19:20 GMT +323 October 2018
Listen Live
    A self-driving car traverses a parking lot at Google's headquarters in Mountain View, California on January 8, 2016.

    This is What’s Wrong With Self-Driving Cars: People

    © AFP 2018 / Noah Berger
    Tech
    Get short URL
    5126

    As self-driving cars slowly but steadily pave their way into our everyday life, Wired’s Alex Davies explained the core problem in of autonomous vehicles’ design.

    The future is here: so-called self-driving cars have finally made it to the market. With Tesla's auto pilot feature, you can do whatever you want as your car drives you home, completely on its own… Or can you?

    What if something happens right in the middle of a tremendously tense finale of a movie you're watching on your iPad? Or even worse — when you're just about to win the top score in Arcanoid or any other game? Will you have what it takes to throw your tablet away in a split second, get your hands on the wheel and, most importantly, get your head around what in the world is going on around you — all in what may be only seconds before a deadly car crash?

    Wired's Alex Davies says no. And all of the car manufacturers that are currently developing self-driving cars agree with him.

    Designing what developers call a "handoff" — the task of getting the driver behind the wheel to take control at a moment's notice — has proven very difficult. Not because of machines. Because of us.

    "Having a human there to resume control is very difficult," says Bryan Reimer, an MIT researcher who studies driving behavior.

    According to various studies, drivers who have been relieved of the responsibility of paying total attention to the road almost always then turn their focus down to zero.  In a YouTube video by Inside Edition you can see people doing all kinds of stuff while riding a Tesla car down the highway, including sleeping. Yes, guys, sleeping soundly on a driver's seat.

    The very same video features footage by another Tesla driver who barely managed to get control of his vehicle in time before Tesla's autopilot would have jerked left and rammed him into an oncoming car.

    So car manufacturers have found themselves in a tricky situation. While trying to provide top-notch safety by embedding autopilot functions in cars, they have created an opportunity for a safety flaw so big that self-driving cars could be less safe than driving your car the old-fashioned way.

    To deal with that problem, manufacturers have to provide another level of safety, known as Level 3 autonomy. Level 3-capable self-driving cars must make sure that at any moment, the driver is in a position to take control at a moment's notice. Now that's a complicated task for a machine!

    Just imagine, Davies writes, the car, like an electronic being, has to make sure you keep your eyes on the road, you don't sleep, you don't play cards, you don't have a VR headset strapped to your face and myriad other ‘dont's.' For those less familiar with computer technology, such a task would require an astronomic amount of image recognition computing and artificial intelligence decision-making, and the complexity of those algorithms may even surpass the self-driving magic itself!

    Google was the first to realize all that computational stuff is just not worth the trouble, in terms of the bottom line, so they decided to ditch Level 3 autonomy and go straight on to next level of autonomy — completely self-driving cars that never rely on human beings to solve critical driving problems.

    But Google, joined by another technology-based company, Uber, is still struggling to perfect their little marshmallow-like machines, which already perform kind of not that bad, according to theOatmeal.com. These companies' incentive is pretty high, since both Uber and Google are going to profit a lot from removing the need for a human driver, if in different ways: Uber's human drivers take the biggest chunk of passenger's money, while Google intends to keep passengers occupied with its other services while riding (not driving) their little marshmallow-looking thingamajigs.

    In the meantime, those more conservative, like Audi, are willing to accept the challenge of the Level 3. According to Wired, Audi is about to present its newest A8 sedan, packed with a whole lot of stuff, all of which is going to serve one purpose: to monitor the driver and keep him constantly reminded of his driving duty by "a combination of alerts the human will see, hear, and feel." Sounds a teeny-tiny bit scary and definitely not what one might imagine when they hear words "premium comfort."

    But even Audi, however conservative, is moving toward a full autonomy, writes Davies. Yet still, neither of those tech giants pursuing Level 4 and 5 autonomy are able to answer this simple question: what should you do when your Google marshmallow decides to suddenly ram you into an oncoming car?

    Related:

    Uber’s Self-Driving Car Program Flees from California to Arizona
    General Motors to Start Testing, Producing Self-Driving Cars in Michigan
    Self-Driving Ubers in San Francisco Under Attack for Running Red Lights (VIDEO)
    US Officials to Control Software for Self-Driving Cars Under New Regulations
    Apple Cuts Dozens of Staff Amid Self-Driving Car Project Difficulties
    Tags:
    autonomous vehicles, attention, self-driving car, Uber, Google, Audi, Ford
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik
    • Сomment