16:19 GMT30 October 2020
Listen Live
    Get short URL

    A woman in Tempe, Arizona, died early Monday morning after being hit by a self-driving car operated by Uber, making it the first known death of a pedestrian hit by an autonomous vehicle in the US.

    According to police officials, the car was in autonomous mode but there was also a human safety driver behind the wheel at the time of the incident. The woman was struck by the car in Tempe, a city just east of Phoenix, Arizona, while she was crossing the street south of Curry Road on Mill Avenue. The unidentified woman was reportedly rushed to a hospital where she died from injuries, Phoenix KNXV reported. 

    An Uber spokesperson told the station that they are "fully cooperating" with local authorities.

    Uber has suspended all autonomous car operations in Phoenix, Pittsburgh, San Francisco and Toronto.

    Companies like General Motors, Waymo, Uber and Intel have been testing self-driving cars on Arizona roads in the Phoenix area since 2015. In March 2017, Uber officials temporarily stopped their autonomous vehicle program in Tempe, San Francisco and Pittsburgh after a Honda CRV collided into a self-driving Volvo run by Uber.

    Earlier this month, Arizona Governor Doug Ducey issued an executive order stating that driverless vehicles don't need a driver behind the wheel if they abide by traditional traffic laws.

    "As technology advances, our policies and priorities must adapt to remain competitive in today's economy," Ducey said in a statement this month. "This executive order embraces new technologies by creating an environment that supports autonomous vehicle innovation and maintains a focus on public safety."

    Ducey's order states that robot cars "with, or without, a person present in the vehicle" must follow all state and federal laws as well as Department of Transportation regulations.

    According to the executive order, driverless vehicles are only allowed on the road if the operator submits a statement to the Department of Transportation proving that the vehicle complies with federal law and that it is programmed to go into "minimal risk condition" if it has a malfunction. This means that if there is a programming error or the vehicle encounters an unfamiliar situation, the car pulls to the side of the road and shuts down.

    accident, pedestrian, self-driving car, Uber, US, Arizona
    Community standardsDiscussion