A Business Insider article from 2016 said that 10 million self-driving cars will be on the road by 2020. This obviously didn’t happen. It seems that the earlier excitement levels for this technology have dropped. One big reason for this is the death of Elain Herzberg. In March of 2018, she was hit by one of taxi company Uber’s cars, which was testing an autopilot system. Video from the crash shows it would be extremely difficult for a human driver to see the woman and stop the car. However, Uber’s taxi used technology called LiDAR to sense the environment. LiDAR doesn’t need light to check the environment. So even though it was dark out, the car should have sensed the woman.
A point of interest is that many people seem to misunderstand what “self-driving car” means. There are actually 6 levels for self-driving cars. As you go up each level, the driver does less and less work. At level 0 there is no automation; the person does everything. At level 5, drivers are not needed at all:
People see videos of self-driving cars and hear people like Elon Musk talk about how safe the technology is. A blog post on Tesla’s website explains that their plan is to develop “a self-driving capability that is 10X safer”. At the bottom of all this excitement is one fact: the technology currently available is not level 5 “full automation”. People might get the feeling it’s level 4 or 5, but Tesla’s autopilot is closer to level 2, moving towards level 3. There are quite a few companies working on this tech, but let’s just compare Tesla with driverless tech company Waymo. These two companies use very different strategies for their driverless technology.
Waymo uses cameras and Google Maps, but the main technology used to sense the environment is called LiDAR. LiDAR uses laser light to hit everything around it in the environment and create a 3D image of the world up to 300 meters around the car. You can take a short look and see what it’s like to ride in Waymo’s car. Waymo doesn’t produce cars like Tesla does, they just bought cars and added the hardware. To test their system, they also use something called Carcraft, which is like a high-tech video game to ‘practice’ real-life driving conditions. In 2018, Carcraft had already test driven over 5 billion miles.
Elon Musk (who uses LiDAR in his SpaceX rocket ships) doesn’t believe LiDAR is a good choice for electric cars, saying that it is too expensive, and not as useful as developing other types of technology. The logic is that people don’t need LiDAR to measure distance and drive, we just use our eyes.. Cameras are so much cheaper than the hardware needed for LiDAR. Their plan is to use software that can judge 3D distance from camera imaging (just like the human eye). They also use radar (what we use to find airplanes in the sky) and ultrasonics (using sound to sense objects in the environment) to sense the environment around it. Since all new Teslas come with the needed hardware, owners can send their driving information back to Tesla, which already has several billion hours of ‘real driving’ experience.
The hardware used in both companies is fully developed. The software that makes driving decisions is what needs to be improved. The systems need huge amounts of data and the AI software programming simply needs more time to perfect. There are mixed levels of confidence about the safety of self-driving technology. While Elon Musk claims it’s “much safer than human drivers”, others have started to disagree. Either way, the future of self-driving cars is much less certain than it seemed to be several years ago. Some people in the industry think consumers won’t own self-driving cars anytime in the near future.
These fears will almost certainly cause governments to take more time deciding when this technology is allowed to hit the road. Elon Musk plans to make Tesla’s self-driving system 10x safer than human drivers. Once this happens, there will still be many questions to work through. For example, neither the Tesla nor Waymo systems provide 100% real driving data. Tesla’s comes from drivers who are in the car paying attention to the road, and Waymo’s is coming from a computer model named after a video game. When more people begin using the self-driving systems, after more car crashes happen, governments will need to decide who is responsible and a whole new system of laws will need to be created. Who knows how long that could take…