Self driving cars are the hot new thing and every major tech company as well as automobile company is working on their iteration of this not so new concept. These companies knows that there is a huge amount of interest in self driving cars. Many people feel the job of driving is a very tedious and tiresome one. Using an AI computer to drive the system takes kills the need for a human to be occupied in a non-productive job of driving. It allows the rider to rather focus on other important tasks or simply relax while the car takes you to the required location.
In 2015, Elon Musk, the billionaire Tesla CEO claimed: By 2018, Tesla’s cars would have achieved “full autonomy,” or “Full Self-Driving” in Tesla jargon. Musk has made multiple claims since. Musk also announced a Robo-Taxi program where Tesla cars would serve as autonomous taxies which would function without any drivers. Tesla and Elon have failed to deliver on that promise repeatedly. While Tesla seems to be closest to achieving a completely autonomous system.
Even tech companies like Apple and Google have been working on building autonomous cars. Since 2014 Apple is working on Project Titan. The very mysterious project is basically Apple trying to build a complete vehicle with a full self driving technology at its core. Google has a subsidiary it calls Waymo working on developing the core technologies that allow autonomous cars.
Yet the biggest companies with the deepest pockets and the brightest brains are still not able to build a flawless, full self driving automobile. The problem with these systems is not about lack of sufficient data for a machine learning model to learn from. Automobile companies, Tesla in particular have been collecting the driving data from hundreds of thousands of cars all over the world. They have the required raw data to train the AI.
The problem is not lac of sufficient hardware resource in an individual car, the processors that drive these cars are definitely more than powerful to enable full self driving. Autonomous cars do not need a huge memory as the decision taken by the AI system does not depend on what the vehicle had in the memory 5 minuets ago. The computer driving the vehicle needs the data of surrounding at the given moment. Pedestrians, wild life, and other vehicles come and go within matter of seconds and once they have passed that data is no longer needed in the memory.
With no major hardware issue in the way we are much closer to full self driving cars that ever, but still a human intervention is needed every now and then. We have successfully achieved what is called level 3 autonomy but the jump to a level 5 autonomy is years away.
Here are the levels of autonomy for self driving cars.
- No automation : The car does not have an AI system assisting the driver in any kind. The car may have a constant speed cruise control at most.
- Driver assistance: The car has adaptive cruise control or lane-keeping technology which makes the car cruise through the specified lane but the driver still needs to do most of the driving.
- Partial automation : The car can keep a safe distance and follow the route, but the driver must be ready to take control whenever necessary.
- Conditional automation : The car can drive by itself in some situations. Although a driver is rarely required, he or she must be ready to respond at all times. Teslas currently posses a level 3 autonomy where they can drive all by themselves but a driver is still needed behind the wheel.
- High automation : When travelling on a controlled route, a driver is not required. The driver may choose to snooze in the back. A driver is still required on other roads.
- Full automation : There is no need for a driver in a car. It’s possible that the car won’t even have steering wheels or pedals. The driver can go back and rest because the automobile has everything under control.
Autonomous driving has faced plenty of criticism due to various minor as well as major accidents that have occurred with vehicles with this current gen technology. For most of them it was a human error that caused the accident and not the AI. But there have been cases where the AI failed to detect the obstacle and caused a mishap.
The problem with todays level of autonomy is that a human is required behind the wheel, ready to take over whenever needed. For most of the part humans start relying too much on the AI and loose attention. Humans are pretty bad at being alert after spending hours just looking at the road not doing anything.
To go beyond the level of autonomy we have already achieved, we might need to compromise and create controlled routes (To achieve a level 4 autonomy) or build AI systems much more flexible and advanced than what we already have.