Bet you my entire earnings in the 2040s
Not to write an essay, but there are 3 fundamental reason they will not happen on a large scale.
1. Algorithmic morality - your car is driving you down a country road, round a bend and there appears to be 10 people stood in the middle of the road. The car has a choice - either hit them and protect you from going off the road, or take you off the road, potentially killing you but saving them (kind of like the whole train points dilemma). Ultimately, the beauty of the human brain is the ability to react to the 'random', and is surprisingly efficient at making a split second choice. Unless it is AI (highly unlikely) the car will have to follow a decision tree to make the choice, and MB have already admitted that their programming would seek to avoid the collision. You can't ask a programmer to decide whether you live or die.... would you get in a car that could be programmed to kill you?
2. Physical constraints - already, BMW, Merc, tesla and a raft of others have developed autonomous tech, but only one (tesla) has actually decided you can use it without any safety precaution. Having driven through Germany and France last month in a brand new BMW 5 with thel latest tech, and in a benz in the uk with the same, 90% of the time they are great. However, using sensors to detect the lines, should the lines disappear (rain on camera/road, mud, sensor fail, line fade or slip road), the car literally says 'computer says no, and chucks control back at you immediately, not before veering dangerously in a direction. Seeing as there is no plan whatsoever to implement an autonomous infrastructure (guide lines etc), we'll still rely on the same tech in 2040, albeit better develop, but similarly flawed.
3. The legal issue - The manufacturers are inherently terrified of lawsuits from the good old US of A resulting in multi million dollar settlements for basic stupidity. Because the tech is not 100% effective, the manufacturers cannot allow you to say 'the car was driving', because in a fatal accident, the way the western legal system is based is that we have to have someone to blame. Unfortuneatly, this cannot be the car manufacturer. tesla are the only brand brave enough to do it, and witness the guy brave enough to try it got slammed into the side of a lorry, asleep. telemetry showed he hadn't touched the pedals or wheel for over 2 hours, which is how tesla escaped blame.
Ultimately, if it was all autonomous cars, I mean 100% entirely, it'd be fine. But the moment you include random human acts into a controlled environment, no amount of processing power in 20 years will be able to deal with that.
I concede that you'll see cars that can drive themselves, on motorways and in cities to relive congestion. On the M25 for example, you could have 2 lanes as like a road train, going at 100mph with cars 50cm from eachother - you could circle LDN in an hour lol
But no govt will ever legislate that you cannot drive a car that does not have this tech, forcing people to buy new cars. But as cool as it would be, I cannot see how you would ever see a car take you from A to B without any driving by you.
Again, sorry for the essay- it's something im hella passionate about, already written a mock dissertation about it