This is a long post. Short version: Tesla is way ahead on data collection and is pulling further ahead every day.
Do you ever think about what the cameras on a Tesla are doing? Cathie Wood does. She runs ARK Invest and has made billions by investing in disrupting companies, particularly those who use artificial intelligence. Her favorite company, she reiterated again last week on CNBC, is Tesla.
I do too, and go even further than she does in studying this industry (I have been studying self-driving cars for 15+ years and have a Twitter list of people and companies building autonomous cars). I even count how many go by my front door. One every few minutes. No one else has a neural network of its type and no other self driving car has been seen on my street. Did you know a Tesla crosses the Golden Gate Bridge every 80 seconds or faster? Yes, that was me counting cars out there.
I spend hours out front every week cataloging what goes past (I have the new over-the-ears headphones from Apple and usually use them outside while walking around talking to dozens of people all over the world building the future –they absolutely rock, by the way).
The photo at the top of this post is the street I live on, right near Netflix’s headquarters, I just shot this on our daily walk. It is part of a multi-billion war over the future of transportation. Most people have no idea how far ahead Tesla is. Including a very smart software developer I talked with today (I won’t name him because that wouldn’t be nice).
Elon just wrote this over on Twitter:
He’s right. When I talk with people, even techies in Silicon Valley, most have no clue just how advanced Tesla is and how fast it’s moving. Wall Street Analysts are even worse. I’ve listened to EVERY analyst and NONE except for Wood’s talks about the data the car is collecting. None have figured out that Tesla is building its own maps, or, if they have, haven’t explained what that means. (I met part of the Tesla programming team building these features and they admitted to me that they are building their own maps, more on that later for the nerds who want to get into why this matters).
She says that the data leads Tesla to doing Robotaxis, which will be highly profitable. She’s right, but that’s only one possible business that Tesla can build off of this new real-time data. Others include augmented reality worlds, GIS data to sell to businesses and cities, and new utilities that will run far ahead of Apple and Google’s abilities. More on that in a bit. These are all multi-billion-dollar businesses and is why tens of billions of dollars are being invested in autonomous technologies, including at GM, with its Cruise division (worth already about 1/3 of GM’s total market value), and Apple has leaked that it’s going to be entering the space in 2024 with an effort that will cost many billions too.
First, some basics.
Go and watch some of the Tesla FSD videos on YouTube.
You will see just how it works. A Tesla has eight cameras (most of them on the outside of the car). It also has a radar in the front bumper, and several ultrasonic sensors around the car that can see things closer, like a dog running next to the car). These are all fused together into a frame by software and then 19 different AI systems go to work figuring out where drivable road surface is, where signs are, where pedestrians and bicyclists are, and much much more.
My car shows that it can “see” about 100 yards around the car, which you can see in the FSD (FSD stands for “Full Self Driving”) videos on YouTube.
When I say “see” I mean it shows me on my screen where stop signs, lights, pedestrians, other vehicles, and even curbs and other features of the road are.
For people who don’t track the bleeding edge of computer vision (I have a separate Twitter list of developers who are doing computer vision, which is how robots, autonomous cars, and augmented reality glasses will “see” the world and figure it out) you might not realize just how good computer vision is. The folks over at Chooch AI, a new startup out of Berkeley’s computer science lab, have shown me just how good computer vision is and how much cheaper it is getting literally every month. Their system can be trained to do a variety of things with cameras, even see if you are washing your hands properly (important if you are a restaurant worker or a surgeon).
Their system already recognizes 200,000 objects. On an iPhone.
My Tesla doesn’t show that it recognizes that many things, but what it does is amazing. For instance, if I drive by a traffic cone at 90 m.p.h. it shows the cone. If there are a string of cones my car automatically changes lanes to get away from the construction area and make it safer for everyone.
As I drive down the street it shows parked cars, and, even, garbage cans. But it isn’t showing me the “real” can. It’s showing me what the AI has in its system. This is very important. A lot of people don’t understand just how much data just one garbage can generate. It captures what kind of can it is. How big is it? How is it positioned in 3D space on the road bed? And it does this even on garbage days when there are hundreds of cans out on the street I live on.
Why would this data be valuable? Well, a garbage company might want to buy an autonomous garbage truck. They could use these systems to both drive the truck and have a robot figure out where each can is to pick it up (already our garbage trucks only have one person controlling such a robot). It will soon go way further than that.
Why “HD” Maps are So Important
Tesla’s current autopilot/self driving systems have some major flaws (many of these are fixed in the FSD beta, but most owners don’t have that yet):
- They can’t see debris in the road.
- They can’t see potholes that have just formed.
- They can’t join together to make energy usage more efficient.
- They can’t see around corners very well.
Now, there are two approaches. One is to put a LOT more cameras and sensors on the car and a LOT more silicon in each car to properly identify things. I hear that will be Apple’s approach, which is why it’s currently looking at LIDAR sensors and I met the LG camera team while touring the Tesla factory, of all such places (they make the cameras in your iPhone) and they said their cameras will soon be a LOT higher resolution than the ones in my three-year-old Tesla. Apple believes it will be able to put a lot more neural network capabilities into each car since it has its own chip manufacturing now.
Disclaimer, I own both Tesla and Apple stock, my number one and two positions.
So, could Apple “beat” Tesla? That is the $64 billion question. I don’t believe so. The photo above shows why.
In Silicon Valley there are already so many Teslas that scenes like this one, on HWY 17 between Los Gatos and Santa Cruz, are quite common. Tesla, I hear, will soon start transmitting data from cars in front of you to your car (and to everyone else too).
Why is this important? Well, one day I was driving in the fast lane of Freeway 85 using my car’s Autopilot/FSD features (my car automatically changes lanes, stops, and basically drives itself already, particularly well on freeways with the above limitations). Cars in front of me started swerving and breaking. Turned out a bucket had fallen out of a truck and was rolling around lane #1 (where I was had three lanes).
I grabbed the steering wheel and took control, also swerving around the bucket. My Tesla hadn’t seen the bucket, although was already assisting me in driving. Doing very advanced braking. Audi taught me just how good anti-lock and traction control systems are by teaching me to drive on ice. They turned those systems off and I instantly spun the car. Turned them on and they were independently braking each wheel to keep my car from losing traction. Something Elon Musk demonstrated to me with electric engines and traction, too (which is why Teslas are REMARKABLE on ice and snow).
Afterward I talked with the programming team and my car, they said, automatically captures TONS of data during such an event (we call that an intervention, because a human had to intervene in autonomous driving and take over). The programmers have a simulator where they can load all that data and actually “walk around” what happened. Chooch shows that the training is so good that an engineer could just circle the bucket and “teach” the AI systems about what that is.
That’s how they taught it to see stop signs and garbage cans, for instance.
So, now a future version of a Tesla will be able to see the bucket and let everyone else know that there’s an object in the road. “But Waze already does that,” you might say. No it doesn’t. Waze requires a human to tap the screen and say there’s an object on the road. But it doesn’t know whether it’s a bucket, a box, a bedspring, or a beam. And it doesn’t show what lane that thing is in. It certainly doesn’t predict, or track, how said object is moving.
I hear that by the end of the year Tesla will turn on such features. Now, what will that look like on the road? All Teslas will start switching lanes to lane #3. You won’t even know why, even if you are in one, until you pass by the bucket in lane #1.
Can Apple match this? Not until it gets a decent amount of cars on the road. Since Apple is aiming at 2024 I think it will be way behind and will find it difficult to catch up.
But it might get worse for Apple, and certainly will get worse for car companies that don’t have these capabilities being built. Why? Maps and Robotoxis.
The next trillion-dollar company
Let’s say Tesla, or, really, anyone, came out with a map that was updated in real time on your phone. Would you continue using Google or Apple Maps that aren’t? Of course not. Here’s why.
One day I was driving my kids on a long trip to pick up some fruits in the Central Valley (I know a strawberry farmer that grows a lot better strawberries than they sell in our local grocery stores). We witnessed a truck burning. Our cameras could have seen that, captured that, and showed it to everyone on the map and kept it updated as firetrucks arrived and the mess cleaned up. On the way back we were caught in traffic at that spot, which was still being cleaned up. The current maps show traffic, but don’t show WHY there is traffic.
In the future you’ll see that truck burning on the freeway and maybe you’ll decide to take a different route, or delay your trip, saving you much time. Can Google match this? Not really. Google has no cameras passing by the fire. No neural network deciding whether that’s important to upload for everyone else. And even if a human snapped a photo, it is already out of date and doesn’t show the current status. A Tesla rolls by every few minutes.
Look at the front of my home on Google or Apple. The photos there are 20 and 23 months old! On Tesla’s map, if my home was burning down, you’d see that fire within seconds of it starting and you’d be able to watch in near real time as roads get closed and emergency gear gets rolled by (a fire station is on our street, so I get to watch that often too).
So, soon, Tesla will have far better 3D maps, that are updated in real time. Google and Apple can’t match it. Why? No data. Tesla has all the data.
Now, Tesla COULD license this data to Apple, so that Apple could stay relevant. Since Apple has $250 billion in cash, that’s quite a possibility. Which is why I own both Apple and Tesla. Apple is building such a 3D map, from scratch, and is doing a ton of autonomous car work. Remember, Apple could buy GM out of petty cash, so I can never count it out (and Apple has an amazing new VR/AR headset coming in 2022, more on that in a different post) and its AR devices will use this new 3D map it is developing.
Tesla, soon, because of the real-time map it is developing, will be able to solve all four of the problems I named above and add major new capabilities. I hear it already is testing caravanning, for instance. This is the ability for an autonomous car in front to control the brakes and acceleration of everyone behind. Which will let Tesla build a string of cars going on long trips. Doing that will save about 20% of power for everyone in the chain. It’s why NASCAR drivers learn to “drift” other cars. Doing so lets them go faster on less fuel.
All of this work will lead to robotaxis.
Think of Uber. That company was invented right in front of me during a Paris snowstorm. The impulse there was to make transportation easier. When I visited a slum in South Africa I heard just how big a deal this was to people. One woman told me Uber changed her life since before then taxis wouldn’t visit the slum, so she had a tough time getting around.
In the research for our latest book, “The Infinite Retina,” I learned that Uber is closer to how we will own cars in the future than other models. Why? When autonomous cars happen having a car sitting in your garage doing nothing will be very stupid. Something only the rich will do.
Now the economics. An Uber costs about $60 per hour. Go ahead, order an Uber and tell the driver “stay here for an hour and charge me.” Keep in mind that it still is losing money at this rate, too.
When autonomous cars come? That cost will go down to less than $10 an hour. This is why Uber invested billions in autonomous vehicle research. For Uber, though, the problems are even worse. It doesn’t make cars. It doesn’t own cars. It can’t force its drivers to buy a new one to get new features. It can’t force its drivers to buy a specific brand.
Let’s talk about the role consistency plays in building a brand. For instance, let’s talk about Starbucks Coffee. Is it great coffee? No. I used to live in Seattle and everyone knows Starbucks’ coffee sucks when compared to high end coffee shops. So, why is Starbucks so loved? (Disclaimer: I own stock in Starbucks too). Because it is ubiquitous AND consistent. Uber is ubiquitous (I took one in Moscow, Russia) but it isn’t consistent for two reasons:
- The driver in the car. Sometimes they smoke. I even had one who smelled of alcohol.
- The car itself. Sometimes it’s a brand new Mercedes. Sometimes it’s a beat up Toyota.
Tesla’s robotaxis will fix both problems. Tesla’s economics are that it could rent you a Model 3 for about $10 an hour. Even a top of the line Roadster could rent for $30 an hour. A $200,000 car cheaper than an Uber? Yes! And at these prices Tesla will be HUGELY profitable compared to Uber. The wholesale cost of a Model 3 actually is only $3 an hour. Uber and Lyft can’t compete.
Remember back to that first ride that Elon Musk gave me? What was his sales pitch for Tesla? Fun? Yes! But he really emphasized how electric motors can be made cheaper than gas ones. He didn’t talk about autonomous vehicles back then, or saving the earth. He saw that he could make a car cheaper than a Toyota and, therefor, disrupt the entire industry. Now that more than a million of them are on the road we see he’s right. The lifetime cost of owning a Tesla is lower than owning a Toyota.
And it’s about to get far worse for Toyota.
Why? Well, if you buy a $45,000 Toyota it’ll cost you about $1,000 a month, if you include all the costs.
That’s a lot of hours of driving if a Tesla is $10 an hour. Most people don’t drive that much every month. Even me, I currently put about 30 hours a month into mine (in three years I put 53,000 miles on my Tesla, and I take a ton of long trips — most of my friends put far far less on their cars than I do). So, an autonomous Tesla will cost about $300 a month for me. Far less than owning a car and having it sit in my garage, like both our Tesla and Toyotas are right now.
One last thing. The auto industry asked me to do some research into customer acquisition costs. I went around the country asking “are you ready to get into a car without a steering wheel?”
“Hell no,” is the typical answer I got. One guy in Kansas told me “I’m a narcissistic control freak and there’s no way a computer is going to drive me around.”
Google’s head of R&D told me they had the data to prove that after three rides in one even that guy changes his mind. So I asked a second question “what if the car drove to you and then you had the choice of driving it or not?”
Almost everyone, including that guy in Kansas, said “yeah, I don’t have a problem with what it does when I’m not in the car.” So, Tesla will have very low customer acquisition cost, if any at all (everyone knows a Tesla is fun to drive).
Kraft food execs once told me they spend $34 to acquire a young customer to eat its cheese (in advertising, and other techniques). So, I imagine that Waymo (Google’s Robotaxi) will have to spend a lot more than that, I figure more than $100 for a while, to get people to try its system, which has no steering wheel and is totally autonomous (it just started working without a driver in Phoenix, Arizona, and San Francisco).
Plus, Tesla has a huge advantage in brand too over Waymo.
Translation: Cathie Wood is right. Robotaxis will make a crapload of money for Tesla. Now, here’s the rub and why Tesla’s valuation is so high (if you thought it was just a car company it’s extremely overvalued): a robotaxi system doesn’t need many cars on the road. Uber has something around a million drivers. Worldwide. Tesla could build that many cars for only a few billion dollars. Plus, its owners have already funded the building of more than a million already, and with the Cybertruck on the way, I expect to see that sell many millions.
So, Tesla has the brand, the distribution, the consistency, the low-customer acquisition cost, and other advantages (like a supercharger network that let us drive ours across America) to make a ton of profit PER CAR. That is what I’m betting on, and it’s what Cathie Woods is betting on too.
Thanks to Brian Roemmele who has been talking to me about this for more than a year. He sees ahead better than anyone else I track right now.