Tesla’s data advantage. Can Apple, or others, keep up?

This is a long post. Short version: Tesla is way ahead on data collection and is pulling further ahead every day.

Do you ever think about what the cameras on a Tesla are doing? Cathie Wood does. She runs ARK Invest and has made billions by investing in disrupting companies, particularly those who use artificial intelligence. Her favorite company, she reiterated again last week on CNBC, is Tesla.

I do too, and go even further than she does in studying this industry (I have been studying self-driving cars for 15+ years and have a Twitter list of people and companies building autonomous cars). I even count how many go by my front door. One every few minutes. No one else has a neural network of its type and no other self driving car has been seen on my street. Did you know a Tesla crosses the Golden Gate Bridge every 80 seconds or faster? Yes, that was me counting cars out there.

I spend hours out front every week cataloging what goes past (I have the new over-the-ears headphones from Apple and usually use them outside while walking around talking to dozens of people all over the world building the future –they absolutely rock, by the way).

The photo at the top of this post is the street I live on, right near Netflix’s headquarters, I just shot this on our daily walk. It is part of a multi-billion war over the future of transportation. Most people have no idea how far ahead Tesla is. Including a very smart software developer I talked with today (I won’t name him because that wouldn’t be nice).

Elon just wrote this over on Twitter:

He’s right. When I talk with people, even techies in Silicon Valley, most have no clue just how advanced Tesla is and how fast it’s moving. Wall Street Analysts are even worse. I’ve listened to EVERY analyst and NONE except for Wood’s talks about the data the car is collecting. None have figured out that Tesla is building its own maps, or, if they have, haven’t explained what that means. (I met part of the Tesla programming team building these features and they admitted to me that they are building their own maps, more on that later for the nerds who want to get into why this matters).

She says that the data leads Tesla to doing Robotaxis, which will be highly profitable. She’s right, but that’s only one possible business that Tesla can build off of this new real-time data. Others include augmented reality worlds, GIS data to sell to businesses and cities, and new utilities that will run far ahead of Apple and Google’s abilities. More on that in a bit. These are all multi-billion-dollar businesses and is why tens of billions of dollars are being invested in autonomous technologies, including at GM, with its Cruise division (worth already about 1/3 of GM’s total market value), and Apple has leaked that it’s going to be entering the space in 2024 with an effort that will cost many billions too.

First, some basics.

Go and watch some of the Tesla FSD videos on YouTube.

You will see just how it works. A Tesla has eight cameras (most of them on the outside of the car). It also has a radar in the front bumper, and several ultrasonic sensors around the car that can see things closer, like a dog running next to the car). These are all fused together into a frame by software and then 19 different AI systems go to work figuring out where drivable road surface is, where signs are, where pedestrians and bicyclists are, and much much more.

My car shows that it can “see” about 100 yards around the car, which you can see in the FSD (FSD stands for “Full Self Driving”) videos on YouTube.

When I say “see” I mean it shows me on my screen where stop signs, lights, pedestrians, other vehicles, and even curbs and other features of the road are.

For people who don’t track the bleeding edge of computer vision (I have a separate Twitter list of developers who are doing computer vision, which is how robots, autonomous cars, and augmented reality glasses will “see” the world and figure it out) you might not realize just how good computer vision is. The folks over at Chooch AI, a new startup out of Berkeley’s computer science lab, have shown me just how good computer vision is and how much cheaper it is getting literally every month. Their system can be trained to do a variety of things with cameras, even see if you are washing your hands properly (important if you are a restaurant worker or a surgeon).

Their system already recognizes 200,000 objects. On an iPhone.

My Tesla doesn’t show that it recognizes that many things, but what it does is amazing. For instance, if I drive by a traffic cone at 90 m.p.h. it shows the cone. If there are a string of cones my car automatically changes lanes to get away from the construction area and make it safer for everyone.

As I drive down the street it shows parked cars, and, even, garbage cans. But it isn’t showing me the “real” can. It’s showing me what the AI has in its system. This is very important. A lot of people don’t understand just how much data just one garbage can generate. It captures what kind of can it is. How big is it? How is it positioned in 3D space on the road bed? And it does this even on garbage days when there are hundreds of cans out on the street I live on.

Why would this data be valuable? Well, a garbage company might want to buy an autonomous garbage truck. They could use these systems to both drive the truck and have a robot figure out where each can is to pick it up (already our garbage trucks only have one person controlling such a robot). It will soon go way further than that.

Why “HD” Maps are So Important

Tesla’s current autopilot/self driving systems have some major flaws (many of these are fixed in the FSD beta, but most owners don’t have that yet):

  1. They can’t see debris in the road.
  2. They can’t see potholes that have just formed.
  3. They can’t join together to make energy usage more efficient.
  4. They can’t see around corners very well.

Now, there are two approaches. One is to put a LOT more cameras and sensors on the car and a LOT more silicon in each car to properly identify things. I hear that will be Apple’s approach, which is why it’s currently looking at LIDAR sensors and I met the LG camera team while touring the Tesla factory, of all such places (they make the cameras in your iPhone) and they said their cameras will soon be a LOT higher resolution than the ones in my three-year-old Tesla. Apple believes it will be able to put a lot more neural network capabilities into each car since it has its own chip manufacturing now.

Disclaimer, I own both Tesla and Apple stock, my number one and two positions.

So, could Apple “beat” Tesla? That is the $64 billion question. I don’t believe so. The photo above shows why.

In Silicon Valley there are already so many Teslas that scenes like this one, on HWY 17 between Los Gatos and Santa Cruz, are quite common. Tesla, I hear, will soon start transmitting data from cars in front of you to your car (and to everyone else too).

Why is this important? Well, one day I was driving in the fast lane of Freeway 85 using my car’s Autopilot/FSD features (my car automatically changes lanes, stops, and basically drives itself already, particularly well on freeways with the above limitations). Cars in front of me started swerving and breaking. Turned out a bucket had fallen out of a truck and was rolling around lane #1 (where I was had three lanes).

I grabbed the steering wheel and took control, also swerving around the bucket. My Tesla hadn’t seen the bucket, although was already assisting me in driving. Doing very advanced braking. Audi taught me just how good anti-lock and traction control systems are by teaching me to drive on ice. They turned those systems off and I instantly spun the car. Turned them on and they were independently braking each wheel to keep my car from losing traction. Something Elon Musk demonstrated to me with electric engines and traction, too (which is why Teslas are REMARKABLE on ice and snow).

Afterward I talked with the programming team and my car, they said, automatically captures TONS of data during such an event (we call that an intervention, because a human had to intervene in autonomous driving and take over). The programmers have a simulator where they can load all that data and actually “walk around” what happened. Chooch shows that the training is so good that an engineer could just circle the bucket and “teach” the AI systems about what that is.

That’s how they taught it to see stop signs and garbage cans, for instance.

So, now a future version of a Tesla will be able to see the bucket and let everyone else know that there’s an object in the road. “But Waze already does that,” you might say. No it doesn’t. Waze requires a human to tap the screen and say there’s an object on the road. But it doesn’t know whether it’s a bucket, a box, a bedspring, or a beam. And it doesn’t show what lane that thing is in. It certainly doesn’t predict, or track, how said object is moving.

I hear that by the end of the year Tesla will turn on such features. Now, what will that look like on the road? All Teslas will start switching lanes to lane #3. You won’t even know why, even if you are in one, until you pass by the bucket in lane #1.

Can Apple match this? Not until it gets a decent amount of cars on the road. Since Apple is aiming at 2024 I think it will be way behind and will find it difficult to catch up.

But it might get worse for Apple, and certainly will get worse for car companies that don’t have these capabilities being built. Why? Maps and Robotoxis.

The next trillion-dollar company

Let’s say Tesla, or, really, anyone, came out with a map that was updated in real time on your phone. Would you continue using Google or Apple Maps that aren’t? Of course not. Here’s why.

One day I was driving my kids on a long trip to pick up some fruits in the Central Valley (I know a strawberry farmer that grows a lot better strawberries than they sell in our local grocery stores). We witnessed a truck burning. Our cameras could have seen that, captured that, and showed it to everyone on the map and kept it updated as firetrucks arrived and the mess cleaned up. On the way back we were caught in traffic at that spot, which was still being cleaned up. The current maps show traffic, but don’t show WHY there is traffic.

In the future you’ll see that truck burning on the freeway and maybe you’ll decide to take a different route, or delay your trip, saving you much time. Can Google match this? Not really. Google has no cameras passing by the fire. No neural network deciding whether that’s important to upload for everyone else. And even if a human snapped a photo, it is already out of date and doesn’t show the current status. A Tesla rolls by every few minutes.

Look at the front of my home on Google or Apple. The photos there are 20 and 23 months old! On Tesla’s map, if my home was burning down, you’d see that fire within seconds of it starting and you’d be able to watch in near real time as roads get closed and emergency gear gets rolled by (a fire station is on our street, so I get to watch that often too).

So, soon, Tesla will have far better 3D maps, that are updated in real time. Google and Apple can’t match it. Why? No data. Tesla has all the data.

Now, Tesla COULD license this data to Apple, so that Apple could stay relevant. Since Apple has $250 billion in cash, that’s quite a possibility. Which is why I own both Apple and Tesla. Apple is building such a 3D map, from scratch, and is doing a ton of autonomous car work. Remember, Apple could buy GM out of petty cash, so I can never count it out (and Apple has an amazing new VR/AR headset coming in 2022, more on that in a different post) and its AR devices will use this new 3D map it is developing.

Tesla, soon, because of the real-time map it is developing, will be able to solve all four of the problems I named above and add major new capabilities. I hear it already is testing caravanning, for instance. This is the ability for an autonomous car in front to control the brakes and acceleration of everyone behind. Which will let Tesla build a string of cars going on long trips. Doing that will save about 20% of power for everyone in the chain. It’s why NASCAR drivers learn to “drift” other cars. Doing so lets them go faster on less fuel.

All of this work will lead to robotaxis.

Think of Uber. That company was invented right in front of me during a Paris snowstorm. The impulse there was to make transportation easier. When I visited a slum in South Africa I heard just how big a deal this was to people. One woman told me Uber changed her life since before then taxis wouldn’t visit the slum, so she had a tough time getting around.

In the research for our latest book, “The Infinite Retina,” I learned that Uber is closer to how we will own cars in the future than other models. Why? When autonomous cars happen having a car sitting in your garage doing nothing will be very stupid. Something only the rich will do.

Now the economics. An Uber costs about $60 per hour. Go ahead, order an Uber and tell the driver “stay here for an hour and charge me.” Keep in mind that it still is losing money at this rate, too.

When autonomous cars come? That cost will go down to less than $10 an hour. This is why Uber invested billions in autonomous vehicle research. For Uber, though, the problems are even worse. It doesn’t make cars. It doesn’t own cars. It can’t force its drivers to buy a new one to get new features. It can’t force its drivers to buy a specific brand.

Let’s talk about the role consistency plays in building a brand. For instance, let’s talk about Starbucks Coffee. Is it great coffee? No. I used to live in Seattle and everyone knows Starbucks’ coffee sucks when compared to high end coffee shops. So, why is Starbucks so loved? (Disclaimer: I own stock in Starbucks too). Because it is ubiquitous AND consistent. Uber is ubiquitous (I took one in Moscow, Russia) but it isn’t consistent for two reasons:

  • The driver in the car. Sometimes they smoke. I even had one who smelled of alcohol.
  • The car itself. Sometimes it’s a brand new Mercedes. Sometimes it’s a beat up Toyota.

Tesla’s robotaxis will fix both problems. Tesla’s economics are that it could rent you a Model 3 for about $10 an hour. Even a top of the line Roadster could rent for $30 an hour. A $200,000 car cheaper than an Uber? Yes! And at these prices Tesla will be HUGELY profitable compared to Uber. The wholesale cost of a Model 3 actually is only $3 an hour. Uber and Lyft can’t compete.

Remember back to that first ride that Elon Musk gave me? What was his sales pitch for Tesla? Fun? Yes! But he really emphasized how electric motors can be made cheaper than gas ones. He didn’t talk about autonomous vehicles back then, or saving the earth. He saw that he could make a car cheaper than a Toyota and, therefor, disrupt the entire industry. Now that more than a million of them are on the road we see he’s right. The lifetime cost of owning a Tesla is lower than owning a Toyota.

And it’s about to get far worse for Toyota.

Why? Well, if you buy a $45,000 Toyota it’ll cost you about $1,000 a month, if you include all the costs.

That’s a lot of hours of driving if a Tesla is $10 an hour. Most people don’t drive that much every month. Even me, I currently put about 30 hours a month into mine (in three years I put 53,000 miles on my Tesla, and I take a ton of long trips — most of my friends put far far less on their cars than I do). So, an autonomous Tesla will cost about $300 a month for me. Far less than owning a car and having it sit in my garage, like both our Tesla and Toyotas are right now.

One last thing. The auto industry asked me to do some research into customer acquisition costs. I went around the country asking “are you ready to get into a car without a steering wheel?”

“Hell no,” is the typical answer I got. One guy in Kansas told me “I’m a narcissistic control freak and there’s no way a computer is going to drive me around.”

Google’s head of R&D told me they had the data to prove that after three rides in one even that guy changes his mind. So I asked a second question “what if the car drove to you and then you had the choice of driving it or not?”

Almost everyone, including that guy in Kansas, said “yeah, I don’t have a problem with what it does when I’m not in the car.” So, Tesla will have very low customer acquisition cost, if any at all (everyone knows a Tesla is fun to drive).

Kraft food execs once told me they spend $34 to acquire a young customer to eat its cheese (in advertising, and other techniques). So, I imagine that Waymo (Google’s Robotaxi) will have to spend a lot more than that, I figure more than $100 for a while, to get people to try its system, which has no steering wheel and is totally autonomous (it just started working without a driver in Phoenix, Arizona, and San Francisco).

Plus, Tesla has a huge advantage in brand too over Waymo.

Translation: Cathie Wood is right. Robotaxis will make a crapload of money for Tesla. Now, here’s the rub and why Tesla’s valuation is so high (if you thought it was just a car company it’s extremely overvalued): a robotaxi system doesn’t need many cars on the road. Uber has something around a million drivers. Worldwide. Tesla could build that many cars for only a few billion dollars. Plus, its owners have already funded the building of more than a million already, and with the Cybertruck on the way, I expect to see that sell many millions.

So, Tesla has the brand, the distribution, the consistency, the low-customer acquisition cost, and other advantages (like a supercharger network that let us drive ours across America) to make a ton of profit PER CAR. That is what I’m betting on, and it’s what Cathie Woods is betting on too.

Thanks to Brian Roemmele who has been talking to me about this for more than a year. He sees ahead better than anyone else I track right now.

Now that my career is over…

Reading “The Scarlet Letter: A Romance,” back in high school in the early 1980s I never imagined I’d have to wear a letter of shame on my chest while moving around Silicon Valley (don’t understand? Just Google my name and you’ll see my “Letter A.”)

That book tells the story of Hester Prynne, who conceives a daughter through an affair and then struggles to create a new life of repentance and dignity, says Wikipedia. Hester’s “Letter A” stood for adultery.

My “Letter A” stands for that, along with abuser and, mostly, asshole. Shame is a heavy thing to bear, for sure. A little heavier than those new Apple headphones. 🙂

Lately, I’ve come to accept that I’m an asshole and that my career is over.

One part of that acceptance is coming to understand that my shame has given me so many gifts, including a 9,000-mile road trip with my kids and a life that doesn’t include traveling. Now that is gone I see just how hard it was to stay sane on the road. So happy we did that road trip in 2018.

Last year was pretty tough, though. At Christmas Maryam and I remembered all our family and friends we had lost. The year took my dad and one of my best friends, along with five others I used to know. I have a feeling more losses are ahead, as COVID keeps going up exponentially. Yesterday alone America lost 4,500. On top of that I was still learning to deal with my new roles as the kids were home all the time, and Maryam was home for the first time in a number of years too. I am tasked with getting them up most days and onto Zoom school, which is what I call distance learning, since both kids are on Zoom.

There was also a problem popping up in my mental health, though. Depression got a deeper hold on me.

Some days I couldn’t get up. I just didn’t want to do anything. And when I was up I had a ton of household and emotional work to do, which just took my energy. Of course that strained relationship between me and my company cofounder, Irena Cronin. I was unfair to her, and sinking because I just couldn’t tell anyone what was going on.

My wife asked me to see a therapist again last Fall because she could see I wasn’t dealing with life well. I did, and he put me on depression medicine. That helped me a lot. In just a few months I’ve lost more than 20 pounds, now more than 50 down from my high four years ago, and it feels like it filled in a bunch of potholes in my soul.

Another part of what I didn’t like about consulting work is that my stock market investing was doing a lot better than what I was doing with Irena. The last year was extraordinary in the market, and brought us good fortune, even while my mental health suffered. Over the last year my investment returns were 15+x times higher that my other income, despite writing a critically-acclaimed book about Spatial Computing and doing a lot of consulting for big companies and small.

Which brings me to what is next. In talking about the changes with tons of developers and entrepreneurs I see that the changes are going to be extraordinary and quick. Autonomous cars will become much more available in 2022. Tesla’s Cybertruck will be next-level in all ways. So much new tech is being readied for it. Apple is readying what I have heard is the biggest product introduction of all time. So big that it will come in two parts. Part One comes this year. More on what Apple is up to soon. It’s big and most people, even the smart analysts, haven’t figured it out yet.

Anyway, that brings me to my next project: a science fiction ebook about what life is like in 2022. The premise of the book? That the next 24 months will see more new technology ship than human beings have ever shipped and deep change is about to hit. Not to mention you will give companies a LOT of new data from these things. Privacy will radically change over the next 24 months due to new devices from Apple, Facebook, Google, and others.

I always wanted to try my hand at science fiction but most science fiction, like Star Trek, depicts a future that’s either way off, or unattainable. I wondered if I could write about something much more short term. A family who gets 2022 technology early. That morphed into a fictional neighborhood that Apple, Tesla, and other companies, are using to test out new devices, services, and more.

I’m noodling around, still at the beginning. It might massively change as I write a new ebook between now and June. And, yes, my “Letter A” is helping me build characters with some depth in human experience. I’m lucky that I don’t need to make money at the moment, so can take the time to do that and continue figuring out how to be less of an asshole. It’s a work in progress.

Anyway, now that my online media career is over it’s OK. I have these new Apple headphones which really rock. Most people haven’t figured out how much Apple is using neural network here. Last week I was talking to someone and a lawn mower started up. The guy on the other side of the conversation told me that the lawn mower was “turned off” within a second or two and he couldn’t hear it, despite it running right next to me. Here’s my son, Milan, wearing a pair. He loves them. He hates VR. Which tells you just about the kind of human factor work Apple is doing and the kinds of things we’ll have to go through as we enter a new paradigm of computing.

As I write the book I’ll be online a lot less. Hope to be done by June. As for the “Letter A” I am wearing around town? Well, I’ve found another gift it gives: it helps other people deal with their own troubles. Helping others is the only real way out of the burden it brings. I have a long way to go.

Milan wears Apple's new AirPods Max headphones.
Milan wears Apple’s new AirPods Max headphones.

This full-body MRI scan could save your life

This summer a 40-year-old friend and brilliant software engineer, Brandon Wirtz, died due to colon cancer and my dad died of pancreatic cancer too. At first neither of their doctors diagnosed properly (Brandon was frequently getting sick and my dad kept having more and more problems). Ever since Brandon discovered his cancer, I’ve started taking healthcare more seriously, wondering if there’s a way to diagnose such diseases earlier.

Last week a new clinic, Prenuvo, opened near San Francisco, that promises to do just that by doing a full-body MRI scan (Magnetic Resonance Imaging). This is like a high-resolution X-ray machine except it doesn’t use radiation to make its images.

I was lucky enough to be one of the first to be scanned in its new location (it has been doing such scans for a decade up in Vancouver, Canada) by founder, Dr Raj Attariwala. Here I filmed the consultation with Dr. Raj right after my scans were done.

The process? You pay $2,500. You spend an hour inside an MRI machine. For me, it was a chance to hold perfectly still for an hour, while I listened to the machine whir and buzz around me. After the hour, it takes a few minutes to process your images and then you sit down with a doctor, like I did here.

Luckily for me I got a pretty clean bill of health but you can see this is a powerful diagnostic tool to help doctors find dozens of problems before they become untreatable. Everything from heart disease to a variety of cancers. You can see how Dr. Raj walks through my entire body, including my brain, looking for problems that I’ll need to work on. He did find one with me, my mom had a bad back, and it looks like I’ve been blessed with the same problems, and he told me to do exercises to strengthen my core muscles to minimize that problem in the future.

In talking with cofounder/CEO Andrew Lacy the company has developed its own MRI machine to do these scans. He told me that most other MRIs are used only for specific body parts, usually after a cancer or problem has already been found. Prenuvo, he told me, has modified the software running the MRI machine to do specialized full-body scans that other machines can’t do easily. Also, his team is using these images to build machine learning to assist the doctors in helping find various problems and, also, in its plans to scale this to more people over time (the San Francisco location has two scanners that can do two people an hour, the company has plans to open more locations and do more scans per hour, but that will need more AI work, and a training of doctors to look for problems when they are early, rather late-stage like they usually see).

For me it’s amazing to see inside your own body for the first time and the company gives its customers all scans on a mobile app that you can explore on your own time later. It also sends the scans to your primary-care physician, or to other doctors for second opinions.

You can learn more about this service at https://www.prenuvo.com.