ANNOUNCEMENT and Why the Tesla Humanoid Robot Matters

Irena Cronin and I are seeing huge shifts coming as autonomous vehicles get to the point where they are driving around San Francisco without humans. We recently started comparing notes and we are seeing the same trends. 

So, today we are announcing that I am rejoining Infinite Retina as Chief Strategy Officer. We are bringing new understanding to entrepreneurs and product strategists on Augmented Reality and everything related to it, including AI and Computer Vision. Here is our first analysis (located at: [url]) on why we should all be paying attention to what is happening in humanoid robots and consumer electronics, which include autonomous vehicles that are now arriving to people’s garages and, soon, Augmented Reality devices from Apple and others.

Tomorrow Elon Musk will step on stage and show us the latest AI and robotics. We think this is a much more important announcement than most people are expecting and here’s an analysis of just how deeply Optimus (Tesla’s humanoid robot), and other humanoid robots, will change all of our homes. 

The last time Irena and I collaborated, we wrote a book, Infinite Retina, that Qualcomm’s head of Augmented and Virtual Reality, Hugo Swart, reviewed as a “must read.” This time, in addition to consulting, Irena and I are doing new analyses in the form of a paid product on Augmented Reality topics that we will offer on a monthly basis. One that loves Augmented Reality devices and automated electric cars, and other products that make life more fun and better. 

Tesla Robot: Consumer Strategy 2028

“Knock knock.”

“Who is there?”

“Your pizza delivery robot. If you invite me in, I can set up your table and do other tasks.”

It will be the first time a product introduces itself to consumers at their front doors and once inside will bring a wholesale change to all the brands inside. Most of them will go away. The robot will – over the years – replace “old brands” with “new brands” that do the same thing, but better. It’ll even change the showerheads to new models to save energy and water. 

Skeptics are right to point out this won’t happen soon. But by 2028 we expect such a robot will be in people’s homes and the Robotaxi (think of Uber without a human driver) will demand the inclusion of a humanoid robot that can do things like deliver dinner or groceries.

Tesla tomorrow will give us a taste of how advanced its robotics program is and how likely we are to get a humanoid robot that helps us at home in five years or less, along with seeing how well it can learn to do new jobs in the factory first. It also could explain the business model and why many Tesla owners will want a robot in their home (it could be a key piece of the RoboTaxi network – plugging in cars to charge them and get them back on the road).

There will be other insights, too. 

The catalyst to write this analysis is that we are both seeing signs of a changing consumer, due to Spatial Computing technologies like autonomous vehicles, Augmented Reality, and, particularly, robots.

If you buy into the premise that we are about to see changes in the technologies that go into robots – the AI, the electric motors, the sensor arrays, and in how, even, humans are living – then you will accept that the person who is interacting with the robot will change that person from deciding on the brand of soap used in the home, for instance, to letting the robot decide. In our research we’ve found that humans will accept these kinds of changes faster than most consumer products companies believe they will.

These changes go far beyond showerheads or the soap brand you use to wash your clothes, though.

It brings with it a bunch of new technologies that could disrupt even Apple, Google, or Amazon, but soon will start bringing service after service to your home. 

The robot brings other robots. (The autonomous vehicle, er, a robot, will bring the humanoid robot to your home, which will bring other, more specialized robots in. This turns everything into a service).

That statement alone brings radical shifts to the economy. 

Why hasn’t this happened yet?

  1. Until now robots were too expensive to be used for general consumer uses.
  2. No distribution or business model existed to entice homeowners to afford a fairly expensive new machine. How many homes can afford to pay $50,000 for one?
  3. The AI or software that controls robots was also very expensive and specialized. A robot at Ford’s plant in Detroit puts windshields into trucks every minute. But it can’t fold laundry. The humanoid robot could do both tasks, which points to similar changes coming to workplaces and factories. Our writing here focuses more on the consumer changes, but our previous book covered both and we bet the same will be true of this newsletter in the future.

All three issues holding back humanoid robots are going away at a pretty fast rate. 

All of the technologies that go into a humanoid robot are coming down in price at a pretty constant rate and are also becoming more capable at about the same rate, too, so you get an exponential improvement on the number of things a robot can do over time, There are already many robots that do things from vacuum floors to clean windows to ones that pick weeds out of your garden. Plus the efficiency of the computers and motors that would drive its hands and legs is getting better over time, so we can now see it doing real work for hours on one charge.

Back to the autonomous vehicle, and its role in turning everything into a service, which is that it will bring other robots to the home.

Once cars start driving without a human in the car, something that GM’s Cruise, Waymo (spun out of Google), and others are already doing in San Francisco, California and Phoenix, Arizona, then the car can bring other robots AND let the robots be shared among a number of different houses, which defrays their cost.

This piece – how to get robots into homes – is what previous robotic companies, now out of business, like Willow Garage or Giant AI were missing. What is that? How to get the robots to be paid for. A $50,000 robot isn’t an expense many can afford, even in richer neighborhoods. The autonomous vehicle unlocks a new business model of turning everything into a service and sharing the robot’s cost amongst many homes. 

Autonomous vehicles will, alone, bring an upheaval as consumers move away from owning cars and toward “transportation as a service.” What does that mean? The best example today is Uber. You pull out your phone and you order a car. You pay for what you use. If you only take one trip a month to the local shopping mall, you’ll pay $20. Far less than the $400 a month a new Toyota costs. 

The humanoid robot could do the same. Could do your laundry for $100 a week, then move next door, where it could do the same for your neighbor, collecting another $100, and so on and so forth. And if it can do laundry, it can do a lot more in the home or even your business.

When you add autonomous vehicles, humanoid robots, and other major technology shifts like Augmented Reality and virtual beings, that will arrive in 2023, you see not just an economic upheaval but an almost complete change to what it means to be human. 

The last time we (Irena Cronin and Robert Scoble) studied the market together the result, The Infinite Retina, earned a “must read” review from Qualcomm’s head of augmented and virtual reality products, Hugo Swart (Qualcomm makes the chips inside everyone’s headsets other than Apple). We reconnected recently after realizing that we were both seeing the same trends from different points of view that very few others were seeing, or studying.

Why now?

There are multiple autonomous vehicle companies now driving around without humans. Yes, not many cities yet, but that will change. 

That, alone, sets up deep changes to economies around the world as more passenger miles, shipping, and other services change from human driven to AI driven. When it also brings humanoid robots into the home, while Apple brings Augmented Reality to the home at the same time, we see something far more profound happening than we saw when we wrote The Infinite Retina two years ago. 

Welcome to the “everything as a service” world and stay tuned to insights from both of us. 

Why now? Because Tesla is updating the status of their Optimus humanoid robot and possibly demonstrating an early version of it on September 30, 2022.

And, yes, the Optimus will push the doorbell button instead of knocking, if you have one.

Life with Tesla Optimus

The first Tesla Cybertrucks will already be a few years old when Tesla’s humanoid robot arrives to the first waves of consumers, but what will it do when it arrives in 2028?

Well, first of all, we need to talk about the Cybertruck. By 2028 it will be driving around most cities without a human in it, along with other vehicles Tesla makes. When that happens the necessary pre-conditions for humanoid robots will be here. One will walk off the production line and jump into a waiting Cybertruck, which will bring the humanoid robot to people’s homes. Others will go into crates to be shipped around the world to both manufacturing and home users. Once released from their crates they will be able to hop into a Tesla and other vehicles.

How many more years after that will you see Tesla robots everywhere in Western society? 2030? Certainly by 2035. 

They will help you load things into your truck at your local Home Depot, loading even heavy sheets of sheetrock. In fact, Home Depot could order many humanoid robots for each store. Such a store would quickly become nicer than Lowe’s, if Lowe’s doesn’t also have the same robots.

Which leads to a lesson: every business soon will have to change deeply to attract partnerships with Tesla and others who will want to compete with Tesla as we move into the “Everything as a Service” world. More on how we see businesses changing later.

Let’s go back to that original pizza delivery. What needs to happen to make that happen?

  1. The robot has to be made.
  2. The AI has to be capable enough to go into, say, a Round Table pizza restaurant, and be able to talk with the humans there who are behind the counter — “Hi, I’m here to pick up two large pepperoni pizzas for Irena Cronin.”
  3. The robot has to be able to get to Round Table, get out of the vehicle, walk over any obstacle like mud, grass, dog poop, curbs, sand, stairs, etc., and get to both the counter at Round Table as well as the front door of your home while carrying the pizzas in a thermal pouch to keep them  piping hot.

If it just did that, it would unlock the business model. But, the robot also has to be programmed to interact with people. So, it has to understand people deeply, and, even, have a personality to get to its fullest potential.

Why? Trust.

Would you trust a robot that just did what you told it with no personality? Not as much as if it talked to you in a human way, and, even, entertained you. Adrian Kaehler discovered this while running the Giant AI company (now out of business, but it was working with factory owners to build a humanoid robot run by AI, just like Tesla is). He discovered that when they made their robot look, and act, more like a human that people accepted it more readily than their earlier prototypes that just looked like a machine with hands.

Trust will soon be the most important score companies track about itself and its products/services. 

Consumers’ attitudes toward computers doing intimate things with them, like cooking in the kitchen together, will soon deeply change due to what the autonomous vehicle will impact them.

Turns out once you trust a computer to drive you around the world, you change as a consumer – you become far more likely to let an AI run your life after that, since you realize that the AI doesn’t kill you while driving you around. After that, trusting a company to have a robot in your home doesn’t seem nearly as far-fetched a proposition as before you put your life in a robot’s hands (autonomous vehicles are technically robots, too).

So, what will you trust your humanoid robot to do? What will its day be like?

Well, laundry, dishes, shopping, cleaning, gardening, security, maintenance, and, even, saving your life. Future robots will be able to perform CPR on you, saving your life if you have a heart attack in your home. It can call 911 while it is doing that too. The operator might not even realize he or she is talking to a computer. “Hi, I’m a Tesla Optimus calling on behalf of Mary Smith and I’m currently performing CPR on her, and she is showing symptoms of having a heart attack. She has a pulse, but it is a weak one.”

Not every day will be so dramatic as saving a life for the robot. 

In fact, the first robots we see will usually be pretty simplistic, at first. What is the low hanging fruit it will pick first? Deliveries! Yes, your first humanoid robot will probably arrive at your door with some pizzas or groceries. 

Take a Ride in the Future

A truck is rolling up with your pizza delivery. It is the first thing the humanoid robot will do as a service. Why? If you can’t deliver pizza you can’t deliver anything else. So it will be how many people have their first encounter with a humanoid robot. 

You are watching from your home’s front window as a human-looking robot hops out of the passenger seat, walks to the back of the truck, and grabs a heated and insulated bag of pizza from the cargo area and starts walking up to your house. 

You had heard they were coming, from TikTok videos from other neighborhoods. You knew they could talk with you, show you things on its screen, and that they can ring your doorbell, but so far, even though they moved gracefully and quickly, they couldn’t yet enter the home. This was just an introduction to the robot. Even so, the experience is so different and unique that people record their first meetings on their glasses and phones and share them on social media or through direct messages to family members and friends. “It’s here.”

“Hello Irena Cronin, we have two pizzas for you. Here is what they looked like when they were put into the box.” (The robot’s face turns to a screen where it shows both pizza photos being put into the box).

“Hope you like your order. Today I can’t come into your home, but starting next week, we can do some common tasks in the home for a low monthly price, but we’ll do the first six months for free. Washing dishes and doing your laundry. Plus we can monitor your home for you, making it safer and more efficient. All if you want, of course. If you do, download the “Tesla Robot” app to your Augmented Reality glasses (a barcode appears on the robot’s face). It has been a pleasure serving you and your family. You can call me anytime with the Tesla app. Thank you.”

“It’s weird talking to a robot who just handed me pizza.”

“We get that a lot. Hey, it’s weird for us too! We had to figure out how to get here without ever being here before.”

The AI inside the robot has everything published on the Internet, and quite a few other data sources to pull from in milliseconds. A conversational AI is already planning out potential things the robot can say to you in response to what you say to it. It knows how likely you are to laugh at its jokes before it tells you one. If you laugh earlier or harder than usual, that will be noted in a database about your humor preferences.

But let’s not get into the fun and games yet. The first robot is there to serve a business and make a profit, not just tell you jokes. The first business is the pizza delivery service.

It will be followed by thousands of services, all controlled by you as long as you are talking to either the Tesla app on your glasses, or the same on one of your older tablets, phones, or computers. As long as you are within earshot of the Tesla Optimus and as soon as it verifies your identity, which usually is done before you even start talking, you also have complete control of it. Particularly if you own a Tesla vehicle, since you already are running the Tesla app full time to control your vehicle. If you are an owner, you have a virtual robot in your Augmented Reality headset that looks, talks, walks, exactly like the real robot. It can walk next to you, you will think you have a real robot walking alongside of you. At least until you say something like “Hey, Tesla, can you change my robot into an elephant?” If you have Augmented Reality glasses on, why, yes it can!

To make a business, there are a lot of boring steps that have to happen before the robot walks up to your door and knocks on the door or rings your doorbell. Things like, the thing had to walk into a Round Table pizza shop, wait in line like a human, and then introduce itself to the person behind the counter, and ask for the pizza for Irena. 

Also, when it is walking from the curb or parking space to your front door, it has to navigate many different things. Some people have stairs. Some people’s front doors are across weird bridges, some made out of rock and wood, others even rope. We have visited homes all around the world, in China, India, Israel, South Africa, and many European countries, along with homes in Canada and Mexico and have seen this. 

Yet others might require walking across some dirt to get to the front door, or navigating past a security guard keeping people who aren’t residents from entering an elevator to the Penthouse Suites. And we haven’t even talked about snow or ice that such a robot would need to navigate without dropping the pizza. 

That, alone, will require huge computer science efforts that cost many billions. Many of those billions have already been spent by teams building autonomous vehicles at places like Google and its Waymo spinout, Apple, Tesla, Mercedes Benz, Amazon’s Zoox, Nuro, GM’s Cruise, Aurora, Wayve, or a few others. But moving a robot through a chaotic environment to your front door will require billions more. Some people live in places with huge crowds right outside their front doors, others live in the middle of forests of trees. A robot will need to navigate all that and interact with people along the way. Every interaction is a potential customer so it has to be nice, funny, trustworthy, all in an attempt to win customers. 

Just talking their way past security guards and doormen is quite a challenge for human delivery people. Getting up to apartment 4B isn’t as easy as it looks sometimes. Humans often have to call up a resident to validate they really wanted a pizza delivery. The robot can do that automatically and you can be shown on a video on its face – and if it uses one of the new 3D screens that have been shown around the robot can actually show you what something looks like in 3D on its screen that is on its face, including your face upstairs as you wait for your pizza. 3D telepresence both inside and around a robot. 

The big business idea is that the robots (self-driving cars) will bring other robots (humanoid robots) which then will bring other robots (for specialized tasks like vacuuming, cleaning windows and probably a lot more, snowblowing, gardening, and more). 

But for the first humanoid robot that gets into the home, there are also other things it can do in addition to delivering pizza:

  1. Make the home and the people living there more efficient energy users.
  2. Give time back to family to do something better with.
  3. Build a buying club so bulk pricing lowers cost and improves quality of everyday things.
  4. Introduce new kinds of healthcare and other lifestyle services into the home, improving the health of everyone in the home (it can make better quality food, too). It can monitor your health just by walking by you. Imagine you run by one on your exercise routine and it cheers you on just like my family and a variety of strangers did while I ran marathons in high school.
  5. Improve the safety and security of the home (it can be a sentry on home all night long, noting various problems before you wake up).
  6. Make sure you stick with its service and that you don’t kick it out of your home. 
  7. Optimize the home, even tracking what clothes you wear by whether they disappeared from the home during the day.
  8. Introduce new experiences to the home. The robot could say “Hey, the robots are gonna watch the Beyonce concert tonight (we’ll even be part of the concert). You wanna come?”
  9. Introduce new bartering systems with your neighbors. Trading of food or, even, tools. “Hey, I’ll pay $5 to borrow a screwdriver.” The robot can arrange all sorts of things to be moved around the neighborhood.

Once the robot gets access to the home it can start optimizing it. Looking for things that could be improved. It also is paying attention to the humans in the home, and is building an internal database of things it learns about you as it watches you. “The human Andrea Kaplan likes eating Cheerios at home at about 7 a.m. every morning.”

In the future this knowledge will make it possible to personalize everything, particularly in relation to the robot. If you have a relationship with the robot, even a cold business only one, it could notice you like Cheerios so it has a bowl, spoon, and your Cheerios and milk on your dining room table waiting for you at 7 a.m.

Of course that means it needs to open drawers and find your spoons, and open the refrigerator and find your milk. Even if it is just doing this simple task, isn’t it also making a database of every product next to these things too? Of course it is and that, alone, will teach the AI a lot about your personality and likes and, even, your belief system. Bringing massive changes to what humans believe about privacy. 

Why? Well, imagine having a robot that comes into your house that didn’t talk to you in a fun way. Just did the laundry silently, saying few words. Are you likely to take it with you to your friend’s house? Or to a Home Depot to help with picking up some things for your home improvement project? No. 

So, we predict it will talk frequently with you about various topics, and, even, high five you at appropriate times. Why? If you feel it really knows you and entertains you, then you will learn to trust it with, say, doing the groceries. 

It is this trust that is worth trillions of dollars as the robot takes on more and more things around you, turning them all into services. 

First Ten Years: Owning the Delivery and Home Services Markets

Expectations for Tesla Optimus

Here are the parameters, among others, that the Tesla Optimus will need to meet our expectations before it can operate in people’s homes, and we will be watching the Tesla AI event for how well it can do each of these:

1. It should lift at least 55 lbs. Why that much? That is what you can pack and check on an airline. It might need to assist someone loading such a bag into a trunk.

2. It needs to be very quiet. Even when moving around you should never hear it, or only when it has to open a drawer or a door. On the other hand, that might be unnerving for people, so “Hey Tesla can you play some music while walking around?”

3. It needs to be able to communicate, via voice and hand signals, along with a screen on its face with humans. Switching modes to what the human prefers. For instance, the robot could switch to sign language for a deaf customer.

4. It needs to walk fast enough to keep up with a human entering, say, a RoundTable Pizza. Oh, heck, Boston Dynamics has robots that do parkour (jumping off of buildings), so maybe we need a little more than just a slow walk, no?

5. It needs to be able to get into, and out of, a Tesla vehicle, including putting on and off a seat belt. For extra credit, it could “assist” the car in driving tasks, for instance, by using its higher resolution cameras to see further and have better data to more accurately predict speed of oncoming traffic.

6. It must figure out how to either knock on the door (without leaving a mark) or ring the doorbell.

7. It must be able to carry a package of goods, such as pizzas, from the cargo area to the front door while always keeping them horizontal. Same with a cake. Same with eggs. Can’t break anything or drop anything.

8. It must show the beginnings of a personality with ability to entertain and delight. In other words, it must have conversational skills that so far computers haven’t demonstrated.
9. It must prove that it will be able to bring more services into the home than is possible otherwise (business model of robot bringing other robots).

10. It must demonstrate that it will never hurt humans, children or animals.

We’ll also be watching for skills that will be needed in both factory work as well as home service work. For instance, can it install a towel rack at home? The skills it would need will be similar to putting in an electric engine into a vehicle on an assembly line.

Why wouldn’t Tesla own the delivery and home services markets if it delivered a humanoid robot that does all that?

Data, Data, Everywhere

Our thesis is that the biggest dataset wins a lot.

It isn’t just our thesis, either. Many strategists at many companies are trying to find new sources of data. NVIDIA laid out the ultimate end of this strategy: one datasystem that drives everything: robots, autonomous vehicles, Augmented Reality, and virtual beings. 

We call this new strategy “the data hydra.” NVIDIA’s Omniverse is the best laid out example, but others are being built at Tesla, Apple, Google, Niantic, Meta, Bytedance, among others.

On September 20th, 2022, NVIDIA announced new features of its Omniverse. At the heart is a simulator that lets AI teams train the system to do new things. Or study a mistake it made by walking around the intersection where an accident occured. 

This hydra needs the data to build a digital twin of everything. What is a digital twin? Think of it as a very faithful digital copy of the real world. Our factories, malls, parks, and other things will soon all have at least one copy. In some places, like Times Square, we can see that there will be hundreds of millions of copies. You could leave pictures or videos of your family on top of this digital twin. And that is just the start. By 2025, Lumus, an optics company building the displays that will be in future Augmented Reality glasses, showed us that this digital twin will let us watch a concert in a new way. All around our couch will be a music festival and, thanks to Spatial Audio, it’ll sound concert level too. In some cases what people will hear in their AirPods Pro will be better than what they will hear at a professional concert. Even a high-end one, like Coachella. Augmented Reality headphones there “augmented” the audio, making it better. You could turn up the bass, for instance, or remove crowd noise or, turn down the concert to a more acceptable level. Business travelers already know that the best noise canceling headphones block out a screaming baby in the seat next to you. 

Adrian Kaehler, a computer vision pioneer who built for the first autonomous vehicle at Stanford and was an early key exec at Magic Leap, started a humanoid robotics company, Giant AI. That company failed to get enough funding. Why? If you start analyzing any job that a robot might do, you can see that a humanoid robot that can walk around, learning on its own, will decimate others.

Where Giant took showing the robot six or so times to “teach” the AI how to do a task, like putting material into a machine, thanks to this data advantage, and all that it brings, the Tesla robot will learn after one time, or will “reason” through it. After all, Tesla’s AI can drive down a street it never has seen. Some can joke that it will learn things from watching YouTube, but our kids are already learning that way so the AI can too. We no longer laugh. The AI ingestion engines at foundational models like Dall-e or Stable Diffusion ingest hundreds of millions of images. Soon we will see these kinds of AI’s evolve into new kinds of information interactions (we used to call this searching).

The robot might read you a poem it had generated by another AI, say, GPT-4, all while putting away the groceries. Why? It knew you like poetry and wanted to make you smile.

Let’s go directly to the point: after looking at how fast all the systems that go into building a robot are improving we now believe the humanoid robot of 2030 will be so good that humans who have one in their home a lot will feel that they are their friends and associates. If they are that good, you will bring one lots of places to “help” you and your family. 

Tesla AI and Its Simulator Advantage

Every Tesla car that gets upgraded with its latest AI stack (it calls it FSD Beta) ends up uploading about 30 gigabytes of data up to its neural network in the Tesla cloud (the new version of that Tesla calls “Dojo”). 

That feeds a simulator that lets researchers “walk around” what looks like the real world. With moving pedestrians, bikers, hundreds of cars, and more. 

It is this simulator that is one of Tesla’s many secret advantages. The simulator shows off the advantage of having huge amounts of data generated by an army of Tesla robots (cars) moving around the world. 

It lets AI researchers train new AI models to do new tasks. In 2021 Tesla introduced an autotagger into the system, which brought about a huge shift in how these systems can learn. The AI already knows hundreds of thousands of objects in the real world and automatically tags anything that it knows well. This speeds up the ability for the AI to start automatically learning. 

Which is where we are headed. There are plenty of examples of AI simulations and robots that start with knowing nothing, and over time by trying thousands of little experiments, they figure out how to walk and move on their own. 

Tesla has the advantage of being able to study humans in a new way while driving around the real world. Already its researchers needed to train its AI models to understand human movement. What the data from a human running, or walking, or biking looks like. It needed to do that to properly behave around humans in the streets. 

This research will give Tesla a lead when it comes to building humanoid robots. It will use the simulator to train robots to do a wide variety of tasks, long before Tesla makes a physical robot that can walk into your kitchen. 

How Does It Progress Over Ten Years?

The next ten years will see radical change due to Spatial Computing:

  1. Augmented Reality glasses are worn by the majority of people.
  2. Autonomous vehicles are everywhere on our streets.
  3. Virtual beings hang out with us all day long.
  4. Robots of all types are working all around us.
  5. Many homes now have solar AND backup batteries AND an electric vehicle charging station.
  6. AI systems now ingest massive amounts of data every day and can hallucinate back to you complex scenes, along with running everything in life. The AI foundation models that bring us things like Dall-e are going to look very quaint in a decade.

Here are our predictions for the humanoid robot specifically in 2033:

  1. It will have much higher resolution imaging sensors (cameras, LIDARs, etc.) than today. By 2033, cameras on autonomous vehicles and robots will go from the 1K they are today to 32K. That means they can see further, and smaller, things. So now where it might have struggled to pick up a very small screw before, now it can see it without any problem. It also means a robot in an older autonomous vehicle will be able to “assist” the original vehicle and see further. 
  2. Most tasks in the home will be turned into services by then. By then it can even install many consumer electronics, or even a shower rack in the bathroom. 
  3. It takes over the management of the home (laundry, dishes, garbage, security, and monitoring and controlling all lights, appliances, vehicles, charging, and more). 
  4. At homes that have an electric car charging station, the robot will meet an incoming vehicle and plug it in for charging. This will make the Robotaxi system more resilient and let it get vehicles back on the road after a charge.
  5. Robots will run many businesses that only cater to the automated vehicle network (making food that gets delivered to people’s homes, for instance).
  6. An “air traffic control system” that runs the transportation as a service that Elon Musk calls “Robotaxi”  will make sure robots and autonomous vehicles are sent to the right place at the right time. This is difficult because when there are large concerts, for instance, like Coachella, this control system will need to move thousands of cars from around the Western United States to Palm Springs to move people around there (we visited Uber’s effort at that festival to understand the traffic control and movement issues involved).
  7. Humanoid robots used to be “janky” because they couldn’t do a wide variety of things well. Those days are gone – AI rapidly learned to get better.
  8. Humanoid robots are very advanced with interacting with humans as compared to today. It won’t be unusual to have long, detailed conversations with your robot.
  9. The network will be a lot smarter than today. Your robot will know everything that is happening across the world, in real time. It can read every single tweet. You certainly can’t. 
  10. The robot will enable new services in your home, like telepresence that goes way beyond what Zoom makes possible today.
  11. Automatic shopping services are common. Consumers learned to trust their autonomous vehicles with their lives and, so, hand over shopping to the robot who, from that point on, always makes sure your refrigerator has milk for the kids.

What really gets fun is when you mix robots, autonomous vehicles, together with Augmented Reality glasses. That brings effects that will be hard to predict. After all, when movie cameras and cinemas were invented how many more decades did it take for Star Wars to show up?

But we can see people getting a few robots to come over for Friday evening with their friends, and the robots will serve dinner and then will perform a short skit for entertainment after dinner. You’ll wear your Augmented Reality glasses and that will “dress up” your robot in different characters. Immersion is really improved when your robot hands you things while inside an immersive experience. This is how 2033 could be very weird compared to today.

The Big Picture

What are the possible larger impacts of the Tesla Optimus? With the Optimus comes an increase in home services spending, as well as an opportunity for Tesla to control the complete supply chain of products that Optimus uses in the home.

The increase in home services spending comes from consumers buying the services that Optimus can do – those services that a person does not have time to do, or just does not want to do. Optimus can serve the same kind of function that a housekeeper or maid has, but can handle more work at the same time and for a much longer period of time.

Additionally, Optimus can do things in the home that a housekeeper cannot, such as run diagnostics on major appliances and gauge how they are performing and if they are running efficiently. It could also do this for the Tesla car in a person’s garage, as well as ready it for use in the morning which is really useful to tired, hard-working families and professionals.

In addition to these functions, it could serve as a very energetic handyman, plumber, housepainter, etc. Doing all these services and replacing traditional professionals significantly changes the dynamics of the home services market. This disruption has the potential to substantially enlarge that market due to the efficiencies and superior attention to detail and physical strength of the Optimus. 

In terms of how the Optimus would be made available to consumers, there would probably be several different channels for this. One possibility would be for a company to buy several Optimuses and rent or lease them out. Another would be direct purchases by upper class families, and a third way could be the buying of community Optimuses by home owners associations (HOAs), neighborhoods or cities.

In the process of its work, the Optimus will be using cleaning products and house improvement and handyman goods. For ease and scale, Tesla has the opportunity to make direct deals with the companies that provide these since it would need these in bulk. In this way, Tesla could control the complete supply chain of these products and goods for the Optimus; companies that make these products and goods would line up to be included since the sales volume would be so high.

While it is difficult to currently assess how big the potential market will be for the Optimus, it would encompass a large majority of upper middle and upper class people with families, as well as single, childless professionals, first in the U.S.and then, in other parts of the world.

The economic impact that the Optimus brings, taking into account even a mid-market penetration, will be significant. Why? Because the potential market is so big. Because the Optimus can do such a wide range of tasks, it will be relatively more efficient and will consolidate and increase the need for its services. The Optimus can go to a home and perform many varied services during that visit that would usually take four or five different kinds of workers. People who would not have been enticed before for certain home services would now take advantage of having those services done. Why not? If the Optimus can cook, mow the lawn, paint, babysit, diagnose electrical issues and so much more, it is very convenient for it to do many-varied tasks during any visit.

What is the impact on the workers the Optimus replaces? Yes, this has the potential of putting many different categories of service people out of business. Robotics and automation tend to have that effect in all kinds of areas of life. We don’t have an answer as to what will happen to the displaced workers, we only know that it will happen.