As Apple puts the finishing touches on its new augmented reality headset, expected later this year, I’ve been tracking innovation in music. Spatial Audio/Dolby Atmos. Why? Dolby Atmos will be a huge part of the announcements Apple is going to make. It will also be very important in the future of the “metaverse.”
Last year we got a new Sonos system that plays Dolby Atmos (new spatial audio/surround sound/better quality). Since I sold audio gear in the 1980s it’s amazing to me that you can feel like you are in the middle of a concert now. Apple’s headphones, which I also have, also support Dolby Atmos but don’t really get you the surround sound or the bass of our $3,800 Sonos system.
While watching group forums on Facebook and elsewhere I see lots of others are getting new audio systems that play Dolby Atmos. Movies have played Atmos for years, but music services started sharing Atmos less than a year ago.
The problem is finding Dolby Atmos music.
For instance, Apple’s “Rock Spatial Audio” list has 99 songs. Nice start, but I got bored very quickly. So I started collecting my own. My rock list has 1,131 songs and my hard rock list has 166 songs. Finding these are very difficult. Why? Some albums only have one song done in Dolby Atmos. So you gotta go one by one through each song and you need to know where to look to find new ones.
None of the services are doing Dolby Atmos fans, like me, justice. I’m on all of them that support Atmos (Tidal, Amazon, and Apple) and even some that don’t support Atmos (like Spotify and YouTube Music).
It makes you wonder why the music industry is hiding its biggest technology advance in decades? When it comes to Apple, I’m pretty sure it is readying their own Dolby Atmos music service for its new headset. But Amazon? Its UI is horrid. Worse, all services have really shitty search engines.
Anyway, Dave Winer regularly writes that blogs let authors route around big companies. This is exactly what is going on here. Now, I know 99.99% of people don’t care. That’s fine. You will when you get new surround sound headphones next year. If you still are reading, just remember that this post exists so when you do start to care about music quality you have a resource to go to.
Anyway, here’s the master list of my 60 playlists. I’m breaking them into two sections: “curated” and “catalog.” Curated means I built the list after listening to every song. I built these for my own home and are what I listen to every day. Catalog means it’s just a list of everything I can find (like my rock lists) without any concern about the quality).
If you use these, you must see the Dolby Atmos logo. If you don’t see a logo when playing then you aren’t getting the full Atmos experience (you might need to turn it on in your phone’s settings, or upgrade your equipment).
So, let’s start with “curated.” The first link is to Apple Music. Amazon has a lot less music in Dolby Atmos format and I have only moved over some of my playlists (they take hours to move over because Amazon has far less Atmos).
I include the link here because Amazon sounds better than Apple. Even on Apple’s own headphones. Why? Because it is using a new version of Dolby Atmos that Apple and Tidal aren’t yet using.
1. Chill Together. 161 songs. This is music that Maryam (I’m her husband) and I like listening together to. Nice and calm music.
2. Dolby Atmos Nightclub. 452 songs. The opposite of Chill Together. Lots of explicit language and mostly hip hop/rap. Loud, obnoxious. Rattles the subwoofers. (Amazon)
3. Dolby Atmos Party. 160 songs. None of the explicitness of the nightclub, but still fun beats to get people dancing. (Amazon)
4. Dolby Atmos Radio. 1,472 songs. Music that’s great to listen to all day long. No explicit stuff, but a wide variety of songs. (Amazon)
5. Dolby Atmos Speaker Demonstrations. 84 songs. The best of the best. I did this list to show family and friends what Dolby Atmos is all about but I found it’s great to keep going back to whenever the software in my speaker system upgrades. (Amazon)
The rest is what I call “catalog.” In other words, genres or other things that don’t have editorial input from me. Here I go for completeness, not quality. Usually I try to stay with Apple’s own categorization.
About 11 years ago I was standing outside in the snow in Munich, Germany with the CTO of a small company, Metaio. He was showing me monsters on the sides of buildings. Apple later bought his company. It got me interested in augmented reality and its uses to make people’s lives more fun and more interesting. The way that first demo happened? The building was turned into an invisible digital twin that the virtual monsters could attach themselves to and move around the building.
Snapchat has, in the past few years, finally brought that tech to consumers with its augmented reality lenses. Things that can turn your world into complete augmented reality scenes, way better than what Metaio showed me 11 years ago.
While SnapChat’s invisible digital twin lets developers do very cool things, it leaves me wanting. Why? So far the Snap platform doesn’t give us many computer vision capabilities. Doesn’t really let us do a whole range of things we want our augmented reality worlds to do (like keep track of your keys).
Today Perceptus, from Singulos Research, gives us an important answer to the future of what humans might do with augmented reality, beyond Snap’s filters, which are really designed to make your selfies much more interesting and it does it with a new kind of computer vision that can catalog physical items in your home or factory.
Augmented Reality has so much more potential into changing EVERYTHING in our homes and factories and Perceptus shows us just how we can use digital twins, computer vision, and AI to make our world better than it was before augmented reality arrives.
CEO Brad Quinton buried the lede when we talked yesterday. Journalism term for hiding an important fact until late in the conversation. About an hour in he let drop “you can play virtual chess without even a chess board.” Then he said you don’t need all the physical pieces, either! The invisible digital twin strikes again.
I also am not paid for, nor am connected with this company. Although I would work for it for free if asked. Why? This platform has a major impact on the future of things developers will be able to do in our homes.
This does NOT require a headset or glasses. If you watch the video you will see founder Brad Quinton demoing it on a simple iPad. But, of course, this really will rock when we get wearable devices that will enable us to use this kind of augmented reality without holding a device in our hands.
“So, Scoble, what is it?”
It lets you do magic. Just watch the demo in the video I posted or watch the more professional video on Perceptus’ website. It shows the power of digital twins and computer vision in our homes.
Soon devices will augment everything with this kind of computer vision and an invisible digital twin. That’s how Snap’s filters work, they add a digital twin of your face and of the city around you into its database which developers can then manipulate.
Digital twins are like a 3D copy of the real world.
In the demo Quinton shows, the table is a digital twin. It looks like the real table, doesn’t it? But it isn’t. In this case the digital twin is mostly invisible. Perceptus uses this digital twin to keep track of things like chess pieces and Lego pieces on the table top (and a few other things too, but I’m trying to keep it simple here). Think of it as a new kind of database: one that is laid out on top of the real world.
The magic here is that Perceptus quickly makes this digital twin and figures out what is on top of it. Computer vision running inside says “hey, there’s a rook in a chess game” and it keeps track of that rook from then on and, from then on, developers can perform their own magic (for instance a developer might want to change that rook into something crazy, like a Sponge Bob character).
Don’t focus too much on the two examples Brad demos. The game and Lego organizer aren’t the secret sauce here. They are just examples of things that COULD be built on top of this platform. As I walk around the home I see dozens of things developers could do with Perceptus, from making new musical instruments on top of Coke cans to making all our board games far more interactive and interesting. It could be used for non entertainment purposes too, like suggesting recipes, or keeping track of things around your home. All my lights have computers in them and they are hard to control. A developer could use this platform to make that much easier. Or could make our Heinz ketchup bottle play games against the salt shaker next to it.
Some details behind the company?
So far it’s self funded. He says he isn’t raising funds right now but I’ve learned many times that entrepreneurs are always raising funds even when they say they aren’t. The valuation just hasn’t gotten interesting enough yet! (My words, not his).
This isn’t his first company, he cofounded Veridae Systems, Invionics, and Abreezio (acquired by Qualcomm) and has 28 patents to his name.
If you have read this far you are really crazy about augmented reality. My kind of people! So, why am I so strongly excited by this company?
Well, we all know Apple is coming with some sort of head mounted display (technical term for something like a headphone that has screens for you to look at). There have been tons of rumors, lots of ideas of what is coming.
I’m not playing that game anymore, except that whatever comes will be the most expensive product launch of all time in any industry. So expectations are extremely high and whatever Apple will do will change the opportunities for developers.
This is a big example of just what kinds of startups are about to come. I’m expecting that over the next 24 months we’ll see hundreds of startups, like this one, created. That has me excited, but even better, I’ve been doing investor research lately and a huge percentage of them are waiting to see what Apple is doing before even considering whether to invest in virtual or augmented reality. I need more time to finish off that research to present real numbers, but it’s already clear after asking 100 investors.
Major companies are already talking to me about augmenting their products in our homes. Tonight I was talking to an employee at Samsung about its appliances and how they would augment them. Then there are consumer products companies like Procter and Gamble. If someone there wants to augment a Tide bottle it’ll need a computer vision platform. Does this work for every use case? The market will decide but it shows me that magic can be brought by developers to EVERY object in our homes. Forks? Yes. Refrigerators? Yes. Board games? Yes. Spices? Yes.
I’m tracking companies that will have a unicorn potential in this augmented reality world. This certainly is one.
Anyway, the next 16 months will be huge for augmented reality and this is just another example why and it demonstrates that machine learning/computer vision and digital twins are about to become much more useful to consumers.
First of all, what is my goal? It is simply to find ways to make my systems (which include many headphones, cars, and Sonos system) sound better and to help you enjoy your system.
There is a bigger goal I have, which is to get the Spatial Computing industry to care a lot more about audio, since music is a foundation for a lot of storytelling, and a huge amount of the difference of, say, watching the Olympics in person or on TV. Even “silent” films from 100+ years ago have music as part of the story. What is Dolby Atmos and why am I so excited by it (enough to make playlists that have tens of thousands of songs on them, all in Atmos, and to talk with the music industry frequently)?
Neil Young took me into his studio to teach me this “gap” between a real-life performance, what was captured on the master recordings, and what people hear coming from their headphones or speakers. On most equipment the gap is huge. But now consumers are getting systems that greatly close the gap. Or, could, if they are fed music with higher resolutions and with the ability to build surround sound stages that sound closer to real concerts. That’s where Dolby Atmos comes in.
When you go to a real game, the sound is incredible. So far consumers don’t have the equipment, nor the source, to help close that gap. Dolby Atmos closes the gap between a real concert and something you would experience digitally. And closes it in a huge way. It used to be that only wealthy schools could do what my Sonos is now doing.
It gives us several things:
1. It virtualizes speakers. So that sound can be put all around a listener. 2. It turns audio into objects that can be played properly on everything from a $250,000 speaker, to cheap speakers on a modern iPhone. 3. It still includes a stereo render so that the music can play on all equipment, even those that don’t support Atmos (since that’s the vast majority of devices that people listen to music on). 4. More bit depth, so sound is better quality.
Putting audio on things, either real or virtual, will be a big deal. The #1 app on Meta’s (formerly known as Facebook) VR headset, the Quest 2, is Beat Saber. Which uses music. But which sounds like crap because it’s 2D music, not Spatial Audio, like what Dolby Atmos delivers.
I’m not paid/compensated in any way by Sonos, Dolby, Apple, Amazon, or any company I discuss online (if I ever am, I will disclose that).
My qualifications? I’ve collected tens of thousands of Dolby Atmos songs on Apple Music and moved a lot of those over to other services like Tidal and Amazon so that I can compare music services. If you want help with a specific kind of music, drop me a line, and I’ll send you my playlists on Amazon or Tidal. Apple is the best place because Apple has the biggest catalog of Dolby Atmos that I’ve been able to find. Amazon sounds better, even on Apple headphones, due to using newer Dolby Atmos technology than the others, but it has fewer songs, particularly for those of you who like classical music.
Spatial Audio is something I’ve been studying for decades. I’ve been in Virginia Tech’s building for Augmented Reality research which has 1,600 speakers in one room. When I visited they put me in a recorded football game which blew my mind.
Neil Young had me in his studio to understand what his analog masters caught of his performance and how much of that is stripped away by technology delivering music at home. I’ve visited with many audio engineers in many studios. Just so you understand that while I’m not an audio engineer, I do have more education on the topic than most people who aren’t audio engineers and I’ve even met audio engineers who are completely working in stereo and who don’t understand Atmos. The music industry is cleaning those people out and building new studios around the world for Atmos.
What is Dolby Atmos?
Instead of talking about bits and bytes and nerdy stuff, let’s ask ourselves “what is the goal for us to recreate music recordings in our home?”
For me, I have seen hundreds of performances from the front row. Buddy Guy played guitar sitting right next to me for 20 minutes. Reggie Watts performed two feet in front of me in Preservation Hall with the band there. I’ve been to Austin City Limits, Coachella, and many festivals.
My goal has been to recreate this concert experience at home. Most people can’t afford to go to, say, Coachella, to listen to live music. Marshmellow played there to about 30,000 people. A few years later he was on Fortnite performing to 11 million. And it sounds like shit compared to Atmos on my Sonos system. The gap between the performance on Fortnite and what his concert in real life was like is huge.
We can do better.
So, let’s start out with something simple.
A three-piece band. A singer. A drummer. A guitar.
The old way was to record in “stereo.” Two audio channels. And distribute that at 44.1 khz, or worse (Spotify is usually compressed on top of that). That is what is on CD’s (which I first started selling around 1980 in the consumer electronics store I worked at).
In stereo the band is “stuck” mostly between your two speakers with a little that goes outside of that. When we sold audio gear in the 1980s we’d say “this speaker has a wider soundstage.” Because in front of you you could more clearly hear where a flute or clarinet was in, say, a symphony.
Today, with Atmos that soundstage can be all around you. The guitar can be in a specific place in 3D now. That wasn’t the case with stereo. In theory Atmos can move things around differently in the future, too. The Atmos technology has separated where sounds are coming from apart from the sounds themselves.
The process of making Dolby Atmos is different than it used to be. Now a technology team needs to “program” where sound should be around you.
That is impossible to do in just stereo. Audio engineers sometimes even put each performer on a different speaker in their studios and move those physical speakers around to decide where to put each on the computer, which translates to your Sonos as “drums in back, guitar on right, singer in front.”
In my home, with my system, this sounds like my whole room comes alive with audio that extends behind my speakers, over my head, and even behind me if I have the speakers directly to my sides, or a little behind me. Sometimes it even “fakes” sounds behind you where there aren’t any speakers. You can hear this actually on an iPhone 13 Pro or Max phone. Play Dolby Atmos music on such, make sure you see the Dolby Atmos logo (if you don’t there are a couple of places in settings you need to change, and you need to pay for the top tier of music services (particularly important on Amazon Music) to get it. Turn your phone and you’ll notice that sound often appears like it’s coming from next to you or, even, behind you where there are no speakers!
I also noticed in my headphones and in my car that Atmos music has better bass and music clarity when Apple turned on Atmos last year. That’s what got me interested in Atmos. Soon after I had purchased my Arc bar and started talking with musician friends and audio engineers about this.
The reason, the engineers tell me, that Atmos has better bass and audio, is because when the music is sampled it not only has more samples (48 khz vs 44.1) but also has more bit depth (the numbers are longer, which makes for better sound, thanks to more information).
Some of those benefits come from “high resolution” music (music recorded at higher sampling rates than what was used to record CDs) but if you have a full surround sound system you hear it isn’t just that: that the music is fundamentally different than it used to be.
Some purists argue that it isn’t what we should do. That stereo is how “God” made recorded music and we shouldn’t mess with that.
I come from a different place: if we really are going to take audio to the next level we must go way beyond stereo.
I hear another resistance: that there is a better way to do Spatial Audio, particularly in VR, since the platforms that make VR can put sound on any of the virtual polygons that make up what you are seeing in a VR headset. The problem here is that the music industry has decided to go with Dolby Atmos, or a similar technology, 360, from Sony.
So, we need to see VR headset manufacturers really support Dolby Atmos at a deep level and make it so that the virtual box around the listener can be “attached” to the real world, so that we can really recreate a concert experience (I’ve been on stage at concerts where musicians are playing and in real life you can hear what it sounds like between, say, the guitar and drums. You can’t do that at home yet.
What about the future?
The holy grail is to “fool” the listener into thinking she/he is at a concert, where sound is coming from all around you, particularly when you are at something like Coachella. There they have dozens of speakers in front, above, and behind you, and that sound is bouncing off of everything else.
We aren’t there yet.
Now add on augmented reality or virtual reality glasses. They could “lock” the Atmos virtual box to the real world, letting you “walk around” a band. Like you can in a real concert.
It is this goal that has me most excited. If we can put a 3D sensor on your face, along with screens that cover your eyes, and headphones that bring real surround sound, and cover your ears, we can deliver much better audio than headphones can today.
So far we haven’t seen this “holy grail” ship to consumers. I expect that in the next year that will change.