Welcome to our newest season of HumAIn podcast in 2021. HumAIn is your first look at the startups and industry Titans that are leading and disrupting ML and AI data science, developer tools, and technical education. I am your host, David Yakobovitch, and this is HumAIn. If you liked this episode, remember to subscribe and leave a review, now onto our show.
Welcome back listeners to the HumAIn podcast. Today, we are talking about the future of computing. It is my pleasure to have Robert Scoble, who is a consultant and BookAuthor. Who’s been involved at the roots of Silicon Valley and a lot of very exciting technology we’ve seen grow and mature as we’re moving into the modern new computing era. Robert, thanks so much for joining us on the show.
It’s an honor to be here.
Well, I’m so excited for this because I’ve looked up to a lot of the great work that you’ve been involved with and the companies and ventures. As we’re now in 2021, there’s such a rapid shift into new technology. Of course, we’ve all seen the pandemic and we’ve all seen the movements there, but now computing is becoming sexy again. It’s becoming exciting again. Why are you excited about this new wave of computing?
Well, I’ve been watching it for almost a decade and have been really longer than a decade. Eight years ago, I was at the consumer electronics show walking around the Back Hall in the Fleets. I came across a company called PrimeSense, which the founder of PrimeSense is a little company that came from Israel and they made 3D sensors.
Well, eight years ago, he showed me what a 3D sensor could do. Today on our new I-phones we have two of them. One on the front and one on the back that sees the world in TV. He was showing me all sorts of things that you could do: pressure sensitive touching on. He had a sensor about three feet away from a table top and he could see how hard we were pressing on the table and could do virtual writing on the table, and stuff like that.
Three or four different demos of what a 3D sensor could be used for. Around the same time I was in Mateo, in Munich, Germany, which is an augmented reality company. Both of these companies, by the way, got bought after I followed them by Apple. They were showing me monsters on the sides of the skyscrapers.
Well, today Snapchat does that. So I’m pretty lucky. I got to go around the world and meet a lot of entrepreneurs. I interviewed thousands of entrepreneurs on my video show, like what you’re doing. That got me into all these R and D labs and into little startups that were doing new technology.
Because of where I am in Silicon Valley, I have close by access to a lot of these companies. No one company, that’s building a contact lens with a little tiny monitor and a little tiny sensor on it. And that’s being built two miles from here. So the world keeps clicking a lot and bringing us new things.
We can talk about these new headphones from Apple and just how different these headphones are from Sony. A lot of new technology is coming to bear and products nowadays, particularly with AI. You saw the M1 chip that Apple announced a third of that chip is now dedicated for AI workloads.
That’s a new neural network workload. That’s a huge investment on behalf of Apple. That’s on a proper technique that didn’t exist a decade ago. Siri was the first company to use this new machine learning technology, to do whites recognition and F case and understand what to do with it, and today I just drove here in a self-driving Tesla. So, it’s a crazy world that’s hitting literally right now.
I love this technology. I think back to Minority Report when we’ve talked about contact lenses and now we’re starting to see it, and there’s going to be a day where we don’t need these glasses. We can have augmented contacts and it’s probably here sooner than many people think about. And with the M1 chips, I love that you mentioned that. The company I’m with, which is a distributed SQL startup, we’re talking about supporting the M1 chip, and it’s amazing, because you don’t have to be cloud only, you can run at the edge. You can run on local devices and any company and any person can be doing machine learning every day.
Yeah, it’s true. And people who don’t have a June oven or a Tesla car really don’t understand just how fast this stuff is happening. In the two years since I got my Tesla, it’s now changing lanes automatically and it stopped the stoplight, stop fine. It’s really understanding the world in a very different way than it did when I first bought this car. That just shows just in two years, the technology that’s running underneath it is radically changing.
So it’s not just hardware that is eating the world anymore. It’s software and the combination of software and hardware, would you say that. Is this a movement of spatial computing that we’re talking about today, that you mentioned we’ve seen with Snapchat and we’re seeing now with many new companies with sensors with FPGAs.
It’s happening everywhere, and we should define what spatial computing is. Our mobile phones, our TVs, they’re flat. They’re monitors. They’re a little rectangle to glass, as I call them in this new spatial world. We’re soon going to be moving through computing. Computing is going to be all around us. We’re not going to look at little rectangular pieces of glass anymore. We’re going to be wearing the rectangles on our eyes. The computer is going to put computing everywhere, And it’s not just for humans. If humans, robots and virtual beings, which as another, probably could do a whole show just on virtual beings, the future of virtual beings.
But there’s quite a bit of work being done, to actually have you talking to something that looks like a real thing, a human kind of thing that you would talk to, that would be using spatial computing. So let’s just start talking about the fundamental right on this new iPhone that I just got.
There’s a 3D sensor on the back of LIDAR and that sprays little beams of light out from the sensor here. And it has about 300,000 of those beams of light. It shoots out one of those beams of light. It hits the wall in front of me and bounces back, and the computer can figure out how far that wall is just by how long the light takes the bounce back.It’s called time of flight sensor.
So it builds a 3D map of your world. You do this today, you download an app that sees the world in 3D as a base, and captures your living room or your bedroom and 3D, and you can start playing with it and start understanding how this works. But let’s just go through it. The first step is it turns my wall into billions of little dots. So if I see it in the computer, it’s actually numbers. So a programmer is seeing billions of pieces of data coming through the computer and has to make some sense of that. Well, somebody else does that hard work. Rocket scientists figured out how to see a wall out of these little points of data. But the little points of data is too much data for your phone to handle without melting down.
So what it does is converts it. In most cases to a set of triangles, polygon, we call it. And that’s what you’re actually seeing in video games. When you’re shooting somebody in Call of Duty, you’re shooting a gun that’s made up of tons of little triangles and your target, or your competitor is tons of little triangles as well. And on those triangles, then they put a graphic. So it looks like a human or it looks like a tank rolling around, or something like that. That process is pretty well known now by developers. If you use Unity, if you understand meshes, we call this a mesh of polygons, and this is pretty exciting stuff.
But it’s just the start because the AI that’s coming along now. It can segment each thing in your room and separate it from, like the table. So I have a little water bottle sitting on a table in front of me. Well, the computer vision can cut the bottle out of space and say: Oh, that’s a bottle. And then, if it can have a camera, it can actually see the text on the bottle and do an AI look up either on Amazon or Google and figure out, Oh, that’s a hint water bottle and it’s the water fuse with mango and grapefruit. Someday we’re going to be able to say: Hey Siri, how much does 20 of these on Amazon? And it’ll know what you’re talking about. Because it’s seeing this bottle in 3D, and it’s reading the label with a camera and it’s doing this new computer vision lookups, the same technique that a self-driving car uses to see a stop sign.
I was driving my Tesla home today. I was seeing stoplights, and stop signs, and pedestrians and dogs and cars, and it knew what each one of those things was, ans it shows me it’s on the screen. It shows me there’s a dog crossing the street right now in front of you. So you better slow down and stop. Actually, the car automatically slowed down and stopped. Soon we’re going to be wearing glasses that do the same thing in our house. So now let’s have some fun. Well, first we can replace any of these polygons with different visuals. So I can take the wall and get rid of it and then put Yosemite on the wall or put an alien crawling through the wall with my Microsoft hololens.
I already experienced this, where it turns my walls into virtual things that games can mess with visually, and it’s quite compelling. It’s like: wow, there’s an alien crawling through my wall screen. So it’s a lot of fun, but it’s also very useful. It’s going to be very. We’re going to see all sorts of new use of technology like that, where the glasses are going to know where you left your keys.
The glasses are going to know what is in your kitchen, so at some point you’re going to ask it for help to what to make for dinner, for instance. And you’ll ask: Hey, out of all the things in my kitchen and what should I make for dinner tonight? Well,you have enough pasta to make lasagna. You want to make lasagna? And, actually, teach you and show you how to do stuff like that.
Or we’ll even have a Boston Dynamics robot making us lasagna.
There we go. That’s coming because the cost of robots is coming down at a pretty steady rate. More likely we’re just going to get it delivered from the Italian place around the corner. And a little robot will come and deliver that using spatial computing or an autonomous car will come and deliver it using spatial computing as well.
This is not really far off. It’s already happening in a lot of places. Waymo, which came out of Google, It’s already driving in San Francisco mountain view and DNX without humans in the car. There’s lots of new drone delivery and robot delivery services that are underway and are rolling around our cities, maybe not in your neighborhood. I don’t see many of those in my neighborhood either, but I’ve seen them in other neighborhoods. I know that they’re coming and they all work very similarly using the same technique, a little LIDAR on top of the little robot that sees the world in 3D and understands how to roll down the sidewalk or down the street.
That’s right. In New York City, we’ve seen in Brooklyn Navy yard some of those self-driving shuttles, we’ve actually seen at my undergrad university of Florida some self-driving buses and those little neuro robots that bring Coca-Cola and sandwiches to the dormitory rooms. So it’s happening, and I definitely think I’m bullish on it. And I think also you are, so why are you so bullish about spatial computing?
Well, on the transportation side of things, If you and I rented an Uber right now, and said: Hey, just sit here for an hour and charge us. They would charge us between $55 and $80, depending on what part of the United States we’re in.
So when autonomous comes, when we have autonomous cars the cost is going to go down to about $10 for a Tesla. And that shift in costs It’s going to be very dramatic for trucking, for deliveries, for taxis and all that. And that’s why when you get it. People like Sebastian Thrun who ran the Google self-driving effort together. You start brainstorming about how cities are going to change, because the costs of transportation are radically different and how buying a car is going to change. A lot of us might not buy a car in such a world where you can just rent a car and have a cybertruck show up in a minute and charge you 10 bucks an hour.
Well, I only use my car for a half an hour today. So why am I paying right now? My Tesla’s sitting literally underneath me in the garage. It’s constantly $2 an hour. Just sit there right now. Why am I doing that in a world where we have autonomous cars? I’m not so sure that most people are going to buy a car in this world that they’re just going to rent an autonomous car and guys ever want to, because the costs are going to be so different.
So we start thinking about that, and then I have an autistic son, a special needs kid who is going to have a lot of trouble in life. He doesn’t speak very clearly. He doesn’t even pay attention when you walk across the road. He’s in his own little world. Well, these devices on his face can really help him live his life and can warn him not to cross the street, because there’s a car coming or could really show him a new way to live even to the place of: Hey, your friend is getting mad at you, you know?
Here’s some hints of how to handle yourself right now. That sounds a little scary, but these devices are going to help blind people to hear what is around them, and it’ll help deaf people to see. Because soon a blind person’s going to have this device on their face. Well, in Facebook’s device that they’re planning, that they’re testing actually, it has seven cameras around the rim of the glass. The AI is able to see things like the water bottle in front of me. And so now I could, if I was blind and I couldn’t see that water bottle, I could ask Siri: ‘Hey Siri, where’s my water bottle?’ ‘It’s right in front of you’.
So, this is quite exciting for human beings, and we haven’t even started touching on how deeply entertainment is going to change: football. Everybody watched football. We’re assuming that Unity is already a public company that makes the polygon underneath everything we do. They’re building a virtualized football. They’re putting a hundred cameras into football stadiums and soccer stadiums around the world, which are going to build a volume metric mesh of polygons.
Running around the football stadium that I can enjoy in my glasses. So I can see the football game like if I was actually at the stadium, and could see it in 3D and not just the 2D screen, the flat, 4k screens that you have on today’s TV. So TV is about to radically shift as well. And we could keep going. My book has seven industries that are going to radically shift from batch rate to transportation, to FinTech. Retail is gonna change a lot because of this stuff.
Overall we’re seeing it’s the pace of innovation. That’s what I’m hearing about this shift. Even in the beginning of 2021, I heard this story about out of Israel where the first artificial cornea was installed in the 78 year old man. That technology you wouldn’t have thought about 10 years ago. It makes me think where we will be in 2030, 2040. The dreams that: Oh, self-driving will be here in 2050. Whoa, It’s happening way quicker than the pace of innovation and so many industries are changing.
I’ve been watching autonomous cars roll around Silicon Valley since, I don’t know, 13, 14 years ago; and in the 15, 16, 17 years ago with Stanford Research. So it seems slow at times. But when these are exponentially growing technologies, the data that they are collecting is growing exponentially, the training cost is coming down exponentially, the computer cost is coming down exponentially. And so, when you have all that exponential technology changing the world you’re going to really see radical changes, eventually. My favorite question is, would you rather have a million dollars or a penny that doubles every day per month?
Of course the penny.
But you have to live 27 days before the penny starts paying off, and passing the million dollars. So if, you’re going to die on day 20, take the million dollars and have a party. But if you think you’re going to live 27, 28 days. Keep the penny, because then it goes from 100,000 dollars to 200,000 to 400,000 to 800,000 to 61.6 million. By the end of the month, it’s $11 million on a 31 day month. Most people don’t think like that. That’s why that question is so interesting. Somebody who understands math and the exponent: Oh yeah, I want the penny that’s doubling. That’s the smarter way to go. Most people don’t understand it, but this is why people don’t understand the virus.
The COVID virus it’s spreading exponentially and people are like: Oh, it’s just like the flu. No, it’s not just like the flu. It’s very different than the flu, because it spreads exponentially faster and it kills that out a little bit at a higher rate. Now we’re seeing what that means. 400,000 people are dying every day. We’re losing 4,000 people now. Remember last March when it was just a couple people dying a day, and people are like: Oh, it’s just like the flu. No, it’s not. You got to understand autonomous cars, robots, and these augmented reality glasses are exponential technology, they get better as they’re trained to do new things.
For instance, if you train an AI to see the hint water bottle, it makes my life better and it makes everybody who has the glass better. It’s exponential. So it’s very exciting in this new world, but It means a lot of change that’s coming for people and that’s very scary. And that’s what I studied. I studied consumer behavior when it comes to new technology, I fell in love with doing that, but I felt VHS, VCRs in the 1980s in the little tiny consumer electronics shop in Silicon Valley.
I love that. I think about how this technology augments humans, and I think where we’re moving with the story that you’re sharing, Robert, that’s about the water bottle. Should I have to put in all these routine and rote actions or can the machines augment? And that story that you just shared is so powerful, even in the 1980s. My dad, actually, was an electronic repairman. He used to repair TVs, VCRs, DVDs, all those things over the decades. It’s just incredible to see these old Silicon Valley stories and how they’re going in a circle, but how they’re becoming modern again.
Yeah. I remember seeing some VCRs, and I was like: Man, that’s cool. Now today, I don’t even have a VCR in my house. It seemed like such an important thing to have in your house 20, 30 years ago. But now it’s like: Oh, you have Netflix down the street. Why do I need a VCR? That shows just how fast things can change.
Here’s another example, these new Apple headphones, the Apple AirPods Max, and over the ear headphones, this is a radically different approach to audio, there’s a little AI chip in these headphones. Here’s an example of what it does that my Pioneer or Beats or Sony headphones don’t do. I have a lot of headphones. My head pump free, because I sold that bond for a long time. If I’m outside and talking to you and a lawnmower starts up right next to me, and this happened to me on a call last week when I was outside showing off something.
I told the guy who I was talking to. I said: Tell me what you’re hearing, because I’m standing right next to a lawn mower. It just started up. Then it’s loud, It’s right next to me. He said: Well, I heard the lawnmower start-up and then, it was like somebody just turned the volume all the way down on the lawnmower. And I hear you just the same, you didn’t change. But the lawnmowers are gone and I’m like: I’m standing right next to the lawnmower running and you can’t hear a lawnmower sound at all. And I’m like: Nope. Nope. I’m like: Wow. The AI is listening for patterns of sound while I’m talking to you. It’s doing it right now.
It’s listening for air conditioners. It removes traffic noise, remove even the ambulance noise, and it turns out way down. So the noise canceling on this is like: Wow, the next level. We haven’t even started using some of the magic that’s in the headphones. There’s nine little microphones on this headphone. Now, when I worked at Microsoft, I met a guy who developed an array microphone, that’s what things are called. He had a box with four microphones and a computer controlling the four microphones and could focus their attention on things. And he said: If I know where to focus these microphones I can really get rid of all other noise, because like my mouth is moving and sound is coming out of it.
So if the microphone is on an array. The microphone could focus attention on my mouth. Then it could get rid of any noise that’s not coming out of my now, because it would know exactly where it’s listening. This was before the AI removes the lawnmower on the weed blower and the traffic right now.
Tell me about this. I used to buy bone conduction headphones, because my intention is to get rid of the noise, is that similar to these array microphones?
Well, bone conduction is different. Array microphones is a grouping of microphones that you can focus with a computer. Bone transferring is actually just trying to get closer to your vocal chords and figure out how to remove noise that way.
Anyway, there’s multiple techniques. These also deal a little bit of that. There’s some fun sensors in these things underneath your air pads that popped out the right. There’s a lot of new technology in these headphones and in a way that this is a test for the next product. That’s going to be in a VR-AR headset with audio as well.
That’s going to let you play games in a new way and watch TV, particularly watch TV in a new way. Think about a world where everybody has a device like this on and it can actually talk to each other. I wore the headphones actually at the dinner table. Now that’s a weird thing to do, because there’s a social contract problem wearing a device in the kitchen dinner.
But I doubt everybody over that, because I’m a nerd and I want to try things out. Using these and pushing the button and moving to transparency mode. So these have three modes: No noise, cancelling noise and then transparency mode. Transparency mode means I can hear sound from the real world, like the traffic or my kids yelling at me or something like that.So in Christmas dinner I put it into transparency mode and sounded just like I didn’t have any headphones on at all, but I was hearing the real world, which is a real key point. I was hearing what Apple wants me to hear, because it’s Apple. These over the ear headphones are blocking the analog sound from getting to your ear the real world’s down from getting you there.
There’s nine microphones lifting all that sound and then there’s a processing chip that processes the sound and sweetens it and shoots it into the audio driver that you’re hearing. That effect was so well done that it sounded like I was just listening to my family without the headphones. Yeah, that was pretty mind blowing.
That breakthrough technology that we’re seeing in the headphones today could be the same breakthrough in this VR, AR technology that Apple may be working on. Imagine this a headset, if you will, that the visor can go on and off and you have it always on to always interact, but then you can choose whether to live in the real world or in the immersive world or hybrid world, which is so fascinating to think that we’re moving there.
Apple is working on a whole bunch of stuff. My next door neighbor works as a chip designer on the car play team and he has a prototype of a spatial computer. That’s going to be in the…Volvo told me they’re going to put this in all cars by the end of the year, all new cars. So they’re putting that 3D sensor, like what I was talking about, on your fingers, on the dash, so it can see what you’re touching and on you, so they can do this new kind of noise canceling.
So it knows where your mouth, the driver’s mouth is. So when you’re talking to the car or when you’re doing a phone call or a Zoom call, it knows how to really reduce noise from other things very, very well. Apple is not playing around and it spends $40 billion. You got to see a number of different new technologies come out of this effort, not just a headset.
So, it sounds like from an AI perspective there’s really two big areas that we’re seeing the evolution of technology. First, it’s around text and audio and speech, this NLP area. Then, secondly, it’s the computer vision, the hint water bottle, seeing and believing. Why are these two areas suddenly having this come to life in this moment and acceleration?
Partly, because of AI just starting to really be adopted across a wide number of new use cases. Spotify playlists are done by AI. My oven has a camera in it, then you put a piece of toast in there or a piece of salmon, and he goes: Oh, there’s salmon in the oven. We’ll cook it for 2 minutes or 12 minutes, and then play 9 seconds where the toast cooks for a minute and a half. These techniques are coming down in cost very, very rapidly.
It used to cost thousands and thousands of dollars and take hours to train on something like a hint water bottle. Now all you need is it was not even one picture anymore. It’s about, you need one picture where some video that you would tag and simulator. It’s coming down in cost so fast because Nvidia and VMware are making the use of techniques available on data centers at cloud computing companies like Amazon and Microsoft Azure and other places, Google Cloud Compute.
So when we think of all the technology that’s coming to market so quick, trends are always what everyone loves to hear about. You’ve shared a lot of what is 2021 and maybe sooner than 2030; but thinking about 2022, since that’s going to be here tomorrow, what should business people do to prepare for the changes that will be coming next year?
Well, understand that Apple is coming and when Apple comes. Like with this $550 headphone, anybody else who did a $550 headphones will sell maybe a hundred albums. It’s a very expensive headphone compared to others headphones. But Apple comes down like: Oh, sure. I’ll put a $550 sell on that, because I trust that Apple’s going to bring me a product that’s worth $550.
And that’s the brand promise of Apple, plus I have stores, I can return it. I know it’s backed by Apple’s privacy stance and stuff like that. So Apple can do things that very few companies can do in terms of launching new products and getting people aware of a new kind of product category and that’s what’s about to come. If you don’t believe that Apple is going to change the world, well, look at how many AirPod Pros they sold.
They’re selling more dollars in the AirPod Pros than Netflix makes in revenue. So, Apple’s coming. Let’s just start there. Get aware of this world is about to come and come pretty quickly at your business. Now we could argue: Okay, only the rich people will have it for a year, then we’ll take a year or two or three for Facebook to get it down to 300 bucks and bring it to everybody. But that’s coming too. They’re spending $10 billion plus than Facebook on this technology.
Apple’s spending more than that. Google is spending billions of dollars, so they just bought a company called Focals by North that made little glasses. So you can tell they’re interested in coming and they help fund a company called Magic Leap, which is probably going to go out of business or get bought for the patents, or whatever, but they spent $2.7 billion on that.
So no that, then get VR, I know a lot of people that are like: I don’t want to play with VR. VR is only for gaming or whatever. No, it’s not. The more time you spend in VR, the more you’re going to be prepared for the changes that soon are going to come to everybody’s face in terms of glass. Now, when I start talking this way, I get a lot of resistance:I hate wearing glasses. Well, you and I both have to wear glasses. So we’re already a little bit further along the curve towards the stuff that most people are, a lot of people are open. But these things are going to have so much utility soon. They’re kind of remembered things. They’re going to let you talk to your store, some talk to your entertainment in a new way.
You’re going to see new kinds of entertainment. Like I said: Put the volume metric football, where the football field is on your coffee table in front of you or on the floor in front of you with a huge virtual life screen. Much better TVs are coming in this kind of device that is possible even with your best TV. Because I can move these virtual monitors anywhere and make them bigger and make them different shapes like domes or wraparound screens, stuff like that.
So, remember what Steve Jobs told Walter Isaacson: I figured out TV. I figured out how to disrupt TV, change TV, whatever he said. I’ll get the exact quote someday, but it’s up on Google. This is what he is talking about. I’ve been in reality because he realized that once the screens got in these glasses got sharp, and we’re not there yet, that’s why I don’t wear them everywhere. But as soon as hololens gets a 4k display in front of you, then we’re getting into this virtual TV world, and that’s when the world starts flipping.
I can see so many possibilities. When I think about these, whatever device they’re going to be or the device it’s going to be a point where even do we need to learn languages or can we all communicate across video and audio. There’s so much.
I went to the Shanghai Disneyland three years ago, when it first opened, and I needed a taxi. Well, I don’t speak Chinese. So I pulled out my phone, pulled out Google translate and walked up to an employee and said: Hey, how do I find a taxi? And he read in Chinese what I was saying into this side of the phone. When he told me on my phone in Chinese how to find the taxi, they translated it in English on the phone and into voice. So I can hear him talking to me and pretty close to real time.
So that was four years ago. So you’re absolutely right. When you have this kind of technology on your face it’s going to do a lot of new things and blow away most people. Because most people are having to have that experience of being translating something in real time to somebody at a Chinese Disneyland, or whatever.
I saw that firsthand in Taipei and got to see it’s incredible technology.
We went to Rome last November, or not, last November in 2019. I was using augmented reality on the Google map to get navigation around the little alleys in the back streets of Rome. Worked out amazingly. I knew where you were in space. It was using this kind of technology to understand the buildings and the streets. It would put signs, like: Walk up this way on the sides of the buildings. It was crazy.
So, when we think of spatial, of course, it sounds like there’s the big companies which are changing the game like Google and Apple and so forth. But whether it’s some of the other best companies in spatial that you’re seeing and hearing about today.
To do the hardware that you’re going to want to wear on your face, It takes billions and billions of dollars to develop these things. But there’s a whole bunch of, I have a Twitter list of 2000 companies doing VR and AR. Now a lot of those are agencies that are doing this kind of stuff, but there’s companies like Niantic. Niantic builds the game Pokemon Go and one of the Harry Potter games. So people are walking around the streets playing Pokemon Go. Well, you’re using the Niantic platform. So, they’re building a 3D map of the entire world and they’re already using AI and computer vision to understand a lot about what you’re doing.
Are you playing in a park or are you in a shopping mall? Are you in a gas station playing while your car is pumping gas? Something like that. So, as you use it to play Pokemon Go and capture a couple of new Pokemons characters. It’s seeing the world that you’re aiming around, and it’s ingesting that data in it. They’re building a platform that developers are going to build all sorts of new things on top of using this data. So it already knows where there’s lakes or oceans or parks or shopping malls. It can tell that just by the data that people have captured with their cameras.
We’re still at the beginnings of this really. In 10 years, this is going to be built out as you’re walking around the world. It’s going to know everything about the world that you’re in to the place where it’s, probably going to know where you’re a Phillips head screw drivers are in your house. Now you’re going to be able to say: Hey Siri, where did I leave my Phillips head screwdriver. All right, is down in the garage. We’ll take you there and put a blue line on the floor. Take you down to where you left your screwdriver.
This technology around spatial, I’ve started to see, even in applications around education where employees at Walmart are now using VR headsets to learn how to stock shelves and truck drivers are learning about the different parts that need repair. So there’s so much going on with both AR and VR, but not everyone really knows what’s the deal going on between VR and AR.
VR is a technique where you see a virtual world online. AR is when you’re seeing like a SpongeBob jumping around the room, and you’re seeing what you think is the real world with some virtual things like the walls are changed, or there’s a character running around you and you’re playing the game for instance.
We have lots of arguments in the industry about the spectrum of technologies. They are on one side of VR on the other mixed reality, we call it, or I call it spatial computing, cause I just don’t even care about the argument anymore.
You’re seeing only virtual bits. You’re only seeing virtual screens in the VR headset, but the cameras are ingesting the real world, turning it into 3D and then showing you what looks like the real world in the glasses. But is that AR or VR? You’re only seeing the virtual polygon for the virtual sugar cubes that are all around you. So I don’t know. We have these arguments, is that AR or VR? Today Apple is like: We’re doing a VR headset. I’m like, no, we both know you’re doing AR.
But you’re showing it in a device that if you don’t have the device on the world is black. Can’t see anything. So isn’t that virtual reality, two items, but where we’re going as soon, we’re going to put some pretty badass pieces of technology on our face and we’re going to be able to go from VR to AR at some level. Now we can argue. How many polygons do you have? What’s your field of view? How sharp is it? Different devices will have different mixtures of these things. That is a very specialized device that costs like $10,000 per car, the car designers, but soon that comes to consumers, less than $2,000 device.
Then we’re watching TV in a whole new way. And if you’re watching TV in a whole new way, do you care what you call it? I don’t know. You’re getting virtual screens in front of you and you can see the football field in 3D and walk around it, even you can make it bigger and walk around it. Maybe even stop the game and go out and try and see if you can make the same path the quarterback is trying to make. It’s changing human experiences at the fundamental level. We’re augmenting them.
In fact, I was just on a show with a bunch of music industry professionals, and they’re thinking about how to make money with this new technology right out and how to bring music into these new 3D worlds, and they’re starting to do it. Fortnite caught their attention. When, Marshmallow, the electronic music performer. I saw Marshmallow at Coachella in the real world, and only 10,000 people could fit into the tent, where he was performing in Fornite he had 11 million people watching. I was walking around his performance at some level, and that was Fortnite is still a 2D thing. It’s still a thing on 2D screens, It’s not VR. So, We’re about to jump into a new way of entertaining each other. We’re going to see a lot of new companies spring up in the next few years because of this.
There are, of course, the big companies. You talked a lot today about Apple, but we cannot count Facebook out of it. They have their portfolio with Oculus and the Portal, and a lot of the new technology.
Apple is going to be more trusted. I expect the rich people to buy Apple. I mean, Apple makes nice stuff. These new headphones are amazing, and if they keep this kind of quality up with their VR and AR headset, it’s going to be pretty nice. But Apple comes at a cost. These are $550 headphones. They’re not affordable by a lot of people. So we’re going to see a gap between like the Apple price point, which might be $2,000 for this device. Facebook is like: Well, we’re willing to throw a thousand dollars in the box because of advertising. We have a different business model. We’re going to subsidize the cost of our device. You won’t trust us as much because you know that your private information that’s going to be used to bring you advertising all over the place.
But The Facebook device might be 300 bucks instead of 2000 bucks. So most people are gonna buy the Facebook one. I know how this game works. Most people go for the device that operates most of the utility, but for a lot lower price.
It depends also globally when we think of policies like GDPR and CCPA, of course Apple will come out ahead there, but in East Asia Facebook may even come out ahead there as well.
So each neighborhood is going to be more heavily one or the other. I have a feeling I’m going to buy both because I want to say something about us. Come on your show again and say: I just got the Facebook one, I just got the Apple. Apple went sharper and has better privacy and better sound. But the Facebook one has better social games because every time I see one of my Facebook friends, it lights up, stuff like that.
Or does automatic shopping while I’m walking around my kitchen, I put the Facebook one bond in the kitchen and Amazon with insight though, how much milk I have left, stuff like that. Crazy world coming, Isn’t it? It’s hard to know exactly how it will break up, but we are already seeing these prototypes out on the street. Facebook is already showing how the seven cameras look and work and they shut it off months ago at their developer conference. So we can see it’s coming pretty quick.
I think it’s exciting to see how we’re moving into this augmented world of spatial computing and for listeners of the show, if they’re looking to learn more and get more involved in this space, what are some tips or recommendations you’d like to share?
I have lots of Twitter lists in this space. I have lists of AI, computer vision, VR and AR, people and all the brands. So you start following a couple of Twitter lists and you get updates. I’m on Twitter, so you can find me there. You can do Google searches and find the latest YouTube videos on programming the world or seeing what the latest technologies are from like the consumer electronics show or what not.
But it depends who you are. If you’re a developer, I’m going to say learn Unity real fast. Spend every minute of your day learning how to program Unity, which is C sharp as the programming language underneath the polygons, to make the polygons do things. Those skills will be very valuable in the next decade.
Well, I love it. I’m a big fan of any technology. So you’ve heard it here first on HumAIn. Unity is where you should be spending your time and attention if you’re a data scientist or a software engineer looking to be part of that next wave of spatial computing. Robert Scoble,consultant and book author, thank you so much for joining us on the HumAIn podcast.
Thank you for listening to this episode of the HumAIn podcast. Did the episode measure up to your thoughts and ML and AI, data science, developer tools and technical education? Share your thoughts with me at humainpodcast.com/contact. Remember to share this episode with a friend, subscribe and leave a review and listen for more episodes of HumAIn.