Intel wants to take you inside the metaverse

A metaverse-inspired art installation in Hong Kong.
A metaverse-inspired art installation in Hong Kong.
Image: REUTERS/Tyrone Siu/File Photo
We may earn a commission from links on this page.

Among Silicon Valley’s hottest buzzwords, the metaverse reigns supreme. Coined by Neal Stephenson in the 1992 novel Snow Crash, the metaverse is a next-generation immersive internet experienced through augmented and virtual reality (AR and VR).

The concept of the metaverse has inspired the tech industry for decades. Long after the virtual world Second Life popped up in the mid-2000s, gaming companies like Epic Games (maker of Fortnite) and Roblox have started describing their worlds as an early version of the metaverse. Facebook founder Mark Zuckerberg changed Facebook’s parent company name to Meta signaling his intention to design the new immersive internet in its image.

Yet few chip companies have gotten in the game. They’re crucial to making the metaverse a reality. Relative to the enormous computing demand for fully immersive virtual worlds, today’s chip are underpowered. They’re also in short supply: supply chain woes mean the semiconductor industry is months behind on delivering enough chips for everything from video game consoles to cars.

So far, only one chip manufacturer, NVIDIA, has announced that it is is building a platform for metaverse. Called Omniverse, its chips are designed for “connecting 3D worlds into a shared virtual universe.”

Now Intel is entering the conversation. Intel will release a new series of graphics processors starting in the first quarter of 2022, which it announced in August 2021, and says will power the metaverse. Intel’s Raja Koduri, who leads Intel’s Accelerated Computing Systems and Graphics Group, said in an exclusive interview that the computing power of today’s chips will need to improve 1000-fold to power the metaverse.

Koduri spoke about the path to the metaverse, his vision for what it will look like, and how Intel wants to help build it ahead of his public remarks at the RealTime Conference on Dec. 13.

This interview has been edited for clarity and length. 

What are you telling the world today about the metaverse?

One foundational thing we always knew is that for what we imagined in Snow Crash, what we imagined in Ready Player One, for those experiences to be delivered, the computational infrastructure that is needed is 1000 times more than what we currently have.

So the [personal computers] are getting better, the phone is amazing these days, you’ve got a two-teraflop GPU [graphics processors] in the phone… and then you have cloud. There’s lots of progress made, but it is not enough.

Your thesis is that there’s a lot of hype around the metaverse, who’s going to build it, and what’s going to look like. But before we get to that, chipmakers need to build the infrastructure layer. 

Yes, exactly… What I have been in pursuit of for the last five years is preparing the computational framework necessary for the metaverse. You need to access to petaflops [one thousand teraflops] of computing in less than a millisecond, less than ten milliseconds for real-time uses.

We’ve been working in the background on the roads and highways and the train lines you need, assuming this civilization is going to happen. When roads are being built it’s exciting but after that, nobody cares about it. And that’s where we want to get to. Once this is all built, you’ll have your fun in the metaverse.

Why is this the first time that Intel is talking about the metaverse publicly?

Because the first building blocks—the high-performance graphics—are within a few months of launching. Before it was speculative. If we started talking about stuff that is still a year-plus away, it’s like, ‘Too much PowerPoint!’ We will start rolling out in 2022. But this is a four- or five-year journey to get everybody to have access to better, faster compute. We’ll be laying out their first set of roads, if we use that analogy, next year and actually have launches coming up in early [2022].

What is what’s Intel’s vision of the metaverse? Is it one cohesive space? Is it a series of different metaverses? 

We envision it as multiverses that may be connected to each other with accounts or something like that. One version of the metaverse I personally aspire to is the ability to have this conversation that you and I are having in a full immersive environment where I see video of you and you see a video of me and it’s photo-real, maybe beyond photo-real. Maybe it’s the Superman version of Scott, but where we can interact and collaborate with people across the world in more three-dimensional reality. That’s one I’m banking on—kind of Zoom on steroids.

The other one is gaming experiences where we are having fun, earning points, and doing quests. Then there is the social stuff that that is beyond just kind of having a meeting. It’s a continual social space with avatars and creators. Being able to have people collaboratively work on things, but they’re all remote. You don’t have to be physically on location to create stuff, whether it is the creation for storytelling or movies or even physical objects. It’s a wealth of possibilities.

Gaming creation, collaboration, social—they all can have different metaverses. And they may connect, but we see the underlying technology framework as being common.

The path to the metaverse

How far out are we from this vision for the metaverse?

You see the first instances in many shapes and forms already. Our infrastructure today with broadband and 5G rolling out is pretty good in pockets, but it’s not consistent, as you know. And I live in the San Francisco Bay Area and I go from the South Bay to San Francisco and my signal quality and my bandwidth is all over the place. Even in the heart of the Silicon Valley, it’s not consistent. One thing about the metaverse to me is that it’s a continual experience. If I’m in the world, I’m in the world. It’s smooth. Especially if you are wearing a headset, if it drops it would be like a punch in the gut.

The compute that you need to render a photo-realistic you of me or your environment needs to be continued anywhere. That means that your PCs, your phones, your edge networks, your cell stations that have some compute, and your cloud computing needs to be kind of working in conjunction like an orchestra—between all of these three elements that deliver that kind of beautiful metaverse. It’ll take time. Facebook, Microsoft, Google, us, NVIDIA are getting this infrastructure to be kind of omnipresent, but it will take time and effort.

How do we get there: is it just better chips and better broadband and better cloud? How much does the architecture of the internet need to change to support this? 

I don’t think it takes fundamental change, because you see in video delivery that’s been happening—Netflix has centers not too far from you that are streaming video. So the disaggregation of the compute has been happening. We use the word edge computing. So the formation of edge has been happening for the last 40 years, slowly and steadily. I’d say we are still at the one percent of that prevalence. But, the amazing thing, when stuff like this happens, it grows exponentially. So from one percent getting to 90% over the next five or six years–it’ll happen.

Where do Web3 [a decentralized vision of the web] and the metaverse intersect? Is decentralization actually important in building something like this? 

That’s also a kind of buzzy topic. But I do believe the decentralization of compute and mechanisms where we can much more easily do transactions between us will help proliferate the metaverse. So that element of web3, the decentralization, and also whether we leverage crypto as it exists today or some other form, some form of microtransaction payment system that is being integrated into the protocols, will be amazing.

You think blockchains will play a role?

Yeah, I believe that they will, and I believe they’ll also find ways to optimize them that they can use blockchain without burning a ton of compute cycles, right? Because you need those compute cycles to render your metaverse, not waste on doing ledger validation.

On the environment and the chip shortage

How do you build the metaverse in a way that is environmentally conscious?

That’s the biggest technological challenge and an exciting thing for us as engineers, because this 1000x that I referred to at the beginning, it has to be the same or even lower energy consumption levels than today. So we have to deliver that 1000x increase at the same energy consumption. We believe that a standard kind of Moore’s law curve [which outlines exponential computing growth over time] is only going to get us to about eight or 10x growth over the next five years.

So algorithms, architectures, neural net algorithms—some of this stuff has to play a role in increasing their efficiency. And that’s one of the things we have been working on as well. Algorithms will have to play a big role in getting the 1000x. We can do brute force. We can put more compute, like these Bitcoin farms. They throw a ton of [computing power] at it, but that’s not energy efficient, as you rightly pointed out. So it has to be that balance where we do energy-efficient compute and hardware, some better algorithms, better architectures. That’s the path to 1000x.

How do the chip shortage and supply chain problems affect your vision and timeline?

They do. The positive side of it from being a semiconductor person is that the demand is exploding and metaverse needs even more. So the burden is put on us is that we have to be even more efficient in leveraging our fabrication capacity. We need to be able to get your metaverse experiences without needing big chips that take up a lot of fab capacity.

We need to be much more efficient. Just like the environmental consciousness—don’t waste water, don’t waste electricity, don’t waste heat—it applies to semiconductors as well.