Microsoft's Andy Beach Talks Streaming and the MetaverseMicrosoft's Andy Beach Talks Streaming and the Metaverse
How is streaming in virtual and immersive environments driving storytelling in unique and experimental ways? Andy Beach, Microsoft CTO, Media & Entertainment, chats with Tim Siglin, Founding Executive Director, Help Me Stream Research Foundation, and Contributor Editor, Streaming Media about his new role at Microsoft and the ways that the metaverse will change how users experience narrative entertainment.
“I’m delighted to be here with Andy Beach, who I’ve probably known for about 20 years now,” Siglin says.
“I actually started a new job about a year ago at Microsoft,” Beach says. “I've been there 10 years now, which is insane to believe. I never imagined I was going to be there that long. I joined on a contract gig to help Xbox launch media apps, and that turned into a full-time role running engineering teams that specialized in media. And now I'm our Chief Technology Officer for the media and entertainment industry. I'm responsible for our technical strategy. And I work closely with partners and customers around the world on media infrastructure solutions, new ways of delivering streaming, new engagement opportunities, the full gamut. And it just keeps growing. We acquired Xandr earlier this year, and part of the work now is how we interconnect what they're doing as a first-party service with our independent software vendors (ISVs) that are already established. So lots to do.”
Siglin remarks, “One of the fascinating things that I heard yesterday, both in your presentation and then in one of the keynotes was talking about sort of stitching the polygon world together with the world of video and audio. And it's funny to me because I worked with Unreal and Unity and all those kinds of things 15 years ago in the gaming space, but it seems like that stuff is almost more relevant now as we're moving into some of these newer media deliveries.”
“Yeah, totally,” Beach says. “I mean, there are so many ways that data gets used to deliver just regular streaming video, and that sort of a hundred x's when you get into a virtual environment where you're delivering some form of video entertainment or just entertainment, period. Being able to take knowledge of all the participants that might be interacting in it, if it's sort of a multi viewer session…something like live sports, taking data telemetry from the game and applying it to some sort of graphics outside of the video so that you're almost an enhanced view, almost a top down view of what’s going on. All of that plays a role and there's so much more ability within virtual environment to see where people are specifically paying attention, which helps you drive the storytelling in a different way, and so I think we're right on that cusp of really a whole new form. And I know people have been saying that and people will continue to say that, but what I think is coming is a lot more visible...obvious experiments that customers can go try and test for themselves until we find the one that people are most interested in, that resonates.”
Beach further elaborates on the specific challenges of making immersive experiences work for all types of users. “I don't think it'll be a traditional video couch watching experience for everyone,” he says. “That that will still exist…part of the challenge is right now I think we're still so focused on ‘what is that immersive environment like?’ We're not yet thinking in more detail around those who don't want that experience, or for accessibility reasons. So how do we extract a piece of that experience from the virtual environment out back into a physical world? So they can participate as well. And when we figure that out, that to me is really the metaverse because that is the hybrid virtual nature.”
Siglin agrees that an entirely new approach is needed for hosting and presenting immersive entertainment. “We've done a really good job of mimicking broadcast TV, but that's all we've done,” he says. “And the immersive models essentially are thinking way beyond that to say, what are things that a linear solution can't do? Here's one question for you: it seems like the industry is waking up to the idea that we don't necessarily need to race to the bottom for real-time streaming, at ‘500 milliseconds,’ if it compromises synchronization to other elements. So it seems like we're moving toward a ‘let's find a sweet spot and agree that we can synchronize.’ And I'm assuming that's really important in the immersive environments as well.”
“Absolutely,” Beach says. “Particularly one where you've got people that are potentially interacting with each other while they view the content, and then they're viewing synchronized content. So you're really dealing with a lot of different interaction overlaps, and you still need that to be barely low latency, but you are right. There is a sweet spot. I will say there's still going to be a desire and a need for low latency out there, but it won't be used everywhere. It'll be specialized scenarios where it's absolutely needed. So we're not done with it by any stretch. I think we're continually going to see people pushing, but they're pushing it in the right areas now, which is for specific use cases, not just everywhere.”
“Right, good,” Siglin says. “Andy, always interesting to talk to you, and 10 years at Microsoft! Wow. That is hard to believe. Congratulations on the…what is that? That's not the diamond anniversary, it's some anniversary…”
“Maybe,” Beach says. “If they give me a diamond, I’ll happily take it!”
Learn more about VR, AR, and the metaverse at Streaming Media East 2023.