Skip to main content

Empowering Creators with Real-Time Graphics

Natalya Tatarchuk, Distinguished Technical Fellow and Chief Architect, and VP, Professional Artistry & Graphics Innovation at Unity Technologies, joins Marc Petit (Epic Games) to discuss her long career in graphics engineering and rendering, how to provide creators with accessible technology, and the challenges ahead in the quest to build the open metaverse.

Guest

Natalya Tatarchuk, Distinguished Technical Fellow and VP, AAA and Graphics Innovation, Unity Technologies. Guest on Episode 14 of Building the Open Metaverse
Natalya Tatarchuk
Distinguished Technical Fellow and Chief Architect, VP, Professional Artistry & Graphics Innovation

Listen

Building the Open Metaverse Podcast cover image
0:00 / ?

Subscribe

Watch

Read

Announcer:

Today on Building the Open Metaverse.

Natalya Tatarchuk:

I think the conversions in some ways is here. I mean, we see a lot of experiences that are really straddling the line between linear stories and interactive stories. And I mean, this varies across the board from projects created in Unity. You've seen some of the recent examples with Neill Blomkamp and others. You guys have put out this amazing matrix experience, but we've also seen truly fantastic stories. For example, from Pixar. There's a lot that's happening that takes advantage of both the interactivity, but also the capability of the hardware to create these stories.

Announcer:

Welcome to Building the Open Metaverse where technology experts discuss how the community is building the open Metaverse together, hosted by Patrick Cozzi from Cesium and Marc Petit from Epic Games.

Marc Petit:

Welcome to Building the Open Metaverse, the podcast where technologies come and discuss how to create the open Metaverse together with the community. My name is Marc Petit from Epic Games, and today, I'm on my own as Patrick Cozzi from Cesium, my co-host, could not make it today, and we did not want to reschedule this conversation as we have a great guest for you today. Someone who is really quite busy, so we're super happy to have with us, Natalya Tatarchuk of Unity. Natalya, welcome to the show.

Natalya Tatarchuk:

Thank you very much, Marc. I appreciate the opportunity to speak to this audience and I'm excited to have that conversation.

Marc Petit:

Well, thank you. Thanks for being with us. So a little bit of your background, you've studied computer science and computer graphics at Harvard and also at Boston University. And you studied as a graphic software architect at AMD. I guess it was probably ATI back in the day, where you worked a lot on parallel computing and real-time graphics techniques. So real-time has been a passion of yours for quite some time.

Natalya Tatarchuk:

Oh, yeah. Well, I'll give you a tiny bit about my background. So I come from an engineering family, both my mom and my dad were engineers back in Soviet Union. And the memory of my childhood actually was very much focused on arcade machines that my dad used to design that are now in the museum of computer games in Moscow. And so I was very lucky to see this from the time when physics was encoded in hardware, and of course, it was all real-time. You interacted with a joystick. And it's funny that we're here, talking about the Metaverse because he was taking a real world game that you use physical bats to throw, and then converted that to be baked into the hardware pieces.

Natalya Tatarchuk:

And so as a kid, I've watched him design this game and converted into experiences that people can use with their hands, but also with their minds. And that kind of got me super hooked to the notion of what it means to do real-time and to do games and to do interactive experiences. And from then onward, the journey kind of moved in the direction of finding the path through how do I help people to create content. So I actually worked on 3D haptic, for example. And funny anecdote, this is where I learned the importance of divide by zero. We were an MIT startup that had this gigantic, super heavy steel arms. And when an exception was thrown across multiple threads, it generated a divide by zero by the force feedback thread, and it would whack you.

Natalya Tatarchuk:

And so the real-time responses, you have to be really careful about when you process, but it was a fun introduction to how you combine these fully virtual experiences. We were crafting this in virtual space while using a physical device. And so you had to create this connectivity between the imagination of the person who's doing it in air, right, to the final result on the computer and create that interface that dynamically gave you that feedback. And it got me completely addicted to the feeling of shaping content and creating in the virtual world from that onwards.

Natalya Tatarchuk:

And then of course, like you mentioned, moving to AMD and ATI allowed me to see how you drive us through the hardware design, how do you think about creating standards because a lot of what we were doing in the team is not just focused on creating new techniques and new research, but a critical part of enabling new techniques and new hardware is to create API standards that allow a number of different companies to participate in the experience of those features. And that was a really good learning experience, for me in particular, because a lot of times people think about standards of these involved committees, where people dumb down things to the sort of common denominator, but the importance about standards is actually about creating something that people can align on that allows differentiation to go beyond the specific standardization space.

Natalya Tatarchuk:

And I think this is actually a critical thing that I want to discuss with you when we think about Metaverse and creation of interchangeable data format. How do we define these standards that allow that specialization to shine as well as interchange durability? And then of course, came to the live after that, to the world of live games for Destiny and still going, and it tickles my fancy to know that the rendering code I wrote still has a play in millions of people experiencing every day. And that was a tremendous learning experience because, of course, games are some of the most complex ecosystems in terms of software and creativity, combined with the power of constrained computing systems, but it's also interesting to see how people who are creating content in these customizable worlds, in a customizable player experiences.

Natalya Tatarchuk:

How are they going to create the evolution of that content going forward? And so there are a lot of challenges that I think are unique to live games. And I know you guys at Epic have done quite a lot of expertise in that space, and it's a very interesting topic, very much connected to Metaverse because when I think about that concept, whether it exists or not, and to me it's a giant live game, right, in the end, giant live software and so forth. And from Bungie, what I decided to really dig into is how do we enable millions of creators to create effectively? And that's ultimately what drew me to Unity because I felt that the opportunity to make the biggest difference that was here, and so that's where I'm here today.

Marc Petit:

Okay. So tell us what you do... Before I go there, I actually have a personal question. When I was student in France learning engineering, I could never understand that is the textbooks in physics and mathematics was very expensive except the Russian one from Mir edition. So why was those books so cheap? And I learned all of my physics and mathematics from actually Russian textbooks, from the Mir edition, and that's the only one I could afford back in the day. So I don't know if it was part of the propaganda or whatever, but this is my... You mentioned Soviet Union, I had this flashback about those black books, the Mir editions. Anyway, so tell us what's your role at Unity now? So you joined in 2017, of course, focus on graphics, so.

Natalya Tatarchuk:

Yeah, so when I joined, it was actually 2016. When I joined my focus was help evolve what we can do with the Unity software for graphics. And we've grown from a tiny team to a tremendously large influence on the world of creators, and much of it has to do with the focus of on creation, on creator workflows, but also enabling scalability. So I ran the product teams for graphics and we were multidisciplinary crew, artists, designers, producers, but of course, a slew of engineers ranging from, of course, traditional graphics engineers, but also UI coders tools, machine learning engineers, assets, you name it. And a big part of that was, a lot of times, graphics gets misconstrued as these are the rendering nerds that focus on the algorithms, but it is so critical to build the mindset that really graphics is about how do you author the source data?

Natalya Tatarchuk:

The pixels don't come from the final RGP representations, they come from somebody thinking through their intent. What it is they are trying to represent? And if you're creating a graphic system, you're creating from end to end, from the authoring to the final representation, through the platform journey, through the performance, and transformations that are necessary for specific platforms. And so much of it has been focused on rethinking. How do we create end-to-end coherent systems? Since then, I've moved to a different role as a distinguished technical fellow and chief architect, professional artistry, and graphics innovation. I also vie for the longest title. There's probably one person who can win it, but no I’m kidding. It's an accident of fate. What I'm doing right now is effectively looking at how do we rethink, again, content creation tools and ingestion of the results of them?

Natalya Tatarchuk:

You probably have seen some of the recent changes that we've done for bringing Weta and Ziva and SpeedTree and a few others into kind of our teams, and the large part of it is focused on trying to rethink to make it super easy for people to create content without having to think about the complexity of how do they manage sending the data through the pipeline, how do they get it to real-time. And also to enable artists to meet them where they are, right? A lot of them spent hundreds of hours training the sort of muscle memory, right, whether it is with a Wacom tablet or a mouse and keyboard, and moving from that muscle memory into some other environment as a painful choice that sometimes quite fractured for a lot of them.

Natalya Tatarchuk:

So we're focusing on building an ecosystem where we can meet them where they are, but then help them become superheroes by letting them meet the audience that they want to meet, right? If they want to meet it on millions of mobile devices, fantastic. We don't want to dictate where they want to see that audience. And I think that is a critical part of building the connected systems in those giant live games or giant experiences because we won't know where the audiences will come to us when we're creating a particular experience. And the more flexibility we can have as a creator to where we can actually meet people who consume our creations, the more we create shareable experiences. And so right now, we're focused on building the service ecosystem for that. And then of course, moving the field forward for graphics innovation and what real-time and graphics in general can do.

Marc Petit:

Fascinating. So before we talk more about the conversion of movies and games, I want to call out that for the past 15 years since 2006, you've been at the helm of one of the most important courses in graphics for real-time called Advances in Real-Time Rendering and 3D Graphics and Games. And I just want to call out all of your leadership through the past 15 years, that's a lot of time, 15 years. When you get your mind to something you don't give up, right?

Natalya Tatarchuk:

Not really. Yeah. No, it's been an honor to have SIGGRAPH supportive for this many years and I have to call out huge thanks to 15 years worth of chairs and organizing community. And a big part of also the success is... You're right, I don't give up. And when I'm lucky to not give up, I also have the community of people who are interested in sharing their ideas because you can't really build... I mean, the reason I started Advances was, frankly in... Yes, I've been in graphics industry for over three decades. And those of us who started out in games back then know that if you wanted to implement skinning, you didn't go online and look up the algorithm or find a shader to a shader, right?

Natalya Tatarchuk:

There was nothing. There were no resources for how do you do that. And in many ways, the goal for the Advances course was to really foster the community of innovative thinking that is focused on sharing and building the knowledge in the community at large. And I'm grateful that so many people took that call. The speakers that have been extraordinarily open about their implementation details, there are specific sort of special sauce. We could have had a different outcome and only through their openness and support of SIGGRAPH we were able to build that knowledge base.

Marc Petit:

Yeah, this is fantastic, so you're going to be back in 2022?

Natalya Tatarchuk:

Indeed, we have a pretty good set of speakers already signed up. And in fact, some folks from your company, from our company, from NVIDIA, and many others are excited to speak, and we're hoping to capture a lot of the innovation that's coming in the next six months as well. It's going to be different. I don't know what the format will be. Will it be virtual? Will it be in-person? Of course, we miss the live community of humans meeting up, but while we were in the virtual world in the pandemic, the SIGGRAPH community actually, I should say Advances community, managed to build quite a good set of discussions on Discord and otherwise. And I think that's something we want to lean into more, right, because it invites more people to participate in that.

Natalya Tatarchuk:

And so in fact, one of the things that we've started last year that we're going to move forward is to create an open conference on real-time rendering... Sorry, on rendering engine architecture, we are co-organizing that kind of moving the course that we put forward last year and make it accessible to all so that people can participate in a lot of the innovations that are coming from the world of both real-time and non real-time rendering engine. And what we aim to do is to focus on really building the shared understanding what were the design choices, how people approached the problem rather than specific solutions, what were they evaluating when they were thinking in particular, let's say, architecture design or algorithms so that many can also learn... Roads not taking it just as important as roads that you do for Wacom, and often they're not talked about, right? And that might help somebody else's journey, and so we're hoping to make that a really thriving conversation as well. So that will be around, I think, during June this year.

Marc Petit:

Okay. Well, thank you. Thank you for all those years of support. And if you need help, let us know. We can always happy to help good causes. So movies and games, I mean, so you mentioned the Weta acquisition and I think it's on top of everybody's mind about... And we're all familiar with kind of insane quality that Weta would get out of the Manuka renderer. And at the same time, having that team now be part of a real-time company is interesting. So how far away from these conversions, these film and games, and the ability to create the Metaverse that looks like whatever we want, photoreal or not.

Natalya Tatarchuk:

It's interesting that you ask that. I think the convergence in some ways is here. I mean, we see a lot of experiences that are really straddling the line between linear stories and interactive stories. And I mean, this varies across the board from projects created in Unity. You've seen some of the recent examples with Neill Blomkamp and others. You guys have put out this amazing Matrix experience, but we've also seen truly fantastic stories like for example, from Pixar. There's a lot that's happening that takes advantage of both the interactivity, but also the capability of the hardware to create these stories. So I think I wouldn't say that the convergence is in the future. I think the interesting conversation is what is actually needed to enable to create more stories, right? I mean, ultimately, the way that we've looked at what is, of course, that is the tip of the sphere literally of what's possible to do with computer graphics today.

Natalya Tatarchuk:

But a large part of it is it allows you to create in sort of the stable guaranteed-to-work environment when the simulation of Loki, which is the simulation tool runs, you know that you are not going to be thrown off because some part of the simulation, let's say, the cloth will freak out because you sat on a chair, right? It is so well-designed and carefully constructed that you actually will have a guarantee that your simulations will work. When the hair will go into the water, right, like you saw in Alita, there's a big part that actually, in many other cases, you have to code all these special... Especially in games, right? You have to code all these special edge cases of like, if sold... I remember in Destiny, we had cloth, right? And then we had to code all these millions of use cases. If you grab the weapon from the back of your backpack, do this. If the character was shut down and we were saving the rag doll bodies to save them to do something else, all these...

Natalya Tatarchuk:

If edge cases end up costing you, effectively, our iteration time because when you look at the point of story creator, they need to think, "Hey, how will my story move forward," but they also need to think, "Hey, wait a second. Is my cloth going to freak out? Is my, I don't know, armor going to glow because I walked into a corner," right? And suddenly, you have slightly your own lighting that previously we didn't account for. So one of the powers that we will be working together to converge with real-time is to bring that guaranteed stability, that pipeline that just works, right, from this very sophisticated tooling, from the deep understanding of physics and optics into the creator's hands, the millions of people who are doing this on variety of hardware and variety of platforms.

Natalya Tatarchuk:

And it's a hard problem, but it's also something that we can start bringing out today. I mean, you can create hair, for example, and start bringing them with Weta tools Barbershop and start bringing them into games. So it's about creating a set of sort of milestones, so we can start sharing more and more and more. At the same time, we're going to keep pushing what the pinnacle is, what the tip of the sphere is. And so when I think about convergence to me, this is about solving the content creator iteration time. And if you look at Weta too, they're equally interested in the real-time technologies because in the end, to them, creators' flow, creators' iteration time is a critical part of success, right? Not just purely like a cost cheaper to produce because it's not just about that.

Natalya Tatarchuk:

Those in games and those in movies know that the more you can iterate on the final look, the final experience, the faster you'll feel that you've arrived at the story that you want to tell. And in the end, it's all about the stories, right? And so they're just as interested to bring in a lot of the elements that we bring from our side, from the real-time domain into their pipeline to see what can go faster, right, where we can save a lot of the time on the compute mall and move it into the real-time space. We've seen a lot of the innovation happening in the virtual productions from animation, from pipeline. For example, Ziva Dynamics is a huge force on making character creation become a significantly streamlined experience through the machine learning elements, by making it effectively. For example, you could take Ziva RT Play trainer, plug it as effectively a deformation note on Maya and everybody who's animating, not even people who are targeting the real-time, they get feedback in real-time.

Natalya Tatarchuk:

How would much more powerful that iteration loop becomes because you're now thinking not just about creating the frames, but you're thinking about creating a faster loop to get the story to the evaluation point. And that's what we're focusing on. And then of course, the other part is with Weta. These are amazing tools, but if we look at margin of computer films, they're extraordinarily painful. So many houses aren't able to tell the stories that they want to tell, right? We see this enormous proliferation of blockbusters and I enjoy them, right? I just watched one this weekend. It's awesome. I love the spectacle. I want more of them. But at the same time, there are always stories that we never get to hear because people simply didn't have money to produce them, right? It's a very basic question. You weren't funded, right? And one of the things that is near and dear to my heart and why I'm in Unity is because we're really focused on letting these stories be told, right?

Natalya Tatarchuk:

One example is just recently I was looking at a company that is helping neurodivergent and also helping some of the indigenous tribes in Siberia to create a preservation of their stories using AR, right? And in a way it's a different type of funding for films, for stories because... It was funny because when I was watching their presentation, I'm originally from Siberia Ural Mountains, right? And so it was actually really, really crazy, like brought tears to my eyes because I saw the stories that my grandmother would say, right, in this AR experience. And they didn't have any money, right? It was a very poorly funded project, but they had all the tools that they needed in order to create that. And so there's a point that I'm making about Weta... How much more powerful will stories that we will be seeing in the world if we bring that to millions of creators, if we make it inexpensive for them to participate or free?

Natalya Tatarchuk:

And I have a young daughter, I'm super excited for her to see these stories because it's not just about the one experience that can get funded because it's low risk. It's also about the myriads of other stories that may be extraordinarily risky to say, or maybe they're just not as widely interesting, or maybe they're a small unproven team. And I think that's something that we can enable. And I think that's the conversion that real-time really can bring to the party because we're going to make it cheaper to produce stories, we're going to make faster feedback so people can iterate on the stories, and I know that's what other companies in the field are also eager to do. So the more we all band together to enable this, the more we'll see this proliferate the around the world, and the more I'm excited to see in that world where we'll have that.

Marc Petit:

100% agree on that. Softimage used to say that productivity is creativity with another name. I mean, the quality of creative product is just the more iteration you put into it, and real-time really helps. So you mentioned bringing tools to millions of people, so what is it going to take to actually have content creation tool that are usable by the masses? Where do you think are the gaps or the trajectories that we need to explore?

Natalya Tatarchuk:

Well, there's a couple things that we are focusing on. So of course, for some of it is about releasing the specific tools, right? So on our docket, you mentioned and Manuka, Barbershop, you take a whole bunch, they're all on the Unity website. They're amazing at creating specific best-in-class solutions for, let's say, if I wanted to create amazing forests, like Pandoras in Avatar, Lumberjack to Tara, they're basically the way that I can approach them. But even beyond that, if you look at it takes to create, so much goes into orchestration of IT aspects, my source control, which version is published of the content, how am I getting this particular file go to this other program. You mentioned Softimage, how do I get this in my Maya or is some part of my team in Blender, right?

Natalya Tatarchuk:

Do I have a bunch of people sitting in Houdini? Some of them are using Substance. Some of them are using Unity. Some of them may use Unreal. Some of them may use a custom engine. Much of what we need to focus, and this is a call-to-action to many, and in fact, I want to call out to fantastic presentation by Kim Libreri in the previous podcast that you guys had. He said something about the, "Hey, what can two engines so together?" And I have a couple of examples, but I also want to say this. And actually, Raph Koster recently had this presentation on Metaverse that really stood out to me. It was extraordinarily eloquent, but fundamentally, there's no data portability without standards and standards are a social coordination problem. I thought it was so well put.

Natalya Tatarchuk:

So back to Kim Libreri. What I think we should do the true power of the two gorillas, so to say, of the third party engines, there are so many scenarios that require us to work together, to create these standards of data. And when we look at the world of creations, you mentioned tools, what will it take to get to millions of people? I look at the journey of 2D images, right? And the standardization that happened because we were able to take a gif and copy it, right? Apple did this amazing work so that I can copy a gif, paste it to text, and I could copy a JPEG and paste it to my email. And it became something that most people who are not the professional, deeply knowledgeable professional creators that know the ins and outs of OBJs and everything else, they don't think about it. This format became ubiquitous. So my ask, what we can do with two engines is let's come up with a proper standard for real-time 3D data format.

Natalya Tatarchuk:

It doesn't exist right now, right? When we look at the larger, long-term perspective of content creation to enable creators do this with ease, we actually need to make real-time content to become first class citizen in digital world and in the OSs, and that's nowhere near right now. We have USD. It's starting to get a lot of traction. I know we are investing very heavily into the format and Guido talked about the new format on this podcast, about the power that that format brings is super exciting, but it's fantastic at describing static assets, right? So how do we think about not just the complexity of creation, what will it take for me to offer that hair, right? So great, Barbershop, I can go and I create the curls and so forth, but how do I make this room and everybody can grab and send it over between different platforms. Even just different tools right now, it's a nightmare scenario.

Natalya Tatarchuk:

And by enabling these robust interchange programs, right, by working together with Kim, with others, with Guido at center, we actually can't create a full interchangeable, real-time 3D format. And that needs to pack dynamics, that needs to pack proceduralism, that needs to pack rigging, that needs to pack animation curve, that needs... And I know like right now, whoever is listening to it from USD world is like, "Oh, you're insane. This is so hard. Nobody's been able to standardize an animation curves alone. That is horrible." Raph made that joke in his presentation about how two engines can't agree on which axis is up, right? Animation curve, how much worse is that? But without that, even if we create enormously incredible tools, we still have the limitation, the real-time 3D.

Natalya Tatarchuk:

And this is a huge limitation in my opinion that we need to solve together for the Metaverse. And by the way, I view Metaverse as creators because, to me, this live complex ecosystem is about people creating collaboratively, people creating together, people creating with trust that their creation will be durable, that it will be attributed to them, that it will be something they can share with the world, but also profit safely from, right? And we can't get there if the data is not a first class citizen, and so that's the biggest thing I think that we need to work as a whole industry because we're pretty far from that for real-time 3D.

Marc Petit:

Yeah, no, I agree. And Patrick and I think are in violent agreement and our approach has been to try to converge two important trends. You mentioned USD. USD is an open source library, and then the other aspect is we have glTF, which is an open standard managed by the Khronos organization, and we had a conversation with Neill (Trevett) the other day about it, so. And actually, Vladimir Vukicevic from Unity, Vlad actually had a great approach to try to go incrementally towards that adding properties and he made the point of experimenting into the open so that we advanced the whole industry. And back to SIGGRAPH, so Patrick and I are trying to create something at SIGGRAPH around the open Metaverse about charting the way, I think exchanging digital characters is probably a decade long project, and characters always come with a stigma. So we're talking about cars. Something that people get around ... What would it take to exchange drivable cars? To drive a car from Unreal to Unity, and working with Neill and the glTF team, and a bunch of other teams.

Marc Petit:

We're trying to unpeel that onion and say, "What is it going to take to, say, like a rig? Okay, now, we need to rig a suspension, can we at least exchange that and try to have a very pragmatic, incremental with a lot of small wins along the way so that crawl, walk, run, fly?" I think you'll take... I look at it as a 10-year long engagement to try to drive that interoperability. And hopefully, we get to take the first steps in 2022. So we'll call you up on that, Natalya, and work on that. So it's an interesting segue because it's like... So this Metaverse thing is happening. It's a lot of hype right now, but we know when you've been in the graphics industry, it will happen. So what's the role of the SIGGRAPH organization in potentially being a force for the open Metaverse? What do you think is the potential of SIGGRAPH and what we should be doing there as a community?

Natalya Tatarchuk:

I hope it's crucial, as you mentioned, right? SIGGRAPH has been around for quite a while, and I think there's... Well, funny. I remember going to SIGGRAPH when KIV, right, the original sort of the super OG, very closed format, by the way, right, very opposite of what we're talking about was a thing. And they had presentations about it, but at that point, SIGGRAPH was and still now SIGGRAPH is the place where a lot of the innovations of thought leaderships are introduced, but then also share to a really wide audience. And I think the ability of SIGGRAPH to reach and cross so many different manufacturers, so many different companies, big and small, by the way, because I think this really needs to be... Vlad wasn't a company when he was driving his format, right? You guys are driving them. We will be a part of it. We're big companies, et cetera, but it's about combining these diverse points of you and creators.

Natalya Tatarchuk:

And SIGGRAPH gives you a platform to do that, not only to structure the conversation, but also then to expand them out to a larger set of people who can follow it. And because SIGGRAPH has all these decades, right, of conversations built in, it also creates trust. And I think one of the most important things that it can do is create space so that these conversations are nurtured, right? So they're invited so that they're thriving and then broadcast them later. So in other words, and I'm going to call this out as a challenge for SIGGRAPH, open up the AC MDL, right, the digital library, make these conversations. They should happen on YouTube, make it wide open so we can crack the walls because combining the immense trust, right, immense knowledge base that SIGGRAPH has from the pioneers of KIV to the now, with the bleeding edge combined with the knowledge of the past, that is a unique advantage, but bring this conversation to a much larger audience, that's the responsibility and the opportunity.

Marc Petit:

Yeah. And building an online arm like we've been doing on Discord in SIGGRAPH 2021 is a way to reach out to more people, create a more permanent conversation and not once a year. So I agree that there's a lot of opportunities, a little bit of challenges because to transform an organization is always difficult.

Natalya Tatarchuk:

Well, that's the responsibility that I really wish... Again, a lot of it is about sustainability of the organization, but I think there's a lot to be said about the organization itself has a responsibility, right? And I think that's what I'm hoping to encourage that they... And I know they care about driving computer graphics, but in driving fields related to that, right? In many ways, when you look at your car, example, so much goes in that isn't just at all about the elements that are traditionally SIGGRAPH. They are part of to make that story. I mean, for love of God, you'll need...

Natalya Tatarchuk:

I still remember being in a jury and SIGGRAPH general program and the set of papers came in and TCPIP protocols, and I remember raising an eyebrow. At that time, I didn't really know the relevance. I was like, "What does this have to do with SIGGRAPH?" And some of the incredibly smart folks from the film industry, which at the time I was still learning about were like, "Well, this is a critical part of what it takes to make a movie. If you don't have an effective scheduling mechanism on the network, you'll fail." So to the point of let's bring together all of the relevant, related fields and then bring it out to a larger conversation, that's what SIGGRAPH can do.

Marc Petit:

Yeah, I agree. The leadership potential right there. So you and I work in commercial game engine companies, what is your take the role of open source game engines, like O3DE and Godot? We had Juan (Linietsky) and Royal (O’Brien) talk on this podcast a few episodes ago. So how do you approach this at Unity?

Natalya Tatarchuk:

Well, I absolutely support it. I mean, you see Vlad as a part of the conversation, we are absolutely focused on creating a lot of the open source technology. We've done that over the years. In fact, most of the graphics work that we do, if you look at Scriptable Render Pipeline, it's all developed fully in the open, right? Like on GitHub, you can see all the PRs. What it does... There's a couple of elements that are, I think, super important for the industry at large. One, it opens up an opportunity to participate, right, for people who may not be even able to afford it, even subscription fees or revenue sharing, right, might be a detriment to some of the folks who are interested in that, but they have an opinion and they might want to express that opinion, so that's a huge part of enabling open source.

Natalya Tatarchuk:

And the other part of the equation, and you know this as well as I, there are many early platforms, many other elements that are not going to enter into the domain of open source because the respective platform holders have their position on how to treat their intellectual property, or even just when to introduce that intellectual property into the world, right? And we have to be respectful of that too. So I think I love the idea of having safe playgrounds, Godot, Blender. A lot of the innovations have been happening in the space because so many people can participate in that. So I love the idea of enabling that. This is the place where we can also evolve. Back to that point of social contracts for standard creation, a lot of it is happening in open software because it's a really easy place to express your point of view without any of the complexity that comes with my position in, let's say, for particular standard comes with even as simple as will I have money to intend a consortium, right?

Natalya Tatarchuk:

If you look at Vulcan on some of the others, I mean, pragmatically speaking, you need to have ability to participate in those consortiums. And some of that comes with fees and it's not a negative statement. Open software doesn't require them. That's where we see some interesting evolutions, some interesting experimentation. It gives opportunity for people to grab and change it to their needs as they desire, but I personally believe that there is a space for both. There's a lot of value from third party engines that provide support, that provide the stability, that will go in a different direction necessarily in terms of reliability that some of the open source might give you. And that's also important to provide because when we're talking about a company that puts effectively life support for many or thousand people on the line for a particular project, they need to know that they can rely on that project to be thriving. So there are different perspectives that are worthwhile to consider.

Marc Petit:

Yeah, absolutely. And it's combining the power open standards and the... I like your reference to a social contract. So it takes a lot of people and then open source, which is a place to also innovate and experiment in the open...

Natalya Tatarchuk:

And to share the specific positions and in Unity, we do quite a lot of it. I mean, in the end, this is a super critical part of how we enable people to tell more stories, how we enable more creators because by creating an open system, it's not just about open source too. And I mean, you seen this enormous strife that is happening right now because of the walled gardens. And in fact, I would say the most negative thread in the Metaverse conversation has been around the subject of, will it be a walled garden? Will it be a one system that a lot of other and people don't participate in and no matter what the opinion is about creatorverse, Metaverse, the one thing that I love about your podcast is that actually the open part, right? The fact that the critical part of emergence for the Metaverse is actually to help it to be open, to be not driven by one company or one platform or one API that there's hundreds and thousands and millions of destinations in that space.

Natalya Tatarchuk:

And that comes in, primarily because of open standards, open participation, right? You need to be able to plug in your data to your point about the car. Right back to the scenario from Kim, what can I think as an example of the two engines doing together, super simple, right? I mean, even just if we want to create a concert, right, like a UFC fight with some of the stuff that Peter Moore is doing with Metacast at Unity, "Hey, what would it take to have a Fortnite character show up at that UFC fight or a concert, right?" How would we enable that together? I think this is a super interesting thing for us to really band together so that my player identity... I created this character in Destiny, and I can take it with me everywhere else because I've invested a ton of time into that character. I love it, right? How do we make these open standards become effectively the springboard for enabling the openness of that creator platform?

Marc Petit:

Because I'm a geek, I'm an optimist geek. I think we'll figure out the technical problems. I worry more about finding a business model where... And that's the challenge, this new creator economy. If you take a Destiny character into Fortnite or into another game, how does a creator get a fair share of their creation and how do we create an economic model where... So I think that business challenge...

Natalya Tatarchuk:

I think that's actually one of the hardest things that we will have to solve. I agree with you, whatever I think about USD extension or whatever the new format for real-time 3D, it won't be instantaneous, but we will solve it. We have so many smart people at this problem. I'm the least smart person in that conversation, right? And it's amazing, but making the creator participate in future revenues from downstream usage of their... When I think about the song industry, I recently had a conversation about the music industry and real-time 3D, right? Piracy aside, when we look at... There's this thriving royalty-free world of music and great, I can use any song. I can do whatever I want with it. It's explicitly given to me as a...

Natalya Tatarchuk:

If I am a DJ or I am a composer and I mix any song, right? There is an actual system for tracking how the royalty from that song goes through. I was recently looking at Madonna's Ray of Light and it turns out that it was based on a '70 song from a very obscure musician, but she paid him a very hefty revenue based on his creation. Nobody even remembered likely the musician from that particular song, but his creation was able to create a living for his descendants, for example, because of the implicit agreement for how to deal with conventions of copyright. That's one of the biggest things that we need to solve, and that's hard. That's not trivial.

Marc Petit:

An issue... If I can volunteer an opinion as a host here, it doesn't happen once in a while is, for me, one thing that's important is the way you implement this is smart contracts, okay? And there is multiple way to implement smart contracts. You don't have to adhere to the fully decentralized model of blockchain There's a lot of ways you implement this database. We have a banking system, they can reconcile transactions. I want call... It does not equate a full embracement of that blockchain, crazy enough that we're saying. There's multiple ways to solve this problem. Blockchain being one of them with a certain set of characteristics, but there will be others. But I do believe that we do need that traceability of the content so that we can implement fair business model in the creator economy, that's going to be probably one of the biggest problems we have to solve.

Natalya Tatarchuk:

Well, and a big part of it if we go back to the technology, so one, of course, we shouldn't pick the most complicated solution. We should pick the most doable solution. Step one, let creators participate in the profits, right, with the simplest choice possible. Two, and this is a technical solution. Back to that durable, portable data format, so one, that needs to happen because to take the data from one place to another, we need to have durable, portable data format. Two, not only we have to have durable, portable data format, but we need to have a durable data format that allows us to recognize variations from that format. And I think that's another critical part, that's quite difficult.

Natalya Tatarchuk:

If you look at the journey, even just if you take a single production that ships, whether it is Weta shipping multiple versions of Avatar, right, like sequels, whether it is a game that ships different seasons or different DLCs, the effort involved in taking one set of content and then pushing it through to subsequent generation and giving attribution on that variation, right now, these enormous content management systems go into place in order to be able to just simply track attribution, right? IP management of all of the elements. Recently, I was talking to a fellow colleague of mine, who was the original creator of Golem, and he talked about how much the content that he's created has moved on to all these other subsequent productions, but there's no way for him to participate in the results or even know that they were used. I mean, sometimes it's not just about money, it's about the joy of creation and knowing how much it sparked in the world beyond that. But to your point to make it thriving, we need to make ability for people to benefit from it.

Marc Petit:

Yeah. Fascinating, Natalya. So we will wrap up with our two traditional questions. The first question is, is there a topic that are very near and dear to your heart that we should have covered today? Or would you want to cover in subsequent conversations?

Natalya Tatarchuk:

I think the main thing that right now I'm spending a lot of my cycles thinking about, how do we author in such a way that allows us to get to scalability of content? There are a lot of conversations. Of course, there's tons of papers and LOD simplification, but that's a really complicated topic. And I would love to spend more time thinking, and banding together on ideas of that, that's something that we are focused so heavily to solve with Unity. And I'm not saying this in any market-y way, but because ultimately, when the creators come to us saying, "Hey, I want to do the following story." They want to be able to reach the audience without having to lock that audience into a specific hardware model, right?

Natalya Tatarchuk:

Only PlayStation people can do it or only Xbox people, only people with high-end Samsung phones or whatever else, and they don't want to... The complexity of re-authoring it is so painful. So how do we do that? I think is a really interesting thing that I'm spending a lot of brain cycle with my team and with myself, and that's what we're setting out to solve because back to that creatorverse, we need to be able create durable content, durable content means that it's scalable content.

Marc Petit:

Absolutely. Content that will stand a test of time.

Natalya Tatarchuk:

A holy grail, admittedly, but back to the same subject, as the matter of attribution, there are certainly steps that we can take along that journey.

Marc Petit:

Yeah, no, you're right. We've been living in the era of durable content for the past 25 years, right? And I think that's going to be one of the big difference moving forward. Another last question is, is there a person or an organization that you want to give a shout out to today in the context of the open Metaverse conversation?

Natalya Tatarchuk:

I would love to give shout out to the Dreams team from Media Molecule and to Alex Evans because they've done something so incredibly unique and amazing with user generated content, with the ability to tell stories in completely different ways and very advanced approaches to how to author content. I would love for him to come talk, how he's thinking evolution of user generated content would be, how machine learning can help with ability to create content faster. I mean, they've done this amazing recent work on neural rendering and training that's in seconds or less than, how will that change the world of content creation, that would be my dream for the next podcast, if I could.

Marc Petit:

Yeah. That's a fantastic idea. I feel we should... Yeah, it's fantastic idea. Thank you so much, Natalya. And so thank you for this conversation. It's been a fantastic moment. I'm sure people will enjoy this conversation. Maybe we should try to have dinner with Kim in Vancouver at SIGGRAPH 2022. Hopefully, we get to meet in-person and we can advance the conversation on thoroughly. We should not wait for Vancouver, but at least we should plan to meet in-person in Vancouver. I think it would be great. And again, thank you for your time today. Thank you for all your contribution to the community and very, very insightful thoughts, so we're super happy to have you here.

Natalya Tatarchuk:

Thank you kindly, and count me in.

Marc Petit:

Yeah. So on behalf of Patrick who was there with us in spirit today, again, thank you so much. Thank you everybody for listening to the podcast where the pickup is good, the feedback is really good, and it's because we're lucky that the right guests agree to come and talk to us like Natalya did today. So thank you, everybody. Let us know on social, how you feel and the questions and the feedback. Thank you so much. Bye-bye.