7
Oct

Unite Copenhagen 2019 Keynote


[MUSIC PLAYING]>>Hello.
>>Hi.
>>And welcome to the Unite Copenhagen 2019 keynote,
here live from the Bella Center. You’re joining us here for
the preshow, and we’re
very glad that you have. Hello, Will Godstone.
>>Hi Liz Mecuri.
How are you doing?>>I’m very well indeed. I’m very
excited. So what
are we doing here?>>That’s a very good question.
What the devil
are we doing here? Well, we’re here to count you
down over the next 20 minutes leading up to the keynote.
>>No spoilers>>No spoilers. We’re not
allowed to tell you what
you’re going to see, but we’re going to give you a few hints
and we’re going to welcome a
few guests on the stage to talk about what’s coming
up, too. So for those of you joining on
YouTube, on Twitch, on Facebook
Live, Mixer, Periscope. Welcome. You’re in the right place if
you want to watch the keynote,
and the keynote is going to begin in 20
minutes, like I said. We’re counting down to
some amazing announcements. So what are you looking
forward to , Liz?>>So in particular, I’m
looking forward to some
announcements and graphics. So we’re going to be
hearing a little bit more about the applications of high
definitions Render Pipeline. I’m also going to be having a look
at the universal Render Pipeline, formerly known as the
Lightweight Render Pipeline.>>Exactly.
>>And we have a guest as well to
just talk about that later, which is always very exciting
indeed. What about you?
>>I’m looking forward to the latest on DOTS, the
data-oriented techstack. So we
are going to be showing you. We’ve talked about
performance a lot in the past. We’re going to show you how
that performance kind of
permeates out to convenience and workflow. And frankly,
I think we’re going to show
you something that you’ve never seen before in a
game engine. And I’m
really excited about it.>>It’s extremely exciting.
>>It’s very cool.>>One section I’m particularly
excited about as well is we’re
going to have our VP of AI, Danny Lange, telling
us about the applications
of the simulation in Unity. I’m so excited.
I can’t wait for you all to see.>>There’s some really
incredible stuff, and that
stretches beyond game development out to all kinds of other
industries, too. And, of
course, connected games, too.>>Yes. So we’re going to see
something from Mad Finder
about their solution for connected games,
which is very exciting indeed.>>It’s true. It’s
true. So there’s over
150 sessions at Unite. There’s a lot content. right?
>>And there’s like almost
3000 people.>>Yeah. I mean, look around.
>>Look at them.>>Hi everybody.
>>Hey. Hey.
>>Good, good. There’s two waves there.
That will do. Why not? And so the atmosphere
here is buzzing. As always, at Unity, we
have a “Made with Unity”
showcase to showcase what everybody’s got.
>>Yes. So what would a Unite be
without a Made with Unity showcase? So Made with Unity
showcase brings together all
the projects and games from our creators, which is
always extremely exciting. So we’ve got, not just games,
but we’ve got film and animation. We’ve got architecture,
engineering, construction,
travel, automotive. Whatever industry you are in,
we have something for you. If you’re not here –.
>>If you’re there,.>>if you’re there or
home or wherever you are. And what Made with Unity
gems have you been playing?>>Well, me? Well, I’ve
been playing — I signed up
to Apple Arcade last week.>>Yes. Congratulations to all
creators you released last week.>>Yes. So there’s — what
you’re seeing right now is Sayonara Wild Hearts by Simogo.
I love this game, Liz. You’ve got to play. I’m
putting a lot pressure
on Liz to play the game.>>The visuals are very useful.
It looks like my jam.>>Yeah, it’s a stunning
rhythm-based game. It’s like
a pop album video game. You are a lady who’s
been transformed into
a pop superheroine, and you’re fighting,
and you’re riding a motorbike.>>Motorbiking.
>>Yeah, it’s awesome,
and I love it.>>That’s great.
>>And then I was also playing
Jenny LeClue by Mografi.>>I have heard this.
>>So this is stunning. Like it’s been in development
for a long time. I’ve been
following it really closely. It’s an adventure game.
It’s really, really funny. And it kind of reminded me of,
you know, the classic days of
LucasArts adventure games.>>Like Nancy Drew.
>>Yeah. And it’s about
a little Nancy Drew girl. She’s solving a mystery in her
town, and it looks fantastic.
Highly recommend that.>>Yeah. That looks great.
>>And then the other one
I was playing was Punch Planet.>>I’ve not heard of this. Oh!
>>It’s the beat ’em up
I was telling you about.>>Yes. You were telling
me about this, because I
love a beat ’em up. Yeah. And I love all the
pop culture references –>>Like Ripley.
>>Yeah. There’s a lady who looks
a bit like Ripley there. And. Yeah. You know, I’m
sure there’s no copyright
problems with that.>>That looks great.
>>It’s an absolutely
awesome game. And of course, coming up
soon as well is Manifold
Garden from William Chyr. This game’s been in the
works for a long time, and
it looks absolutely unique. I absolutely adore this
game, and I can’t wait
to find out what it is.>>So what about the
showcase? Is there anything
you’re excited about here?>>Yeah, of course. So
in the showcase, we have
lots of games as well. I’ve been following
this one for a while. He’s here tonight at the show,
Phi Dinh’s game, Recompile. So this is a 3-D Metroidvania
action game set inside a computer. So it’s great like Tron
plus Metroid Prime plus, I
don’t know, Port of — or whatever else. It is just
stunning. I mean, you can
see for yourself. Right?>>Yeah, it looks great.
>>And what have you been
playing that’s at the show?>>Well, I’m very excited
to see Seed. So Seed is by
Berlin-based Klang games. And the first thing that
drew me to it was its
minimalistic art style. I absolutely love it. And
it’s actually quite deceiving
because the art style does not portray everything
that is in this game. It’s a sandbox MMO in
which players can create
their own ecosystems. So you have a seedling, and you
can cultivate that seedling. A seedling has wants and
needs like human beings. And if you cultivate and
listen to your seedlings, you
can create your own colony and have relationships
develop there. And those cultivations
also take — are going on
when you’re not playing. So it’s just one of those
games that I’m going to sink
hours into. I just think I am.>>Awesome.
>>And as I said, it’s not just
games in the Made with Unity showcase.
>>Yeah. Of course.
>>We have Median Entertainment too. So the scene that you can see
now is A Kite’s Tail by Walt
Disney Animation Studios. I’ve not heard of them.
Who are they?
>>[LAUGHING] So this is their second short animation of
reality animation. So this
is all hand-drawn animations bought into the medium of VR,
which is very, very exciting
and just looks beautiful. And it’s also very
Disney in its story. It’s about two kites, as
you can see, that are very
different that just learn to live together and
coexist beautifully.>>Awesome.
>>A bit like us, I guess.>>Maybe not. [LAUGHING] So
you can learn more about these
games, other projects and other creators, their
creative process and all that
good stuff on the Made with Unity website, which is
made with dot Unity dot com.>>Marvelous.>>And obviously we wanted
to give a big shout out to
the rest of the Unity indie community that’s not
been featured just now. So we put together this indie
celebration real for to take
a look at some of those great games. Let’s take
a look at it right now. [MUSIC PLAYING] So, tons and tons of great Unity
talent going out into those games. Thanks so much for doing
what you do. We absolutely
love all of them. Liz, do you know what I
love about those games?>>>>What’s that?
>>Graphics.>>[LAUGHING] Well, that’s a
good job that you enjoy graphics.>>Yeah. Why?
>>Because look who
we have with us.>>Oh, my goodness. It’s
Arisa Scott, our product
manager for graphics. what incredible
coincidence. Hi, Arisa.
>>Hi, Will. How are you doing?>>I’m doing really good.
>>Thank you for joining us. We won’t keep you long,
because I know you’re going
to be spilling some graphical beans in the keynote itself, so
I’m going to dive right in there. Universal Render Pipeline,
previously referred to as
the Lightweight Render Pipeline. So they’re not
two separate entities?>>Nope. So in 2019.3, we have
renamed Lightweight to Universal. So Lightweight was originally
a more targeted solution
focused on mobile, being super-performant, and a little
bit more of a narrow focus, and its feature set
has grown and evolved. It’s still super-performant,
but it offers fantastic
visuals and more flexibility. So all Lightweight users,
when they get 2018.3, from their point of view, it’s a
name change with more features.>>Awesome. So what
differentiates that from our
built-in renderer, for example?>>More flexibility. It’s also
more performant. So — because
a single forward pass render, and we’ve got some
exciting stats in the keynote.
>>Yep.>>And there
are a lot of artists’ tools
like VFX Graph and Shader Graph. We also have a new
post-processing stack, which
is integrated directly into Universal because gives
you more performance and
also beautiful visuals.>>Awesome. Speaking of
beautiful visuals, we’re taking
a look at this spaceship demo right now. And this is
HDRP, right? So what’s the
difference there with HDRP? What kind of add-ons on top
of what you would get with
Universal could you expect?>>Sure. So HDRP is more
targeted — compute-capable
platforms or higher end platforms like Xbox, PS4,
and you have things like
volumetrics and much richer PBR that is even more realistic
than what we offer in Universal
and a lot of other great features like subsurface
scattering and all the other
acronyms you might want.>>Yeah. SS, DXXR, and the —
>>Yeah. Yeah. Very important. And here we’re seeing a
lot of great VFX Graph
work for this demo, which –>>Oh yeah. We have good news
about VFX Graph right now. It’s
now working with Shader Buff.>>Yeah. So you can author your
shaders inside Shader Graph. Use those directly in
VFX Graphs. You get even
more gorgeous effects.>>Awesome. Awesome.
So, Liz, ray tracing.>>[LAUGHINNG] Huh?
I don’t know why you asking me. [LAUGHING] Arisa, tell us a
little bit more about ray tracing. I know that we’ve previously
released blog posts about
this, but can you tell us a little bit more about what
we can expect and how we
can get our hands on it?>>So you can download 2019.3
and use HDRP. Our ray tracing solution is
built directly on top of HDRP.>>So if you have that
package, you would just
upgrade and it’ll be there.>>Yep. It’ll be there.>>Oh, that’s awesome.
>>And if you have the hardware,
then you’ll be able to kind of scale upwards even further,
and you don’t — you author once. Have it work with both
HDRP and with ray tracing. And the nice thing about our
solution for ray tracing is
it’s the final frame entirely in editor, so
you’re seeing ray tracing
within the edges as well. As so you can kind of
iterate much faster.>>Very cool. Well it’s nearly
keynote time, so I think you
should probably head out.>>Go prep for that. Yeah.
>>So good luck. We’ll see
more of Arisa in the show. Thank you so much.
>>Thank you.>>See you soon.
>>See ya’.
>>Bye.>>Cool. Well, Liz, do you know
what I loved about that segment?>>[LAUGHING] What did you love?>>I loved having a guest.
>>You know, I did as well.>>I think we have
another guest.>>And, you know, from one
hot topic – graphics – to
another, one of the hottest topics in gaming at the moment
is cloud based-gaming and to
talk to us a little bit more about that, we have
Samuel Peterson from Stadia. Thank you so much
for joining us.>>Thanks so much for
having me. Really excited
to be here at Unite.>>Yes!
>>So the Google Stadia service
launches in November for players, but obviously your audience
right now and back here
is Unity developers. So what advice do you have for
Unity developers and how do they get onto the Stadia platform?
>>Sure. So the good news is we’ve
been working with Unity for
over a year now, so we’re almost there. And back at
GDC this year, we launched
the Stadia Partnerships Program. If you haven’t
heard about it, it’s a
program that we’ve generated to engage with developers,
regardless of their size. So support opportunities range
from financial to production
support to marketing, and we really tailor those to
the individual opportunity
and the individual developer.>>So you’re kind
of solving some of the kind
of like publishing aspects of getting the game out as well.>>Absolutely. It’s really
about helping them bring their
games to market the best way possible.
>>Sweet.>>That sounds like an
incredible opportunity. What advice would you give
to any of the developers who
are watching that want to take advantage of this
partnership?>>Absolutely. So first and foremost, it’ll
surprise you to learn that
we’re looking for games that people want to play. Right? So, key
amongst that is that they’re playable. So we’re looking for games
that are in development. And more important than that
is we’re really interested
in developers who want to take full advantage of
the power of Stadia. So either the back end or some of
our YouTube integration features. You might have 300 enemies
in a battle onscreen in a
single instance, or you could use crowd boost, crowd
play or state share, which
are some of our YouTube integrations, that you can read
all about those on our blog.>>I was going to say I
don’t know what those are. If you want to find out,
go to Stadia dot Dev.>>Yeah. Stadia
dot Dev is where you got
to learn about those. Right?>>So what is the actual
application process like?? How do people — What’s
it going to be like if
they want to apply,. if they’ve got a vertical
slice or something they
want to show you today?>>Sure. So the
initial application
process is really important. That’s where we’re getting to
learn about you as a developer. So you need to submit your Web
domain, your developer background. We want gameplay footage
or early playable. We’re obviously looking for
developers with recent history
in certified storefronts. Experience in consoles
is very interesting. It’s not a prerequisite, but
it’s definitely a nice to have. And clearly we’d like
developers from Unity and
with expertise and Unity.>>Awesome. Awesome.
>>It’s incredible. So people can head to Stadia
dot Dev to find out more
about the partnership, but also about the Stadia
features, right? Absolutely. Head to Stadia dot
Dev to register your interest.>>Well, they have a great show,
Samuel. Thank you so much
for joining us.>>Thanks.
>>And good luck finding
all of this talent out here.>>Yeah,
I’m looking forward to it. Thanks very much.
>>Thank you so much.>>Cool. So we are going to pivot
back to Unity for a moment now.>>We are. So, Liz, I don’t
know about yo, but I’ve
used Unity since like 2008 because I’m very, very old
[LAUGHING] and it’s always
kind of looked the same. So thankfully, in 2019.3, we’re
getting a bit of a restyle, thanks to this lady, Mary Luther,
our senior UX designer. Hi, Mary.>>Hey, Will. Nice suit.
>>Thank you so much
for joining us.>>Thanks.
>>So why are we doing this now?
It’s been a long time. It’s been — how long
did you tell me earlier?>>Well, it’s been about 14
years, and we’ve had the
same style in that editor pretty much the whole time.>>Yeah
>>And so Unity is growing a lot. And as you’re going to see at
Unite, we have more awesome
teams than ever, creating new features for Unity.>>True.
>>So it’s really important that
we have a system that is cohesive and consistent, so that
everything can work together.>>So how did you get
started with the redesign? What was the process?
>>Yeah. So the first thing we
did was just talk to users. And what we found when we
did this is kind of like we
could categorize them into three main areas. So the
first major area was just
that people told us that the editor felt really dated, like
you guys were just talking about. So we did user research to
find out more and really sit
down, listen and observe people working in Unity
to see how these things
affected the usability of the product. So what people told
us is that they felt that
the editor was actually less usable and had a lower quality
because just of the way it looked.>>It was like a part perception
thing as well.
>>Yeah, totally. We call it aesthetic
usability in the UX world. And then, of course, we had
real kind of usability issues
related to people kind of losing the context of where
they were in the editor. So that’s kind of
the second main area.>>Like focus.
>>Yeah. Focus. They’re like,
“Am I in the hierarchy? Am I in the scene view And
then I edit the inspector and
then I’ve edited the wrong thing and then I’m angry.” Those kind of issues. And
then the third one was just
lots of little inconsistencies like everywhere and
little usability issues. So like text, it was just too
small to read or too light to
see, lots of things just slightly out of alignment,
and just like a million other
little inconsistencies.>>Ok. Death by a thousand
cuts of awfulness.>>Yeah, for sure.>>But I
mean, we can see the redesign
onscreen right now, and we can hopefully show you a
little video clip of the
side-by-side between 19.2 and 19.3. So tell us a
little bit about the
differences we’ve got here. Obviously, there’s a bit of a flat
design compared to the old look.>>Yeah. So one thing that’s
really common right now is
modern apps are sort of taking over the landscape of
tech is everybody’s going flat. And so that could be actually
really bad for usability. If you go too flat, it’s really
hard to tell the difference
between a text input and a dropdown, and it can
really kind of mess you up. So the challenge for us was
to add enough subtle depth
cues to support usability while really making
a nice modern style.>>Yeah, it feels like –and
I think like we got a lot of
feedback on the forum as well, that it was like a good
compromise between the flat
and adding those cues for people.>>Yeah. So one of
those things when we were kind
of thinking about this is we talked to users and we
did a lot of refinement and
iteration on the forums and also via surveys to kind of
get things at a really good
level of detail, because that’s what this kind
of an effort takes. And then the other thing
that we did was we added,
over the past couple releases, things like tab focus and hover. And I mean, all these things have
come in over the last couple. So they’re not new for 2019.3, but
they’ve been all part of this effort. So we’ve added them as
soon as we technically can. And that’s been really helpful
so that when you actually are
modifying an object and editing in the inspector,
you’re less likely to make a
mistake, which has made me feel better. Working in
Unity, I don’t lose my place
as often as I used to.>>Exactly.
>>That sounds amazing.>>And so what’s actually coming
next? You know, we’ve seen
19.3 and what that looks like. What’s coming up in the
future of the editor as
far as you’re concerned?>>Yeah. So, 2019.3 is actually
just the first stage of kind of
like a bigger effort to continue modernizing the editor. And so the next things we
really want to focus on are
usability, accessibility and then, of course, user
workflows to make kind of
Unity a better experience for as many types of users as possible.
And we’re just going to listen
to the users. So we’re going to let the
users lead us where they
lead us in these areas. And we’re going to just
continue to kind of reach
out and talk to them on the forums. And I do actually want
to say thanks to everybody
over the past year out there that kind of worked
with us on the forums, filled
out like the 3 million surveys on theming that we
put out, because all that
feedback, we were able to make most of the majority
of those changes. That’s pretty much all I’ve
done for the past year, was
listen to feedback and make stuff better. Yeah.
>>It’s awesome.
Mary, awesome work. Thank you for joining us and
telling us about the editor.>>Thanks, Will. Thanks, Liz.
>>Enjoy.
>>Have a good Unite.>>Yeah. You too.
>>Thank you. You too.
>>Bye.>>Cool. So, of course, the editor
theme isn’t the only thing
that’s coming to make your life better in Unity. We have a
ton of quality of life
improvements coming to the editor very soon, and we’ve put
together a quick trailer to
show you some of the great things that are
coming up in 19.3 and 20.1. Let’s take a look at the
quality of life real right now. [MUSIC PLAYING]>>Ok. Lots of amazing
stuff in that video.>>Lots to talk about,
but no time.
>>No time whatsoever. You’re about to be blown away
by some even more cool stuff
in the keynote right now Let’s do this.
>>>Ten, nine, eight, seven,
six, five, four, three, two, one.>>Enjoy the keynote.
>>Enjoy the keynote. [MUSIC PLAYING]>>Please welcome Kat Strafford. [MUSIC AND APPLAUSE]>>Welcome to Unite Copenhagen.
We are so excited that
you all are here. One of the things that makes
Unity’s community so unique
and special is that people come from everywhere to
come together, unite, share
ideas, share the latest technology,
collaborate and make new friends. So thank you all for joining us.
We have a really exciting
week ahead. So we want to extend a big
welcome not just to everyone
in the room here, but also to everyone on the livestream.
We hope you’re going
to enjoy the show with us. And we also would like to
welcome all the speakers, all
the demoers, partners, and the people who brought
this all together
and made it possible. So welcome. [APPLAUSE]>>So we have a week full of
150 sessions, tons and tons of
demos across the Expo floor, as well as Unite at Night that
I hope you’ll make it to. And of course, the one thing
you’ve asked a lot for,
which is more time and more experts at the expert bar. So before I move on, though,
I want to take a special
minute to or second to recognize our beta testers.
Where are you?
Are you right here? Yes. Thank you. Thanks for
making Unity better every day and helping everyone enjoy it. We
really appreciate all that you do. [APPLAUSE]>>You guys are kind of grinning
sheepishly, but we really
appreciate it. Thank you. All right. So tonight, John is
going to share how the power
of real-time 3-D is transforming the way we all create. Lucas and team are going
to share more about our
data-oriented tech stack. Timoni and team are going
to share around our AR
toolset, including MARS, AR foundation and some new features
that we haven’t shared yet. Danny and team sorry,
just Danny, actually. He’s going to share
more around simulation. And Arisa and team are going
to demonstrate our latest on
graphics, including 2-D features, our new scriptable
render pipelines, as well as
a few other surprises. And we’re going to hear from some
special guests. So let’s dive in. Who first shared their
stories in caves, connected
art, nature, science led us toward new horizons?
Who thought to capture motion? Spark an engine to life? Turn
television into a playground? Every day, you create the playful,
the functional and the unexpected. You break down walls and
take us to astonishing
places and tell stories, beat borders and create
beauty for tomorrow. Because at Unity, we believe
the world is a better place
with more creators in it, where everyone has a chance
to shape the world.
Please welcome John Riccitiello. [MUSIC AND APPLAUSE]>>Good evening and welcome
to Unite in Copenhagen. On behalf of everybody at
Unity, we welcome you all
here and we’re thrilled to be here. It was only a couple
kilometers away from here
where it all started, when 15 years ago, the very first
lines of Unity were written. Not together with all of you,
an amazing group of developer
partners and customers, we’ve achieved more than
we ever thought possible. We’ve built a better game engine,
and you’ve built amazing games. Today. games built on Unity
account for approximately
50% of all games on every platform in every
country in the world. In just the last 12 months,
games you’ve built we
downloaded 34 billion times. Think about that for a
minute. If you do the math,
that’s 53 downloads of a Unity-based game, the
stuff you build every
second, 30 — 53 per second. And over this time, our industry
has entered its golden era. Today, $150 billion industry,
far surpassing film and music
to become the largest entertainment medium
on the planet. And we’ve gotten there through
great games like Inside by
Play Dead, Sayonara Wild Hearts, or Nintendo’s
Mario Kart Tour from you. And collectively, we surpass
these other media through
innovation, taking advantage of the massive increase in
compute power and networks
to help you deliver groundbreaking content.
And it gets better from here. The constraints we
face are falling away. With magnitudes, better
performance, processing,
streaming, the evolution of technology will power gaming
farther than ever before. And because we’re the enablers
of the most important or one
of the most important and significant technologies
of our time, real -time 3-D. Mastery of real-time 3-D
technology and tools means you
can be the agents of change for the gaming industry. But our shared technology
that we use to make games has
also helped us become the leading entertainment media
in the world and it will take
us to other industries as well. Real-time 3-D is already
proving to be invaluable
across industries that helped design, build, and sell
the most complex projects,
ranging from cars to airplanes to skyscrapers,
such as designing and
testing augmented reality experiences that
don’t exist yet. Like Volvo has done with
Vario, or how a construction
company, Mortenson, created and designed medical operation
suites in VR before constructing
such critical venues in real life, or Bill Bell
building virtual prototypes
of the FCX helicopter for design and test piloting.
You are the pioneers of
the real-time 3-D revolution. This is a revolution
transforming all content, all
creation across dozens of industries, and you started it.
That which is static will move. That which was not social
will be connected, That which
was 2-D and flat will be 3-D, real-time, and immersive.
It’s game makers like you and
pioneers in other industries. many of them here tonight
at Unite, that will lead the
world into the next realm of creative possibility.
So get ready to shape the world. Now, before we welcome Lucas
to the stage, please join me
in taking a look at some of the incredible things
you’ve made with Unity. [APPLAUSE AND MUSIC PLAYING] [APPLAUSE]>>Please welcome Lucas Meyer.
>>Hi everyone. [APPLAUSE AND MUSIC PLAYING] It’s good to see you all here.
I get to talk about our
data-oriented technology stack. A gentleman here heard about
it, and when he heard about
it, he probably heard us talk about performance,
performance, performance,
performance and performance, by default, in particular. The reason we talk about that
so much is because we focus
on it, because it’s the one part of the game engine that
you cannot add back later on. However, what about convenience?
Let’s talk about some
program of convenience. If you’re a programmer that
wants to use DOTS today, you
kind of start like this. You make a component, but to
use it in the editor, you
have to make these authoring components. To make it
do something, you need
to make a system. To make it do that fast,
you need to make a job. And then to run the job,
you need to schedule that job. So you could say, “Well,
I don’t know, Lucas. That sounds like a lot of
work to move something
around,” and you’d be kind of right. Because there’s only
a few lines in this page of
code that actually have some programmer intent. So
the next release of DOTS
tries to help with this. Start with this offering
component at the top. Instead of us asking you to
write it, you can ask the
compiler to write it, but just adding this generates
offering component attribute. Here in the bottom, with
the struct, let’s remove
that for a second and let’s replace it with the entities
DOT for each statement. I’ll get to that in a second.
But if you compare this to
what we started with, that’s already a substantial
improvement in terms of the
amount of boilerplate that you need to deal with. Now, let’s
take a look at this new code. The programmers is in the
room, they might have had a
little heart attack because there is a lambda expression
in here and those cause GC
allocations, and those cause GC spikes, and those cause
complaining customers. In the next DOTS release, we’re making the lambda expression
in the entity step for each. We’re making it special. Instead of running it as
normal C sharp, we convert it
into that same job structs that we were asking you
to write manually before. So when you use this construct,
you’re using the Unity job
system to go wide over all your course.
However, it also uses burst. If you look at the burst
inspector for this code, you
can see that the inner loop uses these xmm1, xmm2 registers. These are SIM-D instructions
that operate on multiple
elements at the same time. So not only are you going
wide over all your course. On each of those course,
you’re going wide using
all the SIM-D lanes. And that’s a great place for
us to be in, when you can
have the most optimal code you could write, but where you
can write it in a relatively
humane way, that doesn’t come at the expense
of that performance. Another great convenient thing
for programmers in DOTS is
that when we ship DOTS, we ship it to you as C
sharp source packages. If you’ve played around with
DOTS, that means that the
complete source code of DOTS and all the features that
are built on top of it are
on your laptop, in your backpack,
under your chair right now. If you make a DOTS game,
and you grab a debugger,
and you start single-stepping through your game code,
you’ll step through your game
code into our DOTS code into the physics code into the netcode
into the animation engine. You can see everything.
You can change everything. You can make fun of my incredibly
verbose local variable names. Many people do that. I won’t
give up, though, because in
DOTS, there’s no more black boxes. You can see
everything. All right, enough
with these programmers. Let’s get you Juachim
and Martin on stage. [APPLAUSE] Thank you, Lucas. Now Lucas talked about how to
write code for DOTS, but you
also need a great tool to create your content. And for
DOTS, we’re going to use the
exact same Unity editor you know and love already.
So, Martin, let’s take a
look at some content.>>Yeah. So let’s jump into
our prototyping level here. It’s built exactly like
you would expect in Unity. It made out of prefabs,
nested prefabs, game
objects, and components. If we look at the sphere,
we’ll see it as you know it. It has a mesh filter,
renderer, sphere collider. The only difference is that
we have runtime converting
this game object data into entity representation. And in DOTS, we make a
distinction between the data
that the game code operates on and the data that content
creators use, and bridging
those two worlds is the conversion workflow. It
converts from a authoring
friendly game object representation to runtime
optimized entities, and
this optimized DOTS data is streamable. It’s compact,
and it’s optimal for performance. And in the preview
inspector, you can see that
conversioning happening life. So when Martin changes the
physics buddy to be dynamic,
you can see the inspector update here, and you can see
exactly what these runtime
game code components that are actually being used
at runtime are being
generated from your very simple to
easy-to-use understand
authoring representation up there. Now, to build a game. You need at
least a couple of core features. So we built a third person
shooter, and it uses all the
brand new DOTS features: animation, physics, rendering,
and netcode together. And we made a very small,
simple sample project with
the purpose of being easy to understand and digest.
So, Lucas, let’s have a play.>>All right, let’s go.
All right. Where are you? Everyone should
know that Joe made me promise
to lose so you wouldn’t look silly.
[LAUGHING]>>Good takeaway.
[LAUGHING]>>So what you see here, this
is running on our FPS netcode. It is built on top of DOTS
and it makes it easy to
create any network game using an FPS netcode architecture. It has
everything that you would expect. Client side prediction,
lack compensation, and
interpolation are all built in. We’re also using the new
Unity animation package. It is high-performance,
very flexible, and there
are no black boxes. Every character in this
game uses runtime IK. As you can see here, the
feet are perfectly sticking
to the to the chrome ball. We use Unity physics for the
character controller, sliding
along the walls, jumping and going over the floor. We’re
using ray casts from Unity
physics for the shooting, and for rendering, we’re
using the descriptor render
pipelines with a DOTS-based rendering component for
optimal performance. So, Martin, the only thing
I’m just not too sure about in this is the color
scheme you picked.
>>Mm hmm. The baby blue might be an
issue and we can try and
find something a bit cooler. That’s why the using
grass like more green. Try that. I save , and boom,
there you have it on your device.>>I can also try something else.
maybe. That didn’t work out
that beautifully,
but like a midnight purple. Do that.
>>All right.>>Push it.
>>All right. So that’s a big deal
because Martin has — [APPLAUSE] Oh, the color wasn’t that nice.>>It’s a big deal because
Martin has this project open
in the editor, but me and Joachim, we are running built
players that are running the
game, and he’s making a change in the editor on the
left, but it’s affecting
the players while they’re running the game that
don’t have to reboot.>>Yeah. So since I’m generating
runtime-ready data on the
editor, there’s no reason that I can’t push
it to any device.>>So what kind of changes can
you make in anything, really? If you look at the Chrome
[INDISCERNIBLE], then I
can make translations. I can duplicate, make another
sphere and go back to Geo of
the static component that I touched on the physics body. Make it dynamic. They’ll
fall down, collide, maybe
topple over if we were lucky. There we go. So, yeah. And I can
even make changes to shaders. Let’s go into my shader
graph where, like any good
TV chef, I prepared a little effect that we can grab on.
Save. Boom. So that’s what I really love
about this feature that — [APPLAUSE] It’s what I really love.
I was running on the tablet. Joachim was running
on a desktop. We could also attach a console
and a phone, and they would
all attach to the same editing session. When Martin
makes a change, DOTS breaks
out the entity data for each of those
devices individually. So that means that he can
see the changes on all of
those devices without the devices having to restart or
the games having to reboot.>>Yeah. And is there’s
no matter the device. It could be a console or an
iPhone or anything really.>>So what does that do to
your workflow, as an artist?>>Yeah, as an [INDISCERNIBLE]
artist, this means I can build
my entire world on the device with direct feedback
on actual device performance. It’ll be immediately visible
to me if I have performance
headroom in one space of the level, and I can add more
detail there, or if I need to
optimize in another part of the level, and instead
of looking through complex
profiling tools, I can simply hide or delete elements
and see the performance
impact on the scene.>>So I believe this will
have an incredible impact
on your productivity. Unfortunately, all of these
features – Unity, animation,
FPS netcode, Live Link, and conversion flow will ship
as preview packages
alongside Unity 201.3. [APPLAUSE] And we’re also going to make
the full project folder that
we just showed you with all-source code and assets
available very soon. [APPLAUSE]>>Thank you.
>>Thank you. [APPLAUSE]>>Please welcome Brandy House. [MUSIC PLAYING AND APPLAUSE]>>From Unkilled to Shadowgun
Legend’s Madfinger has
consistently pushed the boundaries of what’s possible
with mobile multiplayer game
play, one of the hardest problems developers face today. We know that for Unity to
build the best multiplayer
tools, we need a battle test our tech in some of the most
challenging scenarios out there. Enter Mad Fingers’ Shadow Gun
War Games, a tournament-style,
mobile title designed for e-sports coming
out later this year.>>When you say you’re the
best shooters, why do you say
you’re the best shooters?>>Because it’s true.
Shadowgun Legends is a social
shooter, so you can get into the hop. You can make
friends. You can play in modes. You can set a match, and you
can go fight against aliens. The other side, War Games
, is very quick. There is almost no social stuff.
Of course, you will support
friends. You will support guilds. But the game is designed to be
super-great to jump in, shoot. Making games is challenging.
This is why I’m doing it.
It is what I love. From the very beginning, from the
first idea for the final product, everything is challenging.>>The challenges start with
building, but they go on forever
once it’s a living game.>>What Unity
provides us is dedicated
server, which allows you to control what player can do
and cannot. You don’t
need to rely on clients. And, of course, the
biggest challenge is
dealing with cheaters.>>You know, they are ruining
the game to others. So we have to fight them
using dedicated servers
is one big advantage. You can run full simulation
of the game on the servers,
which means you have flow latency. You have full control,
and when gone in the game –>>they just won’t be able
to do those nasty stuff.>>I must say that using Unity
Matchmaker is more simple
because it’s pluggable and it provides all infrastructure
and allows me to focus only
on the Matchmaker logic itself.>>We got a very close
relationship with Unity. And because Unity cares, so we
can do games, the best games ever. [APPLAUSE]>>To create games of this
caliber, you need to ensure your games don’t lag,
prevent cheaters from ruining your game quickly bring
players together into fair
matches and open communication channels that feel
safe for everyone. Unity worked alongside
Madfinger to bring all
of this into Wargames. Madfinger started with our
new Unity Transport Package,
which enables rapid transfer of data to and from the games
server, keeping players in
sync with as low latency as possible. Next, they integrated
our dedicated server hosting
and matchmaking to seamlessly connect players
to servers across the globe. With our Matchmaker, Madfinger
defined which players they
wanted to group together with custom C sharp
match functions, and
we handled the rest. And finally, they plugged
into our best in class voice
communications to provide their players with an
immersive social experience. As Madfinger has shown, you
Unity Transport Package,
available in preview, dedicated server hosting, matchmaking
and beta next month, and
voice communications provide the foundation to
build a world-class
multiplayer experience. Thank you. [APPLAUSE] [MUSIC PLAYING] Make sure you go to the moon
and [INDISCERNIBLE]
and do the other things. [MUSIC PLAYING] [MUSIC PLAYING]>>Please welcome Timoni West
and Brittany Edmond. [APPLAUSE AND MUSIC PLAYING]>>If you want to make
powerful, deeply interactive, augmented reality experiences
that intelligently interact with the real world, you are in
the right place, because
you need the right tools to build intelligent AR experiences
and we’re building them for you. We’ve built features that not
only unlock the next generation
of AR experiences, but that helped make creating
these experiences much easier.>>That’s right. MARS is our
suite of design and simulation
tools that is built to make the workflow
that much faster. It allows you to prototype,
test, and ship much more
quickly than ever.>>It sits on top of our Air
Foundation framework and it
allows you to create AR experiences that work the way
we really want AR to work –
context aware, flexible, customizable, work in any
location, and with any kind of
data without requiring tons of code.>>Okay, so let’s
talk about some of the ways
that MARS makes your life easier. OK, so you can be
constantly testing without
having to make a device build. And this is enabled through
the MARS always on query
system, which is very cool. Our simulation view and
our environment templates. And you can author complex
data-oriented applications
using a “what you see is what you get” interface. You can
literally drag and drop
objects into the simulation view, just like in real life. And we have visual gizmos to
help you actually lay out
objects in the space so you can just lay out your scene.
>>And you can use any kind
of hard data and there is a lot. So object tracking, gesture
detection, body tracking,
face tracking, all these different types of data, and
we are future-proofing as
well to make sure that we can take advantage of all the
new cool types of data that
are coming down the pipe.>>And to make the development
process even faster and more
accurate, MARS will have companion apps for AR devices. The first iteration will be
for mobile phones, and we’re
giving you a quick sneak peek right here. You’ll be
able to sync projects in the
Unity cloud and then lay out assets as easily as
placing a 3-D sticker. You’ll be able to use your
phone to create conditions
and record video and world data and export it straight back
into the editor without having
to make a build of your project.>>Can I tell you,
that video part was really hard
to get right? Okay. And the next iteration, of
course, is going to be for wearable AR devices, the
Magic Leap and the HoloLens. And it’s the same idea in mind.
You’re sitting in your space.
You’re taking in world data. You have it on the headset,
and you can export it directly
back into the editor to author against, just same concept.
OK. So there you go. We’ve got MARS and that
is our work environment
for AR development. But of course, we want you
to be able to take your AR
experiences and get them into as many hands as possible. And
we have you covered there, too. Dan, welcome to the stage.>>Thanks, Timoni. [APPLAUSE]>>AR foundation allows you to
build your experience once and
ship it on all major platforms.>>In today’s AR
ecosystem, each platform has
similar core features, but they each have different workflows
and nuances to enable them. We do the work to unify these
so that you can simply enable
a feature once in Unity using AR foundation, and that
feature seamlessly works
across all of the platforms.>>And if a feature is enabled
on one platform, but isn’t
available on another, we put in the hooks. So when the
feature becomes available,
it’s easy to put it into an existing app or experience
simply by updating your
packages, not rebuilding your app from scratch.
>>This framework also gives you
the ability to take advantage of all of the awesome features and
workflows we’re building
for Unity, so that you can create truly robust AR apps
that are ready to ship to
internal stakeholders or on any app stores. And we’re not
just talking about mobile anymore.>>That’s because AR
foundation now extends
to wearable AR devices. For the first time ever,
you can build an experience
once and ship it on ARKit, ARCore, HoloLens,
and Magic Leap devices.>>And when you’re building
an experience to any or all
of these devices, we know that you’re going to want
your user to actually be
able to interact with that content in a variety of ways,
like scaling, rotating,
repositioning it, just to name a few. We also know that
coding these object interactions
from scratch for each device and every new platform
is really time-consuming.>>So we’re giving you the
ability to add interactivity
to your AR and VR applications simply by adding
components to your objects. This brand new XR Interaction
Toolkit handles everything
from touches and gestures on mobile devices all the way
up to traffic controllers and
even tracked hands. This toolkit is coming
out in preview in 2019.3. And now we want to leave
you with one final note. Having to completely rebuild
your app from scratch in
order to add AR functionality to it is time-consuming and
painful, and we know that is
actually a non-starter for some of you. So we’re
officially giving you the ability
to embed the full power of Unity, inclusive of our AR
offerings, directly into
any native app that already exists today. And with that
capability, combined with
MARS, AR Foundation and the XR Interaction Toolkit, we’re
giving you the tools to build
powerful augmented reality experiences that intelligently
interact with the real world and reach the widest
possible audience. Thank you. [APPLAUSE] [MUSIC PLAYING]
>>Please welcome Adam Chernick. [MUSIC PLAYING] [APPLAUSE AND MUSIC PLAYING]>>Hi, everyone. At SHoP Architects,
we’re focused on innovation. We’ve done projects such
as the Uber headquarters
in San Francisco and U.S. embassies around the world. One of our most recent projects
is a skyscraper that’s
currently under construction in New York City, set to become
the tallest structure in Brooklyn. It’s also the first project
that’s using Unity Reflect,
which is a new product from Unity. But before I jump
into that, let me give you
a little bit of background. See, architects and engineers have
been using 3-D for a long time. The technology is called
building information
modeling or simply BIM. See, now with Unity
Reflect, we can actually do
a lot more with this BIM information,
with these BIM models. Until recently, the time it
took to get these BIM models into real-time engines
was far from real-time. It actually took teams days
or weeks in order to optimize
and convert these models into real-time ready assets. And this really didn’t align
with the speed of iteration
that we needed in our office. And of course, the
consequence was that not a
lot of our projects made it into real-time engines. But now with Unity Reflect,
it’s easy for any architect
or designer to take their BIM models and data into
Unity, and this is a big deal. This is a game-changer for our
industry. So let me show you
a little bit about it. So the first thing to note is
that you get a couple things
out-of-the-box with Unity Reflect. You get a plug-in
into our BIM Software,
Autodesk Revit, as well as a Reflect viewer application. So here we are in Autodesk
Revit, and we have our entire model for that
tower project in Brooklyn. And this is actually the
model that we use to create construction documents to
get this building built. And I have another view
here that just isolates the
exterior window panels of this model. And this is
actually what we’re going to focus on in this demo
is these exterior windows. So, first off, what I’m
going to do is I’m going to
pull in all this geometry into the out-of-the-box viewer. And this is the
out-of-the-box viewer we have
over here on the right. All I have to do is open up
these projects and pull it in. And you’ll notice how fast
it is to get this model into
this out-of-the-box, Reflect viewer, and you can navigate
and isolate specific components. And another very, very
important thing to note is
the live link function. So if I click this button
right here, we’re actually
creating a live link between our BIM software and
the Reflect viewer. So if we come down to
this awning here
and that looks a little bit small, so let’s make it a
little bit bigger. We’re going to be able to see
this updated live in the viewer. And this is huge. If we’re
making changes with clients
or general contractors, we’re going to be able to get
these updates in AR or
in VR in real-time. This is very important and
this is what you get out
of the box with Reflect. But since it’s built on Unity,
we’re able to customize and
build on top of this technology. And so that’s
actually what we’re most
excited about here at SHoP Architects. So I’m going to
show you the first thing
that we built with Unity Reflect, which is a panel
status tracking application
for the exterior windows of this tower. So if we jump
back into Revit really quick
here, I want to show you one thing. If I toggle through
and grab one of these
panels, we’ll see all of the associated metadata over
here in the left side. And we’ll see this one specific
piece of metadata here, panel
status, and this piece of data is what we’re
going to be looking for
over in the Unity editor. So if we jump into Unity
now, I’m going to show you
how easy it is to set up a Reflect project.
I have the Reflect package here. I’m just going to drag and drop in
the Reflect prefab and hit play. And what you’re going to
notice is that right out of
the gates, what we get is the exact same functionality
as the Reflect viewer
that you had just seen. But now that we’re —
since we’re in Unity, we can actually look and dive
in a little bit and customize. So I’m going to jump into
this root game object and I’m
going to show you that this is actually where this
model instantiated itself. And we’re going to see over
on the left, when I select
a panel here, all of that metadata that we had in our
BIM software is now accessible
in Unity, and this is huge. It — including this
panel status parameter, which
is what our custom script is going to be looking for. So if I kill this really
quick and I look at our main
camera, I have a custom script here that I’m just going to
turn on and I’m going to
give it a couple inputs. So I’m going to give it this
manager and I’m going to give
it this root game object, and I’m going to rerun it. So what we’re able to do here
is we’re actually able to
grab that piece of data, and then we’re going to change
the material of each of these panels based on that
panel status piece of data. So it’s going to come in
here, and then it’s going to
change the colors of these panels. And these panels
— these colors of these
panels are indicative to the status of that panel on the
construction site, whether
that panel is complete, whether that panel is in
transit on its way to the
construction site, or maybe there’s an issue
associated with that panel. There’s a crack and it
needs to be replaced. And all we have to do now
is switch the build target
out to iOS and build out a mobile AR app. And I actually
have the finished app here on
my iPad, and I’ll show you guys the AR finished app.
So this is an AR Foundation app,
so it’s a multi-platform. So all I have to do here is scan
the floor and drop our model. And you’re going to see here
that we have the tower with
the — with all of these panels here. And since now
it’s interactive in an AR
app, I can select a panel. And in the top left, you’re
going to see the panel status
of that is in progress. We’re getting that BIM data
information back into an
augmented reality application. And this is really,
really important. But this is
still just the first step. This is still really
just the first step. Reviewing designs in the
office is great, but what
we’re most excited about is to get this technology onto the
construction site, so that
we can help transform our industry by better connecting
design and construction. See, we actually took
one of our first Reflect
applications on site. to the construction site of this
tower in Brooklyn and began. You’ll see that our construction
team is overlaying these BIM
models in data in real time on top of this
construction site. And this is very, very valuable. This is going to help us
catch design flaws fast,
solve problems faster, and inevitably decrease the
time it takes to build. Ultimately, what this means
is designing and building
in more efficient and sustainable ways. Thank you. [APPLAUSE] [MUSIC]>>Please welcome Danny Lange. [MUSIC]>>Thank you. Thank you. Good evening. As
creators, you’re faced with
many questions around the quality, safety,
and stability of your products. In an ideal world, you would
want to explore all questions
and all possibilities, but unfortunately, you’re
bound by time, inadequate
information, and limited resources of the physical
world we live in. But fortunately, we have Unity. A virtual world that is spatial. Where you have physics and
visual fidelity and all the
interactions you need to simulate any experience. However, the scope of your
simulations is currently limited
to the hardware at hand. That is until today. Introducing Unity Simulation. Our new cloud-based
simulation product. Backed by Google Cloud, Unity
Simulation enables you to
automatically run multiple instances of your Unity
projects at an incredible
scale and on a variety of hardware. You can now test
your products at any point
in the development process. You can train your systems
on millions or even billions
of scenarios and its cases. And you can validate your
concepts without any upfront
hardware investments. Unity Simulation is
already breaking barriers
across industries. Take, for example,
a self-driving car. It needs to drive at least
eleven billion miles before
it is considered road safe. With the traditional training
method, you’d maybe get to
this goal in a hundred years. And each time an
update is made to a piece of
hardware or software in the car systems, the whole thing
needs to be revalidated
across a suite of test cases. But with Unity Simulation,
you can drive sooner by
running millions of test scenarios parallel in the cloud. For example. The LGSVL
Simulator is LG Electronics
Autonomous Vehicle Simulator. It takes advantage of
several advanced units
of features, such as the high-definition rendering
pipeline, the jobs systems
and the best compiler. It does that to simulate
photorealistic digital trend
of real world environments with an accurate vehicle
and sensor models. Developers can now build and
test their autonomous vehicles
by running a diverse set of scenarios. Running these
simulations on the desktop
would take months, at least. But with LG SVL Simulator
running on Unity Simulation,
you can train more quickly, safely and, likely,
save some money too. While simulation can help invent
the future, it’s also crucial. To solving everyday
tasks that are important. Yet labor intensive and tedious. Such as, fine tuning
for game balance before
soft launching a game. We have been working with
the a 37-person game studio
based in Canada on the upcoming title Road Racers. iLLOGIKA wanted to up level
the player experience and
engagement through game balancing. They had to
address some questions
that were tough to answer. Especially when you have tight
deadlines and you know that. Are skill players beating
unskill players by a big margin? Are there combinations of
cards or power ops that would
lead to big advantages for player? In the first case of
a skilled player beating an
unskilled by a big margin, iLLOGIKA wanted to test many
different game settings to
find the one that would even out the playing field and
keep the game fun for all. Using Unity Simulation,
iLLOGIKA decided to do a grid
search and basically explore all combinations of possible
values for three parameters
in the game to find the one combination that would
optimized for best results. iLLOGIKA ran over twenty two
thousand simulations in a
matter of hours instead of the weeks it would have taken
with human players doing this. With Unity Simulation,
iLLOGIKA was able to quickly
understand which settings would lead to an enhanced
player experience. Scale is no longer a barrier
with Unity Simulation. But we’re not finished doing
simulations at scale is
critical to achieving big goals. But what if you’re a
smaller a game studio who
just needs a simple, fast way for you to optimize your game
for your individual players? Well, Future Play is
here to tell you more. Thank you. [APPLAUSE]>>Please welcome Tatu Laine. [MUSIC]>>Future Play Games
is a small studio of 34 people based in Helsinki. Our most notable games are
Idle Farming Empire and
Battle Royale, which together have over 65 million downloads. As a game designer, my job is
to keep players in the game. Idle Farmer Empire is over
four years old and we’re
constantly looking for ways to improve our metrics
and keep things fresh. I need to provide a
constant flow of new
content features and events. I have so many ideas I want
to test, but every test
requires an analyst, a programmer, an artist. It can take weeks to design,
and even longer to roll out,
and analyze the results. And as a small studio,
that’s just not feasible. For example, our store hadn’t
been updated in over three years. So last month we decided to make
a better version of our store. We thought it would be
a small thing, but it
turned out we were wrong. Players hated the new store. We had to drop everything we
were doing to roll it back. It took us way too long to correct
our mistake and roll out a fix. We needed something to
help us iterate faster. To make smarter decisions
that don’t come at the
cost of losing players. So when we learned about
Game Tune a decision engine
powered by machine learning that optimizes your
game in real time. We were all in. I was excited
that there was something that
acted and learned far faster than us. One of the
tests we ran was for tutorial
progression speed and we set the Game Tune to
optimize for Day 7 retention. We then gave it four different
speeds to choose from, from
very slow to very fast. And we assumed that there
was no way the slowest speed
could win, but we assumed wrong again. Game Tune was
able to immediately identify
the slowest speed as the best variable, and we
started seeing positive
results in just a few days. And we didn’t even need to
analyze the results because
the game was adjusted in real time. We would have never
tested these extreme settings
without Game Tune because the risk of losing
players was too high. With Game Tune,
we could be more bold. And as a result, we were
able to inject new life into
the game and we now retain 5 percent more players every day and
run double the tests we used to. The time we save on testing
now gets put toward developing
and optimizing features. Thanks to Game Tune, we now
have the power of a large
machine-learning team to deliver the best
experience to our players. Thank you. [APPLAUSE] [MUSIC]>>Please welcome a Arisa Scott.>>The Unity graphics
team lives to create best-in-class graphics
technology, performance, state
of the art visuals, along with intuitive artist workflows. That’s our driving force. When it comes to graphics,
you need power to push your
visuals and control to be able to decide what
matters to you most. We are investing in our
render technology with the
scriptable render pipelines, giving you direct control
over what you need. You can take advantage or to
render pipelines out of the box. Use them as a starting point
for your own solution or
customize them to meet your needs. The high-definition
writer pipeline is best to
use if your goal is more targeted, pushing graphics as
far as you can on high-end
hardware, delivering powerful performance
high-fidelity visuals. It’s a fully featured
offering in 29.3, now out of preview, bringing
stunning graphics and
photo-realism at game ready frame rates. Your HRGP assets
will scale in fidelity on
high-end platforms, taking the best advantage of the
available hardware resources
for best visual quality. And you only have to author
it once with our full suite
of production ready artist tools such as VFX Graph,
Shader Graph, Post-Processing. The second pipeline is
Universal Render Pipeline,
formerly known as Lightweight Render Pipeline. Which is
best to use if you want full
Unity platform reach with best-in-class visual
quality and performance. It’s a powerful solution right
out of the box with a full
suite of artist tools for content creation. Most
importantly, it visually
scales the capabilities of the hardware to all platforms
that Unity targets. We believe it’s important
that you have the right
tool for the right job. You now have a range of
options you can confidently
build off of right now and for the future. And if 2-D is
where you live, our new suite
of tools, while unlock even more creativity.
Let’s take a look. [APPLAUSE] [APPLAUSE].>>Please welcome Andy
Touch and Jennifer Nordwall. [MUSIC]>>We love 2-D. Whether it’s an RPG
or a Match 3 game, some of today’s most
successful games are 2-D. Including 75 of the
top 100 mobile games. And we’ve been working
to raise the bar for
development across the board. To not only help you create
what you want, but also
improve the experience of how you create it. So let’s
take a look at the 2-D
tools working together. But before we jump in, I
want to give a very big
thank you to Back to the Game Studio based in Montreal,
who made the lovely project
that Jennifer and I are about to show you. This project
is called The Lost Script. It was created using the
Universal Render Pipeline,
which now has a built in 2-D renderer, really dedicated
to making art truly shine. The 2-D renderer introduces
a wealth of new graphics
features that you can see throughout these environments
from the glowing particles
that were in the air to the characters dynamic shadow
being cast by the magic
wand onto the crypt wall. And the rich tone of the
forest, and these little the
spirit sprites is a result of our new native 2-D lighting
system that moved variety
and depth to create this mystical nighttime atmosphere. So let’s go behind the scenes
to see how the new 2-D tools
helped bring this project to life. So previously, to
create a day-night cycle,
you’d make many different versions of every
sprites at the level. But 2-D lights make this
much, much easier. All sprites can be reused
in every lighting condition. And as you can see, as
Jennifer moves this slider,
it controls a wide variety of lights that dynamically
modify everything that
you see in the scene. And here we have what we
call a 2-D Point Lights. This has a wide range of
in-scene gizmos to visually
control how it illuminates the scene. You may notice that
the lights intensity and
also the fall-off are also influencing the normal maps
on the background tarmac war. And this scene looks great,
but a layer of visual polish
can be added to take it to that next level. Shader
Graph and post-processing
work together with the 2-D renderer. And the visual
effects of this water sprites
were composed entirely using Shader Graph. As we can see,
the water refracts the dungeon
floor and also reflects the 2-D lights. Shader Graph allows us to
create 2-D shaders with
absolutely no code by connecting these very visual
nodes together. So let’s take a look at the
organic– Let’s take a look
at the creation of organic terrains. In the past, a
team would need to create,
manage, and also lay out any individual little sprites and
colliders entirely separately. But with Sprite Shape, you define
all your data in just one place. And a visual tool makes it
easy to create and also
iterate on your level shape. And these bright shapes, they
can also be used to generate
platform colliders and also dynamically change the
environment visuals that
you see on the screen. Now, when it comes to animating
characters, sometimes
frame-by-frame animation can be very costly in terms of
production time, file size,
and just being difficult to iterate on. With a new bone
based solution, a team needs
less art to achieve smooth animation. So here we have
this character and she’s
been created using on you rigging and IK system
made entirely for 2-D. And the characters head uses
the exact same rigging system,
but is instead driven by 2-D physics. And here you
can see that Jennifer is
switching the Sprite Library asset. And this is instantly,
in real time, changing how
the character looks. All whilst reusing the same 2-D
rig, 2-D IK, and 2-D animations. This opens up a world
of possibilities for
developing your game. As you can see, these new
tools and workflows will help
you to achieve what you want to create in the
most efficient way. Thank you. [APPLAUSE]>>Please welcome Tim Cooper. [MUSIC]>>The Universal Render Pipeline is a powerful rendering
technology that delivers a
combination of beauty and performance right
out of the box. Best of all it’s supported on any
platform that Unity can target. Whether you’re building for
2-D, 3-D, VR, or AR, if you
wish to develop once and deploy everywhere, this is
the rendering technology
you should be using. By using the Universal Render
Pipeline, you were able to
target all of able to use all of our new artist tools
and workflows, including
VFX graph, shader graph, post-processing, light mapping
tools and render passes. You can update your projects
from Unity’s built-in Renderer
and simply put, it runs better and scales better. To demonstrate the benefits
of moving to the Universal
Render Pipeline, we decided to upgrade a real project. We took the Polygon Farm
assets store pack by Synty
studios and ran it through our automated upgrade tooling. In every aspect, The
Universal Render Pipeline
performed better and faster. Universal Render Pipeline
utilizes an improved single
pass forward rendering technique that leads to fantastic
performance improvements. CPU and GPU usage with
30 percent faster. Draw codes were significantly
reduced and memory access was
improved by one third. Meaning that you can add
more content and still hit
your target frame rates. Put simply, Universal Render
Pipeline can handle more content. It won’t heat up your
phone as much and your
users can play longer. Now to help me talk about
beauty and extensibility. Andre McGregor will join me. [APPLAUSE]. So this is Crest Ocean, a
tool developed by a small
team of two from the UK. Crest allows you to create
beautiful, realistic oceans
and is built as a plugin for the Universal Render Pipeline. It’s available on
the asset store. Now, when you think about tropical
islands, what comes to mind? To me, it’s beautiful,
inviting, emerald green water. This phenomenon is caused by
the way that bright tropical
light scatters and bounces as it interacts with the water. It’s a difficult effect
to model accurately. Utilizing the power of the
Universal Render Pipeline,
the team was able to extend the rendering without having
to modify the universal
source code and develop a number of state-of-the-art
rendering features to model
this beautiful water you can see on-screen right now. To start with, the team
used the custom render
past feature to add shadow accumulation inside
the water volume. You can see this
here in the case. Beautiful soft shadows. Another important part of the
Universal Render Pipeline is
the physically-based shading model, that scales from
high end to mobile devices. This is an exposed shade
[indiscernible], that you
can use in your projects for custom materials. In Crest Ocean, this was
used to help model the state
of the art subsurface light scattering within
the water volume. Without this, the island
looks much less tropical and,
frankly to me, a little dull. Previously, to make effects
like this, they were hard to
achieve, but with explicit hooks, it’s now possible to
greatly extend a rendering
without having to write a render from scratch. So Universal gives you
beauty and extends ability. But what about platform reach? Keeping with the theme
of tropical islands. Let’s take a look at
the Boat Attack project. With Universal Render Pipeline,
you can truly author once and
deploy everywhere with great performance and
best-in-class visuals. So let’s take a look at this
running on the XBox One. We develop the Boat Attack
in-house as a way to test
features for Universal as we were developing the pipeline. Do you wanna switch
across to the PlayStation 4, Andre? We added some cool features
to this, like a day-night
cycle and custom time of day. How about you show the
people in the crowd? And finally, let’s take
a look at this running
on the Nintendo Switch. So what’s great about this
is, this was all built from
the same Unity project, Universal Random Pipeline
easily reaches the high end
with fantastic performance, high resolution and
beautiful quality. But a project like this,
how well does it run on mobile? Here you can see three
phones, each with different
power and capabilities. The project you say here is
built from the exact same
content as on the consoles, except some features have
been optimized throughout
scalable quality settings on these phones. We had a rock
solid 30 frames per second
without sacrificing quality. We’ve also upgraded the
post-processing to a
state-of-the-art mobile-optimized version. These effects are
designed to work blazingly
fast and take advantage of hardware specific
to mobile devices. You can easily upgrade your
existing projects using an
upgrade tooling and start to take advantage of the new
features and performance
benefits in the 29.3 beta. If you wish to have
gorgeous graphics and be able
to scale your content, then you need to be using
the production-ready
Universal Render Pipeline. Thank you. [APPLAUSE].>>Please welcome Freeman Fan.>>Since starting
Multiverse, our 40-person
studio in Shenzhen, China. We’ve established a reputation
for building immersive VR games. But our newest project
has us applying our
Unity expertise to PC. Today, I’m thrilled to share
an early unveil of our newest
game, Earth From Another Sun. In this
[indiscernible]-like P.S., a mysterious threat in the
future sends Earth through a
wormhole back to the present day. We jumped into
production earlier this year,
leveraging a few Unity sample assets to move fast. Driven by the desire to realize
our vision of a graphically
stunning game with a new level of immersive realism, we
chose the High-Definition
Render Pipeline. HDRP enables us to build
a world that feels more
lifelike and organic. Creating an ambience that evokes
a sense of tension and mystery. To achieve this, our artists
layered a variety of HDRP effects. First, we adjusted the exposure. Then there’s some tone mapping. Applied subsurface scattering. Added volumetric lighting. Modify the color grading and
made a slight lens distortion. The result speaks for itself. A stunning environment
that’s alive and far more
immersive than what we started with. Even better, we didn’t
have to sacrifice performance. Every shooter strives for
a high frame rate, but
balancing visual fidelity and performance can be a challenge. We’re overjoyed that Earth
From Another Sun runs at a
smooth 60 frames per second. Even during intense battles
while defending against
hordes of enemies or going up against a giant alien worm,
performance never stutters. We’re able to achieve our
graphical vision while
ensuring highly performance gameplay. With the High
Definition Render Pipeline
we’re benefiting from its incredible visual capabilities
while future proofing our
development as we head toward a release in 2020. Thank you. [APPLAUSE].>>This is the place. It doesn’t look like much. Go take a look. [INDISCERNIBLE MACHINE VOICE] OK, here we go. So who exactly is this guy? You’re kidding.
They’re a fairy tale, Boston. Don’t be a idiot. I see it. Wait [32:12] I can’t believe this
place is actually real. This makes no
sense to me, Boston. But you might have been right. You’ve got one shot at this. I’ve got one shot at this. [INDISCERNIBLE MACHINE VOICE] I know. Here it is. The last one. We know that now. >>Give it to me. That’s why I’m here. Give it to me! [APPLAUSE] Please welcome Sylvia Rasheva. And Vlad Neykov. [MUSIC]>>What you just saw
renders in real time at 30 frames per second on
commercial grade hardware. What we’d like to do now is to
give you a behind-the-scenes
peek into what this short film looks like in Unity. And to share how we address
some of the more interesting
production and artistic challenges in this project.>>So what we have here is
the whole short film from
the beginning to the end, running in Unity in real time. We’d like to focus on some of
the more interesting points
that will allow to us to communicate to you how
the team managed to
accomplish their vision.>>Without productions,
we are always striving to
achieve a two-fold goal. On one side, we want to push
the boundaries on visual
fidelity and to get closer to photo realism. And on the
other, we want to enable
artistic stylization and visual development that is
influenced from cinema. We wanted to focus on creating
a compelling experience, but
without getting bogged down with the
technically mundane. So let us dive right in and
first of all, show you how
Unity’s High Definition Rendered Pipeline is used
throughout the project. It is a feature we’ve been
talking a lot about and here
is why it is important. The great thing about the HD
Render Pipeline is that it
makes high-fidelity graphics very accessible. You don’t need to use any
arbitrary measurements and values. You can simply emulate what
you observe in the real
world and you can get the results you expect.>>Lighting a scene, for example,
is very intuitive with a range of powerful features such
as aerial lights, which are
great for shots like this and volumetric lights which contribute
a lot to the ambient feel. And as Sylvia mentioned,
because a HDRP is using
physically accurate values, everything else from the camera
lens to bloom, It just works. HDRP also comes built in with a
powerful set of post-processing
tools which allow us to modify the final frame
to receive this cinematic
look that we desire. Here we can watch in this
shot help that drastically
changes the final look. Let me emphasize
this real quick. We are talking about final
quality frame which goes
directly into the film. And Vlad is manipulating it
in real time in the editor. This is possible due to
HDRP is fully-unified,
physically-based rendering. HDRP already comes with some
very powerful shaders with
which artist can author unbelievable materials. These include anything from
the standard lead Shader,
to fabric, hair, or decals shaders. However, in every
project, the need for
something project specific may arise. Unity’s Shader Graph
allows for easy authoring
of custom shaders. This empowers artists to work
independently and to raise
the quality of the visuals they create. For example,
here the team is using Shader
Graph to create a dust shader, this shader is applied
to a material which we’ve
added here to this pipe. As you can see, whenever
I move the pipe, the dust
always settles on top. Now these kind of effects are
really the bread and butter
of any kind of production, but shader graph enables anybody
on the team to create them. Now on to the character
Morgan, which appears at the
end of the protagonist journey. It presented a challenge that
we hadn’t tackled before. It is a creature often
undefined shape and gender
in a constantly fluctuating, emotional state.
It is distressed and chaotic. A being which is both good
and evil, both male and
female, both aggressive and vulnerable. How do we
achieve something like that? So we needed a tool that
enables us to explore is
visual appearance and behavior through fast design iterations. We needed not only to render
this character in real time,
but also to also read through a real time
creation process. And the Visual Effects Graph
was just the right tool to
tackle the challenge of creating a character
such as Morgan. By creating a simulation with
GPU particles, an artist can
change the geometry and the character will basically
conform to this new geometry. The particles will
go and change it. So this is really powerful
because it enables not only
to tackle the challenge of Morgan, but also to iterate
and try out new things.>>And this is how we
were able to see Morgan
transitioning between genders, transforming from a
giant to human size. All the while, their
appearance is shifting in line
with their changing emotions. One of the most complex
problems, not only real time
graphics, but NCG in general, is how to achieve
believable human characters. The human face is so familiar
to us that we have evolved to
read and recognize even the tiniest details
of its expression. In order for a digital
human to look acceptable
to us, a large number of technological features need
to be developed and to play
nicely together all at once. With the production of The
Heretic, we wanted to explore
this space and we wanted to help accelerate Unity on a
path towards becoming a next
level tool for creators, no matter how high their ambitions. In addition to all standard
materials, HDRP now ships
with build for master nodes for hair, fabric,
and other advanced materials. Here we are using the
built-in hair shader for
Wayne stubble and eyebrows. Since we are on
the topic of hair. This is the hair geometry
and you can see as go
Wayne is being animated. The geometry
conforms to the face. And this was possible with
the help of C# jobs and the
birth compiler which allow us to resolve over 400000
individual frames. This reposes every
frame at runtime. In addition, even though
there isn’t a dedicated skin
shader, the team here is extending the stack masted
noded in HDRP to create a
believable skin shader which is applied to Wayne. And you can
see as we change the light
that the shader reacts as you expect.>>Now, HDRP
and the VFX graph are
coming out of preview. They are production ready for
content creation in Unity 2019.3. You can have it in your hands
already today in open beta. Thank you. Thank you. [Applause] That’s our show. I hope
you guys had a good time,
learned a little bit. Found some interesting tidbits. So we believe and hope you
agree that real time 3-D is
one of the most powerful storytelling mediums of our
time and into the future. We hope you guys
agree with that. So we are going to call it
a night for those on the
livestream, please stick around for the post show
with Liz and Will. It’ll start immediately after. For all of you in the room
here, we hope you join us for
the keynote reception right behind you. And thank you. Have a great week. [Applause]>>Well.
>>Wowzers.>>>>It’s like game
development Christmas out there. Is it? Let me check my watch. It’s not even December.
Not even December. So many incredible
things in that keynote. I can’t believe it.
What did you enjoy?>>Well, do you know what
my favorite part was?>>I don’t. That’s why I
asked.>>I don’t mean to be
biased, but I’m going to say it. And I really
loved the 2-D section. All those 2-D features. The demo was gorgeous. I just loved how everything
was put together. So we’ve got– So it was
using the 2-D rendering with
Universal Render Pipeline.>>True.>>Spite Shape and
Shader Graph use with 2-D. We had the sprite swap. Sprite Shape. Just, if you
were wanting to create a
2-D game or project.>>Yeah.
>>We just have
so much great stuff.>>Yeah.
>>Very exciting for me.>>We were talking about
this earlier and I love how,
you know, obviously Ori and the Blind Forest came
out a little while ago. Now there is Ori and the
Will of the Wisps and that
kind of quality of 2-D game obviously came from the team
with just a ton of passion
for making an incredible experience. And I just love
that we’ve put together those
tools that now give you, you know, lighting and
particles and, you know,
complex geometry for 2-D.>>Light, shadows.
>>And it’s so cool.>>It’s just incredible.
>>It makes me very happy.>>What about you?
Sorry, I was just rambling.>>That’s fine.
Let’s talk about it. For me, it’s all about life. I’m all about the
dots and live-link. And bringing all of that
stuff to production
readiness is a big deal. Speaking of which. Who’s this?>>This is our opinions. So let’s have an opinion
from someone that matters.>>Oh, Bret Bibby,
VP of engineering. Welcome.
>>Thank you. Good to see you again. Great.>>It’s great to be seen. And this is an amazing turnout.>>Oh my gosh.q>>
Apologies for the delay. But there is a crowd.>>Yes, there is a crowd.>>So Brett, did you have a
particular highlight this
evening in the keynote?>>Yeah. I mean, for me, I’m such
a sucker for a good story. And the demo team just
knocks it out of the park
every year, every time. And Heretic was like the most
amazing, you know, story to date. And then we only got
to see like part one. I was like, what happens? What’s in the case?
What’s in the box? I thought that was really great.>>But do you actually
know what’s in the box? [LAUGHING] I didn’t know I
was going to ask him that.>>Yeah. No, I’m not.
>>Sorry, I’m just trolling.>>Actually–
>>Sorry.>>Boston’s not in the box.>>Boston is not in the box,
but he’s awesome, or she.>>Yeah. Okay, fine.
>>So Brett coming back to it. We had lots of dots
announcements tonight. Oh, yeah. That’s a big part
of where we’re going with
you in year 19, and 2020. How do we– Talk to us about how
we approach making sure that dots is something that’s ready
for people’s productions and
appropriate for their needs.>>Right. Well, like most
things with Unity, you know,
we have a lot of users out there, a lot of great customers
building amazing things. And when they’re building
stuff and we know it works,
then it’s great and we can go talk to them.
How did you make this? How do you make that? And we can
think about like the solution. Unity is a solution for,
say, a 2-D platformer or
a first-person shooter or something. And so when it
comes to dots, this is
really cutting edge stuff. Right. And it’s going to
unlock a set of experiences
that have never before been possible. Therefore, there’s
nobody out there using that. So what we’ve done is–.>>Go and use it, right? Yeah.>>And then she can go and use it.
So you can use dots today. You can put it on top
of an existing game. Works great. But if you want
to go all-in on it, there is
nobody already all -n on it because it’s
brand new technology. So internal productions is what
we spin up to support that.>>And that’s what the what you
saw from the sample game team.>>That’s right. The sample
game team made the sample.>>The dot sample. And then
what we saw here was, you
know, the next generation of their the shooter, but
using all dots, all the
pieces, all integrated. So we know they actually
all worked together.>>How do we get
feedback from that team? So that’s pretty pivotal, right?>>Right. Well, they’re
actually inside of our studios.>>Here in Copenhagen.
>>Here in Copenhagen. And, you know, I think over
the last couple of years,
you’ve seen like a growing number of technical artists
and small productions
starting to dot around the studios. And I think it’s
really important that we’re
actually using dot footing features when it’s something
we heard from our customers. We wish– We like, you know–
It doesn’t make games, but
gosh, we should make games.>>So, yes, as an answer
we’ve pretty much on now.>>Yeah,
we’re pretty much are now.>>Awesome.
>>But we don’t
compete with our customers.>>Yeah,
we never compete with people. But we’re we’re taking all
of that feedback from the
intended production and making it real. Awesome. Brett, thank you so
much for joining us.>>Thanks so much
for being here. And you know, earlier when
we were chatting, you were
asking me about what we’re like most excited about in 2020.>>Oh, yes.
>>Yeah.>>I should totally
ask you that again. That’s what we were so
excited about in 2020.>>I’m most excited about a
strong growing investment
in artists tooling and workflows. You know, when
Unity was invented a few
kilometers from here, it was 90 percent programmers,
10 percent artists. Nowadays,
it’s completely opposite. So, you know, starting in the
few months remaining this
year, but on into next year, this internal production
idea, getting more artists,
more teams, more people building inside a Unity, and
making sure that artists
around the world, creators of all kinds have the best
tools and workflows. That’s what I’m excited about. That’s what we’re investing in.>>Cheers to that.
>>Yes. Cool. Brett, thank you. We’re going to grab
some– even more guests. So we’ll see you at the show.>>Thank you.
>>Thank you so much, Brett.>>Amazing. So a lot in the keynote is–>>True.
>>Lots of bases to cover. Lovely guests for you. Next up is my hero,
director of XR, Unity, Timoni. Hello.
>>Hello.
>>Welcome.>>So let’s just
touch on the keynote. Lots of awesome information.>>Yes.>>Is there anything
that you just wanted to
drill down more into? And be like I need to
tell the people this?>>Yes, absolutely. I mean, I feel like our
section could have been the
entire keynote to really dive into it.>>I can’t believe
you didn’t allowed to do that. That’s crazy.
>>I’ll tell Catherine. So the one thing I really
want to talk about is the
interaction toolkit, which is new. That’s Matt Delby’s
team on the platform’s team. And it is such a fantastic
addition to the way you author
for this new era of sensor driven computers, because when
it comes to how computers work
today, the traditional inputs that we think about are
not the only way you work anymore. You can open your
laptop with your face. Right? That sounds
weird, but it’s true. You’re now your
face as an input. Right.
Your hands can be an input. Your voice can be an input. There’s all these new
ways of interacting. And especially when it comes
to the development kits for
the wearable devices. These are expensive. They’re hard to get. Not all the
developers have access to them. So the tool kit actually
provides this base layer
of interactivity where the developer can just think
to themselves, “I need to
select this object here”. Like I need my character
to be able to do it. I need my user to
be able to do it. Provide this layer of, you
know, it’s just a selection
and then handle it across all the different types of inputs
and all the different types
of devices that you might need to use.
And that’s fucking fantastic.>>It’s amazing. That’s really opening up
to so many developers. Right? Just across the–.
Yes, absolutely. Absolutely.>>And it’s also kind
of like future proofing as well.>>Exactly. Yeah. Yeah.
>>So how do we understand? So we have obviously this
AR [INDISCERNIBLE] now Core.>>Yep.
>>Then, on top of that
you have Unity and you have–.>>AR foundation.
>>Yeah. Yeah. No.
[INAUDIBLE]>>And Mars.>>Yeah. I was gonna say. Get to it!
So who’s the core audience for.>>Yes.
So the core audience for Mars. When we started building
it, we really were laser
focused on augmented reality developers today, AR Kit & Core. So the phones and then
also the wearable devices. But the cool thing that we
have found out as we have
built out this tool is that when it comes to augmented
reality, it’s not just about
the devices themselves. It’s about the types of
data that you get in. And again, it’s the
gesture recognition is
the voice recognition. It’s the room mesh scan
information and the object
detection and so on. And so what we’ve just what
we’ve found is the workflows
that we’ve created for these headsets and these
phones actually works for all
types of world word devices. Like self-driving cars
or smart speakers. So it actually opens up
Unity for a wider swath of
different types of applications and devices working together
than we had expected. And that is so exciting.>>I didn’t– I
didn’t realize that.>>It doesn’t — Yeah. We didn’t either until we
start to think about it more
and we’re like, well, wait a minute.
>>Awesome. So where can people
learn more about Mars? Yes. We have a landing
page where you can sign
up for the closed beta. We’re taking people as they
come in on a case by case
basis just to make sure that we provide the proper support. You know, technical onboarding
and access to the package. We’re also have a blog post
coming out, I think in the
next week that will give an update on the status and
there will be more to come,
obviously, next year.>>Cool. Thank you.>>Thank you.
>>It’s pretty exciting. Yeah. Thank you.>>[INDISCERNIBLE]. Cool. And finally,
last but not least–.>>Last but not least.>>–is our friend Lenny. There he is. He’s the VP
of AI and Machine Learning. Hello, Lenny. You showed some crazy things. [cross-talk]>>Exciting. Yes? Yes,
I’m someone like [INDISCERNIBLE]. But like [INDISCERNIBLE],
I’m someone who doesn’t
know about that area of the business that much. So it’s
really exciting for me to
understand this insight into what you’re working on.
>>And the application of it. Like, what was it? Eleven billion miles
before a self-driving car
is certified as safe. Yeah.>>And just [cross-talk].>>So just I’m
just sitting here. You need to go out and
get going on that, yeah? It’s never gonna happen.
Or, you just simulate it all.>>Yeah. And I didn’t realize. So apart from that, which
is incredible in itself,
what other applications of simulation could there be?>>Fortunately, many.>Yes!
>>That’s why we
launched the service. Oh, of course. There’s the whole gaming
business of play testing. We’ve talked about
self-driving cars. But we also now seeing that robots
are coming out of the cages. Yeah. You have seen those robots
engage in car manufacturing.>>I’ve seen Boston Dynamics. It escapes me.
>>Yeah. Getting out and moving around. Yeah. Yeah.
You can’t program those things. Yeah. They just meet so many
situations all the time. Yeah. So what you have
to do is you have to
simulate a massive scale.>>So you’re providing the
human behavior simulated
for them to learn how to respond.
>>Exactly. Yeah. People are going to
interact with these robots. So what do we do?
We use the NPCs in Unity. We import the robot
virtually into Unity. And then we let the NPC s
interact with the robots and
the robots will learn, there are certain things
you shouldn’t do.>>Right. We can hope so.
Yeah.>>And this is a fantastic
opportunity for Unity to
juice the game technology and all the stuff that the
game developers have sort
of created over the years. The skills, the techniques,
and then use them to impact
their real world.>>Yes. Amazing. So going back to gaming then,
what are the implications of
using this in the gaming?>>Yeah.
Think about it this way. Games are getting more and more
dynamic, more and more adaptive. Yeah. So you want to
fine tune your games. You want to figure out how
your games are going to work
when they’re to meet the real audience, the real humans. So what do you do? You
simulate you run bots against
your game and see how it performs the different
types of robotic players. It’s basically NPCs
playing the game.>>So you’re thinking about
like, you know, procedural
games where, you know, creating encounters and having
teams of thousands of game
designers isn’t really an option, but, you know,
using simulation to
create those encounters.>>Yes, exactly. And you can figure out how
they work before you bring
them into the wild, into into the market. The simulation is
going to change a lot of things. So, yes, amazing.>>And where can people learn more
about Unity Simulation as a set?>>So go to our website. It’s linked there. You can sign
up and you can join the beta. Get your things rolling.>>Oh, okay.>>Yeah.>>Go cage these robots.>>Yeah, right. Right. Yes.
>>That’s the tagline.>>So much for joining us.
>>Thank you.>>Thank you for having me.
We’ll see you.>>Very interesting stuff. Thank you.
>>Well, Elizabeth.>>It’s sadly come to an end.>>It has come to an end,
but, really, it’s only
just getting started. So you can join Unite
Copenhagen from home. By just clicking on any social
channel for a #UniteCopenhagen. We’re going to be sharing
stuff internally with
all, everyone who’s here. Three thousand people who are
here are going to be joining
and sharing stuff with you. So make sure to
look out for it that. You can join our
YouTube channel.>>Yes. Many of the talks
from this week are going to
be released onto our YouTube channel in the coming weeks. So please keep your
eyes peeled for that.>>But until then, we
will see you, and I mean
you specifically at GDC.>>Yeah!
>>Thanks for watching, everyone.>>Bye!
>>Great show.
Good night. [MUSIC]

Tags: , , , , , , , , , ,

46 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *