spotifyovercastrssapple-podcasts

elm-webgl

Andrey Kuzmin walks us through Elm's syntax for writing 3D shaders and using them in a type-safe way from Elm.
November 22, 2021
#44

Guest: Andrey Kuzmin (github) (twitter)

Transcript

[00:00:00]
Hello, Jeroen.
[00:00:01]
Hello, Dillon.
[00:00:02]
And today we've got Andrei Kuzmin joining us to finally talk about some 3D stuff in
[00:00:09]
Elm.
[00:00:10]
So, Jeroen, this is our newbie episode.
[00:00:13]
Andrei, thank you so much for joining us.
[00:00:15]
Thank you, folks.
[00:00:17]
It's really nice to be able to talk about this.
[00:00:20]
Yeah, it's great having you.
[00:00:23]
I've seen so many conference talks where you show awesome 3D demos and everything,
[00:00:28]
but I see so many Elm conference talks and things make sense.
[00:00:34]
It's like, oh, types, okay, types can help with this, types can help with that.
[00:00:38]
This can be type safe.
[00:00:40]
And then 3D stuff, and it's just all over my head.
[00:00:43]
So now we've got you here to ask you all our newbie questions.
[00:00:47]
And hopefully, other people who don't understand 3D will come along the ride and understand
[00:00:52]
it as a result of our newbie questions.
[00:00:54]
Yeah, thank you.
[00:00:55]
I'm really glad to be able to explain all the things.
[00:00:59]
I'll do my best in this.
[00:01:00]
Yeah, no, I mean, you've got all this great knowledge.
[00:01:04]
And I think that a lot of things are, you know, there's the curse of knowledge where
[00:01:09]
you know these things, and it's easy to take for granted.
[00:01:12]
Actually, like going through some of the Elm WebGL documentation, I had this moment where
[00:01:19]
sometimes the questions that people ask me, you know, about like a package I maintain,
[00:01:24]
they're like, what is this?
[00:01:25]
What is a page module in Elm pages?
[00:01:27]
What is, you know, and, and I have to be reminded like, oh, yeah, that's not obvious.
[00:01:33]
And it's so hard to know what questions people are going to have when you've thought so much
[00:01:38]
about something and built this thing.
[00:01:41]
And I had that same feeling reading the WebGL documentation is talking about like meshes
[00:01:45]
and shaders.
[00:01:46]
And I'm like, I need someone to explain it to me like I'm five.
[00:01:49]
So here we are.
[00:01:52]
So maybe let's start with this.
[00:01:53]
If you want to do 3D in Elm, you usually use the Elm dash explorations slash WebGL package,
[00:02:00]
right?
[00:02:01]
What is WebGL's part in all this?
[00:02:05]
Why do we use it?
[00:02:06]
And why is it needed?
[00:02:07]
That's a good question.
[00:02:09]
So the reason why this package is brought into Elm exploration is because there is some
[00:02:14]
native platform, JavaScript that implements the integration with the WebGL.
[00:02:21]
And there is also a part of WebGL that is implemented in the Elm compiler itself.
[00:02:28]
It's like it's really tightly coupled with the types that compiler generates.
[00:02:33]
Right.
[00:02:34]
So, so WebGL is like a language, like a DSL for expressing 3D things.
[00:02:42]
And Elm has like GLSL as a syntax you can do.
[00:02:47]
And it creates, it actually knows the type information about the GLSL snippets.
[00:02:53]
Are the things I just said accurate?
[00:02:56]
That's correct.
[00:02:57]
Yes.
[00:02:58]
So it does type inference for the shaders that are written in GLSL.
[00:03:04]
That is a language that looks similar to, I guess, C.
[00:03:08]
Yeah.
[00:03:09]
Okay.
[00:03:10]
So GLSL is the language and WebGL is the runtime.
[00:03:15]
WebGL is like a set of APIs, I would say.
[00:03:19]
Okay.
[00:03:20]
To do like 3D graphics or any graphics that are accelerated by the GPU.
[00:03:28]
Okay.
[00:03:29]
And those are natively supported by every browser out there?
[00:03:33]
Yes, pretty much.
[00:03:34]
They are like WebGL one that like the first specification that we are relying on is I
[00:03:41]
think you can safely assume it runs in the majority of devices like even mobile phones
[00:03:48]
like Android or iOS.
[00:03:50]
So okay, let's break down every single term.
[00:03:54]
And Jeroen and I are experts at not knowing what these terms are.
[00:03:59]
So we're...
[00:04:00]
I'm so good at that.
[00:04:01]
We're really good at that.
[00:04:03]
What is a shader?
[00:04:04]
Well, a shader is like a block of code with a main function.
[00:04:08]
And there are two types of shaders in the implementation.
[00:04:12]
One is called the vertex shader and another one is called fragment shader.
[00:04:16]
And these are the definitions of like the code that runs on GPU.
[00:04:22]
So they are being compiled and executed on the GPU.
[00:04:25]
And then all the data that you pass just goes through them.
[00:04:29]
And as the result, you get some visual things on the screen.
[00:04:34]
I guess it's like a very...
[00:04:35]
Like maybe it's an oversimplification.
[00:04:38]
What is the...
[00:04:39]
Like if I think of shading, I think of like maybe like coloring something in with a crayon
[00:04:44]
or something.
[00:04:45]
Like what's the metaphor of shading?
[00:04:47]
Like what does that mean a shader?
[00:04:49]
I don't think...
[00:04:50]
I don't know an answer to this question.
[00:04:53]
But what is it at a high level?
[00:04:55]
Like what...
[00:04:56]
Like if you were to describe what a shader is doing at a high level, not with code, what's
[00:05:01]
its purpose?
[00:05:02]
To describe what?
[00:05:04]
I think for the purpose of like explaining how things work in the WebGL, I think it makes
[00:05:10]
sense to maybe talk about fragment shaders, vertex shaders and fragment shaders separately.
[00:05:16]
So if I was to explain what the vertex shader is, it's a shader that runs for every vertex
[00:05:25]
and it produces an output of it is where this vertex can be found on the screen.
[00:05:31]
Is a vertex the triangle or the point?
[00:05:34]
It's triangles?
[00:05:35]
So, well, depending on the geometry, if the geometry is triangles, then the vertex is
[00:05:40]
a corner, like a point on that triangle, a corner point.
[00:05:45]
It can also be aligned.
[00:05:47]
So then it's like a point, like an end of a line segment.
[00:05:53]
Okay.
[00:05:54]
So it's defining how you draw the points, but it's not defining what those points are?
[00:06:00]
The input to it is attributes.
[00:06:03]
And attributes is in elements like a record with user defined fields.
[00:06:09]
It's an information that you can attach to it.
[00:06:11]
So it's like, it's very generic.
[00:06:12]
You can say, oh, you can attach coordinates to it, or you can attach color or texture
[00:06:20]
coordinates.
[00:06:21]
So any information you find useful, it can be attached to it.
[00:06:26]
And then this vertex shader is going to transform that into coordinates on the screen plus some
[00:06:35]
output data, which is called a variance.
[00:06:38]
Okay.
[00:06:39]
So would a shader be almost similar to like CSS rules that could define, you know, a color
[00:06:48]
or like a background image to put over something or a transformation to like move something?
[00:06:56]
Yeah.
[00:06:58]
If you think of the composition of two shaders, then maybe you can think of it in this way.
[00:07:04]
So if let's say if the input to the vertex shader is like a mesh that is basically, well,
[00:07:12]
if you talk in terms of triangles, like a triangle or geometry, it's a bunch of triangles
[00:07:18]
and each point of this triangle has user defined information like color coordinates.
[00:07:24]
Then this goes through the vertex shader and it becomes, for each of them, you know where
[00:07:30]
it is on the screen plus any other information that is then interpolated.
[00:07:35]
When another shader picks it up, it takes, it operates on fragments, but you can think
[00:07:42]
of them as pixels on the screen.
[00:07:43]
So it basically, it fills those triangles and what it gets as an input is interpolated
[00:07:51]
values.
[00:07:52]
So if you have a triangle that has like red, green and blue attached to its vertices, then
[00:08:02]
if you use that in the fragment shader and the fragment shader determines the output
[00:08:06]
color of each pixel, each fragment, then you might just like interpolate it.
[00:08:11]
So if you return this output, this varying from the vertex shader, then the triangle
[00:08:19]
is going to be filled with like interpolated color depending on how close that each point
[00:08:25]
is to either of the corners.
[00:08:30]
So okay, so there's this GLSL that you can put inside of the body of Elm code with these
[00:08:38]
little notation square brackets, GLSL pipe, and then you actually input GLSL code.
[00:08:45]
And then Elm infers type information about that, right?
[00:08:49]
So like what's the mapping between the GLSL, what is it taking from the GLSL and between
[00:08:57]
that and the Elm type?
[00:08:59]
So you define, when you do a GLSL snippet in Elm, it gives you a shader type, right?
[00:09:06]
And then what does it map the GLSL types to the Elm types with?
[00:09:11]
So usually at the top of the shader, you would have some global definitions of attributes,
[00:09:17]
uniforms, variances for the shader, for the vertex shader.
[00:09:22]
That would mean that it builds up a shader that has all the attributes are put together
[00:09:28]
into a record type.
[00:09:30]
All uniforms are grouped in the record type and all the variances are grouped in the record
[00:09:35]
type.
[00:09:36]
Like I forgot to mention what the uniforms are.
[00:09:38]
The uniforms are some global constants.
[00:09:42]
So they're not part of the mesh, they are part of the rendering.
[00:09:46]
So like camera position.
[00:09:48]
Yep.
[00:09:49]
Camera.
[00:09:50]
Okay, cool.
[00:09:51]
So that's basically like an input that you can use to, that allows you to, through Elm,
[00:10:00]
change the camera position.
[00:10:02]
You can like make that vary now and you've basically created a binding between Elm and
[00:10:08]
the WebGL interface where you can move the camera position, for example.
[00:10:14]
It's all passed in the declarative way, but if you use like animation frame subscription
[00:10:20]
and you pass a different camera position, then you'll have like a moving camera.
[00:10:28]
That's very cool.
[00:10:29]
So you're right.
[00:10:30]
So it's this declarative thing where you have your uniforms, which are these like global
[00:10:36]
values in WebGL and you declare them in your GLSL snippet and then you pass in types, which
[00:10:45]
are Elm is going to check that the types match between the uniforms you've declared in your
[00:10:50]
GLSL and the ones you use in your Elm WebGL code.
[00:10:55]
And then you can change those values and the camera moves or the color can change or the
[00:11:02]
rotation can change.
[00:11:04]
And this type safety is not just for uniforms.
[00:11:07]
So basically you have two shaders, their type signatures have to align and also you have
[00:11:13]
mesh that is defined by its vertices, like attributes, attributes type attached to every
[00:11:19]
vertices and those attributes will have to match with the input to the vertex shader.
[00:11:25]
The uniforms have to match between the vertex shader and the fragment shader and the uniforms
[00:11:31]
that you pass, like the uniforms that you pass.
[00:11:35]
And also the variants coming from vertex shader into fragment shader, those values that are
[00:11:41]
interpolated, they also have to match.
[00:11:43]
So Elm verifies that all the types align together and it really helps to avoid some problems
[00:11:50]
that you might otherwise encounter when doing it in a raw JavaScript.
[00:11:56]
Or just the C language that is GLSL.
[00:12:00]
Or you still have to GLSL if you want to use the raw Elm WebGL.
[00:12:07]
Is that part of GLSL, does that give you some of that type safety in itself but then Elm
[00:12:16]
gives you the bindings to the uniforms that it passes in?
[00:12:19]
Or you're saying if you use a JavaScript library...
[00:12:23]
If you were to do it yourself in JavaScript, then you would be getting runtime exceptions
[00:12:30]
that you'll have to...
[00:12:31]
So you'll have to run your code in order to see them.
[00:12:34]
Does it give it to you as soon as you start the GLSL and then it says, oh, the types are
[00:12:40]
a mismatch?
[00:12:41]
Does it compile the GLSL at runtime?
[00:12:45]
There is a bunch of operations you need to do in order to get it working.
[00:12:49]
There is a pretty big boilerplate, I guess.
[00:12:53]
You have to compile the shaders that may fail.
[00:12:58]
Then you have to link the two shaders into a program.
[00:13:02]
So you need to take Vertex Shader and Fragment Shader, link it to a program.
[00:13:06]
That may fail.
[00:13:08]
And then when you perform drawing operations, you have to correctly set all the buffers
[00:13:13]
and pass it to the program and run it.
[00:13:16]
So that may fail.
[00:13:18]
And the uniforms may not line up from being passed in from JavaScript and being run in
[00:13:23]
GLSL.
[00:13:24]
And when things go wrong, it's a runtime exception.
[00:13:28]
And usually you see just a white screen with some console errors.
[00:13:36]
I think I'm starting to get a picture of why people get very passionate, it seems, about
[00:13:42]
doing 3D stuff in Elm.
[00:13:44]
Because it sounds like Elm has really...
[00:13:48]
It's so interesting to me because I feel like I have my finger on the pulse of Elm stuff
[00:13:53]
a fair amount, except this stuff is such a blind spot for me.
[00:13:59]
But I see people getting really passionate about it.
[00:14:01]
And this must be one of the reasons.
[00:14:03]
I know some people who even used some raw WebGL before trying out Elm and they say,
[00:14:09]
wow, this is so much easier because of all these guarantees.
[00:14:15]
How are the compiler errors?
[00:14:17]
When you mess up either the GLSL or when you mess up the linking, are the compiler errors
[00:14:23]
good?
[00:14:24]
Helpful?
[00:14:25]
You mean coming out from the raw WebGL or the Elm ones?
[00:14:31]
I mean when you...
[00:14:33]
When you're using WebGL in Elm.
[00:14:35]
Yeah, when you use WebGL in Elm.
[00:14:37]
Yeah, I think they are.
[00:14:39]
It will be basically like, I want this type, but you pass this type.
[00:14:44]
Kind of error.
[00:14:46]
So I'm curious to know, and I'm wondering whether this is a good way to start a conversation
[00:14:52]
because when you need to do something in 3D, how do you think?
[00:14:56]
What do you start with?
[00:14:57]
Do you start with writing a bunch of vertexes and attributes or what is your process?
[00:15:05]
Right.
[00:15:06]
Like maybe we need to define what that something is because it may depend on that.
[00:15:12]
Like, so can you think of an example?
[00:15:16]
Let's say you want to draw a cube or a triangle.
[00:15:20]
So well, then the first thing I would think of is like, I need to define a mesh for the
[00:15:26]
cube.
[00:15:27]
So I need to visualize it in 3D and I need to write down coordinates for all the triangles
[00:15:34]
that form the cube.
[00:15:36]
Yeah.
[00:15:37]
So the mesh is just the, I want to say the physical representation of the 3D object,
[00:15:44]
just like the box, like where are the coordinates of every point?
[00:15:49]
That is a mesh, right?
[00:15:50]
In the simple form, yes.
[00:15:51]
But like what you say coordinate is, it's an attribute.
[00:15:54]
It's a record where you may have coordinates and you may have something else, whatever
[00:15:59]
you want to attach to it.
[00:16:01]
All right.
[00:16:02]
And a mesh is a list of, how do you say, triplet, like a tuple of three.
[00:16:08]
Right.
[00:16:09]
Yeah.
[00:16:10]
Triplet.
[00:16:11]
So it's a list of those.
[00:16:12]
And each of the components of that triplet is this record type that you define yourself.
[00:16:20]
Okay.
[00:16:21]
Which often contains the position or the original position and then some additional information
[00:16:29]
if needed.
[00:16:30]
And color.
[00:16:31]
Yeah.
[00:16:32]
Right.
[00:16:33]
Yeah.
[00:16:34]
I see that in this like simple cube example, there's like right front top, left front
[00:16:38]
top, left back top, et cetera, et cetera, for all the three dimensional points that
[00:16:44]
you would map out for a cube.
[00:16:46]
And then it's creating a face that is stretching from each of those with these three point
[00:16:52]
vectors.
[00:16:53]
Yeah.
[00:16:54]
Like when constructing a cube, I think you can be like more clever.
[00:16:58]
Maybe you define one, maybe define a square.
[00:17:02]
And then if you know like how to transform that, how to do like a rotation, maybe then
[00:17:07]
you can define all the sides of the cube.
[00:17:09]
I think there is in one of the examples, I think there was this kind of implementation.
[00:17:14]
Yeah, I can confirm.
[00:17:16]
Or you just write it down, like all the coordinates.
[00:17:20]
Wow.
[00:17:21]
There are so many different skills in this world.
[00:17:24]
Like there's the geometry and these mathematical transformations and the textures, the 3D stuff.
[00:17:33]
That's really interesting.
[00:17:35]
So if we talk about textures, that is another thing in WebGL.
[00:17:42]
So let's say if you want to have a textured cube, then you might want to assign texture
[00:17:50]
coordinates to each of the face, which means basically where this vertex can be found on
[00:17:57]
a texture, on an image.
[00:17:59]
And then you can use this information in the fragment shader that determines the output
[00:18:05]
color and you can pass the texture as a uniform and then you can do a lookup using this coordinate.
[00:18:12]
You can find where that pixel is and take its color and return it.
[00:18:17]
Then you get like a textured rendering of a cube.
[00:18:20]
Okay.
[00:18:21]
And that's how we apply a texture on a face or...
[00:18:25]
Okay.
[00:18:26]
So you have your mesh for your cube.
[00:18:29]
What else?
[00:18:30]
What happens next?
[00:18:31]
So then you need to think of the scene.
[00:18:33]
You need to set up a scene.
[00:18:34]
So you probably think of, do you need to move the cube around?
[00:18:38]
Do you want to have...
[00:18:40]
Where do you position the camera?
[00:18:42]
Where the camera is looking at?
[00:18:45]
And in the role of WebGL API, that would be all matrices that define this.
[00:18:50]
So you may want to have like a transformation matrix that you want to apply to your cube.
[00:18:57]
If you define a cube in its own local coordinate system, like I don't know, based on zero,
[00:19:03]
right?
[00:19:04]
Then you might want to move it around or if you have many cubes and you may have, you
[00:19:09]
may position them in 3D.
[00:19:13]
That is defined by matrix that transforms from local into world coordinate system.
[00:19:20]
Is there any convention with like the Y axis going downwards as it grows positive or does
[00:19:26]
it grow up as it grows positive?
[00:19:30]
There's like, usually I think I would use that up.
[00:19:36]
Okay.
[00:19:37]
Yeah.
[00:19:38]
So the intuitive thing.
[00:19:39]
X to the right and Y from the camera.
[00:19:43]
I think this is what I would use.
[00:19:46]
But I think you're not really, you're not restricted because you can do all these transformations
[00:19:52]
for the matrices.
[00:19:54]
Like you can flip the axis, you can do whatever you want.
[00:19:58]
Right.
[00:19:59]
And you could create some mapping of real world units.
[00:20:03]
Like I've heard Ian McKenzie talk about trying to use real world units to model things.
[00:20:08]
Yeah, certainly.
[00:20:11]
So this is the first matrix, let's say the matrix that moves around objects in the world,
[00:20:18]
like positions them, rotates them.
[00:20:20]
You can compose, you can combine all this transformation in one matrix basically by
[00:20:25]
like first rotating and then moving.
[00:20:29]
I feel like I'm missing something.
[00:20:30]
So you have your mesh, you say that it's at position 0, 0, 0, and then you first set a
[00:20:38]
camera.
[00:20:40]
Is that it?
[00:20:41]
No, like first you may want to move object around.
[00:20:46]
If you want it to be positioned at 0, 0, 0 and just render it, then you don't need that.
[00:20:52]
Then all you need is like a camera transformation.
[00:20:56]
So that is like there is another package that you have to use, which is exploration linear
[00:21:04]
algebra and that provides all the useful functions to define like...
[00:21:12]
To like manipulate matrices basically, right?
[00:21:15]
Yeah, and create some like various matrices that you might need, like a projection, like
[00:21:23]
a perspective projection matrix.
[00:21:26]
So what is a perspective matrix?
[00:21:28]
So there are two types of projection matrices that exist in linear algebra.
[00:21:34]
One is the matrix that like it takes your 3D thing and it projects it onto like a plane.
[00:21:41]
Yes, that's a definition of projection and what perspective projection does is it makes
[00:21:48]
it so that like parallel lines intersect.
[00:21:52]
If you prolong them, then they would intersect.
[00:21:56]
So you get this kind of realistic view and orthographic projection keeps the parallel
[00:22:02]
lines parallel.
[00:22:03]
Okay, so it's so that you can imitate seeing from an eye.
[00:22:08]
Is that it?
[00:22:09]
All right.
[00:22:10]
So like a lens or different kind of lens?
[00:22:12]
Like would you imitate like a fish eye lens versus a telephoto lens type of thing with
[00:22:17]
those transformations or is that different?
[00:22:19]
I think like one of the inputs is an angle of view as far as I remember.
[00:22:29]
So you might want to like you might be able to do this, but I'm not really sure about
[00:22:34]
the lens transformations whether you can achieve them using the existing functionality.
[00:22:41]
Okay so we were talking about meshes and just to reiterate like a mesh is just part of a
[00:22:48]
shader.
[00:22:49]
So when you define your GLSL snippet in ELM, you're defining a mapping, you're defining
[00:22:58]
your mesh which are the points, the vertices.
[00:23:02]
You're defining your uniforms which are essentially the parameters that you're binding between
[00:23:08]
ELM and GLSL that you can pass in things that vary like camera position.
[00:23:13]
You're defining them, you're saying I want to have those uniforms, I want those attributes,
[00:23:20]
but you're passing them in from ELM, right?
[00:23:22]
Right, like it seems like it's almost like your model, right, for your GLSL.
[00:23:28]
So attributes is mesh and then uniforms is just like an ELM record and then you also
[00:23:37]
get two shaders and then you put it all together into what is called an entity in the ELM Web
[00:23:46]
GL API.
[00:23:47]
Then you have a list of those that you pass into the top level function that creates a
[00:23:56]
canvas element and like a DOM element so that becomes HTML.
[00:24:03]
And so like this whole GLSL thing is the purpose of it is to like it's a language that is compiling
[00:24:13]
down to highly performant stuff that's running on the GPU of your machine and so these like
[00:24:20]
uniforms are I mean it's compiling down to like very fast GPU instructions essentially,
[00:24:28]
right?
[00:24:29]
And so ELM is able to just sort of vary these few pieces but then it just runs very fast
[00:24:35]
GPU instructions from that, right?
[00:24:38]
Yes, there are numerous performance optimizations in this pipeline.
[00:24:44]
So first of all, all the meshes are cached so whenever this like a WebGL to HTML function,
[00:24:53]
whenever it receives a new mesh is like in one of the entities, it would upload it into
[00:25:01]
a GPU memory and cache it.
[00:25:04]
So when you call the same function, you pass the same mesh, it doesn't have to reupload
[00:25:10]
the data to GPU.
[00:25:13]
How does it detect whether it's the same?
[00:25:15]
Is it by reference or does it do an equality check?
[00:25:20]
That's a good question.
[00:25:21]
So I think in the past what used to be is the JavaScript object is being tagged with
[00:25:26]
an ID but I think what we switched it recently to is using weak map in JavaScript if you
[00:25:34]
know about this.
[00:25:36]
And the overall cache kind of or like a representation of what is cached, what's not is a part of
[00:25:44]
the virtual DOM node that is created for that canvas, for the WebGL canvas element.
[00:25:51]
And this node, it stores things like uploaded, like references to uploaded textures, like
[00:25:56]
all the buffers for attributes but the uniforms are not.
[00:26:00]
So the uniforms are changing from like you can set some different uniforms.
[00:26:05]
That's the whole purpose of them to be able to move around and the uniforms is usually
[00:26:11]
not that much data when compared to like a mesh that defines a very complex 3D object.
[00:26:19]
Yes, so that's why you try to have the uniforms change but not the meshes, right?
[00:26:25]
If you change your mesh 60 times a second, then you will have to upload it onto GPU every
[00:26:35]
time and that would make it slow.
[00:26:37]
So yes, this is like it's a really cool performance optimization but it's also it can be thought
[00:26:44]
of as a limitation like if you want to be changing the mesh instead of moving it around.
[00:26:51]
So you need to think of how you can achieve what you want by not changing the mesh but
[00:26:56]
by I don't know having multiple meshes and like positioning them in the way you want.
[00:27:03]
Then like varying your uniforms for example, right?
[00:27:06]
That would be because that's just running the same mesh that it's computed and it knows
[00:27:12]
the ways it varies based on those uniforms.
[00:27:18]
But then at the same time, you don't want to have too many of them.
[00:27:24]
Like you don't want to have too many entities.
[00:27:28]
Because each entity internally results in a draw call and a draw call is like this final
[00:27:35]
execution of all the things together and it's on overhead.
[00:27:40]
So any call to GPU has some sort of latency, some overhead.
[00:27:48]
Like even internally what we do is we try to keep the calls or what like the settings
[00:27:58]
or even like uniforms we try to keep what the previous ones were that were set on the
[00:28:04]
WebGL context.
[00:28:06]
And if a new operation comes, we see whether it's the same as it was set.
[00:28:12]
Basically some sort of diffusing but like a very simple one.
[00:28:15]
And we do not do it if it is the same to reduce the number of calls to WebGL.
[00:28:22]
You prefer to do an equality check before checking whether you need to run before rendering.
[00:28:31]
So let's go back to having small uniforms and meshes that don't change.
[00:28:37]
Let's imagine I want to draw a hand and I have one mesh per finger and one mesh for
[00:28:42]
the entire hand and I want to move the fingers.
[00:28:46]
Do I have a uniform that defines like the rotation of the finger or do I have like one
[00:28:55]
mesh for the entire hand?
[00:28:59]
That's an interesting question.
[00:29:01]
I haven't done this but I can think of how it's possible to do it.
[00:29:06]
I think what you sort of need is you need to define a skeleton of some sort.
[00:29:15]
And if you think that the origin position is of the hand, then I guess you can do like
[00:29:23]
the next like bone or something you can specify it in the coordinate system of the hand.
[00:29:29]
And then you can have some sort of tree built up that has like relative positions in it.
[00:29:37]
And then if you move the hand, then everything else will move with it.
[00:29:43]
But then what you need to do internally, you would need to flatten out all the transformations
[00:29:48]
because in the end you're rendering a list of things, right?
[00:29:52]
So every bone will have to be an element in that list.
[00:29:55]
So you would need to compose its transformation with the positional transformation of the
[00:30:01]
hand itself.
[00:30:03]
Which you would do using matrices?
[00:30:05]
Yeah.
[00:30:06]
Okay.
[00:30:07]
Yeah.
[00:30:08]
That's the raw WebGL approach because in the end you pass like a matrix 4 and in the end
[00:30:14]
you use this like matrix multiplication on the GPU.
[00:30:19]
But there are other ways to model things like using a library called Elm Geometry from Ian
[00:30:25]
McKenzie which is a completely different way of thinking of transformations.
[00:30:31]
There are types like frame 3D and that type gives you the possibility to transform between
[00:30:37]
two coordinate systems.
[00:30:39]
Interesting.
[00:30:40]
But at the end of the day GPUs only understand matrices, right?
[00:30:44]
Like that's basically…
[00:30:46]
Yeah.
[00:30:47]
So the way down when you want to render you would have to convert frame 3D into matrix
[00:30:54]
4.
[00:30:55]
Right.
[00:30:56]
Right.
[00:30:57]
To speak the language of the GPU.
[00:30:58]
Yeah.
[00:30:59]
Okay.
[00:31:00]
So if we're…
[00:31:01]
Okay.
[00:31:02]
So we've got our like our shader for our cube and we've got our faces defined which is our
[00:31:09]
cube mesh, right?
[00:31:11]
Our mesh is all of the faces of this cube.
[00:31:14]
And then we create an entity from that mesh which defines the faces of the cube and the
[00:31:20]
shader which…
[00:31:21]
Two shaders.
[00:31:22]
Two shaders.
[00:31:23]
Okay.
[00:31:24]
So what are those two shaders?
[00:31:26]
So the vertex shader and the fragment shader.
[00:31:29]
The vertex shader would work, would be called for each of the vertices and it will be passed
[00:31:35]
attributes like that are that we attached to all the points that define those triangles
[00:31:41]
from the mesh and the output is where things can be found on the screen in the clip space
[00:31:50]
coordinates.
[00:31:51]
So it's like from minus one to one on all axis.
[00:31:55]
So it's not like pixels coordinate.
[00:31:58]
And then the fragment shader runs and it runs for all the pixels within each triangle and
[00:32:06]
it returns a color which is RGBA and it's defined as also like from zero to one for
[00:32:14]
each component.
[00:32:15]
So R from zero to one, B from zero to one.
[00:32:19]
So the fragment shader runs for every pixel that would be seen on the screen?
[00:32:25]
Okay.
[00:32:26]
I mean it's not computing anything that wouldn't be seen on the screen because it's behind
[00:32:32]
the camera or something.
[00:32:33]
Yeah, so what like there are things.
[00:32:36]
Yeah.
[00:32:37]
So if a thing is outside of the visible area, I think there are different ways like you
[00:32:42]
can get rid of back facing triangles if you, okay, that's a, let me think a bit more.
[00:32:50]
I'll explain this.
[00:32:53]
When you define the projection matrix, you need to pass two values, which is the nearest
[00:33:01]
clipping plane and the furthest clipping plane.
[00:33:04]
Everything that falls out of this will not be visible.
[00:33:09]
If something is too close to the eye, then you don't show it.
[00:33:13]
Is that it?
[00:33:14]
Okay.
[00:33:15]
And what you can do in order to get things invisible is the output is like, so the output
[00:33:23]
from the vertex shader as X, Y and Z, the Z coordinate, it doesn't affect into where
[00:33:32]
the pixel is positioned because our screen is 2D, but it's still, you can still set like
[00:33:37]
a value that is less than minus one or more than one.
[00:33:42]
And that will mean that it's not going to be rendered, this vertex.
[00:33:47]
And I think this is what the projection matrix would do.
[00:33:52]
It will like map coordinates to being outside of the so called clip space, this visible
[00:34:00]
part of coordinates that are returned.
[00:34:03]
Like the range of coordinates that are considered visible when for the output of the vertex
[00:34:10]
shader.
[00:34:12]
And then another way of making things invisible is you can rely on the winding of the vertices.
[00:34:22]
You can rely on what?
[00:34:23]
So you have like counterclockwise or clockwise coordinate, like positions of the vertices.
[00:34:29]
And then if you create an entity with settings, you can make it so that it only renders like
[00:34:36]
front facing or back facing triangles.
[00:34:39]
Is that in order to not have to compute too many things?
[00:34:42]
Is that just a performance optimization?
[00:34:44]
Like you want to draw a house, but you don't want to draw all the walls behind it.
[00:34:49]
So it's just a facet.
[00:34:54]
Same for the cube, I guess you don't want to render the inside of the cube.
[00:34:59]
You only want to render the outside if it's like sealed.
[00:35:05]
You can't look inside.
[00:35:08]
So I think this is kind of starting to click for me.
[00:35:12]
So like, okay, so we're creating an entity using WebGL.entity and we give it a vertex
[00:35:20]
shader, which we've talked about.
[00:35:22]
We give it a fragment shader, which is telling it how to color the faces of the cube.
[00:35:27]
And if that's just a shader, so it takes a uniform, we could have the uniform could allow
[00:35:33]
us to control the color.
[00:35:35]
It could allow us to vary the color and bind that from Elm.
[00:35:38]
So if we wanted to, we could just say, depending on the face, give it a fixed color, or we
[00:35:44]
could give it a gradient color if we wanted to create that mapping.
[00:35:49]
And then we give it a mesh.
[00:35:50]
Yes, you can do coloring from uniforms too, or you can have them coming from the mesh
[00:35:57]
and pass as varying from the vertex shader into the fragment shader.
[00:36:02]
You have all the options.
[00:36:04]
I'm guessing the fragment shader is mostly used for lighting.
[00:36:09]
So you tell us where the sun is and then it makes some pixels lighter or darker based
[00:36:17]
on their position.
[00:36:18]
Is that it?
[00:36:19]
Yes, you can use it for this too.
[00:36:21]
Like if for lighting, you would need to have a normal to the surface.
[00:36:26]
So a normal, it really depends on whether your object is smooth or not.
[00:36:33]
Let's say our cube is not smooth.
[00:36:35]
For that, I think you can even calculate the normal in the fragment shader, but you can
[00:36:41]
also have it attached to each of the vertices when you're defining the cube.
[00:36:45]
What is a normal?
[00:36:46]
It's like a unit vector from the surface, like going to the outside from the surface.
[00:36:54]
Okay, so it's like...
[00:36:56]
Pointing where the surface is facing.
[00:36:58]
Okay, yeah.
[00:37:00]
And then you have another, like you have a direction of the light or like this kind of
[00:37:07]
thing.
[00:37:08]
If you have directional light, like the simplest possible is like the simple flat shading where
[00:37:15]
you have the direction of the light.
[00:37:17]
And then when you're trying to determine what the color of the pixel should be, you can
[00:37:22]
check how these two vectors align.
[00:37:26]
Like if they are facing each other exactly, then I guess it will be highlighted.
[00:37:31]
But if they are facing in the opposite direction, sorry, if they are facing in the same direction,
[00:37:37]
then it's going to be in the dark.
[00:37:40]
Yeah, because you're pointing, the light is shining on the backside of the cube or of
[00:37:47]
that face.
[00:37:48]
Then it means it won't be shown.
[00:37:50]
Okay.
[00:37:51]
Okay, so this is cool.
[00:37:52]
So like the the variances, so this is like the sort of third piece of a shader.
[00:38:00]
It can have a varying which is an output from the shader.
[00:38:03]
Yeah, so a vertex shader has a variance as an output.
[00:38:08]
The fragment shader has them as an input.
[00:38:11]
Right, right.
[00:38:12]
So for example, if you have an output of the color given which face of the cube we're on,
[00:38:22]
then that varying output from the vertex shader would be, that color would be an input to
[00:38:30]
the fragment shader and then it would just take that color and possibly transform it,
[00:38:34]
but basically paint that color.
[00:38:36]
Yes, basically.
[00:38:39]
It doesn't take it exactly right because you set it onto three vertices of a triangle.
[00:38:45]
Right, remember the vertex shader is working with only the corner points, whilst the fragment
[00:38:51]
shader is filling the whole triangle.
[00:38:56]
So what you get there is an interpolation, an interpolated result based on how close
[00:39:03]
you are to each of the vertices.
[00:39:07]
So like if it's very close, if it's at the vertex itself, then it's going to be the color
[00:39:12]
that is specified for that vertex.
[00:39:13]
If it's somewhere in the center and you have specified different colors for all of the
[00:39:18]
three vertices, then it's going to be like an average of colors.
[00:39:23]
Can you determine how that interpolation is made?
[00:39:26]
Can you use different interpolation functions or is it always the same?
[00:39:29]
It's the same and I can't tell how it works.
[00:39:33]
I think it's like linear.
[00:39:35]
And I'm guessing if you want to be more precise in how the values change from one vertex to
[00:39:43]
another, then you need to multiply the number of triangles, right?
[00:39:48]
So if you have, so let's say your cube is really smooth, so let's say your cube has
[00:39:56]
a very uniform color, then you draw two triangles for one face.
[00:40:01]
But if you want it to be a lot smoother, then you can cut it up into more triangles and
[00:40:07]
have the colors change slightly more.
[00:40:11]
Yeah, if you want to achieve some weird coloring that is not a simple linear interpolation,
[00:40:19]
then you can have, I guess, more segments to curve your own gradient.
[00:40:27]
Not sure how that would work.
[00:40:29]
Maybe you'd rather have a texture and then you map it to all the four vertices of each
[00:40:35]
side.
[00:40:36]
Yeah, or if it's not a cube, but it's more bumpy, then you need to make a lot of triangles
[00:40:43]
for each bump.
[00:40:45]
And then the lighting can have different effects on each triangle separate from another.
[00:40:51]
Whereas if you have only two triangles for an entire face, then what you can do with
[00:40:56]
it becomes very simple.
[00:40:59]
Too simple, maybe?
[00:41:00]
For smooth surfaces, it's like you wouldn't set the normals the same way as you would
[00:41:06]
do for the cube.
[00:41:07]
For the smooth surface, the normal would be pointing from...
[00:41:12]
It's sort of like as an average of all the normals of all the triangles that start with
[00:41:19]
this point.
[00:41:20]
Let me think of how to say it better.
[00:41:26]
If you have a point on such a bump, you might have multiple triangles around this point
[00:41:33]
and they will all have their normals as the strict normals pointing from this triangle.
[00:41:39]
But what you want to attach to that vertex is not either of them, but something in between
[00:41:45]
to have a smooth normal.
[00:41:48]
And then those normals can be passed as varyings and then the normals themselves are going
[00:41:54]
to be interpolated.
[00:41:56]
So what you will be getting in the fragment shader is a normal that is smooth.
[00:42:04]
Because if you use normals for a cube, flat shading, then you will see the rendering,
[00:42:15]
the output will be looking very low poly.
[00:42:19]
All the edges will be very visible because the normals are going to be jumping from one
[00:42:23]
triangle to another.
[00:42:24]
It's not going to be smooth.
[00:42:26]
So the lighting is not going to be smooth because of that.
[00:42:29]
But it really depends on what you are trying to achieve, what kind of look.
[00:42:33]
Yeah, not a cube at this point.
[00:42:37]
So let's say I want to draw the normals or to have the normals be sent in the varyings
[00:42:43]
from the vertex shader to the fragment shader.
[00:42:47]
How do I go about finding GLSL documentation on how to do that?
[00:42:52]
Because I need to write the shader, right?
[00:42:56]
That is not Elm code.
[00:42:57]
So where do I find documentation?
[00:42:59]
How do I get started to do that?
[00:43:02]
I learned about this by looking at some examples.
[00:43:06]
So there are general examples of how to use WebGL and all the GLSL code is the same there.
[00:43:15]
You can look it up there or you can even find some examples in Elm that are pretty simple.
[00:43:21]
Yeah, with this normal thing, I think it's slightly more complex than just rendering
[00:43:29]
a cube.
[00:43:31]
If you want to get into lighting, then it gets more complicated.
[00:43:36]
Maybe at this point, you want to think of like, do you really want to invest a lot into
[00:43:41]
learning 3D graphics in WebGL?
[00:43:44]
Or maybe you'd rather switch to some of the existing packages like Elm 3D scene that would
[00:43:53]
do it for you.
[00:43:55]
You can set up a scene with lights and you'll get nice physically based rendering out of
[00:44:00]
that.
[00:44:01]
So GLSL is a special syntax in Elm, but can you construct it programmatically?
[00:44:08]
Can you create one from a string or something?
[00:44:13]
It's like statically compiled.
[00:44:15]
Okay, so you can't have a package that creates...
[00:44:19]
So you can't have an Elm package that creates a GLSL function programmatically based on
[00:44:26]
user defined inputs?
[00:44:29]
No, there used to be unsafe API, but you know that in this case, you have a string and you
[00:44:37]
pass that string into runtime and you interpret that.
[00:44:41]
So you would lose all the guarantees.
[00:44:43]
So at some point this was removed and this thing is what is currently being explored
[00:44:49]
of how to have a way to nicely compose shaders or to be able to share some bits of code,
[00:44:56]
like a function that you can use in your shader that does something useful.
[00:45:01]
So I think that is currently being explored.
[00:45:04]
So is the main use case for the Elm WebGL stuff games?
[00:45:10]
I mean, games are the obvious one.
[00:45:12]
I know Ian McKenzie is very passionate about CAD type stuff like modeling physical objects.
[00:45:21]
What are the types of use cases people would use this for?
[00:45:24]
It's very broad, I think.
[00:45:26]
Like games, yes, or maybe even data visualization or anything that would require 3D visualization.
[00:45:36]
You can think of an advert or like a product page where you have something physical, like
[00:45:43]
I don't know, you're selling phone and you have like a 3D model on the page that lets
[00:45:49]
your users rotate it and look at it in 3D.
[00:45:54]
I think Ian even mentioned a spinner, you can have like a Nijax spinner rendered in
[00:46:00]
it.
[00:46:01]
Why not?
[00:46:02]
Yeah, I think is the kite project also in WebGL that Erkal Selman created?
[00:46:09]
I think it might be.
[00:46:11]
No, that is using SVG.
[00:46:13]
That's SVG, okay.
[00:46:15]
But Erkal has been doing some really cool visualizations using Elm 3D scene.
[00:46:22]
Data visualizations?
[00:46:23]
No, more like creative coding kind of thing.
[00:46:27]
Because with functional API, you get all the function, composition, all the things you
[00:46:33]
can produce generative art.
[00:46:36]
You can do random generators and yeah, very cool.
[00:46:41]
So what are some of the cool projects that people should take a look at to get inspiration
[00:46:46]
and look at some examples?
[00:46:48]
So one thing I would share is there is a very nice overview of almost all the 3D graphics
[00:46:56]
projects built in Elm.
[00:46:58]
Luca, I hope I pronounced his second name correctly, Luca Mognini put together.
[00:47:06]
It has all the games and art and all the crazy things built in Elm.
[00:47:12]
I think that is a really cool overview of just seeing what's possible.
[00:47:17]
There are more libraries.
[00:47:20]
There are other projects that are more like libraries.
[00:47:23]
So for example, Roman Potasov, his GitHub is just gook.
[00:47:29]
He implemented a WebGL playground.
[00:47:34]
He implemented a WebGL playground that is very similar to the package that Evan implemented,
[00:47:41]
but it uses WebGL under the hood.
[00:47:43]
A lot of things can be put together using that package and it's very simple.
[00:47:48]
Another thing that I would recommend to check is Ian's Elm 3D scene and all the examples
[00:47:55]
there because it's a high level API.
[00:47:59]
You don't need to deal with shaders.
[00:48:01]
You can put together 3D things using his libraries.
[00:48:06]
Wow, cool.
[00:48:07]
If you'd recommend someone to start with 3D in Elm, you would recommend checking out Elm
[00:48:13]
3D scene or would you recommend them to use Elm WebGL to start with?
[00:48:18]
I think you can start with Elm 3D scene.
[00:48:20]
I think it has a friendlier API, but it really depends on whether learning the basics also
[00:48:27]
has its advantages because you're more flexible in what you can do.
[00:48:33]
But if you have a use case that maps to what's possible in Elm 3D scene, then that's going
[00:48:38]
to be an easier path.
[00:48:40]
Another project that I wanted to highlight is a project from Tomas Latal.
[00:48:46]
I think he just recently made it public.
[00:48:49]
It's Elm CSG.
[00:48:50]
CSG stands for constructive solid geometry.
[00:48:54]
What this package allows you is you can construct 3D objects using primitive 3D objects and
[00:49:01]
Boolean operations between them.
[00:49:04]
It's very early stages.
[00:49:07]
It's work in progress, but there are some really cool examples.
[00:49:11]
You can make a chess piece with it using some cylinders.
[00:49:17]
It's easier to create 3D shapes using this method than writing down or constructing coordinates
[00:49:24]
yourself.
[00:49:26]
If you have a cylinder, a sphere, a cube, then you can subtract them or group them and
[00:49:34]
then produce another shape using those operations.
[00:49:39]
Because you don't have those shapes in WebGL, everything is a triangle.
[00:49:44]
There are no shapes in WebGL.
[00:49:47]
There are shapes in Ian's package.
[00:49:51]
There is a project that I created that lets you load OBJ files.
[00:49:56]
You can draw things in Blender using 3D graphics software, like 3D design software, and then
[00:50:06]
output an OBJ file and then load it into Elm.
[00:50:11]
That opens a lot of other possibilities for creativity.
[00:50:18]
You can prepare your game assets and then load them into Elm.
[00:50:23]
Using ports, decoders, and stuff like that?
[00:50:28]
It comes with this package, what is it called, an Elm HTTP plugin or this decoder.
[00:50:38]
You can plug into Elm HTTP and it would load and pass it to you as a triangular mesh, which
[00:50:48]
is a type that is supported in Elm 3D scene.
[00:50:54]
You get it straight into the format that it expects and then you can render things using
[00:50:59]
Ian's Elm 3D scene.
[00:51:01]
I have plenty of examples like falling ducks, for example.
[00:51:07]
That's awesome.
[00:51:08]
Yeah, Luca's Medium blog post is really cool with all these demos.
[00:51:13]
People should definitely check that out to get a sense of what people are doing with
[00:51:16]
this stuff.
[00:51:18]
I remember when Elm Japan was announced, which is a little sad to talk about since it got
[00:51:25]
cancelled for obvious reasons.
[00:51:27]
Don't say it.
[00:51:28]
I know.
[00:51:29]
I know.
[00:51:30]
But I remember the website was so cool.
[00:51:35]
I wonder if it's just this little floating Tokyo city and cityscape with cranes moving
[00:51:43]
around and you can see it floating in the air.
[00:51:47]
I think you could change it from night to day or something like that.
[00:51:51]
I love these delightful little visualizations for conference websites and stuff.
[00:51:55]
It really makes things fun.
[00:51:57]
I wonder if that was created with Blender or with raw WebGL or what?
[00:52:03]
As far as I understand, I think it's not really using WebGL for this.
[00:52:07]
I think it's using SVGs.
[00:52:09]
But what you can actually do is you can render 3D things using SVG.
[00:52:16]
You can basically do all the projections and all the coordinate transformations yourself
[00:52:21]
in this case.
[00:52:23]
There are packages that you can use in order to do this sort of thing.
[00:52:27]
I think the main problem with this is that you will have to deal with the Z index yourself,
[00:52:34]
with the Z buffer yourself.
[00:52:35]
You have to render one thing after another thing in order if it's closer to the camera
[00:52:41]
or intersections would not be possible easily.
[00:52:45]
Wow.
[00:52:46]
Yeah, it does say it's created with SVG.
[00:52:49]
That's really fascinating because it sure looks like a nice 3D model there.
[00:52:54]
Interesting.
[00:52:55]
So all these matrices that do the coordinate transformation, you can multiply the coordinates
[00:53:02]
by these matrices to transform them yourself and then render the final output, the final
[00:53:07]
result using SVG.
[00:53:11]
It might be slower in some cases, but if it's a simple illustration like this, a simple
[00:53:17]
drawing like this, then it…
[00:53:20]
When you use Elm Explorations WebGL, it renders using a canvas, right?
[00:53:25]
An HTML canvas.
[00:53:26]
Okay.
[00:53:27]
Yeah.
[00:53:28]
So WebGL to HTML is the function that puts this DOM node that is a canvas, that ends
[00:53:34]
up being a canvas element.
[00:53:36]
And you can even pass attributes to it.
[00:53:40]
Yeah, regular HTML attributes.
[00:53:41]
Yeah, like a width and height style.
[00:53:45]
What's the best place for people to ask questions if they're getting started with this stuff?
[00:53:50]
Is the game jam…
[00:53:51]
I know there's a pretty active game jam channel in the Elm Slack.
[00:53:54]
Is that a good place for people to ask?
[00:53:56]
Yeah, the game dev.
[00:53:57]
Oh, there's game dev too and then there's the game jams, right?
[00:54:01]
Yes.
[00:54:02]
No, I think it's the only channel.
[00:54:06]
We used to run game jams.
[00:54:10]
This is where we announced them, but the channel is called game dev.
[00:54:15]
And then on top of that, there is the WebGL channel.
[00:54:20]
That might be more appropriate if you're not into games, but some people ask questions
[00:54:26]
in both and it's totally fine.
[00:54:28]
Awesome.
[00:54:29]
Yeah, there's this whole hidden community that's been off my radar, but it looks like
[00:54:33]
a lot of fun.
[00:54:36]
No Dillon, don't start running games.
[00:54:39]
You have tools to make.
[00:54:41]
Actually games are how I got into WebGL in the first place.
[00:54:46]
We were working on a game and that was slow and we tried to render it with WebGL.
[00:54:51]
That was using WebGL in 2D.
[00:54:55]
You can do 2D, it's not restricting you.
[00:54:59]
And then I thought, maybe it would be cool to do something in 3D too.
[00:55:05]
And I created the Rubik's cube game.
[00:55:09]
And then after that, okay, well, where can I go from this?
[00:55:14]
Maybe I can do a physically simulated dice.
[00:55:19]
And things got complicated.
[00:55:23]
You can layer so many things into it.
[00:55:27]
Physics is a whole other level.
[00:55:30]
Creating things in 3D is one thing, but having bodies interact with real world physics.
[00:55:37]
It seems like a lot of game development is done using Unity and these sort of game engines
[00:55:42]
that do abstract away a lot of the details, I imagine.
[00:55:46]
But yeah, physics.
[00:55:48]
It was quite the ride getting it to work.
[00:55:54]
I didn't start from scratch.
[00:55:56]
I looked at the Canon JS, but that is a library in JavaScript that is very object oriented.
[00:56:03]
Oh, so it didn't help.
[00:56:07]
It did help a lot.
[00:56:11]
It's very clear how it's written.
[00:56:13]
So I was able to read and understand the code.
[00:56:17]
But a lot of algorithms, they are very imperative and converting them into a purely functional
[00:56:26]
style was quite a challenge.
[00:56:29]
Just out of curiosity, because apparently Elm is able to send things to the GPU.
[00:56:35]
Can you send computations to be done on the GPU in Elm?
[00:56:40]
Or is it just WebGL?
[00:56:42]
No, the current WebGL is only letting you render things.
[00:56:49]
And I think arguably the way Elm APIs are done, if there was a thing that would let
[00:56:56]
you run some simulations or computations on the GPU, I think it would probably have a
[00:57:01]
very different API from the graphics oriented API.
[00:57:05]
Right.
[00:57:06]
James Carlson gave a talk a while back about using Futhark to do GPU computations from
[00:57:15]
Elm.
[00:57:16]
I think it was using ports.
[00:57:18]
Yeah, it's an interesting space.
[00:57:20]
I mean, a lot of things are changing right now.
[00:57:23]
So I think WebGPU is being rolled out in browsers, but not in all of them.
[00:57:31]
WebGPU?
[00:57:32]
Yeah, WebGPU is a lower level API, the new API for interacting with GPU and that has
[00:57:41]
broader range of use cases.
[00:57:43]
It makes things possible that were not possible before.
[00:57:48]
But the current Elm API is based on WebGL 1.
[00:57:53]
And it's using the current pipeline has some limitations.
[00:58:02]
For example, what you cannot do is you cannot do multipass rendering, which is render this
[00:58:10]
thing on a texture and then take that texture and then use it in the subsequent rendering.
[00:58:18]
And this technique is used a lot to achieve things like shadow maps, which is a way to
[00:58:26]
render shadows, which is like a really clever way to do them.
[00:58:31]
So you render the scene from the light's point of view.
[00:58:37]
So then you will see what's highlighted, what's in shadows.
[00:58:41]
And then you render it from the camera point of view, but you use that as a map to figure
[00:58:46]
out what is in the shadows, what is highlighted.
[00:58:50]
Or things like HDR or things like, well, there are a lot of use cases that would require
[00:58:57]
this.
[00:58:58]
And you're saying that this two pass system is not possible with WebGL?
[00:59:03]
With the current API of WebGL, you only do one pass.
[00:59:07]
Sorry, of WebGL or Elm WebGL?
[00:59:10]
Elm WebGL.
[00:59:11]
Okay.
[00:59:12]
And we are still exploring possible ways of how to achieve multipass rendering.
[00:59:20]
There are some ideas.
[00:59:21]
I think Ian has some ideas.
[00:59:23]
I did like a rough prototype of this.
[00:59:27]
And so I'd like to prove that this is possible, but the API of that is not good.
[00:59:36]
The most problematic thing about this is you have to allocate all these memories for these
[00:59:41]
intermediate buffers in order to store the intermediate renders.
[00:59:46]
And the API is declarative in a way.
[00:59:49]
You only have one way to pass things in.
[00:59:51]
You don't have a way to say when to remove something from the...
[00:59:56]
When to clear the allocation.
[00:59:58]
So you don't know whether you rendered this onto a texture.
[01:00:04]
You don't know how long are you going to need this texture.
[01:00:07]
Is it the result of this rendering?
[01:00:10]
So if it's some intermediate result that is still 60 frames per second, or it's like a
[01:00:18]
texture map, you render it once and then you're going to keep it forever and use in all the
[01:00:26]
subsequent renderings.
[01:00:28]
It's really hard to tell.
[01:00:29]
There are multiple use cases and coming up with an API to support all of that is very
[01:00:35]
challenging.
[01:00:36]
Well, I feel like I've got a lot better grasp of 3D stuff in Elm.
[01:00:42]
So there's so much more to learn, but I feel like I've finally started to wrap my head
[01:00:48]
around some of the basics.
[01:00:49]
So this was great.
[01:00:51]
So what should people take a look at if they want to dive in more?
[01:00:57]
We've got the Elm Explorations WebGL package that we've been talking about.
[01:01:00]
The docs are very thorough.
[01:01:02]
So first read the docs.
[01:01:05]
Then check the examples.
[01:01:09]
So there are examples in this package, like on GitHub of this package.
[01:01:14]
I think some of them have been even published on the Elm blank website.
[01:01:20]
So you can see them there.
[01:01:21]
So yeah, play with those.
[01:01:23]
And then at some point you might need to learn a bit more underlining things.
[01:01:32]
And there's a learning WebGL and there is a GitHub repo of I think, Nacho Martin.
[01:01:40]
He tried to follow these lessons and do things in Elm.
[01:01:49]
So you can see his intermediate results of completing these lessons in Elm.
[01:01:57]
But if you don't want to, then Elm 3D scene, it has its own API and it also has a very
[01:02:05]
good documentation.
[01:02:06]
It even has screenshots of what you see if you use this setting.
[01:02:12]
And you've got your Elm WebGL playground repo.
[01:02:15]
This is what I used in order to play with the API and see how to do things.
[01:02:21]
Because the very first thing, if you want to do 2D thing, then how can you render a
[01:02:27]
sprite on the screen?
[01:02:28]
That's a very basic thing.
[01:02:31]
Just like a 2D rectangle with an image on it.
[01:02:35]
I have an example like this.
[01:02:37]
Yeah, and you've got lots of games.
[01:02:40]
You've given many conference talks about this stuff.
[01:02:43]
So if people want to see more advanced stuff, less fundamentals and intro to the basics
[01:02:50]
of WebGL, they should definitely check out your conference talks.
[01:02:53]
We'll link to some of those as well.
[01:02:54]
Or just talk to me directly.
[01:02:58]
Even better.
[01:02:59]
That's great.
[01:03:00]
Yeah.
[01:03:01]
On Soundscapes, on Elm Slack or on Twitter.
[01:03:05]
I'm really glad we got to sit you down and ask all of our dumbest questions.
[01:03:11]
So thank you for being patient with us and walking us through fundamentals.
[01:03:16]
And I hope some other 3D newbies found it useful as well.
[01:03:19]
Yeah, thank you.
[01:03:21]
Thank you very much for inviting me.
[01:03:24]
Very good to be useful.
[01:03:25]
It was a pleasure.
[01:03:26]
Yeah.
[01:03:27]
Thanks so much, Andrey.
[01:03:28]
And Jeroen, until next time.
[01:03:31]
Until next time.