Shader Demos
Ott Adermann
Project Plan
The main goal of this project is to become more familiar with writing shaders, so as to be able to create nicer looking visuals for computer games. Hopefully by the end of the journey, I am able to discern when it is a good decision to write separate shader code for objects, and that it would be as natural as writing a regular script.
If all goes well, I would like to split the milestones in half.
In the first half, I will take a look at the process of writing shaders in a lower level environment, which will include working with a 3D API.
I have initially chosen Vulkan to be this API for two main reasons. The first is that Vulkan supposedly operates closer to the way graphics cards actually work these days, as opposed to OpenGL, which has a lot of weird things left in it from the near 30 years it has been in use. For these same reasons, Vulkan is generally also supposed to be faster. The second is that Vulkan is a new API, and because it was first announced as the next generation OpenGL it might very well mostly replace OpenGL in a few years or a decade.
The third, but minor reason is that Vulkan, like OpenGL and WebGL uses GLSL as its shader language. They're also the APIs that are collectively gaining popularity and that run on multiple operating systems, such as Windows, Linux, Mac, and Android, as opposed to HLSL, which is used by Microsoft's Direct3D, is losing popularity, and runs only on Windows.
A complication I can potentially foresee arising is that Vulkan might be more difficult than OpenGL. In the case that dealing with Vulkan proves to be too time consuming, I will switch to using WebGL and Three.js, which should allow for more focus in just dealing with shaders.
I would also note that I will be dealing with Vulkan in C++, and (in the case Vulkan proves to be too difficult) with WebGL in Javascript, both of which I've used before, but neither of which I'm comfortable with, so that is an additional layer of difficulty for myself.
In the second half, I'll try out shaders in a higher level environment, namely Unity.
Unity will allow me to basically skip all of the hassle of dealing with an API directly, as well as allowing for much faster creation of the world in which I can develop, test, and showcase the various shaders.
Unity uses a different language for its shaders that is based on HLSL, and is translated into other shader languages. Unity also has what they call a "surface shader" that isn't an actual shader in the graphics pipeline, but it allows for tweaking the look of objects without changing the overall lighting Unity provides out of the box.
I also hope to have time to check out two big new graphics-related features in Unity 2018. One being the Scriptable Render Pipeline, which allows users to define their own rendering configurations in C#, which will hopefully be a much less complicated process than doing that directly in the API. The other being the Unity Shader Graph, which enables creating shaders without writing any code at all, but instead through a visual interface.
I am curious to see how these higher level options compare to doing similar stuff directly in the API / shader code.
These two halves may be split unequally, depending on how the project develops. Since I am mostly unfamiliar with the technologies I will be using, it's very hard to estimate how long various things will take me and how far I can get.
During the project I will both be learning how to write code for the various shaders, but also what practical purposes each of these shaders could be used for. As such, I don't yet have a list of things I want to try my hand at implementing.
I will of course be looking at vertex and fragment shaders, but possibly also tesselators and geometry shaders, as well as Unity's unique surface shader.
Milestone 1 (09.03)
The mighty triangle
Goals
The goal of the first milestone is to get an application running in Vulkan. That application has to display a colored triangle at the very least. While that may seem like a very easy goal, there is actually a lot of setup work to be done before a Vulkan application can be launched. It is my intention to go over the entire setup process step-by-step and gain at least some understanding of what each step is necessary for. Further complications include my relative inexperience with C++, the skills of which I will also be developing during this milestone.
Development notes and results
I started off trying to follow the suggestion of skipping learning C++, and instead writing code for Vulkan in a more familiar language, C#, by using bindings. I found three implementations of that, but each of those had some problems. For one, all were marked as pre-release, indicating there was still work to be done on them. I found no documentation on any of those that would indicate how they should be used. And for the one that I actually tried to get working regardless, VkSharp, their own example code didn't even work due to what I can assume was some problem regarding differing versions of things. Time spent, no progress made.
So I continued on the previously planned path and worked through the cplusplus.com tutorial with moderate haste. A lot was familiar (although with notably different syntax) from having learned other languages in the past, but C++ also had some concepts that I had to familiarize myself with, the most prominent one being pointers.
Finally, I could download Vulkan, set up my development environment in Visual Studio, and start following the tutorial I had picked out.
Sadly, by this time, the deadline was already close, so I just had time to skim over the tutorial to meet the requirements for the first milestone. It does seem like a very good tutorial though, and despite all the scary syntax, it makes everything seem easy. It takes time to explain everything being done, and also provides a bit of background information.
So the result of the first milestone, as promised:
Milestone 2 (23.03)
Vertex and Fragment Shaders
Goals
During the second milestone I will be finishing with the tutorial I started in the last milestone. As the title of this milestone suggests, I will also be taking a longer look into vertex and fragment shaders.
While even the triangle rendering application includes some form of vertex and fragment shaders, as do most all applications that display anything, I will be making at least one fragment shader that displays something different than just color, basic lighting, or textures, and at least one vertex shader that does something different than just the usual transformations to post-projection space.
Current possibilities for fragment shaders that I've thought of include:
- infinite detail "textures"
- texture combining
- a lighting model with support for more advanced features like roughness, metallic, and bump maps
- maybe some other interesting ideas you or the internet have
And my current ideas for vertex shaders:
- mesh distortion (most everything falls under this category, maybe I can animate a flag, or make some flora sway in the wind)
- I've no idea how to do this, but I want to
Development notes and results
The Vulkan Monstrosity
I continued from last time with moderate enthusiasm to finish the Vulkan Tutorial I had started. After about the first four hours, my enthusiasm had dropped considerably. I had spent four hours, but I wasn't very far. Plenty of time still left, but plenty to go as well. I considered just skipping working my way through the tutorial, and just copy-pasting the code and working from there to just do my shaders. However, I came to a realization that I kind of need to understand the C++ code written to actually get my program to use the shaders and geometry properly. Somewhat fatigued, I went back to the tutorial.
After about another four hours, I considered giving up. A lot was done, but a lot still needed to be done as well. Working all day is stressful, but I wasn't sure if I had enough time otherwise to finish before the deadline. Still, what was I going to do? I hoped I had about half the tutorial still to go, and with so much time invested already, I didn't want to just waste it all. I continued.
I wasn't really tracking time, but four hours seems like a nice approximation of each of those chunks of continuous work I was doing. Another such chunk had passed now. At this point I was just tired. The entire tutorial was like a bad joke. The triangle was like a false promise. Always so close, yet never in reach. Each chapter promised that we just need to do one more thing. The window, the instance, the surface, the physical device, the logical device, the swap chain, the image views, the render pass, the pipeline, the frame buffers, the command pool, the command buffers... It never ends. But what other choice do I have now? I must carry on.
Our familiar sixth-of-a-full-day had passed again, and this time... This time I was done. I was done with the triangle, but I was also done with Vulkan. I had just switched to copy-pasting code snippets now, reading over the tutorial text, realizing I hadn't really remembered it, reading it again... I was out of determination. I just wanted to write shaders, not... Not this. The triangle wasn't even real. It was like a quick hack or something apparently - created with hard-coded shaders for that particular purpose. The next segments would go on to describe how to do this properly, even finishing off with importing entire models. But no, I was done. This was not what I was here for, and I had to put my lost time behind me. I had to write shaders, and with Vulkan, I wasn't going to get there. Not the proper way at least.
To anyone reading this who is considering to try out Vulkan for the first time, here's my advice:
Have a lot of time.
Vulkan isn't incredibly difficult in my eyes, but it is incredibly verbose. You're going to be spending hours upon hours on small tasks with very little payoff in terms of seeing results.
If you want to do X, but in Vulkan, then stop right there. Don't do it. If you want to do X, do X. Do it in a familiar, or at least a simpler environment. If you want to learn Vulkan, do it separately.
Milestone 3 (06.04)
Vertex and Fragment Shaders (again)
Goals
Due to catastrophic failure in the last milestone, I will have to rethink what I'm going to do in the project and when. I'll be switching to the second half of the project (shaders in Unity) early, and redoing my second milestone.
To reiterate and to make sure I will accomplish the milestone this time:
I will make one custom fragment shader and one custom fragment shader for Unity. I will additionally make some basic scene in Unity to demonstrate these results.
Development notes and results
I started off by following one of Unity's only video tutorial on writing shaders, and read some of Unity's documentation on the side.
Shaders in Unity are technically written with ShaderLab code, which seems to act like a wrapper around the usual shader code, which is by default written in HLSL (the syntax for which I looked up on MSDN).
My first impression is that Unity's official documentation on its shaders is rather poor, and a lot of the times it was easier to just copy seemingly relevant bits from existing examples to make things work.
I am largely confused by how shaders work in Unity. When creating a new shader, the editor by default offers 4 types of shaders - Unlit, Standard Surface, Image Effect, and Compute. But in reality, these are all the same type of file, using the same syntax, differentiated only by their content, I think.
I tried making an Unlit Shader, in which I used a vertex and a fragment shader, and it felt fairly similar to how I've used shader previously in OpenGL, for example. A difference was that all the shaders were defined in the same file after one another. There were also some Unity helper functions, some syntax that allowed to specify some stuff in the editor, subshaders, passes, LOD, other stuff...
A default Standard Surface Shader had more code though, slightly different syntax on some things, and a lot more of Unity's own functions, which seemed to be mainly for not having to reimplement Unity's lighting system for your custom shaders.
As for the results, an unsightly wobbly cube.
Milestone 4 (20.04)
Unity's Shaders
Goals
Due to my confusion from the last milestone, I'm still reluctant to promise much.
But I will complete at least 3 out of the 4 following things:
- A Shader that can use Unity's lighting. (Which is a Surface Shader, if I understood correctly.)
- A Shader that uses Tesselation.
- A Shader that uses a Geometry Shader.
- A Shader that applies some post-processing to the screen. (An Image Effect Shader, if I, again, understood correctly.)
Development notes and results
I managed to complete, as promised, three of those four shaders.
The more I deal with writing shaders in Unity, the more I feel like the documentation just isn't good enough. With enough looking, you can actually find answers to most things, but they're scattered all over the documentation over different categories even, so that looking for them is about as efficient as Googling. Further, the documentation seems to be somewhat out-of-date, as some pages list examples and features that other pages document as a legacy way of doing things. Unity's magical helper functions are used all over the examples, and I've no clue where they all come from, and why they sometimes have to included, sometimes not.
But, a surface shader was quite straightforward and mostly the same as writing the so-called "unlit" shader. The fragment shader part was replaced by the surface shader now, which was outputting a different data structure, that had information about the object's other properties, like roughness, too. I just re-made the original gelatinous cube, but now with shading.
Tessellation was actually the easiest of them all, and I'm not sure if that was because Unity was doing all the heavy lifting behind the scenes. Basically, you just specify a tessellation shader in the shader code, and return a float as the amount of tessellation you want on the object. Of course for the best results, you'll want to modify the amount of tessellation based on things like the distance from a camera, edge length, and possibly other things. Tessellation is usually coupled with a vertex shader for displacement, because else the extra geometry tends to go to waste. I used a displacement map to create a mountainous-looking terrain.
And finally the image effect (or post-processing) shader, which also required a small script component to get working. The script used Graphics.Blit inside the called OnRenderImage function to get the current rendered image, pass it through a material (which was where the shader was), and then write the result to a new image, which could go on through further post-processing, or to be rendered to the screen. I made a shader that can be used to shift the screen's hue.
Here are the results of my work on one image:
Milestone 5 (04.05)
Unity's Shaders
Goals
Hopefully Unity's 2018.1 release will be out soon. I would like to try out creating shaders with their new visual tool. If it's not out by the end of April, I'll just use the beta version.
Creating at least four shaders with Shader Graph is this milestone's goal.
The following are not promises, but I will try to replicate the cube material I already have, as well as see if the tessellated terrain is doable.
I would like to explore what are the limits of Shader Graph (what can not be done with it), and present how to make some shaders with it.
Development notes and results
While Unity's 2018.1 version actually launched, Shader Graph is still listed as a "Preview" package. Out of the box, it is currently only usable with Unity's Lightweight Rendering Pipeline, which is also listed as a "Preview". I did notice some bugs and inconveniences, but the real problem was that it was also missing some features. For example, anything regarding vertex manipulation wasn't in yet, so all it could be used for at the moment was what you could generally do with fragment shaders.
The next issue was that somewhere, something broke that didn't allow me to use my old method of writing a post-processing shader. For one, the old shaders aren't compatible with the new rendering pipelines. But even if I used shaders that were, none had any effect on what my camera displayed. No errors, they were just ignored.
So that leaves out all of the shaders I made before, but a regular transparent shader was easy enough to make. It resembles the gelatinous cubes from before, but without the wobble, so it's more like glass.
The second shader I made was still an attempt to make something like the image effect shader I made before. If I couldn't apply the shader to the camera texture, I could apply it to some other texture. The idea should be the same. Turns out Shader Graph already has an implementation of hue, so I made something a bit more complicated - a shader that can edit hue, saturation, and value.
The third shader was inspired by the creepy changing paintings often seen in horror games. It changes the texture of an object depending on the x coordinate of the the rendered fragment. The item looks normal when looked straight at, but different when viewed from the edge of your screen.
The fourth shader was for the ground, and should ideally work based on a heightmap and two textures. One texture is used as a base, but for places where the heightmap is below a certain threshold, the other texture should be used. This can be used, for example, to add varying levels of snow or grass to the ground.
And finally, a shader that blends two textures together based on world normals. I used it to show something resembling snowfall or frost on the top side of objects.
Overall, it was really easy and intuitive making shaders using this visual method, and I especially liked that I could see the intermediate results between the inputs and the final output.
Milestone 6 (18.05)
Example scene
Goals
The final milestone's goal is to create a playable demo scene with examples of shaders I've created. This is in preparation of the final presentation, and the aim is to make something presentable that works fine outside of the editor, and without too much explanation.
Development notes and results
In the end it seems that half of my project was plagued by technical issues. This last step is no exception. Creating one final build required some compromises to be made, and it doesn't reflect the full extent of what I did and learned during the course.
First of all, clearly the entire Vulkan part had to be discarded, but it did provide me with a fair amount of insight into how GPUs work, so it was not lost time.
Secondly, writing shaders manually in Unity also definitely helped me understand them better, but this work could not be included in the final demo, since the shader code was not (to my understanding) compatible with the new pipeline on which Shader Graph worked.
I decided to go with demonstrating what I had made in Shader Graph, because that allowed me to create shaders the fastest, so I would have time to show more, even if it gave me the limitation of only being able to show what could be done with fragment shaders.
And finally, regarding the build that was supposed to made - too late did I discover that that was not, in fact, possible. Custom Nodes made in Shader Graph fail to compile to the standalone, see, but my shaders were already too reliant on them to have time to strip them out. I suppose such are the drawbacks of working with software labeled as "preview", and it is a learning experience in itself.
Final Results
Links to the files of the two Unity projects
ShaderLab shaders
Shader Graph shaders
The latter is a larger portion of the overall project. Unfortunately, no build could be made due to technical difficulties, but you should be able to load the assets into your own Unity editor and try it out from there. Alternatively, here is a video of the fragment shaders made.