Friday 15 April 2016

Diffuse shader: vertex / fragment shader

In this example, I will show yo how we can write a shader to achieve the same effect as the previous shader we wrote (light diffuse), only this time with a vertex/fragment shader.

Here, we will have to compute light manually and color each pixel properly according to the light detected.

Please bear in mind that this particular type of shader will only work with a single, directional light, it will not react to multiple lights, ambient light or even single lights which are not of type directional. We will see later, in other posts, how to add multiple lights and ambient light.

This is the shader code:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
Shader "Custom/DiffuseSingleLight"
{
 Properties
 {
 _Color("Color",Color) = (1,1,1,1)
 }

 Subshader
 {
  Tags{"LightMode" = "ForwardBase"}

   Pass
   {

    CGPROGRAM

    #pragma vertex vert
    #pragma fragment frag

    struct input
     {

     float4 ver : POSITION;
     float3 norm : NORMAL;

     };
    struct v2f
     {

     float4 pos : POSITION;
     float3 norm : NORMAL;

     };

    float4 _Color;
    float3 _LightColor0; //is a built in variable but must be declared!

    v2f vert(input i)
     {
     v2f o;

     o.pos = mul(UNITY_MATRIX_MVP,i.ver);
     o.norm = mul(float4(i.norm,0.0),_World2Object).xyz; //_World2Object is float4
     return o;

     }

    float4 frag(v2f v) : COLOR
     {

     float3 normDirection = normalize(v.norm);
     float3 lightDirection = normalize(_WorldSpaceLightPos0);

     float3 light = max(0.0,dot(normDirection,lightDirection)) * _LightColor0.rgb;

     return float4(light,1) * _Color;

     }

    ENDCG

   }

  }



}

I assume you are now familiar with how to begin a shader program, so I will skip the first few lines of code.

On line 10 we add the tag "ForwardBase". This is necessary as it tells Unity that we are in forward rendering and dealing with main directional light.

In the input structure we have one additional parameter, called norm with the semantic NORMAL. This will take the vertex normal vector from the object.

On line 36 we declare a variable called _LightColor0. This is a built in Unity variable that represents the main directional light color property. If you have more directional lights while using this shader, one will override the other one, according to rotation and intensity, the lights will not blend together.

On line 43, we convert the normal vector from the input to a "world position" vector, which is then normalized in the fragment shader on line 52. On the next line, we normalize the direction vector of the directional light, using another built in Unity variable, _WorldSpaceLightPos0.

With these 2 normalized vectors we can calculate the light attenuation, which is what happens on line 55. We first perform a dot product between the 2 vectors.

A dot product of normalized vectors return a value between -1 and 1, depending on the direction they are pointing to: if they point to the same direction, the value returned is 1, if one points to the complete opposite direction of the other one, the value is -1.This value is used to represent the intensity of the light intensity.

Then, we clamp the value obtained between the value itself (which is maximum 1) and 0, using the max method you see being used on the same line. This is done because we don't want any negative contribution, in other words, the light intensity simply cannot be a negative value.

Finally, we multiply this intensity value for the color of the light, represented by the variable _LightColor0.

At the end of the method we return the light variable, "casted" to a float 4 as we want to returna  color. The value of 1 added is the alpha value. This is then multiplied by the _Color public parameter, which is the used defined color, to give a tint to the object.

This is the result:


As you may have noticed, writing vertex/fragment shaders require a lot more code comparing to the surface shader, however, we are allowed much greater flexibility and we can create more complex effects.

Sunday 10 April 2016

Diffuse shader: surface shader

In this post I will show you how to create a diffuse shader (which reacts to lighting) using surface shaders.

Surface shaders are intended to simplify the code for creating complex shaders. These type of shaders will then create automatically vertex and fragment functions and we do not need to deal with them.

I will then show you how to do a diffuse shader using vertex and fragment methods, like we did for the unlit shader in the previous post.

Surface shaders are structured differently than vertex/fragment shaders and, in particular, they have some parameteres that are required: a surface function, which is the CG written method that deals with the surface shader and a lighting model. Unity provides pre-made lighting models that can be used in the shader, however, it is possible to create custom ones as well.

Just like in vertex/fragment shaders, we point to the surface method with the #pragma directive, just like so:


1
#pragma surface surf Lambert : optional_parameters

This tells Unity to lok for a function called surf and to use the pre-built lighing model Lambert. It is possible to add additional parameters here, for example, to enable alpha blend.

Then the method itself will have to be declared as such:


1
2
void surf(Input in, inout SurfaceOutput o)
{}

The structure Input is defined by you. Here we will have all the input variables that are needed to achieve whatever effect we trying to create in the shader. We can use some in_built variables to get some information about our model, like world normals ecc. A complete list is of course available on the Unity website (link here).

The SurfaceOutput structure is already pre-built in Unity, and so are its parameters. We can use its variables to set the output of our shaders. Some examples are Albedo, which determines the diffuse color, or Alpha, used for transparency.

Now, this is the complete code used to create a diffuse shader:



 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Shader "Custom/SingleColorSurfaceShader" {
    Properties
    {
    _Color("Tint",Color) = (1,1,1,1)

    }

    SubShader {
         
      CGPROGRAM

      #pragma surface surf Lambert 

      float4 _Color;
      fixed _Transparency;

      struct Input {
          float4 col : COLOR;
      };
 

      void surf (Input IN, inout SurfaceOutput o) {
         o.Albedo = _Color.rgb;

     
      }
      ENDCG
    }
   
  }

Just like a  vertex/fragment shader, we start with the keyword Shader, followed by the name. This is ShaderLab, so the syntax is exactly the same. The main difference is in the CG part of the program.

On line 16 we tell the engine where to look for our surface method, and we are usnig the Lambert lighting model.

The structure Input contains only a variable float4 called col. Actually, this variable is not even used, as we are going to output the color the is passed in by the user in the properties. It is necessary to put a variable inside the structure though, otherwise we will get en error.

In our surf method, on line 22, we simply set the Albedo member (the diffuse color) of the SurfaceOutput structure (predefined) to be equal to the _Color public variable.

To sum up, all we are doing here is taking the _Color variable, which is the color selected by the user, and pass it to the SurfaceOutput.Albedo variable, which is a predetermined variable that set the diffuse color of the model.

As you can see, the code is very simple and we can avoid to deal with vertex and fragment methods, which are automatically generated for us behind the scenes.

If we create a material with this shader and attach it to a sphere, this is the result:




Monday 4 April 2016

Custom Unlit Shader

Shaders are both fascinating and frustrating.

Writing shaders is very rewarding, however, it can lead you to total madness.

I already showed you how to write shaders which use the Stencil buffer; now I want demonstrate how to create a shader from the ground up in Unity.

I should probably point out that I am not a shader expert, but I do find the topic extremely interesting and Unity has a lot of functionality that makes writing shaders easier.

In this tutorial we will write the simplest of the shaders, one that does not compute lights or textures but simply returns a single color.  Also, the shader we are going to write is a so called "vertex/fragment" shader.

Unity has its own language for writing these programs, called ShaderLab. Other languages are also used, and one that we are going to be looking at is CG, which stands for C for Graphics, a shader language developed by Nvidia. The shaders we write are a mix of these 2 languages.

To begin with, right click in the project folder, select Create -> Shader -> UnlitShader. It doesn't really matter which options of shader you choose as we are going to completely delete the whole content.

I'm going to post the code below and I will then explain the details.


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
Shader "Custom/UnlitSingleColor"
{
 Properties
 {
 _Color("Tint",Color) = (1,1,1,1)
 }

 Subshader
 {
   Pass
   {

   CGPROGRAM

   #pragma vertex vert
   #pragma fragment frag


   struct input
   {

   float4 pos : POSITION;

   };

   struct v2f
   {

   float4 pos : SV_POSITION;
   };

   float4 _Color;

   v2f vert(input i)
   {
   v2f o;

   o.pos = mul(UNITY_MATRIX_MVP,i.pos); //unity_matrix_mvp goes first. these are matrices, orders in multiplications matters

   return o;

   }

   float4 frag(v2f i) : COLOR
   {
   return _Color;
   }

   ENDCG

   }

 }

when we write a shader we start with the keyword Shader, followed by the name. The name itself can be put after  "/" so it will appear in subfolders when we select it in the material. (Fig 1).

Fig 1
Then we create some proprierties. These variables are public and can be tweaked from the inspector in the shader submenu. The only variable we have is called _Color. Within the brackets, we first pass the name with which the variable will be displayed (in this case, "Tint") and we then pass it the type, which is of type Color. When passing this type, in the inspector we will have the option to choose the color we want to have using the palette. (Fig 2).

Fig 2
Now, on line 8, we start the Subshader. The subshader is, well, the shader itself. You can have as many as you need, and usually they perform differently and are written for different platforms. The graphics card will read each subshader you write and use the first one that is compatible with its system.

Right after that, we have the Pass keyword. A pass is basically a draw call. For this simple example, a single pass is enough for what we want to achieve. Some other cases require multiple passes, like the stencil buffer I mentioned earlier or if we want to use multiple lights.

With the CGPROGRAM statement we are now telling Unity that we are going to write in CG. This will end at line 49, with the ENDCG instruction.

If you remember, earlier i mentioned that we are going to write a vertex/fragment shader. This means that this program will contain a function called fragment, and one called vertex.

It is common practice to use frag and vert as names for these 2 functions. On lines 15 and 16 we tell Unity where to look for these 2 functions and what names they have: the vertex method is called vert and the fragment method is called frag.

Before I continue, let's explain what these 2 methods do. It's actually pretty simple to understand: the vertex program is the portion of code that runs for each vertex of the mesh, so you can use it to create animations or special effects like curved worlds. The fragment function runs for each pixel, so it is used to color our mesh and render textures.

The next thing we see is a struct called input. Here we basically grab all the information we need forom "outside". By that I mean we pass in all the variables regarding our model that needs to be processed like, in this case, the vertex position pos, of type float4, which is basically a Vector 4 of floats. The POSITION keyword you see written after is called a semantic, and it's used to communicate to the gpu what kind of variable this is. This variable is pretty much required for every shader we write as it is used to display the actual mesh that is going to run this particular shader.

The next struct called v2f also contains a float4 variable called pos, but the semantic is different. This is because we are going to take the vertex position from the input struct, which is a local position, and convert it into clip space position (with semantic SV_POSITION), which is basically a bunch of coordinates that Unity understands.

On line 32 I declare a float4 type variable called _Color. As you may have noticed this is the same name we gave to the Color type variable in the Proprieties section. This is done because as we are now writing in CG, this portion of code is not aware of what happend outside of it. So, we need a reference to the _Color parameter we declared earlier in the program. To do so, we simply re declare it in the CG section of our shader program using the same name. The type float4 is commonly used for colors as RGBA.

We finally got to the vertex method. This method is of type v2f with the struct input passed in.

In the method, we first declare a v2f object, called o.

The code on line 38 is something you will see in every shader: this does what I explained earlier, takes the local position of the vertex of the mesh and converts them into clip space so the model can be rendered in the scene. This is done by multiplying the pos variable of the input struct to the UNITY_MATRIX_MVP, which I believe it stands for model view projection. Remember, this is a matrix, so the order in which you multiply the 2 parameters matters.

You can try doing  mul(i.pos,UNITY_MATRIX_MVP) for fun, see what happens.

Lastly, we return the object o.

Finally, the fragment method.

This is of type float4, to represent a color, as its semantic suggests.

All we do here is to return the _Color value, which is nothing but the color we will pick in the inspector. So, every single pixel used by this mesh will be colored as dictated by the _Color variable.

Now you can just create a capsule or sphere, anything really, create then a new material, select this shader and attach it to the 3D object and see the result.

With white color, looks like this