ShadersExcerpt
ShadersExcerpt
Preface
Who this book is for
What will you learn?
What this book covers
Conventions used
Disclaimer
About the author
Introduction
Chapter 1: The world of 2D shaders
Shaders for 2D elements
Coordinates? What coordinates?
Color the canvas
A grid of perpendicular lines
Live cables and basic iterations
Fire the fireworks!
Watch out for the lightning
The animated field of stars
It’s a rainy day
Scene in the rain
Color the sprite
A view through glass doors
Dot matrix
Emboss effect
Night vision goggles
How to burn a scene
An intentional glitch
The power of partial derivatives
Pixel glow
Fly through a tunnel
Interlude 1: Visual shaders in 2D
Symmetrical ornament
Chapter 2: Procedural art
Mandelbrot set
Voronoi diagram
Sierpiński triangle
Shiny and animated plasma
Neural plexus
Interlude 2: Visual shaders in 3D
Cloaked character
Energy shield
Chapter 3: Screen post-processing
Turn your display into an old CRT monitor
Shockwave
Edge detection
Pixelate it
Into the fog
Interlude 3: Other types of shaders
Epilogue
Copyright
Copyright © 2024 Filip Rachůnek, all rights reserved.
Shaders are a fascinating technology, and if we aren't afraid to overcome the initial
learning phase, a wonderful world opens up before us, where we can significantly
enhance the appearance of our game with just a few lines of code. Almost every
gaming computer today is equipped with a powerful graphics card, so why not use
its potential to the fullest? The more enhancement options we master, the easier it
will be to create the entire game. And if we don't have the budget for a large team of
experts to help us with this task, this becomes even more important.
The answer is: Yes, you will find much useful information here, regardless of the
game engine. Well, to avoid any misunderstanding, it's not that straightforward.
Shaders for Godot are written in the standard language GLSL (OpenGL Shading
Language), which is slightly modified in the Godot Engine. However, the most
important functions and basic principles remain the same. Unity and Unreal support
the HLSL (High-Level Shading Language), which was developed by Microsoft for
the DirectX standard, and both engines compile HLSL to GLSL in the background.
It's true that the syntax of HLSL differs from GLSL, but the fundamental principles
remain the same. Therefore, if we understand a specific algorithm, it shouldn't be too
difficult to rewrite it for a particular platform.
So if you are working in the Godot Engine, you can use the code from this book
immediately. For engines that don't directly support GLSL, you'll need to adjust the
code a bit. However, the algorithms will be explained in as much detail as possible
to make these adjustments as easy as possible for you.
You will learn how to draw lines or simple geometric shapes, when to use which
coordinate system, how to achieve the same effect at any game window resolution,
why to use trigonometric or power functions, how to work correctly with colors
including the alpha channel, the art of animation using a time variable, and much
more. Each chapter and subsection will introduce something new that you can
include in your toolkit for writing your own effects.
● The world of 2D shaders: 2D shaders are among the most popular techniques
for enhancing the overall appearance of a game, which is why this chapter is
the most extensive. It starts with a brief description of how shaders work on the
GPU and an explanation of different coordinate systems for pixels, and ends
with a detailed explanation of more than 15 specific effects that have wide
applications in games of all types.
● Visual shaders in 2D: For those who enjoy creating visually and want a break
from coding, I've prepared a short detour dedicated to visual shaders in the
Godot editor. The theoretical section is complemented by a specific example of
creating an animated symmetrical ornament, which we can easily enhance by
adding new visual elements or modifying existing ones.
● Procedural art: If you manage to create an original visual effect with a small
amount of code, you will rightfully be celebrated for the result, at least within
the programming community. This chapter aims to bring you closer to creating
several shaders of this category.
● Visual shaders in 3D: The second detour deals with an advanced form of
visual shaders that are used in 3D environments. We will learn how to enhance
any 3D object to make it appear as though it is hidden behind a camouflage
field, or how to envelop such an object with an animated shield that creates the
impression of a fluctuating electric field.
● Screen post-processing: The final chapter introduces the topic of working with
the entire screen, which we can significantly enhance or change beyond
recognition using post-processing effects. We will learn both simple techniques
for 2D games and advanced post-processing in 3D environments.
Conventions used
There are a number of text conventions used throughout this book.
Code in text: Indicates file or folder names, keywords, annotations, path names,
Godot labels, and user input. Example: “Right-click on this node, select Add Child
Node, and choose ColorRect (search for it in the search bar). The scene should
look something like this.”
A block of code:
shader_type canvas_item;
void fragment() {
vec2 uv = UV;
COLOR = vec4(uv.x, 0.0, 0.0, 1.0);
}
Bold: Denotes a new term, a significant word, or any onscreen text that catches your
attention. For instance: “The first line starting with the keyword shader_type is a
mandatory header, without which our code wouldn't compile.”
Italic: Utilized to add captions to screenshots and provide additional information or
comments in the form of side notes.
Disclaimer
Although I have made every effort to ensure that the code examples and snippets are
error-free and typo-free, I cannot guarantee with absolute certainty that no bugs have
been overlooked. If you encounter any difficulties while following the instructions
to code your shaders, I recommend visiting my YouTube channel, where most of the
shaders mentioned here are also available as video tutorials. You are welcome to
contact me via email or on social media.
What else? I live in Prague, Czech Republic, with my beloved wife and three kids. I
like to compose music, play piano, study chess, and write books. And, of course, I
love to play games from other indie developers, and get inspired by them. :-)
Filip.Rachunek.com
Introduction
This book definitely does not contain comprehensive information about all types of
shaders you can use with the Godot Engine. Instead, I would like to focus on
specific useful effects that have practical applications in various games and are not
extremely difficult to understand. I will try to explain each algorithm in as much
detail as possible so that every programmer can write their own shaders or improve
existing ones after reading the book.
That was quite a brief introduction, don't you think? I could have written more, but
something told me that those who got this book mainly wish to learn as much useful
information about shaders as possible, and a long introductory talk wouldn't help
much with that. I will gradually get to the interesting topics that I will discuss in
more detail.
Are we any smarter now? Well, not really. All we know is that some program runs
on the GPU. But why there, specifically?
Let's assume we want to set the color of a single pixel. This by itself is a simple and
fast task, so a regular computer won't have any trouble with it, even if we wish to do
it sixty times per second, which is a common refresh rate. The problem arises if we
intend to set the color of every pixel on our monitor. In the case of Full-HD
resolution, we need to process 1920 x 1080, that is, 2,073,600 pixels. And if we do
this sixty times per second, we get to a value of 124,416,000 operations, almost 125
million. Every second. So how do we go about solving something like this?
The answer is parallel processing of each pixel. In practice, instead of a single CPU
setting the colors of pixels sequentially (one after another), we have a GPU with a
special processor for each pixel individually. Simply put, we're back to sixty
operations per second (or more if our graphics card can handle it). The GPU does a
tremendous amount of work for us, and our job is to provide instructions for this
work.
● vertex(): With this function, we can perform operations on the vertices defined
for our 2D object. A typical example is their periodic shift, which creates the
effect of a flag waving in the wind.
● fragment(): The fragment function is called for each pixel to set its final color.
● light(): The light function is also called for each pixel and considers light
sources that can further affect the final color.
Please keep in mind that this book focuses on Godot 4. It's important to note that the
syntax of Godot’s shader language (and GDScript) in previous versions of Godot,
such as Godot 3.6, may vary slightly. As a result, the code examples provided in this
book will not be compatible with Godot 3.x and earlier versions.
1. Install Godot 4 from the Godot site or Steam. Any stable version that starts
with 4 should be fine - 4.0, 4.1, 4.2 etc.
2. Create a new project (keep the default settings) in a folder of your choice.
3. After you click Create & Edit, and confirm any possible warning dialogs,
Godot will create your new project, and open it.
4. Click on 2D Scene to create a new scene. In the Scene panel (on the left), a
root node Node2D will appear. Right-click on this node, select Add Child
Node, and choose ColorRect (search for it in the search bar). The scene
should look something like this.
5. ColorRect is small by default, so we'll enlarge it to better see our shader effect.
You can either resize it with the mouse or set specific dimensions in the
Inspector. Select the ColorRect node and in the Inspector panel (on the right),
expand the Layout > Transform section under Control, where you'll find
Size. I usually set the dimensions to 600 x 400.
6. Next, we need to add a material, which we will again do in the Inspector. Scroll
down to the CanvasItem > Material section, find the Material property,
and set its value to New ShaderMaterial. Click on the result and for the new
Shader property, select New Shader. A dialog will appear where you can
name this shader.
7. Finally, we need to open the code of our shader in the shader editor. In the
Inspector, click on the new Shader property value, and the editor will
automatically open. We can see that Godot has used a template with empty
vertex, fragment, and light functions. Since we will primarily work with the
fragment function, we can delete the vertex and light functions.
We are ready to write our first shader! If you follow my YouTube channel, you
probably know that I repeat this process at the beginning of almost every video, so if
anything was unclear, I recommend watching my process directly there. Don’t forget
to save the scene with Ctrl-S, and let's get to work.
Coordinates? What coordinates?
Because the GPU processes all pixels in parallel and independently of each other,
the fragment function is also called in parallel and for each pixel separately. This
approach is relatively unusual compared to conventional programming. In many
languages, we simply issue the command "draw a line from point X to point Y,"
whereas when programming a shader, we have to ask an entirely different question,
namely, "What condition do the pixels meet whose coordinates lie on the line we
want to draw?" We will go into this in more detail in subsequent examples. But now
we need to clarify what coordinate system we can actually work with.
● UV: We will use this type most often because these are normalized coordinates
that only relate to the element to which the shader is applied. In practice, this
means that the top-left corner of our ColorRect has coordinates (0,0), while the
bottom-right corner corresponds to (1,1). All other pixels fall within these
intervals, so, for example, the center of our rectangle will have coordinates
(0.5,0.5).
● SCREEN_UV: These are also normalized coordinates, but they relate to the
window (viewport) in which the element with our shader is contained.
Therefore, the coordinate values will change according to the relative position
of the ColorRect in relation to the parent window's position. Thus, we will
encounter these coordinates primarily in shaders that we apply to the entire
screen.
1
We can normalize FRAGCOORD coordinates using the internal variable
SCREEN_PIXEL_SIZE:
SCREEN_UV = FRAGCOORD.xy * SCREEN_PIXEL_SIZE.
If you find this overview somewhat complicated or confusing, don't worry. To
understand the following examples, it is enough to know that UV coordinates are in
the range of 0 to 1, which will form the basis for the vast majority of further
calculations. We will start with simple color settings and observe the results.
Color the canvas
In the previous section, you might have noticed that our ColorRect contains
something like a red gradient. How did I achieve that? Let's write our first shader,
where I will explain everything.
shader_type canvas_item;
void fragment() {
vec2 uv = UV;
COLOR = vec4(uv.x, 0.0, 0.0, 1.0);
}
The first line starting with the keyword shader_type is a mandatory header, without
which our code wouldn't compile. By using the value canvas_item, it tells the
compiler that this is a 2D shader, in which we will set the color of the current pixel
to the internal variable COLOR. All shaders in this chapter will start with this line.
Since we only need the fragment function, I deleted everything else that Godot
generated using the template. As I mentioned, the GPU will call this function for
each pixel, so it's necessary to know which pixel we're working with. This is
precisely what the first line of this function does, where we create a new 2D vector
uv and assign it the value of the internal variable UV, which represents the
normalized coordinates of the pixel.
In the second line, we perform the actual coloring of the pixel. Since COLOR is
essentially the output of our calculation, every fragment function will end with
assigning a value to this variable. Note that COLOR is a four-dimensional vector
because it represents color in RGBA format. It thus has three color components
(red, green, and blue) and an alpha channel that defines transparency. Since we are
setting only the first component according to the X-coordinate of the uv vector, only
the red component of the pixel changes. As we know, X ranges from 0 on the left to
1 on the right, so pixels on the left edge are black and those on the right are pure
red. The other components, green and blue, are set to 0, and the alpha channel is set
to 1, corresponding to full opacity.
Great! Our first shader is complete. We can continue to experiment with it, for
example, by using the Y-coordinate of the uv vector for the green component of the
color, which would create a 2D gradient.
shader_type canvas_item;
void fragment() {
vec2 uv = UV;
COLOR = vec4(uv.x, uv.y, 0.0, 1.0);
}
Now let's show a simple trick for shifting the origin of the coordinates to the center
of our rectangle, which can be useful for creating various symmetrical patterns. As
we know, UV coordinates range from 0 to 1, so the center is at (0.5, 0.5). If we want
the center to be at (0, 0), we need to subtract 0.5 from both components of the UV
vector.
shader_type canvas_item;
void fragment() {
vec2 uv = UV - 0.5;
COLOR = vec4(uv.x, uv.y, 0.0, 1.0);
}
We can notice that the top-left quadrant of our rectangle is displayed in black. This
is because the top-left corner now has coordinates (-0.5, -0.5), and negative color
values are automatically clamped to 0, which results in black.
shader_type canvas_item;
void fragment() {
vec2 uv = abs(UV - 0.5);
COLOR = vec4(uv.x, uv.y, 0.0, 1.0);
}
Tip: The shading language supports various shorthand notations to improve
code readability. For example, we have now subtracted a one-dimensional
value of 0.5 from a two-dimensional vector UV, which actually means that
we have subtracted 0.5 from both components, i.e., UV.x and UV.y.
Now we can clearly see how the colors in the center of the rectangle darken to black,
as the transformed coordinate values approach (0, 0).
Color gradients are cool, but in this primitive form, they might not impress anyone.
Later, we'll show how to transform them into a very impressive animated plasma,
which has definitely more interesting applications. But first, we'll learn something a
bit simpler: how to create a regular square grid.
A grid of perpendicular lines
Starting with this example, I will begin to describe more complex shaders with
somewhat more challenging algorithms. Since this book is primarily intended for
beginners, I will try to explain them as thoroughly as possible. However, it is
possible that some details may remain insufficiently explained for certain readers - if
that happens, please email me at [email protected], and I will gladly
provide additional descriptions. This is a great advantage of e-books, and the ease of
releasing new versions of the book.
We still need to work our way up to the effect you can see in the screenshot. We'll
start with the basics, which is drawing a straight line. As I mentioned at the
beginning, drawing such lines requires mastering a different approach and thinking
about each problem in parallel, meaning everything is part of a whole and must be
rendered simultaneously.
I'll show you two ways to add such a simple square grid to the shader. One way
utilizes fractional parts of float numbers, the other works with trigonometric
functions. Let's start with the first one.
shader_type canvas_item;
float draw_grid(vec2 uv) {
vec2 grid_uv = uv;
return grid_uv.x;
}
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
vec3 color = vec3(draw_grid(uv));
COLOR = vec4(color, 1.0);
}
OK, something has changed, and in negative coordinates, we no longer have just a
black color. But why does the resulting value tend towards white color as it gets
closer to zero? This is due to the implementation of the fract function, which
actually works like this.
Before we move on, let's give our future grid a nicer color. We'll add a uniform
parameter called line_color and use it to colorize the output of the fragment
function.
shader_type canvas_item;
What is a uniform parameter? It is a way to pass input values into the shader from an
external source, either directly in the Inspector panel or programmatically using
GDScript. Later, we will see how we can achieve nice animated effects by
periodically changing uniform parameters. But back to the current shader. The new
uniform parameter has appeared in the Inspector, where we can change the color of
our grid.
Now we'll use the parameter in the code.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
vec3 color = draw_grid(uv) * line_color;
COLOR = vec4(color, 1.0);
}
We need to ensure that we're not only returning values for vertical lines but also
horizontal ones. What happens if we use y instead of x in the draw_grid function?
As we can read in the documentation, the step function with parameters edge and
value returns 0 for all values less than edge and 1 otherwise. Therefore, it's
necessary to set edge very close to 1 to make the resulting line sufficiently thin.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
vec3 color = step(0.99, draw_grid(uv)) * line_color;
COLOR = vec4(color, 1.0);
}
The result looks nice, but we already see the first issue - the vertical line is slightly
thicker than the horizontal one. This is because we're working on a rectangle and
haven't addressed the aspect ratio. Let's fix that now.
Since the shader has no way of determining the dimensions of the ColorRect to
which it is applied, we need to pass the parameters for calculating the aspect ratio
ourselves. We will do this using another uniform parameter.
shader_type canvas_item;
And we'll use the parameter in the code. To ensure that the horizontal and vertical
lines have the same thickness and the rectangles become squares (in the next phase),
it is necessary to multiply the X-coordinate of the uv vector by the ratio of
resolution.x to resolution.y.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
vec3 color = step(0.99, draw_grid(uv)) * line_color;
COLOR = vec4(color, 1.0);
}
The step function is certainly useful, but if we leave the lines with such sharp edges,
there could be problems later when we have more of them and apply additional
transformations, such as scale or rotation. Instead of step, we'll use smoothstep. It
works very similarly but defines two threshold values and performs some
smoothing of the result.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
vec3 color = smoothstep(0.99, 1.0, draw_grid(uv)) * line_color;
COLOR = vec4(color, 1.0);
}
Now we can see that the result is indeed a bit smoother, and the transition from
black to blue is more gradual. However, zooming in reveals that it's also slightly
asymmetric, which is caused by how the fract function calculates the resulting
value.
It won't be as noticeable with thin lines. However, if we wanted to address this issue,
it can be done using the second method I mentioned at the beginning - instead of
fract, we'll use a trigonometric function, specifically cosine.
But before we do that, let's finally display a real grid instead of two perpendicular
lines. We'll add a new uniform parameter called zoom, which we'll then multiply by
the UV coordinates in the fragment function.
shader_type canvas_item;
What does hint_range mean? Using this optional parameter, we tell the Inspector to
display a slider for setting the value with limits of 1.0 and 50.0, and the step size for
dragging with the mouse will be 0.1. This makes it easier for us to modify the
parameters.
Our shader displays nothing at the moment. This is because the range values of the
smoothstep function remain constant, causing the lines to be so thin, that they are
no longer visible. We need to ensure that the resulting line thickness adapts to the
current zoom factor. We'll do that like this.
First, we'll add another uniform parameter to define the desired thickness of the
lines.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
float line_thickness = zoom * thickness / resolution.y;
vec3 color = smoothstep(1.0 - line_thickness, 1.0, draw_grid(uv * zoom))
* line_color;
COLOR = vec4(color, 1.0);
}
And we have a beautifully regular grid. If it seems too dark to us, we can change its
color or multiply the result by some additional value, which we'll call brightness
and add as a uniform parameter.
Great! We've achieved the desired effect. What else could we do with our grid? Let's
try to come up with a simple animation that would perform translation, rotation,
and scaling. I'll start with translation.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
uv += vec2(sin(TIME) * 0.4, cos(TIME) * 0.6);
float line_thickness = zoom * thickness / resolution.y;
vec3 color = smoothstep(1.0 - line_thickness, 1.0, draw_grid(uv * zoom))
* line_color;
COLOR = vec4(color * brightness, 1.0);
}
This requires a bit of explanation. So, what did I do? First, I added a third line to the
fragment function, the one that contains the previously unused internal variable
TIME. As the name suggests, TIME contains the elapsed time since the last engine
start, making it useful for various animations and other effects that change over
time. By using it as a parameter for the periodic functions sine and cosine, we
achieved smooth and cyclical movement of the grid. Such an effect, of course,
doesn't stand out in a book, but if you try out the code we've written so far yourself,
you'll see what I mean.
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
uv += vec2(sin(TIME) * 0.4, cos(TIME) * 0.6);
uv = rotate(uv, TIME * 0.1);
float line_thickness = zoom * thickness / resolution.y;
vec3 color = smoothstep(1.0 - line_thickness, 1.0, draw_grid(uv * zoom))
* line_color;
COLOR = vec4(color * brightness, 1.0);
}
The new line is the fourth in sequence. For controlling the rotation, we again use
TIME, this time multiplied by 0.1 to ensure that the rotation is not too fast.
Finally, scaling. We have the zoom, but it's a uniform parameter, and its value
cannot be changed. Therefore, we'll comment it out and redefine it again, including
periodic changes.
For reference, here is the complete shader code, including the line for periodic
changes in the zoom factor.
shader_type canvas_item;
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
uv += vec2(sin(TIME) * 0.4, cos(TIME) * 0.6);
uv = rotate(uv, TIME * 0.1);
float zoom = abs(sin(TIME * 0.5)) * 40.0;
float line_thickness = zoom * thickness / resolution.y;
vec3 color = smoothstep(1.0 - line_thickness, 1.0, draw_grid(uv * zoom))
* line_color;
COLOR = vec4(color * brightness, 1.0);
}
What did we learn from this example? There's quite a bit, and it's all useful
information that will be practical later on:
● How to define custom functions for cleaner code and separating logical units.
● How to use the fract function for simple periodic changes in other values.
● What uniform parameters are and how to control them from the Inspector
panel.
● Using the max function for calculating values simultaneously in both vertical
and horizontal directions.
● Effectively clamping values using the step and smoothstep functions.
● How to correctly recalculate the aspect ratio to achieve the desired result
regardless of the element's dimensions.
● How and why to replace fract with trigonometric functions for a smoother
result.
● What hint_range is for uniform parameters.
● How to achieve higher brightness through multiplication.
● Formulas for translation, rotation, and scale.
● Using the internal variable TIME for simple animations.
If any step is not entirely clear, I recommend watching the corresponding video,
where I demonstrate the entire process visually. And don’t hesitate to experiment.
There are certainly many ways to further enhance such a grid.
Live cables and basic iterations
I would stay with the lines a bit longer and focus on them in this example as well.
This time, we will try to bend and shorten them a bit. In addition, the entire effect
will be nicely animated, so in the end, it will resemble a bundle of strings or cables
waving in the wind.
Again, we will start with an empty code for our shader, just like in the previous
section. This time, we will immediately insert the code for recalculating the aspect
ratio and shifting the origin of the coordinates to the center, as was described in the
previous example.
shader_type canvas_item;
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
vec3 color = vec3(0.0);
COLOR = vec4(color, 1.0);
}
Just a reminder - vec3(0.0) is a shorthand notation for a vector that has the same
value in all components, that is, vec3(0.0, 0.0, 0.0). Similarly,
vec4(color, 1.0) means that the first three components will be copied from the
three-dimensional vector color and the fourth is set to 1.0.
Tip: Some programmers use a shortened notation for decimal numbers, such
as 0. (with a decimal point at the end) instead of 0.0. The shading language
supports this syntax. However, I prefer to stick to the full format, which I
find clearer. In any case, if you encounter something like this in other
sources, it is not an error.
Let’s start with a simple straight line that will be displayed horizontally. We'll add
the draw_line function and implement the code for the line. Our effect should
have some color, so first, we'll add a uniform parameter for selecting the color, and
then we'll use it in the code.
shader_type canvas_item;
void fragment() {
vec2 uv = UV - 0.5; // move origin to the center
uv.x *= resolution.x / resolution.y; // fix the aspect ratio
vec3 color = vec3(0.0);
color += draw_line(uv, line_color);
COLOR = vec4(color, 1.0);
}
What has been added to the code? Besides the mentioned uniform parameter
line_color, which we will use to set the color of the entire effect, a function
called draw_line has appeared above the fragment function. This function, just
like the draw_grid function in the previous example, is responsible for calculating
the colors of the pixels that the desired line will contain. Currently, it returns the
absolute value of the Y-coordinate of the current pixel, which, as we know from the
beginning of this chapter, ranges from 0.5 to 0 and back to 0.5. The result would
thus be a horizontal line surrounded by a gray gradient. However, this time we
directly multiply the result by the set color, giving this effect a green tint.
But why do we add the result of this function to the color variable (in the fragment
function) instead of simply assigning it to replace the original vec3(0.0) vector?
The reason is that we plan to call the draw_line function repeatedly in a loop and
accumulate the intermediate results. It is therefore good to prepare for this from the
beginning.
So, we have something, but it's displaying in inverted colors, and the line is too
thick. First, let's make it a bit smoother, which is best done using the smoothstep
function.
Now we have a nice straight line! Since the upper limit of the smoothstep function
determines the thickness of the line, it will be advantageous to create another
uniform parameter for it.
shader_type canvas_item;
By the way, we'll probably be using the smoothstep function and its inverse
(subtracting from 1.0) in multiple places in the code. We can simplify this using
preprocessor directives.
shader_type canvas_item;
#define S smoothstep
#define IS(x, y, z) (1.0 - smoothstep(x, y, z))
What are directives? They start with the keyword #define (including the hash
symbol) and allow us to significantly shorten code that we want to repeat frequently.
In the first line, I defined the abbreviation S, which will perform the same operation
as the smoothstep function. In this case, it is sufficient to write it as a short form
without parameters. On the other hand, the second line, where I define the
abbreviation IS, uses a more complex notation (the value of the smoothstep
function is subtracted from 1.0), so it is necessary to specify the input parameters in
parentheses, which I simply named x, y, and z. There are three because the
smoothstep function has three parameters.
Now, every time we want to apply one of these functions, we can use the predefined
shortcut. We'll do this in our draw_line function and use the uniform parameter
line_thickness as well.
The line is now a bit thicker because I set the default value of the line_thickness
parameter to 0.05, whereas the original constant was 0.01. However, the principle of
the algorithm remains unchanged.
Now let's enhance our line. It would certainly look interesting if its thickness varied
based on the horizontal position of the respective pixel. Let's try multiplying
line_thickness by the absolute value of uv.x.
Excellent, the foundation of our effect is ready. Now we just need to add a bit of
ripple to our line, introduce more lines with different parameters, and animate the
result. Let's start with the rippling, for which a naturally wavy trigonometric sine
function works well. Let’s do it by shifting the Y coordinate of the current fragment.
You have reached the end of the excerpt from the book "Shaders in Godot 4: Add
stunning visual effects to your games". I hope that what you have read so far has
piqued your interest, and you would like to learn more about the development of
shaders in the Godot Engine.