top of page
TitleGlitch.JPG

HLSL Shader

This is a Work In Progress!

This Page represents a work in progress! However I enjoy posting my progress. So have a look around if you have any questions or thoughts feel free to reach out! 

Goal

The main goal of this project was to better familiarize myself with HLSL code. I decided to pick something that I felt like would be impossible or at least really difficult to achieve using the regular shading network.


I was playing Cyberpunk 2077, and was inspired to do some form of glitch shader. While it was the glitch that occurs on Johnny Silverhand that initially sent me down the rabbit hole of glitch art in general. Below are two images that I used as my main inspiration during this project. It was these “streak” based glitches that really inspired my final look. I love how these two pieces managed to make the glitches feel physical and like they were being affected by the lighting

I knew I could achieve a similar effect in Houdini but I wanted to see if I could fake the lighting aspect and achieve the whole thing in a shader. 

Cyberpunk-2077-Johnny2.jpg
Cyberpunk-2077-Johnny.jpg
Facevinyl on Twitter.jfif
The Glitch Art of Giacomo Carmagnola.jfif

Learning HLSL

HLSL in Unreal

The Goal of this Project was to learn and better understand HLSL. While the language seemed quite familiar to me, I found that I ran into more limitations than I was expecting. Although most of these had more to do with the way UE4 implemented the “Custom Node” than with the actual HLSL language. One of the main struggles I ran into was that early on in the process I realized that you can not have sub functions. This is because the “Custom Node” in UE4 is treated like a function in of itself.

Creating the Common Node

Things like: Noise, Random, or even Remapping variables. were used near constantly throughout this glitch shader. Using the out of the box implementation of  the UE4 custom node left my code unwieldy and muddy. I came across a technique on the forums that discussed creating what they referred to as a Common Node. This node is stored within a dummy Material Function Node and is written in a bit of a bizarre syntax.

inCommon_Outer.JPG
inCommon_Inner.JPG

Function which linked allowed access to Common Functions

Set up of Common Funciton

Once implemented the inCommon Material Function allowed me to store frequently used functions in a different location and just call out to them in my main loop. By moving them over to Common I was able to greatly improved coding efficiency as well as the readability of the code.

Method

Synopsis/ How does it work

The short Synopsis is that the material first finds valid Starting Positions. Then creates Stripes of color based on those starting Positions. Then creates layers of stripes.

Starting Positions

The system currently works by determining an area in which to create its starting positions. In the past I created random positions within a region on the screen. Ultimately though I found this quite difficult to control. From an artistic perspective while a lot of my reference “feels” random I learned that it actually follows very clear rules and it is those rules that make it aesthetically pleasing. As a result I abandoned the Idea of doing true randomness and instead did slightly random starting positions along a user defined path. This path was defined by two locations in 3d space. 
 

I started off just using two boxes placed in the scene to define the two points I wanted to the effect to live between. I then used the 3d to 2d screen space node in blue prints to then take in those two locations and return the screen UV coordinates. I then Pipe those two UV values into a material instance. Each stripe is then a lerp between those two points. 
 

Once this was working I moved on to creating sockets in the Meta Human Mesh (which I will discuss later.) This allowed the effect to “stick” with the Meta Human once they were animated. The other advantage of using Sockets is that they can easily be called in Blueprints. So I could pipe the result directly into the same system that I had created earlier. 
 

Streaks

The streaks as I called them, are created first by creating a thin rectangular box around the starting position and sampling the screen texture for that given position. Then the box moves down, but continues to sample the first region. By repeating this process we are left with a streak that still carries the same color value down the entire length of the streak. 

 

The next step was to add the shading. I discovered that a power gradient fall off looked the most natural since that is the fall off of light normally

Layers

The next step was to add multiple layers of these streaks so I made sure each new layer appended exactly to the layer above it. Then I made sure each subsequent layer was darker than the one before it.

Meta Humans

Meta Humans are out of the box highly performative. This allowed me to quickly iterate on the glitch shader while still having something underneath with enough detail for me to see where it might be breaking down

Mocap

Using the new LiveLink editor inside of UE4 I was able to connect my iPhone as an input device. The iPhone uses Apples ARKit to track facial animation. This could be previewed live onto a metahuman. This greatly speed up my process as it allowed me to record quick animation without the need to hand animate anything. Metahumans also have very nice blend shapes especially around the forehead and mouthlines, that I found gave very appealing results. 

Skeleton mesh Update

UE4 allows for very easy editing of Rigs right in the editor. additionally they have a notion of a "Socket" which can be attached to a bone with an offset. These sockets are easily callable by Blueprints and track with the facial animation. 

Socket_Heirarchy.JPG

Adding Sockets to the rig help the "glitch" to Stick to the character. Also they create an easily callable object from blueprints

Socket_Location.JPG

Additionally editing the master skeleton rig for the project meta human means any meta humans that are added get the sockets.

Reference Images:
bottom of page