About a week ago, someone posted a comment on this old post asking if I could update the ripple effect so that it would work with Unity’s Lightweight Render Pipeline and 2D lighting. This seemed like a good opportunity not only to familiarize myself with the LWRP and the post processing stack, but to clean up some pretty ugly and ancient code.
The final result can be seen here. The red and yellow lights drifting around are point lights being reflected off a Sprite instance (i.e. 2D lighting). There is also a built in Unity Post Processing vignette effect added. And by clicking on the game screen, you can see the rubbery ripple effect which is a custom post effect similar to the one from the previous post.
To create that effect, you have to start with setting up a project. I won’t bother to go over a lot of detail in project set up, but basically, you need to create a new project and, using the package manager, install the Lightweight RP package (this will automatically bring in the Core RP Library and Post Processing packages as well). Create a Lightweight Render Pipeline asset and add it to your Project Settings in the Graphics tab (we’ll also need to add our shader there as well, but we’ll come back to that in a bit). Add a sprite to your scene (I just used a picture of a castle I took). Create a material for that Sprite (I named mine ‘CastleSpriteDiffuse’) and choose Lightweight Render Pipeline/Simple Lit as the shader source. Set the Base Map of the material to be the same texture as you’re sprite and apply the material to your sprite.
Now you can add a couple Point Lights to your scene and see that they light up the Sprite instance. Nice.
Just to add a bit of motion to an otherwise boring test project, I added this simple script to each point light:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
using UnityEngine; public class AnimatedLight : MonoBehaviour { [SerializeField] private float maxX = 10f; [SerializeField] private float maxY = 10f; private float speedX, speedY, angleX, angleY; private void Start() { speedX = Random.Range(-5, 5); speedY = Random.Range(-5, 5); angleX = 0f; angleY = 0f; } private void Update() { var pos = transform.position; pos.x = Mathf.Sin(angleX) * maxX; pos.y = Mathf.Cos(angleY) * maxY; transform.position = pos; angleX += speedX * Mathf.Deg2Rad; angleY += speedY * Mathf.Deg2Rad; } } |
Not really a necessary thing, but I liked it.
Finally we can start looking at the fun stuff – the post processing.
For performance, Unity’s post processing stack is applied to a single layer. So add a new layer for post effects (I just called mine PostFX). Now here’s a little gotcha it took me awhile to figure out: the camera instance which contains the post processing components must be on that post processing layer. So, in the hierarchy window, select your camera and, in the Inspector window, set the layer to PostFX (or whatever you called your new layer). While the camera is still selected, add a Post Process Layer component, make sure the trigger is set to the camera instance and the layer property is set to the PostFX layer. Now add a Post Process Volume component and click the new button to create a new Post Processing profile asset (or create one manually using the context menu in the project window). Finally, add an effect and select Unity/Vignette and play around with the settings to get something to taste.
At this point, if you run the project in the editor, you should see something that looks like this:
If all has gone well up to this point, now we can start working on a custom shader that can be added to the post processing stack.
Now one thing interesting (or annoying depending on your point of view) about Unity’s render pipeline is that shaders need to be created in HLSL rather than CG. Now, if you’re using Shader Graph, this is automatic, but if you’re writing shaders manually, this is something you’ll have to take into account. Thankfully HLSL and CG are similar enough so that there is practically no learning curve involved, but set up of the shader is different. All this means, using Unity’s Create/Shader/Image Effect Shader editor asset shortcut is useless. So let’s start by creating a simple replacement. In your Assets directory, create a new folder named Templates. In that Templates directory create a new .txt file named ‘HLSLPostProcessingTemplate.txt’. Open that .txt file and add this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
Shader "Hidden/NewPostProcessingShader" { HLSLINCLUDE #include "Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl" TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex); half4 Frag(VaryingsDefault i) : SV_Target { half4 col = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord); // just invert the colors col.rgb = 1 - col.rgb; return col; } ENDHLSL SubShader { Cull Off ZWrite Off ZTest Always Pass { HLSLPROGRAM #pragma vertex VertDefault #pragma fragment Frag ENDHLSL } } } |
I won’t go over that too in depth, but it’s very similar to Unity’s Create/Shader/Image Effect Shader but in HLSL rather than CG. Something interesting to keep in mind, Unity has defined macros for declaring a Sampler2D property (TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex);) and for actually sampling that instance (SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord);). Also, HLSL does not support the fixed data type, so you’re bound to using floats and halfs. Other than that, writing HLSL should be pretty straight forward. I strongly recommend browsing through the Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl file though, just to get an idea of what other tidbits you are including in each HLSL shader.
Of course, at this point, we’ve only created a .txt file, not an actual shader, so let’s fix that. Create an Editor directory and add a script named HLSLTemplateCreator.cs which looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
using System.IO; using UnityEditor; public class HLSLTemplateCreator { private static string PostProcessingTemplatePath = "Assets/Templates/HLSLPostProcessingTemplate.txt"; [MenuItem("Assets/Create/Shader/New PostProcessing Shader")] private static void CreatePostProcessingHLSLTemplate() { var path = AssetDatabase.GetAssetPath(Selection.activeObject); if(File.Exists(path)) path = Path.GetDirectoryName(path); if(string.IsNullOrEmpty(path)) path = "Assets/"; File.Copy(PostProcessingTemplatePath, Path.Combine(path, "NewPostProcessingShader.shader")); AssetDatabase.Refresh(); } } |
Once Unity has spent half the day compiling that in the editor, you can now use the context asset creation menu to go Create/Shader/New PostProcessing Shader. So create a Shaders folder and go ahead and do that. You should now have a NewPostProcessingShader.shader file in that Shaders folder that looks like the template we added to the .txt file above. Rename that new shader to RippleShader and open up and alter it so it looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
Shader "Hidden/OnebyoneDesign/PostFX/Ripple" { HLSLINCLUDE #include "Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl" TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex); half _CenterX; half _CenterY; half _Amount; half _WaveSpeed; half _WaveAmount; half4 Frag(VaryingsDefault i) : SV_Target { half2 center = half2(_CenterX, _CenterY); half time = _Time.y * _WaveSpeed; half amt = _Amount/_ScreenParams.x; half2 uv = center.xy - i.texcoord; uv.x *= _ScreenParams.x / _ScreenParams.y; half dist = sqrt(dot(uv,uv)); half ang = dist * _WaveAmount - time; uv = i.texcoord + normalize(uv) * sin(ang) * amt; return SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, uv); } ENDHLSL SubShader { Cull Off ZWrite Off ZTest Always Pass { HLSLPROGRAM #pragma vertex VertDefault #pragma fragment Frag ENDHLSL } } } |
I won’t bother going over that – it’s pretty much the same as the shader in the old post – but now in HLSL form. Note that I not only updated the code, but the name of the shader at the top of the file. That’s very important coming up.
In fact, you can go ahead copy that shader name, Hidden/OnebyoneDesign/PostFX/Ripple, go back to your Project Settings/Graphics menu and add that to the collection of Always Included Shaders. This will ensure that that shader will always be included and available for programmatic instantiation after compiling and deploying your project.
Now, in order to use this custom shader in the post processing stack, we have to create a C# editor wrapper for it. In your Shaders folder (or someplace else, if you prefer) create a new C# script named Ripple.cs that looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
using System; using UnityEngine; using UnityEngine.Rendering.PostProcessing; [Serializable, PostProcess(typeof(RippleRenderer), PostProcessEvent.AfterStack, "OnebyoneDesign/Ripple")] public sealed class Ripple : PostProcessEffectSettings { [Range(0,1), Tooltip("Horizontal center of effect")] public FloatParameter CenterX = new FloatParameter { value = 0.5f }; [Range(0,1), Tooltip("Vertical center of effect")] public FloatParameter CenterY = new FloatParameter { value = 0.5f }; [Tooltip("Amount/Strength of effect")] public FloatParameter Amount = new FloatParameter { value = 10f }; [Tooltip("Speed of ripple waves")] public FloatParameter WaveSpeed = new FloatParameter { value = 10f }; [Range(0, 50), Tooltip("Amount of waves")] public FloatParameter WaveAmount = new FloatParameter { value = 20f }; } public sealed class RippleRenderer : PostProcessEffectRenderer<Ripple> { public override void Render(PostProcessRenderContext context) { PropertySheet sheet = context.propertySheets.Get(Shader.Find("Hidden/OnebyoneDesign/PostFX/Ripple")); sheet.properties.SetFloat("_CenterX", settings.CenterX); sheet.properties.SetFloat("_CenterY", settings.CenterY); sheet.properties.SetFloat("_Amount", settings.Amount); sheet.properties.SetFloat("_WaveSpeed", settings.WaveSpeed); sheet.properties.SetFloat("_WaveAmount", settings.WaveAmount); context.command.BlitFullscreenTriangle(context.source, context.destination, sheet, 0); } } |
That should be fairly self-explanatory. There’s just a settings object which contains all the settings/properties of the shader which is then referenced in a Render event handler to actually update the properties of the shader. I do recommend reading the documentation to learn about PostProcessEvent etc, but this will do the trick for now. Notice that, once again, the PropertySheet instance gets a reference to the shader by its name, hence the importance of that shader name.
At this point, you can go all the way back to the camera instance in the scene, and in the Post Process Volume component, click ‘Add Effect’ and select OnebyoneDesign/Ripple. Give it a go and play around with the settings. You’ll need to play the game in the Unity Editor to get the full animated effect.
Once you’ve seen it and have an idea of how the shader properties affect the the post effect, go ahead and remove that effect from the Post Process Volume component – we’ll now create and add the shader in code.
What’s cool about the PostProcessEffectSettings class (which our Ripple settings inherited from) is that it inherits from ScriptableObject so we can easily instantiate it in code with ScriptableObject.CreateInstance(). So in a new c# script named RippleController.cs, add this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 |
using System.Collections; using UnityEngine; using UnityEngine.Rendering.PostProcessing; public class RippleController : MonoBehaviour { [SerializeField] private float maxAmount = 25f; [SerializeField] private float friction = .95f; private Coroutine rippleRoutine; private Ripple ripple; private PostProcessVolume rippleVolume; private void Start() { ripple = ScriptableObject.CreateInstance<Ripple>(); ripple.enabled.Override(false); ripple.Amount.Override(0f); ripple.WaveAmount.Override(10f); ripple.WaveSpeed.Override(15f); rippleVolume = PostProcessManager.instance.QuickVolume(gameObject.layer, 100f, ripple); } private void OnDestroy() { StopAllCoroutines(); RuntimeUtilities.DestroyVolume(rippleVolume, true, true); } private void Update() { if (Input.GetMouseButtonDown(0)) { Vector2 position = Input.mousePosition; if(rippleRoutine != null) StopCoroutine(rippleRoutine); ripple.CenterX.Override(position.x / Screen.width); ripple.CenterY.Override(position.y / Screen.height); rippleRoutine = StartCoroutine(DoRipple()); } } private IEnumerator DoRipple() { ripple.enabled.Override(true); float amount = maxAmount; while(amount > .5f) { ripple.Amount.value = amount; amount *= friction; yield return null; } ripple.enabled.Override(false); } } |
That controller creates an instance of the Ripple in Start() and, very importantly, disposes it in OnDestroy(). By using a Coroutine to perform the wave animation, it is very easy to only enable the ripple when necessary and disable it when it is no longer needed to save on performance processing.
And that is pretty much it. I attached that component to my Sprite instance in the scene and that will finally give you the effect in the example here.
If you’re interested, the entire project is available on Github to download or clone and play around with. Enjoy.
1. Cloned the project form Github, loaded in a similar version of Unity (2019.3.0f6).
2. Launched SampleScene.
3. Hit Play.
No effects visible / seen in either scene camera or Game camera. Not even Vignette…