In the third week of work I began preparing a set of triggers driven by input that would parameterise the animation state-machine. For example, when moving, the state is set to movement, with a speed based parameter that will be utilised by the animators to blend between a walk and run cycle animation.
I also briefly mentioned my experience with writing shaders, and how we could utilise this to accomplish some ‘graphical wins’ that would not take time from the animators. One suggestion was that I look into the technique used to simulate the GPU accelerated ‘fur’ that is shown in Ratchet & Clank. After looking at the technique known as ‘concentric fur shells’, I followed a mixture of research papers and forum descriptions to create a similar effect in a shader in Unity. I used 20 shells in my shader as I found this to still have good performance on varied hardware, but may consider an option to turn the fur off of performance dips later.
The technique I used to achieve this uses several simple features in cohesion -
- A sub-shader simply scales the model by a factor. This sub shader runs 20 times, creating 20 ‘shells’ of the base model.
- The alpha level of the models is passed through a ‘fur mask’, an asset using both perlin noise and granular noise to filter the noisy appearance of individual hairs, with each extruded shell essentially forming the length of the hair.
- Several improvements also added were a deformation along a vector, to give the appearance of gravity, along with settings to alter the alpha cutoff at the base of the hair and end of the hair strand (I.E starting thicker and ending thin at the tip of the hair.)
The sub-shader looks a little like this (with some removals to prevent copy-pasting of this code)
And in the main shader this is simply ran multiple times, like
The result looks something a little like this -
Currently, the team dynamic still feels like their is strong support for the idea and we are working towards building up some better synergy between the various specialisations. I believe that this will be greatly aided when a more fleshed out GDD is distributed through the team, but for the time being the artists are building up a repository of reference material. At the moment, the programmers are understandably the only specialisation who can directly work in-engine on components of the final product, albeit using a temporary sandbox environment.