Original Post — Direct link
about 5 years ago - /u/Veegie - Direct link

That would be cool! Currently that isn't on the trivial side of things to implement due to aspects1 of the way the shader system works. I think right now if we were to cut that deep in the code we would prioritize some of the bigger fish to fry But I honestly do appreciate hearing your feedback (I think I saw someone else post this the other day, too) about things you'd like to see for your Guardian some day.

Thank you!

1 Technical background since a few people requested it in my last post:

Shaders currently blend into the base gear shader pixel shader using an array of float4s and two RGBA textures. Because of this shaders must be fixed function and cannot change the way they blend into the base without introducing potentially costly (performance-wise) dynamic branches in the pixel shader. On current platforms this branch would probably be pretty coherent so the cost may not be too bad. Until then they are restricted to CPU bytecode time-based animations of parameters (panning textures, parameters oscillating between constant values).

The more challenging thing is that we currently do the Taken gradient effect by storing the relative object space position of the meshes into their vertex information when we compile the bits for the disk. This isn't currently supported for player customizable gear due to the fact that the items are customizable, thus we don't know where the final relative vertex position will be on the player character due the ordering that we compile our assets. This is more complicated than something like a Vex Hobgoblin where the vertices are one single 'object' that is compiled all at once in its totality.

Like anything in software there are several other alternatives that we could pursue when it comes time, but I thought it would be worth mentioning that it's not just a matter of "use the existing Taken shader". I think it would be cool too!