The new custom sequential parameters open up some interesting possibilities for storing structured data.
I’m curious if there’s any guidance regarding their resource costs if used at large scales vs using DAT tables.
For example, within each RayTK operator, there are tables that define things like the parameters that get imported into the shaders, textures that the operator depends on, etc.
The tables themselves are generally small (up to 8 columns, maybe 10 rows max). But there can be a lot of them. If each RayTK operator has 2-5 of those, in a scene with 15 operators, that can be up to 75 tables.
Would it be costly to replace those with sequential parameters on base COMPs? For example each sequence block would contain the name of a compiler macro along with a value, an enable toggle with an expression, etc.
It would be nice to be able to use typed parameters instead of text in table cells, but I’m concerned that it might add a lot of overhead. And it’ll be difficult to do performance testing without first converting tons of toolkit infrastructure to use the new approach.