Subtance is really well integrated in Touch, specially with the PBR MAT.
But it missing the input image, it’s the strength of Substance.
Of course it is not for live feed, but for still image (generate by touch or capture by camera).
Thx for advice if it’s on the roadmap.
Can you explain what you mean by input image? You can override any texture in a Substance Material by specifying it directly in the PBR MAT, but I’m not sure what you are referring to.
Of course, here i did some pictures for you.
Here a Substance Designer network :
We can add nodes like Noise, Color and promote their parameters to change it in Touch,
We also can add InputColor and InputGrayscale Nodes, here Substance wait for a texture selected by the user in the software choosen, and process it to create output in BaseColor and Normal.
Here the substance network (.sbsar) in Cinema4D
I select a texture grid on my computer and Substance generate Normal Output and the BaseColor as you can see. (it’s the same process in Unity, Maya, Houdini, Unreal…)
In Touch, i thought at my first try than i would see an input at the left side of the Substance TOP or maybe an operator reference field in the parameter.
To output a baseColor and Normal in the Substance TOP
If you need more explaination, i’m here
It is something possible ?
This looks possible, it might be a while before we can address it. We’ve added it to the RFE database.
Just got into that feature, and I agree that the input implementation would be a reel game changer !
Keep it all procedural, in a touchdesigner way. For exemple to generate PBR maps from an input image on the fly, instead of exporting/importing 5 hardcoded bitmaps (in a sbsar archive of 50megs). That would be proper
Bump and +1. I think there may be some misunderstanding here because subs/sbsar files can also act like complex “effects” for chaining material creation or anything else. For instance you could build a substance file that could take a height input (eg a baked or procedural height map) and give you back nicer normals, much better and tailored than what you might get in a simple hieght > normal glsl shader. I don’t know the ins and outs of the API, but I believe the entire substance eco-system is built around sbsar files, many of which can take multiple input textures ( and other data?) and give back various outputs, whether or not they are used immediately in PBR channels. Obviously this stuff is not meant for real time though. Anyways, if it’s possible, I would definitely request a more open ended substance implementation.
I would assume that if the input image parameter in the .sbs file is tagged to be “exposed” (sorry, whatever the correct terminology is in Substance-land) that it should show up in the .sbsar and then the Substance TOP as a custom parameter once loaded into TouchDesigner.
Is this not the case?
Oops, indeed, i stand corrected, either something got updated a while back or i’m just a dweeb, yes, it works to promote texture in puts and get them as TOP parameters, awesome!
I’m happy to hear that its working!
Yeah i just try and Yes it works !
Oh i didn’t see this update passed
Of course it cost so mush on CPU but it can be really usefull for is you capture a image with TD and thet the substance make the job on other thread.
Here the proof ^^
Thx a lot
If the image is not changing, the Substance TOP will not cook. This is the way, as you do not want the Substance TOP cooking, each time a change it made to any of its parameters, or an input image, the Substance TOP has to recreate all the texture in the material, this is not a realtime process.
If you want the Base Color swapped out to different images, or an animating texture, you should override Base Color in the PBR Material, this will give you great performance.