360 reactive stable diffusion

Hello everyone, you might have seen the artist Scottie Fox generating 360 skyboxes VR Stable Diffusion Gets Even Better! Deforum + Touch Designer - YouTube . The final result is really cool and impressive, although the general idea is pretty simple if you follow his projection tutorial (TouchDesigner Walkthru - 360° Projection of AI-Diffused Content - YouTube) and Torins tutorials on stable diffusion. (Torin Blankensmith - YouTube)

But there are some thing I can’t figure out and I haven’t seen any answer anywhere, theoretically the questions are simple to answer but practically when I create the network I run to tones of errors and problems I cant figure out :

  1. How Scottie generates small parts of the image and not the hole? (I think it maybe could be done if you cut the image with noise patterns and send it to img to img generation to fill the spots? But I couldn’t make it work)
  2. And the second question is, how to make it realtime? I have control 1111 running in my pc and want to use it for the generation to be local, I’m okay with it being slow and not exactly real time. The goal is the continues generation, so then I can connect parameters with Chops to make them reactive.

(if anyone has figured those stuff out and has a network that I can take a look would be amazing)… or a tutorial somewhere…

Thanks in advance

I´m looking for the same. Did you find any information?