I’m quite a beginner in this worlds, and I’m trying to make a VR project with particles that falls each time the Kinect detect someone. These particles should accumulate to the ground for a time window of two/three months (day 1 zero particles on the ground, last day ground completely covered by them).
I’m using the ViveSimple system and placed my enviroment and particles system in its geometry box.
I’ve three major problems:
- I created a BlobTrack system for the detection but I dont know how to connect it with the particles system(they start from a grid and are looping for falling and rigenerate). Especially, how can I say to them to stop when the value is 0 and to just fall for some seconds when the value is 1?
- After some hours the program start of course to lag because of the particles long term calculation. I was thinking about capturing the 3d scene or even better just the ground with the already fallen particles each let’s say 10 min, upload it and give an automatic pulse to the particles, or restarting them or simply giving them just 10 minutes “life expected”. How could this be solved?
- In the geomety box, if I open my render, I can see the scene as I would like it to be, so with a nice fog (Camera Fog), nice disolving of the far particles (Liner MAT-near/far), glow effect to the particles (Render-feedback,blur,transform,level,composite - this one is not even apply to the geometry render), but when I go in the VR Output Render, I can’t see them anymore, and it appears just flat. Why? How to solve this?
If any good and genius soul can help me, I thank you infinitely.