This workflow lets you put any 2D panels into a lit 3D scene and operate them as you would in 2D.
You create a bunch of Panel Components as you would in 2D normally, and instead of displaying them in a container, they are used in a 3D render by connecting each panel to a ‘geopanel’ component (a customized Geometry component with custom parameters). The Render Pick DAT and the Render TOP are set up to operate the panels virtually.
You can use a mouse with this, and currently single-finger multitouch is working as well.
UPDATE: Oct 20, 2015 - THIS IS IN THE PALETTE OF Official 56020 and later.
In this example, some, not all gadgets do something on other objects.
Aside from working with mouse and Windows-supported touch screens (both via Multi Touch In DAT), you can remotely driving the main container with panel.interactTouch() and panel.interactMouse().
flow – how it works:
• You mouse click on the /project1 container which only contains a 3D rendered image as the background. The click/drag/release events drives a Multi Touch In DAT that is pointing to /project1.
• The Multi Touch In DAT is passed to the Render Pick DAT that is also connected to the 3D render. Render Pick gets back an event list in its callback that identifies which 3D objects were hit and where.
• The Render Pick DAT sends interactMouse() events to individual gadgets (panels) to operate them virtually.
This will be in the palette under Techniques in the next build after 55460.