The last few weeks, I've been following the trickle of news about the upcoming 3D controller, the Novint Falcon (http://www.novint.com/falcon.htm). If this thing doesn't do a Gizmondo and is actually for real -and so far there have been actual hands on reports, so it probably is- it's a 3D controller with touch-feedback functions. I've had the dubious fortune to own quite a number of 3D controllers over the years (anyone remember the Cybermouse?) but what makes this one stand out is that you can actually feel the object you are touching in-game. This goes way beyond just a rumble-pack: it calculates the location of your hand and the pressure the in-game object can take, and applies the actual force of the pressure to the controller. From the press-release:
"As an example, when a 3D cursor touches a virtual sphere, there is a force normal (perpendicular) to the surface. The device reacts and pushes in the radial direction away from the center of the sphere, proportional to how hard the user pushes against the sphere. The computer keeps track of the direction of the force (based on the position of the cursor) and the amount of the force, 1000 times a second which lets the user slide the 3D cursor across the surface of the sphere, giving it a consistent smooth feel. The effect is that the cursor, and therefore device, physically cannot move through the sphere, and it is actually a virtual solid object. When one looks at the Novint Falcon itself (rather than the cursor and sphere graphics on the computer screen), one can see the “invisible” sphere in the haptic workspace where the haptic device cannot move – it is really there, and you can really touch it! Additionally, other forces and algorithms can be used to give the sphere texture, dynamic properties (i.e. it can bounce like a ball), deformability, or a variety of other effects. "
It doesn't take a Nobel-prize winner to figure out that this could have quite some impact on the way one can perceive SL. Since everything is built out of prims already, the basis for integrating a layer of touch-feedback controllers is already there. Only now, instead of just clicking for a bit, you can actually feel those prims. Bat them around, push them. I think adding sensory feedback to SL would improve both its attraction and the ability to interpret something as abstract as a virtual 3D world. And I won't even go into the possibilities for the expansive red light district of SL...
I'm in no way currently active in integrating this controller in SL in any way, but I'm very keen on seeing how this will develop. So I'd like to lay out the following questions:
- Is it possible to take direct control of your limbs? Especially, could you control your AVs arms and what it's touching (world)? Alternatively, could you integrate a user controlled 'hand' in the HUD only (client)?
- Though it's ideal to integrate a controller like this from the very core of the SL client, what changes would be needed to 'just make it work'? In what way could you introduce an outside driver to interpret the prims and their location relative to the afore-mentioned arm without changing the client (by hacking the data sent to the rendering engine, maybe?)
- Does SL even support 3D controllers? If not, would a keyboard emulation be enough to fake the Z axis?
Anyway, my thoughts and questions so far. It's rather hypothetical at this point, but sooner or later this thing is going to happen. I'd like to know the possibilities once it does.