Scripting a Prim Touchpad
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
06-01-2006 13:36
I'm teaching myself how to script in SL. I'm not a programmer, but I'm catching up pretty quickly thanks to the LSL wiki and this forum.
I'm trying to make a HUD interface that receives its input like a touchpad : by dragging the mouse it generates x,y coordinate data that I can transform into a controller for color and sound manipulation. I'm progressing faster on the color and sound part, but I'm stuck on the touchpad interface thingy (which is to say, I can't find any examples).
I have two ideas for how this could work, though:
1/ A LSL function that tracks mouse movement on an object as x,y coordinates (ie: (float)xCoordinate, yCoordinate) relative to it at a certain resolution. The problem is, I don't think that this function exists in LSL (as it does in most low level multimedia dataflow programs like Pure Data etc...)
2/ A hack I thought of is to create a grid of linked rectilinear prims, each prim reacting to a touch event that signals a scripted function. This seems a bit crude though, as I want to use this prim touchpad to trigger smooth transformations.
Can anyone point me in the right direction?? Thnx.
//Kliger.
|
|
carol Wombat
Registered User
Join date: 29 Jan 2006
Posts: 16
|
06-01-2006 18:31
Hi Kliger,
I have noticed that your avatar's head seems to follow your mouse. Maybe it is possible to get the direction of the head? Otherwise you might need a lot of prims.
Ed
|
|
Logan Bauer
Inept Adept
Join date: 13 Jun 2004
Posts: 2,237
|
06-01-2006 18:46
Well, you need to click on an object to trigger the touch() event, there's no way I know of to track where the mouse pointer itself. Here are two threads that show how to tell where the AV is looking while in mouselook and use that to tell where on a single prim someone is clicking, they may help: /54/33/110583/1.html/15/ee/38717/1.htmlOr, another way to do it, you can set an object physical and then click and drag it - you can then instead track the position of the object you're dragging, and utilize that. However, this unfortunately doesn't work with the idea of it being a hud, you would have to drag the object around in front of you on a table or something.
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
06-02-2006 01:07
From: Logan Bauer Well, you need to click on an object to trigger the touch() event, there's no way I know of to track where the mouse pointer itself.
Or, another way to do it, you can set an object physical and then click and drag it - you can then instead track the position of the object you're dragging, and utilize that. However, this unfortunately doesn't work with the idea of it being a hud, you would have to drag the object around in front of you on a table or something. Logan// Thnx for your input... I hadn't thought of dragging/tracking an object. That idea opens a whole other interface space for me  But like you said, it won't work as a HUD, I tried, though. A moving HUD widget seems like a cool idea. //Kliger.
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
06-02-2006 01:12
From: carol Wombat Hi Kliger,
I have noticed that your avatar's head seems to follow your mouse. Maybe it is possible to get the direction of the head? Otherwise you might need a lot of prims.
Ed Carol// Hmmm, I see what you mean. But I want to use the interface to mix sound, color or textures and I'm not sure that, in this case, I'd want the hand and the eye to necessarily follow each other. I'm going to check it out, though. //Kliger
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
cheap touchpad-in-a-HUD (kindof)
06-02-2006 04:08
update: When I put this codeblock in a prim :
default { touch(integer num_detected) { llSay(0, (string)llGetPos()); } }
attach the prim to a HUD, when I click-drag on the object, the Z coordinate changes. It seems like a bug, but it kindof acts like a cheap touchpad. Does anyone understand why this is happening? I'm trying to use it as an interface feature.
|
|
Eloise Pasteur
Curious Individual
Join date: 14 Jul 2004
Posts: 1,952
|
06-02-2006 07:59
That's the expected behaviour. llGetPos() gets the position of the object the script is in (more precisely it gets the position of the root prim if it's in a linked object).
I think you want llDetectedPos(0) for your application - I assume you're trying to get the position of the av touching it?
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
Detected and Get
06-03-2006 06:23
From: Eloise Pasteur I think you want llDetectedPos(0) for your application - I assume you're trying to get the position of the av touching it? Eloise// When you say the position of the av, what is this measuring?? If it's the hand's movement, then that's what I need. But if it's the position of his/her's feet or the body's center point or whatever, that's not what I need. I experimented a bit, and I think that it's measuring the former, and not the interaction between the av and the object ie: the hand that is touching it. But then again, the difference between Detected and Get is a bit subtle for me...  I have a lot to understand yet, before I can get this together. Thnx for your help. //Kliger
|
|
Logan Bauer
Inept Adept
Join date: 13 Jun 2004
Posts: 2,237
|
06-03-2006 07:07
From: Kliger Dinkin update: When I put this codeblock in a prim :
default { touch(integer num_detected) { llSay(0, (string)llGetPos()); } }
attach the prim to a HUD, when I click-drag on the object, the Z coordinate changes. It seems like a bug, but it kindof acts like a cheap touchpad. Does anyone understand why this is happening?
Actually, I haven't the slightest idea what's happening here - AFAIK llGetPos in an attachment should return your AV's position. Testing this with my AV standing perfectly still, the results I got were as you described - click and hold in one spot and I get the exact same position repeated(actually, it takes about 4 numbers to "settle" on where the Z pos is. O_o). Click and drag and I see variations in the Z coordinates... But, all very small (about a 0.01 variance), and not seeming to mean anything (I.E they didn't seem to me at least to correspond to the mouse pointer's location, my location, or anything, seemed pretty random)... This does NOT seem to happen when attaching a prim to say your spine or arm, just the HUD... Sitting on a cube or standing, got the same results...
|
|
Tiger Crossing
The Prim Maker
Join date: 18 Aug 2003
Posts: 1,560
|
06-03-2006 07:30
I have high hopes for when we have html on a prim (something that is coming soon) that it can be used for just such situations. There are many places in Second Life were large bunches of prims are used for buttons when a simple texture would be enough... IF we had some way of telling where on the surface the user clicked. If the web-ona-prim is interactive, then we'll get this functionality for free.
_____________________
~ Tiger Crossing ~ (Nonsanity)
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
weird, but...
06-03-2006 07:30
Logan/
1/ according to LSL wiki's page : "llGetPos Returns the object's position in region coordinates, which are relative to the simulator's southwest corner." The object being refered to, the way I read this, is the object that I'm touching for a touch* event.
2/ It seemed to me that the Z coordinates are relative to the last position-Z of the object. I'm attaching the object to a HUD from the world, thus it has an absolute position.
3/ The variance is small, but to simulate a touchpad, exploiting the delta between these movements could be enough. ie : the difference between t1 & t2, t2 & t3 etc... can simulate mouse movement and speed relative to the object. Now if I can only figure out how to extract the delta between to events... <G>
//Kliger
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
06-03-2006 07:34
From: Tiger Crossing I have high hopes for when we have html on a prim (something that is coming soon) that it can be used for just such situations. There are many places in Second Life were large bunches of prims are used for buttons when a simple texture would be enough... IF we had some way of telling where on the surface the user clicked. If the web-ona-prim is interactive, then we'll get this functionality for free. Do you mean by using an image map, for example ? Processing&java? What else? That's blowing my mind.
|
|
Ardith Mifflin
Mecha Fiend
Join date: 5 Jun 2004
Posts: 1,416
|
06-03-2006 08:24
I wrote some rough code a couple of days ago that accomlishes something similar. You can find it here. If anyone's having trouble understanding it or would like me to drop my test object on them, feel free to ask. The code tracks the rotation of your avatar's head while in mouselook and converts that to a position vector to move a cursor prim.
|
|
Kliger Dinkin
Registered User
Join date: 22 Apr 2006
Posts: 46
|
06-03-2006 11:37
Ardith//
I'm very interested to see how this works... it seems like the kind of mechanism I have to understand to make my project work. I tried quickly to put it together, but didn't understand how all the pieces fit together. I'll try again later on when I have more time.
//Kliger.
|