From: someone
Originally posted by Myria Daguerre
Such a system would interfere with administration. A bad script could be programmed that moved the object away or spawned a second copy of itself when someone - possibly a God - right clicked it. It would be very difficult to delete such a pest.
Melissa
Linden Gods have the ability to disable scripts and delete objects without touching them, at least from what I've heard.

Plus, what's the point if you can do the same thing (make an object move away from Lindens or other people when they get close) with a sensor already.
I believe this would be a super-cool feature to add, a selected(), selected_start() and selected_end() would allow for cool editing abilities.
Imagine, an object that knows that its being edited and responds to commands differently during that period of time.
For example, with selected() events enabled, I can create a script to easily modify specific objects with voice commands while only in edit mode.
I shift-select a Cube and Sphere with my special buildHelper script in them:
"sidescale +z 4"
They both scale their side along the positive up axis (blue arrowhead) to 4 meters.
I shift-deselect the Sphere, deactivating voice-command on it, and Select Cube2, leaving Cube still selcted.
"alignZWith OtherCube"
Both cubes auto-align with OtherCube.
"eulerRotate <2.192, 23.12, 12.1283>"
Both cubes rotate to that rotation.
Exciting!

And no, the touch events
don't already do this! Touch events account for left
click. select() events would allow scripts to see when their object becomes
selected via right-click (or left-click while in edit mode.)
Selection is represented in the UI by a yellow halo around the object.
It would also be awesome if llDetected* functions worked in these too

==Chris