Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

Visualization

Chance Abattoir
Future Rockin' Resmod
Join date: 3 Apr 2004
Posts: 3,898
09-26-2004 02:32
I will drive to San Francisco and kiss your feet if you can make a modification that allows the user to specify visual depth of field (a la photography).

I love taking photos in SL, but sometimes it'd be nice to have the naked dancing centaur in the background blurred out. I'm sure it'd take some kind of wiz kid programming and some dithering algorithms to make it happen... but it would rock.
Morgaine Dinova
Active Carbon Unit
Join date: 25 Aug 2004
Posts: 968
09-26-2004 10:07
They can't even get the regular camera working properly for us, so I wouldn't put depth of field at the top of my list of priorities just yet. :-)

With that said, yes indeed, I too would love to see depth-of-field control. Algorithmically it's no problem at all. The difficulty is to get LL to do anything whatsoever these days.

Something's got to give in this area, and fast. Currently LL professes to be leading the field, and in several areas indeed they are, but in camera UI they are trailing behind virtually everybody.
_____________________
-- General Mousebutton API, proposal for interactive gaming
-- Mouselook camera continuity, basic UI camera improvements
Malachi Petunia
Gentle Miscreant
Join date: 21 Sep 2003
Posts: 3,414
09-26-2004 10:26
I think that's what Photoshop/PSP is for. Punch out the subject, Gaussian blur the remainder: instant DoF.

Better still, I often find that avatar edges are contrasty enough for the Extract auto-pen to do a nice job. Or throw a <0, 1, 0> colored prim behind your subject and chroma-key them into having lunch with George Bush.

Your world, your bitmap, your imagination.
Moleculor Satyr
Fireflies!
Join date: 5 Jan 2004
Posts: 2,650
09-26-2004 11:36
Whut? O.o
_____________________
</sarcasm>
Strife Onizuka
Moonchild
Join date: 3 Mar 2004
Posts: 5,887
09-26-2004 12:49
you could take a sterographic rendering and use that to get distances then blur everything thats not at the folcal length you want.

But i don't know any software off the top of my head that can do this automaticly.

anyway i would have thought you would want a centaur in the backgroud...
_____________________
Truth is a river that is always splitting up into arms that reunite. Islanded between the arms, the inhabitants argue for a lifetime as to which is the main river.
- Cyril Connolly

Without the political will to find common ground, the continual friction of tactic and counter tactic, only creates suspicion and hatred and vengeance, and perpetuates the cycle of violence.
- James Nachtwey
Adam Zaius
Deus
Join date: 9 Jan 2004
Posts: 1,483
09-26-2004 15:14
From: someone
Originally posted by Strife Onizuka
you could take a sterographic rendering and use that to get distances then blur everything thats not at the folcal length you want.

But i don't know any software off the top of my head that can do this automaticly.

anyway i would have thought you would want a centaur in the backgroud...


Render-to-texture with shaders can do this, but ... it would be laggy.

-Adam
_____________________
Co-Founder / Lead Developer
GigasSecondServer
Chip Midnight
ate my baby!
Join date: 1 May 2003
Posts: 10,231
09-26-2004 16:05
I'm not sure how you could do simulated DOF in a real time 3d engine, except maybe with pixel shaders. I've never seen it done (that I can think of). In 3d scanline rendering (not real time), DOF is done by rendering the same image multiple times with slight rotations of the camera around the focal point and then combining them together. The farther a point is from the focal point, the more variation there is in its position in the images, hence more blurring when they're combined. There's no way to do that in real time. Raytracers do DOF by modeling a lens and refracting the rays through it (I think). It's not something I'd expect to see any time soon in 3d engines, but I'm just speculating :)
_____________________

My other hobby:
www.live365.com/stations/chip_midnight
Moleculor Satyr
Fireflies!
Join date: 5 Jan 2004
Posts: 2,650
09-26-2004 18:00
What?

Is what I'm hearing something about blurry stuff up close, blurry stuff far away, but clear stuff at a certain distance? Huh? Examples?
_____________________
</sarcasm>
Morgaine Dinova
Active Carbon Unit
Join date: 25 Aug 2004
Posts: 968
09-26-2004 18:54
From: someone
Originally posted by Moleculor Satyr
What?

Is what I'm hearing something about blurry stuff up close, blurry stuff far away, but clear stuff at a certain distance? Huh? Examples?

You probably need to rephrase that, Mole. One has to assume that you're acquainted with depth of field, yet your question is phrased as if you weren't, which is probably making people wonder whether to answer and about what.
_____________________
-- General Mousebutton API, proposal for interactive gaming
-- Mouselook camera continuity, basic UI camera improvements
Malachi Petunia
Gentle Miscreant
Join date: 21 Sep 2003
Posts: 3,414
09-26-2004 21:56
Mol, attached is a terrible example of the relationship between depth of field and "blurriness". But it's late.

Both images were shot with the same everything with only one of the two main variables in photography altered. In each picture, the focus was set to the middle pencil, but notice as the aperture goes from very wide (f5) to very narrow (f19) that the readability of the meterstick and pencils in the fore- and background increases.

Since the wide aperture lets more light in, it takes less time to expose to the same illumination. Conversely on the very narrow aperture (almost a pinhole) the exposure time increases to let enough light in.

I hope that helped some.
Moleculor Satyr
Fireflies!
Join date: 5 Jan 2004
Posts: 2,650
09-26-2004 22:47
From: someone
You probably need to rephrase that, Mole. One has to assume that you're acquainted with depth of field, yet your question is phrased as if you weren't, which is probably making people wonder whether to answer and about what.


Generally when someone asks twice what people are talking about, he honestly wants an answer.

From: someone
Originally posted by Malachi Petunia
Mol, attached is a terrible example of the relationship between depth of field and "blurriness". But it's late.

Both images were shot with the same everything with only one of the two main variables in photography altered. In each picture, the focus was set to the middle pencil, but notice as the aperature goes from very wide (f5) to very narrow (f19) that the readability of the meterstick and pencils in the fore- and background increases.

Since the wide aperature lets more light in, it takes less time to expose to the same illumination. Conversely on the very narrow aperature (almost a pinhole) the exposure time increases to let enough light in.

I hope that helped some.


Oooooh. Ok. So I guessed right.

*ponders*

*squints*

I can't think of any postprocessing that could do this with any realism. It MIGHT be possible to get all graphics within a certain range to be perfectly clear, and all objects without to be a set blurryness, but to get a gradual blur? The more gradual, the tougher the problem. I doubt a graphics engine could actually do it in real time right now.

I might be wrong though.

SL almost CERTAINLY can't do it though. You'd need something much more recent.
_____________________
</sarcasm>
Morgaine Dinova
Active Carbon Unit
Join date: 25 Aug 2004
Posts: 968
DOF on GPUs -- GPU programming resource link.
09-27-2004 12:02
From: someone
Originally posted by Moleculor Satyr
I can't think of any postprocessing that could do this with any realism. It MIGHT be possible to get all graphics within a certain range to be perfectly clear, and all objects without to be a set blurryness, but to get a gradual blur? The more gradual, the tougher the problem. I doubt a graphics engine could actually do it in real time right now.

Graphics hardware to the rescue. :-) It's a completely new ballgame with programmable shaders being ubiquitous.

To show you how easily it can be done in realtime, check out this useful ShaderTech page on GPU Programming. About halfway down the page there's an entry on Depth of Field, and running binaries and the sources for them are both supplied there. The Cg files that implement the DOF are very short, much shorter than the waterV.cg file that comes with the SL client.

That's just one entry out of 108 there at ShaderTech --- their Shader Source Code Archive is a great resource.
_____________________
-- General Mousebutton API, proposal for interactive gaming
-- Mouselook camera continuity, basic UI camera improvements
Chip Midnight
ate my baby!
Join date: 1 May 2003
Posts: 10,231
Re: DOF on GPUs -- GPU programming resource link.
09-27-2004 18:44
From: someone
Originally posted by Morgaine Dinova
About halfway down the page there's an entry on Depth of Field, and running binaries and the sources for them are both supplied there. The Cg files that implement the DOF are very short, much shorter than the waterV.cg file that comes with the SL client.


Wow, very cool. Shows you what I know about real time engines these days. hehe.
_____________________

My other hobby:
www.live365.com/stations/chip_midnight
Chance Abattoir
Future Rockin' Resmod
Join date: 3 Apr 2004
Posts: 3,898
09-28-2004 10:15
From: someone
Originally posted by Malachi Petunia
Mol, attached is a terrible example of the relationship between depth of field and "blurriness". But it's late.

Both images were shot with the same everything with only one of the two main variables in photography altered. In each picture, the focus was set to the middle pencil, but notice as the aperture goes from very wide (f5) to very narrow (f19) that the readability of the meterstick and pencils in the fore- and background increases.

Since the wide aperture lets more light in, it takes less time to expose to the same illumination. Conversely on the very narrow aperture (almost a pinhole) the exposure time increases to let enough light in.

I hope that helped some.


Your example would have been better if you used focal length, since you get a more extreme DOF change when using long vs. short lenses as opposed to aperture changes.
If you capture the same mis-en-scene with two different lenses, the longer of the two will have an increasingly narrow space of focus. Mol, just take your camera outside with a zoom lens, stand 5 feet away from your subject with your shortest lens, remember the composition of the image, and then walk as far away as you can with your longest length lens and try to get the same composition. That's an instant way to see what we're talking about.

There's probably an easy way to do it, programming-wise. Maybe it's just that nobody's thought of it yet. After all, pixels are just light (and many programmers often like to whine about what they can't do just to show how much they know, rather than trying to tackle the "impossible";).

;D
Eggy Lippmann
Wiktator
Join date: 1 May 2003
Posts: 7,939
09-28-2004 11:10
I have a better idea. Poke yourself in the eye. It will make everything nice and blurry :)
Chance Abattoir
Future Rockin' Resmod
Join date: 3 Apr 2004
Posts: 3,898
09-29-2004 07:43
From: someone
Originally posted by Eggy Lippmann
I have a better idea. Poke yourself in the eye. It will make everything nice and blurry :)


That would defeat the purpose, but at least you're taking that extra step to think outside the box. :P