These forums are CLOSED. Please visit the new forums HERE
OpenGL 2.0? |
|
|
paulie Femto
Into the dark
Join date: 13 Sep 2003
Posts: 1,098
|
12-23-2004 23:41
To what extent does SL use OpenGL 2.0 features?
_____________________
REUTERS on SL: "Thirty-five thousand people wearing their psyches on the outside and all the attendant unfettered freakishness that brings."
|
|
Huns Valen
Don't PM me here.
Join date: 3 May 2003
Posts: 2,749
|
12-24-2004 00:35
0% AFAIK
_____________________
|
|
Adam Zaius
Deus
Join date: 9 Jan 2004
Posts: 1,483
|
12-24-2004 01:04
Most graphics cards (all consumer cards) cannot take advantage of OpenGL 2.0 yet, Give it at least 12 months before there's full support & compatibility.
-Adam _____________________
|
|
paulie Femto
Into the dark
Join date: 13 Sep 2003
Posts: 1,098
|
tests
12-24-2004 09:56
Here are some test apps that give an idea of how well a particular graphics card handles OpenGL 2.0. The first app tells me that my shiny new Nvidia Geforce 6800 only has 85% support for OpenGL 2.0. Sad. Some of the tests make my pc crash and reboot. My card seems to handle the second app better, except for the particle tests, which bring my 6800 to it's knees. Lol.
Supposedly, a new rendering engine is coming for SL next year. Will it use OpenGL 2.0 features? Lindens, what say you? http://www.realtech-vr.com/glview/ http://developer.3dlabs.com/openGL2/downloads/ _____________________
REUTERS on SL: "Thirty-five thousand people wearing their psyches on the outside and all the attendant unfettered freakishness that brings."
|
|
Huns Valen
Don't PM me here.
Join date: 3 May 2003
Posts: 2,749
|
12-25-2004 00:11
I bet they are on 1.5 now and staying 1.x for at least 1.5-2 years
_____________________
|
|
JC Case
Registered User
Join date: 26 Dec 2004
Posts: 11
|
12-28-2004 18:09
Alright, I don't get something . . . how do you 85% support a video feature? I don't think I like video card marketers.
Also, huns, it's wrong to throw people out of windows. And if you throw them back in, have you fenestrated them? |
|
Max Mandala
Member
Join date: 31 Oct 2003
Posts: 19
|
12-31-2004 16:15
Since SL doesn't even use shaders at the moment I see no reason whatsoever why it should need any OpenGL 2.0 feature. Also, most OpenGL 2.0 features are available as 1.x extensions.
|
|
paulie Femto
Into the dark
Join date: 13 Sep 2003
Posts: 1,098
|
Linden input?
01-07-2005 13:31
Lindens? Any input on this one?
![]() _____________________
REUTERS on SL: "Thirty-five thousand people wearing their psyches on the outside and all the attendant unfettered freakishness that brings."
|
|
Catherine Omega
Geometry Ninja
Join date: 10 Jan 2003
Posts: 2,053
|
01-07-2005 15:17
Alright, I don't get something . . . how do you 85% support a video feature? I don't think I like video card marketers. 1.0 If one of us asks the other a question, the reply can be either "yes" or "no". Now, let's revise it: 1.1 The reply can now be "yes", "no", or "maybe". ...and a second revision: 1.2 Error messages consisting of, "that's not a 'yes or no' question." and "I don't understand." are now available. Now, say we want to follow this spec -- If I ask you "where are you?", clearly not a "yes or no" question, the response with version 1.2 of the protocol will be one of the two error messages. If I've only half-implemented the 1.2 extensions, it might be a poor choice of error messages as well. "I don't understand" might be the only one available. With versions 1.0 and 1.1, there will either be an internal error, resulting in NO reply, or the reply will not make any sense given the context of the question asked. Basically, this is the problem with many communications protocols -- that one piece of software will meet many of the criteria, but it won't be a full implementation of the specification. Concequently, interoperability is frequently broken, even in some very, very small way. That's what much of the ATI/Nvidia/SL wackiness has been about. SL meets the OpenGL specs, it's just that when it attempts to talk to the video drivers, they have absolutely no idea what SL is asking, because the driver programmers have spent all their time implementing features for John Carmack. ![]() _____________________
|
|
Tread Whiplash
Crazy Crafter
Join date: 25 Dec 2004
Posts: 291
|
....Or....
01-07-2005 22:48
Or they have spent all their time devising "cool looking features" for their Marketting departments. Remember 3DFX's dying days anyone - and all their "motion blur" and "T-Buffer" effects?
At least when Carmack gets on the video-card maker's case, its to implement a real, useable feature in an honest-to-god product! Of course, I'm biased - I used to work at Sierra, so I'm apt to side with developers. *chuckle* Now, since we brought up Carmack - let me point out that Doom3, a 2004 release, was using the GeForce 1 as a bare baseline bottom-of-the-barrel card where feature compatability was concerned, right? The GeForce 1 came out in 2000 IIRC, so we're talking about a 3 - 4 year lag time folks, between features and their full use. Why? Because it takes time: Time to implement the features in the drivers, Time for game-programmers to learn how to take advantage of them, Time for a product to be made, And (most importantly) Time for the market to become saturated with these feature-capable cards enough so that when you go to retail, you have a decent market for your software! The OpenGL 2.0 spec is less than 6 months old. Unless we suddenly see drivers from nVidia & ATI that are over 90% compatible with the spec, we won't see major titles using it. My prediction is that, barring a sudden implementation on EXISTING HARDWARE by the afore-mentioned manufacturers, it will be at least 1 year from now that we will see any "consumer level" products using OpenGL 2.0 extensively. Take care, --Noel "HB" Wade (Tread Whiplash) |