What's in a Video Card?
|
Zennoa Seifert
Registered User
Join date: 28 Oct 2005
Posts: 17
|
11-23-2005 20:06
So here's my system specs:
EPoX nForce 3 250GB chipset Athlon 64 3000+ processor running at 2.35 GHz 1024 megs DDR400 ram eVGA GeForce 6800 AGp graphics card
Now I should note that I'd been waiting to get my card back after it had to be repaired and I started playing SL on an old TNT2 card. As expected my graphics sucked.
Now that I have my 6800 back I did not see much of a performance gain. I haven't done enough testing to figure it out but when I turn up all graphic options and get 2 FPS and then pull up stats for my video card and my processor the 6800 is yawning at a comfortable 50 deg C and the processor is working overtime. Is there any video acceleration going on at all? Only thing I really noticed an improvement on was my sprite rendering.
|
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
|
11-23-2005 20:57
Majority of all 3D rendering calculations are done on the CPU. That goes for every game not just SL that doesn't have direct programable GPU instructions written in. Unfortunately ATI and nVidia use different instructions so you'd have to make special plugins for the game to take advantage of either cards' full capabilities.
|
Molly Switchblade
Steppin' Razor
Join date: 9 Jan 2006
Posts: 25
|
02-11-2006 13:28
From: Ron Overdrive Majority of all 3D rendering calculations are done on the CPU. This turns out not to be the case. That is what a 3D GPU is. A 3D graphics processor offloads 3D operations from the CPU to the GPU. That's exactly the job of 3D graphics hardware. Starting in 1995, when an unknown startup named 3Dfx released the 3D graphics processor daughtercard for "ProMotion" brand 2D GPUs from Alliance Semiconductor, 3D processing was moved to the video chipset, not the CPU. From: Ron Overdrive That goes for every game not just SL that doesn't have direct programable GPU instructions written in. Unfortunately ATI and nVidia use different instructions so you'd have to make special plugins for the game to take advantage of either cards' full capabilities. APIs like OpenGL and DirectX _are_ these "plugins," although they are implemented at an operating system level, not the game level (well, since about 1997). Any game using these APIs therefore can use an appropriate API-compliant 3D GPU. FWIW ATI and NVIDIA have drivers for GPU compliance with both APIs, although some argue that each company is significantly better in one or the other. Having OpenGL inventors work at NVIDIA gives them a real advantage that ATI countered by snuggling up to Microsoft for under-the-table DirectX advantages. I don't mean to be nasty, but misinformation begs correction. -- You may be thinking of textures. Video cards without sufficient RAM to process complex textures have to process them with mainboard RAM (or usually not at all).
_____________________
All positioning relative to the avatar is calculated from ... 0,0,0 on the body, which is centered on the pelvis. - Ceera Murakami
Now I understand just about everything I've ever seen in Second Life.
|
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
|
02-11-2006 15:43
Actually _ALL_ mathmatical calculations (the bulk of 3D rendering) _IS_ done on the CPU unless the game is programed to utilize _brand specific GPU instructions_ (there is no standard at this time). Neither OpenGL nor Direct3D have built in support for these brand specific GPU instructions as they must remain as generic as possible to remain compatable accross a long line of GPUs. The reason GLIDE ran so well is because it was designed solely for 3Dfx GPUs and nothing else. What we need is something like WicketGL wich is an OpenGL replacement for 3Dfx cards that contains optimized instructions to fully utilize the GPU's potential. Both ATI and nVidia offer their GPU SDKs to make performance better in your applications using their cards. I don't see how it can be hard to make some openGL dlls that utilize brand specific instructions for SL. This would offset a large amount of work to our graphics cards allowing our CPU's to work less wich makes it available to do other things that may be needed for SL. Btw, OpenGL is an opensource project that has been around for a very long time (longer then Glide and Direct3D). nVidia doesn't have programers from that project working for them because thousands of programmers around the globe have contributed to the project over the years, though some of their programmers may have aided in its developement. From: someone I don't mean to be nasty, but misinformation begs correction.
|
Molly Switchblade
Steppin' Razor
Join date: 9 Jan 2006
Posts: 25
|
how far from the topic can we get?
02-11-2006 18:02
From: Ron Overdrive nVidia doesn't have programers from that project working for them because thousands of programmers around the globe have contributed to the project over the years, though some of their programmers may have aided in its developement. Mark Kilgard, IrisGL programmer (foundation of OpenGL), original programmer of the OpenGL Utility ToolKit, OpenGL Architectural Revision Board member, and NVIDIA employee, should be glad that posting words in a forum doesn't make them true. -- Please then explain this to me. What does a 3D graphics processor do if this processor is not unloading 3D calculations from the CPU? If " _ALL_ mathmatical calculations (the bulk of 3D rendering) _IS_ done on the CPU," then just what math is it that Jen Hsun Huang's "more transistors than a Pentium 4 CPU," NV4x series GPUs are calculating?
_____________________
All positioning relative to the avatar is calculated from ... 0,0,0 on the body, which is centered on the pelvis. - Ceera Murakami
Now I understand just about everything I've ever seen in Second Life.
|
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
|
02-12-2006 07:20
From: Molly Switchblade Please then explain this to me. What does a 3D graphics processor do if this processor is not unloading 3D calculations from the CPU? If "_ALL_ mathmatical calculations (the bulk of 3D rendering) _IS_ done on the CPU," then just what math is it that Jen Hsun Huang's "more transistors than a Pentium 4 CPU," NV4x series GPUs are calculating?
At current state the GPU translates the CPU's mathmatical results into visual information. Basicly it draws what it's told to draw via connect the dots sheet the CPU supplied. Unless you write your app to unload those mathmatical equations to the GPU everything will still be done CPU side. Reason for this is compatability. The main target for a bulk of nVidia's & ATI's graphics cards are games. If you rewrite your drivers to utilize the GPU instructions by default when a good amount of games aren't written to use them you may cause compatability issues and thus you need to patch your drivers on a weekly if not daily basis to fix every compatability issue you run accross to keep the customers satisfied. Its in they're better interest to convince game makers to optimize their games run better on their card then the competition. Unfortunately game makers can't afford to play favorites so either they A) spend more time & man power implementing support for both cards' GPU instructions or B) stick with the generic stuff and optimize their own code to perform well with the two generic API standards and get their game out by their deadline. So for those that do decide to do A, the NV4x GPUs will have all the transisters they need to make the game perform at its best.
|
Paradigm Brodsky
Hmmm, How do I set this?
Join date: 28 Apr 2004
Posts: 206
|
02-13-2006 08:40
If it wasn't for the richer quality of multiplayer functions and communications for PCs, I would just stick with console games from now on! Everything is made for the console, don't have to spend $2500 on graphics cards, dule/quad CPUs, bus, memory, RAID etc every 6 months, and everyone is on an equal playing ground. If X-Box 360 comes out with a keyboard I might never upgrade my PC again.
_____________________
I'll do anything for love, most things for money, and some things for a smile.
|
Paradigm Brodsky
Hmmm, How do I set this?
Join date: 28 Apr 2004
Posts: 206
|
02-13-2006 08:45
If it wasn't for the richer quality of multiplayer functions and communications for PCs, I would just stick with console games from now on! Everything is made for the console, don't have to spend $2500 on graphics cards, dule/quad CPUs, bus, memory, RAID etc every 6 months, and everyone is on an equal playing ground. If X-Box 360 comes out with a keyboard I might never upgrade my PC again.
LOL. It just struck me that this will be a future trend. And the hardware manufactuers will retaliate by making consoles upgradable. OMG NOOOOOOOoooooooooooooooooo!!
OMG They might not even come out with the best parts. They may come out with mediocre specs and make us pay EXTRA for the good stuff!!, and still make the default console $500. OMG this consumer exploitation makes me sick!!! How can they do this!!!! WHYYYYYyyyyyyyyyy!!!!!!!!!!!!!!!!!
I think I'm going to have to do a pre-emptive boycot. It's back to PCs for me!
_____________________
I'll do anything for love, most things for money, and some things for a smile.
|
Corvus Drake
Bedroom Spelunker
Join date: 12 Feb 2006
Posts: 1,456
|
Lol
02-14-2006 15:03
Don't let the semantic hostility here get you down. Where the load goes is controlled by a combination of the drivers for your equipment and the application itself. Second Life is similar to Sims2 in how it passes information in a lot of cases.
BTW that's a pretty decent OC on a 3000+.
SO anyways, SL is more CPU intensive than it is GPU intensive. The rendering itself is really rather simple for most modern video cards such as your 6800 and my 7800GTX twins. The amount of graphics pipelines is also a contributing factor, and your 6800 has fewer than the GT or GTX of the same model. The CPU is called more because most of the load from SL is in running the interpretation of the sims themselves. As long as you aren't running the advance lighting system in the options menu that even lags my ubersystem, the GPU won't squeeze too bad. You should even be able to run 2x or 4x AA on that card.
My dual core made a night-and-day difference. I think SL must be multithreaded, in fact it must be with all these various scripts running at once. Save up some cash and get a 3800+, you'll need a BIOS flash but that'll make a huge change for you. Your video card will make more a difference when it comes to using AA or AF, which are entirely handled by the GPU and not the CPU for rendering purposes.
|
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
|
02-14-2006 17:14
From: Corvus Drake My dual core made a night-and-day difference. I think SL must be multithreaded, in fact it must be with all these various scripts running at once. Save up some cash and get a 3800+, you'll need a BIOS flash but that'll make a huge change for you. Your video card will make more a difference when it comes to using AA or AF, which are entirely handled by the GPU and not the CPU for rendering purposes.
Actually SL is Single Threaded. The reason why it runs so well is because WinXP is multithreaded allowing it to spread the system load. Some people have problems with dual core CPUs & SL because SL is single threaded. The fix for that though is to set the affinity for SL to one of your cores.
|
Corvus Drake
Bedroom Spelunker
Join date: 12 Feb 2006
Posts: 1,456
|
02-14-2006 17:44
From: Ron Overdrive Actually SL is Single Threaded. The reason why it runs so well is because WinXP is multithreaded allowing it to spread the system load. Some people have problems with dual core CPUs & SL because SL is single threaded. The fix for that though is to set the affinity for SL to one of your cores. The most recent fix for that, enabling a sort of pseudo-SMP for those of us not using Opterons, is in the most recent drivers for the X2 processors at AMD.com.
|
Feynt Mistral
Registered User
Join date: 24 Sep 2005
Posts: 551
|
02-14-2006 18:40
Let's just get it over with and get the guys at Linden Labs to make SL into its own OS. If it completely dominates the CPU and has to have its own drivers to interact with the various graphics cards we can be sure we get top performance. >D
No, I'm not forgetting any of my meds, why do you ask?
Ron is right, unless you're specifically calling the functions to make use of one video card over another most of the graphics will be processed through the CPU in a form of software mode graphics. If we REALLY want graphical efficiency, we either wait for the above to come true (games as an OS... boy that takes me back. What was it, 20-30 years since the commadore 64?), accept graphics card specific versions of SL (eww! Segregation based on system specs! You just KNOW they'll drop one brand of card that way), or force some sort of system check to discover which brand of video card you're using and call only the right functions for it (which is only a few steps from what they're doing now, as at the moment they're doing calls for video RAM and AGP acceleration).
And then again it may not be their fault at all, the rendering could be a fault of interaction with Havok 1 and our graphical woes will all disappear with an upgrade in engine. Havok 2/3, the panacea for SL's faults! (yeah right)
|
Corvus Drake
Bedroom Spelunker
Join date: 12 Feb 2006
Posts: 1,456
|
02-15-2006 10:58
nVidias recent and beta drivers employ a concept that passes load between dual core CPUs and the 6 and 7 series video cards. I think that's where I'm seeing my performance hike. That, and SLI forces OpenGL and Direct3d emulation on the GPU itself.
|