After a week of fine-tuning its brand new top-of-the-line machine, running Windows XP x64, and with two nVidia SLI cards working in tandem (2 GB of RAM and 512 MB of video RAM

), my friend and associate at RL work (which uses SL for several educational and cultural projects) wanted to create the ultimate SL powerhouse, to be able to demonstrate SL to customers/partners with 50 fps on drawing distances of 256 m and the remaining Preference settings at their maximum levels. No less should be expected from a highly fine-tuned machine that does amazing things like 300 fps (!) on some of the "game benchmarks" like 3DMark.
When launching SL, the best he got was... 4.7 fps.
You cannot imagine his disappointment. At that point, he was so totally frustrated, assuming that SLI technology was too "early" in its technology stage (although having been around for a year or so) to be able to use it on "serious" platforms for SL. Sure, most other games ran well, and with some tweaking, fine-tuning, new memories, better drivers, and all sorts of dirty tricks in the field, he managed, after a week or so, to get a cool-running machine with incredibly smooth rendering on almost all things he had loaded on his computer.
Except for Second Life. Which refused to spew out more than 4.5-4.7 fps, slightly more on totally empty sims, but not much more. Next to him were people using 3 and 4 year old computers, with cheap $60 video cards, happily with 15-17 fps on the exact spots.
After another session of deep frustration — he had some classes to teach SL over the weekend and wished to "show off" SL at its best — he went to home earlier, shaking his head, and lamenting the poor code of SL, that can't fully use such advanced technology like double cards and the latest drivers. He was already starting to think about alternative platforms, something that he always does when he gets a high frustration level with Second Life. You must understand that this guy is "selling" SL to all sorts of customers and partners for over a year now, and he needs to feel confident on technical issues related to Second Life. The usual concept of saying "oh, it works well if you have the right hardware" is a common "excuse" when people complain of a slow-running SL. This time, however, he did not have that excuse; he had the best machine his money could buy (for a limited budget), which was able to deal with highly complex 3D games of all kinds — just not SL. SL for him was fatally flawed.
Well, I hardly expect to understand 10% of the technical buzzwords he uses when explaining how 'video pipelines' work, but I've been a Mentor for over a year now, so when he left, I tried to go over his SL configuration. My first attempts of looking at the parameters did not improve anything. So I thought I should apply the same rules I use when dealing with newbies complaining about "lag": turn local lighting off, Disable far clip, reduce the drawing distance.
I unchecked the box for local lighting, and suddenly I couldn't believe my eyes: from 4.7 fps the machine jumped to 47 fps in the same place!
Having turned off local lighting since 1.7 on all computers I have ever put my hands on, I haven't ever seen if this feature worked or not; it never worked for me after 1.7, so it's easy to overlook it on other peoples' configuration. Well, happy with the results, I started to raise all values on Preferences (stopping short of putting drawing distance on 512 m — that, at least, will really slow down the machine

and the compromise of having a drawing distance 160 m with 30+ fps *everywhere* but at The Edge was more than enough for me

).
This impressed me so much — I had absolutely no idea that local lighting was that badly broken! As a matter of fact, I watched suddenly something I had never seen on SL before. When turning up the bandwidth to 1000 Kbps, and since this awesome machine is able to load all textures so stupendously quickly, uncompress them blindingly fast in memory, and push them inside the vast amount of available memory on the graphics cards... suddenly, the whole world seemed to slow down and work in slow motion when I was flying around a sim!
What was happening? Turning on statistics, the FPS rate was still in the upper 30s, and everything was rendering so amazingly fast that I couldn't understand what was wrong. The sim was empty of people, low script usage, and while it certainly had tons of textures and prims, the machine dealt with them like a warmed-up knife cutting through butter.
Then I watched the time dilation: it was down to 0.52! Oh, so something was wrong with the server then; no problem, I could always go to the next one, even flying at "slow motion". What was weird was that every time I stopped, time dilation went up to 1.0. Fly around, and it dropped to 0.5 or thereabouts. Hmm.
Entering the next sim, the same thing happened. And on the next one. And all the neighbouring ones. Uh-oh. What was going on??
The answer seemed to be simple. This machine had such computing & processing power, it was squeezing the sim server dry — pulling megabytes and megabytes of texture data, dealing with them in nanoseconds, and asking the server for more, more, and more. Well, in that struggle, the poor server couldn't keep up with the demand — this was the strange situation of feeling that I had now a machine that was more powerful than the sim server itself!

It processed data faster than the sim server could deliver — and this was causing time dilation. Wow. That was
very cool!
And this from a machine that only a few hours ago coughed up just a handful of frames per second, painfully struggling to render them — just because it had local lighting on!
My suggestion/recommendation to Linden Lab: while you fix local lighting on a subsequent revision of the renderer — simply turn this feature
off. In almost all decent computers, it comes
on by default. It should not only
NEVER be a default — it should be
greyed out and not selectable
at all! Remember your new residents. They come to SL with their top-of-the-line computers, fine-tuned to play WoW at 80 fps, just to encounter a 3D virtual world where they get 4 or 5 fps on a good day. But if they only could launch the SL client
without this option on, they'd get 30-50 fps easily on their machines! (and we Mentors would be able to explain to them that the reason why they 'only' have 50-60% of the performance of other platforms is due to the streaming nature of SL).
So, please, Linden Lab, get rid of that option. Your own statistics tell the same: less than 1% use local lighting anyway, at least since 1.7. I loved it before 1.7, while it worked reasonably well. I almost don't know anyone who has turned it on (I get 0.2 fps when I use it...). It's a useless feature, and also an unused one.
Get rid of it fast. Don't continue to persist in offering an option that, sincerely,
does not work and causes excessive frustration!
I'm almost considering setting up a big panel on the Welcome Area saying: "NEW RESIDENTS! WELCOME TO SECOND LIFE! TURN LOCAL LIGHTING OFF
NOW AND GET A 10x BOOST ON YOUR PERFORMANCE! THIS IS A FAST PLATFORM IF YOU TURN IT OFF! IT'S A BROKEN FEATURE FOR ALMOST A YEAR NOW, WE KNOW IT, AND LINDEN LAB IS WORKING ON IT AS YOU READ THIS!"
... but it would be so much simple to turn it off forever, and patiently wait for the upcoming changes on the renderer. LL did the same for the AGP "improvements", at least on the Mac version, a while ago (when it was broken for a few releases)...