Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

How to change name and other stuff

Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-19-2003 22:33
Is it possible for me to change my name? I'm Susan Beach, but I have no idea how I ended up with that, I always go by Susan Killemall, and have for a few years.

And on totally unrelated subjects - Has the fly-off-the-end-of -the-world bug been fixed? The last few days I've been zipping all over the place, and I haven't been bit once by it. It used to be I couldn't cross 3 sims without getting but, but now it seems to be working.

Oh, yeah - what about the hole in the avatar problem? My avatar and everyone else is full of holes. I'm using the latest nvidia drivers, but it still does it. Should I be concerned, or will future releases of SL address this issue?

and lastly - what are the ramifications of building sructures with a lot of small objects. Does it place any undue load on the sim or anything like that?
Yuki Sunshine
Designing Woman
Join date: 1 Apr 2003
Posts: 221
05-19-2003 23:56
Thing with GeForce2MX is that you have to roll back your drivers to an older version rather than having the latest. 41.09 will fix the avatar gap issue. They've been saying for awhile that they're going to fix that problem, but it hasn't been addressed yet. Rolling back drivers works.
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 11:15
It's not a MX, it's a full blown GeForce2 GTS. But I'm assuming that in this case it doesn't matter? I heard a while ago that they would be fixing it in a future release, I'm guessing that they haven't done so yet? If that is the case, then I'll just wait for them to fix it. I had thought they would have fixed it by now since it's been a while since they said they would.

Personally, I'd rather they work on the performance issues. I have a machine with 256MB ram and a slow hard drive, and it spends too much time churning the swap file to be playable. I'd like to see them tune it to run on machines like that. It screams on my desktop, GF2, 384MB ram. Well, ok, with a 256K DSL connection it lags badly at times, but at least it's playable.....
Peter Linden
Registered User
Join date: 18 Nov 2002
Posts: 177
05-20-2003 13:26
The NVidia driver issue is a tough one. Just so you know, I have an 800mhz bad DSL geforce2 machine at home and I can connect at a much better speed now that I downgraded the drivers from v43.45 to 41.09.

The problem is that NVidia removed the vertex blend extension that we were using from the latest driver version that they released. We can fix this with a software patch, but that would slow down the rendering, and based on what everyone has said, we don't want to do anything that would slow rendering.

Make sense?

-P

(note: edited for accuracy, thanks to Doug for clearing that up for me. He will post a more detailed description when it he has time. )
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 14:00
If you use surrogate keys for your primary keys in your databases, then changing names would be a snap - but I didn't design your database, and don't know what goes on under the hood of your database :)

I understand the driver issue - any idea why nvidia removed the vertex shaders? I thought that they would maintain backwards compability in the new drivers, go figure? My laptop uses the 28.01 drivers, it's a GeForce4Go, and that is the most current available that I'm aware of as nVidia doesn't release GF4Go drivers, they are OEM only :(

What do you mean when you say "I can connect at a much better speed now that I downgraded the drivers from v43.45 to 41.09". How is driver version related to connect speed? I turn up all the visuals, and there is no performance issues related to video that I'm aware of, as fps is always quite acceptable. However, I'm certainly willing to downgrade my desktop to 41.09 if it fixes the holes in the avatars for now.

More importantly, do you have any plans to fix this, or does this effect your code at such a basic level that it's just a ton of work to fix? I would think that sooner or later you are gong to have to take action on this issue, and there are going to be thousands of second life residents that are going to be using newer nVidia drivers. Better yet, would nVidia listen to you if you complained to them and asked them to put vertex shaders back in the drivers?
Doug Linden
Linden Lab Developer
Join date: 27 Nov 2002
Posts: 179
Explanation of nVidia driver issues
05-20-2003 14:40
Whoops, Peter was a little bit off in his description of what the issues with the nVidia drivers was, let me be a bit more specific...

There is a particular OpenGL extension, GL_EXT_vertex_weighting, which we use to do smooth-skinning of our avatars - i.e., making the joints of our avatars blend seamlessly together. nVidia supported in their Detonator 41.09 and earlier drivers. Unfortunately for us, when nVidia released the 43.xx revisions of their drivers, they decided to remove support for this extension, in favor of using the more general-purpose GL_ARB_vertex_program instead.

While this is supported on all of nVidia's cards due to their unified driver model, this extension runs in either software or hardware emulation on chips without dedicated vector program units - in other words, all GeForce 2 based cards, and GeForce 4MX based cards as well.

Anyway, when the extension is running in emulation mode, it runs REALLY slow - slow to the point that we decided that it would provide a better user experience to have holes in people's avatars rather than cut the frame rate by about a factor of 10.

We do plan to write our own code to do smooth-skinning of our avatars on non-hardware vertex program cards - however, I can't give you an accurate ETA for it being fixed at this point. Since it can be fixed by a driver rollback, for now its on a back-burner compared to our more serious bugs.

As for the name changes - let's just say that there are several technical and non-technical reasons that we don't allow name changes. I don't know enough to be able to provide specific reasons, though. :)

- Doug
Ama Omega
Lost Wanderer
Join date: 11 Dec 2002
Posts: 1,770
05-20-2003 15:07
A note about laptop video drivers.

First: Do not even read this if you are not skilled in the art of computer maintanance and driver roll backs if your video driver ceases to work.

I have a toshiba laptop with a GeForce2Go video card as my only computer, and the latest released drivers for it are based on (drum roll please) 8.x detonators. Not a typo. Well needless to say that is unacceptable to me. ;)

So I went on a journey, through several dell drivers, other toshiba drivers and finally unzipped some semi recent detonators. And lo and behold there is a driver included called GeForce2 Integrated GPU. I have a Geforce2go, and couldn't think of a much less integrated card ;) ( I know its probably made for nforce boards).

So the process is pretty much painless for me - unzip the detonators from the nvidia site into a folder, update driver and choose to let me select the driver. Browse to the right folder and select the Integrated GPU option. It works wonderfully for me.

I can not say it will work for anyone else, but this allows me to use any detonators I wish. I'm not giving a more detailed description of how to do it because anyone that needs a step by step picture guide will not be able to get back to a usable state if it doesn't work for their laptop, and thus I don't think should be doing this.

I also don't know if there is a GeForce4 Integrated GPU option, but given the nature of the Unified Driver Architecture and the design path of 4Go and 4mx processors the GeForce2 Integrated GPU may work just as well for a 4go card.

As always, Good Luck. :)
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 15:08
OK, good explanation, thanks. I agree that it's a back burner project. When Half-Life 2 comes out, I'm going to buy a GeForce4 just to play it in all of it's glory, so it's not that big of a deal AFAIC :)

What is the word on the fly off the world issue? I've not been bit for a while, it seems to have gone away, but I don't remember seeing anything int he readme. Just curious - at first it was a real gameplay killer because I was always getting bit and flying off the world and crashing SL.
Doug Linden
Linden Lab Developer
Join date: 27 Nov 2002
Posts: 179
05-20-2003 15:18
Susan -

Many of the flying off the world issues have been fixed. It's fairly complicated to explain exactly how they were fixed (it requires quite a bit of understanding of our system's internals), but there are now only a couple of known cases where this can happen, mostly due to extremely poor viewer/simulator performance and/or packet loss.

- Doug
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 15:56
Ama Omega - I found the GeForce 4 Integrated GPU, but it gave me a black screen after booting. I also tried the GF4 MX 420, same thing. So I rolled back to the GF4go (28.xx), as those are the only thing that seems to work so far. I wonder if nVidia no longer supports this chipset with drivers, and excludes them from the unified driver set?
Ama Omega
Lost Wanderer
Join date: 11 Dec 2002
Posts: 1,770
05-20-2003 16:31
The go series have never been supported by the detonators. I guess I just got lucky. :-/ I started using the GeForce 2 Integrated driver sometime around Det 32.x and it has always worked flawlessly.

If you have the spare time you could also try the Integrated Geforce 2 driver. I think the 4Gos are just supped up 2Gos anyways. <shrug> May not get all the features of the 4Go, but you seem to have a handle on rolling back if it doesn't work ...
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 16:43
I did find some drivers that worked, but I was liited to 640X480, 16bit color, so I finally went back to the 28.xx GF4Go drivers.

The GF4Go is actually a hot little chipset. It doesn't have all of the features of the GF4, but it is light years ahead of the GF2Go. It's basically somewhere between a GeForce4 MX and a full blown GF4, more then the MX but less then the real thing. I think I'll just leave it where it is, it's not that big of a deal :)
Ama Omega
Lost Wanderer
Join date: 11 Dec 2002
Posts: 1,770
05-20-2003 17:46
I believe we already had this conversation. ;) And I still disagree - the 4mx and 4go cores are based off of the 2mx. The 4go is a 2go with higher memory bandwith and max memory(real performance booster), AA features and video compresion/decompresion features, as well as faster clock. Its a different branch than the GF4. But whatever, like I said we already discussed this I think. :)
James Miller
Village Idiot
Join date: 9 Jan 2003
Posts: 1,500
05-20-2003 18:53
My laptop has a 4go, and I wuv it when playing SL :D
Susan Beach
Registered User
Join date: 13 Jan 2003
Posts: 70
05-20-2003 20:33
I think we have been here before. Someone claimed the 4mx and 4go cores are based off of the GeForce2mx. This simpley isn't true. The GeForce4Go core (NV17M) is a highly optimized version of the NV17 (GeForce4) and has nothing at all in common with any of the GeForce2 line. It includes the nfiniteFX II engine and accuview antialiasing, and is light years ahead of the GeForce2Go. It's better then the GF2MX, but not the full GF4. This information is publicly available on nVidia's web site, go see for your self. But I guess you can believe what you want ;)
Ama Omega
Lost Wanderer
Join date: 11 Dec 2002
Posts: 1,770
05-20-2003 22:51
sigh

The only Go card with the nfiniteFX II Engine is the GeForce4go 4200. This is because the 4200 is a full fledged GeForce4 4200 chip, not a standard 4Go. If this is the processor you have then you have a solid standard GeForce 4 chip. You are correct then that it is not in relation to the 2go.

Acuview is the added AA I mentioned.

VPE or Video Processing Engine is the added video compression / decompression hardware (also added to the GeForce4mx which I didn't know).

Light Speed Architecture 2 is the enhanced memory bandwith and max memory.

The Geforce4 is not NV17, but NV25. The GeForce3 is NV20. The GeForce2 is NV15. The GeForce4mx is NV17 and 4go is also NV17. Except the GeForce4go 4200. The FX NV30 and the new (already!) FX is NV35. The chip in the XBox is the NV2A - between the GeForce3 and GeForce4 and highly optimized for TV (lotsa AA and low res support and other effects goodies).

If you can find some NV numbers for the 2Go or 2MX I would like to know them, because I just can't find them. I think they are just derivations on the NV15 (GeForce2 GTS) and never had their own NV numbers. That doesn't seem quite right even to me but I have looked. :)

It is about my only peeve with nVidia - that they labeled the GeForce 4mx cards as GeForce4s. They are based on GeForce2 technology and are inferior to GeForce3s, let alone GeForce4s. Basically NVidia forked their development tree at GeForce2 - on one branch they lead their flag ship products, adding new features and greater speed. On the other branch they created the MX budget and Go lines, concentrating on price with a few throw backs from their flagship, such as AA, and a few extras such as PowerMeizer for the Go series and video processing for both. Even with those throw backs, the 4go and 4mx lines are based on GeForce2mx/2go technology, not GeForce3 and not GeForce4.

This information was all gathered from the NVidia Website (all techonoly names and info) and Toms Hardware Guide (the NV numbers, cross referenced by searches on Google).