Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

-multiple

Gideon McMillan
Registered User
Join date: 31 Jan 2007
Posts: 6
09-10-2007 10:30
I run -multiple as a gridargs.dat argument on the 1.8.3.2 client in both Windows and Linux.

The performance in Windows for two SL sessions is acceptable. The performance in Linux is not acceptable. Same machine, same config. Is there something else I should set in the Linux client to make the -multiple performance acceptable?
Ceemore Paine
Registered User
Join date: 23 Jun 2007
Posts: 10
09-11-2007 05:52
I am curious about this too. How are people running multiples in nix ?
Drake Bacon
Linux is Furry
Join date: 13 Jul 2005
Posts: 443
09-11-2007 10:44
What graphics card are you running, Gideon?
_____________________
Drake Bacon/Drake Winger
Home: Custom AMD X2 (65nm) 5000+, 4 Gig RAM, Gentoo amd64, NVidia GeForce 8600GT PCIe
Mobile: Dell Inspiron E1505 (Core Duo 1.6GHz, 1 gig RAM, Gentoo x86, NVidia GeForce Go 7300 PCIe)
Backup: iMac (Core 2 Duo 2.4GHz, 4 gig RAM, ATI Radeon HD 2400, MacOS X Leopard)
Don't Ask: Asus EeePC 900A (Atom 1.6Ghz, 1 gig RAM, Intel graphics, Gentoo x86)
Angel Sunset
Linutic
Join date: 7 Apr 2005
Posts: 636
09-11-2007 23:15
From: Gideon McMillan
I run -multiple as a gridargs.dat argument on the 1.8.3.2 client in both Windows and Linux.

The performance in Windows for two SL sessions is acceptable. The performance in Linux is not acceptable. Same machine, same config. Is there something else I should set in the Linux client to make the -multiple performance acceptable?


I did this in the past too, occasionally.

What I found was necessary was also the "-port 1300x" parameter, with x from 0 to 8 worked for me, so that the datastreams to the clients are seperate. Original client with "-port 13000", second with"-port 13002".

Some people also installed SL twice, so that the caches remain seperate. I THOUGHT I knew the parameter in the secondlife script to set to put cache in a separate directory, but I don't :( This would save having a second installation...

Another issue could be how well the graphics memory of the card is used, but since the Windows SL seems to be OK with this, if the graphics options in preferences are set up the same, this SHOULD be ok.

All the above is my experience with the NVidia proprietary drivers. I can't speak for the rest...
_____________________
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Kubuntu Intrepid 8.10, KDE, linux 2.6.27-11, X.Org 11.0, server glx vendor: NVIDIA Corporation, server glx version: 1.5.2, OpenGL vendor: NVIDIA Corporation, OpenGL renderer: GeForce 9800 GTX+/PCI/SSE2, OpenGL version: 3.0.0 NVIDIA 180.29, glu version: 1.3, NVidia GEForce 9800 GTX+ 512 MB, Intel Core 2 Duo, Mem: 3371368k , Swap: 2570360k
Gideon McMillan
Registered User
Join date: 31 Jan 2007
Posts: 6
09-12-2007 05:18
From: Drake Bacon
What graphics card are you running, Gideon?


I'm using a nVidia 6200 256Mb setup.
Gideon McMillan
Registered User
Join date: 31 Jan 2007
Posts: 6
09-12-2007 05:27
From: Angel Sunset

What I found was necessary was also the "-port 1300x" parameter, with x from 0 to 8 worked for me, so that the datastreams to the clients are seperate. Original client with "-port 13000", second with"-port 13002".


Thanks Angel!
Ceemore Paine
Registered User
Join date: 23 Jun 2007
Posts: 10
09-12-2007 07:09
Angel

Your editing those parameters into the secondlife script? Still how would you keep cache separate? Isn't cache kept in the ~/.secondlife folder? I would think running two copies of the client without the parameter to set the cache folder they would both be looking for this folder for settings or to store cache?
Angel Sunset
Linutic
Join date: 7 Apr 2005
Posts: 636
Separate cache
09-12-2007 23:58
In the Gui you "can set the cache location". However this does not work for Linux, last time I looked.

I do set the parameters in the script, yes... Each script is then renamed appropriately, eg "secondlife00" and "secondlife08", for ports 13000 and 13008 and -multiple, with the original "secondlife" script unchanged for normal SL access. The "-multiple" parameter does seem to have a slight overhead on FPS.

THEORETICALLY, changing the location of the cache in the Gui should write a parameter to the script that called second life, but this does not seem to work.

If I knew the name of the variable I would set it manually in the script, but I don't. Maybe someone can help with this?

Practically, I have found the "shared cache" problem not to be a show-stopper; but that is experience from about a year ago, it may be more critical now :p

---------------------------------------

Edit: the variable is in the README (who reads READMES??? :D ) in this forum:

"User data is stored in the hidden directory ~/.secondlife by default; you may
override this location with the SECONDLIFE_USER_DIR environment variable if
you wish."

Setting this to point to ".secondlife00" and ".secondlife08" will separate the caches. Again, in the appropriate "secondlifexx" script.

----------------------------------------------------
_____________________
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Kubuntu Intrepid 8.10, KDE, linux 2.6.27-11, X.Org 11.0, server glx vendor: NVIDIA Corporation, server glx version: 1.5.2, OpenGL vendor: NVIDIA Corporation, OpenGL renderer: GeForce 9800 GTX+/PCI/SSE2, OpenGL version: 3.0.0 NVIDIA 180.29, glu version: 1.3, NVidia GEForce 9800 GTX+ 512 MB, Intel Core 2 Duo, Mem: 3371368k , Swap: 2570360k
Tofu Linden
Linden Lab Employee
Join date: 29 Aug 2006
Posts: 471
09-13-2007 06:51
From: Angel Sunset
In the Gui you "can set the cache location". However this does not work for Linux, last time I looked.
Just as an aside, this is fixed internally along with a Linux file-picker revamp. I expect it'll be part of 1.18.4.
Angel Sunset
Linutic
Join date: 7 Apr 2005
Posts: 636
09-13-2007 12:34
Thanks, Tofu!

I know Sun (the Unix company) has a GREAT dislike of GUIs, but I think SL is beyond that kind of purity :D
_____________________
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Kubuntu Intrepid 8.10, KDE, linux 2.6.27-11, X.Org 11.0, server glx vendor: NVIDIA Corporation, server glx version: 1.5.2, OpenGL vendor: NVIDIA Corporation, OpenGL renderer: GeForce 9800 GTX+/PCI/SSE2, OpenGL version: 3.0.0 NVIDIA 180.29, glu version: 1.3, NVidia GEForce 9800 GTX+ 512 MB, Intel Core 2 Duo, Mem: 3371368k , Swap: 2570360k
Moy Loon
eeeeaaaaaaaah!!! <:0
Join date: 29 May 2007
Posts: 6
10-01-2007 22:40
Here's something interesting...I've gotten multiple instances to run flawlessly...i just created another user account, and then ran one SL through a terminal...both run amazingly ~35fps on the window im on, and ~15 on the window thats inactive...

much better then when running them both as the same user,
and MUCH better then running two in windows xp...
downsides, is they don't share a cache, so loadtimes will go up alittle, aswell as take up more harddrive space...

my comp is fairly old and runs two clients at slightly less then if it was just one, until i run low on ram, then it starts to slow down as usual =p

notes:
Often SL will not start when using non super user accounts...
i ran it as root by mistake, and from alittle testing. it ran correctly everytime that way...odd...

specs:
3.0celeron/1gb ram/6600gt 128mb/810gb hddspace
Fedora Core 6, Kernel:2.6.22.2-42.fc6
Gnome