Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

Any evidence that 1 gig is a good maximum cache size

SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-24-2005 17:04
Can anyone produce any numbers to support claims that 1 gig is a rational maximum cache size?

I am not looking for the usual unsupported claims, platitudes, rules of thumb, aphorisms, old wives tales, broad generalities, gut feelings, etc. that are usually offered in answer to that question.

Has there ever been a test running Second Life in its present form using larger cache sizes on modern user equipment in real life internet connections?
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
Eggy Lippmann
Wiktator
Join date: 1 May 2003
Posts: 7,939
12-24-2005 17:19
The semi-official answer to that is, IIRC, "for larger cache sizes it would take more time to search the cache than to redownload the image".
This of course completely ignores the issue that some users may not care how long it takes to load, as long as they aren't paying extra money to their ISP for that traffic. Outside America and a few other selected countries, flat-rate is incredibly uncommon. I can only download up to 3 gigs, after that they start charging me extra. I have paid up to $100 in surcharges in the past. If LL made SL less of a bandwidth hog, I could spend that money on land instead.
SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-24-2005 17:48
The passage Eggy suggested, ""for larger cache sizes it would take more time to search the cache than to redownload the image", is a perfect example of the type of unsupported claim that I am not asking for. Thanks, Eggy, for offering that example.

Interesting that you offer a reason to increase the cache size other than speed. Speed is what I was thinking about. I don't have a limit on the amount of bytes I get to send or receive. Oh, there may be some limit but SL is not going to be coming close to whatever it might be.

There is a case in which unsupported claims that servers with similar clock speeds were nearly equivalent in processing capacity were shown to be false as a result of prolonged effort by Second Life users.

Unless it's possible to hack the client to support a larger cache size, I see no way for users to make such tests with regard to cache size.

Many users have processors which are in many ways effectively much the same as two processors. I have one I think fits that bill. I suspect that the second unused logical processor in that computer could be searching that machine's second hard drive without much affect on the Second Life process going on in logical cpu one.
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
Nathan Stewart
Registered User
Join date: 2 Feb 2005
Posts: 1,039
12-24-2005 18:14
From: SuezanneC Baskerville
Can anyone produce any numbers to support claims that 1 gig is a rational maximum cache size?

I am not looking for the usual unsupported claims, platitudes, rules of thumb, aphorisms, old wives tales, broad generalities, gut feelings, etc. that are usually offered in answer to that question.

Has there ever been a test running Second Life in its present form using larger cache sizes on modern user equipment in real life internet connections?


The general slowness to do with textures loading is to do with the image list shifting textures in and out of memory depending where it is on the list, its therefore not calling it from the cache quick enough so it looks like its having to reload it from the asset server.

Although to answer your question, it is possible to test the cache upto a size of 1,125Mb which is a 12% increase in size (this requires altering the settings file, and afterwards you cant use preferences unless you want to go back to normal settings)

I've been testing it and in theory it should be faster although the nature of the index file may slow things down after long use, plus its hard to benchmark at the moment as its very much slowed down by the imagelist problems
_____________________
Nathan Stewart
Registered User
Join date: 2 Feb 2005
Posts: 1,039
12-24-2005 18:35
From: SuezanneC Baskerville

Unless it's possible to hack the client to support a larger cache size, I see no way for users to make such tests with regard to cache size.


I can only offer the 12% increase as previously posted, using the next highest value seems to fail and default back to 512Mb cache

So close secondlife and edit your settings.ini and change the following

VFSSize 3

to

VFSSize 4

save and restart sl after this dont use any preferences until you finish testing, then you can click in the cache box to reset it back to 1000Mb, if you look in your cache folder the new file is 1123Mb or approx.

Note this is totally unsupported and even undocumented so as to its accuracy and use and future availability then dont take anything for granted
_____________________
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
12-24-2005 18:40
Keep in mind the cache does fragment rather quickly wich causes further slowdowns. Larger the cache will mean more fragmented files to search and load. I've loaded my cache onto another drive once before and with the cache being the only thing on that drive I noticed huge fragmentation numbers (like 14.1:1).
SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-24-2005 19:00
Ron's post is a general claim, with no numbers to suggest that 1 gig is a sensible maximum limit for the cache size for Second LIfe. The claim is also false.

Defragging 2 gigs, twice the current cache size, is a trivial operation.

Defrag programs can be set to run in the background during otherwise idle time, or scheduled to run at times while the user is not at the computer.

Hard drives of large capacity are cheap. Maintaining a defragged 10 gig cache on modern sized hard drives is not an issue if the user doesn't want it to be.

Without numbers resulting from actual tests, the claim that a larger cache won't improve speed due to fragmentation is just a superstition.
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
Alondria LeFay
Registered User
Join date: 2 May 2003
Posts: 725
12-24-2005 20:14
SuezanneC - What are you trying to achieve by this? The client will not allow for much larger than 1gig, so how can you expect someone to produce numbers on setting it larger? If your anger is at LL for not allowing it to be higher, perhaps you should ask _them_ for evidence, not the co-residence here whom are trying to be of assistance but are being met with aggrevation.
Hiro Pendragon
bye bye f0rums!
Join date: 22 Jan 2004
Posts: 5,905
12-24-2005 20:31
I believe the last major discussion I had with people on this issue was that 1 gig cache size is actually a significant slowdown to search over a 500mb cache.

I think a better solution is looking at ways to reduce how much is downloaded, to better make use of the cache. My own idea is an inside / outside bounding box object that users could place themselves, so that people outside would not download what's inside, and vice versa. (with options - like allowing 1 way or the other, etc). This would also be a great tool for privacy.
_____________________
Hiro Pendragon
------------------
http://www.involve3d.com - Involve - Metaverse / Emerging Media Studio

Visit my SL blog: http://secondtense.blogspot.com
SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-24-2005 20:48
From: Hiro Pendragon
I believe the last major discussion I had with people on this issue was that 1 gig cache size is actually a significant slowdown to search over a 500mb cache.

Numbers please?

The numbers would need to use today's client, of course, to have any real validity for today's client.

Who's to say that the changes in the interest list made recently, for example, wouldn't invalidate the not supplied results of this alleged test reported by unspecified people in a discussion you only believe you had?

Why then would LL set the max cache size over the optimum size?

To spare bandwidth usage for people in Eggy's situation?

If that's the case why not increase the maximum to allow even more savings for bandwidth restricted users and let the rest of use find the optimum size on our own?

It might be because someone made a guess a few years back based on their recollection of some then already out of date performance specs for user equipment and common internet connections, and now it's religious dogma, eternally true without evidence and never needing to be tested.


--- By the way, Merry Christmas, Hiro.
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
FlipperPA Peregrine
Magically Delicious!
Join date: 14 Nov 2003
Posts: 3,703
12-24-2005 23:14
Echoing many of the responses above, I believe the cache to be close to non-functional. A different overall conceptual cache scheme is needed, to be in line with the new interest code.

To wit: While in Indigo, I should *never* see bandwidth spikes above chat, location and other script activity. Yet, I still see 300kbps type activity at times - in my home sim!

I have a hunch that the 2.0 SL will also include a much improved cache - otherwise, the longer draw distances will surely destroy the whole grid. If we can view up to 512 - as is rumored - I'd like to see the local cache not only STORE a 512x512 texture - but also, client side, resize it to 256x256, 128x128, 64x64, etc... and given a distance, render accordingly.

In the current realization, we view an object that's 1 inch by 1 inch by 1 inch with a 512x512 meter textures on it at full rez, right? How about texture limits per object face?

Regards,

-Flip
_____________________
Peregrine Salon: www.PeregrineSalon.com - my consulting company
Second Blogger: www.SecondBlogger.com - free, fully integrated Second Life blogging for all avatars!
Adam Zaius
Deus
Join date: 9 Jan 2004
Posts: 1,483
12-24-2005 23:24
From: FlipperPA Peregrine
Echoing many of the responses above, I believe the cache to be close to non-functional. A different overall conceptual cache scheme is needed, to be in line with the new interest code.

To wit: While in Indigo, I should *never* see bandwidth spikes above chat, location and other script activity. Yet, I still see 300kbps type activity at times - in my home sim!

I have a hunch that the 2.0 SL will also include a much improved cache - otherwise, the longer draw distances will surely destroy the whole grid. If we can view up to 512 - as is rumored - I'd like to see the local cache not only STORE a 512x512 texture - but also, client side, resize it to 256x256, 128x128, 64x64, etc... and given a distance, render accordingly.

In the current realization, we view an object that's 1 inch by 1 inch by 1 inch with a 512x512 meter textures on it at full rez, right? How about texture limits per object face?

Regards,

-Flip


Actually, there's a very good chance mip-mapping lowers the resolution of those small faces.

-Adam
_____________________
Co-Founder / Lead Developer
GigasSecondServer
FlipperPA Peregrine
Magically Delicious!
Join date: 14 Nov 2003
Posts: 3,703
12-24-2005 23:29
From: Adam Zaius
Actually, there's a very good chance mip-mapping lowers the resolution of those small faces.

-Adam


Yes, but you're still talking real-time calculation. That's not an option, long term. :-)
_____________________
Peregrine Salon: www.PeregrineSalon.com - my consulting company
Second Blogger: www.SecondBlogger.com - free, fully integrated Second Life blogging for all avatars!
Frans Charming
You only need one Frans
Join date: 28 Jan 2005
Posts: 1,847
12-24-2005 23:39
I remember those old 2.0 pics and if i recall correctly there where no textures on the prims. I always thought it was just a extreme pic to show some eyecandy, not something that would be practical to use.
I would be really surprised if we get such a draw distance with textures on. But who knows maybe we get the option to not download and use textures after a certain distance.
_____________________
SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-24-2005 23:54
From: FlipperPA Peregrine
Indigo, I should *never* see bandwidth spikes above chat, location and other script activity. Yet, I still see 300kbps type activity at times - in my home sim!
How much data is required to be stored to display Indigo?

The obvious and no doubt wrong explanation of why you see bandwidth usage is that 1 gig is not enough, so stuff has to get swapped in and out of the cache, hence the bandwidth usage. That is what you would expect to see if the cache wasn't big enough, right?

Alondria - I am not angry. There is not an angry word in anything I spoke. Look at some of the unbearable threads near this one if you want to see what angry posts look like.

I am skeptical.

I and other people have asked this question before and been told 1 gig is big enough.

People make claims that 1 gig is a big as it needs to be, but they offer no proof.

Of course we can't test it properly without a lot of work, if at all. For all I know we could have that ability if LL would make a few simple changes to some numbers that define the max cache file size and change some loop limits and recompile it.

If I manage to induce someone a bit more technically oriented than me, someone who is better connected with the Lindens, to take some interest in this and ask the Lindens to show test results if they have them or make a modified client available to actually test it in the field, I am sure they would be more likely to get a successful response from the Lindens than I would. I am just an egg.


Nathan - I am only now seeing your posts, so I have read the responses out of order.

What sorts of tests can one do to get useful numbers for comparison of the effects of different caches sizes ? Might be of interest to some to do tests with the sizes we can test and see how that interacts with processor speed, amount of memory, etc.

Second Life is not making full use of the modern processors with logical dual CPUs and such, is it?

If I read Nathan's post right, he said that the slightly bigger cache seems a little faster or should be a little faster?
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
FlipperPA Peregrine
Magically Delicious!
Join date: 14 Nov 2003
Posts: 3,703
12-25-2005 00:38
I'm curious to hear the Lindens weigh in on the cacheing issue...and if they have the time, these random questions?

If, let's say, a texture was only loaded at 1024x1024 for avatars within 5 meters...
And 512x512 was only loaded for avatars within 15 meters...
and 256x256 was only loaded for avatars within, let's say, 30 meters...
and 128x128... etc...

HOWEVER, farther than that, the blank texture was loaded with the color of the texture that was set being the fill color?

Would that work? Now that we can see the color of the texture from above (see: awesome new map) isn't this a possibility to make things more efficient in the viewer?

I wonder. Just dreaming - and thinkin' long term!

-Flip
_____________________
Peregrine Salon: www.PeregrineSalon.com - my consulting company
Second Blogger: www.SecondBlogger.com - free, fully integrated Second Life blogging for all avatars!
Hiro Pendragon
bye bye f0rums!
Join date: 22 Jan 2004
Posts: 5,905
12-25-2005 03:56
From: FlipperPA Peregrine
I'm curious to hear the Lindens weigh in on the cacheing issue...and if they have the time, these random questions?

If, let's say, a texture was only loaded at 1024x1024 for avatars within 5 meters...
And 512x512 was only loaded for avatars within 15 meters...
and 256x256 was only loaded for avatars within, let's say, 30 meters...
and 128x128... etc...

HOWEVER, farther than that, the blank texture was loaded with the color of the texture that was set being the fill color?

Would that work? Now that we can see the color of the texture from above (see: awesome new map) isn't this a possibility to make things more efficient in the viewer?

I wonder. Just dreaming - and thinkin' long term!

-Flip

That certainly sounds like a decent idea to work on for SL 2.0 viewer ... perhaps having the prim size as a factor as well? Larger prims would need to texturize farther away than smaller ones.

And remember it would be measured from object to camera, not object to avatar. Or, perhaps the minimum of the two?
_____________________
Hiro Pendragon
------------------
http://www.involve3d.com - Involve - Metaverse / Emerging Media Studio

Visit my SL blog: http://secondtense.blogspot.com
Nathan Stewart
Registered User
Join date: 2 Feb 2005
Posts: 1,039
12-25-2005 11:41
Hi ya,

Not done any testing today being christams and all lol, although my net connection isnt unlimited it is like 75Gb a month so im fairly ok with the sl's use of bandwidth.

Its very hard to test comparitavely when i see bigger issues with the image list and how its managing video card ram, for instance if you open the statistics window, click advanced and textures on a 256Mb card you will never be able to get GL Mem above 160Mb this is the same as what happens at the 128Mb setting, 160Mb seems to be counting all the memory used as the "bound memory" levels appear consistant with whats been displayed in the texture console. the same occurs on a 512Mb setting, you can test this on lower end cards and they will use system memory to cache texture by using the -noprobe startup option to give you more options in prefs.

I ran my laptop which is a 64Mb one on the 256Mb setting and it only reached the same 160Mb gl mem, although doing this performance will suffer greatly

If you watch the statistics for a while you'll see where the slowdown is occuring you can see where you hit your GL Mem limit and then the viewer has to unload/reduce quality on some distant textures before it loads closer ones, so you will see this GL Mem drop below before the textures load infront of you.

The higher the drawdistance is set the more problems you'll see as objects seem to take up the ram and you'll get longer wait times as there is a bigger border on the draw distance to reduce texture quality to load images near you.

That explains the image list problem i think lol or as im seeing it, and if that made sense then you have for a texture to unload/reduce quality to free up the available memory on your video card before it will request the texture either from cache or asset server this delay is what is making people say its a cache problem when its probably the image list processing too slowly to even request from the cache in the first place.

Ok now onto the cache, this is very hard to test because we only have that small 12% margin although i think there is something people maybe missing in their tests and thats latency, a test on a local cache in san fransisco compared with lookup from asset server maybe slower above 1Gb but then when you factor in the kind of pings we experience worldwide, it may very well still be faster to recieve from a larger cache, i know i have an average ping of about 200ms in the uk not to mention how higher that could get as you go further towards central europe and asia

I agree with the others that the size of secondlife compared with the size of the cache doesnt seem at all proportional anymore, i dont have any actual numbers as i personally think there are a number of problems coming together to contribute to an overall effect
_____________________
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
12-25-2005 15:32
From: SuezanneC Baskerville
Ron's post is a general claim, with no numbers to suggest that 1 gig is a sensible maximum limit for the cache size for Second LIfe. The claim is also false.

How bout you show me numbers proving otherwise?
From: someone

Defragging 2 gigs, twice the current cache size, is a trivial operation.

True, but how many people you know defrag on a regular basis?
From: someone


Defrag programs can be set to run in the background during otherwise idle time, or scheduled to run at times while the user is not at the computer.

Last I check M$ Defrag is incapable of doing this efficiently outside of safe mode. Anything else costs money to buy or time to download illegal warez copies. Both options aren't options for most people. Also keep in mind these programs require CPU time wich is kind of hard to do while someone is using near 100% of the cpu processing power for SL. And its still kind of hard for the program to do its job while the same data its trying to organize is constantly being read, written, and removed.
From: someone


Hard drives of large capacity are cheap. Maintaining a defragged 10 gig cache on modern sized hard drives is not an issue if the user doesn't want it to be.

Once again, how many people actually defrag their system? Better yet how many people know what defragging is? How bout you show us those numbers?
From: someone


Without numbers resulting from actual tests, the claim that a larger cache won't improve speed due to fragmentation is just a superstition.

I learned from personal experience how much fragmentation affects speed with SL's cache in my experiments. You claim you want numbers from people to prove its not effective. How bout instead of asking people for numbers you generate your own and show us? I think that'd be more effective instead of expecting people to do the work for you.
SuezanneC Baskerville
Forums Rock!
Join date: 22 Dec 2003
Posts: 14,229
12-25-2005 17:09
To schedule a defrag of the C drive in Windows XP create a batch file called DefragC.bat in a folder you can remember the location of.

The file should have one line that says, minus the quotes:"defrag c:"

Then use the Windows XP task scheduler to run this batch file at a time when you aren't using the computer and don't have any other tasks scheduled.

Schedule this for daily.

First time will take a long time. Subsequent use should take 15 minutes or less.

More detailed info on this topic can be found at Make Windows XP Self-Maintaining

----

Challenging people to provide evidence for claims does not impose an obligation to offer evidence disproving the claim.
_____________________
-

So long to these forums, the vBulletin forums that used to be at forums.secondlife.com. I will miss them.

I can be found on the web by searching for "SuezanneC Baskerville", or go to

http://www.google.com/profiles/suezanne

-

http://lindenlab.tribe.net/ created on 11/19/03.

Members: Ben, Catherine, Colin, Cory, Dan, Doug, Jim, Philip, Phoenix, Richard,
Robin, and Ryan

-
Usagi Musashi
UM ™®
Join date: 24 Oct 2004
Posts: 6,083
12-26-2005 04:11
I resently upgrade to 3g on a sli mb from 1g...........For some reason 1g was not even close to being enough. Now i running better and faster. with a 3200 64cpu and SLI motherboard.
Well as i can tell running 1g of memory was ok. But for some reason when i was loading textures it was not doing a good enough job. but now with 3g I load faster with no gray people.
_____________________
Never Quote People that have no idea what they refering to..It give them a false feeling the need for attention...
Ron Overdrive
Registered User
Join date: 10 Jul 2005
Posts: 1,002
12-26-2005 06:19
From: Usagi Musashi
I resently upgrade to 3g on a sli mb from 1g...........For some reason 1g was not even close to being enough. Now i running better and faster. with a 3200 64cpu and SLI motherboard.
Well as i can tell running 1g of memory was ok. But for some reason when i was loading textures it was not doing a good enough job. but now with 3g I load faster with no gray people.


We're talking about the Cache not the RAM Usagi. ;) RAM will always increase speed. Later this week I'm upgrading my ram (hopefully to DDR400), my powersupply, my video card (to a GeForce 6600GT/6800GT from a GeForceFX 5500), and a new heatsink/fan combo so I can overclock my system. I'm expecting a difference of night and day when I'm done.
Alan Jay
IRL: Alan Jenney
Join date: 10 Jul 2006
Posts: 26
Network VS Cache
08-10-2006 09:21
Well, I did some testing with full and empty caches of the available sizes.

This was on a very average PC with a very average graphics card, 1GB memory, a reliable ADSL link at a quieter time, using the 1Mbps maximum bandwidth setting in the client. Ping times were around 200ms in the UK.

With the cache empty (whatever size I chose), it took around 60 seconds to download the data directly in front of me afresh.

The other results are as follows:

50MB: 25-30 seconds,
200MB: 12-15 seconds,
500MB: 12-15 seconds,
1GB: 10-20 seconds,

However, it always took 60 seconds in total to complete fetching data from disk, network server, rezzing and rendering!

THEREFORE, I CONCLUDE that the cache doesn't affect the speed of completing the view, but it can reduce the amount of data coming through your ISP.

I'm going to do a couple of tests to see how the available bandwidth affects the performance of the cache. More later.
_____________________
SL: "Alan Jay"
IRL: "Alan Jenney"
Zoe Llewelyn
Asylum Inmate
Join date: 15 Jun 2004
Posts: 502
08-10-2006 09:30
From: SuezanneC Baskerville
Can anyone produce any numbers to support claims that 1 gig is a rational maximum cache size?

I am not looking for the usual unsupported claims, platitudes, rules of thumb, aphorisms, old wives tales, broad generalities, gut feelings, etc. that are usually offered in answer to that question.

Has there ever been a test running Second Life in its present form using larger cache sizes on modern user equipment in real life internet connections?


Offer me guidelines to follow that will provide you the numbers you wish and I will be happy to test this.

I have been using a 1gig cache for some time. I am happy to take whatever measurements you wish at this setting and then again at a 500mb setting if you detail what steps you wish me to use to report.
_____________________
Alan Jay
IRL: Alan Jenney
Join date: 10 Jul 2006
Posts: 26
Affect resulting from bandwidth
08-10-2006 09:42
Well, I did some more tests over a number of different restricted bandwidths.

I'm not going to publish all the data!

It seems that if your bandwidth is significantly lower than the 1Mbps maximum permitted by the client that, it will take longer than 60 seconds to fetch all the objects and textures over the network. This means that it would be quicker to fetch from the local cache and render the entire scene in 60 seconds.

FROM THIS I CONCLUDE that if you have a lower than 1Mbps consistent bandwidth then using a larger cache (200, 500MB) can reduce the rendering time. However, just as with the first tests I did, there seems to be a cut-off of around 60 seconds for a relatively simple, unchanging scene, even if you've been there very recently.
_____________________
SL: "Alan Jay"
IRL: "Alan Jenney"
1 2