Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

Only 8,300 people logged in right now and the asset server is almost worthless.

Jesse Malthus
OMG HAX!
Join date: 21 Apr 2006
Posts: 649
08-04-2006 10:29
From: paulie Femto
SQUID can be configured to keep hot items in RAM. Let's hope the SL SQUIDS are doing that.

And on the NAS comment, wouldn't SL's implementation be more like a SAN?

Why are assets not stored in a database? Are they just stored as files? Wouldn't manipulation of assets be easier if they were stored in a database? Or is the asset collection not suited to database storage? The Isilon webpage says that Isilon is adept at storing "unstructured "data. Is the asset collection considered to be "unstructured?"

I love this stuff. I'm such a geek. :)

That's an interesting question. I think assets are stored in files becasue 1) Low overhead for serving them and 2) caching is a heckovalot easier to do with files than database rows.
As for it being unstructured, I suppose in a way it is. The only "structure" would be that it's mapped to a key.
((/me is also a geek, and likes this stuff too))
_____________________
Ruby loves me like Japanese Jesus.
Did Jesus ever go back and clean up those footprints he left? Beach Authority had to spend precious manpower.
Japanese Jesus, where are you?
Pragmatic!
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 10:49
From: paulie Femto

Why are assets not stored in a database? Are they just stored as files? Wouldn't manipulation of assets be easier if they were stored in a database? Or is the asset collection not suited to database storage? The Isilon webpage says that Isilon is adept at storing "unstructured "data. Is the asset collection considered to be "unstructured?"

I love this stuff. I'm such a geek. :)


Well, databases are more within my area of knowledge, so I can answer that :-) Basically, because it makes no sense to store it in a database.

The major advantage of a database is storing relationships between pieces of data, and querying it in different ways. A dump of key/value stuff gets no advantage from being in a DB, as there are no relationships to store, and the database itself can't do anything useful with that sort of data. It would even have a performance impact. You have to submit a query, which goes through a query parser, etc, all of which add overhead but give you no benefit.

The only good reason to use a BLOB IMO is for storing data that does belong in the DB, but for whatever reason can't be adequately stored in a better way. For example, if you're logging network packets, you could parse the headers (flags, length, port, etc), and store them in columns, making it so that the database is good for something (you now can easily search packets by port, etc), and storing the payload in a BLOB along with it, as thousands of tiny files on disk are a big problem for some filesystems.
Einsman Schlegel
Disenchanted Fool
Join date: 11 Jun 2003
Posts: 1,461
08-04-2006 10:56
Uh wait. So we do or don't run on MySQL?
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 11:10
From: paulie Femto
Sims maintain a local cache of their textures and objects. This cache is emptied upon grid restart. Sims have to "recache" after grid restart. I've asked a Linden about this caching and been told that sim resources aren't recached until someone visits the sim and looks at things!


Here's the thing I don't get: Why do they do that?

A sim shouldn't really need all that data. The sim itself isn't rendering graphics, so it doesn't need textures. So that leads me to think things go this way: Asset server => Sim (cache) => Client, with the sim being the thing that delivers the data to the client.

Now, the reason why I find this strange: Some textures are going to be very common in SL, so multiple sims are going to need them. Under the cache in the sim model system, each time a sim needs a texture it doesn't have cached, it hits the asset server and asks for it. But since the sim only sees its own local cache, if 5 sims want that texture that's going to be 5 hits against the asset server.

It would seem that it makes much more sense to put a large cache in front of the asset server, so that those 5 requests result in only 1 lookup on the asset server, and 4 cached replies.

It would seem to me that rather than going through the sim to get an asset it'd be more logical to expose the asset server separately, so that requests for an asset go to say, http://assets.secondlife.com/<key>, and DNS is set up in a round-robin fashion to distribute the load. I think ActiveWorlds did something like that, but it's been ages since I tried it, so I could be wrong.
Jesse Malthus
OMG HAX!
Join date: 21 Apr 2006
Posts: 649
08-04-2006 11:20
From: Dale Glass
Here's the thing I don't get: Why do they do that?

A sim shouldn't really need all that data. The sim itself isn't rendering graphics, so it doesn't need textures. So that leads me to think things go this way: Asset server => Sim (cache) => Client, with the sim being the thing that delivers the data to the client.

Now, the reason why I find this strange: Some textures are going to be very common in SL, so multiple sims are going to need them. Under the cache in the sim model system, each time a sim needs a texture it doesn't have cached, it hits the asset server and asks for it. But since the sim only sees its own local cache, if 5 sims want that texture that's going to be 5 hits against the asset server.

It would seem that it makes much more sense to put a large cache in front of the asset server, so that those 5 requests result in only 1 lookup on the asset server, and 4 cached replies.

It would seem to me that rather than going through the sim to get an asset it'd be more logical to expose the asset server separately, so that requests for an asset go to say, http://assets.secondlife.com/<key>, and DNS is set up in a round-robin fashion to distribute the load. I think ActiveWorlds did something like that, but it's been ages since I tried it, so I could be wrong.

I do believe that's how it's set up in terms of layout. IIRC, squid can preform as a cache cluster, and that could be what's being done.
_____________________
Ruby loves me like Japanese Jesus.
Did Jesus ever go back and clean up those footprints he left? Beach Authority had to spend precious manpower.
Japanese Jesus, where are you?
Pragmatic!
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 11:35
From: Jesse Malthus
I do believe that's how it's set up in terms of layout. IIRC, squid can preform as a cache cluster, and that could be what's being done.


Ahh, thanks, that makes a lot more sense.
Troy Vogel
Marginal Prof. of ZOMG!
Join date: 16 Aug 2004
Posts: 478
08-04-2006 11:47
I wonder, instead of having everything centralized couldn't we treat the assets more like a torrent? So every open and running client of SL can function as a server for assets such as textures?

Or how about writing some logic that permasaves some textures on the users machine -- as in build in a counter that after a certain number of reloads of the same texture, permanently saves it on the users machine until the user purges it? This would be useful for caching people's own build textures. I for one always arrive at my gallery, and my textures are nice and preloaded but if I fly around my sim and come back, I have to wait for the textures of my columns to reload. Well that's insane. They were loaded when I signed on, why did they even go away when I flew away -- especially considering that I never left the sim?

Also why not publish encrypted archives of the MOST used textures -- such as those that arrive with the newbie default library. These would be optional downloads for people who want a faster running SL -- if they're willing to sacrifice disk drive space for faster performance we should allow people to do that. I know it is completely against the whole original model but the original model is impeding the sustainable growth of the grid and the builds on it. A new model, that's hybrid is necessary.

It's not about principles at this point in time, it's about what works... I think the current model is just about at the end of its useful tenure. Building in SL has become consistently more painful from 2004 till today. Right now after the last few updates, I pull my hairs out anytime I am trying to build anything --in my completely abandoned, empty sim with no serious scripts running. Well that just makes a very poor experience....

Troy
_____________________
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 11:55
From: Troy Vogel
I wonder, instead of having everything centralized couldn't we treat the assets more like a torrent? So every open and running client of SL can function as a server for assets such as textures?


Ugh, no. You'd end up fetching textures from some poor guy with a modem. If you think now it's bad, imagine what it could be in that setting.

Besides, major griefing potential! Just write a script that replaces all cached textures with something nasty.


From: Troy Vogel

Or how about writing some logic that permasaves some textures on the users machine -- as in build in a counter that after a certain number of reloads of the same texture, permanently saves it on the users machine until the user purges it? This would be useful for caching people's own build textures. I for one always arrive at my gallery, and my textures are nice and preloaded but if I fly around my sim and come back, I have to wait for the textures of my columns to reload. Well that's insane. They were loaded when I signed on, why did they even go away when I flew away -- especially considering that I never left the sim?


In theory, there should be a local cache already, but it does seem it could use an improvement.

From: Troy Vogel

Also why not publish encrypted archives of the MOST used textures -- such as those that arrive with the newbie default library. These would be optional downloads for people who want a faster running SL -- if they're willing to sacrifice disk drive space for faster performance we should allow people to do that. I know it is completely against the whole original model but the original model is impeding the sustainable growth of the grid and the builds on it. A new model, that's hybrid is necessary.


Encryption would be completely useless here. It's the same reason why DRM simply can't work: You're trying to protect content from that content's viewer, and that implies it must be shown decrypted at some point. No way around it.

But why offer downloads at all? Just allow having a really large cache on disk. Make the maximum size configurable without limit. Store 100GB of textures on your disk if you have space for it. Each texture will get downloaded as you need it, then stay there.
Jesse Malthus
OMG HAX!
Join date: 21 Apr 2006
Posts: 649
08-04-2006 12:06
From: Dale Glass
Ugh, no. You'd end up fetching textures from some poor guy with a modem. If you think now it's bad, imagine what it could be in that setting.

Besides, major griefing potential! Just write a script that replaces all cached textures with something nasty.

Check hashes. That's the way torrents prevent bad blocks. And make it an option, possibly with an incentive like "get a stipend fox X bandwidth donated"
I did think that all the Library stuff was was part of the game's static data. I could be mistaken though.
_____________________
Ruby loves me like Japanese Jesus.
Did Jesus ever go back and clean up those footprints he left? Beach Authority had to spend precious manpower.
Japanese Jesus, where are you?
Pragmatic!
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 12:23
From: Jesse Malthus
Check hashes. That's the way torrents prevent bad blocks. And make it an option, possibly with an incentive like "get a stipend fox X bandwidth donated"
I did think that all the Library stuff was was part of the game's static data. I could be mistaken though.


Sure, you can use signatures. But still, griefing potential. It's pretty easy to use traffic shaping to deliver the data at 0.1KB/s. And it'll be a lot harder to diagnose any problem. Currently, things loading slowly == asset server's fault, and the asset server is right under LL's control, where I'm sure they can make all sorts of detailed observations. With this system it will take a lot longer to determine what's going wrong and how to fix it.

It's probably a doable thing though, I'm not just all that sure it's a very good idea. Also involves coding something new and throwing out the stuff they already have, which probably could be fixed with less effort.
Troy Vogel
Marginal Prof. of ZOMG!
Join date: 16 Aug 2004
Posts: 478
08-04-2006 12:50
From: Dale Glass
Ugh, no. You'd end up fetching textures from some poor guy with a modem. If you think now it's bad, imagine what it could be in that setting.

Besides, major griefing potential! Just write a script that replaces all cached textures with something nasty.


A: Who really uses a modem to access the SL grid on a regular basis? Please raise your hands and let me send you a care package saying "you pooor thing... I feel your payne"

B: I am sure we can write something that would prevent the replacement, griefing thing.

According to your account, there would be no networks of torrent users on the internet but there are... how are torrents any different than SL data?

You make some good points but ultimately I feel that you just took my post and simply refuted every little suggestion I made.

Now take my ideas and if they are not working according to my original description, make them work. Don't tell me how they dont work, tell me how they will work. :-)

That's cooperative constructive thinking. :-)

Thanks for the feedback,

Troy
_____________________
Troy Vogel
Marginal Prof. of ZOMG!
Join date: 16 Aug 2004
Posts: 478
08-04-2006 12:54
From: Jesse Malthus
Check hashes. That's the way torrents prevent bad blocks. And make it an option, possibly with an incentive like "get a stipend fox X bandwidth donated"
I did think that all the Library stuff was was part of the game's static data. I could be mistaken though.


I could be wrong but I swear used one of the stone wall textures from the default library for my gallery columns and I swear it reloads every time I fly around and come back and it is a HUGE texture block, 512x512 at least and takes up to 1-2 minutes to load -- practically the last texture to load at my place...

So if it is truly one of the library defaults (I will check this tonight), then definitely it is not built in, it is still streaming down from the SL servers.

Troy
_____________________
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-04-2006 13:26
From: Troy Vogel
A: Who really uses a modem to access the SL grid on a regular basis? Please raise your hands and let me send you a care package saying "you pooor thing... I feel your payne"


Modem, lack of bandwidth, same thing. Take a 256/128K ADSL connection, run SL and BitTorrent at the same time. 128K upload already is painfully slow for somebody used to have a 1Mbps downstream, add BitTorrent to consume most of the little that is available, and it'll take ages to download anything.

Even without BT competing for resources, the above setting is enough that SL + music streaming is pretty tight. People aren't going to like not being able to listen to music because they're serving textures to somebody else.

From: Troy Vogel

B: I am sure we can write something that would prevent the replacement, griefing thing.

According to your account, there would be no networks of torrent users on the internet but there are... how are torrents any different than SL data?

You make some good points but ultimately I feel that you just took my post and simply refuted every little suggestion I made.


Of course, you can sign images. A system like this is possible but this really isn't a simple thing to do. Consider that you have to know who to connect to (there has to be a database of who has what). It needs to deal correctly with people with sluggish connections, take care not to overload them so much they're lagged or can't listen to music, make sure the data can't be corrupted, make sure people intentionally throttling their connection can't break things too badly for other people.

Also, while signing images will make it impossible for people to replace textures with something nasty, for the signature to be checked, the file has to be downloaded first. Offering 1024x1024 corrupted textures would be an easy way of wasting other people's bandwidth. You could make the client report people who serve corrupted files to LL so that they get added to a ban list, but then somebody will make a client that will report random people.

BitTorrent is a good example here. Yeah, it's a wonderful piece of technology, but there are plenty torrents out there with download speeds of 2KB/s due to the lack of seeds, people throttling their upload bandwidth, etc. BT also has an easier time here. So what if it gets stuck for half an hour if the download takes 3 days anyway? But in SL that'd be nasty.



From: Troy Vogel

Now take my ideas and if they are not working according to my original description, make them work. Don't tell me how they dont work, tell me how they will work. :-)

That's cooperative constructive thinking. :-)

Thanks for the feedback,

Troy


The main problem I see with this idea is that it takes something critical that's fully under LL's control and distributes it around untrusted people, making it vulnerable to various problems that didn't exist before. It's not that it can't be done, but it's not an easy matter, and definitely not just a couple hours of coding. It will require thinking a lot about what might go wrong, and how to work around it.

Also, consider the current state: 370000 users, 6500 online. Making sure all the required data is always present is going to require a lot of redundancy. What happens when the grid goes down, everybody sees grey until enough people log in? What if all of the 20 people that have a texture happen to be offline? I bet that if this gets implemented, something will go wrong sooner or later, and we'll have people whining on the forums just like now. The thing here is whether this system would make them whine less, and I think it's technically complicated.

IMO, it can be solved more easily by creating asset servers in different countries for instance. Make people use the nearest one, distributing the load.
Troy Vogel
Marginal Prof. of ZOMG!
Join date: 16 Aug 2004
Posts: 478
08-04-2006 20:18
hmmmm I'm thinking.... :-) hmmmmm good point good point. Let me get back to you. :-)
_____________________
Cyclopean Sprocket
Compulsive Builder
Join date: 16 Feb 2006
Posts: 17
08-09-2006 16:24
Oooo, lots of miscomprehension going on here. Let me illuminate.

1) Each block of a bittorrent feed has a cryptographic hash associated with it. The only way to spoof a bad block would be to get a server in between the user and the server that's serving out these hashes and replace the received hash with your own. This involves targeting specific pieces of hardware in a network topography and hacking into them, and is WAY beyond the resources of a typical griefer, and is way too much effort for mere griefing anyway.

2) Upload an download bandwidth are separate. When you're receiving an audio stream and images from second life, your upload bandwidth is mostly idle, which is to say, going to waste. Using it doesn't interfere with your download bandwidth. Bittorrent was specifically designed to take advantage of this by making use of the unused upload bandwidth.

3) The REAL problem with instituting bittorrent is that most people these days are behind a NAT firewall. Does your IP address start with 192.168? If so, then you're actually sharing an IP address with many, many other people. This means that you can't tell your machine to act as a server because your advertised server port wouldn't have a corresponding internet-facing server port for the game to connect to. Without that, bittorrent doesn't work.

4) Caching, caching, caching. One way that the Lindens could solve this would be to have a separate asset server on each sim server. If that asset server didn't have what the client needed it would request it from the main server, pass it on and hold onto it. Items would time out and be discarded if not looked at in any seven to ten day period. This is not unreasonable.

I understand from the previous discussion that the current caching discards all of its cache every time the server is restarted. OMG, that's wasteful. That would be as bad as if your browser threw its cache out at midnight every night. Worse, even, because a Sim has a finite number of objects it can have at one time, which means that it's caching would be much more efficient due to serving the same stuff over and over again.

Client side caches might be enlarged, but the big benefit would be in ranking what is stored better. Right now the client side cache is either running on a first-in-first-out basis or it's waaaaay too small. If you were to timestamp the last ten or so accesses to the cache (moving from disk storage to memory storage), then apply a halflife to those accesses you could readily determine the most popularly accessed objects and textures and hold on to them instead of the ones from the landscape you just flew over. An increased cache would help, too, but more intelligence in the use of the cache would be of more value. This would cut the need to hit even the sim's asset server down to a quarter of what it currently is.

5) From my perspective the biggest hit caused by the asset server's issues isn't to the quality of every day hanging out, it's to the two greatest attractions that SL has. (1) Sightseeing and (2) shopping. If it takes 15 minutes of staring at a wall before the score or so images rez for you, it becomes horribly frustrating to find that outfit your looking for. Similarly, I used to enjoy flying over the landscape to see what other people are building. Nowadays that's become an exercise in waiting for the building that I just ran into to become visible so I can figure out how to get around it.

FYI, I am a data storage, serving, and manipulation specialist in real life, so I'd be one to know.

Cyclopean Sprocket
Howie Lament
Registered User
Join date: 22 Apr 2004
Posts: 30
08-09-2006 23:44
I'd suggest that the sims should run a background check for textures in use, and cache those assets automatically, instead of waiting for the first user to request it before it gets cached. I'd also start looking in to how to prune dead assets from the asset server.
Is LL able to tell how long it's been since an asset was last accessed? If an asset hasn't been used by anyone for say six months it should be pruned from the main asset server.
If you keep all the old assets forever you'll just waste a ton of disk space.

I'd also try keeping the sim servers in different datacenters if they aren't already. I'm guessing it's all one big datacenter in USA, it would be nice to have one in europe and asia as well, to spread out the network traffic a bit.

Finally there should be changes to how the local client caching works, like increasing the lifespan of locally cached textures if they're in constant use. Honestly right now it feels like the client just throws away the entire cache each time you log off.
Dale Glass
Evil Scripter
Join date: 12 Feb 2006
Posts: 252
08-10-2006 02:13
From: Cyclopean Sprocket
Oooo, lots of miscomprehension going on here. Let me illuminate.

1) Each block of a bittorrent feed has a cryptographic hash associated with it. The only way to spoof a bad block would be to get a server in between the user and the server that's serving out these hashes and replace the received hash with your own. This involves targeting specific pieces of hardware in a network topography and hacking into them, and is WAY beyond the resources of a typical griefer, and is way too much effort for mere griefing anyway.


You don't need to spoof to annoy. Somebody asks you: Give me block #5 of file X. You give them random data. They still have to receive the block to verify the hash. Granted, workarounds against this are possible, so this part isn't important.

From: Cyclopean Sprocket

2) Upload an download bandwidth are separate. When you're receiving an audio stream and images from second life, your upload bandwidth is mostly idle, which is to say, going to waste. Using it doesn't interfere with your download bandwidth. Bittorrent was specifically designed to take advantage of this by making use of the unused upload bandwidth.


Ha! Not so, Mr. specialist :-) My speciality is networking.

You're forgetting one very important thing here: ADSL is Assymetrical. Meaning, many people have much less upload than download bandwidth.

Now, if you look at the RFC 793 (TCP protocol) you'll notice one very important thing here: Packets must be ACKed. For every packet you receive, you must send an ACK confirming you got it. This means that in practice, downloading drastically reduces your upload bandwith and vice versa.

For instance, a quick estimate: At a MTU of 1500 bytes, each packet carries 1480 bytes of data. Each of those must be replied to with a 20 bytes ACK packet. So if you have bandwidth of 1Mbps/300K, like I do, just ACKing all that is going to take approximately 20% of your upload bandwidth.

Not bad, you say? Well, now you start uploading textures to somebody else. That means you're now uploading 1500 byte large packets, which make your ACKs wait. This really dramatically reduces your effective upload bandwidth.

All this has another effect: It really, really screws with UDP. UDP works well enough on a network with spare capacity. If your UDP packets get into the above mess you're going to find you have trouble resolving DNS, because they get massively delayed or completely dropped because they don't fit in the queue.

You can work around that by prioritizing ACKs, or reducing the MTU so that the packets are smaller and don't clog the upload so much (but that increases the number of ACKs you must send), or throttling your upload speed. This is why many BT client these days allow you to throttle your upload speed, so that you can ACK packets fast enough.

Really dealing with this situation is hard. Consider SL: You have time sensitive data that must be delivered and received as fast as possible (movement, etc), you have streaming which is latency insensitive but requires a minimal amount of bandwidth, and you're going to add to that something that will fill whatever you have left for the P2P stuff.

This is very hard to balance! A traffic shaping setup that allows you to run SL with streaming, BT and web browsing all at once should be possible, but it's not going to be simple. If you think it is, try to do it with say, the Linux traffic shaper. A problem here is that you only have good control over your own uploads, downloads are harder to throttle.

From: Cyclopean Sprocket

4) Caching, caching, caching. One way that the Lindens could solve this would be to have a separate asset server on each sim server. If that asset server didn't have what the client needed it would request it from the main server, pass it on and hold onto it. Items would time out and be discarded if not looked at in any seven to ten day period. This is not unreasonable.

They already mentioned the sim has a local cache.

From: Cyclopean Sprocket

Client side caches might be enlarged, but the big benefit would be in ranking what is stored better. Right now the client side cache is either running on a first-in-first-out basis or it's waaaaay too small. If you were to timestamp the last ten or so accesses to the cache (moving from disk storage to memory storage), then apply a halflife to those accesses you could readily determine the most popularly accessed objects and textures and hold on to them instead of the ones from the landscape you just flew over. An increased cache would help, too, but more intelligence in the use of the cache would be of more value. This would cut the need to hit even the sim's asset server down to a quarter of what it currently is.


It seems to me the current cache keeps too little. Or at least it doesn't seem to be using even half of the 512MB I assigned to it. That sounds like something could use fixing.


Now, I'll add something to this: My objections to the P2P style asset serving isn't that it can't be done, it's that I think it's far from trivial to do, and even harder to do better than the asset server currently does.

Even assuming they could deliver a perfect system (which won't happen), it will still take time (a month or two say). Meanwhile, it'd be a waste to do any serious work on an asset server that's going to be obsoleted soon, so we'd be stuck with the current situation until they release the replacement.

The asset server issue seems to be something that should (I lack data to make a good judgement here though) be quite easily solvable by throwing hardware at it. That's a whole lot easier than designing a good P2P system, and should be a lot faster to deliver as well.
Zonax Delorean
Registered User
Join date: 5 Jun 2004
Posts: 767
08-10-2006 03:07
From: Howie Lament
Is LL able to tell how long it's been since an asset was last accessed? If an asset hasn't been used by anyone for say six months it should be pruned from the main asset server.


I was propagating a similar idea to this a while ago on the forums.
There should be several levels of asset servers, with different access times.
Level 1: the current one. It should be very fast.
Level 2: assets not accessed for six months go into this bucket. Slower to access, but it would have a way to get back into Level 1 if it gets frequently accessed
Level 3: (maybe) an even slower level

But possibly 2 levels would be enough. And with using 'levels', no data would have to be discarded (or maybe only data not accessed for 2-3 years), so if someone goes around a world trip for 6 months and comes back, his house will still be there.

I think this would help a lot, too! When I'm developing a product in SL, or making an object, I have several work versions that are not needed anymore. However, every version is saved in the asset server. With programming scripts, this means every time you press 'save' (make a modification), that creates a new asset. Most of this could be thrown out (or moved to L2 or L3).

Though an inventory backup solution would also be welcome, so if someone knows he won't come back in a year, there's at least a way to preserve his valuables.
Cyclopean Sprocket
Compulsive Builder
Join date: 16 Feb 2006
Posts: 17
08-17-2006 11:18
From: Dale Glass
You don't need to spoof to annoy. Somebody asks you: Give me block #5 of file X. You give them random data. They still have to receive the block to verify the hash. Granted, workarounds against this are possible, so this part isn't important.


Any BT client worth its salt would then know to not request any further hashes from that IP address. True that it's an annoyance, but it isn't enough to put serious load on the server. It could even let the server know when an IP address is serving bad hashes. Enough reports of bad hashes and the hash server can remove an IP address from the swarm.

Generally speaking, though, I also don't think that this is the answer. Bittorrent is most effective when large numbers of people are trying to access something large at the same time. You'd have negative efficiency trying to serve objects like that because of their small size. It's unlikely that there would be an advantage to serving images that way because, although they're larger than objects, they're not THAT much larger, and there is an extremely small percentage of occasions when multiple people are dl'ing the same image.

You COULD design algorythms for getting around these limitations. For instance, a sim could pack all of its contents in a single file and continually broadcast its contents, occasionally adding "update" files as things change, but these wouldn't load stuff in a order that's relevent to the viewer. It would be a complicated system, and the tradeoffs might result in a less functional system, even if it is more efficient.

From: someone
You're forgetting one very important thing here: ADSL is Assymetrical. Meaning, many people have much less upload than download bandwidth.


Ok, you got me on this one. Nonetheless, even by your description there is quite a bit of unused bandwidth that, with proper throttling, can be used to lighten the load on the SL servers with no impact on the user's performance.

From: someone
It seems to me the current cache keeps too little. Or at least it doesn't seem to be using even half of the 512MB I assigned to it. That sounds like something could use fixing.


I think I spend a good five to ten minutes waiting on my high speed DSL for my local environment to load every time I log on. The time doesn't seem any longer when I go somewhere new. I haven't looked under the hood, but I feel like the client discards (or at least times out) the entire cache when I log off.

From: someone
The asset server issue seems to be something that should (I lack data to make a good judgement here though) be quite easily solvable by throwing hardware at it.


As a person who writes these systems for a living, my bias is obviously to disagree with you here. Like you, though, I'm also arguing in a partial vacuum of hard facts. The probability of this method succeeding is inversely proportional to the complexity of the system being improved.
Cyclopean Sprocket
Compulsive Builder
Join date: 16 Feb 2006
Posts: 17
08-17-2006 11:26
From: Zonax Delorean
so if someone goes around a world trip for 6 months and comes back, his house will still be there.


All this reminds me of the Outer Limits episode where some guy wakes up in a future time that is still being built by those who build the individual seconds. He wanders into an alley that's an entirely empty space, and the builders explain to him that nobody is going to be there to see that place during the second that they're currently building.

Backing things off to L2 and etc. storage would be a good idea if L1 didn't have to ever keep track of those items until they're used again. In short, you have to decrease the number of items tracked, which means entirely removing them from the L1 system. As long as the client knows to go ask the L2 system when the L1 system doesn't find it, then this is a perfectly workable solution, and probably even a good idea.
Kristy Cordeaux
Registered User
Join date: 13 May 2006
Posts: 94
08-17-2006 13:02
They dont care if SL collapses into a black hole. That's been obvious for quite a while. They merely introduce more crap, aggravate more people and move on to getting the next weeks' crap ready for downloading. I've seen zero effort devoted toward making the SL experience actually more enjoyable or better. Moreover they're indifferent toward anything the consumers (Us folks in SL) have to say about anything.

SL is extraordinarily beautiful tho, provided you don't try to move or do anything other than pivot in place and wait for everything to rezz.
_____________________
eMachines T5010, but modified to: 2.30Ghz AMD Athlon 64x2 Dual Core processors, NVIDIA GeForce 6150 (128 MB). 250GB HD.
1 2 3