These forums are CLOSED. Please visit the new forums HERE
llHTTPRequest...what are the possibilities? |
|
Solar Angel
Madam Codealot
Join date: 10 Apr 2005
Posts: 58
|
04-18-2006 18:13
Definately onboard with the "exponential delay" idea... cancelling requests is prone to cause some serious issues with perfectly legitimate items that simply get a lot of traffic.
|
Harris Hare
Second Life Resident
Join date: 5 Nov 2004
Posts: 301
|
04-25-2006 12:34
I have an idea:
What if non-attached, in-world objects HTTP requests continue to originate from the simulator and are subject to throttling while scripts in attachments (which require an active agent) use the SL viewer to originate the request. That way, attachment scripts need not be subject to throttling since they only originate from the computer of that particular user and not the simulator. |
Argent Stonecutter
Emergency Mustelid
![]() Join date: 20 Sep 2005
Posts: 20,263
|
04-25-2006 13:32
As I suggested in the main thread for this feature. I think the best solution would maybe to replace the 'blocking' with an exponential delay. Basically, if you play nice then the delay will be nothing at all or neglibable (a small fraction of a second), but if your script is churning out requests, then a the delay after each request grows rapidly. if it's still churning them out then it starts getting higher until the script(s) are FORCED to make no more than the throttled number of requests Alternatively, it can get a block, then do its own exponential backoff and notify the user and he can take off the Hand Of Spam. |
Haravikk Mistral
Registered User
Join date: 8 Oct 2005
Posts: 2,482
|
04-25-2006 14:29
Just a thought, but another idea ergo strict throttling and exponential delays to HTTP requests. But if we're concerned about illegitimate uses (ie DOS attacks), would it make sense to have a script register a web-site before it can send requests to it?
The idea is fairly straight-forward, basically if you want to access www.haravikk.com, then you must first call llHTTPRegister("www.haravikk.com/sl.php" ![]() What this will do is query that URL, and wait for a valid response, if the response is valid then you'll get a reply event declaring your script has registered successfully to the domain you queried (note the domain, not just the exact link specified). It would have a fairly hefty, fixed delay, so would need to be used when starting your script, but the idea is that any repeated calls you're going to make are going to be the same domain or set of domains anyway. If you make too many failed register requests then the script will fail. The only disadvantage really is web-browsing, however, for things like that you should be able to have scripts on your own server doing the call to google or whatever and outputting the response, so attempting to DOS attack a site would just bring yours down instead ![]() Dunno, it probably has flaws, and I'm tired, but I figured I'd throw it out if it's a more viable alternative to give us less restrictive throttling if something exponential is harder to implement. |