Welcome to the Second Life Forums Archive

These forums are CLOSED. Please visit the new forums HERE

Outbound communications: XML-RPC / HTTP

Lex Neva
wears dorky glasses
Join date: 27 Nov 2004
Posts: 1,361
03-30-2006 11:16
The ability to do HTTP Basic Authentication (username/password/realm) somehow would be cool. It might be possible for us to do this ourselves in LSL if enough freedom with request headers is allowed. I know Basic Authentication isn't necessarily all that secure, but combining it with IP based blocks would make it fairly easy to secure a service to only be accessible by a specific script, with reasonable certainty.
Kitten Lulu
Registered User
Join date: 8 Jul 2005
Posts: 114
03-30-2006 12:08
If we can send custom headers, we can do basic authentication.
_____________________
I've got a blog!

My products are available on SLExchange or in-world in my shops: Kitten Lulu's Emporium and Kitten&Co.
Adam Zaius
Deus
Join date: 9 Jan 2004
Posts: 1,483
03-30-2006 12:43
Thought about this for a while; "solution" to the headers data:

CODE
llSetHTTPHeaderFilter(list filter)


Let us define the headers we want to see.
_____________________
Co-Founder / Lead Developer
GigasSecondServer
Gwyneth Llewelyn
Winking Loudmouth
Join date: 31 Jul 2004
Posts: 1,336
03-30-2006 13:32
I'm just drooling with this announcement, and all I can say is: implement a bare-bones version of this as fast as you can :) ... and improve it on the next round of implementations.

I see that people here are completely crazy about the possibilities :) Controlling WebDAV from an SL application?! Wow, that's utterly awesome, but really, do we need it *now*? We can't even write text to a prim, and I assume that even retrieving images remotely (that would be converted-on-the-fly to textures and dropped inside an object's inventory...) is out of the question. Also, even if we had 8-bit-clean notecards, I expect that retrieving, say, a zipped file (that you could give to another resident on a 'binary notecard', who in turn could use a scripted device using a POST request to upload that same zipped file to a remote server... well, the concept is fantastic, but...

... all I really need is an equivalent llLoadURL() that doesn't display a dialogue box, and that has a delay of 0.1 seconds and not 10 seconds ;)

A full CURL implementation can wait until "stage #2". :)
_____________________

Anna Bobbysocks
Registered User
Join date: 29 Jun 2005
Posts: 373
03-30-2006 13:37
I'd rather see the most feature lacked request/reply possible rather than to see this follow in the path of havok or speedtree or html on a prim or whatever.

I agree with Adam's idea regarding implementing any throttling on a proxy rather than in LSL. CURL has fine proxy support.

Just set the curl proxy to DNS round robin or a proper load balancer if you have one, and then set up a few squid caches. You might need to implement the throttle as an add on.. I couldn't google up any readily implemented ones.

But, then, voila. Problem solved. People can't DOS via SL because they'd have to make it through the squid. Hell, you could even be kind enough to turn caching on (or give us a flag) for us as well...

Also, you probably don't want to implement the throttling in LSL, cause you know, that doesn't work as well if someone does a grid wide attack.
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-30-2006 13:49
From: Kelly Linden
It is simply a different project, different feature.
Maybe T-Rex can explain it better than I can.
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Anna Bobbysocks
Registered User
Join date: 29 Jun 2005
Posts: 373
03-30-2006 14:03
Unfortunately Jarod, the security ramifications of sending the URL requests via client are far more brutal then the scalability issues cause by doing it from the sims.

Simply put, you do not make HTTP requests on behalf of individuals, which btw, is why I always keep my streaming turned off.

As adam and I have pointed out, if you have a proxy in the middle, you can throttle the potential DOS attacks.

The problem becomes when LL wants to let anyone own a sim. DO they make the requests via their proxy? At that point, though, they can deal with it then. Perhaps just turn off the proxy. It won't be that hard to deal with, whatever the solution is.
Aliasi Stonebender
Return of Catbread
Join date: 30 Jan 2005
Posts: 1,858
03-30-2006 14:06


On the other hand, making it client side has two ugly possibilities.

(a) you'd have the possibility of someone making a script that would use your system as part of a DDoS attack.

(b) If you limit it to the owner of the script (in a manner like llMapDestination), you lose the ability to communicate when your avatar isn't on.

Whereas, for whatever ugliness server-side may entail, it's at least limited to a known quantity of LL-owned servers.
_____________________
Red Mary says, softly, “How a man grows aggressive when his enemy displays propriety. He thinks: I will use this good behavior to enforce my advantage over her. Is it any wonder people hold good behavior in such disregard?”
Anything Surplus Home to the "Nuke the Crap Out of..." series of games and other stuff
Anna Bobbysocks
Registered User
Join date: 29 Jun 2005
Posts: 373
03-30-2006 14:07
Still, LL is going to turn up in some weird logs.

I'd recommend that users have to register the domains at another web page and show that they have agreed to all terms of use. You can then send that filter to the squid proxy.
Strife Onizuka
Moonchild
Join date: 3 Mar 2004
Posts: 5,887
03-30-2006 14:08
It would be wonderful to be able to build the headers being sent, like...

[HTTP_RAW, "Cookie", "mmmm cookies"]

You best give us outgoing XML-RPC or i'll be forced to use this:
CODE

key request;

key SendRemoteData(key uuid, string StringValue, integer IntValue)
{
if(StringValue)
StringValue = "<string>" + StringValue + "</string>";
else
StringValue = "<string/>";
if(uuid != "")
uuid = "<string>" + (string)uuid + "</string>";
else
uuid = "<string/>";
string body = "<?xml version=\"1.0\"?><methodCall> <methodName>llRemoteData</methodName><params><param><value> <struct><member><name>Channel</name><value>"+ (string)uuid +"</value></member><member><name>IntValue</name><value><int>" + (string)IntValue + "</int></value></member><member><name>StringValue</name><value>" + StringValue +
"</value></member></struct></value></param></params> </methodCall>";
return llHTTPRequest([HTTP_URL, "http://xmlrpc.secondlife.com/cgi-bin/xmlrpc.cgi", HTTP_METHOD, "POST"], body);//$[E10006]$[E10006]$[E10006]
}

default
{
state_entry()
{
request = SendRemoteData("2d0efb26-0287-f25c-9d55-5fd09339ff14", "", 0);
}
http_response(key id, integer status, list headers, string body)//comment this out to make it pass though lslint
// link_message(integer a, integer status, string body, key id)//and uncomment this line.
{
if(id == request && status == 200)
{
if(!llSubStringIndex(body, "<?xml version=\"1.0\"?>"))
{
if(llSubStringIndex(body, "<?xml version=\"1.0\"?><methodResponse><fault>"))
{//a cleverly crafted (ok not so cleverly crafted) xml-rpc responce can generate an invalid xml responce.
integer p = llSubStringIndex(body, "<int>");
@loop;
integer m = p + 1 + llSubStringIndex(llDeleteSubString(body,0,p), "<int>");
if(m > p)
{
p = m;
jump loop;
}
m = p + 1 + llSubStringIndex(llDeleteSubString(body,0,p), "</int>");
integer IntValue = (integer)llGetSubString(body, p + 5, m - 1);
string StringValue = llGetSubString(body, 211, p - 53);
if(StringValue == "<string/>")
StringValue = "";
else
StringValue = llDeleteSubString(StringValue, -9, 7);
//Insert your code here
llOwnerSay(llList2CSV([IntValue, StringValue]));
}
else
;//invalid request
}
else
;//some horrible error
}
}
}
_____________________
Truth is a river that is always splitting up into arms that reunite. Islanded between the arms, the inhabitants argue for a lifetime as to which is the main river.
- Cyril Connolly

Without the political will to find common ground, the continual friction of tactic and counter tactic, only creates suspicion and hatred and vengeance, and perpetuates the cycle of violence.
- James Nachtwey
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-30-2006 14:23
From: Aliasi Stonebender
(b) If you limit it to the owner of the script (in a manner like llMapDestination), you lose the ability to communicate when your avatar isn't on.
You do realize I'm talking about this kind of model, right: LSL -> User Client -> Web -> User Client -> LSL.

I'm thinking of this working the exact same way that llLoadURL works, where activating the function fires off a message to the use client. The only difference here is that instead of just launching a browser, the command would activate a "wget" function in the SL client, and then return information to the LSL script via either llListen channel or some other internal messaging system. Maybe I'm missing something in what you and Anna are saying, but I don't see how having the SL client request a web site is any more dangerous or prone to DDoS attacks than simple, everyday web browsing is.

Besides, if I wanted to over come the throttling, it's simply a matter of putting a self-replicating prim on my land and having it spawn a new objects at roughly 25 meter increments in X- and Y-coordinates. Even if a sim can only request a certain about of pages at a time, don't forget there are over a hundred conneted sims in the main land. All a person would need to do is pick up a few parcels of land here and there, setup their bugs, and have them launch at the same time. If you could get five or ten viral prims in every sim this way, you could theoretically hit a site 5,000 times every few seconds. In that kind of scenario, it wouldn't matter if the prims got derezzed once in a while or (if they were physical) rolled off the sim. Just firing off the request is what hurts the victim's site.

On the other hand, if these requests went through the SL client, and popped up a little warning, ala llLoadURL, or a permission request, the worst that could happen is a couple of users take the bait and a few 128-bit upstreams get used until the person logs out.

Maybe you should use smaller words, but I don't see any greater risks in having the SL client (specifically if it's using Mozilla or cURL's code) make the requests, as opposed to the sims doing the calls. What am I missing?
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Strife Onizuka
Moonchild
Join date: 3 Mar 2004
Posts: 5,887
03-30-2006 14:41
personaly i don't care a bit about XML-RPC if i can have raw HTTP. XML-RPC has huge overheads and it's easy to poison the attributes (try passing "<evil>" as the string for an xml-rpc responce). Currently with XML-RPC there is no way to know who is requesting the information; there is no way to do connection security. If we could run small HTTP servers out of our prims at least then we could do some sorts of security (speaking of which you could do HTTPS and have all object have thier own certificates).

Maybe a 4KB limit on body length.
Delay the scirpt according to message length.
Something like 4kbps. (a 4KB message (headers + body), would delay the script 10 seconds)

Cut any responces body at 4KB, and delay the events triggering like that above (for a 4k page, the event wouldn't trigger for 10 seconds).
_____________________
Truth is a river that is always splitting up into arms that reunite. Islanded between the arms, the inhabitants argue for a lifetime as to which is the main river.
- Cyril Connolly

Without the political will to find common ground, the continual friction of tactic and counter tactic, only creates suspicion and hatred and vengeance, and perpetuates the cycle of violence.
- James Nachtwey
Gwyneth Llewelyn
Winking Loudmouth
Join date: 31 Jul 2004
Posts: 1,336
Thinking about security issues and prevention of abuse is great, but...
03-30-2006 17:01
I wonder if there was a similar thread to this one on the forums when LL released the llEmail() LSL function?

After all, people could also rez billions of prims grid-wide and do DoS attacks using email... ok, I know, a well-tuned sendmail server will gracefully degrade performance and avoid most of the DoS attacks — but the same applies to Apache, if you know how to do it ;) [I'm clueless about how to do it on M$ equivalents]

/me now listens to the theories of why llEmail() wouldn't be able to create DoS attacks as damaging as outbound HTTP through well-configured proxies.
_____________________

Aliasi Stonebender
Return of Catbread
Join date: 30 Jan 2005
Posts: 1,858
03-30-2006 19:48
From: Jarod Godel

Maybe you should use smaller words, but I don't see any greater risks in having the SL client (specifically if it's using Mozilla or cURL's code) make the requests, as opposed to the sims doing the calls. What am I missing?


Because, if it's server-side, we get object to object communication, for one.
_____________________
Red Mary says, softly, “How a man grows aggressive when his enemy displays propriety. He thinks: I will use this good behavior to enforce my advantage over her. Is it any wonder people hold good behavior in such disregard?”
Anything Surplus Home to the "Nuke the Crap Out of..." series of games and other stuff
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-30-2006 20:03
From: Aliasi Stonebender
Because, if it's server-side, we get object to object communication, for one.
Don't we already have that because of XML-RPC?
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Iron Perth
Registered User
Join date: 9 Mar 2005
Posts: 802
03-30-2006 23:20
From: Gwyneth Llewelyn
I wonder if there was a similar thread to this one on the forums when LL released the llEmail() LSL function?

After all, people could also rez billions of prims grid-wide and do DoS attacks using email... ok, I know, a well-tuned sendmail server will gracefully degrade performance and avoid most of the DoS attacks — but the same applies to Apache, if you know how to do it ;) [I'm clueless about how to do it on M$ equivalents]

/me now listens to the theories of why llEmail() wouldn't be able to create DoS attacks as damaging as outbound HTTP through well-configured proxies.


Yes, I believe this has always been a primary concern on the part of Cory and his team.
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-31-2006 06:53
From: Gwyneth Llewelyn
/me now listens to the theories of why llEmail() wouldn't be able to create DoS attacks as damaging as outbound HTTP through well-configured proxies.
Well, not DDoS attacks, but certaionly spam. As for why it's never been done, I don't have an answer, except maybe it's not as practical as a farm of zombie machine. *shrugs*

I'm just saying:

1) If you leave HTTP requests on the server, you do have a security concern, and

2) If you put HTTP requests on the client, you'll be distributing the load away from an already bottlenecked system and you won't have to restrict the number of connections for security reasons. (Clients and the servers are already exchanging information, an extra 4k in the mix isn't going to be as expensive as a whole new process that has to hit web sites.)

Also, what's stumping me is this: what kind of applications are people going to be writing that need to access the web when people aren't around? Are LSL coders really looking to build things like timed aggregators and web spiders when they still have no in-world way to write the grabbed data to a disk?
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Kelly Linden
Linden Developer
Join date: 29 Mar 2004
Posts: 896
03-31-2006 07:40
Many times http access will be desired when the object owner isn't around. Do you really see no security risk in random code executing web requests from your machine (which happens to be inside your firewall, not to mention on your machine?). There are more security risks with your idea. They are solvable and will have to be solved for Mozilla, however this proposed feature doesn't have those security risks.

From: someone
1) If you leave HTTP requests on the server, you do have a security concern

Being able to instigate a ddos attack or spam is addressable by effective throttling and caching, and while definitely a concern isn't technically a security issue. What specifically is the security issue you are concerned about with requests coming from the servers?
_____________________
- Kelly Linden
Zero Linden
Linden Lab Employee
Join date: 18 Oct 2005
Posts: 22
03-31-2006 08:38
Nice to see there's a little interest in this ;-)

on ajax-stubs:
We couldn't use a stub pattern without severely limiting what people use to build their web based servers to some specific technology.

on remote exploits:
If a remote site offers information such as user's credit cards to any HTTP request, then they have far bigger things to worry about than LSL objects in SL.

on HTTP vs. XML-RPC:
Seems every one is happy with HTTP. XML-RPC helper functions, XML parsing, etc... would all be nice, but not needed at start.

on headers out-bound with the request:
Agreement that we want at least the same set that llEmail gives:
  1. Object-Name: Object
  2. Region: Gibson (254976, 256000)
  3. Local-Position: (117, 129, 50)

We'll add Object-Key and Owner-Key/Name too, since many scripts would likely have to encode this information in the body otherwise.

One person really wants to specify additional headers. We'll look into that for round 2.

on returned headers with response:
Some interest in data and location. Some want to save memory by specifying interesting headers. Not yet sure what we'll do here, probably all phase 2

on delays and timing:
All over the map here: 4-8 calls per second (per object?), no script delay but limit to 5 per 30 seconds, 10 per sim, .1 seconds, etc... I understand the arguments for no script delays, so we'll code a minimal one at first. Other throttling will be done to keep the rates reasonable. Still trying to figure out what they'd be.

on response length:
HTTP_BODY_MAXLENGTH seems like a great idea. We will look into a) how to indicate that truncation has occurred, and b) ways to specify range. The later will not be in first deploy.

on methods:
We will have to limit methods in the first version (libCURL limitation), But are very eager to look at expanding the set of available ones next. I'd be the first to buy a WebDAV client implemented in LSL and prims!

on mime types:
In the response, take any text based type, and return it as a UTF-8 string. Other types will be discarded. Looking into what character encodings we'll support. (Sorry, no auto creation of assets from jpegs :-O)

In the request, no one said they needed it. Perhaps text/plain is just fine? Though I would think application/x-www-form-urlencoded would be needed too.

on request bodies:
We will act like a proxy: We'll take your body and forward it no matter what the method. Transfer headers will be set automatically, and out of your control. Remember: This is LSL, you can't really send too much data here!

on announcement timing:
Kudos for the great iTunes hack. However, this has been in the works before than the iTunes hack was known, and the timing was purely incidental.

on client-side vs. server-side:
I've heard proposed LSL applications for both kinds of HTTP request. We've decided to tackle server-side originated HTTP first.

Client-side requires more complex security and information flow. The security issue is that now the request would originate behind someone's firewall and could potentially poke something local. Which, while that *is* the very point of it (think controlling iTunes), it is also risky. So we have to figure out the right way to handle permissions. The data issue is that the result of the request would have to be pumped back up to the simulator so the script can see the result. Remember, LSL executes on the simulator. Hence, making the HTTP request client side doesn't really reduce the abuse potential.

on large responses:
We won't be caching results and giving them with llGetMoreHTTPData, at least at first. This has a potential for really tying up the sims.

on "utter heaven":
Cool!
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-31-2006 09:02
From: Kelly Linden
Do you really see no security risk in random code executing web requests from your machine (which happens to be inside your firewall, not to mention on your machine?).
Kelly, let me explain something to you: AJAX. It's a new kind of web-interface technology where JavaScripts get loaded on to your machine from random, almost anonymous sources on the Internet. Those scripts are technically "random code executing web requests from [people's machines]," and people like them. People are building vast new technologies around them. I'm essentially suggesting you guys implement AJAX into Second Life, make out client just a bit thicker.

I'm not blind to the security risks. Give us a pop-up, ala llLoadURL, before the client goes out to snag a web site. Require the script to get permission from the user before it can access the web. Isn't the data to-and-from SL already encrypted/compressed, if so then anything passed back-and-forth for the web (passwords, credit card numbers, etc.) are secure by way of SL's protocols.

What security risks are you talking about?

From: Kelly Linden
What specifically is the security issue you are concerned about with requests coming from the servers?
It's not security I'm worried about, it's accessibility. Say I want to beef up my iTunes controller... If the HTTP requests from SL, that means I have to (a) send a command to SL, (b) go across the Internet, (c) have SL send a request to my home machine, (d) get the info from my now publically accessible machine (which means any packet sniffing person can control my songs), (e) accept the data from my machine, and (f) integrate it into the LSL system.

If the SL client were able to fire off HTTP requests, this would be much simpler... I would (a) Give the LSL script a command, (b) which would send the request to my client, (c) that would be able to talk to my secure 127.0.0.1 server, and (d) push the data back to SL -- all of this, by the way, would be securely encypted/compressed via the SL protocols.

I'm not suggesting everyone is going to do this -- and, yes, I do understand the risks of having people install web servers and 3rd party code; what I'm suggesting here is for us early adopting developers right now -- but by putting the HTTP requests on the client-side of things, you give developers the option to use a 3rd party application to do the bulk of the transactions. If I can setup a little server system on my home computer that links to a bigger server, my home system can juggle things like session states, cookies, etc. All the HTTP communications between SL and myself have to be are basic, small commands to control the software on my computer. Plus, if I don't want to run that software, as I imagine most users won't, then plain, old web-based communications still function the same.

This is my old request for a client-side API. I want a direct (and controlled) way to have SL talk to my computer. Using the llLoadURL and avatar permission system ensures that I know when a script I'm interacting with wants to use the web. Using a client-based HTTP request system allows me to talk to local systems -- I could control iTunes, build a local AIM/Jabber/IRC proxy, save textual information to my harddrive, etc. I don't understand what security problems you guys are talking about, especially given that what I'm suggesting doesn't seem technologically that different than someone using JavaScript to redirect you to another web page.

From: Kelly Linden
Many times http access will be desired when the object owner isn't around.
I guess this is how Khamon feels when people claim he wanted to destroy telehubs. Does the object owner have to be logged on for llLoadURL to work? Does the object owner have to be on for avatar animations and dance machines to work?

NO!!!

What I'm suggesting is that when Kelly Linden go up to a terminal that Jarod Godel owns, and asks it to fetch Kelly Linden's favorite RSS feed, Jarod Godel's terminal sends a message to Kelly Linden's client. Kelly Linden's client asks Kelly Linden for permission to grab the web page, Kelly Linden's client grabs the page, and then Kelly Linden's client returns the data to Jarod Godel's client. All this happened while Jarod Godel was offline playing City of Heroes, the same way llLoadURL would have launched a third-party application on Kelly Linden's machine whether Jarod Godel was online or not.

I'm not suggesting the object owner be the HTTP requestor for the script, I'm suggesting the object user be the one.

What king of http access are you guys planning that won't have some avatar (which represents a person with an SL client) as part of the interaction?
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-31-2006 09:08
From: Zero Linden
Hence, making the HTTP request client side doesn't really reduce the abuse potential.
I want to live in your world, where maybe a dozen idiots who click "yes, i'll spam the white house" represent more bandwidth than two hundred-plus servers at a colo...

And before anyone says buffer, cache, or squid again, let me just remind everyone of this paraphrasing: "Now that the grid has crashed once because of self replicating prims, we're limiting the way prims can replicate to make sure this never happens again."

Distributing the load is always better than resting on the security of central system.
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Lex Neva
wears dorky glasses
Join date: 27 Nov 2004
Posts: 1,361
03-31-2006 10:35
Jared, I think your request has been received by the Lindens. They've already said they're not going to implement it in the first round. I don't think arguing about it is going to help.

From: Jarod Godel

It's not security I'm worried about, it's accessibility. Say I want to beef up my iTunes controller... If the HTTP requests from SL, that means I have to (a) send a command to SL, (b) go across the Internet, (c) have SL send a request to my home machine, (d) get the info from my now publically accessible machine (which means any packet sniffing person can control my songs), (e) accept the data from my machine, and (f) integrate it into the LSL system.


The thing is, if you just get LL-originated HTTP, you can do everything you want to do with client-side HTTP just fine. You do not have to make your server publically accessible.

You can block your HTTP server down to just the IP addresses of the llHTTPRequest proxies. That will still allow your server to be accessed by any LSL script, so you'll need further authentication. You can use a password, but that's still open to the very unlikely possibility that someone who uses SL controls a backbone router on the way between LL's servers and yours and manages to sniff your traffic.

Instead, you can use a one-time-pad. Create a notecard with, oh, 64 kilobytes of random passwords, one per line. Put that same data on your server, accessible by your CGI script. Have the LSL script send each password in turn with each successive request. One-time-pad is absolutely secure, except in the incredibly unlikely event that someone sniffs the notecard as you upload it into SL.

My point here is that you can do everything you want to do even if all you get is LL-originated HTTP. I think you're one of the very few people who actually has a valid use for client-originated HTTP, so shouldn't you be the one that has to do just a bit of extra work to make your specific system work, rather than have LL implement something that they've already said would require a lot more thought and work due to security concerns?

From: someone

What king of http access are you guys planning that won't have some avatar (which represents a person with an SL client) as part of the interaction?


How about any kind of terminal like SLExchange or SLBoutique or anything like that that needs to send data to an outside source?
Jarod Godel
Utilitarian
Join date: 6 Nov 2003
Posts: 729
03-31-2006 10:41
From: Lex Neva
How about any kind of terminal like SLExchange or SLBoutique or anything like that that needs to send data to an outside source?
What data does a terminal need to send when no one's around?

And, yeah, I'm not arguing anymore. It's their system, their world, they can do what they want. I reserve the right to laugh when this blows up in their face, though.
_____________________
"All designers in SL need to be aware of the fact that there are now quite simple methods of complete texture theft in SL that are impossible to stop..." - Cristiano Midnight

Ad aspera per intelligentem prohibitus.
Rickard Roentgen
Renaissance Punk
Join date: 4 Apr 2004
Posts: 1,869
03-31-2006 11:19
From: Jarod Godel
1) If you leave HTTP requests on the server, you do have a security concern, and

The security concern is the responsibility of site admins. Already been stated that a header will be inserted (not to mention the ip range) that will make it apparent that the request is being made by second life, allowing for easy blacklisting.

From: Jarod Godel
2) If you put HTTP requests on the client, you'll be distributing the load away from an already bottlenecked system and you won't have to restrict the number of connections for security reasons. (Clients and the servers are already exchanging information, an extra 4k in the mix isn't going to be as expensive as a whole new process that has to hit web sites.)

If you put the http requests on the client, the ip range is no longer as easy to track, you now have to determine which client to use should an object not happen to be attached to an avatar, and you have to send to the server (an extra step or two) before sharing with relavent avatars. an http request is not a significant workload.

From: Jarod Godel
Also, what's stumping me is this: what kind of applications are people going to be writing that need to access the web when people aren't around? Are LSL coders really looking to build things like timed aggregators and web spiders when they still have no in-world way to write the grabbed data to a disk?

jabber relays, grid mappers, update servers, vendor servers, region statistics, etc.
_____________________
Aliasi Stonebender
Return of Catbread
Join date: 30 Jan 2005
Posts: 1,858
03-31-2006 13:00
From: Jarod Godel
Don't we already have that because of XML-RPC?


Ah, but using XML-RPC is limited and using it for two-way communication is something of a hack, as you well know, whereas this is quite a bit more full-featured, or so it would seem.
_____________________
Red Mary says, softly, “How a man grows aggressive when his enemy displays propriety. He thinks: I will use this good behavior to enforce my advantage over her. Is it any wonder people hold good behavior in such disregard?”
Anything Surplus Home to the "Nuke the Crap Out of..." series of games and other stuff
1 2 3 4