05-09-2008 13:05
Seems to me that one huge issue with how this work is texture transfer. Obvious right? Well, I had a thought. Right now, there are times where some zones are made so that you almost *have to* have the distance set higher than 128, some its mandated to see things right. The Privateer Space area, in the Showcase, being one where the "walls" provide the appearance of space. Yes, I am sure it works better in the newer client, and I am sure you could just set it to night, but that's not the point. Sometimes you don't have much choice and need greater viewing distances.

So, I got to thinking. When using a 3D program called POVRay, when you wanted to do a fast render, just to see how things fit together, you would use what is called a "quick color". This is an approximation of what you are going to see, rather than an exact match. It might be defined like quickcolor<1,0,0,.5>, if you wanted 100% red, with 50% transparency. And while I don't think this could be obviously usable for things like trees (having partly transparent crossed planes waving in the distance would look kind of silly), it may work for other objects. If you have a "view distance" and a "quick color" distance, then set the first to 128, then the second to 256, then you would still have to feed prims, but not textures, just the quick colors. this would help decrease the bandwidth a decent amount, while "still" giving a bit move visibility. One could even, if they didn't mind losing some detail, set the view distance under the default, to gain some performance, while still "seeing" the objects.

As for how you set the color. I considered two possibilities, one would be to take an average of the colors in a texture, as well as any alpha channel data, and generate a color that matched those averages. This has the advantage that it would work with "any" newly uploaded texture, and could be done for existing ones fairly easily. The other would obviously be to let someone "set" it. Of course, a third option combines the two and would just let you adjust the quick color that was automatically set for the texture, so that yours appears instead (for cases where averaging may land you with gray, but the real object looks something else, or isn't visible enough in the pre-gen quick color). At a distance, you don't see a lot of detail anyway, so for most objects, this would seem to be a workable idea. For some... it would be a sloppy compromise, but speed is more of a factor here than realism anyway.

Then again, it doesn't make much sense to me that no procedural texture/shader system is available either. For a lot of things, from pottery textures, to different sorts of stone, to complex patterns on an object, or pretty much anything else requiring repeated patterns, and not precise details, a procedural would work quite well, are supported in the shaders for cards (though probably differently than what I have used with POVRay) and could save yet another 5-10k per texture, or more. They only contain the description of what the shader needs to do, not the entire bitmap of the texture, so there is less overhead getting them "too" the machine that needs them. The right adaptation of height field style data, to generate sculptie data, could even mean that some of "those" could be done via such a system, though this would likely require software calculation, rather than using the graphics card's method (but I am hardly an expert, so there may be a way to do that too).