Ahzzmandius Werribee
Registered User
Join date: 22 Oct 2006
Posts: 12
|
01-23-2007 00:42
Has anyone else noticed that the linux client has a "bug" where if it get's more than just a couple packets out of order, or it drops a few packets and reports that "missing" packet warning on the console that the official viewer will start eating about 150% MORE cpu time until it gets those missing packets? What is causing this? I just tried the first look viewer for linux. Amazing speedup in video.  Other things seems more responsive as well when my workstation is loaded down to the hilt in cpu/memory intensive tasks. Just FYI I run other tasks because my day job is coding. Compiling and video editing.  Plus it's a pretty much guaranteed way to get SL to drop a couple packets and produce this excessive cpu use issue in the official viewer. I have NOT been able to trigger this (yet) in the first look viewer. I promise I'll try to get it to trigger tomorrow and report back. *grin*
|