AMD did a real number on the bloody market, all right.
Pretty much everything that makes AMD64 better than IA32 has nothing to do with 64 bits.
They shoulda called it AMD64LEETMEMORYBUSOMGMOREREGISTERSPWNEDUINTEL.
These forums are CLOSED. Please visit the new forums HERE
32-bit vs 64-bit |
|
|
Dingo Warrigal
Registered User
Join date: 30 Jun 2006
Posts: 20
|
08-09-2006 15:52
AMD did a real number on the bloody market, all right.
Pretty much everything that makes AMD64 better than IA32 has nothing to do with 64 bits. They shoulda called it AMD64LEETMEMORYBUSOMGMOREREGISTERSPWNEDUINTEL. |
|
Spider Neva
Registered User
Join date: 27 Sep 2004
Posts: 2
|
08-09-2006 16:57
Annie .....I tired running SL with Win XP Pro X64 and it was a no go...i ended up re-installing my 32 version for SL
Has anyone tried to run SL in Windows XP Pro x64 Edition? Any issues? How about with nVidia drivers? I'm upgrading to a 64-bit system. I have access to the OEM version of XP x64 and wouldn't mind giving it a try, but if it doesn't run SL properly it'll be a deal-killer. TIA |
|
Shirley Marquez
Ethical SLut
Join date: 28 Oct 2005
Posts: 788
|
08-09-2006 21:30
Principle 2: In general (that is, answer not necessarily tailored to Intel or AMD), a 32-bit app will run faster than a 64-bit version of the same app <bold>on the same hardware<\bold>. The main driver for a 64-bit app is when the app needs to directly address *huge* amounts of memory, where huge is greater than about 4GB. Consider this: if all your longs and pointers change from 32-bit to 64-bit, the cpu has to move twice as much data, and the data takes up twice as much cache and register space. The results I've seen for 32-bit vs. 64-bit Linux on AMD64 systems contradict this -- the 64-bit version is 10-20% faster. The main reason appears to be that the CPU has twice as many registers in 64-bit mode; the compiler can take advantage of this to generate better code that touches main memory less often. On the other hand, if you do the same test on a Netburst CPU (Pentium 4 or Pentium D), the 64-bit version is slower; the Intel CPUs appear to bottleneck on instruction fetch, and since the 64-bit code is larger because of the bigger pointers, it's slower. I haven't yet seen any 64-bit results for the new Core Duo 2 architecture. On the other hand, running 64-bit Windows and running a 32-bit application on it is always slower, regardless of CPU. There is extra overhead caused by swtiching between modes, so we can expect this penalty to persist in Vista. |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-09-2006 21:43
From personal experience thus far, I agree with what Corvus said. I am extremely happy with my 64-bit, dual-core Opteron 170. I have multitasking needs to run SL and many other programs at once. While SL itself doesn't take advantage of multithreading in the present, I can still put the extra core to good use. I totally agree Torley. I've read a lot on here about Intel Duo-core "sucking". Not only have I not seen substantial tech specs verifying this... my personal experience is opposite. Sure, there are people (such as hard-core gamers) who set systems side by side and wow.. the AMD finished that demo .25 seconds faster than the Intel so... LOL. Has no affect on the reality of everyday computer use. When I bought my computer I priced the Intel Duo core against the AMD duo core chips. AMD was significantly more expensive (at least, at the time I priced them). Since I have in the past had problems with AMD chips overheating and compatibility problems with some applications... I opted for the Intel. The results? Whereas with single-core I absolutely could not run Second Life with a graphics program such as Gimp (it would lag me to a standstill on both programs)... with my Intel Duo-core I can now run Second Life, Gimp, an email program and Win TV all at the same time with no significant speed degredation. I can handle that kind of "bad" performance. ![]() _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-09-2006 21:51
right on i love my machine .. so happy with it Gateway® GT5032 Media Center Desktop I have a similar machine, and like you, I love it. Or.. I would except... The [expletive deleted] engineers at Gateway still haven't learned the lesson about proprietary motherboard layouts. I can't even buy an advanced processor fan because their motherboard is drilled in a rectangle rather than a square configuration. When Thermaltake doesn't make a fan for a system... a fan doesn't exist for that system. I keep thinking Gateway will wake up one day and realize that proprietary boards are stupid, stupid, stupid. If I'd had any idea at the time it was set up that way, I'd have never bought the system. But other than that one thing... it's nice. And Gateway's first-year support is unequalled. Which is good, because I had to replace two defective DVD drives. That comes back to Gateway's second problem of using substandard-quality components. Ah well. It's the first time in my life I've bought a pre-boxed system. Always built them myself before. Let the buyer beware. Gateway has terrific service. But their conceptualization needs some serious work. They have GOT to get away from that proprietary motherboard / case nonsense. It's dinosaur mentality. _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Shirley Marquez
Ethical SLut
Join date: 28 Oct 2005
Posts: 788
|
08-11-2006 00:41
I have a similar machine, and like you, I love it. Or.. I would except... The [expletive deleted] engineers at Gateway still haven't learned the lesson about proprietary motherboard layouts. I can't even buy an advanced processor fan because their motherboard is drilled in a rectangle rather than a square configuration. When Thermaltake doesn't make a fan for a system... a fan doesn't exist for that system. Zalman used to make a fan that didn't attach to the CPU at all. You attached a big heat sink to the processor, and there was a bracket that screwed into the same screw holes that hold in some of your slot covers that held a fan over the heat sink. If you could find one of those old beasts (which, to be fair, WAS a pain to install), it might work on your system. |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-11-2006 05:03
Zalman used to make a fan that didn't attach to the CPU at all. You attached a big heat sink to the processor, and there was a bracket that screwed into the same screw holes that hold in some of your slot covers that held a fan over the heat sink. If you could find one of those old beasts (which, to be fair, WAS a pain to install), it might work on your system. Thanks. Yeah, I remember that item. Won't work with the Gateway though, for 2 reasons: 1) how does one attach the heat sink to CPU and 2) the slot covers on the Gateway are nowhere near the CPU. I've had in mind for some time to try my own kluge however, that's similar to the concept you've mentioned here: build my own fan bracket that mounts on a substructure in the case rather than the motherboard. Fan doesn't necessarily have to be touching the heat sink; it just has to blow through it. So in theory, your idea is a good one. ![]() _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
grumble Loudon
A Little bit a lion
Join date: 30 Nov 2005
Posts: 612
|
08-11-2006 12:35
Two of the processor socket latches broke on one of my computers, so I slaped some heat sink compound on the processor and wedged the heatsink and fan in place with a piece of wood.
|
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-11-2006 13:20
Two of the processor socket latches broke on one of my computers, so I slaped some heat sink compound on the processor and wedged the heatsink and fan in place with a piece of wood. LOL. I've heard of rigging... but a block of wood is new. ![]() I'm no better though. I'm thinking of gluing a system fan to the plastic CPU housing box with silicon or contact cement. ![]() Whatever works in a pinch. ![]() _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Emerald Todd
Registered User
Join date: 15 Jul 2006
Posts: 2
|
08-13-2006 12:17
First off, dual cores rock the world when it comes to SL Heh. First off, you don't exactly know what you're talking about, since SL doesn't even support dual-core processor usage. In fact, SL runs TONS better if you turn off one of the dual cores for SL. ;3 So stop talking about things you /really/ don't know about. As for all the people WITH dual-core processors, and running SL, you'll notice a grand improvement by following the steps below. I'm sure the same process works on other OS's, but my guide will be for Windows XP. Ctrl-alt-del with SL running. Go to the processes tab. Right click on secondlife.exe Click 'set affinity..' Uncheck CPU 0. Hit OK You shoudl notice a significant change, even when in heavy sims. Note, this does not work on hyper-threaded CPUs, since they're not really Dual core. ;3 |
|
Thili Playfair
Registered User
Join date: 18 Aug 2004
Posts: 2,417
|
08-13-2006 14:21
Well for some setting affinity does give you more fps seems to be tied with your motherboard , i use nvidias and it just drops fps enabling 2 cores in SL, however...it doesnt do this in Windows vista so thats why i think ith as something to do with motherboard drivers.
However owning a dualcore , is fun, run Photoshop, virusprogs , defrag , ect, in background you wont really notice it except a harddive noise went up,. and more and more games are using it Oblivion can use 4 cores , most new fps are dual core supported to, putting some load on the other to do stuff if you have one availible, a single core is going out anyway, quad is up soon n.n; |
|
Argent Stonecutter
Emergency Mustelid
Join date: 20 Sep 2005
Posts: 20,263
|
08-13-2006 16:20
Heh. First off, you don't exactly know what you're talking about, since SL doesn't even support dual-core processor usage. I'll see about running it with affinity for CPU 1 only some time, see if I can get a *further* improvement. |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-13-2006 16:51
Heh. First off, you don't exactly know what you're talking about, since SL doesn't even support dual-core processor usage. In fact, SL runs TONS better if you turn off one of the dual cores for SL. ;3 So stop talking about things you /really/ don't know about. Emerald, a little more respect for Corvus might be in order here. I have run both solo core and dual core and I assure you... my dual core runs SL much faster than solo. I have in the past tried the step you recommended. It works in some instances. In many instances it doesn't. I don't know why... that's just the facts. It sure didn't work with me. Technically, I know of no reason why SL should behave any differently than any other program... and I know that I am very pleased with my performance gain since I purchased my dual-core system. Before telling others they're ignorant, might want to consider other possibilities. [/QUOTE]_____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Shirley Marquez
Ethical SLut
Join date: 28 Oct 2005
Posts: 788
|
08-13-2006 22:25
In fact, SL runs TONS better if you turn off one of the dual cores for SL. Not universally true. If I set affinity on the system I'm using right now (Pentium D 805), my frame rate drops by about 10%, and SL runs less smoothly. It does seem to help on AMD dual-core system though. It is true that having the second core doesn't speed up SL, or at least not much. But it DOES make life much better if you want to run anything else at the same time. Big win for clothing designers, because Photoshop and SL can coexist happily -- just remember to have plenty of RAM! |
|
Kolarn Lach
Registered User
Join date: 29 Jun 2006
Posts: 14
|
08-14-2006 09:38
A more relevant part of this debate is when can we actually expect dual core support from SL? It seems SL is extremely suitable for multi-threading since a lot of work has to be done that has no direct impact on eachother (engine state updating vs rendering for example). Given the fact that within 6 months all new CPUs sold will be dual core, it's an important issue, one that may dramatically improve the already way too poor performance of SL.
|
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-14-2006 12:06
A more relevant part of this debate is when can we actually expect dual core support from SL? It seems SL is extremely suitable for multi-threading since a lot of work has to be done that has no direct impact on eachother (engine state updating vs rendering for example). Given the fact that within 6 months all new CPUs sold will be dual core, it's an important issue, one that may dramatically improve the already way too poor performance of SL. Equally important in this is the need to hunt down and eliminate deep core bottleneck and programming issues. SL's database and retrieval system shows every sign of having some real structure and concept problems that even dual core adaptation won't fix. Until those issues are resolved, no matter how fast the client, poor performance will likely remain a problem. _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Kolarn Lach
Registered User
Join date: 29 Jun 2006
Posts: 14
|
08-15-2006 11:02
Equally important in this is the need to hunt down and eliminate deep core bottleneck and programming issues. SL's database and retrieval system shows every sign of having some real structure and concept problems that even dual core adaptation won't fix. Until those issues are resolved, no matter how fast the client, poor performance will likely remain a problem. Agreed, i make multi-user socket/udp servers for a living and i'm quite surprised by both the odd scalability model used in SL as well as the even more worrying fact of servers slowing down to a crawl with only 40-50 people in the sim (and i do mean slow as in chat lines showing up 10-15 seconds later). Oh well.. |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-15-2006 23:03
OK, update info on the solo/dual core issue.
I did a bit of experimenting today and discovered the following things: Duo core Intel runs SL faster than solo core. Duo core AMD runs SL slower than single core. It has to be switched to single core operation for the program. (Load SL, press CTRL/ALT/DEL, click PROCESSES, find Secondlife.exe, right click, SET AFINITY, unclick the 2nd core). MAJOR ISSUE: Have the fastest, meanest graphics card you can afford. It absolutely makes a difference in the operating speed of SL. Conclusion: Since SL is far less advanced graphically than many other programs on the market, my belief is that the server/client interaction is not up to snuff, and the client-side graphics engines in SL must be really bad. There's really no conceivable reason (that I can think of) that SL should so intensely affect client graphics that the most powerful graphics cards are required to make it operate properly. Experimentation indicates that anything under an Nvidia 7600 or ATI X800 is not sufficient to run SL properly. You can use less-powerful graphics cards... but it will lag. So if you're talking strictly SL... the truth is a decent solo-core computer with a really rocking graphics card will do much better than a duo-core with a poor graphics card. However, duo-core will allow running OTHER programs along with SL (such as Photoshop, Gimp, email, etc) without a degredation in speed. Try that with a solo core and you'll drop to syrup. This also means that most laptops are not sufficient to run SL. A laptop would have to have a heavy-duty graphics card in it to be able to handle SL properly. Again... many laptops will run SL if you don't mind glitchy movement. But if you want smooth operation, you have to have a hard-core graphics system. They really need to correct that, imho. There's just no need for SL to require that much of a graphics card. I suggested weeks ago that I thought they were making SL too graphics-demanding. Today's experimentation proved it. _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Steve Mahfouz
Ecstasy Realty
Join date: 1 Oct 2005
Posts: 1,373
|
my experience with AMD 64 bit dual core and 7900 GTX
08-15-2006 23:52
I have one of the cheaper AMD 64 chips, the 3800, on an Asus mobo. The 3800 is dual core. I run 2 video cards, EVGA 7900 GTX, in SLI mode. I thought, yay, I'm going to be smoking with frame rate on SL. Wrong. From what I've read, SL is much more cpu intensive rather than graphics card intensive because of the old rendering engine. Of course, it helps to have a better graphics card. Just my measly $0.02.
![]() _____________________
http://slurl.com/secondlife/Ecstasy/128/129/31
Ecstasy: high quality residential living |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-16-2006 05:53
I have one of the cheaper AMD 64 chips, the 3800, on an Asus mobo. The 3800 is dual core. I run 2 video cards, EVGA 7900 GTX, in SLI mode. I thought, yay, I'm going to be smoking with frame rate on SL. Wrong. From what I've read, SL is much more cpu intensive rather than graphics card intensive because of the old rendering engine. Of course, it helps to have a better graphics card. Just my measly $0.02. ![]() Well, at this point I'd have to assume that SL is both CPU and graphics intensive. Since most duo-core processors are fairly high level to start with (in fact, I think all of them are, if I'm right), then any duo core system will work. In solo Core, probably a Pentium or Athlon trucking along at least 2.8 or higher would be necessary. Anything lower will likely lag. I've never really tested SL with a decent Celeron or Sempron chip. I'd love to have the opportunity to test a high-clock version of these (say a 3.0, 3.2) with a heavy-duty graphics card just to see what would happen. ![]() _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Argent Stonecutter
Emergency Mustelid
Join date: 20 Sep 2005
Posts: 20,263
|
No other game is faced with the same problems as SL.
08-16-2006 16:06
Since SL is far less advanced graphically than many other programs on the market, my belief is that the server/client interaction is not up to snuff, and the client-side graphics engines in SL must be really bad. And as long as SL allows easy user-created content, that will always be true. |
|
Wayfinder Wishbringer
Elf Clan / ElvenMyst
Join date: 28 Oct 2004
Posts: 1,483
|
08-16-2006 18:05
The big problem is that other games have professional graphics artists and developers spending days working on setting every surface and object up for optimal rendering, marking surfaces to never be rendered, laying down pre-baked shadows, and doing all kinds of optimizations in the very database that SL has to do in real time. And as long as SL allows easy user-created content, that will always be true. Unreal has user-created content. There are literally thousands of fully-user-created scenarios, worlds, monsters, weapons etc... and it still works fine. These people aren't professional graphics artists who tweak every single surface; they're just standard people who know how to use 3-D graphics tools, and they plug them into the Unreal engine. The difference is that the Unreal engine is tweaked to the max. Why? Because the creators know that laggy engines make for bad reputation and bad reputation means people play Quake instead of Unreal. The people at Quake are aware of the same thing. They know the bottom line rule of this game: if you're going to be in the business, ya have to do it right. _____________________
Visit ElvenMyst, home of Elf Clan, one of Second Life's oldest and most popular fantasy groups. Visit Dwagonville, home of the Dwagons, our highly detailed Star Trek exhibit, the Warhammer 40k Arena, the Elf Clan Museum and of course, the Elf Clan Fantasy Market. We welcome all visitors. : )
|
|
Kolarn Lach
Registered User
Join date: 29 Jun 2006
Posts: 14
|
08-17-2006 07:43
The big problem is that other games have professional graphics artists and developers spending days working on setting every surface and object up for optimal rendering, marking surfaces to never be rendered, laying down pre-baked shadows, and doing all kinds of optimizations in the very database that SL has to do in real time. And as long as SL allows easy user-created content, that will always be true. Although your points will indeed have overhead it should not make an impact nearly this large. Current graphics cards should be capable to achieve way higher FPS. Even when you stare at a single polygon it doesnt top 60fps on my 7950GX2. Excuse me but that's simply flawed. |
|
Argent Stonecutter
Emergency Mustelid
Join date: 20 Sep 2005
Posts: 20,263
|
08-17-2006 08:32
Unreal has user-created content. There are literally thousands of fully-user-created scenarios, worlds, monsters, weapons etc... and it still works fine. These people aren't professional graphics artists who tweak every single surface; they're just standard people who know how to use 3-D graphics tools, and they plug them into the Unreal engine. Are Unreal or Quake players building them in real time, in the game? No, once the game is running, the scenery is static. The game never has to deal with a new building suddenly showing up in the line of sight, with the culling tree and high and low resolution textures and meshes having to be regenerated in real time. There are a huge number of optimizations that SL simply can't do because they require too much setup to do in real time. LL is working on adding them. The culling trees in 1.10 are one example. Now I'm not saying they're as finely tuned as they could be, and they have definitely made mistakes where they've tried to improve the graphic quality (I had to upgrade my video care with 1.10 because the new vertex programs completely killed my old nVidia 5600), but there's no way they can possibly be as efficient as a game with static content. |
|
Argent Stonecutter
Emergency Mustelid
Join date: 20 Sep 2005
Posts: 20,263
|
08-17-2006 08:38
Although your points will indeed have overhead it should not make an impact nearly this large. Current graphics cards should be capable to achieve way higher FPS. Even when you stare at a single polygon it doesnt top 60fps on my 7950GX2. Excuse me but that's simply flawed. I'm not sure why you care about beating 60 FPS. Anything over 20 FPS I simply don't notice, unless I'm paying obsessive attention, and even there I'm not sure I'm not imagining it. I'd like to peg SL at 30 FPS, and have it simply sleep at the end of a graphics frame if it has cycles to spare. |