Timetest shootout: Linux vs. Windows!
Posted: Thu Jun 30, 2005 9:06 pm
Ask and ye shall recieve. Well, I did run the same tests back-to-back, and I found out some pretty interesting results!
Reference system is an MSI KN-266 board running a Duron 1.3Ghz, 512MB of SDRAM, an MSI GeForce4 MX440 32MB vid card running in dualview at 1280x1024 on each screen. Windows is XP Pro with SP2, Linux is SuSE 9.1 (default 2.6 kernel) with KDE and nVidia drivers.
I couldn't get AA or AF to work in Linux, but no sweat. On a GF4 MX440 it's obviously slow as... molasses, as my first run proves.
All Windows runs were done with every conceivable unneeded service stopped, including all networking. I ran tests in DirectX and OpenGL, and DX performance was about HALF of what the OpenGL renderer could do with my drivers. They aren't the newest, but come on, neither is the game or the video card.
Linux runs were in KDE, with no particular attention to stopping anything. Even had several desktop widgets running. Swapfile usage in Linux was, as usual, 0% the whole time.
All graphics settings in-game were set to "highest" and I didn't use "-nosparkles -nomotionblur" because Durons don't support SSE2 anyway. The only practical thing I could have used in Windows and not Linux was bump mapping, which actually improves the looks a little bit (especially with water ripples) but has no impact on performance that I could tell (I tested both ways).
======== 1280x960===========
Windows, 1280x960, no AA or AF, using OpenGL render:
84.74 Descent3 v1.4
43 Min
132 Max
42 sec.
Linux, 1280x960, no AA or AF, OpenGL (of course):
88.78 Descent3 v1.4
41 Min
117 Max
40 sec.
Windows, 1280x960 WITH AA and AF, using OpenGL:
26.25 Descent3 v1.4
12 Min
31 Max
137 sec.
This was the "lets see if we can get my PC to crash" run. Looks decent, until I turned on Antialiasing, when my GF4 took a dump.
You can tell it's actually a warmed-over GF2 here. Only 32MB of textures didn't help. Each frame is 4.8MB at 32-bit, and you can see the GPU is the limiting factor once you see the rest of the resuls.
======== 1024x768===========
Windows:
89.24 Descent3 v1.4
36 Min
170 Max
40 sec.
Linux:
135.96 Descent3 v1.4
52 Min
191 Max
26 sec.
Here we see Linux starting to pull away from Windows, even though the mins and maxes aren't that far apart.
More than anything, I'd say the Linux run was less "bouncy" as is shown by the (much) higher average framerate and shorter runtime.
======== 640x480===========
Windows:
90.35 Descent3 v1.4
53 Min
170 Max
39 sec.
Linux:
151.16 Descent3 v1.4
103 Min
252 Max
23 sec.
Here's where we separate the men from the boys. About twice the framerate all around in Linux. 'Nuff said. Just to see what it would do in Windows, I dropped all the detail settings to "low" and set pretty much everything else I could find to "low" or "sucky" and even dropped desktop textures to 16-bit. The fastest FPS I got out of Windows, period, was 191, right on par with Linux at 2.5 times the screen real estate, plus maxed out detail settings and 24-bit color.
I hope you guys found this interesting. Was there some setting that I missed? The Linux version doesn't have all the -lowmem switches and stuff, I'm thinking that both version's defaults should be the same, though. If anyone knows differently, I'd love to hear about it.
Reference system is an MSI KN-266 board running a Duron 1.3Ghz, 512MB of SDRAM, an MSI GeForce4 MX440 32MB vid card running in dualview at 1280x1024 on each screen. Windows is XP Pro with SP2, Linux is SuSE 9.1 (default 2.6 kernel) with KDE and nVidia drivers.
I couldn't get AA or AF to work in Linux, but no sweat. On a GF4 MX440 it's obviously slow as... molasses, as my first run proves.
All Windows runs were done with every conceivable unneeded service stopped, including all networking. I ran tests in DirectX and OpenGL, and DX performance was about HALF of what the OpenGL renderer could do with my drivers. They aren't the newest, but come on, neither is the game or the video card.
Linux runs were in KDE, with no particular attention to stopping anything. Even had several desktop widgets running. Swapfile usage in Linux was, as usual, 0% the whole time.
All graphics settings in-game were set to "highest" and I didn't use "-nosparkles -nomotionblur" because Durons don't support SSE2 anyway. The only practical thing I could have used in Windows and not Linux was bump mapping, which actually improves the looks a little bit (especially with water ripples) but has no impact on performance that I could tell (I tested both ways).
======== 1280x960===========
Windows, 1280x960, no AA or AF, using OpenGL render:
84.74 Descent3 v1.4
43 Min
132 Max
42 sec.
Linux, 1280x960, no AA or AF, OpenGL (of course):
88.78 Descent3 v1.4
41 Min
117 Max
40 sec.
Windows, 1280x960 WITH AA and AF, using OpenGL:
26.25 Descent3 v1.4
12 Min
31 Max
137 sec.
This was the "lets see if we can get my PC to crash" run. Looks decent, until I turned on Antialiasing, when my GF4 took a dump.
You can tell it's actually a warmed-over GF2 here. Only 32MB of textures didn't help. Each frame is 4.8MB at 32-bit, and you can see the GPU is the limiting factor once you see the rest of the resuls.
======== 1024x768===========
Windows:
89.24 Descent3 v1.4
36 Min
170 Max
40 sec.
Linux:
135.96 Descent3 v1.4
52 Min
191 Max
26 sec.
Here we see Linux starting to pull away from Windows, even though the mins and maxes aren't that far apart.
More than anything, I'd say the Linux run was less "bouncy" as is shown by the (much) higher average framerate and shorter runtime.
======== 640x480===========
Windows:
90.35 Descent3 v1.4
53 Min
170 Max
39 sec.
Linux:
151.16 Descent3 v1.4
103 Min
252 Max
23 sec.
Here's where we separate the men from the boys. About twice the framerate all around in Linux. 'Nuff said. Just to see what it would do in Windows, I dropped all the detail settings to "low" and set pretty much everything else I could find to "low" or "sucky" and even dropped desktop textures to 16-bit. The fastest FPS I got out of Windows, period, was 191, right on par with Linux at 2.5 times the screen real estate, plus maxed out detail settings and 24-bit color.
I hope you guys found this interesting. Was there some setting that I missed? The Linux version doesn't have all the -lowmem switches and stuff, I'm thinking that both version's defaults should be the same, though. If anyone knows differently, I'd love to hear about it.