G80 Is Killer GPU (128 Unified Shaders @ 1350MHz)
- Aggressor Prime
- DBB Captain
- Posts: 763
- Joined: Wed Feb 05, 2003 3:01 am
- Location: USA
G80 Is Killer GPU (128 Unified Shaders @ 1350MHz)
Details are here.
This combined with the Intel buyout coming tonight could mean a great 2006/2007 for Intel-nVidia. Needless to say, AMD-ATI's R600 will be 2x behind this monster.
This combined with the Intel buyout coming tonight could mean a great 2006/2007 for Intel-nVidia. Needless to say, AMD-ATI's R600 will be 2x behind this monster.
- Aggressor Prime
- DBB Captain
- Posts: 763
- Joined: Wed Feb 05, 2003 3:01 am
- Location: USA
- Admiral LSD
- DBB Admiral
- Posts: 1240
- Joined: Sun Nov 18, 2001 3:01 am
- Location: Northam, W.A., Australia
- Contact:
I'd think nvidia would rather buy one of the smaller CPU companies(what ever happened to Cyrix?) than endure not being able to provide a complete platform like AMD, but it wouldn't like to be competing with Intel either.Admiral LSD wrote:The thing is though that with Intel already having a well established chipset division, dominance over the low-end of the graphics market and an apparent lack of interest in the high-end and workstation segments nVidia have nothing to offer them to make buying them out worthwhile.
Damned if they do, damned if they don't.
Wasn't another 3d chipset company in a similar situation all those years ago?
- Gold Leader
- DBB Ace
- Posts: 247
- Joined: Tue Jan 17, 2006 6:39 pm
- Location: Guatamala, Tatooine, Yavin IV
- Contact:
Hehe -- max of 225W per card times two, all on the 12V rails. Roughly 38A.. Plus whatever you PC sucks. Guess we will see a new line of PSUs w/ 4 GF/X power connectors soon..Neo wrote:Did you see the power requirements for GeForce 8800 GTX SLI? 800 W! lol x_x
More info on the 8800:
http://www.xbitlabs.com/news/video/disp ... 42704.html
http://www.xtremesystems.org/forums/sho ... p?t=120598
GTX pricing:
https://www.excaliberpc.com/parts.asp?stxt=8800+gtx
Kinda steep..
Bigus Dickus -- http://we.pcinlife.com/attachments/foru ... picNQ3.jpg
It seems as though the power requirements isnt THAT bad (but still tending to up):
http://www.dailytech.com/article.aspx?newsid=4812
It seems to perform quite well on benchmarks.
http://www.dailytech.com/article.aspx?newsid=4812
It seems to perform quite well on benchmarks.
For me this is NVidia marchitecture once again. NVidia constantly boasts having the most advancing gfx hardware, and they constantly fail to deliver truly polished products for a long time now. Remember the GF 58XX debacle? They managed to get back into the game with the GF 68XX just so. Their hardware consumes more power than ATIs and lacks features, particularly in the image quality department. ATI has the better AA for a long time. NVidia's drivers do not just cheat, they cannot even deliver the image quality ATI can. ATI may not have the most advanced features in their GPUs, but what they have is polished, fine tuned, highly optimized and fast. And they do it again and again.
Bottom line: NVidia tries to get the headlines, but once such hardware is really needed, ATI will be there again with something better, sleeker, more polished than NVidia.
This is not just fanboy talk. I was an NVidia fan big time - until the Radeon 9700 arrived, and NVidia so miserably failed to deliver on their promises.
If you don't know how big NVidia is into using marketing stuff to get an edge over competitors, google for the story why and how 3dfx got ruined (to make it short: They were ruined by NVidia announcing T&L hardware and some more stuff for over a year, never delivering, until 3dfx's back was broken because customers were staring at NVidia's promise, holding back from buying 3dfx hardware. HW T&L never really got popular in the end - it was just a marchitecture trick).
If you still think NVidia has the better drivers, look at their current Linux drivers - they fail to install due to an installer bug, and there is no easy fix. And if they worked, they'd be a PITA to use compared to ATI's driver installer. I know it, I used them both.
Nah, I'll wait for what ATI will come up with, and I bet it will be good and sleek and fast and consume less power than NVidia's crap and deliver better images.
Bottom line: NVidia tries to get the headlines, but once such hardware is really needed, ATI will be there again with something better, sleeker, more polished than NVidia.
This is not just fanboy talk. I was an NVidia fan big time - until the Radeon 9700 arrived, and NVidia so miserably failed to deliver on their promises.
If you don't know how big NVidia is into using marketing stuff to get an edge over competitors, google for the story why and how 3dfx got ruined (to make it short: They were ruined by NVidia announcing T&L hardware and some more stuff for over a year, never delivering, until 3dfx's back was broken because customers were staring at NVidia's promise, holding back from buying 3dfx hardware. HW T&L never really got popular in the end - it was just a marchitecture trick).
If you still think NVidia has the better drivers, look at their current Linux drivers - they fail to install due to an installer bug, and there is no easy fix. And if they worked, they'd be a PITA to use compared to ATI's driver installer. I know it, I used them both.
Nah, I'll wait for what ATI will come up with, and I bet it will be good and sleek and fast and consume less power than NVidia's crap and deliver better images.
- Krom
- DBB Database Master
- Posts: 16138
- Joined: Sun Nov 29, 1998 3:01 am
- Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
- Contact:
Uhh huh, right.
First, 3dfx destroyed themselves by not having a product out to compete, the only thing nVidia did was hold an extremely aggressive 6 month product cycle. The geforce2 came out and was more then the Voodoo5 could handle, then the Geforce3 followed shortly after and was the final nail in the 3dfx coffin.
Second, molecular details in image quality and FSAA quality couldn't possibly be more irrelevant at 60+ frames per second in almost any game. Image quality is for screen shots and has almost nothing to do with actually playing. I disable AA in my 6800 GT even though it comes at no performance penalty on my system simply because I can't tell the difference between FSAA on and FSAA off unless I stop moving and look for it hard.
As far as power requirements, ATI is no saint either. All the major GPUs hog a ton of energy, saying ATI is power efficient is like saying Prescott ran cool.
ATI has also cheated in benchmarks and it is just as well known as nVidia cheating in 3dmark, saying it like ATI never did anything is deceptive and one sided.
Also ATI's recent record of hard launches has been nothing but a spectacular failure compared to nVidia actually holding off till they had parts on store shelves for the last few major launches. Keeping up with demand after launch is still trouble, but that goes for ATI once they finally ship 6-8 weeks after \"launch\".
First, 3dfx destroyed themselves by not having a product out to compete, the only thing nVidia did was hold an extremely aggressive 6 month product cycle. The geforce2 came out and was more then the Voodoo5 could handle, then the Geforce3 followed shortly after and was the final nail in the 3dfx coffin.
Second, molecular details in image quality and FSAA quality couldn't possibly be more irrelevant at 60+ frames per second in almost any game. Image quality is for screen shots and has almost nothing to do with actually playing. I disable AA in my 6800 GT even though it comes at no performance penalty on my system simply because I can't tell the difference between FSAA on and FSAA off unless I stop moving and look for it hard.
As far as power requirements, ATI is no saint either. All the major GPUs hog a ton of energy, saying ATI is power efficient is like saying Prescott ran cool.
ATI has also cheated in benchmarks and it is just as well known as nVidia cheating in 3dmark, saying it like ATI never did anything is deceptive and one sided.
Also ATI's recent record of hard launches has been nothing but a spectacular failure compared to nVidia actually holding off till they had parts on store shelves for the last few major launches. Keeping up with demand after launch is still trouble, but that goes for ATI once they finally ship 6-8 weeks after \"launch\".
its not a bug if every distro puts their x.org files in different places.Diedel wrote:If you still think NVidia has the better drivers, look at their current Linux drivers - they fail to install due to an installer bug, and there is no easy fix. And if they worked, they'd be a PITA to use compared to ATI's driver installer. I know it, I used them both
the installer has a switch to specifiy where it goes.
I would suggest the interested reader to check out this article for some background info.Diedel wrote:ATi fan post
Warning -- further derailing will be deleted. Please stay on topic.
flip,
the current installer has a bug causing it to complain about missing kernel header files.
Grendel,
I think my post is OT, just adding some depth.
Krom,
maybe I am wrong on 3dfx, I am having contradictory sources.
The image quality differences are not just molecular. You are giving me the impression that you don't really know what you are talking about. http://www.3dcenter.de/ has a lot of very technical in-depth information about this (they are even quoted in international tech sites). Unfortunately for you it is in German.
As far as power and cooling goes, ATI was better than NVidia since the Radeon 9700. It was NVidia who put two power plugs on their gfx cards, and they continue to build such power hogs.
ATI did cheat, but way less severely than NVidia did, and they stopped it altogether when it was detected, while NVidia continued to cheat.
Admittedly ATI has had been problems delivering their hardware. On the other hand I don't think 4 to 8 weeks are a big deal.
I stick with what I said earlier: NVidia boasts with the most advanced GPU features, but: nobody needs them yet, and they need tweaking. ATI sticks with proven tech, highly optimized, doing everything people need. ATI drivers are more stable and image quality is better. Who needs DX10 hardware except those ppl who need to compete for the biggest e-pen0s?
the current installer has a bug causing it to complain about missing kernel header files.
Grendel,
I think my post is OT, just adding some depth.
Krom,
maybe I am wrong on 3dfx, I am having contradictory sources.
The image quality differences are not just molecular. You are giving me the impression that you don't really know what you are talking about. http://www.3dcenter.de/ has a lot of very technical in-depth information about this (they are even quoted in international tech sites). Unfortunately for you it is in German.
As far as power and cooling goes, ATI was better than NVidia since the Radeon 9700. It was NVidia who put two power plugs on their gfx cards, and they continue to build such power hogs.
ATI did cheat, but way less severely than NVidia did, and they stopped it altogether when it was detected, while NVidia continued to cheat.
Admittedly ATI has had been problems delivering their hardware. On the other hand I don't think 4 to 8 weeks are a big deal.
I stick with what I said earlier: NVidia boasts with the most advanced GPU features, but: nobody needs them yet, and they need tweaking. ATI sticks with proven tech, highly optimized, doing everything people need. ATI drivers are more stable and image quality is better. Who needs DX10 hardware except those ppl who need to compete for the biggest e-pen0s?
- Krom
- DBB Database Master
- Posts: 16138
- Joined: Sun Nov 29, 1998 3:01 am
- Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
- Contact:
Diedel, I'm not sure if you were paying attention but the G80 draws a whopping 13 more watts at the plug under full load then the ATI x1950xtx. (321 vs 308 watts) At idle the difference is a good bit bigger, at 229 vs 184 watts (45 watts). But still not exactly a \"power hog\" especially considering how much more powerful the G80 is compared to the x1950xtx, and I would expect nVidia to reduce the idle power consumption in driver updates before too long.
The reason nVidia uses two power plugs is in order to properly meet standards and regulations for the maximum load that should be pulled through a single power plug. This helps because by following the standards more tightly nVidia cards can function properly under a much wider base of installed PSUs. Something that ATI is not doing, in theory drawing too much power through a single power plug could melt the plug or cause stability issues from too much load on a single line from the PSU. That is why nVidia uses two power plugs, not because it is impossible to supply the needed wattage over just one plug but to increase compatibility and stability.
The reason nVidia uses two power plugs is in order to properly meet standards and regulations for the maximum load that should be pulled through a single power plug. This helps because by following the standards more tightly nVidia cards can function properly under a much wider base of installed PSUs. Something that ATI is not doing, in theory drawing too much power through a single power plug could melt the plug or cause stability issues from too much load on a single line from the PSU. That is why nVidia uses two power plugs, not because it is impossible to supply the needed wattage over just one plug but to increase compatibility and stability.
Well, maybe NVidia is catching up, but they still have to wipe out a pretty bad record they gathered over the last years. It's easy to lose a good reputation and hard to recover it. And as I wrote, it's not just about raw power and keeping power plug specs. I am more a quality than a speed freak, and at 60+ fps you can afford quality instead of squeezing 3% higher framerates out of applications.
Edit:
You may want to read this interesting article about the GF 8800. I have to admit that it looks like NVidia did (almost) everything right with the 8800. Interestingly enough, the article states that it is a completely new design, in development since 2002; hence it doesn't suffer from design restrictions from NVidia's previous gfx hardware generations. The article also states that ATI has developed a unified shader gfx hardware for the XBox 360, so you can bet your behind they will come up with something good too when they think the time has come.
Krom,
that article has different numbers on power consumption than you do.
Edit 2:
Another thought after having read another article about the GF 8800 ... they said ATI is clearly beaten (true) and NVidia will rule the christmas sales ... probably also true, and: shedding a light on the most common human disease, which is stupidity. The card is superfast, but do you really need that much speed? DX10 and Vista will not be available before end of January 07, and DX10 games even later. I bet many ppl will buy the 8800 - and have a uselessly powerful piece of hardware in their computers. It was the same with PS 3.0 hardware: There were as good as no games utilizing this. Bottom line: ATI's PS 2 hardware was just as fast as NVidia's PS 3 hardware. The 8800 seems to really be a good gfx card - with features nobody may need, because nobody can use them. Tsk. Btw, NVidia recommends to use at least 1600x1200 with the 8800, which makes use of a huge display almost mandatory. I wonder if NVidia shouldn't start to develop some money printing hardware, too ...
Edit:
You may want to read this interesting article about the GF 8800. I have to admit that it looks like NVidia did (almost) everything right with the 8800. Interestingly enough, the article states that it is a completely new design, in development since 2002; hence it doesn't suffer from design restrictions from NVidia's previous gfx hardware generations. The article also states that ATI has developed a unified shader gfx hardware for the XBox 360, so you can bet your behind they will come up with something good too when they think the time has come.
Krom,
that article has different numbers on power consumption than you do.
Edit 2:
Another thought after having read another article about the GF 8800 ... they said ATI is clearly beaten (true) and NVidia will rule the christmas sales ... probably also true, and: shedding a light on the most common human disease, which is stupidity. The card is superfast, but do you really need that much speed? DX10 and Vista will not be available before end of January 07, and DX10 games even later. I bet many ppl will buy the 8800 - and have a uselessly powerful piece of hardware in their computers. It was the same with PS 3.0 hardware: There were as good as no games utilizing this. Bottom line: ATI's PS 2 hardware was just as fast as NVidia's PS 3 hardware. The 8800 seems to really be a good gfx card - with features nobody may need, because nobody can use them. Tsk. Btw, NVidia recommends to use at least 1600x1200 with the 8800, which makes use of a huge display almost mandatory. I wonder if NVidia shouldn't start to develop some money printing hardware, too ...
- Krom
- DBB Database Master
- Posts: 16138
- Joined: Sun Nov 29, 1998 3:01 am
- Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
- Contact:
I happen to have a 1600x1200 display myself...
Here is the big useful article on the G80:
http://www.anandtech.com/video/showdoc.aspx?i=2870
Anyway, the article I read on power consumption was the daily tech quick preview of the card that was posted a few days ago, more detailed articles are around now. But as far as performance per watt goes, G80 wins every time, which is impressive when you consider it has 681 million transistors! The power consumption with two of them in SLI was around 500 watts at the outlet, which is a lot, but not the end of the world projections some people were making. The OCZ 1000 watt PSU they used for testing was overkill.
The real kicker that makes this card semi useless though has everything to do with your display. Only a CRT can keep up with the framerates these cards spew at anything under 2560x1600, pairing such a FPS monster with a L for Latency LCD is pretty much as stupid as it gets. But don't hold your breath for nVidia's marketing team to tell you that, because they won't.
But to address your final point, the people that buy this kind of hardware are either 1: extremely wealthy and have money to burn on it. Or 2: someone who has been planning/saving up for a long time and will use this to build a system that they will use for several years. It has been a long time since any current GPU hardware was made with "current" games in mind. You buy this with the image in mind of not having to upgrade your hardware to play any game you want for the next 12-24 months.
edit: oh yeah, ATI has been using unified shaders for quite a while, but this is the first time nVidia has so that is what all the unified shader fuss is about right now: mostly nVidia's marketing team getting geared up.
Here is the big useful article on the G80:
http://www.anandtech.com/video/showdoc.aspx?i=2870
Anyway, the article I read on power consumption was the daily tech quick preview of the card that was posted a few days ago, more detailed articles are around now. But as far as performance per watt goes, G80 wins every time, which is impressive when you consider it has 681 million transistors! The power consumption with two of them in SLI was around 500 watts at the outlet, which is a lot, but not the end of the world projections some people were making. The OCZ 1000 watt PSU they used for testing was overkill.
The real kicker that makes this card semi useless though has everything to do with your display. Only a CRT can keep up with the framerates these cards spew at anything under 2560x1600, pairing such a FPS monster with a L for Latency LCD is pretty much as stupid as it gets. But don't hold your breath for nVidia's marketing team to tell you that, because they won't.
IIRC, RSX is actually a modified G70, so it is no surprise the G80 blows it out of the water.anandtech wrote:Back when Sony announced the specifications of the PlayStation 3, everyone asked if it meant the end of PC gaming. After all Cell looked very strong and NVIDIA's RSX GPU had tremendous power. We asked NVIDIA how long it would take until we saw a GPU faster than the RSX. Their answer: by the time the PS3 ships. So congratulations to NVIDIA for making the PS3 obsolete before it ever shipped, as G80 is truly a beast.
But to address your final point, the people that buy this kind of hardware are either 1: extremely wealthy and have money to burn on it. Or 2: someone who has been planning/saving up for a long time and will use this to build a system that they will use for several years. It has been a long time since any current GPU hardware was made with "current" games in mind. You buy this with the image in mind of not having to upgrade your hardware to play any game you want for the next 12-24 months.
edit: oh yeah, ATI has been using unified shaders for quite a while, but this is the first time nVidia has so that is what all the unified shader fuss is about right now: mostly nVidia's marketing team getting geared up.
Krom,
the 8800 is so efficient because the unified shaders allow dynamic load balancing of the shader units depending on what work the GPU primarily has to do at any given moment; i.e. no idling silicon, waiting for some other pipeline stage to get finished.
Well, it's always like there's some forerunners at the frontlines of technical development, and the rest will come a year or two later.
the 8800 is so efficient because the unified shaders allow dynamic load balancing of the shader units depending on what work the GPU primarily has to do at any given moment; i.e. no idling silicon, waiting for some other pipeline stage to get finished.
Well, it's always like there's some forerunners at the frontlines of technical development, and the rest will come a year or two later.
I wrote that I had installed the most recent kernel sources.fliptw wrote:odd. it didn't when I installed mine.Diedel wrote:flip,
the current installer has a bug causing it to complain about missing kernel header files.
wait, I compiled a kernel before installing it, as the installer compiles a kernel interface, it would need the kernel headers.
and complied and installed a kernel from them?Diedel wrote: I wrote that I had installed the most recent kernel sources.
read the special fancy instructions for SUSE?
- Aggressor Prime
- DBB Captain
- Posts: 763
- Joined: Wed Feb 05, 2003 3:01 am
- Location: USA
I am just asking myself what the point in getting a DX10 ready card now is if there is no DX10 ready now ...
Grendel,
if I was you, I'd consider an Innovatek cooler, as they are very slim (they are built so that they do not block the adjacent slot). I am using these for years now (X800, X1800 coolers).
I also would wait for ATI's DX10 hardware. I told ppl here already they'd certainly have something up their sleeve, given their experience with X-Box 360 silicon.
Grendel,
if I was you, I'd consider an Innovatek cooler, as they are very slim (they are built so that they do not block the adjacent slot). I am using these for years now (X800, X1800 coolers).
I also would wait for ATI's DX10 hardware. I told ppl here already they'd certainly have something up their sleeve, given their experience with X-Box 360 silicon.
Heh,
I used Innovatek coolers for my 5950U and two 6800U cards. I switched to Koolance for my 7900GTX tho, since Innovatek blocks are hard to get in the US (and expensive as hell..) The Koolance block works very well, idles around 26C, load ~35C. The Silenx block looks pretty nice, could as well have been made by Innovatek All blocks mentioned are low profile BTW.
I used Innovatek coolers for my 5950U and two 6800U cards. I switched to Koolance for my 7900GTX tho, since Innovatek blocks are hard to get in the US (and expensive as hell..) The Koolance block works very well, idles around 26C, load ~35C. The Silenx block looks pretty nice, could as well have been made by Innovatek All blocks mentioned are low profile BTW.