G80 Is Killer GPU (128 Unified Shaders @ 1350MHz)

For system help, all hardware / software topics NOTE: use Coders Corner for all coders topics.

Moderators: Krom, Grendel

Post Reply
User avatar
Aggressor Prime
DBB Captain
DBB Captain
Posts: 763
Joined: Wed Feb 05, 2003 3:01 am
Location: USA

G80 Is Killer GPU (128 Unified Shaders @ 1350MHz)

Post by Aggressor Prime »

Details are here.

This combined with the Intel buyout coming tonight could mean a great 2006/2007 for Intel-nVidia. Needless to say, AMD-ATI's R600 will be 2x behind this monster.
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

Last I checked, the Intel-nVidia news was just gossip, granted Intel has more then enough cash to buy nVidia, but why? Intel already has more market share in the video market then either ATI or nVidia combined, they have no reason to buy more.
User avatar
Aggressor Prime
DBB Captain
DBB Captain
Posts: 763
Joined: Wed Feb 05, 2003 3:01 am
Location: USA

Post by Aggressor Prime »

Well, crazier things have happened (like Dell using AMD chips). Only time will tell.
User avatar
Admiral LSD
DBB Admiral
DBB Admiral
Posts: 1240
Joined: Sun Nov 18, 2001 3:01 am
Location: Northam, W.A., Australia
Contact:

Post by Admiral LSD »

The thing is though that with Intel already having a well established chipset division, dominance over the low-end of the graphics market and an apparent lack of interest in the high-end and workstation segments nVidia have nothing to offer them to make buying them out worthwhile.
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6459
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

Admiral LSD wrote:The thing is though that with Intel already having a well established chipset division, dominance over the low-end of the graphics market and an apparent lack of interest in the high-end and workstation segments nVidia have nothing to offer them to make buying them out worthwhile.
I'd think nvidia would rather buy one of the smaller CPU companies(what ever happened to Cyrix?) than endure not being able to provide a complete platform like AMD, but it wouldn't like to be competing with Intel either.

Damned if they do, damned if they don't.

Wasn't another 3d chipset company in a similar situation all those years ago?
User avatar
Neo
DBB Admiral
DBB Admiral
Posts: 1027
Joined: Mon Mar 08, 2004 6:03 am
Location: the honeycomb hideout :)

Post by Neo »

Did you see the power requirements for GeForce 8800 GTX SLI? 800 W! lol x_x
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

Yes, while AMD and Intel are talking about performance per watt, Nvidia and ATI continue to work on performance per megawatt.
User avatar
Gold Leader
DBB Ace
DBB Ace
Posts: 247
Joined: Tue Jan 17, 2006 6:39 pm
Location: Guatamala, Tatooine, Yavin IV
Contact:

Post by Gold Leader »

you would need your own power station in your back yard within a few years :roll: unless you can hope for a power tax drop :)
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Neo wrote:Did you see the power requirements for GeForce 8800 GTX SLI? 800 W! lol x_x
Hehe -- max of 225W per card times two, all on the 12V rails. Roughly 38A.. Plus whatever you PC sucks. Guess we will see a new line of PSUs w/ 4 GF/X power connectors soon..

More info on the 8800:

http://www.xbitlabs.com/news/video/disp ... 42704.html
http://www.xtremesystems.org/forums/sho ... p?t=120598

GTX pricing:

https://www.excaliberpc.com/parts.asp?stxt=8800+gtx

Kinda steep..
User avatar
Top Wop
DBB Master
DBB Master
Posts: 5104
Joined: Wed Mar 01, 2000 3:01 am
Location: Far from you.
Contact:

Post by Top Wop »

Looks like this is a good generation to skip over. Im not going to upgrade to a nuclear power plant to stay on top of games, my 7600 GT will do just fine. Besides that I paid good money for my OCZ 520 with the intention to still use it for the next couple of years.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

I got an OCZ 520 too, that thing will power a 8800 GTX just fine :)
User avatar
JMEaT
DBB Meat ByProduct
DBB Meat ByProduct
Posts: 10047
Joined: Wed Mar 10, 1999 3:01 am
Location: USA

Post by JMEaT »

I find it rather amusing that PC's are reaching the power requirements of a clothes dryer outlet. :P
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

OMG that heat spreader is huge, I wonder how big the actual die is.
User avatar
Top Wop
DBB Master
DBB Master
Posts: 5104
Joined: Wed Mar 01, 2000 3:01 am
Location: Far from you.
Contact:

Post by Top Wop »

It seems as though the power requirements isnt THAT bad (but still tending to up):

http://www.dailytech.com/article.aspx?newsid=4812

It seems to perform quite well on benchmarks.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Krom wrote:OMG that heat spreader is huge, I wonder how big the actual die is.
20x20mm ..
ImageImage
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

For me this is NVidia marchitecture once again. NVidia constantly boasts having the most advancing gfx hardware, and they constantly fail to deliver truly polished products for a long time now. Remember the GF 58XX debacle? They managed to get back into the game with the GF 68XX just so. Their hardware consumes more power than ATIs and lacks features, particularly in the image quality department. ATI has the better AA for a long time. NVidia's drivers do not just cheat, they cannot even deliver the image quality ATI can. ATI may not have the most advanced features in their GPUs, but what they have is polished, fine tuned, highly optimized and fast. And they do it again and again.

Bottom line: NVidia tries to get the headlines, but once such hardware is really needed, ATI will be there again with something better, sleeker, more polished than NVidia.

This is not just fanboy talk. I was an NVidia fan big time - until the Radeon 9700 arrived, and NVidia so miserably failed to deliver on their promises.

If you don't know how big NVidia is into using marketing stuff to get an edge over competitors, google for the story why and how 3dfx got ruined (to make it short: They were ruined by NVidia announcing T&L hardware and some more stuff for over a year, never delivering, until 3dfx's back was broken because customers were staring at NVidia's promise, holding back from buying 3dfx hardware. HW T&L never really got popular in the end - it was just a marchitecture trick).

If you still think NVidia has the better drivers, look at their current Linux drivers - they fail to install due to an installer bug, and there is no easy fix. And if they worked, they'd be a PITA to use compared to ATI's driver installer. I know it, I used them both.

Nah, I'll wait for what ATI will come up with, and I bet it will be good and sleek and fast and consume less power than NVidia's crap and deliver better images.
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

Uhh huh, right.

First, 3dfx destroyed themselves by not having a product out to compete, the only thing nVidia did was hold an extremely aggressive 6 month product cycle. The geforce2 came out and was more then the Voodoo5 could handle, then the Geforce3 followed shortly after and was the final nail in the 3dfx coffin.

Second, molecular details in image quality and FSAA quality couldn't possibly be more irrelevant at 60+ frames per second in almost any game. Image quality is for screen shots and has almost nothing to do with actually playing. I disable AA in my 6800 GT even though it comes at no performance penalty on my system simply because I can't tell the difference between FSAA on and FSAA off unless I stop moving and look for it hard.

As far as power requirements, ATI is no saint either. All the major GPUs hog a ton of energy, saying ATI is power efficient is like saying Prescott ran cool.

ATI has also cheated in benchmarks and it is just as well known as nVidia cheating in 3dmark, saying it like ATI never did anything is deceptive and one sided.

Also ATI's recent record of hard launches has been nothing but a spectacular failure compared to nVidia actually holding off till they had parts on store shelves for the last few major launches. Keeping up with demand after launch is still trouble, but that goes for ATI once they finally ship 6-8 weeks after \"launch\".
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6459
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

Diedel wrote:If you still think NVidia has the better drivers, look at their current Linux drivers - they fail to install due to an installer bug, and there is no easy fix. And if they worked, they'd be a PITA to use compared to ATI's driver installer. I know it, I used them both
its not a bug if every distro puts their x.org files in different places.

the installer has a switch to specifiy where it goes.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Diedel wrote:ATi fan post
I would suggest the interested reader to check out this article for some background info.

Warning -- further derailing will be deleted. Please stay on topic.
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

flip,

the current installer has a bug causing it to complain about missing kernel header files.

Grendel,

I think my post is OT, just adding some depth.

Krom,

maybe I am wrong on 3dfx, I am having contradictory sources.

The image quality differences are not just molecular. You are giving me the impression that you don't really know what you are talking about. http://www.3dcenter.de/ has a lot of very technical in-depth information about this (they are even quoted in international tech sites). Unfortunately for you it is in German.

As far as power and cooling goes, ATI was better than NVidia since the Radeon 9700. It was NVidia who put two power plugs on their gfx cards, and they continue to build such power hogs.

ATI did cheat, but way less severely than NVidia did, and they stopped it altogether when it was detected, while NVidia continued to cheat.

Admittedly ATI has had been problems delivering their hardware. On the other hand I don't think 4 to 8 weeks are a big deal.

I stick with what I said earlier: NVidia boasts with the most advanced GPU features, but: nobody needs them yet, and they need tweaking. ATI sticks with proven tech, highly optimized, doing everything people need. ATI drivers are more stable and image quality is better. Who needs DX10 hardware except those ppl who need to compete for the biggest e-pen0s?
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

Diedel, I'm not sure if you were paying attention but the G80 draws a whopping 13 more watts at the plug under full load then the ATI x1950xtx. (321 vs 308 watts) At idle the difference is a good bit bigger, at 229 vs 184 watts (45 watts). But still not exactly a \"power hog\" especially considering how much more powerful the G80 is compared to the x1950xtx, and I would expect nVidia to reduce the idle power consumption in driver updates before too long.

The reason nVidia uses two power plugs is in order to properly meet standards and regulations for the maximum load that should be pulled through a single power plug. This helps because by following the standards more tightly nVidia cards can function properly under a much wider base of installed PSUs. Something that ATI is not doing, in theory drawing too much power through a single power plug could melt the plug or cause stability issues from too much load on a single line from the PSU. That is why nVidia uses two power plugs, not because it is impossible to supply the needed wattage over just one plug but to increase compatibility and stability.
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

Well, maybe NVidia is catching up, but they still have to wipe out a pretty bad record they gathered over the last years. It's easy to lose a good reputation and hard to recover it. And as I wrote, it's not just about raw power and keeping power plug specs. I am more a quality than a speed freak, and at 60+ fps you can afford quality instead of squeezing 3% higher framerates out of applications.

Edit:

You may want to read this interesting article about the GF 8800. I have to admit that it looks like NVidia did (almost) everything right with the 8800. Interestingly enough, the article states that it is a completely new design, in development since 2002; hence it doesn't suffer from design restrictions from NVidia's previous gfx hardware generations. The article also states that ATI has developed a unified shader gfx hardware for the XBox 360, so you can bet your behind they will come up with something good too when they think the time has come.

Krom,

that article has different numbers on power consumption than you do.

Edit 2:

Another thought after having read another article about the GF 8800 ... they said ATI is clearly beaten (true) and NVidia will rule the christmas sales ... probably also true, and: shedding a light on the most common human disease, which is stupidity. The card is superfast, but do you really need that much speed? DX10 and Vista will not be available before end of January 07, and DX10 games even later. I bet many ppl will buy the 8800 - and have a uselessly powerful piece of hardware in their computers. It was the same with PS 3.0 hardware: There were as good as no games utilizing this. Bottom line: ATI's PS 2 hardware was just as fast as NVidia's PS 3 hardware. The 8800 seems to really be a good gfx card - with features nobody may need, because nobody can use them. Tsk. Btw, NVidia recommends to use at least 1600x1200 with the 8800, which makes use of a huge display almost mandatory. I wonder if NVidia shouldn't start to develop some money printing hardware, too ... :P
User avatar
Krom
DBB Database Master
DBB Database Master
Posts: 16138
Joined: Sun Nov 29, 1998 3:01 am
Location: Camping the energy center. BTW, did you know you can have up to 100 characters in this location box?
Contact:

Post by Krom »

I happen to have a 1600x1200 display myself...

Here is the big useful article on the G80:
http://www.anandtech.com/video/showdoc.aspx?i=2870

Anyway, the article I read on power consumption was the daily tech quick preview of the card that was posted a few days ago, more detailed articles are around now. But as far as performance per watt goes, G80 wins every time, which is impressive when you consider it has 681 million transistors! The power consumption with two of them in SLI was around 500 watts at the outlet, which is a lot, but not the end of the world projections some people were making. The OCZ 1000 watt PSU they used for testing was overkill. :P

The real kicker that makes this card semi useless though has everything to do with your display. Only a CRT can keep up with the framerates these cards spew at anything under 2560x1600, pairing such a FPS monster with a L for Latency LCD is pretty much as stupid as it gets. But don't hold your breath for nVidia's marketing team to tell you that, because they won't. :P
anandtech wrote:Back when Sony announced the specifications of the PlayStation 3, everyone asked if it meant the end of PC gaming. After all Cell looked very strong and NVIDIA's RSX GPU had tremendous power. We asked NVIDIA how long it would take until we saw a GPU faster than the RSX. Their answer: by the time the PS3 ships. So congratulations to NVIDIA for making the PS3 obsolete before it ever shipped, as G80 is truly a beast.
IIRC, RSX is actually a modified G70, so it is no surprise the G80 blows it out of the water. :P

But to address your final point, the people that buy this kind of hardware are either 1: extremely wealthy and have money to burn on it. Or 2: someone who has been planning/saving up for a long time and will use this to build a system that they will use for several years. It has been a long time since any current GPU hardware was made with "current" games in mind. You buy this with the image in mind of not having to upgrade your hardware to play any game you want for the next 12-24 months.

edit: oh yeah, ATI has been using unified shaders for quite a while, but this is the first time nVidia has so that is what all the unified shader fuss is about right now: mostly nVidia's marketing team getting geared up.
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6459
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

Diedel wrote:flip,

the current installer has a bug causing it to complain about missing kernel header files.
odd. it didn't when I installed mine.

wait, I compiled a kernel before installing it, as the installer compiles a kernel interface, it would need the kernel headers.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

ImageImage
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

Krom,

the 8800 is so efficient because the unified shaders allow dynamic load balancing of the shader units depending on what work the GPU primarily has to do at any given moment; i.e. no idling silicon, waiting for some other pipeline stage to get finished.

Well, it's always like there's some forerunners at the frontlines of technical development, and the rest will come a year or two later.
fliptw wrote:
Diedel wrote:flip,

the current installer has a bug causing it to complain about missing kernel header files.
odd. it didn't when I installed mine.

wait, I compiled a kernel before installing it, as the installer compiles a kernel interface, it would need the kernel headers.
I wrote that I had installed the most recent kernel sources.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Good article about the card.
ImageImage
User avatar
fliptw
DBB DemiGod
DBB DemiGod
Posts: 6459
Joined: Sat Oct 24, 1998 2:01 am
Location: Calgary Alberta Canada

Post by fliptw »

Diedel wrote: I wrote that I had installed the most recent kernel sources.
and complied and installed a kernel from them?

read the special fancy instructions for SUSE?
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

The installer only needs the header files, and it's a confirmed bug. See the forums NVidia links to for Linux stuff.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Here's what I'm waiting for :)
ImageImage
User avatar
Aggressor Prime
DBB Captain
DBB Captain
Posts: 763
Joined: Wed Feb 05, 2003 3:01 am
Location: USA

Post by Aggressor Prime »

It doesn't look so good when we compare the G80 to the R600. Here are thisand thisout.

R600 has 2x the power and 2GB of RAM; however, I'm sure GX2 with a die shrink can fix that. :P

But if you want DX10 ready cards now with uber performance, nVidia is delivering now.
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

I am just asking myself what the point in getting a DX10 ready card now is if there is no DX10 ready now ... :roll:

Grendel,

if I was you, I'd consider an Innovatek cooler, as they are very slim (they are built so that they do not block the adjacent slot). I am using these for years now (X800, X1800 coolers).

I also would wait for ATI's DX10 hardware. I told ppl here already they'd certainly have something up their sleeve, given their experience with X-Box 360 silicon.
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

Heh,

I used Innovatek coolers for my 5950U and two 6800U cards. I switched to Koolance for my 7900GTX tho, since Innovatek blocks are hard to get in the US (and expensive as hell..) The Koolance block works very well, idles around 26C, load ~35C. The Silenx block looks pretty nice, could as well have been made by Innovatek :) All blocks mentioned are low profile BTW.
User avatar
Diedel
D2X Master
D2X Master
Posts: 5278
Joined: Thu Nov 05, 1998 12:01 pm
Contact:

Post by Diedel »

5950U, 2x6800U, 7900GTX .. :shock:

Are you rich? I am single ... :mrgreen:
User avatar
Grendel
3d Pro Master
3d Pro Master
Posts: 4390
Joined: Mon Oct 28, 2002 3:01 am
Location: Corvallis OR, USA

Post by Grendel »

I'm sure my GF would be amused by that proposal :P

Want to buy the 5950U w/ dualside blocks ? Or a 6800U block ? I can offer them to you for a good price :mrgreen:
Post Reply