Tuesday, May 25th 2010

NVIDIA Releases GeForce GTX 480M, World's Fastest Notebook GPU

NVIDIA made its GeForce GTX 480M GPU official today. The DirectX 11 compliant GPU is based on the GF100 core and packs all the features of its desktop counterpart, such as decentralized hardware tessellation, next-generation CUDA and DirectCompute 5.0. The GF100 core has a configuration similar to the GeForce GTX 465 desktop GPU. It has three of its four graphics processing clusters (GPCs), and 11 out of 16 streaming multiprocessors (SMs) enabled, giving a CUDA core count of 352. To reduce the overall board footprint, the GPU makes do with a 256-bit wide GDDR5 memory interface, with 1 GB of memory.

To make keep up with the electrical constraints of notebooks, the GTX 480M uses much lower clock-speeds than any desktop product that uses GF100. The core is clocked at 425 MHz, shader domain at 850 MHz, and memory at 600 MHz (real) or 2.40 GHz (effective), which gives a memory bandwidth of 76.8 GB/s. As mentioned earlier, the full feature-set of its desktop counterparts is packed with the GTX 480M, including support for NVIDIA 3D Vision, PureVideo HD, PhysX, and CUDA. It can pair with up to two boards of its kind in 2-way SLI. Constraints of the notebook form-factor won't allow any more boards, anyway. The GPU is open to Notebook manufacturers to plan their designs around. NVIDIA claims the GTX 480M to be the fastest notebook GPU. It finds direct competition in the ATI Mobility Radeon HD 5870, which is based on the 800 stream processor-laden Juniper core.
Add your own comment

55 Comments on NVIDIA Releases GeForce GTX 480M, World's Fastest Notebook GPU

#26
$ReaPeR$
it is a matter of point of view, no point in arguing. i would go with the most powerfull combo if i was in the market for one, Nvidia or ATI.
Posted on Reply
#27
a_ump
i'd personally go for an HD 5870 mobility, simply bc it's around HD 5750 performance, and that for on the go imo is plenty of power for even today's games if your just wanting some gaming on the go here and there.
Posted on Reply
#28
Benetanegia
MrMilliI don't know why i bother but here goes:

What i don't understand is that you keep talking about battery life while i didn't say one word about battery life.

1- www.nordichardware.com/en/component/content/article/21856-eurocom-reveals-geforce-gtx-480m-100w-monster.html
Google?

2- 100w isn't that much? For a laptop?
Never mind the battery life of a laptop having a GTX 480M ... how is one supposed to cool one of these in a laptop. This is going to result in a very fat and loud laptop.
Consider this:
- RM 5870 = 50W
- Mobile Core i7 = 45W
- 65% of laptops are supplied with a 60W adapter
- 30% with a 90W adapter
- 4% with a 125W adapter
- and i guess 1% has a 150W adapter
Now take an i7 + 480M + rest of the system and you're way over 150W. That's just not normal for a laptop! You talk about 50W like it's nothing. Maybe not for a PC but most laptops use less than 50W in total. Even if you compare 100W vs 150W, that will result in a huge difference in the cooling system, especially in a laptop.

But i agree, Optimus is a really nice piece of technology. I like it. But ATI has similar technology already available to manufactures. It's is up to the laptop manufacturers to implement the feature, just like Optimus. www.amd.com/us/products/technologies/switchable-graphics/Pages/switchable-graphics.aspx
The thing is, a laptop is designed with the max. TDP in mind. So while Optimus will extend battery life while you're not gaming, it won't make the laptop less bulky.

3- While it's true that nVidia can charge the same amount of money for a 480M as ATI does for a 5870, it just won't. We're talking about 3x the size silicon. 480M will be more expensive period. For reference, Eurocom will charge €285 extra for the 480M over the 280M (or 5870, as it costs as much as the 280M). Do you think 10-15% extra performance is worth €285?
Concerning the PCB, that has nothing to do with nVidia. Making a board with 128-bit tracing is cheaper than a board with 256-bit tracing. Less layers = less money.

4- Well ATI has something called 'Ultra Low Power State' for Crossfire systems. This works only on the 5000 generation cards. When idling, it takes the slave card to an even lower power state than normal idling. This feature is enabled from driver 10.2 onwards. This does alleviate the problem a bit. The second chip adds <10W while idling (if i counted correctly, that puts it at around 22W for both without ATI Switchable Graphics).

But one can't argue about this: when you're in the market for one of these huge, ugly, useless and heavy gaming laptops, you're better of with a Radeon Mobility 5870 CF.
You pay the same amout, the laptop is as heavy, it's as loud, it's as bulky, will eat batteries like there's no tomorrow but it will be 30-40% faster.
I know one thing, i won't buy either.
You do't know why you bother? I don'¡t know why I bother. lol

1- Thanks. I DID google it up and actually spent 15 minutes googling with no match, not even close.

2- You can come up with the most stupid of the arguments. Only 1% come with 150w adapters? And how many laptops do you think will come with a 480M? Or a 5870M in single or CF configuration? Seriously what a waste of time writing all that, when its' obvious these configs are not for the 99% of people looking for a laptop. Besides, there are many GTX285 SLI laptops and that consumes way more than a 480M. Let's use your numbers: yep, we are talking about 150w just for the 2 GPUs. Two 4870's weren't much better at 65 each for a total of 130w on CF. How come? That's imposible! I mean MrMilli said it's imposible and we know that MrMilli can't be wrong...

3- It's much more than 15% performance difference. That to begin with. And what I think about that doesn't matter at all. I would never buy such a thing and I would never buy a desktop part for more than 250-300 euro either. That doesn't mean I will go on forums saying what other people should or should not get or what makes or doesn't make sense.

And again the die size doesn't matter at all and neither does the PCB cost. What matters is how much Nvidia makes thanks to Fermi, including Tesla and Quadro and fact of the matter is that Fermi is a much better all around chip to fill all those markets and that saves them a lot of money in R&D while allowing them to make tons of money. When it comes to Nvidia's strategy, GeForces are there just to ensure mass production of the chips, it's in the profesional markets where they make the real money (profits*). That doesn't mean the desktop parts are useless either. Making a chip (and then a reference card) for every one of those markets would cost much much more than risking a little bit on one chip. With G80 they did the same risk decision and it was camplete success, with GT200 they did the same and we can call that one a tie, with Fermi it didn't go very well and even then they have a killer chip that sells like hotcakes in three markets and it's much faster than the competition. Compare it to failures in the past like FX line or R600. Especially R600, was much bigger than the competition, consumed a lot more, was noisy and was 30% slower than the 8800GTX, let alone the 8800 Ultra. Fermi at least is 15% faster instead of being so much slower and it kills in those apps and features it was designed for: DX11, tesselation, GPGPU...

4- No matter the TDP, cooling down 2 GPUs inside a laptop is much more difficult than cooling one. Your statement is completely false: CF setup would consume much more, and be much hotter than a single 480M.

* Low-end/Mainstream/Performance parts make up the mayority of the revenues, but 80%++ of profits come from the profesional and high-endmarkets. In order to make any computing intesive chip you need a lot of money put into R&D and manufacturing and mainstream is there to ensure some revenue stability and ensure mass prioduction of chips (lowering the prices). They are not there to make money. It's like why Mecedes-Benz, Audi, Jaguar, Lexus, etc etc started making more mainstrem cars, but their bussiness is still luxury cars.
Posted on Reply
#29
ToTTenTranz
Benetanegiahere's a review on the 5870M:

www.tomshardware.com/reviews/avadirect-clevo-w860cu-mobility-radeon-hd-5870,2615-11.html

^^30 minutes of battery life. GTX285M wins hands down with 45 mins.
The results in that review look fishy, to say the least.
Just go to the Mobility HD5870 and GTX285M pages in notebookcheck and see how the HD5870 is always vastly superior to the GTX285M in game tests, even though the GTX285M is always paired with a faster CPU.
www.notebookcheck.net/ATI-Mobility-Radeon-HD-5870.23073.0.html
www.notebookcheck.net/NVIDIA-GeForce-GTX-285M.23822.0.html



Maybe it's that weird Clevo driver they used for the HD5870. At tomshardware, don't they know that ATI drivers are now unified for desktop and notebook parts?
The review dates from May 13th. Why didn't they just use the Catalyst 10.4?

It could also explain that seemingly huge GPU consumption while idling (note the difference in power draw between CPU load and GPU load).
Posted on Reply
#31
ToTTenTranz
mdsx1950Gotta say. 480M is cool.
Really? I think it's gonna be quite hot.
Posted on Reply
#32
Steevo
mdsx1950Gotta say. 480M is cool.
When submerged in a tank of LN.
Posted on Reply
#33
entropy13
SteevoWhen submerged in a tank of LN.
Along with the whole laptop? :laugh:
Posted on Reply
#34
mdsx1950
ToTTenTranzReally? I think it's gonna be quite hot.
Lol.

It's good for a hardcore gamer who travels alot and needs portability.
Posted on Reply
#35
Atom_Anti
Benetanegia:

See this review in the bottom: (Gaming performance - summary):
www.notebookcheck.net/Review-MSI-GX640-i5447LW7P-Notebook.30776.0.html

Performance rating
1st place: Radeon HD 5870 @ 43.75 fps
2nd place: Radeon HD 5850 @ 38.63 fps
3rd place: GeForce GTX 285M @ 35.00 fps
4th place: GeForce GTX 260M @ 32.50 fps

Mobilty HD5850 with GDDR5 also beats GeForce GTX 285M. The most important thing there is huge power consumption difference between them.

And do not talk stupid, because GeForce GTX 480M in laptop is the stupidest thing that I have ever heard!
Posted on Reply
#36
Imsochobo
So nvidia made the fastest one ?
Doubt it.
4870X2 still holds its ground very well. 2x800shaders @ 650mhz.
And is faster than GTX 470....470 is faster than 480M.....
Posted on Reply
#37
LiveOrDie
TVmanso you can play games on your lap and roast nuts at the same time = AWESOME
yer but it will be your nuts roasting :laugh:
Posted on Reply
#38
Roph
Who needs contraception, just play some Crysis with one of these beforehand :laugh:
Posted on Reply
#39
Fourstaff
We have yet to see a real-world implementation of the chip in a laptop chassis, so this is all speculation.
FourstaffMove along guys, nothing to see here.
Hillbeast, I want to game and my home is on the other side of the world (yes, it takes 17 hours to fly back home). I can, A) buy and lug a desktop whenever I fly back, or B) buy 2 computers, 1 here and 1 there, or C) buy a gaming laptop. Doesn't take a genius to work out which one is the best option, does it? :rolleyes:
Posted on Reply
#40
entropy13
FourstaffWe have yet to see a real-world implementation of the chip in a laptop chassis, so this is all speculation.



Hillbeast, I want to game and my home is on the other side of the world (yes, it takes 17 hours to fly back home). I can, A) buy and lug a desktop whenever I fly back, or B) buy 2 computers, 1 here and 1 there, or C) buy a gaming laptop. Doesn't take a genius to work out which one is the best option, does it? :rolleyes:
Buy 2 computers AND a gaming laptop. :cool:
Posted on Reply
#41
caleb
entropy13Buy 2 computers AND a gaming laptop. :cool:
Option 3 move to where you work.Cheaper,safer,more comfortable. Not to mention more time to play.

Bypassing its price how can any gamer say no to playing BFBC2 on a WC ?
Imagine the camping realism :)
Posted on Reply
#42
HalfAHertz
FourstaffWe have yet to see a real-world implementation of the chip in a laptop chassis, so this is all speculation.



Hillbeast, I want to game and my home is on the other side of the world (yes, it takes 17 hours to fly back home). I can, A) buy and lug a desktop whenever I fly back, or B) buy 2 computers, 1 here and 1 there, or C) buy a gaming laptop. Doesn't take a genius to work out which one is the best option, does it? :rolleyes:
Agreed. The only obvious solution is to implant a scaled down SoC system directly into your cranium! That way you can game on the Go anytime,anywhere! Them new ARM cores look pretty good now, eh?
Posted on Reply
#43
MrMilli
BenetanegiaYou do't know why you bother? I don'¡t know why I bother. lol

1- Thanks. I DID google it up and actually spent 15 minutes googling with no match, not even close.

2- You can come up with the most stupid of the arguments. Only 1% come with 150w adapters? And how many laptops do you think will come with a 480M? Or a 5870M in single or CF configuration? Seriously what a waste of time writing all that, when its' obvious these configs are not for the 99% of people looking for a laptop. Besides, there are many GTX285 SLI laptops and that consumes way more than a 480M. Let's use your numbers: yep, we are talking about 150w just for the 2 GPUs. Two 4870's weren't much better at 65 each for a total of 130w on CF. How come? That's imposible! I mean MrMilli said it's imposible and we know that MrMilli can't be wrong...

3- It's much more than 15% performance difference. That to begin with. And what I think about that doesn't matter at all. I would never buy such a thing and I would never buy a desktop part for more than 250-300 euro either. That doesn't mean I will go on forums saying what other people should or should not get or what makes or doesn't make sense.

And again the die size doesn't matter at all and neither does the PCB cost. What matters is how much Nvidia makes thanks to Fermi, including Tesla and Quadro and fact of the matter is that Fermi is a much better all around chip to fill all those markets and that saves them a lot of money in R&D while allowing them to make tons of money. When it comes to Nvidia's strategy, GeForces are there just to ensure mass production of the chips, it's in the profesional markets where they make the real money (profits*). That doesn't mean the desktop parts are useless either. Making a chip (and then a reference card) for every one of those markets would cost much much more than risking a little bit on one chip. With G80 they did the same risk decision and it was camplete success, with GT200 they did the same and we can call that one a tie, with Fermi it didn't go very well and even then they have a killer chip that sells like hotcakes in three markets and it's much faster than the competition. Compare it to failures in the past like FX line or R600. Especially R600, was much bigger than the competition, consumed a lot more, was noisy and was 30% slower than the 8800GTX, let alone the 8800 Ultra. Fermi at least is 15% faster instead of being so much slower and it kills in those apps and features it was designed for: DX11, tesselation, GPGPU...

4- No matter the TDP, cooling down 2 GPUs inside a laptop is much more difficult than cooling one. Your statement is completely false: CF setup would consume much more, and be much hotter than a single 480M.

* Low-end/Mainstream/Performance parts make up the mayority of the revenues, but 80%++ of profits come from the profesional and high-endmarkets. In order to make any computing intesive chip you need a lot of money put into R&D and manufacturing and mainstream is there to ensure some revenue stability and ensure mass prioduction of chips (lowering the prices). They are not there to make money. It's like why Mecedes-Benz, Audi, Jaguar, Lexus, etc etc started making more mainstrem cars, but their bussiness is still luxury cars.
I won't bother. Seems everybody else has a different opinion then you.

But i strongly urge you to read better. Just like you did with my first post, you misread this one again. Seems to me that you do read the words but don't get the message.
You're a CAD freelancer ... maybe that's why you have different opinion about this matter. I work for a SI and see things from a different perspective.

Anand's view: www.anandtech.com/show/3740/nvidia-announces-gtx-480m
Posted on Reply
#44
Unregistered
Are these gonna melt the bga connectors too like the 8400m gpu's did?
#45
Lipton
Makes me wonder..are there any "laptops" without a battery at all? Like a desktop..

I mean, how many of the desktop replacement users use their 'laptops' without the power cord being plugged in? I guess it's good for when the power goes out...but that's about the only reason I can think of.
Posted on Reply
#47
Benetanegia
MrMilliI won't bother. Seems everybody else has a different opinion then you.

But i strongly urge you to read better. Just like you did with my first post, you misread this one again. Seems to me that you do read the words but don't get the message.
You're a CAD freelancer ... maybe that's why you have different opinion about this matter. I work for a SI and see things from a different perspective.

Anand's view: www.anandtech.com/show/3740/nvidia-announces-gtx-480m
Jarred's view is like mine:
We expect with the GPU running constantly battery life is likely to be in the ~1 hour range, but that's typical of the gaming notebook market. Far more important than battery life is the performance characteristics, and here's where NVIDIA should trump AMD's Radeon Mobility HD 5870 quite handily.
And in one of the comments:
As for the 100W... well, companies put Core i7 desktop parts into large notebooks, so why not a 100W GPU? You can fit two 285M chips in an 18.4" notebook as well, so a single 100W GPU isn't that much worse. Obviously, these aren't even remotely targeting the portable market but are after people that don't mind large, heavy, but fast notebooks.
If you actually read what you linked, they expect as much as a 75% performance increase over previous generation. I didn't expect as much, but I do expect more than the ~25% increase you were saying (15% on top of the 10% dif between 285M and RM 5870), something like 40% average.

They do not mention 100w being imposible or going to fry anything. 100w is high, but nowhere near anything to worry about when the laptop is plugged to the grid. 285 SLI is much harder to cool and power and many laptop designs exist with this config. Mobility Radeon 4870 X2 laptops exist too and they too consume much more than 480M...

BTW I forgot to mention this in the last post, but how much % of laptops come with 180w AC adapters according to you? You know they exist right?
Posted on Reply
#48
Fourstaff
calebOption 3 move to where you work.Cheaper,safer,more comfortable. Not to mention more time to play.
That is unfortunately not available, I am still a student :)

And no, I will NOT buy ARM based processors for my main laptop, can't stand the "games" available on ARM.

Edit, actually, this graphics card might be teh bomb. If you see the laptop with GTX480M as a lightweight lanbox with attached screen, that is.
Posted on Reply
#49
Bjorn_Of_Iceland
TVmanso you can play games on your lap and roast nuts at the same time = AWESOME
ooh, yum yum.
Posted on Reply
#50
Fourstaff
Bjorn_Of_Icelandooh, yum yum.
I wouldn't feast on my own nuts, thx
Posted on Reply
Add your own comment
Apr 19th, 2024 07:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts