• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's First 28 nm GPUs in December

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like AMD will have the symbolic achievement of launching its first GPUs built on the new 28 nanometer process in 2011 itself. Sources told Heise.de that AMD is working towards launching some of its planned 28 nm GPUs in the second week of December, 2011. One of these sources specifically named December 06. Details on whether the launched GPU will be for the mobile (notebook) or desktop (graphics card) platforms; or even whether it will use the VLIW4 or so-called 'NextGen' compute architecture, are not known at this point.

Another source reinforced the theory that the launch will be more about symbolism than volume manufacturing for sales. It's likely that a small number of these GPUs will be manufactured, just about enough to send to OEMs for their qualification, and perhaps even the media for published performance testing. We expect these GPUs to be lower-end or mid-range GPUs, and since AMD is reserving the NextGen compute architecture for only the high-end GPU part, these ones will most likely use VLIW4.



View at TechPowerUp Main Site
 
They could use some $$$ after how BD turned out.

IMO that could be the reason why the released early in december and not Q2 of 2012. people wana upgrade at christmas as retailers are more likely to do discounts, but we shall see
 
Hd6750
hd6770
hd6790

hd6830??
hd6850
hd6870
hd6890??

hd6930??
hd6950
hd6970
hd6990

I'm guessing if it isn't a 7XXX series card or for the mobile market, they will call one of these.
 
I got lost in all this :/
So the midrange GPU's are still VLIW4. Does that mean they are die-shrinks of Northern Islands ?
 
Well if both fronts are launching their lowbies first, then getting out with 28nm will do nothing more but add bragging rights. They'll be low ends so its not like they can label as a must have for the new games coming out.
 
i really don't understand how they're able to produce 28nm gpu and no cpu at same node
 
They'll release something simpler at 28nm first. I don't mind, it was a good idea to release 4770 at 40nm first so that the process has time to mature for bigger cards.
 
Well if both fronts are launching their lowbies first, then getting out with 28nm will do nothing more but add bragging rights. They'll be low ends so its not like they can label as a must have for the new games coming out.
Better power consumption and heat output matters much more for mid-range cards than top-end.
 
This will be interesting.

Que "Nvidia also" moment in 3...2...1...
 
Better power consumption and heat output matters much more for mid-range cards than top-end.

It matters to top-end cards too! 6970's and 6990's are room heaters when you have a par of them. the only difference when it comes to high end is high power consumption and heat output become more acceptable.

to put it into clearer context.....

If AMDs new CPU consumed more power, created more heat but pulled well ahead of Intels current processors, people would be more forgiving. higher power consumption is bad, but if it performs like a boss. enthusiasts will overlook it a little
 
It matters to top-end cards too! 6970's and 6990's are room heaters when you have a par of them. the only difference when it comes to high end is high power consumption and heat output become more acceptable.

to put it into clearer context.....

If AMDs new CPU consumed more power, created more heat but pulled well ahead of Intels current processors, people would be more forgiving. higher power consumption is bad, but if it performs like a boss. enthusiasts will overlook it a little
Notice that I said more?
I didn't say it doesn't matter.
 
I got lost in all this :/
So the midrange GPU's are still VLIW4. Does that mean they are die-shrinks of Northern Islands ?

The top-end GPU will use NextGen while every other GPU in the series will use VLIW4.

In the current generation, the top-end GPU uses VLIW4, while every other GPU uses VLIW5.

NextGen is more advanced than VLIW4, which is in turn more advanced than VLIW5.
 
They need to push these out since Bulldozer was such a failure. Let's hope we can get good availability very early in 2012.
 
NextGen is more advanced than VLIW4, which is in turn more advanced than VLIW5.

AFAIK vliew5 (simple shaders + adv shaders) is more advanced than vliew4 (all shaders equal), as the number suggests. Because of this simplicity Vliew4 scales better, and therefore used in high end cards.
 
AFAIK vliew5 (simple shaders + adv shaders) is more advanced than vliew4 (all shaders equal), as the number suggests. Because of this simplicity Vliew4 scales better, and therefore used in high end cards.

No, VLIW5 is the older shader configuration that's die-space and power inefficient compared to VLIW4.
 
Either way, they should find a way to make shaders more effective. As of now, they're about to die shrink and pump up the shaders; like how they did with the Cypress GPU. nVidia's architecture is still more effective per-shader. No fanboy post. 28nm is kickass though, I've been waiting for it quite a while. These should be helluva efficient. And again, it's going to take long for nVidia to shrink their large GPU's. They haven't even readied up a new GPU yet.
 
Personally, I'm not very optimistic about those "next gen" products. Let's just hope that AMD's GPU division is still more ATi than AMD. At least ATi didn't promise what it couldn't deliver (much?).
 
Back when AMD was ATi, they trumped nVidia multiple times. Over and over again with the 9800/X800/X1900 cards. I remember when I had a X1950 XT, and how much better X1950 XTX cards were than their nVidia equivalents (7950 GX2...). Back then, nVidia didn't release an XP driver for months to prepare for 8800 cards and Vista, which blew again. X1950 cards were not only more powerful, but they were also built on a technologically more advanced GPU. Ever since AMD bought out ATi, they've been underwelming in performance with the exception of HD 5000 cards.
 
Successor of VLIW4 for the upper-tier cards is scalar+vector "Graphics Core Next Gen":
graphicscorenextgenarchbloc.jpg

to the bla
 
Well if both fronts are launching their lowbies first, then getting out with 28nm will do nothing more but add bragging rights. They'll be low ends so its not like they can label as a must have for the new games coming out.

I would very much like to see a HD5770/6770 or 6790 manufactured at 28 nm, hopefully with halved power consumption. I own Sapphire Vapor-X version od HD5770 which supposedly has very efficient cooling is more on the quiet side of the spectrum and whatnot yet I still find it irritatingly loud at times. I want quiet cards with reasonable performance :)
 
I would very much like to see a HD5770/6770 or 6790 manufactured at 28 nm, hopefully with halved power consumption. I own Sapphire Vapor-X version od HD5770 which supposedly has very efficient cooling is more on the quiet side of the spectrum and whatnot yet I still find it irritatingly loud at times. I want quiet cards with reasonable performance :)

You're best off with a HD 5850, which is a great card. No point in bothering with low-end, you won't get decent performance out of those.
 
This is some good retribution, very interesting.

They could use some $$$ after how BD turned out.

How is it that BD turned out bad? Performs 15-17% lower than 2600k, on the average 20% cheaper than 2600k, in cpu alone. Then you have the cost of board, and even a $100 motherboard would allow mild overclocking, providing more than enough.
 
How is it that BD turned out bad? Performs 15-17% lower than 2600k, on the average 20% cheaper than 2600k, in cpu alone. Then you have the cost of board, and even a $100 motherboard would allow mild overclocking, providing more than enough.

Performance per Watt - and OC'ing limitations due to it. The wisest thing to do right now is to wait for them to bin up BD after the GF - TSMC transaction. Not much point in jumping straight on BD. It's better off waiting for the chip to mature. ;)
 
Performance per Watt - and OC'ing limitations due to it

But yet noone slammed the i7 9xx series when intel did it? It is a 4c/bt just like it only with what i call "hardware hyperthreading". I am sick of all the BD is crap, I am mad at AMD hype. I am not saying to NOT get an intel chip nor am being fanboyish here. My intentions are not to be taken derogatory.
 
Back
Top