• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New 28 nm Graphics Cards To Be 45 Percent Faster And Overclock Like Never Before?

Sure but history tells us that the majority of them are bad :(

To be fair the majority of games are bad, no one remembers them in a years time because nobody bought them

it's not as if they're haven't been any Awful games/games with bad coding that were aimed at the PC
 
To be fair the majority of games are bad, no one remembers them in a years time because nobody bought them

it's not as if they're haven't been any Awful games/games with bad coding that were aimed at the PC

That's completely right, I must admit that when a game is made for the consoles and then ported to PC I start watching it with an overly critical position :)
 
can we get same or lower performance and good games developed for pc instead ?

Saints Row 3 will get a PC only version, hopefully GTA5 will have PC specifics. Games like LA Noire crap on pc gaming, they should be shamed.

Where's another Crysis when we need one.
 
That's completely right, I must admit that when a game is made for the consoles and then ported to PC I start watching it with an overly critical position :)

And you're not wrong, because the games often show all the same limitations as their console counterparts. In fact, Crysis 2 is a great example of this, which was well and truly fixed by patches later on, which made it more of a graphics benchmark than a crappy console port.
 
That's completely right, I must admit that when a game is made for the consoles and then ported to PC I start watching it with an overly critical position :)

That would be somewhere around 95% of today's popular games I would think.
 
A game being a port doesn't matter, really. It DOES matter when that port is BROKEN, functionality-wise.

I agree, it all depends on how much a game gets polished

Dead Space was a port and it was one of my favorite games to play
 
:shadedshu Developers should code on PC first, port to consoles later. Sigh, I still wish the damned console manufacturers never went with those stupid PowerPC/Cell designs to begin with. If Intel got at least one of the major consoles on x86 (or x86_64), it would've made a HUGE difference for us, trust me. ;)
 
was well and truly fixed by patches later on, which made it more of a graphics benchmark than a crappy console port.

No it wasn't man, they just tact on a load of badly optimised dx11 stuff.

Crysis 2 as a pc game was a failure. IMO


( For example tesselated planks of wood, flat planks of wood. The sea being tesselated under the city even though you couldn't see it)
 
No it wasn't man, they just tact on a load of badly optimised dx11 stuff.

Crysis 2 as a pc game was a failure. IMO


( For example tesselated planks of wood, flat planks of wood. The sea being tesselated under the city even though you couldn't see it)

Oh. :ohwell: I'd seen it reported that it was, but I'd also seen comments such as yours, now I think about it. I don't have the game, so I can't talk from experience about this.
 
I would be happy with 10% performance improvement, and 35% lower power. And silent. Sounds good.
 
:shadedshu Developers should code on PC first, port to consoles later. Sigh, I still wish the damned console manufacturers never went with those stupid PowerPC/Cell designs to begin with. If Intel got at least one of the major consoles on x86 (or x86_64), it would've made a HUGE difference for us, trust me. ;)

Haha they did. The original Xbox. I'm sure you already know this and I know what you mean. Apparently they are looking away from PowerPc the last time I heard. Not only that, the new Wii U is getting a similar architecture based off of the RV700.
 
I'll be skeptical for now.. still can't get the Bulldozer release out of my head.
 
Last edited:
I can see large overclocks coming. I remember barely being able to OC cards in the past, now with lower nm I have almost a 20% OC on my 6950, it's just crazy.

Hey I overclocked my x800GT (475Mhz core) to almost 600Mhz! :P

But yeah it is better.

I'll be skeptic for now.. still can't get the Bulldozer release out of my head.

AMD CPU's are very different from their GPU's. They have not missed on the GPU side in quite some time.
 
The 6850 i have is low power consumption and runs at cool temps, if they can keep this while adding more performance they are going in the right direction. I dont really think another Fermi is a good direction.
 
The 6850 i have is low power consumption and runs at cool temps, if they can keep this while adding more performance they are going in the right direction. I dont really think another Fermi is a good direction.

I strongly agree with you :toast:
Fermi is such an ugly chip on it's high end, especially in its first incarnation (GF100) the 480 left me with such a bad impression about it, high temperatures (I can live with this) and insane power draw...
Luckily Nvidia tried to fix some of the issues with the GF110 on the 5xx series but anyway you can't expect a lot of efficiency from such a big chip :shadedshu

I seriously hope that next GPU generation puts efficiency before performance because a single chip shouldn't draw that much power :banghead:
 
Last edited by a moderator:
I strongly agree with you :toast:
Fermi is such an ugly chip on it's high end, especially in its first incarnation (GF100) the 480 left me with such a bad impression about it, high temperatures (I can live with this) and insane power draw...
Luckily Nvidia tried to fix some of the issues with the GF110 on the 5xx series but anyway you can't expect a lot of efficiency from such a big chip :shadedshu

I seriously hope that next GPU generation puts efficiency before performance because a single chip shouldn't draw that much power :banghead:

The GF110 might pull a lot of power, but as a GTX 580 owner, I can tell you it's not a problem in practice. You just need a decent PSU and a large case for it. The card performs beautifully.

Therefore, if the next gen pulls a similar amount of power with better performance, it won't be a problem either.
 
noticed they said they could be made cheaper too.....think that will translate into savings for us?? :roll:
 
I know but I was scared about thinking on 4 of them inside the same case so I went ATI for this round and I have to say that with the proper ventilation it's not an issue at all.
Probably I'm over reacting to what the GTX 480 was and the GTX 580 isn't that bad :toast:

noticed they said they could be made cheaper too.....think that will translate into savings for us?? :roll:

I wouldn't bet on that :D
 
I know but I was scared about thinking on 4 of them inside the same case so I went ATI for this round and I have to say that with the proper ventilation it's not an issue at all.
Probably I'm over reacting to what the GTX 480 was and the GTX 580 isn't that bad :toast

4 GTX 580s? Now there's a challenge! lol I know how to build a PC and I wouldn't build one with 4 without some advice first.

In fact, the 580 actually uses the same amount of power as a 480, because nvidia used the efficiency improvements to up the rendering power.
 
4 GTX 580s? Now there's a challenge! lol I know how to build a PC and I wouldn't build one with 4 without some advice first.

In fact, the 580 actually uses the same amount of power as a 480, because nvidia used the efficiency improvements to up the rendering power.

That's one of the main reasons I went for 2x HD6990 and the other was my motherboard only has 3 pci slots 16x/16x/8x so I had to arrange things in such way that I could be able without buying another x58 motherboard...
I shouldn't really complain about power draw I know :p

EDIT : I'm also lazy as hell with cable management, if I were to post a screenshot of my rig many of you would slap me for the wiring XD
 
That's one of the main reasons I went for 2x HD6990 and the other was my motherboard only has 3 pci slots 16x/16x/8x so I had to arrange things in such way that I could be able without buying another x58 motherboard...
I shouldn't really complain about power draw I know :p
No, you're not afraid of power draw, that's for sure. ;)

EDIT : I'm also lazy as hell with cable management, if I were to post a screenshot of my rig many of you would slap me for the wiring XD
Oh, go on, just do it! :laugh: :toast:
 
Title is somewhat misleading so let me make this abundantly clear.
45% higher clock rates DOES NOT EQUAL 45% higher performance, there are tons of other factors that go into the final performance of a chip.

That being said this is interesting news, coupled with what I hear about AMD using XDR 2 memory this is shaping up to be an interesting graphics card generation.

28nm die. More space, More Cores, More Transistors, Great Efficiency

45% Higher Clock Speeds, Slight increase more shaders/transistors and your going to have 45% + performance.

No one can really predict, but if you can cram more on that 28nm die, and still clock like a bastard, and still keep it cool then its a good prediction, which is usually true by tsmc.

There is usually just manufacturing and design problems that sometimes make smaller nm processes a little sketchy compared to there spacey brothern.
 
What's really insane is the quality of code in pc games. Graphics aren't really getting better, so why do games get more demanding? They're letting the excess performance of current cards make up for how half-assed their code is.
 
That's completely right, I must admit that when a game is made for the consoles and then ported to PC I start watching it with an overly critical position :)
Well then you will be also watching Elder Scrolls V along with most games too, because that was also made for the console and it's been ported onto the PC with enhancements. Don't know why the buffoons would design a game based on a bloody 6 to 7 year old game console where in fact the PC is superior in every single aspect in gaming and bloody up to date.

I expect the HD 7970 to be between 2x to 3x faster than the HD 6970 for a 28nm Next Gen Design. Along with stipping the problematic 32nm process.
 
28nm die. More space, More Cores, More Transistors, Great Efficiency

45% Higher Clock Speeds, Slight increase more shaders/transistors and your going to have 45% + performance.

No one can really predict, but if you can cram more on that 28nm die, and still clock like a bastard, and still keep it cool then its a good prediction, which is usually true by tsmc.

There is usually just manufacturing and design problems that sometimes make smaller nm processes a little sketchy compared to there spacey brothern.

It says 45% FASTER not 45% more powerful :p So technically the title is fine
 
Back
Top