Monday, November 29th 2010

GeForce GTX 570 Specifications, Release Date Leaked

On a day when a lot of other geopolitical things are being leaked, our friends from Sweden found the specifications sheet of NVIDIA's new upcoming high-end graphics accelerator, the GeForce GTX 570. The GTX 570 will be a deputy to the company's recently-release GeForce GTX 580, it is based on the GF110 graphics processor, with 480 CUDA cores enabled, and a 320-bit wide GDDR5 memory interface holding 1280 MB of memory. At this point it looks like a cross between the GTX 480 and the GTX 470, but the equations take a turn when clock speeds step in: 732 MHz core, 1464 MHz CUDA cores, and 950 MHz (3800 MHz effective) memory, churning out 152 GB/s of memory bandwidth. Power consumption is rated at 225W. NVIDIA's upcoming accelerator is slated for release on 7th December, just five days ahead of AMD's Radeon HD 6900 series launch.
Source: Sweclockers
Add your own comment

44 Comments on GeForce GTX 570 Specifications, Release Date Leaked

#26
[H]@RD5TUFF
sneekypeetPlease lets not use "trolls" or try to stir anyone into a flame war;)
Sometimes one must fight fire with fire, when logic proves ineffective.;):toast:
mlee49The 470's in SLI were phenomenal, these will really be amazing in SLI.
We shall see, a leaked press release, is hardly any grounds for preformance speculation.
Posted on Reply
#27
Over_Lord
News Editor
Ahh, it's a GTX480 cut down by memory bandwidth
Posted on Reply
#28
KaelMaelstrom
thunderisingAhh, it's a GTX480 cut down by memory bandwidth
With the new Architecture of course
Posted on Reply
#29
wolf
Performance Enthusiast
KaelMaelstromWith the new Architecture of course
new architecture? :laugh: merely tweaked my friend.
Posted on Reply
#30
jamsbong
is good news all round to have Nvidia back. However, I do see a significant slow down in graphic development.

Once upon a time, we were able to see 3x performance increase per year. Now, we are only seeing a mere 1.2x per year or there abouts.

It is getting less exciting to see new product launches from both NVidia and ATI. I guess that means I have 2.5 yrs of time to save up for a 3x performance card (compared to a 2.5yr old card).
Posted on Reply
#31
KashunatoR
jamsbongis good news all round to have Nvidia back. However, I do see a significant slow down in graphic development.

Once upon a time, we were able to see 3x performance increase per year. Now, we are only seeing a mere 1.2x per year or there abouts.

It is getting less exciting to see new product launches from both NVidia and ATI. I guess that means I have 2.5 yrs of time to save up for a 3x performance card (compared to a 2.5yr old card).
you may be right but you should take into consideration, that they barely release videocard stressing games anymore, so even the 1.2-1.3x performance increase is overkill. i wish ps4 came out already
Posted on Reply
#32
TheMailMan78
Big Member
KashunatoRyou may be right but you should take into consideration, that they barely release videocard stressing games anymore, so even the 1.2-1.3x performance increase is overkill. i wish ps4 came out already
This is a man with intelligence.


Edit: WTF happen to my santa avatar!
Posted on Reply
#33
Easy Rhino
Linux Advocate
KashunatoRyou may be right but you should take into consideration, that they barely release videocard stressing games anymore, so even the 1.2-1.3x performance increase is overkill. i wish ps4 came out already
i have been saying the same thing for the past 2 years. there was a time when spending $500 on a graphics card was actually worth it if you wanted to game at higher resolutions with all the fixins. now it really is about having a large e-peen.
Posted on Reply
#34
OneCool
wolfnew architecture? :laugh: merely tweaked my friend.
Yeah, A different sticker on the cooler! :laugh:
Posted on Reply
#35
newtekie1
Semi-Retired Folder
Easy Rhinoi have been saying the same thing for the past 2 years. there was a time when spending $500 on a graphics card was actually worth it if you wanted to game at higher resolutions with all the fixins. now it really is about having a large e-peen.
Definitely. I mean if you look at it, there are still G80 setups out there that at this point are 4 years old, and they are still rocking modern games at reasonable settings.
wolfnew architecture? :laugh: merely tweaked my friend.
It amazes me how many people believe that some small tweaks are enough to warrant being called a new architecture or a new generation. But I guess we can blame ATi for leading the industry in that direction...
Posted on Reply
#36
TheMailMan78
Big Member
newtekie1Definitely. I mean if you look at it, there are still G80 setups out there that at this point are 4 years old, and they are still rocking modern games at reasonable settings.



It amazes me how many people believe that some small tweaks are enough to warrant being called a new architecture. But I guess we can blame ATi for leading the industry in that direction...
Oh but when I say it I'm just nuts?
Posted on Reply
#37
Easy Rhino
Linux Advocate
newtekie1Definitely. I mean if you look at it, there are still G80 setups out there that at this point are 4 years old, and they are still rocking modern games at reasonable settings.



It amazes me how many people believe that some small tweaks are enough to warrant being called a new architecture. But I guess we can blame ATi for leading the industry in that direction...
lol! you know that is going to cause problems!
Posted on Reply
#38
newtekie1
Semi-Retired Folder
TheMailMan78Oh but when I say it I'm just nuts?
Not to me.
Easy Rhinolol! you know that is going to cause problems!
Which part, doesn't really matter because they are both true though, and you know it.
Posted on Reply
#39
Easy Rhino
Linux Advocate
newtekie1Not to me.



Which part, doesn't really matter because they are both true though, and you know it.
of course i know it. it's just that i can hear the trolls coming.
Posted on Reply
#40
newtekie1
Semi-Retired Folder
Easy Rhinoof course i know it. it's just that i can hear the trolls coming.
Yeah, me too.:laugh:

I could put it in ATi terms also. My cousin is still running a HD3850 and has no problem with the modern games either, and that card is 3 years old at this point.

A lot of the modern games, especially the console ports, run on ancient hardware because they just aren't built to push graphics cards because they are held back by the console versions. And when you run them at 720p like the consoles do, it takes next to nothing to run.
Posted on Reply
#41
wolf
Performance Enthusiast
newtekie1Definitely. I mean if you look at it, there are still G80 setups out there that at this point are 4 years old, and they are still rocking modern games at reasonable settings.
I can imagine that, 8800GTX or Ultra SLi, while overpriced at the time would still kick a fair amount of ass in modern games, even one would fare really well.
newtekie1It amazes me how many people believe that some small tweaks are enough to warrant being called a new architecture or a new generation. But I guess we can blame ATi for leading the industry in that direction...
marketing marketing marketing...
Posted on Reply
#42
xBruce88x
newtekie1Yeah, me too.:laugh:

I could put it in ATi terms also. My cousin is still running a HD3850 and has no problem with the modern games either, and that card is 3 years old at this point.

A lot of the modern games, especially the console ports, run on ancient hardware because they just aren't built to push graphics cards because they are held back by the console versions. And when you run them at 720p like the consoles do, it takes next to nothing to run.
I'm sure Crysis 2 will be an exception to that :laugh:
Posted on Reply
#43
newconroer
TheMailMan78Shopping for a GPU is a perpetual kick to the nuts.
Echoing what Tekie was saying - good news comes when you consider that the G80-GT200 series and Radeon 4000 series were potent enough that they're still viable today.

I'm still rocking the 4870x2 (with a puny E6600 mind you)and not finding myself 'struggling' to enjoy the latest games.

At this rate, unless some crazy advancements are made in gaming software demands, I can skip the Radeon 6000 series as well.

Thinking about it, I may just quit gaming until real time vector drawing is capable in full 3d (that might take a while though...)
Posted on Reply
#44
T3kl0rd
newtekie1Yeah, me too.:laugh:

I could put it in ATi terms also. My cousin is still running a HD3850 and has no problem with the modern games either, and that card is 3 years old at this point.

A lot of the modern games, especially the console ports, run on ancient hardware because they just aren't built to push graphics cards because they are held back by the console versions. And when you run them at 720p like the consoles do, it takes next to nothing to run.
What is acceptable performance to some in games is subpar to others. I'm disappointed if a game isn't locked @ 60FPS on my machine. Others are satisfied with averaging over 30FPS constant. I still have my HD 3850 AGP and I can't run much of anything with that and the P4 it was paired with, except for really old games.

What amazes me is the recommended specs for console ports keep going up and up as the years go by. For being ports from the exact same 5 year old console! Compare Oblivion which runs on one core with some HT from another to Dragon Age:Origins or GTA IV which needs 4 cores to run properly. Ofc, Oblivion really needs at least 2 full cores to run properly without the world constantly popping up in front of you, preferably 4 cores. I know Elder Scrolls V will right the wrongs with the world popping up. All Xbox 360 ports! Not to mention more powerful GPU requirements. :confused:
Posted on Reply
Add your own comment
Apr 24th, 2024 10:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts