• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

That logic has its flaws:

1- HD3870 X2 is not twice as fast as HD3870, we could guess HD4 x2 won't either.

2- HD3870 X2 price is more than twice than DH3870, will this be different? That puts the HD4870 X2 well above $600. Probably it will be cheaper, but it won't launch until August, we don't know how prices are going to be then...

I'm not even comparing the R600 to the R700. They're 2 completely different gpu's. The 4870x2 is also using a different method of combining the cores so who klnow what new mess they will present to us. Anyway they said a long time ago that the thing wouldn't be over $499. How accurate that is now with the price of GDDR5 going up is in the air as well. But for now I'll hold them to it until I hear differently.

The only way the 2 monsters won't compete is that the 4870x2 is priced that low. I guess it'll make since to ATi but there's still no way that 280 or 260 is getting my money. I'll stick with the GT model if I had to get one but I'm sure that ppl will still flock to snatch them up. Hopefully it will be worth it to them.
 
But, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs. Probably just wishful thinking on my part, though :ohwell:

Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80? :confused:

WTF? R600 a rehash of R500? :confused::confused: What are you talking about?

Well, well, it shares many things with R500 (XB360 GPU) indeed, unified shaders for the most part. But nothing with R520 and 580, X1800 and X1900 respectively.

R600 was a complete new PC GPU architecture, RV670 was a rehash of it and RV770 is again a rehash of RV670 (kinda). GT200 is also (kind of) a rehash of G92 indeed.
 
Actually, the R600 was the brand new design that took a few hints from the R580. The R700 is based off of the R600 but with multiple design fixes and improvements. Can't say if the G200 is a G92 rehash but it could be.

People still bought the HD2900 didn't they? That had awful power consumption and so on. People will still buy the GTX280. Personally, I wont. I dont have that kind of money. I also don't want a mini necular reactor in my case. Thats one of the reasons I never got a 2900Pro or GT. Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.
 
we can't compare these new GPUs on paper - not until we start seeing the hardware itself on shelves, and coupled with real-world gaming benchmarks

Sure, nVidia's new G200 series does appear a lot better on paper than ATI's new R700 series - but the R700 series has been in design for a long time; we were hearing rumors of it before R600 was even released, although it shares a lot of the design of the R600.

Just for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.

nVidia's 7800/7900 cards looked better on paper than the 1800/1900 series did, but which cards came out of the gate better, and stayed ahead of the competition?


It's very possible we might see that again with these new generations of cards - we'll have to wait and see.
 
Actually, the R600 was the brand new design that took a few hints from the R580. The R700 is based off of the R600 but with multiple design fixes and improvements. Can't say if the G200 is a G92 rehash but it could be.

People still bought the HD2900 didn't they? That had awful power consumption and so on. People will still buy the GTX280. Personally, I wont. I dont have that kind of money. I also don't want a mini necular reactor in my case. Thats one of the reasons I never got a 2900Pro or GT. Also, my roommates have had problems with Nvidia drivers (not that I haven't had a few minor issues with ATI, usually the CCC won't install right or work) that are totally wack.

If you're going for bushisms, it's nucular :P

And my drivers have always been great along with all the PCs I build (and I use the "betas"). They probably don't know how to uninstall and reinstall properly.
 
Haha I'll say I was (I'm just a horrible speller).

They didn't until I showed them how. That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan. That could be thrown out because it's a 7950GX2 though I suppose.
 
Just for comparison, the last time we saw a brand spankin new GPU design from ATI was the R500 series - and the cream of the crop there was the 1800/1900 series of cards.

Again, "the last time we saw a brand spankin new GPU design from ATI was the R600"

Also GT200, AKA G100, AKA G90 has been in development for as much if not more time than R700.

BTW: R580 looked a lot better on paper than Nvidia's card, R520 didn't. Ideed that's why X1900 was so much better and X1800 was not.
 
Haha I'll say I was (I'm just a horrible speller).

They didn't until I showed them how. That fixed one of their problems, but one of them has a 7950GX2 still, and he was missing resolutions and had some problem with Age of Conan. That could be thrown out because it's a 7950GX2 though I suppose.

Yeah, the gx2 kinda sucks haha.
 
the prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.

As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.
 
the prices arent justified, and no way in hell are they justified if they are only 3-6% better than ATIs high offerings. This is ridiculous.

As an aside, they arent even increasing clocks, shaders, and memory that much from what they got now.

nVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu
 
nVidia's just swinging a magic wand. You can look at the ROPs or TMUs but how in the hell is it going to translate when the core/shd/mem are so low - especially the mem. So there's a billion transistors, but if they're doing half the work that they could be doing then that's a sweet bottleneck. Maybe they just figured we'll be voltmodding it anyway, otherwise the 280 won't even come close to the 4870x2. $600!? :shadedshu

Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.
 
Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.

lmao, thats well funny :laugh:
 
they are on crack at that price get real

I bought my 8800GTX Stock at $649 because it was the most futureproof card at the time i build my new system. This was November 2006. I still have this first revision of the 8800GTX and its the best card I ever had. A bit expensive but was well worth it.
 
I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources. I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me. I know there is more of them, but to drop the speeds that much seems insane to me.

see I couldn't agree more, i hate it when manufacturers slack off on things just because the competition isn't there.

sure nvidia cards are faster than ati cards right now, but that doesn't mean you can slack off on specs. I mean the shader clock was one of ati's biggest problems so they've upped it on this round of cards, and how does nvidia respond? by lowering the clocks on theirs? that doesn't make any sense to me.

I wonder if they're sandbagging on purpose. like they did with the 7800gtx 256mb which got pwnded by the x1800xt, then nvidia launches a few 7800gtx 512mb cards with uber clocks out of the blue.

so maybe there's a g200 ultra chip settin in nvidia's labs waiting to crush the rv770. time will tell.
 
This happens to EVERY recent cards. The price will me mad high. Just like when the 8800 GT first came out.... it eventually,however,have dropped. I hope this will happen to the GTX260,GTX280 too. And I like the naming. Instead of 10800 GTX.......
 
Exactly, at that price it'd damn better be the cure for cancer or else it's just a massive waste of money.

A massive waste of money is oil sheiks flying around in their private jets with gold-plated bathroom sinks while the rest of their countrymen are struggling every day to survive. :mad:
 
A massive waste of money is oil sheiks flying around in their private jets with their gold-plated bathroom sinks. :mad:

Gold plated? I heard they were solid gold :D
 
heh, & I bet all the heatsinks in their comps are made of diamond :respect:

They know how to use comps? All I ever see them doing is wrecking brand new imports trying to drift :roll:
 
I thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.
 
I thought everyone had diamond heatsinks? Intel gave me a special diamond IHS with a diamond ultra 120 that has air chilled to absolute 0 blowing over it. My temperatures are an even 3 kelvin at load.

I thought the pcb was non conductive platinum and the circuitry was diamond?
 
You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.


don't understand what you mean playable 50 and unplayable under 60?

all games with 30 fps are playable;human eye don't see the the frames above 25 fps maybe you have some special implants (from Nvidia) and you have reached the 50 fps target as minimum,good for you and good for us who play above 30 and under 60 and we're happy with it.
 
You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.

30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.
 
30 and 60 you could probably notice but I'm afraid your monitor may start to lag behind when you reach 60-80. Personally I've always found LCD's to be more crisp of an image when compared to CRT's so long as you're running at native.

Yeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman :rolleyes:
 
Back
Top