Monday, January 16th 2012

NVIDIA Kepler Inbound for March-April

NVIDIA's next high-performance GPU that will attempt to restore NVIDIA's performance leadership in the consumer graphics segment, under the GeForce Kepler family, is slated for a March-April launch, according to a VR-Zone report. At CES 2012, NVIDIA focused on its Tegra product line, and demonstrated its applications in smartphones, tablets, and even automotives, but chose to avoid talking about its GeForce family.

According to the report, NVIDIA wants to avoid doing a paper-launch like AMD, which launched its Radeon HD 7970 on December 22, 2011, but its market availability was non-existent till after two weeks, on January 9, 2012. NVIDIA wants to ensure the GeForce product based on its new high-performance GPU will be available in the market on launch-day, which is pinned somewhere within late March and early April. On April 8, Intel will launch its third-generation Core processor family.Source: VR-Zone
Add your own comment

82 Comments on NVIDIA Kepler Inbound for March-April

#1
ViperXTR
^lol indeed, not to mention, the HD 6930 :D
Posted on Reply
#2
RejZoR
If not for anything else, you'll have option to choose. Because right now, only AMD has the highest end card you can imagine. And the tech behind it is pretty good.

I just wish they'd offer more software goodies. Like FXAA and SMAA algorithms along with already available MLAA. And i wish NVIDIA would do the same. I know there are such features that can be injected in any game but with it you're risking VAC ban and well, i don't want that.
Posted on Reply
#3
entropy13
RejZoR said:
I just wish they'd offer more software goodies. Like FXAA and SMAA algorithms along with already available MLAA. And i wish NVIDIA would do the same. I know there are such features that can be injected in any game but with it you're risking VAC ban and well, i don't want that.
FXAA was made by Nvidia. Instead of being hardware locked (i.e. Nvidia only cards like PhysX) however, they left it to the game developers to add it to the game.
Posted on Reply
#4
Crap Daddy
It will be the same thing all over again. AMD comes first with a new generation faster than previous NV and then NV comes with its new gen which in turn is faster than AMDs. Lately we are used to have a few good months inbetween where AMD is alone on the market. The sad part is that the software (i.e. games) is not keeping the pace therefore mainstream cards from 2 generations before can handle any game out there well (depending on the settings of course) and probably will do so with this years games.
Posted on Reply
#5
Aceman.au
As much as I want to go with Nvidia... I'll probably end up going with AMD again cause if price to performance ratio... Nvidia can really kill AMD if they bring out the 7XX series really cheap.

Alas it won't happen as they are greedy and want profit here and now.
Posted on Reply
#6
thematrix606
Crap Daddy said:
It will be the same thing all over again. AMD comes first with a new generation faster than previous NV and then NV comes with its new gen which in turn is faster than AMDs. Lately we are used to have a few good months inbetween where AMD is alone on the market. The sad part is that the software (i.e. games) is not keeping the pace therefore mainstream cards from 2 generations before can handle any game out there well (depending on the settings of course) and probably will do so with this years games.
Unless you own a current monitor size, of 27"+ or a 3 monitor set up (which is a bit more expensive, and just not too handy for most), your 2 gen old card will barely breath, on the Vram alone. My HD5870 barely handles medium/high settings on a 1900x1200 res without any AA on all the latest titles (BF3, Crysis2, Skyrim). Remember Bf3 uses 1.3-1.5GB of VRAM alone on Ultra. Wait until we get our high end game releases this year as well going beyond that, aside from all the ports that is of course xD

And before you bash and say "oh no, 27" is not mainstream, 22" is" remember this: a few years ago a 19" monitor cost you 350 euro and that was mainstream, now your 27" monitor costs 270 euro.
Posted on Reply
#7
radrok
Why don't we talk about resolution and not monitor size?
You don't use monitor size to talk about performance, makes no sense, a 23,6 incher can have the same resolution as a 24"/27" etc... The performance of a system is the same on different inch size monitors with the same resolution.

As for the news, it's not too late and I'm glad Nvidia announced, I really wanna change ship this time :)
Posted on Reply
#8
ViperXTR
FXAA is already in the drivers, you can use nVidia inspector to fiddle with it, along with custom Ambient occlusion, force AA through flags and frame limiter (an alternative to vsync with no input lag)
Posted on Reply
#9
the54thvoid
radrok said:
Why don't we talk about resolution and not monitor size?
You don't use monitor size to talk about performance, makes no sense, a 23,6 incher can have the same resolution as a 24"/27" etc... The performance of a system is the same on different inch size monitors with the same resolution.

As for the news, it's not too late and I'm glad Nvidia announced, I really wanna change ship this time :)
^^ This.
And before you bash and say "oh no, 27" is not mainstream, 22" is" remember this: a few years ago a 19" monitor cost you 350 euro and that was mainstream, now your 27" monitor costs 270 euro.
Show me a 27" monitor that ISNT 1080 vertical pixels for less than £400.

This http://www.overclockers.co.uk/showproduct.php?prodid=MO-011-HO&groupid=17&catid=1120&subcat= is the cheapest 2560x1440 res 27" monitor i can find. After that you're heading towards £500.
Posted on Reply
#10
thematrix606
the54thvoid said:
Show me a 27" monitor that ISNT 1080 vertical pixels for less than £400.

This http://www.overclockers.co.uk/showproduct.php?prodid=MO-011-HO&groupid=17&catid=1120&subcat= is the cheapest 2560x1440 res 27" monitor i can find. After that you're heading towards £500.
Why you would want 2560 @ 27" is beyond me, I wouldn't want that on anything less than 30".

And the reason 1080 is the standard is because of HD tv, not much we can do there.

So looking for a non 1080 27" is just stupid. 27" 1080 start @ 216 euro here.

On topic: bring on the leaked benchmarks!!! :rockout:
Posted on Reply
#11
ensabrenoir
Wow....for once amd took over a thread and it didnt go all negative:) really looking foward to this launch as ive only used ati/amd and want to give the green team a try. Amds latest is a beast though. Hope nvidia brings their a game at nose bleed prices
Posted on Reply
#12
KooKKiK
So we're talking about a mainstream "Kepler" GK104

or "The Kepler" GK100 the mother of all nVidia gpu ???
Posted on Reply
#13
Frick
Fishfaced Nincompoop
thematrix606 said:
Why you would want 2560 @ 27" is beyond me, I wouldn't want that on anything less than 30".

And the reason 1080 is the standard is because of HD tv, not much we can do there.

So looking for a non 1080 27" is just stupid. 27" 1080 start @ 216 euro here.

On topic: bring on the leaked benchmarks!!! :rockout:
IMO 27" is perfect for 2560 x xxxx. 24" is perfect for 1920x1080, 22" good for 1680x1050. IMO of course.
Posted on Reply
#14
Crap Daddy
thematrix606 said:
Unless you own a current monitor size, of 27"+ or a 3 monitor set up (which is a bit more expensive, and just not too handy for most), your 2 gen old card will barely breath, on the Vram alone. My HD5870 barely handles medium/high settings on a 1900x1200 res without any AA on all the latest titles (BF3, Crysis2, Skyrim). Remember Bf3 uses 1.3-1.5GB of VRAM alone on Ultra. Wait until we get our high end game releases this year as well going beyond that, aside from all the ports that is of course xD

And before you bash and say "oh no, 27" is not mainstream, 22" is" remember this: a few years ago a 19" monitor cost you 350 euro and that was mainstream, now your 27" monitor costs 270 euro.
I see your point but I think the most common res is 1080p and even lower. A GTX460 can play those mentioned games very well, again depending on the settings (maybe no high-res pack for Crysis 2, Med-high no AA for BF3). Of course if you want all the eye-candy you'll need something much better than that but in the age of mobile entertainment devices it seems that there's really no need for hundreds of dollars worth of GPU. Sad but true. I do hope that Nvidia will not throw in the towel in this business to concentrate on mobile computing.
Posted on Reply
#15
thematrix606
Crap Daddy said:
I see your point but I think the most common res is 1080p and even lower. A GTX460 can play those mentioned games very well, again depending on the settings (maybe no high-res pack for Crysis 2, Med-high no AA for BF3). Of course if you want all the eye-candy you'll need something much better than that but in the age of mobile entertainment devices it seems that there's really no need for hundreds of dollars worth of GPU. Sad but true. I do hope that Nvidia will not throw in the towel in this business to concentrate on mobile computing.
You know a few months ago I was sitting in front of my PC, playing games on my tablet, totally ignoring my PC for hours... but now I don't even touch it anymore, starting at my desktop 27" with Crysis 2 in its, near full, glory. Using a mouse and keyboard. Jaw dropping (compared to most other games/consoles/mobile devices).

I do believe the mobile market is huge, and it will only continue to grow, but I'm on par with what Bill said once, yes Bill. It went something like "A device on the go, a tablet/bigger device @ work, and a huge device @ home (referring to desktops/TVs+consoles)." I like that idea, since on the desktop you don't have any restrictions in battery life / power, aside from your PSU, of course. You get the full glory of your CPU/GPU unlike on your laptop as well. And the keyboard + mouse, detailed monitor, plus the enhanced GFX, compared to a console.

I still think the PC market is HUGE, massive. I would love it if nVidia had it right with the PC gaining market share above the consoles once again. While I liked and enjoyed the idea of consoles back in the day, they are just too damn restrictive nowadays.
Posted on Reply
#16
thematrix606
Frick said:
IMO 27" is perfect for 2560 x xxxx. 24" is perfect for 1920x1080, 22" good for 1680x1050. IMO of course.
Yeah it's certainly doable, but I wear glasses, don't have the perfect vision, and I'm behind a PC a lot (software tester), so I do enjoy when I don't have to go closer to my monitor to read my desktop text :)
Posted on Reply
#17
afw
I think I know what this news means ... :rolleyes: nVidia wants the buyers to wait for their cards and not to buy the new 7xxx series ... this could actually work ... cos some ppl dont have a purpose/reason to buy a new card ... they juts want to have the best out there ... :slap:
Posted on Reply
#18
Zubasa
Aceman.au said:
As much as I want to go with Nvidia... I'll probably end up going with AMD again cause if price to performance ratio... Nvidia can really kill AMD if they bring out the 7XX series really cheap.

Alas it won't happen as they are greedy and want profit here and now.
That is assuming they are in the position to do that.
So far nVidia's GPUs have a bigger die size and also more memory chips due to the wider bus.
So in a price war AMD have the upper hand. This is their strategy since the HD3k series.
Posted on Reply
#19
Benetanegia
Zubasa said:
That is assuming they are in the position to do that.
They've never been in a better position. Read below.
So far nVidia's GPUs have a bigger die size and also more memory chips due to the wider bus.
So in a price war AMD have the upper hand. This is their strategy since the HD3k series.
Yeah an strategy that is over now, the old one was strongly based on the fact that Nvidia was agressively pursuing GPGPU and AMD never did, introducin only some small things here and there, saving up tremendous amounts of transistors in the process.

Just think about Tahiti, Tahiti has 4.3 billion transistor. Much much more than the 2.7 In Cayman and also much more than GF100/110's 3 billion. GPGPU is expensive (64 bit, memory management, etc.), something that nobody ever noticed or even cared about. Well finally AMD matched Nvidia on GPGPU features and capabilities and the result is a chip with 1.3 billion transistor more (+40%) than GF110 that is only 15% faster, with 15% faster clocks, clock for clock they are mostly equal. With a little help from 28 nm and it's good transistor clocking, Nvidia could in theory release an hypothetical GF111 @ 900 Mhz that would be as fast as the HD7970. 3 billion transistor vs 4.3 billion, now imagine a 3.6 billion chip or 4.5 billion transistor one.

I know it's not something many people here want to hear, and that it even hurts them, but I think Nvidia is in for an easy win this time around. They have to screw up badly not to score this one.
Posted on Reply
#20
Zakin
entropy13 said:
FXAA was made by Nvidia. Instead of being hardware locked (i.e. Nvidia only cards like PhysX) however, they left it to the game developers to add it to the game.
Entirely incorrect, Nvidia does not leave it to game developers to add it, FXAA was made by a free source developer who allowed anybody to add it into their games or into the Nvidia control panel. ATI has the option to put it in if they please but has not, feel free to check his blog as he is still updating FXAA currently and working on the next big version of it right now. In the end Nvidia doesn't own FXAA in any means.

Although to be quite fair, he does work with Nvidia, he has stated many times over that FXAA is entirely free source for anyone to use and adapt.
Posted on Reply
#21
Benetanegia
Zakin said:
Entirely incorrect, Nvidia does not leave it to game developers to add it, FXAA was made by a free source developer who allowed anybody to add it into their games or into the Nvidia control panel. ATI has the option to put it in if they please but has not, feel free to check his blog as he is still updating FXAA currently and working on the next big version of it right now. In the end Nvidia doesn't own FXAA in any means.
I have always assumed that FXAA was made by Timothy Lottes and every google search I make about it ends in the same place. So as s short answer, yes, it was made by/for Nvidia.
Posted on Reply
#22
Zakin
That it is, but as I have said, if you look around he does leave it open for just about anybody to use. Where as he may not allow ATI to put it in their panel, it doesn't mean that people couldn't simply mod Catalyst to put it in. I actually believe RadeonPro was looking into doing just this. It would be entirely legal as well.

Just to clarify, I run a 6870 and pretty much inject FXAA into almost every game I play, as MLAA is trash in comparison.
Posted on Reply
#23
Johnny_Utah
Aceman.au said:
As much as I want to go with Nvidia... I'll probably end up going with AMD again cause if price to performance ratio... Nvidia can really kill AMD if they bring out the 7XX series really cheap.

Alas it won't happen as they are greedy and want profit here and now.
There is NO WAY they will bring out a "cheap" 7XX line. Why would they? AMD just moved their prices up, so why would Nvidia drop theirs?

(BTW I own a lot of cards from both teams, red and green)

Price to performance was big a generation ago. Now it's going to be very hard for AMD to keep that crown as they have increased pricing...look at the 7970 (which I have). I paid 600+ which is what I paid for each of my 580 3gb....so I expect high end Kepler to be around 650 and to be faster than the 7970. Where's the price to performance now? Also, it's comical that you blame your ignorance in purchasing on the company. Anyone who did ANY reading about the 6870 could have told you to keep your 5xxx card. Want new tech? You will have to pay for it, as always.
Posted on Reply
#24
Zubasa
Benetanegia said:

Yeah an strategy that is over now, the old one was strongly based on the fact that Nvidia was agressively pursuing GPGPU and AMD never did, introducin only some small things here and there, saving up tremendous amounts of transistors in the process.

Just think about Tahiti, Tahiti has 4.3 billion transistor. Much much more than the 2.7 In Cayman and also much more than GF100/110's 3 billion. GPGPU is expensive (64 bit, memory management, etc.), something that nobody ever noticed or even cared about. Well finally AMD matched Nvidia on GPGPU features and capabilities and the result is a chip with 1.3 billion transistor more (+40%) than GF110 that is only 15% faster, with 15% faster clocks, clock for clock they are mostly equal. With a little help from 28 nm and it's good transistor clocking, Nvidia could in theory release an hypothetical GF111 @ 900 Mhz that would be as fast as the HD7970. 3 billion transistor vs 4.3 billion, now imagine a 3.6 billion chip or 4.5 billion transistor one.

I know it's not something many people here want to hear, and that it even hurts them, but I think Nvidia is in for an easy win this time around. They have to screw up badly not to score this one.
Lets hope they don't screw up this round because Kepler is the only reason I am still sitting on my 5870.
The core configuration and the actual performance of 7970 doesn't impress me at all TBH.
Posted on Reply
#25
Casecutter
If there is anyone who believes Nvidia will release their Über offering and be able to set an MSRP lower than AMD 7970, I want what you're smoking!

Kepler will be an improvement over the 7970 in performance, but honestly not on power or production costs unless they're really able to pull a rabbit out of the hat. So, it still does mean complex power sections and coolers to engineer and package onto a PCB.

Per chip I can't see TSMC giving any price break to Nvidia. The only way this works is that if Nvidia purchases the entire wafer and their architecture sort much better and they get better yields. I don't know how either group contacts on purchasing or that either side has an architectural advantage in production at 28Nm. They’re both probably struggling with TSMC, the only upside is Nvidia will be reaping production benefits for being behind, while AMD has 5mo's ahead in sales. It probably just evens out a on the bottom line by year’s end.
Posted on Reply
Add your own comment