Discussion in 'News' started by btarunr, Jan 16, 2012.
leaving out the lonely 5830 are we?
^lol indeed, not to mention, the HD 6930
If not for anything else, you'll have option to choose. Because right now, only AMD has the highest end card you can imagine. And the tech behind it is pretty good.
I just wish they'd offer more software goodies. Like FXAA and SMAA algorithms along with already available MLAA. And i wish NVIDIA would do the same. I know there are such features that can be injected in any game but with it you're risking VAC ban and well, i don't want that.
FXAA was made by Nvidia. Instead of being hardware locked (i.e. Nvidia only cards like PhysX) however, they left it to the game developers to add it to the game.
It will be the same thing all over again. AMD comes first with a new generation faster than previous NV and then NV comes with its new gen which in turn is faster than AMDs. Lately we are used to have a few good months inbetween where AMD is alone on the market. The sad part is that the software (i.e. games) is not keeping the pace therefore mainstream cards from 2 generations before can handle any game out there well (depending on the settings of course) and probably will do so with this years games.
As much as I want to go with Nvidia... I'll probably end up going with AMD again cause if price to performance ratio... Nvidia can really kill AMD if they bring out the 7XX series really cheap.
Alas it won't happen as they are greedy and want profit here and now.
Unless you own a current monitor size, of 27"+ or a 3 monitor set up (which is a bit more expensive, and just not too handy for most), your 2 gen old card will barely breath, on the Vram alone. My HD5870 barely handles medium/high settings on a 1900x1200 res without any AA on all the latest titles (BF3, Crysis2, Skyrim). Remember Bf3 uses 1.3-1.5GB of VRAM alone on Ultra. Wait until we get our high end game releases this year as well going beyond that, aside from all the ports that is of course xD
And before you bash and say "oh no, 27" is not mainstream, 22" is" remember this: a few years ago a 19" monitor cost you 350 euro and that was mainstream, now your 27" monitor costs 270 euro.
Why don't we talk about resolution and not monitor size?
You don't use monitor size to talk about performance, makes no sense, a 23,6 incher can have the same resolution as a 24"/27" etc... The performance of a system is the same on different inch size monitors with the same resolution.
As for the news, it's not too late and I'm glad Nvidia announced, I really wanna change ship this time
FXAA is already in the drivers, you can use nVidia inspector to fiddle with it, along with custom Ambient occlusion, force AA through flags and frame limiter (an alternative to vsync with no input lag)
Show me a 27" monitor that ISNT 1080 vertical pixels for less than £400.
This http://www.overclockers.co.uk/showproduct.php?prodid=MO-011-HO&groupid=17&catid=1120&subcat= is the cheapest 2560x1440 res 27" monitor i can find. After that you're heading towards £500.
Why you would want 2560 @ 27" is beyond me, I wouldn't want that on anything less than 30".
And the reason 1080 is the standard is because of HD tv, not much we can do there.
So looking for a non 1080 27" is just stupid. 27" 1080 start @ 216 euro here.
On topic: bring on the leaked benchmarks!!!
Wow....for once amd took over a thread and it didnt go all negative really looking foward to this launch as ive only used ati/amd and want to give the green team a try. Amds latest is a beast though. Hope nvidia brings their a game at nose bleed prices
So we're talking about a mainstream "Kepler" GK104
or "The Kepler" GK100 the mother of all nVidia gpu ???
IMO 27" is perfect for 2560 x xxxx. 24" is perfect for 1920x1080, 22" good for 1680x1050. IMO of course.
I see your point but I think the most common res is 1080p and even lower. A GTX460 can play those mentioned games very well, again depending on the settings (maybe no high-res pack for Crysis 2, Med-high no AA for BF3). Of course if you want all the eye-candy you'll need something much better than that but in the age of mobile entertainment devices it seems that there's really no need for hundreds of dollars worth of GPU. Sad but true. I do hope that Nvidia will not throw in the towel in this business to concentrate on mobile computing.
You know a few months ago I was sitting in front of my PC, playing games on my tablet, totally ignoring my PC for hours... but now I don't even touch it anymore, starting at my desktop 27" with Crysis 2 in its, near full, glory. Using a mouse and keyboard. Jaw dropping (compared to most other games/consoles/mobile devices).
I do believe the mobile market is huge, and it will only continue to grow, but I'm on par with what Bill said once, yes Bill. It went something like "A device on the go, a tablet/bigger device @ work, and a huge device @ home (referring to desktops/TVs+consoles)." I like that idea, since on the desktop you don't have any restrictions in battery life / power, aside from your PSU, of course. You get the full glory of your CPU/GPU unlike on your laptop as well. And the keyboard + mouse, detailed monitor, plus the enhanced GFX, compared to a console.
I still think the PC market is HUGE, massive. I would love it if nVidia had it right with the PC gaining market share above the consoles once again. While I liked and enjoyed the idea of consoles back in the day, they are just too damn restrictive nowadays.
Yeah it's certainly doable, but I wear glasses, don't have the perfect vision, and I'm behind a PC a lot (software tester), so I do enjoy when I don't have to go closer to my monitor to read my desktop text
I think I know what this news means ... nVidia wants the buyers to wait for their cards and not to buy the new 7xxx series ... this could actually work ... cos some ppl dont have a purpose/reason to buy a new card ... they juts want to have the best out there ...
That is assuming they are in the position to do that.
So far nVidia's GPUs have a bigger die size and also more memory chips due to the wider bus.
So in a price war AMD have the upper hand. This is their strategy since the HD3k series.
They've never been in a better position. Read below.
Yeah an strategy that is over now, the old one was strongly based on the fact that Nvidia was agressively pursuing GPGPU and AMD never did, introducin only some small things here and there, saving up tremendous amounts of transistors in the process.
Just think about Tahiti, Tahiti has 4.3 billion transistor. Much much more than the 2.7 In Cayman and also much more than GF100/110's 3 billion. GPGPU is expensive (64 bit, memory management, etc.), something that nobody ever noticed or even cared about. Well finally AMD matched Nvidia on GPGPU features and capabilities and the result is a chip with 1.3 billion transistor more (+40%) than GF110 that is only 15% faster, with 15% faster clocks, clock for clock they are mostly equal. With a little help from 28 nm and it's good transistor clocking, Nvidia could in theory release an hypothetical GF111 @ 900 Mhz that would be as fast as the HD7970. 3 billion transistor vs 4.3 billion, now imagine a 3.6 billion chip or 4.5 billion transistor one.
I know it's not something many people here want to hear, and that it even hurts them, but I think Nvidia is in for an easy win this time around. They have to screw up badly not to score this one.
Entirely incorrect, Nvidia does not leave it to game developers to add it, FXAA was made by a free source developer who allowed anybody to add it into their games or into the Nvidia control panel. ATI has the option to put it in if they please but has not, feel free to check his blog as he is still updating FXAA currently and working on the next big version of it right now. In the end Nvidia doesn't own FXAA in any means.
Although to be quite fair, he does work with Nvidia, he has stated many times over that FXAA is entirely free source for anyone to use and adapt.
I have always assumed that FXAA was made by Timothy Lottes and every google search I make about it ends in the same place. So as s short answer, yes, it was made by/for Nvidia.
That it is, but as I have said, if you look around he does leave it open for just about anybody to use. Where as he may not allow ATI to put it in their panel, it doesn't mean that people couldn't simply mod Catalyst to put it in. I actually believe RadeonPro was looking into doing just this. It would be entirely legal as well.
Just to clarify, I run a 6870 and pretty much inject FXAA into almost every game I play, as MLAA is trash in comparison.
There is NO WAY they will bring out a "cheap" 7XX line. Why would they? AMD just moved their prices up, so why would Nvidia drop theirs?
(BTW I own a lot of cards from both teams, red and green)
Price to performance was big a generation ago. Now it's going to be very hard for AMD to keep that crown as they have increased pricing...look at the 7970 (which I have). I paid 600+ which is what I paid for each of my 580 3gb....so I expect high end Kepler to be around 650 and to be faster than the 7970. Where's the price to performance now? Also, it's comical that you blame your ignorance in purchasing on the company. Anyone who did ANY reading about the 6870 could have told you to keep your 5xxx card. Want new tech? You will have to pay for it, as always.
Lets hope they don't screw up this round because Kepler is the only reason I am still sitting on my 5870.
The core configuration and the actual performance of 7970 doesn't impress me at all TBH.
Separate names with a comma.