Friday, August 28th 2009
NVIDIA Plans GT300 Demos in September
It looks like NVIDIA doesn't want AMD to drench the media and consumers with enough hype to ensure a smooth, profitable launch of its "Evergreen" family of DirectX 11 GPUs. The party-crasher this time around is NVIDIA's GT300 graphics processor, which sources claim to be continuing on NVIDIA's design methodology of a powerful, monolithic GPU. AMD's itinerary for September looks fairly clear: press-briefings on and around the 10th (we'll be heading to Munich for ours), a number of previews that follow, and launches towards the end of the month, and market availability soon after, in October.
In essence, AMD ends up with all the limelight for the better part of the quarter, in the run up for the crucial November~December shopping season. Meanwhile, the green camp is reportedly readying its own press-briefings of the GT300 GPU. These will be held in late September. To what scale will the briefings be held, or how ready are they with engineering samples is not known as yet, but possibly enough to gain public attention for its DirectX 11 GPUs.
Source:
Bright Side of News
In essence, AMD ends up with all the limelight for the better part of the quarter, in the run up for the crucial November~December shopping season. Meanwhile, the green camp is reportedly readying its own press-briefings of the GT300 GPU. These will be held in late September. To what scale will the briefings be held, or how ready are they with engineering samples is not known as yet, but possibly enough to gain public attention for its DirectX 11 GPUs.
98 Comments on NVIDIA Plans GT300 Demos in September
Its also pointless to show that this may be just alot of paper or whatever, taking the above into consideration, as well as we really dont know what nVidia has up its sleeves.
I know one thing, if I was neutral, and didnt really care about which card I want, Id be upset if nVidia didnt come thru as quickly as ATI on this, as nVidia is basically copying ATIs act, and thus, should follow thru like ATI as well, and if nVidia doesnt, call it a squeaky wheel taking up my time and frustrating me
XD :rockout:
Told ya theyd make a comeback!:rockout:
can't wait till September :toast:
If you have texture loading problems you need more vmem or a faster HDD, or more system memory, or faster system memory, or to update from XP for better memory handling.
Yes I know alot of people had/have issues with GTA4 and cry about it, but here I am with supposed "inferior" hardware and play it at 40+FPS at high resolution and have only had a couple hiccups. So something is wrong with this equasion.
Is it your Intel CPU? (Isn't Intel faster than AMD.......)
Is it your GPU? (Isn't a 4870 faster than a 4850?)
Is it your harddrive?
Is it your OS?
Choose one.
I guess since this is my hobby, I need new toys every six months or so...what can I say, I am addicted to hardware...:D
But as i see, your addicted to hardware.lol
At any rate, I still can't max it on my PC because of it's extraordinary need for framebuffer. Wish I had 2GB cards. lol. Oh, and it runs better in xfire than it does in single gpu, so that's not the issue. If you get poor performance, 9 times out of 10, you have the details set too high for your amount of frame buffer, which causes it to go to system mem, which causes it to stutter. Keep it away from your framebuffer limits, and it plays smooth as silk most of the time. The other factor is your cpu. It's one of the few games that does benefit from a faster cpu. Crank a quad to the 3.5+ range, and you should be perfectly fine.
As for the 5800 series, I'll pass until either a DX11 game releases that I want, or the prices drop along the lines of what happened with the 4800 series, and I see what nv has to offer as a counter (provided I have an SLI capable board at that time).
but really is there any other use for a 4850 (not 2d/3d visuals) that takes up more than 10% of its time?
I guess the main reason for me not upgrading is being extremely strapped for cash these days. :(
LOL, I love to hear how people bash ATi about DX10.1 & DX11 because it's relatively unused as yet, but praise nVidia's PhysX which is just the same, adoption wise. It's almost like saying, why the hell would Airbus make new fuel efficient planes when there's still plenty fuel, they need to wait until there's a shortage, then act.:wtf:
I hope there market share of the 2 equalises, then there'll be a real GFX war, that will become even bigger with Intel's Larabee intro!:rockout:
DX11 brings alot of usable savings with it, for more eyecandy
ATi has it's MIMD tech since R600 and it's engine technologicaly surpass nVidias built-in dx10 tweaks (g80-g92) onto old great g70's 9.0c core base. They just tweaked great leap forward with 2005. g70 release for how long now ... 4.5years and counting. While ATi has it's 16xSIMD core under same umbrella for 2.5yrs. Their r600 architechture in fact brought up something that should suppose to be "real dx10" and all they use to negotiate with ms was that their engine comply to something called dx10.1 (and that nVidia smartly d-toured) And with only few improvements over it they build "tweaked, pumped up" dx11 core as you call it :rolleyes: ATi's dx10 engine (r600) was so close to what we see now as more pompously dx11, and only thing it lacked was true MIMD as nVidia fancy calls it. [I really wonder what real life miracle they'll brought to world behind that abbreviation?] On the other hand that "MIMD functionality" is base for dx11 compliance and we'd already saw it (AMD's June Taipei presentation) as AMD's multi threading communication between their 16 SIMD x 5 SP core packs.
So if they didn't really f*ck us with all that hype we'll end with pretty same real life functionality just differently implemented(?) and called. And who cares about their marketing names and what they'll really call their ingenuity. It would be great that all they doesn't mock us with all that nda stuff and pretend like they really reinventing the wheel.
And about that "Evergreen" stuff .... AMD's green lot longer than nVidia ;) It's just a petty that they killing great ATi brand and appearance. Shame on them. And how then they can still address to their performance pixie as Ruby ... while it should be more proper to call her Esmeralda :D to fit their new appearance.
And is it true that new low end (Cedar/Redwood) ATi offerings will lack of PowerPlay features? It would be realy crappy feature that some budget card that could consume (<3W Cedar) <10W in 2D lacks that features. Another market ripoff (fraud?)
It's poor thing that AMD trails nVidia that put their HybridSLI on death row
When i DID play that poor excuse for a game i played it on my 940 @ 3.6ghz with a single 280. Ran fine for me, AND had AA with the res set to 1920x1200 and its only a 1gig card. Use Google to find out how i did it. Its not that hard really.
As for the 5870, meh. Ill wait for GT300. The 5870 didnt impress me.
Either way don't care as long as they do good cards for each year. And i bet nvidia's even more happier with he prices creeping back up as were going to be back to paying $350+ for a single GPU card.