Wednesday, July 23rd 2014

NVIDIA Timing GeForce GTX 880 Launch with Gamescom?

NVIDIA is reportedly preparing new graphics cards based on the "Maxwell" architecture, which judging by the purported 3DMark FireStrike screenshots, could be high-end parts, maybe even the GeForce GTX 880. The cards could launch along the sidelines of Gamescon, a gaming expo held each year in Cologne, Germany, around mid-August. Given how most information to date points to a later-Q3, maybe even mid-Q4 launch of these cards, it wouldn't surprise us if NVIDIA merely teases these cards. NVIDIA's next big consumer GPU is widely expected to be the GM204, a performance-segment chip based on the "Maxwell" architecture.

Source: VideoCardz
Add your own comment

20 Comments on NVIDIA Timing GeForce GTX 880 Launch with Gamescom?

#1
mroofie
Videocardz the best reliable source ever NOT!!!
and yes im also waiting for this to turn into a amd fanboy section :p
I must do it for my country :D
Posted on Reply
#2
ZoneDymo
by: mroofie
Videocardz the best reliable source ever NOT!!!
and yes im also waiting for this to turn into a amd fanboy section :p
I must do it for my country :D
Can't see in what way this potential news could turn into an "AMD fanboy" section
Posted on Reply
#3
HumanSmoke
Doesn't sound plausible. GM 204 tape out seems to be generally accepted as April, so 8-12 weeks fabrication time and testing at the bare minimum, and that's assuming the first-run A1 silicon needs no revision, you'd still need to take into account die packaging, building the boards (presumably Foxconn), packaging and shipping...and that's assuming that Nvidia's fab schedule hasn't been revised. Seems like an awfully tight schedule.
by: ZoneDymo
Can't see in what way this potential news could turn into an "AMD fanboy" section
Hi ! Welcome to the internet !
If this thread follows just about any other graphics story since 1998, You'll probably be seeing comments along the lines of " too expensive" (even though no pricing is mentioned), "I don't like Nvidia because [insert random proprietary tech]", followed by some rambling about drivers, SLI scaling, the absolute necessity of DirectCompute and GPGPU functionality, and never-to-be-attained-theoretical FLOPS performance, relative price-per-perf, and perf -per-watt of cards that aren't the GTX 880. Cue backlash. Cue unrelated comparisons and anecdotal sob stories of dashed dreams.
Something like that....but it won't just be AMD fanboys involved.
Posted on Reply
#4
RCoon
Gaming Moderator
by: HumanSmoke
Hi ! Welcome to the internet !
If this thread follows just about any other graphics story since 1998, You'll probably be seeing comments along the lines of " too expensive" (even though no pricing is mentioned), "I don't like Nvidia because [insert random proprietary tech]", followed by some rambling about drivers, SLI scaling, the absolute necessity of DirectCompute and GPGPU functionality, and never-to-be-attained-theoretical FLOPS performance, relative price-per-perf, and perf -per-watt of cards that aren't the GTX 880. Cue backlash. Cue unrelated comparisons and anecdotal sob stories of dashed dreams.
Something like that....but it won't just be AMD fanboys involved.
This needs to be copied and pasted into the Forum Rules section, so every time somebody posts one of those generic statements for AMD or NVidia they get backhanded by every member for being a tool and sent to the naughty corner to read the rules again.
Posted on Reply
#5
urza26
by: RCoon
This needs to be copied and pasted into the Forum Rules section, so every time somebody posts one of those generic statements for AMD or NVidia they get backhanded by every member for being a tool and sent to the naughty corner to read the rules again.
Matrox rocks!! I managed to circumvent the rules, victory is mine! :P
Posted on Reply
#6
alwayssts
by: HumanSmoke
Doesn't sound plausible. GM 204 tape out seems to be generally accepted as April, so 8-12 weeks fabrication time and testing at the bare minimum, and that's assuming the first-run A1 silicon needs no revision, you'd still need to take into account die packaging, building the boards (presumably Foxconn), packaging and shipping...and that's assuming that Nvidia's fab schedule hasn't been revised. Seems like an awfully tight schedule.

Hi ! Welcome to the internet !
If this thread follows just about any other graphics story since 1998, You'll probably be seeing comments along the lines of " too expensive" (even though no pricing is mentioned), "I don't like Nvidia because [insert random proprietary tech]", followed by some rambling about drivers, SLI scaling, the absolute necessity of DirectCompute and GPGPU functionality, and never-to-be-attained-theoretical FLOPS performance, relative price-per-perf, and perf -per-watt of cards that aren't the GTX 880. Cue backlash. Cue unrelated comparisons and anecdotal sob stories of dashed dreams.
Something like that....but it won't just be AMD fanboys involved.
Nice random voodoo2-era reference. My 9700 still kicks your FX5800's ass, and it was $100 cheaper...Not to mention I got to play BF1942 and Hot Pursuit 2 at playable settings for a few months before you! /JK.

Let me just start it off by saying it simply looks like both AMD/nvidia will have fairly-optimized 64 ROP parts (more-or-less 4x the ps4 gpu spec, the same way Hawaii/GK110 products are ~4x xboxone). Both look to be coming relatively soon and on 28nm. That both companies seemed to make that feasible is pretty astounding, even if HPL saves a bunch of power...should be some big freakin' chips. I'm sure they will be very nice, bump each current series down a peg (while being themselves refreshed in one form or another) and truly usher in 4k gaming while elevating the baseline...

But yeah...How much $? Prolly a lot. Not just kind of a lot, prolly a lot a lot. Also, if in a similar power envelope, at what point does more units at a lower clock really make a lot more sense than less overclocked to a higher one, even if more efficient?

On a purely fundamental level, I wonder how each achieves this feat architecturally speaking. I could see AMD doing 512-bit and 7gbps ram, but what about nvidia? Tons of cache (bigger chip) and slower ram? Should be interesting. /#jointhespeculation.
Posted on Reply
#7
HumanSmoke
by: alwayssts
On a purely fundamental level, I wonder how each achieves this feat architecturally speaking. I could see AMD doing 512-bit and 7gbps ram, but what about nvidia? Tons of cache (bigger chip) and slower ram? Should be interesting. /#jointhespeculation.
If GK 110 managed to marry a 384-bit I/O with 7Gb/s GDDR5, I wouldn't see them going slower with their next part. If the GM 204 is indeed 256-bit then 7Gb/sec should be considered baseline for bandwidth I would have thought. Couldn't see Nvidia going any less than what GK 104 achieved (224GB/sec bandwidth) TBH.
L2 cache - if GM 204 follows GM107's example, will be sizeable and should allow for a reasonable speedup. The GM 107 suffers in heavily in comparison to GK 106's bandwidth, bus width, memory speed, core count, ROP count, and texture address units, but the cache structure certainly mitigates what should be a major performance disparity.
Posted on Reply
#8
Kaotik
VideoCardz, and everyone spreading their 'news', is being fooled here.

The NV UK's facebook post was never attached to the image above it. The T-27, at the date it was posted, matches to the date Shield-tablet will be available in UK.
Posted on Reply
#9
Dj-ElectriC
You guys are just trash talkin'.

This picture of 3DMark looks completely legit and there's no way this isn't a GTX 880. Clearly.


:|
Posted on Reply
#10
the54thvoid
by: Dj-ElectriC
You guys are just trash talkin'.

This picture of 3DMark looks completely legit and there's no way this isn't a GTX 880. Clearly.


:|
How dare you sarcastically imply 3DMark images can be falsified. Here's my latest run.

Posted on Reply
#11
FreedomEclipse
~Technological Technocrat~
Hurry the fuck up Nvdia! My money is waiting!
Posted on Reply
#12
GhostRyder
Meh, seems a bit early in all honesty for an August release because a month ago it seemed like things were at a stand still. But I digress, that is why rumors are called rumors, now its just going to come down to which one is correct, I guess we will hear soon enough.
Posted on Reply
#13
yogurt_21
Personally I'd be surprised if we got a new maxwell based highend gpu before they finished refreshing their notebook gpu lineup with maxwell. So i'd actually expect to see the 880m maxwell before a new maxwell based highend for the desktop. The 860m maxwell came out in march but the 870m and 880m remained kepler parts.

being that low power is maxwell's bread and butter I'd expect the mobile solutions to be fully vetted before the desktop high ends come out.
Posted on Reply
#14
MxPhenom 216
Corsair Fanboy
by: HumanSmoke
Doesn't sound plausible. GM 204 tape out seems to be generally accepted as April, so 8-12 weeks fabrication time and testing at the bare minimum, and that's assuming the first-run A1 silicon needs no revision, you'd still need to take into account die packaging, building the boards (presumably Foxconn), packaging and shipping...and that's assuming that Nvidia's fab schedule hasn't been revised. Seems like an awfully tight schedule.

Hi ! Welcome to the internet !
If this thread follows just about any other graphics story since 1998, You'll probably be seeing comments along the lines of " too expensive" (even though no pricing is mentioned), "I don't like Nvidia because [insert random proprietary tech]", followed by some rambling about drivers, SLI scaling, the absolute necessity of DirectCompute and GPGPU functionality, and never-to-be-attained-theoretical FLOPS performance, relative price-per-perf, and perf -per-watt of cards that aren't the GTX 880. Cue backlash. Cue unrelated comparisons and anecdotal sob stories of dashed dreams.
Something like that....but it won't just be AMD fanboys involved.

Perhaps the best post you have ever made...........ever.
Posted on Reply
#15
Fluffmeister
HumanSmoke is basically the only person who speaks sense on this forum, so no surprise. :laugh:
Posted on Reply
#16
eidairaman1
by: the54thvoid
How dare you sarcastically imply 3DMark images can be falsified. Here's my latest run.


Lmfao
Posted on Reply
#17
LeonVolcove
Now AMD just need to bring something similiar like what they did with R9 290/290x and we have more competition coming
Posted on Reply
#18
the54thvoid
by: LeonVolcove
Now AMD just need to bring something similiar like what they did with R9 290/290x and we have more competition coming
Which at this point would mean they just need to bring a rumour. These ultra early, unsubstantiated near fabrications deserve nought but a fart in their general direction. Seriously, if NV do bring something out this Autumn, well super dooper but let's all remain calm and rationale. Kepler had a year head start on Hawaii (GK110 HPC release) so if NV do have something it would be unlikely (unfortunately) for AMD to have their new arch out. That being said, why would NV release their 'next gen' when their current single GPU is the better performer? Doesn't make business sense for NV to release 880 yet.
AMD may have the tech crown for their 295 but that probably doesn't worry NV. I could see NV sitting on their Maxwell cards until AMD 'next gen' rumours started.
Posted on Reply
#19
TheMailMan78
Big Member
by: Fluffmeister
HumanSmoke is basically the only person who speaks sense on this forum, so no surprise. :laugh:
Except me. I always shoot torpedo's of truth. Bulletproof facts doused in horrible grammar and off color analogies with a side of I told you so and assholism. Finished off with a three month ban for wielding unadulterated truth with troll dressing.
Posted on Reply
#20
Steevo
by: urza26
Matrox rocks!! I managed to circumvent the rules, victory is mine! :p
I prefer Cirrus Logic, 24 bit true color, but only at lower resolutions as you run out of memory. The unparalleled stability of their drives were second to none. performance per watt, what watt, they weren't even cooled they processed pixels with black magic. The single VGA output was elegant and the satin finish on the IO shield was by far superior to Matrox and their tin.
Posted on Reply
Add your own comment