Monday, August 20th 2018

GALAX Confirms Specs of RTX 2080 and RTX 2080 Ti

GALAX spilled the beans on the specifications of two of NVIDIA's upcoming high-end graphics cards, as it's becoming increasingly clear that the company could launch the GeForce RTX 2080 and the GeForce RTX 2080 Ti simultaneously, to convince GeForce "Pascal" users to upgrade. The company's strategy appears to be to establish 40-100% performance gains over the previous generation, along with a handful killer features (such as RTX, VirtuaLink, etc.,) to trigger the upgrade-itch.

Leaked slides from GALAX confirm that the RTX 2080 will be based on the TU104-400 ASIC, while the RTX 2080 Ti is based on the TU102-300. The RTX 2080 will be endowed with 2,944 CUDA cores, and a 256-bit wide GDDR6 memory interface, holding 8 GB of memory; while the RTX 2080 Ti packs 4,352 CUDA cores, and a 352-bit GDDR6 memory bus, with 11 GB of memory. The memory clock on both is constant, at 14 Gbps. The RTX 2080 has its TDP rated at 215W, and draws power from a combination of 6-pin and 8-pin PCIe power connectors; while the RTX 2080 Ti pulls 250W TDP, drawing power through a pair of 8-pin PCIe power connectors. You also get to spy GALAX' triple-fan non-reference cooling solution in the slides below.
Add your own comment

29 Comments on GALAX Confirms Specs of RTX 2080 and RTX 2080 Ti

#1
Durvelle27
Looks like I know what my next GPU will be
Posted on Reply
#2
medi01
btarunr, post: 3888603, member: 43587"
40-100% performance gains over the previous generation
Lol.
Posted on Reply
#3
cucker tarlson
If this is 800/+1000 like some leaks say, then don't bother Durvelle. Wait for AMD to announce 7nm Navi/Vega, bigger choice and lower prices are always better.
Posted on Reply
#4
Durvelle27
cucker tarlson, post: 3888609, member: 173472"
If this is 800/+1000 like some leaks say, then don't bother Durvelle. Wait for AMD to announce 7nm Navi/Vega, bigger choice and lower prices are always better.
I never said it had to be Nvidia :roll:
Posted on Reply
#6
Durvelle27
cucker tarlson, post: 3888613, member: 173472"
So you know what your next gpu won't be then.
I’m just seeing how it’s plays out. Currently have a GTX 960 which is the only Nvidia card I’ve used in 5 years but anything current is a upgrade over it.
Posted on Reply
#7
bug
cucker tarlson, post: 3888609, member: 173472"
If this is 800/+1000 like some leaks say, then don't bother Durvelle. Wait for AMD to announce 7nm Navi/Vega, bigger choice and lower prices are always better.
I'm pretty sure the $1,000 leaked priced includes the usual early adopter's fee ;)
Posted on Reply
#8
coonbro
Durvelle27, post: 3888616, member: 107186"
I’m just seeing how it’s plays out. Currently have a GTX 960 which is the only Nvidia card I’ve used in 5 years but anything current is a upgrade over it.
lol..... '''company could launch the GeForce RTX 2080 and the GeForce RTX 2080 Ti simultaneously, to convince GeForce "Pascal" users to upgrade ''' I'm a Maxwell user when it decides to pop , spark ,and blow smoke then I may be convinced to ''upgrade'' then at that I'd have to think long and hard if its even worth it anymore with todays rbg junk , my pc building enthusiasm is at a all time low with the state of todays software limited support and todays hardware gimmicked up and also limited support . then add that malware service called win-10 on top of all that , think i'll pass
Posted on Reply
#9
medi01
Durvelle27, post: 3888616, member: 107186"
I’m just seeing how it’s plays out. Currently have a GTX 960 which is the only Nvidia card I’ve used in 5 years but anything current is a upgrade over it.
It also happens to be exceptionally poor perf/buck, even by nvidia's not-that stellar standards.
Posted on Reply
#10
bug
coonbro, post: 3888623, member: 178073"
lol..... '''company could launch the GeForce RTX 2080 and the GeForce RTX 2080 Ti simultaneously, to convince GeForce "Pascal" users to upgrade ''' I'm a Maxwell user when it decides to pop , spark ,and blow smoke then I may be convinced to ''upgrade'' then at that I'd have to think long and hard if its even worth it anymore with todays rbg junk , my pc building enthusiasm is at a all time low with the state of todays software limited support and todays hardware gimmicked up and also limited support . then add that malware service called win-10 on top of all that , think i'll pass
I'm in the same boat. I'd only upgrade if a mid-range GPU starts handling 4k (but there aren't really any 4k monitors I'd buy either). RTX could be a reason to upgrade, but I'm not holding my breath for it making a significant splash in its first iteration. Ray tracing can be that yummy, though.
Posted on Reply
#11
Durvelle27
Welp id be thoroughly surprised if the leaked prices are wrong and it’s comes in under sub $700
Posted on Reply
#12
bug
medi01, post: 3888624, member: 158537"
It also happens to be exceptionally poor perf/buck, even by nvidia's not-that stellar standards.
Because you know exactly how much he paid for it :wtf:
But yes, I've had Nvidia ever since my 6600GT, yet the 960 was one card I skipped without thinking twice. My 660Ti carried me all the way to my current 1060.

Durvelle27, post: 3888628, member: 107186"
Welp id be thoroughly surprised if the leaked prices are wrong and it’s comes in under sub $700
They won't be cheap, because they have no competition. But pre-release prices are always inflated.
Posted on Reply
#13
Crustybeaver
cucker tarlson, post: 3888609, member: 173472"
If this is 800/+1000 like some leaks say, then don't bother Durvelle. Wait for AMD to announce 7nm Navi/Vega, bigger choice and lower prices are always better.
Wait for an AMD card you say? I'm not sure people have that much time
Posted on Reply
#14
Raendor
"The company's strategy appears to be to establish 40-100% performance gains over the previous generation"

Lol, looking at these specs - I highly doubt that. No intention to switch from 1080 so far. Especially with all the cards being showed as 3 slots so far. neither my Define C nor Node 202 cases are even fit for hot gpus. Upgrade after this gen? That's more likely.
Posted on Reply
#15
coonbro
bug, post: 3888629, member: 157434"
Because you know exactly how much he paid for it :wtf:
But yes, I've had Nvidia ever since my 6600GT, yet the 960 was one card I skipped without thinking twice. My 660Ti carried me all the way to my current 1060.


They won't be cheap, because they have no competition. But pre-release prices are always inflated.
''I've had Nvidia ever since my 6600GT, yet the 960 was one card I skipped without thinking twice''

I still run a build with a 6600 agp x8 and as for as the 900 card goes I grabbed one cause it was the last of the cards that offered me full support of all my needs not having to buy new monitors or junk well working software cause it was not win-10 only , ect....

this 900 card is plug and play with all I run and use. a 10 series or newer cant say or do that at all , and funny you still pay as much or even more for it to limit you and your needs / cant do as much [I guess they know fools and there money are soon parted ] ya, computer building has become a money grab joke . I'm pretty well done with it
Posted on Reply
#16
bug
Raendor, post: 3888642, member: 164683"
"The company's strategy appears to be to establish 40-100% performance gains over the previous generation"

Lol, looking at these specs - I highly doubt that. No intention to switch from 1080 so far. Especially with all the cards being showed as 3 slots so far. neither my Define C nor Node 202 cases are even fit for hot gpus. Upgrade after this gen? That's more likely.
I'm also not sure where that came from. Upgrading from one generation to the next never made much sense, regardless of the generation or manufacturer (doesn't make sense for smartphones, cameras or cars either). Upgrading after two generations makes more sense. And if you're not gaming much, you can keep your card for three generations or more.

coonbro, post: 3888646, member: 178073"
''I've had Nvidia ever since my 6600GT, yet the 960 was one card I skipped without thinking twice''

I still run a build with a 6600 apgx8 and as for as the 900 card goes I grabbed one cause it was the last of the cards that offered me full support of all my needs not having to buy new monitors or junk well working software cause it was not win-10 only , ect....

this 900 card is plug and play with all I run and use a 10 series or newer cant say or do that at all and funny you still pay as much or even more for it to limit you and your needs [I guess they know fools and there money are soon parted ] ya, computer building has become a money grab joke . I'm pretty well done with it
Of course you bought what you needed at the time, you don't need to justify that. It's people that lend advice without knowing you or your needs that are hilarious around here.
In my case, I skipped the 760 because it was only like 10% faster than my 660Ti and then the 960 was 10% faster than the 760. That would have never allowed me to play with more AA, let alone at higher resolutions. Going from the 660Ti to the 1060, however, meant that a mostly maxed out Witcher 3 no longer looked like a slideshow ;)
Posted on Reply
#17
Durvelle27
bug, post: 3888648, member: 157434"
I'm also not sure where that came from. Upgrading from one generation to the next never made much sense, regardless of the generation or manufacturer (doesn't make sense for smartphones, cameras or cars either). Upgrading after two generations makes more sense. And if you're not gaming much, you can keep your card for three generations or more.


Of course you bought what you needed at the time, you don't need to justify that. It's people that lend advice without knowing you or your needs that are hilarious around here.
In my case, I skipped the 760 because it was only like 10% faster than my 660Ti and then the 960 was 10% faster than the 760. That would have never allowed me to play with more AA, let alone at higher resolutions. Going from the 660Ti to the 1060, however, meant that a mostly maxed out Witcher 3 no longer looked like a slideshow ;)
I don’t even know how the GTX 960 performs. I just got it but haven’t had time to test it out yet. I know it won’t be on the level on my last card which was a RX 480 8GB
Posted on Reply
#18
coonbro
bug, post: 3888648, member: 157434"
I'm also not sure where that came from. Upgrading from one generation to the next never made much sense, regardless of the generation or manufacturer (doesn't make sense for smartphones, cameras or cars either). Upgrading after two generations makes more sense. And if you're not gaming much, you can keep your card for three generations or more.


Of course you bought what you needed at the time, you don't need to justify that. It's people that lend advice without knowing you or your needs that are hilarious around here.
In my case, I skipped the 760 because it was only like 10% faster than my 660Ti and then the 960 was 10% faster than the 760. That would have never allowed me to play with more AA, let alone at higher resolutions. Going from the 660Ti to the 1060, however, meant that a mostly maxed out Witcher 3 no longer looked like a slideshow ;)
ya, I had to get off AMD card due to its lack of support in each new driver got to where after like 12.6 things did not work 13.12 was the last I used [7--- series]

but you know it like I buy a 500 buck card that does it all like say a 900 or older series xp, analog , ect ,,,, now they think I'm going to pay 500 bucks for a card that cant ? they got to be nutts thinking I'm spending cash on that deal , but then I guess you do get 20 way RBG lighting with the latest that well worth the price in its self , right ?
Posted on Reply
#19
Darmok N Jalad
Seems like a big spec disparity between the regular and Ti editions. Almost seems like they should be different model numbers instead.

coonbro, post: 3888623, member: 178073"
lol..... '''company could launch the GeForce RTX 2080 and the GeForce RTX 2080 Ti simultaneously, to convince GeForce "Pascal" users to upgrade ''' I'm a Maxwell user when it decides to pop , spark ,and blow smoke then I may be convinced to ''upgrade'' then at that I'd have to think long and hard if its even worth it anymore with todays rbg junk , my pc building enthusiasm is at a all time low with the state of todays software limited support and todays hardware gimmicked up and also limited support . then add that malware service called win-10 on top of all that , think i'll pass
Yes, and consider that the console market is still the main target of most new games. You have to make sure your game will run well enough on the original XboxOne. Yes, much can be done to scale a game up to high-detail 4K or whatever, but the base settings still need to come in pretty low. Even then, the most popular configuration on Steam is a quad core CPU, 8GB RAM, and a midrange GPU running 1080p. That’s practically a $400 PS4 Pro. Developers just can’t target $900 GPUs and hope to sell in volume to recover their millions of dollars invested in a title.
Posted on Reply
#20
Vya Domus
Looking at that 2080+ I can't help but think that Nvidia shifted the product stack yet again in terms of relative performance.

1070 replaced by 2080
1070ti replaced by 2080+
1080 replaced by 2080ti
Posted on Reply
#21
bug
My rule is simple: it has to be under $300 (under $250 is best) and significantly better than what I already own.
Till recently it had to be Nvidia because AMD sucked on Linux. Now AMD is an option for Linux, but their current lineup sucks. So...

Vya Domus, post: 3888668, member: 169281"
Looking at that 2080+ I can't help but think that Nvidia shifted the product stack yet again in terms of relative performance.

1070 replaced by 2080
1070ti replaced by 2080+
1080 replaced by 2080ti
Yes, you can totally get all that from a + :kookoo:
Posted on Reply
#22
coonbro
bug, post: 3888674, member: 157434"
My rule is simple: it has to be under $300 (under $250 is best) and significantly better than what I already own.
Till recently it had to be Nvidia because AMD sucked on Linux. Now AMD is an option for Linux, but their current lineup sucks. So...


Yes, you can totally get all that from a + :kookoo:
use to be amd was the go to for Linux then they got funny and now seems like you say NVidia is the better choice

I had great luck with my mid priced / range cards . then if it goes bad you out only 200+/-$ not 500+$ . I look at it like a curve in the lines of cards like my 7850 amd card I liked it and did me a great job as far as a nice looking display and just fine FPS , but the support was getting crap like some games would not work or no longer supported over 12.6 like I said above and after I think like 14.xx got black screening and what not I never got in the older drivers doing the same things , its a shame it ended up that way with it . not to say NVidia is so great but at least everything I do works under it and one good solid driver I use [ and I don't use there latest ones either 355.82 on this 900 series today . AMD got to where I had to use this driver to do this and that driver to do that . that got old quick .
Posted on Reply
#23
Vayra86
Durvelle27, post: 3888616, member: 107186"
I’m just seeing how it’s plays out. Currently have a GTX 960 which is the only Nvidia card I’ve used in 5 years but anything current is a upgrade over it.
Get a 1080 ;)
Posted on Reply
#24
Eric3988
Exciting news day for once, now all we need are some benchmarks. I am not planning on upgrading any time soon, but if I see gains of over 50% then it becomes a lot more tempting to do so.
Posted on Reply
#25
Durvelle27
Vayra86, post: 3888731, member: 152404"
Get a 1080 ;)
if prices drop
Posted on Reply
Add your own comment