Tuesday, December 25th 2018

NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

NVIDIA drew consumer ire for differentiating its GeForce GTX 1060 into two variants based on memory, the GTX 1060 3 GB and GTX 1060 6 GB, with the two also featuring different GPU core-configurations. The company plans to double-down - or should we say, triple-down - on its sub-branding shenanigans with the upcoming GeForce RTX 2060. According to VideoCardz, citing a GIGABYTE leak about regulatory filings, NVIDIA could be carving out not two, but six variants of the RTX 2060!

There are at least two parameters that differentiate the six (that we know of anyway): memory size and memory type. There are three memory sizes, 3 GB, 4 GB, and 6 GB. Each of the three memory sizes come in two memory types, the latest GDDR6 and the older GDDR5. Based on the six RTX 2060 variants, GIGABYTE could launch up to thirty nine SKUs. When you add up similar SKU counts from NVIDIA's other AIC partners, there could be upward of 300 RTX 2060 graphics card models to choose from. It won't surprise us if in addition to memory size and type, GPU core-configurations also vary between the six RTX 2060 variants compounding consumer confusion. The 12 nm "TU106" silicon already has "A" and "non-A" ASIC classes, so there could be as many as twelve new device IDs in all! The GeForce RTX 2060 is expected to debut in January 2019.
Source: VideoCardz
Add your own comment

230 Comments on NVIDIA GeForce RTX 2060 to Ship in Six Variants Based on Memory Size and Type

#101
EarthDog
Charcharo, post: 3966151, member: 162483"
The majority buys AAA games like Origins, Odyssey, Wolfenstein 2, DOOM, Witcher 3 and others just fine...

What I am doing as an enthusiast is enabling what Ultra used to mean 15-20 years ago. :) Nothing more, just a return to how things used to be. I do not think people have to do that since I think the "Ultra or nothing" mentality in PC Gaming is insane.

Those requirements do not reflect reality so the reason they are there is not objectively correct and has not been in the past decade. Show me at least 2-3 times when it meant something from the last 5 years and I will give you some props. As it is, it is often just lies.
There are AAA titles that can bring a GPU down...

Good for you! I am glad you dig down and edit ini files. Just saying most people don't.

How about you show me where it doesn't give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions. The recs are a GUIDELINE, not a rule...but you are just saying they are lies, LOL.
Posted on Reply
#102
Charcharo
EarthDog, post: 3966161, member: 79836"
There are plenty of AAA titles that can bring a GPU to its knees.

Good for you! I am glad you dig down and edit ini files. Just saying most people don't.

How about you show me where it doesn't give a general reference point to start from (minimum). I'm not the one pushing the rock uphill and in need of support for my assertions. The recs are a GUIDELINE, not a rule.
And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...

I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.

If you define things super generally, I guess you can stretch everything enough to win. My old ATI 5770 finished Witcher 3 at Low settings, 30 fps locked, 900p just fine. It was under the minimum requirements, had locked fps, and was not even using the lowest possible settings (resolution wasn't as low as it would go). i5 750 didn't bottleneck it. So obviously this page is a lie to me:
https://www.systemrequirementslab.com/cyri/requirements/the-witcher-3-wild-hunt/12446

Read the OpenGL part here:
https://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review

Hell, for a recent example see this:


R9 380 is much slower than R9 290... and yet it runs well. We can do this for almost all games ultimately, but what matters is that the requirements are made up. I can give better requirements than the developers and that is sad.
Posted on Reply
#103
EarthDog
Charcharo, post: 3966169, member: 162483"
And yet they sell by the millions and average gamers play them. Hell my poor countrymen play these games on their low-end PCs without issues...
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

Charcharo, post: 3966169, member: 162483"
I know most people don't dig through the ini files, but as long as you agree that true Ultra is there, tis fine. We agree.
With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)
Posted on Reply
#104
Charcharo
EarthDog, post: 3966184, member: 79836"
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)
I actually am shocked to see someone talking about system requirements in late 2018. I thought people stopped looking at these years ago and would never guess enthusiasts to use them. No offence, this is just a shock to me as I seriously have not seen this happen in a very long time (years). And no, ;p most dont meet the requirements.

I understand your argument about the ini file manipulation, but do know that when you say you run X game on Ultra I will say that you do not actually run it at Ultra.

So your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...

Words have meanings in languages. Minimum should be minimum, not some standard above what the majority of console gamers (which is like half of gaming) can achieve. And JayZ's R9 380 was doing a LOT better than that. An R9 290 can literally almost max the game at 60+ fps at 1080P. What kind of minimum is that???

I would prefer actual rules with meaning over things I can toss aside and laugh at, with API lies (as proven by the new order) tossed in. Solid, dependable and correct rules with clear meanings and definitions. That is something I think all humans love.
Posted on Reply
#105
EarthDog
Charcharo, post: 3966196, member: 162483"
I will say that you do not actually run it at Ultra.
Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline.... :p

Charcharo, post: 3966196, member: 162483"
your idea of minimum is literally twice the minimum playable fps that the majority of gamers tolerate at a resolution that is much higher than the minimum supported by the actual application? Your idea of minimum is literally my idea of recommended requirements...
Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC. I'm part of that group.
Posted on Reply
#106
Charcharo
EarthDog, post: 3966203, member: 79836"
Ultra is what what the dev's say it is through their preset. Anything else is adding on top. Ultra is what they say it is. Just because you go to fuel cutoff and past redline.... :p

Maybe? I just know that generally 30 fps many wouldn't consider an enjoyable gaming experience on PC. I'm part of that group.
So Nightmare settings in DOOM do not exist :P ? I mean I am all for Authorial intent, but there is an argument to be made for Death of the Author, especially when a decade ago things were more logical and Ultra really meant "as high as it would go before breaking". That makes sense.

Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles, its good enough for me.
Posted on Reply
#107
unikin
C'mon guys anything below 30 fps is a disaster, anything below 20 fps unplayable. We are talking about MIDRANGE GPU in 2019 here, not low end stuff like 560/1050, with perf somewhere in between 1070 - 1070TI given the same number of CUDA cores as 1070 and slightly better IPC. So 1440p/+60fsp capable GPU. Pairing it with 3 GB of RAM is a sin.
Posted on Reply
#108
ASOT
3 Gigs or 3.5 gigs of vram so much debate about it )))) for 1080p medium to high is OK.

Lets hope and expect that will be at competitive price for us,AMD has been rebranding us with crappy 590 and the 56 and 64 a joke saddly
Posted on Reply
#109
bug
Charcharo, post: 3966128, member: 162483"
Do not look at minimum and recommended system requirements. They are made up and make no sense at all. I do not even watch them these days and have not for over a decade.

The difference between Medium and Ultra settings... is overrated most of the time. Plus, most of the time, Ultra settings is not even true Ultra these days. Witcher 3's settings menu does not compare to what even a pleb like me can do in the ini file in 2 minutes without any real modding. That is real Ultra settings :) .

If you can afford an RX 560/ GTX 1050 then you are already quite a bit above the PS4 and Xbox One basic. PS4 Pro is about matched by 1050 Ti and Xbox One X always loses in games vs the GTX 1060 6 GB.

I do not like using different standards for different things. One standard and zero hypocrisy or the discussion is worthless.
You do however posses an uncanny ability to mix together all sorts of things barely related to the subject.

The simple truth is games happen to be playable at settings other than max. Some will look better, some will look worse, depending on budget and who coded it and made the assets. But I have never met a person who didn't play a game because they couldn't max out shadows or textures. I have met people that delayed playing a game, because they were planning an upgrade and thought they'd make the best of it.
Posted on Reply
#110
M2B
EarthDog, post: 3966184, member: 79836"
because they meet the minimum requirements or chances are the gaming the experience is poor (not as the dev wants it).

With respect, I honestly dont care. This has nothing to do with anything here really.

Your examples seem like exactly what I have explained, is it not? I also don't believe there is a standard for minimum/recs set by anyone so it varies by dev. That said, pretty sure 1080p is a given here...not less. I wouldn't call having to lower the resolution to make 30 fps (which many feel is unplayable) meeting minimum specs.

The bottom line is the are there are a GUIDELINE, not a rule. There is some flexibility there but the gaming experience may also suffer.

We'll agree to disagree.;)
They don't quite think 30FPS is unplayable, they think if they say 30FPS is unplayable they'll look cool.
More than 120 million people in the world are currently using consoles and enjoying their games at 30FPS.
Don't get me wrong, 30FPS is not ideal, it's far less enjoyable than 60FPS/60FPS+ but it's not unplayable by any means.
I'll proudly choose that "unplayable" 30FPS Red Dead Redemption 2 over 95 percent of the games at 60FPS+.
Posted on Reply
#111
EarthDog
Charcharo, post: 3966207, member: 162483"
Many and majority are not synonymous. I don't consider 30 fps playable either, but beggars cant be choosers and if its good enough for some rich Americans on their consoles, its good enough for me.
Oh, I'd bet good money a majority wouldn't consider 30 fps to be playable in most genres/titles. RTS, I can do... FPS...I'd cry and likely get a headache...

Playable and enjoyable I'm kind of using interchangeably. I mean... 15 is playable. The game plays...but the experience and it being enjoyable, the majority tend agree 30 fps isnt for PCs.

But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Posted on Reply
#112
lexluthermiester
CandymanGR, post: 3965894, member: 167652"
I dont trust a single word from people who say that what they bought is the best.
So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..
CandymanGR, post: 3965894, member: 167652"
And also, 3gb are NOT enough.
Sure it is when the settings are properly configured.
CandymanGR, post: 3965894, member: 167652"
You are biased as hell.
The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.
B-Real, post: 3965910, member: 170068"
Of course you have to defend that you have been milked in a crazy way.
Interesting perspective. Another is that I'm sharing actual experience and that it is positive despite the price increase.
B-Real, post: 3965910, member: 170068"
Those who try to defend this piece of crap can't understand our concern is not its performance
While I haven't used a 2060 yet, I have build systems with 2080ti's, 2080's(and own one), 2070's, Vega 64's, Vega 56's, RX580's and so on. Each have there pros and cons. Calling one or the other "crap" is so baselessly nonobjective as to effectively sound like drivel. The rest of your points only matter to people looking for reasons to whine and nitpick.
B-Real, post: 3965910, member: 170068"
This is the worst price-performance GPU family ever.
That's an opinion, and not a very good one. Every RTX card I've installed/used kicks the ever living snot out of the GTX10XX/Titan counter parts to say nothing of Radeon cards. And that's before RTRT is factored into the equation. The only people whining about the price are the people who can't afford them or have to wait an extra bit of time to save money for one. Everyone else is buying them up, which is why they are consistently still selling out, months after release.
Posted on Reply
#113
moproblems99
Note: I didn't read the comments and skimmed the article and didn't see a reference to tensor cores. BUUUUTTTT, if this has any number of tensor cores, I want one. It will never see a 3d scenario so I can't care about how it performs in games.

EarthDog, post: 3966218, member: 79836"
But....this is all a bit OT. I'd love a thread to get down to the bottom.of why 30 fps seems different on a console versus a PC (I know why movies can get away it).
Doesn't it have to do with the fact they are a constant value (movies)?

lexluthermiester, post: 3966219, member: 134537"
I own a PC show
Really, what channel and time slot? :laugh:
Posted on Reply
#114
lexluthermiester
EarthDog, post: 3965934, member: 79836"
I'd have to imagine youd be in a minority saying AA isnt needed at 1080p.
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.

moproblems99, post: 3966223, member: 155919"
Really, what channel and time slot? :laugh:
LOL! typo corrected..
Posted on Reply
#115
moproblems99
lexluthermiester, post: 3966224, member: 134537"
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p.
I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.

EDIT:

For me, I don't turn anything down as long as I am over 60fps (75 now while I have this monitor). I can't stand no AA. It looks like crap. Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
Posted on Reply
#116
EarthDog
lexluthermiester, post: 3966224, member: 134537"
Not based on the poll that was done a few months ago here on TPU. Based on that poll, most people tinker with their settings and turn AA down. Then there's Steam's own stats that show most people turn AA down or off, most of them running at 1080p. Pixel density 1080p, 1440p and up mostly eliminates the need for AA as the "pixel laddering" effect of the past isn't noticeable or pronounced like it was in the past with lower resolutions and simply isn't needed. Try it yourself. Turn it off and see if it matters that much to you. Be objective though.


LOL! typo corrected..
links plz... :)

I recall that poll and walked away with a bit different of a meaning.

Yeah... let's be clear here. Nobody said max AA...but you said "AA off" and that "it wasn't needed at 1080p and higher". We concluded from that poll an overwhelming majority used AA be it max or somewhere in between. A single digit % turned it off while 47% depends on game settings which could be on or off.

But the reality was for most that they use it when they can.
Posted on Reply
#117
lexluthermiester
moproblems99, post: 3966228, member: 155919"
I would assume most people that turn AA down on Steam are playing Insurgency or CS:GO.
That would be a big assumption. I personally doubt it, but who knows..
moproblems99, post: 3966228, member: 155919"
Although, still need to try rendering above my native and downscaling to see if visual vs perf is better/worse.
Oh turn that on and leave it on. Then turn your AA down or off. You'll like the steady framerates much better.
EarthDog, post: 3966229, member: 79836"
links plz... :)
https://store.steampowered.com/stats/
https://www.techpowerup.com/forums/threads/what-are-your-gpu-settings-for-running-games-and-benchmarks.250063/
Posted on Reply
#118
EarthDog
See edit above for your poll.

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
Posted on Reply
#119
CandymanGR
lexluthermiester, post: 3966219, member: 134537"
So you don't trust actual usage and experience? Sounds like flawed logic. But hey, do carry on..

Sure it is when the settings are properly configured.

The word your looking for is experienced. I own a PC shop and we build every kind of system for every budget from bleeding edge gaming to economy minded gaming. From Ryzen or Intel to Radeon and Geforce. A 2060 3GB will be bare minimum but still doable as a gaming card, just like the Radeon 4GB cards are doable for 1080p gaming. Calling me biased only shows your ignorance. Good luck with that.
So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start then mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
I don't trust people who are buying something which obviously has some flaws, and defend it like there is no tommorow. That translates a bit as "butthurt" to me.

3gb vram are NOT enough for MANY games. Not all, but many (AAA titles mostly). I can name a few, i already DID. I've said that 2 times already. Those are the games we're buying gpu's for, not lightweight games. You dont agree on that?

Do you think what i am saying is coming out of my a**? You think i haven't made benchmarks myself to see what's what? Ofcourse with the "right settings" vram requirement could go below 3giga but that not the point, because for the games i am speaking for, going below 3gb vram usage usually means also to go for medium/low settings. GTA V for example takes about 3.5 for medium/high settings at 1080p, NOT even Ultra. Now if you start editing .ini files then we're talking for very customized experience and thats not normal for the average user. And still that method cannot change the performance hits or gains beyond the capabilities of the gpu. It is just a more customized method, and you can sacrifice quality over speed (and vice versa) exactly as you want it, because sometimes the in-game settings do not satisfy all tastes. Thats all. But the performance of the gpu with 3gb vram will be what it is.

You are saying all that about your past experience, yet you are ready to defend a NEW gpu for 2019 with 3gb vram. And you call me ignorant!!!!!!!!!!

P.S. And please dont start with nvidia's "magic" compression.
Posted on Reply
#120
lexluthermiester
CandymanGR, post: 3966248, member: 167652"
So now mentioning our tech past experience, gives credibility to what we're saying? That's flawed logic, not mine.
Should i start that mentioning how many systems i have build in the last 25 years i work as IT specialist? Should i? Really?
No, this isn't a contest.
CandymanGR, post: 3966248, member: 167652"
Do you think what i am saying is coming out of my a**?
Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today. Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
CandymanGR, post: 3966248, member: 167652"
for medium/high settings at 1080p, NOT even Ultra.
Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
CandymanGR, post: 3966248, member: 167652"
And you call me ignorant!!!!!!!!!!
Though you took it out of context, that's what I said. And your statements above continue to lend merit to that conclusion.
Posted on Reply
#121
CandymanGR
lexluthermiester, post: 3966251, member: 134537"
No, this isn't a contest.
Then why you've started it?

lexluthermiester, post: 3966251, member: 134537"
Yes? Mostly because you're not taking into account most real-world usage scenario's. For 1080p a 2060 with 3GB will work and perform very well in most games out today.
Yes, and also a 950 with 2gb vram will work and perform "well" in most games. But thats NOT the point. We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember). And also, the vram requirements for games are keep getting higher. You just cannot accept it. And i can tell you games that look like shit with custom settings for less than 3gb vram usage. Like Far Cry 4 or Shadow of Mordor.

lexluthermiester, post: 3966251, member: 134537"
Very few AAA titles can not be made to run well on such as card. How do I know this you ask? Because it can be done with 1060 with 3GB. A 2060 is a better performing card so, naturally, what a 1060 can do a 2060 will do better. Simply deductive reasoning is all that is required to arrive at that conclusion.
On the contrary, a faster core with ram as fast as a card of 2016 (i am refering to the GDDR5 versions) might starve of data faster, and suffer from data bus bottlenecks, especially for vram hungry games! But obviously you know how the card will perform without even seeing it first.

lexluthermiester, post: 3966251, member: 134537"
Most people tinker with their settings so the "medium/high" argument is irrelevant as it ends up being customized.
Is that even an argument? Obviously most people customise their settings. So? We need points of origin in order to discuss this, otherwise we can say "yeah you can customize X game to run with even 2gb of vram usage". Thats not the point! You'are avoiding the point systematically.

Edit: Oh and one more thing by the way. My english are not great, but as far as i remember calling someone "biased" is not an insult, but calling someone "ignorant", actually it is. Especially, in the way you've used it.
Posted on Reply
#122
lexluthermiester
CandymanGR, post: 3966257, member: 167652"
Then why you've started it?
:wtf::kookoo:
CandymanGR, post: 3966257, member: 167652"
Yes, and also a 950 with 2gb vram will work and perform "well" in most games.
No, it wouldn't.
CandymanGR, post: 3966257, member: 167652"
But thats NOT the point.
Sure it is. Gaming performance is exactly the point here.
CandymanGR, post: 3966257, member: 167652"
We're NOT talking about cards of 2016 (thats how old 1060 is in case you dont remember).
Oh gee wiz, thanks for reminding me...:rolleyes:
CandymanGR, post: 3966257, member: 167652"
And i can tell you games that look like shit with custom settings for less than 3gb vram usage.
That's your opinion and a completely subjective one. You're welcome to it.
CandymanGR, post: 3966257, member: 167652"
But obviously you know how the card will perform without even seeing it first.
Sure can, here's my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's and it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.

CandymanGR, post: 3966257, member: 167652"
but as far as i remember calling someone "biased" is not an insult
Depends on how you use it, but I digress..
Posted on Reply
#123
CandymanGR
lexluthermiester, post: 3966265, member: 134537"
:wtf::kookoo:

No, it wouldn't.
Yes it would.
Sure it is. Gaming performance is exactly the point here.
We were not talking about performance in general, we were talking about how 3gb vram are enough or not.
Oh gee wiz, thanks for reminding me...:rolleyes:
When arguments end, irony starts.
That's your opinion and a completely subjective one. You're welcome to it.
And yours also. Stop presenting your subjective opinion as fact.
Sure can, here my premise for logic. I had a 1080 and upgraded to a 2080. Performance jump was significant. I have 1070 in one of my other PC's it is known that the 2070 is a big jump in performance. It doesn't take much for a person to conclude that the 2060 will beat out a 1060. Therefore, very naturally, it is easy to conclude that anything a 1060 can do a 2060 will do much better. Don't need to see it to be able to accurately conclude the general performance of such a card.
So from the point of discussion which was 3gb vram is not enough, you 've reach the conclusion that "the next gen card will be faster than previous gen". No shit Sherlock!? What a great discovery you've made. Ofcourse it will be!

But i speak of specific area in terms of performance. How do you know for example, if the bus size for the GDDR5 models is the same or not, and therefore the bandwidth is similar to the previous gen or not? How do you know if the GDDR5 models would not have performance hit because of bandwidth? How do you know especially if the 3gb vram models will have enough data feed to the gpu? Your deductive logic is as flawed just as like the rest of your arguments.
This is exactly what i mean you are avoiding the point systematically.
Posted on Reply
#124
lexluthermiester
@CandymanGR
You're nitpicking and no longer offering merit based arguments. At this point it's obviously about ego for you, so I'm out.
Posted on Reply
#125
Xzibit
EarthDog, post: 3966235, member: 79836"
See edit above for your poll.

I'm mobile and cant dig down on the steam stats link... feeling saucy and post an image of it?
I was curious too as to why that reference and if SHS added it. I did not find any.
Posted on Reply
Add your own comment