1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are we at pretty much the max needed for high end graphics cards in 2013/2014?

Discussion in 'Graphics Cards' started by vawrvawerawe, Jun 18, 2014.

?

Which side is the PS4?

  1. The left side

    40.7%
  2. The right side

    59.3%
Thread Status:
Not open for further replies.
  1. rtwjunkie

    rtwjunkie

    Joined:
    Jul 25, 2008
    Messages:
    2,618 (1.07/day)
    Thanks Received:
    1,497
    Location:
    Louisiana
    Did I miss it somewhere? I thought the OP was going to tell us which picture was which Playstation?
     
  2. yogurt_21

    yogurt_21

    Joined:
    Feb 18, 2006
    Messages:
    4,466 (1.34/day)
    Thanks Received:
    599
    Location:
    AZ
    funny I thought the consensus was that any game that's playable on both will likely show very little variance from the older one. Dev's being budget minded and all.

    Think about it, PlayStations have always been nice about being backwards compatible. Pop Tony Hawk into your PS2, then Pop it in your PS3...see a difference? no? "OMG conspiracy, PS3 suxxors you should have stuck with PS2!"

    That's pretty much what this thread is. A game built on an old engine so as to be compatible with both older gen consoles as well as newer isn't very likely to take much advantage of the newer gen's GPU horsepower. The games that do take advantage of it, aren't likely compatible with the older gen to test. So we get this nice little waste of time.
     
  3. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,325 (1.17/day)
    Thanks Received:
    376
    At 1600p, two R9 290 cannot fully max Crysis at 60FPS. So.... maybe a better question would have been whether the hardware horsepower in modern GPUs is more than enough, but the APIs and architectures could be more efficient, and waste less horsepower.

    ?
     
  4. Champ

    Champ

    Joined:
    Jun 28, 2008
    Messages:
    981 (0.40/day)
    Thanks Received:
    97
    Location:
    Greenville, NC
    Interesting. We carry on about 4k and our somewhat normal resolutions (1600p isn't normal I think) aren't being maxed yet.
     
  5. Champ

    Champ

    Joined:
    Jun 28, 2008
    Messages:
    981 (0.40/day)
    Thanks Received:
    97
    Location:
    Greenville, NC
  6. dr0thegreatest

    dr0thegreatest

    Joined:
    Jul 1, 2014
    Messages:
    37 (0.14/day)
    Thanks Received:
    6
    Location:
    Toronto and Dubai
    i dont even think you would need to upgrade the GPU, 2 290x are pretty darn good in 4k, and if hes got another slot throw in another 290 to the mix and it should be very good for 4k unless some serioussly demanding 4k game comes out which is obviously going to happen.
     
    Champ says thanks.
  7. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,747 (2.55/day)
    Thanks Received:
    1,465
    Who needs consoles again?

    Untitled.jpg
     

    Attached Files:

    ne6togadno says thanks.
    10 Million points folded for TPU
  8. D007

    D007

    Joined:
    Mar 7, 2007
    Messages:
    3,349 (1.14/day)
    Thanks Received:
    481
    Location:
    Pompano beach, Florida
    And then 4k came out and everything you have doesn't cut it anymore lol..
    On a big screen TV 4k looks amazing.
    It makes 1080p look like 710.
    Once you look at them, there is no going back lol.

    Comparing performance vs quality is also impossible in still pictures.
    60 fps vs 30 fps is a big difference.
    PC will outperform, unquestionably, especially with 4k.

    Desktop color and control panel settings, will also make the picture look different.
     
    Last edited: Aug 28, 2014
  9. XL-R8R

    XL-R8R

    Joined:
    Nov 12, 2012
    Messages:
    322 (0.37/day)
    Thanks Received:
    134
    Location:
    Technical Tittery....
    It appears wmwmmwmwmawavewmw (or whatever that crap is any way) has once again bailed from his own thread that he apparently started with the sole purpose to either troll, start in-fighting between members or just to be pointless.
     
  10. lilhasselhoffer

    lilhasselhoffer

    Joined:
    Apr 2, 2011
    Messages:
    1,703 (1.17/day)
    Thanks Received:
    1,066
    Location:
    East Coast, USA
    I'm having a problem here, that doesn't seem to be resolved yet on this thread.

    I want 4K. I want it because everyone wants to max out everything and have the best stuff. That's why this forum exists.


    My conundrum is that despite wanting 4K I know that it's stupid. Before I get called for flaming, or start some sort of debate, allow me to explain myself.

    Consumers currently have three options for content delivery. They can stream, buy a DVD, or buy a Blu-ray. Gaming falls into either the streaming (Steam, Origin, GOG, etc...), or the DVD category. Neither of these content delivery systems has a resolution limit, because the computer generates images on the fly. You could theoretically compress a program onto a DVD and have it run a thousand monitors once activated, assuming the hardware existed to do so.

    Alternatively, you've got movies and television which can utilize all content delivery streams. Streaming allows instant content access, if limited by internet speeds. DVDs are great, assuming you can compress the crap out of them and get your two hours of video. Blu-ray is better, because the information storage capacity is much higher. The problem for the last two standards is that neither of them are dynamic. Stored pictures, even when compressed very well, are only that. No piece of hardware can truly improve their fidelity, even if interpolation can allow the images to appear less pixelated once expanded.


    So the reason 4K is stupid is simple, it's not adopted by one of the largest bodies and only just being introduced in the other one. Movies and television don't exist in the realm of 4K. If the movie industry doesn't support the hardware with content, nobody buys it. If the hardware doesn't exist, there's no content created for it. No content and no hardware means that despite its advantages 4K isn't worth adopting for the main stream consumer right now.



    So looking at this all, my problem is simple. Why concern yourself with 4K and the cards you need to run it today? If you've actually got the money for a 4K monitor, the graphics cards to run it probably aren't a big cost to you. If you're trying to buy cards that are future proof enough to run 4K, then you're comiting to GPUs for a substantial chunk of time. No installation base now means 4K won't be at a reasonable pricing for at least several more years; a content pool drives consumer buying, which is what drives prices down. You're asking whether cards running right now can drive a theoretical monitor in the distant future. If that's really your intent, ask how many people are still running a 4xxx series Radeon GPU or a 2xx series Nvidea GPU. Those cards came at about the time that 720p was the standard in media, with 1080p still on the horizon. The only difference is that the progress of hardware improvement has slowed dramatically since then.
     
    AsRock says thanks.
  11. D007

    D007

    Joined:
    Mar 7, 2007
    Messages:
    3,349 (1.14/day)
    Thanks Received:
    481
    Location:
    Pompano beach, Florida
    Not trying to bash ya Hassle but you must not know the current state of 4k gaming, if this is what you think.
    4k gaming is alive and well.
    Even on single GPU systems like mine. I run 4k playing mass effect 1, 2 and 3 and it runs very nicely.
    Even with one gpu. 3840x2160

    Also 4k is extremely reasonable in pricing now.
    My 50" samsung 4k , LCD TV was only 1,500.00 US.
    That's exactly what I paid for my old 50" 1080p samsung.
    The price has dropped insanely over the last year.

    Not only that but watching my blurays have never been more vivid due to upscaling.
    The difference is immediately noticeable.
    Like looking at a 710 TV, next to a 1080 TV.

    Word of advice though. Get a monitor or TV with display port 1.2.
    HDMI support is lacking and only works with the HDMI 1.4 work around by nvidia or Eyefinities similar work around.
    Currently sli is not supported but it should be very very soon.
    Word direclty from Nvidia seems like it should be within a month.
    Quote "Our next driver release"
    Which should come out very soon. They release a new driver monthly usually and we are due for a new one any day now.

    You're not entirely wrong though, it has been a pain in the ass.
    My topic here about it: http://www.techpowerup.com/forums/t...dvertising-and-sli.204582/page-2#post-3155731
     
    Last edited: Aug 28, 2014
  12. lilhasselhoffer

    lilhasselhoffer

    Joined:
    Apr 2, 2011
    Messages:
    1,703 (1.17/day)
    Thanks Received:
    1,066
    Location:
    East Coast, USA
    When I began reading I was getting ready to make the comment about your other thread.

    Now, the problem I have with this statement is simple. $1500 isn't a reasonable price for your average consumer. When you can get a 50" 1080p television for $300, and a "next gen" console for about half of the cost of just a decent 4K monitor you're not aiming to the mainstream.


    As far as Mass Effect, I call crap on that. ME 1 ran on the original Xbox, and I got it running well at 1080p on a 3650 GPU. ME 2 and 3 might have required more chutzpah, but we're only looking at an Xbox 360. If a modern card, that is functionally 3 or more generations improved upon the mid-range GPU in the 360, cannot run the game at 4K it would be immensely surprising.


    Now, the next problem with your assumptions relates to interpolation. A 4K monitor isn't any clearer than a 1080p monitor, with a 1080p signal. You're saying the image is clearer, but what you really mean is that the pixel density is high enough that everything looks less jagged. Interpolation is effectively just smearing enough vaseline on the screen so that the lack of actual pixel differentiation is not noticeable. My 23" 1080p monitor has a higher pixel density than a 50" 4K monitor. Conflating the two is at best a disingenuous comparison.


    Finally, everything else wrong about what you are saying. The average consumer couldn't understand the difference between 1080i and 1080p. The differentiations in the various HDMI standards, what the heck DP is, and how to even begin getting the correct driver and work-arounds is well beyond their capacity. For a moment, consider that the average consumer is your grand mother (assuming that old of a relation still exists for you). They want to buy cable x, plug into device y, and connect to Monitor/TV z. All of this complexity is a barrier to any regular person adopting the technology.



    I guess, put simply, you and I are not the center of the world. In all consideration, none of us on a tech forum discussing 4K really are the real world. No matter what you can do today, the relevant question is what can grandma do without a call to tech support. 4K isn't anywhere near being easy enough for grandma, and thus isn't a pressing issue right now. Heck, if the current generation of consoles lasts 7 years the first time we'll really be getting into this debate is in another 4 years when the new console rumors start to swirl, and a 4K monitor 46"+ costs less than $500. PC gaming is experiencing an upswing because of indie development, but indies don't work with the Frostbyte engine.
     
  13. D007

    D007

    Joined:
    Mar 7, 2007
    Messages:
    3,349 (1.14/day)
    Thanks Received:
    481
    Location:
    Pompano beach, Florida
    Good luck waiting for that, if you want big screen gaming.
    1080p TV's are still 600ish dollars in the 50" range.
    No one is going to wait 10 years for prices to become "reasonable" as you call it.
    I consider the prices reasonable currently.
    Not everyone can afford it but such is life.
    Not everyone buys a Ferrari either.

    I should of elaborated and that's my fault. I am running ME 1, 2, and 3, with 4k mods like this one for ME1:
    (MEUITM)http://www.moddb.com/mods/mass-effect-1-new-texture-updatesimprovements-mod
    and ini tweaks for 4k shadows and all the extras.
    My game is NOTHING like your Xbox, dumbed down graphics version lol..
    If Xbox tried to play what I am playing, it would shit it's self and die in short order. ;)

    Your argument is based on what I consider tiny, 23" monitors. Go big or go home is how I see it.
    50" gaming ftw.
    Of course you won't see much if anything @ 23".
    @ 50" however it is immensely better, so your argument is mute in that aspect, if someone wants big screen gaming like myself.
    I don't crunch myself into some desk chair and huddle in front of my monitor like Golem and his precious.
    I sit back and lounge on my lazy boy, in comfort, 6 feet away, with full surround sound baby!
    Go big or go home. :p

    You insult the intelligence of the average consumer and must think yourself to be immensely intelligence about the i vs p thing..
    Everyone and their grandma knows p is better.
    Don't flatter yourself lol..
    There is no "work around" you need to worry about at all as you have implied.
    The work around implements it's self, you do nothing.
    4k works as is, no modifications necessary and it is a huge visual improvement.

    You base your arguments on people in retirement homes for high end gaming?
    That is beyond ridiculous man. How many people in retirement homes you know of that even care about 4k?
    How many you know doing high end gaming? lol
    How many you know buying 4k set ups?
    NONE.. They aren't even in the picture, not part of the demograph, not part of the sales pitch.
    4k and Nvidia high end gpus do not market to 80 year olds.
    They market to the "Enthusiast" group which is generally younger.
    That is common sense.
    Lmfao man.. Come on now at least say things that seem possible. That's beyond nonsensical.

    Again basing high end gaming and super gpu sales on people in retirement homes?
    Half of the people you are talking about will die with 1080p and be happy with it.
    I still know a lot of them using tube TV's and have no desire to upgrade what so ever.
    They are not part of the demographic and anyone knows that.

    In 4 years, 4k will NOT be 500 dollars @ 50".
    Just like over 6 years after 1080p is introduced (which is now) STILL is not 500 dollars in the 50" range, for high end sets.
    You will wait forever and NEVER play on a big screen, 4k system.
    If that's what you want then fine. But I'm pretty sure the rest of us will pay up if we want too and rock out. :rockout:
     
  14. Champ

    Champ

    Joined:
    Jun 28, 2008
    Messages:
    981 (0.40/day)
    Thanks Received:
    97
    Location:
    Greenville, NC
    I brought my monitor now planning for tomorrow. If you have the funds, that's what I recommend. By the looks of it, a 60 hz 4K monitor brought now will last you about a good 5 years before we can start maxing it or the single card affordable solutions comes along. I'm about to run 3 290s. Just ordered 2. I'm just riding along until the next big 4K breakthough
     
    D007 says thanks.
  15. andrewsmc

    andrewsmc

    Joined:
    Sep 15, 2008
    Messages:
    1,036 (0.43/day)
    Thanks Received:
    122
    Location:
    Pikeville NC
    Which one is the PS4?
     
  16. lilhasselhoffer

    lilhasselhoffer

    Joined:
    Apr 2, 2011
    Messages:
    1,703 (1.17/day)
    Thanks Received:
    1,066
    Location:
    East Coast, USA


    ...not sure if trolling, or completely detached from reality.


    $300 50" sets appear quite frequently, as deals and offers. The argument that "high-end" displays will never be that cheap is fallacious. By definition, a high-end monitor is not cheap... Not seeing why you conflate the high-end of televisions with what the average purchase is. Right now a decent 1080p television runs near $500, but you're still looking at a console+TV at a price of $1000. That's 67% of the $1500 you've quoted, and significantly less when you've got a $1000 or more computer that you've purchased in order to even run 4K content. 1000/2500 = 2/5 = 40%. So you could experience gaming on a console for 40% of the cost of a PC. The PC may be much prettier, but not 250% of the price prettier.


    Yes, I equate the average user to my grandma. I'm not sure why you're so against it, but be realistic. When 1080 resolutions first came of people conflated all values together. 720P must be worse than 1080i, because 1080>720. People eventually worked that crap out, but eventually wasn't instantly. Taking offense to this statement is foolish, as there are plenty of people who proudly still own 1080i sets.

    You somehow assume that most people are doing a bunch of research. I call crap, and I call you misinformed. Without searching it out, what is the difference between HDMI 1.0, 1.2, 1.3, 1.4, 1.4a, 1.4b, and 2.0? Can't tell me, can you? That's relatively easy for us to look up, but what about the consumer who sees "HDMI compatible" on an cheaply priced TV? What about the person who has spent 50+ hours working and just wants to watch a movie or play a game, and not sink twenty hours figuring out how to get it working correctly? You seriously think that isn't the bulk of consumers? I'll concede a little here, the measuring stick shouldn't be grandma, it should be your father/mother. If it takes more than 10% of the usable time to get something running for entertainment then it isn't worth it to them. Life is stressful enough, without having to spend hours getting crap setup. This is why services like geek squad exist, despite the fact that a teenager can set this stuff up. $50 to not deal with BS means more enjoyment.

    The ME argument is face palm stupid. Increasing shaders and adding higher resolution textures is nice, but it isn't anything new. I'd conjecture that 3+ generations of GPU developments should do this easily, as the amount of shader units on GPUs has increased by that much in the allotted time. What I said was that ME ran on the original Xbox. If all you are doing is pumping up the textures and tweaking shaders 90% of the graphics are the same as before. It may be prettier, but immensely taxing isn't a valid way to describe it.

    Finally, are you seriously getting into an e-peen measuring contest. The "my screen is big enough to see from the living room" argument is just face-palm stupid. You're saying that four monitors glued together is just fine, but one monitor at 1/4 the viewing distance is somehow worse. You know what, I'll concede. I can spend $140 a monitor, $20 in glue, and $100 on the stand, for a total or $680. That'll give me the 46" size, 4K resolution, and still have me at less than half of what you paid. I cannot put it any more concisely than that. 4K will match 1080p only when our streaming service, or physical media, can deliver content. Gaming is great, but far from the deciding reason that most people buy a television. Television sales and monitor sales are inextricably linked, so 4K isn't gaining a lot of ground.



    Argue all you'd like, but I ask you to read previous posts. The vast majority of users were at 1080p or less on resolutions, with 4K being basically a statistical anomaly instead of a real segment. What you're saying that I don't get is basically the market share of Windows Vista. It exists, but it isn't anything more than a niche at this moment. Planning around a niche is stupid, hence why 4K is stupid. In a few years, it won't be Vista, it'll be Windows 7. The market share, and ease to entry, will be low. When pricing and barrier to entry is low, 4K will make sense. Of course, by that point the 7xx and Rx 2xx series GPUs will be an anachronism. Asking if they'll still run the content at that point is foolish, and seems to be what the OP is stuck on. Just because we can, doesn't mean that doing it matters.
     
  17. 64K

    64K

    Joined:
    Mar 13, 2014
    Messages:
    1,354 (3.54/day)
    Thanks Received:
    975
    I don't know about the situation with others but my Comcast HD service that I pay extra for gives me most channels at 720p or 1080i citing bandwidth issues as the main reason they are not mostly 1080p. If this is true then I doubt a 4K TV will have many if any channels at all actually at 4K.
     
  18. D007

    D007

    Joined:
    Mar 7, 2007
    Messages:
    3,349 (1.14/day)
    Thanks Received:
    481
    Location:
    Pompano beach, Florida
    My Lord Hassle. You are a complete moron.. idk what else to say.
    You have a severe attitude problem and you must be like 12 years old.
    One more for the ignore list.
    You have exactly ZERO idea of what you are talking about.
    You spew the most nonsensical shit I have ever heard..
    Iggy time..

    Have fun waiting for that hardware for the next 20 years..lol..
     
    Last edited: Aug 29, 2014
  19. Toothless

    Toothless

    Joined:
    Mar 26, 2014
    Messages:
    1,680 (4.55/day)
    Thanks Received:
    966
    Location:
    Island of Berk
    Insults and no hard proof...

    Hmmmmm....
     
    lilhasselhoffer says thanks.
    Crunching for Team TPU
  20. D007

    D007

    Joined:
    Mar 7, 2007
    Messages:
    3,349 (1.14/day)
    Thanks Received:
    481
    Location:
    Pompano beach, Florida
    Proof of what?
    That people cry about paying for top end hardware?
    You want proof of that?
    Talk to the guy using a mid grade card, crying about why he can't afford a 4k TV.
    As if he could even run it if he had one.

    I'm not crying that I can't buy that Ferrari.
    Deal with it.

    I don't need to prove anything to you.
    I can see my proof every single time I fire up my badass 4k TV lol..
     
    Last edited: Aug 29, 2014
  21. lilhasselhoffer

    lilhasselhoffer

    Joined:
    Apr 2, 2011
    Messages:
    1,703 (1.17/day)
    Thanks Received:
    1,066
    Location:
    East Coast, USA
    You are an arrogant spoiled child, unwilling to consider any view but your own and unwilling to bring facts to the table that support your conclusion. Obviously, you're going to want the last word, so you're welcome to it, after I put our discourse together so you can point out where I was somehow a "moron" despite the provided evidence.

    As to what is being said, let's go back over it all.
    1) At no point did I say 4K was impossible. I said talking about it relating it to graphics cards currently on the market is stupid. 4K exists, but the amount of people actually using it are a statistical anomaly, as proven with facts in the thread. You responded with unintelligible anger about me having a crappy screen if all I wanted was 1080p on a 23" monitor.
    2) You said the pricing of a 4K monitor was reasonable at $1500 for a 50" screen. I pointed out that the average purchase, by definition, isn't a high-end monitor. The average price of a monitor somewhere in that size, and at 1080p, is currently between $500 and $300. You seemed to then imply that all of us "peasants" should get a better job and buy something more expensive.
    3) When confronted with a more than 250% difference between console 1080p and PC 4K gaming, you dismissed the difference out of hand. No response to the facts that this difference exists was ever given, unless you calling us "broke ass" is a response.
    4) After all of this, you seem to believe offering proof for your point of view is somehow beneath you. You constantly insult us "peasants," and somehow come to the conclusion that you are above having to provide a reasonable response.
    5) Despite everything else, your insults have been tolerated. You should have been flagged as intentionally offensive on this forum. For the record, I'm not doing so because you seem to just be on a tangent and undeserving of scrutiny by a mod. Everyone has their off days, and moments where emotions flare. I've kept my responses civil, and I expect the same from another adult. If you are incapable of this I will be asking for a review of your conduct.

    You have demonstrably acted as a jerk. This isn't the first time either. You said you'd never respond to me a few months back on another thread. What has been proven here is that you believe you are better than other people, and thus other people are idiots. Fine, I'd suggest you take your own advice and stop responding to me. If you'd done so this whole discussion never would have happened.

    My last comment, before I open the floor to another round of misguided insults, is that you are completely incapable of an adult discussion. If you wanted to prove that you were right you could have brought in sales figures for 4K televisions, advertisements listing how many GPUs say they can run a 4K monitor, pricing figures showing that 4K costs are going down, or even figures that show increased uptake in HDMI 2.0 and DP uptake that show more devices capable of using 4K signals. I have seen exactly none of this data from you, and only the anecdote that "my TV is bad ass" to support your statements. Instead you whine like a child scorned, and call me names. I'm still waiting on a response to the differences between the HDMI standards. Perhaps, just perhaps, next time you can come to the table with an argument. As it stands, your argument is name calling and sticking your fingers in your ears.
     
    Toothless says thanks.
  22. ne6togadno

    ne6togadno

    Joined:
    Mar 15, 2013
    Messages:
    1,471 (1.97/day)
    Thanks Received:
    682
    Location:
    GMT +2
  23. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,325 (1.17/day)
    Thanks Received:
    376
    Think people are forgetting here that games aren't shipping with 4k or 8k native textures and third party modding isn't always using native textures either.

    The only thing I fear about 4k is that it will get dominated by 'tv' resolution formats, and give us pixel counts that are too wide when they should be more tall/high like that found in PC format monitors.
     
  24. andrewsmc

    andrewsmc

    Joined:
    Sep 15, 2008
    Messages:
    1,036 (0.43/day)
    Thanks Received:
    122
    Location:
    Pikeville NC
    [​IMG]
     
    yogurt_21 and Toothless say thanks.
  25. vawrvawerawe

    Joined:
    Nov 11, 2012
    Messages:
    594 (0.68/day)
    Thanks Received:
    34
    Okay guys, it's been about 3 months since I created this thread.

    Time for the answer!

    According to your votes,
    22 of you thought the left side was the PS4 (40.7%), whereas
    32 of you thought the right side was the PS4 (59.3%).

    [​IMG]

    AND NOW FOR THE ANSWER:
    Sadly, more than half of you thought the PS3 graphics were actually better than PS4 graphics. This is because:

    The answer is that the LEFT side is the PS4! So, should you upgrade to PS4 for graphics' sake alone? Probably not. Unlike the way PS3 was a huge improvement over PS2, well PS4 was hardly even an upgrade - if you can even call it an "upgrade" with the loss of DLNA and media streaming and other things. Personally I will not be getting the PS4 any time soon.
     
    rtwjunkie and ne6togadno say thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Thread Status:
Not open for further replies.

Share This Page