• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon VII Hands On at CES 2019

ummm have you never heard of gamefly? they mail you unlimited ps4 or xbox one games for $20 a month, 2x at a time, and as soon as tracking info show its been mailed out they mail you next on your que, and you really can't play more than two games at once anyway if your dedicated to the story or multiplayer, etc.

/shrug but suit yourself thats what im doing. PS5, 4k QLED samsung 55", and gamefly subscription.
There's no gamefly (that I know of) where I live. I also wasn't trying to tell the guy not to buy a console (if you follow my post, you may have noticed I don't usually tell people how to spend their money), I was just saying why that won't work for me.
 
ummm have you never heard of gamefly? they mail you unlimited ps4 or xbox one games for $20 a month, 2x at a time, and as soon as tracking info show its been mailed out they mail you next on your que, and you really can't play more than two games at once anyway if your dedicated to the story or multiplayer, etc.

/shrug but suit yourself thats what im doing. PS5, 4k QLED samsung 55", and gamefly subscription.
Gamefly only service the USA, no? Doesn't help the rest of the planet much ;)
 
There's no gamefly (that I know of) where I live. I also wasn't trying to tell the guy not to buy a console (if you follow my post, you may have noticed I don't usually tell people how to spend their money), I was just saying why that won't work for me.

ah yeah if you don't have access to something like gamefly or redbox, consoles will still be pretty expensive.
 
This strikes me more as a creator card than a card you'd pick up to play games.
That what it actually is, MI card with up to 4 faulty CUs.

matches 1080ti and has none of those new features.
And it doesn't even have gsync!
Lovely that veryone understands what 2080's performance actually is, now that AMD announced V7.... /grins
 
I'll hold out for a year more. (as long as my RX Vega stays alive that is....) With one small exception- If they release a liquid cooled, 100% unlocked version later. :-)
 
me. I really enjoy my 240hz 1080p monitor, the smoothness of FPS games and adventure action games is quite fun when you can hit 180+ fps minimum. unfortunately this requires low settings for most games, so it would be nice not to have to do that.

thanks for calling me an idiot though, have fun, cheers




hmmm 60-70 fps faster than any other card at 1080p the 2080 ti review shows... not for all games, but for a few.
One simplle solution for people like you . LOWER RESOLUTION TO 720p , you anyway play at lowest settings = crap graphic
 
One simplle solution for people like you . LOWER RESOLUTION TO 720p , you anyway play at lowest settings = crap graphic

incorrect. i have a 4k big screen for certain games that have capped frames like Final Fantasy X remaster, and I have high refresh 1440p for other types of games that I don't mind 144hz. and for FPS games or when i get in the mood for wanting to go on a thrill rampage in tomb raider on low settings at 240 fps i use my 240hz monitor and roll jump dodge while enjoying the glorious smoothness. and sometimes I will play the same game on the 1440p at a lower refresh and enjoy it too, amazing that we have choice in this life!

also my school laptop is 8th gen intel UHD 620 graphics and 768p, and i think Wizard of Legend indie game looks and plays amazing on it at 60 fps. some genres do very well at 60 fps and don't benefit at all.

its so amazing how we can have options! no i must be rigid and believe in one thing like the rest of you! so neat!
 
One simplle solution for people like you . LOWER RESOLUTION TO 720p , you anyway play at lowest settings = crap graphic

LOL. It doesn't even matter, b/c of dimishing returns above 120. And then they'd still have to play on a TN panel for there to be any benefit or it's spitting out frames faster than the pixels can change. It's just silly.

I don't even know if there's any benefit above 120. Back in the day on my high end CRT, 120 fps at 120 hz was perfect (of course that's FAR superior to LCD).
 
LOL. It doesn't even matter, b/c of dimishing returns above 120. And then they'd still have to play on a TN panel for there to be any benefit or it's spitting out frames faster than the pixels can change. It's just silly.

I don't even know if there's any benefit above 120. Back in the day on my high end CRT, 120 fps at 120 hz was perfect (of course that's FAR superior to LCD).

I'm sorry you can't tell the difference. I switch back between 120 and 240 all the time and can tell the difference. Guess I just have better eyes than the rest of you
 
I'm sorry you can't tell the difference. I switch back between 120 and 240 all the time and can tell the difference. Guess I just have better eyes than the rest of you

Negative. No one is more sensitive to framerate than me. Film/TV is god awful to look at. NEVER actually focus when watching or the stutters (which is nonstop lol) will drive you insane. It's just pointless to sacrifice detail for super high FPS. We might as well only play Source games for life in that case. Not only are textures bad, but shadows and aliasing become atrocious. No thanks.
 
Negative. No one is more sensitive to framerate than me. Film/TV is god awful to look at. NEVER actually focus when watching or the stutters will drive you insane. It's just pointless to sacrifice detail for super high FPS. We might as well only play Source games for life in that case.

plenty of games reach 200ish fps on medium settings and still look great, and i have a lot of backlog dating back 8 years, and those games even go as far as high settings. i can tell the difference, sorry you can't. take care buddy
 
With 16GB of VRAM, could you run 8K screenmode?

EDIT OFF TOPIC:

Found this strange looking chip. Not sure what this is.
 

Attachments

  • HBM MEMORY.jpg
    HBM MEMORY.jpg
    247.5 KB · Views: 303
Last edited:
Last edited:
I am no nvidia fan BUT what does Radeon VII really bring to the table.

Same performance as 2080 BUT nothing new, really only upp the memory to 16GB.

Prices of the 2080 will be just about the same as Radeon VII at release.

At same price what would you choose, RTX 2080 with RayTracing, DLSS and AI
OR
Radeon VII, revamped old VEGA with really nothing new, only 8GB more memory

In games with DLSS the 2080 will make the Radeon VII suck it's tailpipe with 25-35% higher frame rates.

Also if Radeon VII even come close to affecting the 2080 sales nvidia will instantly drop the price to counter.

Really do not see an upside for AMD in this battle at the moment with Radeon VII

If they did RADEON VII with 8GB memory at 499 USD, that would really make a splash in the graphics market.

As is now i feel that AMD just is trying to align it self with nvidias RTX overprices with essentially shrinked old VEGA tech
 
Last edited:
So it still can't play nearly any AAA titles at 4k.
The 2060 can get into playable territory at 4k with some reduced details. I doubt this will be worse.
Let's not do what fanboys do and say this card isn't for anyone just because me and you don't plan to get one ;)
 
I am no nvidia fan BUT what does Radeon VII really bring to the table.

Same performance as 2080 BUT nothing new, really only upp the memory to 16GB.

Prices of the 2080 will be just about the same as Radeon VII at release.

At same price what would you choose, RTX 2080 with RayTracing, DLSS and AI
OR
Radeon VII, revamped old VEGA with really nothing new, only 5GB more memory

In games with DLSS the 2080 will make the Radeon VII suck it's tailpipe with 25-35% higher frame rates.

Also if Radeon VII even come close to affecting the 2080 sales nvidia will instantly drop the price to counter.

Really do not see an upside for AMD in this battle at the moment with Radeon VII

If they did RADEON VII with 8GB memory at 499 USD, that would really make a splash in the graphics market.

As is now i feel that AMD just is trying to align it self with nvidias RTX overprices with essentially shrinked old VEGA tech
Dlss is dumb
 
Well im excited, I mean I wish the price was a bit lower, because seriously this is that Ferrari vs Volkswagen idea.
Ferrari sells 2 cars a year for a million a piece while Volkswagen sells a 10 million a year for 10.000 a piece.

Volkswagen makes a much higher profit, thats what AMD should do with this, bring out a card that is simply good enough and then price it so damn sharply that everyone who isnt a fanboy would go for that.
*AHEM*

VW also owns Lamborghini, Bentley, and Bugatti, makers of the VEYRON, the best automotive comparison to a RTX titan.

VW doesnt soly exist as a low end car maker, they have tons of super high margin cars too. AMD cant survive just on the "peoples GPU", thats what they tried with polaris, and while it gained them market share, it didnt give them profit. Nvidia, OTOH, got one silver platter after another full of $$$$.
 
I am no nvidia fan BUT what does Radeon VII really bring to the table.

Same performance as 2080 BUT nothing new, really only upp the memory to 16GB.

Prices of the 2080 will be just about the same as Radeon VII at release.

At same price what would you choose, RTX 2080 with RayTracing, DLSS and AI
OR
Radeon VII, revamped old VEGA with really nothing new, only 5GB more memory

In games with DLSS the 2080 will make the Radeon VII suck it's tailpipe with 25-35% higher frame rates.

Also if Radeon VII even come close to affecting the 2080 sales nvidia will instantly drop the price to counter.

Really do not see an upside for AMD in this battle at the moment with Radeon VII

If they did RADEON VII with 8GB memory at 499 USD, that would really make a splash in the graphics market.

As is now i feel that AMD just is trying to align it self with nvidias RTX overprices with essentially shrinked old VEGA tech
Radeon VII is not supposed to be "new". It's just 7nm Vega tech, but that is pretty powerful in itself.
Hell, this card wouldn't even exist if it wasn't for AMD having a stream of faulty chips from their professional card output.

The fact that it's a 60 proc GPU and not a 56 one, speaks volumes on how well that 7nm process is performing - AMD didn't joke about that.
Anyhow, the way the Vega architecture is designed does not allow them to install less than 16GB 4096bit HBM2 on the GPU package.

Compared to Vega 64's $499 pricing, the $699 price tag is fair as this 40% increase gives you 30% more performance and twice the memory, which while not exactly needed at the moment does make the card future proof. People scoffed at R390(X)'s 8 gigs of VRAM back in 2015, but that card is still a capable performer at 1080p and 1440p well over 3 years later.

Also AMD have finally released a card with proper, premium, air-cooled design, meaning for once people don't have to wait on Sapphire to release their treatment to get the good stuff. So yeah, this thing is really worth the money.

As for whether it's a better choice than RTX 2080, the answer is ofc - it depends.
If you want RT reflections (or shadows, or global illumination - whatever the game in question has, but note that you get only one of these) you will have to forget about native 4K gaming. Obviously at this price level we are not talking about any lower resolutions than that. DLSS is supposed to balance this issue by upscaling 1440p to 4K without loss in quality, but it doesn't work like that because high quality textures get the middle finger in said scenario no matter what.
So it comes down to this - do you want the best textures or the best reflections/etc. at 4K? I can tell you that for me the textures win every time as they contribute to image quality the most (after the resolution itself ofc), while screen space reflections are so good at this point that you can hardly tell the difference in actual gameplay.

So you ain't right mate - plenty of upsides right there. Indeed I'd actually shell the cash for the damn thing if I didn't have a Vega 64 already (which I would have passed on if AMD didn't deny Radeon VII's existence so vehemently up until CES, but oh well - it's marketing, I get it lol).
 
Last edited:
I am no nvidia fan BUT what does Radeon VII really bring to the table.

Same performance as 2080 BUT nothing new, really only upp the memory to 16GB.

Prices of the 2080 will be just about the same as Radeon VII at release.

At same price what would you choose, RTX 2080 with RayTracing, DLSS and AI
OR
Radeon VII, revamped old VEGA with really nothing new, only 5GB more memory

In games with DLSS the 2080 will make the Radeon VII suck it's tailpipe with 25-35% higher frame rates.

Also if Radeon VII even come close to affecting the 2080 sales nvidia will instantly drop the price to counter.

Really do not see an upside for AMD in this battle at the moment with Radeon VII

If they did RADEON VII with 8GB memory at 499 USD, that would really make a splash in the graphics market.

As is now i feel that AMD just is trying to align it self with nvidias RTX overprices with essentially shrinked old VEGA tech

I don't have a lot of examples for DLSS, but the one with FFXV posted here on TPU it doesn't make DLSS seem all that impressive. Sure, high FPS when compared to TXAA and FXAA, but DLSS kind of makes everything seem more blurry. Even when compared to NoAA the DLSS looks blurry in the distance. Almost like they took the effect of motion blur and layered it over the top of everything in the background while you're standing still.

So it still can't play nearly any AAA titles at 4k.
I don't know....to be fair, I'm not sure most people play AAA titles with how poor some of them have been recently - just rehashed the same things over and over again.
 
Because of nvidia...they made their reference models amazing

The Fury X reference cooler was an AIO. Literally still the best reference cooler ever shipped (Vega 64 LC, R9 295X2 also withstanding) on a reference card. What are you smoking, and would you like to share?
 
If you can pay 20$ for a subscription you can def put aside $20/month for your next gpu upgrade, say every 3 years. That's $720 saved up. I mean, my wife has a phone bill of like $115 a month while mine is like $25 lol. Meaning i save $3240 in 3 years time. So yeah, i'm getting that 2080 soon.
 
don't know....to be fair, I'm not sure most people play AAA titles with how poor some of them have been recently - just rehashed the same things over and over again.

On that note it still can't play the old AAA titles at 4k either.
 
Back
Top