• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

Dream on about both, Volta and GDDR6.
 
Hate to say it, but I don't think mining will ever "peter out". For sure there will be periods of boom and bust, but mining is here to stay buddy.

And if you think about it, it was only a matter of time before some program found a way to make money off of the massive computational power modern GPU's have.

We shall see, and no... crypto currency is not going away, it's simply a highly speculative commodity.

And I'm not your buddy, that's just plain gay.
 
I'm convinced. There are too many posters in this thread being paid specifically to attack other people and hijack the discussion.

As for me the unverified results are nice considering AMD hasnt been in such a position in a long time. I think Nvidia will up its game with Volta, which might mean a delayed release until they prove they retake top daug slot without much fanfare.

I guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time. :)

I still hope you're were joking.
 
- HBC is relevant for pure compute workloads, but not for gaming.
You are so wrong , but just as always you're going to ignore facts and carry on.
Then please focus on the facts instead of personal attacks.

A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.

Volta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018. In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6. It will be stronger, but lack all of these features.

The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.
GV102 and GV104 are already taped out, and the first test batch will arrive soon. So unless Nvidia run into problems like on Fermi, they can be released anywhere from 5-10 months from today.
 
No surprise here.

$400, and it slightly loses to the 1080 while having FAR better long-term technology. This will sell well, and Vega64 should be at least 15% stronger than this!

"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.
 
Then please focus on the facts instead of personal attacks.

A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.

I literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.

Facts ? All you do is ignore them mate. :laugh: Just like you did in out previous discussion.

We got you bro , AMD hasn't got a clue about what they're doing and you do.
 
I literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.

Facts ? All you do is ignore them mate. :laugh: Just like you did in out previous discussion.

We got you bro , AMD hasn't got a clue about what they're doing and you do.

So basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?

"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.

Hey you're wrong! AMD has the alien technology nobody has, but everyone wants, that's why everyone's constantly copying them, and try to stop them by paying to make them look bad!
 
So basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?

:roll:Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.

Right, wasted enough time with you both , you are both on ignore.
 
:roll:Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.

Right, wasted enough time with you both , you are both on ignore.

The level of your stupidity is astonishing, really, i must compliment you.
 
I thought HBC stood for "Holy Batman, Catwoman!"
 
I thought HBC stood for "Holy Batman, Catwoman!"

But seriously, buy cheap (if even possible), and then flog to miners... hmm I'm tempted.
 
Is that not worrying? All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost. All AMD are doing with these compute heavy consumer cards is fueling the mining craze. There is no disputing AMD's consumer compute ability but:

1) It's still not enough to be the fastest gaming GPU and ;
2) It keeps the profit margins very low

Anyway, if you see my post further up, Vega 64 is a mining monster. So it's going to disappear fast on one of those jumbo jets. Not good news at all for gamers. :mad:

nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming :D
 
I couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.

Same with gsync as far as I understand.
 
Same with gsync as far as I understand.

I never see any tearing on my gsync monitor, goes to 165hz though.
 
I guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time. :)

I still hope you're were joking.
Thanks for making my point.
 
Is that not worrying? All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost. All AMD are doing with these compute heavy consumer cards is fueling the mining craze. There is no disputing AMD's consumer compute ability but:

1) It's still not enough to be the fastest gaming GPU and ;
2) It keeps the profit margins very low

Anyway, if you see my post further up, Vega 64 is a mining monster. So it's going to disappear fast on one of those jumbo jets. Not good news at all for gamers. :mad:


I never said it was good, but I wouldn't say this is "bad".

Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol). Nvidia can afford to have 2 architectures at the same time, and AMD cannot. However AMD is starting to finally make money again, and so this will change by 2019.


Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
 
Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Well, for starters we hear this every single time. Buy a underperforming AMD card now, and it will be better in the future, but it never pans out. The primary reason is that your expectations are too inflated. Secondly, AMD will shift their focus to Navi in a few months.

Fp16 is certainly interesting, and will be gradually more used in the future. But for the next 2-3 years there will be a limited amount of games giving a little boost there, and even with this boost it still wouldn't beat a 1080 Ti. Also, keep in mind that AMD already needs to improve their scheduling, so usage of fp16 will result in even more idle resources. HBC wouldn't give it an advantage over Nvidia unless a game needs more memory than the competition can provide and the game uses intrinsics. So simply stated, even in your best case scenario, Vega doesn't look good in comparison to Pascal.
 
I never said it was good, but I wouldn't say this is "bad".

Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol). Nvidia can afford to have 2 architectures at the same time, and AMD cannot. However AMD is starting to finally make money again, and so this will change by 2019.


Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Out of curiosity, what would qualify as bad in your opinion?
 
I'm just impressed to hear Maxwell is worthless beyond gaming and mining (lol).
 
"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.

By far the worst post in this thread. If you don't realize that AMD has gone against the grain then you need to read more.
 
all in all, for such a small Group like RTG with R&D crippled by the former management, i'm just glad for what they accomplished. they always push for new stuff no matter what, opted for open standards friendly ecosystem long term functionally and future proof, they superior the market of consoles, the god of APU I remember when AMD's slogan was "The Future is Fusion" when we are fighting over quad cores CPUs. Sure they may deliver half baked tech sometime, but they strike hard along the way and i'm sure the next Vega will be even more appealing for hard core enthusiast not just for pro and developers.

can't wait for the official reviews and god save us from the miners!
 
Whether or not the Vega 56 is more cost effective than the 1070 is irrelevant when you consider that the GTX 1070 has been out for about a year and the Vega 56 still isn't. Most people already gave Nvidia their money, and the prospect of a lateral move isn't exactly appealing.
 
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.

nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming :D

Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
 
Then I have no idea what you checked. First is Hardware Canuks as a written review, YT videos like HW Unboxed (
), getting 1% in favor of the 1060 and first it was a 12% difference. Yeah, it consumes 30-40W more at default (which is said by HW Unboxed that shouldn't be a deal breaker), but actually, JayzTwoCents' XFX 480 video shows that under stress test, that RX480 eats about 95 to 120W. And with Crimson, AB etc. you can control the more hungry RX480s like the Sapphire or MSI. Also, ones like MSI got later BIOS updates that got their power consumption lower.

some people were mislead by that video by Jay thinking that the whole card only use that much power when running 3D application like Fire Strike. but in reality those were power used on GPU core section only not the entire GPU because there is no sensor on the card that can measure the entire power being used by the card. that's why reviewer like techpowerup and toms hardware did not look at power consumption reported by MSI Afterburner (like Jay did at the time) and have specific equipment to measure GPU power consumption only in a system. this is PCB analysis of the exact card use by Jay (XFX RX480 GTR):

 
Back
Top