• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen Radeon "Polaris" Nomenclature Changed?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,684 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It looks like AMD is deviating from its top-level performance-grading, with its next-generation Radeon graphics cards. The company has maintained the Radeon R3 series for embedded low-power APUs, Radeon R5 for integrated graphics solutions of larger APUs; Radeon R7 for entry-thru-mainstream discrete GPUs (eg: R7 360, R7 370); and Radeon R9 for performance-thru-enthusiast segment (eg: R9 285, R9 290X). The new nomenclature could see it rely on the second set of model numbers (eg: 4#0) to denote market-positioning, if a popular rumor on tech bulletin boards such as Reddit holds true.

A Redditor posted an image of a next-gen AMD Radeon demo machine powered by a "Radeon RX 480." Either "X" could be a variable, or it could be series-wide, prefixing all SKUs in the 400 series. It could also be AMD marketing's way of somehow playing with the number 10 (X), to establish some kind of generational parity with NVIDIA's GeForce GTX 10 series. The placard also depicts a new "Radeon" logo with a different, sharper typeface. The "RX 480" was apparently able to run "Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.



View at TechPowerUp Main Site
 
"Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.


Is this a good thing, bc I have no idea about OGL bench numbers
 
I could be wrong but AMD weren't previously capable of running Open GL?
I know with my slightly older gen card it loads the CPU when used.
 
I could be wrong but AMD weren't previously capable of running Open GL?
I know with my slightly older gen card it loads the CPU when used.

They very much were, but their OpenGL 4.5 implementation was less than perfect. I hear Doom runs on a GL 4.3 fallback with Radeons, but I have no means of testing that.
 
They very much were, but their OpenGL 4.5 implementation was less than perfect. I hear Doom runs on a GL 4.3 fallback with Radeons, but I have no means of testing that.


That is correct bta. I posted some comparison shots in the Doom thread with the OSD up.
 
They very much were, but their OpenGL 4.5 implementation was less than perfect. I hear Doom runs on a GL 4.3 fallback with Radeons, but I have no means of testing that.
Shouldn't it run Vulcan instead? I mean who in his right mind would code stuff for OpenGL these days?
 
Shouldn't it run Vulcan instead? I mean who in his right mind would code stuff for OpenGL these days?
Hasn't been patched in yet.
 
Shouldn't it run Vulcan instead? I mean who in his right mind would code stuff for OpenGL these days?

You cannot have Vulkan without OpenGL. Vulkan replaces only few functions.

Also nvidia and AMD OpenGL numbers don't mean much, they have different function sets and implementations so having "only" 4.3 doesn't mean it lacks the needed instructions. It depends on the functions used by the game itself.
 
The "RX 480" was apparently able to run "Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.

According to this article, the GTX 1080 can't achieve this at max settings.
 
The "RX 480" was apparently able to run "Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.

According to this article, the GTX 1080 can't achieve this at max settings.


The tweet by Robert Hallock was of a demo PC with a 144Hz freesync display and a RX 480 card. This is presumably the Polaris 10 card with 36 CUs. The fact that they're using a 144Hz monitor kinda implies that the card does quite a bit better than a R9 380, which at the same settings gets somewhere between 50fps - 60fps.
 
That is correct bta. I posted some comparison shots in the Doom thread with the OSD up.

idtech confirmed that they require (needed or not) the full set of 4.3 both on Nvidia and AMD, and only some extensions of 4.4 and 4.5.
The differences in quality could mean different driver implementations of the extensions (that's up to the driver development team) and/or using the proprietary extensions of each maker to archive better performance.
 
If that's the real retail SKU name then it make sense, X is 10, so instead of R10 they will use RX.

The "RX 480" was apparently able to run "Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.

According to this article, the GTX 1080 can't achieve this at max settings.

You do know that 144 is the monitor refresh rate (actually it's Lenovo Y27F, a 27" 1080p 144 Hz Freesync monitor, they use VSR to run Doom at 1440p), not the FPS produced by RX 480, right?

The only thing we know from this leak is that RX 480 can render Doom at 1440p, that's it. And Lenovo made a beautiful curved monitor :D
 
If that's the real retail SKU name then it make sense, X is 10, so instead of R10 they will use RX.



You do know that 144 is the monitor refresh rate (actually it's Lenovo Y27F, a 27" 1080p 144 Hz Freesync monitor, they use VSR to run Doom at 1440p), not the FPS produced by RX 480, right?

The only thing we know from this leak is that RX 480 can render Doom at 1440p, that's it. And Lenovo made a beautiful curved monitor :D

Yep, reading comprehension.
 
"Doom" (2016) at 2560x1440 @ 144 Hz, with the OpenGL API.


Is this a good thing, bc I have no idea about OGL bench numbers

Lol really you really buying that bullsh*t, because it can "RUN" it with a 144Hz monitor don't mean how well.

If they really had some thing to show it be a in game screenshot with a readable fps counter.

EDIT:
The tweet by Robert Hallock was of a demo PC with a 144Hz freesync display and a RX 480 card. This is presumably the Polaris 10 card with 36 CUs. The fact that they're using a 144Hz monitor kinda implies that the card does quite a bit better than a R9 380, which at the same settings gets somewhere between 50fps - 60fps.

Imply all they want it means shit at the end of the day.

Factually wrong.

Proof ?, not saying there is none but to say what you did without proof proving the fact.
 
Last edited:
They are different APIs, and executed differently, Vulkan uses a runtime, OpenGL is implemented in the OS or the driver.
 
It looks like they want to distance Polaris from the existing naming scheme. I mean, they even changed the code name from Arctic Islands to Polaris as well. And Radeon RX 480 Or RX 490 doesn't sound bad at all to be honest. I just wonder how top end will be called this time. Will they keep the Fury branding or will they kill it.
 
You cannot have Vulkan without OpenGL. Vulkan replaces only few functions.
that is categorically wrong

besides they should have put a simple vulkan shim in place (call translation only), the simple removal of debugging mode (and call hierarchy) would have improved performance over the opengl version alone, but they are probably being "advised" not to do so

(i remember seeing a simple translation shim being put in place on some game [talos principle?], a few months ago when vulkan hit drivers, and that alone yielded a nice 15~20% boost over opengl. cant find links for it though)

Proof ?, not saying there is none but to say what you did without proof proving the fact.
uh? there is official documentation available online, there are even examples for it that you can compile and test on a multitude of platforms now! do you want me to link in an opengl version documentation too?

btw, have you ever used any in a development environment (not design, actual from scratch development, even if it was just a shader on two triangles to fill the screen)?
 
Last edited:
Proof ?, not saying there is none but to say what you did without proof proving the fact.

Basically, what @truth teller in post #19 wrote
Vulkan was designed to be fully stand-alone, even moreso "independent"[1] than OpenGL is. Source: the spec itself and all those announcements / conferences / etc. stating exactly that.
I suppose some confusion arises from the fact that Vulkan can be (and at some points, was[2]) implemented on top of OpenGL, i.e. Vulkan -> OpenGL wrapper. I see how some people could have misunderstood that as "Vulkan runs through OpenGL".

[1] Vulkan spec itself contains functions for platform-independent way to interact with the underlying system's windowing interfaces, while OpenGL has to use additional platform-dependent means (WGL on Windows, GLX on most *nixes, CGL on OS X) that are NOT part of OpenGL spec. (although, this is partially alleviated by the still-very-young-and-not-widely-adopted EGL. Which still is not a part of OpenGL itself.)
[2] Early developer-only Vulkan drivers were implemented as wrappers on top of OpenGL, as it was easier and a lot more useful for developing the spec itself to first do that before putting all that effort to write actual drivers. Source: this was covered in pre-release conferences and whatnot by the developers working on it at the time (videos should be easy to find on Youtube)
 
144Hz was monitor, come on guys...

Demo was capped at 60 fps.
Which puts RX 480 slightly below 980/Fury (non-Ti, non-X).

PS
R3 R3 R9 ... RX?
But then, having several RX cards... doesn't make any sense to me.
 
Factually wrong.
that is categorically wrong

besides they should have put a simple vulkan shim in place (call translation only), the simple removal of debugging mode (and call hierarchy) would have improved performance over the opengl version alone, but they are probably being "advised" not to do so

(i remember seeing a simple translation shim being put in place on some game [talos principle?], a few months ago when vulkan hit drivers, and that alone yielded a nice 15~20% boost over opengl. cant find links for it though)


uh? there is official documentation available online, there are even examples for it that you can compile and test on a multitude of platforms now! do you want me to link in an opengl version documentation too?

btw, have you ever used any in a development environment (not design, actual from scratch development, even if it was just a shader on two triangles to fill the screen)?

Now that's much better, that's all i was asking for. I never said the facts were not out there.
 
144Hz was monitor, come on guys...

Demo was capped at 60 fps.
Which puts RX 480 slightly below 980/Fury (non-Ti, non-X).

PS
R3 R3 R9 ... RX?
But then, having several RX cards... doesn't make any sense to me.
Didn't know it was capped honestly, mind giving a link to the source?
 
If that's the real retail SKU name then it make sense, X is 10, so instead of R10 they will use RX.



You do know that 144 is the monitor refresh rate (actually it's Lenovo Y27F, a 27" 1080p 144 Hz Freesync monitor, they use VSR to run Doom at 1440p), not the FPS produced by RX 480, right?

The only thing we know from this leak is that RX 480 can render Doom at 1440p, that's it. And Lenovo made a beautiful curved monitor :D

VSR or native, what difference does it make? The card is calculating at selected resolution and downscaling it. If anything, it's actually doing more work...
 
RX? are they trying to bring back Verdetrol again.
pills1.jpg
 
Back
Top