• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon Graphics Roadmap for 2015 Leaked

I expect that to be Fiji Pro. I expect it to have competitive pricing to GTX 980 with much higher performance, making it the next 7950 Pro, HD 5850, HD 6950, HD 7950, R9 290. If you're building for summer and you think you don't have the money for super-high-end, but just enough money for a GTX 980, hold.

Wow hold it there. So I am assuming you have some insider information on the pricing then? If they can get Fiji pro into 499 price range(which I assume is the current price of 980) then it will be totally different. It may even force down the price of the 980Ti!

No matter what is gonna happen I am glad I held onto my money and didn't buy 970 like another member here was nagging me to do so. :D
 
Wow hold it there. So I am assuming you have some insider information on the pricing then? If they can get Fiji pro into 499 price range(which I assume is the current price of 980) then it will be totally different. It may even force down the price of the 980Ti!

No matter what is gonna happen I am glad I held onto my money and didn't buy 970 like another member here was nagging me to do so. :D

Nobody has any Fiji sample or pricing info. This is just my trusty magic ass.
 
I...just want to see what they will actually charge..for the R9 390X w/8 gigs.,when it's announced around the 16th of this month(June)!
 
Nobody has any Fiji sample or pricing info. This is just my trusty magic ass.

Some how I pictured that explanation going something like this

greatass.gif

 
GTA5 wants 6 GiB for 4K. Many games will follow thanks to games turning 64-bit, Direct3D 12, and consoles having 8 GiB of memory. Everything that previously stopped games from using ridiculous amounts of memory are now gone--the sky is the limit and 4 GiB isn't a very high limit.

Just to correct, you are ASSUMING many follow for one and second 4k is not really something happening now , almost nobody uses it now.
If you are then well yeah maybe get something with more ram and hey! next year a newer card will be released anyway with probably more memory, so if 4gb of vram is too little for you in the future you can... well wait for the future I guess.

Also consoles have 8gb of ram to play around with, not dedicated videoram.
 
Soooo... R9 390 and 390x will be rebrands/refined versions of the 290 and 290x and the next gen HBM stuff will be Fiji and FijiXT?

Makes sense I guess if refined versions are able to keep up with nvidias options at price for performance levels.
 
GTA5 wants 6 GiB for 4K. Many games will follow thanks to games turning 64-bit, Direct3D 12, and consoles having 8 GiB of memory. Everything that previously stopped games from using ridiculous amounts of memory are now gone--the sky is the limit and 4 GiB isn't a very high limit.



When referring to memory... GB= base 2, there is no GiB for memory. Side note... wonder how this will change when we are all using non-volatile ram as storage.

Consoles have 8GB of combined memory, for OS and GPU... You probably don't need 8GB dedicated to each gpu and 16GB for the cpu... sloppy coding shouldn't be rewarded.
 
When referring to memory... GB= base 2, there is no GiB for memory. Side note... wonder how this will change when we are all using non-volatile ram as storage.
We're already are. NVRAM is what SSD's use. Non volatile just means memory cells which hold charge without power.


Consoles have 8GB of combined memory, for OS and GPU... You probably don't need 8GB dedicated to each gpu and 16GB for the cpu... sloppy coding shouldn't be rewarded.
Consoles don't run games at PC resolutions or AA settings either.... . It's not bad coding, just more data. Texture memory usage steadily increases by generation.....it has done since 8086 days. More detail, more data... more data, more space.....

Although In saying that we should see more efficient resource usage now with WDDM 2.0 and proper unified memory support, so I guess MS and D3D are partly to blame there.
 
We're already are. NVRAM is what SSD's use. Non volatile just means memory cells which hold charge without power.
... we still refer to SSD nand in base 10 though... as storage... and yes I know what non-volatile means... but nvm dimms is a specific technology... of battery backed dimms...typically with nand on the dimms as well.

Consoles don't run games at PC resolutions or AA settings either.... . It's not bad coding, just more data. Texture memory usage steadily increases by generation.....it has done since 8086 days. More detail, more data... more data, more space.....

Although In saying that we should see more efficient resource usage now with WDDM 2.0 and proper unified memory support, so I guess MS and D3D are partly to blame there.

Based on the pixelage 4k should only use ~20% more space than 1080p ... DX 12 will reduce the bloat a good bit... supposedly. Just because memory usage scales from plebeian settings doesn't mean it isn't bloated at the lower settings as well.

Even the bloated Skyrim only uses double going from 1080p to 4k...
TESV_VRAM.png


For those games that do chug memory in the future... they will also probably need more gpu horsepower to handle 4k anyways and with DX12 for the first time 2 4gb cards will actually give you 8gb usable space.
 
Also consoles have 8gb of ram to play around with, not dedicated videoram.
I know that but realize that when Xbox 360 and PS3 ports came to PC, they jumped from 256 MiB VRAM usage to at least 1 GiB VRAM easily. If that same ratio applies to current gen ports, we will soon be seeing games that can put 16 GiB of VRAM to work. The resolution doesn't have to be huge to use lots of memory--the game just needs lots of large, highly detailed textures. Many games are moving in that direction quickly. 4 GiB just isn't a lot anymore; it is average. Fury is not supposed to be an average card. 12 GiB in the Titan X is a lot.

When referring to memory... GB= base 2, there is no GiB for memory. Side note... wonder how this will change when we are all using non-volatile ram as storage.

Consoles have 8GB of combined memory, for OS and GPU... You probably don't need 8GB dedicated to each gpu and 16GB for the cpu... sloppy coding shouldn't be rewarded.
When referring to numbers of all kinds, -iB is base 2 and -B is base 10. Basic math: units never change. Use -iB when dealing with base 2 always. There are no exceptions.

It is not "sloppy coding" to have larger, more detailed environments in games. To do that, you need memory--the more the merrier.

... we still refer to SSD nand in base 10 though... as storage... and yes I know what non-volatile means... but nvm dimms is a specific technology... of battery backed dimms...typically with nand on the dimms as well.
Because SSDs are base 10 just like HDDs are. They build them to 80 GB (80 billion bytes), 128 GB, etc. where memory is built to 4 GiB (2^30 * 4 bytes), 8 GiB, 16 GiB. We're talking about very specific numbers and how best to describe the number. It's no different than use a kilometer to measure the distance between towns instead of inches. You use the best unit for the job.
 
Last edited:
When referring to numbers of all kinds, -iB is base 2 and -B is base 10. Basic math: units never change. Use -iB when dealing with base 2 always. There are no exceptions.
Tell that to JEDEC, memory uses GB for base 2, always.
If you want to change that, get on their board... otherwise use the industry standards.

It is not "sloppy coding" to have larger, more detailed environments in games. To do that, you need memory--the more the merrier.

When you are tessellating water under the map... it most certainly is sloppy coding.
vram usage is like a gas... it will always fill the space... if you give coders more space they will use it. Just because they use it doesn't make the code efficient.
 
Last edited:
When you are tessellating water under the map... it most certainly is sloppy coding.

That's actually called nVidia Gameworks :D
 
Tell that to JEDEC, memory uses GB for base 2, always.
If you want to change that, get on their board... otherwise use the industry standards.
JEDEC says the industry needs to get off of -B for base two:
JEDEC Standard 100B.01 said:
The definitions of kilo, giga, and mega based on powers of two are included only to reflect common usage. IEEE/ASTM SI 10-1997 states "This practice frequently leads to confusion and is deprecated." Further confusion results from the popular use of the megabyte representing 1 024 000 bytes to define the capacity of the 1.44-MB high-density diskette. An alternative system is found in Amendment 2 to IEC 60027-2: Letter symbols to be used in electrical technology – Part 2.
 
JEDEC says the industry needs to get off of -B for base two:
But they said they were not going to change it... and continue to publish standards stating GB as base 2 lol. In that same standard you quoted a fragment of.
Have you seen DDR4 dimms? Still stamped GB.

Frankly, adding a second term leads to confusion. Just knowing storage is base 10 and memory is base 2 is fairly straight forward... now if storage would just start being created as base 2 ... all would be right in the world.


6 more days and the speculation ends... yay.
 
JEDEC isn't forcing manufacturers to switch to -iB because the usage of -B as base 2 in memory predates the -iB IEC standard. When comparing memory sticks made two decades ago to today, the units still match.

There will come a day when memory is measured in base 10 and JEDEC will have a mess on its plate.
 
Well, I had some high end cards then if x9xx was considered high end. x1950 Pro, HD6950 and now HD7950. Yeah, I skipped the HD2900 and HD3xxx series. With HD4000 I "only" had HD4850, but I don't remember HD4950 even existing back then...
 
Just send the bloody Fury PRO on the market and take my money:roll:
 
... we still refer to SSD nand in base 10 though... as storage... and yes I know what non-volatile means... but nvm dimms is a specific technology... of battery backed dimms...typically with nand on the dimms as well.
Well the dimms are but NV ram isn't...... Prob won't see any similar in the next 10 years.anyhow......

Based on the pixelage 4k should only use ~20% more space than 1080p ... DX 12 will reduce the bloat a good bit... supposedly. Just because memory usage scales from plebeian settings doesn't mean it isn't bloated at the lower settings as well.

Even the bloated Skyrim only uses double going from 1080p to 4k...



TESV_VRAM.png

DX12 isn't going to improve Skyrim performance. Skyrim is DX9, it runs on Gamebryo which is nearly 15yrs old.
Also the Vanilla textures and mesh are pretty low res. GTA V is high res.....high detail..



For those games that do chug memory in the future... they will also probably need more gpu horsepower to handle 4k anyways and with DX12 for the first time 2 4gb cards will actually give you 8gb usable space.
That's due to proper virtual addressing support in WDDM 2.0, not DX12.
DX11 has unified addressing already but Windows doesn't apply it properly. Hence why DX11 games in W7/8 get a crapton of system memory allocated for GPU paging.
 
I know that but realize that when Xbox 360 and PS3 ports came to PC, they jumped from 256 MiB VRAM usage to at least 1 GiB VRAM easily. If that same ratio applies to current gen ports, we will soon be seeing games that can put 16 GiB of VRAM to work. The resolution doesn't have to be huge to use lots of memory--the game just needs lots of large, highly detailed textures. Many games are moving in that direction quickly. 4 GiB just isn't a lot anymore; it is average. Fury is not supposed to be an average card. 12 GiB in the Titan X is a lot.


Well realize that GTA5 was released at the "end" of that consoles cycle.
And while those consoles inherently stood still in the land of technological advancement, the PC merely moved on in all those years.
So yeah they have a lot more memory to play/work with on the pc and the newer consoles (as they should).

By that ratio your 16 GiB of Vram will happen....lets say 4 - 5 years in the future and by that time we'll have seen many a new generation videocard and nobody will think anymore about the current "flagships".
 
My current card is 5.5 years old. 8 GiB may not mean much right now but two years from now, 4 GiB will be the laughing stock.
 
My current card is 5.5 years old. 8 GiB may not mean much right now but two years from now, 4 GiB will be the laughing stock.

sooo you are running an "old" card which is still working out fine it seems (otherwise you would have replaced it).... but a 4 GiB card won't?
If 8 GiB shows its value 2 years from now...well then upgrade at that time.
If you are arguing a videocard should last you longer then that:

1. then I agree in the sense that I find the current performance jumps way too small, the GTX980Ti and Titan X are a joke imo with the price the command and the performance this supposedly high-end cards produce.

2. then I don not agree in the sense that it does last longer then 2 years, but then that consumer is clearly not an enthusiast and will be the same bracket as you, a 4 GiB card will still be able to run all the games 2 years from now (probably more) just perhaps not on maxed settings (like your previous example of GTA5 on 4k resolution).

Either way its not a problem, 4 GiB right now is plenty with perhaps the exception of GTA5 on 4k (although we have benched cards with less then 6 GiB vram on 4k so riddle me that).
The enthusiast can upgrade when more is needed, the not so enthusiast will just play on settings that will work for the gpu.
 
Last edited:
Back
Top