Tuesday, December 2nd 2014

Choose R9 290 Series for its 512-bit Memory Bus: AMD

In one of the first interviews post GeForce GTX 900 series, AMD maintained that its Radeon R9 290 series products are still competitive. Speaking in an interview with TweakTown, Corporate Vice President of Global Channel Sales, Roy Taylor, said that gamers should choose the Radeon R9 290X "with its 512-bit memory bus" at its current price of US $370. He stated that the current low pricing with R9 290 series is due to "ongoing promotions within the channel," and that AMD didn't make an official price adjustment on its end. Taylor dodged questions on when AMD plans to launch its next high-end graphics products, whether they'll level up to the GTX 900 series, and on whether AMD is working with DICE on "Battlefield 5." You can find the full interview in the source link, below.
Source: TweakTown
Add your own comment

107 Comments on Choose R9 290 Series for its 512-bit Memory Bus: AMD

#76
midnightoil
EarthDogYou mention its a fact, but... how do you know its a fact? You haven't supported that assertion with any links.

As I said, NVIDIA was planning on the shrink for Maxwell, but TSMC not being ready delayed it essentially forcing them to stick with 28nm for this release. Not resting on their laurels, I think they did a pretty damn good job of increasing IPC, memory efficiency, and power consumption on that same process to bring out the well received 9 series. With that in mind, I think NVIDIA will be in a position to catch up sooner rather than later IF AMD brings a game changer to the table.

Remember, NVIDIA has also brought to the table a die shrink in the same platform (GTX 260 from 55nm to 45nm IIRC). Who's to say that don't have the 20 nm plans still on the shelf ready to go????
Because it is a fact. NVIDIA's 20nm plans relied entirely on the canned 20nm HP process at TSMC. They're now waiting for FINFET to be ready.

AMD weren't relying on that (20nm HP), but may have canned 20nm due to delays / improvements in 28nm / cost / not wanting go to another node with TSMC when they want to shift production to GF / HBM providing large power savings anyway.
Posted on Reply
#77
EarthDog
Because it is a fact.
LOL, enough already.. Bring something to the table friend. :)

To say they had "no plans" when they had plans and scrapped them because of TSMC not being ready on that node... Do you think they burned them and don't exist?? I would imagine the rumor of NVIDIA skipping it and going to 14nm is true also... but it depends on the fab and if its ready and yields are good, etc. It wouldn't make any sense to me that they scrapped their plans for 20nm and put ALL their eggs in that basket.
Posted on Reply
#78
64K
KarymidoNIt was the same with the R9 2XX series, I expected they become more economic in energy and were not, I hoped they become less hot and noisy, but they were not ... AMD focuses on Competitive price and reasonably good performance, the problem is that the reference Coolers are Horrible and custom models are not attractive.
If until February 2015 AMD not launch a new GPU that really bring an improvement in these two requirements then throw my two R9 270X in the trash and buy a GTX 980 from ASUS Strix, I am Brazilian and here the NVIDIA cards are more expensive than AMD, 980 Strix costs about US $ 1200, but I have no choice if you want to improve my system without having to change the power supply, do not want to have to buy a power supply 1200w for new cards from R9 300 series.
You would only be hurting yourself if you throw those cards in the trash. You could sell them. What resolution are you gaming at? A single card solution might be a better choice for you. You wouldn't need a 1200 watt PSU anyway to run whatever will be the equivalent of your crossfire R9 270X cards in the R9 300 series. The power consumption will probably be about the same but you will get more performance. If you're gaming at 1440p or less then one high end card would be enough. Start a thread in the Graphics Card forum and maybe we can sort this out if you want to.
Posted on Reply
#79
midnightoil
EarthDogLOL, enough already.. Bring something to the table friend. :)

To say they had "no plans" when they had plans and scrapped them because of TSMC not being ready on that node... Do you think they burned them and don't exist?? I would imagine the rumor of NVIDIA skipping it and going to 14nm is true also... but it depends on the fab and if its ready and yields are good, etc. It wouldn't make any sense to me that they scrapped their plans for 20nm and put ALL their eggs in that basket.
I've contributed the facts we do know. You've contributed nothing.

I've no idea what you're even talking about. 20nm isn't one homogenous process. NVIDIA had designs for TSMC's HP 20nm process, but that was shelved a long time ago. Their existing designs (Maxwell & Kepler) won't work on an LP process. Pascal may have if LP 20nm bulk planar was what they were aiming at, but it's ages away and is surely aiming at FINFET. So they might as well have burned them, because there's no process available to them that they can produce them on ... unless you're proposing that they release half a dozen semi-functional prototypes for $15m each?

AMD has had iterations of their GPU designs for both HP and LP. Because 20nm was so delayed, so capacity constrained and so expensive, their TSMC 20nm LP options probably are shelved ... but maybe not - we'll soon see.
Posted on Reply
#80
renz496
midnightoilI've contributed the facts we do know. You've contributed nothing.

I've no idea what you're even talking about. 20nm isn't one homogenous process. NVIDIA had designs for TSMC's HP 20nm process, but that was shelved a long time ago. Their existing designs (Maxwell & Kepler) won't work on an LP process. Pascal may have if LP 20nm bulk planar was what they were aiming at, but it's ages away and is surely aiming at FINFET. So they might as well have burned them, because there's no process available to them that they can produce them on ... unless you're proposing that they release half a dozen semi-functional prototypes for $15m each?

AMD has had iterations of their GPU designs for both HP and LP. Because 20nm was so delayed, so capacity constrained and so expensive, their TSMC 20nm LP options probably are shelved ... but maybe not - we'll soon see.
bro don't confuse the rumor we heard so far as a plain real reality or FACT. since you insisting it is a fact then give the link to prove it. not some rumor article but nvidia official statement that they will skip 20nm node altogether. same with your claim about AMD design their discrete GPU on both HP and LP process. give the link to prove it.
Posted on Reply
#81
Keullo-e
S.T.A.R.S.
I remember from HD2900XT that the width doesn't do the magic.. What a stupid response from AMD. :/
Posted on Reply
#82
plonk420
cedrac18The $190 used on Ebay is my price point. Thank you Nvidia, i have never and will never pay more than $200 for a single component.
i got my 290 new for $215 on Black Friday
Dj-ElectriCNow, power and drivers or at-launch optimization? well...
what do you mean drivers? yeah, the panel is effing slow, but i can't think of a time when i've had driver issues other than OpenCL when trying to mine cryptocoins (and i ususally run freakin' olllld drivers).
Posted on Reply
#83
KarymidoN
64KYou would only be hurting yourself if you throw those cards in the trash. You could sell them. What resolution are you gaming at? A single card solution might be a better choice for you. You wouldn't need a 1200 watt PSU anyway to run whatever will be the equivalent of your crossfire R9 270X cards in the R9 300 series. The power consumption will probably be about the same but you will get more performance. If you're gaming at 1440p or less then one high end card would be enough. Start a thread in the Graphics Card forum and maybe we can sort this out if you want to.
I use 2 monitors (1080p each), I am not disappointed with the performance, but with the temperature and the noise ... I've never used Nvidia and I'm sure that AMD will release more powerful cards that the GTX900 series, but AMD never invests in lower energy consumption and reduce the emission of noise of GPUs, for this reason I will opt for the 900 series nvidia instead of R9 300 ...
Posted on Reply
#84
siki
LinkProUntil AMD finally figures out how to write proper drivers I will stay with nVidia. .
If AMD(ati) didn't figured out that by now they will never.
I mean this story about ATI and the drivers goes back into the nineties.
Posted on Reply
#85
MxPhenom 216
ASIC Engineer
renz496bro don't confuse the rumor we heard so far as a plain real reality or FACT. since you insisting it is a fact then give the link to prove it. not some rumor article but nvidia official statement that they will skip 20nm node altogether. same with your claim about AMD design their discrete GPU on both HP and LP process. give the link to prove it.
This. There has been no official statement from either companies on the state of 20nm. It is an assumption + rumor from an article that spread across the internet and is now interpreted as fact it seems. Fact of the matter, unless you are in the industry, you really don't know much on the matter.
Posted on Reply
#86
ManofGod
I love this thread, it is very entertaining. I have had my XFX R9 290 reference version for about 13 months of enjoyment. Nvidia fans claim their 970 it is the bestest that ever was and AMD neveressess bother with reducing power consumption. :rolleyes: (All the time well petting their card and saying, "My precious.") Oh well, it is your money, have fun with it. :nutkick:

Oh, and my reference R9 290, flashed to a 290X is not to hot or loud at all. Of course, I have a proper case and do not have the card 2 inches from my ear so there is that. The 512 bit memory bus does help and does not require 10 Billion Gigahurts memory speed. But, you needed the one with Hynix memory to avoid any potential problems that did occur. (Such as the one that has Hynix memory on mine and has zero problems.)

That is ok, I am sure Nvidia will take real good care of you all. (Except for the Nforce 3 debacle and the Nvidia chips failing in the laptops just past a year but, oh well, forgive and forget, eh?)
RecusSo 24 wheels is better than 4, right? Right?


Yes, that 24 wheel real truck is better than that 10 inch model car.
Posted on Reply
#87
MxPhenom 216
ASIC Engineer
you should try editing your posts instead of double posting.
Posted on Reply
#88
TheinsanegamerN
ManofGodI love this thread, it is very entertaining. I have had my XFX R9 290 reference version for about 13 months of enjoyment. Nvidia fans claim their 970 it is the bestest that ever was and AMD neveressess bother with reducing power consumption. :rolleyes: (All the time well petting their card and saying, "My precious.") Oh well, it is your money, have fun with it. :nutkick:

Oh, and my reference R9 290, flashed to a 290X is not to hot or loud at all. Of course, I have a proper case and do not have the card 2 inches from my ear so there is that. The 512 bit memory bus does help and does not require 10 Billion Gigahurts memory speed. But, you needed the one with Hynix memory to avoid any potential problems that did occur. (Such as the one that has Hynix memory on mine and has zero problems.)

That is ok, I am sure Nvidia will take real good care of you all. (Except for the Nforce 3 debacle and the Nvidia chips failing in the laptops just past a year but, oh well, forgive and forget, eh?)


Yes, that 24 wheel real truck is better than that 10 inch model car.
well, consider the fact that nvidia's secod tier card is faster than amd's fastest, while pulling half the power, and not needing a massive memory bus to accomplish this, and it's no wonder people say the 970 is better, BECAUSE IT IS. the 2900xt has a 512 bit bus as well, and it didnt matter if the gpu couldnt use the bandwidth.
I mean, you could track down a 290x that has specific memory so you could overclock it on that 512 bit bus, or you could just get the 970 since its still faster anyway, and you dont have to worry about what kind of memory it has.
and dont act like AMD chips in laptops never fail either (the macbook pro and imacs with AMD chips had the same problem) and the nforce 3 was OVER A DECADE AGO, so yeah, it doesnt really matter now.
Posted on Reply
#89
hadesjoy72
SteveS45Personally I don't think AMD is the only camp with a driver issue. Both camps in my opinion are equally meh.
I have two systems one with a R9 280X and HD7970 in crossfire, and a new MSI Gold(Bronze) edition GTX970.

The GTX970 has been having a lot of problems on Display Port with screen tearing after coming back from sleep state. Google GTX900 Display Port tearing/black screen, a lot of people have the same problem. And sometimes switching from Nvidia surround to normal triple monitor or vice versa causes BSOD on Windows7.

On the HD7970, I wouldn't say AMD had better or flawless drivers, we all know they don't. But I don't see the Nvidia drivers superior in any way.

So I think driver and feature wise, both camps are equally meh.
Ahh...voice of reason in an otherwise chaotic thread....thank you sir.
Posted on Reply
#90
Sony Xperia S
LinkProUntil AMD finally figures out how to write proper drivers I will stay with nVidia.
AMD videocards give you better image quality and it's pretty obviously proven here:

pcmonitors.info/reviews/aoc-i2369vm/

We also briefly tested the monitor using an Nvidia GTX 670 just to see if there were any obvious colour differences. As usual the Nvidia card sent out completely the wrong colour signal to the monitor, washing out colours and hugely reducing contrast. To rectify this and make everything look as it should for PC use the following steps should be taken in Nvidia Control Panel.
Posted on Reply
#91
EarthDog
Sony Xperia SAMD videocards give you better image quality and it's pretty obviously proven here:

pcmonitors.info/reviews/aoc-i2369vm/
There is a difference between adjusting gamma, etc, than image quality produced by drivers. ;)

That is like saying the same TV's on the shelf next to each other, one is inferior because it wasn't calibrated as well as the other.

Come on now...
Posted on Reply
#92
ayazam
remember me when the r9 290/x comes out? was it a year ago?
yet you all still comparing it, for god sakes...
Posted on Reply
#93
Tatty_Two
Gone Fishing
ayazamremember me when the r9 290/x comes out? was it a year ago?
yet you all still comparing it, for god sakes...
How could anyone remember you? This is your first post :)
Posted on Reply
#94
ayazam
Tatty_OneHow could anyone remember you? This is your first post :)
:laugh: yes and im sorry...
i just feel 'need' to make an account thou i love TPU :lovetpu: since fermi came to world...
Posted on Reply
#95
xenocide
Sony Xperia SAMD videocards give you better image quality and it's pretty obviously proven here:

pcmonitors.info/reviews/aoc-i2369vm/
So they configured it using an AMD card initially, then just plopped an Nvidia card in there are shocked to find out the color and gamma settings were off... oh, wait, the line after the one you highlighted points out they showed you exactly how to correct it (assuming you for some reason swap out your video cards regularly). That's a stretch accusation at best. I could always take the low hanging fruit and say something to the tune of "does it matter what the image quality is if the game never even runs?"
Posted on Reply
#96
Sony Xperia S
xenocideSo they configured it using an AMD card initially, then just plopped an Nvidia card in there are shocked to find out the color and gamma settings were off... oh, wait, the line after the one you highlighted points out they showed you exactly how to correct it (assuming you for some reason swap out your video cards regularly). That's a stretch accusation at best. I could always take the low hanging fruit and say something to the tune of "does it matter what the image quality is if the game never even runs?"
Nope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...
Posted on Reply
#97
EarthDog
Sony Xperia SNope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...
LOL.
Posted on Reply
#98
Tatty_Two
Gone Fishing
Sony Xperia SNope, I think you are just being confused. Nvidia cards give false colours in all cases, when you turn on your monitor for the first time, and also when you used whatever before to adjust...
Some of what you have said "may" have been true in the analogue age as analogue output/quality was pretty much down to the quality of the RAMDAC but in our digital age you are way out, in fact technically it's so unlikely that if you were a betting man you would be better off putting your money on a chocolate horse in a desert horse race.

Take a look at the link below which tests 2 different cards in detail via a DVi digital connection, I just want to quote one piece that talks about why technically there should not be any differences unless of course settings have been deliberately tampered with...........

"The problem is that when you learn a bit about how graphics actually work on computers, it all seems to be impossible. There is no logical way this would be correct. The reason is because a digital signal remains the same, no matter how many times it is retransmitted or changed in form, unless something deliberately changes it.

In the case of color on a computer, you first have to understand how computers represent color. All colors are represented using what it called a
tristimulus value. This means it is made up of a red, green, and blue component. This is because our eyes perceive those colors, and use that information to give our brains the color detail we see.

Being digital devices, that means each of those three colors are stored as a number. In the case of desktop graphics, an 8-bit value, from 0-255. You may have encountered this before in programs like Photoshop that will have three sliders, one if each color, demarcated in 256 steps, or in HTML code where you specify colors as #XXYYZZ, each pair of characters is a color value in hexadecimal (FF in hex is equal to 255 in decimal).

When the computer wants a given color displayed, it sends that tristimulus value for the video card. There the video card looks at it and decides what to do with it based on the lookup table in the video card. By default, the lookup table doesn’t do anything; it is a straight line, specifying that the output should be the same as the input. It can be changed by the user in the control panel, or by a program such as a monitor calibration program. So by default, the value the OS hands the video card is the value the video card sends out over the DVI cable.

What this all means is that the monitor should be receiving the same digital data from either kind of card, and thus the image should be the same. Thus it would seem the claim isn’t possible"


I will be honest now and say that in the past I too have believed there to be visual differences, there may well of been but it is highly likely that those differences were caused by me in as much as settings, different cabling etc, in a "clean" environment it seems there is not.

hardforum.com/showthread.php?t=1694755

Apologies..... I allowed a thread derail to derail me!
Posted on Reply
#99
Sony Xperia S
EarthDogLOL.
I honestly do NOT understand what exactly you are laughing for. Nvidia drivers send signals to any display which differ to what ATi Catalyst does.
Actually, I use the default colour, brightness, gamma, etc settings on my AMD rig and I have to manually reduce brightness and adjust contrast accordingly on my Nvia machine because the screen image looks unnatural.

What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used? :D
Tatty_OneApologies..... I allowed a thread derail to derail me!
No apologies at all, man.

The thread should be:

Choose R9 290 Series for its superior image quality compared to competition's: AMD
Posted on Reply
#100
Tatty_Two
Gone Fishing
Sony Xperia SI honestly do NOT understand what exactly you are laughing for. Nvidia drivers send signals to any display which differ to what ATi Catalyst does.
Actually, I use the default colour, brightness, gamma, etc settings on my AMD rig and I have to manually reduce brightness and adjust contrast accordingly on my Nvia machine because the screen image looks unnatural.

What is so funny about it? Maybe that you didn't get anything from the article itself and the methodology they used? :D



No apologies at all, man.

The thread should be:

Choose R9 290 Series for its superior image quality compared to competition's: AMD
One problem with that...... it simply isn't true and I am an AMD man! If you read my post you will see that unless you tamper it's pretty much technically impossible.
Posted on Reply
Add your own comment
May 8th, 2024 20:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts