Thursday, April 5th 2018

NVIDIA GeForce GT 1030 Shipping with DDR4 Instead of GDDR5

Low-end graphics cards usually don't attract much attention from the enthusiasts crowd. Nevertheless, not all computer users are avid gamers, and most average-joe users are perfectly happy with an entry-level graphics card, for example, a GeForce GT 1030. To refresh our memories a bit, NVIDIA launched the GeForce GT 1030 last year to compete against AMD's Radeon RX 550. It was recently discovered that several manufacturers have been shipping a lower-spec'd version of the GeForce GT 1030. According to NVIDIA's official specifications, the reference GeForce GT 1030 was shipped with 2 GB of GDDR5 memory running at 6008 MHz (GDDR5-effective) across a 64-bit wide memory bus which amassed to a memory bandwidth of 48 GB/s. However, some models from MSI, Gigabyte, and Palit come with DDR4 memory operating at 2100 MHz instead. If you do the math, that comes down to a memory bandwidth of 16.8 GB/s which certainly is a huge downgrade, on paper at least. The good news amid the bad is that the DDR4-based variants consume 10W less than the reference model.

Will this memory swap affect real-world performance? Probably. However, we won't know till what extent without proper testing. Unlike the GeForce MX150 fiasco, manufacturers were kind enough to let consumers know the difference between both models this time around. The lower-end DDR4 variant carries the "D4" denotation as part of the graphics card's model or consumers can find the denotation on the box. Beware, though, as not all manufacturers will give you the heads up. For example, Palit doesn't.
Source: Tom's Hardware
Add your own comment

59 Comments on NVIDIA GeForce GT 1030 Shipping with DDR4 Instead of GDDR5

#26
dorsetknob
"YOUR RMA REQUEST IS CON-REFUSED"
This Thread is a prime example of one going/gone toxic
Posted on Reply
#27
sutyi
T4C FantasyDdr4 is so much cheaper in bulk and there wont be a huge performance drop
GT 1030 might not be a high end GPU, bit cutting it down to 1/3rd of memory bandwidth will probably tank the performance below a Ryzen 2200G...

This should be a GT 1030LE/SE or GT 1020, this is just bad for the market. Lots of low budget gamers will buy these cause they will be even cheaper then a "regular" GT 1030, and then wonder where did that little performance go they were hoping to get in the first place...

But hey at least it is not not GeForce 4 MX level of stupid, which was basically a revamped GeForce 2 under the sticker with D3D7, while the rest of the the family was D3D8, Ti 4200 and up.

Does someone remember that glorious G92 GPU that seemingly never seemed to go away trough like 3 generations or so?

8800GS
8800GT
8800GTS
9600GSO
9800GT Green Edition
9800GT
9800GTX
9800GTX+
GTS 150 (OEM)
GT 230 (OEM)
GTS 240 (OEM)
GTS 250
GTX 260M
GTX 280M
GTX 285M

If you look hard enough you may find one in a G-Sync monitor near you... lel.

But coming to think of it... AMD Pitcairn did the same thing.
Posted on Reply
#28
T4C Fantasy
CPU & GPU DB Maintainer
sutyiGT 1030 might not be a high end GPU, bit cutting it down to 1/3rd of memory bandwidth will probably tank the performance below a Ryzen 2200G...

This should be a GT 1030LE/SE or GT 1020, this is just bad for the market. Lots of low budget gamers will buy these cause they will be even cheaper then a "regular" GT 1030, and then wonder where did that little performance go they were hoping to get in the first place...

But hey at least it is not not GeForce 4 MX level of stupid, which was basically a revamped GeForce 2 under the sticker with D3D7, while the rest of the the family was D3D8, Ti 4200 and up.

Does someone remember that glorious G92 GPU that seemingly never seemed to go away trough like 3 generations or so?

8800GS
8800GT
8800GTS
9600GSO
9800GT Green Edition
9800GT
9800GTX
9800GTX+
GTS 150 (OEM)
GT 230 (OEM)
GTS 240 (OEM)
GTS 250
GTX 260M
GTX 280M
GTX 285M

If you look hard enough you may find one in a G-Sync monitor near you... lel.

But coming to think of it... AMD Pitcairn did the same thing.
pci-ids.ucw.cz/read/PC/10de/1140
that ID holds more trash than the ISS (International Space Station)
Posted on Reply
#29
Blueberries
10W is huge when you're talking about heat and memory is where most models run into heat dissipation problems.

Pretty brilliant, really. It's not a performance card.
Posted on Reply
#30
ppn
10 watts under load that is. Idle less than watt.
Posted on Reply
#31
R-T-B
AssimilatorAre you honestly going to claim with a straight face that most users of this traditionally pro-AMD forum are not going to jump to the obvious conclusion?
I think the audience's interpretation is irrelevant to what the statement actually states.
Posted on Reply
#32
newtekie1
Semi-Retired Folder
AssimilatorThat's correct, GDDR4 only ever shipped on a handful of products because GDDR3 is cheaper and GDDR5 (which arrived shortly after) is faster. I'm guessing plain-Jane DDR4 is used because it sips less power than the G- variants.
I'd guess it is actually because there really isn't a supply of GDD4, and regular DDR4 chips are cheaper to get a hold of.
Posted on Reply
#33
Melvis
Turbo Cache?

This will just make the Ryzen APU's look even more tempting to buy.
Posted on Reply
#34
eidairaman1
The Exiled Airman
newtekie1I'd guess it is actually because there really isn't a supply of GDD4, and regular DDR4 chips are cheaper to get a hold of.
The last cards afaik that used gddr 4 was the HD2900 and 3800 series
Posted on Reply
#36
evernessince
birdieNot enough drama!

But before NVIDIA takes mostly undeserved flak, let's throw in some logic, reasoning, common sense and data for a change. :)

First of all, NVIDIA does not sell [comsumer GPUs based on the] GT 1030 [chip], unlike the title of this click-bait news says.

Secondly, GT 1030 specs page does not list a memory type.

Thirdly, it's up to NVIDIA's partners to specify low speed memory type and, oh, my god, at least MSI is semi-honest about that: GEFORCE GT 1030 2GD4 LP OC and GEFORCE GT 1030 2GHD4 LP OC (there are two more cards with GD4 monikers but I'm too lazy to list them) - see, it's "D4" meaning DDR4.

Fourthly, GT 1030 is such an underpowered chip, lower spec'ed RAM will hardly make it significantly slower than it already is. Hardly anyone buys GT 1030 to game - this chip is barely faster than built-in Coffee Lake graphics.

Fourthly, it must always be up to a buyer to verify his or her purchases against previously known specs.

Sixthly, it's up to NVIDIA's partners to specify the cards which have a worse memory configuration - this is perhaps the only thing you might accuse NVIDIA of.

Now, let's have some serious drama and loud vapid accusations.
First off, the hat ultimately falls on Nvidia if it's partners are selling cards that don't live up to expectations. Nvidia surely has a tight control over them, if there are variants of a card with large performance differences it is up to Nvidia to either brand those differently or tell their partners to do so. If not, it's not the partners taking the hit for underperforming GTX 1030s, it's Nvidia.

Second, Nvidia not listing the GDDR spec is error on their end, given that every major retailer has them listed as GDDR5. They should especially do so now, as another variant of the card will only add to confusion without proper labeling.

Third, It doesn't matter how underpowered the chip is, it is not an excuse to sell a DDR4 chip as a stand-in for a GDDR5 chip. Nvidia better be properly labeling these or else they will have another GTX 970 situation on their hands.

Fourth, yes to some extent it is the buyer's job to check specs. On the otherhand, it is also Nvidia's job to make sure those specs are made clear through branding in the first place. Same thing happened when AMD introduce the RX 450 D, a low end GPU that had less cores which AMD distinguished with a D. Nvidia can do the same.

FYI, your 3rd point and 6th point are exactly the same. You also skipped 5.

The only thing you've done here is give everyone advance notice of what Nvidia might not do and thus cause customer confusion. If your plan was to stop fanboys or whatever, all you did was give them fuel.
Posted on Reply
#37
T4C Fantasy
CPU & GPU DB Maintainer
evernessinceFirst off, the hat ultimately falls on Nvidia if it's partners are selling cards that don't live up to expectations. Nvidia surely has a tight control over them, if there are variants of a card with large performance differences it is up to Nvidia to either brand those differently or tell their partners to do so. If not, it's not the partners taking the hit for underperforming GTX 1030s, it's Nvidia.

Second, Nvidia not listing the GDDR spec is error on their end, given that every major retailer has them listed as GDDR5. They should especially do so now, as another variant of the card will only add to confusion without proper labeling.

Third, It doesn't matter how underpowered the chip is, it is not an excuse to sell a GDDR4 chip as a stand-in for a GDDR5 chip. Nvidia better be properly labeling these or else they will have another GTX 970 situation on their hands.

Fourth, yes to some extent it is the buyer's job to check specs. On the otherhand, it is also Nvidia's job to make sure those specs are made clear through branding in the first place. Same thing happened when AMD introduce the RX 450 D, a low end GPU that had less cores which AMD distinguished with a D. Nvidia can do the same.

FYI, your 3rd point and 6th point are exactly the same. You also skipped 5.

The only thing you've done here is give everyone advance notice of what Nvidia might not do and thus cause customer confusion. If your plan was to stop fanboys or whatever, all you did was give them fuel.
the card has DDR4 not GDDR4, GDDR4 is a 2006 technology that died in 2008
Posted on Reply
#38
_JP_
Honestly, not an issue. It's nvidia's low-end chip. It's purpose is to decode video and have a framebuffer that is not your RAM.
Man, this should be news if they launched these cards with ridiculous amounts of memory, the likes of this 630 4GB back then. :)
Posted on Reply
#39
Midland Dog
birdieNot enough drama!

But before NVIDIA takes mostly undeserved flak, let's throw in some logic, reasoning, common sense and data for a change. :)

First of all, NVIDIA does not sell [comsumer GPUs based on the] GT 1030 [chip], unlike the title of this click-bait news says.

Secondly, GT 1030 specs page does not list a memory type.

Thirdly, it's up to NVIDIA's partners to specify low speed memory type and, oh, my god, at least MSI is semi-honest about that: GEFORCE GT 1030 2GD4 LP OC and GEFORCE GT 1030 2GHD4 LP OC (there are two more cards with GD4 monikers but I'm too lazy to list them) - see, it's "D4" meaning DDR4.

Fourthly, GT 1030 is such an underpowered chip, lower spec'ed RAM will hardly make it significantly slower than it already is. Hardly anyone buys GT 1030 to game - this chip is barely faster than built-in Coffee Lake graphics.

Fourthly, it must always be up to a buyer to verify his or her purchases against previously known specs.

Sixthly, it's up to NVIDIA's partners to specify the cards which have a worse memory configuration - this is perhaps the only thing you might accuse NVIDIA of.

Now, let's have some serious drama and loud vapid accusations.
1030 and 750ti trade blows, i dont know any igpu that can do what a 750ti can do so i dont know any igpu that can do what a 1030 can do, also it will make a difference, its already a slow card which just got significantly slower
Posted on Reply
#40
renz496
Eric3988All I know is that when this happens to Radeon cards, the forum explodes with outrage, but when Team Green cards are getting downgraded nobody cares. The sad part is that us power users will know the difference, so if we put together a PC for mom or dad, this won't be an issue. The problem is when mom or dad get a PC with this in there and they're still getting buffering on their youtube videos or are getting low FPS on solitaire :laugh:
nah. when it happen to red team all i heard was "team green also do it". take this for example: nvidia doing rebrand: cheating customer. period. AMD doing rebrand: brilliant strategy. rebrand parts got new name and much cheaper for the masses to buy. no need to make all new chip from top to bottom every time coming up with new series.

both AMD and nvidia just happy to get free publicity from AMDiot and Nvidiot.
Posted on Reply
#41
kruk
I'm pretty shocked how many people say it's not a big deal, because this card is weak. A gimped card is a gimped card and should be properly differentiated. I know you might not care, because it's not you performance bracket, but then why comment at all? Why do you feel the need to send companies the "no problems here, continue with this" signal and ruin the market for those who actually buy these cards? Come on, show some compassion for fellow PC gamers who buy low end cards!
Posted on Reply
#42
_JP_
Just buy the ones with GDDR5 then and don't buy these versions or turn to the competition, then.
Show them with your purchase decisions what you want.
Posted on Reply
#43
evernessince
_JP_Just buy the ones with GDDR5 then and don't buy these versions or turn to the competition, then.
Show them with your purchase decisions what you want.
The problem being that which card is GDDR5 isn't properly being noted.
Posted on Reply
#44
T4C Fantasy
CPU & GPU DB Maintainer
evernessinceThe problem being that which card is GDDR5 isn't properly being noted.
Gigabyte and msi say it in the name and box
Posted on Reply
#45
Assimilator
evernessinceThe problem being that which card is GDDR5 isn't properly being noted.
It is, except on Palit's box which is a mere render, so expecting full technical detail on there is unreasonable.
Posted on Reply
#46
anselmo
I do agree with the differenciation of the cards. If one has slower memory, market it differently. And, make it cheaper. It can appeal to those who have an old system or something very low end without an iGPU (or a very weak one) and make a great HD box or a casual gaming system.
Posted on Reply
#48
T4C Fantasy
CPU & GPU DB Maintainer
ExV6kGod, that's disgusting. Is that chip still in production?
Its all in laptops so it might be
Posted on Reply
#49
PLAfiller
I guess it makes sense. Anything up to GT x40 (including) coupled with DDR3 is not worth it compared to iGPU. With DDR4, if price doesn't increase a lot, it may make more sense in this segment.
Posted on Reply
#50
Xzibit
GN just reviewed one


I think hes angry about it
Posted on Reply
Add your own comment
Apr 25th, 2024 23:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts