• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

G.SKILL Breaks Fastest DDR4 Memory Frequency Overclock at 4255MHz

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,681 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
G.SKILL International Co. Ltd., the world's leading manufacturer of extreme performance memory and solid-state storage, is extremely excited to announce a new memory record for fastest DDR4 memory frequency, set at 4255MHz CL18-18-18! This amazing achievement was attained on the Asus Rampage V Extreme motherboard (X99 chipset) and with the Intel i7-5960X processor, all under sub-zero liquid nitrogen cooling. Below is a screenshot of the record validation by CPU-Z (validation).



View at TechPowerUp Main Site
 
Wake me up when they use dual/quad channel and all cores from the CPU. :)
 
This is just marketing B.S. for the gullible. Anyone with a clue knows there is no tangible system benefit running real applications on a desktop PC with a discrete CPU beyond 1600 MHz. Paying more for higher frequency DDR3 or DDR4 is for the foolish.
 
This is just marketing B.S. for the gullible. Anyone with a clue knows there is no tangible system benefit running real applications on a desktop PC with a discrete CPU beyond 1600 MHz. Paying more for higher frequency DDR3 or DDR4 is for the foolish.

I was waiting for you...
Realy, do you get paid to do this?
 
I be more interested who made the ram chips than a brand name. Sure G.Skill put some good modules together but these company's never give credit to who made the ram chips.

Which in turn company's don't tell you when they change them later on in production ( asses ).
 
They are breeding......

I'm not a "youngster" and I see the clear benefits to what is happening, the advancement that works will be copied and through proliferation will result in better performance across the mainstream sector and not just the enthusiast, much the same way there was no point in upgrading to DDR3 at first DDR4 will take a generation or two before it becomes fully viable as a replacement.

Also we must adopt faster, lower voltage RAM, the amount of on die termination and signalling voltage is going to make or break newer process nodes for CPU's as the memory controller is on die.

My 1100T begs to differ that beyond 1600 has no benefit, wrenching down the timing and increasing the speed with higher rated memory and running it close to 2000Mhz made a significant difference in bandwidth intense applications, Intel chips thrive with faster RAM, but new AMD has a serious cache latency issue however.
 
To be fair I agree with some of readers comments but not all by any means, the logic he uses whilst understandable is like saying "there is no need for a land speed record because most everywhere you go there are speed limits in everyday use" so in one respect he is not actually wrong, what is being ignored is the challenge the act brings and the feeling of achievement with it's success..... often not measured by logic!
 
These numbers are only for SHOW.
They are not for REAL WORLD USAGE.
Uh.. Most people realize this.
They realize that this is just a press release proclaiming a high overclock on a RAM module
They realize that this isn't realistic.
They realize that this is a marketing gimmick.
It really doesn't need explanation or finger pointing to theoretical "dumbers" that probably don't really exist.
 
I was waiting for you...
Realy, do you get paid to do this?

He's right though. Outside of a few very specific (non-gaming) applications and synthetic benchmarks, there are virtually no gains in real world performance from DDR3 RAM faster than 2133 for resolutions about 640*480

Uh.. Most people realize this.
They realize that this is just a press release proclaiming a high overclock on a RAM module
They realize that this isn't realistic.
They realize that this is a marketing gimmick.
It really doesn't need explanation or finger pointing to theoretical "dumbers" that probably don't really exist.

yet, "dumbers" do continue to blow money on expensive performance RAM, when they could spend the extra $200 (about the difference between 16gb of 2133mhz DDR3 and 3000mhz DDR3) instead to buy a faster GPU. Something that will actually boost in-game performance more than 1/2 to 1 FPS in resolutions above 640*480

At the end of the day, people can spend their money how they want, but stuff like giant Corsair Dominator DIMMs are good for little more than showing off the size of your e-peen, not any realistic gaming FPS gains.
 
Last edited:
He's right though. Outside of a few very specific (non-gaming) applications and synthetic benchmarks, there are virtually no gains in real world performance from DDR3 RAM faster than 2133



yet, "dumbers" do continue to blow money on expensive performance RAM, when they could spend the extra $200 (about the difference between 16gb of 2133mhz DDR3 and 3000mhz DDR3) instead to buy a faster GPU. Something that will actually boost in-game performance more than 1/2 to 1 FPS.



Except when I am editing 25MBps 1080 video, or 40+MBps 4K video, the processor cache doesn't even have enough space for a second of either, so the next fastest thing, RAM, will be utilized, and as long as the cores are being fully utilized the limitation becomes how fast can we read, work on, and then write data to RAM. In other applications faster RAM has huge impacts, mostly server side though.

At the end of the day though, http://en.wikiquote.org/wiki/Talk:Bill_Gates 640Kb was enough, 1Ghz was huge and many of us "oldsters" probably remember when we thought it was fast enough, hell, I remember getting my hands on a AMD 500 Mhz and overclocking it to 550 and how I thought that was unbelievable at the time SDR RAM 133 in matched pairs to reach up to 150Mhz on some configurations!!! http://www.dansdata.com/fastram.htm
 
Except when I am editing 25MBps 1080 video, or 40+MBps 4K video, the processor cache doesn't even have enough space for a second of either, so the next fastest thing, RAM, will be utilized, and as long as the cores are being fully utilized the limitation becomes how fast can we read, work on, and then write data to RAM. In other applications faster RAM has huge impacts, mostly server side though.

At the end of the day though, http://en.wikiquote.org/wiki/Talk:Bill_Gates 640Kb was enough, 1Ghz was huge and many of us "oldsters" probably remember when we thought it was fast enough, hell, I remember getting my hands on a AMD 500 Mhz and overclocking it to 550 and how I thought that was unbelievable at the time SDR RAM 133 in matched pairs to reach up to 150Mhz on some configurations!!! http://www.dansdata.com/fastram.htm

Outside of a few very specific (non-gaming) applications and synthetic benchmarks, there are virtually no gains in real world performance from DDR3 RAM faster than 2133 for resolutions about 640*480

My point more was that the marketing of consumer/gamer RAM by G.Skill, Corsair, et. all is rather disingenuous in the actual price/performance ratio provided by speeds greater than 2133mhz.

I'm all for more performance, but the reality is, no games on the market today or in the foreseeable future (don't kid yourself, gamers are the market for Corsair and G.Skill) actually benefit from high RAM speeds at realistic resolutions. That's why review sites, including TPU, run their RAM benchmarks at sub-1080p resolutions to isolate from the GPU. At 1080p and up, the FPS difference in virtually all modern games between 2133mhz and 3000mhz are from less than 2% to a statistical margin of error.

Again, gamers should spend money on a faster GPU or more RAM, not faster RAM

I keep using 2133mhz, because that seems to have replaced 1600mhz DDR3 as the entry point into building a gaming desktop, and the price difference between it an 1600mhz is slight enough to make the splurge of $20 to $40 more pretty inconsequential. Not the $150 to $300 jump you'd need to make to get into the 2800mhz+ range.
 
Last edited:
Another example is USB 3.0. I have some USB 3.0 flash drives but more than half of them writes only 7 MB and maximum 10 MB.
My advice would be not to cheap out on buying flash drives.
My older USB 2.0 flash drives are able to write 13.6 or 20 MB. USB 3.0 was all SCAM.
If you're buying cheap-ass flash drives whose main selling point is that they are moulded to look like Homer Simpson or Hello Kitty, don't be surprised they don't offer top performance. You obviously haven't made any comparisons between USB 2.0 and USB 3.0 for external drives either to arrive at your faulty conclusion.
At the end of the day, people can spend their money how they want, but stuff like giant Corsair Dominator DIMMs are good for little more than showing off the size of your e-peen, not any realistic gaming FPS gains.
So what? Last time I checked, custom paint, chassis lighting, dress up kits, braided cables, and a host of other aesthetics-based accessories/components don't add anything to performance either. It might escape your attention that life exists well beyond what is deemed purely functional.
 
Last edited:
I was waiting for you...
Realy, do you get paid to do this?

I'm guessing he gets paid, but probably not very much because they can't afford better than the JEDEC junk ICs thrown out of the bin pile that only run at 1600CL11 :P

I guess HWBOT is totally pointless as well, and we should all be happy using locked e-machines.
 
My phone does 4K, almost all new movies are shot in 4K, Youtube and many other content sources have the ability to do 4K so every reviewer, personality, tom, dick, and harry can start running their show through after effects at 4K, or even at standard 1080, so not so specific.


I have multiple 3.0 drives that run well over 100MBps, makes coping large files a breeze.
 
My phone does 4K, almost all new movies are shot in 4K, Youtube and many other content sources have the ability to do 4K so every reviewer, personality, tom, dick, and harry can start running their show through after effects at 4K, or even at standard 1080, so not so specific.


I have multiple 3.0 drives that run well over 100MBps, makes coping large files a breeze.


Like i said a few times, i'm mainly referring to games, and gamers are G.Skill and Corsair's market. The few content creators and editors that i know wouldn't even know who G.Skill is and use pre-built systems by HP and Boxx.

So what? Last time I checked, custom paint, chassis lighting, dress up kits, braided cables, and a host of other aesthetics-based accessories/components don't add anything to performance either. It might escape your attention that life exists well beyond what is deemed purely functional.

wait, you can tell the difference between 2133mhz DDR3 and 3000mhz DDR3 just by looking at the DIMMS? RAM speed is actually a purely, completely functional consideration, since both G.Skill and Corsair use the same heat spreaders across all their available speeds. All i am saying is, spend your money on a faster GPU, not faster RAM.
 
Last edited:
wait, you can tell the difference between 2133mhz DDR3 and 3000mhz DDR3 just by looking at the DIMMS?
You seem to have missed the point completely. It isn't about visual aesthetics, it is about individuality.
If you want to get pedantic, and it seems you do, the difference is easily seen on any number of publicly available benchmark/system validation submissions. Why do you think that CPU-Z validations are a relatively common signature addition if not as an expression of individuality?
 
You seem to have missed the point completely. It isn't about visual aesthetics, it is about individuality.
If you want to get pedantic, and it seems you do, the difference is easily seen on any number of publicly available benchmark/system validation submissions. Why do you think that CPU-Z validations are a relatively common signature addition if not as an expression of individuality?


Well, you were giving examples that specifically changed the aesthetics of the computer, hence my response about visually identifying RAM, so maybe you missed my point. And expressing individuality by your sig either saying corsair or g.skill ram is like being either an American Apparel or Old Navy person. It's brand fetishism. And you're right, i'll always be a form follows function person. Sorry.

>inb4 "Your specs say G.Skill". You are right, but it was on sale when i bought it and i was trying to be accurate in the specs
 
Last edited:
Does anyone really care what does what at sub zero?
Last time I checked, no one was doing anything in sub zero temps..
Show me 5 gigs on all cores and I'll care.
Memory isn't doing anything noticeable passed 2133 anyway.
 
This is just marketing B.S. for the gullible. Anyone with a clue knows there is no tangible system benefit running real applications on a desktop PC with a discrete CPU beyond 1600 MHz. Paying more for higher frequency DDR3 or DDR4 is for the foolish.
you know, except for any gaming done on an IGPU, and we know future ones will be more powerful and will require more bandwidth. and ignoring any application that uses large file transfers, for instance any image editing or video editing software, or any server dealing with large amounts off file transfers, there is NO difference......

and yeah, this may not be an out of the box application, but IT PROVES that it can be done. in a couple of years, 4200mhz ddr4 may be available for purchase, and it very well could be useful at that time.

It stuns me how stupid some people on this forum can be these days. do most people really think that a shitty p4 hp and GPUs from the 90s are all people need, and that power users just dont exist?
 
I always say that "Dumbers are always ready to pay more for the numbers."

They are called early adopters and they decide which products fail or which technology matures.

Imagine a world where no one bought 4K displays when they were $10.000 because they would be "dumb." Do you think display manufacturers would be willing to invest in 4K if they didn't get any return on their investment the first 5 to 10 years?

If your argument is that people that are early adopters or innovators are dumb, because they pay too much, you're certainly not a technology innovator as people are willing to invest more to get a certain advantage (no matter how small it is.) Why get new GPUs because a 770 is fine for 1080p?

I do however, agree that these benchmark numbers are completely irrelevant for 99.998%(maybe even more) of the people in this world, but the very small percentile of people that do take interest in this are paving the way for overall faster and cheaper products on the market that have an upstream battle in the world of legacy technologies that is the "modern" PC.

edit: there are some great resources on marketing and product development and just about everything is based on Rogers' bell curve.
http://en.wikipedia.org/wiki/Technology_adoption_lifecycle

Your position in the curve is decided by your keenness for technology, willingness to adopt and, critically, financial abilities.
 
My phone does 4K, almost all new movies are shot in 4K, Youtube and many other content sources have the ability to do 4K so every reviewer, personality, tom, dick, and harry can start running their show through after effects at 4K, or even at standard 1080, so not so specific.

Movies are shot in 8K. I mean AAA movies, not the B type ones. :D
 
There's nothing wrong with faster ram, except price, and the system may not make good use (diminishing returns) due to bottlenecks elsewhere (not enough memory controller buffers to keep the ram busy, etc).
 
They are called early adopters and they decide which products fail or which technology matures.
Imagine a world where no one bought 4K displays when they were $10.000 because they would be "dumb." Do you think display manufacturers would be willing to invest in 4K if they didn't get any return on their investment the first 5 to 10 years?

There are cases in which it is not the early adopters who decide the destiny of a given technology.
Look at the Windows 8 shit. Despite all negativism to it, MS will still find a way to make it by giving, for instance, Windows 10 for free or very low prices.

But it is not only this, for sure.

The leading corporations decide where to go no matter what the cost and efforts might be.

There's nothing wrong with faster ram

As far as I remember, faster memory is especially beneficial for APUs.
 
Back
Top