Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
make sense
System Name | Main PC |
---|---|
Processor | 13700k |
Motherboard | Asrock Z690 Steel Legend D4 - Bios 13.02 |
Cooling | Noctua NH-D15S |
Memory | 32 Gig 3200CL14 |
Video Card(s) | 4080 RTX SUPER FE 16G |
Storage | 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red |
Display(s) | LG 27GL850 |
Case | Fractal Define R4 |
Audio Device(s) | Soundblaster AE-9 |
Power Supply | Antec HCG 750 Gold |
Software | Windows 10 21H2 LTSC |
System Name | His & Hers |
---|---|
Processor | R7 5800X/ R7 7950X3D Stock |
Motherboard | X670E Aorus Pro X/ROG Crosshair VIII Hero |
Cooling | Corsair h150 elite/ Corsair h115i Platinum |
Memory | Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk |
Video Card(s) | Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090 |
Storage | lots of SSD. |
Display(s) | A whole bunch OLED, VA, IPS..... |
Case | 011 Dynamic XL/ Phanteks Evolv X |
Audio Device(s) | Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B |
Power Supply | Seasonic Ultra Prime Titanium 1000w/850w |
Mouse | Logitech G502 Lightspeed/ Logitech G Pro Hero. |
Keyboard | Logitech - G915 LIGHTSPEED / Logitech G Pro |
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Exactly. How does a rough 2000 shaders and 24GB of RAM account for 350W?!All of these RTX 4000 "leaks" are so obviously nonsensical that it continues to amaze me how many people fall for them.
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
Its well known that power and performance dont scale linearly. Which means if the 4090 is actually 2x400w for 2x3090 performance, power limiting it to 400w means it will still completely poop on the 3090 by 50% or more.Sure.
2x400w 2x faster than a 400w card 2x the price of a 400W card.
by that logic lower tier cards can go exactly like the enthusiast cards. Why not?
2x150w card 2x faster than a 150w card 2x the price of the 150w card.
System Name | Bro2 |
---|---|
Processor | Ryzen 5800X |
Motherboard | Gigabyte X570 Aorus Elite |
Cooling | Corsair h115i pro rgb |
Memory | 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16 |
Video Card(s) | Powercolor 6900 XT Red Devil 1.1v@2400Mhz |
Storage | M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB |
Display(s) | LG 27UD69 UHD / LG 27GN950 |
Case | Fractal Design G |
Audio Device(s) | Realtec 5.1 |
Power Supply | Seasonic 750W GOLD |
Mouse | Logitech G402 |
Keyboard | Logitech slim |
Software | Windows 10 64 bit |
I'm sure and sincerely hope someone will call your claim and check if that is the case. Because I can bet one product is not equal to the other which means making claims like you did is more than a wishful thinking rather than a rule or a fact.Its well known that power and performance dont scale linearly. Which means if the 4090 is actually 2x400w for 2x3090 performance, power limiting it to 400w means it will still completely poop on the 3090 by 50% or more.
Processor | 3900x - Bykski waterblock |
---|---|
Motherboard | MSI b450m mortar max BIOS Date 27 Apr 2023 |
Cooling | αcool 560 rad - 2xPhanteks F140XP |
Memory | Micron 32gb 3200mhz ddr4 |
Video Card(s) | Colorful 3090 ADOC active backplate cooling |
Storage | WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb |
Display(s) | 24“ HUION pro 4k 10bit |
Case | aluminium extrusions copper panels, 60 deliveries for every piece down to screws |
Audio Device(s) | sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh |
Power Supply | Corsair RM1000x |
Mouse | pen display, no mouse no click |
Keyboard | Microsoft aio media embedded touchpad (moded lithium battery 1000mAh) |
Software | Win 11 23h2 build 22631 |
Benchmark Scores | cine23 20000 |
Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
4090Ti will never have 48gb vram. leakers exaggerate vram for attention.
4080 WILL NOT have 16gb vram. I promise.
Kopite7kimi said 3080Ti will have 20gb vram.
he better turn around his ice bucket on his own head:
View attachment 256257
Processor | 3900x - Bykski waterblock |
---|---|
Motherboard | MSI b450m mortar max BIOS Date 27 Apr 2023 |
Cooling | αcool 560 rad - 2xPhanteks F140XP |
Memory | Micron 32gb 3200mhz ddr4 |
Video Card(s) | Colorful 3090 ADOC active backplate cooling |
Storage | WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb |
Display(s) | 24“ HUION pro 4k 10bit |
Case | aluminium extrusions copper panels, 60 deliveries for every piece down to screws |
Audio Device(s) | sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh |
Power Supply | Corsair RM1000x |
Mouse | pen display, no mouse no click |
Keyboard | Microsoft aio media embedded touchpad (moded lithium battery 1000mAh) |
Software | Win 11 23h2 build 22631 |
Benchmark Scores | cine23 20000 |
if 4080 comes with 16gb, 4080Ti will have enough vram for 8k.It would actually make more sense for a 4090ti to have an absurd amount of VRAM than no. The only way to justify buying it is for someone needing a lot of VRAM or some other very very specific needs.
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
Dont you worry, ill check it myself, but no need frankly. Its common sense, power with performance dont scale linearly. Its been known for the last 30 years. If you clock something 10% higher it will require 20% more power. If you clock it 20% higher it will require 50% more power.I'm sure and sincerely hope someone will call your claim and check if that is the case. Because I can bet one product is not equal to the other which means making claims like you did is more than a wishful thinking rather than a rule or a fact.
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
If the xx90 Ti stays in the "this is a Titan but we don't want to call it a Titan 'cause then gamers with tons of cash and impulse control deficiencies won't buy it" category, absurd amounts of VRAM are reasonable. It's not like the 3090 Ti ever makes use of its 24GB outside of productivity applications (or games modded so stupidly and poorly they really aren't worth considering), so unreasonable is par for the course for this product segment. 2x whatever the 4090 has seems "reasonable" to expect in this scenario.if 4080 comes with 16gb, 4080Ti will have enough vram for 8k.
That's a bad marketing move, they better reserve 8k for 4090 only.
the only Quadro card with 48gb is A6000 at 4949$
This isn't too unreasonable IMO: we've seen similar things before with the higher end GPU being a cut-down, lower clocked wide GPU while the lower end GPU is a higher clocked narrower GPU. RX 6700 vs 6800 isn't too far off. Given that the rumors don't mention clock speeds at all, it's entirely possible that the 4090 is (relatively) wide-and-slow to keep power """in check""" (as if 400W-ish could ever reasonably be called that), while the Ti is a "fuck it, let it burn" SKU with clocks pushed far past any kind of efficiency range.Exactly. How does a rough 2000 shaders and 24GB of RAM account for 350W?!
The supposed 4080 > 4090 adds 8GB of RAM and 6000 shaders for 30W
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
Ι think the hardest part would be to remove 800w from the case, not so much the gpu itself. Of course if aibs cheap out on the designs its gonna be a disaster, but i have hopes they wont on such a high end productIf the xx90 Ti stays in the "this is a Titan but we don't want to call it a Titan 'cause then gamers with tons of cash and impulse control deficiencies won't buy it" category, absurd amounts of VRAM are reasonable. It's not like the 3090 Ti ever makes use of its 24GB outside of productivity applications (or games modded so stupidly and poorly they really aren't worth considering), so unreasonable is par for the course for this product segment. 2x whatever the 4090 has seems "reasonable" to expect in this scenario.
This isn't too unreasonable IMO: we've seen similar things before with the higher end GPU being a cut-down, lower clocked wide GPU while the lower end GPU is a higher clocked narrower GPU. RX 6700 vs 6800 isn't too far off. Given that the rumors don't mention clock speeds at all, it's entirely possible that the 4090 is (relatively) wide-and-slow to keep power """in check""" (as if 400W-ish could ever reasonably be called that), while the Ti is a "fuck it, let it burn" SKU with clocks pushed far past any kind of efficiency range.
Of course, there are major questions regarding whether 800W can actually be dissipated from a GPU-sized die quickly enough for it to not overheat without sub-ambient liquid cooling or similar. Heatpipes are definitely out the window, and even a huge vapor chamber with a massive fin stack will struggle unless you're hammering it with extreme airflow.
System Name | Bro2 |
---|---|
Processor | Ryzen 5800X |
Motherboard | Gigabyte X570 Aorus Elite |
Cooling | Corsair h115i pro rgb |
Memory | 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16 |
Video Card(s) | Powercolor 6900 XT Red Devil 1.1v@2400Mhz |
Storage | M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB |
Display(s) | LG 27UD69 UHD / LG 27GN950 |
Case | Fractal Design G |
Audio Device(s) | Realtec 5.1 |
Power Supply | Seasonic 750W GOLD |
Mouse | Logitech G402 |
Keyboard | Logitech slim |
Software | Windows 10 64 bit |
That is not true unless you haven't given enough details about the 4.9Ghz clock. 5900x or 5950X can clock above 5Ghz with an air cooler so that's that about your LN2 claim for Ryzen obviously extreme OC requires that and that come for both Intel and AMD no doubt about that.Clock zen 3 to 4.9ghz and lets see what happens. Youd need ln2 canisters for that to happen though
Well I'm sorry to tell you this but You don't strike me as an honest, unbiased and transparent person and since you always talk about numbers (your claim) you have never presented any graphs metrics aka DATA or any evidence but keep trying. There is a balance between power and performance. Balance or equilibrium between the two is most desirable which we all know NV and AMD have been altering for the pursuit of the performance which in fact bump the prices for all products not just the top end like people tend to claim saying that's not a problem. I think it is a problem which some people are simply arrogant or blind to see.Dont you worry, ill check it myself, but no need frankly. Its common sense, power with performance dont scale linearly. Its been known for the last 30 years. If you clock something 10% higher it will require 20% more power. If you clock it 20% higher it will require 50% more power.
Processor | 3900x - Bykski waterblock |
---|---|
Motherboard | MSI b450m mortar max BIOS Date 27 Apr 2023 |
Cooling | αcool 560 rad - 2xPhanteks F140XP |
Memory | Micron 32gb 3200mhz ddr4 |
Video Card(s) | Colorful 3090 ADOC active backplate cooling |
Storage | WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb |
Display(s) | 24“ HUION pro 4k 10bit |
Case | aluminium extrusions copper panels, 60 deliveries for every piece down to screws |
Audio Device(s) | sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh |
Power Supply | Corsair RM1000x |
Mouse | pen display, no mouse no click |
Keyboard | Microsoft aio media embedded touchpad (moded lithium battery 1000mAh) |
Software | Win 11 23h2 build 22631 |
Benchmark Scores | cine23 20000 |
It does at 8kIt's not like the 3090 Ti ever makes use of its 24GB outside of productivity applications (or games modded so stupidly and poorly they really aren't worth considering), so unreasonable is par for the course for this product segment.
they probably already overclocked it to the roof to reach 800wΙ think the hardest part would be to remove 800w from the case, not so much the gpu itself. Of course if aibs cheap out on the designs its gonna be a disaster, but i have hopes they wont on such a high end product
System Name | Silent/X1 Yoga/S25U-1TB |
---|---|
Processor | Ryzen 9800X3D @ 5.4ghz AC 1.18 V, TG AM5 High Performance Heatspreader/1185 G7/Snapdragon 8 Elite |
Motherboard | ASUS ROG Strix X870-I, chipset fans replaced with Noctua A14x25 G2 |
Cooling | Optimus Block, HWLabs Copper 240/40 x2, D5/Res, 4x Noctua A12x25, 1x A14G2, Conductonaut Extreme |
Memory | 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled, TG Putty Pro |
Video Card(s) | RTX 3080 Ti Founders Edition, Conductonaut Extreme, 40 W/mK 3D Graphite pads, Corsair XG7 Waterblock |
Storage | Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB |
Display(s) | 34" 240 Hz 3440x1440 34GS95Q LG MLA+ W-OLED, 31.5" 165 Hz 1440P NanoIPS Ultragear, MX900 dual VESA |
Case | Sliger SM570 CNC Alu 13-Litre, 3D printed feet, TG Minuspad Extreme, LINKUP Ultra PCIe 4.0 x16 White |
Audio Device(s) | Audeze Maxwell Ultraviolet w/upgrade pads & Leather LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro |
Power Supply | SF1000 Plat, 13 A transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua |
Mouse | Razer Viper V3 Pro 8 KHz Mercury White w/Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma |
Keyboard | Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerL60 V2, TLabs Leath/Suede |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | Legendary |
That's for one core.That is not true unless you haven't given enough details about the 4.9Ghz clock. 5900x or 5950X can clock above 5Ghz with an air cooler so that's that about your LN2 claim for Ryzen obviously extreme OC requires that and that come for both Intel and AMD no doubt about that.
You can't compare two totally different CPU on a matter of frequency and how much they boost. You can only compare those by the performance, power consumption or performance to power consumption and in general not by a certain scenario whichever satisfies your claim.
I'm saying,
You can't claim that a CPU is efficient in IDLE. that is silly right? and because it is efficient at IDLE you draw conclusion it is efficient in general because that is not true and it is even silly to suggest such a thing. That is clear and yet we have
efficient in gaming which means it is efficient in general which is as silly as IDLE state efficiency.
Now if you want to go with that notion you have to get IDLE efficiency metric, gaming efficiency metric and full load efficiency metric and maybe then you will get some sort of score describing efficiency by consolidating all those into one score.
Or you can do it that way. Idle efficiency (certain lvl of utilization) than you have gaming efficiency (certain lvl of utilization) and full load (100% of utilization) all in a given time and get an average described in units that would make sense.
Well I'm sorry to tell you this but You don't strike me as an honest, unbiased and transparent person and since you always talk about numbers (your claim) you have never presented any graphs metrics aka DATA or any evidence but keep trying. There is a balance between power and performance. Balance or equilibrium between the two is most desirable which we all know NV and AMD have been altering for the pursuit of the performance which in fact bump the prices for all products not just the top end like people tend to claim saying that's not a problem. I think it is a problem which some people are simply arrogant or blind to see.
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
5950x can run cbr23 at 4.9ghz all core? At what wattage? Im not comparing them, im just saying that power and performance dont scale linearly at all. That applies to cpus and gpus as well. Any cpu or gpu at 100 watts will be more efficient than the identical cpu or gpu at 200 watts. That is common sense, arguing with that is just silly.That is not true unless you haven't given enough details about the 4.9Ghz clock. 5900x or 5950X can clock above 5Ghz with an air cooler so that's that about your LN2 claim for Ryzen obviously extreme OC requires that and that come for both Intel and AMD no doubt about that.
You can't compare two totally different CPU on a matter of frequency and how much they boost. You can only compare those by the performance, power consumption or performance to power consumption and in general not by a certain scenario whichever satisfies your claim.
I'm saying,
You can't claim that a CPU is efficient in IDLE. that is silly right? and because it is efficient at IDLE you draw conclusion it is efficient in general because that is not true and it is even silly to suggest such a thing. That is clear and yet we have
efficient in gaming which means it is efficient in general which is as silly as IDLE state efficiency.
Now if you want to go with that notion you have to get IDLE efficiency metric, gaming efficiency metric and full load efficiency metric and maybe then you will get some sort of score describing efficiency by consolidating all those into one score.
Or you can do it that way. Idle efficiency (certain lvl of utilization) than you have gaming efficiency (certain lvl of utilization) and full load (100% of utilization) all in a given time and get an average described in units that would make sense.
Processor | i7 7700k |
---|---|
Motherboard | MSI Z270 SLI Plus |
Cooling | CM Hyper 212 EVO |
Memory | 2 x 8 GB Corsair Vengeance |
Video Card(s) | Temporary MSI RTX 4070 Super |
Storage | Samsung 850 EVO 250 GB and WD Black 4TB |
Display(s) | Temporary Viewsonic 4K 60 Hz |
Case | Corsair Obsidian 750D Airflow Edition |
Audio Device(s) | Onboard |
Power Supply | EVGA SuperNova 850 W Gold |
Mouse | Logitech G502 |
Keyboard | Logitech G105 |
Software | Windows 10 |
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
... does that video show it utilizing its 24GB? You seem to be making the classic mistake of confusing allocated VRAM with actually necessary, actively utilized VRAM. And even accounting for that, the highest I could see was FS at 20200 MB, and most other high-ish games were in the 16-17GB range. Which, knowing how extremely variable and dependent on drivers, VRAM headroom, opportunistic asset streaming and other factors VRAM allocation is, alongside the difference between allocated and actually used assets in VRAM, tells me that no, this GPU does not make use of its 24GB of VRAM in games. Not even close. For a convincing argument otherwise you'd need a very similar SKU with a lower VRAM amount (16GB? 20GB?) and a clear and demonstrable performance difference. Otherwise, most of those VRAM numbers can be put down to "the GPU has tons of free space, so the game aggressively streams in assets ahead of time, with the knowledge that most of them will enver be used".It does at 8k
here's some 8k TV at 1999$ LG QNED MiniLED 99 Series 2021 65 inch Class 8K Smart TV w/ AI ThinQ® (64.5'' Diag) (65QNED99UPA) | LG USA
System Name | Bro2 |
---|---|
Processor | Ryzen 5800X |
Motherboard | Gigabyte X570 Aorus Elite |
Cooling | Corsair h115i pro rgb |
Memory | 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16 |
Video Card(s) | Powercolor 6900 XT Red Devil 1.1v@2400Mhz |
Storage | M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB |
Display(s) | LG 27UD69 UHD / LG 27GN950 |
Case | Fractal Design G |
Audio Device(s) | Realtec 5.1 |
Power Supply | Seasonic 750W GOLD |
Mouse | Logitech G402 |
Keyboard | Logitech slim |
Software | Windows 10 64 bit |
It was not specified it had to be all cores nor the workload thus my concern not enough information.That's for one core.
Disagree. It is the producer that puts the power limit and specification of a product and advertise it as such. You can't say that 12900KS is efficient if you cut the power down to 100Watts since it is not being advertised as 100Watt product and it's price point is not reflecting a product with a performance and power capabilities as a 100Watt one. You need to take more factors into perspective not just whatever suits your opinion.My point is, any efficiency comparison that isnt done normalized for something (either power draw or performance) is fundamentally flawed. It doesnt take a genius to figure out that a 4090ti at 800w is going to be less efficienct than a 4090ti at 400w. And the same applies to the cpus, comparing a 240w plimit cpu to a 125w plimit cpu is just dumb
You have not said all cores at 4.9 but even if you did, you mentioned LN2 to achieve it which is wrong and now you are asking about what wattage and yet you claim you do not want to compare Intels product to AMDs. That is exactly what you want to do that is why the questions you ask steer into that comparison. It never did scale linearly but that is an obvious thing. Sometimes your comparisons and use of that dont scale linearly is taken out of context.5950x can run cbr23 at 4.9ghz all core? At what wattage? Im not comparing them, im just saying that power and performance dont scale linearly at all. That applies to cpus and gpus as well. Any cpu or gpu at 100 watts will be more efficient than the identical cpu or gpu at 200 watts. That is common sense, arguing with that is just silly.
Processor | 3900x - Bykski waterblock |
---|---|
Motherboard | MSI b450m mortar max BIOS Date 27 Apr 2023 |
Cooling | αcool 560 rad - 2xPhanteks F140XP |
Memory | Micron 32gb 3200mhz ddr4 |
Video Card(s) | Colorful 3090 ADOC active backplate cooling |
Storage | WD SN850 2tb ,HP EX950 1tb, WD UltraStar Helioseal 18tb+18tb |
Display(s) | 24“ HUION pro 4k 10bit |
Case | aluminium extrusions copper panels, 60 deliveries for every piece down to screws |
Audio Device(s) | sony stereo mic, logitech c930, Gulikit pro 2 + xbox Series S controller, moded bt headphone 1200mAh |
Power Supply | Corsair RM1000x |
Mouse | pen display, no mouse no click |
Keyboard | Microsoft aio media embedded touchpad (moded lithium battery 1000mAh) |
Software | Win 11 23h2 build 22631 |
Benchmark Scores | cine23 20000 |
... does that video show it utilizing its 24GB? You seem to be making the classic mistake of confusing allocated VRAM with actually necessary, actively utilized VRAM. And even accounting for that, the highest I could see was FS at 20200 MB, and most other high-ish games were in the 16-17GB range. Which, knowing how extremely variable and dependent on drivers, VRAM headroom, opportunistic asset streaming and other factors VRAM allocation is, alongside the difference between allocated and actually used assets in VRAM, tells me that no, this GPU does not make use of its 24GB of VRAM in games. Not even close. For a convincing argument otherwise you'd need a very similar SKU with a lower VRAM amount (16GB? 20GB?) and a clear and demonstrable performance difference. Otherwise, most of those VRAM numbers can be put down to "the GPU has tons of free space, so the game aggressively streams in assets ahead of time, with the knowledge that most of them will enver be used".
System Name | Hotbox |
---|---|
Processor | AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6), |
Motherboard | ASRock Phantom Gaming B550 ITX/ax |
Cooling | LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14 |
Memory | 32GB G.Skill FlareX 3200c14 @3800c15 |
Video Card(s) | PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W |
Storage | 2TB Adata SX8200 Pro |
Display(s) | Dell U2711 main, AOC 24P2C secondary |
Case | SSUPD Meshlicious |
Audio Device(s) | Optoma Nuforce μDAC 3 |
Power Supply | Corsair SF750 Platinum |
Mouse | Logitech G603 |
Keyboard | Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps |
Software | Windows 10 Pro |
The normalization is inherent: the comparison is done at the manufacturer-defined default operating specifications of the product. That is what defines the efficiency of the product. This is of course different from, say, architectural efficiency, which is a huge range and highly variable across clock speeds - but that's only really relevant when comparing across architectures, not when speaking of the efficiency of a specific SKU. This is the same as arguing that the Vega 64 and 56 were actually quite efficient - and sure, they were if you manually downclocked them by a few hundred MHz. But that doesn't matter, as that's not how consumers bought them, nor how they were used by the vast majority of users. How the product comes set up from the factory, out of the box, without expert tweaking is the only normalization that matters when testing a product.My point is, any efficiency comparison that isnt done normalized for something (either power draw or performance) is fundamentally flawed. It doesnt take a genius to figure out that a 4090ti at 800w is going to be less efficienct than a 4090ti at 400w. And the same applies to the cpus, comparing a 240w plimit cpu to a 125w plimit cpu is just dumb
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
I fundamentally disagree when it comes to products like k and ks series which are meant to be tinkered with. If you don't tune the crap out of them you are spending money for nothing, you can go for the non k versions. Its the same with ram for example, you have to at least activate xmp at the very least.The normalization is inherent: the comparison is done at the manufacturer-defined default operating specifications of the product. That is what defines the efficiency of the product. This is of course different from, say, architectural efficiency, which is a huge range and highly variable across clock speeds - but that's only really relevant when comparing across architectures, not when speaking of the efficiency of a specific SKU. This is the same as arguing that the Vega 64 and 56 were actually quite efficient - and sure, they were if you manually downclocked them by a few hundred MHz. But that doesn't matter, as that's not how consumers bought them, nor how they were used by the vast majority of users. How the product comes set up from the factory, out of the box, without expert tweaking is the only normalization that matters when testing a product.
Its obvious im talking about all cores.It was not specified it had to be all cores nor the workload thus my concern not enough information.
Disagree. It is the producer that puts the power limit and specification of a product and advertise it as such. You can't say that 12900KS is efficient if you cut the power down to 100Watts since it is not being advertised as 100Watt product and it's price point is not reflecting a product with a performance and power capabilities as a 100Watt one. You need to take more factors into perspective not just whatever suits your opinion.
You have not said all cores at 4.9 but even if you did, you mentioned LN2 to achieve it which is wrong and now you are asking about what wattage and yet you claim you do not want to compare Intels product to AMDs. That is exactly what you want to do that is why the questions you ask steer into that comparison. It never did scale linearly but that is an obvious thing. Sometimes your comparisons and use of that dont scale linearly is taken out of context.