Saturday, October 13th 2018

New PT Data: i9-9900K is 66% Pricier While Being Just 12% Faster than 2700X at Gaming

Principled Technologies (PT), which Intel paid to obtain some very outrageous test results for its Core i9-9900K eight-core processor launch event test-results, revised its benchmark data by improving its testing methodology partially. Initial tests by the outfit comparing Core i9-9900K to the Ryzen 7 2700X and Ryzen Threadripper 2950X and 2990WX, sprung up false and misleading results because PT tested the AMD chip with half its cores effectively disabled, and crippled its memory controller with an extremely sub-optimal memory configuration (4-module + dual-rank clocked high, leaving the motherboard to significantly loosen up timings).

The original testing provided us with such gems as the i9-9900K "being up to 50 percent faster than 2700X at gaming." As part of its revised testing, while Principled Technologies corrected half its rookie-mistakes, by running the 2700X in the default "Creator Mode" that enables all 8 cores; it didn't correct the sub-optimal memory. Despite this, the data shows gaming performance percentage-differences between the i9-9900K and the 2700X narrow down to single-digit or around 12.39 percent on average, seldom crossing 20 percent. This is a significant departure from the earlier testing, which skewed the average on the basis of >40% differences in some games, due to half the cores being effectively disabled on the 2700X. The bottom-line of PT's new data is this: the Core i9-9900K is roughly 12 percent faster than the Ryzen 7 2700X at gaming, while being a whopping 66% pricier ($319 vs. $530 average online prices).
This whopping 12.3% gap between the i9-9900K and 2700X could narrow further to single-digit percentages if the 2700X is tested with an optimal memory configuration, such as single-rank 2-module dual-channel, with memory timings of around 14-14-14-34, even if the memory clock remains at DDR4-2933 MHz.

Intel responded to these "triumphant" new numbers with the following statement:
Given the feedback from the tech community, we are pleased that Principled Technologies ran additional tests. They've now published these results along with even more detail on the configurations used and the rationale. The results continue to show that the 9th Gen Intel Core i9-9900K is the world's best gaming processor. We are thankful to Principled Technologies' time and transparency throughout the process. We always appreciate feedback from the tech community and are looking forward to comprehensive third party reviews coming out on October 19.
The media never disputed the possibility of i9-9900K being faster than the 2700X. It did, however, call out the bovine defecation peddled as "performance advantage data."

The entire testing data follows:
Source: Principled Technologies (PDF)
Add your own comment

322 Comments on New PT Data: i9-9900K is 66% Pricier While Being Just 12% Faster than 2700X at Gaming

#251
Smartcom5
_Flareand please do a verification @baseclock if intel holds its 95W TDP
Highly questionable if you ask me.
Though, even if it won't exceed those 95W for base clocks, it will pull pretty exactly +230–250W at the wall, excluding the rest of the system – at stock clocks under full load. Mind any overclocking!

Anyway, the overall power-consumption will bump quite a bit! It's still physics, isn't it?
The power-draw can be extrapolated quite easily, ordinary rule of three …

Gaming Load


If a 8700K needs about 66.8 Watts with its 6 cores on a average gaming-load, then a 9900K will be drawing about 89.07 Watts. Still, it won't run on its stock-clocks of 5 Ghz but 'only' on a 8700K's 4.7 GHz – so you have to add the additional consumption which even comes on top of that.

Calculation:

Averaged power-draw at stock-clocks (@4.7 GHz)


Average gaming-load OC'd (@4,9 GHz)


… which makes it ~90W on average gaming-load @4.7 GHz on 8 cores, just by the numbers alone.
So a 9900K will be consume at least 90W (in the best theoretical case) – though, this will not be the actual case since it has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). So due to that fact (of the increased Cache) alone it will be drawing significant more power than those 90W and probably will exceed the TDP of 95W.

Or in other words, it will be very likely that the 9900K will already exceed its TDP of 95W already at stock clocks (as you can see by the already overstepping 95.74 Watts at 4.9 Ghz), especially if it runs any warmer.



Full load


If we then have another look on the 8700K while being under heavy torture load like Prime, we see it doesn't get any better anyway. At Prime a 8700K pulls already 159.5 Watts@stock – and as such, a 9900K will be pulling also at least 212.67 Watts. Having said this, it's still ain't running at its stock-clocks at 5 Ghz but again still 'only' at stock-clocks of a 8700K at 4.7 Ghz. … of course without the additional power-draw of the remaining +300 Mhz, sans the increased power-consumption of its larger cache.

Calculation:

Full-load power-consumption at stock-clocks (@4.7 GHz)


Full-load power-consumption OC'd (@4,9 GHz)


As a result, a 9900K will be in that (still best theoretical) case consume at least circa 212.67 W under full load – admittedly, even that won't be the actual case as it still has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). Hence, due to that fact of its increased Cache it will be consume significantly more. On average the 9900K might be easily draw +230–250 Watts. In any case, the official fantasy-TDP of just 95W is here pure maculation and by all means just printer's waste. So, as usual on Intel's official extremely misleading TDP-specifications.



Final conclusion

Note!
All numbers here are always representing the best case (sic!) and are in fact the best possible and assumable Numbers, since we're still at 4.9 Ghz in this scenario. Any greater attention should also be paid to the evident fact that in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself. Those numbers ain't the power consumptions of the whole system! Those are the CPU's values alone.

☞ Please also note, that the Cache which now has a Size increased by 33% will be making significant contributions to actual Wattage-numbers. Furthermore, all numbers arose with the assistance of a Chiller which cooling loop was cooled down and held permanently at 20°C (which, as a side-note, didn't even could hinder the 8700K from running into its thermal limit).

Résumé or bottom line
  • All of those are Best case-values.
  • All Wattages and (possible) clock frequencies under utilisation and made possible through the use of a Chiller (Compressor-cooling).
  • All calculations lacking the remaining clock speed of +100 MHz towards the nominal-clock of the 9900K (naturally including the respective overconsumption)
  • The actual Wattage might be very likely levelling off at +230–250 Watt nominal-consumption, at stock-clocks under full load.
Smartcom

PS: One forgive the potentially significant simplification of given circumstances for the purpose of exemplification. Errors excepted.
Posted on Reply
#252
cadaveca
My name is Dave
Smartcom5As a result, a 9900K will be in that (still best theoretical) case consume at least circa 212.67 W under full load – admittedly, even that won't be the actual case as it still has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). Hence, due to that fact of its increased Cache it will be consume significantly more. On average the 9900K might be easily draw +230–250 Watts. In any case, the official fantasy-TDP of just 95W is here pure maculation and by all means just printer's waste. So, as usual on Intel's official extremely misleading TDP-specifications.
Nope, sorry. Grab a clamp-on current meter, put it over the 8-pin power connector of your board, and you'll see that Intel's TDP numbers are actually quite accurate, and that's including with Turbo clocks.

Problem is, nobody except for me does this in reviews. Go look at any of the board reviews here and you'll see it. I provided the new board reviewer with the hardware to test this as well, so you'll continue to see actual CPU power draw in motherboard reviews here @ TPU.

I'm also happy to report that AMD's current platform TDP numbers are pretty accurate as to actual power draw as well.

You'll also have to note that Turbo clocks on both platforms, and CPU throttle are controlled by this number. You can even find it in board BIOSes, where you can manipulate the maximum power drawn (and some board makers have previously used this to cheat on review benchmark numbers). By default on all current platforms, this number matches a CPU's advertised TDP.


So rather than blame AMD or Intel on this one, you gotta blame the reviewers who are reporting inaccurate information to you. Its especially revolting to me that nobody tests this way, especially considering that Zalman made meters for this you could buy, that cost just $40, so you don't even need to spend a lot to measure this accurately. A decent and reliable clamp-on meter these days can be had for around $100.

Power numbers derived from meters connected to the PSU's power cord measure the entire system, as well as PSU inefficiency. Of course those numbers seem inflated.. they include board, memory, drives, mouse and keyboard, as well as the CPU, if not also idle power draw of a videocard.
Posted on Reply
#253
Vayra86
R0H1TThat's not my point really, it's just that Intel can't or shouldn't sell chips based on questionable benchmarks & that practice should never be defended, be it Apple/Intel/Nvidia or AMD.
Agreed, but is anyone in disagreement on that? I haven't seen anyone here 'defending Intel' for these misleading results.

The defense gets erected whenever AMD fans tell Intel users that the difference is negligible when there are countless benches (even a dozen of the corrected ones in this article underline it!) that show Intel CPUs still excel at higher framerates and in single thread limited scenarios. And its not just benches either, but practice - experience - from using the hardware. We're on an enthusiast forum, so there is going to be a larger than normal group interested in top end performance. And when it comes to price: the performance gap in some scenario's is easily 30% - let's look at GPU and the additional cost of a 30% faster 2080ti over its little brother below it. Remarkably similar. In both cases, you could say 'must be cheaper', and in both cases we as consumers have influence on that, by simply not buying it.
TheGuruStudUsing performance metrics from a 14 yr old game engine isn't proving anything. More baloney results. It doesn't even have anything to do with single/multithreading. Source straight up runs like doo doo on ryzen.

Don't agree with the dumb dumb. He's saying that intel leads by 40% in ST, but is knocked down to 12% in MT with 15% ish higher clocks. Tell me, where is all that intel IPC at? It doesn't exist. You can conclude that intel currently has a few percent IPC lead lol. And that doesn't include optimized memory for ryzen.

Dummy is flat out wrong or AMD makes the most superior CPU to ever exist for the next 20 yrs b/c of its SMT. Intel's only tangible lead is in freq and/or applications optimized only for intel (which is most everything).

Ever see game benchmarks with all CPUs locked to 4ghz? It's not rosy for intel's ipc "superiority".
- Don't forget your (expensive) B-die sticks
- Don't forget to clock your Intel CPU at 4 Ghz
- Don't use Source as an example game engine
- Don't use a ST limited scenario

And all of a sudden, Ryzen looks almost (still missing some % though) as good as an Intel CPU! You should go work for PT! I heard they're doing a Ryzen piece next month.

Surely you can see the irony. You have just literally summed up everything that is wrong about AMD-fan perspective on performance, and you cannot even see it, apparently. You should take this perfect example to reflect upon. Dummy... :laugh:
Posted on Reply
#254
Smartcom5
cadavecaNope, sorry. Grab a clamp-on current meter, put it over the 8-pin power connector of your board, and you'll see that Intel's TDP numbers are actually quite accurate, and that's including with Turbo clocks.

Problem is, nobody except for me does this in reviews.
You actually are aware that such measurements were made by Igor from Tom'sHardware – and always have in such ways, or are you?
He's one of the few who does that and always have, as he's famous for doing exactly that.

In addition, you seem to have overlooked my note down there were I trying to state expressively, what such numbers are representing and where do those were coming from, no?
Smartcom5
Note!
All numbers here are always representing the best case (sic!) and are in fact the best possible and assumable Numbers, since we're still at 4.9 Ghz in this scenario. Any greater attention should also be paid to the evident fact that in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself. Those numbers ain't the power consumptions of the whole system! Those are the CPU's values alone.
cadavecaI provided the new board reviewer with the hardware to test this as well, so you'll continue to see actual CPU power draw in motherboard reviews here @ TPU.
Reading that makes me actually genuinely happy …
Now shut up and take my money! Oh, and if you don't mind, let me **** **** ****!
cadavecaYou'll also have to note that Turbo clocks on both platforms, and CPU throttle are controlled by this number. You can even find it in board BIOSes, where you can manipulate the maximum power drawn (and some board makers have previously used this to cheat on review benchmark numbers). By default on all current platforms, this number matches a CPU's advertised TDP.
I'm actually pret·ty aware of the ongoing over-extensively and widely used abusive methods of such ways to hide way higher actual numbers under the rug, the all too common practice to benchmark with open roofs and completely unbridled for higher numbers (cough MCE! Unlimited Powertargets!) while 'determine' the »actual power-consumption« afterwards whilst having the product muzzled by given BIOS-/UEFI-options and/or lowered PTs, pre-cooled cards, etcetera – thank you.
cadavecaSo rather than blame AMD or Intel on this one, you gotta blame the reviewers who are reporting inaccurate information to you.
I always criticise such wrongdoings as extremely and excessively misleading and deceiving. Always have, always will.
Especially since such devices and/or apparatuses shall be not only pretty affordable for a today's technical editorial department but since every damn reviewer who considers themselves as any reputable or at least may have the personal aspiration to be taken any serious and trustworthy is nothing less than ob·li·ga·ted in taking such measurements and determine such numbers of actual and nominal power-consumptions.

Using the overall system's wattage I straight-out consider such attempts or habits as direct intent to mislead or deceive. All the more if the product is a) known to be taxing higher numbers in reality and especially b) if such reviewers were already made aware and pointed towards such facts (that using the overall system's wattage is representing the product in a way to massively flattering light).


Smartcom
Posted on Reply
#256
londiste
Durvelle27Where are the reviews by now
NDA ends on October 19th.
Posted on Reply
#257
Keicho2
All I see is that the old 4 kerner i7 6700k fps is technically the same on the 2700x. In games. Why should an 8 core with 5ghz from intel have only 12%?: D
Posted on Reply
#258
cadaveca
My name is Dave
Smartcom5In addition, you seem to have overlooked my note down there were I trying to state expressively, what such numbers are representing and where do those were coming from, no?
No, I've not overlooked anything. You see, if a motherboard is working properly, and it's BIOS is configured properly, such IS NOT POSSIBLE. Anyone telling you that a CPU exceeds it's TDP doesn't fully understand how these things work, and how power draw is controlled via the motherboard for a CPU, and as such, if the BIOS is programmed right, and the board works right, there is ZERO CHANCE for a CPU to exceed the listed TDP.

Now, many things can go wrong that can cause TDP to be exceeded, but it is NEVER supposed to happen, no matter the type of CPU loading. So anyone telling you that this happens, at stock, is misinforming you, and isn't smart enough (IMHO) to investigate why such is taking place. Please also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.
in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself.
Package Power Consumption is a SOFTWARE reading. So, no this guy is NOT doing as we do. He's reading software, and is assuming that things are reported accurately, when clearly they aren't. He's clearly identified a problem in his configuration for sure, but what and where that problem is, is NOT being reported properly.
Posted on Reply
#259
Unregistered
trog100an 8700K is currently £460 from Scan UK.. lets be a at least a little bit accurate..
Yeah thanks to "shortages" how convenient?
notbHow could we not defend a company that is one of the pillars of IT in our civilization?

You don't have to admire Intel and you might not even respect their contribution to computing (which would be weird for a wannabe enthusiast), but you should understand their importance for stability of this business and the general reality around us.
Do you like pizza? Imagine there was a single company selling 90% of pizzas globally. I'm sure you wouldn't want that company to have any problems. :)

I work in insurance - and industry that's constantly plagued by price wars. People don't like paying for insurance, but they have to. And the business is very scale-dependent, i.e. a large market share greatly improves your margins. Hence, smaller companies are selling policies at dumping prices just to get a large client base. It's easier to renew a client than convince a new to join. So it makes sense to sell them a product at a loss. If they stay for another 1-2 years, we'll make a profit in the end.

I look at CPU business and I see some analogies. For example: you have a huge technological cost for R&D and product release. Clients are rather loyal to brands. And most importantly: people have to buy CPUs - it's just a matter of whom to buy from.
I'm not saying AMD margins are too low for making their business stable. But business-wise it wouldn't necessarily be a bad idea for them to sell even at a loss now, but get up to 20-30% market share and gain some momentum.
On the other hand, it would be totally sensible for Intel to realize that there's a particular group of people that's naturally pulled towards AMD's characteristics and fighting for them is very expensive, so sustaining 90% market share simply costs way too much. Maybe someone had the balls to stand up during a meeting and say: let's give up - it's better to sell 7 CPUS for $500 than 9 for $300.
"defend a company" is going too far here - this is just a discussion thread - we're not taking our toilet onto intel and pouring it over them, If you haven't realized already AMD has been on the "edge" for years now and they've made a comeback - Intel RELYS on AMD existing since they both licence technologies to each other that are essential to the production of processors, Also that comment in regards to maintaining the market - Intel will *Never* hand over the market to amd, they are fighting for that market share, they need it as high as possible, if they decided to sell 7 for 500$ than 9 for 300$ it would also affect investors and stock - I'm not going into it that far however. Intel own the bulk of the server market where the real money is to be made, they've already lost in the war for the fastest supercomputers to IBM which crushed them - If AMD take over the server market intel would probably call out for help from other companies and be in big financial trouble - the mainstream is the way AMD could start this - they only lack R&D funding and that's the only thing stopping them and it can be made in the mainstream - you seriously think intel would threaten their entire companies existence? They will continue to try and push for that market share.
Posted on Edit | Reply
#260
londiste
cadavecaNow, many things can go wrong that can cause TDP to be exceeded, but it is NEVER supposed to happen, no matter the type of CPU loading. So anyone telling you that this happens, at stock, is misinforming you, and isn't smart enough (IMHO) to investigate why such is taking place. Please also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.
That actually is not completely true, at least for current generation (as well as generation or two back) Intel processors. You can definitely see CPU temporarily exceeding TDP even without MCE or some other manufacturer's stupid option enabled. From examples I have personally seen with all BIOS/UEFi settings set to as stock as possible, i7 8700K will run at 120-130W for a little while before settling down at 95W. Similarly, i5 8400 runs at 95W for a little while before settling down at 65W. The "little while" in there seems to depend on the motherboard.

Whether this is the default configuration or not is up for debate. From what I can see from whitepaper, technically it should not be (PL2 is 1.25 TDP and up to 10ms with PL1 Tau at 1 second). Are motherboard manufacturers playing around with settings more than they should (in addition to MCE)?

www.intel.com/content/dam/www/public/us/en/documents/datasheets/8th-gen-core-family-datasheet-vol-1.pdf
Chapter 5: Thermal Management (Page 88)

The other thing with Intel's power management is that AVX throws most of it straight out the window. If overclockers decide to disable the default AVX Offset (-2/-3) along with disabling the power limits, that will increase power consumption and heat by a lot. Back to talking about stock - in general Intel has set the limits pretty well, Turbo frequencies will work reasonably fine for anything not AVX. Heavy AVX load, however, will drop the frequencies down to base quickly.
cadavecaPlease also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.
Package Power Consumption is a SOFTWARE reading. So, no this guy is NOT doing as we do. He's reading software, and is assuming that things are reported accurately, when clearly they aren't. He's clearly identified a problem in his configuration for sure, but what and where that problem is, is NOT being reported properly.
You are measuring power with clamp, right? Have you tested from at least a couple different motherboards whether software readings lie and by how much (both motherboad and CPU)?
I have been looking for a cheap clamp to test the power myself but a $40 meter or reasonably cheap clamps do not have a very good accuracy and so far I have not been interested enough to go for something costing couple hundred moneys. Software might not be in a different ballpark from a cheap meter, given that software readings are somewhat verified in hardware.
Posted on Reply
#261
cadaveca
My name is Dave
londisteYou are measuring power with clamp, right?
Correct, with a FLUKE clamp meter.
londisteHave you tested from at least a couple different motherboards whether software readings lie and by how much (both motherboad and CPU)?
Yeah, I have. Over time I have found that AIDA64 can be fairly reliable once it's been updated properly (and if you have a yearly-renewed licence, they'll gladly update it for you if it doesn't work right), but for some boards it is way off, and when you begin overclocking on a lot of boards, it reads less than 1W of power consumed (when clearly it is a whole lot more). It really varies from board to board and version to version of the software in use, whereas the clamp meter just simply works, every time.
londisteI have been looking for a cheap clamp to test the power myself but a $40 meter or reasonably cheap clamps do not have a very good accuracy and so far I have not been interested enough to go for something costing couple hundred moneys. Software might not be in a different ballpark from a cheap meter, given that software readings are somewhat verified in hardware.
You can get the FLUKE meter I have for what I'd call a relatively minor cost, especially for me, since I use it so damn often. As someone that liked to overclock under LN2 and such, it is a tool that you cannot be without, as so much information can be had by watching power increases as you push up the clocks, especially with different CPUs. Now, for the average user, it might not be that worthy of an investment I suppose; it just depends how into overclocking you really are.
londisteAre motherboard manufacturers playing around with settings more than they should (in addition to MCE)?
Yeah, they are. Sad but true, and yeah, you can see some spikes from time to time, especially when AVX loading for sure. But as you've surmised, there is a time limit to this, and yeah, that time limit is also in BIOS and can be adjusted.
Posted on Reply
#262
ToxicTaZ
Intel 9 years old architecture provides the best performance still! Even brand new AMD Ryzen architecture can't out perform 9 years old Intel architecture.

When you have the best and no competition then you can name your prices.... Both Intel 9900K and Nvidia 2080Ti are 2018 best CPU & GPU with no competition from AMD.

I you want a cheaper 3rd place 2700X its the time to do that. Secondly 9700K out performs 2700X in most games and OC up to 5.5GHz with EK.

If you're a PC Gamer the 9700K is your best choice. If you're just want bragging rights with benchmarking get the top dog 9900K.

2700X Max's out 4.4GHz

9900K/9700K both max out 5.5GHz

8700K/8086K both max out 5.3GHz

That's your head room.
Posted on Reply
#263
GoldenX
Nice troll.
Enjoy your blue tax for extra 5FPS.
Posted on Reply
#264
Melvis
How the heck did this thread get to 11 pages long? PT retested, not perfect retest but retested with an actual 8 core CPU this time and we got results closer to all what we was expecting and at the same time proved intel wrong! Now its intels turn to redo there in house testing since they said they got the same results as PT with the first run of benchmarks :shadedshu: Anyone defending intel in this thread are ether getting paid/work for intel or are just plain dumb!

So a 9900k is basically double the price of a 2700x for less then 12% gaming performance increase, we know who the true winner is here.
Posted on Reply
#265
GoldenX
/g/ sums it up quite well:

CPU
>Athlon 200GE - Minimal desktop
>R3 2200G - Bare minimum gaming (dGPU optional)
>R5 2400G/i5-8400 - Consider IF on sale
>R5 2600/X - Good gaming & multithreaded work use CPUs
>i7-9700k - If pairing w/ a 2080Ti and the extra $200+ is worth ~135 FPS instead of ~120 FPS to you, despite better CPUs coming next year and requiring new boards
>R7 2700/X - Best value high-end CPU on a non-HEDT platform
>Wait for R7 3700X - Surely the best overall and not a massive disappointment like the 9900k
>Threadripper/Used Xeon - HEDT
Posted on Reply
#266
Unregistered
Honestly I think the i5-9600k is the best value of the series, www.tweaktown.com/news/63512/intel-core-i5-9600k-6c-6t-overclocks-up-5-2ghz-air/index.html this thing will cost around 262$ or £200 ~ and at the 5ghz+ range it's not too far behind the 8700k at all and massively ahead in single thread, It's not going to be easy for amd at this price point that's most likely why the dropped the 2700x down in price a bit. the 9900k is just a pure bragging rights processor or for the rich.
Posted on Edit | Reply
#267
TheGuruStud
ToxicTaZIntel 9 years old architecture provides the best performance still! Even brand new AMD Ryzen architecture can't out perform 9 years old Intel architecture.

When you have the best and no competition then you can name your prices.... Both Intel 9900K and Nvidia 2080Ti are 2018 best CPU & GPU with no competition from AMD.

I you want a cheaper 3rd place 2700X its the time to do that. Secondly 9700K out performs 2700X in most games and OC up to 5.5GHz with EK.

If you're a PC Gamer the 9700K is your best choice. If you're just want bragging rights with benchmarking get the top dog 9900K.

2700X Max's out 4.4GHz

9900K/9700K both max out 5.5GHz

8700K/8086K both max out 5.3GHz

That's your head room.
You're gonna have to put /s in your posts. Someone is going to think you're not trolling.
Posted on Reply
#268
XXL_AI
Xx Tek Tip xXWe'll see - arm isn't anywhere near as capable as intel and amd.
Nvidia Jetson systems and other ARM powered embedded computers are pretty capable ;)
Posted on Reply
#269
Unregistered
XXL_AINvidia Jetson systems and other ARM powered embedded computers are pretty capable ;)
Not capable, they will not deliver similar performance or anywhere near the performance of AMD and Intel processors, they can stick to the phone market.
#270
XXL_AI
oh really? where is your proof? we're using our router with 10gbe switch powered with Nvidia Jetson, it is fast and it is extremely secure. just because you can't game (yet) on those platforms doesn't give you rights to blame their speed. the video transcoding capabilities of those tiny jetson modules are at the level of Quadro's, and you know any Quadro can beat the electrons out of intel or amd processor when it comes to parallel encoding or decoding of multiple streams simultaneously
Posted on Reply
#271
Unregistered
XXL_AIoh really? where is your proof? we're using our router with 10gbe switch powered with Nvidia Jetson, it is fast and it is extremely secure. just because you can't game (yet) on those platforms doesn't give you rights to blame their speed. the video transcoding capabilities of those tiny jetson modules are at the level of Quadro's, and you know any Quadro can beat the electrons out of intel or amd processor when it comes to parallel encoding or decoding of multiple streams simultaneously
Congratulations, what your doing is basically using using an embedded system - designed for one task, and your comparing them with processors used for a hell of a lot more tasks then just that.
Posted on Edit | Reply
#272
XXL_AI
Try to use any intel or amd processor at the level of embedded systems, you'll cry in the end.
Posted on Reply
#273
Unregistered
XXL_AITry to use any intel or amd processor at the level of embedded systems, you'll cry in the end.
This is a discussion about DESKTOP processors, not embedded crap, it's like saying my android tablet beats a windows desktop - they're aimed at different users.
#274
rtwjunkie
PC Gaming Enthusiast
RobcostyleI like how this greedy bluegreen couple tries to rip you for extra cash, just preteding to be “exclusive”, higher quality or better perfomance-
Hell, even that, they are so unconfident in doing all that advertising, torning between all they’re flushing, resulting in a complete mess about “what their product really is”
-and you know whats the most pretty in all this? THEY ARE NOT PREMIUM. They are not made from better materials than other stuff in semicond. market, they wont last longer, they dont have premium options. Hell, do Intel and ngreedia know what does PREMIUM means??? If they’ll sell their CPU’s in some Ferrari kind of shop, with cup of coffee and a manager kissing ur arse just to buy their stuff - that I will call premium over AMD. Nit just silly ~10% perfomance gain.

Rich people pay more for better service or good - and NEVER for the same stuff available for every, just in order to show they have more money.
All in all, PC gaming is a leisure - and it shouldnt be considered as a major part of your expenses. This price hike tactics just initially false, from very beginning
Lmao! Nice rant.

Now, stop acting shocked, like this is new. Intel has priced their premium processor at premium prices for nearly two decades. Factored for inflation, those $1,000 chips cost a lot more than this one does.
Posted on Reply
#275
Tsukiyomi91
pls... Intel has been putting premium price tags on their higher end SKUs, being "surprised" is nothing out of the ordinary.
Posted on Reply
Add your own comment
May 5th, 2024 08:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts