• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Athlon Pro 200GE Detailed: An Extremely Cut-down "Raven Ridge" at $55

Dual cores shouldn't be a thing these days, maybe inside of a cheap laptop.

For business class systems, dual-cores are often plenty of power. Plus, this dual-core will likely provide better performance, even multi-threaded, than some quad-cores out there.
 
It's more than enough for office work and media playback, at only 35w, and with a good, upgradeable platform.
And I'm sure it can game better than an UHD IGP. Some light gaming in 1366x768 display seems possible.
 
it will match the g4560 single thread performance for sure as 3.2ghz is fater than 3.5ghz intel
 
Can this play compressed 4K 10-bit HDR media? That's all I'm asking.
 
Via CPU? Try it limiting your own cores.
 
Can this play compressed 4K 10-bit HDR media? That's all I'm asking.

I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
 
What would the "FX" be? Single-Core no SMT single Vega Core?
FX would be massively overclocked Ryzen (but with half the cache and IF speed to limit IPC), no SMT, no GPU, TDPs in the 200W range, and a bundled AIO water cooler. For those wanting a space heater ;)

I want them to bring back the Duron brand. My first CPU was a Duron! It was "great" (read: I could afford it, and I was like 12 at the time!). It also overclocked by the exact amount of 0%. Not that I had even the slightest clue what I was doing, but I couldn't get it above stock clocks whatsoever. IIRC my brother's Athlon was a much better overclocker. Oh, how the world has moved on.
 
I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
Some prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
 
Some prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Are there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
 
I don't know. Try it, if you have media like that.
 
Are there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
Kaby Lake [+refresh], Coffee Lake? All of those embedded Pentium Gold/Silver SoCs that came out in the past 1.5 years? Ryzen APUs? Snapdragon 820/835/845? The list goes on...
Pretty sure this new Athlon and upcoming low-power entry level mobile chips will also support 4K HDR.
 
Kaby Lake [+refresh], Coffee Lake? All of those embedded Pentium Gold/Silver SoCs that came out in the past 1.5 years? Ryzen APUs? Snapdragon 820/835/845? The list goes on...
Pretty sure this new Athlon and upcoming low-power entry level mobile chips will also support 4K HDR.

Most of those support it through a hardware decoder. He's asking for software only, using pure CPU power, and for that I'd be inclined to think anything below a modern powerful quad-core would struggle.
 
Most of those support it through a hardware decoder. He's asking for software only, using pure CPU power, and for that I'd be inclined to think anything below a modern powerful quad-core would struggle.
Oh... that... Well, that's stupid. I'm not sure even if the most powerful consumer CPU can do it in real-time this way. My i3-6100 can barely pull 4K@23FPS 8-bit with half the frames dropped, and my older X5650 would do around 1-2FPS in software mode. Those were just the curiosity experiments.
 
Oh... that... Well, that's stupid. I'm not sure even if the most powerful consumer CPU can do it in real-time this way. My i3-6100 can barely pull 4K@23FPS 8-bit with half the frames dropped, and my older X5650 would do around 1-2FPS in software mode. Those were just the curiosity experiments.
I completely agree that that is a rather silly requirement, but people here seemed to want it:
Can this play compressed 4K 10-bit HDR media? That's all I'm asking.
Some prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Hence me asking if it was at all possible.
My initial guess aligns with yours @silentbogo , that at the very least you'd need an OC'd 8700k or similar, though probably even that couldn't do it. And by then you're likely pushing more than 100W just to run an single-threaded decoder, which is, as you say, stupid. But to each their own, I suppose.
 
Hence me asking if it was at all possible.
My initial guess aligns with yours @silentbogo , that at the very least you'd need an OC'd 8700k or similar, though probably even that couldn't do it. And by then you're likely pushing more than 100W just to run an single-threaded decoder, which is, as you say, stupid. But to each their own, I suppose.
What others meant was not software decoding, but the built-in decoder ASIC, like AMD UVD or Intel QuickSync. It's all dedicated hardware solely responsible for video decoding.
Just like GPUs (or graphics accelerators, at the time), were created to avoid the difficulties of software rendering, now all hardware includes some sort of specialized video encoding/decoding accelerator. It's not only faster, it's also more efficient.
 
What others meant was not software decoding, but the built-in decoder ASIC, like AMD UVD or Intel QuickSync. It's all dedicated hardware solely responsible for video decoding.
Just like GPUs (or graphics accelerators, at the time), were created to avoid the difficulties of software rendering, now all hardware includes some sort of specialized video encoding/decoding accelerator. It's not only faster, it's also more efficient.
You're preaching to the choir here man, this is not news to me. Shouldn't be to anybody.
 
Of course it works with the GPU decoder. That's what I use too. But some people like to do post processing, and doing it with GPU aceleration is buggy. Kinda weird to ask that of a 55 usd dual core Ryzen, but I've seen people do renders on Atoms.

By the way, all decoders are very good at using all threads.

We need cheaper itx boards for this baby, I want one for my file sharing Pc, the 5150 is a bit old now.
 
I completely agree that that is a rather silly requirement, but people here seemed to want it:
Why silly requirements?? I want to build a mini multimedia box to play 4K Netflix/Amazon/offline videos on my 4K HDR TV, what's so silly about it?? I don't want to buy a dedicated crap from Amazon or Roku, since those cannot properly play 4K HDR-10 bit MKV or h265 video files. I also have an USB Blu-ray capable of playing 4K HDR Disks.
 
Why silly requirements?? I want to build a mini multimedia box to play 4K Netflix/Amazon/offline videos on my 4K HDR TV, what's so silly about it?? I don't want to buy a dedicated crap from Amazon or Roku, since those cannot properly play 4K HDR-10 bit MKV or h265 video files. I also have an USB Blu-ray capable of playing 4K HDR Disks.
We were talking about software/CPU decoding, i.e. not using fixed-function hardware in the chip. Please read more carefully.
 
We were talking about software/CPU decoding, i.e. not using fixed-function hardware in the chip. Please read more carefully.
I don't understand, is it you that were interested in software decoding, or not you?
Cause no one else has brought up or mentioned software decoding, yet you say:
You're preaching to the choir here man, this is not news to me. Shouldn't be to anybody.
Kinda weird and pointless, don't you think?
 
I don't understand, is it you that were interested in software decoding, or not you?
Cause no one else has brought up or mentioned software decoding, yet you say:
I brought it up? Huh? Let's see:
Can this play compressed 4K 10-bit HDR media? That's all I'm asking.
Via CPU? Try it limiting your own cores.
I'm pretty sure the GPU has a built in HEVC decoder, so it should be able to.
Some prefer to use the CPU for decoding, it's less bug prone and allows for better personalization.
Are there any CPUs at all that can decode 10-bit HDR HEVC @4K60 without dedicated hardware or 'hybrid' solutions?
If that's "no one but [me]" bringing it up, then ... yeah. I didn't bring it up, I asked if [thing that other people seemed to be discussing] was at all possible.
 
I asked if [thing that other people seemed to be discussing] was at all possible.
All of it refers to using UVD or Quicksync, cause it's more stable. Not software decoding on CPU.
Once again, to be even more speicific: HW decoding on iGPU, not dGPU. No one mentioned software decoding.
 
All of it refers to using UVD or Quicksync, cause it's more stable. Not software decoding on CPU.
Once again, to be even more speicific: HW decoding on iGPU, not dGPU. No one mentioned software decoding.
Again: preaching to the choir. Fixed-function video decoding hardware integrated into iGPUs is well know, at least to me. Don't see how you got so stuck on that point. I just asked for clarification on something tangentially related to that. All in all, I guess this is just a circle of misunderstandings all over. No idea why you're dead set on somehow blaming this misunderstanding on me (or why someone needs to be singled out for it at all), but I'll leave that to you. Pretty clear from the posts I've quoted above that that wasn't the case. Now can we stop beating this dead horse?
 
Back
Top