People weren't running ALL games at the max detail of the day on a 750ti. Or at the least at 1440p which is basically 4k of that time. Many were aiming for 1080p med or even 900p
If you look around though you can get an RX6600 for around that price though. nVidia has always had a bit of a price...
Yes the RX6600 has been on my radar and "target locked" for quite a while, whilst occasionally considering the 6600xt/6650XT. Just have some financial things to sort before I pull that trigger!
Let me just point out this little site...
12gb being entry level... And here I am with an RX 560D 4GB. Well... And my Steam Deck OLED. Though I'll admit I don't have much time (or money) for these newer and more demanding games.
Perhaps someday a modder will figure a way to get something like SLI or other multi gpu tech to pool memory...
welp.... I know where my bonus check is going. lost my 512gb lcd model due to a pawn shop changing management and no longer accepting online pay for pawns. if its early enough I'll go for the limited ed though the "normal" 1tb model will do! i usually lock to 40hz but maybe I'll bump that to...
I can already see Phil's Computer Lab shoving one in an older SFF PC, Dawid and Random Gaming in HD doing their usual shenanigans too.
Let's hope the price is decent
Awesome job with the thorough testing. Handy to know for those wanting to get the absolute most out of the cpu for certain games.
I am curious if there's any noticeable difference in frame times with e cores on or off. Would be an interesting follow up. Maybe mix in hyper threading on or off...
I wouldn't put it past them to flat out skip the lower end this round, as they have plenty of 3xxx series to fill that performance level. Probably why they're STILL releasing a new version of the 3060
They do so because they can offset their R&D costs for essentially making a permanent otterbox in the phone. They probably get the other bits wholesale from companies looking to offload outdated inventory.
Might as well, then you'll know the cable will be backwards compatible with the other specs (at least it should). Guess we'll find out when LTT Labs and others test them.