Discussion in 'News' started by btarunr, Dec 3, 2013.
Can you please let me know a CPU from Intel which has no GPU on chip ?
The difference is Intel never claimed socket retention, they also kept LGA775 around for what, like 5-6 years? That's not bad. AMD were the ones that insisted on keeping Socket AM2/AM3 around for nearly a decade (AM2 was originally released in 2006) while reassuring people they had no interest in changing sockets.
Anything with a P at the end of the model number has the GPU de-activated. It's unreasonable to expect them to make 100,000 of a certain SKU and design a completely different manufacturing process for 5-10,000 CPU's because some people are afraid of an iGPU.
What architecture inferiority? The only time Intel had to resort to frequency was during Netburst.
Clock-for-clock Core2 was much faster than Athlon64 and Phenom.
Just go back to the old reviews. 2.4GHz E6600 completely dominated the 3GHz Athlon 64 X2 6000+
Bit of a blanket statement. There was too many revisions of the Athlon 64 series and Core 2 Series to make that generalisation. Yes Core 2 was faster "much faster" depends on which iteration you're talking about. Yes the E6600 was faster than the Athlon II X2 6000, but I wouldn't say dominate.
You are cherry picking too because when AMD moved to the "Kuma" Athlon 64 revision i.e. Athlon X2 7750 BE the performance was very competitive with the Core 2 Duo Conroe.
When AMD moved to Callisto i.e. Phenom II X2 560, it was neck and neck competitive with the E8400. The Athlon II X2 could also compete and sometimes outcompete the E8400. (also the Athlon II X3 was much cheaper than a E8400 and handily dominated the entire Core 2 Duo series).
But we have selective memories of those events
I am surprised that there will be no DDR4 support on their APU lineup in 2015...
Not to mention, next year they are planning a new chipset... for the same socket... that is the only way you'll be able to run desktop Haswell refresh and/or Broadwell-K...
So please, people, a little less with the uneducated biased and/or hateful comments that have nothing to do with reality, and think.
AMD would show a lack of strength and confidence in their own products if they would release FX processors on the side, again people, think god damn it.
Too bad for the time being, APUs still sound good only on paper... but we shall see.
Accuse someone of cherry picking then compare a 7750BE with a Conroe? Even though Wolfdale and Yorkfield 45nm Penryn-based CPUs had already been in the marketplace for nearly a year?
If all the Phenom II actually had to compete with was an ageing Core 2 Duo when AMD would have been peachy. Pity Intel already had Lynnfield based i5 and i7 in the consumer channel by the time the X2 560 arrived then isn't it?
Fixed this... as now it's more poignant, Intel and AMD now are just more aligned.
Rats, was hoping AMD might spin one more interation for AM3 socket. Nothing all that earth-shaddering, just wanted like a 6-core 95W part the had had a little more oomph or a "Black". My boy's are on 870 based machines and hoping to stretch one more upgrade, looks like FX-6300 are all I'll have to look at.
AMD... start offering tray parts for $20 less! please...
Well yes because jihadjoe was referring to the E6600 which is Conroe.
The E6600 came out in July 2006, the 7750BE didn't launch until December 2008. Why bother comparing an AMD processor to an Intel CPU that 1. Was two and a half years old, and 2. Had been EOL'ed nearly a full year before the 7750BE even arrived ???
OK, so now there is a EOL clause. Conroe become unchallenged because it went EOL, didn't realise that stipulation.
The Wolfdale Core 2 Duos were around when the Phenom II X2 560 and Athlon II X2/X3 was around competing with it - either way you look at it jihadjoe's statement wasn't 100% accurate.
If AMD tries hard, I am pretty sure they can make better CPUs in the AM3+ socket. The socket is absolutely fine. But the architecture needs refinement. Better quality L1, L2, L3 caches, better IMC and better IPC is all they need. Instead of core count and cache capacity, they need to focus on performance per core and cache speed.
For that, they will need to keep the socket constant and start or refine the architecture from scratch. As for Vishera, the architecture is good enough especially the 63xx and the 83xx.
Intel's infamous "netburst" architecture comes to mind, and their anticompetive nature sealed AMDs fate.
AMD doesn't really need a new AM3+ architecture at this point. Refine Vishera, get it to 28nm (22nm is a pipe dream ATM) and release it at >4.5Ghz. The strategy has worked well with their GPUs so I think they should try that.
Too bad to see AMD losing out in the CPU race but I am sure majority of us have seen this coming.
I've used APU for HTPCs but to be honest even Intel HD is good enough for HTPC unless you are going for 4K or something in which case you would need a dedicated gpu anyway.
For cost effective solutions, AMD is way ahead and always will be. An AMD APU A10 combined with a entry level GPU like the 6670 delivers excellent performance in 720p. Come on, not all of us play at 1080p 8x MSAA and so on. Most people from poor countries like the one I live in cannot afford high end systems for gaming. For them, APU is the perfect choice. And computer hardware is damn costly here. So for them, atleast people can play and get 40-50 FPS on 720p screens on majority of the games. Had it been an Intel system, the HD graphics could not be used with a discrete GPU and the cost would have been more. An A10 and Core i3 are priced almost same where I live, the i3 being a bit more expensive. Not just for HTPC, APUs are meant for entry level gaming too.
Come back again to FX, I would say they are good for the price apart from the 4xxx series. I don't find any reason to get a Core i3 instead of a FX 6xxx.
I am not a fanboy. I am just at the reality. Here where I live, at this moment, a FX 8350 and Core i5 3570 costs the same. Both are good at their fields and performs more or less the same overall.
And the reason I said that FX needs better caches is because if you bench using AIDA cache and memory benchmark, FX L2 and L3 caches perform really bad. You can check out yourself.
Thats a sad thing, it makes me want to go grab a high end AMD system again just for the memorabilia..
I was thinking that myself
Apparently this was a rumour that turned out to be false.
The new slide doesn't debunk the report; if anything, it confims it:
Both roadmaps are the same, the only difference is that the top one includes 2012 and 2015. Both say that Piledriver (Vishera) will be the only architecture available for the AM3+ platform for the comming years.
Wierd D D D I dont think a future without Amd high end cpu's is a cheap one or one i would like
This end of the cpu twaddle has been spouted add nauseum for the last few years and all regardless of the FACT that binning will retain pure cpu types even when the gpu is integrated because its better then bining them(actual dustbin) (the ones the gpu is broke on) and were not even in that yard yet since Amd and intel still need server parts and the resultant refuse becomes consumer parts.
Fx's will continue to evolve in a positive way as mine has with better software ,Api's ,and Os's
I really don't understand all the doom and gloom this announcement is producing. If you really think about it, it was going to have to happen eventually. Let's take a long look at the events that have brought AMD to this point.
Amd has historically always suffered under Intel's shadow. The only times they've been able to out maneuver Intel is when they've gone for the "hail mary pass" so to speak. Think about it, what was the last truly innovative CPU technology that Intel produced on it's own? If you don't count instruction set extensions which can be of dubious use I'd say, hyper threading. The one before that, on die L2. That's it. The L2 would of been pretty obvious to any one. There could be some debate about Hyper threading, but everyone knew eventually multi-core CPUs were coming. Hyper Threading was just a step in between single core and multi-core. On the other hand AMD has produced a lot of innovations that are in common use today and a lot of people (even me) saw dubious worth in them.
When Intel's Netburst was first on the drawing table it was all about clock speed. Amd and Intel had been trading blows and neither really had the upper hand, even with AMD being the first to reach 1GHz. So while Intel pursued their failed Netburst architect (it never scaled in the manner Intel thought it would) and ever climbing FSB speeds, AMD introduced a much more efficient core architect making over all clock speed less important. They then added the following, first an on die memory controller making FSB less of an issue. Secondly a 64bit extension to the X86 instruction set. I'll admit I was even one to say "so what, other then memory limits, why do we need 64bit" Okay and I was wrong, how many people still run a 32bit OS? Fourth they introduced the first true dual core. And lastly they introduced on die GPUs. All innovations that AMD pioneered and Intel later adopted.
AMD became king of the mountain around the time of the 64bit extensions. In fact that's why they first introduced the FX line of processors. Insanely priced, but the best you could get. As it later came out, and was well documented, the only way Intel could even compete was by forcing OEMs to only carry Intel products. Intel all most lost the race except for a strange turn of events. Enthusiasts had started to use the Pentium M processor (part of the Centrino brand) for desktop use. While Intel may not be as innovative as AMD, that doesn't mean their not smart, and they saw the potential the Pentium M had. I'd say that Intel's biggest strengths is it's all most inexhaustible resources and it's ability to turn on a dime and refocus on more promising avenues. Which is exactly what they did. Intel has taken the Pentium M and over time added all of AMD's innovations to the point where they totally dominate the CPU market. In truth AMD's only recourse is to offer deep discounts on their flagship products. It's sad, but it's also true.
So what can AMD do to once more get out of Intel's shadow? It's pretty obvious that no matter what AMD does with their current processors they will always be playing second fiddle to Intel's. They'll never be a threat. In fact as been shown with Intel's last generation Intel doesn't even consider AMD a threat anymore. In stead of increasing CPU performance they seem to be concentrating on reducing power consumption and heat. And that makes sense because of the rise of mobile computing and how ARM has risen to become a major player. Intel has it's sights set on taking on ARM, not AMD. But this gives AMD a advantage they haven't had since the Athalon 64 days, room to breath and maneuver.
So what can they do with this breathing room? Well they could keep trying to improve traditional processor performance, but we all know where that will lead. No AMD needs to do what AMD does best. Come out of left field with a new technology that no one thinks is viable, just like they did in the past. So what does AMD have that Intel just can't touch? GPU performance on their APU dies. While Intel has made great leaps in this area, they'll never be able to touch AMD, they simple have to much of a lead. So AMD needs to leverage that advantage in the best way it can. How can they? While it's true that more and more applications are taking advantage of mulit-core CPUs, simply adding even more and more cores is eventually a dead end solution. Instead why not throw that "hail mary pass"? Produce an entirely different approach something like a heterogeneous CPU. http://en.wikipedia.org/wiki/Heterogeneous_computing
AMD already has many of the parts already in place on the CPU die, and if their Heterogeneous initiative succeeds. http://hsafoundation.com/ They might just pull it off. All they really need once it's up and running is a "killer app" and AMD will quite likely make an end run around Intel.
With all that in mind AMD can't afford to be split into a APU and CPU company. This is an all or nothing play. A true "hail mary pass". So eventually AMD was going to have to go this route any way. I'm just hoping that it means AMD's heterogeneous system architect isn't that far away. It just might be one of the biggest game changers for desktop computing we've seen yet, and save AMD's bacon in the process.
Devastator is the codename of the igp at last on the A10-5800K my 7660D was a Devastator nicknamed
Separate names with a comma.