• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD hints at high-performance Zen x86 architecture

Why they say "wear a wire" when it is a cable? As in Cablegate :-)
 
lol that comic strip reminds me of Intel P4 days haha
 
Not just that, but AMD's AM1 APUs just took a warning shot. Intel's Celeron J1900 is a quad-core SoC with a 10-watt TDP. I suspect it can keep up with AM1 CPUs just fine consuming half of the power all the while.

Ehhhh wrong. For slow vs slower comparisons, the 5350 wipes the floor with the J1900 in most areas. And also has a comphy / minimum 400mhz of overclocking headroom without touching the voltage - on the stock cooler and is a socketed platform. Also the J1900 does not even support sata 6Gb/s. In this very small segment, AMD wins. Definitely not a warning shot... maybe a nerf dart at best while yelling... JUST KIDDING!

Oh.. and definitely not 1/2 the power despite the ratings... A small difference for a large (if you can call it that) performance increase.
http://www.tomshardware.com/reviews/athlon-5350-am1-platform-review,3801-9.html

(oh dear lordy- someone was wrong on the internets and I just felt I had to correct them - shame on me lol)

Ah I remember the days when Intel had nothing on AMD and the barton cores were pissing all over the polished shoes of Intel execs. But that has been many moons. I still support AMD even (buy and build) though I have many intel based PC's. All one can do is hope - for another run!
 
Last edited:
neither of you know enough about the topic at hand to agree or disagree with anything
get educated
http://www.xbitlabs.com/news/cpu/di...x_AMD_Engineer_Explains_Bulldozer_Fiasco.html

Thanks for the article. In the article the ex-engineer quotes this "That changed before I left". So he didn't get fired, he got tired of incompetent fat asses and left.

That story is only one part of it. There are basic design flaws (I am no expert), they went out with less IPC than Thuban and compensated with more clock speed.

Let me ask this, how is the graphics processor team designing? Are they following same auto design flow or something else? I know GPU and CPU are different, but application might be same within same company?

I am not arguing, just trying to understand.
 
I would love to see AMD compete with Intel on all levels. Competition is good for all of us. They spent too many years selling their chips cheap. Customers love them for it but it's put them in between a rock and a hard place. They haven't had enough profit to keep pace with Intel's R&D. I don't see how they will catch up with Intel. Intel showed a profit margin of 19.11% on 53.9 billion dollars in sales and AMD showed a 1.39% profit margin on 5.89 billion dollars in sales. (source MSN Money)
 
I would love to see AMD compete with Intel on all levels. Competition is good for all of us. They spent too many years selling their chips cheap. Customers love them for it but it's put them in between a rock and a hard place. They haven't had enough profit to keep pace with Intel's R&D. I don't see how they will catch up with Intel. Intel showed a profit margin of 19.11% on 53.9 billion dollars in sales and AMD showed a 1.39% profit margin on 5.89 billion dollars in sales. (source MSN Money)
its not about RND amd already has the KNOW-HOW to make a faster chip they just didn't
 
its not about RND amd already has the KNOW-HOW to make a faster chip they just didn't

And why do you believe that? The article from 1 ex-employee? What proof do I have that he's not just some disgruntled person looking for attention?
 
And why do you believe that? The article from 1 ex-employee? What proof do I have that he's not just some disgruntled person looking for attention?
Google it seriously I have no time to educate fan boys and trolls
the information I linked has been known for YEARS and if you ask anybody that knows about the subject they will tell you the same thing AMD went to AUTOMATED chip-design process and the result was bulldozer there GPU's how ever are still hand-laided
 
Last edited:
Why are they calling this x86 when their CPUs have been x64 for some time?
 
its not about RND amd already has the KNOW-HOW to make a faster chip they just didn't

Agreed. This is what I've been saying for years. Both AMD and Intel already have the R&D to keep releasing the next fastest processor which counters one another for the next decade. It comes down to whether its financial and operational feasible and fits with their long term vision of the company.

And why do you believe that? The article from 1 ex-employee? What proof do I have that he's not just some disgruntled person looking for attention?

OneMoar is correct, AMD is a multi-billion dollar, multi national company. They have engineers and scientists working around the clock worldwide in R&D, it stands to reason that they are sitting on a few potential architectures which could be ready to implement with the right backing.
 
Last edited:
It's all good as far as I can see , Amd are likely to design zen for a node that can compete on a reasonable footing with intel ,, im hoping they skip pciex 3 for four given the 2016 timetable.

Oh and I think that article written by a ex amd employee is Bs

Automated design tools would reduce area not increase it as several other sources have adequately shown over the years.
I remember seeing amds core in handmade and auto design form(admittedly on the reliable interwebz) and the hand made ic was 30% bigger
 
Last edited:
for all there faults and problems AMD has demonstrated a ability todo a about-face-forward -mARCH= WIN
at the drop of a hat
*see what I did their I is phunny
 
for all there faults and problems AMD has demonstrated a ability todo a about-face-forward -mARCH= WIN
at the drop of a hat
*see what I did their I is phunny

You really need to get the trolling out of your system. If it isn't trolling, I've got to assume you are drunk.

I assume both of these points because you can't seem to spell, can't construct complete sentences, and link to articles with the same problems.

That article indicates that a potentially computer designed chip is potentially less efficient than a human designed one. The OP has linked to an article that said the Bulldozer architecture, and not the chip design, has been problematic. Assuming the issue was just chip layout issue, there'd have been absolutely no reason for Piledriver not to shine. If there was somehow an outstanding issue specific to interfacing and design AMD could have easily just redesigned poorly performing sections, and replaced them with more efficient ones. I don't remember Piledriver suddenly being absolutely amazing, and that would be the only justification for automated design being a dead end.


Perhaps then, your argument is that people should be involved with the design process. I'd hazard that you've somehow forgotten Netburst if that's the point you're making. That was human designed, and was a massive flop. Unless history is being rewritten, Intel released Netburst. I guess that means a BS article conjecturing about nothing has very little real world relevance.


The fact of the matter is clear, AMD has made a large error in pursuing a new paradigm. Assuming that this was actually proven to be more efficient, you'd be chiding Intel for falling out of technological progress. AMD made an error, and is finally attempting to fix their mistake. Intel has done it in the past, MS does it every other operating system, and as a human you should understand that errors don't mean idiocy. If we were to judge by that logic this forum wouldn't exist, because all of us have asked stupid questions at one time or another. Move past the trolling, and maybe spend another 20 seconds checking your posts so that they form complete sentences. It's impossible to argue your point when you can't convey anything but slurred speech and anger.
 
You really need to get the trolling out of your system. If it isn't trolling, I've got to assume you are drunk.

I assume both of these points because you can't seem to spell, can't construct complete sentences, and link to articles with the same problems.

That article indicates that a potentially computer designed chip is potentially less efficient than a human designed one. The OP has linked to an article that said the Bulldozer architecture, and not the chip design, has been problematic. Assuming the issue was just chip layout issue, there'd have been absolutely no reason for Piledriver not to shine. If there was somehow an outstanding issue specific to interfacing and design AMD could have easily just redesigned poorly performing sections, and replaced them with more efficient ones. I don't remember Piledriver suddenly being absolutely amazing, and that would be the only justification for automated design being a dead end.


Perhaps then, your argument is that people should be involved with the design process. I'd hazard that you've somehow forgotten Netburst if that's the point you're making. That was human designed, and was a massive flop. Unless history is being rewritten, Intel released Netburst. I guess that means a BS article conjecturing about nothing has very little real world relevance.


The fact of the matter is clear, AMD has made a large error in pursuing a new paradigm. Assuming that this was actually proven to be more efficient, you'd be chiding Intel for falling out of technological progress. AMD made an error, and is finally attempting to fix their mistake. Intel has done it in the past, MS does it every other operating system, and as a human you should understand that errors don't mean idiocy. If we were to judge by that logic this forum wouldn't exist, because all of us have asked stupid questions at one time or another. Move past the trolling, and maybe spend another 20 seconds checking your posts so that they form complete sentences. It's impossible to argue your point when you can't convey anything but slurred speech and anger.
whatever you are smoking I want some of it because you are tripping

""That article indicates that a potentially computer designed chip is potentially less efficient than a human designed"" I am sorry WHAT Software is only as Good as the person that Programmed it

I don't think you have a firm grasp of what this discussion is about is has nothing to-do with what you are calling "layout" it has everything todo with automated engineering tools not being up to par

and finally the very reason intel is a bit ahead of the curve is that REAL people capable of creativity and logic are the ones designing the chips instead of letting dubious software do most of the transistor layout and pathway optimization and the fact of the matter is AMD fired most of the people that could have fixed these problems
to say nothing of the fundamental flaws of the arch its self ( pipe to long,poor cache performance ect)

also I at no point did I imply or state that a human is superior to a automated bit of software. even tho of coarse I know it is because the bottom line is that no bit of Software is going to have the creativity and genius of a seasoned
designer those tools exist as a AIDE nothing more

the only person trolling in this thread is you I get that some people on this board are not my biggest fans SORRY DEAL WITH IT
 
Last edited:
x64 is an 64-bit extension of x86. The full name is x86-64.
Then why does W7 for instance separate 32 bit programs to Program Files (x86), and 64 bit ones to Program Files? It's kinda misleading. Then again, that's MS.
 
oftware is only as Good as the person that Programmed it

I can program a calculator, but I can't recite more than 25 digits of pi, and probably can't answer multiples of more than 5 digits in length within a nano second, so your point is slightly flawed.
 
I can program a calculator, but I can't recite more than 25 digits of pi, and probably can't answer multiples of more than 5 digits in length within a nano second, so your point is slightly flawed.
ability todo mathematics in your head has relatively little todo with figuring out the best place for transistor-block Y in relation to cache branch X
IC design is all about creativity and foresight and maximizing every transistor you can
example I need to add some transistors to the iGPU block to interface them with the l3 cache the software may chose to place them in position A3 row 1 layer 3 from the software stand point this could very well be the most logical

now and bear with me this is taking a fair bit of creative license

a engineer could take a good look at the layout and go: Hrmm Ya know if I move these transistor;s over here and shuffle this pathway around and tweak the logic a bit, I can kill two birds with one stone here and improve performance,reduce space consumption
that bit of ingenuity is something automated aides will never do
 
whatever you are smoking I want some of it because you are tripping

""That article indicates that a potentially computer designed chip is potentially less efficient than a human designed"" I am sorry WHAT Software is only as Good as the person that Programmed it

I don't think you have a firm grasp of what this discussion is about is has nothing to-do with what you are calling "layout" it has everything todo with automated engineering tools not being up to par

and finally the very reason intel is a bit ahead of the curve is that REAL people capable of creativity and logic are the ones designing the chips instead of letting dubious software do most of the transistor layout and pathway optimization and the fact of the matter is AMD fired most of the people that could have fixed these problems
to say nothing of the fundamental flaws of the arch its self ( pipe to long,poor cache performance ect)

also I at no point did I imply or state that a human is superior to a automated bit of software. even tho of coarse I know it is because the bottom line is that no bit of Software is going to have the creativity and genius of a seasoned
designer those tools exist as a AIDE nothing more

the only person trolling in this thread is you I get that some people on this board are not my biggest fans SORRY DEAL WITH IT

Read the article you link to, before you say something else that is absolutely unjustified. Trolling is shouting nearly incoherently at detractors, and you've crossed that threshold here.

1) The engineer complains that automated design tools were utilized for designing components of the chips.
2) The engineer complains that all designs were 20% slower and 20% less efficient. Seems rather hyperbolic, and without real data there's zero proof.
3) The math done by the article writer is rather unglued from fact, and they state so.
4) The article writer seems out of their depth. An AMD core is directly compared to an Intel core, despite the fact that they aren't equals.

Assuming none of this is a valid point for debate, the article is based upon hear-say from an ex-employee. What division are they working in? Why were they fired? How deep did they actually reach into the design process? Can't answer any of that, so all you've got is a wild guess about the facts.


I won't concede that AMD is a failure, because things like the APU prove they have some vision. I agree that Bulldozer was a failure, but hardly think a disgruntled employee is the least biased source for operational information. While you're welcome to go fondle your love for Intel, I hold out hope for the future of AMD. I'm loyal only to the better performer, and that isn't AMD right now. This said, they deserve the opportunity to rectify their Bulldozer issues.

ability todo mathematics in your head has relatively little todo with figuring out the best place for transistor-block Y in relation to cache branch X
IC design is all about creativity and foresight and maximizing every transistor you can
example I need to add some transistors to the iGPU block to interface them with the l3 cache the software may chose to place them in position A3 row 1 layer 3 from the software stand point this could very well be the most logical

now and bear with me this is taking a fair bit of creative license

a engineer could take a good look at the layout and go: Hrmm Ya know if I move these transistor;s over here and shuffle this pathway around and tweak the logic a bit, I can kill two birds with one stone here and improve performance,reduce space consumption
that bit of ingenuity is something automated aides will never do


That same computer could determine that placement of an IC within a certain distance of another will generate enough interference to corrupt data flowing along a parallel channel. It's human design that has allowed things like the SATA port degredation, timing bugs, and a hand-full of crippling errors in the last decade.

Humans are just as imperfect as computers, and creativity isn't always a gift. Thinking that humanity will always find the best route is stupid. If that was necessary you'd have something other than a computer running UPS delivery plans. There's always going to be a better way, no matter who gives you a solution.
 
Last edited:
Back
Top