Discussion in 'News' started by btarunr, May 4, 2011.
if it could transformed into ray gun, then yes
The difference between 22 and 32 is 31.25%. 37% - 31.25%=5.75%
I don't think performance is necessarily the main benefit from Tri-Gate. I think it's reduced leakage which means greater efficiency (less having to recycle pipelines) and theoretically also be less friction which means higher clockspeeds for the same 140w TDP max. I think Intel has set themselves up for a 4+ GHz processor--about 10 years overdue.
The difference between 32 and 22nm is much greater
32^2/22^2=1 024/484 =2,115 ~ 212% shrinkage so a 22nm transistor should be less than half the size of a 32nm transistor
Assuming it is perfectly square and reduced to the same square dimensions, the difference is 52.734375%.
The 7% figure most likely has nothing to do with 32nm -> 22nm though. They probably compared tri-gate to planar on the same manufacture process (most likely 22nm but it could have been on any). "Performance" is vague though. In any event, it allows them to scale down even smaller because the height of an electron is no longer a limiting factor (well, at least not as much so). This discovery allows for 10nm and beyond to be feasible.
Certainly in this press release it's comparing 32 to 22 nm.
But I'm hopeful 3d transistors become the norm regardless as it can only get better.
not necessarily, they still brough out the Athlon 64 back when intel was completely dominating and it was leagues ahead of what intel was doing with the P4.
yea and that was pre-ATI purchase.
hehe, well i hope this not hurt AMD as much as it sounds like it could do.
Like hell i just got a dual unlockable CPU a AMD 890 mobo and DDR3 ram for $250 which i cannot say i could of done with INTEL.
However, seeing how this new tech works is there not a issue with how much thicker the chip would have to be cooling wise ?. Never mind cost so to me sounds like AMD are safe as there cheaper than INTEL now never mind when INTEL take advantage of this tech on a larger scale.
And would the tech be more prone to more issue's when making new stuff ?.
It looks like ARM is not too worried:
From the article:
"Drew also pointed out that ARM has already announced test chips at 22 and 20nm already, with foundry partners TSMC and GlobalFoundries also working on those processes, and that IBM is already working on 14nm."
Well look at it this way - the majority of arm cores were up to recently built at 65nm and were still kicking atom's arse(and anything else that was x86) in performance/power. The problem is that ARM isn't as scalable as x86 and is mostly 32 bit so that's why it's limited to small electronics devices. Half the SoC consists of fixed function hardware, so every one to two years, you need a new SoC with updated hardware just to be able to watch the latest movie format or browse the net. Once you scale things past small electronics devices, ARM quickly looses steam.
So would this herald a huge change to Cadence and Synopsys design/synthesis suites, or maybe just a significant modification of DRC?
I always like your posts cause they always link to informative info.
FFS! I just upgraded to an i7 build and now they mention this?! arrrgggg!!!!
I wish IBM would get back into the desktop arena.
I thought it was closer to a FinFET with multiple gates then anything else?
AMD hit 1GHz first, had 64 bit CPU's out first, had IMC's first, and probably a few other things i forgot already.
And Intel created x86 (which is where AMD laid their 64-bit chip out of) and Intel created the first commercial microprocessor chip.
They both have contributed, though i do think Intel tends lays out the significant advancements first and AMD tends to follow after.
intel surely have the lead, but at times AMD has taken the performance and feature crowns. the point was merely to reinforce the fact that while AMD is the underdog most of the time, its not always true.
will this tech help intel take the lead? damn sure it will. but what if these CPU's cost twice what they do now? the wealthy among us will sing intels praises, and everyone else wont be able to afford the new ivy bridge 22nm 3D transistor chips extreme/K editions that let them OC...
And then AMD will catch up with Intel's tech and improve upon it and Intel will have to come up with something new again and the cycle will repeat.
All this talk about who did what first makes me think about one thing, would many of these advances have come so early if Intel and AMD were not competing?
Yea i know its not much of a competition right now but in the past i think certain generations were all about trying to beat the other company thus giving us AMD cpu's that were the fastest at the time and Intel CPU's that were the fastest at the time.
Pretty much beat me to it its always one side or the other pushing things forward then the other tries to one up them, it's a beautiful cycle for us as it means more and more power
Well, yeah. If a company has nobody to compete with, then they don't need to spend all that money on R&D.
thats exactly it. intel has new tech that dominates, so AMD goes for another route. better power efficiency, more cores, lower prices, etc.
intel then finds a way to make their tech more appealing, likely via the lower prices dealio.
And then we would lose out big time.
All of us will only be around for so long and competition in the tech industry helps us get to see and use more and more advanced tech while we still can.
competition brings new technology and... some price cuts
Separate names with a comma.