• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Reinvents Transistors Using New 3-D Structure

"Intel's 22nm 3D transistors provide up to a 37 percent performance increase at low voltage versus Intel's 32nm planar transistors."

Wait wait wait, that's like 7% more than the usual right?

naughty intel, making it seem like it's better than it is.

Not to say it won;t give them an advantage. But no where near as much as people think I'd wager. The tech may develop some more. But for now it's only slightly better than a change in fab size.
The difference between 22 and 32 is 31.25%. 37% - 31.25%=5.75%

I don't think performance is necessarily the main benefit from Tri-Gate. I think it's reduced leakage which means greater efficiency (less having to recycle pipelines) and theoretically also be less friction which means higher clockspeeds for the same 140w TDP max. I think Intel has set themselves up for a 4+ GHz processor--about 10 years overdue.
 
The difference between 22 and 32 is 31.25%. 37% - 31.25%=5.75%

I don't think performance is necessarily the main benefit from Tri-Gate. I think it's reduced leakage which means greater efficiency (less having to recycle pipelines) and theoretically also be less friction which means higher clockspeeds for the same 140w TDP max. I think Intel has set themselves up for a 4+ GHz processor--about 10 years overdue.

The difference between 32 and 22nm is much greater :o

32^2/22^2=1 024/484 =2,115 ~ 212% shrinkage so a 22nm transistor should be less than half the size of a 32nm transistor
 
Assuming it is perfectly square and reduced to the same square dimensions, the difference is 52.734375%.

The 7% figure most likely has nothing to do with 32nm -> 22nm though. They probably compared tri-gate to planar on the same manufacture process (most likely 22nm but it could have been on any). "Performance" is vague though. In any event, it allows them to scale down even smaller because the height of an electron is no longer a limiting factor (well, at least not as much so). This discovery allows for 10nm and beyond to be feasible.
 
Assuming it is perfectly square and reduced to the same square dimensions, the difference is 52.734375%.

The 7% figure most likely has nothing to do with 32nm -> 22nm though. They probably compared tri-gate to planar on the same manufacture process (most likely 22nm but it could have been on any). "Performance" is vague though. In any event, it allows them to scale down even smaller because the height of an electron is no longer a limiting factor (well, at least not as much so). This discovery allows for 10nm and beyond to be feasible.

Numbers........mind.......burning!
 
Assuming it is perfectly square and reduced to the same square dimensions, the difference is 52.734375%.

The 7% figure most likely has nothing to do with 32nm -> 22nm though. They probably compared tri-gate to planar on the same manufacture process (most likely 22nm but it could have been on any). "Performance" is vague though. In any event, it allows them to scale down even smaller because the height of an electron is no longer a limiting factor (well, at least not as much so). This discovery allows for 10nm and beyond to be feasible.

Certainly in this press release it's comparing 32 to 22 nm.

But I'm hopeful 3d transistors become the norm regardless as it can only get better.
 
AMD is no match for the R&D department at Intel. AMD unfortunately will always be two steps behind.

not necessarily, they still brough out the Athlon 64 back when intel was completely dominating and it was leagues ahead of what intel was doing with the P4.
 
not necessarily, they still brough out the Athlon 64 back when intel was completely dominating and it was leagues ahead of what intel was doing with the P4.

yea and that was pre-ATI purchase.
 
I think Intel should call them Killdozer LOL .

hehe, well i hope this not hurt AMD as much as it sounds like it could do.

Like hell i just got a dual unlockable CPU a AMD 890 mobo and DDR3 ram for $250 which i cannot say i could of done with INTEL.

However, seeing how this new tech works is there not a issue with how much thicker the chip would have to be cooling wise ?. Never mind cost so to me sounds like AMD are safe as there cheaper than INTEL now never mind when INTEL take advantage of this tech on a larger scale.

And would the tech be more prone to more issue's when making new stuff ?.
 
It looks like ARM is not too worried:

http://mobile-device.biz/content/item.php?item=30305

From the article:
"Drew also pointed out that ARM has already announced test chips at 22 and 20nm already, with foundry partners TSMC and GlobalFoundries also working on those processes, and that IBM is already working on 14nm."
 
It looks like ARM is not too worried:

http://mobile-device.biz/content/item.php?item=30305

From the article:
"Drew also pointed out that ARM has already announced test chips at 22 and 20nm already, with foundry partners TSMC and GlobalFoundries also working on those processes, and that IBM is already working on 14nm."

Well look at it this way - the majority of arm cores were up to recently built at 65nm and were still kicking atom's arse(and anything else that was x86) in performance/power. The problem is that ARM isn't as scalable as x86 and is mostly 32 bit so that's why it's limited to small electronics devices. Half the SoC consists of fixed function hardware, so every one to two years, you need a new SoC with updated hardware just to be able to watch the latest movie format or browse the net. Once you scale things past small electronics devices, ARM quickly looses steam.
 
FFS! I just upgraded to an i7 build and now they mention this?! arrrgggg!!!!
 
I wish IBM would get back into the desktop arena.
 
well intel invented something all over again . so when AMD did that? i can't remember :P

finally something really exciting and new :) can't wait to see what will come up from this,

also i wonder can they just build like more bridge on same transistor? like 2 bridge that have control over 3 flow of current each? hehe that would be 9x 1 transistor!

AMD hit 1GHz first, had 64 bit CPU's out first, had IMC's first, and probably a few other things i forgot already.
 
AMD hit 1GHz first, had 64 bit CPU's out first, had IMC's first, and probably a few other things i forgot already.

And Intel created x86 (which is where AMD laid their 64-bit chip out of) and Intel created the first commercial microprocessor chip.

They both have contributed, though i do think Intel tends lays out the significant advancements first and AMD tends to follow after.
 
And Intel created x86 (which is where AMD laid their 64-bit chip out of) and Intel created the first commercial microprocessor chip.

They both have contributed, though i do think Intel tends lays out the significant advancements first and AMD tends to follow after.

intel surely have the lead, but at times AMD has taken the performance and feature crowns. the point was merely to reinforce the fact that while AMD is the underdog most of the time, its not always true.


will this tech help intel take the lead? damn sure it will. but what if these CPU's cost twice what they do now? the wealthy among us will sing intels praises, and everyone else wont be able to afford the new ivy bridge 22nm 3D transistor chips extreme/K editions that let them OC...
 
intel surely have the lead, but at times AMD has taken the performance and feature crowns. the point was merely to reinforce the fact that while AMD is the underdog most of the time, its not always true.


will this tech help intel take the lead? damn sure it will. but what if these CPU's cost twice what they do now? the wealthy among us will sing intels praises, and everyone else wont be able to afford the new ivy bridge 22nm 3D transistor chips extreme/K editions that let them OC...
And then AMD will catch up with Intel's tech and improve upon it and Intel will have to come up with something new again and the cycle will repeat.
 
All this talk about who did what first makes me think about one thing, would many of these advances have come so early if Intel and AMD were not competing?

Yea i know its not much of a competition right now but in the past i think certain generations were all about trying to beat the other company thus giving us AMD cpu's that were the fastest at the time and Intel CPU's that were the fastest at the time.


*edit*
And then AMD will catch up with Intel's tech and improve upon it and Intel will have to come up with something new again and the cycle will repeat.

Pretty much beat me to it :laugh: its always one side or the other pushing things forward then the other tries to one up them, it's a beautiful cycle for us as it means more and more power :D
 
All this talk about who did what first makes me think about one thing, would many of these advances have come so early if Intel and AMD were not competing?

Yea i know its not much of a competition right now but in the past i think certain generations were all about trying to beat the other company thus giving us AMD cpu's that were the fastest at the time and Intel CPU's that were the fastest at the time.
Well, yeah. If a company has nobody to compete with, then they don't need to spend all that money on R&D.
 
thats exactly it. intel has new tech that dominates, so AMD goes for another route. better power efficiency, more cores, lower prices, etc.


intel then finds a way to make their tech more appealing, likely via the lower prices dealio.


everyone wins.
 
Well, yeah. If a company has nobody to compete with, then they don't need to spend all that money on R&D.

And then we would lose out big time.

All of us will only be around for so long and competition in the tech industry helps us get to see and use more and more advanced tech while we still can.
 
And then we would lose out big time.

All of us will only be around for so long and competition in the tech industry helps us get to see and use more and more advanced tech while we still can.

competition brings new technology and... some price cuts
 
Back
Top