• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What does this breakthrough mean for the future of computing? Help me understand the new petahertz transistor.

Prospective transistor speeds might give a false idea of how impressive this is. It takes multiple transistors to form even rudimentary logic gates, hundreds to make a processor capable of 8-bit operations, thousands to run a program, millions to run an operating system.

The question is in throughput when it comes to performance. Would you rather have 1 transistor switching 10^15 times a second, or 10^7 transistors switching 10^9 times per second? Well, we should probably do the math. Orders of 10 are specifically easy to work with in the decimal system, much like orders of 2 are easy to work with in the binary system. Let's examine.

1 petahertz expressed in natural form is 1000000000000000.

Let's multiply 10^7 by 10^9. Given these are simple orders of ten, we can add the exponent together to get 10^16. This might give away the problem immediately.

Our result expressed in natural form is 10000000000000000.

Our 'normal processor', which consists of 10 million transistors operating at 1GHz, outperforms our single mega-transistor by a magnitude of 10... and gets outdone by a Pentium III. This is a gross oversimplification of the idea, but the point is that it will take a lot of engineering and progress for the photon to supercede the electron in generic computing, and to a level where it is usable for the computing tasks of today. It's still interesting, at least, and shows where a theoretical 'absolute ceiling' may lie.
 
Last edited:
Back
Top