• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Moore's Law Buckles as Intel's Tick-Tock Cycle Slows Down

I met a guy, Mid '80's, at a party. He was an Electronics Engineering Student at the U of M (Minnesota).

He was very excitable and full of vigor about the upcoming revolution in computing power.

"Photons"

I told him he sounded like a flake and was driving people away. He stopped for a moment, looked around and realized we were now the only two people standing in the once full Kitchen.

He looked at me, with a little less enthusiasm and said, "You might be right."

I turned and walked outside to join the crowd by the fire while he stood there, alone, for a few moments. Half an hour later I saw him in the basement, he seemed to have taken on a more casual vibe and was calmly conversing with another group.

He stopped me as I walked by, His eyes sparkling like a stoned Sophomore and said "Thanks Man."

That was the last I heard of the Photon Revolution in Computing Power, and the last I ever saw of that guy.

:D
 
So you're saying you are the reason why we don't have optical computers yet? :cry:



POWER6 and POWER8 reached 5 GHz. I think they are 160 watt chips though.

Well if they have as many transistors as SB-e in them and are doing 160W at 5Ghz then it's an incredible achievement because SB-e needs about 300W to do 5Ghz.
 
I met a guy, Mid '80's, at a party. He was an Electronics Engineering Student at the U of M (Minnesota).

He was very excitable and full of vigor about the upcoming revolution in computing power.

"Photons"

I told him he sounded like a flake and was driving people away. He stopped for a moment, looked around and realized we were now the only two people standing in the once full Kitchen.

He looked at me, with a little less enthusiasm and said, "You might be right."

I turned and walked outside to join the crowd by the fire while he stood there, alone, for a few moments. Half an hour later I saw him in the basement, he seemed to have taken on a more casual vibe and was calmly conversing with another group.

He stopped me as I walked by, His eyes sparkling like a stoned Sophomore and said "Thanks Man."

That was the last I heard of the Photon Revolution in Computing Power, and the last I ever saw of that guy.

:D

"More casual vibe" !? Really? Why do you think that he must be responsible and feel guilty for the fact that (silly) people are not up to his level for that type of conversations?

Yeah, I know we live in a sad world but is it really worth it to lower your level in order for the others to stay next to you ? :rolleyes:
 
You were there, were you? If so, why weren't you the one standing in the Kitchen listening to him while everyone else left?

I was there. I made an observation, one that I thought was germane to this thread and also to this unknown person. Funny, he Thanked me. Yet you, feel I did something wrong. Feel free to voice your opinion, but believe me, I know me, you don't, which means I really don't care what you postulate about me.

BTW, his level of "type of conversation" has not proven to be real, in any way I have observed. I plug my computer into a wall socket that does not emit Photons. I don't know anyone else that has even spoken of Photons, in relation to computers. So really, what is it you have an issue with?

Again, not that I care.

Peace out random insulter.

:lovetpu:
 
I am not saying that you did anything wrong at that particular moment.

I just don't agree your present attitude about that situation.
 
Again, You. Don't. Know. Me.

Save your judgement for the people that you have some contact to.

Perhaps, do something other than troll threads pretending to be Holier Than Thou.

Again, Peace Out. Good Night. :lovetpu:
 
I can't say I'm that surprised. Considering how Broadwell complements rather than replaces Haswell, I didn't think that Intel would keep 14nm around for only one top-to-bottom release (Skylake). I suspect that Kaby Lake will be to Skylake what Broadwell was to Haswell - a minor refresh aimed at specific markets.

What intrigues me is what this means to Intel's competitors. Longer times at each process node mean that chip design becomes more important; in contrast, Intel's strength has always been it's technological advantage in manufacturing. If competitors are able to match Intel's manufacturing technology for longer each generation, making chip design the primary differentiation, we could see some real competition.
 
Last edited:
Yeah, I know we live in a sad world but is it really worth it to lower your level in order for the others to stay next to you ? :rolleyes:

Oh my god yes. You have no idea how much "yes" is the correct answer to that.

I would prefer that the progress is driven by desire for more performance and work done, instead of someone greedy and stupid trying to save money and make more profit.

People in general don't need more computing power, what they need is lower power draw. And I agree with that, until serious VR/AR becomes a thing and we all have smart Iron Man houses things will stay the same, and then we all have servers built in our homes.
 
Ages ago I read an article about using synthetic diamond, dunno what happened to that.

Anyway I am expecting some sci-fi stuff the coming years.
BTW, I don't think I like your current attitude.

:p
 
People in general don't need more computing power

This is the most stupid opinion I have read in quite a while.

If people in your world don't need more computing power, the people in my world need it. There are tons of applications where more performance would mean more pleasure and better user experience.

You are so annoying and arrogant, hell.
 
This is the most stupid opinion I have read in quite a while.

If people in your world don't need more computing power, the people in my world need it. There are tons of applications where more performance would mean more pleasure and better user experience.

You are so annoying and arrogant, hell.

I mean avarage joes, they're on tablets and phones. Gamers and professional users will always need moar power obviously, but those are not the big money makers. Low power chips that allows for a long runtime on batteries are far more interesting.

Edit: I apologize for coming across as arrogant. I shall try to work more on my replies.
 
Last edited:
I am sure competition plays some part in this, the slower they make new chips the more they make on current chips, meaning they don't have to spend as much on engineering, research&design, and new market research. the competition is themselves.

I donno man, 1Billion/month for R&D don't seems a little...
 
Its still takes a fuxing long time to render video on a computer, it still takes a fuxing long time to upload videos to youtube, theres a lot of stuff that still takes way too fuxing long on a computer.

Computers are basically the same as they were 30 years ago, with incremental improvements CPU, Memory speed, PCI buses, USB speeds, etc. The computer industry has been on a long term plan to milk the consumers for every incremental upgrades. GPU makers make a card that is 25% faster than the last and we all go nuts but we still cant run 4K properly. We still cant run 4K higher than 60Hz.

When there are only 2 players in any industry, the consumers get screwed.
 
Its still takes a fuxing long time to render video on a computer, it still takes a fuxing long time to upload videos to youtube, theres a lot of stuff that still takes way too fuxing long on a computer.

Computers are basically the same as they were 30 years ago, with incremental improvements CPU, Memory speed, PCI buses, USB speeds, etc. The computer industry has been on a long term plan to milk the consumers for every incremental upgrades. GPU makers make a card that is 25% faster than the last and we all go nuts but we still cant run 4K properly. We still cant run 4K higher than 60Hz.

When there are only 2 players in any industry, the consumers get screwed.
Not really, how many players are there in the auto industry? Take a look at how far cars have come in a similar time frame, lets say that 30 years you mentioned, I would hazard a guess that computing power has moved forward much more significantly, certainly in raw speed terms, it is a subjective view though I agree, in contrast if you looked at mobile telephone evolution in a similar period I think they would be well ahead of the others game.
 
There were a lot of talks about magic of graphene. And then all went silent. I wonder if that's even still on a table...
The world is not ready for that, there's alot more milking to be done from silicon chips then silicon + galium + whatever then maybe in 2030 we will finally see some graphene chips.
 
We had (or we could have had) the tech to go under 10nm/transistor since the 80's... but because of incremental decrease driven by the business only... oh well...
 
I heard that a company was having issues with the laser used for optical lithography for next gen ICs. The lasers used a mercury vapor that is heated by an above average electric heater designed for industrial purposes. The problem is they're running these suckers so damn hot that they're burning out. I suspect this is one example of the issues with next gen technology as it's a lot harder to not just produce the CPUs but to produce the hardware that makes these CPUs.

Side note: Intel's would be able to dedicate less die space to GPU (or have better performing GPUs,) if their GPU design wasn't a bastardized x86 core.
 
Last edited:
We had (or we could have had) the tech to go under 10nm/transistor since the 80's... but because of incremental decrease driven by the business only... oh well...

What do you mean?
 
What do you mean?


He means it's much more profitable to invest in mini process updates and have consumers upgrade all the time then pour a bunch of money into a big leap and have people pleased for years. Guess which one Intel does...
 
People in general don't need more computing power, what they need is lower power draw
I agree. Normal people are moving to tablets, cell phones, all-in-one computer monitor and other things.

That is what I am seeing right here. The "do-it-yourself" desktop model I like is being forgotten.

I am sad for those people who buy computers from ACER, LENOVO, HP, DELL and family. Maintenance and repair can be a real pain an ultra expensive on those closed, proprietary models.

But the normal people usually just realizes that when they encounter a problem that needs to be repaired. :shadedshu:

Just my opinion. :rolleyes:
 
Nobody who posts on this site would be considered a "normal person"/average PC user. We are the 2% who actually give a damn what's inside our PCs and take the time to know enough to fix our own hardware and optimize our software. And yes, it's very sad the way the 98% seem to just get more clueless every year. Most of the people I know think it's okay to download 15 or 20 toolbars and go to "free" game sites that hijack every setting on their PC, then they call me up and say "this thing isn't acting right". I just tell them I'll fix your PC, but I can't fix YOU! These kind of people call me back at least once a month after their latest crop of mistakes renders the PC unusable. They never learn because they have no interest in learning, and it's all so complicated to remember, they can't be bothered. These people should all get tablets, then they can spend hours on their iphones talking to help centers in Mandarin Chinese when they break them, or just buy a new one every month...
 
Back
Top