• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...

Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors. I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.
 
I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...

marginally better. Intel releases are like Call of Duty titles now.
 
Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors. I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.


Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world? I think were gonna see a new focus from intel.
 
True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).

The X800XT is the highest of the high end, he must have been doing great.
 
There are 2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )


Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.
(
 
Although LGA 1150 is coming a little "too soon," I'm actually happy that Intel released Ivy Bridge processors. I hate to sound like some kind of eco friendly nutcase, but compare the power consumption of a 45nm i7 processor with that of a 3770K and it is a huge difference. I haven't gotten my 3770K @ 4.3GHz/1.175v to consume more than 80 watts of power in IntelBurnTest (~60w in crunching, measured with HWMonitor reading the digital VRM interface) when my 2600K at 4.3GHz/1.3v took 130w in IntelBurnTest and 80w crunching. My i7-870 seems to dump out even more heat and is slower, so the advances in the process nodes are more significant than the actual performance increases.

Intel in the past few years has essentially done what the automotive industry has done in recent years: increase efficiency. You wouldn't use an old Chevy 454 big block these days over a Vortec 5300 or a brand new V6, would you? (unless you enjoy getting 8 MPG)
 
There are 2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )

I disagree, for the average consumer, gaming and video editing does not require a faster CPU than his or my several generation old i7. Gaming is basically all GPU limited at this point, the CPU makes little difference, and most video editors on the consumer level run pretty well on any decent quad-core. The only thing a faster CPU helps on is render times, but even still, the difference between the two is maybe a minute to render a decent length video.
 
The hardware side is there, we are waiting on the software side to catch up, and the next set of instructions that will enable the next major breakthroughs in processing. Look at what extra instruction sets did, and now how much of them are in use in all reality VS plain basic operation? Software companies are looking to cover as many systems as possible and to remain compatable with consoles and other devices, once we get past these issues I would imagine the whole need for speed will become more of a need for optimization.
 
[/B]

Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world? I think were gonna see a new focus from intel.

This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.

When talking about computers it's often best to talk about computers. Don't bring in other stuff that has nothing to do with them.
 
This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.

When talking about computers it's often best to talk about computers. Don't bring in other stuff that has nothing to do with them.

1. You miss the point

2. Seriously? its just an analogy
 
We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.

Think about this when you need to render something (video or raytracing), or play Crysis 3...

And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.

Edit: I have logo of maybe largest cloud computing vendor for avatar. Ups. :D
 
1. You miss the point

2. Seriously? its just an analogy

Maybe so, but it was a bad analogy. A lot of analogys are stupid.

Think about this when you need to render something (video or raytracing), or play Crysis 3...

And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.

How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.
 
I can't wait foe these to come out! I can finally upgrade from my old Core2Quad! Now I just need to buckle down, study and pass my test and get a job.

I will gladly take a i7 4770k and AMD 8970/Nvidia 780...
 
Maybe so, but it was a bad analogy. A lot of analogys are stupid.



How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.

:laugh: you think quite highly of yourself... don't you....thankfully not everyone and or everything conforms to your realm of reasoning... but that's what makes us all special..... carry on!
 
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(


None of that is actually true. In fact, reading it again sounds a LOT like you're describing AMD's CPU division....

Higher clock speeds year after year? Show me. Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.

Computers getting slower? Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?

Stay off the crack and out of the comment sections.
 
:laugh: you think quite highly of yourself... don't you....thankfully not everyone and or everything conforms to your realm of reasoning... but that's what makes us all special..... carry on!

So what is wrong with my reasoning? Seriously, if there is anything actually wrong with it I'll correct it. FOr realz.

And I still maintain it was a bad analogy, and that most of them are bad.
 
LOL what makes a bad analogy bad is that its not understood and isnt relevant. At least for me, I could easily pick up on it.
 
Higher clock speeds year after year? Show me. Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.
Core i7 980X = 3.33 GHz
Core i7 3970X = 3.5 GHz

There are no 6-core IvyBridge-based processors out yet.


Computers getting slower? Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?
The average of all computing devices is getting slower. Desktops are losing market share as weak ultrabooks, tablets, and phones take over more suited to play Angry Birds than Crysis.
 
I'm a gamer. It's depressing because gaming technology isn't advancing. With all these idle cores and RAM

That's no lie. Very few games take advantage of quad core CPUs and even less run 64 bit natively to use more than a couple gigs of RAM. I remember the rumor of the original Athlon FX working on reverse hyper threading. I still wish that would come to fruition. I'd rather run a game on a 16Ghz single core machine than have the game barely tax 2 of my 4 cores.
 
Reverse hyperthreading was the rumor for Bulldozer. When it was found to be false, enthusaism for it plummeted.
 
2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.
I wouldn't say its an Intel-only trait. FM1 lasted how many generations ?

I also seem to remember that mobo makers sold a reasonable number of 990FX/X boards leading up to Bulldozers launch, largely off the back of some AMD guerrilla marketing. What huge advance do the 900 chipsets offer that the 800's don't?
 
finally !! time to upgrade from my old 1155 i5 750 !!! :toast:

probably gonna get that unlocked 4770 version :)

- or -

should i wait for the next generation processor & socket ?? (after 1150) ??
 
Back
Top