• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..

Very true. I'm still on my B3 Q6600 from 2007. I've upgraded to an SSD, added some ram RAM and changed GPU to a modern one, but after a mild OC to 3GHz I've never felt it to be lacking on the CPU side.

Intel's problem really is that they've build something so good it's hard to offer a truly tangible upgrade on the performance side, since most games very quickly become GPU limited anyway. That's probably one of the reasons they've been focusing on improving power instead.

Until the next big game (or other program) comes out and brings the best CPU to its knees, then there's no compelling reason to be pushing for outright performance, rather than the far more reasonable performance-per-watt.
 
I know what you mean. I am still on a Q9650 and really I cannot justify buying a new CPU, mobo and RAMs...For what I mean?
 
I know what you mean. I am still on a Q9650 and really I cannot justify buying a new CPU, mobo and RAMs...For what I mean?

I upgraded from my Phenom II 940 because I was hitting a bottleneck with memory speeds and the only way to do that was to replace the entire platform (considering both the motherboard and CPU where AM2+ with support for only DDR2.) Upgrading eliminated that bottleneck (and provided me 8 DIMM slots for future upgrades) and I don't see my 3820 being inadequate any time soon.

It's a consideration, but if you're rig is working fine for your purposes there is no reason to upgrade. You're right.

should i wait for the next generation processor & socket ?? (after 1150) ??

Is it slowing you down? Prima.Vera has a good point, if you're not fully taxing your platform then there is no need to upgrade it.
 
Maybe so, but it was a bad analogy. A lot of analogys are stupid.



How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.

I love vinyl as an end user, but his analogy is right. I wouldn't want to make them right now. It's a niche market, much like high end work stations and desktops. It would be a hard field to enter into without a lot of resources at your disposal.

And cloud is inferior not because of privacy, but because of uncertainty and security. What if the service goes down? What if there's an error on your account? What if it's hacked? Things of that nature. They are all legitimate concerns. Granted, there are good things about it, but it's far from perfect.

As far as these Haswell chips, I'll likely stick with my 980X for the foreseeable future. At least until I get a larger performance increase from upgrading. That will likely mean an unlocked 8 core or better.
 
I am still kicking myself after selling my Q9600 for an i3-230 because I wanted to scale back and save some money. To this day I still don't know why my reasoning centre of my brain shut down and made me think that was a good idea. Should have just stuck with it. I will get the top offerings of the last generation and then call it a day, when I eventually save enough money to finish a complete overhaul of my system.
 
I think what people forget is that nowadays, who doesn't have a 1080p recording camera or smartphone? My current i7 920 is painfully slow at converting video's compared to a 3570K or 3770K using Intel QuickSync technology. Try converting a DVD or BluRay to iPad format.
 
I love vinyl as an end user, but his analogy is right. I wouldn't want to make them right now. It's a niche market, much like high end work stations and desktops. It would be a hard field to enter into without a lot of resources at your disposal.

And cloud is inferior not because of privacy, but because of uncertainty and security. What if the service goes down? What if there's an error on your account? What if it's hacked? Things of that nature. They are all legitimate concerns. Granted, there are good things about it, but it's far from perfect.

Yeah when I thought about it some more it's a quite good analogy actually, as you say niche markets etc etc. I didn't think enough of it.

And it's not perfect, but it's far from horrible as some people make it out to be. It's all about ones needs.

Edit: also i apologize if i came across as bitchy. Cant remember now what i wrote and im on the phone but i have a feeling i was bitchy, and if so i apologize.

@aquinus: as P.M say below, what did you do to have a bottleneck in specifically the memory?
 
Last edited:
I upgraded from my Phenom II 940 because I was hitting a bottleneck with memory speeds and the only way to do that was to replace the entire platform (considering both the motherboard and CPU where AM2+ with support for only DDR2.) Upgrading eliminated that bottleneck (and provided me 8 DIMM slots for future upgrades) and I don't see my 3820 being inadequate any time soon.
...

That's interesting. What apps are you using that caps your DDR2 ? Usually I thought is the CPU or GPU is the main problem. I have DDR2 occ at 1200Mhz and still got better scores than DDR3@1600Mhz for example.
 
That's interesting. What apps are you using that caps your DDR2 ?

Believe it or not Starcraft 2 was the worst offender. Frame rates dropped by large amounts and there was a ton of CPU and GPU power handy and you could watch on certain scenes where both GPU and CPU usage dropped along with the frame rate. I started adjusting the CPU/NB speed and it was making a difference. I was running DDR2-800 @ 5-5-5-15 at the time, but it wouldn't run much faster than 950Mhz if I wanted to keep it stable. So it was either the L3 cache or DRAM. Either way, since I upgraded to the 3820 even with CFX disabled, it's day and night difference as more and more units are on the map.

As far as the other software, it's hard to say if it was the DRAM because I only tested it a little bit and making the switch from the 940 to the 3820 made everything run faster in general. It was just a good upgrade, sans the power consumption, but that doesn't bother me since the 940 could eat just as much and give me less. :p

Not to say that the 940 was a bad chip, it was a very capable rig. I had it for years and it was due to be upgraded. Plus, SB-E handles virtualization very well. I can't complain.

Also I have my machine crunching right now and my rig itself (no monitors) isn't eating more than 400-watts (385-watts to be exact, both GPUs loaded and only the CPU overclocked). Overclocking the 6870s can push that usage up towards 500-watts and in some cases up to 550-watts.
 
Back
Top