1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ghz ?

Discussion in 'NVIDIA' started by Desert Eagle, Feb 11, 2012.

  1. Desert Eagle

    Desert Eagle New Member

    Joined:
    Feb 11, 2012
    Messages:
    70 (0.08/day)
    Thanks Received:
    29
    I've looked around a bit here and I may have missed it, but why are we so locked down on GPU GHZ? I know from the overclockers that heat is the enemy and depending on how far you want to up the GHZ it can get very expensive to dissipate that heat away. I know also that with a die shrink you can count on improved efficiency and less heat for the same performance as the last generation.
    My question is this. Would it be possible to die shrink a GPU and keep the same number of transistors but spread them out over the size of the last generation GPU to offer a greater area of heat dissipation?
    I'm not an engineer and I'm sure that my simple idea has been passed over for good reason but I wonder about this.
  2. cdawall where the hell are my stars

    Joined:
    Jul 23, 2006
    Messages:
    20,668 (6.99/day)
    Thanks Received:
    2,981
    Location:
    some AF base
    OR they could work on efficiency and with efficiency crank the clockspeed at the same time.
  3. _JP_

    _JP_

    Joined:
    Apr 16, 2010
    Messages:
    2,681 (1.68/day)
    Thanks Received:
    734
    Location:
    Portugal
    Well, it has already been done. Not exactly how you said it. The best example is the G92 core. Decreasing the photolithography process but keeping die size doesn't help/improve heat dissipation that much. Also die shrinks means less die space (obviously), thus more chips can be made out of the same wafer, thus keeping a balance in manufacturing costs.
    I'm sure I'm missing some key points here, but I guess it's covered.
  4. slyfox2151

    slyfox2151

    Joined:
    Jan 14, 2009
    Messages:
    2,606 (1.27/day)
    Thanks Received:
    524
    Location:
    Brisbane, Australia
    wouldn't that incress latency?
  5. Desert Eagle

    Desert Eagle New Member

    Joined:
    Feb 11, 2012
    Messages:
    70 (0.08/day)
    Thanks Received:
    29
    Thank you for the replies but I still wonder. Let's use a hypothetical. Let's say we reduce the architecture to 28 nm but we increase the size of the GPU chip while keeping the same number of transistors thereby increasing the overall area for heat dissipation. Wouldn't that allow for a GHZ increase?
  6. TheLaughingMan

    TheLaughingMan

    Joined:
    May 7, 2009
    Messages:
    4,998 (2.58/day)
    Thanks Received:
    1,291
    Location:
    Marietta, GA USA
    My Electrical engineering is rusty, but here we go.

    What you are purposing is counter productive. You are say we should shrink the individual transistors down, but leave the die space the same. Practically, the smaller transistors will use less power and produce less heat. The key point you are missing is within a confined space there is no place for the heat to dissipate. In order to spread the now smaller transistors out over a large surface area, they would have to be connected via metal pathways. Those pathways will be A housing a current flow that is producing heat, and B trapped in a confided place filled with other heat producing sources. Overall you have not reduced the amount of heat producing sources, nor have your reduces the amount of metal on the die which houses that heat you are trying to avoid.

    You can't think about a CPU or GPU as a complex component when it comes to heat production. As far as heat is concerned related to these, the objective is to reduce the amount of metal and reduce friction. Every die shrink is about reducing the metal. Often this allows them increase the transistor count and thus processing power while still reducing overall amount of metal. Every advancement in Instruction Sets, coding, new 3D gates, manufacturing techniques is about reducing friction either by design or by reducing the number of working transistors to achieve a goal. If an new instruction set can reduce a calculation by 1 clock cycle, that could be literally thousands of transistors and gates that don't have to be moved.

    What you want to do is decrease the size of the die/chip and increase the efficiency of the heat dissipation via better heatsink technology.
    Desert Eagle says thanks.
  7. Desert Eagle

    Desert Eagle New Member

    Joined:
    Feb 11, 2012
    Messages:
    70 (0.08/day)
    Thanks Received:
    29
    "The key point you are missing is within a confined space there is no place for the heat to dissipate. In order to spread the now smaller transistors out over a large surface area, they would have to be connected via metal pathways. Those pathways will be A housing a current flow that is producing heat, and B trapped in a confided place filled with other heat producing sources. Overall you have not reduced the amount of heat producing sources, nor have your reduces the amount of metal on the die which houses that heat you are trying to avoid."

    Thank you TheLaughingMan. That is the part I didn't understand.
  8. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,216 (2.55/day)
    Thanks Received:
    1,140
    Plus Silicon on insulator tech is too expensive for GPU cores.
    10 Million points folded for TPU
  9. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,862 (4.53/day)
    Thanks Received:
    6,930
    Location:
    Edmonton, Alberta
    Hmm? WHUT!?


    CPUs with SOI sell for far less $$$ than GPUs. There is a different reason, and expense is just not part of it. AMD Fusion APUs with a GPU included are using SOI, and sell for far less than any current-gen 7-series GPU


    As the 7970 being very nearly sold out still shows...people will pay whatever is asked for something...you just need to have marketing that justifies the price....and specifically with GPUs, fanboys will buy anyway. nV had no issues selling $800 8800GTX cards.


    And with that said, I guess I'm no longer an ATI/AMD fanboy, becuase I think i'm gonna buy nV cards this time. :laugh:
  10. Desert Eagle

    Desert Eagle New Member

    Joined:
    Feb 11, 2012
    Messages:
    70 (0.08/day)
    Thanks Received:
    29
    "Plus Silicon on insulator tech is too expensive for GPU cores."

    Could you dumb that down to homosapien intellect please?
  11. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,375 (2.07/day)
    Thanks Received:
    562
    Location:
    Manchester uk
    its the manufacturing process, gpu's use a cheaper(slightly) process, gpu's also have more transistors anyway then cpu's due to the shader array so produce way more heat then cpu's and 1Ghz is doable and has been a few years more then thats on the way id imagine, amd especially would have a lot to gain from equalising gpu and cpu speeds as then they will be able to easier integrate the cpu and gpu in Apu's to share rescources
    Desert Eagle says thanks.
  12. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,216 (2.55/day)
    Thanks Received:
    1,140
    A 7970 that also has ram, a board, a cooler, power controls, its own BIOS, drivers they have to write, and die size cost, yeild loss.

    By the time Nvidia or AMD engineer the core, spin it a few times to workout bugs, mass produce it ship it, then sell it in bulk to AIB partners that extra cost is alot.
    10 Million points folded for TPU
  13. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,862 (4.53/day)
    Thanks Received:
    6,930
    Location:
    Edmonton, Alberta
    I'm of the opinion that CPUs should be add-in cards, and GPUs what gets stuck in mainboards. Perhaps my perspective isn't "current". Costs are nothing. What matters is what the profits are.


    AMD's 7970 launch PROVES that cost really IS NOT a factor. OR they'd not have sold so many cards already. An extra $100 is nothing, and I'm sure it covers it.

    When the SOI fabs are already built, cost isn't that high. We don't need 5 GHz CPUs...we need 5GHz GPUs. ;)

    Stuffing 225W ++ into a dual-slot space is stupid. nevermind 500W, 750W, 1000W with multiple cards. But people still do it...

    I don't think looking for excuses(and htat's what cost claims are to me) is gonna solve ANY issues. Instead of why not...it should be..HOW.
    Desert Eagle says thanks.
  14. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,216 (2.55/day)
    Thanks Received:
    1,140
    You don't honestly believe they couldn't make a 2Ghz GPU with more money? Prescott.

    Heat currently isn't the biggest issue, newer cards run under 80C with good air cooling. And can you really think of a good reason that we aren't using a included TEC if money isnt the issue. the fact is, money IS the issue, every time you add cost you decrease profits. AMD doesnt make a dime more if a retailer or AIB maker raises the price for a finished card.
    Desert Eagle says thanks.
    10 Million points folded for TPU
  15. xBruce88x

    xBruce88x

    Joined:
    Oct 29, 2009
    Messages:
    2,361 (1.34/day)
    Thanks Received:
    546
    another way to cut heat would be to use materials that are less resistant to electrical current, this would lower the heat as well since you could lower voltage.

    for now... the best you can do is liquid cooler or thermal electric cooling. there's liquid nitro or oxy, but that's just plain nuts. (to use on a daily basis)
  16. Desert Eagle

    Desert Eagle New Member

    Joined:
    Feb 11, 2012
    Messages:
    70 (0.08/day)
    Thanks Received:
    29
    "Stuffing 225W ++ into a dual-slot space is stupid. nevermind 500W, 750W, 1000W with multiple cards. But people still do it..."

    Thank you Dave between the Ca Ca. It's better than being DaveCaCaDave...that would be a sh#t sandwiche.

    Humbly speaking for the PC Gamers...we stuff our slots with monster cards because (A) it's fun and (B) the Corporate Profit Hoes that run EA (and some others) can't be bothered to optimize their f'ing games for PC so we either play games that look like console port caca or we spend $$$ for something that looks better than caca because we can.
  17. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,862 (4.53/day)
    Thanks Received:
    6,930
    Location:
    Edmonton, Alberta
    Yeah, I'm stuck in the middle of the crap that is my daily life. The most I can say I accomplish with my time is my reviews...nothing else really amounts to much. Well, taking care of my kids is important, but that has it's own issues as well.


    I am basically using a second card, so I can enable Ultra in BF3. One card plays high just fine. You are very right...I was just talking to TheMailman78 the other day, and I mentioned just this:


    Top-level "settings" in games, really, aren't there to be used...today. They are there to make you want to upgrade, in the future.



    :laugh:

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page