Monday, July 18th 2011

AMD Radeon HD 7000 Series to be PCI-Express 3.0 Compliant

AMD's next generation of graphics processors (GPUs) that will be branded under the HD 7000 series, are reported to be PCI-Express Generation 3 compliant. The desktop discrete graphics cards will feature PCI-Express 3.0 x16 bus interfaces, and will be fully backwards-compatible with older versions of the bus, including Gen 1 and Gen 2. Motherboards sold today feature Gen 2 PCI-E slots, although some of the very latest motherboards launched by major vendors feature PCI-Express 3.0 slots.

The new bus doubles the bandwidth over PCI-E 2.0, with 1 GB/s of bandwidth per lane, per direction. PCI-Express 3.0 x16 would have 32 GB/s (256 Gbps) of bandwidth at its disposal, 16 GB/s per direction. AMD's next generation of GPUs, codenamed "Southern Islands" will be built on the new 28 nm process at TSMC, and will upscale VLIW4 stream processors. Some of the first PC platforms to fully support PCI-Express 3.0 will be Intel's Sandy Bridge-E. Whether AMD's GPUs have hit a bandwidth bottleneck with PCI-E Gen 2, or is AMD trying to just be standards-compliant, is a different question altogether.
Source: Donanim Haber
Add your own comment

85 Comments on AMD Radeon HD 7000 Series to be PCI-Express 3.0 Compliant

#76
xenocide
eidairaman1Running Windows 7 on a Dell XPS Gen 1 Laptop just fine and its faster than XP ever was.
I haven't really tried it on many legacy setups. My ex-gf's E8300 with 1GB of RAM handled it just as well as XP, so I don't imagine the difference is too huge. I would actually go as far as to say that as long as you have a decent GPU to power all the pretty effects added in the UI you're fine.
Posted on Reply
#77
Shihab
Wasn't this thread about AMDs upcoming gfx cards series ?
Posted on Reply
#78
[H]@RD5TUFF
faramirName a couple (other than TRIM support for SSDs which really shouldn't be OS-dependent and obviously affects only users with SSDs) ?

They are going to stop providing security updates in April 2014, as per their latest announcement.
TRIM, less slugish fully patched, DX-11, Aero, Compatibility ect. it's an upgrade! XP needs to GTFO and people need to move on from this 10 year old technology.
ShihabyoooWasn't this thread about AMDs upcoming gfx cards series ?
It was till the XP trolls got butt hurt.
Posted on Reply
#79
btarunr
Editor & Senior Moderator
Not to mention usable x64. Windows XP 64-bit is worse than Windows Millennium.

Anyway, let's get back to topic.
Posted on Reply
#80
TheoneandonlyMrK
ShihabyoooWasn't this thread about AMDs upcoming gfx cards series ?
bang on dude im so bored of XPEE, what gwan in the world of 7xxx any clue as to how many compute units and or shaders this will have? and how many grfx engines they gona go for 2 mm 4 or borein 1?.

after reading about their new compute unit shader design, Rage3d site goes and mentions 7xxx and vliw4 in the same paragraph, and no compute unit in sight, is this a continuation of the 2 tier gpu's ala barts/cayman with vliw4 in the mainstream and GCU's in the high end cards?
Posted on Reply
#81
dr emulator (madmax)
btarunrNot to mention usable x64. Windows XP 64-bit is worse than Windows Millennium.

Anyway, let's get back to topic.
i agree

i only mentioned i wouldn't be able to use this card on xp ,which for me is a downer ,but this is progress

still i'd love to see how the devs of emulators could use this new tech ,obviously it won't be for some time yet, but i can't wait :D
Posted on Reply
#82
bear jesus
I doubt pci-e 3 will make much difference right now but it could be very handy for more bandwidth per slot and per channel used for internal interconnects, say 4 way sli or crossfire being able to run at x8 pci-e 3 would be similar to pci-e 2 x16, or how many lanes some things currently use up (like chipsets and other controller chips like usb 3, pci-e controller cards and SSD's) could be reduced or kept the same and get a nice jump in bandwidth.

Sure it might not impact the first cards much but most of us are using pci-e 2 instead of 1 for many reasons :-P
theoneandonlymrkafter reading about their new compute unit shader design, Rage3d site goes and mentions 7xxx and vliw4 in the same paragraph, and no compute unit in sight, is this a continuation of the 2 tier gpu's ala barts/cayman with vliw4 in the mainstream and GCU's in the high end cards?
I was thinking the same as more and more sites have reported about both the new compute unit and VLIW4 on the 7xxx cards, if it uses less space it would probably make sense to keep all of the bottom/low end on VLIW4 as it does pretty well and when made on 28nm I'm sure would lead to some pretty small, cool and cheap to produce low end or even mid range cards.

But as we don't really know much about the new architecture yet there could be many reasons to not use it on low end parts or VLIW4 could be part of the new architecture in some way or all the new cards could use the new compute architecture, i guess we just need many many more leaks. :laugh:
Posted on Reply
#83
eidairaman1
The Exiled Airman
xenocideI haven't really tried it on many legacy setups. My ex-gf's E8300 with 1GB of RAM handled it just as well as XP, so I don't imagine the difference is too huge. I would actually go as far as to say that as long as you have a decent GPU to power all the pretty effects added in the UI you're fine.
It has to be a DX9 Compliant GPU, Mine is MR 9800 256MB (R420/23 Core)

It works well on my Sig Setup but Gaming Performance lacks way behind XP cause NV only developed Drivers for XP and dropped the drivers they were making for Vista.
Posted on Reply
#84
TheoneandonlyMrK
eidairaman1Quote:
Originally Posted by xenocide
I haven't really tried it on many legacy setups. My ex-gf's E8300 with 1GB of RAM handled it just as well as XP, so I don't imagine the difference is too huge. I would actually go as far as to say that as long as you have a decent GPU to power all the pretty effects added in the UI you're fine.

It has to be a DX9 Compliant GPU, Mine is MR 9800 256MB (R420/23 Core)

It works well on my Sig Setup but Gaming Performance lacks way behind XP cause NV only developed Drivers for XP and dropped the drivers they were making for Vista.
get a room......:D
Posted on Reply
Add your own comment
Apr 25th, 2024 20:05 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts