• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

PCI-Express 3.0 Hits Backwards Compatibility Roadblock, Delayed

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,774 (7.41/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
PCI-SIG (Special Interest Group), the organisation responsible for development of PCI specifications announced that generation 3 PCI-Express (PCI-E 3.0), is off its target launch time from late-2009 to Q2 2010. Although work on the bus is almost finished, there seems to be problems with implementing backwards-compatibility with older generations of PCI-E. Assuming PCI-E 3.0 is standardised in Q2 2010, one can expect implementing products (motherboards and expansion cards supporting PCI-E 3.0) only by a year later.

PCI-E 3.0 packs features that overcome the bottlenecks of PCI-E 2.0, such as the removal of the 8P/10b encoding scheme that added at least 20% data overhead for the 5 GT/s PCI-E 2.0, reducing it to 4 GT/s effective. At 8 GT/s the new bus will have effectively twice the bandwidth.

View at TechPowerUp Main Site
 
Last edited by a moderator:
:ohwell:
we'll get new cards by then
 
2.0 still hasn't been used to its fullest. Meh, i'll stick with 1.0/1.1
 
2.0 still hasn't been used to its fullest. Meh, i'll stick with 1.0/1.1

I would like to see how many headroom a PCI2.0 GTX295 has.
 
I would like to see how many headroom a PCI2.0 GTX295 has.

A good amount, but i doubt you will see much to any of a performance difference going from 1.0/1.1 GTX 295 to 2.0 GTX 295.

The card still doesn't use 2.0 to its fullest advantage just yet.
 
Science, fixing problems that don't really affect people.

I do have one question. Will the PCIe 3.0 improved multiple GPU card setup performance scaling? Or is that still a software implementation issue?
 
So when are they ever going to up the power provided to video cards through the slot? i kno that PCI-E 2.0 is spec'd to be able to provide 150watts, but for some reason that i can only assume about, they haven't implemented it with that yet.... That is honestly the only improvement i think PCI-E needs is to be able to provide it's specifications....not 75watts.
 
So when are they ever going to up the power provided to video cards through the slot? i kno that PCI-E 2.0 is spec'd to be able to provide 150watts, but for some reason that i can only assume about, they haven't implemented it with that yet.... That is honestly the only improvement i think PCI-E needs is to be able to provide it's specifications....not 75watts.

When they think the ATX connector could use some additional wires for the extra current.
 
You want them to start a several year project once the standard is required? It needs to be done before we need it, it's how progress works.

Them starting to work on a new project isn't the problem, but considering PC's/video cards aren't in dire need of a PCI-E upgrade, it's almost pointless releasing it that early(its was going to come out late 2009:eek:). But as said thats how progress works.
 
So when are they ever going to up the power provided to video cards through the slot? i kno that PCI-E 2.0 is spec'd to be able to provide 150watts, but for some reason that i can only assume about, they haven't implemented it with that yet.... That is honestly the only improvement i think PCI-E needs is to be able to provide it's specifications....not 75watts.

A PCI-E 2.0 x16 slot provides 75W (65W 12V, 10W 3.3V), as did older standards. The amount of power a slot provides doesn't change between generations. All that changes are the specifications of the connectors.
 
A good amount, but i doubt you will see much to any of a performance difference going from 1.0/1.1 GTX 295 to 2.0 GTX 295.

The card still doesn't use 2.0 to its fullest advantage just yet.

False, sorta...
8x PCIe 2.0 is bottlenecking the HD3870X2's... So PCIe 1.1 x16 is surely at its peak usage with a GTX295.
 
Science, fixing problems that don't really affect people.

I do have one question. Will the PCIe 3.0 improved multiple GPU card setup performance scaling? Or is that still a software implementation issue?

Don't look like it will it's not as if going from 1.0\1.1 to 2.0 did any thing noticeable. So why the heck should this. Maybe it does if a computer analyzes the data but to us sweet f all

I just turned in to you TheLaughingMan HA.... My guess any thing REAL special will not happen with PCI-e they will have to bring a totally new slot because thats how money is made.. I guess if it stoped being backward compatible there be a chance but that would only worry people at upgrading time.
 
False, sorta...
8x PCIe 2.0 is bottlenecking the HD3870X2's... So PCIe 1.1 x16 is surely at its peak usage with a GTX295.

Well of course, a half speed slot is going to do that to a dual GPU card.

2.0 still has that extra bandwidth left that hasn't been taken advantage of yet, and i don't see how 3.0 would be solving that problem.(adding more bandwidth and a few extra features isn't worth it if the actual cards aren't taking advantage.)

I'll move to 2.0/3.0 when i see a big difference.
 


Looks like AMD is implementing its final revision to the HT over PCIe idea. The ATI/AMD Chipsets themselves support it but they want to get it into the standard so they can use it over all future chipsets from all companies.

Well of course, a half speed slot is going to do that to a dual GPU card.

2.0 still has that extra bandwidth left that hasn't been taken advantage of yet, and i don't see how 3.0 would be solving that problem.(adding more bandwidth and a few extra features isn't worth it if the actual cards aren't taking advantage.)

I'll move to 2.0/3.0 when i see a big difference.

-8x PCIe 2.0 provides the same bandwidth as PCIe x16 1.x
I do not see the point in your post...
The idea is improvement.. You don't see the world traveling with horses and buggies still do you? (Besides the Omish, no pun intended).. Improvement and planning for the future is what it is all about... (and money... )
 
Looks like AMD is implementing its final revision to the HT over PCIe idea. The ATI/AMD Chipsets themselves support it but they want to get it into the standard so they can use it over all future chipsets from all companies.



-8x PCIe 2.0 provides the same bandwidth as PCIe x16 1.x
I do not see the point in your post...
The idea is improvement.. You don't see the world traveling with horses and buggies still do you? (Besides the Omish, no pun intended).. Improvement and planning for the future is what it is all about... (and money... )

The point of my post is that theres really no point in releasing a 3.0 when 2.0 is still fresh for the pickin.
You don't see the world traveling with horses and buggies still do you?

Well ya, but when you compare a Car to something like using a horse and buggie you can see a significant difference between the two, seeing the huge advantages of a Car justifies why you should have one.
 
The point of my post is that theres really no point in releasing a 3.0 when 2.0 is still fresh for the pickin.


Well ya, but when you compare a Car to something like using a horse and buggie you can see a significant difference between the two, seeing the huge advantages of a Car justifies why you should have one.

Ok, lets go smaller scale...

1985 Toyota Supra vs 1986 Toyota Supra
Same thing, same engine with maybe a minor change in injection amount... but why is it a "New Design"? If the car was released in 1985.. it should remain a "1985 Toyota Supra in 1986...
 
Ok, lets go smaller scale...

1985 Toyota Supra vs 1986 Toyota Supra
Same thing, same engine with maybe a minor change in injection amount... but why is it a "New Design"? If the car was released in 1985.. it should remain a "1985 Toyota Supra in 1986...

So then what the point of going with the '86, if theres only minor changes.
 
So then what the point of going with the '86, if theres only minor changes.

The small change... the 1mpg better fuel economy... the new radio...||

minor tweaks that will be used by future cards... (and current cards such as the ATI HD2/3/4 cards.. using HT over PCIe (wait for it... wait for it... oh yeah... using 2 HT Lanes to do internal crossfire = WIN.)
 
PCI-E 3.0 would be nice for SLI/Xfire on boards that don't do the full x16x/16x
 
The small change... the 1mpg better fuel economy... the new radio...||

minor tweaks that will be used by future cards... (and current cards such as the ATI HD2/3/4 cards.. using HT over PCIe (wait for it... wait for it... oh yeah... using 2 HT Lanes to do internal crossfire = WIN.)

minor tweaks that will be used by future cards... (and current cards such as the ATI HD2/3/4 cards.. using HT over PCIe

But, im not saying we shouldn't go with the new slot at all. We really need to wait for the cards to truly utilize the slots. Having a new slot with more bandwidth and slightly new features every year or two isn't helping if we still have card finely chugging on an older slot.

Im not saying they aren't going to be useful in the future, but as of now there really isn't a big reason to go crazy about it.
 
But, im not saying we shouldn't go with the new slot at all. We really need to wait for the cards to truly utilize the slots. Having a new slot with more bandwidth and slightly new features every year or two isn't helping if we still have card finely chugging on an older slot.

Im not saying they aren't going to be useful in the future, but as of now there really isn't a big reason to go crazy about it.

What if the card already has the features? ;-)
 
Then alls good, no need for a new slot.

umm... but the feature isn't being used yet because the SLOT doesn't support it. ... so how is that good?
 
Back
Top