• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Turing GeForce RTX Technology & Architecture

Deep learning with Turing also plays a role with interferencing applications, be it activities such as identifying common objects in digital photographs, identifying road hazards for self-driving cars, or real-time language translation.

I believe it should read inferencing, or am I mistaken?
 
  • Like
Reactions: VSG
I believe it should read inferencing, or am I mistaken?

You are not, that was a typo from my part on a rushed evening. Thanks for catching it, I am correcting it now.
 
gtx 1070 to rtx 2070: 29 % more performance
gtx 1080 to rtx 2080: 23.63 % more performance
gtx 1080 ti to rtx 2080 ti: 30.54 % more performance

1070 to 2070 also 75% more bandwidth and 25% compression. 30% + 120% more bandwidth ~ 75% more performance in some cases.
 
1070 to 2070 also 75% more bandwidth and 25% compression. 30% + 120% more bandwidth ~ 75% more performance in some cases.

I don't think it works like that, if so it will have a rtx 2080 performance
 
Back
Top