• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Fast-tracks 7nm "Navi" GPU to Late-2018 Alongside "Zen 2" CPU

Joined
Sep 17, 2014
Messages
21,316 (5.99/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Aren't we mixing 2 things here? Or maybe 3?

1) Ray tracing (RT) is a basic rendering technique. It's been around for decades and is fundamental - not useless or unusable as one of AMD fanboys claims. :)
2) Real-time ray tracing (RTRT) is... RT in real time ;-), i.e. fast enough (whatever that means).
It's been around for a while, but used for previews - not final renders. Previews are greatly simplified - they ignore some materials and some effects. Also the resulting live render is usually low-res and under 30fps.
3) RTRT in games means it has to be efficient enough for processing all effects, at high-resolution (1080p+) and high frequency ( has to be acceptable for gaming, i.e at maybe 1080@60fps, maybe 4K@30fps...

You're talking about general processing implementation, i.e. what standard GPU cores do.
Nvidia used an ASIC and it's just way faster - just like tensor cores are way faster for neural networks.

Everything else you've said is more or less correct.
If one wants to combine RTRT with 4K@60fps, then doing that on GPGPU is 10 years away from now. But on ASIC it should be possible withing 1-2 generations, i.e. 4 years tops.
But thanks to RTX cards, you don't have to wait 10 years. For mere $1200 :) you can already make your games look as if it's 2028 (just at 1440p tops).
And when you buy your next RTX card in 2021 for another $1200, it should be OK for 4K@60fps. :)

There's just no way around it. AMD will have to respond with a similar tech, ignore RTRT ("Who needs realism? We're so romantic!") or magically make Navi 4x faster than Vega. :)

No need to argue semantics with me. You know exactly what I'm getting at ;) All RT that is not done on the GPU in real time is not the ray tracing we're talking about when it comes to RTX / DXR, we already have pre-cooked lighting and that is what any kind of non-realtime RT boils down to - its the same as saying 'AI' when in fact its nothing more than lots of lines of code and data to cover every possiblity.
 
Joined
Jun 28, 2016
Messages
3,595 (1.24/day)
No need to argue semantics with me. You know exactly what I'm getting at ;) All RT that is not done on the GPU in real time is not the ray tracing we're talking about when it comes to RTX / DXR, we already have pre-cooked lighting and that is what any kind of non-realtime RT boils down to
Well... I'm very fond of strict definitions. :)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.
saying 'AI' when in fact its nothing more than lots of lines of code and data to cover every possiblity.
Oh... this is not true and especially painful for me... but also way off topic. :)
 
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Well... I'm very fond of strict definitions. :)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.

Oh... this is not true and especially painful for me... but also way off topic. :)
Wow , a 1050 and your in every gpu thread salivating over Nvidia while knocking Amd.

I am not concerned with what you like though, or your perspective , anymore then you are other people's ,1080p died two years ago for me and no one's dragging me back their for one gfx feature , so you may be fine with Raytracing(ray based shadows and reflection not true Rt) at upto 1440p( really, like thats not the 2080ti and out of most people's reach anyway) but some are not ,to make me happy with rtx I'd need two or three 2080tis.

But im not expecting Rtx to work with sli either or its replacement so im out of Rtx for at least three years imho.

Even before reviews.

And as for game Ai i think he meant the current implementation of ai in game's, ie just big lists of if then's in effect not neural nets which clearly haven't graced a Aaa game yet.
 
Joined
Apr 30, 2011
Messages
2,657 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Nice derailing of that thread lads :)

Imho, AMD might release the Vega 20 as an iteration for mixed use as did with Frontier Edition Vega 10. It will help them sell it at a premium to cover the added cost for the 32GB HMB2. My hope is that it manages to close the gap to the fastest nVidia by then at <10% for logical power consumption. It could easily be sold for $1000, especially since it would probably be a limited batch. Navi will come in 2019 for sure and first iteration will suceed the Polaris class GPUs.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Nice redailing that thread lads :)

Imho, AMD might release the Vega 20 as an iteration for mixed use as did with Frontier Edition Vega 10. It will help them sell it at a premium to cover the added cost for the 32GB HMB2. My hope is that it manages to close the gap to the fastest nVidia by then at <10% for logical power consumption. It could easily be sold for $1000, especially since it would probably be a limited batch. Navi will come in 2019 for sure and first iteration will suceed the Polaris class GPUs.
You sir share my opinion largely ,However Im expecting the geldings of 7nm vega to get mixed up with half working stacks of hbm , 3560 ish cores 16Gb Hbm, massive price too but as you say frontier version first in 2018 possibly but v late on.
 
Joined
Feb 26, 2016
Messages
548 (0.18/day)
Location
Texas
System Name O-Clock
Processor Intel Core i9-9900K @ 52x/49x 8c8t
Motherboard ASUS Maximus XI Gene
Cooling EK Quantum Velocity C+A, EK Quantum Vector C+A, CE 280, Monsta 280, GTS 280 all w/ A14 IP67
Memory 2x16GB G.Skill TridentZ @3900 MHz CL16
Video Card(s) EVGA RTX 2080 Ti XC Black
Storage Samsung 983 ZET 960GB, 2x WD SN850X 4TB
Display(s) Asus VG259QM
Case Corsair 900D
Audio Device(s) beyerdynamic DT 990 600Ω, Asus SupremeFX Hi-Fi 5.25", Elgato Wave 3
Power Supply EVGA 1600 T2 w/ A14 IP67
Mouse Logitech G403 Wireless (PMW3366)
Keyboard Monsgeek M5W w/ Cherry MX Silent Black RGBs
Software Windows 10 Pro 64 bit
Benchmark Scores https://hwbot.org/search/submissions/permalink?userId=92615&cpuId=5773
Nowhere in the article does it say Navi is coming late 2018. As a matter of fact, Navi wasn't mentioned at all.
The gpu that's going to launch later this year is a 7nm Vega20 (or whatever it's called), which is not a consumer product.
Read the last paragraph...
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
What do you mean "to you" ? You see what you want to see rather than what it's written ? It is never once said that Navi is coming to mainstream consumers (or any consumers) late 2018 or anything of the sort.

Wow did read the entire paragraph? I was talking about the title of the article not the article that it was misleading in response to someone else’s comment. Right after that sentence it said we know it’s coming out in 2019! Read the whole thing next time lol. I think you just read one sentence and didn’t pay attention to my entire post lol.
 
Joined
Sep 6, 2013
Messages
3,079 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
That's just all in your head, not in anyone else's.
Maybe it's just in your head the, not so accurate idea, that you know what is in people's head.

Vega will do RT and potentially at good perf (relatively speaking to turding). I don't think future cards will have any issue.

As useless as RT already is, it'll be unusable on 2070, so it's not like anything midrange needs it.
If Vega can do RT, I bet we will see a GameWorks library that fixes that. But I don't believe that RT will be useless on 2070 cards. If you can choose various levels of RT quality, 2070 will be down to low or medium. Then we will get some more articles about how usefull G-Sync can be with 2070 when enabling RT for smooth graphics. Or, who knows, we could see 720p resolution coming back to life and start seeing articles about how much better visually 720p with RT is, compared to typical, non RT, 1080p.

I totally will if performance improve, not that they're bad but well...
I can understand that changing platforms is difficult when you are used to one.
 
Joined
Feb 17, 2017
Messages
852 (0.32/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
I can understand that changing platforms is difficult when you are used to one.

Nah, it's not, not for me at least, i don't care which company, i just care about performance. If ryzen will poop on what's intel making, i'll gladly buy that, if they don't i'll buy intel, same goes for videocards. It's not like i'll buy anything to make some company a favor.
 
Joined
Jun 28, 2016
Messages
3,595 (1.24/day)
Wow , a 1050 and your in every gpu thread salivating over Nvidia while knocking Amd.
Why exactly is my GPU important?
ray based shadows and reflection not true Rt
LOL. What is "true RT"?
And as for game Ai i think he meant the current implementation of ai in game's
I don't know if he meant game AI at all. There is a world outside gaming. Google it! :)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.
 
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Why exactly is my GPU important?

LOL. What is "true RT"?

I don't know if he meant game AI at all. There is a world outside gaming. Google it! :)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.
Im not the one blowing smoke up a vendors and techs ass, i mearly stand by the, await reviews stance.

Your gpu doesn't directly matter but does make your stance on amd ,polaris and vega strange , since you don't buy into such but are very very vocal on something you !read! about?.

I know of the other ai ,i have Googles Ai in my hand right now obviously, and you know i and he, are right ,modern /present games have strictly list based Ai and not neural net based , were talking consumer gamer tech so who's bringing pro use case ai here , not me so I meant purely Game Ai.

Finally in this thread i mentioned nothing like navi in 2018 ,i said possibly a prosumer 7nm vega, im an optimist but i wouldn't bet on that either.
 
Joined
Nov 7, 2017
Messages
52 (0.02/day)
I bet Nvidia pays TPU to make these false claims so AMD would get more bad reputation in the end. Very bad. I consider using some other tech sites than this one.
 
Joined
Nov 3, 2013
Messages
2,141 (0.55/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
Read the last paragraph...
Yeah and? Where does it mention that that particular GPU, releasing later this year, will be Navi?
 
Joined
Dec 22, 2011
Messages
3,890 (0.85/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I bet Nvidia pays TPU to make these false claims so AMD would get more bad reputation in the end. Very bad. I consider using some other tech sites than this one.

Another conspiracy nut AMD sycophant. Welcome! You'll fit right in.
 
Joined
Dec 28, 2012
Messages
3,553 (0.85/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Thats not how capitalism works. You have to create a product to fill a need, then consumers buy it.

If AMD doesnt make competitive GPUs, nobody will buy them, and they wont make money. At some point the division would get new leadership or bought by another company, and the products reinvigorated.

Vega 56/64 didnt sell due to supply shortages. If AMD had simply made a 3072/4096 core polaris card, and released them back in 2016, they would have sold well and made AMD money. AMD didnt get sales because they chose to go with cards that were difficult to make trying to pull a 3DFX.
 
Last edited:
Joined
Nov 4, 2005
Messages
11,757 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
This is what ATI did, a new process node with a new small chip and a refresh. They managed to match Nvidia performance with a much smaller chip, and beat them on dollar per performance.

I would rather have a cheap GPU that delivers 60FPS at 1080P with the eye candy maxed out, or 4K with enough eye candy to make it worthwhile.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.53/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Why exactly is my GPU important?

LOL. What is "true RT"?

I don't know if he meant game AI at all. There is a world outside gaming. Google it! :)

And man... I'm looking forward to your comments when AMD releases a GPU with similar RT solution. :-D
I'll be there to remind you all this rubbish and how 7nm Navi was just around the corner in late 2018.

They have had rt for awhile.
 
Joined
Nov 3, 2013
Messages
2,141 (0.55/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
This is what ATI did, a new process node with a new small chip and a refresh. They managed to match Nvidia performance with a much smaller chip, and beat them on dollar per performance.
The famous 3870.
Apropos the smaller chip, Vega 20 is about 2x smaller than 2080Ti.
But I don't think the yields will be great, and that's probably why AMD decided against a gaming Vega 20. Then again, who knows... maybe they surprise us later in the year. Maybe they push Navi to early '19. Time will tell
 
Joined
Jul 16, 2014
Messages
8,143 (2.25/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
Well... I'm very fond of strict definitions. :)
Just trying to point out that RTRT can be used for both professional 3D work and gaming.
For professional stuff it's already going on and RTX will just speed things up.
For gaming it wasn't really possible until now. And won't be possible for a while without ASIC.

For me RTX is important. You see... I'm getting old and I already struggle to find time for casual gaming few times a week. So the idea that I could get RTRT in games 2-3 years from now instead of 10 is big news.
And, as you can see, I'm a proud owner of a 1050. I paid $150 last summer and I just couldn't find a reason to buy anything more expensive. I'm fine with 1080p and this cheap GPU covers all games I wanted to play. Witcher 3 is the most demanding and I can easily run it at 40-50 fps with decent image quality.

But for me RTRT changes everything. Would I pay $1000 for these new RTX cards? No fu... way.
But would I pay $500 for a card that does RTRT and VR in... let's say... 2022? You bet I would. And now it suddenly became possible. :)

Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.

Oh... this is not true and especially painful for me... but also way off topic. :)
I swear everytime I come to read this thread it tells my you are paid either by intel or nvidia or both to be their personal fanboi. You arguments are flawed because of that.
 
Joined
Jun 28, 2016
Messages
3,595 (1.24/day)
They have had rt for awhile.
AMD supports RT via ProRender - I've mentioned it earlier. It's a nicely written platform, it works on everything (OpenCL-based) and it gives AMD's raw processing power a decent advantage (Vega matches 1080Ti - unlike in games).
But you just can't match ASIC in this regard. And if you're not very into 3D rendering, then just think about crypto mining. :)

The famous 3870.
Apropos the smaller chip, Vega 20 is about 2x smaller than 2080Ti.
I can draw you even smaller theoretical chip if you want. :)
2080Ti is here, it works and it's already traveling towards your favourite store.
My favourite store says they can deliver 22 MSI RTX 2080Ti Gaming by 03/10. It was "at least 30" last time I checked. So at least 8 people pre-ordered already (for 5800 PLN ~= 1570 USD).
By comparison, the same store sold 3 (three!!!) Asus Vega 64 Strix since launch.
But I don't think the yields will be great, and that's probably why AMD decided against a gaming Vega 20. Then again, who knows... maybe they surprise us later in the year. Maybe they push Navi to early '19. Time will tell
Navi could be an interesting architecture, but they've built the idea around 7nm. 7nm might just not happen fast enough. And the yields... oh my... the yields!
It's the HBM2 thing happening all over again. AMD wants to build an advantage using more recent, unproven and poorly available tech and they end up caught in all kinds of weird traps.

Now 7nm is the mystical saviour - praised by AMD believers in every comment (just like HBM2 a year ago). We'll see how it turns out this time. :)
 
Joined
Feb 3, 2017
Messages
3,565 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Also, I'd expect RTX to speed up normal (non-live) renders as well. Wonder if that's going to happen.
It does. For production stuff I would assume the API they use is Optix but point and underlying technology is the same.


If Vega can do RT, I bet we will see a GameWorks library that fixes that.
No we won't. AMD will have their own API that DXR will use and that'll work. When game devs have implemented straight-up Gameworks stuff, then all bets are off of course. But even then AMD will help them out in that regard :)
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
AMD supports RT via ProRender - I've mentioned it earlier. It's a nicely written platform, it works on everything (OpenCL-based) and it gives AMD's raw processing power a decent advantage (Vega matches 1080Ti - unlike in games).
But you just can't match ASIC in this regard. And if you're not very into 3D rendering, then just think about crypto mining. :)


I can draw you even smaller theoretical chip if you want. :)
2080Ti is here, it works and it's already traveling towards your favourite store.
My favourite store says they can deliver 22 MSI RTX 2080Ti Gaming by 03/10. It was "at least 30" last time I checked. So at least 8 people pre-ordered already (for 5800 PLN ~= 1570 USD).
By comparison, the same store sold 3 (three!!!) Asus Vega 64 Strix since launch.

Navi could be an interesting architecture, but they've built the idea around 7nm. 7nm might just not happen fast enough. And the yields... oh my... the yields!
It's the HBM2 thing happening all over again. AMD wants to build an advantage using more recent, unproven and poorly available tech and they end up caught in all kinds of weird traps.

Now 7nm is the mystical saviour - praised by AMD believers in every comment (just like HBM2 a year ago). We'll see how it turns out this time. :)
You really are a hype bandit, is your name susan because you seam to like Emmerdale proportions of dramma.

Mystical saviour ,tut. Ill be here to point out YOU alone said that.
 
Joined
Oct 2, 2004
Messages
13,791 (1.92/day)
It does. For production stuff I would assume the API they use is Optix but point and underlying technology is the same.


No we won't. AMD will have their own API that DXR will use and that'll work. When game devs have implemented straight-up Gameworks stuff, then all bets are off of course. But even then AMD will help them out in that regard :)

That's not my quote :p
 
Joined
Oct 22, 2014
Messages
13,308 (3.78/day)
Location
Sunshine Coast
System Name Lenovo ThinkCentre
Processor AMD 5650GE
Motherboard Lenovo
Memory 32 GB DDR4
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Lenovo
Software W11 Pro 64 bit
Thats not how capitalism works. You have to create a product to fill a need, then consumers buy it.
Capitalism is creating a need for a product consumers don't need.
 
Joined
Jun 28, 2016
Messages
3,595 (1.24/day)
Mystical saviour ,tut. Ill be here to point out YOU alone said that.
No problem with that. I'm not changing my opinions. I've criticized Vega for HBM2 dependency and I feel comfortable criticizing this whole 7nm nonsense.
You know how this ends, right? Nvidia will give us consumer grade 7nm GPUs before AMD (I mean available products, not fairy tales).

Actually, there's still a decent probability that Intel delivers their 10nm whatever-lake lineup before 7nm Zen2 and that would be really funny. ;-)
 
Top