• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5070 Reviews Reportedly Due for Publication on March 4

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,328 (3.86/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
NVIDIA's upcoming mid-range GeForce RTX 5070 12 GB model is almost ready for launch, according to recent reports. Industry moles reckon that GB205 GPU-based specimens are already in the clutches of press and influencer outlets; review embargoes are due to be lifted on March 4, for $549 MSRP conformant SKUs (as disclosed by a VideoCardz source). Last week, we heard whispers about Team Green's (allegedly) troubled production cycle for incoming GeForce RTX 5070 and RTX 5060 models.

Insiders insist that these issues have caused a delay; many believed that NVIDIA had (prior) plans for a February GeForce RTX 5070 launch. A revised schedule was leaked to VideoCardz; the publication posits that GeForce RTX 5070 cards will launch at retail on March 5, with non-MSRP ($549+) reviews projected to go live on the same day. Based on various leaks, NVIDIA and AMD will likely clash with their respective new offerings. Right now, reviewers could be dealing with sizable piles of competing Team Green and Team Red hardware. Graphics card enthusiasts will be looking forward to incoming comparisons—GeForce RTX 5070 and its Ti sibling versus Radeon RX 9070 XT and RX 9070 (non-XT).




VideoCardz's has updated its speculative timetable for forthcoming graphics card embargoes:
  • February 28: Radeon RX 9070 Series Announcement
  • March 4: GeForce RTX 5070 reviews (NEW)
  • March 5: GeForce RTX 5070 sales & non-MSRP Reviews
  • March 5: Radeon RX 9070 Series reviews
  • March 6: Radeon RX 9070 Series sales

View at TechPowerUp Main Site | Source
 
Does that include all the ROPs? Or do they still come in installments? :slap:
 
5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.

Does that include all the ROPs? Or do they still come in installments? :slap:

I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
 
5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.



I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
"budget rig"... "$550 GPU is a great choice [for 1080p]"

:kookoo:


Y'all been smoking way too much of Jensen's marketing.
 
"budget rig"... "$550 GPU is a great choice [for 1080p]"

:kookoo:


Y'all been smoking way too much of Jensen's marketing.

i mean my teenager nephew is pulling in $280 a week at his part time. literally only take one month pay to get build him a decent rig since has no bills, with a 5070 and 1080p 180hz 23.8" monitor, he would have a blast. and that would last him many years thinks to 3-4x frame gen in future games/lowering settings

every market has inflation, it is what it is. adjust or whine, i dunno what to tell ya
 
28th is just "announcement"?

Why not swing with the 9700XT?
 
The memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
 
i mean my teenager nephew is pulling in $280 a week at his part time. literally only take one month pay to get build him a decent rig since has no bills, with a 5070 and 1080p 180hz 23.8" monitor, he would have a blast. and that would last him many years thinks to 3-4x frame gen in future games/lowering settings

every market has inflation, it is what it is. adjust or whine, i dunno what to tell ya
Ah yes, because the entire world has a similar earning profile. Newsflash - 1080p PC gaming is nowadays really popular because of e-sports titles and is experiencing growth mostly through emerging markets. No shot that those populations will stomach 550 (more in reality) for a 1080p card. What we see in Steam HS kinda bears it out - the cheaper 4060 is the king.

Now, one might argue that e-sports games don’t really need a 5070 class GPU, sure, but then, what, are we arguing that building a 1080p rig for single player games with a 600 dollar class GPU is a sign of a sane market in big 2025? That performance cost HALF a bit less than a decade ago. You can “muh inflation” all you like, I thought technical progress was supposed to gradually lower the barrier of entry for certain performance tiers, not… whatever the fuck current GPU clownery is.
 
The memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
So far all Skus 5090 all the way to 5070Ti have been limited on memory overclocking on the driver level. Don't get your hopes up.
 
5070 is meant for 1080p gamers i assume, which is still a lot of people. 12gb vram isn't all that bad for that resolution from what i remember. i often find High preset looks better than Ultra preset in a lot of games these days too, terrible optimization from devs.

so yeah, if i was building a budget rig as a 1080p gamer, a 5070 would be a great choice assuming you can get it at msrp 549. future games will take advantage of the dlss4 frame gen well, so yeah its not a terrible option. 5070 ti is still the sweet spot though.



I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
Something like a 7700XT will be more than enough for 1080p gaming. You could even go as low as a 7600. If the 5070 is going to be as fast as 4070 super, it's going to be a tad overkill/overpriced imo
1740513410116.png
 
Consoles do 1080p 60/120 for less than $500.

A $550 card is not a budget PC, and it's embarrassing if it's only satisfactory for 1080p.

$550 was good for 1080p gaming 13 years ago with the gtx 680 and HD 7970
 
Ah yes, because the entire world has a similar earning profile. Newsflash - 1080p PC gaming is nowadays really popular because of e-sports titles and is experiencing growth mostly through emerging markets. No shot that those populations will stomach 550 (more in reality) for a 1080p card. What we see in Steam HS kinda bears it out - the cheaper 4060 is the king.

Now, one might argue that e-sports games don’t really need a 5070 class GPU, sure, but then, what, are we arguing that building a 1080p rig for single player games with a 600 dollar class GPU is a sign of a sane market in big 2025? That performance cost HALF a bit less than a decade ago. You can “muh inflation” all you like, I thought technical progress was supposed to gradually lower the barrier of entry for certain performance tiers, not… whatever the fuck current GPU clownery is.
tbf, GPUs die are not getting smaller, but wafers are getting much more expensive, and games more demanding. RT could be blamed but some devs said that they could find a way to bring a GPU to it's knees just by even improving rasterisation. x10 the shader performance still wouldn't be overkill with all the things that still need to be improved. the biggest difference now is that 4K gaming and high refresh rate gaming is trying to become a thing, but that require x4 the power needed to max out 1080p. While game devs won't chill with visual improvement.

60 fps 1080p max setting is still fairly cheap. Unless you shop with "future proofing" in mind thinking that 12 GB is clutch, and 16GB the ideal amount of vram for 1080p :D
1740514340219.png

Unlike other areas we are not going to reach a "good enough" point, unless we reach some kind of limit software wise.
 
i mean if he wants to play cyberpunk 2077 and kingdom come deliverance 2 at 180 fps on a 180hz 1080p monitor on high/ultra settings, he prob will need a 5070.

not defending nvidia at all, i only own amd products. im just saying for 549 imo its not a horrible card for 1080p gamers who like to run their games at 140+ frames. little bit of future proofing built in with the 2-4x frame gen, etc.

im not comparing anyone to the rest of the world, everyones situation is diff. im just saying for him i'd prob try to talk him in to spending an extra 150 to upgrade to the 5070
 
Put reviewers under NDA contracts so other youtubers and "leakers" can slowly reveal when new hardware will be published.

Kinda fishy. For some reason the speculative release table is quite often right. I wonder why.
 
I'm assuming this will be replaced under warranty though and quickly since its easy to prove and a known issue. Nvidia says its 0.5% of all gpu's sold, if true that's not terrible imo.
I believe that, but it's still embarrassing this can go undetected through QA. You know, there's only two ways this can happen. Either they don't check for ROPs during QA or they do, but can't count very well.
 
I believe that, but it's still embarrassing this can go undetected through QA. You know, there's only two ways this can happen. Either they don't check for ROPs during QA or they do, but can't count very well.

I imagine QA works on sample size, as checking every item at the software level readings would take an immense amount of time, so for every 10 gpus that come through the conveyor belt, you pick one at random test it in-depth, and so on.

Maybe this is not how it works, I just assume this is how it does, so I dunno (i do agree with you overall though, should have been something more fullproof as I don't recall AMD ever having this issue at least not in recent years)
 
  • Like
Reactions: bug
i will wait for the 16Gb version, thanks
 
i will wait for the 16Gb version, thanks

at 1440p you really should wait for 20+gb version, as reviews show 16gb is already hampering at 1440p in a very select few games
 
at 1440p you really should wait for 20+gb version, as reviews show 16gb is already hampering at 1440p in a very select few games

that seems exaggerated, i don't really use ultra or even all high settings much, and RT should be the same, a gimmick
but i wouldn't say no
 
The memory bandwidth is disappointing, hopefully the 20Gps overclocks well.
Blame the 192-bit bus. Almost 700GB/s doesn't sound too bad to me though.

i will wait for the 16Gb version, thanks
GB205 has a 192-bit bus so a 16GB version is just not gonna happen.
 
GB205 has a 192-bit bus so a 16GB version is just not gonna happen.

would it be the 1st time they would use another version for the same class? i'm just asking
 
would it be the 1st time they would use another version for the same class? i'm just asking
Usually that happens later during a card's lifetime when they use a different chip for a lower tier card. Take RTX 2060 TU104 or 4070 TiS AD102 for example.
 
Usually that happens later during a card's lifetime when they use a different chip for a lower tier card. Take RTX 2060 TU104 or 4070 TiS AD102 for example.

see! there will a cost, that they can afford, because I'm sure they will put a nice premium on those 6GB just like they did with the 4060 16GB
I'm convinced it will happen
 
$549 12GB card in 2025... lol.
 
$549 12GB card in 2025... lol.
You can easily go for the $300 3060 from ASUS. 12GB! Only $300!

After the 33 - RTX 5070 cards sell and inventory is depleted, a 3060 will be about your only option for a while.
 
Back
Top