• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's BFGD Solutions Delayed to Q1 2019, Will Cost an Awful Penny

Playing with HDR 1000 is like that :

AUTODARK750.jpg

Prepare to have your eyes on fire after 1 hour.

My 400cd screen is half or third (depending) of full luminosity so I can use it without losing an eye at the end of the day.

Its peak brightness, not sustained.
 
Too expensive too late int day and gsync is now an unnecessary price burden imho with hdmi2.1 out.

Note they been chatting bfgd for so long im bored before seeing one tut, Pr vapour nonesense thread imho.

Look at us loook at us <Nvidia.
 
Its peak brightness, not sustained.

Its still ridiculously high. And I strongly doubt its not detrimental to your eyesight, and it 100% is more tiresome to look at. You have to adjust way more often and more dramatically.

Is there anything that Nvidia does which doesn't cost a pretty penny ?

Sure, for GFE they want your soul :p (logins)
 
They better provide a lifetime supply of eyedrops with that.
 
I was really talking about a longer period of time AND NOT ABOUT FRIGGIN GPUS, another one doesn't understand that. How much did first 1440p 144Hz cost, and how much are they now ?

Two years ago they were going for about $400, quick vist to newegg show the same exact monitor going for $360, it's replacement $449
 
Last edited:
Thing get cheaper as people buy more really? You still buy into that fairy tale?
It's not a "fairy tale." It just requires a competitive marketplace, which the gpu market presently lacks.
 
It's not a "fairy tale." It just requires a competitive marketplace, which the gpu market presently lacks.
It's only not competitive above a certain price point , the price point Nvidia left behind with the 680, when they started out on this price hike Innovation, and I am presently residing in the realistic expectations camp as far as Rtx goes , i await reviews.
Imho.
 
That price is about as high as the dude who set it... Nvidia execs are smoking the GOOD shit.
 
I didn't realise Nvidia made monitors? But I appreciate the name gets people mad instantly.
 
Its peak brightness, not sustained.
Even worse. Rapid changes in brightness cause eye strain.

I didn't realise Nvidia made monitors? But I appreciate the name gets people mad instantly.
They don't. They make the G-Sync module they're selling for probably $1000-2000 and they're putting requirements on the panel itself that significantly adds to that cost before it will reach the market. NVIDIA just priced itself out of the market.


Oh damn, I had to re-read that. "NVIDIA SHIELD built-in." So now your monitor has a Tegra chip with it's own ARM CPU and Kepler GPU. Gee, I wonder why. Oh, right, the G-Sync module historically was basically a mini computer to handle NVIDIA's bullshit. They just went all the way now. I wonder how long this monitor takes to boot up. :roll: And how are they going to manage "ultra-low latency" when everything has to be handled by two GPUs? [facepalm.jpg] Give. Up. NVIDIA. Implement the adaptive sync standard.
 
Last edited:
They don't. They make the G-Sync module they're selling for probably $1000-2000 and they're putting requirements on the panel itself that significantly adds to that cost before it will reach the market. NVIDIA just priced itself out of the market.

I thought it was $200? People need to make their minds up.
 
That's for non-HDR G-Sync modules. HDR G-Sync modules are closer to $500. There's no way the panel tech is costing $3500+ so it's likely these new HDR modules are even more expensive than the previous ones. Call it the "SHIELD tax." NVIDIA is all "we're adding features to your monitors so we deserve more money." Monitor manufacturers need to flip the finger at NVIDIA and wash their hands clean of it. Only the huge monitor manufacturers can even afford to consider selling a G-Sync monitor.
 
Last time I checked, monitor manufacturers are in the business of making money, it's their name on it after all, not Nvidia.

If they didn't think this was worth the effort they wouldn't bother.
 
Since none are out yet, I'd say they aren't bothering. The numbers we're seeing probably come from preliminary estimates which are being leaked to the press so NVIDIA reconsiders pricing of the module and/or requirements (1000 nit is stupid).
 
I wish they would focus more on 1440p size with all the features in like 27-32" size.
They would sell way more than some nich size/res monitor.
i have 1440p 32" (well ... more like 1620p) but what do you mean by all feature? G-Sync? erk ... my screen would cost 500+ and not 299 if it carried that :laugh: 144hz? well ... if my card would push more than 60ish average at 1440p (a little lower but not much at 1620p ), granted that even a 1080Ti can't do 144hz in 1440p (close but not equal or above) maybe it would be nice ... (tho ... 20XX would probably do that ... ) 60hz/75hz oc is quite fine in my setup...

short version: my current monitor is a proof that you can have more for less :laugh: (well ... personal opinion right? )

suspecting the incoming 4k ultimate GPU namely the RTX 2080Ti (an arm, a kidney and maybe a part of your liver, out of taxes of course) will push obscenely priced 4k monitor with more gimmick than ever ...
 
Who needs body parts? I thought Switzerland benefited greatly from stolen gold! Cash in and treat yourself.
 
120fps on 4K?... like what gpu are nvidia planing to recommend for this? do nvidia know something about amd navi? or intels 2020 project? because one thing nvida can know for sure - no green team cards that can push past 80fps on 4K anytime soon

You live under a rock or something ? 2080 is already averaging 71fps on that 10game sample Nvidia provided so 2080Ti should be able to brake the 100fps 4K barrier and that without taking into account DLSS etc .
 
Last edited:
Speaking of which. What are the specs of Displayport 1.4a, which Turing is ready for? I can't find them anywhere, only thing what came up was it was released April 2018.

Turing is HDMI 2.0b

https://rog.asus.com/articles/gamin...and-rtx-2080-graphics-cards-from-rog-and-asus

RTX2000 series fights AMD Vega 20

Next year 7nm Nvidia Ampere (RTX3000) series will have HDMI 2.1 and PCIe 4.0..... Second generation Ray Tracing!

But that's Q4 2019 or Q1 2020

2020 Nvidia Ampere is fighting 10nm Intel Arctic Sound and 7nm AMD Navi
 
Playing with HDR 1000 is like that :

AUTODARK750.jpg

Prepare to have your eyes on fire after 1 hour.

My 400cd screen is half or third (depending) of full luminosity so I can use it without losing an eye at the end of the day.
Next Nvidia will be selling gaming glasses to ease eye strain during long gaming sessions.
 
These are expensive, but if we get the same monitors with FreeSync, they should like $500 cheaper! Oh, wait...

On a more serious note, I've given up on trying fond a decent 32", 4k, HDR capable monitor. The hype is there, but the technology isn't. It will take few more years. Which is fine, because that's about how much it will take for video cards that I buy (in the $200-300 range) to start handling 4k ok-ish.
 
Last edited:
Turing is HDMI 2.0b

https://rog.asus.com/articles/gamin...and-rtx-2080-graphics-cards-from-rog-and-asus

RTX2000 series fights AMD Vega 20

Next year 7nm Nvidia Ampere (RTX3000) series will have HDMI 2.1 and PCIe 4.0..... Second generation Ray Tracing!

But that's Q4 2019 or Q1 2020

2020 Nvidia Ampere is fighting 10nm Intel Arctic Sound and 7nm AMD Navi

Yes of course it has hdmi2.0b. But I was talking about Displayport 1.4a. I suppose it has the same bw and other specs as dp1.4 has. But only difference might be what it says on the FAQ:
What is the current version of the DisplayPort Standard?
DisplayPort 1.4a was published in April, 2018 and defines the new normative requirement and informative guideline for component and system design.

For more information on DisplayPort 1.4a, see DisplayPort 1.4a Standard FAQs
 
I don't object to this, in fact I find it quite cool. Halo products are supposed to be crazy. But I honestly can't figure out a use case for it. Yes, if it's good enough it can be used as more than a "mere" gaming monitor, but who is the target audience? Companies needing large displays with this tech, for some reason? Gamers with truly high end computers they play with controllers from the couch? Or are there people who actually uses monitors of this size as desktop monitors?
 
Its still ridiculously high. And I strongly doubt its not detrimental to your eyesight

The point of this 1000 nits target is for advertising. They want their flash flash flash ads to really sear into you.

edit: It's also potentially a great way to turn over OLED sets more quickly, since those pixels will wear out faster, especially as the ridiculous 8K craze becomes the standard. People will "discover" problems like gamut shrinkage (especially in the blue shades) and contrast reduction and offer upgrades to fix the problem. "Old set looking washed-out, the new-and-improved sets not only have 10K resolution, they have a wider color gamut than sRGB!"

What gamers and video watchers need more than 1000 nits is better static contrast (except for OLED) and a vastly wider color gamut than the ancient sRGB. The new HDR standard is going in that direction but too much emphasis is being placed where it shouldn't be (pixel shrinkage and, especially, excessive eye-searing brightness). I have no doubt that the primary factor behind the brightness marketing is advertising. Ad companies have already discovered the trick of turning the screen black periodically during commercials to make people think the ad is over.
 
Last edited:
At this rate these will be DOA by the time they come out...
 
Back
Top