• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

CHOO CHOOOOO!!!!1! Navi Hype Train be rollin'

You don't know what "educated guess" means and you can't even use a dictionary...
An "educated" guess is still a guess!
 
Honestly, I think the 970 got a pretty decent service life for what it was.



By the time GTAVI makes it to PC, all of these GPUs will be obsolete anyway. What did it take For GTAV? Over a year? RDR2 still isn't here 6 mos later. If anyone knows how to milk people, it's Rockstar.

The 290 still holds up further.

An "educated" guess is still a guess!
:roll:
 
That's the most ridiculous assumption ever, seriously? dude? Come on lol.
Assumption? You realize these were statements to investors. Making false statements to investors is fraud.

She went from being excited about Navi in January 2019 to being evasive about Navi in April 2019.
 
Well... not so long ago some people on this forum were sure that Intel designed 6-core CPUs in few months - because AMD surprised them with Zen. Otherwise they'd be making 4 cores forever.
Yet, when it comes to GPUs, the same people are very worried about Intel's R&D potential... ;-)

I don't understand why everyone is so hyped on Intel GPUs, you may as well hope AMD turns RTG around.

What makes anyone think the people that produced Vega and Polaris are going to go to a company that has as much money as the Pope but hasn't been able to produce a decent GPU are somehow going to buck the trend?
 
Honestly, I think the 970 got a pretty decent service life for what it was.



By the time GTAVI makes it to PC, all of these GPUs will be obsolete anyway. What did it take For GTAV? Over a year? RDR2 still isn't here 6 mos later. If anyone knows how to milk people, it's Rockstar.


In VR it 970 was even faster than my old FuryX, specifically Fallout4 VR simply because AMD refuse to add Async Reprojection and Smooth Motining to R9 GPUs.

3.5GB "slow" GDDR5 VRAM wins over 4GB HBM RAM, that was pretty sad.
 
I don't understand why everyone is so hyped on Intel GPUs, you may as well hope AMD turns RTG around.

What makes anyone think the people that produced Vega and Polaris are going to go to a company that has as much money as the Pope but hasn't been able to produce a decent GPU are somehow going to buck the trend?

Considering Intel's main focus is like Nvidia's, AI, gaming takes a far back seat.

We are not the money makers for any corporation it's other corporations that make these companies money.
 
I don't understand why everyone is so hyped on Intel GPUs, you may as well hope AMD turns RTG around.

What makes anyone think the people that produced Vega and Polaris are going to go to a company that has as much money as the Pope but hasn't been able to produce a decent GPU are somehow going to buck the trend?

Because a good amount of former ATI talent are now at Intel, A shit ton more money will definitely help as well. Kyle of HardOCP speculated this a loooooong time ago. It seems to be coming true now

https://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility

I liked ATi, so since most of the team are now at Intel might as well root for Intel GPU now.
 
You have data points from 3 years. 4K resolution. Pretty much the same RAM usage.
The rate memory usage is climbing is exponential because the 32-bit barrier is gone (which made GTX 970 feasible) and consoles are finally shipping with amounts that aren't pathetic because developers are demanding it.

Again: $500+ product, only 8 GiB of VRAM? Radeon VII is a better product from a value proposition because it won't be obsoleted by growing memory demand as fast as the RTX 2080 and RTX 2070 will be. NVIDIA did that, does that, and will continue to do that intentionally (planned obsoletism). AMD has a history of being generous on memory (except Fury because of technical limitations of HBM).


For the record, HBCC is a carry over from Radeon Instinct where 32 GiB often isn't enough. HBCC I think allows DMA to virtually all storage resources in the machine so it can pull data directly without waiting for the CPU to do it. Not really important for games...especially when you have more than enough dedicated VRAM. When you're working with datasets that are in the terabytes (like laser scans and raytracing) HBCC reduces latency.
 
I don't understand why everyone is so hyped on Intel GPUs, you may as well hope AMD turns RTG around.

What makes anyone think the people that produced Vega and Polaris are going to go to a company that has as much money as the Pope but hasn't been able to produce a decent GPU are somehow going to buck the trend?
Because Intel is the most mature company in this business: very serious and committed to enterprise clients. They will make a good GPU. Maybe not extremely fast, but extremely polished, focused and easy to use (with minimal tinkering). And yeah, I think they could go as far as locking OC. :-)

It'll be an interesting alternative. If you'd rather get a lot of RAM, a lot of oomph and a lot of driver lottery, AMD will probably still be around.
 
Intel is going to come out of the gate with a process tech disadvantage. Rumor mill says 10 nm but I don't buy it. Nothing Intel has produced to date on 10 nm comes remotely close to densities necessary for a monolithic GPU. I think the only way Intel is competitive is by outsourcing dGPUs to TSMC 7 nm. On top of that, I'm still not convinced Intel is even prioritizing real-time rendering. The big money and the market Intel is losing is in the enterprise compute space (Radeon Instinct and Tesla).
 
It's not, it's an assumption.
"Assumption" is in different class. It's statement that is treated as true without a proof.
"Guess and "educated guess" are not treated as true, but as uncertain.
But a guess is random and an educated guess is based on some information one has.

So for example:
"eidairaman1 can't use a hammer" is a guess - I have no idea, no information.
but
"eidairaman1 can't use a dictionary" is an educated guess - an inference - because over many years of using Internet forums I encountered many people that didn't know some word (no shame). But they quickly checked whether I'm right and either admitted or changed the subject. And you keep drowning.
 
Well, actually no. Educated guess is more like inference (thinking).
Sure, based on prior evidence and concensus.
Neanderthal: Dinosaurs will rule the Earth forever.
Scientists circa 1500: The Earth is flat. The Earth is the centre of the Universe etc.
 
Because Intel is the most mature company in this business: very serious and committed to enterprise clients. They will make a good GPU. Maybe not extremely fast, but extremely polished, focused and easy to use (with minimal tinkering). And yeah, I think they could go as far as locking OC. :)

None of this should inspire a lot of enthusiasm for 'gamers'.

EDIT: Also, what GPU isn't easy to use?
 
Last edited:
That's the question I've asked @FordGT90Concept .
For 3 years we haven't seen a significant increase in VRAM needs. 4K games utilize roughly the same amount. And that's on highest settings games offer.
So why would this trend change now? Why would games launching in next 3 years utilize more?
It'll still be 4K.
Maybe you know?

Looking at RTX cards, clearly RTRT and DLSS use need some RAM (1-2GB above what the game normally needs). But that's hardware that won't magically appear on Radeon VII. And mainstream Nvidia RTX cards are <=8GB nevertheless.


No, it isn't.
You haven't seen shit simples.

This is clear because I can use close to 8GB of Vram on GtaV at 4k ultra settings, guess what the vega VII with twice the bandwidth runs it better maxed out as do the 2080 and ti.
Your talking rubbish ,sure i could game using less Vram but I would HAVE to reduce resolution or settings it's really just that simple.
For three years I have seen Vram increases because I was always trying for Max IQ at the best resolution i could.
Your running a potato yet know so much about 4k gaming,how, because you read reviews.
 
"Assumption" is in different class. It's statement that is treated as true without a proof.
"Guess and "educated guess" are not treated as true, but as uncertain.
But a guess is random and an educated guess is based on some information one has.

So for example:
"eidairaman1 can't use a hammer" is a guess - I have no idea, no information.
but
"eidairaman1 can't use a dictionary" is an educated guess - an inference - because over many years of using Internet forums I encountered many people that didn't know some word (no shame). But they quickly checked whether I'm right and either admitted or changed the subject. And you keep drowning.

The only one drowning is you because you assume.
 
A fanboy yt channel is now quoted as a source for a clickbait article,proof people never learned.

Threads like this used to get locked here.. now they're just where the remaining active members let their fanboy flags fly.. Oh how tpu has fallen (but so has every traditional forum)
 
Scientists circa 1500: The Earth is flat.
Scientists claimed the earth is round since BC. You're mixing that with "Sun orbits the Earth - Earth orbits the Sun" argument
 
Basically, many AMD fans say something like this: objectively Radeon GPUs are sh*t, but AMD is small, poor and doesn't give a f*ck, which makes Radeon GPUs great.

This is so true.
 
I'm not sure the car analogy works. It paints amd as a cheap brand while it works quite well for NV given their history of lying and cheating. Maybe vw and ford or BMW and ford would of worked better. They both make good products but 1 is perceived as being "better".

But I know what you mean. Amd have been more chasing the mid range, family sedan while NV have been chasing the high end sports coupe market.

VW = Nvidia = Dieselgate, its perfect? Meanwhile Dacia's got a spotless reputation and competes on price ;) Ford does not. You said it right: perceived to be better :) Not really better at getting the job done, really. In fact, VW's got worse failure rates I believe.

Anyway :D Let's move on

Apex legends can use more than 8GB and that's dx11,as a 4k ultra IQ gamer that 8GBlimit gets tested Today, imagine what spec GtaVI will need.

What!? Apex Legends? Not sure that is a great example. I can almost count the number of different assets in that game on my two hands.
 
None of this should inspire a lot of enthusiasm for 'gamers'.
Why not?
EDIT: Also, what GPU isn't easy to use?
The one you have to underclock, overclock, flash, tune and pick the right driver (to get performance figures everyone talks about on internet forums).

I won't be surprised if drivers will come in Windows Update and some GPUs will have locked clocks.
The only one drowning is you because you assume.
You have no idea how math or science work, do you? :-D
Next time you comment in an AI-related thread, I'll ask you what "inference" is. Maybe you can at least use wikipedia. ;-)
Sure, based on prior evidence and concensus.
I said: based on information one has. It doesn't have to be true and inference doesn't have to be correct.
Scientists circa 1500: The Earth is flat.
Ancient Greeks were already pretty certain Earth is round.
Anyway, "flat Earth" approximation will never go away in science and engineering. It's very good.
The Earth is the centre of the Universe etc.
In classical physics it is relative, so there was really no way to understand this until we had gravity equations (XVII century).
 
Well I read few pages here hoping for some "Hype Train" stuff but I guess the train's left the station already. Too bad I missed it. Same Nicknames appeared and of course same old stories revealed again. Apples or Oranges? I see the same persistence of some people drowning in their own thoughts :) Amazing really how hopeless and depressed some of you are :)

On topic.
This is frustrating. Been hearing about the Navi for a while now. AMD pushed the release. The question here is why? Will it be that good as WCCFTECH claims? well I sure hope so. And this Ray Tracing :) It's everywhere now and it starts to be boring. As of now and all the RTRT I really couldn't care less about it. Just want a good card for a reasonable price.
 
On topic.
This is frustrating. Been hearing about the Navi for a while now. AMD pushed the release. The question here is why? Will it be that good as WCCFTECH claims? well I sure hope so. And this Ray Tracing :) It's everywhere now and it starts to be boring. As of now and all the RTRT I really couldn't care less about it. Just want a good card for a reasonable price.
Well we have been hearing about Navi for a while, just not from AMD. Huge difference. They mention it here and there as an afterthought. Everything mostly came from rumors and "leaks".
And another thing, WCCF is as reliable as a 1980 Zastava Yugo. Don't give them too much credit.
 
Back
Top