• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Atari is working on its first console in more than 20 years

Well i am less forgiving, they said they could do 4k native 60fps which is not the truth. Any thing i buy and find that it is not what it claims goes right back, even more so when it cost $400-$500.

I can see the current ones being replaced pretty soon to so they can do 4K better with improved graphics.
 
Well i am less forgiving, they said they could do 4k native 60fps which is not the truth. Any thing i buy and find that it is not what it claims goes right back, even more so when it cost $400-$500.

I can see the current ones being replaced pretty soon to so they can do 4K better with improved graphics.
Saying they can do 4k60 doesn't mean they're promising that for every title. They don't have total control over the way all devs develop their games, especially AAA non exclusive titles.

Anything you buy..., yeah, but you're also a PC gamer with higher standards, so consoles don't really apply to you. I just see this as a positive step up in the console realm. It doesn't mean I'm going to buy into it.
 
And what you fail to understand is the same applies with the PS4 Pro, thus it's still 1.8 TFLOPS behind. You're going by nothing but sites that claim real world specs. You don't even acknowledge that many devs, whom know full well what kind of hardware is required for 4k, have already said the One X is more suited to 4k than the PS4 Pro.

LOL, foot in MY mouth? Even Sony's own spec admits it's not a true 4k player. What you don't get is you can't go by YouTube's player. There are ways to trick YouTube into using higher bit rate. Just by outputting a 1080p capture to 2048x1152, you can make it look better than a 1080p upload. And just because the player plays a vid at 4k doesn't mean it hasn't been doctored by upscaling. I imagine your eyes aren't even good enough to tell the difference between true 4k and upscaled 4k, given what you've said.
Odd of you to say since none of what you said there addresses the topic, and is obviously more fanboyish than anything I said. You don't win an argument by just implying MS is evil, and Sony isn't, without giving any reasons whatsoever. MS lifting their restrictions first revealed at initial launch on One, and working hard on backward compatibility doesn't just happen out of the blue, they are decisions made by actual people that head Xbox. If anyone's acting like a dense, biased fanboy, it's you.
LOL, no, that was you overreacting to my saying MS is the only one putting up a decent console with One X. You were the one that wanted to turn it into a MS vs every other console pissing contest, and none of what you said had any validity. This is what happens when gamers think YouTube is the end all be all for judging video quality. :rolleyes:
And you called ME a fanboy. Wow..
MS officially announced core count and clock speeds , from who do you expect proof ? It's a technical specification , here is the formula : core count x clock speed x 2 = number of FLOPS. There is no such things as a need for independent proof or "technologically reasonable ".
??? 40*64*1172*2 = 6000640 aka 6 TFLOPS , why are you all so doubtful of this ? I don't get it.
That is NOT how FLOPS are calculated. Not all IC's are created equal and there is no one set universal formulae for calculating FLOPS. Just like there is no universal formulae for calculating IPC. Each platform must be tested independently. Careful there sub-genius, your ignorance is showing.
 
Last edited:
And you called ME a fanboy. Wow..


That is NOT how FLOPS are calculated. Not all IC's are created equal and there is no one set universal formulae for calculating FLOPS. Just like there is no universal formulae for calculating IPC. Each platform must be tested independently. Careful there sub-genius, your ignorance is showing.

Nice thorough explanation on why I am wrong mate. I just love these comments.

That IS precisely the formula for peak floating point operations per second. That is the absolute limit of what the hardware can provide with no software overhead. And it is the only objective measurement you can make for raw processing power.

That is because not all pieces of code can make perfect use of the hardware. Every "independent test" will provide a level of performance relevant only within the context of the test itself and in relation to how optimized it is and what you are trying to achieve with it.

There are algorithmic aspects that need to be taken into account as well , you can write a piece of sequential code that will prove that a Titan Xp is a piece of shit , similarly you can make use of the data parallelism these GPUs are design around and make it look like an absolute beast.

"Independent testing" is in no way suggestive enough to infer from it how capable the hardware is overall.

All I have to say is gain some serious knowledge buddy, for your own sake. Otherwise don't go around and pretend like you know what are you talking about.

Till then you won a place on the ignore list for insulting me for no reason in a incredibly poor attempt to prove your point.
 
Nice thorough explanation on why I am wrong mate. I just love these comments.

That IS precisely the formula for peak floating point operations per second. That is the absolute limit of what the hardware can provide with no software overhead. And it is the only objective measurement you can make for raw processing power.

That is because not all pieces of code can make perfect use of the hardware. Every "independent test" will provide a level of performance relevant only within the context of the test itself and in relation to how optimized it is and what you are trying to achieve with it.

There are algorithmic aspects that need to be taken into account as well , you can write a piece of sequential code that will prove that a Titan Xp is a piece of shit , similarly you can make use of the data parallelism these GPUs are design around and make it look like an absolute beast.

"Independent testing" is in no way suggestive enough to infer from it how capable the hardware is overall.

All I have to say is gain some serious knowledge buddy, for your own sake. Otherwise don't go around and pretend like you know what are you talking about.

Till then you won a place on the ignore list for insulting me for no reason in a incredibly poor attempt to prove your point.
I've been working in this industry longer than you've been alive and am well aware of how things work. You are incorrect about how FLOPS are calculated, even for peak values. Your proposed mathematical equation fails to account for all possible variables between different calculation/function operations from platform to platform and even between different products within platform. Microsoft's declared TFOPS are literally a marketing number that have no bearing on real world performance. And the reason I've not offered anything more in this conversation is simple, I don't need to. Anyone who understands how compute operations function knows that Microsoft's declared numbers are a marketing fantasy and that your suggested "math" for estimating FLOPS is nonsense. If you wish to continue rambling on, it's a free world.
 
All this claimed genius, yet you can't fathom the difference between 4k upscaling and true 4k. Even your screen name is odd. Lex wasn't the master, Superman was. The only one "rambling on" here is you, and clearly out of an over inflated ego.
 
True

Except that AMD is not making the claims. Microsoft is, and they are well known for fudging numbers and out-right lying for marketing advantage. Given the known capabilities of existing AMD APU's, 6TFLOP's seems a dubious claim. 4.6 TFLOP's is a more realistic number, again based on known facts.

The GTX 980ti running at Nvidia spec clock is a 5.4 TFLOP part. Increasing clock speed will increase that number.

Ok let's do this.

That is .2GHz. If it were .1 would that be a small enough difference to be "close" to you?

Until someone independently proves this number, I'm going with a more technologically reasonable 4.8TFLOP's and vs 4.2, that reasonably close.

You're right on that one, it's 50% more and at a faster speed. But it's still in the ball-park.

You conveniently left out that PS4Pro does 4k very easily and that Sony feels that most users will be streaming 4k content rather than using BD4k discs. That was a weak point to attempt.
Let's add a few more stats shall we?
5. Scorpio has 40 compute units where-as PS4Pro has 36. I call that close.
6. Those compute units run at 1172MHz vs 911MHz respectively. Again, that is in the ball-park.

You assume the mistrust is related to who's in charge. That is only one smaller aspect of the total problem. The greater part of the problem is that Microsoft keeps violating the general privacy and digital security of the public without regard to the long term ramifications of their actions. And they keep lying about how much data they are collecting. So yeah, not trustworthy. And an ever increasing number of people are realizing this.

Mid to High end? No. The Mid-range suggestion is reasonable as is the point of the Keyboard and Mouse support. Make no mistake, good choices are being made with the system and it's going to be an improvement.

Um, ok. Everyone is doing that to varying degrees of success. However, Nintendo is a lot better at it. They're also making a lot of really creative, original titles and they're not alone.

What a subjective statement. "Serious" gamers play games they enjoy, regardless of genre. Perhaps the word you were looking for was "professional"?

Don't know man, I do disagree that a 4.8 - 4.2 difference is a small gap. That's like jumping to a higher tier GPU. And with the knowledge of where the consoles stand performance wise, that is quite huge. This can be the difference between upscaling something and not doing so, especially for lighter games. And we also know that with these measly Jaguar cores every small bit of performance is going to help overall, in a big way and at the moments where it counts.
 
Lex wasn't the master, Superman was.
That was weak.
The only one "rambling on" here is you, and clearly out of an over inflated ego.
Yup, that's gotta be it. Irony much?
Don't know man, I do disagree that a 4.8 - 4.2 difference is a small gap. That's like jumping to a higher tier GPU.
Exactly. Not the earth-shattering jump in performance Microsoft is making it out to be.
And with the knowledge of where the consoles stand performance wise, that is quite huge. This can be the difference between upscaling something and not doing so, especially for lighter games. And we also know that with these measly Jaguar cores every small bit of performance is going to help overall, in a big way and at the moments where it counts.
Those are good points. And to be clear, I do think that Scorpio is an improvement over the last generation of SoC, but there is no way that four extra GPU compute cores[40 vs 36 over PS4Pro] running 200ish mhz faster are going to equate to 31% greater performance. The laws of physics apply. It's just not going to happen.

Here the thing though, if Microsoft delivers on the KB&M support for the platform and the rumor of KB&M patches for the Halo games, I'm all in. Don't really care what the specs are if the games are good. Lack of KB&M support is why I stopped playing FPS's on consoles. Controllers are cumbersome as hell for FPS games and some gamers refuse to play in such a way, me included. Give us KB&M support and the whole thing changes.
 
Lack of KB&M support is why I stopped playing FPS's on consoles. Controllers are cumbersome as hell for FPS games and some gamers refuse to play in such a way, me included. Give us KB&M support and the whole thing changes.
I don't disagree that it's the worst thing about consoles, but I find it really odd anyone would say it's a reason to stop playing on consoles. More like a reason to not play on them to begin with, until now, IF One X really does get proper KB/M support. Not too convinced of that yet really.
 
I don't disagree that it's the worst thing about consoles, but I find it really odd anyone would say it's a reason to stop playing on consoles. More like a reason to not play on them to begin with, until now, IF One X really does get proper KB/M support. Not too convinced of that yet really.
To be fair, I gave it a shot on the original Xbox with Halo1 & 2[and again recently with the Master Chief Collection], along with a few others. But this was AFTER having been a PC gamer for half a decade. FPS games are FAR more fun to play on PC. After Xbox/PS2/GameCube I gave up on FPS's and RTS's on consoles. KB&M support is a deal breaker. And I agree, Microsoft has a habit of saying one thing and then doing another, or just doing nothing.

EDIT;
After discussing this with a few coworkers, we have arrived at a conclusion. The only way Scorpio could do 6 TFLOPS is if they used the GPU and 6 of the CPU cores to do floating point calculations. Sony's estimations are based only on GPU compute numbers, but when you factor in the same scenario, the PS4Pro using it's GPU and 6 of it's CPU cores, the rough numbers are about 5.5 to 5.6 TFLOPS. But really, how many devs are going to do FLOP on CPU? None. So the 6 TFLOPS estimation by Microsoft is technically true, but Sony's numbers are more honest/realistic as they represent the scenario that devs will most likely use.
 
Last edited:
To be fair, I gave it a shot on the original Xbox with Halo1 & 2...
My idea of trying consoles is when I'm at a friend's place and he shoves his gamepad in my face, no doubt looking to humiliate me since he knows I suck with them. That said, even though I did better than I expected I would in some cases in a boss bloater fight in The Last of Us, and some events in NFS Rivals where I kicked his ass, there's just too many awkward feeling moments as well where it's epic fail time to ever want to own one. And even IF One X gets proper KB/M support, I still think I'm better off with a PC.

As for the TFLOPS thing, I don't recall AMD ever just blatantly lying about such spec, including claiming it's all GPU power instead of shared with CPU. They'd be setting themselves up for lost sales, and there's no way in hell MS would be charging $100 more than Sony if it din't have significantly more power.

So all else I'll say on that is we shall see when it launches. Anyways, this has already distracted from the thread topic enough, and since we disagree on main points, I see no reason to continue diverting the conversation.
 
Nothing beats a controller for Fighting/Driving/Flying/Scroller games though...regardless of platform.

This Atari box is looking less and less appealing. Other than nostalgia I don't see much of a point. I hope they push this out with more than they have presented with so far.
 
Nothing beats a controller for Fighting/Driving/Flying/Scroller games though...regardless of platform.
I tested that theory on GRID 2 by buying a friend's original Xbox controller cheap. While it did arc turns more smoothly, it was just too slow in response to correct small mistakes as well as a KB. The result was mistakes got amplified and I ended up wagging into walls. The way some of the better devs build algorithm filters for KB into their race games anymore, they drive pretty well. It really depends on the game and what the player is used to.
 
Nothing beats a controller for Fighting/Driving/Flying/Scroller games though...regardless of platform
Going to agree to a point. Speaking from experience, nothing beats a quality steering wheel for driving games and nothing beats a flight stick for flight/space sim games.

This Atari box is looking less and less appealing. Other than nostalgia I don't see much of a point. I hope they push this out with more than they have presented with so far.
Not much is known about it yet. Give them time.
It really depends on the game and what the player is used to.
So very true.
 
Last edited:
hudson.jpg
^This just makes me sad that Bill Paxton passed away recently. He was awesome in Training Day.
 
atari-vcs-family-adjusted.jpg


This will be on my stand for sure. Too many fond memories for it not to be.
 
Gorgeous

is that your pic?
 
I can 3d print one and throw a Bristol Ridge APU in it myself for a fraction of the cost... I really like the controllers though!
 
My worst expectations of this project came true. According to all the vague vagueness from the mouth of ataribox VCS team it seems like it's gonna be a boring linux box without content.
Also, april is a re-launch of the campaign, so for the past year they couldn't even come up at least with some solid lies about the ongoing progress. Only changed the name, kept the old chassis design and did not even confirm that it's a 100% x86_64 platform. So the worst case scenario is that it may as well be a bigger OUYA with some mundane by today's standards Helio x20 or (I seriously hope not) Allwinner A63 onboard. So far all we have is "It's probably going to be somewhere between Shield and Switch" (e.g. in an overestimated ballpark of Tegra X1). Given the current price on TX1 and projected price of Ataribox I seriously doubt that it's actually based on Tegra, and the only cheap x86 alternatives w/ equivalent performance are Atom X5 and last-gen Carrizo.
 
Any console launch without exclusives is going to fail and I'm not seeing anything that suggests Atari have a plan in that regard. Unless they're going to try and emulate the S/NES Mini's retro games take, which I somehow don't think is gonna work with Pong unless all the hipsters decide to buy it.

I can 3d print one and throw a Bristol Ridge APU in it myself for a fraction of the cost... I really like the controllers though!

And it would probably be faster than what they'll go with...
 
Will it be called the Atari 2700? Or the Atari II
 
I can 3d print one and throw a Bristol Ridge APU in it myself for a fraction of the cost... I really like the controllers though!
Buy Xbox 360 controller? Looks pretty much identical. Even down to the ABXY layout.

Oh, you're talking about the joystick. That really comes down to quality of the joint. Most long sticks like that are made with cheap plastic joints anymore which are garbage. I highly doubt Atari is doing any better.
 
Last edited:
Back
Top