Thursday, May 20th 2010

Next Generation 3DMark Named, Detailed

Futuremark is readying the next generation of its popular 3D graphics benchmark, the 3DMark. The new version will be called 3DMark 11 (probably named after the year 2011 or DirectX 11), and will strive to be every present-generation GPU's worst nightmare (stress test). With NVIDIA's entry into the DirectX 11 generation of graphics following ATI making the new GPUs "current", 3DMark 11 will focus on the GPUs' DirectX 11 capabilities, probably exploiting new texturing and hardware tessellation features.

The benchmark will be able to push present and future GPUs to their maximum capabilities. At least one of its game tests have been known to be called "Undersea Submarine". There is no mention of NVIDIA PhysX, so it's safe to assume that the benchmark will use industry standard features which are available to all DirectX 11 generation GPUs. However, similar to a marketing deal with Sapphire for 3DMark Vantage (where a certain game test showed a futuristic vehicle with Sapphire logo on it), this version will market MSI. What's more, it will be bundled with MSI graphics cards when the software releases. Futuremark will show off its latest creation at this year's Computex event held in early June, in Taiwan. The software will be released some time in Q3, 2010.
Source: Plaza.fl
Add your own comment

48 Comments on Next Generation 3DMark Named, Detailed

#26
CDdude55
Crazy 4 TPU!!!
Sounds great.:)

I have never used 3DMark.(im not so big on the benchmarking scene), but this sounds good.
Posted on Reply
#27
Kitkat
wow 3D Mark Equality edition.

Al Shatrpon and Jesse Jackson finaly stepped in. Now all TPU do your part stop doing is making phyx a Con for ATI cards. You didnt make Avivo a con for Nvidia. lol its stupid.
Posted on Reply
#28
DrPepper
The Doctor is in the house
Kitkatwow 3D Mark Equality edition.

Al Shatrpon and Jesse Jackson finaly stepped in. Now all TPU do your part stop doing is making phyx a Con for ATI cards. You didnt make Avivo a con for Nvidia. lol its stupid.
Nvidia has their own equivelant afaik.
Posted on Reply
#29
shevanel
ut oh.. stuff like this makes me break out the plastic.
Posted on Reply
#30
kid41212003
cadavecaYeah, I know that you and I are eye-to-eye on alot of things, this included.

Vantage has never been a decent bench...far too cpu limited on the "game" benches. There's no way, if a bench is programmed right, that increasing cpu speed will increase gpu scores.

I understand why it's that way currently, but can't help but think that's the wrong approach.

I really wonder, with Shattered Horizon out, and patched heavily, how long they've really been working on this. Seems to me this announcement is due to them finally getting the proper funding to make a new one(Thank you very much for that, MSI!~). I bought Shattered horizon ,but have only played it once..I was buying to help support them until the next 3DMark, and that's it.
I think you meant 2006, Vantage is not cpu limited, even a Phenom 2.4GHz can max out dual-8800GT in Vantage (GPU score), while in 2006 it isn't.
Posted on Reply
#31
kid41212003
What I see is:

Dual 8800GT score 10k in GPU points using a Phenom I x4 2.4GHz

With a Core i7 @ 4.2GHz, GPU point is 10,0100. Yes, and increase of ~100 point in GPU by using a totally better CPU with higher clock.

Take off one card give me over 5k a lil in GPU score, SLI them and it's 10k.

What about 2006? Phenom 2.4GHz SLI 8800GT = 11k total score, 5k in SM 2.0 and 6k in SM 3.0

Result with Core i7 4.2GHz = 23k total score, 10k in SM2.0 and 10k in SM3.0

It's not that the CPU score is higher, but the GPU score is doubled (FPS).
GPU util software is not reliable, most of the time, it doesn't work.
Posted on Reply
#32
kid41212003
That's easy to explain. If it can score 18k using a single card, but can't scale well with the 2nd. It's mean the driver is fucked up. Ask AMD for that problem.
Posted on Reply
#33
erocker
*
Gentlemen, this thread isn't about Vantage or 06, so let's agree to disagree regardless.

Much appreciated. Thanks. :)
Posted on Reply
#34
kid41212003
Nonetheless, it will likely make use of tells heavily, which would give the current NVIDIA cards higher score, and start another war that AMD "don't" want to fight. And so, AMD will start ignoring to optimize driver for benchmarks that flagged as NVIDIA's friends (this is no fact).

Current known DX11 benchmarks: Bitsquid, Unigine, and Futuremark DX11.
Posted on Reply
#35
HillBeast
Sweet. I'm wondering though if it will do anything like Unigine where you can run it in DX10 mode so you can still compare it with older cards. If you can then that would be good too.

On another note, MSI have stopped impressing me ages ago now. I don't know why though, I guess Gigabyte just feels a bit more thought out these days than MSI. Oh well.
Posted on Reply
#36
HillBeast
kid41212003Nonetheless, it will likely make use of tells heavily, which would give the current NVIDIA cards higher score, and start another war that AMD "don't" want to fight. And so, AMD will start ignoring to optimize driver for benchmarks that flagged as NVIDIA's friends (this is no fact).
I totally agree. The last thing I want is for ATI to start optimizing it for 3DMark or other benchmarks and making game support really crap. Benchmarks are nice and all but when driver companies optimise for them, it shows they have stopped making game optimisations.
Posted on Reply
#37
Divide Overflow
I wonder what kind of licensing or account restrictions will be needed to run it.
Posted on Reply
#38
SK-1
Time for 3DMark to make our PCs feel puny again...
Posted on Reply
#39
Kitkat
DrPepperNvidia has their own equivelant afaik.
dosnt matter its dumb. They dont have one they didnt make it its not a con.
Posted on Reply
#40
Wile E
Power User
Missing features compared to competitors IS, most definitely, a con.

At any rate, I hope this 3Dmark is better than Vantage was. Vantage wasn't nearly a brutal as it needed to be.
Posted on Reply
#41
Easo
There must be sound... xD
Posted on Reply
#42
Kitkat
Wile EMissing features compared to competitors IS, most definitely, a con.

At any rate, I hope this 3Dmark is better than Vantage was. Vantage wasn't nearly a brutal as it needed to be.
its not open competition (which is why they are dropping it) there is was no open race. Exclusive dev for one card. They just made it for them and did it. its NOT a con. period. Theres no way it is or theyd have to do it for every little thing. Its STUPID
Posted on Reply
#44
DrPepper
The Doctor is in the house
Kitkatdosnt matter its dumb. They dont have one they didnt make it its not a con.
nvidia have purevideo. Ati's equivelant is avivo. That's why it's not listed as a con.
Posted on Reply
#45
WSP
futuremark needs to take out the cpu test like it did with 3DM01,03,05.
just gpu bench will do fine. thanks
Posted on Reply
#46
SKD007
erockerNvidia doesn't need to pay as the tesselation on their 4 series cards is superior to ATi's. Basically it's two tesselators vs. 1 with ATi.
Exactly.. 4 series cards is superior to ATi's when it comes to tesselation. so nvidia will pay them asking them to use more tesselation, so that they can out perform ATI in the test by a big margin :)

Very simple right ;)
Posted on Reply
#47
OnBoard
News is now pulled, MSI leaked it too early.

About Vantage being bad looking? Exact same thing happened with 3DMark05, it just didn't run good on anyone and that made it look bad. 3DMark06 was the same benchmark, but year later when everyone had better computers and it ran better.

Vantage looks great as long as you run it widescreen resolution and have to power to run it smooth. 3DMark11 will "look horrid" for all but 4x0/58xx users, but in couple years time it'll be good.
Posted on Reply
#48
Wile E
Power User
Kitkatits not open competition (which is why they are dropping it) there is was no open race. Exclusive dev for one card. They just made it for them and did it. its NOT a con. period. Theres no way it is or theyd have to do it for every little thing. Its STUPID
That's where you are wrong, it is open competition. For example, nVidia not having a technology similar to Eyefinity, is a con. ATI not having gpu physics is most definitely a con.
Posted on Reply
Add your own comment
Apr 25th, 2024 18:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts