Wednesday, October 21st 2009

NVIDIA RealityServer Propels 3D Cloud Computing Using GPUs

NVIDIA, inventor of the graphics processing unit (GPU), and mental images, world leader in rendering technologies, introduced today the NVIDIA RealityServer platform for cloud computing, a powerful combination of GPUs and software that streams interactive, photo-realistic 3D applications to any web connected PC, laptop, netbook and smart phone.

NVIDIA RealityServer - the culmination of nearly 40 collective years of hardware and software engineering by NVIDIA and mental images - enables developers to create a new generation of consumer and enterprise 3D web applications, all with remarkable levels of photo-realism.
  • Automobile product engineering teams will be able to securely share and visualize complex 3D models of cars under different lighting and environmental conditions.
  • Architects and their clients will be able to review sophisticated architectural models, rendered in different settings, including day or night.
  • Online shoppers will be able to interactively design home interiors, rearrange furniture, and view how fabrics will drape, all with perfectly accurate lighting.
The NVIDIA RealityServer platform is comprised of an NVIDIA Tesla RS GPU-based server running RealityServer software from mental images. While photorealistic imagery has traditionally taken hours or days to create, this first-of-its-kind, integrated solution streams images of photorealistic scenes at rates approaching an interactive gaming experience.

"This is one giant leap closer to the goal of real-time photo-realistic visual computing for the masses," said Dan Vivoli, senior vice president, NVIDIA. "mental images fully embraced the concept of GPU co-processing to enable Interactive photo-realism anywhere, any time - something that was science fiction just yesterday."

RealityServer software utilizes mental images iray technology, the world's first physically correct ray-tracing renderer that employs the massively parallel CUDA architecture of NVIDIA GPUs to create stunningly accurate photo-realistic images by simulating the physics of light in its interaction with matter. Because ray tracing is one of the most demanding computational problems, iray technology is designed to take advantage of the parallel computing power of NVIDIA Tesla.

"Everyone should experience the web the way they experience the world, in 3D," said Rolf Herken, CEO and CTO of mental images. "RealityServer offers extraordinary opportunities and far-reaching implications for businesses and consumers, enabling us to interact with 3D content in the form and manner that our brains expect."

Customer Quotes
"NVIDIA RealityServer can revolutionize marketing and sales via the web by enabling the world to interact with virtual products in a more realistic and unconstrained way," said David Kelly, CEO of mydeco.com, a U.K.-based designer furnishing site. "Our interactive 3D planner, which uses RealityServer, is what attracts customers to mydeco, and the immersive, real-life experience it delivers keeps them engaged."

"Some of the biggest problems we face in aviation research involve managing and visualizing massive datasets while keeping that data secure," said Fernando Toledo, Manager, Virtual Reality Center, National Institute for Aviation Research at Wichita State University. "Using NVIDIA RealityServer for virtual prototyping, design reviews and remote visualization solves those issues. We are extremely impressed by its performance and ability to realistically render our large 3D CAD models over the web without exposing our IP."

"NVIDIA RealityServer is an integral part of our 3D Scenes application on Facebook," said Mark Zohar, CEO, SceneCaster. "This unique and powerful GPU-based Cloud computing solution offers exciting opportunities for the delivery of realistic 3D applications and services."

Product Information
The NVIDIA RealityServer platform will be available November 30, 2009. Tesla RS configurations start at eight GPUs, and scale to support increasing numbers of simultaneous users.

A developer edition of RealityServer 3.0 software will be downloadable free of charge, including the right to deploy non-commercial applications November 30, 2009.

For more information about NVIDIA RealityServer, please visit this page.
Add your own comment

25 Comments on NVIDIA RealityServer Propels 3D Cloud Computing Using GPUs

#1
IceCreamBarr
So is this an office showing how clean it would be if they implemented cloud computing, reducing the amount of desk clutter OR is this an example of the new 3D capabilities touted in the article? If that's a 3D CAD model, I can not tell the difference.

Truth be told, I didn't read the entire article so perhaps my question is answered in the article.

Barr
Posted on Reply
#2
El_Mayo
IceCreamBarrSo is this an office showing how clean it would be if they implemented cloud computing, reducing the amount of desk clutter OR is this an example of the new 3D capabilities touted in the article? If that's a 3D CAD model, I can not tell the difference.
i think it's a 3D model
Posted on Reply
#3
Imsochobo
3D.

Ami the only one that cant see that stuff isnt real, and the designers are far from real alike experience, looks etc etc.
Posted on Reply
#4
El_Mayo
Imsochobo3D.

Ami the only one that cant see that stuff isnt real, and the designers are far from real alike experience, looks etc etc.
it's rather realistic if you ask me.
Posted on Reply
#5
Steevo
Inventor of the GPU huh?
Posted on Reply
#6
AsRock
TPU addict
El_Mayoit's rather realistic if you ask me.
I've never seen a office so clean lolz.. Totally none realistic to me.
Posted on Reply
#7
Benetanegia
SteevoInventor of the GPU huh?
Every time something is posted untouched from Nvidia happens the same. YES, they invented the GPU.


Graphics card != GPU (Graphics Processing Unit) != VPU (Video Processing Unit, Video Precessor) != Graphics Accelerator

Infact:

{Video Processor
.......[Graphics Accelerator
...............(GPU)
.......]
}

Examples:

Graphics card = generic name for all cards.

Matrox Millenium = Video Processing Unit

3DFx Vodoo = 3D Graphics Accelerator

GeForce 256 = First GPU, term invented by Nvidia that gives name to a very specific variation of something that already existed, just like CUDA core, for example. It involves, between many other things, hardware transform and lighting (T&L) and full compliance with OpenGL pipeline. Since then all graphics vendors reproduced the "same" architecture and adopted the term because it was widely used in the media.

Comparison of what has happened with the GPU: A pick-up is a truck type, but imagine that in the future all trucks are pick-up trucks and are called simply pick-up, while the term truck comes into desuse. Then someone would be surprised to hear that: "someone invented the pick-up, while there existed many other...erm... pick-ups before? (=trucks, truck and car term are not used anymore and pick-up takes the place for anything similar to it, including their antecessors).
Posted on Reply
#8
Steevo
I have a ISA card that has a GPU by common teminology. It is a Matrox, but wait, I had a quasi three pixel pushing engine with onboard RAM Cirrus PCI card, then Cirrus then invented the GPU. Or perhaps the first LED displays on mainframes were the first GPU's as they were computer related. However the lights on the early mainframes could be considered pixels, so perhaps they invented the first GPU.


Matrox had the first GPU.

From Wiki

Matrox's first graphics card product was the ALT-256 for S-100 bus computers, released in 1978.

From Wiki

Three people co-founded Nvidia in 1993


Nvidia invented shit, ATI invented shit. Matrox was first.


Or this.

en.wikipedia.org/wiki/Graphics_processing_unit
Posted on Reply
#9
PP Mguire
Imsochobo3D.

Ami the only one that cant see that stuff isnt real, and the designers are far from real alike experience, looks etc etc.
AsRockI've never seen a office so clean lolz.. Totally none realistic to me.
Um.....its 3d. Compare that to say...Crysis and id say it looks pretty damn real. So if its to clean, thats a technicality. I aksed anybody else around my screen and they all thought it was a real picture.
Posted on Reply
#10
Benetanegia
SteevoI have a ISA card that has a GPU by common teminology. It is a Matrox, but wait, I had a quasi three pixel pushing engine with onboard RAM Cirrus PCI card, then Cirrus then invented the GPU. Or perhaps the first LED displays on mainframes were the first GPU's as they were computer related. However the lights on the early mainframes could be considered pixels, so perhaps they invented the first GPU.


Matrox had the first GPU.

From Wiki

Matrox's first graphics card product was the ALT-256 for S-100 bus computers, released in 1978.

From Wiki

Three people co-founded Nvidia in 1993


Nvidia invented shit, ATI invented shit. Matrox was first.
Again, they did not invent the GPU, they invented video cards or video processor in any case, but never the GPU.
Posted on Reply
#11
PP Mguire
BenetanegiaEvery time something is posted untouched from Nvidia happens the same. YES, they invented the GPU.


Graphics card != GPU (Graphics Processing Unit) != VPU (Video Processing Unit, Video Precessor) != Graphics Accelerator

Infact:

{Video Processor
.......[Graphics Accelerator
...............(GPU)
.......]
}

Examples:

Graphics card = generic name for all cards.

Matrox Millenium = Video Processing Unit

3DFx Vodoo = 3D Graphics Accelerator

GeForce 256 = First GPU, term invented by Nvidia that gives name to a very specific variation of something that already existed, just like CUDA core, for example. It involves, between many other things, hardware transform and lighting (T&L) and full compliance with OpenGL pipeline. Since then all graphics vendors reproduced the "same" architecture and adopted the term because it was widely used in the media.

Comparison of what has happened with the GPU: A pick-up is a truck type, but imagine that in the future all trucks are pick-up trucks and are called simply pick-up, while the term truck comes into desuse. Then someone would be surprised to hear that: "someone invented the pick-up, while there existed many other...erm... pick-ups before? (=trucks, truck and car term are not used anymore and pick-up takes the place for anything similar to it, including their antecessors).
BenetanegiaAgain, they did not invent the GPU, they invented video cards or video processor in any case, but never the GPU.
Posted on Reply
#12
Benetanegia
en.wikipedia.org/wiki/GeForce_256
Upon release, GeForce 256 offered industry-leading real-time 3D rendering performance. It was marketed as "the world's first 'GPU', or Graphics Processing Unit," a term Nvidia defined at the time as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second."
I could debate about this forever, it might look like a technicallity nowadays, but before GeForce 256 launched GPUs didn't exist, primarily (but no exclusively) because the GPU term didn't exist to begin with. ;)

Still confused? Let me put another example. Who invented the car?
From Wikipedia, the free encyclopedia

"Car" and "Cars" redirect here. For other uses, see Car (disambiguation).

An automobile, motor car or car is a wheeled motor vehicle used for transporting passengers, which also carries its own engine or motor.
en.wikipedia.org/wiki/Automobile

History of cars=automobile:

www.ausbcomp.com/~bbott/cars/carhist.htm

Etymology of car (AKA what car really means): en.wiktionary.org/wiki/car
Noun

car (plural cars)

1. (dated) A wheeled vehicle, drawn by a horse or other animal
2. A wheeled vehicle that moves independently, steered by a driver mostly for personal transportation; a motor car or automobile

She drove her car to the mall.

3. (rail transport, chiefly North American) An unpowered unit in a railroad train.

The conductor linked the cars to the locomotive.
...
So who invented the car?
Posted on Reply
#13
lemonadesoda
Shadowing is all wrong by the wall and under the table and the white pillars are over reflected in the laminate flooring. Perspective is a bit cockeyed too. TFT screens are too small, buttonless and have consistent plastic matte all over without cables or brand markings (on front or rear).

Other than those minor complains, the render is excellent.

HOWEVER; is this a minute long render or is this REAL TIME?


PS. I invented the GPU and the GPU was ME in KINDERGARTEN in 1973. I used to get graph paper from my dad and shade the little boxes and make shapes. Pixels. I did triangles, boxes, squares, as primitive shapes, and did circles, letters, shadows and shading. So what is it was a Human-GPU? I still invented it. I AM the first human GPU. Well actually, makebe some other kid did it 20 years earlier, who knows.
Posted on Reply
#14
Benetanegia
lemonadesodaPS. I invented the GPU and the GPU was ME in KINDERGARTEN in 1973. I used to get graph paper from my dad and shade the little boxes and make shapes. Pixels. I did triangles, boxes, squares, as primitive shapes, and did circles, letters, shadows and shading. So what is it was a Human-GPU? I still invented it. I AM the first human GPU. Well actually, makebe some other kid did it 20 years earlier, who knows.
Oooooh sorry to spoil you, but it seems that

en.wikipedia.org/wiki/Chameleon
en.wikipedia.org/wiki/Cuttlefish

both were already kicking in the jurassic or triassic. And they were not only the first GPU, but the first camera and first OLED screen too. :D
Posted on Reply
#15
lemonadesoda
Bother.I guess I cant sue for royalties then? LOL

LOL... in other news... WWF charity sues nVidia for copyright infringement. WWF representing class-action lawsuit on behalf of all chameleons. LOL
Posted on Reply
#16
Steevo
lemonadesodaPS. I invented the GPU and the GPU was ME in KINDERGARTEN in 1973. I used to get graph paper from my dad and shade the little boxes and make shapes. Pixels. I did triangles, boxes, squares, as primitive shapes, and did circles, letters, shadows and shading. So what is it was a Human-GPU? I still invented it. I AM the first human GPU. Well actually, makebe some other kid did it 20 years earlier, who knows.
Yes, but the statement shows clearly that it can do 10 MILLION at a time!!!! WOOT WOOT WOOT 1111111111111 WAFFLES WAFFLES


Can you do that? I think not.
Posted on Reply
#17
etrigan420
Jagged angles...

They should have rendered it with an ATi card to fix that!!! :laugh:
Posted on Reply
#18
lemonadesoda
Hey, I can draw non-jagged angled lines with my crayon. And I did it first. Someone call me a lawyer. Gonna get BOTH nV and ATI now.
Posted on Reply
#19
Benetanegia
etrigan420Jagged angles...

They should have rendered it with an ATi card to fix that!!! :laugh:
It's because of the photons of global ilumination IMO, because flat shaded/iluminated angles don't have jaggies. They should have used more photons, but maybe that's asking too much for interactive real-time rendering.
lemonadesodaHey, I can draw non-jagged angled lines with my crayon. And I did it first. Someone call me a lawyer. Gonna get BOTH nV and ATI now.
$$$$$$$$$$$
Posted on Reply
#20
Steevo
Jagged angles on the bottom edges of the tables, and the lighting on the aluminum base of the chair is wrong, monitor facing away on nearest table has jaggies on the bottom, and is not casting a shadow. The white reflection of the columns on the floor is not consistent with the lighting angle. The whole image look like it was taken at too low of a ISO and has alot of noise, could be compression artifacts, but it just looks meh.
Posted on Reply
#21
Mussels
Freshwater Moderator
its definately 3D.


its very, VERY well done - but you can tell its fake by looking at hte LCD screens.

1. no reflections/refraction from the one facing us - it looks like an LCD screen powered on with no signal (slightly fuzzy look from ambient reflections)

2. the one facing away from us, has no logo or cables ;D



either

A: this is a 3D rendering

or

B: this is a mock shoot, where they just used props

going by the press release, i say A - and hot damn, ray traced games will be AWESOME



P.S - nvidia did not invent graphics cards or 3D video cards - they did however coin the term GPU, and steal the rights to it.
Posted on Reply
#22
FordGT90Concept
"I go fast!1!11!1!"
Mussels1. no reflections/refraction from the one facing us - it looks like an LCD screen powered on with no signal (slightly fuzzy look from ambient reflections)
Matte screens don't reflect much if anything. Only those with a glossy/glass finish reflect a lot. Looking at it, it looks exactly like a matte screen.

The only thing that doesn't look right to me is the bright spot on the chair. It appears excessively blurry when, being that close, it should be sharp and clear. Also, the wheels on that same chair don't look like your typical office chair. The wheels should be offset more from their attachment point.

If it weren't for that chair, I'd say it were taken by a camera.


Oh, and the binder close to that chair looks excessively blurry too. Maybe it was taken with a camera but they used very high JPEG compression and/or used a crappy program to size it.
Posted on Reply
#23
PP Mguire
I have an office chair just like that and the wheels arent offset.
Posted on Reply
#24
Weer
lemonadesodaPS. I invented the GPU and the GPU was ME in KINDERGARTEN in 1973. I used to get graph paper from my dad and shade the little boxes and make shapes. Pixels. I did triangles, boxes, squares, as primitive shapes, and did circles, letters, shadows and shading. So what is it was a Human-GPU? I still invented it. I AM the first human GPU. Well actually, makebe some other kid did it 20 years earlier, who knows.
You're 43?
Posted on Reply
#25
pr0n Inspector
SteevoJagged angles on the bottom edges of the tables, and the lighting on the aluminum base of the chair is wrong, monitor facing away on nearest table has jaggies on the bottom, and is not casting a shadow. The white reflection of the columns on the floor is not consistent with the lighting angle. The whole image look like it was taken at too low of a ISO and has alot of noise, could be compression artifacts, but it just looks meh.
This is jaggy? You really should go to flickr and see some horribly over-sharpened photos.
Also, higher ISO = more noise, not the other way around. They are trying to mimic a real photo by adding noise.
Posted on Reply
Add your own comment
Apr 24th, 2024 19:43 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts