Thursday, May 22nd 2008

NVIDIA Acquires RayScale Software

PC Perspective sends word to us that during a meeting in San Jose, NVIDIA has announced that it is to aquire a ray tracing software company called RayScale. Interesting acquisition indeed, but little is known at the moment. There's still no press release issued. RayScale starts its life from the University of Utah, and currently it provides interactive ray tracing and photo-realistic rendering solutions. Ray tracing is a technique for rendering three-dimensional graphics with very complex light interactions. More info about RayScale can be found on their web page here.
Source: PC Perspective
Add your own comment

25 Comments on NVIDIA Acquires RayScale Software

#1
erocker
*
This is very interesting indeed! I've been interested in ray tracing for a while now after seeing some demos a while back.
Posted on Reply
#2
jbunch07
yes very interesting....seems nvidia is acquiring all kinds of stuff these days.
Posted on Reply
#3
erocker
*
Besides making video cards and chipsets, buying out companies is the other thing Nvidia likes to do.;)
Posted on Reply
#4
das müffin mann
kinda make you wonder what nvidia is planning

also someone may know but isnt ray tracing much more cpu intesive, more cores=better performance? (in reguards to ray tracing)
Posted on Reply
#5
jbunch07
erockerBesides making video cards and chipsets, buying out companies is the other thing Nvidia likes to do.;)
no kidding! :)

ray tracing is pretty cool stuff though.
Posted on Reply
#6
Triprift
This is an interesting move wonder what plans nv has with this :twitch:
Posted on Reply
#7
Solaris17
Super Dainty Moderator
das müffin mannkinda make you wonder what nvidia is planning

also someone may know but isnt ray tracing much more cpu intesive, more cores=better performance? (in reguards to ray tracing)
as far as i know when i looked into it ray trace calculations are similar to physics ones with a little modification ;) just a hint drop.
Posted on Reply
#8
a111087
ray tracing is the future!
Posted on Reply
#9
DarkMatter
WOW they have taken seriously the "fight" against Intel. They are not leaving anything to chance, they are taking as much weapons as they can.

Anyway ray tracing for second level reflexions is cool, I think (I hope) Nvidia is not going to aim at complete ray traced renderers. No game developer wants that from what I heard and are bloody slow renderers.
Posted on Reply
#10
pentastar111
That is interesting...Our current graphics solutions are if my limted knowledge is correct, utilizing "rasterisation" which I believe is actually "faking" graphics? Anyway ray tracing creates a much more realistic image (see thumbnail), and like das muffin man stated above..more core's the better...also, this move by nVidia would also explain their reluctance to support DX 10.1? hmmm....If things progress like they have been, tech wise, we should be seeing games that look like moving photographs within the next 3 to 5 years and the hardware to run such software still affordable:D
Posted on Reply
#11
imperialreign
erockerBesides making video cards and chipsets, cornering the market is the other thing Nvidia likes to do.;)
fixed :D
Posted on Reply
#12
Rebo&Zooty
erockerBesides making video cards and chipsets, buying out companies is the other thing Nvidia likes to do.;)
they are trying to be the microsoft of the gfx and chipset world, problem, sounds like intel plans to pull their licence to make intel chipsets come the new cpu, so they may be forced to eather kiss amd ass or they will have to find a way to get an x86 licence(afik they dont got one)

since ms isnt likely to move away from x86-64 any time soon, that leaves them stuck with amd as a cpu supplyer, unless they can somehow help via make better cpus :P
Posted on Reply
#13
panchoman
Sold my stars!
i dont like this, not at all

by eliminating smaller companies nvidia is trying to reinforce its winnings in the duopoly of ati-nvidia... instead of letting rayscale etc enter the market and become a player in the market (good for the consumer), nvidia is buying it out to strengthen itself so that it can beat intel and amd and form a virtual monopoly -_-
Posted on Reply
#15
DarkMatter
panchomani dont like this, not at all

by eliminating smaller companies nvidia is trying to reinforce its winnings in the duopoly of ati-nvidia... instead of letting rayscale etc enter the market and become a player in the market (good for the consumer), nvidia is buying it out to strengthen itself so that it can beat intel and amd and form a virtual monopoly -_-
As if Rayscale was going to mean anything in the GPU market them alone...

They are not competing in the GPU bussines, they were small players in the small bussines of off-line renderers. They have nothing to do with Ati-Nvidia 's market at all. Just as with Mental Images, also owned by Nvidia, they will probably continue with the bussiness they are doing today, but with more money. And at he same time they will help Nvidia introduce real-time ray-tracing in games as it has to be introduced and not the way Intel is trying to do, badly.
Posted on Reply
#16
graphicsHorse
Raytracing in games has a long way to go. You'd need X10000 of the computing power to run realtime raytraced environments with global illumination - caustics - volume light - soft area shadows.

I think the primary reason for this aquisition is a nextgen rendering technology that will unite Mentelray - Gelato and RayScale now.
Posted on Reply
#17
InnocentCriminal
Resident Grammar Amender
I love how nVIDIA recently retorted to Intel's claims of RayTracing being the future with how it's not actually going to be but GPUs all the way. Then they go and buy a ray tracing company...

Intel: the GPU gives no benefit to non-gamers.
graphicsHorseRaytracing in games has a long way to go. You'd need X10000 of the computing power to run realtime raytraced environments with global illumination - caustics - volume light - soft area shadows.
Y'not wrong, Pixar's films can takes days to render just a few minutes of a scene, for example, something with an explosion or the a-like.
Posted on Reply
#18
Rebo&Zooty
InnocentCriminalI love how nVIDIA recently retorted to Intel's claims of RayTracing being the future with how it's not actually going to be but GPUs all the way. Then they go and buy a ray tracing company...

Intel: the GPU gives no benefit to non-gamers.



Y'not wrong, Pixar's films can takes days to render just a few minutes of a scene, for example, something with an explosion or the a-like.
acctualy i see them moving to a mix of both, gpu's that are designed to support raytracing as a method to render certen effects, its quite possable they could come up with a gpu that was designed spicificly to run ratrace software on, but it wouldnt be compatable with todays games, so most likely use will be to creat hybrid cards that slowly move that dirrection using a GPU thats designed like todays gpus for s spicific type of prosessing.

and intels raytracing thing,yeah the future 5-10years down the road maby, MABY, but by then cpu's will not look like they do now, i honestly cant see the x86 core as it is now being around that long, alot of the old x86 design is lagacy now rarely used other then for VERY old apps, hence slowly amd and intel are both removing that detocated prosessing power and just running it emulated on other parts of the cpu.

i see VIA headed in the right dirrection, RISC is the future, less complex cpu's that are optimized to run the software thats commonly used, but that are still capable of running other types of software and older software via emulation(sure theres a perf hit, but most of those apps are so old or low power they still run better then they did when they where made)


i would love to see what the k9 would have been, all i heard about it was that it was based on alot of concepts from the dec alpha(designed by the same team) but somebody in management killed the project, and as i hear it the dessission had nothing to do with performance, but the fact that it wasnt based on the k8, it was a new design using RISC insted of CISC(reduced instruction set computeing vs complex instruction set computing) and somebody in management coudnt understand that this was a good thing and would make future development of the cpu FASTER since it was less complex, also would make each core cost less and take up less die space, OR would make more rool for new fetures.

blah, this move to me is equivlant to the idiot at intel that choose to go with netburst over p6 core(pentium pro/p2/p3/pentium-m/core/core-duo/core2 are all based on the P6 design) bone head people who cant let go of stupid thoughts.
worst part was amd lost their head engineer to that because he was pissed about his project being cancled for such a stupid reasion......

and NO the am2 chips are NOT k9, some board makers refer to them as such, but they are NOT a k9 chip, they are a k8 with am2 memory controler PERIOD.
Posted on Reply
#19
DarkMatter
Hmm well, off-line software rasterized renderers take several minutes to render an image too. But games reproduce mre than 30 in one second, at least that's the goal. Movie studios use a mix of raytraced and rasterized images, they don't use ray-trace only as is not cost/time effective. Software renderers are all about image quality and accuracy. Real time renderers are not as accurate, and on the the other hand GPUs are 20 times more powerful than a similar priced CPU. If they manage to mix rasterization and ray-trace and make it run on the GPU they could have a renderer with almost production renderers' quality and good performance.

Also they are not talking about making a 8800 or even GT200 do ray-trace, but the next or even two or three generations ahead. Each generation the power of cards is doubled, think about three generatios ahead and you have a card with 8x the power of a 8800. For what do you want that power? 3000x2000 pixel resolution? It's hard to notice any difference beyond 1920x1200. 16-32x antialiasing? 4X is all you need, specially at 1920x1200 and above. Anisotropic? At higher display resolutions even 16x is not indispensable with the high resolution textures of latest games...

In reality that power will have to be used in the detail of the picture rendered. But geometry and textures with 8x the detail of today don't make sense neither. 2x maybe. Now some use 2048x2048, most use 1024x1024. 4096x4096 (4x) is not even used in production graphics.

What's left? Shaders, lighting... Improving rasterizers by 8x is not possible you would find a wall in perception, just as the ones I mention in the second paragraph. So adding something new is the only way to go, that's including ray-trace for reflexions, massive amounts of high quality physics, etc.
That's what Nvidia is doing, and I love it. I hope AMD is doing the same with the same commitment, even though we didn't hear anything about it.
Posted on Reply
#20
btarunr
Editor & Senior Moderator
jbunch07yes very interesting....seems nvidia is acquiring all kinds of stuff these days.
Agree, they're acquiring the right kind of stuff, that's small enough for an easy acquisition, useful enough to help strengthen their primary product.

They acquired ULi, ended up making a decent low-end chipset line. Acquired Ageia and now have big plans with PhysX. Now this.
Posted on Reply
#21
InnocentCriminal
Resident Grammar Amender
Rebo&Zooty... gpu's that are designed to support raytracing as a method to render certen effects, its quite possable they could come up with a gpu that was designed spicificly to run ratrace software on...
Y'first sentence said it all, and that's my thoughts exactly. Explains why they bought RayTrace.
Posted on Reply
#22
magibeg
Yes ray tracing is extremely complex, but if you guys actually checked my link you would see that you can do real time ray tracing using OpenRT. It may not be as far away as we think. Of course the ray tracing in the link i posted used 20 XP1800's in a cluster to run quake 3 ray traced. But in terms of raw computing power my quad at 3.6ghz is probably not all that far off from them. It would be logical to say that intels next 8-core chip may be capable of real time ray tracing although we would probably have to wait for 16 core systems before they could be useful. Maybe another 3 or 4 years and we could be in business.
Posted on Reply
#23
btarunr
Editor & Senior Moderator
InnocentCriminalY'first sentence said it all, and that's my thoughts exactly. Explains why they bought RayTrace.
They could add functionality to the Tesla.
Posted on Reply
#24
neo1231
das müffin mannkinda make you wonder what nvidia is planning

also someone may know but isnt ray tracing much more cpu intesive, more cores=better performance? (in reguards to ray tracing)
that is why nvidia invented CUDA, to off load cpu processing onto the gpu!
Posted on Reply
#25
candle_86
concerning the Intel/AMD issues of Intel being faster but having to deal with AMD explains CUDA alot, if they can offload CPU workload to the GPU is makes the CPU less important. I belive this is one reason Nvidia did this is so they arn't dependent on one company's CPU's to truly show what there GPU can do. If they can make CUDA offload Physics from the CPU for example like the plan is it means that complex work is gone.
Posted on Reply
Add your own comment
Apr 26th, 2024 18:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts