News Posts matching #SDK

Return to Keyword Browsing

Intel Hit by a Devastating Data Breach, Chip Designs, Code, Possible Backdoors Leaked

Intel on Thursday was hit by a massive data-breach, with someone on Twitter posting links to an archive that contains the dump of the breach - a 20-gigabyte treasure chest that includes - but not limited to - Intel Management Engine bringup guides, flashing tools, samples; source code of Consumer Electronics Firmware Development Kit (CEFDK); silicon and FSP source packages for various platforms; an assortment of development and debugging tools; Simics simulation for "Rocket Lake S" and other platforms; a wealth of roadmaps and other documents; shcematics, documents, tools, and firmware for "Tiger Lake," Intel Trace Hub + decoder files for various Intel ME versions; "Elkhart Lake" silicon reference and sample code; Bootguard SDK, "Snow Ridge" simulator; design schematics of various products; etc.

The most fascinating part of the leak is the person points to the possibility of Intel laying backdoors in its code and designs - a very tinfoil hat though likely possibility in the post-9/11 world. Intel in a comment to Tom's Hardware denied that its security apparatus had been compromised, and instead blamed someone with access to this information for downloading the data. "We are investigating this situation. The information appears to come from the Intel Resource and Design Center, which hosts information for use by our customers, partners and other external parties who have registered for access. We believe an individual with access downloaded and shared this data," a company spox said.

Matrox D1450 Graphics Card for High-Density Output Video Walls Now Shipping

Matrox is pleased to announce that the Matrox D-Series D1450 multi-display graphics card is now shipping. Purpose-built to power next-generation video walls, this new single-slot, quad-4K HDMI graphics card enables OEMs, system integrators, and AV installers to easily combine multiple D1450 boards to quickly deploy high-density-output video walls of up 16 synchronized 4K displays. Along with a rich assortment of video wall software and developer tools for advanced custom control and application development, D1450 is ideal for a broad range of commercial and critical 24/7 applications, including control rooms, enterprises, industries, government, military, digital signage, broadcast, and more.
Advanced capabilities

Backed by innovative technology and deep industry expertise, D1450 delivers exceptional video and graphics performance on up to four 4K HDMI monitors from a single-slot card. OEMs, system integrators, and AV professionals can easily add—and synchronize—displays by framelocking up to four D-Series cards via board-to-board framelock cables. In addition, D1450 offers HDCP support to display copy-protected content, as well as Microsoft DirectX 12 and OpenGL support to run the latest professional applications.

Dynics Announces AI-enabled Vision System Powered by NVIDIA T4 Tensor Core GPU

Dynics, Inc., a U.S.-based manufacturer of industrial-grade computer hardware, visualization software, network security, network monitoring and software-defined networking solutions, today announced the XiT4 Inference Server, which helps industrial manufacturing companies increase their yield and provide more consistent manufacturing quality.

Artificial intelligence (AI) is increasingly being integrated into modern manufacturing to improve and automate processes, including 3D vision applications. The XiT4 Inference Server, powered by the NVIDIA T4 Tensor Core GPUs, is a fan-less hardware platform for AI, machine learning and 3D vision applications. AI technology is allowing manufacturers to increase efficiency and throughput of their production, while also providing more consistent quality due to higher accuracy and repeatability. Additional benefits are fewer false negatives (test escapes) and fewer false positives, which reduce downstream re-inspection needs, all leading to lower costs of manufacturing.

Epic Online Services Launches with New Tools for Cross-Play and More

Epic Games today announces the launch of Epic Online Services, unlocking the ability to effortlessly scale games and unify player communities for all developers. First announced in December 2018, Epic Online Services are battle-tested and powered by the services built for Fortnite across seven major platforms (PlayStation, Xbox, Nintendo Switch, PC, Mac, iOS, and Android). Open to all developers, Epic Online Services is completely free and offers creators a single SDK to quickly and easily launch, operate, and scale their games across engines, stores, and platforms of their choice.

"At Epic, we believe in open, integrated platforms and in the future of gaming being a highly social and connected experience," said Chris Dyl, General Manager, Online Services, Epic Games. "Through Epic Online Services, we strive to help build a user-friendly ecosystem for both developers and players, where creators can benefit regardless of how they choose to build and publish their games, and where players can play games with their friends and enjoy the same quality experience regardless of the hardware they own."

Intel iGPU+dGPU Multi-Adapter Tech Shows Promise Thanks to its Realistic Goals

Intel is revisiting the concept of asymmetric multi-GPU introduced with DirectX 12. The company posted an elaborate technical slide-deck it originally planned to present to game developers at the now-cancelled GDC 2020. The technology shows promise because the company isn't insulting developers' intelligence by proposing that the iGPU lying dormant be made to shoulder the game's entire rendering pipeline for a single-digit percentage performance boost. Rather, it has come up with innovating augments to the rendering path such that only certain lightweight compute aspects of the game's rendering be passed on to the iGPU's execution units, so it has a more meaningful contribution to overall performance. To that effect, Intel is on the path of coming up with SDK that can be integrated with existing game engines.

Microsoft DirectX 12 introduced the holy grail of multi-GPU technology, under its Explicit Multi-Adapter specification. This allows game engines to send rendering traffic to any combinations or makes of GPUs that support the API, to achieve a performance uplift over single GPU. This was met with lukewarm reception from AMD and NVIDIA, and far too few DirectX 12 games actually support it. Intel proposes a specialization of explicit multi-adapter approach, in which the iGPU's execution units are made to process various low-bandwidth elements both during the rendering and post-processing stages, such as Occlusion Culling, AI, game physics, etc. Intel's method leverages cross-adapter shared resources sitting in system memory (main memory), and D3D12 asynchronous compute, which creates separate processing queues for rendering and compute.

Khronos Group Releases Vulkan Ray Tracing

Today, The Khronos Group, an open consortium of industry-leading companies creating advanced interoperability standards, announces the ratification and public release of the Vulkan Ray Tracing provisional extensions, creating the industry's first open, cross-vendor, cross-platform standard for ray tracing acceleration. Primarily focused on meeting desktop market demand for both real-time and offline rendering, the release of Vulkan Ray Tracing as provisional extensions enables the developer community to provide feedback before the specifications are finalized. Comments and feedback will be collected through the Vulkan GitHub Issues Tracker and Khronos Developer Slack. Developers are also encouraged to share comments with their preferred hardware vendors. The specifications are available today on the Vulkan Registry.

Ray tracing is a rendering technique that realistically simulates how light rays intersect and interact with scene geometry, materials, and light sources to generate photorealistic imagery. It is widely used for film and other production rendering and is beginning to be practical for real-time applications and games. Vulkan Ray Tracing seamlessly integrates a coherent ray tracing framework into the Vulkan API, enabling a flexible merging of rasterization and ray tracing acceleration. Vulkan Ray Tracing is designed to be hardware agnostic and so can be accelerated on both existing GPU compute and dedicated ray tracing cores if available.
Vulkan ray tracing

NVIDIA Files for "Hopper" and "Aerial" Trademarks

In a confirmation that a future NVIDIA graphics architecture will be codenamed "Hopper," the company has trademarked the term with the US-PTO. The trademark application was filed as recently as December 4, and closely follows that of "Aerial," another trademark, which is an SDK for a GPU-accelerated 5G vRANs (virtual radio-access networks). Named after eminent computing scientist Grace Hopper, the new graphics architecture by NVIDIA reportedly sees one of the first GPU die MCMs (package with multiple GPU dies). It reportedly succeeds "Ampere," NVIDIA's next graphics architecture.

NVIDIA Announces New GeForce Experience Features Ahead of RTX Push

NVIDIA today announced new GeForce experience features to be integrated and expanded in wake of its RTX platform push. The new features include increased number of Ansel-supporting titles (including already released Prey and Vampyr, as well as the upcoming Metro Exodus and Shadow of the Tomb Raider), as well as RTX-exclusive features that are being implemented into the company's gaming system companion.

There are also some features being implemented that gamers will be able to take advantage of without explicit Ansel SDK integration done by the games developer - which NVIDIA says will bring Ansel support (in any shape or form) to over 200 titles (150 more than the over 50 titles already supported via SDK). And capitalizing on Battlefield V's relevance to the gaming crowd, NVIDIA also announced support for Ansel and its Highlights feature for the upcoming title.

Microsoft Releases DirectX Raytracing - NVIDIA Volta-based RTX Adds Real-Time Capability

Microsoft today announced an extension to its DirectX 12 API with DirectX Raytracing, which provides components designed to make real-time ray-tracing easier to implement, and uses Compute Shaders under the hood, for wide graphics card compatibility. NVIDIA feels that their "Volta" graphics architecture, has enough computational power on tap, to make real-time ray-tracing available to the masses. The company has hence collaborated with Microsoft to develop the NVIDIA RTX technology, as an interoperative part of the DirectX Raytracing (DXR) API, along with a few turnkey effects, which will be made available through the company's next-generation GameWorks SDK program, under GameWorks Ray Tracing, as a ray-tracing denoiser module for the API.

Real-time ray-tracing has for long been regarded as a silver-bullet to get lifelike lighting, reflections, and shadows right. Ray-tracing is already big in the real-estate industry, for showcasing photorealistic interactive renderings of property under development, but has stayed away from gaming, that tends to be more intense, with larger scenes, more objects, and rapid camera movements. Movies with big production budgets use pre-rendered ray-tracing farms to render each frame. Movies have, hence, used ray-traced visual-effects for years now, since it's not interactive content, and its studios are willing to spend vast amounts of time and money to painstakingly render each frame using hundreds of rays per pixel.

Bose Introduces the World's First Audio Augmented Reality Platform

This week at SXSW, Bose introduces Bose AR, the world's first audio augmented reality platform, and glasses to hear- a Bose AR prototype that launches the future of mobile sound. Bose also announces its SDK schedule date for developers, manufacturers, and research institutions, along with collaborations currently under way, and venture funding for related start-ups.

Unlike other augmented reality products and platforms, Bose AR doesn't change what you see, but knows what you're looking at- without an integrated lens or phone camera. And rather than superimposing visual objects on the real world, Bose AR adds an audible layer of information and experiences, making every day better, easier, more meaningful, and more productive.

Dell Partners with Meta to Sell Meta 2 Augmented Reality Development Kit

Dell today announced it will be the first authorized reseller of the Meta 2 Augmented Reality Development Kit, equipping commercial companies with the tools needed to more easily innovate and adopt new AR technology applications that can advance their business. In partnership with Meta, Dell aims to make AR more accessible for business deployment, particularly in healthcare, manufacturing and construction, by providing tools for creating immersive experiences unique to the needs of those industries.

Dell is the only technology provider with an end to end ecosystem to consume, create and power VR and AR. The new offering with Meta stems from Dell's VR/AR Technology Partner Program, which brings together other innovators in VR and AR to test and collaborate on the best technology solutions for varying applications and experiences. This program allows Dell to help current and potential customers to better navigate the new and rapidly evolving VR/AR ecosystem, by working with partners to verify and certify the best software and hardware solutions for VR and AR applications - bringing standardization where it is needed most.

HTC Reveals Vive Focus Standalone VR Headset and Vive Wave VR Open Platform

HTC, a pioneer in innovative, smart mobile and virtual reality (VR) technologies, today held its VIVE Developer Conference 2017 (VDC2017), where it announced VIVE WAVE, a VR open platform and toolset that will open up the path to easy mobile VR content development and high-performance device optimization for third-party partners. 12 hardware partners, namely 360QIKU, Baofengmojing, Coocaa, EmdoorVR, Idealens, iQIYI, Juhaokan, Nubia, Pico, Pimax, Quanta and Thundercomm, announced their support for the integration of Vive Wave as well as the VIVEPORT VR content platform into their future products. Vive Wave is a clear step forward in bringing together the highly fragmented mobile VR market that has growth up in China the last several years. It saves tremendous efforts by allowing developers to create content for a common platform and storefront across disparate hardware vendors. Over 35 Chinese and global content developers have already built VR content optimized for Vive Wave, with 14 showing live demos at the event. Vive also unveiled the VIVE FOCUS, its highly anticipated premium standalone VR headset for the China market that is also based on the Vive Wave VR open platform.

Creative Launches Aurora Reactive SDK for Sound BlasterX Products

Creative Technology Ltd today announced that it would be launching the Aurora Reactive SDK. This tool would effectively convert the Aurora Reactive Lighting System found on Sound BlasterX products into an open platform, allowing developers the freedom to customize, animate and synchronize its lighting behavior. The 16.8 million color Aurora Reactive Lighting System is currently found on the Sound BlasterX Katana, Vanguard K08, Siege M04, AE-5, and Kratos S5.

The Aurora Reactive SDK is a system with APIs (Application Programming Interfaces) that allow third party developers to program Creative's Sound BlasterX RGB-enabled hardware. The SDK will come complete with sample codes, an API library, and documentation to enable even novice programmers to get started.

Razer Announces the Wolverine Ultimate Gamepad for PC and Xbox One

Razer, the leading global lifestyle brand for gamers, today announced the officially licensed Razer Wolverine Ultimate gaming controller for Xbox One and PC. The Razer Wolverine Ultimate was designed to adapt itself to any gamer. Two interchangeable D-Pads, a range of interchangeable thumbsticks with different heights and shape and a total of 6 remappable triggers and buttons - both via Razer Synapse for Xbox and on-the-fly - provide maximum customizability.

An integrated RGB lighting strip that can be controlled via Razer Synapse for Xbox adds more ways to personalize the controller and introduces Razer Chroma to Xbox gamers everywhere. Gamers can choose from 16.8 million colors and a variety of effects that include Static, Spectrum Cycling, Breathing, Wave and more. Additionally, the Razer Wolverine Ultimate will be the first console product to support the Razer Chroma SDK, allowing developers to integrate advanced lighting capabilities for Xbox One games, and console controllers for next level gaming immersion.

Razer Takes Chroma Lighting Beyond Peripherals with the Hardware Development Kit

Razer, the leading global lifestyle brand for gamers, today announced the Razer Chroma Hardware Development Kit (HDK), the world's most advanced modular lighting system for PC gamers and enthusiasts. Integrated within the Razer Chroma ecosystem, the Chroma HDK offers all-in-one color customization with precise control down to the individual LED.

Users can shape and bend the LED strips to fit virtually any surface to light up an entire room, home or office for total game immersion. The individually controllable lights are integrated into Razer Synapse 3, and are powered by Razer Chroma technology, which unlocks customizable lighting features that can be synced across devices.

NVIDIA Announces OptiX 5.0 SDK - AI-Enhanced Ray Tracing

At SIGGRAPH 2017, NVIDIA introduced the latest version of their AI-based, GPU-enabled ray-tracing OptiX API. The company has been at the forefront of GPU-powered AI endeavors in a number of areas, including facial animation, anti-aliasing, denoising, and light transport. OptiX 5.0 brings a renewed focus on AI-based denoising.

AI training is still a brute-force scenario with finesse applied at the end: basically, NVIDIA took tens of thousands of image pairs of rendered images with one sample per pixel and a companion image of the same render with 4,000 rays per pixel, and used that to train the AI to predict what a denoised image looks like. Basically (and picking up the numbers NVIDIA used for its AI training), this means that in theory, users deploying OptiX 5.0 only need to render one sample per pixel of a given image, instead of the 4,000 rays per pixel that would be needed for its final presentation. Based on its learning, the AI will then be able to fill in the blanks towards finalizing the image, saving the need to render all that extra data. NVIDIA quotes a 157x improvement in render time using a DGX station with Optix 5.0 deployed against the same render on a CPU-based platform (2 x E5-2699 v4 @ 2.20GHz). The Optix 5.0 release also includes provisions for GPU-accelerated motion blur, which should do away with the need to render a frame multiple times and then applying a blur filter through a collage of the different frames. NVIDIA said OptiX 5.0 will be available in November. Check the press release after the break.

NVIDIA Releases VRWorks Audio and 360 Video SDKs at GTC

Further planting its roots on the VR SDK and development field, NVIDIA has just announced availability of two more SDK packages, for their VRWorks Audio and 360 Video suites. Now a part of NVIDIA's VRWorks suite of VR solutions, the VRWorks Audio SDK provides real-time ray tracing of audio in virtual environments, and is supported in Epic's Unreal Engine 4 (here's hoping this solution, or other solutions similar to it, address the problem of today's game audio.) The VRWorks 360 Video SDK, on the other hand, may be less interesting for graphics enthusiasts, in that it addresses the complex challenge of real-time video stitching.

Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.

NVIDIA Announces Public Ansel SDK, Developer Plugins

NVIDIA, Ansel, a framework for doing real-time screenshot filters and photographic effects, has seen the release of a public SDK and a few developer plugins to boot. Unreal Engine and Unity have both gained plugins for the technology, and the tech is reportedly coming to Amazon's Lumberyard engine as well. This should most assuredly aid in the adoption of the technology, as well as open it up to new markets where it was previous unavailable, such as indie game development. The public SDK is presently available for download from NVIDIA directly at developer.nvidia.com/ansel

NVIDIA Announces DX12 Gameworks Support

NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.

Shadowplay Now Automagically Records Your Greatest Moments

NVIDIA has announced a new SDK for its products known as Shadowplay Highlights. Shadowplay Highlights augments the existing recording game technology of NVIDIA Shadowplay to automatically capture hot moments in your favorite videogame. Whether it's your latest Triple Kill or a particular daring jump on the race track, if the game engine tells the SDK it's significant, Shadowplay spins up, combining previously recorded gameplay with live recordings, to create a perfect video of your glory moment. You can then edit the footage from within the game and directly upload it to a number of social networks.

The technology includes many options for quality or diskspace saving, and anything in-between. Of course, as with all things Shadowplay, the technology certainly will require a GeForce branded graphics card and support from game developers as well. A video demonstrating the technology follows after the break.

IBM and NVIDIA Team Up on World's Fastest Deep Learning Enterprise Solution

IBM and NVIDIA today announced collaboration on a new deep learning tool optimized for the latest IBM and NVIDIA technologies to help train computers to think and learn in more human-like ways at a faster pace. Deep learning is a fast growing machine learning method that extracts information by crunching through millions of pieces of data to detect and rank the most important aspects from the data. Publicly supported among leading consumer web and mobile application companies, deep learning is quickly being adopted by more traditional business enterprises.

Deep learning and other artificial intelligence capabilities are being used across a wide range of industry sectors; in banking to advance fraud detection through facial recognition; in automotive for self-driving automobiles and in retail for fully automated call centers with computers that can better understand speech and answer questions.

NVIDIA Releases VRWorks SDK Update for "Pascal"

NVIDIA today released a major update to its VRWorks SDK that enables game developers to implement new VR features features introduced by the GeForce "Pascal" graphics processors, taking advantage of the new Simultaneous Multi-projection Engine (SMP). The two major features introduced are Lens-Matched Shading and Single-Pass Stereo.

Lens-Matched Shading uses SMP to provide substantial performance improvements in pixel shading. The feature improves upon Multi-res Shading by rendering to a surface that more closely approximates the lens corrected image that is output to the headset display. This avoids the performance cost of rendering many pixels that are discarded during the VR lens warp post-process. Single-Pass Stereo, on the other hand, removes the need for a GPU to render the geometry and tessellation of a 3D scene twice (one for each eye/viewport), and lets both viewports share one pass of geometry and tessellation, thereby halving the the tessellation and vertex-shading workload.

Dell Announces VR-Ready Precision Workstations

Dell today announced new Virtual Reality-ready solutions that feature refined criteria for optimal VR experience, whether consuming or creating VR content. Dell has defined VR-ready solutions by three criteria:
  • Minimum CPU, memory, and graphics requirements to support optimal VR viewing experiences;
  • Graphics drivers that are qualified to work reliably with these solutions; and,
  • Passing performance tests conducted by Dell using test criteria based on HMD (head-mounted display) suppliers, ISVs or 3rd party benchmarks where available.
Working closely with its hardware and software partners, Dell is formalizing its commitment to the future of VR by delivering solutions that are optimized for VR consumption and creation alongside ISV applications for professional customers.

Oculus to Begin Taking Pre-orders for the Oculus Rift CV1 on January 6

Oculus, makers of the popular Oculus Rift VR HMD, announced that it will open the gates for pre-orders for its upcoming Rift CV1 HMD on the 6th of January, 2016, at 08:00 Pacific Time. You'll be able to take it for a spin right out of the box, on the bundled games Lucky's Tale, and EVE: Valkyrie, two games built almost entirely around VR, by leveraging the Oculus SDK.

2016 is shaping up to be the year VR takes off in a big scale, with consumer electronics giants planning to launch their VR headsets; game developers building their games around major VR SDKs, and graphics hardware companies like AMD and NVIDIA making major moves in the VR industry. AMD is sitting on a treasure-chest of IP with its LiquidVR technology, while NVIDIA recently announced a VR-ready certification program.

AMD Counters GameWorks with GPUOpen, Leverages Open-Source

AMD is in no mood to let NVIDIA run away with the PC graphics market, with its GameWorks SDK that speeds up PC graphics development (in turn increasing NVIDIA's influence over the game development, in a predominantly AMD GCN driven client base (20% PC graphics market-share, and 100% game console market share). AMD's counter to GameWorks is GPUOpen, with the "open" referring to "open-source."

GPUOpen is a vast set of pre-developed visual-effects, tools, libraries, and SDKs, designed to give developers "unprecedented control" over the GPU, helping them get their software closer to the metal than any other software can. The idea here is that an NVIDIA GameWorks designed title won't get you as "close" to the metal on machines such as the Xbox One and PlayStation 4, or PCs with Radeon GPUs, as GPUOpen. Getting "close to the metal" is defined as directly leveraging features exposed by the GPU, with as few software layers between the app and the hardware as possible.
Return to Keyword Browsing