News Posts matching #SDK

Return to Keyword Browsing

Imagination launches RISC-V CPU family

Imagination Technologies announces Catapult, a RISC-V CPU product line designed from the ground-up for next-generation heterogeneous compute needs. Based on RISC-V, the open-source CPU architecture, which is transforming processor design, Imagination's Catapult CPUs can be configured for performance, efficiency, or balanced profiles, making them suitable for a wide range of markets.

Leveraging Imagination's 20 years of experience in delivering complex IP solutions, the new CPUs are supported by the rapidly expanding open-standard RISC-V ecosystem, which continues to shake up the embedded CPU industry by offering greater choice. Imagination's entry will enable the rapidly expanding RISC-V ecosystem to add a greater range of product offerings, especially for heterogeneous systems. Now customers have an even wider choice of solutions built on the open RISC-V ISA, avoiding lock-in with proprietary architectures.

NVIDIA Announces Updated Open-Source Image Scaling SDK

For the past two years, NVIDIA has offered a driver-based spatial upscaler called NVIDIA Image Scaling and Sharpening, for all your games, that didn't require game or SDK integrations to work. With the new November GeForce Game Ready Driver, we have improved the scaling and sharpening algorithm to now use a 6-tap filter with 4 directional scaling and adaptive sharpening filters to boost performance. And we have also added an in-game sharpness slider, accessible via GeForce Experience, so you can do real-time customizations to sharpness.

In contrast to NVIDIA DLSS, the algorithm is non-AI and non-temporal, using only information from the current low resolution image rendered by the game as an input. While the resulting image quality is best-in-class in comparison to scaling offered by monitors or other in-game scaling techniques, it lacks the temporal data and AI smarts of DLSS, which are required to deliver native resolution detail and robust frame-to-frame stability. By combining both NVIDIA DLSS and NVIDIA Image Scaling, the developer gets the best of both worlds: NVIDIA DLSS for the best image quality, and NVIDIA Image Scaling for cross-platform support. You can read how to enable the feature for any game down below.

AMD Enables FidelityFX Suite on Xbox Series X|S

AMD has announced that Microsoft's Xbox Series S|X now features support for the company's FidelityFX suite. This move, which enabled previously PC-centric technologies on Microsoft's latest-generation gaming consoles, will bring feature parity between RDNA 2-powered graphics, and will eventually enable support for AMD's FSR (FidelityFX Super Resolution), the company's eventual competition to NVIDIA's DLSS tech.

This means that besides the technologies that are part of the DX 12 Ultimate spec (and which the consoles already obviously support), developers now have access to AMD's Fidelity FX technologies such as Contrast Adaptive Sharpening, Variable Rate Shading, ray traced shadow Denoiser, Ambient Occlusion and Screen Space Reflections. All of these AMD-led developments in the SDK allow for higher performance and/or better visual fidelity. However, the icing on the cake should be the FSR support, which could bring the Series X's 8K claims to bear (alongside high-refresh-rate 4K gaming) - should FSR turn out be in a similar performance-enhancing ballpark as NVIDIA's DLSS, which we can't really know for sure at this stage (and likely neither can AMD). No word on Fidelity FX support on the PS5 has been announced at this time, which does raise the question of its eventual support, or if Sony will enable a similar feature via their own development tools.

NVIDIA Extends Data Center Infrastructure Processing Roadmap with BlueField-3 DPU

NVIDIA today announced the NVIDIA BlueField -3 DPU, its next-generation data processing unit, to deliver the most powerful software-defined networking, storage and cybersecurity acceleration capabilities available for data centers.

The first DPU built for AI and accelerated computing, BlueField-3 lets every enterprise deliver applications at any scale with industry-leading performance and data center security. It is optimized for multi-tenant, cloud-native environments, offering software-defined, hardware-accelerated networking, storage, security and management services at data-center scale.

Get Re-Connected With the Latest Version of EK-Loop Connect Software and Sensors

EK, the leading computer cooling solutions provider, is announcing the release of EK-Loop Connect software version 1.3.11. As a company dedicated to providing water cooling enthusiasts and PC builders with the best of what the market has to offer, and after listening to customers' feedback several months ago, EK has issued a rework on the EK-Loop Connect software. Today, the new version of the software is officially ready and can already be downloaded.

"We were fortunate enough to have onboarded a team of software developers who were dedicated to the cause and managed to deliver functional software within the set deadline. We promised to deliver the software within 3 months, and here we are. The latest version of the EK-Loop Connect software was tested and confirmed by a small group of beta testers, and today we are ready to present it to the public," said Edvard König, founder of EK.

NVIDIA Reflex Feature Detailed, Vastly Reduce Input Latency, Measure End-to-End System Latency

NVIDIA Reflex is a new innovation designed to minimize input latency with competitive e-sports games. When it comes out later this month with patches to popular e-sports titles such as Fortnite, Apex Legends, and Valorant, along with a GeForce driver update, the feature could improve input latencies even without any specialized hardware. Input latency is defined as the time it takes for a user input (such as a mouse click) in a game, to reflect as output on the screen, or the time it takes for your mouse click to register as a gunshot in an online shooter, and appear on-screen. The feature is compatible with any NVIDIA GeForce GPU, GTX 900 series or later.

NVIDIA briefly detailed how this works. On the software side, the NVIDIA driver co-operates with a compatible game engine to optimize the game's 3D rendering pipeline. This is accomplished by dynamically reducing the rendering queue, so fewer frames are queued up for the GPU to render. NVIDIA claims that the technology can also keep the GPU perfectly in sync with the CPU (1:1 render queue), reducing the "back-pressure" on the GPU, letting the game sample mouse input at the last possible moment. NVIDIA is releasing Reflex to gamers as GeForce driver updates, and to game developers as the Reflex SDK. This allows them to integrate the technology with their game engine, providing a toggle for the technology, and also put out in-game performance metrics.

IBM Delivers Its Highest Quantum Volume to Date

Today, IBM has unveiled a new milestone on its quantum computing road map, achieving the company's highest Quantum Volume to date. Combining a series of new software and hardware techniques to improve overall performance, IBM has upgraded one of its newest 27-qubit client-deployed systems to achieve a Quantum Volume 64. The company has made a total of 28 quantum computers available over the last four years through IBM Quantum Experience.

In order to achieve a Quantum Advantage, the point where certain information processing tasks can be performed more efficiently or cost effectively on a quantum computer, versus a classical one, it will require improved quantum circuits, the building blocks of quantum applications. Quantum Volume measures the length and complexity of circuits - the higher the Quantum Volume, the higher the potential for exploring solutions to real world problems across industry, government, and research.

To achieve this milestone, the company focused on a new set of techniques and improvements that used knowledge of the hardware to optimally run the Quantum Volume circuits. These hardware-aware methods are extensible and will improve any quantum circuit run on any IBM Quantum system, resulting in improvements to the experiments and applications which users can explore. These techniques will be available in upcoming releases and improvements to the IBM Cloud software services and the cross-platform open source software development kit (SDK) Qiskit.

Intel Hit by a Devastating Data Breach, Chip Designs, Code, Possible Backdoors Leaked

Intel on Thursday was hit by a massive data-breach, with someone on Twitter posting links to an archive that contains the dump of the breach - a 20-gigabyte treasure chest that includes - but not limited to - Intel Management Engine bringup guides, flashing tools, samples; source code of Consumer Electronics Firmware Development Kit (CEFDK); silicon and FSP source packages for various platforms; an assortment of development and debugging tools; Simics simulation for "Rocket Lake S" and other platforms; a wealth of roadmaps and other documents; shcematics, documents, tools, and firmware for "Tiger Lake," Intel Trace Hub + decoder files for various Intel ME versions; "Elkhart Lake" silicon reference and sample code; Bootguard SDK, "Snow Ridge" simulator; design schematics of various products; etc.

The most fascinating part of the leak is the person points to the possibility of Intel laying backdoors in its code and designs - a very tinfoil hat though likely possibility in the post-9/11 world. Intel in a comment to Tom's Hardware denied that its security apparatus had been compromised, and instead blamed someone with access to this information for downloading the data. "We are investigating this situation. The information appears to come from the Intel Resource and Design Center, which hosts information for use by our customers, partners and other external parties who have registered for access. We believe an individual with access downloaded and shared this data," a company spox said.

Matrox D1450 Graphics Card for High-Density Output Video Walls Now Shipping

Matrox is pleased to announce that the Matrox D-Series D1450 multi-display graphics card is now shipping. Purpose-built to power next-generation video walls, this new single-slot, quad-4K HDMI graphics card enables OEMs, system integrators, and AV installers to easily combine multiple D1450 boards to quickly deploy high-density-output video walls of up 16 synchronized 4K displays. Along with a rich assortment of video wall software and developer tools for advanced custom control and application development, D1450 is ideal for a broad range of commercial and critical 24/7 applications, including control rooms, enterprises, industries, government, military, digital signage, broadcast, and more.
Advanced capabilities

Backed by innovative technology and deep industry expertise, D1450 delivers exceptional video and graphics performance on up to four 4K HDMI monitors from a single-slot card. OEMs, system integrators, and AV professionals can easily add—and synchronize—displays by framelocking up to four D-Series cards via board-to-board framelock cables. In addition, D1450 offers HDCP support to display copy-protected content, as well as Microsoft DirectX 12 and OpenGL support to run the latest professional applications.

Dynics Announces AI-enabled Vision System Powered by NVIDIA T4 Tensor Core GPU

Dynics, Inc., a U.S.-based manufacturer of industrial-grade computer hardware, visualization software, network security, network monitoring and software-defined networking solutions, today announced the XiT4 Inference Server, which helps industrial manufacturing companies increase their yield and provide more consistent manufacturing quality.

Artificial intelligence (AI) is increasingly being integrated into modern manufacturing to improve and automate processes, including 3D vision applications. The XiT4 Inference Server, powered by the NVIDIA T4 Tensor Core GPUs, is a fan-less hardware platform for AI, machine learning and 3D vision applications. AI technology is allowing manufacturers to increase efficiency and throughput of their production, while also providing more consistent quality due to higher accuracy and repeatability. Additional benefits are fewer false negatives (test escapes) and fewer false positives, which reduce downstream re-inspection needs, all leading to lower costs of manufacturing.

Epic Online Services Launches with New Tools for Cross-Play and More

Epic Games today announces the launch of Epic Online Services, unlocking the ability to effortlessly scale games and unify player communities for all developers. First announced in December 2018, Epic Online Services are battle-tested and powered by the services built for Fortnite across seven major platforms (PlayStation, Xbox, Nintendo Switch, PC, Mac, iOS, and Android). Open to all developers, Epic Online Services is completely free and offers creators a single SDK to quickly and easily launch, operate, and scale their games across engines, stores, and platforms of their choice.

"At Epic, we believe in open, integrated platforms and in the future of gaming being a highly social and connected experience," said Chris Dyl, General Manager, Online Services, Epic Games. "Through Epic Online Services, we strive to help build a user-friendly ecosystem for both developers and players, where creators can benefit regardless of how they choose to build and publish their games, and where players can play games with their friends and enjoy the same quality experience regardless of the hardware they own."

Intel iGPU+dGPU Multi-Adapter Tech Shows Promise Thanks to its Realistic Goals

Intel is revisiting the concept of asymmetric multi-GPU introduced with DirectX 12. The company posted an elaborate technical slide-deck it originally planned to present to game developers at the now-cancelled GDC 2020. The technology shows promise because the company isn't insulting developers' intelligence by proposing that the iGPU lying dormant be made to shoulder the game's entire rendering pipeline for a single-digit percentage performance boost. Rather, it has come up with innovating augments to the rendering path such that only certain lightweight compute aspects of the game's rendering be passed on to the iGPU's execution units, so it has a more meaningful contribution to overall performance. To that effect, Intel is on the path of coming up with SDK that can be integrated with existing game engines.

Microsoft DirectX 12 introduced the holy grail of multi-GPU technology, under its Explicit Multi-Adapter specification. This allows game engines to send rendering traffic to any combinations or makes of GPUs that support the API, to achieve a performance uplift over single GPU. This was met with lukewarm reception from AMD and NVIDIA, and far too few DirectX 12 games actually support it. Intel proposes a specialization of explicit multi-adapter approach, in which the iGPU's execution units are made to process various low-bandwidth elements both during the rendering and post-processing stages, such as Occlusion Culling, AI, game physics, etc. Intel's method leverages cross-adapter shared resources sitting in system memory (main memory), and D3D12 asynchronous compute, which creates separate processing queues for rendering and compute.

Khronos Group Releases Vulkan Ray Tracing

Today, The Khronos Group, an open consortium of industry-leading companies creating advanced interoperability standards, announces the ratification and public release of the Vulkan Ray Tracing provisional extensions, creating the industry's first open, cross-vendor, cross-platform standard for ray tracing acceleration. Primarily focused on meeting desktop market demand for both real-time and offline rendering, the release of Vulkan Ray Tracing as provisional extensions enables the developer community to provide feedback before the specifications are finalized. Comments and feedback will be collected through the Vulkan GitHub Issues Tracker and Khronos Developer Slack. Developers are also encouraged to share comments with their preferred hardware vendors. The specifications are available today on the Vulkan Registry.

Ray tracing is a rendering technique that realistically simulates how light rays intersect and interact with scene geometry, materials, and light sources to generate photorealistic imagery. It is widely used for film and other production rendering and is beginning to be practical for real-time applications and games. Vulkan Ray Tracing seamlessly integrates a coherent ray tracing framework into the Vulkan API, enabling a flexible merging of rasterization and ray tracing acceleration. Vulkan Ray Tracing is designed to be hardware agnostic and so can be accelerated on both existing GPU compute and dedicated ray tracing cores if available.
Vulkan ray tracing

NVIDIA Files for "Hopper" and "Aerial" Trademarks

In a confirmation that a future NVIDIA graphics architecture will be codenamed "Hopper," the company has trademarked the term with the US-PTO. The trademark application was filed as recently as December 4, and closely follows that of "Aerial," another trademark, which is an SDK for a GPU-accelerated 5G vRANs (virtual radio-access networks). Named after eminent computing scientist Grace Hopper, the new graphics architecture by NVIDIA reportedly sees one of the first GPU die MCMs (package with multiple GPU dies). It reportedly succeeds "Ampere," NVIDIA's next graphics architecture.

NVIDIA Announces New GeForce Experience Features Ahead of RTX Push

NVIDIA today announced new GeForce experience features to be integrated and expanded in wake of its RTX platform push. The new features include increased number of Ansel-supporting titles (including already released Prey and Vampyr, as well as the upcoming Metro Exodus and Shadow of the Tomb Raider), as well as RTX-exclusive features that are being implemented into the company's gaming system companion.

There are also some features being implemented that gamers will be able to take advantage of without explicit Ansel SDK integration done by the games developer - which NVIDIA says will bring Ansel support (in any shape or form) to over 200 titles (150 more than the over 50 titles already supported via SDK). And capitalizing on Battlefield V's relevance to the gaming crowd, NVIDIA also announced support for Ansel and its Highlights feature for the upcoming title.

Microsoft Releases DirectX Raytracing - NVIDIA Volta-based RTX Adds Real-Time Capability

Microsoft today announced an extension to its DirectX 12 API with DirectX Raytracing, which provides components designed to make real-time ray-tracing easier to implement, and uses Compute Shaders under the hood, for wide graphics card compatibility. NVIDIA feels that their "Volta" graphics architecture, has enough computational power on tap, to make real-time ray-tracing available to the masses. The company has hence collaborated with Microsoft to develop the NVIDIA RTX technology, as an interoperative part of the DirectX Raytracing (DXR) API, along with a few turnkey effects, which will be made available through the company's next-generation GameWorks SDK program, under GameWorks Ray Tracing, as a ray-tracing denoiser module for the API.

Real-time ray-tracing has for long been regarded as a silver-bullet to get lifelike lighting, reflections, and shadows right. Ray-tracing is already big in the real-estate industry, for showcasing photorealistic interactive renderings of property under development, but has stayed away from gaming, that tends to be more intense, with larger scenes, more objects, and rapid camera movements. Movies with big production budgets use pre-rendered ray-tracing farms to render each frame. Movies have, hence, used ray-traced visual-effects for years now, since it's not interactive content, and its studios are willing to spend vast amounts of time and money to painstakingly render each frame using hundreds of rays per pixel.

Bose Introduces the World's First Audio Augmented Reality Platform

This week at SXSW, Bose introduces Bose AR, the world's first audio augmented reality platform, and glasses to hear- a Bose AR prototype that launches the future of mobile sound. Bose also announces its SDK schedule date for developers, manufacturers, and research institutions, along with collaborations currently under way, and venture funding for related start-ups.

Unlike other augmented reality products and platforms, Bose AR doesn't change what you see, but knows what you're looking at- without an integrated lens or phone camera. And rather than superimposing visual objects on the real world, Bose AR adds an audible layer of information and experiences, making every day better, easier, more meaningful, and more productive.

Dell Partners with Meta to Sell Meta 2 Augmented Reality Development Kit

Dell today announced it will be the first authorized reseller of the Meta 2 Augmented Reality Development Kit, equipping commercial companies with the tools needed to more easily innovate and adopt new AR technology applications that can advance their business. In partnership with Meta, Dell aims to make AR more accessible for business deployment, particularly in healthcare, manufacturing and construction, by providing tools for creating immersive experiences unique to the needs of those industries.

Dell is the only technology provider with an end to end ecosystem to consume, create and power VR and AR. The new offering with Meta stems from Dell's VR/AR Technology Partner Program, which brings together other innovators in VR and AR to test and collaborate on the best technology solutions for varying applications and experiences. This program allows Dell to help current and potential customers to better navigate the new and rapidly evolving VR/AR ecosystem, by working with partners to verify and certify the best software and hardware solutions for VR and AR applications - bringing standardization where it is needed most.

HTC Reveals Vive Focus Standalone VR Headset and Vive Wave VR Open Platform

HTC, a pioneer in innovative, smart mobile and virtual reality (VR) technologies, today held its VIVE Developer Conference 2017 (VDC2017), where it announced VIVE WAVE, a VR open platform and toolset that will open up the path to easy mobile VR content development and high-performance device optimization for third-party partners. 12 hardware partners, namely 360QIKU, Baofengmojing, Coocaa, EmdoorVR, Idealens, iQIYI, Juhaokan, Nubia, Pico, Pimax, Quanta and Thundercomm, announced their support for the integration of Vive Wave as well as the VIVEPORT VR content platform into their future products. Vive Wave is a clear step forward in bringing together the highly fragmented mobile VR market that has growth up in China the last several years. It saves tremendous efforts by allowing developers to create content for a common platform and storefront across disparate hardware vendors. Over 35 Chinese and global content developers have already built VR content optimized for Vive Wave, with 14 showing live demos at the event. Vive also unveiled the VIVE FOCUS, its highly anticipated premium standalone VR headset for the China market that is also based on the Vive Wave VR open platform.

Creative Launches Aurora Reactive SDK for Sound BlasterX Products

Creative Technology Ltd today announced that it would be launching the Aurora Reactive SDK. This tool would effectively convert the Aurora Reactive Lighting System found on Sound BlasterX products into an open platform, allowing developers the freedom to customize, animate and synchronize its lighting behavior. The 16.8 million color Aurora Reactive Lighting System is currently found on the Sound BlasterX Katana, Vanguard K08, Siege M04, AE-5, and Kratos S5.

The Aurora Reactive SDK is a system with APIs (Application Programming Interfaces) that allow third party developers to program Creative's Sound BlasterX RGB-enabled hardware. The SDK will come complete with sample codes, an API library, and documentation to enable even novice programmers to get started.

Razer Announces the Wolverine Ultimate Gamepad for PC and Xbox One

Razer, the leading global lifestyle brand for gamers, today announced the officially licensed Razer Wolverine Ultimate gaming controller for Xbox One and PC. The Razer Wolverine Ultimate was designed to adapt itself to any gamer. Two interchangeable D-Pads, a range of interchangeable thumbsticks with different heights and shape and a total of 6 remappable triggers and buttons - both via Razer Synapse for Xbox and on-the-fly - provide maximum customizability.

An integrated RGB lighting strip that can be controlled via Razer Synapse for Xbox adds more ways to personalize the controller and introduces Razer Chroma to Xbox gamers everywhere. Gamers can choose from 16.8 million colors and a variety of effects that include Static, Spectrum Cycling, Breathing, Wave and more. Additionally, the Razer Wolverine Ultimate will be the first console product to support the Razer Chroma SDK, allowing developers to integrate advanced lighting capabilities for Xbox One games, and console controllers for next level gaming immersion.

Razer Takes Chroma Lighting Beyond Peripherals with the Hardware Development Kit

Razer, the leading global lifestyle brand for gamers, today announced the Razer Chroma Hardware Development Kit (HDK), the world's most advanced modular lighting system for PC gamers and enthusiasts. Integrated within the Razer Chroma ecosystem, the Chroma HDK offers all-in-one color customization with precise control down to the individual LED.

Users can shape and bend the LED strips to fit virtually any surface to light up an entire room, home or office for total game immersion. The individually controllable lights are integrated into Razer Synapse 3, and are powered by Razer Chroma technology, which unlocks customizable lighting features that can be synced across devices.

NVIDIA Announces OptiX 5.0 SDK - AI-Enhanced Ray Tracing

At SIGGRAPH 2017, NVIDIA introduced the latest version of their AI-based, GPU-enabled ray-tracing OptiX API. The company has been at the forefront of GPU-powered AI endeavors in a number of areas, including facial animation, anti-aliasing, denoising, and light transport. OptiX 5.0 brings a renewed focus on AI-based denoising.

AI training is still a brute-force scenario with finesse applied at the end: basically, NVIDIA took tens of thousands of image pairs of rendered images with one sample per pixel and a companion image of the same render with 4,000 rays per pixel, and used that to train the AI to predict what a denoised image looks like. Basically (and picking up the numbers NVIDIA used for its AI training), this means that in theory, users deploying OptiX 5.0 only need to render one sample per pixel of a given image, instead of the 4,000 rays per pixel that would be needed for its final presentation. Based on its learning, the AI will then be able to fill in the blanks towards finalizing the image, saving the need to render all that extra data. NVIDIA quotes a 157x improvement in render time using a DGX station with Optix 5.0 deployed against the same render on a CPU-based platform (2 x E5-2699 v4 @ 2.20GHz). The Optix 5.0 release also includes provisions for GPU-accelerated motion blur, which should do away with the need to render a frame multiple times and then applying a blur filter through a collage of the different frames. NVIDIA said OptiX 5.0 will be available in November. Check the press release after the break.

NVIDIA Releases VRWorks Audio and 360 Video SDKs at GTC

Further planting its roots on the VR SDK and development field, NVIDIA has just announced availability of two more SDK packages, for their VRWorks Audio and 360 Video suites. Now a part of NVIDIA's VRWorks suite of VR solutions, the VRWorks Audio SDK provides real-time ray tracing of audio in virtual environments, and is supported in Epic's Unreal Engine 4 (here's hoping this solution, or other solutions similar to it, address the problem of today's game audio.) The VRWorks 360 Video SDK, on the other hand, may be less interesting for graphics enthusiasts, in that it addresses the complex challenge of real-time video stitching.

Traditional VR audio ( and gaming audio, for that matter) provide an accurate 3D position of the audio source within a virtual environment. However, as it is handled today, sound is processed with little regard to anything else but the location of the source. With VRWorks Audio, NVIDIA brings to the table considerations for the dimensions and material properties of the physical environment, helping to create a truly immersive environment by modeling sound propagation phenomena such as reflection, refraction and diffraction. This is to be done in real time, at a GPU level. This work leverages NVIDIA's OptiX ray-tracing technology, which allows VRWorks Audio to trace the path of sound in real time, delivering physically accurate audio that reflects the size, shape and material properties of the virtual environment.

NVIDIA Announces Public Ansel SDK, Developer Plugins

NVIDIA, Ansel, a framework for doing real-time screenshot filters and photographic effects, has seen the release of a public SDK and a few developer plugins to boot. Unreal Engine and Unity have both gained plugins for the technology, and the tech is reportedly coming to Amazon's Lumberyard engine as well. This should most assuredly aid in the adoption of the technology, as well as open it up to new markets where it was previous unavailable, such as indie game development. The public SDK is presently available for download from NVIDIA directly at developer.nvidia.com/ansel
Return to Keyword Browsing
Apr 19th, 2024 17:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts