News Posts matching #Hardware

Return to Keyword Browsing

Steam On Linux Restores Hardware Acceleration by Default for NVIDIA GPUs

A previous attempt to enable NVIDIA GPU video hardware acceleration by default within Steam running on Linux platforms was thwarted by numerous bugs and faults - adopters of the mid-May Steam Client Beta update reported their experiences of various crashes encountered in Valve's user interface. The embattled software engineering team has since investigated this matter and released a new update (yesterday).

The June 6th Steam Client Beta patch notes list a number of general improvements along with Linux-specific adjustments: "a crash when Steam windows were closed with hardware (HW) acceleration enabled on NVIDIA GPUs" and the re-enabling of "HW acceleration by default for NVIDIA GPUs." Early reports indicate that Linux gamers are having a smoother time after installing yesterday's update.

Apple Introduces M2 Ultra

Apple today announced M2 Ultra, a new system on a chip (SoC) that delivers huge performance increases to the Mac and completes the M2 family. M2 Ultra is the largest and most capable chip Apple has ever created, and it makes the new Mac Studio and Mac Pro the most powerful Mac desktops ever made. M2 Ultra is built using a second-generation 5-nanometer process and uses Apple's groundbreaking UltraFusion technology to connect the die of two M2 Max chips, doubling the performance. M2 Ultra consists of 134 billion transistors—20 billion more than M1 Ultra. Its unified memory architecture supports up to a breakthrough 192 GB of memory capacity, which is 50 percent more than M1 Ultra, and features 800 GB/s of memory bandwidth—twice that of M2 Max. M2 Ultra features a more powerful CPU that's 20 percent faster than M1 Ultra, a larger GPU that's up to 30 percent faster, and a Neural Engine that's up to 40 percent faster. It also features a media engine with twice the capabilities of M2 Max for blazing ProRes acceleration. With all these advancements, M2 Ultra takes Mac performance to a whole new level yet again.

"M2 Ultra delivers astonishing performance and capabilities for our pro users' most demanding workflows, while maintaining Apple silicon's industry-leading power efficiency," said Johny Srouji, Apple's senior vice president of Hardware Technologies. "With huge performance gains in the CPU, GPU, and Neural Engine, combined with massive memory bandwidth in a single SoC, M2 Ultra is the world's most powerful chip ever created for a personal computer."

Analogue Duo FPGA Retro Console Arriving This Week

Introducing Duo - The Higher Energy Analogue System. You've always known what to expect from a video game system. Until now. Duo is an all-in-one reimagining of perhaps the most underappreciated video game systems of all time. Analogue Duo is compatible with nearly every NEC system and game format ever made. TurboGrafx-16. PC Engine. SuperGrafx. TurboGrafx CD. PC Engine CD-ROM. Super Arcade CD-ROM. 1080p. Zero lag. Bluetooth. 2.4 G. Because the last thing a video game system should be is predictable. Completely engineered in FPGA. Analogue OS. No emulation.

We're preserving history with FPGA hardware. Duo is designed with unparalleled compatibility. The core functionality of each system is engineered directly into an Altera Cyclone V, a sophisticated FPGA. We spent thousands of hours engineering each system via FPGA for absolute accuracy. Unlike the knock off and emulation systems that riddle the market today, you'll be experiencing the entire NEC era free of compromises. Duo is designed to preserve video game history, with the respect it deserves.

Intel Cuts Budget for Client and Data Center Groups, Layoffs Imminent

Following the recent Q1 2023 financial report with declining revenue, Intel is restructuring its Client Computing Group (CCG) and Data Center Group (DCG). These two units were hit the hardest, with 38 and 39% downturns, respectively. According to Dylan Patel of SemiAnalysis, and a statement from Tom's Hardware, we have information that Intel will be conducting budget cuts to CCG and DCG, with some layoffs. As Dylan Patel notes, Intel will cut CCG and DCG budgets by 10%, resulting in as much as a 20% reduction of the workforce inside those two groups. Additionally, this was supported by Intel's spokesperson, who issued a statement for Tom's Hardware stating the following:
Intel SpokespersonIntel is working to accelerate its strategy while navigating a challenging macro-economic environment. We are focused on identifying cost reductions and efficiency gains through multiple initiatives, including some business and function-specific workforce reductions in areas across the company.

We continue to invest in areas core to our business, including our U.S.-based manufacturing operations, to ensure we are well-positioned for long-term growth. These are difficult decisions, and we are committed to treating impacted employees with dignity and respect.

AMD's Dr. Lisa Su Thinks That Moore's Law is Still Relevant - Innovation Will Keep Legacy Going

Barron's Magazine has been on a technology industry kick this week and published their interview with AMD CEO Dr. Lisa Su on May 3. The interviewer asks Su about her views on Moore's Law and it becomes apparent that she remains a believer of Gordon Moore's (more than half-century old) prediction - Moore, an Intel co-founder passed away in late March. Su explains that her company's engineers will need to innovate in order to carry on with that legacy: "I would certainly say I don't think Moore's Law is dead. I think Moore's Law has slowed down. We have to do different things to continue to get that performance and that energy efficiency. We've done chiplets - that's been one big step. We've now done 3-D packaging. We think there are a number of other innovations, as well." Expertise in other areas is also key in hitting technological goals: "Software and algorithms are also quite important. I think you need all of these pieces for us to continue this performance trajectory that we've all been on."

When asked about the challenges involved in advancing CPU designs within limitations, Su responds with: "Yes. The transistor costs and the amount of improvement you're getting from density and overall energy reduction is less from each generation. But we're still moving (forward) generation to generation. We're doing plenty of work in 3 nanometer today, and we're looking beyond that to 2 nm as well. But we'll continue to use chiplets and these type of constructions to try to get around some of the Moore's Law challenges." AMD and Intel continue to hold firm with Moore's Law, even though slightly younger upstarts disagree (see NVIDIA). Dr. Lisa Su's latest thoughts stay consistent with her colleague's past statements - AMD CTO Mark Papermaster reckoned that the theory is pertinent for another six to eight years, although it could be a costly endeavor for AMD - the company believes that it cannot double transistor density every 18 to 24 months without incurring extra expenses.

Apricorn Launches Aegis NVX Hardware-Encrypted USB Storage Device, Boasts Read/Write Speeds of 1 GB/s

Apricorn, the leading manufacturer of software-free, 256-bit AES XTS hardware-encrypted USB drives, today announced the release of the USB 10Gbs Aegis NVX. Employing proprietary architecture, The Aegis NVX is the first Apricorn encrypted device to feature an NVMe SSD inside. Initial capacity offerings will be 500 GB, 1 TB, and 2 TB, with a price range of $339.00 - $739.00 MSRP.

The NVX was conceived to address the immediate protection of raw data delivered directly from its source at high speeds, such as high-definition video cameras with the capability to write to an external SSD via USB C or HDMI. The NVX's high-speed read/write capabilities at 1,000 MB/s are sought after in the fields of military intelligence, digital forensics, filmmaking and healthcare where write speeds over 600 Mb/s are critical.

NVIDIA GeForce GTX 1650 Returns to Top Spot According to April Steam Hardware Survey

Valve has released the tabulated results and statistics of its April Steam Hardware and Software Survey - the key take away from last month's user generated data is that NVIDIA's trusty GeForce GTX 1650 GPU is once again the most popular graphics card. It dethrones last month's winner - the NVIDIA RTX 3060 graphics card which falls to third place where it sits below the second place GTX 1060 GPU. The RTX 3060 experienced an almost 6% decline in usership from the previous month, and the GTX 1650's userbase grew by 2% in the same period of time. It is interesting to note that the entry for the GTX 1650 encompasses both desktop and laptop variants, while the RTX 3060 gets divided into two separate entries on Valve's survey - the desktop version sits at third place and its laptop-oriented sibling trails slightly behind with a placement at position number four. NVIDIA absolutely dominates the field with lots of its budget and midrange cards (across several older generations) - AMD and Intel barely make it into the top 25 with a small sprinkling of iGPUs and one discrete model (Radeon RX 580) placed at position 24.

April's survey shows that Intel processors remain a favorite for many Steam users with a 67.14% share, and AMD follows in second place with a 32.84% share. AMD CPU popularity is on the rise (when compared to previous months) so a more even share of the market could be on the cards, if an upward trend continues. System RAM enthusiasts were upgrading to a smaller degree last month: 52.19% are on 16 GB, and 16.1 percent are on 32 GB - indicating slight declines (from March) of 4.73% and 6.61% respectively. The majority of users prefer to stick with Windows 10 64-bit - that OS has a 61.21% share, but its popularity has dropped by 12.74% within the survey period. Windows 11 64-bit is gaining ground with a 10.98% increase from March to April, and it sits at second place with a 33.39% share of the OS userbase. As always, the results indicated by the monthly Steam Hardware and Software survey are not considered to be pinpoint accurate due to the random nature of user responses, but overall and general trends can be discerned from the data on hand.

Microsoft Said to be Designing its own Arm SoC to Compete with Apple Silicon

According to Tom's Hardware, Microsoft is busy hiring engineers to help the company design its own Arm based SoC for Windows 12. Based on job listings, Microsoft is adding people to its "Silicon team," which is currently involved in designing products for Microsoft's Azure, Xbox Surface and HoloLens products. That said, the specific job listings mentioned by Tom's Hardware mentions "optimizing Windows 12 for Silicon-ARM architecture" suggesting we're looking at a custom Arm SoC, with others mentioning "internally developed silicon components" and "building complex, state-of-the-art SOCs using leading silicon technology nodes and will collaborate closely with internal customers and partners."

That said, Microsoft is currently working with Qualcomm and the Microsoft SQ3 found in the Surface Pro 9 is the latest result of that partnership. This brings the question if Microsoft has decided to make its own chip to compete with the Apple M-series of silicon, or if Microsoft is simply looking at working closer with Qualcomm by hiring inhouse talent that can help tweak the Qualcomm silicon to better suit its needs. With Windows 12 scheduled for a 2024 release, it looks like we'll have to wait a while longer to find out what Microsoft is cooking up, but regardless of what it is, it looks like Windows on Arm isn't going anywhere.

AMD Ryzen 7000X3D Power Consumption Spiking Beyond 100 W in Idle Mode

According to investigations undertaken by Igor's Lab and Hardware Busters this week it seems that AMD's problematic lineup of Ryzen 7000 & Ryzen 7000X3D CPUs are consuming unexpected levels of power in short bursts when running in idle mode. In conducting more in-depth tests over the past few days, Igor Wallossek and (outgoing TPU PSU expert) Aristeidis Bitziopoulos have both found that that the aforementioned AMD processors are producing (to the testers' slight concern) power spikes in situations involving minimal computing activity. It is not currently known whether the sharp climbs in power consumption are in any way related to the burnout issues experienced by unlucky overclockers this week.

Aris/crmaris (at Hardware Busters) says that he has tested many of the affected processors in the past, but was not privy to any major problems relating to burnout or power consumption spikes. By running new tests this week, using his own Powenetics v2 board, Aris has found out that: "There are some interesting facts here, which I didn't pay much attention to during the reviews because I only look at the average values and not the peak ones in idle. In the 7950X3D, there is a high spike during idle at 130 W, which is unjustified because the peak CPU load is only 3.53%. Even with the Curve Optimized enabled and a -15 setting, the idle power spike is close to 125 W, so something is happening there. On the 7800X3D, the spike during idle stays low, but this is not the case for the 7900X, which has an idle power spike at 109 W, while the peak CPU load at idle was at 5.12%, so these 109 W are not justified, either."

Bethesda Reveals Redfall's Official PC Hardware Requirements, NVIDIA Releases Exclusive Launch Trailer

With the launch of Redfall set for early next week, Bethesda has kindly provided some prepwork to consider in advance - a breakdown of the crossover shooter's PC specification requirements (minimum, recommended and ultra) as well as all available accessibility settings and options. They fail to mention any resolution or framerate targets, so it will be up to PC gamers to investigate the nitty-gritty numbers on the day of Redfall's release (May 2). As part of the whole promo campaign NVIDIA has decided to unleash a gameplay trailer (see below) of the game running on a high end PC fitted with a (presumed) high-end GeForce RTX 40-series graphics card.

Redfall's PC audience will get to flex their graphical muscles more so when compared to unfortunate current gen Xbox owners - Bethesda and developer Arkane Studios have recently admitted that their supernatural shooter will be restricted to 30 FPS Quality Mode on Series X and S consoles at launch. A higher framerate performance mode is promised in the future - via a game update - but the Microsoft-owned publisher has not set a more specific release window for the upgrade. The middle and high-tier specifications on PC seem to be quite reasonable, so Redfall is not expected to be a stuttering mess on arrival.

Microsoft FY23 Q3 Earnings Report Shows Losses for OEM Business and Hardware

Microsoft Corp. today announced the following results for the quarter ended March 31, 2023, as compared to the corresponding period of last fiscal year:
  • Revenue was $52.9 billion and increased 7% (up 10% in constant currency)
  • Operating income was $22.4 billion and increased 10% (up 15% in constant currency)
  • Net income was $18.3 billion and increased 9% (up 14% in constant currency)
  • Diluted earnings per share was $2.45 and increased 10% (up 14% in constant currency)
"The world's most advanced AI models are coming together with the world's most universal user interface - natural language - to create a new era of computing," said Satya Nadella, chairman and chief executive officer of Microsoft. "Across the Microsoft Cloud, we are the platform of choice to help customers get the most value out of their digital spend and innovate for this next generation of AI."

Colorful Custom RTX 4060 Ti GPU Clocks Outed, 8 GB VRAM Confirmed

Resident TechPowerUp hardware database overseer T4C Fantasy has divulged some early information about a custom version of the NVIDIA GeForce RTX 4060 Ti GPU card - Colorful's catchily named iGame RTX 4060 Ti Ultra White OC model has been added to the TPU GPU database, and T4C Fantasy has revealed a couple of tidbits on Twitter. The GPU has been tuned to have a maximum boost clock of 2580 MHz, jumping from a base clock of 2310 MHz. According to past leaks the reference version of the GeForce RTX 4060 Ti has a default boost clock of 2535 MHz, so Colorful's engineers have managed to add another 45 MHz on top of that with their custom iteration - so roughly 2% more than the reference default.

T4C Fantasy also confirmed that the Colorful iGame RTX 4060 Ti Ultra W OC will be appointed with 8 GB of VRAM, which also matches the reference model's rumored memory spec. T4C Fantasy points out that brands have the option to produce RTX 4060 Ti cards with a larger pool of attached video memory, but launch models will likely stick with the standard allotment of 8 GB of VRAM. The RTX 4060 Ti is listed as being based on the Ada Lovelace GPU architecture (GPU variant AD106-350-A1), and T4C Fantasy expects that Team Green will stick with a 5 nm process size - contrary to reports of a transition to manufacturing on 4 nm (chez TSMC foundries).

AMD Brings ROCm to Consumer GPUs on Windows OS

AMD has published an exciting development for its Radeon Open Compute Ecosystem (ROCm) users today. Now, ROCm is coming to the Windows operating system, and the company has extended ROCm support for consumer graphics cards instead of only supporting professional-grade GPUs. This development milestone is essential for making AMD's GPU family more competent with NVIDIA and its CUDA-accelerated GPUs. For those unaware, AMD ROCm is a software stack designed for GPU programming. Similarly to NVIDIA's CUDA, ROCm is designed for AMD GPUs and was historically limited to Linux-based OSes and GFX9, CDNA, and professional-grade RDNA GPUs.

However, according to documents obtained by Tom's Hardware (which are behind a login wall), AMD has brought support for ROCm to Radeon RX 6900 XT, Radeon RX 6600, and R9 Fury GPU. What is interesting is not the inclusion of RX 6900 XT and RX 6600 but the support for R9 Fury, an eight-year-old graphics card. Also, what is interesting is that out of these three GPUs, only R9 Fury has full ROCm support, the RX 6900 XT has HIP SDK support, and RX 6600 has only HIP runtime support. And to make matters even more complicated, the consumer-grade R9 Fury GPU has full ROCm support only on Linux and not Windows. The reason for this strange selection of support has yet to be discovered. However, it is a step in the right direction, as AMD has yet to enable more functionality on Windows and more consumer GPUs to compete with NVIDIA.

Bulk Order of GPUs Points to Twitter Tapping Big Time into AI Potential

According to Business Insider, Twitter has made a substantial investment into hardware upgrades at its North American datacenter operation. The company has purchased somewhere in the region of 10,000 GPUs - destined for the social media giant's two remaining datacenter locations. Insider sources claim that Elon Musk has committed to a large language model (LLM) project, in an effort to rival OpenAI's ChatGPT system. The GPUs will not provide much computational value in the current/normal day-to-day tasks at Twitter - the source reckons that the extra processing power will be utilized for deep learning purposes.

Twitter has not revealed any concrete plans for its relatively new in-house artificial intelligence project but something was afoot when, earlier this year, Musk recruited several research personnel from Alphabet's DeepMind division. It was theorized that he was incubating a resident AI research lab at the time, following personal criticisms levelled at his former colleagues at OpenAI, ergo their very popular and much adopted chatbot.

U.S. President Invokes Defense Production Act for PCB Production

On Monday 27 March U.S. President Joe Biden invoked the Defense Production Act in order to form a budget of $50 million, to be spent on domestic and Canadian production of printed circuit boards (aka PCBs). This move was deemed as important to matters of national defense, and technology has been cited as key part of North American security efforts. In a memo issued that day, Biden stated that without presidential action under the act: "United States industry cannot reasonably be expected to provide the capability for the needed industrial resource, material, or critical technology item in a timely manner."

PCBs form the basis of vital components that are integrated into military-purpose missiles and radars, in addition to electronics utilized for energy distribution and the nation's healthcare. The President continues to outline the importance of the Defense Production Act: "I find that action to expand the domestic production capability for printed circuit boards and advanced packaging is necessary to avert an industrial resource or critical technology item shortfall that would severely impair national defense capability."

NVIDIA RTX 3080 Ti Owners Reporting Bricked Cards During Diablo IV Closed Beta Play Sessions

A combination of the Diablo IV Closed Beta and NVIDIA RTX 3080 Ti graphics card is proving lethal for the latter - community feedback has alerted Blizzard to take action, and they will be investigating the issue in the coming days, with assistance from NVIDIA. It is speculated that the game is exposing underlying hardware faults within an affected card, but it is odd that a specific model is generating the largest number of issues. Unlucky 3080 Ti owners participating in the closed beta are said to be experiencing unpleasant or inconsistent in-game performance at best, and BSODs followed by non-functional GPUs at worst.

A Blizzard forumite, ForANge, chimed in with their experience: "My graphics card also burned out. While playing, the fans of my Gigabyte GeForce RTX 3080 Ti suddenly started running at maximum speed. At the same time, the signal from the monitor disappeared. After turning off the power and trying to turn it back on, I couldn't get it to work anymore. The card is just under a year old. It happened during a cutscene with flowers when they showed a snowy landscape."

Bigscreen Introduces Beyond—the World's Smallest VR Headset

Bigscreen today unveiled Bigscreen Beyond, the world's smallest VR headset. Weighing just 127 grams and measuring less than 1-inch at its thinnest point, Beyond has an unprecedented form factor that is 6X lighter than competing VR devices. Beyond features ultra-high resolution OLED displays, custom pancake optics, and
tethers to a PC for ultimate VR immersion. Each device is custom-built to the shape of a customer's face, providing unparalleled comfort for long duration VR
experiences.

"As passionate VR enthusiasts, we built the VR headset we wanted for ourselves," said Darshan Shankar, Bigscreen's Founder & CEO. "Today's leading VR headsets have doubled in weight compared to headsets from 2016. We built Beyond because we felt VR was too heavy, bulky, and uncomfortable. We invented new technologies to increase comfort, and developed ultra-high-end components like OLED microdisplays and pancake optics to increase immersion. To deliver the best software experience for watching movies in Bigscreen, we also had to build the best hardware with Bigscreen Beyond."

ASUS ROG Strix X670E-I Chipset Sits on a M.2 PCB

AMD's high-end X670E motherboard chipset combines two Promontory 21 chips working together to deliver a single solution. With regularly-sized ATX motherboards, having two chips to form a chipset is fine, as there is much room on the PCB. However, with Mini-ITX motherboards, packing two Promontory 21 chips is difficult as the PCB area is limited. To combat this, ASUS introduced an interesting solution to solve the problem and allowed the company to ship the high-end X670E chipset inside a Mini-ITX form factor. Thanks to UNIKO's Hardware's findings, we look at the exciting solution ASUS used to solve this problem.

Instead of two Promontory 21 chips side by side, one is placed on the motherboard directly, while the other stands vertically attached by M.2 PCIe slot. Below, the chipset's pictures and the highlight show how it looks disassembled.

Fortinet Unveils New ASIC to Accelerate the Convergence of Networking and Security Across Every Network Edge

Fortinet, the global cybersecurity leader driving the convergence of networking and security, today announced FortiSP5, the latest breakthrough in ASIC technology from Fortinet, propelling major leaps forward in securing distributed network edges. Building on over 20 years of ASIC investment and innovation from Fortinet, FortiSP5 delivers significant secure computing power advantages over traditional CPU and network ASICs, lower cost and power consumption, the ability to enable new secure infrastructure across branch, campus, 5G, edge compute, operational technologies, and more.

"With the introduction of FortiSP5, Fortinet once again sets new industry records for performance, cost, and energy efficiency. As the only cybersecurity vendor leveraging purpose-built ASICs, an over 20-year investment in innovation, Fortinet delivers the secure computing power that will support the next generation of secure infrastructure." Ken Xie, Founder, Chairman of the Board, and Chief Executive Officer at Fortinet

Intel Reincarnates VROC Functionality for Xeon Processors

Intel's Xeon processors feature a wide range of embedded functionalities that the company has developed over the years. One such is the Virtual RAID on CPU (VROC) that enabled the functionality of an NVMe RAID card on the CPU, simplifying the installation, cost, and maintenance of high-performance storage arrays. Debuting in 2017, it is present in some consumer-facing Core models and Xeon Scalable platforms where it sees the highest usage. However, on January 6, Intel posted a product change notice that notified the users that the VROC function would be discontinued, with the last orders being placed on January 23 and the last shipping with VROC being made on March 31. This caused confusion, especially in the enterprise sector, which utilizes Intel Xeon processors for their workloads and storage arrays.

Tom's Hardware has reached out to Intel for clarification and got the following statement:
Intel SpokespersonThe PCN [Product Change Notice] was prematurely posted while the decision was under evaluation. After discussing with the ecosystem and customers we realize there is significant demand for this product and intend to continue to support it.

US Might Reimpose GPU Import Tariffs in the New Year

Currently, the US has an exclusion in place when it comes to import tariffs relating to graphics cards and GPUs imported from China, but the exclusion is set to expire on the 31st of December this year. So far, the US government has been quiet on whether or not the import tariff will be reinstated or not. If the tariff was to be reinstated, US consumers are looking at a 25 percent import duty on graphics cards, starting on the 1st of January, 2023.

There's no easy way to circumvent the tariff either, as it includes items like "printed circuit assemblies, constituting unfinished logic boards," according to Tom's Hardware. Not all graphics cards are made in China though, but the majority of graphics cards are today. It's possible that NVIDIA's move of its logistics center from Hong Kong to Taiwan could have some relation to this as well, as NVIDIA would then be shipping products out of Taiwan, rather than China, depending on how the US Customs classifies Hong Kong these days. We should know what happens in a month's time, but a 25 percent import duty on graphics cards will likely kill most sales, as most people already find them overpriced. This would of course affect AMD and NVIDIA, as well as their partners in the same way, unless they make their graphics cards outside of China.

AMD 4th Generation EPYC "Genoa" Processors Benchmarked

Yesterday, AMD announced its latest addition to the data center family of processors called EPYC Genoa. Named the 4th generation EPYC processors, they feature a Zen 4 design and bring additional I/O connectivity like PCIe 5.0, DDR5, and CXL support. To disrupt the cloud, enterprise, and HPC offerings, AMD decided to manufacture SKUs with up to 96 cores and 192 threads, an increase from the previous generation's 64C/128T designs. Today, we are learning more about the performance and power aspects of the 4th generation AMD EPYC Genoa 9654, 9554, and 9374F SKUs from 3rd party sources, and not the official AMD presentation. Tom's Hardware published a heap of benchmarks consisting of rendering, compilation, encoding, parallel computing, molecular dynamics, and much more.

In the comparison tests, we have AMD EPYC Milan 7763, 75F3, and Intel Xeon Platinum 8380, a current top-end Intel offering until Sapphire Rapids arrives. Comparing 3rd-gen EPYC 64C/128T SKUs with 4th-gen 64C/128T EPYC SKUs, the new generation brings about a 30% increase in compression and parallel compute benchmarks performance. When scaling to the 96C/192T SKU, the gap is widened, and AMD has a clear performance leader in the server marketplace. For more details about the benchmark results, go here to explore. As far as comparison to Intel offerings, AMD leads the pack as it has a more performant single and multi-threaded design. Of course, beating the Sapphire Rapids to market is a significant win for team red, so we are still waiting to see how the 4th generation Xeon stacks up against Genoa.

HaptX Introduces Industry's Most Advanced Haptic Gloves, Priced for Scalable Deployment

HaptX Inc., the leading provider of realistic haptic technology, today announced the availability of pre-orders of the company's new HaptX Gloves G1, a ground-breaking haptic device optimized for the enterprise metaverse. HaptX has engineered HaptX Gloves G1 with the features most requested by HaptX customers, including improved ergonomics, multiple glove sizes, wireless mobility, new and improved haptic functionality, and multiplayer collaboration, all priced as low as $4,500 per pair - a fraction of the cost of the award-winning HaptX Gloves DK2.

"With HaptX Gloves G1, we're making it possible for all organizations to leverage our lifelike haptics," said Jake Rubin, Founder and CEO of HaptX. "Touch is the cornerstone of the next generation of human-machine interface technologies, and the opportunities are endless." HaptX Gloves G1 leverages advances in materials science and the latest manufacturing techniques to deliver the first haptic gloves that fit like a conventional glove. The Gloves' digits, palm, and wrist are soft and flexible for uninhibited dexterity and comfort. Available in four sizes (Small, Medium, Large, and Extra Large), these Gloves offer the best fit and performance for all adult hands. Inside the Gloves are hundreds of microfluidic actuators that physically displace your skin, so when you touch and interact with virtual objects, the objects feel real.

Meta's Grand Teton Brings NVIDIA Hopper to Its Data Centers

Meta today announced its next-generation AI platform, Grand Teton, including NVIDIA's collaboration on design. Compared to the company's previous generation Zion EX platform, the Grand Teton system packs in more memory, network bandwidth and compute capacity, said Alexis Bjorlin, vice president of Meta Infrastructure Hardware, at the 2022 OCP Global Summit, an Open Compute Project conference.

AI models are used extensively across Facebook for services such as news feed, content recommendations and hate-speech identification, among many other applications. "We're excited to showcase this newest family member here at the summit," Bjorlin said in prepared remarks for the conference, adding her thanks to NVIDIA for its deep collaboration on Grand Teton's design and continued support of OCP.

Microsoft Updates Surface PC Models with the Latest Hardware

Today, we shared our vision for the next era of the Windows PC, where the PC and the cloud intersect and tap into innovative AI technology that unlocks new experiences. So that each of us can participate, be seen, heard and express our creativity.

For nearly 40 years, the Windows PC has held a place at the center of our lives. It's contributed to new levels of productivity, kept us all connected, and unlocked our creativity and potential through innovations we couldn't have imagined when we first began this journey. Just think about how far we've come in how people interact with it. From the very first text-based keyboard input to the precision of point and click with the mouse, up to today, where touch, voice, pen and gestures all help people use the Windows PC more naturally and intuitively. From its inception, Surface has been a catalyst for that change.
Return to Keyword Browsing
Jun 2nd, 2024 19:49 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts