• chevron_right

    After a chaotic three years, GPU sales are starting to look normal-ish again / ArsTechnica · 7 days ago - 21:57 · 1 minute

AMD's Radeon RX 7600.

Enlarge / AMD's Radeon RX 7600. (credit: Andrew Cunningham)

It's been an up-and-down decade for most consumer technology, with a pandemic-fueled boom in PC sales giving way to a sales crater that the market is still gradually recovering from . But few components have had as hard a time as gaming graphics cards, which were near impossible to buy at reasonable prices for about two years and then crashed hard as GPU companies responded with unattainable new high-end products .

According to the GPU sales analysts at Jon Peddie Research, things may finally be evening out. Its data shows that GPU shipments have returned to quarter-over-quarter and year-over-year growth after two years of shrinking sales. This is the second consecutive quarter this has happened, which "strongly indicates that things are finally on the upswing for the graphics industry."

JPR reports that overall GPU unit shipments (which include integrated and dedicated GPUs) are up 16.8 percent from Q2 and 36.6 percent from a year ago. Dedicated GPU sales increased 37.4 percent from Q2. When comparing year-over-year numbers, the biggest difference is that Nvidia, AMD, and Intel all have current-generation GPUs available in the $200–$300 range, including the GeForce RTX 4060 , the Radeon RX 7600 , and the Arc A770 and A750 , all of which were either unavailable or newly launched in Q3 of 2022.

Read 4 remaining paragraphs | Comments

  • chevron_right

    Nvidia CEO: US chip independence may take 20 years to achieve / ArsTechnica · Wednesday, 29 November - 22:35

Founder and CEO of NVIDIA Jensen Huang speaks during the New York Times annual DealBook summit on November 29, 2023, in New York City.

Enlarge / Founder and CEO of NVIDIA Jensen Huang speaks during the New York Times annual DealBook summit on November 29, 2023, in New York City. (credit: Michael M. Santiago / Staff | Getty Images North America )

The US could be up to two decades away from maintaining its own domestic chips supply chain, Nvidia Corp.'s CEO, Jensen Huang, told an audience gathered in New York for the New York Times’s DealBook conference.

Nvidia is a giant in the semiconductor industry, and Huang said his company's success depends on "myriad components that come from different parts of the world," Bloomberg reported. "Not just Taiwan," Huang said, where Taiwan Semiconductor Manufacturing company makes the world's most advanced semiconductor technology .

“We are somewhere between a decade and two decades away from supply chain independence,” Huang said. “It’s not a really practical thing for a decade or two.”

Read 16 remaining paragraphs | Comments

  • chevron_right

    Ars system mini-guide: Summer GPU refresh edition, aka “can it run Starfield”? / ArsTechnica · Friday, 8 September - 14:38

The AMD Radeon RX 7900 XT, 7800 XT, and 7600.

Enlarge / The AMD Radeon RX 7900 XT, 7800 XT, and 7600. (credit: Andrew Cunningham)

Two big things have happened since we last updated our PC build guide in the spring . First, we got a batch of late-spring and summer midrange GPU launches, including AMD's Radeon RX 7600 , 7700 XT, and 7800 XT , plus Nvidia's GeForce RTX 4060 and 4060 Ti . Second, Bethesda's Starfield finally dropped , prompting a whole bunch of people to ask "can my PC run Starfield ?"

Starfield isn't an exceptionally demanding PC game, at least not by the standards set by buggy PC ports like The Last of Us . But it will give any PC more than 3 or 4 years old a serious workout, and it should serve as a decent yardstick for building a PC that can run this console generation's games fairly well.

This guide will focus on just minor tweaks to our spring PC builds, since other component pricing hasn't changed much and there haven't been major CPU introductions since then (Intel's don't-call-them-14th-generation Core processors may be out within a few months, but on the desktop they'll be a mild refresh of 13th-gen, which was already a mild refresh of 12th-gen).

Read 24 remaining paragraphs | Comments

  • chevron_right

    Review: AMD’s Radeon RX 7700 XT and 7800 XT are almost great / ArsTechnica · Wednesday, 6 September - 13:00

AMD's Radeon RX 7800 XT.

Enlarge / AMD's Radeon RX 7800 XT. (credit: Andrew Cunningham)

Nearly a year ago, Nvidia kicked off this GPU generation with its GeForce RTX 4090 . The 4090 offers unparalleled performance but at an unparalleled price of $1,600 (prices have not fallen). It's not for everybody, but it's a nice halo card that shows what the Ada Lovelace architecture is capable of. Fine, I guess.

The RTX 4080 soon followed, along with AMD's Radeon RX 7900 XTX and XT . These cards also generally offered better performance than anything you could get from a previous-generation GPU, but at still-too-high-for-most-people prices that ranged from between $900 and $1,200 (though all of those prices have fallen by a bit). Fine, I guess.

By the time we got the 4070 Ti launch in May, we were getting down to the level of performance that had been available from previous-generation cards. These GPUs offered a decent generational jump over their predecessors (the 4070 Ti performs kind of like a 3090, and the 4070 performs kind of like a 3080). But those cards also got big price bumps that took them closer to the pricing levels of the last-gen cards they performed like. Fine, I guess.

Read 25 remaining paragraphs | Comments

  • chevron_right

    Starfield’s missing Nvidia DLSS support has been added by a free mod / ArsTechnica · Tuesday, 5 September - 19:22 · 1 minute

A video from modder PureDark shows off the performance benefits of DLSS3 in the Patreon-only version of his mod.

Nvidia graphics card owners can rest easy; Starfield modders have already added support for Nvidia's Deep Learning Super-Sampling (DLSS) technology (alongside the game's official support for AMD's FSR2 upscaling). But unlocking the full power of that mod will require either paying for a Patreon subscription or using cracks to get around some controversial DRM protecting the most full-featured version of the mod.

Since its initial release on Friday, the "Starfield Upscaler" is currently the most popular Starfield mod listed on clearinghouse NexusMods. That should be welcome news to a significant portion of the PC gaming community running a newer Nvidia GPU that supports the frame-rate-enhancing upscaling technology. That's especially true for the Nvidia owners who were outraged when Bethesda announced an official Starfield partnership with AMD this summer.

In practice, though, the practical effect of that DLSS support might be hard to notice for many players. In Ars' testing on a GTX 2080 Ti gaming rig (running at 2560×1440 resolution, Ultra quality, and 50 percent render resolution), we were able to hit 35 frames per second using both the DLSS mod and the game's built-in AMD FSR2 support (which also works on Nvidia cards). Neither upscaling technology had an apparent performance edge, even as both improved significantly on the ~25 fps frame rate when running at full resolution without any upscaling (and even as DLSS has shown superior visual quality in other tests ).

Read 3 remaining paragraphs | Comments

  • chevron_right

    China keeps buying hobbled Nvidia cards to train its AI models / ArsTechnica · Monday, 21 August - 17:58

The Nvidia H100 Tensor Core GPU

Enlarge / A press photo of the Nvidia H100 Tensor Core GPU. (credit: Nvidia )

The US acted aggressively last year to limit China’s ability to develop artificial intelligence for military purposes, blocking the sale there of the most advanced US chips used to train AI systems.

Big advances in the chips used to develop generative AI have meant that the latest US technology on sale in China is more powerful than anything available before. That is despite the fact that the chips have been deliberately hobbled for the Chinese market to limit their capabilities, making them less effective than products available elsewhere in the world.

The result has been soaring Chinese orders for the latest advanced US processors. China’s leading Internet companies have placed orders for $5 billion worth of chips from Nvidia, whose graphical processing units have become the workhorse for training large AI models.

Read 24 remaining paragraphs | Comments

  • chevron_right

    Nvidia’s AI software tricked into leaking data / ArsTechnica · Friday, 9 June, 2023 - 17:26

Nvidia’s AI software tricked into leaking data

Enlarge (credit: VGG | Getty Images)

A feature in Nvidia’s artificial intelligence software can be manipulated into ignoring safety restraints and reveal private information, according to new research.

Nvidia has created a system called the “NeMo Framework” which allows developers to work with a range of large language models—the underlying technology that powers generative AI products such as chatbots.

The chipmaker’s framework is designed to be adopted by businesses, such as using a company’s proprietary data alongside language models to provide responses to questions—a feature that could, for example, replicate the work of customer service representatives, or advise people seeking simple health care advice.

Read 18 remaining paragraphs | Comments

  • chevron_right

    Nvidia’s new monster CPU+GPU chip may power the next gen of AI chatbots / ArsTechnica · Thursday, 8 June, 2023 - 15:48


Enlarge / NVIDIA's GH200 "Grace Hopper" AI superchip. (credit: Nvidia)

Early last week at COMPUTEX, Nvidia announced that its new GH200 Grace Hopper "Superchip" —a combination CPU and GPU specifically created for large-scale AI applications—has entered full production. It's a beast. It has 528 GPU tensor cores, supports up to 480GB of CPU RAM and 96GB of GPU RAM, and boasts a GPU memory bandwidth of up to 4TB per second.

We've previously covered the Nvidia H100 Hopper chip , which is currently Nvidia's most powerful data center GPU. It powers AI models like OpenAI's ChatGPT , and it marked a significant upgrade over 2020's A100 chip, which powered the first round of training runs for many of the news-making generative AI chatbots and image generators we're talking about today.

Faster GPUs roughly translate into more powerful generative AI models because they can run more matrix multiplications in parallel (and do it faster), which is necessary for today's artificial neural networks to function.

Read 6 remaining paragraphs | Comments