<![CDATA[ PCGamer ]]> https://www.pcgamer.com Fri, 16 Aug 2024 16:34:34 +0000 en <![CDATA[ 'Without using these inventions, Western Digital would not be able to compete in the market': WD owes $262,000,000 in damages over a HDD patent dispute ]]> Western Digital is a name synonymous with hard drives, system storage, and all things spinny-platter, with the sort of brand recognition many of its competitors must envy. It looks like its going to have to reach into its pockets and find a considerable amount of spare change, however, as a Californian jury has decided that the company has violated patent rights, and owes damages to the tune of $262m.

MR Technologie (MRT) sued Western Digital in August 2022, claiming that it had infringed upon two patents filed by Dieter Seuss, a professor and head of the Physics of Functional Materials department at the University of Vienna—and owner of MRT (via Blocks & Files). 

The patents referred to methods that increase the signal to noise ratio in a HDD by using anisotropy magnetic effects to help bits change direction. The lawsuit alleged that several of Western Digital's hard drive products contained technology that infringed the patented techniques.

MRT's lawyers reportedly accused WD of misusing these methods, which allowed it to increase areal density on the HDDs in question from 300 Gbit/sq in to 1,000 Gbit/sq in. According to a court transcript obtained by Reuters, MRT attorney Mark Fenster of Russ August & Kabat said, during closing arguments:

"Without using these inventions, Western Digital would not be able to compete in the market."

Western Digital attorney Douglas Lumish disagreed: "MRT's lawyers have given false credit, to a fairly magnificent extent, to Dr. Suess for the work of thousands of [Western Digital] engineers over decades and across the planet."

Western Digital has said it will appeal the verdict "as soon as possible".

For a company with an estimated net worth of $20.86 billion as of August 16 this year, rustling up $262m in damages may seem like a drop in the ocean. That being said, the accusation that Western Digital violated patent rights in order to compete in the storage market will sting, and it's likely WD will do everything within its power to attempt to remove that unsightly mark from its name. Just when you thought spinning platter hard drives were boring, ey?

]]>
https://www.pcgamer.com/hardware/storage/without-using-these-inventions-western-digital-would-not-be-able-to-compete-in-the-market-wd-owes-dollar262000000-in-damages-over-a-hdd-patent-dispute fMtDyVXCrTRBjvxfZLJLA Fri, 16 Aug 2024 16:19:38 +0000
<![CDATA[ Super-svelte CAMM2 memory can deliver higher clock speeds, lower latencies, costs, and even better system cooling says MSI ]]> Let's start with the obvious: CAMM2 memory is flat. Initially designed to fit inside mobile devices thanks to its ultra-slim design, CAMM2 has since been heralded as potentially the future of desktop RAM, and we've seen several manufacturers show off super-skinny modules of CAMM2 that may soon be slotting in (or should that be, mounting on?) our desktop PC motherboards.

But beyond looking odd (it can't just be me crossing my eyes looking at a desktop mobo with no RAM sticking up from the board, can it?), there are several potential advantages to the new form factor, at least according to MSI. It live-streamed a deep dive into CAMM2, espousing the benefits of the new standard and why it might show up in your next build (via Wccftech).

Unlike traditional RAM sticks, CAMM2 modules are directly connected to the corresponding CAMM2 interface, rather than via SI (System Interface) "stubs" which limit the bus speed. This more direct connection allows both the inner and outer IMC (Integrated Memory Controller) channels to connect to a single CAMM2 module, which means you can have dual-channel operation on a single module—unlike SO-DIMMs, for example, where you'd need two separate sticks.

CAMM2 motherboards also require few (and shorter) signal traces, cutting down on costs and potentially leading to higher clocks and lower latency.

The modules have a smaller PCB overall compared to traditional RAM sticks of similar capacity, with only a single PMIC (Power Management Integrated Circuit). Not only does this lead to further reduced costs compared to a traditional DIMM, but lower power consumption and less heat.

Speaking of heat, CAMM2 is also said to improve cooling. Essentially, RAM sticking up proud from the motherboard can block airflow to the underneath of a CPU cooler, whereas mounting it flush allows air to pass over the top. Traditional RAM sticks can also get in the way of fan placement and water-cooling system tubing, whereas clever old CAMM2 tucks itself neatly to the board, leading to more cooling options.

That being said, Bitspower has already shown off a water block designed to sit on top of a CAMM2 module, which takes up a fair bit of the space that would be otherwise saved and looks like it might reduce some of those benefits for the rest of the system.

That's not to say all cooling solutions would need to be that large, however. MSI's CAMM2 heatsink features a thermal pad on the front-facing part of the module, with a corresponding thermal pad on the PCB that directs heat to a metal mounting bracket on the rear side of the mobo. Dual-channel CAMM2 DDR5 module designs feature DRAM ICs on the rear, necessitating the rear pad and bracket solution to stay chilled.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

And finally, there's repairability to consider. CAMM2 connectors are said to be cross-compatible across all motherboards and aren't soldered to the board. This means that, unlike a traditional DIMM slot, if one becomes damaged then it should be a lot easier to replace. 

There's also only one way it fits into place, which should help prevent the cack-handed among us (and I'm including myself here) from forcing a square peg into a round hole.

Overall then, CAMM2 modules may well be an improvement over the current form factor, and they look pretty neat, to boot. With rear-connector motherboards cropping up left right and centre, and manufacturers like Asus experimenting with innovative cable-hiding methods, it looks like we'll be seeing some very flush-looking desktop PC internals in the next few years—although whether CAMM2 ends up becoming the new RAM standard overall, I suppose we'll just have to wait and see.

]]>
https://www.pcgamer.com/hardware/memory/super-svelte-camm2-memory-can-deliver-higher-clock-speeds-lower-latencies-costs-and-even-better-system-cooling-says-msi 6caueXcigmvZzkiCt7JHeX Fri, 16 Aug 2024 15:18:40 +0000
<![CDATA[ Grab this mighty fast RTX 4080 Super and save more than $140 off the MSRP ]]>

MSI RTX 4080 Super | 16 GB GDDR6 | 10,240 CUDA Cores | 2,595 MHz boost | $1,029.99 $855.99 at Walmart (save $174)
With an MSRP of $999, the RTX 4080 Super is a better deal than the original RTX 4080. It's still a lot of money, though, but a deal like this one makes it a very tempting purchase.

RTX 4080 price check: Amazon $958 | Best Buy $959.99 | Newegg $959.99View Deal

When Nvidia launched the GeForce RTX 4080 Super earlier this year, we were pleased to see that its MSRP was lower than the original RTX 4080's—$999 instead of  $1,199 was a much nicer deal. But given that it was only a fraction quicker, the RTX 4080 Super still felt like it wasn't great value for money.

But this MSI deal goes a long way to counter that and has more than $140 shaved off the Super's suggested retail price.

So what exactly are you getting for your $856? As with all of the current Nvidia RTX 40-series, it's powered by the Ada Lovelace architecture, which is well regarded for being light on power consumption but heavy on the gaming chops. The RTX 4080 Super sports 10,240 CUDA cores (you can just call them shaders, if you like) and this MSI model's boost clock of 2,595 MHz means it has a peak FP32 throughput of 53.1 TFLOPS.

If you're not sure if that's a lot, only the Radeon RX 7900 XTX and RTX 4090 are higher (61.4 and 82.6 TFLOPS, respectively).

The RX 7900 XTX is a good GPU to compare the RTX 4080 Super against because this MSI model is cheaper than AMD's best GPU by a little over $50. While the Radeon card has more VRAM (24 vs 16 GB) and is a bit faster in games using standard rasterization, the RTX 4080 Super is considerably quicker when ray tracing is being used.

And then you've got the full DLSS 3.5 feature set of the RTX card. Both GPUs support upscaling and frame interpolation for boosting performance, but Nvidia's AI-powered systems are generally better especially its frame generation system. In terms of performance, there's not much to separate them, but DLSS arguably produces better-looking results.

Something else that the RTX 4080 Super is better at is managing its power consumption. On paper, there's not much to separate them, with the Nvidia card having a 320 W limit and the Radeon topping out at 355 W. However, in actual gaming, the Ada Lovelace chip generally stays around the 300 W mark, whereas the RDNA 3 processor heads north of 340 W.

That makes it a little easier for the graphics card's cooler to keep things...err...cool and more importantly, it means the RTX 4080 Super is generally the quieter of the two cards.

At $999, the GeForce RTX 4080 Super is a great but expensive graphics card, but with this deal, it's a much nicer prospect. It's still a lot of money, of course, but until the next generation of GPUs arrives from AMD, Intel, and Nvidia, it's as good as it gets right now.

]]>
https://www.pcgamer.com/hardware/graphics-cards/grab-this-mighty-fast-rtx-4080-super-and-save-more-than-dollar140-off-the-msrp t79Z968uXgtE64Uqy4Qg3A Fri, 16 Aug 2024 15:03:53 +0000
<![CDATA[ Arm reportedly spooling up major new GPU architecture to take on Nvidia ]]> Arm is working on a new GPU design and has no less a target than Nvidia in its sights. That's quite the claim, but it's exactly what Israeli business website Globes is reporting.

Arm is said to be employing a team of up to 100 GPU engineers at its development center in Ra'anana, Israel. But what's not clear is what kind of graphics architecture they've been tasked to build, with the two obvious options being aimed at video rendering or AI training and inferencing. 

Arm already has several GPU designs on its shelves, including the Mali and Immortalis series, which are typically offered as IP that chip makers can licence. Arm itself isn't in the business of making and selling GPUs. That applies to CPUs, too, which are the mainstay of its chip-design licensing business.

However, the Mali and Immortalis series are fairly traditional graphics units that are actually intended to process, well, graphics. That's a whole different ball game from building a GPU to process AI.

Granted, Nvidia's AI and gaming GPU architectures are developed in tandem and have shared elements. But there's a whole world of difference between a $40,000 Nvida AI chip and even its priciest desktop graphics cards.

Moreover, while Arm has form when it comes to both graphics processing and selling SoCs that include hardware for that specific job, it has no track record in either AI or selling discrete GPUs.

Unfortunately, the Globes story provides few insights into to any of the details. However, it does claim that the Ra'anana facility has been working with Israeli startup NeuReality on its new SR1 hardware for accelerating AI inferencing, which is said to be 90% cheaper than doing the same job with Nvidia GPUs.

All of which means it's hard to draw any firm conclusions. With that in mind, we'd suggest that it's pretty unlikely Arm's plan involves discrete gaming graphics cards. There's far, far more money to be made in cranking out an alternative to Nvidia's all-conquering AI hardware.

Your next machine

Gaming PC group shot

(Image credit: Future)

Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

On the other hand, those overlaps between graphics and AI acceleration remain. It definitely wouldn't be a complete surprise to see Arm's investments in latter spill over into benefits for gaming graphics.

What's more, with Qualcomm having launched what you might call the first really serious attempt to get an Arm CPU into the PC with the Snapdragon X series, what Arm does in graphics has the potential to be that little bit more relevant to PC gaming and especially mobile gaming.

In short, don't go expecting to buy an Arm graphics card next year. But you might just be playing PC games on an all-Arm handheld somewhere down the road.

]]>
https://www.pcgamer.com/hardware/graphics-cards/arm-reportedly-spooling-up-major-new-gpu-architecture-to-take-on-nvidia CGSrz55SxjAD3LRCAyzPW9 Fri, 16 Aug 2024 14:37:37 +0000
<![CDATA[ You had one job: Outlook, Word and OneNote may 'unexpectedly close when typing' says Microsoft ]]> Software bugs are a fact of life. In today's feature-packed applications, it's no surprise that, given the increasing complexity of many modern programs, something at some point is going to fall over. However, if an app's primary purpose in life is to handle text, you would hope the simple act of typing wouldn't cause it to close.

According to Microsoft, this exact issue is affecting certain installs of Outlook, Word and OneNote, and its Outlook and Office teams are currently investigating (via Neowin). In a support post, MS defines the problem as:

"After updating to [Office] Version 2407 Build 17830.20138 or higher you find that Outlook, Word, or OneNote may unexpectedly close when typing or doing other authoring tasks such as spell check."

Yep, the simple act of typing or performing a spell check can cause the applications to close, undoubtedly inspiring a choice selection of curse words from affected users. Microsoft says that you can confirm the issue by looking for Event 1000 or Event 1001 in the Windows Event Viewer Application Log and that the issue may be caused by older language packs:

"The faulting module name will vary depending on what language packs you have installed. For example, mscss7it.dll for Italian, or mscss7ge.dll, for German, and others could include: EN, ES, FR, GE, IT, NP, PB."

Currently, MS advises a workaround involving an online repair of the affected Office application, searching for and uninstalling old Office language installations, and reinstalling the Language Accessory Pack for Microsoft 365. 

Affected users are also pointed towards a thread on the Microsoft forums in which multiple users report their Outlook installations crashing from simply typing, saving drafts, and typing in different languages—both in the Windows version and the Android application. 

Many of the users in the thread report that they were typing in (or spellchecking) German when the problem occurred, although by the look of the support post MS seems to think multiple languages could cause the affected apps to close.

Chalk one up for Notepad, I guess, although it's not like you can send an email with the app I internally refer to as "Ol' Reliable". Still, it does have a surprisingly good spellcheck and multiple useful features these days, and I've yet to have it crash on me while I'm taking notes. Sometimes the classics are the best, ey?

]]>
https://www.pcgamer.com/software/you-had-one-job-outlook-word-and-onenote-may-unexpectedly-close-when-typing-says-microsoft ii7Xu43LNzSe7qwoxVS8RX Fri, 16 Aug 2024 13:21:05 +0000
<![CDATA[ Leaked Intel Arrow Lake CPU benchmarks show generational performance regression but that might not be the whole story ]]> What with the ongoing 13th and 14th Gen CPU debacle, wouldn't it be nice if Intel had a hot new generation of desktop processors to fix everything. In theory, that would be Arrow Lake, due out later this year.

The problem is, some new leaked benchmarks spotted by X account Benchleaks (via Tom's Hardware) don't exactly paint a stellar picture of Arrow Lake's performance prowess.

The chip in question is the upcoming Intel Core Ultra 7 265KF. It's expected to slot into the range in the same spot as the existing Core i7-14700KF. So, it's a second rung model with the integrated GPU disabled.

But what of those benchmark results? Well, we're talking Geekbench 6, which comes with the usual caveats. Whatever, the leaked numbers for the 265KF are 3,219 single-thread points and 19,433 multi-thread points.

That compares with 3,005 and 19,595 points, respectively, for the Core i7-14700KF (read our review of the similar Core i7 14700K here). Notably, the new Arrow Lake CPU is expected to offer the same eight Performance and 12 Efficient core configuration as the 14700KF.

However, the big difference with Arrow Lake is that it ditches HyperThreading from its Performance cores. That means it drops eight threads from its overall thread processing count versus its predecessor. 

With that in mind, the slight regression in multi-thread performance does make sense. Indeed, it's actually less of a drop than you might expect from the removal of HypperThreading support.

But if accurate, any actual regression in multi-thread performance, generation-on-generation, is still very disappointing from what is meant to be a pretty major architectural advance.

The other part of the performance equation, of course, is single-thread performance. Here the new Arrow Lake processor does gap the old Raptor Lake, but not by as much as you'd hope for from a brand new CPU design that seems to be making a fairly significant concession regarding multi-threading.

These Geekbench 6 numbers indicate a single-thread uptick of just over 7%. That's hardly a radical jump. However, multiple factors are not clear. What clock speed is the new chip running at? And is that speed representative of the final product?

The Geekbench entry says 3.9GHz for the Arrow Lake chip's base clock, which compares to 3.4GHz for the 14700KF. But again, it's not known if this is a accurate or final figure. And the Max Turbo figure simply isn't know, be that the speed this particular chip is running at or the final retail processors.

Moreover, final revisions to the elements like microcode can unlock further performance. So, there's a good chance 7% won't be the final single-thread performance delta. And Geekbench is but a single data point, too.

But with all that in mind, if this is a real result from an actual Arrow Lake CPU, it's hard to imagine the final retail product will be cranking out, say, 25% more single-thread performance.

If you also factor in the fairly disappointing advances offered by AMD's latest Ryzen 9000 chips, well, this new generation of CPUs isn't exactly shaping up to be a classic. So, here's hoping Arrow Lake has a few other tricks up its sleeves that these supposed benchmarks do not reveal.

]]>
https://www.pcgamer.com/hardware/processors/leaked-intel-arrow-lake-cpu-benchmarks-show-generational-performance-regression-but-that-might-not-be-the-whole-story SBcy2pAV8gU2FMcdeDovnE Fri, 16 Aug 2024 11:23:09 +0000
<![CDATA[ The FTC is banning fake reviews, false testimonials and all sorts of shady consumer deception with an ominous sounding 'final rule' ]]> When it comes to deciding on your next purchase, be that PC hardware or otherwise, you'll probably be leaning pretty heavily on reviews to make your decision. But it won't come as too much of a surprise when I say that some reviews are less trustworthy than others, with obvious bot accounts, paid reviews and fake testimonials muddying the waters.

Now the FTC has announced what it's calling a 'final rule', aimed at combating exactly that (via PC World). It's been in the works since October 2022 and contains some pretty sweeping prohibitions on some of the biggest bugbears faced by consumers when searching through site reviews to find trustworthy content.

The final rule prohibits fake or false consumer reviews, consumer testimonials and celebrity testimonials, as well as prohibiting businesses from "providing compensation or other incentives conditioned on the writing of consumer reviews expressing a particular sentiment, either positive or negative." It also bans reviews and testimonials written by company insiders that fail to clearly disclose the reviewer's connection to the business.

There's also a prohibition on businesses misrepresenting websites or entities they control as independent reviewers, when voicing opinions about products or services they own. "Fake social media indicators" are also targeted, essentially banning the selling or buying of social media followers, or views generated by a bot or hijacked account.

Finally, the rule prohibits businesses from using "groundless legal threats, physical threats, intimidation or certain false public accusations to prevent or remove a negative consumer review."

In terms of penalties for breaching the rule, the FTC says that this new set of prohibitions will "enhance deterrence and strengthen FTC enforcement actions", and cites a previous Supreme Court decision that it says has hindered its ability to seek monetary relief for consumers up until this point.

"Fake reviews not only waste people’s time and money, but also pollute the marketplace and divert business away from honest competitors," said FTC Chair Lina M. Khan.

"By strengthening the FTC’s toolkit to fight deceptive advertising, the final rule will protect Americans from getting cheated, put businesses that unlawfully game the system on notice, and promote markets that are fair, honest, and competitive."

Given the flood of misleading review content across the internet, this tightening of the FTC's position should hopefully go some way in deterring US businesses from acting in an unscrupulous manner in regards to promoting their products. 

As someone who writes many a hardware review, I can't help but cheer for the principles of fairness, proper disclosure and consumer rights receiving some beefed-up protections, at least as far as the FTC is concerned.

]]>
https://www.pcgamer.com/hardware/the-ftc-is-banning-fake-reviews-and-testimonials-with-an-ominous-sounding-final-rule gf9WST9Z2bcnMPuSy23AcP Fri, 16 Aug 2024 10:53:29 +0000
<![CDATA[ LG UltraGear 32GS95UE review ]]> The new LG UltraGear 32GS95UE is not perfect. And yet it does a pretty comprehensive job of blowing every existing 32-inch 4K gaming monitor based on Samsung's QD-OLED panel tech into last month. Wait, make that last year.

The thing is, LG's take on the high-refresh 4K OLED gaming monitor riff isn't on a totally different level to those QD-OLED panels. In fact, it's very similar. But it is undeniably and unambiguously—even if ultimately pretty marginally—better. Hold those thoughts.

On paper, the LG UltraGear 32GS95UE is very similar to the likes of, say, the Alienware 32 AW3225QF, Asus ROG Swift OLED PG32UCDM, Samsung Odyssey G8 OLED G80SD, or MSI MPG 321URX. Whether it's the 32-inch panel size, 4K native resolution, 240 Hz refresh or 0.03 ms response performance, LG's OLED monitor looks like a dead ringer for that quartet of Samsung QD-OLED panels.

LG's 275 nit full-screen brightness rating is actually a little brighter. But that's a "typical" rating, with LG rating the panel at 250 nits "minimum". If it's a close run thing in theory, full-screen brightness is the one area where you might have come into this review with some doubts.

UltraGear 32GS95UE specs

LG UltraGear 32GS95UE

(Image credit: Future)

Screen size: 32-inch
Resolution: 3,840 x 2,160
Brightness: 275 nits full screen, 1,300 nits max HDR
Color coverage: 98.5% DCI-P3
Response time: 0.03 ms
Refresh rate: 240 Hz (480 Hz 1080p)
HDR: DisplayHDR 400 True Black
Features: LG WOLED panel, Adaptive Sync, 1x DisplayPort 1.4, 2x HDMI 2.1
Price: $1,399 | £1,300

That's because this LG monitor inevitably uses the WOLED panel technology from sister company LG Display, the subsidiary of the sprawling LG empire that makes the actual OLED panels which go into everything from monitors like this to TVs, phones, watches, cars and the rest. And the one metric by which LG WOLED tech has fallen short previously, is full screen brightness.

If that's now at least on par with Samsung QD-OLED, this particular 32-inch 4K beauty has something none of the Samsung-powered competition currently offers, namely a Dual Mode functionality which uses pixel doubling to essentially run as a native 1080p panel but with an extremely quick 480 Hz refresh rate. 

The idea is to provide the best of both worlds. You get both full 4K capability for ultrasharp and detailed image quality in games and which also benefits things like font rendering day to day, plus the ability to run 1080p at sky-high frame rates and ultra-low latency, just without the need to interpolate a 1080p image over a 4K panel. Doing the latter always ends up looking soft and a little blurry compared to a native 1080p monitor of the same size. What's not to like?

Other highlights include 98.5% coverage of the DCI-P3 digital cinema color space, Nvidia G-Sync compatibility, DisplayHDR True Black 400 certification, plus HDMI and DisplayPort connectivity along with a USB-A hub. In fact, really the only notable omission is a USB-C port.

Depending on your point of view, that may or may not be an issue. But at this extremely lofty price point, it hardly seems like an onerous expectation. Moreover, it's something of a pity given that the 4K resolution and pixel density, not to mention mostly excellent color accuracy, means this monitor does such a stellar job of bridging the void between gaming and productivity.

(Image credit: Future)

It's not quite up there with Samsung's Odyssey monitor for sheer physical desirability.

Put simply, it would be nice to able to have both a proper gaming rig hooked up via DisplayPort and a laptop running in single-cable mode and picking up a desktop keyboard and mouse, all courtesy of USB-C. Odds are, if you can afford this monitor and you're into gaming, you'll also have a laptop to hook up via USB-C.

With that USB-C themed pico-rant squared away, that's the pre-game considerations covered off. Oh, with the exception of design and ergonomics. In truth, that aspect of the LG UltraGear 32GS95UE is somewhat forgettable, which is why I almost did forget. The slim bezels on all four sides of the panel ensure a slick, contemporary look, while the broad stand adds a touch of individuality. And it's certainly well put together and offers plenty of adjustability including rotation into portrait mode, if that's your thing. But it's not quite up there with Samsung's Odyssey monitor for sheer physical desirability.

But what, then, of the actual image quality? I'll tease you no longer. Here's why this LG is better than those Samsung QD-OLEDs. First, it doesn't suffer from the slightly warm color balance of those 4K QD-OLED monitors. Second, the panel doesn't turn slightly grey in bright ambient light, again as QD-OLED panels do and thus marginally detracting from contrast performance and black levels. Third it does or doesn't do all that while absolutely matching if not bettering the QD-OLED competition for full-screen brightness.

Now, those factors may not immediately seem like an absolutely overwhelming roll call of advantages. But we're talking about very expensive displays, so even a small edge matters. More to the point, those wins come with no discernible downsides. In other regards, this monitor is at least as good.

(Image credit: Future)

The net result is a ridiculously enjoyable monitor to use for just about anything. The best bit is probably the HDR performance. There's a particular sequence in Cyberpunk 2077 that's a great test of peak brightness. It's an underground bar scene, mostly dark and moody. But the actual bar is surrounded by banks of neon lights. And they absolutely, positively sizzle on this monitor. It's the most impressive rendering I've yet seen.

HDR video looks stellar, too, and really delivers on the whole High Dynamic Range premise. The contrast, the bright highlights right next to inky darkness, these are things that LCD monitors with local dimming just can't compete with.

They can't compete with the speed, either. Pixel response is essentially a solved issue with these OLED monitors. It's questionable whether you'd be able to tell the difference were they any faster. Of course, the 240 Hz refresh ensures very low latency, provided you have a GPU powerful enough to drive this monitor at high frame rates. And you can improve can lower the latency yet further with the aforementioned 1080p mode.

Quick side note on that subject: The Dual Mode feature works slickly. There's a button on the bottom bezel you hit to jump between 4K@240 Hz and 1080p@480 Hz modes. The screen does blank out and the display will resync with your PC, but it happens fast enough. So, the big question is whether you'd mistake the 1080p mode for native 1080p on a 32-inch monitor.

The answer is no, you wouldn't. For sure, it looks a bit better than 1080p interpolated on a 4K 32-inch panel in the usual manner. And, in game, the experience looks closer to native than it does on the Windows desktop, the latter being really pretty fugly. But there's still a softness that belies any true pretence at native rendering. So, it's a welcome enough feature viewed as an extra. It just doesn't quite deliver on the dual-native premise.

Image 1 of 6

LG UltraGear 32GS95UE

(Image credit: Future)
Image 2 of 6

LG UltraGear 32GS95UE

(Image credit: Future)
Image 3 of 6

LG UltraGear 32GS95UE

(Image credit: Future)
Image 4 of 6

LG UltraGear 32GS95UE

(Image credit: Future)
Image 5 of 6

LG UltraGear 32GS95UE

(Image credit: Future)
Image 6 of 6

LG UltraGear 32GS95UE

(Image credit: Future)

Were this monitor glossy it would probably look even better. But as it is, it's still my new favorite OLED monitor.

Oh, and one last thing. Throughout all of this, we haven't touched on something that's typically fairly critical on an OLED gaming panel, the panel coating. Horror of horrors, the LG UltraGear 32GS95UE doesn't have a glossy anti-glare coating, something that normally I'd say was a substantial disadvantage on an OLED monitor.

But somehow, the matte coating is just fine. Is it a little "glossier" than a typical matte finish? Possibly. But either way, the sense of contrast and inky black levels, not to mention highlight dazzle is barely, if at all, compromised. Consider my glossy-panel prejudices largely, if not quite comprehensively, dismantled. Oh, okay, were this monitor glossy it would probably look even better. But as it is, it's still my new favorite OLED monitor.

(Image credit: Future)

A shout out, too, to both full-screen brightness and SDR content handling. Regarding the former, you can set the panel at either constant full-screen SDR brightness around 250 nits or allow it to vary according to how much of the screen is lit up. The latter has been a bit of a distraction on previous monitors with LG OLED panels.

However, this one is bright enough, full-screen, that it doesn't dim infuriatingly if you open up a large white app window, like a text doc or webpage. In fact, I think it works best in variable mode, which allows it to go that bit brighter most of the time. LG has also managed the calibration of SDR content in HDR mode very nicely. So, you can realistically run this thing in HDR mode all the time. Short of pro-level content creation, there's no need to jump between modes.

(Image credit: Future)

But wait, one definitely last thing. Font rendering is just fab on this panel. Again, it's down to the 4K native on a relatively small 32-inch panel. The pixel density is plenty to cover up the non-standard subpixel structure of these OLED panels compared to conventional RGB LCD monitor.

Buy if...

You want the best 4K gaming OLED out there: LG has done it. This monitor is better than the entire Samsung QD-OLED horde.

Don't buy if...

You want value for money: At $1,400, this is a ridiculously pricey panel, even taking into account how good it undoubtedly is.

As for negatives, if you really must insist the panel color balance has the very slightest green tinge. It's very minor and not as apparent as the overly warm skew of those QD-OLED alternatives. But for the record, it is there.

All of which means this is one heck of a monitor. It's an HDR killer, the SDR handling and brightness is good, the pixel response is ridiculous and the Dual Mode is a nice little extra even if it isn't quite as advertised. The only thing missing is that USB-C interface, which I can forgive. What's harder to wish away, however, is the price.

This is definitely my favorite 4K OLED monitor. But does that justify the monstrous $1,400 price? After all, you can get a 32-inch 4K OLED for $900, fully $500 less. In the end, it's a personal call. If I could easily afford the extra money, I'd cough up. But if the added $500 was any kind of stretch, I'd be in quite the quandary. I really would.

]]>
https://www.pcgamer.com/hardware/gaming-monitors/lg-ultragear-32gs95ue-review KzYym7jhXWE43qxxh2U5Lg Fri, 16 Aug 2024 10:36:34 +0000
<![CDATA[ 'I'm still amazed that it all came together and actually works': YouTuber spends 14 months building a glorious gaming laptop from desktop parts ]]>

You ever look at your gaming laptop and think, "Damn, I wish I'd built this thing myself over the course of the last 14 months?" Me either—but where would we be without those willing to boldly go where others won't?

YouTuber Socket Science spent over a year making an entirely bespoke DIY gaming laptop (via Hackaday), and not only does it seem to actually run games pretty well, it also looks rather dashing. This seems to be as much a surprise to Socket Science as to me, because they say they're "still amazed that it all came together and actually works".

In explaining why they decided to undertake this task, Socket Science points out a somewhat open secret about gaming laptop GPUs: they're not as good as the desktop graphics cards of the same name. 

A laptop RTX 4080 isn't as powerful as a desktop one. In fact, a laptop RTX 4080 uses the same AD104 GPU as a desktop RTX 4070 Ti. Throw in mobile power limitations and you have a not insignificantly worse chip.

So, what does Socket Science do about this? Yep, they re-fashion desktop parts into a gaming laptop. The parts in question are an AMD Ryzen 5 5600X, an XFX Radeon RX 6600, a Gigabyte ITX motherboard, some "very low profile RAM", a 120 Hz QHD portable gaming monitor, a thin keyboard, a touchpad, and a DC-to-DC power supply.

To state the obvious, building a laptop is nothing like building a PC. Building a PC is easy: Just order the parts and fit them all together with loads of space to spare. Not so with a laptop. Parts need to be packed together tightly with densely packed cooling and laid out very strategically. It's usually not a job for home enthusiasts.

Components inside Socket Science's DIY gaming laptop

(Image credit: Socket Science on YouTube)

I won't spoil every detail because the video's well worth watching, but the general gist is as follows: Socket Science makes everything thin enough to place in laptop format thanks to a lot of motherboard desoldering and snipping followed by taking apart the graphics card to use just the PCB with the GPU on. 

There's lots of slapping together copper, heatsinks, and heatpipes, followed by putting this all together inside a 3D printed case alongside some 3D printed fans. The base of the build even includes fishing line and popsicle sticks, apparently. 

Finally, the LCD is taken out of the portable monitor and set up in a bespoke 3D printed lid, and the keyboard and touchpad is fitted into the base of the case. None of this is even mentioning the mottled copper finish.

Just writing my very simplified version made me feel a little overwhelmed—I can see why it took Socket Science 14 months. Worth it, though? I'll stick to spending a little extra on a laptop with a seemingly misnomered chip, myself, but I'm glad I've now witnessed a veritable gaming laptop being built in DIY fashion without having to raise a finger.

]]>
https://www.pcgamer.com/hardware/gaming-laptops/im-still-amazed-that-it-all-came-together-and-actually-works-youtuber-spends-14-months-building-a-glorious-gaming-laptop-from-desktop-parts 9zEYGEkHuXDczMoF8hXA55 Thu, 15 Aug 2024 16:30:10 +0000
<![CDATA[ Someone's already overclocked the AMD Ryzen 9 9900X and 9950X above 7.4 GHz ]]> It's only been a day since the AMD Ryzen 9 9900X and 9950X launched and it looks like some have already started pushing these new Zen 5 chips to their limits. They're some of the most powerful processors on the market (if not quite the best CPUs for gaming), so it's no surprise they can clock very high when push comes to shove.

MSI pointed out (via Wccftech) that a world record clock speed has been set for the AMD Ryzen 9000-series. This seems to be referring to overclocker TSAIK's recorded clock speeds of of 7.4 GHz for the 12-core 9900X and 7.44 GHz for the 16-core 9950X, as submitted to overclocking score aggregator site HWBot. For reference, the stock boost clock of both chips is 5.6 GHz, which in turn is over 2 GHz higher than their respective base clocks.

The big score is the one for the 9950X. This 7.44 GHz almost matches the highest frequency I could find on HWBot for the previous-gen AMD Ryzen 9 7950X, which is 7.47 GHz, achieved by overclocker rog-fisher. But that these new chips have only been out a day, and the fact they're already approaching that previous-gen record is neat.

Of course, this score doesn't come close to the highest frequencies achieved by any overclock. Last year, for instance, Asus broke the clock speed world record with over 9 GHz on the Intel Core i9 14900KF. But, y'know, apples and oranges. And who's wanting to push Intel chips above defaults these days, anyway, with all the stability issues even at previous default settings?

These AMD chip overclocks clearly won't have been achieved with anything like a standard cooling setup. As with all extreme overclocks, the chips will have been cooled using liquid nitrogen, which I've heard is a very tricky thing to do right. Everything has to be waterproofed and if you cool the chip too quickly it can crack—it's not something for the faint of heart to try.

While we can't expect anything close to these record clock speeds from a regular overclock, it's nice to know the chips are capable of it when push comes to shove. Nice, but not all too surprising given Zen 5's architectural improvements that allow for better power and heat efficiency.

It's certainly good to see what Zen 5 chips are capable of when pushed given that the mid-range AMD Ryzen 5 9600X and Ryzen 7 9700X are shipping in what seems, for all intents and purposes, like eco mode. So much so, in fact, that it's rumoured AMD will give these mid-range chips a post-launch TDP boost

It is, however, a shame that the 9900X and 9950X are about as expensive as they come, at least for now. Let's hope for price drops before too long—my money's on some cuts when Intel Arrow Lake CPUs launch.

]]>
https://www.pcgamer.com/hardware/processors/someones-already-overclocked-the-amd-ryzen-9-9900x-and-9950x-above-74-ghz NCe2o5nDoKfChEs4VKeAog Thu, 15 Aug 2024 14:40:48 +0000