

They thought they consumed the onion, but in reality the onion consumed them
Currently studying CS and some other stuff. Best known for previously being top 50 (OCE) in LoL, expert RoN modder, and creator of RoN:EE’s community patch (CBP). He/him.
(header photo by Brian Maffitt)
They thought they consumed the onion, but in reality the onion consumed them
Do you not see any value in engaging with views you don’t personally agree with? I don’t think agreeing with it is a good barometer for whether it’s post-worthy
If the GN tests accurately map to whatever the navy’s using, the difference in most games isn’t that significant despite the suboptimal cooling, and if they’re usually just playing TF2 and Halo 2 (as per article) then even 50% of full performance should still be plenty.
it’s not good for us at all.
I personally think it’s good to be considerate of others when it has essentially no negative effects, but you’re welcome to do whatever the fuck you want
last-generation Alienware (Dell) machines
I noticed that too, it might be a case of military procurement delay ¯\_(ツ)_/¯
Well, yes
The MoD sees embracing gamer culture as a way of attracting and retaining young people, particularly for roles in cyber defence and technology-focused positions. The UK government launched a recruitment plan this year to fast-track gamers into cyber defence roles.
https://aussie.zone/post/252773
I don’t personally care so long as the meaning is still obvious, so I try to keep it in mind for the sake of those who do
She already has 1 mil subs lol
I assume the proposed substitutions have to be genre-constrained
FYI your first link isn’t actually inside of your spoiler tag
Ngl I kind of assumed we would’ve just made them out of something biodegradable and food-safe because it seems insane to use them otherwise
p.s. here’s the intended link for people on platforms that don’t support those types of edits: https://www.theguardian.com/australia-news/2025/jun/29/calls-for-australia-wide-crackdown-on-real-estate-ads-that-use-ai-to-hide-faults-and-lure-in-renters
That was honestly very disorientating lol
Is this what people who get motion sick playing games feel like?
An image from 2023 with the quality of one from the '00s lol
#JusticeForEurope
The animation is cool but man, not quite hitting on the beat is a bummer lol
deleted by creator
Sorry, I forgot this was a thing with Wired articles (I don’t read regularly enough for it to usually be a problem) - I’ll add an archive link to the OP
said Prime Minister Netanyahu, “no one should be allowed to have them as they destabilise the region.”
It doesn’t have to be completely fictional to be satire
If you come to the comments having eaten the onion hard drive I swear to god I’m going to slap you
It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:
The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral – a top-end card being used for overclocking world records – is $2.8k. I couldn’t quickly find the European MSRP but my money’s on it being more than 2.2k euro.
NVENC isn’t much of a moat right now, as both Intel and AMD’s encoders are roughly comparable in quality these days (including in Intel’s iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.
Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn’t had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).
I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?
Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it’s more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that’s fine by me. I don’t think blaming DLSS (and its equivalents like FSR and XeSS) as “snake oil” is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can’t tell it’s not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.
There’s some fair points here about RT (though I find exclusively using path tracing for RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading “DLSS is, and always was, snake oil”?
obligatory: disagreeing with some of the author’s points is not the same as saying “Nvidia is great”