• sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 days ago

    My semi-educated guess is that undervolting could help, but the real problem is just literally fundamentally bad design with the power connector / raw amount of power needed in such a small space.

    Like… we literally already saw this kind of nonsense with the 40 series…

    https://www.tomshardware.com/pc-components/gpus/nvidia-confident-that-rtx-50-series-power-connectors-unlikely-to-melt-despite-higher-tdp

    So uh… no… ‘Tom’…? … no it does not look like everything has been solved on this front.

    Of course yes, I too would like to see more specialized, actual investigation into this… but uh yeah, my gut feeling is … they are just flying too close to the sun, have hit the limits of the physics of heat dispersion.

    A 5090 has a max 575W power draw.

    That is fucking insane. That’s the almost the power draw of an entire decently high end gaming pc a decade ago, maybe even less far back than that.

    Even if the 50 series doesn’t have literally exploding power connectors, they evidently just cannot actually sufficiently manage the heat purely generated from the electrical power itself…

    Like… fans aren’t enough, and if they wanted to make these things last at all, they’d sell them all with AiO liquid cooling loops and fans.

    But I get the strong impression Nvidia fully doesn’t give a shit about the pc gaming crowd, is abusing their monopoly status and cult like fandom, is knowingly and intentionally selling unreliable garbage that is designed to intentionally obsolete itself.

    Nvidia’s entire pivot into AI upscaling, frame gen… that they massively coerced the wider gaming industry into… this is an unsustainable paradigm.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      3 days ago

      That is a broader and well litigated issue at this point. Going for more power draw wouldn’t be a problem in itself (hey, your microwave will pull 1000w and it doesn’t spontaneously combust). The problem is they designed the whole thing to what would safely fit (poorly) managing 350w and tidy and instead they are pushing 600w through it with meaningful cost-cutting shortcuts.

      That is what it is, and I think it’s a more than reasonable dealbreaker, which leaves this generation of GPUs down to a low-tier Intel card with its own compatibility issues and a decent but expensive mid-tier AMD offering. We are at a very weird impasse and I have no intuition about where it goes looking forward.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 days ago

        … but your microwave is literally designed to heat things, and be used in short bursts, not constantly.

        But, yeah, it seems we basically agree.

        This whole situation is a mess.

        They would need to invent… some entirely new standard or paradigm for power distribution… or slap a liquid cooler on these things, and just fully announce ‘lol, we only cater to the making gaming hardware for the top 5% of pc gamers, by income distribution’.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          Yeah, that’s my point about the microwave thing. It’s not that the total power is too much, it’s that you need more reliable ways to get it where it needs to be.

          I don’t understand how massively ramping up the power led to thinner wires and smaller plugs, for one thing. Other than someone got fancy and wanted prettier looking cable management over… you know, the laws of physics. Because apparently hardware manufacturers haven’t gotten past the notion that PC enthusiasts want to have a fancy aquarium that also does some computing sometimes. They should have made this thing a chonker with proper mains power wires. It’s called hardware for a reason.

          But I agree that the other option is completely changing how a PC is built. If you’re gonna have a GPU pulling 600W while the entire rest of the system is barely doing half of that maybe it’s time to rethink the idea of a modular board with sockets for cards, CPU and RAM and cables for power delivery. This entire structure was designed for 8086s and Motorola 68000s back when ISA ports were meant to hold a plug for your printer, a hard drive controller and a sound card. Laptops have moved on almost entirely from this format and there are plenty of manufacturers now building PCs on laptop hardware, Apple included.

          Maybe it’s time you start buying a graphics card with integrated shared memory and a slot to plug in a modular CPU instead. Maybe the GPU does its own power management and feeds power to the rest of the system instead of the other way around.

          I don’t know, I’m not a hardware engineer. I can tell the current way of doing things for desktop PCs is dumb now, though.

          • sp3ctr4l@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 days ago

            Erp, you posted as I was editing in an addendum, here’s my addendum.

            EDIT:

            It is still absolutely wild to me that… … just what the fuck is the point of this new real time raytracing paradigm, that necessitated frame upscaling… which also necessitates framegen… which (sony has announced they are looking into this) may soon necessitate AI assisted input to hallucinate what the player ‘probably wants to be doing’, to counteract the input lag frome framegen?

            Cyberpunk 2077, years after release… oh, you want to actually run path tracing at 4k, maxxed settings?

            A 4090 with 24 gb vram gets you 20 (real) fps average.

            what? Why … why are we even designing games with features that just… no hardware can even actually run, no matter how much money you throw at that hardware?

            Apparently a 5090 can get up to 60ish (real) FPS…

            This is insane. CP77 is … the flagship debut and testbed of this new paradigm, and here we are like 5 years later, and Nvidia is saying ok so you can now finally actually do what we initially said you could do 5 years ago, just buy this… currently, roughly $4000 video card.

            … What?

            Ok, to reply to what you just said:

            Yeah, no notes, total agreement, I am now too angry to say anything more poignant.

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              2
              ·
              3 days ago

              Yeah, see, on the features argument I’m gonna disagree, just to disrupt the lovefest.

              I LOVE new graphics features. I’ve been messing with raytracers and pathtracers since back when they were a commandline DOS application. Real time path traced visuals? Gimme. I don’t have a problem with alternate ways to feed pixels or frames, even. All of that is just a byproduct of all the tensor acceleration they are using for AI anyway, I’m just glad there’s a gaming application for it, too.

              If I’m going to question an unreasonably high technical paradigm it’s going to be what we consider “gaming monitors” now. Who needs 4K at 240Hz? There are 500fps displays out there now, which is a frame per 2 miliseconds. Back in the 1080p60 days that was your antialiasing budget.

              But I’m also willing to acknowledge that other people have different priorities and would rather maximize that for specific applications instead of visual fidelity.

              And that’s the real problem these days, there is no standard spec. Different people are taking different elements of rendering to absurd extremes. Crazy framerates, crazy resolutions, crazy real time pathtracing, crazy scope, crazy fidelity, crazy assets, crazy low power draw on handhelds. You’re supposed to be servicing all of those needs at once PLUS this is now the hardware that runs the last two tech gold rushes driving insane speculative investment.

              That is ultimately neither here nor there, but if we’re all going to have to accept that there will never be a fire-and-forget, max-everything-out user-level hardware spec ever again the least manufacturers could do is prioritize the insane, wildly expensive new prosumer segment not catching on actual fire if you look at it sideways.

              • sp3ctr4l@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                3 days ago

                That is ultimately neither here nor there, but if we’re all going to have to accept that there will never be a fire-and-forget, max-everything-out user-level hardware spec ever again the least manufacturers could do is prioritize the insane, wildly expensive new prosumer segment not catching on actual fire if you look at it sideways.

                See, this is what I see is the entire problem.

                Just like the US Auto industry… and many others…

                Everything is shifting toward being based on what used to be the high end luxury market… that is now the new ‘normal’, only barely maintainable as ‘normal’ via exploding levels of just actual literal personal and corporate balance sheet debt and cratering credit scores, rising systemic risk from corporate finance overleveraging.

                The result is that affordability is out the window, massively mismatched with the mainstay of your consumer base, and video games go back to a state they were in in roughly the 80s and 90s… where you basically need to be pretty well off to experience the new hotness…

                But also, simultaneously: Affordable, basic options… don’t really exist anymore.

                The older used high end stuff that still kinda works just price matches with the newer low end stuff that is unreliable for other reasons (planned obsolesence)… untill it dies of old age or supply runs out, even on the used market.

                Gaming companies do literally everything they can to make it difficult or unappealing for you to just go play an old video game.

                MtX, Live Services, Always Online game that just expires and then hey go fuck yourself, even if you wanted to host your own servers as a dedicated fanbase, fuck you, no network or server code for you, maybe you can spend a decade reverse engineering it, oh right and all your microtransactions? Poof, gone, game is unplayable, all that shit you paid for evaporated too.

                There is going to be a giant rift between the few indie studios that manage to get enough funding to put together something fairly small scale but actually novel … and then a few massive AAAAA games that require fucking supercomputers and a networth in the tens of millions afford to purchase and run.

                No more middle.

                The gaming landscape is going to bifrucate along with the rest of the increasing unstable wealth disparity divide in society generally.

                We are no longer in the ‘things generally get better and cheaper’ phase, we are in the corporate neofeudalism phase where you are either a serf or a master… but always, still a consumer.

                The fact that CP2077 is itself the inflexion point of this transition is so ironic it literally causes me physical pain.

                … thank you for coming to my TEDtalkrant, lol.

                • MudMan@fedia.io
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  3 days ago

                  That mostly tracks. I think the problem is less the availability of affordable options and more the willingness of the market to take those as a standard, though.

                  The affordable options are there. You can get a PS5 starting at 400 bucks (tariffs allowing). That’s a lower sticker price than a launch PS3 and on par with the inflation-adjusted price of a 360. It’s also cheaper than some comparably performant GPUs, let alone an entire PC.

                  Problem is then you’re playing at some variation of upscaled 1080-1440p at 30 to 60 fps and apparently the PC market thinks that’s for peasants and you should only ever play at hundreds of fpss and many megapixels.

                  And yeah, the tech hasn’t made those specs available to the human-tier and instead the marketers have gotten really good at giving you FOMO for all the high end features you could be getting instead.

                  There is a low end. I think the fact that a Steam Deck is ostensibly a full handheld PC starting at 420 bucks is absurd. Not gonna raytrace much on it, though.

                  Do I think games should all be made for Steam Decks and PS5s and not have any features that require beefier hardware? Well, seeing my point about loving visual features I’m going to be a no, but I also think we need to get better at managing the FOMO as a group.

                  Or the hardware needs to find a new route to get us back on the Moore’s Law curve. Either/or.

                  • sp3ctr4l@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    3 days ago

                    Unfortunately, I have a bad wrist/hand, and cannot type much more without extreme pain.

                    That being said… wonderful conversation, truly, thank you for that.

                    As a parting comment, I will say:

                    MooresLawIsDead is great youtube channel. =D

                    So is uh, Threat Interactive.

                    They go very in depth into showing the details of … basically, game engine and game optimization is largely dead, everyone is now just relying on assuming every end user has a supercomputer GPU to throw enough raw power and input lag inducing frame gen to smooth over a whole, whole lot of bad optimization and … unsustainable fundamental game design practices, which of course go along with the unsustainable fundamental hardware architectures.