The first salvo of RTX 50 series GPU will arrive in January, with pricing starting at $549 for the RTX 5070 and topping out at an eye-watering $1,999 for the flagship RTX 5090. In between those are the $749 RTX 5070 Ti and $999 RTX 5080. Laptop variants of the desktop GPUs will follow in March, with pricing there starting at $1,299 for 5070-equipped PCs.

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    13 hours ago

    From google:

    The RTX 4090 was released as the first model of the series on October 12, 2022, launched for $1,599 US, and the 16GB RTX 4080 was released on November 16, 2022 for $1,199 US.

    So they dropped the 80 series in price by $200 while increasing the 5090 by $400.

    Pretty smart honestly. Those who have to have the best are willing to spend more and I’m happy the 80 series is more affordable.

  • ramble81@lemm.ee
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    19 hours ago

    Just using this thread as a reminder the new Intel Arc B580 is showing 4060 performance for only $250

  • SuperSpruce@lemmy.zip
    link
    fedilink
    arrow-up
    36
    ·
    edit-2
    24 hours ago

    The prices are high, but what really is shocking are the power consumption figures. The 5090 is 575W(!!), while the 5080 is 360W, 5070Ti is 300W, and the 5070 is 250W.

    If you are getting one of these, factor in the cost of a better PSU and your electric bill too. We’re getting closer and closer to the limit of power from a US electrical socket.

    • pishadoot@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 hours ago

      1000W PSU pulls max 8.3A on a 120v circuit.

      Residential circuits in USA are 15-20A, very rarely are they 10 but I’ve seen some super old ones or split 20A breakers in the wild.

      A single duplex outlet must be rated to the same amperage as the breaker in order to be code, so with a 5090 PC you’re around half capacity of what you’d normally find, worst case. Nice big monitors take about an amp each, and other peripherals are negligible.

      You could easily pop a breaker if you’ve got a bunch of other stuff on the same circuit, but that’s true for anything.

      I think the power draw on a 5090 is crazy, crazy high don’t get me wrong, but let’s be reasonable here - electricity costs yes, but we’re not getting close to the limits of a circuit/receptacle (yet).

    • Jimmycakes@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      There’s gonna be as many tariffs as there were walls that got built and paid for by Mexico.

      Not because it’s bad for the American people.

      It’s because the same people in congress who would install tariffs are making hundreds of millions hand over first on insider trading stocks. They aren’t gonna fuck up the gravy train for Trumps dumb ass campaign ramblings.

        • Jimmycakes@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 hours ago

          He was free wheeling yolo the first term. This time he has masters he needs to obey. Elon for one ain’t allow tariffs on any electronics he needs for ai or hours cars. Nvidia should be safe.

    • Dasus@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      16 hours ago

      But you see because of the tariffs the American gamers will just default to American GPUs, duh.

      • FrowingFostek@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        14 hours ago

        I’m not super informed about the developments but, won’t this be in the realm of possibility with the new Intel facilities being constructed?

        As a result of the Biden admin’s chips and science act, is what I’m eluding to.

  • Professorozone@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    21 hours ago

    Why do people buy this stuff? It only takes like a year before it falls in price as the next one comes along. Gotta get that last 2FPS, I guess.

    • sus@programming.dev
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      18 hours ago

      it seems people realized this and the old cards aren’t even properly falling in price anymore, even on the used market

      • Professorozone@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        18 hours ago

        Yeah I’m never willing to afford the best. I usually build a new computer with second best parts. With these prices my next computer will be with third best stuff I guess.

      • Skates@feddit.nl
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        13 hours ago

        I don’t know a lot about computers, but I do know a fair amount about bussy.

        All Linux-related communities on lemmy.

    • Sixty@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      22 hours ago

      I’m staying on 1440p deliberately. My 3080 is still perfectly fine for a few more years, at least current console gen.

      • superkret@feddit.org
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        13 hours ago

        I’ve ditched my gaming PC and am currently playing my favorite game (Kingdom Come Deliverance) on an old laptop. Which means I can’t go higher than 800x480.
        And honestly, the immersion works. After a couple minutes I don’t notice it anymore.

      • Subverb@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        21 hours ago

        You’re not wrong. I just recently upgraded my whole machine going from a 3090 to a 4090 on 1440p and basically can’t tell the difference.

    • DarkCloud@lemmy.world
      link
      fedilink
      arrow-up
      51
      arrow-down
      2
      ·
      1 day ago

      No one should, video graphics haven’t progressed that far. Only the lack of optimisation has.

      • John Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        2
        ·
        1 day ago

        You’re missing a major audience willing to pay $2k for these cards, people wanting to run large AI language models locally.

        • cm0002@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          I’m willing, but unable :'(

          Someday I’ll be able to run something cool like that Deepseek v3 model or something. Probably when they figure out how to run them well in regular RAM, I have a shit ton of that at my disposal. Stupid VRAM. (Maybe they’ll start coming out with GPUs with slotted VRAM lol)

  • halcyoncmdr@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    1 day ago

    Welp, looks like I’ll start looking at AMD and Intel instead. Nvidia is pricing itself at a premium that’s impossible to actually meet compared to competitors.

    There will be people that buy it. Professionals that can actually use the hardware and can justify it via things like business tax benefits, and those with enough money to waste that it doesn’t matter.

    For everyone else, competitors are going to be much better options. Especially with Intel’s very fast progression into the dedicated card game with Arc and generational improvements.

      • ggtdbz@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        22 hours ago

        I’m here to represent the professionals ∩ idiots. We exist too.

        Although seeing those prices is reminding me my mobile 3070 has been perfectly usable

    • Aurenkin@sh.itjust.works
      link
      fedilink
      arrow-up
      15
      ·
      1 day ago

      Some people don’t care about spending $2000 for whatever. I mean, I’m not one of those people but they probably exist.

      • pivot_root@lemmy.world
        link
        fedilink
        arrow-up
        25
        arrow-down
        3
        ·
        edit-2
        1 day ago

        I’m probably one of those people. I don’t have kids, I don’t care much about fun things like vacations, fancy food, or yearly commodity electronics like phones or leased cars, and I’m lucky enough to not have any college debt left.

        A Scrooge McDuck vault of unused money isn’t going to do anything useful when I’m 6 feet underground, so I might as well spend a bit more (within reason*) on one of the few things that I do get enjoyment out of.

        * Specifically: doing research on what I want; waiting for high-end parts to go on sale; never buying marked-up AIB partner GPUs; and only actually upgrading things every 5~6 years after I’ve gotten good value out of my last frivolous purchase.

      • srecko@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        My company could buy me this (for video editing), but I mostly need it for vram that should be cheap. I would like to be able to afford it without it doubling the price of my pc.

  • MrGerrit@feddit.nl
    link
    fedilink
    arrow-up
    14
    ·
    1 day ago

    I bought my 4080 super recently and hopes it last me a good +12 years like my old card did. These prices are insane!

      • MrGerrit@feddit.nl
        link
        fedilink
        arrow-up
        3
        ·
        21 hours ago

        My old card literally died and was forced to get a new one.

        Sleep tight, prince, my AMD R9 Fury x. You will be missed.

        • spookex@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 hours ago

          I’m still running a 1070ti, before that it was the R9 390 and R9 390X, both of them died though. One probably had it’s voltage regulator fail and the other probably had it’s chip die because it will boot windows but die as soon as it had to do any work

  • fuckwit_mcbumcrumble@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    1 day ago

    My question is will the 5080 perform half as fast as the 5090. Or is it going to be like the 4080 vs 4090 again where the 4080 was like 80% the price for 60% the performance?

    • Evil_Shrubbery@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      23 hours ago

      I think that at higher resolutions (4k) there is gonna be a bit bigger difference than in gen 40 bcs of 256bit vs 384bit mem bussy in 4080 vs 4090 compared to 256bit vs 512bit in 5080 vs 5090.

      That memory throughput & bandwidth might not get such a big bump in the next gen or two.

  • Xtallll@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 day ago

    They claim the 5070 gives 4090 performance for $549. that lower end of the 50 series line up looks nice.

    • ShinkanTrain@lemmy.ml
      link
      fedilink
      English
      arrow-up
      32
      ·
      edit-2
      1 day ago

      That “4090 performance” includes the new frame generation where only a quarter of all frames is rendered. It’s borderline false advertising.

      • Jimmycakes@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        It’s called upcycling if the frame is generated then the kids have to mine less virgin frames from the mines. Before this breakthrough we were rendering so many frames the landfills were completely full.