Well I am shocked, SHOCKED I say! Well, not that shocked.

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    13 days ago

    For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.

    • Oniononon@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      And still have your house burn down due to it just being a 2080 that has 9.8 jiggawats pushed into it.

      There isn’t a single reason to get any of the 5 series imo, they don’t offer anything. And i say that as a 3d artist for games.

      Edit: nevermind i remember some idiots got roped into 4k for gaming and are now paying the price like marketing wanted them to.

        • kattfisk@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          4K is an outrageously high resolution.

          If I was conspiratorial I would say that 4K was normalized as the next step above 1440p in order to create a demand for many generations of new graphics cards. Because it was introduced long before there was hardware able to use it without serious compromises. (I don’t actually think it’s a conspiracy though.)

          For comparison, 1440p has 78% more pixels than 1080p. That’s quite a jump in pixel density and required performance.

          4K has 125% more pixels than 1440p (300% more than 1080p). The step up is massive, and the additional performance required is as well.

          Now there is a resolution that we are missing in between them. 3200x1800 is the natural next step above 1440p*. At 56% more pixels it would be a nice improvement, without an outrageous jump in performance. But it doesn’t exist outside of a few laptops for some reason.

          *All these resolutions are multiples of 640x360. 720p is 2x, 1080p is 3x, 1440p is 4x, and 4K is 6x. 1800p is the missing 5x.

        • Damage@feddit.it
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          13 days ago

          Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.

          “You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.

    • Lord Wiggle@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      unless they’re rich or actually need it for something important

      Fucking youtubers and crypto miners.

      • Robust Mirror@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.

    • overload@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.

  • Anomalocaris@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    plus, i have a 3060. and it’s still amazing.

    don’t feel the need to upgrade at all.

  • EndlessNightmare@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    I don’t buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I’m assumimg series naming here) before upgrading.

  • bluesheep@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

    Yeah no shit, what a weird fucking take

  • simple@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

  • RadioFreeArabia@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    12 days ago

    I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      12 days ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 days ago

      Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…

      • JustEnoughDucks@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        12 days ago

        It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        The majority sure, but there are some gems though.

        Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example

        You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.

  • arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.

    My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.

    • DefederateLemmyMl@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 days ago

      bitcoin mining

      That’s a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it’s nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.

      AI is driving up prices or rather, it’s reducing availability, which then translates into higher prices.

      Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They’ll sell less volume but they’ll make more profit per unit.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 days ago

      My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.

      Then again I have become old and grumpy, playing old games.

      • WhatYouNeed@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?

  • WereCat@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    The progress is just not there.

    I’ve got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn’t care less about.

    • DefederateLemmyMl@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 days ago

      bit better RT performance about which I couldn’t care less about.

      Yeah raytracing is not really relevant on these cards, the performance hit is just too great.

      The RX 9070 XT is the first AMD GPU where you can consider turning it on.

  • candyman337@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    12 days ago

    It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.

  • Demognomicon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    12 days ago

    I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.

    • Critical_Thinker@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      12 days ago

      I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.

      The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?

      As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        12 days ago

        One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.

        I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).

        I feel like another is… marketing?

        Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.