• hdnsmbt@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    You realize it’s not devs that make those decisions, right? It’s publishers and execs. You know, the guys who make the actual money in all this. Stop blaming devs for stupid exec decisions.

        • MindSkipperBro12@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          It’s an extreme example but the principle remains the same: The idea of someone’s responsibility when following questionable orders.

          • hdnsmbt@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            No, it’s a fucking stupid comparison, man. One thing leads to dead people, the other thing leads to slightly less convenient entertainment software. Can you figure out the rest for yourself? Fuck all the way off with your “questionable orders”.

  • mavu@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I hate this conflation of “Developer” with every other role in modern game development.

    If you think the new Porsche looks shit, do you blame the Mecanical engineer who designed the brake mechanism?

    If your new manga body pillow gives you a rash, do you blame the graphic designer of the manga?

    There is not a single thing listed in the meme above that is actually the fault of the actual developers working on the game. Don’t even need to talk about the first picture.

    game size is studio management related. They want to stuff as much (repetitive, boring) content into the game as possible. Plus a multiplayer mode no one asked for.

    Optimizations don’t happen because the CEO decides to take the sales money of the game this quarter, and not next, and ships an unfinished product.

    Always online is ALWAYS a management decision.

    It’s a shit joke, it’s wrong because it blames the wrong people, and its also just dumb.

  • beefbot@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Problems with game developers might better be understood as problems with capitalism, to paraphrase Ted Chiang

    • beefbot@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      We can’t update games or refactor code to make it smaller bc our bosses demand we constantly work harder, better, faster, stronger. They force us into games that require more expensive hardware bc the entire tech industry depends on people upgrading every other year. And it’s online constantly bc we hoover up player data for our new profit centre where we sell all your data.

      And now they made a meme that deflects blame off them and onto devs, who have way more contact w the public than anonymous rich people

    • ZILtoid1991@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      A lot of today’s indie devs are also… well…

      groomerwojak.jpg: “I groomed a teen fan of mine, and when she came forward I made her to write an apology, also I spent my Patreon money on a sexdoll, and my code is spaghetti.”

      “We barely managed to make a functioning game with premade assets, and our popularity was so dependent on Pokémon not performing well, our fanbase is a toxic cesspool as a result, who can’t express the love to the game without actively dissing Nintendo.”

      “I’m a bigoted con artist who rebrands every time they get busted for his crappy horror game.”

      “Optimization? We are already using low-poly assets!”

      “The assets in our pixelart games are very unaligned, and we use high-resolution fonts because no one makes bitmap fonts anymore.”

    • RandomStickman@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      That and it easily running on Linux, either naively or though Proton, is why I haven’t touched any AAA in like… at least 5 years? Maybe closer to 10.

  • TrickDacy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Besides being a maintenance fucking nightmare, wouldn’t writing a game in assembly make it a lot harder to be cross platform? I really don’t get that panel.

    • lunarul@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I kept scrolling for this comment. Writing in assembly means you can only write for one specific instruction set. The innovation of programming languages was not just making things easier to write, it was the compiling step which could take the same code and produce machine code output for different systems, making it much easier to support multiple platforms.

      • TrickDacy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Yeah exactly. Apparently they meant “most machines” as in “most machines that could run windows”. Like in a performance sense. Weird way to put it imo, since “most machines” to me would refer to platform concerns.

    • MimicJar@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Yes, yes it would. They meant to say that it would improve performance (if done well, which it was). That improved performance would allow it to run on a wide variety of devices, including those with low specs.

      Also at the time writing for x86 only would have been plenty portable. Even today that would cover “standard” PC architecture. (Although nowadays you probably want to put it on mobile devices, gaming consoles or macOS, so not ideal.)

      • TrickDacy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Yeah, it being about performance makes sense. Still don’t know how that dude managed to write a full-ass game in assembly though. Takes a special brain to even be able to think that way.

    • Twinklebreeze @lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Yes. Because older is always better. Then when the present is the before times people will look back fondly on it too.

    • twinnie@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Everyone seems to think that games like Doom and Half-Life came out all the time. I remember looking at shareware disks in shops and seeing loads of games that looked like total crap.

      • bazus1@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        For sure! just go to Abandonware and try to go to a specific year to find something. You have to wade through pages of garbo to find something worth playing.

  • Skullgrid@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Games back then : created by 1 to 4 people with autism because they wanted to have fun on a computer

    Games now : driven by dickheads that just left business school at the whims of billionaire conglomoration funds.

    • mossy_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I miss when games used to be good. Anyone 'member Vampire Survivors, Lethal Company, Bug Fables? Developers these days just can’t compare.

      • Skullgrid@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        now that’s survivor bias

        EDIT : here’s the fun thing, Lethal company would have been a mod back in the day

        • Acters@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Tbf, games were easier to create using in-game functions and logic that was created for another game. Modding a whole rework was easier than making the entire game from scratch. Undeniably lethal company is similar in look and feel but it has better game play than some mods.

          • bob_lemon@feddit.de
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 months ago

            Exactly, creating a mod for half-life or similar titles was simply the easiest way to get a decent working 3d fps engine without coding it yourself.

        • mossy_@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Is your point that developers today aren’t as good/benevolent/whatever as devs back in the day? I’m saying (sarcastically, I suppose) that the same type of developers exist today. What does survivor’s bias have to do with it? Is my point moot because GMOD exists?

          • Skullgrid@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            Your point is moot because there is an unending hose of indie games being created and knowing that 2 gems exist doesn’t mean the rest of the cottage industry measures up to the things being achieved earlier, and nor does said indie scene have a similar rate of success as the old industry back then.

            • mossy_@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              What are you, a shareholder? Why does the ‘rate of success’ matter? I didn’t list three games because there were only two gems.

              It’s like being at the library and saying “fantasy authors will never compete with what JK Rowling was writing, just look at how many books are here!”

  • Bloodyhog@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Ok, that got me. I still remember the days of ZX and that funny noise… But I do have a question for one part of the meme: can someone explain to me why on Earth the updates now weigh these tens of gigs? I can accept that hires textures and other assets can take that space, but these are most likely not the bits that are being updated most of the time. Why don’t devs just take the code they actually update and send that our way?

    • PsychedSy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’ve got 2gig fiber, not 56k dialup. It’s Steam’s bandwidth now. They paid Valve their 30%. Why bother with insane compression that just makes it feel slow for us?

      • Bloodyhog@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        That is also a factor I do not understand. Bandwidth costs the storefront money, would Steam and others not want to decrease this load? And well done you with that fiber, you dog! I also have a fiber line but see no reason to upgrade from my tariff (150mib, i think?) that covers everything just to shave that hour of download time a year.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      For modern games, from what I’ve seen, they’ve taken a more modular approach to how assets are saved. So you’ll have large data files which are essentially full of compressed textures or something. Depending on how many textures you’re using and how many versions of each textures is available (for different detail levels), it can be a lot of assets, even if all the assets in this file, are all wall textures, as an example.

      So the problem becomes that the updaters/installers are not complex enough to update a single texture file in a single compressed texture dataset file. So the solution is to instead, replace the entire dataset with one that contains the new information. So while you’re adding an item or changing how something looks, you’re basically sending not only the item, but also all similar items (all in the same set) again, even though 90% didn’t change. The files can easily reach into the 10s of gigabytes in size due to how many assets are needed. Adding a map? Dataset file for all maps needs to be sent. Adding a weapon or changing the look/feel/animation of a weapon? Here’s the entire weapon dataset again.

      Though not nearly as horrible, the same can be said for the libraries and executable binaries of the game logic. This variable was added, well, here’s that entire binary file with the change (not just the change). Binaries tend to be a lot smaller than the assets so it’s less problematic.

      The entirety of the game content is likely stored in a handful (maybe a few dozen at most) dataset files, so if any one of them change for any reason, end users now need to download 5-10% of the installed size of the game, to get the update.

      Is there a better way? Probably. But it may be too complex to accomplish. Basically write a small patching program to unpack the dataset, replace/insert the new assets, then repack it. It would reduce the download size, but increase the amount of work the end user system needs to do for the update, which may or may not be viable depending on the system you’ve made the game for. PC games should support it, but what happens if you’re coding across PC, Xbox, PlayStation, and Nintendo switch? Do those consoles allow your game the read/write access they need to the storage to do the unpacking and repacking? Do they have the space for that?

      It becomes a risk, and doing it the way they are now, if you have enough room to download the update, then no more space is needed, since the update manager will simply copy the updated dataset entirely, over the old one.

      It’s a game of choices and variables, risks and rewards. Developers definitely don’t want to get into the business of custom updates per platform based on capabilities, so you have to find a solution that works for everyone who might be running the game. The current solution wastes bandwidth, but has the merit of being cross compatible, and consistent. The process is the same for every platform.

      • Bloodyhog@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        The console argument does actually make a lot of sense to me, thank you for the detailed response. It would still (seemingly) be possible to structure the project in a way that would allow replacing only what you actually need to replace, but that requires more investment in the architecture and likely cause more errors due to added complexity. Still, i cannot forgive the BG 3 coders for making me redownload these 120gb or so! )

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          The issue is the compression. There’s hundreds of individual assets, the process to compress or more accurately, uncompress the assets for use takes processor resources. Usually it only really needs to be done a few times when the game starts, when it loads the assets required. Basically when you get to a loading screen, the game is unpacking the relevant assets from those dataset files. Every time the game opens one of those datasets, it takes time to create the connection to the dataset file on the host system, then unpack the index of the dataset, and finally go and retrieve the assets needed.

          Two things about this process: first, securing access to the file and getting the index is a fairly slow process. Allocating anything takes significant time (relative to the other steps in the process) and accomplishes nothing except preparing to load the relevant assets. It’s basically just wasted time. The second thing is that compressed files are most efficient in making the total size smaller when there’s more data in the file.

          Very basically, the most simple compression, zip (aka “compressed folders” in Windows) basically looks through the files for repeating sections of data, it then replaces all that repeated content with a reference to the original data. The reference is much smaller than the data it replaces. This can also be referred to as de-duplication. In this way if you had a set of files that all contained mostly the same data, say text files with most of the same repeating messages, the resulting compression would be very high (smaller size) and this method is used for things like log files since there are many repeating dates, times, and messages with a few unique variances from line to line. This is an extremely basic concept of one style of compression that’s very common, and certainly not the only way, and also not necessarily the method being used, or the only method being used.

          If there’s less content per compressed dataset file, there’s going to be fewer opportunities for the compression to optimize the content to be smaller, so large similar datasets are preferable over smaller ones containing more diverse data.

          This, combined with the relatively long open times per file means that programmers will want as few datasets as possible to keep the system from needing to open many files to retrieve the required data during load times, and to boost the efficiency of those compressed files to optimal levels.

          If, for example, many smaller files were used, then yes, updates would be smaller. However, loading times could end up being doubled or tripled from their current timing. Given that you would, in theory, be leading data many times over (every time you load into a game or a map or something), compared to how frequently you perform updates, the right choice is to have updates take longer with more data required for download, so when you get into the game, your intra-session loads may be much faster.

          With the integration of solid state storage in most modern systems, loading times have also been dramatically reduced due to the sheer speed at which files can be locked, opened, and data streamed out of them into working memory, but it’s still a trade-off that needs to be taken into account. This is especially true when considering releases on PC, since PC’s can have wildly different hardware and not everyone is using SSDs, or similar (fast) flash storage; perhaps on older systems or if the end user simply prefers the less expensive space available from spinning platter hard disks.

          All of this must be counter balanced to provide the best possible experience for the end user and I assure you that all aspects of this process are heavily scrutinized by the people who designed the game. Often, these decisions are made early on so that the rest of the loading system can be designed around these concepts consistently, and it doesn’t need to be reworked part way through the lifecycle of the game. It’s very likely that even as systems and standards change, the loading system in the game will not, so if the game was designed with optimizations for hard disks (not SSDs) in mind, then that will not change until at least the next major release in that games franchise.

          What isn’t really excusable is when the next game from a franchise has a large overhaul, and the loading system (with all of its obsolete optimizations) is used for more modern titles; which is something I’m certain happens with most AAA studios. They reuse a lot of the existing systems and code to reduce how much work is required to go from concept to release, and hopefully shorten the duration of time (and the amount of effort required) to get to launch. Such systems should be under scrutiny at all times whenever possible, to further streamline the process and optimize it for the majority of players. If that means outlier customers trying to play the latest game on their WD green spinning disk have a worse time because they haven’t purchased an SSD, when more than 90% + have at least a SATA SSD, all of whom get the benefits from the newer load system while obsolete users are detrimented because of their slow platter drives, then so be it.

          But I’m starting to cross over into my opinions on it a bit more than I intended to. So I’ll stop there. I hope that helps at least make sense of what’s happening and why such decisions are made. As always if anyone reads this and knows more than I do, please speak up and correct me. I’m just some guy on the internet, and I’m not perfect. I don’t make games, I’m not a developer. I am a systems administrator, so I see these issues constantly; I know how the subsystems work and I have a deep understanding of the underlying technology, but I haven’t done any serious coding work for a long long time. I may be wrong or inaccurate on a few points and I welcome any corrections that anyone may have that they can share.

          Have a good day.

  • I Cast Fist@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    For those that are unaware, the second chad is most likely referring to .kkrieger. Not a full game, but a demo (from a demoscene) whose purpose was to make a fully playable game with a max size of 96kb. Even going very slow, you won’t need more than 5 minutes to finish it.

    The startup is very CPU heavy and takes a while, even on modern systems, because it generates all the geometry, textures, lighting and whatnot from stored procedures.

  • chicken@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Very rose tinted glasses. I remember horrifying cache corruption bugs that locked you out of certain game areas permanently on that save, random illegal operation exceptions crashing games (no autosave btw), the whole system regularly freezing and needing to be completely restarted, games just inexplicably not working to begin with on a regular basis because of some hardware incompatibility and the internet sucked for finding fixes then and patches weren’t a thing so you were just screwed.

    I would say that games not all being written in C and assembly trying to squeeze out every possible performance efficiency with nothing but dev machismo as safeguards is in fact a good thing.