• Avicenna@programming.dev
    link
    fedilink
    English
    arrow-up
    33
    ·
    7 days ago

    lol lets fuck all other computational sciences so that people can generate cats drinking whiskey in robes de chambre.

  • Gsus4@mander.xyz
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    7 days ago

    Shit, looks like the crazy stock bubble is starting to dwarf and eat into the consumer market instead of just bursting…

  • COASTER1921@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 days ago

    All web browsers are nearly unusable with 4gb of RAM lately. Even with desktop Linux I usually have nearly my full 8gb used. With 8gb AND Windows it’s only a matter of time before these computers become unusable…

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 days ago

    zswap and zram becoming highly critical software again trying to shove your fat 16Gb allocated application into 8Gb of RAM with room to spare:

  • kerrigan778@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 days ago

    Lol, forget a browser, is that even enough to run Windows 11 well?

    (In before jokes about windows 11 ever running “well”)

  • JTskulk@lemmy.world
    link
    fedilink
    English
    arrow-up
    234
    arrow-down
    1
    ·
    8 days ago

    Do it. I know which OS will run fine on 8G of RAM and which one won’t.

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    101
    ·
    8 days ago

    Hah, guess they’re gonna have to run Linux. Windows 11 would choke on 8 GB RAM.

    • themachinestops@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      edit-2
      8 days ago

      I agree, funny enough they require minimum 4 GB, but I had problems with it with 16 GB. 8 GB is nowhere near enough, especially if you use excel, teams, and that crap.

      • Buckshot@programming.dev
        link
        fedilink
        English
        arrow-up
        13
        ·
        8 days ago

        I have to use windows at work, have 32gb and regularly get browser tabs unloaded for low memory. I’m not running VMs or anything. Usually just Firefox, visual studio, and slack.

        Personal computer is Linux with 16gb and that’s more than enough.

        • Zannsolo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          I have 64gb 32 for vm and 32 for host, security software eats it all. I pity the poor bastards on the 8gb work laptops. My project is funded separately from all the others so we got to order our own laptops. Our previous laptops sucked at 32gb total.

        • Jason2357@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          8 days ago

          Same here. Modern 32 gb machine from work is a slog. 2 minutes from wake to actually working, can be 10 seconds just to use the start menu sometimes. Older thinkpad with 16gb and linux/cosmic desktop - wakes almost instantly and perfectly snappy for most things.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 days ago

        Most bloated apps like outlook and teams etc regularly use nearly a gig of ram each in my experience. Brutal.

          • yeehaw@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            Ya that depends on the content too and what bullshit they load on the background. In this context, these days I look at tabs as apps lol

  • some_designer_dude@lemmy.world
    link
    fedilink
    English
    arrow-up
    78
    ·
    8 days ago

    So RAM costs them more now, and they need to pass those costs onto customers. However, it seems like they’re also trying to redefine what “mid-range” means to us all, as if we aren’t fully aware of what computers are capable of and what amount of memory is good vs not. Making the various ranges cost more is intuitive. Enshitifying the ranges to sell them at the same price is just antithetical to the whole concept of the ranges…

    • some_designer_dude@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      8 days ago

      It’s like if batteries got stupid expensive and they tried to tell us 200km of range is what you get in a touring EV these days. But the distances between all the places haven’t changed…

    • Zephorah@discuss.online
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      8 days ago

      Isn’t the issue here the newest generation? I keep reading they’re way less tech savvy than the rest of us. Blended in with the propensity for young people to have an iPhone or Android and no PC, well, this junk will likely slide right by.

      • some_designer_dude@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        8 days ago

        So our generation will be the first to have to teach both our boomer parents AND our millennial offspring what “RAM” is?!

        (╯°□°)╯︵ ┻━┻)

        • Zephorah@discuss.online
          link
          fedilink
          English
          arrow-up
          17
          ·
          8 days ago

          GenX is mostly forgotten until tech support is needed, either direction, yes.

          Millennials are fine, lol, and overlap with genx, this relates to the teens and early 20s age group.

  • Pika@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    76
    ·
    edit-2
    8 days ago

    Maybe this will be a boon. The entire reason the ram requirements got so high as it is is because software optimization was put on the back burner. Maybe a ram shortage where people can’t obtain the ram needed will force the big name software devs to start being more frugal with ram. (talking to you chrome… whom currently is using 2 gigs alone just trying to show a twitch stream…)

    • relativestranger@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      that would require the software companies to actually spend money on competent developers instead of tossing peanuts at prompt writers.

    • MountingSuspicion@reddthat.com
      link
      fedilink
      English
      arrow-up
      67
      ·
      8 days ago

      Or more realistically be used as an excuse for always online cloud based services a la office 365. “We would let you download the app, but most users don’t have the computing power so instead we’ll just make this a helpful subscription!”

        • MountingSuspicion@reddthat.com
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 days ago

          “Oh don’t worry, you won’t have to actually load spreadsheets anymore, just give our AI full access to your files and it will do whatever you ask :)”

          Ideally, you’re correct though and companies start investing in optimization. I don’t see it going that way, but a girl can dream.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Honestly, it’ll be more efficient to have memory in a datacenter in that hardware in a datacenter will see higher average capacity utilization, but it’s gonna drive up datacenter prices too.

        • Jason2357@lemmy.ca
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 days ago

          Not sure I agree. Centralizing storage, and especially memory, creates incredible round trip costs.

          • tal@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 days ago

            I mean, efficient in terms of memory utilization, like. Obviously there are gonna be associated costs and drawbacks with having remote compute.

            Just that if the world has only N GB of RAM, you can probably get more out of it on some system running a bunch of containers, where any inactive memory gets used by some other container.

        • MountingSuspicion@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          As a dirty commie, I agree, but unfortunately under capitalism it is just an avenue for exploitation. Large companies are deciding what we can or cannot have access to and setting the price for it in a manner completely divorced from what they’re offering.

        • BackgrndNoize@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          But imagine the latency and network bandwidth issues, there’s a reason most companies moved away from the huge central framework model to distributed computing

    • kieron115@startrek.website
      link
      fedilink
      English
      arrow-up
      27
      ·
      edit-2
      8 days ago

      At least one studio, Larian, has confirmed this is the case for them.

      When discussing the pressures the company faces when releasing a game in early access, such as audience expectations, Vincke told us, “Interestingly, another [issue Larian is facing] is really the price of RAM and the price of SSDs and f**k, man. It’s like, literally, we’ve never had it like this.”

      He continued, “It kind of ruins all of your projections that you had about it because normally, you know the curves, and you can protect the hardware. It’s gonna be an interesting one. It means that most likely, we already need to do a lot of optimization work in early access that we didn’t necessarily want to do at that point in time. So it’s challenging, but it’s video games.”

      • Holytimes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        8 days ago

        Good fuck studios just throwing optimization into the bin cause they can. They should fucking actually do some problem solving instead of brute forcing everything.

          • MountingSuspicion@reddthat.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            8 days ago

            Not the person you’re responding to, but “most likely, we already need to do a lot of optimization work in early access that we didn’t necessarily want to do at that point” indicates to me that optimization was not a top priority. It’s not unusual for people to optimize after a proof of concept or something, but I imagine in gaming (I don’t do game dev admittedly) you don’t want that too late in the process. If they’re not planning on having it in early access, then their early consistent user base will be more worried about other things. If min spec is 8 then people with 4 won’t get it or won’t complain about poor performance because technically it’s their machine that’s the issue. Lack of complaints about that and feedback about other things further shifts the priority away from optimization. Plus, anyone who’s worked in dev spaces or probably any kind of deliverable knows that there are things that just don’t happen despite your best intentions. Things like optimization are the first to go in the dev space, so by openly admitting to putting it off, it does feel like an admission of “we were probably just not going to get around to it”. In my experience, the further out you plan to optimize, the more man hours you end up wasting, so I don’t see a company investing heavily in that at any point, but doing so post early launch seems wasteful if they legitimately cared about it.

    • BackgrndNoize@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 days ago

      In the world of AI vibe coding, I don’t think so, they will push people even more towards web apps I think

  • Itdidnttrickledown@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    ·
    8 days ago

    Large computing will exist solely in the cloud where you will pay a subscription for it. Can’t have these grubbing consumers buy anything we elites don’t get a monthly cut of.

    I wish this was sarcasm.

    • Corkyskog@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      7
      ·
      8 days ago

      I am honestly not sure if that’s a bad thing aside from the capitalism of it all. Almost all tasks normal people do could be done with a 10 year old computer running Linux.

      I will find it hilarious if this RAM pricing issue causes people to move to Linux rather than have slow ass Windows 11.