• @[email protected]
    link
    fedilink
    English
    07 months ago

    That doesn’t make sense. I have the 8GB M2 and don’t have any issues with 20+ tabs, video calling, torrents, Luminar, Little Snitch, etc open right now.

    • @[email protected]
      link
      fedilink
      English
      97 months ago

      15 tabs of Safari, which is demonstrably a better browser by some opinions due to its efficiency and available privacy configuration options. What if you prefer Chrome or Firefox?

      I will argue in Apple’s defense that their stack includes very effective libraries that intrinsically made applications on Mac OS better in many regards, but 8GB is still 8GB, and an SoC isn’t upgradeable. Competition has far cheaper 16GB options, and Apple is back to looking like complete assholes again.

        • Adam
          link
          fedilink
          English
          27 months ago

          The fact you got downvoted for someone else’s assumption (that was upvoted) makes me chuckle. There’s some serious Apple hating going on here*.

          *sometimes deserved. Not really in this case.

    • @[email protected]
      link
      fedilink
      English
      -137 months ago

      That’s because PC people try to equate specs in dissimilar architecture with an OS that is not written explicitly to utilize that architecture. They haven’t read enough about it or experienced it in practice to have an informed opinion. We can get downvoted together on our “sub standard hardware” that works wonderfully. lol

      • @[email protected]
        link
        fedilink
        English
        87 months ago

        The only memory-utilization-related advantage gained by sharing memory between the CPU and GPU is zero-copy operations between the CPU and GPU. The occasional texture upload and framebuffer access is nowhere near enough to make 8 GiB the functional equivalent of 16 GiB.

        If you want to see something “written explicitly to utilize [a unified memory] architecture,” look no further than the Nintendo Switch. The operating system and applications are designed specifically for the hardware, and even first-party titles are choked by the hardware’s memory capacity and bandwidth.

        • @[email protected]
          link
          fedilink
          English
          -47 months ago

          The Tegra is similar being an SoC, however it does not possess nearly as many dedicated independent processing cores designed around specialized processes.

          The M1 has 10-core CPU with 8 performance cores and 2 efficiency cores, a 16-core GPU, a 16-core Neural Engine, and all with 200GB/s memory bandwidth.

          • @[email protected]
            link
            fedilink
            English
            67 months ago

            The M1-3 is still miles ahead of the Tegra, I don’t disagree. My point was that software designed specifically for a platform can’t make up for the platform’s shortcomings. The SOC itself is excellently designed to meet needs well into the future, but that 8 GiB of total system memory on the base model is unfortunately a shortcoming.

            Apple’s use of memory compression stops it from being too noticeable, but it’s going to become a problem as application memory requirements grow (like with the Switch).

            • @[email protected]
              link
              fedilink
              English
              -37 months ago

              Sure, but no one is saying 8GB is good enough for everyone. It’s a base model. Grandma can use it to check her Facebook and do online banking. It’s good for plenty of basic users. I have an M1 Mini with 8GB that I use as a home server. It works great, but I need my M2 MBP with 16GB UM to use FCP, PS, and Logic Pro. With that, I can master 4K HDR in FCP from an unmastered source in Logic Pro without high memory pressure, let alone swap. There’s no way I’d have the same performance from a PC with 16GB of RAM in Adobe Premiere and Pro Tools. I’ve been there before.

              8GB really is a suitable low-end configuration, and most Mac users would agree. I’m not surprised a magazine dedicated to PC gaming hardware thinks otherwise.