OK, maybe you wouldn’t pay three grand for a Project DIGITS PC. But what about a $1,000 Blackwell PC from Acer, Asus, or Lenovo?


Besides, why not use native Linux as the primary operating system on this new chip family? Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn’t. It’s that simple.

Nowadays, Linux runs well with Nvidia chips. Recent benchmarks show that open-source Linux graphic drivers work with Nvidia GPUs as well as its proprietary drivers.

Even Linus Torvalds thinks Nvidia has gotten its open-source and Linux act together. In August 2023, Torvalds said, “Nvidia got much more involved in the kernel. Nvidia went from being on my list of companies who are not good to my list of companies who are doing really good work.”

  • @[email protected]
    link
    fedilink
    English
    818 hours ago

    Don’t get too excited – if this goes like the last few NVidia hardware, it will:

    • cost too much
    • run a non-mainline kernel
    • NVidia will discontinue support for it after 3 months

    Go talk to all the Jetson owners out there and see how happy they are with NVidia Linux boxes. I’ll believe it when I see it (and when it is supported for longer than a quarter)

    • qaz
      link
      fedilink
      English
      623 hours ago

      I don’t care why they got their shit together, I’m happy as long as they fix the open source drivers.

  • I Cast Fist
    link
    fedilink
    English
    311 day ago

    Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn’t.

    And why is that?

    Project DIGITS features the new NVIDIA GB10 Grace Blackwell Superchip, offering a petaflop of AI computing performance for prototyping, fine-tuning and running large AI models.

    With the Grace Blackwell architecture, enterprises and researchers can prototype, fine-tune and test models on local Project DIGITS systems running Linux-based NVIDIA DGX OS, and then deploy them seamlessly on NVIDIA DGX Cloud™, accelerated cloud instances or data center infrastructure.

    Oh, because it’s not a fucking consumer product. It’s for enterprises that need a cheap supercomputer

  • @[email protected]
    link
    fedilink
    English
    7
    edit-2
    1 day ago

    Or you can just buy any random potato computer (or assemble it yourself from stuff you found) and still run Linux on it.

  • @[email protected]
    link
    fedilink
    English
    41 day ago

    Haven’t they been making things like the Jetson AGX for years? I guess this is an announcement of the next generation.

    • @[email protected]
      link
      fedilink
      English
      -1
      edit-2
      1 day ago

      It’s a pile of shit compared to any other sbc. It’s difficult to develop or run anything because it has an arm chip

      • @[email protected]
        link
        fedilink
        English
        41 day ago

        But arm is the most deployed microprocessor in the world? I’d much rather write arm assembly than Intel or PowerPC. For higher level languages, arm has good compiler support. Can you explain why you don’t like arm? I’m genuinely curious because it is probably my favorite development environment (I mostly write embedded system software).

        • @[email protected]
          link
          fedilink
          English
          112 hours ago

          It is an horrible ecosystem that could very well end the era of the personnal computer. I type this on an arm device which I cannot ever be root on. Arm has been the biggest rollback in user freedom since windows 10.

        • @[email protected]
          link
          fedilink
          English
          -11 day ago

          Linux packages don’t work on it unless they’re custom compiled, OS is supplied by Nvidia unless you make or compile your own os, so support for these will be abandoned when the next one comes out. Minimal performance for the price in exchange for lower power consumption. Really only useful for image recognition for OEMs for automated factory quality inspection, robotics, etc. where Internet access is limited.

          • @[email protected]
            link
            fedilink
            English
            11 day ago

            The AGX that I use has Ubuntu 22.04 lts. I have been able to update it with apt. For us, it has been a good environment for CUDA. We run a rust application that uses c++ cuda image processing on the back end. Sorry people are downvoting you.

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    1 day ago

    I’m planning on getting new pc soon. I was planning on avoiding nvidia because i had read it might be more difficult to get drivers. Does this mean they are going to improve things in general or just for the newest and likely most expensive stuff? I dont want to buy the newest possible gpu since they always have bloated price for being new and a bit older ones are likely decent enough too.

    • @[email protected]
      link
      fedilink
      English
      216 hours ago

      Nvidia drivers on Linux are messy and have been for a long time. It took them ages to fix Vsync in Wayland. If you want to run Linux, go AMD (or Intel).

      • @[email protected]
        link
        fedilink
        English
        28 hours ago

        I was planning on getting some amd gpu. Are there any other components that might have similar issues? I want to build this pc specificially for linux

        • @[email protected]
          link
          fedilink
          English
          28 hours ago

          Not really, everything else should just work. At least if you don’t plan to buy an obscure USB sound card or something like that 😄

          Have fun!

    • @[email protected]
      link
      fedilink
      English
      01 day ago

      Modern nvidia GPUs work great, like rtx 900 and newer

      The main problem are nvidia legacy cards where nvidia isn’t updating their proprietary drivers and isn’t making them open source which leads to the decision to go with nuveau on newer kernels which has less features and uses more power, but is wayland compatible.

  • XNX
    link
    fedilink
    English
    2
    edit-2
    1 day ago

    Cant load the article. Does it mention if this will be ARM computers?

  • Chemical Wonka
    link
    fedilink
    English
    259
    edit-2
    3 days ago

    Don’t forget those who made it happen. Nvidia was “forced” to integrate Linux into its ecosystem

    Nvidia has always been hostile to the Linux community or negligent to say the least

    1000043457

    • @[email protected]
      link
      fedilink
      English
      182 days ago

      Nvidia was “forced” to integrate Linux into its ecosystem

      100% bullcrap.

      Nvidia’s servers for data processing have always run Linux. And you know what those servers run? It’s not Windows, that’s for sure. So why would they write multiple versions of a driver for the same hardware interface? Their servers use the same drivers that you would use for gaming on a Linux desktop system.

      In fact, no version of Windows is supported on their DGX servers, and AFAIK you can’t even install Windows on it (even if you managed, it wouldn’t be usable).

      Long story short, a vendor we were working with (about 6 or 7 years ago now), was working on their Linux version of their SDK. We wanted to do some preliminary testing on Nvidia’s new T4s that at this point were only available via Nvidia’s testing datacenter (which we had access to).

      During a call with some of the Nvidia engineers I had to ask the awkward question of “any chance there’s a Windows server we can test on?”. I knew it was a cringe question and I died a little during the 10 second silence until one of the Nvidia guys finally replied with “no one uses Windows for this stuff”. And he said it slowly like the reply to such a question needed to go slow to be understood, because who else would ask that question unless you’re slow in the head?

      Nvidia has always been hostile to the Linux community or negligent to say the least

      People say “hostile”, but I think a better word is arrogant. They wanted to force the industry to use their own implementations they owned or pioneered like egl-stream instead of open standards. But AMD and Intel have proven that open source graphics drivers not only work, but benefit from being open so that the community can scratch their own itches and fix issues faster.

      • @[email protected]
        link
        fedilink
        English
        21 day ago

        Yep, Nvidia has never been hostile towards Linux, they benefit from supporting it. They just don’t care to support the desktop that much, and frankly neither do AMD or Intel. They often take an extremely long time to fix simple bugs that only effect desktop usage. Fortunately, in their case, the drivers can be fixed by other open source contributors.

    • Cris
      link
      fedilink
      English
      913 days ago

      Man, I completely forgot about that. That’s honestly wild to think about in retrospect…

      • @[email protected]
        link
        fedilink
        English
        132 days ago

        It’s not. It had nothing to do with it. Nvidia was all in with Linux as soon as they realized their hardware could be used for data processing and AI. That realization was way more than a decade ago.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          1 day ago

          their drivers were good for AI and compute way before the leak.

          but suddenly their desktop drivers are open, after hackers leaked their desktop driver code? mmmmmmm…

          • @[email protected]
            link
            fedilink
            English
            51 day ago

            They only open sourced the kernel drivers, which just makes sense for them to do. Userspace drivers, which these attackers wanted to be open, are still very much closed. Likely had nothing to do with it.

            • @[email protected]
              link
              fedilink
              English
              1
              edit-2
              3 hours ago

              there is a userspace driver being worked into nouveau as we speak that is supposed to replace the closed userspace driver we are currently using as the official setup.

              turns out the kernel driver + the documentation they published was what we needed.

              • @[email protected]
                link
                fedilink
                English
                13 hours ago

                That’s true, but its still not nvidia making that, so it’s a bit of a different thing. It will never support certain things like CUDA. It is really cool though and wouldve never happened without the open kernel modules.

    • projectmoon
      link
      fedilink
      English
      293 days ago

      Don’t know about “always.” In recent years, like the past 10 years, definitely. But I remember a time when Nvidia was the only reasonable recommendation for a graphics card on Linux, because Radeon was so bad. This was before Wayland, and probably even before AMD bought ATI. And it was certainly long before the amdgpu drivers existed.

      • Dark Arc
        link
        fedilink
        English
        163 days ago

        Yeah it was before AMD did graphics.

        ATI had an atrocious closed source driver. I used it … but it was not good at much of anything.

        • @[email protected]
          link
          fedilink
          English
          7
          edit-2
          2 days ago

          I had ati card on my pc when pentium 4 was all the rage. I literally spent my teenage years learning english in order to get the dumb games I saved up ages for to work without crashing constantly. Its shocking how the same terrible card manufacturer is part of the company that makes the only cpu worth damn and great gpus.

      • @[email protected]
        link
        fedilink
        English
        23 days ago

        Nvidia is still rather nice with FreeBSD, because their official proprietary driver there is, well, fully official, while drivers ported from Linux somewhat lag behind and have problems sometimes.

  • @[email protected]
    link
    fedilink
    English
    813 days ago

    Honestly, I’ve found that my compute needs have been surpassed quite a while ago, and so I could easily get away with buying a $300 computer.

    • Snot FlickermanOP
      link
      fedilink
      English
      563 days ago

      Honestly, for real, a lot of low-power PCs are really useful once they have crap like Windows off of them and a lightweight Linux distro on them.

      • @[email protected]
        link
        fedilink
        English
        373 days ago

        Exactly. Get yourself a somewhat low-end PC, wipe windows, and install Linux Mint, and you’re pretty much golden.

        • @[email protected]
          link
          fedilink
          English
          123 days ago

          Did exactly this with an old laptop and use to mainly for tv and occasionally browsing when staying at our hut/cottage? Still bit slow but works.

      • @[email protected]
        link
        fedilink
        English
        133 days ago

        I’ve found my preferences have been creeping up in price again, but only because I’ve found I want an actually physically lightweight laptop, and those have been getting more available, linux-able and capable.

        I only need a few hundred dollars worth of computer, and anything more can live on a rack somewhere. I’ll pay more than that for my computer to be light enough I don’t need to think about.

    • @[email protected]
      link
      fedilink
      English
      20
      edit-2
      3 days ago

      Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

      In that environment, it was quite important to upgrade the CPU.

      But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

      This is about ten years old now:

      https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

      Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

      Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

      If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

      We can also look at about the twelve years since then, which is even slower:

      https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

      This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

      We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

      Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.

      • @[email protected]
        link
        fedilink
        English
        12 days ago

        My line for computational adequacy was crossed with the Core2Duo. Any chip since has been fine for everyday administration or household use, and they are still fine running linux.

        Any Apple silicon including the M1 is now adequate even for high end production, setting a new low bar, and a new watershed.

      • @[email protected]
        link
        fedilink
        English
        33 days ago

        You know, that would explain a lot because I had no idea that there was an authentication pin and that’s total bullshit.

    • @[email protected]
      link
      fedilink
      English
      10
      edit-2
      3 days ago

      I bought a former office HP EliteDesk 800 G2 16GB for $120 on eBay or Amazon (can’t recall) 2 years ago with the intention of it just being my server. I ended up not unhooking the monitor and leaving it on my desk since it’s plenty fast for my needs. No massive PC gaming rig but it plays Steam indie titles and even 3D modeling and slicing apps at full speed. I just haven’t needed to get anything else.

      • @[email protected]
        link
        fedilink
        English
        7
        edit-2
        3 days ago

        Being blind, I don’t play video games and don’t do any kind of 3D graphics and stuff like that. So many, many computers would fit my specifications.

        Edit: My laptop right now is a Dell Latitude E5400 from like 2014 with eight gigabytes of RAM and a 7200 RPM drive with an Intel Core i5 and it works well enough. Honestly, the only problem with it is that it does not charge the battery. So as soon as it is unplugged from the wall, it just dies. And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery. I figure with it being 10 years old already, at some point I will have to replace it.

        • @[email protected]
          link
          fedilink
          English
          4
          edit-2
          3 days ago

          It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

          And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

          At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

          Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

          I bet the charger on yours is a barrel charger with that pin down the middle.

          hits Amazon

          Yeah, looks like it.

          https://www.amazon.com/dp/B086VYSZVL?psc=1

          I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

          If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

          EDIT: Even one of the top reviews on that Amazon page mentions it:

          I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…

        • Cyborganism
          link
          fedilink
          English
          33 days ago

          Oh hey, I have question for you then. Are you using any braille system with your computer? Or is it a kind of voice reader thing you have going on? What do you use for reading posts and comments on Lemmy?

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            3 days ago

            I use my phone a lot more frequently than I use my computer and I use the TalkBack screen reader on my phone primarily. I can read and write Braille of course and have been able to do so since I was a little kid but I don’t do it very often primarily because I’ve always found reading to be slow for me and so I prefer audio. I’m able to better absorb information through audio than through reading it directly and always have been.

            Edit: I’m not totally blind so my primary navigation is through memorization of where things are and then to read posts and stuff like that that’s long I use the screen reader. So, for example, on my home screen, I know where I’ve placed my app icons, so I can just easily navigate to them, and in settings, for example, I know roughly where the menus are that I’m looking for, and so can navigate to them quickly. I also use the magnification gestures a lot. So, primarily, I navigate with memorization, magnification gestures, and screen reader for longer stuff.

                • @[email protected]
                  link
                  fedilink
                  English
                  11 day ago

                  Can you actually understand mumble rappers?

                  Seriously though I find these accessibility discussions very informative and they make me think about how I develop things and share information. Thank you for sharing.

        • Snot FlickermanOP
          link
          fedilink
          English
          1
          edit-2
          3 days ago

          Oh snap I am really sorry to intrude but I have a question for someone like yourself who is an avid PC user and is also blind.

          How do you feel about the prohibitive cost of braille terminals? I am not blind but I remember seeing the film Sneakers when I was young and the blind hacker Whistler using a braille terminal. As an adult I looked into them and was shocked that some cost more than a mid-range laptop. Are they even that useful or is this a relic that I recall but has been superseded by more useful assistive technologies?

          • @[email protected]
            link
            fedilink
            English
            53 days ago

            Mind you, I don’t use Braille super often. And the Braille note taker devices are quite expensive. For sure. But just direct Braille displays have come down quite a bit in price. I remember a couple of years ago, a Braille display was launched called the Orbit Reader 20, which is a 20 cell Braille display. And I think it was like $400 or something like that. Compared to the $5,000 that some Braille note-taker devices can cost, $400 is nothing.

            • Snot FlickermanOP
              link
              fedilink
              English
              13 days ago

              Thanks for the feedback, that’s super interesting to me. I’m glad to hear they’ve come down in price more recently, for sure!

              • @[email protected]
                link
                fedilink
                English
                32 days ago

                Same here. It used to be that you had to get them subsidized by government programs such as vocational rehabilitation. But now they are affordable by just saving for a little bit.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      2 days ago

      For real, I’m happily using an APU for 90% of the time. I barely need a dedicated GPU at all any more. I use Mint btw.

    • @[email protected]
      link
      fedilink
      English
      63 days ago

      I was that way for the longest time. I was more than content with my 4 core 8 thread 4th Gen. i7 laptop. I only upgraded to an 11th Gen. i9 system because I wanted to play some games on the go.

      But after I upgraded to that system I started to do so much more, and all at once. Mostly because I actually could, and the old system would cry in pain long before then. But Mid last year I finally broke and bought a 13th Gen. i9 system to replace it and man do I flog the shit out of this computer. Just having the spare power lying around made me want to do more and more with it.

      • @[email protected]
        link
        fedilink
        English
        12 days ago

        My phone has been my primary computing device for several years now, and so I hardly ever use my laptop anyway. So it honestly doesn’t make a whole lot of sense for me to spend a ton of money on it.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            2 days ago

            Personally, I don’t do spreadsheets, or I do them incredibly rarely, but obviously for that, you need a bigger screen. And I do have a laptop. I just don’t use it all that often. My phone is my primary computing device. My phone is not my only computing device.

      • @[email protected]
        link
        fedilink
        English
        23 days ago

        My current laptop is the Dell Latitude E5400 and it has like 4 threads with 8 gigs of RAM and a 7200 RPM drive and it works well enough even though it’s 10 years old. Honestly, the only problem with it is that it does not charge the battery. It’s something in the charging circuitry. Since it works fine when it’s on wall power, but it absolutely will not charge a battery anymore.

        • @[email protected]
          link
          fedilink
          English
          12 days ago

          I’m still dailying my Acer c720 Chromebook with Linux mint lol. I’m thankful the flimsy changing port hasn’t given out yet. But it’s coming.

  • Serge Matveenko
    link
    fedilink
    English
    122 days ago

    Well, it’s still a modified custom distro and other distros will need to invest extra effort to be able to run there. So, no actual freedom of choice for users again…

    • @[email protected]
      link
      fedilink
      English
      -11 day ago

      Not true. You can run other distros, but it won’t be ready to go without a decent amount of work.

        • @[email protected]
          link
          fedilink
          English
          01 day ago

          They said there is no freedom of choice. You are free to choose any distro you want. HW mfgs aren’t under obligation to provide inbox support for niche markets.

          • @[email protected]
            link
            fedilink
            English
            2
            edit-2
            1 day ago

            Folks, I think we got a bot here thats just spewing pre-prepared lines, regardless of how irrelevant the comment is.

            • @[email protected]
              link
              fedilink
              English
              -11 day ago

              Versus someone that doesn’t understand the market segment this is meant for. People using it for training don’t care about the distro or gfx drivers. It’s an appliance.

  • @[email protected]
    link
    fedilink
    English
    -111 day ago

    I fucking wish you could filter out words like “Linux” on Lemmy so I don’t have to hear it anymore. I avoid Linux out of spite to all the Linux bros