• Captain Janeway
    link
    fedilink
    85 months ago

    Well I’m guessing they actually did testing on local AI using a 4GB and 8GB RAM laptop and realized it would be an awful user experience. It’s just too slow.

    I wish they rolled it in as an option though.

      • suoko
        link
        fedilink
        15 months ago

        Llamafile with tinyllama model is 640mb. It could be a flag to enable or an extension