• Daxtron2@startrek.website
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 个月前

    Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.