All in all pretty decent sorry I attached a 35 min video but didn’t wanna link to twitter and wanted to comment on this…pretty cool tho not a huge fan of mark but I prefer this over what the rest are doing…

The open source AI model that you can fine-tune, distill and deploy anywhere. It is available in 8B, 70B and 405B versions.

Benchmarks

    • RobotToaster@mander.xyz
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      https://opensource.org/blog/metas-llama-2-license-is-not-open-source

      The actual licence is here: https://ai.meta.com/llama/license/

      iv. Your use of the Llama Materials must comply with applicable laws and regulations (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Llama Materials (available at https://ai.meta.com/llama/use-policy), which is hereby incorporated by reference into this Agreement.

      v. You will not use the Llama Materials or any output or results of the Llama Materials to improve any other large language model (excluding Llama 2 or derivative works thereof).

      1. Additional Commercial Terms. If, on the Llama 2 version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee’s affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from Meta, which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until Meta otherwise expressly grants you such rights.
    • hendrik@palaver.p3x.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      The issue is that “open source” is a term for computer software. And it doesn’t really apply to other things. But people use it regardless. With software, it means you share the recipe, the program code. With machine learning models, there isn’t really such a thing. It’s a pile of numbers (the weights) that are the important thing. They get shared in this case. But you can’t reproduce them. For that you’d need the dataset that went in (which Meta doesn’t share because lots of that is copyrighted and they have several court cases running because they just stole the texts and said it’s alright.) But what open source allows (amongst other things) is to build upon things and modify them. And that can be done with the models to a certain degree. They can be fine-tuned and incorporated in custom projects. In the end they (Meta) want to frame things a certain way and be the good guys. But the term still doesn’t really mean what it’s supposed to mean.

      There are other models with other licenses. There are Apache-licensed models available. There are models which do or don’t allow for commercial usage. We also have some with the datasets and everything available. But those aren’t state of the art anymore.

      • laughterlaughter@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Thanks. I know all this; I was just lazy and continued to use the terminology used in the post or thread.

        So, what are the restrictions? That you can’t use them commercially, for example?

        And if an average Joe wants to re-create a “for personal use” Jarvis, what would they use today?

        • hendrik@palaver.p3x.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          I haven’t looked it up. If it’s the same as before there aren’t that many restrictions. You can’t use it commercially if you have like more than 700 million users. So basically if you’re Google, Amazon or OpenAI. Everyone else is allowed to use it commercially. They have some stupid rules how you have to call your derivatives. And they’re not liable. You can read the license if you’re interested.

          What people would use? They’d use exactly this. Probably one of the smaller variants. This is state of the art. Or you’d use some of the competing models from a few weeks or months back. Mistral, … You’d certainly not train anything from ground up. Because it costs millions of dollars in electricity and hardware (or cost for renting hardware).