Comment by fragmede

Comment by fragmede 2 days ago

9 replies

No. In that case, you're providing two things, a binary version of your tool, and the tool's source. That tool's source is available to inspect and build their own copy. However, given just the weights, we don't have the source, and can't inspect what alignment went into it. In the case of DeepSeek, we know they had to purposefully cause their model to consider Tiananmen Square something it shouldn't discuss. But without the source used to create the model, we don't know what else is lurking around inside the model.

NitpickLawyer 2 days ago

> However, given just the weights, we don't have the source

This is incorrect, given the definitions in the license.

> (Apache 2.0) "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.

(emphasis mine)

In LLMs, the weights are the preferred form of making modifications. Weights are not compiled from something else. You start with the weights (randomly initialised) and at every step of training you adjust the weights. That is not akin to compilation, for many reasons (both theoretical and practical).

In general licenses do not give you rights over the "know-how" or "processes" in which the licensed parts were created. What you get is the ability to inspect, modify, redistribute the work as you see fit. And most importantly, you modify the work just like the creators modify the work (hence the preferred form). Just not with the same data (i.e. you can modify the source of chrome all you want, just not with the "know-how and knowledge" of a google engineer - the license can not offer that).

This is also covered in the EU AI act btw.

> General-purpose AI models released under free and open-source licences should be considered to ensure high levels of transparency and openness if their parameters, including the weights, the information on the model architecture, and the information on model usage are made publicly available. The licence should be considered to be free and open-source also when it allows users to run, copy, distribute, study, change and improve software and data, including models under the condition that the original provider of the model is credited, the identical or comparable terms of distribution are respected.

  • fragmede 2 days ago

    > In LLMs, the weights are the preferred form of making modifications.

    No they aren't. We happen to be able to do things to modify the weights, sure, but why would any lab ever train something from scratch if editing weights was preferred?

    • NitpickLawyer 2 days ago

      training is modifying the weights. How you modify them is not the object of a license, never was.

      • v9v 2 days ago

        Would you accept the argument that compiling is modifying the bytes in the memory space reserved for an executable?

        I can edit the executable at the byte level if I so desire, and this is also what compilers do, but the developer would instead be modifying the source code to make changes to the program and then feed that through a compiler.

        Similarly, I can edit the weights of a neural network myself (using any tool I want) but the developers of the network would be altering the training dataset and the training code to make changes instead.

      • noodletheworld 2 days ago

        > And most importantly, you modify the work just like the creators modify the work

        Emphasis mine.

        Weights are not open source.

        You can define terms to mean whatever you want, but fundametally if you cannot modify the “output” the way the original creators could, its not in the spirit of open source.

        Isnt that literally what you said?

        How can you possibly claim both that a) you can modify it the creators did, b) thats all you need to be open source, but…

        Also c) the categorically incorrect assertion that the weights allow you to do this?

        Whatever, I guess, but your argument is logically wrong, and philosophically flawed.