[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Guidelines for pre-trained ML model weight binaries
From: |
Andreas Enge |
Subject: |
Re: Guidelines for pre-trained ML model weight binaries |
Date: |
Wed, 6 Sep 2023 16:28:10 +0200 |
Hello,
related to this thread, I just came across an entry in Cory Doctorow's blog:
https://pluralistic.net/2023/08/18/openwashing/#you-keep-using-that-word-i-do-not-think-it-means-what-you-think-it-means
It is already interesting in its disection of the terms "open" vs. "free",
which is quite relevant to us (but just echoes the sentiment I had anyway).
The end can be seen as an invitation to *not* package neurol network
related software at all: by packaging "big corporation X"'s free software,
but which is untrainable on anything but big corporations' hardware, we
actually help big corporation X to entrap users into its "ecosystem".
Andreas
- Re: Guidelines for pre-trained ML model weight binaries,
Andreas Enge <=