points by jsight 8 days ago

Distillation can extract the knowledge from an existing model into a newly trained one. That doesn't solve the cost problem, but costs are steadily coming down.

goku12 8 days ago

That's still a crude repurposement of an inscrutable artifact. Open source requires you to share the source data from which that artifact (the model parameters) was created.