This is exactly how I was thinking we could implement downloading footprints and symbols. What is fundamentally wrong with this approach assuming that each KiCad user downloads 5% of the library?
One big issue is how KiCad gets a directory listing for the 3D files, even before it curls them down. Internally KiCad uses the GitHub API to get a directory listing of the available models. This “mostly” works except that the GitHub API has moved on in some regards. Specifically, large directories with (IIRC) > 100 models will not report some of the models.
What are the potential workarounds of Git not supporting per-file download?
- Use Git as God intended it to be used, and clone the entire repo
- Use the SVN interface to GitHub which allows directory traversal and pulling down individual files, and is not limited by the restrictions placed upon API access. I have done some experimenting with Python SVN library and it seems like it has potential. There would be a lot of work required to make this into a fully-fledged library tool, but I think it is worth further investigation.
- Host each model (zipped) on a website that gets rebuilt every 48hr. Users can download a single model (zipped) or a single folder of models (e.g. pin headers) zipped. With good 7z compression, pin header models are only a couple of meg (for the entire set!). What a weirdly specific answer, you may say… Almost as if I have been working on this exact thing…