help-guix
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Architecture to reduce download time when pulling multiple packages


From: Josh Marshall
Subject: Architecture to reduce download time when pulling multiple packages
Date: Wed, 11 Oct 2023 23:27:00 -0400

Presently, I am waiting until the end of global warming to finish
pulling down texlive packages.  I see that there are a few servers
from which packages are provided.  Is the following feasible as a
feature to improve effective download speed?

List the base information for what packages there are and where those
packages are located.
1) For each package identifier, list the locations from which they may
be obtained.

List the top level status for each location so the first level of
scheduling packages across servers.  Something simple and sane.
Prioritize uniquely pulling down packages and more specifically
getting packages which are available from the fewest locations first
so as to not bottleneck later on.
2) For each location, have a mapped value for a presently downloading a package

It would be simpler to leave it at step 2 for downloading, but we can
do better.  If we're running into a situation where we have a package
which can be sourced from multiple locations and those locations
cannot be given unique packages (typically, more locations than
packages) then downloading a package can be interleaved between
multiple locations.
3) For each actively downloading package, list locations actively
assigned to obtain data for the package alongside information to
interleave data coming in from each location.

If someone is willing to do a bit of mentoring, this might be a good
project to work on.  Any thoughts on this?  Is this re-hashing old
ground?



reply via email to

[Prev in Thread] Current Thread [Next in Thread]