Commit Graph

24 Commits

Author SHA1 Message Date
Thomas Petazzoni
1db15e2853 support/download/dl-wrapper: add concept of download post-processing
In order to support package managers such as Cargo (Rust) or Go, we
want to run some custom logic after the main download, but before
packing the tarball and checking the hash.

To implement this, this commit introduces a concept of download
post-processing: if -p <something> is passed to the dl-wrapper, then
support/download/<something>-post-process will be called.

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
[yann.morin.1998@free.fr:
  - double-quote variable expansion when calling post-process script
]
Signed-off-by: Yann E. MORIN <yann.morin.1998@free.fr>
2022-01-07 11:19:10 +01:00
John Keeping
cff428fe31 download/git: support Git LFS
Git Large File Storage replaces large files with text pointers in the
Git repository while storing the contents on a remote server.  If a
repository is using this extension, then git-lfs must be used to
checkout the large files before the source archive is generated.

Signed-off-by: John Keeping <john@metanate.com>
[vfazio:
  - add git-lfs to DL_TOOLS_DEPENDENCIES
  - fixup for 5a0d681394
    ("infra/pkg-download: make the DOWNLOAD macro fully parameterised")
]
Signed-off-by: Vincent Fazio <vfazio@xes-inc.com>
[Arnout:
 - don't "git lfs install";
 - recurse into submodules.
]
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
2022-01-06 11:32:23 +01:00
Thomas Preston
16f660f8ce support/download: Add SFTP support
Add Secure File Transfer Program (SFTP) support using a simple wrapper.
SFTP is a common protocol used to transfer files securely between
enterprises, but it is not currently supported in Buildroot because all
of the packages are usually available via HTTP, git or some other
download method.

SFTP is similar to FTP but it preforms all operations over an encrypted
SSH transport using a specific protocol. This is unlike ftps, which is
traditional FTP over an SSL/TLS connection.

Signed-off-by: Thomas Preston <thomas.preston@codethink.co.uk>
Signed-off-by: Michael Drake <michael.drake@codethink.co.uk>
[Arnout:
 - update documentation with sftp everywhere scp is mentioned;
 - rename "verbose" variable to "quiet";
 - print the sftp command, similar to wget and scp helpers.
]
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
2022-01-06 09:34:05 +01:00
Damien Thébault
bb7c13273f support/download/dl-wrapper: fix urlencode option never being sent to backend
Since commit 38de434123 ("download: fix file:// BR2_PRIMARY_SITE
(download cache)"), the urlencode option is no longer passed to the
download backend, because we use ${backend} instead of
${backend_urlencode}.

We must get the urlencode information from backend_urlencode.

Signed-off-by: Damien Thébault <damien.thebault@vitec.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
[Thomas: rework commit log]
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-12-09 17:16:04 +01:00
Peter Korsgaard
721e4cbb52 Merge branch 'next'
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-09-07 13:13:17 +02:00
Hollis Blanchard
38de434123 download: fix file:// BR2_PRIMARY_SITE (download cache)
wget is the only downloader currently usable with BR2_PRIMARY_SITE, and that
doesn't work at all for file:// URLs. The symptoms are these:

	support/download/dl-wrapper -c '2.4.47' -d '/PATH/build/sw/source/attr' -D '/PATH/build/sw/source' -f 'attr-2.4.47.src.tar.gz' -H 'package/attr//attr.hash' -n 'attr-2.4.47' -N 'attr' -o '/PATH/build/sw/source/attr/attr-2.4.47.src.tar.gz'  -u file\|urlencode+file:///NFS/buildroot_dl_cache/attr -u file\|urlencode+file:///NFS/buildroot_dl_cache -u http+http://download.savannah.gnu.org/releases/attr -u http\|urlencode+http://sources.buildroot.net/attr -u http\|urlencode+http://sources.buildroot.net  --
	file:///NFS/buildroot_dl_cache/attr/attr-2.4.47.src.tar.gz: Unsupported scheme `file'.
	ERROR: attr-2.4.47.src.tar.gz has wrong sha256 hash:
	ERROR: expected: 25772f653ac5b2e3ceeb89df50e4688891e21f723c460636548971652af0a859
	ERROR: got     : e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
	ERROR: Incomplete download, or man-in-the-middle (MITM) attack

In the case of custom Linux kernel versions, this is fatal, because there isn't
necessarily a hash file to indicate that wget's empty tarball is wrong.

This seems to have been broken by commit c8ef0c03b0, because:
1. BR2_PRIMARY_SITE always appends "urlencode" (package/pkg-download.mk)
2. Anything with the "|urlencode" suffix in $uri will end up using wget due to
   the backend case wildcarding.
3. The wget backend rejects file:/// URLs ("unsupported scheme"), and we end up
   with an empty .tar.gz file in the downloads directory.

Fix that by shell-extracting the backend name from the left of "|". I'm not
positive if all URLs will have a "|", so this code only looks for a "|" left of
the "+".

Signed-off-by: Hollis Blanchard <hollis_blanchard@mentor.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-08-21 00:24:29 +02:00
Yann E. MORIN
4dc6c43703 support/download: remove help from wrapper
The download wrapper is a purely internal helper, and is not supposed to
be callable manually. No need to offer some help.

Besides, the help text was way out-dated.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-08-09 23:47:54 +02:00
Robert Beckett
a5ba72946e dl-wrapper: Fix support for URIs containing '+'
'+' is a valid character in a url. The current dl-wrapper gets the
URI scheme by dropping everything after the last '+' character, with
the intension of finding 'git' from e.g. 'git+https://uri'.

If a uri has a '+' anywhere in it, it ends up using too much of the
string as a scheme, and fails to match the handler properly.

An example of where this form of URI is used is when using deploy tokens
in gitlab. It uses a form like https://<username>:<password>@gitlab.com/<group>/<repo.git>
where username for deploy token is of the form 'gitlab+deploy-token-<number>'.

Use the %% operator to search backwards until the last '+' character when
dropping the rest of the string as we know that the first '+'
in the string should be the scheme.

Signed-off-by: Robert Beckett <bbeckett@netvu.org.uk>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-06-04 21:59:55 +02:00
Thomas Petazzoni
2679444980 support/download/dl-wrapper: pass the correct -N option
${raw_name} is never defined in dl-wrapper, and therefore the value
passed to the -N option is always empty. This causes a problem for the
'cvs' backend, which uses the value of this option as the CVS module
to be downloaded.

If the name of the CVS module is omitted, all the CVS modules from
that CVS repository are downloaded, which creates a tarball with a lot
more contents, and the actual useful contents in a sub-directory,
obviously breaking patches that should be applied, and the entire
build process that follows.

Fixes:

  http://autobuild.buildroot.net/results/fcee0e3d7eeeb373313b1794092c729b1b052348/

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-12 22:54:00 +02:00
Thomas Petazzoni
0cf077a209 support/download/dl-wrapper: fix passing remaining options to helper scripts
When calling the backend-specific helper scripts, the remaining
options are in ${@}. However, in order to let the helper script know
that those remaining options should not be parsed, but instead passed
as-is to the download tool, they must be separated from the main
options by "--".

Without this, packages that use <pkg>_DL_OPTS, such as the
amd-catalyst package, cannot download their tarball anymore.

Fixes:

  http://autobuild.buildroot.net/results/de818f6e4c8e63d5e8a49c445d10c34eccc40410/

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-12 22:53:55 +02:00
Yann E. MORIN
0d19fcf6c2 core/download: fix when the BR2_DL_DIR does not accept hardlinks
When the BR2_DL_DIR is a mountpoint (presumably shared between various
machine, or mounted from the local host when running in a VM), it is
possible that it does not support hardlinks (e.g. samba, or the VMWare
VMFS, etc...).

If the hardlink fails, fallback to copying the file. As a last resort,
if that also fails, eventually fallback to doing the download.

Note: this means that the dl-wrapper is no longer atomic-safe: the code
suffers of a TOCTTOU condition: the file may be created in-between the
check and the moment we try to ln/cp it. Fortunately, the dl-wrapper is
now run under an flock, so we're still safe. If we eventually go for a
more fine-grained implementation, we'll have to be careful then.

Reported-by: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-03 22:45:00 +02:00
Yann E. MORIN
e80d1d0af4 core/download: look for archives in the global download dir first
For existing setups, the global donload directory may have a lot of the
required archives, so look into there before attempting a download.

We simply hard-link them if found there and not in the new per-package
loaction. Then we resume the existing procedure (which means the new
hardlink will get removed if it happened to not match the hash).

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 17:48:56 +02:00
Maxime Hadjinlian
5d2a018ddf download: add missing '-d' option
The infrastructure needs to give the 'dl_dir' to the dl-wrapper which in its
turn needs to give it to the helper.  It will only be used by the 'git'
helper as of now.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 17:48:56 +02:00
Maxime Hadjinlian
c8ef0c03b0 download: put most of the infra in dl-wrapper
The goal here is to simplify the infrastructure by putting most of the
code in the dl-wrapper as it is easier to implement and to read.

Most of the functions were common already, this patch finalizes it by
making the pkg-download.mk pass all the parameters needed to the
dl-wrapper which in turn will pass everything to every backend.

The backend will then cherry-pick what it needs from these arguments
and act accordingly.

It eases the transition to the addition of a sub directory per package
in the DL_DIR, and later on, a git cache.

[Peter: drop ';' in BR_NO_CHECK_HASH_FOR in DOWNLOAD macro and swap cd/rm
	-rf as mentioned by Yann, fix typos]
Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Tested-by: Luca Ceresoli <luca@lucaceresoli.net>
Reviewed-by: Luca Ceresoli <luca@lucaceresoli.net>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 14:19:51 +02:00
Yann E. MORIN
91e776b5af core/pkg-download: change all helpers to use common options
Currently all download helpers accepts the local output file, the remote
locations, the changesets and so on... as positional arguments.

This was well and nice when that's was all we needed.

But then we added an option to quiesce their verbosity, and that was
shoehorned with a trivial getopts, still keeping all the existing
positional arguments as... positional arguments.

Adding yet more options while keeping positional arguments will not be
very easy, even if we do not envision any new option in the foreseeable
future (but 640K ought to be enough for everyone, remember? ;-) ).

Change all helpers to accept a set of generic options (-q for quiet and
-o for the output file) as well as helper-specific options (like -r for
the repository, -c for a changeset...).

Maxime:
Changed -R to -r for recurse (only for the git backend)
Changed -r to -u for URI (for all backend)
Change -R to -c for cset (for CVS and SVN backend)
Add the export of the BR_BACKEND_DL_GETOPTS so all the backend wrapper
can use the same option easily
Now all the backends use the same common options.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 12:22:50 +02:00
Gaël PORTAY
cf9bf7d28b support/download: keep files downloaded without hash
In the situation where the hash is missing from the hash file, the
dl-wrapper downloads the file again and again until the developer
specifies the hash to complete the download step.

To avoid this situation, the freshly-downloaded file is not removed
anymore after a successful download.

After this change, the behaviour is as follows:

- Hash file doesn't exist, or file is in BR_NO_CHECK_HASH_FOR
  => always succeeds.

- Hash file exists, but file is not present
  => file is NOT removed, build is terminated immediately (i.e.
     secondary site is not tried).

- Hash file exists, file is present, but hash mismatch
  => file is removed, secondary site is tried.
  => If all primary/secondary site downloads or hash checks fail, the
     build is terminated.

Signed-off-by: Gaël PORTAY <gael.portay@savoirfairelinux.com>
[Arnout: extend commit log]
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
2018-04-01 15:02:53 +02:00
Yann E. MORIN
ec3e057f79 core/pkg-infra: download git submodules if the package wants them
Add a new package variable that packages can set to specify that they
need git submodules.

Only accept this option if the download method is git, as we can not get
submodules via an http download (via wget).

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Aleksandar Simeonov <aleksandar@barix.com>
Tested-by: Matt Weber <matt@thewebers.ws>
Reviewed-by: Matt Weber <matt@thewebers.ws>
Tested-By: Nicolas Cavallari <nicolas.cavallari@green-communications.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-02 19:11:14 +02:00
Yann E. MORIN
3c19d511d7 support/download: properly catch missing hashes
When checking hashes reports no hash for a file, and this is treated as
an error (now: because BR2_ENFORCE_CHECK_HASH is set; later: because
that will be the new and only behaviour), exit promptly in error.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:54 +02:00
Yann E. MORIN
c22fe04a78 support/download: make hash file optional
Currently, specifying a hash file for our download wrapper is mandatory.

However, when we download a git, svn, bzr, hg or cvs tree, there's by
design no hash to check the download against.

Since we're going to have hash checking mandatory when a hash file
exists, this would break those downloads from a repository.

So, make specifying a hash file optional when calling our download
wrapper and bail out early from the check-hash script if no hash file is
specified.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:00 +02:00
Fabio Porcedda
0b600227b0 pkg-download: silence downloads if make is silent
If doing a silent build (make -s -> QUIET=-q), silence all downloads,
by passing the -q flag downward to backends as well as to check-hash.

Change a printf to use the trace functions.

Signed-off-by: Fabio Porcedda <fabio.porcedda@gmail.com>
Signed-off-by: Yann E. MORIN <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-01-04 18:39:46 +01:00
Yann E. MORIN
9bb7d10eea pkg-download: check hashes for locally cached files
In some cases, upstream just update their releases in-place, without
renaming them. When that package is updated in Buildroot, a new hash to
match the new upstream release is included in the corresponding .hash
file.

As a consequence, users who previously downloaded that package's tarball
with an older version of Buildroot, will get stuck with an old archive
for that package, and after updating their Buildroot copy, will be greeted
with a failed download, due to the local file not matching the new
hashes.

Also, an upstream would sometime serve us HTML garbage instead of the
actual tarball we requested, like SourceForge does from time for as-yet
unknown reasons.

So, to avoid this situation, check the hashes prior to doing the
download. If the hashes match, consider the locally cached file genuine,
and do not download it. However, if the locally cached file does not
match the known hashes we have for it, it is promptly removed, and a
download is re-attempted.

Note: this does not add any overhead compared to the previous situation,
because we were already checking hashes of locally cached files. It just
changes the order in which we do the checks. For the records, here is the
overhead of hashing a 231MiB file (qt-everywhere-opensource-src-4.8.6.tar.gz)
on a core-i5 @2.5GHz:

            cache-cold  cache-hot
    sha1      1.914s      0.762s
    sha256    2.109s      1.270s

But again, this overhead already existed before this patch.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:59:41 +01:00
Yann E. MORIN
9b88c60484 pkg-download: verify the hashes from the download wrapper
Instead of repeating the check in our download rules, delegate the check
of the hashes to the download wrapper.

This needs three different changes:

  - add a new argument to the download wrapper, that is the full path to
    the hash file; if the hash file does not exist, that does not change
    the current behaviour, as the existence of the hash file is checked
    for in the check-hash script;

  - add a third argument to the check-hash script, to be the basename of
    the file to check; this is required because we no longer check the
    final file with the final filename, but an intermediate file with a
    temporary filename;

  - do the actual call to the check-hash script from within the download
    wrapper.

This further paves the way to doing pre-download checks of the hashes
for the locally cached files.

Note: this patch removes the check for hashes for already downloaded
files, since the wrapper script exits early. The behaviour to check
localy cached files will be restored and enhanced in the following
patch.

[Thomas: fix minor typo in comment.]

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:59:25 +01:00
Yann E. MORIN
2685937e06 pkg-download: check for already downloaded file in the download wrapper
Instead of repeating the same test again and again in all our download
rules, just delegate the check for an already downloaded file to the
download wrapper.

This clears up the path for doing the hash checks on a cached file
before the download.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:58:44 +01:00
Yann E. MORIN
78b92e5055 support/download: add option parsing to the download wrapper
Instead of relying on argument ordering, use actual options in the
download wrapper.

Download backends (bzr, cp, hg...) are left as-is, because it does not
make sense to complexify them, since they are almost very trivial shell
scripts, and adding option parsing would be really overkill.

This commit also renames the script to dl-wrapper so it looks better in
the traces, and it is not confused with another wrapper.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:58:37 +01:00