Commit Graph

150 Commits

Author SHA1 Message Date
Hollis Blanchard
38de434123 download: fix file:// BR2_PRIMARY_SITE (download cache)
wget is the only downloader currently usable with BR2_PRIMARY_SITE, and that
doesn't work at all for file:// URLs. The symptoms are these:

	support/download/dl-wrapper -c '2.4.47' -d '/PATH/build/sw/source/attr' -D '/PATH/build/sw/source' -f 'attr-2.4.47.src.tar.gz' -H 'package/attr//attr.hash' -n 'attr-2.4.47' -N 'attr' -o '/PATH/build/sw/source/attr/attr-2.4.47.src.tar.gz'  -u file\|urlencode+file:///NFS/buildroot_dl_cache/attr -u file\|urlencode+file:///NFS/buildroot_dl_cache -u http+http://download.savannah.gnu.org/releases/attr -u http\|urlencode+http://sources.buildroot.net/attr -u http\|urlencode+http://sources.buildroot.net  --
	file:///NFS/buildroot_dl_cache/attr/attr-2.4.47.src.tar.gz: Unsupported scheme `file'.
	ERROR: attr-2.4.47.src.tar.gz has wrong sha256 hash:
	ERROR: expected: 25772f653ac5b2e3ceeb89df50e4688891e21f723c460636548971652af0a859
	ERROR: got     : e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
	ERROR: Incomplete download, or man-in-the-middle (MITM) attack

In the case of custom Linux kernel versions, this is fatal, because there isn't
necessarily a hash file to indicate that wget's empty tarball is wrong.

This seems to have been broken by commit c8ef0c03b0, because:
1. BR2_PRIMARY_SITE always appends "urlencode" (package/pkg-download.mk)
2. Anything with the "|urlencode" suffix in $uri will end up using wget due to
   the backend case wildcarding.
3. The wget backend rejects file:/// URLs ("unsupported scheme"), and we end up
   with an empty .tar.gz file in the downloads directory.

Fix that by shell-extracting the backend name from the left of "|". I'm not
positive if all URLs will have a "|", so this code only looks for a "|" left of
the "+".

Signed-off-by: Hollis Blanchard <hollis_blanchard@mentor.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-08-21 00:24:29 +02:00
Yann E. MORIN
4dc6c43703 support/download: remove help from wrapper
The download wrapper is a purely internal helper, and is not supposed to
be callable manually. No need to offer some help.

Besides, the help text was way out-dated.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-08-09 23:47:54 +02:00
Robert Beckett
a5ba72946e dl-wrapper: Fix support for URIs containing '+'
'+' is a valid character in a url. The current dl-wrapper gets the
URI scheme by dropping everything after the last '+' character, with
the intension of finding 'git' from e.g. 'git+https://uri'.

If a uri has a '+' anywhere in it, it ends up using too much of the
string as a scheme, and fails to match the handler properly.

An example of where this form of URI is used is when using deploy tokens
in gitlab. It uses a form like https://<username>:<password>@gitlab.com/<group>/<repo.git>
where username for deploy token is of the form 'gitlab+deploy-token-<number>'.

Use the %% operator to search backwards until the last '+' character when
dropping the rest of the string as we know that the first '+'
in the string should be the scheme.

Signed-off-by: Robert Beckett <bbeckett@netvu.org.uk>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-06-04 21:59:55 +02:00
Arnout Vandecappelle (Essensium/Mind)
66f40a7242 download/cvs: add a 10 minute timeout
Apparently, CVS servers can be deadlocked and in that case clients will
retry connecting to them indefinitely. Cfr.
http://autobuild.buildroot.net/results/23d/23d1034b33d0354de15de2ec4a8ccd0603e8db78/build-end.log
Apparently, the sf.net CVS server got in such a deadlock on 2018-05-18,
and almost 2 weeks later it is still not fixed.

Instead of just hanging, we should fall back on BR2_SECONDARY_SITE. To
achieve this, it's sufficient to add a timeout to the CVS command.

The timeout value is of course arbitrary. However, we can assume that
nobody will be putting large projects under CVS any more. So if the
download takes more than 5 minutes, it's probably broken. Let's put the
timeout at 10 minutes then.

Fixes:
http://autobuild.buildroot.net/results/db3/db33d4fa507fb3b4132423cd0a7e25a1fe6e4105
http://autobuild.buildroot.net/results/b6d/b6d927dcc73ac8d754422577dacefff4ff918a5c
http://autobuild.buildroot.net/results/23d/23d1034b33d0354de15de2ec4a8ccd0603e8db78
http://autobuild.buildroot.net/results/127/1272a3aa3077e434c9805ec3034f35e6fcc330d4

Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-05-31 19:32:45 +02:00
Angelo Compagnucci
43ca1987e6 support/download/file: remove set -x
Signed-off-by: Angelo Compagnucci <angelo@amarulasolutions.com>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-13 22:26:43 +02:00
Angelo Compagnucci
a9bb361435 support/download/file: fix file:// protocol handling
Since the rework of the download infrastructure, the "file" download
helper gets passed an URL that starts with file://, but forgets to
strip it before passing it to "cp", causing a failure as the "cp"
program isn't prepared for file paths starting with file://. This is
fixed by stripping the file:// at the beginning of the URL.

In addition, the path passed to cp lacked a slash between the
directory path and the filename part of the url. This is fixed by
adding a slash at the appropriate places.

Fixes the following build failure when the "file" download method is
used:

cp: cannot stat 'file:///home/angelo/DEV/TOOLCHAINSarmv7-eabihf--glibc--bleeding-edge-2017.11-1.tar.bz2': No such file or directory

Signed-off-by: Angelo Compagnucci <angelo@amarulasolutions.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-13 22:23:13 +02:00
Yann E. MORIN
4f54c959dc download/git: always do full-clone
We currently attempt a shallow clone, as tentative to save bandwidth and
download time.

However, now that we keep the git tree as a cache, it may happen that we
need to checkout an earlier commit, and that would not be present with a
shallow clone.

Furthermore, the shallow fetch is already really broken, and just
happens to work by chance. Consider the following actions, which are
basically what happens today:

    mkdir git
    git init git
    cd git
    git remote add origin https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
    git fetch origin --depth 1 v4.17-rc1
    if ! git fetch origin v4.17-rc1:v4.17-rc1 ; then
        echo "warning"
    fi
    git checkout v4.17-rc1

The checkout succeeds just because of the git-fetch in the if-condition,
which is initially there to fetch the special refs from github PRs, or
gerrit reviews. That fails, but we just print a warning. If we were to
ever remove support for special refs, then the checkout would fail.

The whole purpose of the git cache is to actually save bandwidth and
download time, but in the long run. For one-offs, people would
preferably use a wget download (e.g. with the github macro) instead of
a git clone.

We switch to always doing a full clone. It is more correct, and pays off
in the long run...

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
80d8bc6e46 download/git: ensure we can checkout repos with submodule conversions
When a git tree has had sub-dir <-> sub-module conversions, or has had
submodules added or removed over the course of time, checking out a
changeset across those conversions/additions/removals may leave
untracked files, or may fail because of a conflict of type.

So, before we checkout the new changeset, we forcibly remove the
submodules. The new set of submodules, if any, will be restored later.

Ideally, we would use a native git command: git submodule deinit --all.
However, that was only introduced in git 1.8.3 which, while not being
recent by modern standards, is still too old for some enterprise-grade
distributions (RHEL6 only has git-1.7.1).

So, instead, we just use git submodule foreach, to rm -rf the submodules
directory.

Again, we would ideally use 'cd $toplevel && rm -rf $path', but
$toplevel was only introduced in git 1.7.2. $path has always been there.

So, instead, we just cd back one level, and remove the basename of the
directory.

Eventually, we need to get rid of now-empty and untracked directories,
that were parents of a removed submodule. For example. ./foo/bar/ was a
submodule, so ./foo/bar/ was removed, which left ./foo/ around.

Yet again, recent-ish git versions would have removed it during the
forced checkout, but old-ish versions (e.g. 1.7.1) do not remove it with
the forced checkout.

Instead we rely on the already used forced-forced clean of directories,
untracked, and ignored content, to really get rid of extra stuff we are
not interested in.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
3d2e018889 download/git: ensure we checkout to a clean state
Force the checkout to ignore and throw away any local changes. This
allows recovering from a previous partial checkout (e.g. killed by
the user, or by a CI job...)

git checkout -f has been supported since the inception of git, so we
can use it without any second thought.

Also do a forced-forced clean, to really get rid of all untracked stuff.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
b7efb43e86 download/git: try to recover from utterly-broken repositories
In some cases, the repository may be in a state we can't automatically
recover from, especially since we must still support oldish git versions
that do not provide the necessary commands or options thereof.

As a last-ditch recovery, delete the repository and recreate the cache
from scratch.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
428a0649ff download/git: quickly exit when the cset does not exist
Check that the given cset is indeed something we can checkout. If not,
then exit early.

This will be useful when a later commit will trap any failing git
command to try to recover the repository by doing a clone from scratch:
when the cset is not a commit, it does not mean the repository is broken
or what, and re-cloning from scratch would not help, so no need to trash
a good cache.

Reported-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
577315687f download/git: run all git commands in the current directory
That way, we can pushd earlier, which will help with last-ditch recovery
in a followup commit.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
f1eb192e26 download/git: add warning not to use our git cache
We really want the user not to use our git cache manually, or their
changes (committed or not) may eventually get lost.

So, add a warning file, not unlike the one we put in the target/
directory, to warn the user not to use the git tree.

Ideally, we would have carried this file in support/misc/, but the git
backend does not have access to it: the working directory is somewhere
unknown, and TOPDIR is not exported in the environment.

So, we have to carry it in-line in the backend instead.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-05-01 21:22:28 +02:00
Yann E. MORIN
6e2f5d0254 download/git: fix transform regexp for older tar versions
Older versions of tar (e.g. 1.27.1) incorrectly interpret the escaping
of the regexp separator, and generate broken tarballs.

For example, given the following transform expression:
    --transform="s/^\.\//squashfs-e38956b92f738518c29734399629e7cdb33072d3\//"

the resulting paths in the generated tarball would be:
    squashfs-e38956b92f738518c29734399629e7cdb33072d3\/

i.e. a directory which last character is indeed a '\'.

We fix that by using a separator which is very unlikely to occur in a
filename.

Fixes:
    http://autobuild.buildroot.org/results/742/7427f34e5c9f6d043b0fe6ad2c66cc0f31d2b24f/

and probably a slew of others as well...

Take this opportunity to fix indentation on the following line
(leading spaces, not TABs).

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-20 10:48:27 +02:00
Yann E. MORIN
e2b71e852d download/git: be sure we do fetch something
The different versions of git will behave in different ways when
fetching remote references, as summarised by the table below:

              |     ancient git         | new git
--------------------------------------------------------------------
git fetch     | fetch all refs but tags | fetches all refs but tags
git fetch -t  | fetches only tags       | fetch all refs and tags

(git-fetch may still fetch tags, but only if reachable from a branch)

So, to cover all the bases, we do a simple fetch, to be sure we have
branches, followed by the existing fetch -t, to get extra tags.

Fixes:
    http://autobuild.buildroot.net/results/0a2/0a238a7f55ea56c33b639ad03ed5796143426889/build-end.log

Reported-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-19 23:10:24 +02:00
Yann E. MORIN
d48a1b0d5e download/git: ensure we have a sane repository
There are cases where a repository might be broken, e.g. when a previous
operation was killed or otherwise failed unexpectedly.

We fix that by always initialising the repository, as suggested by
Ricardo. git-init is safe on an otherwise-healthy repository:

    Running git init in an existing repository is safe. It will not
    overwrite things that are already there. [...]

Using git-init will just ensure that we have the strictly required files
to form a sane tree. Any blob that is still missing would get fetched
later on.

Reported-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Reported-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Acked-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-19 22:32:28 +02:00
Yann E. MORIN
a07da16e78 download/git: ensure we always work in the expected repository
git always look directories up until it finds a repository. In case
the git cache is broken, it may no longer be identified as a repository,
and git will look higher in the directories until it finds one.

In the default conditions, this would be Buildroot's own git tree
(because DL_DIR is a subdir of Buildroot), but in some situations may
very well be any repository the user has Buildroot in, like a
br2-external tree...

So, we force git to use our git cache and never look elsewhere, as
Suggested by Ricardo.

Use GIT_DIR, as it has been there for ages now, while --git-dir was
only introduced later (even if most distros ship an later version),
as suggested by Arnout.

Also fix the one call to git that was not using the wrapper.

Reported-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Acked-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-19 22:30:50 +02:00
Thomas Petazzoni
2679444980 support/download/dl-wrapper: pass the correct -N option
${raw_name} is never defined in dl-wrapper, and therefore the value
passed to the -N option is always empty. This causes a problem for the
'cvs' backend, which uses the value of this option as the CVS module
to be downloaded.

If the name of the CVS module is omitted, all the CVS modules from
that CVS repository are downloaded, which creates a tarball with a lot
more contents, and the actual useful contents in a sub-directory,
obviously breaking patches that should be applied, and the entire
build process that follows.

Fixes:

  http://autobuild.buildroot.net/results/fcee0e3d7eeeb373313b1794092c729b1b052348/

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-12 22:54:00 +02:00
Thomas Petazzoni
0cf077a209 support/download/dl-wrapper: fix passing remaining options to helper scripts
When calling the backend-specific helper scripts, the remaining
options are in ${@}. However, in order to let the helper script know
that those remaining options should not be parsed, but instead passed
as-is to the download tool, they must be separated from the main
options by "--".

Without this, packages that use <pkg>_DL_OPTS, such as the
amd-catalyst package, cannot download their tarball anymore.

Fixes:

  http://autobuild.buildroot.net/results/de818f6e4c8e63d5e8a49c445d10c34eccc40410/

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-12 22:53:55 +02:00
Yann E. MORIN
6dfaa33d13 download/git: more resilient in case of kill
In case the git backend gets killed right in-between it finished
initialising the repository, but before it could add the remote,
we'd end up with a repository without the 'origin' remote, so we
would not be able to change its URL.

Another case that may happen (like in the build failure, below),
is that the repository was initialised with a previous version
of Buildroot, before the commit e17719264b (download/git: don't
require too-recent git) was applied, and that trepository was
still lying around...

Fixes:
    http://autobuild.buildroot.org/results/25a/25aae054634368fadb265b97ebe4dda809deff6f/

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
2018-04-10 09:28:43 +02:00
Yann E. MORIN
e17719264b download/git: don't require too-recent git
git has supported -C only since 1.8.5, and some distros have not yet
caught up after more than 4 years...

Fall back to entering the directory.

Fixes:
    http://autobuild.buildroot.net/results/35f9f7a4adc6c2cad741079e4afdf1408c94703b

Reported-by: André Hentschel <nerv@dawncrow.de>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: André Hentschel <nerv@dawncrow.de>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-08 21:46:36 +02:00
Yann E. MORIN
54785851ee download/git: fix transform-name
When a package contains a relative symlink which first component is '..'
(thus pointing one directory higher), for example package 'meh' contains
this symlink:

    foo/bar -> ../buz

then it would be stored as 'meh-version./buz' because of the
transform-name pattern replacement.

Fix it to only match the leading './'.

Reported-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Reviewed-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Tested-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-08 17:40:16 +02:00
Yann E. MORIN
585c5dc35a core/download fix local backend
Since c8ef0c03b0 (download: put most of the infra in dl-wrapper), the
backend for local files is now named after the scheme, which is 'file'
for a local file.

>From the same commit on, the directory part and the basename are now
passed separately, to let the backend reconstruct the full path when it
needs to do so, which is the case for the 'file' backend too.

Finaly, ff559846fd (support/download: Add support to pass options
directly to downloaders) introduced a nasty error, as it made use of
"${@}" when calling its internal function. Revert that mess now...

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-06 12:43:47 +02:00
Ricardo Martincoski
573ea2c7d4 download/git: fix basename for files inside tarballs
Commit "6d938bcb52 download: git: introduce cache feature" introduced a
typo that makes the tarball to contain files without the package
basename:
$ tar -tvf good-a238b1dfcd825d47d834af3c5223417c8411d90d.tar.gz
-rw-r--r-- 0/0               8 2017-10-14 02:10 ./file

Historically, all tarballs are generated with the basename:
$ tar -tvf good-a238b1dfcd825d47d834af3c5223417c8411d90d.tar.gz
-rw-r--r-- 0/0               8 2017-10-14 02:10 good-a238b1dfcd825d47d834af3c5223417c8411d90d/file

The hashes in the tree were calculated with the basename.

In the most common scenario, after the download ends the tarball is
generated, the hash mismatches and the download mechanism falls back to
use the tarball from http://sources.buildroot.net .

The problem can be reproduced by forcing the download of any git package
PKG that has a hash file to check against:
$ make defconfig
$ ./utils/config --set-str BR2_BACKUP_SITE ""
$ BR2_DL_DIR=$(mktemp -d) make PKG-dirclean PKG-source

Fix the typo so the basename is really added to the files, that was
clearly the intention of the code.

Signed-off-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Cc: Yann E. MORIN <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-05 08:15:45 +02:00
Yann E. MORIN
0d19fcf6c2 core/download: fix when the BR2_DL_DIR does not accept hardlinks
When the BR2_DL_DIR is a mountpoint (presumably shared between various
machine, or mounted from the local host when running in a VM), it is
possible that it does not support hardlinks (e.g. samba, or the VMWare
VMFS, etc...).

If the hardlink fails, fallback to copying the file. As a last resort,
if that also fails, eventually fallback to doing the download.

Note: this means that the dl-wrapper is no longer atomic-safe: the code
suffers of a TOCTTOU condition: the file may be created in-between the
check and the moment we try to ln/cp it. Fortunately, the dl-wrapper is
now run under an flock, so we're still safe. If we eventually go for a
more fine-grained implementation, we'll have to be careful then.

Reported-by: Arnout Vandecappelle <arnout@mind.be>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-03 22:45:00 +02:00
Maxime Hadjinlian
6d938bcb52 download: git: introduce cache feature
Now we keep the git clone that we download and generates our tarball
from there.
The main goal here is that if you change the version of a package (say
Linux), instead of cloning all over again, you will simply 'git fetch'
from the repo the missing objects, then generates the tarball again.

This should speed the 'source' part of the build significantly.

The drawback is that the DL_DIR will grow much larger; but time is more
important than disk space nowadays.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 17:52:53 +02:00
Yann E. MORIN
e80d1d0af4 core/download: look for archives in the global download dir first
For existing setups, the global donload directory may have a lot of the
required archives, so look into there before attempting a download.

We simply hard-link them if found there and not in the new per-package
loaction. Then we resume the existing procedure (which means the new
hardlink will get removed if it happened to not match the hash).

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Peter Korsgaard <peter@korsgaard.com>
Cc: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 17:48:56 +02:00
Maxime Hadjinlian
5d2a018ddf download: add missing '-d' option
The infrastructure needs to give the 'dl_dir' to the dl-wrapper which in its
turn needs to give it to the helper.  It will only be used by the 'git'
helper as of now.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 17:48:56 +02:00
Maxime Hadjinlian
c8ef0c03b0 download: put most of the infra in dl-wrapper
The goal here is to simplify the infrastructure by putting most of the
code in the dl-wrapper as it is easier to implement and to read.

Most of the functions were common already, this patch finalizes it by
making the pkg-download.mk pass all the parameters needed to the
dl-wrapper which in turn will pass everything to every backend.

The backend will then cherry-pick what it needs from these arguments
and act accordingly.

It eases the transition to the addition of a sub directory per package
in the DL_DIR, and later on, a git cache.

[Peter: drop ';' in BR_NO_CHECK_HASH_FOR in DOWNLOAD macro and swap cd/rm
	-rf as mentioned by Yann, fix typos]
Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Tested-by: Luca Ceresoli <luca@lucaceresoli.net>
Reviewed-by: Luca Ceresoli <luca@lucaceresoli.net>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 14:19:51 +02:00
Yann E. MORIN
91e776b5af core/pkg-download: change all helpers to use common options
Currently all download helpers accepts the local output file, the remote
locations, the changesets and so on... as positional arguments.

This was well and nice when that's was all we needed.

But then we added an option to quiesce their verbosity, and that was
shoehorned with a trivial getopts, still keeping all the existing
positional arguments as... positional arguments.

Adding yet more options while keeping positional arguments will not be
very easy, even if we do not envision any new option in the foreseeable
future (but 640K ought to be enough for everyone, remember? ;-) ).

Change all helpers to accept a set of generic options (-q for quiet and
-o for the output file) as well as helper-specific options (like -r for
the repository, -c for a changeset...).

Maxime:
Changed -R to -r for recurse (only for the git backend)
Changed -r to -u for URI (for all backend)
Change -R to -c for cset (for CVS and SVN backend)
Add the export of the BR_BACKEND_DL_GETOPTS so all the backend wrapper
can use the same option easily
Now all the backends use the same common options.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2018-04-02 12:22:50 +02:00
Gaël PORTAY
cf9bf7d28b support/download: keep files downloaded without hash
In the situation where the hash is missing from the hash file, the
dl-wrapper downloads the file again and again until the developer
specifies the hash to complete the download step.

To avoid this situation, the freshly-downloaded file is not removed
anymore after a successful download.

After this change, the behaviour is as follows:

- Hash file doesn't exist, or file is in BR_NO_CHECK_HASH_FOR
  => always succeeds.

- Hash file exists, but file is not present
  => file is NOT removed, build is terminated immediately (i.e.
     secondary site is not tried).

- Hash file exists, file is present, but hash mismatch
  => file is removed, secondary site is tried.
  => If all primary/secondary site downloads or hash checks fail, the
     build is terminated.

Signed-off-by: Gaël PORTAY <gael.portay@savoirfairelinux.com>
[Arnout: extend commit log]
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
2018-04-01 15:02:53 +02:00
Sam Voss
ad477b0859 support/download: svn non-interactive in BR2_SVN
Instead of overriding the _svn command and injecting --non-interactive,
change the default value of BR2_SVN to include this flag so the end user
can choose not to use the flag.

This change helps users behind corporate system rules which may not
allow them to locally cache credentials and require interactive mode.

Signed-off-by: Sam Voss <sam.voss@rockwellcollins.com>

[Originally implemented by]
CC: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2017-11-26 14:18:57 +01:00
Yann E. MORIN
4013f11a5b suport/download: force svn to be non-interactive
Fixes:
    http://autobuild.buildroot.org/results/2af/2af7412846c576089f8596857ab8c81ac31c1bed/

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: André Hentschel <nerv@dawncrow.de>
Reviewed-by: André Hentschel <nerv@dawncrow.de>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2017-11-05 21:16:51 +01:00
Gaël PORTAY
f9a52fad7b support/download: fix check_one_hash description
Function check_one_hash takes three arguments:
 - algo hash
 - known hash
 - file to hash

Signed-off-by: Gaël PORTAY <gael.portay@savoirfairelinux.com>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
2017-09-19 22:11:30 +02:00
Petr Kulhavy
04a22cf1b5 download/git: force gzip compression level 6
Force gzip compression level 6 when calculating hash of a downloaded GIT repo.
To make sure the tar->gzip->checksum chain always provides consistent result.`

The script was relying on the default compression level, which must not be
necessarily consistent among different gzip versions. The level 6 is gzip's
current default compression level.

Signed-off-by: Petr Kulhavy <brain@jikos.cz>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2017-09-12 23:36:13 +02:00
Ricardo Martincoski
5d6ec40b75 download/git: clarify why .git is removed
The removal of the .git dir before creating the tarball is not anymore
just an optimization. It is necessary to make the tarball reproducible.
Also, without the removal, large tarballs (gigabytes) would be created
for some linux trees.

Update the comment accordingly.

Reported-by: Baruch Siach <baruch@tkos.co.il>
Signed-off-by: Ricardo Martincoski <ricardo.martincoski@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2017-04-20 22:27:23 +02:00
Arnout Vandecappelle
0f369a9231 download/git: create GNU format tar files
On most distros, the tar format defaults to GNU. However, at build time
the default format may be changed to posix. Also, future versions of
tar will default to posix.

Since we want the tarballs created by the git download method to be
reproducible (so their hash can be checked), we should explicitly
specify the format. Since existing tarballs on sources.buildroot.org
use the GNU format, and also the existing hashes in the *.hash files
are based on GNU format tarballs, we use the GNU format.

In addition, the Posix format encodes atime and ctime as well as mtime,
but tar offers no option like --mtime to override them. In the GNU
format, atime and ctime are only encoded if the --incremental option is
given.

Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Cc: Peter Seiderer <ps.report@gmx.net>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2017-03-21 23:15:46 +01:00
Yann E. MORIN
4f87aa7db5 support/download: make the git wrapper more robust
Currently, there are two failure paths in the wrapper:

  - if the tar fails, then the error is ignored because it is on the
    left-hand-side of a pipe;

  - if the find fails, then the error is ignored because it is a
    process substitution (and there is a pipe, too).

While the former could be fixed with "set -o pipefail", the latter can
not be fixed thusly and we must use an intemediate file for it.

So, fix both issues by using intermediate files, both to generate the
list of files to include in the archive, and generate the archive in a
temporary tarball.

Fixes the following build issue, where the find is failing for whatever
unknown reason:
    http://autobuild.buildroot.net/results/20f/20fd76d2256eee81837f7e9bbaefbe79d7645ae9/

And this one, where the process substitution failed, also for an unknown
reason:
    http://autobuild.buildroot.org/results/018/018971ea9227b386fe25d3c264c7e80b843a9f68/

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2016-10-25 23:49:35 +02:00
Romain Perier
ff559846fd support/download: Add support to pass options directly to downloaders
This adds support to pass options to the underlying command that is used
by downloader. Useful for retrieving data with server-side checking for
user login or passwords, use a proxy or use specific options for cloning
a repository via git and hg.

Signed-off-by: Romain Perier <romain.perier@free-electrons.com>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2016-08-23 14:59:01 +02:00
Enrique Ocaña González
1e39f08733 support/download/git: Fix compatibility issue with git older than 1.8.4
The "--no-patch" option used by the git downloader appeared on git
1.8.4. Systems with older git versions show an error and fall back to
the wget downloader, which isn't suitable for all the cases.

Signed-off-by: Enrique Ocaña González <eocanha@igalia.com>
Tested-by: Matthew Weber <matthew.weber@rockwellcollins.com>
Tested-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-28 22:02:13 +02:00
Yann E. MORIN
d1830ca877 support/download: don't over-remove files from git archives
When we now manually create git archives, we removed all .git-related
files. However, we also exclude empty directories.

This means that a directory which only had a .gitignore file is excluded
from the archive.

Fixes:
    http://autobuild.buildroot.org/results/2aa/2aa8954311f009988880d27b6e48af91bc74c346/
    http://autobuild.buildroot.org/results/b45/b45cceea99b9860ccf1c925eeda498a823b30903/
    http://autobuild.buildroot.org/results/5ae/5ae336052fd32057d9631649279e142a81f5651f/
    http://autobuild.buildroot.org/results/5fc/5fc3abf4a1aea677f576e16c49253d00720a8bef/

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-04 10:04:22 +02:00
Yann E. MORIN
ec3e057f79 core/pkg-infra: download git submodules if the package wants them
Add a new package variable that packages can set to specify that they
need git submodules.

Only accept this option if the download method is git, as we can not get
submodules via an http download (via wget).

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Aleksandar Simeonov <aleksandar@barix.com>
Tested-by: Matt Weber <matt@thewebers.ws>
Reviewed-by: Matt Weber <matt@thewebers.ws>
Tested-By: Nicolas Cavallari <nicolas.cavallari@green-communications.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-02 19:11:14 +02:00
Yann E. MORIN
f109e7eeb5 support/download/git: add support for submodules
Some git repositories may be split into a master repository and
submodules. Up until now, we did not have support for submodules,
because we were using bare clones, in which it is not possible to
update the list of submodules.

Now that we are using plain clones with a working copy, we can retrieve
the submdoules.

Add an option to the git download helper to kick the update of
submodules, so that they are only fetched for those packages that
require them. Also document the existing -q option at the same time.

Submodules have a .git file at their root, which contains the path to
the real .git directory of the master repository. Since we remove it,
there is no point in keeping those .git files either.

Note: this is currently unused, but will be enabled with the follow-up
patch that adds the necessary parts in the pkg-generic and pkg-download
infrastructures.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Matt Weber <matt@thewebers.ws>
Reviewed-by: Matt Weber <matt@thewebers.ws>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-02 19:11:14 +02:00
Yann E. MORIN
3abd5ba424 support/download/git: do not use git archive, handle it manually
We currently use git-archive to generate the tarball. This is all handy
and dandy, but git-archive does not support submodules. In the follow-up
patch, we're going to handle submodules, so we would not be able to use
git-archive.

Instead, we manually generate the archive:
  - extract the tree to the requested cset,
  - get the date of the commit to store in the archive,
  - store only numeric owners,
  - store owner and group as 0 (zero, although any arbitrary value would
    have been fine, as long as it's a constant),
  - sort the files to store in the archive.

We also get rid of the .git directory, because there is no reason to
keep it in the context of Buildroot. Some people would love to keep it
so as to speed up later downloads when updating a package, but that is
not really doable. For example:
  - use current Buildroot
  - it would need foo-12345, so do a clone and keep the .git in the
    generated tarball
  - update Buildroot
  - it would need foo-98765
For that second clone, how could we know we would have to first extract
foo-12345 ? So, the .git in the archive is pretty much useless for
Buildroot.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Matt Weber <matt@thewebers.ws>
Reviewed-by: Matt Weber <matt@thewebers.ws>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-02 19:11:14 +02:00
Yann E. MORIN
13c89c2f89 support/download/git: do not use bare clones
Currently, we are using bare clones, so as to minimise the disk usage,
most notably for largeish repositories such as the one for the Linux
kernel, which can go beyond the 1GiB barrier.

However, this precludes updating (and thus using) the submodules, if
any, of the repositories, as a working copy is required to use
submodules (becaue we need to know the list of submodules, where to find
them, where to clone them, what cset to checkout, and all those is
dependent upon the checked out cset of the father repository).

Switch to using /plain/ clones with a working copy.

This means that the extra refs used by some forges (like pull-requests
for Github, or changes for gerrit...) are no longer fetched as part of
the clone, because git does not offer to do a mirror clone when there is
a working copy.

Instead, we have to fetch those special refs by hand. Since there is no
easy solution to know whether the cset the user asked for is such a
special ref or not, we just try to always fetch the cset requested by
the user; if this fails, we assume that this is not a special ref (most
probably, it is a sha1) and we defer the check to the archive creation,
which would fail if the requested cset is missing anyway.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Matt Weber <matt@thewebers.ws>
Reviewed-by: Matt Weber <matt@thewebers.ws>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-07-02 19:11:14 +02:00
Yann E. MORIN
8773182092 support/download: really, really make git archives reproducible
The way we use it, gzip will store the current time in the header, which
leads to unreproducible archives.

Fix that by telling gzip to not store the name and date of the file it
compresses, with the -n option. Since it compresses its stdin, there was
already no filename stored; now there's even no date stored.

Note: gzip has had -n since at least 1.2.4, released in 1993, so
virtually every gzip out there nowadays has it.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-02-27 23:13:55 +01:00
Joao Mano
4fad12598d support/download: alternative access methods to CVS
Allows user to specify other access methods than :pserver:anonymous@
on CVS repositories. This shall be defined in the <pkg>_SITE variable.

[Thomas:
 - as suggested by Yann, quote the variable expansion
 - as suggested by Yann, use a regexp match
 - tweak commit log]

Signed-off-by: Joao Mano <joao@datacom.ind.br>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2016-01-20 23:26:03 +01:00
Yann E. MORIN
b9be0f65ac support/download: support older bazaar versions
In efe7f68 (support/download: generate reproducible Bazaar archives),
bzr was instructed to store files with the timestamp set to the date
they were last modified in the repository, instead of the current date,
using the --per-file-timestamp option.

However, this option has been added only in bzr-2.2 (August 2010) which
is not available on older distros.

We fix that by not using --per-file-timestamp when the bzr version is
older than 2.2, in which case we just generate the archive with the
current date set on files.

This means the archive is thus non-reproducible, and if a hash is
available for that archive, the hash will not match, and Buildroot will
try to download from the mirror (if any) or fail (if no mirror).

Fixes:
    http://autobuild.buildroot.org/results/51f/51f4ff5462c15a85937d411f457096224d00fdcd
    http://autobuild.buildroot.org/results/b88/b8828b5fbc16128408c2f44169ac23de7e34d770
    http://autobuild.buildroot.org/results/fb4/fb4b0fb2131b40c18273dbe5e51b393cb6df18ec
    ...

[Peter: simplify sed invocation]
Reported-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-01-18 20:46:04 +01:00
Yann E. MORIN
efe7f684a6 support/download: generate reproducible Bazaar archives
Similarly to what has previously been done for the Hg download backend,
instruct bzr to generate the archive on stdout, so that we can generate
reproducible archives.

When instructing bzr to generate the output file by itself, it uses a
temporary file that is then fed to gzip, which in turn stores the
timestamp of that file in the generated archive, whereas when the output
is generated on stdout, there is no timestamp, so the archive is then
reproducible.

Bizarely enough, we can tell 'bazaar' not to generate a bazaar in the
archive. Cool, uh? ;-]

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-01-03 21:51:50 +01:00
Yann E. MORIN
76b51f90c0 support/download: generate reproducible Hg archives
When hg directly creates the output file, the hash for that file changes
everytime.

However, if we just tell hg to output the archive on stdout and we do
the redirect to the file, then the archive is reproducible.

(The reason is that in the first case, a temporary file is created and
then compressed, and gzip is adding the filename and its timestamp in
the gzip header, while in the second case, there is no temporary file,
and thus no timestamp and thus it is reproducible.)

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Yegor Yefremov <yegorslists@googlemail.com>
Tested-by: Yegor Yefremov <yegorslists@googlemail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2016-01-03 21:50:38 +01:00
Yann E. MORIN
3f2bdd0701 support/download: protect from custom commands with spaces in args
Some users may provide custom download commands with spaces in their
arguments, like so:
    BR2_HG="hg --config foo.bar='some space-separated value'"

However, the way we currently call those commands does not account
for the extra quotes, and each space-separated part of the command is
interpreted as separate arguments.

Fix that by calling 'eval' on the commands.

Because of the eval, we must further quote our own arguments, to avoid
the eval further splitting them in case there are spaces (even though
we do not support paths with spaces, better be clean from the onset to
avoid breakage in the future).

We change all the wrappers to use a wrapper-function, even those with
a single call, so they all look alike.

Note that we do not single-quote some of the variables, like ${verbose}
because it can be empty and we really do not want to generate an
empty-string argument. That's not a problem, as ${verbose} would not
normally contain space-separated values (it could get set to something
like '-q -v' but in that case we'd still want two arguments, so that's
fine).

Reported-by: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Reviewed-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-12-12 17:00:46 +01:00
Vivien Didelot
f8b8251a92 support/download: fetch all refs on full git clone
When specifying BR2_LINUX_KERNEL_CUSTOM_REPO_VERSION, a user may want to
specify the SHA of a reference different than a branch or tag.

For instance, Gerrit stores the patchsets under refs/changes/xx/xxx, and
Github stores the pull requests under refs/pull/xxx/head.

When cloning a repository with --bare, you don't fetch these references.
This patch uses --mirror for a full clone, in order to give the user
access to all references of the Git repository.

Signed-off-by: Vivien Didelot <vivien.didelot@savoirfairelinux.com>
Reviewed-by: "Maxime Hadjinlian" <maxime.hadjinlian@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-11-29 18:52:44 +01:00
Yann E. MORIN
697e2b7d94 support/download: fix the Hg backend for tags
When the version of a package is a Mercurial tag, the download fails,
with:
    abort: unknown revision 'X.Y.Z'!

This is because, in Mercurial, tags are commits like the others, and
when we clone, we actively request a tag. But then, the server
"dereferences" that tag and sends us the revision pointed to by that
tag. Of course, since the tag is a commit after the revision we got,
we do not have the revision adding the tag.

So, we just have to download the full repository to be sure we have
the tags in our local clone.

Reported-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-08-29 11:38:00 +02:00
Yann E. MORIN
3dea23cf53 core/download: don't be needlessly verbose in backends
In 50c8b7e (support/download: support -q in all download backends), the
backend were made to respect the quietness of the main Makefile, when -s
is poassed on the 'make' command line. In doing so, they were all made
to be verbose by default.

However, the verbosity of some of the tools, like scp, is very high, and
is in fact intended for debug purposes.

Drop being verbose by default, just use whatever each tool deems normal
output. Only respect the quietness requested by the user.

Reported-by: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-07-26 15:18:39 +02:00
Fabio Porcedda
6c0d6592f4 support/download/cvs: add support to use a date instead of a tag
This is useful when a tag is not avaiable.

Also fix support for Fedora where the command "cvs -r :<version>" doesn't work.

Signed-off-by: Fabio Porcedda <fabio.porcedda@gmail.com>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-07-10 11:09:20 +02:00
Baruch Siach
ef92d32f19 support/download: relocate the git clone comment
Following commit 95a572282e (pkg-infra: move the git download helper to a
script, 2014-07-02), move the comment describing the shallow clone trickery as
well. Merge this comment with the existing helper comment that was added in
7e40a1103a (support/download: convert git to use the wrapper, 2014-08-03).

Rename $($(PKG)_DL_VERSION) to ${cset} to match the helper code context.

Cc: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Baruch Siach <baruch@tkos.co.il>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2015-05-31 11:56:56 +02:00
Yann E. MORIN
f75a9f07a6 support/download: restore mandatory check of hashes
Now that custom external toolchains to be downloaded properly instruct
to not fail on a missing hash, restore the mandatory hash check for
everything else.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Acked-by: Gustavo Zacarias <gustavo@zacarias.com.ar>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-25 11:51:44 +02:00
Yann E. MORIN
8d2f4e6246 support/download: add possibility to not fail on missing hash
In very constrained cases, it might be needed to not fail if a hash is
missing. This is notably the case for custom external toolchains to be
downloaded, because we do have a .hash file for external toolchains,
but we obviously can not have hashes for all existing custom toolchains
(he, "custom"!).

So, add a way to avoid failing in that case.

>From the Makefile, we export the list of files for which not to check
the hash. Then, from the check-hash script, if no check was done, and
the file we were trying to match in in this exclusion list, we just exit
without error.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Acked-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Tested-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>

changes v6 -> v7:
  - /beautify/ the pattern in the case clause

Changed v5 -> v6:  (Arnout)
  - fix the pattern in the case clause

Changes v4 -> v5:
  - micro-optimisation, use case-esac instead of a for-loop  (Arnout)
  - typoes  (Arnout)

Changes v3 -> v4:
  - drop the magic value, use a list of excluded files  (Arnout)

Changes v1 -> v2:
  - fix typoes in commit log

Signed-off-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Tested-by: Gustavo Zacarias <gustavo@zacarias.com.ar>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-25 11:51:08 +02:00
Masahiro Yamada
5b686a0674 Fix typos in comment blocks
[Thomas: fix issues noticed by Arnout:
  - Rewrap the linux/Config.in paragraph
  - Revert the "is a toolchain dependency" -> "has a toolchain
    dependency" change from pkg-generic.mk, as the original was
    correct.]

Signed-off-by: Masahiro Yamada <yamada.masahiro@socionext.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-08 22:44:43 +02:00
Yann E. MORIN
aa3a409b05 support/download: do not warn about missing hash file for repositories
When downloading from a repository, we explicitly pass no hash file,
because we can't check hashes in that case.

However, we're still printing a message that there is a missign hash
file.

Beside being a bit annoying (since we can't do anything about it), it
may also be wrong, especially for packages for which we support multiple
versions, with some being downloaded via a git clone and others as
tarballs.

Just print no warning when the path to the hash file is empty.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-06 23:29:02 +02:00
Yann E. MORIN
209b754996 support/download: quick fix to avoid breaking on custom toolchains
When the user selects a custom toolchain to be downloaded, there's no
hash for that toolchain, so the download fails, now that hashes are
mandatory.

Fix that by simply exiting as if there was no error, until we have a
better fix...

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:21:47 +02:00
Yann E. MORIN
b6b833e7bc support/download: warn when there's no .hash file
Instead of silently accepting a missing .hash file, print a warning.

This can be grepped from a build log, to find packages that still have
no hash, with the long-term goal of adding hashes for all packages.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:54 +02:00
Yann E. MORIN
3275315432 support/download: always fail when there's no hash
At the time we introduced hashes, we did not want to be too harsh in the
beginning, and give people some time to adapt and accept the hashes. So
we so far only whined^Wwarned about a missing hash (when the .hash file
exists).

Some time has passed now, and people are still missing updating hashes
when bumping packages.

Let's make that warning a little bit more annoying...

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:54 +02:00
Yann E. MORIN
3c19d511d7 support/download: properly catch missing hashes
When checking hashes reports no hash for a file, and this is treated as
an error (now: because BR2_ENFORCE_CHECK_HASH is set; later: because
that will be the new and only behaviour), exit promptly in error.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:54 +02:00
Yann E. MORIN
0012b36c14 support/download: return different exit codes for different failures
Return different exit codes depending on the error that occured:

  0: no error (hash file missing, or all hashes match)
  1: unknown option
  2: hash file exists, but at least one hash in error
  3: hash file exists, but no hash for file to check
  4: hash file exists, but at least one hash type unknown

This will be used in a later patch to decide whether the downloaded file
should be kept or removed.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Samuel Martin <s.martin49@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:50 +02:00
Yann E. MORIN
1ba85b7f87 support/download: add explicit no-hash support
Add support to explicitly state that an archive has no hash.

This can be used for archives downloaded from a repository, like a
git-clone or a subversion checkout, or using the github helper.

This will come in handy when we'll eventually make hashes mandatory as
soon as a .hash file exists: for some packages, like gcc, some versions
are downloaded as archives from upstream, while other versions may come
from a GitHub repository (via the github herlper).

In this case, a .hash file would exist, that contains hashes for the
downloaded tarballs, but archives downloaded from the repository would
not have a hash (since it is currently not possible to have reproducible
such archives). So, we'd need a way to explicitly state there is no
hash, on purpose, for those archives.

So, add 'none' as a new type of hash.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:00 +02:00
Yann E. MORIN
c22fe04a78 support/download: make hash file optional
Currently, specifying a hash file for our download wrapper is mandatory.

However, when we download a git, svn, bzr, hg or cvs tree, there's by
design no hash to check the download against.

Since we're going to have hash checking mandatory when a hash file
exists, this would break those downloads from a repository.

So, make specifying a hash file optional when calling our download
wrapper and bail out early from the check-hash script if no hash file is
specified.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-04-03 14:08:00 +02:00
Yann E. MORIN
4bf1174a83 support/download: further silence the git helper
We expresely call printf in the git helper, calls which were not
addresed in the previous silent-build patchset.

Just redirect stdout to oblivion when being silent.

Reported-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Acked-by: Fabio Porcedda <fabio.porcedda@gmail.com>
Tested-by: Fabio Porcedda <fabio.porcedda@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-01-05 15:15:09 +01:00
Fabio Porcedda
0b600227b0 pkg-download: silence downloads if make is silent
If doing a silent build (make -s -> QUIET=-q), silence all downloads,
by passing the -q flag downward to backends as well as to check-hash.

Change a printf to use the trace functions.

Signed-off-by: Fabio Porcedda <fabio.porcedda@gmail.com>
Signed-off-by: Yann E. MORIN <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-01-04 18:39:46 +01:00
Yann E. MORIN
50c8b7e947 support/download: support -q in all download backends
Add an option flag to all backends, as well as the check-hash script, so
as to silence download helpers when the user wants a silent build.

Additionaly, make the default be verbose.

Inspired by Fabio's patch on git/svn.

[Thomas: fix a typo "Environemnt" -> "Environment"

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Fabio Porcedda <fabio.porcedda@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2015-01-04 18:38:36 +01:00
Yann E. MORIN
9bb7d10eea pkg-download: check hashes for locally cached files
In some cases, upstream just update their releases in-place, without
renaming them. When that package is updated in Buildroot, a new hash to
match the new upstream release is included in the corresponding .hash
file.

As a consequence, users who previously downloaded that package's tarball
with an older version of Buildroot, will get stuck with an old archive
for that package, and after updating their Buildroot copy, will be greeted
with a failed download, due to the local file not matching the new
hashes.

Also, an upstream would sometime serve us HTML garbage instead of the
actual tarball we requested, like SourceForge does from time for as-yet
unknown reasons.

So, to avoid this situation, check the hashes prior to doing the
download. If the hashes match, consider the locally cached file genuine,
and do not download it. However, if the locally cached file does not
match the known hashes we have for it, it is promptly removed, and a
download is re-attempted.

Note: this does not add any overhead compared to the previous situation,
because we were already checking hashes of locally cached files. It just
changes the order in which we do the checks. For the records, here is the
overhead of hashing a 231MiB file (qt-everywhere-opensource-src-4.8.6.tar.gz)
on a core-i5 @2.5GHz:

            cache-cold  cache-hot
    sha1      1.914s      0.762s
    sha256    2.109s      1.270s

But again, this overhead already existed before this patch.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:59:41 +01:00
Yann E. MORIN
9b88c60484 pkg-download: verify the hashes from the download wrapper
Instead of repeating the check in our download rules, delegate the check
of the hashes to the download wrapper.

This needs three different changes:

  - add a new argument to the download wrapper, that is the full path to
    the hash file; if the hash file does not exist, that does not change
    the current behaviour, as the existence of the hash file is checked
    for in the check-hash script;

  - add a third argument to the check-hash script, to be the basename of
    the file to check; this is required because we no longer check the
    final file with the final filename, but an intermediate file with a
    temporary filename;

  - do the actual call to the check-hash script from within the download
    wrapper.

This further paves the way to doing pre-download checks of the hashes
for the locally cached files.

Note: this patch removes the check for hashes for already downloaded
files, since the wrapper script exits early. The behaviour to check
localy cached files will be restored and enhanced in the following
patch.

[Thomas: fix minor typo in comment.]

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:59:25 +01:00
Yann E. MORIN
2685937e06 pkg-download: check for already downloaded file in the download wrapper
Instead of repeating the same test again and again in all our download
rules, just delegate the check for an already downloaded file to the
download wrapper.

This clears up the path for doing the hash checks on a cached file
before the download.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:58:44 +01:00
Yann E. MORIN
78b92e5055 support/download: add option parsing to the download wrapper
Instead of relying on argument ordering, use actual options in the
download wrapper.

Download backends (bzr, cp, hg...) are left as-is, because it does not
make sense to complexify them, since they are almost very trivial shell
scripts, and adding option parsing would be really overkill.

This commit also renames the script to dl-wrapper so it looks better in
the traces, and it is not confused with another wrapper.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-11 23:58:37 +01:00
Yann E. MORIN
dab47af5d9 pkg-download: fix arguments to hash checking script
The argument are correctly used, but incorrectly documented.
Inverse the comments to match the actual usage.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-12-07 22:08:03 +01:00
Bjørn Forsman
acc76a6ef9 Change /bin/bash shebangs into /usr/bin/env bash
Not all systems have /bin/bash (e.g. NixOS[1] doesn't). Buildroot
already uses /usr/bin/env shebangs for other interpreters (perl,
python), so why not bash?

This changes only the shebangs used by Buildroot itself; stuff installed
to the target system is left unchanged.

With this applied I can run Buildroot unmodified on NixOS.

[1]: http://nixos.org/

Signed-off-by: Bjørn Forsman <bjorn.forsman@gmail.com>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-10-25 01:55:37 +02:00
Yann E. MORIN
8f76366c0f support/download: fix the git helper output file format
The git helper uses gzip to compress the intermediate tarball. But gzip
removes the source file, and create a new file named by appending .gz to
the original file name.

Thus, we end up with output.gz, while the download wrapper expects jsut
output, and thus believes the downlaod failed.

Fix that by storing the tar from git to a temporary file, then pipe this
file to gzip's stdin, and redirect gzip's stdout to the output file.

Reported-by: Graham Newton <gnewton@peavey-eu.com>
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Peter Seiderer <ps.report@gmx.net>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-06 19:41:05 +02:00
Peter Seiderer
9d1ab43211 support/download: fix the git helper
Re-add the git_done variable (lost in commit [1]).

Fixes download problem reported by Rohit Kumar [2].

[1] http://git.buildroot.net/buildroot/commit/?id=7e40a1103a919a8177f00ddca2b46b4439953511
[2] http://lists.busybox.net/pipermail/buildroot/2014-August/103733.html

Signed-off-by: Peter Seiderer <ps.report@gmx.net>
Acked-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-06 19:40:34 +02:00
Yann E. MORIN
9a8470c9ee support/download: convert wget to use the wrapper
This drastically simplifies the wget helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by running 'make busybox-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:25 +02:00
Yann E. MORIN
a92290aa20 support/download: convert svn to use the wrapper
This drastically simplifies the svn helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by running 'make open2300-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:25 +02:00
Yann E. MORIN
5b3880e174 support/download: convert scp to use the wrapper
This drastically simplifies the scp helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by setting a primary site to 'scp://localhost:/tmp' and
running 'make vim-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:25 +02:00
Yann E. MORIN
93e678b767 support/download: convert Hg to use the wrapper
This drastically simplifies the hg helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by running 'make vim-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
7e40a1103a support/download: convert git to use the wrapper
This drastically simplifies the git helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by running 'make fmc-fsl-sdk-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
389f50dfb6 support/download: convert cvs to use the wrapper
This drastically simplifies the cvs helper, as it no longer has to deal
with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
84469226c1 support/download: convert localfiles to use the wrapper
This drastically simplifies the localfiles helper, as it no longer has
to deal with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
(Tested by setting BUSYBOX_SITE = file:///tmp and running 'make busybox-source')
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
c48d45e342 support/download: convert bzr to use the wrapper
This drastically simplifies the bzr helper, as it no longer has to
deal with atomically saving the downloaded archive.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
b5a9470bdf support/download: add download wrapper
The download wrapper is responsible for ensuring the atomicity
of saving into $(BR2_DL_DIR).

It calls the appropriate download helper, telling it to save the
downloaded content to a temporary file in $(BUILD_DIR) (so it does
not clutter $(BR2_DL_DIR) with partial, failed downloads.

Then, only if the download helper was successful, does the wrapper
save the downloaded content to the final location, yet still in a
temporary file, and finally atomically renames it to the final output
file.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Signed-off-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
2014-08-04 20:12:24 +02:00
Yann E. MORIN
ebe6154ff4 support/download: fix the git helper
When switching the git helper over to a shell script, a special case was
not carried over: in case the remote has the required reference, we
attempt a shallow clone, using --depth 1. However, this is not supported
when the remote is accessed with the http protocol.

Therefore, the download fails.

What happened before the conversion to a shell script was that the helper
in the Makefile would fallback to doing a full-clone.

This is the case and behaviour that were lost in the conversion.

To avoid making the script too complex, we only attempt a full clone if
needed. And we decide that a full clone is needed by default; we decide
it is unnecessary if the remote has the needed reference *and* the
shallow clone was successful.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-08 23:26:48 +02:00
Yann E. MORIN
84d7986940 support/download: fix the bzr helper
bzr uses the name of the extension of the output file to known what
output format to use: tar, tgz, tar.bz2... If no extension is
recognised, bzr will output to a directory.

Since we use 'mktemp .XXXXXX' to generate temporary files, it obviously
never ends with a recognised extension. Thus, bzr expects the output to
be a directory, and fails since it is a file.

Fix that by forcing the output format.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-08 10:53:39 +02:00
Yann E. MORIN
051876936f manual: add documentation about packages' hashes
Although md5 is, for legacy reasons, a supported hash type,
it is not documented on purpose, since it is now known to
be weak.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Baruch Siach <baruch@tkos.co.il>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Samuel Martin <s.martin49@gmail.com>
Cc: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Reviewed-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-04 23:39:26 +02:00
Yann E. MORIN
9bd8b59526 pkg-infra: add possiblity to check downloaded files against known hashes
Some of the packages that Buildroot might build are sensitive packages,
related to security: openssl, dropbear, ca-certificates...

Some of those packages are downloaded over plain http, because there is
no way to get them over a secure channel, such as https.

In these dark times of pervasive surveillance, the potential for harm that
a tampered-with package could generate, we may want to check the integrity
of those sensitive packages.

So, each package may now provide a list of hashes for all files that needs
to be downloaded, and Buildroot will just fail if any downloaded file does
not match its known hash, in which case it is removed.

Hashes can be any of the md5, sha1 or sha2 variants, and will be checked
even if the file was pre-downloaded.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Baruch Siach <baruch@tkos.co.il>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Cc: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-04 23:35:19 +02:00
Yann E. MORIN
19afad50f5 pkg-infra: don't use DL_DIR as scratchpad for temporary downloads
DL_DIR can be a very precious place for some users: they use it to store
all the downloaded archives to share across all their Buildroot (and
maybe non-Buildroot) builds.

We do not want to trash this location with our temporary downloads (e.g.
git, Hg, svn, cvs repository clones/checkouts, or wget, bzr tep tarballs).

Turns out that we already have some kind of scratchpad, the BUILD_DIR.
Although it is not really a disposable location, that's the best we have
so far.

Also, we create the temporary tarballs with mktemp using the final tarball,
as template, since we want the temporary to be on the same filesystem as
the final location, so the 'mv' is just a plain, atomic rename(2), and we
are not left with a half-copied file as the final location.

Using mktemp ensures all temp file names are unique, so it allows for
parallel downloads from different build dirs at the same time, without
cloberring each downloads.

Note: we're using neither ${TMP} nor ${TMPDIR} since they are shared
locations, sometime with little place (eg. tmpfs), and some of the
repositories we clone/checkout can be very big.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Samuel Martin <s.martin49@gmail.com>
Cc: Arnout Vandecappelle <arnout@mind.be>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Thomas De Schampheleire <patrickdepinguin@gmail.com>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
[tested a particular scenario that used to fail: two separate builds
using a shared DL_DIR, ccache enabled, so that they run almost
synchronously. These would download the same file at the same time,
corrupting each other. With the patches in this series, all works
fine.]
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-04 23:28:13 +02:00
Yann E. MORIN
283b8b7fea pkg-infra: move the cp (aka localfiles) download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-03 00:07:57 +02:00
Yann E. MORIN
54adff55d5 pkg-infra: move the scp download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-03 00:07:12 +02:00
Yann E. MORIN
45261f1fb7 pkg-infra: move the bzr download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-03 00:03:59 +02:00
Yann E. MORIN
4bee7629ee pkg-infra: move the wget download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-03 00:00:15 +02:00
Yann E. MORIN
2fd4b959bd pkg-infra: move the hg download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Tested-by: Thomas De Schampheleire <thomas.de.schampheleire@gmail.com>
[tested a particular scenario that used to fail, when the 'hg archive'
step is interrupted, now working fine]
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-02 23:59:10 +02:00
Yann E. MORIN
f4526c053f pkg-infra: move the cvs download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-02 23:54:16 +02:00
Yann E. MORIN
daf034f881 pkg-infra: move the svn download helper to a script
Maintaining the download helpers in the Makefile has proved to be a bit
complex, so move it to a shell script.

[Peter: redirect pushd/popd output to /dev/null]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Reviewed-by: Ryan Barnett <ryan.barnett@rockwellcollins.com>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-02 23:51:56 +02:00
Yann E. MORIN
95a572282e pkg-infra: move the git download helper to a script
The git download helper is getting a bit more complex. Fixing it in the
Makefile when it breaks (like the recent breakage with a non-existing
sha1-cset) proves to be challenging, to say the least.

Move it into a shell script in support/download/git, which will make
it much easier to read, maintain, fix and enhance in the future.

[Peter: redirect pushd/popd output to /dev/null]
Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Acked-by: Luca Ceresoli <luca@lucaceresoli.net>
Cc: Arnout Vandecappelle <arnout@mind.be>
Reviewed-by: Samuel Martin <s.martin49@gmail.com>
Reviewed-by: Arnout Vandecappelle (Essensium/Mind) <arnout@mind.be>
Signed-off-by: Peter Korsgaard <peter@korsgaard.com>
2014-07-02 23:47:13 +02:00