In the https://review.opendev.org/c/openstack/loci/+/888351
we build python-nss separately because we need to
patch it before building. Then we moved the result
.whl file to the root directory.
Now let's add --find-links / to all pip wheel commands.
Change-Id: I5faca92eb229af989b4bf63e6e044e49c7bacb5a
The libnss3 headers in Ubuntu Jammy are compatible
with python-nss===1.0.1. Ubuntu Jammy itself
provides the binary package python-nss and
they apply the patch that renames RSAPublicKey/DSAPublicKey
types into PyRSAPublicKey/PyDSAPublicKey.
This change applies the same patch before
building python-nss wheel.
Change-Id: I5211aac7b6a3bbc1f1b74e8662170dc8932525f9
Not all deploys will use uwsgi by default but there is no way to disable
it currently. Since uwsgi itself isnt truly a python app (just a C
program with a python building shim).
The user will still be able to install binary package based uwsgi or
include uwsgi in the extra pip packages to install if they wish to.
Related-Id: I76612794c1ba8dbc45b62dff00cee43c6ba10a34
Change-Id: I15008d41633168fda31e061003ccf4681cade68e
This change removes the python 2 references and workarounds from
the requirements.sh script. This is part of the effort to remove
the python 2 uses and workarounds from the loci repository.
Change-Id: I23271067587dff938c3bcce4f798db8dc33a428b
The workaround for trollius appears to be no longer needed since
openstack has moved away from python 2. This change removes the
logic to handle removing trollius since it's no longer present
in upper-constraints.
Change-Id: Icd832edffec2cea907276480fd8c67e44459ad90
This package requires a version of libfdkafka-dev that is too old inside
the distributinos to be able to build the wheel. In amd64, it has
wheels on PyPI so it doesn't attempt to build it but it fails to build
it inside an aarch64 system.
Change-Id: If0a90919cdddaf91a67ce96646dea87cf0b9632c
Without CPUCOUNT=1, uwsgi frequently fails to build in
certain environments. Once this was implemented, the
builds were stable again.
Change-Id: If85687046297c582b39ca591b21e93c76b7de876
New pip resolver is being released with pip 20.3 and does not support
unnamed constraints. Instead of using pip functionality to handle
unnamed constraints we'll instead first build corresponding wheels,
determine their versions and substitude unnamed constraints with named
ones in upper-constraints.txt. They'll be fed to general wheel build
afterwards and used by wheel bundle consumer.
Change-Id: Ie2acf93d4c91c6bbf3a3777b6ee697253207e2ef
Signed-off-by: Andrii Ostapenko <andrii.ostapenko@att.com>
If KEEP_ALL_WHEELS flag is not False, keep all packages specified in
upper-constraints.txt together with their dependencies.
Change-Id: I79051757780eabba15a2995fa8eaece0ff459f9e
Signed-off-by: Andrii Ostapenko <andrii.ostapenko@att.com>
The patch adds KEEP_ALL_WHEELS set to False by default.
The flag allows to do not remove wheels after built of
requirement image. This is useful where reproducibility
of builds with the same WHEEL image is needed as will allow
to keep 3rd party libs (not specified in upper-constraints.txt)
file with same versions.
Change-Id: I00fd3df2ba46072b3f19c9d08f69bb3c1f53d01f
If additional pip packages are provided for requirements image, they
will be installed into system before building wheels and, in addition,
wheel stored to local pip cache. Later, during wheels creation, this
cached wheel will be used and then deleted as system will think that
package is available in wheel form.
This only matters when package is not available in wheel form and needs
to be a part of requirements image.
Change-Id: I6d5410da02ff9717418de55876162d34b6185e1d
This reverts commit acfd446c03.
The workarounds discussed here are implemented in ocata/pike,
Stein and above doesn't need enum-compat. However Rocky
would still fail without this.
Change-Id: I83fe0c46cad24d9a1c64c4047429bb3491791fe5
Trollius cannot build (not present on pypi anymore), and is only
used for Zaqar. If we don't bypass this, we can't build LOCI
in older branches. I have proposed fixes in requirements and
zaqar, when these merge in the eldest branches, we can revert
this.
Change-Id: Id1a5413ce36a82063301c011b7adf8d8ac5d06ba
This bug was fixed in requirements repo a while back.
upper-constraints.txt contains a constraint for enum-compat now and we
do not need to explicity call it out anymore.
Related-Id: I7c8bfbb89f13db1c251e762dfaf2020fa1f2fdc8
Change-Id: I42b9f3d253e2062a5e2ec92ae24be88d24617f3f
M2Crypto wheel building (tested on 0.33 and 0.31) requires the openssl
devel libraries. Yet, building the wheel with the default libraries
on CentOS and Ubuntu fails, due to an incompatible version shipped
by default.
M2Crypto is only used for some specific cinder drivers, and
can probably be installed from distro packages as a post build
action, if necessary. Alternatively, using the python3 version of
pywbem doesn't need M2Crypto.
Change-Id: I0ae1df6a71bbaabd21c2dfcfd1ffe847a811e008
Any pylxd version before 2.2.7 fails to build now that
urllib3 1.25 is out. This removes them, and therefore
allows us to build requirements for older openstack
branches than Rocky.
See also https://review.opendev.org/#/c/655498/
Change-Id: I632f648f59020a2010ffe05f59ca634461752997
Now that requirements patch has merged, we can remove this hack.
This reverts commit 329e83a6e3.
Depends-On: https://review.openstack.org/#/c/641738/
Change-Id: I2b3f6fc95df6e15077d280436ca5ea06715b3c5a
When building requirements image, contraints try to bring in 0.3.5
version of zVMCloudConnector, however this one requires python
version at most 3.5. However Leap 15 only has python 3.6.
Change-Id: I15f57006246059ead08d2cbc4ee494b3891e2081
At least one distro (ubuntu xenial)
does not have the necessary packages in its default repositories
for building python-qpid-proton 0.14.0 (libqpid-proton2-dev doesn't provide
the necessary qpid-proton required in the wheel build).
When building the wheel, the bundling of qpid-proton fails
due to apache mirrors having removed this old version.
This prevents the newton branch to build.
Example pip wheel log:
```
Collecting python-qpid-proton===0.14.0 (from -r requirements.txt (line 1))
Using cached d39f0e805fdda8ebc0eae03d2647f6/python-qpid-proton-0.14.0.tar.gz
Building wheels for collected packages: python-qpid-proton
Running setup.py bdist_wheel for python-qpid-proton ... error
Complete output from command /usr/bin/python -u -c (redacted)
running bdist_wheel
running build
running build_ext
running configure
Did not find libqpid-proton via pkg-config:
Bundling qpid-proton into the extension
fetching http://www.apache.org/dist/qpid/proton/0.14.0/qpid-proton-0.14.0.tar.gz into build/bundled
error: HTTP Error 404: Not Found
```
Change-Id: I1b3708da4cefa7e15ec5f5623e182a3258508fc7
The 'echo %1' in requirements.sh did just that, echoed %1 without
reporting the source of the pip error. This patch changes this
command to 'cat $1', which dumps the content of the file referred
to by $1. This content is the pip package to install. Upon failure,
the user can know exactly which pip build failed and search for the
relevant error message in the build log.
Change-Id: Ieb5c928c31f1a6673152a0b1a15f6efe21cc063e
This ps adds new argument PIP_WHEEL_ARGS
- allows additional parameters for pip wheel
- defaults to PIP_ARGS
Also use PIP_ARGS with all pip installs
Useful when building requirements image with local pip mirrors
Change-Id: I43c6b921467150509d013554aaa1983f30abedff
Cleans up the comment style to remove author names and clarify
the comment as it relates to the code. Using the NOTE (NAME):
format is redundant and takes away attention from the purpose
of documenting why an action is being taken.
Also updates status of TODO and FIXME items, including removing
code was a workaround fixed by a recent patch.
Change-Id: I2e087be1e204c618d1dbe499b3f69eae34ce656f
Install packages from PIP_PACKAGES before wheel build to be able
to limit dependencies specified in "setup_requires" of packages
listed in upper-constraints.txt, that pip has no way to control.
This is intended to work around https://github.com/lxc/pylxd/issues/308
or similar issues.
Change-Id: I36bd4347727b4c7f4beea2cdba6674445fbb4ca4
Without this patch, we are stuck to an old version of bindep
and distro. There is no reason to pin or patch bindep anymore.
This is a problem because more recent distributions like
opensuse leap 15 will have profile detection issues.
This should fix it.
Change-Id: Ia3637b3801590cda92be433ca3ba79f62e15512a
'scikit-learn' requires 'numpy' and 'scipy' packages. Which take long
time to build.
We need 'numpy' for nova image but can ignore rest (they are needed by
'monasca_analytics' project we do not use).
On x86-64 architecture all those packages come as binaries from Pypi.
Change-Id: I36b6ca9d094f83d48e7fa41a7818441169bd04e6
Upstream requirements generator has a bug and is not pulling in
enum-compat. lower-contraints.txt has it strangely. This is the quick
fix to the issue.
Change-Id: I7c8bfbb89f13db1c251e762dfaf2020fa1f2fdc8
One of the claims we have made is that we can install in an airgapped
environemnt. This was true until we started installing master bindep to
take advantage of the new syntax.
Instead of installing master, we can just patch the file and remove the
patch when the new tag is created. Lets do that.
Change-Id: I7a86e6315c44e4e9aa050ce19865708dd6c90ff6
This prevents ~200MB of needless pypi downloading and speeds up the
build process, esspecially in environments without a PyPI mirror.
Change-Id: Icb36ad752923730f5daca22ef08455551e1161a8
The cassandra driver is far and away the longest package we build. It
normally takes 3-4 minutes *after* all the other packages have built to
finish. The recommendation for this is to increase the build concurrency
with the variable above.
Change-Id: I3965f62525d509e8811ed65c599d55b558ce39e4
We cannot build python-qpid-proton pip package on debian due to it
having 1.1.0 openssl. This seems like a lesser used package, so for now
we will just skip building it and give the world some time to resolve
the issue.
Change-Id: I4af88cb57ce2fc614d373c83cf3745c4aaaa5c7b
With this patch we stop trying to collect all possible packages we would
want and instead only save packages that we needed to build into wheels
ourselves because they were only distrubuted as source tarballs upstream
Doing it this way has a few advantages:
* Reduced layer size to 50MB wheels from 250MB wheels
* Can install additional packages that were not built into the wheels
image ahead of time (assuming they exist as wheels upstream)
* Faster build times on slower WAN connections
* Less I/O when building
We explicitly limit to binary installs with "--only-binary :all:" to
prevent any accidental source building.
This has no affect on building in an air-gapped environment. You would
already need a PyPI mirror to build in an air-gapped environment in the
first place.
Change-Id: I746d315ff642c951bbe798af7fbf951b5aa81a4f