Also tweak Hive a bit and refer to artifacts in a new (but not totally
ideal) location.
Co-Authored-By: Jeremy Freudberg <jeremyfreudberg@gmail.com>
Change-Id: I3a25ee8c282849911089adf6c3593b1bb50fd067
The canonical location for the artifacts going on is
tarballs.openstack.org/sahara-extra/, so fix the link to use that
and also use https.
Moreover, since the last tarball required for building images
is available on tarballs.openstack.org, remove the last references
to sahara-files for artifacts and documentation
(the new location was used already for a while in few places).
there are still few references to sahara-files,
but they are all about CentOS6 which is no more supported
by diskimage-builder, and should be removed separately.
Change-Id: Iab5a4d50a0abc6ab278837b6a9efd5e30f31c44a
Building CDH image under version 5.5.0 is no longer support.
Remove these useless code.
Also, adding the ambari usage info in sahara-image-create
command.
Change-Id: I6fffe25ee9daf651355611be675137babb67e2a8
Set DIB_CDH_MINOR_VERSION in diskimage-create so that it could not
be an empty value. The order of the lines where DIB_CDH_VERSION is
assigned for other CDH versions where DIB_CDH_MINOR_VERSION is not
used was changed as well for consistency.
Also, in order to try to prevent future errors, set the value of
DIB_CDH_MINOR_VERSION based on DIB_CDH_VERSION in all elements where
it is used (thanks to the ${VAR:-value} syntax which does not fail
with set -u even if DIB_CDH_MINOR_VERSION is not assigned).
Closes-Bug: #1657482
Change-Id: I31b25fd4ba886d051b9b57902cd72349a4a2dbfa
we migrated hadoop openstack jars to tarballs.openstack.org,
so we should switch to use these files.
Change-Id: I2b8a773c29649113fa85a9e0260b54b77bda6a3f
we don't have vanilla 2.6.0 in supported list
in all current branches of sahara. we can
just drop that. if needed, stable/mitaka
branch should be used for building that image.
Change-Id: I81ed8209f2154f112fe7f6718029b84548793380
Added support of CDH 5.7 for ubuntu, centos and centos7.
Also added support of CDH minor versions (5.7.0 and 5.7.1) that we
can set with DIB_CDH_MINOR_VERSION variable
By default it builds 5.7.0 version.
note: for Spark plugin we use HDFS part of CDH so we don't need to
create symbolic links for oozie (so for this we check variable
DIB_CDH_HDFS_ONLY)
depends-on: I1167d0d98ae6fb6fabaf7f1f9a344691d459b50b
Change-Id: I1d71ffbdf78373d27f7e1304164b32da786ac10b
bp: cdh-5-7-support
This patch adds our custom hadoop swiftfs implementation to Vanilla,
Spark, Ambari and CDH plugins images in order to allow usage of
swift with Keystone API v3
Partial-bug: 1558064
Depends-on: Ie6df4a542a16b4417b505b4a621e8b4b921364d3
Change-Id: Icd4b62bd4293bc9b40dba171a22285c7d0ac75c7
This change replaces the hardcoded value for the hadoop swift jar
filename with a variable that defaults to the old value
"hadoop-swift.jar" but also allows redefinition.
* add variables for hadoop swift jar file name
* add documentation for new variable
* correct documentation for the lib path variable
* add more to the basic description documentation
Change-Id: I44eaaa8b23ae30231b7b4087740d8dd5d050ec00
Closes-Bug: 1498967
Make use of the package-installs element to declare in yaml format the
packages to be installed at the beginning of a phase.
Other than reducing the amont of explicit 'install-packages pkg1 ...'
invocations, they can also be installed just once per-phase.
Change-Id: I1f1acfb2bd74fed5cf4c0b48bc739f7f75c35d83
Following the latest dib-lint reporting, make sure almost all the
scripts are enabling -e, -u, and pipefail. This eases the discovery of
failing commands, and the usage of unset variables.
There are few exceptions with flags not set:
* elements/hadoop-hdp/install.d/40-install-hdp, set -e
when installing HDP 2.0, it tries to install tez, which does not seem
to exist in Hortonworks' repositories
* elements/ssh/install.d/33-ssh, set -e
the version of augtool (part of augeas 0.10) in older Ubuntu versions
(like Precise, needed by the cloudera plugin) exits with wrong return
values in autosave mode
* elements/storm/install.d/60-storm, set -e
It tries to change the permission of /etc/supervisord.conf, which does
not seem to exist
Change-Id: Ic1314639dfc6a66c48ca87b6820707a2b0cb1dbd
Partial-Bug: #1435306
Mimic commit 36b59c001c1643217449646b371df46d2cb11b91 in
diskimage-builder, by adopting the usage of $DIB_DEBUG_TRACE to check
whether enable tracing in scripts.
Unlike with the diskimage-builder commit, the default is to not enable
tracing even in the few scripts that used to unconditionally "set -x".
Enabling tracing can be done by either:
- passing -x to disk-image-create
- exporting DIB_DEBUG_TRACE=N, with N=0/1
Change-Id: I56ccd6753df31f7ddda641640cdb1985b2d9e856
Partial-Bug: #1435306
The Spark image needs the hadoop-swift.jar in /usr/lib/hadoop
in order for Spark jobs to access Swift URLs.
Change-Id: I1c47cbc877e6c6628dc3fd2181152b2c4d4cd3f9
Partial-Implements: blueprint edp-spark-swift-integration
WARNING:
----
Before merging this commit the alias for
http://sahara-files.mirantis.com needs to be in place.
Also before merging this commit the new openstack git project must be
available at https://git.openstack.org/openstack/sahara-image-elements/
NOTE:
----
The file 'elements/hadoop-hdp/source-repository-hadoopswift' contains a
link to the HortonWorks repository that holds the Hadoop Swift rpm, this
link needs to be updated when HortonWorks makes the change.
Implements: blueprint savanna-renaming-image-elements
Change-Id: Icb9a992f8545535af3a111580ce7c9622d754c67