This patch allows build sahara images in ubuntu/debian
distribution with qemu-utils package.
Change-Id: Ibfa5c2550898a0d36b51057f5cd2b85f434b57f5
Story: 2006505
Task: 36483
Even if we are not really testing plain images much, let's make clear
that we didn't test any hadoop stuff on bionic yet.
Change-Id: I684a98338945d4449e37d9f652bddddf827838bf
Remedying of patchings, version conflicts, classpath issues, etc.
ALSO: Switch the Hadoop libraries used on the Spark standalone plugin to
Hadoop 2.7.3. The version was previously 2.6.5, to match Cloudera's
so-called "Hadoop 2.6.0", but in fact this concordance is not at all
necessary...
Change-Id: Iafafb64fd60a1ae585375a68173c84fbb82c7e1f
Sahara projects migrated to storyboard.openstack.org.
Replace the references to Launchpad, including the bugs now
available as stories.
Fix a reference to github.
Change-Id: Iadba69efc1e310b6a19463d3398bf5c6549acd73
Also tweak Hive a bit and refer to artifacts in a new (but not totally
ideal) location.
Co-Authored-By: Jeremy Freudberg <jeremyfreudberg@gmail.com>
Change-Id: I3a25ee8c282849911089adf6c3593b1bb50fd067
The canonical location for the artifacts going on is
tarballs.openstack.org/sahara-extra/, so fix the link to use that
and also use https.
Moreover, since the last tarball required for building images
is available on tarballs.openstack.org, remove the last references
to sahara-files for artifacts and documentation
(the new location was used already for a while in few places).
there are still few references to sahara-files,
but they are all about CentOS6 which is no more supported
by diskimage-builder, and should be removed separately.
Change-Id: Iab5a4d50a0abc6ab278837b6a9efd5e30f31c44a
* Handle Hadoop classpath better
* Include proper support for Spark classpath
* Formally limit element's use to Vanilla and Spark
Change-Id: I65abd7e375dba11599a4ab943d24f878235cd71d
Closes-Bug: #1727757
Closes-Bug: #1728061
In certain cases, it is preferable to build a different format of
image such as using Ceph RBD backend which prefers a RAW image
format.
This option allows passing that value over to DIB in order to
control the final image format.
Change-Id: I851fc222b5e8a77d148c5f9d53c2688b17e6e96f
As prereq of support for S3 datasource, the hadoop-aws jar needs to be
in the Hadoop classpath. The jar is copied into the proper folder when
possible on the appropriate plugins, and otherwise can be provided from
a download URL by the user.
Additionally, set the correct value of DIB_HDFS_LIB_DIR on the Vanilla
plugin to avoid any unnecessary simlinking.
Partially-Implements: bp sahara-support-s3
Change-Id: I94c5b0055b87f6a4e1382118d0718e588fccfe87
Now hosted on tarballs.o.o. Plus, for Hadoop 2.7.1, correct the content
of this tarball.
Change-Id: Ib42df14dcb62548082d38fb36f3928d632ee4da5
Closes-Bug: #1705942
Instead of hard-coded Spark 1.6.0, allow use of DIB_SPARK_VERSION (and
its corresponding CLI argument `-s`) to specify which version of Spark
should be included on images for Vanilla plugin.
Change-Id: Ia7c7027c9eadfc9d724733a503990ca78e487ee9
Following the approved spec:
- do not automatically build centos (CentOS 6) images when no
operating system is specified for a specific plugin;
- if 'centos' is explicitly specified, print out a warning message.
- whenever 'centos' is explicitly specified by the gate script,
do not build it anymore.
Blueprint: deprecate-centos6-images
Change-Id: I4e11b97061d6e1f9804bae0157a345ed484d7dbe
Actually there is no much difference between using CDH 5.4 and 5.5
for spark, because both are using Hadoop 2.6 as a base.
Also removing redundant case in installing packages for CDH.
Change-Id: Ie24bb72365352edb22d94a461df0a0af6cd71806
Closes-bug: 1686400
Building CDH image under version 5.5.0 is no longer support.
Remove these useless code.
Also, adding the ambari usage info in sahara-image-create
command.
Change-Id: I6fffe25ee9daf651355611be675137babb67e2a8
Set DIB_CDH_MINOR_VERSION in diskimage-create so that it could not
be an empty value. The order of the lines where DIB_CDH_VERSION is
assigned for other CDH versions where DIB_CDH_MINOR_VERSION is not
used was changed as well for consistency.
Also, in order to try to prevent future errors, set the value of
DIB_CDH_MINOR_VERSION based on DIB_CDH_VERSION in all elements where
it is used (thanks to the ${VAR:-value} syntax which does not fail
with set -u even if DIB_CDH_MINOR_VERSION is not assigned).
Closes-Bug: #1657482
Change-Id: I31b25fd4ba886d051b9b57902cd72349a4a2dbfa
sahara is using netcat for indirect access from
cluster's instances. so, additionally include netcat package
for proper work of indirect access
Closes-bug: 1649644
Change-Id: Ifb255b3c51fd7810730d5809dab913ef0fcd0f17
Those versions are removed or disabled in the current master, and anyway
the older branches of sahara-image-elements still can build them.
Change-Id: I4e14662465e96ac223792a2f40e3925a7ecbebd5
The other images (Ambari, CDH, MapR) do not support Xenial.
Spark depends on CDH, and while Vanilla and Storm may support Xenial,
there are other packages and dependencies that need to be fixed first.
Change-Id: Ied720e35d2a5372f228861b104cca1afddaf26f9
Add spark element to vanilla images.
It provides run spark jobs on vanilla
clusters.
bp spark-jobs-for-vanilla-hadoop
Change-Id: Ie5eb9ec10b0052c9d1f6284b312edfee0ddba4f0