sahara-image-elements/elements
Michael Ionkin d56039497b Add hadoop openstack swift jar to plugins images
This patch adds our custom hadoop swiftfs implementation to Vanilla,
Spark, Ambari and CDH plugins images in order to allow usage of
swift with Keystone API v3

Partial-bug: 1558064
Depends-on: Ie6df4a542a16b4417b505b4a621e8b4b921364d3
Change-Id: Icd4b62bd4293bc9b40dba171a22285c7d0ac75c7
2016-03-24 19:02:11 +03:00
..
ambari Merge "gate-sahara-buildimages-ambari (>= 2.1.2) job fix" 2016-02-14 12:54:50 +00:00
apt-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
centos-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
disable-firewall Merge "hadoop: add vanilla/2.6 based on CentOS 7" 2015-08-05 05:56:09 +00:00
extjs Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
fedora-mirror Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
hadoop hadoop: move scripts from post-install.d to install.d 2015-09-18 16:15:56 +02:00
hadoop-cdh Deprecate Spark 0.x and 1.0.x images 2015-07-17 09:28:26 +00:00
hadoop-cloudera Merge "Added support of Spark 1.6.0" 2016-02-15 06:50:34 +00:00
hadoop-hdp Merge "Remove unneeded code from HDP element" 2015-05-28 03:14:28 +00:00
hadoop-mapr Add support of building images for MapR 5.1.0 2016-03-03 11:46:18 +00:00
hdp-local-mirror Use direct links to HDP distribution 2015-11-09 07:20:25 +00:00
hive Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
java Move JRE_PATH/bin export to separate file 2015-09-09 16:06:18 +03:00
mysql mysql: remove extra +x permissions from pkg-map file 2015-09-15 12:52:50 +02:00
nfs-shares NFS share utility installation 2015-07-28 13:07:13 -04:00
ntp Add elements for sync time on VM 2015-07-21 12:25:53 +00:00
oozie oozie: do not install some extra packages 2015-09-18 16:15:57 +02:00
openjdk Use JRE directory for JRE_HOME for RHEL-based os 2015-09-08 12:59:51 +03:00
oracle-java Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
root-passwd Improvements to README.rst of elements 2015-04-29 18:41:30 +02:00
sahara-version/root.d Use tox env for building images 2015-05-05 14:49:08 +03:00
spark Deprecate Spark 0.x and 1.0.x images 2015-07-17 09:28:26 +00:00
ssh Add centos7 plain image 2015-07-08 09:40:51 +02:00
storm Add dib-lint checks for elements 2015-07-06 13:03:12 +03:00
swift_hadoop Add hadoop openstack swift jar to plugins images 2016-03-24 19:02:11 +03:00
xfs-tools Install xfsprogs for ability to formatting volumes in XFS FS 2015-08-18 08:50:53 +00:00
zookeeper Start switching to declarative package-installs 2015-04-15 00:44:54 +08:00
.gitignore Add a .gitignore. 2013-09-02 12:58:57 +04:00
README.rst Renaming all Savanna references to Sahara 2014-03-13 15:26:47 +04:00

README.rst

Diskimage-builder tools for creation cloud images

Steps how to create cloud image with Apache Hadoop installed using diskimage-builder project:

  1. Clone the repository "https://github.com/openstack/diskimage-builder" locally. Note: Make sure you have commit 43b96d91 in your clone, it provides a mapping for default-jre.
git clone https://github.com/openstack/diskimage-builder
  1. Add ~/diskimage-builder/bin/ directory to your path (for example, PATH=$PATH:/home/$USER/diskimage-builder/bin/ ).
  2. Export the following variable ELEMENTS_PATH=/home/$USER/diskimage-builder/elements/ to your .bashrc. Then source it.
  3. Copy file "img-build-sudoers" from ~/disk-image-builder/sudoers.d/ to your /etc/sudoers.d/.
chmod 440 /etc/sudoers.d/img-build-sudoers
chown root:root /etc/sudoers.d/img-build-sudoers
  1. Export sahara-elements commit id variable (from sahara-extra directory):
export SAHARA_ELEMENTS_COMMIT_ID=`git show --format=%H | head -1`
  1. Move elements/ directory to disk-image-builder/elements/
mv elements/*  /path_to_disk_image_builder/diskimage-builder/elements/
  1. Export DIB commit id variable (from DIB directory):
export DIB_COMMIT_ID=`git show --format=%H | head -1`
  1. Call the following command to create cloud image is able to run on OpenStack:

8.1. Ubuntu cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz disk-image-create base vm hadoop oozie ubuntu root-passwd -o ubuntu_hadoop_1_2_1

8.2. Fedora cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie -o fedora_hadoop_1_2_1

Note: If you are building this image from Ubuntu or Fedora 18 OS host, you should add element 'selinux-permissive'.

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie selinux-permissive -o fedora_hadoop_1_2_1

In this command 'DIB_HADOOP_VERSION' parameter is version of hadoop needs to be installed. You can use 'JAVA_DOWNLOAD_URL' parameter to specify download link for JDK (tarball or bin). 'DIB_IMAGE_SIZE' is parameter that specifes a volume of hard disk of instance. You need to specify it because Fedora and CentOS don't use all available volume. If you have already downloaded the jdk package, move it to "elements/hadoop/install.d/" and use its filename as 'JAVA_FILE' parameter. In order of working EDP components with Sahara DIB images you need pre-installed Oozie libs. Use OOZIE_DOWNLOAD_URL to specify link to Oozie archive (tar.gz). For example we have built Oozie libs here: http://sahara-files.mirantis.com/oozie-4.0.0.tar.gz If you have already downloaded archive, move it to "elements/oozie/install.d/" and use its filename as 'OOZIE_FILE' parameter.