sahara-image-elements/elements
Pino Toscano cea2cb1850 Fix extjs.zip download URL
Switch to a different URL, found here [1], which works and point to the
same ext-2.2.zip as used so far.

[1] https://issues.cloudera.org/browse/DISTRO-476

Change-Id: I4a74c0d4d870683c4e4cc311736cd74349563fab
Closes-Bug: #1444348
(cherry picked from commit 2d7defcaca)
2015-04-17 17:16:28 +02:00
..
apt-mirror Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
centos-mirror Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
disable-firewall Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
extjs Fix extjs.zip download URL 2015-04-17 17:16:28 +02:00
fedora-mirror Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
hadoop Add tar and wget installation to setup-hadoop script 2015-04-08 13:09:28 +03:00
hadoop-cdh Fix "unbound variable" for Spark plugin 2015-04-08 15:13:30 +03:00
hadoop-cloudera Switch to the extjs element 2015-04-08 18:59:11 +02:00
hadoop-hdp Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
hadoop-mapr Switch to the extjs element 2015-04-08 18:59:11 +02:00
hive Merge "Make almost all the element scripts as e/u/pipefail" 2015-04-07 16:37:24 +00:00
java Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
mysql Merge "Make almost all the element scripts as e/u/pipefail" 2015-04-07 16:37:24 +00:00
oozie Merge "Make almost all the element scripts as e/u/pipefail" 2015-04-07 16:37:24 +00:00
openjdk Simplify environment.d scripts 2015-04-01 15:44:38 +02:00
oracle-java Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
redhat-lsb/pre-install.d Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
root-passwd Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
sahara-version/install.d Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
spark Merge "Make almost all the element scripts as e/u/pipefail" 2015-04-07 16:37:24 +00:00
ssh Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
storm Change supervisor conf file path 2015-04-07 15:16:56 -03:00
swift_hadoop Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
updater/install.d Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
zookeeper Make almost all the element scripts as e/u/pipefail 2015-04-07 12:46:39 +02:00
.gitignore Add a .gitignore. 2013-09-02 12:58:57 +04:00
README.rst Renaming all Savanna references to Sahara 2014-03-13 15:26:47 +04:00

README.rst

Diskimage-builder tools for creation cloud images

Steps how to create cloud image with Apache Hadoop installed using diskimage-builder project:

  1. Clone the repository "https://github.com/openstack/diskimage-builder" locally. Note: Make sure you have commit 43b96d91 in your clone, it provides a mapping for default-jre.
git clone https://github.com/openstack/diskimage-builder
  1. Add ~/diskimage-builder/bin/ directory to your path (for example, PATH=$PATH:/home/$USER/diskimage-builder/bin/ ).
  2. Export the following variable ELEMENTS_PATH=/home/$USER/diskimage-builder/elements/ to your .bashrc. Then source it.
  3. Copy file "img-build-sudoers" from ~/disk-image-builder/sudoers.d/ to your /etc/sudoers.d/.
chmod 440 /etc/sudoers.d/img-build-sudoers
chown root:root /etc/sudoers.d/img-build-sudoers
  1. Export sahara-elements commit id variable (from sahara-extra directory):
export SAHARA_ELEMENTS_COMMIT_ID=`git show --format=%H | head -1`
  1. Move elements/ directory to disk-image-builder/elements/
mv elements/*  /path_to_disk_image_builder/diskimage-builder/elements/
  1. Export DIB commit id variable (from DIB directory):
export DIB_COMMIT_ID=`git show --format=%H | head -1`
  1. Call the following command to create cloud image is able to run on OpenStack:

8.1. Ubuntu cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz disk-image-create base vm hadoop oozie ubuntu root-passwd -o ubuntu_hadoop_1_2_1

8.2. Fedora cloud image

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie -o fedora_hadoop_1_2_1

Note: If you are building this image from Ubuntu or Fedora 18 OS host, you should add element 'selinux-permissive'.

JAVA_FILE=jdk-7u21-linux-x64.tar.gz DIB_HADOOP_VERSION=1.2.1 OOZIE_FILE=oozie-4.0.0.tar.gz DIB_IMAGE_SIZE=10 disk-image-create base vm fedora hadoop root-passwd oozie selinux-permissive -o fedora_hadoop_1_2_1

In this command 'DIB_HADOOP_VERSION' parameter is version of hadoop needs to be installed. You can use 'JAVA_DOWNLOAD_URL' parameter to specify download link for JDK (tarball or bin). 'DIB_IMAGE_SIZE' is parameter that specifes a volume of hard disk of instance. You need to specify it because Fedora and CentOS don't use all available volume. If you have already downloaded the jdk package, move it to "elements/hadoop/install.d/" and use its filename as 'JAVA_FILE' parameter. In order of working EDP components with Sahara DIB images you need pre-installed Oozie libs. Use OOZIE_DOWNLOAD_URL to specify link to Oozie archive (tar.gz). For example we have built Oozie libs here: http://sahara-files.mirantis.com/oozie-4.0.0.tar.gz If you have already downloaded archive, move it to "elements/oozie/install.d/" and use its filename as 'OOZIE_FILE' parameter.