Improvements to README.rst of elements

Improve the formatting of README.rst of elements:
- proper syntax for bash snippets
- improve list formatting
- use list definitions for environment variariables
- remove the explicit bash usage from diskimage-create.sh invocations
- wrap lines at 80 characters
- various small text formattings

Change-Id: I93b4c35b31dac66af22663b2cec426a08fa73bd8
This commit is contained in:
Pino Toscano 2015-04-29 18:41:30 +02:00
parent 3850d6fcdb
commit 939055eff0
16 changed files with 200 additions and 81 deletions

View File

@ -2,5 +2,12 @@
apt-mirror
==========
This element setups mirror for updating Ubuntu cloud image. Using mirror increases speed of building image.
You should specify http url for Ubuntu mirror using parameter 'UBUNTU_MIRROR'.
This element sets up the mirror for updating the Ubuntu cloud image.
Using a mirror improves the speed of the image building.
Environment Variables
---------------------
UBUNTU_MIRROR
:Required: Yes
:Description: URL to the Ubuntu mirror.

View File

@ -2,5 +2,12 @@
centos-mirror
=============
This element setups mirror for updating CentOS cloud image. Using mirror increases speed of building image.
You should specify http url for CentOS mirror using parameter 'CENTOS_MIRROR'.
This element sets up the mirror for updating the CentOS cloud image.
Using a mirror improves the speed of the image building.
Environment Variables
---------------------
CENTOS_MIRROR
:Required: Yes
:Description: URL to the CentOS mirror.

View File

@ -4,7 +4,8 @@ disable-firewall
This elements disables all firewalls on the image.
Recognized firewalls -
. iptables
. ip6tables
. firewalld
Recognized firewalls:
- iptables
- ip6tables
- firewalld

View File

@ -5,24 +5,26 @@ extjs
This element downloads extjs from its website, caching it so it is
not downloaded every time, and optionally unpacking it.
Configuration
-------------
Environment Variables
---------------------
The element can be configured by exporting variables using a
`environment.d` script; variables with ``*`` are mandatory:
`environment.d` script.
* EXTJS\_DESTINATION\_DIR ``*``
EXTJS_DESTINATION_DIR
:Required: Yes
:Description: The directory where to extract (or copy) extjs; must be
an absolute directory within the image. The directory is created if not
existing already.
:Example: ``EXTJS_DESTINATION_DIR=/usr/share/someapp``
The directory where to extract (or copy) extjs. Mandatory, must be
an absolute directory within the image, e.g. ``/usr/share/someapp``.
The directory is created if not existing already.
EXTJS_DOWNLOAD_URL
:Required: No
:Default: ``http://dev.sencha.com/deploy/ext-2.2.zip``
:Description: The URL from where to download extjs.
* EXTJS\_DOWNLOAD\_URL
The URL from where to download extjs. Defaults to
``http://dev.sencha.com/deploy/ext-2.2.zip``.
* EXTJS\_NO\_UNPACK
If set to 1, then the extjs tarball is simply copied to the location
specified by EXTJS\_DESTINATION\_DIR.
EXTJS_NO_UNPACK
:Required: No
:Default: *unset*
:Description: If set to 1, then the extjs tarball is simply copied to the
location specified by ``EXTJS_DESTINATION_DIR``.

View File

@ -2,5 +2,12 @@
fedora-mirror
=============
This element setups mirror for updating Fedora cloud image. Using mirror increases speed of building image.
You should specify http url for Fedora mirror using parameter 'FEDORA_MIRROR'.
This element sets up the mirror for updating the Fedora cloud image.
Using a mirror improves the speed of the image building.
Environment Variables
---------------------
FEDORA_MIRROR
:Required: Yes
:Description: URL to the Fedora mirror.

View File

@ -2,10 +2,17 @@
hadoop-cloudera
===============
Installs cloudera (cloudera-manager-agent cloudera-manager-daemons cloudera-manager-server cloudera-manager-server-db-2 hadoop-hdfs-namenode hadoop-hdfs-secondarynamenode hadoop-hdfs-datanode hadoop-yarn-resourcemanager hadoop-yarn-nodemanager hadoop-mapreduce hadoop-mapreduce-historyserver) and java (oracle-j2sdk1.7) packages from cloudera repositories `cdh5 <http://archive-primary.cloudera.com/cdh5/>`_ and `cm5 <http://archive-primary.cloudera.com/cm5>`_.
Installs cloudera (cloudera-manager-agent cloudera-manager-daemons
cloudera-manager-server cloudera-manager-server-db-2 hadoop-hdfs-namenode
hadoop-hdfs-secondarynamenode hadoop-hdfs-datanode hadoop-yarn-resourcemanager
hadoop-yarn-nodemanager hadoop-mapreduce hadoop-mapreduce-historyserver) and
Java (oracle-j2sdk1.7) packages from cloudera repositories
`cdh5 <http://archive-primary.cloudera.com/cdh5/>`_ and
`cm5 <http://archive-primary.cloudera.com/cm5>`_.
In order to create the Cloudera images with the diskimage-create.sh script, use the following syntax to select the "cloudera" plugin:
In order to create the Cloudera images with ``diskimage-create.sh``, use the
following syntax to select the ``cloudera`` plugin:
.. sourcecode:: bash
sudo bash diskimage-create.sh -p cloudera
diskimage-create.sh -p cloudera

View File

@ -4,19 +4,34 @@ hadoop-hdp
Installs the JDK, the Hortonworks Data Platform, and Apache Ambari.
Please set the DIB_HDP_VERSION environment variable to configure the install to use a given version. The default script (mentioned below) sets this variable for each supported version.
Currently, the following versions of the Hortonworks Data Platform are
supported for image building:
Currently, the following versions of the Hortonworks Data Platform are supported for image building:
1.3
2.0
- 1.3
- 2.0
The following script:
sahara-image-elements/diskimage-create/diskimage-create.sh
.. code:: bash
is the default script to use for creating CentOS images with HDP installed/configured. This script can be used without modification, or can be used as an example to describe how a more customized script may be created with the "hadoop-hdp" diskimage-builder element.
diskimage-create/diskimage-create.sh
In order to create the HDP images with the diskimage-create.sh script, use the following syntax to select the "hdp" plugin:
is the default script to use for creating CentOS images with HDP
installed/configured. This script can be used without modification, or can
be used as an example to describe how a more customized script may be created
with the ``hadoop-hdp`` element.
In order to create the HDP images with ``diskimage-create.sh``, use the
following syntax to select the ``hdp`` plugin:
.. code:: bash
diskimage-create.sh -p hdp
Environment Variables
---------------------
DIB_HDP_VERSION
:Required: Yes
:Description: Version of the Hortonworks Data Platform to install.
:Example: ``DIB_HDP_VERSION=2.0``

View File

@ -2,24 +2,38 @@
hadoop-mapr
===========
Creates images with local mirrors of MapR repositories: `core <http://package.mapr.com/releases/>`_ and `ecosystem <http://package.mapr.com/releases/ecosystem-4.x/>`_.
Installs `OpenJDK <http://http://openjdk.java.net/>`_ and `Scala <http://www.scala-lang.org/>`_.
Creates images with local mirrors of MapR repositories:
`core <http://package.mapr.com/releases/>`_ and
`ecosystem <http://package.mapr.com/releases/ecosystem-4.x/>`_.
Installs `OpenJDK <http://http://openjdk.java.net/>`_ and
`Scala <http://www.scala-lang.org/>`_.
In order to create the MapR images with ``diskimage-create.sh``, use the
following syntax to select the ``MapR`` plugin:
In order to create the MapR images with the diskimage-create.sh script, use the following syntax to select the "MapR" plugin:
.. sourcecode:: bash
bash diskimage-create.sh -p mapr [-i ubuntu|centos] [-r 3.1.1|4.0.1|4.0.2]
diskimage-create.sh -p mapr [-i ubuntu|centos] [-r 3.1.1|4.0.1|4.0.2]
NOTE: By default MapR 4.0.1 version will be used
In order to speed up image creation process you can download archives with MapR repositories and specify environment variable:
``DIB_MAPR_CORE_DEB_REPO``, ``DIB_MAPR_CORE_RPM_REPO``, ``DIB_MAPR_ECO_DEB_REPO``, ``DIB_MAPR_ECO_RPM_REPO``
In order to speed up image creation process you can download archives with MapR
repositories and specify environment variables:
``DIB_MAPR_CORE_DEB_REPO``, ``DIB_MAPR_CORE_RPM_REPO``,
``DIB_MAPR_ECO_DEB_REPO``, ``DIB_MAPR_ECO_RPM_REPO``.
For example:
.. sourcecode:: bash
export DIB_MAPR_CORE_DEB_REPO="file://<path-to-archive>/mapr-v4.0.1GA.deb.tgz"
export DIB_MAPR_CORE_RPM_REPO="file://<path-to-archive>/mapr-v4.0.1GA.rpm.tgz"
export DIB_MAPR_ECO_DEB_REPO="http://<URL>/mapr-ecosystem.deb.tgz"
export DIB_MAPR_ECO_RPM_REPO="http://<URL>/mapr-ecosystem.rpm.tgz"
bash diskimage-create.sh -p mapr -r 4.0.1
diskimage-create.sh -p mapr -r 4.0.1
Environment Variables
---------------------
DIB_MAPR_VERSION
:Required: Yes
:Description: Version of MapR to install.
:Example: ``DIB_MAPR_VERSION=4.0.1``

View File

@ -2,27 +2,35 @@
hadoop
======
Installs Java and Hadoop, configures SSH
========================================
Installs Java and Hadoop, configures SSH.
HOWTO build Hadoop Native Libs
------------------------------
+ Install: *jdk >= 6*, *maven*, *cmake* and *protobuf >= 2.5.0*
+ Get Hadoop source code:
```sh
$ wget http://archive.apache.org/dist/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
```
+ Unpack source
```sh
$ tar xvf hadoop-2.6.0-src.tar.gz
```
+ Build Hadoop
```sh
$ cd hadoop-2.6.0-src
$ mvn package -Pdist,native -DskipTests
```
+ Create tarball with Hadoop Native Libs
```sh
$ cd hadoop-dist/target/hadoop-2.6.0/lib
$ tar -czvf hadoop-native-libs-2.6.0.tar.gz native
```
- Install: *jdk >= 6*, *maven*, *cmake* and *protobuf >= 2.5.0*
- Get Hadoop source code:
.. code:: bash
wget http://archive.apache.org/dist/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
- Unpack source:
.. code:: bash
tar xvf hadoop-2.6.0-src.tar.gz
- Build Hadoop:
.. code:: bash
cd hadoop-2.6.0-src
mvn package -Pdist,native -DskipTests
- Create tarball with Hadoop Native Libs:
.. code:: bash
cd hadoop-dist/target/hadoop-2.6.0/lib
tar -czvf hadoop-native-libs-2.6.0.tar.gz native

View File

@ -3,5 +3,19 @@ hive
====
Installs Hive on Ubuntu and Fedora.
Hive stores metadata in MySQL databases. So, this element requires 'mysql' element.
You can specify download link for Hive using parameter 'HIVE_DOWNLOAD_URL' or choose Hive version using parameter 'HIVE_VERSION'.
Hive stores metadata in MySQL databases. So, this element requires the
``mysql`` element.
Environment Variables
---------------------
HIVE_VERSION
:Required: Yes, if ``HIVE_DOWNLOAD_URL`` is not set.
:Description: Version of Hive to fetch from apache.org.
:Example: ``HIVE_VERSION=0.11.0``
HIVE_DOWNLOAD_URL
:Required: Yes, if ``HIVE_VERSION`` is not set.
:Default: ``http://archive.apache.org/dist/hive/hive-$HIVE_VERSION/hive-$HIVE_VERSION-bin.tar.gz``
:Description: Download URL of the Hive package.

View File

@ -2,5 +2,7 @@
mysql
=====
This element setups basic components of MySQL on Ubuntu and Fedora.
It is light version of original MySQL element (https://github.com/openstack/tripleo-image-elements/tree/master/elements/mysql).
This element sets up the basic components of MySQL.
It is light version of original MySQL element
(https://github.com/openstack/tripleo-image-elements/tree/master/elements/mysql).

View File

@ -2,4 +2,4 @@
oozie
=====
Oozie deployment
Oozie deployment.

View File

@ -6,3 +6,10 @@ Assign a password to root.
This is useful when booting outside of a cloud environment (e.g. manually via
kvm).
Environment Variables
---------------------
DIB_PASSWORD
:Required: Yes
:Description: The password for the root user.

View File

@ -2,12 +2,27 @@
spark
=====
Installs Spark on Ubuntu. Requires Hadoop CDH 4 (hadoop-cdh element).
Installs Spark on Ubuntu. Requires Hadoop CDH 4 (``hadoop-cdh`` element).
It will install a version of Spark known to be compatible with CDH 4
This behaviour can be controlled also by using 'DIB_SPARK_VERSION' or directly with
'SPARK_DOWNLOAD_URL'
It will install a version of Spark known to be compatible with CDH 4;
this behaviour can be controlled also by using ``DIB_SPARK_VERSION`` or
directly with ``SPARK_DOWNLOAD_URL``.
If you set 'SPARK_CUSTOM_DISTRO' to 1, you can point the 'SPARK_DOWNLOAD_URL'
variable to a custom Spark distribution created with the make-distribution.sh
script included in Spark.
Environment Variables
---------------------
DIB_HADOOP_VERSION
:Required: Yes, if ``SPARK_DOWNLOAD_URL`` is not set.
:Description: Version of the Hadoop platform. See also
http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html.
:Example: ``DIB_HADOOP_VERSION=CDH4``
DIB_SPARK_VERSION
:Required: No
:Default: Depends on ``DIB_HADOOP_VERSION``.
:Description: Version of Spark to download from apache.org.
SPARK_DOWNLOAD_URL
:Required: Yes, if ``DIB_HADOOP_VERSION`` is not set.
:Default: ``http://archive.apache.org/dist/spark/spark-$DIB_SPARK_VERSION/spark-$DIB_SPARK_VERSION-bin-$SPARK_HADOOP_DL.tgz``
:Description: Download URL of the Spark package.

View File

@ -3,4 +3,4 @@ ssh
===
This element installs an SSH server then configures it to be suitable
for use with Sahara
for use with Sahara.

View File

@ -2,4 +2,17 @@
swift_hadoop
============
You can add your own Swift into image. Use 'swift_url' to specify download link for Swift jar file.
Install a Swift jar file into the image.
Environment Variables
---------------------
swift_url
:Required: No
:Default: http://sahara-files.mirantis.com/hadoop-swift/hadoop-swift-latest.jar
:Description: Location of the Swift jar file.
HDFS_LIB_DIR
:Required: No
:Default: /usr/lib/hadoop
:Description: Directory in the guest where to save the Swift jar.