f376b0f480
Update the Spark element to use the existing hadoop-cloudera element for HDFS for Spark versions > 1.0, instead of the ad-hoc cloudera-cdh one. For Spark 1.0.2, CDH4 via the old hadoop-cdh element is used, since a precompiled binary for CDH5 is not available. This change also makes it possible to specify an arbitrary Spark version via the new -s commandline switch, reducing the amount of code for supporting future versions of Spark. The defaults for Spark are 1.3.1 and CDH 5.3, a combination that works well in our deployments. A small change is needed in the cloudera element: when creating a Spark image, only the HDFS packages have to be installed. README files have been updated to clarify that default versions are tested, while other combinations are not. A reference to the SparkPlugin wiki page was added to point to a table of supported versions. Change-Id: Ifc2a0c8729981e1e1df79b556a4c2e6bd1ba893a Implements: blueprint support-spark-1-3 Depends-On: I8fa482b6d1d6abaa6633aec309a3ba826a8b7ebb |
||
---|---|---|
.. | ||
environment.d | ||
install.d | ||
post-install.d | ||
pre-install.d | ||
README.rst | ||
element-deps | ||
package-installs.yaml |
README.rst
hadoop-cloudera
Installs cloudera (cloudera-manager-agent cloudera-manager-daemons cloudera-manager-server cloudera-manager-server-db-2 hadoop-hdfs-namenode hadoop-hdfs-secondarynamenode hadoop-hdfs-datanode hadoop-yarn-resourcemanager hadoop-yarn-nodemanager hadoop-mapreduce hadoop-mapreduce-historyserver) and Java (oracle-j2sdk1.7) packages from cloudera repositories cdh5 and cm5.
In order to create the Cloudera images with
diskimage-create.sh
, use the following syntax to select the
cloudera
plugin:
diskimage-create.sh -p cloudera
Environment Variables
The element can be configured by exporting variables using a environment.d script.
- CDH_HDFS_ONLY
-
- Required
-
No
- Description
-
If set will install only the namenode and datanode packages with their dependencies.