Merge "Rename Savanna to Sahara"

This commit is contained in:
Jenkins 2014-03-17 21:11:27 +00:00 committed by Gerrit Code Review
commit 97a145c420
5 changed files with 24 additions and 24 deletions

View File

@ -1,12 +1,12 @@
Savanna Style Commandments
==========================
Sahara Style Commandments
=========================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Savanna Specific Commandments
-----------------------------
Sahara Specific Commandments
----------------------------
None so far

View File

@ -1,9 +1,9 @@
Savanna-extra project
=====================
Sahara-extra project
====================
Savanna-extra is place for Savanna components not included into the main `Savanna repository <https://github.com/stackforge/savanna>`_
Sahara-extra is place for Sahara components not included into the main `Sahara repository <https://github.com/openstack/sahara>`_
Here is the list of components:
* Sources for Swift filesystem implementation for Hadoop: https://github.com/stackforge/savanna-extra/blob/master/hadoop-swiftfs/README.rst
* `Diskimage-builder <https://github.com/stackforge/diskimage-builder>`_ elements moved to the new repo: https://github.com/stackforge/savanna-image-elements
* Sources for Swift filesystem implementation for Hadoop: https://github.com/openstack/sahara-extra/blob/master/hadoop-swiftfs/README.rst
* `Diskimage-builder <https://github.com/openstack/diskimage-builder>`_ elements moved to the new repo: https://github.com/openstack/sahara-image-elements

View File

@ -43,8 +43,8 @@ To run this example from Oozie, you will need to modify the ``job.properties`` f
to specify the correct ``jobTracker`` and ``nameNode`` addresses for your cluster.
You will also need to modify the ``workflow.xml`` file to contain the correct input
and output paths. These paths may be Savanna swift urls or hdfs paths. If swift
urls are used, set the ``fs.swift.service.savanna.username`` and ``fs.swift.service.savanna.password``
and output paths. These paths may be Sahara swift urls or hdfs paths. If swift
urls are used, set the ``fs.swift.service.sahara.username`` and ``fs.swift.service.sahara.password``
properties in the ``<configuration>`` section.
1) Upload the ``wordcount`` directory to hdfs
@ -55,12 +55,12 @@ properties in the ``<configuration>`` section.
``$ oozie job -oozie http://oozie_server:port/oozie -config wordcount/job.properties -run``
3) Don't forget to create your swift input path! A Savanna swift url looks like *swift://container.savanna/object*
3) Don't forget to create your swift input path! A Sahara swift url looks like *swift://container.sahara/object*
Running from the Savanna UI
Running from the Sahara UI
===========================
Running the WordCount example from the Savanna UI is very similar to running a Pig, Hive,
Running the WordCount example from the Sahara UI is very similar to running a Pig, Hive,
or MapReduce job.
1) Create a job binary that points to the ``edp-wordcount.jar`` file
@ -69,8 +69,8 @@ or MapReduce job.
a) Add the input and output paths to ``args``
b) If swift input or output paths are used, set the ``fs.swift.service.savanna.username`` and ``fs.swift.service.savanna.password``
b) If swift input or output paths are used, set the ``fs.swift.service.sahara.username`` and ``fs.swift.service.sahara.password``
configuration values
c) The Savanna UI will prompt for the required ``main_class`` value and the optional ``java_opts`` value
c) The Sahara UI will prompt for the required ``main_class`` value and the optional ``java_opts`` value

View File

@ -27,17 +27,17 @@
<value>${queueName}</value>
</property>
<property>
<name>fs.swift.service.savanna.username</name>
<name>fs.swift.service.sahara.username</name>
<value>swiftuser</value>
</property>
<property>
<name>fs.swift.service.savanna.password</name>
<name>fs.swift.service.sahara.password</name>
<value>swiftpassword</value>
</property>
</configuration>
<main-class>org.apache.hadoop.examples.WordCount</main-class>
<arg>swift://user.savanna/input</arg>
<arg>swift://user.savanna/output</arg>
<arg>swift://user.sahara/input</arg>
<arg>swift://user.sahara/output</arg>
</java>
<ok to="end"/>
<error to="fail"/>

View File

@ -1,7 +1,7 @@
[metadata]
name = savanna-extra
name = sahara-extra
version = 2014.1
summary = Extras for Savanna: elements, hadoop-swiftfs
summary = Extras for Sahara: hadoop-swiftfs
description-file = README.rst
license = Apache Software License
classifiers =
@ -11,11 +11,11 @@ classifiers =
Operating System :: POSIX :: Linux
author = OpenStack
author-email = openstack-dev@lists.openstack.org
home-page = https://savanna.readthedocs.org
home-page = https://sahara.readthedocs.org
[files]
data_files =
share/savanna-elements = elements/*
share/sahara-elements = elements/*
[global]
setup-hooks = pbr.hooks.setup_hook