Commit Graph

9 Commits

Author SHA1 Message Date
Telles Nobrega 92686f5a10 Prepare Sahara core for plugin split
On the effort to make Sahara more user and operators friendly
we are splitting the plugins from Sahara core.

The main goal of this change is to facilitate installation,
maintainance and upgrade of plugins. With the plugins outside
the main Sahara code, operators will be able to install a subset
of plugins, as well as upgrade to newer versions of the plugins
without having to wait for a new version of OpenStack to be
released. As well, it aims to facilitate new contributors to
develop and maintain their own plugins.

Sahara Spec: https://specs.openstack.org/openstack/sahara-specs/specs/rocky/plugins-outside-sahara-core.html

Change-Id: I7ed0945fd82e37daaf6b29f947d3dba06db9d158
2019-01-10 22:18:24 -03:00
Telles Nobrega e1a36ee28c Updating Spark versions
We are adding new spark version 2.3.0

Change-Id: I3a1c8decdc17c2c9b63af29ee9199cf24f11e0e2
2018-06-26 09:16:56 -03:00
Trevor McKay e00a2bbdf7 Remove support for spark 1.0.0
It was deprecated in Liberty and is marked
for removal in Mitaka.

Change-Id: I3ba6941b1e1aa6900b5f59ea52a0370577729d9e
Implements: blueprint remove-spark-100
2016-02-23 10:17:41 -05:00
Sergey Gotliv e7d6799155 Adding support for the Spark Shell job
Implements: blueprint edp-add-spark-shell-action
Change-Id: I6d2ec02f854ab2eeeab2413bb56f1a359a3837c1
2015-08-27 13:29:36 +03:00
Trevor McKay 1018a540a5 Ensure working dir is on driver class path for Spark/Swift
For Spark/Swift integration, we use a wrapper class to set up
the hadoop environment.  For this to succeed, the current
working directory must be on the classpath. Newer versions of
Spark have changed how the default classpath is generated, so
Sahara must ensure explicitly that the working dir will be
included.

Change-Id: I6680bf8736cada93e87821ef37de3c3b4202ead4
Close-Bug: #1486544
2015-08-21 21:52:54 +03:00
Alexander Aleksiyants 74159dfdd2 Spark job for Cloudera 5.3.0 and 5.4.0 added
Spark jobs in Cloudera 5.3.0 and 5.4.0 plugins are now supported.
Required unit tests have been added. Merged with current
master HEAD.

Change-Id: Ic8fde97e424e45c6f31f7794749793b26c844915
Implements: blueprint spark-jobs-for-cdh-5-3-0
2015-07-10 17:45:11 +03:00
Trevor McKay a7adef4708 Implement job-types endpoint support methods for Spark plugin
This change implements the optional methods in the Plugins SPI
to support the job-types endpoint for the Spark plugin.

Config hints at this point are unchanged. Additional work may be
needed to provide config-hints specific to Spark plugin versions.

Partial-Implements: blueprint edp-job-types-endpoint
Change-Id: I1cd318da11c997119b192e7396969f89d8f0f216
2015-03-17 17:58:31 -04:00
Andrew Lazarev e55238a881 Moved validate_edp from plugin SPI to edp_engine
Now EDP engine is fully responsible on validation of data for
job execution.

Other changes:
* Removed API calls from validation to remove circular dependancy
* Removed plugins patching in validation to allow non-vanilla
  plugins testing
* Renamed job_executor to job_execution

Change-Id: I14c86f33b355cb4317e96a70109d8d72d52d3c00
Closes-Bug: #1357512
2014-09-10 10:10:41 -07:00
Andrew Lazarev 42526b808b Made EDP engine plugin specific
+ Moved 'get_hdfs_user' method from plugin SPI to EDP engine

Futher steps: move other EDP-specific method to EDP engine

Change-Id: I0537397894012f496ea4abc2661aa8331fbf6bd3
Partial-Bug: #1357512
2014-08-21 12:45:43 -07:00