Retire Packaging Deb project repos

This commit is part of a series to retire the Packaging Deb
project. Step 2 is to remove all content from the project
repos, replacing it with a README notification where to find
ongoing work, and how to recover the repo if needed at some
future point (as in
https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project).

Change-Id: I712c6c047a8a1c6be699308911e2e0583eedf9ae
This commit is contained in:
Tony Breeds 2017-09-12 16:10:58 -06:00
parent 03fa880061
commit 2e6c2f814d
136 changed files with 14 additions and 14207 deletions

View File

@ -1,12 +0,0 @@
[run]
branch = True
source = saharaclient
omit =
.tox/*
saharaclient/tests/*
[paths]
source = saharaclient
[report]
ignore_errors = True

40
.gitignore vendored
View File

@ -1,40 +0,0 @@
*.py[co]
*.egg
*.egg-info
dist
build
eggs
parts
var
sdist
develop-eggs
.installed.cfg
pip-log.txt
.tox
*.mo
.mr.developer.cfg
.DS_Store
Thumbs.db
.venv
.idea
out
target
*.iml
*.ipr
*.iws
*.db
.coverage
nosetests.xml
pylint-report.txt
ChangeLog
cscope.out
.testrepository
AUTHORS
cover
doc/html
doc/source/apidoc
doc/source/api
doc/build
*.log
# Files created by releasenotes build
releasenotes/build

View File

@ -1,4 +0,0 @@
[gerrit]
host=review.openstack.org
port=29418
project=openstack/python-saharaclient.git

View File

@ -1,7 +0,0 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover $DISCOVER_DIRECTORY $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

View File

@ -1,21 +0,0 @@
If you would like to contribute to the development of OpenStack,
you must follow the steps in the "If you're a developer"
section of this page:
http://wiki.openstack.org/HowToContribute
You can find more Sahara-specific info in our How To Participate guide:
http://docs.openstack.org/developer/python-saharaclient/devref/how_to_participate.html
Once those steps have been completed, changes to OpenStack
should be submitted for review via the Gerrit tool, following
the workflow documented at:
http://wiki.openstack.org/GerritWorkflow
Pull requests submitted through GitHub will be ignored.
Bugs should be filed on Launchpad, not GitHub:
https://bugs.launchpad.net/python-saharaclient

View File

@ -1,45 +0,0 @@
Sahara Style Commandments
=========================
- Step 1: Read the OpenStack Style Commandments
https://docs.openstack.org/hacking/latest/
- Step 2: Read on
Sahara Specific Commandments
----------------------------
Commit Messages
---------------
Using a common format for commit messages will help keep our git history
readable. Follow these guidelines:
- [S365] First, provide a brief summary of 50 characters or less. Summaries
of greater than 72 characters will be rejected by the gate.
- [S364] The first line of the commit message should provide an accurate
description of the change, not just a reference to a bug or blueprint.
Imports
-------
- [S366, S367] Organize your imports according to the ``Import order``
Dictionaries/Lists
------------------
- [S360] Ensure default arguments are not mutable.
- [S368] Must use a dict comprehension instead of a dict constructor with a
sequence of key-value pairs. For more information, please refer to
http://legacy.python.org/dev/peps/pep-0274/
Logs
----
- [S373] Don't translate logs
- [S374] You used a deprecated log level
Importing json
--------------
- [S375] It's more preferable to use ``jsonutils`` from ``oslo_serialization``
instead of ``json`` for operating with ``json`` objects.

176
LICENSE
View File

@ -1,176 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

View File

@ -1,9 +0,0 @@
include AUTHORS
include README.rst
include ChangeLog
include LICENSE
exclude .gitignore
exclude .gitreview
global-exclude *.pyc

14
README Normal file
View File

@ -0,0 +1,14 @@
This project is no longer maintained.
The contents of this repository are still available in the Git
source code management system. To see the contents of this
repository before it reached its end of life, please check out the
previous commit with "git checkout HEAD^1".
For ongoing work on maintaining OpenStack packages in the Debian
distribution, please see the Debian OpenStack packaging team at
https://wiki.debian.org/OpenStack/.
For any further questions, please email
openstack-dev@lists.openstack.org or join #openstack-dev on
Freenode.

View File

@ -1,50 +0,0 @@
========================
Team and repository tags
========================
.. image:: http://governance.openstack.org/badges/python-saharaclient.svg
:target: http://governance.openstack.org/reference/tags/index.html
.. Change things from this point on
Python bindings to the OpenStack Sahara API
===========================================
.. image:: https://img.shields.io/pypi/v/python-saharaclient.svg
:target: https://pypi.python.org/pypi/python-saharaclient/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/dm/python-saharaclient.svg
:target: https://pypi.python.org/pypi/python-saharaclient/
:alt: Downloads
This is a client for the OpenStack Sahara API. There's a Python API (the
``saharaclient`` module), and a command-line script (``sahara``). Each
implements the OpenStack Sahara API. You can find documentation for both
Python bindings and CLI in `Docs`_.
Development takes place via the usual OpenStack processes as outlined
in the `developer guide
<http://docs.openstack.org/infra/manual/developers.html>`_.
.. _Docs: http://docs.openstack.org/developer/python-saharaclient/
* License: Apache License, Version 2.0
* `PyPi`_ - package installation
* `Online Documentation`_
* `Launchpad project`_ - release management
* `Blueprints`_ - feature specifications
* `Bugs`_ - issue tracking
* `Source`_
* `Specs`_
* `How to Contribute`_
.. _PyPi: https://pypi.python.org/pypi/python-saharaclient
.. _Online Documentation: http://docs.openstack.org/developer/python-saharaclient
.. _Launchpad project: https://launchpad.net/python-saharaclient
.. _Blueprints: https://blueprints.launchpad.net/python-saharaclient
.. _Bugs: https://bugs.launchpad.net/python-saharaclient
.. _Source: https://git.openstack.org/cgit/openstack/python-saharaclient
.. _How to Contribute: http://docs.openstack.org/infra/manual/developers.html
.. _Specs: http://specs.openstack.org/openstack/sahara-specs/

View File

View File

@ -1,90 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import inspect
import os
import sys
from docutils import nodes
from . import ext
def _get_command(classes):
"""Associates each command class with command depending on setup.cfg
"""
commands = {}
setup_file = os.path.join(
os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')),
'setup.cfg')
for line in open(setup_file, 'r'):
for cl in classes:
if cl in line:
commands[cl] = line.split(' = ')[0].strip().replace('_', ' ')
return commands
class ArgParseDirectiveOSC(ext.ArgParseDirective):
"""Sphinx extension that automatically documents commands and options
of the module that contains OpenstackClient/cliff command objects
Usage example:
.. cli::
:module: saharaclient.osc.v1.clusters
"""
def run(self):
module_name = self.options['module']
mod = __import__(module_name, globals(), locals())
classes = inspect.getmembers(sys.modules[module_name], inspect.isclass)
classes_names = [cl[0] for cl in classes]
commands = _get_command(classes_names)
items = []
for cl in classes:
parser = cl[1](None, None).get_parser(None)
parser.prog = commands[cl[0]]
items.append(nodes.subtitle(text=commands[cl[0]]))
result = ext.parse_parser(
parser, skip_default_values='nodefault' in self.options)
result = ext.parser_navigate(result, '')
nested_content = ext.nodes.paragraph()
self.state.nested_parse(
self.content, self.content_offset, nested_content)
nested_content = nested_content.children
for item in nested_content:
if not isinstance(item, ext.nodes.definition_list):
items.append(item)
if 'description' in result:
items.append(self._nested_parse_paragraph(result['description']))
items.append(ext.nodes.literal_block(text=result['usage']))
items.append(ext.print_command_args_and_opts(
ext.print_arg_list(result, nested_content),
ext.print_opt_list(result, nested_content),
ext.print_subcommand_list(result, nested_content)
))
if 'epilog' in result:
items.append(self._nested_parse_paragraph(result['epilog']))
return items
def setup(app):
app.add_directive('cli', ArgParseDirectiveOSC)

View File

@ -1,386 +0,0 @@
# Copyright (c) 2013 Alex Rudakov
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from argparse import ArgumentParser
import os
from docutils import nodes
from docutils.statemachine import StringList
from docutils.parsers.rst.directives import flag, unchanged
from sphinx.util.compat import Directive
from sphinx.util.nodes import nested_parse_with_titles
from .parser import parse_parser, parser_navigate
def map_nested_definitions(nested_content):
if nested_content is None:
raise Exception('Nested content should be iterable, not null')
# build definition dictionary
definitions = {}
for item in nested_content:
if not isinstance(item, nodes.definition_list):
continue
for subitem in item:
if not isinstance(subitem, nodes.definition_list_item):
continue
if not len(subitem.children) > 0:
continue
classifier = '@after'
idx = subitem.first_child_matching_class(nodes.classifier)
if idx is not None:
ci = subitem[idx]
if len(ci.children) > 0:
classifier = ci.children[0].astext()
if classifier is not None and classifier not in (
'@replace', '@before', '@after'):
raise Exception('Unknown classifier: %s' % classifier)
idx = subitem.first_child_matching_class(nodes.term)
if idx is not None:
ch = subitem[idx]
if len(ch.children) > 0:
term = ch.children[0].astext()
idx = subitem.first_child_matching_class(nodes.definition)
if idx is not None:
def_node = subitem[idx]
def_node.attributes['classifier'] = classifier
definitions[term] = def_node
return definitions
def print_arg_list(data, nested_content):
definitions = map_nested_definitions(nested_content)
items = []
if 'args' in data:
for arg in data['args']:
my_def = [nodes.paragraph(text=arg['help'])] if arg['help'] else []
name = arg['name']
my_def = apply_definition(definitions, my_def, name)
if len(my_def) == 0:
my_def.append(nodes.paragraph(text='Undocumented'))
if 'choices' in arg:
my_def.append(nodes.paragraph(
text=('Possible choices: %s' % ', '.join([str(c) for c in arg['choices']]))))
items.append(
nodes.option_list_item(
'', nodes.option_group('', nodes.option_string(text=name)),
nodes.description('', *my_def)))
return nodes.option_list('', *items) if items else None
def print_opt_list(data, nested_content):
definitions = map_nested_definitions(nested_content)
items = []
if 'options' in data:
for opt in data['options']:
names = []
my_def = [nodes.paragraph(text=opt['help'])] if opt['help'] else []
for name in opt['name']:
option_declaration = [nodes.option_string(text=name)]
if opt['default'] is not None \
and opt['default'] != '==SUPPRESS==':
option_declaration += nodes.option_argument(
'', text='=' + str(opt['default']))
names.append(nodes.option('', *option_declaration))
my_def = apply_definition(definitions, my_def, name)
if len(my_def) == 0:
my_def.append(nodes.paragraph(text='Undocumented'))
if 'choices' in opt:
my_def.append(nodes.paragraph(
text=('Possible choices: %s' % ', '.join([str(c) for c in opt['choices']]))))
items.append(
nodes.option_list_item(
'', nodes.option_group('', *names),
nodes.description('', *my_def)))
return nodes.option_list('', *items) if items else None
def print_command_args_and_opts(arg_list, opt_list, sub_list=None):
items = []
if arg_list:
items.append(nodes.definition_list_item(
'', nodes.term(text='Positional arguments:'),
nodes.definition('', arg_list)))
if opt_list:
items.append(nodes.definition_list_item(
'', nodes.term(text='Options:'),
nodes.definition('', opt_list)))
if sub_list and len(sub_list):
items.append(nodes.definition_list_item(
'', nodes.term(text='Sub-commands:'),
nodes.definition('', sub_list)))
return nodes.definition_list('', *items)
def apply_definition(definitions, my_def, name):
if name in definitions:
definition = definitions[name]
classifier = definition['classifier']
if classifier == '@replace':
return definition.children
if classifier == '@after':
return my_def + definition.children
if classifier == '@before':
return definition.children + my_def
raise Exception('Unknown classifier: %s' % classifier)
return my_def
def print_subcommand_list(data, nested_content):
definitions = map_nested_definitions(nested_content)
items = []
if 'children' in data:
for child in data['children']:
my_def = [nodes.paragraph(
text=child['help'])] if child['help'] else []
name = child['name']
my_def = apply_definition(definitions, my_def, name)
if len(my_def) == 0:
my_def.append(nodes.paragraph(text='Undocumented'))
if 'description' in child:
my_def.append(nodes.paragraph(text=child['description']))
my_def.append(nodes.literal_block(text=child['usage']))
my_def.append(print_command_args_and_opts(
print_arg_list(child, nested_content),
print_opt_list(child, nested_content),
print_subcommand_list(child, nested_content)
))
items.append(
nodes.definition_list_item(
'',
nodes.term('', '', nodes.strong(text=name)),
nodes.definition('', *my_def)
)
)
return nodes.definition_list('', *items)
class ArgParseDirective(Directive):
has_content = True
option_spec = dict(module=unchanged, func=unchanged, ref=unchanged,
prog=unchanged, path=unchanged, nodefault=flag,
manpage=unchanged, nosubcommands=unchanged, passparser=flag)
def _construct_manpage_specific_structure(self, parser_info):
"""
Construct a typical man page consisting of the following elements:
NAME (automatically generated, out of our control)
SYNOPSIS
DESCRIPTION
OPTIONS
FILES
SEE ALSO
BUGS
"""
# SYNOPSIS section
synopsis_section = nodes.section(
'',
nodes.title(text='Synopsis'),
nodes.literal_block(text=parser_info["bare_usage"]),
ids=['synopsis-section'])
# DESCRIPTION section
description_section = nodes.section(
'',
nodes.title(text='Description'),
nodes.paragraph(text=parser_info.get(
'description', parser_info.get(
'help', "undocumented").capitalize())),
ids=['description-section'])
nested_parse_with_titles(
self.state, self.content, description_section)
if parser_info.get('epilog'):
# TODO: do whatever sphinx does to understand ReST inside
# docstrings magically imported from other places. The nested
# parse method invoked above seem to be able to do this but
# I haven't found a way to do it for arbitrary text
description_section += nodes.paragraph(
text=parser_info['epilog'])
# OPTIONS section
options_section = nodes.section(
'',
nodes.title(text='Options'),
ids=['options-section'])
if 'args' in parser_info:
options_section += nodes.paragraph()
options_section += nodes.subtitle(text='Positional arguments:')
options_section += self._format_positional_arguments(parser_info)
if 'options' in parser_info:
options_section += nodes.paragraph()
options_section += nodes.subtitle(text='Optional arguments:')
options_section += self._format_optional_arguments(parser_info)
items = [
# NOTE: we cannot generate NAME ourselves. It is generated by
# docutils.writers.manpage
synopsis_section,
description_section,
# TODO: files
# TODO: see also
# TODO: bugs
]
if len(options_section.children) > 1:
items.append(options_section)
if 'nosubcommands' not in self.options:
# SUBCOMMANDS section (non-standard)
subcommands_section = nodes.section(
'',
nodes.title(text='Sub-Commands'),
ids=['subcommands-section'])
if 'children' in parser_info:
subcommands_section += self._format_subcommands(parser_info)
if len(subcommands_section) > 1:
items.append(subcommands_section)
if os.getenv("INCLUDE_DEBUG_SECTION"):
import json
# DEBUG section (non-standard)
debug_section = nodes.section(
'',
nodes.title(text="Argparse + Sphinx Debugging"),
nodes.literal_block(text=json.dumps(parser_info, indent=' ')),
ids=['debug-section'])
items.append(debug_section)
return items
def _format_positional_arguments(self, parser_info):
assert 'args' in parser_info
items = []
for arg in parser_info['args']:
arg_items = []
if arg['help']:
arg_items.append(nodes.paragraph(text=arg['help']))
else:
arg_items.append(nodes.paragraph(text='Undocumented'))
if 'choices' in arg:
arg_items.append(
nodes.paragraph(
text='Possible choices: ' + ', '.join(arg['choices'])))
items.append(
nodes.option_list_item(
'',
nodes.option_group(
'', nodes.option(
'', nodes.option_string(text=arg['metavar'])
)
),
nodes.description('', *arg_items)))
return nodes.option_list('', *items)
def _format_optional_arguments(self, parser_info):
assert 'options' in parser_info
items = []
for opt in parser_info['options']:
names = []
opt_items = []
for name in opt['name']:
option_declaration = [nodes.option_string(text=name)]
if opt['default'] is not None \
and opt['default'] != '==SUPPRESS==':
option_declaration += nodes.option_argument(
'', text='=' + str(opt['default']))
names.append(nodes.option('', *option_declaration))
if opt['help']:
opt_items.append(nodes.paragraph(text=opt['help']))
else:
opt_items.append(nodes.paragraph(text='Undocumented'))
if 'choices' in opt:
opt_items.append(
nodes.paragraph(
text='Possible choices: ' + ', '.join(opt['choices'])))
items.append(
nodes.option_list_item(
'', nodes.option_group('', *names),
nodes.description('', *opt_items)))
return nodes.option_list('', *items)
def _format_subcommands(self, parser_info):
assert 'children' in parser_info
items = []
for subcmd in parser_info['children']:
subcmd_items = []
if subcmd['help']:
subcmd_items.append(nodes.paragraph(text=subcmd['help']))
else:
subcmd_items.append(nodes.paragraph(text='Undocumented'))
items.append(
nodes.definition_list_item(
'',
nodes.term('', '', nodes.strong(
text=subcmd['bare_usage'])),
nodes.definition('', *subcmd_items)))
return nodes.definition_list('', *items)
def _nested_parse_paragraph(self, text):
content = nodes.paragraph()
self.state.nested_parse(StringList(text.split("\n")), 0, content)
return content
def run(self):
if 'module' in self.options and 'func' in self.options:
module_name = self.options['module']
attr_name = self.options['func']
elif 'ref' in self.options:
_parts = self.options['ref'].split('.')
module_name = '.'.join(_parts[0:-1])
attr_name = _parts[-1]
else:
raise self.error(
':module: and :func: should be specified, or :ref:')
mod = __import__(module_name, globals(), locals(), [attr_name])
if not hasattr(mod, attr_name):
raise self.error((
'Module "%s" has no attribute "%s"\n'
'Incorrect argparse :module: or :func: values?'
) % (module_name, attr_name))
func = getattr(mod, attr_name)
if isinstance(func, ArgumentParser):
parser = func
elif 'passparser' in self.options:
parser = ArgumentParser()
func(parser)
else:
parser = func()
if 'path' not in self.options:
self.options['path'] = ''
path = str(self.options['path'])
if 'prog' in self.options:
parser.prog = self.options['prog']
result = parse_parser(
parser, skip_default_values='nodefault' in self.options)
result = parser_navigate(result, path)
if 'manpage' in self.options:
return self._construct_manpage_specific_structure(result)
nested_content = nodes.paragraph()
self.state.nested_parse(
self.content, self.content_offset, nested_content)
nested_content = nested_content.children
items = []
# add common content between
for item in nested_content:
if not isinstance(item, nodes.definition_list):
items.append(item)
if 'description' in result:
items.append(self._nested_parse_paragraph(result['description']))
items.append(nodes.literal_block(text=result['usage']))
items.append(print_command_args_and_opts(
print_arg_list(result, nested_content),
print_opt_list(result, nested_content),
print_subcommand_list(result, nested_content)
))
if 'epilog' in result:
items.append(self._nested_parse_paragraph(result['epilog']))
return items
def setup(app):
app.add_directive('argparse', ArgParseDirective)

View File

@ -1,138 +0,0 @@
# Copyright (c) 2013 Alex Rudakov
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from argparse import _HelpAction, _SubParsersAction
import re
class NavigationException(Exception):
pass
def parser_navigate(parser_result, path, current_path=None):
if isinstance(path, str):
if path == '':
return parser_result
path = re.split('\s+', path)
current_path = current_path or []
if len(path) == 0:
return parser_result
if 'children' not in parser_result:
raise NavigationException(
'Current parser have no children elements. (path: %s)' %
' '.join(current_path))
next_hop = path.pop(0)
for child in parser_result['children']:
if child['name'] == next_hop:
current_path.append(next_hop)
return parser_navigate(child, path, current_path)
raise NavigationException(
'Current parser have no children element with name: %s (path: %s)' % (
next_hop, ' '.join(current_path)))
def _try_add_parser_attribute(data, parser, attribname):
attribval = getattr(parser, attribname, None)
if attribval is None:
return
if not isinstance(attribval, str):
return
if len(attribval) > 0:
data[attribname] = attribval
def _format_usage_without_prefix(parser):
"""
Use private argparse APIs to get the usage string without
the 'usage: ' prefix.
"""
fmt = parser._get_formatter()
fmt.add_usage(parser.usage, parser._actions,
parser._mutually_exclusive_groups, prefix='')
return fmt.format_help().strip()
def parse_parser(parser, data=None, **kwargs):
if data is None:
data = {
'name': '',
'usage': parser.format_usage().strip(),
'bare_usage': _format_usage_without_prefix(parser),
'prog': parser.prog,
}
_try_add_parser_attribute(data, parser, 'description')
_try_add_parser_attribute(data, parser, 'epilog')
for action in parser._get_positional_actions():
if isinstance(action, _HelpAction):
continue
if isinstance(action, _SubParsersAction):
helps = {}
for item in action._choices_actions:
helps[item.dest] = item.help
# commands which share an existing parser are an alias,
# don't duplicate docs
subsection_alias = {}
subsection_alias_names = set()
for name, subaction in action._name_parser_map.items():
if subaction not in subsection_alias:
subsection_alias[subaction] = []
else:
subsection_alias[subaction].append(name)
subsection_alias_names.add(name)
for name, subaction in action._name_parser_map.items():
if name in subsection_alias_names:
continue
subalias = subsection_alias[subaction]
subaction.prog = '%s %s' % (parser.prog, name)
subdata = {
'name': name if not subalias else
'%s (%s)' % (name, ', '.join(subalias)),
'help': helps.get(name, ''),
'usage': subaction.format_usage().strip(),
'bare_usage': _format_usage_without_prefix(subaction),
}
parse_parser(subaction, subdata, **kwargs)
data.setdefault('children', []).append(subdata)
continue
if 'args' not in data:
data['args'] = []
arg = {
'name': action.dest,
'help': action.help or '',
'metavar': action.metavar
}
if action.choices:
arg['choices'] = action.choices
data['args'].append(arg)
show_defaults = (
('skip_default_values' not in kwargs)
or (kwargs['skip_default_values'] is False))
for action in parser._get_optional_actions():
if isinstance(action, _HelpAction):
continue
if 'options' not in data:
data['options'] = []
option = {
'name': action.option_strings,
'default': action.default if show_defaults else '==SUPPRESS==',
'help': action.help or ''
}
if action.choices:
option['choices'] = action.choices
if "==SUPPRESS==" not in option['help']:
data['options'].append(option)
return data

View File

@ -1,11 +0,0 @@
<h3>Useful Links</h3>
<ul>
<li><a href="https://wiki.openstack.org/wiki/Sahara">Sahara @ OpenStack Wiki</a></li>
<li><a href="https://launchpad.net/sahara">Sahara @ Launchpad</a></li>
</ul>
{% if READTHEDOCS %}
<script type='text/javascript'>
$('div.body').css('margin', 0)
</script>
{% endif %}

View File

@ -1,4 +0,0 @@
{% extends "basic/layout.html" %}
{% set css_files = css_files + ['_static/tweaks.css'] %}
{% block relbar1 %}{% endblock relbar1 %}

View File

@ -1,4 +0,0 @@
[theme]
inherit = nature
stylesheet = nature.css
pygments_style = tango

View File

@ -1,167 +0,0 @@
Sahara Client
=============
Overview
--------
Sahara Client provides a list of Python interfaces to communicate with the
Sahara REST API. Sahara Client enables users to perform most of the existing
operations like retrieving template lists, creating Clusters, submitting EDP
Jobs, etc.
Instantiating a Client
----------------------
To start using the Sahara Client users have to create an instance of the
`Client` class. The client constructor has a list of parameters to authenticate
and locate Sahara endpoint.
.. autoclass:: saharaclient.api.client.Client
:members:
**Important!**
It is not a mandatory rule to provide all of the parameters above. The minimum
number should be enough to determine Sahara endpoint, check user
authentication and tenant to operate in.
Authentication check
~~~~~~~~~~~~~~~~~~~~
Passing authentication parameters to Sahara Client is deprecated. Keystone
Session object should be used for this purpose. For example:
.. sourcecode:: python
from keystoneauth1.identity import v2
from keystoneauth1 import session
from saharaclient import client
auth = v2.Password(auth_url=AUTH_URL,
username=USERNAME,
password=PASSWORD,
tenant_name=PROJECT_ID)
ses = session.Session(auth=auth)
sahara = client.Client('1.1', session=ses)
..
For more information about Keystone Sessions, see `Using Sessions`_.
.. _Using Sessions: http://docs.openstack.org/developer/python-keystoneclient/using-sessions.html
Sahara endpoint discovery
~~~~~~~~~~~~~~~~~~~~~~~~~
If user has a direct URL pointing to Sahara REST API, it may be specified as
`sahara_url`. If this parameter is missing, Sahara client will use Keystone
Service Catalog to find the endpoint. There are two parameters: `service_type`
and `endpoint_type` to configure endpoint search. Both parameters have
default values.
.. sourcecode:: python
from keystoneauth1.identity import v2
from keystoneauth1 import session
from saharaclient import client
auth = v2.Password(auth_url=AUTH_URL,
username=USERNAME,
password=PASSWORD,
tenant_name=PROJECT_ID)
ses = session.Session(auth=auth)
sahara = client.Client('1.1', session=ses,
service_type="non-default-service-type",
endpoint_type="internalURL")
..
Object managers
---------------
Sahara Client has a list of fields to operate with:
* plugins
* clusters
* cluster_templates
* node_group_templates
* images
* data_sources
* job_binaries
* job_binary_internals
* job_executions
* job_types
Each of this fields is a reference to a Manager for a corresponding group of
REST calls.
Supported operations
--------------------
Plugin ops
~~~~~~~~~~
.. autoclass:: saharaclient.api.plugins.PluginManager
:members:
Image Registry ops
~~~~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.images.ImageManager
:members:
Node Group Template ops
~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.node_group_templates.NodeGroupTemplateManager
:members:
Cluster Template ops
~~~~~~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.cluster_templates.ClusterTemplateManager
:members:
Cluster ops
~~~~~~~~~~~
.. autoclass:: saharaclient.api.clusters.ClusterManager
:members:
Data Source ops
~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.data_sources.DataSourceManager
:members:
Job Binary Internal ops
~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.job_binary_internals.JobBinaryInternalsManager
:members: create, update
Job Binary ops
~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.job_binaries.JobBinariesManager
:members:
Job ops
~~~~~~~
.. autoclass:: saharaclient.api.jobs.JobsManager
:members:
Job Execution ops
~~~~~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.job_executions.JobExecutionsManager
:members:
Job Types ops
~~~~~~~~~~~~~
.. autoclass:: saharaclient.api.job_types.JobTypesManager
:members:

View File

@ -1,64 +0,0 @@
Sahara CLI Commands
===================
The following commands are currently supported by the Sahara CLI:
Plugins
-------
.. cli::
:module: saharaclient.osc.v1.plugins
Images
------
.. cli::
:module: saharaclient.osc.v1.images
Node Group Templates
--------------------
.. cli::
:module: saharaclient.osc.v1.node_group_templates
Cluster Templates
-----------------
.. cli::
:module: saharaclient.osc.v1.cluster_templates
Clusters
--------
.. cli::
:module: saharaclient.osc.v1.clusters
Data Sources
------------
.. cli::
:module: saharaclient.osc.v1.data_sources
Job Binaries
------------
.. cli::
:module: saharaclient.osc.v1.job_binaries
Job Types
---------
.. cli::
:module: saharaclient.osc.v1.job_types
Job Templates
-------------
.. cli::
:module: saharaclient.osc.v1.job_templates
Jobs
----
.. cli::
:module: saharaclient.osc.v1.jobs

View File

@ -1,269 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import subprocess
import sys
import os
import warnings
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
sys.path.insert(0, os.path.abspath('../../saharaclient'))
sys.path.append(os.path.abspath('..'))
sys.path.append(os.path.abspath('../bin'))
# -- General configuration -----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.doctest', 'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode', 'ext.cli',
'openstackdocstheme']
# openstackdocstheme options
repository_name = 'openstack/python-saharaclient'
bug_project = 'python-saharaclient'
bug_tag = 'doc'
html_last_updated_fmt = '%Y-%m-%d %H:%M'
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'Sahara Client'
copyright = u'2013, OpenStack Foundation'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# Version info
from saharaclient.version import version_info as saharaclient_version
release = saharaclient_version.release_string()
# The short X.Y version.
version = saharaclient_version.version_string()
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
# The reST default role (used for this markup: `text`) to use for all documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
# -- Options for HTML output ---------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'openstackdocs'
if on_rtd:
html_theme_path = ['.']
html_theme = '_theme_rtd'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
html_title = 'Sahara Client'
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
html_sidebars = {
'index': ['sidebarlinks.html', 'localtoc.html', 'searchbox.html', 'sourcelink.html'],
'**': ['localtoc.html', 'relations.html',
'searchbox.html', 'sourcelink.html']
}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'SaharaClientDoc'
# -- Options for LaTeX output --------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
('index', 'saharaclientdoc.tex', u'Sahara Client',
u'OpenStack Foundation', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output --------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'saharaclient', u'Sahara Client',
[u'OpenStack Foundation'], 1)
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output ------------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'Sahara Client', u'Sahara Client',
u'OpenStack Foundation', 'Sahara Client', 'Sahara Client',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'

View File

@ -1,68 +0,0 @@
How to Participate
==================
Getting started
---------------
* Create account on `Github <https://github.com/openstack/sahara>`_
(if you don't have one)
* Make sure that your local git is properly configured by executing
``git config --list``. If not, configure ``user.name``, ``user.email``
* Create account on `Launchpad <https://launchpad.net/sahara>`_
(if you don't have one)
* Subscribe to `OpenStack general mail-list <http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack>`_
* Subscribe to `OpenStack development mail-list <http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev>`_
* Create `OpenStack profile <https://www.openstack.org/profile/>`_
* Login to `OpenStack Gerrit <https://review.openstack.org/>`_ with your
Launchpad id
* Sign `OpenStack Individual Contributor License Agreement <https://review.openstack.org/#/settings/agreements>`_
* Make sure that your email is listed in `identities <https://review.openstack.org/#/settings/web-identities>`_
* Subscribe to code-reviews. Go to your settings on http://review.openstack.org
* Go to ``watched projects``
* Add ``openstack/sahara``, ``openstack/sahara-dashboard``,
``openstack/sahara-extra``, ``openstack/python-saharaclient``,
``openstack/sahara-image-elements``, ``openstack/horizon``
How to stay in touch with the community?
----------------------------------------
* If you have something to discuss use
`OpenStack development mail-list <http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev>`_.
Prefix mail subject with ``[Sahara]``
* Join ``#openstack-sahara`` IRC channel on `freenode <http://freenode.net/>`_
* Join public weekly meetings on *Thursdays at 18:00 UTC* on
``#openstack-meeting-alt`` IRC channel
* Join public weekly meetings on *Thursdays at 14:00 UTC* on
``#openstack-meeting-3`` IRC channel
How to send your first patch on review?
---------------------------------------
* Checkout Sahara code from `Github <https://github.com/openstack/sahara>`_
* Carefully read https://wiki.openstack.org/wiki/Gerrit_Workflow
* Pay special attention to https://wiki.openstack.org/wiki/Gerrit_Workflow#Committing_Changes
* Apply and commit your changes
* Make sure that your code passes ``PEP8`` checks and unit-tests
* Send your patch on review
* Monitor status of your patch review on https://review.openstack.org/#/

View File

@ -1,42 +0,0 @@
Python bindings to the OpenStack Sahara API
===========================================
This is a client for OpenStack Sahara API. There's :doc:`a Python API
<api>` (the :mod:`saharaclient` module), and a :doc:`command-line utility
<shell>` (installed as an OpenStackClient plugin). Each implements the entire
OpenStack Sahara API.
You'll need credentials for an OpenStack cloud that implements the
Data Processing API, in order to use the sahara client.
You may want to read the `OpenStack Sahara Docs`__ -- the overview, at
least -- to get an idea of the concepts. By understanding the concepts
this library should make more sense.
__ http://docs.openstack.org/developer/sahara/
Contents:
.. toctree::
:maxdepth: 2
api
shell
cli
how_to_participate
Contributing
============
Code is hosted in `review.o.o`_ and mirrored to `github`_ and `git.o.o`_ .
Submit bugs to the Sahara project on `launchpad`_ and to the Sahara client on
`launchpad_client`_. Submit code to the openstack/python-saharaclient project
using `gerrit`_.
.. _review.o.o: https://review.openstack.org
.. _github: https://github.com/openstack/python-saharaclient
.. _git.o.o: http://git.openstack.org/cgit/openstack/python-saharaclient
.. _launchpad: https://launchpad.net/sahara
.. _launchpad_client: https://launchpad.net/python-saharaclient
.. _gerrit: http://docs.openstack.org/infra/manual/developers.html#development-workflow

View File

@ -1,64 +0,0 @@
Sahara CLI
==========
The Sahara shell utility now is part of the OpenStackClient, so all
shell commands take the following form:
.. code-block:: bash
$ openstack dataprocessing <command> [arguments...]
To get a list of all possible commands you can run:
.. code-block:: bash
$ openstack help dataprocessing
To get detailed help for the command you can run:
.. code-block:: bash
$ openstack help dataprocessing <command>
For more information about commands and their parameters you can refer to
:doc:`the Sahara CLI commands <cli>`.
For more information about abilities and features of OpenStackClient CLI you
can refer to `OpenStackClient documentation <http://docs.openstack.org/developer/python-openstackclient/>`_
Configuration
-------------
The CLI is configured via environment variables and command-line options which
are described in http://docs.openstack.org/developer/python-openstackclient/authentication.html.
Authentication using username/password is most commonly used and can be
provided with environment variables:
.. code-block:: bash
export OS_AUTH_URL=<url-to-openstack-identity>
export OS_PROJECT_NAME=<project-name>
export OS_USERNAME=<username>
export OS_PASSWORD=<password> # (optional)
or command-line options:
.. code-block:: bash
--os-auth-url <url>
--os-project-name <project-name>
--os-username <username>
[--os-password <password>]
Additionally :program:`sahara` API url can be configured with parameter:
.. code-block:: bash
--os-data-processing-url
or with environment variable:
.. code-block:: bash
export OS_DATA_PROCESSING_URL=<url-to-sahara-API>

View File

@ -1,4 +0,0 @@
---
features:
- >
Automatically generated documentation for saharaclient API was added.

View File

@ -1,4 +0,0 @@
---
features:
- >
Automatically generated documentation for saharaclient CLI was added.

View File

@ -1,4 +0,0 @@
---
deprecations:
- >
Old CLI is deprecated and will not be maintained.

View File

@ -1,4 +0,0 @@
---
features:
- Added integration of Designate for hostname resolution through dns
servers

View File

@ -1,4 +0,0 @@
---
features:
- Providing ability to make dump of event logs for clusters.
Also displaying shorten version of event logs by option.

View File

@ -1,5 +0,0 @@
---
fixes:
- >
[`bug 1534050 <https://bugs.launchpad.net/python-saharaclient/+bug/1534050>`_]
Now object's fields can be unset with ``update`` calls.

View File

@ -1,4 +0,0 @@
---
features:
- >
Pagination for list operations is implemented.

View File

@ -1,6 +0,0 @@
---
fixes:
- >
[`bug 1508406 <https://bugs.launchpad.net/python-saharaclient/+bug/1508406>`_]
Now ``description`` and ``extra`` parameters of jobs ``create`` method
are optional.

View File

@ -1,6 +0,0 @@
---
fixes:
- >
[`bug 1506448 <https://bugs.launchpad.net/python-saharaclient/+bug/1506448>`_]
Now ``mains``, ``libs`` and ``description`` parameters of jobs ``create``
method are optional.

View File

@ -1,6 +0,0 @@
---
fixes:
- >
[`bug 1507966 <https://bugs.launchpad.net/python-saharaclient/+bug/1507966>`_]
Now input_id, output_id, configs parameters of job executions create
method are optional.

View File

@ -1,4 +0,0 @@
---
features:
- >
New CLI as part of the openstackclient was implemented.

View File

@ -1,4 +0,0 @@
---
features:
- Plugins updates are supported now in saharaclient. Also
information about plugin labels is available for users.

View File

@ -1,4 +0,0 @@
---
prelude: >
Functional tests were replaced to sahara-tests repository. Please refer to
README of sahara-tests about how to run these tests now.

View File

@ -1,7 +0,0 @@
---
prelude: >
Old CLI commands are removed. Please use OpenStackClient
instead.
deprecations:
- Old CLI commands are removed. Please use OpenStackClient
instead.

View File

@ -1,5 +0,0 @@
---
deprecations:
- >
[`bug 1519510 <https://bugs.launchpad.net/python-saharaclient/+bug/1519510>`_]
Support of python 2.6 was dropped.

View File

@ -1,5 +0,0 @@
---
deprecations:
- >
[`bug 1526170 <https://bugs.launchpad.net/python-saharaclient/+bug/1526170>`_]
Support of python 3.3 was dropped.

View File

@ -1,10 +0,0 @@
---
upgrade:
- Option 'version' is replaced by 'plugin-version' option.
fixes:
- Option 'version' is a global option, which is used for getting
the client version. So there were problems with the OpenStack client,
when we specified 'version' of the plugin, but OSC treated
that as a request for getting the current client version. Hence, to fix
this problem, 'version' is replaced by 'plugin-version'.
Related bug 1565775.

View File

@ -1,4 +0,0 @@
---
features:
- >
Now shares can be edited on an existing cluster.

View File

@ -1,4 +0,0 @@
---
other:
- >
Start using reno to manage release notes.

View File

@ -1,5 +0,0 @@
---
fixes:
- >
[`bug 1500790 <https://bugs.launchpad.net/python-saharaclient/+bug/1500790>`_]
Now tags can be added and removed simultaneously in one call.

View File

@ -1,5 +0,0 @@
---
fixes:
- >
[`bug 1510470 <https://bugs.launchpad.net/python-saharaclient/+bug/1510470>`_]
Now ``desc`` parameter of ``update_image`` is optional.

View File

@ -1,6 +0,0 @@
---
fixes:
- >
[`bug 1499697 <https://bugs.launchpad.net/python-saharaclient/+bug/1499697>`_]
Now node group templates can be created and updated with
``volume_mount_prefix`` parameter.

View File

@ -1,228 +0,0 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Sahara Client Release Notes documentation build configuration file
extensions = [
'reno.sphinxext',
'openstackdocstheme'
]
# openstackdocstheme options
repository_name = 'openstack/python-saharaclient'
bug_project = 'python-saharaclient'
bug_tag = 'releasenotes'
html_last_updated_fmt = '%Y-%m-%d %H:%M'
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'Saharaclient Release Notes'
copyright = u'2015, Sahara Developers'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
from saharaclient.version import version_info as saharaclient_version
# The full version, including alpha/beta/rc tags.
release = saharaclient_version.version_string_with_vcs()
# The short X.Y version.
version = saharaclient_version.canonical_version_string()
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'openstackdocs'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
# html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
# html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
# html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
# html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
# html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
# html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
# html_extra_path = []
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
# html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
# html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
# html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
# html_additional_pages = {}
# If false, no module index is generated.
# html_domain_indices = True
# If false, no index is generated.
# html_use_index = True
# If true, the index is split into individual pages for each letter.
# html_split_index = False
# If true, links to the reST sources are added to the pages.
# html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
# html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
# html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
# html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
# html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'SaharaClientReleaseNotesdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
# 'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
('index', 'SaharaClientReleaseNotes.tex',
u'Sahara Client Release Notes Documentation',
u'Sahara Client Developers', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
# latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
# latex_use_parts = False
# If true, show page references after internal links.
# latex_show_pagerefs = False
# If true, show URL addresses after external links.
# latex_show_urls = False
# Documents to append as an appendix to all manuals.
# latex_appendices = []
# If false, no module index is generated.
# latex_domain_indices = True
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'saharaclientreleasenotes',
u'Sahara Client Release Notes Documentation',
[u'Sahara Developers'], 1)
]
# If true, show URL addresses after external links.
# man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'SaharaClientReleaseNotes',
u'Sahara Client Release Notes Documentation',
u'Sahara Developers', 'SaharaClientReleaseNotes',
'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
# texinfo_appendices = []
# If false, no module index is generated.
# texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
# texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
# texinfo_no_detailmenu = False
# -- Options for Internationalization output ------------------------------
locale_dirs = ['locale/']

View File

@ -1,11 +0,0 @@
===========================
Saharaclient Release Notes
===========================
.. toctree::
:maxdepth: 1
unreleased
ocata
newton
mitaka

View File

@ -1,6 +0,0 @@
===================================
Mitaka Series Release Notes
===================================
.. release-notes::
:branch: origin/stable/mitaka

View File

@ -1,6 +0,0 @@
===================================
Newton Series Release Notes
===================================
.. release-notes::
:branch: origin/stable/newton

View File

@ -1,6 +0,0 @@
===================================
Ocata Series Release Notes
===================================
.. release-notes::
:branch: origin/stable/ocata

View File

@ -1,5 +0,0 @@
==============================
Current Series Release Notes
==============================
.. release-notes::

View File

@ -1,16 +0,0 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
pbr!=2.1.0,>=2.0.0 # Apache-2.0
Babel!=2.4.0,>=2.3.4 # BSD
keystoneauth1>=3.0.1 # Apache-2.0
osc-lib>=1.7.0 # Apache-2.0
oslo.log>=3.22.0 # Apache-2.0
oslo.serialization!=2.19.1,>=1.10.0 # Apache-2.0
oslo.i18n!=3.15.2,>=2.1.0 # Apache-2.0
oslo.utils>=3.20.0 # Apache-2.0
python-openstackclient!=3.10.0,>=3.3.0 # Apache-2.0
requests>=2.14.2 # Apache-2.0
six>=1.9.0 # MIT

View File

@ -1,164 +0,0 @@
#!/bin/bash
set -eu
function usage {
echo "Usage: $0 [OPTION]..."
echo "Run python-saharaclient test suite"
echo ""
echo " -V, --virtual-env Always use virtualenv. Install automatically if not present"
echo " -N, --no-virtual-env Don't use virtualenv. Run tests in local environment"
echo " -s, --no-site-packages Isolate the virtualenv from the global Python environment"
echo " -x, --stop Stop running tests after the first error or failure."
echo " -f, --force Force a clean re-build of the virtual environment. Useful when dependencies have been added."
echo " -p, --pep8 Just run pep8"
echo " -P, --no-pep8 Don't run pep8"
echo " -c, --coverage Generate coverage report"
echo " -h, --help Print this usage message"
echo " --hide-elapsed Don't print the elapsed time for each test along with slow test list"
echo ""
echo "Note: with no options specified, the script will try to run the tests in a virtual environment,"
echo " If no virtualenv is found, the script will ask if you would like to create one. If you "
echo " prefer to run tests NOT in a virtual environment, simply pass the -N option."
exit
}
function process_option {
case "$1" in
-h|--help) usage;;
-V|--virtual-env) always_venv=1; never_venv=0;;
-N|--no-virtual-env) always_venv=0; never_venv=1;;
-s|--no-site-packages) no_site_packages=1;;
-f|--force) force=1;;
-p|--pep8) just_pep8=1;;
-P|--no-pep8) no_pep8=1;;
-c|--coverage) coverage=1;;
-*) testropts="$testropts $1";;
*) testrargs="$testrargs $1"
esac
}
venv=.venv
with_venv=tools/with_venv.sh
always_venv=0
never_venv=0
force=0
no_site_packages=0
installvenvopts=
testrargs=
testropts=
wrapper=""
just_pep8=0
no_pep8=0
coverage=0
LANG=en_US.UTF-8
LANGUAGE=en_US:en
LC_ALL=C
for arg in "$@"; do
process_option $arg
done
if [ $no_site_packages -eq 1 ]; then
installvenvopts="--no-site-packages"
fi
function init_testr {
if [ ! -d .testrepository ]; then
${wrapper} testr init
fi
}
function run_tests {
# Cleanup *pyc
${wrapper} find . -type f -name "*.pyc" -delete
if [ $coverage -eq 1 ]; then
# Do not test test_coverage_ext when gathering coverage.
if [ "x$testrargs" = "x" ]; then
testrargs="^(?!.*test_coverage_ext).*$"
fi
export PYTHON="${wrapper} coverage run --source saharaclient --parallel-mode"
fi
# Just run the test suites in current environment
set +e
TESTRTESTS="$TESTRTESTS $testrargs"
echo "Running \`${wrapper} $TESTRTESTS\`"
${wrapper} $TESTRTESTS
RESULT=$?
set -e
copy_subunit_log
return $RESULT
}
function copy_subunit_log {
LOGNAME=`cat .testrepository/next-stream`
LOGNAME=$(($LOGNAME - 1))
LOGNAME=".testrepository/${LOGNAME}"
cp $LOGNAME subunit.log
}
function run_pep8 {
echo "Running flake8 ..."
${wrapper} flake8
}
TESTRTESTS="testr run --parallel $testropts"
if [ $never_venv -eq 0 ]
then
# Remove the virtual environment if --force used
if [ $force -eq 1 ]; then
echo "Cleaning virtualenv..."
rm -rf ${venv}
fi
if [ -e ${venv} ]; then
wrapper="${with_venv}"
else
if [ $always_venv -eq 1 ]; then
# Automatically install the virtualenv
python tools/install_venv.py $installvenvopts
wrapper="${with_venv}"
else
echo -e "No virtual environment found...create one? (Y/n) \c"
read use_ve
if [ "x$use_ve" = "xY" -o "x$use_ve" = "x" -o "x$use_ve" = "xy" ]; then
# Install the virtualenv and run the test suite in it
python tools/install_venv.py $installvenvopts
wrapper=${with_venv}
fi
fi
fi
fi
# Delete old coverage data from previous runs
if [ $coverage -eq 1 ]; then
${wrapper} coverage erase
fi
if [ $just_pep8 -eq 1 ]; then
run_pep8
exit
fi
init_testr
run_tests
# NOTE(sirp): we only want to run pep8 when we're running the full-test suite,
# not when we're running tests individually. To handle this, we need to
# distinguish between options (noseopts), which begin with a '-', and
# arguments (testrargs).
if [ -z "$testrargs" ]; then
if [ $no_pep8 -eq 0 ]; then
run_pep8
fi
fi
if [ $coverage -eq 1 ]; then
echo "Generating coverage report in covhtml/"
${wrapper} coverage combine
${wrapper} coverage html --include='saharaclient/*' --omit='saharaclient/openstack/common/*' -d covhtml -i
fi

View File

@ -1,19 +0,0 @@
# Copyright 2017 Huawei, Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
from saharaclient import version
__version__ = version.version_info.version_string()

View File

@ -1,20 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""oslo.i18n integration module.
See http://docs.openstack.org/developer/oslo.i18n/usage.html
"""
import oslo_i18n
_ = oslo_i18n.TranslatorFactory(domain='saharaclient').primary

View File

@ -1,277 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
from oslo_serialization import jsonutils
from six.moves.urllib import parse
from saharaclient._i18n import _
class Resource(object):
resource_name = 'Something'
defaults = {}
def __init__(self, manager, info):
self.manager = manager
info = info.copy()
self._info = info
self._set_defaults(info)
self._add_details(info)
def _set_defaults(self, info):
for name, value in self.defaults.items():
if name not in info:
info[name] = value
def _add_details(self, info):
for (k, v) in info.items():
try:
setattr(self, k, v)
self._info[k] = v
except AttributeError:
# In this case we already defined the attribute on the class
pass
def to_dict(self):
return copy.deepcopy(self._info)
def __str__(self):
return '%s %s' % (self.resource_name, str(self._info))
def _check_items(obj, searches):
try:
return all(getattr(obj, attr) == value for (attr, value) in searches)
except AttributeError:
return False
class NotUpdated(object):
"""A sentinel class to signal that parameter should not be updated."""
def __repr__(self):
return 'NotUpdated'
class ResourceManager(object):
resource_class = None
def __init__(self, api):
self.api = api
def find(self, **kwargs):
return [i for i in self.list() if _check_items(i, kwargs.items())]
def find_unique(self, **kwargs):
found = self.find(**kwargs)
if not found:
raise APIException(error_code=404,
error_message=_("No matches found."))
if len(found) > 1:
raise APIException(error_code=409,
error_message=_("Multiple matches found."))
return found[0]
def _copy_if_defined(self, data, **kwargs):
for var_name, var_value in kwargs.items():
if var_value is not None:
data[var_name] = var_value
def _copy_if_updated(self, data, **kwargs):
for var_name, var_value in kwargs.items():
if not isinstance(var_value, NotUpdated):
data[var_name] = var_value
def _create(self, url, data, response_key=None, dump_json=True):
if dump_json:
kwargs = {'json': data}
else:
kwargs = {'data': data}
resp = self.api.post(url, **kwargs)
if resp.status_code != 202:
self._raise_api_exception(resp)
if response_key is not None:
data = get_json(resp)[response_key]
else:
data = get_json(resp)
return self.resource_class(self, data)
def _update(self, url, data, response_key=None, dump_json=True):
if dump_json:
kwargs = {'json': data}
else:
kwargs = {'data': data}
resp = self.api.put(url, **kwargs)
if resp.status_code != 202:
self._raise_api_exception(resp)
if response_key is not None:
data = get_json(resp)[response_key]
else:
data = get_json(resp)
return self.resource_class(self, data)
def _patch(self, url, data, response_key=None, dump_json=True):
if dump_json:
kwargs = {'json': data}
else:
kwargs = {'data': data}
resp = self.api.patch(url, **kwargs)
if resp.status_code != 202:
self._raise_api_exception(resp)
if response_key is not None:
data = get_json(resp)[response_key]
else:
data = get_json(resp)
return self.resource_class(self, data)
def _post(self, url, data, response_key=None, dump_json=True):
if dump_json:
kwargs = {'json': data}
else:
kwargs = {'data': data}
resp = self.api.post(url, **kwargs)
if resp.status_code != 202:
self._raise_api_exception(resp)
if response_key is not None:
data = get_json(resp)[response_key]
else:
data = get_json(resp)
return self.resource_class(self, data)
def _list(self, url, response_key):
resp = self.api.get(url)
if resp.status_code == 200:
data = get_json(resp)[response_key]
return [self.resource_class(self, res)
for res in data]
else:
self._raise_api_exception(resp)
def _page(self, url, response_key, limit=None):
resp = self.api.get(url)
if resp.status_code == 200:
result = get_json(resp)
data = result[response_key]
meta = result.get('markers')
next, prev = None, None
if meta:
prev = meta.get('prev')
next = meta.get('next')
l = [self.resource_class(self, res)
for res in data]
return Page(l, prev, next, limit)
else:
self._raise_api_exception(resp)
def _get(self, url, response_key=None):
resp = self.api.get(url)
if resp.status_code == 200:
if response_key is not None:
data = get_json(resp)[response_key]
else:
data = get_json(resp)
return self.resource_class(self, data)
else:
self._raise_api_exception(resp)
def _delete(self, url):
resp = self.api.delete(url)
if resp.status_code != 204:
self._raise_api_exception(resp)
def _plurify_resource_name(self):
return self.resource_class.resource_name + 's'
def _raise_api_exception(self, resp):
try:
error_data = get_json(resp)
except Exception:
msg = _("Failed to parse response from Sahara: %s") % resp.reason
raise APIException(
error_code=resp.status_code,
error_message=msg)
raise APIException(error_code=error_data.get("error_code"),
error_name=error_data.get("error_name"),
error_message=error_data.get("error_message"))
def get_json(response):
"""Provide backward compatibility with old versions of requests library."""
json_field_or_function = getattr(response, 'json', None)
if callable(json_field_or_function):
return response.json()
else:
return jsonutils.loads(response.content)
class APIException(Exception):
def __init__(self, error_code=None, error_name=None, error_message=None):
super(APIException, self).__init__(error_message)
self.error_code = error_code
self.error_name = error_name
self.error_message = error_message
def get_query_string(search_opts, limit=None, marker=None, sort_by=None,
reverse=None):
opts = {}
if marker is not None:
opts['marker'] = marker
if limit is not None:
opts['limit'] = limit
if sort_by is not None:
if reverse:
opts['sort_by'] = "-%s" % sort_by
else:
opts['sort_by'] = sort_by
if search_opts is not None:
opts.update(search_opts)
if opts:
qparams = sorted(opts.items(), key=lambda x: x[0])
query_string = "?%s" % parse.urlencode(qparams, doseq=True)
else:
query_string = ""
return query_string
class Page(list):
def __init__(self, l, prev, next, limit):
super(Page, self).__init__(l)
self.prev = prev
self.next = next
self.limit = limit

View File

@ -1,188 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import warnings
from keystoneauth1 import adapter
from keystoneauth1 import exceptions
from keystoneauth1.identity import v2
from keystoneauth1.identity import v3
from keystoneauth1 import session as keystone_session
from keystoneauth1 import token_endpoint
from saharaclient.api import cluster_templates
from saharaclient.api import clusters
from saharaclient.api import data_sources
from saharaclient.api import images
from saharaclient.api import job_binaries
from saharaclient.api import job_binary_internals
from saharaclient.api import job_executions
from saharaclient.api import job_types
from saharaclient.api import jobs
from saharaclient.api import node_group_templates
from saharaclient.api import plugins
USER_AGENT = 'python-saharaclient'
class HTTPClient(adapter.Adapter):
def request(self, *args, **kwargs):
kwargs.setdefault('raise_exc', False)
return super(HTTPClient, self).request(*args, **kwargs)
class Client(object):
"""Client for the OpenStack Data Processing v1 API.
:param str username: Username for Keystone authentication.
:param str api_key: Password for Keystone authentication.
:param str project_id: Keystone Tenant id.
:param str project_name: Keystone Tenant name.
:param str auth_url: Keystone URL that will be used for authentication.
:param str sahara_url: Sahara REST API URL to communicate with.
:param str endpoint_type: Desired Sahara endpoint type.
:param str service_type: Sahara service name in Keystone catalog.
:param str input_auth_token: Keystone authorization token.
:param session: Keystone Session object.
:param auth: Keystone Authentication Plugin object.
:param boolean insecure: Allow insecure.
:param string cacert: Path to the Privacy Enhanced Mail (PEM) file
which contains certificates needed to establish
SSL connection with the identity service.
:param string region_name: Name of a region to select when choosing an
endpoint from the service catalog.
"""
def __init__(self, username=None, api_key=None, project_id=None,
project_name=None, auth_url=None, sahara_url=None,
endpoint_type='publicURL', service_type='data-processing',
input_auth_token=None, session=None, auth=None,
insecure=False, cacert=None, region_name=None, **kwargs):
if not session:
warnings.simplefilter('once', category=DeprecationWarning)
warnings.warn('Passing authentication parameters to saharaclient '
'is deprecated. Please construct and pass an '
'authenticated session object directly.',
DeprecationWarning)
warnings.resetwarnings()
if input_auth_token:
auth = token_endpoint.Token(sahara_url, input_auth_token)
else:
auth = self._get_keystone_auth(auth_url=auth_url,
username=username,
api_key=api_key,
project_id=project_id,
project_name=project_name)
verify = True
if insecure:
verify = False
elif cacert:
verify = cacert
session = keystone_session.Session(verify=verify)
if not auth:
auth = session.auth
# NOTE(Toan): bug #1512801. If sahara_url is provided, it does not
# matter if service_type is orthographically correct or not.
# Only find Sahara service_type and endpoint in Keystone catalog
# if sahara_url is not provided.
if not sahara_url:
service_type = self._determine_service_type(session,
auth,
service_type,
endpoint_type)
kwargs['user_agent'] = USER_AGENT
kwargs.setdefault('interface', endpoint_type)
kwargs.setdefault('endpoint_override', sahara_url)
client = HTTPClient(session=session,
auth=auth,
service_type=service_type,
region_name=region_name,
**kwargs)
self.clusters = clusters.ClusterManager(client)
self.cluster_templates = (
cluster_templates.ClusterTemplateManager(client)
)
self.node_group_templates = (
node_group_templates.NodeGroupTemplateManager(client)
)
self.plugins = plugins.PluginManager(client)
self.images = images.ImageManager(client)
self.data_sources = data_sources.DataSourceManager(client)
self.jobs = jobs.JobsManager(client)
self.job_executions = job_executions.JobExecutionsManager(client)
self.job_binaries = job_binaries.JobBinariesManager(client)
self.job_binary_internals = (
job_binary_internals.JobBinaryInternalsManager(client)
)
self.job_types = job_types.JobTypesManager(client)
def _get_keystone_auth(self, username=None, api_key=None, auth_url=None,
project_id=None, project_name=None):
if not auth_url:
raise RuntimeError("No auth url specified")
if 'v2.0' in auth_url:
return v2.Password(auth_url=auth_url,
username=username,
password=api_key,
tenant_id=project_id,
tenant_name=project_name)
else:
# NOTE(jamielennox): Setting these to default is what
# keystoneclient does in the event they are not passed.
return v3.Password(auth_url=auth_url,
username=username,
password=api_key,
user_domain_id='default',
project_id=project_id,
project_name=project_name,
project_domain_id='default')
@staticmethod
def _determine_service_type(session, auth, service_type, interface):
"""Check a catalog for data-processing or data_processing"""
# NOTE(jamielennox): calling get_endpoint forces an auth on
# initialization which is required for backwards compatibility. It
# also allows us to reset the service type if not in the catalog.
for st in (service_type, service_type.replace('-', '_')):
try:
url = auth.get_endpoint(session,
service_type=st,
interface=interface)
except exceptions.Unauthorized:
raise RuntimeError("Not Authorized")
except exceptions.EndpointNotFound:
# NOTE(jamielennox): bug #1428447. This should not be
# raised, instead None should be returned. Handle in case
# it changes in the future
url = None
if url:
return st
raise RuntimeError("Could not find Sahara endpoint in catalog")

View File

@ -1,99 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class ClusterTemplate(base.Resource):
resource_name = 'Cluster Template'
class ClusterTemplateManager(base.ResourceManager):
resource_class = ClusterTemplate
NotUpdated = base.NotUpdated()
def create(self, name, plugin_name, hadoop_version, description=None,
cluster_configs=None, node_groups=None, anti_affinity=None,
net_id=None, default_image_id=None, use_autoconfig=None,
shares=None, is_public=None, is_protected=None,
domain_name=None):
"""Create a Cluster Template."""
data = {
'name': name,
'plugin_name': plugin_name,
'hadoop_version': hadoop_version,
}
self._copy_if_defined(data,
description=description,
cluster_configs=cluster_configs,
node_groups=node_groups,
anti_affinity=anti_affinity,
neutron_management_network=net_id,
default_image_id=default_image_id,
use_autoconfig=use_autoconfig,
shares=shares,
is_public=is_public,
is_protected=is_protected,
domain_name=domain_name)
return self._create('/cluster-templates', data, 'cluster_template')
def update(self, cluster_template_id, name=NotUpdated,
plugin_name=NotUpdated, hadoop_version=NotUpdated,
description=NotUpdated, cluster_configs=NotUpdated,
node_groups=NotUpdated, anti_affinity=NotUpdated,
net_id=NotUpdated, default_image_id=NotUpdated,
use_autoconfig=NotUpdated, shares=NotUpdated,
is_public=NotUpdated, is_protected=NotUpdated,
domain_name=NotUpdated):
"""Update a Cluster Template."""
data = {}
self._copy_if_updated(data, name=name,
plugin_name=plugin_name,
hadoop_version=hadoop_version,
description=description,
cluster_configs=cluster_configs,
node_groups=node_groups,
anti_affinity=anti_affinity,
neutron_management_network=net_id,
default_image_id=default_image_id,
use_autoconfig=use_autoconfig,
shares=shares,
is_public=is_public,
is_protected=is_protected,
domain_name=domain_name)
return self._update('/cluster-templates/%s' % cluster_template_id,
data, 'cluster_template')
def list(self, search_opts=None, marker=None,
limit=None, sort_by=None, reverse=None):
"""Get list of Cluster Templates."""
query = base.get_query_string(search_opts, marker=marker, limit=limit,
sort_by=sort_by, reverse=reverse)
url = "/cluster-templates%s" % query
return self._page(url, 'cluster_templates', limit)
def get(self, cluster_template_id):
"""Get information about a Cluster Template."""
return self._get('/cluster-templates/%s' % cluster_template_id,
'cluster_template')
def delete(self, cluster_template_id):
"""Delete a Cluster Template."""
self._delete('/cluster-templates/%s' % cluster_template_id)

View File

@ -1,138 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from six.moves.urllib import parse
from saharaclient.api import base
class Cluster(base.Resource):
resource_name = 'Cluster'
class ClusterManager(base.ResourceManager):
resource_class = Cluster
NotUpdated = base.NotUpdated()
def create(self, name, plugin_name, hadoop_version,
cluster_template_id=None, default_image_id=None,
is_transient=None, description=None, cluster_configs=None,
node_groups=None, user_keypair_id=None,
anti_affinity=None, net_id=None, count=None,
use_autoconfig=None, shares=None,
is_public=None, is_protected=None):
"""Launch a Cluster."""
data = {
'name': name,
'plugin_name': plugin_name,
'hadoop_version': hadoop_version,
}
# Checking if count is greater than 1, otherwise we set it to None
# so the created dict in the _copy_if_defined method does not contain
# the count parameter.
if count and count <= 1:
count = None
self._copy_if_defined(data,
cluster_template_id=cluster_template_id,
is_transient=is_transient,
default_image_id=default_image_id,
description=description,
cluster_configs=cluster_configs,
node_groups=node_groups,
user_keypair_id=user_keypair_id,
anti_affinity=anti_affinity,
neutron_management_network=net_id,
count=count,
use_autoconfig=use_autoconfig,
shares=shares,
is_public=is_public,
is_protected=is_protected)
if count:
return self._create('/clusters/multiple', data)
return self._create('/clusters', data, 'cluster')
def scale(self, cluster_id, scale_object):
"""Scale an existing Cluster.
:param scale_object: dict that describes scaling operation
:Example:
The following `scale_object` can be used to change the number of
instances in the node group and add instances of new node group to
existing cluster:
.. sourcecode:: json
{
"add_node_groups": [
{
"count": 3,
"name": "new_ng",
"node_group_template_id": "ngt_id"
}
],
"resize_node_groups": [
{
"count": 2,
"name": "old_ng"
}
]
}
"""
return self._update('/clusters/%s' % cluster_id, scale_object)
def list(self, search_opts=None, limit=None, marker=None,
sort_by=None, reverse=None):
"""Get a list of Clusters."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/clusters%s" % query
return self._page(url, 'clusters', limit)
def get(self, cluster_id, show_progress=False):
"""Get information about a Cluster."""
url = ('/clusters/%(cluster_id)s?%(params)s' %
{"cluster_id": cluster_id,
"params": parse.urlencode({"show_progress": show_progress})})
return self._get(url, 'cluster')
def delete(self, cluster_id):
"""Delete a Cluster."""
self._delete('/clusters/%s' % cluster_id)
def update(self, cluster_id, name=NotUpdated, description=NotUpdated,
is_public=NotUpdated, is_protected=NotUpdated,
shares=NotUpdated):
"""Update a Cluster."""
data = {}
self._copy_if_updated(data, name=name, description=description,
is_public=is_public, is_protected=is_protected,
shares=shares)
return self._patch('/clusters/%s' % cluster_id, data)
def verification_update(self, cluster_id, status):
"""Start a verification for a Cluster."""
data = {'verification': {'status': status}}
return self._patch("/clusters/%s" % cluster_id, data)

View File

@ -1,80 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class DataSources(base.Resource):
resource_name = 'Data Source'
class DataSourceManager(base.ResourceManager):
resource_class = DataSources
def create(self, name, description, data_source_type,
url, credential_user=None, credential_pass=None,
is_public=None, is_protected=None):
"""Create a Data Source."""
data = {
'name': name,
'description': description,
'type': data_source_type,
'url': url,
'credentials': {}
}
self._copy_if_defined(data['credentials'],
user=credential_user,
password=credential_pass)
self._copy_if_defined(data, is_public=is_public,
is_protected=is_protected)
return self._create('/data-sources', data, 'data_source')
def list(self, search_opts=None, limit=None, marker=None,
sort_by=None, reverse=None):
"""Get a list of Data Sources."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/data-sources%s" % query
return self._page(url, 'data_sources', limit)
def get(self, data_source_id):
"""Get information about a Data Source."""
return self._get('/data-sources/%s' % data_source_id, 'data_source')
def delete(self, data_source_id):
"""Delete a Data Source."""
self._delete('/data-sources/%s' % data_source_id)
def update(self, data_source_id, update_data):
"""Update a Data Source.
:param dict update_data: dict that contains fields that should be
updated with new values.
Fields that can be updated:
* name
* description
* type
* url
* is_public
* is_protected
* credentials - dict with `user` and `password` keyword arguments
"""
return self._update('/data-sources/%s' % data_source_id,
update_data)

View File

@ -1,76 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import parameters as params
class Helpers(object):
def __init__(self, sahara_client):
self.sahara = sahara_client
self.plugins = self.sahara.plugins
def _get_node_processes(self, plugin):
processes = []
for proc_lst in plugin.node_processes.values():
processes += proc_lst
return [(proc_name, proc_name) for proc_name in processes]
def get_node_processes(self, plugin_name, hadoop_version):
plugin = self.plugins.get_version_details(plugin_name, hadoop_version)
return self._get_node_processes(plugin)
def _extract_parameters(self, configs, scope, applicable_target):
parameters = []
for config in configs:
if (config['scope'] == scope and
config['applicable_target'] == applicable_target):
parameters.append(params.Parameter(config))
return parameters
def get_cluster_general_configs(self, plugin_name, hadoop_version):
plugin = self.plugins.get_version_details(plugin_name, hadoop_version)
return self._extract_parameters(plugin.configs, 'cluster', "general")
def get_general_node_group_configs(self, plugin_name, hadoop_version):
plugin = self.plugins.get_version_details(plugin_name, hadoop_version)
return self._extract_parameters(plugin.configs, 'node', 'general')
def get_targeted_node_group_configs(self, plugin_name, hadoop_version):
plugin = self.plugins.get_version_details(plugin_name, hadoop_version)
parameters = dict()
for service in plugin.node_processes.keys():
parameters[service] = self._extract_parameters(plugin.configs,
'node', service)
return parameters
def get_targeted_cluster_configs(self, plugin_name, hadoop_version):
plugin = self.plugins.get_version_details(plugin_name, hadoop_version)
parameters = dict()
for service in plugin.node_processes.keys():
parameters[service] = self._extract_parameters(plugin.configs,
'cluster', service)
return parameters

View File

@ -1,74 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class Image(base.Resource):
resource_name = 'Image'
defaults = {'description': ''}
class ImageManager(base.ResourceManager):
resource_class = Image
def list(self, search_opts=None):
"""Get a list of registered images."""
query = base.get_query_string(search_opts)
return self._list('/images%s' % query, 'images')
def get(self, id):
"""Get information about an image"""
return self._get('/images/%s' % id, 'image')
def unregister_image(self, image_id):
"""Remove an Image from Sahara Image Registry."""
self._delete('/images/%s' % image_id)
def update_image(self, image_id, user_name, desc=None):
"""Create or update an Image in Image Registry."""
desc = desc if desc else ''
data = {"username": user_name,
"description": desc}
return self._post('/images/%s' % image_id, data)
def update_tags(self, image_id, new_tags):
"""Update an Image tags.
:param new_tags: list of tags that will replace currently
assigned tags
"""
# Do not add :param list in the docstring above until this is solved:
# https://github.com/sphinx-doc/sphinx/issues/2549
old_image = self.get(image_id)
old_tags = frozenset(old_image.tags)
new_tags = frozenset(new_tags)
to_add = list(new_tags - old_tags)
to_remove = list(old_tags - new_tags)
add_response, remove_response = None, None
if to_add:
add_response = self._post('/images/%s/tag' % image_id,
{'tags': to_add}, 'image')
if to_remove:
remove_response = self._post('/images/%s/untag' % image_id,
{'tags': to_remove}, 'image')
return remove_response or add_response or self.get(image_id)

View File

@ -1,79 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class JobBinaries(base.Resource):
resource_name = 'Job Binary'
class JobBinariesManager(base.ResourceManager):
resource_class = JobBinaries
def create(self, name, url, description=None, extra=None, is_public=None,
is_protected=None):
"""Create a Job Binary."""
data = {
"name": name,
"url": url
}
self._copy_if_defined(data, description=description, extra=extra,
is_public=is_public, is_protected=is_protected)
return self._create('/job-binaries', data, 'job_binary')
def list(self, search_opts=None, limit=None, marker=None,
sort_by=None, reverse=None):
"""Get a list of Job Binaries."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/job-binaries%s" % query
return self._page(url, 'binaries', limit)
def get(self, job_binary_id):
"""Get information about a Job Binary."""
return self._get('/job-binaries/%s' % job_binary_id, 'job_binary')
def delete(self, job_binary_id):
"""Delete a Job Binary."""
self._delete('/job-binaries/%s' % job_binary_id)
def get_file(self, job_binary_id):
"""Download a Job Binary."""
resp = self.api.get('/job-binaries/%s/data' % job_binary_id)
if resp.status_code != 200:
self._raise_api_exception(resp)
return resp.content
def update(self, job_binary_id, data):
"""Update Job Binary.
:param dict data: dict that contains fields that should be updated
with new values.
Fields that can be updated:
* name
* description
* url
* is_public
* is_protected
* extra - dict with `user` and `password` keyword arguments
"""
return self._update(
'/job-binaries/%s' % job_binary_id, data, 'job_binary')

View File

@ -1,63 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from six.moves.urllib import parse as urlparse
from saharaclient.api import base
class JobBinaryInternal(base.Resource):
resource_name = 'JobBinaryInternal'
class JobBinaryInternalsManager(base.ResourceManager):
resource_class = JobBinaryInternal
NotUpdated = base.NotUpdated()
def create(self, name, data):
"""Create a Job Binary Internal.
:param str data: raw data of script text
"""
return self._update('/job-binary-internals/%s' %
urlparse.quote(name.encode('utf-8')), data,
'job_binary_internal', dump_json=False)
def list(self, search_opts=None, limit=None, marker=None,
sort_by=None, reverse=None):
"""Get a list of Job Binary Internals."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/job-binary-internals%s" % query
return self._page(url, 'binaries', limit)
def get(self, job_binary_id):
"""Get information about a Job Binary Internal."""
return self._get('/job-binary-internals/%s' % job_binary_id,
'job_binary_internal')
def delete(self, job_binary_id):
"""Delete a Job Binary Internal."""
self._delete('/job-binary-internals/%s' % job_binary_id)
def update(self, job_binary_id, name=NotUpdated, is_public=NotUpdated,
is_protected=NotUpdated):
"""Update a Job Binary Internal."""
data = {}
self._copy_if_updated(data, name=name, is_public=is_public,
is_protected=is_protected)
return self._patch('/job-binary-internals/%s' % job_binary_id, data)

View File

@ -1,65 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class JobExecution(base.Resource):
resource_name = 'JobExecution'
class JobExecutionsManager(base.ResourceManager):
resource_class = JobExecution
NotUpdated = base.NotUpdated()
def list(self, search_opts=None, marker=None, limit=None,
sort_by=None, reverse=None):
"""Get a list of Job Executions."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/job-executions%s" % query
return self._page(url, 'job_executions', limit)
def get(self, obj_id):
"""Get information about a Job Execution."""
return self._get('/job-executions/%s' % obj_id, 'job_execution')
def delete(self, obj_id):
"""Delete a Job Execution."""
self._delete('/job-executions/%s' % obj_id)
def create(self, job_id, cluster_id, input_id=None,
output_id=None, configs=None, interface=None, is_public=None,
is_protected=None):
"""Launch a Job."""
url = "/jobs/%s/execute" % job_id
data = {
"cluster_id": cluster_id,
}
self._copy_if_defined(data, input_id=input_id, output_id=output_id,
job_configs=configs, interface=interface,
is_public=is_public, is_protected=is_protected)
return self._create(url, data, 'job_execution')
def update(self, obj_id, is_public=NotUpdated, is_protected=NotUpdated):
"""Update a Job Execution."""
data = {}
self._copy_if_updated(data, is_public=is_public,
is_protected=is_protected)
return self._patch('/job-executions/%s' % obj_id, data)

View File

@ -1,29 +0,0 @@
# Copyright (c) 2015 Red Hat Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class JobType(base.Resource):
resource_name = 'JobType'
class JobTypesManager(base.ResourceManager):
resource_class = JobType
def list(self, search_opts=None):
"""Get a list of job types supported by plugins."""
query = base.get_query_string(search_opts)
return self._list('/job-types%s' % query, 'job_types')

View File

@ -1,69 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class Job(base.Resource):
resource_name = 'Job'
class JobsManager(base.ResourceManager):
resource_class = Job
NotUpdated = base.NotUpdated()
def create(self, name, type, mains=None, libs=None, description=None,
interface=None, is_public=None, is_protected=None):
"""Create a Job."""
data = {
'name': name,
'type': type
}
self._copy_if_defined(data, description=description, mains=mains,
libs=libs, interface=interface,
is_public=is_public, is_protected=is_protected)
return self._create('/jobs', data, 'job')
def list(self, search_opts=None, limit=None,
marker=None, sort_by=None, reverse=None):
"""Get a list of Jobs."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/jobs%s" % query
return self._page(url, 'jobs', limit)
def get(self, job_id):
"""Get information about a Job"""
return self._get('/jobs/%s' % job_id, 'job')
def get_configs(self, job_type):
"""Get config hints for a specified Job type."""
return self._get('/jobs/config-hints/%s' % job_type)
def delete(self, job_id):
"""Delete a Job"""
self._delete('/jobs/%s' % job_id)
def update(self, job_id, name=NotUpdated, description=NotUpdated,
is_public=NotUpdated, is_protected=NotUpdated):
"""Update a Job."""
data = {}
self._copy_if_updated(data, name=name, description=description,
is_public=is_public, is_protected=is_protected)
return self._patch('/jobs/%s' % job_id, data)

View File

@ -1,129 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from saharaclient.api import base
class NodeGroupTemplate(base.Resource):
resource_name = 'Node Group Template'
class NodeGroupTemplateManager(base.ResourceManager):
resource_class = NodeGroupTemplate
NotUpdated = base.NotUpdated()
def create(self, name, plugin_name, hadoop_version, flavor_id,
description=None, volumes_per_node=None, volumes_size=None,
node_processes=None, node_configs=None, floating_ip_pool=None,
security_groups=None, auto_security_group=None,
availability_zone=None, volumes_availability_zone=None,
volume_type=None, image_id=None, is_proxy_gateway=None,
volume_local_to_instance=None, use_autoconfig=None,
shares=None, is_public=None, is_protected=None,
volume_mount_prefix=None):
"""Create a Node Group Template."""
data = {
'name': name,
'plugin_name': plugin_name,
'hadoop_version': hadoop_version,
'flavor_id': flavor_id,
'node_processes': node_processes
}
self._copy_if_defined(data,
description=description,
node_configs=node_configs,
floating_ip_pool=floating_ip_pool,
security_groups=security_groups,
auto_security_group=auto_security_group,
availability_zone=availability_zone,
image_id=image_id,
is_proxy_gateway=is_proxy_gateway,
use_autoconfig=use_autoconfig,
shares=shares,
is_public=is_public,
is_protected=is_protected
)
if volumes_per_node:
data.update({"volumes_per_node": volumes_per_node,
"volumes_size": volumes_size})
if volumes_availability_zone:
data.update({"volumes_availability_zone":
volumes_availability_zone})
if volume_type:
data.update({"volume_type": volume_type})
if volume_local_to_instance:
data.update(
{"volume_local_to_instance": volume_local_to_instance})
if volume_mount_prefix:
data.update({"volume_mount_prefix": volume_mount_prefix})
return self._create('/node-group-templates', data,
'node_group_template')
def update(self, ng_template_id, name=NotUpdated, plugin_name=NotUpdated,
hadoop_version=NotUpdated, flavor_id=NotUpdated,
description=NotUpdated, volumes_per_node=NotUpdated,
volumes_size=NotUpdated, node_processes=NotUpdated,
node_configs=NotUpdated, floating_ip_pool=NotUpdated,
security_groups=NotUpdated, auto_security_group=NotUpdated,
availability_zone=NotUpdated,
volumes_availability_zone=NotUpdated, volume_type=NotUpdated,
image_id=NotUpdated, is_proxy_gateway=NotUpdated,
volume_local_to_instance=NotUpdated, use_autoconfig=NotUpdated,
shares=NotUpdated, is_public=NotUpdated,
is_protected=NotUpdated, volume_mount_prefix=NotUpdated):
"""Update a Node Group Template."""
data = {}
self._copy_if_updated(
data, name=name, plugin_name=plugin_name,
hadoop_version=hadoop_version, flavor_id=flavor_id,
description=description, volumes_per_node=volumes_per_node,
volumes_size=volumes_size, node_processes=node_processes,
node_configs=node_configs, floating_ip_pool=floating_ip_pool,
security_groups=security_groups,
auto_security_group=auto_security_group,
availability_zone=availability_zone,
volumes_availability_zone=volumes_availability_zone,
volume_type=volume_type, image_id=image_id,
is_proxy_gateway=is_proxy_gateway,
volume_local_to_instance=volume_local_to_instance,
use_autoconfig=use_autoconfig, shares=shares,
is_public=is_public, is_protected=is_protected,
volume_mount_prefix=volume_mount_prefix
)
return self._update('/node-group-templates/%s' % ng_template_id, data,
'node_group_template')
def list(self, search_opts=None, marker=None,
limit=None, sort_by=None, reverse=None):
"""Get a list of Node Group Templates."""
query = base.get_query_string(search_opts, limit=limit, marker=marker,
sort_by=sort_by, reverse=reverse)
url = "/node-group-templates%s" % query
return self._page(url, 'node_group_templates', limit)
def get(self, ng_template_id):
"""Get information about a Node Group Template."""
return self._get('/node-group-templates/%s' % ng_template_id,
'node_group_template')
def delete(self, ng_template_id):
"""Delete a Node Group Template."""
self._delete('/node-group-templates/%s' % ng_template_id)

View File

@ -1,26 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
class Parameter(object):
"""This bean is used for building config entries."""
def __init__(self, config):
self.name = config['name']
self.description = config.get('description', "No description")
self.required = not config['is_optional']
self.default_value = config.get('default_value', None)
self.initial_value = self.default_value
self.param_type = config['config_type']
self.priority = int(config.get('priority', 2))

View File

@ -1,75 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from six.moves.urllib import parse as urlparse
from saharaclient.api import base
class Plugin(base.Resource):
resource_name = 'Plugin'
def __init__(self, manager, info):
base.Resource.__init__(self, manager, info)
# Horizon requires each object in table to have an id
self.id = self.name
class PluginManager(base.ResourceManager):
resource_class = Plugin
def list(self, search_opts=None):
"""Get a list of Plugins."""
query = base.get_query_string(search_opts)
return self._list('/plugins%s' % query, 'plugins')
def get(self, plugin_name):
"""Get information about a Plugin."""
return self._get('/plugins/%s' % plugin_name, 'plugin')
def get_version_details(self, plugin_name, hadoop_version):
"""Get version details
Get the list of Services and Service Parameters for a specified
Plugin and Plugin Version.
"""
return self._get('/plugins/%s/%s' % (plugin_name, hadoop_version),
'plugin')
def update(self, plugin_name, values):
"""Update plugin and then return updated result to user
"""
return self._patch("/plugins/%s" % plugin_name, values, 'plugin')
def convert_to_cluster_template(self, plugin_name, hadoop_version,
template_name, filecontent):
"""Convert to cluster template
Create Cluster Template directly, avoiding Cluster Template
mechanism.
"""
resp = self.api.post('/plugins/%s/%s/convert-config/%s' %
(plugin_name,
hadoop_version,
urlparse.quote(template_name)),
data=filecontent)
if resp.status_code != 202:
raise RuntimeError('Failed to upload template file for plugin "%s"'
' and version "%s"' %
(plugin_name, hadoop_version))
else:
return base.get_json(resp)['cluster_template']

View File

@ -1,47 +0,0 @@
# Copyright (c) 2014 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from oslo_utils import importutils
class UnsupportedVersion(Exception):
"""Indication for using an unsupported version of the API.
Indicates that the user is trying to use an unsupported
version of the API.
"""
pass
def get_client_class(version):
version_map = {
'1.0': 'saharaclient.api.client.Client',
'1.1': 'saharaclient.api.client.Client',
}
try:
client_path = version_map[str(version)]
except (KeyError, ValueError):
supported_versions = ', '.join(version_map.keys())
msg = ("Invalid client version '%(version)s'; must be one of: "
"%(versions)s") % {'version': version,
'versions': supported_versions}
raise UnsupportedVersion(msg)
return importutils.import_class(client_path)
def Client(version, *args, **kwargs):
client_class = get_client_class(version)
return client_class(*args, **kwargs)

View File

@ -1,69 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""OpenStackClient plugin for Data Processing service."""
from osc_lib import utils
from oslo_log import log as logging
LOG = logging.getLogger(__name__)
DEFAULT_DATA_PROCESSING_API_VERSION = "1.1"
API_VERSION_OPTION = "os_data_processing_api_version"
API_NAME = "data_processing"
API_VERSIONS = {
"1.1": "saharaclient.api.client.Client"
}
def make_client(instance):
data_processing_client = utils.get_client_class(
API_NAME,
instance._api_version[API_NAME],
API_VERSIONS)
LOG.debug('Instantiating data-processing client: %s',
data_processing_client)
kwargs = utils.build_kwargs_dict('endpoint_type', instance._interface)
client = data_processing_client(
session=instance.session,
region_name=instance._region_name,
cacert=instance._cacert,
insecure=instance._insecure,
sahara_url=instance._cli_options.data_processing_url,
**kwargs
)
return client
def build_option_parser(parser):
"""Hook to add global options."""
parser.add_argument(
"--os-data-processing-api-version",
metavar="<data-processing-api-version>",
default=utils.env(
'OS_DATA_PROCESSING_API_VERSION',
default=DEFAULT_DATA_PROCESSING_API_VERSION),
help=("Data processing API version, default=" +
DEFAULT_DATA_PROCESSING_API_VERSION +
' (Env: OS_DATA_PROCESSING_API_VERSION)'))
parser.add_argument(
"--os-data-processing-url",
default=utils.env(
"OS_DATA_PROCESSING_URL"),
help=("Data processing API URL, "
"(Env: OS_DATA_PROCESSING_API_URL)"))
return parser

View File

@ -1,508 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils as json
from saharaclient.osc.v1 import utils
CT_FIELDS = ['id', 'name', 'plugin_name', 'plugin_version', 'description',
'node_groups', 'anti_affinity', 'use_autoconfig', 'is_default',
'is_protected', 'is_public', 'domain_name']
def _format_node_groups_list(node_groups):
return ', '.join(
['%s:%s' % (ng['name'], ng['count']) for ng in node_groups])
def _format_ct_output(data):
data['plugin_version'] = data.pop('hadoop_version')
data['node_groups'] = _format_node_groups_list(data['node_groups'])
data['anti_affinity'] = osc_utils.format_list(data['anti_affinity'])
def _configure_node_groups(node_groups, client):
node_groups_list = dict(
map(lambda x: x.split(':', 1), node_groups))
node_groups = []
plugins_versions = set()
for name, count in node_groups_list.items():
ng = utils.get_resource(client.node_group_templates, name)
node_groups.append({'name': ng.name,
'count': int(count),
'node_group_template_id': ng.id})
plugins_versions.add((ng.plugin_name, ng.hadoop_version))
if len(plugins_versions) != 1:
raise exceptions.CommandError('Node groups with the same plugins '
'and versions must be specified')
plugin, plugin_version = plugins_versions.pop()
return plugin, plugin_version, node_groups
class CreateClusterTemplate(command.ShowOne):
"""Creates cluster template"""
log = logging.getLogger(__name__ + ".CreateClusterTemplate")
def get_parser(self, prog_name):
parser = super(CreateClusterTemplate, self).get_parser(prog_name)
parser.add_argument(
'--name',
metavar="<name>",
help="Name of the cluster template [REQUIRED if JSON is not "
"provided]",
)
parser.add_argument(
'--node-groups',
metavar="<node-group:instances_count>",
nargs="+",
help="List of the node groups(names or IDs) and numbers of "
"instances for each one of them [REQUIRED if JSON is not "
"provided]"
)
parser.add_argument(
'--anti-affinity',
metavar="<anti-affinity>",
nargs="+",
help="List of processes that should be added to an anti-affinity "
"group"
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the cluster template'
)
parser.add_argument(
'--autoconfig',
action='store_true',
default=False,
help='If enabled, instances of the cluster will be '
'automatically configured',
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the cluster template public (Visible from other '
'projects)',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the cluster template protected',
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the cluster template. Other '
'arguments will not be taken into account if this one is '
'provided'
)
parser.add_argument(
'--shares',
metavar='<filename>',
help='JSON representation of the manila shares'
)
parser.add_argument(
'--configs',
metavar='<filename>',
help='JSON representation of the cluster template configs'
)
parser.add_argument(
'--domain-name',
metavar='<domain-name>',
help='Domain name for instances of this cluster template. This '
'option is available if \'use_designate\' config is True'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
if 'neutron_management_network' in template:
template['net_id'] = template.pop('neutron_management_network')
data = client.cluster_templates.create(**template).to_dict()
else:
if not parsed_args.name or not parsed_args.node_groups:
raise exceptions.CommandError(
'At least --name , --node-groups arguments should be '
'specified or json template should be provided with '
'--json argument')
configs = None
if parsed_args.configs:
blob = osc_utils.read_blob_file_contents(parsed_args.configs)
try:
configs = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'configs from file %s: %s' % (parsed_args.configs, e))
shares = None
if parsed_args.shares:
blob = osc_utils.read_blob_file_contents(parsed_args.shares)
try:
shares = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'shares from file %s: %s' % (parsed_args.shares, e))
plugin, plugin_version, node_groups = _configure_node_groups(
parsed_args.node_groups, client)
data = client.cluster_templates.create(
name=parsed_args.name,
plugin_name=plugin,
hadoop_version=plugin_version,
description=parsed_args.description,
node_groups=node_groups,
use_autoconfig=parsed_args.autoconfig,
cluster_configs=configs,
shares=shares,
is_public=parsed_args.public,
is_protected=parsed_args.protected,
domain_name=parsed_args.domain_name
).to_dict()
_format_ct_output(data)
data = utils.prepare_data(data, CT_FIELDS)
return self.dict2columns(data)
class ListClusterTemplates(command.Lister):
"""Lists cluster templates"""
log = logging.getLogger(__name__ + ".ListClusterTemplates")
def get_parser(self, prog_name):
parser = super(ListClusterTemplates, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="List cluster templates for specific plugin"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="List cluster templates with specific version of the "
"plugin"
)
parser.add_argument(
'--name',
metavar="<name-substring>",
help="List cluster templates with specific substring in the "
"name"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {}
if parsed_args.plugin:
search_opts['plugin_name'] = parsed_args.plugin
if parsed_args.plugin_version:
search_opts['hadoop_version'] = parsed_args.plugin_version
data = client.cluster_templates.list(search_opts=search_opts)
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.long:
columns = ('name', 'id', 'plugin_name', 'hadoop_version',
'node_groups', 'description')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version'})
else:
columns = ('name', 'id', 'plugin_name', 'hadoop_version')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version'})
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns,
formatters={
'node_groups': _format_node_groups_list
}
) for s in data)
)
class ShowClusterTemplate(command.ShowOne):
"""Display cluster template details"""
log = logging.getLogger(__name__ + ".ShowClusterTemplate")
def get_parser(self, prog_name):
parser = super(ShowClusterTemplate, self).get_parser(prog_name)
parser.add_argument(
"cluster_template",
metavar="<cluster-template>",
help="Name or id of the cluster template to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.cluster_templates, parsed_args.cluster_template).to_dict()
_format_ct_output(data)
data = utils.prepare_data(data, CT_FIELDS)
return self.dict2columns(data)
class DeleteClusterTemplate(command.Command):
"""Deletes cluster template"""
log = logging.getLogger(__name__ + ".DeleteClusterTemplate")
def get_parser(self, prog_name):
parser = super(DeleteClusterTemplate, self).get_parser(prog_name)
parser.add_argument(
"cluster_template",
metavar="<cluster-template>",
nargs="+",
help="Name(s) or id(s) of the cluster template(s) to delete",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for ct in parsed_args.cluster_template:
ct_id = utils.get_resource_id(client.cluster_templates, ct)
client.cluster_templates.delete(ct_id)
sys.stdout.write(
'Cluster template "{ct}" has been removed '
'successfully.\n'.format(ct=ct))
class UpdateClusterTemplate(command.ShowOne):
"""Updates cluster template"""
log = logging.getLogger(__name__ + ".UpdateClusterTemplate")
def get_parser(self, prog_name):
parser = super(UpdateClusterTemplate, self).get_parser(prog_name)
parser.add_argument(
'cluster_template',
metavar="<cluster-template>",
help="Name or ID of the cluster template [REQUIRED]",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the cluster template",
)
parser.add_argument(
'--node-groups',
metavar="<node-group:instances_count>",
nargs="+",
help="List of the node groups(names or IDs) and numbers of"
"instances for each one of them"
)
parser.add_argument(
'--anti-affinity',
metavar="<anti-affinity>",
nargs="+",
help="List of processes that should be added to an anti-affinity "
"group"
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the cluster template'
)
autoconfig = parser.add_mutually_exclusive_group()
autoconfig.add_argument(
'--autoconfig-enable',
action='store_true',
help='Instances of the cluster will be '
'automatically configured',
dest='use_autoconfig'
)
autoconfig.add_argument(
'--autoconfig-disable',
action='store_false',
help='Instances of the cluster will not be '
'automatically configured',
dest='use_autoconfig'
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the cluster template public '
'(Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the cluster template private '
'(Visible only from this tenant)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the cluster template protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the cluster template unprotected',
dest='is_protected'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the cluster template. Other '
'arguments will not be taken into account if this one is '
'provided'
)
parser.add_argument(
'--shares',
metavar='<filename>',
help='JSON representation of the manila shares'
)
parser.add_argument(
'--configs',
metavar='<filename>',
help='JSON representation of the cluster template configs'
)
parser.add_argument(
'--domain-name',
metavar='<domain-name>',
default=None,
help='Domain name for instances of this cluster template. This '
'option is available if \'use_designate\' config is True'
)
parser.set_defaults(is_public=None, is_protected=None,
use_autoconfig=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
ct_id = utils.get_resource_id(
client.cluster_templates, parsed_args.cluster_template)
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.cluster_templates.update(
ct_id, **template).to_dict()
else:
plugin, plugin_version, node_groups = None, None, None
if parsed_args.node_groups:
plugin, plugin_version, node_groups = _configure_node_groups(
parsed_args.node_groups, client)
configs = None
if parsed_args.configs:
blob = osc_utils.read_blob_file_contents(parsed_args.configs)
try:
configs = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'configs from file %s: %s' % (parsed_args.configs, e))
shares = None
if parsed_args.shares:
blob = osc_utils.read_blob_file_contents(parsed_args.shares)
try:
shares = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'shares from file %s: %s' % (parsed_args.shares, e))
update_dict = utils.create_dict_from_kwargs(
name=parsed_args.name,
plugin_name=plugin,
hadoop_version=plugin_version,
description=parsed_args.description,
node_groups=node_groups,
use_autoconfig=parsed_args.use_autoconfig,
cluster_configs=configs,
shares=shares,
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected,
domain_name=parsed_args.domain_name
)
data = client.cluster_templates.update(
ct_id, **update_dict).to_dict()
_format_ct_output(data)
data = utils.prepare_data(data, CT_FIELDS)
return self.dict2columns(data)

View File

@ -1,662 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.osc.v1 import utils
CLUSTER_FIELDS = ["cluster_template_id", "use_autoconfig", "user_keypair_id",
"status", "image", "node_groups", "id", "info",
"anti_affinity", "plugin_version", "name", "is_transient",
"is_protected", "description", "is_public",
"neutron_management_network", "plugin_name"]
def _format_node_groups_list(node_groups):
return ', '.join(
['%s:%s' % (ng['name'], ng['count']) for ng in node_groups])
def _format_cluster_output(data):
data['plugin_version'] = data.pop('hadoop_version')
data['image'] = data.pop('default_image_id')
data['node_groups'] = _format_node_groups_list(data['node_groups'])
data['anti_affinity'] = osc_utils.format_list(data['anti_affinity'])
def _prepare_health_checks(data):
additional_data = {}
ver = data.get('verification', {})
additional_fields = ['verification_status']
additional_data['verification_status'] = ver.get('status', 'UNKNOWN')
for check in ver.get('checks', []):
row_name = "Health check (%s)" % check['name']
additional_data[row_name] = check['status']
additional_fields.append(row_name)
return additional_data, additional_fields
def _get_plugin_version(cluster_template, client):
ct = utils.get_resource(client.cluster_templates, cluster_template)
return ct.plugin_name, ct.hadoop_version, ct.id
class CreateCluster(command.ShowOne):
"""Creates cluster"""
log = logging.getLogger(__name__ + ".CreateCluster")
def get_parser(self, prog_name):
parser = super(CreateCluster, self).get_parser(prog_name)
parser.add_argument(
'--name',
metavar="<name>",
help="Name of the cluster [REQUIRED if JSON is not provided]",
)
parser.add_argument(
'--cluster-template',
metavar="<cluster-template>",
help="Cluster template name or ID [REQUIRED if JSON is not "
"provided]"
)
parser.add_argument(
'--image',
metavar="<image>",
help='Image that will be used for cluster deployment (Name or ID) '
'[REQUIRED if JSON is not provided]'
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the cluster'
)
parser.add_argument(
'--user-keypair',
metavar="<keypair>",
help='User keypair to get acces to VMs after cluster creation'
)
parser.add_argument(
'--neutron-network',
metavar="<network>",
help='Instances of the cluster will get fixed IP addresses in '
'this network. (Name or ID should be provided)'
)
parser.add_argument(
'--count',
metavar="<count>",
type=int,
help='Number of clusters to be created'
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the cluster public (Visible from other projects)',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the cluster protected',
)
parser.add_argument(
'--transient',
action='store_true',
default=False,
help='Create transient cluster',
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the cluster. Other '
'arguments (except for --wait) will not be taken into '
'account if this one is provided'
)
parser.add_argument(
'--wait',
action='store_true',
default=False,
help='Wait for the cluster creation to complete',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
network_client = self.app.client_manager.network
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
if 'neutron_management_network' in template:
template['net_id'] = template.pop('neutron_management_network')
if 'count' in template:
parsed_args.count = template['count']
data = client.clusters.create(**template).to_dict()
else:
if not parsed_args.name or not parsed_args.cluster_template \
or not parsed_args.image:
raise exceptions.CommandError(
'At least --name , --cluster-template, --image arguments '
'should be specified or json template should be provided '
'with --json argument')
plugin, plugin_version, template_id = _get_plugin_version(
parsed_args.cluster_template, client)
image_id = utils.get_resource_id(client.images, parsed_args.image)
net_id = (network_client.find_network(
parsed_args.neutron_network, ignore_missing=False).id if
parsed_args.neutron_network else None)
data = client.clusters.create(
name=parsed_args.name,
plugin_name=plugin,
hadoop_version=plugin_version,
cluster_template_id=template_id,
default_image_id=image_id,
description=parsed_args.description,
is_transient=parsed_args.transient,
user_keypair_id=parsed_args.user_keypair,
net_id=net_id,
count=parsed_args.count,
is_public=parsed_args.public,
is_protected=parsed_args.protected
).to_dict()
if parsed_args.count and parsed_args.count > 1:
clusters = [
utils.get_resource(client.clusters, id)
for id in data['clusters']]
if parsed_args.wait:
for cluster in clusters:
if not osc_utils.wait_for_status(
client.clusters.get, cluster.id):
self.log.error(
'Error occurred during cluster creation: %s',
data['id'])
data = {}
for cluster in clusters:
data[cluster.name] = cluster.id
else:
if parsed_args.wait:
if not osc_utils.wait_for_status(
client.clusters.get, data['id']):
self.log.error(
'Error occurred during cluster creation: %s',
data['id'])
data = client.clusters.get(data['id']).to_dict()
_format_cluster_output(data)
data = utils.prepare_data(data, CLUSTER_FIELDS)
return self.dict2columns(data)
class ListClusters(command.Lister):
"""Lists clusters"""
log = logging.getLogger(__name__ + ".ListClusters")
def get_parser(self, prog_name):
parser = super(ListClusters, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="List clusters with specific plugin"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="List clusters with specific version of the "
"plugin"
)
parser.add_argument(
'--name',
metavar="<name-substring>",
help="List clusters with specific substring in the name"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {}
if parsed_args.plugin:
search_opts['plugin_name'] = parsed_args.plugin
if parsed_args.plugin_version:
search_opts['hadoop_version'] = parsed_args.plugin_version
data = client.clusters.list(search_opts=search_opts)
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.long:
columns = ('name', 'id', 'plugin_name', 'hadoop_version',
'status', 'description', 'default_image_id')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version',
'default_image_id': 'image'})
else:
columns = ('name', 'id', 'plugin_name', 'hadoop_version', 'status')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version',
'default_image_id': 'image'})
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class ShowCluster(command.ShowOne):
"""Display cluster details"""
log = logging.getLogger(__name__ + ".ShowCluster")
def get_parser(self, prog_name):
parser = super(ShowCluster, self).get_parser(prog_name)
parser.add_argument(
"cluster",
metavar="<cluster>",
help="Name or id of the cluster to display",
)
parser.add_argument(
'--verification',
action='store_true',
default=False,
help='List additional fields for verifications',
)
parser.add_argument(
'--show-progress',
action='store_true',
default=False,
help='Provides ability to show brief details of event logs.'
)
parser.add_argument(
'--full-dump-events',
action='store_true',
default=False,
help='Provides ability to make full dump with event log details.'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
kwargs = {}
if parsed_args.show_progress or parsed_args.full_dump_events:
kwargs['show_progress'] = True
data = utils.get_resource(
client.clusters, parsed_args.cluster, **kwargs).to_dict()
provision_steps = data.get('provision_progress', [])
provision_steps = utils.created_at_sorted(provision_steps)
if parsed_args.full_dump_events:
file_name = utils.random_name('event-logs')
# making full dump
with open(file_name, 'w') as file:
jsonutils.dump(provision_steps, file, indent=4)
sys.stdout.write('Event log dump saved to file: %s\n' % file_name)
_format_cluster_output(data)
fields = []
if parsed_args.verification:
ver_data, fields = _prepare_health_checks(data)
data.update(ver_data)
fields.extend(CLUSTER_FIELDS)
data = self.dict2columns(utils.prepare_data(data, fields))
if parsed_args.show_progress:
output_steps = []
for step in provision_steps:
st_name, st_type = step['step_name'], step['step_type']
description = "%s: %s" % (st_type, st_name)
if step['successful'] is None:
progress = "Step in progress"
elif step['successful']:
progress = "Step completed successfully"
else:
progress = 'Step has failed events'
output_steps += [(description, progress)]
data = utils.extend_columns(data, output_steps)
return data
class DeleteCluster(command.Command):
"""Deletes cluster"""
log = logging.getLogger(__name__ + ".DeleteCluster")
def get_parser(self, prog_name):
parser = super(DeleteCluster, self).get_parser(prog_name)
parser.add_argument(
"cluster",
metavar="<cluster>",
nargs="+",
help="Name(s) or id(s) of the cluster(s) to delete",
)
parser.add_argument(
'--wait',
action='store_true',
default=False,
help='Wait for the cluster(s) delete to complete',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
clusters = []
for cluster in parsed_args.cluster:
cluster_id = utils.get_resource_id(
client.clusters, cluster)
client.clusters.delete(cluster_id)
clusters.append((cluster_id, cluster))
sys.stdout.write(
'Cluster "{cluster}" deletion has been started.\n'.format(
cluster=cluster))
if parsed_args.wait:
for cluster_id, cluster_arg in clusters:
if not utils.wait_for_delete(client.clusters, cluster_id):
self.log.error(
'Error occurred during cluster deleting: %s' %
cluster_id)
else:
sys.stdout.write(
'Cluster "{cluster}" has been removed '
'successfully.\n'.format(cluster=cluster_arg))
class UpdateCluster(command.ShowOne):
"""Updates cluster"""
log = logging.getLogger(__name__ + ".UpdateCluster")
def get_parser(self, prog_name):
parser = super(UpdateCluster, self).get_parser(prog_name)
parser.add_argument(
'cluster',
metavar="<cluster>",
help="Name or ID of the cluster",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the cluster",
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the cluster'
)
parser.add_argument(
'--shares',
metavar="<filename>",
help='JSON representation of the manila shares'
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the cluster public '
'(Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the cluster private '
'(Visible only from this tenant)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the cluster protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the cluster unprotected',
dest='is_protected'
)
parser.set_defaults(is_public=None, is_protected=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
cluster_id = utils.get_resource_id(
client.clusters, parsed_args.cluster)
shares = None
if parsed_args.shares:
blob = osc_utils.read_blob_file_contents(parsed_args.shares)
try:
shares = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'shares from file %s: %s' % (parsed_args.shares, e))
update_dict = utils.create_dict_from_kwargs(
name=parsed_args.name,
description=parsed_args.description,
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected,
shares=shares
)
data = client.clusters.update(cluster_id, **update_dict).cluster
_format_cluster_output(data)
data = utils.prepare_data(data, CLUSTER_FIELDS)
return self.dict2columns(data)
class ScaleCluster(command.ShowOne):
"""Scales cluster"""
log = logging.getLogger(__name__ + ".ScaleCluster")
def get_parser(self, prog_name):
parser = super(ScaleCluster, self).get_parser(prog_name)
parser.add_argument(
'cluster',
metavar="<cluster>",
help="Name or ID of the cluster",
)
parser.add_argument(
'--instances',
nargs='+',
metavar='<node-group-template:instances_count>',
help='Node group templates and number of their instances to be '
'scale to [REQUIRED if JSON is not provided]'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the cluster scale object. Other '
'arguments (except for --wait) will not be taken into '
'account if this one is provided'
)
parser.add_argument(
'--wait',
action='store_true',
default=False,
help='Wait for the cluster scale to complete',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
cluster = utils.get_resource(
client.clusters, parsed_args.cluster)
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.clusters.scale(cluster.id, template).to_dict()
else:
scale_object = {
"add_node_groups": [],
"resize_node_groups": []
}
scale_node_groups = dict(
map(lambda x: x.split(':', 1), parsed_args.instances))
cluster_ng_map = {
ng['node_group_template_id']: ng['name'] for ng
in cluster.node_groups}
for name, count in scale_node_groups.items():
ngt = utils.get_resource(client.node_group_templates, name)
if ngt.id in cluster_ng_map:
scale_object["resize_node_groups"].append({
"name": cluster_ng_map[ngt.id],
"count": int(count)
})
else:
scale_object["add_node_groups"].append({
"node_group_template_id": ngt.id,
"name": ngt.name,
"count": int(count)
})
if not scale_object['add_node_groups']:
del scale_object['add_node_groups']
if not scale_object['resize_node_groups']:
del scale_object['resize_node_groups']
data = client.clusters.scale(cluster.id, scale_object).cluster
sys.stdout.write(
'Cluster "{cluster}" scaling has been started.\n'.format(
cluster=parsed_args.cluster))
if parsed_args.wait:
if not osc_utils.wait_for_status(
client.clusters.get, data['id']):
self.log.error(
'Error occurred during cluster scaling: %s' %
cluster.id)
data = client.clusters.get(cluster.id).to_dict()
_format_cluster_output(data)
data = utils.prepare_data(data, CLUSTER_FIELDS)
return self.dict2columns(data)
class VerificationUpdateCluster(command.ShowOne):
"""Updates cluster verifications"""
log = logging.getLogger(__name__ + ".VerificationUpdateCluster")
def get_parser(self, prog_name):
parser = super(VerificationUpdateCluster, self).get_parser(prog_name)
parser.add_argument(
'cluster',
metavar="<cluster>",
help="Name or ID of the cluster",
)
status = parser.add_mutually_exclusive_group(required=True)
status.add_argument(
'--start',
action='store_const',
const='START',
help='Start health verification for the cluster',
dest='status'
)
status.add_argument(
'--show',
help='Show health of the cluster',
action='store_true'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.show:
data = utils.get_resource(
client.clusters, parsed_args.cluster).to_dict()
ver_data, ver_fields = _prepare_health_checks(data)
data = utils.prepare_data(ver_data, ver_fields)
return self.dict2columns(data)
else:
cluster_id = utils.get_resource_id(
client.clusters, parsed_args.cluster)
client.clusters.verification_update(
cluster_id, parsed_args.status)
if parsed_args.status == 'START':
print_status = 'started'
sys.stdout.write(
'Cluster "{cluster}" health verification has been '
'{status}.\n'.format(cluster=parsed_args.cluster,
status=print_status))
return {}, {}

View File

@ -1,303 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from saharaclient.osc.v1 import utils
DATA_SOURCE_FIELDS = ['name', 'id', 'type', 'url', 'description', 'is_public',
'is_protected']
DATA_SOURCE_CHOICES = ["swift", "hdfs", "maprfs", "manila"]
class CreateDataSource(command.ShowOne):
"""Creates data source"""
log = logging.getLogger(__name__ + ".CreateDataSource")
def get_parser(self, prog_name):
parser = super(CreateDataSource, self).get_parser(prog_name)
parser.add_argument(
'name',
metavar="<name>",
help="Name of the data source",
)
parser.add_argument(
'--type',
metavar="<type>",
choices=DATA_SOURCE_CHOICES,
help="Type of the data source (%s) "
"[REQUIRED]" % ', '.join(DATA_SOURCE_CHOICES),
required=True
)
parser.add_argument(
'--url',
metavar="<url>",
help="URL for the data source [REQUIRED]",
required=True
)
parser.add_argument(
'--username',
metavar="<username>",
help="Username for accessing the data source URL"
)
parser.add_argument(
'--password',
metavar="<password>",
help="Password for accessing the data source URL"
)
parser.add_argument(
'--description',
metavar="<description>",
help="Description of the data source"
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the data source public',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the data source protected',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
description = parsed_args.description or ''
data = client.data_sources.create(
name=parsed_args.name, description=description,
data_source_type=parsed_args.type, url=parsed_args.url,
credential_user=parsed_args.username,
credential_pass=parsed_args.password,
is_public=parsed_args.public,
is_protected=parsed_args.protected).to_dict()
data = utils.prepare_data(data, DATA_SOURCE_FIELDS)
return self.dict2columns(data)
class ListDataSources(command.Lister):
"""Lists data sources"""
log = logging.getLogger(__name__ + ".ListDataSources")
def get_parser(self, prog_name):
parser = super(ListDataSources, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--type',
metavar="<type>",
choices=DATA_SOURCE_CHOICES,
help="List data sources of specific type "
"(%s)" % ', '.join(DATA_SOURCE_CHOICES)
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {'type': parsed_args.type} if parsed_args.type else {}
data = client.data_sources.list(search_opts=search_opts)
if parsed_args.long:
columns = DATA_SOURCE_FIELDS
column_headers = utils.prepare_column_headers(columns)
else:
columns = ('name', 'id', 'type')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class ShowDataSource(command.ShowOne):
"""Display data source details"""
log = logging.getLogger(__name__ + ".ShowDataSource")
def get_parser(self, prog_name):
parser = super(ShowDataSource, self).get_parser(prog_name)
parser.add_argument(
"data_source",
metavar="<data-source>",
help="Name or id of the data source to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.data_sources, parsed_args.data_source).to_dict()
data = utils.prepare_data(data, DATA_SOURCE_FIELDS)
return self.dict2columns(data)
class DeleteDataSource(command.Command):
"""Delete data source"""
log = logging.getLogger(__name__ + ".DeleteDataSource")
def get_parser(self, prog_name):
parser = super(DeleteDataSource, self).get_parser(prog_name)
parser.add_argument(
"data_source",
metavar="<data-source>",
nargs="+",
help="Name(s) or id(s) of the data source(s) to delete",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for ds in parsed_args.data_source:
data_source_id = utils.get_resource_id(
client.data_sources, ds)
client.data_sources.delete(data_source_id)
sys.stdout.write(
'Data Source "{ds}" has been removed '
'successfully.\n'.format(ds=ds))
class UpdateDataSource(command.ShowOne):
"""Update data source"""
log = logging.getLogger(__name__ + ".UpdateDataSource")
def get_parser(self, prog_name):
parser = super(UpdateDataSource, self).get_parser(prog_name)
parser.add_argument(
'data_source',
metavar="<data-source>",
help="Name or id of the data source",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the data source",
)
parser.add_argument(
'--type',
metavar="<type>",
choices=DATA_SOURCE_CHOICES,
help="Type of the data source "
"(%s)" % ', '.join(DATA_SOURCE_CHOICES)
)
parser.add_argument(
'--url',
metavar="<url>",
help="URL for the data source"
)
parser.add_argument(
'--username',
metavar="<username>",
help="Username for accessing the data source URL"
)
parser.add_argument(
'--password',
metavar="<password>",
help="Password for accessing the data source URL"
)
parser.add_argument(
'--description',
metavar="<description>",
help="Description of the data source"
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
dest='is_public',
help='Make the data source public (Visible from other projects)',
)
public.add_argument(
'--private',
action='store_false',
dest='is_public',
help='Make the data source private (Visible only from this '
'tenant)',
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
dest='is_protected',
help='Make the data source protected',
)
protected.add_argument(
'--unprotected',
action='store_false',
dest='is_protected',
help='Make the data source unprotected',
)
parser.set_defaults(is_public=None, is_protected=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
credentials = {}
if parsed_args.username:
credentials['user'] = parsed_args.username
if parsed_args.password:
credentials['password'] = parsed_args.password
if not credentials:
credentials = None
update_fields = utils.create_dict_from_kwargs(
name=parsed_args.name,
description=parsed_args.description,
type=parsed_args.type, url=parsed_args.url,
credentials=credentials,
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected)
ds_id = utils.get_resource_id(
client.data_sources, parsed_args.data_source)
data = client.data_sources.update(ds_id, update_fields).data_source
data = utils.prepare_data(data, DATA_SOURCE_FIELDS)
return self.dict2columns(data)

View File

@ -1,309 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from saharaclient.osc.v1 import utils
IMAGE_FIELDS = ['name', 'id', 'username', 'tags', 'status', 'description']
class ListImages(command.Lister):
"""Lists registered images"""
log = logging.getLogger(__name__ + ".ListImages")
def get_parser(self, prog_name):
parser = super(ListImages, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--name',
metavar="<name-regex>",
help="Regular expression to match image name"
)
parser.add_argument(
'--tags',
metavar="<tag>",
nargs="+",
help="List images with specific tag(s)"
)
parser.add_argument(
'--username',
metavar="<username>",
help="List images with specific username"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {'tags': parsed_args.tags} if parsed_args.tags else {}
data = client.images.list(search_opts=search_opts)
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.username:
data = [i for i in data if parsed_args.username in i.username]
if parsed_args.long:
columns = IMAGE_FIELDS
column_headers = [c.capitalize() for c in columns]
else:
columns = ('name', 'id', 'username', 'tags')
column_headers = [c.capitalize() for c in columns]
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns,
formatters={
'tags': osc_utils.format_list
},
) for s in data)
)
class ShowImage(command.ShowOne):
"""Display image details"""
log = logging.getLogger(__name__ + ".ShowImage")
def get_parser(self, prog_name):
parser = super(ShowImage, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
help="Name or id of the image to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.images, parsed_args.image).to_dict()
data['tags'] = osc_utils.format_list(data['tags'])
data = utils.prepare_data(data, IMAGE_FIELDS)
return self.dict2columns(data)
class RegisterImage(command.ShowOne):
"""Register an image"""
log = logging.getLogger(__name__ + ".RegisterImage")
def get_parser(self, prog_name):
parser = super(RegisterImage, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
help="Name or ID of the image to register",
)
parser.add_argument(
"--username",
metavar="<username>",
help="Username of privileged user in the image [REQUIRED]",
required=True
)
parser.add_argument(
"--description",
metavar="<description>",
help="Description of the image. If not provided, description of "
"the image will be reset to empty",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
image_client = self.app.client_manager.image
image_id = osc_utils.find_resource(
image_client.images, parsed_args.image).id
data = client.images.update_image(
image_id, user_name=parsed_args.username,
desc=parsed_args.description).image
data['tags'] = osc_utils.format_list(data['tags'])
data = utils.prepare_data(data, IMAGE_FIELDS)
return self.dict2columns(data)
class UnregisterImage(command.Command):
"""Unregister image(s)"""
log = logging.getLogger(__name__ + ".RegisterImage")
def get_parser(self, prog_name):
parser = super(UnregisterImage, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
nargs="+",
help="Name(s) or id(s) of the image(s) to unregister",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for image in parsed_args.image:
image_id = utils.get_resource_id(client.images, image)
client.images.unregister_image(image_id)
sys.stdout.write(
'Image "{image}" has been unregistered '
'successfully.\n'.format(image=image))
class SetImageTags(command.ShowOne):
"""Set image tags (Replace current image tags with provided ones)"""
log = logging.getLogger(__name__ + ".AddImageTags")
def get_parser(self, prog_name):
parser = super(SetImageTags, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
help="Name or id of the image",
)
parser.add_argument(
'--tags',
metavar="<tag>",
nargs="+",
required=True,
help="Tag(s) to set [REQUIRED]"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
image_id = utils.get_resource_id(client.images, parsed_args.image)
data = client.images.update_tags(image_id, parsed_args.tags).to_dict()
data['tags'] = osc_utils.format_list(data['tags'])
data = utils.prepare_data(data, IMAGE_FIELDS)
return self.dict2columns(data)
class AddImageTags(command.ShowOne):
"""Add image tags"""
log = logging.getLogger(__name__ + ".AddImageTags")
def get_parser(self, prog_name):
parser = super(AddImageTags, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
help="Name or id of the image",
)
parser.add_argument(
'--tags',
metavar="<tag>",
nargs="+",
required=True,
help="Tag(s) to add [REQUIRED]"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
image = utils.get_resource(client.images, parsed_args.image)
parsed_args.tags.extend(image.tags)
data = client.images.update_tags(
image.id, list(set(parsed_args.tags))).to_dict()
data['tags'] = osc_utils.format_list(data['tags'])
data = utils.prepare_data(data, IMAGE_FIELDS)
return self.dict2columns(data)
class RemoveImageTags(command.ShowOne):
"""Remove image tags"""
log = logging.getLogger(__name__ + ".RemoveImageTags")
def get_parser(self, prog_name):
parser = super(RemoveImageTags, self).get_parser(prog_name)
parser.add_argument(
"image",
metavar="<image>",
help="Name or id of the image",
)
group = parser.add_mutually_exclusive_group()
group.add_argument(
'--tags',
metavar="<tag>",
nargs="+",
help="Tag(s) to remove"
),
group.add_argument(
'--all',
action='store_true',
default=False,
help='Remove all tags from image',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
image = utils.get_resource(client.images, parsed_args.image)
if parsed_args.all:
data = client.images.update_tags(image.id, []).to_dict()
else:
parsed_args.tags = parsed_args.tags or []
new_tags = list(set(image.tags) - set(parsed_args.tags))
data = client.images.update_tags(image.id, new_tags).to_dict()
data['tags'] = osc_utils.format_list(data['tags'])
data = utils.prepare_data(data, IMAGE_FIELDS)
return self.dict2columns(data)

View File

@ -1,431 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os import path
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.api import base
from saharaclient.osc.v1 import utils
JOB_BINARY_FIELDS = ['name', 'id', 'url', 'description', 'is_public',
'is_protected']
class CreateJobBinary(command.ShowOne):
"""Creates job binary"""
log = logging.getLogger(__name__ + ".CreateJobBinary")
def get_parser(self, prog_name):
parser = super(CreateJobBinary, self).get_parser(prog_name)
parser.add_argument(
'--name',
metavar="<name>",
help="Name of the job binary [REQUIRED if JSON is not provided]",
)
creation_type = parser.add_mutually_exclusive_group()
creation_type.add_argument(
'--data',
metavar='<file>',
help='File that will be stored in the internal DB [REQUIRED if '
'JSON and URL are not provided]'
)
creation_type.add_argument(
'--url',
metavar='<url>',
help='URL for the job binary [REQUIRED if JSON and file are '
'not provided]'
)
parser.add_argument(
'--description',
metavar="<description>",
help="Description of the job binary"
)
parser.add_argument(
'--username',
metavar='<username>',
help='Username for accessing the job binary URL',
)
password = parser.add_mutually_exclusive_group()
password.add_argument(
'--password',
metavar='<password>',
help='Password for accessing the job binary URL',
)
password.add_argument(
'--password-prompt',
dest="password_prompt",
action="store_true",
help='Prompt interactively for password',
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the job binary public',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the job binary protected',
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the job binary. Other '
'arguments will not be taken into account if this one is '
'provided'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.job_binaries.create(**template).to_dict()
else:
if parsed_args.data:
data = open(parsed_args.data).read()
jbi_id = client.job_binary_internals.create(
parsed_args.name, data).id
parsed_args.url = 'internal-db://' + jbi_id
if parsed_args.password_prompt:
parsed_args.password = osc_utils.get_password(
self.app.stdin, confirm=False)
if parsed_args.password and not parsed_args.username:
raise exceptions.CommandError(
'Username via --username should be provided with password')
if parsed_args.username and not parsed_args.password:
raise exceptions.CommandError(
'Password should be provided via --password or entered '
'interactively with --password-prompt')
if parsed_args.password and parsed_args.username:
extra = {
'user': parsed_args.username,
'password': parsed_args.password
}
else:
extra = None
data = client.job_binaries.create(
name=parsed_args.name, url=parsed_args.url,
description=parsed_args.description, extra=extra,
is_public=parsed_args.public,
is_protected=parsed_args.protected).to_dict()
data = utils.prepare_data(data, JOB_BINARY_FIELDS)
return self.dict2columns(data)
class ListJobBinaries(command.Lister):
"""Lists job binaries"""
log = logging.getLogger(__name__ + ".ListJobBinaries")
def get_parser(self, prog_name):
parser = super(ListJobBinaries, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--name',
metavar="<name-substring>",
help="List job binaries with specific substring in the "
"name"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = client.job_binaries.list()
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.long:
columns = ('name', 'id', 'url', 'description', 'is_public',
'is_protected')
column_headers = utils.prepare_column_headers(columns)
else:
columns = ('name', 'id', 'url')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class ShowJobBinary(command.ShowOne):
"""Display job binary details"""
log = logging.getLogger(__name__ + ".ShowJobBinary")
def get_parser(self, prog_name):
parser = super(ShowJobBinary, self).get_parser(prog_name)
parser.add_argument(
"job_binary",
metavar="<job-binary>",
help="Name or ID of the job binary to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.job_binaries, parsed_args.job_binary).to_dict()
data = utils.prepare_data(data, JOB_BINARY_FIELDS)
return self.dict2columns(data)
class DeleteJobBinary(command.Command):
"""Deletes job binary"""
log = logging.getLogger(__name__ + ".DeleteJobBinary")
def get_parser(self, prog_name):
parser = super(DeleteJobBinary, self).get_parser(prog_name)
parser.add_argument(
"job_binary",
metavar="<job-binary>",
nargs="+",
help="Name(s) or id(s) of the job binary(ies) to delete",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for jb in parsed_args.job_binary:
jb = utils.get_resource(client.job_binaries, jb)
if jb.url.startswith("internal-db"):
jbi_id = jb.url.replace('internal-db://', '')
try:
client.job_binary_internals.delete(jbi_id)
except base.APIException as ex:
# check if job binary internal was already deleted for
# some reasons
if not ex.error_code == '404':
raise
client.job_binaries.delete(jb.id)
sys.stdout.write(
'Job binary "{jb}" has been removed '
'successfully.\n'.format(jb=jb))
class UpdateJobBinary(command.ShowOne):
"""Updates job binary"""
log = logging.getLogger(__name__ + ".UpdateJobBinary")
def get_parser(self, prog_name):
parser = super(UpdateJobBinary, self).get_parser(prog_name)
parser.add_argument(
'job_binary',
metavar="<job-binary>",
help="Name or ID of the job binary",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the job binary",
)
parser.add_argument(
'--url',
metavar='<url>',
help='URL for the job binary [Internal DB URL can not be updated]'
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the job binary'
)
parser.add_argument(
'--username',
metavar='<username>',
help='Username for accessing the job binary URL',
)
password = parser.add_mutually_exclusive_group()
password.add_argument(
'--password',
metavar='<password>',
help='Password for accessing the job binary URL',
)
password.add_argument(
'--password-prompt',
dest="password_prompt",
action="store_true",
help='Prompt interactively for password',
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the job binary public (Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the job binary private (Visible only from'
' this project)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the job binary protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the job binary unprotected',
dest='is_protected'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the update object. Other '
'arguments will not be taken into account if this one is '
'provided'
)
parser.set_defaults(is_public=None, is_protected=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
jb_id = utils.get_resource_id(
client.job_binaries, parsed_args.job_binary)
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.job_binaries.update(jb_id, template).to_dict()
else:
if parsed_args.password_prompt:
parsed_args.password = osc_utils.get_password(
self.app.stdin, confirm=False)
extra = {}
if parsed_args.password:
extra['password'] = parsed_args.password
if parsed_args.username:
extra['user'] = parsed_args.username
if not extra:
extra = None
update_fields = utils.create_dict_from_kwargs(
name=parsed_args.name, url=parsed_args.url,
description=parsed_args.description,
extra=extra, is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected
)
data = client.job_binaries.update(
jb_id, update_fields).to_dict()
data = utils.prepare_data(data, JOB_BINARY_FIELDS)
return self.dict2columns(data)
class DownloadJobBinary(command.Command):
"""Downloads job binary"""
log = logging.getLogger(__name__ + ".DownloadJobBinary")
def get_parser(self, prog_name):
parser = super(DownloadJobBinary, self).get_parser(prog_name)
parser.add_argument(
"job_binary",
metavar="<job-binary>",
help="Name or ID of the job binary to download",
)
parser.add_argument(
'--file',
metavar="<file>",
help='Destination file (defaults to job binary name)',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if not parsed_args.file:
parsed_args.file = parsed_args.job_binary
if path.exists(parsed_args.file):
msg = ('File "%s" already exists. Chose another one with '
'--file argument.' % parsed_args.file)
raise exceptions.CommandError(msg)
else:
jb_id = utils.get_resource_id(
client.job_binaries, parsed_args.job_binary)
data = client.job_binaries.get_file(jb_id)
with open(parsed_args.file, 'w') as f:
f.write(data)
sys.stdout.write(
'Job binary "{jb}" has been downloaded '
'successfully.\n'.format(jb=parsed_args.job_binary))

View File

@ -1,327 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.osc.v1 import utils
JOB_TEMPLATE_FIELDS = ['name', 'id', 'type', 'mains', 'libs', 'description',
'is_public', 'is_protected']
JOB_TYPES_CHOICES = ['Hive', 'Java', 'MapReduce', 'Storm', 'Storm.Pyleus',
'Pig', 'Shell', 'MapReduce.Streaming', 'Spark']
def _format_job_template_output(data):
data['mains'] = osc_utils.format_list(
['%s:%s' % (m['name'], m['id']) for m in data['mains']])
data['libs'] = osc_utils.format_list(
['%s:%s' % (l['name'], l['id']) for l in data['libs']])
class CreateJobTemplate(command.ShowOne):
"""Creates job template"""
log = logging.getLogger(__name__ + ".CreateJobTemplate")
def get_parser(self, prog_name):
parser = super(CreateJobTemplate, self).get_parser(prog_name)
parser.add_argument(
'--name',
metavar="<name>",
help="Name of the job template [REQUIRED if JSON is not provided]",
)
parser.add_argument(
'--type',
metavar="<type>",
choices=JOB_TYPES_CHOICES,
help="Type of the job (%s) "
"[REQUIRED if JSON is not provided]" % ', '.join(
JOB_TYPES_CHOICES)
)
parser.add_argument(
'--mains',
metavar="<main>",
nargs='+',
help="Name(s) or ID(s) for job's main job binary(s)",
)
parser.add_argument(
'--libs',
metavar="<lib>",
nargs='+',
help="Name(s) or ID(s) for job's lib job binary(s)",
)
parser.add_argument(
'--description',
metavar="<description>",
help="Description of the job template"
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the job template public',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the job template protected',
)
parser.add_argument(
'--interface',
metavar='<filename>',
help='JSON representation of the interface'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the job template'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.jobs.create(**template).to_dict()
else:
if parsed_args.interface:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
parsed_args.interface = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'interface from file %s: %s' % (parsed_args.json, e))
mains_ids = [utils.get_resource_id(client.job_binaries, m) for m
in parsed_args.mains] if parsed_args.mains else None
libs_ids = [utils.get_resource_id(client.job_binaries, m) for m
in parsed_args.libs] if parsed_args.libs else None
data = client.jobs.create(
name=parsed_args.name, type=parsed_args.type, mains=mains_ids,
libs=libs_ids, description=parsed_args.description,
interface=parsed_args.interface, is_public=parsed_args.public,
is_protected=parsed_args.protected).to_dict()
_format_job_template_output(data)
data = utils.prepare_data(data, JOB_TEMPLATE_FIELDS)
return self.dict2columns(data)
class ListJobTemplates(command.Lister):
"""Lists job templates"""
log = logging.getLogger(__name__ + ".ListJobTemplates")
def get_parser(self, prog_name):
parser = super(ListJobTemplates, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--type',
metavar="<type>",
choices=JOB_TYPES_CHOICES,
help="List job templates of specific type"
)
parser.add_argument(
'--name',
metavar="<name-substring>",
help="List job templates with specific substring in the "
"name"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {'type': parsed_args.type} if parsed_args.type else {}
data = client.jobs.list(search_opts=search_opts)
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.long:
columns = ('name', 'id', 'type', 'description', 'is_public',
'is_protected')
column_headers = utils.prepare_column_headers(columns)
else:
columns = ('name', 'id', 'type')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class ShowJobTemplate(command.ShowOne):
"""Display job template details"""
log = logging.getLogger(__name__ + ".ShowJobTemplate")
def get_parser(self, prog_name):
parser = super(ShowJobTemplate, self).get_parser(prog_name)
parser.add_argument(
"job_template",
metavar="<job-template>",
help="Name or ID of the job template to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.jobs, parsed_args.job_template).to_dict()
_format_job_template_output(data)
data = utils.prepare_data(data, JOB_TEMPLATE_FIELDS)
return self.dict2columns(data)
class DeleteJobTemplate(command.Command):
"""Deletes job template"""
log = logging.getLogger(__name__ + ".DeleteJobTemplate")
def get_parser(self, prog_name):
parser = super(DeleteJobTemplate, self).get_parser(prog_name)
parser.add_argument(
"job_template",
metavar="<job-template>",
nargs="+",
help="Name(s) or id(s) of the job template(s) to delete",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for jt in parsed_args.job_template:
jt_id = utils.get_resource_id(client.jobs, jt)
client.jobs.delete(jt_id)
sys.stdout.write(
'Job template "{jt}" has been removed '
'successfully.\n'.format(jt=jt))
class UpdateJobTemplate(command.ShowOne):
"""Updates job template"""
log = logging.getLogger(__name__ + ".UpdateJobTemplate")
def get_parser(self, prog_name):
parser = super(UpdateJobTemplate, self).get_parser(prog_name)
parser.add_argument(
'job_template',
metavar="<job-template>",
help="Name or ID of the job template",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the job template",
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the job template'
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the job template public '
'(Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the job_template private '
'(Visible only from this tenant)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the job template protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the job template unprotected',
dest='is_protected'
)
parser.set_defaults(is_public=None, is_protected=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
jt_id = utils.get_resource_id(
client.jobs, parsed_args.job_template)
update_data = utils.create_dict_from_kwargs(
name=parsed_args.name,
description=parsed_args.description,
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected
)
data = client.jobs.update(jt_id, **update_data).job
_format_job_template_output(data)
data = utils.prepare_data(data, JOB_TEMPLATE_FIELDS)
return self.dict2columns(data)

View File

@ -1,133 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os import path
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.osc.v1.job_templates import JOB_TYPES_CHOICES
from saharaclient.osc.v1 import utils
class ListJobTypes(command.Lister):
"""Lists job types supported by plugins"""
log = logging.getLogger(__name__ + ".ListJobTypes")
def get_parser(self, prog_name):
parser = super(ListJobTypes, self).get_parser(prog_name)
parser.add_argument(
'--type',
metavar="<type>",
choices=JOB_TYPES_CHOICES,
help="Get information about specific job type"
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="Get only job types supported by this plugin"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="Get only job types supported by specific version of the "
"plugin. This parameter will be taken into account only if "
"plugin is provided"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {}
if parsed_args.type:
search_opts['type'] = parsed_args.type
if parsed_args.plugin:
search_opts['plugin'] = parsed_args.plugin
if parsed_args.plugin_version:
search_opts['plugin_version'] = parsed_args.plugin_version
elif parsed_args.plugin_version:
raise exceptions.CommandError(
'--plugin-version argument should be specified with --plugin '
'argument')
data = client.job_types.list(search_opts=search_opts)
for job in data:
plugins = []
for plugin in job.plugins:
versions = ", ".join(sorted(plugin["versions"].keys()))
if versions:
versions = "(" + versions + ")"
plugins.append(plugin["name"] + versions)
job.plugins = ', '.join(plugins)
columns = ('name', 'plugins')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class GetJobTypeConfigs(command.Command):
"""Get job type configs"""
log = logging.getLogger(__name__ + ".GetJobTypeConfigs")
def get_parser(self, prog_name):
parser = super(GetJobTypeConfigs, self).get_parser(prog_name)
parser.add_argument(
"job_type",
metavar="<job-type>",
choices=JOB_TYPES_CHOICES,
help="Type of the job to provide config information about",
)
parser.add_argument(
'--file',
metavar="<file>",
help='Destination file (defaults to job type)',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if not parsed_args.file:
parsed_args.file = parsed_args.job_type
data = client.jobs.get_configs(parsed_args.job_type).to_dict()
if path.exists(parsed_args.file):
self.log.error('File "%s" already exists. Choose another one with '
'--file argument.' % parsed_args.file)
else:
with open(parsed_args.file, 'w') as f:
jsonutils.dump(data, f, indent=4)
sys.stdout.write(
'"%(type)s" job configs were saved in "%(file)s"'
'file' % {'type': parsed_args.job_type,
'file': parsed_args.file})

View File

@ -1,384 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.osc.v1 import utils
JOB_FIELDS = ['id', 'job_template_id', 'cluster_id', 'input_id', 'output_id',
'start_time', 'end_time', 'status', 'is_public', 'is_protected',
'engine_job_id']
JOB_STATUS_CHOICES = ['done-with-error', 'failed', 'killed', 'pending',
'running', 'succeeded', 'to-be-killed']
def _format_job_output(data):
data['status'] = data['info']['status']
del data['info']
data['job_template_id'] = data.pop('job_id')
class ExecuteJob(command.ShowOne):
"""Executes job"""
log = logging.getLogger(__name__ + ".ExecuteJob")
def get_parser(self, prog_name):
parser = super(ExecuteJob, self).get_parser(prog_name)
parser.add_argument(
'--job-template',
metavar="<job-template>",
help="Name or ID of the job template "
"[REQUIRED if JSON is not provided]",
)
parser.add_argument(
'--cluster',
metavar="<cluster>",
help="Name or ID of the cluster "
"[REQUIRED if JSON is not provided]",
)
parser.add_argument(
'--input',
metavar="<input>",
help="Name or ID of the input data source",
)
parser.add_argument(
'--output',
metavar="<output>",
help="Name or ID of the output data source",
)
parser.add_argument(
'--params',
metavar="<name:value>",
nargs='+',
help="Parameters to add to the job"
)
parser.add_argument(
'--args',
metavar="<argument>",
nargs='+',
help="Arguments to add to the job"
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the job public',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the job protected',
)
configs = parser.add_mutually_exclusive_group()
configs.add_argument(
'--config-json',
metavar='<filename>',
help='JSON representation of the job configs'
)
configs.add_argument(
'--configs',
metavar="<name:value>",
nargs='+',
help="Configs to add to the job"
)
parser.add_argument(
'--interface',
metavar='<filename>',
help='JSON representation of the interface'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the job. Other arguments will not be '
'taken into account if this one is provided'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
if 'job_configs' in template:
template['configs'] = template.pop('job_configs')
data = client.job_executions.create(**template).to_dict()
else:
if not parsed_args.cluster or not parsed_args.job_template:
raise exceptions.CommandError(
'At least --cluster, --job-template, arguments should be '
'specified or json template should be provided with '
'--json argument')
job_configs = {}
if parsed_args.interface:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
parsed_args.interface = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'interface from file %s: %s' % (parsed_args.json, e))
if parsed_args.config_json:
blob = osc_utils.read_blob_file_contents(parsed_args.configs)
try:
job_configs['configs'] = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'configs from file %s: %s' % (parsed_args.json, e))
elif parsed_args.configs:
job_configs['configs'] = dict(
map(lambda x: x.split(':', 1), parsed_args.configs))
if parsed_args.args:
job_configs['args'] = parsed_args.args
if parsed_args.params:
job_configs['params'] = dict(
map(lambda x: x.split(':', 1), parsed_args.params))
jt_id = utils.get_resource_id(
client.jobs, parsed_args.job_template)
cluster_id = utils.get_resource_id(
client.clusters, parsed_args.cluster)
if parsed_args.input not in [None, "", "None"]:
input_id = utils.get_resource_id(
client.data_sources, parsed_args.input)
else:
input_id = None
if parsed_args.output not in [None, "", "None"]:
output_id = utils.get_resource_id(
client.data_sources, parsed_args.output)
else:
output_id = None
data = client.job_executions.create(
job_id=jt_id, cluster_id=cluster_id, input_id=input_id,
output_id=output_id, interface=parsed_args.interface,
configs=job_configs, is_public=parsed_args.public,
is_protected=parsed_args.protected).to_dict()
sys.stdout.write(
'Job "{job}" has been started successfully.\n'.format(
job=data['id']))
_format_job_output(data)
data = utils.prepare_data(data, JOB_FIELDS)
return self.dict2columns(data)
class ListJobs(command.Lister):
"""Lists jobs"""
log = logging.getLogger(__name__ + ".ListJobs")
def get_parser(self, prog_name):
parser = super(ListJobs, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--status',
metavar="<status>",
choices=JOB_STATUS_CHOICES,
help="List jobs with specific status"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = client.job_executions.list()
for job in data:
job.status = job.info['status']
if parsed_args.status:
data = [job for job in data
if job.info['status'] == parsed_args.status.replace(
'-', '').upper()]
if parsed_args.long:
columns = ('id', 'cluster id', 'job id', 'status', 'start time',
'end time')
column_headers = utils.prepare_column_headers(columns)
else:
columns = ('id', 'cluster id', 'job id', 'status')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns
) for s in data)
)
class ShowJob(command.ShowOne):
"""Display job details"""
log = logging.getLogger(__name__ + ".ShowJob")
def get_parser(self, prog_name):
parser = super(ShowJob, self).get_parser(prog_name)
parser.add_argument(
"job",
metavar="<job>",
help="ID of the job to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = client.job_executions.get(parsed_args.job).to_dict()
_format_job_output(data)
data = utils.prepare_data(data, JOB_FIELDS)
return self.dict2columns(data)
class DeleteJob(command.Command):
"""Deletes job"""
log = logging.getLogger(__name__ + ".DeleteJob")
def get_parser(self, prog_name):
parser = super(DeleteJob, self).get_parser(prog_name)
parser.add_argument(
"job",
metavar="<job>",
nargs="+",
help="ID(s) of the job(s) to delete",
)
parser.add_argument(
'--wait',
action='store_true',
default=False,
help='Wait for the job(s) delete to complete',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for job_id in parsed_args.job:
client.job_executions.delete(job_id)
sys.stdout.write(
'Job "{job}" deletion has been started.\n'.format(job=job_id))
if parsed_args.wait:
for job_id in parsed_args.job:
if not utils.wait_for_delete(client.job_executions, job_id):
self.log.error(
'Error occurred during job deleting: %s' %
job_id)
else:
sys.stdout.write(
'Job "{job}" has been removed successfully.\n'.format(
job=job_id))
class UpdateJob(command.ShowOne):
"""Updates job"""
log = logging.getLogger(__name__ + ".UpdateJob")
def get_parser(self, prog_name):
parser = super(UpdateJob, self).get_parser(prog_name)
parser.add_argument(
'job',
metavar="<job>",
help="ID of the job to update",
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the job public (Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the job private (Visible only from this project)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the job protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the job unprotected',
dest='is_protected'
)
parser.set_defaults(is_public=None, is_protected=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
update_dict = utils.create_dict_from_kwargs(
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected)
data = client.job_executions.update(
parsed_args.job, **update_dict).job_execution
_format_job_output(data)
data = utils.prepare_data(data, JOB_FIELDS)
return self.dict2columns(data)

View File

@ -1,691 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils as json
from saharaclient.osc.v1 import utils
NGT_FIELDS = ['id', 'name', 'plugin_name', 'plugin_version', 'node_processes',
'description', 'auto_security_group', 'security_groups',
'availability_zone', 'flavor_id', 'floating_ip_pool',
'volumes_per_node', 'volumes_size',
'volume_type', 'volume_local_to_instance', 'volume_mount_prefix',
'volumes_availability_zone', 'use_autoconfig',
'is_proxy_gateway', 'is_default', 'is_protected', 'is_public']
def _format_ngt_output(data):
data['node_processes'] = osc_utils.format_list(data['node_processes'])
data['plugin_version'] = data.pop('hadoop_version')
if data['volumes_per_node'] == 0:
del data['volume_local_to_instance']
del data['volume_mount_prefix']
del data['volume_type'],
del data['volumes_availability_zone']
del data['volumes_size']
class CreateNodeGroupTemplate(command.ShowOne):
"""Creates node group template"""
log = logging.getLogger(__name__ + ".CreateNodeGroupTemplate")
def get_parser(self, prog_name):
parser = super(CreateNodeGroupTemplate, self).get_parser(prog_name)
parser.add_argument(
'--name',
metavar="<name>",
help="Name of the node group template [REQUIRED if JSON is not "
"provided]",
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="Name of the plugin [REQUIRED if JSON is not provided]"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="Version of the plugin [REQUIRED if JSON is not provided]"
)
parser.add_argument(
'--processes',
metavar="<processes>",
nargs="+",
help="List of the processes that will be launched on each "
"instance [REQUIRED if JSON is not provided]"
)
parser.add_argument(
'--flavor',
metavar="<flavor>",
help="Name or ID of the flavor [REQUIRED if JSON is not provided]"
)
parser.add_argument(
'--security-groups',
metavar="<security-groups>",
nargs="+",
help="List of the security groups for the instances in this node "
"group"
)
parser.add_argument(
'--auto-security-group',
action='store_true',
default=False,
help='Indicates if an additional security group should be created '
'for the node group',
)
parser.add_argument(
'--availability-zone',
metavar="<availability-zone>",
help="Name of the availability zone where instances "
"will be created"
)
parser.add_argument(
'--floating-ip-pool',
metavar="<floating-ip-pool>",
help="ID of the floating IP pool"
)
parser.add_argument(
'--volumes-per-node',
type=int,
metavar="<volumes-per-node>",
help="Number of volumes attached to every node"
)
parser.add_argument(
'--volumes-size',
type=int,
metavar="<volumes-size>",
help='Size of volumes attached to node (GB). '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-type',
metavar="<volumes-type>",
help='Type of the volumes. '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-availability-zone',
metavar="<volumes-availability-zone>",
help='Name of the availability zone where volumes will be created.'
' This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-mount-prefix',
metavar="<volumes-mount-prefix>",
help='Prefix for mount point directory. '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-locality',
action='store_true',
default=False,
help='If enabled, instance and attached volumes will be created on'
' the same physical host. This parameter will be taken into '
'account only if volumes-per-node is set and non-zero',
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the node group template'
)
parser.add_argument(
'--autoconfig',
action='store_true',
default=False,
help='If enabled, instances of the node group will be '
'automatically configured',
)
parser.add_argument(
'--proxy-gateway',
action='store_true',
default=False,
help='If enabled, instances of the node group will be used to '
'access other instances in the cluster',
)
parser.add_argument(
'--public',
action='store_true',
default=False,
help='Make the node group template public (Visible from other '
'projects)',
)
parser.add_argument(
'--protected',
action='store_true',
default=False,
help='Make the node group template protected',
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the node group template. Other '
'arguments will not be taken into account if this one is '
'provided'
)
parser.add_argument(
'--shares',
metavar='<filename>',
help='JSON representation of the manila shares'
)
parser.add_argument(
'--configs',
metavar='<filename>',
help='JSON representation of the node group template configs'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.node_group_templates.create(**template).to_dict()
else:
if (not parsed_args.name or not parsed_args.plugin or
not parsed_args.plugin_version or not parsed_args.flavor or
not parsed_args.processes):
raise exceptions.CommandError(
'At least --name, --plugin, --plugin-version, --processes,'
' --flavor arguments should be specified or json template '
'should be provided with --json argument')
configs = None
if parsed_args.configs:
blob = osc_utils.read_blob_file_contents(parsed_args.configs)
try:
configs = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'configs from file %s: %s' % (parsed_args.configs, e))
shares = None
if parsed_args.shares:
blob = osc_utils.read_blob_file_contents(parsed_args.shares)
try:
shares = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'shares from file %s: %s' % (parsed_args.shares, e))
compute_client = self.app.client_manager.compute
flavor_id = osc_utils.find_resource(
compute_client.flavors, parsed_args.flavor).id
data = client.node_group_templates.create(
name=parsed_args.name,
plugin_name=parsed_args.plugin,
hadoop_version=parsed_args.plugin_version,
flavor_id=flavor_id,
description=parsed_args.description,
volumes_per_node=parsed_args.volumes_per_node,
volumes_size=parsed_args.volumes_size,
node_processes=parsed_args.processes,
floating_ip_pool=parsed_args.floating_ip_pool,
security_groups=parsed_args.security_groups,
auto_security_group=parsed_args.auto_security_group,
availability_zone=parsed_args.availability_zone,
volume_type=parsed_args.volumes_type,
is_proxy_gateway=parsed_args.proxy_gateway,
volume_local_to_instance=parsed_args.volumes_locality,
use_autoconfig=parsed_args.autoconfig,
is_public=parsed_args.public,
is_protected=parsed_args.protected,
node_configs=configs,
shares=shares,
volumes_availability_zone=parsed_args.volumes_availability_zone
).to_dict()
_format_ngt_output(data)
data = utils.prepare_data(data, NGT_FIELDS)
return self.dict2columns(data)
class ListNodeGroupTemplates(command.Lister):
"""Lists node group templates"""
log = logging.getLogger(__name__ + ".ListNodeGroupTemplates")
def get_parser(self, prog_name):
parser = super(ListNodeGroupTemplates, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="List node group templates for specific plugin"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="List node group templates with specific version of the "
"plugin"
)
parser.add_argument(
'--name',
metavar="<name-substring>",
help="List node group templates with specific substring in the "
"name"
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
search_opts = {}
if parsed_args.plugin:
search_opts['plugin_name'] = parsed_args.plugin
if parsed_args.plugin_version:
search_opts['hadoop_version'] = parsed_args.plugin_version
data = client.node_group_templates.list(search_opts=search_opts)
if parsed_args.name:
data = utils.get_by_name_substring(data, parsed_args.name)
if parsed_args.long:
columns = ('name', 'id', 'plugin_name', 'hadoop_version',
'node_processes', 'description')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version'})
else:
columns = ('name', 'id', 'plugin_name', 'hadoop_version')
column_headers = utils.prepare_column_headers(
columns, {'hadoop_version': 'plugin_version'})
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns,
formatters={
'node_processes': osc_utils.format_list
}
) for s in data)
)
class ShowNodeGroupTemplate(command.ShowOne):
"""Display node group template details"""
log = logging.getLogger(__name__ + ".ShowNodeGroupTemplate")
def get_parser(self, prog_name):
parser = super(ShowNodeGroupTemplate, self).get_parser(prog_name)
parser.add_argument(
"node_group_template",
metavar="<node-group-template>",
help="Name or id of the node group template to display",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = utils.get_resource(
client.node_group_templates,
parsed_args.node_group_template).to_dict()
_format_ngt_output(data)
data = utils.prepare_data(data, NGT_FIELDS)
return self.dict2columns(data)
class DeleteNodeGroupTemplate(command.Command):
"""Deletes node group template"""
log = logging.getLogger(__name__ + ".DeleteNodeGroupTemplate")
def get_parser(self, prog_name):
parser = super(DeleteNodeGroupTemplate, self).get_parser(prog_name)
parser.add_argument(
"node_group_template",
metavar="<node-group-template>",
nargs="+",
help="Name(s) or id(s) of the node group template(s) to delete",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
for ngt in parsed_args.node_group_template:
ngt_id = utils.get_resource_id(
client.node_group_templates, ngt)
client.node_group_templates.delete(ngt_id)
sys.stdout.write(
'Node group template "{ngt}" has been removed '
'successfully.\n'.format(ngt=ngt))
class UpdateNodeGroupTemplate(command.ShowOne):
"""Updates node group template"""
log = logging.getLogger(__name__ + ".UpdateNodeGroupTemplate")
def get_parser(self, prog_name):
parser = super(UpdateNodeGroupTemplate, self).get_parser(prog_name)
parser.add_argument(
'node_group_template',
metavar="<node-group-template>",
help="Name or ID of the node group template",
)
parser.add_argument(
'--name',
metavar="<name>",
help="New name of the node group template",
)
parser.add_argument(
'--plugin',
metavar="<plugin>",
help="Name of the plugin"
)
parser.add_argument(
'--plugin-version',
metavar="<plugin_version>",
help="Version of the plugin"
)
parser.add_argument(
'--processes',
metavar="<processes>",
nargs="+",
help="List of the processes that will be launched on each "
"instance"
)
parser.add_argument(
'--security-groups',
metavar="<security-groups>",
nargs="+",
help="List of the security groups for the instances in this node "
"group"
)
autosecurity = parser.add_mutually_exclusive_group()
autosecurity.add_argument(
'--auto-security-group-enable',
action='store_true',
help='Additional security group should be created '
'for the node group',
dest='use_auto_security_group'
)
autosecurity.add_argument(
'--auto-security-group-disable',
action='store_false',
help='Additional security group should not be created '
'for the node group',
dest='use_auto_security_group'
)
parser.add_argument(
'--availability-zone',
metavar="<availability-zone>",
help="Name of the availability zone where instances "
"will be created"
)
parser.add_argument(
'--flavor',
metavar="<flavor>",
help="Name or ID of the flavor"
)
parser.add_argument(
'--floating-ip-pool',
metavar="<floating-ip-pool>",
help="ID of the floating IP pool"
)
parser.add_argument(
'--volumes-per-node',
type=int,
metavar="<volumes-per-node>",
help="Number of volumes attached to every node"
)
parser.add_argument(
'--volumes-size',
type=int,
metavar="<volumes-size>",
help='Size of volumes attached to node (GB). '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-type',
metavar="<volumes-type>",
help='Type of the volumes. '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-availability-zone',
metavar="<volumes-availability-zone>",
help='Name of the availability zone where volumes will be created.'
' This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
parser.add_argument(
'--volumes-mount-prefix',
metavar="<volumes-mount-prefix>",
help='Prefix for mount point directory. '
'This parameter will be taken into account only '
'if volumes-per-node is set and non-zero'
)
volumelocality = parser.add_mutually_exclusive_group()
volumelocality.add_argument(
'--volumes-locality-enable',
action='store_true',
help='Instance and attached volumes will be created on '
'the same physical host. This parameter will be taken into '
'account only if volumes-per-node is set and non-zero',
dest='volume_locality'
)
volumelocality.add_argument(
'--volumes-locality-disable',
action='store_false',
help='Instance and attached volumes creation on the same physical '
'host will not be regulated. This parameter will be taken'
'into account only if volumes-per-node is set and non-zero',
dest='volume_locality'
)
parser.add_argument(
'--description',
metavar="<description>",
help='Description of the node group template'
)
autoconfig = parser.add_mutually_exclusive_group()
autoconfig.add_argument(
'--autoconfig-enable',
action='store_true',
help='Instances of the node group will be '
'automatically configured',
dest='use_autoconfig'
)
autoconfig.add_argument(
'--autoconfig-disable',
action='store_false',
help='Instances of the node group will not be '
'automatically configured',
dest='use_autoconfig'
)
proxy = parser.add_mutually_exclusive_group()
proxy.add_argument(
'--proxy-gateway-enable',
action='store_true',
help='Instances of the node group will be used to '
'access other instances in the cluster',
dest='is_proxy_gateway'
)
proxy.add_argument(
'--proxy-gateway-disable',
action='store_false',
help='Instances of the node group will not be used to '
'access other instances in the cluster',
dest='is_proxy_gateway'
)
public = parser.add_mutually_exclusive_group()
public.add_argument(
'--public',
action='store_true',
help='Make the node group template public '
'(Visible from other projects)',
dest='is_public'
)
public.add_argument(
'--private',
action='store_false',
help='Make the node group template private '
'(Visible only from this project)',
dest='is_public'
)
protected = parser.add_mutually_exclusive_group()
protected.add_argument(
'--protected',
action='store_true',
help='Make the node group template protected',
dest='is_protected'
)
protected.add_argument(
'--unprotected',
action='store_false',
help='Make the node group template unprotected',
dest='is_protected'
)
parser.add_argument(
'--json',
metavar='<filename>',
help='JSON representation of the node group template update '
'fields. Other arguments will not be taken into account if '
'this one is provided'
)
parser.add_argument(
'--shares',
metavar='<filename>',
help='JSON representation of the manila shares'
)
parser.add_argument(
'--configs',
metavar='<filename>',
help='JSON representation of the node group template configs'
)
parser.set_defaults(is_public=None, is_protected=None,
is_proxy_gateway=None, volume_locality=None,
use_auto_security_group=None, use_autoconfig=None)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
ngt_id = utils.get_resource_id(
client.node_group_templates, parsed_args.node_group_template)
if parsed_args.json:
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
template = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'template from file %s: %s' % (parsed_args.json, e))
data = client.node_group_templates.update(
ngt_id, **template).to_dict()
else:
configs = None
if parsed_args.configs:
blob = osc_utils.read_blob_file_contents(parsed_args.configs)
try:
configs = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'configs from file %s: %s' % (parsed_args.configs, e))
shares = None
if parsed_args.shares:
blob = osc_utils.read_blob_file_contents(parsed_args.shares)
try:
shares = json.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'shares from file %s: %s' % (parsed_args.shares, e))
flavor_id = None
if parsed_args.flavor:
compute_client = self.app.client_manager.compute
flavor_id = osc_utils.find_resource(
compute_client.flavors, parsed_args.flavor).id
update_dict = utils.create_dict_from_kwargs(
name=parsed_args.name,
plugin_name=parsed_args.plugin,
hadoop_version=parsed_args.plugin_version,
flavor_id=flavor_id,
description=parsed_args.description,
volumes_per_node=parsed_args.volumes_per_node,
volumes_size=parsed_args.volumes_size,
node_processes=parsed_args.processes,
floating_ip_pool=parsed_args.floating_ip_pool,
security_groups=parsed_args.security_groups,
auto_security_group=parsed_args.use_auto_security_group,
availability_zone=parsed_args.availability_zone,
volume_type=parsed_args.volumes_type,
is_proxy_gateway=parsed_args.is_proxy_gateway,
volume_local_to_instance=parsed_args.volume_locality,
use_autoconfig=parsed_args.use_autoconfig,
is_public=parsed_args.is_public,
is_protected=parsed_args.is_protected,
node_configs=configs,
shares=shares,
volumes_availability_zone=parsed_args.volumes_availability_zone
)
data = client.node_group_templates.update(
ngt_id, **update_dict).to_dict()
_format_ngt_output(data)
data = utils.prepare_data(data, NGT_FIELDS)
return self.dict2columns(data)

View File

@ -1,221 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os import path
import sys
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib import utils as osc_utils
from oslo_log import log as logging
from oslo_serialization import jsonutils
from saharaclient.osc.v1 import utils
def _serialize_label_items(plugin):
labels = {}
pl_labels = plugin.get('plugin_labels', {})
for label, data in pl_labels.items():
labels['plugin: %s' % label] = data['status']
vr_labels = plugin.get('version_labels', {})
for version, version_data in vr_labels.items():
for label, data in version_data.items():
labels[
'plugin version %s: %s' % (version, label)] = data['status']
labels = utils.prepare_data(labels, list(labels.keys()))
return sorted(labels.items())
class ListPlugins(command.Lister):
"""Lists plugins"""
log = logging.getLogger(__name__ + ".ListPlugins")
def get_parser(self, prog_name):
parser = super(ListPlugins, self).get_parser(prog_name)
parser.add_argument(
'--long',
action='store_true',
default=False,
help='List additional fields in output',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
data = client.plugins.list()
if parsed_args.long:
columns = ('name', 'title', 'versions', 'description')
column_headers = utils.prepare_column_headers(columns)
else:
columns = ('name', 'versions')
column_headers = utils.prepare_column_headers(columns)
return (
column_headers,
(osc_utils.get_item_properties(
s,
columns,
formatters={
'versions': osc_utils.format_list
},
) for s in data)
)
class ShowPlugin(command.ShowOne):
"""Display plugin details"""
log = logging.getLogger(__name__ + ".ShowPlugin")
def get_parser(self, prog_name):
parser = super(ShowPlugin, self).get_parser(prog_name)
parser.add_argument(
"plugin",
metavar="<plugin>",
help="Name of the plugin to display",
)
parser.add_argument(
"--plugin-version",
metavar="<plugin_version>",
help='Version of the plugin to display'
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if parsed_args.plugin_version:
data = client.plugins.get_version_details(
parsed_args.plugin, parsed_args.plugin_version).to_dict()
processes = data.pop('node_processes')
for k, v in processes.items():
processes[k] = osc_utils.format_list(v)
data['required_image_tags'] = osc_utils.format_list(
data['required_image_tags'])
label_items = _serialize_label_items(data)
data = utils.prepare_data(
data, ['required_image_tags', 'name', 'description', 'title'])
data = self.dict2columns(data)
data = utils.extend_columns(data, label_items)
data = utils.extend_columns(
data, [('Service:', 'Available processes:')])
data = utils.extend_columns(
data, sorted(processes.items()))
else:
data = client.plugins.get(parsed_args.plugin).to_dict()
data['versions'] = osc_utils.format_list(data['versions'])
items = _serialize_label_items(data)
data = utils.prepare_data(
data, ['versions', 'name', 'description', 'title'])
data = utils.extend_columns(self.dict2columns(data), items)
return data
class GetPluginConfigs(command.Command):
"""Get plugin configs"""
log = logging.getLogger(__name__ + ".GetPluginConfigs")
def get_parser(self, prog_name):
parser = super(GetPluginConfigs, self).get_parser(prog_name)
parser.add_argument(
"plugin",
metavar="<plugin>",
help="Name of the plugin to provide config information about",
)
parser.add_argument(
"plugin_version",
metavar="<plugin_version>",
help="Version of the plugin to provide config information about",
)
parser.add_argument(
'--file',
metavar="<file>",
help="Destination file (defaults to a combination of "
"plugin name and plugin version)",
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
if not parsed_args.file:
parsed_args.file = (parsed_args.plugin + '-' +
parsed_args.plugin_version)
if path.exists(parsed_args.file):
msg = ('File "%s" already exists. Choose another one with '
'--file argument.' % parsed_args.file)
raise exceptions.CommandError(msg)
else:
data = client.plugins.get_version_details(
parsed_args.plugin, parsed_args.plugin_version).to_dict()
with open(parsed_args.file, 'w') as f:
jsonutils.dump(data, f, indent=4)
sys.stdout.write(
'"%(plugin)s" plugin "%(version)s" version configs '
'was saved in "%(file)s" file\n' % {
'plugin': parsed_args.plugin,
'version': parsed_args.plugin_version,
'file': parsed_args.file})
class UpdatePlugin(command.ShowOne):
log = logging.getLogger(__name__ + ".UpdatePlugin")
def get_parser(self, prog_name):
parser = super(UpdatePlugin, self).get_parser(prog_name)
parser.add_argument(
"plugin",
metavar="<plugin>",
help="Name of the plugin to provide config information about",
)
parser.add_argument(
'json',
metavar="<json>",
help='JSON representation of the plugin update dictionary',
)
return parser
def take_action(self, parsed_args):
self.log.debug("take_action(%s)", parsed_args)
client = self.app.client_manager.data_processing
blob = osc_utils.read_blob_file_contents(parsed_args.json)
try:
update_dict = jsonutils.loads(blob)
except ValueError as e:
raise exceptions.CommandError(
'An error occurred when reading '
'update dict from file %s: %s' % (parsed_args.json, e))
plugin = client.plugins.update(parsed_args.plugin, update_dict)
data = plugin.to_dict()
data['versions'] = osc_utils.format_list(data['versions'])
items = _serialize_label_items(data)
data = utils.prepare_data(
data, ['versions', 'name', 'description', 'title'])
data = utils.extend_columns(self.dict2columns(data), items)
return data

View File

@ -1,101 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import time
from oslo_utils import timeutils
from oslo_utils import uuidutils
from saharaclient.api import base
def get_resource(manager, name_or_id, **kwargs):
if uuidutils.is_uuid_like(name_or_id):
return manager.get(name_or_id, **kwargs)
else:
resource = manager.find_unique(name=name_or_id)
if kwargs:
# we really need additional call to apply kwargs
resource = manager.get(resource.id, **kwargs)
return resource
def created_at_sorted(objs, reverse=False):
return sorted(objs, key=created_at_key, reverse=reverse)
def random_name(prefix=None):
return "%s-%s" % (prefix, uuidutils.generate_uuid()[:8])
def created_at_key(obj):
return timeutils.parse_isotime(obj["created_at"])
def get_resource_id(manager, name_or_id):
if uuidutils.is_uuid_like(name_or_id):
return name_or_id
else:
return manager.find_unique(name=name_or_id).id
def create_dict_from_kwargs(**kwargs):
return {k: v for (k, v) in kwargs.items() if v is not None}
def prepare_data(data, fields):
new_data = {}
for f in fields:
if f in data:
new_data[f.replace('_', ' ').capitalize()] = data[f]
return new_data
def unzip(data):
return zip(*data)
def extend_columns(columns, items):
return unzip(list(unzip(columns)) + [('', '')] + items)
def prepare_column_headers(columns, remap=None):
remap = remap if remap else {}
new_columns = []
for c in columns:
for old, new in remap.items():
c = c.replace(old, new)
new_columns.append(c.replace('_', ' ').capitalize())
return new_columns
def get_by_name_substring(data, name):
return [obj for obj in data if name in obj.name]
def wait_for_delete(manager, obj_id, sleep_time=5, timeout=3000):
s_time = timeutils.utcnow()
while timeutils.delta_seconds(s_time, timeutils.utcnow()) < timeout:
try:
manager.get(obj_id)
except base.APIException as ex:
if ex.error_code == 404:
return True
raise
time.sleep(sleep_time)
return False

View File

@ -1,138 +0,0 @@
# Copyright (c) 2013 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import tokenize
import pep8
from saharaclient.tests.hacking import commit_message
from saharaclient.tests.hacking import import_checks
from saharaclient.tests.hacking import logging_checks
RE_OSLO_IMPORTS = (re.compile(r"(((from)|(import))\s+oslo\.)"),
re.compile(r"(from\s+oslo\s+import)"))
RE_DICT_CONSTRUCTOR_WITH_LIST_COPY = re.compile(r".*\bdict\((\[)?(\(|\[)")
RE_USE_JSONUTILS_INVALID_LINE = re.compile(r"(import\s+json)")
RE_USE_JSONUTILS_VALID_LINE = re.compile(r"(import\s+jsonschema)")
RE_MUTABLE_DEFAULT_ARGS = re.compile(r"^\s*def .+\((.+=\{\}|.+=\[\])")
def _starts_with_any(line, *prefixes):
for prefix in prefixes:
if line.startswith(prefix):
return True
return False
def _any_in(line, *sublines):
for subline in sublines:
if subline in line:
return True
return False
def import_db_only_in_conductor(logical_line, filename):
"""Check that db calls are only in conductor module and in tests.
S361
"""
if _any_in(filename,
"sahara/conductor",
"sahara/tests",
"sahara/db"):
return
if _starts_with_any(logical_line,
"from sahara import db",
"from sahara.db",
"import sahara.db"):
yield (0, "S361: sahara.db import only allowed in "
"sahara/conductor/*")
def hacking_no_author_attr(logical_line, tokens):
"""__author__ should not be used.
S362: __author__ = slukjanov
"""
for token_type, text, start_index, _, _ in tokens:
if token_type == tokenize.NAME and text == "__author__":
yield (start_index[1],
"S362: __author__ should not be used")
def check_oslo_namespace_imports(logical_line):
"""Check to prevent old oslo namespace usage.
S363
"""
if re.match(RE_OSLO_IMPORTS[0], logical_line):
yield(0, "S363: '%s' must be used instead of '%s'." % (
logical_line.replace('oslo.', 'oslo_'),
logical_line))
if re.match(RE_OSLO_IMPORTS[1], logical_line):
yield(0, "S363: '%s' must be used instead of '%s'" % (
'import oslo_%s' % logical_line.split()[-1],
logical_line))
def dict_constructor_with_list_copy(logical_line):
"""Check to prevent dict constructor with a sequence of key-value pairs.
S368
"""
if RE_DICT_CONSTRUCTOR_WITH_LIST_COPY.match(logical_line):
yield (0, 'S368: Must use a dict comprehension instead of a dict '
'constructor with a sequence of key-value pairs.')
def use_jsonutils(logical_line, filename):
"""Check to prevent importing json in sahara code.
S375
"""
if pep8.noqa(logical_line):
return
if (RE_USE_JSONUTILS_INVALID_LINE.match(logical_line) and
not RE_USE_JSONUTILS_VALID_LINE.match(logical_line)):
yield(0, "S375: Use jsonutils from oslo_serialization instead"
" of json")
def no_mutable_default_args(logical_line):
"""Check to prevent mutable default argument in sahara code.
S360
"""
msg = "S360: Method's default argument shouldn't be mutable!"
if RE_MUTABLE_DEFAULT_ARGS.match(logical_line):
yield (0, msg)
def factory(register):
register(import_db_only_in_conductor)
register(hacking_no_author_attr)
register(check_oslo_namespace_imports)
register(commit_message.OnceGitCheckCommitTitleBug)
register(commit_message.OnceGitCheckCommitTitleLength)
register(import_checks.hacking_import_groups)
register(import_checks.hacking_import_groups_together)
register(dict_constructor_with_list_copy)
register(logging_checks.no_translate_logs)
register(logging_checks.accepted_log_levels)
register(use_jsonutils)
register(no_mutable_default_args)

View File

@ -1,95 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import re
import subprocess # nosec
from hacking import core
class GitCheck(core.GlobalCheck):
"""Base-class for Git related checks."""
def _get_commit_title(self):
# Check if we're inside a git checkout
try:
subp = subprocess.Popen( # nosec
['git', 'rev-parse', '--show-toplevel'],
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
gitdir = subp.communicate()[0].rstrip()
except OSError:
# "git" was not found
return None
if not os.path.exists(gitdir):
return None
# Get title of most recent commit
subp = subprocess.Popen( # nosec
['git', 'log', '--no-merges', '--pretty=%s', '-1'],
stdout=subprocess.PIPE)
title = subp.communicate()[0]
if subp.returncode:
raise Exception("git log failed with code %s" % subp.returncode)
return title.decode('utf-8')
class OnceGitCheckCommitTitleBug(GitCheck):
"""Check git commit messages for bugs.
OpenStack HACKING recommends not referencing a bug or blueprint in first
line. It should provide an accurate description of the change
S364
"""
name = "GitCheckCommitTitleBug"
# From https://github.com/openstack/openstack-ci-puppet
# /blob/master/modules/gerrit/manifests/init.pp#L74
# Changeid|bug|blueprint
GIT_REGEX = re.compile(
r'(I[0-9a-f]{8,40})|'
'([Bb]ug|[Ll][Pp])[\s\#:]*(\d+)|'
'([Bb]lue[Pp]rint|[Bb][Pp])[\s\#:]*([A-Za-z0-9\\-]+)')
def run_once(self):
title = self._get_commit_title()
# NOTE(jogo) if match regex but over 3 words, acceptable title
if (title and self.GIT_REGEX.search(title) is not None
and len(title.split()) <= 3):
return (1, 0,
"S364: git commit title ('%s') should provide an accurate "
"description of the change, not just a reference to a bug "
"or blueprint" % title.strip(), self.name)
class OnceGitCheckCommitTitleLength(GitCheck):
"""Check git commit message length.
HACKING recommends commit titles 50 chars or less, but enforces
a 72 character limit
S365 Title limited to 72 chars
"""
name = "GitCheckCommitTitleLength"
def run_once(self):
title = self._get_commit_title()
if title and len(title) > 72:
return (
1, 0,
"S365: git commit title ('%s') should be under 50 chars"
% title.strip(),
self.name)

View File

@ -1,450 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import imp
from hacking import core
# NOTE(Kezar): This checks a good enough if we have only py2.7 supported.
# As soon as we'll get py3.x we need to drop it or rewrite. You can read more
# about it in dev-list archive, topic: "[hacking]proposed rules drop for 1.0"
def _find_module(module, path=None):
mod_base = module
parent_path = None
while '.' in mod_base:
first, _, mod_base = mod_base.partition('.')
parent_path = path
_, path, _ = imp.find_module(first, path)
path = [path]
try:
_, path, _ = imp.find_module(mod_base, path)
except ImportError:
# NOTE(bnemec): There are two reasons we might get here: 1) A
# non-module import and 2) an import of a namespace module that is
# in the same namespace as the current project, which caused us to
# recurse into the project namespace but fail to find the third-party
# module. For 1), we won't be able to import it as a module, so we
# return the parent module's path, but for 2) the import below should
# succeed, so we re-raise the ImportError because the module was
# legitimately not found in this path.
try:
__import__(module)
except ImportError:
# Non-module import, return the parent path if we have it
if parent_path:
return parent_path
raise
raise
return path
module_cache = dict()
# List of all Python 2 stdlib modules - anything not in this list will be
# allowed in either the stdlib or third-party groups to allow for Python 3
# stdlib additions.
# The list was generated via the following script, which is a variation on
# the one found here:
# http://stackoverflow.com/questions/6463918/how-can-i-get-a-list-of-all-the-python-standard-library-modules
"""
from distutils import sysconfig
import os
import sys
std_lib = sysconfig.get_python_lib(standard_lib=True)
prefix_len = len(std_lib) + 1
modules = ''
line = '['
mod_list = []
for top, dirs, files in os.walk(std_lib):
for name in files:
if 'site-packages' not in top:
if name == '__init__.py':
full_name = top[prefix_len:].replace('/', '.')
mod_list.append(full_name)
elif name.endswith('.py'):
full_name = top.replace('/', '.') + '.'
full_name += name[:-3]
full_name = full_name[prefix_len:]
mod_list.append(full_name)
elif name.endswith('.so') and top.endswith('lib-dynload'):
full_name = name[:-3]
if full_name.endswith('module'):
full_name = full_name[:-6]
mod_list.append(full_name)
for name in sys.builtin_module_names:
mod_list.append(name)
mod_list.sort()
for mod in mod_list:
if len(line + mod) + 8 > 79:
modules += '\n' + line
line = ' '
line += "'%s', " % mod
print modules + ']'
"""
py2_stdlib = [
'BaseHTTPServer', 'Bastion', 'CGIHTTPServer', 'ConfigParser', 'Cookie',
'DocXMLRPCServer', 'HTMLParser', 'MimeWriter', 'Queue',
'SimpleHTTPServer', 'SimpleXMLRPCServer', 'SocketServer', 'StringIO',
'UserDict', 'UserList', 'UserString', '_LWPCookieJar',
'_MozillaCookieJar', '__builtin__', '__future__', '__main__',
'__phello__.foo', '_abcoll', '_ast', '_bisect', '_bsddb', '_codecs',
'_codecs_cn', '_codecs_hk', '_codecs_iso2022', '_codecs_jp',
'_codecs_kr', '_codecs_tw', '_collections', '_crypt', '_csv',
'_ctypes', '_curses', '_curses_panel', '_elementtree', '_functools',
'_hashlib', '_heapq', '_hotshot', '_io', '_json', '_locale',
'_lsprof', '_multibytecodec', '_multiprocessing', '_osx_support',
'_pyio', '_random', '_socket', '_sqlite3', '_sre', '_ssl',
'_strptime', '_struct', '_symtable', '_sysconfigdata',
'_threading_local', '_warnings', '_weakref', '_weakrefset', 'abc',
'aifc', 'antigravity', 'anydbm', 'argparse', 'array', 'ast',
'asynchat', 'asyncore', 'atexit', 'audiodev', 'audioop', 'base64',
'bdb', 'binascii', 'binhex', 'bisect', 'bsddb', 'bsddb.db',
'bsddb.dbobj', 'bsddb.dbrecio', 'bsddb.dbshelve', 'bsddb.dbtables',
'bsddb.dbutils', 'bz2', 'cPickle', 'cProfile', 'cStringIO',
'calendar', 'cgi', 'cgitb', 'chunk', 'cmath', 'cmd', 'code', 'codecs',
'codeop', 'collections', 'colorsys', 'commands', 'compileall',
'compiler', 'compiler.ast', 'compiler.consts', 'compiler.future',
'compiler.misc', 'compiler.pyassem', 'compiler.pycodegen',
'compiler.symbols', 'compiler.syntax', 'compiler.transformer',
'compiler.visitor', 'contextlib', 'cookielib', 'copy', 'copy_reg',
'crypt', 'csv', 'ctypes', 'ctypes._endian', 'ctypes.macholib',
'ctypes.macholib.dyld', 'ctypes.macholib.dylib',
'ctypes.macholib.framework', 'ctypes.util', 'ctypes.wintypes',
'curses', 'curses.ascii', 'curses.has_key', 'curses.panel',
'curses.textpad', 'curses.wrapper', 'datetime', 'dbhash', 'dbm',
'decimal', 'difflib', 'dircache', 'dis', 'distutils',
'distutils.archive_util', 'distutils.bcppcompiler',
'distutils.ccompiler', 'distutils.cmd', 'distutils.command',
'distutils.command.bdist', 'distutils.command.bdist_dumb',
'distutils.command.bdist_msi', 'distutils.command.bdist_rpm',
'distutils.command.bdist_wininst', 'distutils.command.build',
'distutils.command.build_clib', 'distutils.command.build_ext',
'distutils.command.build_py', 'distutils.command.build_scripts',
'distutils.command.check', 'distutils.command.clean',
'distutils.command.config', 'distutils.command.install',
'distutils.command.install_data',
'distutils.command.install_egg_info',
'distutils.command.install_headers', 'distutils.command.install_lib',
'distutils.command.install_scripts', 'distutils.command.register',
'distutils.command.sdist', 'distutils.command.upload',
'distutils.config', 'distutils.core', 'distutils.cygwinccompiler',
'distutils.debug', 'distutils.dep_util', 'distutils.dir_util',
'distutils.dist', 'distutils.emxccompiler', 'distutils.errors',
'distutils.extension', 'distutils.fancy_getopt',
'distutils.file_util', 'distutils.filelist', 'distutils.log',
'distutils.msvc9compiler', 'distutils.msvccompiler',
'distutils.spawn', 'distutils.sysconfig', 'distutils.text_file',
'distutils.unixccompiler', 'distutils.util', 'distutils.version',
'distutils.versionpredicate', 'dl', 'doctest', 'dumbdbm',
'dummy_thread', 'dummy_threading', 'email', 'email._parseaddr',
'email.base64mime', 'email.charset', 'email.encoders', 'email.errors',
'email.feedparser', 'email.generator', 'email.header',
'email.iterators', 'email.message', 'email.mime',
'email.mime.application', 'email.mime.audio', 'email.mime.base',
'email.mime.image', 'email.mime.message', 'email.mime.multipart',
'email.mime.nonmultipart', 'email.mime.text', 'email.parser',
'email.quoprimime', 'email.utils', 'encodings', 'encodings.aliases',
'encodings.ascii', 'encodings.base64_codec', 'encodings.big5',
'encodings.big5hkscs', 'encodings.bz2_codec', 'encodings.charmap',
'encodings.cp037', 'encodings.cp1006', 'encodings.cp1026',
'encodings.cp1140', 'encodings.cp1250', 'encodings.cp1251',
'encodings.cp1252', 'encodings.cp1253', 'encodings.cp1254',
'encodings.cp1255', 'encodings.cp1256', 'encodings.cp1257',
'encodings.cp1258', 'encodings.cp424', 'encodings.cp437',
'encodings.cp500', 'encodings.cp720', 'encodings.cp737',
'encodings.cp775', 'encodings.cp850', 'encodings.cp852',
'encodings.cp855', 'encodings.cp856', 'encodings.cp857',
'encodings.cp858', 'encodings.cp860', 'encodings.cp861',
'encodings.cp862', 'encodings.cp863', 'encodings.cp864',
'encodings.cp865', 'encodings.cp866', 'encodings.cp869',
'encodings.cp874', 'encodings.cp875', 'encodings.cp932',
'encodings.cp949', 'encodings.cp950', 'encodings.euc_jis_2004',
'encodings.euc_jisx0213', 'encodings.euc_jp', 'encodings.euc_kr',
'encodings.gb18030', 'encodings.gb2312', 'encodings.gbk',
'encodings.hex_codec', 'encodings.hp_roman8', 'encodings.hz',
'encodings.idna', 'encodings.iso2022_jp', 'encodings.iso2022_jp_1',
'encodings.iso2022_jp_2', 'encodings.iso2022_jp_2004',
'encodings.iso2022_jp_3', 'encodings.iso2022_jp_ext',
'encodings.iso2022_kr', 'encodings.iso8859_1', 'encodings.iso8859_10',
'encodings.iso8859_11', 'encodings.iso8859_13',
'encodings.iso8859_14', 'encodings.iso8859_15',
'encodings.iso8859_16', 'encodings.iso8859_2', 'encodings.iso8859_3',
'encodings.iso8859_4', 'encodings.iso8859_5', 'encodings.iso8859_6',
'encodings.iso8859_7', 'encodings.iso8859_8', 'encodings.iso8859_9',
'encodings.johab', 'encodings.koi8_r', 'encodings.koi8_u',
'encodings.latin_1', 'encodings.mac_arabic', 'encodings.mac_centeuro',
'encodings.mac_croatian', 'encodings.mac_cyrillic',
'encodings.mac_farsi', 'encodings.mac_greek', 'encodings.mac_iceland',
'encodings.mac_latin2', 'encodings.mac_roman',
'encodings.mac_romanian', 'encodings.mac_turkish', 'encodings.mbcs',
'encodings.palmos', 'encodings.ptcp154', 'encodings.punycode',
'encodings.quopri_codec', 'encodings.raw_unicode_escape',
'encodings.rot_13', 'encodings.shift_jis', 'encodings.shift_jis_2004',
'encodings.shift_jisx0213', 'encodings.string_escape',
'encodings.tis_620', 'encodings.undefined',
'encodings.unicode_escape', 'encodings.unicode_internal',
'encodings.utf_16', 'encodings.utf_16_be', 'encodings.utf_16_le',
'encodings.utf_32', 'encodings.utf_32_be', 'encodings.utf_32_le',
'encodings.utf_7', 'encodings.utf_8', 'encodings.utf_8_sig',
'encodings.uu_codec', 'encodings.zlib_codec', 'errno', 'exceptions',
'fcntl', 'filecmp', 'fileinput', 'fnmatch', 'formatter', 'fpformat',
'fractions', 'ftplib', 'functools', 'future_builtins', 'gc', 'gdbm',
'genericpath', 'getopt', 'getpass', 'gettext', 'glob', 'grp', 'gzip',
'hashlib', 'heapq', 'hmac', 'hotshot', 'hotshot.log', 'hotshot.stats',
'hotshot.stones', 'htmlentitydefs', 'htmllib', 'httplib', 'idlelib',
'idlelib.AutoComplete', 'idlelib.AutoCompleteWindow',
'idlelib.AutoExpand', 'idlelib.Bindings', 'idlelib.CallTipWindow',
'idlelib.CallTips', 'idlelib.ClassBrowser', 'idlelib.CodeContext',
'idlelib.ColorDelegator', 'idlelib.Debugger', 'idlelib.Delegator',
'idlelib.EditorWindow', 'idlelib.FileList', 'idlelib.FormatParagraph',
'idlelib.GrepDialog', 'idlelib.HyperParser', 'idlelib.IOBinding',
'idlelib.IdleHistory', 'idlelib.MultiCall', 'idlelib.MultiStatusBar',
'idlelib.ObjectBrowser', 'idlelib.OutputWindow', 'idlelib.ParenMatch',
'idlelib.PathBrowser', 'idlelib.Percolator', 'idlelib.PyParse',
'idlelib.PyShell', 'idlelib.RemoteDebugger',
'idlelib.RemoteObjectBrowser', 'idlelib.ReplaceDialog',
'idlelib.RstripExtension', 'idlelib.ScriptBinding',
'idlelib.ScrolledList', 'idlelib.SearchDialog',
'idlelib.SearchDialogBase', 'idlelib.SearchEngine',
'idlelib.StackViewer', 'idlelib.ToolTip', 'idlelib.TreeWidget',
'idlelib.UndoDelegator', 'idlelib.WidgetRedirector',
'idlelib.WindowList', 'idlelib.ZoomHeight', 'idlelib.aboutDialog',
'idlelib.configDialog', 'idlelib.configHandler',
'idlelib.configHelpSourceEdit', 'idlelib.configSectionNameDialog',
'idlelib.dynOptionMenuWidget', 'idlelib.idle', 'idlelib.idlever',
'idlelib.keybindingDialog', 'idlelib.macosxSupport', 'idlelib.rpc',
'idlelib.run', 'idlelib.tabbedpages', 'idlelib.textView', 'ihooks',
'imageop', 'imaplib', 'imghdr', 'imp', 'importlib', 'imputil',
'inspect', 'io', 'itertools', 'json', 'json.decoder', 'json.encoder',
'json.scanner', 'json.tool', 'keyword', 'lib2to3', 'lib2to3.__main__',
'lib2to3.btm_matcher', 'lib2to3.btm_utils', 'lib2to3.fixer_base',
'lib2to3.fixer_util', 'lib2to3.fixes', 'lib2to3.fixes.fix_apply',
'lib2to3.fixes.fix_basestring', 'lib2to3.fixes.fix_buffer',
'lib2to3.fixes.fix_callable', 'lib2to3.fixes.fix_dict',
'lib2to3.fixes.fix_except', 'lib2to3.fixes.fix_exec',
'lib2to3.fixes.fix_execfile', 'lib2to3.fixes.fix_exitfunc',
'lib2to3.fixes.fix_filter', 'lib2to3.fixes.fix_funcattrs',
'lib2to3.fixes.fix_future', 'lib2to3.fixes.fix_getcwdu',
'lib2to3.fixes.fix_has_key', 'lib2to3.fixes.fix_idioms',
'lib2to3.fixes.fix_import', 'lib2to3.fixes.fix_imports',
'lib2to3.fixes.fix_imports2', 'lib2to3.fixes.fix_input',
'lib2to3.fixes.fix_intern', 'lib2to3.fixes.fix_isinstance',
'lib2to3.fixes.fix_itertools', 'lib2to3.fixes.fix_itertools_imports',
'lib2to3.fixes.fix_long', 'lib2to3.fixes.fix_map',
'lib2to3.fixes.fix_metaclass', 'lib2to3.fixes.fix_methodattrs',
'lib2to3.fixes.fix_ne', 'lib2to3.fixes.fix_next',
'lib2to3.fixes.fix_nonzero', 'lib2to3.fixes.fix_numliterals',
'lib2to3.fixes.fix_operator', 'lib2to3.fixes.fix_paren',
'lib2to3.fixes.fix_print', 'lib2to3.fixes.fix_raise',
'lib2to3.fixes.fix_raw_input', 'lib2to3.fixes.fix_reduce',
'lib2to3.fixes.fix_renames', 'lib2to3.fixes.fix_repr',
'lib2to3.fixes.fix_set_literal', 'lib2to3.fixes.fix_standarderror',
'lib2to3.fixes.fix_sys_exc', 'lib2to3.fixes.fix_throw',
'lib2to3.fixes.fix_tuple_params', 'lib2to3.fixes.fix_types',
'lib2to3.fixes.fix_unicode', 'lib2to3.fixes.fix_urllib',
'lib2to3.fixes.fix_ws_comma', 'lib2to3.fixes.fix_xrange',
'lib2to3.fixes.fix_xreadlines', 'lib2to3.fixes.fix_zip',
'lib2to3.main', 'lib2to3.patcomp', 'lib2to3.pgen2',
'lib2to3.pgen2.conv', 'lib2to3.pgen2.driver', 'lib2to3.pgen2.grammar',
'lib2to3.pgen2.literals', 'lib2to3.pgen2.parse', 'lib2to3.pgen2.pgen',
'lib2to3.pgen2.token', 'lib2to3.pgen2.tokenize', 'lib2to3.pygram',
'lib2to3.pytree', 'lib2to3.refactor', 'linecache', 'linuxaudiodev',
'locale', 'logging', 'logging.config', 'logging.handlers', 'macpath',
'macurl2path', 'mailbox', 'mailcap', 'markupbase', 'marshal', 'math',
'md5', 'mhlib', 'mimetools', 'mimetypes', 'mimify', 'mmap',
'modulefinder', 'multifile', 'multiprocessing',
'multiprocessing.connection', 'multiprocessing.dummy',
'multiprocessing.dummy.connection', 'multiprocessing.forking',
'multiprocessing.heap', 'multiprocessing.managers',
'multiprocessing.pool', 'multiprocessing.process',
'multiprocessing.queues', 'multiprocessing.reduction',
'multiprocessing.sharedctypes', 'multiprocessing.synchronize',
'multiprocessing.util', 'mutex', 'netrc', 'new', 'nis', 'nntplib',
'ntpath', 'nturl2path', 'numbers', 'opcode', 'operator', 'optparse',
'os', 'os2emxpath', 'ossaudiodev', 'parser', 'pdb', 'pickle',
'pickletools', 'pipes', 'pkgutil', 'plat-linux2.CDROM',
'plat-linux2.DLFCN', 'plat-linux2.IN', 'plat-linux2.TYPES',
'platform', 'plistlib', 'popen2', 'poplib', 'posix', 'posixfile',
'posixpath', 'pprint', 'profile', 'pstats', 'pty', 'pwd',
'py_compile', 'pyclbr', 'pydoc', 'pydoc_data', 'pydoc_data.topics',
'pyexpat', 'quopri', 'random', 're', 'readline', 'repr', 'resource',
'rexec', 'rfc822', 'rlcompleter', 'robotparser', 'runpy', 'sched',
'select', 'sets', 'sgmllib', 'sha', 'shelve', 'shlex', 'shutil',
'signal', 'site', 'smtpd', 'smtplib', 'sndhdr', 'socket', 'spwd',
'sqlite3', 'sqlite3.dbapi2', 'sqlite3.dump', 'sre', 'sre_compile',
'sre_constants', 'sre_parse', 'ssl', 'stat', 'statvfs', 'string',
'stringold', 'stringprep', 'strop', 'struct', 'subprocess', 'sunau',
'sunaudio', 'symbol', 'symtable', 'sys', 'sysconfig', 'syslog',
'tabnanny', 'tarfile', 'telnetlib', 'tempfile', 'termios', 'test',
'test.test_support', 'textwrap', 'this', 'thread', 'threading',
'time', 'timeit', 'timing', 'toaiff', 'token', 'tokenize', 'trace',
'traceback', 'tty', 'types', 'unicodedata', 'unittest',
'unittest.__main__', 'unittest.case', 'unittest.loader',
'unittest.main', 'unittest.result', 'unittest.runner',
'unittest.signals', 'unittest.suite', 'unittest.test',
'unittest.test.dummy', 'unittest.test.support',
'unittest.test.test_assertions', 'unittest.test.test_break',
'unittest.test.test_case', 'unittest.test.test_discovery',
'unittest.test.test_functiontestcase', 'unittest.test.test_loader',
'unittest.test.test_program', 'unittest.test.test_result',
'unittest.test.test_runner', 'unittest.test.test_setups',
'unittest.test.test_skipping', 'unittest.test.test_suite',
'unittest.util', 'urllib', 'urllib2', 'urlparse', 'user', 'uu',
'uuid', 'warnings', 'wave', 'weakref', 'webbrowser', 'whichdb',
'wsgiref', 'wsgiref.handlers', 'wsgiref.headers',
'wsgiref.simple_server', 'wsgiref.util', 'wsgiref.validate', 'xdrlib',
'xml', 'xml.dom', 'xml.dom.NodeFilter', 'xml.dom.domreg',
'xml.dom.expatbuilder', 'xml.dom.minicompat', 'xml.dom.minidom',
'xml.dom.pulldom', 'xml.dom.xmlbuilder', 'xml.etree',
'xml.etree.ElementInclude', 'xml.etree.ElementPath',
'xml.etree.ElementTree', 'xml.etree.cElementTree', 'xml.parsers',
'xml.parsers.expat', 'xml.sax', 'xml.sax._exceptions',
'xml.sax.expatreader', 'xml.sax.handler', 'xml.sax.saxutils',
'xml.sax.xmlreader', 'xmllib', 'xmlrpclib', 'xxsubtype', 'zipfile', ]
# Dynamic modules that can't be auto-discovered by the script above
manual_stdlib = ['os.path', ]
py2_stdlib.extend(manual_stdlib)
def _get_import_type(module):
if module in module_cache:
return module_cache[module]
def cache_type(module_type):
module_cache[module] = module_type
return module_type
# Check static stdlib list
if module in py2_stdlib:
return cache_type('stdlib')
# Check if the module is local
try:
_find_module(module, ['.'])
# If the previous line succeeded then it must be a project module
return cache_type('project')
except ImportError:
pass
# Otherwise treat it as third-party - this means we may treat some stdlib
# modules as third-party, but that's okay because we are allowing
# third-party libs in the stdlib section.
return cache_type('third-party')
@core.flake8ext
def hacking_import_groups(logical_line, blank_before, previous_logical,
indent_level, previous_indent_level, physical_line,
noqa):
r"""Check that imports are grouped correctly.
OpenStack HACKING guide recommendation for imports:
imports grouped such that Python standard library imports are together,
third party library imports are together, and project imports are
together
Okay: import os\nimport sys\n\nimport six\n\nimport hacking
Okay: import six\nimport znon_existent_package
Okay: import os\nimport threading
S366: import mock\nimport os
S366: import hacking\nimport os
S366: import hacking\nimport nonexistent
S366: import hacking\nimport mock
"""
if (noqa or blank_before > 0 or
indent_level != previous_indent_level):
return
normalized_line = core.import_normalize(logical_line.strip()).split()
normalized_previous = core.import_normalize(previous_logical.
strip()).split()
def compatible(previous, current):
if previous == current:
return True
if normalized_line and normalized_line[0] == 'import':
current_type = _get_import_type(normalized_line[1])
if normalized_previous and normalized_previous[0] == 'import':
previous_type = _get_import_type(normalized_previous[1])
if not compatible(previous_type, current_type):
yield(0, 'S366: imports not grouped correctly '
'(%s: %s, %s: %s)' %
(normalized_previous[1], previous_type,
normalized_line[1], current_type))
class ImportGroupData(object):
"""A class to hold persistent state data for import group checks.
To verify import grouping, it is necessary to know the current group
for the current file. This can not always be known solely from the
current and previous line, so this class can be used to keep track.
"""
# NOTE(bnemec): *args is needed because the test code tries to run this
# as a flake8 check and passes an argument to it.
def __init__(self, *args):
self.current_group = None
self.current_filename = None
self.current_import = None
together_data = ImportGroupData()
@core.flake8ext
def hacking_import_groups_together(logical_line, blank_lines, indent_level,
previous_indent_level, line_number,
physical_line, filename, noqa):
r"""Check that like imports are grouped together.
OpenStack HACKING guide recommendation for imports:
Imports should be grouped together by type.
Okay: import os\nimport sys
Okay: try:\n import foo\nexcept ImportError:\n pass\n\nimport six
Okay: import abc\nimport mock\n\nimport six
Okay: import eventlet\neventlet.monkey_patch()\n\nimport copy
S367: import mock\n\nimport six
S367: import os\n\nimport sys
S367: import mock\nimport os\n\nimport sys
"""
if line_number == 1 or filename != together_data.current_filename:
together_data.current_group = None
together_data.current_filename = filename
if noqa:
return
def update_current_group(current):
together_data.current_group = current
normalized_line = core.import_normalize(logical_line.strip()).split()
if normalized_line:
if normalized_line[0] == 'import':
current_type = _get_import_type(normalized_line[1])
previous_import = together_data.current_import
together_data.current_import = normalized_line[1]
matched = current_type == together_data.current_group
update_current_group(current_type)
if (matched and indent_level == previous_indent_level and
blank_lines >= 1):
yield(0, 'S367: like imports should be grouped together (%s '
'and %s from %s are separated by whitespace)' %
(previous_import,
together_data.current_import,
current_type))
else:
# Reset on non-import code
together_data.current_group = None

View File

@ -1,64 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import re
ALL_LOG_LEVELS = "info|exception|warning|critical|error|debug"
RE_ACCEPTED_LOG_LEVELS = re.compile(
r"(.)*LOG\.(%(levels)s)\(" % {'levels': ALL_LOG_LEVELS})
# Since _Lx() have been removed, we just need to check _()
RE_TRANSLATED_LOG = re.compile(
r"(.)*LOG\.(%(levels)s)\(\s*_\(" % {'levels': ALL_LOG_LEVELS})
def no_translate_logs(logical_line, filename):
"""Check for 'LOG.*(_('
Translators don't provide translations for log messages, and operators
asked not to translate them.
* This check assumes that 'LOG' is a logger.
* Use filename so we can start enforcing this in specific folders instead
of needing to do so all at once.
S373
"""
msg = "S373 Don't translate logs"
if RE_TRANSLATED_LOG.match(logical_line):
yield (0, msg)
def accepted_log_levels(logical_line, filename):
"""In Sahara we use only 5 log levels.
This check is needed because we don't want new contributors to
use deprecated log levels.
S374
"""
# NOTE(Kezar): sahara/tests included because we don't require translations
# in tests. sahara/db/templates provide separate cli interface so we don't
# want to translate it.
ignore_dirs = ["sahara/db/templates",
"sahara/tests"]
for directory in ignore_dirs:
if directory in filename:
return
msg = ("S374 You used deprecated log level. Accepted log levels are "
"%(levels)s" % {'levels': ALL_LOG_LEVELS})
if logical_line.startswith("LOG."):
if not RE_ACCEPTED_LOG_LEVELS.search(logical_line):
yield(0, msg)

View File

@ -1,46 +0,0 @@
# Copyright (c) 2014 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import testtools
from saharaclient.api import base
from saharaclient.api import client
from requests_mock.contrib import fixture
class BaseTestCase(testtools.TestCase):
URL = 'http://localhost:8386'
TOKEN = 'token'
def setUp(self):
super(BaseTestCase, self).setUp()
self.responses = self.useFixture(fixture.Fixture())
self.client = client.Client(sahara_url=self.URL,
input_auth_token=self.TOKEN)
def assertFields(self, body, obj):
for key, value in body.items():
self.assertEqual(value, getattr(obj, key))
class TestResource(base.Resource):
resource_name = 'Test Resource'
defaults = {'description': 'Test Description',
'extra': "extra"}
class TestManager(base.ResourceManager):
resource_class = TestResource

View File

@ -1,42 +0,0 @@
# Copyright (c) 2015 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import mock
from saharaclient.osc import plugin
from saharaclient.tests.unit import base
class TestDataProcessingPlugin(base.BaseTestCase):
@mock.patch("saharaclient.api.client.Client")
def test_make_client(self, p_client):
instance = mock.Mock()
instance._api_version = {"data_processing": '1.1'}
instance.session = 'session'
instance._region_name = 'region_name'
instance._cacert = 'cacert'
instance._insecure = 'insecure'
instance._cli_options.data_processing_url = 'url'
instance._interface = 'public'
plugin.make_client(instance)
p_client.assert_called_with(session='session',
region_name='region_name',
cacert='cacert',
insecure='insecure',
sahara_url='url',
endpoint_type='public')

Some files were not shown because too many files have changed in this diff Show More