Retire Packaging Deb project repos

This commit is part of a series to retire the Packaging Deb
project. Step 2 is to remove all content from the project
repos, replacing it with a README notification where to find
ongoing work, and how to recover the repo if needed at some
future point (as in
https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project).

Change-Id: I3b2becdf2cf9158c71f3d272171b27be50fd6acb
This commit is contained in:
Tony Breeds 2017-09-12 16:04:35 -06:00
parent 47e28fabc3
commit 7dcd5a937e
80 changed files with 14 additions and 9050 deletions

View File

@ -1,6 +0,0 @@
[run]
branch = True
source = osc_lib
[report]
ignore_errors = True

25
.gitignore vendored
View File

@ -1,25 +0,0 @@
*.DS_Store
*.egg*
*.log
*.mo
*.pyc
*.swo
*.swp
*~
.coverage
.idea
.testrepository
.tox
AUTHORS
build
ChangeLog
dist
# Doc related
doc/build
doc/source/reference/api/
# Development environment files
.project
.pydevproject
cover
# Files created by releasenotes build
releasenotes/build

View File

@ -1,4 +0,0 @@
[gerrit]
host=review.openstack.org
port=29418
project=openstack/osc-lib.git

View File

@ -1,2 +0,0 @@
<josh.kearney@pistoncloud.com> <josh@jk0.org>
<matt.joyce@cloudscaling.com> <matt@nycresistor.com>

View File

@ -1,9 +0,0 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover -t ./ ${OS_TEST_PATH:-./osc_lib/tests} $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list
group_regex=([^\.]+\.)+

View File

@ -1,108 +0,0 @@
OpenStack Style Commandments
============================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
General
-------
- thou shalt not violate causality in our time cone, or else
Docstrings
----------
Docstrings should ONLY use triple-double-quotes (``"""``)
Single-line docstrings should NEVER have extraneous whitespace
between enclosing triple-double-quotes.
Deviation! Sentence fragments do not have punctuation. Specifically in the
command classes the one line docstring is also the help string for that
command and those do not have periods.
"""A one line docstring looks like this"""
Calling Methods
---------------
Deviation! When breaking up method calls due to the 79 char line length limit,
use the alternate 4 space indent. With the first argument on the succeeding
line all arguments will then be vertically aligned. Use the same convention
used with other data structure literals and terminate the method call with
the last argument line ending with a comma and the closing paren on its own
line indented to the starting line level.
unnecessarily_long_function_name(
'string one',
'string two',
kwarg1=constants.ACTIVE,
kwarg2=['a', 'b', 'c'],
)
Text encoding
-------------
Note: this section clearly has not been implemented in this project yet, it is
the intention to do so.
All text within python code should be of type 'unicode'.
WRONG:
>>> s = 'foo'
>>> s
'foo'
>>> type(s)
<type 'str'>
RIGHT:
>>> u = u'foo'
>>> u
u'foo'
>>> type(u)
<type 'unicode'>
Transitions between internal unicode and external strings should always
be immediately and explicitly encoded or decoded.
All external text that is not explicitly encoded (database storage,
commandline arguments, etc.) should be presumed to be encoded as utf-8.
WRONG:
infile = open('testfile', 'r')
mystring = infile.readline()
myreturnstring = do_some_magic_with(mystring)
outfile.write(myreturnstring)
RIGHT:
infile = open('testfile', 'r')
mystring = infile.readline()
mytext = mystring.decode('utf-8')
returntext = do_some_magic_with(mytext)
returnstring = returntext.encode('utf-8')
outfile.write(returnstring)
Python 3.x Compatibility
------------------------
OpenStackClient strives to be Python 3.3 compatible. Common guidelines:
* Convert print statements to functions: print statements should be converted
to an appropriate log or other output mechanism.
* Use six where applicable: x.iteritems is converted to six.iteritems(x)
for example.
Running Tests
-------------
Note: Oh boy, are we behind on writing tests. But they are coming!
The testing system is based on a combination of tox and testr. If you just
want to run the whole suite, run `tox` and all will be fine. However, if
you'd like to dig in a bit more, you might want to learn some things about
testr itself. A basic walkthrough for OpenStack can be found at
http://wiki.openstack.org/testr

176
LICENSE
View File

@ -1,176 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

14
README Normal file
View File

@ -0,0 +1,14 @@
This project is no longer maintained.
The contents of this repository are still available in the Git
source code management system. To see the contents of this
repository before it reached its end of life, please check out the
previous commit with "git checkout HEAD^1".
For ongoing work on maintaining OpenStack packages in the Debian
distribution, please see the Debian OpenStack packaging team at
https://wiki.debian.org/OpenStack/.
For any further questions, please email
openstack-dev@lists.openstack.org or join #openstack-dev on
Freenode.

View File

@ -1,66 +0,0 @@
=======
osc-lib
=======
.. image:: https://img.shields.io/pypi/v/osc-lib.svg
:target: https://pypi.python.org/pypi/osc-lib/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/dm/osc-lib.svg
:target: https://pypi.python.org/pypi/osc-lib/
:alt: Downloads
OpenStackClient (aka OSC) is a command-line client for OpenStack. osc-lib
is a package of common support modules for writing OSC plugins.
* `PyPi`_ - package installation
* `Online Documentation`_
* `Launchpad project`_ - part of OpenStackClient
* `Bugs`_ - issue tracking
* `Source`_
* `Developer` - getting started as a developer
* `Contributing` - contributing code
* `Testing` - testing code
* IRC: #openstack-sdks on Freenode (irc.freenode.net)
* License: Apache 2.0
.. _PyPi: https://pypi.python.org/pypi/osc-lib
.. _Online Documentation: http://docs.openstack.org/osc-lib/latest/
.. _Launchpad project: https://launchpad.net/python-openstackclient
.. _Bugs: https://bugs.launchpad.net/python-openstackclient
.. _Source: https://git.openstack.org/cgit/openstack/osc-lib
.. _Developer: http://docs.openstack.org/project-team-guide/project-setup/python.html
.. _Contributing: http://docs.openstack.org/infra/manual/developers.html
.. _Testing: http://docs.openstack.org/osc-lib/latest/contributor/#testing
Getting Started
===============
osc-lib can be installed from PyPI using pip::
pip install osc-lib
Transition From OpenStackclient
===============================
This library was extracted from the main OSC repo after the OSC 2.4.0 release.
The following are the changes to imports that will cover the majority of
transition to using osc-lib:
* openstackclient.api.api -> osc_lib.api.api
* openstackclient.api.auth -> osc_lib.api.auth
* openstackclient.api.utils -> osc_lib.api.utils
* openstackclient.common.command -> osc_lib.command.command
* openstackclient.common.commandmanager -> osc_lib.command.commandmanager
* openstackclient.common.exceptions -> osc_lib.exceptions
* openstackclient.common.logs -> osc_lib.logs
* openstackclient.common.parseractions -> osc_lib.cli.parseractions
* openstackclient.common.session -> osc_lib.session
* openstackclient.common.utils -> osc_lib.utils
* openstackclient.i18n -> osc_lib.i18n
* openstackclient.shell -> osc_lib.shell
Also, some of the test fixtures and modules may be used:
* openstackclient.tests.fakes -> osc_lib.tests.fakes
* openstackclient.tests.utils -> osc_lib.tests.utils

View File

@ -1,130 +0,0 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = build
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
.PHONY: help clean html pdf dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text changes linkcheck doctest
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " pdf to make pdf with rst2pdf"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " text to make text files"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
-rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
pdf:
$(SPHINXBUILD) -b pdf $(ALLSPHINXOPTS) $(BUILDDIR)/pdf
@echo
@echo "Build finished. The PDFs are in $(BUILDDIR)/pdf."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/NebulaDocs.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/NebulaDocs.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/NebulaDocs"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/NebulaDocs"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
make -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."

View File

View File

@ -1,43 +0,0 @@
# Copyright 2014 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os.path as path
from sphinx import apidoc
# NOTE(blk-u): pbr will run Sphinx multiple times when it generates
# documentation. Once for each builder. To run this extension we use the
# 'builder-inited' hook that fires at the beginning of a Sphinx build.
# We use ``run_already`` to make sure apidocs are only generated once
# even if Sphinx is run multiple times.
run_already = False
def run_apidoc(app):
global run_already
if run_already:
return
run_already = True
package_dir = path.abspath(path.join(app.srcdir, '..', '..',
'osc_lib'))
source_dir = path.join(app.srcdir, 'api')
apidoc.main(['apidoc', package_dir, '-f',
'-H', 'osc-lib Modules',
'-o', source_dir])
def setup(app):
app.connect('builder-inited', run_apidoc)

View File

@ -1,267 +0,0 @@
# -*- coding: utf-8 -*-
#
# OpenStack Command Line Client documentation build configuration file, created
# by sphinx-quickstart on Wed May 16 12:05:58 2012.
#
# This file is execfile()d with the current directory set to its containing
# dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import os
import sys
import openstackdocstheme
import pbr.version
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..', '..')))
# NOTE(blk-u): Path for our Sphinx extension, remove when
# https://launchpad.net/bugs/1260495 is fixed.
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
# -- General configuration ----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.todo',
'openstackdocstheme',
]
# openstackdocstheme options
repository_name = 'openstack/osc-lib'
bug_project = 'python-openstackclient'
bug_tag = 'osc-lib'
html_last_updated_fmt = '%Y-%m-%d %H:%M'
# Add any paths that contain templates here, relative to this directory.
#templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'OpenStackClient CLI Base'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
version_info = pbr.version.VersionInfo('osc-lib')
#
# The short X.Y version.
version = version_info.version_string()
# The full version, including alpha/beta/rc tags.
release = version_info.release_string()
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
modindex_common_prefix = ['osc_lib.']
# -- Options for HTML output --------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#html_theme_path = ["."]
#html_theme = '_theme'
html_theme = 'openstackdocs'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
html_theme_path = [openstackdocstheme.get_html_theme_path()]
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
#html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'OpenStackCommandLineClientdoc'
# -- Options for LaTeX output -------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual])
# .
latex_documents = [
('index', 'OpenStackCommandLineClient.tex',
u'OpenStack Command Line Client Documentation',
u'OpenStack'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output -------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
#man_pages = []
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output -----------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'OpenStackCommandLineClient',
u'OpenStack Command Line Client Documentation',
u'OpenStack', 'OpenStackCommandLineClient',
'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'

View File

@ -1,19 +0,0 @@
==============
Contributing
==============
osc-lib utilizes all of the usual OpenStack processes and requirements for
contributions. The code is hosted `on OpenStack's Git server`_. `Bug reports`_
and `blueprints`_ may be submitted to the :code:`python-openstackclient` project
on `Launchpad`_. Code may be submitted to the
:code:`openstack/osc-lib` project using `Gerrit`_.
Developers may also be found in the `IRC channel`_ ``#openstack-sdks``.
.. _`on OpenStack's Git server`: https://git.openstack.org/cgit/openstack/python-openstackclient/tree
.. _Launchpad: https://launchpad.net/python-openstackclient
.. _Gerrit: http://docs.openstack.org/infra/manual/developers.html#development-workflow
.. _Bug reports: https://bugs.launchpad.net/python-openstackclient/+bugs
.. _blueprints: https://blueprints.launchpad.net/python-openstackclient
.. _PyPi: https://pypi.python.org/pypi/osc-lib
.. _tarball: http://tarballs.openstack.org/osc-lib
.. _IRC channel: https://wiki.openstack.org/wiki/IRC

View File

@ -1,21 +0,0 @@
=========================================
osc-lib -- OpenStackClient Plugin Library
=========================================
OpenStackClient (aka OSC) is a command-line client for OpenStack. osc-lib
is a package of common support modules for writing OSC plugins.
Contents:
.. toctree::
:maxdepth: 2
user/index
reference/index
contributor/index
.. rubric:: Indices and tables
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@ -1,8 +0,0 @@
=======================
Library API Reference
=======================
.. toctree::
:maxdepth: 2
api/autoindex

View File

@ -1 +0,0 @@
.. include:: ../../../ChangeLog

View File

@ -1,9 +0,0 @@
===============
Using osc-lib
===============
.. toctree::
:maxdepth: 2
transition
change_log

View File

@ -1,109 +0,0 @@
===============================
Transition from OpenStackClient
===============================
``osc-lib`` was extracted from the main OpenStackClient repo after the
OSC 2.4.0 release. During the migration all module names have changed
from ``openstackclient.*`` to ``osc_lib.*``. In addition, some re-arranging
has been done internally to better align modules.
The complete list of public module name changes:
* ``openstackclient.api.api`` -> ``osc_lib.api.api``
* ``openstackclient.api.auth`` -> ``osc_lib.api.auth``
* ``openstackclient.api.utils`` -> ``osc_lib.api.utils``
* ``openstackclient.common.command`` -> ``osc_lib.command.command``
* ``openstackclient.common.commandmanager`` -> ``osc_lib.command.commandmanager``
* ``openstackclient.common.exceptions`` -> ``osc_lib.exceptions``
* ``openstackclient.common.logs`` -> ``osc_lib.logs``
* ``openstackclient.common.parseractions`` -> ``osc_lib.cli.parseractions``
* ``openstackclient.common.session`` -> ``osc_lib.session``
* ``openstackclient.common.utils`` -> ``osc_lib.utils``
* ``openstackclient.tests.fakes`` -> ``osc_lib.tests.fakes``
* ``openstackclient.tests.utils`` -> ``osc_lib.tests.utils``
Additional Changes
==================
In addition to the existing public modules, other parts of OSC have been
extracted, including the base ``Command``, ``CommandManager``, ``ClientManager``
and ``Session`` classes.
ClientManager
-------------
The OSC ``ClientManager`` is responsible for managing all of the handles to the
individual API client objects as well as coordinating session and authentication
objects.
Plugins are encouraged to use the ClientManager interface for obtaining information
about global configuration.
* ``openstackclient.common.clientmanager`` -> ``osc_lib.clientmanager``
* All of the handling of the ``verify``/``insecure``/``cacert`` configuration
options has been consolidated into ``ClientManager``. This converts the ``--verify``,
``--insecure`` and ``--os-cacert`` options into a ``Requests``-compatible
``verify`` attribute and a ``cacert`` attribute for the legacy client libraries.
both are now public; the ``_insecure`` attribute has been removed.
.. list-table:: Verify/Insecure/CACert
:header-rows: 1
* - --verify
- --insecure
- --cacert
- Result
* - None
- None
-
- ``verify=True``, ``cacert=None``
* - True
- None
- None
- ``verify=True``, ``cacert=None``
* - None
- True
- None
- ``verify=False``, ``cacert=None``
* - None
- None
- <filename>
- ``verify=cacert``, ``cacert=<filename>``
* - True
- None
- <filename>
- ``verify=cacert``, ``cacert=<filename>``
* - None
- True
- <filename>
- ``verify=False``, ``cacert=None``
* A number of other ``ClientManager`` attributes have also been made public to
encourage their direct use rather than reaching in to the global options passed
in the ``ClientManager`` constructor:
* ``_verify`` -> ``verify``
* ``_cacert`` -> ``cacert``
* ``_cert`` -> ``cert``
* ``_insecure`` -> removed, use '`not verify'`
* ``_interface`` -> ``interface``
* ``_region_name`` -> ``region_name``
Shell
=====
* ``openstackclient.shell`` -> ``osc_lib.shell``
* Break up ``OpenStackShell.initialize_app()``
* leave all plugin initialization in OSC in ``_load_plugins()``
* leave all command loading in OSC in ``_load_commands()``
API
===
The API base layer is the common point for all API subclasses. It is a
wrapper around ``keystoneauth1.session.Session`` that fixes the ``request()``
interface and provides simple endpoint handling that is useful when a Service
Catalog is either not available or is insufficient. It also adds simple
implementations of the common API CRUD operations: create(), delete(), etc.
* ``KeystoneSession`` -> merged into ``BaseAPI``

View File

@ -1,19 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
from osc_lib import version as _version
__all__ = ['__version__']
__version__ = _version.version_string

View File

@ -1,417 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Base API Library"""
import simplejson as json
import six
from keystoneauth1 import exceptions as ksa_exceptions
from keystoneauth1 import session as ksa_session
from osc_lib import exceptions
from osc_lib.i18n import _
class BaseAPI(object):
"""Base API wrapper for keystoneauth1.session.Session
Encapsulate the translation between keystoneauth1.session.Session
and requests.Session in a single layer:
* Restore some requests.session.Session compatibility;
keystoneauth1.session.Session.request() has the method and url
arguments swapped from the rest of the requests-using world.
* Provide basic endpoint handling when a Service Catalog is not
available.
"""
# Which service are we? Set in API-specific subclasses
SERVICE_TYPE = ""
# The common OpenStack microversion header
HEADER_NAME = "OpenStack-API-Version"
def __init__(
self,
session=None,
service_type=None,
endpoint=None,
**kwargs
):
"""Base object that contains some common API objects and methods
:param keystoneauth1.session.Session session:
The session to be used for making the HTTP API calls. If None,
a default keystoneauth1.session.Session will be created.
:param string service_type:
API name, i.e. ``identity`` or ``compute``
:param string endpoint:
An optional URL to be used as the base for API requests on
this API.
:param kwargs:
Keyword arguments passed to keystoneauth1.session.Session().
"""
super(BaseAPI, self).__init__()
# Create a keystoneauth1.session.Session if one is not supplied
if not session:
self.session = ksa_session.Session(**kwargs)
else:
self.session = session
self.service_type = service_type
self.endpoint = self._munge_endpoint(endpoint)
def _munge_endpoint(self, endpoint):
"""Hook to allow subclasses to massage the passed-in endpoint
Hook to massage passed-in endpoints from arbitrary sources,
including direct user input. By default just remove trailing
'/' as all of our path info strings start with '/' and not all
services can handle '//' in their URLs.
Some subclasses will override this to do additional work, most
likely with regard to API versions.
:param string endpoint: The service endpoint, generally direct
from the service catalog.
:return: The modified endpoint
"""
if isinstance(endpoint, six.string_types):
return endpoint.rstrip('/')
else:
return endpoint
def _request(self, method, url, session=None, **kwargs):
"""Perform call into session
All API calls are funneled through this method to provide a common
place to finalize the passed URL and other things.
:param string method:
The HTTP method name, i.e. ``GET``, ``PUT``, etc
:param string url:
The API-specific portion of the URL path, or a full URL if
``endpoint`` was not supplied at initialization.
:param keystoneauth1.session.Session session:
An initialized session to override the one created at
initialization.
:param kwargs:
Keyword arguments passed to requests.request().
:return: the requests.Response object
"""
# If session arg is supplied, use it this time, but don't save it
if not session:
session = self.session
# Do the auto-endpoint magic
if self.endpoint:
if url:
url = '/'.join([self.endpoint.rstrip('/'), url.lstrip('/')])
else:
# NOTE(dtroyer): This is left here after _munge_endpoint() is
# added because endpoint is public and there is
# no accounting for what may happen.
url = self.endpoint.rstrip('/')
else:
# Pass on the lack of URL unmolested to maintain the same error
# handling from keystoneauth: raise EndpointNotFound
pass
# Hack out empty headers 'cause KSA can't stomach it
if 'headers' in kwargs and kwargs['headers'] is None:
kwargs.pop('headers')
# Why is ksc session backwards???
return session.request(url, method, **kwargs)
# The basic action methods all take a Session and return dict/lists
def create(
self,
url,
session=None,
method=None,
**params
):
"""Create a new resource
:param string url:
The API-specific portion of the URL path
:param Session session:
HTTP client session
:param string method:
HTTP method (default POST)
"""
if not method:
method = 'POST'
ret = self._request(method, url, session=session, **params)
# Should this move into _requests()?
try:
return ret.json()
except json.JSONDecodeError:
return ret
def delete(
self,
url,
session=None,
**params
):
"""Delete a resource
:param string url:
The API-specific portion of the URL path
:param Session session:
HTTP client session
"""
return self._request('DELETE', url, **params)
def list(
self,
path,
session=None,
body=None,
detailed=False,
headers=None,
**params
):
"""Return a list of resources
GET ${ENDPOINT}/${PATH}?${PARAMS}
path is often the object's plural resource type
:param string path:
The API-specific portion of the URL path
:param Session session:
HTTP client session
:param body: data that will be encoded as JSON and passed in POST
request (GET will be sent by default)
:param bool detailed:
Adds '/details' to path for some APIs to return extended attributes
:param dict headers:
Headers dictionary to pass to requests
:returns:
JSON-decoded response, could be a list or a dict-wrapped-list
"""
if detailed:
path = '/'.join([path.rstrip('/'), 'details'])
if body:
ret = self._request(
'POST',
path,
# service=self.service_type,
json=body,
params=params,
headers=headers,
)
else:
ret = self._request(
'GET',
path,
# service=self.service_type,
params=params,
headers=headers,
)
try:
return ret.json()
except json.JSONDecodeError:
return ret
# Layered actions built on top of the basic action methods do not
# explicitly take a Session but one may still be passed in kwargs
def find_attr(
self,
path,
value=None,
attr=None,
resource=None,
):
"""Find a resource via attribute or ID
Most APIs return a list wrapped by a dict with the resource
name as key. Some APIs (Identity) return a dict when a query
string is present and there is one return value. Take steps to
unwrap these bodies and return a single dict without any resource
wrappers.
:param string path:
The API-specific portion of the URL path
:param string value:
value to search for
:param string attr:
attribute to use for resource search
:param string resource:
plural of the object resource name; defaults to path
For example:
n = find(netclient, 'network', 'networks', 'matrix')
"""
# Default attr is 'name'
if attr is None:
attr = 'name'
# Default resource is path - in many APIs they are the same
if resource is None:
resource = path
def getlist(kw):
"""Do list call, unwrap resource dict if present"""
ret = self.list(path, **kw)
if isinstance(ret, dict) and resource in ret:
ret = ret[resource]
return ret
# Search by attribute
kwargs = {attr: value}
data = getlist(kwargs)
if isinstance(data, dict):
return data
if len(data) == 1:
return data[0]
if len(data) > 1:
msg = _("Multiple %(resource)s exist with %(attr)s='%(value)s'")
raise exceptions.CommandError(
msg % {'resource': resource,
'attr': attr,
'value': value}
)
# Search by id
kwargs = {'id': value}
data = getlist(kwargs)
if len(data) == 1:
return data[0]
msg = _("No %(resource)s with a %(attr)s or ID of '%(value)s' found")
raise exceptions.CommandError(
msg % {'resource': resource,
'attr': attr,
'value': value}
)
def find_bulk(
self,
path,
headers=None,
**kwargs
):
"""Bulk load and filter locally
:param string path:
The API-specific portion of the URL path
:param kwargs:
A dict of AVPs to match - logical AND
:param dict headers:
Headers dictionary to pass to requests
:returns: list of resource dicts
"""
print("keys: %s" % kwargs.keys())
items = self.list(path)
if isinstance(items, dict):
# strip off the enclosing dict
key = list(items.keys())[0]
items = items[key]
ret = []
for o in items:
try:
if all(o[attr] == kwargs[attr] for attr in kwargs.keys()):
ret.append(o)
except KeyError:
continue
return ret
def find_one(
self,
path,
**kwargs
):
"""Find a resource by name or ID
:param string path:
The API-specific portion of the URL path
:returns:
resource dict
"""
bulk_list = self.find_bulk(path, **kwargs)
num_bulk = len(bulk_list)
if num_bulk == 0:
msg = _("none found")
raise exceptions.NotFound(msg)
elif num_bulk > 1:
msg = _("many found")
raise RuntimeError(msg)
return bulk_list[0]
def find(
self,
path,
value=None,
attr=None,
headers=None,
):
"""Find a single resource by name or ID
:param string path:
The API-specific portion of the URL path
:param string value:
search expression (required, really)
:param string attr:
name of attribute for secondary search
:param dict headers:
Headers dictionary to pass to requests
"""
try:
ret = self._request(
'GET', "/%s/%s" % (path, value),
headers=headers,
).json()
if isinstance(ret, dict):
# strip off the enclosing dict
key = list(ret.keys())[0]
ret = ret[key]
except (
ksa_exceptions.NotFound,
ksa_exceptions.BadRequest,
):
kwargs = {attr: value}
try:
ret = self.find_one(
path,
headers=headers,
**kwargs
)
except (
exceptions.NotFound,
ksa_exceptions.NotFound,
):
msg = _("%s not found") % value
raise exceptions.NotFound(msg)
return ret

View File

@ -1,215 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Authentication Library"""
import argparse
from keystoneauth1.identity.v3 import k2k
from keystoneauth1.loading import base
from osc_lib import exceptions as exc
from osc_lib.i18n import _
from osc_lib import utils
# Initialize the list of Authentication plugins early in order
# to get the command-line options
PLUGIN_LIST = None
# List of plugin command line options
OPTIONS_LIST = {}
def get_plugin_list():
"""Gather plugin list and cache it"""
global PLUGIN_LIST
if PLUGIN_LIST is None:
PLUGIN_LIST = base.get_available_plugin_names()
return PLUGIN_LIST
def get_options_list():
"""Gather plugin options so the help action has them available"""
global OPTIONS_LIST
if not OPTIONS_LIST:
for plugin_name in get_plugin_list():
plugin_options = base.get_plugin_options(plugin_name)
for o in plugin_options:
os_name = o.name.lower().replace('_', '-')
os_env_name = 'OS_' + os_name.upper().replace('-', '_')
OPTIONS_LIST.setdefault(
os_name, {'env': os_env_name, 'help': ''},
)
# TODO(mhu) simplistic approach, would be better to only add
# help texts if they vary from one auth plugin to another
# also the text rendering is ugly in the CLI ...
OPTIONS_LIST[os_name]['help'] += 'With %s: %s\n' % (
plugin_name,
o.help,
)
return OPTIONS_LIST
def check_valid_authorization_options(options, auth_plugin_name):
"""Validate authorization options, and provide helpful error messages."""
if (options.auth.get('project_id') and not
options.auth.get('domain_id') and not
options.auth.get('domain_name') and not
options.auth.get('project_name') and not
options.auth.get('tenant_id') and not
options.auth.get('tenant_name')):
raise exc.CommandError(_(
'Missing parameter(s): '
'Set either a project or a domain scope, but not both. Set a '
'project scope with --os-project-name, OS_PROJECT_NAME, or '
'auth.project_name. Alternatively, set a domain scope with '
'--os-domain-name, OS_DOMAIN_NAME or auth.domain_name.'
))
def check_valid_authentication_options(options, auth_plugin_name):
"""Validate authentication options, and provide helpful error messages
:param required_scope: indicate whether a scoped token is required
"""
# Get all the options defined within the plugin.
plugin_opts = base.get_plugin_options(auth_plugin_name)
plugin_opts = {opt.dest: opt for opt in plugin_opts}
# NOTE(aloga): this is an horrible hack. We need a way to specify the
# required options in the plugins. Using the "required" argument for
# the oslo_config.cfg.Opt does not work, as it is not possible to load the
# plugin if the option is not defined, so the error will simply be:
# "NoMatchingPlugin: The plugin foobar could not be found"
msgs = []
# when no auth params are passed in, user advised to use os-cloud
if not options.auth:
msgs.append(_(
'Set a cloud-name with --os-cloud or OS_CLOUD'
))
else:
if 'password' in plugin_opts and not options.auth.get('username'):
msgs.append(_(
'Set a username with --os-username, OS_USERNAME,'
' or auth.username'
))
if 'auth_url' in plugin_opts and not options.auth.get('auth_url'):
msgs.append(_(
'Set an authentication URL, with --os-auth-url,'
' OS_AUTH_URL or auth.auth_url'
))
if 'url' in plugin_opts and not options.auth.get('url'):
msgs.append(_(
'Set a service URL, with --os-url, OS_URL or auth.url'
))
if 'token' in plugin_opts and not options.auth.get('token'):
msgs.append(_(
'Set a token with --os-token, OS_TOKEN or auth.token'
))
if msgs:
raise exc.CommandError(
_('Missing parameter(s): \n%s') % '\n'.join(msgs)
)
def build_auth_plugins_option_parser(parser):
"""Auth plugins options builder
Builds dynamically the list of options expected by each available
authentication plugin.
"""
available_plugins = list(get_plugin_list())
parser.add_argument(
'--os-auth-type',
metavar='<auth-type>',
dest='auth_type',
default=utils.env('OS_AUTH_TYPE'),
help=_('Select an authentication type. Available types: %s.'
' Default: selected based on --os-username/--os-token'
' (Env: OS_AUTH_TYPE)') % ', '.join(available_plugins),
choices=available_plugins
)
# Maintain compatibility with old tenant env vars
envs = {
'OS_PROJECT_NAME': utils.env(
'OS_PROJECT_NAME',
default=utils.env('OS_TENANT_NAME')
),
'OS_PROJECT_ID': utils.env(
'OS_PROJECT_ID',
default=utils.env('OS_TENANT_ID')
),
}
for o in get_options_list():
# Remove tenant options from KSC plugins and replace them below
if 'tenant' not in o:
parser.add_argument(
'--os-' + o,
metavar='<auth-%s>' % o,
dest=o.replace('-', '_'),
default=envs.get(
OPTIONS_LIST[o]['env'],
utils.env(OPTIONS_LIST[o]['env']),
),
help=_('%(help)s\n(Env: %(env)s)') % {
'help': OPTIONS_LIST[o]['help'],
'env': OPTIONS_LIST[o]['env'],
},
)
# add tenant-related options for compatibility
# this is deprecated but still used in some tempest tests...
parser.add_argument(
'--os-tenant-name',
metavar='<auth-tenant-name>',
dest='os_project_name',
help=argparse.SUPPRESS,
)
parser.add_argument(
'--os-tenant-id',
metavar='<auth-tenant-id>',
dest='os_project_id',
help=argparse.SUPPRESS,
)
return parser
def get_keystone2keystone_auth(local_auth, service_provider,
project_id=None, project_name=None,
project_domain_id=None,
project_domain_name=None):
"""Return Keystone 2 Keystone authentication for service provider.
:param local_auth: authentication to use with the local Keystone
:param service_provider: service provider id as registered in Keystone
:param project_id: project id to scope to in the service provider
:param project_name: project name to scope to in the service provider
:param project_domain_id: id of domain in the service provider
:param project_domain_name: name of domain to in the service provider
:return: Keystone2Keystone auth object for service provider
"""
return k2k.Keystone2Keystone(local_auth,
service_provider,
project_id=project_id,
project_name=project_name,
project_domain_id=project_domain_id,
project_domain_name=project_domain_name)

View File

@ -1,84 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""API Utilities Library"""
def simple_filter(
data=None,
attr=None,
value=None,
property_field=None,
):
"""Filter a list of dicts
:param list data:
The list to be filtered. The list is modified in-place and will
be changed if any filtering occurs.
:param string attr:
The name of the attribute to filter. If attr does not exist no
match will succeed and no rows will be returned. If attr is
None no filtering will be performed and all rows will be returned.
:param string value:
The value to filter. None is considered to be a 'no filter' value.
'' matches against a Python empty string.
:param string property_field:
The name of the data field containing a property dict to filter.
If property_field is None, attr is a field name. If property_field
is not None, attr is a property key name inside the named property
field.
:returns:
Returns the filtered list
:rtype list:
This simple filter (one attribute, one exact-match value) searches a
list of dicts to select items. It first searches the item dict for a
matching ``attr`` then does an exact-match on the ``value``. If
``property_field`` is given, it will look inside that field (if it
exists and is a dict) for a matching ``value``.
"""
# Take the do-nothing case shortcut
if not data or not attr or value is None:
return data
# NOTE:(dtroyer): This filter modifies the provided list in-place using
# list.remove() so we need to start at the end so the loop pointer does
# not skip any items after a deletion.
for d in reversed(data):
if attr in d:
# Searching data fields
search_value = d[attr]
elif (property_field and property_field in d and
isinstance(d[property_field], dict)):
# Searching a properties field - do this separately because
# we don't want to fail over to checking the fields if a
# property name is given.
if attr in d[property_field]:
search_value = d[property_field][attr]
else:
search_value = None
else:
search_value = None
# could do regex here someday...
if not search_value or search_value != value:
# remove from list
try:
data.remove(d)
except ValueError:
# it's already gone!
pass
return data

View File

@ -1,220 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""OpenStackConfig subclass for argument compatibility"""
import logging
from os_client_config import config
from os_client_config import exceptions as occ_exceptions
from oslo_utils import strutils
import six
LOG = logging.getLogger(__name__)
# Sublcass OpenStackConfig in order to munge config values
# before auth plugins are loaded
class OSC_Config(config.OpenStackConfig):
def _auth_select_default_plugin(self, config):
"""Select a default plugin based on supplied arguments
Migrated from auth.select_auth_plugin()
"""
identity_version = str(config.get('identity_api_version', ''))
if config.get('username', None) and not config.get('auth_type', None):
if identity_version == '3':
config['auth_type'] = 'v3password'
elif identity_version.startswith('2'):
config['auth_type'] = 'v2password'
else:
# let keystoneauth figure it out itself
config['auth_type'] = 'password'
elif config.get('token', None) and not config.get('auth_type', None):
if identity_version == '3':
config['auth_type'] = 'v3token'
elif identity_version.startswith('2'):
config['auth_type'] = 'v2token'
else:
# let keystoneauth figure it out itself
config['auth_type'] = 'token'
else:
# The ultimate default is similar to the original behaviour,
# but this time with version discovery
if not config.get('auth_type', None):
config['auth_type'] = 'password'
LOG.debug("Auth plugin %s selected" % config['auth_type'])
return config
def _auth_v2_arguments(self, config):
"""Set up v2-required arguments from v3 info
Migrated from auth.build_auth_params()
"""
if ('auth_type' in config and config['auth_type'].startswith("v2")):
if 'project_id' in config['auth']:
config['auth']['tenant_id'] = config['auth']['project_id']
if 'project_name' in config['auth']:
config['auth']['tenant_name'] = config['auth']['project_name']
return config
def _auth_v2_ignore_v3(self, config):
"""Remove v3 arguemnts if present for v2 plugin
Migrated from clientmanager.setup_auth()
"""
# NOTE(hieulq): If USER_DOMAIN_NAME, USER_DOMAIN_ID, PROJECT_DOMAIN_ID
# or PROJECT_DOMAIN_NAME is present and API_VERSION is 2.0, then
# ignore all domain related configs.
if (str(config.get('identity_api_version', '')).startswith('2') and
config.get('auth_type').endswith('password')):
domain_props = [
'project_domain_id',
'project_domain_name',
'user_domain_id',
'user_domain_name',
]
for prop in domain_props:
if config['auth'].pop(prop, None) is not None:
LOG.warning("Ignoring domain related config " +
prop + " because identity API version is 2.0")
return config
def _auth_default_domain(self, config):
"""Set a default domain from available arguments
Migrated from clientmanager.setup_auth()
"""
identity_version = str(config.get('identity_api_version', ''))
auth_type = config.get('auth_type', None)
# TODO(mordred): This is a usability improvement that's broadly useful
# We should port it back up into os-client-config.
default_domain = config.get('default_domain', None)
if (identity_version == '3' and
not auth_type.startswith('v2') and
default_domain):
# NOTE(stevemar): If PROJECT_DOMAIN_ID or PROJECT_DOMAIN_NAME is
# present, then do not change the behaviour. Otherwise, set the
# PROJECT_DOMAIN_ID to 'OS_DEFAULT_DOMAIN' for better usability.
if (
auth_type in ("password", "v3password", "v3totp") and
not config['auth'].get('project_domain_id') and
not config['auth'].get('project_domain_name')
):
config['auth']['project_domain_id'] = default_domain
# NOTE(stevemar): If USER_DOMAIN_ID or USER_DOMAIN_NAME is present,
# then do not change the behaviour. Otherwise, set the
# USER_DOMAIN_ID to 'OS_DEFAULT_DOMAIN' for better usability.
# NOTE(aloga): this should only be set if there is a username.
# TODO(dtroyer): Move this to os-client-config after the plugin has
# been loaded so we can check directly if the options are accepted.
if (
auth_type in ("password", "v3password", "v3totp") and
not config['auth'].get('user_domain_id') and
not config['auth'].get('user_domain_name')
):
config['auth']['user_domain_id'] = default_domain
return config
def auth_config_hook(self, config):
"""Allow examination of config values before loading auth plugin
OpenStackClient will override this to perform additional chacks
on auth_type.
"""
config = self._auth_select_default_plugin(config)
config = self._auth_v2_arguments(config)
config = self._auth_v2_ignore_v3(config)
config = self._auth_default_domain(config)
if LOG.isEnabledFor(logging.DEBUG):
LOG.debug("auth_config_hook(): %s",
strutils.mask_password(six.text_type(config)))
return config
def _validate_auth_ksc(self, config, cloud, fixed_argparse=None):
"""Old compatibility hack for OSC, no longer needed/wanted"""
return config
def _validate_auth(self, config, loader, fixed_argparse=None):
"""Validate auth plugin arguments"""
# May throw a keystoneauth1.exceptions.NoMatchingPlugin
plugin_options = loader.get_options()
msgs = []
prompt_options = []
for p_opt in plugin_options:
# if it's in config, win, move it and kill it from config dict
# if it's in config.auth but not in config we're good
# deprecated loses to current
# provided beats default, deprecated or not
winning_value = self._find_winning_auth_value(p_opt, config)
if not winning_value:
winning_value = self._find_winning_auth_value(
p_opt, config['auth'])
# if the plugin tells us that this value is required
# then error if it's doesn't exist now
if not winning_value and p_opt.required:
msgs.append(
'Missing value {auth_key}'
' required for auth plugin {plugin}'.format(
auth_key=p_opt.name, plugin=config.get('auth_type'),
)
)
# Clean up after ourselves
for opt in [p_opt.name] + [o.name for o in p_opt.deprecated]:
opt = opt.replace('-', '_')
config.pop(opt, None)
config['auth'].pop(opt, None)
if winning_value:
# Prefer the plugin configuration dest value if the value's key
# is marked as depreciated.
if p_opt.dest is None:
config['auth'][p_opt.name.replace('-', '_')] = (
winning_value)
else:
config['auth'][p_opt.dest] = winning_value
# See if this needs a prompting
if (
'prompt' in vars(p_opt) and
p_opt.prompt is not None and
p_opt.dest not in config['auth'] and
self._pw_callback is not None
):
# Defer these until we know all required opts are present
prompt_options.append(p_opt)
if msgs:
raise occ_exceptions.OpenStackConfigException('\n'.join(msgs))
else:
for p_opt in prompt_options:
config['auth'][p_opt.dest] = self._pw_callback(p_opt.prompt)
return config

View File

@ -1,47 +0,0 @@
# Copyright 2017 Huawei, Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Formattable column for specify content type"""
from cliff import columns
from osc_lib import utils
class DictColumn(columns.FormattableColumn):
"""Format column for dict content"""
def human_readable(self):
return utils.format_dict(self._value)
class DictListColumn(columns.FormattableColumn):
"""Format column for dict, key is string, value is list"""
def human_readable(self):
return utils.format_dict_of_list(self._value)
class ListColumn(columns.FormattableColumn):
"""Format column for list content"""
def human_readable(self):
return utils.format_list(self._value)
class ListDictColumn(columns.FormattableColumn):
"""Format column for list of dict content"""
def human_readable(self):
return utils.format_list_of_dicts(self._value)

View File

@ -1,255 +0,0 @@
# Copyright 2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""argparse Custom Actions"""
import argparse
from osc_lib.i18n import _
class KeyValueAction(argparse.Action):
"""A custom action to parse arguments as key=value pairs
Ensures that ``dest`` is a dict
"""
def __call__(self, parser, namespace, values, option_string=None):
# Make sure we have an empty dict rather than None
if getattr(namespace, self.dest, None) is None:
setattr(namespace, self.dest, {})
# Add value if an assignment else remove it
if '=' in values:
values_list = values.split('=', 1)
# NOTE(qtang): Prevent null key setting in property
if '' == values_list[0]:
msg = _("Property key must be specified: %s")
raise argparse.ArgumentTypeError(msg % str(values))
else:
getattr(namespace, self.dest, {}).update([values_list])
else:
msg = _("Expected 'key=value' type, but got: %s")
raise argparse.ArgumentTypeError(msg % str(values))
class MultiKeyValueAction(argparse.Action):
"""A custom action to parse arguments as key1=value1,key2=value2 pairs
Ensure that ``dest`` is a list. The list will finally contain multiple
dicts, with key=value pairs in them.
NOTE: The arguments string should be a comma separated key-value pairs.
And comma(',') and equal('=') may not be used in the key or value.
"""
def __init__(self, option_strings, dest, nargs=None,
required_keys=None, optional_keys=None, **kwargs):
"""Initialize the action object, and parse customized options
Required keys and optional keys can be specified when initializing
the action to enable the key validation. If none of them specified,
the key validation will be skipped.
:param required_keys: a list of required keys
:param optional_keys: a list of optional keys
"""
if nargs:
msg = _("Parameter 'nargs' is not allowed, but got %s")
raise ValueError(msg % nargs)
super(MultiKeyValueAction, self).__init__(option_strings,
dest, **kwargs)
# required_keys: A list of keys that is required. None by default.
if required_keys and not isinstance(required_keys, list):
msg = _("'required_keys' must be a list")
raise TypeError(msg)
self.required_keys = set(required_keys or [])
# optional_keys: A list of keys that is optional. None by default.
if optional_keys and not isinstance(optional_keys, list):
msg = _("'optional_keys' must be a list")
raise TypeError(msg)
self.optional_keys = set(optional_keys or [])
def __call__(self, parser, namespace, values, metavar=None):
# Make sure we have an empty list rather than None
if getattr(namespace, self.dest, None) is None:
setattr(namespace, self.dest, [])
params = {}
for kv in values.split(','):
# Add value if an assignment else raise ArgumentTypeError
if '=' in kv:
kv_list = kv.split('=', 1)
# NOTE(qtang): Prevent null key setting in property
if '' == kv_list[0]:
msg = _("Each property key must be specified: %s")
raise argparse.ArgumentTypeError(msg % str(kv))
else:
params.update([kv_list])
else:
msg = _(
"Expected comma separated 'key=value' pairs, but got: %s"
)
raise argparse.ArgumentTypeError(msg % str(kv))
# Check key validation
valid_keys = self.required_keys | self.optional_keys
if valid_keys:
invalid_keys = [k for k in params if k not in valid_keys]
if invalid_keys:
msg = _(
"Invalid keys %(invalid_keys)s specified.\n"
"Valid keys are: %(valid_keys)s"
)
raise argparse.ArgumentTypeError(msg % {
'invalid_keys': ', '.join(invalid_keys),
'valid_keys': ', '.join(valid_keys),
})
if self.required_keys:
missing_keys = [k for k in self.required_keys if k not in params]
if missing_keys:
msg = _(
"Missing required keys %(missing_keys)s.\n"
"Required keys are: %(required_keys)s"
)
raise argparse.ArgumentTypeError(msg % {
'missing_keys': ', '.join(missing_keys),
'required_keys': ', '.join(self.required_keys),
})
# Update the dest dict
getattr(namespace, self.dest, []).append(params)
class MultiKeyValueCommaAction(MultiKeyValueAction):
"""Custom action to parse arguments from a set of key=value pair
Ensures that ``dest`` is a dict.
Parses dict by separating comma separated string into individual values
Ex. key1=val1,val2,key2=val3 => {"key1": "val1,val2", "key2": "val3"}
"""
def __call__(self, parser, namespace, values, option_string=None):
"""Overwrite the __call__ function of MultiKeyValueAction
This is done to handle scenarios where we may have comma seperated
data as a single value.
"""
# Make sure we have an empty list rather than None
if getattr(namespace, self.dest, None) is None:
setattr(namespace, self.dest, [])
params = {}
key = ''
for kv in values.split(','):
# Add value if an assignment else raise ArgumentTypeError
if '=' in kv:
kv_list = kv.split('=', 1)
# NOTE(qtang): Prevent null key setting in property
if '' == kv_list[0]:
msg = _("A key must be specified before '=': %s")
raise argparse.ArgumentTypeError(msg % str(kv))
else:
params.update([kv_list])
key = kv_list[0]
else:
# If the ',' split does not have key=value pair, then it
# means the current value is a part of the previous
# key=value pair, so append it.
try:
params[key] = "%s,%s" % (params[key], kv)
except KeyError:
msg = _("A key=value pair is required: %s")
raise argparse.ArgumentTypeError(msg % str(kv))
# Check key validation
valid_keys = self.required_keys | self.optional_keys
if valid_keys:
invalid_keys = [k for k in params if k not in valid_keys]
if invalid_keys:
msg = _(
"Invalid keys %(invalid_keys)s specified.\n"
"Valid keys are: %(valid_keys)s"
)
raise argparse.ArgumentTypeError(msg % {
'invalid_keys': ', '.join(invalid_keys),
'valid_keys': ', '.join(valid_keys),
})
if self.required_keys:
missing_keys = [k for k in self.required_keys if k not in params]
if missing_keys:
msg = _(
"Missing required keys %(missing_keys)s.\n"
"Required keys are: %(required_keys)s"
)
raise argparse.ArgumentTypeError(msg % {
'missing_keys': ', '.join(missing_keys),
'required_keys': ', '.join(self.required_keys),
})
# Update the dest dict
getattr(namespace, self.dest, []).append(params)
class RangeAction(argparse.Action):
"""A custom action to parse a single value or a range of values
Parses single integer values or a range of integer values delimited
by a colon and returns a tuple of integers:
'4' sets ``dest`` to (4, 4)
'6:9' sets ``dest`` to (6, 9)
"""
def __call__(self, parser, namespace, values, option_string=None):
range = values.split(':')
if len(range) == 0:
# Nothing passed, return a zero default
setattr(namespace, self.dest, (0, 0))
elif len(range) == 1:
# Only a single value is present
setattr(namespace, self.dest, (int(range[0]), int(range[0])))
elif len(range) == 2:
# Range of two values
if int(range[0]) <= int(range[1]):
setattr(namespace, self.dest, (int(range[0]), int(range[1])))
else:
msg = _("Invalid range, %(min)s is not less than %(max)s")
raise argparse.ArgumentError(self, msg % {
'min': range[0],
'max': range[1],
})
else:
# Too many values
msg = _("Invalid range, too many values")
raise argparse.ArgumentError(self, msg)
class NonNegativeAction(argparse.Action):
"""A custom action to check whether the value is non-negative or not
Ensures the value is >= 0.
"""
def __call__(self, parser, namespace, values, option_string=None):
if int(values) >= 0:
setattr(namespace, self.dest, values)
else:
msg = _("%s expected a non-negative integer")
raise argparse.ArgumentTypeError(msg % str(option_string))

View File

@ -1,285 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Manage access to the clients, including authenticating when needed."""
import copy
import logging
import sys
from oslo_utils import strutils
import six
from osc_lib.api import auth
from osc_lib import exceptions
from osc_lib import session as osc_session
from osc_lib import version
LOG = logging.getLogger(__name__)
PLUGIN_MODULES = []
class ClientCache(object):
"""Descriptor class for caching created client handles."""
def __init__(self, factory):
self.factory = factory
self._handle = None
def __get__(self, instance, owner):
# Tell the ClientManager to login to keystone
if self._handle is None:
try:
self._handle = self.factory(instance)
except AttributeError as err:
# Make sure the failure propagates. Otherwise, the plugin just
# quietly isn't there.
new_err = exceptions.PluginAttributeError(err)
six.reraise(new_err.__class__, new_err, sys.exc_info()[2])
return self._handle
class ClientManager(object):
"""Manages access to API clients, including authentication."""
# NOTE(dtroyer): Keep around the auth required state of the _current_
# command since ClientManager has no visibility to the
# command itself; assume auth is not required.
_auth_required = False
def __init__(
self,
cli_options=None,
api_version=None,
pw_func=None,
app_name=None,
app_version=None,
):
"""Set up a ClientManager
:param cli_options:
Options collected from the command-line, environment, or wherever
:param api_version:
Dict of API versions: key is API name, value is the version
:param pw_func:
Callback function for asking the user for a password. The function
takes an optional string for the prompt ('Password: ' on None) and
returns a string containing the password
:param app_name:
The name of the application for passing through to the useragent
:param app_version:
The version of the application for passing through to the useragent
"""
self._cli_options = cli_options
self._api_version = api_version
self._pw_callback = pw_func
self._app_name = app_name
self._app_version = app_version
self.region_name = self._cli_options.region_name
self.interface = self._cli_options.interface
self.timing = self._cli_options.timing
self._auth_ref = None
self.session = None
# self.verify is the Requests-compatible form
# self.cacert is the form used by the legacy client libs
# self.insecure is not needed, use 'not self.verify'
# NOTE(dtroyer): Per bug https://bugs.launchpad.net/bugs/1447784
# --insecure overrides any --os-cacert setting
# Set a hard default
self.verify = True
if self._cli_options.insecure:
# Handle --insecure
self.verify = False
self.cacert = None
else:
if (self._cli_options.cacert is not None
and self._cli_options.cacert != ''):
# --cacert implies --verify here
self.verify = self._cli_options.cacert
self.cacert = self._cli_options.cacert
else:
# Fall through also gets --verify
if self._cli_options.verify is not None:
self.verify = self._cli_options.verify
self.cacert = None
# Set up client certificate and key
# NOTE(cbrandily): This converts client certificate/key to requests
# cert argument: None (no client certificate), a path
# to client certificate or a tuple with client
# certificate/key paths.
self.cert = self._cli_options.cert
if self.cert and self._cli_options.key:
self.cert = self.cert, self._cli_options.key
# TODO(mordred) The logic above to set all of these is duplicated in
# os-client-config but needs more effort to tease apart and ensure that
# values are being passed in. For now, let osc_lib do it and just set
# the values in occ.config
self._cli_options.config['verify'] = self.verify
self._cli_options.config['cacert'] = self.cacert
# Attack of the killer passthrough values
self._cli_options.config['cert'] = self._cli_options.cert
self._cli_options.config['key'] = self._cli_options.key
# TODO(mordred) We also don't have any support for setting or passing
# in api_timeout, which is set in occ defaults but we skip occ defaults
# so set it here by hand and later we should potentially expose this
# directly to osc
self._cli_options.config['api_timeout'] = None
# Get logging from root logger
root_logger = logging.getLogger('')
LOG.setLevel(root_logger.getEffectiveLevel())
# NOTE(gyee): use this flag to indicate whether auth setup has already
# been completed. If so, do not perform auth setup again. The reason
# we need this flag is that we want to be able to perform auth setup
# outside of auth_ref as auth_ref itself is a property. We can not
# retrofit auth_ref to optionally skip scope check. Some operations
# do not require a scoped token. In those cases, we call setup_auth
# prior to dereferrencing auth_ref.
self._auth_setup_completed = False
def setup_auth(self):
"""Set up authentication
This is deferred until authentication is actually attempted because
it gets in the way of things that do not require auth.
"""
if self._auth_setup_completed:
return
# Stash the selected auth type
self.auth_plugin_name = self._cli_options.config['auth_type']
# Basic option checking to avoid unhelpful error messages
auth.check_valid_authentication_options(
self._cli_options,
self.auth_plugin_name,
)
# Horrible hack alert...must handle prompt for null password if
# password auth is requested.
if (self.auth_plugin_name.endswith('password') and
not self._cli_options.auth.get('password')):
self._cli_options.auth['password'] = self._pw_callback()
LOG.info('Using auth plugin: %s', self.auth_plugin_name)
LOG.debug('Using parameters %s',
strutils.mask_password(self._cli_options.auth))
self.auth = self._cli_options.get_auth()
if self._cli_options.service_provider:
self.auth = auth.get_keystone2keystone_auth(
self.auth,
self._cli_options.service_provider,
self._cli_options.remote_project_id,
self._cli_options.remote_project_name,
self._cli_options.remote_project_domain_id,
self._cli_options.remote_project_domain_name
)
self.session = osc_session.TimingSession(
auth=self.auth,
verify=self.verify,
cert=self.cert,
app_name=self._app_name,
app_version=self._app_version,
additional_user_agent=[('osc-lib', version.version_string)],
)
self._auth_setup_completed = True
def validate_scope(self):
if self._auth_ref.project_id is not None:
# We already have a project scope.
return
if self._auth_ref.domain_id is not None:
# We already have a domain scope.
return
# We do not have a scoped token (and the user's default project scope
# was not implied), so the client needs to be explicitly configured
# with a scope.
auth.check_valid_authorization_options(
self._cli_options,
self.auth_plugin_name,
)
@property
def auth_ref(self):
"""Dereference will trigger an auth if it hasn't already"""
if not self._auth_required:
# Forcibly skip auth if we know we do not need it
return None
if not self._auth_ref:
self.setup_auth()
LOG.debug("Get auth_ref")
self._auth_ref = self.auth.get_auth_ref(self.session)
return self._auth_ref
def is_service_available(self, service_type):
"""Check if a service type is in the current Service Catalog"""
# Trigger authentication necessary to discover endpoint
if self.auth_ref:
service_catalog = self.auth_ref.service_catalog
else:
service_catalog = None
# Assume that the network endpoint is enabled.
service_available = None
if service_catalog:
if service_type in service_catalog.get_endpoints():
service_available = True
LOG.debug("%s endpoint in service catalog", service_type)
else:
service_available = False
LOG.debug("No %s endpoint in service catalog", service_type)
else:
LOG.debug("No service catalog")
return service_available
def get_endpoint_for_service_type(self, service_type, region_name=None,
interface='public'):
"""Return the endpoint URL for the service type."""
if not interface:
interface = 'public'
# See if we are using password flow auth, i.e. we have a
# service catalog to select endpoints from
if self.auth_ref:
endpoint = self.auth_ref.service_catalog.url_for(
service_type=service_type,
region_name=region_name,
interface=interface,
)
else:
# Get the passed endpoint directly from the auth plugin
endpoint = self.auth.get_endpoint(
self.session,
interface=interface,
)
return endpoint
def get_configuration(self):
return copy.deepcopy(self._cli_options.config)

View File

@ -1,63 +0,0 @@
# Copyright 2016 NEC Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
import logging
from cliff import command
from cliff import lister
from cliff import show
import six
from osc_lib import exceptions
from osc_lib.i18n import _
class CommandMeta(abc.ABCMeta):
def __new__(mcs, name, bases, cls_dict):
if 'log' not in cls_dict:
cls_dict['log'] = logging.getLogger(
cls_dict['__module__'] + '.' + name)
return super(CommandMeta, mcs).__new__(mcs, name, bases, cls_dict)
@six.add_metaclass(CommandMeta)
class Command(command.Command):
def run(self, parsed_args):
self.log.debug('run(%s)', parsed_args)
return super(Command, self).run(parsed_args)
def validate_os_beta_command_enabled(self):
if not self.app.options.os_beta_command:
msg = _('Caution: This is a beta command and subject to '
'change. Use global option --os-beta-command '
'to enable this command.')
raise exceptions.CommandError(msg)
def deprecated_option_warning(self, old_option, new_option):
"""Emit a warning for use of a deprecated option"""
self.log.warning(
_("The %(old)s option is deprecated, please use %(new)s instead.")
% {'old': old_option, 'new': new_option}
)
class Lister(Command, lister.Lister):
pass
class ShowOne(Command, show.ShowOne):
pass

View File

@ -1,59 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Modify cliff.CommandManager"""
import pkg_resources
import cliff.commandmanager
class CommandManager(cliff.commandmanager.CommandManager):
"""Add additional functionality to cliff.CommandManager
Load additional command groups after initialization
Add _command_group() methods
"""
def __init__(self, namespace, convert_underscores=True):
self.group_list = []
super(CommandManager, self).__init__(namespace, convert_underscores)
def load_commands(self, namespace):
self.group_list.append(namespace)
return super(CommandManager, self).load_commands(namespace)
def add_command_group(self, group=None):
"""Adds another group of command entrypoints"""
if group:
self.load_commands(group)
def get_command_groups(self):
"""Returns a list of the loaded command groups"""
return self.group_list
def get_command_names(self, group=None):
"""Returns a list of commands loaded for the specified group"""
group_list = []
if group is not None:
for ep in pkg_resources.iter_entry_points(group):
cmd_name = (
ep.name.replace('_', ' ')
if self.convert_underscores
else ep.name
)
group_list.append(cmd_name)
return group_list
return list(self.commands.keys())

View File

@ -1,41 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Timing Implementation"""
from osc_lib.command import command
class Timing(command.Lister):
"""Show timing data"""
def take_action(self, parsed_args):
column_headers = (
'URL',
'Seconds',
)
results = []
total = 0.0
for url, td in self.app.timing_data:
# NOTE(dtroyer): Take the long way here because total_seconds()
# was added in py27.
sec = (td.microseconds + (td.seconds + td.days *
86400) * 1e6) / 1e6
total += sec
results.append((url, sec))
results.append(('Total', total))
return (
column_headers,
results,
)

View File

@ -1,122 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Exception definitions."""
class CommandError(Exception):
pass
class AuthorizationFailure(Exception):
pass
class PluginAttributeError(Exception):
"""A plugin threw an AttributeError while being lazily loaded."""
# This *must not* inherit from AttributeError;
# that would defeat the whole purpose.
pass
class NoTokenLookupException(Exception):
"""This does not support looking up endpoints from an existing token."""
pass
class EndpointNotFound(Exception):
"""Could not find Service or Region in Service Catalog."""
pass
class UnsupportedVersion(Exception):
"""The user is trying to use an unsupported version of the API"""
pass
class InvalidValue(Exception):
"""An argument value is not valid: wrong type, out of range, etc"""
message = "Supplied value is not valid"
class ClientException(Exception):
"""The base exception class for all exceptions this library raises."""
def __init__(self, code, message=None, details=None):
self.code = code
self.message = message or self.__class__.message
self.details = details
def __str__(self):
return "%s (HTTP %s)" % (self.message, self.code)
class BadRequest(ClientException):
"""HTTP 400 - Bad request: you sent some malformed data."""
http_status = 400
message = "Bad request"
class Unauthorized(ClientException):
"""HTTP 401 - Unauthorized: bad credentials."""
http_status = 401
message = "Unauthorized"
class Forbidden(ClientException):
"""HTTP 403 - Forbidden: not authorized to access to this resource."""
http_status = 403
message = "Forbidden"
class NotFound(ClientException):
"""HTTP 404 - Not found"""
http_status = 404
message = "Not found"
class Conflict(ClientException):
"""HTTP 409 - Conflict"""
http_status = 409
message = "Conflict"
class OverLimit(ClientException):
"""HTTP 413 - Over limit: reached the API limits for this time period."""
http_status = 413
message = "Over limit"
# NotImplemented is a python keyword.
class HTTPNotImplemented(ClientException):
"""HTTP 501 - Not Implemented: server does not support this operation."""
http_status = 501
message = "Not Implemented"
# In Python 2.4 Exception is old-style and thus doesn't have a __subclasses__()
# so we can do this:
# _code_map = dict((c.http_status, c)
# for c in ClientException.__subclasses__())
#
# Instead, we have to hardcode it:
_code_map = dict((c.http_status, c) for c in [
BadRequest,
Unauthorized,
Forbidden,
NotFound,
OverLimit,
HTTPNotImplemented
])

View File

@ -1,21 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import oslo_i18n
_translators = oslo_i18n.TranslatorFactory(domain='osc_lib')
# The primary translation function using the well-known name "_"
_ = _translators.primary

View File

@ -1,196 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Application logging"""
import logging
import sys
import warnings
def get_loggers():
loggers = {}
for logkey in logging.Logger.manager.loggerDict.keys():
loggers[logkey] = logging.getLevelName(logging.getLogger(logkey).level)
return loggers
def log_level_from_options(options):
# if --debug, --quiet or --verbose is not specified,
# the default logging level is warning
log_level = logging.WARNING
if options.verbose_level == 0:
# --quiet
log_level = logging.ERROR
elif options.verbose_level == 2:
# One --verbose
log_level = logging.INFO
elif options.verbose_level >= 3:
# Two or more --verbose
log_level = logging.DEBUG
return log_level
def log_level_from_string(level_string):
log_level = {
'critical': logging.CRITICAL,
'error': logging.ERROR,
'warning': logging.WARNING,
'info': logging.INFO,
'debug': logging.DEBUG,
}.get(level_string, logging.WARNING)
return log_level
def log_level_from_config(config):
# Check the command line option
verbose_level = config.get('verbose_level')
if config.get('debug', False):
verbose_level = 3
if verbose_level == 0:
verbose_level = 'error'
elif verbose_level == 1:
# If a command line option has not been specified, check the
# configuration file
verbose_level = config.get('log_level', 'warning')
elif verbose_level == 2:
verbose_level = 'info'
else:
verbose_level = 'debug'
return log_level_from_string(verbose_level)
def set_warning_filter(log_level):
if log_level == logging.ERROR:
warnings.simplefilter("ignore")
elif log_level == logging.WARNING:
warnings.simplefilter("ignore")
elif log_level == logging.INFO:
warnings.simplefilter("once")
class _FileFormatter(logging.Formatter):
"""Customize the logging format for logging handler"""
_LOG_MESSAGE_BEGIN = (
'%(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s ')
_LOG_MESSAGE_CONTEXT = '[%(cloud)s %(username)s %(project)s] '
_LOG_MESSAGE_END = '%(message)s'
_LOG_DATE_FORMAT = '%Y-%m-%d %H:%M:%S'
def __init__(self, options=None, config=None, **kwargs):
context = {}
if options:
context = {
'cloud': getattr(options, 'cloud', ''),
'project': getattr(options, 'os_project_name', ''),
'username': getattr(options, 'username', ''),
}
elif config:
context = {
'cloud': config.config.get('cloud', ''),
'project': config.auth.get('project_name', ''),
'username': config.auth.get('username', ''),
}
if context:
self.fmt = (self._LOG_MESSAGE_BEGIN +
(self._LOG_MESSAGE_CONTEXT % context) +
self._LOG_MESSAGE_END)
else:
self.fmt = self._LOG_MESSAGE_BEGIN + self._LOG_MESSAGE_END
logging.Formatter.__init__(self, self.fmt, self._LOG_DATE_FORMAT)
class LogConfigurator(object):
_CONSOLE_MESSAGE_FORMAT = '%(message)s'
def __init__(self, options):
self.root_logger = logging.getLogger('')
self.root_logger.setLevel(logging.DEBUG)
# Force verbose_level 3 on --debug
self.dump_trace = False
if options.debug:
options.verbose_level = 3
self.dump_trace = True
# Always send higher-level messages to the console via stderr
self.console_logger = logging.StreamHandler(sys.stderr)
log_level = log_level_from_options(options)
self.console_logger.setLevel(log_level)
formatter = logging.Formatter(self._CONSOLE_MESSAGE_FORMAT)
self.console_logger.setFormatter(formatter)
self.root_logger.addHandler(self.console_logger)
# Set the warning filter now
set_warning_filter(log_level)
# Set up logging to a file
self.file_logger = None
log_file = options.log_file
if log_file:
self.file_logger = logging.FileHandler(filename=log_file)
self.file_logger.setFormatter(_FileFormatter(options=options))
self.file_logger.setLevel(log_level)
self.root_logger.addHandler(self.file_logger)
# Requests logs some stuff at INFO that we don't want
# unless we have DEBUG
requests_log = logging.getLogger("requests")
# Other modules we don't want DEBUG output for
cliff_log = logging.getLogger('cliff')
stevedore_log = logging.getLogger('stevedore')
iso8601_log = logging.getLogger("iso8601")
if options.debug:
# --debug forces traceback
requests_log.setLevel(logging.DEBUG)
else:
requests_log.setLevel(logging.ERROR)
cliff_log.setLevel(logging.ERROR)
stevedore_log.setLevel(logging.ERROR)
iso8601_log.setLevel(logging.ERROR)
def configure(self, cloud_config):
log_level = log_level_from_config(cloud_config.config)
set_warning_filter(log_level)
self.dump_trace = cloud_config.config.get('debug', self.dump_trace)
self.console_logger.setLevel(log_level)
log_file = cloud_config.config.get('log_file')
if log_file:
if not self.file_logger:
self.file_logger = logging.FileHandler(filename=log_file)
self.file_logger.setFormatter(_FileFormatter(config=cloud_config))
self.file_logger.setLevel(log_level)
self.root_logger.addHandler(self.file_logger)
logconfig = cloud_config.config.get('logging')
if logconfig:
highest_level = logging.NOTSET
for k in logconfig.keys():
level = log_level_from_string(logconfig[k])
logging.getLogger(k).setLevel(level)
if (highest_level < level):
highest_level = level
self.console_logger.setLevel(highest_level)
if self.file_logger:
self.file_logger.setLevel(highest_level)
# loggers that are not set will use the handler level, so we
# need to set the global level for all the loggers
for logkey in logging.Logger.manager.loggerDict.keys():
logger = logging.getLogger(logkey)
if logger.level == logging.NOTSET:
logger.setLevel(log_level)

View File

@ -1,50 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Subclass of keystoneauth1.session"""
from keystoneauth1 import session
class TimingSession(session.Session):
"""A Session that supports collection of timing data per Method URL"""
def __init__(
self,
**kwargs
):
"""Pass through all arguments except timing"""
super(TimingSession, self).__init__(**kwargs)
# times is a list of tuples: ("method url", elapsed_time)
self.times = []
def get_timings(self):
return self.times
def reset_timings(self):
self.times = []
def request(self, url, method, **kwargs):
"""Wrap the usual request() method with the timers"""
resp = super(TimingSession, self).request(url, method, **kwargs)
for h in resp.history:
self.times.append((
"%s %s" % (h.request.method, h.request.url),
h.elapsed,
))
self.times.append((
"%s %s" % (resp.request.method, resp.request.url),
resp.elapsed,
))
return resp

View File

@ -1,479 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
# Copyright 2015 Dean Troyer
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Command-line interface to the OpenStack APIs"""
import getpass
import locale
import logging
import sys
import traceback
from cliff import app
from cliff import command
from cliff import complete
from cliff import help
from oslo_utils import importutils
from oslo_utils import strutils
import six
from osc_lib.cli import client_config as cloud_config
from osc_lib import clientmanager
from osc_lib.command import commandmanager
from osc_lib.command import timing
from osc_lib import exceptions as exc
from osc_lib.i18n import _
from osc_lib import logs
from osc_lib import utils
osprofiler_profiler = importutils.try_import("osprofiler.profiler")
DEFAULT_DOMAIN = 'default'
def prompt_for_password(prompt=None):
"""Prompt user for a password
Prompt for a password if stdin is a tty.
"""
if not prompt:
prompt = 'Password: '
pw = None
# If stdin is a tty, try prompting for the password
if hasattr(sys.stdin, 'isatty') and sys.stdin.isatty():
# Check for Ctl-D
try:
pw = getpass.getpass(prompt)
except EOFError:
pass
# No password because we did't have a tty or nothing was entered
if not pw:
raise exc.CommandError(_("No password entered, or found via"
" --os-password or OS_PASSWORD"),)
return pw
class OpenStackShell(app.App):
CONSOLE_MESSAGE_FORMAT = '%(levelname)s: %(name)s %(message)s'
log = logging.getLogger(__name__)
timing_data = []
def __init__(
self,
description=None,
version=None,
command_manager=None,
stdin=None,
stdout=None,
stderr=None,
interactive_app_factory=None,
deferred_help=False,
):
# Patch command.Command to add a default auth_required = True
command.Command.auth_required = True
# Some commands do not need authentication
help.HelpCommand.auth_required = False
complete.CompleteCommand.auth_required = False
# Slight change to the meaning of --debug
self.DEFAULT_DEBUG_VALUE = None
self.DEFAULT_DEBUG_HELP = 'Set debug logging and traceback on errors.'
# Do default for positionals
if not command_manager:
cm = commandmanager.CommandManager('openstack.cli')
else:
cm = command_manager
super(OpenStackShell, self).__init__(
description=__doc__.strip(),
version=version,
command_manager=cm,
deferred_help=True,
)
# Until we have command line arguments parsed, dump any stack traces
self.dump_stack_trace = True
# Set in subclasses
self.api_version = None
self.client_manager = None
self.command_options = None
self.do_profile = False
def configure_logging(self):
"""Configure logging for the app."""
self.log_configurator = logs.LogConfigurator(self.options)
self.dump_stack_trace = self.log_configurator.dump_trace
def run(self, argv):
ret_val = 1
self.command_options = argv
try:
ret_val = super(OpenStackShell, self).run(argv)
return ret_val
except Exception as e:
if not logging.getLogger('').handlers:
logging.basicConfig()
if self.dump_stack_trace:
self.log.error(traceback.format_exc())
else:
self.log.error('Exception raised: ' + str(e))
return ret_val
finally:
self.log.info("END return value: %s", ret_val)
def init_profile(self):
self.do_profile = osprofiler_profiler and self.options.profile
if self.do_profile:
osprofiler_profiler.init(self.options.profile)
def close_profile(self):
if self.do_profile:
trace_id = osprofiler_profiler.get().get_base_id()
# NOTE(dbelova): let's use warning log level to see these messages
# printed. In fact we can define custom log level here with value
# bigger than most big default one (CRITICAL) or something like
# that (PROFILE = 60 for instance), but not sure we need it here.
self.log.warning("Trace ID: %s" % trace_id)
self.log.warning("Display trace with command:\n"
"osprofiler trace show --html %s " % trace_id)
def run_subcommand(self, argv):
self.init_profile()
try:
ret_value = super(OpenStackShell, self).run_subcommand(argv)
finally:
self.close_profile()
return ret_value
def interact(self):
self.init_profile()
try:
ret_value = super(OpenStackShell, self).interact()
finally:
self.close_profile()
return ret_value
def build_option_parser(self, description, version):
parser = super(OpenStackShell, self).build_option_parser(
description,
version)
# service token auth argument
parser.add_argument(
'--os-cloud',
metavar='<cloud-config-name>',
dest='cloud',
default=utils.env('OS_CLOUD'),
help=_('Cloud name in clouds.yaml (Env: OS_CLOUD)'),
)
# Global arguments
parser.add_argument(
'--os-region-name',
metavar='<auth-region-name>',
dest='region_name',
default=utils.env('OS_REGION_NAME'),
help=_('Authentication region name (Env: OS_REGION_NAME)'),
)
parser.add_argument(
'--os-cacert',
metavar='<ca-bundle-file>',
dest='cacert',
default=utils.env('OS_CACERT', default=None),
help=_('CA certificate bundle file (Env: OS_CACERT)'),
)
parser.add_argument(
'--os-cert',
metavar='<certificate-file>',
dest='cert',
default=utils.env('OS_CERT'),
help=_('Client certificate bundle file (Env: OS_CERT)'),
)
parser.add_argument(
'--os-key',
metavar='<key-file>',
dest='key',
default=utils.env('OS_KEY'),
help=_('Client certificate key file (Env: OS_KEY)'),
)
verify_group = parser.add_mutually_exclusive_group()
verify_group.add_argument(
'--verify',
action='store_true',
default=None,
help=_('Verify server certificate (default)'),
)
verify_group.add_argument(
'--insecure',
action='store_true',
default=None,
help=_('Disable server certificate verification'),
)
parser.add_argument(
'--os-default-domain',
metavar='<auth-domain>',
dest='default_domain',
default=utils.env(
'OS_DEFAULT_DOMAIN',
default=DEFAULT_DOMAIN),
help=_('Default domain ID, default=%s. '
'(Env: OS_DEFAULT_DOMAIN)') % DEFAULT_DOMAIN,
)
parser.add_argument(
'--os-interface',
metavar='<interface>',
dest='interface',
choices=['admin', 'public', 'internal'],
default=utils.env('OS_INTERFACE'),
help=_('Select an interface type.'
' Valid interface types: [admin, public, internal].'
' (Env: OS_INTERFACE)'),
)
parser.add_argument(
'--timing',
default=False,
action='store_true',
help=_("Print API call timing info"),
)
parser.add_argument(
'--os-beta-command',
action='store_true',
help=_("Enable beta commands which are subject to change"),
)
# osprofiler HMAC key argument
if osprofiler_profiler:
parser.add_argument(
'--os-profile',
metavar='hmac-key',
dest='profile',
default=utils.env('OS_PROFILE'),
help=_('HMAC key for encrypting profiling context data'),
)
return parser
# return clientmanager.build_plugin_option_parser(parser)
"""
Break up initialize_app() so that overriding it in a subclass does not
require duplicating a lot of the method
* super()
* _final_defaults()
* OpenStackConfig
* get_one_cloud
* _load_plugins()
* _load_commands()
* ClientManager
"""
def _final_defaults(self):
# Set the default plugin to None
# NOTE(dtroyer): This is here to set up for setting it to a default
# in the calling CLI
self._auth_type = None
# Converge project/tenant options
project_id = getattr(self.options, 'project_id', None)
project_name = getattr(self.options, 'project_name', None)
tenant_id = getattr(self.options, 'tenant_id', None)
tenant_name = getattr(self.options, 'tenant_name', None)
# handle some v2/v3 authentication inconsistencies by just acting like
# both the project and tenant information are both present. This can
# go away if we stop registering all the argparse options together.
if project_id and not tenant_id:
self.options.tenant_id = project_id
if project_name and not tenant_name:
self.options.tenant_name = project_name
if tenant_id and not project_id:
self.options.project_id = tenant_id
if tenant_name and not project_name:
self.options.project_name = tenant_name
# Save default domain
self.default_domain = self.options.default_domain
def _load_plugins(self):
"""Load plugins via stevedore
osc-lib has no opinion on what plugins should be loaded
"""
pass
def _load_commands(self):
"""Load commands via cliff/stevedore
osc-lib has no opinion on what commands should be loaded
"""
pass
def initialize_app(self, argv):
"""Global app init bits:
* set up API versions
* validate authentication info
* authenticate against Identity if requested
"""
# Parent __init__ parses argv into self.options
super(OpenStackShell, self).initialize_app(argv)
self.log.info("START with options: %s",
strutils.mask_password(self.command_options))
self.log.debug("options: %s",
strutils.mask_password(self.options))
# Callout for stuff between superclass init and o-c-c
self._final_defaults()
# Do configuration file handling
# Ignore the default value of interface. Only if it is set later
# will it be used.
try:
self.cloud_config = cloud_config.OSC_Config(
override_defaults={
'interface': None,
'auth_type': self._auth_type,
},
)
except (IOError, OSError) as e:
self.log.critical("Could not read clouds.yaml configuration file")
self.print_help_if_requested()
raise e
# TODO(thowe): Change cliff so the default value for debug
# can be set to None.
if not self.options.debug:
self.options.debug = None
# NOTE(dtroyer): Need to do this with validate=False to defer the
# auth plugin handling to ClientManager.setup_auth()
self.cloud = self.cloud_config.get_one_cloud(
cloud=self.options.cloud,
argparse=self.options,
validate=False,
)
self.log_configurator.configure(self.cloud)
self.dump_stack_trace = self.log_configurator.dump_trace
self.log.debug("defaults: %s", self.cloud_config.defaults)
self.log.debug("cloud cfg: %s",
strutils.mask_password(self.cloud.config))
# Callout for stuff between o-c-c and ClientManager
# self._initialize_app_2(self.options)
self._load_plugins()
self._load_commands()
# Handle deferred help and exit
self.print_help_if_requested()
self.client_manager = clientmanager.ClientManager(
cli_options=self.cloud,
api_version=self.api_version,
pw_func=prompt_for_password,
)
def prepare_to_run_command(self, cmd):
"""Set up auth and API versions"""
self.log.info(
'command: %s -> %s.%s (auth=%s)',
getattr(cmd, 'cmd_name', '<none>'),
cmd.__class__.__module__,
cmd.__class__.__name__,
cmd.auth_required,
)
# NOTE(dtroyer): If auth is not required for a command, skip
# get_one_Cloud()'s validation to avoid loading plugins
validate = cmd.auth_required
# NOTE(dtroyer): Save the auth required state of the _current_ command
# in the ClientManager
self.client_manager._auth_required = cmd.auth_required
# Validate auth options
self.cloud = self.cloud_config.get_one_cloud(
cloud=self.options.cloud,
argparse=self.options,
validate=validate,
)
# Push the updated args into ClientManager
self.client_manager._cli_options = self.cloud
if cmd.auth_required:
self.client_manager.setup_auth()
if hasattr(cmd, 'required_scope') and cmd.required_scope:
# let the command decide whether we need a scoped token
self.client_manager.validate_scope()
# Trigger the Identity client to initialize
self.client_manager.auth_ref
return
def clean_up(self, cmd, result, err):
self.log.debug('clean_up %s: %s', cmd.__class__.__name__, err or '')
# Process collected timing data
if self.options.timing:
# Get session data
self.timing_data.extend(
self.client_manager.session.get_timings(),
)
# Use the Timing pseudo-command to generate the output
tcmd = timing.Timing(self, self.options)
tparser = tcmd.get_parser('Timing')
# If anything other than prettytable is specified, force csv
format = 'table'
# Check the formatter used in the actual command
if hasattr(cmd, 'formatter') \
and cmd.formatter != cmd._formatter_plugins['table'].obj:
format = 'csv'
sys.stdout.write('\n')
targs = tparser.parse_args(['-f', format])
tcmd.run(targs)
def main(argv=None):
if argv is None:
argv = sys.argv[1:]
if six.PY2:
# Emulate Py3, decode argv into Unicode based on locale so that
# commands always see arguments as text instead of binary data
encoding = locale.getpreferredencoding()
if encoding:
argv = map(lambda arg: arg.decode(encoding), argv)
return OpenStackShell().run(argv)
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View File

@ -1,56 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""API Test Fakes"""
from keystoneauth1 import session
from requests_mock.contrib import fixture
from osc_lib.tests import utils
RESP_ITEM_1 = {
'id': '1',
'name': 'alpha',
'status': 'UP',
'props': {'a': 1, 'b': 2},
}
RESP_ITEM_2 = {
'id': '2',
'name': 'beta',
'status': 'DOWN',
'props': {'a': 2, 'b': 2},
}
RESP_ITEM_3 = {
'id': '3',
'name': 'delta',
'status': 'UP',
'props': {'a': 3, 'b': 1},
}
LIST_RESP = [RESP_ITEM_1, RESP_ITEM_2]
LIST_BODY = {
'p1': 'xxx',
'p2': 'yyy',
}
class TestSession(utils.TestCase):
BASE_URL = 'https://api.example.com:1234/test'
def setUp(self):
super(TestSession, self).setUp()
self.sess = session.Session()
self.requests_mock = self.useFixture(fixture.Fixture())

View File

@ -1,521 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Base API Library Tests"""
from keystoneauth1 import exceptions as ksa_exceptions
from keystoneauth1 import session
from osc_lib.api import api
from osc_lib import exceptions
from osc_lib.tests.api import fakes as api_fakes
class TestBaseAPIDefault(api_fakes.TestSession):
def setUp(self):
super(TestBaseAPIDefault, self).setUp()
self.api = api.BaseAPI()
def test_baseapi_request_no_url(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
self.assertRaises(
ksa_exceptions.EndpointNotFound,
self.api._request,
'GET',
'',
)
self.assertIsNotNone(self.api.session)
self.assertNotEqual(self.sess, self.api.session)
def test_baseapi_request_url(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
ret = self.api._request('GET', self.BASE_URL + '/qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
self.assertIsNotNone(self.api.session)
self.assertNotEqual(self.sess, self.api.session)
def test_baseapi_request_url_path(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
self.assertRaises(
ksa_exceptions.EndpointNotFound,
self.api._request,
'GET',
'/qaz',
)
self.assertIsNotNone(self.api.session)
self.assertNotEqual(self.sess, self.api.session)
def test_baseapi_request_session(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
ret = self.api._request(
'GET',
self.BASE_URL + '/qaz',
session=self.sess,
)
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
self.assertIsNotNone(self.api.session)
self.assertNotEqual(self.sess, self.api.session)
class TestBaseAPIEndpointArg(api_fakes.TestSession):
def test_baseapi_endpoint_no_endpoint(self):
x_api = api.BaseAPI(
session=self.sess,
)
self.assertIsNotNone(x_api.session)
self.assertEqual(self.sess, x_api.session)
self.assertIsNone(x_api.endpoint)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
# Normal url
self.assertRaises(
ksa_exceptions.EndpointNotFound,
x_api._request,
'GET',
'/qaz',
)
# No leading '/' url
self.assertRaises(
ksa_exceptions.EndpointNotFound,
x_api._request,
'GET',
'qaz',
)
# Extra leading '/' url
self.assertRaises(
ksa_exceptions.connection.UnknownConnectionError,
x_api._request,
'GET',
'//qaz',
)
def test_baseapi_endpoint_no_extra(self):
x_api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
self.assertIsNotNone(x_api.session)
self.assertEqual(self.sess, x_api.session)
self.assertEqual(self.BASE_URL, x_api.endpoint)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
# Normal url
ret = x_api._request('GET', '/qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
# No leading '/' url
ret = x_api._request('GET', 'qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
# Extra leading '/' url
ret = x_api._request('GET', '//qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
def test_baseapi_endpoint_extra(self):
x_api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL + '/',
)
self.assertIsNotNone(x_api.session)
self.assertEqual(self.sess, x_api.session)
self.assertEqual(self.BASE_URL, x_api.endpoint)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
# Normal url
ret = x_api._request('GET', '/qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
# No leading '/' url
ret = x_api._request('GET', 'qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
# Extra leading '/' url
ret = x_api._request('GET', '//qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
class TestBaseAPIArgs(api_fakes.TestSession):
def setUp(self):
super(TestBaseAPIArgs, self).setUp()
self.api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
def test_baseapi_request_url_path(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
ret = self.api._request('GET', '/qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
self.assertIsNotNone(self.api.session)
self.assertEqual(self.sess, self.api.session)
def test_baseapi_request_session(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=200,
)
new_session = session.Session()
ret = self.api._request('GET', '/qaz', session=new_session)
self.assertEqual(api_fakes.RESP_ITEM_1, ret.json())
self.assertIsNotNone(self.api.session)
self.assertNotEqual(new_session, self.api.session)
class TestBaseAPICreate(api_fakes.TestSession):
def setUp(self):
super(TestBaseAPICreate, self).setUp()
self.api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
def test_baseapi_create_post(self):
self.requests_mock.register_uri(
'POST',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=202,
)
ret = self.api.create('qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
def test_baseapi_create_put(self):
self.requests_mock.register_uri(
'PUT',
self.BASE_URL + '/qaz',
json=api_fakes.RESP_ITEM_1,
status_code=202,
)
ret = self.api.create('qaz', method='PUT')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
def test_baseapi_delete(self):
self.requests_mock.register_uri(
'DELETE',
self.BASE_URL + '/qaz',
status_code=204,
)
ret = self.api.delete('qaz')
self.assertEqual(204, ret.status_code)
class TestBaseAPIFind(api_fakes.TestSession):
def setUp(self):
super(TestBaseAPIFind, self).setUp()
self.api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
def test_baseapi_find_attr_by_id(self):
# All first requests (by name) will fail in this test
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?name=1',
json={'qaz': []},
status_code=200,
)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?id=1',
json={'qaz': [api_fakes.RESP_ITEM_1]},
status_code=200,
)
ret = self.api.find_attr('qaz', '1')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
# value not found
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?name=0',
json={'qaz': []},
status_code=200,
)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?id=0',
json={'qaz': []},
status_code=200,
)
self.assertRaises(
exceptions.CommandError,
self.api.find_attr,
'qaz',
'0',
)
# Attribute other than 'name'
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?status=UP',
json={'qaz': [api_fakes.RESP_ITEM_1]},
status_code=200,
)
ret = self.api.find_attr('qaz', 'UP', attr='status')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
ret = self.api.find_attr('qaz', value='UP', attr='status')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
def test_baseapi_find_attr_by_name(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?name=alpha',
json={'qaz': [api_fakes.RESP_ITEM_1]},
status_code=200,
)
ret = self.api.find_attr('qaz', 'alpha')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
# value not found
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?name=0',
json={'qaz': []},
status_code=200,
)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?id=0',
json={'qaz': []},
status_code=200,
)
self.assertRaises(
exceptions.CommandError,
self.api.find_attr,
'qaz',
'0',
)
# Attribute other than 'name'
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?status=UP',
json={'qaz': [api_fakes.RESP_ITEM_1]},
status_code=200,
)
ret = self.api.find_attr('qaz', 'UP', attr='status')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
ret = self.api.find_attr('qaz', value='UP', attr='status')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
def test_baseapi_find_attr_path_resource(self):
# Test resource different than path
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/wsx?name=1',
json={'qaz': []},
status_code=200,
)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/wsx?id=1',
json={'qaz': [api_fakes.RESP_ITEM_1]},
status_code=200,
)
ret = self.api.find_attr('wsx', '1', resource='qaz')
self.assertEqual(api_fakes.RESP_ITEM_1, ret)
def test_baseapi_find_bulk_none(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.find_bulk('qaz')
self.assertEqual(api_fakes.LIST_RESP, ret)
# Verify headers arg does not interfere
ret = self.api.find_bulk('qaz', headers={})
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_find_bulk_one(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.find_bulk('qaz', id='1')
self.assertEqual([api_fakes.LIST_RESP[0]], ret)
# Verify headers arg does not interfere with search
ret = self.api.find_bulk('qaz', id='1', headers={})
self.assertEqual([api_fakes.LIST_RESP[0]], ret)
ret = self.api.find_bulk('qaz', id='0')
self.assertEqual([], ret)
ret = self.api.find_bulk('qaz', name='beta')
self.assertEqual([api_fakes.LIST_RESP[1]], ret)
ret = self.api.find_bulk('qaz', error='bogus')
self.assertEqual([], ret)
def test_baseapi_find_bulk_two(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.find_bulk('qaz', id='1', name='alpha')
self.assertEqual([api_fakes.LIST_RESP[0]], ret)
ret = self.api.find_bulk('qaz', id='1', name='beta')
self.assertEqual([], ret)
ret = self.api.find_bulk('qaz', id='1', error='beta')
self.assertEqual([], ret)
def test_baseapi_find_bulk_dict(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json={'qaz': api_fakes.LIST_RESP},
status_code=200,
)
ret = self.api.find_bulk('qaz', id='1')
self.assertEqual([api_fakes.LIST_RESP[0]], ret)
class TestBaseAPIList(api_fakes.TestSession):
def setUp(self):
super(TestBaseAPIList, self).setUp()
self.api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
def test_baseapi_list_no_args(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz',
json=api_fakes.LIST_RESP,
status_code=204,
)
ret = self.api.list('/qaz')
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_list_params(self):
params = {'format': 'json'}
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '?format=json',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.list('', **params)
self.assertEqual(api_fakes.LIST_RESP, ret)
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?format=json',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.list('qaz', **params)
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_list_body(self):
self.requests_mock.register_uri(
'POST',
self.BASE_URL + '/qaz',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.list('qaz', body=api_fakes.LIST_BODY)
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_list_detailed(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz/details',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.list('qaz', detailed=True)
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_list_filtered(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?attr=value',
json=api_fakes.LIST_RESP,
status_code=200,
)
ret = self.api.list('qaz', attr='value')
self.assertEqual(api_fakes.LIST_RESP, ret)
def test_baseapi_list_wrapped(self):
self.requests_mock.register_uri(
'GET',
self.BASE_URL + '/qaz?attr=value',
json={'responses': api_fakes.LIST_RESP},
status_code=200,
)
ret = self.api.list('qaz', attr='value')
self.assertEqual({'responses': api_fakes.LIST_RESP}, ret)

View File

@ -1,115 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""API Utilities Library Tests"""
import copy
from osc_lib.api import api
from osc_lib.api import utils as api_utils
from osc_lib.tests.api import fakes as api_fakes
class TestBaseAPIFilter(api_fakes.TestSession):
"""The filters can be tested independently"""
def setUp(self):
super(TestBaseAPIFilter, self).setUp()
self.api = api.BaseAPI(
session=self.sess,
endpoint=self.BASE_URL,
)
self.input_list = [
api_fakes.RESP_ITEM_1,
api_fakes.RESP_ITEM_2,
api_fakes.RESP_ITEM_3,
]
def test_simple_filter_none(self):
output = api_utils.simple_filter(
)
self.assertIsNone(output)
def test_simple_filter_no_attr(self):
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
)
self.assertEqual(self.input_list, output)
def test_simple_filter_attr_only(self):
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='status',
)
self.assertEqual(self.input_list, output)
def test_simple_filter_attr_value(self):
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='status',
value='',
)
self.assertEqual([], output)
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='status',
value='UP',
)
self.assertEqual(
[api_fakes.RESP_ITEM_1, api_fakes.RESP_ITEM_3],
output,
)
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='fred',
value='UP',
)
self.assertEqual([], output)
def test_simple_filter_prop_attr_only(self):
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='b',
property_field='props',
)
self.assertEqual(self.input_list, output)
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='status',
property_field='props',
)
self.assertEqual(self.input_list, output)
def test_simple_filter_prop_attr_value(self):
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='b',
value=2,
property_field='props',
)
self.assertEqual(
[api_fakes.RESP_ITEM_1, api_fakes.RESP_ITEM_2],
output,
)
output = api_utils.simple_filter(
copy.deepcopy(self.input_list),
attr='b',
value=9,
property_field='props',
)
self.assertEqual([], output)

View File

@ -1,222 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
from osc_lib.cli import client_config
from osc_lib.tests import utils
class TestOSCConfig(utils.TestCase):
def setUp(self):
super(TestOSCConfig, self).setUp()
self.cloud = client_config.OSC_Config()
def test_auth_select_default_plugin(self):
config = {
'auth_type': 'admin_token',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('admin_token', ret_config['auth_type'])
def test_auth_select_default_plugin_password(self):
config = {
'username': 'fred',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('password', ret_config['auth_type'])
self.assertEqual('fred', ret_config['username'])
def test_auth_select_default_plugin_password_v2(self):
config = {
'identity_api_version': '2',
'username': 'fred',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v2password', ret_config['auth_type'])
self.assertEqual('fred', ret_config['username'])
def test_auth_select_default_plugin_password_v2_int(self):
config = {
'identity_api_version': 2,
'username': 'fred',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v2password', ret_config['auth_type'])
self.assertEqual('fred', ret_config['username'])
def test_auth_select_default_plugin_password_v3(self):
config = {
'identity_api_version': '3',
'username': 'fred',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v3password', ret_config['auth_type'])
self.assertEqual('fred', ret_config['username'])
def test_auth_select_default_plugin_password_v3_int(self):
config = {
'identity_api_version': 3,
'username': 'fred',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v3password', ret_config['auth_type'])
self.assertEqual('fred', ret_config['username'])
def test_auth_select_default_plugin_token(self):
config = {
'token': 'subway',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('token', ret_config['auth_type'])
self.assertEqual('subway', ret_config['token'])
def test_auth_select_default_plugin_token_v2(self):
config = {
'identity_api_version': '2.2',
'token': 'subway',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v2token', ret_config['auth_type'])
self.assertEqual('subway', ret_config['token'])
def test_auth_select_default_plugin_token_v3(self):
config = {
'identity_api_version': '3',
'token': 'subway',
}
ret_config = self.cloud._auth_select_default_plugin(config)
self.assertEqual('v3token', ret_config['auth_type'])
self.assertEqual('subway', ret_config['token'])
def test_auth_v2_arguments(self):
config = {
'identity_api_version': '2',
'auth_type': 'v2password',
'auth': {
'username': 'fred',
},
}
ret_config = self.cloud._auth_v2_arguments(config)
self.assertEqual('fred', ret_config['auth']['username'])
self.assertFalse('tenant_id' in ret_config['auth'])
self.assertFalse('tenant_name' in ret_config['auth'])
config = {
'identity_api_version': '3',
'auth_type': 'v3password',
'auth': {
'username': 'fred',
'project_id': 'id',
},
}
ret_config = self.cloud._auth_v2_arguments(config)
self.assertEqual('fred', ret_config['auth']['username'])
self.assertFalse('tenant_id' in ret_config['auth'])
self.assertFalse('tenant_name' in ret_config['auth'])
config = {
'identity_api_version': '2',
'auth_type': 'v2password',
'auth': {
'username': 'fred',
'project_id': 'id',
},
}
ret_config = self.cloud._auth_v2_arguments(config)
self.assertEqual('id', ret_config['auth']['tenant_id'])
self.assertFalse('tenant_name' in ret_config['auth'])
config = {
'identity_api_version': '2',
'auth_type': 'v2password',
'auth': {
'username': 'fred',
'project_name': 'name',
},
}
ret_config = self.cloud._auth_v2_arguments(config)
self.assertFalse('tenant_id' in ret_config['auth'])
self.assertEqual('name', ret_config['auth']['tenant_name'])
def test_auth_v2_ignore_v3(self):
config = {
'identity_api_version': '2',
'auth_type': 'v2password',
'auth': {
'username': 'fred',
'project_id': 'id',
'project_domain_id': 'bad',
},
}
ret_config = self.cloud._auth_v2_ignore_v3(config)
self.assertEqual('fred', ret_config['auth']['username'])
self.assertFalse('project_domain_id' in ret_config['auth'])
def test_auth_default_domain_not_set(self):
config = {
'identity_api_version': '3',
'auth_type': 'v3oidcpassword',
'default_domain': 'default',
'auth': {
'username': 'fred',
'project_id': 'id',
},
}
ret_config = self.cloud._auth_default_domain(config)
self.assertEqual('v3oidcpassword', ret_config['auth_type'])
self.assertEqual('default', ret_config['default_domain'])
self.assertEqual('fred', ret_config['auth']['username'])
self.assertNotIn('project_domain_id', ret_config['auth'])
self.assertNotIn('user_domain_id', ret_config['auth'])
def test_auth_default_domain_use_default(self):
config = {
'identity_api_version': '3',
'auth_type': 'v3password',
'default_domain': 'default',
'auth': {
'username': 'fred',
'project_id': 'id',
},
}
ret_config = self.cloud._auth_default_domain(config)
self.assertEqual('v3password', ret_config['auth_type'])
self.assertEqual('default', ret_config['default_domain'])
self.assertEqual('fred', ret_config['auth']['username'])
self.assertEqual('default', ret_config['auth']['project_domain_id'])
self.assertEqual('default', ret_config['auth']['user_domain_id'])
def test_auth_default_domain_use_given(self):
config = {
'identity_api_version': '3',
'auth_type': 'v3password',
'default_domain': 'default',
'auth': {
'username': 'fred',
'project_id': 'id',
'project_domain_id': 'proj',
'user_domain_id': 'use'
},
}
ret_config = self.cloud._auth_default_domain(config)
self.assertEqual('v3password', ret_config['auth_type'])
self.assertEqual('default', ret_config['default_domain'])
self.assertEqual('fred', ret_config['auth']['username'])
self.assertEqual('proj', ret_config['auth']['project_domain_id'])
self.assertEqual('use', ret_config['auth']['user_domain_id'])
def test_auth_config_hook_default(self):
config = {}
ret_config = self.cloud.auth_config_hook(config)
self.assertEqual('password', ret_config['auth_type'])

View File

@ -1,65 +0,0 @@
# Copyright 2017 Huawei, Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
from osc_lib.cli import format_columns
from osc_lib.tests import utils
class TestDictColumn(utils.TestCase):
def test_dict_column(self):
dict_content = {
'key1': 'value1',
'key2': 'value2',
}
col = format_columns.DictColumn(dict_content)
self.assertEqual(dict_content, col.machine_readable())
self.assertEqual("key1='value1', key2='value2'", col.human_readable())
class TestDictListColumn(utils.TestCase):
def test_dict_list_column(self):
dict_list_content = {'public': ['2001:db8::8', '172.24.4.6'],
'private': ['2000:db7::7', '192.24.4.6']}
col = format_columns.DictListColumn(dict_list_content)
self.assertEqual(dict_list_content, col.machine_readable())
self.assertEqual('private=192.24.4.6, 2000:db7::7; '
'public=172.24.4.6, 2001:db8::8',
col.human_readable())
class TestListColumn(utils.TestCase):
def test_list_column(self):
list_content = [
'key1',
'key2',
]
col = format_columns.ListColumn(list_content)
self.assertEqual(list_content, col.machine_readable())
self.assertEqual("key1, key2", col.human_readable())
class TestListDictColumn(utils.TestCase):
def test_list_dict_column(self):
list_dict_content = [
{'key1': 'value1'},
{'key2': 'value2'},
]
col = format_columns.ListDictColumn(list_dict_content)
self.assertEqual(list_dict_content, col.machine_readable())
self.assertEqual("key1='value1'\nkey2='value2'", col.human_readable())

View File

@ -1,428 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import argparse
from osc_lib.cli import parseractions
from osc_lib.tests import utils
class TestKeyValueAction(utils.TestCase):
def setUp(self):
super(TestKeyValueAction, self).setUp()
self.parser = argparse.ArgumentParser()
# Set up our typical usage
self.parser.add_argument(
'--property',
metavar='<key=value>',
action=parseractions.KeyValueAction,
default={'green': '20%', 'format': '#rgb'},
help='Property to store for this volume '
'(repeat option to set multiple properties)',
)
def test_good_values(self):
results = self.parser.parse_args([
'--property', 'red=',
'--property', 'green=100%',
'--property', 'blue=50%',
])
actual = getattr(results, 'property', {})
# All should pass through unmolested
expect = {'red': '', 'green': '100%', 'blue': '50%', 'format': '#rgb'}
self.assertEqual(expect, actual)
def test_error_values(self):
data_list = [
['--property', 'red', ],
['--property', '=', ],
['--property', '=red', ]
]
for data in data_list:
self.assertRaises(argparse.ArgumentTypeError,
self.parser.parse_args, data)
class TestMultiKeyValueAction(utils.TestCase):
def setUp(self):
super(TestMultiKeyValueAction, self).setUp()
self.parser = argparse.ArgumentParser()
# Set up our typical usage
self.parser.add_argument(
'--test',
metavar='req1=xxx,req2=yyy',
action=parseractions.MultiKeyValueAction,
dest='test',
default=None,
required_keys=['req1', 'req2'],
optional_keys=['opt1', 'opt2'],
help='Test'
)
def test_good_values(self):
results = self.parser.parse_args([
'--test', 'req1=aaa,req2=bbb',
'--test', 'req1=,req2=',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': 'aaa', 'req2': 'bbb'},
{'req1': '', 'req2': ''},
]
self.assertItemsEqual(expect, actual)
def test_empty_required_optional(self):
self.parser.add_argument(
'--test-empty',
metavar='req1=xxx,req2=yyy',
action=parseractions.MultiKeyValueAction,
dest='test_empty',
default=None,
required_keys=[],
optional_keys=[],
help='Test'
)
results = self.parser.parse_args([
'--test-empty', 'req1=aaa,req2=bbb',
'--test-empty', 'req1=,req2=',
])
actual = getattr(results, 'test_empty', [])
expect = [
{'req1': 'aaa', 'req2': 'bbb'},
{'req1': '', 'req2': ''},
]
self.assertItemsEqual(expect, actual)
def test_error_values_with_comma(self):
data_list = [
['--test', 'mmm,nnn=zzz', ],
['--test', 'nnn=zzz,=', ],
['--test', 'nnn=zzz,=zzz', ]
]
for data in data_list:
self.assertRaises(argparse.ArgumentTypeError,
self.parser.parse_args, data)
def test_error_values_without_comma(self):
self.assertRaises(
argparse.ArgumentTypeError,
self.parser.parse_args,
[
'--test', 'mmmnnn',
]
)
def test_missing_key(self):
self.assertRaises(
argparse.ArgumentTypeError,
self.parser.parse_args,
[
'--test', 'req2=ddd',
]
)
def test_invalid_key(self):
self.assertRaises(
argparse.ArgumentTypeError,
self.parser.parse_args,
[
'--test', 'req1=aaa,req2=bbb,aaa=req1',
]
)
def test_required_keys_not_list(self):
self.assertRaises(
TypeError,
self.parser.add_argument,
'--test-required-dict',
metavar='req1=xxx,req2=yyy',
action=parseractions.MultiKeyValueAction,
dest='test_required_dict',
default=None,
required_keys={'aaa': 'bbb'},
optional_keys=['opt1', 'opt2'],
help='Test'
)
def test_optional_keys_not_list(self):
self.assertRaises(
TypeError,
self.parser.add_argument,
'--test-optional-dict',
metavar='req1=xxx,req2=yyy',
action=parseractions.MultiKeyValueAction,
dest='test_optional_dict',
default=None,
required_keys=['req1', 'req2'],
optional_keys={'aaa': 'bbb'},
help='Test'
)
class TestMultiKeyValueCommaAction(utils.TestCase):
def setUp(self):
super(TestMultiKeyValueCommaAction, self).setUp()
self.parser = argparse.ArgumentParser()
# Typical usage
self.parser.add_argument(
'--test',
metavar='req1=xxx,yyy',
action=parseractions.MultiKeyValueCommaAction,
dest='test',
default=None,
required_keys=['req1'],
optional_keys=['opt2'],
help='Test',
)
def test_mkvca_required(self):
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': 'aaa,bbb'},
]
self.assertItemsEqual(expect, actual)
results = self.parser.parse_args([
'--test', 'req1=',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': ''},
]
self.assertItemsEqual(expect, actual)
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb',
'--test', 'req1=',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': 'aaa,bbb'},
{'req1': ''},
]
self.assertItemsEqual(expect, actual)
def test_mkvca_optional(self):
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': 'aaa,bbb'},
]
self.assertItemsEqual(expect, actual)
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb',
'--test', 'req1=,opt2=ccc',
])
actual = getattr(results, 'test', [])
expect = [
{'req1': 'aaa,bbb'},
{'req1': '', 'opt2': 'ccc'},
]
self.assertItemsEqual(expect, actual)
try:
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb',
'--test', 'opt2=ccc',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertEqual(
'Missing required keys req1.\nRequired keys are: req1',
str(e),
)
def test_mkvca_multiples(self):
results = self.parser.parse_args([
'--test', 'req1=aaa,bbb,opt2=ccc',
])
actual = getattr(results, 'test', [])
expect = [{
'req1': 'aaa,bbb',
'opt2': 'ccc',
}]
self.assertItemsEqual(expect, actual)
def test_mkvca_no_required_optional(self):
self.parser.add_argument(
'--test-empty',
metavar='req1=xxx,yyy',
action=parseractions.MultiKeyValueCommaAction,
dest='test_empty',
default=None,
required_keys=[],
optional_keys=[],
help='Test',
)
results = self.parser.parse_args([
'--test-empty', 'req1=aaa,bbb',
])
actual = getattr(results, 'test_empty', [])
expect = [
{'req1': 'aaa,bbb'},
]
self.assertItemsEqual(expect, actual)
results = self.parser.parse_args([
'--test-empty', 'xyz=aaa,bbb',
])
actual = getattr(results, 'test_empty', [])
expect = [
{'xyz': 'aaa,bbb'},
]
self.assertItemsEqual(expect, actual)
def test_mkvca_invalid_key(self):
try:
self.parser.parse_args([
'--test', 'req1=aaa,bbb=',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertIn(
'Invalid keys bbb specified.\nValid keys are:',
str(e),
)
try:
self.parser.parse_args([
'--test', 'nnn=aaa',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertIn(
'Invalid keys nnn specified.\nValid keys are:',
str(e),
)
def test_mkvca_value_no_key(self):
try:
self.parser.parse_args([
'--test', 'req1=aaa,=bbb',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertEqual(
"A key must be specified before '=': =bbb",
str(e),
)
try:
self.parser.parse_args([
'--test', '=nnn',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertEqual(
"A key must be specified before '=': =nnn",
str(e),
)
try:
self.parser.parse_args([
'--test', 'nnn',
])
self.fail('ArgumentTypeError should be raised')
except argparse.ArgumentTypeError as e:
self.assertIn(
'A key=value pair is required:',
str(e),
)
def test_mkvca_required_keys_not_list(self):
self.assertRaises(
TypeError,
self.parser.add_argument,
'--test-required-dict',
metavar='req1=xxx',
action=parseractions.MultiKeyValueCommaAction,
dest='test_required_dict',
default=None,
required_keys={'aaa': 'bbb'},
optional_keys=['opt1', 'opt2'],
help='Test',
)
def test_mkvca_optional_keys_not_list(self):
self.assertRaises(
TypeError,
self.parser.add_argument,
'--test-optional-dict',
metavar='req1=xxx',
action=parseractions.MultiKeyValueCommaAction,
dest='test_optional_dict',
default=None,
required_keys=['req1', 'req2'],
optional_keys={'aaa': 'bbb'},
help='Test',
)
class TestNonNegativeAction(utils.TestCase):
def setUp(self):
super(TestNonNegativeAction, self).setUp()
self.parser = argparse.ArgumentParser()
# Set up our typical usage
self.parser.add_argument(
'--foo',
metavar='<foo>',
type=int,
action=parseractions.NonNegativeAction,
)
def test_negative_values(self):
self.assertRaises(
argparse.ArgumentTypeError,
self.parser.parse_args,
"--foo -1".split()
)
def test_zero_values(self):
results = self.parser.parse_args(
'--foo 0'.split()
)
actual = getattr(results, 'foo', None)
self.assertEqual(actual, 0)
def test_positive_values(self):
results = self.parser.parse_args(
'--foo 1'.split()
)
actual = getattr(results, 'foo', None)
self.assertEqual(actual, 1)

View File

@ -1,52 +0,0 @@
# Copyright 2016 NEC Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from osc_lib.command import command
from osc_lib import exceptions
from osc_lib.tests import fakes as test_fakes
from osc_lib.tests import utils as test_utils
class FakeCommand(command.Command):
def take_action(self, parsed_args):
pass
class TestCommand(test_utils.TestCase):
def test_command_has_logger(self):
cmd = FakeCommand(mock.Mock(), mock.Mock())
self.assertTrue(hasattr(cmd, 'log'))
self.assertEqual(
'osc_lib.tests.command.test_command.FakeCommand',
cmd.log.name,
)
def test_validate_os_beta_command_enabled(self):
cmd = FakeCommand(mock.Mock(), mock.Mock())
cmd.app = mock.Mock()
cmd.app.options = test_fakes.FakeOptions()
# No exception is raised when enabled.
cmd.app.options.os_beta_command = True
cmd.validate_os_beta_command_enabled()
cmd.app.options.os_beta_command = False
self.assertRaises(
exceptions.CommandError,
cmd.validate_os_beta_command_enabled,
)

View File

@ -1,107 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import mock
from osc_lib.command import commandmanager
from osc_lib.tests import utils
class FakeCommand(object):
@classmethod
def load(cls):
return cls
def __init__(self):
return
FAKE_CMD_ONE = FakeCommand
FAKE_CMD_TWO = FakeCommand
FAKE_CMD_ALPHA = FakeCommand
FAKE_CMD_BETA = FakeCommand
class FakeCommandManager(commandmanager.CommandManager):
commands = {}
def load_commands(self, namespace):
if namespace == 'test':
self.commands['one'] = FAKE_CMD_ONE
self.commands['two'] = FAKE_CMD_TWO
self.group_list.append(namespace)
elif namespace == 'greek':
self.commands['alpha'] = FAKE_CMD_ALPHA
self.commands['beta'] = FAKE_CMD_BETA
self.group_list.append(namespace)
class TestCommandManager(utils.TestCase):
def test_add_command_group(self):
mgr = FakeCommandManager('test')
# Make sure add_command() still functions
mock_cmd_one = mock.Mock()
mgr.add_command('mock', mock_cmd_one)
cmd_mock, name, args = mgr.find_command(['mock'])
self.assertEqual(mock_cmd_one, cmd_mock)
# Find a command added in initialization
cmd_one, name, args = mgr.find_command(['one'])
self.assertEqual(FAKE_CMD_ONE, cmd_one)
# Load another command group
mgr.add_command_group('greek')
# Find a new command
cmd_alpha, name, args = mgr.find_command(['alpha'])
self.assertEqual(FAKE_CMD_ALPHA, cmd_alpha)
# Ensure that the original commands were not overwritten
cmd_two, name, args = mgr.find_command(['two'])
self.assertEqual(FAKE_CMD_TWO, cmd_two)
def test_get_command_groups(self):
mgr = FakeCommandManager('test')
# Make sure add_command() still functions
mock_cmd_one = mock.Mock()
mgr.add_command('mock', mock_cmd_one)
cmd_mock, name, args = mgr.find_command(['mock'])
self.assertEqual(mock_cmd_one, cmd_mock)
# Load another command group
mgr.add_command_group('greek')
gl = mgr.get_command_groups()
self.assertEqual(['test', 'greek'], gl)
def test_get_command_names(self):
mock_cmd_one = mock.Mock()
mock_cmd_one.name = 'one'
mock_cmd_two = mock.Mock()
mock_cmd_two.name = 'cmd two'
mock_pkg_resources = mock.Mock(
return_value=[mock_cmd_one, mock_cmd_two],
)
with mock.patch(
'pkg_resources.iter_entry_points',
mock_pkg_resources,
) as iter_entry_points:
mgr = commandmanager.CommandManager('test')
iter_entry_points.assert_called_once_with('test')
cmds = mgr.get_command_names('test')
self.assertEqual(['one', 'cmd two'], cmds)

View File

@ -1,94 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Test Timing pseudo-command"""
import datetime
from osc_lib.command import timing
from osc_lib.tests import fakes
from osc_lib.tests import utils
timing_url = 'GET http://localhost:5000'
timing_elapsed = 0.872809
class FakeGenericClient(object):
def __init__(self, **kwargs):
self.auth_token = kwargs['token']
self.management_url = kwargs['endpoint']
class TestTiming(utils.TestCommand):
columns = (
'URL',
'Seconds',
)
def setUp(self):
super(TestTiming, self).setUp()
self.app.timing_data = []
self.app.client_manager.compute = FakeGenericClient(
endpoint=fakes.AUTH_URL,
token=fakes.AUTH_TOKEN,
)
self.app.client_manager.volume = FakeGenericClient(
endpoint=fakes.AUTH_URL,
token=fakes.AUTH_TOKEN,
)
# Get the command object to test
self.cmd = timing.Timing(self.app, None)
def test_timing_list_no_data(self):
arglist = []
verifylist = []
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
# In base command class Lister in cliff, abstract method take_action()
# returns a tuple containing the column names and an iterable
# containing the data to be listed.
columns, data = self.cmd.take_action(parsed_args)
self.assertEqual(self.columns, columns)
datalist = [
('Total', 0.0,)
]
self.assertEqual(datalist, data)
def test_timing_list(self):
self.app.timing_data = [(
timing_url,
datetime.timedelta(microseconds=timing_elapsed * 1000000),
)]
arglist = []
verifylist = []
parsed_args = self.check_parser(self.cmd, arglist, verifylist)
# In base command class Lister in cliff, abstract method take_action()
# returns a tuple containing the column names and an iterable
# containing the data to be listed.
columns, data = self.cmd.take_action(parsed_args)
self.assertEqual(self.columns, columns)
datalist = [
(timing_url, timing_elapsed),
('Total', timing_elapsed),
]
self.assertEqual(datalist, data)

View File

@ -1,191 +0,0 @@
# Copyright 2013 Nebula Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import mock
import six
import sys
from keystoneauth1 import fixture
AUTH_TOKEN = "foobar"
AUTH_URL = "http://0.0.0.0"
USERNAME = "itchy"
PASSWORD = "scratchy"
PROJECT_NAME = "poochie"
PROJECT_ID = "30c3da29-61f5-4b7b-8eb2-3d18287428c7"
REGION_NAME = "richie"
INTERFACE = "catchy"
VERSION = "3"
SERVICE_PROVIDER_ID = "bob"
TEST_RESPONSE_DICT = fixture.V2Token(token_id=AUTH_TOKEN,
user_name=USERNAME)
_s = TEST_RESPONSE_DICT.add_service('identity', name='keystone')
_s.add_endpoint(AUTH_URL + ':5000/v2.0')
_s = TEST_RESPONSE_DICT.add_service('network', name='neutron')
_s.add_endpoint(AUTH_URL + ':9696')
_s = TEST_RESPONSE_DICT.add_service('compute', name='nova')
_s.add_endpoint(AUTH_URL + ':8774/v2')
_s = TEST_RESPONSE_DICT.add_service('image', name='glance')
_s.add_endpoint(AUTH_URL + ':9292')
_s = TEST_RESPONSE_DICT.add_service('object', name='swift')
_s.add_endpoint(AUTH_URL + ':8080/v1')
TEST_RESPONSE_DICT_V3 = fixture.V3Token(user_name=USERNAME)
TEST_RESPONSE_DICT_V3.set_project_scope()
TEST_VERSIONS = fixture.DiscoveryList(href=AUTH_URL)
def to_unicode_dict(catalog_dict):
"""Converts dict to unicode dict"""
if isinstance(catalog_dict, dict):
return {to_unicode_dict(key): to_unicode_dict(value)
for key, value in catalog_dict.items()}
elif isinstance(catalog_dict, list):
return [to_unicode_dict(element) for element in catalog_dict]
elif isinstance(catalog_dict, str):
return catalog_dict + u""
else:
return catalog_dict
class FakeStdout(object):
def __init__(self):
self.content = []
def write(self, text):
self.content.append(text)
def make_string(self):
result = ''
for line in self.content:
result = result + line
return result
class FakeLog(object):
def __init__(self):
self.messages = {}
def debug(self, msg):
self.messages['debug'] = msg
def info(self, msg):
self.messages['info'] = msg
def warning(self, msg):
self.messages['warning'] = msg
def error(self, msg):
self.messages['error'] = msg
def critical(self, msg):
self.messages['critical'] = msg
class FakeApp(object):
def __init__(self, _stdout, _log):
self.stdout = _stdout
self.client_manager = None
self.stdin = sys.stdin
self.stdout = _stdout or sys.stdout
self.stderr = sys.stderr
self.log = _log
class FakeOptions(object):
def __init__(self, **kwargs):
self.os_beta_command = False
class FakeClientManager(object):
def __init__(self):
self.compute = None
self.identity = None
self.image = None
self.object_store = None
self.volume = None
self.network = None
self.session = None
self.auth_ref = None
self.auth_plugin_name = None
def get_configuration(self):
return {
'auth': {
'username': USERNAME,
'password': PASSWORD,
'token': AUTH_TOKEN,
},
'region': REGION_NAME,
'identity_api_version': VERSION,
}
class FakeResource(object):
def __init__(self, manager=None, info=None, loaded=False, methods=None):
"""Set attributes and methods for a resource.
:param manager:
The resource manager
:param Dictionary info:
A dictionary with all attributes
:param bool loaded:
True if the resource is loaded in memory
:param Dictionary methods:
A dictionary with all methods
"""
info = info or {}
methods = methods or {}
self.__name__ = type(self).__name__
self.manager = manager
self._info = info
self._add_details(info)
self._add_methods(methods)
self._loaded = loaded
def _add_details(self, info):
for (k, v) in six.iteritems(info):
setattr(self, k, v)
def _add_methods(self, methods):
"""Fake methods with MagicMock objects.
For each <@key, @value> pairs in methods, add an callable MagicMock
object named @key as an attribute, and set the mock's return_value to
@value. When users access the attribute with (), @value will be
returned, which looks like a function call.
"""
for (name, ret) in six.iteritems(methods):
method = mock.MagicMock(return_value=ret)
setattr(self, name, method)
def __repr__(self):
reprkeys = sorted(k for k in self.__dict__.keys() if k[0] != '_' and
k != 'manager')
info = ", ".join("%s=%s" % (k, getattr(self, k)) for k in reprkeys)
return "<%s %s>" % (self.__class__.__name__, info)
def keys(self):
return self._info.keys()

View File

@ -1,379 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import copy
import mock
from keystoneauth1.access import service_catalog
from keystoneauth1 import exceptions as ksa_exceptions
from keystoneauth1.identity import generic as generic_plugin
from keystoneauth1.identity.v3 import k2k
from keystoneauth1 import loading
from keystoneauth1 import token_endpoint
from os_client_config import cloud_config
from osc_lib.api import auth
from osc_lib import clientmanager
from osc_lib import exceptions as exc
from osc_lib.tests import fakes
from osc_lib.tests import utils
AUTH_REF = {'version': 'v2.0'}
AUTH_REF.update(fakes.TEST_RESPONSE_DICT['access'])
SERVICE_CATALOG = service_catalog.ServiceCatalogV2(AUTH_REF)
AUTH_DICT = {
'auth_url': fakes.AUTH_URL,
'username': fakes.USERNAME,
'password': fakes.PASSWORD,
'project_name': fakes.PROJECT_NAME
}
# This is deferred in api.auth but we need it here...
auth.get_options_list()
class Container(object):
attr = clientmanager.ClientCache(lambda x: object())
buggy_attr = clientmanager.ClientCache(lambda x: x.foo)
def __init__(self):
pass
class TestClientCache(utils.TestCase):
def test_singleton(self):
# NOTE(dtroyer): Verify that the ClientCache descriptor only invokes
# the factory one time and always returns the same value after that.
c = Container()
self.assertEqual(c.attr, c.attr)
def test_attribute_error_propagates(self):
c = Container()
err = self.assertRaises(exc.PluginAttributeError,
getattr, c, 'buggy_attr')
self.assertNotIsInstance(err, AttributeError)
self.assertEqual("'Container' object has no attribute 'foo'", str(err))
class TestClientManager(utils.TestClientManager):
def test_client_manager_admin_token(self):
token_auth = {
'endpoint': fakes.AUTH_URL,
'token': fakes.AUTH_TOKEN,
}
client_manager = self._make_clientmanager(
auth_args=token_auth,
auth_plugin_name='admin_token',
)
self.assertEqual(
fakes.AUTH_URL,
client_manager._cli_options.config['auth']['endpoint'],
)
self.assertEqual(
fakes.AUTH_TOKEN,
client_manager.auth.get_token(None),
)
self.assertIsInstance(
client_manager.auth,
token_endpoint.Token,
)
# NOTE(dtroyer): This is intentionally not assertFalse() as the return
# value from is_service_available() may be == None
self.assertNotEqual(
False,
client_manager.is_service_available('network'),
)
def test_client_manager_password(self):
client_manager = self._make_clientmanager(
auth_required=True,
)
self.assertEqual(
fakes.AUTH_URL,
client_manager._cli_options.config['auth']['auth_url'],
)
self.assertEqual(
fakes.USERNAME,
client_manager._cli_options.config['auth']['username'],
)
self.assertEqual(
fakes.PASSWORD,
client_manager._cli_options.config['auth']['password'],
)
self.assertIsInstance(
client_manager.auth,
generic_plugin.Password,
)
self.assertTrue(client_manager.verify)
self.assertIsNone(client_manager.cert)
# These need to stick around until the old-style clients are gone
self.assertEqual(
AUTH_REF.pop('version'),
client_manager.auth_ref.version,
)
self.assertEqual(
fakes.to_unicode_dict(AUTH_REF),
client_manager.auth_ref._data['access'],
)
self.assertEqual(
dir(SERVICE_CATALOG),
dir(client_manager.auth_ref.service_catalog),
)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_verify(self):
client_manager = self._make_clientmanager(
auth_required=True,
)
self.assertTrue(client_manager.verify)
self.assertIsNone(client_manager.cacert)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_verify_ca(self):
config_args = {
'cacert': 'cafile',
}
client_manager = self._make_clientmanager(
config_args=config_args,
auth_required=True,
)
# Test that client_manager.verify is Requests-compatible,
# i.e. it contains the value of cafile here
self.assertTrue(client_manager.verify)
self.assertEqual('cafile', client_manager.verify)
self.assertEqual('cafile', client_manager.cacert)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_verify_false(self):
config_args = {
'verify': False,
}
client_manager = self._make_clientmanager(
config_args=config_args,
auth_required=True,
)
self.assertFalse(client_manager.verify)
self.assertIsNone(client_manager.cacert)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_verify_insecure(self):
config_args = {
'insecure': True,
}
client_manager = self._make_clientmanager(
config_args=config_args,
auth_required=True,
)
self.assertFalse(client_manager.verify)
self.assertIsNone(client_manager.cacert)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_verify_insecure_ca(self):
config_args = {
'insecure': True,
'cacert': 'cafile',
}
client_manager = self._make_clientmanager(
config_args=config_args,
auth_required=True,
)
# insecure overrides cacert
self.assertFalse(client_manager.verify)
self.assertIsNone(client_manager.cacert)
self.assertTrue(client_manager.is_service_available('network'))
def test_client_manager_password_client_cert(self):
config_args = {
'cert': 'cert',
}
client_manager = self._make_clientmanager(
config_args=config_args,
)
self.assertEqual('cert', client_manager.cert)
def test_client_manager_password_client_key(self):
config_args = {
'cert': 'cert',
'key': 'key',
}
client_manager = self._make_clientmanager(
config_args=config_args,
)
self.assertEqual(('cert', 'key'), client_manager.cert)
def test_client_manager_select_auth_plugin_password(self):
# test password auth
auth_args = {
'auth_url': fakes.AUTH_URL,
'username': fakes.USERNAME,
'password': fakes.PASSWORD,
'tenant_name': fakes.PROJECT_NAME,
}
self._make_clientmanager(
auth_args=auth_args,
identity_api_version='2.0',
auth_plugin_name='v2password',
)
auth_args = copy.deepcopy(self.default_password_auth)
auth_args.update({
'user_domain_name': 'default',
'project_domain_name': 'default',
})
self._make_clientmanager(
auth_args=auth_args,
identity_api_version='3',
auth_plugin_name='v3password',
)
# Use v2.0 auth args
auth_args = {
'auth_url': fakes.AUTH_URL,
'username': fakes.USERNAME,
'password': fakes.PASSWORD,
'tenant_name': fakes.PROJECT_NAME,
}
self._make_clientmanager(
auth_args=auth_args,
identity_api_version='2.0',
)
# Use v3 auth args
auth_args = copy.deepcopy(self.default_password_auth)
auth_args.update({
'user_domain_name': 'default',
'project_domain_name': 'default',
})
self._make_clientmanager(
auth_args=auth_args,
identity_api_version='3',
)
def test_client_manager_select_auth_plugin_token(self):
# test token auth
self._make_clientmanager(
# auth_args=auth_args,
identity_api_version='2.0',
auth_plugin_name='v2token',
)
self._make_clientmanager(
# auth_args=auth_args,
identity_api_version='3',
auth_plugin_name='v3token',
)
self._make_clientmanager(
# auth_args=auth_args,
identity_api_version='x',
auth_plugin_name='token',
)
def test_client_manager_select_auth_plugin_failure(self):
self.assertRaises(
ksa_exceptions.NoMatchingPlugin,
self._make_clientmanager,
identity_api_version='3',
auth_plugin_name='bad_plugin',
)
@mock.patch('osc_lib.api.auth.check_valid_authentication_options')
def test_client_manager_auth_setup_once(self, check_authn_options_func):
loader = loading.get_plugin_loader('password')
auth_plugin = loader.load_from_options(**AUTH_DICT)
client_manager = self._clientmanager_class()(
cli_options=cloud_config.CloudConfig(
name='t1',
region='1',
config=dict(
auth_type='password',
auth=AUTH_DICT,
interface=fakes.INTERFACE,
region_name=fakes.REGION_NAME,
),
auth_plugin=auth_plugin,
),
api_version={
'identity': '2.0',
},
)
self.assertFalse(client_manager._auth_setup_completed)
client_manager.setup_auth()
self.assertTrue(check_authn_options_func.called)
self.assertTrue(client_manager._auth_setup_completed)
# now make sure we don't do auth setup the second time around
# by checking whether check_valid_auth_options() gets called again
check_authn_options_func.reset_mock()
client_manager.auth_ref
check_authn_options_func.assert_not_called()
def test_client_manager_endpoint_disabled(self):
auth_args = copy.deepcopy(self.default_password_auth)
auth_args.update({
'user_domain_name': 'default',
'project_domain_name': 'default',
})
# v3 fake doesn't have network endpoint
client_manager = self._make_clientmanager(
auth_args=auth_args,
identity_api_version='3',
auth_plugin_name='v3password',
)
self.assertFalse(client_manager.is_service_available('network'))
def test_client_manager_k2k_auth_setup(self):
loader = loading.get_plugin_loader('password')
auth_plugin = loader.load_from_options(**AUTH_DICT)
client_manager = self._clientmanager_class()(
cli_options=cloud_config.CloudConfig(
name='t1',
region='1',
config=dict(
auth_type='password',
auth=AUTH_DICT,
interface=fakes.INTERFACE,
region_name=fakes.REGION_NAME,
service_provider=fakes.SERVICE_PROVIDER_ID,
remote_project_id=fakes.PROJECT_ID
),
auth_plugin=auth_plugin,
),
api_version={
'identity': '3',
},
)
self.assertFalse(client_manager._auth_setup_completed)
client_manager.setup_auth()
# Note(knikolla): Make sure that the auth object is of the correct
# type and that the service_provider is correctly set.
self.assertIsInstance(client_manager.auth, k2k.Keystone2Keystone)
self.assertEqual(client_manager.auth._sp_id, fakes.SERVICE_PROVIDER_ID)
self.assertEqual(client_manager.auth.project_id, fakes.PROJECT_ID)
self.assertTrue(client_manager._auth_setup_completed)

View File

@ -1,204 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import logging
import mock
from osc_lib import logs
from osc_lib.tests import utils
class TestContext(utils.TestCase):
def test_log_level_from_options(self):
opts = mock.Mock()
opts.verbose_level = 0
self.assertEqual(logging.ERROR, logs.log_level_from_options(opts))
opts.verbose_level = 1
self.assertEqual(logging.WARNING, logs.log_level_from_options(opts))
opts.verbose_level = 2
self.assertEqual(logging.INFO, logs.log_level_from_options(opts))
opts.verbose_level = 3
self.assertEqual(logging.DEBUG, logs.log_level_from_options(opts))
def test_log_level_from_config(self):
cfg = {'verbose_level': 0}
self.assertEqual(logging.ERROR, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1}
self.assertEqual(logging.WARNING, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 2}
self.assertEqual(logging.INFO, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 3}
self.assertEqual(logging.DEBUG, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'critical'}
self.assertEqual(logging.CRITICAL, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'error'}
self.assertEqual(logging.ERROR, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'warning'}
self.assertEqual(logging.WARNING, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'info'}
self.assertEqual(logging.INFO, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'debug'}
self.assertEqual(logging.DEBUG, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'bogus'}
self.assertEqual(logging.WARNING, logs.log_level_from_config(cfg))
cfg = {'verbose_level': 1, 'log_level': 'info', 'debug': True}
self.assertEqual(logging.DEBUG, logs.log_level_from_config(cfg))
@mock.patch('warnings.simplefilter')
def test_set_warning_filter(self, simplefilter):
logs.set_warning_filter(logging.ERROR)
simplefilter.assert_called_with("ignore")
logs.set_warning_filter(logging.WARNING)
simplefilter.assert_called_with("ignore")
logs.set_warning_filter(logging.INFO)
simplefilter.assert_called_with("once")
class TestFileFormatter(utils.TestCase):
def test_nothing(self):
formatter = logs._FileFormatter()
self.assertEqual(('%(asctime)s.%(msecs)03d %(process)d %(levelname)s '
'%(name)s %(message)s'), formatter.fmt)
def test_options(self):
class Opts(object):
cloud = 'cloudy'
os_project_name = 'projecty'
username = 'usernamey'
options = Opts()
formatter = logs._FileFormatter(options=options)
self.assertEqual(('%(asctime)s.%(msecs)03d %(process)d %(levelname)s '
'%(name)s [cloudy usernamey projecty] %(message)s'),
formatter.fmt)
def test_config(self):
config = mock.Mock()
config.config = {'cloud': 'cloudy'}
config.auth = {'project_name': 'projecty', 'username': 'usernamey'}
formatter = logs._FileFormatter(config=config)
self.assertEqual(('%(asctime)s.%(msecs)03d %(process)d %(levelname)s '
'%(name)s [cloudy usernamey projecty] %(message)s'),
formatter.fmt)
class TestLogConfigurator(utils.TestCase):
def setUp(self):
super(TestLogConfigurator, self).setUp()
self.options = mock.Mock()
self.options.verbose_level = 1
self.options.log_file = None
self.options.debug = False
self.root_logger = mock.Mock()
self.root_logger.setLevel = mock.Mock()
self.root_logger.addHandler = mock.Mock()
self.requests_log = mock.Mock()
self.requests_log.setLevel = mock.Mock()
self.cliff_log = mock.Mock()
self.cliff_log.setLevel = mock.Mock()
self.stevedore_log = mock.Mock()
self.stevedore_log.setLevel = mock.Mock()
self.iso8601_log = mock.Mock()
self.iso8601_log.setLevel = mock.Mock()
self.loggers = [
self.root_logger,
self.requests_log,
self.cliff_log,
self.stevedore_log,
self.iso8601_log]
@mock.patch('logging.StreamHandler')
@mock.patch('logging.getLogger')
@mock.patch('osc_lib.logs.set_warning_filter')
def test_init(self, warning_filter, getLogger, handle):
getLogger.side_effect = self.loggers
console_logger = mock.Mock()
console_logger.setFormatter = mock.Mock()
console_logger.setLevel = mock.Mock()
handle.return_value = console_logger
configurator = logs.LogConfigurator(self.options)
getLogger.assert_called_with('iso8601') # last call
warning_filter.assert_called_with(logging.WARNING)
self.root_logger.setLevel.assert_called_with(logging.DEBUG)
self.root_logger.addHandler.assert_called_with(console_logger)
self.requests_log.setLevel.assert_called_with(logging.ERROR)
self.cliff_log.setLevel.assert_called_with(logging.ERROR)
self.stevedore_log.setLevel.assert_called_with(logging.ERROR)
self.iso8601_log.setLevel.assert_called_with(logging.ERROR)
self.assertFalse(configurator.dump_trace)
@mock.patch('logging.getLogger')
@mock.patch('osc_lib.logs.set_warning_filter')
def test_init_no_debug(self, warning_filter, getLogger):
getLogger.side_effect = self.loggers
self.options.debug = True
configurator = logs.LogConfigurator(self.options)
warning_filter.assert_called_with(logging.DEBUG)
self.requests_log.setLevel.assert_called_with(logging.DEBUG)
self.assertTrue(configurator.dump_trace)
@mock.patch('logging.FileHandler')
@mock.patch('logging.getLogger')
@mock.patch('osc_lib.logs.set_warning_filter')
@mock.patch('osc_lib.logs._FileFormatter')
def test_init_log_file(self, formatter, warning_filter, getLogger, handle):
getLogger.side_effect = self.loggers
self.options.log_file = '/tmp/log_file'
file_logger = mock.Mock()
file_logger.setFormatter = mock.Mock()
file_logger.setLevel = mock.Mock()
handle.return_value = file_logger
mock_formatter = mock.Mock()
formatter.return_value = mock_formatter
logs.LogConfigurator(self.options)
handle.assert_called_with(filename=self.options.log_file)
self.root_logger.addHandler.assert_called_with(file_logger)
file_logger.setFormatter.assert_called_with(mock_formatter)
file_logger.setLevel.assert_called_with(logging.WARNING)
@mock.patch('logging.FileHandler')
@mock.patch('logging.getLogger')
@mock.patch('osc_lib.logs.set_warning_filter')
@mock.patch('osc_lib.logs._FileFormatter')
def test_configure(self, formatter, warning_filter, getLogger, handle):
getLogger.side_effect = self.loggers
configurator = logs.LogConfigurator(self.options)
cloud_config = mock.Mock()
config_log = '/tmp/config_log'
cloud_config.config = {
'log_file': config_log,
'verbose_level': 1,
'log_level': 'info'}
file_logger = mock.Mock()
file_logger.setFormatter = mock.Mock()
file_logger.setLevel = mock.Mock()
handle.return_value = file_logger
mock_formatter = mock.Mock()
formatter.return_value = mock_formatter
configurator.configure(cloud_config)
warning_filter.assert_called_with(logging.INFO)
handle.assert_called_with(filename=config_log)
self.root_logger.addHandler.assert_called_with(file_logger)
file_logger.setFormatter.assert_called_with(mock_formatter)
file_logger.setLevel.assert_called_with(logging.INFO)
self.assertFalse(configurator.dump_trace)

View File

@ -1,611 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import copy
import mock
import os
import sys
import testtools
from osc_lib import shell
from osc_lib.tests import utils
DEFAULT_AUTH_URL = "http://127.0.0.1:5000/v2.0/"
DEFAULT_PROJECT_ID = "xxxx-yyyy-zzzz"
DEFAULT_PROJECT_NAME = "project"
DEFAULT_DOMAIN_ID = "aaaa-bbbb-cccc"
DEFAULT_DOMAIN_NAME = "default"
DEFAULT_USER_DOMAIN_ID = "aaaa-bbbb-cccc"
DEFAULT_USER_DOMAIN_NAME = "domain"
DEFAULT_PROJECT_DOMAIN_ID = "aaaa-bbbb-cccc"
DEFAULT_PROJECT_DOMAIN_NAME = "domain"
DEFAULT_USERNAME = "username"
DEFAULT_PASSWORD = "password"
DEFAULT_CLOUD = "altocumulus"
DEFAULT_REGION_NAME = "ZZ9_Plural_Z_Alpha"
DEFAULT_TOKEN = "token"
DEFAULT_SERVICE_URL = "http://127.0.0.1:8771/v3.0/"
DEFAULT_AUTH_PLUGIN = "v2password"
DEFAULT_INTERFACE = "internal"
DEFAULT_COMPUTE_API_VERSION = ""
DEFAULT_IDENTITY_API_VERSION = ""
DEFAULT_IMAGE_API_VERSION = ""
DEFAULT_VOLUME_API_VERSION = ""
DEFAULT_NETWORK_API_VERSION = ""
LIB_COMPUTE_API_VERSION = ""
LIB_IDENTITY_API_VERSION = ""
LIB_IMAGE_API_VERSION = ""
LIB_VOLUME_API_VERSION = ""
LIB_NETWORK_API_VERSION = ""
CLOUD_1 = {
'clouds': {
'scc': {
'auth': {
'auth_url': DEFAULT_AUTH_URL,
'project_name': DEFAULT_PROJECT_NAME,
'username': 'zaphod',
},
'region_name': 'occ-cloud,krikkit',
'donut': 'glazed',
'interface': 'public',
}
}
}
CLOUD_2 = {
'clouds': {
'megacloud': {
'cloud': 'megadodo',
'auth': {
'project_name': 'heart-o-gold',
'username': 'zaphod',
},
'region_name': 'occ-cloud,krikkit,occ-env',
'log_file': '/tmp/test_log_file',
'log_level': 'debug',
'cert': 'mycert',
'key': 'mickey',
}
}
}
PUBLIC_1 = {
'public-clouds': {
'megadodo': {
'auth': {
'auth_url': DEFAULT_AUTH_URL,
'project_name': DEFAULT_PROJECT_NAME,
},
'region_name': 'occ-public',
'donut': 'cake',
}
}
}
# The option table values is a tuple of (<value>, <test-opt>, <test-env>)
# where <value> is the test value to use, <test-opt> is True if this option
# should be tested as a CLI option and <test-env> is True of this option
# should be tested as an environment variable.
# Global options that should be parsed before shell.initialize_app() is called
global_options = {
'--os-cloud': (DEFAULT_CLOUD, True, True),
'--os-region-name': (DEFAULT_REGION_NAME, True, True),
'--os-default-domain': (DEFAULT_DOMAIN_NAME, True, True),
'--os-cacert': ('/dev/null', True, True),
'--timing': (True, True, False),
'--os-profile': ('SECRET_KEY', True, True),
'--os-interface': (DEFAULT_INTERFACE, True, True)
}
class TestShellArgV(utils.TestShell):
"""Test the deferred help flag"""
def setUp(self):
super(TestShellArgV, self).setUp()
def test_shell_argv(self):
"""Test argv decoding
Python 2 does nothing with argv while Python 3 decodes it into
Unicode before we ever see it. We manually decode when running
under Python 2 so verify that we get the right argv types.
Use the argv supplied by the test runner so we get actual Python
runtime behaviour; we only need to check the type of argv[0]
which will alwyas be present.
"""
with mock.patch(
"osc_lib.shell.OpenStackShell.run",
self.app,
):
# Ensure type gets through unmolested through shell.main()
argv = sys.argv
shell.main(sys.argv)
self.assertEqual(type(argv[0]), type(self.app.call_args[0][0][0]))
# When shell.main() gets sys.argv itself it should be decoded
shell.main()
self.assertEqual(type(u'x'), type(self.app.call_args[0][0][0]))
class TestShellHelp(utils.TestShell):
"""Test the deferred help flag"""
def setUp(self):
super(TestShellHelp, self).setUp()
self.useFixture(utils.EnvFixture())
@testtools.skip("skip until bug 1444983 is resolved")
def test_help_options(self):
flag = "-h list server"
kwargs = {
"deferred_help": True,
}
with mock.patch(self.app_patch + ".initialize_app", self.app):
_shell, _cmd = utils.make_shell(), flag
utils.fake_execute(_shell, _cmd)
self.assertEqual(
kwargs["deferred_help"],
_shell.options.deferred_help,
)
class TestShellOptions(utils.TestShell):
"""Test the option handling by argparse and os_client_config
This covers getting the CLI options through the initial processing
and validates the arguments to initialize_app() and occ_get_one()
"""
def setUp(self):
super(TestShellOptions, self).setUp()
self.useFixture(utils.EnvFixture())
def test_empty_auth(self):
os.environ = {}
self._assert_initialize_app_arg("", {})
self._assert_cloud_config_arg("", {})
def test_no_options(self):
os.environ = {}
self._assert_initialize_app_arg("", {})
self._assert_cloud_config_arg("", {})
def test_global_options(self):
self._test_options_init_app(global_options)
self._test_options_get_one_cloud(global_options)
def test_global_env(self):
self._test_env_init_app(global_options)
self._test_env_get_one_cloud(global_options)
class TestShellCli(utils.TestShell):
"""Test handling of specific global options
_shell.options is the parsed command line from argparse
_shell.client_manager.* are the values actually used
"""
def setUp(self):
super(TestShellCli, self).setUp()
env = {}
self.useFixture(utils.EnvFixture(env.copy()))
def test_shell_args_no_options(self):
_shell = utils.make_shell()
with mock.patch(
"osc_lib.shell.OpenStackShell.initialize_app",
self.app,
):
utils.fake_execute(_shell, "list user")
self.app.assert_called_with(["list", "user"])
def test_shell_args_tls_options(self):
"""Test the TLS verify and CA cert file options"""
_shell = utils.make_shell()
# Default
utils.fake_execute(_shell, "module list")
self.assertIsNone(_shell.options.verify)
self.assertIsNone(_shell.options.insecure)
self.assertIsNone(_shell.options.cacert)
self.assertTrue(_shell.client_manager.verify)
self.assertIsNone(_shell.client_manager.cacert)
# --verify
utils.fake_execute(_shell, "--verify module list")
self.assertTrue(_shell.options.verify)
self.assertIsNone(_shell.options.insecure)
self.assertIsNone(_shell.options.cacert)
self.assertTrue(_shell.client_manager.verify)
self.assertIsNone(_shell.client_manager.cacert)
# --insecure
utils.fake_execute(_shell, "--insecure module list")
self.assertIsNone(_shell.options.verify)
self.assertTrue(_shell.options.insecure)
self.assertIsNone(_shell.options.cacert)
self.assertFalse(_shell.client_manager.verify)
self.assertIsNone(_shell.client_manager.cacert)
# --os-cacert
utils.fake_execute(_shell, "--os-cacert foo module list")
self.assertIsNone(_shell.options.verify)
self.assertIsNone(_shell.options.insecure)
self.assertEqual('foo', _shell.options.cacert)
self.assertEqual('foo', _shell.client_manager.verify)
self.assertEqual('foo', _shell.client_manager.cacert)
# --os-cacert and --verify
utils.fake_execute(_shell, "--os-cacert foo --verify module list")
self.assertTrue(_shell.options.verify)
self.assertIsNone(_shell.options.insecure)
self.assertEqual('foo', _shell.options.cacert)
self.assertEqual('foo', _shell.client_manager.verify)
self.assertEqual('foo', _shell.client_manager.cacert)
# --os-cacert and --insecure
# NOTE(dtroyer): Per bug https://bugs.launchpad.net/bugs/1447784
# in this combination --insecure now overrides any
# --os-cacert setting, where before --insecure
# was ignored if --os-cacert was set.
utils.fake_execute(_shell, "--os-cacert foo --insecure module list")
self.assertIsNone(_shell.options.verify)
self.assertTrue(_shell.options.insecure)
self.assertEqual('foo', _shell.options.cacert)
self.assertFalse(_shell.client_manager.verify)
self.assertIsNone(_shell.client_manager.cacert)
def test_shell_args_cert_options(self):
"""Test client cert options"""
_shell = utils.make_shell()
# Default
utils.fake_execute(_shell, "module list")
self.assertEqual('', _shell.options.cert)
self.assertEqual('', _shell.options.key)
self.assertIsNone(_shell.client_manager.cert)
# --os-cert
utils.fake_execute(_shell, "--os-cert mycert module list")
self.assertEqual('mycert', _shell.options.cert)
self.assertEqual('', _shell.options.key)
self.assertEqual('mycert', _shell.client_manager.cert)
# --os-key
utils.fake_execute(_shell, "--os-key mickey module list")
self.assertEqual('', _shell.options.cert)
self.assertEqual('mickey', _shell.options.key)
self.assertIsNone(_shell.client_manager.cert)
# --os-cert and --os-key
utils.fake_execute(
_shell,
"--os-cert mycert --os-key mickey module list"
)
self.assertEqual('mycert', _shell.options.cert)
self.assertEqual('mickey', _shell.options.key)
self.assertEqual(('mycert', 'mickey'), _shell.client_manager.cert)
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_cloud_no_vendor(self, config_mock):
"""Test cloud config options without the vendor file"""
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_1))
_shell = utils.make_shell()
utils.fake_execute(
_shell,
"--os-cloud scc module list",
)
self.assertEqual(
'scc',
_shell.cloud.name,
)
# These come from clouds.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
DEFAULT_PROJECT_NAME,
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
self.assertEqual(
'occ-cloud',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'occ-cloud',
_shell.client_manager.region_name,
)
self.assertEqual(
'glazed',
_shell.cloud.config['donut'],
)
self.assertEqual(
'public',
_shell.cloud.config['interface'],
)
self.assertIsNone(_shell.cloud.config['cert'])
self.assertIsNone(_shell.cloud.config['key'])
self.assertIsNone(_shell.client_manager.cert)
@mock.patch("os_client_config.config.OpenStackConfig._load_vendor_file")
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_cloud_public(self, config_mock, public_mock):
"""Test cloud config options with the vendor file"""
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_2))
public_mock.return_value = ('file.yaml', copy.deepcopy(PUBLIC_1))
_shell = utils.make_shell()
utils.fake_execute(
_shell,
"--os-cloud megacloud module list",
)
self.assertEqual(
'megacloud',
_shell.cloud.name,
)
# These come from clouds-public.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
'cake',
_shell.cloud.config['donut'],
)
# These come from clouds.yaml
self.assertEqual(
'heart-o-gold',
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
self.assertEqual(
'occ-cloud',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'occ-cloud',
_shell.client_manager.region_name,
)
self.assertEqual('mycert', _shell.cloud.config['cert'])
self.assertEqual('mickey', _shell.cloud.config['key'])
self.assertEqual(('mycert', 'mickey'), _shell.client_manager.cert)
@mock.patch("os_client_config.config.OpenStackConfig._load_vendor_file")
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_precedence(self, config_mock, vendor_mock):
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_2))
vendor_mock.return_value = ('file.yaml', copy.deepcopy(PUBLIC_1))
_shell = utils.make_shell()
# Test command option overriding config file value
utils.fake_execute(
_shell,
"--os-cloud megacloud --os-region-name krikkit module list",
)
self.assertEqual(
'megacloud',
_shell.cloud.name,
)
# These come from clouds-public.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
'cake',
_shell.cloud.config['donut'],
)
# These come from clouds.yaml
self.assertEqual(
'heart-o-gold',
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
self.assertEqual(
'krikkit',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'krikkit',
_shell.client_manager.region_name,
)
class TestShellCliPrecedence(utils.TestShell):
"""Test option precedencr order"""
def setUp(self):
super(TestShellCliPrecedence, self).setUp()
env = {
'OS_CLOUD': 'megacloud',
'OS_REGION_NAME': 'occ-env',
}
self.useFixture(utils.EnvFixture(env.copy()))
@mock.patch("os_client_config.config.OpenStackConfig._load_vendor_file")
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_precedence_1(self, config_mock, vendor_mock):
"""Test environment overriding occ"""
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_2))
vendor_mock.return_value = ('file.yaml', copy.deepcopy(PUBLIC_1))
_shell = utils.make_shell()
# Test env var
utils.fake_execute(
_shell,
"module list",
)
self.assertEqual(
'megacloud',
_shell.cloud.name,
)
# These come from clouds-public.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
'cake',
_shell.cloud.config['donut'],
)
# These come from clouds.yaml
self.assertEqual(
'heart-o-gold',
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
# These come from the environment
self.assertEqual(
'occ-env',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'occ-env',
_shell.client_manager.region_name,
)
@mock.patch("os_client_config.config.OpenStackConfig._load_vendor_file")
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_precedence_2(self, config_mock, vendor_mock):
"""Test command line overriding environment and occ"""
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_2))
vendor_mock.return_value = ('file.yaml', copy.deepcopy(PUBLIC_1))
_shell = utils.make_shell()
# Test command option overriding config file value
utils.fake_execute(
_shell,
"--os-region-name krikkit list user",
)
self.assertEqual(
'megacloud',
_shell.cloud.name,
)
# These come from clouds-public.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
'cake',
_shell.cloud.config['donut'],
)
# These come from clouds.yaml
self.assertEqual(
'heart-o-gold',
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
# These come from the command line
self.assertEqual(
'krikkit',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'krikkit',
_shell.client_manager.region_name,
)
@mock.patch("os_client_config.config.OpenStackConfig._load_vendor_file")
@mock.patch("os_client_config.config.OpenStackConfig._load_config_file")
def test_shell_args_precedence_3(self, config_mock, vendor_mock):
"""Test command line overriding environment and occ"""
config_mock.return_value = ('file.yaml', copy.deepcopy(CLOUD_1))
vendor_mock.return_value = ('file.yaml', copy.deepcopy(PUBLIC_1))
_shell = utils.make_shell()
# Test command option overriding config file value
utils.fake_execute(
_shell,
"--os-cloud scc --os-region-name krikkit list user",
)
self.assertEqual(
'scc',
_shell.cloud.name,
)
# These come from clouds-public.yaml
self.assertEqual(
DEFAULT_AUTH_URL,
_shell.cloud.config['auth']['auth_url'],
)
self.assertEqual(
'glazed',
_shell.cloud.config['donut'],
)
# These come from clouds.yaml
self.assertEqual(
DEFAULT_PROJECT_NAME,
_shell.cloud.config['auth']['project_name'],
)
self.assertEqual(
'zaphod',
_shell.cloud.config['auth']['username'],
)
# These come from the command line
self.assertEqual(
'krikkit',
_shell.cloud.config['region_name'],
)
self.assertEqual(
'krikkit',
_shell.client_manager.region_name,
)

View File

@ -1,724 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import time
import uuid
from cliff import columns as cliff_columns
import mock
from osc_lib.cli import format_columns
from osc_lib import exceptions
from osc_lib.tests import fakes
from osc_lib.tests import utils as test_utils
from osc_lib import utils
PASSWORD = "Pa$$w0rd"
WASSPORD = "Wa$$p0rd"
DROWSSAP = "dr0w$$aP"
class FakeOddballResource(fakes.FakeResource):
def get(self, attr):
"""get() is needed for utils.find_resource()"""
if attr == 'id':
return self.id
elif attr == 'name':
return self.name
else:
return None
class TestUtils(test_utils.TestCase):
def _get_test_items(self):
item1 = {'a': 1, 'b': 2}
item2 = {'a': 1, 'b': 3}
item3 = {'a': 2, 'b': 2}
item4 = {'a': 2, 'b': 1}
return [item1, item2, item3, item4]
def test_find_min_match_no_sort(self):
items = self._get_test_items()
sort_str = None
flair = {}
expect_items = items
self.assertEqual(
expect_items,
list(utils.find_min_match(items, sort_str, **flair)),
)
def test_find_min_match_no_flair(self):
items = self._get_test_items()
sort_str = 'b'
flair = {}
expect_items = [items[3], items[0], items[2], items[1]]
self.assertEqual(
expect_items,
utils.find_min_match(items, sort_str, **flair),
)
def test_find_min_match_a2(self):
items = self._get_test_items()
sort_str = 'b'
flair = {'a': 2}
expect_items = [items[3], items[2]]
self.assertEqual(
expect_items,
utils.find_min_match(items, sort_str, **flair),
)
def test_find_min_match_b2(self):
items = self._get_test_items()
sort_str = 'b'
flair = {'b': 2}
expect_items = [items[0], items[2], items[1]]
self.assertEqual(
expect_items,
utils.find_min_match(items, sort_str, **flair),
)
def test_find_min_match_b5(self):
items = self._get_test_items()
sort_str = 'b'
flair = {'b': 5}
expect_items = []
self.assertEqual(
expect_items,
utils.find_min_match(items, sort_str, **flair),
)
def test_find_min_match_a2_b2(self):
items = self._get_test_items()
sort_str = 'b'
flair = {'a': 2, 'b': 2}
expect_items = [items[2]]
self.assertEqual(
expect_items,
utils.find_min_match(items, sort_str, **flair),
)
def test_get_password_good(self):
with mock.patch("getpass.getpass", return_value=PASSWORD):
mock_stdin = mock.Mock()
mock_stdin.isatty = mock.Mock()
mock_stdin.isatty.return_value = True
self.assertEqual(PASSWORD, utils.get_password(mock_stdin))
def test_get_password_bad_once(self):
answers = [PASSWORD, WASSPORD, DROWSSAP, DROWSSAP]
with mock.patch("getpass.getpass", side_effect=answers):
mock_stdin = mock.Mock()
mock_stdin.isatty = mock.Mock()
mock_stdin.isatty.return_value = True
self.assertEqual(DROWSSAP, utils.get_password(mock_stdin))
def test_get_password_no_tty(self):
mock_stdin = mock.Mock()
mock_stdin.isatty = mock.Mock()
mock_stdin.isatty.return_value = False
self.assertRaises(exceptions.CommandError,
utils.get_password,
mock_stdin)
def test_get_password_cntrl_d(self):
with mock.patch("getpass.getpass", side_effect=EOFError()):
mock_stdin = mock.Mock()
mock_stdin.isatty = mock.Mock()
mock_stdin.isatty.return_value = True
self.assertRaises(exceptions.CommandError,
utils.get_password,
mock_stdin)
def test_sort_items_with_one_key(self):
items = self._get_test_items()
sort_str = 'b'
expect_items = [items[3], items[0], items[2], items[1]]
self.assertEqual(expect_items, utils.sort_items(items, sort_str))
def test_sort_items_with_multiple_keys(self):
items = self._get_test_items()
sort_str = 'a,b'
expect_items = [items[0], items[1], items[3], items[2]]
self.assertEqual(expect_items, utils.sort_items(items, sort_str))
def test_sort_items_all_with_direction(self):
items = self._get_test_items()
sort_str = 'a:desc,b:desc'
expect_items = [items[2], items[3], items[1], items[0]]
self.assertEqual(expect_items, utils.sort_items(items, sort_str))
def test_sort_items_some_with_direction(self):
items = self._get_test_items()
sort_str = 'a,b:desc'
expect_items = [items[1], items[0], items[2], items[3]]
self.assertEqual(expect_items, utils.sort_items(items, sort_str))
def test_sort_items_with_object(self):
item1 = mock.Mock(a=1, b=2)
item2 = mock.Mock(a=1, b=3)
item3 = mock.Mock(a=2, b=2)
item4 = mock.Mock(a=2, b=1)
items = [item1, item2, item3, item4]
sort_str = 'b,a'
expect_items = [item4, item1, item3, item2]
self.assertEqual(expect_items, utils.sort_items(items, sort_str))
def test_sort_items_with_empty_key(self):
items = self._get_test_items()
sort_srt = ''
self.assertEqual(items, utils.sort_items(items, sort_srt))
sort_srt = None
self.assertEqual(items, utils.sort_items(items, sort_srt))
def test_sort_items_with_invalid_key(self):
items = self._get_test_items()
sort_str = 'c'
self.assertRaises(exceptions.CommandError,
utils.sort_items,
items, sort_str)
def test_sort_items_with_invalid_direction(self):
items = self._get_test_items()
sort_str = 'a:bad_dir'
self.assertRaises(exceptions.CommandError,
utils.sort_items,
items, sort_str)
@mock.patch.object(time, 'sleep')
def test_wait_for_delete_ok(self, mock_sleep):
# Tests the normal flow that the resource is deleted with a 404 coming
# back on the 2nd iteration of the wait loop.
resource = mock.MagicMock(status='ACTIVE', progress=None)
mock_get = mock.Mock(side_effect=[resource,
exceptions.NotFound(404)])
manager = mock.MagicMock(get=mock_get)
res_id = str(uuid.uuid4())
callback = mock.Mock()
self.assertTrue(utils.wait_for_delete(manager, res_id,
callback=callback))
mock_sleep.assert_called_once_with(5)
callback.assert_called_once_with(0)
@mock.patch.object(time, 'sleep')
def test_wait_for_delete_timeout(self, mock_sleep):
# Tests that we fail if the resource is not deleted before the timeout.
resource = mock.MagicMock(status='ACTIVE')
mock_get = mock.Mock(return_value=resource)
manager = mock.MagicMock(get=mock_get)
res_id = str(uuid.uuid4())
self.assertFalse(utils.wait_for_delete(manager, res_id, sleep_time=1,
timeout=1))
mock_sleep.assert_called_once_with(1)
@mock.patch.object(time, 'sleep')
def test_wait_for_delete_error(self, mock_sleep):
# Tests that we fail if the resource goes to error state while waiting.
resource = mock.MagicMock(status='ERROR')
mock_get = mock.Mock(return_value=resource)
manager = mock.MagicMock(get=mock_get)
res_id = str(uuid.uuid4())
self.assertFalse(utils.wait_for_delete(manager, res_id))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_delete_error_with_overrides(self, mock_sleep):
# Tests that we fail if the resource is my_status=failed
resource = mock.MagicMock(my_status='FAILED')
mock_get = mock.Mock(return_value=resource)
manager = mock.MagicMock(get=mock_get)
res_id = str(uuid.uuid4())
self.assertFalse(utils.wait_for_delete(manager, res_id,
status_field='my_status',
error_status=['failed']))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_delete_error_with_overrides_exception(self, mock_sleep):
# Tests that we succeed if the resource is specific exception
mock_get = mock.Mock(side_effect=Exception)
manager = mock.MagicMock(get=mock_get)
res_id = str(uuid.uuid4())
self.assertTrue(utils.wait_for_delete(manager, res_id,
exception_name=['Exception']))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_status_ok(self, mock_sleep):
# Tests the normal flow that the resource is status=active
resource = mock.MagicMock(status='ACTIVE')
status_f = mock.Mock(return_value=resource)
res_id = str(uuid.uuid4())
self.assertTrue(utils.wait_for_status(status_f, res_id,))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_status_ok_with_overrides(self, mock_sleep):
# Tests the normal flow that the resource is status=complete
resource = mock.MagicMock(my_status='COMPLETE')
status_f = mock.Mock(return_value=resource)
res_id = str(uuid.uuid4())
self.assertTrue(utils.wait_for_status(status_f, res_id,
status_field='my_status',
success_status=['complete']))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_status_error(self, mock_sleep):
# Tests that we fail if the resource is status=error
resource = mock.MagicMock(status='ERROR')
status_f = mock.Mock(return_value=resource)
res_id = str(uuid.uuid4())
self.assertFalse(utils.wait_for_status(status_f, res_id))
mock_sleep.assert_not_called()
@mock.patch.object(time, 'sleep')
def test_wait_for_status_error_with_overrides(self, mock_sleep):
# Tests that we fail if the resource is my_status=failed
resource = mock.MagicMock(my_status='FAILED')
status_f = mock.Mock(return_value=resource)
res_id = str(uuid.uuid4())
self.assertFalse(utils.wait_for_status(status_f, res_id,
status_field='my_status',
error_status=['failed']))
mock_sleep.assert_not_called()
def test_build_kwargs_dict_value_set(self):
self.assertEqual({'arg_bla': 'bla'},
utils.build_kwargs_dict('arg_bla', 'bla'))
def test_build_kwargs_dict_value_None(self):
self.assertEqual({}, utils.build_kwargs_dict('arg_bla', None))
def test_build_kwargs_dict_value_empty_str(self):
self.assertEqual({}, utils.build_kwargs_dict('arg_bla', ''))
def test_is_ascii_bytes(self):
self.assertFalse(utils.is_ascii(b'\xe2'))
def test_is_ascii_string(self):
self.assertFalse(utils.is_ascii(u'\u2665'))
def test_format_size(self):
self.assertEqual("999", utils.format_size(999))
self.assertEqual("100K", utils.format_size(100000))
self.assertEqual("2M", utils.format_size(2000000))
self.assertEqual(
"16.4M", utils.format_size(16361280)
)
self.assertEqual(
"1.6G", utils.format_size(1576395005)
)
self.assertEqual("0", utils.format_size(None))
def test_backward_compat_col_lister(self):
fake_col_headers = ['ID', 'Name', 'Size']
columns = ['Display Name']
column_map = {'Display Name': 'Name'}
results = utils.backward_compat_col_lister(fake_col_headers,
columns,
column_map)
self.assertIsInstance(results, list)
self.assertIn('Display Name', results)
self.assertNotIn('Name', results)
self.assertIn('ID', results)
self.assertIn('Size', results)
def test_backward_compat_col_lister_no_specify_column(self):
fake_col_headers = ['ID', 'Name', 'Size']
columns = []
column_map = {'Display Name': 'Name'}
results = utils.backward_compat_col_lister(fake_col_headers,
columns,
column_map)
self.assertIsInstance(results, list)
self.assertNotIn('Display Name', results)
self.assertIn('Name', results)
self.assertIn('ID', results)
self.assertIn('Size', results)
def test_backward_compat_col_lister_with_tuple_headers(self):
fake_col_headers = ('ID', 'Name', 'Size')
columns = ['Display Name']
column_map = {'Display Name': 'Name'}
results = utils.backward_compat_col_lister(fake_col_headers,
columns,
column_map)
self.assertIsInstance(results, list)
self.assertIn('Display Name', results)
self.assertNotIn('Name', results)
self.assertIn('ID', results)
self.assertIn('Size', results)
def test_backward_compat_col_showone(self):
fake_object = {'id': 'fake-id',
'name': 'fake-name',
'size': 'fake-size'}
columns = ['display_name']
column_map = {'display_name': 'name'}
results = utils.backward_compat_col_showone(fake_object,
columns,
column_map)
self.assertIsInstance(results, dict)
self.assertIn('display_name', results)
self.assertIn('id', results)
self.assertNotIn('name', results)
self.assertIn('size', results)
def test_backward_compat_col_showone_no_specify_column(self):
fake_object = {'id': 'fake-id',
'name': 'fake-name',
'size': 'fake-size'}
columns = []
column_map = {'display_name': 'name'}
results = utils.backward_compat_col_showone(fake_object,
columns,
column_map)
self.assertIsInstance(results, dict)
self.assertNotIn('display_name', results)
self.assertIn('id', results)
self.assertIn('name', results)
self.assertIn('size', results)
def _test_get_item_properties_with_formatter(self, formatters):
names = ('id', 'attr')
item = fakes.FakeResource(info={'id': 'fake-id', 'attr': ['a', 'b']})
res_id, res_attr = utils.get_item_properties(item, names,
formatters=formatters)
self.assertEqual('fake-id', res_id)
return res_attr
def test_get_item_properties_with_format_func(self):
formatters = {'attr': utils.format_list}
res_attr = self._test_get_item_properties_with_formatter(formatters)
self.assertEqual(utils.format_list(['a', 'b']), res_attr)
def test_get_item_properties_with_formattable_column(self):
formatters = {'attr': format_columns.ListColumn}
res_attr = self._test_get_item_properties_with_formatter(formatters)
self.assertIsInstance(res_attr, format_columns.ListColumn)
def _test_get_dict_properties_with_formatter(self, formatters):
names = ('id', 'attr')
item = {'id': 'fake-id', 'attr': ['a', 'b']}
res_id, res_attr = utils.get_dict_properties(item, names,
formatters=formatters)
self.assertEqual('fake-id', res_id)
return res_attr
def test_get_dict_properties_with_format_func(self):
formatters = {'attr': utils.format_list}
res_attr = self._test_get_dict_properties_with_formatter(formatters)
self.assertEqual(utils.format_list(['a', 'b']), res_attr)
def test_get_dict_properties_with_formattable_column(self):
formatters = {'attr': format_columns.ListColumn}
res_attr = self._test_get_dict_properties_with_formatter(formatters)
self.assertIsInstance(res_attr, format_columns.ListColumn)
class NoUniqueMatch(Exception):
pass
class TestFindResource(test_utils.TestCase):
def setUp(self):
super(TestFindResource, self).setUp()
self.name = 'legos'
self.expected = mock.Mock()
self.manager = mock.Mock()
self.manager.resource_class = mock.Mock()
self.manager.resource_class.__name__ = 'lego'
def test_find_resource_get_int(self):
self.manager.get = mock.Mock(return_value=self.expected)
result = utils.find_resource(self.manager, 1)
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with(1)
def test_find_resource_get_int_string(self):
self.manager.get = mock.Mock(return_value=self.expected)
result = utils.find_resource(self.manager, "2")
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with("2")
def test_find_resource_get_name_and_domain(self):
name = 'admin'
domain_id = '30524568d64447fbb3fa8b7891c10dd6'
# NOTE(stevemar): we need an iterable side-effect because the same
# function (manager.get()) is used twice, the first time an exception
# will happen, then the result will be found, but only after using
# the domain ID as a query arg
side_effect = [Exception('Boom!'), self.expected]
self.manager.get = mock.Mock(side_effect=side_effect)
result = utils.find_resource(self.manager, name, domain_id=domain_id)
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with(name, domain_id=domain_id)
def test_find_resource_get_uuid(self):
uuid = '9a0dc2a0-ad0d-11e3-a5e2-0800200c9a66'
self.manager.get = mock.Mock(return_value=self.expected)
result = utils.find_resource(self.manager, uuid)
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with(uuid)
def test_find_resource_get_whatever(self):
self.manager.get = mock.Mock(return_value=self.expected)
result = utils.find_resource(self.manager, 'whatever')
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with('whatever')
def test_find_resource_find(self):
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(return_value=self.expected)
result = utils.find_resource(self.manager, self.name)
self.assertEqual(self.expected, result)
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_find_resource_find_not_found(self):
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(
side_effect=exceptions.NotFound(404, "2")
)
result = self.assertRaises(exceptions.CommandError,
utils.find_resource,
self.manager,
self.name)
self.assertEqual("No lego with a name or ID of 'legos' exists.",
str(result))
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_find_resource_list_forbidden(self):
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(side_effect=Exception('Boom!'))
self.manager.list = mock.Mock(
side_effect=exceptions.Forbidden(403)
)
self.assertRaises(exceptions.Forbidden,
utils.find_resource,
self.manager,
self.name)
self.manager.list.assert_called_with()
def test_find_resource_find_no_unique(self):
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(side_effect=NoUniqueMatch())
result = self.assertRaises(exceptions.CommandError,
utils.find_resource,
self.manager,
self.name)
self.assertEqual("More than one lego exists with the name 'legos'.",
str(result))
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_find_resource_silly_resource(self):
# We need a resource with no resource_class for this test, start fresh
self.manager = mock.Mock()
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(
side_effect=AttributeError(
"'Controller' object has no attribute 'find'",
)
)
silly_resource = FakeOddballResource(
None,
{'id': '12345', 'name': self.name},
loaded=True,
)
self.manager.list = mock.Mock(
return_value=[silly_resource, ],
)
result = utils.find_resource(self.manager, self.name)
self.assertEqual(silly_resource, result)
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_find_resource_silly_resource_not_found(self):
# We need a resource with no resource_class for this test, start fresh
self.manager = mock.Mock()
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(
side_effect=AttributeError(
"'Controller' object has no attribute 'find'",
)
)
self.manager.list = mock.Mock(return_value=[])
result = self.assertRaises(exceptions.CommandError,
utils.find_resource,
self.manager,
self.name)
self.assertEqual("Could not find resource legos",
str(result))
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_find_resource_silly_resource_no_unique_match(self):
# We need a resource with no resource_class for this test, start fresh
self.manager = mock.Mock()
self.manager.get = mock.Mock(side_effect=Exception('Boom!'))
self.manager.find = mock.Mock(
side_effect=AttributeError(
"'Controller' object has no attribute 'find'",
)
)
silly_resource = FakeOddballResource(
None,
{'id': '12345', 'name': self.name},
loaded=True,
)
silly_resource_same = FakeOddballResource(
None,
{'id': 'abcde', 'name': self.name},
loaded=True,
)
self.manager.list = mock.Mock(return_value=[silly_resource,
silly_resource_same])
result = self.assertRaises(exceptions.CommandError,
utils.find_resource,
self.manager,
self.name)
self.assertEqual("More than one resource exists "
"with the name or ID 'legos'.", str(result))
self.manager.get.assert_called_with(self.name)
self.manager.find.assert_called_with(name=self.name)
def test_format_dict(self):
expected = "a='b', c='d', e='f'"
self.assertEqual(expected,
utils.format_dict({'a': 'b', 'c': 'd', 'e': 'f'}))
self.assertEqual(expected,
utils.format_dict({'e': 'f', 'c': 'd', 'a': 'b'}))
self.assertIsNone(utils.format_dict(None))
def test_format_dict_of_list(self):
expected = "a=a1, a2; b=b1, b2; c=c1, c2; e="
self.assertEqual(expected,
utils.format_dict_of_list({'a': ['a2', 'a1'],
'b': ['b2', 'b1'],
'c': ['c1', 'c2'],
'd': None,
'e': []})
)
self.assertEqual(expected,
utils.format_dict_of_list({'c': ['c1', 'c2'],
'a': ['a2', 'a1'],
'b': ['b2', 'b1'],
'e': []})
)
self.assertIsNone(utils.format_dict_of_list(None))
def test_format_dict_of_list_with_separator(self):
expected = "a=a1, a2\nb=b1, b2\nc=c1, c2\ne="
self.assertEqual(expected,
utils.format_dict_of_list({'a': ['a2', 'a1'],
'b': ['b2', 'b1'],
'c': ['c1', 'c2'],
'd': None,
'e': []},
separator='\n')
)
self.assertEqual(expected,
utils.format_dict_of_list({'c': ['c1', 'c2'],
'a': ['a2', 'a1'],
'b': ['b2', 'b1'],
'e': []},
separator='\n')
)
self.assertIsNone(utils.format_dict_of_list(None,
separator='\n'))
def test_format_list(self):
expected = 'a, b, c'
self.assertEqual(expected, utils.format_list(['a', 'b', 'c']))
self.assertEqual(expected, utils.format_list(['c', 'b', 'a']))
self.assertIsNone(utils.format_list(None))
def test_format_list_of_dicts(self):
expected = "a='b', c='d'\ne='f'"
sorted_data = [{'a': 'b', 'c': 'd'}, {'e': 'f'}]
unsorted_data = [{'c': 'd', 'a': 'b'}, {'e': 'f'}]
self.assertEqual(expected, utils.format_list_of_dicts(sorted_data))
self.assertEqual(expected, utils.format_list_of_dicts(unsorted_data))
self.assertEqual('', utils.format_list_of_dicts([]))
self.assertEqual('', utils.format_list_of_dicts([{}]))
self.assertIsNone(utils.format_list_of_dicts(None))
def test_format_list_separator(self):
expected = 'a\nb\nc'
actual_pre_sorted = utils.format_list(['a', 'b', 'c'], separator='\n')
actual_unsorted = utils.format_list(['c', 'b', 'a'], separator='\n')
self.assertEqual(expected, actual_pre_sorted)
self.assertEqual(expected, actual_unsorted)
class TestAssertItemEqual(test_utils.TestCommand):
def test_assert_normal_item(self):
expected = ['a', 'b', 'c']
actual = ['a', 'b', 'c']
self.assertItemEqual(expected, actual)
def test_assert_item_with_formattable_columns(self):
expected = [format_columns.DictColumn({'a': 1, 'b': 2}),
format_columns.ListColumn(['x', 'y', 'z'])]
actual = [format_columns.DictColumn({'a': 1, 'b': 2}),
format_columns.ListColumn(['x', 'y', 'z'])]
self.assertItemEqual(expected, actual)
def test_assert_item_different_length(self):
expected = ['a', 'b', 'c']
actual = ['a', 'b']
self.assertRaises(AssertionError,
self.assertItemEqual, expected, actual)
def test_assert_item_formattable_columns_vs_legacy_formatter(self):
expected = [format_columns.DictColumn({'a': 1, 'b': 2}),
format_columns.ListColumn(['x', 'y', 'z'])]
actual = [utils.format_dict({'a': 1, 'b': 2}),
utils.format_list(['x', 'y', 'z'])]
self.assertRaises(AssertionError,
self.assertItemEqual, expected, actual)
def test_assert_item_different_formattable_columns(self):
class ExceptionColumn(cliff_columns.FormattableColumn):
def human_readable(self):
raise Exception('always fail')
expected = [format_columns.DictColumn({'a': 1, 'b': 2})]
actual = [ExceptionColumn({'a': 1, 'b': 2})]
# AssertionError is a subclass of Exception
# so raising AssertionError ensures ExceptionColumn.human_readable()
# is not called.
self.assertRaises(AssertionError,
self.assertItemEqual, expected, actual)
def test_assert_list_item(self):
expected = [
['a', 'b', 'c'],
[format_columns.DictColumn({'a': 1, 'b': 2}),
format_columns.ListColumn(['x', 'y', 'z'])]
]
actual = [
['a', 'b', 'c'],
[format_columns.DictColumn({'a': 1, 'b': 2}),
format_columns.ListColumn(['x', 'y', 'z'])]
]
self.assertListItemEqual(expected, actual)

View File

@ -1,407 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
# Copyright 2013 Nebula Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
import copy
import json as jsonutils
import mock
import os
from cliff import columns as cliff_columns
import fixtures
from keystoneauth1 import loading
from os_client_config import cloud_config
from oslo_utils import importutils
from requests_mock.contrib import fixture
import testtools
from osc_lib import clientmanager
from osc_lib import shell
from osc_lib.tests import fakes
def fake_execute(shell, cmd):
"""Pretend to execute shell commands."""
return shell.run(cmd.split())
def make_shell(shell_class=None):
"""Create a new command shell and mock out some bits."""
if shell_class is None:
shell_class = shell.OpenStackShell
_shell = shell_class()
_shell.command_manager = mock.Mock()
# _shell.cloud = mock.Mock()
return _shell
def opt2attr(opt):
if opt.startswith('--os-'):
attr = opt[5:]
elif opt.startswith('--'):
attr = opt[2:]
else:
attr = opt
return attr.lower().replace('-', '_')
def opt2env(opt):
return opt[2:].upper().replace('-', '_')
class EnvFixture(fixtures.Fixture):
"""Environment Fixture.
This fixture replaces os.environ with provided env or an empty env.
"""
def __init__(self, env=None):
self.new_env = env or {}
def _setUp(self):
self.orig_env, os.environ = os.environ, self.new_env
self.addCleanup(self.revert)
def revert(self):
os.environ = self.orig_env
class ParserException(Exception):
pass
class TestCase(testtools.TestCase):
def setUp(self):
testtools.TestCase.setUp(self)
if (os.environ.get("OS_STDOUT_CAPTURE") == "True" or
os.environ.get("OS_STDOUT_CAPTURE") == "1"):
stdout = self.useFixture(fixtures.StringStream("stdout")).stream
self.useFixture(fixtures.MonkeyPatch("sys.stdout", stdout))
if (os.environ.get("OS_STDERR_CAPTURE") == "True" or
os.environ.get("OS_STDERR_CAPTURE") == "1"):
stderr = self.useFixture(fixtures.StringStream("stderr")).stream
self.useFixture(fixtures.MonkeyPatch("sys.stderr", stderr))
def assertNotCalled(self, m, msg=None):
"""Assert a function was not called"""
if m.called:
if not msg:
msg = 'method %s should not have been called' % m
self.fail(msg)
class TestCommand(TestCase):
"""Test cliff command classes"""
def setUp(self):
super(TestCommand, self).setUp()
# Build up a fake app
self.fake_stdout = fakes.FakeStdout()
self.fake_log = fakes.FakeLog()
self.app = fakes.FakeApp(self.fake_stdout, self.fake_log)
self.app.client_manager = fakes.FakeClientManager()
def check_parser(self, cmd, args, verify_args):
cmd_parser = cmd.get_parser('check_parser')
try:
parsed_args = cmd_parser.parse_args(args)
except SystemExit:
raise ParserException("Argument parse failed")
for av in verify_args:
attr, value = av
if attr:
self.assertIn(attr, parsed_args)
self.assertEqual(value, getattr(parsed_args, attr))
return parsed_args
def assertItemEqual(self, expected, actual):
"""Compare item considering formattable columns.
This method compares an observed item to an expected item column by
column. If a column is a formattable column, observed and expected
columns are compared using human_readable() and machine_readable().
"""
self.assertEqual(len(expected), len(actual))
for col_expected, col_actual in zip(expected, actual):
if isinstance(col_expected, cliff_columns.FormattableColumn):
self.assertIsInstance(col_actual, col_expected.__class__)
self.assertEqual(col_expected.human_readable(),
col_actual.human_readable())
self.assertEqual(col_expected.machine_readable(),
col_actual.machine_readable())
else:
self.assertEqual(col_expected, col_actual)
def assertListItemEqual(self, expected, actual):
"""Compare a list of items considering formattable columns.
Each pair of observed and expected items are compared
using assertItemEqual() method.
"""
self.assertEqual(len(expected), len(actual))
for item_expected, item_actual in zip(expected, actual):
self.assertItemEqual(item_expected, item_actual)
class TestClientManager(TestCase):
"""ClientManager class test framework"""
default_password_auth = {
'auth_url': fakes.AUTH_URL,
'username': fakes.USERNAME,
'password': fakes.PASSWORD,
'project_name': fakes.PROJECT_NAME,
}
default_token_auth = {
'auth_url': fakes.AUTH_URL,
'token': fakes.AUTH_TOKEN,
}
def setUp(self):
super(TestClientManager, self).setUp()
self.mock = mock.Mock()
self.requests = self.useFixture(fixture.Fixture())
# fake v2password token retrieval
self.stub_auth(json=fakes.TEST_RESPONSE_DICT)
# fake token and token_endpoint retrieval
self.stub_auth(json=fakes.TEST_RESPONSE_DICT,
url='/'.join([fakes.AUTH_URL, 'v2.0/tokens']))
# fake v3password token retrieval
self.stub_auth(json=fakes.TEST_RESPONSE_DICT_V3,
url='/'.join([fakes.AUTH_URL, 'v3/auth/tokens']))
# fake password token retrieval
self.stub_auth(json=fakes.TEST_RESPONSE_DICT_V3,
url='/'.join([fakes.AUTH_URL, 'auth/tokens']))
# fake password version endpoint discovery
self.stub_auth(json=fakes.TEST_VERSIONS,
url=fakes.AUTH_URL,
verb='GET')
# Mock the auth plugin
self.auth_mock = mock.Mock()
def stub_auth(self, json=None, url=None, verb=None, **kwargs):
subject_token = fakes.AUTH_TOKEN
base_url = fakes.AUTH_URL
if json:
text = jsonutils.dumps(json)
headers = {
'X-Subject-Token': subject_token,
'Content-Type': 'application/json',
}
if not url:
url = '/'.join([base_url, 'tokens'])
url = url.replace("/?", "?")
if not verb:
verb = 'POST'
self.requests.register_uri(
verb,
url,
headers=headers,
text=text,
)
def _clientmanager_class(self):
"""Allow subclasses to override the ClientManager class"""
return clientmanager.ClientManager
def _make_clientmanager(
self,
auth_args=None,
config_args=None,
identity_api_version=None,
auth_plugin_name=None,
auth_required=None,
):
if identity_api_version is None:
identity_api_version = '2.0'
if auth_plugin_name is None:
auth_plugin_name = 'password'
if auth_plugin_name.endswith('password'):
auth_dict = copy.deepcopy(self.default_password_auth)
elif auth_plugin_name.endswith('token'):
auth_dict = copy.deepcopy(self.default_token_auth)
else:
auth_dict = {}
if auth_args is not None:
auth_dict = auth_args
cli_options = {
'auth_type': auth_plugin_name,
'auth': auth_dict,
'interface': fakes.INTERFACE,
'region_name': fakes.REGION_NAME,
}
if config_args is not None:
cli_options.update(config_args)
loader = loading.get_plugin_loader(auth_plugin_name)
auth_plugin = loader.load_from_options(**auth_dict)
client_manager = self._clientmanager_class()(
cli_options=cloud_config.CloudConfig(
name='t1',
region='1',
config=cli_options,
auth_plugin=auth_plugin,
),
api_version={
'identity': identity_api_version,
},
)
client_manager._auth_required = auth_required is True
client_manager.setup_auth()
client_manager.auth_ref
self.assertEqual(
auth_plugin_name,
client_manager.auth_plugin_name,
)
return client_manager
class TestShell(TestCase):
# Full name of the OpenStackShell class to test (cliff.app.App subclass)
shell_class_name = "osc_lib.shell.OpenStackShell"
def setUp(self):
super(TestShell, self).setUp()
self.shell_class = importutils.import_class(self.shell_class_name)
self.cmd_patch = mock.patch(self.shell_class_name + ".run_subcommand")
self.cmd_save = self.cmd_patch.start()
self.addCleanup(self.cmd_patch.stop)
self.app = mock.Mock("Test Shell")
def _assert_initialize_app_arg(self, cmd_options, default_args):
"""Check the args passed to initialize_app()
The argv argument to initialize_app() is the remainder from parsing
global options declared in both cliff.app and
osc_lib.OpenStackShell build_option_parser(). Any global
options passed on the command line should not be in argv but in
_shell.options.
"""
with mock.patch(
self.shell_class_name + ".initialize_app",
self.app,
):
_shell = make_shell(shell_class=self.shell_class)
_cmd = cmd_options + " module list"
fake_execute(_shell, _cmd)
self.app.assert_called_with(["module", "list"])
for k in default_args.keys():
self.assertEqual(
default_args[k],
vars(_shell.options)[k],
"%s does not match" % k,
)
def _assert_cloud_config_arg(self, cmd_options, default_args):
"""Check the args passed to cloud_config.get_one_cloud()
The argparse argument to get_one_cloud() is an argparse.Namespace
object that contains all of the options processed to this point in
initialize_app().
"""
cloud = mock.Mock(name="cloudy")
cloud.config = {}
self.occ_get_one = mock.Mock(return_value=cloud)
with mock.patch(
"os_client_config.config.OpenStackConfig.get_one_cloud",
self.occ_get_one,
):
_shell = make_shell(shell_class=self.shell_class)
_cmd = cmd_options + " module list"
fake_execute(_shell, _cmd)
self.app.assert_called_with(["module", "list"])
opts = self.occ_get_one.call_args[1]['argparse']
for k in default_args.keys():
self.assertEqual(
default_args[k],
vars(opts)[k],
"%s does not match" % k,
)
def _test_options_init_app(self, test_opts):
"""Test options on the command line"""
for opt in test_opts.keys():
if not test_opts[opt][1]:
continue
key = opt2attr(opt)
if isinstance(test_opts[opt][0], str):
cmd = opt + " " + test_opts[opt][0]
else:
cmd = opt
kwargs = {
key: test_opts[opt][0],
}
self._assert_initialize_app_arg(cmd, kwargs)
def _test_env_init_app(self, test_opts):
"""Test options in the environment"""
for opt in test_opts.keys():
if not test_opts[opt][2]:
continue
key = opt2attr(opt)
kwargs = {
key: test_opts[opt][0],
}
env = {
opt2env(opt): test_opts[opt][0],
}
os.environ = env.copy()
self._assert_initialize_app_arg("", kwargs)
def _test_options_get_one_cloud(self, test_opts):
"""Test options sent "to os_client_config"""
for opt in test_opts.keys():
if not test_opts[opt][1]:
continue
key = opt2attr(opt)
if isinstance(test_opts[opt][0], str):
cmd = opt + " " + test_opts[opt][0]
else:
cmd = opt
kwargs = {
key: test_opts[opt][0],
}
self._assert_cloud_config_arg(cmd, kwargs)
def _test_env_get_one_cloud(self, test_opts):
"""Test environment options sent "to os_client_config"""
for opt in test_opts.keys():
if not test_opts[opt][2]:
continue
key = opt2attr(opt)
kwargs = {
key: test_opts[opt][0],
}
env = {
opt2env(opt): test_opts[opt][0],
}
os.environ = env.copy()
self._assert_cloud_config_arg("", kwargs)

View File

@ -1,637 +0,0 @@
# Copyright 2012-2013 OpenStack Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Common client utilities"""
import copy
import getpass
import logging
import os
import six
import time
import warnings
from cliff import columns as cliff_columns
from oslo_utils import importutils
from osc_lib import exceptions
from osc_lib.i18n import _
LOG = logging.getLogger(__name__)
def backward_compat_col_lister(column_headers, columns, column_map):
"""Convert the column headers to keep column backward compatibility.
Replace the new column name of column headers by old name, so that
the column headers can continue to support to show the old column name by
--column/-c option with old name, like: volume list -c 'Display Name'
:param column_headers: The column headers to be output in list command.
:param columns: The columns to be output.
:param column_map: The key of map is old column name, the value is new
column name, like: {'old_col': 'new_col'}
"""
if not columns:
return column_headers
# NOTE(RuiChen): column_headers may be a tuple in some code, like:
# volume v1, convert it to a list in order to change
# the column name.
column_headers = list(column_headers)
for old_col, new_col in six.iteritems(column_map):
if old_col in columns:
LOG.warning(_('The column "%(old_column)s" was deprecated, '
'please use "%(new_column)s" replace.') % {
'old_column': old_col,
'new_column': new_col}
)
if new_col in column_headers:
column_headers[column_headers.index(new_col)] = old_col
return column_headers
def backward_compat_col_showone(show_object, columns, column_map):
"""Convert the output object to keep column backward compatibility.
Replace the new column name of output object by old name, so that
the object can continue to support to show the old column name by
--column/-c option with old name, like: volume show -c 'display_name'
:param show_object: The object to be output in create/show commands.
:param columns: The columns to be output.
:param column_map: The key of map is old column name, the value is new
column name, like: {'old_col': 'new_col'}
"""
if not columns:
return show_object
show_object = copy.deepcopy(show_object)
for old_col, new_col in six.iteritems(column_map):
if old_col in columns:
LOG.warning(_('The column "%(old_column)s" was deprecated, '
'please use "%(new_column)s" replace.') % {
'old_column': old_col,
'new_column': new_col}
)
if new_col in show_object:
show_object.update({old_col: show_object.pop(new_col)})
return show_object
def build_kwargs_dict(arg_name, value):
"""Return a dictionary containing `arg_name` if `value` is set."""
kwargs = {}
if value:
kwargs[arg_name] = value
return kwargs
def env(*vars, **kwargs):
"""Search for the first defined of possibly many env vars
Returns the first environment variable defined in vars, or
returns the default defined in kwargs.
"""
for v in vars:
value = os.environ.get(v, None)
if value:
return value
return kwargs.get('default', '')
def find_min_match(items, sort_attr, **kwargs):
"""Find all resources meeting the given minimum constraints
:param items: A List of objects to consider
:param sort_attr: Attribute to sort the resulting list
:param kwargs: A dict of attributes and their minimum values
:rtype: A list of resources osrted by sort_attr that meet the minimums
"""
def minimum_pieces_of_flair(item):
"""Find lowest value greater than the minumum"""
result = True
for k in kwargs:
# AND together all of the given attribute results
result = result and kwargs[k] <= get_field(item, k)
return result
return sort_items(filter(minimum_pieces_of_flair, items), sort_attr)
def find_resource(manager, name_or_id, **kwargs):
"""Helper for the _find_* methods.
:param manager: A client manager class
:param name_or_id: The resource we are trying to find
:param kwargs: To be used in calling .find()
:rtype: The found resource
This method will attempt to find a resource in a variety of ways.
Primarily .get() methods will be called with `name_or_id` as an integer
value, and tried again as a string value.
If both fail, then a .find() is attempted, which is essentially calling
a .list() function with a 'name' query parameter that is set to
`name_or_id`.
Lastly, if any kwargs are passed in, they will be treated as additional
query parameters. This is particularly handy in the case of finding
resources in a domain.
"""
# Case 1: name_or_id is an ID, we need to call get() directly
# for example: /projects/454ad1c743e24edcad846d1118837cac
# For some projects, the name only will work. For keystone, this is not
# enough information, and domain information is necessary.
try:
return manager.get(name_or_id)
except Exception:
pass
if kwargs:
# Case 2: name_or_id is a name, but we have query args in kwargs
# for example: /projects/demo&domain_id=30524568d64447fbb3fa8b7891c10dd
try:
return manager.get(name_or_id, **kwargs)
except Exception:
pass
# Case 3: Try to get entity as integer id. Keystone does not have integer
# IDs, they are UUIDs, but some things in nova do, like flavors.
try:
if isinstance(name_or_id, int) or name_or_id.isdigit():
return manager.get(int(name_or_id), **kwargs)
# FIXME(dtroyer): The exception to catch here is dependent on which
# client library the manager passed in belongs to.
# Eventually this should be pulled from a common set
# of client exceptions.
except Exception as ex:
if (type(ex).__name__ == 'NotFound' or
type(ex).__name__ == 'HTTPNotFound' or
type(ex).__name__ == 'TypeError'):
pass
else:
raise
# Case 4: Try to use find.
# Reset the kwargs here for find
if len(kwargs) == 0:
kwargs = {}
try:
# Prepare the kwargs for calling find
if 'NAME_ATTR' in manager.resource_class.__dict__:
# novaclient does this for oddball resources
kwargs[manager.resource_class.NAME_ATTR] = name_or_id
else:
kwargs['name'] = name_or_id
except Exception:
pass
# finally try to find entity by name
try:
return manager.find(**kwargs)
# FIXME(dtroyer): The exception to catch here is dependent on which
# client library the manager passed in belongs to.
# Eventually this should be pulled from a common set
# of client exceptions.
except Exception as ex:
if type(ex).__name__ == 'NotFound':
msg = _(
"No %(resource)s with a name or ID of '%(id)s' exists."
)
raise exceptions.CommandError(msg % {
'resource': manager.resource_class.__name__.lower(),
'id': name_or_id,
})
if type(ex).__name__ == 'NoUniqueMatch':
msg = _(
"More than one %(resource)s exists with the name '%(id)s'."
)
raise exceptions.CommandError(msg % {
'resource': manager.resource_class.__name__.lower(),
'id': name_or_id,
})
else:
pass
# Case 5: For client with no find function, list all resources and hope
# to find a matching name or ID.
count = 0
for resource in manager.list():
if (resource.get('id') == name_or_id or
resource.get('name') == name_or_id):
count += 1
_resource = resource
if count == 0:
# we found no match, report back this error:
msg = _("Could not find resource %s")
raise exceptions.CommandError(msg % name_or_id)
elif count == 1:
return _resource
else:
# we found multiple matches, report back this error
msg = _("More than one resource exists with the name or ID '%s'.")
raise exceptions.CommandError(msg % name_or_id)
def format_dict(data):
"""Return a formatted string of key value pairs
:param data: a dict
:rtype: a string formatted to key='value'
"""
if data is None:
return None
output = ""
for s in sorted(data):
output = output + s + "='" + six.text_type(data[s]) + "', "
return output[:-2]
def format_dict_of_list(data, separator='; '):
"""Return a formatted string of key value pair
:param data: a dict, key is string, value is a list of string, for example:
{u'public': [u'2001:db8::8', u'172.24.4.6']}
:param separator: the separator to use between key/value pair
(default: '; ')
:return: a string formatted to {'key1'=['value1', 'value2']} with separated
by separator
"""
if data is None:
return None
output = []
for key in sorted(data):
value = data[key]
if value is None:
continue
value_str = format_list(value)
group = "%s=%s" % (key, value_str)
output.append(group)
return separator.join(output)
def format_list(data, separator=', '):
"""Return a formatted strings
:param data: a list of strings
:param separator: the separator to use between strings (default: ', ')
:rtype: a string formatted based on separator
"""
if data is None:
return None
return separator.join(sorted(data))
def format_list_of_dicts(data):
"""Return a formatted string of key value pairs for each dict
:param data: a list of dicts
:rtype: a string formatted to key='value' with dicts separated by new line
"""
if data is None:
return None
return '\n'.join(format_dict(i) for i in data)
def format_size(size):
"""Display size of a resource in a human readable format
:param string size:
The size of the resource in bytes.
:returns:
Returns the size in human-friendly format
:rtype string:
This function converts the size (provided in bytes) of a resource
into a human-friendly format such as K, M, G, T, P, E, Z
"""
suffix = ['', 'K', 'M', 'G', 'T', 'P', 'E', 'Z']
base = 1000.0
index = 0
if size is None:
size = 0
while size >= base:
index = index + 1
size = size / base
padded = '%.1f' % size
stripped = padded.rstrip('0').rstrip('.')
return '%s%s' % (stripped, suffix[index])
def get_client_class(api_name, version, version_map):
"""Returns the client class for the requested API version
:param api_name: the name of the API, e.g. 'compute', 'image', etc
:param version: the requested API version
:param version_map: a dict of client classes keyed by version
:rtype: a client class for the requested API version
"""
try:
client_path = version_map[str(version)]
except (KeyError, ValueError):
sorted_versions = sorted(version_map.keys(),
key=lambda s: list(map(int, s.split('.'))))
msg = _(
"Invalid %(api_name)s client version '%(version)s'. "
"must be one of: %(version_map)s"
)
raise exceptions.UnsupportedVersion(msg % {
'api_name': api_name,
'version': version,
'version_map': ', '.join(sorted_versions),
})
return importutils.import_class(client_path)
def get_dict_properties(item, fields, mixed_case_fields=None, formatters=None):
"""Return a tuple containing the item properties.
:param item: a single dict resource
:param fields: tuple of strings with the desired field names
:param mixed_case_fields: tuple of field names to preserve case
:param formatters: dictionary mapping field names to callables
to format the values
"""
if mixed_case_fields is None:
mixed_case_fields = []
if formatters is None:
formatters = {}
row = []
for field in fields:
if field in mixed_case_fields:
field_name = field.replace(' ', '_')
else:
field_name = field.lower().replace(' ', '_')
data = item[field_name] if field_name in item else ''
if field in formatters:
formatter = formatters[field]
if issubclass(formatter, cliff_columns.FormattableColumn):
data = formatter(data)
else:
warnings.warn(
'The usage of formatter functions is now discouraged. '
'Consider using cliff.columns.FormattableColumn instead. '
'See reviews linked with bug 1687955 for more detail.',
category=DeprecationWarning)
if data is not None:
data = formatter(data)
row.append(data)
return tuple(row)
def get_effective_log_level():
"""Returns the lowest logging level considered by logging handlers
Retrieve and return the smallest log level set among the root
logger's handlers (in case of multiple handlers).
"""
root_log = logging.getLogger()
min_log_lvl = logging.CRITICAL
for handler in root_log.handlers:
min_log_lvl = min(min_log_lvl, handler.level)
return min_log_lvl
def get_field(item, field):
try:
if isinstance(item, dict):
return item[field]
else:
return getattr(item, field)
except Exception:
msg = _("Resource doesn't have field %s")
raise exceptions.CommandError(msg % field)
def get_item_properties(item, fields, mixed_case_fields=None, formatters=None):
"""Return a tuple containing the item properties.
:param item: a single item resource (e.g. Server, Project, etc)
:param fields: tuple of strings with the desired field names
:param mixed_case_fields: tuple of field names to preserve case
:param formatters: dictionary mapping field names to callables
to format the values
"""
if mixed_case_fields is None:
mixed_case_fields = []
if formatters is None:
formatters = {}
row = []
for field in fields:
if field in mixed_case_fields:
field_name = field.replace(' ', '_')
else:
field_name = field.lower().replace(' ', '_')
data = getattr(item, field_name, '')
if field in formatters:
formatter = formatters[field]
if issubclass(formatter, cliff_columns.FormattableColumn):
data = formatter(data)
else:
warnings.warn(
'The usage of formatter functions is now discouraged. '
'Consider using cliff.columns.FormattableColumn instead. '
'See reviews linked with bug 1687955 for more detail.',
category=DeprecationWarning)
if data is not None:
data = formatter(data)
row.append(data)
return tuple(row)
def get_password(stdin, prompt=None, confirm=True):
message = prompt or "User Password:"
if hasattr(stdin, 'isatty') and stdin.isatty():
try:
while True:
first_pass = getpass.getpass(message)
if not confirm:
return first_pass
second_pass = getpass.getpass("Repeat " + message)
if first_pass == second_pass:
return first_pass
msg = _("The passwords entered were not the same")
print(msg)
except EOFError: # Ctl-D
msg = _("Error reading password")
raise exceptions.CommandError(msg)
msg = _("No terminal detected attempting to read password")
raise exceptions.CommandError(msg)
def is_ascii(string):
try:
(string.decode('ascii') if isinstance(string, bytes)
else string.encode('ascii'))
return True
except (UnicodeEncodeError, UnicodeDecodeError):
return False
def read_blob_file_contents(blob_file):
try:
with open(blob_file) as file:
blob = file.read().strip()
return blob
except IOError:
msg = _("Error occurred trying to read from file %s")
raise exceptions.CommandError(msg % blob_file)
def sort_items(items, sort_str):
"""Sort items based on sort keys and sort directions given by sort_str.
:param items: a list or generator object of items
:param sort_str: a string defining the sort rules, the format is
'<key1>:[direction1],<key2>:[direction2]...', direction can be 'asc'
for ascending or 'desc' for descending, if direction is not given,
it's ascending by default
:return: sorted items
"""
if not sort_str:
return items
# items may be a generator object, transform it to a list
items = list(items)
sort_keys = sort_str.strip().split(',')
for sort_key in reversed(sort_keys):
reverse = False
if ':' in sort_key:
sort_key, direction = sort_key.split(':', 1)
if not sort_key:
msg = _("'<empty string>'' is not a valid sort key")
raise exceptions.CommandError(msg)
if direction not in ['asc', 'desc']:
if not direction:
direction = "<empty string>"
msg = _(
"'%(direction)s' is not a valid sort direction for "
"sort key %(sort_key)s, use 'asc' or 'desc' instead"
)
raise exceptions.CommandError(msg % {
'direction': direction,
'sort_key': sort_key,
})
if direction == 'desc':
reverse = True
items.sort(key=lambda item: get_field(item, sort_key),
reverse=reverse)
return items
def wait_for_delete(manager,
res_id,
status_field='status',
error_status=['error'],
exception_name=['NotFound'],
sleep_time=5,
timeout=300,
callback=None):
"""Wait for resource deletion
:param manager: the manager from which we can get the resource
:param res_id: the resource id to watch
:param status_field: the status attribute in the returned resource object,
this is used to check for error states while the resource is being
deleted
:param error_status: a list of status strings for error
:param exception_name: a list of exception strings for deleted case
:param sleep_time: wait this long between checks (seconds)
:param timeout: check until this long (seconds)
:param callback: called per sleep cycle, useful to display progress; this
function is passed a progress value during each iteration of the wait
loop
:rtype: True on success, False if the resource has gone to error state or
the timeout has been reached
"""
total_time = 0
while total_time < timeout:
try:
# might not be a bad idea to re-use find_resource here if it was
# a bit more friendly in the exceptions it raised so we could just
# handle a NotFound exception here without parsing the message
res = manager.get(res_id)
except Exception as ex:
if type(ex).__name__ in exception_name:
return True
raise
status = getattr(res, status_field, '').lower()
if status in error_status:
return False
if callback:
progress = getattr(res, 'progress', None) or 0
callback(progress)
time.sleep(sleep_time)
total_time += sleep_time
# if we got this far we've timed out
return False
def wait_for_status(status_f,
res_id,
status_field='status',
success_status=['active'],
error_status=['error'],
sleep_time=5,
callback=None):
"""Wait for status change on a resource during a long-running operation
:param status_f: a status function that takes a single id argument
:param res_id: the resource id to watch
:param status_field: the status attribute in the returned resource object
:param success_status: a list of status strings for successful completion
:param error_status: a list of status strings for error
:param sleep_time: wait this long (seconds)
:param callback: called per sleep cycle, useful to display progress
:rtype: True on success
"""
while True:
res = status_f(res_id)
status = getattr(res, status_field, '').lower()
if status in success_status:
retval = True
break
elif status in error_status:
retval = False
break
if callback:
progress = getattr(res, 'progress', None) or 0
callback(progress)
time.sleep(sleep_time)
return retval

View File

@ -1,19 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pbr.version
__all__ = ['version_info', 'version_string']
version_info = pbr.version.VersionInfo('osc-lib')
version_string = version_info.version_string()

View File

@ -1,53 +0,0 @@
---
prelude: >
``osc-lib`` was extracted from the main OpenStackClient
repo after the OSC 2.4.0 release. A number of the lower-layer
modules were simply renamed into the osc_lib namespace::
* openstackclient.api.api -> osc_lib.api.api
* openstackclient.api.auth -> osc_lib.api.auth
* openstackclient.api.utils -> osc_lib.api.utils
* openstackclient.common.command -> osc_lib.command.command
* openstackclient.common.exceptions -> osc_lib.exceptions
* openstackclient.common.logs -> osc_lib.logs
* openstackclient.common.parseractions -> osc_lib.cli.parseractions
* openstackclient.common.session -> osc_lib.session
* openstackclient.common.utils -> osc_lib.utils
* openstackclient.i18n -> osc_lib.i18n
The higher-layer components, such as the OpenStackShell and ClientManager
objects, have had significant changes made to them to streamline interaction
with ``os-client-config`` and ``keystoneauth`` in addition to the rename::
* openstackclient.common.commandmanager -> osc_lib.command.commandmanager
* openstackclient.shell -> osc_lib.shell
features:
- Add ``utils.find_min_match()`` function to filter a list
based on a set of minimum values of attributes. For example,
selecting all compute flavors that have a minimum amount of
RAM and disk and VCPUs.
- Add ``cli.client_config.OSC_Config`` as a subclass of
``os_client_config.config.OpenStackConfig`` to collect all of the
configuration option special cases in OSC into one place and insert
into the ``os-client-config`` handling.
fixes:
- The ``parseractions.KeyValueAction`` class now raises
a ``argparse.ArgumentTypeError`` exception when the
argument is not in the form ``<key>=<value>``.
- Change ``utils.find_resource()`` to handle client managers
that lack a ``find()`` method. Raise an
``exceptions.CommandError`` exception when multiple matches
are found.
- Change ``utils.find_resource()`` to handle glanceclient's
``HTTPNotFound`` exception.
- Change ``utils.find_resource()`` to attempt lookups as
IDs first, falling back to ``find()`` methods when available.
- Refactor ``ClientManager`` class to remove OSC-specific logic and
move all option special-cases into ``cli.client_config.OSC_Config``.
Also change some private attributes to public (``region_name``,
``interface``, ``cacert``, ``verify`` and remove ``_insecure``).
- Refactor ``OpenStackShell`` to handle only global argument
processing and setting up the ClientManager with configuration
from ``os-client-config``. Command and plugin loading remain in
OSC.

View File

@ -1,6 +0,0 @@
---
feature:
- |
Add ``MultiKeyValueCommaAction`` as a ``MultiKeyValueAction`` sublass
that allows values to include a comma. For example:
``--property key1=value1,value2,key2=value3,value4,value5``.

View File

@ -1,7 +0,0 @@
---
fixes:
- |
Add additional precedence fixes to the argument precedence problems
in os-client-config 1.18.0 and earlier. This all will be removed
when os-client-config 1.19.x is the minimum allwed version in
OpenStack's global requirements.txt.

View File

@ -1,6 +0,0 @@
---
fixes:
- |
Prevent null key setting for key-value pairs in the ``KeyValueAction``
and ``MultiKeyValueAction`` parser actions.
[Bug `1558690 <https://bugs.launchpad.net/bugs/1558690>`_]

View File

@ -1,7 +0,0 @@
---
security:
- |
This release contains the fix for `bug 1630822`_ so that passwords are
no longer leaked when using the ``--debug`` or ``-vv`` options.
.. _bug 1630822: https://bugs.launchpad.net/python-openstackclient/+bug/1630822

View File

@ -1,6 +0,0 @@
---
features:
- |
``--os-profile`` argument can be loaded from ``OS_PROFILE``
environment variables to avoid repeating ``--os-profile``
in openstack commands.

View File

@ -1,5 +0,0 @@
---
fixes:
- |
Decode argv into Unicode on Python 2 in ``OpenStackShell.main()``
[OSC Bug `1603494 <https://bugs.launchpad.net/bugs/1603494>`_]

View File

@ -1,350 +0,0 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# osc-lib Release Notes documentation build configuration file, created by
# sphinx-quickstart on Thu Jul 28 17:16:41 2016.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
import openstackdocstheme
# sys.path.insert(0, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#
# needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'reno.sphinxext',
'sphinx.ext.extlinks',
]
# Set aliases for extlinks
# * lpbug - generic Launchpad bug :lpbug:`123456`
# * oscbp - OSC blueprints :oscbp:`Blue Print <bp-name>`
# * oscdoc - OSC Docs :oscdoc:`Comamnd List <command-list>`
extlinks = {
'lpbug': (
'https://bugs.launchpad.net/bugs/%s',
'Bug ',
),
'oscbp': (
'https://blueprints.launchpad.net/python-openstackclient/+spec/%s',
'',
),
'oscdoc': (
'http://docs.openstack.org/developer/osc-lib/%s.html',
'',
),
}
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
# The encoding of source files.
#
# source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'osc-lib Release Notes'
copyright = u'2016, osc-lib Developers'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
from osc_lib.version import version_info as osc_lib_version
# The full version, including alpha/beta/rc tags.
release = osc_lib_version.version_string_with_vcs()
# The short X.Y version.
version = osc_lib_version.canonical_version_string()
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#
# today = ''
#
# Else, today_fmt is used as the format for a strftime call.
#
# today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = []
# The reST default role (used for this markup: `text`) to use for all
# documents.
#
# default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#
# add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#
# add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
# If true, keep warnings as "system message" paragraphs in the built documents.
# keep_warnings = False
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'openstackdocs'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
# html_theme_path = []
html_theme_path = [openstackdocstheme.get_html_theme_path()]
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
# html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#
# html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#
# html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#
# html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#
# html_extra_path = []
# If not None, a 'Last updated on:' timestamp is inserted at every page
# bottom, using the given strftime format.
# The empty string is equivalent to '%b %d, %Y'.
#
# html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#
# html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#
# html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#
# html_additional_pages = {}
# If false, no module index is generated.
#
# html_domain_indices = True
# If false, no index is generated.
#
# html_use_index = True
# If true, the index is split into individual pages for each letter.
#
# html_split_index = False
# If true, links to the reST sources are added to the pages.
#
# html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#
# html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#
# html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#
# html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
# html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'OSC_LIBReleaseNotesdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
# 'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [(
'index',
'OSC_LIBReleaseNotes.tex',
u'osc-lib Release Notes Documentation',
u'osc-lib Developers',
'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#
# latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#
# latex_use_parts = False
# If true, show page references after internal links.
#
# latex_show_pagerefs = False
# If true, show URL addresses after external links.
#
# latex_show_urls = False
# Documents to append as an appendix to all manuals.
#
# latex_appendices = []
# If false, no module index is generated.
#
# latex_domain_indices = True
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [(
'index',
'osc_libreleasenotes',
u'osc-lib Release Notes Documentation',
[u'osc-lib Developers'],
1,
)]
# If true, show URL addresses after external links.
#
# man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [(
'index',
'OSC_LIBReleaseNotes',
u'osc-lib Release Notes Documentation',
u'osc-lib Developers',
'OSC_LIBReleaseNotes',
'Common base library for OpenStackClient plugins.',
'Miscellaneous',
)]
# Documents to append as an appendix to all manuals.
#
# texinfo_appendices = []
# If false, no module index is generated.
#
# texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#
# texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
#
# texinfo_no_detailmenu = False
# -- Options for Internationalization output ------------------------------
locale_dirs = ['locale/']

View File

@ -1,16 +0,0 @@
=====================
osc-lib Release Notes
=====================
.. toctree::
:maxdepth: 1
unreleased
ocata
newton
Indices and tables
==================
* :ref:`genindex`
* :ref:`search`

View File

@ -1,6 +0,0 @@
=============================
Newton Series Release Notes
=============================
.. release-notes::
:branch: origin/stable/newton

View File

@ -1,6 +0,0 @@
===================================
Ocata Series Release Notes
===================================
.. release-notes::
:branch: origin/stable/ocata

View File

@ -1,5 +0,0 @@
=====================
Current Release Notes
=====================
.. release-notes::

View File

@ -1,14 +0,0 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
pbr!=2.1.0,>=2.0.0 # Apache-2.0
six>=1.9.0 # MIT
Babel!=2.4.0,>=2.3.4 # BSD
cliff>=2.8.0 # Apache-2.0
keystoneauth1>=2.21.0 # Apache-2.0
os-client-config>=1.28.0 # Apache-2.0
oslo.i18n!=3.15.2,>=2.1.0 # Apache-2.0
oslo.utils>=3.20.0 # Apache-2.0
simplejson>=2.2.0 # MIT
stevedore>=1.20.0 # Apache-2.0

View File

@ -1,35 +0,0 @@
[metadata]
name = osc-lib
summary = OpenStackClient Library
description-file =
README.rst
author = OpenStack
author-email = openstack-dev@lists.openstack.org
home-page = http://docs.openstack.org/developer/osc-lib
classifier =
Environment :: OpenStack
Intended Audience :: Information Technology
Intended Audience :: System Administrators
License :: OSI Approved :: Apache Software License
Operating System :: POSIX :: Linux
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.5
[files]
packages =
osc_lib
[build_sphinx]
source-dir = doc/source
build-dir = doc/build
all_files = 1
warning-is-error = 1
[pbr]
autodoc_index_modules = True
api_doc_dir = reference/api
autodoc_exclude_modules =
osc_lib.tests.*

View File

@ -1,29 +0,0 @@
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
import setuptools
# In python < 2.7.4, a lazy loading of package `pbr` will break
# setuptools if some other modules registered functions in `atexit`.
# solution from: http://bugs.python.org/issue15881#msg170215
try:
import multiprocessing # noqa
except ImportError:
pass
setuptools.setup(
setup_requires=['pbr>=2.0.0'],
pbr=True)

View File

@ -1,20 +0,0 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
hacking<0.11,>=0.10.0
coverage!=4.4,>=4.0 # Apache-2.0
fixtures>=3.0.0 # Apache-2.0/BSD
mock>=2.0 # BSD
oslotest>=1.10.0 # Apache-2.0
requests-mock>=1.1 # Apache-2.0
sphinx>=1.6.2 # BSD
os-testr>=0.8.0 # Apache-2.0
testrepository>=0.0.18 # Apache-2.0/BSD
testtools>=1.4.0 # MIT
osprofiler>=1.4.0 # Apache-2.0
bandit>=1.1.0 # Apache-2.0
# Documentation
openstackdocstheme>=1.11.0 # Apache-2.0
reno!=2.3.1,>=1.8.0 # Apache-2.0

View File

@ -1,30 +0,0 @@
#!/usr/bin/env bash
# Client constraint file contains this client version pin that is in conflict
# with installing the client from source. We should remove the version pin in
# the constraints file before applying it for from-source installation.
CONSTRAINTS_FILE="$1"
shift 1
set -e
# NOTE(tonyb): Place this in the tox enviroment's log dir so it will get
# published to logs.openstack.org for easy debugging.
localfile="$VIRTUAL_ENV/log/upper-constraints.txt"
if [[ "$CONSTRAINTS_FILE" != http* ]]; then
CONSTRAINTS_FILE="file://$CONSTRAINTS_FILE"
fi
# NOTE(tonyb): need to add curl to bindep.txt if the project supports bindep
curl "$CONSTRAINTS_FILE" --insecure --progress-bar --output "$localfile"
pip install -c"$localfile" openstack-requirements
# This is the main purpose of the script: Allow local installation of
# the current repo. It is listed in constraints file and thus any
# install will be constrained and we need to unconstrain it.
edit-constraints "$localfile" -- "$CLIENT_NAME"
pip install -c"$localfile" -U "$@"
exit $?

41
tox.ini
View File

@ -1,41 +0,0 @@
[tox]
minversion = 2.0
envlist = py35,py27,pep8
skipdist = True
[testenv]
usedevelop = True
install_command = {toxinidir}/tools/tox_install.sh {env:UPPER_CONSTRAINTS_FILE:https://git.openstack.org/cgit/openstack/requirements/plain/upper-constraints.txt} {opts} {packages}
setenv = VIRTUAL_ENV={envdir}
BRANCH_NAME=master
CLIENT_NAME=osc-lib
deps = -r{toxinidir}/test-requirements.txt
commands = ostestr {posargs}
whitelist_externals = ostestr
[testenv:pep8]
commands = flake8
[testenv:venv]
commands = {posargs}
[testenv:cover]
commands =
python setup.py test --coverage --coverage-package-name=osc_lib --testr-args='{posargs}'
coverage report
[testenv:debug]
commands = oslo_debug_helper -t osc_lib/tests {posargs}
[testenv:docs]
commands = python setup.py build_sphinx
[testenv:releasenotes]
commands = sphinx-build -a -E -W -d releasenotes/build/doctrees -b html releasenotes/source releasenotes/build/html
[flake8]
show-source = True
exclude = .git,.tox,dist,doc,*lib/python*,*egg,build,tools
# If 'ignore' is not set there are default errors and warnings that are set
# Doc: http://flake8.readthedocs.org/en/latest/config.html#default
ignore = __