Retire Packaging Deb project repos
This commit is part of a series to retire the Packaging Deb project. Step 2 is to remove all content from the project repos, replacing it with a README notification where to find ongoing work, and how to recover the repo if needed at some future point (as in https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project). Change-Id: I9df787b33aa77024eeaa742be30f582c6dd836a4
This commit is contained in:
parent
dd1ceea294
commit
5eff8c62f0
|
@ -1,8 +0,0 @@
|
|||
[run]
|
||||
branch = True
|
||||
source = oslo_utils
|
||||
omit = oslo_utils/tests/*
|
||||
|
||||
[report]
|
||||
ignore_errors = True
|
||||
precision = 2
|
|
@ -1,55 +0,0 @@
|
|||
*.py[cod]
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Packages
|
||||
*.egg
|
||||
*.egg-info
|
||||
dist
|
||||
build
|
||||
eggs
|
||||
parts
|
||||
bin
|
||||
var
|
||||
sdist
|
||||
develop-eggs
|
||||
.installed.cfg
|
||||
lib
|
||||
lib64
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
.coverage
|
||||
cover
|
||||
.tox
|
||||
nosetests.xml
|
||||
.testrepository
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
|
||||
# Mr Developer
|
||||
.mr.developer.cfg
|
||||
.project
|
||||
.pydevproject
|
||||
|
||||
# Complexity
|
||||
output/*.html
|
||||
output/*/index.html
|
||||
|
||||
# Sphinx
|
||||
doc/build
|
||||
|
||||
# pbr generates these
|
||||
AUTHORS
|
||||
ChangeLog
|
||||
|
||||
# Editors
|
||||
*~
|
||||
.*.swp
|
||||
|
||||
# reno build
|
||||
releasenotes/build
|
|
@ -1,4 +0,0 @@
|
|||
[gerrit]
|
||||
host=review.openstack.org
|
||||
port=29418
|
||||
project=openstack/oslo.utils.git
|
3
.mailmap
3
.mailmap
|
@ -1,3 +0,0 @@
|
|||
# Format is:
|
||||
# <preferred e-mail> <other e-mail 1>
|
||||
# <preferred e-mail> <other e-mail 2>
|
|
@ -1,7 +0,0 @@
|
|||
[DEFAULT]
|
||||
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
|
||||
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
|
||||
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
|
||||
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
|
@ -1,16 +0,0 @@
|
|||
If you would like to contribute to the development of OpenStack,
|
||||
you must follow the steps in this page:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html
|
||||
|
||||
Once those steps have been completed, changes to OpenStack
|
||||
should be submitted for review via the Gerrit tool, following
|
||||
the workflow documented at:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
||||
|
||||
Bugs should be filed on Launchpad, not GitHub:
|
||||
|
||||
https://bugs.launchpad.net/oslo.utils
|
|
@ -1,4 +0,0 @@
|
|||
oslo.utils Style Commandments
|
||||
======================================================
|
||||
|
||||
Read the OpenStack Style Commandments https://docs.openstack.org/hacking/latest/
|
175
LICENSE
175
LICENSE
|
@ -1,175 +0,0 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
|
@ -0,0 +1,14 @@
|
|||
This project is no longer maintained.
|
||||
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
For ongoing work on maintaining OpenStack packages in the Debian
|
||||
distribution, please see the Debian OpenStack packaging team at
|
||||
https://wiki.debian.org/OpenStack/.
|
||||
|
||||
For any further questions, please email
|
||||
openstack-dev@lists.openstack.org or join #openstack-dev on
|
||||
Freenode.
|
28
README.rst
28
README.rst
|
@ -1,28 +0,0 @@
|
|||
========================
|
||||
Team and repository tags
|
||||
========================
|
||||
|
||||
.. image:: http://governance.openstack.org/badges/oslo.utils.svg
|
||||
:target: http://governance.openstack.org/reference/tags/index.html
|
||||
|
||||
.. Change things from this point on
|
||||
|
||||
==========
|
||||
oslo.utils
|
||||
==========
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/oslo.utils.svg
|
||||
:target: https://pypi.python.org/pypi/oslo.utils/
|
||||
:alt: Latest Version
|
||||
|
||||
.. image:: https://img.shields.io/pypi/dm/oslo.utils.svg
|
||||
:target: https://pypi.python.org/pypi/oslo.utils/
|
||||
:alt: Downloads
|
||||
|
||||
The oslo.utils library provides support for common utility type functions,
|
||||
such as encoding, exception handling, string manipulation, and time handling.
|
||||
|
||||
* Free software: Apache license
|
||||
* Documentation: https://docs.openstack.org/oslo.utils/latest/
|
||||
* Source: https://git.openstack.org/cgit/openstack/oslo.utils
|
||||
* Bugs: https://bugs.launchpad.net/oslo.utils
|
|
@ -1,82 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.abspath('../..'))
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
'openstackdocstheme'
|
||||
]
|
||||
|
||||
# openstackdocstheme options
|
||||
repository_name = 'openstack/oslo.utils'
|
||||
bug_project = 'oslo.utils'
|
||||
bug_tag = ''
|
||||
|
||||
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||
# text edit cycles.
|
||||
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'oslo.utils'
|
||||
copyright = u'2014, OpenStack Foundation'
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
add_module_names = True
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# -- Options for HTML output --------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
# html_theme_path = ["."]
|
||||
# html_theme = '_theme'
|
||||
# html_static_path = ['static']
|
||||
html_theme = 'openstackdocs'
|
||||
|
||||
html_last_updated_fmt = '%Y-%m-%d %H:%M'
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = '%sdoc' % project
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass
|
||||
# [howto/manual]).
|
||||
latex_documents = [
|
||||
('index',
|
||||
'%s.tex' % project,
|
||||
u'%s Documentation' % project,
|
||||
u'OpenStack Foundation', 'manual'),
|
||||
]
|
||||
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
#intersphinx_mapping = {'http://docs.python.org/': None}
|
|
@ -1,5 +0,0 @@
|
|||
============
|
||||
Contributing
|
||||
============
|
||||
|
||||
.. include:: ../../../CONTRIBUTING.rst
|
|
@ -1,22 +0,0 @@
|
|||
======================================
|
||||
Welcome to oslo.utils's documentation!
|
||||
======================================
|
||||
|
||||
The `oslo`_ utils library provides support for common utility type functions,
|
||||
such as encoding, exception handling, string manipulation, and time handling.
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
install/index
|
||||
user/index
|
||||
reference/index
|
||||
contributor/index
|
||||
|
||||
.. rubric:: Indices and tables
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
||||
.. _oslo: https://wiki.openstack.org/wiki/Oslo
|
|
@ -1,12 +0,0 @@
|
|||
============
|
||||
Installation
|
||||
============
|
||||
|
||||
At the command line::
|
||||
|
||||
$ pip install oslo.utils
|
||||
|
||||
Or, if you have virtualenvwrapper installed::
|
||||
|
||||
$ mkvirtualenv oslo.utils
|
||||
$ pip install oslo.utils
|
|
@ -1,6 +0,0 @@
|
|||
===========
|
||||
dictutils
|
||||
===========
|
||||
|
||||
.. automodule:: oslo_utils.dictutils
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
=============
|
||||
encodeutils
|
||||
=============
|
||||
|
||||
.. automodule:: oslo_utils.encodeutils
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
===============
|
||||
eventletutils
|
||||
===============
|
||||
|
||||
.. automodule:: oslo_utils.eventletutils
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
==========
|
||||
excutils
|
||||
==========
|
||||
|
||||
.. automodule:: oslo_utils.excutils
|
||||
:members:
|
|
@ -1,7 +0,0 @@
|
|||
=============
|
||||
fileutils
|
||||
=============
|
||||
|
||||
.. automodule:: oslo_utils.fileutils
|
||||
:members:
|
||||
|
|
@ -1,6 +0,0 @@
|
|||
=============
|
||||
fixture
|
||||
=============
|
||||
|
||||
.. automodule:: oslo_utils.fixture
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
=============
|
||||
importutils
|
||||
=============
|
||||
|
||||
.. automodule:: oslo_utils.importutils
|
||||
:members:
|
|
@ -1,23 +0,0 @@
|
|||
=============
|
||||
API Reference
|
||||
=============
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
dictutils
|
||||
encodeutils
|
||||
eventletutils
|
||||
excutils
|
||||
fileutils
|
||||
fixture
|
||||
importutils
|
||||
netutils
|
||||
reflection
|
||||
secretutils
|
||||
specs_matcher
|
||||
strutils
|
||||
timeutils
|
||||
units
|
||||
uuidutils
|
||||
versionutils
|
|
@ -1,6 +0,0 @@
|
|||
==========
|
||||
netutils
|
||||
==========
|
||||
|
||||
.. automodule:: oslo_utils.netutils
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
============
|
||||
reflection
|
||||
============
|
||||
|
||||
.. automodule:: oslo_utils.reflection
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
=============
|
||||
secretutils
|
||||
=============
|
||||
|
||||
.. automodule:: oslo_utils.secretutils
|
||||
:members: constant_time_compare
|
|
@ -1,6 +0,0 @@
|
|||
==============
|
||||
specs_matcher
|
||||
==============
|
||||
|
||||
.. automodule:: oslo_utils.specs_matcher
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
==========
|
||||
strutils
|
||||
==========
|
||||
|
||||
.. automodule:: oslo_utils.strutils
|
||||
:members:
|
|
@ -1,7 +0,0 @@
|
|||
===========
|
||||
timeutils
|
||||
===========
|
||||
|
||||
.. automodule:: oslo_utils.timeutils
|
||||
:members:
|
||||
:special-members: __enter__, __exit__
|
|
@ -1,6 +0,0 @@
|
|||
=======
|
||||
units
|
||||
=======
|
||||
|
||||
.. automodule:: oslo_utils.units
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
===========
|
||||
uuidutils
|
||||
===========
|
||||
|
||||
.. automodule:: oslo_utils.uuidutils
|
||||
:members:
|
|
@ -1,6 +0,0 @@
|
|||
==============
|
||||
versionutils
|
||||
==============
|
||||
|
||||
.. automodule:: oslo_utils.versionutils
|
||||
:members:
|
|
@ -1 +0,0 @@
|
|||
.. include:: ../../../ChangeLog
|
|
@ -1,15 +0,0 @@
|
|||
==================
|
||||
Using oslo.service
|
||||
==================
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
usage
|
||||
timeutils
|
||||
|
||||
.. history contains a lot of sections, toctree with maxdepth 1 is used.
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
history
|
|
@ -1,96 +0,0 @@
|
|||
===========
|
||||
timeutils
|
||||
===========
|
||||
|
||||
Using a stopwatch (as a context manager)
|
||||
----------------------------------------
|
||||
|
||||
::
|
||||
|
||||
>>> from oslo_utils import timeutils
|
||||
>>> import time
|
||||
>>>
|
||||
>>> def slow_routine(delay):
|
||||
... def i_am_slow():
|
||||
... time.sleep(delay)
|
||||
... return i_am_slow
|
||||
...
|
||||
>>>
|
||||
>>> half_sec_func = slow_routine(0.5)
|
||||
>>> with timeutils.StopWatch() as w:
|
||||
... half_sec_func()
|
||||
...
|
||||
>>> print(w.elapsed())
|
||||
0.500243999995
|
||||
|
||||
|
||||
Manually using a stopwatch
|
||||
--------------------------
|
||||
|
||||
::
|
||||
|
||||
>>> from oslo_utils import timeutils
|
||||
>>> import time
|
||||
>>> w = timeutils.StopWatch()
|
||||
>>> w.start()
|
||||
<oslo_utils.timeutils.StopWatch object at 0x2b85a0ab7590>
|
||||
>>> time.sleep(0.1)
|
||||
>>> time.sleep(0.1)
|
||||
>>> time.sleep(0.1)
|
||||
>>> time.sleep(0.1)
|
||||
>>> w.stop()
|
||||
<oslo_utils.timeutils.StopWatch object at 0x2b85a0ab7590>
|
||||
>>> w.elapsed()
|
||||
13.96467600017786
|
||||
|
||||
Tracking durations with a stopwatch
|
||||
-----------------------------------
|
||||
|
||||
::
|
||||
|
||||
>>> from oslo_utils import timeutils
|
||||
>>> w = timeutils.StopWatch(duration=10)
|
||||
>>> w.start()
|
||||
<oslo_utils.timeutils.StopWatch object at 0x2b85a7940a10>
|
||||
>>> w.elapsed()
|
||||
2.023942000232637
|
||||
>>> w.leftover()
|
||||
4.648160999640822
|
||||
>>> w.leftover()
|
||||
3.5522090001031756
|
||||
>>> w.leftover()
|
||||
3.0481000002473593
|
||||
>>> w.leftover()
|
||||
2.1918740002438426
|
||||
>>> w.leftover()
|
||||
1.6966530000790954
|
||||
>>> w.leftover()
|
||||
1.1202940000221133
|
||||
>>> w.leftover()
|
||||
0.0
|
||||
>>> w.expired()
|
||||
True
|
||||
|
||||
Tracking and splitting with a stopwatch
|
||||
---------------------------------------
|
||||
|
||||
::
|
||||
|
||||
>>> from oslo_utils import timeutils
|
||||
>>> w = timeutils.StopWatch()
|
||||
>>> w.start()
|
||||
<oslo_utils.timeutils.StopWatch object at 0x2ba75c12b050>
|
||||
>>> w.split()
|
||||
Split(elapsed=3.02423300035, length=3.02423300035)
|
||||
>>> w.split()
|
||||
Split(elapsed=6.44820600003, length=3.42397299968)
|
||||
>>> w.split()
|
||||
Split(elapsed=7.9678720003, length=1.51966600027)
|
||||
>>> w.splits
|
||||
(Split(elapsed=3.02423300035, length=3.02423300035), Split(elapsed=6.44820600003, length=3.42397299968), Split(elapsed=7.9678720003, length=1.51966600027))
|
||||
>>> w.stop()
|
||||
<oslo_utils.timeutils.StopWatch object at 0x2ba75c12b050>
|
||||
>>> w.elapsed()
|
||||
16.799759999848902
|
||||
>>> w.splits
|
||||
(Split(elapsed=3.02423300035, length=3.02423300035), Split(elapsed=6.44820600003, length=3.42397299968), Split(elapsed=7.9678720003, length=1.51966600027))
|
|
@ -1,10 +0,0 @@
|
|||
=======
|
||||
Usage
|
||||
=======
|
||||
|
||||
To use oslo.utils in a project, import the individual module you
|
||||
need. For example::
|
||||
|
||||
from oslo_utils import strutils
|
||||
|
||||
slug = strutils.to_slug('input value')
|
|
@ -1,27 +0,0 @@
|
|||
# Copyright 2014 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""oslo.i18n integration module.
|
||||
|
||||
See https://docs.openstack.org/oslo.i18n/latest/user/index.html .
|
||||
|
||||
"""
|
||||
|
||||
import oslo_i18n
|
||||
|
||||
|
||||
_translators = oslo_i18n.TranslatorFactory(domain='oslo_utils')
|
||||
|
||||
# The primary translation function using the well-known name "_"
|
||||
_ = _translators.primary
|
|
@ -1,31 +0,0 @@
|
|||
# Copyright (c) 2016 EasyStack Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import six
|
||||
|
||||
|
||||
def flatten_dict_to_keypairs(d, separator=':'):
|
||||
"""Generator that produces sequence of keypairs for nested dictionaries.
|
||||
|
||||
:param d: dictionaries which may be nested
|
||||
:param separator: symbol between names
|
||||
"""
|
||||
for name, value in sorted(six.iteritems(d)):
|
||||
if isinstance(value, dict):
|
||||
for subname, subvalue in flatten_dict_to_keypairs(value,
|
||||
separator):
|
||||
yield ('%s%s%s' % (name, separator, subname), subvalue)
|
||||
else:
|
||||
yield name, value
|
|
@ -1,188 +0,0 @@
|
|||
# Copyright 2014 Red Hat, Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import sys
|
||||
|
||||
import six
|
||||
|
||||
|
||||
# NOTE(blk-u): This provides a symbol that can be overridden just for this
|
||||
# module during testing. sys.getfilesystemencoding() is called by coverage so
|
||||
# mocking it globally caused the coverage job to fail.
|
||||
_getfilesystemencoding = sys.getfilesystemencoding
|
||||
|
||||
|
||||
def safe_decode(text, incoming=None, errors='strict'):
|
||||
"""Decodes incoming text/bytes string using `incoming` if they're not
|
||||
already unicode.
|
||||
|
||||
:param incoming: Text's current encoding
|
||||
:param errors: Errors handling policy. See here for valid
|
||||
values http://docs.python.org/2/library/codecs.html
|
||||
:returns: text or a unicode `incoming` encoded
|
||||
representation of it.
|
||||
:raises TypeError: If text is not an instance of str
|
||||
"""
|
||||
if not isinstance(text, (six.string_types, six.binary_type)):
|
||||
raise TypeError("%s can't be decoded" % type(text))
|
||||
|
||||
if isinstance(text, six.text_type):
|
||||
return text
|
||||
|
||||
if not incoming:
|
||||
incoming = (sys.stdin.encoding or
|
||||
sys.getdefaultencoding())
|
||||
|
||||
try:
|
||||
return text.decode(incoming, errors)
|
||||
except UnicodeDecodeError:
|
||||
# Note(flaper87) If we get here, it means that
|
||||
# sys.stdin.encoding / sys.getdefaultencoding
|
||||
# didn't return a suitable encoding to decode
|
||||
# text. This happens mostly when global LANG
|
||||
# var is not set correctly and there's no
|
||||
# default encoding. In this case, most likely
|
||||
# python will use ASCII or ANSI encoders as
|
||||
# default encodings but they won't be capable
|
||||
# of decoding non-ASCII characters.
|
||||
#
|
||||
# Also, UTF-8 is being used since it's an ASCII
|
||||
# extension.
|
||||
return text.decode('utf-8', errors)
|
||||
|
||||
|
||||
def safe_encode(text, incoming=None,
|
||||
encoding='utf-8', errors='strict'):
|
||||
"""Encodes incoming text/bytes string using `encoding`.
|
||||
|
||||
If incoming is not specified, text is expected to be encoded with
|
||||
current python's default encoding. (`sys.getdefaultencoding`)
|
||||
|
||||
:param incoming: Text's current encoding
|
||||
:param encoding: Expected encoding for text (Default UTF-8)
|
||||
:param errors: Errors handling policy. See here for valid
|
||||
values http://docs.python.org/2/library/codecs.html
|
||||
:returns: text or a bytestring `encoding` encoded
|
||||
representation of it.
|
||||
:raises TypeError: If text is not an instance of str
|
||||
|
||||
See also to_utf8() function which is simpler and don't depend on
|
||||
the locale encoding.
|
||||
"""
|
||||
if not isinstance(text, (six.string_types, six.binary_type)):
|
||||
raise TypeError("%s can't be encoded" % type(text))
|
||||
|
||||
if not incoming:
|
||||
incoming = (sys.stdin.encoding or
|
||||
sys.getdefaultencoding())
|
||||
|
||||
# Avoid case issues in comparisons
|
||||
if hasattr(incoming, 'lower'):
|
||||
incoming = incoming.lower()
|
||||
if hasattr(encoding, 'lower'):
|
||||
encoding = encoding.lower()
|
||||
|
||||
if isinstance(text, six.text_type):
|
||||
return text.encode(encoding, errors)
|
||||
elif text and encoding != incoming:
|
||||
# Decode text before encoding it with `encoding`
|
||||
text = safe_decode(text, incoming, errors)
|
||||
return text.encode(encoding, errors)
|
||||
else:
|
||||
return text
|
||||
|
||||
|
||||
def to_utf8(text):
|
||||
"""Encode Unicode to UTF-8, return bytes unchanged.
|
||||
|
||||
Raise TypeError if text is not a bytes string or a Unicode string.
|
||||
|
||||
.. versionadded:: 3.5
|
||||
"""
|
||||
if isinstance(text, bytes):
|
||||
return text
|
||||
elif isinstance(text, six.text_type):
|
||||
return text.encode('utf-8')
|
||||
else:
|
||||
raise TypeError("bytes or Unicode expected, got %s"
|
||||
% type(text).__name__)
|
||||
|
||||
|
||||
def exception_to_unicode(exc):
|
||||
"""Get the message of an exception as a Unicode string.
|
||||
|
||||
On Python 3, the exception message is always a Unicode string. On
|
||||
Python 2, the exception message is a bytes string *most* of the time.
|
||||
|
||||
If the exception message is a bytes strings, try to decode it from UTF-8
|
||||
(superset of ASCII), from the locale encoding, or fallback to decoding it
|
||||
from ISO-8859-1 (which never fails).
|
||||
|
||||
.. versionadded:: 1.6
|
||||
"""
|
||||
msg = None
|
||||
if six.PY2:
|
||||
# First try by calling the unicode type constructor. We should try
|
||||
# unicode() before exc.__unicode__() because subclasses of unicode can
|
||||
# be easily casted to unicode, whereas they have no __unicode__()
|
||||
# method.
|
||||
try:
|
||||
msg = unicode(exc) # NOQA
|
||||
except UnicodeError:
|
||||
# unicode(exc) fail with UnicodeDecodeError on Python 2 if
|
||||
# exc.__unicode__() or exc.__str__() returns a bytes string not
|
||||
# decodable from the default encoding (ASCII)
|
||||
if hasattr(exc, '__unicode__'):
|
||||
# Call directly the __unicode__() method to avoid
|
||||
# the implicit decoding from the default encoding
|
||||
try:
|
||||
msg = exc.__unicode__()
|
||||
except UnicodeError: # nosec
|
||||
pass
|
||||
|
||||
if msg is None:
|
||||
# Don't call directly str(exc), because it fails with
|
||||
# UnicodeEncodeError on Python 2 if exc.__str__() returns a Unicode
|
||||
# string not encodable to the default encoding (ASCII)
|
||||
msg = exc.__str__()
|
||||
|
||||
if isinstance(msg, six.text_type):
|
||||
# This should be the default path on Python 3 and an *optional* path
|
||||
# on Python 2 (if for some reason the exception message was already
|
||||
# in unicode instead of the more typical bytes string); so avoid
|
||||
# further converting to unicode in both of these cases.
|
||||
return msg
|
||||
|
||||
try:
|
||||
# Try to decode from UTF-8 (superset of ASCII). The decoder fails
|
||||
# if the string is not a valid UTF-8 string: the UTF-8 codec includes
|
||||
# a validation algorithm to ensure the consistency of the codec.
|
||||
return msg.decode('utf-8')
|
||||
except UnicodeDecodeError: # nosec
|
||||
pass
|
||||
|
||||
# Try the locale encoding, most error messages are encoded to this encoding
|
||||
# (ex: os.strerror(errno))
|
||||
encoding = _getfilesystemencoding()
|
||||
try:
|
||||
return msg.decode(encoding)
|
||||
except UnicodeDecodeError: # nosec
|
||||
pass
|
||||
|
||||
# The encoding is not ASCII, not UTF-8, nor the locale encoding. Fallback
|
||||
# to the ISO-8859-1 encoding which never fails. It will produce mojibake
|
||||
# if the message is not encoded to ISO-8859-1, but we don't want a super
|
||||
# complex heuristic to get the encoding of an exception message.
|
||||
return msg.decode('latin1')
|
|
@ -1,175 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright (C) 2015 Yahoo! Inc. All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Eventlet utils helper module.
|
||||
|
||||
.. versionadded:: 1.3
|
||||
"""
|
||||
|
||||
import threading
|
||||
import warnings
|
||||
|
||||
from oslo_utils import importutils
|
||||
|
||||
# These may or may not exist; so carefully import them if we can...
|
||||
_eventlet = importutils.try_import('eventlet')
|
||||
_patcher = importutils.try_import('eventlet.patcher')
|
||||
|
||||
# Attribute that can be used by others to see if eventlet is even currently
|
||||
# useable (can be used in unittests to skip test cases or test classes that
|
||||
# require eventlet to work).
|
||||
EVENTLET_AVAILABLE = all((_eventlet, _patcher))
|
||||
|
||||
# Taken from eventlet.py (v0.16.1) patcher code (it's not a accessible set
|
||||
# for some reason...)
|
||||
_ALL_PATCH = frozenset(['__builtin__', 'MySQLdb', 'os',
|
||||
'psycopg', 'select', 'socket', 'thread', 'time'])
|
||||
|
||||
|
||||
def fetch_current_thread_functor():
|
||||
"""Get the current thread.
|
||||
|
||||
If eventlet is used to monkey-patch the threading module, return the
|
||||
current eventlet greenthread. Otherwise, return the current Python thread.
|
||||
|
||||
.. versionadded:: 1.5
|
||||
"""
|
||||
# Until https://github.com/eventlet/eventlet/issues/172 is resolved
|
||||
# or addressed we have to use complicated workaround to get a object
|
||||
# that will not be recycled; the usage of threading.current_thread()
|
||||
# doesn't appear to currently be monkey patched and therefore isn't
|
||||
# reliable to use (and breaks badly when used as all threads share
|
||||
# the same current_thread() object)...
|
||||
if not EVENTLET_AVAILABLE:
|
||||
return threading.current_thread
|
||||
else:
|
||||
green_threaded = _patcher.is_monkey_patched('thread')
|
||||
if green_threaded:
|
||||
return _eventlet.getcurrent
|
||||
else:
|
||||
return threading.current_thread
|
||||
|
||||
|
||||
def warn_eventlet_not_patched(expected_patched_modules=None,
|
||||
what='this library'):
|
||||
"""Warns if eventlet is being used without patching provided modules.
|
||||
|
||||
:param expected_patched_modules: list of modules to check to ensure that
|
||||
they are patched (and to warn if they
|
||||
are not); these names should correspond
|
||||
to the names passed into the eventlet
|
||||
monkey_patch() routine. If not provided
|
||||
then *all* the modules that could be
|
||||
patched are checked. The currently valid
|
||||
selection is one or multiple of
|
||||
['MySQLdb', '__builtin__', 'all', 'os',
|
||||
'psycopg', 'select', 'socket', 'thread',
|
||||
'time'] (where 'all' has an inherent
|
||||
special meaning).
|
||||
:type expected_patched_modules: list/tuple/iterable
|
||||
:param what: string to merge into the warnings message to identify
|
||||
what is being checked (used in forming the emitted warnings
|
||||
message).
|
||||
:type what: string
|
||||
"""
|
||||
if not expected_patched_modules:
|
||||
expanded_patched_modules = _ALL_PATCH.copy()
|
||||
else:
|
||||
expanded_patched_modules = set()
|
||||
for m in expected_patched_modules:
|
||||
if m == 'all':
|
||||
expanded_patched_modules.update(_ALL_PATCH)
|
||||
else:
|
||||
if m not in _ALL_PATCH:
|
||||
raise ValueError("Unknown module '%s' requested to check"
|
||||
" if patched" % m)
|
||||
else:
|
||||
expanded_patched_modules.add(m)
|
||||
if EVENTLET_AVAILABLE:
|
||||
try:
|
||||
# The patcher code stores a dictionary here of all modules
|
||||
# names -> whether it was patched...
|
||||
#
|
||||
# Example:
|
||||
#
|
||||
# >>> _patcher.monkey_patch(os=True)
|
||||
# >>> print(_patcher.already_patched)
|
||||
# {'os': True}
|
||||
maybe_patched = bool(_patcher.already_patched)
|
||||
except AttributeError:
|
||||
# Assume it is patched (the attribute used here doesn't appear
|
||||
# to be a public documented API so we will assume that everything
|
||||
# is patched when that attribute isn't there to be safe...)
|
||||
maybe_patched = True
|
||||
if maybe_patched:
|
||||
not_patched = []
|
||||
for m in sorted(expanded_patched_modules):
|
||||
if not _patcher.is_monkey_patched(m):
|
||||
not_patched.append(m)
|
||||
if not_patched:
|
||||
warnings.warn("It is highly recommended that when eventlet"
|
||||
" is used that the %s modules are monkey"
|
||||
" patched when using %s (to avoid"
|
||||
" spurious or unexpected lock-ups"
|
||||
" and/or hangs)" % (not_patched, what),
|
||||
RuntimeWarning, stacklevel=3)
|
||||
|
||||
|
||||
def is_monkey_patched(module):
|
||||
"""Determines safely is eventlet patching for module enabled or not
|
||||
:param module: String, module name
|
||||
:return Bool, True if module is patched, False otherwise
|
||||
"""
|
||||
|
||||
if _patcher is None:
|
||||
return False
|
||||
return _patcher.is_monkey_patched(module)
|
||||
|
||||
|
||||
class _Event(object):
|
||||
"""A class that provides consistent eventlet/threading Event API.
|
||||
|
||||
This wraps the eventlet.event.Event class to have the same API as
|
||||
the standard threading.Event object.
|
||||
"""
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.clear()
|
||||
|
||||
def clear(self):
|
||||
self._set = False
|
||||
self._event = _eventlet.event.Event()
|
||||
|
||||
def is_set(self):
|
||||
return self._set
|
||||
|
||||
isSet = is_set
|
||||
|
||||
def set(self):
|
||||
self._set = True
|
||||
self._event.send(True)
|
||||
|
||||
def wait(self, timeout=None):
|
||||
with _eventlet.timeout.Timeout(timeout, False):
|
||||
self._event.wait()
|
||||
return self.is_set()
|
||||
|
||||
|
||||
def Event():
|
||||
if is_monkey_patched("thread"):
|
||||
return _Event()
|
||||
else:
|
||||
return threading.Event()
|
|
@ -1,346 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# Copyright 2012, Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Exception related utilities.
|
||||
"""
|
||||
|
||||
import functools
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import traceback
|
||||
|
||||
import six
|
||||
|
||||
|
||||
from oslo_utils import encodeutils
|
||||
from oslo_utils import reflection
|
||||
from oslo_utils import timeutils
|
||||
|
||||
|
||||
class CausedByException(Exception):
|
||||
"""Base class for exceptions which have associated causes.
|
||||
|
||||
NOTE(harlowja): in later versions of python we can likely remove the need
|
||||
to have a ``cause`` here as PY3+ have implemented :pep:`3134` which
|
||||
handles chaining in a much more elegant manner.
|
||||
|
||||
:param message: the exception message, typically some string that is
|
||||
useful for consumers to view when debugging or analyzing
|
||||
failures.
|
||||
:param cause: the cause of the exception being raised, when provided this
|
||||
should itself be an exception instance, this is useful for
|
||||
creating a chain of exceptions for versions of python where
|
||||
this is not yet implemented/supported natively.
|
||||
|
||||
.. versionadded:: 2.4
|
||||
"""
|
||||
def __init__(self, message, cause=None):
|
||||
super(CausedByException, self).__init__(message)
|
||||
self.cause = cause
|
||||
|
||||
def __bytes__(self):
|
||||
return self.pformat().encode("utf8")
|
||||
|
||||
def __str__(self):
|
||||
return self.pformat()
|
||||
|
||||
def _get_message(self):
|
||||
# We must *not* call into the ``__str__`` method as that will
|
||||
# reactivate the pformat method, which will end up badly (and doesn't
|
||||
# look pretty at all); so be careful...
|
||||
return self.args[0]
|
||||
|
||||
def pformat(self, indent=2, indent_text=" ", show_root_class=False):
|
||||
"""Pretty formats a caused exception + any connected causes."""
|
||||
if indent < 0:
|
||||
raise ValueError("Provided 'indent' must be greater than"
|
||||
" or equal to zero instead of %s" % indent)
|
||||
buf = six.StringIO()
|
||||
if show_root_class:
|
||||
buf.write(reflection.get_class_name(self, fully_qualified=False))
|
||||
buf.write(": ")
|
||||
buf.write(self._get_message())
|
||||
active_indent = indent
|
||||
next_up = self.cause
|
||||
seen = []
|
||||
while next_up is not None and next_up not in seen:
|
||||
seen.append(next_up)
|
||||
buf.write(os.linesep)
|
||||
if isinstance(next_up, CausedByException):
|
||||
buf.write(indent_text * active_indent)
|
||||
buf.write(reflection.get_class_name(next_up,
|
||||
fully_qualified=False))
|
||||
buf.write(": ")
|
||||
buf.write(next_up._get_message())
|
||||
else:
|
||||
lines = traceback.format_exception_only(type(next_up), next_up)
|
||||
for i, line in enumerate(lines):
|
||||
buf.write(indent_text * active_indent)
|
||||
if line.endswith("\n"):
|
||||
# We'll add our own newlines on...
|
||||
line = line[0:-1]
|
||||
buf.write(line)
|
||||
if i + 1 != len(lines):
|
||||
buf.write(os.linesep)
|
||||
# Don't go deeper into non-caused-by exceptions... as we
|
||||
# don't know if there exception 'cause' attributes are even
|
||||
# useable objects...
|
||||
break
|
||||
active_indent += indent
|
||||
next_up = getattr(next_up, 'cause', None)
|
||||
return buf.getvalue()
|
||||
|
||||
|
||||
def raise_with_cause(exc_cls, message, *args, **kwargs):
|
||||
"""Helper to raise + chain exceptions (when able) and associate a *cause*.
|
||||
|
||||
NOTE(harlowja): Since in py3.x exceptions can be chained (due to
|
||||
:pep:`3134`) we should try to raise the desired exception with the given
|
||||
*cause* (or extract a *cause* from the current stack if able) so that the
|
||||
exception formats nicely in old and new versions of python. Since py2.x
|
||||
does **not** support exception chaining (or formatting) the exception
|
||||
class provided should take a ``cause`` keyword argument (which it may
|
||||
discard if it wants) to its constructor which can then be
|
||||
inspected/retained on py2.x to get *similar* information as would be
|
||||
automatically included/obtainable in py3.x.
|
||||
|
||||
:param exc_cls: the exception class to raise (typically one derived
|
||||
from :py:class:`.CausedByException` or equivalent).
|
||||
:param message: the text/str message that will be passed to
|
||||
the exceptions constructor as its first positional
|
||||
argument.
|
||||
:param args: any additional positional arguments to pass to the
|
||||
exceptions constructor.
|
||||
:param kwargs: any additional keyword arguments to pass to the
|
||||
exceptions constructor.
|
||||
|
||||
.. versionadded:: 1.6
|
||||
"""
|
||||
if 'cause' not in kwargs:
|
||||
exc_type, exc, exc_tb = sys.exc_info()
|
||||
try:
|
||||
if exc is not None:
|
||||
kwargs['cause'] = exc
|
||||
finally:
|
||||
# Leave no references around (especially with regards to
|
||||
# tracebacks and any variables that it retains internally).
|
||||
del(exc_type, exc, exc_tb)
|
||||
six.raise_from(exc_cls(message, *args, **kwargs), kwargs.get('cause'))
|
||||
|
||||
|
||||
class save_and_reraise_exception(object):
|
||||
"""Save current exception, run some code and then re-raise.
|
||||
|
||||
In some cases the exception context can be cleared, resulting in None
|
||||
being attempted to be re-raised after an exception handler is run. This
|
||||
can happen when eventlet switches greenthreads or when running an
|
||||
exception handler, code raises and catches an exception. In both
|
||||
cases the exception context will be cleared.
|
||||
|
||||
To work around this, we save the exception state, run handler code, and
|
||||
then re-raise the original exception. If another exception occurs, the
|
||||
saved exception is logged and the new exception is re-raised.
|
||||
|
||||
In some cases the caller may not want to re-raise the exception, and
|
||||
for those circumstances this context provides a reraise flag that
|
||||
can be used to suppress the exception. For example::
|
||||
|
||||
except Exception:
|
||||
with save_and_reraise_exception() as ctxt:
|
||||
decide_if_need_reraise()
|
||||
if not should_be_reraised:
|
||||
ctxt.reraise = False
|
||||
|
||||
If another exception occurs and reraise flag is False,
|
||||
the saved exception will not be logged.
|
||||
|
||||
If the caller wants to raise new exception during exception handling
|
||||
he/she sets reraise to False initially with an ability to set it back to
|
||||
True if needed::
|
||||
|
||||
except Exception:
|
||||
with save_and_reraise_exception(reraise=False) as ctxt:
|
||||
[if statements to determine whether to raise a new exception]
|
||||
# Not raising a new exception, so reraise
|
||||
ctxt.reraise = True
|
||||
|
||||
.. versionchanged:: 1.4
|
||||
Added *logger* optional parameter.
|
||||
"""
|
||||
def __init__(self, reraise=True, logger=None):
|
||||
self.reraise = reraise
|
||||
if logger is None:
|
||||
logger = logging.getLogger()
|
||||
self.logger = logger
|
||||
self.type_, self.value, self.tb = (None, None, None)
|
||||
|
||||
def force_reraise(self):
|
||||
if self.type_ is None and self.value is None:
|
||||
raise RuntimeError("There is no (currently) captured exception"
|
||||
" to force the reraising of")
|
||||
six.reraise(self.type_, self.value, self.tb)
|
||||
|
||||
def capture(self, check=True):
|
||||
(type_, value, tb) = sys.exc_info()
|
||||
if check and type_ is None and value is None:
|
||||
raise RuntimeError("There is no active exception to capture")
|
||||
self.type_, self.value, self.tb = (type_, value, tb)
|
||||
return self
|
||||
|
||||
def __enter__(self):
|
||||
# TODO(harlowja): perhaps someday in the future turn check here
|
||||
# to true, because that is likely the desired intention, and doing
|
||||
# so ensures that people are actually using this correctly.
|
||||
return self.capture(check=False)
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
if exc_type is not None:
|
||||
if self.reraise:
|
||||
self.logger.error('Original exception being dropped: %s',
|
||||
traceback.format_exception(self.type_,
|
||||
self.value,
|
||||
self.tb))
|
||||
return False
|
||||
if self.reraise:
|
||||
self.force_reraise()
|
||||
|
||||
|
||||
def forever_retry_uncaught_exceptions(*args, **kwargs):
|
||||
"""Decorates provided function with infinite retry behavior.
|
||||
|
||||
The function retry delay is **always** one second unless
|
||||
keyword argument ``retry_delay`` is passed that defines a value different
|
||||
than 1.0 (less than zero values are automatically changed to be 0.0).
|
||||
|
||||
If repeated exceptions with the same message occur, logging will only
|
||||
output/get triggered for those equivalent messages every 60.0
|
||||
seconds, this can be altered by keyword argument ``same_log_delay`` to
|
||||
be a value different than 60.0 seconds (exceptions that change the
|
||||
message are always logged no matter what this delay is set to). As in
|
||||
the ``retry_delay`` case if this is less than zero, it will be
|
||||
automatically changed to be 0.0.
|
||||
"""
|
||||
|
||||
def decorator(infunc):
|
||||
retry_delay = max(0.0, float(kwargs.get('retry_delay', 1.0)))
|
||||
same_log_delay = max(0.0, float(kwargs.get('same_log_delay', 60.0)))
|
||||
|
||||
@six.wraps(infunc)
|
||||
def wrapper(*args, **kwargs):
|
||||
last_exc_message = None
|
||||
same_failure_count = 0
|
||||
watch = timeutils.StopWatch(duration=same_log_delay)
|
||||
while True:
|
||||
try:
|
||||
return infunc(*args, **kwargs)
|
||||
except Exception as exc:
|
||||
this_exc_message = encodeutils.exception_to_unicode(exc)
|
||||
if this_exc_message == last_exc_message:
|
||||
same_failure_count += 1
|
||||
else:
|
||||
same_failure_count = 1
|
||||
if this_exc_message != last_exc_message or watch.expired():
|
||||
# The watch has expired or the exception message
|
||||
# changed, so time to log it again...
|
||||
logging.exception(
|
||||
'Unexpected exception occurred %d time(s)... '
|
||||
'retrying.' % same_failure_count)
|
||||
if not watch.has_started():
|
||||
watch.start()
|
||||
else:
|
||||
watch.restart()
|
||||
same_failure_count = 0
|
||||
last_exc_message = this_exc_message
|
||||
time.sleep(retry_delay)
|
||||
return wrapper
|
||||
|
||||
# This is needed to handle when the decorator has args or the decorator
|
||||
# doesn't have args, python is rather weird here...
|
||||
if kwargs or not args:
|
||||
return decorator
|
||||
else:
|
||||
if len(args) == 1:
|
||||
return decorator(args[0])
|
||||
else:
|
||||
return decorator
|
||||
|
||||
|
||||
class exception_filter(object):
|
||||
"""A context manager that prevents some exceptions from being raised.
|
||||
|
||||
Use this class as a decorator for a function that returns whether a given
|
||||
exception should be ignored, in cases where complex logic beyond subclass
|
||||
matching is required. e.g.
|
||||
|
||||
>>> @exception_filter
|
||||
>>> def ignore_test_assertions(ex):
|
||||
... return isinstance(ex, AssertionError) and 'test' in str(ex)
|
||||
|
||||
The filter matching function can then be used as a context manager:
|
||||
|
||||
>>> with ignore_test_assertions:
|
||||
... assert False, 'This is a test'
|
||||
|
||||
or called directly:
|
||||
|
||||
>>> try:
|
||||
... assert False, 'This is a test'
|
||||
... except Exception as ex:
|
||||
... ignore_test_assertions(ex)
|
||||
|
||||
Any non-matching exception will be re-raised. When the filter is used as a
|
||||
context manager, the traceback for re-raised exceptions is always
|
||||
preserved. When the filter is called as a function, the traceback is
|
||||
preserved provided that no other exceptions have been raised in the
|
||||
intervening time. The context manager method is preferred for this reason
|
||||
except in cases where the ignored exception affects control flow.
|
||||
"""
|
||||
|
||||
def __init__(self, should_ignore_ex):
|
||||
self._should_ignore_ex = should_ignore_ex
|
||||
|
||||
if all(hasattr(should_ignore_ex, a)
|
||||
for a in functools.WRAPPER_ASSIGNMENTS):
|
||||
functools.update_wrapper(self, should_ignore_ex)
|
||||
|
||||
def __get__(self, obj, owner):
|
||||
return type(self)(self._should_ignore_ex.__get__(obj, owner))
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
if exc_val is not None:
|
||||
return self._should_ignore_ex(exc_val)
|
||||
|
||||
def __call__(self, ex):
|
||||
"""Re-raise any exception value not being filtered out.
|
||||
|
||||
If the exception was the last to be raised, it will be re-raised with
|
||||
its original traceback.
|
||||
"""
|
||||
exc_type, exc_val, traceback = sys.exc_info()
|
||||
|
||||
try:
|
||||
if not self._should_ignore_ex(ex):
|
||||
if exc_val is ex:
|
||||
six.reraise(exc_type, exc_val, traceback)
|
||||
else:
|
||||
raise ex
|
||||
finally:
|
||||
del exc_type, exc_val, traceback
|
|
@ -1,105 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
File utilities.
|
||||
|
||||
.. versionadded:: 1.8
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import errno
|
||||
import os
|
||||
import stat
|
||||
import tempfile
|
||||
|
||||
from oslo_utils import excutils
|
||||
|
||||
_DEFAULT_MODE = stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO
|
||||
|
||||
|
||||
def ensure_tree(path, mode=_DEFAULT_MODE):
|
||||
"""Create a directory (and any ancestor directories required)
|
||||
|
||||
:param path: Directory to create
|
||||
:param mode: Directory creation permissions
|
||||
"""
|
||||
try:
|
||||
os.makedirs(path, mode)
|
||||
except OSError as exc:
|
||||
if exc.errno == errno.EEXIST:
|
||||
if not os.path.isdir(path):
|
||||
raise
|
||||
else:
|
||||
raise
|
||||
|
||||
|
||||
def delete_if_exists(path, remove=os.unlink):
|
||||
"""Delete a file, but ignore file not found error.
|
||||
|
||||
:param path: File to delete
|
||||
:param remove: Optional function to remove passed path
|
||||
"""
|
||||
|
||||
try:
|
||||
remove(path)
|
||||
except OSError as e:
|
||||
if e.errno != errno.ENOENT:
|
||||
raise
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def remove_path_on_error(path, remove=delete_if_exists):
|
||||
"""Protect code that wants to operate on PATH atomically.
|
||||
Any exception will cause PATH to be removed.
|
||||
|
||||
:param path: File to work with
|
||||
:param remove: Optional function to remove passed path
|
||||
"""
|
||||
|
||||
try:
|
||||
yield
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception():
|
||||
remove(path)
|
||||
|
||||
|
||||
def write_to_tempfile(content, path=None, suffix='', prefix='tmp'):
|
||||
"""Create a temporary file containing data.
|
||||
|
||||
Create a temporary file containing specified content, with a specified
|
||||
filename suffix and prefix. The tempfile will be created in a default
|
||||
location, or in the directory `path`, if it is not None. `path` and its
|
||||
parent directories will be created if they don't exist.
|
||||
|
||||
:param content: bytestring to write to the file
|
||||
:param path: same as parameter 'dir' for mkstemp
|
||||
:param suffix: same as parameter 'suffix' for mkstemp
|
||||
:param prefix: same as parameter 'prefix' for mkstemp
|
||||
|
||||
For example: it can be used in database tests for creating
|
||||
configuration files.
|
||||
|
||||
.. versionadded:: 1.9
|
||||
"""
|
||||
if path:
|
||||
ensure_tree(path)
|
||||
|
||||
(fd, path) = tempfile.mkstemp(suffix=suffix, dir=path, prefix=prefix)
|
||||
try:
|
||||
os.write(fd, content)
|
||||
finally:
|
||||
os.close(fd)
|
||||
return path
|
|
@ -1,51 +0,0 @@
|
|||
|
||||
# Copyright 2015 OpenStack Foundation
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Test fixtures.
|
||||
|
||||
.. versionadded:: 1.3
|
||||
"""
|
||||
|
||||
import fixtures
|
||||
|
||||
from oslo_utils import timeutils
|
||||
|
||||
|
||||
class TimeFixture(fixtures.Fixture):
|
||||
"""A fixture for overriding the time returned by timeutils.utcnow().
|
||||
|
||||
:param override_time: datetime instance or list thereof. If not given,
|
||||
defaults to the current UTC time.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, override_time=None):
|
||||
super(TimeFixture, self).__init__()
|
||||
self._override_time = override_time
|
||||
|
||||
def setUp(self):
|
||||
super(TimeFixture, self).setUp()
|
||||
timeutils.set_time_override(self._override_time)
|
||||
self.addCleanup(timeutils.clear_time_override)
|
||||
|
||||
def advance_time_delta(self, timedelta):
|
||||
"""Advance overridden time using a datetime.timedelta."""
|
||||
timeutils.advance_time_delta(timedelta)
|
||||
|
||||
def advance_time_seconds(self, seconds):
|
||||
"""Advance overridden time by seconds."""
|
||||
timeutils.advance_time_seconds(seconds)
|
|
@ -1,78 +0,0 @@
|
|||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Thread safe fnmatch re-implementation.
|
||||
|
||||
Standard library fnmatch in Python versions <= 2.7.9 has thread safe
|
||||
issue, this module is created for such case. see:
|
||||
https://bugs.python.org/issue23191
|
||||
|
||||
.. versionadded:: 3.3
|
||||
"""
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import fnmatch as standard_fnmatch
|
||||
import os
|
||||
import posixpath
|
||||
import re
|
||||
import sys
|
||||
|
||||
|
||||
if sys.version_info > (2, 7, 9):
|
||||
fnmatch = standard_fnmatch.fnmatch
|
||||
fnmatchcase = standard_fnmatch.fnmatchcase
|
||||
filter = standard_fnmatch.filter
|
||||
translate = standard_fnmatch.translate
|
||||
else:
|
||||
_MATCH_CACHE = {}
|
||||
_MATCH_CACHE_MAX = 100
|
||||
|
||||
translate = standard_fnmatch.translate
|
||||
|
||||
def _get_cached_pattern(pattern):
|
||||
cached_pattern = _MATCH_CACHE.get(pattern)
|
||||
if cached_pattern is None:
|
||||
translated_pattern = translate(pattern)
|
||||
cached_pattern = re.compile(translated_pattern)
|
||||
if len(_MATCH_CACHE) >= _MATCH_CACHE_MAX:
|
||||
_MATCH_CACHE.clear()
|
||||
_MATCH_CACHE[pattern] = cached_pattern
|
||||
return cached_pattern
|
||||
|
||||
def fnmatchcase(filename, pattern):
|
||||
cached_pattern = _get_cached_pattern(pattern)
|
||||
return cached_pattern.match(filename) is not None
|
||||
|
||||
def fnmatch(filename, pattern):
|
||||
filename = os.path.normcase(filename)
|
||||
pattern = os.path.normcase(pattern)
|
||||
return fnmatchcase(filename, pattern)
|
||||
|
||||
def filter(filenames, pattern):
|
||||
filtered_filenames = []
|
||||
|
||||
pattern = os.path.normcase(pattern)
|
||||
cached_pattern = _get_cached_pattern(pattern)
|
||||
|
||||
if os.path is posixpath:
|
||||
# normcase on posix is NOP. Optimize it away from the loop.
|
||||
for filename in filenames:
|
||||
if cached_pattern.match(filename):
|
||||
filtered_filenames.append(filename)
|
||||
else:
|
||||
for filename in filenames:
|
||||
norm_name = os.path.normcase(filename)
|
||||
if cached_pattern.match(norm_name):
|
||||
filtered_filenames.append(filename)
|
||||
|
||||
return filtered_filenames
|
|
@ -1,177 +0,0 @@
|
|||
# Copyright 2010 United States Government as represented by the
|
||||
# Administrator of the National Aeronautics and Space Administration.
|
||||
# All Rights Reserved.
|
||||
# Copyright (c) 2010 Citrix Systems, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Helper methods to deal with images.
|
||||
|
||||
.. versionadded:: 3.1
|
||||
|
||||
.. versionchanged:: 3.14.0
|
||||
add paramter format.
|
||||
|
||||
"""
|
||||
|
||||
import json
|
||||
import re
|
||||
|
||||
from oslo_utils._i18n import _
|
||||
from oslo_utils import strutils
|
||||
|
||||
|
||||
class QemuImgInfo(object):
|
||||
"""Parse Qemu image information from command `qemu-img info`'s output.
|
||||
|
||||
The instance of :class:`QemuImgInfo` has properties: `image`,
|
||||
`backing_file`, `file_format`, `virtual_size`, `cluster_size`,
|
||||
`disk_size`, `snapshots` and `encrypted`.
|
||||
The parameter format can be set to 'json' or 'human'. With 'json' format
|
||||
output, qemu image information will be parsed more easily and readable.
|
||||
"""
|
||||
BACKING_FILE_RE = re.compile((r"^(.*?)\s*\(actual\s+path\s*:"
|
||||
r"\s+(.*?)\)\s*$"), re.I)
|
||||
TOP_LEVEL_RE = re.compile(r"^([\w\d\s\_\-]+):(.*)$")
|
||||
SIZE_RE = re.compile(r"(\d*\.?\d+)(\w+)?(\s*\(\s*(\d+)\s+bytes\s*\))?",
|
||||
re.I)
|
||||
|
||||
def __init__(self, cmd_output=None, format='human'):
|
||||
if format == 'json':
|
||||
details = json.loads(cmd_output or '{}')
|
||||
self.image = details.get('filename')
|
||||
self.backing_file = details.get('backing-filename')
|
||||
self.file_format = details.get('format')
|
||||
self.virtual_size = details.get('virtual-size')
|
||||
self.cluster_size = details.get('cluster-size')
|
||||
self.disk_size = details.get('actual-size')
|
||||
self.snapshots = details.get('snapshots', [])
|
||||
self.encrypted = details.get('encrypted')
|
||||
else:
|
||||
details = self._parse(cmd_output or '')
|
||||
self.image = details.get('image')
|
||||
self.backing_file = details.get('backing_file')
|
||||
self.file_format = details.get('file_format')
|
||||
self.virtual_size = details.get('virtual_size')
|
||||
self.cluster_size = details.get('cluster_size')
|
||||
self.disk_size = details.get('disk_size')
|
||||
self.snapshots = details.get('snapshot_list', [])
|
||||
self.encrypted = details.get('encrypted')
|
||||
|
||||
def __str__(self):
|
||||
lines = [
|
||||
'image: %s' % self.image,
|
||||
'file_format: %s' % self.file_format,
|
||||
'virtual_size: %s' % self.virtual_size,
|
||||
'disk_size: %s' % self.disk_size,
|
||||
'cluster_size: %s' % self.cluster_size,
|
||||
'backing_file: %s' % self.backing_file,
|
||||
]
|
||||
if self.snapshots:
|
||||
lines.append("snapshots: %s" % self.snapshots)
|
||||
if self.encrypted:
|
||||
lines.append("encrypted: %s" % self.encrypted)
|
||||
return "\n".join(lines)
|
||||
|
||||
def _canonicalize(self, field):
|
||||
# Standardize on underscores/lc/no dash and no spaces
|
||||
# since qemu seems to have mixed outputs here... and
|
||||
# this format allows for better integration with python
|
||||
# - i.e. for usage in kwargs and such...
|
||||
field = field.lower().strip()
|
||||
for c in (" ", "-"):
|
||||
field = field.replace(c, '_')
|
||||
return field
|
||||
|
||||
def _extract_bytes(self, details):
|
||||
# Replace it with the byte amount
|
||||
real_size = self.SIZE_RE.search(details)
|
||||
if not real_size:
|
||||
raise ValueError(_('Invalid input value "%s".') % details)
|
||||
magnitude = real_size.group(1)
|
||||
unit_of_measure = real_size.group(2)
|
||||
bytes_info = real_size.group(3)
|
||||
if bytes_info:
|
||||
return int(real_size.group(4))
|
||||
elif not unit_of_measure:
|
||||
return int(magnitude)
|
||||
return strutils.string_to_bytes('%s%sB' % (magnitude, unit_of_measure),
|
||||
return_int=True)
|
||||
|
||||
def _extract_details(self, root_cmd, root_details, lines_after):
|
||||
real_details = root_details
|
||||
if root_cmd == 'backing_file':
|
||||
# Replace it with the real backing file
|
||||
backing_match = self.BACKING_FILE_RE.match(root_details)
|
||||
if backing_match:
|
||||
real_details = backing_match.group(2).strip()
|
||||
elif root_cmd in ['virtual_size', 'cluster_size', 'disk_size']:
|
||||
# Replace it with the byte amount (if we can convert it)
|
||||
if root_details in ('None', 'unavailable'):
|
||||
real_details = 0
|
||||
else:
|
||||
real_details = self._extract_bytes(root_details)
|
||||
elif root_cmd == 'file_format':
|
||||
real_details = real_details.strip().lower()
|
||||
elif root_cmd == 'snapshot_list':
|
||||
# Next line should be a header, starting with 'ID'
|
||||
if not lines_after or not lines_after.pop(0).startswith("ID"):
|
||||
msg = _("Snapshot list encountered but no header found!")
|
||||
raise ValueError(msg)
|
||||
real_details = []
|
||||
# This is the sprintf pattern we will try to match
|
||||
# "%-10s%-20s%7s%20s%15s"
|
||||
# ID TAG VM SIZE DATE VM CLOCK (current header)
|
||||
while lines_after:
|
||||
line = lines_after[0]
|
||||
line_pieces = line.split()
|
||||
if len(line_pieces) != 6:
|
||||
break
|
||||
# Check against this pattern in the final position
|
||||
# "%02d:%02d:%02d.%03d"
|
||||
date_pieces = line_pieces[5].split(":")
|
||||
if len(date_pieces) != 3:
|
||||
break
|
||||
lines_after.pop(0)
|
||||
real_details.append({
|
||||
'id': line_pieces[0],
|
||||
'tag': line_pieces[1],
|
||||
'vm_size': line_pieces[2],
|
||||
'date': line_pieces[3],
|
||||
'vm_clock': line_pieces[4] + " " + line_pieces[5],
|
||||
})
|
||||
return real_details
|
||||
|
||||
def _parse(self, cmd_output):
|
||||
# Analysis done of qemu-img.c to figure out what is going on here
|
||||
# Find all points start with some chars and then a ':' then a newline
|
||||
# and then handle the results of those 'top level' items in a separate
|
||||
# function.
|
||||
#
|
||||
# TODO(harlowja): newer versions might have a json output format
|
||||
# we should switch to that whenever possible.
|
||||
# see: http://bit.ly/XLJXDX
|
||||
contents = {}
|
||||
lines = [x for x in cmd_output.splitlines() if x.strip()]
|
||||
while lines:
|
||||
line = lines.pop(0)
|
||||
top_level = self.TOP_LEVEL_RE.match(line)
|
||||
if top_level:
|
||||
root = self._canonicalize(top_level.group(1))
|
||||
if not root:
|
||||
continue
|
||||
root_details = top_level.group(2).strip()
|
||||
details = self._extract_details(root, root_details, lines)
|
||||
contents[root] = details
|
||||
return contents
|
|
@ -1,123 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Import related utilities and helper functions.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import traceback
|
||||
|
||||
|
||||
def import_class(import_str):
|
||||
"""Returns a class from a string including module and class.
|
||||
|
||||
.. versionadded:: 0.3
|
||||
"""
|
||||
mod_str, _sep, class_str = import_str.rpartition('.')
|
||||
__import__(mod_str)
|
||||
try:
|
||||
return getattr(sys.modules[mod_str], class_str)
|
||||
except AttributeError:
|
||||
raise ImportError('Class %s cannot be found (%s)' %
|
||||
(class_str,
|
||||
traceback.format_exception(*sys.exc_info())))
|
||||
|
||||
|
||||
def import_object(import_str, *args, **kwargs):
|
||||
"""Import a class and return an instance of it.
|
||||
|
||||
.. versionadded:: 0.3
|
||||
"""
|
||||
return import_class(import_str)(*args, **kwargs)
|
||||
|
||||
|
||||
def import_object_ns(name_space, import_str, *args, **kwargs):
|
||||
"""Tries to import object from default namespace.
|
||||
|
||||
Imports a class and return an instance of it, first by trying
|
||||
to find the class in a default namespace, then failing back to
|
||||
a full path if not found in the default namespace.
|
||||
|
||||
.. versionadded:: 0.3
|
||||
|
||||
.. versionchanged:: 2.6
|
||||
Don't capture :exc:`ImportError` when instanciating the object, only
|
||||
when importing the object class.
|
||||
"""
|
||||
import_value = "%s.%s" % (name_space, import_str)
|
||||
try:
|
||||
cls = import_class(import_value)
|
||||
except ImportError:
|
||||
cls = import_class(import_str)
|
||||
return cls(*args, **kwargs)
|
||||
|
||||
|
||||
def import_module(import_str):
|
||||
"""Import a module.
|
||||
|
||||
.. versionadded:: 0.3
|
||||
"""
|
||||
__import__(import_str)
|
||||
return sys.modules[import_str]
|
||||
|
||||
|
||||
def import_versioned_module(module, version, submodule=None):
|
||||
"""Import a versioned module in format {module}.v{version][.{submodule}].
|
||||
|
||||
:param module: the module name.
|
||||
:param version: the version number.
|
||||
:param submodule: the submodule name.
|
||||
:raises ValueError: For any invalid input.
|
||||
|
||||
.. versionadded:: 0.3
|
||||
|
||||
.. versionchanged:: 3.17
|
||||
Added *module* parameter.
|
||||
"""
|
||||
|
||||
# NOTE(gcb) Disallow parameter version include character '.'
|
||||
if '.' in '%s' % version:
|
||||
raise ValueError("Parameter version shouldn't include character '.'.")
|
||||
module_str = '%s.v%s' % (module, version)
|
||||
if submodule:
|
||||
module_str = '.'.join((module_str, submodule))
|
||||
return import_module(module_str)
|
||||
|
||||
|
||||
def try_import(import_str, default=None):
|
||||
"""Try to import a module and if it fails return default."""
|
||||
try:
|
||||
return import_module(import_str)
|
||||
except ImportError:
|
||||
return default
|
||||
|
||||
|
||||
def import_any(module, *modules):
|
||||
"""Try to import a module from a list of modules.
|
||||
|
||||
:param modules: A list of modules to try and import
|
||||
:returns: The first module found that can be imported
|
||||
:raises ImportError: If no modules can be imported from list
|
||||
|
||||
.. versionadded:: 3.8
|
||||
"""
|
||||
for module_name in (module,) + modules:
|
||||
imported_module = try_import(module_name)
|
||||
if imported_module:
|
||||
return imported_module
|
||||
|
||||
raise ImportError('Unable to import any modules from the list %s' %
|
||||
str(modules))
|
|
@ -1,39 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2014-12-07 06:44+0000\n"
|
||||
"Last-Translator: Andreas Jaeger <jaegerandi@gmail.com>\n"
|
||||
"Language: de\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: German\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %(interface)s: %(error)s"
|
||||
msgstr ""
|
||||
"Konnte IPv4 Adresse für Interface %(interface)s nicht bestimmen: %(error)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %s, using 127.0.0.1"
|
||||
msgstr ""
|
||||
"Konnte IPv4 Adresse für Interface %s nicht bestimmen, verwende 127.0.0.1"
|
||||
|
||||
msgid ""
|
||||
"Could not determine default network interface, using 127.0.0.1 for IPv4 "
|
||||
"address"
|
||||
msgstr ""
|
||||
"Konnte default Network Interface nicht bestimmen, 127.0.0.1 wird als IPv4 "
|
||||
"Adresse benutzt."
|
|
@ -1,24 +0,0 @@
|
|||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2016-01-31 08:11+0000\n"
|
||||
"Last-Translator: Andreas Jaeger <jaegerandi@gmail.com>\n"
|
||||
"Language-Team: German\n"
|
||||
"Language: de\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1)\n"
|
||||
|
||||
msgid "tcp_keepcnt not available on your system"
|
||||
msgstr "tcp_keepcnt ist nicht verfügbar auf ihrem System"
|
||||
|
||||
msgid "tcp_keepidle not available on your system"
|
||||
msgstr "tcp_keepidle ist nicht verfügbar auf ihrem System"
|
||||
|
||||
msgid "tcp_keepintvl not available on your system"
|
||||
msgstr "tcp_keepintvl ist nicht verfügbar auf ihrem System"
|
|
@ -1,67 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2014
|
||||
# Ettore Atalan <atalanttore@googlemail.com>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2016-01-31 08:10+0000\n"
|
||||
"Last-Translator: Andreas Jaeger <jaegerandi@gmail.com>\n"
|
||||
"Language: de\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: German\n"
|
||||
|
||||
#, python-format
|
||||
msgid "%s is not a string or unicode"
|
||||
msgstr "%s ist keine Zeichenkette oder Unicode"
|
||||
|
||||
#, python-format
|
||||
msgid ""
|
||||
"Bad prefix or mac format for generating IPv6 address by EUI-64: %(prefix)s, "
|
||||
"%(mac)s:"
|
||||
msgstr ""
|
||||
"Falsches Präfix- oder MAC-Format für das Generieren der IPv6-Adresse durch "
|
||||
"EUI-64: %(prefix)s, %(mac)s:"
|
||||
|
||||
#, python-format
|
||||
msgid "Bad prefix type for generating IPv6 address by EUI-64: %s"
|
||||
msgstr ""
|
||||
"Falscher Präfixtyp für das Generieren der IPv6-Adresse durch EUI-64: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid input value \"%s\"."
|
||||
msgstr "Ungültiger Eingabewert \"%s\"."
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid string format: %s"
|
||||
msgstr "Ungültiges Stringformat: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid unit system: \"%s\""
|
||||
msgstr "Ungültiges Einheitensystem: \"%s\""
|
||||
|
||||
msgid "Snapshot list encountered but no header found!"
|
||||
msgstr "Momentaufnahmenliste gefunden, aber kein Header gefunden!"
|
||||
|
||||
msgid "Unable to generate IP address by EUI64 for IPv4 prefix"
|
||||
msgstr ""
|
||||
"IP-Adresse kann nicht mithilfe von EUI64 mit dem IPv4-Präfix generiert werden"
|
||||
|
||||
#, python-format
|
||||
msgid "Unrecognized value '%(val)s', acceptable values are: %(acceptable)s"
|
||||
msgstr "Nicht erkannter Wert '%(val)s', zulässige Werte are: %(acceptable)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Version %s is invalid."
|
||||
msgstr "Version %s ist ungültig."
|
|
@ -1,30 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andi Chandler <andi@gowling.com>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2014-09-02 09:07+0000\n"
|
||||
"Last-Translator: Andi Chandler <andi@gowling.com>\n"
|
||||
"Language: en-GB\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: English (United Kingdom)\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Original exception being dropped: %s"
|
||||
msgstr "Original exception being dropped: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Unexpected exception occurred %d time(s)... retrying."
|
||||
msgstr "Unexpected exception occurred %d time(s)... retrying."
|
|
@ -1,38 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andi Chandler <andi@gowling.com>, 2014-2015
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2015-04-22 05:28+0000\n"
|
||||
"Last-Translator: Andi Chandler <andi@gowling.com>\n"
|
||||
"Language: en-GB\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: English (United Kingdom)\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %(interface)s: %(error)s"
|
||||
msgstr ""
|
||||
"Could not determine IPv4 address for interface %(interface)s: %(error)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %s, using 127.0.0.1"
|
||||
msgstr "Could not determine IPv4 address for interface %s, using 127.0.0.1"
|
||||
|
||||
msgid ""
|
||||
"Could not determine default network interface, using 127.0.0.1 for IPv4 "
|
||||
"address"
|
||||
msgstr ""
|
||||
"Could not determine default network interface, using 127.0.0.1 for IPv4 "
|
||||
"address"
|
|
@ -1,31 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andi Chandler <andi@gowling.com>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2014-09-02 09:08+0000\n"
|
||||
"Last-Translator: Andi Chandler <andi@gowling.com>\n"
|
||||
"Language: en-GB\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: English (United Kingdom)\n"
|
||||
|
||||
msgid "tcp_keepcnt not available on your system"
|
||||
msgstr "tcp_keepcnt not available on your system"
|
||||
|
||||
msgid "tcp_keepidle not available on your system"
|
||||
msgstr "tcp_keepidle not available on your system"
|
||||
|
||||
msgid "tcp_keepintvl not available on your system"
|
||||
msgstr "tcp_keepintvl not available on your system"
|
|
@ -1,82 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Andi Chandler <andi@gowling.com>, 2014-2015
|
||||
# OpenStack Infra <zanata@openstack.org>, 2015. #zanata
|
||||
# Andi Chandler <andi@gowling.com>, 2016. #zanata
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.12.1.dev3\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-06-10 18:11+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2016-06-09 11:15+0000\n"
|
||||
"Last-Translator: Andi Chandler <andi@gowling.com>\n"
|
||||
"Language: en-GB\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: English (United Kingdom)\n"
|
||||
|
||||
#, python-format
|
||||
msgid "%(name)s has %(length)s characters, less than %(min_length)s."
|
||||
msgstr "%(name)s has %(length)s characters, less than %(min_length)s."
|
||||
|
||||
#, python-format
|
||||
msgid "%(name)s has %(length)s characters, more than %(max_length)s."
|
||||
msgstr "%(name)s has %(length)s characters, more than %(max_length)s."
|
||||
|
||||
#, python-format
|
||||
msgid "%s is not a string or unicode"
|
||||
msgstr "%s is not a string or unicode"
|
||||
|
||||
#, python-format
|
||||
msgid ""
|
||||
"Bad prefix or mac format for generating IPv6 address by EUI-64: %(prefix)s, "
|
||||
"%(mac)s:"
|
||||
msgstr ""
|
||||
"Bad prefix or mac format for generating IPv6 address by EUI-64: %(prefix)s, "
|
||||
"%(mac)s:"
|
||||
|
||||
#, python-format
|
||||
msgid "Bad prefix type for generating IPv6 address by EUI-64: %s"
|
||||
msgstr "Bad prefix type for generating IPv6 address by EUI-64: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid input value \"%s\"."
|
||||
msgstr "Invalid input value \"%s\"."
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid path: %s"
|
||||
msgstr "Invalid path: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid string format: %s"
|
||||
msgstr "Invalid string format: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid unit system: \"%s\""
|
||||
msgstr "Invalid unit system: \"%s\""
|
||||
|
||||
msgid "Snapshot list encountered but no header found!"
|
||||
msgstr "Snapshot list encountered but no header found!"
|
||||
|
||||
msgid "Unable to generate IP address by EUI64 for IPv4 prefix"
|
||||
msgstr "Unable to generate IP address by EUI64 for IPv4 prefix"
|
||||
|
||||
#, python-format
|
||||
msgid "Unrecognized value '%(val)s', acceptable values are: %(acceptable)s"
|
||||
msgstr "Unrecognised value '%(val)s', acceptable values are: %(acceptable)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Version %s is invalid."
|
||||
msgstr "Version %s is invalid."
|
||||
|
||||
#, python-format
|
||||
msgid "minsegs > maxsegs: %(min)d > %(max)d)"
|
||||
msgstr "minsegs > maxsegs: %(min)d > %(max)d)"
|
|
@ -1,24 +0,0 @@
|
|||
# Alex Eng <loones1595@gmail.com>, 2016. #zanata
|
||||
# KATO Tomoyuki <kato.tomoyuki@jp.fujitsu.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.15.1.dev4\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-07-11 23:17+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2016-07-02 08:15+0000\n"
|
||||
"Last-Translator: KATO Tomoyuki <kato.tomoyuki@jp.fujitsu.com>\n"
|
||||
"Language-Team: Spanish\n"
|
||||
"Language: es\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1)\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Original exception being dropped: %s"
|
||||
msgstr "Se está descartando la excepción original: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Unexpected exception occurred %d time(s)... retrying."
|
||||
msgstr "Ocurrió una excepción inesperada... reintentando %d vez(veces)."
|
|
@ -1,31 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Corina Roe <croe@redhat.com>, 2014
|
||||
# Jonathan Dupart <jonathan+transifex@dupart.org>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2014-10-08 05:32+0000\n"
|
||||
"Last-Translator: Corina Roe <croe@redhat.com>\n"
|
||||
"Language: fr\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: French\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Original exception being dropped: %s"
|
||||
msgstr "Exception d'origine abandonnée : %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Unexpected exception occurred %d time(s)... retrying."
|
||||
msgstr "Une exception inattendue s'est produite %d foi(s) ... nouvel essai."
|
|
@ -1,41 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Maxime COQUEREL <max.coquerel@gmail.com>, 2014-2015
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2015-03-08 04:45+0000\n"
|
||||
"Last-Translator: Maxime COQUEREL <max.coquerel@gmail.com>\n"
|
||||
"Language: fr\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: French\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %(interface)s: %(error)s"
|
||||
msgstr ""
|
||||
"Impossible de déterminer l'adresse IPv4 utilisé par l'interface "
|
||||
"%(interface)s: %(error)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Could not determine IPv4 address for interface %s, using 127.0.0.1"
|
||||
msgstr ""
|
||||
"Impossible de déterminer l'adresse IPv4 utilisé par l'interface %s pour "
|
||||
"l'adresse 127.0.0.1"
|
||||
|
||||
msgid ""
|
||||
"Could not determine default network interface, using 127.0.0.1 for IPv4 "
|
||||
"address"
|
||||
msgstr ""
|
||||
"Impossible de déterminer l'interface de réseau par défaut, utilisé pour "
|
||||
"l'adresse IPv4 127.0.0.1"
|
|
@ -1,31 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Maxime COQUEREL <max.coquerel@gmail.com>, 2014
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2014-09-25 09:15+0000\n"
|
||||
"Last-Translator: Maxime COQUEREL <max.coquerel@gmail.com>\n"
|
||||
"Language: fr\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: French\n"
|
||||
|
||||
msgid "tcp_keepcnt not available on your system"
|
||||
msgstr "tcp_keepcnt non disponible sur votre système"
|
||||
|
||||
msgid "tcp_keepidle not available on your system"
|
||||
msgstr "tcp_keepidle non disponible sur votre système"
|
||||
|
||||
msgid "tcp_keepintvl not available on your system"
|
||||
msgstr "tcp_keepintv non disponible sur votre système"
|
|
@ -1,67 +0,0 @@
|
|||
# Translations template for oslo.utils.
|
||||
# Copyright (C) 2015 ORGANIZATION
|
||||
# This file is distributed under the same license as the oslo.utils project.
|
||||
#
|
||||
# Translators:
|
||||
# Maxime COQUEREL <max.coquerel@gmail.com>, 2014-2015
|
||||
# OpenStack Infra <zanata@openstack.org>, 2015. #zanata
|
||||
# Tom Cocozzello <tjcocozz@us.ibm.com>, 2015. #zanata
|
||||
# Andreas Jaeger <jaegerandi@gmail.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.7.1.dev13\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-05-02 20:03+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2015-07-27 10:55+0000\n"
|
||||
"Last-Translator: Maxime COQUEREL <max.coquerel@gmail.com>\n"
|
||||
"Language: fr\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
|
||||
"Generated-By: Babel 2.0\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Language-Team: French\n"
|
||||
|
||||
#, python-format
|
||||
msgid "%s is not a string or unicode"
|
||||
msgstr "%s n'est pas une chaîne ou unicode"
|
||||
|
||||
#, python-format
|
||||
msgid ""
|
||||
"Bad prefix or mac format for generating IPv6 address by EUI-64: %(prefix)s, "
|
||||
"%(mac)s:"
|
||||
msgstr ""
|
||||
"Mauvais type de préfixe ou mauvais format d'adresse mac pour générer une "
|
||||
"adresse IPv6 par EUI-64: %(prefix)s, %(mac)s:"
|
||||
|
||||
#, python-format
|
||||
msgid "Bad prefix type for generating IPv6 address by EUI-64: %s"
|
||||
msgstr "Mauvais type de préfixe pour générer adresse IPv6 par EUI-64: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid input value \"%s\"."
|
||||
msgstr "Valeur en entrée \"%s\" non valide."
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid string format: %s"
|
||||
msgstr "Format de chaine de caractère non valide: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Invalid unit system: \"%s\""
|
||||
msgstr "Unit système non valide: \"%s\""
|
||||
|
||||
msgid "Snapshot list encountered but no header found!"
|
||||
msgstr "Liste d'instantanés trouvée mais aucun en-tête trouvé !"
|
||||
|
||||
msgid "Unable to generate IP address by EUI64 for IPv4 prefix"
|
||||
msgstr "Impossible de générer l'adresse IP par EUI64 pour le préfixe IPv4"
|
||||
|
||||
#, python-format
|
||||
msgid "Unrecognized value '%(val)s', acceptable values are: %(acceptable)s"
|
||||
msgstr ""
|
||||
"Valeur non reconnue '%(val)s', les valeurs acceptables sont: %(acceptable)s"
|
||||
|
||||
#, python-format
|
||||
msgid "Version %s is invalid."
|
||||
msgstr "La version %s est invalide."
|
|
@ -1,24 +0,0 @@
|
|||
# Alex Eng <loones1595@gmail.com>, 2016. #zanata
|
||||
# KATO Tomoyuki <kato.tomoyuki@jp.fujitsu.com>, 2016. #zanata
|
||||
msgid ""
|
||||
msgstr ""
|
||||
"Project-Id-Version: oslo.utils 3.15.1.dev4\n"
|
||||
"Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n"
|
||||
"POT-Creation-Date: 2016-07-11 23:17+0000\n"
|
||||
"MIME-Version: 1.0\n"
|
||||
"Content-Type: text/plain; charset=UTF-8\n"
|
||||
"Content-Transfer-Encoding: 8bit\n"
|
||||
"PO-Revision-Date: 2016-07-02 08:15+0000\n"
|
||||
"Last-Translator: KATO Tomoyuki <kato.tomoyuki@jp.fujitsu.com>\n"
|
||||
"Language-Team: Portuguese (Brazil)\n"
|
||||
"Language: pt-BR\n"
|
||||
"X-Generator: Zanata 3.7.3\n"
|
||||
"Plural-Forms: nplurals=2; plural=(n != 1)\n"
|
||||
|
||||
#, python-format
|
||||
msgid "Original exception being dropped: %s"
|
||||
msgstr "Exceção original sendo cancelada: %s"
|
||||
|
||||
#, python-format
|
||||
msgid "Unexpected exception occurred %d time(s)... retrying."
|
||||
msgstr "Exceção não esperada ocorreu %d vez(es)... tentando novamente."
|
|
@ -1,453 +0,0 @@
|
|||
# Copyright 2012 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Network-related utilities and helper functions.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
|
||||
import netaddr
|
||||
import netifaces
|
||||
import six
|
||||
from six.moves.urllib import parse
|
||||
|
||||
from oslo_utils._i18n import _
|
||||
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
_IS_IPV6_ENABLED = None
|
||||
|
||||
|
||||
def parse_host_port(address, default_port=None):
|
||||
"""Interpret a string as a host:port pair.
|
||||
|
||||
An IPv6 address MUST be escaped if accompanied by a port,
|
||||
because otherwise ambiguity ensues: 2001:db8:85a3::8a2e:370:7334
|
||||
means both [2001:db8:85a3::8a2e:370:7334] and
|
||||
[2001:db8:85a3::8a2e:370]:7334.
|
||||
|
||||
>>> parse_host_port('server01:80')
|
||||
('server01', 80)
|
||||
>>> parse_host_port('server01')
|
||||
('server01', None)
|
||||
>>> parse_host_port('server01', default_port=1234)
|
||||
('server01', 1234)
|
||||
>>> parse_host_port('[::1]:80')
|
||||
('::1', 80)
|
||||
>>> parse_host_port('[::1]')
|
||||
('::1', None)
|
||||
>>> parse_host_port('[::1]', default_port=1234)
|
||||
('::1', 1234)
|
||||
>>> parse_host_port('2001:db8:85a3::8a2e:370:7334', default_port=1234)
|
||||
('2001:db8:85a3::8a2e:370:7334', 1234)
|
||||
>>> parse_host_port(None)
|
||||
(None, None)
|
||||
"""
|
||||
if not address:
|
||||
return (None, None)
|
||||
|
||||
if address[0] == '[':
|
||||
# Escaped ipv6
|
||||
_host, _port = address[1:].split(']')
|
||||
host = _host
|
||||
if ':' in _port:
|
||||
port = _port.split(':')[1]
|
||||
else:
|
||||
port = default_port
|
||||
else:
|
||||
if address.count(':') == 1:
|
||||
host, port = address.split(':')
|
||||
else:
|
||||
# 0 means ipv4, >1 means ipv6.
|
||||
# We prohibit unescaped ipv6 addresses with port.
|
||||
host = address
|
||||
port = default_port
|
||||
|
||||
return (host, None if port is None else int(port))
|
||||
|
||||
|
||||
def is_valid_ipv4(address):
|
||||
"""Verify that address represents a valid IPv4 address.
|
||||
|
||||
:param address: Value to verify
|
||||
:type address: string
|
||||
:returns: bool
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
try:
|
||||
return netaddr.valid_ipv4(address)
|
||||
except netaddr.AddrFormatError:
|
||||
return False
|
||||
|
||||
|
||||
def is_valid_ipv6(address):
|
||||
"""Verify that address represents a valid IPv6 address.
|
||||
|
||||
:param address: Value to verify
|
||||
:type address: string
|
||||
:returns: bool
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
if not address:
|
||||
return False
|
||||
|
||||
parts = address.rsplit("%", 1)
|
||||
address = parts[0]
|
||||
scope = parts[1] if len(parts) > 1 else None
|
||||
if scope is not None and (len(scope) < 1 or len(scope) > 15):
|
||||
return False
|
||||
|
||||
try:
|
||||
return netaddr.valid_ipv6(address, netaddr.core.INET_PTON)
|
||||
except netaddr.AddrFormatError:
|
||||
return False
|
||||
|
||||
|
||||
def is_valid_cidr(address):
|
||||
"""Verify that address represents a valid CIDR address.
|
||||
|
||||
:param address: Value to verify
|
||||
:type address: string
|
||||
:returns: bool
|
||||
|
||||
.. versionadded:: 3.8
|
||||
"""
|
||||
try:
|
||||
# Validate the correct CIDR Address
|
||||
netaddr.IPNetwork(address)
|
||||
except (TypeError, netaddr.AddrFormatError):
|
||||
return False
|
||||
|
||||
# Prior validation partially verify /xx part
|
||||
# Verify it here
|
||||
ip_segment = address.split('/')
|
||||
|
||||
if (len(ip_segment) <= 1 or
|
||||
ip_segment[1] == ''):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def is_valid_ipv6_cidr(address):
|
||||
"""Verify that address represents a valid IPv6 CIDR address.
|
||||
|
||||
:param address: address to verify
|
||||
:type address: string
|
||||
:returns: true if address is valid, false otherwise
|
||||
|
||||
.. versionadded:: 3.17
|
||||
"""
|
||||
try:
|
||||
netaddr.IPNetwork(address, version=6).cidr
|
||||
return True
|
||||
except (TypeError, netaddr.AddrFormatError):
|
||||
return False
|
||||
|
||||
|
||||
def get_ipv6_addr_by_EUI64(prefix, mac):
|
||||
"""Calculate IPv6 address using EUI-64 specification.
|
||||
|
||||
This method calculates the IPv6 address using the EUI-64
|
||||
addressing scheme as explained in rfc2373.
|
||||
|
||||
:param prefix: IPv6 prefix.
|
||||
:param mac: IEEE 802 48-bit MAC address.
|
||||
:returns: IPv6 address on success.
|
||||
:raises ValueError, TypeError: For any invalid input.
|
||||
|
||||
.. versionadded:: 1.4
|
||||
"""
|
||||
# Check if the prefix is an IPv4 address
|
||||
if is_valid_ipv4(prefix):
|
||||
msg = _("Unable to generate IP address by EUI64 for IPv4 prefix")
|
||||
raise ValueError(msg)
|
||||
try:
|
||||
eui64 = int(netaddr.EUI(mac).eui64())
|
||||
prefix = netaddr.IPNetwork(prefix)
|
||||
return netaddr.IPAddress(prefix.first + eui64 ^ (1 << 57))
|
||||
except (ValueError, netaddr.AddrFormatError):
|
||||
raise ValueError(_('Bad prefix or mac format for generating IPv6 '
|
||||
'address by EUI-64: %(prefix)s, %(mac)s:')
|
||||
% {'prefix': prefix, 'mac': mac})
|
||||
except TypeError:
|
||||
raise TypeError(_('Bad prefix type for generating IPv6 address by '
|
||||
'EUI-64: %s') % prefix)
|
||||
|
||||
|
||||
def is_ipv6_enabled():
|
||||
"""Check if IPv6 support is enabled on the platform.
|
||||
|
||||
This api will look into the proc entries of the platform to figure
|
||||
out the status of IPv6 support on the platform.
|
||||
|
||||
:returns: True if the platform has IPv6 support, False otherwise.
|
||||
|
||||
.. versionadded:: 1.4
|
||||
"""
|
||||
|
||||
global _IS_IPV6_ENABLED
|
||||
|
||||
if _IS_IPV6_ENABLED is None:
|
||||
disabled_ipv6_path = "/proc/sys/net/ipv6/conf/default/disable_ipv6"
|
||||
if os.path.exists(disabled_ipv6_path):
|
||||
with open(disabled_ipv6_path, 'r') as f:
|
||||
disabled = f.read().strip()
|
||||
_IS_IPV6_ENABLED = disabled == "0"
|
||||
else:
|
||||
_IS_IPV6_ENABLED = False
|
||||
return _IS_IPV6_ENABLED
|
||||
|
||||
|
||||
def is_valid_ip(address):
|
||||
"""Verify that address represents a valid IP address.
|
||||
|
||||
:param address: Value to verify
|
||||
:type address: string
|
||||
:returns: bool
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
return is_valid_ipv4(address) or is_valid_ipv6(address)
|
||||
|
||||
|
||||
def is_valid_mac(address):
|
||||
"""Verify the format of a MAC address.
|
||||
|
||||
Check if a MAC address is valid and contains six octets. Accepts
|
||||
colon-separated format only.
|
||||
|
||||
:param address: MAC address to be validated.
|
||||
:returns: True if valid. False if not.
|
||||
|
||||
.. versionadded:: 3.17
|
||||
"""
|
||||
m = "[0-9a-f]{2}(:[0-9a-f]{2}){5}$"
|
||||
return (isinstance(address, six.string_types) and
|
||||
re.match(m, address.lower()))
|
||||
|
||||
|
||||
def _is_int_in_range(value, start, end):
|
||||
"""Try to convert value to int and check if it lies within
|
||||
range 'start' to 'end'.
|
||||
|
||||
:param value: value to verify
|
||||
:param start: start number of range
|
||||
:param end: end number of range
|
||||
:returns: bool
|
||||
"""
|
||||
try:
|
||||
val = int(value)
|
||||
except (ValueError, TypeError):
|
||||
return False
|
||||
return (start <= val <= end)
|
||||
|
||||
|
||||
def is_valid_port(port):
|
||||
"""Verify that port represents a valid port number.
|
||||
|
||||
Port can be valid integer having a value of 0 up to and
|
||||
including 65535.
|
||||
|
||||
.. versionadded:: 1.1.1
|
||||
"""
|
||||
return _is_int_in_range(port, 0, 65535)
|
||||
|
||||
|
||||
def is_valid_icmp_type(type):
|
||||
"""Verify if ICMP type is valid.
|
||||
|
||||
:param type: ICMP *type* field can only be a valid integer
|
||||
:returns: bool
|
||||
|
||||
ICMP *type* field can be valid integer having a value of 0
|
||||
up to and including 255.
|
||||
"""
|
||||
return _is_int_in_range(type, 0, 255)
|
||||
|
||||
|
||||
def is_valid_icmp_code(code):
|
||||
"""Verify if ICMP code is valid.
|
||||
|
||||
:param code: ICMP *code* field can be valid integer or None
|
||||
:returns: bool
|
||||
|
||||
ICMP *code* field can be either None or valid integer having
|
||||
a value of 0 up to and including 255.
|
||||
"""
|
||||
if code is None:
|
||||
return True
|
||||
return _is_int_in_range(code, 0, 255)
|
||||
|
||||
|
||||
def get_my_ipv4():
|
||||
"""Returns the actual ipv4 of the local machine.
|
||||
|
||||
This code figures out what source address would be used if some traffic
|
||||
were to be sent out to some well known address on the Internet. In this
|
||||
case, IP from RFC5737 is used, but the specific address does not
|
||||
matter much. No traffic is actually sent.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
|
||||
.. versionchanged:: 1.2.1
|
||||
Return ``'127.0.0.1'`` if there is no default interface.
|
||||
"""
|
||||
try:
|
||||
csock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
csock.connect(('192.0.2.0', 80))
|
||||
(addr, port) = csock.getsockname()
|
||||
csock.close()
|
||||
return addr
|
||||
except socket.error:
|
||||
return _get_my_ipv4_address()
|
||||
|
||||
|
||||
def _get_my_ipv4_address():
|
||||
"""Figure out the best ipv4
|
||||
"""
|
||||
LOCALHOST = '127.0.0.1'
|
||||
gtw = netifaces.gateways()
|
||||
try:
|
||||
interface = gtw['default'][netifaces.AF_INET][1]
|
||||
except (KeyError, IndexError):
|
||||
LOG.info('Could not determine default network interface, '
|
||||
'using 127.0.0.1 for IPv4 address')
|
||||
return LOCALHOST
|
||||
|
||||
try:
|
||||
return netifaces.ifaddresses(interface)[netifaces.AF_INET][0]['addr']
|
||||
except (KeyError, IndexError):
|
||||
LOG.info('Could not determine IPv4 address for interface %s, '
|
||||
'using 127.0.0.1',
|
||||
interface)
|
||||
except Exception as e:
|
||||
LOG.info('Could not determine IPv4 address for '
|
||||
'interface %(interface)s: %(error)s',
|
||||
{'interface': interface, 'error': e})
|
||||
return LOCALHOST
|
||||
|
||||
|
||||
class _ModifiedSplitResult(parse.SplitResult):
|
||||
"""Split results class for urlsplit."""
|
||||
|
||||
def params(self, collapse=True):
|
||||
"""Extracts the query parameters from the split urls components.
|
||||
|
||||
This method will provide back as a dictionary the query parameter
|
||||
names and values that were provided in the url.
|
||||
|
||||
:param collapse: Boolean, turn on or off collapsing of query values
|
||||
with the same name. Since a url can contain the same query parameter
|
||||
name with different values it may or may not be useful for users to
|
||||
care that this has happened. This parameter when True uses the
|
||||
last value that was given for a given name, while if False it will
|
||||
retain all values provided by associating the query parameter name with
|
||||
a list of values instead of a single (non-list) value.
|
||||
"""
|
||||
if self.query:
|
||||
if collapse:
|
||||
return dict(parse.parse_qsl(self.query))
|
||||
else:
|
||||
params = {}
|
||||
for (key, value) in parse.parse_qsl(self.query):
|
||||
if key in params:
|
||||
if isinstance(params[key], list):
|
||||
params[key].append(value)
|
||||
else:
|
||||
params[key] = [params[key], value]
|
||||
else:
|
||||
params[key] = value
|
||||
return params
|
||||
else:
|
||||
return {}
|
||||
|
||||
|
||||
def urlsplit(url, scheme='', allow_fragments=True):
|
||||
"""Parse a URL using urlparse.urlsplit(), splitting query and fragments.
|
||||
This function papers over Python issue9374_ when needed.
|
||||
|
||||
.. _issue9374: http://bugs.python.org/issue9374
|
||||
|
||||
The parameters are the same as urlparse.urlsplit.
|
||||
"""
|
||||
scheme, netloc, path, query, fragment = parse.urlsplit(
|
||||
url, scheme, allow_fragments)
|
||||
if allow_fragments and '#' in path:
|
||||
path, fragment = path.split('#', 1)
|
||||
if '?' in path:
|
||||
path, query = path.split('?', 1)
|
||||
return _ModifiedSplitResult(scheme, netloc,
|
||||
path, query, fragment)
|
||||
|
||||
|
||||
def set_tcp_keepalive(sock, tcp_keepalive=True,
|
||||
tcp_keepidle=None,
|
||||
tcp_keepalive_interval=None,
|
||||
tcp_keepalive_count=None):
|
||||
"""Set values for tcp keepalive parameters
|
||||
|
||||
This function configures tcp keepalive parameters if users wish to do
|
||||
so.
|
||||
|
||||
:param tcp_keepalive: Boolean, turn on or off tcp_keepalive. If users are
|
||||
not sure, this should be True, and default values will be used.
|
||||
|
||||
:param tcp_keepidle: time to wait before starting to send keepalive probes
|
||||
:param tcp_keepalive_interval: time between successive probes, once the
|
||||
initial wait time is over
|
||||
:param tcp_keepalive_count: number of probes to send before the connection
|
||||
is killed
|
||||
"""
|
||||
|
||||
# NOTE(praneshp): Despite keepalive being a tcp concept, the level is
|
||||
# still SOL_SOCKET. This is a quirk.
|
||||
if isinstance(tcp_keepalive, bool):
|
||||
sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, tcp_keepalive)
|
||||
else:
|
||||
raise TypeError("tcp_keepalive must be a boolean")
|
||||
|
||||
if not tcp_keepalive:
|
||||
return
|
||||
|
||||
# These options aren't available in the OS X version of eventlet,
|
||||
# Idle + Count * Interval effectively gives you the total timeout.
|
||||
if tcp_keepidle is not None:
|
||||
if hasattr(socket, 'TCP_KEEPIDLE'):
|
||||
sock.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPIDLE,
|
||||
tcp_keepidle)
|
||||
else:
|
||||
LOG.warning('tcp_keepidle not available on your system')
|
||||
if tcp_keepalive_interval is not None:
|
||||
if hasattr(socket, 'TCP_KEEPINTVL'):
|
||||
sock.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPINTVL,
|
||||
tcp_keepalive_interval)
|
||||
else:
|
||||
LOG.warning('tcp_keepintvl not available on your system')
|
||||
if tcp_keepalive_count is not None:
|
||||
if hasattr(socket, 'TCP_KEEPCNT'):
|
||||
sock.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPCNT,
|
||||
tcp_keepalive_count)
|
||||
else:
|
||||
LOG.warning('tcp_keepcnt not available on your system')
|
|
@ -1,221 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright (C) 2012-2013 Yahoo! Inc. All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Reflection module.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
|
||||
import inspect
|
||||
import types
|
||||
|
||||
import six
|
||||
|
||||
try:
|
||||
_TYPE_TYPE = types.TypeType
|
||||
except AttributeError:
|
||||
_TYPE_TYPE = type
|
||||
|
||||
# See: https://docs.python.org/2/library/__builtin__.html#module-__builtin__
|
||||
# and see https://docs.python.org/2/reference/executionmodel.html (and likely
|
||||
# others)...
|
||||
_BUILTIN_MODULES = ('builtins', '__builtin__', '__builtins__', 'exceptions')
|
||||
|
||||
if six.PY3:
|
||||
Parameter = inspect.Parameter
|
||||
Signature = inspect.Signature
|
||||
get_signature = inspect.signature
|
||||
else:
|
||||
# Provide an equivalent but use funcsigs instead...
|
||||
import funcsigs
|
||||
Parameter = funcsigs.Parameter
|
||||
Signature = funcsigs.Signature
|
||||
get_signature = funcsigs.signature
|
||||
|
||||
|
||||
def get_members(obj, exclude_hidden=True):
|
||||
"""Yields the members of an object, filtering by hidden/not hidden.
|
||||
|
||||
.. versionadded:: 2.3
|
||||
"""
|
||||
for (name, value) in inspect.getmembers(obj):
|
||||
if name.startswith("_") and exclude_hidden:
|
||||
continue
|
||||
yield (name, value)
|
||||
|
||||
|
||||
def get_member_names(obj, exclude_hidden=True):
|
||||
"""Get all the member names for a object."""
|
||||
return [name for (name, _obj) in
|
||||
get_members(obj, exclude_hidden=exclude_hidden)]
|
||||
|
||||
|
||||
def get_class_name(obj, fully_qualified=True, truncate_builtins=True):
|
||||
"""Get class name for object.
|
||||
|
||||
If object is a type, returns name of the type. If object is a bound
|
||||
method or a class method, returns its ``self`` object's class name.
|
||||
If object is an instance of class, returns instance's class name.
|
||||
Else, name of the type of the object is returned. If fully_qualified
|
||||
is True, returns fully qualified name of the type. For builtin types,
|
||||
just name is returned. TypeError is raised if can't get class name from
|
||||
object.
|
||||
"""
|
||||
if inspect.isfunction(obj):
|
||||
raise TypeError("Can't get class name.")
|
||||
|
||||
if inspect.ismethod(obj):
|
||||
obj = get_method_self(obj)
|
||||
if not isinstance(obj, six.class_types):
|
||||
obj = type(obj)
|
||||
if truncate_builtins:
|
||||
try:
|
||||
built_in = obj.__module__ in _BUILTIN_MODULES
|
||||
except AttributeError: # nosec
|
||||
pass
|
||||
else:
|
||||
if built_in:
|
||||
return obj.__name__
|
||||
if fully_qualified and hasattr(obj, '__module__'):
|
||||
return '%s.%s' % (obj.__module__, obj.__name__)
|
||||
else:
|
||||
return obj.__name__
|
||||
|
||||
|
||||
def get_all_class_names(obj, up_to=object,
|
||||
fully_qualified=True, truncate_builtins=True):
|
||||
"""Get class names of object parent classes.
|
||||
|
||||
Iterate over all class names object is instance or subclass of,
|
||||
in order of method resolution (mro). If up_to parameter is provided,
|
||||
only name of classes that are sublcasses to that class are returned.
|
||||
"""
|
||||
if not isinstance(obj, six.class_types):
|
||||
obj = type(obj)
|
||||
for cls in obj.mro():
|
||||
if issubclass(cls, up_to):
|
||||
yield get_class_name(cls,
|
||||
fully_qualified=fully_qualified,
|
||||
truncate_builtins=truncate_builtins)
|
||||
|
||||
|
||||
def get_callable_name(function):
|
||||
"""Generate a name from callable.
|
||||
|
||||
Tries to do the best to guess fully qualified callable name.
|
||||
"""
|
||||
method_self = get_method_self(function)
|
||||
if method_self is not None:
|
||||
# This is a bound method.
|
||||
if isinstance(method_self, six.class_types):
|
||||
# This is a bound class method.
|
||||
im_class = method_self
|
||||
else:
|
||||
im_class = type(method_self)
|
||||
try:
|
||||
parts = (im_class.__module__, function.__qualname__)
|
||||
except AttributeError:
|
||||
parts = (im_class.__module__, im_class.__name__, function.__name__)
|
||||
elif inspect.ismethod(function) or inspect.isfunction(function):
|
||||
# This could be a function, a static method, a unbound method...
|
||||
try:
|
||||
parts = (function.__module__, function.__qualname__)
|
||||
except AttributeError:
|
||||
if hasattr(function, 'im_class'):
|
||||
# This is a unbound method, which exists only in python 2.x
|
||||
im_class = function.im_class
|
||||
parts = (im_class.__module__,
|
||||
im_class.__name__, function.__name__)
|
||||
else:
|
||||
parts = (function.__module__, function.__name__)
|
||||
else:
|
||||
im_class = type(function)
|
||||
if im_class is _TYPE_TYPE:
|
||||
im_class = function
|
||||
try:
|
||||
parts = (im_class.__module__, im_class.__qualname__)
|
||||
except AttributeError:
|
||||
parts = (im_class.__module__, im_class.__name__)
|
||||
return '.'.join(parts)
|
||||
|
||||
|
||||
def get_method_self(method):
|
||||
"""Gets the ``self`` object attached to this method (or none)."""
|
||||
if not inspect.ismethod(method):
|
||||
return None
|
||||
try:
|
||||
return six.get_method_self(method)
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
|
||||
def is_same_callback(callback1, callback2, strict=True):
|
||||
"""Returns if the two callbacks are the same."""
|
||||
if callback1 is callback2:
|
||||
# This happens when plain methods are given (or static/non-bound
|
||||
# methods).
|
||||
return True
|
||||
if callback1 == callback2:
|
||||
if not strict:
|
||||
return True
|
||||
# Two bound methods are equal if functions themselves are equal and
|
||||
# objects they are applied to are equal. This means that a bound
|
||||
# method could be the same bound method on another object if the
|
||||
# objects have __eq__ methods that return true (when in fact it is a
|
||||
# different bound method). Python u so crazy!
|
||||
try:
|
||||
self1 = six.get_method_self(callback1)
|
||||
self2 = six.get_method_self(callback2)
|
||||
return self1 is self2
|
||||
except AttributeError: # nosec
|
||||
pass
|
||||
return False
|
||||
|
||||
|
||||
def is_bound_method(method):
|
||||
"""Returns if the given method is bound to an object."""
|
||||
return get_method_self(method) is not None
|
||||
|
||||
|
||||
def is_subclass(obj, cls):
|
||||
"""Returns if the object is class and it is subclass of a given class."""
|
||||
return inspect.isclass(obj) and issubclass(obj, cls)
|
||||
|
||||
|
||||
def get_callable_args(function, required_only=False):
|
||||
"""Get names of callable arguments.
|
||||
|
||||
Special arguments (like ``*args`` and ``**kwargs``) are not included into
|
||||
output.
|
||||
|
||||
If required_only is True, optional arguments (with default values)
|
||||
are not included into output.
|
||||
"""
|
||||
sig = get_signature(function)
|
||||
function_args = list(six.iterkeys(sig.parameters))
|
||||
for param_name, p in six.iteritems(sig.parameters):
|
||||
if (p.kind in (Parameter.VAR_POSITIONAL, Parameter.VAR_KEYWORD)
|
||||
or (required_only and p.default is not Parameter.empty)):
|
||||
function_args.remove(param_name)
|
||||
return function_args
|
||||
|
||||
|
||||
def accepts_kwargs(function):
|
||||
"""Returns ``True`` if function accepts kwargs otherwise ``False``."""
|
||||
sig = get_signature(function)
|
||||
return any(p.kind == Parameter.VAR_KEYWORD
|
||||
for p in six.itervalues(sig.parameters))
|
|
@ -1,47 +0,0 @@
|
|||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Secret utilities.
|
||||
|
||||
.. versionadded:: 3.5
|
||||
"""
|
||||
|
||||
import hmac
|
||||
|
||||
import six
|
||||
|
||||
|
||||
try:
|
||||
constant_time_compare = hmac.compare_digest
|
||||
except AttributeError:
|
||||
def constant_time_compare(first, second):
|
||||
"""Returns True if both string inputs are equal, otherwise False.
|
||||
|
||||
This function should take a constant amount of time regardless of
|
||||
how many characters in the strings match. This function uses an
|
||||
approach designed to prevent timing analysis by avoiding
|
||||
content-based short circuiting behaviour, making it appropriate
|
||||
for cryptography.
|
||||
"""
|
||||
if isinstance(first, six.string_types):
|
||||
first = first.encode('utf-8')
|
||||
if isinstance(second, six.string_types):
|
||||
second = second.encode('utf-8')
|
||||
if len(first) != len(second):
|
||||
return False
|
||||
result = 0
|
||||
for x, y in zip(first, second):
|
||||
result |= ord(x) ^ ord(y)
|
||||
return result == 0
|
|
@ -1,103 +0,0 @@
|
|||
# Copyright (c) 2011 OpenStack Foundation
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import ast
|
||||
import operator
|
||||
|
||||
import pyparsing
|
||||
from pyparsing import Literal
|
||||
from pyparsing import OneOrMore
|
||||
from pyparsing import Regex
|
||||
|
||||
|
||||
def _all_in(x, *y):
|
||||
x = ast.literal_eval(x)
|
||||
if not isinstance(x, list):
|
||||
raise TypeError("<all-in> must compare with a list literal"
|
||||
" string, EG \"%s\"" % (['aes', 'mmx'],))
|
||||
return all(val in x for val in y)
|
||||
|
||||
|
||||
op_methods = {
|
||||
# This one is special/odd,
|
||||
# TODO(harlowja): fix it so that it's not greater than or
|
||||
# equal, see here for the original @ https://review.openstack.org/#/c/8089/
|
||||
'=': lambda x, y: float(x) >= float(y),
|
||||
# More sane ops/methods
|
||||
'!=': lambda x, y: float(x) != float(y),
|
||||
'<=': lambda x, y: float(x) <= float(y),
|
||||
'<': lambda x, y: float(x) < float(y),
|
||||
'==': lambda x, y: float(x) == float(y),
|
||||
'>=': lambda x, y: float(x) >= float(y),
|
||||
'>': lambda x, y: float(x) > float(y),
|
||||
's!=': operator.ne,
|
||||
's<': operator.lt,
|
||||
's<=': operator.le,
|
||||
's==': operator.eq,
|
||||
's>': operator.gt,
|
||||
's>=': operator.ge,
|
||||
'<all-in>': _all_in,
|
||||
'<in>': lambda x, y: y in x,
|
||||
'<or>': lambda x, *y: any(x == a for a in y),
|
||||
}
|
||||
|
||||
|
||||
def make_grammar():
|
||||
"""Creates the grammar to be used by a spec matcher."""
|
||||
# This is apparently how pyparsing recommends to be used,
|
||||
# as http://pyparsing.wikispaces.com/share/view/644825 states that
|
||||
# it is not thread-safe to use a parser across threads.
|
||||
|
||||
unary_ops = (
|
||||
# Order matters here (so that '=' doesn't match before '==')
|
||||
Literal("==") | Literal("=") |
|
||||
Literal("!=") | Literal("<in>") |
|
||||
Literal(">=") | Literal("<=") |
|
||||
Literal(">") | Literal("<") |
|
||||
Literal("s==") | Literal("s!=") |
|
||||
# Order matters here (so that '<' doesn't match before '<=')
|
||||
Literal("s<=") | Literal("s<") |
|
||||
# Order matters here (so that '>' doesn't match before '>=')
|
||||
Literal("s>=") | Literal("s>"))
|
||||
|
||||
all_in_nary_op = Literal("<all-in>")
|
||||
or_ = Literal("<or>")
|
||||
|
||||
# An atom is anything not an keyword followed by anything but whitespace
|
||||
atom = ~(unary_ops | all_in_nary_op | or_) + Regex(r"\S+")
|
||||
|
||||
unary = unary_ops + atom
|
||||
nary = all_in_nary_op + OneOrMore(atom)
|
||||
disjunction = OneOrMore(or_ + atom)
|
||||
|
||||
# Even-numbered tokens will be '<or>', so we drop them
|
||||
disjunction.setParseAction(lambda _s, _l, t: ["<or>"] + t[1::2])
|
||||
|
||||
expr = disjunction | nary | unary | atom
|
||||
return expr
|
||||
|
||||
|
||||
def match(cmp_value, spec):
|
||||
"""Match a given value to a given spec DSL."""
|
||||
expr = make_grammar()
|
||||
try:
|
||||
tree = expr.parseString(spec)
|
||||
except pyparsing.ParseException:
|
||||
tree = [spec]
|
||||
if len(tree) == 1:
|
||||
return tree[0] == cmp_value
|
||||
|
||||
op = op_methods[tree[0]]
|
||||
return op(cmp_value, *tree[1:])
|
|
@ -1,504 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
System-level utilities and helper functions.
|
||||
"""
|
||||
|
||||
import copy
|
||||
import math
|
||||
import re
|
||||
import unicodedata
|
||||
|
||||
import pyparsing as pp
|
||||
import six
|
||||
from six.moves import urllib
|
||||
|
||||
from oslo_utils._i18n import _
|
||||
from oslo_utils import encodeutils
|
||||
|
||||
|
||||
UNIT_PREFIX_EXPONENT = {
|
||||
'k': 1,
|
||||
'K': 1,
|
||||
'Ki': 1,
|
||||
'M': 2,
|
||||
'Mi': 2,
|
||||
'G': 3,
|
||||
'Gi': 3,
|
||||
'T': 4,
|
||||
'Ti': 4,
|
||||
}
|
||||
UNIT_SYSTEM_INFO = {
|
||||
'IEC': (1024, re.compile(r'(^[-+]?\d*\.?\d+)([KMGT]i?)?(b|bit|B)$')),
|
||||
'SI': (1000, re.compile(r'(^[-+]?\d*\.?\d+)([kMGT])?(b|bit|B)$')),
|
||||
}
|
||||
|
||||
TRUE_STRINGS = ('1', 't', 'true', 'on', 'y', 'yes')
|
||||
FALSE_STRINGS = ('0', 'f', 'false', 'off', 'n', 'no')
|
||||
|
||||
SLUGIFY_STRIP_RE = re.compile(r"[^\w\s-]")
|
||||
SLUGIFY_HYPHENATE_RE = re.compile(r"[-\s]+")
|
||||
|
||||
|
||||
# NOTE(flaper87): The following globals are used by `mask_password`
|
||||
_SANITIZE_KEYS = ['adminPass', 'admin_pass', 'password', 'admin_password',
|
||||
'auth_token', 'new_pass', 'auth_password', 'secret_uuid',
|
||||
'secret', 'sys_pswd', 'token', 'configdrive',
|
||||
'CHAPPASSWORD', 'encrypted_key']
|
||||
|
||||
# NOTE(ldbragst): Let's build a list of regex objects using the list of
|
||||
# _SANITIZE_KEYS we already have. This way, we only have to add the new key
|
||||
# to the list of _SANITIZE_KEYS and we can generate regular expressions
|
||||
# for XML and JSON automatically.
|
||||
_SANITIZE_PATTERNS_2 = {}
|
||||
_SANITIZE_PATTERNS_1 = {}
|
||||
|
||||
# NOTE(amrith): Some regular expressions have only one parameter, some
|
||||
# have two parameters. Use different lists of patterns here.
|
||||
_FORMAT_PATTERNS_1 = [r'(%(key)s\s*[=]\s*)[^\s^\'^\"]+']
|
||||
_FORMAT_PATTERNS_2 = [r'(%(key)s\s*[=]\s*[\"\'])[^\"\']*([\"\'])',
|
||||
r'(%(key)s\s+[\"\'])[^\"\']*([\"\'])',
|
||||
r'([-]{2}%(key)s\s+)[^\'^\"^=^\s]+([\s]*)',
|
||||
r'(<%(key)s>)[^<]*(</%(key)s>)',
|
||||
r'([\"\']%(key)s[\"\']\s*:\s*[\"\'])[^\"\']*([\"\'])',
|
||||
r'([\'"][^"\']*%(key)s[\'"]\s*:\s*u?[\'"])[^\"\']*'
|
||||
'([\'"])',
|
||||
r'([\'"][^\'"]*%(key)s[\'"]\s*,\s*\'--?[A-z]+\'\s*,\s*u?'
|
||||
'[\'"])[^\"\']*([\'"])',
|
||||
r'(%(key)s\s*--?[A-z]+\s*)\S+(\s*)']
|
||||
|
||||
# NOTE(dhellmann): Keep a separate list of patterns by key so we only
|
||||
# need to apply the substitutions for keys we find using a quick "in"
|
||||
# test.
|
||||
for key in _SANITIZE_KEYS:
|
||||
_SANITIZE_PATTERNS_1[key] = []
|
||||
_SANITIZE_PATTERNS_2[key] = []
|
||||
|
||||
for pattern in _FORMAT_PATTERNS_2:
|
||||
reg_ex = re.compile(pattern % {'key': key}, re.DOTALL)
|
||||
_SANITIZE_PATTERNS_2[key].append(reg_ex)
|
||||
|
||||
for pattern in _FORMAT_PATTERNS_1:
|
||||
reg_ex = re.compile(pattern % {'key': key}, re.DOTALL)
|
||||
_SANITIZE_PATTERNS_1[key].append(reg_ex)
|
||||
|
||||
|
||||
def int_from_bool_as_string(subject):
|
||||
"""Interpret a string as a boolean and return either 1 or 0.
|
||||
|
||||
Any string value in:
|
||||
|
||||
('True', 'true', 'On', 'on', '1')
|
||||
|
||||
is interpreted as a boolean True.
|
||||
|
||||
Useful for JSON-decoded stuff and config file parsing
|
||||
"""
|
||||
return int(bool_from_string(subject))
|
||||
|
||||
|
||||
def bool_from_string(subject, strict=False, default=False):
|
||||
"""Interpret a subject as a boolean.
|
||||
|
||||
A subject can be a boolean, a string or an integer. Boolean type value
|
||||
will be returned directly, otherwise the subject will be converted to
|
||||
a string. A case-insensitive match is performed such that strings
|
||||
matching 't','true', 'on', 'y', 'yes', or '1' are considered True and,
|
||||
when `strict=False`, anything else returns the value specified by
|
||||
'default'.
|
||||
|
||||
Useful for JSON-decoded stuff and config file parsing.
|
||||
|
||||
If `strict=True`, unrecognized values, including None, will raise a
|
||||
ValueError which is useful when parsing values passed in from an API call.
|
||||
Strings yielding False are 'f', 'false', 'off', 'n', 'no', or '0'.
|
||||
"""
|
||||
if isinstance(subject, bool):
|
||||
return subject
|
||||
if not isinstance(subject, six.string_types):
|
||||
subject = six.text_type(subject)
|
||||
|
||||
lowered = subject.strip().lower()
|
||||
|
||||
if lowered in TRUE_STRINGS:
|
||||
return True
|
||||
elif lowered in FALSE_STRINGS:
|
||||
return False
|
||||
elif strict:
|
||||
acceptable = ', '.join(
|
||||
"'%s'" % s for s in sorted(TRUE_STRINGS + FALSE_STRINGS))
|
||||
msg = _("Unrecognized value '%(val)s', acceptable values are:"
|
||||
" %(acceptable)s") % {'val': subject,
|
||||
'acceptable': acceptable}
|
||||
raise ValueError(msg)
|
||||
else:
|
||||
return default
|
||||
|
||||
|
||||
def is_valid_boolstr(value):
|
||||
"""Check if the provided string is a valid bool string or not.
|
||||
|
||||
:param value: value to verify
|
||||
:type value: string
|
||||
:returns: true if value is boolean string, false otherwise
|
||||
|
||||
.. versionadded:: 3.17
|
||||
"""
|
||||
boolstrs = TRUE_STRINGS + FALSE_STRINGS
|
||||
return str(value).lower() in boolstrs
|
||||
|
||||
|
||||
def string_to_bytes(text, unit_system='IEC', return_int=False):
|
||||
"""Converts a string into an float representation of bytes.
|
||||
|
||||
The units supported for IEC ::
|
||||
|
||||
Kb(it), Kib(it), Mb(it), Mib(it), Gb(it), Gib(it), Tb(it), Tib(it)
|
||||
KB, KiB, MB, MiB, GB, GiB, TB, TiB
|
||||
|
||||
The units supported for SI ::
|
||||
|
||||
kb(it), Mb(it), Gb(it), Tb(it)
|
||||
kB, MB, GB, TB
|
||||
|
||||
Note that the SI unit system does not support capital letter 'K'
|
||||
|
||||
:param text: String input for bytes size conversion.
|
||||
:param unit_system: Unit system for byte size conversion.
|
||||
:param return_int: If True, returns integer representation of text
|
||||
in bytes. (default: decimal)
|
||||
:returns: Numerical representation of text in bytes.
|
||||
:raises ValueError: If text has an invalid value.
|
||||
|
||||
"""
|
||||
try:
|
||||
base, reg_ex = UNIT_SYSTEM_INFO[unit_system]
|
||||
except KeyError:
|
||||
msg = _('Invalid unit system: "%s"') % unit_system
|
||||
raise ValueError(msg)
|
||||
match = reg_ex.match(text)
|
||||
if match:
|
||||
magnitude = float(match.group(1))
|
||||
unit_prefix = match.group(2)
|
||||
if match.group(3) in ['b', 'bit']:
|
||||
magnitude /= 8
|
||||
else:
|
||||
msg = _('Invalid string format: %s') % text
|
||||
raise ValueError(msg)
|
||||
if not unit_prefix:
|
||||
res = magnitude
|
||||
else:
|
||||
res = magnitude * pow(base, UNIT_PREFIX_EXPONENT[unit_prefix])
|
||||
if return_int:
|
||||
return int(math.ceil(res))
|
||||
return res
|
||||
|
||||
|
||||
def to_slug(value, incoming=None, errors="strict"):
|
||||
"""Normalize string.
|
||||
|
||||
Convert to lowercase, remove non-word characters, and convert spaces
|
||||
to hyphens.
|
||||
|
||||
Inspired by Django's `slugify` filter.
|
||||
|
||||
:param value: Text to slugify
|
||||
:param incoming: Text's current encoding
|
||||
:param errors: Errors handling policy. See here for valid
|
||||
values http://docs.python.org/2/library/codecs.html
|
||||
:returns: slugified unicode representation of `value`
|
||||
:raises TypeError: If text is not an instance of str
|
||||
"""
|
||||
value = encodeutils.safe_decode(value, incoming, errors)
|
||||
# NOTE(aababilov): no need to use safe_(encode|decode) here:
|
||||
# encodings are always "ascii", error handling is always "ignore"
|
||||
# and types are always known (first: unicode; second: str)
|
||||
value = unicodedata.normalize("NFKD", value).encode(
|
||||
"ascii", "ignore").decode("ascii")
|
||||
value = SLUGIFY_STRIP_RE.sub("", value).strip().lower()
|
||||
return SLUGIFY_HYPHENATE_RE.sub("-", value)
|
||||
|
||||
|
||||
# NOTE(dhellmann): Before submitting a patch to add a new argument to
|
||||
# this function to allow the caller to pass in "extra" or "additional"
|
||||
# or "replacement" patterns to be masked out, please note that we have
|
||||
# discussed that feature many times and always rejected it based on
|
||||
# the desire to have Oslo functions behave consistently across all
|
||||
# projects and *especially* to have security features work the same
|
||||
# way no matter where they are used. If every project adopted its own
|
||||
# set patterns for secret values, it would be very difficult to audit
|
||||
# the logging to ensure that everything is properly masked. So, please
|
||||
# either add your pattern to the module-level variables at the top of
|
||||
# this file or, even better, pick an existing pattern or key to use in
|
||||
# your application to ensure that the value is masked by this
|
||||
# function.
|
||||
def mask_password(message, secret="***"): # nosec
|
||||
"""Replace password with *secret* in message.
|
||||
|
||||
:param message: The string which includes security information.
|
||||
:param secret: value with which to replace passwords.
|
||||
:returns: The unicode value of message with the password fields masked.
|
||||
|
||||
For example:
|
||||
|
||||
>>> mask_password("'adminPass' : 'aaaaa'")
|
||||
"'adminPass' : '***'"
|
||||
>>> mask_password("'admin_pass' : 'aaaaa'")
|
||||
"'admin_pass' : '***'"
|
||||
>>> mask_password('"password" : "aaaaa"')
|
||||
'"password" : "***"'
|
||||
>>> mask_password("'original_password' : 'aaaaa'")
|
||||
"'original_password' : '***'"
|
||||
>>> mask_password("u'original_password' : u'aaaaa'")
|
||||
"u'original_password' : u'***'"
|
||||
|
||||
.. versionadded:: 0.2
|
||||
|
||||
.. versionchanged:: 1.1
|
||||
Replace also ``'auth_token'``, ``'new_pass'`` and ``'auth_password'``
|
||||
keys.
|
||||
|
||||
.. versionchanged:: 1.1.1
|
||||
Replace also ``'secret_uuid'`` key.
|
||||
|
||||
.. versionchanged:: 1.5
|
||||
Replace also ``'sys_pswd'`` key.
|
||||
|
||||
.. versionchanged:: 2.6
|
||||
Replace also ``'token'`` key.
|
||||
|
||||
.. versionchanged:: 2.7
|
||||
Replace also ``'secret'`` key.
|
||||
|
||||
.. versionchanged:: 3.4
|
||||
Replace also ``'configdrive'`` key.
|
||||
|
||||
.. versionchanged:: 3.8
|
||||
Replace also ``'CHAPPASSWORD'`` key.
|
||||
"""
|
||||
|
||||
try:
|
||||
message = six.text_type(message)
|
||||
except UnicodeDecodeError: # nosec
|
||||
# NOTE(jecarey): Temporary fix to handle cases where message is a
|
||||
# byte string. A better solution will be provided in Kilo.
|
||||
pass
|
||||
|
||||
substitute1 = r'\g<1>' + secret
|
||||
substitute2 = r'\g<1>' + secret + r'\g<2>'
|
||||
|
||||
# NOTE(ldbragst): Check to see if anything in message contains any key
|
||||
# specified in _SANITIZE_KEYS, if not then just return the message since
|
||||
# we don't have to mask any passwords.
|
||||
for key in _SANITIZE_KEYS:
|
||||
if key in message:
|
||||
for pattern in _SANITIZE_PATTERNS_2[key]:
|
||||
message = re.sub(pattern, substitute2, message)
|
||||
for pattern in _SANITIZE_PATTERNS_1[key]:
|
||||
message = re.sub(pattern, substitute1, message)
|
||||
|
||||
return message
|
||||
|
||||
|
||||
def mask_dict_password(dictionary, secret="***"): # nosec
|
||||
"""Replace password with *secret* in a dictionary recursively.
|
||||
|
||||
:param dictionary: The dictionary which includes secret information.
|
||||
:param secret: value with which to replace secret information.
|
||||
:returns: The dictionary with string substitutions.
|
||||
|
||||
A dictionary (which may contain nested dictionaries) contains
|
||||
information (such as passwords) which should not be revealed, and
|
||||
this function helps detect and replace those with the 'secret'
|
||||
provided (or `***` if none is provided).
|
||||
|
||||
Substitution is performed in one of three situations:
|
||||
|
||||
If the key is something that is considered to be indicative of a
|
||||
secret, then the corresponding value is replaced with the secret
|
||||
provided (or `***` if none is provided).
|
||||
|
||||
If a value in the dictionary is a string, then it is masked
|
||||
using the ``mask_password()`` function.
|
||||
|
||||
Finally, if a value is a dictionary, this function will
|
||||
recursively mask that dictionary as well.
|
||||
|
||||
For example:
|
||||
|
||||
>>> mask_dict_password({'password': 'd81juxmEW_',
|
||||
>>> 'user': 'admin',
|
||||
>>> 'home-dir': '/home/admin'},
|
||||
>>> '???')
|
||||
{'password': '???', 'user': 'admin', 'home-dir': '/home/admin'}
|
||||
|
||||
For example (the value is masked using mask_password())
|
||||
|
||||
>>> mask_dict_password({'password': '--password d81juxmEW_',
|
||||
>>> 'user': 'admin',
|
||||
>>> 'home-dir': '/home/admin'},
|
||||
>>> '???')
|
||||
{'password': '--password ???', 'user': 'admin',
|
||||
'home-dir': '/home/admin'}
|
||||
|
||||
|
||||
For example (a nested dictionary is masked):
|
||||
|
||||
>>> mask_dict_password({"nested": {'password': 'd81juxmEW_',
|
||||
>>> 'user': 'admin',
|
||||
>>> 'home': '/home/admin'}},
|
||||
>>> '???')
|
||||
{"nested": {'password': '???', 'user': 'admin', 'home': '/home/admin'}}
|
||||
|
||||
.. versionadded:: 3.4
|
||||
|
||||
"""
|
||||
|
||||
if not isinstance(dictionary, dict):
|
||||
raise TypeError("Expected a dictionary, got %s instead."
|
||||
% type(dictionary))
|
||||
|
||||
out = copy.deepcopy(dictionary)
|
||||
|
||||
for k, v in dictionary.items():
|
||||
if isinstance(v, dict):
|
||||
out[k] = mask_dict_password(v, secret=secret)
|
||||
continue
|
||||
# NOTE(jlvillal): Check to see if anything in the dictionary 'key'
|
||||
# contains any key specified in _SANITIZE_KEYS.
|
||||
for sani_key in _SANITIZE_KEYS:
|
||||
if sani_key in k:
|
||||
out[k] = secret
|
||||
break
|
||||
else:
|
||||
# We did not find a match for the key name in the
|
||||
# _SANITIZE_KEYS, so we fall through to here
|
||||
if isinstance(v, six.string_types):
|
||||
out[k] = mask_password(v, secret=secret)
|
||||
return out
|
||||
|
||||
|
||||
def is_int_like(val):
|
||||
"""Check if a value looks like an integer with base 10.
|
||||
|
||||
:param val: Value to verify
|
||||
:type val: string
|
||||
:returns: bool
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
try:
|
||||
return six.text_type(int(val)) == six.text_type(val)
|
||||
except (TypeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def check_string_length(value, name=None, min_length=0, max_length=None):
|
||||
"""Check the length of specified string.
|
||||
|
||||
:param value: the value of the string
|
||||
:param name: the name of the string
|
||||
:param min_length: the min_length of the string
|
||||
:param max_length: the max_length of the string
|
||||
:raises TypeError, ValueError: For any invalid input.
|
||||
|
||||
.. versionadded:: 3.7
|
||||
"""
|
||||
if name is None:
|
||||
name = value
|
||||
|
||||
if not isinstance(value, six.string_types):
|
||||
msg = _("%s is not a string or unicode") % name
|
||||
raise TypeError(msg)
|
||||
|
||||
length = len(value)
|
||||
if length < min_length:
|
||||
msg = _("%(name)s has %(length)s characters, less than "
|
||||
"%(min_length)s.") % {'name': name, 'length': length,
|
||||
'min_length': min_length}
|
||||
raise ValueError(msg)
|
||||
|
||||
if max_length and length > max_length:
|
||||
msg = _("%(name)s has %(length)s characters, more than "
|
||||
"%(max_length)s.") % {'name': name, 'length': length,
|
||||
'max_length': max_length}
|
||||
raise ValueError(msg)
|
||||
|
||||
|
||||
def split_path(path, minsegs=1, maxsegs=None, rest_with_last=False):
|
||||
"""Validate and split the given HTTP request path.
|
||||
|
||||
**Examples**::
|
||||
|
||||
['a'] = _split_path('/a')
|
||||
['a', None] = _split_path('/a', 1, 2)
|
||||
['a', 'c'] = _split_path('/a/c', 1, 2)
|
||||
['a', 'c', 'o/r'] = _split_path('/a/c/o/r', 1, 3, True)
|
||||
|
||||
:param path: HTTP Request path to be split
|
||||
:param minsegs: Minimum number of segments to be extracted
|
||||
:param maxsegs: Maximum number of segments to be extracted
|
||||
:param rest_with_last: If True, trailing data will be returned as part
|
||||
of last segment. If False, and there is
|
||||
trailing data, raises ValueError.
|
||||
:returns: list of segments with a length of maxsegs (non-existent
|
||||
segments will return as None)
|
||||
:raises: ValueError if given an invalid path
|
||||
|
||||
.. versionadded:: 3.11
|
||||
"""
|
||||
if not maxsegs:
|
||||
maxsegs = minsegs
|
||||
if minsegs > maxsegs:
|
||||
raise ValueError(_('minsegs > maxsegs: %(min)d > %(max)d)') %
|
||||
{'min': minsegs, 'max': maxsegs})
|
||||
if rest_with_last:
|
||||
segs = path.split('/', maxsegs)
|
||||
minsegs += 1
|
||||
maxsegs += 1
|
||||
count = len(segs)
|
||||
if (segs[0] or count < minsegs or count > maxsegs or
|
||||
'' in segs[1:minsegs]):
|
||||
raise ValueError(_('Invalid path: %s') % urllib.parse.quote(path))
|
||||
else:
|
||||
minsegs += 1
|
||||
maxsegs += 1
|
||||
segs = path.split('/', maxsegs)
|
||||
count = len(segs)
|
||||
if (segs[0] or count < minsegs or count > maxsegs + 1 or
|
||||
'' in segs[1:minsegs] or
|
||||
(count == maxsegs + 1 and segs[maxsegs])):
|
||||
raise ValueError(_('Invalid path: %s') % urllib.parse.quote(path))
|
||||
segs = segs[1:maxsegs]
|
||||
segs.extend([None] * (maxsegs - 1 - len(segs)))
|
||||
return segs
|
||||
|
||||
|
||||
def split_by_commas(value):
|
||||
"""Split values by commas and quotes according to api-wg
|
||||
|
||||
:param value: value to be split
|
||||
|
||||
.. versionadded:: 3.17
|
||||
"""
|
||||
word = (pp.QuotedString(quoteChar='"', escChar='\\')
|
||||
| pp.Word(pp.printables, excludeChars='",'))
|
||||
grammar = pp.stringStart + pp.delimitedList(word) + pp.stringEnd
|
||||
|
||||
try:
|
||||
return list(grammar.parseString(value))
|
||||
except pp.ParseException:
|
||||
raise ValueError("Invalid value: %s" % value)
|
|
@ -1,13 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
|
@ -1,55 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright 2010-2011 OpenStack Foundation
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import os
|
||||
|
||||
import fixtures
|
||||
import testtools
|
||||
|
||||
_TRUE_VALUES = ('true', '1', 'yes')
|
||||
|
||||
# FIXME(dhellmann) Update this to use oslo.test library
|
||||
|
||||
|
||||
class TestCase(testtools.TestCase):
|
||||
|
||||
"""Test case base class for all unit tests."""
|
||||
|
||||
def setUp(self):
|
||||
"""Run before each test method to initialize test environment."""
|
||||
|
||||
super(TestCase, self).setUp()
|
||||
test_timeout = os.environ.get('OS_TEST_TIMEOUT', 0)
|
||||
try:
|
||||
test_timeout = int(test_timeout)
|
||||
except ValueError:
|
||||
# If timeout value is invalid do not set a timeout.
|
||||
test_timeout = 0
|
||||
if test_timeout > 0:
|
||||
self.useFixture(fixtures.Timeout(test_timeout, gentle=True))
|
||||
|
||||
self.useFixture(fixtures.NestedTempfile())
|
||||
self.useFixture(fixtures.TempHomeDir())
|
||||
|
||||
if os.environ.get('OS_STDOUT_CAPTURE') in _TRUE_VALUES:
|
||||
stdout = self.useFixture(fixtures.StringStream('stdout')).stream
|
||||
self.useFixture(fixtures.MonkeyPatch('sys.stdout', stdout))
|
||||
if os.environ.get('OS_STDERR_CAPTURE') in _TRUE_VALUES:
|
||||
stderr = self.useFixture(fixtures.StringStream('stderr')).stream
|
||||
self.useFixture(fixtures.MonkeyPatch('sys.stderr', stderr))
|
||||
|
||||
self.log_fixture = self.useFixture(fixtures.FakeLogger())
|
|
@ -1,28 +0,0 @@
|
|||
# Copyright 2012 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
class FakeDriver():
|
||||
def __init__(self, first_arg=True):
|
||||
self.first_arg = first_arg
|
||||
|
||||
|
||||
class FakeDriver2():
|
||||
def __init__(self, first_arg):
|
||||
self.first_arg = first_arg
|
||||
|
||||
|
||||
class FakeDriver3():
|
||||
def __init__(self):
|
||||
raise ImportError("ImportError occurs in __init__")
|
|
@ -1,28 +0,0 @@
|
|||
# Copyright 2016, EasyStack, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
class V2FakeDriver(object):
|
||||
def __init__(self, first_arg=True):
|
||||
self.first_arg = first_arg
|
||||
|
||||
|
||||
class V2FakeDriver2(object):
|
||||
def __init__(self, first_arg):
|
||||
self.first_arg = first_arg
|
||||
|
||||
|
||||
class V2FakeDriver3(object):
|
||||
def __init__(self):
|
||||
raise ImportError("ImportError occurs in __init__")
|
|
@ -1,40 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright (c) 2016 EasyStack Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import dictutils as du
|
||||
|
||||
|
||||
class DictUtilsTestCase(test_base.BaseTestCase):
|
||||
|
||||
def test_flatten_dict_to_keypairs(self):
|
||||
data = {'a': 'A', 'b': 'B',
|
||||
'nested': {'a': 'A', 'b': 'B'}}
|
||||
pairs = list(du.flatten_dict_to_keypairs(data))
|
||||
self.assertEqual([('a', 'A'), ('b', 'B'),
|
||||
('nested:a', 'A'), ('nested:b', 'B')],
|
||||
pairs)
|
||||
|
||||
def test_flatten_dict_to_keypairs_with_separator(self):
|
||||
data = {'a': 'A', 'b': 'B',
|
||||
'nested': {'a': 'A', 'b': 'B'}}
|
||||
pairs = list(du.flatten_dict_to_keypairs(data, separator='.'))
|
||||
self.assertEqual([('a', 'A'), ('b', 'B'),
|
||||
('nested.a', 'A'), ('nested.b', 'B')],
|
||||
pairs)
|
|
@ -1,144 +0,0 @@
|
|||
# Copyright 2012, Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import threading
|
||||
import warnings
|
||||
|
||||
import mock
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
|
||||
from oslo_utils import eventletutils
|
||||
|
||||
|
||||
class EventletUtilsTest(test_base.BaseTestCase):
|
||||
def setUp(self):
|
||||
super(EventletUtilsTest, self).setUp()
|
||||
self._old_avail = eventletutils.EVENTLET_AVAILABLE
|
||||
eventletutils.EVENTLET_AVAILABLE = True
|
||||
|
||||
def tearDown(self):
|
||||
super(EventletUtilsTest, self).tearDown()
|
||||
eventletutils.EVENTLET_AVAILABLE = self._old_avail
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_warning_not_patched(self, mock_patcher):
|
||||
mock_patcher.already_patched = True
|
||||
mock_patcher.is_monkey_patched.return_value = False
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['os'])
|
||||
self.assertEqual(1, len(capture))
|
||||
w = capture[0]
|
||||
self.assertEqual(RuntimeWarning, w.category)
|
||||
self.assertIn('os', six.text_type(w.message))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_warning_not_patched_none_provided(self, mock_patcher):
|
||||
mock_patcher.already_patched = True
|
||||
mock_patcher.is_monkey_patched.return_value = False
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched()
|
||||
self.assertEqual(1, len(capture))
|
||||
w = capture[0]
|
||||
self.assertEqual(RuntimeWarning, w.category)
|
||||
for m in eventletutils._ALL_PATCH:
|
||||
self.assertIn(m, six.text_type(w.message))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_warning_not_patched_all(self, mock_patcher):
|
||||
mock_patcher.already_patched = True
|
||||
mock_patcher.is_monkey_patched.return_value = False
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['all'])
|
||||
self.assertEqual(1, len(capture))
|
||||
w = capture[0]
|
||||
self.assertEqual(RuntimeWarning, w.category)
|
||||
for m in eventletutils._ALL_PATCH:
|
||||
self.assertIn(m, six.text_type(w.message))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_no_warning(self, mock_patcher):
|
||||
mock_patcher.already_patched = True
|
||||
mock_patcher.is_monkey_patched.return_value = True
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['os'])
|
||||
self.assertEqual(0, len(capture))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_eventlet_is_patched(self, mock_patcher):
|
||||
mock_patcher.is_monkey_patched.return_value = True
|
||||
self.assertTrue(eventletutils.is_monkey_patched('os'))
|
||||
mock_patcher.is_monkey_patched.return_value = False
|
||||
self.assertFalse(eventletutils.is_monkey_patched('os'))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher", None)
|
||||
def test_eventlet_no_patcher(self):
|
||||
self.assertFalse(eventletutils.is_monkey_patched('os'))
|
||||
|
||||
@mock.patch("oslo_utils.eventletutils._patcher")
|
||||
def test_partially_patched_warning(self, mock_patcher):
|
||||
is_patched = set()
|
||||
mock_patcher.already_patched = True
|
||||
mock_patcher.is_monkey_patched.side_effect = lambda m: m in is_patched
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['os'])
|
||||
self.assertEqual(1, len(capture))
|
||||
is_patched.add('os')
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['os'])
|
||||
self.assertEqual(0, len(capture))
|
||||
is_patched.add('thread')
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['os', 'thread'])
|
||||
self.assertEqual(0, len(capture))
|
||||
with warnings.catch_warnings(record=True) as capture:
|
||||
warnings.simplefilter("always")
|
||||
eventletutils.warn_eventlet_not_patched(['all'])
|
||||
self.assertEqual(1, len(capture))
|
||||
w = capture[0]
|
||||
self.assertEqual(RuntimeWarning, w.category)
|
||||
for m in ['os', 'thread']:
|
||||
self.assertNotIn(m, six.text_type(w.message))
|
||||
|
||||
def test_invalid_patch_check(self):
|
||||
self.assertRaises(ValueError,
|
||||
eventletutils.warn_eventlet_not_patched,
|
||||
['blah.blah'])
|
||||
|
||||
@mock.patch('oslo_utils.eventletutils._Event.clear')
|
||||
def test_event_api_compat(self, mock_clear):
|
||||
with mock.patch('oslo_utils.eventletutils.is_monkey_patched',
|
||||
return_value=True):
|
||||
e_event = eventletutils.Event()
|
||||
self.assertIsInstance(e_event, eventletutils._Event)
|
||||
|
||||
t_event = eventletutils.Event()
|
||||
if six.PY3:
|
||||
t_event_cls = threading.Event
|
||||
else:
|
||||
t_event_cls = threading._Event
|
||||
self.assertIsInstance(t_event, t_event_cls)
|
||||
|
||||
public_methods = [m for m in dir(t_event) if not m.startswith("_") and
|
||||
callable(getattr(t_event, m))]
|
||||
|
||||
for method in public_methods:
|
||||
self.assertTrue(hasattr(e_event, method))
|
|
@ -1,569 +0,0 @@
|
|||
# Copyright 2012, Red Hat, Inc.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import logging
|
||||
import time
|
||||
|
||||
import mock
|
||||
from oslotest import base as test_base
|
||||
from oslotest import moxstubout
|
||||
|
||||
from oslo_utils import excutils
|
||||
from oslo_utils import timeutils
|
||||
|
||||
|
||||
mox = moxstubout.mox
|
||||
|
||||
|
||||
class Fail1(excutils.CausedByException):
|
||||
pass
|
||||
|
||||
|
||||
class Fail2(excutils.CausedByException):
|
||||
pass
|
||||
|
||||
|
||||
class CausedByTest(test_base.BaseTestCase):
|
||||
|
||||
def test_caused_by_explicit(self):
|
||||
e = self.assertRaises(Fail1,
|
||||
excutils.raise_with_cause,
|
||||
Fail1, "I was broken",
|
||||
cause=Fail2("I have been broken"))
|
||||
self.assertIsInstance(e.cause, Fail2)
|
||||
e_p = e.pformat()
|
||||
self.assertIn("I have been broken", e_p)
|
||||
self.assertIn("Fail2", e_p)
|
||||
|
||||
def test_caused_by_implicit(self):
|
||||
|
||||
def raises_chained():
|
||||
try:
|
||||
raise Fail2("I have been broken")
|
||||
except Fail2:
|
||||
excutils.raise_with_cause(Fail1, "I was broken")
|
||||
|
||||
e = self.assertRaises(Fail1, raises_chained)
|
||||
self.assertIsInstance(e.cause, Fail2)
|
||||
e_p = e.pformat()
|
||||
self.assertIn("I have been broken", e_p)
|
||||
self.assertIn("Fail2", e_p)
|
||||
|
||||
|
||||
class SaveAndReraiseTest(test_base.BaseTestCase):
|
||||
|
||||
def test_save_and_reraise_exception_forced(self):
|
||||
|
||||
def _force_reraise():
|
||||
try:
|
||||
raise IOError("I broke")
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception() as e:
|
||||
e.reraise = False
|
||||
e.force_reraise()
|
||||
|
||||
self.assertRaises(IOError, _force_reraise)
|
||||
|
||||
def test_save_and_reraise_exception_capture_reraise(self):
|
||||
|
||||
def _force_reraise():
|
||||
try:
|
||||
raise IOError("I broke")
|
||||
except Exception:
|
||||
excutils.save_and_reraise_exception().capture().force_reraise()
|
||||
|
||||
self.assertRaises(IOError, _force_reraise)
|
||||
|
||||
def test_save_and_reraise_exception_capture_not_active(self):
|
||||
e = excutils.save_and_reraise_exception()
|
||||
self.assertRaises(RuntimeError, e.capture, check=True)
|
||||
|
||||
def test_save_and_reraise_exception_forced_not_active(self):
|
||||
e = excutils.save_and_reraise_exception()
|
||||
self.assertRaises(RuntimeError, e.force_reraise)
|
||||
e = excutils.save_and_reraise_exception()
|
||||
e.capture(check=False)
|
||||
self.assertRaises(RuntimeError, e.force_reraise)
|
||||
|
||||
def test_save_and_reraise_exception(self):
|
||||
e = None
|
||||
msg = 'foo'
|
||||
try:
|
||||
try:
|
||||
raise Exception(msg)
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception():
|
||||
pass
|
||||
except Exception as _e:
|
||||
e = _e
|
||||
|
||||
self.assertEqual(str(e), msg)
|
||||
|
||||
@mock.patch('logging.getLogger')
|
||||
def test_save_and_reraise_exception_dropped(self, get_logger_mock):
|
||||
logger = get_logger_mock()
|
||||
e = None
|
||||
msg = 'second exception'
|
||||
try:
|
||||
try:
|
||||
raise Exception('dropped')
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception():
|
||||
raise Exception(msg)
|
||||
except Exception as _e:
|
||||
e = _e
|
||||
self.assertEqual(str(e), msg)
|
||||
self.assertTrue(logger.error.called)
|
||||
|
||||
def test_save_and_reraise_exception_no_reraise(self):
|
||||
"""Test that suppressing the reraise works."""
|
||||
try:
|
||||
raise Exception('foo')
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception() as ctxt:
|
||||
ctxt.reraise = False
|
||||
|
||||
@mock.patch('logging.getLogger')
|
||||
def test_save_and_reraise_exception_dropped_no_reraise(self,
|
||||
get_logger_mock):
|
||||
logger = get_logger_mock()
|
||||
e = None
|
||||
msg = 'second exception'
|
||||
try:
|
||||
try:
|
||||
raise Exception('dropped')
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception(reraise=False):
|
||||
raise Exception(msg)
|
||||
except Exception as _e:
|
||||
e = _e
|
||||
self.assertEqual(str(e), msg)
|
||||
self.assertFalse(logger.error.called)
|
||||
|
||||
def test_save_and_reraise_exception_provided_logger(self):
|
||||
fake_logger = mock.MagicMock()
|
||||
try:
|
||||
try:
|
||||
raise Exception('foo')
|
||||
except Exception:
|
||||
with excutils.save_and_reraise_exception(logger=fake_logger):
|
||||
raise Exception('second exception')
|
||||
except Exception:
|
||||
pass
|
||||
self.assertTrue(fake_logger.error.called)
|
||||
|
||||
|
||||
class ForeverRetryUncaughtExceptionsTest(test_base.BaseTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(ForeverRetryUncaughtExceptionsTest, self).setUp()
|
||||
moxfixture = self.useFixture(moxstubout.MoxStubout())
|
||||
self.mox = moxfixture.mox
|
||||
self.stubs = moxfixture.stubs
|
||||
|
||||
@excutils.forever_retry_uncaught_exceptions
|
||||
def exception_generator(self):
|
||||
exc = self.exception_to_raise()
|
||||
while exc is not None:
|
||||
raise exc
|
||||
exc = self.exception_to_raise()
|
||||
|
||||
def exception_to_raise(self):
|
||||
return None
|
||||
|
||||
def my_time_sleep(self, arg):
|
||||
pass
|
||||
|
||||
def exc_retrier_common_start(self):
|
||||
self.stubs.Set(time, 'sleep', self.my_time_sleep)
|
||||
self.mox.StubOutWithMock(logging, 'exception')
|
||||
self.mox.StubOutWithMock(timeutils, 'now',
|
||||
use_mock_anything=True)
|
||||
self.mox.StubOutWithMock(self, 'exception_to_raise')
|
||||
|
||||
def exc_retrier_sequence(self, exc_id=None,
|
||||
exc_count=None, before_timestamp_calls=(),
|
||||
after_timestamp_calls=()):
|
||||
self.exception_to_raise().AndReturn(
|
||||
Exception('unexpected %d' % exc_id))
|
||||
# Timestamp calls that happen before the logging is possibly triggered.
|
||||
for timestamp in before_timestamp_calls:
|
||||
timeutils.now().AndReturn(timestamp)
|
||||
if exc_count != 0:
|
||||
logging.exception(mox.In(
|
||||
'Unexpected exception occurred %d time(s)' % exc_count))
|
||||
# Timestamp calls that happen after the logging is possibly triggered.
|
||||
for timestamp in after_timestamp_calls:
|
||||
timeutils.now().AndReturn(timestamp)
|
||||
|
||||
def exc_retrier_common_end(self):
|
||||
self.exception_to_raise().AndReturn(None)
|
||||
self.mox.ReplayAll()
|
||||
self.exception_generator()
|
||||
self.addCleanup(self.stubs.UnsetAll)
|
||||
|
||||
def test_exc_retrier_1exc_gives_1log(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1, exc_count=1,
|
||||
after_timestamp_calls=[0])
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_same_10exc_1min_gives_1log(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
# By design, the following exception don't get logged because they
|
||||
# are within the same minute.
|
||||
for i in range(2, 11):
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[i],
|
||||
exc_count=0)
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_same_2exc_2min_gives_2logs(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[65], exc_count=1,
|
||||
after_timestamp_calls=[65, 66])
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_same_10exc_2min_gives_2logs(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
for ts in [12, 23, 34, 45]:
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[ts],
|
||||
exc_count=0)
|
||||
# The previous 4 exceptions are counted here
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[106],
|
||||
exc_count=5,
|
||||
after_timestamp_calls=[106, 107])
|
||||
# Again, the following are not logged due to being within
|
||||
# the same minute
|
||||
for ts in [117, 128, 139, 150]:
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[ts],
|
||||
exc_count=0)
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_mixed_4exc_1min_gives_2logs(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
# The stop watch will be started,
|
||||
# which will consume one timestamp call.
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
# By design, this second 'unexpected 1' exception is not counted. This
|
||||
# is likely a rare thing and is a sacrifice for code simplicity.
|
||||
self.exc_retrier_sequence(exc_id=1, exc_count=0,
|
||||
# Since the exception will be the same
|
||||
# the expiry method will be called, which
|
||||
# uses up a timestamp call.
|
||||
before_timestamp_calls=[5])
|
||||
self.exc_retrier_sequence(exc_id=2, exc_count=1,
|
||||
# The watch should get reset, which uses
|
||||
# up two timestamp calls.
|
||||
after_timestamp_calls=[10, 20])
|
||||
# Again, trailing exceptions within a minute are not counted.
|
||||
self.exc_retrier_sequence(exc_id=2, exc_count=0,
|
||||
# Since the exception will be the same
|
||||
# the expiry method will be called, which
|
||||
# uses up a timestamp call.
|
||||
before_timestamp_calls=[25])
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_mixed_4exc_2min_gives_2logs(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
# The stop watch will now be started.
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
# Again, this second exception of the same type is not counted
|
||||
# for the sake of code simplicity.
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[10], exc_count=0)
|
||||
# The difference between this and the previous case is the log
|
||||
# is also triggered by more than a minute expiring.
|
||||
self.exc_retrier_sequence(exc_id=2, exc_count=1,
|
||||
# The stop watch will now be restarted.
|
||||
after_timestamp_calls=[100, 105])
|
||||
self.exc_retrier_sequence(exc_id=2,
|
||||
before_timestamp_calls=[110], exc_count=0)
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
def test_exc_retrier_mixed_4exc_2min_gives_3logs(self):
|
||||
self.exc_retrier_common_start()
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
# The stop watch will now be started.
|
||||
after_timestamp_calls=[0], exc_count=1)
|
||||
# This time the second 'unexpected 1' exception is counted due
|
||||
# to the same exception occurring same when the minute expires.
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[10], exc_count=0)
|
||||
self.exc_retrier_sequence(exc_id=1,
|
||||
before_timestamp_calls=[100],
|
||||
after_timestamp_calls=[100, 105],
|
||||
exc_count=2)
|
||||
self.exc_retrier_sequence(exc_id=2, exc_count=1,
|
||||
after_timestamp_calls=[110, 111])
|
||||
self.exc_retrier_common_end()
|
||||
|
||||
|
||||
class ExceptionFilterTest(test_base.BaseTestCase):
|
||||
|
||||
def _make_filter_func(self, ignore_classes=AssertionError):
|
||||
@excutils.exception_filter
|
||||
def ignore_exceptions(ex):
|
||||
'''Ignore some exceptions F.'''
|
||||
return isinstance(ex, ignore_classes)
|
||||
|
||||
return ignore_exceptions
|
||||
|
||||
def _make_filter_method(self, ignore_classes=AssertionError):
|
||||
class ExceptionIgnorer(object):
|
||||
def __init__(self, ignore):
|
||||
self.ignore = ignore
|
||||
|
||||
@excutils.exception_filter
|
||||
def ignore_exceptions(self, ex):
|
||||
'''Ignore some exceptions M.'''
|
||||
return isinstance(ex, self.ignore)
|
||||
|
||||
return ExceptionIgnorer(ignore_classes).ignore_exceptions
|
||||
|
||||
def _make_filter_classmethod(self, ignore_classes=AssertionError):
|
||||
class ExceptionIgnorer(object):
|
||||
ignore = ignore_classes
|
||||
|
||||
@excutils.exception_filter
|
||||
@classmethod
|
||||
def ignore_exceptions(cls, ex):
|
||||
'''Ignore some exceptions C.'''
|
||||
return isinstance(ex, cls.ignore)
|
||||
|
||||
return ExceptionIgnorer.ignore_exceptions
|
||||
|
||||
def _make_filter_staticmethod(self, ignore_classes=AssertionError):
|
||||
class ExceptionIgnorer(object):
|
||||
@excutils.exception_filter
|
||||
@staticmethod
|
||||
def ignore_exceptions(ex):
|
||||
'''Ignore some exceptions S.'''
|
||||
return isinstance(ex, ignore_classes)
|
||||
|
||||
return ExceptionIgnorer.ignore_exceptions
|
||||
|
||||
def test_filter_func_call(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception as exc:
|
||||
ignore_assertion_error(exc)
|
||||
|
||||
def test_raise_func_call(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc:
|
||||
self.assertRaises(RuntimeError, ignore_assertion_error, exc)
|
||||
|
||||
def test_raise_previous_func_call(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc1:
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc2:
|
||||
self.assertIsNot(exc1, exc2)
|
||||
raised = self.assertRaises(RuntimeError,
|
||||
ignore_assertion_error,
|
||||
exc1)
|
||||
self.assertIs(exc1, raised)
|
||||
|
||||
def test_raise_previous_after_filtered_func_call(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc1:
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception:
|
||||
pass
|
||||
self.assertRaises(RuntimeError, ignore_assertion_error, exc1)
|
||||
|
||||
def test_raise_other_func_call(self):
|
||||
@excutils.exception_filter
|
||||
def translate_exceptions(ex):
|
||||
raise RuntimeError
|
||||
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception as exc:
|
||||
self.assertRaises(RuntimeError, translate_exceptions, exc)
|
||||
|
||||
def test_filter_func_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
with ignore_assertion_error:
|
||||
assert False, "This is a test"
|
||||
|
||||
def test_raise_func_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
def try_runtime_err():
|
||||
with ignore_assertion_error:
|
||||
raise RuntimeError
|
||||
|
||||
self.assertRaises(RuntimeError, try_runtime_err)
|
||||
|
||||
def test_raise_other_func_context_manager(self):
|
||||
@excutils.exception_filter
|
||||
def translate_exceptions(ex):
|
||||
raise RuntimeError
|
||||
|
||||
def try_assertion():
|
||||
with translate_exceptions:
|
||||
assert False, "This is a test"
|
||||
|
||||
self.assertRaises(RuntimeError, try_assertion)
|
||||
|
||||
def test_noexc_func_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_func()
|
||||
|
||||
with ignore_assertion_error:
|
||||
pass
|
||||
|
||||
def test_noexc_nocall_func_context_manager(self):
|
||||
@excutils.exception_filter
|
||||
def translate_exceptions(ex):
|
||||
raise RuntimeError
|
||||
|
||||
with translate_exceptions:
|
||||
pass
|
||||
|
||||
def test_func_docstring(self):
|
||||
ignore_func = self._make_filter_func()
|
||||
self.assertEqual('Ignore some exceptions F.', ignore_func.__doc__)
|
||||
|
||||
def test_filter_method_call(self):
|
||||
ignore_assertion_error = self._make_filter_method()
|
||||
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception as exc:
|
||||
ignore_assertion_error(exc)
|
||||
|
||||
def test_raise_method_call(self):
|
||||
ignore_assertion_error = self._make_filter_method()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc:
|
||||
self.assertRaises(RuntimeError, ignore_assertion_error, exc)
|
||||
|
||||
def test_filter_method_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_method()
|
||||
|
||||
with ignore_assertion_error:
|
||||
assert False, "This is a test"
|
||||
|
||||
def test_raise_method_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_method()
|
||||
|
||||
def try_runtime_err():
|
||||
with ignore_assertion_error:
|
||||
raise RuntimeError
|
||||
|
||||
self.assertRaises(RuntimeError, try_runtime_err)
|
||||
|
||||
def test_method_docstring(self):
|
||||
ignore_func = self._make_filter_method()
|
||||
self.assertEqual('Ignore some exceptions M.', ignore_func.__doc__)
|
||||
|
||||
def test_filter_classmethod_call(self):
|
||||
ignore_assertion_error = self._make_filter_classmethod()
|
||||
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception as exc:
|
||||
ignore_assertion_error(exc)
|
||||
|
||||
def test_raise_classmethod_call(self):
|
||||
ignore_assertion_error = self._make_filter_classmethod()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc:
|
||||
self.assertRaises(RuntimeError, ignore_assertion_error, exc)
|
||||
|
||||
def test_filter_classmethod_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_classmethod()
|
||||
|
||||
with ignore_assertion_error:
|
||||
assert False, "This is a test"
|
||||
|
||||
def test_raise_classmethod_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_classmethod()
|
||||
|
||||
def try_runtime_err():
|
||||
with ignore_assertion_error:
|
||||
raise RuntimeError
|
||||
|
||||
self.assertRaises(RuntimeError, try_runtime_err)
|
||||
|
||||
def test_classmethod_docstring(self):
|
||||
ignore_func = self._make_filter_classmethod()
|
||||
self.assertEqual('Ignore some exceptions C.', ignore_func.__doc__)
|
||||
|
||||
def test_filter_staticmethod_call(self):
|
||||
ignore_assertion_error = self._make_filter_staticmethod()
|
||||
|
||||
try:
|
||||
assert False, "This is a test"
|
||||
except Exception as exc:
|
||||
ignore_assertion_error(exc)
|
||||
|
||||
def test_raise_staticmethod_call(self):
|
||||
ignore_assertion_error = self._make_filter_staticmethod()
|
||||
|
||||
try:
|
||||
raise RuntimeError
|
||||
except Exception as exc:
|
||||
self.assertRaises(RuntimeError, ignore_assertion_error, exc)
|
||||
|
||||
def test_filter_staticmethod_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_staticmethod()
|
||||
|
||||
with ignore_assertion_error:
|
||||
assert False, "This is a test"
|
||||
|
||||
def test_raise_staticmethod_context_manager(self):
|
||||
ignore_assertion_error = self._make_filter_staticmethod()
|
||||
|
||||
def try_runtime_err():
|
||||
with ignore_assertion_error:
|
||||
raise RuntimeError
|
||||
|
||||
self.assertRaises(RuntimeError, try_runtime_err)
|
||||
|
||||
def test_staticmethod_docstring(self):
|
||||
ignore_func = self._make_filter_staticmethod()
|
||||
self.assertEqual('Ignore some exceptions S.', ignore_func.__doc__)
|
|
@ -1,191 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import errno
|
||||
import os
|
||||
import shutil
|
||||
import stat
|
||||
import tempfile
|
||||
import uuid
|
||||
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
|
||||
from oslo_utils import fileutils
|
||||
|
||||
|
||||
TEST_PERMISSIONS = stat.S_IRWXU
|
||||
|
||||
|
||||
class EnsureTree(test_base.BaseTestCase):
|
||||
def test_ensure_tree(self):
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
try:
|
||||
testdir = '%s/foo/bar/baz' % (tmpdir,)
|
||||
fileutils.ensure_tree(testdir, TEST_PERMISSIONS)
|
||||
self.assertTrue(os.path.isdir(testdir))
|
||||
self.assertEqual(os.stat(testdir).st_mode,
|
||||
TEST_PERMISSIONS | stat.S_IFDIR)
|
||||
finally:
|
||||
if os.path.exists(tmpdir):
|
||||
shutil.rmtree(tmpdir)
|
||||
|
||||
|
||||
class DeleteIfExists(test_base.BaseTestCase):
|
||||
def test_file_present(self):
|
||||
tmpfile = tempfile.mktemp()
|
||||
|
||||
open(tmpfile, 'w')
|
||||
fileutils.delete_if_exists(tmpfile)
|
||||
self.assertFalse(os.path.exists(tmpfile))
|
||||
|
||||
def test_file_absent(self):
|
||||
tmpfile = tempfile.mktemp()
|
||||
|
||||
fileutils.delete_if_exists(tmpfile)
|
||||
self.assertFalse(os.path.exists(tmpfile))
|
||||
|
||||
def test_dir_present(self):
|
||||
tmpdir = tempfile.mktemp()
|
||||
os.mkdir(tmpdir)
|
||||
|
||||
fileutils.delete_if_exists(tmpdir, remove=os.rmdir)
|
||||
self.assertFalse(os.path.exists(tmpdir))
|
||||
|
||||
def test_file_error(self):
|
||||
def errm(path):
|
||||
raise OSError(errno.EINVAL, '')
|
||||
|
||||
tmpfile = tempfile.mktemp()
|
||||
|
||||
open(tmpfile, 'w')
|
||||
self.assertRaises(OSError, fileutils.delete_if_exists, tmpfile, errm)
|
||||
os.unlink(tmpfile)
|
||||
|
||||
|
||||
class RemovePathOnError(test_base.BaseTestCase):
|
||||
def test_error(self):
|
||||
tmpfile = tempfile.mktemp()
|
||||
open(tmpfile, 'w')
|
||||
|
||||
try:
|
||||
with fileutils.remove_path_on_error(tmpfile):
|
||||
raise Exception
|
||||
except Exception:
|
||||
self.assertFalse(os.path.exists(tmpfile))
|
||||
|
||||
def test_no_error(self):
|
||||
tmpfile = tempfile.mktemp()
|
||||
open(tmpfile, 'w')
|
||||
|
||||
with fileutils.remove_path_on_error(tmpfile):
|
||||
pass
|
||||
self.assertTrue(os.path.exists(tmpfile))
|
||||
os.unlink(tmpfile)
|
||||
|
||||
def test_remove(self):
|
||||
tmpfile = tempfile.mktemp()
|
||||
open(tmpfile, 'w')
|
||||
|
||||
try:
|
||||
with fileutils.remove_path_on_error(tmpfile, remove=lambda x: x):
|
||||
raise Exception
|
||||
except Exception:
|
||||
self.assertTrue(os.path.exists(tmpfile))
|
||||
os.unlink(tmpfile)
|
||||
|
||||
def test_remove_dir(self):
|
||||
tmpdir = tempfile.mktemp()
|
||||
os.mkdir(tmpdir)
|
||||
|
||||
try:
|
||||
with fileutils.remove_path_on_error(
|
||||
tmpdir,
|
||||
lambda path: fileutils.delete_if_exists(path, os.rmdir)):
|
||||
raise Exception
|
||||
except Exception:
|
||||
self.assertFalse(os.path.exists(tmpdir))
|
||||
|
||||
|
||||
class WriteToTempfileTestCase(test_base.BaseTestCase):
|
||||
def setUp(self):
|
||||
super(WriteToTempfileTestCase, self).setUp()
|
||||
self.content = 'testing123'.encode('ascii')
|
||||
|
||||
def check_file_content(self, path):
|
||||
with open(path, 'r') as fd:
|
||||
ans = fd.read()
|
||||
self.assertEqual(self.content, six.b(ans))
|
||||
|
||||
def test_file_without_path_and_suffix(self):
|
||||
res = fileutils.write_to_tempfile(self.content)
|
||||
self.assertTrue(os.path.exists(res))
|
||||
|
||||
(basepath, tmpfile) = os.path.split(res)
|
||||
self.assertTrue(basepath.startswith(tempfile.gettempdir()))
|
||||
self.assertTrue(tmpfile.startswith('tmp'))
|
||||
|
||||
self.check_file_content(res)
|
||||
|
||||
def test_file_with_not_existing_path(self):
|
||||
random_dir = uuid.uuid4().hex
|
||||
path = '/tmp/%s/test1' % random_dir
|
||||
res = fileutils.write_to_tempfile(self.content, path=path)
|
||||
self.assertTrue(os.path.exists(res))
|
||||
(basepath, tmpfile) = os.path.split(res)
|
||||
self.assertEqual(basepath, path)
|
||||
self.assertTrue(tmpfile.startswith('tmp'))
|
||||
|
||||
self.check_file_content(res)
|
||||
shutil.rmtree('/tmp/' + random_dir)
|
||||
|
||||
def test_file_with_not_default_suffix(self):
|
||||
suffix = '.conf'
|
||||
res = fileutils.write_to_tempfile(self.content, suffix=suffix)
|
||||
self.assertTrue(os.path.exists(res))
|
||||
|
||||
(basepath, tmpfile) = os.path.split(res)
|
||||
self.assertTrue(basepath.startswith(tempfile.gettempdir()))
|
||||
self.assertTrue(tmpfile.startswith('tmp'))
|
||||
self.assertTrue(tmpfile.endswith('.conf'))
|
||||
|
||||
self.check_file_content(res)
|
||||
|
||||
def test_file_with_not_existing_path_and_not_default_suffix(self):
|
||||
suffix = '.txt'
|
||||
random_dir = uuid.uuid4().hex
|
||||
path = '/tmp/%s/test2' % random_dir
|
||||
res = fileutils.write_to_tempfile(self.content,
|
||||
path=path,
|
||||
suffix=suffix)
|
||||
self.assertTrue(os.path.exists(res))
|
||||
(basepath, tmpfile) = os.path.split(res)
|
||||
self.assertTrue(tmpfile.startswith('tmp'))
|
||||
self.assertEqual(basepath, path)
|
||||
self.assertTrue(tmpfile.endswith(suffix))
|
||||
|
||||
self.check_file_content(res)
|
||||
shutil.rmtree('/tmp/' + random_dir)
|
||||
|
||||
def test_file_with_not_default_prefix(self):
|
||||
prefix = 'test'
|
||||
res = fileutils.write_to_tempfile(self.content, prefix=prefix)
|
||||
self.assertTrue(os.path.exists(res))
|
||||
|
||||
(basepath, tmpfile) = os.path.split(res)
|
||||
self.assertTrue(tmpfile.startswith(prefix))
|
||||
self.assertTrue(basepath.startswith(tempfile.gettempdir()))
|
||||
|
||||
self.check_file_content(res)
|
|
@ -1,63 +0,0 @@
|
|||
|
||||
# Copyright 2015 OpenStack Foundation
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import datetime
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import fixture
|
||||
from oslo_utils import timeutils
|
||||
|
||||
|
||||
class TimeFixtureTest(test_base.BaseTestCase):
|
||||
|
||||
def test_set_time_override_using_default(self):
|
||||
# When the fixture is used with its default constructor, the
|
||||
# override_time is set to the current timestamp.
|
||||
# Also, when the fixture is cleaned up, the override_time is reset.
|
||||
|
||||
self.assertIsNone(timeutils.utcnow.override_time)
|
||||
with fixture.TimeFixture():
|
||||
self.assertIsNotNone(timeutils.utcnow.override_time)
|
||||
self.assertIsNone(timeutils.utcnow.override_time)
|
||||
|
||||
def test_set_time_override(self):
|
||||
# When the fixture is used to set a time, utcnow returns that time.
|
||||
|
||||
new_time = datetime.datetime(2015, 1, 2, 3, 4, 6, 7)
|
||||
self.useFixture(fixture.TimeFixture(new_time))
|
||||
self.assertEqual(new_time, timeutils.utcnow())
|
||||
# Call again to make sure it keeps returning the same time.
|
||||
self.assertEqual(new_time, timeutils.utcnow())
|
||||
|
||||
def test_advance_time_delta(self):
|
||||
# advance_time_delta() advances the overridden time by some timedelta.
|
||||
|
||||
new_time = datetime.datetime(2015, 1, 2, 3, 4, 6, 7)
|
||||
time_fixture = self.useFixture(fixture.TimeFixture(new_time))
|
||||
time_fixture.advance_time_delta(datetime.timedelta(seconds=1))
|
||||
expected_time = datetime.datetime(2015, 1, 2, 3, 4, 7, 7)
|
||||
self.assertEqual(expected_time, timeutils.utcnow())
|
||||
|
||||
def test_advance_time_seconds(self):
|
||||
# advance_time_seconds() advances the overridden time by some number of
|
||||
# seconds.
|
||||
|
||||
new_time = datetime.datetime(2015, 1, 2, 3, 4, 6, 7)
|
||||
time_fixture = self.useFixture(fixture.TimeFixture(new_time))
|
||||
time_fixture.advance_time_seconds(2)
|
||||
expected_time = datetime.datetime(2015, 1, 2, 3, 4, 8, 7)
|
||||
self.assertEqual(expected_time, timeutils.utcnow())
|
|
@ -1,61 +0,0 @@
|
|||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import fnmatch as standard_fnmatch
|
||||
import ntpath
|
||||
import posixpath
|
||||
import sys
|
||||
|
||||
import mock
|
||||
from oslotest import base
|
||||
import six
|
||||
|
||||
|
||||
fnmatch = None
|
||||
|
||||
|
||||
class TestFnmatch(base.BaseTestCase):
|
||||
|
||||
def _test_fnmatch(self):
|
||||
self.assertFalse(fnmatch.fnmatch("tesX", "Test"))
|
||||
self.assertTrue(fnmatch.fnmatch("test", "test"))
|
||||
self.assertFalse(fnmatch.fnmatchcase("test", "Test"))
|
||||
self.assertTrue(fnmatch.fnmatchcase("test", "test"))
|
||||
self.assertTrue(fnmatch.fnmatch("testX", "test*"))
|
||||
self.assertEqual(["Test"], fnmatch.filter(["Test", "TestX"], "Test"))
|
||||
|
||||
def _test_fnmatch_posix_nt(self):
|
||||
with mock.patch("os.path", new=posixpath):
|
||||
self.assertFalse(fnmatch.fnmatch("test", "Test"))
|
||||
self._test_fnmatch()
|
||||
with mock.patch("os.path", new=ntpath):
|
||||
self._test_fnmatch()
|
||||
self.assertTrue(fnmatch.fnmatch("test", "Test"))
|
||||
self.assertEqual(["Test"],
|
||||
fnmatch.filter(["Test", "TestX"], "test"))
|
||||
|
||||
def test_fnmatch(self):
|
||||
global fnmatch
|
||||
|
||||
fnmatch = standard_fnmatch
|
||||
self._test_fnmatch_posix_nt()
|
||||
|
||||
with mock.patch.object(sys, 'version_info', new=(2, 7, 11)):
|
||||
from oslo_utils import fnmatch as oslo_fnmatch
|
||||
fnmatch = oslo_fnmatch
|
||||
self._test_fnmatch_posix_nt()
|
||||
|
||||
with mock.patch.object(sys, 'version_info', new=(2, 7, 0)):
|
||||
six.moves.reload_module(oslo_fnmatch)
|
||||
self._test_fnmatch_posix_nt()
|
|
@ -1,227 +0,0 @@
|
|||
# Copyright (C) 2012 Yahoo! Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base as test_base
|
||||
import testscenarios
|
||||
|
||||
from oslo_utils import imageutils
|
||||
|
||||
load_tests = testscenarios.load_tests_apply_scenarios
|
||||
|
||||
|
||||
class ImageUtilsRawTestCase(test_base.BaseTestCase):
|
||||
|
||||
_image_name = [
|
||||
('disk_config', dict(image_name='disk.config')),
|
||||
]
|
||||
|
||||
_file_format = [
|
||||
('raw', dict(file_format='raw')),
|
||||
]
|
||||
|
||||
_virtual_size = [
|
||||
('64M', dict(virtual_size='64M',
|
||||
exp_virtual_size=67108864)),
|
||||
('64M_with_byte_hint', dict(virtual_size='64M (67108844 bytes)',
|
||||
exp_virtual_size=67108844)),
|
||||
('64M_byte', dict(virtual_size='67108844',
|
||||
exp_virtual_size=67108844)),
|
||||
('4.4M', dict(virtual_size='4.4M',
|
||||
exp_virtual_size=4613735)),
|
||||
('4.4M_with_byte_hint', dict(virtual_size='4.4M (4592640 bytes)',
|
||||
exp_virtual_size=4592640)),
|
||||
('2K', dict(virtual_size='2K',
|
||||
exp_virtual_size=2048)),
|
||||
('2K_with_byte_hint', dict(virtual_size='2K (2048 bytes)',
|
||||
exp_virtual_size=2048)),
|
||||
]
|
||||
|
||||
_disk_size = [
|
||||
('96K', dict(disk_size='96K',
|
||||
exp_disk_size=98304)),
|
||||
('96K_byte', dict(disk_size='98304',
|
||||
exp_disk_size=98304)),
|
||||
('3.1G', dict(disk_size='3.1G',
|
||||
exp_disk_size=3328599655)),
|
||||
('unavailable', dict(disk_size='unavailable',
|
||||
exp_disk_size=0)),
|
||||
]
|
||||
|
||||
_garbage_before_snapshot = [
|
||||
('no_garbage', dict(garbage_before_snapshot=None)),
|
||||
('garbage_before_snapshot_list', dict(garbage_before_snapshot=False)),
|
||||
('garbage_after_snapshot_list', dict(garbage_before_snapshot=True)),
|
||||
]
|
||||
|
||||
_snapshot_count = [
|
||||
('no_snapshots', dict(snapshot_count=None)),
|
||||
('one_snapshots', dict(snapshot_count=1)),
|
||||
('three_snapshots', dict(snapshot_count=3)),
|
||||
]
|
||||
|
||||
@classmethod
|
||||
def generate_scenarios(cls):
|
||||
cls.scenarios = testscenarios.multiply_scenarios(
|
||||
cls._image_name,
|
||||
cls._file_format,
|
||||
cls._virtual_size,
|
||||
cls._disk_size,
|
||||
cls._garbage_before_snapshot,
|
||||
cls._snapshot_count)
|
||||
|
||||
def _initialize_img_info(self):
|
||||
return ('image: %s' % self.image_name,
|
||||
'file_format: %s' % self.file_format,
|
||||
'virtual_size: %s' % self.virtual_size,
|
||||
'disk_size: %s' % self.disk_size)
|
||||
|
||||
def _insert_snapshots(self, img_info):
|
||||
img_info = img_info + ('Snapshot list:',)
|
||||
img_info = img_info + ('ID '
|
||||
'TAG '
|
||||
'VM SIZE '
|
||||
'DATE '
|
||||
'VM CLOCK',)
|
||||
for i in range(self.snapshot_count):
|
||||
img_info = img_info + ('%d '
|
||||
'd9a9784a500742a7bb95627bb3aace38 '
|
||||
'0 2012-08-20 10:52:46 '
|
||||
'00:00:00.000' % (i + 1),)
|
||||
return img_info
|
||||
|
||||
def _base_validation(self, image_info):
|
||||
self.assertEqual(image_info.image, self.image_name)
|
||||
self.assertEqual(image_info.file_format, self.file_format)
|
||||
self.assertEqual(image_info.virtual_size, self.exp_virtual_size)
|
||||
self.assertEqual(image_info.disk_size, self.exp_disk_size)
|
||||
if self.snapshot_count is not None:
|
||||
self.assertEqual(len(image_info.snapshots), self.snapshot_count)
|
||||
|
||||
def test_qemu_img_info(self):
|
||||
img_info = self._initialize_img_info()
|
||||
if self.garbage_before_snapshot is True:
|
||||
img_info = img_info + ('blah BLAH: bb',)
|
||||
if self.snapshot_count is not None:
|
||||
img_info = self._insert_snapshots(img_info)
|
||||
if self.garbage_before_snapshot is False:
|
||||
img_info = img_info + ('junk stuff: bbb',)
|
||||
example_output = '\n'.join(img_info)
|
||||
image_info = imageutils.QemuImgInfo(example_output)
|
||||
self._base_validation(image_info)
|
||||
|
||||
ImageUtilsRawTestCase.generate_scenarios()
|
||||
|
||||
|
||||
class ImageUtilsQemuTestCase(ImageUtilsRawTestCase):
|
||||
|
||||
_file_format = [
|
||||
('qcow2', dict(file_format='qcow2')),
|
||||
]
|
||||
|
||||
_qcow2_cluster_size = [
|
||||
('65536', dict(cluster_size='65536', exp_cluster_size=65536)),
|
||||
]
|
||||
|
||||
_qcow2_encrypted = [
|
||||
('no_encryption', dict(encrypted=None)),
|
||||
('encrypted', dict(encrypted='yes')),
|
||||
]
|
||||
|
||||
_qcow2_backing_file = [
|
||||
('no_backing_file', dict(backing_file=None)),
|
||||
('backing_file_path',
|
||||
dict(backing_file='/var/lib/nova/a328c7998805951a_2',
|
||||
exp_backing_file='/var/lib/nova/a328c7998805951a_2')),
|
||||
('backing_file_path_with_actual_path',
|
||||
dict(backing_file='/var/lib/nova/a328c7998805951a_2 '
|
||||
'(actual path: /b/3a988059e51a_2)',
|
||||
exp_backing_file='/b/3a988059e51a_2')),
|
||||
]
|
||||
|
||||
@classmethod
|
||||
def generate_scenarios(cls):
|
||||
cls.scenarios = testscenarios.multiply_scenarios(
|
||||
cls._image_name,
|
||||
cls._file_format,
|
||||
cls._virtual_size,
|
||||
cls._disk_size,
|
||||
cls._garbage_before_snapshot,
|
||||
cls._snapshot_count,
|
||||
cls._qcow2_cluster_size,
|
||||
cls._qcow2_encrypted,
|
||||
cls._qcow2_backing_file)
|
||||
|
||||
def test_qemu_img_info(self):
|
||||
img_info = self._initialize_img_info()
|
||||
img_info = img_info + ('cluster_size: %s' % self.cluster_size,)
|
||||
if self.backing_file is not None:
|
||||
img_info = img_info + ('backing file: %s' %
|
||||
self.backing_file,)
|
||||
if self.encrypted is not None:
|
||||
img_info = img_info + ('encrypted: %s' % self.encrypted,)
|
||||
if self.garbage_before_snapshot is True:
|
||||
img_info = img_info + ('blah BLAH: bb',)
|
||||
if self.snapshot_count is not None:
|
||||
img_info = self._insert_snapshots(img_info)
|
||||
if self.garbage_before_snapshot is False:
|
||||
img_info = img_info + ('junk stuff: bbb',)
|
||||
example_output = '\n'.join(img_info)
|
||||
image_info = imageutils.QemuImgInfo(example_output)
|
||||
self._base_validation(image_info)
|
||||
self.assertEqual(image_info.cluster_size, self.exp_cluster_size)
|
||||
if self.backing_file is not None:
|
||||
self.assertEqual(image_info.backing_file,
|
||||
self.exp_backing_file)
|
||||
if self.encrypted is not None:
|
||||
self.assertEqual(image_info.encrypted, self.encrypted)
|
||||
|
||||
ImageUtilsQemuTestCase.generate_scenarios()
|
||||
|
||||
|
||||
class ImageUtilsBlankTestCase(test_base.BaseTestCase):
|
||||
def test_qemu_img_info_blank(self):
|
||||
example_output = '\n'.join(['image: None', 'file_format: None',
|
||||
'virtual_size: None', 'disk_size: None',
|
||||
'cluster_size: None',
|
||||
'backing_file: None'])
|
||||
image_info = imageutils.QemuImgInfo()
|
||||
self.assertEqual(str(image_info), example_output)
|
||||
self.assertEqual(len(image_info.snapshots), 0)
|
||||
|
||||
|
||||
class ImageUtilsJSONTestCase(test_base.BaseTestCase):
|
||||
def test_qemu_img_info_json_format(self):
|
||||
img_output = '''{
|
||||
"virtual-size": 41126400,
|
||||
"filename": "fake_img",
|
||||
"cluster-size": 65536,
|
||||
"format": "qcow2",
|
||||
"actual-size": 13168640
|
||||
}'''
|
||||
image_info = imageutils.QemuImgInfo(img_output, format='json')
|
||||
self.assertEqual(41126400, image_info.virtual_size)
|
||||
self.assertEqual('fake_img', image_info.image)
|
||||
self.assertEqual(65536, image_info.cluster_size)
|
||||
self.assertEqual('qcow2', image_info.file_format)
|
||||
self.assertEqual(13168640, image_info.disk_size)
|
||||
|
||||
def test_qemu_img_info_json_format_blank(self):
|
||||
img_output = '{}'
|
||||
image_info = imageutils.QemuImgInfo(img_output, format='json')
|
||||
self.assertIsNone(image_info.virtual_size)
|
||||
self.assertIsNone(image_info.image)
|
||||
self.assertIsNone(image_info.cluster_size)
|
||||
self.assertIsNone(image_info.file_format)
|
||||
self.assertIsNone(image_info.disk_size)
|
|
@ -1,152 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import datetime
|
||||
import sys
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import importutils
|
||||
|
||||
|
||||
class ImportUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
# NOTE(jkoelker) There has GOT to be a way to test this. But mocking
|
||||
# __import__ is the devil. Right now we just make
|
||||
# sure we can import something from the stdlib
|
||||
def test_import_class(self):
|
||||
dt = importutils.import_class('datetime.datetime')
|
||||
self.assertEqual(sys.modules['datetime'].datetime, dt)
|
||||
|
||||
def test_import_bad_class(self):
|
||||
self.assertRaises(ImportError, importutils.import_class,
|
||||
'lol.u_mad.brah')
|
||||
|
||||
def test_import_module(self):
|
||||
dt = importutils.import_module('datetime')
|
||||
self.assertEqual(sys.modules['datetime'], dt)
|
||||
|
||||
def test_import_object_optional_arg_not_present(self):
|
||||
obj = importutils.import_object('oslo_utils.tests.fake.FakeDriver')
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_optional_arg_present(self):
|
||||
obj = importutils.import_object('oslo_utils.tests.fake.FakeDriver',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_required_arg_not_present(self):
|
||||
# arg 1 isn't optional here
|
||||
self.assertRaises(TypeError, importutils.import_object,
|
||||
'oslo_utils.tests.fake.FakeDriver2')
|
||||
|
||||
def test_import_object_required_arg_present(self):
|
||||
obj = importutils.import_object('oslo_utils.tests.fake.FakeDriver2',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver2')
|
||||
|
||||
# namespace tests
|
||||
def test_import_object_ns_optional_arg_not_present(self):
|
||||
obj = importutils.import_object_ns('oslo_utils',
|
||||
'tests.fake.FakeDriver')
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_ns_optional_arg_present(self):
|
||||
obj = importutils.import_object_ns('oslo_utils',
|
||||
'tests.fake.FakeDriver',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_ns_required_arg_not_present(self):
|
||||
# arg 1 isn't optional here
|
||||
self.assertRaises(TypeError, importutils.import_object_ns,
|
||||
'oslo_utils', 'tests.fake.FakeDriver2')
|
||||
|
||||
def test_import_object_ns_required_arg_present(self):
|
||||
obj = importutils.import_object_ns('oslo_utils',
|
||||
'tests.fake.FakeDriver2',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver2')
|
||||
|
||||
# namespace tests
|
||||
def test_import_object_ns_full_optional_arg_not_present(self):
|
||||
obj = importutils.import_object_ns('tests2',
|
||||
'oslo_utils.tests.fake.FakeDriver')
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_ns_full_optional_arg_present(self):
|
||||
obj = importutils.import_object_ns('tests2',
|
||||
'oslo_utils.tests.fake.FakeDriver',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver')
|
||||
|
||||
def test_import_object_ns_full_required_arg_not_present(self):
|
||||
# arg 1 isn't optional here
|
||||
self.assertRaises(TypeError, importutils.import_object_ns,
|
||||
'tests2', 'oslo_utils.tests.fake.FakeDriver2')
|
||||
|
||||
def test_import_object_ns_full_required_arg_present(self):
|
||||
obj = importutils.import_object_ns('tests2',
|
||||
'oslo_utils.tests.fake.FakeDriver2',
|
||||
first_arg=False)
|
||||
self.assertEqual(obj.__class__.__name__, 'FakeDriver2')
|
||||
|
||||
def test_import_object_ns_raise_import_error_in_init(self):
|
||||
self.assertRaises(ImportError, importutils.import_object_ns,
|
||||
'tests2', 'oslo_utils.tests.fake.FakeDriver3')
|
||||
|
||||
def test_import_object(self):
|
||||
dt = importutils.import_object('datetime.time')
|
||||
self.assertIsInstance(dt, sys.modules['datetime'].time)
|
||||
|
||||
def test_import_object_with_args(self):
|
||||
dt = importutils.import_object('datetime.datetime', 2012, 4, 5)
|
||||
self.assertIsInstance(dt, sys.modules['datetime'].datetime)
|
||||
self.assertEqual(dt, datetime.datetime(2012, 4, 5))
|
||||
|
||||
def test_import_versioned_module(self):
|
||||
v2 = importutils.import_versioned_module('oslo_utils.tests.fake', 2)
|
||||
self.assertEqual(sys.modules['oslo_utils.tests.fake.v2'], v2)
|
||||
|
||||
dummpy = importutils.import_versioned_module('oslo_utils.tests.fake',
|
||||
2, 'dummpy')
|
||||
self.assertEqual(sys.modules['oslo_utils.tests.fake.v2.dummpy'],
|
||||
dummpy)
|
||||
|
||||
def test_import_versioned_module_wrong_version_parameter(self):
|
||||
self.assertRaises(ValueError,
|
||||
importutils.import_versioned_module,
|
||||
'oslo_utils.tests.fake', "2.0", 'fake')
|
||||
|
||||
def test_import_versioned_module_error(self):
|
||||
self.assertRaises(ImportError,
|
||||
importutils.import_versioned_module,
|
||||
'oslo_utils.tests.fake', 2, 'fake')
|
||||
|
||||
def test_try_import(self):
|
||||
dt = importutils.try_import('datetime')
|
||||
self.assertEqual(sys.modules['datetime'], dt)
|
||||
|
||||
def test_try_import_returns_default(self):
|
||||
foo = importutils.try_import('foo.bar')
|
||||
self.assertIsNone(foo)
|
||||
|
||||
def test_import_any_none_found(self):
|
||||
self.assertRaises(ImportError, importutils.import_any,
|
||||
'foo.bar', 'foo.foo.bar')
|
||||
|
||||
def test_import_any_found(self):
|
||||
dt = importutils.import_any('foo.bar', 'datetime')
|
||||
self.assertEqual(sys.modules['datetime'], dt)
|
|
@ -1,430 +0,0 @@
|
|||
# Copyright 2012 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import contextlib
|
||||
import socket
|
||||
|
||||
import mock
|
||||
import netifaces
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
|
||||
from oslo_utils import netutils
|
||||
|
||||
|
||||
class NetworkUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
def test_no_host(self):
|
||||
result = netutils.urlsplit('http://')
|
||||
self.assertEqual('', result.netloc)
|
||||
self.assertIsNone(result.port)
|
||||
self.assertIsNone(result.hostname)
|
||||
self.assertEqual('http', result.scheme)
|
||||
|
||||
def test_parse_host_port(self):
|
||||
self.assertEqual(('server01', 80),
|
||||
netutils.parse_host_port('server01:80'))
|
||||
self.assertEqual(('server01', None),
|
||||
netutils.parse_host_port('server01'))
|
||||
self.assertEqual(('server01', 1234),
|
||||
netutils.parse_host_port('server01',
|
||||
default_port=1234))
|
||||
self.assertEqual(('::1', 80),
|
||||
netutils.parse_host_port('[::1]:80'))
|
||||
self.assertEqual(('::1', None),
|
||||
netutils.parse_host_port('[::1]'))
|
||||
self.assertEqual(('::1', 1234),
|
||||
netutils.parse_host_port('[::1]',
|
||||
default_port=1234))
|
||||
self.assertEqual(('2001:db8:85a3::8a2e:370:7334', 1234),
|
||||
netutils.parse_host_port(
|
||||
'2001:db8:85a3::8a2e:370:7334',
|
||||
default_port=1234))
|
||||
|
||||
def test_urlsplit(self):
|
||||
result = netutils.urlsplit('rpc://myhost?someparam#somefragment')
|
||||
self.assertEqual(result.scheme, 'rpc')
|
||||
self.assertEqual(result.netloc, 'myhost')
|
||||
self.assertEqual(result.path, '')
|
||||
self.assertEqual(result.query, 'someparam')
|
||||
self.assertEqual(result.fragment, 'somefragment')
|
||||
|
||||
result = netutils.urlsplit(
|
||||
'rpc://myhost/mypath?someparam#somefragment',
|
||||
allow_fragments=False)
|
||||
self.assertEqual(result.scheme, 'rpc')
|
||||
self.assertEqual(result.netloc, 'myhost')
|
||||
self.assertEqual(result.path, '/mypath')
|
||||
self.assertEqual(result.query, 'someparam#somefragment')
|
||||
self.assertEqual(result.fragment, '')
|
||||
|
||||
result = netutils.urlsplit(
|
||||
'rpc://user:pass@myhost/mypath?someparam#somefragment',
|
||||
allow_fragments=False)
|
||||
self.assertEqual(result.scheme, 'rpc')
|
||||
self.assertEqual(result.netloc, 'user:pass@myhost')
|
||||
self.assertEqual(result.path, '/mypath')
|
||||
self.assertEqual(result.query, 'someparam#somefragment')
|
||||
self.assertEqual(result.fragment, '')
|
||||
|
||||
def test_urlsplit_ipv6(self):
|
||||
ipv6_url = 'http://[::1]:443/v2.0/'
|
||||
result = netutils.urlsplit(ipv6_url)
|
||||
self.assertEqual(result.scheme, 'http')
|
||||
self.assertEqual(result.netloc, '[::1]:443')
|
||||
self.assertEqual(result.path, '/v2.0/')
|
||||
self.assertEqual(result.hostname, '::1')
|
||||
self.assertEqual(result.port, 443)
|
||||
|
||||
ipv6_url = 'http://user:pass@[::1]/v2.0/'
|
||||
result = netutils.urlsplit(ipv6_url)
|
||||
self.assertEqual(result.scheme, 'http')
|
||||
self.assertEqual(result.netloc, 'user:pass@[::1]')
|
||||
self.assertEqual(result.path, '/v2.0/')
|
||||
self.assertEqual(result.hostname, '::1')
|
||||
self.assertIsNone(result.port)
|
||||
|
||||
ipv6_url = 'https://[2001:db8:85a3::8a2e:370:7334]:1234/v2.0/xy?ab#12'
|
||||
result = netutils.urlsplit(ipv6_url)
|
||||
self.assertEqual(result.scheme, 'https')
|
||||
self.assertEqual(result.netloc, '[2001:db8:85a3::8a2e:370:7334]:1234')
|
||||
self.assertEqual(result.path, '/v2.0/xy')
|
||||
self.assertEqual(result.hostname, '2001:db8:85a3::8a2e:370:7334')
|
||||
self.assertEqual(result.port, 1234)
|
||||
self.assertEqual(result.query, 'ab')
|
||||
self.assertEqual(result.fragment, '12')
|
||||
|
||||
def test_urlsplit_params(self):
|
||||
test_url = "http://localhost/?a=b&c=d"
|
||||
result = netutils.urlsplit(test_url)
|
||||
self.assertEqual({'a': 'b', 'c': 'd'}, result.params())
|
||||
self.assertEqual({'a': 'b', 'c': 'd'}, result.params(collapse=False))
|
||||
|
||||
test_url = "http://localhost/?a=b&a=c&a=d"
|
||||
result = netutils.urlsplit(test_url)
|
||||
self.assertEqual({'a': 'd'}, result.params())
|
||||
self.assertEqual({'a': ['b', 'c', 'd']}, result.params(collapse=False))
|
||||
|
||||
test_url = "http://localhost"
|
||||
result = netutils.urlsplit(test_url)
|
||||
self.assertEqual({}, result.params())
|
||||
|
||||
test_url = "http://localhost?"
|
||||
result = netutils.urlsplit(test_url)
|
||||
self.assertEqual({}, result.params())
|
||||
|
||||
def test_set_tcp_keepalive(self):
|
||||
mock_sock = mock.Mock()
|
||||
netutils.set_tcp_keepalive(mock_sock, True, 100, 10, 5)
|
||||
calls = [
|
||||
mock.call.setsockopt(socket.SOL_SOCKET,
|
||||
socket.SO_KEEPALIVE, True),
|
||||
]
|
||||
if hasattr(socket, 'TCP_KEEPIDLE'):
|
||||
calls += [
|
||||
mock.call.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPIDLE, 100)
|
||||
]
|
||||
if hasattr(socket, 'TCP_KEEPINTVL'):
|
||||
calls += [
|
||||
mock.call.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPINTVL, 10),
|
||||
]
|
||||
if hasattr(socket, 'TCP_KEEPCNT'):
|
||||
calls += [
|
||||
mock.call.setsockopt(socket.IPPROTO_TCP,
|
||||
socket.TCP_KEEPCNT, 5)
|
||||
]
|
||||
mock_sock.assert_has_calls(calls)
|
||||
|
||||
mock_sock.reset_mock()
|
||||
netutils.set_tcp_keepalive(mock_sock, False)
|
||||
self.assertEqual(1, len(mock_sock.mock_calls))
|
||||
|
||||
def test_is_valid_ipv4(self):
|
||||
self.assertTrue(netutils.is_valid_ipv4('42.42.42.42'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ipv4('-1.11.11.11'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ipv4(''))
|
||||
|
||||
def test_is_valid_ipv6(self):
|
||||
self.assertTrue(netutils.is_valid_ipv6('::1'))
|
||||
|
||||
self.assertTrue(netutils.is_valid_ipv6('fe80::1%eth0'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ip('fe%80::1%eth0'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ipv6(
|
||||
'1fff::a88:85a3::172.31.128.1'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ipv6(''))
|
||||
|
||||
def test_is_valid_ip(self):
|
||||
self.assertTrue(netutils.is_valid_ip('127.0.0.1'))
|
||||
|
||||
self.assertTrue(netutils.is_valid_ip('2001:db8::ff00:42:8329'))
|
||||
|
||||
self.assertTrue(netutils.is_valid_ip('fe80::1%eth0'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ip('256.0.0.0'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ip('::1.2.3.'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ip(''))
|
||||
|
||||
self.assertFalse(netutils.is_valid_ip(None))
|
||||
|
||||
def test_is_valid_mac(self):
|
||||
self.assertTrue(netutils.is_valid_mac("52:54:00:cf:2d:31"))
|
||||
self.assertTrue(netutils.is_valid_mac(u"52:54:00:cf:2d:31"))
|
||||
self.assertFalse(netutils.is_valid_mac("127.0.0.1"))
|
||||
self.assertFalse(netutils.is_valid_mac("not:a:mac:address"))
|
||||
self.assertFalse(netutils.is_valid_mac("52-54-00-cf-2d-31"))
|
||||
self.assertFalse(netutils.is_valid_mac("aa bb cc dd ee ff"))
|
||||
self.assertTrue(netutils.is_valid_mac("AA:BB:CC:DD:EE:FF"))
|
||||
self.assertFalse(netutils.is_valid_mac("AA BB CC DD EE FF"))
|
||||
self.assertFalse(netutils.is_valid_mac("AA-BB-CC-DD-EE-FF"))
|
||||
|
||||
def test_is_valid_cidr(self):
|
||||
self.assertTrue(netutils.is_valid_cidr('10.0.0.0/24'))
|
||||
self.assertTrue(netutils.is_valid_cidr('10.0.0.1/32'))
|
||||
self.assertTrue(netutils.is_valid_cidr('0.0.0.0/0'))
|
||||
self.assertTrue(netutils.is_valid_cidr('2600::/64'))
|
||||
self.assertTrue(netutils.is_valid_cidr(
|
||||
'0000:0000:0000:0000:0000:0000:0000:0001/32'))
|
||||
|
||||
self.assertFalse(netutils.is_valid_cidr('10.0.0.1'))
|
||||
self.assertFalse(netutils.is_valid_cidr('10.0.0.1/33'))
|
||||
self.assertFalse(netutils.is_valid_cidr(10))
|
||||
|
||||
def test_is_valid_ipv6_cidr(self):
|
||||
self.assertTrue(netutils.is_valid_ipv6_cidr("2600::/64"))
|
||||
self.assertTrue(netutils.is_valid_ipv6_cidr(
|
||||
"abcd:ef01:2345:6789:abcd:ef01:192.168.254.254/48"))
|
||||
self.assertTrue(netutils.is_valid_ipv6_cidr(
|
||||
"0000:0000:0000:0000:0000:0000:0000:0001/32"))
|
||||
self.assertTrue(netutils.is_valid_ipv6_cidr(
|
||||
"0000:0000:0000:0000:0000:0000:0000:0001"))
|
||||
self.assertFalse(netutils.is_valid_ipv6_cidr("foo"))
|
||||
self.assertFalse(netutils.is_valid_ipv6_cidr("127.0.0.1"))
|
||||
|
||||
def test_valid_port(self):
|
||||
valid_inputs = [0, '0', 1, '1', 2, '3', '5', 8, 13, 21,
|
||||
'80', '3246', '65535']
|
||||
for input_str in valid_inputs:
|
||||
self.assertTrue(netutils.is_valid_port(input_str))
|
||||
|
||||
def test_valid_port_fail(self):
|
||||
invalid_inputs = ['-32768', '65536', 528491, '528491',
|
||||
'528.491', 'thirty-seven', None]
|
||||
for input_str in invalid_inputs:
|
||||
self.assertFalse(netutils.is_valid_port(input_str))
|
||||
|
||||
def test_get_my_ip(self):
|
||||
sock_attrs = {
|
||||
'return_value.getsockname.return_value': ['1.2.3.4', '']}
|
||||
with mock.patch('socket.socket', **sock_attrs):
|
||||
addr = netutils.get_my_ipv4()
|
||||
self.assertEqual(addr, '1.2.3.4')
|
||||
|
||||
def test_is_int_in_range(self):
|
||||
valid_inputs = [(1, -100, 100),
|
||||
('1', -100, 100),
|
||||
(100, -100, 100),
|
||||
('100', -100, 100),
|
||||
(-100, -100, 100),
|
||||
('-100', -100, 100)]
|
||||
for input_value in valid_inputs:
|
||||
self.assertTrue(netutils._is_int_in_range(*input_value))
|
||||
|
||||
def test_is_int_not_in_range(self):
|
||||
invalid_inputs = [(None, 1, 100),
|
||||
('ten', 1, 100),
|
||||
(-1, 0, 255),
|
||||
('None', 1, 100)]
|
||||
for input_value in invalid_inputs:
|
||||
self.assertFalse(netutils._is_int_in_range(*input_value))
|
||||
|
||||
def test_valid_icmp_type(self):
|
||||
valid_inputs = [1, '1', 0, '0', 255, '255']
|
||||
for input_value in valid_inputs:
|
||||
self.assertTrue(netutils.is_valid_icmp_type(input_value))
|
||||
|
||||
def test_invalid_icmp_type(self):
|
||||
invalid_inputs = [-1, '-1', 256, '256', None, 'None', 'five']
|
||||
for input_value in invalid_inputs:
|
||||
self.assertFalse(netutils.is_valid_icmp_type(input_value))
|
||||
|
||||
def test_valid_icmp_code(self):
|
||||
valid_inputs = [1, '1', 0, '0', 255, '255', None]
|
||||
for input_value in valid_inputs:
|
||||
self.assertTrue(netutils.is_valid_icmp_code(input_value))
|
||||
|
||||
def test_invalid_icmp_code(self):
|
||||
invalid_inputs = [-1, '-1', 256, '256', 'None', 'zero']
|
||||
for input_value in invalid_inputs:
|
||||
self.assertFalse(netutils.is_valid_icmp_code(input_value))
|
||||
|
||||
@mock.patch('socket.socket')
|
||||
@mock.patch('oslo_utils.netutils._get_my_ipv4_address')
|
||||
def test_get_my_ip_socket_error(self, ip, mock_socket):
|
||||
mock_socket.side_effect = socket.error
|
||||
ip.return_value = '1.2.3.4'
|
||||
addr = netutils.get_my_ipv4()
|
||||
self.assertEqual(addr, '1.2.3.4')
|
||||
|
||||
@mock.patch('netifaces.gateways')
|
||||
@mock.patch('netifaces.ifaddresses')
|
||||
def test_get_my_ipv4_address_with_default_route(
|
||||
self, ifaddr, gateways):
|
||||
with mock.patch.dict(netifaces.__dict__, {'AF_INET': '0'}):
|
||||
ifaddr.return_value = {'0': [{'addr': '172.18.204.1'}]}
|
||||
addr = netutils._get_my_ipv4_address()
|
||||
self.assertEqual('172.18.204.1', addr)
|
||||
|
||||
@mock.patch('netifaces.gateways')
|
||||
@mock.patch('netifaces.ifaddresses')
|
||||
def test_get_my_ipv4_address_without_default_route(
|
||||
self, ifaddr, gateways):
|
||||
with mock.patch.dict(netifaces.__dict__, {'AF_INET': '0'}):
|
||||
ifaddr.return_value = {}
|
||||
addr = netutils._get_my_ipv4_address()
|
||||
self.assertEqual('127.0.0.1', addr)
|
||||
|
||||
@mock.patch('netifaces.gateways')
|
||||
@mock.patch('netifaces.ifaddresses')
|
||||
def test_get_my_ipv4_address_without_default_interface(
|
||||
self, ifaddr, gateways):
|
||||
gateways.return_value = {}
|
||||
addr = netutils._get_my_ipv4_address()
|
||||
self.assertEqual('127.0.0.1', addr)
|
||||
self.assertFalse(ifaddr.called)
|
||||
|
||||
|
||||
class IPv6byEUI64TestCase(test_base.BaseTestCase):
|
||||
"""Unit tests to generate IPv6 by EUI-64 operations."""
|
||||
|
||||
def test_generate_IPv6_by_EUI64(self):
|
||||
addr = netutils.get_ipv6_addr_by_EUI64('2001:db8::',
|
||||
'00:16:3e:33:44:55')
|
||||
self.assertEqual('2001:db8::216:3eff:fe33:4455', addr.format())
|
||||
|
||||
def test_generate_IPv6_with_IPv4_prefix(self):
|
||||
ipv4_prefix = '10.0.8'
|
||||
mac = '00:16:3e:33:44:55'
|
||||
self.assertRaises(ValueError, lambda:
|
||||
netutils.get_ipv6_addr_by_EUI64(ipv4_prefix, mac))
|
||||
|
||||
def test_generate_IPv6_with_bad_mac(self):
|
||||
bad_mac = '00:16:3e:33:44:5Z'
|
||||
prefix = '2001:db8::'
|
||||
self.assertRaises(ValueError, lambda:
|
||||
netutils.get_ipv6_addr_by_EUI64(prefix, bad_mac))
|
||||
|
||||
def test_generate_IPv6_with_bad_prefix(self):
|
||||
mac = '00:16:3e:33:44:55'
|
||||
bad_prefix = 'bb'
|
||||
self.assertRaises(ValueError, lambda:
|
||||
netutils.get_ipv6_addr_by_EUI64(bad_prefix, mac))
|
||||
|
||||
def test_generate_IPv6_with_error_prefix_type(self):
|
||||
mac = '00:16:3e:33:44:55'
|
||||
prefix = 123
|
||||
self.assertRaises(TypeError, lambda:
|
||||
netutils.get_ipv6_addr_by_EUI64(prefix, mac))
|
||||
|
||||
def test_generate_IPv6_with_empty_prefix(self):
|
||||
mac = '00:16:3e:33:44:55'
|
||||
prefix = ''
|
||||
self.assertRaises(ValueError, lambda:
|
||||
netutils.get_ipv6_addr_by_EUI64(prefix, mac))
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def mock_file_content(content):
|
||||
# Allows StringIO to act like a context manager-enabled file.
|
||||
yield six.StringIO(content)
|
||||
|
||||
|
||||
class TestIsIPv6Enabled(test_base.BaseTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestIsIPv6Enabled, self).setUp()
|
||||
|
||||
def reset_detection_flag():
|
||||
netutils._IS_IPV6_ENABLED = None
|
||||
reset_detection_flag()
|
||||
self.addCleanup(reset_detection_flag)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=True)
|
||||
@mock.patch('six.moves.builtins.open', return_value=mock_file_content('0'))
|
||||
def test_enabled(self, mock_open, exists):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertTrue(enabled)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=True)
|
||||
@mock.patch('six.moves.builtins.open', return_value=mock_file_content('1'))
|
||||
def test_disabled(self, mock_open, exists):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=False)
|
||||
@mock.patch('six.moves.builtins.open',
|
||||
side_effect=AssertionError('should not read'))
|
||||
def test_disabled_non_exists(self, mock_open, exists):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=True)
|
||||
def test_memoize_enabled(self, exists):
|
||||
# Reset the flag to appear that we haven't looked for it yet.
|
||||
netutils._IS_IPV6_ENABLED = None
|
||||
with mock.patch('six.moves.builtins.open',
|
||||
return_value=mock_file_content('0')) as mock_open:
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertTrue(mock_open.called)
|
||||
self.assertTrue(netutils._IS_IPV6_ENABLED)
|
||||
self.assertTrue(enabled)
|
||||
# The second call should not use open again
|
||||
with mock.patch('six.moves.builtins.open',
|
||||
side_effect=AssertionError('should not be called')):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertTrue(enabled)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=True)
|
||||
def test_memoize_disabled(self, exists):
|
||||
# Reset the flag to appear that we haven't looked for it yet.
|
||||
netutils._IS_IPV6_ENABLED = None
|
||||
with mock.patch('six.moves.builtins.open',
|
||||
return_value=mock_file_content('1')):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
||||
# The second call should not use open again
|
||||
with mock.patch('six.moves.builtins.open',
|
||||
side_effect=AssertionError('should not be called')):
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
||||
|
||||
@mock.patch('os.path.exists', return_value=False)
|
||||
@mock.patch('six.moves.builtins.open',
|
||||
side_effect=AssertionError('should not read'))
|
||||
def test_memoize_not_exists(self, mock_open, exists):
|
||||
# Reset the flag to appear that we haven't looked for it yet.
|
||||
netutils._IS_IPV6_ENABLED = None
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
||||
enabled = netutils.is_ipv6_enabled()
|
||||
self.assertFalse(enabled)
|
|
@ -1,361 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
import testtools
|
||||
|
||||
from oslo_utils import reflection
|
||||
|
||||
|
||||
if six.PY3:
|
||||
RUNTIME_ERROR_CLASSES = ['RuntimeError', 'Exception',
|
||||
'BaseException', 'object']
|
||||
else:
|
||||
RUNTIME_ERROR_CLASSES = ['RuntimeError', 'StandardError', 'Exception',
|
||||
'BaseException', 'object']
|
||||
|
||||
|
||||
def dummy_decorator(f):
|
||||
|
||||
@six.wraps(f)
|
||||
def wrapper(*args, **kwargs):
|
||||
return f(*args, **kwargs)
|
||||
|
||||
return wrapper
|
||||
|
||||
|
||||
def mere_function(a, b):
|
||||
pass
|
||||
|
||||
|
||||
def function_with_defs(a, b, optional=None):
|
||||
pass
|
||||
|
||||
|
||||
def function_with_kwargs(a, b, **kwargs):
|
||||
pass
|
||||
|
||||
|
||||
class TestObject(object):
|
||||
def _hello(self):
|
||||
pass
|
||||
|
||||
def hi(self):
|
||||
pass
|
||||
|
||||
|
||||
class Class(object):
|
||||
|
||||
def method(self, c, d):
|
||||
pass
|
||||
|
||||
@staticmethod
|
||||
def static_method(e, f):
|
||||
pass
|
||||
|
||||
@classmethod
|
||||
def class_method(cls, g, h):
|
||||
pass
|
||||
|
||||
|
||||
class BadClass(object):
|
||||
def do_something(self):
|
||||
pass
|
||||
|
||||
def __nonzero__(self):
|
||||
return False
|
||||
|
||||
|
||||
class CallableClass(object):
|
||||
def __call__(self, i, j):
|
||||
pass
|
||||
|
||||
|
||||
class ClassWithInit(object):
|
||||
def __init__(self, k, l):
|
||||
pass
|
||||
|
||||
|
||||
class MemberGetTest(test_base.BaseTestCase):
|
||||
def test_get_members_exclude_hidden(self):
|
||||
obj = TestObject()
|
||||
members = list(reflection.get_members(obj, exclude_hidden=True))
|
||||
self.assertEqual(1, len(members))
|
||||
|
||||
def test_get_members_no_exclude_hidden(self):
|
||||
obj = TestObject()
|
||||
members = list(reflection.get_members(obj, exclude_hidden=False))
|
||||
self.assertGreater(len(members), 1)
|
||||
|
||||
def test_get_members_names_exclude_hidden(self):
|
||||
obj = TestObject()
|
||||
members = list(reflection.get_member_names(obj, exclude_hidden=True))
|
||||
self.assertEqual(["hi"], members)
|
||||
|
||||
def test_get_members_names_no_exclude_hidden(self):
|
||||
obj = TestObject()
|
||||
members = list(reflection.get_member_names(obj, exclude_hidden=False))
|
||||
members = [member for member in members if not member.startswith("__")]
|
||||
self.assertEqual(["_hello", "hi"], sorted(members))
|
||||
|
||||
|
||||
class CallbackEqualityTest(test_base.BaseTestCase):
|
||||
def test_different_simple_callbacks(self):
|
||||
|
||||
def a():
|
||||
pass
|
||||
|
||||
def b():
|
||||
pass
|
||||
|
||||
self.assertFalse(reflection.is_same_callback(a, b))
|
||||
|
||||
def test_static_instance_callbacks(self):
|
||||
|
||||
class A(object):
|
||||
|
||||
@staticmethod
|
||||
def b(a, b, c):
|
||||
pass
|
||||
|
||||
a = A()
|
||||
b = A()
|
||||
|
||||
self.assertTrue(reflection.is_same_callback(a.b, b.b))
|
||||
|
||||
def test_different_instance_callbacks(self):
|
||||
|
||||
class A(object):
|
||||
def b(self):
|
||||
pass
|
||||
|
||||
def __eq__(self, other):
|
||||
return True
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
b = A()
|
||||
c = A()
|
||||
|
||||
self.assertFalse(reflection.is_same_callback(b.b, c.b))
|
||||
self.assertTrue(reflection.is_same_callback(b.b, c.b, strict=False))
|
||||
|
||||
|
||||
class BoundMethodTest(test_base.BaseTestCase):
|
||||
def test_baddy(self):
|
||||
b = BadClass()
|
||||
self.assertTrue(reflection.is_bound_method(b.do_something))
|
||||
|
||||
def test_static_method(self):
|
||||
self.assertFalse(reflection.is_bound_method(Class.static_method))
|
||||
|
||||
|
||||
class GetCallableNameTest(test_base.BaseTestCase):
|
||||
|
||||
def test_mere_function(self):
|
||||
name = reflection.get_callable_name(mere_function)
|
||||
self.assertEqual('.'.join((__name__, 'mere_function')), name)
|
||||
|
||||
def test_method(self):
|
||||
name = reflection.get_callable_name(Class.method)
|
||||
self.assertEqual('.'.join((__name__, 'Class', 'method')), name)
|
||||
|
||||
def test_instance_method(self):
|
||||
name = reflection.get_callable_name(Class().method)
|
||||
self.assertEqual('.'.join((__name__, 'Class', 'method')), name)
|
||||
|
||||
def test_static_method(self):
|
||||
name = reflection.get_callable_name(Class.static_method)
|
||||
if six.PY3:
|
||||
self.assertEqual('.'.join((__name__, 'Class', 'static_method')),
|
||||
name)
|
||||
else:
|
||||
# NOTE(imelnikov): static method are just functions, class name
|
||||
# is not recorded anywhere in them.
|
||||
self.assertEqual('.'.join((__name__, 'static_method')), name)
|
||||
|
||||
def test_class_method(self):
|
||||
name = reflection.get_callable_name(Class.class_method)
|
||||
self.assertEqual('.'.join((__name__, 'Class', 'class_method')), name)
|
||||
|
||||
def test_constructor(self):
|
||||
name = reflection.get_callable_name(Class)
|
||||
self.assertEqual('.'.join((__name__, 'Class')), name)
|
||||
|
||||
def test_callable_class(self):
|
||||
name = reflection.get_callable_name(CallableClass())
|
||||
self.assertEqual('.'.join((__name__, 'CallableClass')), name)
|
||||
|
||||
def test_callable_class_call(self):
|
||||
name = reflection.get_callable_name(CallableClass().__call__)
|
||||
self.assertEqual('.'.join((__name__, 'CallableClass',
|
||||
'__call__')), name)
|
||||
|
||||
|
||||
# These extended/special case tests only work on python 3, due to python 2
|
||||
# being broken/incorrect with regard to these special cases...
|
||||
@testtools.skipIf(not six.PY3, 'python 3.x is not currently available')
|
||||
class GetCallableNameTestExtended(test_base.BaseTestCase):
|
||||
# Tests items in http://legacy.python.org/dev/peps/pep-3155/
|
||||
|
||||
class InnerCallableClass(object):
|
||||
def __call__(self):
|
||||
pass
|
||||
|
||||
def test_inner_callable_class(self):
|
||||
obj = self.InnerCallableClass()
|
||||
name = reflection.get_callable_name(obj.__call__)
|
||||
expected_name = '.'.join((__name__, 'GetCallableNameTestExtended',
|
||||
'InnerCallableClass', '__call__'))
|
||||
self.assertEqual(expected_name, name)
|
||||
|
||||
def test_inner_callable_function(self):
|
||||
def a():
|
||||
|
||||
def b():
|
||||
pass
|
||||
|
||||
return b
|
||||
|
||||
name = reflection.get_callable_name(a())
|
||||
expected_name = '.'.join((__name__, 'GetCallableNameTestExtended',
|
||||
'test_inner_callable_function', '<locals>',
|
||||
'a', '<locals>', 'b'))
|
||||
self.assertEqual(expected_name, name)
|
||||
|
||||
def test_inner_class(self):
|
||||
obj = self.InnerCallableClass()
|
||||
name = reflection.get_callable_name(obj)
|
||||
expected_name = '.'.join((__name__,
|
||||
'GetCallableNameTestExtended',
|
||||
'InnerCallableClass'))
|
||||
self.assertEqual(expected_name, name)
|
||||
|
||||
|
||||
class GetCallableArgsTest(test_base.BaseTestCase):
|
||||
|
||||
def test_mere_function(self):
|
||||
result = reflection.get_callable_args(mere_function)
|
||||
self.assertEqual(['a', 'b'], result)
|
||||
|
||||
def test_function_with_defaults(self):
|
||||
result = reflection.get_callable_args(function_with_defs)
|
||||
self.assertEqual(['a', 'b', 'optional'], result)
|
||||
|
||||
def test_required_only(self):
|
||||
result = reflection.get_callable_args(function_with_defs,
|
||||
required_only=True)
|
||||
self.assertEqual(['a', 'b'], result)
|
||||
|
||||
def test_method(self):
|
||||
result = reflection.get_callable_args(Class.method)
|
||||
self.assertEqual(['self', 'c', 'd'], result)
|
||||
|
||||
def test_instance_method(self):
|
||||
result = reflection.get_callable_args(Class().method)
|
||||
self.assertEqual(['c', 'd'], result)
|
||||
|
||||
def test_class_method(self):
|
||||
result = reflection.get_callable_args(Class.class_method)
|
||||
self.assertEqual(['g', 'h'], result)
|
||||
|
||||
def test_class_constructor(self):
|
||||
result = reflection.get_callable_args(ClassWithInit)
|
||||
self.assertEqual(['k', 'l'], result)
|
||||
|
||||
def test_class_with_call(self):
|
||||
result = reflection.get_callable_args(CallableClass())
|
||||
self.assertEqual(['i', 'j'], result)
|
||||
|
||||
def test_decorators_work(self):
|
||||
@dummy_decorator
|
||||
def special_fun(x, y):
|
||||
pass
|
||||
result = reflection.get_callable_args(special_fun)
|
||||
self.assertEqual(['x', 'y'], result)
|
||||
|
||||
|
||||
class AcceptsKwargsTest(test_base.BaseTestCase):
|
||||
|
||||
def test_no_kwargs(self):
|
||||
self.assertEqual(False, reflection.accepts_kwargs(mere_function))
|
||||
|
||||
def test_with_kwargs(self):
|
||||
self.assertEqual(True, reflection.accepts_kwargs(function_with_kwargs))
|
||||
|
||||
|
||||
class GetClassNameTest(test_base.BaseTestCase):
|
||||
|
||||
def test_std_exception(self):
|
||||
name = reflection.get_class_name(RuntimeError)
|
||||
self.assertEqual('RuntimeError', name)
|
||||
|
||||
def test_class(self):
|
||||
name = reflection.get_class_name(Class)
|
||||
self.assertEqual('.'.join((__name__, 'Class')), name)
|
||||
|
||||
def test_qualified_class(self):
|
||||
class QualifiedClass(object):
|
||||
pass
|
||||
|
||||
name = reflection.get_class_name(QualifiedClass)
|
||||
self.assertEqual('.'.join((__name__, 'QualifiedClass')), name)
|
||||
|
||||
def test_instance(self):
|
||||
name = reflection.get_class_name(Class())
|
||||
self.assertEqual('.'.join((__name__, 'Class')), name)
|
||||
|
||||
def test_int(self):
|
||||
name = reflection.get_class_name(42)
|
||||
self.assertEqual('int', name)
|
||||
|
||||
def test_class_method(self):
|
||||
name = reflection.get_class_name(Class.class_method)
|
||||
self.assertEqual('%s.Class' % __name__, name)
|
||||
# test with fully_qualified=False
|
||||
name = reflection.get_class_name(Class.class_method,
|
||||
fully_qualified=False)
|
||||
self.assertEqual('Class', name)
|
||||
|
||||
def test_static_method(self):
|
||||
self.assertRaises(TypeError, reflection.get_class_name,
|
||||
Class.static_method)
|
||||
|
||||
def test_unbound_method(self):
|
||||
self.assertRaises(TypeError, reflection.get_class_name,
|
||||
mere_function)
|
||||
|
||||
def test_bound_method(self):
|
||||
c = Class()
|
||||
name = reflection.get_class_name(c.method)
|
||||
self.assertEqual('%s.Class' % __name__, name)
|
||||
# test with fully_qualified=False
|
||||
name = reflection.get_class_name(c.method, fully_qualified=False)
|
||||
self.assertEqual('Class', name)
|
||||
|
||||
|
||||
class GetAllClassNamesTest(test_base.BaseTestCase):
|
||||
|
||||
def test_std_class(self):
|
||||
names = list(reflection.get_all_class_names(RuntimeError))
|
||||
self.assertEqual(RUNTIME_ERROR_CLASSES, names)
|
||||
|
||||
def test_std_class_up_to(self):
|
||||
names = list(reflection.get_all_class_names(RuntimeError,
|
||||
up_to=Exception))
|
||||
self.assertEqual(RUNTIME_ERROR_CLASSES[:-2], names)
|
|
@ -1,54 +0,0 @@
|
|||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base as test_base
|
||||
import testscenarios
|
||||
|
||||
from oslo_utils import secretutils
|
||||
|
||||
|
||||
class SecretUtilsTest(testscenarios.TestWithScenarios,
|
||||
test_base.BaseTestCase):
|
||||
|
||||
scenarios = [
|
||||
('binary', {'converter': lambda text: text.encode('utf-8')}),
|
||||
('unicode', {'converter': lambda text: text}),
|
||||
]
|
||||
|
||||
def test_constant_time_compare(self):
|
||||
# make sure it works as a compare, the "constant time" aspect
|
||||
# isn't appropriate to test in unittests
|
||||
ctc = secretutils.constant_time_compare
|
||||
self.assertTrue(ctc(self.converter(u'abcd'),
|
||||
self.converter(u'abcd')))
|
||||
self.assertTrue(ctc(self.converter(u''),
|
||||
self.converter(u'')))
|
||||
self.assertTrue(ctc('abcd', 'abcd'))
|
||||
self.assertFalse(ctc(self.converter(u'abcd'),
|
||||
self.converter(u'efgh')))
|
||||
self.assertFalse(ctc(self.converter(u'abc'),
|
||||
self.converter(u'abcd')))
|
||||
self.assertFalse(ctc(self.converter(u'abc'),
|
||||
self.converter(u'abc\x00')))
|
||||
self.assertFalse(ctc(self.converter(u''),
|
||||
self.converter(u'abc')))
|
||||
self.assertTrue(ctc(self.converter(u'abcd1234'),
|
||||
self.converter(u'abcd1234')))
|
||||
self.assertFalse(ctc(self.converter(u'abcd1234'),
|
||||
self.converter(u'ABCD234')))
|
||||
self.assertFalse(ctc(self.converter(u'abcd1234'),
|
||||
self.converter(u'a')))
|
||||
self.assertFalse(ctc(self.converter(u'abcd1234'),
|
||||
self.converter(u'1234abcd')))
|
||||
self.assertFalse(ctc('abcd1234', '1234abcd'))
|
|
@ -1,422 +0,0 @@
|
|||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import specs_matcher
|
||||
|
||||
|
||||
class SpecsMatcherTestCase(test_base.BaseTestCase):
|
||||
def _do_specs_matcher_test(self, value, req, matches):
|
||||
assertion = self.assertTrue if matches else self.assertFalse
|
||||
assertion(specs_matcher.match(value, req))
|
||||
|
||||
def test_specs_matches_simple(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1',
|
||||
req='1',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_simple(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='',
|
||||
req='1',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_simple2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='1',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_simple3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='222',
|
||||
req='2',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_bogus_ops(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='4',
|
||||
req='! 2',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_eq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='123',
|
||||
req='= 123',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_eq2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='124',
|
||||
req='= 123',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_eq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='34',
|
||||
req='= 234',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_eq3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='34',
|
||||
req='=',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_seq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='123',
|
||||
req='s== 123',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_seq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1234',
|
||||
req='s== 123',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_sneq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1234',
|
||||
req='s!= 123',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_sneq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='123',
|
||||
req='s!= 123',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_sge(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='234',
|
||||
req='s>= 1000',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_sge2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='234',
|
||||
req='s>= 234',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_sge(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1000',
|
||||
req='s>= 234',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_sle(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1000',
|
||||
req='s<= 1234',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_sle2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1234',
|
||||
req='s<= 1234',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_sle(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1234',
|
||||
req='s<= 1000',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_sl(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='s< 2',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_sl(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='2',
|
||||
req='s< 12',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_sl2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='s< 12',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_sg(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='2',
|
||||
req='s> 12',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_sg(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='s> 2',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_sg2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='s> 12',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_in(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12311321',
|
||||
req='<in> 11',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_in2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12311321',
|
||||
req='<in> 12311321',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_in3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12311321',
|
||||
req='<in> 12311321 <in>',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_in(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12310321',
|
||||
req='<in> 11',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_in2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12310321',
|
||||
req='<in> 11 <in>',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_or(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='<or> 11 <or> 12',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_or2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='<or> 11 <or> 12 <or>',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_or3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='12',
|
||||
req='<or> 12',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_or(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='13',
|
||||
req='<or> 11 <or> 12',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_or2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='13',
|
||||
req='<or> 11 <or> 12 <or>',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_with_op_or3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='13',
|
||||
req='<or> 11',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_le(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='2',
|
||||
req='<= 10',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_le2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='10',
|
||||
req='<= 10',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_le(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='<= 2',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_ge(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='>= 1',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_ge2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.0',
|
||||
req='>= 3',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_g(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='> 1',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_g2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='> 3',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_g3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.0',
|
||||
req='> 2',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_l(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='< 5',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_l2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='< 3',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_l3(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='1.0',
|
||||
req='< 6',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_ge(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='2',
|
||||
req='>= 3',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_ne(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.2',
|
||||
req='!= 3.1',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_ne(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.2',
|
||||
req='!= 3.2',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_with_op_eqeq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3',
|
||||
req='== 3',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_with_op_eqeq2(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.0',
|
||||
req='== 3',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_eqeq(self):
|
||||
self._do_specs_matcher_test(
|
||||
value='3.0',
|
||||
req='== 3.1',
|
||||
matches=False)
|
||||
|
||||
def test_specs_matches_all_with_op_allin(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> aes mmx',
|
||||
matches=True)
|
||||
|
||||
def test_specs_matches_one_with_op_allin(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> mmx',
|
||||
matches=True)
|
||||
|
||||
def test_specs_fails_with_op_allin(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> txt',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_all_with_op_allin(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> txt 3dnow',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_match_one_with_op_allin(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> txt aes',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_match_substr_single(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['X_X']),
|
||||
req='<all-in> _',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_match_substr(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['X___X']),
|
||||
req='<all-in> ___',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_match_substr_reversed(self):
|
||||
self._do_specs_matcher_test(
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> XaesX',
|
||||
matches=False)
|
||||
|
||||
def test_specs_fails_onechar_with_op_allin(self):
|
||||
self.assertRaises(
|
||||
TypeError,
|
||||
specs_matcher.match,
|
||||
value=str(['aes', 'mmx', 'aux']),
|
||||
req='<all-in> e')
|
||||
|
||||
def test_specs_errors_list_with_op_allin(self):
|
||||
self.assertRaises(
|
||||
TypeError,
|
||||
specs_matcher.match,
|
||||
value=['aes', 'mmx', 'aux'],
|
||||
req='<all-in> aes')
|
||||
|
||||
def test_specs_errors_str_with_op_allin(self):
|
||||
self.assertRaises(
|
||||
TypeError,
|
||||
specs_matcher.match,
|
||||
value='aes',
|
||||
req='<all-in> aes')
|
||||
|
||||
def test_specs_errors_dict_literal_with_op_allin(self):
|
||||
self.assertRaises(
|
||||
TypeError,
|
||||
specs_matcher.match,
|
||||
value=str({'aes': 1}),
|
||||
req='<all-in> aes')
|
||||
|
||||
def test_specs_errors_bad_literal_with_op_allin(self):
|
||||
self.assertRaises(
|
||||
TypeError,
|
||||
specs_matcher.match,
|
||||
value="^&*($",
|
||||
req='<all-in> aes')
|
|
@ -1,806 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import copy
|
||||
import math
|
||||
|
||||
import mock
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
import testscenarios
|
||||
|
||||
from oslo_utils import strutils
|
||||
from oslo_utils import units
|
||||
|
||||
load_tests = testscenarios.load_tests_apply_scenarios
|
||||
|
||||
|
||||
class StrUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
@mock.patch("six.text_type")
|
||||
def test_bool_bool_from_string_no_text(self, mock_text):
|
||||
self.assertTrue(strutils.bool_from_string(True))
|
||||
self.assertFalse(strutils.bool_from_string(False))
|
||||
self.assertEqual(0, mock_text.call_count)
|
||||
|
||||
def test_bool_bool_from_string(self):
|
||||
self.assertTrue(strutils.bool_from_string(True))
|
||||
self.assertFalse(strutils.bool_from_string(False))
|
||||
|
||||
def test_bool_bool_from_string_default(self):
|
||||
self.assertTrue(strutils.bool_from_string('', default=True))
|
||||
self.assertFalse(strutils.bool_from_string('wibble', default=False))
|
||||
|
||||
def _test_bool_from_string(self, c):
|
||||
self.assertTrue(strutils.bool_from_string(c('true')))
|
||||
self.assertTrue(strutils.bool_from_string(c('TRUE')))
|
||||
self.assertTrue(strutils.bool_from_string(c('on')))
|
||||
self.assertTrue(strutils.bool_from_string(c('On')))
|
||||
self.assertTrue(strutils.bool_from_string(c('yes')))
|
||||
self.assertTrue(strutils.bool_from_string(c('YES')))
|
||||
self.assertTrue(strutils.bool_from_string(c('yEs')))
|
||||
self.assertTrue(strutils.bool_from_string(c('1')))
|
||||
self.assertTrue(strutils.bool_from_string(c('T')))
|
||||
self.assertTrue(strutils.bool_from_string(c('t')))
|
||||
self.assertTrue(strutils.bool_from_string(c('Y')))
|
||||
self.assertTrue(strutils.bool_from_string(c('y')))
|
||||
|
||||
self.assertFalse(strutils.bool_from_string(c('false')))
|
||||
self.assertFalse(strutils.bool_from_string(c('FALSE')))
|
||||
self.assertFalse(strutils.bool_from_string(c('off')))
|
||||
self.assertFalse(strutils.bool_from_string(c('OFF')))
|
||||
self.assertFalse(strutils.bool_from_string(c('no')))
|
||||
self.assertFalse(strutils.bool_from_string(c('0')))
|
||||
self.assertFalse(strutils.bool_from_string(c('42')))
|
||||
self.assertFalse(strutils.bool_from_string(c(
|
||||
'This should not be True')))
|
||||
self.assertFalse(strutils.bool_from_string(c('F')))
|
||||
self.assertFalse(strutils.bool_from_string(c('f')))
|
||||
self.assertFalse(strutils.bool_from_string(c('N')))
|
||||
self.assertFalse(strutils.bool_from_string(c('n')))
|
||||
|
||||
# Whitespace should be stripped
|
||||
self.assertTrue(strutils.bool_from_string(c(' 1 ')))
|
||||
self.assertTrue(strutils.bool_from_string(c(' true ')))
|
||||
self.assertFalse(strutils.bool_from_string(c(' 0 ')))
|
||||
self.assertFalse(strutils.bool_from_string(c(' false ')))
|
||||
|
||||
def test_bool_from_string(self):
|
||||
self._test_bool_from_string(lambda s: s)
|
||||
|
||||
def test_unicode_bool_from_string(self):
|
||||
self._test_bool_from_string(six.text_type)
|
||||
self.assertFalse(strutils.bool_from_string(u'使用', strict=False))
|
||||
|
||||
exc = self.assertRaises(ValueError, strutils.bool_from_string,
|
||||
u'使用', strict=True)
|
||||
expected_msg = (u"Unrecognized value '使用', acceptable values are:"
|
||||
u" '0', '1', 'f', 'false', 'n', 'no', 'off', 'on',"
|
||||
u" 't', 'true', 'y', 'yes'")
|
||||
self.assertEqual(expected_msg, six.text_type(exc))
|
||||
|
||||
def test_other_bool_from_string(self):
|
||||
self.assertFalse(strutils.bool_from_string(None))
|
||||
self.assertFalse(strutils.bool_from_string(mock.Mock()))
|
||||
|
||||
def test_int_bool_from_string(self):
|
||||
self.assertTrue(strutils.bool_from_string(1))
|
||||
|
||||
self.assertFalse(strutils.bool_from_string(-1))
|
||||
self.assertFalse(strutils.bool_from_string(0))
|
||||
self.assertFalse(strutils.bool_from_string(2))
|
||||
|
||||
def test_strict_bool_from_string(self):
|
||||
# None isn't allowed in strict mode
|
||||
exc = self.assertRaises(ValueError, strutils.bool_from_string, None,
|
||||
strict=True)
|
||||
expected_msg = ("Unrecognized value 'None', acceptable values are:"
|
||||
" '0', '1', 'f', 'false', 'n', 'no', 'off', 'on',"
|
||||
" 't', 'true', 'y', 'yes'")
|
||||
self.assertEqual(expected_msg, str(exc))
|
||||
|
||||
# Unrecognized strings aren't allowed
|
||||
self.assertFalse(strutils.bool_from_string('Other', strict=False))
|
||||
exc = self.assertRaises(ValueError, strutils.bool_from_string, 'Other',
|
||||
strict=True)
|
||||
expected_msg = ("Unrecognized value 'Other', acceptable values are:"
|
||||
" '0', '1', 'f', 'false', 'n', 'no', 'off', 'on',"
|
||||
" 't', 'true', 'y', 'yes'")
|
||||
self.assertEqual(expected_msg, str(exc))
|
||||
|
||||
# Unrecognized numbers aren't allowed
|
||||
exc = self.assertRaises(ValueError, strutils.bool_from_string, 2,
|
||||
strict=True)
|
||||
expected_msg = ("Unrecognized value '2', acceptable values are:"
|
||||
" '0', '1', 'f', 'false', 'n', 'no', 'off', 'on',"
|
||||
" 't', 'true', 'y', 'yes'")
|
||||
self.assertEqual(expected_msg, str(exc))
|
||||
|
||||
# False-like values are allowed
|
||||
self.assertFalse(strutils.bool_from_string('f', strict=True))
|
||||
self.assertFalse(strutils.bool_from_string('false', strict=True))
|
||||
self.assertFalse(strutils.bool_from_string('off', strict=True))
|
||||
self.assertFalse(strutils.bool_from_string('n', strict=True))
|
||||
self.assertFalse(strutils.bool_from_string('no', strict=True))
|
||||
self.assertFalse(strutils.bool_from_string('0', strict=True))
|
||||
|
||||
self.assertTrue(strutils.bool_from_string('1', strict=True))
|
||||
|
||||
# Avoid font-similarity issues (one looks like lowercase-el, zero like
|
||||
# oh, etc...)
|
||||
for char in ('O', 'o', 'L', 'l', 'I', 'i'):
|
||||
self.assertRaises(ValueError, strutils.bool_from_string, char,
|
||||
strict=True)
|
||||
|
||||
def test_int_from_bool_as_string(self):
|
||||
self.assertEqual(1, strutils.int_from_bool_as_string(True))
|
||||
self.assertEqual(0, strutils.int_from_bool_as_string(False))
|
||||
|
||||
def test_is_valid_boolstr(self):
|
||||
self.assertTrue(strutils.is_valid_boolstr('true'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('false'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('yes'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('no'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('y'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('n'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('1'))
|
||||
self.assertTrue(strutils.is_valid_boolstr('0'))
|
||||
self.assertTrue(strutils.is_valid_boolstr(1))
|
||||
self.assertTrue(strutils.is_valid_boolstr(0))
|
||||
|
||||
self.assertFalse(strutils.is_valid_boolstr('maybe'))
|
||||
self.assertFalse(strutils.is_valid_boolstr('only on tuesdays'))
|
||||
|
||||
def test_slugify(self):
|
||||
to_slug = strutils.to_slug
|
||||
self.assertRaises(TypeError, to_slug, True)
|
||||
self.assertEqual(six.u("hello"), to_slug("hello"))
|
||||
self.assertEqual(six.u("two-words"), to_slug("Two Words"))
|
||||
self.assertEqual(six.u("ma-any-spa-ce-es"),
|
||||
to_slug("Ma-any\t spa--ce- es"))
|
||||
self.assertEqual(six.u("excamation"), to_slug("exc!amation!"))
|
||||
self.assertEqual(six.u("ampserand"), to_slug("&ser$and"))
|
||||
self.assertEqual(six.u("ju5tnum8er"), to_slug("ju5tnum8er"))
|
||||
self.assertEqual(six.u("strip-"), to_slug(" strip - "))
|
||||
self.assertEqual(six.u("perche"), to_slug(six.b("perch\xc3\xa9")))
|
||||
self.assertEqual(six.u("strange"),
|
||||
to_slug("\x80strange", errors="ignore"))
|
||||
|
||||
|
||||
class StringToBytesTest(test_base.BaseTestCase):
|
||||
|
||||
_unit_system = [
|
||||
('si', dict(unit_system='SI')),
|
||||
('iec', dict(unit_system='IEC')),
|
||||
('invalid_unit_system', dict(unit_system='KKK', assert_error=True)),
|
||||
]
|
||||
|
||||
_sign = [
|
||||
('no_sign', dict(sign='')),
|
||||
('positive', dict(sign='+')),
|
||||
('negative', dict(sign='-')),
|
||||
('invalid_sign', dict(sign='~', assert_error=True)),
|
||||
]
|
||||
|
||||
_magnitude = [
|
||||
('integer', dict(magnitude='79')),
|
||||
('decimal', dict(magnitude='7.9')),
|
||||
('decimal_point_start', dict(magnitude='.9')),
|
||||
('decimal_point_end', dict(magnitude='79.', assert_error=True)),
|
||||
('invalid_literal', dict(magnitude='7.9.9', assert_error=True)),
|
||||
('garbage_value', dict(magnitude='asdf', assert_error=True)),
|
||||
]
|
||||
|
||||
_unit_prefix = [
|
||||
('no_unit_prefix', dict(unit_prefix='')),
|
||||
('k', dict(unit_prefix='k')),
|
||||
('K', dict(unit_prefix='K')),
|
||||
('M', dict(unit_prefix='M')),
|
||||
('G', dict(unit_prefix='G')),
|
||||
('T', dict(unit_prefix='T')),
|
||||
('Ki', dict(unit_prefix='Ki')),
|
||||
('Mi', dict(unit_prefix='Mi')),
|
||||
('Gi', dict(unit_prefix='Gi')),
|
||||
('Ti', dict(unit_prefix='Ti')),
|
||||
('invalid_unit_prefix', dict(unit_prefix='B', assert_error=True)),
|
||||
]
|
||||
|
||||
_unit_suffix = [
|
||||
('b', dict(unit_suffix='b')),
|
||||
('bit', dict(unit_suffix='bit')),
|
||||
('B', dict(unit_suffix='B')),
|
||||
('invalid_unit_suffix', dict(unit_suffix='Kg', assert_error=True)),
|
||||
]
|
||||
|
||||
_return_int = [
|
||||
('return_dec', dict(return_int=False)),
|
||||
('return_int', dict(return_int=True)),
|
||||
]
|
||||
|
||||
@classmethod
|
||||
def generate_scenarios(cls):
|
||||
cls.scenarios = testscenarios.multiply_scenarios(cls._unit_system,
|
||||
cls._sign,
|
||||
cls._magnitude,
|
||||
cls._unit_prefix,
|
||||
cls._unit_suffix,
|
||||
cls._return_int)
|
||||
|
||||
def test_string_to_bytes(self):
|
||||
|
||||
def _get_quantity(sign, magnitude, unit_suffix):
|
||||
res = float('%s%s' % (sign, magnitude))
|
||||
if unit_suffix in ['b', 'bit']:
|
||||
res /= 8
|
||||
return res
|
||||
|
||||
def _get_constant(unit_prefix, unit_system):
|
||||
if not unit_prefix:
|
||||
return 1
|
||||
elif unit_system == 'SI':
|
||||
res = getattr(units, unit_prefix)
|
||||
elif unit_system == 'IEC':
|
||||
if unit_prefix.endswith('i'):
|
||||
res = getattr(units, unit_prefix)
|
||||
else:
|
||||
res = getattr(units, '%si' % unit_prefix)
|
||||
return res
|
||||
|
||||
text = ''.join([self.sign, self.magnitude, self.unit_prefix,
|
||||
self.unit_suffix])
|
||||
err_si = self.unit_system == 'SI' and (self.unit_prefix == 'K' or
|
||||
self.unit_prefix.endswith('i'))
|
||||
err_iec = self.unit_system == 'IEC' and self.unit_prefix == 'k'
|
||||
if getattr(self, 'assert_error', False) or err_si or err_iec:
|
||||
self.assertRaises(ValueError, strutils.string_to_bytes,
|
||||
text, unit_system=self.unit_system,
|
||||
return_int=self.return_int)
|
||||
return
|
||||
quantity = _get_quantity(self.sign, self.magnitude, self.unit_suffix)
|
||||
constant = _get_constant(self.unit_prefix, self.unit_system)
|
||||
expected = quantity * constant
|
||||
actual = strutils.string_to_bytes(text, unit_system=self.unit_system,
|
||||
return_int=self.return_int)
|
||||
if self.return_int:
|
||||
self.assertEqual(actual, int(math.ceil(expected)))
|
||||
else:
|
||||
self.assertAlmostEqual(actual, expected)
|
||||
|
||||
StringToBytesTest.generate_scenarios()
|
||||
|
||||
|
||||
class MaskPasswordTestCase(test_base.BaseTestCase):
|
||||
|
||||
def test_json(self):
|
||||
# Test 'adminPass' w/o spaces
|
||||
payload = """{'adminPass':'TL0EfN33'}"""
|
||||
expected = """{'adminPass':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'adminPass' with spaces
|
||||
payload = """{ 'adminPass' : 'TL0EfN33' }"""
|
||||
expected = """{ 'adminPass' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' w/o spaces
|
||||
payload = """{'admin_pass':'TL0EfN33'}"""
|
||||
expected = """{'admin_pass':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' with spaces
|
||||
payload = """{ 'admin_pass' : 'TL0EfN33' }"""
|
||||
expected = """{ 'admin_pass' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' w/o spaces
|
||||
payload = """{'admin_password':'TL0EfN33'}"""
|
||||
expected = """{'admin_password':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' with spaces
|
||||
payload = """{ 'admin_password' : 'TL0EfN33' }"""
|
||||
expected = """{ 'admin_password' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' w/o spaces
|
||||
payload = """{'password':'TL0EfN33'}"""
|
||||
expected = """{'password':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' with spaces
|
||||
payload = """{ 'password' : 'TL0EfN33' }"""
|
||||
expected = """{ 'password' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'auth_password' w/o spaces
|
||||
payload = """{'auth_password':'TL0EfN33'}"""
|
||||
expected = """{'auth_password':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'auth_password' with spaces
|
||||
payload = """{ 'auth_password' : 'TL0EfN33' }"""
|
||||
expected = """{ 'auth_password' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'secret_uuid' w/o spaces
|
||||
payload = """{'secret_uuid':'myuuid'}"""
|
||||
expected = """{'secret_uuid':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'secret_uuid' with spaces
|
||||
payload = """{ 'secret_uuid' : 'myuuid' }"""
|
||||
expected = """{ 'secret_uuid' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'token' w/o spaces
|
||||
payload = """{'token':'token'}"""
|
||||
expected = """{'token':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'token' with spaces
|
||||
payload = """{ 'token' : 'token' }"""
|
||||
expected = """{ 'token' : '***' }"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
def test_xml(self):
|
||||
# Test 'adminPass' w/o spaces
|
||||
payload = """<adminPass>TL0EfN33</adminPass>"""
|
||||
expected = """<adminPass>***</adminPass>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'adminPass' with spaces
|
||||
payload = """<adminPass>
|
||||
TL0EfN33
|
||||
</adminPass>"""
|
||||
expected = """<adminPass>***</adminPass>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' w/o spaces
|
||||
payload = """<admin_pass>TL0EfN33</admin_pass>"""
|
||||
expected = """<admin_pass>***</admin_pass>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' with spaces
|
||||
payload = """<admin_pass>
|
||||
TL0EfN33
|
||||
</admin_pass>"""
|
||||
expected = """<admin_pass>***</admin_pass>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' w/o spaces
|
||||
payload = """<admin_password>TL0EfN33</admin_password>"""
|
||||
expected = """<admin_password>***</admin_password>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' with spaces
|
||||
payload = """<admin_password>
|
||||
TL0EfN33
|
||||
</admin_password>"""
|
||||
expected = """<admin_password>***</admin_password>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' w/o spaces
|
||||
payload = """<password>TL0EfN33</password>"""
|
||||
expected = """<password>***</password>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' with spaces
|
||||
payload = """<password>
|
||||
TL0EfN33
|
||||
</password>"""
|
||||
expected = """<password>***</password>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
def test_xml_attribute(self):
|
||||
# Test 'adminPass' w/o spaces
|
||||
payload = """adminPass='TL0EfN33'"""
|
||||
expected = """adminPass='***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'adminPass' with spaces
|
||||
payload = """adminPass = 'TL0EfN33'"""
|
||||
expected = """adminPass = '***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'adminPass' with double quotes
|
||||
payload = """adminPass = "TL0EfN33\""""
|
||||
expected = """adminPass = "***\""""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' w/o spaces
|
||||
payload = """admin_pass='TL0EfN33'"""
|
||||
expected = """admin_pass='***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' with spaces
|
||||
payload = """admin_pass = 'TL0EfN33'"""
|
||||
expected = """admin_pass = '***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_pass' with double quotes
|
||||
payload = """admin_pass = "TL0EfN33\""""
|
||||
expected = """admin_pass = "***\""""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' w/o spaces
|
||||
payload = """admin_password='TL0EfN33'"""
|
||||
expected = """admin_password='***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' with spaces
|
||||
payload = """admin_password = 'TL0EfN33'"""
|
||||
expected = """admin_password = '***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'admin_password' with double quotes
|
||||
payload = """admin_password = "TL0EfN33\""""
|
||||
expected = """admin_password = "***\""""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' w/o spaces
|
||||
payload = """password='TL0EfN33'"""
|
||||
expected = """password='***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' with spaces
|
||||
payload = """password = 'TL0EfN33'"""
|
||||
expected = """password = '***'"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
# Test 'password' with double quotes
|
||||
payload = """password = "TL0EfN33\""""
|
||||
expected = """password = "***\""""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
def test_json_message(self):
|
||||
payload = """body: {"changePassword": {"adminPass": "1234567"}}"""
|
||||
expected = """body: {"changePassword": {"adminPass": "***"}}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """body: {"rescue": {"admin_pass": "1234567"}}"""
|
||||
expected = """body: {"rescue": {"admin_pass": "***"}}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """body: {"rescue": {"admin_password": "1234567"}}"""
|
||||
expected = """body: {"rescue": {"admin_password": "***"}}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """body: {"rescue": {"password": "1234567"}}"""
|
||||
expected = """body: {"rescue": {"password": "***"}}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
def test_xml_message(self):
|
||||
payload = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rebuild
|
||||
xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
name="foobar"
|
||||
imageRef="http://openstack.example.com/v1.1/32278/images/70a599e0-31e7"
|
||||
accessIPv4="1.2.3.4"
|
||||
accessIPv6="fe80::100"
|
||||
adminPass="seekr3t">
|
||||
<metadata>
|
||||
<meta key="My Server Name">Apache1</meta>
|
||||
</metadata>
|
||||
</rebuild>"""
|
||||
expected = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rebuild
|
||||
xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
name="foobar"
|
||||
imageRef="http://openstack.example.com/v1.1/32278/images/70a599e0-31e7"
|
||||
accessIPv4="1.2.3.4"
|
||||
accessIPv6="fe80::100"
|
||||
adminPass="***">
|
||||
<metadata>
|
||||
<meta key="My Server Name">Apache1</meta>
|
||||
</metadata>
|
||||
</rebuild>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
admin_pass="MySecretPass"/>"""
|
||||
expected = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
admin_pass="***"/>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
admin_password="MySecretPass"/>"""
|
||||
expected = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
admin_password="***"/>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
payload = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
password="MySecretPass"/>"""
|
||||
expected = """<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rescue xmlns="http://docs.openstack.org/compute/api/v1.1"
|
||||
password="***"/>"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
def test_mask_password(self):
|
||||
payload = "test = 'password' : 'aaaaaa'"
|
||||
expected = "test = 'password' : '111'"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='111'))
|
||||
|
||||
payload = 'mysqld --password "aaaaaa"'
|
||||
expected = 'mysqld --password "****"'
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='****'))
|
||||
|
||||
payload = 'mysqld --password aaaaaa'
|
||||
expected = 'mysqld --password ???'
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='???'))
|
||||
|
||||
payload = 'mysqld --password = "aaaaaa"'
|
||||
expected = 'mysqld --password = "****"'
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='****'))
|
||||
|
||||
payload = "mysqld --password = 'aaaaaa'"
|
||||
expected = "mysqld --password = '****'"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='****'))
|
||||
|
||||
payload = "mysqld --password = aaaaaa"
|
||||
expected = "mysqld --password = ****"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='****'))
|
||||
|
||||
payload = "test = password = aaaaaa"
|
||||
expected = "test = password = 111"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='111'))
|
||||
|
||||
payload = "test = password= aaaaaa"
|
||||
expected = "test = password= 111"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='111'))
|
||||
|
||||
payload = "test = password =aaaaaa"
|
||||
expected = "test = password =111"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='111'))
|
||||
|
||||
payload = "test = password=aaaaaa"
|
||||
expected = "test = password=111"
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_password(payload, secret='111'))
|
||||
|
||||
payload = 'test = "original_password" : "aaaaaaaaa"'
|
||||
expected = 'test = "original_password" : "***"'
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = 'test = "param1" : "value"'
|
||||
expected = 'test = "param1" : "value"'
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = """{'adminPass':'TL0EfN33'}"""
|
||||
payload = six.text_type(payload)
|
||||
expected = """{'adminPass':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = """{'token':'mytoken'}"""
|
||||
payload = six.text_type(payload)
|
||||
expected = """{'token':'***'}"""
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = ("test = 'node.session.auth.password','-v','TL0EfN33',"
|
||||
"'nomask'")
|
||||
expected = ("test = 'node.session.auth.password','-v','***',"
|
||||
"'nomask'")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = ("test = 'node.session.auth.password', '--password', "
|
||||
"'TL0EfN33', 'nomask'")
|
||||
expected = ("test = 'node.session.auth.password', '--password', "
|
||||
"'***', 'nomask'")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = ("test = 'node.session.auth.password', '--password', "
|
||||
"'TL0EfN33'")
|
||||
expected = ("test = 'node.session.auth.password', '--password', "
|
||||
"'***'")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = "test = node.session.auth.password -v TL0EfN33 nomask"
|
||||
expected = "test = node.session.auth.password -v *** nomask"
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = ("test = node.session.auth.password --password TL0EfN33 "
|
||||
"nomask")
|
||||
expected = ("test = node.session.auth.password --password *** "
|
||||
"nomask")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = ("test = node.session.auth.password --password TL0EfN33")
|
||||
expected = ("test = node.session.auth.password --password ***")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
payload = "test = cmd --password my\xe9\x80\x80pass"
|
||||
expected = ("test = cmd --password ***")
|
||||
self.assertEqual(expected, strutils.mask_password(payload))
|
||||
|
||||
|
||||
class MaskDictionaryPasswordTestCase(test_base.BaseTestCase):
|
||||
|
||||
def test_dictionary(self):
|
||||
payload = {'password': 'TL0EfN33'}
|
||||
expected = {'password': '***'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
payload = {'user': 'admin', 'password': 'TL0EfN33'}
|
||||
expected = {'user': 'admin', 'password': '***'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
payload = {'strval': 'somestring',
|
||||
'dictval': {'user': 'admin', 'password': 'TL0EfN33'}}
|
||||
expected = {'strval': 'somestring',
|
||||
'dictval': {'user': 'admin', 'password': '***'}}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
payload = {'strval': '--password abc',
|
||||
'dont_change': 'this is fine',
|
||||
'dictval': {'user': 'admin', 'password': b'TL0EfN33'}}
|
||||
expected = {'strval': '--password ***',
|
||||
'dont_change': 'this is fine',
|
||||
'dictval': {'user': 'admin', 'password': '***'}}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
payload = {'ipmi_password': 'KeDrahishvowphyecMornEm0or('}
|
||||
expected = {'ipmi_password': '***'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
def test_do_no_harm(self):
|
||||
payload = {}
|
||||
expected = {}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
payload = {'somekey': 'somevalue',
|
||||
'anotherkey': 'anothervalue'}
|
||||
expected = {'somekey': 'somevalue',
|
||||
'anotherkey': 'anothervalue'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
def test_mask_values(self):
|
||||
payload = {'somekey': 'test = cmd --password my\xe9\x80\x80pass'}
|
||||
expected = {'somekey': 'test = cmd --password ***'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
def test_other_non_str_values(self):
|
||||
payload = {'password': 'DK0PK1AK3', 'bool': True,
|
||||
'dict': {'cat': 'meow', 'password': "*aa38skdjf"},
|
||||
'float': 0.1, 'int': 123, 'list': [1, 2], 'none': None,
|
||||
'str': 'foo'}
|
||||
expected = {'password': '***', 'bool': True,
|
||||
'dict': {'cat': 'meow', 'password': '***'},
|
||||
'float': 0.1, 'int': 123, 'list': [1, 2], 'none': None,
|
||||
'str': 'foo'}
|
||||
self.assertEqual(expected,
|
||||
strutils.mask_dict_password(payload))
|
||||
|
||||
def test_argument_untouched(self):
|
||||
"""Make sure that the argument passed in is not modified"""
|
||||
payload = {'password': 'DK0PK1AK3', 'bool': True,
|
||||
'dict': {'cat': 'meow', 'password': "*aa38skdjf"},
|
||||
'float': 0.1, 'int': 123, 'list': [1, 2], 'none': None,
|
||||
'str': 'foo'}
|
||||
pristine = copy.deepcopy(payload)
|
||||
# Send the payload into the function, to see if it gets modified
|
||||
strutils.mask_dict_password(payload)
|
||||
self.assertEqual(pristine, payload)
|
||||
|
||||
|
||||
class IsIntLikeTestCase(test_base.BaseTestCase):
|
||||
def test_is_int_like_true(self):
|
||||
self.assertTrue(strutils.is_int_like(1))
|
||||
self.assertTrue(strutils.is_int_like("1"))
|
||||
self.assertTrue(strutils.is_int_like("514"))
|
||||
self.assertTrue(strutils.is_int_like("0"))
|
||||
|
||||
def test_is_int_like_false(self):
|
||||
self.assertFalse(strutils.is_int_like(1.1))
|
||||
self.assertFalse(strutils.is_int_like("1.1"))
|
||||
self.assertFalse(strutils.is_int_like("1.1.1"))
|
||||
self.assertFalse(strutils.is_int_like(None))
|
||||
self.assertFalse(strutils.is_int_like("0."))
|
||||
self.assertFalse(strutils.is_int_like("aaaaaa"))
|
||||
self.assertFalse(strutils.is_int_like("...."))
|
||||
self.assertFalse(strutils.is_int_like("1g"))
|
||||
self.assertFalse(
|
||||
strutils.is_int_like("0cc3346e-9fef-4445-abe6-5d2b2690ec64"))
|
||||
self.assertFalse(strutils.is_int_like("a1"))
|
||||
# NOTE(viktors): 12e3 - is a float number
|
||||
self.assertFalse(strutils.is_int_like("12e3"))
|
||||
# NOTE(viktors): Check integer numbers with base not 10
|
||||
self.assertFalse(strutils.is_int_like("0o51"))
|
||||
self.assertFalse(strutils.is_int_like("0xDEADBEEF"))
|
||||
|
||||
|
||||
class StringLengthTestCase(test_base.BaseTestCase):
|
||||
def test_check_string_length(self):
|
||||
self.assertIsNone(strutils.check_string_length(
|
||||
'test', 'name', max_length=255))
|
||||
self.assertRaises(ValueError,
|
||||
strutils.check_string_length,
|
||||
'', 'name', min_length=1)
|
||||
self.assertRaises(ValueError,
|
||||
strutils.check_string_length,
|
||||
'a' * 256, 'name', max_length=255)
|
||||
self.assertRaises(TypeError,
|
||||
strutils.check_string_length,
|
||||
11, 'name', max_length=255)
|
||||
self.assertRaises(TypeError,
|
||||
strutils.check_string_length,
|
||||
dict(), 'name', max_length=255)
|
||||
|
||||
def test_check_string_length_noname(self):
|
||||
self.assertIsNone(strutils.check_string_length(
|
||||
'test', max_length=255))
|
||||
self.assertRaises(ValueError,
|
||||
strutils.check_string_length,
|
||||
'', min_length=1)
|
||||
self.assertRaises(ValueError,
|
||||
strutils.check_string_length,
|
||||
'a' * 256, max_length=255)
|
||||
self.assertRaises(TypeError,
|
||||
strutils.check_string_length,
|
||||
11, max_length=255)
|
||||
self.assertRaises(TypeError,
|
||||
strutils.check_string_length,
|
||||
dict(), max_length=255)
|
||||
|
||||
|
||||
class SplitPathTestCase(test_base.BaseTestCase):
|
||||
def test_split_path_failed(self):
|
||||
self.assertRaises(ValueError, strutils.split_path, '')
|
||||
self.assertRaises(ValueError, strutils.split_path, '/')
|
||||
self.assertRaises(ValueError, strutils.split_path, '//')
|
||||
self.assertRaises(ValueError, strutils.split_path, '//a')
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a/c')
|
||||
self.assertRaises(ValueError, strutils.split_path, '//c')
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a/c/')
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a//')
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a', 2)
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a', 2, 3)
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a', 2, 3, True)
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a/c/o/r', 3, 3)
|
||||
self.assertRaises(ValueError, strutils.split_path, '/a', 5, 4)
|
||||
|
||||
def test_split_path_success(self):
|
||||
self.assertEqual(strutils.split_path('/a'), ['a'])
|
||||
self.assertEqual(strutils.split_path('/a/'), ['a'])
|
||||
self.assertEqual(strutils.split_path('/a/c', 2), ['a', 'c'])
|
||||
self.assertEqual(strutils.split_path('/a/c/o', 3), ['a', 'c', 'o'])
|
||||
self.assertEqual(strutils.split_path('/a/c/o/r', 3, 3, True),
|
||||
['a', 'c', 'o/r'])
|
||||
self.assertEqual(strutils.split_path('/a/c', 2, 3, True),
|
||||
['a', 'c', None])
|
||||
self.assertEqual(strutils.split_path('/a/c/', 2), ['a', 'c'])
|
||||
self.assertEqual(strutils.split_path('/a/c/', 2, 3), ['a', 'c', ''])
|
||||
|
||||
def test_split_path_invalid_path(self):
|
||||
try:
|
||||
strutils.split_path('o\nn e', 2)
|
||||
except ValueError as err:
|
||||
self.assertEqual(str(err), 'Invalid path: o%0An%20e')
|
||||
try:
|
||||
strutils.split_path('o\nn e', 2, 3, True)
|
||||
except ValueError as err:
|
||||
self.assertEqual(str(err), 'Invalid path: o%0An%20e')
|
||||
|
||||
|
||||
class SplitByCommas(test_base.BaseTestCase):
|
||||
def test_not_closed_quotes(self):
|
||||
self.assertRaises(ValueError, strutils.split_by_commas, '"ab","b""')
|
||||
|
||||
def test_no_comma_before_opening_quotes(self):
|
||||
self.assertRaises(ValueError, strutils.split_by_commas, '"ab""b"')
|
||||
|
||||
def test_quote_inside_unquoted(self):
|
||||
self.assertRaises(ValueError, strutils.split_by_commas, 'a"b,cd')
|
||||
|
||||
def check(self, expect, input):
|
||||
self.assertEqual(expect, strutils.split_by_commas(input))
|
||||
|
||||
def test_plain(self):
|
||||
self.check(["a,b", "ac"], '"a,b",ac')
|
||||
|
||||
def test_with_backslash_inside_quoted(self):
|
||||
self.check(['abc"', 'de', 'fg,h', 'klm\\', '"nop'],
|
||||
r'"abc\"","de","fg,h","klm\\","\"nop"')
|
||||
|
||||
def test_with_backslash_inside_unquoted(self):
|
||||
self.check([r'a\bc', 'de'], r'a\bc,de')
|
||||
|
||||
def test_with_escaped_quotes_in_row_inside_quoted(self):
|
||||
self.check(['a"b""c', 'd'], r'"a\"b\"\"c",d')
|
|
@ -1,629 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import calendar
|
||||
import datetime
|
||||
import logging
|
||||
import time
|
||||
|
||||
import iso8601
|
||||
import mock
|
||||
from oslotest import base as test_base
|
||||
from testtools import matchers
|
||||
|
||||
from oslo_utils import timeutils
|
||||
|
||||
|
||||
def monotonic_iter(start=0, incr=0.05):
|
||||
while True:
|
||||
yield start
|
||||
start += incr
|
||||
|
||||
|
||||
class TimeUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TimeUtilsTest, self).setUp()
|
||||
self.skynet_self_aware_time_str = '1997-08-29T06:14:00Z'
|
||||
self.skynet_self_aware_time_ms_str = '1997-08-29T06:14:00.000123Z'
|
||||
self.skynet_self_aware_time = datetime.datetime(1997, 8, 29, 6, 14, 0)
|
||||
self.skynet_self_aware_ms_time = datetime.datetime(1997, 8, 29, 6, 14,
|
||||
0, 123)
|
||||
self.one_minute_before = datetime.datetime(1997, 8, 29, 6, 13, 0)
|
||||
self.one_minute_after = datetime.datetime(1997, 8, 29, 6, 15, 0)
|
||||
self.skynet_self_aware_time_perfect_str = '1997-08-29T06:14:00.000000'
|
||||
self.skynet_self_aware_time_perfect = datetime.datetime(1997, 8, 29,
|
||||
6, 14, 0)
|
||||
self.addCleanup(timeutils.clear_time_override)
|
||||
|
||||
def test_isotime(self):
|
||||
with mock.patch('datetime.datetime') as datetime_mock:
|
||||
datetime_mock.utcnow.return_value = self.skynet_self_aware_time
|
||||
dt = timeutils.isotime()
|
||||
self.assertEqual(dt, self.skynet_self_aware_time_str)
|
||||
|
||||
def test_isotimei_micro_second_precision(self):
|
||||
with mock.patch('datetime.datetime') as datetime_mock:
|
||||
datetime_mock.utcnow.return_value = self.skynet_self_aware_ms_time
|
||||
dt = timeutils.isotime(subsecond=True)
|
||||
self.assertEqual(dt, self.skynet_self_aware_time_ms_str)
|
||||
|
||||
def test_parse_isotime(self):
|
||||
expect = timeutils.parse_isotime(self.skynet_self_aware_time_str)
|
||||
skynet_self_aware_time_utc = self.skynet_self_aware_time.replace(
|
||||
tzinfo=iso8601.iso8601.UTC)
|
||||
self.assertEqual(skynet_self_aware_time_utc, expect)
|
||||
|
||||
def test_parse_isotime_micro_second_precision(self):
|
||||
expect = timeutils.parse_isotime(self.skynet_self_aware_time_ms_str)
|
||||
skynet_self_aware_time_ms_utc = self.skynet_self_aware_ms_time.replace(
|
||||
tzinfo=iso8601.iso8601.UTC)
|
||||
self.assertEqual(skynet_self_aware_time_ms_utc, expect)
|
||||
|
||||
def test_strtime(self):
|
||||
expect = timeutils.strtime(self.skynet_self_aware_time_perfect)
|
||||
self.assertEqual(self.skynet_self_aware_time_perfect_str, expect)
|
||||
|
||||
def test_parse_strtime(self):
|
||||
perfect_time_format = self.skynet_self_aware_time_perfect_str
|
||||
expect = timeutils.parse_strtime(perfect_time_format)
|
||||
self.assertEqual(self.skynet_self_aware_time_perfect, expect)
|
||||
|
||||
def test_strtime_and_back(self):
|
||||
orig_t = datetime.datetime(1997, 8, 29, 6, 14, 0)
|
||||
s = timeutils.strtime(orig_t)
|
||||
t = timeutils.parse_strtime(s)
|
||||
self.assertEqual(orig_t, t)
|
||||
|
||||
def _test_is_older_than(self, fn):
|
||||
strptime = datetime.datetime.strptime
|
||||
with mock.patch('datetime.datetime') as datetime_mock:
|
||||
datetime_mock.utcnow.return_value = self.skynet_self_aware_time
|
||||
datetime_mock.strptime = strptime
|
||||
expect_true = timeutils.is_older_than(fn(self.one_minute_before),
|
||||
59)
|
||||
self.assertTrue(expect_true)
|
||||
expect_false = timeutils.is_older_than(fn(self.one_minute_before),
|
||||
60)
|
||||
self.assertFalse(expect_false)
|
||||
expect_false = timeutils.is_older_than(fn(self.one_minute_before),
|
||||
61)
|
||||
self.assertFalse(expect_false)
|
||||
|
||||
def test_is_older_than_datetime(self):
|
||||
self._test_is_older_than(lambda x: x)
|
||||
|
||||
def test_is_older_than_str(self):
|
||||
self._test_is_older_than(timeutils.strtime)
|
||||
|
||||
def test_is_older_than_aware(self):
|
||||
"""Tests sending is_older_than an 'aware' datetime."""
|
||||
self._test_is_older_than(lambda x: x.replace(
|
||||
tzinfo=iso8601.iso8601.UTC))
|
||||
|
||||
def test_is_older_than_aware_no_utc(self):
|
||||
self._test_is_older_than(lambda x: x.replace(
|
||||
tzinfo=iso8601.iso8601.FixedOffset(1, 0, 'foo')).replace(
|
||||
hour=7))
|
||||
|
||||
def _test_is_newer_than(self, fn):
|
||||
strptime = datetime.datetime.strptime
|
||||
with mock.patch('datetime.datetime') as datetime_mock:
|
||||
datetime_mock.utcnow.return_value = self.skynet_self_aware_time
|
||||
datetime_mock.strptime = strptime
|
||||
expect_true = timeutils.is_newer_than(fn(self.one_minute_after),
|
||||
59)
|
||||
self.assertTrue(expect_true)
|
||||
expect_false = timeutils.is_newer_than(fn(self.one_minute_after),
|
||||
60)
|
||||
self.assertFalse(expect_false)
|
||||
expect_false = timeutils.is_newer_than(fn(self.one_minute_after),
|
||||
61)
|
||||
self.assertFalse(expect_false)
|
||||
|
||||
def test_is_newer_than_datetime(self):
|
||||
self._test_is_newer_than(lambda x: x)
|
||||
|
||||
def test_is_newer_than_str(self):
|
||||
self._test_is_newer_than(timeutils.strtime)
|
||||
|
||||
def test_is_newer_than_aware(self):
|
||||
"""Tests sending is_newer_than an 'aware' datetime."""
|
||||
self._test_is_newer_than(lambda x: x.replace(
|
||||
tzinfo=iso8601.iso8601.UTC))
|
||||
|
||||
def test_is_newer_than_aware_no_utc(self):
|
||||
self._test_is_newer_than(lambda x: x.replace(
|
||||
tzinfo=iso8601.iso8601.FixedOffset(1, 0, 'foo')).replace(
|
||||
hour=7))
|
||||
|
||||
def test_set_time_override_using_default(self):
|
||||
now = timeutils.utcnow_ts()
|
||||
|
||||
# NOTE(kgriffs): Normally it's bad form to sleep in a unit test,
|
||||
# but this is the only way to test that set_time_override defaults
|
||||
# to setting the override to the current time.
|
||||
time.sleep(1)
|
||||
|
||||
timeutils.set_time_override()
|
||||
overriden_now = timeutils.utcnow_ts()
|
||||
self.assertThat(now, matchers.LessThan(overriden_now))
|
||||
|
||||
def test_utcnow_ts(self):
|
||||
skynet_self_aware_ts = 872835240
|
||||
skynet_dt = datetime.datetime.utcfromtimestamp(skynet_self_aware_ts)
|
||||
self.assertEqual(self.skynet_self_aware_time, skynet_dt)
|
||||
|
||||
# NOTE(kgriffs): timeutils.utcnow_ts() uses time.time()
|
||||
# IFF time override is not set.
|
||||
with mock.patch('time.time') as time_mock:
|
||||
time_mock.return_value = skynet_self_aware_ts
|
||||
ts = timeutils.utcnow_ts()
|
||||
self.assertEqual(ts, skynet_self_aware_ts)
|
||||
|
||||
timeutils.set_time_override(skynet_dt)
|
||||
ts = timeutils.utcnow_ts()
|
||||
self.assertEqual(ts, skynet_self_aware_ts)
|
||||
|
||||
def test_utcnow(self):
|
||||
timeutils.set_time_override(mock.sentinel.utcnow)
|
||||
self.assertEqual(timeutils.utcnow(), mock.sentinel.utcnow)
|
||||
|
||||
timeutils.clear_time_override()
|
||||
self.assertFalse(timeutils.utcnow() == mock.sentinel.utcnow)
|
||||
|
||||
self.assertTrue(timeutils.utcnow())
|
||||
|
||||
def test_advance_time_delta(self):
|
||||
timeutils.set_time_override(self.one_minute_before)
|
||||
timeutils.advance_time_delta(datetime.timedelta(seconds=60))
|
||||
self.assertEqual(timeutils.utcnow(), self.skynet_self_aware_time)
|
||||
|
||||
def test_advance_time_seconds(self):
|
||||
timeutils.set_time_override(self.one_minute_before)
|
||||
timeutils.advance_time_seconds(60)
|
||||
self.assertEqual(timeutils.utcnow(), self.skynet_self_aware_time)
|
||||
|
||||
def test_marshall_time(self):
|
||||
now = timeutils.utcnow()
|
||||
binary = timeutils.marshall_now(now)
|
||||
backagain = timeutils.unmarshall_time(binary)
|
||||
self.assertEqual(now, backagain)
|
||||
|
||||
def test_marshall_time_with_tz(self):
|
||||
now = timeutils.utcnow()
|
||||
now = now.replace(tzinfo=iso8601.iso8601.UTC)
|
||||
binary = timeutils.marshall_now(now)
|
||||
self.assertEqual("UTC", binary['tzname'])
|
||||
backagain = timeutils.unmarshall_time(binary)
|
||||
self.assertEqual(now, backagain)
|
||||
self.assertIsNotNone(backagain.tzinfo)
|
||||
self.assertEqual(now.utcoffset(), backagain.utcoffset())
|
||||
|
||||
def test_unmarshall_time_leap_second(self):
|
||||
leap_dict = dict(day=30, month=6, year=2015,
|
||||
hour=23, minute=59,
|
||||
second=timeutils._MAX_DATETIME_SEC + 1,
|
||||
microsecond=0)
|
||||
leap_time = timeutils.unmarshall_time(leap_dict)
|
||||
|
||||
leap_dict.update(second=timeutils._MAX_DATETIME_SEC)
|
||||
expected = timeutils.unmarshall_time(leap_dict)
|
||||
|
||||
self.assertEqual(expected, leap_time)
|
||||
|
||||
def test_delta_seconds(self):
|
||||
before = timeutils.utcnow()
|
||||
after = before + datetime.timedelta(days=7, seconds=59,
|
||||
microseconds=123456)
|
||||
self.assertAlmostEquals(604859.123456,
|
||||
timeutils.delta_seconds(before, after))
|
||||
|
||||
def test_iso8601_from_timestamp(self):
|
||||
utcnow = timeutils.utcnow()
|
||||
iso = timeutils.isotime(utcnow)
|
||||
ts = calendar.timegm(utcnow.timetuple())
|
||||
self.assertEqual(iso, timeutils.iso8601_from_timestamp(ts))
|
||||
|
||||
def test_iso8601_from_timestamp_ms(self):
|
||||
ts = timeutils.utcnow_ts(microsecond=True)
|
||||
utcnow = datetime.datetime.utcfromtimestamp(ts)
|
||||
iso = timeutils.isotime(utcnow, subsecond=True)
|
||||
self.assertEqual(iso, timeutils.iso8601_from_timestamp(ts, True))
|
||||
|
||||
def test_is_soon(self):
|
||||
expires = timeutils.utcnow() + datetime.timedelta(minutes=5)
|
||||
self.assertFalse(timeutils.is_soon(expires, 120))
|
||||
self.assertTrue(timeutils.is_soon(expires, 300))
|
||||
self.assertTrue(timeutils.is_soon(expires, 600))
|
||||
|
||||
with mock.patch('datetime.datetime') as datetime_mock:
|
||||
datetime_mock.utcnow.return_value = self.skynet_self_aware_time
|
||||
expires = timeutils.utcnow()
|
||||
self.assertTrue(timeutils.is_soon(expires, 0))
|
||||
|
||||
|
||||
class TestIso8601Time(test_base.BaseTestCase):
|
||||
|
||||
def _instaneous(self, timestamp, yr, mon, day, hr, minute, sec, micro):
|
||||
self.assertEqual(timestamp.year, yr)
|
||||
self.assertEqual(timestamp.month, mon)
|
||||
self.assertEqual(timestamp.day, day)
|
||||
self.assertEqual(timestamp.hour, hr)
|
||||
self.assertEqual(timestamp.minute, minute)
|
||||
self.assertEqual(timestamp.second, sec)
|
||||
self.assertEqual(timestamp.microsecond, micro)
|
||||
|
||||
def _do_test(self, time_str, yr, mon, day, hr, minute, sec, micro, shift):
|
||||
DAY_SECONDS = 24 * 60 * 60
|
||||
timestamp = timeutils.parse_isotime(time_str)
|
||||
self._instaneous(timestamp, yr, mon, day, hr, minute, sec, micro)
|
||||
offset = timestamp.tzinfo.utcoffset(None)
|
||||
self.assertEqual(offset.seconds + offset.days * DAY_SECONDS, shift)
|
||||
|
||||
def test_zulu(self):
|
||||
time_str = '2012-02-14T20:53:07Z'
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 0, 0)
|
||||
|
||||
def test_zulu_micros(self):
|
||||
time_str = '2012-02-14T20:53:07.123Z'
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 123000, 0)
|
||||
|
||||
def test_offset_east(self):
|
||||
time_str = '2012-02-14T20:53:07+04:30'
|
||||
offset = 4.5 * 60 * 60
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 0, offset)
|
||||
|
||||
def test_offset_east_micros(self):
|
||||
time_str = '2012-02-14T20:53:07.42+04:30'
|
||||
offset = 4.5 * 60 * 60
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 420000, offset)
|
||||
|
||||
def test_offset_west(self):
|
||||
time_str = '2012-02-14T20:53:07-05:30'
|
||||
offset = -5.5 * 60 * 60
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 0, offset)
|
||||
|
||||
def test_offset_west_micros(self):
|
||||
time_str = '2012-02-14T20:53:07.654321-05:30'
|
||||
offset = -5.5 * 60 * 60
|
||||
self._do_test(time_str, 2012, 2, 14, 20, 53, 7, 654321, offset)
|
||||
|
||||
def test_compare(self):
|
||||
zulu = timeutils.parse_isotime('2012-02-14T20:53:07')
|
||||
east = timeutils.parse_isotime('2012-02-14T20:53:07-01:00')
|
||||
west = timeutils.parse_isotime('2012-02-14T20:53:07+01:00')
|
||||
self.assertTrue(east > west)
|
||||
self.assertTrue(east > zulu)
|
||||
self.assertTrue(zulu > west)
|
||||
|
||||
def test_compare_micros(self):
|
||||
zulu = timeutils.parse_isotime('2012-02-14T20:53:07.6544')
|
||||
east = timeutils.parse_isotime('2012-02-14T19:53:07.654321-01:00')
|
||||
west = timeutils.parse_isotime('2012-02-14T21:53:07.655+01:00')
|
||||
self.assertTrue(east < west)
|
||||
self.assertTrue(east < zulu)
|
||||
self.assertTrue(zulu < west)
|
||||
|
||||
def test_zulu_roundtrip(self):
|
||||
time_str = '2012-02-14T20:53:07Z'
|
||||
zulu = timeutils.parse_isotime(time_str)
|
||||
self.assertEqual(zulu.tzinfo, iso8601.iso8601.UTC)
|
||||
self.assertEqual(timeutils.isotime(zulu), time_str)
|
||||
|
||||
def test_east_roundtrip(self):
|
||||
time_str = '2012-02-14T20:53:07-07:00'
|
||||
east = timeutils.parse_isotime(time_str)
|
||||
self.assertEqual(east.tzinfo.tzname(None), '-07:00')
|
||||
self.assertEqual(timeutils.isotime(east), time_str)
|
||||
|
||||
def test_west_roundtrip(self):
|
||||
time_str = '2012-02-14T20:53:07+11:30'
|
||||
west = timeutils.parse_isotime(time_str)
|
||||
self.assertEqual(west.tzinfo.tzname(None), '+11:30')
|
||||
self.assertEqual(timeutils.isotime(west), time_str)
|
||||
|
||||
def test_now_roundtrip(self):
|
||||
time_str = timeutils.isotime()
|
||||
now = timeutils.parse_isotime(time_str)
|
||||
self.assertEqual(now.tzinfo, iso8601.iso8601.UTC)
|
||||
self.assertEqual(timeutils.isotime(now), time_str)
|
||||
|
||||
def test_zulu_normalize(self):
|
||||
time_str = '2012-02-14T20:53:07Z'
|
||||
zulu = timeutils.parse_isotime(time_str)
|
||||
normed = timeutils.normalize_time(zulu)
|
||||
self._instaneous(normed, 2012, 2, 14, 20, 53, 7, 0)
|
||||
|
||||
def test_east_normalize(self):
|
||||
time_str = '2012-02-14T20:53:07-07:00'
|
||||
east = timeutils.parse_isotime(time_str)
|
||||
normed = timeutils.normalize_time(east)
|
||||
self._instaneous(normed, 2012, 2, 15, 3, 53, 7, 0)
|
||||
|
||||
def test_west_normalize(self):
|
||||
time_str = '2012-02-14T20:53:07+21:00'
|
||||
west = timeutils.parse_isotime(time_str)
|
||||
normed = timeutils.normalize_time(west)
|
||||
self._instaneous(normed, 2012, 2, 13, 23, 53, 7, 0)
|
||||
|
||||
def test_normalize_aware_to_naive(self):
|
||||
dt = datetime.datetime(2011, 2, 14, 20, 53, 7)
|
||||
time_str = '2011-02-14T20:53:07+21:00'
|
||||
aware = timeutils.parse_isotime(time_str)
|
||||
naive = timeutils.normalize_time(aware)
|
||||
self.assertTrue(naive < dt)
|
||||
|
||||
def test_normalize_zulu_aware_to_naive(self):
|
||||
dt = datetime.datetime(2011, 2, 14, 20, 53, 7)
|
||||
time_str = '2011-02-14T19:53:07Z'
|
||||
aware = timeutils.parse_isotime(time_str)
|
||||
naive = timeutils.normalize_time(aware)
|
||||
self.assertTrue(naive < dt)
|
||||
|
||||
def test_normalize_naive(self):
|
||||
dt = datetime.datetime(2011, 2, 14, 20, 53, 7)
|
||||
dtn = datetime.datetime(2011, 2, 14, 19, 53, 7)
|
||||
naive = timeutils.normalize_time(dtn)
|
||||
self.assertTrue(naive < dt)
|
||||
|
||||
|
||||
class TimeItTest(test_base.BaseTestCase):
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_timed(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger)
|
||||
def slow_function():
|
||||
time.sleep(0.1)
|
||||
|
||||
slow_function()
|
||||
self.assertTrue(mock_now.called)
|
||||
self.assertTrue(mock_sleep.called)
|
||||
self.assertTrue(fake_logger.log.called)
|
||||
fake_logger.log.assert_called_with(logging.DEBUG, mock.ANY, mock.ANY)
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_no_timed_disabled(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger, enabled=False)
|
||||
def slow_function():
|
||||
time.sleep(0.1)
|
||||
|
||||
slow_function()
|
||||
self.assertFalse(mock_now.called)
|
||||
self.assertFalse(fake_logger.log.called)
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_no_timed_to_fast(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger, min_duration=10)
|
||||
def fast_function():
|
||||
pass
|
||||
|
||||
fast_function()
|
||||
self.assertFalse(fake_logger.log.called)
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_no_timed_exception(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger)
|
||||
def broken_function():
|
||||
raise IOError("Broken")
|
||||
|
||||
self.assertRaises(IOError, broken_function)
|
||||
self.assertFalse(fake_logger.log.called)
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_timed_custom_message(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger, message="That took a long time")
|
||||
def slow_function():
|
||||
time.sleep(0.1)
|
||||
|
||||
slow_function()
|
||||
self.assertTrue(mock_now.called)
|
||||
self.assertTrue(mock_sleep.called)
|
||||
self.assertTrue(fake_logger.log.called)
|
||||
fake_logger.log.assert_called_with(logging.DEBUG,
|
||||
"That took a long time",
|
||||
mock.ANY)
|
||||
|
||||
@mock.patch('time.sleep')
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_timed_custom_level(self, mock_now, mock_sleep):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.1)
|
||||
fake_logger = mock.MagicMock(logging.getLogger(), autospec=True)
|
||||
|
||||
@timeutils.time_it(fake_logger, log_level=logging.INFO)
|
||||
def slow_function():
|
||||
time.sleep(0.1)
|
||||
|
||||
slow_function()
|
||||
self.assertTrue(mock_now.called)
|
||||
self.assertTrue(mock_sleep.called)
|
||||
self.assertTrue(fake_logger.log.called)
|
||||
fake_logger.log.assert_called_with(logging.INFO, mock.ANY, mock.ANY)
|
||||
|
||||
|
||||
class StopWatchTest(test_base.BaseTestCase):
|
||||
def test_leftover_no_duration(self):
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
self.assertRaises(RuntimeError, watch.leftover)
|
||||
self.assertRaises(RuntimeError, watch.leftover, return_none=False)
|
||||
self.assertIsNone(watch.leftover(return_none=True))
|
||||
|
||||
def test_no_states(self):
|
||||
watch = timeutils.StopWatch()
|
||||
self.assertRaises(RuntimeError, watch.stop)
|
||||
self.assertRaises(RuntimeError, watch.resume)
|
||||
|
||||
def test_bad_expiry(self):
|
||||
self.assertRaises(ValueError, timeutils.StopWatch, -1)
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_backwards(self, mock_now):
|
||||
mock_now.side_effect = [0, 0.5, -1.0, -1.0]
|
||||
watch = timeutils.StopWatch(0.1)
|
||||
watch.start()
|
||||
self.assertTrue(watch.expired())
|
||||
self.assertFalse(watch.expired())
|
||||
self.assertEqual(0.0, watch.elapsed())
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_expiry(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.2)
|
||||
watch = timeutils.StopWatch(0.1)
|
||||
watch.start()
|
||||
self.assertTrue(watch.expired())
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_not_expired(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter()
|
||||
watch = timeutils.StopWatch(0.1)
|
||||
watch.start()
|
||||
self.assertFalse(watch.expired())
|
||||
|
||||
def test_has_started_stopped(self):
|
||||
watch = timeutils.StopWatch()
|
||||
self.assertFalse(watch.has_started())
|
||||
self.assertFalse(watch.has_stopped())
|
||||
watch.start()
|
||||
|
||||
self.assertTrue(watch.has_started())
|
||||
self.assertFalse(watch.has_stopped())
|
||||
|
||||
watch.stop()
|
||||
self.assertTrue(watch.has_stopped())
|
||||
self.assertFalse(watch.has_started())
|
||||
|
||||
def test_no_expiry(self):
|
||||
watch = timeutils.StopWatch(0.1)
|
||||
self.assertRaises(RuntimeError, watch.expired)
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_elapsed(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter(incr=0.2)
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
matcher = matchers.GreaterThan(0.19)
|
||||
self.assertThat(watch.elapsed(), matcher)
|
||||
|
||||
def test_no_elapsed(self):
|
||||
watch = timeutils.StopWatch()
|
||||
self.assertRaises(RuntimeError, watch.elapsed)
|
||||
|
||||
def test_no_leftover(self):
|
||||
watch = timeutils.StopWatch()
|
||||
self.assertRaises(RuntimeError, watch.leftover)
|
||||
watch = timeutils.StopWatch(1)
|
||||
self.assertRaises(RuntimeError, watch.leftover)
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_pause_resume(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter()
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
watch.stop()
|
||||
elapsed = watch.elapsed()
|
||||
self.assertAlmostEqual(elapsed, watch.elapsed())
|
||||
watch.resume()
|
||||
self.assertNotEqual(elapsed, watch.elapsed())
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_context_manager(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter()
|
||||
with timeutils.StopWatch() as watch:
|
||||
pass
|
||||
matcher = matchers.GreaterThan(0.04)
|
||||
self.assertThat(watch.elapsed(), matcher)
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_context_manager_splits(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter()
|
||||
with timeutils.StopWatch() as watch:
|
||||
time.sleep(0.01)
|
||||
watch.split()
|
||||
self.assertRaises(RuntimeError, watch.split)
|
||||
self.assertEqual(1, len(watch.splits))
|
||||
|
||||
def test_splits_stopped(self):
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
watch.split()
|
||||
watch.stop()
|
||||
self.assertRaises(RuntimeError, watch.split)
|
||||
|
||||
def test_splits_never_started(self):
|
||||
watch = timeutils.StopWatch()
|
||||
self.assertRaises(RuntimeError, watch.split)
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_splits(self, mock_now):
|
||||
mock_now.side_effect = monotonic_iter()
|
||||
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
self.assertEqual(0, len(watch.splits))
|
||||
|
||||
watch.split()
|
||||
self.assertEqual(1, len(watch.splits))
|
||||
self.assertEqual(watch.splits[0].elapsed,
|
||||
watch.splits[0].length)
|
||||
|
||||
watch.split()
|
||||
splits = watch.splits
|
||||
self.assertEqual(2, len(splits))
|
||||
self.assertNotEqual(splits[0].elapsed, splits[1].elapsed)
|
||||
self.assertEqual(splits[1].length,
|
||||
splits[1].elapsed - splits[0].elapsed)
|
||||
|
||||
watch.stop()
|
||||
self.assertEqual(2, len(watch.splits))
|
||||
|
||||
watch.start()
|
||||
self.assertEqual(0, len(watch.splits))
|
||||
|
||||
@mock.patch('oslo_utils.timeutils.now')
|
||||
def test_elapsed_maximum(self, mock_now):
|
||||
mock_now.side_effect = [0, 1] + ([11] * 4)
|
||||
|
||||
watch = timeutils.StopWatch()
|
||||
watch.start()
|
||||
self.assertEqual(1, watch.elapsed())
|
||||
|
||||
self.assertEqual(11, watch.elapsed())
|
||||
self.assertEqual(1, watch.elapsed(maximum=1))
|
||||
|
||||
watch.stop()
|
||||
self.assertEqual(11, watch.elapsed())
|
||||
self.assertEqual(11, watch.elapsed())
|
||||
self.assertEqual(0, watch.elapsed(maximum=-1))
|
|
@ -1,60 +0,0 @@
|
|||
# Copyright (c) 2012 Intel Corporation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import uuid
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import uuidutils
|
||||
|
||||
|
||||
class UUIDUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
def test_generate_uuid(self):
|
||||
uuid_string = uuidutils.generate_uuid()
|
||||
self.assertIsInstance(uuid_string, str)
|
||||
self.assertEqual(len(uuid_string), 36)
|
||||
# make sure there are 4 dashes
|
||||
self.assertEqual(len(uuid_string.replace('-', '')), 32)
|
||||
|
||||
def test_generate_uuid_dashed_false(self):
|
||||
uuid_string = uuidutils.generate_uuid(dashed=False)
|
||||
self.assertIsInstance(uuid_string, str)
|
||||
self.assertEqual(len(uuid_string), 32)
|
||||
self.assertFalse('-' in uuid_string)
|
||||
|
||||
def test_is_uuid_like(self):
|
||||
self.assertTrue(uuidutils.is_uuid_like(str(uuid.uuid4())))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'{12345678-1234-5678-1234-567812345678}'))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'12345678123456781234567812345678'))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'urn:uuid:12345678-1234-5678-1234-567812345678'))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'urn:bbbaaaaa-aaaa-aaaa-aabb-bbbbbbbbbbbb'))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'uuid:bbbaaaaa-aaaa-aaaa-aabb-bbbbbbbbbbbb'))
|
||||
self.assertTrue(uuidutils.is_uuid_like(
|
||||
'{}---bbb---aaa--aaa--aaa-----aaa---aaa--bbb-bbb---bbb-bbb-bb-{}'))
|
||||
|
||||
def test_is_uuid_like_insensitive(self):
|
||||
self.assertTrue(uuidutils.is_uuid_like(str(uuid.uuid4()).upper()))
|
||||
|
||||
def test_id_is_uuid_like(self):
|
||||
self.assertFalse(uuidutils.is_uuid_like(1234567))
|
||||
|
||||
def test_name_is_uuid_like(self):
|
||||
self.assertFalse(uuidutils.is_uuid_like('zhongyueluo'))
|
|
@ -1,85 +0,0 @@
|
|||
# Copyright (c) 2013 OpenStack Foundation
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
from oslotest import base as test_base
|
||||
|
||||
from oslo_utils import versionutils
|
||||
|
||||
|
||||
class IsCompatibleTestCase(test_base.BaseTestCase):
|
||||
def test_same_version(self):
|
||||
self.assertTrue(versionutils.is_compatible('1', '1'))
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.0'))
|
||||
self.assertTrue(versionutils.is_compatible('1.0.0', '1.0.0'))
|
||||
|
||||
def test_requested_minor_greater(self):
|
||||
self.assertFalse(versionutils.is_compatible('1.1', '1.0'))
|
||||
|
||||
def test_requested_minor_less_than(self):
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.1'))
|
||||
|
||||
def test_requested_patch_greater(self):
|
||||
self.assertFalse(versionutils.is_compatible('1.0.1', '1.0.0'))
|
||||
|
||||
def test_requested_patch_less_than(self):
|
||||
self.assertTrue(versionutils.is_compatible('1.0.0', '1.0.1'))
|
||||
|
||||
def test_requested_patch_not_present_same(self):
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.0.0'))
|
||||
|
||||
def test_requested_patch_not_present_less_than(self):
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.0.1'))
|
||||
|
||||
def test_current_patch_not_present_same(self):
|
||||
self.assertTrue(versionutils.is_compatible('1.0.0', '1.0'))
|
||||
|
||||
def test_current_patch_not_present_less_than(self):
|
||||
self.assertFalse(versionutils.is_compatible('1.0.1', '1.0'))
|
||||
|
||||
def test_same_major_true(self):
|
||||
"""Even though the current version is 2.0, since `same_major` defaults
|
||||
to `True`, 1.0 is deemed incompatible.
|
||||
"""
|
||||
self.assertFalse(versionutils.is_compatible('2.0', '1.0'))
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.0'))
|
||||
self.assertFalse(versionutils.is_compatible('1.0', '2.0'))
|
||||
|
||||
def test_same_major_false(self):
|
||||
"""With `same_major` set to False, then major version compatibiity
|
||||
rule is not enforced, so a current version of 2.0 is deemed to satisfy
|
||||
a requirement of 1.0.
|
||||
"""
|
||||
self.assertFalse(versionutils.is_compatible('2.0', '1.0',
|
||||
same_major=False))
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '1.0',
|
||||
same_major=False))
|
||||
self.assertTrue(versionutils.is_compatible('1.0', '2.0',
|
||||
same_major=False))
|
||||
|
||||
def test_convert_version_to_int(self):
|
||||
self.assertEqual(6002000, versionutils.convert_version_to_int('6.2.0'))
|
||||
self.assertEqual(6004003,
|
||||
versionutils.convert_version_to_int((6, 4, 3)))
|
||||
self.assertEqual(5, versionutils.convert_version_to_int((5, )))
|
||||
self.assertRaises(ValueError,
|
||||
versionutils.convert_version_to_int, '5a.6b')
|
||||
|
||||
def test_convert_version_to_string(self):
|
||||
self.assertEqual('6.7.0', versionutils.convert_version_to_str(6007000))
|
||||
self.assertEqual('4', versionutils.convert_version_to_str(4))
|
||||
|
||||
def test_convert_version_to_tuple(self):
|
||||
self.assertEqual((6, 7, 0),
|
||||
versionutils.convert_version_to_tuple('6.7.0'))
|
|
@ -1,256 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright 2014 Red Hat, Inc.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import mock
|
||||
from oslo_i18n import fixture as oslo_i18n_fixture
|
||||
from oslotest import base as test_base
|
||||
import six
|
||||
import testtools
|
||||
|
||||
from oslo_utils import encodeutils
|
||||
|
||||
|
||||
class EncodeUtilsTest(test_base.BaseTestCase):
|
||||
|
||||
def test_safe_decode(self):
|
||||
safe_decode = encodeutils.safe_decode
|
||||
self.assertRaises(TypeError, safe_decode, True)
|
||||
self.assertEqual(six.u('ni\xf1o'), safe_decode(six.b("ni\xc3\xb1o"),
|
||||
incoming="utf-8"))
|
||||
if six.PY2:
|
||||
# In Python 3, bytes.decode() doesn't support anymore
|
||||
# bytes => bytes encodings like base64
|
||||
self.assertEqual(six.u("test"), safe_decode("dGVzdA==",
|
||||
incoming='base64'))
|
||||
|
||||
self.assertEqual(six.u("strange"), safe_decode(six.b('\x80strange'),
|
||||
errors='ignore'))
|
||||
|
||||
self.assertEqual(six.u('\xc0'), safe_decode(six.b('\xc0'),
|
||||
incoming='iso-8859-1'))
|
||||
|
||||
# Forcing incoming to ascii so it falls back to utf-8
|
||||
self.assertEqual(six.u('ni\xf1o'), safe_decode(six.b('ni\xc3\xb1o'),
|
||||
incoming='ascii'))
|
||||
|
||||
self.assertEqual(six.u('foo'), safe_decode(b'foo'))
|
||||
|
||||
def test_safe_encode_none_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, None)
|
||||
|
||||
def test_safe_encode_bool_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, True)
|
||||
|
||||
def test_safe_encode_int_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, 1)
|
||||
|
||||
def test_safe_encode_list_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, [])
|
||||
|
||||
def test_safe_encode_dict_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, {})
|
||||
|
||||
def test_safe_encode_tuple_instead_of_text(self):
|
||||
self.assertRaises(TypeError, encodeutils.safe_encode, ('foo', 'bar', ))
|
||||
|
||||
def test_safe_encode_py2(self):
|
||||
if six.PY2:
|
||||
# In Python 3, str.encode() doesn't support anymore
|
||||
# text => text encodings like base64
|
||||
self.assertEqual(
|
||||
six.b("dGVzdA==\n"),
|
||||
encodeutils.safe_encode("test", encoding='base64'),
|
||||
)
|
||||
else:
|
||||
self.skipTest("Requires py2.x")
|
||||
|
||||
def test_safe_encode_force_incoming_utf8_to_ascii(self):
|
||||
# Forcing incoming to ascii so it falls back to utf-8
|
||||
self.assertEqual(
|
||||
six.b('ni\xc3\xb1o'),
|
||||
encodeutils.safe_encode(six.b('ni\xc3\xb1o'), incoming='ascii'),
|
||||
)
|
||||
|
||||
def test_safe_encode_same_encoding_different_cases(self):
|
||||
with mock.patch.object(encodeutils, 'safe_decode', mock.Mock()):
|
||||
utf8 = encodeutils.safe_encode(
|
||||
six.u('foo\xf1bar'), encoding='utf-8')
|
||||
self.assertEqual(
|
||||
encodeutils.safe_encode(utf8, 'UTF-8', 'utf-8'),
|
||||
encodeutils.safe_encode(utf8, 'utf-8', 'UTF-8'),
|
||||
)
|
||||
self.assertEqual(
|
||||
encodeutils.safe_encode(utf8, 'UTF-8', 'utf-8'),
|
||||
encodeutils.safe_encode(utf8, 'utf-8', 'utf-8'),
|
||||
)
|
||||
encodeutils.safe_decode.assert_has_calls([])
|
||||
|
||||
def test_safe_encode_different_encodings(self):
|
||||
text = six.u('foo\xc3\xb1bar')
|
||||
result = encodeutils.safe_encode(
|
||||
text=text, incoming='utf-8', encoding='iso-8859-1')
|
||||
self.assertNotEqual(text, result)
|
||||
self.assertNotEqual(six.b("foo\xf1bar"), result)
|
||||
|
||||
def test_to_utf8(self):
|
||||
self.assertEqual(encodeutils.to_utf8(b'a\xe9\xff'), # bytes
|
||||
b'a\xe9\xff')
|
||||
self.assertEqual(encodeutils.to_utf8(u'a\xe9\xff\u20ac'), # Unicode
|
||||
b'a\xc3\xa9\xc3\xbf\xe2\x82\xac')
|
||||
self.assertRaises(TypeError, encodeutils.to_utf8, 123) # invalid
|
||||
|
||||
# oslo.i18n Message objects should also be accepted for convenience.
|
||||
# It works because Message is a subclass of six.text_type. Use the
|
||||
# lazy translation to get a Message instance of oslo_i18n.
|
||||
msg = oslo_i18n_fixture.Translation().lazy("test")
|
||||
self.assertEqual(encodeutils.to_utf8(msg),
|
||||
b'test')
|
||||
|
||||
|
||||
class ExceptionToUnicodeTest(test_base.BaseTestCase):
|
||||
|
||||
def test_str_exception(self):
|
||||
# The regular Exception class cannot be used directly:
|
||||
# Exception(u'\xe9').__str__() raises an UnicodeEncodeError
|
||||
# on Python 2
|
||||
class StrException(Exception):
|
||||
def __init__(self, value):
|
||||
Exception.__init__(self)
|
||||
self.value = value
|
||||
|
||||
def __str__(self):
|
||||
return self.value
|
||||
|
||||
# On Python 3, an exception which returns bytes with is __str__()
|
||||
# method (like StrException(bytes)) is probably a bug, but it was not
|
||||
# harder to support this silly case in exception_to_unicode().
|
||||
|
||||
# Decode from ASCII
|
||||
exc = StrException(b'bytes ascii')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'bytes ascii')
|
||||
|
||||
# Decode from UTF-8
|
||||
exc = StrException(b'utf-8 \xc3\xa9\xe2\x82\xac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'utf-8 \xe9\u20ac')
|
||||
|
||||
# Force the locale encoding to ASCII to test the fallback
|
||||
with mock.patch.object(encodeutils, '_getfilesystemencoding',
|
||||
return_value='ascii'):
|
||||
# Fallback: decode from ISO-8859-1
|
||||
exc = StrException(b'rawbytes \x80\xff')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'rawbytes \x80\xff')
|
||||
|
||||
# No conversion needed
|
||||
exc = StrException(u'unicode ascii')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'unicode ascii')
|
||||
|
||||
# No conversion needed
|
||||
exc = StrException(u'unicode \xe9\u20ac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'unicode \xe9\u20ac')
|
||||
|
||||
# Test the locale encoding
|
||||
with mock.patch.object(encodeutils, '_getfilesystemencoding',
|
||||
return_value='koi8_r'):
|
||||
exc = StrException(b'\xf2\xd5\xd3\xd3\xcb\xc9\xca')
|
||||
# Decode from the locale encoding
|
||||
# (the message cannot be decoded from ASCII nor UTF-8)
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'\u0420\u0443\u0441\u0441\u043a\u0438\u0439')
|
||||
|
||||
@testtools.skipIf(six.PY3, 'test specific to Python 2')
|
||||
def test_unicode_exception(self):
|
||||
# Exception with a __unicode__() method, but no __str__()
|
||||
class UnicodeException(Exception):
|
||||
def __init__(self, value):
|
||||
Exception.__init__(self)
|
||||
self.value = value
|
||||
|
||||
def __unicode__(self):
|
||||
return self.value
|
||||
|
||||
# __unicode__() returns unicode
|
||||
exc = UnicodeException(u'unicode \xe9\u20ac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'unicode \xe9\u20ac')
|
||||
|
||||
# __unicode__() returns bytes (does this case really happen in the
|
||||
# wild?)
|
||||
exc = UnicodeException(b'utf-8 \xc3\xa9\xe2\x82\xac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'utf-8 \xe9\u20ac')
|
||||
|
||||
@testtools.skipIf(six.PY3, 'test specific to Python 2')
|
||||
def test_unicode_or_str_exception(self):
|
||||
# Exception with __str__() and __unicode__() methods
|
||||
class UnicodeOrStrException(Exception):
|
||||
def __init__(self, unicode_value, str_value):
|
||||
Exception.__init__(self)
|
||||
self.unicode_value = unicode_value
|
||||
self.str_value = str_value
|
||||
|
||||
def __unicode__(self):
|
||||
return self.unicode_value
|
||||
|
||||
def __str__(self):
|
||||
return self.str_value
|
||||
|
||||
# __unicode__() returns unicode
|
||||
exc = UnicodeOrStrException(u'unicode \xe9\u20ac', b'str')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'unicode \xe9\u20ac')
|
||||
|
||||
# __unicode__() returns bytes (does this case really happen in the
|
||||
# wild?)
|
||||
exc = UnicodeOrStrException(b'utf-8 \xc3\xa9\xe2\x82\xac', b'str')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'utf-8 \xe9\u20ac')
|
||||
|
||||
@testtools.skipIf(six.PY3, 'test specific to Python 2')
|
||||
def test_unicode_only_exception(self):
|
||||
# Exception with a __unicode__() method and a __str__() which
|
||||
# raises an exception (similar to the Message class of oslo_i18n)
|
||||
class UnicodeOnlyException(Exception):
|
||||
def __init__(self, value):
|
||||
Exception.__init__(self)
|
||||
self.value = value
|
||||
|
||||
def __unicode__(self):
|
||||
return self.value
|
||||
|
||||
def __str__(self):
|
||||
raise UnicodeError("use unicode()")
|
||||
|
||||
# __unicode__() returns unicode
|
||||
exc = UnicodeOnlyException(u'unicode \xe9\u20ac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'unicode \xe9\u20ac')
|
||||
|
||||
# __unicode__() returns bytes
|
||||
exc = UnicodeOnlyException(b'utf-8 \xc3\xa9\xe2\x82\xac')
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u'utf-8 \xe9\u20ac')
|
||||
|
||||
def test_oslo_i18n_message(self):
|
||||
# use the lazy translation to get a Message instance of oslo_i18n
|
||||
exc = oslo_i18n_fixture.Translation().lazy("test")
|
||||
self.assertEqual(encodeutils.exception_to_unicode(exc),
|
||||
u"test")
|
|
@ -1,544 +0,0 @@
|
|||
# Copyright 2011 OpenStack Foundation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Time related utilities and helper functions.
|
||||
"""
|
||||
|
||||
import calendar
|
||||
import datetime
|
||||
import logging
|
||||
import time
|
||||
|
||||
from debtcollector import removals
|
||||
import iso8601
|
||||
from monotonic import monotonic as now # noqa
|
||||
import pytz
|
||||
import six
|
||||
|
||||
from oslo_utils import reflection
|
||||
|
||||
# ISO 8601 extended time format with microseconds
|
||||
_ISO8601_TIME_FORMAT_SUBSECOND = '%Y-%m-%dT%H:%M:%S.%f'
|
||||
_ISO8601_TIME_FORMAT = '%Y-%m-%dT%H:%M:%S'
|
||||
PERFECT_TIME_FORMAT = _ISO8601_TIME_FORMAT_SUBSECOND
|
||||
|
||||
_MAX_DATETIME_SEC = 59
|
||||
|
||||
|
||||
@removals.remove(
|
||||
message="use datetime.datetime.isoformat()",
|
||||
version="1.6",
|
||||
removal_version="?",
|
||||
)
|
||||
def isotime(at=None, subsecond=False):
|
||||
"""Stringify time in ISO 8601 format.
|
||||
|
||||
.. deprecated:: 1.5.0
|
||||
Use :func:`utcnow` and :func:`datetime.datetime.isoformat` instead.
|
||||
"""
|
||||
if not at:
|
||||
at = utcnow()
|
||||
st = at.strftime(_ISO8601_TIME_FORMAT
|
||||
if not subsecond
|
||||
else _ISO8601_TIME_FORMAT_SUBSECOND)
|
||||
tz = at.tzinfo.tzname(None) if at.tzinfo else 'UTC'
|
||||
st += ('Z' if tz == 'UTC' else tz)
|
||||
return st
|
||||
|
||||
|
||||
def parse_isotime(timestr):
|
||||
"""Parse time from ISO 8601 format."""
|
||||
try:
|
||||
return iso8601.parse_date(timestr)
|
||||
except iso8601.ParseError as e:
|
||||
raise ValueError(six.text_type(e))
|
||||
except TypeError as e:
|
||||
raise ValueError(six.text_type(e))
|
||||
|
||||
|
||||
@removals.remove(
|
||||
message="use either datetime.datetime.isoformat() "
|
||||
"or datetime.datetime.strftime() instead",
|
||||
version="1.6",
|
||||
removal_version="?",
|
||||
)
|
||||
def strtime(at=None, fmt=PERFECT_TIME_FORMAT):
|
||||
"""Returns formatted utcnow.
|
||||
|
||||
.. deprecated:: 1.5.0
|
||||
Use :func:`utcnow()`, :func:`datetime.datetime.isoformat`
|
||||
or :func:`datetime.strftime` instead:
|
||||
|
||||
* ``strtime()`` => ``utcnow().isoformat()``
|
||||
* ``strtime(fmt=...)`` => ``utcnow().strftime(fmt)``
|
||||
* ``strtime(at)`` => ``at.isoformat()``
|
||||
* ``strtime(at, fmt)`` => ``at.strftime(fmt)``
|
||||
"""
|
||||
if not at:
|
||||
at = utcnow()
|
||||
return at.strftime(fmt)
|
||||
|
||||
|
||||
def parse_strtime(timestr, fmt=PERFECT_TIME_FORMAT):
|
||||
"""Turn a formatted time back into a datetime."""
|
||||
return datetime.datetime.strptime(timestr, fmt)
|
||||
|
||||
|
||||
def normalize_time(timestamp):
|
||||
"""Normalize time in arbitrary timezone to UTC naive object."""
|
||||
offset = timestamp.utcoffset()
|
||||
if offset is None:
|
||||
return timestamp
|
||||
return timestamp.replace(tzinfo=None) - offset
|
||||
|
||||
|
||||
def is_older_than(before, seconds):
|
||||
"""Return True if before is older than seconds.
|
||||
|
||||
.. versionchanged:: 1.7
|
||||
Accept datetime string with timezone information.
|
||||
Fix comparison with timezone aware datetime.
|
||||
"""
|
||||
if isinstance(before, six.string_types):
|
||||
before = parse_isotime(before)
|
||||
|
||||
before = normalize_time(before)
|
||||
|
||||
return utcnow() - before > datetime.timedelta(seconds=seconds)
|
||||
|
||||
|
||||
def is_newer_than(after, seconds):
|
||||
"""Return True if after is newer than seconds.
|
||||
|
||||
.. versionchanged:: 1.7
|
||||
Accept datetime string with timezone information.
|
||||
Fix comparison with timezone aware datetime.
|
||||
"""
|
||||
if isinstance(after, six.string_types):
|
||||
after = parse_isotime(after)
|
||||
|
||||
after = normalize_time(after)
|
||||
|
||||
return after - utcnow() > datetime.timedelta(seconds=seconds)
|
||||
|
||||
|
||||
def utcnow_ts(microsecond=False):
|
||||
"""Timestamp version of our utcnow function.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
.. versionchanged:: 1.3
|
||||
Added optional *microsecond* parameter.
|
||||
"""
|
||||
if utcnow.override_time is None:
|
||||
# NOTE(kgriffs): This is several times faster
|
||||
# than going through calendar.timegm(...)
|
||||
timestamp = time.time()
|
||||
if not microsecond:
|
||||
timestamp = int(timestamp)
|
||||
return timestamp
|
||||
|
||||
now = utcnow()
|
||||
timestamp = calendar.timegm(now.timetuple())
|
||||
|
||||
if microsecond:
|
||||
timestamp += float(now.microsecond) / 1000000
|
||||
|
||||
return timestamp
|
||||
|
||||
|
||||
def utcnow(with_timezone=False):
|
||||
"""Overridable version of utils.utcnow that can return a TZ-aware datetime.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
.. versionchanged:: 1.6
|
||||
Added *with_timezone* parameter.
|
||||
"""
|
||||
if utcnow.override_time:
|
||||
try:
|
||||
return utcnow.override_time.pop(0)
|
||||
except AttributeError:
|
||||
return utcnow.override_time
|
||||
if with_timezone:
|
||||
return datetime.datetime.now(tz=iso8601.iso8601.UTC)
|
||||
return datetime.datetime.utcnow()
|
||||
|
||||
|
||||
@removals.remove(
|
||||
message="use datetime.datetime.utcfromtimestamp().isoformat()",
|
||||
version="1.6",
|
||||
removal_version="?",
|
||||
)
|
||||
def iso8601_from_timestamp(timestamp, microsecond=False):
|
||||
"""Returns an iso8601 formatted date from timestamp.
|
||||
|
||||
.. versionchanged:: 1.3
|
||||
Added optional *microsecond* parameter.
|
||||
|
||||
.. deprecated:: 1.5.0
|
||||
Use :func:`datetime.datetime.utcfromtimestamp` and
|
||||
:func:`datetime.datetime.isoformat` instead.
|
||||
"""
|
||||
return isotime(datetime.datetime.utcfromtimestamp(timestamp), microsecond)
|
||||
|
||||
|
||||
utcnow.override_time = None
|
||||
|
||||
|
||||
def set_time_override(override_time=None):
|
||||
"""Overrides utils.utcnow.
|
||||
|
||||
Make it return a constant time or a list thereof, one at a time.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
:param override_time: datetime instance or list thereof. If not
|
||||
given, defaults to the current UTC time.
|
||||
"""
|
||||
utcnow.override_time = override_time or datetime.datetime.utcnow()
|
||||
|
||||
|
||||
def advance_time_delta(timedelta):
|
||||
"""Advance overridden time using a datetime.timedelta.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
"""
|
||||
assert utcnow.override_time is not None # nosec
|
||||
try:
|
||||
for dt in utcnow.override_time:
|
||||
dt += timedelta
|
||||
except TypeError:
|
||||
utcnow.override_time += timedelta
|
||||
|
||||
|
||||
def advance_time_seconds(seconds):
|
||||
"""Advance overridden time by seconds.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
"""
|
||||
advance_time_delta(datetime.timedelta(0, seconds))
|
||||
|
||||
|
||||
def clear_time_override():
|
||||
"""Remove the overridden time.
|
||||
|
||||
See :py:class:`oslo_utils.fixture.TimeFixture`.
|
||||
|
||||
"""
|
||||
utcnow.override_time = None
|
||||
|
||||
|
||||
def marshall_now(now=None):
|
||||
"""Make an rpc-safe datetime with microseconds.
|
||||
|
||||
.. versionchanged:: 1.6
|
||||
Timezone information is now serialized instead of being stripped.
|
||||
"""
|
||||
if not now:
|
||||
now = utcnow()
|
||||
d = dict(day=now.day, month=now.month, year=now.year, hour=now.hour,
|
||||
minute=now.minute, second=now.second,
|
||||
microsecond=now.microsecond)
|
||||
if now.tzinfo:
|
||||
d['tzname'] = now.tzinfo.tzname(None)
|
||||
return d
|
||||
|
||||
|
||||
def unmarshall_time(tyme):
|
||||
"""Unmarshall a datetime dict.
|
||||
|
||||
.. versionchanged:: 1.5
|
||||
Drop leap second.
|
||||
|
||||
.. versionchanged:: 1.6
|
||||
Added support for timezone information.
|
||||
"""
|
||||
|
||||
# NOTE(ihrachys): datetime does not support leap seconds,
|
||||
# so the best thing we can do for now is dropping them
|
||||
# http://bugs.python.org/issue23574
|
||||
second = min(tyme['second'], _MAX_DATETIME_SEC)
|
||||
dt = datetime.datetime(day=tyme['day'],
|
||||
month=tyme['month'],
|
||||
year=tyme['year'],
|
||||
hour=tyme['hour'],
|
||||
minute=tyme['minute'],
|
||||
second=second,
|
||||
microsecond=tyme['microsecond'])
|
||||
tzname = tyme.get('tzname')
|
||||
if tzname:
|
||||
tzinfo = pytz.timezone(tzname)
|
||||
dt = tzinfo.localize(dt)
|
||||
return dt
|
||||
|
||||
|
||||
def delta_seconds(before, after):
|
||||
"""Return the difference between two timing objects.
|
||||
|
||||
Compute the difference in seconds between two date, time, or
|
||||
datetime objects (as a float, to microsecond resolution).
|
||||
"""
|
||||
delta = after - before
|
||||
return delta.total_seconds()
|
||||
|
||||
|
||||
def is_soon(dt, window):
|
||||
"""Determines if time is going to happen in the next window seconds.
|
||||
|
||||
:param dt: the time
|
||||
:param window: minimum seconds to remain to consider the time not soon
|
||||
|
||||
:return: True if expiration is within the given duration
|
||||
"""
|
||||
soon = (utcnow() + datetime.timedelta(seconds=window))
|
||||
return normalize_time(dt) <= soon
|
||||
|
||||
|
||||
class Split(object):
|
||||
"""A *immutable* stopwatch split.
|
||||
|
||||
See: http://en.wikipedia.org/wiki/Stopwatch for what this is/represents.
|
||||
|
||||
.. versionadded:: 1.4
|
||||
"""
|
||||
|
||||
__slots__ = ['_elapsed', '_length']
|
||||
|
||||
def __init__(self, elapsed, length):
|
||||
self._elapsed = elapsed
|
||||
self._length = length
|
||||
|
||||
@property
|
||||
def elapsed(self):
|
||||
"""Duration from stopwatch start."""
|
||||
return self._elapsed
|
||||
|
||||
@property
|
||||
def length(self):
|
||||
"""Seconds from last split (or the elapsed time if no prior split)."""
|
||||
return self._length
|
||||
|
||||
def __repr__(self):
|
||||
r = reflection.get_class_name(self, fully_qualified=False)
|
||||
r += "(elapsed=%s, length=%s)" % (self._elapsed, self._length)
|
||||
return r
|
||||
|
||||
|
||||
def time_it(logger, log_level=logging.DEBUG,
|
||||
message="It took %(seconds).02f seconds to"
|
||||
" run function '%(func_name)s'",
|
||||
enabled=True, min_duration=0.01):
|
||||
"""Decorator that will log how long its decorated function takes to run.
|
||||
|
||||
This does **not** output a log if the decorated function fails
|
||||
with an exception.
|
||||
|
||||
:param logger: logger instance to use when logging elapsed time
|
||||
:param log_level: logger logging level to use when logging elapsed time
|
||||
:param message: customized message to use when logging elapsed time,
|
||||
the message may use automatically provide values
|
||||
``%(seconds)`` and ``%(func_name)`` if it finds those
|
||||
values useful to record
|
||||
:param enabled: whether to enable or disable this decorator (useful to
|
||||
decorate a function with this decorator, and then easily
|
||||
be able to switch that decoration off by some config or
|
||||
other value)
|
||||
:param min_duration: argument that determines if logging is triggered
|
||||
or not, it is by default set to 0.01 seconds to avoid
|
||||
logging when durations and/or elapsed function call
|
||||
times are less than 0.01 seconds, to disable
|
||||
any ``min_duration`` checks this value should be set
|
||||
to less than or equal to zero or set to none
|
||||
"""
|
||||
|
||||
def decorator(func):
|
||||
if not enabled:
|
||||
return func
|
||||
|
||||
@six.wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
with StopWatch() as w:
|
||||
result = func(*args, **kwargs)
|
||||
time_taken = w.elapsed()
|
||||
if min_duration is None or time_taken >= min_duration:
|
||||
logger.log(log_level, message,
|
||||
{'seconds': time_taken,
|
||||
'func_name': reflection.get_callable_name(func)})
|
||||
return result
|
||||
|
||||
return wrapper
|
||||
|
||||
return decorator
|
||||
|
||||
|
||||
class StopWatch(object):
|
||||
"""A simple timer/stopwatch helper class.
|
||||
|
||||
Inspired by: apache-commons-lang java stopwatch.
|
||||
|
||||
Not thread-safe (when a single watch is mutated by multiple threads at
|
||||
the same time). Thread-safe when used by a single thread (not shared) or
|
||||
when operations are performed in a thread-safe manner on these objects by
|
||||
wrapping those operations with locks.
|
||||
|
||||
It will use the `monotonic`_ pypi library to find an appropriate
|
||||
monotonically increasing time providing function (which typically varies
|
||||
depending on operating system and python version).
|
||||
|
||||
.. _monotonic: https://pypi.python.org/pypi/monotonic/
|
||||
|
||||
.. versionadded:: 1.4
|
||||
"""
|
||||
_STARTED = 'STARTED'
|
||||
_STOPPED = 'STOPPED'
|
||||
|
||||
def __init__(self, duration=None):
|
||||
if duration is not None and duration < 0:
|
||||
raise ValueError("Duration must be greater or equal to"
|
||||
" zero and not %s" % duration)
|
||||
self._duration = duration
|
||||
self._started_at = None
|
||||
self._stopped_at = None
|
||||
self._state = None
|
||||
self._splits = ()
|
||||
|
||||
def start(self):
|
||||
"""Starts the watch (if not already started).
|
||||
|
||||
NOTE(harlowja): resets any splits previously captured (if any).
|
||||
"""
|
||||
if self._state == self._STARTED:
|
||||
return self
|
||||
self._started_at = now()
|
||||
self._stopped_at = None
|
||||
self._state = self._STARTED
|
||||
self._splits = ()
|
||||
return self
|
||||
|
||||
@property
|
||||
def splits(self):
|
||||
"""Accessor to all/any splits that have been captured."""
|
||||
return self._splits
|
||||
|
||||
def split(self):
|
||||
"""Captures a split/elapsed since start time (and doesn't stop)."""
|
||||
if self._state == self._STARTED:
|
||||
elapsed = self.elapsed()
|
||||
if self._splits:
|
||||
length = self._delta_seconds(self._splits[-1].elapsed, elapsed)
|
||||
else:
|
||||
length = elapsed
|
||||
self._splits = self._splits + (Split(elapsed, length),)
|
||||
return self._splits[-1]
|
||||
else:
|
||||
raise RuntimeError("Can not create a split time of a stopwatch"
|
||||
" if it has not been started or if it has been"
|
||||
" stopped")
|
||||
|
||||
def restart(self):
|
||||
"""Restarts the watch from a started/stopped state."""
|
||||
if self._state == self._STARTED:
|
||||
self.stop()
|
||||
self.start()
|
||||
return self
|
||||
|
||||
@staticmethod
|
||||
def _delta_seconds(earlier, later):
|
||||
# Uses max to avoid the delta/time going backwards (and thus negative).
|
||||
return max(0.0, later - earlier)
|
||||
|
||||
def elapsed(self, maximum=None):
|
||||
"""Returns how many seconds have elapsed."""
|
||||
if self._state not in (self._STARTED, self._STOPPED):
|
||||
raise RuntimeError("Can not get the elapsed time of a stopwatch"
|
||||
" if it has not been started/stopped")
|
||||
if self._state == self._STOPPED:
|
||||
elapsed = self._delta_seconds(self._started_at, self._stopped_at)
|
||||
else:
|
||||
elapsed = self._delta_seconds(self._started_at, now())
|
||||
if maximum is not None and elapsed > maximum:
|
||||
elapsed = max(0.0, maximum)
|
||||
return elapsed
|
||||
|
||||
def __enter__(self):
|
||||
"""Starts the watch."""
|
||||
self.start()
|
||||
return self
|
||||
|
||||
def __exit__(self, type, value, traceback):
|
||||
"""Stops the watch (ignoring errors if stop fails)."""
|
||||
try:
|
||||
self.stop()
|
||||
except RuntimeError: # nosec: errors are meant to be ignored
|
||||
pass
|
||||
|
||||
def leftover(self, return_none=False):
|
||||
"""Returns how many seconds are left until the watch expires.
|
||||
|
||||
:param return_none: when ``True`` instead of raising a ``RuntimeError``
|
||||
when no duration has been set this call will
|
||||
return ``None`` instead.
|
||||
:type return_none: boolean
|
||||
"""
|
||||
if self._state != self._STARTED:
|
||||
raise RuntimeError("Can not get the leftover time of a stopwatch"
|
||||
" that has not been started")
|
||||
if self._duration is None:
|
||||
if not return_none:
|
||||
raise RuntimeError("Can not get the leftover time of a watch"
|
||||
" that has no duration")
|
||||
return None
|
||||
return max(0.0, self._duration - self.elapsed())
|
||||
|
||||
def expired(self):
|
||||
"""Returns if the watch has expired (ie, duration provided elapsed)."""
|
||||
if self._state not in (self._STARTED, self._STOPPED):
|
||||
raise RuntimeError("Can not check if a stopwatch has expired"
|
||||
" if it has not been started/stopped")
|
||||
if self._duration is None:
|
||||
return False
|
||||
return self.elapsed() > self._duration
|
||||
|
||||
def has_started(self):
|
||||
"""Returns True if the watch is in a started state."""
|
||||
return self._state == self._STARTED
|
||||
|
||||
def has_stopped(self):
|
||||
"""Returns True if the watch is in a stopped state."""
|
||||
return self._state == self._STOPPED
|
||||
|
||||
def resume(self):
|
||||
"""Resumes the watch from a stopped state."""
|
||||
if self._state == self._STOPPED:
|
||||
self._state = self._STARTED
|
||||
return self
|
||||
else:
|
||||
raise RuntimeError("Can not resume a stopwatch that has not been"
|
||||
" stopped")
|
||||
|
||||
def stop(self):
|
||||
"""Stops the watch."""
|
||||
if self._state == self._STOPPED:
|
||||
return self
|
||||
if self._state != self._STARTED:
|
||||
raise RuntimeError("Can not stop a stopwatch that has not been"
|
||||
" started")
|
||||
self._stopped_at = now()
|
||||
self._state = self._STOPPED
|
||||
return self
|
|
@ -1,54 +0,0 @@
|
|||
# Copyright 2013 IBM Corp
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Unit constants
|
||||
"""
|
||||
|
||||
# Binary unit constants.
|
||||
Ki = 1024
|
||||
"Binary kilo unit"
|
||||
Mi = 1024 ** 2
|
||||
"Binary mega unit"
|
||||
Gi = 1024 ** 3
|
||||
"Binary giga unit"
|
||||
Ti = 1024 ** 4
|
||||
"Binary tera unit"
|
||||
Pi = 1024 ** 5
|
||||
"Binary peta unit"
|
||||
Ei = 1024 ** 6
|
||||
"Binary exa unit"
|
||||
Zi = 1024 ** 7
|
||||
"Binary zetta unit"
|
||||
Yi = 1024 ** 8
|
||||
"Binary yotta unit"
|
||||
|
||||
# Decimal unit constants.
|
||||
k = 1000
|
||||
"Decimal kilo unit"
|
||||
M = 1000 ** 2
|
||||
"Decimal mega unit"
|
||||
G = 1000 ** 3
|
||||
"Decimal giga unit"
|
||||
T = 1000 ** 4
|
||||
"Decimal tera unit"
|
||||
P = 1000 ** 5
|
||||
"Decimal peta unit"
|
||||
E = 1000 ** 6
|
||||
"Decimal exa unit"
|
||||
Z = 1000 ** 7
|
||||
"Decimal zetta unit"
|
||||
Y = 1000 ** 8
|
||||
"Decimal yotta unit"
|
|
@ -1,58 +0,0 @@
|
|||
# Copyright (c) 2012 Intel Corporation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
UUID related utilities and helper functions.
|
||||
|
||||
.. versionadded:: 1.1
|
||||
"""
|
||||
|
||||
import uuid
|
||||
|
||||
|
||||
def generate_uuid(dashed=True):
|
||||
"""Creates a random uuid string.
|
||||
|
||||
:param dashed: Generate uuid with dashes or not
|
||||
:type dashed: bool
|
||||
:returns: string
|
||||
"""
|
||||
if dashed:
|
||||
return str(uuid.uuid4())
|
||||
return uuid.uuid4().hex
|
||||
|
||||
|
||||
def _format_uuid_string(string):
|
||||
return (string.replace('urn:', '')
|
||||
.replace('uuid:', '')
|
||||
.strip('{}')
|
||||
.replace('-', '')
|
||||
.lower())
|
||||
|
||||
|
||||
def is_uuid_like(val):
|
||||
"""Returns validation of a value as a UUID.
|
||||
|
||||
:param val: Value to verify
|
||||
:type val: string
|
||||
:returns: bool
|
||||
|
||||
.. versionchanged:: 1.1.1
|
||||
Support non-lowercase UUIDs.
|
||||
"""
|
||||
try:
|
||||
return str(uuid.UUID(val)).replace('-', '') == _format_uuid_string(val)
|
||||
except (TypeError, ValueError, AttributeError):
|
||||
return False
|
|
@ -1,88 +0,0 @@
|
|||
# Copyright (c) 2013 OpenStack Foundation
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Helpers for comparing version strings.
|
||||
|
||||
.. versionadded:: 1.6
|
||||
"""
|
||||
|
||||
import pkg_resources
|
||||
import six
|
||||
|
||||
from oslo_utils._i18n import _
|
||||
|
||||
|
||||
def is_compatible(requested_version, current_version, same_major=True):
|
||||
"""Determine whether `requested_version` is satisfied by
|
||||
`current_version`; in other words, `current_version` is >=
|
||||
`requested_version`.
|
||||
|
||||
:param requested_version: version to check for compatibility
|
||||
:param current_version: version to check against
|
||||
:param same_major: if True, the major version must be identical between
|
||||
`requested_version` and `current_version`. This is used when a
|
||||
major-version difference indicates incompatibility between the two
|
||||
versions. Since this is the common-case in practice, the default is
|
||||
True.
|
||||
:returns: True if compatible, False if not
|
||||
"""
|
||||
requested_parts = pkg_resources.parse_version(requested_version)
|
||||
current_parts = pkg_resources.parse_version(current_version)
|
||||
|
||||
if same_major and (requested_parts[0] != current_parts[0]):
|
||||
return False
|
||||
|
||||
return current_parts >= requested_parts
|
||||
|
||||
|
||||
def convert_version_to_int(version):
|
||||
"""Convert a version to an integer.
|
||||
|
||||
*version* must be a string with dots or a tuple of integers.
|
||||
|
||||
.. versionadded:: 2.0
|
||||
"""
|
||||
try:
|
||||
if isinstance(version, six.string_types):
|
||||
version = convert_version_to_tuple(version)
|
||||
if isinstance(version, tuple):
|
||||
return six.moves.reduce(lambda x, y: (x * 1000) + y, version)
|
||||
except Exception as ex:
|
||||
msg = _("Version %s is invalid.") % version
|
||||
six.raise_from(ValueError(msg), ex)
|
||||
|
||||
|
||||
def convert_version_to_str(version_int):
|
||||
"""Convert a version integer to a string with dots.
|
||||
|
||||
.. versionadded:: 2.0
|
||||
"""
|
||||
version_numbers = []
|
||||
factor = 1000
|
||||
while version_int != 0:
|
||||
version_number = version_int - (version_int // factor * factor)
|
||||
version_numbers.insert(0, six.text_type(version_number))
|
||||
version_int = version_int // factor
|
||||
|
||||
return '.'.join(map(str, version_numbers))
|
||||
|
||||
|
||||
def convert_version_to_tuple(version_str):
|
||||
"""Convert a version string with dots to a tuple.
|
||||
|
||||
.. versionadded:: 2.0
|
||||
"""
|
||||
return tuple(int(part) for part in version_str.split('.'))
|
|
@ -1,3 +0,0 @@
|
|||
---
|
||||
other:
|
||||
- Introduce reno for deployer release notes.
|
|
@ -1,3 +0,0 @@
|
|||
---
|
||||
fixes:
|
||||
- Expanded range of allowed ports by adding 0 to valid number.
|
|
@ -1,281 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# This file is execfile()d with the current directory set to its
|
||||
# containing dir.
|
||||
#
|
||||
# Note that not all possible configuration values are present in this
|
||||
# autogenerated file.
|
||||
#
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
# -- General configuration ------------------------------------------------
|
||||
|
||||
# If your documentation needs a minimal Sphinx version, state it here.
|
||||
# needs_sphinx = '1.0'
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = [
|
||||
'openstackdocstheme',
|
||||
'reno.sphinxext',
|
||||
]
|
||||
|
||||
# openstackdocstheme options
|
||||
repository_name = 'openstack/oslo.utils'
|
||||
bug_project = 'oslo.utils'
|
||||
bug_tag = ''
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The encoding of source files.
|
||||
# source_encoding = 'utf-8-sig'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'oslo.utils Release Notes'
|
||||
copyright = u'2016, oslo.utils Developers'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
import pkg_resources
|
||||
release = pkg_resources.get_distribution('oslo.utils').version
|
||||
# The short X.Y version.
|
||||
version = release
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
# language = None
|
||||
|
||||
# There are two options for replacing |today|: either, you set today to some
|
||||
# non-false value, then it is used:
|
||||
# today = ''
|
||||
# Else, today_fmt is used as the format for a strftime call.
|
||||
# today_fmt = '%B %d, %Y'
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
exclude_patterns = []
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all
|
||||
# documents.
|
||||
# default_role = None
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
# add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
# add_module_names = True
|
||||
|
||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||
# output. They are ignored by default.
|
||||
# show_authors = False
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
# modindex_common_prefix = []
|
||||
|
||||
# If true, keep warnings as "system message" paragraphs in the built documents.
|
||||
# keep_warnings = False
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
html_theme = 'openstackdocs'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
# html_theme_options = {}
|
||||
|
||||
# Add any paths that contain custom themes here, relative to this directory.
|
||||
# html_theme_path = []
|
||||
|
||||
# The name for this set of Sphinx documents. If None, it defaults to
|
||||
# "<project> v<release> documentation".
|
||||
# html_title = None
|
||||
|
||||
# A shorter title for the navigation bar. Default is the same as html_title.
|
||||
# html_short_title = None
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top
|
||||
# of the sidebar.
|
||||
# html_logo = None
|
||||
|
||||
# The name of an image file (within the static path) to use as favicon of the
|
||||
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
||||
# pixels large.
|
||||
# html_favicon = None
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ['_static']
|
||||
|
||||
# Add any extra paths that contain custom files (such as robots.txt or
|
||||
# .htaccess) here, relative to this directory. These files are copied
|
||||
# directly to the root of the documentation.
|
||||
# html_extra_path = []
|
||||
|
||||
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
|
||||
# using the given strftime format.
|
||||
html_last_updated_fmt = '%Y-%m-%d %H:%M'
|
||||
|
||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||
# typographically correct entities.
|
||||
# html_use_smartypants = True
|
||||
|
||||
# Custom sidebar templates, maps document names to template names.
|
||||
# html_sidebars = {}
|
||||
|
||||
# Additional templates that should be rendered to pages, maps page names to
|
||||
# template names.
|
||||
# html_additional_pages = {}
|
||||
|
||||
# If false, no module index is generated.
|
||||
# html_domain_indices = True
|
||||
|
||||
# If false, no index is generated.
|
||||
# html_use_index = True
|
||||
|
||||
# If true, the index is split into individual pages for each letter.
|
||||
# html_split_index = False
|
||||
|
||||
# If true, links to the reST sources are added to the pages.
|
||||
# html_show_sourcelink = True
|
||||
|
||||
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
|
||||
# html_show_sphinx = True
|
||||
|
||||
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
|
||||
# html_show_copyright = True
|
||||
|
||||
# If true, an OpenSearch description file will be output, and all pages will
|
||||
# contain a <link> tag referring to it. The value of this option must be the
|
||||
# base URL from which the finished HTML is served.
|
||||
# html_use_opensearch = ''
|
||||
|
||||
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
||||
# html_file_suffix = None
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'oslo.utilsReleaseNotesDoc'
|
||||
|
||||
|
||||
# -- Options for LaTeX output ---------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
# 'papersize': 'letterpaper',
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
# 'pointsize': '10pt',
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
# 'preamble': '',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title,
|
||||
# author, documentclass [howto, manual, or own class]).
|
||||
latex_documents = [
|
||||
('index', 'oslo.utilsReleaseNotes.tex',
|
||||
u'oslo.utils Release Notes Documentation',
|
||||
u'oslo.utils Developers', 'manual'),
|
||||
]
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top of
|
||||
# the title page.
|
||||
# latex_logo = None
|
||||
|
||||
# For "manual" documents, if this is true, then toplevel headings are parts,
|
||||
# not chapters.
|
||||
# latex_use_parts = False
|
||||
|
||||
# If true, show page references after internal links.
|
||||
# latex_show_pagerefs = False
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
# latex_show_urls = False
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
# latex_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
# latex_domain_indices = True
|
||||
|
||||
|
||||
# -- Options for manual page output ---------------------------------------
|
||||
|
||||
# One entry per manual page. List of tuples
|
||||
# (source start file, name, description, authors, manual section).
|
||||
man_pages = [
|
||||
('index', 'oslo.utilsReleaseNotes',
|
||||
u'oslo.utils Release Notes Documentation',
|
||||
[u'oslo.utils Developers'], 1)
|
||||
]
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
# man_show_urls = False
|
||||
|
||||
|
||||
# -- Options for Texinfo output -------------------------------------------
|
||||
|
||||
# Grouping the document tree into Texinfo files. List of tuples
|
||||
# (source start file, target name, title, author,
|
||||
# dir menu entry, description, category)
|
||||
texinfo_documents = [
|
||||
('index', 'oslo.utilsReleaseNotes',
|
||||
u'oslo.utils Release Notes Documentation',
|
||||
u'oslo.utils Developers', 'oslo.utilsReleaseNotes',
|
||||
'One line description of project.',
|
||||
'Miscellaneous'),
|
||||
]
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
# texinfo_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
# texinfo_domain_indices = True
|
||||
|
||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||
# texinfo_show_urls = 'footnote'
|
||||
|
||||
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
||||
# texinfo_no_detailmenu = False
|
||||
|
||||
# -- Options for Internationalization output ------------------------------
|
||||
locale_dirs = ['locale/']
|
|
@ -1,10 +0,0 @@
|
|||
===========================
|
||||
oslo.utils Release Notes
|
||||
===========================
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
unreleased
|
||||
ocata
|
||||
newton
|
|
@ -1,6 +0,0 @@
|
|||
=============================
|
||||
Newton Series Release Notes
|
||||
=============================
|
||||
|
||||
.. release-notes::
|
||||
:branch: origin/stable/newton
|
|
@ -1,6 +0,0 @@
|
|||
===================================
|
||||
Ocata Series Release Notes
|
||||
===================================
|
||||
|
||||
.. release-notes::
|
||||
:branch: origin/stable/ocata
|
|
@ -1,5 +0,0 @@
|
|||
==========================
|
||||
Unreleased Release Notes
|
||||
==========================
|
||||
|
||||
.. release-notes::
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue