Retire the project

Change-Id: Ic92e304924561a1aafdadab08803d282938aa17a
This commit is contained in:
Gauvain Pocentek 2018-01-19 11:10:40 +01:00
parent 3605ead36c
commit e59f07864e
206 changed files with 8 additions and 24461 deletions

View File

@ -1,7 +0,0 @@
[run]
branch = True
source = cerberus
omit = cerberus/tests/*,cerberus/openstack/*
[report]
ignore_errors = True

51
.gitignore vendored
View File

@ -1,51 +0,0 @@
*.py[cod]
# C extensions
*.so
# Packages
*.egg
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
# Installer logs
pip-log.txt
# Unit test / coverage reports
.coverage
.tox
nosetests.xml
.testrepository
# Translations
*.mo
# Mr Developer
.mr.developer.cfg
.project
.pydevproject
# Complexity
output/*.html
output/*/index.html
# Sphinx
doc/build
# pbr generates these
AUTHORS
ChangeLog
# Editors
*~
.*.swp
.*sw?
.idea

View File

@ -1,4 +0,0 @@
[gerrit]
host=review.openstack.org
port=29418
project=openstack/cerberus.git

View File

@ -1,3 +0,0 @@
# Format is:
# <preferred e-mail> <other e-mail 1>
# <preferred e-mail> <other e-mail 2>

View File

@ -1,4 +0,0 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} OS_TEST_TIMEOUT=60 ${PYTHON:-python} -m subunit.run discover -t ./ ${TESTS_DIR:-./cerberus/tests/unit/} $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

View File

@ -1,17 +0,0 @@
If you would like to contribute to the development of OpenStack,
you must follow the steps in the "If you're a developer, start here"
section of this page:
http://wiki.openstack.org/HowToContribute
Once those steps have been completed, changes to OpenStack
should be submitted for review via the Gerrit tool, following
the workflow documented at:
http://wiki.openstack.org/GerritWorkflow
Pull requests submitted through GitHub will be ignored.
Bugs should be filed on Launchpad, not GitHub:
https://bugs.launchpad.net/cerberus

View File

@ -1,5 +0,0 @@
===========================
cerberus Style Commandments
===========================
Read the OpenStack Style Commandments http://docs.openstack.org/developer/hacking/

175
LICENSE
View File

@ -1,175 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

View File

@ -1,202 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,6 +0,0 @@
include AUTHORS
include ChangeLog
exclude .gitignore
exclude .gitreview
global-exclude *.pyc

View File

View File

@ -1,15 +1,10 @@
===============================
cerberus
===============================
This project is no longer maintained.
Cerberus security component
The contents of this repository are still available in the Git
source code management system. To see the contents of this
repository before it reached its end of life, please check out the
previous commit with "git checkout HEAD^1".
* Free software: Apache license
* Documentation: http://docs.openstack.org/developer/cerberus
* Source: http://git.openstack.org/cgit/openstack/cerberus
* Bugs: http://bugs.launchpad.net/replace with the name of the project on launchpad
Features
--------
* TODO
For any further questions, please email
openstack-dev@lists.openstack.org or join #openstack-dev on
Freenode.

View File

@ -1 +0,0 @@
[python: **.py]

View File

@ -1,23 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import eventlet
eventlet.monkey_patch()
import pbr.version
__version__ = pbr.version.VersionInfo(
'cerberus').version_string()

View File

@ -1,27 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from oslo.config import cfg
from cerberus.openstack.common.gettextutils import _ # noqa
keystone_opts = [
cfg.StrOpt('auth_strategy', default='keystone',
help=_('The strategy to use for authentication.'))
]
CONF = cfg.CONF
CONF.register_opts(keystone_opts)

View File

@ -1,112 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import pecan
from wsgiref import simple_server
from oslo.config import cfg
from cerberus.api import auth
from cerberus.api import config as api_config
from cerberus.api import hooks
from cerberus.openstack.common import log as logging
LOG = logging.getLogger(__name__)
auth_opts = [
cfg.StrOpt('api_paste_config',
default="api_paste.ini",
help="Configuration file for WSGI definition of API."
),
]
api_opts = [
cfg.StrOpt('host_ip',
default="0.0.0.0",
help="Host serving the API."
),
cfg.IntOpt('port',
default=8300,
help="Host port serving the API."
),
]
CONF = cfg.CONF
CONF.register_opts(auth_opts)
CONF.register_opts(api_opts, group='api')
def get_pecan_config():
# Set up the pecan configuration
filename = api_config.__file__.replace('.pyc', '.py')
return pecan.configuration.conf_from_file(filename)
def setup_app(pecan_config=None, extra_hooks=None):
if not pecan_config:
pecan_config = get_pecan_config()
app_hooks = [hooks.ConfigHook(),
hooks.DBHook(),
hooks.ContextHook(pecan_config.app.acl_public_routes),
hooks.NoExceptionTracebackHook()]
if pecan_config.app.enable_acl:
app_hooks.append(hooks.AuthorizationHook(
pecan_config.app.member_routes))
pecan.configuration.set_config(dict(pecan_config), overwrite=True)
app = pecan.make_app(
pecan_config.app.root,
static_root=pecan_config.app.static_root,
template_path=pecan_config.app.template_path,
debug=CONF.debug,
force_canonical=getattr(pecan_config.app, 'force_canonical', True),
hooks=app_hooks,
guess_content_type_from_ext=False
)
if pecan_config.app.enable_acl:
strategy = auth.strategy(CONF.auth_strategy)
return strategy.install(app,
cfg.CONF,
pecan_config.app.acl_public_routes)
return app
def build_server():
# Create the WSGI server and start it
host = CONF.api.host_ip
port = CONF.api.port
server_cls = simple_server.WSGIServer
handler_cls = simple_server.WSGIRequestHandler
pecan_config = get_pecan_config()
pecan_config.app.enable_acl = (CONF.auth_strategy == 'keystone')
app = setup_app(pecan_config=pecan_config)
srv = simple_server.make_server(
host,
port,
app,
server_cls,
handler_cls)
return srv

View File

@ -1,62 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.api.middleware import auth_token
from cerberus.openstack.common import log
STRATEGIES = {}
LOG = log.getLogger(__name__)
OPT_GROUP_NAME = 'keystone_authtoken'
class KeystoneAuth(object):
@classmethod
def _register_opts(cls, conf):
"""Register keystoneclient middleware options."""
if OPT_GROUP_NAME not in conf:
conf.register_opts(auth_token.opts, group=OPT_GROUP_NAME)
auth_token.CONF = conf
@classmethod
def install(cls, app, conf, public_routes):
"""Install Auth check on application."""
LOG.debug(u'Installing Keystone\'s auth protocol')
cls._register_opts(conf)
conf = dict(conf.get(OPT_GROUP_NAME))
return auth_token.AuthTokenMiddleware(app,
conf=conf,
public_api_routes=public_routes)
STRATEGIES['keystone'] = KeystoneAuth
def strategy(strategy):
"""Returns the Auth Strategy.
:param strategy: String representing
the strategy to use
"""
try:
return STRATEGIES[strategy]
except KeyError:
raise RuntimeError

View File

@ -1,23 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# Pecan Application Configurations
app = {
'root': 'cerberus.api.root.RootController',
'modules': ['cerberus.api'],
'static_root': '%(confdir)s/public',
'template_path': '%(confdir)s/templates',
'debug': True,
'enable_acl': False,
'acl_public_routes': ['/', '/v1'],
'member_routes': ['/v1/security_reports', ]
}

View File

@ -1,153 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from oslo.config import cfg
from pecan import hooks
from webob import exc
from cerberus.common import context
from cerberus.common import policy
from cerberus.db import api as dbapi
class ConfigHook(hooks.PecanHook):
"""Attach the config object to the request so controllers can get to it."""
def before(self, state):
state.request.cfg = cfg.CONF
class DBHook(hooks.PecanHook):
"""Attach the dbapi object to the request so controllers can get to it."""
def before(self, state):
state.request.dbapi = dbapi.get_instance()
class ContextHook(hooks.PecanHook):
"""Configures a request context and attaches it to the request.
The following HTTP request headers are used:
X-User-Id or X-User:
Used for context.user_id.
X-Tenant-Id or X-Tenant:
Used for context.tenant.
X-Auth-Token:
Used for context.auth_token.
X-Roles:
Used for setting context.is_admin flag to either True or False.
The flag is set to True, if X-Roles contains either an administrator
or admin substring. Otherwise it is set to False.
"""
def __init__(self, public_api_routes):
self.public_api_routes = public_api_routes
super(ContextHook, self).__init__()
def before(self, state):
user_id = state.request.headers.get('X-User-Id')
user_id = state.request.headers.get('X-User', user_id)
tenant_id = state.request.headers.get('X-Tenant-Id')
tenant = state.request.headers.get('X-Tenant', tenant_id)
domain_id = state.request.headers.get('X-User-Domain-Id')
domain_name = state.request.headers.get('X-User-Domain-Name')
auth_token = state.request.headers.get('X-Auth-Token')
roles = state.request.headers.get('X-Roles', '').split(',')
creds = {'roles': roles}
is_public_api = state.request.environ.get('is_public_api', False)
is_admin = policy.enforce('context_is_admin',
state.request.headers,
creds)
state.request.context = context.RequestContext(
auth_token=auth_token,
user=user_id,
tenant_id=tenant_id,
tenant=tenant,
domain_id=domain_id,
domain_name=domain_name,
is_admin=is_admin,
is_public_api=is_public_api,
roles=roles)
class AuthorizationHook(hooks.PecanHook):
"""Verify that the user has admin rights.
Checks whether the request context is an admin context and
rejects the request if the api is not public.
"""
def __init__(self, member_routes):
self.member_routes = member_routes
super(AuthorizationHook, self).__init__()
def is_path_in_routes(self, path):
for p in self.member_routes:
if path.startswith(p):
return True
return False
def before(self, state):
ctx = state.request.context
if not ctx.is_admin and not ctx.is_public_api and \
not self.is_path_in_routes(state.request.path):
raise exc.HTTPForbidden()
class NoExceptionTracebackHook(hooks.PecanHook):
"""Workaround rpc.common: deserialize_remote_exception.
deserialize_remote_exception builds rpc exception traceback into error
message which is then sent to the client. Such behavior is a security
concern so this hook is aimed to cut-off traceback from the error message.
"""
# NOTE(max_lobur): 'after' hook used instead of 'on_error' because
# 'on_error' never fired for wsme+pecan pair. wsme @wsexpose decorator
# catches and handles all the errors, so 'on_error' dedicated for unhandled
# exceptions never fired.
def after(self, state):
# Omit empty body. Some errors may not have body at this level yet.
if not state.response.body:
return
# Do nothing if there is no error.
if 200 <= state.response.status_int < 400:
return
json_body = state.response.json
# Do not remove traceback when server in debug mode (except 'Server'
# errors when 'debuginfo' will be used for traces).
if cfg.CONF.debug and json_body.get('faultcode') != 'Server':
return
faultsting = json_body.get('faultstring')
traceback_marker = 'Traceback (most recent call last):'
if faultsting and (traceback_marker in faultsting):
# Cut-off traceback.
faultsting = faultsting.split(traceback_marker, 1)[0]
# Remove trailing newlines and spaces if any.
json_body['faultstring'] = faultsting.rstrip()
# Replace the whole json. Cannot change original one beacause it's
# generated on the fly.
state.response.json = json_body

View File

@ -1,20 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.api.middleware import auth_token
AuthTokenMiddleware = auth_token.AuthTokenMiddleware

View File

@ -1,62 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import re
from keystoneclient.middleware import auth_token
from cerberus.common import exception
from cerberus.common import safe_utils
from cerberus.openstack.common.gettextutils import _ # noqa
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
class AuthTokenMiddleware(auth_token.AuthProtocol):
"""A wrapper on Keystone auth_token middleware.
Does not perform verification of authentication tokens
for public routes in the API.
"""
def __init__(self, app, conf, public_api_routes=[]):
route_pattern_tpl = '%s(\.json|\.xml)?$'
try:
self.public_api_routes = [re.compile(route_pattern_tpl % route_tpl)
for route_tpl in public_api_routes]
except re.error as e:
msg = _('Cannot compile public API routes: %s') % e
LOG.error(msg)
raise exception.ConfigInvalid(error_msg=msg)
super(AuthTokenMiddleware, self).__init__(app, conf)
def __call__(self, env, start_response):
path = safe_utils.safe_rstrip(env.get('PATH_INFO'), '/')
# The information whether the API call is being performed against the
# public API is required for some other components. Saving it to the
# WSGI environment is reasonable thereby.
env['is_public_api'] = any(map(lambda pattern: re.match(pattern, path),
self.public_api_routes))
if env['is_public_api']:
return self.app(env, start_response)
return super(AuthTokenMiddleware, self).__call__(env, start_response)

View File

@ -1,140 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import pecan
from pecan import rest
from wsme import types as wtypes
import wsmeext.pecan as wsme_pecan
from cerberus.api.v1 import controllers as v1_api
from cerberus.openstack.common import log as logging
LOG = logging.getLogger(__name__)
VERSION_STATUS = wtypes.Enum(wtypes.text, 'EXPERIMENTAL', 'STABLE')
class APILink(wtypes.Base):
"""API link description.
"""
type = wtypes.text
"""Type of link."""
rel = wtypes.text
"""Relationship with this link."""
href = wtypes.text
"""URL of the link."""
@classmethod
def sample(cls):
version = 'v1'
sample = cls(
rel='self',
type='text/html',
href='http://127.0.0.1:8888/{id}'.format(
id=version))
return sample
class APIMediaType(wtypes.Base):
"""Media type description.
"""
base = wtypes.text
"""Base type of this media type."""
type = wtypes.text
"""Type of this media type."""
@classmethod
def sample(cls):
sample = cls(
base='application/json',
type='application/vnd.openstack.sticks-v1+json')
return sample
class APIVersion(wtypes.Base):
"""API Version description.
"""
id = wtypes.text
"""ID of the version."""
status = VERSION_STATUS
"""Status of the version."""
updated = wtypes.text
"Last update in iso8601 format."
links = [APILink]
"""List of links to API resources."""
media_types = [APIMediaType]
"""Types accepted by this API."""
@classmethod
def sample(cls):
version = 'v1'
updated = '2014-08-11T16:00:00Z'
links = [APILink.sample()]
media_types = [APIMediaType.sample()]
sample = cls(id=version,
status='STABLE',
updated=updated,
links=links,
media_types=media_types)
return sample
class RootController(rest.RestController):
"""Root REST Controller exposing versions of the API.
"""
v1 = v1_api.V1Controller()
@wsme_pecan.wsexpose([APIVersion])
def get(self):
"""Return the version list
"""
# TODO(sheeprine): Maybe we should store all the API version
# informations in every API modules
ver1 = APIVersion(
id='v1',
status='EXPERIMENTAL',
updated='2015-03-09T16:00:00Z',
links=[
APILink(
rel='self',
href='{scheme}://{host}/v1'.format(
scheme=pecan.request.scheme,
host=pecan.request.host,
)
)
],
media_types=[]
)
versions = []
versions.append(ver1)
return versions

View File

@ -1,32 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from pecan import rest
from cerberus.api.v1.controllers import plugins as plugins_api
from cerberus.api.v1.controllers import security_alarms as \
security_alarms_api
from cerberus.api.v1.controllers import security_reports as \
security_reports_api
from cerberus.api.v1.controllers import tasks as tasks_api
class V1Controller(rest.RestController):
"""API version 1 controller. """
plugins = plugins_api.PluginsController()
security_alarms = security_alarms_api.SecurityAlarmsController()
security_reports = security_reports_api.SecurityReportsController()
tasks = tasks_api.TasksController()

View File

@ -1,34 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from pecan import rest
from oslo.config import cfg
from oslo import messaging
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
class BaseController(rest.RestController):
def __init__(self):
super(BaseController, self).__init__()
transport = messaging.get_transport(cfg.CONF)
target = messaging.Target(topic='test_rpc', server='server1')
self.client = messaging.RPCClient(transport, target)

View File

@ -1,166 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#
import json
import pecan
from webob import exc
from wsme import types as wtypes
import wsmeext.pecan as wsme_pecan
from oslo import messaging
from cerberus.api.v1.controllers import base
from cerberus.api.v1.datamodels import plugin as plugin_models
from cerberus.common import errors
from cerberus import db
from cerberus.db.sqlalchemy import models
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
_ENFORCER = None
class PluginsController(base.BaseController):
def list_plugins(self):
""" List all the plugins installed on system """
# Get information about plugins loaded by Cerberus
try:
plugins = self._plugins()
except messaging.RemoteError as e:
LOG.exception(e)
raise
try:
# Get information about plugins stored in db
db_plugins_info = db.plugins_info_get()
except Exception as e:
LOG.exception(e)
raise
plugins_info = {}
for plugin_info in db_plugins_info:
plugins_info[plugin_info.name] = models.\
PluginInfoJsonSerializer().serialize(plugin_info)
for key in plugins:
if key in plugins_info:
if isinstance(plugins_info[key], dict) and isinstance(
plugins[key], dict):
plugins_info[key].update(plugins[key])
pluginResources = []
for k, v in plugins_info.items():
pluginResources.append(
plugin_models.PluginResource(v))
return plugin_models.PluginResourceCollection(plugins=pluginResources)
def _plugins(self):
""" Get a dict of plugins loaded by Cerberus Manager """
ctx = pecan.request.context.to_dict()
try:
plugins = self.client.call(ctx, 'get_plugins')
except messaging.RemoteError as e:
LOG.exception(e)
raise
plugins_ = {}
for plugin in plugins:
plugin_ = json.loads(plugin)
plugins_[plugin_['name']] = plugin_
return plugins_
@wsme_pecan.wsexpose(plugin_models.PluginResourceCollection)
def get_all(self):
""" Get a list of plugins loaded by Cerberus manager
:return: a list of plugins loaded by Cerberus manager
:raises:
HTTPServiceUnavailable: an error occurred in Cerberus Manager or
the service is unavailable
HTTPNotFound: any other error
"""
# Get information about plugins loaded by Cerberus
try:
plugins = self.list_plugins()
except messaging.RemoteError:
raise exc.HTTPServiceUnavailable()
except Exception as e:
LOG.exception(e)
raise exc.HTTPNotFound()
return plugins
def get_plugin(self, uuid):
""" Get information about plugin loaded by Cerberus"""
try:
plugin = self._plugin(uuid)
except messaging.RemoteError:
raise
except errors.PluginNotFound:
raise
try:
# Get information about plugin stored in db
db_plugin_info = db.plugin_info_get_from_uuid(uuid)
plugin_info = models.PluginInfoJsonSerializer().\
serialize(db_plugin_info)
plugin_info.update(plugin)
except Exception as e:
LOG.exception(e)
raise
return plugin_models.PluginResource(plugin_info)
def _plugin(self, uuid):
""" Get a specific plugin thanks to its identifier """
ctx = pecan.request.context.to_dict()
try:
plugin = self.client.call(ctx, 'get_plugin_from_uuid', uuid=uuid)
except messaging.RemoteError as e:
LOG.exception(e)
raise
if plugin is None:
LOG.exception('Plugin %s not found.' % uuid)
raise errors.PluginNotFound(uuid)
return json.loads(plugin)
@wsme_pecan.wsexpose(plugin_models.PluginResource,
wtypes.text)
def get_one(self, uuid):
""" Get details of a specific plugin whose identifier is uuid
:param uuid: the identifier of the plugin
:return: details of a specific plugin
:raises:
HTTPServiceUnavailable: an error occurred in Cerberus Manager or
the service is unavailable
HTTPNotFound: Plugin is not found. Also any other error
"""
try:
plugin = self.get_plugin(uuid)
except messaging.RemoteError:
raise exc.HTTPServiceUnavailable()
except errors.PluginNotFound:
raise exc.HTTPNotFound()
except Exception as e:
LOG.exception(e)
raise exc.HTTPNotFound()
return plugin

View File

@ -1,124 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import pecan
from webob import exc
from wsme import types as wtypes
import wsmeext.pecan as wsme_pecan
from cerberus.api.v1.controllers import base
from cerberus.api.v1.datamodels import security_alarm as alarm_models
from cerberus.common import errors
from cerberus import db
from cerberus.db.sqlalchemy import models
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
class SecurityAlarmsController(base.BaseController):
@pecan.expose()
def _lookup(self, alarm_id, *remainder):
return SecurityAlarmController(alarm_id), remainder
def list_security_alarms(self):
""" List all the security alarms of all projects or just one. """
try:
security_alarms = db.security_alarm_get_all()
except Exception as e:
LOG.exception(e)
raise errors.DbError(
"Security alarms could not be retrieved"
)
return security_alarms
@wsme_pecan.wsexpose(alarm_models.SecurityAlarmResourceCollection)
def get_all(self):
""" Get stored security alarms.
:return: list of security alarms
:raises:
HTTPNotFound: Any database error
"""
try:
security_alarms = self.list_security_alarms()
except errors.DbError:
raise exc.HTTPNotFound()
alarms_resource = []
# todo(eglamn3) : no need to serialize here
for security_alarm in security_alarms:
alarms_resource.append(
alarm_models.SecurityAlarmResource(
models.SecurityAlarmJsonSerializer().
serialize(security_alarm)))
return alarm_models.SecurityAlarmResourceCollection(
security_alarms=alarms_resource)
class SecurityAlarmController(base.BaseController):
_custom_actions = {
'tickets': ['PUT']
}
def __init__(self, alarm_id):
super(SecurityAlarmController, self).__init__()
pecan.request.context['alarm_id'] = alarm_id
self._uuid = alarm_id
def get_security_alarm(self, alarm_id):
try:
security_alarm = db.security_alarm_get(alarm_id)
except Exception as e:
LOG.exception(e)
raise errors.DbError(
"Security alarm %s could not be retrieved" % alarm_id
)
return security_alarm
@wsme_pecan.wsexpose(alarm_models.SecurityAlarmResource,
wtypes.text)
def get(self):
"""Get security alarm in db
:return: a security alarm
:raises:
HTTPNotFound: Alarm not found or any database error
"""
try:
security_alarm = self.get_security_alarm(self._uuid)
except errors.DbError:
raise exc.HTTPNotFound()
s_alarm = models.SecurityAlarmJsonSerializer().\
serialize(security_alarm)
return alarm_models.SecurityAlarmResource(initial_data=s_alarm)
@pecan.expose("json")
def tickets(self, ticket_id):
"""Modify the ticket id associated to a security alarm in db.
:param ticket_id: the ticket_id to store in db.
:raises:
HTTPNotFound: Alarm not found or any database error
"""
try:
db.security_alarm_update_ticket_id(self._uuid, ticket_id)
except Exception:
raise exc.HTTPNotFound()

View File

@ -1,144 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import pecan
from webob import exc
from wsme import types as wtypes
import wsmeext.pecan as wsme_pecan
from cerberus.api.v1.controllers import base
from cerberus.api.v1.datamodels import security_report as report_models
from cerberus.common import errors
from cerberus import db
from cerberus.db.sqlalchemy import models
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
class SecurityReportsController(base.BaseController):
@pecan.expose()
def _lookup(self, report_id, *remainder):
return SecurityReportController(report_id), remainder
def list_security_reports(self, project_id=None):
""" List all the security reports of all projects or just one. """
try:
security_reports = db.security_report_get_all(
project_id=project_id)
except Exception as e:
LOG.exception(e)
raise errors.DbError(
"Security reports could not be retrieved"
)
return security_reports
@wsme_pecan.wsexpose(report_models.SecurityReportResourceCollection)
def get_all(self):
""" Get stored security reports.
:return: list of security reports
:raises:
HTTPNotFound: Any database error
"""
ctx = pecan.request.context
try:
if ctx.is_admin:
security_reports = self.list_security_reports()
else:
security_reports = self.list_security_reports(ctx.tenant_id)
except errors.DbError:
raise exc.HTTPNotFound()
reports_resource = []
# todo(eglamn3) : no need to serialize here
for security_report in security_reports:
reports_resource.append(
report_models.SecurityReportResource(
models.SecurityReportJsonSerializer().
serialize(security_report)))
return report_models.SecurityReportResourceCollection(
security_reports=reports_resource)
class SecurityReportController(base.BaseController):
_custom_actions = {
'tickets': ['PUT']
}
def __init__(self, uuid):
super(SecurityReportController, self).__init__()
pecan.request.context['uuid'] = uuid
self._id = uuid
def get_security_report(self, uuid):
try:
security_report = db.security_report_get(uuid)
except Exception as e:
LOG.exception(e)
raise errors.DbError(
"Security report %s could not be retrieved" % uuid
)
return security_report
@wsme_pecan.wsexpose(report_models.SecurityReportResource,
wtypes.text)
def get(self):
"""Get security report in db.
:return: a security report
:raises:
HTTPNotFound: Report not found or any database error
"""
try:
security_report = self.get_security_report(self._id)
except errors.DbError:
raise exc.HTTPNotFound()
if security_report is None:
raise exc.HTTPNotFound()
s_report = models.SecurityReportJsonSerializer().\
serialize(security_report)
return report_models.SecurityReportResource(initial_data=s_report)
@pecan.expose("json")
def tickets(self, ticket_id):
"""Modify the ticket id associated to a security report in db.
:param ticket_id: the ticket_id to store in db.
:raises:
HTTPNotFound: Report not found or any database error
"""
try:
db.security_report_update_ticket_id(self._id, ticket_id)
except Exception:
raise exc.HTTPNotFound()
@wsme_pecan.wsexpose()
def delete(self):
"""Delete the security report stored in db.
:raises:
HTTPNotFound: Report not found or any database error
"""
try:
db.security_report_delete(self._id)
except Exception as e:
LOG.exception(e)
raise exc.HTTPNotFound()

View File

@ -1,236 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import json
import pecan
from webob import exc
import wsme
from wsme import types as wtypes
import wsmeext.pecan as wsme_pecan
from oslo.messaging import rpc
from cerberus.api.v1.controllers import base
from cerberus.api.v1.datamodels import task as task_models
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
action_kind = ["stop", "start", "force_delete"]
action_kind_enum = wtypes.Enum(str, *action_kind)
class ActionController(base.BaseController):
_custom_actions = {
'stop': ['POST'],
'force_delete': ['POST'],
'start': ['POST'],
}
@wsme_pecan.wsexpose(None, wtypes.text)
def stop(self, task_id):
"""Stop task
:raises:
HTTPBadRequest: task not found or impossible to stop it
"""
try:
self.stop_task(task_id)
except rpc.RemoteError:
raise exc.HTTPBadRequest(
explanation="Task can not be stopped")
@wsme_pecan.wsexpose(None, wtypes.text)
def force_delete(self, task_id):
"""Force delete task
:raises:
HTTPNotFound: task is not found
"""
try:
self.force_delete_task(task_id)
except rpc.RemoteError as e:
raise exc.HTTPNotFound(explanation=e.value)
@wsme_pecan.wsexpose(None, wtypes.text)
def start(self, task_id):
"""Start task
:raises:
HTTPBadRequest: task not found or impossible to start it
"""
try:
self.start_task(task_id)
except rpc.RemoteError as e:
raise exc.HTTPBadRequest(explanation=e.value)
def stop_task(self, task_id):
ctx = pecan.request.context.to_dict()
try:
self.client.call(ctx, 'stop_task', task_id=task_id)
except rpc.RemoteError as e:
LOG.exception(e)
raise
def force_delete_task(self, task_id):
ctx = pecan.request.context.to_dict()
try:
self.client.call(ctx,
'force_delete_recurrent_task',
task_id=task_id)
except rpc.RemoteError as e:
LOG.exception(e)
raise
def start_task(self, task_id):
ctx = pecan.request.context.to_dict()
try:
self.client.call(ctx,
'start_recurrent_task',
task_id=task_id)
except rpc.RemoteError as e:
LOG.exception(e)
raise
class TasksController(base.BaseController):
action = ActionController()
def __init__(self):
super(TasksController, self).__init__()
def list_tasks(self):
ctx = pecan.request.context.to_dict()
try:
tasks = self.client.call(ctx, 'get_tasks')
except rpc.RemoteError as e:
LOG.exception(e)
raise
tasks_resource = []
for task in tasks:
tasks_resource.append(
task_models.TaskResource(json.loads(task)))
return task_models.TaskResourceCollection(tasks=tasks_resource)
@wsme_pecan.wsexpose(task_models.TaskResourceCollection)
def get_all(self):
""" List tasks handled by Cerberus Manager.
:return: list of tasks
:raises:
HTTPServiceUnavailable: an error occurred in Cerberus Manager or
the service is unavailable
"""
try:
tasks = self.list_tasks()
except rpc.RemoteError:
raise exc.HTTPServiceUnavailable()
return tasks
def get_task(self, task_id):
ctx = pecan.request.context.to_dict()
try:
task = self.client.call(ctx, 'get_task', task_id=task_id)
except rpc.RemoteError as e:
LOG.exception(e)
raise
return json.loads(task)
@wsme_pecan.wsexpose(task_models.TaskResource,
wtypes.text)
def get(self, task_id):
""" Get details of a task
:return: task details
:raises:
HTTPNotFound: task is not found
"""
try:
task = self.get_task(task_id)
except rpc.RemoteError:
raise exc.HTTPNotFound()
except Exception as e:
LOG.exception(e)
raise exc.HTTPNotFound()
return task_models.TaskResource(initial_data=task)
def create_task(self, task):
ctx = pecan.request.context.to_dict()
try:
if task.period is wsme.Unset:
task.period = None
task.id = self.client.call(
ctx,
'create_task',
plugin_id=task.plugin_id,
method_=task.method,
task_period=task.period,
task_name=task.name,
task_type=task.type,
persistent=task.persistent
)
except rpc.RemoteError as e:
LOG.exception(e)
raise
return task
@wsme_pecan.wsexpose(task_models.TaskResource,
body=task_models.TaskResource)
def post(self, task):
"""Create a task
:return: task details
:raises:
HTTPBadRequest
"""
try:
task = self.create_task(task)
except rpc.RemoteError as e:
LOG.exception(e)
raise exc.HTTPBadRequest(explanation=e.value)
except Exception as e:
LOG.exception(e)
raise exc.HTTPBadRequest()
return task
@wsme_pecan.wsexpose(None, wtypes.text)
def delete(self, task_id):
"""Delete a task
:raises:
HTTPNotFound: task does not exist
"""
try:
self.delete_task(task_id)
except rpc.RemoteError as e:
raise exc.HTTPNotFound(explanation=e.value)
except Exception as e:
LOG.exception(e)
raise
def delete_task(self, task_id):
ctx = pecan.request.context.to_dict()
try:
self.client.call(ctx, 'delete_recurrent_task', task_id=task_id)
except rpc.RemoteError as e:
LOG.exception(e)
raise

View File

@ -1,26 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import wsme
from wsme import types as wtypes
class Base(wtypes.Base):
def as_dict_from_keys(self, keys):
return dict((k, getattr(self, k))
for k in keys
if hasattr(self, k) and
getattr(self, k) != wsme.Unset)

View File

@ -1,87 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#
import decimal
from cerberus.api.v1.datamodels import base
from wsme import types as wtypes
class PluginResource(base.Base):
"""Type describing a plugin.
"""
name = wtypes.text
"""Name of the plugin."""
id = wtypes.IntegerType()
"""Id of the plugin."""
uuid = wtypes.text
"""Uuid of the plugin."""
methods = [wtypes.text]
"""Hook methods."""
version = wtypes.text
"""Version of the plugin."""
provider = wtypes.text
"""Provider of the plugin."""
subscribed_events = [wtypes.text]
"""Subscribed events of the plugin."""
type = wtypes.text
"""Type of the plugin."""
tool_name = wtypes.text
"""Tool name of the plugin."""
description = wtypes.text
"""Description of the plugin."""
def as_dict(self):
return self.as_dict_from_keys(['name', 'id', 'uuid', 'methods',
'version', 'provider',
'subscribed_events', 'type',
'tool_name', 'description'])
def __init__(self, initial_data):
super(PluginResource, self).__init__()
for key in initial_data:
setattr(self, key, initial_data[key])
@classmethod
def sample(cls):
sample = cls(initial_data={
'name': 'some_plugin',
'version': '2015.1',
'tool_name': 'some_tool',
'provider': 'some_provider',
'type': 'scanner',
'id': decimal.Decimal(1),
'uuid': '063d4206-5afc-409c-a4d1-c2a469299d37',
'methods': ['method_1', 'method_2'],
'subscribed_events': ['image.update']})
return sample
class PluginResourceCollection(base.Base):
"""A list of Plugins."""
plugins = [PluginResource]

View File

@ -1,94 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import datetime
import decimal
from cerberus.api.v1.datamodels import base
from wsme import types as wtypes
class SecurityAlarmResource(base.Base):
""" Representation of a security alarm.
"""
id = wtypes.IntegerType()
"""Security alarm id."""
plugin_id = wtypes.wsattr(wtypes.text)
"""Associated plugin id."""
alarm_id = wtypes.wsattr(wtypes.text)
"""Associated alarm id."""
timestamp = datetime.datetime
"""creation date."""
status = wtypes.wsattr(wtypes.text)
"""Status."""
severity = wtypes.wsattr(wtypes.text)
"""Severity."""
project_id = wtypes.wsattr(wtypes.text)
"""Associated project id."""
component_id = wtypes.wsattr(wtypes.text)
"""Component id."""
summary = wtypes.wsattr(wtypes.text)
"""Summary."""
description = wtypes.wsattr(wtypes.text)
"""Description."""
ticket_id = wtypes.wsattr(wtypes.text)
"""Associated ticket id."""
def as_dict(self):
return self.as_dict_from_keys(
['id', 'plugin_id', 'alarm_id', 'timestamp',
'status', 'severity', 'component_id', 'project_id',
'summary', 'description', 'ticket_id']
)
def __init__(self, initial_data=None):
super(SecurityAlarmResource, self).__init__()
if initial_data is not None:
for key in initial_data:
setattr(self, key, initial_data[key])
@classmethod
def sample(cls):
sample = cls(initial_data={
'id': decimal.Decimal(1),
'plugin_id': '927c8435-f81f-468a-92cb-ebb08ed0fad2',
'alarm_id': 'fea4b170-ed46-4a50-8b91-ed1c6876be7d',
'timestamp': datetime.datetime(2015, 3, 24, 9, 50, 50, 577840),
'status': 'new',
'severity': 'critical',
'project_id': 'e845a1f2004847e4ac14cb1732a2e75f',
'component_id': '4b75699f7a9649438932bebdbf9711e0',
'summary': 'Several attempts to log failed',
'description': 'Apache suffered an attack by brute force.'
' Thousands of attempts to log failed'})
return sample
class SecurityAlarmResourceCollection(base.Base):
"""A list of Security alarms."""
security_alarms = [SecurityAlarmResource]

View File

@ -1,113 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import datetime
import uuid
from cerberus.api.v1.datamodels import base
from wsme import types as wtypes
class SecurityReportResource(base.Base):
""" Representation of a security report.
"""
uuid = wtypes.wsattr(wtypes.text)
"""Security report id."""
plugin_id = wtypes.wsattr(wtypes.text)
"""Associated plugin id."""
report_id = wtypes.wsattr(wtypes.text)
"""Associated report id provided by plugin."""
component_id = wtypes.wsattr(wtypes.text)
"""Associated component id."""
component_type = wtypes.wsattr(wtypes.text)
"""Component type."""
component_name = wtypes.wsattr(wtypes.text)
"""Component name."""
project_id = wtypes.wsattr(wtypes.text)
"""Associated project id."""
title = wtypes.wsattr(wtypes.text)
"""Title of report."""
description = wtypes.wsattr(wtypes.text)
"""Description."""
security_rating = float
"""Security rating."""
vulnerabilities = wtypes.wsattr(wtypes.text)
"""Vulnerabilities."""
vulnerabilities_number = wtypes.IntegerType()
"""Total of Vulnerabilities."""
last_report_date = datetime.datetime
"""Last report date."""
ticket_id = wtypes.wsattr(wtypes.text, mandatory=True)
"""Associated ticket id."""
def as_dict(self):
return self.as_dict_from_keys(
['uuid', 'plugin_id', 'report_id', 'component_id',
'component_type', 'component_name', 'project_id',
'title', 'description', 'security_rating',
'vulnerabilities', 'vulnerabilities_number',
'last_report_date', 'ticket_id']
)
def __init__(self, initial_data=None):
super(SecurityReportResource, self).__init__()
if initial_data is not None:
for key in initial_data:
setattr(self, key, initial_data[key])
@classmethod
def sample(cls):
sample = cls(initial_data={
'uuid': str(uuid.uuid4()),
'security_rating': float(7.4),
'component_name': 'openstack-server',
'component_id': 'a1d869a1-6ab0-4f02-9e56-f83034bacfcb',
'component_type': 'instance',
'vulnerabilities_number': '2',
'description': 'security report',
'title': 'Security report',
'last_report_date': datetime.datetime(2015, 5, 6, 16, 19, 29),
'project_id': '510c7f4ed14243f09df371bba2561177',
'plugin_id': '063d4206-5afc-409c-a4d1-c2a469299d37',
'report_id': 'fea4b170-ed46-4a50-8b91-ed1c6876be7d',
'vulnerabilities': '{"443": {"archived": "false", '
'"protocol": "tcp", "family": "Web Servers", '
'"iface_id": 329, '
'"plugin": "1.3.6.1.4.1.25623.1.0.10386",'
'"ip": "192.168.100.3", "id": 443,'
'"output": "Summary": "Remote web server does'
' not reply with 404 error code"}}'})
return sample
class SecurityReportResourceCollection(base.Base):
"""A list of Security reports."""
security_reports = [SecurityReportResource]

View File

@ -1,76 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import decimal
from cerberus.api.v1.datamodels import base
from wsme import types as wtypes
class TaskResource(base.Base):
""" Representation of a task.
"""
name = wtypes.wsattr(wtypes.text, default="unknown")
"""Name of the task."""
period = wtypes.IntegerType()
"""Period if periodic."""
method = wtypes.wsattr(wtypes.text, mandatory=True)
"""Hook methods."""
state = wtypes.wsattr(wtypes.text)
"""Running or not."""
id = wtypes.IntegerType()
"""Associated task id."""
plugin_id = wtypes.wsattr(wtypes.text, mandatory=True)
"""Associated plugin id."""
type = wtypes.wsattr(wtypes.text, default="unique")
"""Type of the task."""
persistent = wtypes.wsattr(bool, default=False)
"""If task must persist."""
def as_dict(self):
return self.as_dict_from_keys(['name', 'period', 'method', 'state',
'id', 'plugin_id', 'type',
'persistent'])
def __init__(self, initial_data=None):
super(TaskResource, self).__init__()
if initial_data is not None:
for key in initial_data:
setattr(self, key, initial_data[key])
@classmethod
def sample(cls):
sample = cls(initial_data={
'name': 'some_task',
'period': decimal.Decimal(3),
'persistent': True,
'state': 'running',
'plugin_id': '063d4206-5afc-409c-a4d1-c2a469299d37',
'type': 'recurrent',
'id': '4820cea8-e88e-463b-ae1f-6bbde009cc93'})
return sample
class TaskResourceCollection(base.Base):
"""A list of Tasks."""
tasks = [TaskResource]

View File

@ -1,15 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

View File

@ -1,65 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import functools
from keystoneclient.v2_0 import client as keystone_client_v2_0
from oslo.config import cfg
from cerberus.openstack.common import log
cfg.CONF.import_group('service_credentials', 'cerberus.service')
LOG = log.getLogger(__name__)
def logged(func):
@functools.wraps(func)
def with_logging(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
LOG.exception(e)
raise
return with_logging
class Client(object):
"""A client which gets information via python-keystoneclient."""
def __init__(self):
"""Initialize a keystone client object."""
conf = cfg.CONF.service_credentials
self.keystone_client_v2_0 = keystone_client_v2_0.Client(
username=conf.os_username,
password=conf.os_password,
tenant_name=conf.os_tenant_name,
auth_url=conf.os_auth_url,
region_name=conf.os_region_name,
)
@logged
def user_detail_get(self, user):
"""Returns details for a user."""
return self.keystone_client_v2_0.users.get(user)
@logged
def roles_for_user(self, user, tenant=None):
"""Returns role for a given id."""
return self.keystone_client_v2_0.roles.roles_for_user(user, tenant)

View File

@ -1,109 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import functools
from neutronclient.v2_0 import client as neutron_client
from oslo.config import cfg
from cerberus.openstack.common import log
cfg.CONF.import_group('service_credentials', 'cerberus.service')
LOG = log.getLogger(__name__)
def logged(func):
@functools.wraps(func)
def with_logging(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
LOG.exception(e)
raise
return with_logging
class Client(object):
"""A client which gets information via python-neutronclient."""
def __init__(self):
"""Initialize a neutron client object."""
conf = cfg.CONF.service_credentials
self.neutronClient = neutron_client.Client(
username=conf.os_username,
password=conf.os_password,
tenant_name=conf.os_tenant_name,
auth_url=conf.os_auth_url,
)
@logged
def list_networks(self, tenant_id):
"""Returns the list of networks of a given tenant"""
return self.neutronClient.list_networks(
tenant_id=tenant_id).get("networks", None)
@logged
def list_floatingips(self, tenant_id):
"""Returns the list of networks of a given tenant"""
return self.neutronClient.list_floatingips(
tenant_id=tenant_id).get("floatingips", None)
@logged
def list_associated_floatingips(self, **params):
"""Returns the list of associated floating ips of a given tenant"""
floating_ips = self.neutronClient.list_floatingips(
**params).get("floatingips", None)
# A floating IP is an IP address on an external network, which is
# associated with a specific port, and optionally a specific IP
# address, on a private OpenStack Networking network. Therefore a
# floating IP allows access to an instance on a private network from an
# external network. Floating IPs can only be defined on networks for
# which the attribute router:external (by the external network
# extension) has been set to True.
associated_floating_ips = []
for floating_ip in floating_ips:
if floating_ip.get("port_id") is not None:
associated_floating_ips.append(floating_ip)
return associated_floating_ips
@logged
def net_ips_get(self, network_id):
"""
Return ip pools used in all subnets of a network
:param network_id:
:return: list of pools
"""
subnets = self.neutronClient.show_network(
network_id)["network"]["subnets"]
ips = []
for subnet in subnets:
ips.append(self.subnet_ips_get(subnet))
return ips
@logged
def get_net_of_subnet(self, subnet_id):
return self.neutronClient.show_subnet(
subnet_id)["subnet"]["network_id"]
@logged
def subnet_ips_get(self, network_id):
"""Returns ip pool of a subnet."""
return self.neutronClient.show_subnet(
network_id)["subnet"]["allocation_pools"]

View File

@ -1,109 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import functools
from novaclient.v3 import client as nova_client
from oslo.config import cfg
from cerberus.openstack.common import log
OPTS = [
cfg.BoolOpt('nova_http_log_debug',
default=False,
help='Allow novaclient\'s debug log output.'),
]
SERVICE_OPTS = [
cfg.StrOpt('nova',
default='compute',
help='Nova service type.'),
]
cfg.CONF.register_opts(OPTS)
cfg.CONF.register_opts(SERVICE_OPTS, group='service_types')
# cfg.CONF.import_opt('http_timeout', 'cerberus.service')
cfg.CONF.import_group('service_credentials', 'cerberus.service')
LOG = log.getLogger(__name__)
def logged(func):
@functools.wraps(func)
def with_logging(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
LOG.exception(e)
raise
return with_logging
class Client(object):
"""A client which gets information via python-novaclient."""
def __init__(self, bypass_url=None, auth_token=None):
"""Initialize a nova client object."""
conf = cfg.CONF.service_credentials
tenant = conf.os_tenant_id or conf.os_tenant_name
self.nova_client = nova_client.Client(
username=conf.os_username,
project_id=tenant,
auth_url=conf.os_auth_url,
password=conf.os_password,
region_name=conf.os_region_name,
endpoint_type=conf.os_endpoint_type,
service_type=cfg.CONF.service_types.nova,
bypass_url=bypass_url,
cacert=conf.os_cacert,
insecure=conf.insecure,
http_log_debug=cfg.CONF.nova_http_log_debug,
no_cache=True)
@logged
def instance_get_all(self):
"""Returns list of all instances."""
search_opts = {'all_tenants': True}
return self.nova_client.servers.list(
detailed=True,
search_opts=search_opts)
@logged
def get_instance_details_from_floating_ip(self, ip):
"""
Get instance_id which is associated to the floating ip "ip"
:param ip: the floating ip that should belong to an instance
:return instance_id if ip is found, else None
"""
instances = self.instance_get_all()
try:
for instance in instances:
# An instance can belong to many networks. An instance can
# have two ips in a network:
# at least a private ip and potentially a floating ip
addresses_in_networks = instance.addresses.values()
for addresses_in_network in addresses_in_networks:
for address_in_network in addresses_in_network:
if ((address_in_network.get('OS-EXT-IPS:type', None)
== 'floating')
and (address_in_network['addr'] == ip)):
return instance
except Exception as e:
LOG.exception(e)
raise
return None

View File

@ -1,15 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

View File

@ -1,42 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import sys
from oslo.config import cfg
from cerberus.common import config
from cerberus import manager
from cerberus.openstack.common import log
from cerberus.openstack.common import service
LOG = log.getLogger(__name__)
def main():
log.set_defaults(cfg.CONF.default_log_levels)
argv = sys.argv
config.parse_args(argv)
log.setup(cfg.CONF, 'cerberus')
launcher = service.ProcessLauncher()
c_manager = manager.CerberusManager()
launcher.launch_service(c_manager)
launcher.wait()
if __name__ == '__main__':
main()

View File

@ -1,45 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import sys
from oslo.config import cfg
from cerberus.api import app
from cerberus.common import config
from cerberus.openstack.common import log
CONF = cfg.CONF
CONF.import_opt('auth_strategy', 'cerberus.api')
LOG = log.getLogger(__name__)
def main():
argv = sys.argv
config.parse_args(argv)
log.setup(cfg.CONF, 'cerberus')
server = app.build_server()
log.set_defaults(cfg.CONF.default_log_levels)
try:
server.serve_forever()
except KeyboardInterrupt:
pass
LOG.info("cerberus-api starting...")
if __name__ == '__main__':
main()

View File

@ -1,40 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import sys
from oslo.config import cfg
from sqlalchemy import create_engine
from cerberus.common import config
def main():
argv = sys.argv
config.parse_args(argv)
engine = create_engine(cfg.CONF.database.connection)
conn = engine.connect()
try:
conn.execute("CREATE DATABASE cerberus")
except Exception:
pass
conn.close()
if __name__ == '__main__':
main()

View File

@ -1,110 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Copyright 2013 Hewlett-Packard Development Company, L.P.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
Run storage database migration.
"""
import sys
from oslo.config import cfg
from cerberus.db import migration
from cerberus import service
CONF = cfg.CONF
class DBCommand(object):
def upgrade(self):
migration.upgrade(CONF.command.revision)
def downgrade(self):
migration.downgrade(CONF.command.revision)
def revision(self):
migration.revision(CONF.command.message, CONF.command.autogenerate)
def stamp(self):
migration.stamp(CONF.command.revision)
def version(self):
print(migration.version())
def create_schema(self):
migration.create_schema()
def add_command_parsers(subparsers):
command_object = DBCommand()
parser = subparsers.add_parser('upgrade',
help="Upgrade the database schema to the latest version. "
"Optionally, use --revision to specify an alembic revision "
"string to upgrade to.")
parser.set_defaults(func=command_object.upgrade)
parser.add_argument('--revision', nargs='?')
parser = subparsers.add_parser('downgrade',
help="Downgrade the database schema to the oldest revision. "
"While optional, one should generally use --revision to "
"specify the alembic revision string to downgrade to.")
parser.set_defaults(func=command_object.downgrade)
parser.add_argument('--revision', nargs='?')
parser = subparsers.add_parser('stamp')
parser.add_argument('--revision', nargs='?')
parser.set_defaults(func=command_object.stamp)
parser = subparsers.add_parser('revision',
help="Create a new alembic revision. "
"Use --message to set the message string.")
parser.add_argument('-m', '--message')
parser.add_argument('--autogenerate', action='store_true')
parser.set_defaults(func=command_object.revision)
parser = subparsers.add_parser('version',
help="Print the current version information and exit.")
parser.set_defaults(func=command_object.version)
parser = subparsers.add_parser('create_schema',
help="Create the database schema.")
parser.set_defaults(func=command_object.create_schema)
command_opt = cfg.SubCommandOpt('command',
title='Command',
help='Available commands',
handler=add_command_parsers)
CONF.register_cli_opt(command_opt)
def main():
# this is hack to work with previous usage of ironic-dbsync
# pls change it to ironic-dbsync upgrade
valid_commands = set([
'upgrade', 'downgrade', 'revision',
'version', 'stamp', 'create_schema',
])
if not set(sys.argv) & valid_commands:
sys.argv.append('upgrade')
service.prepare_service(sys.argv)
CONF.command.func()

View File

@ -1,15 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

View File

@ -1,147 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import functools
import json
import kombu
import logging
from oslo.messaging._drivers import amqp as rpc_amqp
from oslo.messaging._drivers import amqpdriver
from oslo.messaging._drivers import common as rpc_common
from oslo.messaging._drivers import impl_rabbit
from oslo.messaging.openstack.common.gettextutils import _ # noqa
LOG = logging.getLogger(__name__)
def _get_queue_arguments(conf):
"""Construct the arguments for declaring a queue.
If the rabbit_ha_queues option is set, we declare a mirrored queue
as described here:
http://www.rabbitmq.com/ha.html
Setting x-ha-policy to all means that the queue will be mirrored
to all nodes in the cluster.
"""
return {'x-ha-policy': 'all'} if conf.rabbit_ha_queues else {}
class CerberusRabbitMessage(dict):
def __init__(self, raw_message):
if isinstance(raw_message.payload, unicode):
message = rpc_common.deserialize_msg(
json.loads(raw_message.payload))
else:
message = rpc_common.deserialize_msg(raw_message.payload)
super(CerberusRabbitMessage, self).__init__(message)
self._raw_message = raw_message
def acknowledge(self):
self._raw_message.ack()
def requeue(self):
self._raw_message.requeue()
class CerberusConsumerBase(impl_rabbit.ConsumerBase):
def _callback_handler(self, message, callback):
"""Call callback with deserialized message.
Messages that are processed and ack'ed.
"""
try:
callback(CerberusRabbitMessage(message))
except Exception:
LOG.exception(_("Failed to process message"
" ... skipping it."))
message.ack()
class CerberusTopicConsumer(CerberusConsumerBase):
"""Consumer class for 'topic'."""
def __init__(self, conf, channel, topic, callback, tag, exchange_name,
name=None, **kwargs):
"""Init a 'topic' queue.
:param channel: the amqp channel to use
:param topic: the topic to listen on
:paramtype topic: str
:param callback: the callback to call when messages are received
:param tag: a unique ID for the consumer on the channel
:param exchange_name: the exchange name to use
:param name: optional queue name, defaults to topic
:paramtype name: str
Other kombu options may be passed as keyword arguments
"""
# Default options
options = {'durable': conf.amqp_durable_queues,
'queue_arguments': _get_queue_arguments(conf),
'auto_delete': conf.amqp_auto_delete,
'exclusive': False}
options.update(kwargs)
exchange = kombu.entity.Exchange(name=exchange_name,
type='topic',
durable=options['durable'],
auto_delete=options['auto_delete'])
super(CerberusTopicConsumer, self).__init__(channel,
callback,
tag,
name=name or topic,
exchange=exchange,
routing_key=topic,
**options)
class CerberusConnection(impl_rabbit.Connection):
def __init__(self, conf, url):
super(CerberusConnection, self).__init__(conf, url)
def declare_topic_consumer(self, exchange_name, topic, callback=None,
queue_name=None):
"""Create a 'topic' consumer."""
self.declare_consumer(functools.partial(CerberusTopicConsumer,
name=queue_name,
exchange_name=exchange_name,
),
topic, callback)
class CerberusRabbitDriver(amqpdriver.AMQPDriverBase):
def __init__(self, conf, url,
default_exchange=None,
allowed_remote_exmods=None):
conf.register_opts(impl_rabbit.rabbit_opts)
conf.register_opts(rpc_amqp.amqp_opts)
connection_pool = rpc_amqp.get_connection_pool(conf,
url,
CerberusConnection)
super(CerberusRabbitDriver, self).__init__(conf, url,
connection_pool,
default_exchange,
allowed_remote_exmods)

View File

@ -1,26 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from oslo.config import cfg
from cerberus import version
def parse_args(argv, default_config_files=None):
cfg.CONF(argv[1:],
project='cerberus',
version=version.version_info.release_string(),
default_config_files=default_config_files)

View File

@ -1,64 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from cerberus.openstack.common import context
class RequestContext(context.RequestContext):
"""Extends security contexts from the OpenStack common library."""
def __init__(self, auth_token=None, domain_id=None, domain_name=None,
user=None, tenant_id=None, tenant=None, is_admin=False,
is_public_api=False, read_only=False, show_deleted=False,
request_id=None, roles=None):
"""Stores several additional request parameters:
:param domain_id: The ID of the domain.
:param domain_name: The name of the domain.
:param is_public_api: Specifies whether the request should be processed
without authentication.
"""
self.tenant_id = tenant_id
self.is_public_api = is_public_api
self.domain_id = domain_id
self.domain_name = domain_name
self.roles = roles or []
super(RequestContext, self).__init__(auth_token=auth_token,
user=user, tenant=tenant,
is_admin=is_admin,
read_only=read_only,
show_deleted=show_deleted,
request_id=request_id)
def to_dict(self):
return {'auth_token': self.auth_token,
'user': self.user,
'tenant_id': self.tenant_id,
'tenant': self.tenant,
'is_admin': self.is_admin,
'read_only': self.read_only,
'show_deleted': self.show_deleted,
'request_id': self.request_id,
'domain_id': self.domain_id,
'roles': self.roles,
'domain_name': self.domain_name,
'is_public_api': self.is_public_api}
@classmethod
def from_dict(cls, values):
values.pop('user', None)
values.pop('tenant', None)
return cls(**values)

View File

@ -1,124 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.openstack.common.gettextutils import _ # noqa
class InvalidOperation(Exception):
def __init__(self, description):
super(InvalidOperation, self).__init__(description)
class PluginNotFound(InvalidOperation):
def __init__(self, uuid):
super(PluginNotFound, self).__init__("Plugin %s does not exist"
% str(uuid))
class TaskPeriodNotInteger(InvalidOperation):
def __init__(self):
super(TaskPeriodNotInteger, self).__init__(
"The period of the task must be provided as an integer"
)
class TaskNotFound(InvalidOperation):
def __init__(self, _id):
super(TaskNotFound, self).__init__(
_('Task %s does not exist') % _id
)
class TaskDeletionNotAllowed(InvalidOperation):
def __init__(self, _id):
super(TaskDeletionNotAllowed, self).__init__(
_("Deletion of task %s is not allowed because either it "
"does not exist or it is not recurrent") % _id
)
class TaskStartNotAllowed(InvalidOperation):
def __init__(self, _id):
super(TaskStartNotAllowed, self).__init__(
_("Starting task %s is not allowed because either it "
"does not exist or it is not recurrent") % _id
)
class TaskStartNotPossible(InvalidOperation):
def __init__(self, _id):
super(TaskStartNotPossible, self).__init__(
_("Starting task %s is not possible because it is running") % _id
)
class MethodNotString(InvalidOperation):
def __init__(self):
super(MethodNotString, self).__init__(
"Method must be provided as a string"
)
class MethodNotCallable(InvalidOperation):
def __init__(self, method, name):
super(MethodNotCallable, self).__init__(
"Method named %s is not callable by plugin %s"
% (str(method), str(name))
)
class TaskObjectNotProvided(InvalidOperation):
def __init__(self):
super(TaskObjectNotProvided, self).__init__(
"Task object not provided in request"
)
class PluginIdNotProvided(InvalidOperation):
def __init__(self):
super(PluginIdNotProvided, self).__init__(
"Plugin id not provided in request"
)
class MethodNotProvided(InvalidOperation):
def __init__(self):
super(MethodNotProvided, self).__init__(
"Method not provided in request"
)
class PolicyEnforcementError(Exception):
def __init__(self):
super(PolicyEnforcementError, self).__init__(
"Policy enforcement error"
)
class DbError(Exception):
def __init__(self, description):
super(DbError, self).__init__(description)

View File

@ -1,161 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""Cerberus base exception handling.
Includes decorator for re-raising Nova-type exceptions.
SHOULD include dedicated exception logging.
"""
import functools
import logging
import sys
import webob.exc
from oslo.config import cfg
from cerberus.common import safe_utils
from cerberus.openstack.common import excutils
from cerberus.openstack.common.gettextutils import _ # noqa
LOG = logging.getLogger(__name__)
exc_log_opts = [
cfg.BoolOpt('fatal_exception_format_errors',
default=False,
help='Make exception message format errors fatal'),
]
CONF = cfg.CONF
CONF.register_opts(exc_log_opts)
class ConvertedException(webob.exc.WSGIHTTPException):
def __init__(self, code=0, title="", explanation=""):
self.code = code
self.title = title
self.explanation = explanation
super(ConvertedException, self).__init__()
def _cleanse_dict(original):
"""Strip all admin_password, new_pass, rescue_pass keys from a dict."""
return dict((k, v) for k, v in original.iteritems() if "_pass" not in k)
def wrap_exception(notifier=None, get_notifier=None):
"""This decorator wraps a method to catch any exceptions that may
get thrown. It logs the exception as well as optionally sending
it to the notification system.
"""
def inner(f):
def wrapped(self, context, *args, **kw):
# Don't store self or context in the payload, it now seems to
# contain confidential information.
try:
return f(self, context, *args, **kw)
except Exception as e:
with excutils.save_and_reraise_exception():
if notifier or get_notifier:
payload = dict(exception=e)
call_dict = safe_utils.getcallargs(f, context,
*args, **kw)
cleansed = _cleanse_dict(call_dict)
payload.update({'args': cleansed})
# If f has multiple decorators, they must use
# functools.wraps to ensure the name is
# propagated.
event_type = f.__name__
(notifier or get_notifier()).error(context,
event_type,
payload)
return functools.wraps(f)(wrapped)
return inner
class CerberusException(Exception):
"""Base Cerberus Exception
To correctly use this class, inherit from it and define
a 'msg_fmt' property. That msg_fmt will get printf'd
with the keyword arguments provided to the constructor.
"""
msg_fmt = _("An unknown exception occurred.")
code = 500
headers = {}
safe = False
def __init__(self, message=None, **kwargs):
self.kwargs = kwargs
if 'code' not in self.kwargs:
try:
self.kwargs['code'] = self.code
except AttributeError:
pass
if not message:
try:
message = self.msg_fmt % kwargs
except Exception:
exc_info = sys.exc_info()
# kwargs doesn't match a variable in the message
# log the issue and the kwargs
LOG.exception(_('Exception in string format operation'))
for name, value in kwargs.iteritems():
LOG.error("%s: %s" % (name, value)) # noqa
if CONF.fatal_exception_format_errors:
raise exc_info[0], exc_info[1], exc_info[2]
else:
# at least get the core message out if something happened
message = self.msg_fmt
super(CerberusException, self).__init__(message)
def format_message(self):
# NOTE(mrodden): use the first argument to the python Exception object
# which should be our full NovaException message, (see __init__)
return self.args[0]
class DBException(CerberusException):
msg_fmt = _("Database error.")
class ReportExists(DBException):
msg_fmt = _("Report %(report_id)s already exists for plugin "
"%(plugin_id)s.")
class PluginInfoExists(DBException):
msg_fmt = _("Plugin info %(plugin_id)s already exists.")
class AlarmExists(DBException):
msg_fmt = _("Alarm %(alarm_id)s already exists.")
class TaskExists(DBException):
msg_fmt = _("Task %(task_id)s already exists.")

View File

@ -1,26 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import datetime
import json
class DateTimeEncoder(json.JSONEncoder):
def default(self, obj):
"""JSON serializer for objects not serializable by default json code"""
if isinstance(obj, datetime.datetime):
serial = obj.isoformat()
return serial

View File

@ -1,66 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import sys
from eventlet import event
from eventlet import greenthread
from cerberus.openstack.common.gettextutils import _LE, _LW # noqa
from cerberus.openstack.common import log as logging
from cerberus.openstack.common import loopingcall
from cerberus.openstack.common import timeutils
LOG = logging.getLogger(__name__)
class CerberusFixedIntervalLoopingCall(loopingcall.FixedIntervalLoopingCall):
"""A fixed interval looping call."""
def start(self, interval, initial_delay=None):
self._running = True
done = event.Event()
def _inner():
if initial_delay:
greenthread.sleep(initial_delay)
try:
while self._running:
start = timeutils.utcnow()
self.f(*self.args, **self.kw)
end = timeutils.utcnow()
if not self._running:
break
delay = interval - timeutils.delta_seconds(start, end)
if delay <= 0:
LOG.warn(_LW('task run outlasted interval by %s sec') %
-delay)
greenthread.sleep(delay if delay > 0 else 0)
except loopingcall.LoopingCallDone as e:
self.stop()
done.send(e.retvalue)
except Exception:
LOG.exception(_LE('in fixed duration looping call'))
done.send_exception(*sys.exc_info())
return
else:
done.send(True)
self.done = done
self.gt = greenthread.spawn(_inner)
return self.done

View File

@ -1,67 +0,0 @@
# Copyright (c) 2011 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Policy Engine For Cerberus."""
from oslo.config import cfg
from cerberus.openstack.common import policy
_ENFORCER = None
CONF = cfg.CONF
def init_enforcer(policy_file=None, rules=None,
default_rule=None, use_conf=True):
"""Synchronously initializes the policy enforcer
:param policy_file: Custom policy file to use, if none is specified,
`CONF.policy_file` will be used.
:param rules: Default dictionary / Rules to use. It will be
considered just in the first instantiation.
:param default_rule: Default rule to use, CONF.default_rule will
be used if none is specified.
:param use_conf: Whether to load rules from config file.
"""
global _ENFORCER
if _ENFORCER:
return
_ENFORCER = policy.Enforcer(policy_file=policy_file,
rules=rules,
default_rule=default_rule,
use_conf=use_conf)
def get_enforcer():
"""Provides access to the single instance of Policy enforcer."""
if not _ENFORCER:
init_enforcer()
return _ENFORCER
def enforce(rule, target, creds, do_raise=False, exc=None, *args, **kwargs):
"""A shortcut for policy.Enforcer.enforce()
Checks authorization of a rule against the target and credentials.
"""
enforcer = get_enforcer()
return enforcer.enforce(rule, target, creds, do_raise=do_raise,
exc=exc, *args, **kwargs)

View File

@ -1,70 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""Utilities and helper functions that won't produce circular imports."""
import inspect
import six
from cerberus.openstack.common import log
LOG = log.getLogger(__name__)
def getcallargs(function, *args, **kwargs):
"""This is a simplified inspect.getcallargs (2.7+).
It should be replaced when python >= 2.7 is standard.
"""
keyed_args = {}
argnames, varargs, keywords, defaults = inspect.getargspec(function)
keyed_args.update(kwargs)
# NOTE(alaski) the implicit 'self' or 'cls' argument shows up in
# argnames but not in args or kwargs. Uses 'in' rather than '==' because
# some tests use 'self2'.
if 'self' in argnames[0] or 'cls' == argnames[0]:
# The function may not actually be a method or have im_self.
# Typically seen when it's stubbed with mox.
if inspect.ismethod(function) and hasattr(function, 'im_self'):
keyed_args[argnames[0]] = function.im_self
else:
keyed_args[argnames[0]] = None
remaining_argnames = filter(lambda x: x not in keyed_args, argnames)
keyed_args.update(dict(zip(remaining_argnames, args)))
if defaults:
num_defaults = len(defaults)
for argname, value in zip(argnames[-num_defaults:], defaults):
if argname not in keyed_args:
keyed_args[argname] = value
return keyed_args
def safe_rstrip(value, chars=None):
"""Removes trailing characters from a string if that does not make it empty
:param value: A string value that will be stripped.
:param chars: Characters to remove.
:return: Stripped value.
"""
if not isinstance(value, six.string_types):
LOG.warn(("Failed to remove trailing character. Returning original "
"object. Supplied object is not a string: %s,") % value)
return value
return value.rstrip(chars) or value

View File

@ -1,110 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import parser
class JsonSerializer(object):
"""A serializer that provides methods to serialize and deserialize JSON
dictionaries.
Note, one of the assumptions this serializer makes is that all objects that
it is used to deserialize have a constructor that can take all of the
attribute arguments. I.e. If you have an object with 3 attributes, the
constructor needs to take those three attributes as keyword arguments.
"""
__attributes__ = None
"""The attributes to be serialized by the seralizer.
The implementor needs to provide these."""
__required__ = None
"""The attributes that are required when deserializing.
The implementor needs to provide these."""
__attribute_serializer__ = None
"""The serializer to use for a specified attribute. If an attribute is not
included here, no special serializer will be user.
The implementor needs to provide these."""
__object_class__ = None
"""The class that the deserializer should generate.
The implementor needs to provide these."""
serializers = dict(
date=dict(
serialize=lambda x: x.isoformat(),
deserialize=lambda x: parser.parse(x)
)
)
def deserialize(self, json, **kwargs):
"""Deserialize a JSON dictionary and return a populated object.
This takes the JSON data, and deserializes it appropriately and then
calls the constructor of the object to be created with all of the
attributes.
Args:
json: The JSON dict with all of the data
**kwargs: Optional values that can be used as defaults if they are
not present in the JSON data
Returns:
The deserialized object.
Raises:
ValueError: If any of the required attributes are not present
"""
d = dict()
for attr in self.__attributes__:
if attr in json:
val = json[attr]
elif attr in self.__required__:
try:
val = kwargs[attr]
except KeyError:
raise ValueError("{} must be set".format(attr))
serializer = self.__attribute_serializer__.get(attr)
if serializer:
d[attr] = self.serializers[serializer]['deserialize'](val)
else:
d[attr] = val
return self.__object_class__(**d)
def serialize(self, obj):
"""Serialize an object to a dictionary.
Take all of the attributes defined in self.__attributes__ and create
a dictionary containing those values.
Args:
obj: The object to serialize
Returns:
A dictionary containing all of the serialized data from the object.
"""
d = dict()
for attr in self.__attributes__:
val = getattr(obj, attr)
if val is None:
continue
serializer = self.__attribute_serializer__.get(attr)
if serializer:
d[attr] = self.serializers[serializer]['serialize'](val)
else:
d[attr] = val
return d

View File

@ -1,32 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.common import threadgroup
from cerberus.openstack.common import service
class CerberusService(service.Service):
def __init__(self, threads=1000):
super(CerberusService, self).__init__(threads)
self.tg = threadgroup.CerberusThreadGroup(threads)
class CerberusServices(service.Services):
def __init__(self):
super(CerberusServices, self).__init__()
self.tg = threadgroup.CerberusThreadGroup()

View File

@ -1,60 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.common import loopingcall
from cerberus.db.sqlalchemy import api as db_api
from cerberus.openstack.common import threadgroup
class CerberusThread(threadgroup.Thread):
def __init__(self, f, thread, group, *args, **kwargs):
super(CerberusThread, self).__init__(thread, group)
self.f = f
self.args = args
self.kw = kwargs
class CerberusThreadGroup(threadgroup.ThreadGroup):
def add_stopped_timer(self, callback, *args, **kwargs):
pulse = loopingcall.CerberusFixedIntervalLoopingCall(callback,
*args,
**kwargs)
self.timers.append(pulse)
return pulse
def add_timer(self, interval, callback, initial_delay=None,
*args, **kwargs):
pulse = loopingcall.CerberusFixedIntervalLoopingCall(callback,
*args,
**kwargs)
pulse.start(interval=interval,
initial_delay=initial_delay)
self.timers.append(pulse)
return pulse
def add_thread(self, callback, *args, **kwargs):
gt = self.pool.spawn(callback, *args, **kwargs)
th = CerberusThread(callback, gt, self, *args, **kwargs)
self.threads.append(th)
return th
def thread_done(self, thread):
self.threads.remove(thread)
try:
db_api.delete_task(thread.kw.get('task_id'))
except Exception:
raise

View File

@ -1,176 +0,0 @@
# Copyright 2010 United States Government as represented by the
# Administrator of the National Aeronautics and Space Administration.
# Copyright 2011 Justin Santa Barbara
# Copyright (c) 2012 NTT DOCOMO, INC.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Utilities and helper functions."""
import netaddr
import re
import six
import uuid
from oslo.config import cfg
from cerberus.common import exception
from cerberus.openstack.common.gettextutils import _ # noqa
from cerberus.openstack.common import log as logging
CONF = cfg.CONF
LOG = logging.getLogger(__name__)
class LazyPluggable(object):
"""A pluggable backend loaded lazily based on some value."""
def __init__(self, pivot, config_group=None, **backends):
self.__backends = backends
self.__pivot = pivot
self.__backend = None
self.__config_group = config_group
def __get_backend(self):
if not self.__backend:
if self.__config_group is None:
backend_name = CONF[self.__pivot]
else:
backend_name = CONF[self.__config_group][self.__pivot]
if backend_name not in self.__backends:
msg = _('Invalid backend: %s') % backend_name
raise exception.CerberusException(msg)
backend = self.__backends[backend_name]
if isinstance(backend, tuple):
name = backend[0]
fromlist = backend[1]
else:
name = backend
fromlist = backend
self.__backend = __import__(name, None, None, fromlist)
return self.__backend
def __getattr__(self, key):
backend = self.__get_backend()
return getattr(backend, key)
def is_valid_ipv4(address):
"""Verify that address represents a valid IPv4 address."""
try:
return netaddr.valid_ipv4(address)
except Exception:
return False
def is_valid_ipv6(address):
try:
return netaddr.valid_ipv6(address)
except Exception:
return False
def is_valid_ipv6_cidr(address):
try:
str(netaddr.IPNetwork(address, version=6).cidr)
return True
except Exception:
return False
def get_shortened_ipv6(address):
addr = netaddr.IPAddress(address, version=6)
return str(addr.ipv6())
def get_shortened_ipv6_cidr(address):
net = netaddr.IPNetwork(address, version=6)
return str(net.cidr)
def is_valid_cidr(address):
"""Check if the provided ipv4 or ipv6 address is a valid CIDR address."""
try:
# Validate the correct CIDR Address
netaddr.IPNetwork(address)
except netaddr.core.AddrFormatError:
return False
except UnboundLocalError:
# NOTE(MotoKen): work around bug in netaddr 0.7.5 (see detail in
# https://github.com/drkjam/netaddr/issues/2)
return False
# Prior validation partially verify /xx part
# Verify it here
ip_segment = address.split('/')
if (len(ip_segment) <= 1 or
ip_segment[1] == ''):
return False
return True
def get_ip_version(network):
"""Returns the IP version of a network (IPv4 or IPv6).
:raises: AddrFormatError if invalid network.
"""
if netaddr.IPNetwork(network).version == 6:
return "IPv6"
elif netaddr.IPNetwork(network).version == 4:
return "IPv4"
def convert_to_list_dict(lst, label):
"""Convert a value or list into a list of dicts."""
if not lst:
return None
if not isinstance(lst, list):
lst = [lst]
return [{label: x} for x in lst]
def sanitize_hostname(hostname):
"""Return a hostname which conforms to RFC-952 and RFC-1123 specs."""
if isinstance(hostname, six.text_type):
hostname = hostname.encode('latin-1', 'ignore')
hostname = re.sub('[ _]', '-', hostname)
hostname = re.sub('[^\w.-]+', '', hostname)
hostname = hostname.lower()
hostname = hostname.strip('.-')
return hostname
def generate_uuid():
return str(uuid.uuid4())
def is_uuid_like(val):
"""Returns validation of a value as a UUID.
For our purposes, a UUID is a canonical form string:
aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa
"""
try:
return str(uuid.UUID(val)) == val
except (TypeError, ValueError, AttributeError):
return False

View File

@ -1,17 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from cerberus.db.api import * # noqa

View File

@ -1,137 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from oslo.config import cfg
from cerberus.db.sqlalchemy import models
from cerberus.openstack.common.db import api as db_api
CONF = cfg.CONF
CONF.import_opt('backend', 'cerberus.openstack.common.db.options',
group='database')
_BACKEND_MAPPING = {'sqlalchemy': 'cerberus.db.sqlalchemy.api'}
IMPL = db_api.DBAPI(CONF.database.backend, backend_mapping=_BACKEND_MAPPING,
lazy=True)
def setup_db():
engine = get_engine()
models.register_models(engine)
def drop_db():
engine = get_engine()
models.unregister_models(engine)
def get_instance():
"""Return a DB API instance."""
return IMPL
def get_engine():
return IMPL.get_engine()
def get_session():
return IMPL.get_session()
def security_report_create(values):
"""Create an instance from the values dictionary."""
return IMPL.security_report_create(values)
def security_report_update_last_report_date(uuid, date):
"""Create an instance from the values dictionary."""
return IMPL.security_report_update_last_report_date(uuid, date)
def security_report_update_ticket_id(uuid, ticket_id):
"""Create an instance from the values dictionary."""
return IMPL.security_report_update_ticket_id(uuid, ticket_id)
def security_report_get_all(project_id=None):
"""Get all security reports"""
return IMPL.security_report_get_all(project_id=project_id)
def security_report_get(uuid):
"""Get security report from its id in database"""
return IMPL.security_report_get(uuid)
def security_report_get_from_report_id(report_id):
"""Get security report from its report identifier"""
return IMPL.security_report_get_from_report_id(report_id)
def security_report_delete(report_id):
"""Delete security report from its report identifier"""
return IMPL.security_report_delete(report_id)
def plugins_info_get():
"""Get information about plugins stored in db"""
return IMPL.plugins_info_get()
def plugin_info_get_from_uuid(id):
"""
Get information about plugin stored in db
:param id: the uuid of the plugin
"""
return IMPL.plugin_info_get_from_uuid(id)
def plugin_version_update(id, version):
return IMPL.plugin_version_update(id, version)
def security_alarm_create(values):
return IMPL.security_alarm_create(values)
def security_alarm_get_all():
return IMPL.security_alarm_get_all()
def security_alarm_get(id):
return IMPL.security_alarm_get(id)
def security_alarm_update_ticket_id(alarm_id, ticket_id):
"""Create an instance from the values dictionary."""
return IMPL.security_alarm_update_ticket_id(alarm_id, ticket_id)
def create_task(values):
return IMPL.create_task(values)
def delete_task(id):
IMPL.delete_task(id)
def update_state_task(id, running):
IMPL.update_state_task(id, running)
def get_all_tasks():
return IMPL.get_all_tasks()

View File

@ -1,55 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""Database setup and migration commands."""
from oslo.config import cfg
from cerberus.common import utils
CONF = cfg.CONF
CONF.import_opt('backend',
'cerberus.openstack.common.db.options',
group='database')
IMPL = utils.LazyPluggable(
pivot='backend',
config_group='database',
sqlalchemy='cerberus.db.sqlalchemy.migration')
INIT_VERSION = 0
def upgrade(version=None):
"""Migrate the database to `version` or the most recent version."""
return IMPL.upgrade(version)
def downgrade(version=None):
return IMPL.downgrade(version)
def version():
return IMPL.version()
def stamp(version):
return IMPL.stamp(version)
def revision(message, autogenerate):
return IMPL.revision(message, autogenerate)

View File

@ -1,15 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

View File

@ -1,54 +0,0 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts
script_location = %(here)s/alembic
# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
# max length of characters to apply to the
# "slug" field
#truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
#sqlalchemy.url = driver://user:pass@localhost/dbname
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
qualname =
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

View File

@ -1,16 +0,0 @@
Please see https://alembic.readthedocs.org/en/latest/index.html for general documentation
To create alembic migrations use:
$ cerberus-dbsync revision --message --autogenerate
Stamp db with most recent migration version, without actually running migrations
$ cerberus-dbsync stamp --revision head
Upgrade can be performed by:
$ cerberus-dbsync - for backward compatibility
$ cerberus-dbsync upgrade
# cerberus-dbsync upgrade --revision head
Downgrading db:
$ cerberus-dbsync downgrade
$ cerberus-dbsync downgrade --revision base

View File

@ -1,54 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from logging import config as log_config
from alembic import context
from cerberus.db.sqlalchemy import api as sqla_api
from cerberus.db.sqlalchemy import models
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
log_config.fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
target_metadata = models.Base.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
engine = sqla_api.get_engine()
with engine.connect() as connection:
context.configure(connection=connection,
target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
run_migrations_online()

View File

@ -1,22 +0,0 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision}
Create Date: ${create_date}
"""
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
def upgrade():
${upgrades if upgrades else "pass"}
def downgrade():
${downgrades if downgrades else "pass"}

View File

@ -1,116 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""initial_migration
Revision ID: 2dd6320a2745
Revises: None
Create Date: 2015-06-25 10:45:10.853595
"""
# revision identifiers, used by Alembic.
revision = '2dd6320a2745'
down_revision = None
from alembic import op
import sqlalchemy as sa
def upgrade():
op.create_table(
'plugin_info',
sa.Column('id', sa.Integer, primary_key=True, nullable=False),
sa.Column('uuid', sa.Text),
sa.Column('name', sa.Text),
sa.Column('version', sa.Text),
sa.Column('provider', sa.Text),
sa.Column('type', sa.Text),
sa.Column('description', sa.Text),
sa.Column('tool_name', sa.Text),
sa.Column('created_at', sa.DateTime),
sa.Column('updated_at', sa.DateTime),
sa.Column('deleted_at', sa.DateTime),
sa.Column('deleted', sa.Integer),
mysql_ENGINE='InnoDB',
mysql_DEFAULT_CHARSET='utf8'
)
op.create_table(
'security_report',
sa.Column('id', sa.Integer, primary_key=True, nullable=False),
sa.Column('plugin_id', sa.Text),
sa.Column('report_id', sa.VARCHAR(255), unique=True),
sa.Column('component_id', sa.Text),
sa.Column('component_type', sa.Text),
sa.Column('component_name', sa.Text),
sa.Column('project_id', sa.Text),
sa.Column('ticket_id', sa.Text),
sa.Column('title', sa.Text),
sa.Column('description', sa.Text),
sa.Column('security_rating', sa.Float),
sa.Column('vulnerabilities', sa.Text),
sa.Column('vulnerabilities_number', sa.Integer),
sa.Column('last_report_date', sa.DateTime),
sa.Column('created_at', sa.DateTime),
sa.Column('updated_at', sa.DateTime),
sa.Column('deleted_at', sa.DateTime),
sa.Column('deleted', sa.Integer),
mysql_ENGINE='InnoDB',
mysql_DEFAULT_CHARSET='UTF8'
)
op.create_table(
'security_alarm',
sa.Column('id', sa.Integer, primary_key=True, nullable=False),
sa.Column('plugin_id', sa.Text),
sa.Column('alarm_id', sa.VARCHAR(255), unique=True),
sa.Column('component_id', sa.Text),
sa.Column('project_id', sa.Text),
sa.Column('ticket_id', sa.Text),
sa.Column('timestamp', sa.DateTime),
sa.Column('summary', sa.Text),
sa.Column('severity', sa.Text),
sa.Column('status', sa.Text),
sa.Column('description', sa.Text),
sa.Column('created_at', sa.DateTime),
sa.Column('updated_at', sa.DateTime),
sa.Column('deleted_at', sa.DateTime),
sa.Column('deleted', sa.Integer),
mysql_engine='InnoDB',
mysql_charset='utf8'
)
op.create_table(
'task',
sa.Column('id', sa.Integer, primary_key=True, nullable=False),
sa.Column('type', sa.Text),
sa.Column('plugin_id', sa.Text),
sa.Column('uuid', sa.Text),
sa.Column('name', sa.Text),
sa.Column('method', sa.Text),
sa.Column('running', sa.Boolean),
sa.Column('period', sa.Integer),
sa.Column('created_at', sa.DateTime),
sa.Column('updated_at', sa.DateTime),
sa.Column('deleted_at', sa.DateTime),
sa.Column('deleted', sa.Integer),
mysql_engine='InnoDB',
mysql_charset='utf8'
)
def downgrade():
raise NotImplementedError(('Downgrade from initial migration is'
' unsupported.'))

View File

@ -1,332 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""text_to_varchar
Revision ID: 4426f811d4d9
Revises: 2dd6320a2745
Create Date: 2015-06-25 10:47:00.485303
"""
# revision identifiers, used by Alembic.
revision = '4426f811d4d9'
down_revision = '2dd6320a2745'
from alembic import op
import sqlalchemy as sa
def upgrade():
# In table plugin_info
op.alter_column(
table_name='plugin_info',
column_name='uuid',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='name',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='version',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='provider',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='type',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='description',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='plugin_info',
column_name='tool_name',
type_=sa.VARCHAR(255)
)
# In table security_report, except column vulnerabilities
op.alter_column(
table_name='security_report',
column_name='plugin_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='component_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='component_type',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='component_name',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='project_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='ticket_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='title',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_report',
column_name='description',
type_=sa.VARCHAR(255)
)
# In table security_alarm
op.alter_column(
table_name='security_alarm',
column_name='plugin_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='component_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='project_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='ticket_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='summary',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='severity',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='status',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='security_alarm',
column_name='description',
type_=sa.VARCHAR(255)
)
# In table task
op.alter_column(
table_name='task',
column_name='type',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='task',
column_name='plugin_id',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='task',
column_name='uuid',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='task',
column_name='name',
type_=sa.VARCHAR(255)
)
op.alter_column(
table_name='task',
column_name='method',
type_=sa.VARCHAR(255)
)
def downgrade():
# In table plugin_info
op.alter_column(
table_name='plugin_info',
column_name='uuid',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='name',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='version',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='provider',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='type',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='description',
type_=sa.TEXT
)
op.alter_column(
table_name='plugin_info',
column_name='tool_name',
type_=sa.TEXT
)
# In table security_report, except column vulnerabilities (still Text)
# and report_id (already varchar)
op.alter_column(
table_name='security_report',
column_name='plugin_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='component_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='component_type',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='component_name',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='project_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='ticket_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='title',
type_=sa.TEXT
)
op.alter_column(
table_name='security_report',
column_name='description',
type_=sa.TEXT
)
# In table security_alarm, except alarm_id (already varchar)
op.alter_column(
table_name='security_alarm',
column_name='plugin_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='component_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='project_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='ticket_id',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='summary',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='severity',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='status',
type_=sa.TEXT
)
op.alter_column(
table_name='security_alarm',
column_name='description',
type_=sa.TEXT
)
# In table task
op.alter_column(
table_name='task',
column_name='type',
type_=sa.TEXT
)
op.alter_column(
table_name='task',
column_name='plugin_id',
type_=sa.TEXT
)
op.alter_column(
table_name='task',
column_name='uuid',
type_=sa.TEXT
)
op.alter_column(
table_name='task',
column_name='name',
type_=sa.TEXT
)
op.alter_column(
table_name='task',
column_name='method',
type_=sa.TEXT
)

View File

@ -1,50 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""alter_security_report_add_uuid
Revision ID: 479e56a9ae3b
Revises: 4426f811d4d9
Create Date: 2015-06-25 10:48:06.260041
"""
# revision identifiers, used by Alembic.
revision = '479e56a9ae3b'
down_revision = '4426f811d4d9'
from alembic import op
import sqlalchemy as sa
def upgrade():
op.add_column('security_report',
sa.Column('uuid', sa.VARCHAR(255), unique=True))
op.drop_constraint('report_id', 'security_report', type_='unique')
op.create_unique_constraint('unique_uuid',
'security_report',
['uuid'])
op.create_unique_constraint('unique_report_id_plugin_id',
'security_report',
['report_id', 'plugin_id'])
def downgrade():
op.drop_column('security_report', 'uuid')
op.drop_constraint('unique_report_id_plugin_id',
'security_report',
type_='unique')
op.create_unique_constraint('report_id', 'security_report', ['report_id'])

View File

@ -1,400 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import sys
import threading
from oslo.config import cfg
from cerberus.common import exception
from cerberus.db.sqlalchemy import models
from cerberus.openstack.common.db import exception as db_exc
from cerberus.openstack.common.db.sqlalchemy import session as db_session
from cerberus.openstack.common import log
CONF = cfg.CONF
LOG = log.getLogger(__name__)
_ENGINE_FACADE = None
_LOCK = threading.Lock()
_FACADE = None
def _create_facade_lazily():
global _FACADE
if _FACADE is None:
_FACADE = db_session.EngineFacade(
CONF.database.connection,
**dict(CONF.database.iteritems())
)
return _FACADE
def get_engine():
facade = _create_facade_lazily()
return facade.get_engine()
def get_session(**kwargs):
facade = _create_facade_lazily()
return facade.get_session(**kwargs)
def get_backend():
"""The backend is this module itself."""
return sys.modules[__name__]
def model_query(model, *args, **kwargs):
"""Query helper for simpler session usage.
:param session: if present, the session to use
"""
session = kwargs.get('session') or get_session()
query = session.query(model, *args)
return query
def _security_report_create(values):
try:
security_report_ref = models.SecurityReport()
security_report_ref.update(values)
security_report_ref.save()
except db_exc.DBDuplicateEntry as e:
LOG.exception(e)
raise exception.ReportExists(report_id=values['report_id'],
plugin_id=values['plugin_id'])
except Exception as e:
LOG.exception(e)
raise exception.DBException()
return security_report_ref
def security_report_create(values):
return _security_report_create(values)
def _security_report_update_last_report_date(uuid, date):
try:
session = get_session()
report = model_query(models.SecurityReport, read_deleted="no",
session=session).filter(models.SecurityReport.uuid
== uuid).first()
report.last_report_date = date
report.save(session)
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_update_last_report_date(uuid, date):
_security_report_update_last_report_date(uuid, date)
def _security_report_update_ticket_id(uuid, ticket_id):
try:
session = get_session()
report = model_query(models.SecurityReport, read_deleted="no",
session=session).filter(models.SecurityReport.uuid
== uuid).first()
report.ticket_id = ticket_id
report.save(session)
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_update_ticket_id(uuid, ticket_id):
_security_report_update_ticket_id(uuid, ticket_id)
def _security_report_get_all(project_id=None):
try:
session = get_session()
if project_id is None:
return model_query(models.SecurityReport, read_deleted="no",
session=session).all()
else:
return model_query(models.SecurityReport, read_deleted="no",
session=session).\
filter(models.SecurityReport.project_id == project_id).all()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_get_all(project_id=None):
return _security_report_get_all(project_id=project_id)
def _security_report_get(uuid):
try:
session = get_session()
return model_query(
models.SecurityReport, read_deleted="no", session=session).filter(
models.SecurityReport.uuid == uuid).first()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_get(uuid):
return _security_report_get(uuid)
def _security_report_get_from_report_id(report_id):
try:
session = get_session()
return model_query(
models.SecurityReport, read_deleted="no", session=session).filter(
models.SecurityReport.report_id == report_id).first()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_get_from_report_id(report_id):
return _security_report_get_from_report_id(report_id)
def _security_report_delete(uuid):
try:
session = get_session()
report = model_query(
models.SecurityReport, read_deleted="no",
session=session).filter_by(uuid=uuid)
report.delete()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_report_delete(uuid):
return _security_report_delete(uuid)
def _plugin_info_create(values):
try:
plugin_info_ref = models.PluginInfo()
plugin_info_ref.update(values)
plugin_info_ref.save()
except db_exc.DBDuplicateEntry:
raise exception.PluginInfoExists(plugin_id=values['id'])
except Exception as e:
LOG.exception(e)
raise exception.DBException()
return plugin_info_ref
def plugin_info_create(values):
return _plugin_info_create(values)
def _plugins_info_get():
try:
session = get_session()
return model_query(models.PluginInfo,
read_deleted="no",
session=session).all()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def plugins_info_get():
return _plugins_info_get()
def _plugin_info_get(name):
try:
session = get_session()
return model_query(models.PluginInfo,
read_deleted="no",
session=session).filter(models.PluginInfo.name ==
name).first()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def plugin_info_get(name):
return _plugin_info_get(name)
def _plugin_info_get_from_uuid(plugin_id):
try:
session = get_session()
return model_query(models.PluginInfo,
read_deleted="no",
session=session).filter(models.PluginInfo.uuid ==
plugin_id).first()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def plugin_info_get_from_uuid(plugin_id):
return _plugin_info_get_from_uuid(plugin_id)
def _plugin_version_update(plugin_id, version):
try:
session = get_session()
plugin = model_query(models.PluginInfo, read_deleted="no",
session=session).filter(models.PluginInfo.id ==
plugin_id).first()
plugin.version = version
plugin.save(session)
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def plugin_version_update(plugin_id, version):
_plugin_version_update(plugin_id, version)
def _security_alarm_create(values):
try:
security_alarm_ref = models.SecurityAlarm()
security_alarm_ref.update(values)
security_alarm_ref.save()
except db_exc.DBDuplicateEntry as e:
LOG.exception(e)
raise exception.AlarmExists(alarm_id=values['id'])
except Exception as e:
LOG.exception(e)
raise exception.DBException()
return security_alarm_ref
def security_alarm_create(values):
return _security_alarm_create(values)
def _security_alarm_get_all():
try:
session = get_session()
return model_query(models.SecurityAlarm, read_deleted="no",
session=session).all()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_alarm_get_all():
return _security_alarm_get_all()
def _security_alarm_get(alarm_id):
try:
session = get_session()
return model_query(
models.SecurityAlarm, read_deleted="no", session=session).filter(
models.SecurityAlarm.alarm_id == alarm_id).first()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_alarm_get(alarm_id):
return _security_alarm_get(alarm_id)
def _security_alarm_update_ticket_id(alarm_id, ticket_id):
try:
session = get_session()
alarm = model_query(
models.SecurityAlarm, read_deleted="no", session=session).filter(
models.SecurityAlarm.alarm_id == alarm_id).first()
alarm.ticket_id = ticket_id
alarm.save(session)
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def security_alarm_update_ticket_id(alarm_id, ticket_id):
_security_alarm_update_ticket_id(alarm_id, ticket_id)
def _create_task(values):
try:
task_ref = models.Task()
task_ref.update(values)
task_ref.save()
except db_exc.DBDuplicateEntry as e:
LOG.exception(e)
raise exception.TaskExists(task_id=values['uuid'])
except Exception as e:
LOG.exception(e)
raise exception.DBException()
return task_ref
def create_task(values):
return _create_task(values)
def _delete_task(task_id):
try:
session = get_session()
task = model_query(models.Task, read_deleted="no",
session=session).filter_by(uuid=task_id)
task.delete()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def delete_task(task_id):
_delete_task(task_id)
def _update_state_task(task_id, running):
try:
session = get_session()
task = model_query(models.Task, read_deleted="no",
session=session).filter_by(uuid=task_id).first()
task.running = running
task.save(session)
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def update_state_task(task_id, running):
_update_state_task(task_id, running)
def _get_all_tasks():
try:
session = get_session()
return model_query(models.Task, read_deleted="no",
session=session).all()
except Exception as e:
LOG.exception(e)
raise exception.DBException()
def get_all_tasks():
return _get_all_tasks()

View File

@ -1,90 +0,0 @@
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import alembic
from alembic import config as alembic_config
import alembic.migration as alembic_migration
from cerberus.db.sqlalchemy import api as sqla_api
INIT_VERSION = 0
def _alembic_config():
path = os.path.join(os.path.dirname(__file__), 'alembic.ini')
config = alembic_config.Config(path)
return config
def version(config=None):
"""Current database version.
:returns: Database version
:rtype: string
"""
engine = sqla_api.get_engine()
with engine.connect() as conn:
context = alembic_migration.MigrationContext.configure(conn)
return context.get_current_revision()
def upgrade(revision, config=None):
"""Used for upgrading database.
:param version: Desired database version
:type version: string
"""
revision = revision or 'head'
config = config or _alembic_config()
alembic.command.upgrade(config, revision or 'head')
def downgrade(revision, config=None):
"""Used for downgrading database.
:param version: Desired database version
:type version: string
"""
revision = revision or 'base'
config = config or _alembic_config()
return alembic.command.downgrade(config, revision)
def stamp(revision, config=None):
"""Stamps database with provided revision.
Dont run any migrations.
:param revision: Should match one from repository or head - to stamp
database with most recent revision
:type revision: string
"""
config = config or _alembic_config()
return alembic.command.stamp(config, revision=revision)
def revision(message=None, autogenerate=False, config=None):
"""Creates template for migration.
:param message: Text that will be used for migration title
:type message: string
:param autogenerate: If True - generates diff based on current database
state
:type autogenerate: bool
"""
config = config or _alembic_config()
return alembic.command.revision(config, message=message,
autogenerate=autogenerate)

View File

@ -1,188 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
SQLAlchemy models for cerberus data.
"""
from sqlalchemy import Boolean, Column, String, Integer, DateTime, Float, Text
from sqlalchemy.ext.declarative import declarative_base
from oslo.config import cfg
from cerberus.common import serialize
from cerberus.openstack.common.db.sqlalchemy import models
CONF = cfg.CONF
class CerberusBase(models.SoftDeleteMixin,
models.TimestampMixin,
models.ModelBase):
metadata = None
def save(self, session=None):
from cerberus.db.sqlalchemy import api
if session is None:
session = api.get_session()
super(CerberusBase, self).save(session=session)
Base = declarative_base(cls=CerberusBase)
class PluginInfo(Base, CerberusBase):
"""Plugin info"""
__tablename__ = 'plugin_info'
__table_args__ = ()
id = Column(Integer, primary_key=True)
uuid = Column(String(255))
name = Column(String(255))
version = Column(String(255))
provider = Column(String(255))
type = Column(String(255))
description = Column(String(255))
tool_name = Column(String(255))
class PluginInfoJsonSerializer(serialize.JsonSerializer):
"""Plugin info serializer"""
__attributes__ = ['id', 'uuid', 'name', 'version', 'provider',
'type', 'description', 'tool_name']
__required__ = ['id']
__attribute_serializer__ = dict(created_at='date', deleted_at='date',
acknowledged_at='date')
__object_class__ = PluginInfo
class SecurityReport(Base, CerberusBase):
"""Security Report"""
__tablename__ = 'security_report'
__table_args__ = ()
id = Column(Integer, primary_key=True)
uuid = Column(String(255), unique=True)
plugin_id = Column(String(255))
report_id = Column(String(255))
component_id = Column(String(255))
component_type = Column(String(255))
component_name = Column(String(255))
project_id = Column(String(255))
title = Column(String(255))
description = Column(String(255))
security_rating = Column(Float)
vulnerabilities = Column(Text)
vulnerabilities_number = Column(Integer)
last_report_date = Column(DateTime)
ticket_id = Column(String(255))
class SecurityReportJsonSerializer(serialize.JsonSerializer):
"""Security report serializer"""
__attributes__ = ['id', 'uuid', 'title', 'description', 'plugin_id',
'report_id', 'component_id', 'component_type',
'component_name', 'project_id', 'security_rating',
'vulnerabilities', 'vulnerabilities_number',
'last_report_date', 'ticket_id', 'deleted', 'created_at',
'deleted_at', 'updated_at']
__required__ = ['uuid', 'title', 'component_id']
__attribute_serializer__ = dict(created_at='date', deleted_at='date',
acknowledged_at='date')
__object_class__ = SecurityReport
class SecurityAlarm(Base, CerberusBase):
"""Security alarm coming from Security Information and Event Manager
for example
"""
__tablename__ = 'security_alarm'
__table_args__ = ()
id = Column(Integer, primary_key=True)
plugin_id = Column(String(255))
alarm_id = Column(String(255), unique=True)
timestamp = Column(DateTime)
status = Column(String(255))
severity = Column(String(255))
project_id = Column(String(255))
component_id = Column(String(255))
summary = Column(String(255))
description = Column(String(255))
ticket_id = Column(String(255))
class SecurityAlarmJsonSerializer(serialize.JsonSerializer):
"""Security report serializer"""
__attributes__ = ['id', 'plugin_id', 'alarm_id', 'timestamp', 'status',
'severity', 'project_id', 'component_id', 'summary',
'description', 'ticket_id', 'deleted', 'created_at',
'deleted_at', 'updated_at']
__required__ = ['id', 'title']
__attribute_serializer__ = dict(created_at='date', deleted_at='date',
acknowledged_at='date')
__object_class__ = SecurityAlarm
class Task(Base, CerberusBase):
"""Tasks for security purposes (e.g: daily scans...)
"""
__tablename__ = 'task'
__table_args__ = ()
id = Column(Integer, primary_key=True)
name = Column(String(255))
method = Column(String(255))
type = Column(String(255))
period = Column(Integer)
plugin_id = Column(String(255))
running = Column(Boolean)
uuid = Column(String(255))
class TaskJsonSerializer(serialize.JsonSerializer):
"""Security report serializer"""
__attributes__ = ['id', 'name', 'method', 'type', 'period',
'plugin_id', 'running', 'uuid', 'deleted', 'created_at',
'deleted_at', 'updated_at']
__required__ = ['id', ]
__attribute_serializer__ = dict(created_at='date', deleted_at='date',
acknowledged_at='date')
__object_class__ = Task
def register_models(engine):
"""Creates database tables for all models with the given engine."""
models = (PluginInfo, SecurityReport, SecurityAlarm, Task)
for model in models:
model.metadata.create_all(engine)
def unregister_models(engine):
"""Drops database tables for all models with the given engine."""
models = (PluginInfo, SecurityReport, SecurityAlarm, Task)
for model in models:
model.metadata.drop_all(engine)

View File

@ -1,592 +0,0 @@
#
# Copyright (c) 2014 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import json
import uuid
from oslo.config import cfg
from oslo import messaging
from stevedore import extension
from cerberus.common import errors
from cerberus.common import exception as cerberus_exception
from cerberus.common import service
from cerberus.db.sqlalchemy import api as db_api
from cerberus import notifications
from cerberus.openstack.common import log
from cerberus.openstack.common import loopingcall
from cerberus.openstack.common import threadgroup
from plugins import base
OPTS = [
cfg.StrOpt('notifier_topic',
default='notifications',
help='The topic that Cerberus uses for generating '
'notifications')
]
cfg.CONF.register_opts(OPTS)
LOG = log.getLogger(__name__)
_SECURITY_REPORT = 'security_report'
def store_report_and_notify(title, plugin_id, report_id, component_id,
component_name, component_type, project_id,
description, security_rating, vulnerabilities,
vulnerabilities_number, last_report_date):
report_uuid = uuid.uuid4()
report = {'title': title,
'plugin_id': plugin_id,
'uuid': str(report_uuid),
'report_id': report_id,
'component_id': component_id,
'component_type': component_type,
'component_name': component_name,
'project_id': project_id,
'description': description,
'security_rating': security_rating,
'vulnerabilities': vulnerabilities,
'vulnerabilities_number': vulnerabilities_number}
try:
db_api.security_report_create(report)
db_api.security_report_update_last_report_date(
report_uuid, last_report_date)
notifications.send_notification('store', 'security_report', report)
except cerberus_exception.DBException:
raise
def store_alarm_and_notify(plugin_id, alarm_id, timestamp, status, severity,
component_id, description, summary):
alarm = {'plugin_id': plugin_id,
'alarm_id': alarm_id,
'timestamp': timestamp,
'status': status,
'severity': severity,
'component_id': component_id,
'description': description,
'summary': summary}
try:
db_api.security_alarm_create(alarm)
notifications.send_notification('store', 'security_alarm', alarm)
except cerberus_exception.DBException:
raise
class CerberusManager(service.CerberusService):
TASK_NAMESPACE = 'cerberus.plugins'
@classmethod
def _get_cerberus_manager(cls):
return extension.ExtensionManager(
namespace=cls.TASK_NAMESPACE,
invoke_on_load=True,
)
def __init__(self):
super(CerberusManager, self).__init__()
self.notifier = None
def _register_plugin(self, extension):
"""Register plugin in database
:param extension: stevedore extension containing the plugin to register
:return:
"""
version = extension.entry_point.dist.version
plugin = extension.obj
db_plugin_info = db_api.plugin_info_get(plugin._name)
if db_plugin_info is None:
db_plugin_info = db_api.plugin_info_create({'name': plugin._name,
'uuid': uuid.uuid4(),
'version': version,
'provider':
plugin.PROVIDER,
'type': plugin.TYPE,
'description':
plugin.DESCRIPTION,
'tool_name':
plugin.TOOL_NAME})
else:
db_api.plugin_version_update(db_plugin_info.id, version)
plugin._uuid = db_plugin_info.uuid
def add_stored_tasks(self):
"""Add stored tasks when Cerberus starts"""
tasks = db_api.get_all_tasks()
for task in tasks:
kwargs = {}
kwargs['task_name'] = task.name
kwargs['task_type'] = task.type
kwargs['task_period'] = task.period
kwargs['task_id'] = task.uuid
kwargs['running'] = task.running
kwargs['persistent'] = True
self._add_task(task.plugin_id, task.method, **kwargs)
def start(self):
"""Start Cerberus Manager"""
self.rpc_server = None
self.notification_server = None
super(CerberusManager, self).start()
transport = messaging.get_transport(cfg.CONF)
self.notifier = notifications._get_notifier()
targets = []
plugins = []
self.cerberus_manager = self._get_cerberus_manager()
if not list(self.cerberus_manager):
LOG.warning('Failed to load any task handlers for %s',
self.TASK_NAMESPACE)
for extension in self.cerberus_manager:
handler = extension.obj
LOG.debug('Plugin loaded: ' + extension.name)
LOG.debug(('Event types from %(name)s: %(type)s')
% {'name': extension.name,
'type': ', '.join(handler._subscribedEvents)})
self._register_plugin(extension)
handler.register_manager(self)
targets.extend(handler.get_targets(cfg.CONF))
plugins.append(handler)
self.add_stored_tasks()
if transport:
rpc_target = messaging.Target(topic='test_rpc', server='server1')
self.rpc_server = messaging.get_rpc_server(transport, rpc_target,
[self],
executor='eventlet')
self.notification_server = messaging.get_notification_listener(
transport, targets, plugins, executor='eventlet')
LOG.info("RPC Server starting...")
self.rpc_server.start()
self.notification_server.start()
def _get_unique_task(self, task_id):
"""Get unique task (executed once) thanks to its identifier
:param task_id: the uique identifier of the task
:return: the task or None if there is not any task with this id
"""
try:
unique_task = next(
thread for thread in self.tg.threads
if (thread.kw.get('task_id', None) == task_id))
except StopIteration:
return None
return unique_task
def _get_recurrent_task(self, task_id):
"""Get recurrent task thanks to its identifier
:param task_id: the uique identifier of the task
:return: the task or None if there is not any task with this id
"""
try:
recurrent_task = next(timer for timer in self.tg.timers if
(timer.kw.get('task_id', None) == task_id))
except StopIteration:
return None
return recurrent_task
def _add_unique_task(self, callback, *args, **kwargs):
"""Add an unique task (executed once) without delay
:param callback: Callable function to call when it's necessary
:param args: list of positional arguments to call the callback with
:param kwargs: dict of keyword arguments to call the callback with
:return the thread object that is created
"""
return self.tg.add_thread(callback, *args, **kwargs)
def _add_stopped_reccurent_task(self, callback, period, initial_delay=None,
*args, **kwargs):
"""Add a recurrent task (executed periodically) without starting it
:param callback: Callable function to call when it's necessary
:param period: the time in seconds during two executions of the task
:param initial_delay: the time after the first execution of the task
occurs
:param args: list of positional arguments to call the callback with
:param kwargs: dict of keyword arguments to call the callback with
"""
return self.tg.add_stopped_timer(callback, initial_delay,
*args, **kwargs)
def _add_recurrent_task(self, callback, period, initial_delay=None, *args,
**kwargs):
"""Add a recurrent task (executed periodically)
:param callback: Callable function to call when it's necessary
:param period: the time in seconds during two executions of the task
:param initial_delay: the time after the first execution of the task
occurs
:param args: list of positional arguments to call the callback with
:param kwargs: dict of keyword arguments to call the callback with
"""
return self.tg.add_timer(period, callback, initial_delay, *args,
**kwargs)
def get_plugins(self, ctx):
'''List plugins loaded by Cerberus manager
This method is called by the Cerberus-api rpc client
'''
json_plugins = []
for extension in self.cerberus_manager:
plugin = extension.obj
res = json.dumps(plugin, cls=base.PluginEncoder)
json_plugins.append(res)
return json_plugins
def _get_plugin_from_uuid(self, plugin_id):
for extension in self.cerberus_manager:
plugin = extension.obj
if plugin._uuid == plugin_id:
return plugin
return None
def get_plugin_from_uuid(self, ctx, uuid):
plugin = self._get_plugin_from_uuid(uuid)
if plugin is not None:
return json.dumps(plugin, cls=base.PluginEncoder)
else:
return None
def _add_task(self, plugin_id, method_, *args, **kwargs):
'''Add a task in the Cerberus manager
:param plugin_id: the uuid of the plugin to call method onto
:param method_: the method to call back
:param task_type: the type of task to create
:param args: some extra arguments
:param kwargs: some extra keyworded arguments
'''
kwargs['plugin_id'] = plugin_id
task_type = kwargs.get('task_type', "unique")
plugin = self._get_plugin_from_uuid(plugin_id)
if plugin is None:
raise errors.PluginNotFound(plugin_id)
if (task_type.lower() == 'recurrent'):
try:
task_period = int(kwargs.get('task_period', None))
except (TypeError, ValueError) as e:
LOG.exception(e)
raise errors.TaskPeriodNotInteger()
try:
if kwargs.get('running', True) is True:
task = self._add_recurrent_task(getattr(plugin, method_),
task_period,
*args,
**kwargs)
else:
task = self._add_stopped_reccurent_task(
getattr(plugin, method_),
task_period,
*args,
**kwargs)
except TypeError as e:
LOG.exception(e)
raise errors.MethodNotString()
except AttributeError as e:
LOG.exception(e)
raise errors.MethodNotCallable(method_,
plugin.__class__.__name__)
else:
try:
task = self._add_unique_task(
getattr(plugin, method_),
*args,
**kwargs)
except TypeError as e:
LOG.exception(e)
raise errors.MethodNotString()
except AttributeError as e:
LOG.exception(e)
raise errors.MethodNotCallable(method_,
plugin.__class__.__name__)
return task
def _store_task(self, task, method_):
try:
task_period_ = task.kw.get('task_period', None)
if task_period_ is not None:
task_period = int(task_period_)
else:
task_period = task_period_
db_api.create_task({'name': task.kw.get('task_name',
'Unknown'),
'method': str(method_),
'type': task.kw['task_type'],
'period': task_period,
'plugin_id': task.kw['plugin_id'],
'running': True,
'uuid': task.kw['task_id']})
except Exception as e:
LOG.exception(e)
pass
def create_task(self, ctx, plugin_id, method_, *args, **kwargs):
"""Create a task
This method is called by a rpc client. It adds a task in the manager
and stores it if the task is persistent
:param ctx: a request context dict supplied by client
:param plugin_id: the uuid of the plugin to call method onto
:param method_: the method to call back
:param args: some extra arguments
:param kwargs: some extra keyworded arguments
"""
task_id = uuid.uuid4()
try:
task = self._add_task(plugin_id, method_, *args,
task_id=str(task_id), **kwargs)
except Exception:
raise
if kwargs.get('persistent', False) is True:
try:
self._store_task(task, method_)
except Exception as e:
LOG.exception(e)
pass
return str(task_id)
def _stop_recurrent_task(self, task_id):
"""Stop the recurrent task but does not remove it from the ThreadGroup.
The task still exists and could be started. Plus, if the task is
running, wait for the end of its execution
:param task_id: the id of the recurrent task to stop
:return:
:raises:
StopIteration: the task is not found
"""
recurrent_task = self._get_recurrent_task(task_id)
if recurrent_task is None:
raise errors.TaskNotFound(task_id)
recurrent_task.stop()
if recurrent_task.kw.get('persistent', False) is True:
try:
db_api.update_state_task(task_id, False)
except Exception as e:
LOG.exception(e)
raise e
def _stop_unique_task(self, task_id):
"""Stop the task. This task is automatically deleted as it's not
recurrent
"""
unique_task = self._get_unique_task(task_id)
if unique_task is None:
raise errors.TaskNotFound(task_id)
unique_task.stop()
if unique_task.kw.get('persistent', False) is True:
try:
db_api.delete_task(task_id)
except Exception as e:
LOG.exception(e)
raise e
def _stop_task(self, task_id):
task = self._get_task(task_id)
if isinstance(task, loopingcall.FixedIntervalLoopingCall):
try:
self._stop_recurrent_task(task_id)
except errors.InvalidOperation:
raise
elif isinstance(task, threadgroup.Thread):
try:
self._stop_unique_task(task_id)
except errors.InvalidOperation:
raise
def stop_task(self, ctx, task_id):
try:
self._stop_task(task_id)
except errors.InvalidOperation:
raise
return task_id
def _delete_recurrent_task(self, task_id):
"""
Stop the task and delete the recurrent task from the ThreadGroup.
If the task is running, wait for the end of its execution
:param task_id: the identifier of the task to delete
:return:
"""
recurrent_task = self._get_recurrent_task(task_id)
if (recurrent_task is None):
raise errors.TaskDeletionNotAllowed(task_id)
recurrent_task.stop()
try:
self.tg.timers.remove(recurrent_task)
except ValueError:
raise
if recurrent_task.kw.get('persistent', False) is True:
try:
db_api.delete_task(task_id)
except Exception as e:
LOG.exception(e)
raise e
def delete_recurrent_task(self, ctx, task_id):
'''
This method is designed to be called by an rpc client.
E.g: Cerberus-api
Stop the task and delete the recurrent task from the ThreadGroup.
If the task is running, wait for the end of its execution
:param ctx: a request context dict supplied by client
:param task_id: the identifier of the task to delete
'''
try:
self._delete_recurrent_task(task_id)
except errors.InvalidOperation:
raise
return task_id
def _force_delete_recurrent_task(self, task_id):
"""
Stop the task even if it is running and delete the recurrent task from
the ThreadGroup.
:param task_id: the identifier of the task to force delete
:return:
"""
recurrent_task = self._get_recurrent_task(task_id)
if (recurrent_task is None):
raise errors.TaskDeletionNotAllowed(task_id)
recurrent_task.stop()
recurrent_task.gt.kill()
try:
self.tg.timers.remove(recurrent_task)
except ValueError:
raise
if recurrent_task.kw.get('persistent', False) is True:
try:
db_api.delete_task(task_id)
except Exception as e:
LOG.exception(e)
raise e
def force_delete_recurrent_task(self, ctx, task_id):
'''
This method is designed to be called by an rpc client.
E.g: Cerberus-api
Stop the task even if it is running and delete the recurrent task
from the ThreadGroup.
:param ctx: a request context dict supplied by client
:param task_id: the identifier of the task to force delete
'''
try:
self._force_delete_recurrent_task(task_id)
except errors.InvalidOperation:
raise
return task_id
def _get_tasks(self):
tasks = []
for timer in self.tg.timers:
tasks.append(timer)
for thread in self.tg.threads:
tasks.append(thread)
return tasks
def _get_task(self, task_id):
task = self._get_unique_task(task_id)
task_ = self._get_recurrent_task(task_id)
if (task is None and task_ is None):
raise errors.TaskNotFound(task_id)
return task if task is not None else task_
def get_tasks(self, ctx):
tasks_ = []
tasks = self._get_tasks()
for task in tasks:
if (isinstance(task, loopingcall.FixedIntervalLoopingCall)):
tasks_.append(
json.dumps(task,
cls=base.FixedIntervalLoopingCallEncoder))
elif (isinstance(task, threadgroup.Thread)):
tasks_.append(
json.dumps(task,
cls=base.ThreadEncoder))
return tasks_
def get_task(self, ctx, task_id):
try:
task = self._get_task(task_id)
except errors.InvalidOperation:
raise
if isinstance(task, loopingcall.FixedIntervalLoopingCall):
return json.dumps(task,
cls=base.FixedIntervalLoopingCallEncoder)
elif isinstance(task, threadgroup.Thread):
return json.dumps(task,
cls=base.ThreadEncoder)
def _start_recurrent_task(self, task_id):
"""
Start the task
:param task_id: the identifier of the task to start
:return:
"""
recurrent_task = self._get_recurrent_task(task_id)
if (recurrent_task is None):
raise errors.TaskStartNotAllowed(str(task_id))
period = recurrent_task.kw.get("task_period", None)
if recurrent_task._running is True:
raise errors.TaskStartNotPossible(str(task_id))
else:
try:
recurrent_task.start(int(period))
if recurrent_task.kw.get('persistent', False) is True:
db_api.update_state_task(task_id, True)
except Exception as e:
LOG.exception(e)
raise e
def start_recurrent_task(self, ctx, task_id):
'''
This method is designed to be called by an rpc client.
E.g: Cerberus-api
Start a recurrent task after it's being stopped
:param ctx: a request context dict supplied by client
:param task_id: the identifier of the task to start
'''
try:
self._start_recurrent_task(task_id)
except errors.InvalidOperation:
raise
return task_id

View File

@ -1,93 +0,0 @@
#
# Copyright (c) 2015 EUROGICIEL
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import socket
from oslo.config import cfg
from oslo import messaging
from cerberus.openstack.common.gettextutils import _ # noqa
from cerberus.openstack.common import log
notifier_opts = [
cfg.StrOpt('default_publisher_id',
default=None,
help='Default publisher_id for outgoing notifications'),
cfg.StrOpt('notifier_topic',
default='notifications',
help='The topic that Cerberus uses for generating '
'notifications')
]
cfg.CONF.register_opts(notifier_opts)
LOG = log.getLogger(__name__)
_notifier = None
def _get_notifier():
"""Return a notifier object.
If _notifier is None it means that a notifier object has not been set.
If _notifier is False it means that a notifier has previously failed to
construct.
Otherwise it is a constructed Notifier object.
"""
global _notifier
if _notifier is None:
host = cfg.CONF.default_publisher_id or socket.gethostname()
try:
transport = messaging.get_transport(cfg.CONF)
_notifier = messaging.Notifier(transport, "security.%s" % host,
topic=cfg.CONF.notifier_topic)
except Exception:
LOG.exception("Failed to construct notifier")
_notifier = False
return _notifier
def _reset_notifier():
global _notifier
_notifier = None
def send_notification(operation, resource_type, payload):
"""Send notification to inform observers about the affected resource.
This method doesn't raise an exception when sending the notification fails.
:param operation: operation being performed (created, updated, or deleted)
:param resource_type: type of resource being operated on
:param resource_id: ID of resource being operated on
"""
context = {}
service = 'security'
event_type = '%(service)s.%(resource_type)s.%(operation)s' % {
'service': service,
'resource_type': resource_type,
'operation': operation}
notifier = _get_notifier()
if notifier:
try:
LOG.info('Sending %(event_type)s notification...',
{'event_type': event_type})
notifier.info(context, event_type, payload)
except Exception:
LOG.exception(_(
'Failed to send %(event_type)s notification'),
{'event_type': event_type})

View File

@ -1,17 +0,0 @@
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
six.add_move(six.MovedModule('mox', 'mox', 'mox3.mox'))

View File

@ -1,40 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""oslo.i18n integration module.
See http://docs.openstack.org/developer/oslo.i18n/usage.html
"""
import oslo.i18n
# NOTE(dhellmann): This reference to o-s-l-o will be replaced by the
# application name when this module is synced into the separate
# repository. It is OK to have more than one translation function
# using the same domain, since there will still only be one message
# catalog.
_translators = oslo.i18n.TranslatorFactory(domain='oslo')
# The primary translation function using the well-known name "_"
_ = _translators.primary
# Translators for log levels.
#
# The abbreviated names are meant to reflect the usual use of a short
# name like '_'. The "L" is for "log" and the other letter comes from
# the level.
_LI = _translators.log_info
_LW = _translators.log_warning
_LE = _translators.log_error
_LC = _translators.log_critical

View File

@ -1,221 +0,0 @@
# Copyright 2013 OpenStack Foundation
# Copyright 2013 Spanish National Research Council.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# E0202: An attribute inherited from %s hide this method
# pylint: disable=E0202
import abc
import argparse
import os
import six
from stevedore import extension
from cerberus.openstack.common.apiclient import exceptions
_discovered_plugins = {}
def discover_auth_systems():
"""Discover the available auth-systems.
This won't take into account the old style auth-systems.
"""
global _discovered_plugins
_discovered_plugins = {}
def add_plugin(ext):
_discovered_plugins[ext.name] = ext.plugin
ep_namespace = "cerberus.openstack.common.apiclient.auth"
mgr = extension.ExtensionManager(ep_namespace)
mgr.map(add_plugin)
def load_auth_system_opts(parser):
"""Load options needed by the available auth-systems into a parser.
This function will try to populate the parser with options from the
available plugins.
"""
group = parser.add_argument_group("Common auth options")
BaseAuthPlugin.add_common_opts(group)
for name, auth_plugin in six.iteritems(_discovered_plugins):
group = parser.add_argument_group(
"Auth-system '%s' options" % name,
conflict_handler="resolve")
auth_plugin.add_opts(group)
def load_plugin(auth_system):
try:
plugin_class = _discovered_plugins[auth_system]
except KeyError:
raise exceptions.AuthSystemNotFound(auth_system)
return plugin_class(auth_system=auth_system)
def load_plugin_from_args(args):
"""Load required plugin and populate it with options.
Try to guess auth system if it is not specified. Systems are tried in
alphabetical order.
:type args: argparse.Namespace
:raises: AuthPluginOptionsMissing
"""
auth_system = args.os_auth_system
if auth_system:
plugin = load_plugin(auth_system)
plugin.parse_opts(args)
plugin.sufficient_options()
return plugin
for plugin_auth_system in sorted(six.iterkeys(_discovered_plugins)):
plugin_class = _discovered_plugins[plugin_auth_system]
plugin = plugin_class()
plugin.parse_opts(args)
try:
plugin.sufficient_options()
except exceptions.AuthPluginOptionsMissing:
continue
return plugin
raise exceptions.AuthPluginOptionsMissing(["auth_system"])
@six.add_metaclass(abc.ABCMeta)
class BaseAuthPlugin(object):
"""Base class for authentication plugins.
An authentication plugin needs to override at least the authenticate
method to be a valid plugin.
"""
auth_system = None
opt_names = []
common_opt_names = [
"auth_system",
"username",
"password",
"tenant_name",
"token",
"auth_url",
]
def __init__(self, auth_system=None, **kwargs):
self.auth_system = auth_system or self.auth_system
self.opts = dict((name, kwargs.get(name))
for name in self.opt_names)
@staticmethod
def _parser_add_opt(parser, opt):
"""Add an option to parser in two variants.
:param opt: option name (with underscores)
"""
dashed_opt = opt.replace("_", "-")
env_var = "OS_%s" % opt.upper()
arg_default = os.environ.get(env_var, "")
arg_help = "Defaults to env[%s]." % env_var
parser.add_argument(
"--os-%s" % dashed_opt,
metavar="<%s>" % dashed_opt,
default=arg_default,
help=arg_help)
parser.add_argument(
"--os_%s" % opt,
metavar="<%s>" % dashed_opt,
help=argparse.SUPPRESS)
@classmethod
def add_opts(cls, parser):
"""Populate the parser with the options for this plugin.
"""
for opt in cls.opt_names:
# use `BaseAuthPlugin.common_opt_names` since it is never
# changed in child classes
if opt not in BaseAuthPlugin.common_opt_names:
cls._parser_add_opt(parser, opt)
@classmethod
def add_common_opts(cls, parser):
"""Add options that are common for several plugins.
"""
for opt in cls.common_opt_names:
cls._parser_add_opt(parser, opt)
@staticmethod
def get_opt(opt_name, args):
"""Return option name and value.
:param opt_name: name of the option, e.g., "username"
:param args: parsed arguments
"""
return (opt_name, getattr(args, "os_%s" % opt_name, None))
def parse_opts(self, args):
"""Parse the actual auth-system options if any.
This method is expected to populate the attribute `self.opts` with a
dict containing the options and values needed to make authentication.
"""
self.opts.update(dict(self.get_opt(opt_name, args)
for opt_name in self.opt_names))
def authenticate(self, http_client):
"""Authenticate using plugin defined method.
The method usually analyses `self.opts` and performs
a request to authentication server.
:param http_client: client object that needs authentication
:type http_client: HTTPClient
:raises: AuthorizationFailure
"""
self.sufficient_options()
self._do_authenticate(http_client)
@abc.abstractmethod
def _do_authenticate(self, http_client):
"""Protected method for authentication.
"""
def sufficient_options(self):
"""Check if all required options are present.
:raises: AuthPluginOptionsMissing
"""
missing = [opt
for opt in self.opt_names
if not self.opts.get(opt)]
if missing:
raise exceptions.AuthPluginOptionsMissing(missing)
@abc.abstractmethod
def token_and_endpoint(self, endpoint_type, service_type):
"""Return token and endpoint.
:param service_type: Service type of the endpoint
:type service_type: string
:param endpoint_type: Type of endpoint.
Possible values: public or publicURL,
internal or internalURL,
admin or adminURL
:type endpoint_type: string
:returns: tuple of token and endpoint strings
:raises: EndpointException
"""

View File

@ -1,500 +0,0 @@
# Copyright 2010 Jacob Kaplan-Moss
# Copyright 2011 OpenStack Foundation
# Copyright 2012 Grid Dynamics
# Copyright 2013 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
Base utilities to build API operation managers and objects on top of.
"""
# E1102: %s is not callable
# pylint: disable=E1102
import abc
import copy
import six
from six.moves.urllib import parse
from cerberus.openstack.common.apiclient import exceptions
from cerberus.openstack.common import strutils
def getid(obj):
"""Return id if argument is a Resource.
Abstracts the common pattern of allowing both an object or an object's ID
(UUID) as a parameter when dealing with relationships.
"""
try:
if obj.uuid:
return obj.uuid
except AttributeError:
pass
try:
return obj.id
except AttributeError:
return obj
# TODO(aababilov): call run_hooks() in HookableMixin's child classes
class HookableMixin(object):
"""Mixin so classes can register and run hooks."""
_hooks_map = {}
@classmethod
def add_hook(cls, hook_type, hook_func):
"""Add a new hook of specified type.
:param cls: class that registers hooks
:param hook_type: hook type, e.g., '__pre_parse_args__'
:param hook_func: hook function
"""
if hook_type not in cls._hooks_map:
cls._hooks_map[hook_type] = []
cls._hooks_map[hook_type].append(hook_func)
@classmethod
def run_hooks(cls, hook_type, *args, **kwargs):
"""Run all hooks of specified type.
:param cls: class that registers hooks
:param hook_type: hook type, e.g., '__pre_parse_args__'
:param **args: args to be passed to every hook function
:param **kwargs: kwargs to be passed to every hook function
"""
hook_funcs = cls._hooks_map.get(hook_type) or []
for hook_func in hook_funcs:
hook_func(*args, **kwargs)
class BaseManager(HookableMixin):
"""Basic manager type providing common operations.
Managers interact with a particular type of API (servers, flavors, images,
etc.) and provide CRUD operations for them.
"""
resource_class = None
def __init__(self, client):
"""Initializes BaseManager with `client`.
:param client: instance of BaseClient descendant for HTTP requests
"""
super(BaseManager, self).__init__()
self.client = client
def _list(self, url, response_key, obj_class=None, json=None):
"""List the collection.
:param url: a partial URL, e.g., '/servers'
:param response_key: the key to be looked up in response dictionary,
e.g., 'servers'
:param obj_class: class for constructing the returned objects
(self.resource_class will be used by default)
:param json: data that will be encoded as JSON and passed in POST
request (GET will be sent by default)
"""
if json:
body = self.client.post(url, json=json).json()
else:
body = self.client.get(url).json()
if obj_class is None:
obj_class = self.resource_class
data = body[response_key]
# NOTE(ja): keystone returns values as list as {'values': [ ... ]}
# unlike other services which just return the list...
try:
data = data['values']
except (KeyError, TypeError):
pass
return [obj_class(self, res, loaded=True) for res in data if res]
def _get(self, url, response_key):
"""Get an object from collection.
:param url: a partial URL, e.g., '/servers'
:param response_key: the key to be looked up in response dictionary,
e.g., 'server'
"""
body = self.client.get(url).json()
return self.resource_class(self, body[response_key], loaded=True)
def _head(self, url):
"""Retrieve request headers for an object.
:param url: a partial URL, e.g., '/servers'
"""
resp = self.client.head(url)
return resp.status_code == 204
def _post(self, url, json, response_key, return_raw=False):
"""Create an object.
:param url: a partial URL, e.g., '/servers'
:param json: data that will be encoded as JSON and passed in POST
request (GET will be sent by default)
:param response_key: the key to be looked up in response dictionary,
e.g., 'servers'
:param return_raw: flag to force returning raw JSON instead of
Python object of self.resource_class
"""
body = self.client.post(url, json=json).json()
if return_raw:
return body[response_key]
return self.resource_class(self, body[response_key])
def _put(self, url, json=None, response_key=None):
"""Update an object with PUT method.
:param url: a partial URL, e.g., '/servers'
:param json: data that will be encoded as JSON and passed in POST
request (GET will be sent by default)
:param response_key: the key to be looked up in response dictionary,
e.g., 'servers'
"""
resp = self.client.put(url, json=json)
# PUT requests may not return a body
if resp.content:
body = resp.json()
if response_key is not None:
return self.resource_class(self, body[response_key])
else:
return self.resource_class(self, body)
def _patch(self, url, json=None, response_key=None):
"""Update an object with PATCH method.
:param url: a partial URL, e.g., '/servers'
:param json: data that will be encoded as JSON and passed in POST
request (GET will be sent by default)
:param response_key: the key to be looked up in response dictionary,
e.g., 'servers'
"""
body = self.client.patch(url, json=json).json()
if response_key is not None:
return self.resource_class(self, body[response_key])
else:
return self.resource_class(self, body)
def _delete(self, url):
"""Delete an object.
:param url: a partial URL, e.g., '/servers/my-server'
"""
return self.client.delete(url)
@six.add_metaclass(abc.ABCMeta)
class ManagerWithFind(BaseManager):
"""Manager with additional `find()`/`findall()` methods."""
@abc.abstractmethod
def list(self):
pass
def find(self, **kwargs):
"""Find a single item with attributes matching ``**kwargs``.
This isn't very efficient: it loads the entire list then filters on
the Python side.
"""
matches = self.findall(**kwargs)
num_matches = len(matches)
if num_matches == 0:
msg = "No %s matching %s." % (self.resource_class.__name__, kwargs)
raise exceptions.NotFound(msg)
elif num_matches > 1:
raise exceptions.NoUniqueMatch()
else:
return matches[0]
def findall(self, **kwargs):
"""Find all items with attributes matching ``**kwargs``.
This isn't very efficient: it loads the entire list then filters on
the Python side.
"""
found = []
searches = kwargs.items()
for obj in self.list():
try:
if all(getattr(obj, attr) == value
for (attr, value) in searches):
found.append(obj)
except AttributeError:
continue
return found
class CrudManager(BaseManager):
"""Base manager class for manipulating entities.
Children of this class are expected to define a `collection_key` and `key`.
- `collection_key`: Usually a plural noun by convention (e.g. `entities`);
used to refer collections in both URL's (e.g. `/v3/entities`) and JSON
objects containing a list of member resources (e.g. `{'entities': [{},
{}, {}]}`).
- `key`: Usually a singular noun by convention (e.g. `entity`); used to
refer to an individual member of the collection.
"""
collection_key = None
key = None
def build_url(self, base_url=None, **kwargs):
"""Builds a resource URL for the given kwargs.
Given an example collection where `collection_key = 'entities'` and
`key = 'entity'`, the following URL's could be generated.
By default, the URL will represent a collection of entities, e.g.::
/entities
If kwargs contains an `entity_id`, then the URL will represent a
specific member, e.g.::
/entities/{entity_id}
:param base_url: if provided, the generated URL will be appended to it
"""
url = base_url if base_url is not None else ''
url += '/%s' % self.collection_key
# do we have a specific entity?
entity_id = kwargs.get('%s_id' % self.key)
if entity_id is not None:
url += '/%s' % entity_id
return url
def _filter_kwargs(self, kwargs):
"""Drop null values and handle ids."""
for key, ref in six.iteritems(kwargs.copy()):
if ref is None:
kwargs.pop(key)
else:
if isinstance(ref, Resource):
kwargs.pop(key)
kwargs['%s_id' % key] = getid(ref)
return kwargs
def create(self, **kwargs):
kwargs = self._filter_kwargs(kwargs)
return self._post(
self.build_url(**kwargs),
{self.key: kwargs},
self.key)
def get(self, **kwargs):
kwargs = self._filter_kwargs(kwargs)
return self._get(
self.build_url(**kwargs),
self.key)
def head(self, **kwargs):
kwargs = self._filter_kwargs(kwargs)
return self._head(self.build_url(**kwargs))
def list(self, base_url=None, **kwargs):
"""List the collection.
:param base_url: if provided, the generated URL will be appended to it
"""
kwargs = self._filter_kwargs(kwargs)
return self._list(
'%(base_url)s%(query)s' % {
'base_url': self.build_url(base_url=base_url, **kwargs),
'query': '?%s' % parse.urlencode(kwargs) if kwargs else '',
},
self.collection_key)
def put(self, base_url=None, **kwargs):
"""Update an element.
:param base_url: if provided, the generated URL will be appended to it
"""
kwargs = self._filter_kwargs(kwargs)
return self._put(self.build_url(base_url=base_url, **kwargs))
def update(self, **kwargs):
kwargs = self._filter_kwargs(kwargs)
params = kwargs.copy()
params.pop('%s_id' % self.key)
return self._patch(
self.build_url(**kwargs),
{self.key: params},
self.key)
def delete(self, **kwargs):
kwargs = self._filter_kwargs(kwargs)
return self._delete(
self.build_url(**kwargs))
def find(self, base_url=None, **kwargs):
"""Find a single item with attributes matching ``**kwargs``.
:param base_url: if provided, the generated URL will be appended to it
"""
kwargs = self._filter_kwargs(kwargs)
rl = self._list(
'%(base_url)s%(query)s' % {
'base_url': self.build_url(base_url=base_url, **kwargs),
'query': '?%s' % parse.urlencode(kwargs) if kwargs else '',
},
self.collection_key)
num = len(rl)
if num == 0:
msg = "No %s matching %s." % (self.resource_class.__name__, kwargs)
raise exceptions.NotFound(404, msg)
elif num > 1:
raise exceptions.NoUniqueMatch
else:
return rl[0]
class Extension(HookableMixin):
"""Extension descriptor."""
SUPPORTED_HOOKS = ('__pre_parse_args__', '__post_parse_args__')
manager_class = None
def __init__(self, name, module):
super(Extension, self).__init__()
self.name = name
self.module = module
self._parse_extension_module()
def _parse_extension_module(self):
self.manager_class = None
for attr_name, attr_value in self.module.__dict__.items():
if attr_name in self.SUPPORTED_HOOKS:
self.add_hook(attr_name, attr_value)
else:
try:
if issubclass(attr_value, BaseManager):
self.manager_class = attr_value
except TypeError:
pass
def __repr__(self):
return "<Extension '%s'>" % self.name
class Resource(object):
"""Base class for OpenStack resources (tenant, user, etc.).
This is pretty much just a bag for attributes.
"""
HUMAN_ID = False
NAME_ATTR = 'name'
def __init__(self, manager, info, loaded=False):
"""Populate and bind to a manager.
:param manager: BaseManager object
:param info: dictionary representing resource attributes
:param loaded: prevent lazy-loading if set to True
"""
self.manager = manager
self._info = info
self._add_details(info)
self._loaded = loaded
def __repr__(self):
reprkeys = sorted(k
for k in self.__dict__.keys()
if k[0] != '_' and k != 'manager')
info = ", ".join("%s=%s" % (k, getattr(self, k)) for k in reprkeys)
return "<%s %s>" % (self.__class__.__name__, info)
@property
def human_id(self):
"""Human-readable ID which can be used for bash completion.
"""
if self.NAME_ATTR in self.__dict__ and self.HUMAN_ID:
return strutils.to_slug(getattr(self, self.NAME_ATTR))
return None
def _add_details(self, info):
for (k, v) in six.iteritems(info):
try:
setattr(self, k, v)
self._info[k] = v
except AttributeError:
# In this case we already defined the attribute on the class
pass
def __getattr__(self, k):
if k not in self.__dict__:
#NOTE(bcwaldon): disallow lazy-loading if already loaded once
if not self.is_loaded():
self.get()
return self.__getattr__(k)
raise AttributeError(k)
else:
return self.__dict__[k]
def get(self):
"""Support for lazy loading details.
Some clients, such as novaclient have the option to lazy load the
details, details which can be loaded with this function.
"""
# set_loaded() first ... so if we have to bail, we know we tried.
self.set_loaded(True)
if not hasattr(self.manager, 'get'):
return
new = self.manager.get(self.id)
if new:
self._add_details(new._info)
def __eq__(self, other):
if not isinstance(other, Resource):
return NotImplemented
# two resources of different types are not equal
if not isinstance(other, self.__class__):
return False
if hasattr(self, 'id') and hasattr(other, 'id'):
return self.id == other.id
return self._info == other._info
def is_loaded(self):
return self._loaded
def set_loaded(self, val):
self._loaded = val
def to_dict(self):
return copy.deepcopy(self._info)

View File

@ -1,358 +0,0 @@
# Copyright 2010 Jacob Kaplan-Moss
# Copyright 2011 OpenStack Foundation
# Copyright 2011 Piston Cloud Computing, Inc.
# Copyright 2013 Alessio Ababilov
# Copyright 2013 Grid Dynamics
# Copyright 2013 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
OpenStack Client interface. Handles the REST calls and responses.
"""
# E0202: An attribute inherited from %s hide this method
# pylint: disable=E0202
import logging
import time
try:
import simplejson as json
except ImportError:
import json
import requests
from cerberus.openstack.common.apiclient import exceptions
from cerberus.openstack.common import importutils
_logger = logging.getLogger(__name__)
class HTTPClient(object):
"""This client handles sending HTTP requests to OpenStack servers.
Features:
- share authentication information between several clients to different
services (e.g., for compute and image clients);
- reissue authentication request for expired tokens;
- encode/decode JSON bodies;
- raise exceptions on HTTP errors;
- pluggable authentication;
- store authentication information in a keyring;
- store time spent for requests;
- register clients for particular services, so one can use
`http_client.identity` or `http_client.compute`;
- log requests and responses in a format that is easy to copy-and-paste
into terminal and send the same request with curl.
"""
user_agent = "cerberus.openstack.common.apiclient"
def __init__(self,
auth_plugin,
region_name=None,
endpoint_type="publicURL",
original_ip=None,
verify=True,
cert=None,
timeout=None,
timings=False,
keyring_saver=None,
debug=False,
user_agent=None,
http=None):
self.auth_plugin = auth_plugin
self.endpoint_type = endpoint_type
self.region_name = region_name
self.original_ip = original_ip
self.timeout = timeout
self.verify = verify
self.cert = cert
self.keyring_saver = keyring_saver
self.debug = debug
self.user_agent = user_agent or self.user_agent
self.times = [] # [("item", starttime, endtime), ...]
self.timings = timings
# requests within the same session can reuse TCP connections from pool
self.http = http or requests.Session()
self.cached_token = None
def _http_log_req(self, method, url, kwargs):
if not self.debug:
return
string_parts = [
"curl -i",
"-X '%s'" % method,
"'%s'" % url,
]
for element in kwargs['headers']:
header = "-H '%s: %s'" % (element, kwargs['headers'][element])
string_parts.append(header)
_logger.debug("REQ: %s" % " ".join(string_parts))
if 'data' in kwargs:
_logger.debug("REQ BODY: %s\n" % (kwargs['data']))
def _http_log_resp(self, resp):
if not self.debug:
return
_logger.debug(
"RESP: [%s] %s\n",
resp.status_code,
resp.headers)
if resp._content_consumed:
_logger.debug(
"RESP BODY: %s\n",
resp.text)
def serialize(self, kwargs):
if kwargs.get('json') is not None:
kwargs['headers']['Content-Type'] = 'application/json'
kwargs['data'] = json.dumps(kwargs['json'])
try:
del kwargs['json']
except KeyError:
pass
def get_timings(self):
return self.times
def reset_timings(self):
self.times = []
def request(self, method, url, **kwargs):
"""Send an http request with the specified characteristics.
Wrapper around `requests.Session.request` to handle tasks such as
setting headers, JSON encoding/decoding, and error handling.
:param method: method of HTTP request
:param url: URL of HTTP request
:param kwargs: any other parameter that can be passed to
' requests.Session.request (such as `headers`) or `json`
that will be encoded as JSON and used as `data` argument
"""
kwargs.setdefault("headers", kwargs.get("headers", {}))
kwargs["headers"]["User-Agent"] = self.user_agent
if self.original_ip:
kwargs["headers"]["Forwarded"] = "for=%s;by=%s" % (
self.original_ip, self.user_agent)
if self.timeout is not None:
kwargs.setdefault("timeout", self.timeout)
kwargs.setdefault("verify", self.verify)
if self.cert is not None:
kwargs.setdefault("cert", self.cert)
self.serialize(kwargs)
self._http_log_req(method, url, kwargs)
if self.timings:
start_time = time.time()
resp = self.http.request(method, url, **kwargs)
if self.timings:
self.times.append(("%s %s" % (method, url),
start_time, time.time()))
self._http_log_resp(resp)
if resp.status_code >= 400:
_logger.debug(
"Request returned failure status: %s",
resp.status_code)
raise exceptions.from_response(resp, method, url)
return resp
@staticmethod
def concat_url(endpoint, url):
"""Concatenate endpoint and final URL.
E.g., "http://keystone/v2.0/" and "/tokens" are concatenated to
"http://keystone/v2.0/tokens".
:param endpoint: the base URL
:param url: the final URL
"""
return "%s/%s" % (endpoint.rstrip("/"), url.strip("/"))
def client_request(self, client, method, url, **kwargs):
"""Send an http request using `client`'s endpoint and specified `url`.
If request was rejected as unauthorized (possibly because the token is
expired), issue one authorization attempt and send the request once
again.
:param client: instance of BaseClient descendant
:param method: method of HTTP request
:param url: URL of HTTP request
:param kwargs: any other parameter that can be passed to
' `HTTPClient.request`
"""
filter_args = {
"endpoint_type": client.endpoint_type or self.endpoint_type,
"service_type": client.service_type,
}
token, endpoint = (self.cached_token, client.cached_endpoint)
just_authenticated = False
if not (token and endpoint):
try:
token, endpoint = self.auth_plugin.token_and_endpoint(
**filter_args)
except exceptions.EndpointException:
pass
if not (token and endpoint):
self.authenticate()
just_authenticated = True
token, endpoint = self.auth_plugin.token_and_endpoint(
**filter_args)
if not (token and endpoint):
raise exceptions.AuthorizationFailure(
"Cannot find endpoint or token for request")
old_token_endpoint = (token, endpoint)
kwargs.setdefault("headers", {})["X-Auth-Token"] = token
self.cached_token = token
client.cached_endpoint = endpoint
# Perform the request once. If we get Unauthorized, then it
# might be because the auth token expired, so try to
# re-authenticate and try again. If it still fails, bail.
try:
return self.request(
method, self.concat_url(endpoint, url), **kwargs)
except exceptions.Unauthorized as unauth_ex:
if just_authenticated:
raise
self.cached_token = None
client.cached_endpoint = None
self.authenticate()
try:
token, endpoint = self.auth_plugin.token_and_endpoint(
**filter_args)
except exceptions.EndpointException:
raise unauth_ex
if (not (token and endpoint) or
old_token_endpoint == (token, endpoint)):
raise unauth_ex
self.cached_token = token
client.cached_endpoint = endpoint
kwargs["headers"]["X-Auth-Token"] = token
return self.request(
method, self.concat_url(endpoint, url), **kwargs)
def add_client(self, base_client_instance):
"""Add a new instance of :class:`BaseClient` descendant.
`self` will store a reference to `base_client_instance`.
Example:
>>> def test_clients():
... from keystoneclient.auth import keystone
... from openstack.common.apiclient import client
... auth = keystone.KeystoneAuthPlugin(
... username="user", password="pass", tenant_name="tenant",
... auth_url="http://auth:5000/v2.0")
... openstack_client = client.HTTPClient(auth)
... # create nova client
... from novaclient.v1_1 import client
... client.Client(openstack_client)
... # create keystone client
... from keystoneclient.v2_0 import client
... client.Client(openstack_client)
... # use them
... openstack_client.identity.tenants.list()
... openstack_client.compute.servers.list()
"""
service_type = base_client_instance.service_type
if service_type and not hasattr(self, service_type):
setattr(self, service_type, base_client_instance)
def authenticate(self):
self.auth_plugin.authenticate(self)
# Store the authentication results in the keyring for later requests
if self.keyring_saver:
self.keyring_saver.save(self)
class BaseClient(object):
"""Top-level object to access the OpenStack API.
This client uses :class:`HTTPClient` to send requests. :class:`HTTPClient`
will handle a bunch of issues such as authentication.
"""
service_type = None
endpoint_type = None # "publicURL" will be used
cached_endpoint = None
def __init__(self, http_client, extensions=None):
self.http_client = http_client
http_client.add_client(self)
# Add in any extensions...
if extensions:
for extension in extensions:
if extension.manager_class:
setattr(self, extension.name,
extension.manager_class(self))
def client_request(self, method, url, **kwargs):
return self.http_client.client_request(
self, method, url, **kwargs)
def head(self, url, **kwargs):
return self.client_request("HEAD", url, **kwargs)
def get(self, url, **kwargs):
return self.client_request("GET", url, **kwargs)
def post(self, url, **kwargs):
return self.client_request("POST", url, **kwargs)
def put(self, url, **kwargs):
return self.client_request("PUT", url, **kwargs)
def delete(self, url, **kwargs):
return self.client_request("DELETE", url, **kwargs)
def patch(self, url, **kwargs):
return self.client_request("PATCH", url, **kwargs)
@staticmethod
def get_class(api_name, version, version_map):
"""Returns the client class for the requested API version
:param api_name: the name of the API, e.g. 'compute', 'image', etc
:param version: the requested API version
:param version_map: a dict of client classes keyed by version
:rtype: a client class for the requested API version
"""
try:
client_path = version_map[str(version)]
except (KeyError, ValueError):
msg = "Invalid %s client version '%s'. must be one of: %s" % (
(api_name, version, ', '.join(version_map.keys())))
raise exceptions.UnsupportedVersion(msg)
return importutils.import_class(client_path)

View File

@ -1,459 +0,0 @@
# Copyright 2010 Jacob Kaplan-Moss
# Copyright 2011 Nebula, Inc.
# Copyright 2013 Alessio Ababilov
# Copyright 2013 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
Exception definitions.
"""
import inspect
import sys
import six
class ClientException(Exception):
"""The base exception class for all exceptions this library raises.
"""
pass
class MissingArgs(ClientException):
"""Supplied arguments are not sufficient for calling a function."""
def __init__(self, missing):
self.missing = missing
msg = "Missing argument(s): %s" % ", ".join(missing)
super(MissingArgs, self).__init__(msg)
class ValidationError(ClientException):
"""Error in validation on API client side."""
pass
class UnsupportedVersion(ClientException):
"""User is trying to use an unsupported version of the API."""
pass
class CommandError(ClientException):
"""Error in CLI tool."""
pass
class AuthorizationFailure(ClientException):
"""Cannot authorize API client."""
pass
class ConnectionRefused(ClientException):
"""Cannot connect to API service."""
pass
class AuthPluginOptionsMissing(AuthorizationFailure):
"""Auth plugin misses some options."""
def __init__(self, opt_names):
super(AuthPluginOptionsMissing, self).__init__(
"Authentication failed. Missing options: %s" %
", ".join(opt_names))
self.opt_names = opt_names
class AuthSystemNotFound(AuthorizationFailure):
"""User has specified a AuthSystem that is not installed."""
def __init__(self, auth_system):
super(AuthSystemNotFound, self).__init__(
"AuthSystemNotFound: %s" % repr(auth_system))
self.auth_system = auth_system
class NoUniqueMatch(ClientException):
"""Multiple entities found instead of one."""
pass
class EndpointException(ClientException):
"""Something is rotten in Service Catalog."""
pass
class EndpointNotFound(EndpointException):
"""Could not find requested endpoint in Service Catalog."""
pass
class AmbiguousEndpoints(EndpointException):
"""Found more than one matching endpoint in Service Catalog."""
def __init__(self, endpoints=None):
super(AmbiguousEndpoints, self).__init__(
"AmbiguousEndpoints: %s" % repr(endpoints))
self.endpoints = endpoints
class HttpError(ClientException):
"""The base exception class for all HTTP exceptions.
"""
http_status = 0
message = "HTTP Error"
def __init__(self, message=None, details=None,
response=None, request_id=None,
url=None, method=None, http_status=None):
self.http_status = http_status or self.http_status
self.message = message or self.message
self.details = details
self.request_id = request_id
self.response = response
self.url = url
self.method = method
formatted_string = "%s (HTTP %s)" % (self.message, self.http_status)
if request_id:
formatted_string += " (Request-ID: %s)" % request_id
super(HttpError, self).__init__(formatted_string)
class HTTPRedirection(HttpError):
"""HTTP Redirection."""
message = "HTTP Redirection"
class HTTPClientError(HttpError):
"""Client-side HTTP error.
Exception for cases in which the client seems to have erred.
"""
message = "HTTP Client Error"
class HttpServerError(HttpError):
"""Server-side HTTP error.
Exception for cases in which the server is aware that it has
erred or is incapable of performing the request.
"""
message = "HTTP Server Error"
class MultipleChoices(HTTPRedirection):
"""HTTP 300 - Multiple Choices.
Indicates multiple options for the resource that the client may follow.
"""
http_status = 300
message = "Multiple Choices"
class BadRequest(HTTPClientError):
"""HTTP 400 - Bad Request.
The request cannot be fulfilled due to bad syntax.
"""
http_status = 400
message = "Bad Request"
class Unauthorized(HTTPClientError):
"""HTTP 401 - Unauthorized.
Similar to 403 Forbidden, but specifically for use when authentication
is required and has failed or has not yet been provided.
"""
http_status = 401
message = "Unauthorized"
class PaymentRequired(HTTPClientError):
"""HTTP 402 - Payment Required.
Reserved for future use.
"""
http_status = 402
message = "Payment Required"
class Forbidden(HTTPClientError):
"""HTTP 403 - Forbidden.
The request was a valid request, but the server is refusing to respond
to it.
"""
http_status = 403
message = "Forbidden"
class NotFound(HTTPClientError):
"""HTTP 404 - Not Found.
The requested resource could not be found but may be available again
in the future.
"""
http_status = 404
message = "Not Found"
class MethodNotAllowed(HTTPClientError):
"""HTTP 405 - Method Not Allowed.
A request was made of a resource using a request method not supported
by that resource.
"""
http_status = 405
message = "Method Not Allowed"
class NotAcceptable(HTTPClientError):
"""HTTP 406 - Not Acceptable.
The requested resource is only capable of generating content not
acceptable according to the Accept headers sent in the request.
"""
http_status = 406
message = "Not Acceptable"
class ProxyAuthenticationRequired(HTTPClientError):
"""HTTP 407 - Proxy Authentication Required.
The client must first authenticate itself with the proxy.
"""
http_status = 407
message = "Proxy Authentication Required"
class RequestTimeout(HTTPClientError):
"""HTTP 408 - Request Timeout.
The server timed out waiting for the request.
"""
http_status = 408
message = "Request Timeout"
class Conflict(HTTPClientError):
"""HTTP 409 - Conflict.
Indicates that the request could not be processed because of conflict
in the request, such as an edit conflict.
"""
http_status = 409
message = "Conflict"
class Gone(HTTPClientError):
"""HTTP 410 - Gone.
Indicates that the resource requested is no longer available and will
not be available again.
"""
http_status = 410
message = "Gone"
class LengthRequired(HTTPClientError):
"""HTTP 411 - Length Required.
The request did not specify the length of its content, which is
required by the requested resource.
"""
http_status = 411
message = "Length Required"
class PreconditionFailed(HTTPClientError):
"""HTTP 412 - Precondition Failed.
The server does not meet one of the preconditions that the requester
put on the request.
"""
http_status = 412
message = "Precondition Failed"
class RequestEntityTooLarge(HTTPClientError):
"""HTTP 413 - Request Entity Too Large.
The request is larger than the server is willing or able to process.
"""
http_status = 413
message = "Request Entity Too Large"
def __init__(self, *args, **kwargs):
try:
self.retry_after = int(kwargs.pop('retry_after'))
except (KeyError, ValueError):
self.retry_after = 0
super(RequestEntityTooLarge, self).__init__(*args, **kwargs)
class RequestUriTooLong(HTTPClientError):
"""HTTP 414 - Request-URI Too Long.
The URI provided was too long for the server to process.
"""
http_status = 414
message = "Request-URI Too Long"
class UnsupportedMediaType(HTTPClientError):
"""HTTP 415 - Unsupported Media Type.
The request entity has a media type which the server or resource does
not support.
"""
http_status = 415
message = "Unsupported Media Type"
class RequestedRangeNotSatisfiable(HTTPClientError):
"""HTTP 416 - Requested Range Not Satisfiable.
The client has asked for a portion of the file, but the server cannot
supply that portion.
"""
http_status = 416
message = "Requested Range Not Satisfiable"
class ExpectationFailed(HTTPClientError):
"""HTTP 417 - Expectation Failed.
The server cannot meet the requirements of the Expect request-header field.
"""
http_status = 417
message = "Expectation Failed"
class UnprocessableEntity(HTTPClientError):
"""HTTP 422 - Unprocessable Entity.
The request was well-formed but was unable to be followed due to semantic
errors.
"""
http_status = 422
message = "Unprocessable Entity"
class InternalServerError(HttpServerError):
"""HTTP 500 - Internal Server Error.
A generic error message, given when no more specific message is suitable.
"""
http_status = 500
message = "Internal Server Error"
# NotImplemented is a python keyword.
class HttpNotImplemented(HttpServerError):
"""HTTP 501 - Not Implemented.
The server either does not recognize the request method, or it lacks
the ability to fulfill the request.
"""
http_status = 501
message = "Not Implemented"
class BadGateway(HttpServerError):
"""HTTP 502 - Bad Gateway.
The server was acting as a gateway or proxy and received an invalid
response from the upstream server.
"""
http_status = 502
message = "Bad Gateway"
class ServiceUnavailable(HttpServerError):
"""HTTP 503 - Service Unavailable.
The server is currently unavailable.
"""
http_status = 503
message = "Service Unavailable"
class GatewayTimeout(HttpServerError):
"""HTTP 504 - Gateway Timeout.
The server was acting as a gateway or proxy and did not receive a timely
response from the upstream server.
"""
http_status = 504
message = "Gateway Timeout"
class HttpVersionNotSupported(HttpServerError):
"""HTTP 505 - HttpVersion Not Supported.
The server does not support the HTTP protocol version used in the request.
"""
http_status = 505
message = "HTTP Version Not Supported"
# _code_map contains all the classes that have http_status attribute.
_code_map = dict(
(getattr(obj, 'http_status', None), obj)
for name, obj in six.iteritems(vars(sys.modules[__name__]))
if inspect.isclass(obj) and getattr(obj, 'http_status', False)
)
def from_response(response, method, url):
"""Returns an instance of :class:`HttpError` or subclass based on response.
:param response: instance of `requests.Response` class
:param method: HTTP method used for request
:param url: URL used for request
"""
kwargs = {
"http_status": response.status_code,
"response": response,
"method": method,
"url": url,
"request_id": response.headers.get("x-compute-request-id"),
}
if "retry-after" in response.headers:
kwargs["retry_after"] = response.headers["retry-after"]
content_type = response.headers.get("Content-Type", "")
if content_type.startswith("application/json"):
try:
body = response.json()
except ValueError:
pass
else:
if isinstance(body, dict):
error = list(body.values())[0]
kwargs["message"] = error.get("message")
kwargs["details"] = error.get("details")
elif content_type.startswith("text/"):
kwargs["details"] = response.text
try:
cls = _code_map[response.status_code]
except KeyError:
if 500 <= response.status_code < 600:
cls = HttpServerError
elif 400 <= response.status_code < 500:
cls = HTTPClientError
else:
cls = HttpError
return cls(**kwargs)

View File

@ -1,173 +0,0 @@
# Copyright 2013 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
A fake server that "responds" to API methods with pre-canned responses.
All of these responses come from the spec, so if for some reason the spec's
wrong the tests might raise AssertionError. I've indicated in comments the
places where actual behavior differs from the spec.
"""
# W0102: Dangerous default value %s as argument
# pylint: disable=W0102
import json
import requests
import six
from six.moves.urllib import parse
from cerberus.openstack.common.apiclient import client
def assert_has_keys(dct, required=[], optional=[]):
for k in required:
try:
assert k in dct
except AssertionError:
extra_keys = set(dct.keys()).difference(set(required + optional))
raise AssertionError("found unexpected keys: %s" %
list(extra_keys))
class TestResponse(requests.Response):
"""Wrap requests.Response and provide a convenient initialization.
"""
def __init__(self, data):
super(TestResponse, self).__init__()
self._content_consumed = True
if isinstance(data, dict):
self.status_code = data.get('status_code', 200)
# Fake the text attribute to streamline Response creation
text = data.get('text', "")
if isinstance(text, (dict, list)):
self._content = json.dumps(text)
default_headers = {
"Content-Type": "application/json",
}
else:
self._content = text
default_headers = {}
if six.PY3 and isinstance(self._content, six.string_types):
self._content = self._content.encode('utf-8', 'strict')
self.headers = data.get('headers') or default_headers
else:
self.status_code = data
def __eq__(self, other):
return (self.status_code == other.status_code and
self.headers == other.headers and
self._content == other._content)
class FakeHTTPClient(client.HTTPClient):
def __init__(self, *args, **kwargs):
self.callstack = []
self.fixtures = kwargs.pop("fixtures", None) or {}
if not args and not "auth_plugin" in kwargs:
args = (None, )
super(FakeHTTPClient, self).__init__(*args, **kwargs)
def assert_called(self, method, url, body=None, pos=-1):
"""Assert than an API method was just called.
"""
expected = (method, url)
called = self.callstack[pos][0:2]
assert self.callstack, \
"Expected %s %s but no calls were made." % expected
assert expected == called, 'Expected %s %s; got %s %s' % \
(expected + called)
if body is not None:
if self.callstack[pos][3] != body:
raise AssertionError('%r != %r' %
(self.callstack[pos][3], body))
def assert_called_anytime(self, method, url, body=None):
"""Assert than an API method was called anytime in the test.
"""
expected = (method, url)
assert self.callstack, \
"Expected %s %s but no calls were made." % expected
found = False
entry = None
for entry in self.callstack:
if expected == entry[0:2]:
found = True
break
assert found, 'Expected %s %s; got %s' % \
(method, url, self.callstack)
if body is not None:
assert entry[3] == body, "%s != %s" % (entry[3], body)
self.callstack = []
def clear_callstack(self):
self.callstack = []
def authenticate(self):
pass
def client_request(self, client, method, url, **kwargs):
# Check that certain things are called correctly
if method in ["GET", "DELETE"]:
assert "json" not in kwargs
# Note the call
self.callstack.append(
(method,
url,
kwargs.get("headers") or {},
kwargs.get("json") or kwargs.get("data")))
try:
fixture = self.fixtures[url][method]
except KeyError:
pass
else:
return TestResponse({"headers": fixture[0],
"text": fixture[1]})
# Call the method
args = parse.parse_qsl(parse.urlparse(url)[4])
kwargs.update(args)
munged_url = url.rsplit('?', 1)[0]
munged_url = munged_url.strip('/').replace('/', '_').replace('.', '_')
munged_url = munged_url.replace('-', '_')
callback = "%s_%s" % (method.lower(), munged_url)
if not hasattr(self, callback):
raise AssertionError('Called unknown API method: %s %s, '
'expected fakes method name: %s' %
(method, url, callback))
resp = getattr(self, callback)(**kwargs)
if len(resp) == 3:
status, headers, body = resp
else:
status, body = resp
headers = {}
return TestResponse({
"status_code": status,
"text": body,
"headers": headers,
})

View File

@ -1,309 +0,0 @@
# Copyright 2012 Red Hat, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# W0603: Using the global statement
# W0621: Redefining name %s from outer scope
# pylint: disable=W0603,W0621
from __future__ import print_function
import getpass
import inspect
import os
import sys
import textwrap
import prettytable
import six
from six import moves
from cerberus.openstack.common.apiclient import exceptions
from cerberus.openstack.common.gettextutils import _
from cerberus.openstack.common import strutils
from cerberus.openstack.common import uuidutils
def validate_args(fn, *args, **kwargs):
"""Check that the supplied args are sufficient for calling a function.
>>> validate_args(lambda a: None)
Traceback (most recent call last):
...
MissingArgs: Missing argument(s): a
>>> validate_args(lambda a, b, c, d: None, 0, c=1)
Traceback (most recent call last):
...
MissingArgs: Missing argument(s): b, d
:param fn: the function to check
:param arg: the positional arguments supplied
:param kwargs: the keyword arguments supplied
"""
argspec = inspect.getargspec(fn)
num_defaults = len(argspec.defaults or [])
required_args = argspec.args[:len(argspec.args) - num_defaults]
def isbound(method):
return getattr(method, 'im_self', None) is not None
if isbound(fn):
required_args.pop(0)
missing = [arg for arg in required_args if arg not in kwargs]
missing = missing[len(args):]
if missing:
raise exceptions.MissingArgs(missing)
def arg(*args, **kwargs):
"""Decorator for CLI args.
Example:
>>> @arg("name", help="Name of the new entity")
... def entity_create(args):
... pass
"""
def _decorator(func):
add_arg(func, *args, **kwargs)
return func
return _decorator
def env(*args, **kwargs):
"""Returns the first environment variable set.
If all are empty, defaults to '' or keyword arg `default`.
"""
for arg in args:
value = os.environ.get(arg)
if value:
return value
return kwargs.get('default', '')
def add_arg(func, *args, **kwargs):
"""Bind CLI arguments to a shell.py `do_foo` function."""
if not hasattr(func, 'arguments'):
func.arguments = []
# NOTE(sirp): avoid dups that can occur when the module is shared across
# tests.
if (args, kwargs) not in func.arguments:
# Because of the semantics of decorator composition if we just append
# to the options list positional options will appear to be backwards.
func.arguments.insert(0, (args, kwargs))
def unauthenticated(func):
"""Adds 'unauthenticated' attribute to decorated function.
Usage:
>>> @unauthenticated
... def mymethod(f):
... pass
"""
func.unauthenticated = True
return func
def isunauthenticated(func):
"""Checks if the function does not require authentication.
Mark such functions with the `@unauthenticated` decorator.
:returns: bool
"""
return getattr(func, 'unauthenticated', False)
def print_list(objs, fields, formatters=None, sortby_index=0,
mixed_case_fields=None):
"""Print a list or objects as a table, one row per object.
:param objs: iterable of :class:`Resource`
:param fields: attributes that correspond to columns, in order
:param formatters: `dict` of callables for field formatting
:param sortby_index: index of the field for sorting table rows
:param mixed_case_fields: fields corresponding to object attributes that
have mixed case names (e.g., 'serverId')
"""
formatters = formatters or {}
mixed_case_fields = mixed_case_fields or []
if sortby_index is None:
kwargs = {}
else:
kwargs = {'sortby': fields[sortby_index]}
pt = prettytable.PrettyTable(fields, caching=False)
pt.align = 'l'
for o in objs:
row = []
for field in fields:
if field in formatters:
row.append(formatters[field](o))
else:
if field in mixed_case_fields:
field_name = field.replace(' ', '_')
else:
field_name = field.lower().replace(' ', '_')
data = getattr(o, field_name, '')
row.append(data)
pt.add_row(row)
print(strutils.safe_encode(pt.get_string(**kwargs)))
def print_dict(dct, dict_property="Property", wrap=0):
"""Print a `dict` as a table of two columns.
:param dct: `dict` to print
:param dict_property: name of the first column
:param wrap: wrapping for the second column
"""
pt = prettytable.PrettyTable([dict_property, 'Value'], caching=False)
pt.align = 'l'
for k, v in six.iteritems(dct):
# convert dict to str to check length
if isinstance(v, dict):
v = six.text_type(v)
if wrap > 0:
v = textwrap.fill(six.text_type(v), wrap)
# if value has a newline, add in multiple rows
# e.g. fault with stacktrace
if v and isinstance(v, six.string_types) and r'\n' in v:
lines = v.strip().split(r'\n')
col1 = k
for line in lines:
pt.add_row([col1, line])
col1 = ''
else:
pt.add_row([k, v])
print(strutils.safe_encode(pt.get_string()))
def get_password(max_password_prompts=3):
"""Read password from TTY."""
verify = strutils.bool_from_string(env("OS_VERIFY_PASSWORD"))
pw = None
if hasattr(sys.stdin, "isatty") and sys.stdin.isatty():
# Check for Ctrl-D
try:
for __ in moves.range(max_password_prompts):
pw1 = getpass.getpass("OS Password: ")
if verify:
pw2 = getpass.getpass("Please verify: ")
else:
pw2 = pw1
if pw1 == pw2 and pw1:
pw = pw1
break
except EOFError:
pass
return pw
def find_resource(manager, name_or_id, **find_args):
"""Look for resource in a given manager.
Used as a helper for the _find_* methods.
Example:
def _find_hypervisor(cs, hypervisor):
#Get a hypervisor by name or ID.
return cliutils.find_resource(cs.hypervisors, hypervisor)
"""
# first try to get entity as integer id
try:
return manager.get(int(name_or_id))
except (TypeError, ValueError, exceptions.NotFound):
pass
# now try to get entity as uuid
try:
tmp_id = strutils.safe_encode(name_or_id)
if uuidutils.is_uuid_like(tmp_id):
return manager.get(tmp_id)
except (TypeError, ValueError, exceptions.NotFound):
pass
# for str id which is not uuid
if getattr(manager, 'is_alphanum_id_allowed', False):
try:
return manager.get(name_or_id)
except exceptions.NotFound:
pass
try:
try:
return manager.find(human_id=name_or_id, **find_args)
except exceptions.NotFound:
pass
# finally try to find entity by name
try:
resource = getattr(manager, 'resource_class', None)
name_attr = resource.NAME_ATTR if resource else 'name'
kwargs = {name_attr: name_or_id}
kwargs.update(find_args)
return manager.find(**kwargs)
except exceptions.NotFound:
msg = _("No %(name)s with a name or "
"ID of '%(name_or_id)s' exists.") % \
{
"name": manager.resource_class.__name__.lower(),
"name_or_id": name_or_id
}
raise exceptions.CommandError(msg)
except exceptions.NoUniqueMatch:
msg = _("Multiple %(name)s matches found for "
"'%(name_or_id)s', use an ID to be more specific.") % \
{
"name": manager.resource_class.__name__.lower(),
"name_or_id": name_or_id
}
raise exceptions.CommandError(msg)
def service_type(stype):
"""Adds 'service_type' attribute to decorated function.
Usage:
@service_type('volume')
def mymethod(f):
...
"""
def inner(f):
f.service_type = stype
return f
return inner
def get_service_type(f):
"""Retrieves service type from function."""
return getattr(f, 'service_type', None)
def pretty_choice_list(l):
return ', '.join("'%s'" % i for i in l)
def exit(msg=''):
if msg:
print (msg, file=sys.stderr)
sys.exit(1)

View File

@ -1,307 +0,0 @@
# Copyright 2012 SINA Corporation
# Copyright 2014 Cisco Systems, Inc.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
"""Extracts OpenStack config option info from module(s)."""
from __future__ import print_function
import argparse
import imp
import os
import re
import socket
import sys
import textwrap
from oslo.config import cfg
import six
import stevedore.named
from cerberus.openstack.common import gettextutils
from cerberus.openstack.common import importutils
gettextutils.install('cerberus')
STROPT = "StrOpt"
BOOLOPT = "BoolOpt"
INTOPT = "IntOpt"
FLOATOPT = "FloatOpt"
LISTOPT = "ListOpt"
DICTOPT = "DictOpt"
MULTISTROPT = "MultiStrOpt"
OPT_TYPES = {
STROPT: 'string value',
BOOLOPT: 'boolean value',
INTOPT: 'integer value',
FLOATOPT: 'floating point value',
LISTOPT: 'list value',
DICTOPT: 'dict value',
MULTISTROPT: 'multi valued',
}
OPTION_REGEX = re.compile(r"(%s)" % "|".join([STROPT, BOOLOPT, INTOPT,
FLOATOPT, LISTOPT, DICTOPT,
MULTISTROPT]))
PY_EXT = ".py"
BASEDIR = os.path.abspath(os.path.join(os.path.dirname(__file__),
"../../../../"))
WORDWRAP_WIDTH = 60
def raise_extension_exception(extmanager, ep, err):
raise
def generate(argv):
parser = argparse.ArgumentParser(
description='generate sample configuration file',
)
parser.add_argument('-m', dest='modules', action='append')
parser.add_argument('-l', dest='libraries', action='append')
parser.add_argument('srcfiles', nargs='*')
parsed_args = parser.parse_args(argv)
mods_by_pkg = dict()
for filepath in parsed_args.srcfiles:
pkg_name = filepath.split(os.sep)[1]
mod_str = '.'.join(['.'.join(filepath.split(os.sep)[:-1]),
os.path.basename(filepath).split('.')[0]])
mods_by_pkg.setdefault(pkg_name, list()).append(mod_str)
# NOTE(lzyeval): place top level modules before packages
pkg_names = sorted(pkg for pkg in mods_by_pkg if pkg.endswith(PY_EXT))
ext_names = sorted(pkg for pkg in mods_by_pkg if pkg not in pkg_names)
pkg_names.extend(ext_names)
# opts_by_group is a mapping of group name to an options list
# The options list is a list of (module, options) tuples
opts_by_group = {'DEFAULT': []}
if parsed_args.modules:
for module_name in parsed_args.modules:
module = _import_module(module_name)
if module:
for group, opts in _list_opts(module):
opts_by_group.setdefault(group, []).append((module_name,
opts))
# Look for entry points defined in libraries (or applications) for
# option discovery, and include their return values in the output.
#
# Each entry point should be a function returning an iterable
# of pairs with the group name (or None for the default group)
# and the list of Opt instances for that group.
if parsed_args.libraries:
loader = stevedore.named.NamedExtensionManager(
'oslo.config.opts',
names=list(set(parsed_args.libraries)),
invoke_on_load=False,
on_load_failure_callback=raise_extension_exception
)
for ext in loader:
for group, opts in ext.plugin():
opt_list = opts_by_group.setdefault(group or 'DEFAULT', [])
opt_list.append((ext.name, opts))
for pkg_name in pkg_names:
mods = mods_by_pkg.get(pkg_name)
mods.sort()
for mod_str in mods:
if mod_str.endswith('.__init__'):
mod_str = mod_str[:mod_str.rfind(".")]
mod_obj = _import_module(mod_str)
if not mod_obj:
raise RuntimeError("Unable to import module %s" % mod_str)
for group, opts in _list_opts(mod_obj):
opts_by_group.setdefault(group, []).append((mod_str, opts))
print_group_opts('DEFAULT', opts_by_group.pop('DEFAULT', []))
for group in sorted(opts_by_group.keys()):
print_group_opts(group, opts_by_group[group])
def _import_module(mod_str):
try:
if mod_str.startswith('bin.'):
imp.load_source(mod_str[4:], os.path.join('bin', mod_str[4:]))
return sys.modules[mod_str[4:]]
else:
return importutils.import_module(mod_str)
except Exception as e:
sys.stderr.write("Error importing module %s: %s\n" % (mod_str, str(e)))
return None
def _is_in_group(opt, group):
"Check if opt is in group."
for value in group._opts.values():
# NOTE(llu): Temporary workaround for bug #1262148, wait until
# newly released oslo.config support '==' operator.
if not(value['opt'] != opt):
return True
return False
def _guess_groups(opt, mod_obj):
# is it in the DEFAULT group?
if _is_in_group(opt, cfg.CONF):
return 'DEFAULT'
# what other groups is it in?
for value in cfg.CONF.values():
if isinstance(value, cfg.CONF.GroupAttr):
if _is_in_group(opt, value._group):
return value._group.name
raise RuntimeError(
"Unable to find group for option %s, "
"maybe it's defined twice in the same group?"
% opt.name
)
def _list_opts(obj):
def is_opt(o):
return (isinstance(o, cfg.Opt) and
not isinstance(o, cfg.SubCommandOpt))
opts = list()
for attr_str in dir(obj):
attr_obj = getattr(obj, attr_str)
if is_opt(attr_obj):
opts.append(attr_obj)
elif (isinstance(attr_obj, list) and
all(map(lambda x: is_opt(x), attr_obj))):
opts.extend(attr_obj)
ret = {}
for opt in opts:
ret.setdefault(_guess_groups(opt, obj), []).append(opt)
return ret.items()
def print_group_opts(group, opts_by_module):
print("[%s]" % group)
print('')
for mod, opts in opts_by_module:
print('#')
print('# Options defined in %s' % mod)
print('#')
print('')
for opt in opts:
_print_opt(opt)
print('')
def _get_my_ip():
try:
csock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
csock.connect(('8.8.8.8', 80))
(addr, port) = csock.getsockname()
csock.close()
return addr
except socket.error:
return None
def _sanitize_default(name, value):
"""Set up a reasonably sensible default for pybasedir, my_ip and host."""
if value.startswith(sys.prefix):
# NOTE(jd) Don't use os.path.join, because it is likely to think the
# second part is an absolute pathname and therefore drop the first
# part.
value = os.path.normpath("/usr/" + value[len(sys.prefix):])
elif value.startswith(BASEDIR):
return value.replace(BASEDIR, '/usr/lib/python/site-packages')
elif BASEDIR in value:
return value.replace(BASEDIR, '')
elif value == _get_my_ip():
return '10.0.0.1'
elif value in (socket.gethostname(), socket.getfqdn()) and 'host' in name:
return 'cerberus'
elif value.strip() != value:
return '"%s"' % value
return value
def _print_opt(opt):
opt_name, opt_default, opt_help = opt.dest, opt.default, opt.help
if not opt_help:
sys.stderr.write('WARNING: "%s" is missing help string.\n' % opt_name)
opt_help = ""
opt_type = None
try:
opt_type = OPTION_REGEX.search(str(type(opt))).group(0)
except (ValueError, AttributeError) as err:
sys.stderr.write("%s\n" % str(err))
sys.exit(1)
opt_help = u'%s (%s)' % (opt_help,
OPT_TYPES[opt_type])
print('#', "\n# ".join(textwrap.wrap(opt_help, WORDWRAP_WIDTH)))
if opt.deprecated_opts:
for deprecated_opt in opt.deprecated_opts:
if deprecated_opt.name:
deprecated_group = (deprecated_opt.group if
deprecated_opt.group else "DEFAULT")
print('# Deprecated group/name - [%s]/%s' %
(deprecated_group,
deprecated_opt.name))
try:
if opt_default is None:
print('#%s=<None>' % opt_name)
elif opt_type == STROPT:
assert(isinstance(opt_default, six.string_types))
print('#%s=%s' % (opt_name, _sanitize_default(opt_name,
opt_default)))
elif opt_type == BOOLOPT:
assert(isinstance(opt_default, bool))
print('#%s=%s' % (opt_name, str(opt_default).lower()))
elif opt_type == INTOPT:
assert(isinstance(opt_default, int) and
not isinstance(opt_default, bool))
print('#%s=%s' % (opt_name, opt_default))
elif opt_type == FLOATOPT:
assert(isinstance(opt_default, float))
print('#%s=%s' % (opt_name, opt_default))
elif opt_type == LISTOPT:
assert(isinstance(opt_default, list))
print('#%s=%s' % (opt_name, ','.join(opt_default)))
elif opt_type == DICTOPT:
assert(isinstance(opt_default, dict))
opt_default_strlist = [str(key) + ':' + str(value)
for (key, value) in opt_default.items()]
print('#%s=%s' % (opt_name, ','.join(opt_default_strlist)))
elif opt_type == MULTISTROPT:
assert(isinstance(opt_default, list))
if not opt_default:
opt_default = ['']
for default in opt_default:
print('#%s=%s' % (opt_name, default))
print('')
except Exception:
sys.stderr.write('Error in option "%s"\n' % opt_name)
sys.exit(1)
def main():
generate(sys.argv[1:])
if __name__ == '__main__':
main()

View File

@ -1,111 +0,0 @@
# Copyright 2011 OpenStack Foundation.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
Simple class that stores security context information in the web request.
Projects should subclass this class if they wish to enhance the request
context or provide additional information in their specific WSGI pipeline.
"""
import itertools
import uuid
def generate_request_id():
return 'req-%s' % str(uuid.uuid4())
class RequestContext(object):
"""Helper class to represent useful information about a request context.
Stores information about the security context under which the user
accesses the system, as well as additional request information.
"""
user_idt_format = '{user} {tenant} {domain} {user_domain} {p_domain}'
def __init__(self, auth_token=None, user=None, tenant=None, domain=None,
user_domain=None, project_domain=None, is_admin=False,
read_only=False, show_deleted=False, request_id=None,
instance_uuid=None):
self.auth_token = auth_token
self.user = user
self.tenant = tenant
self.domain = domain
self.user_domain = user_domain
self.project_domain = project_domain
self.is_admin = is_admin
self.read_only = read_only
self.show_deleted = show_deleted
self.instance_uuid = instance_uuid
if not request_id:
request_id = generate_request_id()
self.request_id = request_id
def to_dict(self):
user_idt = (
self.user_idt_format.format(user=self.user or '-',
tenant=self.tenant or '-',
domain=self.domain or '-',
user_domain=self.user_domain or '-',
p_domain=self.project_domain or '-'))
return {'user': self.user,
'tenant': self.tenant,
'domain': self.domain,
'user_domain': self.user_domain,
'project_domain': self.project_domain,
'is_admin': self.is_admin,
'read_only': self.read_only,
'show_deleted': self.show_deleted,
'auth_token': self.auth_token,
'request_id': self.request_id,
'instance_uuid': self.instance_uuid,
'user_identity': user_idt}
def get_admin_context(show_deleted=False):
context = RequestContext(None,
tenant=None,
is_admin=True,
show_deleted=show_deleted)
return context
def get_context_from_function_and_args(function, args, kwargs):
"""Find an arg of type RequestContext and return it.
This is useful in a couple of decorators where we don't
know much about the function we're wrapping.
"""
for arg in itertools.chain(kwargs.values(), args):
if isinstance(arg, RequestContext):
return arg
return None
def is_user_context(context):
"""Indicates if the request context is a normal user."""
if not context:
return False
if context.is_admin:
return False
if not context.user_id or not context.project_id:
return False
return True

View File

@ -1,162 +0,0 @@
# Copyright (c) 2013 Rackspace Hosting
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Multiple DB API backend support.
A DB backend module should implement a method named 'get_backend' which
takes no arguments. The method can return any object that implements DB
API methods.
"""
import functools
import logging
import threading
import time
from cerberus.openstack.common.db import exception
from cerberus.openstack.common.gettextutils import _LE
from cerberus.openstack.common import importutils
LOG = logging.getLogger(__name__)
def safe_for_db_retry(f):
"""Enable db-retry for decorated function, if config option enabled."""
f.__dict__['enable_retry'] = True
return f
class wrap_db_retry(object):
"""Retry db.api methods, if DBConnectionError() raised
Retry decorated db.api methods. If we enabled `use_db_reconnect`
in config, this decorator will be applied to all db.api functions,
marked with @safe_for_db_retry decorator.
Decorator catchs DBConnectionError() and retries function in a
loop until it succeeds, or until maximum retries count will be reached.
"""
def __init__(self, retry_interval, max_retries, inc_retry_interval,
max_retry_interval):
super(wrap_db_retry, self).__init__()
self.retry_interval = retry_interval
self.max_retries = max_retries
self.inc_retry_interval = inc_retry_interval
self.max_retry_interval = max_retry_interval
def __call__(self, f):
@functools.wraps(f)
def wrapper(*args, **kwargs):
next_interval = self.retry_interval
remaining = self.max_retries
while True:
try:
return f(*args, **kwargs)
except exception.DBConnectionError as e:
if remaining == 0:
LOG.exception(_LE('DB exceeded retry limit.'))
raise exception.DBError(e)
if remaining != -1:
remaining -= 1
LOG.exception(_LE('DB connection error.'))
# NOTE(vsergeyev): We are using patched time module, so
# this effectively yields the execution
# context to another green thread.
time.sleep(next_interval)
if self.inc_retry_interval:
next_interval = min(
next_interval * 2,
self.max_retry_interval
)
return wrapper
class DBAPI(object):
def __init__(self, backend_name, backend_mapping=None, lazy=False,
**kwargs):
"""Initialize the chosen DB API backend.
:param backend_name: name of the backend to load
:type backend_name: str
:param backend_mapping: backend name -> module/class to load mapping
:type backend_mapping: dict
:param lazy: load the DB backend lazily on the first DB API method call
:type lazy: bool
Keyword arguments:
:keyword use_db_reconnect: retry DB transactions on disconnect or not
:type use_db_reconnect: bool
:keyword retry_interval: seconds between transaction retries
:type retry_interval: int
:keyword inc_retry_interval: increase retry interval or not
:type inc_retry_interval: bool
:keyword max_retry_interval: max interval value between retries
:type max_retry_interval: int
:keyword max_retries: max number of retries before an error is raised
:type max_retries: int
"""
self._backend = None
self._backend_name = backend_name
self._backend_mapping = backend_mapping or {}
self._lock = threading.Lock()
if not lazy:
self._load_backend()
self.use_db_reconnect = kwargs.get('use_db_reconnect', False)
self.retry_interval = kwargs.get('retry_interval', 1)
self.inc_retry_interval = kwargs.get('inc_retry_interval', True)
self.max_retry_interval = kwargs.get('max_retry_interval', 10)
self.max_retries = kwargs.get('max_retries', 20)
def _load_backend(self):
with self._lock:
if not self._backend:
# Import the untranslated name if we don't have a mapping
backend_path = self._backend_mapping.get(self._backend_name,
self._backend_name)
backend_mod = importutils.import_module(backend_path)
self._backend = backend_mod.get_backend()
def __getattr__(self, key):
if not self._backend:
self._load_backend()
attr = getattr(self._backend, key)
if not hasattr(attr, '__call__'):
return attr
# NOTE(vsergeyev): If `use_db_reconnect` option is set to True, retry
# DB API methods, decorated with @safe_for_db_retry
# on disconnect.
if self.use_db_reconnect and hasattr(attr, 'enable_retry'):
attr = wrap_db_retry(
retry_interval=self.retry_interval,
max_retries=self.max_retries,
inc_retry_interval=self.inc_retry_interval,
max_retry_interval=self.max_retry_interval)(attr)
return attr

View File

@ -1,56 +0,0 @@
# Copyright 2010 United States Government as represented by the
# Administrator of the National Aeronautics and Space Administration.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""DB related custom exceptions."""
import six
from cerberus.openstack.common.gettextutils import _
class DBError(Exception):
"""Wraps an implementation specific exception."""
def __init__(self, inner_exception=None):
self.inner_exception = inner_exception
super(DBError, self).__init__(six.text_type(inner_exception))
class DBDuplicateEntry(DBError):
"""Wraps an implementation specific exception."""
def __init__(self, columns=[], inner_exception=None):
self.columns = columns
super(DBDuplicateEntry, self).__init__(inner_exception)
class DBDeadlock(DBError):
def __init__(self, inner_exception=None):
super(DBDeadlock, self).__init__(inner_exception)
class DBInvalidUnicodeParameter(Exception):
message = _("Invalid Parameter: "
"Unicode is not supported by the current database.")
class DbMigrationError(DBError):
"""Wraps migration specific exception."""
def __init__(self, message=None):
super(DbMigrationError, self).__init__(message)
class DBConnectionError(DBError):
"""Wraps connection specific exception."""
pass

View File

@ -1,171 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import copy
from oslo.config import cfg
database_opts = [
cfg.StrOpt('sqlite_db',
deprecated_group='DEFAULT',
default='cerberus.sqlite',
help='The file name to use with SQLite'),
cfg.BoolOpt('sqlite_synchronous',
deprecated_group='DEFAULT',
default=True,
help='If True, SQLite uses synchronous mode'),
cfg.StrOpt('backend',
default='sqlalchemy',
deprecated_name='db_backend',
deprecated_group='DEFAULT',
help='The backend to use for db'),
cfg.StrOpt('connection',
help='The SQLAlchemy connection string used to connect to the '
'database',
secret=True,
deprecated_opts=[cfg.DeprecatedOpt('sql_connection',
group='DEFAULT'),
cfg.DeprecatedOpt('sql_connection',
group='DATABASE'),
cfg.DeprecatedOpt('connection',
group='sql'), ]),
cfg.StrOpt('mysql_sql_mode',
default='TRADITIONAL',
help='The SQL mode to be used for MySQL sessions. '
'This option, including the default, overrides any '
'server-set SQL mode. To use whatever SQL mode '
'is set by the server configuration, '
'set this to no value. Example: mysql_sql_mode='),
cfg.IntOpt('idle_timeout',
default=3600,
deprecated_opts=[cfg.DeprecatedOpt('sql_idle_timeout',
group='DEFAULT'),
cfg.DeprecatedOpt('sql_idle_timeout',
group='DATABASE'),
cfg.DeprecatedOpt('idle_timeout',
group='sql')],
help='Timeout before idle sql connections are reaped'),
cfg.IntOpt('min_pool_size',
default=1,
deprecated_opts=[cfg.DeprecatedOpt('sql_min_pool_size',
group='DEFAULT'),
cfg.DeprecatedOpt('sql_min_pool_size',
group='DATABASE')],
help='Minimum number of SQL connections to keep open in a '
'pool'),
cfg.IntOpt('max_pool_size',
default=None,
deprecated_opts=[cfg.DeprecatedOpt('sql_max_pool_size',
group='DEFAULT'),
cfg.DeprecatedOpt('sql_max_pool_size',
group='DATABASE')],
help='Maximum number of SQL connections to keep open in a '
'pool'),
cfg.IntOpt('max_retries',
default=10,
deprecated_opts=[cfg.DeprecatedOpt('sql_max_retries',
group='DEFAULT'),
cfg.DeprecatedOpt('sql_max_retries',
group='DATABASE')],
help='Maximum db connection retries during startup. '
'(setting -1 implies an infinite retry count)'),
cfg.IntOpt('retry_interval',
default=10,
deprecated_opts=[cfg.DeprecatedOpt('sql_retry_interval',
group='DEFAULT'),
cfg.DeprecatedOpt('reconnect_interval',
group='DATABASE')],
help='Interval between retries of opening a sql connection'),
cfg.IntOpt('max_overflow',
default=None,
deprecated_opts=[cfg.DeprecatedOpt('sql_max_overflow',
group='DEFAULT'),
cfg.DeprecatedOpt('sqlalchemy_max_overflow',
group='DATABASE')],
help='If set, use this value for max_overflow with sqlalchemy'),
cfg.IntOpt('connection_debug',
default=0,
deprecated_opts=[cfg.DeprecatedOpt('sql_connection_debug',
group='DEFAULT')],
help='Verbosity of SQL debugging information. 0=None, '
'100=Everything'),
cfg.BoolOpt('connection_trace',
default=False,
deprecated_opts=[cfg.DeprecatedOpt('sql_connection_trace',
group='DEFAULT')],
help='Add python stack traces to SQL as comment strings'),
cfg.IntOpt('pool_timeout',
default=None,
deprecated_opts=[cfg.DeprecatedOpt('sqlalchemy_pool_timeout',
group='DATABASE')],
help='If set, use this value for pool_timeout with sqlalchemy'),
cfg.BoolOpt('use_db_reconnect',
default=False,
help='Enable the experimental use of database reconnect '
'on connection lost'),
cfg.IntOpt('db_retry_interval',
default=1,
help='seconds between db connection retries'),
cfg.BoolOpt('db_inc_retry_interval',
default=True,
help='Whether to increase interval between db connection '
'retries, up to db_max_retry_interval'),
cfg.IntOpt('db_max_retry_interval',
default=10,
help='max seconds between db connection retries, if '
'db_inc_retry_interval is enabled'),
cfg.IntOpt('db_max_retries',
default=20,
help='maximum db connection retries before error is raised. '
'(setting -1 implies an infinite retry count)'),
]
CONF = cfg.CONF
CONF.register_opts(database_opts, 'database')
def set_defaults(sql_connection, sqlite_db, max_pool_size=None,
max_overflow=None, pool_timeout=None):
"""Set defaults for configuration variables."""
cfg.set_defaults(database_opts,
connection=sql_connection,
sqlite_db=sqlite_db)
# Update the QueuePool defaults
if max_pool_size is not None:
cfg.set_defaults(database_opts,
max_pool_size=max_pool_size)
if max_overflow is not None:
cfg.set_defaults(database_opts,
max_overflow=max_overflow)
if pool_timeout is not None:
cfg.set_defaults(database_opts,
pool_timeout=pool_timeout)
def list_opts():
"""Returns a list of oslo.config options available in the library.
The returned list includes all oslo.config options which may be registered
at runtime by the library.
Each element of the list is a tuple. The first element is the name of the
group under which the list of elements in the second element will be
registered. A group name of None corresponds to the [DEFAULT] group in
config files.
The purpose of this is to allow tools like the Oslo sample config file
generator to discover the options exposed to users by this library.
:returns: a list of (group_name, opts) tuples
"""
return [('database', copy.deepcopy(database_opts))]

View File

@ -1,278 +0,0 @@
# coding: utf-8
#
# Copyright (c) 2013 OpenStack Foundation
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# Base on code in migrate/changeset/databases/sqlite.py which is under
# the following license:
#
# The MIT License
#
# Copyright (c) 2009 Evan Rosson, Jan Dittberner, Domen Kožar
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import os
import re
from migrate.changeset import ansisql
from migrate.changeset.databases import sqlite
from migrate import exceptions as versioning_exceptions
from migrate.versioning import api as versioning_api
from migrate.versioning.repository import Repository
import sqlalchemy
from sqlalchemy.schema import UniqueConstraint
from cerberus.openstack.common.db import exception
from cerberus.openstack.common.gettextutils import _
def _get_unique_constraints(self, table):
"""Retrieve information about existing unique constraints of the table
This feature is needed for _recreate_table() to work properly.
Unfortunately, it's not available in sqlalchemy 0.7.x/0.8.x.
"""
data = table.metadata.bind.execute(
"""SELECT sql
FROM sqlite_master
WHERE
type='table' AND
name=:table_name""",
table_name=table.name
).fetchone()[0]
UNIQUE_PATTERN = "CONSTRAINT (\w+) UNIQUE \(([^\)]+)\)"
return [
UniqueConstraint(
*[getattr(table.columns, c.strip(' "')) for c in cols.split(",")],
name=name
)
for name, cols in re.findall(UNIQUE_PATTERN, data)
]
def _recreate_table(self, table, column=None, delta=None, omit_uniques=None):
"""Recreate the table properly
Unlike the corresponding original method of sqlalchemy-migrate this one
doesn't drop existing unique constraints when creating a new one.
"""
table_name = self.preparer.format_table(table)
# we remove all indexes so as not to have
# problems during copy and re-create
for index in table.indexes:
index.drop()
# reflect existing unique constraints
for uc in self._get_unique_constraints(table):
table.append_constraint(uc)
# omit given unique constraints when creating a new table if required
table.constraints = set([
cons for cons in table.constraints
if omit_uniques is None or cons.name not in omit_uniques
])
self.append('ALTER TABLE %s RENAME TO migration_tmp' % table_name)
self.execute()
insertion_string = self._modify_table(table, column, delta)
table.create(bind=self.connection)
self.append(insertion_string % {'table_name': table_name})
self.execute()
self.append('DROP TABLE migration_tmp')
self.execute()
def _visit_migrate_unique_constraint(self, *p, **k):
"""Drop the given unique constraint
The corresponding original method of sqlalchemy-migrate just
raises NotImplemented error
"""
self.recreate_table(p[0].table, omit_uniques=[p[0].name])
def patch_migrate():
"""A workaround for SQLite's inability to alter things
SQLite abilities to alter tables are very limited (please read
http://www.sqlite.org/lang_altertable.html for more details).
E. g. one can't drop a column or a constraint in SQLite. The
workaround for this is to recreate the original table omitting
the corresponding constraint (or column).
sqlalchemy-migrate library has recreate_table() method that
implements this workaround, but it does it wrong:
- information about unique constraints of a table
is not retrieved. So if you have a table with one
unique constraint and a migration adding another one
you will end up with a table that has only the
latter unique constraint, and the former will be lost
- dropping of unique constraints is not supported at all
The proper way to fix this is to provide a pull-request to
sqlalchemy-migrate, but the project seems to be dead. So we
can go on with monkey-patching of the lib at least for now.
"""
# this patch is needed to ensure that recreate_table() doesn't drop
# existing unique constraints of the table when creating a new one
helper_cls = sqlite.SQLiteHelper
helper_cls.recreate_table = _recreate_table
helper_cls._get_unique_constraints = _get_unique_constraints
# this patch is needed to be able to drop existing unique constraints
constraint_cls = sqlite.SQLiteConstraintDropper
constraint_cls.visit_migrate_unique_constraint = \
_visit_migrate_unique_constraint
constraint_cls.__bases__ = (ansisql.ANSIColumnDropper,
sqlite.SQLiteConstraintGenerator)
def db_sync(engine, abs_path, version=None, init_version=0, sanity_check=True):
"""Upgrade or downgrade a database.
Function runs the upgrade() or downgrade() functions in change scripts.
:param engine: SQLAlchemy engine instance for a given database
:param abs_path: Absolute path to migrate repository.
:param version: Database will upgrade/downgrade until this version.
If None - database will update to the latest
available version.
:param init_version: Initial database version
:param sanity_check: Require schema sanity checking for all tables
"""
if version is not None:
try:
version = int(version)
except ValueError:
raise exception.DbMigrationError(
message=_("version should be an integer"))
current_version = db_version(engine, abs_path, init_version)
repository = _find_migrate_repo(abs_path)
if sanity_check:
_db_schema_sanity_check(engine)
if version is None or version > current_version:
return versioning_api.upgrade(engine, repository, version)
else:
return versioning_api.downgrade(engine, repository,
version)
def _db_schema_sanity_check(engine):
"""Ensure all database tables were created with required parameters.
:param engine: SQLAlchemy engine instance for a given database
"""
if engine.name == 'mysql':
onlyutf8_sql = ('SELECT TABLE_NAME,TABLE_COLLATION '
'from information_schema.TABLES '
'where TABLE_SCHEMA=%s and '
'TABLE_COLLATION NOT LIKE "%%utf8%%"')
# NOTE(morganfainberg): exclude the sqlalchemy-migrate and alembic
# versioning tables from the tables we need to verify utf8 status on.
# Non-standard table names are not supported.
EXCLUDED_TABLES = ['migrate_version', 'alembic_version']
table_names = [res[0] for res in
engine.execute(onlyutf8_sql, engine.url.database) if
res[0].lower() not in EXCLUDED_TABLES]
if len(table_names) > 0:
raise ValueError(_('Tables "%s" have non utf8 collation, '
'please make sure all tables are CHARSET=utf8'
) % ','.join(table_names))
def db_version(engine, abs_path, init_version):
"""Show the current version of the repository.
:param engine: SQLAlchemy engine instance for a given database
:param abs_path: Absolute path to migrate repository
:param version: Initial database version
"""
repository = _find_migrate_repo(abs_path)
try:
return versioning_api.db_version(engine, repository)
except versioning_exceptions.DatabaseNotControlledError:
meta = sqlalchemy.MetaData()
meta.reflect(bind=engine)
tables = meta.tables
if len(tables) == 0 or 'alembic_version' in tables:
db_version_control(engine, abs_path, version=init_version)
return versioning_api.db_version(engine, repository)
else:
raise exception.DbMigrationError(
message=_(
"The database is not under version control, but has "
"tables. Please stamp the current version of the schema "
"manually."))
def db_version_control(engine, abs_path, version=None):
"""Mark a database as under this repository's version control.
Once a database is under version control, schema changes should
only be done via change scripts in this repository.
:param engine: SQLAlchemy engine instance for a given database
:param abs_path: Absolute path to migrate repository
:param version: Initial database version
"""
repository = _find_migrate_repo(abs_path)
versioning_api.version_control(engine, repository, version)
return version
def _find_migrate_repo(abs_path):
"""Get the project's change script repository
:param abs_path: Absolute path to migrate repository
"""
if not os.path.exists(abs_path):
raise exception.DbMigrationError("Path %s not found" % abs_path)
return Repository(abs_path)

View File

@ -1,78 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import alembic
from alembic import config as alembic_config
import alembic.migration as alembic_migration
from cerberus.openstack.common.db.sqlalchemy.migration_cli import ext_base
from cerberus.openstack.common.db.sqlalchemy import session as db_session
class AlembicExtension(ext_base.MigrationExtensionBase):
order = 2
@property
def enabled(self):
return os.path.exists(self.alembic_ini_path)
def __init__(self, migration_config):
"""Extension to provide alembic features.
:param migration_config: Stores specific configuration for migrations
:type migration_config: dict
"""
self.alembic_ini_path = migration_config.get('alembic_ini_path', '')
self.config = alembic_config.Config(self.alembic_ini_path)
# option should be used if script is not in default directory
repo_path = migration_config.get('alembic_repo_path')
if repo_path:
self.config.set_main_option('script_location', repo_path)
self.db_url = migration_config['db_url']
def upgrade(self, version):
return alembic.command.upgrade(self.config, version or 'head')
def downgrade(self, version):
if isinstance(version, int) or version is None or version.isdigit():
version = 'base'
return alembic.command.downgrade(self.config, version)
def version(self):
engine = db_session.create_engine(self.db_url)
with engine.connect() as conn:
context = alembic_migration.MigrationContext.configure(conn)
return context.get_current_revision()
def revision(self, message='', autogenerate=False):
"""Creates template for migration.
:param message: Text that will be used for migration title
:type message: string
:param autogenerate: If True - generates diff based on current database
state
:type autogenerate: bool
"""
return alembic.command.revision(self.config, message=message,
autogenerate=autogenerate)
def stamp(self, revision):
"""Stamps database with provided revision.
:param revision: Should match one from repository or head - to stamp
database with most recent revision
:type revision: string
"""
return alembic.command.stamp(self.config, revision=revision)

View File

@ -1,79 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
import six
@six.add_metaclass(abc.ABCMeta)
class MigrationExtensionBase(object):
#used to sort migration in logical order
order = 0
@property
def enabled(self):
"""Used for availability verification of a plugin.
:rtype: bool
"""
return False
@abc.abstractmethod
def upgrade(self, version):
"""Used for upgrading database.
:param version: Desired database version
:type version: string
"""
@abc.abstractmethod
def downgrade(self, version):
"""Used for downgrading database.
:param version: Desired database version
:type version: string
"""
@abc.abstractmethod
def version(self):
"""Current database version.
:returns: Databse version
:rtype: string
"""
def revision(self, *args, **kwargs):
"""Used to generate migration script.
In migration engines that support this feature, it should generate
new migration script.
Accept arbitrary set of arguments.
"""
raise NotImplementedError()
def stamp(self, *args, **kwargs):
"""Stamps database based on plugin features.
Accept arbitrary set of arguments.
"""
raise NotImplementedError()
def __cmp__(self, other):
"""Used for definition of plugin order.
:param other: MigrationExtensionBase instance
:rtype: bool
"""
return self.order > other.order

View File

@ -1,69 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
import os
from cerberus.openstack.common.db.sqlalchemy import migration
from cerberus.openstack.common.db.sqlalchemy.migration_cli import ext_base
from cerberus.openstack.common.db.sqlalchemy import session as db_session
from cerberus.openstack.common.gettextutils import _LE
LOG = logging.getLogger(__name__)
class MigrateExtension(ext_base.MigrationExtensionBase):
"""Extension to provide sqlalchemy-migrate features.
:param migration_config: Stores specific configuration for migrations
:type migration_config: dict
"""
order = 1
def __init__(self, migration_config):
self.repository = migration_config.get('migration_repo_path', '')
self.init_version = migration_config.get('init_version', 0)
self.db_url = migration_config['db_url']
self.engine = db_session.create_engine(self.db_url)
@property
def enabled(self):
return os.path.exists(self.repository)
def upgrade(self, version):
version = None if version == 'head' else version
return migration.db_sync(
self.engine, self.repository, version,
init_version=self.init_version)
def downgrade(self, version):
try:
#version for migrate should be valid int - else skip
if version in ('base', None):
version = self.init_version
version = int(version)
return migration.db_sync(
self.engine, self.repository, version,
init_version=self.init_version)
except ValueError:
LOG.error(
_LE('Migration number for migrate plugin must be valid '
'integer or empty, if you want to downgrade '
'to initial state')
)
raise
def version(self):
return migration.db_version(
self.engine, self.repository, init_version=self.init_version)

View File

@ -1,71 +0,0 @@
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from stevedore import enabled
MIGRATION_NAMESPACE = 'cerberus.openstack.common.migration'
def check_plugin_enabled(ext):
"""Used for EnabledExtensionManager"""
return ext.obj.enabled
class MigrationManager(object):
def __init__(self, migration_config):
self._manager = enabled.EnabledExtensionManager(
MIGRATION_NAMESPACE,
check_plugin_enabled,
invoke_kwds={'migration_config': migration_config},
invoke_on_load=True
)
if not self._plugins:
raise ValueError('There must be at least one plugin active.')
@property
def _plugins(self):
return sorted(ext.obj for ext in self._manager.extensions)
def upgrade(self, revision):
"""Upgrade database with all available backends."""
results = []
for plugin in self._plugins:
results.append(plugin.upgrade(revision))
return results
def downgrade(self, revision):
"""Downgrade database with available backends."""
#downgrading should be performed in reversed order
results = []
for plugin in reversed(self._plugins):
results.append(plugin.downgrade(revision))
return results
def version(self):
"""Return last version of db."""
last = None
for plugin in self._plugins:
version = plugin.version()
if version:
last = version
return last
def revision(self, message, autogenerate):
"""Generate template or autogenerated revision."""
#revision should be done only by last plugin
return self._plugins[-1].revision(message, autogenerate)
def stamp(self, revision):
"""Create stamp for a given revision."""
return self._plugins[-1].stamp(revision)

View File

@ -1,119 +0,0 @@
# Copyright (c) 2011 X.commerce, a business unit of eBay Inc.
# Copyright 2010 United States Government as represented by the
# Administrator of the National Aeronautics and Space Administration.
# Copyright 2011 Piston Cloud Computing, Inc.
# Copyright 2012 Cloudscaling Group, Inc.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""
SQLAlchemy models.
"""
import six
from sqlalchemy import Column, Integer
from sqlalchemy import DateTime
from sqlalchemy.orm import object_mapper
from cerberus.openstack.common import timeutils
class ModelBase(six.Iterator):
"""Base class for models."""
__table_initialized__ = False
def save(self, session):
"""Save this object."""
# NOTE(boris-42): This part of code should be look like:
# session.add(self)
# session.flush()
# But there is a bug in sqlalchemy and eventlet that
# raises NoneType exception if there is no running
# transaction and rollback is called. As long as
# sqlalchemy has this bug we have to create transaction
# explicitly.
with session.begin(subtransactions=True):
session.add(self)
session.flush()
def __setitem__(self, key, value):
setattr(self, key, value)
def __getitem__(self, key):
return getattr(self, key)
def get(self, key, default=None):
return getattr(self, key, default)
@property
def _extra_keys(self):
"""Specifies custom fields
Subclasses can override this property to return a list
of custom fields that should be included in their dict
representation.
For reference check tests/db/sqlalchemy/test_models.py
"""
return []
def __iter__(self):
columns = dict(object_mapper(self).columns).keys()
# NOTE(russellb): Allow models to specify other keys that can be looked
# up, beyond the actual db columns. An example would be the 'name'
# property for an Instance.
columns.extend(self._extra_keys)
self._i = iter(columns)
return self
# In Python 3, __next__() has replaced next().
def __next__(self):
n = six.advance_iterator(self._i)
return n, getattr(self, n)
def next(self):
return self.__next__()
def update(self, values):
"""Make the model object behave like a dict."""
for k, v in six.iteritems(values):
setattr(self, k, v)
def iteritems(self):
"""Make the model object behave like a dict.
Includes attributes from joins.
"""
local = dict(self)
joined = dict([(k, v) for k, v in six.iteritems(self.__dict__)
if not k[0] == '_'])
local.update(joined)
return six.iteritems(local)
class TimestampMixin(object):
created_at = Column(DateTime, default=lambda: timeutils.utcnow())
updated_at = Column(DateTime, onupdate=lambda: timeutils.utcnow())
class SoftDeleteMixin(object):
deleted_at = Column(DateTime)
deleted = Column(Integer, default=0)
def soft_delete(self, session):
"""Mark this object as deleted."""
self.deleted = self.id
self.deleted_at = timeutils.utcnow()
self.save(session=session)

View File

@ -1,157 +0,0 @@
# Copyright 2013 Mirantis.inc
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Provision test environment for specific DB backends"""
import argparse
import logging
import os
import random
import string
from six import moves
import sqlalchemy
from cerberus.openstack.common.db import exception as exc
LOG = logging.getLogger(__name__)
def get_engine(uri):
"""Engine creation
Call the function without arguments to get admin connection. Admin
connection required to create temporary user and database for each
particular test. Otherwise use existing connection to recreate connection
to the temporary database.
"""
return sqlalchemy.create_engine(uri, poolclass=sqlalchemy.pool.NullPool)
def _execute_sql(engine, sql, driver):
"""Initialize connection, execute sql query and close it."""
try:
with engine.connect() as conn:
if driver == 'postgresql':
conn.connection.set_isolation_level(0)
for s in sql:
conn.execute(s)
except sqlalchemy.exc.OperationalError:
msg = ('%s does not match database admin '
'credentials or database does not exist.')
LOG.exception(msg % engine.url)
raise exc.DBConnectionError(msg % engine.url)
def create_database(engine):
"""Provide temporary user and database for each particular test."""
driver = engine.name
auth = {
'database': ''.join(random.choice(string.ascii_lowercase)
for i in moves.range(10)),
'user': engine.url.username,
'passwd': engine.url.password,
}
sqls = [
"drop database if exists %(database)s;",
"create database %(database)s;"
]
if driver == 'sqlite':
return 'sqlite:////tmp/%s' % auth['database']
elif driver in ['mysql', 'postgresql']:
sql_query = map(lambda x: x % auth, sqls)
_execute_sql(engine, sql_query, driver)
else:
raise ValueError('Unsupported RDBMS %s' % driver)
params = auth.copy()
params['backend'] = driver
return "%(backend)s://%(user)s:%(passwd)s@localhost/%(database)s" % params
def drop_database(admin_engine, current_uri):
"""Drop temporary database and user after each particular test."""
engine = get_engine(current_uri)
driver = engine.name
auth = {'database': engine.url.database, 'user': engine.url.username}
if driver == 'sqlite':
try:
os.remove(auth['database'])
except OSError:
pass
elif driver in ['mysql', 'postgresql']:
sql = "drop database if exists %(database)s;"
_execute_sql(admin_engine, [sql % auth], driver)
else:
raise ValueError('Unsupported RDBMS %s' % driver)
def main():
"""Controller to handle commands
::create: Create test user and database with random names.
::drop: Drop user and database created by previous command.
"""
parser = argparse.ArgumentParser(
description='Controller to handle database creation and dropping'
' commands.',
epilog='Under normal circumstances is not used directly.'
' Used in .testr.conf to automate test database creation'
' and dropping processes.')
subparsers = parser.add_subparsers(
help='Subcommands to manipulate temporary test databases.')
create = subparsers.add_parser(
'create',
help='Create temporary test '
'databases and users.')
create.set_defaults(which='create')
create.add_argument(
'instances_count',
type=int,
help='Number of databases to create.')
drop = subparsers.add_parser(
'drop',
help='Drop temporary test databases and users.')
drop.set_defaults(which='drop')
drop.add_argument(
'instances',
nargs='+',
help='List of databases uri to be dropped.')
args = parser.parse_args()
connection_string = os.getenv('OS_TEST_DBAPI_ADMIN_CONNECTION',
'sqlite://')
engine = get_engine(connection_string)
which = args.which
if which == "create":
for i in range(int(args.instances_count)):
print(create_database(engine))
elif which == "drop":
for db in args.instances:
drop_database(engine, db)
if __name__ == "__main__":
main()

Some files were not shown because too many files have changed in this diff Show More