Initial commit of the code

This is the initial commit of the code for the project. Previous
development was ongoing in https://github.com/CCI-MOC/k2k-proxy.git

Change-Id: I5cce38e22581e1f0a82c2c76a64e7bbf2cd7490b
Co-Authored-By: George Silvis, III <george.iii.silvis@gmail.com>
Co-Authored-By: Wjdan Alharthi <walharth@bu.edu>
This commit is contained in:
Kristi Nikolla 2016-11-07 09:49:52 -05:00
parent ddc6401072
commit dcc8b3aa76
45 changed files with 3011 additions and 0 deletions

58
.gitignore vendored Normal file
View File

@ -0,0 +1,58 @@
*.py[cod]
# C extensions
*.so
# Packages
*.egg*
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
lib
lib64
# Installer logs
pip-log.txt
# Unit test / coverage reports
cover/
.coverage*
!.coveragerc
.tox
nosetests.xml
.testrepository
.venv
# Translations
*.mo
# Mr Developer
.mr.developer.cfg
.project
.pydevproject
# Complexity
output/*.html
output/*/index.html
# Sphinx
doc/build
# pbr generates these
AUTHORS
ChangeLog
# Editors
*~
.*.swp
.*sw?
# Files created by releasenotes build
releasenotes/build

7
.testr.conf Normal file
View File

@ -0,0 +1,7 @@
[DEFAULT]
test_command=OS_STDOUT_CAPTURE=${OS_STDOUT_CAPTURE:-1} \
OS_STDERR_CAPTURE=${OS_STDERR_CAPTURE:-1} \
OS_TEST_TIMEOUT=${OS_TEST_TIMEOUT:-60} \
${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION
test_id_option=--load-list $IDFILE
test_list_option=--list

17
CONTRIBUTING.rst Normal file
View File

@ -0,0 +1,17 @@
If you would like to contribute to the development of OpenStack, you must
follow the steps in this page:
http://docs.openstack.org/infra/manual/developers.html
If you already have a good understanding of how the system works and your
OpenStack accounts are set up, you can skip to the development workflow
section of this documentation to learn how changes to OpenStack should be
submitted for review via the Gerrit tool:
http://docs.openstack.org/infra/manual/developers.html#development-workflow
Pull requests submitted through GitHub will be ignored.
Bugs should be filed on Launchpad, not GitHub:
https://bugs.launchpad.net/mixmatch

4
HACKING.rst Normal file
View File

@ -0,0 +1,4 @@
mixmatch Style Commandments
===============================================
Read the OpenStack Style Commandments http://docs.openstack.org/developer/hacking/

176
LICENSE Normal file
View File

@ -0,0 +1,176 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

6
MANIFEST.in Normal file
View File

@ -0,0 +1,6 @@
include AUTHORS
include ChangeLog
exclude .gitignore
exclude .gitreview
global-exclude *.pyc

18
README.rst Normal file
View File

@ -0,0 +1,18 @@
===============================
mixmatch
===============================
Combine resources across federated OpenStack deployments
Proxy Service that will forward REST API requests to a remote service provider
which is federated using Keystone-to-Keystone Federation (K2K).
The proxy learns the location of resources and is able to forward requests to
the correct service provider. This allows OpenStack services to use resources
provided by other federated OpenStack deployments, ex. Nova attach a remote
volume.
* Free software: Apache license
* Documentation: http://docs.openstack.org/developer/mixmatch
* Source: http://git.openstack.org/cgit/openstack/mixmatch
* Bugs: http://bugs.launchpad.net/mixmatch

2
babel.cfg Normal file
View File

@ -0,0 +1,2 @@
[python: **.py]

75
doc/source/conf.py Executable file
View File

@ -0,0 +1,75 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
sys.path.insert(0, os.path.abspath('../..'))
# -- General configuration ----------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = [
'sphinx.ext.autodoc',
#'sphinx.ext.intersphinx',
'oslosphinx'
]
# autodoc generation is a bit aggressive and a nuisance when doing heavy
# text edit cycles.
# execute "export SPHINX_DEBUG=1" in your terminal to disable
# The suffix of source filenames.
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'mixmatch'
copyright = u'2016, OpenStack Foundation'
# If true, '()' will be appended to :func: etc. cross-reference text.
add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
add_module_names = True
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# -- Options for HTML output --------------------------------------------------
# The theme to use for HTML and HTML Help pages. Major themes that come with
# Sphinx are currently 'default' and 'sphinxdoc'.
# html_theme_path = ["."]
# html_theme = '_theme'
# html_static_path = ['static']
# Output file base name for HTML help builder.
htmlhelp_basename = '%sdoc' % project
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass
# [howto/manual]).
latex_documents = [
('index',
'%s.tex' % project,
u'%s Documentation' % project,
u'OpenStack Foundation', 'manual'),
]
# Example configuration for intersphinx: refer to the Python standard library.
#intersphinx_mapping = {'http://docs.python.org/': None}

View File

@ -0,0 +1,4 @@
============
Contributing
============
.. include:: ../../CONTRIBUTING.rst

25
doc/source/index.rst Normal file
View File

@ -0,0 +1,25 @@
.. mixmatch documentation master file, created by
sphinx-quickstart on Tue Jul 9 22:26:36 2013.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to mixmatch's documentation!
========================================================
Contents:
.. toctree::
:maxdepth: 2
readme
installation
usage
contributing
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@ -0,0 +1,12 @@
============
Installation
============
At the command line::
$ pip install mixmatch
Or, if you have virtualenvwrapper installed::
$ mkvirtualenv mixmatch
$ pip install mixmatch

1
doc/source/readme.rst Normal file
View File

@ -0,0 +1 @@
.. include:: ../../README.rst

7
doc/source/usage.rst Normal file
View File

@ -0,0 +1,7 @@
========
Usage
========
To use mixmatch in a project::
import mixmatch

52
etc/k2k-proxy.conf Normal file
View File

@ -0,0 +1,52 @@
[database]
connection="sqlite:////home/ubuntu/proxy.db"
[keystone]
auth_url="http://127.0.0.1:5000/v3"
username="admin"
user_domain_id="default"
password="nomoresecrete"
project_name="admin"
project_domain_id="default"
[proxy]
aggregation=True
token_caching=False
search_by_broadcast=True
service_providers=default, coffee-sp
caching=True
image_api_versions = v2.3, v2.2, v2.1, v2.0, v1.1, v1.0
volume_api_versions = v3.0, v2.0, v1.0
[cache]
enabled=True
backend=dogpile.cache.memory
[sp_default]
sp_name=default
messagebus="rabbit://stackrabbit:stackqueue@localhost"
auth_url="http://127.0.0.1:5000/v3"
image_endpoint="http://localhost:9292"
volume_endpoint="http://localhost:8776"
[sp_coffee-sp]
sp_name=coffee-sp
messagebus="rabbit://stackrabbit:stackqueue@192.168.0.106"
auth_url="http://192.168.0.106:5000/v3"
image_endpoint="http://192.168.0.106:9292"
volume_endpoint="http://192.168.0.106:8776"
# Logging
[loggers]
keys = root
[handlers]
keys = stdout
[formatters]
keys = default
[logger_root]
level = DEBUG
handlers = stdout
formatter = default

17
httpd/apache.conf Normal file
View File

@ -0,0 +1,17 @@
LoadModule wsgi_module modules/mod_wsgi.so
WSGISocketPrefix /var/run/wsgi
Listen 5001
<VirtualHost *:5001>
WSGIPassAuthorization On
WSGIChunkedRequest On
#WSGIDaemonProcess k2k-proxy user=ubuntu group=ubuntu threads=2
WSGIScriptAlias / /home/ubuntu/k2k-proxy/httpd/k2k-proxy.wsgi
<Directory /home/ubuntu/k2k-proxy/httpd>
#WSGIProcessGroup k2k-proxy
#WSGIApplicationGroup %{GLOBAL}
Require all granted
</Directory>
</VirtualHost>

15
httpd/k2k-proxy.wsgi Normal file
View File

@ -0,0 +1,15 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from mixmatch.proxy import app as application

19
mixmatch/__init__.py Normal file
View File

@ -0,0 +1,19 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import pbr.version
__version__ = pbr.version.VersionInfo(
'mixmatch').version_string()

104
mixmatch/auth.py Normal file
View File

@ -0,0 +1,104 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from keystoneauth1 import identity
from keystoneauth1 import session
from keystoneclient import v3
from keystoneauth1.exceptions import http
import json
from flask import abort
from mixmatch import config
from mixmatch.config import LOG, CONF, get_conf_for_sp
@config.MEMOIZE_SESSION
def get_client():
"""Return a Keystone client capable of validating tokens."""
LOG.info("Getting Admin Client")
service_auth = identity.Password(
auth_url=CONF.keystone.auth_url,
username=CONF.keystone.username,
password=CONF.keystone.password,
project_name=CONF.keystone.project_name,
project_domain_id=CONF.keystone.project_domain_id,
user_domain_id=CONF.keystone.user_domain_id
)
local_session = session.Session(auth=service_auth)
return v3.client.Client(session=local_session)
@config.MEMOIZE_SESSION
def get_local_auth(user_token):
"""Return a Keystone session for the local cluster."""
LOG.info("Getting session for %s" % user_token)
client = get_client()
token = v3.tokens.TokenManager(client)
try:
token_data = token.validate(token=user_token, include_catalog=False)
except http.NotFound:
abort(401)
project_id = token_data['project']['id']
local_auth = identity.v3.Token(auth_url=CONF.keystone.auth_url,
token=user_token,
project_id=project_id)
return session.Session(auth=local_auth)
@config.MEMOIZE_SESSION
def get_unscoped_sp_auth(service_provider, user_token):
"""Perform K2K auth, and return an unscoped session."""
conf = get_conf_for_sp(service_provider)
local_auth = get_local_auth(user_token).auth
LOG.info("Getting unscoped session for (%s, %s)" % (service_provider,
user_token))
remote_auth = identity.v3.Keystone2Keystone(
local_auth,
conf.sp_name
)
return session.Session(auth=remote_auth)
def get_projects_at_sp(service_provider, user_token):
"""Perform K2K auth, and return the projects that can be scoped to."""
conf = get_conf_for_sp(service_provider)
unscoped_session = get_unscoped_sp_auth(service_provider, user_token)
r = json.loads(str(unscoped_session.get(
conf.auth_url + "/OS-FEDERATION/projects").text))
return [project[u'id'] for project in r[u'projects']]
@config.MEMOIZE_SESSION
def get_sp_auth(service_provider, user_token, remote_project_id):
"""Perform K2K auth, and return a session for a remote cluster."""
conf = get_conf_for_sp(service_provider)
local_auth = get_local_auth(user_token).auth
LOG.info("Getting session for (%s, %s, %s)" % (service_provider,
user_token,
remote_project_id))
remote_auth = identity.v3.Keystone2Keystone(
local_auth,
conf.sp_name,
project_id=remote_project_id
)
return session.Session(auth=remote_auth)

169
mixmatch/config.py Normal file
View File

@ -0,0 +1,169 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from os import path
from oslo_config import cfg
from oslo_log import log
from oslo_cache import core as cache
LOG = log.getLogger('root')
CONF = cfg.CONF
# Proxy
proxy_group = cfg.OptGroup(name='proxy',
title='Proxy Config Group')
proxy_opts = [
cfg.IntOpt('port',
default=5001,
help='Web Server Port'),
cfg.ListOpt('service_providers',
default=[],
help='List of service providers'),
cfg.BoolOpt('search_by_broadcast',
default=False,
help='Search All Service Providers on Unknown Resource ID'),
cfg.BoolOpt('aggregation',
default=False,
help='Enable Aggregation when listing resources.'),
cfg.BoolOpt('caching',
default=True,
help='Enable token caching using oslo.cache'),
cfg.IntOpt('cache_time',
default=600,
help='How long to store cached tokens for'),
cfg.ListOpt('image_api_versions',
default=['v2.3', 'v2.2', 'v2.1', 'v2.0', 'v1.1', 'v1.0'],
help='List of supported image api versions'),
cfg.ListOpt('volume_api_versions',
default=['v3.0', 'v2.0', 'v1.0'],
help='List of supported volume api versions'),
]
# Keystone
keystone_group = cfg.OptGroup(name='keystone',
title='Keystone Config Group')
keystone_opts = [
cfg.StrOpt('auth_url',
default='http://localhost:35357/v3',
help='Keystone AUTH URL'),
cfg.StrOpt('username',
default='admin',
help='Proxy username'),
cfg.StrOpt('user_domain_id',
default='default',
help='Proxy user domain id'),
cfg.StrOpt('password',
default='nomoresecrete',
help='Proxy user password'),
cfg.StrOpt('project_name',
default='admin',
help='Proxy project name'),
cfg.StrOpt('project_domain_id',
default='default',
help='Proxy project domain id')
]
CONF.register_group(proxy_group)
CONF.register_opts(proxy_opts, proxy_group)
CONF.register_group(keystone_group)
CONF.register_opts(keystone_opts, keystone_group)
# Logging
log.register_options(CONF)
# Caching
cache.configure(CONF)
MEMOIZE_SESSION = None
session_cache_region = cache.create_region()
MEMOIZE_SESSION = cache.get_memoization_decorator(
CONF, session_cache_region, group="proxy")
def load_config():
"""Load parameters from the proxy's config file."""
conf_files = [f for f in ['k2k-proxy.conf',
'etc/k2k-proxy.conf',
'/etc/k2k-proxy.conf'] if path.isfile(f)]
if conf_files is not []:
CONF(default_config_files=conf_files)
def more_config():
"""Perform configuration that must be delayed until after import time.
This code must be delayed until the config files have been loaded. They
are in a separate file so that unit tests can run them without loading
configuration from a file.
"""
cache.configure_cache_region(CONF, session_cache_region)
for service_provider in CONF.proxy.service_providers:
sp_group = cfg.OptGroup(name='sp_%s' % service_provider,
title=service_provider)
sp_opts = [
cfg.StrOpt('sp_name',
default="default",
help='SP ID in Keystone Catalog. Omit for local.'),
cfg.StrOpt('messagebus',
help='URI to connect to message bus'),
cfg.StrOpt('services',
default=None,
help='Enabled services for this service provider.'),
cfg.StrOpt('auth_url',
default=None,
help='Keystone AUTH URL for Service Provider'),
cfg.StrOpt('image_endpoint',
default=None,
help="Image Endpoint for Service Provider"),
cfg.StrOpt('volume_endpoint',
default=None,
help="Volume Endpoint for Service Provider")
]
CONF.register_group(sp_group)
CONF.register_opts(sp_opts, sp_group)
log.setup(CONF, 'demo')
def get_conf_for_sp(sp_id):
"""Get the configuration opject for a specifc service provider."""
return CONF.__getattr__('sp_%s' % sp_id)

186
mixmatch/listener.py Normal file
View File

@ -0,0 +1,186 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import oslo_messaging
from mixmatch import config
from mixmatch.config import CONF, LOG
from mixmatch import model
from mixmatch.model import insert, delete, ResourceMapping
import eventlet
eventlet.monkey_patch()
class VolumeCreateEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^volume.*',
event_type='^volume.create.start$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Creating volume mapping %s -> %s at %s' % (
payload['volume_id'],
payload['tenant_id'],
self.sp_name))
insert(ResourceMapping("volumes",
payload['volume_id'],
payload['tenant_id'],
self.sp_name))
class VolumeDeleteEndpoint(object):
def __init__(self, sp_name):
self.sp_name = sp_name
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^volume.*',
event_type='^volume.delete.end$')
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Deleting volume mapping %s -> %s at %s' % (
payload['volume_id'],
payload['tenant_id'],
self.sp_name))
delete(ResourceMapping.find("volumes", payload['volume_id']))
class VolumeTransferEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^volume.*',
event_type='^volume.transfer.accept.end$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Moving volume mapping %s -> %s at %s' % (
payload['volume_id'],
payload['tenant_id'],
self.sp_name))
mapping = ResourceMapping.find("volumes", payload['volume_id'])
# Since we're manually updating a field, we have to sanitize the UUID
# ourselves.
mapping.tenant_id = payload['tenant_id'].replace("-", "")
class SnapshotCreateEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^snapshot.*',
event_type='^snapshot.create.start$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Creating snapshot mapping %s -> %s at %s' % (
payload['snapshot_id'],
payload['tenant_id'],
self.sp_name))
insert(ResourceMapping("snapshots",
payload['snapshot_id'],
payload['tenant_id'],
self.sp_name))
class SnapshotDeleteEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^snapshot.*',
event_type='^snapshot.delete.end$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Deleting snapshot mapping %s -> %s at %s' % (
payload['snapshot_id'],
payload['tenant_id'],
self.sp_name))
delete(ResourceMapping.find("snapshots", payload['snapshot_id']))
class ImageCreateEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^image.*',
event_type='^image.create$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Creating image mapping %s -> %s at %s' % (
payload['id'],
payload['owner'],
self.sp_name))
insert(ResourceMapping("images",
payload['id'],
payload['owner'],
self.sp_name))
class ImageDeleteEndpoint(object):
filter_rule = oslo_messaging.NotificationFilter(
publisher_id='^image.*',
event_type='^image.delete$')
def __init__(self, sp_name):
self.sp_name = sp_name
def info(self, ctxt, publisher_id, event_type, payload, metadata):
LOG.info('Deleting image mapping %s -> %s at %s' % (
payload['id'],
payload['owner'],
self.sp_name))
delete(ResourceMapping.find("images", payload['id']))
def get_endpoints_for_sp(sp_name):
return [
VolumeCreateEndpoint(sp_name),
VolumeDeleteEndpoint(sp_name),
VolumeTransferEndpoint(sp_name),
SnapshotCreateEndpoint(sp_name),
SnapshotDeleteEndpoint(sp_name),
ImageCreateEndpoint(sp_name),
ImageDeleteEndpoint(sp_name)
]
def get_server_for_sp(sp):
"""Get notification listener for a particular service provider.
The server can be run in the background under eventlet using .start()
"""
cfg = config.get_conf_for_sp(sp)
transport = oslo_messaging.get_notification_transport(CONF, cfg.messagebus)
targets = [oslo_messaging.Target(topic='notifications')]
return oslo_messaging.get_notification_listener(
transport,
targets,
get_endpoints_for_sp(cfg.sp_name),
executor='eventlet')
if __name__ == "__main__":
config.load_config()
config.more_config()
model.create_tables()
LOG.info("Now listening for changes")
for sp in CONF.proxy.service_providers:
get_server_for_sp(sp).start()
while True:
eventlet.sleep(5)
# XXX do something moderately more intelligent than this...

77
mixmatch/model.py Normal file
View File

@ -0,0 +1,77 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import sqlalchemy as sql
from sqlalchemy.ext.declarative import declarative_base
from oslo_db.sqlalchemy import enginefacade
from oslo_db.sqlalchemy import models
BASE = declarative_base(cls=models.ModelBase)
class ResourceMapping(BASE):
"""The location of a particular resource."""
__tablename__ = 'resource_mapping'
id = sql.Column(sql.Integer, primary_key=True)
resource_type = sql.Column(sql.String(60), nullable=False)
resource_id = sql.Column(sql.String(255), nullable=False)
resource_sp = sql.Column(sql.String(255), nullable=False)
tenant_id = sql.Column(sql.String(255), nullable=False)
def __init__(self, resource_type, resource_id, tenant_id, resource_sp):
self.resource_type = resource_type
self.resource_id = resource_id.replace("-", "")
self.tenant_id = tenant_id.replace("-", "")
self.resource_sp = resource_sp
def __repr__(self):
return str((self.resource_type, self.resource_id, self.resource_sp))
def __eq__(self, other):
return (self.resource_type == other.resource_type and
self.resource_id == other.resource_id and
self.resource_sp == other.resource_sp and
self.tenant_id == other.tenant_id)
def __ne__(self, other):
return not self.__eq__(other)
@classmethod
def find(cls, resource_type, resource_id):
context = enginefacade.transaction_context()
with enginefacade.reader.using(context) as session:
mapping = session.query(ResourceMapping).filter_by(
resource_type=resource_type,
resource_id=resource_id.replace("-", "")
).first()
return mapping
def insert(entity):
context = enginefacade.transaction_context()
with enginefacade.writer.using(context) as session:
session.add(entity)
def delete(entity):
context = enginefacade.transaction_context()
with enginefacade.writer.using(context) as session:
session.delete(entity)
def create_tables():
BASE.metadata.create_all(enginefacade.get_legacy_facade().get_engine())

273
mixmatch/proxy.py Normal file
View File

@ -0,0 +1,273 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import uuid
import requests
import flask
from flask import abort
from mixmatch import config
from mixmatch.config import LOG, CONF
from mixmatch.session import app
from mixmatch.session import chunked_reader
from mixmatch.session import request
from mixmatch import auth
from mixmatch import model
from mixmatch import services
def stream_response(response):
yield response.raw.read()
def is_valid_uuid(value):
try:
uuid.UUID(value, version=4)
return True
except ValueError:
return False
class RequestHandler:
def __init__(self, method, path, headers):
self.method = method
self.path = path
self.headers = headers
self.request_path = path.split('/')
# workaround to fix glance requests
# that does not contain image directory
if self.request_path[0] in ['v1', 'v2']:
self.request_path.insert(0, 'image')
self.service_type = self.request_path[0]
if len(self.request_path) == 1:
# unversioned calls with no action
self._forward = self._list_api_versions
return
elif len(self.request_path) == 2:
# versioned calls with no action
abort(400)
self.version = self.request_path[1]
self.detailed = True
if self.service_type == 'image':
# /image/{version}/{action}
self.action = self.request_path[2:]
elif self.service_type == 'volume':
# /volume/{version}/{project_id}/{action}
self.action = self.request_path[3:]
# if request is to /volumes, change it
# to /volumes/detail for aggregation
if self.method == 'GET' \
and self.action[-1] == 'volumes':
self.detailed = False
self.action.insert(len(self.action), 'detail')
else:
raise ValueError
if self.method in ['GET']:
self.stream = True
else:
self.stream = False
resource_id = None
mapping = None
aggregate = False
if len(self.action) > 1 and is_valid_uuid(self.action[1]):
resource_id = self.action[1]
mapping = model.ResourceMapping.find(
resource_type=self.action[0],
resource_id=resource_id)
else:
if self.method == 'GET' \
and self.action[0] in ['images', 'volumes', 'snapshots']:
aggregate = True
self.local_token = headers['X-AUTH-TOKEN']
LOG.info('Local Token: %s ' % self.local_token)
if 'MM-SERVICE-PROVIDER' in headers and 'MM-PROJECT-ID' in headers:
# The user wants a specific service provider, use that SP.
self.service_provider = headers['MM-SERVICE-PROVIDER']
self.project_id = headers['MM-PROJECT-ID']
self._forward = self._targeted_forward
elif aggregate:
self._forward = self._aggregate_forward
elif mapping:
# Which we already know the location of, use that SP.
self.service_provider = mapping.resource_sp
self.project_id = mapping.tenant_id
self._forward = self._targeted_forward
else:
self._forward = self._search_forward
def _do_request_on(self, sp, project_id=None):
if sp == 'default':
auth_session = auth.get_local_auth(self.local_token)
else:
auth_session = auth.get_sp_auth(sp,
self.local_token,
project_id)
headers = self._prepare_headers(self.headers)
headers['X-AUTH-TOKEN'] = auth_session.get_token()
url = services.construct_url(
sp,
self.service_type,
self.version,
self.action,
project_id=auth_session.get_project_id()
)
LOG.info('%s: %s' % (self.method, url))
if self.chunked:
return requests.request(method=self.method,
url=url,
headers=headers,
data=chunked_reader())
else:
return requests.request(method=self.method,
url=url,
headers=headers,
data=request.data,
stream=self.stream,
params=self._prepare_args(request.args))
def _finalize(self, response):
if not self.stream:
final_response = flask.Response(
response.text,
response.status_code
)
for key, value in response.headers.items():
final_response.headers[key] = value
else:
final_response = flask.Response(
flask.stream_with_context(stream_response(response)),
response.status_code,
content_type=response.headers['content-type']
)
return final_response
def _local_forward(self):
return self._finalize(self._do_request_on('default'))
def _targeted_forward(self):
return self._finalize(
self._do_request_on(self.service_provider, self.project_id))
def _search_forward(self):
if not CONF.proxy.search_by_broadcast:
return self._local_forward()
for sp in CONF.proxy.service_providers:
if sp == 'default':
response = self._do_request_on('default')
if 200 <= response.status_code < 300:
return self._finalize(response)
else:
self.service_provider = sp
for project in auth.get_projects_at_sp(sp, self.local_token):
response = self._do_request_on(sp, project)
if 200 <= response.status_code < 300:
return self._finalize(response)
return flask.Response(
"Not found\n.",
404
)
def _aggregate_forward(self):
if not CONF.proxy.aggregation:
return self._local_forward()
responses = {}
for sp in CONF.proxy.service_providers:
if sp == 'default':
responses['default'] = self._do_request_on('default')
else:
for proj in auth.get_projects_at_sp(sp, self.local_token):
responses[(sp, proj)] = self._do_request_on(sp, proj)
return flask.Response(
services.aggregate(responses,
self.action[0],
request.args.to_dict(),
request.base_url,
detailed=self.detailed),
200,
content_type=responses['default'].headers['content-type']
)
def _list_api_versions(self):
return services.list_api_versions(self.service_type,
request.base_url)
def forward(self):
return self._forward()
@staticmethod
def _prepare_headers(user_headers):
headers = dict()
headers['Accept'] = user_headers.get('Accept', '')
headers['Content-Type'] = user_headers.get('Content-Type', '')
for key, value in user_headers.items():
if key.lower().startswith('x-') and key.lower() != 'x-auth-token':
headers[key] = value
return headers
@staticmethod
def _prepare_args(user_args):
"""Prepare the GET arguments by removing the limit and marker.
This is because the id of the marker will only be present in one of
the service providers.
"""
args = user_args.copy()
args.pop('limit', None)
args.pop('marker', None)
return args
@property
def chunked(self):
return self.headers.get('Transfer-Encoding', '').lower() == 'chunked'
@app.route('/', defaults={'path': ''}, methods=['GET', 'POST', 'PUT',
'DELETE', 'HEAD', 'PATCH'])
@app.route('/<path:path>', methods=['GET', 'POST', 'PUT',
'DELETE', 'HEAD', 'PATCH'])
def proxy(path):
k2k_request = RequestHandler(request.method, path, request.headers)
return k2k_request.forward()
def main():
config.load_config()
config.more_config()
model.create_tables()
if __name__ == "__main__":
main()
app.run(port=5001, threaded=True)

198
mixmatch/services.py Normal file
View File

@ -0,0 +1,198 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
import os
import operator
from six.moves.urllib import parse
from mixmatch import config
CONF = config.CONF
def construct_url(service_provider, service_type,
version, action, project_id=None):
"""Construct the full URL for an Openstack API call."""
conf = config.get_conf_for_sp(service_provider)
if service_type == 'image':
endpoint = conf.image_endpoint
return "%(endpoint)s/%(version)s/%(action)s" % {
'endpoint': endpoint,
'version': version,
'action': os.path.join(*action)
}
elif service_type == 'volume':
endpoint = conf.volume_endpoint
return "%(endpoint)s/%(version)s/%(project)s/%(action)s" % {
'endpoint': endpoint,
'version': version,
'project': project_id,
'action': os.path.join(*action)
}
def aggregate(responses, key, params=None, path=None, detailed=True):
"""Combine responses from several clusters into one response."""
if params:
limit = int(params.get('limit', 0))
sort = params.get('sort', None)
marker = params.get('marker', None)
sort_key = params.get('sort_key', None)
sort_dir = params.get('sort_dir', None)
if sort and not sort_key:
sort_key, sort_dir = sort.split(':')
else:
sort_key = None
limit = 0
marker = None
resource_list = []
for location, response in responses.items():
resources = json.loads(response.text)
if type(resources) == dict:
resource_list += resources[key]
start = 0
last = end = len(resource_list)
if sort_key:
resource_list = sorted(resource_list,
key=operator.itemgetter(sort_key),
reverse=_is_reverse(sort_dir))
if marker:
# Find the position of the resource with marker id
# and set the list to start at the one after that.
for index, item in enumerate(resource_list):
if item['id'] == marker:
start = index + 1
break
if limit != 0:
end = start + limit
# this hack is to handle GET requests to /volumes
# we automatically make the call to /volumes/detail
# because we need sorting information. Here we
# remove the extra values /volumes/detail provides
if key == 'volumes' and not detailed:
resource_list[start:end] = \
_remove_details(resource_list[start:end])
response = {key: resource_list[start:end]}
# Inject the pagination URIs
if start > 0:
params.pop('marker', None)
response['start'] = '%s?%s' % (path, parse.urlencode(params))
if end < last:
params['marker'] = response[key][-1]['id']
response['next'] = '%s?%s' % (path, parse.urlencode(params))
return json.dumps(response)
def list_api_versions(service_type, url):
api_versions = list()
if service_type == 'image':
supported_versions = CONF.proxy.image_api_versions
for version in supported_versions:
info = dict()
if version == supported_versions[0]:
info.update({'status': 'CURRENT'})
else:
info.update({'status': 'SUPPORTED'})
info.update({
'id': version,
'links': [
{'href': '%s/%s/' % (url,
version[:-2]),
'rel': 'self'}
]
})
api_versions.append(info)
return json.dumps({'versions': api_versions})
elif service_type == 'volume':
supported_versions = CONF.proxy.volume_api_versions
for version in supported_versions:
info = dict()
if version == supported_versions[0]:
info.update({
'status': 'CURRENT',
'min_version': version[1:],
'version': version[1:]
})
else:
info.update({
'status': 'SUPPORTED',
'min_version': '',
'version': ''
})
info.update({
'id': version,
'updated': '2014-06-28T12:20:21Z', # FIXME
'links': [
{'href': 'http://docs.openstack.org/',
'type': 'text/html',
'rel': 'describedby'},
{'href': '%s/%s/' % (url,
version[:-2]),
'rel': 'self'}
],
'media-types': [
{'base': 'application/json',
'type':
'application/vnd.openstack.volume+json;version=%s'
% version[1:-2]},
{'base': 'application/xml',
'type':
'application/vnd.openstack.volume+xml;version=%s'
% version[1:-2]}
]
})
api_versions.append(info)
return json.dumps({'versions': api_versions})
else:
raise ValueError
def _is_reverse(order):
"""Return True if order is asc, False if order is desc"""
if order == 'asc':
return False
elif order == 'desc':
return True
else:
raise ValueError
def _remove_details(volumes):
"""Delete key, value pairs if key is not in keys"""
keys = ['id', 'links', 'name']
for i in range(len(volumes)):
volumes[i] = {key: volumes[i][key] for key in keys}
return volumes

40
mixmatch/session.py Normal file
View File

@ -0,0 +1,40 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import flask
app = flask.Flask(__name__)
request = flask.request
def chunked_reader():
try:
# If we're running under uWSGI, use the uwsgi.chunked_read method
# to read chunked input.
import uwsgi # noqa
while True:
chunk = uwsgi.chunked_read()
if len(chunk) > 0:
yield chunk
else:
return
except ImportError:
# Otherwise try to read the wsgi input. This works in embedded Apache.
stream = flask.request.environ["wsgi.input"]
try:
while True:
yield stream.next()
except:
return

View File

View File

View File

@ -0,0 +1,155 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
IMAGE_LIST_V2 = {
"images": [
{
"checksum": "eb9139e4942121f22bbc2afc0400b2a4",
"container_format": "ami",
"created_at": "2016-08-19T15:47:10Z",
"disk_format": "ami",
"file": "/v2/images/61f655c0-4511-4307-a257-4162c87a5130/file",
"id": "61f655c0-4511-4307-a257-4162c87a5130",
"kernel_id": "130b7b71-487a-4553-b336-0a72ec590c99",
"min_disk": 0,
"min_ram": 0,
"name": "cirros-0.3.4-x86_64-uec",
"owner": "5f4358e168b747a487fe34e64c5619b2",
"protected": False,
"ramdisk_id": "941882c5-b992-4fa9-bcba-9d25d2f4e3b8",
"schema": "/v2/schemas/image",
"self": "/v2/images/61f655c0-4511-4307-a257-4162c87a5130",
"size": 25165824,
"status": "active",
"tags": [],
"updated_at": "2016-08-19T15:47:10Z",
"virtual_size": None,
"visibility": "public"
},
{
"checksum": "be575a2b939972276ef675752936977f",
"container_format": "ari",
"created_at": "2016-08-19T15:47:08Z",
"disk_format": "ari",
"file": "/v2/images/941882c5-b992-4fa9-bcba-9d25d2f4e3b8/file",
"id": "941882c5-b992-4fa9-bcba-9d25d2f4e3b8",
"min_disk": 0,
"min_ram": 0,
"name": "cirros-0.3.4-x86_64-uec-ramdisk",
"owner": "5f4358e168b747a487fe34e64c5619b2",
"protected": False,
"schema": "/v2/schemas/image",
"self": "/v2/images/941882c5-b992-4fa9-bcba-9d25d2f4e3b8",
"size": 3740163,
"status": "active",
"tags": [],
"updated_at": "2016-08-19T15:47:08Z",
"virtual_size": None,
"visibility": "public"
},
{
"checksum": "8a40c862b5735975d82605c1dd395796",
"container_format": "aki",
"created_at": "2016-08-19T15:46:58Z",
"disk_format": "aki",
"file": "/v2/images/130b7b71-487a-4553-b336-0a72ec590c99/file",
"id": "130b7b71-487a-4553-b336-0a72ec590c99",
"min_disk": 0,
"min_ram": 0,
"name": "cirros-0.3.4-x86_64-uec-kernel",
"owner": "5f4358e168b747a487fe34e64c5619b2",
"protected": False,
"schema": "/v2/schemas/image",
"self": "/v2/images/130b7b71-487a-4553-b336-0a72ec590c99",
"size": 4979632,
"status": "active",
"tags": [],
"updated_at": "2016-08-19T15:47:02Z",
"virtual_size": None,
"visibility": "public"
}
]
}
IMAGE_LIST_V2_2 = {
"images": [
{
"status": "active",
"name": "cirros-0.3.2-x86_64-disk",
"tags": [],
"container_format": "bare",
"created_at": "2014-11-07T17:07:06Z",
"disk_format": "qcow2",
"updated_at": "2014-11-07T17:19:09Z",
"visibility": "public",
"self": "/v2/images/1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"min_disk": 0,
"protected": False,
"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"file": "/v2/images/1bea47ed-f6a9-463b-b423-14b9cca9ad27/file",
"checksum": "64d7c1cd2b6f60c92c14662941cb7913",
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 13167616,
"min_ram": 0,
"schema": "/v2/schemas/image",
"virtual_size": None
},
{
"status": "active",
"name": "F17-x86_64-cfntools",
"tags": [],
"container_format": "bare",
"created_at": "2014-10-30T08:23:39Z",
"disk_format": "qcow2",
"updated_at": "2014-11-03T16:40:10Z",
"visibility": "public",
"self": "/v2/images/781b3762-9469-4cec-b58d-3349e5de4e9c",
"min_disk": 0,
"protected": False,
"id": "781b3762-9469-4cec-b58d-3349e5de4e9c",
"file": "/v2/images/781b3762-9469-4cec-b58d-3349e5de4e9c/file",
"checksum": "afab0f79bac770d61d24b4d0560b5f70",
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 476704768,
"min_ram": 0,
"schema": "/v2/schemas/image",
"virtual_size": None
}
]
}
VOLUME_LIST_V2 = {
"volumes": [
{
"id": "69baebf2-c242-47f4-b0a3-ab1761cfe755",
"links": [
{
"href": "http://localhost:8776/v2/"
"5f4358e168b747a487fe34e64c5619b2/"
"volumes/"
"69baebf2-c242-47f4-b0a3-ab1761cfe755",
"rel": "self"
},
{
"href": "http://localhost:8776/"
"5f4358e168b747a487fe34e64c5619b2/"
"volumes/"
"69baebf2-c242-47f4-b0a3-ab1761cfe755",
"rel": "bookmark"
}
],
"name": "volume1"
}
]
}

View File

@ -0,0 +1,160 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from testtools import testcase
from oslo_messaging.notify import dispatcher as notify_dispatcher
import mock
from mixmatch.model import ResourceMapping
from mixmatch.listener import get_endpoints_for_sp
class TestListener(testcase.TestCase):
@mock.patch('mixmatch.listener.insert')
def test_create_volume(self, insert):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'volume_id': "1232123212321",
'tenant_id': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'volume.node4',
'event_type': 'volume.create.start',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
insert.assert_called_with(
ResourceMapping('volumes',
'1232123212321',
'abdbabdbabdba',
'default'))
@mock.patch('mixmatch.listener.ResourceMapping.find', return_value=35)
@mock.patch('mixmatch.listener.delete')
def test_delete_volume(self, delete, find):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'volume_id': "1232123212321",
'tenant_id': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'volume.node4',
'event_type': 'volume.delete.end',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
find.assert_called_with('volumes', '1232123212321')
delete.assert_called_with(35)
@mock.patch('mixmatch.listener.insert')
def test_create_snapshot(self, insert):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'snapshot_id': "1232123212321",
'tenant_id': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'snapshot.node4',
'event_type': 'snapshot.create.start',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
insert.assert_called_with(
ResourceMapping('snapshots',
'1232123212321',
'abdbabdbabdba',
'default'))
@mock.patch('mixmatch.listener.ResourceMapping.find', return_value=35)
@mock.patch('mixmatch.listener.delete')
def test_delete_snapshot(self, delete, find):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'snapshot_id': "1232123212321",
'tenant_id': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'snapshot.node4',
'event_type': 'snapshot.delete.end',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
find.assert_called_with('snapshots', '1232123212321')
delete.assert_called_with(35)
@mock.patch('mixmatch.listener.insert')
def test_create_image(self, insert):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'id': "1232123212321",
'owner': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'image.node4',
'event_type': 'image.create',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
insert.assert_called_with(
ResourceMapping('images',
'1232123212321',
'abdbabdbabdba',
'default'))
@mock.patch('mixmatch.listener.ResourceMapping.find', return_value=35)
@mock.patch('mixmatch.listener.delete')
def test_delete_image(self, delete, find):
endpoints = get_endpoints_for_sp('default')
dispatcher = notify_dispatcher.NotificationDispatcher(
endpoints, serializer=None)
MESSAGE = {
'payload': {
'id': "1232123212321",
'owner': "abdbabdbabdba"
},
'priority': 'info',
'publisher_id': 'image.node4',
'event_type': 'image.delete',
'timestamp': '2014-03-03 18:21:04.369234',
'message_id': '99863dda-97f0-443a-a0c1-6ed317b7fd45'
}
incoming = mock.Mock(ctxt={}, message=MESSAGE)
dispatcher.dispatch(incoming)
find.assert_called_with('images', '1232123212321')
delete.assert_called_with(35)

View File

@ -0,0 +1,333 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
from testtools import testcase
from requests_mock.contrib import fixture as requests_fixture
from oslo_config import fixture as config_fixture
import oslo_db
import fixtures
import json
from mixmatch.config import CONF, more_config
from mixmatch.proxy import app
from mixmatch.model import BASE, enginefacade, insert, ResourceMapping
class FakeSession():
"""A replacement for keystoneauth1.session.Session."""
def __init__(self, token, project):
self.token = token
self.project = project
def get_token(self):
return self.token
def get_project_id(self):
return self.project
class SessionFixture(fixtures.Fixture):
"""A fixture that mocks get_{sp,local}_session."""
def _setUp(self):
def get_local_auth(token):
return FakeSession(token, self.local_auths[token])
def get_sp_auth(sp, token, project):
return FakeSession(self.sp_auths[(sp, token, project)], project)
def get_projects_at_sp(sp, token):
if sp in self.sp_projects:
return self.sp_projects[sp]
else:
return []
self.local_auths = {}
self.sp_auths = {}
self.sp_projects = {}
self.useFixture(fixtures.MonkeyPatch(
'mixmatch.auth.get_sp_auth', get_sp_auth))
self.useFixture(fixtures.MonkeyPatch(
'mixmatch.auth.get_local_auth', get_local_auth))
self.useFixture(fixtures.MonkeyPatch(
'mixmatch.auth.get_projects_at_sp', get_projects_at_sp))
def add_local_auth(self, token, project):
self.local_auths[token] = project
def add_sp_auth(self, sp, token, project, remote_token):
self.sp_auths[(sp, token, project)] = remote_token
def add_project_at_sp(self, sp, project):
if sp in self.sp_projects:
self.sp_projects[sp].append(project)
else:
self.sp_projects[sp] = [project]
class DatabaseFixture(fixtures.Fixture):
"""A fixture that performs each test in a new, blank database."""
def __init__(self, conf):
super(DatabaseFixture, self).__init__()
oslo_db.options.set_defaults(conf, connection="sqlite://")
def _setUp(self):
context = enginefacade.transaction_context()
with enginefacade.writer.using(context) as session:
self.engine = session.get_bind()
BASE.metadata.create_all(bind=self.engine)
self.addCleanup(BASE.metadata.drop_all, bind=self.engine)
def recreate(self):
BASE.metadata.create_all(bind=self.engine)
class TestMock(testcase.TestCase):
def setUp(self):
super(TestMock, self).setUp()
self.requests_fixture = self.useFixture(requests_fixture.Fixture())
self.config_fixture = self.useFixture(config_fixture.Config(conf=CONF))
self.session_fixture = self.useFixture(SessionFixture())
self.db_fixture = self.useFixture(DatabaseFixture(conf=CONF))
self.app = app.test_client()
# set config values
self.config_fixture.load_raw_values(
group='proxy',
service_providers='default, remote1',
aggregation=True)
self.config_fixture.load_raw_values(
group='sp_default',
image_endpoint='http://images.local',
volume_endpoint='http://volumes.local')
self.config_fixture.load_raw_values(
group='sp_remote1',
image_endpoint='http://images.remote1',
volume_endpoint='http://volumes.remote1')
more_config()
def test_get_image(self):
self.session_fixture.add_local_auth('wewef', 'my_project_id')
insert(ResourceMapping("images", "6c4ae06e14bd422e97afe07223c99e18",
"not-to-be-read", "default"))
EXPECTED = 'WEOFIHJREINJEFDOWEIJFWIENFERINWFKEWF'
self.requests_fixture.get(
'http://images.local/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
request_headers={'X-AUTH-TOKEN': 'wewef'},
text=six.u(EXPECTED),
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': 'wewef',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.data, six.b(EXPECTED))
def test_get_image_remote(self):
REMOTE_PROJECT_ID = "319d8162b38342609f5fafe1404216b9"
LOCAL_TOKEN = "my-local-token"
REMOTE_TOKEN = "my-remote-token"
self.session_fixture.add_sp_auth('remote1', LOCAL_TOKEN,
REMOTE_PROJECT_ID, REMOTE_TOKEN)
insert(ResourceMapping("images", "6c4ae06e14bd422e97afe07223c99e18",
REMOTE_PROJECT_ID, "remote1"))
EXPECTED = 'WEOFIHJREINJEFDOWEIJFWIENFERINWFKEWF'
self.requests_fixture.get(
'http://images.remote1/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text=six.u(EXPECTED),
request_headers={'X-AUTH-TOKEN': REMOTE_TOKEN},
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': LOCAL_TOKEN,
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.data, six.b(EXPECTED))
def test_get_image_default_to_local(self):
self.session_fixture.add_local_auth('wewef', 'my_project_id')
self.requests_fixture.get(
'http://images.local/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text="nope.",
status_code=400,
request_headers={'X-AUTH-TOKEN': 'wewef'},
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': 'wewef',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 400)
def test_get_image_search_local(self):
self.config_fixture.load_raw_values(group='proxy',
search_by_broadcast=True)
self.session_fixture.add_local_auth('wewef', 'my_project_id')
IMAGE = 'Here is my image.'
self.requests_fixture.get(
'http://images.local/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text=six.u(IMAGE),
status_code=200,
request_headers={'X-AUTH-TOKEN': 'wewef'},
headers={'CONTENT-TYPE': 'application/json'})
# Don't add a response for the remote SP, to ensure that our code
# always checks locally first.
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': 'wewef',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data, six.b(IMAGE))
def test_get_image_search_remote(self):
REMOTE_PROJECT_ID = "319d8162b38342609f5fafe1404216b9"
self.config_fixture.load_raw_values(group='proxy',
search_by_broadcast=True)
self.session_fixture.add_local_auth('local-tok', 'my_project_id')
self.session_fixture.add_sp_auth('remote1', 'local-tok',
REMOTE_PROJECT_ID, 'remote-tok')
self.session_fixture.add_project_at_sp('remote1', REMOTE_PROJECT_ID)
IMAGE = 'Here is my image.'
self.requests_fixture.get(
'http://images.local/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text="nope.",
status_code=400,
request_headers={'X-AUTH-TOKEN': 'local-tok'},
headers={'CONTENT-TYPE': 'application/json'})
self.requests_fixture.get(
'http://images.remote1/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text=six.u(IMAGE),
status_code=200,
request_headers={'X-AUTH-TOKEN': 'remote-tok'},
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': 'local-tok',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.data, six.b(IMAGE))
def test_get_image_search_nexists(self):
REMOTE_PROJECT_ID = "319d8162b38342609f5fafe1404216b9"
self.config_fixture.load_raw_values(group='proxy',
search_by_broadcast=True)
self.session_fixture.add_local_auth('local-tok', 'my_project_id')
self.session_fixture.add_sp_auth('remote1', 'local-tok',
REMOTE_PROJECT_ID, 'remote-tok')
self.session_fixture.add_project_at_sp('remote1', REMOTE_PROJECT_ID)
self.requests_fixture.get(
'http://images.local/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text="nope.",
status_code=400,
request_headers={'X-AUTH-TOKEN': 'local-tok'},
headers={'CONTENT-TYPE': 'application/json'})
self.requests_fixture.get(
'http://images.remote1/v2/images/'
'6c4ae06e-14bd-422e-97af-e07223c99e18',
text="also nope.",
status_code=403,
request_headers={'X-AUTH-TOKEN': 'remote-tok'},
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images/6c4ae06e-14bd-422e-97af-e07223c99e18',
headers={'X-AUTH-TOKEN': 'local-tok',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 404)
def test_list_images(self):
REMOTE_PROJECT_ID = "319d8162b38342609f5fafe1404216b9"
self.session_fixture.add_local_auth('local-tok', 'my_project_id')
self.session_fixture.add_sp_auth('remote1', 'local-tok',
REMOTE_PROJECT_ID, 'remote-tok')
self.session_fixture.add_project_at_sp('remote1', REMOTE_PROJECT_ID)
LOCAL_IMAGES = json.dumps({
"images": [
{"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"size": 4096},
{"id": "781b3762-9469-4cec-b58d-3349e5de4e9c",
"size": 476704768}
],
})
REMOTE1_IMAGES = json.dumps({
"images": [
{"id": "4af2929a-3c1f-4ccf-bf91-724444719c78",
"size": 13167626}
],
})
EXPECTED = {
"images": [
{"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"size": 4096},
{"id": "4af2929a-3c1f-4ccf-bf91-724444719c78",
"size": 13167626},
{"id": "781b3762-9469-4cec-b58d-3349e5de4e9c",
"size": 476704768}
],
}
self.requests_fixture.get(
'http://images.local/v2/images',
text=LOCAL_IMAGES,
status_code=200,
request_headers={'X-AUTH-TOKEN': 'local-tok'},
headers={'CONTENT-TYPE': 'application/json'})
self.requests_fixture.get(
'http://images.remote1/v2/images',
text=REMOTE1_IMAGES,
status_code=200,
request_headers={'X-AUTH-TOKEN': 'remote-tok'},
headers={'CONTENT-TYPE': 'application/json'})
response = self.app.get(
'/image/v2/images',
headers={'X-AUTH-TOKEN': 'local-tok',
'CONTENT-TYPE': 'application/json'})
actual = json.loads(response.data.decode("ascii"))
actual['images'].sort(key=(lambda x: x[u'id']))
self.assertEqual(actual, EXPECTED)
def test_unversioned_calls_no_action(self):
response = self.app.get(
'/image',
headers={'X-AUTH-TOKEN': 'local-tok',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 200)
actual = json.loads(response.data.decode("ascii"))
self.assertEqual(len(actual['versions']), 6)
def test_versioned_calls_no_action(self):
response = self.app.get(
'/image/v2',
headers={'X-AUTH-TOKEN': 'local-tok',
'CONTENT-TYPE': 'application/json'})
self.assertEqual(response.status_code, 400)

View File

@ -0,0 +1,39 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import uuid
from testtools import testcase
from mixmatch import proxy
class TestRequestHandler(testcase.TestCase):
def setUp(self):
super(TestRequestHandler, self).setUp()
def test_prepare_headers(self):
user_headers = {
'x-auth-token': uuid.uuid4(),
}
headers = proxy.RequestHandler._prepare_headers(user_headers)
self.assertEqual({'Accept': '', 'Content-Type': ''}, headers)
def test_prepare_args(self):
user_args = {
'limit': 1,
'marker': uuid.uuid4()
}
args = proxy.RequestHandler._prepare_args(user_args)
self.assertEqual({}, args)

View File

@ -0,0 +1,280 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
from six.moves.urllib import parse
from oslo_config import fixture as config_fixture
from mixmatch.config import CONF
from testtools import testcase
from mixmatch import services
from mixmatch.tests.unit import samples
class Response:
def __init__(self, text):
self.text = text
# Source: http://stackoverflow.com/a/9468284
class Url(object):
"""A url object that can be compared with other url orbjects
without regard to the vagaries of encoding, escaping, and ordering
of parameters in query strings."""
def __init__(self, url):
parts = parse.urlparse(url)
_query = frozenset(parse.parse_qsl(parts.query))
_path = parse.unquote_plus(parts.path)
parts = parts._replace(query=_query, path=_path)
self.parts = parts
def __eq__(self, other):
return self.parts == other.parts
def __hash__(self):
return hash(self.parts)
VOLUMES = {'default': Response(json.dumps(samples.VOLUME_LIST_V2)),
'sp1': Response(json.dumps(samples.VOLUME_LIST_V2))}
IMAGES = {'default': Response(json.dumps(samples.IMAGE_LIST_V2)),
'sp1': Response(json.dumps(samples.IMAGE_LIST_V2_2))}
SMALLEST_IMAGE = '941882c5-b992-4fa9-bcba-9d25d2f4e3b8'
EARLIEST_IMAGE = '781b3762-9469-4cec-b58d-3349e5de4e9c'
SECOND_EARLIEST_IMAGE = '1bea47ed-f6a9-463b-b423-14b9cca9ad27'
LATEST_IMAGE = '61f655c0-4511-4307-a257-4162c87a5130'
IMAGE_PATH = 'http://localhost/image/images'
IMAGES_IN_SAMPLE = 5
VOLUMES_IN_SAMPLE = 2
API_VERSIONS = 'v3.2, v2.0, v1'
NUM_OF_VERSIONS = 3
IMAGE_UNVERSIONED = 'http://localhost/image'
IMAGE_VERSIONED = 'http://localhost/image/v3/'
class TestServices(testcase.TestCase):
def setUp(self):
super(TestServices, self).setUp()
self.config_fixture = self.useFixture(config_fixture.Config(conf=CONF))
def test_aggregate_key(self):
# Aggregate 'images'
response = json.loads(services.aggregate(IMAGES, 'images'))
self.assertEqual(IMAGES_IN_SAMPLE, len(response['images']))
# Aggregate 'volumes'
response = json.loads(services.aggregate(VOLUMES, 'volumes'))
self.assertEqual(VOLUMES_IN_SAMPLE, len(response['volumes']))
def test_aggregate_limit(self):
params = {
'limit': 1
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
self.assertEqual(1, len(response['images']))
def test_aggregate_sort_images_ascending(self):
"""Sort images by smallest size, ascending."""
params = {
'sort': 'size:asc'
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
self.assertEqual(response['images'][0]['id'], SMALLEST_IMAGE)
def test_aggregate_sort_images_limit(self):
"""Sort images by smallest size, ascending, limit to 1, alt format."""
params = {
'sort_key': 'size',
'sort_dir': 'asc',
'limit': 1
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Ensure the smallest is first and there is only 1 entry.
self.assertEqual(response['images'][0]['id'], SMALLEST_IMAGE)
self.assertEqual(1, len(response['images']))
# Ensure the 'next' url is correct.
self.assertEqual(
Url(response['next']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params, marker=SMALLEST_IMAGE)
))
)
def test_sort_images_date_limit_ascending(self):
"""Sort images by last update, ascending, limit to 2."""
params = {
'sort': 'updated_at:asc',
'limit': 2
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Check the first and second are the correct ids.
self.assertEqual(response['images'][0]['id'], EARLIEST_IMAGE)
self.assertEqual(response['images'][1]['id'], SECOND_EARLIEST_IMAGE)
self.assertEqual(2, len(response['images']))
# Check the next link
self.assertEqual(
Url(response['next']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params, marker=SECOND_EARLIEST_IMAGE)
))
)
def test_sort_images_date_limit_descending(self):
"""Sort images by last update, descending, limit 1."""
params = {
'sort': 'updated_at:desc',
'limit': 1
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Check the id and size
self.assertEqual(response['images'][0]['id'], LATEST_IMAGE)
self.assertEqual(1, len(response['images']))
# Check the next link
self.assertEqual(
Url(response['next']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params, marker=LATEST_IMAGE)
))
)
def test_sort_images_date_ascending_pagination(self):
"""Sort images by last update, ascending, skip the first one."""
params = {
'sort': 'updated_at:asc',
'limit': 1,
'marker': EARLIEST_IMAGE
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Ensure we skipped the first one
self.assertEqual(response['images'][0]['id'], SECOND_EARLIEST_IMAGE)
self.assertEqual(1, len(response['images']))
# Next link
self.assertEqual(
Url(response['next']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params, marker=SECOND_EARLIEST_IMAGE)
))
)
# Start link
self.assertEqual(
Url(response['start']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params)
))
)
def test_marker_without_limit(self):
"""Test marker without limit."""
params = {
'sort': 'updated_at:asc',
'marker': EARLIEST_IMAGE
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Ensure we skipped the first one
self.assertEqual(response['images'][0]['id'], SECOND_EARLIEST_IMAGE)
self.assertEqual(IMAGES_IN_SAMPLE - 1, len(response['images']))
# Start link
self.assertEqual(
Url(response['start']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params)
))
)
def test_marker_last(self):
"""Test marker without limit, nothing to return."""
params = {
'sort': 'updated_at:asc',
'marker': LATEST_IMAGE
}
response = json.loads(services.aggregate(IMAGES, 'images',
params, IMAGE_PATH))
# Ensure we skipped the first one
self.assertEqual(0, len(response['images']))
# Start link
self.assertEqual(
Url(response['start']),
Url(self._prepare_url(
IMAGE_PATH,
self._prepare_params(params)
))
)
def test_list_api_versions(self):
self.config_fixture.load_raw_values(group='proxy',
image_api_versions=API_VERSIONS,
volume_api_versions=API_VERSIONS)
# List image api
response = json.loads(services.list_api_versions('image',
IMAGE_UNVERSIONED))
current_version = response['versions'][0]['id']
current_version_status = response['versions'][0]['status']
current_version_url = response['versions'][0]['links'][0]['href']
self.assertEqual(NUM_OF_VERSIONS, len(response['versions']))
self.assertEqual(current_version, 'v3.2')
self.assertEqual(current_version_status, 'CURRENT')
self.assertEqual(
Url(current_version_url),
Url(IMAGE_VERSIONED))
@staticmethod
def _prepare_params(user_params, marker=None):
params = user_params.copy()
if marker:
params['marker'] = marker
else:
params.pop('marker', None)
return params
@staticmethod
def _prepare_url(url, params):
return '%s?%s' % (url, parse.urlencode(params))

19
mixmatch/wsgi.py Normal file
View File

@ -0,0 +1,19 @@
# Copyright 2016 Massachusetts Open Cloud
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from mixmatch import proxy
from mixmatch import session
proxy.main()
application = session.app

View File

View File

275
releasenotes/source/conf.py Normal file
View File

@ -0,0 +1,275 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Glance Release Notes documentation build configuration file, created by
# sphinx-quickstart on Tue Nov 3 17:40:50 2015.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
# sys.path.insert(0, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
# needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'oslosphinx',
'reno.sphinxext',
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
# source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'mixmatch Release Notes'
copyright = u'2016, OpenStack Foundation'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
# The full version, including alpha/beta/rc tags.
release = ''
# The short X.Y version.
version = ''
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
# language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
# today = ''
# Else, today_fmt is used as the format for a strftime call.
# today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = []
# The reST default role (used for this markup: `text`) to use for all
# documents.
# default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
# add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
# add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
# If true, keep warnings as "system message" paragraphs in the built documents.
# keep_warnings = False
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
# html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
# html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
# html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
# html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
# html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
# html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
# html_extra_path = []
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
# html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
# html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
# html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
# html_additional_pages = {}
# If false, no module index is generated.
# html_domain_indices = True
# If false, no index is generated.
# html_use_index = True
# If true, the index is split into individual pages for each letter.
# html_split_index = False
# If true, links to the reST sources are added to the pages.
# html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
# html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
# html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
# html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
# html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'GlanceReleaseNotesdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
# 'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
('index', 'GlanceReleaseNotes.tex', u'Glance Release Notes Documentation',
u'Glance Developers', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
# latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
# latex_use_parts = False
# If true, show page references after internal links.
# latex_show_pagerefs = False
# If true, show URL addresses after external links.
# latex_show_urls = False
# Documents to append as an appendix to all manuals.
# latex_appendices = []
# If false, no module index is generated.
# latex_domain_indices = True
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'glancereleasenotes', u'Glance Release Notes Documentation',
[u'Glance Developers'], 1)
]
# If true, show URL addresses after external links.
# man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'GlanceReleaseNotes', u'Glance Release Notes Documentation',
u'Glance Developers', 'GlanceReleaseNotes',
'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
# texinfo_appendices = []
# If false, no module index is generated.
# texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
# texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
# texinfo_no_detailmenu = False
# -- Options for Internationalization output ------------------------------
locale_dirs = ['locale/']

View File

@ -0,0 +1,8 @@
============================================
mixmatch Release Notes
============================================
.. toctree::
:maxdepth: 1
unreleased

View File

@ -0,0 +1,5 @@
==============================
Current Series Release Notes
==============================
.. release-notes::

19
requirements.txt Normal file
View File

@ -0,0 +1,19 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
pbr>=1.6 # Apache-2.0
eventlet
flask
sqlalchemy
keystoneauth
oslo.cache
oslo.config
oslo.messaging
oslo.log
oslo.db
python-keystoneclient
requests
six
uwsgi

9
run_proxy.sh Executable file
View File

@ -0,0 +1,9 @@
#!/usr/bin/env bash
uwsgi --socket 0.0.0.0:5001 \
--protocol=http \
--http-chunked-input \
-w mixmatch.wsgi \
--master \
--processes 4 \
--threads 2

55
setup.cfg Normal file
View File

@ -0,0 +1,55 @@
[metadata]
name = mixmatch
summary = Combine resources across federated OpenStack deployments
description-file =
README.rst
author = OpenStack
author-email = openstack-dev@lists.openstack.org
home-page = http://www.openstack.org/
classifier =
Development Status :: 4 - Beta
Environment :: OpenStack
Intended Audience :: Information Technology
Intended Audience :: System Administrators
License :: OSI Approved :: Apache Software License
Operating System :: POSIX :: Linux
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.3
Programming Language :: Python :: 3.4
Programming Language :: Python :: 3.5
[files]
packages =
mixmatch
data_files =
etc/ = etc/*
[build_sphinx]
source-dir = doc/source
build-dir = doc/build
all_files = 1
[upload_sphinx]
upload-dir = doc/build/html
[compile_catalog]
directory = mixmatch/locale
domain = mixmatch
[update_catalog]
domain = mixmatch
output_dir = mixmatch/locale
input_file = mixmatch/locale/mixmatch.pot
[extract_messages]
keywords = _ gettext ngettext l_ lazy_gettext
mapping_file = babel.cfg
output_file = mixmatch/locale/mixmatch.pot
[build_releasenotes]
all_files = 1
build-dir = releasenotes/build
source-dir = releasenotes/source

29
setup.py Normal file
View File

@ -0,0 +1,29 @@
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
import setuptools
# In python < 2.7.4, a lazy loading of package `pbr` will break
# setuptools if some other modules registered functions in `atexit`.
# solution from: http://bugs.python.org/issue15881#msg170215
try:
import multiprocessing # noqa
except ImportError:
pass
setuptools.setup(
setup_requires=['pbr'],
pbr=True)

23
test-requirements.txt Normal file
View File

@ -0,0 +1,23 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
os-testr
testtools
flake8
mock
requests_mock
hacking>=0.11.0,<0.12 # Apache-2.0
coverage>=3.6 # Apache-2.0
python-subunit>=0.0.18 # Apache-2.0/BSD
sphinx>=1.2.1,!=1.3b1,<1.4 # BSD
oslosphinx>=4.7.0 # Apache-2.0
oslotest>=1.10.0 # Apache-2.0
testrepository>=0.0.18 # Apache-2.0/BSD
testscenarios>=0.4 # Apache-2.0/BSD
testtools>=1.4.0 # MIT
# releasenotes
reno>=1.8.0 # Apache2

40
tox.ini Normal file
View File

@ -0,0 +1,40 @@
[tox]
minversion = 2.0
envlist = py35,py27,pypy,pep8
skipsdist = True
[testenv]
usedevelop = True
install_command = pip install -c{env:UPPER_CONSTRAINTS_FILE:https://git.openstack.org/cgit/openstack/requirements/plain/upper-constraints.txt} {opts} {packages}
setenv =
VIRTUAL_ENV={envdir}
PYTHONWARNINGS=default::DeprecationWarning
deps = -r{toxinidir}/test-requirements.txt
commands = python setup.py test --slowest --testr-args='{posargs}'
[testenv:pep8]
commands = flake8 {posargs}
[testenv:venv]
commands = {posargs}
[testenv:cover]
commands = python setup.py test --coverage --testr-args='{posargs}'
[testenv:docs]
commands = python setup.py build_sphinx
[testenv:releasenotes]
commands =
sphinx-build -a -E -W -d releasenotes/build/doctrees -b html releasenotes/source releasenotes/build/html
[testenv:debug]
commands = oslo_debug_helper {posargs}
[flake8]
# E123, E125 skipped as they are invalid PEP-8.
show-source = True
ignore = E123,E125
builtins = _
exclude=.venv,.git,.tox,dist,doc,*lib/python*,*egg,build