Cryptographically Signed tokens
Uses CMS to create tokens that can be verified without network calls. Tokens encapsulate authorization information. This includes user name and roles in JSON. The JSON document info is cryptographically signed with a private key from Keystone, in accordance with the Cryptographic Message Syntax (CMS) in DER format and then Base64 encoded. The header, footer, and line breaks are stripped to minimize the size, and slashes which are invalid in Base64 are converted to hyphens. Since signed tokens are not validated against the Keystone server, they continue to be valid until the expiration time. This means that even if a user has their roles revoked or their account disabled, those changes will not take effect until their token times out. The prototype for this is Kerberos, which has the same limitation, and has funtioned sucessfully with it for decades. It is possible to set the token time out for much shorter than the default of 8 hours, but that may mean that users tokens will time out prior to completion of long running tasks. This should be a drop in replacement for the current token production code. Although the signed token is longer than the older format, the token is still a unique stream of Alpha-Numeric characters. The auth token middle_ware is capable of handling both uuid and signed tokens. To start with, the PKI functionality is disabled. This will keep from breaking the existing deployments. However, it can be enabled with the config value: [signing] disable_pki = False The 'id_hash' column is added to the SQL schema because SQL alchemy insists on each table having a primary key. However primary keys are limited to roughly 250 Characters (768 Bytes, but there is more than 1 varchar per byte) so the ID field cannot be used as the primary key anymore. id_hash is a hash of the id column, and should be used for lookups as it is indexed. middleware/auth_token.py needs to stand alone in the other services, and uses keystone.common.cms in order to verify tokens. Token needs to have all of the data from the original authenticate code contained in the signed document, as the authenticate RPC will no longer be called in mand cases. The datetime of expiry is signed in the token. The certificates are accessible via web APIs. On the remote service side, certificates needed to authenitcate tokens are stored in /tmp/keystone-signing by default. Remote systems use Paste API to read configuration values. Certificates are retrieved only if they are not on the local system. When authenticating in Keystone systems, it still does the Database checks for token presence. This allows Keystone to continue to enforce Timeout and disabled users. The service catalog has been added to the signed token. Although this greatly increases the size of the token, it makes it consistant with what is fetched during the token authenticate checks This change also fixes time variations in expiry test. Although unrelated to the above changes, it was making testing very frustrating. For the database Upgrade scripts, we now only bring 'token' up to V1 in 001 script. This makes it possible to use the same 002 script for both upgrade and initializing a new database. Upon upgrade, the current UUID tokens are retained in the id_hash and id fields. The mechanisms to verify uuid tokens work the same as before. On downgrade, token_ids are dropped. Takes into account changes for "Raise unauthorized if tenant disabled" Bug 1003962 Change-Id: I89b5aa609143bbe09a36bfaf64758c5306e86de7
This commit is contained in:
parent
4ed0551985
commit
bcc0f6d6fc
|
@ -0,0 +1,86 @@
|
|||
import os
|
||||
import stat
|
||||
import subprocess
|
||||
|
||||
|
||||
UUID_TOKEN_LENGTH = 32
|
||||
|
||||
|
||||
def cms_verify(formatted, signing_cert_file_name, ca_file_name):
|
||||
"""
|
||||
verifies the signature of the contensts IAW CMS syntax
|
||||
"""
|
||||
process = subprocess.Popen(["openssl", "cms", "-verify",
|
||||
"-certfile", signing_cert_file_name,
|
||||
"-CAfile", ca_file_name,
|
||||
"-inform", "PEM",
|
||||
"-nosmimecap", "-nodetach",
|
||||
"-nocerts", "-noattr"],
|
||||
stdin=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
output, err = process.communicate(formatted)
|
||||
retcode = process.poll()
|
||||
if retcode:
|
||||
raise subprocess.CalledProcessError(retcode, "openssl", output=err)
|
||||
return output
|
||||
|
||||
|
||||
def token_to_cms(signed_text):
|
||||
copy_of_text = signed_text.replace('-', '/')
|
||||
|
||||
formatted = "-----BEGIN CMS-----\n"
|
||||
line_length = 64
|
||||
while len(copy_of_text) > 0:
|
||||
if (len(copy_of_text) > line_length):
|
||||
formatted += copy_of_text[:line_length]
|
||||
copy_of_text = copy_of_text[line_length:]
|
||||
else:
|
||||
formatted += copy_of_text
|
||||
copy_of_text = ""
|
||||
formatted += "\n"
|
||||
|
||||
formatted += "-----END CMS-----\n"
|
||||
|
||||
return formatted
|
||||
|
||||
|
||||
def verify_token(token, signing_cert_file_name, ca_file_name):
|
||||
return cms_verify(token_to_cms(token),
|
||||
signing_cert_file_name,
|
||||
ca_file_name)
|
||||
|
||||
|
||||
def cms_sign_text(text, signing_cert_file_name, signing_key_file_name):
|
||||
""" Uses OpenSSL to sign a document
|
||||
Produces a Base64 encoding of a DER formatted CMS Document
|
||||
http://en.wikipedia.org/wiki/Cryptographic_Message_Syntax
|
||||
"""
|
||||
|
||||
process = subprocess.Popen(["openssl", "cms", "-sign",
|
||||
"-signer", signing_cert_file_name,
|
||||
"-inkey", signing_key_file_name,
|
||||
"-outform", "PEM",
|
||||
"-nosmimecap", "-nodetach",
|
||||
"-nocerts", "-noattr"],
|
||||
stdin=subprocess.PIPE,
|
||||
stdout=subprocess.PIPE)
|
||||
output, unused_err = process.communicate(text)
|
||||
retcode = process.poll()
|
||||
if retcode:
|
||||
raise subprocess.CalledProcessError(retcode,
|
||||
"openssl", output=output)
|
||||
return cms_to_token(output)
|
||||
|
||||
|
||||
def cms_to_token(cms_text):
|
||||
|
||||
start_delim = "-----BEGIN CMS-----"
|
||||
end_delim = "-----END CMS-----"
|
||||
signed_text = cms_text
|
||||
signed_text = signed_text.replace('/', '-')
|
||||
signed_text = signed_text.replace(start_delim, '')
|
||||
signed_text = signed_text.replace(end_delim, '')
|
||||
signed_text = signed_text.replace('\n', '')
|
||||
|
||||
return signed_text
|
|
@ -23,14 +23,25 @@ from keystone.common import sql
|
|||
import keystone.catalog.backends.sql
|
||||
import keystone.contrib.ec2.backends.sql
|
||||
import keystone.identity.backends.sql
|
||||
import keystone.token.backends.sql
|
||||
#inentionally leave off token. We bring it up to V1 here manually
|
||||
|
||||
|
||||
def upgrade(migrate_engine):
|
||||
# Upgrade operations go here. Don't create your own engine; bind
|
||||
# migrate_engine to your metadata
|
||||
meta = MetaData()
|
||||
meta.bind = migrate_engine
|
||||
dialect = migrate_engine.url.get_dialect().name
|
||||
|
||||
sql.ModelBase.metadata.create_all(migrate_engine)
|
||||
|
||||
token = Table('token', meta,
|
||||
Column('id', sql.String(64), primary_key=True),
|
||||
Column('expires', sql.DateTime()),
|
||||
Column('extra', sql.JsonBlob()))
|
||||
|
||||
token.create(migrate_engine, checkfirst=True)
|
||||
|
||||
|
||||
def downgrade(migrate_engine):
|
||||
# Operations to reverse the above upgrade go here.
|
||||
|
|
|
@ -0,0 +1,3 @@
|
|||
alter table token drop id;
|
||||
alter table token change id_hash id varchar(64);
|
||||
|
|
@ -0,0 +1,2 @@
|
|||
alter table token change id id_hash varchar(64);
|
||||
alter table token add id varchar(2048);
|
|
@ -0,0 +1,8 @@
|
|||
drop table token;
|
||||
|
||||
CREATE TABLE token (
|
||||
id VARCHAR(64) NOT NULL,
|
||||
expires DATETIME,
|
||||
extra TEXT,
|
||||
PRIMARY KEY (id)
|
||||
);
|
|
@ -0,0 +1,27 @@
|
|||
CREATE TABLE token_backup (
|
||||
id_hash VARCHAR(64) NOT NULL,
|
||||
id VARCHAR(1024),
|
||||
expires DATETIME,
|
||||
extra TEXT,
|
||||
PRIMARY KEY (id_hash)
|
||||
);
|
||||
|
||||
insert into token_backup
|
||||
select id as old_id,
|
||||
'',
|
||||
expires as old_expires,
|
||||
extra as old_extra from token;
|
||||
|
||||
drop table token;
|
||||
|
||||
CREATE TABLE token (
|
||||
id_hash VARCHAR(64) NOT NULL,
|
||||
id VARCHAR(1024),
|
||||
expires DATETIME,
|
||||
extra TEXT,
|
||||
PRIMARY KEY (id_hash)
|
||||
);
|
||||
|
||||
insert into token select * from token_backup;
|
||||
|
||||
drop table token_backup;
|
|
@ -110,7 +110,6 @@ def register_cli_int(*args, **kw):
|
|||
group = kw.pop('group', None)
|
||||
return conf.register_cli_opt(cfg.IntOpt(*args, **kw), group=group)
|
||||
|
||||
|
||||
register_str('admin_token', default='ADMIN')
|
||||
register_str('bind_host', default='0.0.0.0')
|
||||
register_str('compute_port', default=8774)
|
||||
|
@ -126,6 +125,8 @@ register_str('keyfile', group='ssl', default=None)
|
|||
register_str('ca_certs', group='ssl', default=None)
|
||||
register_bool('cert_required', group='ssl', default=False)
|
||||
#signing options
|
||||
register_bool('disable_pki', group='signing',
|
||||
default=True)
|
||||
register_str('certfile', group='signing',
|
||||
default="/etc/keystone/ssl/certs/signing_cert.pem")
|
||||
register_str('keyfile', group='signing',
|
||||
|
@ -136,6 +137,7 @@ register_int('key_size', group='signing', default=2048)
|
|||
register_int('valid_days', group='signing', default=3650)
|
||||
register_str('ca_password', group='signing', default=None)
|
||||
|
||||
|
||||
# sql options
|
||||
register_str('connection', group='sql', default='sqlite:///keystone.db')
|
||||
register_int('idle_timeout', group='sql', default=200)
|
||||
|
@ -154,7 +156,6 @@ register_str('driver', group='ec2',
|
|||
register_str('driver', group='stats',
|
||||
default='keystone.contrib.stats.backends.kvs.Stats')
|
||||
|
||||
|
||||
#ldap
|
||||
register_str('url', group='ldap', default='ldap://localhost')
|
||||
register_str('user', group='ldap', default='dc=Manager,dc=example,dc=com')
|
||||
|
|
|
@ -94,14 +94,17 @@ HTTP_X_ROLE
|
|||
"""
|
||||
|
||||
import httplib
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import stat
|
||||
import subprocess
|
||||
import time
|
||||
|
||||
import webob
|
||||
import webob.exc
|
||||
|
||||
from keystone.openstack.common import jsonutils
|
||||
|
||||
from keystone.common import cms
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
@ -146,6 +149,22 @@ class AuthProtocol(object):
|
|||
self.cert_file = conf.get('certfile')
|
||||
self.key_file = conf.get('keyfile')
|
||||
|
||||
#signing
|
||||
self.signing_dirname = conf.get('signing_dir', '/tmp/keystone-signing')
|
||||
if (os.path.exists(self.signing_dirname) and
|
||||
not os.access(self.signing_dirname, os.W_OK)):
|
||||
raise "TODO: Need to find an Exception to raise here."
|
||||
|
||||
if not os.path.exists(self.signing_dirname):
|
||||
os.makedirs(self.signing_dirname)
|
||||
#will throw IOError if it cannot change permissions
|
||||
os.chmod(self.signing_dirname, stat.S_IRWXU)
|
||||
|
||||
val = '%s/signing_cert.pem' % self.signing_dirname
|
||||
self.signing_cert_file_name = val
|
||||
val = '%s/cacert.pem' % self.signing_dirname
|
||||
self.ca_file_name = val
|
||||
|
||||
# Credentials used to verify this component with the Auth service since
|
||||
# validating tokens is a privileged call
|
||||
self.admin_token = conf.get('admin_token')
|
||||
|
@ -271,6 +290,29 @@ class AuthProtocol(object):
|
|||
self.key_file,
|
||||
self.cert_file)
|
||||
|
||||
def _http_request(self, method, path):
|
||||
"""HTTP request helper used to make unspecified content type requests.
|
||||
|
||||
:param method: http method
|
||||
:param path: relative request url
|
||||
:return (http response object)
|
||||
:raise ServerError when unable to communicate with keystone
|
||||
|
||||
"""
|
||||
conn = self._get_http_connection()
|
||||
|
||||
try:
|
||||
conn.request(method, path)
|
||||
response = conn.getresponse()
|
||||
body = response.read()
|
||||
except Exception, e:
|
||||
LOG.error('HTTP connection exception: %s' % e)
|
||||
raise ServiceError('Unable to communicate with keystone')
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
return response, body
|
||||
|
||||
def _json_request(self, method, path, body=None, additional_headers=None):
|
||||
"""HTTP request helper used to make json requests.
|
||||
|
||||
|
@ -347,49 +389,30 @@ class AuthProtocol(object):
|
|||
raise ServiceError('invalid json response')
|
||||
|
||||
def _validate_user_token(self, user_token, retry=True):
|
||||
"""Authenticate user token with keystone.
|
||||
"""Authenticate user using PKI
|
||||
|
||||
:param user_token: user's token id
|
||||
:param retry: flag that forces the middleware to retry
|
||||
user authentication when an indeterminate
|
||||
response is received. Optional.
|
||||
:return token object received from keystone on success
|
||||
:param retry: Ignored, as it is not longer relevant
|
||||
:return uncrypted body of the token if the token is valid
|
||||
:raise InvalidUserToken if token is rejected
|
||||
:raise ServiceError if unable to authenticate token
|
||||
:no longer raises ServiceError since it no longer makes RPC
|
||||
|
||||
"""
|
||||
cached = self._cache_get(user_token)
|
||||
if cached:
|
||||
return cached
|
||||
|
||||
headers = {'X-Auth-Token': self.get_admin_token()}
|
||||
response, data = self._json_request('GET',
|
||||
'/v2.0/tokens/%s' % user_token,
|
||||
additional_headers=headers)
|
||||
|
||||
if response.status == 200:
|
||||
try:
|
||||
cached = self._cache_get(user_token)
|
||||
if cached:
|
||||
return cached
|
||||
if (len(user_token) > cms.UUID_TOKEN_LENGTH):
|
||||
verified = self.verify_signed_token(user_token)
|
||||
data = json.loads(verified)
|
||||
else:
|
||||
data = self.verify_uuid_token(user_token, retry)
|
||||
self._cache_put(user_token, data)
|
||||
return data
|
||||
if response.status == 404:
|
||||
# FIXME(ja): I'm assuming the 404 status means that user_token is
|
||||
# invalid - not that the admin_token is invalid
|
||||
except Exception as e:
|
||||
self._cache_store_invalid(user_token)
|
||||
LOG.warn("Authorization failed for token %s", user_token)
|
||||
raise InvalidUserToken('Token authorization failed')
|
||||
if response.status == 401:
|
||||
LOG.info('Keystone rejected admin token %s, resetting', headers)
|
||||
self.admin_token = None
|
||||
else:
|
||||
LOG.error('Bad response code while validating token: %s' %
|
||||
response.status)
|
||||
if retry:
|
||||
LOG.info('Retrying validation')
|
||||
return self._validate_user_token(user_token, False)
|
||||
else:
|
||||
LOG.warn("Invalid user token: %s. Keystone response: %s.",
|
||||
user_token, data)
|
||||
|
||||
raise InvalidUserToken()
|
||||
|
||||
def _build_user_headers(self, token_info):
|
||||
"""Convert token object into headers.
|
||||
|
@ -541,6 +564,100 @@ class AuthProtocol(object):
|
|||
'invalid',
|
||||
time=self.token_cache_time)
|
||||
|
||||
def cert_file_missing(self, called_proc_err, file_name):
|
||||
return (called_proc_err.output.find(self.signing_cert_file_name)
|
||||
and not os.path.exists(self.signing_cert_file_name))
|
||||
|
||||
def verify_uuid_token(self, user_token, retry=True):
|
||||
"""Authenticate user token with keystone.
|
||||
|
||||
:param user_token: user's token id
|
||||
:param retry: flag that forces the middleware to retry
|
||||
user authentication when an indeterminate
|
||||
response is received. Optional.
|
||||
:return token object received from keystone on success
|
||||
:raise InvalidUserToken if token is rejected
|
||||
:raise ServiceError if unable to authenticate token
|
||||
|
||||
"""
|
||||
|
||||
headers = {'X-Auth-Token': self.get_admin_token()}
|
||||
response, data = self._json_request('GET',
|
||||
'/v2.0/tokens/%s' % user_token,
|
||||
additional_headers=headers)
|
||||
|
||||
if response.status == 200:
|
||||
self._cache_put(user_token, data)
|
||||
return data
|
||||
if response.status == 404:
|
||||
# FIXME(ja): I'm assuming the 404 status means that user_token is
|
||||
# invalid - not that the admin_token is invalid
|
||||
self._cache_store_invalid(user_token)
|
||||
LOG.warn("Authorization failed for token %s", user_token)
|
||||
raise InvalidUserToken('Token authorization failed')
|
||||
if response.status == 401:
|
||||
LOG.info('Keystone rejected admin token %s, resetting', headers)
|
||||
self.admin_token = None
|
||||
else:
|
||||
LOG.error('Bad response code while validating token: %s' %
|
||||
response.status)
|
||||
if retry:
|
||||
LOG.info('Retrying validation')
|
||||
return self._validate_user_token(user_token, False)
|
||||
else:
|
||||
LOG.warn("Invalid user token: %s. Keystone response: %s.",
|
||||
user_token, data)
|
||||
|
||||
raise InvalidUserToken()
|
||||
|
||||
def verify_signed_token(self, signed_text):
|
||||
"""
|
||||
Converts a block of Base64 encoding to strict PEM format
|
||||
and verifies the signature of the contensts IAW CMS syntax
|
||||
If either of the certificate files are missing, fetch them
|
||||
and retry
|
||||
"""
|
||||
|
||||
formatted = cms.token_to_cms(signed_text)
|
||||
|
||||
while True:
|
||||
try:
|
||||
output = cms.cms_verify(formatted, self.signing_cert_file_name,
|
||||
self.ca_file_name)
|
||||
except subprocess.CalledProcessError as err:
|
||||
if self.cert_file_missing(err, self.signing_cert_file_name):
|
||||
self.fetch_signing_cert()
|
||||
continue
|
||||
if self.cert_file_missing(err, self.ca_file_name):
|
||||
self.fetch_ca_cert()
|
||||
continue
|
||||
raise err
|
||||
return output
|
||||
|
||||
def fetch_signing_cert(self):
|
||||
response, data = self._http_request('GET',
|
||||
'/v2.0/certificates/signing')
|
||||
try:
|
||||
#todo check response
|
||||
certfile = open(self.signing_cert_file_name, 'w')
|
||||
certfile.write(data)
|
||||
certfile.close()
|
||||
except (AssertionError, KeyError):
|
||||
LOG.warn("Unexpected response from keystone service: %s", data)
|
||||
raise ServiceError('invalid json response')
|
||||
|
||||
def fetch_ca_cert(self):
|
||||
response, data = self._http_request('GET',
|
||||
'/v2.0/certificates/ca')
|
||||
try:
|
||||
#todo check response
|
||||
certfile = open(self.ca_file_name, 'w')
|
||||
certfile.write(data)
|
||||
certfile.close()
|
||||
except (AssertionError, KeyError):
|
||||
LOG.warn("Unexpected response from keystone service: %s", data)
|
||||
raise ServiceError('invalid json response')
|
||||
|
||||
|
||||
def filter_factory(global_conf, **local_conf):
|
||||
"""Returns a WSGI filter app for use with paste.deploy."""
|
||||
|
|
|
@ -15,9 +15,10 @@
|
|||
# under the License.
|
||||
|
||||
import uuid
|
||||
|
||||
import routes
|
||||
import json
|
||||
|
||||
from keystone import config
|
||||
from keystone import catalog
|
||||
from keystone.common import logging
|
||||
from keystone.common import wsgi
|
||||
|
@ -27,6 +28,11 @@ from keystone.openstack.common import timeutils
|
|||
from keystone import policy
|
||||
from keystone import token
|
||||
|
||||
from keystone.common import cms
|
||||
from keystone.common import logging
|
||||
from keystone.common import utils
|
||||
from keystone.common import wsgi
|
||||
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
@ -63,6 +69,17 @@ class AdminRouter(wsgi.ComposingRouter):
|
|||
action='endpoints',
|
||||
conditions=dict(method=['GET']))
|
||||
|
||||
#Certificates used for veritfy auth toekns
|
||||
mapper.connect('/certificates/ca',
|
||||
controller=auth_controller,
|
||||
action='ca_cert',
|
||||
conditions=dict(method=['GET']))
|
||||
|
||||
mapper.connect('/certificates/signing',
|
||||
controller=auth_controller,
|
||||
action='signing_cert',
|
||||
conditions=dict(method=['GET']))
|
||||
|
||||
# Miscellaneous Operations
|
||||
extensions_controller = AdminExtensionsController()
|
||||
mapper.connect('/extensions',
|
||||
|
@ -94,6 +111,16 @@ class PublicRouter(wsgi.ComposingRouter):
|
|||
action='authenticate',
|
||||
conditions=dict(method=['POST']))
|
||||
|
||||
mapper.connect('/certificates/ca',
|
||||
controller=auth_controller,
|
||||
action='ca_cert',
|
||||
conditions=dict(method=['GET']))
|
||||
|
||||
mapper.connect('/certificates/signing',
|
||||
controller=auth_controller,
|
||||
action='signing_cert',
|
||||
conditions=dict(method=['GET']))
|
||||
|
||||
# Miscellaneous
|
||||
extensions_controller = PublicExtensionsController()
|
||||
mapper.connect('/extensions',
|
||||
|
@ -225,6 +252,18 @@ class TokenController(wsgi.Application):
|
|||
self.policy_api = policy.Manager()
|
||||
super(TokenController, self).__init__()
|
||||
|
||||
def ca_cert(self, context, auth=None):
|
||||
ca_file = open(config.CONF.signing.ca_certs, 'r')
|
||||
data = ca_file.read()
|
||||
ca_file.close()
|
||||
return data
|
||||
|
||||
def signing_cert(self, context, auth=None):
|
||||
cert_file = open(config.CONF.signing.certfile, 'r')
|
||||
data = cert_file.read()
|
||||
cert_file.close()
|
||||
return data
|
||||
|
||||
def authenticate(self, context, auth=None):
|
||||
"""Authenticate credentials and return a token.
|
||||
|
||||
|
@ -247,7 +286,6 @@ class TokenController(wsgi.Application):
|
|||
that will return a token that is scoped to that tenant.
|
||||
"""
|
||||
|
||||
token_id = uuid.uuid4().hex
|
||||
if 'passwordCredentials' in auth:
|
||||
user_id = auth['passwordCredentials'].get('userId', None)
|
||||
username = auth['passwordCredentials'].get('username', '')
|
||||
|
@ -290,14 +328,10 @@ class TokenController(wsgi.Application):
|
|||
raise exception.Unauthorized()
|
||||
except AssertionError as e:
|
||||
raise exception.Unauthorized(e.message)
|
||||
auth_token_data = dict(zip(["user", "tenant", "metadata"],
|
||||
auth_info))
|
||||
expiry = self.token_api._get_default_expire_time(context=context)
|
||||
|
||||
token_ref = self.token_api.create_token(
|
||||
context,
|
||||
token_id,
|
||||
dict(id=token_id,
|
||||
user=user_ref,
|
||||
tenant=tenant_ref,
|
||||
metadata=metadata_ref))
|
||||
if tenant_ref:
|
||||
catalog_ref = self.catalog_api.get_catalog(
|
||||
context=context,
|
||||
|
@ -308,56 +342,60 @@ class TokenController(wsgi.Application):
|
|||
catalog_ref = {}
|
||||
|
||||
elif 'token' in auth:
|
||||
token = auth['token'].get('id', None)
|
||||
|
||||
old_token = auth['token'].get('id', None)
|
||||
tenant_name = auth.get('tenantName')
|
||||
|
||||
# more compat
|
||||
if tenant_name:
|
||||
tenant_ref = self.identity_api.get_tenant_by_name(
|
||||
context=context, tenant_name=tenant_name)
|
||||
tenant_id = tenant_ref['id']
|
||||
else:
|
||||
tenant_id = auth.get('tenantId', None)
|
||||
|
||||
try:
|
||||
old_token_ref = self.token_api.get_token(context=context,
|
||||
token_id=token)
|
||||
token_id=old_token)
|
||||
except exception.NotFound:
|
||||
raise exception.Unauthorized()
|
||||
|
||||
user_ref = old_token_ref['user']
|
||||
user_id = user_ref['id']
|
||||
|
||||
# If the user is disabled don't allow them to authenticate
|
||||
current_user_ref = self.identity_api.get_user(
|
||||
context=context,
|
||||
user_id=user_ref['id'])
|
||||
current_user_ref = self.identity_api.get_user(context=context,
|
||||
user_id=user_id)
|
||||
if not current_user_ref.get('enabled', True):
|
||||
LOG.warning('User %s is disabled' % user_ref['id'])
|
||||
LOG.warning('User %s is disabled' % user_id)
|
||||
raise exception.Unauthorized()
|
||||
|
||||
tenants = self.identity_api.get_tenants_for_user(context,
|
||||
user_ref['id'])
|
||||
if tenant_id and tenant_id not in tenants:
|
||||
raise exception.Unauthorized()
|
||||
if tenant_name:
|
||||
tenant_ref = self.identity_api.\
|
||||
get_tenant_by_name(context=context,
|
||||
tenant_name=tenant_name)
|
||||
tenant_id = tenant_ref['id']
|
||||
else:
|
||||
tenant_id = auth.get('tenantId', None)
|
||||
tenants = self.identity_api.get_tenants_for_user(context, user_id)
|
||||
|
||||
if tenant_id:
|
||||
if not tenant_id in tenants:
|
||||
LOG.warning('User %s is authorized for tenant %s'
|
||||
% (user_id, tenant_id))
|
||||
raise exception.Unauthorized()
|
||||
|
||||
#if the old token is sufficient unpack and return it.
|
||||
if (old_token_ref['tenant']) and \
|
||||
(tenant_id == old_token_ref['tenant']['id']) and\
|
||||
len(old_token) > cms.UUID_TOKEN_LENGTH:
|
||||
return_data = \
|
||||
json.loads(cms.verify_token
|
||||
(old_token,
|
||||
config.CONF.signing.certfile,
|
||||
config.CONF.signing.ca_certs))
|
||||
return_data['access']['token']['id'] = old_token
|
||||
return return_data
|
||||
|
||||
expiry = old_token_ref['expires']
|
||||
try:
|
||||
tenant_ref = self.identity_api.get_tenant(
|
||||
context=context,
|
||||
tenant_id=tenant_id)
|
||||
metadata_ref = self.identity_api.get_metadata(
|
||||
context=context,
|
||||
user_id=user_ref['id'],
|
||||
tenant_id=tenant_ref['id'])
|
||||
catalog_ref = self.catalog_api.get_catalog(
|
||||
context=context,
|
||||
user_id=user_ref['id'],
|
||||
tenant_id=tenant_ref['id'],
|
||||
metadata=metadata_ref)
|
||||
tenant_ref = self.identity_api.get_tenant(context=context,
|
||||
tenant_id=tenant_id)
|
||||
except exception.TenantNotFound:
|
||||
tenant_ref = None
|
||||
metadata_ref = {}
|
||||
catalog_ref = {}
|
||||
|
||||
except exception.MetadataNotFound:
|
||||
metadata_ref = {}
|
||||
catalog_ref = {}
|
||||
|
@ -367,21 +405,61 @@ class TokenController(wsgi.Application):
|
|||
LOG.warning('Tenant %s is disabled' % tenant_id)
|
||||
raise exception.Unauthorized()
|
||||
|
||||
token_ref = self.token_api.create_token(
|
||||
context, token_id, dict(id=token_id,
|
||||
user=user_ref,
|
||||
tenant=tenant_ref,
|
||||
metadata=metadata_ref,
|
||||
expires=old_token_ref['expires']))
|
||||
if tenant_ref:
|
||||
metadata_ref = self.identity_api.get_metadata(
|
||||
context=context,
|
||||
user_id=user_ref['id'],
|
||||
tenant_id=tenant_ref['id'])
|
||||
catalog_ref = self.catalog_api.get_catalog(
|
||||
context=context,
|
||||
user_id=user_ref['id'],
|
||||
tenant_id=tenant_ref['id'],
|
||||
metadata=metadata_ref)
|
||||
|
||||
auth_token_data = dict(dict(user=current_user_ref,
|
||||
tenant=tenant_ref,
|
||||
metadata=metadata_ref))
|
||||
|
||||
auth_token_data['expires'] = expiry
|
||||
auth_token_data['id'] = 'placeholder'
|
||||
|
||||
# TODO(termie): optimize this call at some point and put it into the
|
||||
# the return for metadata
|
||||
# fill out the roles in the metadata
|
||||
roles_ref = []
|
||||
for role_id in metadata_ref.get('roles', []):
|
||||
roles_ref.append(self.identity_api.get_role(context, role_id))
|
||||
logging.debug('TOKEN_REF %s', token_ref)
|
||||
return self._format_authenticate(token_ref, roles_ref, catalog_ref)
|
||||
role_ref = self.identity_api.get_role(context, role_id)
|
||||
roles_ref.append(dict(name=role_ref['name']))
|
||||
|
||||
token_data = self._format_token(auth_token_data, roles_ref)
|
||||
|
||||
service_catalog = self._format_catalog(catalog_ref)
|
||||
token_data['access']['serviceCatalog'] = service_catalog
|
||||
|
||||
if config.CONF.signing.disable_pki:
|
||||
token_id = uuid.uuid4().hex
|
||||
signed = token_id
|
||||
else:
|
||||
signed = cms.cms_sign_text(json.dumps(token_data),
|
||||
config.CONF.signing.certfile,
|
||||
config.CONF.signing.keyfile)
|
||||
token_id = signed
|
||||
try:
|
||||
token_ref = self.token_api.create_token(
|
||||
context, token_id, dict(key=token_id,
|
||||
id=signed,
|
||||
user=user_ref,
|
||||
tenant=tenant_ref,
|
||||
metadata=metadata_ref))
|
||||
except Exception as ex:
|
||||
#an identical token may have been created already.
|
||||
#if so, return the token_data as it is also identical
|
||||
try:
|
||||
exist_token = self.token_api.get_token(context=context,
|
||||
token_id=token_id)
|
||||
except exception.TokenNotFound:
|
||||
raise ex
|
||||
|
||||
token_data['access']['token']['id'] = signed
|
||||
|
||||
return token_data
|
||||
|
||||
def _get_token_ref(self, context, token_id, belongs_to=None):
|
||||
"""Returns a token if a valid one exists.
|
||||
|
@ -390,14 +468,22 @@ class TokenController(wsgi.Application):
|
|||
|
||||
"""
|
||||
# TODO(termie): this stuff should probably be moved to middleware
|
||||
self.assert_admin(context)
|
||||
|
||||
token_ref = self.token_api.get_token(context=context,
|
||||
token_id=token_id)
|
||||
|
||||
if belongs_to:
|
||||
assert token_ref['tenant']['id'] == belongs_to
|
||||
|
||||
if len(token_id) > cms.UUID_TOKEN_LENGTH:
|
||||
self.assert_admin(context)
|
||||
data = json.loads(cms.cms_verify(cms.token_to_cms(token_id),
|
||||
config.CONF.signing.certfile,
|
||||
config.CONF.signing.ca_certs))
|
||||
access_data = data['access']
|
||||
token_ref = access_data['token']
|
||||
user_data = access_data['user']
|
||||
token_ref['metadata'] = access_data['metadata']
|
||||
token_ref['user'] = user_data
|
||||
if belongs_to:
|
||||
assert token_ref['tenant']['id'] == belongs_to
|
||||
token_ref['expires']
|
||||
else:
|
||||
token_ref = self.token_api.get_token(context=context,
|
||||
token_id=token_id)
|
||||
return token_ref
|
||||
|
||||
# admin only
|
||||
|
@ -464,7 +550,8 @@ class TokenController(wsgi.Application):
|
|||
metadata_ref = token_ref['metadata']
|
||||
expires = token_ref['expires']
|
||||
if expires is not None:
|
||||
expires = timeutils.isotime(expires)
|
||||
if not isinstance(expires, unicode):
|
||||
expires = timeutils.isotime(expires)
|
||||
o = {'access': {'token': {'id': token_ref['id'],
|
||||
'expires': expires,
|
||||
},
|
||||
|
@ -482,6 +569,14 @@ class TokenController(wsgi.Application):
|
|||
o['access']['token']['tenant'] = token_ref['tenant']
|
||||
if catalog_ref is not None:
|
||||
o['access']['serviceCatalog'] = self._format_catalog(catalog_ref)
|
||||
if metadata_ref:
|
||||
if 'is_admin' in metadata_ref:
|
||||
o['access']['metadata'] = {'is_admin':
|
||||
metadata_ref['is_admin']}
|
||||
else:
|
||||
o['access']['metadata'] = {'is_admin': 0}
|
||||
if 'roles' in metadata_ref:
|
||||
o['access']['metadata']['roles'] = metadata_ref['roles']
|
||||
return o
|
||||
|
||||
def _format_catalog(self, catalog_ref):
|
||||
|
|
|
@ -15,8 +15,11 @@
|
|||
# under the License.
|
||||
|
||||
import copy
|
||||
import datetime
|
||||
import hashlib
|
||||
import uuid
|
||||
|
||||
from keystone.common import sql
|
||||
from keystone.common import sql, cms
|
||||
from keystone import exception
|
||||
from keystone.openstack.common import timeutils
|
||||
from keystone import token
|
||||
|
@ -24,7 +27,8 @@ from keystone import token
|
|||
|
||||
class TokenModel(sql.ModelBase, sql.DictBase):
|
||||
__tablename__ = 'token'
|
||||
id = sql.Column(sql.String(64), primary_key=True)
|
||||
id_hash = sql.Column(sql.String(64), primary_key=True)
|
||||
id = sql.Column(sql.String(1024))
|
||||
expires = sql.Column(sql.DateTime(), default=None)
|
||||
extra = sql.Column(sql.JsonBlob())
|
||||
|
||||
|
@ -33,13 +37,14 @@ class TokenModel(sql.ModelBase, sql.DictBase):
|
|||
# shove any non-indexed properties into extra
|
||||
extra = copy.deepcopy(token_dict)
|
||||
data = {}
|
||||
for k in ('id', 'expires'):
|
||||
for k in ('id_hash', 'id', 'expires'):
|
||||
data[k] = extra.pop(k, None)
|
||||
data['extra'] = extra
|
||||
return cls(**data)
|
||||
|
||||
def to_dict(self):
|
||||
out = copy.deepcopy(self.extra)
|
||||
out['id_hash'] = self.id
|
||||
out['id'] = self.id
|
||||
out['expires'] = self.expires
|
||||
return out
|
||||
|
@ -49,21 +54,29 @@ class Token(sql.Base, token.Driver):
|
|||
# Public interface
|
||||
def get_token(self, token_id):
|
||||
session = self.get_session()
|
||||
token_ref = session.query(TokenModel).filter_by(id=token_id).first()
|
||||
now = timeutils.utcnow()
|
||||
token_ref = session.query(TokenModel)\
|
||||
.filter_by(id_hash=self.token_to_key(token_id)).first()
|
||||
now = datetime.datetime.utcnow()
|
||||
if token_ref and (not token_ref.expires or now < token_ref.expires):
|
||||
return token_ref.to_dict()
|
||||
else:
|
||||
raise exception.TokenNotFound(token_id=token_id)
|
||||
|
||||
def token_to_key(self, token_id):
|
||||
if len(token_id) > cms.UUID_TOKEN_LENGTH:
|
||||
hash = hashlib.md5()
|
||||
hash.update(token_id)
|
||||
return hash.hexdigest()
|
||||
else:
|
||||
return token_id
|
||||
|
||||
def create_token(self, token_id, data):
|
||||
data_copy = copy.deepcopy(data)
|
||||
if 'expires' not in data_copy:
|
||||
data_copy['expires'] = self._get_default_expire_time()
|
||||
|
||||
token_ref = TokenModel.from_dict(data_copy)
|
||||
token_ref.id = token_id
|
||||
|
||||
token_ref.id_hash = self.token_to_key(token_id)
|
||||
session = self.get_session()
|
||||
with session.begin():
|
||||
session.add(token_ref)
|
||||
|
@ -72,6 +85,12 @@ class Token(sql.Base, token.Driver):
|
|||
|
||||
def delete_token(self, token_id):
|
||||
session = self.get_session()
|
||||
token_ref = session.query(TokenModel)\
|
||||
.filter_by(id_hash=self.token_to_key(token_id))\
|
||||
.first()
|
||||
if not token_ref:
|
||||
raise exception.TokenNotFound(token_id=token_id)
|
||||
|
||||
with session.begin():
|
||||
if not session.query(TokenModel).filter_by(id=token_id).delete():
|
||||
raise exception.TokenNotFound(token_id=token_id)
|
||||
|
|
|
@ -0,0 +1,4 @@
|
|||
auth_token.pem was constructed using the following command
|
||||
|
||||
openssl cms -sign -in auth_token.json -nosmimecap -signer signing_cert.pem -inkey private_key.pem -outform PEM -nodetach -nocerts -noattr -out auth_token.pem
|
||||
|
|
@ -0,0 +1 @@
|
|||
{"access": {"serviceCatalog": [{"endpoints": [{"adminURL": "http://127.0.0.1:8776/v1/64b6f3fbcc53435e8a60fcf89bb6617a", "region": "regionOne", "internalURL": "http://127.0.0.1:8776/v1/64b6f3fbcc53435e8a60fcf89bb6617a", "publicURL": "http://127.0.0.1:8776/v1/64b6f3fbcc53435e8a60fcf89bb6617a"}], "endpoints_links": [], "type": "volume", "name": "volume"}, {"endpoints": [{"adminURL": "http://127.0.0.1:9292/v1", "region": "regionOne", "internalURL": "http://127.0.0.1:9292/v1", "publicURL": "http://127.0.0.1:9292/v1"}], "endpoints_links": [], "type": "image", "name": "glance"}, {"endpoints": [{"adminURL": "http://127.0.0.1:8774/v1.1/64b6f3fbcc53435e8a60fcf89bb6617a", "region": "regionOne", "internalURL": "http://127.0.0.1:8774/v1.1/64b6f3fbcc53435e8a60fcf89bb6617a", "publicURL": "http://127.0.0.1:8774/v1.1/64b6f3fbcc53435e8a60fcf89bb6617a"}], "endpoints_links": [], "type": "compute", "name": "nova"}, {"endpoints": [{"adminURL": "http://127.0.0.1:35357/v2.0", "region": "RegionOne", "internalURL": "http://127.0.0.1:35357/v2.0", "publicURL": "http://127.0.0.1:5000/v2.0"}], "endpoints_links": [], "type": "identity", "name": "keystone"}],"token": {"expires": "2012-06-02T14:47:34Z", "id": "placeholder", "tenant": {"enabled": true, "description": null, "name": "tenant_name1", "id": "tenant_id1"}}, "user": {"username": "user_name1", "roles_links": ["role1","role2"], "id": "user_id1", "roles": [{"name": "role1"}, {"name": "role2"}], "name": "user_name1"}}}
|
|
@ -0,0 +1,40 @@
|
|||
-----BEGIN CMS-----
|
||||
MIIG7QYJKoZIhvcNAQcCoIIG3jCCBtoCAQExCTAHBgUrDgMCGjCCBc4GCSqGSIb3
|
||||
DQEHAaCCBb8EggW7eyJhY2Nlc3MiOiB7InNlcnZpY2VDYXRhbG9nIjogW3siZW5k
|
||||
cG9pbnRzIjogW3siYWRtaW5VUkwiOiAiaHR0cDovLzEyNy4wLjAuMTo4Nzc2L3Yx
|
||||
LzY0YjZmM2ZiY2M1MzQzNWU4YTYwZmNmODliYjY2MTdhIiwgInJlZ2lvbiI6ICJy
|
||||
ZWdpb25PbmUiLCAiaW50ZXJuYWxVUkwiOiAiaHR0cDovLzEyNy4wLjAuMTo4Nzc2
|
||||
L3YxLzY0YjZmM2ZiY2M1MzQzNWU4YTYwZmNmODliYjY2MTdhIiwgInB1YmxpY1VS
|
||||
TCI6ICJodHRwOi8vMTI3LjAuMC4xOjg3NzYvdjEvNjRiNmYzZmJjYzUzNDM1ZThh
|
||||
NjBmY2Y4OWJiNjYxN2EifV0sICJlbmRwb2ludHNfbGlua3MiOiBbXSwgInR5cGUi
|
||||
OiAidm9sdW1lIiwgIm5hbWUiOiAidm9sdW1lIn0sIHsiZW5kcG9pbnRzIjogW3si
|
||||
YWRtaW5VUkwiOiAiaHR0cDovLzEyNy4wLjAuMTo5MjkyL3YxIiwgInJlZ2lvbiI6
|
||||
ICJyZWdpb25PbmUiLCAiaW50ZXJuYWxVUkwiOiAiaHR0cDovLzEyNy4wLjAuMTo5
|
||||
MjkyL3YxIiwgInB1YmxpY1VSTCI6ICJodHRwOi8vMTI3LjAuMC4xOjkyOTIvdjEi
|
||||
fV0sICJlbmRwb2ludHNfbGlua3MiOiBbXSwgInR5cGUiOiAiaW1hZ2UiLCAibmFt
|
||||
ZSI6ICJnbGFuY2UifSwgeyJlbmRwb2ludHMiOiBbeyJhZG1pblVSTCI6ICJodHRw
|
||||
Oi8vMTI3LjAuMC4xOjg3NzQvdjEuMS82NGI2ZjNmYmNjNTM0MzVlOGE2MGZjZjg5
|
||||
YmI2NjE3YSIsICJyZWdpb24iOiAicmVnaW9uT25lIiwgImludGVybmFsVVJMIjog
|
||||
Imh0dHA6Ly8xMjcuMC4wLjE6ODc3NC92MS4xLzY0YjZmM2ZiY2M1MzQzNWU4YTYw
|
||||
ZmNmODliYjY2MTdhIiwgInB1YmxpY1VSTCI6ICJodHRwOi8vMTI3LjAuMC4xOjg3
|
||||
NzQvdjEuMS82NGI2ZjNmYmNjNTM0MzVlOGE2MGZjZjg5YmI2NjE3YSJ9XSwgImVu
|
||||
ZHBvaW50c19saW5rcyI6IFtdLCAidHlwZSI6ICJjb21wdXRlIiwgIm5hbWUiOiAi
|
||||
bm92YSJ9LCB7ImVuZHBvaW50cyI6IFt7ImFkbWluVVJMIjogImh0dHA6Ly8xMjcu
|
||||
MC4wLjE6MzUzNTcvdjIuMCIsICJyZWdpb24iOiAiUmVnaW9uT25lIiwgImludGVy
|
||||
bmFsVVJMIjogImh0dHA6Ly8xMjcuMC4wLjE6MzUzNTcvdjIuMCIsICJwdWJsaWNV
|
||||
UkwiOiAiaHR0cDovLzEyNy4wLjAuMTo1MDAwL3YyLjAifV0sICJlbmRwb2ludHNf
|
||||
bGlua3MiOiBbXSwgInR5cGUiOiAiaWRlbnRpdHkiLCAibmFtZSI6ICJrZXlzdG9u
|
||||
ZSJ9XSwidG9rZW4iOiB7ImV4cGlyZXMiOiAiMjAxMi0wNi0wMlQxNDo0NzozNFoi
|
||||
LCAiaWQiOiAicGxhY2Vob2xkZXIiLCAidGVuYW50IjogeyJlbmFibGVkIjogdHJ1
|
||||
ZSwgImRlc2NyaXB0aW9uIjogbnVsbCwgIm5hbWUiOiAidGVuYW50X25hbWUxIiwg
|
||||
ImlkIjogInRlbmFudF9pZDEifX0sICJ1c2VyIjogeyJ1c2VybmFtZSI6ICJ1c2Vy
|
||||
X25hbWUxIiwgInJvbGVzX2xpbmtzIjogWyJyb2xlMSIsInJvbGUyIl0sICJpZCI6
|
||||
ICJ1c2VyX2lkMSIsICJyb2xlcyI6IFt7Im5hbWUiOiAicm9sZTEifSwgeyJuYW1l
|
||||
IjogInJvbGUyIn1dLCAibmFtZSI6ICJ1c2VyX25hbWUxIn19fQ0KMYH3MIH0AgEB
|
||||
MFQwTzEVMBMGA1UEChMMUmVkIEhhdCwgSW5jMREwDwYDVQQHEwhXZXN0Zm9yZDEW
|
||||
MBQGA1UECBMNTWFzc2FjaHVzZXR0czELMAkGA1UEBhMCVVMCAQEwBwYFKw4DAhow
|
||||
DQYJKoZIhvcNAQEBBQAEgYAD6hPEpc/0wHe3rYDBFec52h7gxdbrTNEN7jmwdFto
|
||||
xw0QnucmCREh9IUikJ2ob0c0uUC6cmNPajD9aFkGWhvNswNH2W2BYzUiC3CHM7U0
|
||||
7nsIe3OOatqyUAyoQUhHZnIAx1tOgdPBVflnrtdIV1vkdqxednlJZ52Hxob2PP3h
|
||||
xg==
|
||||
-----END CMS-----
|
|
@ -0,0 +1,18 @@
|
|||
-----BEGIN CERTIFICATE-----
|
||||
MIICzjCCAjegAwIBAgIJAMwBikmrmZ0sMA0GCSqGSIb3DQEBBAUAME8xFTATBgNV
|
||||
BAoTDFJlZCBIYXQsIEluYzERMA8GA1UEBxMIV2VzdGZvcmQxFjAUBgNVBAgTDU1h
|
||||
c3NhY2h1c2V0dHMxCzAJBgNVBAYTAlVTMB4XDTEyMDUxODE5MzQ1MVoXDTIyMDUx
|
||||
NjE5MzQ1MVowTzEVMBMGA1UEChMMUmVkIEhhdCwgSW5jMREwDwYDVQQHEwhXZXN0
|
||||
Zm9yZDEWMBQGA1UECBMNTWFzc2FjaHVzZXR0czELMAkGA1UEBhMCVVMwgZ8wDQYJ
|
||||
KoZIhvcNAQEBBQADgY0AMIGJAoGBAORnyPRzimWPxIeTJ3DEedU5hzRjzfDC8ZHP
|
||||
ZgmB81V5VUiPTB72uNf8Wh6p0mhBMSmVkmvWJNjdrGWXU/SmtVd9EFLRyLwUt9kk
|
||||
3fjEHBl7HXLc1kAwaBsmA6LGDHvxQ34zXB2hvqd5x3BwPGnzN5XUEHjIjQncLkhi
|
||||
86BqaTkhAgMBAAGjgbEwga4wDAYDVR0TBAUwAwEB/zAdBgNVHQ4EFgQUv20jLjrl
|
||||
MDv+KyKSjzuEmagGCekwfwYDVR0jBHgwdoAUv20jLjrlMDv+KyKSjzuEmagGCemh
|
||||
U6RRME8xFTATBgNVBAoTDFJlZCBIYXQsIEluYzERMA8GA1UEBxMIV2VzdGZvcmQx
|
||||
FjAUBgNVBAgTDU1hc3NhY2h1c2V0dHMxCzAJBgNVBAYTAlVTggkAzAGKSauZnSww
|
||||
DQYJKoZIhvcNAQEEBQADgYEAYLM3oI2qawJpyNODliOkwRvlSsotF/2pn5EU85I5
|
||||
vGewZxrgwwy2DbK6w8EECcarOjRJwz1ZYyi8ZpATipbLTX2JtmSwiye6YjhJyU4f
|
||||
yp7jtnalLlpoDigHHWjc1jzoKDQTk7g1F/XzUBTG5rcEB24IzLXgr7vt2TU+7/nq
|
||||
KbY=
|
||||
-----END CERTIFICATE-----
|
|
@ -0,0 +1,16 @@
|
|||
-----BEGIN PRIVATE KEY-----
|
||||
MIICdgIBADANBgkqhkiG9w0BAQEFAASCAmAwggJcAgEAAoGBAKaTKHl5YfzfWUkV
|
||||
QS5O6UoBLQ+Sh/tHjXpKhsSmFXkKD4nFQiIf2X1HGdQkKFY258pVvWbVNb82LT4k
|
||||
F7r+tElQh4zzPO2f633hPs+GrrvzyDwXIKU2Y0/7aAy9mcPpHEK0ACnn0vYzF5Ax
|
||||
1FhqHmXpeNpxla4dxK1wPFNIwWgdAgMBAAECgYBTNwjtRnpxPZL5M6kQXVOmKNg+
|
||||
A1Hzcld3VGvnKaFoimIgzW6wZYDdWPvKQxXznBJHvnWUPcdP8ty/QoCoZj3h5ABA
|
||||
PaaJjsMDYzP5XzvFi1X0bWu5DZbrd5aCqCJV7qiHrAg6kfOzzqGgQULrh/LJh0nn
|
||||
1ZIDzx4o7RM9nreOAQJBANJxRNgh3msy4K72dipHewSX0ZBg0TlophfqXYuBauK0
|
||||
twIiqOtZwNmBM+bO8sYOqki/eagbzihEjcomVP+THCECQQDKor5ZKxRLPGW5t0B4
|
||||
ix85mbIHo7jkbVjcwEFEwnIZ5uLj0KD3G31UqmrocXuzJmWhwryWmwx0+BHMlhTq
|
||||
Nyx9AkEAmVZRTI75KvEqiDIrjckB2SnqWCJDsWoQRDLQMJt/T2tQQi0RGlQO0i1z
|
||||
rQU0Hp6G83UZZyXDhNHW4uolWwhNIQJAU3UT0MXdZd9KRmMjOoKSKbcTi/HyhKJE
|
||||
pybHuvoa5HAjopCauyunQuetgG6889wsn6ME6UKSrto8+nYVxyFSQQJALJ6x4AxJ
|
||||
IJJiR9lHIGQKw2SD1cty1FkSxHWcSc3CMTy3COrchI6o4wSJ/jMIRT95c09Ir5bT
|
||||
Mgus0nrjlXFl7w==
|
||||
-----END PRIVATE KEY-----
|
|
@ -0,0 +1,13 @@
|
|||
-----BEGIN CERTIFICATE-----
|
||||
MIICCzCCAXQCAQEwDQYJKoZIhvcNAQEEBQAwTzEVMBMGA1UEChMMUmVkIEhhdCwg
|
||||
SW5jMREwDwYDVQQHEwhXZXN0Zm9yZDEWMBQGA1UECBMNTWFzc2FjaHVzZXR0czEL
|
||||
MAkGA1UEBhMCVVMwHhcNMTIwNTE4MTk0MTQyWhcNMTMwNTE4MTk0MTQyWjBNMQsw
|
||||
CQYDVQQGEwJVUzEWMBQGA1UECBMNTWFzc2FjaHVzZXR0czEVMBMGA1UEChMMUmVk
|
||||
IEhhdCwgSW5jMQ8wDQYDVQQDEwZheW91bmcwgZ8wDQYJKoZIhvcNAQEBBQADgY0A
|
||||
MIGJAoGBAKaTKHl5YfzfWUkVQS5O6UoBLQ+Sh/tHjXpKhsSmFXkKD4nFQiIf2X1H
|
||||
GdQkKFY258pVvWbVNb82LT4kF7r+tElQh4zzPO2f633hPs+GrrvzyDwXIKU2Y0/7
|
||||
aAy9mcPpHEK0ACnn0vYzF5Ax1FhqHmXpeNpxla4dxK1wPFNIwWgdAgMBAAEwDQYJ
|
||||
KoZIhvcNAQEEBQADgYEA1Nr9B+iTLLzlMc+8dsyJpDEzVPACVkElhVDojODfOW3p
|
||||
MD0rINb+icprJVp+zBOR0MDYtGyBFUNGLFE3z2i5gWKu/63Ge3wfC0KBLFs6UQEd
|
||||
82MQS3pBEub+4SM7XkhKajx12YgkX0ntEpNCAkm/YdGW4af5xlkViJ3cBpqWwuk=
|
||||
-----END CERTIFICATE-----
|
|
@ -602,7 +602,7 @@ class IdentityTests(object):
|
|||
class TokenTests(object):
|
||||
def test_token_crud(self):
|
||||
token_id = uuid.uuid4().hex
|
||||
data = {'id': token_id, 'a': 'b'}
|
||||
data = {'id': token_id, 'id_hash': token_id, 'a': 'b'}
|
||||
data_ref = self.token_api.create_token(token_id, data)
|
||||
expires = data_ref.pop('expires')
|
||||
self.assertTrue(isinstance(expires, datetime.datetime))
|
||||
|
@ -632,7 +632,8 @@ class TokenTests(object):
|
|||
def test_expired_token(self):
|
||||
token_id = uuid.uuid4().hex
|
||||
expire_time = timeutils.utcnow() - datetime.timedelta(minutes=1)
|
||||
data = {'id': token_id, 'a': 'b', 'expires': expire_time}
|
||||
data = {'id_hash': token_id, 'id': token_id, 'a': 'b',
|
||||
'expires': expire_time}
|
||||
data_ref = self.token_api.create_token(token_id, data)
|
||||
self.assertDictEqual(data_ref, data)
|
||||
self.assertRaises(exception.TokenNotFound,
|
||||
|
@ -640,7 +641,7 @@ class TokenTests(object):
|
|||
|
||||
def test_null_expires_token(self):
|
||||
token_id = uuid.uuid4().hex
|
||||
data = {'id': token_id, 'a': 'b', 'expires': None}
|
||||
data = {'id': token_id, 'id_hash': token_id, 'a': 'b', 'expires': None}
|
||||
data_ref = self.token_api.create_token(token_id, data)
|
||||
self.assertDictEqual(data_ref, data)
|
||||
new_data_ref = self.token_api.get_token(token_id)
|
||||
|
|
Loading…
Reference in New Issue