Multihash implementation for Glance

Partially implements blueprint multihash.

Requires glance_store 0.26.1

Co-authored-by: Scott McClymont <scott.mcclymont@verizonwireless.com>
Co-authored-by: Brian Rosmaita <rosmaita.fossdev@gmail.com>

Change-Id: Ib28ea1f6c431db6434dbab2a234018e82d5a6d1a
This commit is contained in:
Brian Rosmaita 2018-07-30 15:48:49 -04:00
parent ff77f59bd4
commit 0b24dbd620
32 changed files with 511 additions and 35 deletions

View File

@ -202,6 +202,8 @@ Response Parameters
- min_disk: min_disk
- min_ram: min_ram
- name: name
- os_hash_algo: os_hash_algo
- os_hash_value: os_hash_value
- owner: owner
- protected: protected
- schema: schema-image
@ -266,6 +268,8 @@ Response Parameters
- min_disk: min_disk
- min_ram: min_ram
- name: name
- os_hash_algo: os_hash_algo
- os_hash_value: os_hash_value
- owner: owner
- protected: protected
- schema: schema-image
@ -584,8 +588,10 @@ Response Parameters
- id: id
- min_disk: min_disk
- min_ram: min_ram
- owner: owner
- name: name
- owner: owner
- os_hash_algo: os_hash_algo
- os_hash_value: os_hash_value
- protected: protected
- schema: schema-image
- self: self

View File

@ -484,6 +484,27 @@ next:
in: body
required: true
type: string
os_hash_algo:
description: |
The algorithm used to compute a secure hash of the image data for this
image. The result of the computation is displayed as the value of the
``os_hash_value`` property. The value might be ``null`` (JSON null
data type). The algorithm used is chosen by the cloud operator; it
may not be configured by end users. *(Since Image API v2.7)*
in: body
required: true
type: string
os_hash_value:
description: |
The hexdigest of the secure hash of the image data computed using the
algorithm whose name is the value of the ``os_hash_algo`` property.
The value might be ``null`` (JSON null data type) if data has not
yet been associated with this image, or if the image was created using
a version of the Image Service API prior to version 2.7.
*(Since Image API v2.7)*
in: body
required: true
type: string
owner:
description: |
An identifier for the owner of the image, usually the project (also

View File

@ -15,6 +15,8 @@
"id": "b2173dd3-7ad6-4362-baa6-a68bce3565cb",
"file": "/v2/images/b2173dd3-7ad6-4362-baa6-a68bce3565cb/file",
"checksum": null,
"os_hash_algo": null,
"os_hash_value": null,
"owner": "bab7d5c60cd041a0a36f7c4b6e1dd978",
"virtual_size": null,
"min_ram": 0,

View File

@ -13,6 +13,8 @@
"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"file": "/v2/images/1bea47ed-f6a9-463b-b423-14b9cca9ad27/file",
"checksum": "64d7c1cd2b6f60c92c14662941cb7913",
"os_hash_algo": "sha512",
"os_hash_value": "073b4523583784fbe01daff81eba092a262ec37ba6d04dd3f52e4cd5c93eb8258af44881345ecda0e49f3d8cc6d2df6b050ff3e72681d723234aff9d17d0cf09"
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 13167616,
"min_ram": 0,

View File

@ -13,6 +13,8 @@
"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"file": "/v2/images/1bea47ed-f6a9-463b-b423-14b9cca9ad27/file",
"checksum": "64d7c1cd2b6f60c92c14662941cb7913",
"os_hash_algo": "sha512",
"os_hash_value": "073b4523583784fbe01daff81eba092a262ec37ba6d04dd3f52e4cd5c93eb8258af44881345ecda0e49f3d8cc6d2df6b050ff3e72681d723234aff9d17d0cf09"
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 13167616,
"min_ram": 0,

View File

@ -9,6 +9,8 @@
"min_ram": 512,
"name": "Fedora 17",
"owner": "02a7fb2dd4ef434c8a628c511dcbbeb6",
"os_hash_algo": "sha512",
"os_hash_value": "ef7d1ed957ffafefb324d50ebc6685ed03d0e64549762ba94a1c44e92270cdbb69d7437dd1e101d00dd41684aaecccad1edc5c2e295e66d4733025b052497844"
"protected": false,
"schema": "/v2/schemas/image",
"self": "/v2/images/2b61ed2b-f800-4da0-99ff-396b742b8646",

View File

@ -15,6 +15,8 @@
"id": "1bea47ed-f6a9-463b-b423-14b9cca9ad27",
"file": "/v2/images/1bea47ed-f6a9-463b-b423-14b9cca9ad27/file",
"checksum": "64d7c1cd2b6f60c92c14662941cb7913",
"os_hash_algo": "sha512",
"os_hash_value": "073b4523583784fbe01daff81eba092a262ec37ba6d04dd3f52e4cd5c93eb8258af44881345ecda0e49f3d8cc6d2df6b050ff3e72681d723234aff9d17d0cf09"
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 13167616,
"min_ram": 0,
@ -36,6 +38,8 @@
"id": "781b3762-9469-4cec-b58d-3349e5de4e9c",
"file": "/v2/images/781b3762-9469-4cec-b58d-3349e5de4e9c/file",
"checksum": "afab0f79bac770d61d24b4d0560b5f70",
"os_hash_algo": "sha512",
"os_hash_value": "ea3e20140df1cc65f53d4c5b9ee3b38d0d6868f61bbe2230417b0f98cef0e0c7c37f0ebc5c6456fa47f013de48b452617d56c15fdba25e100379bd0e81ee15ec"
"owner": "5ef70662f8b34079a6eddb8da9d75fe8",
"size": 476704768,
"min_ram": 0,

View File

@ -145,6 +145,24 @@
"is_base": false,
"type": "string"
},
"os_hash_algo": {
"description": "Algorithm to calculate the os_hash_value",
"maxLength": 64,
"readOnly": true,
"type": [
"null",
"string"
]
},
"os_hash_value": {
"description": "Hexdigest of the image contents using the algorithm specified by the os_hash_algo",
"maxLength": 128,
"readOnly": true,
"type": [
"null",
"string"
]
},
"os_version": {
"description": "Operating system version as specified by the distributor",
"is_base": false,

View File

@ -166,6 +166,24 @@
"is_base": false,
"type": "string"
},
"os_hash_algo": {
"description": "Algorithm to calculate the os_hash_value",
"maxLength": 64,
"readOnly": true,
"type": [
"null",
"string"
]
},
"os_hash_value": {
"description": "Hexdigest of the image contents using the algorithm specified by the os_hash_algo",
"maxLength": 128,
"readOnly": true,
"type": [
"null",
"string"
]
},
"os_version": {
"description": "Operating system version as specified by the distributor",
"is_base": false,

View File

@ -315,6 +315,8 @@ class ImmutableImageProxy(object):
min_disk = _immutable_attr('base', 'min_disk')
min_ram = _immutable_attr('base', 'min_ram')
protected = _immutable_attr('base', 'protected')
os_hash_algo = _immutable_attr('base', 'os_hash_algo')
os_hash_value = _immutable_attr('base', 'os_hash_value')
os_hidden = _immutable_attr('base', 'os_hidden')
locations = _immutable_attr('base', 'locations', proxy=ImmutableLocations)
checksum = _immutable_attr('base', 'checksum')

View File

@ -446,7 +446,8 @@ class RequestDeserializer(wsgi.JSONRequestDeserializer):
_disallowed_properties = ('direct_url', 'self', 'file', 'schema')
_readonly_properties = ('created_at', 'updated_at', 'status', 'checksum',
'size', 'virtual_size', 'direct_url', 'self',
'file', 'schema', 'id')
'file', 'schema', 'id', 'os_hash_algo',
'os_hash_value')
_reserved_properties = ('location', 'deleted', 'deleted_at')
_base_properties = ('checksum', 'created_at', 'container_format',
'disk_format', 'id', 'min_disk', 'min_ram', 'name',
@ -884,7 +885,8 @@ class ResponseSerializer(wsgi.JSONResponseSerializer):
attributes = ['name', 'disk_format', 'container_format',
'visibility', 'size', 'virtual_size', 'status',
'checksum', 'protected', 'min_ram', 'min_disk',
'owner', 'os_hidden']
'owner', 'os_hidden', 'os_hash_algo',
'os_hash_value']
for key in attributes:
image_view[key] = getattr(image, key)
image_view['id'] = image.image_id
@ -1018,6 +1020,19 @@ def get_base_properties():
'description': _('md5 hash of image contents.'),
'maxLength': 32,
},
'os_hash_algo': {
'type': ['null', 'string'],
'readOnly': True,
'description': _('Algorithm to calculate the os_hash_value'),
'maxLength': 64,
},
'os_hash_value': {
'type': ['null', 'string'],
'readOnly': True,
'description': _('Hexdigest of the image contents using the '
'algorithm specified by the os_hash_algo'),
'maxLength': 128,
},
'owner': {
'type': ['null', 'string'],
'description': _('Owner of the image'),

View File

@ -191,6 +191,40 @@ Possible values:
Related options:
* image_property_quota
""")),
cfg.StrOpt('hashing_algorithm',
default='sha512',
help=_(""""
Secure hashing algorithm used for computing the 'os_hash_value' property.
This option configures the Glance "multihash", which consists of two
image properties: the 'os_hash_algo' and the 'os_hash_value'. The
'os_hash_algo' will be populated by the value of this configuration
option, and the 'os_hash_value' will be populated by the hexdigest computed
when the algorithm is applied to the uploaded or imported image data.
The value must be a valid secure hash algorithm name recognized by the
python 'hashlib' library. You can determine what these are by examining
the 'hashlib.algorithms_available' data member of the version of the
library being used in your Glance installation. For interoperability
purposes, however, we recommend that you use the set of secure hash
names supplies by the 'hashlib.algorithms_guaranteed' data member because
those algorithms are guaranteed to be supported by the 'hashlib' library
on all platforms. Thus, any image consumer using 'hashlib' locally should
be able to verify the 'os_hash_value' of the image.
The default value of 'sha512' is a performant secure hash algorithm.
If this option is misconfigured, any attempts to store image data will fail.
For that reason, we recommend using the default value.
Possible values:
* Any secure hash algorithm name recognized by the Python 'hashlib'
library
Related options:
* None
""")),
cfg.IntOpt('image_member_quota', default=128,
help=_("""

View File

@ -130,6 +130,8 @@ class ImageRepo(object):
protected=db_image['protected'],
locations=location_strategy.get_ordered_locations(locations),
checksum=db_image['checksum'],
os_hash_algo=db_image['os_hash_algo'],
os_hash_value=db_image['os_hash_value'],
owner=db_image['owner'],
disk_format=db_image['disk_format'],
container_format=db_image['container_format'],
@ -162,6 +164,8 @@ class ImageRepo(object):
'protected': image.protected,
'locations': locations,
'checksum': image.checksum,
'os_hash_algo': image.os_hash_algo,
'os_hash_value': image.os_hash_value,
'owner': image.owner,
'disk_format': image.disk_format,
'container_format': image.container_format,

View File

@ -225,6 +225,8 @@ def _image_format(image_id, **values):
'size': None,
'virtual_size': None,
'checksum': None,
'os_hash_algo': None,
'os_hash_value': None,
'tags': [],
'created_at': dt,
'updated_at': dt,
@ -735,7 +737,7 @@ def image_create(context, image_values, v1_mode=False):
'protected', 'is_public', 'container_format',
'disk_format', 'created_at', 'updated_at', 'deleted',
'deleted_at', 'properties', 'tags', 'visibility',
'os_hidden'])
'os_hidden', 'os_hash_algo', 'os_hash_value'])
incorrect_keys = set(image_values.keys()) - allowed_keys
if incorrect_keys:

View File

@ -0,0 +1,26 @@
# Copyright (C) 2018 Verizon Wireless
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
def has_migrations(engine):
"""Returns true if at least one data row can be migrated."""
return False
def migrate(engine):
"""Return the number of rows migrated."""
return 0

View File

@ -0,0 +1,25 @@
# Copyright (C) 2018 Verizon Wireless
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# revision identifiers, used by Alembic.
revision = 'rocky_contract02'
down_revision = 'rocky_contract01'
branch_labels = None
depends_on = 'rocky_expand02'
def upgrade():
pass

View File

@ -0,0 +1,33 @@
# Copyright (C) 2018 Verizon Wireless
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""add os_hash_algo and os_hash_value columns to images table"""
from alembic import op
from sqlalchemy import Column, String
# revision identifiers, used by Alembic.
revision = 'rocky_expand02'
down_revision = 'rocky_expand01'
branch_labels = None
depends_on = None
def upgrade():
algo_col = Column('os_hash_algo', String(length=64), nullable=True)
value_col = Column('os_hash_value', String(length=128), nullable=True)
op.add_column('images', algo_col)
op.add_column('images', value_col)
op.create_index('os_hash_value_image_idx', 'images', ['os_hash_value'])

View File

@ -468,6 +468,10 @@ def _make_conditions_from_filters(filters, is_public=None):
checksum = filters.pop('checksum')
image_conditions.append(models.Image.checksum == checksum)
if 'os_hash_value' in filters:
os_hash_value = filters.pop('os_hash_value')
image_conditions.append(models.Image.os_hash_value == os_hash_value)
for (k, v) in filters.pop('properties', {}).items():
prop_filters = _make_image_property_condition(key=k, value=v)
prop_conditions.append(prop_filters)

View File

@ -120,7 +120,8 @@ class Image(BASE, GlanceBase):
Index('owner_image_idx', 'owner'),
Index('created_at_image_idx', 'created_at'),
Index('updated_at_image_idx', 'updated_at'),
Index('os_hidden_image_idx', 'os_hidden'))
Index('os_hidden_image_idx', 'os_hidden'),
Index('os_hash_value_image_idx', 'os_hash_value'))
id = Column(String(36), primary_key=True,
default=lambda: str(uuid.uuid4()))
@ -134,6 +135,8 @@ class Image(BASE, GlanceBase):
name='image_visibility'), nullable=False,
server_default='shared')
checksum = Column(String(32))
os_hash_algo = Column(String(64))
os_hash_value = Column(String(128))
min_disk = Column(Integer, nullable=False, default=0)
min_ram = Column(Integer, nullable=False, default=0)
owner = Column(String(255))

View File

@ -48,7 +48,8 @@ def _import_delayed_delete():
class ImageFactory(object):
_readonly_properties = ['created_at', 'updated_at', 'status', 'checksum',
'size', 'virtual_size']
'os_hash_algo', 'os_hash_value', 'size',
'virtual_size']
_reserved_properties = ['owner', 'locations', 'deleted', 'deleted_at',
'direct_url', 'self', 'file', 'schema']
@ -127,6 +128,8 @@ class Image(object):
self.protected = kwargs.pop('protected', False)
self.locations = kwargs.pop('locations', [])
self.checksum = kwargs.pop('checksum', None)
self.os_hash_algo = kwargs.pop('os_hash_algo', None)
self.os_hash_value = kwargs.pop('os_hash_value', None)
self.owner = kwargs.pop('owner', None)
self._disk_format = kwargs.pop('disk_format', None)
self._container_format = kwargs.pop('container_format', None)

View File

@ -175,6 +175,8 @@ class Image(object):
os_hidden = _proxy('base', 'os_hidden')
locations = _proxy('base', 'locations')
checksum = _proxy('base', 'checksum')
os_hash_algo = _proxy('base', 'os_hash_algo')
os_hash_value = _proxy('base', 'os_hash_value')
owner = _proxy('base', 'owner')
disk_format = _proxy('base', 'disk_format')
container_format = _proxy('base', 'container_format')

View File

@ -428,12 +428,19 @@ class ImageProxy(glance.domain.proxy.Image):
else:
verifier = None
location, size, checksum, loc_meta = self.store_api.add_to_backend(
hashing_algo = CONF['hashing_algorithm']
(location,
size,
checksum,
multihash,
loc_meta) = self.store_api.add_to_backend_with_multihash(
CONF,
self.image.image_id,
utils.LimitingReader(utils.CooperativeReader(data),
CONF.image_size_cap),
size,
hashing_algo,
context=self.context,
verifier=verifier)
@ -454,6 +461,8 @@ class ImageProxy(glance.domain.proxy.Image):
'status': 'active'}]
self.image.size = size
self.image.checksum = checksum
self.image.os_hash_value = multihash
self.image.os_hash_algo = hashing_algo
self.image.status = 'active'
def get_data(self, offset=0, chunk_size=None):

View File

@ -0,0 +1,41 @@
# Copyright (c) 2018 Verizon Wireless
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from oslo_db.sqlalchemy import test_base
from oslo_db.sqlalchemy import utils as db_utils
from glance.tests.functional.db import test_migrations
class TestRockyExpand02Mixin(test_migrations.AlembicMigrationsMixin):
def _get_revisions(self, config):
return test_migrations.AlembicMigrationsMixin._get_revisions(
self, config, head='rocky_expand02')
def _pre_upgrade_rocky_expand02(self, engine):
images = db_utils.get_table(engine, 'images')
self.assertNotIn('os_hash_algo', images.c)
self.assertNotIn('os_hash_value', images.c)
def _check_rocky_expand02(self, engine, data):
images = db_utils.get_table(engine, 'images')
self.assertIn('os_hash_algo', images.c)
self.assertTrue(images.c.os_hash_algo.nullable)
self.assertIn('os_hash_value', images.c)
self.assertTrue(images.c.os_hash_value.nullable)
class TestRockyExpand02MySQL(TestRockyExpand02Mixin,
test_base.MySQLOpportunisticTestCase):
pass

View File

@ -13,6 +13,7 @@
# License for the specific language governing permissions and limitations
# under the License.
import hashlib
import os
import signal
import uuid
@ -158,6 +159,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
])
@ -186,23 +189,29 @@ class TestImages(functional.FunctionalTest):
self.assertEqual(1, len(images))
self.assertEqual(image_id, images[0]['id'])
def _verify_image_checksum_and_status(checksum=None, status=None):
# Checksum should be populated and status should be active
def _verify_image_hashes_and_status(
checksum=None, os_hash_value=None, status=None):
path = self._url('/v2/images/%s' % image_id)
response = requests.get(path, headers=self._headers())
self.assertEqual(http.OK, response.status_code)
image = jsonutils.loads(response.text)
self.assertEqual(checksum, image['checksum'])
if os_hash_value:
# make sure we're using the hashing_algorithm we expect
self.assertEqual(six.text_type('sha512'),
image['os_hash_algo'])
self.assertEqual(os_hash_value, image['os_hash_value'])
self.assertEqual(status, image['status'])
# Upload some image data to staging area
path = self._url('/v2/images/%s/stage' % image_id)
headers = self._headers({'Content-Type': 'application/octet-stream'})
response = requests.put(path, headers=headers, data='ZZZZZ')
image_data = b'ZZZZZ'
response = requests.put(path, headers=headers, data=image_data)
self.assertEqual(http.NO_CONTENT, response.status_code)
# Verify image is in uploading state and checksum is None
_verify_image_checksum_and_status(status='uploading')
# Verify image is in uploading state, hashes are None
_verify_image_hashes_and_status(status='uploading')
# Import image to store
path = self._url('/v2/images/%s/import' % image_id)
@ -225,9 +234,11 @@ class TestImages(functional.FunctionalTest):
status='active',
max_sec=2,
delay_sec=0.2)
_verify_image_checksum_and_status(
checksum='8f113e38d28a79a5a451b16048cc2b72',
status='active')
expect_c = six.text_type(hashlib.md5(image_data).hexdigest())
expect_h = six.text_type(hashlib.sha512(image_data).hexdigest())
_verify_image_hashes_and_status(checksum=expect_c,
os_hash_value=expect_h,
status='active')
# Ensure the size is updated to reflect the data uploaded
path = self._url('/v2/images/%s' % image_id)
@ -300,6 +311,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
])
@ -328,17 +341,22 @@ class TestImages(functional.FunctionalTest):
self.assertEqual(1, len(images))
self.assertEqual(image_id, images[0]['id'])
def _verify_image_checksum_and_status(checksum=None, status=None):
# Checksum should be populated and status should be active
def _verify_image_hashes_and_status(
checksum=None, os_hash_value=None, status=None):
path = self._url('/v2/images/%s' % image_id)
response = requests.get(path, headers=self._headers())
self.assertEqual(http.OK, response.status_code)
image = jsonutils.loads(response.text)
self.assertEqual(checksum, image['checksum'])
if os_hash_value:
# make sure we're using the hashing_algorithm we expect
self.assertEqual(six.text_type('sha512'),
image['os_hash_algo'])
self.assertEqual(os_hash_value, image['os_hash_value'])
self.assertEqual(status, image['status'])
# Verify image is in queued state and checksum is None
_verify_image_checksum_and_status(status='queued')
# Verify image is in queued state and hashes are None
_verify_image_hashes_and_status(status='queued')
# Import image to store
path = self._url('/v2/images/%s/import' % image_id)
@ -346,10 +364,11 @@ class TestImages(functional.FunctionalTest):
'content-type': 'application/json',
'X-Roles': 'admin',
})
image_data_uri = ('https://www.openstack.org/assets/openstack-logo/'
'2016R/OpenStack-Logo-Horizontal.eps.zip')
data = jsonutils.dumps({'method': {
'name': 'web-download',
'uri': 'https://www.openstack.org/assets/openstack-logo/'
'2016R/OpenStack-Logo-Horizontal.eps.zip'
'uri': image_data_uri
}})
response = requests.post(path, headers=headers, data=data)
self.assertEqual(http.ACCEPTED, response.status_code)
@ -364,9 +383,12 @@ class TestImages(functional.FunctionalTest):
max_sec=20,
delay_sec=0.2,
start_delay_sec=1)
_verify_image_checksum_and_status(
checksum='bcd65f8922f61a9e6a20572ad7aa2bdd',
status='active')
with requests.get(image_data_uri) as r:
expect_c = six.text_type(hashlib.md5(r.content).hexdigest())
expect_h = six.text_type(hashlib.sha512(r.content).hexdigest())
_verify_image_hashes_and_status(checksum=expect_c,
os_hash_value=expect_h,
status='active')
# Deleting image should work
path = self._url('/v2/images/%s' % image_id)
@ -428,6 +450,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
u'locations',
@ -493,6 +517,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
u'locations',
@ -722,23 +748,28 @@ class TestImages(functional.FunctionalTest):
response = requests.get(path, headers=headers)
self.assertEqual(http.NO_CONTENT, response.status_code)
def _verify_image_checksum_and_status(checksum, status):
# Checksum should be populated and status should be active
def _verify_image_hashes_and_status(checksum, os_hash_value, status):
# hashes should be populated and status should be active
path = self._url('/v2/images/%s' % image_id)
response = requests.get(path, headers=self._headers())
self.assertEqual(http.OK, response.status_code)
image = jsonutils.loads(response.text)
self.assertEqual(checksum, image['checksum'])
# make sure we're using the default algo
self.assertEqual(six.text_type('sha512'), image['os_hash_algo'])
self.assertEqual(os_hash_value, image['os_hash_value'])
self.assertEqual(status, image['status'])
# Upload some image data
path = self._url('/v2/images/%s/file' % image_id)
headers = self._headers({'Content-Type': 'application/octet-stream'})
response = requests.put(path, headers=headers, data='ZZZZZ')
image_data = b'ZZZZZ'
response = requests.put(path, headers=headers, data=image_data)
self.assertEqual(http.NO_CONTENT, response.status_code)
expected_checksum = '8f113e38d28a79a5a451b16048cc2b72'
_verify_image_checksum_and_status(expected_checksum, 'active')
expect_c = six.text_type(hashlib.md5(image_data).hexdigest())
expect_h = six.text_type(hashlib.sha512(image_data).hexdigest())
_verify_image_hashes_and_status(expect_c, expect_h, 'active')
# `disk_format` and `container_format` cannot
# be replaced when the image is active.
@ -757,7 +788,7 @@ class TestImages(functional.FunctionalTest):
path = self._url('/v2/images/%s/file' % image_id)
response = requests.get(path, headers=self._headers())
self.assertEqual(http.OK, response.status_code)
self.assertEqual(expected_checksum, response.headers['Content-MD5'])
self.assertEqual(expect_c, response.headers['Content-MD5'])
self.assertEqual('ZZZZZ', response.text)
# Uploading duplicate data should be rejected with a 409. The
@ -766,7 +797,7 @@ class TestImages(functional.FunctionalTest):
headers = self._headers({'Content-Type': 'application/octet-stream'})
response = requests.put(path, headers=headers, data='XXX')
self.assertEqual(http.CONFLICT, response.status_code)
_verify_image_checksum_and_status(expected_checksum, 'active')
_verify_image_hashes_and_status(expect_c, expect_h, 'active')
# Ensure the size is updated to reflect the data uploaded
path = self._url('/v2/images/%s' % image_id)
@ -944,6 +975,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
u'locations',
@ -1009,6 +1042,8 @@ class TestImages(functional.FunctionalTest):
u'container_format',
u'owner',
u'checksum',
u'os_hash_algo',
u'os_hash_value',
u'size',
u'virtual_size',
u'locations',

View File

@ -38,6 +38,8 @@ class TestSchemas(functional.FunctionalTest):
'name',
'visibility',
'checksum',
'os_hash_algo',
'os_hash_value',
'created_at',
'updated_at',
'tags',

View File

@ -15,6 +15,7 @@
# under the License.
import collections
import hashlib
import os.path
import mock
@ -66,6 +67,8 @@ class ImageStub(object):
self.status = status
self.extra_properties = extra_properties
self.checksum = 'c2e5db72bd7fd153f53ede5da5a06de3'
self.os_hash_algo = 'sha512'
self.os_hash_value = hashlib.sha512(b'glance').hexdigest()
self.created_at = '2013-09-28T15:27:36Z'
self.updated_at = '2013-09-28T15:27:37Z'
self.locations = []

View File

@ -238,6 +238,30 @@ class FakeStoreAPI(object):
checksum = 'Z'
return (image_id, size, checksum, self.store_metadata)
def add_to_backend_with_multihash(
self, conf, image_id, data, size, hashing_algo,
scheme=None, context=None, verifier=None):
store_max_size = 7
current_store_size = 2
for location in self.data.keys():
if image_id in location:
raise exception.Duplicate()
if not size:
# 'data' is a string wrapped in a LimitingReader|CooperativeReader
# pipeline, so peek under the hood of those objects to get at the
# string itself.
size = len(data.data.fd)
if (current_store_size + size) > store_max_size:
raise exception.StorageFull()
if context.user == USER2:
raise exception.Forbidden()
if context.user == USER3:
raise exception.StorageWriteDenied()
self.data[image_id] = (data, size)
checksum = 'Z'
multihash = 'ZZ'
return (image_id, size, checksum, multihash, self.store_metadata)
def check_location_metadata(self, val, key=''):
store.check_location_metadata(val)

View File

@ -15,6 +15,7 @@
import datetime
import eventlet
import hashlib
import uuid
import glance_store as store
@ -56,6 +57,10 @@ TENANT4 = 'c6c87f25-8a94-47ed-8c83-053c25f42df4'
CHKSUM = '93264c3edf5972c9f1cb309543d38a5c'
CHKSUM1 = '43254c3edf6972c9f1cb309543d38a8c'
FAKEHASHALGO = 'fake-name-for-sha512'
MULTIHASH1 = hashlib.sha512(b'glance').hexdigest()
MULTIHASH2 = hashlib.sha512(b'image_service').hexdigest()
def _db_fixture(id, **kwargs):
obj = {
@ -64,6 +69,8 @@ def _db_fixture(id, **kwargs):
'visibility': 'shared',
'properties': {},
'checksum': None,
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': None,
'owner': None,
'status': 'queued',
'tags': [],
@ -87,6 +94,8 @@ def _domain_fixture(id, **kwargs):
'name': None,
'visibility': 'private',
'checksum': None,
'os_hash_algo': None,
'os_hash_value': None,
'owner': None,
'status': 'queued',
'size': None,
@ -149,6 +158,7 @@ class TestImagesController(base.IsolatedUnitTest):
def _create_images(self):
self.images = [
_db_fixture(UUID1, owner=TENANT1, checksum=CHKSUM,
os_hash_algo=FAKEHASHALGO, os_hash_value=MULTIHASH1,
name='1', size=256, virtual_size=1024,
visibility='public',
locations=[{'url': '%s/%s' % (BASE_URI, UUID1),
@ -157,6 +167,7 @@ class TestImagesController(base.IsolatedUnitTest):
container_format='bare',
status='active'),
_db_fixture(UUID2, owner=TENANT1, checksum=CHKSUM1,
os_hash_algo=FAKEHASHALGO, os_hash_value=MULTIHASH2,
name='2', size=512, virtual_size=2048,
visibility='public',
disk_format='raw',
@ -166,6 +177,7 @@ class TestImagesController(base.IsolatedUnitTest):
properties={'hypervisor_type': 'kvm', 'foo': 'bar',
'bar': 'foo'}),
_db_fixture(UUID3, owner=TENANT3, checksum=CHKSUM1,
os_hash_algo=FAKEHASHALGO, os_hash_value=MULTIHASH2,
name='3', size=512, virtual_size=2048,
visibility='public', tags=['windows', '64bit', 'x86']),
_db_fixture(UUID4, owner=TENANT4, name='4',
@ -291,6 +303,34 @@ class TestImagesController(base.IsolatedUnitTest):
output = self.controller.index(req, filters={'checksum': '236231827'})
self.assertEqual(0, len(output['images']))
def test_index_with_os_hash_value_filter_single_image(self):
req = unit_test_utils.get_fake_request(
'/images?os_hash_value=%s' % MULTIHASH1)
output = self.controller.index(req,
filters={'os_hash_value': MULTIHASH1})
self.assertEqual(1, len(output['images']))
actual = list([image.image_id for image in output['images']])
expected = [UUID1]
self.assertEqual(expected, actual)
def test_index_with_os_hash_value_filter_multiple_images(self):
req = unit_test_utils.get_fake_request(
'/images?os_hash_value=%s' % MULTIHASH2)
output = self.controller.index(req,
filters={'os_hash_value': MULTIHASH2})
self.assertEqual(2, len(output['images']))
actual = list([image.image_id for image in output['images']])
expected = [UUID3, UUID2]
self.assertEqual(expected, actual)
def test_index_with_non_existent_os_hash_value(self):
fake_hash_value = hashlib.sha512(b'not_used_in_fixtures').hexdigest()
req = unit_test_utils.get_fake_request(
'/images?os_hash_value=%s' % fake_hash_value)
output = self.controller.index(req,
filters={'checksum': fake_hash_value})
self.assertEqual(0, len(output['images']))
def test_index_size_max_filter(self):
request = unit_test_utils.get_fake_request('/images?size_max=512')
output = self.controller.index(request, filters={'size_max': 512})
@ -2776,6 +2816,8 @@ class TestImagesDeserializer(test_utils.BaseTestCase):
'id': '00000000-0000-0000-0000-000000000000',
'status': 'active',
'checksum': 'abcdefghijklmnopqrstuvwxyz012345',
'os_hash_algo': 'supersecure',
'os_hash_value': 'a' * 32 + 'b' * 32 + 'c' * 32 + 'd' * 32,
'size': 9001,
'virtual_size': 9001,
'created_at': ISOTIME,
@ -3435,7 +3477,9 @@ class TestImagesSerializer(test_utils.BaseTestCase):
visibility='public', container_format='ami',
tags=['one', 'two'], disk_format='ami',
min_ram=128, min_disk=10,
checksum='ca425b88f047ce8ec45ee90e813ada91'),
checksum='ca425b88f047ce8ec45ee90e813ada91',
os_hash_algo=FAKEHASHALGO,
os_hash_value=MULTIHASH1),
# NOTE(bcwaldon): This second fixture depends on default behavior
# and sets most values to None
@ -3456,6 +3500,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'size': 1024,
'virtual_size': 3072,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'container_format': 'ami',
'disk_format': 'ami',
'min_ram': 128,
@ -3485,6 +3531,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'min_ram': None,
'min_disk': None,
'checksum': None,
'os_hash_algo': None,
'os_hash_value': None,
'disk_format': None,
'virtual_size': None,
'container_format': None,
@ -3564,6 +3612,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'size': 1024,
'virtual_size': 3072,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'container_format': 'ami',
'disk_format': 'ami',
'min_ram': 128,
@ -3601,6 +3651,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'min_ram': None,
'min_disk': None,
'checksum': None,
'os_hash_algo': None,
'os_hash_value': None,
'disk_format': None,
'virtual_size': None,
'container_format': None,
@ -3621,6 +3673,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'size': 1024,
'virtual_size': 3072,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'container_format': 'ami',
'disk_format': 'ami',
'min_ram': 128,
@ -3687,6 +3741,8 @@ class TestImagesSerializer(test_utils.BaseTestCase):
'size': 1024,
'virtual_size': 3072,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'container_format': 'ami',
'disk_format': 'ami',
'min_ram': 128,
@ -3733,6 +3789,8 @@ class TestImagesSerializerWithUnicode(test_utils.BaseTestCase):
'min_ram': 128,
'min_disk': 10,
'checksum': u'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'extra_properties': {'lang': u'Fran\u00E7ais',
u'dispos\u00E9': u'f\u00E2ch\u00E9'},
}),
@ -3752,6 +3810,8 @@ class TestImagesSerializerWithUnicode(test_utils.BaseTestCase):
u'size': 1024,
u'virtual_size': 3072,
u'checksum': u'ca425b88f047ce8ec45ee90e813ada91',
u'os_hash_algo': six.text_type(FAKEHASHALGO),
u'os_hash_value': six.text_type(MULTIHASH1),
u'container_format': u'ami',
u'disk_format': u'ami',
u'min_ram': 128,
@ -3790,6 +3850,8 @@ class TestImagesSerializerWithUnicode(test_utils.BaseTestCase):
u'size': 1024,
u'virtual_size': 3072,
u'checksum': u'ca425b88f047ce8ec45ee90e813ada91',
u'os_hash_algo': six.text_type(FAKEHASHALGO),
u'os_hash_value': six.text_type(MULTIHASH1),
u'container_format': u'ami',
u'disk_format': u'ami',
u'min_ram': 128,
@ -3822,6 +3884,8 @@ class TestImagesSerializerWithUnicode(test_utils.BaseTestCase):
u'size': 1024,
u'virtual_size': 3072,
u'checksum': u'ca425b88f047ce8ec45ee90e813ada91',
u'os_hash_algo': six.text_type(FAKEHASHALGO),
u'os_hash_value': six.text_type(MULTIHASH1),
u'container_format': u'ami',
u'disk_format': u'ami',
u'min_ram': 128,
@ -3856,6 +3920,8 @@ class TestImagesSerializerWithUnicode(test_utils.BaseTestCase):
u'size': 1024,
u'virtual_size': 3072,
u'checksum': u'ca425b88f047ce8ec45ee90e813ada91',
u'os_hash_algo': six.text_type(FAKEHASHALGO),
u'os_hash_value': six.text_type(MULTIHASH1),
u'container_format': u'ami',
u'disk_format': u'ami',
u'min_ram': 128,
@ -3895,6 +3961,7 @@ class TestImagesSerializerWithExtendedSchema(test_utils.BaseTestCase):
self.fixture = _domain_fixture(
UUID2, name='image-2', owner=TENANT2,
checksum='ca425b88f047ce8ec45ee90e813ada91',
os_hash_algo=FAKEHASHALGO, os_hash_value=MULTIHASH1,
created_at=DATETIME, updated_at=DATETIME, size=1024,
virtual_size=3072, extra_properties=props)
@ -3907,6 +3974,8 @@ class TestImagesSerializerWithExtendedSchema(test_utils.BaseTestCase):
'protected': False,
'os_hidden': False,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'tags': [],
'size': 1024,
'virtual_size': 3072,
@ -3936,6 +4005,8 @@ class TestImagesSerializerWithExtendedSchema(test_utils.BaseTestCase):
'protected': False,
'os_hidden': False,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'tags': [],
'size': 1024,
'virtual_size': 3072,
@ -3964,6 +4035,7 @@ class TestImagesSerializerWithAdditionalProperties(test_utils.BaseTestCase):
self.fixture = _domain_fixture(
UUID2, name='image-2', owner=TENANT2,
checksum='ca425b88f047ce8ec45ee90e813ada91',
os_hash_algo=FAKEHASHALGO, os_hash_value=MULTIHASH1,
created_at=DATETIME, updated_at=DATETIME, size=1024,
virtual_size=3072, extra_properties={'marx': 'groucho'})
@ -3977,6 +4049,8 @@ class TestImagesSerializerWithAdditionalProperties(test_utils.BaseTestCase):
'protected': False,
'os_hidden': False,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'marx': 'groucho',
'tags': [],
'size': 1024,
@ -4012,6 +4086,8 @@ class TestImagesSerializerWithAdditionalProperties(test_utils.BaseTestCase):
'protected': False,
'os_hidden': False,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'marx': 123,
'tags': [],
'size': 1024,
@ -4042,6 +4118,8 @@ class TestImagesSerializerWithAdditionalProperties(test_utils.BaseTestCase):
'protected': False,
'os_hidden': False,
'checksum': 'ca425b88f047ce8ec45ee90e813ada91',
'os_hash_algo': FAKEHASHALGO,
'os_hash_value': MULTIHASH1,
'tags': [],
'size': 1024,
'virtual_size': 3072,

View File

@ -33,7 +33,8 @@ class TestSchemasController(test_utils.BaseTestCase):
'disk_format', 'updated_at', 'visibility', 'self',
'file', 'container_format', 'schema', 'id', 'size',
'direct_url', 'min_ram', 'min_disk', 'protected',
'locations', 'owner', 'virtual_size', 'os_hidden'])
'locations', 'owner', 'virtual_size', 'os_hidden',
'os_hash_algo', 'os_hash_value'])
self.assertEqual(expected, set(output['properties'].keys()))
def test_image_has_correct_statuses(self):

View File

@ -35,7 +35,7 @@ future==0.16.0
futurist==1.2.0
gitdb2==2.0.3
GitPython==2.1.8
glance-store==0.22.0
glance-store==0.26.1
greenlet==0.4.13
hacking==0.12.0
httplib2==0.9.1

View File

@ -0,0 +1,55 @@
---
features:
- |
This release implements the Glance spec `Secure Hash Algorithm Support
<https://specs.openstack.org/openstack/glance-specs/specs/rocky/approved/glance/multihash.html>`_
(also known as "multihash"). This feature supplements the current
'checksum' image property with a self-describing secure hash. The
self-description consists of two new image properties:
* ``os_hash_algo`` - this contains the name of the secure hash algorithm
used to generate the value on this image
* ``os_hash_value`` - this is the hexdigest computed by applying the
secure hash algorithm named in the ``os_hash_algo`` property to the
image data
These are read-only image properties and are not user-modifiable.
The secure hash algorithm used is an operator-configurable setting. See
the help text for 'hashing_algorithm' in the sample Glance configuration
file for more information.
The default secure hash algorithm is SHA-512. It should be suitable for
most applications.
The legacy 'checksum' image property, which provides an MD5 message
digest of the image data, is preserved for backward compatibility.
issues:
- |
The ``os_hash_value`` image property, introduced as part of the
`Secure Hash Algorithm Support
<https://specs.openstack.org/openstack/glance-specs/specs/rocky/approved/glance/multihash.html>`_
("multihash") feature, is limited to 128 characters. This is sufficient
to store 512 bits as a hexadecimal numeral.
- |
The "multihash" implemented in this release (`Secure Hash Algorithm Support
<https://specs.openstack.org/openstack/glance-specs/specs/rocky/approved/glance/multihash.html>`_)
is computed only for new images. There is no provision for computing
the multihash for existing images. Thus, users should expect to see
JSON 'null' values for the ``os_hash_algo`` and ``os_hash_value`` image
properties on images created prior to the installation of the Rocky
release at your site.
security:
- |
This release implements the Glance spec `Secure Hash Algorithm Support
<https://specs.openstack.org/openstack/glance-specs/specs/rocky/approved/glance/multihash.html>`_,
which introduces a self-describing "multihash" to the image-show response.
This feature supplements the current 'checksum' image property with a
self-describing secure hash. The default hashing algorithm is SHA-512,
which is currently considered secure. In the event that algorithm is
compromised, you will immediately be able to begin using a different
algorithm (as long as it's supported by the Python 'hashlib' library and
has output that fits in 128 characters) by modifying the value of the
'hashing_algorithm' configuration option and either restarting or issuing
a SIGHUP to Glance.

View File

@ -46,7 +46,7 @@ retrying!=1.3.0,>=1.2.3 # Apache-2.0
osprofiler>=1.4.0 # Apache-2.0
# Glance Store
glance-store>=0.22.0 # Apache-2.0
glance-store>=0.26.1 # Apache-2.0
debtcollector>=1.2.0 # Apache-2.0