Flatten the v2 API

This is a squashed commit of work done by Doug and myself.
Thanks Doug!

Author: Angus Salkeld <asalkeld@redhat.com>

    Add a Statistics class
    Note this is a bit different to the spec
    (http://wiki.openstack.org/Ceilometer/blueprints/APIv2)
    As wsme doen't really like different types been returned from
    the same method.

    I have:
    GET /v2/meters/<meter> - raw samples
    GET /v2/meters/<meter>/statistics - for the stats

    Make the error reporting better for invalid fields
    Try and protect from passing in the wrong arguments into the db api
    Also get_resources() takes start/stop_timestamp not start/stop.
    Fix most of the duration test cases (overlapping ones are still broken)
    Add some log messages to warn of unimplemented queries
    Fix the start/end timestamp passed into calc_duration()
    Make the query op default to 'eq'
    Fix v2 event list paths
    Remove v2 list projects tests
    Re-Add the duration
    Implement get_meter_statistics() for sqlalchemy.
    Add tests for get_meter_statistics()
    Fix the latest pep8 1.4 issues

Author: Doug Hellmann <doug.hellmann@dreamhost.com>
    fixme comment
    Fix duration calculation
    fix event listing tests
    remove obsolete list tests
    update resource listing tests
    remove obsolete list tests
    fix max statistics tests for projects
    fix max tests for resource queries
    fix tests for stats using project filter
    Fix sum tests for resource queries
    Fix the statistics calculation in the mongo driver to handle getting
    no data back from the query.
    Update the queries in the test code.
    enable logging for wsme in the tests to help with debugging
    always include all query fields to keep values aligned
    only include the start and end timestamp keywords wanted by the EventFilter
    update url used in acl tests
    update tests for listing meters
    convert prints to logging calls and add a few todo/fixme notes
    add some debugging and error checking to _query_to_kwargs
    add q argument to get_json() to make it easier to pass queries to the service
    do not stub out controller we have deleted
    fix whitespace issues to make pep8 happy

Change-Id: I1b9a4c26fb8cc74ae1a002f93b84db05d0b20192
Blueprint: api-aggregate-average
Blueprint: api-server-pecan-wsme
This commit is contained in:
Angus Salkeld 2013-01-16 22:46:12 +11:00
parent 25405d2e63
commit f3bc7d0109
23 changed files with 1117 additions and 1046 deletions

View File

@ -3,6 +3,7 @@
# Copyright © 2012 New Dream Network, LLC (DreamHost)
#
# Author: Doug Hellmann <doug.hellmann@dreamhost.com>
# Angus Salkeld <asalkeld@redhat.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -18,75 +19,26 @@
"""Version 2 of the API.
"""
# [ ] / -- information about this version of the API
# [GET ] / -- information about this version of the API
#
# [ ] /extensions -- list of available extensions
# [ ] /extensions/<extension> -- details about a specific extension
# [GET ] /resources -- list the resources
# [GET ] /resources/<resource> -- information about the resource
# [GET ] /meters -- list the meters
# [POST ] /meters -- insert a new sample (and meter/resource if needed)
# [GET ] /meters/<meter> -- list the samples for this meter
# [PUT ] /meters/<meter> -- update the meter (not the samples)
# [DELETE] /meters/<meter> -- delete the meter and samples
#
# [ ] /sources -- list of known sources (where do we get this?)
# [ ] /sources/components -- list of components which provide metering
# data (where do we get this)?
#
# [x] /projects/<project>/resources -- list of resource ids
# [x] /resources -- list of resource ids
# [x] /sources/<source>/resources -- list of resource ids
# [x] /users/<user>/resources -- list of resource ids
#
# [x] /users -- list of user ids
# [x] /sources/<source>/users -- list of user ids
#
# [x] /projects -- list of project ids
# [x] /sources/<source>/projects -- list of project ids
#
# [ ] /resources/<resource> -- metadata
#
# [ ] /projects/<project>/meters -- list of meters reporting for parent obj
# [ ] /resources/<resource>/meters -- list of meters reporting for parent obj
# [ ] /sources/<source>/meters -- list of meters reporting for parent obj
# [ ] /users/<user>/meters -- list of meters reporting for parent obj
#
# [x] /projects/<project>/meters/<meter> -- events
# [x] /resources/<resource>/meters/<meter> -- events
# [x] /sources/<source>/meters/<meter> -- events
# [x] /users/<user>/meters/<meter> -- events
#
# [ ] /projects/<project>/meters/<meter>/duration -- total time for selected
# meter
# [x] /resources/<resource>/meters/<meter>/duration -- total time for selected
# meter
# [ ] /sources/<source>/meters/<meter>/duration -- total time for selected
# meter
# [ ] /users/<user>/meters/<meter>/duration -- total time for selected meter
#
# [ ] /projects/<project>/meters/<meter>/volume -- total or max volume for
# selected meter
# [x] /projects/<project>/meters/<meter>/volume/max -- max volume for
# selected meter
# [x] /projects/<project>/meters/<meter>/volume/sum -- total volume for
# selected meter
# [ ] /resources/<resource>/meters/<meter>/volume -- total or max volume for
# selected meter
# [x] /resources/<resource>/meters/<meter>/volume/max -- max volume for
# selected meter
# [x] /resources/<resource>/meters/<meter>/volume/sum -- total volume for
# selected meter
# [ ] /sources/<source>/meters/<meter>/volume -- total or max volume for
# selected meter
# [ ] /users/<user>/meters/<meter>/volume -- total or max volume for selected
# meter
import datetime
import os
import inspect
import pecan
from pecan import request
from pecan.rest import RestController
import wsme
import wsmeext.pecan as wsme_pecan
from wsme.types import Base, text, wsattr
from wsme.types import Base, text, Enum
from ceilometer.openstack.common import jsonutils
from ceilometer.openstack.common import log as logging
from ceilometer.openstack.common import timeutils
from ceilometer import storage
@ -95,148 +47,131 @@ from ceilometer import storage
LOG = logging.getLogger(__name__)
# FIXME(dhellmann): Change APIs that use this to return float?
class MeterVolume(Base):
volume = wsattr(float, mandatory=False)
def __init__(self, volume, **kw):
if volume is not None:
volume = float(volume)
super(MeterVolume, self).__init__(volume=volume, **kw)
operation_kind = Enum(str, 'lt', 'le', 'eq', 'ne', 'ge', 'gt')
class DateRange(Base):
start = datetime.datetime
end = datetime.datetime
search_offset = int
class Query(Base):
def get_op(self):
return self._op or 'eq'
def __init__(self, start=None, end=None, search_offset=0):
if start is not None:
start = start.replace(tzinfo=None)
if end is not None:
end = end.replace(tzinfo=None)
super(DateRange, self).__init__(start=start,
end=end,
search_offset=search_offset,
)
def set_op(self, value):
self._op = value
@property
def query_start(self):
"""The timestamp the query should use to start, including
the search offset.
"""
if self.start is None:
return None
return (self.start -
datetime.timedelta(minutes=self.search_offset))
field = text
#op = wsme.wsattr(operation_kind, default='eq')
# this ^ doesn't seem to work.
op = wsme.wsproperty(operation_kind, get_op, set_op)
value = text
@property
def query_end(self):
"""The timestamp the query should use to end, including
the search offset.
"""
if self.end is None:
return None
return (self.end +
datetime.timedelta(minutes=self.search_offset))
def to_dict(self):
return {'query_start': self.query_start,
'query_end': self.query_end,
'start_timestamp': self.start,
'end_timestamp': self.end,
'search_offset': self.search_offset,
}
def __repr__(self):
# for logging calls
return '<Query %r %s %r>' % (self.field, self.op, self.value)
class MeterVolumeController(object):
@wsme_pecan.wsexpose(MeterVolume, DateRange)
def max(self, daterange=None):
"""Find the maximum volume for the matching meter events.
"""
if daterange is None:
daterange = DateRange()
q_ts = daterange.to_dict()
try:
meter = request.context['meter_id']
except KeyError:
raise ValueError('No meter specified')
resource = request.context.get('resource_id')
project = request.context.get('project_id')
# Query the database for the max volume
f = storage.EventFilter(meter=meter,
resource=resource,
start=q_ts['query_start'],
end=q_ts['query_end'],
project=project,
)
# TODO(sberler): do we want to return an error if the resource
# does not exist?
results = list(request.storage_conn.get_volume_max(f))
value = None
if results:
if resource:
# If the caller specified a resource there should only
# be one result.
value = results[0].get('value')
def _query_to_kwargs(query, db_func):
# TODO(dhellmann): This function needs tests of its own.
valid_keys = inspect.getargspec(db_func)[0]
if 'self' in valid_keys:
valid_keys.remove('self')
translation = {'user_id': 'user',
'project_id': 'project',
'resource_id': 'resource'}
stamp = {}
trans = {}
metaquery = {}
for i in query:
if i.field == 'timestamp':
# FIXME(dhellmann): This logic is not consistent with the
# way the timestamps are treated inside the mongo driver
# (the end timestamp is always tested using $lt). We
# should just pass a single timestamp through to the
# storage layer with the operator and let the storage
# layer use that operator.
if i.op in ('lt', 'le'):
stamp['end_timestamp'] = i.value
elif i.op in ('gt', 'ge'):
stamp['start_timestamp'] = i.value
else:
# FIXME(sberler): Currently get_volume_max is really
# always grouping by resource_id. We should add a new
# function in the storage driver that does not do this
# grouping (and potentially rename the existing one to
# get_volume_max_by_resource())
value = max(result.get('value') for result in results)
return MeterVolume(volume=value)
@wsme_pecan.wsexpose(MeterVolume, DateRange)
def sum(self, daterange=None):
"""Compute the total volume for the matching meter events.
"""
if daterange is None:
daterange = DateRange()
q_ts = daterange.to_dict()
try:
meter = request.context['meter_id']
except KeyError:
raise ValueError('No meter specified')
resource = request.context.get('resource_id')
project = request.context.get('project_id')
f = storage.EventFilter(meter=meter,
project=project,
start=q_ts['query_start'],
end=q_ts['query_end'],
resource=resource,
)
# TODO(sberler): do we want to return an error if the resource
# does not exist?
results = list(request.storage_conn.get_volume_sum(f))
value = None
if results:
if resource:
# If the caller specified a resource there should only
# be one result.
value = results[0].get('value')
LOG.warn('_query_to_kwargs ignoring %r unexpected op %r"' %
(i.field, i.op))
else:
if i.op != 'eq':
LOG.warn('_query_to_kwargs ignoring %r unimplemented op %r' %
(i.field, i.op))
elif i.field == 'search_offset':
stamp['search_offset'] = i.value
elif i.field.startswith('metadata.'):
metaquery[i.field] = i.value
else:
# FIXME(sberler): Currently get_volume_max is really
# always grouping by resource_id. We should add a new
# function in the storage driver that does not do this
# grouping (and potentially rename the existing one to
# get_volume_max_by_resource())
value = sum(result.get('value') for result in results)
trans[translation.get(i.field, i.field)] = i.value
return MeterVolume(volume=value)
kwargs = {}
if metaquery and 'metaquery' in valid_keys:
kwargs['metaquery'] = metaquery
if stamp:
q_ts = _get_query_timestamps(stamp)
if 'start' in valid_keys:
kwargs['start'] = q_ts['query_start']
kwargs['end'] = q_ts['query_end']
elif 'start_timestamp' in valid_keys:
kwargs['start_timestamp'] = q_ts['query_start']
kwargs['end_timestamp'] = q_ts['query_end']
else:
raise wsme.exc.UnknownArgument('timestamp',
"not valid for this resource")
if trans:
for k in trans:
if k not in valid_keys:
raise wsme.exc.UnknownArgument(i.field,
"unrecognized query field")
kwargs[k] = trans[k]
return kwargs
def _get_query_timestamps(args={}):
"""Return any optional timestamp information in the request.
Determine the desired range, if any, from the GET arguments. Set
up the query range using the specified offset.
[query_start ... start_timestamp ... end_timestamp ... query_end]
Returns a dictionary containing:
query_start: First timestamp to use for query
start_timestamp: start_timestamp parameter from request
query_end: Final timestamp to use for query
end_timestamp: end_timestamp parameter from request
search_offset: search_offset parameter from request
"""
search_offset = int(args.get('search_offset', 0))
start_timestamp = args.get('start_timestamp')
if start_timestamp:
start_timestamp = timeutils.parse_isotime(start_timestamp)
start_timestamp = start_timestamp.replace(tzinfo=None)
query_start = (start_timestamp -
datetime.timedelta(minutes=search_offset))
else:
query_start = None
end_timestamp = args.get('end_timestamp')
if end_timestamp:
end_timestamp = timeutils.parse_isotime(end_timestamp)
end_timestamp = end_timestamp.replace(tzinfo=None)
query_end = end_timestamp + datetime.timedelta(minutes=search_offset)
else:
query_end = None
return {'query_start': query_start,
'query_end': query_end,
'start_timestamp': start_timestamp,
'end_timestamp': end_timestamp,
'search_offset': search_offset,
}
def _flatten_metadata(metadata):
@ -248,7 +183,7 @@ def _flatten_metadata(metadata):
if type(v) not in set([list, dict, set]))
class Event(Base):
class Sample(Base):
source = text
counter_name = text
counter_type = text
@ -265,78 +200,37 @@ class Event(Base):
if counter_volume is not None:
counter_volume = float(counter_volume)
resource_metadata = _flatten_metadata(resource_metadata)
super(Event, self).__init__(counter_volume=counter_volume,
resource_metadata=resource_metadata,
**kwds)
super(Sample, self).__init__(counter_volume=counter_volume,
resource_metadata=resource_metadata,
**kwds)
class Duration(Base):
start_timestamp = datetime.datetime
end_timestamp = datetime.datetime
class Statistics(Base):
min = float
max = float
avg = float
sum = float
count = int
duration = float
duration_start = datetime.datetime
duration_end = datetime.datetime
def __init__(self, start_timestamp=None, end_timestamp=None, **kwds):
super(Statistics, self).__init__(**kwds)
self._update_duration(start_timestamp, end_timestamp)
class MeterController(RestController):
"""Manages operations on a single meter.
"""
volume = MeterVolumeController()
_custom_actions = {
'duration': ['GET'],
}
def __init__(self, meter_id):
request.context['meter_id'] = meter_id
self._id = meter_id
@wsme_pecan.wsexpose([Event], DateRange)
def get_all(self, daterange=None):
"""Return all events for the meter.
"""
if daterange is None:
daterange = DateRange()
f = storage.EventFilter(
user=request.context.get('user_id'),
project=request.context.get('project_id'),
start=daterange.query_start,
end=daterange.query_end,
resource=request.context.get('resource_id'),
meter=self._id,
source=request.context.get('source_id'),
)
return [Event(**e)
for e in request.storage_conn.get_raw_events(f)
]
@wsme_pecan.wsexpose(Duration, DateRange)
def duration(self, daterange=None):
"""Computes the duration of the meter events in the time range given.
"""
if daterange is None:
daterange = DateRange()
# Query the database for the interval of timestamps
# within the desired range.
f = storage.EventFilter(user=request.context.get('user_id'),
project=request.context.get('project_id'),
start=daterange.query_start,
end=daterange.query_end,
resource=request.context.get('resource_id'),
meter=self._id,
source=request.context.get('source_id'),
)
min_ts, max_ts = request.storage_conn.get_event_interval(f)
def _update_duration(self, start_timestamp, end_timestamp):
# "Clamp" the timestamps we return to the original time
# range, excluding the offset.
LOG.debug('start_timestamp %s, end_timestamp %s, min_ts %s, max_ts %s',
daterange.start, daterange.end, min_ts, max_ts)
if daterange.start and min_ts and min_ts < daterange.start:
min_ts = daterange.start
if (start_timestamp and
self.duration_start and
self.duration_start < start_timestamp):
self.duration_start = start_timestamp
LOG.debug('clamping min timestamp to range')
if daterange.end and max_ts and max_ts > daterange.end:
max_ts = daterange.end
if (end_timestamp and
self.duration_end and
self.duration_end > end_timestamp):
self.duration_end = end_timestamp
LOG.debug('clamping max timestamp to range')
# If we got valid timestamps back, compute a duration in minutes.
@ -349,23 +243,65 @@ class MeterController(RestController):
# If the timestamps are invalid, return None as a
# sentinal indicating that there is something "funny"
# about the range.
if min_ts and max_ts and (min_ts <= max_ts):
if (self.duration_start and
self.duration_end and
self.duration_start <= self.duration_end):
# Can't use timedelta.total_seconds() because
# it is not available in Python 2.6.
diff = max_ts - min_ts
duration = (diff.seconds + (diff.days * 24 * 60 ** 2)) / 60
diff = self.duration_end - self.duration_start
self.duration = (diff.seconds + (diff.days * 24 * 60 ** 2)) / 60
else:
min_ts = max_ts = duration = None
self.duration_start = self.duration_end = self.duration = None
return Duration(start_timestamp=min_ts,
end_timestamp=max_ts,
duration=duration,
)
class MeterController(RestController):
"""Manages operations on a single meter.
"""
_custom_actions = {
'statistics': ['GET'],
}
def __init__(self, meter_id):
request.context['meter_id'] = meter_id
self._id = meter_id
@wsme_pecan.wsexpose([Sample], [Query])
def get_all(self, q=[]):
"""Return all events for the meter.
"""
kwargs = _query_to_kwargs(q, storage.EventFilter.__init__)
kwargs['meter'] = self._id
f = storage.EventFilter(**kwargs)
return [Sample(**e)
for e in request.storage_conn.get_raw_events(f)
]
@wsme_pecan.wsexpose(Statistics, [Query])
def statistics(self, q=[]):
"""Computes the statistics of the meter events in the time range given.
"""
kwargs = _query_to_kwargs(q, storage.EventFilter.__init__)
kwargs['meter'] = self._id
f = storage.EventFilter(**kwargs)
computed = request.storage_conn.get_meter_statistics(f)
# Find the original timestamp in the query to use for clamping
# the duration returned in the statistics.
start = end = None
for i in q:
if i.field == 'timestamp' and i.op in ('lt', 'le'):
end = timeutils.parse_isotime(i.value).replace(tzinfo=None)
elif i.field == 'timestamp' and i.op in ('gt', 'ge'):
start = timeutils.parse_isotime(i.value).replace(tzinfo=None)
stat = Statistics(start_timestamp=start,
end_timestamp=end,
**computed)
return stat
class Meter(Base):
name = text
type = text
unit = text
resource_id = text
project_id = text
user_id = text
@ -378,18 +314,23 @@ class MetersController(RestController):
def _lookup(self, meter_id, *remainder):
return MeterController(meter_id), remainder
@wsme_pecan.wsexpose([Meter])
def get_all(self):
user_id = request.context.get('user_id')
project_id = request.context.get('project_id')
resource_id = request.context.get('resource_id')
source_id = request.context.get('source_id')
@wsme_pecan.wsexpose([Meter], [Query])
def get_all(self, q=[]):
kwargs = _query_to_kwargs(q, request.storage_conn.get_meters)
return [Meter(**m)
for m in request.storage_conn.get_meters(user=user_id,
project=project_id,
resource=resource_id,
source=source_id,
)]
for m in request.storage_conn.get_meters(**kwargs)]
class Resource(Base):
resource_id = text
project_id = text
user_id = text
timestamp = datetime.datetime
metadata = {text: text}
def __init__(self, metadata={}, **kwds):
metadata = _flatten_metadata(metadata)
super(Resource, self).__init__(metadata=metadata, **kwds)
class ResourceController(RestController):
@ -399,28 +340,11 @@ class ResourceController(RestController):
def __init__(self, resource_id):
request.context['resource_id'] = resource_id
meters = MetersController()
class MeterDescription(Base):
counter_name = text
counter_type = text
class Resource(Base):
resource_id = text
project_id = text
user_id = text
timestamp = datetime.datetime
metadata = {text: text}
meter = wsattr([MeterDescription])
def __init__(self, meter=[], metadata={}, **kwds):
meter = [MeterDescription(**m) for m in meter]
metadata = _flatten_metadata(metadata)
super(Resource, self).__init__(meter=meter,
metadata=metadata,
**kwds)
@wsme_pecan.wsexpose([Resource])
def get_all(self):
r = request.storage_conn.get_resources(
resource=request.context.get('resource_id'))[0]
return Resource(**r)
class ResourcesController(RestController):
@ -430,153 +354,17 @@ class ResourcesController(RestController):
def _lookup(self, resource_id, *remainder):
return ResourceController(resource_id), remainder
@wsme_pecan.wsexpose([Resource])
def get_all(self, start_timestamp=None, end_timestamp=None):
if start_timestamp:
start_timestamp = timeutils.parse_isotime(start_timestamp)
if end_timestamp:
end_timestamp = timeutils.parse_isotime(end_timestamp)
@wsme_pecan.wsexpose([Resource], [Query])
def get_all(self, q=[]):
kwargs = _query_to_kwargs(q, request.storage_conn.get_resources)
resources = [
Resource(**r)
for r in request.storage_conn.get_resources(
source=request.context.get('source_id'),
user=request.context.get('user_id'),
project=request.context.get('project_id'),
start_timestamp=start_timestamp,
end_timestamp=end_timestamp,
)
]
for r in request.storage_conn.get_resources(**kwargs)]
return resources
class ProjectController(RestController):
"""Works on resources."""
def __init__(self, project_id):
request.context['project_id'] = project_id
meters = MetersController()
resources = ResourcesController()
class ProjectsController(RestController):
"""Works on projects."""
@pecan.expose()
def _lookup(self, project_id, *remainder):
return ProjectController(project_id), remainder
@wsme_pecan.wsexpose([text])
def get_all(self):
source_id = request.context.get('source_id')
projects = list(request.storage_conn.get_projects(source=source_id))
return projects
meters = MetersController()
class UserController(RestController):
"""Works on reusers."""
def __init__(self, user_id):
request.context['user_id'] = user_id
meters = MetersController()
resources = ResourcesController()
class UsersController(RestController):
"""Works on users."""
@pecan.expose()
def _lookup(self, user_id, *remainder):
return UserController(user_id), remainder
@wsme_pecan.wsexpose([text])
def get_all(self):
source_id = request.context.get('source_id')
users = list(request.storage_conn.get_users(source=source_id))
return users
class Source(Base):
name = text
data = {text: text}
@staticmethod
def sample():
return Source(name='openstack',
data={'key': 'value'})
class SourceController(RestController):
"""Works on resources."""
def __init__(self, source_id, data):
request.context['source_id'] = source_id
self._id = source_id
self._data = data
@wsme_pecan.wsexpose(Source)
def get(self):
response = Source(name=self._id, data=self._data)
return response
meters = MetersController()
resources = ResourcesController()
projects = ProjectsController()
users = UsersController()
class SourcesController(RestController):
"""Works on sources."""
def __init__(self):
self._sources = None
@property
def sources(self):
# FIXME(dhellmann): Add a configuration option for the filename.
#
# FIXME(dhellmann): We only want to load the file once in a process,
# but we want to be able to mock the loading out in separate tests.
#
if not self._sources:
self._sources = self._load_sources(os.path.abspath("sources.json"))
return self._sources
@staticmethod
def _load_sources(filename):
try:
with open(filename, "r") as f:
sources = jsonutils.load(f)
except IOError as err:
LOG.warning('Could not load data source definitions from %s: %s' %
(filename, err))
sources = {}
return sources
@pecan.expose()
def _lookup(self, source_id, *remainder):
try:
data = self.sources[source_id]
except KeyError:
# Unknown source
pecan.abort(404, detail='No source %s' % source_id)
return SourceController(source_id, data), remainder
@wsme_pecan.wsexpose([Source])
def get_all(self):
return [Source(name=key, data=value)
for key, value in self.sources.iteritems()]
class V2Controller(object):
"""Version 2 API controller root."""
projects = ProjectsController()
resources = ResourcesController()
sources = SourcesController()
users = UsersController()
meters = MetersController()

View File

@ -83,7 +83,7 @@ class Connection(object):
@abc.abstractmethod
def get_resources(self, user=None, project=None, source=None,
start_timestamp=None, end_timestamp=None,
metaquery={}):
metaquery={}, resource=None):
"""Return an iterable of dictionaries containing resource information.
{ 'resource_id': UUID of the resource,
@ -99,7 +99,8 @@ class Connection(object):
:param source: Optional source filter.
:param start_timestamp: Optional modified timestamp start range.
:param end_timestamp: Optional modified timestamp end range.
:param metaquery: Optional dict with metadata to match on..
:param metaquery: Optional dict with metadata to match on.
:param resource: Optional resource filter.
"""
@abc.abstractmethod
@ -159,3 +160,22 @@ class Connection(object):
( datetime.datetime(), datetime.datetime() )
"""
@abc.abstractmethod
def get_meter_statistics(self, event_filter):
"""Return a dictionary containing meter statistics.
described by the query parameters.
The filter must have a meter value set.
{ 'min':
'max':
'avg':
'sum':
'count':
'duration':
'duration_start':
'duration_end':
}
"""

View File

@ -73,7 +73,7 @@ class Connection(base.Connection):
def get_resources(self, user=None, project=None, source=None,
start_timestamp=None, end_timestamp=None,
metaquery={}):
metaquery={}, resource=None):
"""Return an iterable of dictionaries containing resource information.
{ 'resource_id': UUID of the resource,
@ -90,6 +90,7 @@ class Connection(base.Connection):
:param start_timestamp: Optional modified timestamp start range.
:param end_timestamp: Optional modified timestamp end range.
:param metaquery: Optional dict with metadata to match on.
:param resource: Optional resource filter.
"""
def get_meters(self, user=None, project=None, resource=None, source=None,
@ -129,3 +130,21 @@ class Connection(base.Connection):
"""Return the min and max timestamp for events
matching the event_filter.
"""
def get_meter_statistics(self, event_filter):
"""Return a dictionary containing meter statistics.
described by the query parameters.
The filter must have a meter value set.
{ 'min':
'max':
'avg':
'sum':
'count':
'duration':
'duration_start':
'duration_end':
}
"""

View File

@ -176,6 +176,36 @@ class Connection(base.Connection):
}
""")
MAP_STATS = bson.code.Code("""
function () {
emit('statistics', { min : this.counter_volume,
max : this.counter_volume,
qty : this.counter_volume,
count : 1,
timestamp_min : this.timestamp,
timestamp_max : this.timestamp } )
}
""")
REDUCE_STATS = bson.code.Code("""
function (key, values) {
var res = values[0];
for ( var i=1; i<values.length; i++ ) {
if ( values[i].min < res.min )
res.min = values[i].min;
if ( values[i].max > res.max )
res.max = values[i].max;
res.count += values[i].count;
res.qty += values[i].qty;
if ( values[i].timestamp_min < res.timestamp_min )
res.timestamp_min = values[i].timestamp_min;
if ( values[i].timestamp_max > res.timestamp_max )
res.timestamp_max = values[i].timestamp_max;
}
return res;
}
""")
def __init__(self, conf):
opts = self._parse_connection_url(conf.database_connection)
LOG.info('connecting to MongoDB on %s:%s', opts['host'], opts['port'])
@ -308,7 +338,7 @@ class Connection(base.Connection):
def get_resources(self, user=None, project=None, source=None,
start_timestamp=None, end_timestamp=None,
metaquery={}):
metaquery={}, resource=None):
"""Return an iterable of dictionaries containing resource information.
{ 'resource_id': UUID of the resource,
@ -325,6 +355,7 @@ class Connection(base.Connection):
:param start_timestamp: Optional modified timestamp start range.
:param end_timestamp: Optional modified timestamp end range.
:param metaquery: Optional dict with metadata to match on.
:param resource: Optional resource filter.
"""
q = {}
if user is not None:
@ -333,6 +364,8 @@ class Connection(base.Connection):
q['project_id'] = project
if source is not None:
q['source'] = source
if resource is not None:
q['_id'] = resource
q.update(metaquery)
# FIXME(dhellmann): This may not perform very well,
@ -412,6 +445,56 @@ class Connection(base.Connection):
del e['_id']
yield e
def get_meter_statistics(self, event_filter):
"""Return a dictionary containing meter statistics.
described by the query parameters.
The filter must have a meter value set.
{ 'min':
'max':
'avg':
'sum':
'count':
'duration':
'duration_start':
'duration_end':
}
"""
q = make_query_from_filter(event_filter)
results = self.db.meter.map_reduce(self.MAP_STATS,
self.REDUCE_STATS,
{'inline': 1},
query=q,
)
if results['results']:
r = results['results'][0]['value']
(start, end) = self._fix_interval_min_max(r['timestamp_min'],
r['timestamp_max'])
else:
start = None
end = None
r = {'count': 0,
'min': None,
'max': None,
'avg': None,
'qty': None,
'duration': None,
'duration_start': None,
'duration_end': None,
}
count = int(r['count'])
return {'min': r['min'],
'sum': r['qty'],
'count': count,
'avg': (r['qty'] / count) if count > 0 else None,
'max': r['max'],
'duration': 0,
'duration_start': start,
'duration_end': end,
}
def get_volume_sum(self, event_filter):
"""Return the sum of the volume field for the events
described by the query parameters.
@ -438,6 +521,34 @@ class Connection(base.Connection):
return ({'resource_id': r['_id'], 'value': r['value']}
for r in results['results'])
def _fix_interval_min_max(self, a_min, a_max):
if hasattr(a_min, 'valueOf') and a_min.valueOf is not None:
# NOTE (dhellmann): HACK ALERT
#
# The real MongoDB server can handle Date objects and
# the driver converts them to datetime instances
# correctly but the in-memory implementation in MIM
# (used by the tests) returns a spidermonkey.Object
# representing the "value" dictionary and there
# doesn't seem to be a way to recursively introspect
# that object safely to convert the min and max values
# back to datetime objects. In this method, we know
# what type the min and max values are expected to be,
# so it is safe to do the conversion
# here. JavaScript's time representation uses
# different units than Python's, so we divide to
# convert to the right units and then create the
# datetime instances to return.
#
# The issue with MIM is documented at
# https://sourceforge.net/p/merciless/bugs/3/
#
a_min = datetime.datetime.fromtimestamp(
a_min.valueOf() // 1000)
a_max = datetime.datetime.fromtimestamp(
a_max.valueOf() // 1000)
return (a_min, a_max)
def get_event_interval(self, event_filter):
"""Return the min and max timestamps from events,
using the event_filter to limit the events seen.
@ -452,32 +563,5 @@ class Connection(base.Connection):
)
if results['results']:
answer = results['results'][0]['value']
a_min = answer['min']
a_max = answer['max']
if hasattr(a_min, 'valueOf') and a_min.valueOf is not None:
# NOTE (dhellmann): HACK ALERT
#
# The real MongoDB server can handle Date objects and
# the driver converts them to datetime instances
# correctly but the in-memory implementation in MIM
# (used by the tests) returns a spidermonkey.Object
# representing the "value" dictionary and there
# doesn't seem to be a way to recursively introspect
# that object safely to convert the min and max values
# back to datetime objects. In this method, we know
# what type the min and max values are expected to be,
# so it is safe to do the conversion
# here. JavaScript's time representation uses
# different units than Python's, so we divide to
# convert to the right units and then create the
# datetime instances to return.
#
# The issue with MIM is documented at
# https://sourceforge.net/p/merciless/bugs/3/
#
a_min = datetime.datetime.fromtimestamp(
a_min.valueOf() // 1000)
a_max = datetime.datetime.fromtimestamp(
a_max.valueOf() // 1000)
return (a_min, a_max)
return self._fix_interval_min_max(answer['min'], answer['max'])
return (None, None)

View File

@ -222,7 +222,7 @@ class Connection(base.Connection):
def get_resources(self, user=None, project=None, source=None,
start_timestamp=None, end_timestamp=None,
metaquery=None):
metaquery=None, resource=None):
"""Return an iterable of dictionaries containing resource information.
{ 'resource_id': UUID of the resource,
@ -239,6 +239,7 @@ class Connection(base.Connection):
:param start_timestamp: Optional modified timestamp start range.
:param end_timestamp: Optional modified timestamp end range.
:param metaquery: Optional dict with metadata to match on.
:param resource: Optional resource filter.
"""
query = model_query(Resource, session=self.session)
if user is not None:
@ -251,6 +252,8 @@ class Connection(base.Connection):
query = query.filter(Resource.timestamp < end_timestamp)
if project is not None:
query = query.filter(Resource.project_id == project)
if resource is not None:
query = query.filter(Resource.id == resource)
query = query.options(
sqlalchemy_session.sqlalchemy.orm.joinedload('meters'))
if metaquery is not None:
@ -368,6 +371,41 @@ class Connection(base.Connection):
a_min, a_max = results[0]
return (a_min, a_max)
def get_meter_statistics(self, event_filter):
"""Return a dictionary containing meter statistics.
described by the query parameters.
The filter must have a meter value set.
{ 'min':
'max':
'avg':
'sum':
'count':
'duration':
'duration_start':
'duration_end':
}
"""
query = self.session.query(func.min(Meter.timestamp),
func.max(Meter.timestamp),
func.sum(Meter.counter_volume),
func.min(Meter.counter_volume),
func.max(Meter.counter_volume),
func.count(Meter.counter_volume))
query = make_query_from_filter(query, event_filter)
results = query.all()
res = results[0]
count = int(res[5])
return {'count': count,
'min': res[3],
'max': res[4],
'avg': (res[2] / count) if count > 0 else None,
'sum': res[2],
'duration': None,
'duration_start': res[0],
'duration_end': res[1],
}
############################

View File

@ -133,21 +133,10 @@ class FunctionalTest(unittest.TestCase):
self.stubs = stubout.StubOutForTesting()
self.app = self._make_app()
self._stubout_sources()
def _make_app(self):
return load_test_app(self.config)
def _stubout_sources(self):
"""Source data is usually read from a file, but
we want to let tests define their own. The class
attribute SOURCE_DATA is injected into the controller
as though it was read from the usual configuration
file.
"""
self.stubs.SmartSet(v2.SourcesController, 'sources',
self.SOURCE_DATA)
def tearDown(self):
self.mox.UnsetStubs()
self.stubs.UnsetAll()
@ -156,11 +145,19 @@ class FunctionalTest(unittest.TestCase):
set_config({}, overwrite=True)
def get_json(self, path, expect_errors=False, headers=None,
extra_params={}, **params):
q=[], **params):
full_path = self.PATH_PREFIX + path
query_params = {'q.field': [],
'q.value': [],
'q.op': [],
}
for query in q:
for name in ['field', 'op', 'value']:
query_params['q.%s' % name].append(query.get(name, ''))
all_params = {}
all_params.update(params)
all_params.update(extra_params)
if q:
all_params.update(query_params)
print 'GET: %s %r' % (full_path, all_params)
response = self.app.get(full_path,
params=all_params,

View File

@ -44,11 +44,11 @@ class TestAPIACL(FunctionalTest):
return result
def test_non_authenticated(self):
response = self.get_json('/sources', expect_errors=True)
response = self.get_json('/meters', expect_errors=True)
self.assertEqual(response.status_int, 401)
def test_authenticated_wrong_role(self):
response = self.get_json('/sources',
response = self.get_json('/meters',
expect_errors=True,
headers={
"X-Roles": "Member",
@ -74,7 +74,7 @@ class TestAPIACL(FunctionalTest):
# self.assertEqual(response.status_int, 401)
def test_authenticated(self):
response = self.get_json('/sources',
response = self.get_json('/meters',
expect_errors=True,
headers={
"X-Roles": "admin",

View File

@ -51,31 +51,40 @@ class TestComputeDurationByResource(FunctionalTest):
def _stub_interval_func(self, func):
self.stubs.Set(impl_test.TestConnection,
'get_event_interval',
'get_meter_statistics',
func)
def _set_interval(self, start, end):
def get_interval(ignore_self, event_filter):
assert event_filter.start
assert event_filter.end
return (start, end)
return {'count': 0,
'min': None,
'max': None,
'avg': None,
'qty': None,
'duration': None,
'duration_start': start,
'duration_end': end,
}
self._stub_interval_func(get_interval)
def _invoke_api(self):
return self.get_json(
'/resources/resource-id/meters/instance:m1.tiny/duration',
extra_params={
'daterange.start': self.start.isoformat(),
'daterange.end': self.end.isoformat(),
# this value doesn't matter, db call is mocked
'daterange.search_offset': 10,
})
return self.get_json('/meters/instance:m1.tiny/statistics',
q=[{'field': 'timestamp',
'op': 'ge',
'value': self.start.isoformat()},
{'field': 'timestamp',
'op': 'le',
'value': self.end.isoformat()},
{'field': 'search_offset',
'value': 10}])
def test_before_range(self):
self._set_interval(self.early1, self.early2)
data = self._invoke_api()
assert data['start_timestamp'] is None
assert data['end_timestamp'] is None
assert data['duration_start'] is None
assert data['duration_end'] is None
assert data['duration'] is None
def _assert_times_match(self, actual, expected):
@ -88,62 +97,81 @@ class TestComputeDurationByResource(FunctionalTest):
def test_overlap_range_start(self):
self._set_interval(self.early1, self.middle1)
data = self._invoke_api()
self._assert_times_match(data['start_timestamp'], self.start)
self._assert_times_match(data['end_timestamp'], self.middle1)
self._assert_times_match(data['duration_start'], self.start)
self._assert_times_match(data['duration_end'], self.middle1)
assert data['duration'] == 8 * 60
def test_within_range(self):
self._set_interval(self.middle1, self.middle2)
data = self._invoke_api()
self._assert_times_match(data['start_timestamp'], self.middle1)
self._assert_times_match(data['end_timestamp'], self.middle2)
self._assert_times_match(data['duration_start'], self.middle1)
self._assert_times_match(data['duration_end'], self.middle2)
assert data['duration'] == 10 * 60
def test_within_range_zero_duration(self):
self._set_interval(self.middle1, self.middle1)
data = self._invoke_api()
self._assert_times_match(data['start_timestamp'], self.middle1)
self._assert_times_match(data['end_timestamp'], self.middle1)
self._assert_times_match(data['duration_start'], self.middle1)
self._assert_times_match(data['duration_end'], self.middle1)
assert data['duration'] == 0
def test_overlap_range_end(self):
self._set_interval(self.middle2, self.late1)
data = self._invoke_api()
self._assert_times_match(data['start_timestamp'], self.middle2)
self._assert_times_match(data['end_timestamp'], self.end)
self._assert_times_match(data['duration_start'], self.middle2)
self._assert_times_match(data['duration_end'], self.end)
assert data['duration'] == (6 * 60) - 1
def test_after_range(self):
self._set_interval(self.late1, self.late2)
data = self._invoke_api()
assert data['start_timestamp'] is None
assert data['end_timestamp'] is None
assert data['duration_start'] is None
assert data['duration_end'] is None
assert data['duration'] is None
def test_without_end_timestamp(self):
def get_interval(ignore_self, event_filter):
return (self.late1, self.late2)
return {'count': 0,
'min': None,
'max': None,
'avg': None,
'qty': None,
'duration': None,
'duration_start': self.late1,
'duration_end': self.late2,
}
self._stub_interval_func(get_interval)
data = self.get_json(
'/resources/resource-id/meters/instance:m1.tiny/duration',
extra_params={
'daterange.start': self.late1.isoformat(),
# this value doesn't matter, db call is mocked
'daterange.search_offset': 10,
})
self._assert_times_match(data['start_timestamp'], self.late1)
self._assert_times_match(data['end_timestamp'], self.late2)
data = self.get_json('/meters/instance:m1.tiny/statistics',
q=[{'field': 'timestamp',
'op': 'ge',
'value': self.late1.isoformat()},
{'field': 'resource_id',
'value': 'resource-id'},
{'field': 'search_offset',
'value': 10}])
self._assert_times_match(data['duration_start'], self.late1)
self._assert_times_match(data['duration_end'], self.late2)
def test_without_start_timestamp(self):
def get_interval(ignore_self, event_filter):
return {'count': 0,
'min': None,
'max': None,
'avg': None,
'qty': None,
'duration': None,
'duration_start': self.early1,
'duration_end': self.early2,
}
return (self.early1, self.early2)
self._stub_interval_func(get_interval)
data = self.get_json(
'/resources/resource-id/meters/instance:m1.tiny/duration',
extra_params={
'daterange.end': self.early2.isoformat(),
# this value doesn't matter, db call is mocked
'daterange.search_offset': 10,
})
self._assert_times_match(data['start_timestamp'], self.early1)
self._assert_times_match(data['end_timestamp'], self.early2)
data = self.get_json('/meters/instance:m1.tiny/statistics',
q=[{'field': 'timestamp',
'op': 'le',
'value': self.early2.isoformat()},
{'field': 'resource_id',
'value': 'resource-id'},
{'field': 'search_offset',
'value': 10}])
self._assert_times_match(data['duration_start'], self.early1)
self._assert_times_match(data['duration_end'], self.early2)

View File

@ -1,80 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Copyright © 2012 New Dream Network, LLC (DreamHost)
#
# Author: Steven Berler <steven.berler@dreamhost.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test the _get_query_timestamps helper function.
"""
import unittest
import datetime
from ceilometer.api.controllers import v2 as api
class DateRangeTest(unittest.TestCase):
def test_get_query_timestamps_none_specified(self):
result = api.DateRange().to_dict()
expected = {'start_timestamp': None,
'end_timestamp': None,
'query_start': None,
'query_end': None,
'search_offset': 0,
}
assert result == expected
def test_get_query_timestamps_start(self):
d = datetime.datetime(2012, 9, 20, 12, 13, 14)
result = api.DateRange(start=d).to_dict()
expected = {
'start_timestamp': datetime.datetime(2012, 9, 20, 12, 13, 14),
'end_timestamp': None,
'query_start': datetime.datetime(2012, 9, 20, 12, 13, 14),
'query_end': None,
'search_offset': 0,
}
assert result == expected
def test_get_query_timestamps_end(self):
d = datetime.datetime(2012, 9, 20, 12, 13, 14)
result = api.DateRange(end=d).to_dict()
expected = {
'end_timestamp': datetime.datetime(2012, 9, 20, 12, 13, 14),
'start_timestamp': None,
'query_end': datetime.datetime(2012, 9, 20, 12, 13, 14),
'query_start': None,
'search_offset': 0,
}
assert result == expected
def test_get_query_timestamps_with_offset(self):
result = api.DateRange(
end=datetime.datetime(2012, 9, 20, 13, 24, 25),
start=datetime.datetime(2012, 9, 20, 12, 13, 14),
search_offset=20,
).to_dict()
expected = {
'query_end': datetime.datetime(2012, 9, 20, 13, 44, 25),
'query_start': datetime.datetime(2012, 9, 20, 11, 53, 14),
'end_timestamp': datetime.datetime(2012, 9, 20, 13, 24, 25),
'start_timestamp': datetime.datetime(2012, 9, 20, 12, 13, 14),
'search_offset': 20,
}
assert result == expected

View File

@ -75,45 +75,70 @@ class TestListEvents(FunctionalTest):
self.conn.record_metering_data(msg2)
def test_all(self):
data = self.get_json('/resources')
data = self.get_json('/meters/instance')
self.assertEquals(2, len(data))
def test_empty_project(self):
data = self.get_json('/projects/no-such-project/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'project_id',
'value': 'no-such-project',
}])
self.assertEquals([], data)
def test_by_project(self):
data = self.get_json('/projects/project1/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'project_id',
'value': 'project1',
}])
self.assertEquals(1, len(data))
def test_empty_resource(self):
data = self.get_json('/resources/no-such-resource/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'resource_id',
'value': 'no-such-resource',
}])
self.assertEquals([], data)
def test_by_resource(self):
data = self.get_json('/resources/resource-id/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'resource_id',
'value': 'resource-id',
}])
self.assertEquals(1, len(data))
def test_empty_source(self):
data = self.get_json('/sources/no-such-source/meters/instance',
expect_errors=True)
self.assertEquals(data.status_int, 404)
data = self.get_json('/meters/instance',
q=[{'field': 'source',
'value': 'no-such-source',
}])
self.assertEquals(0, len(data))
def test_by_source(self):
data = self.get_json('/sources/test_source/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'source',
'value': 'test_source',
}])
self.assertEquals(1, len(data))
def test_empty_user(self):
data = self.get_json('/users/no-such-user/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'user_id',
'value': 'no-such-user',
}])
self.assertEquals([], data)
def test_by_user(self):
data = self.get_json('/users/user-id/meters/instance')
data = self.get_json('/meters/instance',
q=[{'field': 'user_id',
'value': 'user-id',
}])
self.assertEquals(1, len(data))
def test_metadata(self):
data = self.get_json('/resources/resource-id/meters/instance')
self.assertEquals(1, len(data))
data = self.get_json('/meters/instance',
q=[{'field': 'resource_id',
'value': 'resource-id',
}])
sample = data[0]
self.assert_('resource_metadata' in sample)
self.assertEqual(

View File

@ -116,12 +116,16 @@ class TestListMeters(FunctionalTest):
'meter.mine']))
def test_with_resource(self):
data = self.get_json('/resources/resource-id/meters')
data = self.get_json('/meters', q=[{'field': 'resource_id',
'value': 'resource-id',
}])
ids = set(r['name'] for r in data)
self.assertEquals(set(['meter.test']), ids)
def test_with_source(self):
data = self.get_json('/sources/test_source/meters')
data = self.get_json('/meters', q=[{'field': 'source',
'value': 'test_source',
}])
ids = set(r['resource_id'] for r in data)
self.assertEquals(set(['resource-id',
'resource-id2',
@ -129,13 +133,22 @@ class TestListMeters(FunctionalTest):
'resource-id4']), ids)
def test_with_source_non_existent(self):
data = self.get_json('/sources/test_source_doesnt_exist/meters',
expect_errors=True)
self.assert_('No source test_source_doesnt_exist' in
data.json['error_message'])
data = self.get_json('/meters',
q=[{'field': 'source',
'value': 'test_source_doesnt_exist',
}],
)
assert not data
def test_with_user(self):
data = self.get_json('/users/user-id/meters')
data = self.get_json('/meters',
q=[{'field': 'user_id',
'value': 'user-id',
}],
)
uids = set(r['user_id'] for r in data)
self.assertEquals(set(['user-id']), uids)
nids = set(r['name'] for r in data)
self.assertEquals(set(['meter.mine', 'meter.test']), nids)
@ -144,14 +157,26 @@ class TestListMeters(FunctionalTest):
self.assertEquals(set(['resource-id', 'resource-id2']), rids)
def test_with_user_non_existent(self):
data = self.get_json('/users/user-id-foobar123/meters')
data = self.get_json('/meters',
q=[{'field': 'user_id',
'value': 'user-id-foobar123',
}],
)
self.assertEquals(data, [])
def test_with_project(self):
data = self.get_json('/projects/project-id2/meters')
data = self.get_json('/meters',
q=[{'field': 'project_id',
'value': 'project-id2',
}],
)
ids = set(r['resource_id'] for r in data)
self.assertEquals(set(['resource-id3', 'resource-id4']), ids)
def test_with_project_non_existent(self):
data = self.get_json('/projects/jd-was-here/meters')
data = self.get_json('/meters',
q=[{'field': 'project_id',
'value': 'jd-was-here',
}],
)
self.assertEquals(data, [])

View File

@ -1,122 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Copyright © 2012 New Dream Network, LLC (DreamHost)
#
# Author: Doug Hellmann <doug.hellmann@dreamhost.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test listing users.
"""
import datetime
import logging
from ceilometer.collector import meter
from ceilometer import counter
from ceilometer.openstack.common import cfg
from .base import FunctionalTest
LOG = logging.getLogger(__name__)
class TestListProjects(FunctionalTest):
def test_empty(self):
data = self.get_json('/projects')
self.assertEquals([], data)
def test_projects(self):
counter1 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id',
'project-id',
'resource-id',
timestamp=datetime.datetime(2012, 7, 2, 10, 40),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter',
}
)
msg = meter.meter_message_from_counter(counter1,
cfg.CONF.metering_secret,
'test_source',
)
self.conn.record_metering_data(msg)
counter2 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id2',
'project-id2',
'resource-id-alternate',
timestamp=datetime.datetime(2012, 7, 2, 10, 41),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter2',
}
)
msg2 = meter.meter_message_from_counter(counter2,
cfg.CONF.metering_secret,
'test_source',
)
self.conn.record_metering_data(msg2)
data = self.get_json('/projects')
self.assertEquals(['project-id', 'project-id2'], data)
def test_with_source(self):
counter1 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id',
'project-id',
'resource-id',
timestamp=datetime.datetime(2012, 7, 2, 10, 40),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter',
}
)
msg = meter.meter_message_from_counter(counter1,
cfg.CONF.metering_secret,
'test_source',
)
self.conn.record_metering_data(msg)
counter2 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id2',
'project-id2',
'resource-id-alternate',
timestamp=datetime.datetime(2012, 7, 2, 10, 41),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter2',
}
)
msg2 = meter.meter_message_from_counter(counter2,
cfg.CONF.metering_secret,
'not-test',
)
self.conn.record_metering_data(msg2)
data = self.get_json('/sources/test_source/projects')
self.assertEquals(['project-id'], data)

View File

@ -119,7 +119,9 @@ class TestListResources(FunctionalTest):
)
self.conn.record_metering_data(msg2)
data = self.get_json('/sources/test_list_resources/resources')
data = self.get_json('/resources', q=[{'field': 'source',
'value': 'test_list_resources',
}])
ids = [r['resource_id'] for r in data]
self.assertEquals(['resource-id'], ids)
@ -162,7 +164,9 @@ class TestListResources(FunctionalTest):
)
self.conn.record_metering_data(msg2)
data = self.get_json('/users/user-id/resources')
data = self.get_json('/resources', q=[{'field': 'user_id',
'value': 'user-id',
}])
ids = [r['resource_id'] for r in data]
self.assertEquals(['resource-id'], ids)
@ -205,7 +209,9 @@ class TestListResources(FunctionalTest):
)
self.conn.record_metering_data(msg2)
data = self.get_json('/projects/project-id/resources')
data = self.get_json('/resources', q=[{'field': 'project_id',
'value': 'project-id',
}])
ids = [r['resource_id'] for r in data]
self.assertEquals(['resource-id'], ids)

View File

@ -1,47 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Copyright © 2012 Julien Danjou
#
# Author: Julien Danjou <julien@danjou.info>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test listing users.
"""
from .base import FunctionalTest
class TestListSource(FunctionalTest):
def test_all(self):
ydata = self.get_json('/sources')
self.assertEqual(len(ydata), 1)
source = ydata[0]
self.assertEqual(source['name'], 'test_source')
def test_source(self):
ydata = self.get_json('/sources/test_source')
self.assert_("data" in ydata)
self.assert_("somekey" in ydata['data'])
self.assertEqual(ydata['data']["somekey"], '666')
def test_unknownsource(self):
ydata = self.get_json(
'/sources/test_source_that_does_not_exist',
expect_errors=True)
print 'GOT:', ydata
self.assertEqual(ydata.status_int, 404)
self.assert_(
"No source test_source_that_does_not_exist" in
ydata.json['error_message']
)

View File

@ -1,123 +0,0 @@
# -*- encoding: utf-8 -*-
#
# Copyright © 2012 New Dream Network, LLC (DreamHost)
#
# Author: Doug Hellmann <doug.hellmann@dreamhost.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test listing users.
"""
import datetime
import logging
from ceilometer.collector import meter
from ceilometer import counter
from ceilometer.openstack.common import cfg
from .base import FunctionalTest
LOG = logging.getLogger(__name__)
class TestListUsers(FunctionalTest):
SOURCE_DATA = {'test_list_users': {}}
def test_empty(self):
data = self.get_json('/users')
self.assertEquals([], data)
def test_users(self):
counter1 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id',
'project-id',
'resource-id',
timestamp=datetime.datetime(2012, 7, 2, 10, 40),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter'}
)
msg = meter.meter_message_from_counter(counter1,
cfg.CONF.metering_secret,
'test_list_users',
)
self.conn.record_metering_data(msg)
counter2 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id2',
'project-id',
'resource-id-alternate',
timestamp=datetime.datetime(2012, 7, 2, 10, 41),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter2',
}
)
msg2 = meter.meter_message_from_counter(counter2,
cfg.CONF.metering_secret,
'test_list_users',
)
self.conn.record_metering_data(msg2)
data = self.get_json('/users')
self.assertEquals(['user-id', 'user-id2'], data)
def test_with_source(self):
counter1 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id',
'project-id',
'resource-id',
timestamp=datetime.datetime(2012, 7, 2, 10, 40),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter',
}
)
msg = meter.meter_message_from_counter(counter1,
cfg.CONF.metering_secret,
'test_list_users',
)
self.conn.record_metering_data(msg)
counter2 = counter.Counter(
'instance',
'cumulative',
'',
1,
'user-id2',
'project-id',
'resource-id-alternate',
timestamp=datetime.datetime(2012, 7, 2, 10, 41),
resource_metadata={'display_name': 'test-server',
'tag': 'self.counter2',
}
)
msg2 = meter.meter_message_from_counter(counter2,
cfg.CONF.metering_secret,
'not-test',
)
self.conn.record_metering_data(msg2)
data = self.get_json('/sources/test_list_users/users')
self.assertEquals(['user-id'], data)

View File

@ -31,7 +31,7 @@ from .base import FunctionalTest
class TestMaxProjectVolume(FunctionalTest):
PATH = '/projects/project1/meters/volume.size/volume/max'
PATH = '/meters/volume.size/statistics'
def setUp(self):
super(TestMaxProjectVolume, self).setUp()
@ -60,42 +60,72 @@ class TestMaxProjectVolume(FunctionalTest):
self.conn.record_metering_data(msg)
def test_no_time_bounds(self):
data = self.get_json(self.PATH)
expected = {'volume': 7}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
}])
self.assertEqual(data['max'], 7)
self.assertEqual(data['count'], 3)
def test_start_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00'})
expected = {'volume': 7}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
])
self.assertEqual(data['max'], 7)
self.assertEqual(data['count'], 2)
def test_start_timestamp_after(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T12:34:00'})
expected = {'volume': None}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T12:34:00',
},
])
self.assertEqual(data['max'], None)
self.assertEqual(data['count'], 0)
def test_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T11:30:00'})
expected = {'volume': 5}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:30:00',
},
])
self.assertEqual(data['max'], 5)
self.assertEqual(data['count'], 1)
def test_end_timestamp_before(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T09:54:00'})
expected = {'volume': None}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T09:54:00',
},
])
self.assertEqual(data['max'], None)
self.assertEqual(data['count'], 0)
def test_start_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00',
'daterange.end': '2012-09-25T11:32:00'})
expected = {'volume': 6}
self.assertEqual(data, expected)
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:32:00',
},
])
self.assertEqual(data['max'], 6)
self.assertEqual(data['count'], 1)

View File

@ -30,7 +30,7 @@ from ceilometer.tests.db import require_map_reduce
class TestMaxResourceVolume(FunctionalTest):
PATH = '/resources/resource-id/meters/volume.size/volume/max'
PATH = '/meters/volume.size/statistics'
def setUp(self):
super(TestMaxResourceVolume, self).setUp()
@ -59,42 +59,72 @@ class TestMaxResourceVolume(FunctionalTest):
self.conn.record_metering_data(msg)
def test_no_time_bounds(self):
data = self.get_json(self.PATH)
expected = {'volume': 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
}])
assert data['max'] == 7
assert data['count'] == 3
def test_start_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00'})
expected = {'volume': 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
])
assert data['max'] == 7
assert data['count'] == 2
def test_start_timestamp_after(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T12:34:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T12:34:00',
},
])
assert data['max'] is None
assert data['count'] == 0
def test_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T11:30:00'})
expected = {'volume': 5}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:30:00',
},
])
assert data['max'] == 5
assert data['count'] == 1
def test_end_timestamp_before(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T09:54:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T09:54:00',
},
])
assert data['max'] is None
assert data['count'] == 0
def test_start_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00',
'daterange.end': '2012-09-25T11:32:00'})
expected = {'volume': 6}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:32:00',
},
])
assert data['max'] == 6
assert data['count'] == 1

View File

@ -0,0 +1,118 @@
# -*- encoding: utf-8 -*-
#
# Copyright © 2012 New Dream Network, LLC (DreamHost)
#
# Author: Doug Hellmann <doug.hellmann@dreamhost.com>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Test listing raw events.
"""
import datetime
import logging
import unittest
from ceilometer.api.controllers import v2
LOG = logging.getLogger(__name__)
class TestStatisticsDuration(unittest.TestCase):
def setUp(self):
super(TestStatisticsDuration, self).setUp()
# Create events relative to the range and pretend
# that the intervening events exist.
self.early1 = datetime.datetime(2012, 8, 27, 7, 0)
self.early2 = datetime.datetime(2012, 8, 27, 17, 0)
self.start = datetime.datetime(2012, 8, 28, 0, 0)
self.middle1 = datetime.datetime(2012, 8, 28, 8, 0)
self.middle2 = datetime.datetime(2012, 8, 28, 18, 0)
self.end = datetime.datetime(2012, 8, 28, 23, 59)
self.late1 = datetime.datetime(2012, 8, 29, 9, 0)
self.late2 = datetime.datetime(2012, 8, 29, 19, 0)
def test_nulls(self):
s = v2.Statistics(duration_start=None,
duration_end=None,
start_timestamp=None,
end_timestamp=None,
)
assert s.duration_start is None
assert s.duration_end is None
assert s.duration is None
def test_overlap_range_start(self):
s = v2.Statistics(duration_start=self.early1,
duration_end=self.middle1,
start_timestamp=self.start,
end_timestamp=self.end,
)
assert s.duration_start == self.start
assert s.duration_end == self.middle1
assert s.duration == 8 * 60
def test_within_range(self):
s = v2.Statistics(duration_start=self.middle1,
duration_end=self.middle2,
start_timestamp=self.start,
end_timestamp=self.end,
)
assert s.duration_start == self.middle1
assert s.duration_end == self.middle2
assert s.duration == 10 * 60
def test_within_range_zero_duration(self):
s = v2.Statistics(duration_start=self.middle1,
duration_end=self.middle1,
start_timestamp=self.start,
end_timestamp=self.end,
)
assert s.duration_start == self.middle1
assert s.duration_end == self.middle1
assert s.duration == 0
def test_overlap_range_end(self):
s = v2.Statistics(duration_start=self.middle2,
duration_end=self.late1,
start_timestamp=self.start,
end_timestamp=self.end,
)
assert s.duration_start == self.middle2
assert s.duration_end == self.end
assert s.duration == (6 * 60) - 1
def test_after_range(self):
s = v2.Statistics(duration_start=self.late1,
duration_end=self.late2,
start_timestamp=self.start,
end_timestamp=self.end,
)
assert s.duration_start is None
assert s.duration_end is None
assert s.duration is None
def test_without_timestamp(self):
s = v2.Statistics(duration_start=self.late1,
duration_end=self.late2,
start_timestamp=None,
end_timestamp=None,
)
assert s.duration_start == self.late1
assert s.duration_end == self.late2

View File

@ -30,7 +30,7 @@ from ceilometer.tests.db import require_map_reduce
class TestSumProjectVolume(FunctionalTest):
PATH = '/projects/project1/meters/volume.size/volume/sum'
PATH = '/meters/volume.size/statistics'
def setUp(self):
super(TestSumProjectVolume, self).setUp()
@ -59,42 +59,74 @@ class TestSumProjectVolume(FunctionalTest):
self.conn.record_metering_data(msg)
def test_no_time_bounds(self):
data = self.get_json(self.PATH)
expected = {'volume': 5 + 6 + 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
}])
expected = 5 + 6 + 7
assert data['sum'] == expected
assert data['count'] == 3
def test_start_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00'})
expected = {'volume': 6 + 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
])
expected = 6 + 7
assert data['sum'] == expected
assert data['count'] == 2
def test_start_timestamp_after(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T12:34:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T12:34:00',
},
])
assert data['sum'] is None
assert data['count'] == 0
def test_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T11:30:00'})
expected = {'volume': 5}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:30:00',
},
])
assert data['sum'] == 5
assert data['count'] == 1
def test_end_timestamp_before(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T09:54:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T09:54:00',
},
])
assert data['sum'] is None
assert data['count'] == 0
def test_start_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00',
'daterange.end': '2012-09-25T11:32:00'})
expected = {'volume': 6}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'project_id',
'value': 'project1',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:32:00',
},
])
assert data['sum'] == 6
assert data['count'] == 1

View File

@ -30,7 +30,7 @@ from ceilometer.tests.db import require_map_reduce
class TestSumResourceVolume(FunctionalTest):
PATH = '/resources/resource-id/meters/volume.size/volume/sum'
PATH = '/meters/volume.size/statistics'
def setUp(self):
super(TestSumResourceVolume, self).setUp()
@ -59,42 +59,67 @@ class TestSumResourceVolume(FunctionalTest):
self.conn.record_metering_data(msg)
def test_no_time_bounds(self):
data = self.get_json(self.PATH)
expected = {'volume': 5 + 6 + 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
}])
assert data['sum'] == 5 + 6 + 7
assert data['count'] == 3
def test_start_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00'})
expected = {'volume': 6 + 7}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
}])
assert data['sum'] == 6 + 7
assert data['count'] == 2
def test_start_timestamp_after(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T12:34:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T12:34:00',
}])
assert data['sum'] is None
assert data['count'] == 0
def test_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T11:30:00'})
expected = {'volume': 5}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T11:30:00',
}])
assert data['sum'] == 5
assert data['count'] == 1
def test_end_timestamp_before(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.end': '2012-09-25T09:54:00'})
expected = {'volume': None}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'le',
'value': '2012-09-25T09:54:00',
}])
assert data['sum'] is None
assert data['count'] == 0
def test_start_end_timestamp(self):
data = self.get_json(
self.PATH,
extra_params={'daterange.start': '2012-09-25T11:30:00',
'daterange.end': '2012-09-25T11:32:00'})
expected = {'volume': 6}
assert data == expected
data = self.get_json(self.PATH, q=[{'field': 'resource_id',
'value': 'resource-id',
},
{'field': 'timestamp',
'op': 'ge',
'value': '2012-09-25T11:30:00',
},
{'field': 'timestamp',
'op': 'lt',
'value': '2012-09-25T11:32:00',
}])
assert data['sum'] == 6
assert data['count'] == 1

View File

@ -526,3 +526,89 @@ class TestGetEventInterval(MongoDBEngineTestBase):
s, e = self.conn.get_event_interval(self._filter)
assert s is None
assert e is None
class StatisticsTest(MongoDBEngineTestBase):
def setUp(self):
super(StatisticsTest, self).setUp()
require_map_reduce(self.conn)
self.counters = []
for i in range(3):
c = counter.Counter(
'volume.size',
'gauge',
'GiB',
5 + i,
'user-id',
'project1',
'resource-id',
timestamp=datetime.datetime(2012, 9, 25, 10 + i, 30 + i),
resource_metadata={'display_name': 'test-volume',
'tag': 'self.counter',
}
)
self.counters.append(c)
msg = meter.meter_message_from_counter(c,
secret='not-so-secret',
source='test',
)
self.conn.record_metering_data(msg)
for i in range(3):
c = counter.Counter(
'volume.size',
'gauge',
'GiB',
8 + i,
'user-5',
'project2',
'resource-6',
timestamp=datetime.datetime(2012, 9, 25, 10 + i, 30 + i),
resource_metadata={'display_name': 'test-volume',
'tag': 'self.counter',
}
)
self.counters.append(c)
msg = meter.meter_message_from_counter(c,
secret='not-so-secret',
source='test',
)
self.conn.record_metering_data(msg)
def test_by_user(self):
f = storage.EventFilter(
user='user-5',
meter='volume.size',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 3
assert results['min'] == 8
assert results['max'] == 10
assert results['sum'] == 27
assert results['avg'] == 9
def test_by_project(self):
f = storage.EventFilter(
meter='volume.size',
resource='resource-id',
start='2012-09-25T11:30:00',
end='2012-09-25T11:32:00',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 1
assert results['min'] == 6
assert results['max'] == 6
assert results['sum'] == 6
assert results['avg'] == 6
def test_one_resource(self):
f = storage.EventFilter(
user='user-id',
meter='volume.size',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 3
assert results['min'] == 5
assert results['max'] == 7
assert results['sum'] == 18
assert results['avg'] == 6

View File

@ -734,6 +734,91 @@ class MaxResourceTest(SQLAlchemyEngineSubBase):
assert results == expected
class StatisticsTest(SQLAlchemyEngineSubBase):
def setUp(self):
super(StatisticsTest, self).setUp()
self.counters = []
for i in range(3):
c = counter.Counter(
'volume.size',
'gauge',
'GiB',
5 + i,
'user-id',
'project1',
'resource-id',
timestamp=datetime.datetime(2012, 9, 25, 10 + i, 30 + i),
resource_metadata={'display_name': 'test-volume',
'tag': 'self.counter',
}
)
self.counters.append(c)
msg = meter.meter_message_from_counter(c,
cfg.CONF.metering_secret,
'source1',
)
self.conn.record_metering_data(msg)
for i in range(3):
c = counter.Counter(
'volume.size',
'gauge',
'GiB',
8 + i,
'user-5',
'project2',
'resource-6',
timestamp=datetime.datetime(2012, 9, 25, 10 + i, 30 + i),
resource_metadata={'display_name': 'test-volume',
'tag': 'self.counter',
}
)
self.counters.append(c)
msg = meter.meter_message_from_counter(c,
cfg.CONF.metering_secret,
'source1',
)
self.conn.record_metering_data(msg)
def test_by_user(self):
f = storage.EventFilter(
user='user-5',
meter='volume.size',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 3
assert results['min'] == 8
assert results['max'] == 10
assert results['sum'] == 27
assert results['avg'] == 9
def test_by_project(self):
f = storage.EventFilter(
meter='volume.size',
resource='resource-id',
start='2012-09-25T11:30:00',
end='2012-09-25T11:32:00',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 1
assert results['min'] == 6
assert results['max'] == 6
assert results['sum'] == 6
assert results['avg'] == 6
def test_one_resource(self):
f = storage.EventFilter(
user='user-id',
meter='volume.size',
)
results = self.conn.get_meter_statistics(f)
assert results['count'] == 3
assert results['min'] == 5
assert results['max'] == 7
assert results['sum'] == 18
assert results['avg'] == 6
def test_model_table_args():
cfg.CONF.database_connection = 'mysql://localhost'
assert table_args()

View File

@ -8,6 +8,13 @@ then
project_name=demo
fi
if [ -z "$OS_USERNAME" ]
then
user=demo
else
user=$OS_USERNAME
fi
# Convert a possible project name to an id, if we have
# keystone installed.
if which keystone >/dev/null
@ -41,7 +48,7 @@ late2="2012-08-31T20:00:00"
mkdata() {
${bindir}/make_test_data.py --project "$project" \
--start "$2" --end "$3" \
--user "$user" --start "$2" --end "$3" \
"$1" instance:m1.tiny 1
}