Fuel plugin builder v5

- Now it is possible to deliver Fuel release configuration using the Fuel plugin
  [bp/release-as-a-plugin](https://blueprints.launchpad.net/fuel/+spec/release-as-a-plugin)
  Using flag ``is_release: true`` in ``metadata.yaml: releases:`` section you coud
  define new fuel release.
  Also you could define ``base_release: release_template.yaml`` inside release
  description to share single template between multiple releases.
- ``*_path`` directive in is now supported in ``metadata.yaml``.
  Now you could require folder, external file or merged glob output using keys
  like ``deployment_tasks_path: 'dt.yaml'``
- It is possible to define graphs linked with plugin or releases
  directive inside ``metadata.yaml``, see the examples.
- All yaml root files is not not required except ``metadata.yaml``.
- Templates and examples for the Fuel plugins package ``v5.0.0`` are added.
- Fuel plugin builder is refactored to make all configurations traceable and
  decouple logic working with file system with validation logic and building
  logic.
  [LP1539598](https://bugs.launchpad.net/fuel/+bug/1539598)
- Docker environment for building and creating plugins example.
- Experimental ``JSON`` manifests support added.
- Schemas are aligned with Fuel versions.
- Advanced build process reporting.
  Now FPB trying to detect all possible problems in plugin configuration
  and report them instead of failing of first of them and also could warn user
  without stopping execution.
- Now it is possible to build plugin package v4.0.0 without ``tasks.yaml``

Change-Id: I55d0313db7cd64ab16802a75ff0d9edd16782d01
Implements-blueprint: release-as-a-plugin
Closes-Bug: #1539598
Closes-Bug: #1552248
This commit is contained in:
Ilya Kutukov 2016-09-22 11:36:57 +03:00
parent 6dbc520503
commit 56b0d8ffa4
104 changed files with 7615 additions and 5278 deletions

8
.gitignore vendored
View File

@ -14,12 +14,6 @@ built_plugins/
# Packages
*.egg*
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
@ -62,3 +56,5 @@ ChangeLog
.*.swp
.*sw?
.idea
.docker_build
build

View File

@ -1,5 +1,47 @@
# Changelog
## 5.0.0 (Not relesed)
New package version "5.0.0" includes the following features:
- Now it is possible to deliver Fuel release configuration using the Fuel plugin
[bp/release-as-a-plugin](https://blueprints.launchpad.net/fuel/+spec/release-as-a-plugin)
Using flag ``is_release: true`` in ``metadata.yaml: releases:`` section you coud
define new fuel release.
Also you could define ``base_release: release_template.yaml`` inside release
description to share single template between multiple releases.
- ``*_path`` directive in is now supported in ``metadata.yaml``.
Now you could require folder, external file or merged glob output using keys
like ``deployment_tasks_path: 'dt.yaml'``
- It is possible to define graphs linked with plugin or releases
directive inside ``metadata.yaml``, see the examples.
- All yaml root files is not not required except ``metadata.yaml``.
- Templates and examples for the Fuel plugins package ``v5.0.0`` are added.
- Fuel plugin builder is refactored to make all configurations traceable and
decouple logic working with file system with validation logic and building
logic.
[LP1539598](https://bugs.launchpad.net/fuel/+bug/1539598)
Also, this release include several experimental features and improvements:
- Docker environment for building and creating plugins example.
- Experimental ``JSON`` manifests support added.
- Schemas are aligned with Fuel versions.
- Advanced build process reporting.
Now FPB trying to detect all possible problems in plugin configuration
and report them instead of failing of first of them and also could warn user
without stopping execution.
Reporting tree provides nice hierarchical output and extends integration
abilities by providing different output formats: ``json``, ``yaml``,
``plaintext``
Bugfixes:
- Now it is possible to build plugin package v4.0.0 without ``tasks.yaml``
[LP1552248](https://bugs.launchpad.net/fuel/+bug/1552248)
## 4.1.0 (2016-06-29)
Bugfixes:

24
Dockerfile Normal file
View File

@ -0,0 +1,24 @@
FROM centos:7
RUN yum update -y; yum clean all
RUN yum install -y epel-release
RUN yum install -y git
RUN yum install -y python-pip python-wheel
RUN yum install -y rpm-build createrepo dpkg-scanpackages
#install epel as we need dpkg from epel-testing
RUN yum -y install epel-release
#install dpkg binaries from epel-testing
RUN yum -y install --enablerepo=epel-testing dpkg-dev tar
RUN mkdir -p /build
VOLUME /build
COPY . /fuel-plugins
WORKDIR /fuel-plugins
RUN python setup.py install
RUN pip install -r requirements.txt
ENTRYPOINT cd /build/ && find . -mindepth 1 -maxdepth 1 -type d -exec fpb --build {} \;

View File

@ -11,7 +11,6 @@ can be spread across multiple nodes. Or you might want to use a GlusterFS
plugin so that you can use a Gluster file system as backend for Cinder
volumes.
Finding Plugins
---------------
@ -64,6 +63,98 @@ This will:
* clone the fuel_plugin_example plugin with the name fuel_plugin_name
* build the plugin ``.rpm`` package.
This is not working on my OS
````````````````````````````
You could find an example of docker setup creating and building plugin in shared
library running:
.. code:: bash
tox -edocker
How Fuel plugin builder works
-----------------------------
Fuel plugin builder entry point is ./cli.py file that provides cli command
bindings to the ./actions.
Two actions are available:
* create
Creating bootstrap plugin file structure using files combining
templates/* folders defined in ./versions_mapping.py
* build
Build plugin package using working directory.
Build involving 5 steps with according modules responsive for this step for
given plugin package version.
* Preloading (single for all)
* Loading
* Validation
* Data schema checking
* Package Build
Preloading is about opening metadata.yaml, looking for package version and
choosing appropriate classes using `./version_mapping.py`
Loading is performed by Loader class of given version that are know where to
look for files how understand their formats, how resolve external references
inside data. Loader output is a list/dict tree with metadata.yaml content as
root. Loading report tree is attached to this list/dict structure and based
on report status FPB deciding to continue build process or reject it providing
failure report to developer.
Validation is performed by one of Validator classes located at ./validators
folder, and taking list/dict data tree as input. Validator business logic taking
data tree at parts and applying
@check functions to this branches making report tree (consist of ReportNode)
as output.
JSON Schema checks is the part of validation when we are getting sure that form
of data tree branches or whole tree is valid. We are making plugins for Fuel so
the data structure schemas is relying on fuel versioning (starting from v6.0)
so you could easily express with which Fuel version your package validation
should be compatible with. You could see this schemas located at ./schemas
folder.
Building itself is a copying of files preserving their permissions
and making `rpm` package based on `metadata.yaml` of your plugin, command line
arguments and `plugin_rpm.spec.mako` with path defined in `rpm_spec_src_path`
builder attribute resolved with this context to plugin_rpm.spec file.
All validation and loading processes are producing reports.
Reports are the tree of ReportNode() instances.
You could write messages with `report_node.error('ERROR!')`,
`report_node.warning('Warning!')`, `report_node.info('Info')` attach one nodes
to another with
`report_node.add_nodes(ReportNode('Im a child!'), ReportNode('Im too!'))`
And, what is the best option, you could render every tree branch as text log
yaml and json documents just calling `print report_branch_node.render('yaml')`.
How FPB and Fuel versions are aligned?
``````````````````````````````````````
Fuel Plugin Builder <-> Fuel versions mapping:
Fuel FPB Tasks
6.0 - 1.0.0 - 0.0.0
6.0 - 1.0.1 - 0.0.0
6.0 - 1.0.2 - 0.0.1
6.1 - 2.0.0 - 1.0.0
6.1 - 2.0.1 - 1.0.0
6.1 - 2.0.2 - 1.0.0
6.1 - 2.0.3 - 1.0.0
6.1 - 2.0.4 - 1.0.0
6.1 - 2.0.4 - 1.0.0
7.0 - 3.0.0 - 1.0.1
8.0 - 4.0.0 - 2.0.0
8.0 - 4.1.0 - 2.1.0
9.1 - 5.0.0 - 2.2.0
Examples
````````

6
build_releases_plugin.sh Executable file
View File

@ -0,0 +1,6 @@
sudo python setup.py install
mkdir -p ./.docker_build/ && cd ./.docker_build/
rm -rf ./release-plugin
fpb --create release-plugin --fuel-import --library-path ../../fuel-library --nailgun-path ../../fuel-web/nailgun/nailgun
cd ..
tox -edocker

View File

@ -12,12 +12,6 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.actions.base import BaseAction
from fuel_plugin_builder.actions.create import CreatePlugin
from fuel_plugin_builder.actions.build import BuildPluginV1
from fuel_plugin_builder.actions.build import BuildPluginV2
from fuel_plugin_builder.actions.build import BuildPluginV3
from fuel_plugin_builder.actions.build import BuildPluginV4
from fuel_plugin_builder.actions.build import BuildPluginV5
from fuel_plugin_builder.actions.build import make_builder
from .base import BaseAction
from .build import make_builder
from .create import CreatePlugin

View File

@ -16,270 +16,21 @@
from __future__ import unicode_literals
import abc
import logging
import os
from os.path import join as join_path
from fuel_plugin_builder.actions import BaseAction
from fuel_plugin_builder import errors
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators import ValidatorManager
from fuel_plugin_builder import version_mapping
import fuel_plugin_builder
logger = logging.getLogger(__name__)
class BaseBuildPlugin(BaseAction):
@abc.abstractproperty
def requires(self):
"""Should return a list of commands which
are required for the builder
"""
@abc.abstractproperty
def result_package_mask(self):
"""Should return mask for built package
"""
@abc.abstractmethod
def make_package(self):
"""Method should be implemented in child classes
"""
def __init__(self, plugin_path):
self.plugin_path = plugin_path
self.pre_build_hook_cmd = './pre_build_hook'
self.meta = utils.parse_yaml(
join_path(self.plugin_path, 'metadata.yaml')
)
self.build_dir = join_path(self.plugin_path, '.build')
self.build_src_dir = join_path(self.build_dir, 'src')
self.checksums_path = join_path(self.build_src_dir, 'checksums.sha1')
self.name = self.meta['name']
def run(self):
logger.debug('Start plugin building "%s"', self.plugin_path)
self.clean()
self.run_pre_build_hook()
self.check()
self.build_repos()
self.add_checksums_file()
self.make_package()
def clean(self):
utils.remove(self.build_dir)
utils.create_dir(self.build_dir)
utils.remove_by_mask(self.result_package_mask)
def run_pre_build_hook(self):
if utils.which(join_path(self.plugin_path, self.pre_build_hook_cmd)):
utils.exec_cmd(self.pre_build_hook_cmd, self.plugin_path)
def add_checksums_file(self):
utils.create_checksums_file(self.build_src_dir, self.checksums_path)
def build_repos(self):
utils.create_dir(self.build_src_dir)
utils.copy_files_in_dir(
join_path(self.plugin_path, '*'),
self.build_src_dir)
releases_paths = {}
for release in self.meta['releases']:
releases_paths.setdefault(release['os'], [])
releases_paths[release['os']].append(
join_path(self.build_src_dir, release['repository_path']))
self.build_ubuntu_repos(releases_paths.get('ubuntu', []))
self.build_centos_repos(releases_paths.get('centos', []))
def build_ubuntu_repos(cls, releases_paths):
for repo_path in releases_paths:
utils.exec_piped_cmds(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=repo_path)
@classmethod
def build_centos_repos(cls, releases_paths):
for repo_path in releases_paths:
repo_packages = join_path(repo_path, 'Packages')
utils.create_dir(repo_packages)
utils.move_files_in_dir(
join_path(repo_path, '*.rpm'),
repo_packages)
utils.exec_cmd('createrepo -o {0} {0}'.format(repo_path))
def check(self):
self._check_requirements()
self._check_structure()
def _check_requirements(self):
not_found = filter(lambda r: not utils.which(r), self.requires)
if not_found:
raise errors.FuelCannotFindCommandError(
'Cannot find commands "{0}", '
'install required commands and try again'.format(
', '.join(not_found)))
def _check_structure(self):
ValidatorManager(self.plugin_path).get_validator().validate()
class BuildPluginV1(BaseBuildPlugin):
requires = ['rpm', 'createrepo', 'dpkg-scanpackages']
@property
def result_package_mask(self):
return join_path(self.plugin_path, '{0}-*.fp'.format(self.name))
def make_package(self):
full_name = '{0}-{1}'.format(self.meta['name'],
self.meta['version'])
tar_name = '{0}.fp'.format(full_name)
tar_path = join_path(
self.plugin_path,
tar_name)
utils.make_tar_gz(self.build_src_dir, tar_path, full_name)
class BuildPluginV2(BaseBuildPlugin):
requires = ['rpmbuild', 'rpm', 'createrepo', 'dpkg-scanpackages']
rpm_spec_src_path = 'templates/v2/build/plugin_rpm.spec.mako'
release_tmpl_src_path = 'templates/v2/build/Release.mako'
def __init__(self, *args, **kwargs):
super(BuildPluginV2, self).__init__(*args, **kwargs)
self.plugin_version, self.full_version = utils.version_split_name_rpm(
self.meta['version'])
self.rpm_path = os.path.abspath(
join_path(self.plugin_path, '.build', 'rpm'))
self.rpm_src_path = join_path(self.rpm_path, 'SOURCES')
self.full_name = '{0}-{1}'.format(
self.meta['name'], self.plugin_version)
tar_name = '{0}.fp'.format(self.full_name)
self.tar_path = join_path(self.rpm_src_path, tar_name)
fpb_dir = join_path(os.path.dirname(__file__), '..')
self.spec_src = os.path.abspath(join_path(
fpb_dir, self.rpm_spec_src_path))
self.release_tmpl_src = os.path.abspath(join_path(
fpb_dir, self.release_tmpl_src_path))
self.spec_dst = join_path(self.rpm_path, 'plugin_rpm.spec')
self.rpm_packages_mask = join_path(
self.rpm_path, 'RPMS', 'noarch', '*.rpm')
@property
def result_package_mask(self):
return join_path(
self.plugin_path, '{0}-*.noarch.rpm'.format(self.name))
def make_package(self):
"""Builds rpm package
"""
utils.create_dir(self.rpm_src_path)
utils.make_tar_gz(self.build_src_dir, self.tar_path, self.full_name)
utils.render_to_file(
self.spec_src,
self.spec_dst,
self._make_data_for_template())
utils.exec_cmd(
'rpmbuild -vv --nodeps --define "_topdir {0}" '
'-bb {1}'.format(self.rpm_path, self.spec_dst))
utils.copy_files_in_dir(self.rpm_packages_mask, self.plugin_path)
def _make_data_for_template(self):
"""Generates data for spec template
:returns: dictionary with required data
"""
return {
'name': self.full_name,
'version': self.full_version,
'summary': self.meta['title'],
'description': self.meta['description'],
'license': ' and '.join(self.meta.get('licenses', [])),
'homepage': self.meta.get('homepage'),
'vendor': ', '.join(self.meta.get('authors', [])),
'year': utils.get_current_year()}
def build_ubuntu_repos(self, releases_paths):
for repo_path in releases_paths:
utils.exec_piped_cmds(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=repo_path)
release_path = join_path(repo_path, 'Release')
utils.render_to_file(
self.release_tmpl_src,
release_path,
{'plugin_name': self.meta['name'],
'major_version': self.plugin_version})
class BuildPluginV3(BuildPluginV2):
rpm_spec_src_path = 'templates/v3/build/plugin_rpm.spec.mako'
release_tmpl_src_path = 'templates/v3/build/Release.mako'
def _make_data_for_template(self):
data = super(BuildPluginV3, self)._make_data_for_template()
uninst = utils.read_if_exist(
join_path(self.plugin_path, "uninstall.sh"))
preinst = utils.read_if_exist(
join_path(self.plugin_path, "pre_install.sh"))
postinst = utils.read_if_exist(
join_path(self.plugin_path, "post_install.sh"))
plugin_build_version = str(self.meta.get('build_version', '1'))
data.update(
{'postinstall_hook': postinst,
'preinstall_hook': preinst,
'uninstall_hook': uninst,
'build_version': plugin_build_version}
)
return data
class BuildPluginV4(BuildPluginV3):
pass
class BuildPluginV5(BuildPluginV4):
pass
def make_builder(plugin_path):
"""Creates build object.
:param str plugin_path: path to the plugin
:returns: specific version of builder object
"""
builder = version_mapping.get_version_mapping_from_plugin(
plugin_path)['builder']
builder = \
fuel_plugin_builder.version_mapping.get_plugin_package_config_for_path(
plugin_path
)['builder']
return builder(plugin_path)

View File

@ -18,48 +18,207 @@ import logging
import os
import re
from fuel_plugin_builder.actions import BaseAction
import jinja2
import six
import yaml
import fuel_plugin_builder
from fuel_plugin_builder.actions.base import BaseAction
from fuel_plugin_builder import consts
from fuel_plugin_builder import errors
from fuel_plugin_builder import messages
from fuel_plugin_builder import utils
from fuel_plugin_builder import version_mapping
logger = logging.getLogger(__name__)
class CreatePlugin(BaseAction):
plugin_name_pattern = re.compile(consts.PLUGIN_NAME_PATTERN)
def __init__(self, plugin_path, package_version=None):
def __init__(
self,
plugin_path,
package_version=None,
fuel_import=False,
nailgun_path=None,
library_path=None):
self.plugin_name = utils.basename(plugin_path.rstrip('/'))
self.plugin_path = plugin_path
self.package_version = (package_version or
version_mapping.latest_version)
consts.LATEST_VERSION)
self.fuel_import = fuel_import
self.nailgun_path = nailgun_path
self.library_path = library_path
self.render_ctx = {'plugin_name': self.plugin_name}
self.template_paths = version_mapping.get_plugin_for_version(
self.package_version)['templates']
self.template_paths = \
fuel_plugin_builder.version_mapping.get_plugin_package_config(
self.package_version)['templates']
def check(self):
if utils.exists(self.plugin_path):
if utils.is_exists(self.plugin_path):
raise errors.PluginDirectoryExistsError(
'Plugins directory {0} already exists, '
'choose another name'.format(self.plugin_path))
if not self.plugin_name_pattern.match(self.plugin_name):
raise errors.ValidationError(
messages.PLUGIN_WRONG_NAME_EXCEPTION_MESSAGE)
"Plugin name is invalid, use only lower "
"case letters, numbers, '_', '-' symbols")
def run(self):
logger.debug('Start plugin creation "%s"', self.plugin_path)
report = utils.ReportNode(
'Start plugin creation "{}"'.format(self.plugin_path))
# todo(ikutukov): add to report
self.check()
for template_path in self.template_paths:
template_dir = os.path.join(
os.path.dirname(__file__), '..', template_path)
report.info('Adding template from {}'.format(template_dir))
utils.copy(template_dir, self.plugin_path)
utils.render_files_in_dir(self.plugin_path, self.render_ctx)
if self.fuel_import:
report.info("Applying Nailgun configuration")
report.add_nodes(
self.import_releases(
self.nailgun_path,
self.library_path
)
)
else:
report.info("Creating fresh plugin")
report.info('Plugin bootstrap is created')
return report
def make_release_files_and_metadata(self, release_data, graphs):
fields = {
'networks_metadata': 'metadata/networks.yaml',
'volumes_metadata': 'metadata/volumes.yaml',
'roles_metadata': 'metadata/roles.yaml',
'network_roles_metadata': 'metadata/network_roles.yaml',
'attributes_metadata': 'attributes/attributes.yaml',
'vmware_attributes_metadata': 'attributes/vmware.yaml',
'node_attributes_metadata': 'attributes/node.yaml',
'nic_attributes_metadata': 'attributes/nic.yaml',
'bond_attributes_metadata': 'attributes/bond.yaml',
'node_attributes': 'attributes/node.yaml',
}
report = utils.ReportNode(
'Adding release: {}'.format(release_data.get('name')))
result = {
'is_release': True,
'deployment_scripts_path': 'deployment_scripts',
'repository_path': 'repositories/ubuntu'
}
fm = utils.FilesManager()
def _safe_string(unsafe_string):
return "".join(
[c if re.match(r'\w', c) else '-' for c in unsafe_string]
).lower()
for f in release_data:
if f in fields:
relative_path = os.path.join(
_safe_string(release_data.get('name')),
fields[f]
)
fm.save(
os.path.join(
self.plugin_path,
relative_path
),
release_data[f]
)
result[f.replace('_metadata', '') + '_path'] = relative_path
else:
# leave it at root metadata
result[f] = release_data[f]
result['graphs'] = graphs
return report.mix_to_data(result)
def import_releases(self, nailgun_path, library_path):
report = utils.ReportNode('Importing releases from nailgun')
if not nailgun_path:
return report.error('No nailgun path defined')
if not library_path:
return report.error('No nailgun library path defined')
plugin_metadata_path = os.path.join(self.plugin_path, 'metadata.yaml')
report.info('Using: {}'.format(plugin_metadata_path))
openstack_file_path = os.path.join(
nailgun_path, 'fixtures', 'openstack.yaml')
report.info('Using: {}'.format(openstack_file_path))
fuel_settings_path = os.path.join(nailgun_path, 'settings.yaml')
report.info('Using: {}'.format(fuel_settings_path))
library_tasks_path = os.path.join(library_path, '**/tasks.yaml')
report.info('Using: {}'.format(library_tasks_path))
fm = utils.FilesManager()
library_graphs = []
library_tasks = fm.load(library_tasks_path) or []
if library_tasks:
library_graphs.append({
'type': 'default',
'name': 'default',
'tasks': library_tasks
})
for graph in library_graphs:
tasks_path = os.path.join('graphs', graph['type'] + '.yaml')
fm.save(
os.path.join(self.plugin_path, tasks_path),
graph['tasks']
)
graph['tasks_path'] = tasks_path
del graph['tasks']
fixture = fm.load(openstack_file_path, decode=False)
nailgun_settings = fm.load(fuel_settings_path)
# taken from nailgun fixman
t = jinja2.Template(fixture)
fixture = yaml.load(
six.StringIO(t.render(settings=nailgun_settings)))
for i in range(0, len(fixture)):
def extend(obj):
if 'extend' in obj:
obj['extend'] = extend(obj['extend'])
return utils.dict_merge(obj.get('extend', {}), obj)
fixture[i] = extend(fixture[i])
fixture[i].pop('extend', None)
# returning to FPB codebase
releases_content = [
r['fields']
for r in fixture
if r.get('pk', None) is not None
]
releases_root_metadata = []
for release_content in releases_content:
result = self.make_release_files_and_metadata(release_content,
library_graphs)
report.add_nodes(result.report)
releases_root_metadata.append(dict(result))
report.info('Saving to {}'.format(plugin_metadata_path))
plugin_metadata = fm.load(plugin_metadata_path)
plugin_metadata['releases'] = releases_root_metadata
plugin_metadata['name'] = 'plugin-releases'
fm.save(plugin_metadata_path, plugin_metadata)
report.info('Done')
return report

View File

@ -0,0 +1,4 @@
from .builder_base import PluginBuilderBase
from .builder_v1 import PluginBuilderV1
from .builder_v2 import PluginBuilderV2
from .builder_v3 import PluginBuilderV3

View File

@ -0,0 +1,157 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import abc
import logging
from os.path import join as join_path
import sys
import fuel_plugin_builder
from fuel_plugin_builder import actions
from fuel_plugin_builder import errors
from fuel_plugin_builder import loaders
from fuel_plugin_builder import utils
logger = logging.getLogger(__name__)
class PluginBuilderBase(actions.BaseAction):
loader_class = loaders.PluginLoaderV1
@abc.abstractproperty
def requires(self):
"""Should return a list of commands which
are required for the builder
"""
@abc.abstractproperty
def result_package_mask(self):
"""Should return mask for built package
"""
@abc.abstractmethod
def make_package(self):
"""Method should be implemented in child classes
"""
def __init__(self, plugin_path, loader=None):
self.plugin_path = plugin_path
self.report = utils.ReportNode('Building: {}'.format(self.plugin_path))
if loader:
self.loader = loader
else:
self.loader = self.loader_class(plugin_path)
self.pre_build_hook_cmd = './pre_build_hook'
data = self.loader.load(
self.plugin_path
)
self.report.add_nodes(data.report)
if data.report.is_failed():
print(self.report.render())
sys.exit(-1)
else:
self.data = data
self.build_dir = join_path(self.plugin_path, '.build')
self.build_src_dir = join_path(self.build_dir, 'src')
self.checksums_path = join_path(self.build_src_dir, 'checksums.sha1')
self.name = self.data['name']
def run(self):
self.report.info('Start plugin building "{}"'.format(self.plugin_path))
self.clean(),
self.run_pre_build_hook(),
self.check(),
self.build_repos(),
self.add_checksums_file(),
self.make_package()
if self.report.is_successful():
self.report.info('Plugin is built')
return self.report
def clean(self):
utils.remove(self.build_dir)
utils.create_dir(self.build_dir)
utils.remove_by_mask(self.result_package_mask)
self.report.info('Cleaning complete')
def run_pre_build_hook(self):
if utils.which(join_path(self.plugin_path, self.pre_build_hook_cmd)):
utils.exec_cmd(self.pre_build_hook_cmd, self.plugin_path)
self.report.info('Prebuilt hook executed')
def add_checksums_file(self):
utils.create_checksums_file(self.build_src_dir, self.checksums_path)
self.report.info('Checksums file added')
def build_repos(self):
utils.create_dir(self.build_src_dir)
utils.copy_files_in_dir(
join_path(self.plugin_path, '*'),
self.build_src_dir)
releases_paths = {}
for release in self.data['releases']:
releases_paths.setdefault(release['operating_system'], [])
releases_paths[release['operating_system']].append(
join_path(self.build_src_dir, release['repository_path']))
self.build_ubuntu_repos(releases_paths.get('ubuntu', []))
self.build_centos_repos(releases_paths.get('centos', []))
self.report.info('Repositories are built')
def build_ubuntu_repos(cls, releases_paths):
for repo_path in releases_paths:
utils.exec_piped_cmds(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=repo_path)
@classmethod
def build_centos_repos(cls, releases_paths):
for repo_path in releases_paths:
repo_packages = join_path(repo_path, 'Packages')
utils.create_dir(repo_packages)
utils.move_files_in_dir(
join_path(repo_path, '*.rpm'),
repo_packages)
utils.exec_cmd('createrepo -o {0} {0}'.format(repo_path))
def check(self):
self._check_requirements()
self._validate()
def _check_requirements(self):
self.report.info('Checking requirements')
not_found = filter(lambda r: not utils.which(r), self.requires)
err_message = 'Cannot find commands "{0}", install required ' \
'commands and try again'.format(', '.join(not_found))
if not_found:
self.report.error(err_message)
print(self.report.render())
raise errors.FuelCannotFindCommandError(err_message)
def _validate(self):
validation_report = fuel_plugin_builder.version_mapping.get_validator(
self.plugin_path).validate(self.data)
self.report.add_nodes(validation_report)
if validation_report.is_failed():
print(self.report.render())
raise errors.ValidationError()

View File

@ -0,0 +1,46 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import logging
from os.path import join as join_path
from fuel_plugin_builder.builders.builder_base import PluginBuilderBase
from fuel_plugin_builder import loaders
from fuel_plugin_builder import utils
logger = logging.getLogger(__name__)
class PluginBuilderV1(PluginBuilderBase):
loader_class = loaders.PluginLoaderV1
requires = ['rpm', 'createrepo', 'dpkg-scanpackages']
@property
def result_package_mask(self):
return join_path(self.plugin_path, '{0}-*.fp'.format(self.name))
def make_package(self):
full_name = '{0}-{1}'.format(self.data['name'],
self.data['version'])
tar_name = '{0}.fp'.format(full_name)
tar_path = join_path(
self.plugin_path,
tar_name)
utils.make_tar_gz(self.build_src_dir, tar_path, full_name)
return utils.ReportNode('Package is made {}'.format(full_name))

View File

@ -0,0 +1,123 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import logging
import os
from os.path import join as join_path
from fuel_plugin_builder.builders.builder_base import PluginBuilderBase
from fuel_plugin_builder import loaders
from fuel_plugin_builder import utils
logger = logging.getLogger(__name__)
class PluginBuilderV2(PluginBuilderBase):
loader_class = loaders.PluginLoaderV1 # LoaderV1 is not type
requires = ['rpmbuild', 'rpm', 'createrepo', 'dpkg-scanpackages']
rpm_spec_src_path = 'templates/v2/build/plugin_rpm.spec.mako'
release_tmpl_src_path = 'templates/v2/build/Release.mako'
def __init__(self, *args, **kwargs):
super(PluginBuilderV2, self).__init__(*args, **kwargs)
self.plugin_version, self.full_version = utils.version_split_name_rpm(
self.data['version'])
self.rpm_path = os.path.abspath(
join_path(self.plugin_path, '.build', 'rpm'))
self.rpm_src_path = join_path(self.rpm_path, 'SOURCES')
self.full_name = '{0}-{1}'.format(
self.data['name'], self.plugin_version)
tar_name = '{0}.fp'.format(self.full_name)
self.tar_path = join_path(self.rpm_src_path, tar_name)
fpb_dir = join_path(os.path.dirname(__file__), '..')
self.spec_src = os.path.abspath(join_path(
fpb_dir, self.rpm_spec_src_path))
self.release_tmpl_src = os.path.abspath(join_path(
fpb_dir, self.release_tmpl_src_path))
self.spec_dst = join_path(self.rpm_path, 'plugin_rpm.spec')
self.rpm_packages_mask = join_path(
self.rpm_path, 'RPMS', 'noarch', '*.rpm')
@property
def result_package_mask(self):
return join_path(
self.plugin_path, '{0}-*.noarch.rpm'.format(self.name))
def make_package(self):
"""Builds rpm package
"""
report = utils.ReportNode("Making package:")
utils.create_dir(self.rpm_src_path)
utils.make_tar_gz(self.build_src_dir, self.tar_path, self.full_name)
utils.load_template_and_render_to_file(
self.spec_src,
self.spec_dst,
self._make_data_for_template())
build_cmd = 'rpmbuild -vv --nodeps --define "_topdir {0}" ' \
'-bb {1}'.format(self.rpm_path, self.spec_dst)
report.info("Running build command: {}".format(build_cmd))
utils.exec_cmd(build_cmd)
report.info("Copying {} to {}".format(
self.rpm_packages_mask, self.plugin_path))
utils.copy_files_in_dir(self.rpm_packages_mask, self.plugin_path)
return report
def _make_data_for_template(self):
"""Generates data for spec template
:returns: dictionary with required data
"""
data = {
'name': self.full_name,
'version': self.full_version,
'summary': self.data['title'],
'description': self.data['description'],
'license': ' and '.join(self.data.get('licenses', [])),
'homepage': self.data.get('homepage'),
'vendor': ', '.join(self.data.get('authors', [])),
'year': utils.get_current_year()
}
return data
def build_ubuntu_repos(self, releases_paths):
for repo_path in releases_paths:
utils.exec_piped_cmds(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=repo_path)
release_path = join_path(repo_path, 'Release')
utils.load_template_and_render_to_file(
self.release_tmpl_src,
release_path,
{
'plugin_name': self.data['name'],
'major_version': self.plugin_version
}
)

View File

@ -0,0 +1,51 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import logging
from os.path import join as join_path
from fuel_plugin_builder.builders.builder_v2 import PluginBuilderV2
from fuel_plugin_builder import errors
from fuel_plugin_builder import loaders
from fuel_plugin_builder import utils
logger = logging.getLogger(__name__)
class PluginBuilderV3(PluginBuilderV2):
rpm_spec_src_path = 'templates/v3/build/plugin_rpm.spec.mako'
release_tmpl_src_path = 'templates/v3/build/Release.mako'
loader_class = loaders.PluginLoaderV3
def _make_data_for_template(self):
data = super(PluginBuilderV3, self)._make_data_for_template()
data['build_version'] = str(self.data.get('build_version', '1'))
fm = utils.FilesManager()
for key, script_file in (
('uninstall_hook', 'uninstall.sh'),
('preinstall_hook', 'pre_install.sh'),
('postinstall_hook', 'post_install.sh')
):
try:
data[key] = fm.load(
join_path(self.plugin_path, script_file))
except errors.NoPluginFileFound:
data[key] = ''
return data

View File

@ -0,0 +1,316 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import functools
import jsonschema
import six
from fuel_plugin_builder import utils
# This file provides number of functions which making some plugin-specific data
# requirements/integrity and file system checks and returning report.
#
# Basic rules of using checks:
# 1. Check wrapped with @check decorator providing clear report node.
# 2. Check could call another check and use it's report.
# 3. Check always returning a Report node.
def check(function_or_title):
"""Check decorator.
feel free to use:
@check
def check_function(report, *args, **kwargs)
...
return report
or
@check('This is my check')
def check_function(report, *args, **kwargs)
...
return report
"""
def decorator(f):
if callable(function_or_title):
title = "Running check function '{}'".format(f.__name__)
else:
title = function_or_title
@functools.wraps(f)
def wrapper(*args, **kwargs):
report = utils.ReportNode(title)
result = f(report, *args, **kwargs)
assert isinstance(result, utils.ReportNode)
return result
return wrapper
if callable(function_or_title):
return decorator(function_or_title)
return decorator
@check("Applying JSON Schema to data")
def json_schema_is_valid(report, schema, data):
"""Check data with JSON Schema.
:param report: report node
:type report: ReportNode
:param schema: JSON Schema
:type schema: list|dict
:param data: data to check
:type data: list|dict
:return: report node
:rtype: utils.ReportNode
"""
json_schema_validator = jsonschema.Draft4Validator(schema)
def _convert_errors_tree_report_tree(json_schema_errors, report_node):
"""Make detailed report tree of JSON errors.
:param json_schema_errors: validation errors
:type json_schema_errors: iterable[ValidationError]
:param report_node: report node
:type report_node: ReportNode
:return: report node
:rtype: utils.ReportNode
"""
for exc in sorted(json_schema_errors, key=lambda e: e.path):
path = u' -> '.join(map(six.text_type, exc.path)) or None
error_node = utils.ReportNode(exc, level='error')
if path:
path_node = utils.ReportNode(path)
sub_record_node = report_node.add_nodes(
path_node.add_nodes(error_node))
else:
sub_record_node = report_node.add_nodes(error_node)
if exc.context: # make nested report nodes
_convert_errors_tree_report_tree(
exc.context,
sub_record_node
)
return report_node
_convert_errors_tree_report_tree(
json_schema_validator.iter_errors(data),
report
)
return report
@check("Applying multiple JSON Schemas distinguished by record 'type' field")
def multi_json_schema_is_valid(report, schemas, data):
"""Checks multiple JSON Schema using record ``type`` field to choose
appropriate schema.
:param report: report node
:type report: ReportNode
:param schemas: dict of schemas in format
{
'type1': schema1,
'type2': schema2
}
:type schemas: dict
:param data: data tree
:type data: list[list|dict]
:return: report
:rtype: utils.ReportNode
"""
if not isinstance(data, list):
report.error(u'Data should be a list of entities')
return report
for record_id, record in enumerate(data):
record_type = record.get('type', '')
schema = schemas.get(record_type)
if schema is not None:
report.add_nodes(
json_schema_is_valid(schema, record)
)
else:
report.error(u'Invalid type: {0} for record: {1}'.format(
record_type, record_id
))
return report
@check('Checking path existence')
def path_exists(report, path):
"""Check if path is exists or path mask has been resolved to at least
one path.
:param report: report node
:type report: ReportNode
:param path: path
:type path: basestring|str
:return: report
:rtype: utils.ReportNode
"""
report.info(u'Path: {}'.format(path))
if not utils.fs.get_paths(path):
report.error(u'Path not exists')
return report
@check('Checking folder existence')
def dir_exists(report, path):
"""Check if dir is exists.
:param report: report node
:type report: ReportNode
:param path: path
:type path: str
:return: report
:rtype: utils.ReportNode
"""
report.info(path)
if not utils.fs.is_dir(path):
report.error(u'Directory not exists')
return report
@check('Checking file existence')
def file_exists(report, path):
"""Check if file is exists.
:param report: report node
:type report: ReportNode
:param path: path
:type path: str
:return: report
:rtype: utils.ReportNode
"""
report.info(path)
if not (utils.fs.is_exists(path) and utils.fs.is_file(path)):
report.error(u'File not found')
return report
@check('Checking fuel_version compatibility with package_version')
def fuel_ver_compatible_with_package_ver(
report, minimal_fuel_version, plugin_metadata):
"""Checks Fuel version compatibility with plugin package version.
:param report: report node
:type report: ReportNode
:param minimal_fuel_version: basic supported version
:type minimal_fuel_version: str
:param plugin_metadata: plugin metadata root
:type plugin_metadata: dict
:return: report
:rtype: utils.ReportNode
"""
report.info(u'Expected Fuel version >= {0}'.format(minimal_fuel_version))
incompatible_versions = list()
for fuel_version in plugin_metadata.get('fuel_version', []):
if (
utils.strict_version(fuel_version) <
utils.strict_version(minimal_fuel_version)
):
incompatible_versions.append(fuel_version)
if incompatible_versions:
report.error(
'Current plugin format {0} is not compatible with '
'{2} Fuel release. Fuel version must be {1} or higher. '
'Please remove {2} version from metadata.yaml file or '
'downgrade package_version.'.format(
plugin_metadata['package_version'],
minimal_fuel_version,
', '.join(incompatible_versions)
)
)
else:
report.info(u'Plugin is compatible with target Fuel version.')
return report
@check("Checking for legacy field 'fuel_version'")
def legacy_fuel_version(report, metadata):
if metadata.get('fuel_version'):
report.warning(u'"fuel_version" field in metadata.yaml is deprecated '
u'and will be removed in further Fuel releases.')
return report
@check("Checking environment attributes")
def env_attributes(report, data, attr_root_schema,
attribute_schema, attribute_meta_schema):
"""Check attributes in environment config file.
'attributes' is not required field, but if it's
present it should contain UI elements OR metadata
structure.
:param report: report node
:type report: ReportNode
:param data: attributes data
:type data: dict
:param attr_root_schema: dict
:type attr_root_schema: JSON schema of attributes root
:param attribute_schema: dict
:type attribute_schema: JSON schema of attribute
:param attribute_meta_schema: dict
:type attribute_meta_schema: JSON schema of attribute
:return: report
:rtype: utils.ReportNode
"""
report.add_nodes(
json_schema_is_valid(attr_root_schema, data)
)
if report.is_failed():
return report
attrs = data.get('attributes', {}) or {}
for attr_id, attr in six.iteritems(attrs):
# Metadata object is totally different
# from the others, we have to set different
# validator for it
if attr_id == 'metadata':
schema = attribute_meta_schema
else:
schema = attribute_schema
report.add_nodes(
json_schema_is_valid(schema, attr)
)
return report
@check("Looking for deprecated 'mode' directive inside releases")
def mode_directive(report, release_record):
mode = release_record.get('mode')
if mode is not None:
report.warning(u'"mode" directive id deprecated and ignored by Fuel '
u'releases elder then 6.1')
return report

View File

@ -16,15 +16,14 @@
import argparse
import logging
import six
import sys
import six
from fuel_plugin_builder import actions
from fuel_plugin_builder import errors
from fuel_plugin_builder import messages
from fuel_plugin_builder.validators import ValidatorManager
from fuel_plugin_builder.logger import configure_logger
from fuel_plugin_builder import version_mapping
logger = logging.getLogger(__name__)
@ -38,8 +37,19 @@ def handle_exception(exc):
logger.exception(exc)
if isinstance(exc, errors.FuelCannotFindCommandError):
print_err(messages.HEADER)
print_err(messages.INSTALL_REQUIRED_PACKAGES)
print_err('=' * 50)
print_err("""
Was not able to find required packages.
If you use Ubuntu, run:
# sudo apt-get install createrepo rpm dpkg-dev
If you use CentOS, run:
# yum install createrepo dpkg-devel dpkg-dev rpm rpm-build
""")
elif isinstance(exc, errors.ValidationError):
print_err('Validation failed')
@ -63,7 +73,7 @@ def parse_args():
"""
parser = argparse.ArgumentParser(
description='fpb is a fuel plugin builder which '
'helps you create plugin for Fuel')
'helps you create plugin for Fuel')
# TODO(vsharshov): we should move to subcommands instead of
# exclusive group, because in this case we could not
@ -88,6 +98,18 @@ def parse_args():
'--package-version', help='which package version to use',
type=decode_string)
parser.add_argument(
'--fuel-import', help='Create plugin from existing releases',
action="store_true")
parser.add_argument(
'--nailgun-path', help='path se existing Nailgun configuration '
'to create releases from',
type=decode_string)
parser.add_argument(
'--library-path', help='path se existing Fuel Library repo '
'to create releases from',
type=decode_string)
result = parser.parse_args()
package_version_check(result, parser)
@ -100,14 +122,22 @@ def perform_action(args):
:param args: argparse object
"""
if args.create:
actions.CreatePlugin(args.create, args.package_version).run()
print('Plugin is created')
report = actions.CreatePlugin(
plugin_path=args.create,
package_version=args.package_version,
fuel_import=args.fuel_import,
nailgun_path=args.nailgun_path,
library_path=args.library_path
).run()
elif args.build:
actions.make_builder(args.build).run()
print('Plugin is built')
report = actions.make_builder(args.build).run()
elif args.check:
ValidatorManager(args.check).get_validator().validate()
print('Plugin is valid')
report = version_mapping.get_validator(args.check).validate()
else:
print("Invalid args: {}".format(args))
return
print (report)
print (report.render())
def package_version_check(args, parser):

View File

@ -14,5 +14,23 @@
# License for the specific language governing permissions and limitations
# under the License.
# Only lower case letters, numbers, '_', '-' symbols
# Default files encoding
DEFAULT_ENCODING = 'utf-8'
# In order of preference
SUPPORTED_FORMATS = ('yaml', 'json',)
# Used during plugin build
TAR_PARAMETERS = 'w:gz'
# Template extension
TEMPLATE_EXTENSION = 'mako'
# Latest plugin package version
LATEST_VERSION = '5.0.0'
# Plugin name pattern that are used in schemas and builder
PLUGIN_NAME_PATTERN = '^[a-z0-9_-]+$'
# suffix for the metadata.yaml paths filds
PATHS_SUFFIX = '_path'

View File

@ -17,6 +17,15 @@ class FuelPluginException(Exception):
pass
class ReportedException(FuelPluginException):
def __init__(self, report):
self.report = report
super(ReportedException, self).__init__()
def __str__(self):
return self.report.render()
class FuelCannotFindCommandError(FuelPluginException):
pass
@ -45,12 +54,16 @@ class FileIsEmpty(ValidationError):
class FileDoesNotExist(ValidationError):
def __init__(self, file_path):
def __init__(self, file_path=None):
super(FileDoesNotExist, self).__init__(
"File '{0}' does not exist".format(file_path)
)
class FilesInPathDoesNotExist(ValidationError):
pass
class WrongPackageVersionError(FuelPluginException):
pass
@ -61,3 +74,41 @@ class ReleasesDirectoriesError(FuelPluginException):
class WrongPluginDirectoryError(FuelPluginException):
pass
class InspectionConfigurationError(FuelPluginException):
pass
class InvalidFileFormat(FuelPluginException):
message = "Invalid file format: {}, supported formats are:"
def __init__(self, path, supported_formats, *args, **kwargs):
super(InvalidFileFormat, self).__init__(*args, **kwargs)
self.message = self.message.format(path, supported_formats.join(', '))
class CantReadFile(FuelPluginException):
message = "Can't read file: {}"
def __init__(self, path, *args, **kwargs):
super(CantReadFile, self).__init__(*args, **kwargs)
self.message = self.message.format(path)
class InvalidFileExtension(FuelPluginException):
def __init__(self, extension):
super(InvalidFileExtension, self).__init__(
"Invalid file extension: {}".format(extension)
)
class NoPluginFileFound(FuelPluginException):
message = "Plugin file not found"
def __init__(self, message):
self.message = message
class FailedToLoadPlugin(FuelPluginException):
message = "Failed to load plugin"

View File

@ -0,0 +1,6 @@
from loader_base import PluginLoaderBase
from loader_preloader import PluginLoaderPreloader
from loader_v1 import PluginLoaderV1
from loader_v3 import PluginLoaderV3
from loader_v4 import PluginLoaderV4
from loader_v5 import PluginLoaderV5

View File

@ -0,0 +1,184 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import six
from fuel_plugin_builder import errors
from fuel_plugin_builder import utils
class PluginLoaderBase(object):
"""Plugin loader.
Loader deals with the file structure providing ability to load, combine
and form the data tree from the plugin directory.
If loader fails it raising exception with the report attached.
"""
_metadata_path = "metadata.yaml"
_path_suffix = "_path"
_dont_resolve_path_keys = {'repository_path', 'deployment_scripts_path'}
paths_to_fields = {}
def __init__(self, plugin_path=None):
self.files_manager = utils.FilesManager()
self.plugin_path = plugin_path
def _get_absolute_path(self, path):
"""Get absolute path from the relative to the plugins folder.
:param path: relative path
:type path: str
:return: path string
:rtype: str
"""
return os.path.join(self.plugin_path, path)
@property
def _root_metadata_path(self):
"""Where is the root plugin data file located."""
return self._get_absolute_path(self._metadata_path)
def _recursive_process_paths(self, data, report):
"""Recursively processed nested list/dict.
:param data: data
:type data: iterable
:param report: report node
:type report: utils.ReportNode
:returns: data
:rtype: list|dict
"""
if isinstance(data, dict):
new_data = {}
for key in tuple(data):
value = data[key]
# if we have key with path we could do 3 things:
#
# * if it is pointing to directory, check dir existence and
# leave path intact
#
# * if it is a `glob` compatible mask, iterate over files
# that are matched this mask and compatible with
# FileManager then merge this files data if they have
# list or dict as common data root.
# Then remove _path suffix from key.
#
# * if it is file compatible with FileManager, read this
# file and remove _path suffix from key.
if key.endswith(self._path_suffix) \
and isinstance(value, six.string_types):
if os.path.isdir(self._get_absolute_path(value)):
report.info(u"{} is valid directory".format(
value))
new_data[key] = value
elif key in self._dont_resolve_path_keys:
report.error(u"{} is invalid directory".format(
value))
new_data[key] = value
else:
cleaned_key = key[:- len(self._path_suffix)]
try:
loaded_data = self.files_manager.load(
self._get_absolute_path(value)
)
new_data[cleaned_key] = loaded_data
except Exception as exc:
path_node = utils.ReportNode(data[key])
report.add_nodes(path_node.error(exc))
# keep path as is
new_data[key] = value
else:
new_data[key] = self._recursive_process_paths(
data[key], report)
elif isinstance(data, list):
new_data = [
self._recursive_process_paths(record, report)
for record in data
]
else:
new_data = data
return new_data
def _load_root_metadata_file(self):
"""Get plugin root data (usually, it's metadata.yaml).
:return: data
:rtype: DictResultWithReport|ListResultWithReport
"""
report = utils.ReportNode(u"Loading root metadata file:{}".format(
self._root_metadata_path
))
# todo(ikutukov): current loading schema and testing relies on case
# when no metadata.yaml file is possible. So we are skipping all
# exceptions.
data = {}
try:
data = self.files_manager.load(self._root_metadata_path)
data = self._recursive_process_paths(data, report)
except Exception as exc:
report.warning(exc)
finally:
return report.mix_to_data(data)
def _process_legacy_fields(self, data):
for release in data.get('releases', []):
if release.get('os'):
if release.get('operating_system') is None:
release['operating_system'] = release['os']
del release['os']
return data
def load(self, plugin_path=None):
"""Loads data from the given plugin path and producing data tree.
:param plugin_path: plugin root path
:param plugin_path: str|basestring|None
:return: data tree starting from the data in root metadata file
:rtype: tuple(dict, utils.ReportNode)
"""
self.plugin_path = \
plugin_path if plugin_path is not None else self.plugin_path
report = utils.ReportNode(
u"File structure validation: {}".format(self.plugin_path))
data = self._load_root_metadata_file()
report.add_nodes(data.report)
# load files with fixed location
for key, file_path in six.iteritems(self.paths_to_fields):
file_report = utils.ReportNode(file_path)
try:
data[key] = self.files_manager.load(
self._get_absolute_path(file_path)
)
except errors.NoPluginFileFound as exc: # not found files are OK
file_report.warning(exc)
except Exception as exc:
file_report.error(exc)
report.add_nodes(file_report)
data = self._process_legacy_fields(data)
return report.mix_to_data(data)

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,10 +14,11 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.loaders.loader_base import PluginLoaderBase
from fuel_plugin_builder.validators.schemas.base import BaseSchema
from fuel_plugin_builder.validators.schemas.v1 import SchemaV1
from fuel_plugin_builder.validators.schemas.v2 import SchemaV2
from fuel_plugin_builder.validators.schemas.v3 import SchemaV3
from fuel_plugin_builder.validators.schemas.v4 import SchemaV4
from fuel_plugin_builder.validators.schemas.v5 import SchemaV5
class PluginLoaderPreloader(PluginLoaderBase):
@property
def _root_metadata_path(self):
"""Where is the root plugin data file located."""
return self._get_absolute_path('metadata.*')

View File

@ -0,0 +1,24 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.loaders.loader_base import PluginLoaderBase
class PluginLoaderV1(PluginLoaderBase):
paths_to_fields = {
'attributes_metadata': 'environment_config.yaml',
'tasks': 'tasks.yaml',
}

View File

@ -0,0 +1,31 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.loaders.loader_base import PluginLoaderBase
class PluginLoaderV3(PluginLoaderBase):
paths_to_fields = {
'attributes_metadata': 'environment_config.yaml',
'tasks': 'tasks.yaml',
'deployment_tasks': 'deployment_tasks.yaml',
'network_roles_metadata': 'network_roles.yaml',
'roles_metadata': 'node_roles.yaml',
'volumes_metadata': 'volumes.yaml',
}

View File

@ -0,0 +1,33 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.loaders.loader_base import PluginLoaderBase
class PluginLoaderV4(PluginLoaderBase):
paths_to_fields = {
'attributes_metadata': 'environment_config.yaml',
'tasks': 'tasks.yaml',
'deployment_tasks': 'deployment_tasks.yaml',
'network_roles_metadata': 'network_roles.yaml',
'roles_metadata': 'node_roles.yaml',
'volumes_metadata': 'volumes.yaml',
'components_metadata': 'components.yaml'
}

View File

@ -0,0 +1,41 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.loaders.loader_base import PluginLoaderBase
class PluginLoaderV5(PluginLoaderBase):
paths_to_fields = {
'attributes_metadata': 'environment_config.yaml',
'tasks': 'tasks.yaml',
'deployment_tasks': 'deployment_tasks.yaml',
'network_roles_metadata': 'network_roles.yaml',
'roles_metadata': 'node_roles.yaml',
'volumes_metadata': 'volumes.yaml',
'components_metadata': 'components.yaml',
'nic_attributes_metadata': 'nic_config.yaml',
'bond_attributes_metadata': 'bond_config.yaml',
'node_attributes_metadata': 'node_config.yaml'
}
@property
def _root_metadata_path(self):
"""Where is the root plugin data file located."""
return self._get_absolute_path('metadata.*')

View File

@ -0,0 +1,59 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# Please, be kindly reminded about data schema versioning policy and its
# nuances:
#
# Fuel Plugin Builder(FPB) is on intersection of several lines of versioning
# subsystems and APIs making the data schema tricky (as small as it possible)
#
# ``Tasks`` version that are defined by changes in data flow chain
# between of API Fuel, Astute, Fuel Library and QA environment. In theory, all
# versions of this tasks could be delivered now, on practice v1.0.0 is not
# even close to way you may want to configure what Nailgun do.
#
# ``Plugin package`` version as well FPB version is semver where major going
# up on all changes in plugin format and any changes of data schema inside it
# that could affect internal and third-party plugins developers.
#
# ``FPB own version`` FPB not releasing together with Fuel, but respecting
# Fuel milestones. Fuel still holds a great backward compatibility with plugins
# 3 and even more major releases ago, so there is no rush to roll up changes
# that will open new sometimes experimental functionality. Everyone who want
# to work with new features is free to clone and use latest master of FPB to
# build new format of plugin packages.
#
#
# So we have hypothetical versions snapshot:
#
# FPB version 4.1.0
# Plugins package version 4.0.0
# Fuel version 9.0.1
# Tasks version 2.1.0
#
from .attributes import *
from .common import *
from .components import *
from .graph import *
from .metadata import *
from .network_roles import *
from .node_attributes import *
from .node_roles import *
from .release import *
from .task import *
from .volumes import *

View File

@ -0,0 +1,250 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
class EnvConfigSchemaV6_0(object):
schema = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'attributes': {
'type': 'object',
'additionalProperties': {
'type': 'object',
'properties': {
'type': {'type': 'string'},
'weight': {'type': 'integer'},
'value': {'type': ['string', 'boolean']},
'label': {'type': 'string'}}}}}}
class AttrElementsSchemaV6_0(object):
attr_element = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type', 'label', 'weight', 'value'],
'properties': {
'type': {'type': 'string'},
'weight': {'type': 'integer'},
'value': {'type': ['string', 'boolean']},
'label': {'type': 'string'},
'values': {'type': 'array', 'items': {
'type': 'object',
'required': ['data', 'label'],
'properties': {
'data': {'type': 'string'},
'label': {'type': 'string'}}}}}}
attr_meta = {'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'label': {'type': 'string'},
'weight': {'type': 'integer'},
'toggleable': {'type': 'boolean'},
'enabled': {'type': 'boolean'},
'restrictions': {
'type': 'array',
'items': {'type': ['string', 'object']}}}}
attr_root = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'attributes': {'type': 'object'}}}
class SchemaAttributesV6_1(object):
_condition = {'type': 'string'}
@property
def _full_restriction(self):
return {
'type': 'object',
'required': ['condition'],
'properties': {
'condition': self._condition,
'message': {'type': 'string'},
'action': {'type': 'string'}}}
_short_restriction = {
'type': 'object',
'minProperties': 1,
'maxProperties': 1}
@property
def _restrictions(self):
return {
'type': 'array',
'minItems': 1,
'items': {
'anyOf': [
self._condition,
self._full_restriction,
self._short_restriction]}}
@property
def attr_element(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type', 'label', 'weight', 'value'],
'properties': {
'type': {'type': 'string'},
'weight': {'type': 'integer'},
'value': {'type': ['string', 'boolean']},
'label': {'type': 'string'},
'restrictions': self._restrictions,
'values': {'type': 'array', 'items':
{'type': 'object',
'required': ['data', 'label'],
'properties': {
'data': {'type': 'string'},
'label': {'type': 'string'}}}}}
}
@property
def attr_meta(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'label': {'type': 'string'},
'weight': {'type': 'integer'},
'toggleable': {'type': 'boolean'},
'enabled': {'type': 'boolean'},
'restrictions': self._restrictions}
}
attr_root = {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'attributes': {'type': 'object'}}}
class SchemaAttributesV8_0(object):
@property
def condition(self):
return {'type': 'string'}
@property
def full_restriction(self):
return {
'type': 'object',
'required': ['condition'],
'properties': {
'condition': self.condition,
'message': {'type': 'string'},
'action': {'type': 'string'}
}
}
@property
def short_restriction(self):
return {
'type': 'object',
'minProperties': 1,
'maxProperties': 1}
@property
def restrictions(self):
return {
'type': 'array',
'minItems': 1,
'items': {
'anyOf': [
self.condition,
self.full_restriction,
self.short_restriction
]
}
}
@property
def attr_element(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type', 'label', 'weight', 'value'],
'properties': {
'type': {'type': 'string'},
'weight': {'type': 'integer'},
'value': {'anyOf': [
{'type': 'string'},
{'type': 'boolean'},
{
'type': 'object',
'properties': {'generator': {'type': 'string'}}
}
]},
'label': {'type': 'string'},
'restrictions': self.restrictions,
'values': {
'type': 'array',
'items': {
'type': 'object',
'required': ['data', 'label'],
'properties': {
'data': {'type': 'string'},
'label': {'type': 'string'}
}
}
}
}
}
@property
def attr_meta(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'label': {'type': 'string'},
'weight': {'type': 'integer'},
'toggleable': {'type': 'boolean'},
'enabled': {'type': 'boolean'},
'restrictions': self.restrictions
}
}
@property
def attr_root(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': [],
'properties': {
'attributes': {
'type': 'object',
'properties': {
'group': {
'enum': [
'general', 'security',
'compute', 'network',
'storage', 'logging',
'openstack_services', 'other'
]
}
}
}
}
}
env_config_v6_0 = EnvConfigSchemaV6_0()
attr_elements_v6_0 = AttrElementsSchemaV6_0()
attributes_v6_1 = SchemaAttributesV6_1()
attributes_v8_0 = SchemaAttributesV8_0()

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,20 +14,22 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import consts
HEADER = '=' * 50
INSTALL_REQUIRED_PACKAGES = """
Was not able to find required packages.
If you use Ubuntu, run:
class SchemaCommonV6_0(object):
@property
def plugin_name_pattern(self):
return consts.PLUGIN_NAME_PATTERN
# sudo apt-get install createrepo rpm dpkg-dev
@property
def list_of_strings(self):
return {'type': 'array',
'items': {'type': 'string'}}
If you use CentOS, run:
@property
def positive_integer(self):
return {'type': 'integer', 'minimum': 0}
# yum install createrepo dpkg-devel dpkg-dev rpm rpm-build
"""
PLUGIN_WRONG_NAME_EXCEPTION_MESSAGE = ("Plugin name is invalid, use only "
"lower case letters, numbers, '_', '-' "
"symbols")
common_v6_0 = SchemaCommonV6_0()

View File

@ -0,0 +1,72 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
class SchemaComponentsV8_0(object):
_components_types_str = \
'|'.join(['hypervisor', 'network', 'storage', 'additional_service'])
_component_name_pattern = \
'^({0}):([0-9a-z_-]+:)*[0-9a-z_-]+$'.format(
_components_types_str)
_compatible_component_name_pattern = \
'^({0}):([0-9a-z_-]+:)*([0-9a-z_-]+|(\*)?)$'.format(
_components_types_str)
@property
def _components_items(self):
return {
'type': 'array',
'items': {
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'type': 'string',
'pattern': self._compatible_component_name_pattern
},
'message': {'type': 'string'}
}
}
}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'required': ['name', 'label'],
'type': 'object',
'additionalProperties': False,
'properties': {
'name': {
'type': 'string',
'pattern': self._component_name_pattern
},
'label': {'type': 'string'},
'description': {'type': 'string'},
'compatible': self._components_items,
'requires': self._components_items,
'incompatible': self._components_items,
'bind': {'type': 'array'}
}
}
}
components_v8_0 = SchemaComponentsV8_0()

View File

@ -0,0 +1,36 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from task import task_v2_2_0
class SchemaGraphV9_1(object):
@property
def graph(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'additionalProperties': False,
'properties': {
'name': {'type': 'string'},
'type': {'type': 'string'},
'tasks': task_v2_2_0.tasks,
'metadata': {'type': 'object'}
}
}
graph_v9_1 = SchemaGraphV9_1()

View File

@ -0,0 +1,238 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from common import common_v6_0
from release import release_v6_0
# Tasks have their own versioning line slightly dependant on Nailgun and
# FPB versions.
class SchemaMetadataV6_0(object):
_package_version = {'enum': ['1.0.0']}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': ['name', 'title', 'version', 'releases',
'package_version'],
'properties': {
'name': {
'type': 'string',
# Only lower case letters, numbers, '_', '-' symbols
'pattern': '^[a-z0-9_-]+$'},
'title': {'type': 'string'},
'version': {'type': 'string'},
'package_version': self._package_version,
'description': {'type': 'string'},
'fuel_version': {'type': 'array',
'items': {'type': 'string'}},
'releases': {
'type': 'array',
'items': release_v6_0.schema}}}
class SchemaMetadataV6_1(object):
_plugin_name_pattern = '^[a-z0-9_-]+$'
_package_version = {'enum': ['2.0.0']}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': [
'name',
'title',
'version',
'package_version',
'description',
'fuel_version',
'licenses',
'authors',
'homepage',
'releases',
'groups'],
'properties': {
'name': {
'type': 'string',
'pattern': self._plugin_name_pattern},
'title': {
'type': 'string'},
'version': {
'type': 'string'},
'package_version': self._package_version,
'description': {
'type': 'string'},
'fuel_version': common_v6_0.list_of_strings,
'licenses': common_v6_0.list_of_strings,
'authors': common_v6_0.list_of_strings,
'groups': {
'type': 'array',
'uniqueItems': True,
'items': {
'enum': [
'network',
'storage',
'storage::cinder',
'storage::glance',
'hypervisor'
]
}
},
'homepage': {'type': 'string'},
'releases': {
'type': 'array',
'items': release_v6_0.schema
}
}
}
class SchemaMetadataV7_0(SchemaMetadataV6_1):
_package_version = {'enum': ['3.0.0']}
class SchemaMetadataV8_0(SchemaMetadataV7_0):
_package_version = {'enum': ['4.0.0']}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': [
'name',
'title',
'version',
'package_version',
'description',
'fuel_version',
'licenses',
'authors',
'homepage',
'releases',
'groups',
'is_hotpluggable'],
'properties': {
'name': {
'type': 'string',
'pattern': self._plugin_name_pattern
},
'title': {'type': 'string'},
'version': {'type': 'string'},
'package_version': self._package_version,
'description': {'type': 'string'},
'fuel_version': common_v6_0.list_of_strings,
'licenses': common_v6_0.list_of_strings,
'authors': common_v6_0.list_of_strings,
'groups': {
'type': 'array',
'uniqueItems': True,
'items': {
'enum': [
'network',
'storage',
'storage::cinder',
'storage::glance',
'hypervisor',
'equipment'
]
}
},
'homepage': {
'type': 'string'
},
'releases': {
'type': 'array',
'items': release_v6_0.schema
},
'is_hotpluggable': {'type': 'boolean'}
}
}
class SchemaMetadataV9_1(object):
_package_version = {'enum': ['5.0.0']}
_plugin_name_pattern = '^[a-z0-9_-]+$'
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': [
'name',
'title',
'version',
'package_version',
'description',
'fuel_version',
'licenses',
'authors',
'homepage',
'releases',
'groups'
],
'properties': {
'name': {
'type': 'string',
'pattern': common_v6_0.plugin_name_pattern
},
'title': {'type': 'string'},
'version': {'type': 'string'},
'package_version': self._package_version,
'description': {'type': 'string'},
'fuel_version': common_v6_0.list_of_strings,
'licenses': common_v6_0.list_of_strings,
'authors': common_v6_0.list_of_strings,
'groups': {
'type': 'array',
'uniqueItems': True,
'items': {
'enum': [
'network',
'storage',
'storage::cinder',
'storage::glance',
'hypervisor',
'monitoring'
]
}
},
'homepage': {'type': 'string'},
'releases': {
'type': 'array',
'items': { # more detailed check will be at release level
'type': 'object'
}
}
}
}
metadata_v6_0 = SchemaMetadataV6_0()
metadata_v6_1 = SchemaMetadataV6_1()
metadata_v7_0 = SchemaMetadataV7_0()
metadata_v8_0 = SchemaMetadataV8_0()
metadata_v9_1 = SchemaMetadataV9_1()

View File

@ -0,0 +1,128 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
class SchemaNetworkRolesV7_0(object):
_network_role_pattern = '^[0-9a-z_-]+$'
@property
def _vip(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'type': 'string',
'pattern': self._network_role_pattern
},
'namespace': {
'type': 'string',
'pattern': self._network_role_pattern
}
}
}
}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'type': 'object',
'required': [
'id',
'default_mapping',
'properties'
],
'properties': {
'id': {
'type': 'string'
},
'default_mapping': {
'type': 'string'
},
'properties': {
'type': 'object',
'required': ['subnet', 'gateway', 'vip'],
'properties': {
'subnet': {
'type': 'boolean'
},
'gateway': {
'type': 'boolean'
},
'vip': self._vip
}
}
}
}
}
class SchemaNetworkRolesV8_0(object):
_network_role_pattern = '^[0-9a-z_-]+$'
_vip = {
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'type': 'string',
'pattern': _network_role_pattern
},
'namespace': {
'type': 'string',
'pattern': _network_role_pattern
}
}
}
_vips = {
'type': 'array',
'items': _vip
}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'type': 'object',
'required': ['id', 'default_mapping', 'properties'],
'properties': {
'id': {'type': 'string'},
'default_mapping': {'type': 'string'},
'properties': {
'type': 'object',
'required': ['subnet', 'gateway', 'vip'],
'properties': {
'subnet': {'type': 'boolean'},
'gateway': {'type': 'boolean'},
'vip': self._vips
}
}
}
}
}
network_roles_v7_0 = SchemaNetworkRolesV7_0()
network_roles_v8_0 = SchemaNetworkRolesV8_0()

View File

@ -14,17 +14,13 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.validators.schemas import SchemaV4
from attributes import attributes_v8_0
class SchemaV5(SchemaV4):
class SchemaNodeAttributesV9_1(object):
@property
def package_version(self):
return {'enum': ['5.0.0']}
@property
def node_attributes_schema(self):
def node_attributes(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
@ -32,13 +28,13 @@ class SchemaV5(SchemaV4):
'^[0-9a-zA-Z_-]+$': {"$ref": "#/definitions/attrItem"}
},
"definitions": {
"attrItem": self.node_nic_attributes_schema
"attrItem": self.node_nic_attributes
},
"additionalProperties": False
}
@property
def node_nic_attributes_schema(self):
def node_nic_attributes(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
@ -61,7 +57,7 @@ class SchemaV5(SchemaV4):
'description': {'type': 'string'},
'type': {'type': 'string'},
'value': {},
'restrictions': self.restrictions
'restrictions': attributes_v8_0.restrictions
}
}
@ -72,6 +68,9 @@ class SchemaV5(SchemaV4):
'required': ['label'],
'properties': {
'label': {'type': 'string'},
'restrictions': self.restrictions
'restrictions': attributes_v8_0.restrictions
}
}
node_attributes_v9_1 = SchemaNodeAttributesV9_1()

View File

@ -0,0 +1,150 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from common import common_v6_0
class SchemaNodeRolesV7_0(object):
@property
def _rule(self):
return {
'type': ['string', 'integer']
}
@property
def _override(self):
return {
'type': 'object',
'description': 'Property which can change limit recommended|min'
'|max properties due to some additional condition',
'required': ['condition'],
'properties': {
'condition': {'type': 'string'},
'max': self._rule,
'recommended': self._rule,
'min': self._rule,
'message': {'type': 'string'}
}
}
@property
def _overrides(self):
return {
'type': 'array',
'description': 'Array of limit override properties',
'minItems': 1,
'items': self._override
}
@property
def _condition(self):
return {'type': 'string'}
@property
def _limits(self):
return {
'type': 'object',
'description': 'Limits for count of nodes for node role',
'properties': {
'condition': self._condition,
'max': self._rule,
'recommended': self._rule,
'min': self._rule,
'overrides': self._overrides
}
}
@property
def _full_restriction(self):
return {
'type': 'object',
'required': ['condition'],
'properties': {
'condition': self._condition,
'message': {'type': 'string'},
'action': {'type': 'string'}}}
@property
def _short_restriction(self):
return {
'type': 'object',
'minProperties': 1,
'maxProperties': 1}
@property
def _restrictions(self):
return {
'type': 'array',
'minItems': 1,
'items': {
'anyOf': [
self._condition,
self._full_restriction,
self._short_restriction]}}
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'patternProperties': {
'^[0-9a-zA-Z_-]+$': {
'type': 'object',
'required': ['name', 'description'],
'properties': {
'name': {
'type': 'string',
'description': 'Name that will be shown on UI'},
'description': {
'type': 'string',
'description': ('Short description of role'
' functionality')},
'conflicts': {
'oneOf': [
common_v6_0.list_of_strings,
{
'type': 'string',
'enum': ['*']
}
]
},
'has_primary': {
'type': 'boolean',
'description': ('During orchestration this role'
' will be splitted into'
' primary-role and role.')},
'public_ip_required': {
'type': 'boolean',
'description': ('Specify if role needs public'
' IP address.')},
'update_required': common_v6_0.list_of_strings,
'update_once': common_v6_0.list_of_strings,
'weight': {
'type': 'integer',
'description': ('Specify weight that will be'
' used to sort out the roles'
' on the Fuel web UI')
},
'limits': self._limits,
'restrictions': self._restrictions
}
}
},
'additionalProperties': False
}
node_roles_v7_0 = SchemaNodeRolesV7_0()

View File

@ -0,0 +1,56 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# Tasks have their own versioning line slightly dependant on Nailgun and
# FPB versions.
class SchemaReleaseV6_0(object):
@property
def schema(self):
return {
'type': 'object',
'required': ['version', 'operating_system', 'mode'],
'properties': {
'version': {'type': 'string'},
'operating_system': {'enum': ['ubuntu', 'centos']},
'deployment_scripts_path': {'type': 'string'},
'repository_path': {'type': 'string'},
'mode': {'type': 'array',
'items': {'enum': ['ha', 'multinode']}}}}
class SchemaReleaseV9_1(object):
@property
def schema(self):
return {
'type': 'object',
'required': ['version', 'operating_system', 'mode'],
'properties': {
'version': {'type': 'string'},
'operating_system': {'enum': ['ubuntu', 'centos']},
'deployment_scripts_path': {'type': 'string'},
'repository_path': {'type': 'string'},
'mode': {
'type': 'array',
'items': {'enum': ['ha', 'multinode']}
}
}
}
release_v6_0 = SchemaReleaseV6_0()
release_v9_1 = SchemaReleaseV9_1()

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,69 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from task import task_v2_1_0
class SchemaVolumesV7_0(object):
@property
def schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['volumes_roles_mapping', 'volumes'],
'properties': {
'volumes_roles_mapping': {
'type': 'object',
'patternProperties': {
task_v2_1_0.task_name_pattern: {
'type': 'array',
'minItems': 1,
'items': {
'type': 'object',
'description': 'Volume allocations for role',
'required': ['allocate_size', 'id'],
'properties': {
'allocate_size': {
'type': 'string',
'enum': ['all', 'min', 'full-disk']
},
'id': {'type': 'string'}
}
}
}
},
'additionalProperties': False
},
'volumes': {
'type': 'array',
'items': {
'type': 'object',
'required': ['id', 'type'],
'properties': {
'id': {
'type': 'string'
},
'type': {
'type': 'string'
}
}
}
}
}
}
volumes_v7_0 = SchemaVolumesV7_0()

View File

@ -1,2 +1,2 @@
Label: ${plugin_name}
Version: ${major_version}
Version: ${major_version}

View File

@ -1,2 +1,2 @@
Label: ${plugin_name}
Version: ${major_version}
Version: ${major_version}

View File

@ -0,0 +1,10 @@
- id: delete
type: puppet
version: 2.1.0
roles:
- deleted
parameters:
puppet_manifest: "delete.pp"
puppet_modules: "."
timeout: 3600
retries: 10

View File

@ -0,0 +1,62 @@
# These tasks will be merged into deployment graph. Here you
# can specify new tasks for any roles, even built-in ones.
- id: ${plugin_name}_role
type: group
role: [${plugin_name}_role]
parameters:
strategy:
type: parallel
- id: ${plugin_name}-deployment-puppet
type: puppet
role: [${plugin_name}_role]
# If you do not want to use task-based deployment that is introduced as experimental
# in fuel v8.0 comment code section below this comment, uncomment two lines below it
# and do the same for tasks below.
version: 2.0.0
cross-depends:
- name: deploy_start
cross-depended-by:
- name: deploy_end
# requires: [deploy_start] # version 1.0.0
# required_for: [deploy_end]
parameters:
puppet_manifest: "deploy.pp"
puppet_modules: "."
timeout: 3600
#- id: ${plugin_name}-post-deployment-sh
# type: shell
# role: [${plugin_name}_role]
# version: 2.0.0
# cross-depends:
# - name: post_deployment_start
# cross-depended-by:
# - name: post_deployment_end
# # requires: [post_deployment_start]
# # required_for: [post_deployment_end]
# parameters:
# cmd: echo post_deployment_task_executed > /tmp/post_deployment
# retries: 3
# interval: 20
# timeout: 180
#- id: ${plugin_name}-pre-deployment-sh
# type: shell
# role: [${plugin_name}_role]
# version: 2.0.0
# cross-depends:
# - name: pre_deployment_start
# cross-depended-by:
# - name: pre_deployment_end
# # requires: [pre_deployment_start]
# # required_for: [pre_deployment_end]
# parameters:
# cmd: echo pre_deployment_task_executed > /tmp/pre_deployment
# retries: 3
# interval: 20
# timeout: 180

View File

@ -0,0 +1,10 @@
- id: verify_networks
type: puppet
version: 2.1.0
roles: ["*", "master"]
required_for: ["deploy_start"]
parameters:
puppet_manifest: "delete.pp"
puppet_modules: "."
timeout: 3600
retries: 10

View File

@ -0,0 +1,9 @@
- id: provision
type: puppet
version: 2.0.0
roles: "*"
parameters:
puppet_manifest: "provision.pp"
puppet_modules: "."
timeout: 3600
retries: 10

View File

@ -1,21 +1,22 @@
# Plugin name
name: ${plugin_name}
name: fuel_plugin_example_v5
# Human-readable name for your plugin
title: Title for ${plugin_name} plugin
title: Title for fuel_plugin_example_v5 plugin
# Plugin version
version: '1.0.0'
# Description
description: Please describe your plugin here
# Required fuel version
fuel_version: ['9.0', '10.0']
# Minimum required fuel version
fuel_version: ['9.1', '10.0']
# Specify license of your plugin
licenses: ['Apache License Version 2.0']
# Specify author or company name
authors: ['Specify author or company name']
# A link to the plugin's page
homepage: 'https://github.com/stackforge/fuel-plugins'
homepage: 'https://github.com/openstack/fuel-plugins'
# Specify a group which your plugin implements, possible options:
# network, storage, storage::cinder, storage::glance, hypervisor
# network, storage, storage::cinder, storage::glance, hypervisor,
# equipment
groups: []
# Change `false` to `true` if the plugin can be installed in the environment
# after the deployment.
@ -23,6 +24,53 @@ is_hotpluggable: false
# The plugin is compatible with releases in the list
releases:
- name: 'ExampleRelease'
description: 'Example Release Description'
operating_system: 'ubuntu'
version: '1.0.0'
is_release: true
networks_path: metadata/networks.yaml
volumes_path: metadata/volumes.yaml
roles_path: metadata/roles.yaml
network_roles_path: metadata/network_roles.yaml
components_path: metadata/components.yaml
attributes_path: attributes/attributes.yaml
vmware_attributes_path: attributes/vmware.yaml
node_attributes_path: attributes/node.yaml
nic_attributes_path: attributes/nic.yaml
bond_attributes_path: attributes/bond.yaml
deployment_scripts_path: deployment_scripts/
repository_path: repositories/ubuntu
# deployment_tasks is used in fuel 9.0.x as a deployment graph
# don't use if in new fuel versions
deployment_tasks_path: graphs/deployment_tasks.yaml
graphs:
- type: provisioning
name: provisioning
tasks_path: graphs/provisioning.yaml
- type: deployment
name: default deployment graph
tasks_path: graphs/deployment_tasks.yaml
- type: deletion
name: deletion
tasks_path: graphs/deletion.yaml
- type: network_verification
name: network_verification
tasks_path: graphs/network_verification.yaml
- type: default # default was used in fuel 9.0.x as a deployment graph
name: deployment-graph-name
tasks_path: graphs/deployment_tasks.yaml
- os: ubuntu
version: mitaka-9.0
mode: ['ha']

View File

@ -0,0 +1,12 @@
# This file contains wizard components descriptions that are pretty similar to
# the `environment_config.yaml`.
# Please, take a look at following link for the details:
# - https://blueprints.launchpad.net/fuel/+spec/component-registry
# - https://specs.openstack.org/openstack/fuel-specs/specs/8.0/component-registry.html
- name: additional_service:${plugin_name}
compatible: []
requires: []
incompatible: []
label: "Plugin label, that will be shown on UI"
description: "Component description (optional)"

View File

@ -0,0 +1,15 @@
# Unique network role name
- id: "example_net_role"
# Role mapping to network
default_mapping: "public"
properties:
# Should be true if network role requires subnet being set
subnet: true
# Should be true if network role requires gateway being set
gateway: false
# List of VIPs to be allocated
vip:
# Unique VIP name
- name: "vip_name"
# Optional linux namespace for VIP
namespace: "haproxy"

View File

@ -0,0 +1,13 @@
${plugin_name}_role:
# Role name
name: "Set here the name for the role. This name will be displayed in the Fuel web UI"
# Role description
description: "Write description for your role"
# If primary then during orchestration this role will be
# separated into primary-role and role
has_primary: false
# Assign public IP to node if true
public_ip_required: false
# Weight that will be used to sort out the
# roles on the Fuel web UI
weight: 1000

View File

@ -0,0 +1,7 @@
volumes_roles_mapping:
# Default role mapping
${plugin_name}_role:
- {allocate_size: "min", id: "os"}
# Set here new volumes for your role
volumes: []

View File

@ -21,9 +21,17 @@ except ImportError:
from unittest2.case import TestCase
import mock
import os
from StringIO import StringIO
from fuel_plugin_builder import errors
from pyfakefs import fake_filesystem_unittest
import yaml
from fuel_plugin_builder import consts
from fuel_plugin_builder import utils
from fuel_plugin_builder import validators
from fuel_plugin_builder import version_mapping
class FakeFile(StringIO):
@ -34,6 +42,7 @@ class FakeFile(StringIO):
here, because it hangs when we use 'with' statement,
and when we want to read file by chunks.
"""
def __enter__(self):
return self
@ -83,166 +92,124 @@ class BaseTestCase(TestCase):
for method in methods:
setattr(obj, method, mock.MagicMock())
def _make_fake_metadata_data(self, **kwargs):
"""Generate metadata based on example and custom fields from kwargs.
@mock.patch('fuel_plugin_builder.validators.base.utils')
class LegacyBaseValidatorTestCase(BaseTestCase):
:return: metadata
:rtype: dict
"""
metadata = {
'package_version': '5.0.0',
'fuel_version': ['9.1']
}
metadata.update(kwargs)
return metadata
__test__ = False
validator_class = None
schema_class = None
class FakeFSTest(BaseTestCase, fake_filesystem_unittest.TestCase):
plugin_path = '/tmp/plugin/' # path inside mock FS
fpb_dir = os.path.join(os.path.dirname(__file__), '..')
validator_class = validators.ValidatorBase
loader_class = None
package_version = None
def _delete_from_fakefs(self, path):
"""Remove record from mockfs if exists.
:param path: path
:type path: str
"""
fakefs_path = self._make_fakefs_path(path)
if os.path.exists(fakefs_path):
self.fs.RemoveObject(fakefs_path)
def _make_fakefs_path(self, relative_path):
"""Make absolute path related to the plugin example root folder.
:param relative_path: relative path
:type relative_path: str
:return: absolute path
:rtype: str
"""
return os.path.abspath(
os.path.join(
self.plugin_path, relative_path
)
)
def _patch_fakefs_file(self, path, add_data):
fakefs_path = self._make_fakefs_path(path)
if os.path.exists(fakefs_path):
raw_data = self.fs.GetObject(fakefs_path)
data = yaml.safe_load(raw_data.contents)
data.update(add_data)
else:
data = add_data
self._create_fakefs_yaml(path, data)
def _create_fakefs_yaml(self, path, new_data):
"""Replace file with new one inside mockfs
:param path: relative path
:type path: str|basestring
:param new_data: list/dict structure that will be serialised to YAML
:type new_data: dict|list
"""
self._delete_from_fakefs(path)
self.fs.CreateFile(
file_path=self._make_fakefs_path(path),
contents=yaml.dump(new_data)
)
def setUp(self):
self.plugin_path = '/tmp/plugin_path'
self.validator = self.validator_class(self.plugin_path)
super(FakeFSTest, self).setUp()
template_paths = version_mapping.get_plugin_package_config(
self.package_version)['templates']
def test_validate(self, _):
mocked_methods = [
'check_schemas',
'check_tasks',
'check_releases_paths',
'check_compatibility',
]
self.check_validate(mocked_methods)
for template_path in template_paths:
template_path = os.path.join(self.fpb_dir, template_path)
print("Setting up fakeFs from template {}".format(template_path))
for root, _, file_names in os.walk(template_path):
for filename in file_names:
src_path = os.path.abspath(
os.path.join(root, filename)
)
extension = utils.get_path_extension(src_path)
if extension == consts.TEMPLATE_EXTENSION:
content = utils.template.render_template_file(
src_path, plugin_name="test-plugin")
dst_path = os.path.join(
self.plugin_path,
os.path.relpath(
utils.fs.get_path_without_extension(src_path),
template_path
)
)
else:
dst_path = os.path.join(
self.plugin_path,
os.path.relpath(
src_path,
template_path
)
)
with open(src_path) as f:
content = f.read()
try:
self.fs.RemoveObject(dst_path)
except IOError:
pass
self.fs.CreateFile(
file_path=dst_path,
contents=content
)
self.validator = self.validator_class()
if isinstance(self.loader_class.load, mock.Mock):
self.loader_class.load.reset_mock()
self.setUpPyfakefs() # setup place is important
self.loader = self.loader_class(self.plugin_path)
self.data_tree = self.loader.load()
def test_check_schemas(self, _):
mocked_methods = [
'check_env_config_attrs',
'validate_file_by_schema'
]
self.mock_methods(self.validator, mocked_methods)
self.validator.check_schemas()
self.assertEqual(
[mock.call(self.schema_class().metadata_schema,
self.validator.meta_path),
mock.call(self.schema_class().tasks_schema,
self.validator.tasks_path)],
self.validator.validate_file_by_schema.call_args_list)
self.validator.check_env_config_attrs.assert_called_once_with()
def test_check_releases_paths(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'releases': [{
'deployment_scripts_path': '/tmp/deployment_scripts_path',
'repository_path': '/tmp/repository_path'}]}
utils_mock.exists.return_value = True
self.validator.check_releases_paths()
self.assertEqual(
utils_mock.exists.call_args_list,
[mock.call('/tmp/deployment_scripts_path'),
mock.call('/tmp/repository_path')])
def test_check_releases_paths_error(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'releases': [{
'deployment_scripts_path': '/tmp/deployment_scripts_path',
'repository_path': '/tmp/repository_path'}]}
utils_mock.exists.return_value = False
with self.assertRaisesRegexp(
errors.ReleasesDirectoriesError,
'Cannot find directories /tmp/deployment_scripts_path'
', /tmp/repository_path for release '):
self.validator.check_releases_paths()
def test_check_env_config_attrs_do_not_fail_if_empty(self, utils_mock):
utils_mock.parse_yaml.return_value = None
self.validator.check_env_config_attrs()
def test_check_env_config_attrs_fail_if_none(self, utils_mock):
utils_mock.parse_yaml.return_value = {'attributes': None}
with self.assertRaisesRegexp(
errors.ValidationError,
"File '/tmp/plugin_path/environment_config.yaml', None "
"is not of type 'object', value path 'attributes'"):
self.validator.check_env_config_attrs()
def test_check_env_config_attrs_checks_metadata(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'attributes': {'metadata': []}}
with self.assertRaisesRegexp(
errors.ValidationError,
"File '/tmp/plugin_path/environment_config.yaml', \[\] is "
"not of type 'object', value path 'attributes -> metadata'"):
self.validator.check_env_config_attrs()
def test_check_env_config_attrs_checks_attrs(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'attributes': {
'key1': {
'type': True,
'label': 'text',
'value': 'text',
'weight': 1}}}
with self.assertRaisesRegexp(
errors.ValidationError,
"File '/tmp/plugin_path/environment_config.yaml', True is not "
"of type 'string', value path 'attributes -> key1 -> type'"):
self.validator.check_env_config_attrs()
def test_check_env_config_attrs_generator_value(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'attributes': {
'key1': {
'type': 'hidden',
'label': '',
'value': {'generator': 'password'},
'weight': 1}}}
self.validator.check_env_config_attrs()
def test_check_env_config_attrs_restriction_fails(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'attributes': {
'key1': {
'type': 'text',
'label': 'test',
'value': 'test',
'weight': 1,
'restrictions': [
{
'condition': 'false',
'action': 'disable'
},
{
'condition': True,
'action': 'hide'
}
]
}
}
}
with self.assertRaisesRegexp(
errors.ValidationError,
"File '/tmp/plugin_path/environment_config.yaml', True is not "
"of type 'string', value path "
"'attributes -> key1 -> restrictions -> 1 -> condition"):
self.validator.check_env_config_attrs()
def check_raised_exception(self, utils_mock, mock_data,
err_msg, executed_method,
err_type=errors.ValidationError):
"""Check if the given error with given type was raised.
:param obj utils_mock: fuel_plugin_builder.utils mock
:param List[dict] mock_data: mock data
:param str err_msg: what error message is expected
:param function executed_method: what method should be executed
:param Exception err_type: what error type is expected
"""
utils_mock.parse_yaml.return_value = mock_data
with self.assertRaisesRegexp(err_type, err_msg):
executed_method()
def check_validate(self, mocked_methods=[]):
self.mock_methods(self.validator, mocked_methods)
self.validator.validate()
for method in mocked_methods:
getattr(self.validator, method).assert_called_once_with()
def tearDown(self):
super(FakeFSTest, self).tearDown()

View File

@ -1,183 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import jsonschema
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.base import BaseTestCase
from fuel_plugin_builder.validators import LegacyBaseValidator
class LegacyBaseValidatorTestCase(BaseTestCase):
def setUp(self):
class NewValidator(LegacyBaseValidator):
@property
def basic_version(self):
return None
def validate(self):
pass
self.plugin_path = '/tmp/plugin_path'
self.validator = NewValidator(self.plugin_path)
self.data = {'data': 'data1'}
self.schema = self.make_schema(['data'], {'data': {'type': 'string'}})
self.format_checker = jsonschema.FormatChecker
@classmethod
def make_schema(cls, required, properties):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': required,
'properties': properties}
@mock.patch('fuel_plugin_builder.validators.base.jsonschema')
def test_validate_schema(self, schema_mock):
self.validator.validate_schema(
self.data,
self.schema,
'file_path')
schema_mock.validate.assert_called_once_with(
self.data,
self.schema, format_checker=self.format_checker)
def test_validate_schema_raises_error(self):
schema = self.make_schema(['key'], {'key': {'type': 'string'}})
data = {}
with self.assertRaisesRegexp(
errors.ValidationError,
"File 'file_path', 'key' is a required property"):
self.validator.validate_schema(data, schema, 'file_path')
def test_validate_schema_raises_error_path_in_message(self):
schema = self.make_schema(
['key'],
{'key': {'type': 'array', 'items': {'type': 'string'}}})
data = {'key': ['str', 'str', 0]}
expected_error = ("File 'file_path', 0 is not of type "
"'string', value path 'key -> 2'")
with self.assertRaisesRegexp(
errors.ValidationError,
expected_error):
self.validator.validate_schema(data, schema, 'file_path')
def test_validate_schema_raises_error_custom_value_path(self):
schema = self.make_schema(['key'], {'key': {'type': 'string'}})
data = {}
with self.assertRaisesRegexp(
errors.ValidationError,
"File 'file_path', 'key' is a required property, "
"value path '0 -> path2'"):
self.validator.validate_schema(
data, schema, 'file_path', value_path=[0, 'path2'])
@mock.patch(
'fuel_plugin_builder.validators.base'
'.LegacyBaseValidator.validate_schema')
def test_validate_file_by_schema_failed(self, utils_mock):
utils_mock.parse_yaml.return_value = self.data
with self.assertRaisesRegexp(
errors.FileDoesNotExist,
"File '/tmp/plugin_path' does not exist"):
self.validator.validate_file_by_schema(
self.schema, self.plugin_path)
@mock.patch('fuel_plugin_builder.validators.base.utils')
@mock.patch(
'fuel_plugin_builder.validators.base'
'.LegacyBaseValidator.validate_schema')
def test_validate_file_by_schema(self, validate_mock, utils_mock):
utils_mock.parse_yaml.return_value = self.data
self.validator.validate_file_by_schema(self.schema, self.plugin_path)
utils_mock.parse_yaml.assert_called_once_with(self.plugin_path)
validate_mock(self.data, self.schema, self.plugin_path)
@mock.patch('fuel_plugin_builder.validators.base.utils')
@mock.patch(
'fuel_plugin_builder.validators.base'
'.LegacyBaseValidator.validate_schema')
def test_validate_file_by_schema_empty_file_passes(
self, validate_mock, utils_mock):
utils_mock.parse_yaml.return_value = None
self.validator.validate_file_by_schema(
self.schema,
self.plugin_path,
allow_empty=True)
utils_mock.parse_yaml.assert_called_once_with(self.plugin_path)
@mock.patch('fuel_plugin_builder.validators.base.utils')
@mock.patch(
'fuel_plugin_builder.validators.base'
'.LegacyBaseValidator.validate_schema')
def test_validate_file_by_schema_empty_file_fails(
self, validate_mock, utils_mock):
utils_mock.parse_yaml.return_value = None
with self.assertRaises(errors.FileIsEmpty):
self.validator.validate_file_by_schema(
self.schema,
self.plugin_path,
allow_empty=False)
def test_validate_schema_with_subschemas(self):
schema_object = {
'key': {
'type': 'array',
'items': {
'anyOf': [
{
'type': 'string'
},
{
'type': 'object',
'required': ['inner_key'],
'properties': {
'inner_key_1': {'type': 'string'},
'inner_key_2': {'type': 'string'},
}
},
{
'type': 'object',
'minProperties': 1,
'maxProperties': 1
}
]
}
}
}
schema = self.make_schema(['key'], schema_object)
with self.assertRaisesRegexp(
errors.ValidationError,
"File 'file_path', True is not of type 'string', "
"value path '0 -> path1 -> key -> 0'"):
data = {'key': [True]}
self.validator.validate_schema(
data, schema, 'file_path', value_path=[0, 'path1'])
with self.assertRaisesRegexp(
errors.ValidationError,
"File 'file_path', True is not of type 'string', "
"value path '0 -> path1 -> key -> 0 -> inner_key_1'"):
data = {'key': [{'inner_key_1': True, 'inner_key_2': 'str'}]}
self.validator.validate_schema(
data, schema, 'file_path', value_path=[0, 'path1'])

View File

@ -1,401 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import mock
import os
from os.path import join as join_path
from fuel_plugin_builder.actions.build import BaseBuildPlugin
from fuel_plugin_builder.actions.build import BuildPluginV1
from fuel_plugin_builder.actions.build import BuildPluginV2
from fuel_plugin_builder.actions.build import BuildPluginV3
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.base import BaseTestCase
class BaseBuild(BaseTestCase):
# Prevent test runner to run tests in base
__test__ = False
# Redefine class
builder_class = None
releases = [
{'os': 'ubuntu',
'deployment_scripts_path': 'deployment_scripts_path',
'repository_path': 'repository_path'}]
def setUp(self):
self.plugins_name = 'fuel_plugin'
self.plugin_path = '/tmp/{0}'.format(self.plugins_name)
self.builder = self.create_builder(self.plugin_path)
def create_builder(self, plugin_path, meta=None):
meta = meta or self.meta
with mock.patch(
'fuel_plugin_builder.actions.build.utils.parse_yaml',
return_value=meta):
return self.builder_class(plugin_path)
def test_run(self):
mocked_methods = [
'clean',
'run_pre_build_hook',
'check',
'build_repos',
'add_checksums_file',
'make_package']
self.mock_methods(self.builder, mocked_methods)
self.builder.run()
self.builder.clean.assert_called_once_with()
self.builder.run_pre_build_hook.assert_called_once_with()
self.builder.check.assert_called_once_with()
self.builder.add_checksums_file()
self.builder.build_repos.assert_called_once_with()
self.builder.make_package()
@mock.patch('fuel_plugin_builder.actions.build.utils.which')
@mock.patch('fuel_plugin_builder.actions.build.utils.exec_cmd',
return_value=True)
def test_run_pre_build_hook(self, exec_cmd_mock, which_mock):
self.builder.run_pre_build_hook()
exec_cmd_mock.assert_called_once_with(self.builder.pre_build_hook_cmd,
self.builder.plugin_path)
which_mock.assert_called_once_with(
join_path(self.builder.plugin_path,
self.builder.pre_build_hook_cmd))
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_build_repos(self, utils_mock):
with mock.patch.object(
self.builder_class, 'build_ubuntu_repos') as build_ubuntu_mock:
with mock.patch.object(
self.builder_class,
'build_centos_repos') as build_centos_mock:
self.builder.build_repos()
utils_mock.create_dir.assert_called_once_with(
self.builder.build_src_dir)
utils_mock.copy_files_in_dir.assert_called_once_with(
'/tmp/fuel_plugin/*',
self.builder.build_src_dir)
build_centos_mock.assert_called_once_with([])
build_ubuntu_mock.assert_called_once_with([
'/tmp/fuel_plugin/.build/src/repository_path'])
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_build_ubuntu_repos(self, utils_mock):
path = '/repo/path'
self.builder.build_ubuntu_repos([path])
utils_mock.exec_piped_cmds.assert_called_once_with(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=path)
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_build_centos_repos(self, utils_mock):
path = '/repo/path'
self.builder.build_centos_repos([path])
utils_mock.create_dir.assert_called_once_with(
'/repo/path/Packages')
utils_mock.move_files_in_dir.assert_called_once_with(
'/repo/path/*.rpm', '/repo/path/Packages')
utils_mock.exec_cmd.assert_called_once_with(
'createrepo -o /repo/path /repo/path')
@mock.patch.object(BaseBuildPlugin, '_check_requirements')
@mock.patch.object(BaseBuildPlugin, '_check_structure')
def test_check(self, check_structure_mock, check_requirements_mock):
self.builder.check()
check_structure_mock.assert_called_once_with()
check_requirements_mock.assert_called_once_with()
@mock.patch('fuel_plugin_builder.actions.build.utils.which',
return_value=True)
def test_check_requirements(self, _):
self.builder._check_requirements()
@mock.patch('fuel_plugin_builder.actions.build.ValidatorManager')
def test_check_structure(self, manager_class_mock):
validator_manager_obj = mock.MagicMock()
manager_class_mock.return_value = validator_manager_obj
validator_mock = mock.MagicMock()
validator_manager_obj.get_validator.return_value = validator_mock
self.builder._check_structure()
manager_class_mock.assert_called_once_with(self.plugin_path)
validator_manager_obj.get_validator.assert_called_once_with()
validator_mock.validate.assert_called_once_with()
@mock.patch(
'fuel_plugin_builder.actions.build.utils.create_checksums_file')
def test_add_checksums_file(self, create_checksums_file_mock):
self.builder.add_checksums_file()
create_checksums_file_mock.assert_called_once_with(
self.builder.build_src_dir, self.builder.checksums_path)
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_clean(self, utils_mock):
self.builder.clean()
utils_mock.assert_has_calls([
mock.call.remove(self.builder.build_dir),
mock.call.create_dir(self.builder.build_dir),
mock.call.remove_by_mask(self.builder.result_package_mask)])
class TestBaseBuildV1(BaseBuild):
__test__ = True
builder_class = BuildPluginV1
meta = {
'releases': BaseBuild.releases,
'version': '1.2.3',
'name': 'plugin_name'
}
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_make_package(self, utils_mock):
self.builder.make_package()
tar_path = '/tmp/fuel_plugin/plugin_name-1.2.3.fp'
utils_mock.make_tar_gz.assert_called_once_with(
self.builder.build_src_dir,
tar_path,
'plugin_name-1.2.3')
@mock.patch('fuel_plugin_builder.actions.build.utils.which',
return_value=False)
def test_check_requirements_raises_error(self, _):
self.assertRaisesRegexp(
errors.FuelCannotFindCommandError,
'Cannot find commands "rpm, createrepo, dpkg-scanpackages", '
'install required commands and try again',
self.builder._check_requirements)
class TestBaseBuildV2(BaseBuild):
__test__ = True
builder_class = BuildPluginV2
meta = {
'releases': BaseBuild.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url'
}
def path_from_plugin(self, plugin_path, path):
return join_path(plugin_path, path)
@mock.patch('fuel_plugin_builder.actions.build.utils')
def check_make_package(self, builder, plugin_path, utils_mock):
plugin_path = plugin_path
utils_mock.get_current_year.return_value = '2014'
builder.make_package()
rpm_src_path = self.path_from_plugin(plugin_path,
'.build/rpm/SOURCES')
utils_mock.create_dir.assert_called_once_with(rpm_src_path)
fp_dst = self.path_from_plugin(
plugin_path, '.build/rpm/SOURCES/plugin_name-1.2.fp')
utils_mock.make_tar_gz.assert_called_once_with(
self.path_from_plugin(plugin_path, '.build/src'),
fp_dst,
'plugin_name-1.2')
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
utils_mock.render_to_file.assert_called_once_with(
spec_src,
join_path(plugin_path, '.build/rpm/plugin_rpm.spec'),
{'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2014'})
utils_mock.exec_cmd.assert_called_once_with(
'rpmbuild -vv --nodeps --define "_topdir {0}" -bb '
'{1}'.format(
self.path_from_plugin(plugin_path, '.build/rpm'),
self.path_from_plugin(plugin_path,
'.build/rpm/plugin_rpm.spec')))
utils_mock.copy_files_in_dir.assert_called_once_with(
self.path_from_plugin(plugin_path,
'.build/rpm/RPMS/noarch/*.rpm'),
plugin_path
)
def test_make_package(self):
self.check_make_package(self.builder, self.plugin_path)
def test_make_package_with_non_ascii_chars_in_path(self):
plugin_path = '/tmp/тест/' + self.plugins_name
builder = self.create_builder(plugin_path)
self.check_make_package(builder, plugin_path)
@mock.patch('fuel_plugin_builder.actions.build.utils.which',
return_value=False)
def test_check_requirements_raises_error(self, _):
self.assertRaisesRegexp(
errors.FuelCannotFindCommandError,
'Cannot find commands "rpmbuild, rpm, createrepo, '
'dpkg-scanpackages", install required commands and try again',
self.builder._check_requirements)
@mock.patch('fuel_plugin_builder.actions.build.utils')
def test_build_ubuntu_repos(self, utils_mock):
path = '/repo/path'
self.builder.build_ubuntu_repos([path])
utils_mock.exec_piped_cmds.assert_called_once_with(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=path)
release_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.release_tmpl_src_path))
utils_mock.render_to_file.assert_called_once_with(
release_src,
'/repo/path/Release',
{'major_version': '1.2',
'plugin_name': 'plugin_name'})
class TestBaseBuildV3(BaseBuild):
__test__ = True
builder_class = BuildPluginV3
meta = {
'releases': BaseBuild.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url'
}
def path_from_plugin(self, path):
return join_path(self.plugin_path, path)
@mock.patch('fuel_plugin_builder.actions.build.utils')
def _test_make_package(self, utils_mock):
utils_mock.get_current_year.return_value = '2014'
utils_mock.read_if_exist.side_effect = ['echo uninst', 'echo preinst',
'echo postinst']
self.builder.make_package()
rpm_src_path = self.path_from_plugin('.build/rpm/SOURCES')
utils_mock.create_dir.assert_called_once_with(rpm_src_path)
fp_dst = self.path_from_plugin('.build/rpm/SOURCES/plugin_name-1.2.fp')
utils_mock.make_tar_gz.assert_called_once_with(
self.path_from_plugin('.build/src'),
fp_dst,
'plugin_name-1.2')
utils_mock.exec_cmd.assert_called_once_with(
'rpmbuild -vv --nodeps --define "_topdir {0}" -bb '
'{1}'.format(
self.path_from_plugin('.build/rpm'),
self.path_from_plugin('.build/rpm/plugin_rpm.spec')))
utils_mock.copy_files_in_dir.assert_called_once_with(
self.path_from_plugin('.build/rpm/RPMS/noarch/*.rpm'),
self.plugin_path)
utils_mock.read_if_exist.assert_has_calls([
mock.call(self.path_from_plugin('uninstall.sh')),
mock.call(self.path_from_plugin('pre_install.sh')),
mock.call(self.path_from_plugin('post_install.sh'))])
return utils_mock
def test_make_package(self):
utils_mock = self._test_make_package()
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
utils_mock.render_to_file.assert_called_once_with(
spec_src,
join_path(self.plugin_path, '.build/rpm/plugin_rpm.spec'),
{'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2014',
'preinstall_hook': 'echo preinst',
'postinstall_hook': 'echo postinst',
'uninstall_hook': 'echo uninst',
'build_version': '1'})
def test_make_package_with_build_version(self):
meta = {
'releases': BaseBuild.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url',
'build_version': '34'
}
self.builder = self.create_builder(self.plugin_path, meta=meta)
utils_mock = self._test_make_package()
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
utils_mock.render_to_file.assert_called_once_with(
spec_src,
join_path(self.plugin_path, '.build/rpm/plugin_rpm.spec'),
{'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2014',
'preinstall_hook': 'echo preinst',
'postinstall_hook': 'echo postinst',
'uninstall_hook': 'echo uninst',
'build_version': '34'})

View File

@ -0,0 +1,155 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
from os.path import join as join_path
import mock
from fuel_plugin_builder import builders
from fuel_plugin_builder.tests.base import BaseTestCase
from fuel_plugin_builder import utils
class BaseBuildTestCase(BaseTestCase):
# Prevent test runner to run tests in base
__test__ = False
# Redefine class
builder_class = builders.PluginBuilderBase
fake_metadata = None
releases = [
{'operating_system': 'ubuntu',
'deployment_scripts_path': 'deployment_scripts_path',
'repository_path': 'repository_path'}]
def setUp(self):
super(BaseTestCase, self).setUp()
self.plugin_path = '/tmp/fuel_plugin'
self.builder = self._create_builder(self.plugin_path)
def _create_builder(self, plugin_path, fake_metadata=None):
fake_metadata = utils.ReportNode().mix_to_data(
fake_metadata or self.fake_metadata)
loader = self.builder_class.loader_class(plugin_path)
loader.load = mock.Mock(return_value=fake_metadata)
return self.builder_class(plugin_path, loader=loader)
def test_run(self):
mocked_methods = [
'clean',
'run_pre_build_hook',
'check',
'build_repos',
'add_checksums_file',
'make_package']
self.mock_methods(self.builder, mocked_methods)
self.builder.run()
self.builder.clean.assert_called_once_with()
self.builder.run_pre_build_hook.assert_called_once_with()
self.builder.check.assert_called_once_with()
self.builder.add_checksums_file()
self.builder.build_repos.assert_called_once_with()
self.builder.make_package()
@mock.patch('fuel_plugin_builder.utils.which')
@mock.patch('fuel_plugin_builder.utils.exec_cmd',
return_value=True)
def test_run_pre_build_hook(self, exec_cmd_mock, which_mock):
self.builder.run_pre_build_hook()
exec_cmd_mock.assert_called_once_with(self.builder.pre_build_hook_cmd,
self.builder.plugin_path)
which_mock.assert_called_once_with(
join_path(self.builder.plugin_path,
self.builder.pre_build_hook_cmd))
@mock.patch('fuel_plugin_builder.utils.create_dir')
@mock.patch('fuel_plugin_builder.utils.copy_files_in_dir')
def test_build_repos(self, copy_files_in_dir_m, create_dir_m):
with mock.patch.object(
self.builder_class, 'build_ubuntu_repos') as build_ubuntu_mock:
with mock.patch.object(
self.builder_class,
'build_centos_repos') as build_centos_mock:
self.builder.build_repos()
create_dir_m.assert_called_once_with(
self.builder.build_src_dir)
copy_files_in_dir_m.assert_called_once_with(
'/tmp/fuel_plugin/*',
self.builder.build_src_dir)
build_centos_mock.assert_called_once_with([])
build_ubuntu_mock.assert_called_once_with([
'/tmp/fuel_plugin/.build/src/repository_path'])
@mock.patch('fuel_plugin_builder.utils.exec_piped_cmds')
@mock.patch('fuel_plugin_builder.utils.load_template_and_render_to_file')
def test_build_ubuntu_repos(self,
load_template_and_render_to_file_m,
exec_piped_cmds_m):
path = '/repo/path'
self.builder.build_ubuntu_repos([path])
exec_piped_cmds_m.assert_called_once_with(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=path)
@mock.patch('fuel_plugin_builder.utils.create_dir')
@mock.patch('fuel_plugin_builder.utils.move_files_in_dir')
@mock.patch('fuel_plugin_builder.utils.exec_cmd')
def test_build_centos_repos(
self, exec_cmd_m, move_files_in_dir_m, create_dir_m):
path = '/repo/path'
self.builder.build_centos_repos([path])
create_dir_m.assert_called_once_with(
'/repo/path/Packages')
move_files_in_dir_m.assert_called_once_with(
'/repo/path/*.rpm', '/repo/path/Packages')
exec_cmd_m.assert_called_once_with(
'createrepo -o /repo/path /repo/path')
@mock.patch.object(builders.PluginBuilderBase, '_check_requirements')
@mock.patch.object(builders.PluginBuilderBase, '_validate')
def test_check(self, check_structure_mock, check_requirements_mock):
check_structure_mock.return_value = utils.ReportNode('Mock node')
check_requirements_mock.return_value = utils.ReportNode('Mock node')
self.builder.check()
check_structure_mock.assert_called_once_with()
check_requirements_mock.assert_called_once_with()
@mock.patch('fuel_plugin_builder.utils.which',
return_value=True)
def test_check_requirements(self, _):
self.builder._check_requirements()
@mock.patch(
'fuel_plugin_builder.utils.create_checksums_file')
def test_add_checksums_file(self, create_checksums_file_mock):
self.builder.add_checksums_file()
create_checksums_file_mock.assert_called_once_with(
self.builder.build_src_dir, self.builder.checksums_path)
@mock.patch('fuel_plugin_builder.utils.remove')
@mock.patch('fuel_plugin_builder.utils.create_dir')
@mock.patch('fuel_plugin_builder.utils.remove_by_mask')
def test_clean(self, remove_by_mask_m, created_dir_m, remove_m):
self.builder.clean()
remove_m.assert_called_once_with(self.builder.build_dir),
created_dir_m.assert_called_once_with(self.builder.build_dir),
remove_by_mask_m.assert_called_once_with(
self.builder.result_package_mask)

View File

@ -0,0 +1,54 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import mock
from fuel_plugin_builder import builders
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.test_builder_base import BaseBuildTestCase
class TestBuilderV1(BaseBuildTestCase):
__test__ = True
builder_class = builders.PluginBuilderV1
fake_metadata = {
'releases': BaseBuildTestCase.releases,
'version': '1.2.3',
'name': 'plugin_name',
'package_version': '1.0.0'
}
@mock.patch('fuel_plugin_builder.utils.make_tar_gz')
def test_make_package(self, make_tar_gz_m):
self.builder.make_package()
tar_path = '/tmp/fuel_plugin/plugin_name-1.2.3.fp'
make_tar_gz_m.assert_called_once_with(
self.builder.build_src_dir,
tar_path,
'plugin_name-1.2.3')
@mock.patch('fuel_plugin_builder.utils.which',
return_value=False)
def test_check_requirements_raises_error(self, _):
self.assertRaisesRegexp(
errors.FuelCannotFindCommandError,
'Cannot find commands "rpm, createrepo, dpkg-scanpackages", '
'install required commands and try again',
self.builder._check_requirements)

View File

@ -0,0 +1,140 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import os
from os.path import join as join_path
import mock
from fuel_plugin_builder import builders
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.test_builder_base import BaseBuildTestCase
class TestBuilderV2(BaseBuildTestCase):
__test__ = True
builder_class = builders.PluginBuilderV2
fake_metadata = {
'releases': BaseBuildTestCase.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url'
}
def path_from_plugin(self, plugin_path, path):
return join_path(plugin_path, path)
# fixme(ikutukov): investigate better approach to utils mocking
@mock.patch('fuel_plugin_builder.utils.get_current_year')
@mock.patch('fuel_plugin_builder.utils.create_dir')
@mock.patch('fuel_plugin_builder.utils.make_tar_gz')
@mock.patch('fuel_plugin_builder.utils.load_template_and_render_to_file')
@mock.patch('fuel_plugin_builder.utils.exec_cmd')
@mock.patch('fuel_plugin_builder.utils.copy_files_in_dir')
def check_make_package(self, builder, plugin_path,
copy_files_in_dir_m, exec_cmd_m,
load_template_and_render_to_file_m, make_tar_gz_m,
create_dir_m, get_current_year_m):
get_current_year_m.return_value = '2016'
builder.make_package()
rpm_src_path = self.path_from_plugin(plugin_path,
'.build/rpm/SOURCES')
create_dir_m.assert_called_once_with(rpm_src_path)
fp_dst = self.path_from_plugin(
plugin_path, '.build/rpm/SOURCES/plugin_name-1.2.fp')
make_tar_gz_m.assert_called_once_with(
self.path_from_plugin(plugin_path, '.build/src'),
fp_dst,
'plugin_name-1.2')
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
load_template_and_render_to_file_m.assert_called_once_with(
spec_src,
join_path(plugin_path, '.build/rpm/plugin_rpm.spec'),
{
'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2016'
}
)
exec_cmd_m.assert_called_once_with(
'rpmbuild -vv --nodeps --define "_topdir {0}" -bb '
'{1}'.format(
self.path_from_plugin(plugin_path, '.build/rpm'),
self.path_from_plugin(plugin_path,
'.build/rpm/plugin_rpm.spec')))
copy_files_in_dir_m.assert_called_once_with(
self.path_from_plugin(plugin_path,
'.build/rpm/RPMS/noarch/*.rpm'),
plugin_path
)
def test_make_package(self):
self.check_make_package(self.builder, self.plugin_path)
def test_make_package_with_non_ascii_chars_in_path(self):
plugin_path = '/tmp/тест/fuel_plugin'
builder = self._create_builder(plugin_path)
self.check_make_package(builder, plugin_path)
@mock.patch('fuel_plugin_builder.utils.which',
return_value=False)
def test_check_requirements_raises_error(self, _):
self.assertRaisesRegexp(
errors.FuelCannotFindCommandError,
'Cannot find commands "rpmbuild, rpm, createrepo, '
'dpkg-scanpackages", install required commands and try again',
self.builder._check_requirements)
@mock.patch('fuel_plugin_builder.utils.exec_piped_cmds')
@mock.patch('fuel_plugin_builder.utils.load_template_and_render_to_file')
def test_build_ubuntu_repos(
self, load_template_and_render_to_file_m, exec_piped_cmds_m):
path = '/repo/path'
self.builder.build_ubuntu_repos([path])
exec_piped_cmds_m.assert_called_once_with(
['dpkg-scanpackages -m .', 'gzip -c9 > Packages.gz'],
cwd=path)
release_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.release_tmpl_src_path))
load_template_and_render_to_file_m.assert_called_once_with(
release_src,
'/repo/path/Release',
{
'major_version': '1.2',
'plugin_name': 'plugin_name'
}
)

View File

@ -0,0 +1,142 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from __future__ import unicode_literals
import os
from os.path import join as join_path
import mock
from fuel_plugin_builder import builders
from fuel_plugin_builder.tests.test_builder_base import BaseBuildTestCase
class TestBuilderV3(BaseBuildTestCase):
__test__ = True
builder_class = builders.PluginBuilderV3
fake_metadata = {
'releases': BaseBuildTestCase.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url'
}
def path_from_plugin(self, path):
return join_path(self.plugin_path, path)
@mock.patch('fuel_plugin_builder.utils.get_current_year')
@mock.patch('fuel_plugin_builder.utils.create_dir')
@mock.patch('fuel_plugin_builder.utils.make_tar_gz')
@mock.patch('fuel_plugin_builder.utils.exec_cmd')
@mock.patch('fuel_plugin_builder.utils.copy_files_in_dir')
@mock.patch('fuel_plugin_builder.utils.FilesManager.load')
def _test_make_package(self, load_m,
copy_files_in_dir_m, exec_cmd_m,
make_tar_gz_m,
create_dir_m, get_current_year_m):
get_current_year_m.return_value = '2014'
load_m.side_effect = ['echo uninst', 'echo preinst', 'echo postinst']
self.builder.make_package()
rpm_src_path = self.path_from_plugin('.build/rpm/SOURCES')
create_dir_m.assert_called_once_with(rpm_src_path)
fp_dst = self.path_from_plugin('.build/rpm/SOURCES/plugin_name-1.2.fp')
make_tar_gz_m.assert_called_once_with(
self.path_from_plugin('.build/src'),
fp_dst,
'plugin_name-1.2')
exec_cmd_m.assert_called_once_with(
'rpmbuild -vv --nodeps --define "_topdir {0}" -bb '
'{1}'.format(
self.path_from_plugin('.build/rpm'),
self.path_from_plugin('.build/rpm/plugin_rpm.spec')))
copy_files_in_dir_m.assert_called_once_with(
self.path_from_plugin('.build/rpm/RPMS/noarch/*.rpm'),
self.plugin_path)
load_m.assert_has_calls([
mock.call(self.path_from_plugin('uninstall.sh')),
mock.call(self.path_from_plugin('pre_install.sh')),
mock.call(self.path_from_plugin('post_install.sh'))])
@mock.patch('fuel_plugin_builder.utils.load_template_and_render_to_file')
def test_make_package(self, load_template_and_render_to_file_m):
self._test_make_package()
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
load_template_and_render_to_file_m.assert_called_once_with(
spec_src,
join_path(self.plugin_path, '.build/rpm/plugin_rpm.spec'),
{'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2014',
'preinstall_hook': 'echo preinst',
'postinstall_hook': 'echo postinst',
'uninstall_hook': 'echo uninst',
'build_version': '1'})
@mock.patch('fuel_plugin_builder.utils.load_template_and_render_to_file')
def test_make_package_with_build_version(
self, load_template_and_render_to_file_m):
meta = {
'releases': BaseBuildTestCase.releases,
'version': '1.2.3',
'name': 'plugin_name',
'title': 'Plugin title',
'description': 'Description',
'licenses': ['Apache', 'BSD'],
'authors': ['author1', 'author2'],
'homepage': 'url',
'build_version': '34'
}
self.builder = self._create_builder(
self.plugin_path, fake_metadata=meta)
self._test_make_package()
spec_src = os.path.abspath(join_path(
os.path.dirname(__file__), '..',
self.builder.rpm_spec_src_path))
load_template_and_render_to_file_m.assert_called_once_with(
spec_src,
join_path(self.plugin_path, '.build/rpm/plugin_rpm.spec'),
{'vendor': 'author1, author2',
'description': 'Description',
'license': 'Apache and BSD',
'summary': 'Plugin title',
'version': '1.2.3',
'homepage': 'url',
'name': 'plugin_name-1.2',
'year': '2014',
'preinstall_hook': 'echo preinst',
'postinstall_hook': 'echo postinst',
'uninstall_hook': 'echo uninst',
'build_version': '34'})

View File

@ -0,0 +1,166 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import checks
from fuel_plugin_builder.tests.base import BaseTestCase
from fuel_plugin_builder import utils
class TestChecks(BaseTestCase):
def test_json_schema_is_valid(self):
report = checks.json_schema_is_valid(
utils.make_schema(
['data'],
{'data': {'type': 'string'}}
),
{'data': 'data'}
)
self.assertEqual(0, report.count_failures())
def test_json_schema_is_invalid(self):
report = checks.json_schema_is_valid(
utils.make_schema(
['data'],
{'data': {'type': 'string'}}
),
{'bad_data': 'data'}
)
self.assertEqual(1, report.count_failures())
self.assertTrue(report.is_failed())
self.assertIn("'data' is a required property", report.render())
def test_multi_json_schema_is_valid(self):
report = checks.multi_json_schema_is_valid(
schemas={
'type1': utils.make_schema(
['data1'],
{'data1': {'type': 'string'}}
),
'type2': utils.make_schema(
['data2'],
{'data2': {'type': 'string'}}
)
},
data=[
{'type': 'type1', 'data1': 'somedata'},
{'type': 'type2', 'data2': 'somedata'}
]
)
self.assertIn("Success!", report.render())
self.assertFalse(report.is_failed())
self.assertEqual(0, report.count_failures())
def test_multi_json_schema_is_invalid(self):
report = checks.multi_json_schema_is_valid(
schemas={
'type1': utils.make_schema(
['data1'],
{'data1': {'type': 'string'}}
),
'type2': utils.make_schema(
['data2'],
{'data2': {'type': 'string'}}
)
},
data=[
{
'type': 'badtype',
'data1': 'somedata'
},
{
'type': 'type1',
'badkey': 'somedata'
}
]
)
self.assertTrue(report.is_failed())
self.assertEqual(2, report.count_failures())
self.assertIn("Please fix 2 errors listed above", report.render())
@mock.patch('fuel_plugin_builder.utils.fs.os.path.lexists')
@mock.patch('fuel_plugin_builder.utils.fs.os.path.isfile')
def test_is_file_is_ok(self, isfile_m, exists_m):
exists_m.return_value = True
isfile_m.return_value = True
report = checks.file_exists('.')
self.assertFalse(report.is_failed())
@mock.patch('fuel_plugin_builder.utils.fs.os.path.lexists')
@mock.patch('fuel_plugin_builder.utils.fs.os.path.isfile')
def test_is_file_is_not_ok(self, exists_m, isfile_m):
exists_m.return_value = True
isfile_m.return_value = False
report = checks.file_exists('.')
self.assertTrue(report.is_failed())
exists_m.return_value = False
isfile_m.return_value = True
report = checks.file_exists('.')
self.assertTrue(report.is_failed())
exists_m.return_value = False
isfile_m.return_value = False
report = checks.file_exists('.')
self.assertTrue(report.is_failed())
def test_is_compatible_ok(self):
fuel_version_checks = (
(['6.0', '6.1', '7.0', '8.0']),
(['6.1', '7.0', '8.0']),
)
for fuel_version in fuel_version_checks:
report = checks.fuel_ver_compatible_with_package_ver(
minimal_fuel_version='6.0',
plugin_metadata={
'fuel_version': fuel_version,
'package_version': '4.0.0'
}
)
self.assertFalse(report.is_failed())
self.assertIn('Expected Fuel version >= 6.0', report.render())
def test_is_compatible_fail(self):
fuel_version_checks = (
(['6.0', '6.1', '7.0', '8.0', '9.0'], ['6.0', '6.1', '7.0']),
(['6.1', '7.0'], ['6.1', '7.0']),
)
minimal_fuel_version = '8.0'
for fuel_version, bad_versions, in fuel_version_checks:
report = checks.fuel_ver_compatible_with_package_ver(
minimal_fuel_version=minimal_fuel_version,
plugin_metadata={
'fuel_version': fuel_version,
'package_version': '4.0.0'
}
)
self.assertEqual(1, report.count_failures())
self.assertIn(
'Current plugin format 4.0.0 is not compatible '
'with {0} Fuel release'
''.format(', '.join(bad_versions)),
report.render()
)
self.assertIn(
'Fuel version must be {} or higher'
''.format(minimal_fuel_version),
report.render()
)

View File

@ -32,8 +32,12 @@ class TestCli(BaseTestCase):
perform_action(args)
actions_mock.CreatePlugin.assert_called_once_with(
'plugin_path',
'2.0.0')
plugin_path='plugin_path',
package_version='2.0.0',
fuel_import=mock.ANY,
nailgun_path=mock.ANY,
library_path=mock.ANY
)
creatre_obj.run.assert_called_once_with()
@mock.patch('fuel_plugin_builder.cli.actions')
@ -44,7 +48,13 @@ class TestCli(BaseTestCase):
perform_action(args)
actions_mock.CreatePlugin.assert_called_once_with('plugin_path', None)
actions_mock.CreatePlugin.assert_called_once_with(
plugin_path='plugin_path',
package_version=None,
fuel_import=mock.ANY,
nailgun_path=mock.ANY,
library_path=mock.ANY
)
creatre_obj.run.assert_called_once_with()
@mock.patch('fuel_plugin_builder.cli.actions.make_builder')
@ -60,7 +70,7 @@ class TestCli(BaseTestCase):
builder_mock.assert_called_once_with('plugin_path')
build_obj.run.assert_called_once_with()
@mock.patch('fuel_plugin_builder.cli.ValidatorManager')
@mock.patch('fuel_plugin_builder.cli.version_mapping.get_validator')
def test_perform_check(self, validator_mock):
args = mock.MagicMock(
create=None,

View File

@ -16,28 +16,26 @@
import mock
from fuel_plugin_builder.actions import CreatePlugin
from fuel_plugin_builder import actions
from fuel_plugin_builder import errors
from fuel_plugin_builder import messages
from fuel_plugin_builder.tests.base import BaseTestCase
class TestCreate(BaseTestCase):
def setUp(self):
self.plugins_name = 'fuel_plugin'
self.plugin_path = '/tmp/{0}'.format(self.plugins_name)
self.template_dir = '/temp_dir'
self.creator = CreatePlugin(self.plugin_path)
self.creator = actions.CreatePlugin(self.plugin_path)
self.creator.template_dir = self.template_dir
@mock.patch('fuel_plugin_builder.actions.create.utils.exists',
@mock.patch('fuel_plugin_builder.actions.create.utils.is_exists',
return_value=False)
def test_check(self, exists_mock):
self.creator.check()
exists_mock.assert_called_once_with(self.plugin_path)
@mock.patch('fuel_plugin_builder.actions.create.utils.exists',
@mock.patch('fuel_plugin_builder.actions.create.utils.is_exists',
return_value=True)
def test_check_when_plugin_exists_with_same_name(self, exists_mock):
self.assertRaisesRegexp(
@ -47,17 +45,18 @@ class TestCreate(BaseTestCase):
self.creator.check)
exists_mock.assert_called_once_with(self.plugin_path)
@mock.patch('fuel_plugin_builder.actions.create.utils.exists',
@mock.patch('fuel_plugin_builder.actions.create.utils.is_exists',
return_value=False)
def test_check_with_invalid_name(self, exists_mock):
self.creator.plugin_name = 'Test_plugin'
self.assertRaisesRegexp(
errors.ValidationError,
messages.PLUGIN_WRONG_NAME_EXCEPTION_MESSAGE,
"Plugin name is invalid, use only lower "
"case letters, numbers, '_', '-' symbols",
self.creator.check)
exists_mock.assert_called_once_with(self.plugin_path)
@mock.patch.object(CreatePlugin, 'check')
@mock.patch.object(actions.CreatePlugin, 'check')
@mock.patch('fuel_plugin_builder.actions.create.utils')
def test_run(self, utils_mock, _):
self.creator.run()

View File

@ -0,0 +1,506 @@
import re
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import FakeFSTest
from fuel_plugin_builder import validators
PLUGIN_V5_DATA = {
"description": "Please describe your plugin here",
"releases": [
{
"is_release": True,
"operating_system": "ubuntu",
"description": "Example Release Description",
"roles": {
"test-plugin_role": {
"has_primary": False,
"public_ip_required": False,
"description": "Write description for your role",
"weight": 1000,
"name": "Set here the name for the role. This name will "
"be displayed in the Fuel web UI"
}
},
"network_roles": [
{
"id": "example_net_role",
"properties": {
"subnet": True,
"vip": [
{
"namespace": "haproxy",
"name": "vip_name"
}
],
"gateway": False
},
"default_mapping": "public"
}
],
"name": "ExampleRelease",
"repository_path": "repositories/ubuntu",
"vmware_attributes": None,
"graphs": [
{
"tasks": [
{
"parameters": {
"retries": 10,
"puppet_modules": ".",
"puppet_manifest": "provision.pp",
"timeout": 3600
},
"version": "2.0.0",
"type": "puppet",
"id": "provision",
"roles": "*"
}
],
"name": "provisioning",
"type": "provisioning"
},
{
"tasks": [
{
"role": [
"test-plugin_role"
],
"type": "group",
"id": "test-plugin_role",
"parameters": {
"strategy": {
"type": "parallel"
}
}
},
{
"parameters": {
"puppet_modules": ".",
"puppet_manifest": "deploy.pp",
"timeout": 3600
},
"cross-depended-by": [
{
"name": "deploy_end"
}
],
"version": "2.0.0",
"role": [
"test-plugin_role"
],
"cross-depends": [
{
"name": "deploy_start"
}
],
"type": "puppet",
"id": "test-plugin-deployment-puppet"
}
],
"name": "default deployment graph",
"type": "deployment"
},
{
"tasks": [
{
"parameters": {
"retries": 10,
"puppet_modules": ".",
"puppet_manifest": "delete.pp",
"timeout": 3600
},
"version": "2.1.0",
"type": "puppet",
"id": "delete",
"roles": [
"deleted"
]
}
],
"name": "deletion",
"type": "deletion"
},
{
"tasks": [
{
"parameters": {
"retries": 10,
"puppet_modules": ".",
"puppet_manifest": "delete.pp",
"timeout": 3600
},
"roles": [
"*",
"master"
],
"version": "2.1.0",
"required_for": [
"deploy_start"
],
"type": "puppet",
"id": "verify_networks"
}
],
"name": "network_verification",
"type": "network_verification"
},
{
"tasks": [
{
"role": [
"test-plugin_role"
],
"type": "group",
"id": "test-plugin_role",
"parameters": {
"strategy": {
"type": "parallel"
}
}
},
{
"parameters": {
"puppet_modules": ".",
"puppet_manifest": "deploy.pp",
"timeout": 3600
},
"cross-depended-by": [
{
"name": "deploy_end"
}
],
"version": "2.0.0",
"role": [
"test-plugin_role"
],
"cross-depends": [
{
"name": "deploy_start"
}
],
"type": "puppet",
"id": "test-plugin-deployment-puppet"
}
],
"name": "deployment-graph-name",
"type": "default"
}
],
"version": "1.0.0",
"deployment_scripts_path": "deployment_scripts/",
"components": [
{
"description": "Component description (optional)",
"incompatible": [],
"label": "Plugin label, that will be shown on UI",
"compatible": [],
"requires": [],
"name": "additional_service:test-plugin"
}
],
"attributes": None,
"volumes": {
"volumes_roles_mapping": {
"test-plugin_role": [
{
"id": "os",
"allocate_size": "min"
}
]
},
"volumes": []
},
"networks": None,
"deployment_tasks": [
{
"role": [
"test-plugin_role"
],
"type": "group",
"id": "test-plugin_role",
"parameters": {
"strategy": {
"type": "parallel"
}
}
},
{
"parameters": {
"puppet_modules": ".",
"puppet_manifest": "deploy.pp",
"timeout": 3600
},
"cross-depended-by": [
{
"name": "deploy_end"
}
],
"version": "2.0.0",
"role": [
"test-plugin_role"
],
"cross-depends": [
{
"name": "deploy_start"
}
],
"type": "puppet",
"id": "test-plugin-deployment-puppet"
}
],
'bond_attributes': None,
'nic_attributes': None,
'node_attributes': None
},
{
'operating_system': 'ubuntu',
'repository_path': 'repositories/ubuntu',
'version': 'mitaka-9.0',
'mode': ['ha'],
'deployment_scripts_path': 'deployment_scripts/'
},
{
'operating_system': 'ubuntu',
'repository_path': 'repositories/ubuntu',
'version': 'newton-10.0',
'mode': ['ha'],
'deployment_scripts_path': 'deployment_scripts/'
}
],
"title": "Title for fuel_plugin_example_v5 plugin",
"package_version": "5.0.0",
'nic_attributes_metadata': {
'attribute_b': {
'type': 'checkbox',
'description': 'Some description',
'value': False,
'label': 'NIC attribute B'
},
'attribute_a': {
'type': 'text',
'description': 'Some description',
'value': '',
'label': 'NIC attribute A'
}
},
'node_attributes_metadata': {
'plugin_section_a': {
'metadata': {
'group': 'some_new_section',
'label': 'Section A'
},
'attribute_b': {
'type': 'checkbox',
'description': 'Some description',
'value': '',
'label': 'Node attribute B for section A'
},
'attribute_a': {
'type': 'text',
'description': 'Some description',
'value': '',
'label': 'Node attribute A for section A'
}
}
},
'bond_attributes_metadata': {
'attribute_b': {
'type': 'checkbox',
'description': 'Some description',
'value': False,
'label': 'Bond attribute B'
},
'attribute_a': {
'type': 'text',
'description': 'Some description',
'value': '',
'label': 'Bond attribute A'
}
},
"volumes_metadata": {
"volumes_roles_mapping": {
"test-plugin": [
{
"id": "os",
"allocate_size": "min"
}
]
},
"volumes": []
},
"attributes_metadata": {
"attributes": {
"test-plugin_text": {
"weight": 25,
"type": "text",
"description": "Description for text field",
"value": "Set default value",
"label": "Text field"
}
}
},
"is_hotpluggable": False,
"version": "1.0.0",
"fuel_version": [
"9.1", "10.0"
],
"groups": [],
"authors": [
"Specify author or company name"
],
"licenses": [
"Apache License Version 2.0"
],
"roles_metadata": {
"test-plugin": {
"has_primary": False,
"public_ip_required": False,
"description": "Write description for your role",
"weight": 1000,
"name": "Set here the name for the role. This name will be "
"displayed in the Fuel web UI"
}
},
"homepage": "https://github.com/openstack/fuel-plugins",
"network_roles_metadata": [
{
"id": "example_net_role",
"properties": {
"subnet": True,
"vip": [
{
"namespace": "haproxy",
"name": "vip_name"
}
],
"gateway": False
},
"default_mapping": "public"
}
],
"deployment_tasks": [
{
"role": [
"test-plugin"
],
"type": "group",
"id": "test-plugin",
"parameters": {
"strategy": {
"type": "parallel"
}
}
},
{
"parameters": {
"puppet_modules": ".",
"puppet_manifest": "deploy.pp",
"timeout": 3600
},
"requires": [
"deploy_start"
],
"groups": [
"test-plugin"
],
"required_for": [
"deploy_end"
],
"type": "puppet",
"id": "test-plugin-deployment-puppet"
}
],
"name": "fuel_plugin_example_v5"
}
class TestLoaderV5(FakeFSTest):
validator_class = validators.ValidatorV5
loader_class = loaders.PluginLoaderV5
package_version = '5.0.0'
def test_loaded_ok(self):
self.assertIn(u'Success!', self.data_tree.report.render())
self.assertFalse(self.data_tree.report.is_failed())
self.assertEqual(PLUGIN_V5_DATA, self.data_tree)
def test_loader_fail_on_missing_graph_file(self):
self.fs.RemoveObject(
self._make_fakefs_path('graphs')
)
data = self.loader.load(self.plugin_path)
self.assertIn(u"graphs/deployment_tasks.yaml", data.report.render())
self.assertIn(u"Can't find file.", data.report.render())
self.assertTrue(data.report.is_failed())
self.assertEqual(
{
'type': 'provisioning',
'name': 'provisioning',
'tasks_path': 'graphs/provisioning.yaml'
},
data['releases'][0]['graphs'][0]
)
self.assertEqual(
'graphs/provisioning.yaml',
data['releases'][0]['graphs'][0].get('tasks_path')
)
self.assertEqual(
'graphs/deployment_tasks.yaml',
data['releases'][0]['graphs'][1].get('tasks_path')
)
def test_loader_fail_on_missing_attributes_file(self):
self.fs.RemoveObject(
self._make_fakefs_path('attributes/attributes.yaml')
)
data = self.loader.load(self.plugin_path)
self.assertIn(u"attributes/attributes.yaml", data.report.render())
self.assertIn(u"Can't find file.", data.report.render())
self.assertTrue(data.report.is_failed())
self.assertEqual(
None,
data['releases'][0].get('attributes')
)
self.assertEqual(
'attributes/attributes.yaml',
data['releases'][0].get('attributes_path')
)
def test_fail_on_bad_release_path(self):
self.fs.RemoveObject(
self._make_fakefs_path('repositories/ubuntu')
)
self.fs.RemoveObject(
self._make_fakefs_path('deployment_scripts/')
)
data = self.loader.load(self.plugin_path)
self.assertTrue(data.report.is_failed())
self.assertTrue(
re.search(
r'repositories\/ubuntu is invalid directory',
data.report.render()))
self.assertTrue(
re.search(
r'deployment_scripts\/ is invalid directory',
data.report.render()))

View File

@ -29,31 +29,35 @@ from fuel_plugin_builder import utils
class TestUtils(BaseTestCase):
@mock.patch('fuel_plugin_builder.utils.os.path.isfile', return_value=True)
@mock.patch('fuel_plugin_builder.utils.os.access', return_value=True)
@mock.patch('fuel_plugin_builder.utils.fs.os.path.isfile',
return_value=True)
@mock.patch('fuel_plugin_builder.utils.fs.os.access',
return_value=True)
def test_is_executable_returns_true(self, access_mock, isfile_mock):
file_name = 'file_name'
self.assertTrue(utils.is_executable(file_name))
isfile_mock.assert_called_once_with(file_name)
access_mock.assert_called_once_with(file_name, os.X_OK)
@mock.patch('fuel_plugin_builder.utils.os.path.isfile', return_value=True)
@mock.patch('fuel_plugin_builder.utils.os.access', return_value=False)
@mock.patch('fuel_plugin_builder.utils.fs.os.path.isfile',
return_value=True)
@mock.patch('fuel_plugin_builder.utils.fs.os.access',
return_value=False)
def test_is_executable_returns_false(self, access_mock, isfile_mock):
file_name = 'file_name'
self.assertFalse(utils.is_executable(file_name))
isfile_mock.assert_called_once_with(file_name)
access_mock.assert_called_once_with(file_name, os.X_OK)
@mock.patch('fuel_plugin_builder.utils.os')
@mock.patch('fuel_plugin_builder.utils.is_executable', return_value=True)
@mock.patch('fuel_plugin_builder.utils.fs.os')
@mock.patch('fuel_plugin_builder.utils.fs.is_executable',
return_value=True)
def test_which_returns_for_absolute_path_exec(self, _, os_mock):
path = '/usr/bin/some_exec'
os_mock.path.split.return_value = ('/usr/bin/', 'some_exec')
self.assertEqual(utils.which(path), path)
@mock.patch('fuel_plugin_builder.utils.is_executable',
@mock.patch('fuel_plugin_builder.utils.fs.is_executable',
side_effect=[False, True])
def test_which_returns_if_exec_in_env_path(self, _):
# some_exec is in /bin directory
@ -61,7 +65,8 @@ class TestUtils(BaseTestCase):
with patch.dict('os.environ', {'PATH': '/usr/bin:/bin'}):
self.assertEqual(utils.which(path), '/bin/some_exec')
@mock.patch('fuel_plugin_builder.utils.is_executable', return_value=False)
@mock.patch('fuel_plugin_builder.utils.is_executable',
return_value=False)
def test_which_returns_none(self, _):
with patch.dict('os.environ', {'PATH': '/usr/bin:/bin'}):
self.assertIsNone(utils.which('some_exec'))
@ -110,7 +115,7 @@ class TestUtils(BaseTestCase):
utils.exec_piped_cmds(['some command', 'some other command'])
process_mock.communicate.assert_called_with(input='stdout')
@mock.patch('fuel_plugin_builder.utils.os')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_create_dir(self, os_mock):
path = '/dir/path'
os_mock.path.isdir.return_value = False
@ -118,7 +123,7 @@ class TestUtils(BaseTestCase):
os_mock.path.isdir.assert_called_once_with(path)
os_mock.makedirs.assert_called_once_with(path)
@mock.patch('fuel_plugin_builder.utils.os')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_create_dir_dont_create_if_created(self, os_mock):
path = '/dir/path'
os_mock.path.isdir.return_value = True
@ -126,33 +131,34 @@ class TestUtils(BaseTestCase):
os_mock.path.isdir.assert_called_once_with(path)
self.method_was_not_called(os_mock.makedirs)
@mock.patch('fuel_plugin_builder.utils.os.path.lexists', return_value=True)
@mock.patch('fuel_plugin_builder.utils.fs.os.path.lexists',
return_value=True)
def test_exists(self, os_exists):
file_path = '/dir/path'
self.assertTrue(utils.exists(file_path))
self.assertTrue(utils.fs.is_exists(file_path))
os_exists.assert_called_once_with(file_path)
@mock.patch('fuel_plugin_builder.utils.os.path.lexists',
@mock.patch('fuel_plugin_builder.utils.fs.os.path.lexists',
return_value=False)
def test_exists_returns_false(self, os_exists):
file_path = '/dir/path'
self.assertFalse(utils.exists(file_path))
self.assertFalse(utils.fs.is_exists(file_path))
os_exists.assert_called_once_with(file_path)
@mock.patch('fuel_plugin_builder.utils.os.path.basename')
@mock.patch('fuel_plugin_builder.utils.fs.os.path.basename')
def test_basename(self, base_mock):
path = 'some_path'
base_mock.return_value = path
self.assertEqual(utils.basename(path), path)
base_mock.assert_called_once_with(path)
@mock.patch('fuel_plugin_builder.utils.shutil')
@mock.patch('fuel_plugin_builder.utils.fs.shutil')
def test_copy_file_permissions(self, shutil_mock):
utils.copy_file_permissions('src', 'dst')
shutil_mock.copymode.assert_called_once_with('src', 'dst')
@mock.patch('fuel_plugin_builder.utils.shutil')
@mock.patch('fuel_plugin_builder.utils.os')
@mock.patch('fuel_plugin_builder.utils.fs.shutil')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_remove_file(self, os_mock, shutil_mock):
path = 'file_for_removing'
os_mock.path.isdir.return_value = False
@ -160,8 +166,8 @@ class TestUtils(BaseTestCase):
os_mock.remove.assert_called_once_with(path)
self.method_was_not_called(shutil_mock.rmtree)
@mock.patch('fuel_plugin_builder.utils.shutil')
@mock.patch('fuel_plugin_builder.utils.os')
@mock.patch('fuel_plugin_builder.utils.fs.shutil')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_remove_dir(self, os_mock, shutil_mock):
path = 'dir_for_removing'
os_mock.path.isdir.return_value = True
@ -170,33 +176,33 @@ class TestUtils(BaseTestCase):
shutil_mock.rmtree.assert_called_once_with(path)
self.method_was_not_called(os_mock.remove)
@mock.patch('fuel_plugin_builder.utils.dir_util')
@mock.patch('fuel_plugin_builder.utils.shutil')
@mock.patch('fuel_plugin_builder.utils.os')
def test_copy_file(self, os_mock, shutil_mock, dir_util_mock):
@mock.patch('fuel_plugin_builder.utils.fs.dir_util')
@mock.patch('fuel_plugin_builder.utils.fs.shutil')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_copy_file(self, os_mock, shutil_mock, dir_util_m):
src = '/tmp/soruce_file'
dst = '/tmp/destination_file'
os_mock.path.isdir.return_value = False
utils.copy(src, dst)
shutil_mock.copy.assert_called_once_with(src, dst)
self.method_was_not_called(dir_util_mock.copy_tree)
self.method_was_not_called(dir_util_m.copy_tree)
@mock.patch('fuel_plugin_builder.utils.dir_util')
@mock.patch('fuel_plugin_builder.utils.shutil')
@mock.patch('fuel_plugin_builder.utils.os')
def test_copy_dir(self, os_mock, shutil_mock, dir_util_mock):
@mock.patch('fuel_plugin_builder.utils.fs.dir_util')
@mock.patch('fuel_plugin_builder.utils.fs.shutil')
@mock.patch('fuel_plugin_builder.utils.fs.os')
def test_copy_dir(self, is_dir_mock, shutil_mock, dir_util_m):
src = '/tmp/soruce_file'
dst = '/tmp/destination_file'
os_mock.path.isdir.return_value = True
utils.copy(src, dst)
dir_util_mock.copy_tree.assert_called_once_with(
is_dir_mock.return_value = True
utils.fs.copy(src, dst)
dir_util_m.copy_tree.assert_called_once_with(
src,
dst,
preserve_symlinks=True)
self.method_was_not_called(shutil_mock.copy)
@mock.patch('fuel_plugin_builder.utils.copy')
@mock.patch('fuel_plugin_builder.utils.glob',
@mock.patch('fuel_plugin_builder.utils.fs.copy')
@mock.patch('fuel_plugin_builder.utils.fs.glob.glob',
return_value=['file1', 'file2'])
def test_copy_files_in_dir(self, glob_mock, copy_mock):
mask = 'file*'
@ -208,7 +214,7 @@ class TestUtils(BaseTestCase):
[mock.call('file1', '/tmp/file1'),
mock.call('file2', '/tmp/file2')])
@mock.patch('fuel_plugin_builder.utils.tarfile')
@mock.patch('fuel_plugin_builder.utils.fs.tarfile')
def test_make_tar_gz(self, tarfile_mock):
src = 'dir'
dst = '/tmp/file.fp'
@ -220,8 +226,8 @@ class TestUtils(BaseTestCase):
tar_mock.add.assert_called_once_with(src, arcname=prefix)
tar_mock.close.assert_called_once_with()
@mock.patch('fuel_plugin_builder.utils.shutil.move')
@mock.patch('fuel_plugin_builder.utils.glob',
@mock.patch('fuel_plugin_builder.utils.fs.shutil.move')
@mock.patch('fuel_plugin_builder.utils.fs.glob.glob',
return_value=['file1', 'file2'])
def test_move_files_in_dir(self, glob_mock, move_mock):
mask = 'file*'
@ -233,19 +239,9 @@ class TestUtils(BaseTestCase):
[mock.call('file1', '/tmp/file1'),
mock.call('file2', '/tmp/file2')])
@mock.patch('__builtin__.open')
@mock.patch('fuel_plugin_builder.utils.yaml')
def test_parse_yaml(self, yaml_mock, open_mock):
path = '/tmp/path'
file_mock = mock.MagicMock()
open_mock.return_value = file_mock
utils.parse_yaml(path)
open_mock.assert_called_once_with(path)
yaml_mock.load.assert_called_once_with(file_mock)
def test_render_to_file_unicode_handling(self):
expected = u'тест'
params = {'vendors': expected}
context = {'vendors': expected}
template_content = "${vendors}"
temp_dir = tempfile.mkdtemp()
@ -257,16 +253,18 @@ class TestUtils(BaseTestCase):
with open(src_file, 'w') as f:
f.write(template_content)
utils.render_to_file(src=src_file, dst=dst_file, params=params)
utils.template.load_template_and_render_to_file(
src=src_file, dst=dst_file, context=context)
with open(dst_file, 'rb') as f:
actual = f.read()
self.assertEqual(expected, actual.decode('utf-8'))
@mock.patch('fuel_plugin_builder.utils.copy_file_permissions')
@mock.patch('fuel_plugin_builder.utils.render_to_file')
@mock.patch('fuel_plugin_builder.utils.remove')
@mock.patch('fuel_plugin_builder.utils.os.walk')
@mock.patch('fuel_plugin_builder.utils.template.copy_file_permissions')
@mock.patch(
'fuel_plugin_builder.utils.template.load_template_and_render_to_file')
@mock.patch('fuel_plugin_builder.utils.template.remove')
@mock.patch('fuel_plugin_builder.utils.fs.os.walk')
def test_render_files_in_dir(
self, walk_mock, remove_mock, render_mock, copy_permissions_mock):
dir_path = '/tmp/some_plugin'
@ -298,19 +296,9 @@ class TestUtils(BaseTestCase):
'/tmp/some_plugin/file4')],
copy_permissions_mock.call_args_list)
def test_calculate_sha(self):
file_path = '/tmp/file'
with mock.patch('__builtin__.open',
self.mock_open('fake file content')):
self.assertEqual(
utils.calculate_sha(file_path),
'5083c27641e7e4ae287d690cb3fafb4dd6e8f6ab')
@mock.patch('fuel_plugin_builder.utils.calculate_sha')
@mock.patch('fuel_plugin_builder.utils.os.walk')
def test_calculate_checksums(self, walk_mock, sha_mock):
@mock.patch('fuel_plugin_builder.utils.checksum.calculate_file_sha')
@mock.patch('fuel_plugin_builder.utils.fs.os.walk')
def test_calculate_file_checksums(self, walk_mock, sha_mock):
dir_path = '/tmp/dir_path'
walk_mock.return_value = [
[dir_path, '', ['file1.txt', 'file2.txt']],
@ -319,7 +307,7 @@ class TestUtils(BaseTestCase):
sha_mock.side_effect = ['sha_1', 'sha_2', 'sha_3']
self.assertEqual(
utils.calculate_checksums(dir_path),
utils.checksum.calculate_file_checksums(dir_path),
[{'file_path': 'file1.txt', 'checksum': 'sha_1'},
{'file_path': 'file2.txt', 'checksum': 'sha_2'},
{'file_path': 'file3.txt', 'checksum': 'sha_3'}])
@ -330,7 +318,7 @@ class TestUtils(BaseTestCase):
mock.call('/tmp/dir_path/file3.txt')],
sha_mock.call_args_list)
@mock.patch('fuel_plugin_builder.utils.calculate_checksums')
@mock.patch('fuel_plugin_builder.utils.checksum.calculate_file_checksums')
def test_create_checksums_file(self, calculate_mock):
calculate_mock.return_value = [
{'checksum': 'checksum2', 'file_path': 'file2.txt'},
@ -346,8 +334,8 @@ class TestUtils(BaseTestCase):
fileobj.getvalue(),
'checksum file1.txt\nchecksum2 file2.txt\n')
@mock.patch('fuel_plugin_builder.utils.remove')
@mock.patch('fuel_plugin_builder.utils.glob',
@mock.patch('fuel_plugin_builder.utils.fs.remove')
@mock.patch('fuel_plugin_builder.utils.fs.glob.glob',
return_value=['file1', 'file2'])
def test_remove_by_mask(self, glob_mock, remove_mock):
mask = '/tmp/test/*.yaml'
@ -356,19 +344,3 @@ class TestUtils(BaseTestCase):
self.assertEqual(
remove_mock.call_args_list,
[mock.call('file1'), mock.call('file2')])
@mock.patch('fuel_plugin_builder.utils.exists',
return_value=True)
def test_read_if_exist(self, utils_exists):
file_path = '/tmp/file'
with mock.patch('__builtin__.open', self.mock_open("foo")):
self.assertEqual(utils.read_if_exist(file_path), "foo")
utils_exists.assert_called_once_with(file_path)
@mock.patch('fuel_plugin_builder.utils.exists',
return_value=False)
def test_read_if_exist_returns_empty(self, utils_exists):
file_path = '/tmp/file'
with mock.patch('__builtin__.open', self.mock_open("foo")):
self.assertEqual(utils.read_if_exist(file_path), "")
utils_exists.assert_called_once_with(file_path)

View File

@ -1,39 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder.tests.base import BaseTestCase
from fuel_plugin_builder.validators import ValidatorManager
class TestValidatorManager(BaseTestCase):
def setUp(self):
self.plugin_path = '/tmp/plugin_path'
def test_get_validator(self):
validator = mock.MagicMock(return_value='test')
with mock.patch(
'fuel_plugin_builder.validators.manager.'
'version_mapping.get_version_mapping_from_plugin',
return_value={'validator': validator}):
self.assertEqual(
ValidatorManager(self.plugin_path).get_validator(),
'test')
validator.assert_called_once_with(self.plugin_path)

View File

@ -14,69 +14,109 @@
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.base import LegacyBaseValidatorTestCase
from fuel_plugin_builder.validators.schemas.v1 import SchemaV1
from fuel_plugin_builder.validators.validator_v1 import ValidatorV1
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import FakeFSTest
from fuel_plugin_builder import validators
class TestValidatorV1(LegacyBaseValidatorTestCase):
class TestValidatorV1(FakeFSTest):
validator_class = validators.ValidatorV1
loader_class = loaders.PluginLoaderV1
package_version = '1.0.0'
__test__ = True
validator_class = ValidatorV1
schema_class = SchemaV1
@mock.patch('fuel_plugin_builder.validators.validator_v1.utils')
def test_check_tasks(self, utils_mock):
mocked_methods = [
'validate_schema'
]
self.mock_methods(self.validator, mocked_methods)
utils_mock.parse_yaml.return_value = [
{'type': 'puppet', 'parameters': 'param1'},
{'type': 'shell', 'parameters': 'param2'}]
def test_check_schemas(self):
report = self.validator.validate(self.data_tree)
self.assertIn('metadata', report.render())
self.assertIn('tasks', report.render())
self.assertIn('attributes', report.render())
self.validator.check_tasks()
def test_check_env_config_attrs_checks_metadata(self):
self.data_tree['environment_config'] = {
'attributes': {'metadata': []}
}
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("[] is not of type 'object'", report.render())
self.assertEqual(
[mock.call('param1', self.schema_class().puppet_parameters,
self.validator.tasks_path,
value_path=[0, 'parameters']),
mock.call('param2', self.schema_class().shell_parameters,
self.validator.tasks_path,
value_path=[1, 'parameters'])],
self.validator.validate_schema.call_args_list)
def test_check_env_config_attrs_do_not_fail_if_empty(self):
self.data_tree['environment_config'] = {}
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())
@mock.patch('fuel_plugin_builder.validators.validator_v1.utils')
def test_check_tasks_no_parameters_not_failed(self, utils_mock):
mocked_methods = [
'validate_schema'
]
self.mock_methods(self.validator, mocked_methods)
utils_mock.parse_yaml.return_value = [
{'type': 'puppet'},
]
def test_check_env_config_attrs_fail_if_none(self):
self.data_tree['environment_config'] = {
'attributes': None
}
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("None is not of type 'object'", report.render())
self.validator.check_tasks()
def test_check_env_config_attrs_checks_attrs(self):
self.data_tree['environment_config'] = {
'attributes': {
'key1': {
'type': True,
'label': 'text',
'value': 'text',
'weight': 1}}}
self.assertEqual(
[mock.call(None, self.schema_class().puppet_parameters,
self.validator.tasks_path,
value_path=[0, 'parameters'])],
self.validator.validate_schema.call_args_list)
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("True is not of type 'string'", report.render())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'fuel_version': ['5.1', '6.0', '6.1'],
'package_version': '1.0.0'}
def test_check_env_config_attrs_generator_value(self):
self.data_tree['environment_config'] = {
'attributes': {
'key1': {
'type': 'hidden',
'label': '',
'value': {'generator': 'password'},
'weight': 1}}}
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("{'generator': 'password'} is not "
"of type 'string', 'boolean'", report.render())
with self.assertRaisesRegexp(
errors.ValidationError,
'Current plugin format 1.0.0 is not compatible with 5.1 Fuel'
' release. Fuel version must be 6.0 or higher.'
' Please remove 5.1 version from metadata.yaml file or'
' downgrade package_version.'):
self.validator.check_compatibility()
def test_check_env_config_attrs_restriction_fails(self):
self.data_tree['environment_config'] = {
'attributes': {
'key1': {
'type': 'text',
'label': 'test',
'value': 'test',
'weight': 1,
'restrictions': [
{
'condition': 'false',
'action': 'disable'
},
{
'condition': True,
'action': 'hide'
}
]
}
}
}
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("True is not of type 'string'", report.render())
def test_check_validate(self):
self.mock_methods(self.validator, ['validate'])
self.validator.validate(self.data_tree)
self.validator.validate.assert_called_once_with(self.data_tree)
def test_check_tasks(self):
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())
def test_check_tasks_with_no_parameters_failed(self):
self.data_tree['tasks'] = [{'type': 'puppet'}]
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn("'parameters' is a required property", report.render())
self.assertIn("'stage' is a required property", report.render())
self.assertIn("'role' is a required property", report.render())

View File

@ -14,73 +14,84 @@
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.base import LegacyBaseValidatorTestCase
from fuel_plugin_builder.validators.schemas.v2 import SchemaV2
from fuel_plugin_builder.validators.validator_v2 import ValidatorV2
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import FakeFSTest
from fuel_plugin_builder import validators
class TestValidatorV2(LegacyBaseValidatorTestCase):
class TestValidatorV2(FakeFSTest):
__test__ = True
validator_class = ValidatorV2
schema_class = SchemaV2
validator_class = validators.ValidatorV2
loader_class = loaders.PluginLoaderV1
package_version = '2.0.0'
@mock.patch('fuel_plugin_builder.validators.validator_v2.utils')
def test_check_tasks(self, utils_mock):
def test_check_compatibility_failed(self):
self.data_tree['fuel_version'] = ['6.0', '6.1']
self.data_tree['package_version'] = '2.0.0'
err_msg = 'Current plugin format 2.0.0 is not compatible with 6.0 ' \
'Fuel release. Fuel version must be 6.1 or higher. ' \
'Please remove 6.0 version from metadata.yaml file or ' \
'downgrade package_version.'
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn(err_msg, report.render())
def test_check_tasks_not_failed(self):
self.data_tree['tasks'] = [
{
'type': 'puppet',
'role': '*',
'stage': 'pre_deployment',
'parameters': {
'timeout': 42,
'puppet_manifest': '/my/manifest',
'puppet_modules': '/my/modules'
}
},
{
'type': 'shell',
'role': '*',
'stage': 'pre_deployment',
'parameters': {
'cmd': 'echo all > /tmp/plugin.all',
'timeout': 42
}
},
{
'type': 'reboot',
'role': '*',
'stage': 'pre_deployment',
'parameters': {'timeout': 42}
},
# {
# 'type': 'reboot',
# 'role': '*',
# 'stage': 'pre_deployment/+100.1',
# 'parameters': {'timeout': 42}
# },
# {
# 'type': 'reboot',
# 'role': '*',
# 'stage': 'pre_deployment/-100',
# 'parameters': {'timeout': 42}
# }
]
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())
def test_check_tasks_empty_parameters_not_failed(self):
mocked_methods = [
'validate_schema'
]
self.mock_methods(self.validator, mocked_methods)
utils_mock.parse_yaml.return_value = [
{'type': 'puppet', 'parameters': 'param1'},
{'type': 'shell', 'parameters': 'param2'},
{'type': 'reboot', 'parameters': 'param3'}]
self.validator.check_tasks()
self.assertEqual(
[mock.call('param1', self.schema_class().puppet_parameters,
self.validator.tasks_path,
value_path=[0, 'parameters']),
mock.call('param2', self.schema_class().shell_parameters,
self.validator.tasks_path,
value_path=[1, 'parameters']),
mock.call('param3', self.schema_class().reboot_parameters,
self.validator.tasks_path,
value_path=[2, 'parameters'])],
self.validator.validate_schema.call_args_list)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'fuel_version': ['6.0', '6.1'],
'package_version': '2.0.0'}
with self.assertRaisesRegexp(
errors.ValidationError,
'Current plugin format 2.0.0 is not compatible with 6.0 Fuel'
' release. Fuel version must be 6.1 or higher.'
' Please remove 6.0 version from metadata.yaml file or'
' downgrade package_version.'):
self.validator.check_compatibility()
@mock.patch('fuel_plugin_builder.validators.validator_v2.utils')
def test_check_tasks_no_parameters_not_failed(self, utils_mock):
mocked_methods = [
'validate_schema'
self.data_tree['tasks'] = [
{
'type': 'reboot',
'role': '*',
'stage': 'pre_deployment'
}
]
self.mock_methods(self.validator, mocked_methods)
utils_mock.parse_yaml.return_value = [
{'type': 'puppet'},
]
self.validator.check_tasks()
self.assertEqual(
[mock.call(None, self.schema_class().puppet_parameters,
self.validator.tasks_path,
value_path=[0, 'parameters'])],
self.validator.validate_schema.call_args_list)
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,624 +14,13 @@
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.base import LegacyBaseValidatorTestCase
from fuel_plugin_builder.validators.schemas import SchemaV3
from fuel_plugin_builder.validators.validator_v3 import ValidatorV3
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import FakeFSTest
from fuel_plugin_builder import validators
class TestValidatorV3(LegacyBaseValidatorTestCase):
class TestValidatorV3(FakeFSTest):
__test__ = True
validator_class = ValidatorV3
schema_class = SchemaV3
def test_validate(self):
mocked_methods = [
'check_schemas',
'check_tasks',
'check_releases_paths',
'check_compatibility',
'check_deployment_tasks'
]
self.check_validate(mocked_methods)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_check_tasks_schema_validation_failed(self, utils_mock, *args):
data_sets = [
{
'type': 'shell',
'parameters': {
'timeout': 3
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'retries': 'asd',
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': '',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': '',
'puppet_modules': 'yy',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
}
]
for data in data_sets:
utils_mock.parse_yaml.return_value = [data]
self.assertRaises(errors.ValidationError,
self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_check_tasks_schema_validation_passed(self, utils_mock, *args):
data_sets = [
[
{
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
}
],
[
{
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': 'master'
},
]
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
@mock.patch('fuel_plugin_builder.validators.base.utils.exists')
def test_check_tasks_no_file(self, exists_mock, utils_mock, *args):
mocked_methods = ['validate_schema']
self.mock_methods(self.validator, mocked_methods)
exists_mock.return_value = False
self.validator.check_deployment_tasks()
self.assertFalse(self.validator.validate_schema.called)
def test_check_schemas(self):
mocked_methods = [
'check_env_config_attrs',
'check_deployment_tasks_schema',
'check_network_roles_schema',
'check_node_roles_schema',
'check_volumes_schema'
]
self.mock_methods(self.validator, mocked_methods)
self.mock_methods(self.validator, ['validate_file_by_schema'])
self.validator.check_schemas()
self.assertEqual(
[mock.call(self.schema_class().metadata_schema,
self.validator.meta_path),
mock.call(self.schema_class().tasks_schema,
self.validator.tasks_path, allow_not_exists=True)],
self.validator.validate_file_by_schema.call_args_list)
for method in mocked_methods:
getattr(self.validator, method).assert_called_once_with()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_failed(self, utils_mock):
fuel_version_checks = (
(['6.0', '6.1', '7.0']),
(['6.1', '7.0']),
)
for fuel_version in fuel_version_checks:
mock_data = {
'fuel_version': fuel_version,
'package_version': '3.0.0'}
err_msg = 'Current plugin format 3.0.0 is not compatible with ' \
'{0} Fuel release. Fuel version must be 7.0 or higher.' \
' Please remove {0} version from metadata.yaml file or' \
' downgrade package_version.'.format(fuel_version[0])
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_compatibility)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_passed(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'fuel_version': ['7.0'],
'package_version': '3.0.0'}
self.validator.check_compatibility()
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_role_attribute_is_required_for_deployment_task_types(
self, utils_mock, *args):
deployment_task_types = [
'group', 'shell', 'copy_files', 'sync', 'upload_file']
for task_type in deployment_task_types:
mock_data = [{
'id': 'plugin_name',
'type': task_type}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'role' is a required property, value path '0'"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_parameters_attribute_is_required_for_deployment_task_types(
self, utils_mock, *args):
deployment_task_types = ['copy_files', 'sync', 'upload_file']
for task_type in deployment_task_types:
mock_data = [{
'id': 'plugin_name',
'type': task_type,
'role': '*'}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'parameters' is a required property, value path '0'"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_files_attribute_is_required_for_copy_files_task_type(
self, utils_mock, *args):
mock_data = [{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {}}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'files' is a required property, value path '0 " \
"-> parameters'"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_files_should_contain_at_least_one_item_for_copy_files_task_type(
self, utils_mock, *args):
mock_data = [{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {'files': []}}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"\[\] is too short, value path '0 -> parameters -> files'"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_src_and_dst_attributes_are_required_for_copy_files_task_type(
self, utils_mock, *args):
data_to_check = [
([{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {
'files': [{}]}
}], 'src'),
([{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {
'files': [{'src': 'some_source'}]}
}], 'dst')]
for mock_data, key in data_to_check:
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'{0}' is a required property, value path '0 " \
"-> parameters -> files -> 0'".format(key)
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_src_and_dst_attributes_are_required_for_sync_task_type(
self, utils_mock, *args):
data_to_check = [
([{
'id': 'plugin_name',
'type': 'sync',
'role': '*',
'parameters': {}
}], 'src'),
([{
'id': 'plugin_name',
'type': 'sync',
'role': '*',
'parameters': {'src': 'some_source'}
}], 'dst')]
for mock_data, key in data_to_check:
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'{0}' is a required property, value path '0 " \
"-> parameters'".format(key)
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_path_and_data_attributes_are_required_for_upload_file_task_type(
self, utils_mock, *args):
data_to_check = [
([{
'id': 'plugin_name',
'type': 'upload_file',
'role': '*',
'parameters': {}
}], 'path'),
([{
'id': 'plugin_name',
'type': 'upload_file',
'role': '*',
'parameters': {'path': 'some_path'}
}], 'data')]
for mock_data, key in data_to_check:
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'{0}' is a required property, value path '0 " \
"-> parameters'".format(key)
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_check_group_type_deployment_task_does_not_contain_manifests(
self, utils_mock, *args):
utils_mock.parse_yaml.return_value = [{
'id': 'plugin_name',
'type': 'group',
'role': ['plugin_name'],
'parameters': {}}]
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_check_deployment_task_role_failed(self, utils_mock, *args):
mock_data = [{
'id': 'plugin_name',
'type': 'group',
'role': ['plugin_n@me']}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml'," \
" 'plugin_n@me' does not match"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v3.utils')
def test_check_deployment_task_role(self, utils_mock, *args):
utils_mock.parse_yaml.return_value = [
{'id': 'plugin_name', 'type': 'group', 'role': []},
{'id': 'plugin_name', 'type': 'group', 'role': ['a', 'b']},
{'id': 'plugin_name', 'type': 'group', 'role': '*'},
{'id': 'plugin_name', 'type': 'puppet', 'role': []},
{'id': 'plugin_name', 'type': 'puppet', 'role': ['a', 'b']},
{'id': 'plugin_name', 'type': 'puppet', 'role': '*'},
{'id': 'plugin_name', 'type': 'shell', 'role': []},
{'id': 'plugin_name', 'type': 'shell', 'role': ['a', 'b']},
{'id': 'plugin_name', 'type': 'shell', 'role': '*'},
{'id': 'plugin_name', 'type': 'skipped'},
{'id': 'plugin_name', 'type': 'stage'},
{'id': 'plugin_name', 'type': 'reboot'},
{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {
'files': [
{'src': 'some_source', 'dst': 'some_destination'}]}
},
{
'id': 'plugin_name',
'type': 'sync',
'role': '*',
'parameters': {
'src': 'some_source', 'dst': 'some_destination'}
},
{
'id': 'plugin_name',
'type': 'upload_file',
'role': '*',
'parameters': {
'path': 'some_path', 'data': 'some_data'}
},
]
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_deployment_task_id(self, utils_mock):
mock_data = [{
'id': 'plugin_n@me',
'type': 'group',
'role': ['plugin_name']}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml'," \
" 'plugin_n@me' does not match"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_deployment_task_valid_dependencies(self, utils_mock):
utils_mock.parse_yaml.return_value = [{
'id': 'plugin_name',
'type': 'group',
'role': ['plugin_name'],
'requires': ['dependency_1', 'dependency_2']}]
self.validator.check_deployment_tasks_schema()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_deployment_task_invalid_dependencies(self, utils_mock):
mock_data = [{
'id': 'plugin_name',
'type': 'group',
'role': ['plugin_name'],
'requires': ['dependency_1', 'dependency_#']}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml'," \
" 'dependency_#' does not match"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_node_roles_have_correct_name(self, utils_mock):
mock_data = {
'plug$n_n@me': {
'name': 'test_plugin',
'description': 'test plugin'}}
err_msg = "File '/tmp/plugin_path/node_roles.yaml', Additional" \
" properties are not allowed"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_node_roles_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_node_role_should_has_name(self, utils_mock):
mock_data = {
'plugin_name': {
'description': 'test plugin'}}
err_msg = "File '/tmp/plugin_path/node_roles.yaml', 'name' is" \
" a required property, value path 'plugin_name'"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_node_roles_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_node_role_conflicts(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'plugin_name': {
'name': 'test_plugin',
'description': 'test plugin',
'conflicts': '*'}}
self.validator.check_node_roles_schema()
utils_mock.parse_yaml.return_value = {
'plugin_name': {
'name': 'test_plugin',
'description': 'test plugin',
'conflicts': ['some_role']}}
self.validator.check_node_roles_schema()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_valid_volumes_roles_mapping_name(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'volumes_roles_mapping': {
'mapping_name': [{'allocate_size': 'min', 'id': 'test'}]},
'volumes': []}
self.validator.check_volumes_schema()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_invalid_volumes_roles_mapping_name(self, utils_mock):
mock_data = {
'volumes_roles_mapping': {
'm@pping_name': [{'allocate_size': 'min', 'id': 'test'}]},
'volumes': []}
err_msg = "File '/tmp/plugin_path/volumes.yaml', Additional" \
" properties are not allowed"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_volumes_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_valid_network_roles(self, utils_mock):
utils_mock.parse_yaml.return_value = [{
"id": "example_net_role",
"default_mapping": "public",
"properties": {
"subnet": True,
"gateway": False,
"vip": [{
"name": "vip_name",
"namespace": "haproxy"}]}}]
self.validator.check_network_roles_schema()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_network_roles_vip_have_invalid_name(self, utils_mock):
mock_data = [{
"id": "example_net_role",
"default_mapping": "public",
"properties": {
"subnet": True,
"gateway": False,
"vip": [{
"name": "vip@name",
"namespace": "haproxy"}]}}]
err_msg = "File '/tmp/plugin_path/network_roles.yaml'," \
" 'vip@name' does not match"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_network_roles_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_network_roles_vip_have_invalid_namespace(self, utils_mock):
mock_data = [{
"id": "example_net_role",
"default_mapping": "public",
"properties": {
"subnet": True,
"gateway": False,
"vip": [{
"name": "vip_name",
"namespace": "hap roxy"}]}}]
err_msg = "File '/tmp/plugin_path/network_roles.yaml'," \
" 'hap roxy' does not match"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_network_roles_schema)
validator_class = validators.ValidatorV3
loader_class = loaders.PluginLoaderV3
package_version = '3.0.0'

View File

@ -14,11 +14,7 @@
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.test_validator_v3 import TestValidatorV3
from fuel_plugin_builder.validators.schemas import SchemaV4
from fuel_plugin_builder.validators.validator_v4 import ValidatorV4
@ -26,898 +22,4 @@ class TestValidatorV4(TestValidatorV3):
__test__ = True
validator_class = ValidatorV4
schema_class = SchemaV4
package_version = '4.0.0'
def setUp(self):
super(TestValidatorV4, self).setUp()
self.metadata = {
'name': 'plugin_name-12',
'title': 'plugin_name-12',
'version': '1.2.3',
'package_version': self.package_version,
'description': 'Description',
'fuel_version': ['8.0.0'],
'licenses': ['Apache', 'BSD'],
'authors': ['Author1', 'Author2'],
'homepage': 'http://test.com',
'releases': [
{
"os": "ubuntu",
"version": "liberty-8.0",
"mode": ['ha'],
"deployment_scripts_path": "deployment_scripts/",
"repository_path": "repositories/ubuntu"
}
],
'groups': [],
'is_hotpluggable': False
}
def test_check_schemas(self):
mocked_methods = [
'check_metadata_schema',
'check_env_config_attrs',
'check_tasks_schema',
'check_deployment_tasks_schema',
'check_network_roles_schema',
'check_node_roles_schema',
'check_volumes_schema',
'check_components_schema'
]
self.mock_methods(self.validator, mocked_methods)
self.mock_methods(self.validator, ['validate_file_by_schema'])
self.validator.check_schemas()
for method in mocked_methods:
getattr(self.validator, method).assert_called_once_with()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_failed(self, utils_mock):
fuel_version_checks = (
(['6.0', '6.1', '7.0', '8.0']),
(['6.1', '7.0', '8.0']),
(['6.0', '6.1', '7.0']),
(['6.1', '7.0']),
)
for fuel_version in fuel_version_checks:
mock_data = {
'fuel_version': fuel_version,
'package_version': '4.0.0'}
err_msg = 'Current plugin format 4.0.0 is not compatible with ' \
'{0} Fuel release. Fuel version must be 8.0 or higher.' \
' Please remove {0} version from metadata.yaml file or' \
' downgrade package_version.'.format(fuel_version[0])
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_compatibility)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_passed(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'fuel_version': ['8.0'],
'package_version': '4.0.0'}
self.validator.check_compatibility()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_is_hotpluggable_flag(self, utils_mock):
mock_data = {
'name': 'plugin_name-12',
'title': 'plugin_name-12',
'version': '1.2.3',
'package_version': self.package_version,
'description': 'Description',
'fuel_version': ['8.0.0'],
'licenses': ['Apache', 'BSD'],
'authors': ['Author1', 'Author2'],
'homepage': 'http://test.com',
'releases': [
{
"os": "ubuntu",
"version": "liberty-8.0",
"mode": ['ha'],
"deployment_scripts_path": "deployment_scripts/",
"repository_path": "repositories/ubuntu"
}
],
'groups': ['network'],
'is_hotpluggable': True
}
utils_mock.parse_yaml.return_value = mock_data
self.assertEqual(None, self.validator.check_metadata_schema())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_environment_config_settings_groups(self, utils_mock):
mock_data = {'attributes': {}}
utils_mock.parse_yaml.return_value = mock_data
self.assertEqual(None, self.validator.check_env_config_attrs())
mock_data = {'attributes': {'metadata': {}}}
utils_mock.parse_yaml.return_value = mock_data
self.assertEqual(None, self.validator.check_env_config_attrs())
mock_data = {'attributes': {'metadata': {'group': 'network'}}}
utils_mock.parse_yaml.return_value = mock_data
self.assertEqual(None, self.validator.check_env_config_attrs())
mock_data = {'attributes': {'metadata': {'group': 'unknown'}}}
utils_mock.parse_yaml.return_value = mock_data
self.assertRaises(
errors.ValidationError,
self.validator.check_env_config_attrs
)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_environment_config_type_attrs(self, utils_mock):
mock_data = {
'attributes': {
'server-name': {
'value': [],
'label': 'test',
'weight': 1,
'type': 'text_list',
}
}
}
utils_mock.parse_yaml.return_value = mock_data
self.assertEqual(None, self.validator.check_env_config_attrs())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_components_schema_validation_failed(self, utils_mock):
data_sets = [
{
'name': 'test_additional_item',
'type': 'network',
'label': 'test label',
'compatible': []
},
{
'name': 'test_wrong_label_type',
'label': 1
},
{
'name': 'test_wrong_description_type',
'description': []
},
{
'compatible': [],
'incompatible': []
},
{
'name': 'wrong::type_name:*',
'compatible': [],
'incompatible': []
},
{
'name': 'storage::NameWithUpperCase',
'label': 'Component Label'
},
{
'name': 'storage::wrong_compatible_types',
'compatible': {},
'requires': 3,
'incompatible': ""
},
{
'name': 'storage:no_name_compatible_items',
'incompatible': [{
'message': 'Component incompatible with XXX'
}],
},
{
'name': 'storage:wrong_message_compatible_items',
'incompatible': [{
'name': 'storage:*',
'message': 1234
}]
},
{
'name': 'network:new_net:wrong_compatible',
'compatible': [
{'name': ''},
{'name': 'wrong::component'},
{'name': 'storage:UpperCaseWrongName'},
{'name': 'Another_wrong**'}
]
}
]
for data in data_sets:
utils_mock.parse_yaml.return_value = [data]
self.assertRaises(errors.ValidationError,
self.validator.check_components_schema)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_components_schema_validation_passed(self, utils_mock):
data_sets = [
{
'name': 'network:test_name',
'label': 'Test Name network'
},
{
'name': 'storage:sub-type:test_name',
'label': 'Test Storage',
'description': 'New Test Storage Description',
'compatible': [
{'name': 'hypervisor:libvirt:*'},
{'name': 'hypervisor:wmvare_new_1'},
{'name': 'network:neutron:ml2:*'},
{'name': 'additional_service:murano'},
],
'requires': [{
'name': 'hypervisor:libvirt:kvm',
'message': 'Requires message'
}],
'incompatible': [
{
'name': 'storage:*',
'message': 'New storage is incompatible with other'
},
{
'name': 'additional_service:sahara',
'message': 'New storage is incompatible with Sahara'
}
]
},
{
'name': 'hypervisor:new',
'label': 'New Hypervisor',
'compatible': []
},
{
'name': 'additional_service:ironic-new',
'label': 'Ironic New',
'bind': [('some_key', 'some_val')],
'incompatible': [{
'name': 'additional_service:*',
'message': 'Alert message'
}],
'requires': [{
'name': 'storage:test'
}]
}
]
for data in data_sets:
utils_mock.parse_yaml.return_value = [data]
self.validator.check_components_schema()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_groups(self, utils_mock):
groups_data = [
["network"],
["storage"],
["storage::cinder"],
["storage::glance"],
["hypervisor"],
["equipment"],
["storage::cinder", "equipment"],
[]
]
for gd in groups_data:
self.metadata['groups'] = gd
utils_mock.parse_yaml.return_value = self.metadata
self.assertEqual(None, self.validator.check_metadata_schema())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_deployment_task_reexecute_on(self, utils_mock):
mock_data = [{
'id': 'plugin_task',
'type': 'puppet',
'groups': ['controller'],
'reexecute_on': ['bla']}]
err_msg = "File '/tmp/plugin_path/deployment_tasks.yaml', " \
"'bla' is not one of"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks_schema)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
@mock.patch('fuel_plugin_builder.validators.validator_v4.logger')
def test_role_attribute_is_required_for_deployment_task_types(
self, logger_mock, utils_mock, *args):
deployment_tasks_data = [
{
'id': 'plugin_name',
'type': 'group'
},
{
'id': 'plugin_name',
'type': 'shell'
},
{
'id': 'plugin_name',
'type': 'copy_files',
'parameters': {
'files': [{'src': '/dev/null', 'dst': '/dev/null'}]
}
},
{
'id': 'plugin_name',
'type': 'sync',
'parameters': {'src': '/dev/null', 'dst': '/dev/null'}
},
{
'id': 'plugin_name',
'type': 'upload_file',
'parameters': {
'path': 'http://test.com',
'data': 'VGVzdERhdGE='
}
}
]
for task in deployment_tasks_data:
utils_mock.parse_yaml.return_value = [task]
logger_mock.warn.reset_mock()
self.validator.check_deployment_tasks()
self.assertEqual(logger_mock.warn.call_count, 1)
# This is the section of tests inherited from the v3 validator
# where decorators is re-defined for module v4
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
@mock.patch('fuel_plugin_builder.validators.base.utils.exists')
def test_check_tasks_no_file(self, exists_mock, utils_mock, *args):
super(TestValidatorV4, self).test_check_deployment_task_role(
exists_mock, utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_deployment_task_role(self, utils_mock, *args):
utils_mock.parse_yaml.return_value = [
{'id': 'plugin_name', 'type': 'group', 'groups': ['a', 'b']},
{'id': 'plugin_name', 'type': 'group', 'groups': '*'},
{'id': 'plugin_name', 'type': 'puppet', 'role': ['a', 'b']},
{'id': 'plugin_name', 'type': 'puppet', 'role': '*'},
{'id': 'plugin_name', 'type': 'shell', 'roles': ['a', 'b']},
{'id': 'plugin_name', 'type': 'shell', 'roles': '*'},
{'id': 'plugin_name', 'type': 'skipped', 'role': '/test/'},
{'id': 'plugin_name', 'type': 'stage'},
{'id': 'plugin_name', 'type': 'reboot', 'groups': 'contrail'},
{
'id': 'plugin_name',
'type': 'copy_files',
'role': '*',
'parameters': {
'files': [
{'src': 'some_source', 'dst': 'some_destination'}]}
},
{
'id': 'plugin_name',
'type': 'sync',
'role': 'plugin_name',
'parameters': {
'src': 'some_source', 'dst': 'some_destination'}
},
{
'id': 'plugin_name',
'type': 'upload_file',
'role': '/^.*plugin\w+name$/',
'parameters': {
'path': 'some_path', 'data': 'some_data'}
},
]
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_deployment_task_role_failed(self, utils_mock, *args):
mock_data = [{
'id': 'plugin_name',
'type': 'group',
'role': ['plugin_n@me']}]
err_msg = "field should"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_deployment_task_required_missing(self, utils_mock, *args):
mock_data = [{
'groups': 'plugin_name',
'type': 'puppet'}]
err_msg = 'required'
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_deployment_task_required_roles_missing_is_ok(
self, utils_mock, *args):
utils_mock.parse_yaml.return_value = [{
'id': 'plugin_name',
'type': 'stage'}]
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_deployment_task_role_regexp_failed(self, utils_mock, *args):
mock_data = [{
'id': 'plugin_name',
'type': 'group',
'role': '/[0-9]++/'}]
err_msg = "field should.*multiple repeat"
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_group_type_deployment_task_does_not_contain_manifests(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_check_group_type_deployment_task_does_not_contain_manifests(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_files_attribute_is_required_for_copy_files_task_type(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_files_attribute_is_required_for_copy_files_task_type(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_files_should_contain_at_least_one_item_for_copy_files_task_type(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_files_should_contain_at_least_one_item_for_copy_files_task_type(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_parameters_attribute_is_required_for_deployment_task_types(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_parameters_attribute_is_required_for_deployment_task_types(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_path_and_data_attributes_are_required_for_upload_file_task_type(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_path_and_data_attributes_are_required_for_upload_file_task_type(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_src_and_dst_attributes_are_required_for_copy_files_task_type(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_src_and_dst_attributes_are_required_for_copy_files_task_type(
utils_mock)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_src_and_dst_attributes_are_required_for_sync_task_type(
self, utils_mock, *args):
super(
TestValidatorV4, self
).test_src_and_dst_attributes_are_required_for_sync_task_type(
utils_mock)
# todo(ikutukov): validation for old-style tasks.yaml without
# id and normal dependencies. Have to find out what to do with them.
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_validation_failed(self, utils_mock, *args):
pass
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_validation_passed(self, utils_mock, *args):
pass
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_1_0_validation_failed(self, utils_mock, *args):
checks = [
{
'data': {
'id': 'task-id',
'type': 'shell',
'parameters': {
'timeout': 3
},
'stage': 'post_deployment',
'role': '*'
},
'errorTextContains': "'cmd' is a required property, "
"value path '0 -> parameters'"
},
{
'data': {
'id': 'task-id',
'type': 'puppet',
'parameters': {
'timeout': 3
},
'stage': 'post_deployment',
'role': '*'
},
'errorTextContains': "'puppet_manifest' is a required property"
", value path '0 -> parameters'"
},
{
'data': {
'id': 'task-id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
'errorTextContains': "'puppet_manifest' is a required property"
", value path '0 -> parameters'"
},
{
'data': {
'id': 'task-id',
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
},
'stage': 'post_deployment',
'role': '*'
},
'errorTextContains': "'cmd' is a required property, value path"
" '0 -> parameters'"
},
{
'data': {
'id': 'task-id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'retries': 'asd',
},
'stage': 'post_deployment',
'role': '*'
},
'errorTextContains': "'asd' is not of type 'integer', value "
"path '0 -> parameters -> retries'"
},
{
'data': {
'id': 'task-id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': '',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
},
'errorTextContains': "'' is too short, value path '0 -> "
"parameters -> puppet_modules'"
},
{
'data': {
'id': 'task-id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': '',
'puppet_modules': 'yy',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
},
'errorTextContains': "'' is too short, value path '0 -> "
"parameters -> puppet_manifest'"
}
]
for check in checks:
utils_mock.parse_yaml.return_value = [check['data']]
self.assertRaisesRegexp(
errors.ValidationError,
check['errorTextContains'],
self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_1_0_validation_passed(self, utils_mock, *args):
data_sets = [
[
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
}
],
[
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'task_id',
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': 'master'
},
]
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_2_0_validation_failed(self, utils_mock, *args):
tasks_data = [
{
'id': 'test',
'type': 'shell',
'version': '2'
},
{
'id': 'test',
'type': 'shell',
'cross-depends': [
{
'role': 'role_without_name'
}
]
},
{
'id': 'test',
'type': 'shell',
'parameters': {
'strategy': 'NOSUCHSTRATEGY'
}
},
{
'id': 'test',
'type': 'shell',
'parameters': {
'strategy': {
'type': 'NOSUCHSTRATEGY'
}
}
}
]
utils_mock.parse_yaml.return_value = tasks_data
self.assertRaises(errors.ValidationError,
self.validator.check_deployment_tasks)
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_2_0_validation_passed(self, utils_mock, *args):
tasks_data = [
{
'id': 'task_id',
'type': 'puppet',
'version': '2.0.0',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'roles': ['test_role'],
'cross-depends': [
{
'name': 'task_id2',
},
{
'name': 'task_id2',
'role': ['some_role']
},
{
'name': 'task_id2',
'role': 'some_role'
},
{
'name': 'task_id2',
'policy': 'all'
},
{
'name': 'task_id2',
'policy': 'any'
}
],
'cross-depended-by': [
{
'name': 'task_id2',
},
{
'name': 'task_id2',
'role': ['some_role']
},
{
'name': 'task_id2',
'role': 'some_role'
},
{
'name': 'task_id2',
'policy': 'all'
},
{
'name': 'task_id2',
'policy': 'any'
}
],
'strategy': {
'type': 'parallel',
'amount': 10
}
}
]
utils_mock.parse_yaml.return_value = tasks_data
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.validator_v4.utils')
def test_check_tasks_schema_2_1_validation_passed(self, utils_mock, *args):
# this is a slightly modified task from netconfig.yaml
tasks_data = [
{
"id": "netconfig",
"type": "puppet",
"version": "2.1.0",
"groups": [
"primary-controller",
"controller",
],
"required_for": [
"deploy_end"
],
"requires": [
"tools"
],
"condition": {
"yaql_exp": "changedAny($.network_scheme, $.dpdk, $.get('"
"use_ovs'), $.get('set_rps'), $.get('set_rps')"
", $.get('run_ping_checker'), $.network_scheme"
".endpoints.values().where(\n $.get('gateway'"
") != null).gateway)\n"
},
"parameters": {
"puppet_manifest": "/etc/puppet/modules/osnailyfacter/"
"modular/netconfig/netconfig.pp",
"puppet_modules": "/etc/puppet/modules",
"timeout": 300,
"strategy": {
"type": "parallel",
"amount": {
"yaql_exp": "switch($.get('deployed_before', {})."
"get('value') => 1, true => 3)\n"
}
}
},
"test_pre": {
"cmd": "ruby /etc/puppet/modules/osnailyfacter/modular/"
"netconfig/netconfig_pre.rb"
},
"test_post": {
"cmd": "ruby /etc/puppet/modules/osnailyfacter/modular/"
"netconfig/netconfig_post.rb"
},
"cross-depends": {
"yaql_exp": "switch( (\n $.roles.any($.matches('("
"primary-)?(controller|mongo)'))\n "
"or ($.network_metadata.get('vips',{}).get"
"('management') = null)\n ) => [],\n "
"true => [{name =>'virtual_ips'}]\n)\n"
}
}
]
utils_mock.parse_yaml.return_value = tasks_data
self.validator.check_deployment_tasks()
@mock.patch('fuel_plugin_builder.validators.base.utils.exists')
def test_check_tasks_schema_validation_no_file(self, exists_mock, *args):
mocked_methods = ['validate_schema']
self.mock_methods(self.validator, mocked_methods)
exists_mock.return_value = False
self.validator.check_tasks_schema()
self.assertFalse(self.validator.validate_schema.called)

View File

@ -14,221 +14,260 @@
# License for the specific language governing permissions and limitations
# under the License.
import mock
from fuel_plugin_builder import errors
from fuel_plugin_builder.tests.test_validator_v4 import TestValidatorV4
from fuel_plugin_builder.validators.schemas import SchemaV5
from fuel_plugin_builder.validators.validator_v5 import ValidatorV5
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import FakeFSTest
from fuel_plugin_builder import validators
class TestValidatorV5(TestValidatorV4):
__test__ = True
validator_class = ValidatorV5
schema_class = SchemaV5
class TestValidatorV5(FakeFSTest):
validator_class = validators.ValidatorV5
loader_class = loaders.PluginLoaderV5
package_version = '5.0.0'
def setUp(self):
super(TestValidatorV5, self).setUp()
__test__ = True
def test_check_schemas(self):
mocked_methods = [
'check_metadata_schema',
'check_env_config_attrs',
'check_tasks_schema',
'check_deployment_tasks_schema',
'check_network_roles_schema',
'check_node_roles_schema',
'check_volumes_schema',
'check_components_schema',
'check_node_attributes_schema'
]
self.mock_methods(self.validator, mocked_methods)
self.mock_methods(
self.validator,
['validate_file_by_schema', 'check_interface_attributes_schema']
def test_validate(self):
report = self.validator.validate(self.data_tree)
self.assertIn(u'Success!', report.render())
def test_fuel_version_legacy_warning(self):
self.data_tree.update(
self._make_fake_metadata_data(fuel_version=['9.1'])
)
self.validator.check_schemas()
report = self.validator.validate(self.data_tree)
self.assertIn('WARNING: "fuel_version" field in metadata.yaml is '
'deprecated and will be removed in further Fuel '
'releases.', report.render())
self.assertFalse(report.is_failed())
self.assertEqual(
[mock.call(self.validator.bond_config_path),
mock.call(self.validator.nic_config_path)],
self.validator.check_interface_attributes_schema.call_args_list)
for method in mocked_methods:
getattr(self.validator, method).assert_called_once_with()
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_failed(self, utils_mock):
def test_check_compatibility_failed(self):
fuel_version_checks = (
(['8.0', '9.0', '10.0']),
(['6.1', '7.0', '8.0']),
(['6.0', '6.1', '7.0']),
(['6.1', '7.0']),
(['8.0', '9.0', '9.1', '10.0'], ['8.0', '9.0']),
(['6.1', '7.0', '8.0'], ['6.1', '7.0', '8.0']),
(['6.0', '6.1', '7.0'], ['6.0', '6.1', '7.0']),
(['6.1', '7.0'], ['6.1', '7.0']),
)
for fuel_version in fuel_version_checks:
mock_data = {
'fuel_version': fuel_version,
'package_version': '5.0.0'}
self.data_tree['package_version'] = '5.0.0'
for fuel_version, incompatible_versions in fuel_version_checks:
self.data_tree['fuel_version'] = fuel_version
err_msg = 'Current plugin format 5.0.0 is not compatible with ' \
'{0} Fuel release. Fuel version must be 9.0 or higher.' \
' Please remove {0} version from metadata.yaml file or' \
' downgrade package_version.'.format(fuel_version[0])
'{0} Fuel release. Fuel version must be 9.1 or higher.' \
' Please remove {0} version from metadata.yaml file ' \
'or downgrade package_version.' \
''.format(', '.join(incompatible_versions))
report = self.validator.validate(self.data_tree)
self.assertTrue(report.is_failed())
self.assertIn(err_msg, report.render())
self.check_raised_exception(
utils_mock, mock_data,
err_msg, self.validator.check_compatibility)
def test_check_compatibility_passed(self):
self.data_tree['package_version'] = '5.0.0'
self.data_tree['fuel_version'] = ['9.1', '9.2', '10.0']
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())
self.assertIn('Success!', report.render())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_compatibility_passed(self, utils_mock):
utils_mock.parse_yaml.return_value = {
'fuel_version': ['9.0', '9.1', '9.2', '10.0'],
'package_version': '5.0.0'}
self.validator.check_compatibility()
@mock.patch('fuel_plugin_builder.validators.base.utils.exists')
def test_check_interface_attributes_schema_validation_no_file(self,
exists_mock):
mocked_methods = ['validate_schema']
self.mock_methods(self.validator, mocked_methods)
exists_mock.return_value = False
self.validator.check_interface_attributes_schema(mock.ANY)
self.assertFalse(self.validator.validate_schema.called)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_interface_attributes_schema_validation_failed(self,
utils_mock):
data_sets = [
def test_check_tasks_schema_validation_failed(self):
bad_tasks_data = [
{
'123': {
'label': 'Attribute without type',
'description': 'Attribute without type',
'value': ''
}
},
{
'attribute_without_label': {
'description': 'Attribute without label',
'type': 'text',
'value': 'attribute_value'
}
}, {
'attribute_without_value': {
'label': 'Attribute without value',
'description': 'Attribute without value',
'type': 'text',
}
},
{
'attribute-1': {
'description': 'Attribute with wrong label type',
'label': 123,
'type': 'checkbox',
}
},
{
'attribute-2': {
'label': 'Attribute with wrong type type',
'type': [],
}
},
{
'attribute-3': {
'label': 'Attribute with wrong description type',
'type': 'text',
'description': False
}
},
{
'attribute-4': {
'label': 'Attribute with wrong restrictions type',
'type': 'text',
'restrictions': {}
}
},
{
'label': 'Missed attribute name. Wrong level nesting.',
'type': 'text',
'value': ''
},
{
'extra_level': {
'attribute_name': {
'label': 'Attribute with extra nesting level',
'type': 'text',
'value': ''
}
}
},
{
'uns@pported_letters=!n_attr_name*': {
'label': 'Attribute with wrong name',
'type': 'text',
'value': ''
}
},
['wrong interface attributes object type']
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.assertRaises(errors.ValidationError,
self.validator.check_interface_attributes_schema,
mock.ANY)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_interface_attributes_schema_validation_passed(self,
utils_mock):
data_sets = [
{
'123': {
'label': 'Attribute with min required fields',
'type': 'text',
'value': ''
}
},
{
'Attribute_1': {
'label': 'Attribute with restrictions & complex value',
'description': 'Some attribute description',
'type': 'text',
'value': {'key1': ['val_1', 'val_2']},
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
]
'type': 'shell',
'parameters': {
'timeout': 3
},
'attribute-2': {
'label': 'Attribute with additional fields',
'type': 'number',
'description': 'Some attribute description',
'value': 10,
'min': 0
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3
},
'metadata': {
'label': 'Some metadata'
}
'stage': 'post_deployment',
'role': '*'
},
{
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'retries': 'asd',
},
'stage': 'post_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': '',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
},
{
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': '',
'puppet_modules': 'yy',
'retries': 1,
},
'stage': 'pre_deployment',
'role': '*'
}
]
self.data_tree['releases'][0]['graphs'][0]['tasks'] = \
bad_tasks_data
report = self.validator.validate(self.data_tree)
self.assertEqual(report.count_failures(), 7 + 1)
self.assertIn('Failure!', report.render())
def test_check_tasks_schema_validation_passed(self):
data_sets = [
[
{
'id': 'test1',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'id': 'test1',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test2',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
],
[
{
'id': 'test3',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test4',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'xx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test5',
'type': 'puppet',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
}
],
[
{
'id': 'test1',
'type': 'shell',
'parameters': {
'timeout': 3,
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test2',
'type': 'shell',
'parameters': {
'timeout': 3,
'puppet_manifest': 'xx',
'puppet_modules': 'yy',
'cmd': 'reboot'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test3',
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': '*'
},
{
'id': 'test4',
'type': 'puppet',
'parameters': {
'timeout': 3,
'retries': 10,
'puppet_manifest': 'xx',
'puppet_modules': 'xxx'
},
'stage': 'post_deployment',
'role': 'master'
},
]
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.validator.check_interface_attributes_schema('nic_config_path')
self.data_tree['releases'][0]['graphs'][0]['tasks'] = data
report = self.validator.validate(self.data_tree)
self.assertFalse(report.is_failed())
self.assertIn('Success!', report.render())
@mock.patch('fuel_plugin_builder.validators.base.utils.exists')
def test_check_node_attributes_schema_validation_no_file(self,
exists_mock):
mocked_methods = ['validate_schema']
self.mock_methods(self.validator, mocked_methods)
exists_mock.return_value = False
self.validator.check_node_attributes_schema()
self.assertFalse(self.validator.validate_schema.called)
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_node_attributes_schema_validation_failed(self, utils_mock):
def test_check_node_attributes_schema_validation_failed(self):
data_sets = [
{
'plugin_section': {
@ -395,58 +434,177 @@ class TestValidatorV5(TestValidatorV4):
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.assertRaises(errors.ValidationError,
self.validator.check_node_attributes_schema)
self.data_tree['node_attributes_metadata'] = data
report = self.validator.validate(self.data_tree)
self.assertIn('Failure!', report.render())
self.assertTrue(report.is_failed())
@mock.patch('fuel_plugin_builder.validators.base.utils')
def test_check_node_attributes_schema_validation_passed(self, utils_mock):
def test_check_node_attributes_schema_validation_passed(self):
data = {
'plugin_section': {
'metadata': {
'label': 'Some label'
},
'123': {
'label': 'Attribute with min required fields',
'type': 'text',
'value': ''
}
},
'plugin_section123': {
'Attribute_1': {
'label': 'Attribute with restrictions & complex value',
'description': 'Some attribute description',
'type': 'text',
'value': {'key1': ['val_1', 'val_2']},
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
]
},
'attribute-2': {
'label': 'Attribute with additional fields',
'type': 'number',
'description': 'Some attribute description',
'value': 10,
'min': 0
},
'metadata': {
'label': 'Metadata with extra field & restrictions',
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
],
'group': 'group A'
}
}
}
self.data_tree['node_attributes_metadata'] = data
report = self.validator.validate(self.data_tree)
self.assertIn('Success!', report.render())
self.assertFalse(report.is_failed())
def test_check_interface_attributes_schema_validation_failed(self):
data_sets = [
{
'plugin_section': {
'metadata': {
'label': 'Some label'
},
'123': {
'label': 'Attribute with min required fields',
'123': {
'label': 'Attribute without type',
'description': 'Attribute without type',
'value': ''
}
},
{
'attribute_without_label': {
'description': 'Attribute without label',
'type': 'text',
'value': 'attribute_value'
}
}, {
'attribute_without_value': {
'label': 'Attribute without value',
'description': 'Attribute without value',
'type': 'text',
}
},
{
'attribute-1': {
'description': 'Attribute with wrong label type',
'label': 123,
'type': 'checkbox',
}
},
{
'attribute-2': {
'label': 'Attribute with wrong type type',
'type': [],
}
},
{
'attribute-3': {
'label': 'Attribute with wrong description type',
'type': 'text',
'description': False
}
},
{
'attribute-4': {
'label': 'Attribute with wrong restrictions type',
'type': 'text',
'restrictions': {}
}
},
{
'label': 'Missed attribute name. Wrong level nesting.',
'type': 'text',
'value': ''
},
{
'extra_level': {
'attribute_name': {
'label': 'Attribute with extra nesting level',
'type': 'text',
'value': ''
}
}
},
{
'uns@pported_letters=!n_attr_name*': {
'label': 'Attribute with wrong name',
'type': 'text',
'value': ''
}
},
['wrong interface attributes object type']
]
for data in data_sets:
self.data_tree['bond_attributes_metadata'] = data
report = self.validator.validate(self.data_tree)
self.assertIn('Failure!', report.render())
self.assertTrue(report.is_failed())
def test_check_interface_attributes_schema_validation_passed(self):
data_sets = [
{
'123': {
'label': 'Attribute with min required fields',
'type': 'text',
'value': ''
}
},
{
'Attribute_1': {
'label': 'Attribute with restrictions & complex value',
'description': 'Some attribute description',
'type': 'text',
'value': {'key1': ['val_1', 'val_2']},
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
]
},
'plugin_section123': {
'Attribute_1': {
'label': 'Attribute with restrictions & complex value',
'description': 'Some attribute description',
'type': 'text',
'value': {'key1': ['val_1', 'val_2']},
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
]
},
'attribute-2': {
'label': 'Attribute with additional fields',
'type': 'number',
'description': 'Some attribute description',
'value': 10,
'min': 0
},
'metadata': {
'label': 'Metadata with extra field & restrictions',
'restrictions': [
{
'condition': 'false',
'action': 'disable'
}
],
'group': 'group A'
}
'attribute-2': {
'label': 'Attribute with additional fields',
'type': 'number',
'description': 'Some attribute description',
'value': 10,
'min': 0
},
'metadata': {
'label': 'Some metadata'
}
}
]
for data in data_sets:
utils_mock.parse_yaml.return_value = data
self.validator.check_node_attributes_schema()
self.data_tree['nic_attributes_metadata'] = data
report = self.validator.validate(self.data_tree)
self.assertIn('Success!', report.render())
self.assertFalse(report.is_failed())

View File

@ -14,66 +14,55 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import errors
from fuel_plugin_builder import builders
from fuel_plugin_builder import loaders
from fuel_plugin_builder.tests.base import BaseTestCase
from fuel_plugin_builder.validators import ValidatorV1
from fuel_plugin_builder.validators import ValidatorV2
from fuel_plugin_builder.validators import ValidatorV3
from fuel_plugin_builder.validators import ValidatorV4
from fuel_plugin_builder.validators import ValidatorV5
from fuel_plugin_builder.version_mapping import get_plugin_for_version
from fuel_plugin_builder import validators
from fuel_plugin_builder.version_mapping import \
get_plugin_package_config
class TestVersionMapping(BaseTestCase):
def test_get_plugin_for_version_1(self):
result = get_plugin_for_version('1.0.0')
self.assertEqual(result['version'], '1.0.0')
self.assertEqual(
result['templates'],
['templates/base', 'templates/v1/'])
self.assertEqual(result['validator'], ValidatorV1)
def test_get_plugin_for_version_2(self):
result = get_plugin_for_version('2.0.0')
self.assertEqual(result['version'], '2.0.0')
self.assertEqual(
result['templates'],
['templates/base', 'templates/v2/plugin_data/'])
self.assertEqual(result['validator'], ValidatorV2)
def test_get_plugin_for_version_3(self):
result = get_plugin_for_version('3.0.0')
self.assertEqual(result['version'], '3.0.0')
self.assertEqual(
result['templates'],
['templates/base', 'templates/v3/plugin_data/'])
self.assertEqual(result['validator'], ValidatorV3)
def test_get_plugin_for_version_4(self):
result = get_plugin_for_version('4.0.0')
self.assertEqual(result['version'], '4.0.0')
self.assertEqual(
result['templates'],
[
'templates/base',
'templates/v3/plugin_data/',
'templates/v4/plugin_data/'])
self.assertEqual(result['validator'], ValidatorV4)
def test_get_plugin_for_version_5(self):
result = get_plugin_for_version('5.0.0')
self.assertEqual(result['version'], '5.0.0')
self.assertEqual(
result['templates'],
[
'templates/base',
'templates/v3/plugin_data/',
'templates/v4/plugin_data/',
'templates/v5/plugin_data/'])
self.assertEqual(result['validator'], ValidatorV5)
def test_get_plugin_for_existing_versions(self):
for n, valdator, builder, loader in (
(
1,
validators.ValidatorV1,
builders.PluginBuilderV1,
loaders.PluginLoaderV1
),
(
2,
validators.ValidatorV2,
builders.PluginBuilderV2,
loaders.PluginLoaderV1
),
(
3,
validators.ValidatorV3,
builders.PluginBuilderV3,
loaders.PluginLoaderV3
),
(
4,
validators.ValidatorV4,
builders.PluginBuilderV3,
loaders.PluginLoaderV4
),
(
5,
validators.ValidatorV5,
builders.PluginBuilderV3,
loaders.PluginLoaderV5
)
):
result = get_plugin_package_config('{}.0.0'.format(n))
self.assertEqual(result['version'], '{}.0.'.format(n))
self.assertEqual(result['validator'], valdator)
self.assertEqual(result['builder'], builder)
self.assertEqual(result['loader'], loader)
def test_get_plugin_for_version_raises_error(self):
with self.assertRaisesRegexp(errors.WrongPackageVersionError,
with self.assertRaisesRegexp(Exception,
'Wrong package version "2999"'):
get_plugin_for_version('2999')
get_plugin_package_config('2999')

View File

@ -1,387 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import hashlib
import io
import logging
import os
import shutil
import subprocess
import tarfile
import yaml
from distutils import dir_util
from distutils.version import StrictVersion
from glob import glob
from mako.template import Template
from fuel_plugin_builder import errors
logger = logging.getLogger(__name__)
def is_executable(file_path):
"""Checks if file executable
:param str file_path: path to the file
:returns: True if file is executable, False if is not
"""
return os.path.isfile(file_path) and os.access(file_path, os.X_OK)
def which(cmd):
"""Checks if file executable
:param str cmd: the name of the command or path
:returns: None if there is no such command,
if there is such command returns
the path to the command
"""
fpath, fname = os.path.split(cmd)
if fpath:
if is_executable(cmd):
return cmd
for path in os.environ['PATH'].split(os.pathsep):
exe_file = os.path.join(path, cmd)
if is_executable(exe_file):
return exe_file
return None
def exec_cmd(cmd, cwd=None):
"""Execute command with logging.
Ouput of stdout and stderr will be written
in log.
:param cmd: shell command
:param cwd: string or None
"""
logger.debug(u'Execute command "{0}"'.format(cmd))
child = subprocess.Popen(
cmd, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
cwd=cwd)
logger.debug(u'Stdout and stderr of command "{0}":'.format(cmd))
for line in child.stdout:
logger.debug(line.rstrip())
child.wait()
exit_code = child.returncode
if exit_code != 0:
raise errors.ExecutedErrorNonZeroExitCode(
u'Shell command executed with "{0}" '
'exit code: {1} '.format(exit_code, cmd))
logger.debug(u'Command "{0}" successfully executed'.format(cmd))
def exec_piped_cmds(cmds, cwd=None):
"""Execute pipe of commands with logging.
:param cmds: list of shell commands
:type cmds: list
:param cwd: current working directory
:type cwd: string or None
"""
logger.debug(u'Executing commands "{0}"'.format(" | ".join(cmds)))
std_out = None
for cmd in cmds:
child = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True,
cwd=cwd)
std_out, std_err = child.communicate(input=std_out)
exit_code = child.returncode
if exit_code != 0:
logger.debug(u'Stderr of command "{0}":'.format(cmd))
logger.debug(std_err)
raise errors.ExecutedErrorNonZeroExitCode(
u'Shell command executed with "{0}" '
'exit code: {1} '.format(exit_code, cmd))
logger.debug(u'Stdout of command "{0}":'.format(" | ".join(cmds)))
logger.debug(std_out)
logger.debug(
u'Command "{0}" successfully executed'.format(" | ".join(cmds))
)
def create_dir(dir_path):
"""Creates directory
:param dir_path: directory path
:raises: errors.DirectoryExistsError
"""
logger.debug(u'Creating directory %s', dir_path)
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
def exists(path):
"""Checks if filel is exist
:param str path: path to the file
:returns: True if file is exist, Flase if is not
"""
return os.path.lexists(path)
def basename(path):
"""Basename for path
:param str path: path to the file
:returns: str with filename
"""
return os.path.basename(path)
def render_to_file(src, dst, params):
"""Render mako template and write it to specified file
:param src: path to template
:param dst: path where rendered template will be saved
"""
logger.debug(u'Render template from {0} to {1} with params: {2}'.format(
src, dst, params))
# NOTE(aroma): we use io.open because sometimes we ended up with
# non-ascii chars in rendered template so must explicitly
# converse content to 'utf-8' encoding before writing
with io.open(src, 'r', encoding='utf-8') as f:
template_file = f.read()
with io.open(dst, 'w', encoding='utf-8') as f:
# NOTE(aroma): 'render' in such configuration always
# return unicode object as the result
rendered_file = Template(template_file).render(**params)
f.write(rendered_file)
def render_files_in_dir(dir_path, params):
"""Renders all *.mako files and removes templates
:param str dir_path: path to the directory
:param dict params: parameters for rendering
"""
for root, _, files in os.walk(dir_path):
for file_path in files:
name, extension = os.path.splitext(file_path)
if not extension == '.mako':
continue
src_path = os.path.join(root, file_path)
dst_path = os.path.join(root, name)
render_to_file(src_path, dst_path, params)
copy_file_permissions(src_path, dst_path)
remove(src_path)
def copy_file_permissions(src, dst):
"""Copies file permissions
:param str src: source file
:param str dst: destination
"""
shutil.copymode(src, dst)
def remove(path):
"""Remove file or directory
:param path: a file or directory to remove
"""
logger.debug(u'Removing "%s"', path)
if not os.path.lexists(path):
return
if os.path.isdir(path) and not os.path.islink(path):
shutil.rmtree(path)
else:
os.remove(path)
def copy(src, dst):
"""Copy a given file or directory from one place to another.
Rewrite already exists files.
:param src: copy from
:param dst: copy to
"""
logger.debug(u'Copy from %s to %s', src, dst)
if os.path.isdir(src):
# dir_util.copy_tree use here instead of shutil.copytree because
# it can overwrite existing folder and files. This is necessary
# for our template combinations, e.g.: base and v1
dir_util.copy_tree(src, dst, preserve_symlinks=True)
else:
shutil.copy(src, dst)
def copy_files_in_dir(src, dst):
"""Copies file in directory
:param str src: source files
:param str dst: destination directory
"""
logger.debug(u'Copy files in directory %s %s', src, dst)
for f in glob(src):
dst_path = os.path.join(dst, os.path.basename(f))
copy(f, dst_path)
def move_files_in_dir(src, dst):
"""Move files or directories
:param str src: source files or directories
:param str dst: destination directory
"""
logger.debug(u'Move files to directory %s %s', src, dst)
for f in glob(src):
dst_path = os.path.join(dst, os.path.basename(f))
shutil.move(f, dst_path)
def make_tar_gz(dir_path, tar_path, files_prefix):
"""Compress the file in tar.gz archive
:param str dir_path: directory for archiving
:param str tar_path: the name and path to the file
:param str files_prefix: the directory in the tar files where all
of the files are allocated
"""
logger.debug(u'Archive directory %s to file %s', dir_path, tar_path)
tar = tarfile.open(tar_path, 'w:gz')
tar.add(dir_path, arcname=files_prefix)
tar.close()
def parse_yaml(path):
"""Parses yaml file
:param str path: path to the file
:returns: dict or list
"""
return yaml.load(open(path))
def calculate_sha(file_path, chunk_size=2 ** 20):
"""Calculate file's checksum
:param str file_path: file path
:param int chunk_size: optional parameter, size of chunk
:returns: SHA1 string
"""
sha = hashlib.sha1()
with open(file_path, 'rb') as f:
for chunk in iter(lambda: f.read(chunk_size), b''):
sha.update(chunk)
return sha.hexdigest()
def calculate_checksums(dir_path):
"""Calculates checksums of files in the directory
:param str dir_path: path to the directory
:returns: list of dicts, where 'checksum' is SHA1,
'file_path' is a relative path to the file
"""
checksums = []
for root, _, files in os.walk(dir_path):
for file_path in files:
full_path = os.path.join(root, file_path)
rel_path = os.path.relpath(full_path, dir_path)
checksums.append({
'checksum': calculate_sha(full_path),
'file_path': rel_path})
return checksums
def create_checksums_file(dir_path, checksums_file):
"""Creates file with checksums
:param str dir_path: path to the directory for checksums calculation
:param str checksums_file: path to the file where checksums are saved
"""
checksums = calculate_checksums(dir_path)
checksums_sorted = sorted(checksums, key=lambda c: c['file_path'])
checksum_lines = [
'{checksum} {file_path}\n'.format(**checksum)
for checksum in checksums_sorted]
with open(checksums_file, 'w') as f:
f.writelines(checksum_lines)
def version_split_name_rpm(version):
version_tuple = StrictVersion(version).version
major = '.'.join(map(str, version_tuple[0:2]))
minor = version
return (major, minor)
def get_current_year():
"""Returns current year
"""
return str(datetime.date.today().year)
def remove_by_mask(mask):
"""Deletes files by mask
:param str mask: files mask
"""
logger.debug(u'Remove files by mask %s', mask)
for f in glob(mask):
remove(f)
def read_if_exist(filename):
"""Read contents from filename
:param str filename: path to the file
:retruns: str with contents of filename or empty string
"""
if not exists(filename):
logger.debug('File not found. Skipping {0}'.format(filename))
return ""
with open(filename) as f:
logger.debug('Reading file {0}'.format(filename))
return f.read()

View File

@ -0,0 +1,34 @@
from .checksum import calculate_file_checksums
from .checksum import calculate_file_sha
from .checksum import create_checksums_file
from .data_structures import dict_merge
from .data_structures import Enum
from .files_manager import FilesManager
from .fs import basename
from .fs import copy
from .fs import copy_file_permissions
from .fs import copy_files_in_dir
from .fs import create_dir
from .fs import exec_piped_cmds
from .fs import files_in_path
from .fs import get_path_extension
from .fs import get_path_without_extension
from .fs import get_paths
from .fs import is_dir
from .fs import is_executable
from .fs import is_exists
from .fs import is_file
from .fs import make_tar_gz
from .fs import move_files_in_dir
from .fs import remove
from .fs import remove_by_mask
from .fs import which
from .reports import ReportNode
from .schema import make_schema
from .sys_calls import exec_cmd
from .template import load_template_and_render_to_file
from .template import render_files_in_dir
from .template import render_template_file
from .time import get_current_year
from .version import strict_version
from .version import version_split_name_rpm

View File

@ -0,0 +1,72 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import hashlib
import os
def calculate_file_sha(file_path, chunk_size=2 ** 20):
"""Calculate file's checksum
:param str file_path: file path
:param int chunk_size: optional parameter, size of chunk
:returns: SHA1 string
"""
sha = hashlib.sha1()
with open(file_path, 'rb') as f:
for chunk in iter(lambda: f.read(chunk_size), b''):
sha.update(chunk)
return sha.hexdigest()
def calculate_file_checksums(dir_path):
"""Calculates checksums of files in the directory
:param str dir_path: path to the directory
:returns: list of dicts, where 'checksum' is SHA1,
'file_path' is a relative path to the file
"""
checksums = []
for root, _, files in os.walk(dir_path):
for file_path in files:
full_path = os.path.join(root, file_path)
rel_path = os.path.relpath(full_path, dir_path)
checksums.append({
'checksum': calculate_file_sha(full_path),
'file_path': rel_path})
return checksums
def create_checksums_file(dir_path, checksums_file):
"""Creates file with checksums
:param dir_path: path to the directory for checksums calculation
:type dir_path: str
:param checksums_file: path to the file where checksums are saved
:type checksums_file: str
"""
checksums = calculate_file_checksums(dir_path)
checksums_sorted = sorted(checksums, key=lambda c: c['file_path'])
checksum_lines = [
'{checksum} {file_path}\n'.format(**checksum)
for checksum in checksums_sorted]
with open(checksums_file, 'w') as f:
f.writelines(checksum_lines)

View File

@ -0,0 +1,43 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
from copy import deepcopy
import six
def Enum(*values, **kwargs):
names = kwargs.get('names')
if names:
return collections.namedtuple('Enum', names)(*values)
return collections.namedtuple('Enum', values)(*values)
def dict_merge(a, b):
"""recursively merges dict's. not just simple a['key'] = b['key'], if
both a and bhave a key who's value is a dict then dict_merge is called
on both values and the result stored in the returned dictionary.
"""
if not isinstance(b, dict):
return deepcopy(b)
result = deepcopy(a)
for k, v in six.iteritems(b):
if k in result and isinstance(result[k], dict):
result[k] = dict_merge(result[k], v)
else:
result[k] = deepcopy(v)
return result

View File

@ -0,0 +1,278 @@
# -*- coding: utf-8 -*-
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import glob
import json
import os
import yaml
from fuel_plugin_builder import errors
from fuel_plugin_builder.utils.fs import create_dir
class FilesManager(object):
"""Files Manager allows to load and save files with auto-serialization.
All files loading and saving operations are recommended to be
performed via FilesManager class.
Also, it's recommended to work with FM using absolute paths to avoid
relative paths mess.
"""
_deserializers = {
"json": json.loads,
"yaml": yaml.load,
"yml": yaml.load,
"txt": lambda v: v,
"md": lambda v: v,
"sh": lambda v: v
}
_serializers = {
"json": json.dumps,
"yaml": yaml.safe_dump,
"yml": yaml.safe_dump,
"txt": lambda v: v,
"md": lambda v: v,
"sh": lambda v: v
}
@staticmethod
def _get_normalized_extension(path):
"""Get normalized file extension.
:param path: path
:type path: str|basestring
:return: lowercased extension without dot
:rtype: str|basestring
"""
extension = os.path.splitext(path)[1].lower()
if extension:
if extension[0] == '.':
extension = extension[1:]
return extension
def _get_files_by_mask(self, path_mask, allowed_formats=None):
"""Find all files of allowed format in path.
:param path_mask: path mask like ./my-file.*
:type path_mask: str|basestring
:param allowed_formats: available file formats
allow all if not defined
:type allowed_formats: iterable|None
:return: list of sorted files paths
:rtype: list
"""
path_mask_parts = path_mask.split('**/')
paths = []
paths_to_glob = []
if len(path_mask_parts) == 1:
paths_to_glob = [path_mask]
else:
for dir, _, _ in os.walk(path_mask_parts[0]):
paths_to_glob.append(os.path.join(dir, path_mask_parts[1]))
for path_to_glob in paths_to_glob:
for path in glob.glob(path_to_glob):
extension = self._get_normalized_extension(path)
if not allowed_formats or extension in allowed_formats:
paths.append(path)
if paths:
return sorted(paths)
@staticmethod
def _merge_data_records(data_records):
"""Merge data records.
Accepting lists and dict structures respecting order of records.
If we are having at least one record with list as root we are extending
this record by all other found lists and appending records with objects
as root.
If all records have object as root, fields will be overridden for every
this records in given order.
example 1:
_merge_data_records([
[{'field1': 1}],
{'field2': 2},
[{'field1': 3}],
])
will return:
[
{'field1': 1},
{'field2': 2},
{'field1': 3}
]
example 2:
_merge_data_records([
{'field1': 1},
{'field2': 2},
{'field1': 3},
])
will return:
{
'field1': 3,
'field2': 2
}
:param data_records: list of data records
:type data_records: list[list|dict]
:return: resulting data
:rtype: list|dict|other objects
"""
unmergable = []
dicts_to_merge = []
merged_list = []
for data_record in data_records:
if isinstance(data_record, dict):
dicts_to_merge.append(data_record)
elif isinstance(data_record, list):
merged_list.extend(data_record)
else:
unmergable.append(data_record)
if len(merged_list): # we have list as root structure
merged_list.extend(dicts_to_merge)
merged_list.extend(unmergable)
return merged_list
elif len(dicts_to_merge):
merged_dict = {}
for dict_to_merge in dicts_to_merge:
merged_dict.update(dict_to_merge)
return merged_dict
elif len(unmergable) == 1:
return unmergable[0]
elif len(unmergable) > 1:
return unmergable
@property
def supported_input_formats(self):
return list(self._deserializers)
@property
def supported_output_formats(self):
return list(self._serializers)
def load(
self,
path_mask,
skip_unknown_files=False,
skip_unredable_files=False,
decode=True,
*args,
**kwargs
):
"""Load file from path mask or direct path.
:param path_mask: path
:type path_mask: str
:param skip_unknown_files: not stop on deserialization errors
default=False
:type skip_unknown_files: bool
:param skip_unredable_files: not stop on file reading errors
default=False
:type skip_unredable_files: bool
:param decode: dcode automatically (Default: True)
:type decode: bool
:raises: InvalidFileFormat
:raises: TypeError
:raises: yaml.YAMLError
:return: data
:rtype: list|dict
"""
paths = self._get_files_by_mask(
path_mask, self.supported_input_formats)
if not paths:
raise errors.NoPluginFileFound(
u"Can't find file. "
u"Ensure that file is on its place and have one of "
u"the following data files formats: {}.".format(
u", ".join(self.supported_input_formats)
)
)
data_records = []
for path in paths:
extension = self._get_normalized_extension(path)
deserializer = self._deserializers.get(extension)
if deserializer is not None:
try:
with open(path, 'r') as content_file:
raw_content = content_file.read()
if decode:
data_records.append(
deserializer(raw_content, *args, **kwargs)
)
else:
data_records.append(raw_content)
except IOError as e:
if not skip_unredable_files:
raise e
else:
e = errors.InvalidFileFormat(
path, self.supported_input_formats)
if not skip_unknown_files:
raise e
return self._merge_data_records(data_records)
def save(self, path, data, mode='w', *args, **kwargs):
"""Save data to given file path applying serializer.
:param path: full path with extension that will define serialization
format.
:type path: str
:param data: data to save
:type data: list|dict
:param mode: file write mode
:type mode: str|basestring
:raises: InvalidFileFormat
:raises: TypeError
:raises: yaml.YAMLError
:return: data
:rtype: list|dict
"""
extension = self._get_normalized_extension(path)
serializer = self._serializers.get(extension)
if serializer is not None:
serialized_data = serializer(data, *args, **kwargs)
create_dir(os.path.dirname(path))
with open(path, mode) as content_file:
content_file.write(serialized_data)
else:
raise errors.InvalidFileFormat(
path, self.supported_output_formats)

View File

@ -0,0 +1,344 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import glob
import logging
import os
import shutil
import subprocess
import tarfile
from fuel_plugin_builder import errors
logger = logging.getLogger(__name__)
from distutils import dir_util
def copy(src, dst):
"""Copy a given file or directory from one place to another.
Rewrite already exists files.
:param src: copy from
:param dst: copy to
"""
logger.debug(u'Copy from %s to %s', src, dst)
if os.path.isdir(src):
# dir_util.copy_tree use here instead of shutil.copytree because
# it can overwrite existing folder and files. This is necessary
# for our template combinations, e.g.: base and v1
dir_util.copy_tree(src, dst, preserve_symlinks=True)
else:
shutil.copy(src, dst)
def copy_file_permissions(src, dst):
"""Copies file permissions
:param str src: source file
:param str dst: destination
"""
shutil.copymode(src, dst)
def copy_files_in_dir(src, dst):
"""Copies file in directory
:param str src: source files
:param str dst: destination directory
"""
logger.debug(u'Copy files in directory %s %s', src, dst)
for f in get_paths(src):
dst_path = os.path.join(dst, os.path.basename(f))
copy(f, dst_path)
def create_dir(dir_path):
"""Creates directory.
:param dir_path: directory path
:type dir_path: directory str
:raises: errors.DirectoryExistsError
"""
logger.debug(u'Creating directory %s', dir_path)
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
def exec_piped_cmds(cmds, cwd=None):
"""Execute pipe of commands with logging.
:param cmds: list of shell commands
:type cmds: list
:param cwd: current working directory
:type cwd: string or None
"""
logger.debug(u'Executing commands "{0}"'.format(" | ".join(cmds)))
std_out = None
for cmd in cmds:
child = subprocess.Popen(
cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True,
cwd=cwd)
std_out, std_err = child.communicate(input=std_out)
exit_code = child.returncode
if exit_code != 0:
logger.debug(u'Stderr of command "{0}":'.format(cmd))
logger.debug(std_err)
raise errors.ExecutedErrorNonZeroExitCode(
u'Shell command executed with "{0}" '
u'exit code: {1} '.format(exit_code, cmd))
logger.debug(u'Stdout of command "{0}":'.format(" | ".join(cmds)))
logger.debug(std_out)
logger.debug(
u'Command "{0}" successfully executed'.format(" | ".join(cmds))
)
def make_tar_gz(dir_path, tar_path, files_prefix):
"""Compress the file in tar.gz archive.
:param str dir_path: directory for archiving
:type dir_path: str
:param str tar_path: the name and path to the file
:type tar_path: str
:param str files_prefix: the directory in the tar files where all
of the files are allocated
:type files_prefix: str
"""
logger.debug(u'Archive directory %s to file %s', dir_path, tar_path)
tar = tarfile.open(tar_path, 'w:gz')
tar.add(dir_path, arcname=files_prefix)
tar.close()
def move_files_in_dir(src, dst):
"""Move files or directories.
:param src: source files or directories
:type src: str
:param str dst: destination directory
:type dst: str
"""
logger.debug(u'Move files to directory %s %s', src, dst)
for f in get_paths(src):
dst_path = os.path.join(dst, os.path.basename(f))
shutil.move(f, dst_path)
def remove(path):
"""Remove file or directory.
:param path: a file or directory to remove
:type path: str
"""
logger.debug(u'Removing "%s"', path)
if not os.path.lexists(path):
return
if os.path.isdir(path) and not os.path.islink(path):
shutil.rmtree(path)
else:
os.remove(path)
def remove_by_mask(mask):
"""Deletes files by mask.
:param mask: files mask
:type mask: str
"""
logger.debug(u'Remove files by mask %s', mask)
for f in get_paths(mask):
remove(f)
def get_paths(path_mask):
"""Returns glob(bed) files list.
:param path_mask:
:type path_mask: str
:return: list of paths
:rtype: str
"""
return glob.glob(path_mask)
def is_exists(path):
"""Checks if path is exists.
:param path: path to the file
:type path: str
:returns: True if file is exist, Flase if is not
"""
return os.path.lexists(path)
def is_file(path):
"""Checks if path is file.
:param path: path
:type path: str
:returns: True if given path is file, False if is not
:rtype: bool
"""
return os.path.isfile(path)
def is_dir(path):
"""Checks if path is directory.
:param path: path
:type path: str
:returns: True if given path is directory, False if is not
:rtype: bool
"""
return os.path.isdir(path)
def is_executable(file_path):
"""Checks if file executable.
:param file_path: path to the file
:type file_path: str
:returns: True if file is executable, False if is not
:rtype: bool
"""
return os.path.isfile(file_path) and os.access(file_path, os.X_OK)
def which(cmd):
"""Checks cmd location.
:param cmd: the name of the command or path
:type cmd: str
:returns: None if there is no such command, if there is such command
returns the path to the command
:rtype: None|str
"""
file_path, file_name = os.path.split(cmd)
if file_path:
if is_executable(cmd):
return cmd
for path in os.environ['PATH'].split(os.pathsep):
exe_file = os.path.join(path, cmd)
if is_executable(exe_file):
return exe_file
return None
def basename(path):
"""Basename for path
:param str path: path to the file
:returns: str with filename
"""
return os.path.basename(path)
def files_in_path(path, follow_links=False):
"""Walks dir and return list of found files or list with given path if
given path is not a folder.
:param follow_links: follow links while walking
:type follow_links: bool
:param path: path
:type path: str
:return: list of file paths
:rtype: list[str]
"""
matches = []
if os.path.exists(path):
if os.path.isdir(path):
for root, dir_names, file_names in os.walk(
path, followlinks=follow_links):
for filename in file_names:
matches.append(os.path.join(root, filename))
else:
matches.append(path)
return matches
def normalize_extension(extension):
"""Normalize extension.
examples:
> ".JSON" -> "json"
> ".yaml" -> "yaml"
> "CSV" -> "csv"
> "intact" -> "intact"
> "." -> InvalidFileExtension
> "" -> InvalidFileExtension
:param extension: extension
:type extension: str
:return: normalised extension
:rtype: str
"""
if extension:
if extension[0] == '.':
extension = extension[1:]
return extension.lower()
def get_path_without_extension(path):
"""Return path without extension.
Example:
> /var/config/template.yaml.mako -> /var/config/template.yaml
> /var/config/template.yaml -> /var/config/template
> /var/config/template -> /var/config/template
:param path: path
:type path: str
:return: path without extension
:rtype: str|None
"""
if path:
return os.path.splitext(path)[0]
else:
return None
def get_path_extension(path):
"""Get extensions from path.
:param path: path
:type path: str
:return: normalized extension
:rtype: str
"""
return normalize_extension(os.path.splitext(path)[1])

View File

@ -0,0 +1,370 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import json
import traceback
import six
import yaml
from fuel_plugin_builder.utils.data_structures import Enum
class DataWithReport(object):
"""Incapsulates result list/dict and report as property."""
def __init__(self, seq=None, report=None, **kwargs):
"""Initialize DataWithReport.
:param seq:
:type seq: iterable|None
:param report: report node
:param report: ReportNode|None
:returns: extended list or dict
:rtype: DictResultWithReport|ListResultWithReport
"""
super(DataWithReport, self).__init__(seq, **kwargs)
self.report = report or ReportNode(u'No report provided')
class DictResultWithReport(DataWithReport, dict):
pass
class ListResultWithReport(DataWithReport, list):
pass
class TextReportConfig(object):
indent_size = 4
failure_pointer = '> '
line_delimiter = '\n'
def __init__(self, **kwargs):
# update only already defined attributes
for k in kwargs:
if self.__getattribute__(k) and kwargs.get(k, None) is not None:
self.__setattr__(k, kwargs[k])
class ReportNode(object):
"""Basic unit of Reports tree.
Any ReportNode could be rendered as report with all children tree.
"""
REPORT_LEVELS = Enum(
'error',
'warning',
'info',
'debug'
)
RENDER_FORMATS = Enum(
'text',
'json',
'yaml'
)
text_report_config = TextReportConfig()
# Size of the new level text indent when rendering report
text = None
level = None
children = None
time = None
failed = False
@property
def _renderers(self):
return {
self.RENDER_FORMATS.text: self._render_text,
self.RENDER_FORMATS.yaml: self._render_yaml,
self.RENDER_FORMATS.json: self._render_json
}
def __init__(self,
text=None,
level=None,
children=None,
time=None,
failed=None):
"""Basic unit of report tree.
:param text: node text
:type text: str|basestring
:param level: message level
:type level: str|basestring
:param children: list of child ReportNodes
:type children: list[ReportNode]
:param time: override node creation time
:type time: datetime.datetime
:param failed: failure flag that affects rendering
:type failed: boolean
"""
self.text = self._format_message_content(text)
self.time = time or datetime.datetime.now()
self.children = children if children is not None else []
self.level = level or self.level
if self.level == self.REPORT_LEVELS.error:
self.failed = True
if failed is not None:
self.failed = failed
def _render_json(self, depth=0, *args, **kwargs):
next_level = depth + 1
result = {}
if self.text:
result['text'] = self.text
if self.level:
result['level'] = self.level
if self.time:
result['time'] = self.time
if len(self.children):
result['children'] = [
child._render_yaml(next_level, *args, **kwargs)
for child in self.children]
if depth > 0:
return result
else:
return json.dumps(result, *args, **kwargs)
def _render_yaml(self, depth=0, *args, **kwargs):
next_level = depth + 1
result = {}
if self.text:
result['text'] = self.text
if self.level:
result['level'] = self.level
if self.time:
result['time'] = self.time
if len(self.children):
result['children'] = [
child._render_yaml(next_level, *args, **kwargs)
for child in self.children]
if depth > 0:
return result
else:
return yaml.safe_dump(result, *args, **kwargs)
def _render_text(self, depth=None, config=None):
config = config if config else self.text_report_config
indent = config.indent_size * (depth or 0) * ' '
error_indent_size = max(
len(indent) - len(config.failure_pointer),
0
)
error_indent = error_indent_size * ' '
lines = []
failed = self.failed
for child in self.children:
child_lines = child._render_text(
0 if depth is None else depth + 1,
config=config)
lines.extend(child_lines)
def _make_level_string(string):
return '{}: '.format(string.upper()) if string else ''
if self.text or self.level:
output = '{indent}{pointer}{text}'.format(
indent=error_indent if failed else indent,
pointer=config.failure_pointer if failed else '',
text='{level}{text}'.format(
level=_make_level_string(self.level),
text=self.text or ''
)
)
lines.insert(0, output)
if depth is None:
return config.line_delimiter.join(lines)
else:
return lines
@staticmethod
def _format_message_content(msg_or_exc):
if not msg_or_exc:
return msg_or_exc
if isinstance(msg_or_exc, six.string_types):
return msg_or_exc
elif isinstance(msg_or_exc, Exception):
tb = traceback.format_exc(msg_or_exc)
return msg_or_exc.message or repr(msg_or_exc) + (tb or '')
else:
return "{}".format(msg_or_exc)
def _attach_message(self, msg_or_exc, level, *args, **kwargs):
self.add_nodes(
ReportNode(self._format_message_content(msg_or_exc), level)
)
self.add_nodes(
*(
ReportNode(arg, level=self.level)
for arg in args
)
)
self.add_nodes(
*(
ReportNode(u'{}: {}'.format(key, kwargs[key]))
for key in kwargs
)
)
return self
def add_nodes(self, *nodes):
"""Add single node or several nodes.
:param nodes: one or several report nodes
:type nodes: list[ReportNode]
:raises: InspectionConfigurationError
"""
for node in nodes:
self.children.append(node)
return self
def error(self, msg_or_exc, *args, **kwargs):
"""Add child ReportNode with error message.
:param msg_or_exc: message or exception
:type msg_or_exc: str|basestring|Exception
:return: self
:rtype: ReportNode
"""
return self._attach_message(
msg_or_exc=msg_or_exc,
level=self.REPORT_LEVELS.error,
*args, **kwargs
)
def warning(self, msg_or_exc, *args, **kwargs):
"""Add child ReportNode with warning message.
:param msg_or_exc: message or exception
:type msg_or_exc: str|basestring|Exception
:return: self
:rtype: ReportNode
"""
return self._attach_message(
msg_or_exc=msg_or_exc,
level=self.REPORT_LEVELS.warning,
*args, **kwargs
)
def warn(self, msg_or_exc, *args, **kwargs):
"""Alias to warning."""
return self.warning(msg_or_exc, *args, **kwargs)
def info(self, msg_or_exc, *args, **kwargs):
"""Add child ReportNode with info message.
:param msg_or_exc: message or exception
:type msg_or_exc: str|basestring|Exception
:return: self
:rtype: ReportNode
"""
return self._attach_message(
msg_or_exc=msg_or_exc,
level=self.REPORT_LEVELS.info,
*args, **kwargs
)
def render(
self,
output_format=RENDER_FORMATS.text,
add_summary=True,
*args, **kwargs
):
"""Render report tree to the text.
:param output_format: render format
text(default) json and yaml are supported.
:type output_format: str|basestring
:param add_summary: include statistics and result
:type add_summary: bool
:return: report strings
:rtype: str|basestring
"""
root_node = ReportNode(children=[self])
if add_summary:
summary_node = ReportNode(u'Summary:')
fail_count = self.count_failures()
if fail_count:
summary_node.info(
u'Failure!')
summary_node.info(
u'Please fix {} errors listed above.'.format(fail_count))
else:
summary_node.info(u'Success!')
root_node.add_nodes(summary_node)
return root_node._renderers[output_format](*args, **kwargs)
def count_failures(self, start_from=0):
"""Count failure messages inside report.
:param start_from: start count from
:type start_from: int
:return: errors count
:rtype: int
"""
count = start_from
if self.failed:
count += 1
for child in self.children:
count = child.count_failures(count)
return count
def is_failed(self):
"""Is this report about failure.
:return: is failed
:rtype: boolean
"""
return bool(self.count_failures())
def is_successful(self):
"""Is this report OK.
:return: is successful
:rtype: boolean
"""
return not bool(self.count_failures())
def mix_to_data(self, data):
"""Replace data with reported data with .report attribute
:param data: list|dict
:return: data with report
:rtype: DataWithReport|ListResultWithReport|DictResultWithReport
"""
if isinstance(data, list):
return ListResultWithReport(data, self)
else:
return DictResultWithReport(data, self)

View File

@ -0,0 +1,24 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
def make_schema(required, properties):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': required,
'properties': properties
}

View File

@ -0,0 +1,54 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
import subprocess
from fuel_plugin_builder import errors
logger = logging.getLogger(__name__)
def exec_cmd(cmd, cwd=None):
"""Execute command with logging.
Output of STDOUT and STDERR will be written
in log.
:param cmd: shell command
:type cmd: str|basestring
:param cwd: string or None
:type cwd: str|basestring|None
"""
logger.debug(u'Execute command "{0}"'.format(cmd))
child = subprocess.Popen(
cmd, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
shell=True,
cwd=cwd)
logger.debug(u'Stdout and stderr of command "{0}":'.format(cmd))
for line in child.stdout:
logger.debug(line.rstrip())
child.wait()
exit_code = child.returncode
if exit_code != 0:
raise errors.ExecutedErrorNonZeroExitCode(
u'Shell command executed with "{0}" '
'exit code: {1} '.format(exit_code, cmd))
logger.debug(u'Command "{0}" successfully executed'.format(cmd))

View File

@ -0,0 +1,83 @@
# -*- coding: utf-8 -*-
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import io
import logging
import os
from mako.template import Template
from fuel_plugin_builder import consts
from fuel_plugin_builder.utils.fs import copy_file_permissions
from fuel_plugin_builder.utils.fs import remove
logger = logging.getLogger(__name__)
def render_files_in_dir(dir_path, params):
"""Renders all *.mako files and removes templates
:param str dir_path: path to the directory
:param dict params: parameters for rendering
"""
for root, _, files in os.walk(dir_path):
for file_path in files:
name, extension = os.path.splitext(file_path)
if not extension == '.mako':
continue
src_path = os.path.join(root, file_path)
dst_path = os.path.join(root, name)
load_template_and_render_to_file(src_path, dst_path, params)
copy_file_permissions(src_path, dst_path)
remove(src_path)
def render_template_file(src, **context):
"""Render Mako template to string.
:param src: path to template
:type src: str
:param context: template engine context
:type context: list|dict|None
:return: string
:rtype: str
"""
with io.open(src, 'r', encoding=consts.DEFAULT_ENCODING) as f:
template_file = f.read()
rendered_file_content = Template(template_file).render(**context)
return rendered_file_content
def load_template_and_render_to_file(src, dst, context):
"""Render Mako template and write it to specified file.
:param src: path to template
:type src: str
:param dst: path where rendered template will be saved
:type dst: str
:param context: template engine context
:type context: list|dict|None
"""
logger.debug(u'Render template from {0} to {1} with params: {2}'.format(
src, dst, context))
with io.open(src, 'r', encoding=consts.DEFAULT_ENCODING) as f:
template_file = f.read()
with io.open(dst, 'w', encoding=consts.DEFAULT_ENCODING) as f:
rendered_file = Template(template_file).render(**context)
f.write(rendered_file)

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,8 +14,10 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.validators.schemas import BaseSchema
import datetime
class SchemaV1(BaseSchema):
pass
def get_current_year():
"""Returns current year
"""
return str(datetime.date.today().year)

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -14,15 +14,16 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import version_mapping
from distutils.version import StrictVersion
class ValidatorManager(object):
def strict_version(minimal_fuel_version):
return StrictVersion(minimal_fuel_version)
def __init__(self, plugin_path):
self.plugin_path = plugin_path
def get_validator(self):
validator = version_mapping.get_version_mapping_from_plugin(
self.plugin_path)['validator']
return validator(self.plugin_path)
def version_split_name_rpm(version):
version_tuple = StrictVersion(version).version
major = '.'.join(map(str, version_tuple[0:2]))
minor = version
return major, minor

View File

@ -14,10 +14,9 @@
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.validators.manager import ValidatorManager
from fuel_plugin_builder.validators.validator_v1 import ValidatorV1
from fuel_plugin_builder.validators.validator_v2 import ValidatorV2
from fuel_plugin_builder.validators.validator_v3 import ValidatorV3
from fuel_plugin_builder.validators.validator_v4 import ValidatorV4
from fuel_plugin_builder.validators.validator_v5 import ValidatorV5
from fuel_plugin_builder.validators.base import LegacyBaseValidator
from .validator_base import ValidatorBase
from .validator_v1 import ValidatorV1
from .validator_v2 import ValidatorV2
from .validator_v3 import ValidatorV3
from .validator_v4 import ValidatorV4
from .validator_v5 import ValidatorV5

View File

@ -1,190 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
import logging
import jsonschema
import six
from distutils.version import StrictVersion
from fuel_plugin_builder import errors
from fuel_plugin_builder import utils
from os.path import join as join_path
logger = logging.getLogger(__name__)
@six.add_metaclass(abc.ABCMeta)
class LegacyBaseValidator(object):
@abc.abstractproperty
def basic_version(self):
pass
def __init__(self, plugin_path, format_checker=jsonschema.FormatChecker):
self.plugin_path = plugin_path
self.format_checker = format_checker
def validate_schema(self, data, schema, file_path, value_path=None):
logger.debug(
'Start schema validation for %s file, %s', file_path, schema)
try:
jsonschema.validate(data, schema,
format_checker=self.format_checker)
except jsonschema.exceptions.ValidationError as exc:
raise errors.ValidationError(
self._make_error_message(exc, file_path, value_path))
def _make_error_message(self, exc, file_path, value_path):
if value_path is None:
value_path = []
if exc.absolute_path:
value_path.extend(exc.absolute_path)
if exc.context:
sub_exceptions = sorted(
exc.context, key=lambda e: len(e.schema_path), reverse=True)
sub_message = sub_exceptions[0]
value_path.extend(list(sub_message.absolute_path)[2:])
message = sub_message.message
else:
message = exc.message
error_msg = "File '{0}', {1}".format(file_path, message)
if value_path:
value_path = ' -> '.join(map(six.text_type, value_path))
error_msg = '{0}, {1}'.format(
error_msg, "value path '{0}'".format(value_path))
return error_msg
def validate_file_by_schema(self, schema, file_path,
allow_not_exists=False, allow_empty=False):
"""Validate file with given JSON schema.
:param schema: object dict
:type schema: object
:param file_path: path to the file
:type file_path: basestring
:param allow_not_exists: if true don't raise error on missing file
:type allow_not_exists: bool
:param allow_empty: allow file to contain no json
:type allow_empty: bool
:return:
"""
if not utils.exists(file_path):
if allow_not_exists:
logger.debug('No file "%s". Skipping check.', file_path)
return
else:
raise errors.FileDoesNotExist(file_path)
data = utils.parse_yaml(file_path)
if data is not None:
self.validate_schema(data, schema, file_path)
else:
if not allow_empty:
raise errors.FileIsEmpty(file_path)
@abc.abstractmethod
def validate(self):
"""Performs validation
"""
def check_schemas(self):
logger.debug('Start schema checking "%s"', self.plugin_path)
self.validate_file_by_schema(
self.schema.metadata_schema,
self.meta_path)
self.validate_file_by_schema(
self.schema.tasks_schema,
self.tasks_path)
self.check_env_config_attrs()
def check_env_config_attrs(self):
"""Check attributes in environment config file.
'attributes' is not required field, but if it's
present it should contain UI elements OR metadata
structure.
"""
config = utils.parse_yaml(self.env_conf_path)
if not config:
return
self.validate_schema(
config,
self.schema.attr_root_schema,
self.env_conf_path)
attrs = config.get('attributes', {})
for attr_id, attr in six.iteritems(attrs):
schema = self.schema.attr_element_schema
# Metadata object is totally different
# from the others, we have to set different
# validator for it
if attr_id == 'metadata':
schema = self.schema.attr_meta_schema
self.validate_schema(
attr,
schema,
self.env_conf_path,
value_path=['attributes', attr_id])
def check_releases_paths(self):
meta = utils.parse_yaml(self.meta_path)
for release in meta['releases']:
scripts_path = join_path(
self.plugin_path,
release['deployment_scripts_path'])
repo_path = join_path(
self.plugin_path,
release['repository_path'])
wrong_paths = []
for path in [scripts_path, repo_path]:
if not utils.exists(path):
wrong_paths.append(path)
if wrong_paths:
raise errors.ReleasesDirectoriesError(
'Cannot find directories {0} for release "{1}"'.format(
', '.join(wrong_paths), release))
def check_compatibility(self):
"""Json schema doesn't have any conditions, so we have
to make sure here, that this validation schema can be used
for described fuel releases
"""
meta = utils.parse_yaml(self.meta_path)
for fuel_release in meta['fuel_version']:
if StrictVersion(fuel_release) < StrictVersion(self.basic_version):
raise errors.ValidationError(
'Current plugin format {0} is not compatible with {2} Fuel'
' release. Fuel version must be {1} or higher.'
' Please remove {2} version from metadata.yaml file or'
' downgrade package_version.'
.format(
meta['package_version'],
self.basic_version,
fuel_release))

View File

@ -20,7 +20,7 @@ from sre_constants import error as sre_error
import jsonschema
import six
from fuel_plugin_builder import errors
import errors
class FormatChecker(jsonschema.FormatChecker):

View File

@ -1,211 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import consts
class BaseSchema(object):
@property
def plugin_release_schema(self):
return {
'type': 'object',
'required': ['version', 'os', 'mode'],
'properties': {
'version': {'type': 'string'},
'os': {'enum': ['ubuntu', 'centos']},
'deployment_scripts_path': {'type': 'string'},
'repository_path': {'type': 'string'},
'mode': {'type': 'array',
'items': {'enum': ['ha', 'multinode']}}}
}
@property
def condition(self):
return {'type': 'string'}
@property
def full_restriction(self):
return {
'type': 'object',
'required': ['condition'],
'properties': {
'condition': self.condition,
'message': {'type': 'string'},
'action': {'type': 'string'}}}
@property
def short_restriction(self):
return {
'type': 'object',
'minProperties': 1,
'maxProperties': 1}
@property
def restrictions(self):
return {
'type': 'array',
'minItems': 1,
'items': {
'anyOf': [
self.condition,
self.full_restriction,
self.short_restriction]}}
@property
def metadata_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': [
'name',
'title',
'version',
'package_version',
'description',
'fuel_version',
'releases',
],
'properties': {
'name': {
'type': 'string',
'pattern': consts.PLUGIN_NAME_PATTERN},
'title': {'type': 'string'},
'version': {'type': 'string'},
'package_version': {'enum': ['1.0.0']},
'description': {'type': 'string'},
'fuel_version': self.list_of_strings,
'releases': {
'type': 'array',
'items': self.plugin_release_schema}}
}
@property
def list_of_strings(self):
return {'type': 'array',
'items': {'type': 'string'}}
@property
def positive_integer(self):
return {'type': 'integer', 'minimum': 0}
@property
def puppet_parameters(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['timeout', 'puppet_modules', 'puppet_manifest'],
'properties': {
'timeout': self.positive_integer,
'puppet_modules': {'type': 'string'},
'puppet_manifest': {'type': 'string'}}
}
@property
def shell_parameters(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['timeout', 'cmd'],
'properties': {
'timeout': self.positive_integer,
'cmd': {'type': 'string'}}
}
@property
def task_base_parameters(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['timeout'],
'properties': {
'timeout': self.positive_integer}
}
@property
def task_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['parameters', 'type', 'stage', 'role'],
'properties': {
'type': {'enum': ['puppet', 'shell']},
'parameters': self.task_base_parameters,
'stage': {'enum': ['post_deployment', 'pre_deployment']},
'role': {
'oneOf': [
self.list_of_strings,
{'enum': ['*', 'master']}]}}
}
@property
def tasks_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': self.task_schema
}
@property
def attr_element_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type', 'label', 'weight', 'value'],
'properties': {
'type': {'type': 'string'},
'weight': {'type': 'integer'},
'value': {'anyOf': [
{'type': 'string'},
{'type': 'boolean'},
{'type': 'object',
'properties': {'generator': {'type': 'string'}}},
{'type': 'array',
'items': {'anyOf': [{'type': 'string'},
{'type': 'boolean'}]}},
]},
'label': {'type': 'string'},
'restrictions': self.restrictions,
'values': {'type': 'array', 'items':
{'type': 'object',
'required': ['data', 'label'],
'properties': {
'data': {'type': 'string'},
'label': {'type': 'string'}}}}}
}
@property
def attr_meta_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'label': {'type': 'string'},
'weight': {'type': 'integer'},
'toggleable': {'type': 'boolean'},
'enabled': {'type': 'boolean'},
'restrictions': self.restrictions}
}
@property
def attr_root_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'properties': {
'attributes': {'type': 'object'}}
}

View File

@ -1,96 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder import consts
from fuel_plugin_builder.validators.schemas import BaseSchema
class SchemaV2(BaseSchema):
@property
def package_version(self):
return {'enum': ['2.0.0']}
@property
def metadata_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'title': 'plugin',
'type': 'object',
'required': [
'name',
'title',
'version',
'package_version',
'description',
'fuel_version',
'licenses',
'authors',
'homepage',
'releases',
'groups'],
'properties': {
'name': {
'type': 'string',
'pattern': consts.PLUGIN_NAME_PATTERN},
'title': {'type': 'string'},
'version': {'type': 'string'},
'package_version': self.package_version,
'description': {'type': 'string'},
'fuel_version': self.list_of_strings,
'licenses': self.list_of_strings,
'authors': self.list_of_strings,
'groups': {'type': 'array', 'uniqueItems': True, 'items':
{'enum':
['network',
'storage',
'storage::cinder',
'storage::glance',
'hypervisor',
'monitoring']}},
'homepage': {'type': 'string'},
'releases': {
'type': 'array',
'items': self.plugin_release_schema}}
}
@property
def task_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['parameters', 'type', 'stage', 'role'],
'properties': {
'type': {'enum': ['puppet', 'shell', 'reboot']},
'parameters': self.task_base_parameters,
'stage': {'type': 'string',
'pattern':
'^(post_deployment|pre_deployment)'
'(/[-+]?([0-9]*\.[0-9]+|[0-9]+))?$'},
'role': {
'oneOf': [
self.list_of_strings,
{'enum': ['*', 'master']}]}}
}
@property
def reboot_parameters(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['timeout'],
'properties': {'timeout': self.positive_integer}
}

View File

@ -1,393 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from fuel_plugin_builder.validators.schemas import SchemaV2
TASK_NAME_PATTERN = '^[0-9a-zA-Z_-]+$'
NETWORK_ROLE_PATTERN = '^[0-9a-z_-]+$'
FILE_PERMISSIONS_PATTERN = '^[0-7]{4}$'
class SchemaV3(SchemaV2):
@property
def task_role(self):
return {
'oneOf': [
self.task_group,
{'enum': ['*', 'master']}
]
}
@property
def task_group(self):
return {
'type': 'array',
'items': {
'type': 'string',
'pattern': TASK_NAME_PATTERN
}
}
@property
def rule(self):
return {
'type': ['string', 'integer']
}
@property
def override(self):
return {
'type': 'object',
'description': 'Property which can change limit recommended|min'
'|max properties due to some additional condition',
'required': ['condition'],
'properties': {
'condition': {'type': 'string'},
'max': self.rule,
'recommended': self.rule,
'min': self.rule,
'message': {'type': 'string'}
}
}
@property
def overrides(self):
return {
'type': 'array',
'description': 'Array of limit override properties',
'minItems': 1,
'items': self.override
}
@property
def limits(self):
return {
'type': 'object',
'description': 'Limits for count of nodes for node role',
'properties': {
'condition': self.condition,
'max': self.rule,
'recommended': self.rule,
'min': self.rule,
'overrides': self.overrides
}
}
@property
def package_version(self):
return {'enum': ['3.0.0']}
@property
def puppet_task(self):
return {
'type': 'object',
'properties': {
'type': {'enum': ['puppet']},
'groups': self.task_group,
'role': self.task_role,
'parameters': {
'type': 'object',
'required': [
'puppet_manifest', 'puppet_modules', 'timeout'],
'properties': {
'puppet_manifest': {
'type': 'string',
'minLength': 1},
'puppet_modules': {
'type': 'string',
'minLength': 1},
'timeout': {
'type': 'integer'},
'retries': {
'type': 'integer'},
}
}
}
}
@property
def shell_task(self):
return {
'type': 'object',
'required': ['role'],
'properties': {
'type': {'enum': ['shell']},
'role': self.task_role,
'parameters': {
'type': 'object',
'required': ['cmd'],
'properties': {
'cmd': {
'type': 'string'},
'retries': {
'type': 'integer'},
'interval': {
'type': 'integer'},
'timeout': {
'type': 'integer'}}}}
}
@property
def group_task(self):
return {
'type': 'object',
'required': ['role'],
'properties': {
'type': {'enum': ['group']},
'role': self.task_role,
'parameters': {
'type': 'object',
'properties': {
'strategy': {
'type': 'object',
'properties': {
'type': {
'enum': ['parallel', 'one_by_one']}}}}}}
}
@property
def skipped_task(self):
return {
'type': 'object',
'properties': {
'type': {'enum': ['skipped']}}
}
@property
def copy_files(self):
return {
'type': 'object',
'required': ['role', 'parameters'],
'properties': {
'type': {'enum': ['copy_files']},
'role': self.task_role,
'parameters': {
'type': 'object',
'required': ['files'],
'properties': {
'files': {
'type': 'array',
'minItems': 1,
'items': {
'type': 'object',
'required': ['src', 'dst'],
'properties': {
'src': {'type': 'string'},
'dst': {'type': 'string'}}}},
'permissions': {
'type': 'string',
'pattern': FILE_PERMISSIONS_PATTERN},
'dir_permissions': {
'type': 'string',
'pattern': FILE_PERMISSIONS_PATTERN}}}}
}
@property
def sync(self):
return {
'type': 'object',
'required': ['role', 'parameters'],
'properties': {
'type': {'enum': ['sync']},
'role': self.task_role,
'parameters': {
'type': 'object',
'required': ['src', 'dst'],
'properties': {
'src': {'type': 'string'},
'dst': {'type': 'string'},
'timeout': {'type': 'integer'}}}}
}
@property
def upload_file(self):
return {
'type': 'object',
'required': ['role', 'parameters'],
'properties': {
'type': {'enum': ['upload_file']},
'role': self.task_role,
'parameters': {
'type': 'object',
'required': ['path', 'data'],
'properties': {
'path': {'type': 'string'},
'data': {'type': 'string'}}}}
}
@property
def stage(self):
return {
'type': 'object',
'properties': {
'type': {'enum': ['stage']}}
}
@property
def reboot(self):
return {
'type': 'object',
'properties': {
'type': {'enum': ['reboot']},
'parameters': {
'type': 'object',
'properties': {
'timeout': {'type': 'integer'}}}}
}
@property
def deployment_task_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'type': 'object',
'required': ['id', 'type'],
'properties': {
'id': {
'type': 'string',
'pattern': TASK_NAME_PATTERN},
'type': {
'enum': [
'puppet',
'shell',
'group',
'skipped',
'copy_files',
'sync',
'upload_file',
'stage',
'reboot']},
'required_for': self.task_group,
'requires': self.task_group}}
}
@property
def network_roles_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'type': 'object',
'required': ['id', 'default_mapping', 'properties'],
'properties': {
'id': {'type': 'string'},
'default_mapping': {'type': 'string'},
'properties': {
'type': 'object',
'required': ['subnet', 'gateway', 'vip'],
'properties': {
'subnet': {'type': 'boolean'},
'gateway': {'type': 'boolean'},
'vip': {
'type': 'array',
'items': {
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'type': 'string',
'pattern': NETWORK_ROLE_PATTERN},
'namespace': {
'type': 'string',
'pattern': NETWORK_ROLE_PATTERN}
}}}}}}}
}
@property
def node_roles_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'patternProperties': {
'^[0-9a-zA-Z_-]+$': {
'type': 'object',
'required': ['name', 'description'],
'properties': {
'name': {
'type': 'string',
'description': 'Name that will be shown on UI'},
'description': {
'type': 'string',
'description': ('Short description of role'
' functionality')},
'conflicts': {
'oneOf': [
self.list_of_strings,
{'type': 'string', 'enum': ['*']}]},
'has_primary': {
'type': 'boolean',
'description': ('During orchestration this role'
' will be splitted into'
' primary-role and role.')},
'public_ip_required': {
'type': 'boolean',
'description': ('Specify if role needs public'
' IP address.')},
'update_required': self.list_of_strings,
'update_once': self.list_of_strings,
'weight': {
'type': 'integer',
'description': ('Specify weight that will be'
' used to sort out the roles'
' on the Fuel web UI')},
'limits': self.limits,
'restrictions': self.restrictions}}},
'additionalProperties': False
}
@property
def volume_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['volumes_roles_mapping', 'volumes'],
'properties': {
'volumes_roles_mapping': {
'type': 'object',
'patternProperties': {
TASK_NAME_PATTERN: {
'type': 'array',
'minItems': 1,
'items': {
'type': 'object',
'description': 'Volume allocations for role',
'required': ['allocate_size', 'id'],
'properties': {
'allocate_size': {
'type': 'string',
'enum': ['all', 'min', 'full-disk']},
'id': {'type': 'string'}}}}},
'additionalProperties': False},
'volumes': {
'type': 'array',
'items': {
'type': 'object',
'required': ['id', 'type'],
'properties': {
'id': {
'type': 'string'},
'type': {
'type': 'string'}}}}}
}
@property
def task_base_parameters(self):
schema = super(SchemaV3, self).task_base_parameters
schema['properties']['retries'] = self.positive_integer
return schema

View File

@ -1,423 +0,0 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
from fuel_plugin_builder.validators.schemas import SchemaV3
COMPONENTS_TYPES_STR = '|'.join(
['hypervisor', 'network', 'storage', 'additional_service'])
COMPONENT_NAME_PATTERN = \
'^({0}):([0-9a-z_-]+:)*[0-9a-z_-]+$'.format(COMPONENTS_TYPES_STR)
COMPATIBLE_COMPONENT_NAME_PATTERN = \
'^({0}):([0-9a-z_-]+:)*([0-9a-z_-]+|(\*)?)$'.format(COMPONENTS_TYPES_STR)
TASK_NAME_PATTERN = TASK_ROLE_PATTERN = '^[0-9a-zA-Z_-]+$|^\*$'
NETWORK_ROLE_PATTERN = '^[0-9a-z_-]+$'
FILE_PERMISSIONS_PATTERN = '^[0-7]{4}$'
TASK_VERSION_PATTERN = '^\d+.\d+.\d+$'
STAGE_PATTERN = '^(post_deployment|pre_deployment)' \
'(/[-+]?([0-9]*\.[0-9]+|[0-9]+))?$'
ROLE_ALIASES = ('roles', 'groups', 'role')
TASK_OBLIGATORY_FIELDS = ['id', 'type']
ROLELESS_TASKS = ('stage')
class SchemaV4(SchemaV3):
def __init__(self):
super(SchemaV4, self).__init__()
self.role_pattern = TASK_ROLE_PATTERN
self.roleless_tasks = ROLELESS_TASKS
self.role_aliases = ROLE_ALIASES
@property
def _node_resolve_policy(self):
return {
'type': 'string',
'enum': ['all', 'any']
}
@property
def _yaql_expression(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['yaql_exp'],
'properties': {
'yaql_exp': {'type': 'string'},
}
}
@property
def _task_relation(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'oneOf': [
{'type': 'string'},
self._yaql_expression],
},
'role': {
'oneOf': [
{'type': 'string'},
{'type': 'array'},
self._yaql_expression]
},
'policy': self._node_resolve_policy,
}
}
@property
def _task_role(self):
return {
'oneOf': [
{
'type': 'string',
'format': 'fuel_task_role_format'
},
{
'type': 'array',
'items': {
'type': 'string',
'format': 'fuel_task_role_format'
}
}
]
}
@property
def _task_strategy(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': ['type'],
'properties': {
'type': {
'type': 'string',
'enum': ['parallel', 'one_by_one']},
'amount': {
'oneOf': [
{'type': 'integer'},
self._yaql_expression
]
}
}
}
@property
def _task_stage(self):
return {'type': 'string', 'pattern': STAGE_PATTERN}
@property
def _task_reexecute(self):
return {
'type': 'array',
'items': {
'type': 'string',
'enum': ['deploy_changes']
}
}
def _gen_task_schema(self, task_types, required=None,
parameters=None):
"""Generate deployment task schema using prototype.
:param task_types: task types
:type task_types: str|list
:param required: new required fields
:type required: list
:param parameters: new properties dict
:type parameters: dict
:return:
:rtype: dict
"""
if not task_types:
raise ValueError('Task type should not be empty')
if isinstance(task_types, six.string_types):
task_types = [task_types]
# patch strategy parameter
parameters = parameters or {
"type": "object",
}
parameters.setdefault("properties", {})
parameters["properties"].setdefault("strategy", self._task_strategy)
task_specific_req_fields = list(set(TASK_OBLIGATORY_FIELDS +
(required or [])))
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'object',
'required': task_specific_req_fields,
'properties': {
'type': {'enum': task_types},
'id': {
'type': 'string',
'pattern': TASK_NAME_PATTERN},
'version': {
'type': 'string', "pattern": TASK_VERSION_PATTERN},
'role': self._task_role,
'groups': self._task_role,
'roles': self._task_role,
'required_for': self.task_group,
'requires': self.task_group,
'cross-depends': {
'oneOf': [
{'type': 'array', 'items': self._task_relation},
self._yaql_expression]
},
'cross-depended-by': {
'oneOf': [
{'type': 'array', 'items': self._task_relation},
self._yaql_expression]
},
'stage': self._task_stage,
'tasks': { # used only for 'group' tasks
'type': 'array',
'items': {
'type': 'string',
'pattern': TASK_ROLE_PATTERN}},
'reexecute_on': self._task_reexecute,
'parameters': parameters,
},
}
@property
def deployment_task_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
"$ref": "#/definitions/anyTask"
},
"definitions": {
"anyTask": self._gen_task_schema(
[
'copy_files',
'group',
'reboot',
'shell',
'skipped',
'stage',
'sync',
'puppet',
'upload_file',
]
)
}
}
@property
def copy_files_task(self):
return self._gen_task_schema(
"copy_files",
['parameters'],
{
'type': 'object',
'required': ['files'],
'properties': {
'files': {
'type': 'array',
'minItems': 1,
'items': {
'type': 'object',
'required': ['src', 'dst'],
'properties': {
'src': {'type': 'string'},
'dst': {'type': 'string'}}}},
'permissions': {
'type': 'string',
'pattern': FILE_PERMISSIONS_PATTERN},
'dir_permissions': {
'type': 'string',
'pattern': FILE_PERMISSIONS_PATTERN}}})
@property
def group_task(self):
return self._gen_task_schema("group", [])
@property
def puppet_task(self):
return self._gen_task_schema(
"puppet",
[],
{
'type': 'object',
'required': [
'puppet_manifest', 'puppet_modules', 'timeout'],
'properties': {
'puppet_manifest': {
'type': 'string', 'minLength': 1},
'puppet_modules': {
'type': 'string', 'minLength': 1},
'timeout': {'type': 'integer'},
'retries': {'type': 'integer'}
}
}
)
@property
def reboot_task(self):
return self._gen_task_schema(
"reboot",
[],
{
'type': 'object',
'properties': {
'timeout': {'type': 'integer'}
}
}
)
@property
def shell_task(self):
return self._gen_task_schema(
"shell",
[],
{
'type': 'object',
'required': ['cmd'],
'properties': {
'cmd': {
'type': 'string'},
'retries': {
'type': 'integer'},
'interval': {
'type': 'integer'},
'timeout': {
'type': 'integer'}
}
}
)
@property
def skipped_task(self):
return self._gen_task_schema("skipped")
@property
def stage_task(self):
return self._gen_task_schema("stage")
@property
def sync_task(self):
return self._gen_task_schema(
"sync",
['parameters'],
{
'type': 'object',
'required': ['src', 'dst'],
'properties': {
'src': {'type': 'string'},
'dst': {'type': 'string'},
'timeout': {'type': 'integer'}
}
}
)
@property
def upload_file_task(self):
return self._gen_task_schema(
"upload_file",
['parameters'],
{
'type': 'object',
'required': ['path', 'data'],
'properties': {
'path': {'type': 'string'},
'data': {'type': 'string'}
}
}
)
@property
def package_version(self):
return {'enum': ['4.0.0']}
@property
def metadata_schema(self):
schema = super(SchemaV4, self).metadata_schema
schema['required'].append('is_hotpluggable')
schema['properties']['is_hotpluggable'] = {'type': 'boolean'}
schema['properties']['groups']['items']['enum'].append('equipment')
return schema
@property
def attr_root_schema(self):
schema = super(SchemaV4, self).attr_root_schema
schema['properties']['attributes']['properties'] = {
'metadata': {
'type': 'object',
'properties': {
'group': {
'enum': [
'general', 'security',
'compute', 'network',
'storage', 'logging',
'openstack_services', 'other'
]
}
}
}
}
return schema
@property
def components_items(self):
return {
'type': 'array',
'items': {
'type': 'object',
'required': ['name'],
'properties': {
'name': {
'type': 'string',
'pattern': COMPATIBLE_COMPONENT_NAME_PATTERN
},
'message': {'type': 'string'}
}
}
}
@property
def components_schema(self):
return {
'$schema': 'http://json-schema.org/draft-04/schema#',
'type': 'array',
'items': {
'required': ['name', 'label'],
'type': 'object',
'additionalProperties': False,
'properties': {
'name': {
'type': 'string',
'pattern': COMPONENT_NAME_PATTERN
},
'label': {'type': 'string'},
'description': {'type': 'string'},
'compatible': self.components_items,
'requires': self.components_items,
'incompatible': self.components_items,
'bind': {'type': 'array'}
}
}
}

View File

@ -0,0 +1,127 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
import six
from fuel_plugin_builder import checks
from fuel_plugin_builder import utils
logger = logging.getLogger(__name__)
class ValidatorBase(object):
"""Base Validator.
New ValidatorBase targeted to plugin package version >= 5.0.0 and using
Checks to describe custom logic and providing output based on
utils.ReportNode class.
Check is a basic logic unit that performing validations with given
parameters.
"""
package_version = '0.0.1'
minimal_fuel_version = '0.1'
_metadata_schema = None
_data_tree_schemas = {}
_data_tree_multi_schemas = {}
_data_tree_env_attributes = {}
def validate(self, data_tree):
"""Validate data tree and return report.
:param data_tree: plugin data tree starting from the metadata.yaml dict
:type data_tree: dict
:return: report
:rtype: utils.ReportNode
"""
report = utils.ReportNode('Validating data')
# fixme(ikutukov): compatibility check should be moved out from base
# after the plugin package > 5.0.0 will be defined
report.add_nodes(
checks.fuel_ver_compatible_with_package_ver(
self.minimal_fuel_version, data_tree
)
)
report.add_nodes(
self.check_data_tree_branches_schema(data_tree)
)
report.add_nodes(
self.check_data_tree_multi_branches_schema(data_tree)
)
report.add_nodes(
self.check_data_tree_env_attributes(data_tree)
)
return report
def check_data_tree_branches_schema(self, data_tree):
schema_check_report = utils.ReportNode('Checking schemas')
for branch_key, schema in \
six.iteritems(self._data_tree_schemas):
if branch_key:
if data_tree.get(branch_key):
report = utils.ReportNode(branch_key)
report.add_nodes(
checks.json_schema_is_valid(
schema,
data_tree[branch_key]
)
)
schema_check_report.add_nodes(report)
else:
report = utils.ReportNode('metadata')
report.add_nodes(
checks.json_schema_is_valid(schema, data_tree)
)
schema_check_report.add_nodes(report)
return schema_check_report
def check_data_tree_multi_branches_schema(self, data_tree):
schema_check_report = utils.ReportNode('Checking multi schemas')
for branch_key, multi_schema in \
six.iteritems(self._data_tree_multi_schemas):
if data_tree.get(branch_key):
report = utils.ReportNode(branch_key)
report.add_nodes(
checks.multi_json_schema_is_valid(
multi_schema,
data_tree[branch_key]
)
)
schema_check_report.add_nodes(report)
return schema_check_report
def check_data_tree_env_attributes(self, data_tree):
schema_check_report = utils.ReportNode(
'Checking env attributes schemas')
for branch_key, multi_schema in \
six.iteritems(self._data_tree_env_attributes):
if data_tree.get(branch_key):
report = utils.ReportNode(branch_key)
report.add_nodes(
checks.env_attributes(
data_tree.get(branch_key),
*multi_schema
)
)
schema_check_report.add_nodes(report)
return schema_check_report

View File

@ -15,52 +15,31 @@
# under the License.
import logging
from os.path import join as join_path
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators.base import LegacyBaseValidator
from fuel_plugin_builder.validators.schemas import SchemaV1
from fuel_plugin_builder import schemas
from fuel_plugin_builder.validators.validator_base import ValidatorBase
logger = logging.getLogger(__name__)
class ValidatorV1(LegacyBaseValidator):
class ValidatorV1(ValidatorBase):
package_version = '1.0.0'
minimal_fuel_version = '6.0'
schema = SchemaV1()
@property
def basic_version(self):
return '6.0'
def __init__(self, *args, **kwargs):
super(ValidatorV1, self).__init__(*args, **kwargs)
self.meta_path = join_path(self.plugin_path, 'metadata.yaml')
self.tasks_path = join_path(self.plugin_path, 'tasks.yaml')
self.env_conf_path = join_path(
self.plugin_path, 'environment_config.yaml')
def validate(self):
self.check_schemas()
self.check_tasks()
self.check_releases_paths()
self.check_compatibility()
def check_tasks(self):
"""Json schema doesn't have any conditions, so we have
to make sure here, that puppet task is really puppet
and shell task is correct too
"""
logger.debug('Start tasks checking "%s"', self.tasks_path)
tasks = utils.parse_yaml(self.tasks_path)
schemas = {
'puppet': self.schema.puppet_parameters,
'shell': self.schema.shell_parameters}
for idx, task in enumerate(tasks):
self.validate_schema(
task.get('parameters'),
schemas[task['type']],
self.tasks_path,
value_path=[idx, 'parameters'])
_data_tree_schemas = {
'': schemas.metadata_v6_0.schema,
'tasks': schemas.task_v0_0_2.tasks
}
_data_tree_multi_schemas = {
'tasks': {
'puppet': schemas.task_v0_0_0.puppet_task,
'shell': schemas.task_v0_0_0.shell_task
}
}
_data_tree_env_attributes = {
'environment_config': [
schemas.attributes_v6_1.attr_root,
schemas.attributes_v6_1.attr_element,
schemas.attributes_v6_1.attr_meta
]
}

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2014 Mirantis, Inc.
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -13,60 +13,21 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
from os.path import join as join_path
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators.base import LegacyBaseValidator
from fuel_plugin_builder.validators.schemas import SchemaV2
from fuel_plugin_builder import schemas
from fuel_plugin_builder.validators.validator_base import ValidatorBase
logger = logging.getLogger(__name__)
class ValidatorV2(LegacyBaseValidator):
schema = SchemaV2()
@property
def basic_version(self):
return '6.1'
def __init__(self, *args, **kwargs):
super(ValidatorV2, self).__init__(*args, **kwargs)
self.meta_path = join_path(self.plugin_path, 'metadata.yaml')
self.tasks_path = join_path(self.plugin_path, 'tasks.yaml')
self.env_conf_path = join_path(
self.plugin_path, 'environment_config.yaml')
def validate(self):
self.check_schemas()
self.check_tasks()
self.check_releases_paths()
self.check_compatibility()
def _parse_tasks(self):
return utils.parse_yaml(self.tasks_path)
def check_tasks(self):
"""Json schema doesn't have any conditions, so we have
to make sure here, that puppet task is really puppet,
shell or reboot tasks are correct too
"""
logger.debug('Start tasks checking "%s"', self.tasks_path)
tasks = self._parse_tasks()
if tasks is None:
return
schemas = {
'puppet': self.schema.puppet_parameters,
'shell': self.schema.shell_parameters,
'reboot': self.schema.reboot_parameters}
for idx, task in enumerate(tasks):
self.validate_schema(
task.get('parameters'),
schemas[task['type']],
self.tasks_path,
value_path=[idx, 'parameters'])
class ValidatorV2(ValidatorBase):
package_version = '2.0.0'
minimal_fuel_version = '6.1'
_data_tree_schemas = {
'': schemas.metadata_v6_1.schema,
'tasks': schemas.task_v0_0_2.tasks
}
_data_tree_multi_schemas = {
'tasks': {
'puppet': schemas.task_v1_0_0.puppet_task,
'shell': schemas.task_v1_0_0.shell_task,
'reboot': schemas.task_v1_0_0.reboot_task
}
}

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -13,113 +13,37 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
from os.path import join as join_path
from fuel_plugin_builder import errors
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators.schemas import SchemaV3
from fuel_plugin_builder import schemas
from fuel_plugin_builder.validators import ValidatorV2
logger = logging.getLogger(__name__)
class ValidatorV3(ValidatorV2):
package_version = '3.0.0'
minimal_fuel_version = '7.0'
schema = SchemaV3()
def __init__(self, *args, **kwargs):
super(ValidatorV3, self).__init__(*args, **kwargs)
self.deployment_tasks_path = join_path(
self.plugin_path, 'deployment_tasks.yaml')
self.network_roles_path = join_path(
self.plugin_path, 'network_roles.yaml')
self.node_roles_path = join_path(
self.plugin_path, 'node_roles.yaml')
self.volumes_path = join_path(
self.plugin_path, 'volumes.yaml')
@property
def basic_version(self):
return '7.0'
def validate(self):
super(ValidatorV3, self).validate()
self.check_deployment_tasks()
def check_schemas(self):
logger.debug('Start schema checking "%s"', self.plugin_path)
self.validate_file_by_schema(
self.schema.metadata_schema,
self.meta_path)
self.validate_file_by_schema(
self.schema.tasks_schema,
self.tasks_path,
allow_not_exists=True
)
self.check_env_config_attrs()
self.check_deployment_tasks_schema()
self.check_network_roles_schema()
self.check_node_roles_schema()
self.check_volumes_schema()
def check_deployment_tasks_schema(self):
self.validate_file_by_schema(
self.schema.deployment_task_schema,
self.deployment_tasks_path)
def check_network_roles_schema(self):
self.validate_file_by_schema(
self.schema.network_roles_schema,
self.network_roles_path,
allow_not_exists=True)
def check_node_roles_schema(self):
self.validate_file_by_schema(
self.schema.node_roles_schema,
self.node_roles_path,
allow_not_exists=True)
def check_volumes_schema(self):
self.validate_file_by_schema(
self.schema.volume_schema,
self.volumes_path,
allow_not_exists=True)
def check_deployment_tasks(self):
logger.debug(
'Start deployment tasks checking "%s"',
self.deployment_tasks_path)
deployment_tasks = utils.parse_yaml(self.deployment_tasks_path)
schemas = {
'puppet': self.schema.puppet_task,
'shell': self.schema.shell_task,
'group': self.schema.group_task,
'skipped': self.schema.skipped_task,
'copy_files': self.schema.copy_files,
'sync': self.schema.sync,
'upload_file': self.schema.upload_file,
'stage': self.schema.stage,
'reboot': self.schema.reboot}
for idx, deployment_task in enumerate(deployment_tasks):
if deployment_task['type'] not in schemas:
error_msg = 'There is no such task type:' \
'{0}'.format(deployment_task['type'])
raise errors.ValidationError(error_msg)
self.validate_schema(
deployment_task,
schemas[deployment_task['type']],
self.deployment_tasks_path,
value_path=[idx])
def _parse_tasks(self):
if utils.exists(self.tasks_path):
tasks = utils.parse_yaml(self.tasks_path)
# Tasks schema is not checked in check_schemas, thus
# we perform manual check on parsing tasks file
if tasks is None:
raise errors.FileIsEmpty(self.tasks_path)
return None
_data_tree_schemas = {
'': schemas.metadata_v7_0.schema,
'tasks': schemas.task_v0_0_2.tasks,
'deployment_tasks': schemas.task_v1_0_0.tasks,
'network_roles_metadata': schemas.network_roles_v7_0.schema,
'node_roles_metadata': schemas.node_roles_v7_0.schema,
'volumes_metadata': schemas.volumes_v7_0.schema
}
_data_tree_multi_schemas = {
'tasks': {
'puppet': schemas.task_v1_0_0.puppet_task,
'shell': schemas.task_v1_0_0.shell_task,
'reboot': schemas.task_v1_0_0.reboot_task
},
'deployment_tasks': {
'puppet': schemas.task_v1_0_0.puppet_task,
'shell': schemas.task_v1_0_0.shell_task,
'group': schemas.task_v1_0_0.group_task,
'skipped': schemas.task_v1_0_0.skipped_task,
'copy_files': schemas.task_v1_0_0.copy_files_task,
'sync': schemas.task_v1_0_0.sync_task,
'upload_file': schemas.task_v1_0_0.upload_file_task,
'stage': schemas.task_v1_0_0.stage_task,
'reboot': schemas.task_v1_0_0.reboot_task
}
}

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# Copyright 2015 Mirantis, Inc.
#
# Copyright 2016 Mirantis, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
@ -13,122 +13,45 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import logging
from os.path import join as join_path
from fuel_plugin_builder import errors
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators.formatchecker import FormatChecker
from fuel_plugin_builder.validators.schemas import SchemaV4
from fuel_plugin_builder import schemas
from fuel_plugin_builder.validators import ValidatorV3
logger = logging.getLogger(__name__)
class ValidatorV4(ValidatorV3):
package_version = '4.0.0'
minimal_fuel_version = '8.0'
schema = SchemaV4()
def __init__(self, *args, **kwargs):
super(ValidatorV4, self).__init__(format_checker=FormatChecker(
role_patterns=[self.schema.role_pattern]), *args, **kwargs)
self.components_path = join_path(self.plugin_path, 'components.yaml')
@property
def basic_version(self):
return '8.0'
def check_metadata_schema(self):
self.validate_file_by_schema(
self.schema.metadata_schema,
self.meta_path,
allow_not_exists=True)
def check_tasks_schema(self):
self.validate_file_by_schema(
self.schema.tasks_schema,
self.tasks_path,
allow_not_exists=True,
allow_empty=True
)
def check_schemas(self):
logger.debug('Start schema checking "%s"', self.plugin_path)
self.check_metadata_schema()
self.check_tasks_schema()
self.check_env_config_attrs()
self.check_deployment_tasks_schema()
self.check_network_roles_schema()
self.check_node_roles_schema()
self.check_volumes_schema()
self.check_components_schema()
def check_components_schema(self):
self.validate_file_by_schema(self.schema.components_schema,
self.components_path,
allow_not_exists=True)
def check_deployment_tasks(self):
logger.debug(
'Start deployment tasks checking "%s"',
self.deployment_tasks_path)
deployment_tasks = utils.parse_yaml(self.deployment_tasks_path)
schemas = {
'puppet': self.schema.puppet_task,
'shell': self.schema.shell_task,
'group': self.schema.group_task,
'skipped': self.schema.skipped_task,
'copy_files': self.schema.copy_files_task,
'sync': self.schema.sync_task,
'upload_file': self.schema.upload_file_task,
'stage': self.schema.stage_task,
'reboot': self.schema.reboot_task}
for idx, deployment_task in enumerate(deployment_tasks):
if deployment_task['type'] not in schemas:
error_msg = 'There is no such task type:' \
'{0}'.format(deployment_task['type'])
raise errors.ValidationError(error_msg)
if deployment_task['type'] not in self.schema.roleless_tasks:
for role_alias in self.schema.role_aliases:
deployment_role = deployment_task.get(role_alias)
if deployment_role:
break
else:
logger.warn(
'Task {0} does not contain {1} fields. That '
'may lead to tasks being unassigned to nodes.'.
format(deployment_task['id'], '/'.
join(self.schema.role_aliases)))
self.validate_schema(
deployment_task,
schemas[deployment_task['type']],
self.deployment_tasks_path,
value_path=[idx])
def check_tasks(self):
"""Check legacy tasks.yaml."""
logger.debug('Start tasks checking "%s"', self.tasks_path)
if utils.exists(self.tasks_path):
# todo(ikutukov): remove self._check_tasks
tasks = utils.parse_yaml(self.tasks_path)
if tasks is None:
return
schemas = {
'puppet': self.schema.puppet_parameters,
'shell': self.schema.shell_parameters,
'reboot': self.schema.reboot_parameters}
for idx, task in enumerate(tasks):
self.validate_schema(
task.get('parameters'),
schemas[task['type']],
self.tasks_path,
value_path=[idx, 'parameters'])
else:
logger.debug('File "%s" doesn\'t exist', self.tasks_path)
_data_tree_schemas = {
'': schemas.metadata_v8_0.schema,
'tasks': schemas.task_v0_0_2.tasks,
'deployment_tasks': schemas.task_v2_0_0.tasks,
'network_roles_metadata': schemas.network_roles_v8_0.schema,
'node_roles_metadata': schemas.node_roles_v7_0.schema,
'volumes_metadata': schemas.volumes_v7_0.schema,
'components_metadata': schemas.components_v8_0.schema
}
_data_tree_multi_schemas = {
'tasks': {
'puppet': schemas.task_v1_0_0.puppet_task,
'shell': schemas.task_v1_0_0.shell_task,
'reboot': schemas.task_v1_0_0.reboot_task
},
'deployment_tasks': {
'puppet': schemas.task_v2_0_0.puppet_task,
'shell': schemas.task_v2_0_0.shell_task,
'group': schemas.task_v2_0_0.group_task,
'skipped': schemas.task_v2_0_0.skipped_task,
'copy_files': schemas.task_v2_0_0.copy_files_task,
'sync': schemas.task_v2_0_0.sync_task,
'upload_file': schemas.task_v2_0_0.upload_file_task,
'stage': schemas.task_v2_0_0.stage_task,
'reboot': schemas.task_v2_0_0.reboot_task
}
}
_data_tree_env_attributes = {
'environment_config': [
schemas.attributes_v8_0.attr_root,
schemas.attributes_v8_0.attr_element,
schemas.attributes_v8_0.attr_meta
]
}

View File

@ -14,38 +14,81 @@
# License for the specific language governing permissions and limitations
# under the License.
from os.path import join as join_path
from fuel_plugin_builder.validators.schemas import SchemaV5
from fuel_plugin_builder.validators import ValidatorV4
from fuel_plugin_builder import checks
from fuel_plugin_builder import schemas
from fuel_plugin_builder import utils
from fuel_plugin_builder.validators.validator_base import ValidatorBase
class ValidatorV5(ValidatorV4):
class ValidatorV5(ValidatorBase):
package_version = '5.0.0'
minimal_fuel_version = '9.1'
schema = SchemaV5()
_data_tree_schemas = {
'': schemas.metadata_v9_1.schema,
'tasks': schemas.task_v0_0_2.tasks,
'deployment_tasks': schemas.task_v2_1_0.tasks,
'network_roles_metadata': schemas.network_roles_v8_0.schema,
'node_roles_metadata': schemas.node_roles_v7_0.schema,
'volumes_metadata': schemas.volumes_v7_0.schema,
'components_metadata': schemas.components_v8_0.schema,
'node_attributes_metadata': (
schemas.node_attributes_v9_1.node_attributes
),
'nic_attributes_metadata': (
schemas.node_attributes_v9_1.node_nic_attributes
),
'bond_attributes_metadata': (
schemas.node_attributes_v9_1.node_nic_attributes
)
}
_data_tree_multi_schemas = {
'tasks': {
'puppet': schemas.task_v1_0_0.puppet_task,
'shell': schemas.task_v1_0_0.shell_task,
'reboot': schemas.task_v1_0_0.reboot_task
},
'deployment_tasks': {
'puppet': schemas.task_v2_1_0.puppet_task,
'shell': schemas.task_v2_1_0.shell_task,
'group': schemas.task_v2_1_0.group_task,
'skipped': schemas.task_v2_1_0.skipped_task,
'copy_files': schemas.task_v2_1_0.copy_files_task,
'sync': schemas.task_v2_1_0.sync_task,
'upload_file': schemas.task_v2_1_0.upload_file_task,
'stage': schemas.task_v2_1_0.stage_task,
'reboot': schemas.task_v2_1_0.reboot_task,
'move_to_bootstrap': schemas.task_v2_2_0.move_to_bootstrap_task,
'master_shell': schemas.task_v2_2_0.master_shell_task,
'erase_node': schemas.task_v2_2_0.erase_node_task,
}
}
def __init__(self, *args, **kwargs):
super(ValidatorV5, self).__init__(*args, **kwargs)
self.bond_config_path = join_path(self.plugin_path, 'bond_config.yaml')
self.nic_config_path = join_path(self.plugin_path, 'nic_config.yaml')
self.node_config_path = join_path(self.plugin_path, 'node_config.yaml')
def validate(self, data_tree):
"""See ValidatorBase documentation."""
report = super(ValidatorV5, self).validate(data_tree)
@property
def basic_version(self):
return '9.0'
report.add_nodes(
checks.legacy_fuel_version(data_tree)
)
def check_schemas(self):
super(ValidatorV5, self).check_schemas()
self.check_node_attributes_schema()
self.check_interface_attributes_schema(self.bond_config_path)
self.check_interface_attributes_schema(self.nic_config_path)
report.add_nodes(
checks.mode_directive(data_tree)
)
def check_node_attributes_schema(self):
self.validate_file_by_schema(self.schema.node_attributes_schema,
self.node_config_path,
allow_not_exists=True)
# check releases schema
for release in data_tree.get('releases', []):
release_report = utils.ReportNode('Checking release:')
for graph in release.get('graphs', []):
release_report.info('Graph: "{}"'.format(
graph.get('type'))
)
release_report.add_nodes(
checks.json_schema_is_valid(
schema=schemas.graph_v9_1.graph,
data=graph
)
)
def check_interface_attributes_schema(self, file_path):
self.validate_file_by_schema(self.schema.node_nic_attributes_schema,
file_path,
allow_not_exists=True)
report.add_nodes(release_report)
return report

Some files were not shown because too many files have changed in this diff Show More