Retire anvil

Removes all project contents (and places a README.rst
for future benefit).

Kolla is a good replacement, people should likely switch
to that instead.

http://docs.openstack.org/developer/kolla/

Change-Id: I5e2bf5842af7a67949186ab41814e55ce03deb5f
This commit is contained in:
Joshua Harlow 2016-12-20 13:21:04 -05:00
parent 703d5236ae
commit 649ab1fb3f
237 changed files with 16 additions and 22252 deletions

View File

@ -1,68 +0,0 @@
# Contributing to Anvil
## General
Anvil is written in python (we should be compatible with ``python >= 2.6``).
Anvil's official repository is located on GitHub at: https://github.com/openstack/anvil
Besides the master branch that tracks the OpenStack ``trunk`` tags will maintained for all OpenStack releases starting with `essex`.
The primary script in anvil is ``smithy``, which performs the bulk of the work for anvil's use cases (it acts as the main program entry-point).
A number of additional scripts can be found in the ``tools`` directory that may or may not be useful to you.
## Documentation
Please create documentation in the ``docs/`` folder which will be synced with:
http://readthedocs.org/docs/anvil/
This will suffice until a more *official* documentation site can be made.
## Style
* Please attempt to follow [pep8] for all code submitted.
* Please also attempt to run [pylint] all code submitted.
* Please also attempt to run the [yaml] validation if you adjust any [yaml] files in the `conf` directory.
## Environment Variables
* The ``OS_*`` environment variables should be the only ones used for all authentication to OpenStack clients as documented in the [CLI Auth] wiki page.
## Documentation
Documentation should all be written in [markdown] or [rst]. Although github does support other formats it seems better just to stabilize on one of those.
## Style Commandments
1. Read http://www.python.org/dev/peps/pep-0008/
1. Read http://www.python.org/dev/peps/pep-0008/ again
1. Read on
### Overall
1. Put two newlines between top-level code (funcs, classes, etc)
1. Put one newline between methods in classes and anywhere else
1. Do not write "except:", use "except Exception:" at the very least
1. Include your name with TODOs as in "#TODO(termie)"
1. Do not name anything the same name as a built-in or reserved word
1. Do not use the '_' as a single character variable as it is used with
the [gettext] module and can lead to confusion if used for other purposes.
### Imports
1. Do not import objects, only modules (not strictly enforced)
1. Do not import more than one module per line
1. Do not make relative imports
1. Order your imports by the full module path
1. Organize your imports in lexical order
[gettext]: http://docs.python.org/2/library/gettext.html
[CLI Auth]: http://wiki.openstack.org/CLIAuth
[yaml]: http://en.wikipedia.org/wiki/YAML
[pep8]: http://www.python.org/dev/peps/pep-0008/
[pylint]: http://pypi.python.org/pypi/pylint
[markdown]: http://daringfireball.net/projects/markdown/
[rst]: http://docutils.sourceforge.net/docs/user/rst/quickstart.html

176
LICENSE
View File

@ -1,176 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.

View File

@ -1,9 +0,0 @@
include README.rst
include HACKING.md
include requirements.txt
include test-requirements.txt
include LICENSE
exclude .gitignore
exclude .gitreview
global-exclude *.pyc

View File

@ -1,14 +1,20 @@
We want more information!
=========================
=====
Anvil
=====
Please check out: http://anvil.readthedocs.org.
Dead
----
Licensing
=========
This project is no longer maintained.
Anvil is licensed under the Apache License, Version 2.0 (the "License"); you
may not use this file except in compliance with the License. You may obtain a
copy of the License at http://www.apache.org/licenses/LICENSE-2.0
The contents of this repository are still available in the Git
source code management system. To see the contents of this
repository before it reached its end of life, please check out the
previous commit with "git checkout HEAD^1".
Some tools are licensed under different terms; see tools/README.rst for
more information.
Contacts
--------
For any further questions, please email
openstack-dev@lists.openstack.org or join #openstack-dev on
Freenode.

View File

@ -1,15 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,254 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import re
import sys
import time
import traceback as tb
import six
sys.path.insert(0, os.path.join(os.path.abspath(os.pardir)))
sys.path.insert(0, os.path.abspath(os.getcwd()))
from anvil import actions
from anvil import colorizer
from anvil import distro
from anvil import exceptions as excp
from anvil import log as logging
from anvil import opts
from anvil import origins as _origins
from anvil import persona
from anvil import pprint
from anvil import settings
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger()
SETTINGS_FILE = "/etc/anvil/settings.yaml"
ANVIL_DIRS = tuple(["/etc/anvil/", '/usr/share/anvil/'])
def run(args):
"""Starts the execution after args have been parsed and logging has been setup.
"""
LOG.debug("CLI arguments are:")
utils.log_object(args, logger=LOG, level=logging.DEBUG, item_max_len=128)
# Keep the old args around so we have the full set to write out
saved_args = dict(args)
action = args.pop("action", '').strip().lower()
if re.match(r"^moo[o]*$", action):
return
try:
runner_cls = actions.class_for(action)
except Exception as ex:
raise excp.OptionException(str(ex))
if runner_cls.needs_sudo:
ensure_perms()
# Check persona file exists
persona_fn = args.pop('persona_fn')
if not persona_fn:
raise excp.OptionException("No persona file name specified!")
if not sh.isfile(persona_fn):
raise excp.OptionException("Invalid persona file %r specified!" % (persona_fn))
# Check origin file exists
origins_fn = args.pop('origins_fn')
if not origins_fn:
raise excp.OptionException("No origin file name specified!")
if not sh.isfile(origins_fn):
raise excp.OptionException("Invalid origin file %r specified!" % (origins_fn))
args['origins_fn'] = sh.abspth(origins_fn)
# Determine the root directory...
root_dir = sh.abspth(args.pop("dir"))
(repeat_string, line_max_len) = utils.welcome()
print(pprint.center_text("Action Runner", repeat_string, line_max_len))
# !!
# Here on out we should be using the logger (and not print)!!
# !!
# Ensure the anvil dirs are there if others are about to use it...
if not sh.isdir(root_dir):
LOG.info("Creating anvil root directory at path: %s", root_dir)
sh.mkdir(root_dir)
try:
for d in ANVIL_DIRS:
if sh.isdir(d):
continue
LOG.info("Creating anvil auxiliary directory at path: %s", d)
sh.mkdir(d)
except OSError as e:
LOG.warn("Failed ensuring auxiliary directories due to %s", e)
# Load the origins...
origins = _origins.load(args['origins_fn'],
patch_file=args.get('origins_patch'))
# Load the distro/s
possible_distros = distro.load(settings.DISTRO_DIR,
distros_patch=args.get('distros_patch'))
# Load + match the persona to the possible distros...
try:
persona_obj = persona.load(persona_fn)
except Exception as e:
raise excp.OptionException("Error loading persona file: %s due to %s" % (persona_fn, e))
else:
dist = persona_obj.match(possible_distros, origins)
LOG.info('Persona selected distro: %s from %s possible distros',
colorizer.quote(dist.name), len(possible_distros))
# Update the dist with any other info...
dist.inject_platform_overrides(persona_obj.distro_updates, source=persona_fn)
dist.inject_platform_overrides(origins, source=origins_fn)
# Print it out...
LOG.debug("Distro settings are:")
for line in dist.pformat(item_max_len=128).splitlines():
LOG.debug(line)
# Get the object we will be running with...
runner = runner_cls(distro=dist,
root_dir=root_dir,
name=action,
cli_opts=args)
# Now that the settings are known to work, store them for next run
store_current_settings(saved_args)
LOG.info("Starting action %s on %s for distro: %s",
colorizer.quote(action), colorizer.quote(utils.iso8601()),
colorizer.quote(dist.name))
LOG.info("Using persona: %s", colorizer.quote(persona_fn))
LOG.info("Using origins: %s", colorizer.quote(origins_fn))
LOG.info("In root directory: %s", colorizer.quote(root_dir))
start_time = time.time()
runner.run(persona_obj)
end_time = time.time()
pretty_time = utils.format_time(end_time - start_time)
LOG.info("It took %s seconds or %s minutes to complete action %s.",
colorizer.quote(pretty_time['seconds']), colorizer.quote(pretty_time['minutes']), colorizer.quote(action))
def load_previous_settings():
settings_prev = None
try:
settings_prev = utils.load_yaml(SETTINGS_FILE)
except Exception:
# Errors could be expected on format problems
# or on the file not being readable....
pass
return settings_prev
def store_current_settings(c_settings):
# Remove certain keys that just shouldn't be saved
to_save = dict(c_settings)
for k in ['action', 'verbose']:
if k in c_settings:
to_save.pop(k, None)
buf = six.StringIO()
buf.write("# Anvil last used settings\n")
buf.write(utils.add_header(SETTINGS_FILE,
utils.prettify_yaml(to_save),
adjusted=sh.isfile(SETTINGS_FILE)))
try:
sh.write_file(SETTINGS_FILE, buf.getvalue())
except OSError as e:
LOG.warn("Failed writing to %s due to %s", SETTINGS_FILE, e)
def ensure_perms():
# Ensure we are running as root to start...
if not sh.got_root():
raise excp.PermException("Root access required")
def main():
"""Starts the execution of anvil without injecting variables into
the global namespace. Ensures that logging is setup and that sudo access
is available and in-use.
Arguments: N/A
Returns: 1 for success, 0 for failure and 2 for permission change failure.
"""
# Do this first so people can see the help message...
args = opts.parse(load_previous_settings())
# Configure logging levels
log_level = logging.INFO
if args['verbose']:
log_level = logging.DEBUG
logging.setupLogging(log_level, tee_filename=args['tee_file'])
LOG.debug("Log level is: %s" % (logging.getLevelName(log_level)))
def print_exc(exc):
if not exc:
return
msg = str(exc).strip()
if not msg:
return
if not (msg.endswith(".") or msg.endswith("!")):
msg = msg + "."
if msg:
print(msg)
def print_traceback():
traceback = None
if log_level < logging.INFO:
# See: http://docs.python.org/library/traceback.html
# When its not none u get more detailed info about the exception
traceback = sys.exc_traceback
tb.print_exception(sys.exc_type, sys.exc_value,
traceback, file=sys.stdout)
try:
run(args)
utils.goodbye(True)
return 0
except excp.PermException as e:
print_exc(e)
print(("This program should be running via %s as it performs some root-only commands is it not?")
% (colorizer.quote('sudo', quote_color='red')))
return 2
except excp.OptionException as e:
print_exc(e)
print("Perhaps you should try %s" % (colorizer.quote('--help', quote_color='red')))
return 1
except Exception:
utils.goodbye(False)
print_traceback()
return 1
if __name__ == "__main__":
sys.exit(main())

View File

@ -1,38 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil.actions import build
from anvil.actions import prepare
_NAMES_TO_RUNNER = {
'build': build.BuildAction,
'prepare': prepare.PrepareAction,
}
_RUNNER_TO_NAMES = dict((v, k) for k, v in _NAMES_TO_RUNNER.items())
def names():
"""Returns a list of the available action names."""
return list(sorted(_NAMES_TO_RUNNER.keys()))
def class_for(action):
"""Given an action name, look up the factory for that action runner."""
try:
return _NAMES_TO_RUNNER[action]
except KeyError:
raise RuntimeError('Unrecognized action %s' % action)

View File

@ -1,362 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License..
import abc
import copy
from anvil import cfg
from anvil import colorizer
from anvil import env
from anvil import exceptions as excp
from anvil import importer
from anvil import log as logging
from anvil import persona as _persona
from anvil import phase
from anvil import shell as sh
from anvil import utils
import six
LOG = logging.getLogger(__name__)
BASE_ENTRYPOINTS = {
'build': 'anvil.components.base_build:BuildComponent',
}
BASE_PYTHON_ENTRYPOINTS = dict(BASE_ENTRYPOINTS)
BASE_PYTHON_ENTRYPOINTS.update({
'build': 'anvil.components.base_build:PythonBuildComponent',
})
SPECIAL_GROUPS = _persona.SPECIAL_GROUPS
class PhaseFunctors(object):
def __init__(self, start, run, end):
self.start = start
self.run = run
self.end = end
class Action(object):
__meta__ = abc.ABCMeta
needs_sudo = True
def __init__(self, name, distro, root_dir, cli_opts):
self.distro = distro
self.name = name
# Root directory where all files/downloads will be based at
self.root_dir = root_dir
# Action phases are tracked in this directory
self.phase_dir = sh.joinpths(root_dir, 'phases')
# Yamls are loaded (with its reference links) using this instance at the
# given component directory where component configuration will be found.
self.config_loader = cfg.YamlMergeLoader(root_dir,
origins_path=cli_opts['origins_fn'])
# Stored for components to get any options
self.cli_opts = cli_opts
@abc.abstractproperty
@property
def lookup_name(self):
# Name that will be used to lookup this module
# in any configuration (may or may not be the same as the name
# of this action)....
raise NotImplementedError()
@abc.abstractmethod
def _run(self, persona, groups):
"""Run the phases of processing for this action.
Subclasses are expected to override this method to
do something useful.
"""
raise NotImplementedError()
def _make_default_entry_points(self, component_name, component_options):
if component_options.get('python_entrypoints'):
return BASE_PYTHON_ENTRYPOINTS.copy()
return BASE_ENTRYPOINTS.copy()
def _merge_subsystems(self, distro_subsystems, desired_subsystems):
subsystems = {}
for subsystem_name in desired_subsystems:
# Return a deep copy so that later instances can not modify
# other instances subsystem accidentally...
subsystems[subsystem_name] = copy.deepcopy(distro_subsystems.get(subsystem_name, {}))
return subsystems
def _construct_siblings(self, name, siblings, base_params, sibling_instances):
# First setup the sibling instance action references
for (action, _entry_point) in siblings.items():
if action not in sibling_instances:
sibling_instances[action] = {}
there_siblings = {}
for (action, entry_point) in siblings.items():
sibling_params = utils.merge_dicts(base_params, self.cli_opts, preserve=True)
# Give the sibling the reference to all other siblings being created
# which will be populated when they are created (now or later) for
# the same action
sibling_params['instances'] = sibling_instances[action]
a_sibling = importer.construct_entry_point(entry_point, **sibling_params)
# Update the sibling we are returning and the corresponding
# siblings for that action (so that the sibling can have the
# correct 'sibling' instances associated with it, if it needs those...)
there_siblings[action] = a_sibling
# Update all siblings being constructed so that there siblings will
# be correct when fetched...
sibling_instances[action][name] = a_sibling
return there_siblings
def _construct_instances(self, persona):
"""Create component objects for each component in the persona."""
# Keeps track of all sibling instances across all components + actions
# so that each instance or sibling instance will be connected to the
# right set of siblings....
sibling_instances = {}
components_created = set()
groups = []
for group in persona.matched_components:
instances = utils.OrderedDict()
for c in group:
if c in components_created:
raise RuntimeError("Can not duplicate component %s in a"
" later group %s" % (c, group.id))
d_component = self.distro.extract_component(
c, self.lookup_name, default_entry_point_creator=self._make_default_entry_points)
LOG.debug("Constructing component %r (%s)", c, d_component.entry_point)
d_subsystems = d_component.options.pop('subsystems', {})
sibling_params = {}
sibling_params['name'] = c
# First create its siblings with a 'minimal' set of options
# This is done, so that they will work in a minimal state, they do not
# get access to the persona options since those are action specific (or could be),
# if this is not useful, we can give them full access, unsure if its worse or better...
active_subsystems = self._merge_subsystems(distro_subsystems=d_subsystems,
desired_subsystems=persona.wanted_subsystems.get(c, []))
sibling_params['subsystems'] = active_subsystems
sibling_params['siblings'] = {} # This gets adjusted during construction
sibling_params['distro'] = self.distro
sibling_params['options'] = self.config_loader.load(
distro=d_component, component=c,
origins_patch=self.cli_opts.get('origins_patch'))
LOG.debug("Constructing %r %s siblings...", c, len(d_component.siblings))
my_siblings = self._construct_siblings(c, d_component.siblings, sibling_params, sibling_instances)
# Now inject the full options and create the target instance
# with the full set of options and not the restricted set that
# siblings get...
instance_params = dict(sibling_params)
instance_params['instances'] = instances
instance_params['options'] = self.config_loader.load(
distro=d_component, component=c, persona=persona,
origins_patch=self.cli_opts.get('origins_patch'))
instance_params['siblings'] = my_siblings
instance_params = utils.merge_dicts(instance_params, self.cli_opts, preserve=True)
instances[c] = importer.construct_entry_point(d_component.entry_point, **instance_params)
if c not in SPECIAL_GROUPS:
components_created.add(c)
groups.append((group.id, instances))
return groups
def _verify_components(self, groups):
for group, instances in groups:
LOG.info("Verifying that the components of group %s are ready"
" to rock-n-roll.", colorizer.quote(group))
for _c, instance in six.iteritems(instances):
instance.verify()
def _warm_components(self, groups):
for group, instances in groups:
LOG.info("Warming up component configurations of group %s.",
colorizer.quote(group))
for _c, instance in six.iteritems(instances):
instance.warm_configs()
def _on_start(self, persona, groups):
LOG.info("Booting up your components.")
LOG.debug("Starting environment settings:")
utils.log_object(env.get(), logger=LOG, level=logging.DEBUG, item_max_len=64)
sh.mkdirslist(self.phase_dir)
self._verify_components(groups)
self._warm_components(groups)
def _on_finish(self, persona, groups):
LOG.info("Tearing down your components.")
LOG.debug("Final environment settings:")
utils.log_object(env.get(), logger=LOG, level=logging.DEBUG, item_max_len=64)
def _get_phase_filename(self, phase_name):
# Do some canonicalization of the phase name so its in a semi-standard format...
phase_name = phase_name.lower().strip()
phase_name = phase_name.replace("-", '_')
phase_name = phase_name.replace(" ", "_")
if not phase_name:
raise ValueError("Phase name must not be empty")
return sh.joinpths(self.phase_dir, "%s.phases" % (phase_name))
def _run_many_phase(self, functors, group, instances, phase_name, *inv_phase_names):
"""Run a given 'functor' across all of the components, passing *all* instances to run."""
# This phase recorder will be used to check if a given component
# and action has ran in the past, if so that components action
# will not be ran again. It will also be used to mark that a given
# component has completed a phase (if that phase runs).
if not phase_name:
phase_recorder = phase.NullPhaseRecorder()
else:
phase_recorder = phase.PhaseRecorder(self._get_phase_filename(phase_name))
# These phase recorders will be used to undo other actions activities
# ie, when an install completes you want the uninstall phase to be
# removed from that actions phase file (and so on). This list will be
# used to accomplish that.
neg_phase_recs = []
if inv_phase_names:
for n in inv_phase_names:
if not n:
neg_phase_recs.append(phase.NullPhaseRecorder())
else:
neg_phase_recs.append(phase.PhaseRecorder(self._get_phase_filename(n)))
def change_activate(instance, on_off):
# Activate/deactivate a component instance and there siblings (if any)
#
# This is used when you say are looking at components
# that have been activated before your component has been.
#
# Typically this is useful for checking if a previous component
# has a shared dependency with your component and if so then there
# is no need to reinstall said dependency...
instance.activated = on_off
for (_name, sibling_instance) in instance.siblings.items():
sibling_instance.activated = on_off
def run_inverse_recorders(c_name):
for n in neg_phase_recs:
n.unmark(c_name)
# Reset all activations
for c, instance in six.iteritems(instances):
change_activate(instance, False)
# Run all components which have not been ran previously (due to phase tracking)
instances_started = utils.OrderedDict()
for c, instance in six.iteritems(instances):
if c in SPECIAL_GROUPS:
c = "%s_%s" % (c, group)
if c in phase_recorder:
LOG.debug("Skipping phase named %r for component %r since it already happened.", phase_name, c)
else:
try:
with phase_recorder.mark(c):
if functors.start:
functors.start(instance)
instances_started[c] = instance
except excp.NoTraceException:
pass
if functors.run:
results = functors.run(list(six.itervalues(instances_started)))
else:
results = [None] * len(instances_started)
instances_ran = instances_started
for i, (c, instance) in enumerate(six.iteritems(instances_ran)):
result = results[i]
try:
with phase_recorder.mark(c):
if functors.end:
functors.end(instance, result)
except excp.NoTraceException:
pass
for c, instance in six.iteritems(instances_ran):
change_activate(instance, True)
run_inverse_recorders(c)
def _run_phase(self, functors, group, instances, phase_name, *inv_phase_names):
"""Run a given 'functor' across all of the components, in order individually."""
# This phase recorder will be used to check if a given component
# and action has ran in the past, if so that components action
# will not be ran again. It will also be used to mark that a given
# component has completed a phase (if that phase runs).
if not phase_name:
phase_recorder = phase.NullPhaseRecorder()
else:
phase_recorder = phase.PhaseRecorder(self._get_phase_filename(phase_name))
# These phase recorders will be used to undo other actions activities
# ie, when an install completes you want the uninstall phase to be
# removed from that actions phase file (and so on). This list will be
# used to accomplish that.
neg_phase_recs = []
if inv_phase_names:
for n in inv_phase_names:
if not n:
neg_phase_recs.append(phase.NullPhaseRecorder())
else:
neg_phase_recs.append(phase.PhaseRecorder(self._get_phase_filename(n)))
def change_activate(instance, on_off):
# Activate/deactivate a component instance and there siblings (if any)
#
# This is used when you say are looking at components
# that have been activated before your component has been.
#
# Typically this is useful for checking if a previous component
# has a shared dependency with your component and if so then there
# is no need to reinstall said dependency...
instance.activated = on_off
for (_name, sibling_instance) in instance.siblings.items():
sibling_instance.activated = on_off
def run_inverse_recorders(c_name):
for n in neg_phase_recs:
n.unmark(c_name)
# Reset all activations
for c, instance in six.iteritems(instances):
change_activate(instance, False)
# Run all components which have not been ran previously (due to phase tracking)
for c, instance in six.iteritems(instances):
if c in SPECIAL_GROUPS:
c = "%s_%s" % (c, group)
if c in phase_recorder:
LOG.debug("Skipping phase named %r for component %r since it already happened.", phase_name, c)
else:
try:
with phase_recorder.mark(c):
if functors.start:
functors.start(instance)
if functors.run:
result = functors.run(instance)
else:
result = None
if functors.end:
functors.end(instance, result)
except excp.NoTraceException:
pass
change_activate(instance, True)
run_inverse_recorders(c)
def run(self, persona):
groups = self._construct_instances(persona)
LOG.info("Processing components for action %s.", colorizer.quote(self.name))
for group in persona.matched_components:
utils.log_iterable(group,
header="Activating group %s in the following order" % colorizer.quote(group.id),
logger=LOG)
self._on_start(persona, groups)
self._run(persona, groups)
self._on_finish(persona, groups)

View File

@ -1,44 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# pylint: disable=R0915
from anvil.actions import base as action
from anvil import colorizer
from anvil import log
LOG = log.getLogger(__name__)
class BuildAction(action.Action):
needs_sudo = True
@property
def lookup_name(self):
return 'build'
def _run(self, persona, groups):
prior_groups = []
for group, instances in groups:
LOG.info("Building group %s...", colorizer.quote(group))
dependency_handler_class = self.distro.dependency_handler_class
dependency_handler = dependency_handler_class(self.distro,
self.root_dir,
instances.values(),
self.cli_opts,
group, prior_groups)
dependency_handler.build_binary()
prior_groups.append((group, instances))

View File

@ -1,95 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# pylint: disable=R0915
from anvil.actions import base as action
from anvil.actions import states
from anvil import colorizer
from anvil import log
LOG = log.getLogger(__name__)
class PrepareAction(action.Action):
needs_sudo = False
@property
def lookup_name(self):
return 'build'
def _run(self, persona, groups):
prior_groups = []
for group, instances in groups:
LOG.info("Preparing group %s...", colorizer.quote(group))
dependency_handler_class = self.distro.dependency_handler_class
dependency_handler = dependency_handler_class(self.distro,
self.root_dir,
instances.values(),
self.cli_opts,
group, prior_groups)
removals = states.reverts("download")
self._run_phase(
action.PhaseFunctors(
start=lambda i: LOG.info('Downloading %s.', colorizer.quote(i.name)),
run=lambda i: i.download(),
end=lambda i, result: LOG.info("Performed %s downloads.", len(result))
),
group,
instances,
"download",
*removals
)
removals.extend(states.reverts("download-patch"))
self._run_phase(
action.PhaseFunctors(
start=lambda i: LOG.info('Post-download patching %s.', colorizer.quote(i.name)),
run=lambda i: i.patch("download"),
end=None,
),
group,
instances,
"download-patch",
*removals
)
dependency_handler.package_start()
removals.extend(states.reverts("package"))
if not hasattr(dependency_handler, 'package_instances'):
self._run_phase(
action.PhaseFunctors(
start=lambda i: LOG.info("Packaging %s.", colorizer.quote(i.name)),
run=dependency_handler.package_instance,
end=None,
),
group,
instances,
"package",
*removals
)
else:
self._run_many_phase(
action.PhaseFunctors(
start=lambda i: LOG.info("Packaging %s.", colorizer.quote(i.name)),
run=dependency_handler.package_instances,
end=None,
),
group,
instances,
"package",
*removals
)
dependency_handler.package_finish()
prior_groups.append((group, instances))

View File

@ -1,31 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# These map what action states will cause what other action states to be
# removed (aka the inverse operations of each state). This is used so that
# we can skip states that have already completed as well as redo states when
# the inverse is applied.
_INVERSES = {
"download": [],
"download-patch": [],
}
def reverts(action):
try:
return list(_INVERSES[action])
except KeyError:
return []

View File

@ -1,97 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2015 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import sys
import threading
from concurrent import futures
import six
from six.moves import queue as compat_queue
from six.moves import range as compat_range
from anvil import log as logging
LOG = logging.getLogger(__name__)
_TOMBSTONE = object()
def _chained_worker(ident, shared_death, queue, futs):
running = True
while running:
if shared_death.is_set():
LOG.warn("Worker %s dying unhappily...", ident)
running = False
else:
w = queue.get()
if w is _TOMBSTONE:
queue.put(w)
LOG.info("Worker %s dying happily...", ident)
running = False
else:
func, fut = w
if fut.set_running_or_notify_cancel():
try:
result = func()
except BaseException:
LOG.exception("Worker %s dying unhappily...", ident)
exc_type, exc_val, exc_tb = sys.exc_info()
if six.PY2:
fut.set_exception_info(exc_val, exc_tb)
else:
fut.set_exception(exc_val)
# Stop all other workers from doing any more work...
shared_death.set()
for fut in futs:
fut.cancel()
running = False
else:
fut.set_result(result)
class ChainedWorkerExecutor(object):
def __init__(self, max_workers):
self._workers = []
self._max_workers = int(max_workers)
self._queue = compat_queue.Queue()
self._death = threading.Event()
def run(self, funcs):
if self._workers:
raise RuntimeError("Can not start another `run` with %s"
" existing workers" % (len(self._workers)))
self._queue = compat_queue.Queue()
self._death.clear()
futs = []
for i in compat_range(0, self._max_workers):
w = threading.Thread(target=_chained_worker,
args=(i + 1, self._death,
self._queue, futs))
w.daemon = True
w.start()
self._workers.append(w)
for func in funcs:
fut = futures.Future()
futs.append(fut)
self._queue.put((func, fut))
return futs
def wait(self):
self._queue.put(_TOMBSTONE)
while self._workers:
w = self._workers.pop()
w.join()

View File

@ -1,282 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import re
from anvil import exceptions
from anvil import log as logging
from anvil import origins as _origins
from anvil import settings
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
class YamlMergeLoader(object):
"""Holds merging process component options (based on Yaml reference loader).
Merge order is:
* Directory options (app_dir, component_dir...).
* Distro matched options (from `distros` directory).
* Origins matched options (from `origins` directory)
* General component options (from `general.yaml`).
* Persona general options (from personas/basic*.yaml with `general:` key).
* Specific component options (from `component_name.yaml`).
* Persona specific options (from personas/basic*.yaml
with `component_name:` key).
All merging is done to right with overwriting existing options (keys)
"""
def __init__(self, root_dir, origins_path=None):
self._root_dir = root_dir
self._base_loader = YamlRefLoader(settings.COMPONENT_CONF_DIR)
self._origins_path = origins_path
def _get_dir_opts(self, component):
component_dir = sh.joinpths(self._root_dir, component)
trace_dir = sh.joinpths(component_dir, 'traces')
app_dir = sh.joinpths(component_dir, 'app')
return utils.OrderedDict([
('app_dir', app_dir),
('component_dir', component_dir),
('root_dir', self._root_dir),
('trace_dir', trace_dir)
])
def _apply_persona(self, component, persona):
"""Apply persona specific options according to component.
Include the general.yaml in each applying since it typically contains
useful shared settings.
"""
for conf in ('general', component):
if persona is not None:
# Note: any additional redefines could be added here.
persona_specific = persona.component_options.get(component, {})
try:
self._base_loader.update_cache(conf, persona_specific)
except exceptions.YamlConfigNotFoundException:
LOG.warn("Unable to update the loaders cache with"
" component '%s' configuration using"
" persona specific data: %s", conf,
persona_specific, exc_info=True)
def load(self, distro, component, persona=None, origins_patch=None):
# NOTE (vnovikov): applying takes place before loading reference links
self._apply_persona(component, persona)
dir_opts = self._get_dir_opts(component)
distro_opts = distro.options
origins_opts = {}
if self._origins_path:
try:
origins = _origins.load(self._origins_path,
patch_file=origins_patch)
origins_opts = origins[component]
except KeyError:
pass
component_opts = []
for conf in ('general', component):
try:
component_opts.append(self._base_loader.load(conf))
except exceptions.YamlConfigNotFoundException:
LOG.warn("Unable to find component specific configuration"
" for component '%s'", conf, exc_info=True)
# NOTE (vnovikov): merge order is the same as arguments order below.
merged_opts = utils.merge_dicts(
dir_opts,
distro_opts,
origins_opts,
*component_opts
)
return merged_opts
class YamlRefLoader(object):
"""Reference loader for *.yaml configs.
Holds usual safe loading of the *.yaml files, caching, resolving and getting
all reference links and transforming all data to python built-in types.
Let's describe some basics.
In this context reference means value which formatted just like:
opt: "$(source:option)" , or
opt: "some-additional-data-$(source:option)-some-postfix-data", where:
opt - base option name
source - other source config (i.e. other *.yaml file) from which we
should get 'option'
option - option name in 'source'
In other words it means that loader will try to find and read 'option' from
'source'.
Any source config also allows:
References to itself via it's name (opt: "$(source:opt)",
in file - source.yaml)
References to auto parameters (opt: $(auto:ip), will insert current ip).
'auto' allows next options: 'ip', 'hostname' and 'home'
Implicit and multi references just like
s.yaml => opt: "here 3 opts: $(source:opt), $(source2:opt) and $(auto:ip)".
Exception cases:
* if reference 'option' does not exist than YamlOptionException is raised
* if config 'source' does not exist than YamlConfException is raised
* if reference loop found than YamlLoopException is raised
Config file example:
(file sample.yaml)
reference: "$(source:option)"
ip: "$(auto:ip)"
self_ref: "$(sample:ip)" # this will equal ip option.
opt: "http://$(auto:ip)/"
"""
def __init__(self, path):
self._conf_ext = '.yaml'
self._ref_pattern = re.compile(r"\$\(([\w\d-]+)\:([\w\d-]+)\)")
self._predefined_refs = {
'auto': {
'ip': utils.get_host_ip,
'home': sh.gethomedir,
'hostname': sh.hostname,
}
}
self._path = path # path to root directory with configs
self._cached = {} # buffer to save already loaded configs
self._processed = {} # buffer to save already processed configs
self._ref_stack = [] # stack for controlling reference loop
def _process_string(self, value):
"""Processing string (and reference links) values via regexp."""
processed = value
# Process each reference in value (one by one)
for match in self._ref_pattern.finditer(value):
ref_conf, ref_opt = match.groups()
val = self._load_option(ref_conf, ref_opt)
if match.group(0) == value:
return val
else:
processed = re.sub(self._ref_pattern, str(val), processed, count=1)
return processed
def _process_dict(self, value):
"""Process dictionary values."""
processed = utils.OrderedDict()
for opt, val in sorted(value.items()):
res = self._process(val)
processed[opt] = res
return processed
def _process_iterable(self, value):
"""Process list, set or tuple values."""
processed = []
for item in value:
processed.append(self._process(item))
return processed
def _process_asis(self, value):
"""Process built-in values."""
return value
def _process(self, value):
"""Base recursive method for processing references."""
if isinstance(value, basestring):
processed = self._process_string(value)
elif isinstance(value, dict):
processed = self._process_dict(value)
elif isinstance(value, (list, set, tuple)):
processed = self._process_iterable(value)
else:
processed = self._process_asis(value)
return processed
def _precache(self):
"""Cache and process predefined auto-references."""
for conf, options in self._predefined_refs.items():
if conf not in self._processed:
processed = dict((option, functor())
for option, functor in options.items())
self._cached[conf] = processed
self._processed[conf] = processed
def _load_option(self, conf, opt):
try:
return self._processed[conf][opt]
except KeyError:
if (conf, opt) in self._ref_stack:
raise exceptions.YamlLoopException(conf, opt, self._ref_stack)
self._ref_stack.append((conf, opt))
self._cache(conf)
try:
raw_value = self._cached[conf][opt]
except KeyError:
try:
cur_conf, cur_opt = self._ref_stack[-1]
except IndexError:
cur_conf, cur_opt = None, None
raise exceptions.YamlOptionNotFoundException(
cur_conf, cur_opt, conf, opt
)
result = self._process(raw_value)
self._processed.setdefault(conf, {})[opt] = result
self._ref_stack.pop()
return result
def _cache(self, conf):
"""Cache config file into memory to avoid re-reading it from disk."""
if conf not in self._cached:
path = sh.joinpths(self._path, conf + self._conf_ext)
if not sh.isfile(path):
raise exceptions.YamlConfigNotFoundException(path)
self._cached[conf] = utils.load_yaml(path) or {}
def update_cache(self, conf, dict2update):
self._cache(conf)
# NOTE (vnovikov): should remove obsolete processed data
self._cached[conf].update(dict2update)
self._processed[conf] = {}
def load(self, conf):
"""Load config `conf` from same yaml file with and resolve all
references.
"""
self._precache()
self._cache(conf)
# NOTE(imelnikov): some confs may be partially processed, so
# we have to ensure all the options got loaded.
for opt in self._cached[conf].iterkeys():
self._load_option(conf, opt)
# TODO(imelnikov: can we really restore original order here?
self._processed[conf] = utils.OrderedDict(
sorted(self._processed.get(conf, {}).iteritems())
)
return self._processed[conf]

View File

@ -1,55 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import sys
import termcolor
from anvil import env
from anvil import type_utils as tu
COLORS = termcolor.COLORS.keys()
LOG_COLOR = True
if 'LOG_COLOR' in env.get():
LOG_COLOR = tu.make_bool(env.get_key('LOG_COLOR'))
if not sys.stdout.isatty():
LOG_COLOR = False
def quote(data, quote_color='green', **kargs):
if not LOG_COLOR:
return "'%s'" % (data)
else:
text = str(data)
if len(text) == 0:
text = "''"
return color(text, quote_color, **kargs)
def color(data, color_to_be, bold=False, underline=False, blink=False):
text = str(data)
text_attrs = list()
if bold:
text_attrs.append('bold')
if underline:
text_attrs.append('underline')
if blink:
text_attrs.append('blink')
if LOG_COLOR and color_to_be in COLORS:
return termcolor.colored(text, color_to_be, attrs=text_attrs)
else:
return text

View File

@ -1,15 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,182 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
from anvil import exceptions as excp
from anvil import log as logging
from anvil import patcher
from anvil import settings
from anvil import shell as sh
from anvil import trace as tr
from anvil import type_utils as tu
from anvil import utils
LOG = logging.getLogger(__name__)
class Component(object):
def __init__(self, name, subsystems, instances, options, siblings, distro, **kwargs):
# Subsystems this was requested with
self.subsystems = subsystems
# The component name (from config)
self.name = name
# Any component options
self.options = options
# All the other active instances
self.instances = instances
# All the other class names that can be used alongside this class
self.siblings = siblings
# The distribution 'interaction object'
self.distro = distro
# Turned on and off as phases get activated
self.activated = False
# Where our binaries will be located
self.bin_dir = "/usr/bin/"
# Where configuration will be written
self.cfg_dir = sh.joinpths("/etc/", self.name)
def get_password(self, option):
pw_val = self.passwords.get(option)
if pw_val is None:
raise excp.PasswordException("Password asked for option %s but none was pre-populated!" % (option))
return pw_val
def get_interpolated_option(self, option, default_value=None):
tried = [option]
while True:
option_value = utils.get_deep(self.options, [option])
if option_value is None:
return default_value
else:
if not isinstance(option_value, six.string_types):
return option_value
if option_value.startswith("$"):
maybe_option = option_value[1:]
if maybe_option in tried:
tried.append(maybe_option)
raise excp.ConfigException("Option loop %s detected" % tried)
else:
tried.append(maybe_option)
option = maybe_option
else:
return option_value
def get_option(self, option, *options, **kwargs):
option_value = utils.get_deep(self.options, [option] + list(options))
if option_value is None:
return kwargs.get('default_value')
else:
return option_value
def get_bool_option(self, option, *options, **kwargs):
if 'default_value' not in kwargs:
kwargs['default_value'] = False
return tu.make_bool(self.get_option(option, *options, **kwargs))
def get_int_option(self, option, *options, **kwargs):
if 'default_value' not in kwargs:
kwargs['default_value'] = 0
return int(self.get_option(option, *options, **kwargs))
@property
def env_exports(self):
return {}
def verify(self):
pass
def __str__(self):
return "%s@%s" % (tu.obj_name(self), self.name)
@property
def params(self):
# Various params that are frequently accessed
return {
'APP_DIR': self.get_option('app_dir'),
'COMPONENT_DIR': self.get_option('component_dir'),
'TRACE_DIR': self.get_option('trace_dir'),
}
def warm_configs(self):
# Before any actions occur you get the chance to
# warmup the configs u might use (ie for prompting for passwords
# earlier rather than later)
pass
def subsystem_names(self):
return self.subsystems.keys()
@property
def packages(self):
return []
def package_names(self):
"""Return a set of names of all packages for this component."""
names = set()
for pack in self.packages:
try:
names.add(pack["name"])
except KeyError:
pass
daemon_to_package = self.get_option("daemon_to_package")
if not daemon_to_package:
daemon_to_package = {}
for key in self.subsystem_names():
try:
names.add(daemon_to_package[key])
except KeyError:
names.add("openstack-%s-%s" % (self.name, key))
return names
class BasicComponent(Component):
def __init__(self, *args, **kargs):
super(BasicComponent, self).__init__(*args, **kargs)
trace_fn = tr.trace_filename(self.get_option('trace_dir'), 'created')
self.tracewriter = tr.TraceWriter(trace_fn, break_if_there=False)
def download(self):
return []
def list_patches(self, section):
what_patches = self.get_option('patches', section)
if not what_patches:
what_patches = [sh.joinpths(settings.CONFIG_DIR, 'patches',
self.name, section)]
canon_what_patches = []
for path in what_patches:
if sh.isdir(path):
patches = sorted(fn for fn in sh.listdir(path, files_only=True)
if fn.endswith('patch'))
canon_what_patches.extend(patches)
elif sh.isfile(path):
canon_what_patches.append(path)
return canon_what_patches
def patch(self, section):
canon_what_patches = self.list_patches(section)
if canon_what_patches:
target_dir = self.get_option('app_dir')
patcher.apply_patches(canon_what_patches, target_dir)

View File

@ -1,93 +0,0 @@
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil.components import base
from anvil import downloader as down
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
from anvil.packaging.helpers import pip_helper
LOG = logging.getLogger(__name__)
# Potential files that can hold a projects requirements...
REQUIREMENT_FILES = [
'pip-requires',
'requirements.txt',
'requirements-py2.txt',
]
TEST_REQUIREMENT_FILES = [
'test-requires',
'test-requirements.txt',
]
class BuildComponent(base.BasicComponent):
pass
class PythonBuildComponent(BuildComponent):
def __init__(self, *args, **kargs):
super(PythonBuildComponent, self).__init__(*args, **kargs)
self._origins_fn = kargs['origins_fn']
app_dir = self.get_option('app_dir')
tools_dir = sh.joinpths(app_dir, 'tools')
self.requires_files = []
self.test_requires_files = []
for path in [app_dir, tools_dir]:
for req_fn in REQUIREMENT_FILES:
self.requires_files.append(sh.joinpths(path, req_fn))
for req_fn in TEST_REQUIREMENT_FILES:
self.test_requires_files.append(sh.joinpths(path, req_fn))
def config_params(self, config_fn):
mp = dict(self.params)
if config_fn:
mp['CONFIG_FN'] = config_fn
return mp
def download(self):
"""Download sources needed to build the component, if any."""
target_dir = self.get_option('app_dir')
download_cfg = utils.load_yaml(self._origins_fn).get(self.name, {})
if not target_dir or not download_cfg:
return []
uri = download_cfg.pop('repo', None)
if not uri:
raise ValueError(("Could not find repo uri for %r component from the %r "
"config file." % (self.name, self._origins_fn)))
uris = [uri]
utils.log_iterable(uris, logger=LOG,
header="Downloading from %s uris" % (len(uris)))
sh.mkdirslist(target_dir, tracewriter=self.tracewriter)
# This is used to delete what is downloaded (done before
# fetching to ensure its cleaned up even on download failures)
self.tracewriter.download_happened(target_dir, uri)
down.GitDownloader(uri, target_dir, **download_cfg).download()
return uris
@property
def egg_info(self):
app_dir = self.get_option('app_dir')
pbr_version = self.get_interpolated_option("pbr_version")
egg_info = pip_helper.get_directory_details(app_dir, pbr_version=pbr_version)
egg_info = egg_info.copy()
egg_info['dependencies'] = pip_helper.read_requirement_files(self.requires_files)[1]
egg_info['test_dependencies'] = pip_helper.read_requirement_files(self.test_requires_files)[1]
return egg_info

View File

@ -1,214 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
# Copyright (C) 2012 New Dream Network, LLC (DreamHost) All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
import copy
import glob
import jsonpatch
import os
import platform
import re
import shlex
import six
from anvil import exceptions as excp
from anvil import importer
from anvil import log as logging
from anvil import pprint
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
Component = collections.namedtuple( # pylint: disable=C0103
"Component", 'entry_point,options,siblings')
class Distro(object):
def __init__(self,
name, platform_pattern,
install_helper, dependency_handler,
components, **kwargs):
self.name = name
self._platform_pattern_text = platform_pattern
self._platform_pattern = re.compile(platform_pattern, re.IGNORECASE)
self._install_helper = install_helper
self._dependency_handler = dependency_handler
self._commands = kwargs.get('commands', {})
self._components = components
self.inject_platform_overrides(kwargs)
def inject_platform_overrides(self, potential_data, source='??'):
if 'platform_overrides' not in potential_data:
return
overrides = potential_data['platform_overrides']
plts = _get_platform_names()
patterns = [(k, re.compile(k, re.IGNORECASE), override)
for k, override in six.iteritems(overrides)]
for k, pat, override in patterns:
if any(pat.search(plt) for plt in plts):
LOG.info("Merging in 'platform_overrides' that matched"
" platform %s from %s (sub-key %s)", plts, source, k)
self._dependency_handler = utils.recursive_merge(
self._dependency_handler, override)
def pformat(self, item_max_len=None):
data = {
'name': self.name,
'dependency_handler': self._dependency_handler,
'commands': self._commands,
'pattern': "/%s/i" % self._platform_pattern_text,
}
return pprint.pformat(data, item_max_len=item_max_len)
def _fetch_value(self, root, keys, quiet):
end_key = keys[-1]
for k in keys[0:-1]:
if quiet:
root = root.get(k)
if root is None:
return None
else:
root = root[k]
end_value = None
if not quiet:
end_value = root[end_key]
else:
end_value = root.get(end_key)
return end_value
def get_dependency_config(self, key, *more_keys, **kwargs):
root = dict(self._dependency_handler)
# NOTE(harlowja): Don't allow access to the dependency handler class
# name. Access should be via the property instead.
root.pop('name', None)
keys = [key] + list(more_keys)
return self._fetch_value(root, keys, kwargs.get('quiet', False))
def get_command_config(self, key, *more_keys, **kwargs):
root = dict(self._commands)
keys = [key] + list(more_keys)
return self._fetch_value(root, keys, kwargs.get('quiet', False))
def get_command(self, key, *more_keys, **kwargs):
"""Retrieves a string for running a command from the setup
and splits it to return a list.
"""
val = self.get_command_config(key, *more_keys, **kwargs)
if not val:
return []
else:
return shlex.split(val)
def known_component(self, name):
return name in self._components
def supports_platform(self, platform_name):
"""Does this distro support the named platform?
:param platform_name: Return value from platform.platform().
"""
return bool(self._platform_pattern.search(platform_name))
@property
def install_helper_class(self):
"""Return an install helper that will work for this distro."""
return importer.import_entry_point(self._install_helper)
@property
def dependency_handler_class(self):
"""Return a dependency handler that will work for this distro."""
return importer.import_entry_point(self._dependency_handler["name"])
def extract_component(self, name, action, default_entry_point_creator=None):
"""Return the class + component info to use for doing the action w/the component."""
try:
# Use a copy instead of the original since we will be
# modifying this dictionary which may not be wanted for future
# usages of this dictionary (so keep the original clean)...
component_info = copy.deepcopy(self._components[name])
except KeyError:
component_info = {}
action_classes = component_info.pop('action_classes', {})
if default_entry_point_creator is not None:
default_action_classes = default_entry_point_creator(name,
copy.deepcopy(component_info))
if default_action_classes:
for (an_action, entry_point) in six.iteritems(default_action_classes):
if an_action not in action_classes:
action_classes[an_action] = entry_point
try:
entry_point = action_classes.pop(action)
except KeyError:
raise RuntimeError('No entrypoint configured/generated for'
' %r %r for distribution %r' % (action, name, self.name))
else:
return Component(entry_point, component_info, action_classes)
def _get_platform_names():
plts = [
platform.platform(),
]
linux_plt = platform.linux_distribution()[0:2]
linux_plt = "-".join(linux_plt)
linux_plt = linux_plt.replace(" ", "-")
plts.append(linux_plt)
return [plt.lower() for plt in plts]
def _match_distros(distros):
plts = _get_platform_names()
matches = []
for d in distros:
if any(d.supports_platform(plt) for plt in plts):
matches.append(d)
if not matches:
raise excp.ConfigException('No distro matched for platform %s' % plts)
else:
return matches
def load(path, distros_patch=None):
"""Load configuration for all distros found in path.
:param path: path containing distro configuration in yaml format
:param distros_patch: distros file patch, jsonpath format (rfc6902)
"""
distro_possibles = []
patch = jsonpatch.JsonPatch(distros_patch) if distros_patch else None
input_files = glob.glob(sh.joinpths(path, '*.yaml'))
if not input_files:
raise excp.ConfigException('Did not find any distro definition files in %r' % path)
for fn in input_files:
LOG.debug("Attempting to load distro definition from %r", fn)
try:
cls_kvs = utils.load_yaml(fn)
# Apply any user specified patches to distros file
if patch:
patch.apply(cls_kvs, in_place=True)
except Exception as err:
LOG.warning('Could not load distro definition from %r: %s', fn, err)
else:
if 'name' not in cls_kvs:
name, _ext = os.path.splitext(sh.basename(fn))
cls_kvs['name'] = name
distro_possibles.append(Distro(**cls_kvs))
matches = _match_distros(distro_possibles)
LOG.debug("Matched distros %s", [m.name for m in matches])
return matches

View File

@ -1,178 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
import contextlib
import functools
import re
import urllib2
import progressbar
from anvil import colorizer
from anvil import exceptions
from anvil import log as logging
from anvil import shell as sh
LOG = logging.getLogger(__name__)
class Downloader(object):
__metaclass__ = abc.ABCMeta
def __init__(self, uri, dst):
self._uri = uri
self._dst = dst
@abc.abstractmethod
def download(self):
raise NotImplementedError()
class GitDownloader(Downloader):
def __init__(self, uri, dst, **kwargs):
Downloader.__init__(self, uri, dst)
self._branch = self._get_string_from_dict(kwargs, 'branch')
self._tag = self._get_string_from_dict(kwargs, 'tag')
self._sha1 = self._get_string_from_dict(kwargs, 'sha1')
self._refspec = self._get_string_from_dict(kwargs, 'refspec')
git_sources = len([a for a in (self._tag, self._sha1, self._branch) if a])
if git_sources > 1:
raise exceptions.ConfigException('Too many sources. Please, '
'specify only one of tag/SHA1/branch.')
if not git_sources:
self._branch = 'master'
def _get_string_from_dict(self, params, key):
value = params.get(key)
if value:
value = str(value)
return value
def download(self):
branch = self._branch
tag = self._tag
start_point = self._sha1 or self._tag
if start_point:
# Avoid 'detached HEAD state' message by moving to a
# $tag-anvil branch for that tag
new_branch = "%s-%s" % (start_point[:8], 'anvil')
checkout_what = [start_point, '-b', new_branch]
else:
# Set it up to track the remote branch correctly
new_branch = branch
checkout_what = ['-t', '-b', new_branch, 'origin/%s' % branch]
if sh.isdir(self._dst) and sh.isdir(sh.joinpths(self._dst, '.git')):
LOG.info("Existing git directory located at %s, leaving it alone.",
colorizer.quote(self._dst))
# do git clean -xdfq and git reset --hard to undo possible changes
cmd = ["git", "clean", "-xdfq"]
sh.execute(cmd, cwd=self._dst)
cmd = ["git", "reset", "--hard"]
sh.execute(cmd, cwd=self._dst)
cmd = ["git", "fetch", "origin"]
sh.execute(cmd, cwd=self._dst)
else:
LOG.info("Downloading %s to %s.", colorizer.quote(self._uri),
colorizer.quote(self._dst))
cmd = ["git", "clone", self._uri, self._dst]
sh.execute(cmd)
if self._refspec:
LOG.info("Fetching ref %s.", self._refspec)
cmd = ["git", "fetch", self._uri, self._refspec]
sh.execute(cmd, cwd=self._dst)
if self._sha1:
LOG.info("Adjusting to SHA1 %s.", colorizer.quote(self._sha1))
elif tag:
LOG.info("Adjusting to tag %s.", colorizer.quote(tag))
else:
LOG.info("Adjusting branch to %s.", colorizer.quote(branch))
# detach, drop new_branch if it exists, and checkout to new_branch
# newer git allows branch resetting: git checkout -B $new_branch
# so, all these are for compatibility with older RHEL git
cmd = ["git", "rev-parse", "HEAD"]
git_head = sh.execute(cmd, cwd=self._dst)[0].strip()
cmd = ["git", "checkout", git_head]
sh.execute(cmd, cwd=self._dst)
cmd = ["git", "branch", "-D", new_branch]
sh.execute(cmd, cwd=self._dst, check_exit_code=False)
cmd = ["git", "checkout"] + checkout_what
sh.execute(cmd, cwd=self._dst)
# NOTE(aababilov): old openstack.common.setup reports all tag that
# contain HEAD as project's version. It breaks all RPM building
# process, so, we will delete all extra tags
cmd = ["git", "tag", "--contains", "HEAD"]
tag_names = [
i
for i in sh.execute(cmd, cwd=self._dst)[0].splitlines()
if i and i != tag]
# Making sure we are not removing tag with the same commit reference
# as for a branch. Otherwise this will make repository broken.
if tag_names:
cmd = ["git", "show-ref", "--tags", "--dereference"] + tag_names
for line in sh.execute(cmd, cwd=self._dst)[0].splitlines():
res = re.search(r"(.+)\s+refs/tags/(.+)\^\{\}$", line)
if res is None:
continue
ref, tag_name = res.groups()
if ref == git_head and tag_name in tag_names:
tag_names.remove(tag_name)
if tag_names:
LOG.info("Removing tags: %s", colorizer.quote(" ".join(tag_names)))
cmd = ["git", "tag", "-d"] + tag_names
sh.execute(cmd, cwd=self._dst)
class UrlLibDownloader(Downloader):
def __init__(self, uri, store_where, **kargs):
Downloader.__init__(self, uri, store_where)
self.quiet = kargs.get('quiet', False)
self.timeout = kargs.get('timeout', 5)
def _make_bar(self, size):
widgets = [
'Fetching: ', progressbar.Percentage(),
' ', progressbar.Bar(),
' ', progressbar.ETA(),
' ', progressbar.FileTransferSpeed(),
]
return progressbar.ProgressBar(widgets=widgets, maxval=size)
def download(self):
LOG.info('Downloading using urllib2: %s to %s.',
colorizer.quote(self._uri), colorizer.quote(self._dst))
p_bar = None
def update_bar(progress_bar, bytes_down):
if progress_bar:
progress_bar.update(bytes_down)
try:
with contextlib.closing(urllib2.urlopen(self._uri, timeout=self.timeout)) as conn:
c_len = conn.headers.get('content-length')
if c_len is not None and not self.quiet:
try:
p_bar = self._make_bar(int(c_len))
p_bar.start()
except ValueError:
pass
with open(self._dst, 'wb') as ofh:
return (self._dst, sh.pipe_in_out(conn, ofh,
chunk_cb=functools.partial(update_bar, p_bar)))
finally:
if p_bar:
p_bar.finish()

View File

@ -1,39 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
def get():
return dict(os.environ)
def set(key, value):
# This is really screwy, python is really odd in this area
# See: from http://docs.python.org/library/os.html
# Calling putenv() directly does not change os.environ, so it's better to modify os.environ.
if key is not None:
os.environ[str(key)] = str(value)
def get_key(key, default_value=None):
if not key:
return default_value
key = str(key)
value = get().get(key)
if value is None:
value = default_value
return value

View File

@ -1,165 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import contextlib
import subprocess
import sys
import six
class AnvilException(Exception):
pass
class PermException(AnvilException):
pass
class OptionException(AnvilException):
pass
class DownloadException(AnvilException):
pass
class NoTraceException(AnvilException):
pass
class PackageException(AnvilException):
pass
class FileException(AnvilException):
pass
class ConfigException(AnvilException):
pass
class DependencyException(AnvilException):
pass
class ProcessExecutionError(IOError):
MESSAGE_TPL = (
'%(description)s\n'
'Command: %(command)s\n'
'Exit code: %(exit_code)s\n'
'Stdout: %(stdout)s\n'
'Stderr: %(stderr)s'
)
# Truncate stdout & stderr content to this many lines when creating a
# process execution error (the full stdout/stderr can still be accessed).
_TRUNCATED_OUTPUT_LINES = 7
def __init__(self, cmd, exec_kwargs=None,
stdout='', stderr='', exit_code=None, description=None,
where_output=None):
if not isinstance(exit_code, (long, int)):
exit_code = '-'
if not description:
description = 'Unexpected error while running command.'
if not exec_kwargs:
exec_kwargs = {}
self._stdout = self._format(exec_kwargs.get('stdout'), stdout)
self._stderr = self._format(exec_kwargs.get('stderr'), stderr)
message = self.MESSAGE_TPL % {
'exit_code': exit_code,
'command': cmd,
'description': description,
'stdout': self._truncate_lines(self._stdout, where_output),
'stderr': self._truncate_lines(self._stderr, where_output),
}
IOError.__init__(self, message)
@classmethod
def _truncate_lines(cls, content, where_output=None):
"""Truncates a given text blob using the class defined line limit."""
if not content:
return content
lines = content.splitlines(True)
if len(lines) > cls._TRUNCATED_OUTPUT_LINES:
content = "".join(lines[-cls._TRUNCATED_OUTPUT_LINES:])
if where_output:
content += " (see %s for more details...)" % (where_output)
else:
content += "..."
return content
@staticmethod
def _format(stream, output):
if stream != subprocess.PIPE and stream is not None:
return "<redirected to %s>" % stream.name
return output
@property
def stdout(self):
"""Access the full (non-truncated) stdout."""
return self._stdout
@property
def stderr(self):
"""Access the full (non-truncated) stderr."""
return self._stderr
class YamlException(ConfigException):
pass
class YamlOptionNotFoundException(YamlException):
"""Raised by YamlRefLoader if reference option not found."""
def __init__(self, conf, opt, ref_conf, ref_opt):
msg = "In `{0}`=>`{1}: '$({2}:{3})'` " \
"reference option `{3}` not found." \
.format(conf, opt, ref_conf, ref_opt)
super(YamlOptionNotFoundException, self).__init__(msg)
class YamlConfigNotFoundException(YamlException):
"""Raised by YamlRefLoader if config source not found."""
def __init__(self, path):
msg = "Could not find (open) yaml source {0}.".format(path)
super(YamlConfigNotFoundException, self).__init__(msg)
class YamlLoopException(YamlException):
"""Raised by YamlRefLoader if reference loop found."""
def __init__(self, conf, opt, ref_stack):
prettified_stack = "".join(
"\n%s`%s`=>`%s`" % (" " * i, c, o)
for i, (c, o) in enumerate(ref_stack))
msg = "In `{0}`=>`{1}` reference loop found.\n" \
"Reference stack is:{2}." \
.format(conf, opt, prettified_stack)
super(YamlLoopException, self).__init__(msg)
@contextlib.contextmanager
def reraise():
ex_type, ex, ex_tb = sys.exc_info()
try:
yield ex
except Exception:
raise
else:
six.reraise(ex_type, ex, ex_tb)

View File

@ -1,71 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
# Copyright (C) 2012 New Dream Network, LLC (DreamHost) All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import sys
from anvil import log as logging
from anvil import utils
LOG = logging.getLogger(__name__)
def construct_entry_point(fullname, *args, **kwargs):
cls = import_entry_point(fullname)
LOG.debug("Constructing %r (%s)", fullname, cls)
if kwargs:
LOG.debug("Kwargs are:")
utils.log_object(kwargs, logger=LOG, level=logging.DEBUG)
if args:
LOG.debug("Args are:")
utils.log_object(args, logger=LOG, level=logging.DEBUG)
return cls(*args, **kwargs)
def partition(fullname):
"""The name should be in dotted.path:ClassName syntax."""
if ':' not in fullname:
raise ValueError('Invalid entry point specifier %r' % fullname)
(module_name, _sep, classname) = fullname.partition(':')
return (module_name, classname)
def import_entry_point(fullname):
"""Given a name import the class and return it."""
(module_name, classname) = partition(fullname)
try:
import_module(module_name)
# This is done to ensure we get the right submodule
module = __import__(module_name)
for submodule in module_name.split('.')[1:]:
module = getattr(module, submodule)
LOG.debug("Importing class: %s", classname)
cls = getattr(module, classname)
# TODO(harlowja) actually verify this is a class??
except (ImportError, AttributeError, ValueError) as err:
raise RuntimeError('Could not load entry point %s: %s' %
(fullname, err))
return cls
def import_module(module_name):
try:
LOG.debug("Importing module: %s", module_name)
__import__(module_name)
return sys.modules.get(module_name, None)
except (ImportError, ValueError) as err:
raise RuntimeError('Could not load module %s: %s' %
(module_name, err))

View File

@ -1,313 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import ConfigParser
from ConfigParser import DEFAULTSECT
from ConfigParser import NoOptionError
from ConfigParser import NoSectionError
from StringIO import StringIO
import iniparse
from iniparse import ini
import re
from anvil import log as logging
from anvil import utils
LOG = logging.getLogger(__name__)
class StringiferMixin(object):
def __init__(self):
pass
def stringify(self, fn=None):
outputstream = StringIO()
self.write(outputstream)
contents = utils.add_header(fn, outputstream.getvalue())
return contents
class ConfigHelperMixin(object):
DEF_INT = 0
DEF_FLOAT = 0.0
DEF_BOOLEAN = False
DEF_BASE = None
def __init__(self, templatize_values=False):
self.templatize_values = templatize_values
def get(self, section, option):
value = self.DEF_BASE
try:
value = super(ConfigHelperMixin, self).get(section, option)
except NoSectionError:
pass
except NoOptionError:
pass
return value
def _template_value(self, option, value):
if not self.templatize_values:
return value
tpl_value = StringIO()
safe_value = str(option)
for c in ['-', ' ', '\t', ':', '$', '%', '(', ')']:
safe_value = safe_value.replace(c, '_')
tpl_value.write("$(%s)" % (safe_value.upper().strip()))
comment_value = str(value).strip().encode('string_escape')
for c in ['(', ')', '$']:
comment_value = comment_value.replace(c, '')
comment_value = comment_value.strip()
tpl_value.write(" # %s" % (comment_value))
return tpl_value.getvalue()
def set(self, section, option, value):
if not self.has_section(section) and section.lower() != 'default':
self.add_section(section)
value = self._template_value(option, value)
super(ConfigHelperMixin, self).set(section, option, value)
def remove_option(self, section, option):
if self.has_option(section, option):
super(ConfigHelperMixin, self).remove_option(section, option)
def getboolean(self, section, option):
if not self.has_option(section, option):
return self.DEF_BOOLEAN
return super(ConfigHelperMixin, self).getboolean(section, option)
def getfloat(self, section, option):
if not self.has_option(section, option):
return self.DEF_FLOAT
return super(ConfigHelperMixin, self).getfloat(section, option)
def getint(self, section, option):
if not self.has_option(section, option):
return self.DEF_INT
return super(ConfigHelperMixin, self).getint(section, option)
def getlist(self, section, option):
return self.get(section, option).split(",")
class BuiltinConfigParser(ConfigHelperMixin, ConfigParser.RawConfigParser, StringiferMixin):
def __init__(self, fns=None, templatize_values=False):
ConfigHelperMixin.__init__(self, templatize_values)
ConfigParser.RawConfigParser.__init__(self)
StringiferMixin.__init__(self)
# Make option names case sensitive
# See: http://docs.python.org/library/configparser.html#ConfigParser.RawConfigParser.optionxform
self.optionxform = str
if fns:
for f in fns:
self.read(f)
class AnvilConfigParser(iniparse.RawConfigParser):
"""Extends RawConfigParser with the following functionality:
1. All commented options with related comments belong to
their own section, but not to the global scope. This is
needed to insert new options into proper position after
same commented option in the section, if present.
2. Override set option behavior to insert option right
after same commented option, if present, otherwise insert
in the section beginning.
3. Includes [DEFAULT] section if present (but not present in original).
"""
# commented option regexp
option_regex = re.compile(
r"""
^[;#] # comment line starts with ';' or '#'
\s* # then maybe some spaces
# then option name
([^:=\s[] # at least one non-special symbol here
[^:=]*?) # option continuation
\s* # then maybe some spaces
[:=] # option-value separator ':' or '='
.* # then option value
$ # then line ends
""", re.VERBOSE)
def __init__(self, defaults=None, dict_type=dict, include_defaults=True):
super(AnvilConfigParser, self).__init__(defaults=defaults,
dict_type=dict_type)
self._include_defaults = include_defaults
def readfp(self, fp, filename=None):
super(AnvilConfigParser, self).readfp(fp, filename)
self._on_after_file_read()
def set(self, section, option, value):
"""Overrides option set behavior."""
try:
self._set_section_option(self.data[section], option, value)
except KeyError:
raise NoSectionError(section)
def _sections(self):
"""Gets all the underlying sections (including DEFAULT). The underlying
iniparse library seems to exclude the DEFAULT section which makes it
hard to tell if we should include the DEFAULT section in output or
whether the library will include it for us (which it will do if the
origin data had a DEFAULT section).
"""
sections = set()
for x in self.data._data.contents:
if isinstance(x, ini.LineContainer):
sections.add(x.name)
return sections
def write(self, fp):
"""Writes sections to the provided file object, and also includes the
default section if it is not present in the origin data but should be
present in the output data.
"""
if self.data._bom:
fp.write(u'\ufeff')
default_added = False
if self._include_defaults and DEFAULTSECT not in self._sections():
try:
sect = self.data[DEFAULTSECT]
except KeyError:
pass
else:
default_added = True
fp.write("%s\n" % (ini.SectionLine(DEFAULTSECT)))
for lines in sect._lines:
fp.write("%s\n" % (lines))
data = "%s" % (self.data._data)
if default_added and data:
# Remove extra spaces since we added a section before this.
data = data.lstrip()
data = "\n" + data
fp.write(data)
def _on_after_file_read(self):
"""This function is called after reading config file
to move all commented lines into section they belong to,
otherwise such commented lines are placed on top level,
that is not very suitable for us.
"""
curr_section = None
pending_lines = []
remove_lines = []
for line_obj in self.data._data.contents:
if isinstance(line_obj, ini.LineContainer):
curr_section = line_obj
pending_lines = []
else:
if curr_section is not None:
pending_lines.append(line_obj)
# if line is commented option - add it and all
# pending lines into current section
if self.option_regex.match(line_obj.line) is not None:
curr_section.extend(pending_lines)
remove_lines.extend(pending_lines)
pending_lines = []
for line_obj in remove_lines:
self.data._data.contents.remove(line_obj)
@classmethod
def _set_section_option(cls, section, key, value):
"""This function is used to override the __setitem__ behavior
of the INISection to search suitable place to insert new
option if it doesn't exist. The 'suitable' place is
considered to be after same commented option, if present,
otherwise new option is placed at the section beginning.
"""
if section._optionxform:
xkey = section._optionxform(key)
else:
xkey = key
if xkey in section._compat_skip_empty_lines:
section._compat_skip_empty_lines.remove(xkey)
if xkey not in section._options:
# create a dummy object - value may have multiple lines
obj = ini.LineContainer(ini.OptionLine(key, ''))
# search for the line index to insert after
line_idx = 0
section_lines = section._lines[-1].contents
for idx, line_obj in reversed(list(enumerate(section_lines))):
if not isinstance(line_obj, ini.LineContainer):
if line_obj.line is not None:
match_res = cls.option_regex.match(line_obj.line)
if match_res is not None and match_res.group(1) == xkey:
line_idx = idx
break
# insert new parameter object on the next line after
# commented option, otherwise insert it at the beginning
section_lines.insert(line_idx + 1, obj)
section._options[xkey] = obj
section._options[xkey].value = value
class RewritableConfigParser(ConfigHelperMixin, AnvilConfigParser, StringiferMixin):
def __init__(self, fns=None, templatize_values=False):
ConfigHelperMixin.__init__(self, templatize_values)
AnvilConfigParser.__init__(self)
StringiferMixin.__init__(self)
# Make option names case sensitive
# See: http://docs.python.org/library/configparser.html#ConfigParser.RawConfigParser.optionxform
self.optionxform = str
if fns:
for f in fns:
self.read(f)
class DefaultConf(object):
"""This class represents the data/format of the config file with
a large DEFAULT section.
"""
current_section = DEFAULTSECT
def __init__(self, backing, current_section=None):
self.backing = backing
self.current_section = current_section or self.current_section
def add_with_section(self, section, key, value, *values):
real_key = str(key)
real_value = ""
if len(values):
str_values = [str(value)] + [str(v) for v in values]
real_value = ",".join(str_values)
else:
real_value = str(value)
LOG.debug("Added conf key %r with value %r under section %r",
real_key, real_value, section)
self.backing.set(section, real_key, real_value)
def add(self, key, value, *values):
self.add_with_section(self.current_section, key, value, *values)
def remove(self, section, key):
self.backing.remove_option(section, key)
def create_parser(cfg_cls, component, fns=None):
templatize_values = component.get_bool_option('template_config')
cfg_opts = {
'fns': fns,
'templatize_values': templatize_values,
}
return cfg_cls(**cfg_opts)

View File

@ -1,126 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Copyright 2011 OpenStack LLC.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# pylint: disable=C0103
import logging
import sys
from anvil import colorizer
# A list of things we want to replicate from logging levels
CRITICAL = logging.CRITICAL
FATAL = logging.FATAL
ERROR = logging.ERROR
WARNING = logging.WARNING
WARN = logging.WARN
INFO = logging.INFO
DEBUG = logging.DEBUG
NOTSET = logging.NOTSET
# Methods
debug = logging.debug
info = logging.info
warning = logging.warning
warn = logging.warn
error = logging.error
exception = logging.exception
critical = logging.critical
log = logging.log
# Nice translator
getLevelName = logging.getLevelName
# Classes
root = logging.root
Formatter = logging.Formatter
# Handlers
StreamHandler = logging.StreamHandler
FileHandler = logging.FileHandler
class TermFormatter(logging.Formatter):
COLOR_MAP = {
logging.DEBUG: 'blue',
logging.INFO: 'cyan',
logging.WARNING: 'yellow',
logging.ERROR: 'red',
logging.CRITICAL: 'red',
}
MSG_COLORS = {
logging.CRITICAL: 'red',
}
def __init__(self, reg_fmt=None, date_format=None):
logging.Formatter.__init__(self, reg_fmt, date_format)
def _format_msg(self, lvl, msg):
color_to_be = self.MSG_COLORS.get(lvl)
if color_to_be:
return colorizer.color(msg, color_to_be, bold=True)
else:
return msg
def _format_lvl(self, lvl, lvl_name):
color_to_be = self.COLOR_MAP.get(lvl)
if color_to_be:
return colorizer.color(lvl_name, color_to_be)
else:
return lvl_name
def format(self, record):
record.levelname = self._format_lvl(record.levelno, record.levelname)
record.msg = self._format_msg(record.levelno, record.msg)
return logging.Formatter.format(self, record)
class TermAdapter(logging.LoggerAdapter):
warn = logging.LoggerAdapter.warning
def __init__(self, logger):
logging.LoggerAdapter.__init__(self, logger, dict())
def setupLogging(log_level,
term_format='%(levelname)s: @%(name)s : %(message)s',
tee_filename='/var/log/anvil.log',
tee_format='%(asctime)s : %(levelname)s: @%(name)s : %(message)s'):
root_logger = getLogger().logger
console_formatter = TermFormatter(term_format)
console_logger = StreamHandler(sys.stdout)
console_logger.setLevel(log_level)
console_logger.setFormatter(console_formatter)
root_logger.addHandler(console_logger)
file_formatter = logging.Formatter(tee_format)
file_logger = FileHandler(tee_filename)
file_logger.setFormatter(file_formatter)
file_logger.setLevel(DEBUG)
root_logger.addHandler(file_logger)
root_logger.setLevel(DEBUG)
def getLogger(name='anvil'):
return TermAdapter(logging.getLogger(name))

View File

@ -1,220 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from StringIO import StringIO
import json
import multiprocessing
import textwrap
from optparse import IndentedHelpFormatter
from optparse import OptionGroup
from optparse import OptionParser
from optparse import OptionValueError
from anvil import actions
from anvil import env
from anvil import settings
from anvil import shell as sh
from anvil import utils
from anvil import version
OVERVIEW = """Overview: Anvil is a forging tool to help build OpenStack components
and their dependencies into a complete system. It git checkouts the components and
builds them and their dependencies into packages."""
STEPS = """Steps: For smooth experience please make sure you go through the
following steps when running."""
STEP_SECTIONS = {
'building': [
'./smithy -a prepare',
'./smithy -a build',
],
}
def _format_list(in_list):
sorted_list = sorted(in_list)
return "[" + ", ".join(sorted_list) + "]"
def _size_cb(option, opt_str, value, parser):
try:
parser.values.show_amount = utils.to_bytes(value)
except (TypeError, ValueError) as e:
raise OptionValueError("Invalid value for %s due to %s" % (opt_str, e))
class SmithyHelpFormatter(IndentedHelpFormatter):
def _wrap_it(self, text):
return textwrap.fill(text, width=self.width,
initial_indent="", subsequent_indent=" ")
def format_epilog(self, epilog):
buf = StringIO()
buf.write(IndentedHelpFormatter.format_epilog(self, epilog))
buf.write("\n")
buf.write(self._wrap_it('For further information check out: '
'http://anvil.readthedocs.org'))
buf.write("\n")
return buf.getvalue()
def format_usage(self, usage):
buf = StringIO()
buf.write(IndentedHelpFormatter.format_usage(self, usage))
buf.write("\n")
buf.write(self._wrap_it(OVERVIEW))
buf.write("\n\n")
buf.write(self._wrap_it(STEPS))
buf.write("\n\n")
for k in sorted(STEP_SECTIONS.keys()):
buf.write("%s:\n" % (k.title()))
for line in STEP_SECTIONS[k]:
buf.write(" %s\n" % (line))
return buf.getvalue()
def _get_default_dir():
root_dir = env.get_key('INSTALL_ROOT')
if root_dir:
return root_dir
return sh.joinpths(sh.gethomedir(), 'openstack')
def parse(previous_settings=None):
version_str = "%s v%s" % ('anvil', version.version_string())
help_formatter = SmithyHelpFormatter(width=120)
parser = OptionParser(version=version_str, formatter=help_formatter,
prog='smithy')
# Root options
parser.add_option("-v", "--verbose",
action="store_true",
dest="verbose",
default=False,
help="make the output logging verbose")
# Install/start/stop/uninstall specific options
base_group = OptionGroup(parser, "Action specific options")
base_group.add_option("-p", "--persona",
action="store",
type="string",
dest="persona_fn",
default=sh.joinpths(settings.PERSONA_DIR, 'in-a-box', 'basic.yaml'),
metavar="FILE",
help="persona yaml file to apply (default: %default)")
base_group.add_option("-a", "--action",
action="store",
type="string",
dest="action",
metavar="ACTION",
help="required action to perform: %s" % (_format_list(actions.names())))
base_group.add_option("-o", "--origins",
action="store",
type="string",
dest="origins_fn",
default=sh.joinpths(settings.ORIGINS_DIR, 'master.yaml'),
metavar="FILE",
help="yaml file describing where to get openstack sources "
"from (default: %default)")
base_group.add_option("--origins-patch",
action="store",
type="string",
dest="origins_patch_fn",
default=None,
metavar="FILE",
help="origins file patch, jsonpath format (rfc6902)")
base_group.add_option("--distros-patch",
action="store",
type="string",
dest="distros_patch_fn",
default=None,
metavar="FILE",
help="distros file patch, jsonpath format (rfc6902)")
base_group.add_option("-j", "--jobs",
action="store",
type="int",
dest="jobs",
default=multiprocessing.cpu_count() + 1,
metavar="JOBS",
help="number of building jobs to run simultaneously (default: %default)")
base_group.add_option("-d", "--directory",
action="store",
type="string",
dest="dir",
metavar="DIR",
default=_get_default_dir(),
help=("empty root DIR or DIR with existing components (default: %default)"))
base_group.add_option("--tee-file",
action="store",
type="string",
dest="tee_file",
metavar="FILE",
default='/var/log/anvil.log',
help=("location to store tee of output (default: %default)"))
parser.add_option_group(base_group)
build_group = OptionGroup(parser, "Build specific options")
build_group.add_option('-u', "--usr-only",
action="store_true",
dest="usr_only",
default=False,
help=("when packaging only store /usr directory"
" (default: %default)"))
build_group.add_option("--venv-deploy-dir",
action="store",
type="string",
dest="venv_deploy_dir",
default=None,
help=("for virtualenv builds, make the virtualenv "
"relocatable to a directory different from "
"build directory"))
build_group.add_option('-c', "--overwrite-configs",
action="store_true",
dest="overwrite_configs",
default=False,
help=("When packaging do you want rpm to mark config "
"files with %config or treat them as files and "
"overwrite them each time on rpm install"))
parser.add_option_group(build_group)
# Extract only what we care about, these will be passed
# to the constructor of actions as arguments
# so don't adjust the naming wily nilly...
if previous_settings:
parser.set_defaults(**previous_settings)
(options, _args) = parser.parse_args()
values = {}
values['dir'] = (options.dir or "")
values['action'] = (options.action or "")
values['jobs'] = options.jobs
values['persona_fn'] = options.persona_fn
values['origins_fn'] = options.origins_fn
values['verbose'] = options.verbose
values['usr_only'] = options.usr_only
values['tee_file'] = options.tee_file
values['overwrite_configs'] = options.overwrite_configs
if options.origins_patch_fn:
with open(options.origins_patch_fn) as fp:
values['origins_patch'] = json.load(fp)
if options.distros_patch_fn:
with open(options.distros_patch_fn) as fp:
values['distros_patch'] = json.load(fp)
values['venv_deploy_dir'] = options.venv_deploy_dir
return values

View File

@ -1,40 +0,0 @@
# -*- coding: utf-8 -*-
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import jsonpatch
from anvil import utils
class Origin(dict):
def __init__(self, filename, patched=False):
super(Origin, self).__init__()
self.filename = filename
self.patched = patched
def load(filename, patch_file=None):
base = utils.load_yaml(filename)
patched = False
if patch_file:
patch = jsonpatch.JsonPatch(patch_file)
patch.apply(base, in_place=True)
patched = True
origin = Origin(filename, patched=patched)
origin.update(base)
return origin

View File

@ -1,26 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""Package formats and package management systems support.
Supported formats:
- pip
- RPM
Supported systems:
- pip
- YUM
"""

View File

@ -1,381 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# R0902: Too many instance attributes
#pylint: disable=R0902
import functools
from anvil import colorizer
from anvil import exceptions as exc
from anvil import log as logging
from anvil.packaging.helpers import multipip_helper
from anvil.packaging.helpers import pip_helper
from anvil import shell as sh
from anvil import trace as tr
from anvil import utils
LOG = logging.getLogger(__name__)
def sort_req(r1, r2):
return cmp(r1.key, r2.key)
class InstallHelper(object):
"""Run pre and post install for a single package."""
def __init__(self, distro):
self.distro = distro
def pre_install(self, pkg, params=None):
cmds = pkg.get('pre-install')
if cmds:
LOG.info("Running pre-install commands for package %s.", colorizer.quote(pkg['name']))
utils.execute_template(*cmds, params=params)
def post_install(self, pkg, params=None):
cmds = pkg.get('post-install')
if cmds:
LOG.info("Running post-install commands for package %s.", colorizer.quote(pkg['name']))
utils.execute_template(*cmds, params=params)
class DependencyHandler(object):
"""Basic class for handler of OpenStack dependencies."""
# Sometimes pip fails doing things, retry it when this happens...
RETRIES = 3
RETRY_DELAY = 10
def __init__(self, distro, root_dir,
instances, opts, group, prior_groups):
self.distro = distro
self.root_dir = root_dir
self.instances = instances
self.prior_groups = prior_groups
self.opts = opts or {}
self.group = group
self.retries = max(0, int(opts.get('pip_retries', self.RETRIES)))
self.retry_delay = max(0, float(opts.get('pip_retry_delay',
self.RETRY_DELAY)))
# Various paths we will use while operating
self.deps_dir = sh.joinpths(self.root_dir, "deps")
self.download_dir = sh.joinpths(self.deps_dir, "download")
self.log_dir = sh.joinpths(self.deps_dir, "output")
sh.mkdir(self.log_dir, recurse=True)
self.gathered_requires_filename = sh.joinpths(self.deps_dir, "pip-requires-group-%s" % group)
self.forced_requires_filename = sh.joinpths(self.deps_dir, "forced-requires-group-%s" % group)
self.download_requires_filename = sh.joinpths(self.deps_dir, "download-requires-group-%s" % group)
self.multipip = multipip_helper.Helper()
# List of requirements
self.pips_to_install = []
self.forced_pips = []
# Instances to there app directory (with a setup.py inside)
self.package_dirs = self._get_package_dirs(instances)
# Track what file we create so they can be cleaned up on uninstall.
trace_fn = tr.trace_filename(self.root_dir, 'deps')
self.tracewriter = tr.TraceWriter(trace_fn, break_if_there=False)
self.tracereader = tr.TraceReader(trace_fn)
self.requirements = {}
for key in ("build-requires", "requires", "conflicts"):
req_set = set()
for inst in self.instances:
req_set |= set(pkg["name"]
for pkg in inst.get_option(key) or [])
self.requirements[key] = req_set
ignore_pips = set()
ignore_distro_pips = self.distro.get_dependency_config("ignoreable_pips", quiet=True)
if ignore_distro_pips:
ignore_pips.update(ignore_distro_pips)
self.ignore_pips = ignore_pips
def iter_instance_and_eggs(self, include_priors):
groups = [self.instances]
if include_priors:
for _group, prior_instances in self.prior_groups:
groups.append(list(prior_instances.values()))
for instances in groups:
for i in instances:
try:
yield i, dict(i.egg_info)
except AttributeError:
pass
@property
def python_names(self):
return [egg['name']
for _instance, egg in self.iter_instance_and_eggs(True)]
@staticmethod
def _get_package_dirs(instances):
package_dirs = []
for inst in instances:
app_dir = inst.get_option("app_dir")
if sh.isfile(sh.joinpths(app_dir, "setup.py")):
package_dirs.append(app_dir)
return package_dirs
def package_start(self):
create_requirement = pip_helper.create_requirement
def gather_extras(instance):
pips = []
for p in instance.get_option("pips", default_value=[]):
req = create_requirement(p['name'], p.get('version'))
pips.append(str(req))
requires_files = list(getattr(instance, 'requires_files', []))
if instance.get_bool_option('use_tests_requires', default_value=True):
requires_files.extend(getattr(instance, 'test_requires_files', []))
return (pips, requires_files)
requires_files = []
extra_pips = []
for i in self.instances:
instance_pips, instance_requires_files = gather_extras(i)
extra_pips.extend(instance_pips)
requires_files.extend(instance_requires_files)
requires_files = filter(sh.isfile, requires_files)
self._gather_pips_to_install(requires_files, sorted(set(extra_pips)))
self._scan_pip_requires(requires_files)
def package_instance(self, instance):
pass
def package_finish(self):
pass
def build_binary(self):
pass
def install(self, general):
pass
def install_all_deps(self):
pass
def uninstall(self):
pass
def destroy(self):
self.uninstall()
# Clear out any files touched.
if self.tracereader.exists():
for f in self.tracereader.files_touched():
sh.unlink(f)
for d in self.tracereader.dirs_made():
sh.deldir(d)
sh.unlink(self.tracereader.filename())
def _scan_pip_requires(self, requires_files):
own_eggs = [egg for _instance, egg
in self.iter_instance_and_eggs(False)]
def replace_forced_requirements(fn, forced_by_key):
old_lines = sh.load_file(fn).splitlines()
new_lines = []
alterations = []
for line in old_lines:
try:
source_req = pip_helper.extract_requirement(line)
except (ValueError, TypeError):
pass
else:
if source_req:
validate_requirement(fn, source_req)
try:
replace_req = forced_by_key[source_req.key]
except KeyError:
pass
else:
replace_req = str(replace_req)
source_req = str(source_req)
if replace_req != source_req:
line = replace_req
alterations.append("%s => %s"
% (colorizer.quote(source_req),
colorizer.quote(replace_req)))
new_lines.append(line)
if alterations:
contents = "# Cleaned on %s\n\n%s\n" % (utils.iso8601(), "\n".join(new_lines))
sh.write_file_and_backup(fn, contents)
utils.log_iterable(alterations,
logger=LOG,
header="Replaced %s requirements in %s"
% (len(alterations), fn),
color=None)
return len(alterations)
def on_replace_done(fn, time_taken):
LOG.debug("Replacing potential forced requirements in %s"
" took %s seconds", colorizer.quote(fn), time_taken)
def validate_requirement(filename, source_req):
install_egg = None
for egg_info in own_eggs:
if egg_info['name'] == source_req.key:
install_egg = egg_info
break
if not install_egg:
return
# Ensure what we are about to install/create will actually work
# with the desired version. If it is not compatible then we should
# abort and someone should update the tag/branch in the origin
# file (or fix it via some other mechanism).
if install_egg['version'] not in source_req:
msg = ("Can not satisfy '%s' with '%s', version"
" conflict found in %s")
raise exc.DependencyException(msg % (source_req,
install_egg['req'],
filename))
if not requires_files:
return
requires_files = sorted(requires_files)
utils.log_iterable(requires_files,
logger=LOG,
header="Scanning %s pip 'requires' files" % (len(requires_files)))
forced_by_key = {}
for pkg in self.forced_pips:
forced_by_key[pkg.key] = pkg
mutations = 0
for fn in requires_files:
LOG.debug("Replacing any potential forced requirements in %s",
colorizer.quote(fn))
mutations += utils.time_it(functools.partial(on_replace_done, fn),
replace_forced_requirements,
fn, forced_by_key)
# NOTE(imelnikov): after updating requirement lists we should re-fetch
# data from them again, so we drop pip helper caches here.
if mutations > 0:
pip_helper.drop_caches()
def _gather_pips_to_install(self, requires_files, extra_pips=None):
"""Analyze requires_files and extra_pips.
Updates `self.forced_pips` and `self.pips_to_install`.
Writes requirements to `self.gathered_requires_filename`.
"""
ignore_pips = set(self.python_names)
ignore_pips.update(self.ignore_pips)
forced_pips = set()
forced_distro_pips = self.distro.get_dependency_config("forced_pips", quiet=True)
if forced_distro_pips:
forced_pips.update(forced_distro_pips)
compatibles, incompatibles = self.multipip.resolve(extra_pips,
requires_files,
ignore_pips,
forced_pips)
self.pips_to_install = compatibles
sh.write_file(self.gathered_requires_filename, "\n".join(self.pips_to_install))
pip_requirements, raw_requirements = pip_helper.read_requirement_files([self.gathered_requires_filename])
pips_to_install = sorted(raw_requirements, cmp=sort_req)
utils.log_iterable(pips_to_install, logger=LOG,
header="Full known python dependency list")
for (name, lines) in incompatibles.items():
LOG.warn("Incompatible requirements found for %s",
colorizer.quote(name, quote_color='red'))
for line in lines:
LOG.warn(line)
if not self.pips_to_install:
LOG.error("No valid dependencies found. Something went wrong.")
raise exc.DependencyException("No valid dependencies found")
# Translate those that we altered requirements for into a set of forced
# requirements file (and associated list).
self.forced_pips = []
forced_pip_keys = []
for req in [pip_helper.extract_requirement(line) for line in self.pips_to_install]:
if req.key in incompatibles and req.key not in forced_pip_keys:
self.forced_pips.append(req)
forced_pip_keys.append(req.key)
self.forced_pips = sorted(self.forced_pips, cmp=sort_req)
forced_pips = [str(req) for req in self.forced_pips]
utils.log_iterable(forced_pips, logger=LOG,
header="Automatically forced python dependencies")
sh.write_file(self.forced_requires_filename, "\n".join(forced_pips))
def _filter_download_requires(self):
"""Shrinks the pips that were downloaded into a smaller set.
:returns: a list of all requirements that must be downloaded
:rtype: list of str
"""
return self.pips_to_install
def _examine_download_dir(self, pips_to_download, pip_download_dir):
pip_names = set([p.key for p in pips_to_download])
what_downloaded = sorted(sh.listdir(pip_download_dir, files_only=True))
LOG.info("Validating %s files that were downloaded.", len(what_downloaded))
for filename in what_downloaded:
pkg_details = pip_helper.get_archive_details(filename)
req = pkg_details['req']
if req.key not in pip_names:
LOG.info("Dependency %s was automatically included.",
colorizer.quote(req))
return what_downloaded
@staticmethod
def _requirements_satisfied(pips_list, download_dir):
downloaded_req = [pip_helper.get_archive_details(filename)["req"]
for filename in sh.listdir(download_dir, files_only=True)]
downloaded_req = dict((req.key, req.specs[0][1]) for req in downloaded_req)
for req_str in pips_list:
req = pip_helper.extract_requirement(req_str)
try:
downloaded_version = downloaded_req[req.key]
except KeyError:
return False
else:
if downloaded_version not in req:
return False
return True
def _try_download(self, pips_to_download, attempt=0):
def on_download_finish(time_taken):
LOG.info("Took %0.2f seconds to download...", time_taken)
LOG.info("Downloading %s dependencies with pip (attempt %s)...",
len(pips_to_download), attempt)
output_filename = sh.joinpths(self.log_dir,
"pip-download-attempt-%s.log" % (attempt))
LOG.info("Please wait this may take a while...")
LOG.info("Check %s for download activity details...",
colorizer.quote(output_filename))
utils.time_it(on_download_finish,
pip_helper.download_dependencies,
self.download_dir, pips_to_download,
output_filename)
def download_dependencies(self):
"""Download dependencies from `$deps_dir/download-requires`."""
# NOTE(aababilov): do not drop download_dir - it can be reused
sh.mkdirslist(self.download_dir, tracewriter=self.tracewriter)
pips_to_download = self._filter_download_requires()
sh.write_file(self.download_requires_filename,
"\n".join([str(req) for req in pips_to_download]))
if not pips_to_download:
return ([], [])
# NOTE(aababilov): user could have changed persona, so,
# check that all requirements are downloaded....
if self._requirements_satisfied(pips_to_download, self.download_dir):
LOG.info("All python dependencies have been already downloaded")
else:
utils.retry(self.retries, self.retry_delay,
self._try_download, pips_to_download)
pips_downloaded = [pip_helper.extract_requirement(p) for p in pips_to_download]
what_downloaded = self._examine_download_dir(pips_downloaded, self.download_dir)
return (pips_downloaded, what_downloaded)

View File

@ -1,15 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,48 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
import json
from anvil import shell as sh
class Helper(object):
def __init__(self):
self._executable = sh.which("explode_envra", ["tools/"])
def explode(self, *filenames):
if not filenames:
return []
cmdline = [self._executable]
for filename in filenames:
cmdline.append(sh.basename(filename))
(stdout, _stderr) = sh.execute(cmdline)
results = []
missing = collections.deque(filenames)
for line in stdout.splitlines():
decoded = json.loads(line)
decoded['origin'] = missing.popleft()
results.append(decoded)
if missing:
raise AssertionError("%s filenames names were lost during"
" exploding: %s" % (len(missing),
list(missing)))
if len(results) > len(filenames):
diff = len(results) - len(filenames)
raise AssertionError("%s filenames appeared unexpectedly while"
" exploding" % (diff))
return results

View File

@ -1,73 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
import six
from anvil import shell as sh
from anvil import utils
class Helper(object):
def __init__(self):
self._multipip_executable = sh.which("multipip", ["tools/"])
def _call_multipip(self, requirements,
requires_files=None, ignore_requirements=None,
forced_requirements=None):
cmdline = [self._multipip_executable]
if requires_files:
cmdline.append("-r")
cmdline.extend(requires_files)
if ignore_requirements:
cmdline.append("--ignore-package")
cmdline.extend(ignore_requirements)
if forced_requirements:
cmdline.append("--force-package")
cmdline.extend(forced_requirements)
if requirements:
cmdline.append("--")
cmdline.extend(requirements)
(stdout, stderr) = sh.execute(cmdline, check_exit_code=False)
compatibles = list(utils.splitlines_not_empty(stdout))
incompatibles = collections.defaultdict(list)
current_name = ''
for line in stderr.strip().splitlines():
if line.endswith(": incompatible requirements"):
current_name = line.split(":", 1)[0].lower().strip()
if current_name not in incompatibles:
incompatibles[current_name] = []
else:
incompatibles[current_name].append(line)
cleaned_incompatibles = dict()
for (requirement, lines) in six.iteritems(incompatibles):
requirement = requirement.strip()
if not requirement:
continue
if not lines:
continue
cleaned_incompatibles[requirement] = lines
incompatibles = cleaned_incompatibles
return (compatibles, incompatibles)
def resolve(self, requirements,
requires_files=None, ignore_requirements=None,
forced_requirements=None):
return self._call_multipip(requirements,
requires_files=requires_files,
ignore_requirements=ignore_requirements,
forced_requirements=forced_requirements)

View File

@ -1,249 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import glob
import pkg_resources
import re
import sys
import tempfile
import threading
from pip import download as pip_download
from pip import req as pip_req
from pip import utils as pip_util
import pkginfo
import six
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
# Caches and there associated locks...
REQUIREMENT_FILE_CACHE = {}
REQUIREMENT_FILE_CACHE_LOCK = threading.RLock()
EGGS_DETAILED = {}
EGGS_DETAILED_LOCK = threading.RLock()
PYTHON_KEY_VERSION_RE = re.compile("^(.+)-([0-9][0-9.a-zA-Z]*)$")
PIP_EXECUTABLE = sh.which_first(['pip', 'pip-python'])
def create_requirement(name, version=None):
name = pkg_resources.safe_name(name.strip())
if not name:
raise ValueError("Pip requirement provided with an empty name")
if version is not None:
if isinstance(version, (int, float, long)):
version = "==%s" % version
if isinstance(version, (str, basestring)):
if version[0] not in "=<>":
version = "==%s" % version
else:
raise TypeError(
"Pip requirement version must be a string or numeric type")
name = "%s%s" % (name, version)
return pkg_resources.Requirement.parse(name)
def _split(line):
if line.startswith('-e') or line.startswith('--editable'):
if line.startswith('-e'):
line = line[2:].strip()
else:
line = line[len('--editable'):].strip().lstrip('=')
if line:
return ('-e', line)
return (None, line)
def extract(line):
if line.startswith('-e') or line.startswith('--editable'):
if line.startswith('-e'):
line = line[2:].strip()
else:
line = line[len('--editable'):].strip().lstrip('=')
req = pip_req.InstallRequirement.from_editable(line, comes_from="??")
else:
req = pip_req.InstallRequirement.from_line(line, comes_from="??")
# NOTE(aababilov): req.req.key can look like oslo.config-1.2.0a2,
# so, split it
if req.req:
match = PYTHON_KEY_VERSION_RE.match(req.req.key)
if match:
req.req = pkg_resources.Requirement.parse(
"%s>=%s" % (match.group(1), match.group(2)))
return req
def extract_requirement(line):
req = extract(line)
return req.req
def get_directory_details(path, pbr_version=None):
if not sh.isdir(path):
raise IOError("Can not detail non-existent directory %s" % (path))
# Check if we already got the details of this dir previously
with EGGS_DETAILED_LOCK:
path = sh.abspth(path)
cache_key = "d:%s" % (sh.abspth(path))
if cache_key in EGGS_DETAILED:
return EGGS_DETAILED[cache_key]
details = None
skip_paths = [
sh.joinpths(path, "PKG-INFO"),
sh.joinpths(path, "EGG-INFO"),
]
skip_paths.extend(glob.glob(sh.joinpths(path, "*.egg-info")))
if any(sh.exists(a_path) for a_path in skip_paths):
# Some packages seem to not support the 'egg_info' call and
# provide there own path/file that contains this information
# already, so just use it if we can get at it...
#
# Ie for pyyaml3.x:
#
# error: invalid command 'egg_info'
details = pkginfo.Develop(path)
if not details or not details.name:
cmd = [sys.executable, 'setup.py', 'egg_info']
if pbr_version:
env_overrides = {
"PBR_VERSION": str(pbr_version),
}
else:
env_overrides = {}
sh.execute(cmd, cwd=path, env_overrides=env_overrides)
details = pkginfo.get_metadata(path)
if not details or not details.name:
raise RuntimeError("No egg detail information discovered"
" at '%s'" % path)
egg_details = {
'req': create_requirement(details.name, version=details.version),
}
for attr_name in ['description', 'author',
'version', 'name', 'summary']:
egg_details[attr_name] = getattr(details, attr_name)
for attr_name in ['description', 'author', 'summary']:
attr_value = egg_details[attr_name]
if isinstance(attr_value, six.text_type):
# Fix any unicode which will cause unicode decode failures...
# versions or names shouldn't be unicode, and the rest
# we don't really care about being unicode (since its
# just used for logging right now anyway...).
#
# The reason this is done is that 'elasticsearch' seems to
# have a unicode author name, and that causes the log_object
# to blowup, so just avoid that by replacing this information
# in the first place.
egg_details[attr_name] = attr_value.encode("ascii",
errors='replace')
LOG.debug("Extracted '%s' egg detail information:", path)
utils.log_object(egg_details, logger=LOG, level=logging.DEBUG)
EGGS_DETAILED[cache_key] = egg_details
return egg_details
def drop_caches():
with EGGS_DETAILED_LOCK:
EGGS_DETAILED.clear()
with REQUIREMENT_FILE_CACHE_LOCK:
REQUIREMENT_FILE_CACHE.clear()
def get_archive_details(filename, pbr_version=None):
if not sh.isfile(filename):
raise IOError("Can not detail non-existent file %s" % (filename))
# Check if we already got the details of this file previously
with EGGS_DETAILED_LOCK:
cache_key = "f:%s:%s" % (sh.basename(filename), sh.getsize(filename))
if cache_key in EGGS_DETAILED:
return EGGS_DETAILED[cache_key]
# Get pip to get us the egg-info.
with utils.tempdir() as td:
filename = sh.copy(filename, sh.joinpths(td, sh.basename(filename)))
extract_to = sh.mkdir(sh.joinpths(td, 'build'))
pip_util.unpack_file(filename, extract_to, content_type='', link='')
egg_details = get_directory_details(extract_to,
pbr_version=pbr_version)
EGGS_DETAILED[cache_key] = egg_details
return egg_details
def parse_requirements(contents):
with tempfile.NamedTemporaryFile(suffix=".txt") as tmp_fh:
tmp_fh.write(contents)
tmp_fh.write("\n")
tmp_fh.flush()
return read_requirement_files([tmp_fh.name])
def read_requirement_files(files):
pip_requirements = []
session = pip_download.PipSession()
for filename in files:
if sh.isfile(filename):
cache_key = "f:%s:%s" % (sh.abspth(filename), sh.getsize(filename))
with REQUIREMENT_FILE_CACHE_LOCK:
try:
reqs = REQUIREMENT_FILE_CACHE[cache_key]
except KeyError:
reqs = tuple(pip_req.parse_requirements(filename,
session=session))
REQUIREMENT_FILE_CACHE[cache_key] = reqs
pip_requirements.extend(reqs)
return (pip_requirements,
[req.req for req in pip_requirements])
def download_dependencies(download_dir, pips_to_download, output_filename):
if not pips_to_download:
return
# NOTE(aababilov): pip has issues with already downloaded files
if sh.isdir(download_dir):
for filename in sh.listdir(download_dir, files_only=True):
sh.unlink(filename)
else:
sh.mkdir(download_dir)
# Clean out any previous paths that we don't want around.
build_path = sh.joinpths(download_dir, ".build")
if sh.isdir(build_path):
sh.deldir(build_path)
sh.mkdir(build_path)
cmdline = [
PIP_EXECUTABLE, '-v',
'install', '-I', '-U',
'--download', download_dir,
'--build', build_path,
# Don't download wheels since we lack the ability to create
# rpms from them (until future when we will have it, if ever)...
"--no-use-wheel",
]
for p in pips_to_download:
for p_seg in _split(p):
if p_seg:
cmdline.append(p_seg)
sh.execute_save_output(cmdline, output_filename)

View File

@ -1,222 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
import six
from anvil import log as logging
from anvil import settings
from anvil import shell as sh
from anvil import utils
from anvil.packaging.helpers import pip_helper
LOG = logging.getLogger(__name__)
def _fetch_missing_extra(python_names_in, python_names_out):
python_missing_names = set()
python_extra_names = set()
python_reqs_in = set(pip_helper.extract_requirement(n)
for n in python_names_in)
python_reqs_out = set(pip_helper.extract_requirement(n)
for n in python_names_out)
for req in python_reqs_in:
if req not in python_reqs_out:
python_missing_names.add(str(req))
for req in python_reqs_out:
if req not in python_reqs_in:
python_extra_names.add(str(req))
return (python_missing_names, python_extra_names)
class Helper(object):
def __init__(self, epoch_map, package_map, arch_dependent,
rpmbuild_dir, download_dir, deps_dir, log_dir,
build_options):
self._py2rpm_executable = sh.which("py2rpm", ["tools/"])
self._epoch_map = epoch_map
self._package_map = package_map
self._arch_dependent = arch_dependent
# Various paths that are used during operating
self._rpmbuild_dir = rpmbuild_dir
self._download_dir = download_dir
self._deps_dir = deps_dir
self._log_dir = log_dir
self._build_options = build_options
@staticmethod
def _make_value_escape(value):
# Escape things so makefile doesn't puke on us...
value = value.replace(" ", "\ ")
value = value.replace("$", "$$")
value = value.replace("#", "\#")
return value
def _start_cmdline(self, escape_values=False):
if escape_values:
escape_func = self._make_value_escape
else:
escape_func = lambda value: value
cmdline = [
self._py2rpm_executable,
"--rpm-base",
self._rpmbuild_dir
]
if self._epoch_map:
cmdline += [
"--epoch-map"
] + ["%s==%s" % (key, escape_func(value))
for key, value in self._epoch_map.iteritems()]
if self._package_map:
cmdline += [
"--package-map",
] + ["%s==%s" % (key, escape_func(value))
for key, value in self._package_map.iteritems()]
if self._build_options:
build_options = []
for key, values in self._build_options.iteritems():
if values:
for value in values:
build_options.append("%s==%s" % (key,
escape_func(value)))
if build_options:
cmdline.append("--build-options")
cmdline.extend(build_options)
if self._arch_dependent:
cmdline += [
"--arch-dependent",
] + list(self._arch_dependent)
return cmdline
def _execute_make(self, filename, marks_dir, jobs):
cmdline = ["make", "-f", filename, "-j", str(jobs)]
out_filename = sh.joinpths(self._log_dir, "%s.log" % sh.basename(filename))
sh.execute_save_output(cmdline, out_filename, cwd=marks_dir)
def _convert_names_to_rpm(self, python_names, only_name):
if not python_names:
return {}, {}
cmdline = self._start_cmdline() + ["--convert"] + python_names
result = collections.defaultdict(set)
conflicts = collections.defaultdict(set)
current_source = None
for line in sh.execute(cmdline)[0].splitlines():
# NOTE(harlowja): format is "Requires: rpm-name <=> X" or when
# the original requirement is denoted by the following comment
# lines "# Source: python-requirement"
if line.startswith("Requires:"):
line = line[len("Requires:"):]
if only_name:
positions = [line.find(">"), line.find("<"), line.find("=")]
positions = sorted([p for p in positions if p != -1])
if positions:
line = line[0:positions[0]]
result[current_source].add(line.strip())
elif line.startswith("Conflicts:"):
line = line[len("Conflicts:"):]
conflicts[current_source].add(line.strip())
elif line.startswith("# Source:"):
current_source = line[len("# Source:"):].strip()
found_names = set(result.keys())
found_names.update(conflicts.keys())
missing_names, extra_names = _fetch_missing_extra(python_names,
found_names)
if missing_names:
raise AssertionError("Python names were lost during conversion: %s"
% ', '.join(sorted(missing_names)))
if extra_names:
raise AssertionError("Extra python names were found during conversion: %s"
% ', '.join(sorted(extra_names)))
return result, conflicts
def names_to_rpm_names(self, python_names):
mapping = self._convert_names_to_rpm(python_names, only_name=True)[0]
result = {}
for k, v in six.iteritems(mapping):
assert len(v) == 1, ('There should be exactly one RPM name for '
'python module %s, but we have: %s'
% (k, sorted(v)))
result[k] = v.pop()
return result
def names_to_rpm_deps(self, python_names):
# Given a set of packages in Python namespace, return the equivalent
# Requires and Conflicts in RPM namespace.
requires, conflicts = self._convert_names_to_rpm(python_names, only_name=False)
requires_list = [req for value in six.itervalues(requires) for req in value]
conflicts_list = [req for value in six.itervalues(conflicts) for req in value]
return requires_list, conflicts_list
def build_all_srpms(self, package_files, tracewriter, jobs):
(_fn, content) = utils.load_template(sh.joinpths("packaging", "makefiles"), "source.mk")
scripts_dir = sh.abspth(sh.joinpths(settings.TEMPLATE_DIR, "packaging", "scripts"))
cmdline = self._start_cmdline(escape_values=True)[1:] + [
"--scripts-dir", scripts_dir,
"--source-only",
"--rpm-base", self._rpmbuild_dir,
"--debug",
]
executable = " ".join(self._start_cmdline()[0:1])
params = {
"DOWNLOADS_DIR": self._download_dir,
"LOGS_DIR": self._log_dir,
"PY2RPM": executable,
"PY2RPM_FLAGS": " ".join(cmdline)
}
marks_dir = sh.joinpths(self._deps_dir, "marks-deps")
if not sh.isdir(marks_dir):
sh.mkdirslist(marks_dir, tracewriter=tracewriter)
makefile_path = sh.joinpths(self._deps_dir, "deps.mk")
sh.write_file(makefile_path, utils.expand_template(content, params),
tracewriter=tracewriter)
utils.log_iterable(package_files,
header="Building %s SRPM packages using %s jobs" %
(len(package_files), jobs),
logger=LOG)
self._execute_make(makefile_path, marks_dir, jobs)
def build_srpm(self, source, log_filename,
release=None, with_tests=False):
cmdline = self._start_cmdline() + ["--source-only", "--debug"]
if release is not None:
cmdline.extend(["--release", release])
if with_tests:
cmdline.append("--with-tests")
cmdline.extend(["--", source])
out_filename = sh.joinpths(self._log_dir,
"py2rpm-build-%s.log" % log_filename)
sh.execute_save_output(cmdline, out_filename, cwd=source)
def build_all_binaries(self, repo_name, src_repo_dir, rpmbuild_flags,
tracewriter, jobs):
makefile_path = sh.joinpths(self._deps_dir, "binary-%s.mk" % repo_name)
marks_dir = sh.joinpths(self._deps_dir, "marks-binary")
if not sh.isdir(marks_dir):
sh.mkdirslist(marks_dir, tracewriter=tracewriter)
params = {
"SRC_REPO_DIR": src_repo_dir,
"RPMBUILD_FLAGS": rpmbuild_flags,
"LOGS_DIR": self._log_dir,
"RPMTOP_DIR": self._rpmbuild_dir,
}
(_fn, content) = utils.load_template(sh.joinpths("packaging", "makefiles"), "binary.mk")
sh.write_file(makefile_path, utils.expand_template(content, params),
tracewriter=tracewriter)
self._execute_make(makefile_path, marks_dir, jobs)

View File

@ -1,167 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import tempfile
import time
from anvil import exceptions as excp
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
def _generate_log_filename(arglist):
pieces = ['yyoom-']
for a in arglist:
a = a.strip()
if not a or a.startswith("-") or sh.exists(a):
break
else:
pieces.append(a)
pieces.append("_")
pieces.append(int(time.time()))
pieces.append("_")
pieces.append(utils.get_random_string(4))
pieces.append('.log')
return "".join([str(p) for p in pieces])
class Helper(object):
def __init__(self, log_dir, repos):
# Executables we require to operate
self.yyoom_executable = sh.which("yyoom", ["tools/"])
# Preferred repositories names
self._repos = repos
# Caches of installed and available packages
self._installed = None
self._available = None
self._logs_dir = log_dir
def _yyoom(self, arglist, on_completed=None):
if not on_completed:
on_completed = lambda data, errored: None
if not sh.isdir(self._logs_dir):
sh.mkdirslist(self._logs_dir)
with tempfile.NamedTemporaryFile(suffix=".json") as fh:
cmdline = [
self.yyoom_executable,
"--output-file", fh.name,
"--verbose",
]
cmdline.extend(arglist)
log_filename = sh.joinpths(self._logs_dir,
_generate_log_filename(arglist))
LOG.debug("Running yyoom: log output will be placed in %s",
log_filename)
try:
sh.execute_save_output(cmdline, log_filename)
except excp.ProcessExecutionError:
with excp.reraise():
try:
fh.seek(0)
data = utils.parse_json(fh.read())
except Exception:
LOG.exception("Failed to parse YYOOM output")
else:
on_completed(data, True)
else:
fh.seek(0)
data = utils.parse_json(fh.read())
on_completed(data, False)
return data
def _traced_yyoom(self, arglist, tracewriter):
def on_completed(data, errored):
self._handle_transaction_data(tracewriter, data)
return self._yyoom(arglist, on_completed=on_completed)
@staticmethod
def _handle_transaction_data(tracewriter, data):
if not data:
return
failed_names = None
try:
if tracewriter:
for action in data:
if action['action_type'] == 'install':
tracewriter.package_installed(action['name'])
elif action['action_type'] == 'upgrade':
tracewriter.package_upgraded(action['name'])
failed_names = [action['name']
for action in data
if action['action_type'] == 'error']
except Exception:
LOG.exception("Failed to handle YYOOM transaction data")
else:
if failed_names:
raise RuntimeError("YYOOM failed on %s" % ", ".join(failed_names))
def is_installed(self, name):
matches = self.find_installed(name)
if len(matches):
return True
return False
def find_installed(self, name):
installed = self.list_installed()
return [item for item in installed if item['name'] == name]
def list_available(self):
if self._available is None:
self._available = self._yyoom(['list', 'available'])
return list(self._available)
def list_installed(self):
if self._installed is None:
self._installed = self._yyoom(['list', 'installed'])
return list(self._installed)
def builddep(self, srpm_path, tracewriter=None):
self._traced_yyoom(['builddep', srpm_path], tracewriter)
def _reset(self):
# reset the caches:
self._installed = None
self._available = None
def clean(self):
try:
self._yyoom(['cleanall'])
finally:
self._reset()
def transaction(self, install_pkgs=(), remove_pkgs=(), tracewriter=None):
if not install_pkgs and not remove_pkgs:
return
cmdline = ['transaction']
for pkg in install_pkgs:
cmdline.append('--install')
cmdline.append(pkg)
for pkg in remove_pkgs:
cmdline.append('--erase')
cmdline.append(pkg)
for repo in self._repos:
cmdline.append('--prefer-repo')
cmdline.append(repo)
try:
self._traced_yyoom(cmdline, tracewriter)
finally:
self._reset()

View File

@ -1,275 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import contextlib
import functools
import itertools
import os
import re
import tarfile
import six
from anvil import async
from anvil import colorizer
from anvil import env
from anvil import exceptions as excp
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
from anvil.packaging import base
from anvil.packaging.helpers import pip_helper
LOG = logging.getLogger(__name__)
def _on_finish(what, time_taken):
LOG.info("%s took %s seconds", what, time_taken)
# TODO(harlowja): think we can remove this...
class VenvInstallHelper(base.InstallHelper):
def pre_install(self, pkg, params=None):
pass
def post_install(self, pkg, params=None):
pass
class VenvDependencyHandler(base.DependencyHandler):
PREREQUISITE_UPGRADE_PKGS = frozenset(['pip'])
def __init__(self, distro, root_dir,
instances, opts, group, prior_groups):
super(VenvDependencyHandler, self).__init__(distro, root_dir,
instances, opts, group,
prior_groups)
self.jobs = max(0, int(opts.get('jobs', 0)))
self.install_counters = {}
def _venv_directory_for(self, instance):
return sh.joinpths(instance.get_option('component_dir'), 'venv')
def _install_into_venv(self, instance, requirements,
upgrade=False, extra_env_overrides=None):
venv_dir = self._venv_directory_for(instance)
base_pip = [sh.joinpths(venv_dir, 'bin', 'pip')]
env_overrides = {
'PATH': os.pathsep.join([sh.joinpths(venv_dir, "bin"),
env.get_key('PATH', default_value='')]),
'VIRTUAL_ENV': venv_dir,
}
if extra_env_overrides:
env_overrides.update(extra_env_overrides)
cmd = list(base_pip) + ['install']
if upgrade:
cmd.append("--upgrade")
if isinstance(requirements, six.string_types):
cmd.extend([
'--requirement',
requirements
])
else:
for req in requirements:
cmd.append(str(req))
count = self.install_counters.get(instance.name, 0)
self.install_counters[instance.name] = count + 1
out_filename = sh.joinpths(self.log_dir, "venv-install-%s-%s.log" % (instance.name, count))
sh.execute_save_output(cmd, out_filename, env_overrides=env_overrides)
def _is_buildable(self, instance):
app_dir = instance.get_option('app_dir')
if app_dir and sh.isdir(app_dir) and hasattr(instance, 'egg_info'):
return True
return False
def _replace_deployment_paths(self, root_dir, replacer):
total_replacements = 0
files_replaced = 0
for path in sh.listdir(root_dir, recursive=True, files_only=True):
new_contents, replacements = replacer(sh.load_file(path))
if replacements:
sh.write_file(path, new_contents)
total_replacements += replacements
files_replaced += 1
return (files_replaced, total_replacements)
def _make_tarball(self, venv_dir, tar_filename, tar_path):
with contextlib.closing(tarfile.open(tar_filename, "w:gz")) as tfh:
for path in sh.listdir(venv_dir, recursive=True):
tarpath = tar_path + path[len(venv_dir):]
tarpath = os.path.abspath(tarpath)
tfh.add(path, recursive=False, arcname=tarpath)
def package_finish(self):
super(VenvDependencyHandler, self).package_finish()
for instance in self.instances:
if not self._is_buildable(instance):
continue
venv_dir = sh.abspth(self._venv_directory_for(instance))
release = str(instance.get_option("release", default_value=1))
if release and not release.startswith('-'):
release = '-' + release
version_full = instance.egg_info['version'] + release
# Replace paths with virtualenv deployment directory.
if self.opts.get('venv_deploy_dir'):
deploy_dir = sh.joinpths(self.opts.get('venv_deploy_dir'),
instance.name)
replacer = functools.partial(
re.subn, re.escape(instance.get_option('component_dir')),
deploy_dir)
bin_dir = sh.joinpths(venv_dir, 'bin')
adjustments, files_replaced = self._replace_deployment_paths(bin_dir,
replacer)
if files_replaced:
LOG.info("Adjusted %s deployment path(s) in %s files",
adjustments, files_replaced)
tar_path = sh.joinpths(self.opts.get('venv_deploy_dir'), '%s/%s-%s-venv/venv' % (
instance.name, instance.name, version_full))
else:
tar_path = '%s/%s-%s-venv/venv' % (instance.name, instance.name, version_full)
# Create a tarball containing the virtualenv.
tar_filename = sh.joinpths(venv_dir, '%s-%s-venv.tar.gz' % (instance.name,
version_full))
LOG.info("Making tarball of %s built for %s with version %s at %s", venv_dir,
instance.name, version_full, tar_filename)
utils.time_it(functools.partial(_on_finish, "Tarball creation"),
self._make_tarball, venv_dir, tar_filename, tar_path)
def package_start(self):
super(VenvDependencyHandler, self).package_start()
self.install_counters.clear()
base_cmd = env.get_key('VENV_CMD', default_value='virtualenv')
for instance in self.instances:
if not self._is_buildable(instance):
continue
# Create a virtualenv...
venv_dir = self._venv_directory_for(instance)
sh.mkdirslist(venv_dir, tracewriter=self.tracewriter)
cmd = [base_cmd, '--clear', venv_dir]
LOG.info("Creating virtualenv at %s", colorizer.quote(venv_dir))
out_filename = sh.joinpths(self.log_dir, "venv-create-%s.log" % (instance.name))
sh.execute_save_output(cmd, out_filename)
self._install_into_venv(instance,
self.PREREQUISITE_UPGRADE_PKGS,
upgrade=True)
def package_instances(self, instances):
if not instances:
return []
LOG.info("Packaging %s instances using %s threads",
len(instances), self.jobs)
results = [None] * len(instances)
if self.jobs >= 1:
executor = async.ChainedWorkerExecutor(self.jobs)
retryable_exceptions = [
excp.ProcessExecutionError,
]
run_funcs = []
for instance in instances:
func = functools.partial(utils.retry,
self.retries, self.retry_delay,
self._package_instance, instance,
retryable_exceptions=retryable_exceptions)
run_funcs.append(func)
futs = executor.run(run_funcs)
executor.wait()
for fut in futs:
if fut.cancelled():
continue
if fut.done():
fut.result()
else:
for instance in instances:
self.package_instance(instance)
return results
def _package_instance(self, instance, attempt=0):
if not self._is_buildable(instance):
# Skip things that aren't python...
LOG.warn("Skipping building %s (not python)",
colorizer.quote(instance.name, quote_color='red'))
return
def gather_extras():
extra_reqs = []
for p in instance.get_option("pips", default_value=[]):
req = pip_helper.create_requirement(p['name'], p.get('version'))
extra_reqs.append(req)
if instance.get_bool_option('use_tests_requires', default_value=True):
for p in instance.get_option("test_requires", default_value=[]):
extra_reqs.append(pip_helper.create_requirement(p))
return extra_reqs
all_requires_what = self._filter_download_requires()
LOG.info("Packaging %s (attempt %s)",
colorizer.quote(instance.name), attempt)
all_requires_mapping = {}
for req in all_requires_what:
if isinstance(req, six.string_types):
req = pip_helper.extract_requirement(req)
all_requires_mapping[req.key] = req
direct_requires_what = []
direct_requires_keys = set()
egg_info = getattr(instance, 'egg_info', None)
if egg_info is not None:
# Ensure we have gotten all the things...
test_dependencies = (egg_info.get('test_dependencies', [])
if instance.get_bool_option(
'use_tests_requires', default_value=True)
else [])
for req in itertools.chain(egg_info.get('dependencies', []),
test_dependencies):
if isinstance(req, six.string_types):
req = pip_helper.extract_requirement(req)
if req.key not in direct_requires_keys:
direct_requires_what.append(req)
direct_requires_keys.add(req.key)
requires_what = []
extra_requires_what = gather_extras()
for req in extra_requires_what:
if req.key in all_requires_mapping:
req = all_requires_mapping[req.key]
requires_what.append(req)
try:
direct_requires_keys.remove(req.key)
except KeyError:
pass
for req in direct_requires_what:
if req.key not in direct_requires_keys:
continue
if req.key in all_requires_mapping:
req = all_requires_mapping[req.key]
requires_what.append(req)
what = 'installation for %s' % colorizer.quote(instance.name)
utils.time_it(functools.partial(_on_finish, "Dependency %s" % what),
self._install_into_venv, instance,
requires_what)
extra_env_overrides = {
'PBR_VERSION': instance.egg_info['version'],
}
utils.time_it(functools.partial(_on_finish, "Instance %s" % what),
self._install_into_venv, instance,
[instance.get_option('app_dir')],
extra_env_overrides=extra_env_overrides)
def download_dependencies(self):
pass

View File

@ -1,966 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import collections
import contextlib
import errno
import json
import pkg_resources
import sys
import tarfile
import six
from anvil import colorizer
from anvil import exceptions as excp
from anvil import log as logging
from anvil.packaging import base
from anvil.packaging.helpers import envra_helper
from anvil.packaging.helpers import pip_helper
from anvil.packaging.helpers import py2rpm_helper
from anvil.packaging.helpers import yum_helper
from anvil import settings
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
# Certain versions of pbr seem to miss these files, which causes the rpmbuild
# phases to not complete correctly. Ensure that we don't miss them.
ENSURE_NOT_MISSING = [
'doc', # Without this one our rpm doc build won't work
'README.rst', # Without this one pbr won't work (thus killing setup.py)
'babel.cfg',
'HACKING',
'AUTHORS',
'ChangeLog',
'CONTRIBUTING.rst',
'LICENSE',
]
_DEFAULT_SKIP_EPOCHS = ['0']
def _get_lines(filename):
lines = []
for line in sh.load_file(filename).splitlines():
line = line.strip()
if line and not line.startswith("#"):
lines.append(line)
return lines
class YumInstallHelper(base.InstallHelper):
def pre_install(self, pkg, params=None):
"""pre-install is handled in openstack-deps %pre script."""
pass
def post_install(self, pkg, params=None):
"""post-install is handled in openstack-deps %post script."""
pass
class YumDependencyHandler(base.DependencyHandler):
OPENSTACK_EPOCH = 2
SPEC_TEMPLATE_DIR = "packaging/specs"
YUM_REPO_DIR = "/etc/yum.repos.d/"
SRC_REPOS = {
'anvil': 'anvil-source',
"anvil-deps": "anvil-deps-source",
}
REPOS = ["anvil-deps", "anvil"]
JOBS = 2
def __init__(self, distro, root_dir,
instances, opts, group, prior_groups):
super(YumDependencyHandler, self).__init__(distro, root_dir,
instances, opts, group,
prior_groups)
# Various paths we will use while operating
self.rpmbuild_dir = sh.joinpths(self.deps_dir, "rpmbuild")
self.prebuild_dir = sh.joinpths(self.deps_dir, "prebuild")
self.deps_repo_dir = sh.joinpths(self.deps_dir, "openstack-deps")
self.deps_src_repo_dir = sh.joinpths(self.deps_dir, "openstack-deps-sources")
self.rpm_sources_dir = sh.joinpths(self.rpmbuild_dir, "SOURCES")
self.anvil_repo_dir = sh.joinpths(self.root_dir, "repo")
self.generated_srpms_filename = sh.joinpths(self.deps_dir, "generated-srpms-%s" % group)
self.build_requires_filename = sh.joinpths(self.deps_dir, "build-requires-%s" % group)
self.yum_satisfies_filename = sh.joinpths(self.deps_dir, "yum-satisfiable-%s" % group)
self.rpm_build_requires_filename = sh.joinpths(self.deps_dir, "rpm-build-requires-%s" % group)
# Executables we require to operate
self.rpmbuild_executable = sh.which("rpmbuild")
self.specprint_executable = sh.which('specprint', ["tools/"])
# We inspect yum for packages, this helper allows us to do this.
self.helper = yum_helper.Helper(self.log_dir, self.REPOS)
self.envra_helper = envra_helper.Helper()
# See if we are requested to run at a higher make parallelism level
try:
self.jobs = max(self.JOBS, int(self.opts.get('jobs')))
except (TypeError, ValueError):
self.jobs = self.JOBS
def _fetch_epoch_mapping(self):
epoch_map = self.distro.get_dependency_config("epoch_map", quiet=True)
if not epoch_map:
epoch_map = {}
epoch_skips = self.distro.get_dependency_config("epoch_skips",
quiet=True)
if not epoch_skips:
epoch_skips = _DEFAULT_SKIP_EPOCHS
if not isinstance(epoch_skips, (list, tuple)):
epoch_skips = [i.strip() for i in epoch_skips.split(",")]
built_epochs = {}
for instance, egg in self.iter_instance_and_eggs(True):
name = egg['name']
epoch = None
if name in epoch_map:
epoch = str(epoch_map.pop(name))
else:
epoch = instance.get_option('epoch')
if not epoch:
epoch = str(self.OPENSTACK_EPOCH)
built_epochs[name] = epoch
# Ensure epochs set by a yum searching (that are not in the list of
# epochs to provide) are correctly set when building dependent
# packages...
keep_names = set()
try:
yum_satisfies = sh.load_file(self.yum_satisfies_filename)
except IOError as e:
if e.errno != errno.ENOENT:
raise
else:
for line in yum_satisfies.splitlines():
raw_req_rpm = utils.parse_json(line)
req = pip_helper.extract_requirement(raw_req_rpm['requirement'])
if req.key in epoch_map:
LOG.debug("Ensuring manually set epoch is retained for"
" requirement '%s' with epoch %s", req,
epoch_map[req.key])
keep_names.add(req.key)
else:
rpm_info = raw_req_rpm['rpm']
rpm_epoch = rpm_info.get('epoch')
if rpm_epoch and str(rpm_epoch) not in epoch_skips:
LOG.debug("Adding in yum satisfiable package %s for"
" requirement '%s' with epoch %s from repo %s",
rpm_info['name'], req, rpm_epoch, rpm_info['repo'])
keep_names.add(req.key)
epoch_map[req.key] = str(rpm_epoch)
# Exclude names from the epoch map that we never downloaded in the
# first place or that we did not just set automatically (since these
# are not useful and should not be set in the first place).
try:
_pip_reqs, downloaded_reqs = pip_helper.read_requirement_files([self.build_requires_filename])
except IOError as e:
if e.errno != errno.ENOENT:
raise
else:
downloaded_names = set([req.key for req in downloaded_reqs])
tmp_epoch_map = {}
for (name, epoch) in six.iteritems(epoch_map):
name = name.lower()
if name in downloaded_names or name in keep_names:
tmp_epoch_map[name] = str(epoch)
else:
LOG.debug("Discarding %s:%s from the epoch mapping since"
" it was not part of the downloaded (or automatically"
" included) build requirements", name, epoch)
epoch_map = tmp_epoch_map
epoch_map.update(built_epochs)
return epoch_map
@property
def py2rpm_helper(self):
epoch_map = self._fetch_epoch_mapping()
package_map = self.distro.get_dependency_config("package_map")
arch_dependent = self.distro.get_dependency_config("arch_dependent")
build_options = self.distro.get_dependency_config("build_options")
return py2rpm_helper.Helper(epoch_map=epoch_map,
package_map=package_map,
arch_dependent=arch_dependent,
rpmbuild_dir=self.rpmbuild_dir,
download_dir=self.download_dir,
deps_dir=self.deps_dir,
log_dir=self.log_dir,
build_options=build_options)
def _package_parameters(self, instance):
params = {}
params["release"] = str(instance.get_option("release", default_value=1))
if '-' in params["release"]:
# NOTE(imelnikov): "-" is prohibited in RPM releases
raise ValueError("Malformed package release: %r" % params["release"])
version_suffix = instance.get_option("version_suffix", default_value="")
if version_suffix and not version_suffix.startswith('.'):
version_suffix = '.' + version_suffix
params['version_suffix'] = version_suffix
tests_package = instance.get_option('tests_package', default_value={})
params["no_tests"] = 0 if tests_package.get('enabled', True) else 1
params["exclude_from_test_env"] = ['./bin', './build*']
params["exclude_from_test_env"].extend(
tests_package.get("exclude_from_env", ()))
return params
def _create_rpmbuild_subdirs(self):
for dirname in (sh.joinpths(self.rpmbuild_dir, "SPECS"),
sh.joinpths(self.rpmbuild_dir, "SOURCES")):
sh.mkdirslist(dirname, tracewriter=self.tracewriter)
def _record_srpm_files(self, files):
if not files:
return
buf = six.StringIO()
for f in files:
buf.write(f)
buf.write("\n")
if sh.isfile(self.generated_srpms_filename):
sh.append_file(self.generated_srpms_filename, "\n" + buf.getvalue())
else:
sh.write_file(self.generated_srpms_filename, buf.getvalue())
def package_instance(self, instance):
with sh.remove_before(self.rpmbuild_dir):
self._create_rpmbuild_subdirs()
if instance.name in ["general"]:
self._build_dependencies()
self._record_srpm_files(self._move_srpms("anvil-deps"))
else:
# Meta packages don't get built.
app_dir = instance.get_option("app_dir")
if sh.isdir(app_dir):
self._build_openstack_package(instance)
self._record_srpm_files(self._move_srpms("anvil"))
def _move_rpm_files(self, source_dir, target_dir):
# NOTE(imelnikov): we should create target_dir even if we have
# nothing to move, because later we rely on its existence
if not sh.isdir(target_dir):
sh.mkdirslist(target_dir, tracewriter=self.tracewriter)
if not sh.isdir(source_dir):
return []
moved = []
for filename in sh.listdir(source_dir, recursive=True, files_only=True):
if not filename.lower().endswith(".rpm"):
continue
sh.move(filename, target_dir, force=True)
moved.append(sh.joinpths(target_dir, sh.basename(filename)))
return moved
def build_binary(self):
def is_src_rpm(path):
if not path:
return False
if not sh.isfile(path):
return False
if not path.lower().endswith('.src.rpm'):
return False
return True
def list_src_rpms(path):
path_files = []
restricted = set()
if sh.isdir(path):
path_files = sh.listdir(path, filter_func=is_src_rpm)
try:
# Leave other groups files alone...
restricted = set(_get_lines(self.generated_srpms_filename))
except IOError as e:
if e.errno != errno.ENOENT:
raise
filtered = []
for path in path_files:
if path in restricted:
filtered.append(path)
path_files = filtered
return sorted(path_files)
def move_rpms(repo_name):
repo_dir = sh.joinpths(self.anvil_repo_dir, repo_name)
search_dirs = [
sh.joinpths(self.rpmbuild_dir, "RPMS"),
]
for sub_dir in sh.listdir(self.rpmbuild_dir, dirs_only=True):
search_dirs.append(sh.joinpths(sub_dir, "RPMS"))
moved = []
for source_dir in search_dirs:
moved.extend(self._move_rpm_files(source_dir, repo_dir))
return moved
def build(repo_dir, repo_name, header_tpl, group, built_files):
repo_files = []
for srpm in list_src_rpms(repo_dir):
if srpm not in built_files:
repo_files.append(srpm)
if not repo_files:
return []
utils.log_iterable(repo_files,
header=header_tpl % (len(repo_files),
self.SRC_REPOS[repo_name],
self.jobs),
logger=LOG)
rpmbuild_flags = "--rebuild"
if self.opts.get("usr_only", False):
rpmbuild_flags += " --define 'usr_only 1'"
if self.opts.get("overwrite_configs", False):
rpmbuild_flags += " --define 'overwrite_configs 1'"
with sh.remove_before(self.rpmbuild_dir):
self._create_rpmbuild_subdirs()
# This is needed so that make correctly identifies the right
# files and the right *.mark files and so-on; instead of
# grabbing all the files (including ones we don't want to
# build just yet...)
files_dirname = '%s-%s-build' % (repo_name, group)
files_dir = sh.joinpths(self.deps_dir, files_dirname)
sh.mkdirslist(files_dir)
for srpm in repo_files:
sh.copy(srpm, sh.joinpths(files_dir, sh.basename(srpm)))
try:
self.py2rpm_helper.build_all_binaries(repo_name,
files_dir,
rpmbuild_flags,
self.tracewriter,
self.jobs)
finally:
# If we made any rpms (even if a failure happened, make
# sure that we move them to the right target repo).
moved_rpms = move_rpms(repo_name)
if len(moved_rpms) > 0:
self._create_repo(repo_name)
return repo_files
def pre_build():
build_requirements = self.requirements.get("build-requires")
if build_requirements:
utils.log_iterable(build_requirements,
header="Installing build requirements",
logger=LOG)
self.helper.transaction(install_pkgs=build_requirements,
tracewriter=self.tracewriter)
build_requirements = []
try:
build_requirements.extend(_get_lines(self.rpm_build_requires_filename))
except IOError as e:
if e.errno != errno.ENOENT:
raise
built_files = []
built_requirements = []
for repo_name in self.REPOS:
repo_dir = sh.joinpths(self.anvil_repo_dir, self.SRC_REPOS[repo_name])
matched_paths = []
available_paths = list_src_rpms(repo_dir)
envra_path_details = self.envra_helper.explode(*available_paths)
for (path, envra_detail) in zip(available_paths, envra_path_details):
package_name = envra_detail.get('name')
if package_name in build_requirements:
matched_paths.append(path)
built_requirements.append(package_name)
if matched_paths:
with sh.remove_before(self.prebuild_dir) as prebuild_dir:
sh.mkdirslist(prebuild_dir, tracewriter=self.tracewriter)
for path in matched_paths:
sh.copy(path,
sh.joinpths(prebuild_dir, sh.basename(path)))
built_files.extend(
build(prebuild_dir, repo_name,
'Prebuilding %s RPM packages from their'
' SRPMs for repo %s using %s jobs',
"%s-prebuild" % self.group, built_files))
leftover_requirements = set()
for req in build_requirements:
if req not in built_requirements:
leftover_requirements.add(req)
return (leftover_requirements, built_files)
leftover_requirements, built_files = pre_build()
if leftover_requirements:
utils.log_iterable(sorted(leftover_requirements),
header="%s unsatisfied build requirements (these"
" will need to be satisfied by existing"
" repositories)" % len(leftover_requirements),
logger=LOG)
for repo_name in self.REPOS:
repo_dir = sh.joinpths(self.anvil_repo_dir, self.SRC_REPOS[repo_name])
built_files.extend(
build(repo_dir, repo_name,
'Building %s RPM packages from their SRPMs for repo %s'
' using %s jobs', self.group, built_files))
def _move_srpms(self, repo_name, rpmbuild_dir=None):
if rpmbuild_dir is None:
rpmbuild_dir = self.rpmbuild_dir
src_repo_name = self.SRC_REPOS[repo_name]
src_repo_dir = sh.joinpths(self.anvil_repo_dir, src_repo_name)
search_dirs = [
sh.joinpths(rpmbuild_dir, "SRPMS"),
]
moved = []
for dir_name in search_dirs:
moved.extend(self._move_rpm_files(dir_name, src_repo_dir))
return moved
def _create_repo(self, repo_name):
repo_dir = sh.joinpths(self.anvil_repo_dir, repo_name)
src_repo_dir = sh.joinpths(self.anvil_repo_dir, self.SRC_REPOS[repo_name])
for a_dir in (repo_dir, src_repo_dir):
if not sh.isdir(a_dir):
sh.mkdirslist(a_dir, tracewriter=self.tracewriter)
cmdline = ["createrepo", a_dir]
LOG.info("Creating repo at %s", a_dir)
sh.execute(cmdline)
repo_filename = sh.joinpths(self.anvil_repo_dir, "%s.repo" % repo_name)
LOG.info("Writing %s", repo_filename)
(_fn, content) = utils.load_template("packaging", "common.repo")
params = {
"repo_name": repo_name,
"baseurl_bin": "file://%s" % repo_dir,
"baseurl_src": "file://%s" % src_repo_dir,
}
sh.write_file(repo_filename, utils.expand_template(content, params),
tracewriter=self.tracewriter)
# NOTE(harlowja): Install *.repo file so that anvil deps will be available
# when building openstack core project packages.
system_repo_filename = sh.joinpths(self.YUM_REPO_DIR, "%s.repo" % repo_name)
sh.copy(repo_filename, system_repo_filename, tracewriter=self.tracewriter)
LOG.info("Copied to %s", system_repo_filename)
def _get_known_yum_packages(self):
LOG.info("Determining which yum packages are available or installed...")
yum_map = collections.defaultdict(list)
pkgs = []
pkgs.extend(self.helper.list_available())
pkgs.extend(self.helper.list_installed())
for pkg in pkgs:
for provides in pkg.get('provides', []):
yum_map[provides[0]].append(pkg)
# Note(harlowja): this is done to remove the default lists
# that each entry would previously provide, converting the defaultdict
# into a normal dict.
return dict(yum_map)
@staticmethod
def _find_yum_match(yum_map, req, rpm_name):
yum_versions = yum_map.get(rpm_name, [])
for pkg in yum_versions:
version = pkg['version']
if version in req:
return pkg
return None
def _filter_download_requires(self):
yum_map = self._get_known_yum_packages()
pip_origins = {}
for line in self.pips_to_install:
req = pip_helper.extract_requirement(line)
pip_origins[req.key] = line
pips_to_download = []
req_to_install = [pip_helper.extract_requirement(line)
for line in self.pips_to_install]
requested_names = [req.key for req in req_to_install]
rpm_names = self.py2rpm_helper.names_to_rpm_names(requested_names)
satisfied_list = []
for req in req_to_install:
rpm_name = rpm_names[req.key]
rpm_info = self._find_yum_match(yum_map, req, rpm_name)
if not rpm_info:
# We need the source requirement in case it's a url.
pips_to_download.append(pip_origins[req.key])
else:
satisfied_list.append((req, rpm_name, rpm_info))
yum_buff = six.StringIO()
if satisfied_list:
# Organize by repo
repos = collections.defaultdict(list)
for (req, rpm_name, rpm_info) in satisfied_list:
repo = rpm_info['repo']
rpm_found = '%s-%s' % (rpm_name, rpm_info['version'])
repos[repo].append("%s as %s" % (colorizer.quote(req),
colorizer.quote(rpm_found)))
dep_info = {
'requirement': str(req),
'rpm': rpm_info,
}
yum_buff.write(json.dumps(dep_info))
yum_buff.write("\n")
for r in sorted(repos.keys()):
header = ("%s Python packages are already available "
"as RPMs from repository %s")
header = header % (len(repos[r]), colorizer.quote(r))
utils.log_iterable(sorted(repos[r]), logger=LOG, header=header,
color=None)
sh.write_file(self.yum_satisfies_filename, yum_buff.getvalue())
return pips_to_download
def _build_dependencies(self):
(pips_downloaded, package_files) = self.download_dependencies()
# Analyze what was downloaded and eject things that were downloaded
# by pip as a dependency of a download but which we do not want to
# build or can satisfy by other means
no_pips = [pkg_resources.Requirement.parse(name).key
for name in self.python_names]
no_pips.extend(self.ignore_pips)
yum_map = self._get_known_yum_packages()
pips_keys = set([p.key for p in pips_downloaded])
package_reqs = []
for filename in package_files:
package_details = pip_helper.get_archive_details(filename)
package_reqs.append((filename, package_details['req']))
def _filter_package_files():
yum_provided = []
req_names = [req.key for (filename, req) in package_reqs]
package_rpm_names = self.py2rpm_helper.names_to_rpm_names(req_names)
filtered_files = []
for filename, req in package_reqs:
rpm_name = package_rpm_names[req.key]
if req.key in no_pips:
LOG.info(("Dependency %s was downloaded additionally "
"but it is disallowed."), colorizer.quote(req))
continue
if req.key in pips_keys:
filtered_files.append(filename)
continue
# See if pip tried to download it but we already can satisfy
# it via yum and avoid building it in the first place...
rpm_info = self._find_yum_match(yum_map, req, rpm_name)
if not rpm_info:
filtered_files.append(filename)
else:
yum_provided.append((req, rpm_info))
LOG.info(("Dependency %s was downloaded additionally "
"but it can be satisfied by %s from repository "
"%s instead."), colorizer.quote(req),
colorizer.quote(rpm_name),
colorizer.quote(rpm_info['repo']))
return (filtered_files, yum_provided)
LOG.info("Filtering %s downloaded files.", len(package_files))
filtered_package_files, yum_provided = _filter_package_files()
if yum_provided:
yum_buff = six.StringIO()
for (req, rpm_info) in yum_provided:
dep_info = {
'requirement': str(req),
'rpm': rpm_info,
}
yum_buff.write(json.dumps(dep_info))
yum_buff.write("\n")
sh.append_file(self.yum_satisfies_filename, yum_buff.getvalue())
if not filtered_package_files:
LOG.info("No SRPM package dependencies to build.")
return
for filename in package_files:
if filename not in filtered_package_files:
sh.unlink(filename)
ensure_prebuilt = self.distro.get_dependency_config("ensure_prebuilt",
quiet=True)
if not ensure_prebuilt:
ensure_prebuilt = {}
build_requires = six.StringIO()
rpm_build_requires = six.StringIO()
for (filename, req) in package_reqs:
if filename in filtered_package_files:
build_requires.write("%s\n" % (req))
prebuilt_reqs = []
for line in ensure_prebuilt.get(req.key, []):
prebuilt_reqs.append(pip_helper.extract_requirement(line))
if prebuilt_reqs:
rpm_build_requires.write("# %s from %s\n" % (req, sh.basename(filename)))
rpm_names = self.py2rpm_helper.names_to_rpm_names(
[r.key for r in prebuilt_reqs])
for r in prebuilt_reqs:
rpm_name = rpm_names[r.key]
LOG.info("Adding %s (%s) as a pre-build time"
" requirement of %s (%s)", r, rpm_name, req,
sh.basename(filename))
rpm_build_requires.write("%s\n" % (rpm_name))
rpm_build_requires.write("\n")
sh.append_file(self.rpm_build_requires_filename, rpm_build_requires.getvalue())
sh.write_file(self.build_requires_filename, build_requires.getvalue())
# Now build them into SRPM rpm files.
package_files = sorted(filtered_package_files)
self.py2rpm_helper.build_all_srpms(package_files=package_files,
tracewriter=self.tracewriter,
jobs=self.jobs)
def _make_spec_functors(self, downloaded_version):
# TODO(harlowja): refactor to just use cmp()
def newer_than(version):
version = pkg_resources.parse_version(version)
if downloaded_version > version:
return True
return False
def newer_than_eq(version):
version = pkg_resources.parse_version(version)
if downloaded_version >= version:
return True
return False
def older_than(version):
version = pkg_resources.parse_version(version)
if downloaded_version < version:
return True
return False
def older_than_eq(version):
version = pkg_resources.parse_version(version)
if downloaded_version <= version:
return True
return False
return {
'older_than_eq': older_than_eq,
'older_than': older_than,
'newer_than_eq': newer_than_eq,
'newer_than': newer_than,
}
def _write_spec_file(self, instance, rpm_name, template_name, params):
requires_what = params.get('requires', [])
conflicts_what = params.get('conflicts', [])
test_requires_what = params.get('test_requires', [])
test_conflicts_what = params.get('test_conflicts', [])
egg_info = getattr(instance, 'egg_info', None)
if egg_info:
def ei_names(key):
try:
requires_python = [str(req) for req in egg_info[key]]
except KeyError:
return [], []
else:
return self.py2rpm_helper.names_to_rpm_deps(requires_python)
rpm_requires, rpm_conflicts = ei_names('dependencies')
requires_what.extend(rpm_requires)
conflicts_what.extend(rpm_conflicts)
rpm_test_requires, rpm_test_conflicts = ei_names('test_dependencies')
test_requires_what.extend(rpm_test_requires)
test_conflicts_what.extend(rpm_test_conflicts)
params["requires"] = requires_what
params["conflicts"] = conflicts_what
params["test_requires"] = test_requires_what
params["test_conflicts"] = test_conflicts_what
params["epoch"] = self.OPENSTACK_EPOCH
params["part_fn"] = lambda filename: sh.joinpths(
settings.TEMPLATE_DIR,
self.SPEC_TEMPLATE_DIR,
filename)
parsed_version = pkg_resources.parse_version(params["version"])
params.update(self._make_spec_functors(parsed_version))
content = utils.load_template(self.SPEC_TEMPLATE_DIR, template_name)[1]
spec_filename = sh.joinpths(self.rpmbuild_dir, "SPECS", "%s.spec" % rpm_name)
sh.write_file(spec_filename, utils.expand_template(content, params),
tracewriter=self.tracewriter)
return spec_filename
def _copy_startup_scripts(self, instance, spec_details):
common_init_content = utils.load_template("packaging",
"common.init")[1]
daemon_args = instance.get_option('daemon_args', default_value={})
for src in spec_details.get('sources', []):
script = sh.basename(src)
if not (script.endswith(".init")):
continue
target_filename = sh.joinpths(self.rpm_sources_dir, script)
if sh.isfile(target_filename):
continue
bin_name = utils.strip_prefix_suffix(script, "openstack-", ".init")
params = {
"bin": bin_name,
"package": bin_name.split("-", 1)[0],
"daemon_args": daemon_args.get(bin_name, ''),
}
sh.write_file(target_filename,
utils.expand_template(common_init_content, params))
def _copy_systemd_scripts(self, instance, spec_details):
common_init_content = utils.load_template("packaging",
"common.service")[1]
daemon_args = instance.get_option('daemon_args', default_value={})
killmode = instance.get_option('killmode', default_value={})
for src in spec_details.get('sources', []):
script = sh.basename(src)
if not (script.endswith(".service")):
continue
target_filename = sh.joinpths(self.rpm_sources_dir, script)
if sh.isfile(target_filename):
continue
bin_name = utils.strip_prefix_suffix(script, "openstack-", ".service")
kill_mode = killmode.get(bin_name, '') or "control-group"
params = {
"bin": bin_name,
"package": bin_name.split("-", 1)[0],
"daemon_args": daemon_args.get(bin_name, ''),
"killmode": kill_mode,
}
sh.write_file(target_filename,
utils.expand_template(common_init_content, params))
def _copy_sources(self, instance):
other_sources_dir = sh.joinpths(settings.TEMPLATE_DIR,
"packaging", "sources", instance.name)
if sh.isdir(other_sources_dir):
for filename in sh.listdir(other_sources_dir, files_only=True):
sh.copy(filename, self.rpm_sources_dir)
def _copy_patches(self, patches):
for filename in patches:
sh.copy(filename, self.rpm_sources_dir)
def _build_from_spec(self, instance, spec_filename, patches=None):
pkg_dir = instance.get_option('app_dir')
if sh.isfile(sh.joinpths(pkg_dir, "setup.py")):
self._write_python_tarball(instance, pkg_dir, ENSURE_NOT_MISSING)
else:
self._write_git_tarball(instance, pkg_dir, spec_filename)
self._copy_sources(instance)
if patches:
self._copy_patches(patches)
cmdline = [self.specprint_executable]
cmdline.extend(['-f', spec_filename])
spec_details = json.loads(sh.execute(cmdline)[0])
rpm_requires = []
for k in ('requires', 'requirenevrs'):
try:
rpm_requires.extend(spec_details['headers'][k])
except (KeyError, TypeError):
pass
if rpm_requires:
buff = six.StringIO()
buff.write("# %s\n" % instance.name)
if rpm_requires:
for req in rpm_requires:
buff.write("%s\n" % req)
buff.write("\n")
sh.append_file(self.rpm_build_requires_filename, buff.getvalue())
self._copy_startup_scripts(instance, spec_details)
self._copy_systemd_scripts(instance, spec_details)
cmdline = [
self.rpmbuild_executable,
"-bs",
"--define", "_topdir %s" % self.rpmbuild_dir,
spec_filename,
]
out_filename = sh.joinpths(self.log_dir, "rpmbuild-%s.log" % instance.name)
sh.execute_save_output(cmdline, out_filename)
def _write_git_tarball(self, instance, pkg_dir, spec_filename):
cmdline = [
"rpm",
"-q",
"--specfile", spec_filename,
"--qf", "%{NAME}-%{VERSION}\n"
]
tar_base = sh.execute(cmdline, cwd=pkg_dir)[0].splitlines()[0].strip()
# NOTE(harlowja): git 1.7.1 from RHEL doesn't understand --format=tar.gz
output_filename = sh.joinpths(self.rpm_sources_dir, "%s.tar" % tar_base)
cmdline = [
"git",
"archive",
"--format=tar",
"--prefix=%s/" % tar_base,
"--output=%s" % output_filename,
"HEAD",
]
out_filename = sh.joinpths(self.log_dir, "git-tar-%s.log" % instance.name)
sh.execute_save_output(cmdline, out_filename, cwd=pkg_dir)
sh.gzip(output_filename)
sh.unlink(output_filename)
def _write_python_tarball(self, instance, pkg_dir, ensure_exists=None):
def prefix_exists(text, in_what):
for t in in_what:
if t.startswith(text):
return True
return False
pkg_name = instance.egg_info['name']
version = instance.egg_info['version']
base_name = "%s-%s" % (pkg_name, version)
cmdline = [
sys.executable,
"setup.py",
"sdist",
"--formats=tar",
"--dist-dir", self.rpm_sources_dir,
]
env_overrides = {
'PBR_VERSION': version,
}
out_filename = sh.joinpths(self.log_dir, "sdist-%s.log" % (instance.name))
sh.execute_save_output(cmdline, out_filename,
cwd=pkg_dir, env_overrides=env_overrides)
archive_name = sh.joinpths(self.rpm_sources_dir, "%s.tar" % (base_name))
if ensure_exists:
with contextlib.closing(tarfile.open(archive_name, 'r')) as tfh:
tar_entries = [t.path for t in tfh.getmembers()]
missing_paths = {}
for path in ensure_exists:
tar_path = sh.joinpths(base_name, path)
source_path = sh.joinpths(pkg_dir, path)
if not prefix_exists(tar_path, tar_entries) and sh.exists(source_path):
missing_paths[tar_path] = source_path
if missing_paths:
utils.log_iterable(sorted(missing_paths.keys()),
logger=LOG,
header='%s paths were not archived and will now be' % (len(missing_paths)))
with contextlib.closing(tarfile.open(archive_name, 'a')) as tfh:
for (tar_path, source_path) in missing_paths.items():
tfh.add(source_path, tar_path)
sh.gzip(archive_name)
sh.unlink(archive_name)
def _find_template_and_rpm_name(self, instance, build_name):
search_names = [(build_name, "%s.spec" % build_name)]
try:
egg_name = instance.egg_info['name']
except AttributeError:
pass
else:
if any(s.endswith("client")
for s in (instance.name, egg_name, build_name)):
search_names.append([egg_name, "python-commonclient.spec"])
search_names.extend([
("openstack-%s" % (egg_name), "openstack-%s.spec" % (egg_name)),
(egg_name, "%s.spec" % (egg_name)),
])
# Return the first that exists (if any from this list)
for (rpm_name, template_name) in search_names:
spec_filename = sh.joinpths(settings.TEMPLATE_DIR,
self.SPEC_TEMPLATE_DIR, template_name)
if sh.isfile(spec_filename):
return (rpm_name, template_name)
return (None, None)
def _build_openstack_package(self, instance):
params = self._package_parameters(instance)
patches = instance.list_patches("package")
params['patches'] = [sh.basename(fn) for fn in patches]
build_name = instance.get_option('build_name', default_value=instance.name)
(rpm_name, template_name) = self._find_template_and_rpm_name(instance, build_name)
try:
egg_name = instance.egg_info['name']
params["version"] = instance.egg_info["version"]
except AttributeError:
pass
else:
if any(s.endswith("client")
for s in (instance.name, egg_name, build_name)):
client_name = utils.strip_prefix_suffix(egg_name, "python-", "client")
if not client_name:
msg = "Bad client package name %s" % (egg_name)
raise excp.PackageException(msg)
params["clientname"] = client_name
params["apiname"] = instance.get_option(
'api_name', default_value=client_name.title())
if all((rpm_name, template_name)):
spec_filename = self._write_spec_file(instance, rpm_name,
template_name, params)
self._build_from_spec(instance, spec_filename, patches)
else:
self.py2rpm_helper.build_srpm(source=instance.get_option("app_dir"),
log_filename=instance.name,
release=params.get("release"),
with_tests=not params.get("no_tests"))
def _get_rpm_names(self, from_deps=True, from_instances=True):
desired_rpms = []
py_reqs = set()
if from_instances:
inst_packages = list(self.requirements["requires"])
for inst in self.instances:
inst_packages.extend(inst.package_names())
if sh.isdir(inst.get_option("app_dir")):
try:
py_req = inst.egg_info['req']
except AttributeError:
pass
else:
rpm_name, _ = self._find_template_and_rpm_name(
inst, inst.get_option('build_name', default_value=inst.name)
)
if rpm_name is not None:
desired_rpms.append((rpm_name, py_req))
else:
py_reqs.add(py_req)
for rpm_name in inst_packages:
desired_rpms.append((rpm_name, None))
if from_deps:
# This file should have all the requirements (including test ones)
# that we need to install (and which should have been built as rpms
# in the previous build stages).
requires = sh.load_file(self.gathered_requires_filename).splitlines()
for line in [line.strip() for line in requires if line.strip()]:
py_reqs.add(pip_helper.extract_requirement(line))
rpm_names = self.py2rpm_helper.names_to_rpm_names([req.key
for req in py_reqs])
desired_rpms.extend((rpm_names[req.key], req) for req in py_reqs)
def _format_name(rpm_name, py_req):
full_name = str(rpm_name).strip()
if py_req is not None:
full_name += ','.join(''.join(x) for x in py_req.specs)
return full_name
return sorted(_format_name(rpm_name, py_req)
for rpm_name, py_req in desired_rpms)
def install(self, general):
super(YumDependencyHandler, self).install(general)
self.helper.clean()
install_all_deps = general.get_bool_option('install-all-deps', True)
install_pkgs = self._get_rpm_names(from_deps=install_all_deps,
from_instances=True)
# Erase conflicting packages
remove_pkgs = [pkg_name
for pkg_name in self.requirements["conflicts"]
if self.helper.is_installed(pkg_name)]
self.helper.transaction(install_pkgs=install_pkgs,
remove_pkgs=remove_pkgs,
tracewriter=self.tracewriter)
def install_all_deps(self):
super(YumDependencyHandler, self).install_all_deps()
self.helper.clean()
install_pkgs = self._get_rpm_names(from_deps=True, from_instances=False)
self.helper.transaction(install_pkgs=install_pkgs,
tracewriter=self.tracewriter)
def uninstall(self):
super(YumDependencyHandler, self).uninstall()
if self.tracereader.exists():
remove_pkgs = self.tracereader.packages_installed()
self.helper.transaction(remove_pkgs=remove_pkgs)

View File

@ -1,68 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
LOG = logging.getLogger(__name__)
PATCH_CMD = ['patch', '-p1']
GIT_PATCH_CMD = ['git', 'am']
def _is_patch(path, patch_ext='.patch'):
if not path.endswith(patch_ext):
return False
if not sh.isfile(path):
return False
return True
def expand_patches(paths, patch_ext='.patch'):
if not paths:
return []
all_paths = []
# Expand patch files/dirs
for path in paths:
path = sh.abspth(path)
if sh.isdir(path):
all_paths.extend([p for p in sh.listdir(path, files_only=True)])
else:
all_paths.append(path)
# Now filter on valid patches
return [p for p in all_paths if _is_patch(p, patch_ext=patch_ext)]
def apply_patches(patch_files, working_dir):
if not sh.isdir(working_dir):
LOG.warn("Can only apply patches 'inside' a directory and not '%s'",
working_dir)
return
already_applied = set()
for patch_ext, patch_cmd in [('.patch', PATCH_CMD), ('.git_patch', GIT_PATCH_CMD)]:
apply_files = expand_patches(patch_files, patch_ext=patch_ext)
apply_files = [p for p in apply_files if p not in already_applied]
if not apply_files:
continue
with utils.chdir(working_dir):
for p in apply_files:
LOG.debug("Applying patch %s using command %s in directory %s",
p, patch_cmd, working_dir)
patch_contents = sh.load_file(p)
if len(patch_contents):
sh.execute(patch_cmd, process_input=patch_contents)
already_applied.add(p)

View File

@ -1,106 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
from anvil import colorizer
from anvil import log as logging
from anvil import utils
LOG = logging.getLogger(__name__)
SPECIAL_GROUPS = frozenset(['general'])
class Persona(object):
def __init__(self, supports, components, **kwargs):
self.distro_support = supports or []
self.source = kwargs.pop('source', None)
self.wanted_components = utils.group_builds(components)
self.wanted_subsystems = kwargs.pop('subsystems', {})
self.component_options = kwargs.pop('options', {})
self.no_origins = kwargs.pop('no-origin', [])
self.matched_components = []
self.distro_updates = kwargs
def match(self, distros, origins):
for group in self.wanted_components:
for c in group:
if c not in origins:
if c in self.no_origins:
LOG.debug("Automatically enabling component %s, not"
" present in origins file %s but present in"
" desired persona %s (origin not required).",
c, origins.filename, self.source)
origins[c] = {
'disabled': False,
}
else:
LOG.warn("Automatically disabling %s, not present in"
" origin file but present in desired"
" persona (origin required).",
colorizer.quote(c, quote_color='red'))
origins[c] = {
'disabled': True,
}
disabled_components = set(key
for key, value in six.iteritems(origins)
if value.get('disabled'))
self.matched_components = []
all_components = set()
for group in self.wanted_components:
adjusted_group = utils.Group(group.id)
for c in group:
if c not in disabled_components:
adjusted_group.append(c)
all_components.add(c)
if adjusted_group:
for c in SPECIAL_GROUPS:
if c not in adjusted_group:
adjusted_group.insert(0, c)
all_components.add(c)
self.matched_components.append(adjusted_group)
# Pick which of potentially many distros will work...
distro_names = set()
selected_distro = None
for distro in distros:
distro_names.add(distro.name)
if distro.name not in self.distro_support:
continue
will_work = True
for component in all_components:
if not distro.known_component(component):
will_work = False
LOG.warning("Persona specified component '%s' but"
" distro '%s' does not specify it", component,
distro.name)
break
if will_work:
selected_distro = distro
break
if selected_distro is None:
raise RuntimeError("Persona does not support any of the loaded"
" distros: %s" % list(distro_names))
else:
return selected_distro
def load(fn):
cls_kvs = utils.load_yaml(fn)
cls_kvs['source'] = fn
instance = Persona(**cls_kvs)
return instance

View File

@ -1,83 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil import log as logging
from anvil import shell as sh
from anvil import utils
from contextlib import contextmanager
LOG = logging.getLogger(__name__)
class PhaseRecorder(object):
def __init__(self, fn):
self.filename = fn
self.state = None
def _format_contents(self, contents):
return utils.prettify_yaml(contents)
@contextmanager
def mark(self, what):
contents = self.list_phases()
contents[what] = utils.iso8601()
yield what
sh.write_file(self.filename, self._format_contents(contents))
def unmark(self, what):
contents = self.list_phases()
contents.pop(what, None)
sh.write_file(self.filename, self._format_contents(contents))
def __contains__(self, what):
phases = self.list_phases()
if what in phases:
return True
return False
def list_phases(self):
if self.state is not None:
return self.state
state = {}
# Shell not used to avoid dry-run capturing
try:
with open(self.filename, 'r') as fh:
state = utils.load_yaml_text(fh.read())
if not isinstance(state, (dict)):
raise TypeError("Phase file %s expected dictionary root type" % (self.filename))
except IOError:
pass
self.state = state
return self.state
class NullPhaseRecorder(object):
def __init__(self):
pass
@contextmanager
def mark(self, what):
yield what
def list_phases(self):
return {}
def unmark(self, what):
pass
def __contains__(self, what):
return False

View File

@ -1,102 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
def center_text(text, fill, max_len):
return '{0:{fill}{align}{size}}'.format(text, fill=fill, align="^", size=max_len)
def _pformat_list(lst, item_max_len):
lines = []
if not lst:
lines.append("+------+")
lines.append("'------'")
return "\n".join(lines)
entries = []
max_len = 0
for i in lst:
e = pformat(i, item_max_len)
for v in e.split("\n"):
max_len = max(max_len, len(v) + 2)
entries.append(e)
lines.append("+%s+" % ("-" * (max_len)))
for e in entries:
for line in e.split("\n"):
lines.append("|%s|" % (center_text(line, ' ', max_len)))
lines.append("'%s'" % ("-" * (max_len)))
return "\n".join(lines)
def _pformat_hash(hsh, item_max_len):
lines = []
if not hsh:
lines.append("+-----+-----+")
lines.append("'-----+-----'")
return "\n".join(lines)
# Figure out the lengths to place items in...
max_key_len = 0
max_value_len = 0
entries = []
for (k, v) in hsh.items():
entry = ("%s" % (_pformat_escape(k, item_max_len)), "%s" % (pformat(v, item_max_len)))
max_key_len = max(max_key_len, len(entry[0]) + 2)
for v in entry[1].split("\n"):
max_value_len = max(max_value_len, len(v) + 2)
entries.append(entry)
# Now actually do the placement since we have the lengths
lines.append("+%s+%s+" % ("-" * max_key_len, "-" * max_value_len))
for (key, value) in entries:
value_lines = value.split("\n")
lines.append("|%s|%s|" % (center_text(key, ' ', max_key_len),
center_text(value_lines[0], ' ', max_value_len)))
if len(value_lines) > 1:
for j in range(1, len(value_lines)):
lines.append("|%s|%s|" % (center_text("-", ' ', max_key_len),
center_text(value_lines[j], ' ', max_value_len)))
lines.append("'%s+%s'" % ("-" * max_key_len, "-" * max_value_len))
return "\n".join(lines)
def _pformat_escape(item, item_max_len):
item = _pformat_simple(item, item_max_len)
item = item.replace("\n", "\\n")
item = item.replace("\t", "\\t")
return item
def _pformat_simple(item, item_max_len):
if item_max_len is None or item_max_len < 0:
return "%s" % (item)
if item_max_len == 0:
return ''
item_str = "%s" % (item)
if len(item_str) > item_max_len:
# TODO(harlowja) use utf8 ellipse or '...'??
item_str = item_str[0:item_max_len] + '...'
return item_str
def pformat(item, item_max_len=None):
if isinstance(item, (list, set, tuple)):
return _pformat_list(item, item_max_len)
elif isinstance(item, (dict)):
return _pformat_hash(item, item_max_len)
else:
return _pformat_simple(item, item_max_len)
def pprint(item, item_max_len=None):
print("%s" % (pformat(item, item_max_len)))

View File

@ -1,27 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
# Where the configs and templates should be at...
CONFIG_DIR = 'conf'
COMPONENT_CONF_DIR = os.path.join(CONFIG_DIR, "components")
DISTRO_DIR = os.path.join(CONFIG_DIR, "distros")
MESSAGING_DIR = os.path.join(CONFIG_DIR, "messages")
ORIGINS_DIR = os.path.join(CONFIG_DIR, "origins")
PERSONA_DIR = os.path.join(CONFIG_DIR, "personas")
TEMPLATE_DIR = os.path.join(CONFIG_DIR, "templates")

View File

@ -1,607 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
# R0915: Too many statements
# pylint: disable=R0915
import collections
import contextlib
import getpass
import grp
import gzip as gz
import os
import pwd
import shutil
import signal
import socket
import subprocess
import time
import distutils.spawn
import psutil
import anvil
from anvil import env
from anvil import exceptions as excp
from anvil import log as logging
LOG = logging.getLogger(__name__)
# Locally stash these so that they can not be changed
# by others after this is first fetched...
SUDO_UID = env.get_key('SUDO_UID')
SUDO_GID = env.get_key('SUDO_GID')
# Tail this many lines from a file when executed and output is being
# piped to said file.
_TRUNCATED_OUTPUT_LINES = 7
# Take over some functions directly from os.path/os/... so that we don't have
# to type as many long function names to access these.
getsize = os.path.getsize
exists = os.path.exists
basename = os.path.basename
dirname = os.path.dirname
canon_path = os.path.realpath
prompt = raw_input
isfile = os.path.isfile
isdir = os.path.isdir
islink = os.path.islink
geteuid = os.geteuid
getegid = os.getegid
class Process(psutil.Process):
def __str__(self):
return "%s (%s)" % (self.pid, self.name)
# Originally borrowed from nova compute execute.
def execute(cmd,
process_input=None,
check_exit_code=True,
cwd=None,
shell=False,
env_overrides=None,
stdout_fh=subprocess.PIPE,
stderr_fh=subprocess.PIPE):
"""Helper method to execute a command through subprocess.
:param cmd: Command passed to subprocess.Popen.
:param process_input: Input send to opened process.
:param check_exit_code: Specifies whether to check process return code.
If return code is other then `0` - exception will
be raised.
:param cwd: The child's current directory will be changed to
`cwd` before it is executed.
:param shell: Specifies whether to use the shell as the program
to execute.
:param env_overrides: Process environment parameters to override.
:param stdout_fh: Stdout file handler.
:param stderr_fh: Stderr file handler.
:returns: A tuple, (stdout, stderr) from the spawned process.
:raises: :class:`exceptions.ProcessExecutionError` when
process ends with other then `0` return code.
"""
# Ensure all string args (i.e. for those that send ints, etc.).
cmd = map(str, cmd)
# NOTE(skudriashev): If shell is True, it is recommended to pass args as a
# string rather than as a sequence.
str_cmd = subprocess.list2cmdline(cmd)
if shell:
cmd = str_cmd
LOG.debug('Running shell cmd: %r' % cmd)
else:
LOG.debug('Running cmd: %r' % cmd)
if process_input is not None:
process_input = str(process_input)
LOG.debug('Process input: %s' % process_input)
if cwd:
LOG.debug('Process working directory: %r' % cwd)
# Override process environment in needed.
process_env = None
if env_overrides and len(env_overrides):
process_env = env.get()
for k, v in env_overrides.items():
LOG.debug("Using environment override '%s' => '%s'", k, v)
process_env[k] = str(v)
# Run command process.
exec_kwargs = {
'stdin': subprocess.PIPE,
'stdout': stdout_fh,
'stderr': stderr_fh,
'close_fds': True,
'shell': shell,
'cwd': cwd,
'env': process_env,
}
result = ("", "")
try:
obj = subprocess.Popen(cmd, **exec_kwargs)
result = obj.communicate(process_input)
except OSError as e:
raise excp.ProcessExecutionError(
str_cmd,
exec_kwargs=exec_kwargs,
description="%s: [%s, %s]" % (e, e.errno, e.strerror)
)
else:
rc = obj.returncode
# Handle process exit code.
stdout = result[0] or ""
stderr = result[1] or ""
if rc != 0 and check_exit_code:
# Raise exception if return code is not `0`.
e = excp.ProcessExecutionError(str_cmd,
exec_kwargs=exec_kwargs,
stdout=stdout,
stderr=stderr,
exit_code=rc,
where_output="debug log")
LOG.debug("Stdout: %s", e.stdout)
LOG.debug("Stderr: %s", e.stderr)
raise e
return stdout, stderr
def execute_save_output(cmd, file_name, **kwargs):
"""Helper method to execute a command through subprocess and save stdout
and stderr into a file.
"""
kwargs = kwargs.copy()
mkdirslist(dirname(file_name))
try:
with open(file_name, 'wb') as fh:
return execute(cmd, stdout_fh=fh, stderr_fh=fh, **kwargs)
except excp.ProcessExecutionError:
with excp.reraise():
try:
with open(file_name, 'rb') as fh:
lines = collections.deque(fh,
maxlen=_TRUNCATED_OUTPUT_LINES)
content = "".join(lines)
except IOError:
pass
else:
LOG.debug('Last lines from %s:\n%s', file_name, content)
@contextlib.contextmanager
def remove_before(path):
if isdir(path):
deldir(path)
if isfile(path):
unlink(path)
yield path
def gzip(file_name, gz_archive_name=None):
if not isfile(file_name):
raise IOError("Can not gzip non-existent file: %s" % (file_name))
if not gz_archive_name:
gz_archive_name = "%s.gz" % (file_name)
with contextlib.closing(gz.open(gz_archive_name, 'wb')) as tz:
with open(file_name, 'rb') as fh:
tz.write(fh.read())
return gz_archive_name
def abspth(path):
if not path:
path = "/"
if path == "~":
path = gethomedir()
return os.path.abspath(path)
def hostname(default='localhost'):
try:
return socket.gethostname()
except socket.error:
return default
# Useful for doing progress bars that get told the current progress
# for the transfer ever chunk via the chunk callback function that
# will be called after each chunk has been written...
def pipe_in_out(in_fh, out_fh, chunk_size=1024, chunk_cb=None):
bytes_piped = 0
LOG.debug("Transferring the contents of %s to %s in chunks of size %s.", in_fh, out_fh, chunk_size)
while True:
data = in_fh.read(chunk_size)
if data == '':
# EOF
break
else:
out_fh.write(data)
bytes_piped += len(data)
if chunk_cb:
chunk_cb(bytes_piped)
return bytes_piped
def fileperms(path):
return (os.stat(path).st_mode & 0o777)
def listdir(path, recursive=False, dirs_only=False, files_only=False, filter_func=None):
path = abspth(path)
all_contents = []
if not recursive:
all_contents = os.listdir(path)
all_contents = [joinpths(path, f) for f in all_contents]
else:
for (root, dirs, files) in os.walk(path):
for d in dirs:
all_contents.append(joinpths(root, d))
for f in files:
all_contents.append(joinpths(root, f))
if dirs_only:
all_contents = [f for f in all_contents if isdir(f)]
if files_only:
all_contents = [f for f in all_contents if isfile(f)]
if filter_func:
all_contents = [f for f in all_contents if filter_func(f)]
return all_contents
def joinpths(*paths):
return os.path.join(*paths)
def get_suids():
uid = SUDO_UID
if uid is not None:
uid = int(uid)
gid = SUDO_GID
if gid is not None:
gid = int(gid)
return (uid, gid)
def chown(path, uid, gid):
if uid is None:
uid = -1
if gid is None:
gid = -1
if uid == -1 and gid == -1:
return 0
LOG.debug("Changing ownership of %r to %s:%s" % (path, uid, gid))
os.chown(path, uid, gid)
return 1
def chown_r(path, uid, gid):
changed = 0
for (root, dirs, files) in os.walk(path):
changed += chown(root, uid, gid)
for d in dirs:
dir_pth = joinpths(root, d)
changed += chown(dir_pth, uid, gid)
for f in files:
fn_pth = joinpths(root, f)
changed += chown(fn_pth, uid, gid)
return changed
def _explode_path(path):
dirs = []
comps = []
path = abspth(path)
dirs.append(path)
(head, tail) = os.path.split(path)
while tail:
dirs.append(head)
comps.append(tail)
path = head
(head, tail) = os.path.split(path)
dirs.sort()
comps.reverse()
return (dirs, comps)
def explode_path(path):
return _explode_path(path)[0]
def _attempt_kill(proc, signal_type, max_try, wait_time):
try:
if not proc.is_running():
return (True, 0)
except psutil.error.NoSuchProcess:
return (True, 0)
# Be a little more forceful...
killed = False
attempts = 0
for _i in range(0, max_try):
try:
LOG.debug("Attempting to kill process %s" % (proc))
attempts += 1
proc.send_signal(signal_type)
LOG.debug("Sleeping for %s seconds before next attempt to kill process %s" % (wait_time, proc))
sleep(wait_time)
except psutil.error.NoSuchProcess:
killed = True
break
except Exception as e:
LOG.debug("Failed killing %s due to: %s", proc, e)
LOG.debug("Sleeping for %s seconds before next attempt to kill process %s" % (wait_time, proc))
sleep(wait_time)
return (killed, attempts)
def kill(pid, max_try=4, wait_time=1):
if not is_running(pid):
return (True, 0)
proc = Process(pid)
# Try the nicer sig-int first.
(killed, i_attempts) = _attempt_kill(proc, signal.SIGINT,
int(max_try / 2), wait_time)
if killed:
return (True, i_attempts)
# Get aggressive and try sig-kill.
(killed, k_attempts) = _attempt_kill(proc, signal.SIGKILL,
int(max_try / 2), wait_time)
return (killed, i_attempts + k_attempts)
def is_running(pid):
try:
return Process(pid).is_running()
except psutil.error.NoSuchProcess:
return False
def mkdirslist(path, tracewriter=None):
dirs_possible = explode_path(path)
dirs_made = []
for dir_path in dirs_possible:
if not isdir(dir_path):
mkdir(dir_path, recurse=False)
if tracewriter:
tracewriter.dirs_made(dir_path)
dirs_made.append(dir_path)
return dirs_made
def append_file(fn, text, flush=True, quiet=False):
if not quiet:
LOG.debug("Appending to file %r (%d bytes) (flush=%s)", fn, len(text), (flush))
LOG.debug(">> %s" % (text))
with open(fn, "a") as f:
f.write(text)
if flush:
f.flush()
return fn
def write_file(fn, text, flush=True, quiet=False, tracewriter=None):
if not quiet:
LOG.debug("Writing to file %r (%d bytes) (flush=%s)", fn, len(text), (flush))
LOG.debug("> %s" % (text))
mkdirslist(dirname(fn), tracewriter=tracewriter)
with open(fn, "w") as fh:
if isinstance(text, unicode):
text = text.encode("utf-8")
fh.write(text)
if flush:
fh.flush()
if tracewriter:
tracewriter.file_touched(fn)
def touch_file(fn, die_if_there=True, quiet=False, file_size=0, tracewriter=None):
if not isfile(fn):
if not quiet:
LOG.debug("Touching and truncating file %r (truncate size=%s)", fn, file_size)
mkdirslist(dirname(fn), tracewriter=tracewriter)
with open(fn, "w") as fh:
fh.truncate(file_size)
if tracewriter:
tracewriter.file_touched(fn)
else:
if die_if_there:
msg = "Can not touch & truncate file %r since it already exists" % (fn)
raise excp.FileException(msg)
def load_file(fn):
with open(fn, "rb") as fh:
return fh.read()
def mkdir(path, recurse=True):
if not isdir(path):
if recurse:
LOG.debug("Recursively creating directory %r" % (path))
os.makedirs(path)
else:
LOG.debug("Creating directory %r" % (path))
os.mkdir(path)
return path
def deldir(path):
if isdir(path):
LOG.debug("Recursively deleting directory tree starting at %r" % (path))
shutil.rmtree(path)
def rmdir(path, quiet=True):
if not isdir(path):
return
try:
LOG.debug("Deleting directory %r with the cavet that we will fail if it's not empty." % (path))
os.rmdir(path)
LOG.debug("Deleted directory %r" % (path))
except OSError:
if not quiet:
raise
else:
pass
def symlink(source, link, force=True, tracewriter=None):
LOG.debug("Creating symlink from %r => %r" % (link, source))
mkdirslist(dirname(link), tracewriter=tracewriter)
if force and (exists(link) and islink(link)):
unlink(link, True)
os.symlink(source, link)
if tracewriter:
tracewriter.symlink_made(link)
def getuser():
(uid, _gid) = get_suids()
if uid is None:
return getpass.getuser()
return pwd.getpwuid(uid).pw_name
def getuid(username):
return pwd.getpwnam(username).pw_uid
def gethomedir(user=None):
if not user:
user = getuser()
home_dir = os.path.expanduser("~%s" % (user))
return home_dir
def getgid(groupname):
return grp.getgrnam(groupname).gr_gid
def getgroupname():
(_uid, gid) = get_suids()
if gid is None:
gid = os.getgid()
return grp.getgrgid(gid).gr_name
def unlink(path, ignore_errors=True):
LOG.debug("Unlinking (removing) %r" % (path))
try:
os.unlink(path)
except OSError:
if not ignore_errors:
raise
else:
pass
def copy(src, dst, tracewriter=None):
LOG.debug("Copying: %r => %r" % (src, dst))
shutil.copy(src, dst)
if tracewriter:
tracewriter.file_touched(dst)
return dst
def move(src, dst, force=False):
LOG.debug("Moving: %r => %r" % (src, dst))
if force:
if isdir(dst):
dst = joinpths(dst, basename(src))
if isfile(dst):
unlink(dst)
shutil.move(src, dst)
return dst
def write_file_and_backup(path, contents, bk_ext='org'):
perms = None
backup_path = None
if isfile(path):
perms = fileperms(path)
backup_path = "%s.%s" % (path, bk_ext)
if not isfile(backup_path):
LOG.debug("Backing up %s => %s", path, backup_path)
move(path, backup_path)
else:
LOG.debug("Leaving original backup of %s at %s", path, backup_path)
write_file(path, contents)
if perms is not None:
chmod(path, perms)
return backup_path
def chmod(fname, mode):
LOG.debug("Applying chmod: %r to %o" % (fname, mode))
os.chmod(fname, mode)
return fname
def got_root():
e_id = geteuid()
g_id = getegid()
for a_id in [e_id, g_id]:
if a_id != 0:
return False
return True
def sleep(winks):
if winks <= 0:
return
time.sleep(winks)
def which_first(bin_names, additional_dirs=None, ensure_executable=True):
assert bin_names, 'Binary names required'
for b in bin_names:
try:
return which(b,
additional_dirs=additional_dirs,
ensure_executable=ensure_executable)
except excp.FileException:
pass
bin_names = ", ".join(bin_names)
raise excp.FileException("Can't find any of %s" % bin_names)
def which(bin_name, additional_dirs=None, ensure_executable=True):
def check_it(path):
if not path:
return False
if not isfile(path):
return False
if ensure_executable and not os.access(path, os.X_OK):
return False
return True
full_name = distutils.spawn.find_executable(bin_name)
if check_it(full_name):
return full_name
if not additional_dirs:
additional_dirs = []
for dir_name in additional_dirs:
full_name = joinpths(dirname(dirname(abspth(anvil.__file__))),
dir_name,
bin_name)
if check_it(full_name):
return full_name
raise excp.FileException("Can't find %s" % bin_name)

View File

@ -1,75 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
from testtools import compat
from testtools import matchers
from testtools import testcase
class TestCase(testcase.TestCase):
"""Base test case class for all anvil unit tests."""
def assertRaisesRegexp(self, expected_exception, expected_regexp,
callable_obj=None, *args, **kwargs):
# TODO(harlowja): submit a pull/review request to testtools to add
# this method to there codebase instead of having it exist in ours
# since it really doesn't belong here.
class ReRaiseOtherTypes(object):
def match(self, matchee):
if not issubclass(matchee[0], expected_exception):
compat.reraise(*matchee)
matcher = matchers.Raises(matchers.MatchesAll(ReRaiseOtherTypes(),
matchers.MatchesException(expected_exception,
expected_regexp)))
our_callable = testcase.Nullary(callable_obj, *args, **kwargs)
self.assertThat(our_callable, matcher)
class MockTestCase(TestCase):
"""Base test case class for all anvil mocking unit tests."""
def setUp(self):
super(MockTestCase, self).setUp()
self.master_mock = mock.MagicMock(name='master_mock')
def _patch_class(self, module, name, autospec=True, attach_as=None):
"""Patch class, create class instance mock and attach them to
the master mock.
"""
if autospec:
instance_mock = mock.MagicMock(spec=getattr(module, name))
else:
instance_mock = mock.MagicMock()
patcher = mock.patch.object(module, name, autospec=autospec)
class_mock = patcher.start()
self.addCleanup(patcher.stop)
class_mock.return_value = instance_mock
if attach_as is None:
attach_class_as = name
attach_instance_as = name.lower()
else:
attach_class_as = attach_as + '_class'
attach_instance_as = attach_as
self.master_mock.attach_mock(class_mock, attach_class_as)
self.master_mock.attach_mock(instance_mock, attach_instance_as)
return class_mock, instance_mock

View File

@ -1,15 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.

View File

@ -1,652 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
import os
import shutil
import tempfile
from anvil import cfg
from anvil import exceptions
from anvil import shell
from anvil import test
from anvil import utils
class TestYamlRefLoader(test.TestCase):
def setUp(self):
super(TestYamlRefLoader, self).setUp()
self.sample = ""
self.sample2 = ""
self.sample3 = ""
self.temp_dir = tempfile.mkdtemp()
self.loader = cfg.YamlRefLoader(self.temp_dir)
def tearDown(self):
shutil.rmtree(self.temp_dir, ignore_errors=True)
del self.loader
super(TestYamlRefLoader, self).tearDown()
def _write_samples(self):
with open(os.path.join(self.temp_dir, 'sample.yaml'), 'w') as f:
f.write(self.sample)
with open(os.path.join(self.temp_dir, 'sample2.yaml'), 'w') as f:
f.write(self.sample2)
with open(os.path.join(self.temp_dir, 'sample3.yaml'), 'w') as f:
f.write(self.sample3)
def test_load__default(self):
self.sample = "default: default_value"
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'default': 'default_value'})
self.assertEqual(processed, should_be)
def test_load__empty(self):
self.sample = ""
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict()
self.assertEqual(processed, should_be)
def test_load__empty2(self):
self.sample = "empty: "
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'empty': None})
self.assertEqual(processed, should_be)
def test_load__integer(self):
self.sample = "integer: 11"
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'integer': 11})
self.assertEqual(processed, should_be)
def test_load__string(self):
self.sample = 'string: "string sample"'
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'string': "string sample"})
self.assertEqual(processed, should_be)
def test_load__float(self):
self.sample = "float: 1.1234"
self._write_samples()
processed = self.loader.load('sample')
self.assertAlmostEqual(processed['float'], 1.1234)
def test_load__bool(self):
self.sample = "bool: true"
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'bool': True})
self.assertEqual(processed, should_be)
def test_load__list(self):
self.sample = """
list:
- first
- second
- 100
"""
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'list': ['first', 'second', 100]})
self.assertEqual(processed, should_be)
def test_load__dict(self):
self.sample = """
dict:
integer: 11
default: default_value
string: "string sample"
"""
self._write_samples()
# Note: dictionaries are always sorted by options names.
processed = self.loader.load('sample')
should_be = utils.OrderedDict([
('dict',
utils.OrderedDict([
('default', 'default_value'),
('integer', 11),
('string', 'string sample')
]))
])
self.assertEqual(processed, should_be)
def test_load__nested_dict(self):
self.sample = """
dict:
dict1:
default: default_value
integer: 11
dict2:
default: default_value
string: "string sample"
"""
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({
'dict': {
'dict1': {'default': 'default_value',
'integer': 11},
'dict2': {'default': 'default_value',
'string': 'string sample'}
}
})
self.assertEqual(processed, should_be)
def test_load__complex(self):
self.sample = """
# some comments...
integer: 15
bool-opt: false
bool-opt2: 0
bool-opt3: 1
float: 0.15
list:
- 1st
- 2nd
- 0.1
- 100
- true
dict:
dict1:
default: default_value 1
integer: 11
bool: true
dict2:
default: default_value 2
"""
self._write_samples()
processed = self.loader.load('sample')
self.assertEqual(len(processed), 7)
self.assertEqual(processed['integer'], 15)
self.assertEqual(processed['bool-opt'], False)
self.assertEqual(processed['bool-opt2'], False)
self.assertEqual(processed['bool-opt3'], True)
self.assertAlmostEqual(processed['float'], 0.15)
self.assertEqual(processed['list'], ['1st', '2nd', 0.1, 100, True])
self.assertEqual(processed['dict']['dict1']['integer'], 11)
self.assertEqual(processed['dict']['dict1']['bool'], True)
self.assertEqual(processed['dict']['dict1']['default'],
'default_value 1')
self.assertEqual(processed['dict']['dict2']['default'],
'default_value 2')
def test_load__simple_reference(self):
self.sample = 'opt: $(sample2:opt)'
self.sample2 = 'opt: 10'
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict({'opt': 10})
self.assertEqual(processed, should_be)
def test_load__self_reference(self):
self.sample = """
opt1: "$(sample:opt2)"
opt2: "$(sample:opt3)"
opt3: 10
"""
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict([('opt1', 10), ('opt2', 10), ('opt3', 10)])
self.assertEqual(processed, should_be)
def test_load__auto_reference(self):
self.sample = """
ip: "$(auto:ip)"
host: "$(auto:hostname)"
home: "$(auto:home)"
"""
self._write_samples()
processed = self.loader.load('sample')
self.assertIsInstance(processed, utils.OrderedDict)
self.assertEqual(len(processed), 3)
self.assertEqual(processed['ip'], utils.get_host_ip())
self.assertEqual(processed['host'], shell.hostname())
self.assertEqual(processed['home'], shell.gethomedir())
def test_load__multi_reference(self):
self.sample = """
multi_ref: "9 + $(sample2:opt) + $(sample3:opt) + $(auto:home) + 12"
"""
self.sample2 = """opt: 10"""
self.sample3 = """opt: 11"""
self._write_samples()
processed = self.loader.load('sample')
self.assertIsInstance(processed, utils.OrderedDict)
self.assertEqual(len(processed), 1)
self.assertEqual(processed['multi_ref'],
"9 + 10 + 11 + " + shell.gethomedir() + " + 12")
def test_load__dict_reference(self):
self.sample = """
sample2:
opt: "$(sample2:opt)"
"""
self.sample2 = """opt: 10"""
self._write_samples()
processed = self.loader.load('sample')
should_be = utils.OrderedDict([
('sample2', utils.OrderedDict([
('opt', 10)
]))
])
self.assertEqual(processed, should_be)
def test_load__wrapped_ref(self):
self.sample = """
stable: 23
prefixed: "1$(sample:stable)"
suffixed: "$(sample:stable)4"
wrapped: "1$(sample:stable)4"
"""
self._write_samples()
processed = self.loader.load('sample')
self.assertEqual(processed['prefixed'], "123")
self.assertEqual(processed['suffixed'], "234")
self.assertEqual(processed['wrapped'], "1234")
def test_load__complex_reference(self):
self.sample = """
stable: 9
ref0: "$(sample:stable)"
ref1: "$(sample2:stable)"
ref2: "$(sample2:ref1)"
ref3: "$(sample2:ref2)"
ref4: "$(sample2:ref3)"
ref5: "$(sample3:ref1)"
sample:
stable: "$(sample:stable)"
ref0: "$(sample:ref0)"
ref1: "$(sample:ref1)"
sample2:
stable: "$(sample2:stable)"
ref3: "$(sample2:ref3)"
sample3:
stable: "$(sample3:stable)"
ref1: "$(sample3:ref1)"
list:
- "$(sample:sample2)"
- "$(sample:sample3)"
dict:
sample3: "$(sample:sample3)"
sample2: "$(sample:sample2)"
"""
self.sample2 = """
stable: 10
ref1: "$(sample:stable)"
ref2: "$(sample3:stable)"
ref3: "$(sample3:ref1)"
ref4: "$(sample2:stable)"
"""
self.sample3 = """
stable: 11
ref1: "$(sample:stable)"
"""
self._write_samples()
processed = self.loader.load('sample')
self.assertIsInstance(processed, utils.OrderedDict)
#self.assertEqual(len(processed), 11)
self.assertEqual(processed['stable'], 9)
self.assertEqual(processed['ref0'], 9)
self.assertEqual(processed['ref1'], 10)
self.assertEqual(processed['ref2'], 9)
self.assertEqual(processed['ref3'], 11)
self.assertEqual(processed['ref4'], 9)
self.assertEqual(processed['ref5'], 9)
sample = utils.OrderedDict([
('ref0', 9),
('ref1', 10),
('stable', 9),
])
self.assertEqual(processed['sample'], sample)
sample2 = utils.OrderedDict([
('ref3', 9),
('stable', 10),
])
self.assertEqual(processed['sample2'], sample2)
sample3 = utils.OrderedDict([
('ref1', 9),
('stable', 11),
])
self.assertEqual(processed['sample3'], sample3)
self.assertEqual(processed['list'], [sample2, sample3])
self.assertEqual(
processed['dict'],
utils.OrderedDict([
('sample2', sample2),
('sample3', sample3),
])
)
processed = self.loader.load('sample2')
self.assertEqual(processed, {
'stable': 10,
'ref1': 9,
'ref2': 11,
'ref3': 9,
'ref4': 10,
})
processed = self.loader.load('sample3')
self.assertEqual(len(processed), 2)
self.assertEqual(processed['stable'], 11)
self.assertEqual(processed['ref1'], 9)
def test_load__magic_reference(self):
self.sample = """
magic:
reference: $(sample:reference)
reference: "$(sample:stable)"
stable: 1
"""
self._write_samples()
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 1)
self.assertEqual(processed['reference'], 1)
self.assertEqual(processed['magic']['reference'], 1)
def test_load__more_complex_ref(self):
"""Test loading references links via dictionaries and lists."""
self.sample = """
stable: 9
ref_to_s1: "$(sample:stable)"
ref_to_s2: "$(sample2:stable)"
ref_to_s3: "$(sample3:stable)"
sample:
stable: "$(sample:stable)"
ref_to_s1: "$(sample:ref_to_s1)"
ref_to_s2: "$(sample:ref_to_s2)"
list:
- "$(sample:stable)"
- "$(sample2:stable)"
- "$(sample3:stable)"
- "$(sample:ref_to_s1)"
- "$(sample:ref_to_s2)"
- "$(sample:ref_to_s3)"
- "$(sample:sample)"
dict:
stable: "$(sample:stable)"
sample: "$(sample:sample)"
list: "$(sample:list)"
"""
self.sample2 = """stable: 10"""
self.sample3 = """stable: 11"""
self._write_samples()
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 9)
self.assertEqual(processed['ref_to_s1'], 9)
self.assertEqual(processed['ref_to_s2'], 10)
self.assertEqual(processed['ref_to_s3'], 11)
self.assertEqual(
processed['sample'],
utils.OrderedDict([('ref_to_s1', 9),
('ref_to_s2', 10),
('stable', 9)])
)
self.assertEqual(processed['list'], [9, 10, 11, 9, 10, 11,
processed['sample']])
self.assertEqual(
processed['dict'],
utils.OrderedDict([
('list', processed['list']),
('sample', processed['sample']),
('stable', 9),
])
)
def test_load__raises_no_option(self):
self.sample = "ref: $(sample2:no-such-opt)"
self.sample2 = ""
self._write_samples()
self.assertRaises(exceptions.YamlOptionNotFoundException,
self.loader.load, 'sample')
def test_load__raises_no_config(self):
self.sample = "ref: $(no-sush-conf:opt)"
self.sample2 = ""
self._write_samples()
self.assertRaises(exceptions.YamlConfigNotFoundException,
self.loader.load, 'sample')
def test_load__raises_loop(self):
self.sample = "opt: $(sample2:opt)"
self.sample2 = "opt: $(sample:opt)"
self._write_samples()
self.assertRaises(exceptions.YamlLoopException,
self.loader.load, 'sample')
def test_load__raises_self_loop(self):
self.sample = "opt: $(sample:opt)"
self._write_samples()
self.assertRaises(exceptions.YamlLoopException,
self.loader.load, 'sample')
self.sample = """
opt:
- $(sample:opt)
"""
self._write_samples()
self.assertRaises(exceptions.YamlLoopException,
self.loader.load, 'sample')
self.sample = """
opt:
opt: $(sample:opt)
"""
self._write_samples()
self.assertRaises(exceptions.YamlLoopException,
self.loader.load, 'sample')
def test_update_cache(self):
self.sample = """
stable: 9
reference: "$(sample2:stable)"
reference2: "$(sample2:stable)"
reference3: "$(sample2:stable2)"
"""
self.sample2 = """
stable: 10
stable2: 11
"""
self._write_samples()
self.loader.update_cache('sample', dict(reference=20))
self.loader.update_cache('sample2', dict(stable=21))
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 9)
self.assertEqual(processed['reference'], 20)
self.assertEqual(processed['reference2'], 21)
self.assertEqual(processed['reference3'], 11)
def test_update_cache__few_times(self):
self.sample = "stable: '$(sample2:stable)'"
self.sample2 = "stable: 10"
self._write_samples()
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 10)
self.loader.update_cache('sample', dict(stable=11))
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 11)
self.loader.update_cache('sample', dict(stable=12))
processed = self.loader.load('sample')
self.assertEqual(processed['stable'], 12)
class TestYamlMergeLoader(test.TestCase):
def setUp(self):
super(TestYamlMergeLoader, self).setUp()
class Distro(object):
def __init__(self):
self.options = {
'unique-distro': True,
'redefined-in-general': 0,
'redefined-in-component': 0
}
class Persona(object):
def __init__(self):
self.component_options = {
'component': {
'unique-specific': True,
'redefined-in-specific': 1
}
}
self.general = ""
self.component = ""
self.distro = Distro()
self.persona = Persona()
self.temp_dir = tempfile.mkdtemp()
with mock.patch('anvil.settings.COMPONENT_CONF_DIR', self.temp_dir):
self.loader = cfg.YamlMergeLoader(self.temp_dir)
def tearDown(self):
super(TestYamlMergeLoader, self).tearDown()
shutil.rmtree(self.temp_dir, ignore_errors=True)
def _write_samples(self):
with open(os.path.join(self.temp_dir, 'general.yaml'), 'w') as f:
f.write(self.general)
with open(os.path.join(self.temp_dir, 'component.yaml'), 'w') as f:
f.write(self.component)
def test_load(self):
self.general = """
unique-general: True
redefined-in-general: 1
redefined-in-component: 1
"""
self.component = """
unique-component: True
redefined-in-component: 2
redefined-in-specific: 0
"""
self._write_samples()
merged = self.loader.load(self.distro, 'component', self.persona)
should_be = utils.OrderedDict([
('app_dir', os.path.join(self.temp_dir, 'component', 'app')),
('component_dir', os.path.join(self.temp_dir, 'component')),
('root_dir', os.path.join(self.temp_dir)),
('trace_dir', os.path.join(self.temp_dir, 'component', 'traces')),
('unique-distro', True),
('redefined-in-general', 1),
('redefined-in-component', 2),
('redefined-in-specific', 1),
('unique-general', True),
('unique-specific', True),
('unique-component', True),
])
self.assertEqual(merged, should_be)
# yet once loading with changed values.
self.persona.component_options['component']['redefined-in-specific'] = 2
merged = self.loader.load(self.distro, 'component', self.persona)
self.assertEqual(merged['redefined-in-specific'], 2)

View File

@ -1,66 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil import downloader
from anvil import exceptions
from anvil import test
class TestGitDownloader(test.TestCase):
def setUp(self):
super(TestGitDownloader, self).setUp()
self._uri = 'https://github.com/stackforge/anvil.git'
self._dst = '/root/anvil'
self._sha1 = '0a4d55a8d778e5022fab701977c5d840bbc486d0'
self._tag = '1.0.6'
def test_constructor_basic(self):
d = downloader.GitDownloader(self._uri, self._dst)
self.assertEqual(d._uri, self._uri)
self.assertEqual(d._dst, self._dst)
self.assertEqual(d._branch, 'master')
self.assertEqual(d._tag, None)
self.assertEqual(d._sha1, None)
def test_constructor_branch(self):
branch = 'stable/havana'
d = downloader.GitDownloader(self._uri, self._dst, branch=branch)
self.assertEqual(d._branch, branch)
self.assertEqual(d._tag, None)
self.assertEqual(d._sha1, None)
def test_constructor_string_tag(self):
d = downloader.GitDownloader(self._uri, self._dst, tag=self._tag)
self.assertEqual(d._tag, self._tag)
self.assertEqual(d._sha1, None)
def test_constructor_float_tag(self):
tag = 2013.2
d = downloader.GitDownloader(self._uri, self._dst, tag=tag)
self.assertEqual(d._tag, str(tag))
self.assertEqual(d._sha1, None)
def test_constructor_sha1(self):
d = downloader.GitDownloader(self._uri, self._dst, sha1=self._sha1)
self.assertEqual(d._tag, None)
self.assertEqual(d._sha1, self._sha1)
def test_constructor_raises_exception(self):
kwargs = {"tag": self._tag, 'sha1': self._sha1}
self.assertRaises(exceptions.ConfigException,
downloader.GitDownloader,
self._uri, self._dst, **kwargs)

View File

@ -1,150 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil import exceptions as exc
from anvil import test
class TestProcessExecutionError(test.TestCase):
def assertExceptionMessage(self, err, cmd, stdout='', stderr='',
exit_code='-', description=None):
if description is None:
description = 'Unexpected error while running command.'
message = ('%s\nCommand: %s\nExit code: %s\nStdout: %s\nStderr: %s' %
(description, cmd, exit_code, stdout, stderr))
self.assertEqual(err.message, message)
def setUp(self):
super(TestProcessExecutionError, self).setUp()
self.cmd = 'test-command'
self.stdout = 'test-stdout'
self.stderr = 'test-stderr'
def test_default(self):
err = exc.ProcessExecutionError(self.cmd)
self.assertExceptionMessage(err, cmd=self.cmd)
def test_stdout(self):
err = exc.ProcessExecutionError(self.cmd, stdout=self.stdout)
self.assertExceptionMessage(err, cmd=self.cmd, stdout=self.stdout)
self.assertEqual(self.stdout, err.stdout)
def test_stdout_empty(self):
err = exc.ProcessExecutionError(self.cmd, stdout='')
self.assertExceptionMessage(err, cmd=self.cmd, stdout='')
self.assertEqual('', err.stdout)
def test_stdout_none(self):
err = exc.ProcessExecutionError(self.cmd, stdout=None)
self.assertExceptionMessage(err, cmd=self.cmd, stdout=None)
def test_stderr(self):
err = exc.ProcessExecutionError(self.cmd, stderr=self.stderr)
self.assertExceptionMessage(err, cmd=self.cmd, stderr=self.stderr)
self.assertEqual(self.stderr, err.stderr)
def test_stderr_none(self):
err = exc.ProcessExecutionError(self.cmd, stderr=None)
self.assertExceptionMessage(err, cmd=self.cmd, stderr=None)
def test_exit_code_int(self):
err = exc.ProcessExecutionError(self.cmd, exit_code=0)
self.assertExceptionMessage(err, self.cmd, exit_code=0)
def test_exit_code_long(self):
err = exc.ProcessExecutionError(self.cmd, exit_code=0L)
self.assertExceptionMessage(err, self.cmd, exit_code=0L)
def test_exit_code_not_valid(self):
err = exc.ProcessExecutionError(self.cmd, exit_code='code')
self.assertExceptionMessage(err, self.cmd, exit_code='-')
err = exc.ProcessExecutionError(self.cmd, exit_code=0.0)
self.assertExceptionMessage(err, self.cmd, exit_code='-')
def test_description(self):
description = 'custom description'
err = exc.ProcessExecutionError(self.cmd, description=description)
self.assertExceptionMessage(err, self.cmd, description=description)
class TestReraise(test.TestCase):
def test_reraise_exception(self):
buff = []
def failure():
raise IOError("Broken")
def activate():
try:
failure()
except Exception:
with exc.reraise():
buff.append(1)
self.assertRaises(IOError, activate)
self.assertEqual([1], buff)
def test_override_reraise_exception(self):
def failure():
raise IOError("Broken")
def activate():
try:
failure()
except Exception:
with exc.reraise():
raise RuntimeError("Really broken")
self.assertRaises(RuntimeError, activate)
class TestYamlException(test.TestCase):
def test_yaml_exception(self):
self.assertTrue(issubclass(exc.YamlException,
exc.ConfigException))
def test_yaml_option_not_found_exception(self):
self.assertTrue(issubclass(exc.YamlOptionNotFoundException,
exc.YamlException))
exc_str = str(exc.YamlOptionNotFoundException(
'conf-sample', 'opt-sample', 'ref-conf', 'ref-opt'
))
self.assertTrue("`conf-sample`" in exc_str)
self.assertTrue("`ref-opt`" in exc_str)
self.assertTrue("opt-sample" in exc_str)
self.assertTrue("ref-conf:ref-opt" in exc_str)
def test_yaml_config_not_found_exception(self):
self.assertTrue(issubclass(exc.YamlConfigNotFoundException,
exc.YamlException))
exc_str = str(exc.YamlConfigNotFoundException("no/such//path/to/yaml"))
self.assertTrue("no/such//path/to/yaml" in exc_str)
def test_yaml_loop_exception(self):
self.assertTrue(issubclass(exc.YamlLoopException, exc.YamlException))
exc_str = str(exc.YamlLoopException('conf-sample', 'opt-sample',
[('s1', 'r1'), ('s2', 'r2')]))
self.assertTrue("`conf-sample`" in exc_str)
self.assertTrue("`opt-sample`" in exc_str)
self.assertTrue("loop found" in exc_str)
self.assertTrue("`s1`=>`r1`" in exc_str)
self.assertTrue("`s2`=>`r2`" in exc_str)

View File

@ -1,253 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import StringIO
from anvil import ini_parser
from anvil import test
class TestAnvilConfigParser(test.TestCase):
def setUp(self):
super(TestAnvilConfigParser, self).setUp()
self.config_parser = ini_parser.AnvilConfigParser()
def _read_ini(self, ini):
stream = StringIO.StringIO(ini)
self.config_parser.readfp(stream)
def test_commented_option_regexp_simple(self):
regexp = self.config_parser.option_regex
option = "# option1 = True"
result = regexp.match(option)
self.assertNotEqual(result, None)
self.assertEqual(result.group(1), "option1")
def test_commented_option_regexp_no_spaces(self):
regexp = self.config_parser.option_regex
option = "#option1=True"
result = regexp.match(option)
self.assertNotEqual(result, None)
self.assertEqual(result.group(1), "option1")
def test_commented_option_regexp_more_spaces(self):
regexp = self.config_parser.option_regex
option = "# option1 = True"
result = regexp.match(option)
self.assertNotEqual(result, None)
self.assertEqual(result.group(1), "option1")
def test_commented_option_regexp_with_spaces(self):
regexp = self.config_parser.option_regex
option = "# option name = option value"
result = regexp.match(option)
self.assertNotEqual(result, None)
self.assertEqual(result.group(1), "option name")
def test_readfp_comments_no_option(self):
ini = """
[DEFAULT]
# comment line #1
# comment line #2
# comment line #3
"""
self._read_ini(ini)
# 7 global scope elements
global_elements = self.config_parser.data._data.contents
self.assertEqual(len(global_elements), 7)
def test_readfp_comments_one_section(self):
ini = """
[DEFAULT]
# comment line #1
# option1 = value1
# comment line #2
# option2 = value2
"""
self._read_ini(ini)
# 3 global scope elements
global_elements = self.config_parser.data._data.contents
self.assertEqual(len(global_elements), 3)
# 6 lines in default section
default_section = global_elements[1]
self.assertEqual(len(default_section.contents), 6)
def test_readfp_comments_several_section(self):
ini = """
[DEFAULT]
# comment line #1
# option1 = value1
[ANOTHER_SECTION]
# comment line #1
# comment line #2
# option2 = value2
"""
self._read_ini(ini)
# 5 global scope elements
global_elements = self.config_parser.data._data.contents
self.assertEqual(len(global_elements), 5)
# 3 lines in default section
default_section = global_elements[1]
self.assertEqual(len(default_section.contents), 3)
# 4 lines in another section
another_section = global_elements[3]
self.assertEqual(len(another_section.contents), 4)
def test_readfp_no_sections(self):
ini = """
# comment line #1
# option1 = value1
# comment line #2
# option2 = value2
"""
self._read_ini(ini)
# 7 global scope elements
global_elements = self.config_parser.data._data.contents
self.assertEqual(len(global_elements), 7)
def test_readfp_with_global_comment(self):
ini = """
[DEFAULT]
# comment line #1
option1 = value1
# global scope comment
[ANOTHER_SECTION]
# comment line #2
option2 = value2
"""
self._read_ini(ini)
# 7 global scope elements
global_elements = self.config_parser.data._data.contents
self.assertEqual(len(global_elements), 7)
# 3 lines in default section
default_section = global_elements[1]
self.assertEqual(len(default_section.contents), 3)
# 3 lines in another section
another_section = global_elements[5]
self.assertEqual(len(another_section.contents), 3)
def test_set_one_option_simple(self):
ini = """
[DEFAULT]
# option1 = value1
# option2 = value2
"""
pattern = """
[DEFAULT]
# option1 = value1
option1 = True
# option2 = value2
"""
self._read_ini(ini)
self.config_parser.set('DEFAULT', 'option1', 'True')
output = StringIO.StringIO()
self.config_parser.write(output)
self.assertEqual(output.getvalue(), pattern)
def test_set_one_option_same_commented(self):
ini = """
[DEFAULT]
# comment line #1
# option1 = value1
# comment line #2
# option1 = value1
# comment line #3
# option2 = value2
"""
pattern = """
[DEFAULT]
# comment line #1
# option1 = value1
# comment line #2
# option1 = value1
option1 = True
# comment line #3
# option2 = value2
"""
self._read_ini(ini)
self.config_parser.set('DEFAULT', 'option1', 'True')
output = StringIO.StringIO()
self.config_parser.write(output)
self.assertEqual(output.getvalue(), pattern)
def test_set_one_option_non_existent(self):
ini = """
[DEFAULT]
# option1 = value1
# option2 = value2
"""
pattern = """
[DEFAULT]
option3 = False
# option1 = value1
# option2 = value2
"""
self._read_ini(ini)
self.config_parser.set('DEFAULT', 'option3', 'False')
output = StringIO.StringIO()
self.config_parser.write(output)
self.assertEqual(output.getvalue(), pattern)
def test_set_several_options_complex(self):
ini = """
[DEFAULT]
# option1 = value1
# option2 = value2
"""
pattern = """
[DEFAULT]
option3 = False
# option1 = value1
option1 = True
# option2 = value2
option2 = False
"""
self._read_ini(ini)
self.config_parser.set('DEFAULT', 'option1', 'True')
self.config_parser.set('DEFAULT', 'option2', 'False')
self.config_parser.set('DEFAULT', 'option3', 'False')
output = StringIO.StringIO()
self.config_parser.write(output)
self.assertEqual(output.getvalue(), pattern)

View File

@ -1,38 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import os
import tempfile
from anvil import log
from anvil import test
class TestLog(test.TestCase):
def setUp(self):
super(TestLog, self).setUp()
self.test_logger = log.getLogger().logger
self.test_logger.handlers = []
self.log_name = tempfile.mkstemp()[1]
def tearDown(self):
if os.path.isfile(self.log_name):
os.remove(self.log_name)
super(TestLog, self).tearDown()
def test_logger_has_two_handlers(self):
log.setupLogging(log.INFO, tee_filename=self.log_name)
self.assertEqual(len(self.test_logger.handlers), 2)

View File

@ -1,139 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import mock
import subprocess
from anvil import exceptions as exc
from anvil import shell as sh
from anvil import test
class TestShell(test.MockTestCase):
def setUp(self):
super(TestShell, self).setUp()
self.cmd = ['test', 'command']
self.str_cmd = ' '.join(self.cmd)
self.result = ('stdout', 'stderr')
# patch subprocess.Popen
self.popen_mock, self.popen_inst_mock = self._patch_class(
sh.subprocess, 'Popen')
self.popen_inst_mock.returncode = 0
self.popen_inst_mock.communicate.return_value = self.result
def test_execute_default_params(self):
result = sh.execute(self.cmd)
master_mock_calls = [
mock.call.Popen(self.cmd,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
close_fds=True,
shell=False,
cwd=None,
env=None),
mock.call.popen.communicate(None)
]
self.assertEqual(self.master_mock.mock_calls, master_mock_calls)
self.assertEqual(result, self.result)
@mock.patch.object(sh.env, 'get')
def test_execute_custom_params(self, mocked_env_get):
mocked_env_get.return_value = {'a': 'a'}
env = {'b': 'b'}
sh.execute(self.cmd,
process_input='input',
cwd='cwd',
shell=True,
env_overrides=env,
stdout_fh='stdout_fh',
stderr_fh='stderr_fh')
env.update({'a': 'a'})
self.assertEqual(self.master_mock.mock_calls, [
mock.call.Popen(self.str_cmd,
stdin=subprocess.PIPE,
stdout='stdout_fh',
stderr='stderr_fh',
close_fds=True,
shell=True,
cwd='cwd',
env=env),
mock.call.popen.communicate('input')
])
def test_execute_with_result_none(self):
self.popen_inst_mock.communicate.return_value = (None, None)
self.assertEqual(sh.execute(self.cmd), ('', ''))
def test_execute_popen_raises(self):
self.popen_mock.side_effect = OSError('Woot!')
self.assertRaises(exc.ProcessExecutionError, sh.execute, self.cmd)
def test_execute_communicate_raises(self):
self.popen_inst_mock.communicate.side_effect = OSError('Woot!')
self.assertRaises(exc.ProcessExecutionError, sh.execute, self.cmd)
def test_execute_bad_return_code_no_check(self):
self.popen_inst_mock.returncode = 1
self.assertEqual(sh.execute(self.cmd, check_exit_code=False),
self.result)
def test_execute_bad_return_code_with_check(self):
self.popen_inst_mock.returncode = 1
self.assertRaisesRegexp(exc.ProcessExecutionError,
"Unexpected error while running command.\n"
"Command: %s\n"
"Exit code: 1\n"
"Stdout: stdout\n"
"Stderr: stderr" % self.str_cmd,
sh.execute, self.cmd)
def test_execute_bad_return_code_with_tail(self):
self.popen_inst_mock.returncode = 1
self.popen_inst_mock.communicate.return_value = (
'0\n1\n2\n3\n4\n5\n6\n7\n8\n', '')
stdout = ('2\n3\n4\n5\n6\n7\n8\n')
expected = (
"Unexpected error while running command.\n"
"Command: %s\n"
"Exit code: 1\n"
"Stdout: %s \(see debug log for more details...\)\n"
"Stderr: " % (self.str_cmd, stdout)
)
self.assertRaisesRegexp(exc.ProcessExecutionError,
expected, sh.execute, self.cmd)
@mock.patch.object(sh, 'mkdirslist')
def test_execute_save_output(self, mocked_mkdirslist):
self.popen_inst_mock.returncode = 1
file_name = 'output.txt'
with mock.patch.object(sh, 'open', mock.mock_open(),
create=True) as fh_mock:
fh_mock.return_value.name = file_name
self.assertRaisesRegexp(
exc.ProcessExecutionError,
"Unexpected error while running command.\n"
"Command: %s\n"
"Exit code: 1\n"
"Stdout: <redirected to %s>\n"
"Stderr: <redirected to %s>" % (self.str_cmd, file_name,
file_name),
sh.execute_save_output, self.cmd, file_name
)

View File

@ -1,104 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2014 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import glob
import re
import sys
from anvil.packaging.helpers import pip_helper
from anvil import shell as sh
from anvil import test
from anvil import utils
from nose_parameterized import parameterized
EXAMPLE_GLOB = sh.joinpths("data", "tests", "requirements*.yaml")
def load_examples():
examples = []
for filename in glob.glob(EXAMPLE_GLOB):
if sh.isfile(filename):
# The test generator will use the first element as the test
# identifer so provide a filename + index based test identifer to
# be able to connect test failures to the example which caused it.
try:
base = sh.basename(filename)
base = re.sub(r"[.\s]", "_", base)
for i, example in enumerate(utils.load_yaml(filename)):
examples.append(("%s_%s" % (base, i), example))
except IOError:
pass
return examples
class TestTools(test.TestCase):
def setUp(self):
super(TestTools, self).setUp()
self.multipip = [sys.executable, sh.which("multipip", ['tools'])]
def _run_multipip(self, versions):
cmd = list(self.multipip)
cmd.extend(versions)
return sh.execute(cmd, check_exit_code=False)
def _extract_conflicts(self, stderr):
conflicts = {}
current_name = None
capturing = False
for line in stderr.splitlines():
if line.endswith(": incompatible requirements"):
capturing = False
current_name = line.split(":", 1)[0].lower().strip()
if current_name not in conflicts:
conflicts[current_name] = []
continue
if line.startswith("Choosing") and current_name:
capturing = False
continue
if line.startswith("Conflicting") and current_name:
capturing = True
continue
if capturing and current_name and line.startswith("\t"):
try:
line = line.lstrip()
_where, req = line.split(":", 1)
req = req.strip()
if req:
conflicts[current_name].append(req)
except ValueError:
pass
return conflicts
def assertEquivalentRequirements(self, expected, created):
self.assertEqual(len(expected), len(created))
for req in created:
self.assertIn(req, expected)
@parameterized.expand(load_examples())
def test_example(self, _name, example):
(stdout, stderr) = self._run_multipip(example['requirements'])
expected_normalized = []
for line in example['expected'].strip().splitlines():
expected_normalized.append(pip_helper.extract_requirement(line))
parsed_normalized = []
for line in stdout.strip().splitlines():
parsed_normalized.append(pip_helper.extract_requirement(line))
self.assertEquivalentRequirements(expected_normalized,
parsed_normalized)
if 'conflicts' in example:
self.assertEqual(example['conflicts'],
self._extract_conflicts(stderr))

View File

@ -1,28 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2013 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from anvil import test
from anvil import utils
class TestUtils(test.TestCase):
def test_expand(self):
text = "blah $v"
text = utils.expand_template(text, {
'v': 'blah',
})
self.assertEqual(text, "blah blah")

View File

@ -1,195 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import json
from anvil import exceptions as excp
from anvil import shell as sh
# Common trace actions
AP_STARTED = "AP_STARTED"
CFG_WRITING_FILE = "CFG_WRITING_FILE"
DIR_MADE = "DIR_MADE"
DOWNLOADED = "DOWNLOADED"
FILE_TOUCHED = "FILE_TOUCHED"
PKG_INSTALL = "PKG_INSTALL"
PKG_UPGRADE = "PKG_UPGRADE"
SYMLINK_MAKE = "SYMLINK_MAKE"
def trace_filename(root_dir, base_name):
return sh.joinpths(root_dir, "%s.trace" % (base_name))
class TraceWriter(object):
def __init__(self, trace_fn, break_if_there=True):
self.trace_fn = trace_fn
self.started = False
self.break_if_there = break_if_there
def trace(self, cmd, action=None):
if action is None:
action = ''
if cmd is not None:
sh.append_file(self.trace_fn, "%s - %s\n" % (cmd, action))
def filename(self):
return self.trace_fn
def _start(self):
if self.started:
return
else:
trace_dirs = sh.mkdirslist(sh.dirname(self.trace_fn))
sh.touch_file(self.trace_fn, die_if_there=self.break_if_there)
self.started = True
self.dirs_made(*trace_dirs)
def symlink_made(self, link):
self._start()
self.trace(SYMLINK_MAKE, link)
def download_happened(self, tgt, uri):
self._start()
what = dict()
what['target'] = tgt
what['from'] = uri
self.trace(DOWNLOADED, json.dumps(what))
def dirs_made(self, *dirs):
self._start()
for d in dirs:
self.trace(DIR_MADE, d)
def file_touched(self, fn):
self._start()
self.trace(FILE_TOUCHED, fn)
def package_installed(self, pkg_name):
self._start()
self.trace(PKG_INSTALL, pkg_name)
def package_upgraded(self, pkg_name):
self._start()
self.trace(PKG_UPGRADE, pkg_name)
def app_started(self, name, info_fn, how):
self._start()
data = dict()
data['name'] = name
data['trace_fn'] = info_fn
data['how'] = how
self.trace(AP_STARTED, json.dumps(data))
class TraceReader(object):
def __init__(self, trace_fn):
self.trace_fn = trace_fn
self.contents = None
def filename(self):
return self.trace_fn
def _parse(self):
fn = self.trace_fn
if not sh.isfile(fn):
msg = "No trace found at filename %s" % (fn)
raise excp.NoTraceException(msg)
contents = sh.load_file(fn)
lines = contents.splitlines()
accum = list()
for line in lines:
ep = self._split_line(line)
if ep is None:
continue
accum.append(tuple(ep))
return accum
def read(self):
if self.contents is None:
self.contents = self._parse()
return self.contents
def _split_line(self, line):
pieces = line.split("-", 1)
if len(pieces) == 2:
cmd = pieces[0].rstrip()
action = pieces[1].lstrip()
return (cmd, action)
else:
return None
def exists(self):
return sh.exists(self.trace_fn)
def apps_started(self):
lines = self.read()
apps = list()
for (cmd, action) in lines:
if cmd == AP_STARTED and len(action):
entry = json.loads(action)
if type(entry) is dict:
apps.append((entry.get('name'), entry.get('trace_fn'), entry.get('how')))
return apps
def download_locations(self):
lines = self.read()
locations = list()
for (cmd, action) in lines:
if cmd == DOWNLOADED and len(action):
entry = json.loads(action)
if type(entry) is dict:
locations.append((entry.get('target'), entry.get('uri')))
return locations
def _sort_paths(self, pths):
# Ensure in correct order (ie /tmp is before /)
pths = list(set(pths))
pths.sort()
pths.reverse()
return pths
def files_touched(self):
lines = self.read()
files = list()
for (cmd, action) in lines:
if cmd == FILE_TOUCHED and len(action):
files.append(action)
return self._sort_paths(files)
def dirs_made(self):
lines = self.read()
dirs = list()
for (cmd, action) in lines:
if cmd == DIR_MADE and len(action):
dirs.append(action)
return self._sort_paths(dirs)
def symlinks_made(self):
lines = self.read()
links = list()
for (cmd, action) in lines:
if cmd == SYMLINK_MAKE and len(action):
links.append(action)
return links
def packages_installed(self):
lines = self.read()
pkg_list = list()
for (cmd, action) in lines:
if cmd == PKG_INSTALL and len(action):
pkg_list.append(action)
return pkg_list

View File

@ -1,39 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import types
def make_bool(val):
if isinstance(val, bool):
return val
if isinstance(val, types.NoneType):
return False
sval = str(val).lower().strip()
if sval in ['true', '1', 'on', 'yes', 't']:
return True
if sval in ['0', 'false', 'off', 'no', 'f', '', 'none']:
return False
raise TypeError("Unable to convert %r to a boolean" % (val))
def obj_name(obj):
if isinstance(obj, (types.TypeType,
types.ModuleType,
types.FunctionType,
types.LambdaType)):
return str(obj.__name__)
return obj_name(obj.__class__)

View File

@ -1,678 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Copyright 2011 OpenStack LLC.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import binascii
import collections
import contextlib
import glob
import inspect
import json
import os
import random
import re
import socket
import sys
import tempfile
import time
try:
# Only in python 2.7+
from collections import OrderedDict
except ImportError:
from ordereddict import OrderedDict
from datetime import datetime
import netifaces
import progressbar
import six
import yaml
from Cheetah.Template import Template
from anvil import colorizer
from anvil import log as logging
from anvil import pprint
from anvil import settings
from anvil import shell as sh
from anvil import version
from anvil.pprint import center_text
MONTY_PYTHON_TEXT_RE = re.compile(r"([a-z0-9A-Z\?!.,'\"]+)")
# Thx cowsay
# See: http://www.nog.net/~tony/warez/cowsay.shtml
COWS = dict()
COWS['happy'] = r'''
{header}
\ {ear}__{ear}
\ ({eye}{eye})\_______
(__)\ )\/\
||----w |
|| ||
'''
COWS['unhappy'] = r'''
{header}
\ || ||
\ __ ||-----mm||
\ ( )/_________)//
({eye}{eye})/
{ear}--{ear}
'''
LOG = logging.getLogger(__name__)
class Group(list):
def __init__(self, id):
super(Group, self).__init__()
self.id = id
class ExponentialBackoff(object):
def __init__(self, attempts=5, start=1.3):
self.start = start
self.attempts = attempts
def __iter__(self):
value = self.start
if self.attempts <= 0:
raise StopIteration()
yield value
for _i in xrange(0, self.attempts - 1):
value = value * value
yield value
def __str__(self):
vals = [str(v) for v in self]
return "Backoff %s" % (vals)
def get_callback_name(cb):
"""Tries to get a callbacks fully-qualified name.
If no name can be produced ``repr(cb)`` is called and returned.
"""
segments = []
try:
segments.append(cb.__qualname__)
except AttributeError:
try:
segments.append(cb.__name__)
if inspect.ismethod(cb):
try:
# This attribute doesn't exist on py3.x or newer, so
# we optionally ignore it... (on those versions of
# python `__qualname__` should have been found anyway).
segments.insert(0, cb.im_class.__name__)
except AttributeError:
pass
except AttributeError:
pass
if not segments:
return repr(cb)
else:
try:
# When running under sphinx it appears this can be none?
if cb.__module__:
segments.insert(0, cb.__module__)
except AttributeError:
pass
return ".".join(segments)
def expand_template(contents, params):
if not params:
params = {}
tpl = Template(source=str(contents),
searchList=[params],
compilerSettings={
'useErrorCatcher': True})
return tpl.respond()
def expand_template_deep(root, params):
if isinstance(root, (basestring, str)):
return expand_template(root, params)
if isinstance(root, (list, tuple)):
n_list = []
for i in root:
n_list.append(expand_template_deep(i, params))
return n_list
if isinstance(root, (dict)):
n_dict = {}
for (k, v) in root.items():
n_dict[k] = expand_template_deep(v, params)
return n_dict
if isinstance(root, (set)):
n_set = set()
for v in root:
n_set.add(expand_template_deep(v, params))
return n_set
return root
def get_random_string(length):
"""Get a random hex string of the specified length."""
if length <= 0:
return ''
return binascii.hexlify(os.urandom((length + 1) / 2))[:length]
def parse_json(text):
"""Load JSON from string
If string is whitespace-only, returns None
"""
text = text.strip()
if len(text):
return json.loads(text)
else:
return None
def group_builds(components):
if not components:
return []
stages = collections.defaultdict(list)
for c in components:
if isinstance(c, six.string_types):
stages[0].append(c)
elif isinstance(c, dict):
for project_name, stage_id in six.iteritems(c):
stage_id = int(stage_id)
stages[stage_id].append(project_name)
else:
raise TypeError("Unexpected group type %s" % type(c))
groupings = []
for i in sorted(six.iterkeys(stages)):
stage = Group(i)
stage.extend(stages[i])
groupings.append(stage)
return groupings
def load_yaml(path):
return load_yaml_text(sh.load_file(path))
def load_yaml_text(text):
return yaml.safe_load(text)
def has_any(text, *look_for):
if not look_for:
return False
for v in look_for:
if text.find(v) != -1:
return True
return False
def retry(attempts, delay, func, *args, **kwargs):
if delay < 0:
raise ValueError("delay must be >= 0")
if attempts < 0:
raise ValueError("attempts must be >= 1")
func_name = get_callback_name(func)
failures = []
retryable_exceptions = kwargs.pop('retryable_exceptions', [Exception])
retryable_exceptions = tuple(retryable_exceptions)
max_attempts = int(attempts) + 1
for attempt in range(1, max_attempts):
LOG.debug("Attempt %s for calling '%s'", attempt, func_name)
kwargs['attempt'] = attempt
try:
return func(*args, **kwargs)
except retryable_exceptions:
failures.append(sys.exc_info())
LOG.exception("Calling '%s' failed (retryable)", func_name)
if attempt < max_attempts and delay > 0:
LOG.info("Waiting %s seconds before calling '%s' again",
delay, func_name)
sh.sleep(delay)
except BaseException:
failures.append(sys.exc_info())
LOG.exception("Calling '%s' failed (not retryable)", func_name)
break
exc_type, exc, exc_tb = failures[-1]
six.reraise(exc_type, exc, exc_tb)
def add_header(fn, contents, adjusted=True):
lines = []
if not fn:
fn = "???"
if adjusted:
lines.append('# Adjusted source file %s' % (fn.strip()))
else:
lines.append('# Created source file %s' % (fn.strip()))
lines.append("# On %s" % (iso8601()))
lines.append("# By user %s, group %s" % (sh.getuser(), sh.getgroupname()))
lines.append("")
if contents:
lines.append(contents)
return joinlinesep(*lines)
def iso8601():
return datetime.now().isoformat()
def recursive_merge(a, b):
# pylint: disable=C0103
def _merge_lists(a, b):
merged = []
merged.extend(a)
merged.extend(b)
return merged
def _merge_dicts(a, b):
merged = {}
for k in six.iterkeys(a):
if k in b:
merged[k] = recursive_merge(a[k], b[k])
else:
merged[k] = a[k]
for k in six.iterkeys(b):
if k in merged:
continue
merged[k] = b[k]
return merged
def _merge_text(a, b):
return b
def _merge_int(a, b):
return b
def _merge_float(a, b):
return b
def _merge_bool(a, b):
return b
mergers = [
(list, list, _merge_lists),
(list, tuple, _merge_lists),
(tuple, tuple, _merge_lists),
(tuple, list, _merge_lists),
(dict, dict, _merge_dicts),
(six.string_types, six.string_types, _merge_text),
(int, int, _merge_int),
(bool, bool, _merge_bool),
(float, float, _merge_float),
]
merger = None
for (a_type, b_type, func) in mergers:
if isinstance(a, a_type) and isinstance(b, b_type):
merger = func
break
if not merger:
raise TypeError("Unknown how to merge '%s' with '%s'" % (type(a), type(b)))
return merger(a, b)
def merge_dicts(*dicts, **kwargs):
merged = OrderedDict()
for mp in dicts:
for (k, v) in mp.items():
if kwargs.get('preserve') and k in merged:
continue
else:
merged[k] = v
return merged
def get_deep(items, path, quiet=True):
if len(path) == 0:
return items
head = path[0]
remainder = path[1:]
if isinstance(items, (list, tuple)):
index = int(head)
if quiet and not (index < len(items) and index >= 0):
return None
else:
return get_deep(items[index], remainder)
else:
get_method = getattr(items, 'get', None)
if not get_method:
if not quiet:
raise RuntimeError("Can not figure out how to extract an item from %s" % (items))
else:
return None
else:
return get_deep(get_method(head), remainder)
def load_template(component, template_name):
path = sh.joinpths(settings.TEMPLATE_DIR, component, template_name)
return (path, sh.load_file(path))
def execute_template(cmd, *cmds, **kargs):
params = kargs.pop('params', None) or {}
results = []
for info in [cmd] + list(cmds):
run_what_tpl = info["cmd"]
if not isinstance(run_what_tpl, (list, tuple, set)):
run_what_tpl = [run_what_tpl]
run_what = [expand_template(c, params) for c in run_what_tpl]
stdin = None
stdin_tpl = info.get('stdin')
if stdin_tpl:
if not isinstance(stdin_tpl, (list, tuple, set)):
stdin_tpl = [stdin_tpl]
stdin = [expand_template(c, params) for c in stdin_tpl]
stdin = "\n".join(stdin)
result = sh.execute(run_what,
process_input=stdin,
check_exit_code=not info.get(
'ignore_failure', False),
**kargs)
results.append(result)
return results
def to_bytes(text):
byte_val = 0
if not text:
return byte_val
if text[-1].upper() == 'G':
byte_val = int(text[:-1]) * 1024 ** 3
elif text[-1].upper() == 'M':
byte_val = int(text[:-1]) * 1024 ** 2
elif text[-1].upper() == 'K':
byte_val = int(text[:-1]) * 1024
elif text[-1].upper() == 'B':
byte_val = int(text[:-1])
else:
byte_val = int(text)
return byte_val
def truncate_text(text, max_len, from_bottom=False):
if len(text) < max_len:
return text
if not from_bottom:
return (text[0:max_len] + "...")
else:
text = text[::-1]
text = truncate_text(text, max_len)
text = text[::-1]
return text
def log_object(to_log, logger=None, level=logging.INFO, item_max_len=64):
if not to_log:
return
if not logger:
logger = LOG
content = pprint.pformat(to_log, item_max_len)
for line in content.splitlines():
logger.log(level, line)
def log_iterable(to_log, header=None, logger=None, color='blue'):
if not logger:
logger = LOG
if not to_log:
if not header:
return
if header.endswith(":"):
header = header[0:-1]
if not header.endswith("."):
header = header + "."
logger.info(header)
return
if header:
if not header.endswith(":"):
header += ":"
logger.info(header)
for c in to_log:
if color:
c = colorizer.color(c, color)
logger.info("|-- %s", c)
@contextlib.contextmanager
def progress_bar(name, max_am, reverse=False):
widgets = [
'%s: ' % (name),
progressbar.Percentage(),
' ',
]
if reverse:
widgets.append(progressbar.ReverseBar())
else:
widgets.append(progressbar.Bar())
widgets.append(' ')
widgets.append(progressbar.ETA())
p_bar = progressbar.ProgressBar(maxval=max_am, widgets=widgets)
p_bar.start()
try:
yield p_bar
finally:
p_bar.finish()
@contextlib.contextmanager
def tempdir(**kwargs):
# This seems like it was only added in python 3.2
# Make it since its useful...
# See: http://bugs.python.org/file12970/tempdir.patch
tdir = tempfile.mkdtemp(**kwargs)
try:
yield tdir
finally:
sh.deldir(tdir)
def get_host_ip(default_ip='127.0.0.1'):
"""Returns the actual ip of the local machine.
This code figures out what source address would be used if some traffic
were to be sent out to some well known address on the Internet. In this
case, a private address is used, but the specific address does not
matter much. No traffic is actually sent.
Adjusted from nova code...
"""
ip = None
try:
csock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
csock.connect(('8.8.8.8', 80))
with contextlib.closing(csock) as s:
(addr, _) = s.getsockname()
if addr:
ip = addr
except socket.error:
pass
# Attempt to find the first ipv4 with an addr
# and use that as the address
if not ip:
interfaces = get_interfaces()
for (_, net_info) in interfaces.items():
ip_info = net_info.get('IPv4')
if ip_info:
a_ip = ip_info.get('addr')
if a_ip:
ip = a_ip
break
# Just return a default verson then
if not ip:
ip = default_ip
return ip
@contextlib.contextmanager
def chdir(where_to):
curr_dir = os.getcwd()
if curr_dir == where_to:
yield where_to
else:
try:
os.chdir(where_to)
yield where_to
finally:
os.chdir(curr_dir)
def get_interfaces():
interfaces = OrderedDict()
for intfc in netifaces.interfaces():
interface_info = {}
interface_addresses = netifaces.ifaddresses(intfc)
ip6 = interface_addresses.get(netifaces.AF_INET6)
if ip6:
# Just take the first
interface_info['IPv6'] = ip6[0]
ip4 = interface_addresses.get(netifaces.AF_INET)
if ip4:
# Just take the first
interface_info['IPv4'] = ip4[0]
# Note: there are others but this is good for now..
interfaces[intfc] = interface_info
return interfaces
def format_time(secs):
return {
'seconds': "%.03f" % (secs),
"minutes": "%.02f" % (secs / 60.0),
}
def time_it(on_finish, func, *args, **kwargs):
start_time = time.time()
result = func(*args, **kwargs)
end_time = time.time()
on_finish(max(0, end_time - start_time))
return result
def joinlinesep(*pieces):
return os.linesep.join(pieces)
def prettify_yaml(obj):
formatted = yaml.safe_dump(obj,
line_break="\n",
indent=4,
explicit_start=True,
explicit_end=True,
default_flow_style=False)
return formatted
def _pick_message(pattern, def_message="This page is intentionally left blank."):
if not pattern:
return def_message
expanded_pattern = sh.joinpths(settings.MESSAGING_DIR, pattern)
file_matches = glob.glob(expanded_pattern)
file_matches = [f for f in file_matches if sh.isfile(f)]
try:
file_selected = random.choice(file_matches)
with open(file_selected, 'r') as fh:
contents = fh.read()
contents = contents.strip("\n\r")
if not contents:
contents = def_message
return contents
except (IndexError, IOError):
return def_message
def _get_welcome_stack():
return _pick_message("stacks.*")
def _welcome_slang():
return _pick_message("welcome.*")
def _color_blob(text, text_color):
def replacer(match):
contents = match.group(1)
return colorizer.color(contents, text_color)
return MONTY_PYTHON_TEXT_RE.sub(replacer, text)
def _goodbye_header(worked):
msg = _pick_message("success.*")
apply_color = 'green'
if not worked:
msg = _pick_message("fails.*")
apply_color = 'red'
return _color_blob(msg, apply_color)
def goodbye(worked):
cow = COWS['happy']
eye_fmt = colorizer.color('o', 'green')
ear = colorizer.color("^", 'green')
if not worked:
cow = COWS['unhappy']
eye_fmt = colorizer.color("o", 'red')
ear = colorizer.color("v", 'red')
cow = cow.strip("\n\r")
header = _goodbye_header(worked)
msg = cow.format(eye=eye_fmt, ear=ear, header=header)
print(msg)
def welcome(prog_name='Anvil', version_text=version.version_string()):
lower = "| %s |" % (version_text)
welcome_header = _get_welcome_stack()
max_line_len = len(max(welcome_header.splitlines(), key=len))
footer = colorizer.color(prog_name, 'green') + ": " + colorizer.color(lower, 'blue', bold=True)
uncolored_footer = prog_name + ": " + lower
if max_line_len - len(uncolored_footer) > 0:
# This format string will center the uncolored text which
# we will then replace with the color text equivalent.
centered_str = center_text(uncolored_footer, " ", max_line_len)
footer = centered_str.replace(uncolored_footer, footer)
print(welcome_header)
print(footer)
real_max = max(max_line_len, len(uncolored_footer))
slang = center_text(_welcome_slang(), ' ', real_max)
print(colorizer.color(slang, 'magenta', bold=True))
return ("-", real_max)
def splitlines_not_empty(text):
for line in text.splitlines():
line = line.strip()
if line:
yield line
def strip_prefix_suffix(line, prefix=None, suffix=None):
if prefix and line.startswith(prefix):
line = line[len(prefix):]
if suffix and line.endswith(suffix):
line = line[:-len(suffix)]
return line

View File

@ -1,30 +0,0 @@
# vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright (C) 2012 Yahoo! Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
ANVIL_VERSION = ['2015', None, None]
YEAR, COUNT, REVISION = ANVIL_VERSION
FINAL = False # May never be final ;)
def canonical_version_string():
return '.'.join(filter(None, ANVIL_VERSION))
def version_string():
if FINAL:
return canonical_version_string()
else:
return '%s-dev' % (canonical_version_string())

View File

@ -1,4 +0,0 @@
# Settings for component ceilometer-client
---
...

View File

@ -1,3 +0,0 @@
# Settings for component Ceilometer
---
...

View File

@ -1,3 +0,0 @@
# Settings for component cinder-client
---
...

View File

@ -1,8 +0,0 @@
# Settings for component cinder
---
# Used for associating the client package with a human understandable
# name in its package description (not a code-name, like cinder).
api_name: "Volume"
...

View File

@ -1,4 +0,0 @@
# Settings for component django-openstack-auth
---
...

View File

@ -1,6 +0,0 @@
# Settings for component general
---
ip: "$(auto:ip)"
...

View File

@ -1,4 +0,0 @@
# Settings for component glance-client
---
...

View File

@ -1,11 +0,0 @@
# Settings for component glance
---
# Used by install section in the specfile (conflicts with the client binary...)
remove_file: "/bin/rm -rf %{buildroot}/usr/bin/glance"
# Used for associating the client package with a human understandable
# name in its package description (not a code-name, like glance).
api_name: "Image"
...

View File

@ -1,4 +0,0 @@
# Settings for component global_requirements
---
...

View File

@ -1,4 +0,0 @@
# Settings for component heat-client
---
...

View File

@ -1,3 +0,0 @@
# Settings for component Heat
---
...

View File

@ -1,7 +0,0 @@
# Settings for component horizon
---
# Instead of naming this components package horizon, it will be named this instead
build_name: "python-django-horizon"
...

View File

@ -1,4 +0,0 @@
# Settings for component ironic-client
---
...

View File

@ -1,3 +0,0 @@
# Settings for component ironic
---
...

View File

@ -1,4 +0,0 @@
# Settings for component keystone-client
---
...

View File

@ -1,8 +0,0 @@
# Settings for component keystone
---
# Used for associating the client package with a human understandable
# name in its package description (not a code-name, like keystone).
api_name: "Identity"
...

View File

@ -1,4 +0,0 @@
# Settings for component magnum-client
---
...

View File

@ -1,8 +0,0 @@
# Settings for component magnum
---
daemon_args:
magnum-api: "/usr/bin/magnum-api --config-file /etc/magnum/magnum.conf --log-dir=/var/log/magnum"
magnum-conductor: "/usr/bin/magnum-conductor --config-file /etc/magnum/magnum.conf --log-dir=/var/log/magnum"
...

View File

@ -1,4 +0,0 @@
# Settings for component neutron-client
---
...

View File

@ -1,4 +0,0 @@
# Settings for component neutron-fwaas
---
...

View File

@ -1,4 +0,0 @@
# Settings for component neutron-lbass
---
...

View File

@ -1,4 +0,0 @@
# Settings for component neutron-vpnaas
---
...

View File

@ -1,22 +0,0 @@
# Settings for component neutron-client
---
# Used for associating the client package with a human understandable
# name in its package description (not a code-name, like neutron).
api_name: "Networking"
core_plugin: openvswitch
use_namespaces: True
# When building a package for the neutron the arguments to the individual daemons
# will be expanded to include the following runtime arguments.
daemon_args:
neutron-server: "--config-file=/etc/neutron/plugin.ini --config-file=/etc/neutron/neutron.conf"
neutron-l3-agent: "--config-file=/etc/neutron/l3_agent.ini --config-file=/etc/neutron/neutron.conf"
neutron-dhcp-agent: "--config-file=/etc/neutron/dhcp_agent.ini --config-file=/etc/neutron/neutron.conf"
neutron-metadata-agent: "--config-file=/etc/neutron/metadata_agent.ini --config-file=/etc/neutron/neutron.conf"
neutron-openvswitch-agent: "--config-file=/etc/neutron/plugins/openvswitch/ovs_neutron_plugin.ini --config-file=/etc/neutron/neutron.conf"
killmode:
neutron-server: "process"
...

View File

@ -1,4 +0,0 @@
# Settings for component nova-client
---
...

View File

@ -1,8 +0,0 @@
# Settings for component nova
---
# Used for associating the client package with a human understandable
# name in its package description (not a code-name, like nova).
api_name: "Compute"
...

View File

@ -1,4 +0,0 @@
# Settings for component novnc
---
...

View File

@ -1,4 +0,0 @@
# Settings for component openstack-client
---
...

View File

@ -1,4 +0,0 @@
# Settings for component oslo.config
---
...

View File

@ -1,4 +0,0 @@
# Settings for component oslo.config
---
...

View File

@ -1,4 +0,0 @@
# Settings for component oslo.messaging
---
...

View File

@ -1,4 +0,0 @@
# Settings for component pycadf
---
...

View File

@ -1,4 +0,0 @@
# Settings for component swift-client
---
...

View File

@ -1,4 +0,0 @@
# Settings for component trove-client
---
...

View File

@ -1,4 +0,0 @@
# Settings for component trove
---
...

View File

@ -1 +0,0 @@
redhat.yaml

View File

@ -1 +0,0 @@
redhat.yaml

View File

@ -1,235 +0,0 @@
---
# RedHat distros or variants (centos, rhel, fedora)...
platform_pattern: redhat(.*)|centos(.*)|fedora(.*)
install_helper: anvil.packaging.yum:YumInstallHelper
dependency_handler:
name: anvil.packaging.yum:YumDependencyHandler
epoch_map:
flask: 2
package_map:
distribute: python-setuptools
django: Django
IPy: python-IPy
libvirt-python: libvirt-python
m2crypto: m2crypto
pyScss: python-scss
mysql-python: MySQL-python
numpy: numpy
pam: python-pam
pastedeploy: python-paste-deploy
pycrypto: python-crypto
pyflakes: pyflakes
pylint: pylint
pyopenssl: pyOpenSSL
pyparsing: pyparsing
pysendfile: pysendfile
pytz: pytz
sqlalchemy-migrate: python-migrate
qpid-python: python-qpid # Why is this one backwards :-/
PyYAML: PyYAML
pyzmq: python-zmq
pyscss: python-scss
ignoreable_pips:
- distribute # distribute has been replaced by setuptools
- trollius # this project is dead/deprecated...
arch_dependent:
- PuLP
- cryptography
- selenium
- xattr
- numpy
ensure_prebuilt:
# For some reason we can't even touch testtools until the following
# exists/is-built and we can't seem to get this info from the egg_info
# extraction process...
testtools:
- "unittest2>=0.8.0"
build_options:
pylint:
- "%global _python_bytecompile_errors_terminate_build 0"
tablib:
- "%global _python_bytecompile_errors_terminate_build 0"
linecache2:
- "%global _python_bytecompile_errors_terminate_build 0"
components:
ceilometer-client:
python_entrypoints: True
cinder:
python_entrypoints: True
daemon_to_package:
all: openstack-cinder
volume: openstack-cinder
scheduler: openstack-cinder
api: openstack-cinder
pips:
- name: hp3parclient
cinder-client:
python_entrypoints: True
general:
build-requires:
# Build time dependencies
- name: libxml2-devel
removable: false
- name: libxslt-devel
removable: false
- name: mysql-devel
removable: false
- name: postgresql-devel
removable: false
- name: openldap-devel
removable: false
- name: psmisc
removable: false
- name: sudo
removable: false
- name: tcpdump
removable: false
- name: unzip
removable: false
- name: wget
removable: false
# Shared python packages
- name: python
removable: false
- name: python-devel
removable: false
- name: python-distutils-extra
removable: false
- name: python-setuptools
removable: false
- name: sqlite-devel
removable: false
requires:
- name: MySQL-python
# Require extra packages needed to run tests
pips:
- name: "nose"
version: ">=1.3.0"
- name: "coverage"
glance:
python_entrypoints: True
pips:
# pip setup and download of xattr>=0.7 seems to have problems find cffi
# so lets just use an restrict the upper bound until this is fixed upstream
# see: https://github.com/xattr/xattr/issues/16
- name: xattr
version: ">=0.6.0,<0.7"
daemon_to_package:
api: openstack-glance
registry: openstack-glance
scrubber: openstack-glance
glance-client:
python_entrypoints: True
heat-client:
python_entrypoints: True
horizon:
python_entrypoints: True
packages:
- name: openstack-dashboard
pips:
- name: pyScss
django-openstack-auth:
python_entrypoints: True
keystone:
python_entrypoints: True
daemon_to_package:
all: openstack-keystone
keystone-client:
python_entrypoints: True
neutron-client:
python_entrypoints: True
magnum:
python_entrypoints: True
magnum-client:
python_entrypoints: True
nova:
python_entrypoints: True
pips:
# This seems to be a core dependency for a 'cas' tool
# so don't try to remove it since it will also remove
# said 'cas' tool, unfortunately the version of paramiko
# installed in rhel uses a old version of crypto which
# other components actually can't use. This sucks...
- name: paramiko
test_requires:
# NOTE(imelnikov): nova testcases require importlib, which was not part
# of python standard library as of python 2.6.
- importlib
daemon_to_package:
api: openstack-nova-api
conductor: openstack-nova-conductor
consoleauth: openstack-nova-console
dhcpbridge: openstack-nova-network
network: openstack-nova-network
novncproxy: openstack-nova-novncproxy
scheduler: openstack-nova-scheduler
spicehtml5proxy: openstack-nova-console
xvpvncproxy: openstack-nova-console
serialproxy: openstack-nova-serialproxy
nova-client:
python_entrypoints: True
novnc:
python_entrypoints: True
openstack-client:
python_entrypoints: True
oslo-config:
python_entrypoints: True
oslo-incubator:
python_entrypoints: True
pycadf:
python_entrypoints: True
oslo-messaging:
python_entrypoints: True
neutron:
python_entrypoints: True
daemon_to_package:
dhcp-agent: openstack-neutron
l3-agent: openstack-neutron
linuxbridge-agent: openstack-neutron-linuxbridge
metadata-agent: openstack-neutron
openvswitch-agent: openstack-neutron-openvswitch
ovs-cleanup: openstack-neutron-openvswitch
rpc-zmq-receiver: openstack-neutron
server: openstack-neutron
sriov-nic-agent: openstack-neutron-sriov-nic-agent
mlnx-agent: openstack-neutron-mlnx
metering-agent: openstack-neutron-metering-agen
cisco-cfg-agent: openstack-neutron-cisco
netns-cleanup: openstack-neutron
neutron-lbaas:
python_entrypoints: True
neutron-fwaas:
python_entrypoints: True
neutron-vpnaas:
python_entrypoints: True
swift-client:
python_entrypoints: True
trove:
python_entrypoints: True
trove-client:
python_entrypoints: True
heat:
python_entrypoints: True
daemon_to_package:
api: openstack-heat-api
api-cfn: openstack-heat-api-cfn
api-cloudwatch: openstack-heat-api-cloudwatch
engine: openstack-heat-engine
global-requirements:
python_entrypoints: True
ceilometer:
python_entrypoints: True
daemon_to_package:
api: openstack-ceilometer-api
central: openstack-ceilometer-central
collector: openstack-ceilometer-collector
compute: openstack-ceilometer-compute
ipmi: openstack-ceilometer-ipmi
ironic:
python_entrypoints: True
daemon_to_package:
api: openstack-ironic-api
conductor: openstack-ironic-conductor
ironic-client:
python_entrypoints: True
...

View File

@ -1 +0,0 @@
redhat.yaml

View File

@ -1,68 +0,0 @@
---
# Ubuntu distros or variants...
platform_pattern: ubuntu(.*)
install_helper: anvil.packaging.venv:VenvInstallHelper
dependency_handler:
name: anvil.packaging.venv:VenvDependencyHandler
components:
general:
python_entrypoints: False
ceilometer-client:
python_entrypoints: True
cinder:
python_entrypoints: True
cinder-client:
python_entrypoints: True
glance:
python_entrypoints: True
glance-client:
python_entrypoints: True
heat:
python_entrypoints: True
heat-client:
python_entrypoints: True
horizon:
python_entrypoints: True
django-openstack-auth:
python_entrypoints: True
keystone:
python_entrypoints: True
keystone-client:
python_entrypoints: True
neutron-client:
python_entrypoints: True
nova:
python_entrypoints: True
nova-client:
python_entrypoints: True
novnc:
python_entrypoints: True
openstack-client:
python_entrypoints: True
oslo-config:
python_entrypoints: True
oslo-incubator:
python_entrypoints: True
pycadf:
python_entrypoints: True
oslo-messaging:
python_entrypoints: True
neutron:
python_entrypoints: True
swift-client:
python_entrypoints: True
trove:
python_entrypoints: True
trove-client:
python_entrypoints: True
heat:
python_entrypoints: True
global-requirements:
python_entrypoints: True
ceilometer:
python_entrypoints: True
ironic:
python_entrypoints: True
ironic-client:
python_entrypoints: True
...

View File

@ -1,12 +0,0 @@
[
{
"path": "/install_helper",
"value": "anvil.packaging.venv:VenvInstallHelper",
"op": "replace"
},
{
"path": "/dependency_handler",
"value": { "name": "anvil.packaging.venv:VenvDependencyHandler" },
"op": "replace"
}
]

View File

@ -1,4 +0,0 @@
__________
< Failure! >
----------

View File

@ -1,6 +0,0 @@
_____________________
/ We were in the nick \
| of time. You were |
\ in great peril. /
---------------------

View File

@ -1,8 +0,0 @@
___________________
/ I know a dead \
| parrot when I see |
| one, and I'm |
| looking at one |
\ right now. /
-------------------

View File

@ -1,6 +0,0 @@
_________________
/ Welcome to the \
| National Cheese |
\ Emporium /
-----------------

Some files were not shown because too many files have changed in this diff Show More