Basic packagizing and preparation for Jenkins

Hard-coding the path to the interpreter causes problems when running
inside a virtualenv - eg, if pyyaml is installed inside the virtualenv
but not in the system python, this will give an import error rather than
finding the virtualenv's version of pyyaml

Turn this into a package.

* Update .gitignore to ignore virtualenv cruft
* Add basic requirements
* Basic packageization.
* Readme has been tweaked to reflect the fact that this is now a more
  generic tool.
* Further packageization
* Add tox.ini and test-requirements.txt
  This will enable testing on patches pushed up to stackforge
  Right now we're only running basic pep8 tests.

Change-Id: I9897a4da863dbf61700812cb436bd1d46e92b950
This commit is contained in:
James Polley 2014-08-20 15:00:59 +10:00
parent b6bd9d0b4c
commit 0c208d62c3
23 changed files with 376 additions and 222 deletions

5
.gitignore vendored
View File

@ -1,3 +1,6 @@
.*.swp
*~
repo_refs.yaml.variables
.Python
bin/
include/
lib/

5
AUTHORS Normal file
View File

@ -0,0 +1,5 @@
Adam Gandelman <adamg@ubuntu.com>
Ghe Rivero <ghe@debian.org>
James Polley <jp@jamezpolley.com>
Robert Collins <rbtcollins@hp.com>
rbtcollins <robertc@robertcollins.net>

17
CONTRIBUTING.rst Normal file
View File

@ -0,0 +1,17 @@
If you would like to contribute to the development of OpenStack,
you must follow the steps in the "If you're a developer, start here"
section of this page:
http://wiki.openstack.org/HowToContribute
Once those steps have been completed, changes to OpenStack
should be submitted for review via the Gerrit tool, following
the workflow documented at:
http://wiki.openstack.org/GerritWorkflow
Pull requests submitted through GitHub will be ignored.
Bugs should be filed on Launchpad, not GitHub:
https://bugs.launchpad.net/prep-source-repos

67
ChangeLog Normal file
View File

@ -0,0 +1,67 @@
CHANGES
=======
* Reformat README as RST
* Rename readme
0.0.1
-----
* Add tox.ini
* Remove rack-testing - not relevant to prep_source_repos
* Basic packageization
* Add basic requirements
* Update .gitignore to ignore virtualenv cruft
* Enable use of local reference directories
* Add --clean to reset all repos to upstream master
* Enable use inside a virtualenv
* Show what we're merging
* Late bound references for the win
* Handle repos with wonky origin
* Fix cruft in refs
* Preserve uncommitted changes
* And now make it cool
* Ignore things we don't want checked in
* More README and move rack testing to a subdir
* Overhaul prep-source-repos
* Update devtestrc
* Update repo_refs
* Update repo_refs.yaml
* Update repo_refs.yaml
* Update repo_refs.yaml
* update repo_refs.yaml
* Update repo_refs with current minimum patch set
* Pull in eventlet fix
* Add dib-utils
* Add 104455
* Bump to patchset 6 for 103227
* Add 104407
* Drop patch 97703 (merged)
* devtestrc: Allow SEED_* to be specified
* Bump 97703 to patchset #5
* Drop PYPI_MIRROR_URL_1
* Add LOCAL_DIB_ELEMENTS
* Bump 97703 to patchset 4
* Update repo_refs /w patch for bug #1334905
* Add tripleo-heat-templates review 97703
* Clear repo_refs for new round of testing
* Bump patchset 10 for 96498
* 93806 Merged, drop from repo_refs.yaml
* Bump patchset for 97626
* Set default OVERCLOUD_FIXED_RANGE_CIDR in devtestrc
* Typo
* Adds t-o-i rev. 97626
* Update repo_refs.yaml
* Update repo_refs.yaml
* Add tripleo-ci repo
* Update devtestrc
* repo_refs.yaml: Update context
* Cleanout repo refs, remove timeout, add 93731 as better iscsiadm workaround
* Add 96611, test workaround for bug 1324670
* Update repo_refs: Adds 96498
* Allow FLOATING_* to be set locally
* Set OVERCLOUD_{COMPUTE,CONTROL}_DIB_EXTRA_ARGS
* CONTROLSCALE=1
* Set USE_IRONIC=1
* Set seed vm specs
* INIT

View File

@ -1,50 +0,0 @@
This repository contains scripts for managing multiple outstanding patches to
TripleO (or other gerrit based projects).
- tooling to combine arbitrary unmerged gerrit patches (prep_source_repos)
which will also export an rc file with git refs based on the combined
branches
- a sample config file that we're using (repo_refs.yaml)
- some example inputs for doing rack testing of tripleo in the rack-testing
subdir
## Usage
* create a repo_refs.yaml in your TRIPLEO_ROOT (see the one in the root of this
repository for inspiration).
* add the tripleo-end-to-end bin directory to your path (or symlink the
specific scripts into place, or $up to you).
* run prep_source_repos $YOUR\_REFS\_FILE $TRIPLEO\_ROOT to checkout and update
the repositories specified by the refs file. Note that local edits are saved
via git stash whenever you refresh your source repos, and restored after the
update (which may, of course, fail). This provides a convenient way to use
local edits / work in progress for repositories that are used directly (vs
e.g. those that are cloned into images).
* source YOUR_REFS_FILE.variables to configure TripleO scripts to use your
freshly integrated branches
* proceed with any tripleo activies you might have (building images, deploying,
etc etc).
## Advanced use
Refs that don't match the xx/yy/zz form of gerrit refs are presumed to be local
work-in-progress branches. These are not fetched, but are merged into the
rollup branch along with all the other patches. With a little care this permits
working effectively with multiple patchsets in one project without them being
made into a stack in gerrit.
Refs of the form xx/yy/0 are late-bound references to gerrit - they will use
the gerrit REST API to find out the latest version and will use that.
When running prep-source-repos any additional arguments after the refs and
output dir are used to filter the repositories to fetch - so when working on
(say) two local orthogonal patches to nova, and you need to update your rollup
branch just do::
prep-source-repos foo bar nova
and only nova will be updated.

61
README.rst Normal file
View File

@ -0,0 +1,61 @@
prep_source_repos
-----------------
Introduction
============
This repository contains scripts for managing multiple outstanding patches
to a gerrit based project. It was initially developed for managing TripleO
deployments, and still makes certain TripleOish assumptions (patches welcome
if you find the tool more generally useful)
The source repo includes:
- tooling to combine arbitrary unmerged gerrit patches (prep_source_repos)
which will also export an rc file with git refs based on the combined
branches
- a sample config file that we're using for our TripleO deployments
(repo_refs.yaml)
Usage
=====
* create a repo_refs.yaml (see the one in the root of this repository
for inspiration).
* run prep_source_repos $YOUR\_REFS\_FILE $DESTINATIION\_DIR to checkout and
update the repositories specified by the refs file (in a TripleO context,
$DESTINATION\_DIR will usually be "$TRIPLEO\_ROOT").
Note that local edits are saved via git stash whenever you refresh your
source repos, and restored after the update (which may, of course,
fail). This provides a convenient way to use local edits / work in
progress for repositories that are used directly (vs e.g. those that are
cloned into images).
* (optional) source YOUR_REFS_FILE.variables to configure TripleO scripts to
use your freshly integrated branches
* proceed with any tripleo activies you might have (building images, deploying,
etc etc).
Advanced use
============
Refs that don't match the xx/yy/zz form of gerrit refs are presumed to be
local work-in-progress branches. These are not fetched, but are merged into
the rollup branch along with all the other patches. With a little care this
permits working effectively with multiple patchsets in one project without
them being made into a stack in gerrit.
Refs of the form xx/yy/0 are late-bound references to gerrit - they will use
the gerrit REST API to find out the latest version and will use that.
When running prep-source-repos any additional arguments after the refs and
output dir are used to filter the repositories to fetch - so when working on
(say) two local orthogonal patches to nova, and you need to update your
rollup branch just do::
prep-source-repos foo bar nova
and only nova will be updated.

75
doc/source/conf.py Executable file
View File

@ -0,0 +1,75 @@
# -*- coding: utf-8 -*-
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
sys.path.insert(0, os.path.abspath('../..'))
# -- General configuration ----------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = [
'sphinx.ext.autodoc',
#'sphinx.ext.intersphinx',
'oslosphinx'
]
# autodoc generation is a bit aggressive and a nuisance when doing heavy
# text edit cycles.
# execute "export SPHINX_DEBUG=1" in your terminal to disable
# The suffix of source filenames.
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'prep-source-repos'
copyright = u'2014, OpenStack Foundation'
# If true, '()' will be appended to :func: etc. cross-reference text.
add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
add_module_names = True
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# -- Options for HTML output --------------------------------------------------
# The theme to use for HTML and HTML Help pages. Major themes that come with
# Sphinx are currently 'default' and 'sphinxdoc'.
# html_theme_path = ["."]
# html_theme = '_theme'
# html_static_path = ['static']
# Output file base name for HTML help builder.
htmlhelp_basename = '%sdoc' % project
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass
# [howto/manual]).
latex_documents = [
('index',
'%s.tex' % project,
u'%s Documentation' % project,
u'OpenStack Foundation', 'manual'),
]
# Example configuration for intersphinx: refer to the Python standard library.
#intersphinx_mapping = {'http://docs.python.org/': None}

View File

@ -0,0 +1,4 @@
Contributing to prep_source_repos
=================================
.. include:: ../../CONTRIBUTING.rst

24
doc/source/index.rst Normal file
View File

@ -0,0 +1,24 @@
.. prep-source-repos documentation master file, created by
sphinx-quickstart on Tue Jul 9 22:26:36 2013.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to prep-source-repos's documentation!
========================================================
Contents:
.. toctree::
:maxdepth: 2
readme
installation
usage
contributing
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@ -0,0 +1,12 @@
============
Installation
============
At the command line::
$ pip install prep-source-repos
Or, if you have virtualenvwrapper installed::
$ mkvirtualenv prep-source-repos
$ pip install prep-source-repos

1
doc/source/readme.rst Normal file
View File

@ -0,0 +1 @@
.. include:: ../../README.rst

5
doc/source/usage.rst Normal file
View File

@ -0,0 +1,5 @@
========
Usage
========
See README.rst

View File

View File

@ -1,19 +1,20 @@
#!/usr/bin/python
#!/usr/bin/env python
import argparse
import json
import os.path
import re
from subprocess import check_call, check_output
from subprocess import check_call
from subprocess import check_output
import sys
import yaml
import requests
import yaml
def normalise_conf(conf):
"""generate full paths etc for easy application later.
The resulting structure is:
basename -> (remotebase, gerrit_API_base).
"""
@ -41,7 +42,8 @@ def normalise_conf(conf):
def main():
parser = argparse.ArgumentParser()
parser.add_argument("refs", help="the yaml config file")
parser.add_argument("output", help="where to put the downloaded repositories")
parser.add_argument("output",
help="where to put the downloaded repositories")
parser.add_argument("repos", help="what repos to update", nargs="*")
args = parser.parse_args()
SRC_ROOT = os.path.abspath(args.output)
@ -76,7 +78,7 @@ def main():
if segments[2] == '0':
# pull the latest edition
gerrit_url = gerrit + ('/changes/?q=%s&o=CURRENT_REVISION'
% segments[1])
% segments[1])
details = json.loads(session.get(gerrit_url).text[4:])
src = details[0]['revisions'].values()[0]['fetch'].values()[0]
rref = src['ref']
@ -87,7 +89,7 @@ def main():
git_refs.append(
'+%(rref)s:%(rref)s' % dict(rref=rref))
print 'fetching from %s %s' % (remote, git_refs)
print('fetching from %s %s' % (remote, git_refs))
check_call(['git', 'fetch', remote] + git_refs, cwd=rd)
if not refs:
@ -106,11 +108,12 @@ def main():
check_call(['git', 'stash'], cwd=rd)
branches = check_output(['git', 'branch', '-a'], cwd=rd)
if ' ' + branch_name in branches:
print 'Resetting existing branch %s...' % branch_name
print('Resetting existing branch %s...' % branch_name)
check_call(['git', 'checkout', branch_name], cwd=rd)
check_call(['git', 'reset', '--hard', 'review/master'], cwd=rd)
else:
check_call(['git', 'checkout', '-b', branch_name, 'review/master'], cwd=rd)
check_call(['git', 'checkout', '-b', branch_name,
'review/master'], cwd=rd)
for ref in refs:
segments = ref.split('/')
if len(segments) == 3:
@ -118,13 +121,13 @@ def main():
ref = resolved_refs[ref]
else:
ref = 'refs/changes/%s' % ref
print 'merging in %s' % ref
print('merging in %s' % ref)
check_call(['git', 'merge', '--no-edit', ref], cwd=rd)
if dirty:
check_call(['git', 'stash', 'pop'], cwd=rd)
normalised_repo = re.sub('[^A-Za-z0-9_]', '_', repo)
if repo not in CONF['gerrit_refs']:
print 'no refs for %s' % repo
print('no refs for %s' % repo)
variables.append((normalised_repo, rd, None))
else:
variables.append((normalised_repo, rd, branch_name))
@ -136,7 +139,7 @@ def main():
if ref:
output.write('export DIB_REPOREF_%s=%s\n' % (name, ref))
else:
output.write('unset DIB_REPOREF_%s\n'% name)
output.write('unset DIB_REPOREF_%s\n' % name)
return 0

View File

@ -1,16 +0,0 @@
Here we find:
* sample nodes.json and networks.json files
* an example devtestrc which we use when testing in one of the HP test racks
To use:
* follow the main readme to get your source repos downloaded and variables
defined.
* create appropriate customised nodes and network json files
* run it::
devtest.sh --trash-my-machine --nodes $PATH_TO/nodes.json --bm-networks $PATH_TO/bm-network.json

View File

@ -1,108 +0,0 @@
#PROXY="10.22.167.17"; # Using desktop squid
#PYPI_MIRROR="10.22.167.17";
#### Set these specific to your environment!
####
# IP/Hostname of pypi mirror, or leave blank to not use
export PYPI_MIRROR=${PYPI_MIRROR:-''}
# IP/Hostname:Port of HTTP/HTTPS Proxy, or leave blank to not use
export PROXY=${PROXY:-''}
export TRIPLEO_ROOT=${TRIPLEO_ROOT:-"$PWD/tripleo"}
export TE_DATAFILE=${TE_DATAFILE:-"$TRIPLEO_ROOT/testenv.json"}
export NeutronPublicInterface=eth2
# Scale for overcloud compute/control.
export OVERCLOUD_CONTROLSCALE=${OVERCLOUD_CONTROLSCALE:-"1"}
export OVERCLOUD_COMPUTESCALE=${OVERCLOUD_COMPUTESCALE:-"29"}
# Specific to your network
export FLOATING_START=${FLOATING_START:-"10.22.157.225"}
export FLOATING_END=${FLOATING_END:-"10.22.157.254"}
export FLOATING_CIDR=${FLOATING_CIDR:-"10.22.157.244/27"}
# Relies on https://review.openstack.org/97626
export OVERCLOUD_FIXED_RANGE_CIDR=${OVERCLOUD_FIXED_RANGE_CIDR:-"192.168.10.0/24"}
# Be sure to create a large seed vm
export SEED_CPU=${SEED_CPU:-24}
export SEED_MEM=${SEED_MEM:-24576}
##### end
if [[ -n "$PROXY" ]] ; then
export http_proxy="http://$PROXY/"
export https_proxy="https://$PROXY/"
export no_proxy="${PYPI_MIRROR},localhost";
fi
if [[ -n "$PYPI_MIRROR" ]] ; then
export PYPI_MIRROR_URL="http://${PYPI_MIRROR}/pypi/latest"; # point this at the pypi mirror.
export DIB_NO_PYPI_PIP=1
fi
export DIB_COMMON_ELEMENTS="$LOCAL_DIB_ELEMENTS stackuser pypi -u use-ephemeral mellanox"
export DEVTEST_PERF_COMMENT="$(hostname): clean re-run, using USEast pypi mirror."
# same arch as host machine, so the wheels in the mirror work
export NODE_ARCH=amd64
export NODE_DIST=${NODE_DIST:-"ubuntu"}
export DIB_RELEASE=trusty
export USE_IRONIC=1
# NOTE (adam_g): Limit cloud-init data sources to Ec2 to workaround
# trusty cloud-init bug #1316475.
# Relies on https://review.openstack.org/#/c/95598/
export DIB_CLOUD_INIT_DATASOURCES="Ec2"
export UNDERCLOUD_DIB_EXTRA_ARGS="rabbitmq-server cloud-init-datasources"
export OVERCLOUD_CONTROL_DIB_EXTRA_ARGS="rabbitmq-server cloud-init-datasources"
export OVERCLOUD_COMPUTE_DIB_EXTRA_ARGS="cloud-init-datasources"
if [ -z "$NO_SOURCE_PREP" ]; then
cd $TRIPLE_ROOT/tripleo-end-to-end
bin/prep_source_repos
cd -
fi
# Clone our local git copies. Make devtest.sh prefer your local repositories. You'll still need to have stuff checked in to them!
for n in $TRIPLEO_ROOT/*;
do
[ -d "$n" -a -d "$n/.git" ] || continue
nn=$(basename "$n") # work around older bash
bn=${nn//[^A-Za-z0-9_]/_}
DIB_REPOTYPE="DIB_REPOTYPE_${bn}";
DIB_REPOLOCATION="DIB_REPOLOCATION_${bn}";
DIB_REPOREF="DIB_REPOREF_${bn}";
export "${DIB_REPOTYPE}"="git";
export "${DIB_REPOLOCATION}"="${n}";
unset branch;
if branch=$(cd "${n}" && git symbolic-ref --short -q HEAD);
then
export "${DIB_REPOREF}"="${branch}";
else
unset "${DIB_REPOREF}";
fi
printf "%-25s %-5s %-60s %s\n" "${bn}" "${!DIB_REPOTYPE}" "${!DIB_REPOLOCATION}" "${!DIB_REPOREF}";
pushd "${n}" >/dev/null;
if [[ "$(git rev-parse master)" != "$( git rev-parse HEAD)" ]];
then
IFS=$'\n';
for f in $(git log master.. --oneline); do
printf ' \e[1;31m%-60s \e[1;34m%s\e[m\n' "${f}" "$(git show $(echo $f | cut -d" " -f1) | awk '/Change-Id/ {print "http://review.openstack.org/r/" $2}')";
done
fi
popd >/dev/null;
done
source $TRIPLEO_ROOT/tripleo-incubator/scripts/devtest_variables.sh

View File

@ -1,13 +0,0 @@
{
"cidr": "10.22.157.0/24",
"gateway-ip": "10.22.157.1",
"seed": {
"ip": "10.22.157.150",
"range-start": "10.22.157.151",
"range-end": "10.22.157.152"
},
"undercloud": {
"range-start": "10.22.157.153",
"range-end": "10.22.157.190"
}
}

View File

@ -1,22 +0,0 @@
[{
"pm_password": "foo",
"mac": ["78:e7:d1:24:99:a5"],
"pm_addr": "10.22.51.66",
"pm_type": "pxe_ipmitool",
"memory": 98304,
"disk": 1600,
"arch": "amd64",
"cpu": 24,
"pm_user": "Administrator"
},
{
"pm_password": "AFDJHTVQ",
"mac": ["78:e7:d1:24:6f:c5"],
"pm_addr": "10.22.51.69",
"pm_type": "pxe_ipmitool",
"memory": 98304,
"disk": 1600,
"arch": "amd64",
"cpu": 24,
"pm_user": "Administrator"
}]

2
requirements.txt Normal file
View File

@ -0,0 +1,2 @@
PyYAML>=3.1.0
requests>=2.2.0,!=2.4.0

25
setup.cfg Normal file
View File

@ -0,0 +1,25 @@
[metadata]
name = prep_source_repos
author = OpenStack Foundation
author-email = openstack-dev@lists.openstack.org
summary = Tool to manage a local checkout of a set of unlanded patches on a Gerrit repository
description-file = README.rst
license = Apache-2
classifier =
Development Status :: 4 - Beta
Environment :: Console
Environment :: OpenStack
Intended Audience :: Developers
Intended Audience :: Information Technology
License :: OSI Approved :: Apache Software License
Operating System :: OS Independent
Programming Language :: Python
keywords =
gerrit
[files]
packages =
prep_source_repos
[entry_points]
console_scripts =
prep_source_repos = prep_source_repos.cmd:main

8
setup.py Executable file
View File

@ -0,0 +1,8 @@
#!/usr/bin/env python
from setuptools import setup
setup(
setup_requires=['pbr'],
pbr=True,
)

15
test-requirements.txt Normal file
View File

@ -0,0 +1,15 @@
# The order of packages is significant, because pip processes them in the order
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
hacking<0.11,>=0.10.0
coverage>=3.6
discover
fixtures>=0.3.14
mock>=1.0
python-subunit>=0.0.18
sphinx>=1.1.2,!=1.2.0,!=1.3b1,<1.3
oslosphinx>=2.2.0 # Apache-2.0
testrepository>=0.0.18
testscenarios>=0.4
testtools>=0.9.36,!=1.2.0

36
tox.ini Normal file
View File

@ -0,0 +1,36 @@
[tox]
minversion = 1.6
envlist = pep8
skipsdist = True
[testenv]
usedevelop = True
install_command = pip install -U {opts} {packages}
setenv =
VIRTUAL_ENV={envdir}
deps = -r{toxinidir}/requirements.txt
-r{toxinidir}/test-requirements.txt
commands = python setup.py testr --slowest --testr-args='{posargs}'
[testenv:pep8]
commands = flake8
[testenv:venv]
commands = {posargs}
[testenv:docs]
commands = python setup.py build_sphinx
[flake8]
# H302 skipped on purpose per IRC discussion involving other TripleO projects.
# H803 skipped on purpose per list discussion.
# E123, E125 skipped as they are invalid PEP-8.
show-source = True
ignore = E123,E125,H302,H803
builtins = _
exclude=.venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*,*egg,build
#Value chosen to allow current complexity, but no more.
#TODO(tchayo) decrease this to 14
max-complexity=22