Farewell oslo-incubator
We need remove all of the files except the README as required by: http://docs.openstack.org/infra/manual/drivers.html#remove-project-content Change-Id: Ib008533ba1c5be809f4b6ae3f60e9537e022d923
|
@ -1,20 +0,0 @@
|
|||
=============================================
|
||||
Contributing to: oslo-incubator
|
||||
=============================================
|
||||
|
||||
If you would like to contribute to the development of OpenStack,
|
||||
you must follow the steps in this page:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html
|
||||
|
||||
Once those steps have been completed, changes to OpenStack
|
||||
should be submitted for review via the Gerrit tool, following
|
||||
the workflow documented at:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
||||
|
||||
Bugs should be filed on Launchpad, not GitHub:
|
||||
|
||||
https://bugs.launchpad.net/oslo-incubator
|
73
HACKING.rst
|
@ -1,73 +0,0 @@
|
|||
Oslo Style Commandments
|
||||
=======================
|
||||
|
||||
- Step 1: Read the OpenStack Style Commandments
|
||||
http://docs.openstack.org/developer/hacking/
|
||||
- Step 2: Read on
|
||||
|
||||
Oslo Specific Commandments
|
||||
--------------------------
|
||||
- None So Far
|
||||
|
||||
|
||||
|
||||
General
|
||||
-------
|
||||
- When defining global constants, define them before functions and classes
|
||||
- Use 'raise' instead of 'raise e' to preserve original traceback or exception being reraised::
|
||||
|
||||
except Exception as e:
|
||||
...
|
||||
raise e # BAD
|
||||
|
||||
except Exception:
|
||||
...
|
||||
raise # OKAY
|
||||
|
||||
TODO vs FIXME
|
||||
-------------
|
||||
- TODO(name): implies that something should be done (cleanup, refactoring,
|
||||
etc), but is expected to be functional.
|
||||
- FIXME(name): implies that the method/function/etc shouldn't be used until
|
||||
that code is resolved and bug fixed.
|
||||
|
||||
|
||||
Text encoding
|
||||
-------------
|
||||
- All text within python code should be of type 'unicode'.
|
||||
|
||||
WRONG:
|
||||
|
||||
>>> s = 'foo'
|
||||
>>> s
|
||||
'foo'
|
||||
>>> type(s)
|
||||
<type 'str'>
|
||||
|
||||
RIGHT:
|
||||
|
||||
>>> u = u'foo'
|
||||
>>> u
|
||||
u'foo'
|
||||
>>> type(u)
|
||||
<type 'unicode'>
|
||||
|
||||
- Transitions between internal unicode and external strings should always
|
||||
be immediately and explicitly encoded or decoded.
|
||||
|
||||
- All external text that is not explicitly encoded (database storage,
|
||||
commandline arguments, etc.) should be presumed to be encoded as utf-8.
|
||||
|
||||
WRONG:
|
||||
|
||||
mystring = infile.readline()
|
||||
myreturnstring = do_some_magic_with(mystring)
|
||||
outfile.write(myreturnstring)
|
||||
|
||||
RIGHT:
|
||||
|
||||
mystring = infile.readline()
|
||||
mytext = s.decode('utf-8')
|
||||
returntext = do_some_magic_with(mytext)
|
||||
returnstring = returntext.encode('utf-8')
|
||||
outfile.write(returnstring)
|
233
LICENSE
|
@ -1,233 +0,0 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
--- License for python-keystoneclient versions prior to 2.1 ---
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright notice,
|
||||
this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of this project nor the names of its contributors may
|
||||
be used to endorse or promote products derived from this software without
|
||||
specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
|
||||
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
|
||||
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
||||
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
|
||||
Licenses for incorporated software
|
||||
==================================
|
||||
|
||||
----------------------------------------------------------------------
|
||||
|
||||
The db.sqlalchemy.migration code was based on code originally licensed
|
||||
under the MIT license:
|
||||
|
||||
----------------------------------------------------------------------
|
||||
Copyright (c) 2009 Evan Rosson, Jan Dittberner, Domen Kožar
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
25
README.rst
|
@ -1,18 +1,13 @@
|
|||
------------------
|
||||
The Oslo Incubator
|
||||
------------------
|
||||
This project is no longer maintained.
|
||||
|
||||
The Oslo program produces a set of python libraries containing
|
||||
infrastructure code shared by OpenStack projects. The APIs provided by
|
||||
these libraries should be high quality, stable, consistent and
|
||||
generally useful.
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
The process of developing a new Oslo API usually begins by taking code
|
||||
which is common to some OpenStack projects and moving it into this
|
||||
repository. Incubation shouldn't be seen as a long term option for any
|
||||
API - it is merely a stepping stone to inclusion into a published Oslo
|
||||
library.
|
||||
For an alternative project, please see oslo.tools at
|
||||
http://git.openstack.org/cgit/openstack/oslo.tools
|
||||
|
||||
For more information, see our wiki page:
|
||||
|
||||
https://wiki.openstack.org/wiki/Oslo
|
||||
For any further questions, please email
|
||||
openstack-dev@lists.openstack.org or join #openstack-dev on
|
||||
Freenode.
|
||||
|
|
|
@ -1,28 +0,0 @@
|
|||
[dashboard]
|
||||
title = Oslo Documentation Sprint
|
||||
description = Documentation SPrint
|
||||
foreach = (project:^openstack/.*oslo.* OR project:^openstack-dev/.*oslo.* OR
|
||||
project:openstack/debtcollector OR project:openstack/pylockfile OR
|
||||
project:openstack/futurist OR project:openstack/automaton OR
|
||||
project:openstack/stevedore OR project:openstack/taskflow OR
|
||||
project:openstack/tooz OR project:openstack-dev/cookiecutter OR
|
||||
project:openstack-dev/pbr OR project:openstack/debtcollector OR
|
||||
project:openstack/mox3)
|
||||
status:open NOT owner:self NOT label:Workflow<=-1 label:Verified>=1
|
||||
NOT label:Code-Review<=-1,self NOT label:Code-Review>=1,self
|
||||
(file:^doc/source/.* OR file:^README.*)
|
||||
|
||||
[section "Bug Fixes"]
|
||||
query = topic:^bug/.*
|
||||
|
||||
[section "You are a reviewer, but haven't voted in the current revision"]
|
||||
query = reviewer:self
|
||||
|
||||
[section "Needs final +2"]
|
||||
query = label:Code-Review>=2 limit:50
|
||||
|
||||
[section "New Contributors"]
|
||||
query = reviewer:10068
|
||||
|
||||
[section "Passed Jenkins, No Negative Feedback"]
|
||||
query = NOT label:Code-Review>=2 NOT label:Code-Review<=-1 limit:50
|
|
@ -1,34 +0,0 @@
|
|||
[dashboard]
|
||||
title = Oslo Graduation Changes
|
||||
description = Changes for Oslo libraries being Graduated
|
||||
# NOTE(dhellmann): The space in the message query results in
|
||||
# much more accurate query results.
|
||||
foreach = is:open
|
||||
|
||||
[section "Graduating Libraries"]
|
||||
query = (project:openstack/oslo.versionedobjects OR
|
||||
project:openstack/oslo.service OR project:openstack/oslo.reports OR
|
||||
project:openstack/oslo.cache OR project:openstack/oslo.service)
|
||||
NOT owner:self
|
||||
NOT label:Workflow<=-1
|
||||
label:Verified>=1
|
||||
|
||||
[section "New Libraries"]
|
||||
query = (project:openstack/futurist OR
|
||||
project:openstack/automaton OR
|
||||
project:openstack/mox3)
|
||||
NOT owner:self
|
||||
NOT label:Workflow<=-1
|
||||
label:Verified>=1
|
||||
|
||||
[section "Incubator"]
|
||||
query = project:openstack/oslo-incubator topic:^.*graduate.*
|
||||
|
||||
[section "openstack-infra/*"]
|
||||
query = project:^openstack-infra/.* message:" oslo"
|
||||
|
||||
[section "openstack-dev/*"]
|
||||
query = project:^openstack-dev/.* message:" oslo"
|
||||
|
||||
[section "governance"]
|
||||
query = project:openstack/governance message:" oslo"
|
|
@ -1,39 +0,0 @@
|
|||
[dashboard]
|
||||
title = Oslo Review Inbox
|
||||
description = Review Inbox
|
||||
foreach = (project:^openstack/.*oslo.* OR project:^openstack-dev/.*oslo.* OR
|
||||
project:openstack/debtcollector OR project:openstack/pylockfile OR
|
||||
project:openstack/futurist OR project:openstack/automaton OR
|
||||
project:openstack/stevedore OR project:openstack/taskflow OR
|
||||
project:openstack/tooz OR project:openstack-dev/cookiecutter OR
|
||||
project:openstack-dev/pbr OR project:openstack/debtcollector OR
|
||||
project:openstack/mox3)
|
||||
status:open NOT owner:self NOT label:Workflow<=-1 label:Verified>=1
|
||||
NOT label:Code-Review<=-1,self NOT label:Code-Review>=1,self
|
||||
|
||||
[section "Oslo Specs"]
|
||||
query = project:openstack/oslo-specs
|
||||
|
||||
[section "Bug Fixes"]
|
||||
query = topic:^bug/.*
|
||||
|
||||
[section "Blueprints"]
|
||||
query = message:"Blueprint"
|
||||
|
||||
[section "Needs Feedback (Changes older than 5 days that have not been reviewed by anyone)"]
|
||||
query = NOT label:Code-Review<=2 age:5d
|
||||
|
||||
[section "You are a reviewer, but haven't voted in the current revision"]
|
||||
query = reviewer:self
|
||||
|
||||
[section "Needs final +2"]
|
||||
query = label:Code-Review>=2 limit:50
|
||||
|
||||
[section "New Contributors"]
|
||||
query = reviewer:10068
|
||||
|
||||
[section "Passed Jenkins, No Negative Feedback"]
|
||||
query = NOT label:Code-Review>=2 NOT label:Code-Review<=-1 limit:50
|
||||
|
||||
[section "Wayward Changes (Changes with no code review in the last 2days)"]
|
||||
query = NOT label:Code-Review<=2 age:2d
|
|
@ -1,17 +0,0 @@
|
|||
[dashboard]
|
||||
title = Oslo Sync Review Inbox
|
||||
description = Review Inbox
|
||||
foreach = is:open branch:master file:^.*openstack/common.* status:open
|
||||
NOT owner:self NOT label:Workflow<=-1 label:Verified>=1,jenkins
|
||||
NOT label:Code-Review<=-1,self NOT label:Code-Review>=1,self
|
||||
|
||||
[section "Integrated Projects"]
|
||||
query = NOT project:openstack/oslo-incubator project:^openstack/.*
|
||||
NOT project:^openstack/python-.*client
|
||||
|
||||
[section "Clients"]
|
||||
query = project:^openstack/python-.*client
|
||||
|
||||
[section "Stackforge Projects"]
|
||||
query = project:^stackforge/.*
|
||||
|
Before Width: | Height: | Size: 52 KiB |
Before Width: | Height: | Size: 18 KiB |
Before Width: | Height: | Size: 4.6 KiB |
|
@ -1,19 +0,0 @@
|
|||
<?xml version="1.0"?><svg width="640" height="480" xmlns="http://www.w3.org/2000/svg">
|
||||
<g>
|
||||
<path d="m414.682617,278.616882c0.018066,60.508209 -49.028503,109.569366 -109.536743,109.569366c-60.508209,0 -109.554749,-49.061157 -109.536713,-109.569366c-0.018036,-60.508224 49.028503,-109.569397 109.536713,-109.569397c60.50824,0 109.55481,49.061172 109.536743,109.569397z" id="path2383" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="4" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m312.326172,389.186554c0,9.090759 14.665344,9.381348 32.735107,9.381348c18.069824,0 32.735168,0.693481 32.735168,-9.381348c0,-17.715942 -14.665344,-32.094116 -32.735168,-32.094116c-18.069763,0 -32.735107,14.378174 -32.735107,32.094116z" id="path3172" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="4" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m357.651733,397.966949c0,-16.367554 -9.44281,-26.439941 -9.44281,-26.439941c0,0 3.7771,13.849487 3.147583,26.439941l6.295227,0z" id="path3174" stroke-miterlimit="4" stroke-linejoin="round" stroke-width="4" stroke="#000000" fill-rule="evenodd" fill="#ffffff"/>
|
||||
<path d="m301.484131,148.321594c-30.129211,0.683929 -50.931671,13.052826 -63.624969,32.20285c-24.064438,-2.89859 -44.0625,26.462311 -44.0625,26.462311c0,0 18.891129,10.580811 32.0625,1.715149c-2.399231,9.757996 -3.5625,20.280457 -3.5625,31.257721c0,50.599915 56.125519,91.673096 81.843719,91.673096c26.716614,0 81.84375,-41.073181 81.84375,-91.673096c0,-10.977264 -1.163879,-21.499725 -3.5625,-31.257721c13.171417,8.865662 32.03125,-1.715149 32.03125,-1.715149c0,0 -19.971191,-29.351685 -44.03125,-26.462311c-13.061646,-19.70813 -34.707642,-32.20285 -66.28125,-32.20285c-0.893921,0 -1.778137,-0.019928 -2.65625,0z" id="path3155" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="4" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m335.109131,152.078033c22.642822,-2.369171 34.040344,-7.446167 39.5,-12c17.754211,-14.808411 11.325378,-28.549896 22.000031,-25.5c10.499969,3 10.476746,14.694916 17.499969,18c8.500031,4 -2.037109,-35.167847 6.500031,-32.5c8,2.5 8.971832,22.345093 21,27.5c10.499969,4.5 -6.609039,-46.582703 8,-35.5c14.499969,11 35,55 1,67c-34,12 -84.446686,15.486877 -101.000031,15c-17,-0.5 -32.659851,-20.099854 -14.5,-22z" id="path2399" stroke-miterlimit="4" stroke-linejoin="round" stroke-width="4" stroke="#000000" fill-rule="evenodd" fill="#ffffff"/>
|
||||
<path d="m300.516357,218.146606c0.003723,12.518936 -10.143799,22.669525 -22.662781,22.669525c-12.518921,0 -22.666473,-10.150589 -22.66275,-22.669525c-0.003723,-12.518951 10.143829,-22.669525 22.66275,-22.669525c12.518982,0 22.666504,10.150574 22.662781,22.669525z" id="path3157" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="3.88889" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m353.248566,218.146606c0.003754,12.518936 -10.143829,22.669525 -22.66275,22.669525c-12.518951,0 -22.666504,-10.150589 -22.662781,-22.669525c-0.003723,-12.518951 10.143829,-22.669525 22.662781,-22.669525c12.518921,0 22.666504,10.150574 22.66275,22.669525z" id="path3159" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="3.88889" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m333.103882,222.146637c0.001251,4.172974 -3.381287,7.556488 -7.55426,7.556488c-4.172974,0 -7.555481,-3.383514 -7.55426,-7.556488c-0.001221,-4.172989 3.381287,-7.556503 7.55426,-7.556503c4.172974,0 7.555511,3.383514 7.55426,7.556503z" id="path3161" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="47.666691" fill-rule="nonzero" fill="#000000"/>
|
||||
<path d="m291.073547,222.146637c0.001251,4.172974 -3.381287,7.556488 -7.55426,7.556488c-4.172974,0 -7.555481,-3.383514 -7.55426,-7.556488c-0.001221,-4.172989 3.381287,-7.556503 7.55426,-7.556503c4.172974,0 7.555511,3.383514 7.55426,7.556503z" id="path3163" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="47.666691" fill-rule="nonzero" fill="#000000"/>
|
||||
<path d="m297.328369,389.186554c0,9.090759 -14.665344,9.381348 -32.735107,9.381348c-18.069778,0 -32.735123,0.693481 -32.735123,-9.381348c0,-17.715942 14.665344,-32.094116 32.735123,-32.094116c18.069763,0 32.735107,14.378174 32.735107,32.094116z" id="path3329" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="4" stroke="#000000" fill-rule="nonzero" fill="#ffffff"/>
|
||||
<path d="m252.002853,397.966949c0,-16.367554 9.442825,-26.439941 9.442825,-26.439941c0,0 -3.777161,13.849487 -3.147644,26.439941l-6.295181,0z" id="path3331" stroke-miterlimit="4" stroke-linejoin="round" stroke-width="4" stroke="#000000" fill-rule="evenodd" fill="#ffffff"/>
|
||||
<path d="m320.741333,294.277985c-2.292725,3.973083 -6.500244,5.838562 -9.396729,4.16626c-2.896545,-1.672302 -3.384766,-6.248901 -1.090332,-10.221008c2.292725,-3.973083 6.500305,-5.838562 9.39679,-4.16626c2.896545,1.672302 3.384705,6.248901 1.090271,10.221008z" id="path2398" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="47.666691" fill-rule="nonzero" fill="#000000"/>
|
||||
<path d="m288.004272,294.277985c2.292725,3.973083 6.500305,5.838562 9.39679,4.16626c2.896545,-1.672302 3.384705,-6.248901 1.090271,-10.221008c-2.292725,-3.973083 -6.500244,-5.838562 -9.39679,-4.16626c-2.896484,1.672302 -3.384705,6.248901 -1.090271,10.221008z" id="path2412" stroke-miterlimit="4" stroke-linejoin="round" stroke-linecap="round" stroke-width="47.666691" fill-rule="nonzero" fill="#000000"/>
|
||||
<path d="m272.512146,152.078033c-22.642776,-2.369171 -34.040268,-7.446167 -39.499985,-12c-17.754181,-14.808411 -11.325348,-28.549896 -22,-25.5c-10.5,3 -10.4767,14.694916 -17.5,18c-8.5,4 2.037125,-35.167847 -6.5,-32.5c-8,2.5 -8.971848,22.345093 -21,27.5c-10.5,4.5 6.609024,-46.582703 -8,-35.5c-14.5,11 -35,55 -1,67c34,12 84.446655,15.486877 100.999985,15c17,-0.5 32.659912,-20.099854 14.5,-22z" id="path3171" stroke-miterlimit="4" stroke-linejoin="round" stroke-width="4" stroke="#000000" fill-rule="evenodd" fill="#ffffff"/>
|
||||
<title>Layer 1</title>
|
||||
</g>
|
||||
</svg>
|
Before Width: | Height: | Size: 6.1 KiB |
Before Width: | Height: | Size: 56 KiB |
Before Width: | Height: | Size: 34 KiB |
Before Width: | Height: | Size: 9.5 KiB |
|
@ -1,25 +0,0 @@
|
|||
<?xml version="1.0"?><svg width="335.62131" height="312.57803" xmlns="http://www.w3.org/2000/svg">
|
||||
|
||||
<g>
|
||||
<title>Layer 1</title>
|
||||
<g id="layer1">
|
||||
<path fill="#784421" fill-rule="nonzero" stroke="#000000" stroke-width="3.287356" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2383" d="m277.681335,190.614655c0.018036,60.508072 -49.028381,109.569092 -109.536438,109.569092c-60.508072,0 -109.554497,-49.06102 -109.536453,-109.569092c-0.018044,-60.508057 49.028381,-109.569122 109.536453,-109.569122c60.508057,0 109.554474,49.061066 109.536438,109.569122z"/>
|
||||
<path fill="#000000" fill-opacity="0.313725" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3221" d="m271.060486,153.036011c0.076782,1.68158 0.125,3.393311 0.125,5.09375c0,60.464294 -49.06694,109.53125 -109.531235,109.53125c-47.262177,0 -87.591187,-29.999268 -102.9375,-71.96875c2.653503,58.112183 50.673645,104.46875 109.4375,104.46875c60.464294,0 109.531235,-49.066956 109.531235,-109.53125c0,-13.202209 -2.338135,-25.870056 -6.625,-37.59375z"/>
|
||||
<path fill="#502d16" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3172" d="m176.326309,301.186523c0,9.090759 14.665344,9.381348 32.735107,9.381348c18.069794,0 32.735138,0.693481 32.735138,-9.381348c0,-17.715942 -14.665344,-32.094116 -32.735138,-32.094116c-18.069763,0 -32.735107,14.378174 -32.735107,32.094116z"/>
|
||||
<path fill="#000000" fill-opacity="0.313725" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3236" d="m252.781265,166.743103c0,50.599854 -55.121216,91.666382 -81.837799,91.666382c-25.718262,0 -81.83783,-41.066528 -81.83783,-91.666382c0,-50.599854 24.625031,-91.666382 81.83783,-91.666382c57.222565,0 81.837799,41.066528 81.837799,91.666382z"/>
|
||||
<path fill="#000000" fill-opacity="0.313725" fill-rule="evenodd" stroke="#000000" stroke-width="4" stroke-linejoin="round" stroke-miterlimit="4" id="path3174" d="m221.65184,309.966919c0,-16.367554 -9.44281,-26.439941 -9.44281,-26.439941c0,0 3.77713,13.849487 3.147614,26.439941l6.295197,0z"/>
|
||||
<path fill="#593218" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3155" d="m165.484268,60.321564c-30.129242,0.683929 -50.931671,13.052826 -63.625,32.20285c-24.064423,-2.89859 -44.062485,26.462311 -44.062485,26.462311c0,0 18.891129,10.580811 32.062485,1.715149c-2.399231,9.757996 -3.5625,20.280457 -3.5625,31.257721c0,50.599915 56.125519,91.673096 81.84375,91.673096c26.716614,0 81.84375,-41.073181 81.84375,-91.673096c0,-10.977264 -1.163879,-21.499725 -3.5625,-31.257721c13.171371,8.865662 32.031235,-1.715149 32.031235,-1.715149c0,0 -19.971191,-29.351685 -44.031235,-26.462311c-13.061646,-19.70813 -34.707672,-32.20285 -66.28125,-32.20285c-0.893951,0 -1.778168,-0.019928 -2.65625,0z"/>
|
||||
<path fill="#ba8d73" fill-rule="evenodd" stroke="#000000" stroke-width="5" stroke-linejoin="round" stroke-miterlimit="4" id="path2399" d="m199.109268,64.078003c22.642822,-2.369171 34.040314,-7.446167 39.5,-12c17.754196,-14.808411 11.325348,-28.549896 21.999985,-25.5c10.5,3 10.476746,14.694916 17.5,18c8.5,4 -2.037109,-35.167847 6.5,-32.5c8,2.5 8.971863,22.345093 21,27.5c10.5,4.5 -6.609009,-46.582703 8,-35.5c14.5,11 35,55 1,67c-34,12 -84.44664,15.486877 -100.999985,15c-17,-0.5 -32.659882,-20.099854 -14.5,-22z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="5.88889" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3157" d="m164.516037,130.146606c0.003738,12.518936 -10.143814,22.66951 -22.66275,22.66951c-12.518936,0 -22.666481,-10.150574 -22.66275,-22.66951c-0.003731,-12.518921 10.143814,-22.669495 22.66275,-22.669495c12.518936,0 22.666489,10.150574 22.66275,22.669495z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="5.88889" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3159" d="m217.249039,130.146606c0.003738,12.518936 -10.143814,22.66951 -22.66275,22.66951c-12.518936,0 -22.666489,-10.150574 -22.66275,-22.66951c-0.003738,-12.518921 10.143814,-22.669495 22.66275,-22.669495c12.518936,0 22.666489,10.150574 22.66275,22.669495z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="23.666691" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3161" d="m197.103683,134.14621c0.001236,4.172989 -3.381271,7.556519 -7.554245,7.556519c-4.172989,0 -7.555511,-3.38353 -7.55426,-7.556519c-0.001251,-4.172974 3.381271,-7.556488 7.55426,-7.556488c4.172974,0 7.555481,3.383514 7.554245,7.556488z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.666691" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3163" d="m155.073685,134.14621c0.001236,4.172989 -3.381271,7.556519 -7.554245,7.556519c-4.172989,0 -7.555511,-3.38353 -7.55426,-7.556519c-0.001251,-4.172974 3.381271,-7.556488 7.55426,-7.556488c4.172974,0 7.555481,3.383514 7.554245,7.556488z"/>
|
||||
<path fill="#000000" fill-opacity="0.313725" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3231" d="m246.703018,121.926971l-1.875,2.975281c0.248901,3.444824 0.34375,6.947571 0.34375,10.500916c0,50.599884 -55.127136,91.673065 -81.84375,91.673065c-20.705811,0 -61.09671,-26.608643 -76.0625,-63.390564c7.615814,45.079773 57.171906,79.947021 80.875,79.947021c26.716614,0 81.84375,-41.073242 81.84375,-91.673096c0,-10.51944 -1.075378,-20.621277 -3.28125,-30.032623z"/>
|
||||
<path fill="#502d16" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3329" d="m161.328506,301.186523c0,9.090759 -14.665344,9.381348 -32.735107,9.381348c-18.069794,0 -32.735138,0.693481 -32.735138,-9.381348c0,-17.715942 14.665344,-32.094116 32.735138,-32.094116c18.069763,0 32.735107,14.378174 32.735107,32.094116z"/>
|
||||
<path fill="#000000" fill-opacity="0.313725" fill-rule="evenodd" stroke="#000000" stroke-width="4" stroke-linejoin="round" stroke-miterlimit="4" id="path3331" d="m116.002975,309.966919c0,-16.367554 9.44281,-26.439941 9.44281,-26.439941c0,0 -3.77713,13.849487 -3.147614,26.439941l-6.295197,0z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.666691" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2398" d="m184.74147,206.277954c-2.292725,3.973083 -6.500244,5.838562 -9.396759,4.16626c-2.896545,-1.672302 -3.384735,-6.248901 -1.090302,-10.221008c2.292725,-3.973083 6.500275,-5.838562 9.39679,-4.16626c2.896545,1.672302 3.384705,6.248901 1.090271,10.221008z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.666691" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2412" d="m152.00441,206.277954c2.292725,3.973083 6.500275,5.838562 9.39679,4.16626c2.896545,-1.672302 3.384705,-6.248901 1.090271,-10.221008c-2.292725,-3.973083 -6.500244,-5.838562 -9.39679,-4.16626c-2.896515,1.672302 -3.384705,6.248901 -1.090271,10.221008z"/>
|
||||
<path fill="#ad7e61" fill-rule="evenodd" stroke="#000000" stroke-width="5" stroke-linejoin="round" stroke-miterlimit="4" id="path3171" d="m136.512283,64.078003c-22.642792,-2.369171 -34.040283,-7.446167 -39.5,-12c-17.754181,-14.808411 -11.325348,-28.549896 -22,-25.5c-10.5,3 -10.476715,14.694916 -17.5,18c-8.5,4 2.037125,-35.167847 -6.5,-32.5c-8,2.5 -8.971848,22.345093 -21,27.5c-10.5,4.5 6.609024,-46.582703 -8,-35.5c-14.5,11 -35,55 -1,67c34,12 84.446655,15.486877 101,15c17,-0.5 32.659882,-20.099854 14.5,-22z"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
Before Width: | Height: | Size: 7.5 KiB |
Before Width: | Height: | Size: 60 KiB |
Before Width: | Height: | Size: 21 KiB |
Before Width: | Height: | Size: 5.3 KiB |
|
@ -1,24 +0,0 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="335.62" height="312.58">
|
||||
<title>
|
||||
Layer 1
|
||||
</title>
|
||||
<g id="layer1">
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="3.29" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2383" d="m277.68 190.61c0.02 60.51-49.03 109.57-109.54 109.57 -60.51 0-109.55-49.06-109.54-109.57 -0.02-60.51 49.03-109.57 109.54-109.57 60.51 0 109.55 49.06 109.54 109.57zM271.06 153.04 271.06 153.04 271.06 153.04"/>
|
||||
<path fill="#000000" fill-opacity="0.31" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3221" d="m271.06 153.04c0.08 1.68 0.13 3.39 0.13 5.09 0 60.46-49.07 109.53-109.53 109.53 -47.26 0-87.59-30-102.94-71.97 2.65 58.11 50.67 104.47 109.44 104.47 60.46 0 109.53-49.07 109.53-109.53 0-13.2-2.34-25.87-6.62-37.59z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3172" d="m176.33 301.19c0 9.09 14.67 9.38 32.74 9.38 18.07 0 32.74 0.69 32.74-9.38 0-17.72-14.67-32.09-32.74-32.09 -18.07 0-32.74 14.38-32.74 32.09z"/>
|
||||
<path fill="#000000" fill-opacity="0.31" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3236" d="m252.78 166.74c0 50.6-55.12 91.67-81.84 91.67 -25.72 0-81.84-41.07-81.84-91.67 0-50.6 24.63-91.67 81.84-91.67 57.22 0 81.84 41.07 81.84 91.67z"/>
|
||||
<path fill="#000000" fill-opacity="0.31" fill-rule="evenodd" stroke="#000000" stroke-width="4" stroke-linejoin="round" stroke-miterlimit="4" id="path3174" d="m221.65 309.97c0-16.37-9.44-26.44-9.44-26.44 0 0 3.78 13.85 3.15 26.44l6.3 0z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3155" d="m165.48 60.32c-30.13 0.68-50.93 13.05-63.62 32.2 -24.06-2.9-44.06 26.46-44.06 26.46 0 0 18.89 10.58 32.06 1.72 -2.4 9.76-3.56 20.28-3.56 31.26 0 50.6 56.13 91.67 81.84 91.67 26.72 0 81.84-41.07 81.84-91.67 0-10.98-1.16-21.5-3.56-31.26 13.17 8.87 32.03-1.72 32.03-1.72 0 0-19.97-29.35-44.03-26.46 -13.06-19.71-34.71-32.2-66.28-32.2 -0.89 0-1.78-0.02-2.66 0z"/>
|
||||
<path fill="#ffffff" fill-rule="evenodd" stroke="#000000" stroke-width="5" stroke-linejoin="round" stroke-miterlimit="4" id="path2399" d="m199.11 64.08c22.64-2.37 34.04-7.45 39.5-12 17.75-14.81 11.33-28.55 22-25.5 10.5 3 10.48 14.69 17.5 18 8.5 4-2.04-35.17 6.5-32.5 8 2.5 8.97 22.35 21 27.5 10.5 4.5-6.61-46.58 8-35.5 14.5 11 35 55 1 67 -34 12-84.45 15.49-101 15 -17-0.5-32.66-20.1-14.5-22z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="5.89" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3157" d="m164.52 130.15c0 12.52-10.14 22.67-22.66 22.67 -12.52 0-22.67-10.15-22.66-22.67 0-12.52 10.14-22.67 22.66-22.67 12.52 0 22.67 10.15 22.66 22.67z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="5.89" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3159" d="m217.25 130.15c0 12.52-10.14 22.67-22.66 22.67 -12.52 0-22.67-10.15-22.66-22.67 0-12.52 10.14-22.67 22.66-22.67 12.52 0 22.67 10.15 22.66 22.67z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="23.67" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3161" d="m197.1 134.15c0 4.17-3.38 7.56-7.55 7.56 -4.17 0-7.56-3.38-7.55-7.56 0-4.17 3.38-7.56 7.55-7.56 4.17 0 7.56 3.38 7.55 7.56z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.67" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3163" d="m155.07 134.15c0 4.17-3.38 7.56-7.55 7.56 -4.17 0-7.56-3.38-7.55-7.56 0-4.17 3.38-7.56 7.55-7.56 4.17 0 7.56 3.38 7.55 7.56z"/>
|
||||
<path fill="#000000" fill-opacity="0.31" fill-rule="nonzero" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3231" d="m246.7 121.93l-1.87 2.98c0.25 3.44 0.34 6.95 0.34 10.5 0 50.6-55.13 91.67-81.84 91.67 -20.71 0-61.1-26.61-76.06-63.39 7.62 45.08 57.17 79.95 80.88 79.95 26.72 0 81.84-41.07 81.84-91.67 0-10.52-1.08-20.62-3.28-30.03z"/>
|
||||
<path fill="#ffffff" fill-rule="nonzero" stroke="#000000" stroke-width="4" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path3329" d="m161.33 301.19c0 9.09-14.67 9.38-32.74 9.38 -18.07 0-32.74 0.69-32.74-9.38 0-17.72 14.67-32.09 32.74-32.09 18.07 0 32.74 14.38 32.74 32.09z"/>
|
||||
<path fill="#000000" fill-opacity="0.31" fill-rule="evenodd" stroke="#000000" stroke-width="4" stroke-linejoin="round" stroke-miterlimit="4" id="path3331" d="m116 309.97c0-16.37 9.44-26.44 9.44-26.44 0 0-3.78 13.85-3.15 26.44l-6.3 0z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.67" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2398" d="m184.74 206.28c-2.29 3.97-6.5 5.84-9.4 4.17 -2.9-1.67-3.38-6.25-1.09-10.22 2.29-3.97 6.5-5.84 9.4-4.17 2.9 1.67 3.38 6.25 1.09 10.22z"/>
|
||||
<path fill="#000000" fill-rule="nonzero" stroke-width="47.67" stroke-linecap="round" stroke-linejoin="round" stroke-miterlimit="4" id="path2412" d="m152 206.28c2.29 3.97 6.5 5.84 9.4 4.17 2.9-1.67 3.38-6.25 1.09-10.22 -2.29-3.97-6.5-5.84-9.4-4.17 -2.9 1.67-3.38 6.25-1.09 10.22z"/>
|
||||
<path fill="#ffffff" fill-rule="evenodd" stroke="#000000" stroke-width="5" stroke-linejoin="round" stroke-miterlimit="4" id="path3171" d="m136.51 64.08c-22.64-2.37-34.04-7.45-39.5-12 -17.75-14.81-11.33-28.55-22-25.5 -10.5 3-10.48 14.69-17.5 18 -8.5 4 2.04-35.17-6.5-32.5 -8 2.5-8.97 22.35-21 27.5 -10.5 4.5 6.61-46.58-8-35.5 -14.5 11-35 55-1 67 34 12 84.45 15.49 101 15 17-0.5 32.66-20.1 14.5-22z"/>
|
||||
</g>
|
||||
</svg>
|
Before Width: | Height: | Size: 5.7 KiB |
|
@ -1,99 +0,0 @@
|
|||
|
||||
from __future__ import print_function
|
||||
|
||||
import subprocess
|
||||
import sys
|
||||
import os
|
||||
import fileinput
|
||||
import fnmatch
|
||||
import warnings
|
||||
|
||||
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
ROOT = os.path.abspath(os.path.join(BASE_DIR, "..", ".."))
|
||||
|
||||
sys.path.insert(0, ROOT)
|
||||
sys.path.insert(0, BASE_DIR)
|
||||
|
||||
# This is required for ReadTheDocs.org, but isn't a bad idea anyway.
|
||||
os.environ['DJANGO_SETTINGS_MODULE'] = 'openstack_dashboard.settings'
|
||||
|
||||
# -- General configuration ----------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
|
||||
extensions = ['sphinx.ext.autodoc',
|
||||
'oslosphinx']
|
||||
|
||||
# autodoc generation is a bit aggressive and a nuisance when doing heavy
|
||||
# text edit cycles.
|
||||
# execute "export SPHINX_DEBUG=1" in your terminal to disable
|
||||
|
||||
# A list of glob-style patterns that should be excluded when looking for source
|
||||
# files.
|
||||
exclude_patterns = [
|
||||
'api/tests.*', # avoid of docs generation from tests
|
||||
'api/openstack.common.cache._backends.*',
|
||||
]
|
||||
|
||||
# Prune the excluded patterns from the autoindex
|
||||
for line in fileinput.input('api/autoindex.rst', inplace=True):
|
||||
found = False
|
||||
for pattern in exclude_patterns:
|
||||
if fnmatch.fnmatch(line, '*' + pattern[4:]):
|
||||
found = True
|
||||
if not found:
|
||||
print (line.rstrip())
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix of source filenames.
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = 'Oslo'
|
||||
copyright = '2012, OpenStack Foundation'
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
add_module_names = True
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# -- Options for HTML output --------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. Major themes that come with
|
||||
# Sphinx are currently 'default' and 'sphinxdoc'.
|
||||
#html_theme_path = ["."]
|
||||
#html_theme = '_theme'
|
||||
#html_static_path = ['static']
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = '%sdoc' % project
|
||||
|
||||
try:
|
||||
git_cmd = ["git", "log", "--pretty=format:'%ad, commit %h'", "--date=local",
|
||||
"-n1"]
|
||||
html_last_updated_fmt = subprocess.Popen(git_cmd,
|
||||
stdout=subprocess.PIPE).\
|
||||
communicate()[0]
|
||||
except Exception:
|
||||
warnings.warn('Cannot get last updated time from git repository. '
|
||||
'Not setting "html_last_updated_fmt".')
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title, author, documentclass
|
||||
# [howto/manual]).
|
||||
latex_documents = [
|
||||
('index',
|
||||
'%s.tex' % project,
|
||||
'%s Documentation' % project,
|
||||
'OpenStack Foundation', 'manual'),
|
||||
]
|
|
@ -1,5 +0,0 @@
|
|||
==============
|
||||
Contributing
|
||||
==============
|
||||
|
||||
.. include:: ../../CONTRIBUTING.rst
|
|
@ -1 +0,0 @@
|
|||
.. include:: ../../HACKING.rst
|
|
@ -1,28 +0,0 @@
|
|||
OpenStack Common Code
|
||||
=====================
|
||||
|
||||
Code shared across OpenStack.
|
||||
|
||||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
contributing
|
||||
hacking
|
||||
|
||||
Code Documentation
|
||||
==================
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
api/autoindex
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
|
@ -1,2 +0,0 @@
|
|||
This directory is just here to hold the setup.py used to register the
|
||||
Oslo name on PyPI, to avoid project name collisions.
|
|
@ -1,3 +0,0 @@
|
|||
"""Declare namespace package"""
|
||||
|
||||
__import__('pkg_resources').declare_namespace(__name__)
|
|
@ -1,42 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
# Copyright (c) 2012 OpenStack Foundation.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import setuptools
|
||||
|
||||
setuptools.setup(
|
||||
name='oslo',
|
||||
version='1',
|
||||
description="Namespace for common components for OpenStack",
|
||||
long_description="Namespace for common components for OpenStack",
|
||||
classifiers=[
|
||||
'Development Status :: 4 - Beta',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Environment :: No Input/Output (Daemon)',
|
||||
'Environment :: OpenStack',
|
||||
],
|
||||
keywords='openstack',
|
||||
author='OpenStack',
|
||||
author_email='openstack@lists.openstack.org',
|
||||
url='http://www.openstack.org/',
|
||||
license='Apache Software License',
|
||||
zip_safe=True,
|
||||
packages=setuptools.find_packages(exclude=['ez_setup',
|
||||
'examples', 'tests']),
|
||||
namespace_packages=['oslo'],
|
||||
)
|
48
setup.cfg
|
@ -1,48 +0,0 @@
|
|||
[metadata]
|
||||
name = openstack.common
|
||||
summary = OpenStack Common Libraries
|
||||
description-file =
|
||||
README.rst
|
||||
author = OpenStack
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
home-page = http://www.openstack.org/
|
||||
classifier =
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Information Technology
|
||||
Intended Audience :: System Administrators
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: POSIX :: Linux
|
||||
Programming Language :: Python
|
||||
Programming Language :: Python :: 2
|
||||
Programming Language :: Python :: 2.7
|
||||
Programming Language :: Python :: 3
|
||||
Programming Language :: Python :: 3.4
|
||||
|
||||
[files]
|
||||
packages =
|
||||
openstack
|
||||
namespace_packages =
|
||||
openstack
|
||||
|
||||
[global]
|
||||
setup-hooks =
|
||||
pbr.hooks.setup_hook
|
||||
|
||||
[nosetests]
|
||||
# NOTE(jkoelker) To run the test suite under nose install the following
|
||||
# coverage http://pypi.python.org/pypi/coverage
|
||||
# tissue http://pypi.python.org/pypi/tissue (pep8 checker)
|
||||
# openstack-nose https://github.com/jkoelker/openstack-nose
|
||||
verbosity=2
|
||||
|
||||
[build_sphinx]
|
||||
source-dir = doc/source
|
||||
build-dir = doc/build
|
||||
all_files = 1
|
||||
|
||||
[upload_sphinx]
|
||||
upload-dir = doc/build/html
|
||||
|
||||
[pbr]
|
||||
warnerrors = true
|
||||
autodoc_index_modules = 1
|
29
setup.py
|
@ -1,29 +0,0 @@
|
|||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
|
||||
import setuptools
|
||||
|
||||
# In python < 2.7.4, a lazy loading of package `pbr` will break
|
||||
# setuptools if some other modules registered functions in `atexit`.
|
||||
# solution from: http://bugs.python.org/issue15881#msg170215
|
||||
try:
|
||||
import multiprocessing # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
setuptools.setup(
|
||||
setup_requires=['pbr>=1.8'],
|
||||
pbr=True)
|
|
@ -1,90 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
# This requires gitinspector to be installed
|
||||
# it can be gotten from:
|
||||
#
|
||||
# - https://pypi.python.org/pypi/gitinspector/0.3.2
|
||||
# - https://github.com/ejwa/gitinspector
|
||||
|
||||
# Check out a new copy of a repository and set it up to be a useful
|
||||
# local copy.
|
||||
function clone_new {
|
||||
typeset repo="$1"
|
||||
typeset url="$2"
|
||||
echo
|
||||
echo "Cloning $repo"
|
||||
git clone $url $repo
|
||||
return 0
|
||||
}
|
||||
|
||||
# Determine the current branch of a local repository.
|
||||
function current_branch {
|
||||
(cd $1 && git rev-parse --abbrev-ref HEAD)
|
||||
}
|
||||
|
||||
# Update an existing copy of a repository, including all remotes and
|
||||
# pulling into the local master branch if we're on that branch
|
||||
# already.
|
||||
function update_existing {
|
||||
typeset repo="$1"
|
||||
echo
|
||||
echo "Updating $repo"
|
||||
(cd $repo && git remote update)
|
||||
RC=$?
|
||||
if [ $RC -ne 0 ]
|
||||
then
|
||||
return $RC
|
||||
fi
|
||||
# Only run git pull for repos where I'm not working in a branch.
|
||||
typeset b=$(current_branch $repo)
|
||||
if [ $b == "master" ]
|
||||
then
|
||||
if (cd $repo && git diff --exit-code >/dev/null)
|
||||
then
|
||||
(cd $repo && git pull)
|
||||
else
|
||||
echo "Skipping pull for master branch with local changes"
|
||||
(cd $repo && git status)
|
||||
fi
|
||||
else
|
||||
echo "Skipping pull for branch $b"
|
||||
branched="$branched $repo"
|
||||
fi
|
||||
}
|
||||
|
||||
# Process a single repository found in gerrit, determining whether it
|
||||
# exists locally already or not.
|
||||
function get_one_repo {
|
||||
typeset repo="$1"
|
||||
typeset url="$2"
|
||||
typeset pardir=$(dirname $repo)
|
||||
if [ ! -z "$pardir" ]
|
||||
then
|
||||
mkdir -p $pardir
|
||||
fi
|
||||
if [ ! -d $repo ] ; then
|
||||
clone_new $repo $url
|
||||
else
|
||||
update_existing $repo
|
||||
fi
|
||||
RC=$?
|
||||
return $RC
|
||||
}
|
||||
|
||||
current_dir=`pwd`
|
||||
base="git://git.openstack.org"
|
||||
projects=$(ssh review.openstack.org -p 29418 gerrit ls-projects | grep -v 'attic' | grep "oslo")
|
||||
projects="$projects openstack/taskflow openstack/tooz openstack/cliff openstack/debtcollector"
|
||||
projects="$projects openstack/futurist openstack/stevedore openstack-dev/cookiecutter"
|
||||
projects="$projects openstack/automaton"
|
||||
|
||||
for repo in $projects; do
|
||||
get_one_repo "$repo" "$base/$repo"
|
||||
RC=$?
|
||||
if [ $RC -ne 0 ] ; then
|
||||
echo "Unable to obtain $repo"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
python new_core_analyzer.py $projects > "${current_dir}/oslo_reports.txt"
|
|
@ -1,46 +0,0 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Apply the Oslo cookiecutter template to an existing directory,
|
||||
# usually as part of the graduation process.
|
||||
|
||||
COOKIECUTTER_TEMPLATE_REPO=${COOKIECUTTER_TEMPLATE_REPO:-https://git.openstack.org/openstack-dev/oslo-cookiecutter}
|
||||
|
||||
function usage {
|
||||
echo "Usage: apply_cookiecutter.sh newlib" 1>&2
|
||||
}
|
||||
|
||||
if [ $# -lt 1 ]
|
||||
then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
new_lib="$1"
|
||||
|
||||
if [[ $new_lib =~ oslo.* ]]
|
||||
then
|
||||
echo "You probably don't want 'oslo' in the lib name." 1>&2
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Set up a virtualenv with cookiecutter
|
||||
tmpdir=$(mktemp -d -t oslo-cookiecutter.XXXX)
|
||||
echo "Installing cookiecutter..."
|
||||
venv=$tmpdir/venv
|
||||
virtualenv $venv
|
||||
$venv/bin/python -m pip install cookiecutter
|
||||
cookiecutter=$venv/bin/cookiecutter
|
||||
|
||||
# Apply the cookiecutter template by building out a fresh copy using
|
||||
# the name chosen for this library and then copying any parts of the
|
||||
# results into the local tree, without overwriting files that already
|
||||
# exist.
|
||||
git clone $COOKIECUTTER_TEMPLATE_REPO $tmpdir/oslo-cookiecutter
|
||||
|
||||
# FIXME(dhellmann): We need a better non-interactive mode for cookiecutter
|
||||
(cd $tmpdir && $cookiecutter $tmpdir/oslo-cookiecutter) <<EOF
|
||||
$new_lib
|
||||
openstack
|
||||
oslo.${new_lib} library
|
||||
EOF
|
||||
rsync -a --verbose --ignore-existing $tmpdir/oslo.${new_lib}/ .
|
|
@ -1,16 +0,0 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Process the dashboard files and emit the URLs
|
||||
|
||||
creator_dir=$1
|
||||
dashboard_dir=$2
|
||||
|
||||
cd $creator_dir
|
||||
|
||||
for f in $dashboard_dir/*.dash
|
||||
do
|
||||
echo '----------------------------------------'
|
||||
echo $(basename $f .dash)
|
||||
echo '----------------------------------------'
|
||||
./gerrit-dash-creator $f
|
||||
done
|
|
@ -1,36 +0,0 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Script to replace imports from the 'oslo' namespace package with the
|
||||
# appropriate alternative in the dist-specific packages.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
name=$(python setup.py --name)
|
||||
dir=${1:-$name}
|
||||
|
||||
echo "Updating $dir"
|
||||
sed -i \
|
||||
-e 's/from oslo\./from oslo_/g' \
|
||||
-e 's/import oslo\./import oslo_/g' \
|
||||
-e 's/from oslo import i18n/import oslo_i18n as i18n/g' \
|
||||
-e 's/from oslo import messaging/import oslo_messaging as messaging/g' \
|
||||
-e 's/from oslo import config/import oslo_config as config/g' \
|
||||
-e 's/from oslo import serialization/import oslo_serialization as serialization/g' \
|
||||
-e 's/from oslo import utils/import oslo_utils as utils/g' \
|
||||
-e 's/oslo\.i18n\.TranslatorFactory/oslo_i18n.TranslatorFactory/g' \
|
||||
$(find $dir -name '*.py' | grep -v "$name/tests/unit/test_hacking.py")
|
||||
|
||||
set -x
|
||||
|
||||
git grep 'from oslo import'
|
||||
git grep 'oslo\.'
|
|
@ -1,196 +0,0 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Check out every active repository from git.openstack.org. For new
|
||||
# copies, set up git-review. For any existing copies, update their
|
||||
# remotes and pull changes up to the local master.
|
||||
#
|
||||
# This script is based on prior art from mordred on the openstack-dev
|
||||
# mailing list.
|
||||
# http://lists.openstack.org/pipermail/openstack-dev/2013-October/017532.html
|
||||
#
|
||||
# Usage:
|
||||
#
|
||||
# Check out everything under the current directory:
|
||||
# $ clone_openstack.sh
|
||||
#
|
||||
# Check out a specific project (you can list multiple names):
|
||||
# $ clone_openstack.sh openstack/oslo-incubator
|
||||
#
|
||||
|
||||
trouble_with=""
|
||||
branched=""
|
||||
|
||||
# Figure out if git-hooks is installed and should be used.
|
||||
# https://github.com/icefox/git-hooks
|
||||
which git-hooks 2>&1 > /dev/null
|
||||
USE_GIT_HOOKS=$?
|
||||
|
||||
# Users can set INCLUDE_STACKFORGE=1 if they want to always check out
|
||||
# new copies of stackforge projects.
|
||||
INCLUDE_STACKFORGE=${INCLUDE_STACKFORGE:-0}
|
||||
|
||||
# If we have any trouble at all working with a repository, report that
|
||||
# and then record the name for the summary at the end.
|
||||
function track_trouble {
|
||||
if [ $1 -ne 0 ]
|
||||
then
|
||||
echo "Remembering trouble with $2"
|
||||
trouble_with="$trouble_with $2"
|
||||
fi
|
||||
}
|
||||
|
||||
# Determine the current branch of a local repository.
|
||||
function current_branch {
|
||||
(cd $1 && git rev-parse --abbrev-ref HEAD)
|
||||
}
|
||||
|
||||
# Print a summary report for any repositories that had trouble
|
||||
# updating.
|
||||
function report_trouble {
|
||||
if [ ! -z "$trouble_with" ]
|
||||
then
|
||||
echo
|
||||
echo "Had trouble updating:"
|
||||
for r in $trouble_with
|
||||
do
|
||||
echo " $r - $(current_branch $r)"
|
||||
done
|
||||
fi
|
||||
}
|
||||
|
||||
# Print a summary report for any repositories that were not on the
|
||||
# master branch when we updated them.
|
||||
function report_branched {
|
||||
if [ ! -z "$branched" ]
|
||||
then
|
||||
echo
|
||||
echo "Branched repos:"
|
||||
for r in $branched
|
||||
do
|
||||
echo " $r - $(current_branch $r)"
|
||||
done
|
||||
fi
|
||||
}
|
||||
|
||||
# Check out a new copy of a repository and set it up to be a useful
|
||||
# local copy.
|
||||
function clone_new {
|
||||
typeset repo="$1"
|
||||
typeset url="$2"
|
||||
# Ignore stackforge projects unless told otherwise.
|
||||
if [[ $repo =~ ^stackforge/.* ]]
|
||||
then
|
||||
if [ $INCLUDE_STACKFORGE -ne 1 ]
|
||||
then
|
||||
return 0
|
||||
fi
|
||||
fi
|
||||
echo
|
||||
echo "Cloning $repo"
|
||||
git clone $url $repo
|
||||
(cd $repo && git review -s)
|
||||
if [ $USE_GIT_HOOKS -eq 0 ]
|
||||
then
|
||||
echo "Configuring git hooks"
|
||||
(cd $repo && git hooks --install)
|
||||
fi
|
||||
return 0
|
||||
}
|
||||
|
||||
# Update an existing copy of a repository, including all remotes and
|
||||
# pulling into the local master branch if we're on that branch
|
||||
# already.
|
||||
function update_existing {
|
||||
typeset repo="$1"
|
||||
echo
|
||||
echo "Updating $repo"
|
||||
(cd $repo && git remote update)
|
||||
RC=$?
|
||||
if [ $RC -ne 0 ]
|
||||
then
|
||||
return $RC
|
||||
fi
|
||||
# Only run git pull for repos where I'm not working in a branch.
|
||||
typeset b=$(current_branch $repo)
|
||||
if [ $b == "master" ]
|
||||
then
|
||||
if (cd $repo && git diff --exit-code >/dev/null)
|
||||
then
|
||||
(cd $repo && git pull)
|
||||
else
|
||||
echo "Skipping pull for master branch with local changes"
|
||||
(cd $repo && git status)
|
||||
fi
|
||||
else
|
||||
echo "Skipping pull for branch $b"
|
||||
branched="$branched $repo"
|
||||
fi
|
||||
}
|
||||
|
||||
# Process a single repository found in gerrit, determining whether it
|
||||
# exists locally already or not.
|
||||
function get_one_repo {
|
||||
typeset repo="$1"
|
||||
typeset url="$2"
|
||||
typeset pardir=$(dirname $repo)
|
||||
if [ ! -z "$pardir" ]
|
||||
then
|
||||
mkdir -p $pardir
|
||||
fi
|
||||
if [ ! -d $repo ] ; then
|
||||
clone_new $repo $url
|
||||
else
|
||||
update_existing $repo
|
||||
fi
|
||||
RC=$?
|
||||
return $RC
|
||||
}
|
||||
|
||||
# If we are given a list of projects on the command line, we will only
|
||||
# work on those. Otherwise, ask gerrit for the full list of openstack
|
||||
# projects, ignoring the ones in the attic. Stackforge projects are
|
||||
# ignored if they do not exist locally, so we include them in the
|
||||
# output list and check for them when we decide what to do with each
|
||||
# repository.
|
||||
projects="$*"
|
||||
if [ -z "$projects" ]
|
||||
then
|
||||
projects=$(ssh review.openstack.org -p 29418 gerrit ls-projects | grep '^openstack' | grep -v 'attic')
|
||||
RC=$?
|
||||
if [ $RC -ne 0 ]
|
||||
then
|
||||
echo "Unable to obtain a list of projects from gerrit. Check your ssh credientials for review.openstack.org"
|
||||
userid=`id -un`
|
||||
gerrit_userid=`git config --get gitreview.username`
|
||||
if [ $userid != $gerrit_userid ]
|
||||
then
|
||||
echo "Identified a possible userid difference between $userid and $gerrit_userid"
|
||||
fi
|
||||
exit $RC
|
||||
fi
|
||||
else
|
||||
# Go ahead and set things up so we will work with stackforge
|
||||
# repositories, in case the caller has specified one on the
|
||||
# command line.
|
||||
INCLUDE_STACKFORGE=1
|
||||
fi
|
||||
|
||||
for repo in $projects; do
|
||||
get_one_repo $repo git://git.openstack.org/$repo
|
||||
track_trouble $? $repo
|
||||
done
|
||||
|
||||
report_branched
|
||||
report_trouble
|
|
@ -1,333 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
# Copyright (c) 2013, Nebula, Inc.
|
||||
# Copyright 2010 United States Government as represented by the
|
||||
# Administrator of the National Aeronautics and Space Administration.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Colorizer Code is borrowed from Twisted:
|
||||
# Copyright (c) 2001-2010 Twisted Matrix Laboratories.
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining
|
||||
# a copy of this software and associated documentation files (the
|
||||
# "Software"), to deal in the Software without restriction, including
|
||||
# without limitation the rights to use, copy, modify, merge, publish,
|
||||
# distribute, sublicense, and/or sell copies of the Software, and to
|
||||
# permit persons to whom the Software is furnished to do so, subject to
|
||||
# the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be
|
||||
# included in all copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
|
||||
"""Display a subunit stream through a colorized unittest test runner."""
|
||||
|
||||
import heapq
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
import six
|
||||
import subunit
|
||||
import testtools
|
||||
|
||||
|
||||
class _AnsiColorizer(object):
|
||||
"""Colorizer allows callers to write text in a particular color.
|
||||
|
||||
A colorizer is an object that loosely wraps around a stream, allowing
|
||||
callers to write text to the stream in a particular color.
|
||||
|
||||
Colorizer classes must implement C{supported()} and C{write(text, color)}.
|
||||
"""
|
||||
_colors = dict(black=30, red=31, green=32, yellow=33,
|
||||
blue=34, magenta=35, cyan=36, white=37)
|
||||
|
||||
def __init__(self, stream):
|
||||
self.stream = stream
|
||||
|
||||
def supported(cls, stream=sys.stdout):
|
||||
"""Check is the current platform supports coloring terminal output.
|
||||
|
||||
A class method that returns True if the current platform supports
|
||||
coloring terminal output using this method. Returns False otherwise.
|
||||
"""
|
||||
if not stream.isatty():
|
||||
return False # auto color only on TTYs
|
||||
try:
|
||||
import curses
|
||||
except ImportError:
|
||||
return False
|
||||
else:
|
||||
try:
|
||||
try:
|
||||
return curses.tigetnum("colors") > 2
|
||||
except curses.error:
|
||||
curses.setupterm()
|
||||
return curses.tigetnum("colors") > 2
|
||||
except Exception:
|
||||
# guess false in case of error
|
||||
return False
|
||||
supported = classmethod(supported)
|
||||
|
||||
def write(self, text, color):
|
||||
"""Write the given text to the stream in the given color.
|
||||
|
||||
@param text: Text to be written to the stream.
|
||||
|
||||
@param color: A string label for a color. e.g. 'red', 'white'.
|
||||
"""
|
||||
color = self._colors[color]
|
||||
self.stream.write('\x1b[%s;1m%s\x1b[0m' % (color, text))
|
||||
|
||||
|
||||
class _Win32Colorizer(object):
|
||||
"""See _AnsiColorizer docstring."""
|
||||
def __init__(self, stream):
|
||||
import win32console
|
||||
red, green, blue, bold = (win32console.FOREGROUND_RED,
|
||||
win32console.FOREGROUND_GREEN,
|
||||
win32console.FOREGROUND_BLUE,
|
||||
win32console.FOREGROUND_INTENSITY)
|
||||
self.stream = stream
|
||||
self.screenBuffer = win32console.GetStdHandle(
|
||||
win32console.STD_OUT_HANDLE)
|
||||
self._colors = {
|
||||
'normal': red | green | blue,
|
||||
'red': red | bold,
|
||||
'green': green | bold,
|
||||
'blue': blue | bold,
|
||||
'yellow': red | green | bold,
|
||||
'magenta': red | blue | bold,
|
||||
'cyan': green | blue | bold,
|
||||
'white': red | green | blue | bold,
|
||||
}
|
||||
|
||||
def supported(cls, stream=sys.stdout):
|
||||
try:
|
||||
import win32console
|
||||
screenBuffer = win32console.GetStdHandle(
|
||||
win32console.STD_OUT_HANDLE)
|
||||
except ImportError:
|
||||
return False
|
||||
import pywintypes
|
||||
try:
|
||||
screenBuffer.SetConsoleTextAttribute(
|
||||
win32console.FOREGROUND_RED |
|
||||
win32console.FOREGROUND_GREEN |
|
||||
win32console.FOREGROUND_BLUE)
|
||||
except pywintypes.error:
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
supported = classmethod(supported)
|
||||
|
||||
def write(self, text, color):
|
||||
color = self._colors[color]
|
||||
self.screenBuffer.SetConsoleTextAttribute(color)
|
||||
self.stream.write(text)
|
||||
self.screenBuffer.SetConsoleTextAttribute(self._colors['normal'])
|
||||
|
||||
|
||||
class _NullColorizer(object):
|
||||
"""See _AnsiColorizer docstring."""
|
||||
def __init__(self, stream):
|
||||
self.stream = stream
|
||||
|
||||
def supported(cls, stream=sys.stdout):
|
||||
return True
|
||||
supported = classmethod(supported)
|
||||
|
||||
def write(self, text, color):
|
||||
self.stream.write(text)
|
||||
|
||||
|
||||
def get_elapsed_time_color(elapsed_time):
|
||||
if elapsed_time > 1.0:
|
||||
return 'red'
|
||||
elif elapsed_time > 0.25:
|
||||
return 'yellow'
|
||||
else:
|
||||
return 'green'
|
||||
|
||||
|
||||
class OpenStackTestResult(testtools.TestResult):
|
||||
def __init__(self, stream, descriptions, verbosity):
|
||||
super(OpenStackTestResult, self).__init__()
|
||||
self.stream = stream
|
||||
self.showAll = verbosity > 1
|
||||
self.num_slow_tests = 10
|
||||
self.slow_tests = [] # this is a fixed-sized heap
|
||||
self.colorizer = None
|
||||
# NOTE(vish): reset stdout for the terminal check
|
||||
stdout = sys.stdout
|
||||
sys.stdout = sys.__stdout__
|
||||
for colorizer in [_Win32Colorizer, _AnsiColorizer, _NullColorizer]:
|
||||
if colorizer.supported():
|
||||
self.colorizer = colorizer(self.stream)
|
||||
break
|
||||
sys.stdout = stdout
|
||||
self.start_time = None
|
||||
self.last_time = {}
|
||||
self.results = {}
|
||||
self.last_written = None
|
||||
|
||||
def _writeElapsedTime(self, elapsed):
|
||||
color = get_elapsed_time_color(elapsed)
|
||||
self.colorizer.write(" %.2f" % elapsed, color)
|
||||
|
||||
def _addResult(self, test, *args):
|
||||
try:
|
||||
name = test.id()
|
||||
except AttributeError:
|
||||
name = 'Unknown.unknown'
|
||||
test_class, test_name = name.rsplit('.', 1)
|
||||
|
||||
elapsed = (self._now() - self.start_time).total_seconds()
|
||||
item = (elapsed, test_class, test_name)
|
||||
if len(self.slow_tests) >= self.num_slow_tests:
|
||||
heapq.heappushpop(self.slow_tests, item)
|
||||
else:
|
||||
heapq.heappush(self.slow_tests, item)
|
||||
|
||||
self.results.setdefault(test_class, [])
|
||||
self.results[test_class].append((test_name, elapsed) + args)
|
||||
self.last_time[test_class] = self._now()
|
||||
self.writeTests()
|
||||
|
||||
def _writeResult(self, test_name, elapsed, long_result, color,
|
||||
short_result, success):
|
||||
if self.showAll:
|
||||
self.stream.write(' %s' % str(test_name).ljust(66))
|
||||
self.colorizer.write(long_result, color)
|
||||
if success:
|
||||
self._writeElapsedTime(elapsed)
|
||||
self.stream.writeln()
|
||||
else:
|
||||
self.colorizer.write(short_result, color)
|
||||
|
||||
def addSuccess(self, test):
|
||||
super(OpenStackTestResult, self).addSuccess(test)
|
||||
self._addResult(test, 'OK', 'green', '.', True)
|
||||
|
||||
def addFailure(self, test, err):
|
||||
if test.id() == 'process-returncode':
|
||||
return
|
||||
super(OpenStackTestResult, self).addFailure(test, err)
|
||||
self._addResult(test, 'FAIL', 'red', 'F', False)
|
||||
|
||||
def addError(self, test, err):
|
||||
super(OpenStackTestResult, self).addFailure(test, err)
|
||||
self._addResult(test, 'ERROR', 'red', 'E', False)
|
||||
|
||||
def addSkip(self, test, reason=None, details=None):
|
||||
super(OpenStackTestResult, self).addSkip(test, reason, details)
|
||||
self._addResult(test, 'SKIP', 'blue', 'S', True)
|
||||
|
||||
def startTest(self, test):
|
||||
self.start_time = self._now()
|
||||
super(OpenStackTestResult, self).startTest(test)
|
||||
|
||||
def writeTestCase(self, cls):
|
||||
if not self.results.get(cls):
|
||||
return
|
||||
if cls != self.last_written:
|
||||
self.colorizer.write(cls, 'white')
|
||||
self.stream.writeln()
|
||||
for result in self.results[cls]:
|
||||
self._writeResult(*result)
|
||||
del self.results[cls]
|
||||
self.stream.flush()
|
||||
self.last_written = cls
|
||||
|
||||
def writeTests(self):
|
||||
time = self.last_time.get(self.last_written, self._now())
|
||||
if not self.last_written or (self._now() - time).total_seconds() > 2.0:
|
||||
diff = 3.0
|
||||
while diff > 2.0:
|
||||
classes = self.results.keys()
|
||||
oldest = min(classes, key=lambda x: self.last_time[x])
|
||||
diff = (self._now() - self.last_time[oldest]).total_seconds()
|
||||
self.writeTestCase(oldest)
|
||||
else:
|
||||
self.writeTestCase(self.last_written)
|
||||
|
||||
def done(self):
|
||||
self.stopTestRun()
|
||||
|
||||
def stopTestRun(self):
|
||||
for cls in list(six.iterkeys(self.results)):
|
||||
self.writeTestCase(cls)
|
||||
self.stream.writeln()
|
||||
self.writeSlowTests()
|
||||
|
||||
def writeSlowTests(self):
|
||||
# Pare out 'fast' tests
|
||||
slow_tests = [item for item in self.slow_tests
|
||||
if get_elapsed_time_color(item[0]) != 'green']
|
||||
if slow_tests:
|
||||
slow_total_time = sum(item[0] for item in slow_tests)
|
||||
slow = ("Slowest %i tests took %.2f secs:"
|
||||
% (len(slow_tests), slow_total_time))
|
||||
self.colorizer.write(slow, 'yellow')
|
||||
self.stream.writeln()
|
||||
last_cls = None
|
||||
# sort by name
|
||||
for elapsed, cls, name in sorted(slow_tests,
|
||||
key=lambda x: x[1] + x[2]):
|
||||
if cls != last_cls:
|
||||
self.colorizer.write(cls, 'white')
|
||||
self.stream.writeln()
|
||||
last_cls = cls
|
||||
self.stream.write(' %s' % str(name).ljust(68))
|
||||
self._writeElapsedTime(elapsed)
|
||||
self.stream.writeln()
|
||||
|
||||
def printErrors(self):
|
||||
if self.showAll:
|
||||
self.stream.writeln()
|
||||
self.printErrorList('ERROR', self.errors)
|
||||
self.printErrorList('FAIL', self.failures)
|
||||
|
||||
def printErrorList(self, flavor, errors):
|
||||
for test, err in errors:
|
||||
self.colorizer.write("=" * 70, 'red')
|
||||
self.stream.writeln()
|
||||
self.colorizer.write(flavor, 'red')
|
||||
self.stream.writeln(": %s" % test.id())
|
||||
self.colorizer.write("-" * 70, 'red')
|
||||
self.stream.writeln()
|
||||
self.stream.writeln("%s" % err)
|
||||
|
||||
|
||||
test = subunit.ProtocolTestCase(sys.stdin, passthrough=None)
|
||||
|
||||
if sys.version_info[0:2] <= (2, 6):
|
||||
runner = unittest.TextTestRunner(verbosity=2)
|
||||
else:
|
||||
runner = unittest.TextTestRunner(verbosity=2,
|
||||
resultclass=OpenStackTestResult)
|
||||
|
||||
if runner.run(test).wasSuccessful():
|
||||
exit_code = 0
|
||||
else:
|
||||
exit_code = 1
|
||||
sys.exit(exit_code)
|
|
@ -1,70 +0,0 @@
|
|||
#!/bin/sh
|
||||
#
|
||||
# Filter the history of a git repository to only include the named
|
||||
# files.
|
||||
|
||||
set -e
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage $0 <files to keep>"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
set -x
|
||||
|
||||
files_to_keep="$@"
|
||||
|
||||
# Build the grep pattern for ignoring files that we want to keep
|
||||
keep_pattern="\($(echo $files_to_keep | sed -e 's/ /\\|/g')\)"
|
||||
# Prune all other files in every commit
|
||||
pruner="git ls-files | grep -v \"$keep_pattern\" | git update-index --force-remove --stdin; git ls-files > /dev/stderr"
|
||||
|
||||
# Find all first commits with listed files and find a subset of them that
|
||||
# predates all others
|
||||
|
||||
roots=""
|
||||
for file in $files_to_keep; do
|
||||
file_root=$(git rev-list --reverse HEAD -- $file | head -n1)
|
||||
fail=0
|
||||
for root in $roots; do
|
||||
if git merge-base --is-ancestor $root $file_root; then
|
||||
fail=1
|
||||
break
|
||||
elif !git merge-base --is-ancestor $file_root $root; then
|
||||
new_roots="$new_roots $root"
|
||||
fi
|
||||
done
|
||||
if [ $fail -ne 1 ]; then
|
||||
roots="$new_roots $file_root"
|
||||
fi
|
||||
done
|
||||
|
||||
# Purge all parents for those commits
|
||||
|
||||
set_roots="
|
||||
if [ 1 -eq 0 $(for root in $roots; do echo " -o \"\$GIT_COMMIT\" = '$root' "; done) ]; then
|
||||
echo '';
|
||||
else
|
||||
cat;
|
||||
fi"
|
||||
|
||||
# Enhance git_commit_non_empty_tree to skip merges with:
|
||||
# a) either two equal parents (commit that was about to land got purged as well
|
||||
# as all commits on mainline);
|
||||
# b) or with second parent being an ancestor to the first one (just as with a)
|
||||
# but when there are some commits on mainline).
|
||||
# In both cases drop second parent and let git_commit_non_empty_tree to decide
|
||||
# if commit worth doing (most likely not).
|
||||
|
||||
skip_empty=$(cat << \EOF
|
||||
if [ $# = 5 ] && git merge-base --is-ancestor $5 $3; then
|
||||
git_commit_non_empty_tree $1 -p $3
|
||||
else
|
||||
git_commit_non_empty_tree "$@"
|
||||
fi
|
||||
EOF
|
||||
)
|
||||
|
||||
# Filter out commits for unrelated files
|
||||
echo "Pruning commits for unrelated files..."
|
||||
git filter-branch --index-filter "$pruner" --parent-filter "$set_roots" --commit-filter "$skip_empty" HEAD
|
|
@ -1,116 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""
|
||||
Look through the openstack-common.conf files for projects to find
|
||||
any that are using modules that have been deleted from the
|
||||
incubator.
|
||||
"""
|
||||
|
||||
from __future__ import print_function
|
||||
|
||||
import glob
|
||||
import os
|
||||
import sys
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
# Extend sys.path to find update.py
|
||||
my_dir = os.path.dirname(__file__)
|
||||
incubator_root = os.path.abspath(os.path.dirname(my_dir))
|
||||
sys.path.append(incubator_root)
|
||||
import update
|
||||
|
||||
|
||||
def main(argv=sys.argv[1:]):
|
||||
repodir = os.path.abspath(
|
||||
os.path.join(my_dir, os.pardir, os.pardir, os.pardir)
|
||||
)
|
||||
|
||||
main_cfg = cfg.ConfigOpts()
|
||||
main_cfg.register_cli_opt(
|
||||
cfg.MultiStrOpt(
|
||||
# NOTE(dhellmann): We can't call this "project" because
|
||||
# that conflicts with another property of the ConfigOpts
|
||||
# class.
|
||||
'proj',
|
||||
default=[],
|
||||
positional=True,
|
||||
help='list of repo subdirs to scan, e.g. "openstack/nova"',
|
||||
)
|
||||
)
|
||||
main_cfg(argv)
|
||||
|
||||
# If the user gave us project names, turn them into full paths to
|
||||
# the project directory. If not, build a full list of all the
|
||||
# projects we find.
|
||||
projects = main_cfg.proj
|
||||
if projects:
|
||||
projects = [os.path.join(repodir, p) for p in projects]
|
||||
else:
|
||||
projects = glob.glob(
|
||||
os.path.join(repodir, '*', '*')
|
||||
)
|
||||
|
||||
base_dir = os.path.join(
|
||||
incubator_root,
|
||||
'openstack',
|
||||
'common',
|
||||
)
|
||||
tools_dir = os.path.join(incubator_root, 'tools')
|
||||
|
||||
previous_project = None
|
||||
for project_path in projects:
|
||||
conf_file = os.path.join(project_path, 'openstack-common.conf')
|
||||
if not os.path.exists(conf_file):
|
||||
# This is not a directory using oslo-incubator.
|
||||
continue
|
||||
|
||||
project_name = '/'.join(project_path.split('/')[-2:])
|
||||
|
||||
# Use a separate parser for each configuration file.
|
||||
pcfg = cfg.ConfigOpts()
|
||||
pcfg.register_opts(update.opts)
|
||||
pcfg(['--config-file', conf_file])
|
||||
|
||||
# The list of modules can come in a couple of different
|
||||
# options, so combine the results.
|
||||
modules = pcfg.module + pcfg.modules
|
||||
for mod in modules:
|
||||
# Build a few filenames and patterns for looking for
|
||||
# versions of the module being used by the project before
|
||||
# testing them all.
|
||||
mod_path = os.path.join(
|
||||
base_dir,
|
||||
mod.replace('.', os.sep),
|
||||
)
|
||||
mod_file = '%s.py' % mod_path
|
||||
tool_pattern = os.path.join(tools_dir, mod + '*')
|
||||
tool_subdir_pattern = os.path.join(tools_dir, mod, '*.sh')
|
||||
if (os.path.isfile(mod_file)
|
||||
or
|
||||
os.path.isdir(mod_path)
|
||||
or
|
||||
glob.glob(tool_pattern)
|
||||
or
|
||||
glob.glob(tool_subdir_pattern)):
|
||||
# Found something we would have copied in update.py.
|
||||
continue
|
||||
else:
|
||||
if project_name != previous_project:
|
||||
previous_project = project_name
|
||||
print()
|
||||
print('%s: %s' % (project_name, mod))
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -1,70 +0,0 @@
|
|||
# Copyright 2010 United States Government as represented by the
|
||||
# Administrator of the National Aeronautics and Space Administration.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Copyright 2010 OpenStack Foundation
|
||||
# Copyright 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
import install_venv_common as install_venv # noqa
|
||||
|
||||
|
||||
def print_help(venv, root):
|
||||
help = """
|
||||
OpenStack development environment setup is complete.
|
||||
|
||||
OpenStack development uses virtualenv to track and manage Python
|
||||
dependencies while in development and testing.
|
||||
|
||||
To activate the OpenStack virtualenv for the extent of your current shell
|
||||
session you can run:
|
||||
|
||||
$ source %s/bin/activate
|
||||
|
||||
Or, if you prefer, you can run commands in the virtualenv on a case by case
|
||||
basis by running:
|
||||
|
||||
$ %s/tools/with_venv.sh <your command>
|
||||
|
||||
Also, make test will automatically use the virtualenv.
|
||||
"""
|
||||
print(help % (venv, root))
|
||||
|
||||
|
||||
def main(argv):
|
||||
root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
|
||||
|
||||
if os.environ.get('TOOLS_PATH'):
|
||||
root = os.environ['TOOLS_PATH']
|
||||
venv = os.path.join(root, '.venv')
|
||||
if os.environ.get('VENV'):
|
||||
venv = os.environ['VENV']
|
||||
|
||||
pip_requires = os.path.join(root, 'requirements.txt')
|
||||
test_requires = os.path.join(root, 'test-requirements.txt')
|
||||
py_version = "python%s.%s" % (sys.version_info[0], sys.version_info[1])
|
||||
project = 'OpenStack'
|
||||
install = install_venv.InstallVenv(root, venv, pip_requires, test_requires,
|
||||
py_version, project)
|
||||
options = install.parse_args(argv)
|
||||
install.check_dependencies()
|
||||
install.create_virtualenv(no_site_packages=options.no_site_packages)
|
||||
install.install_dependencies()
|
||||
print_help(venv, root)
|
||||
|
||||
if __name__ == '__main__':
|
||||
main(sys.argv)
|
|
@ -1,165 +0,0 @@
|
|||
# Copyright 2013 OpenStack Foundation
|
||||
# Copyright 2013 IBM Corp.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Provides methods needed by installation script for OpenStack development
|
||||
virtual environments.
|
||||
|
||||
Synced in from openstack-common
|
||||
"""
|
||||
|
||||
from __future__ import print_function
|
||||
|
||||
import optparse
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
|
||||
class InstallVenv(object):
|
||||
|
||||
def __init__(self, root, venv, requirements,
|
||||
test_requirements, py_version,
|
||||
project):
|
||||
self.root = root
|
||||
self.venv = venv
|
||||
self.requirements = requirements
|
||||
self.test_requirements = test_requirements
|
||||
self.py_version = py_version
|
||||
self.project = project
|
||||
|
||||
def die(self, message, *args):
|
||||
print(message % args, file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
def run_command_with_code(self, cmd, redirect_output=True,
|
||||
check_exit_code=True):
|
||||
"""Runs a command in an out-of-process shell.
|
||||
|
||||
Returns the output of that command. Working directory is self.root.
|
||||
"""
|
||||
if redirect_output:
|
||||
stdout = subprocess.PIPE
|
||||
else:
|
||||
stdout = None
|
||||
|
||||
proc = subprocess.Popen(cmd, cwd=self.root, stdout=stdout)
|
||||
output = proc.communicate()[0]
|
||||
if check_exit_code and proc.returncode != 0:
|
||||
self.die('Command "%s" failed.\n%s', ' '.join(cmd), output)
|
||||
return (output, proc.returncode)
|
||||
|
||||
def run_command(self, cmd, redirect_output=True, check_exit_code=True):
|
||||
return self.run_command_with_code(cmd, redirect_output,
|
||||
check_exit_code)[0]
|
||||
|
||||
def get_distro(self):
|
||||
if (os.path.exists('/etc/fedora-release') or
|
||||
os.path.exists('/etc/redhat-release')):
|
||||
return Fedora(
|
||||
self.root, self.venv, self.requirements,
|
||||
self.test_requirements, self.py_version, self.project)
|
||||
else:
|
||||
return Distro(
|
||||
self.root, self.venv, self.requirements,
|
||||
self.test_requirements, self.py_version, self.project)
|
||||
|
||||
def check_dependencies(self):
|
||||
self.get_distro().install_virtualenv()
|
||||
|
||||
def create_virtualenv(self, no_site_packages=True):
|
||||
"""Creates the virtual environment and installs PIP.
|
||||
|
||||
Creates the virtual environment and installs PIP only into the
|
||||
virtual environment.
|
||||
"""
|
||||
if not os.path.isdir(self.venv):
|
||||
print('Creating venv...', end=' ')
|
||||
if no_site_packages:
|
||||
self.run_command(['virtualenv', '-q', '--no-site-packages',
|
||||
self.venv])
|
||||
else:
|
||||
self.run_command(['virtualenv', '-q', self.venv])
|
||||
print('done.')
|
||||
else:
|
||||
print("venv already exists...")
|
||||
pass
|
||||
|
||||
def pip_install(self, *args):
|
||||
self.run_command(['tools/with_venv.sh',
|
||||
'pip', 'install', '--upgrade'] + list(args),
|
||||
redirect_output=False)
|
||||
|
||||
def install_dependencies(self):
|
||||
print('Installing dependencies with pip (this can take a while)...')
|
||||
|
||||
# First things first, make sure our venv has the latest pip and
|
||||
# setuptools and pbr
|
||||
self.pip_install('pip>=1.4')
|
||||
self.pip_install('setuptools')
|
||||
self.pip_install('pbr')
|
||||
|
||||
self.pip_install('-r', self.requirements, '-r', self.test_requirements)
|
||||
|
||||
def parse_args(self, argv):
|
||||
"""Parses command-line arguments."""
|
||||
parser = optparse.OptionParser()
|
||||
parser.add_option('-n', '--no-site-packages',
|
||||
action='store_true',
|
||||
help="Do not inherit packages from global Python "
|
||||
"install.")
|
||||
return parser.parse_args(argv[1:])[0]
|
||||
|
||||
|
||||
class Distro(InstallVenv):
|
||||
|
||||
def check_cmd(self, cmd):
|
||||
return bool(self.run_command(['which', cmd],
|
||||
check_exit_code=False).strip())
|
||||
|
||||
def install_virtualenv(self):
|
||||
if self.check_cmd('virtualenv'):
|
||||
return
|
||||
|
||||
if self.check_cmd('easy_install'):
|
||||
print('Installing virtualenv via easy_install...', end=' ')
|
||||
if self.run_command(['easy_install', 'virtualenv']):
|
||||
print('Succeeded')
|
||||
return
|
||||
else:
|
||||
print('Failed')
|
||||
|
||||
self.die('ERROR: virtualenv not found.\n\n%s development'
|
||||
' requires virtualenv, please install it using your'
|
||||
' favorite package management tool' % self.project)
|
||||
|
||||
|
||||
class Fedora(Distro):
|
||||
"""This covers all Fedora-based distributions.
|
||||
|
||||
Includes: Fedora, RHEL, CentOS, Scientific Linux
|
||||
"""
|
||||
|
||||
def check_pkg(self, pkg):
|
||||
return self.run_command_with_code(['rpm', '-q', pkg],
|
||||
check_exit_code=False)[1] == 0
|
||||
|
||||
def install_virtualenv(self):
|
||||
if self.check_cmd('virtualenv'):
|
||||
return
|
||||
|
||||
if not self.check_pkg('python-virtualenv'):
|
||||
self.die("Please install 'python-virtualenv'.")
|
||||
|
||||
super(Fedora, self).install_virtualenv()
|
|
@ -1,30 +0,0 @@
|
|||
# Copyright (c) 2013 Intel Corporation.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
|
||||
import sys
|
||||
|
||||
from pylint import lint
|
||||
|
||||
|
||||
ENABLED_PYLINT_MSGS = ['W0611']
|
||||
|
||||
|
||||
def main(dirpath):
|
||||
enable_opt = '--enable=%s' % ','.join(ENABLED_PYLINT_MSGS)
|
||||
lint.Run(['--reports=n', '--disable=all', enable_opt, dirpath])
|
||||
|
||||
if __name__ == '__main__':
|
||||
main(sys.argv[1])
|
|
@ -1,55 +0,0 @@
|
|||
#!/bin/bash
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Show the latest tags for all Oslo projects as an approximation for
|
||||
# reporting on which releases exist.
|
||||
|
||||
bindir=$(cd $(dirname $0) && pwd)
|
||||
repodir=$(cd $bindir/../../.. && pwd)
|
||||
|
||||
# Make sure no pager is configured so the output is not blocked
|
||||
export PAGER=
|
||||
|
||||
if [ -z "$*" ]
|
||||
then
|
||||
libs=$($bindir/list_oslo_projects.py | egrep -v -e '(oslo.version|cookiecutter|incubator)')
|
||||
else
|
||||
libs="$*"
|
||||
fi
|
||||
|
||||
function get_last_tag {
|
||||
git for-each-ref --sort=taggerdate --format '%(refname)' refs/tags \
|
||||
| sed -e 's|refs/tags/||' \
|
||||
| ${bindir}/highest_semver.py
|
||||
}
|
||||
|
||||
function list_versions {
|
||||
# Show the tag for each library
|
||||
for lib in $*
|
||||
do
|
||||
the_date=""
|
||||
cd $repodir/$lib
|
||||
highest_tag=$(get_last_tag)
|
||||
if [ -z "$highest_tag" ]
|
||||
then
|
||||
the_date="0000-00-00 00:00:00 +0000"
|
||||
highest_tag="UNRELEASED"
|
||||
else
|
||||
the_date=$(git log -q --format='format:%ci' -n 1 $highest_tag)
|
||||
fi
|
||||
echo $the_date $lib $highest_tag
|
||||
done
|
||||
}
|
||||
|
||||
list_versions $libs | sort -nr
|
|
@ -1,96 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
New core email content generator.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
|
||||
import jinja2
|
||||
import parawrap
|
||||
|
||||
|
||||
CORE_TPL = """
|
||||
Greetings all stackers,
|
||||
|
||||
I propose that we add {{FULL_NAME}}[1] to the {{TEAM_CORE}}[2] team.
|
||||
|
||||
{{FIRST_NAME}} has been actively contributing to {{TEAM}} for a while now, both
|
||||
in helping make {{TEAM}} better via code contribution(s) and by helping with
|
||||
the review load when {{HE_SHE_LOWER}} can. {{HE_SHE}} has provided quality
|
||||
reviews and is doing an awesome job with the various {{TEAM}} concepts and
|
||||
helping make {{TEAM}} the best it can be!
|
||||
|
||||
Overall I think {{HE_SHE_LOWER}} would make a great addition to the core
|
||||
review team.
|
||||
|
||||
Please respond with +1/-1.
|
||||
|
||||
Thanks much!
|
||||
|
||||
- {{ME}}
|
||||
"""
|
||||
CORE_TPL = CORE_TPL.strip()
|
||||
|
||||
|
||||
def expand_template(contents, params):
|
||||
if not params:
|
||||
params = {}
|
||||
tpl = jinja2.Template(source=contents, undefined=jinja2.StrictUndefined)
|
||||
return tpl.render(**params)
|
||||
|
||||
|
||||
def generate_email(args):
|
||||
params = {
|
||||
'FULL_NAME': args.who,
|
||||
'HE_SHE': args.gender.title(),
|
||||
'TEAM_CORE': '%s-core' % args.team,
|
||||
'ME': args.sender,
|
||||
}
|
||||
params['TEAM'] = args.team.strip().lower()
|
||||
params['HE_SHE_LOWER'] = params['HE_SHE'].lower()
|
||||
params['FIRST_NAME'] = params['FULL_NAME'].split()[0]
|
||||
contents = expand_template(CORE_TPL, params)
|
||||
contents = parawrap.fill(contents.strip(), width=75)
|
||||
# Put the links on after so they are not affected by the wrapping...
|
||||
links = [
|
||||
'https://launchpad.net/~%s' % args.who_launchpad_id,
|
||||
'https://launchpad.net/%s' % params['TEAM'],
|
||||
]
|
||||
contents += "\n\n"
|
||||
for i, link in enumerate(links, 1):
|
||||
contents += "[%s] %s\n" % (i, link)
|
||||
return contents.rstrip()
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description=__doc__)
|
||||
parser.add_argument('--adding-who', action="store", dest="who",
|
||||
required=True, metavar="<full-name>")
|
||||
parser.add_argument('--adding-who-launchpad-id', action="store",
|
||||
dest="who_launchpad_id",
|
||||
required=True, metavar="<launchpad-id>")
|
||||
parser.add_argument('--from-who', action="store", dest="sender",
|
||||
metavar="<full-name>", required=True)
|
||||
parser.add_argument('--team', action="store", dest="team",
|
||||
metavar="<team>", required=True)
|
||||
parser.add_argument('--gender', action="store", dest="gender",
|
||||
metavar="<he/she>", required=True)
|
||||
args = parser.parse_args()
|
||||
print(generate_email(args))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -1,165 +0,0 @@
|
|||
import collections
|
||||
import contextlib
|
||||
import datetime
|
||||
import os
|
||||
import sys
|
||||
|
||||
import tabulate
|
||||
|
||||
from gitinspector.changes import Changes
|
||||
from gitinspector.metrics import MetricsLogic
|
||||
|
||||
Repository = collections.namedtuple('Repository', 'name,location')
|
||||
|
||||
CORE_SKIPS = frozenset([
|
||||
u'Julien Danjou',
|
||||
u'Davanum Srinivas',
|
||||
u'Ben Nemec',
|
||||
u'Joshua Harlow',
|
||||
u'Brant Knudson',
|
||||
u'Doug Hellmann',
|
||||
u'Victor Stinner',
|
||||
u'Michael Still',
|
||||
u'Flavio Percoco',
|
||||
u'Mehdi Abaakouk',
|
||||
u'Robert Collins',
|
||||
])
|
||||
EMAIL_SKIPS = frozenset([
|
||||
'openstack-infra@lists.openstack.org',
|
||||
'flaper87@gmail.com',
|
||||
'fpercoco@redhat.com',
|
||||
])
|
||||
OLDEST_COMMIT_YEAR = 2014
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def auto_cwd(target_dir):
|
||||
old_dir = os.getcwd()
|
||||
if old_dir == target_dir:
|
||||
yield
|
||||
else:
|
||||
os.chdir(target_dir)
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
os.chdir(old_dir)
|
||||
|
||||
|
||||
def new_core_compare(c1, c2):
|
||||
# Sort by insertions, deletions...
|
||||
c1_info = (c1[3], c1[4], c1[5])
|
||||
c2_info = (c2[3], c2[4], c2[5])
|
||||
if c1_info == c2_info:
|
||||
return 0
|
||||
if c1_info < c2_info:
|
||||
return -1
|
||||
else:
|
||||
return 1
|
||||
|
||||
|
||||
def should_discard(change_date, author_name, author_email, author_info):
|
||||
if author_name in CORE_SKIPS:
|
||||
return True
|
||||
if author_email in EMAIL_SKIPS:
|
||||
return True
|
||||
if change_date is not None:
|
||||
if change_date.year < OLDEST_COMMIT_YEAR:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def dump_changes(repo):
|
||||
with auto_cwd(repo.location):
|
||||
print("Analyzing repo %s (%s):" % (repo.name, repo.location))
|
||||
print("Please wait...")
|
||||
Changes.authors.clear()
|
||||
Changes.authors_dateinfo.clear()
|
||||
Changes.authors_by_email.clear()
|
||||
Changes.emails_by_author.clear()
|
||||
|
||||
changes = Changes(repo)
|
||||
# This is needed to flush out changes progress message...
|
||||
sys.stdout.write("\n")
|
||||
# Force population of this info...
|
||||
changes_per_author = changes.get_authordateinfo_list()
|
||||
just_authors = changes.get_authorinfo_list()
|
||||
better_changes_per_author = {}
|
||||
maybe_new_cores = {}
|
||||
for c in changes.get_commits():
|
||||
change_date = c.timestamp
|
||||
author_name = c.author
|
||||
author_email = c.email
|
||||
change_date = datetime.datetime.fromtimestamp(int(change_date))
|
||||
try:
|
||||
author_info = changes.authors[author_name]
|
||||
better_changes_per_author[(change_date, author_name)] = author_info
|
||||
except KeyError:
|
||||
pass
|
||||
for (change_date, author_name) in better_changes_per_author.keys():
|
||||
author_email = changes.get_latest_email_by_author(author_name)
|
||||
author_info = better_changes_per_author[(change_date, author_name)]
|
||||
author_info.email = author_email
|
||||
if not should_discard(change_date, author_name, author_email, author_info):
|
||||
if author_name in maybe_new_cores:
|
||||
existing_info = maybe_new_cores[author_name]
|
||||
if existing_info[2] < change_date:
|
||||
existing_info[2] = change_date
|
||||
else:
|
||||
maybe_core = [
|
||||
author_name.encode("ascii", errors='replace'),
|
||||
author_email,
|
||||
change_date,
|
||||
author_info.insertions,
|
||||
author_info.deletions,
|
||||
author_info.commits,
|
||||
]
|
||||
maybe_new_cores[author_name] = maybe_core
|
||||
if maybe_new_cores:
|
||||
print("%s potential new cores found!!" % len(maybe_new_cores))
|
||||
tmp_maybe_new_cores = sorted(list(maybe_new_cores.values()),
|
||||
cmp=new_core_compare, reverse=True)
|
||||
headers = ['Name', 'Email', 'Last change made', 'Insertions', 'Deletions', 'Commits']
|
||||
print(tabulate.tabulate(tmp_maybe_new_cores, headers=headers,
|
||||
tablefmt="grid"))
|
||||
else:
|
||||
print("No new cores found!!")
|
||||
return changes.authors.copy()
|
||||
|
||||
|
||||
def main(repos):
|
||||
raw_repos = [os.path.abspath(p) for p in repos]
|
||||
parsed_repos = []
|
||||
for repo in raw_repos:
|
||||
parsed_repos.append(Repository(os.path.basename(repo), repo))
|
||||
all_authors = []
|
||||
for repo in parsed_repos:
|
||||
all_authors.append(dump_changes(repo))
|
||||
if all_authors:
|
||||
print("Combined changes of %s repos:" % len(parsed_repos))
|
||||
maybe_new_cores = {}
|
||||
for repo_authors in all_authors:
|
||||
for author_name, author_info in repo_authors.items():
|
||||
change_date = datetime.datetime.now()
|
||||
if not should_discard(None, author_name, author_info.email, author_info):
|
||||
if author_name in maybe_new_cores:
|
||||
prior_author_info = maybe_new_cores[author_name]
|
||||
prior_author_info[3] = prior_author_info[3] + author_info.insertions
|
||||
prior_author_info[4] = prior_author_info[4] + author_info.deletions
|
||||
prior_author_info[5] = prior_author_info[5] + author_info.commits
|
||||
else:
|
||||
maybe_new_cores[author_name] = [
|
||||
author_name.encode("ascii", errors='replace'),
|
||||
author_info.email,
|
||||
u"N/A",
|
||||
author_info.insertions,
|
||||
author_info.deletions,
|
||||
author_info.commits,
|
||||
]
|
||||
tmp_maybe_new_cores = sorted(list(maybe_new_cores.values()),
|
||||
cmp=new_core_compare, reverse=True)
|
||||
headers = ['Name', 'Email', 'Last change made', 'Insertions', 'Deletions', 'Commits']
|
||||
print(tabulate.tabulate(tmp_maybe_new_cores, headers=headers,
|
||||
tablefmt="grid"))
|
||||
|
||||
if __name__ == '__main__':
|
||||
main(sys.argv[1:])
|
|
@ -1,70 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import random
|
||||
import sys
|
||||
|
||||
import jinja2
|
||||
import parawrap
|
||||
|
||||
|
||||
def expand_template(contents, params):
|
||||
if not params:
|
||||
params = {}
|
||||
tpl = jinja2.Template(source=contents, undefined=jinja2.StrictUndefined)
|
||||
return tpl.render(**params)
|
||||
|
||||
|
||||
chosen_how = [
|
||||
'selected',
|
||||
'picked',
|
||||
'targeted',
|
||||
]
|
||||
new_oslo_core_tpl = """
|
||||
Hi {{firstname}} {{lastname}},
|
||||
|
||||
You have been {{chosen_how}} to be a new {{project}} core (if you are
|
||||
willing to accept this mission). We have been watching your commits and
|
||||
reviews and have noticed that you may be interested in a core position
|
||||
that would be granted to you (if you are willing to accept the
|
||||
responsibility of being a new core member[1] in project {{project}}).
|
||||
|
||||
What do you think, are you able (and willing) to accept?
|
||||
|
||||
If you have any questions, please feel free to respond or jump on
|
||||
freenode and chat with the team on channel #openstack-oslo (one of the
|
||||
other cores in oslo usually around).
|
||||
|
||||
This message will self-destruct in 5 seconds.
|
||||
|
||||
Sincerely,
|
||||
|
||||
The Oslo Team
|
||||
|
||||
[1] http://docs.openstack.org/infra/manual/core.html
|
||||
"""
|
||||
firstname = sys.argv[1]
|
||||
lastname = sys.argv[2]
|
||||
tpl_args = {
|
||||
'firstname': firstname,
|
||||
'project': sys.argv[3],
|
||||
'lastname': lastname,
|
||||
'firstname_title': firstname.title(),
|
||||
'lastname_title': lastname.title(),
|
||||
'chosen_how': random.choice(chosen_how),
|
||||
}
|
||||
|
||||
tpl_value = expand_template(new_oslo_core_tpl.lstrip(), tpl_args)
|
||||
tpl_value = parawrap.fill(tpl_value)
|
||||
print(tpl_value)
|
|
@ -1,48 +0,0 @@
|
|||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""Utilities functions for working with oslo.config from the tool scripts.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from oslo_config import cfg
|
||||
|
||||
DEFAULT_CONFIG_FILES = [
|
||||
'./oslo.conf',
|
||||
os.path.expanduser('~/.oslo.conf'),
|
||||
]
|
||||
|
||||
|
||||
def get_config_parser():
|
||||
conf = cfg.ConfigOpts()
|
||||
conf.register_cli_opt(
|
||||
cfg.StrOpt(
|
||||
'repo_root',
|
||||
default='.',
|
||||
help='directory containing the git repositories',
|
||||
)
|
||||
)
|
||||
return conf
|
||||
|
||||
|
||||
def parse_arguments(conf):
|
||||
# Look for a few configuration files, and load the ones we find.
|
||||
default_config_files = [
|
||||
f
|
||||
for f in DEFAULT_CONFIG_FILES
|
||||
if os.path.exists(f)
|
||||
]
|
||||
return conf(
|
||||
project='oslo',
|
||||
default_config_files=default_config_files,
|
||||
)
|
|
@ -1,18 +0,0 @@
|
|||
Drop use of 'oslo' namespace package
|
||||
|
||||
The Oslo libraries have moved all of their code out of the 'oslo'
|
||||
namespace package into per-library packages. The namespace package was
|
||||
retained during kilo for backwards compatibility, but will be removed by
|
||||
the liberty-2 milestone. This change removes the use of the namespace
|
||||
package, replacing it with the new package names.
|
||||
|
||||
The patches in the libraries will be put on hold until application
|
||||
patches have landed, or L2, whichever comes first. At that point, new
|
||||
versions of the libraries without namespace packages will be released as
|
||||
a major version update.
|
||||
|
||||
Please merge this patch, or an equivalent, before L2 to avoid problems
|
||||
with those library releases.
|
||||
|
||||
Blueprint: remove-namespace-packages
|
||||
https://blueprints.launchpad.net/oslo-incubator/+spec/remove-namespace-packages
|
|
@ -1,7 +0,0 @@
|
|||
# NOTE(dhellmann): These requirements are just for the tool scripts and
|
||||
# do not need to be synced.
|
||||
|
||||
argparse
|
||||
oslo.config
|
||||
jinja2
|
||||
parawrap
|
|
@ -1,248 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
function usage {
|
||||
echo "Usage: $0 [OPTION]..."
|
||||
echo "Run project's test suite(s)"
|
||||
echo ""
|
||||
echo " -V, --virtual-env Always use virtualenv. Install automatically if not present."
|
||||
echo " -N, --no-virtual-env Don't use virtualenv. Run tests in local environment."
|
||||
echo " -s, --no-site-packages Isolate the virtualenv from the global Python environment."
|
||||
echo " -r, --recreate-db Recreate the test database (deprecated, as this is now the default)."
|
||||
echo " -n, --no-recreate-db Don't recreate the test database."
|
||||
echo " -f, --force Force a clean re-build of the virtual environment."
|
||||
echo " Useful when dependencies have been added."
|
||||
echo " -u, --update Update the virtual environment with any newer package versions."
|
||||
echo " -p, --pep8 Just run PEP8 and HACKING compliance check."
|
||||
echo " -P, --no-pep8 Don't run static code checks."
|
||||
echo " -c, --coverage Generate coverage report."
|
||||
echo " -d, --debug Run tests with testtools instead of testr."
|
||||
echo " This allows you to use the debugger."
|
||||
echo " -h, --help Print this usage message."
|
||||
echo " --hide-elapsed Don't print the elapsed time for each test along with slow test list."
|
||||
echo " --virtual-env-path <path> Location of the virtualenv directory."
|
||||
echo " Default: \$(pwd)"
|
||||
echo " --virtual-env-name <name> Name of the virtualenv directory."
|
||||
echo " Default: .venv"
|
||||
echo " --tools-path <dir> Location of the tools directory."
|
||||
echo " Default: \$(pwd)"
|
||||
echo ""
|
||||
echo "Note: with no options specified, the script will try to run the tests in a virtual environment,"
|
||||
echo " If no virtualenv is found, the script will ask if you would like to create one. If you "
|
||||
echo " prefer to run tests NOT in a virtual environment, simply pass the -N option."
|
||||
exit
|
||||
}
|
||||
|
||||
function process_options {
|
||||
i=1
|
||||
while [ $i -le $# ]; do
|
||||
case "${!i}" in
|
||||
-h|--help) usage;;
|
||||
-V|--virtual-env) ALWAYS_VENV=1; NEVER_VENV=0;;
|
||||
-N|--no-virtual-env) ALWAYS_VENV=0; NEVER_VENV=1;;
|
||||
-s|--no-site-packages) NO_SITE_PACKAGES=1;;
|
||||
-r|--recreate-db) RECREATE_DB=1;;
|
||||
-n|--no-recreate-db) RECREATE_DB=0;;
|
||||
-f|--force) FORCE=1;;
|
||||
-u|--update) UPDATE=1;;
|
||||
-p|--pep8) JUST_PEP8=1;;
|
||||
-P|--no-pep8) NO_PEP8=1;;
|
||||
-c|--coverage) COVERAGE=1;;
|
||||
-d|--debug) DEBUG=1;;
|
||||
--virtual-env-path)
|
||||
(( i++ ))
|
||||
VENV_PATH=${!i}
|
||||
;;
|
||||
--virtual-env-name)
|
||||
(( i++ ))
|
||||
VENV_DIR=${!i}
|
||||
;;
|
||||
--tools-path)
|
||||
(( i++ ))
|
||||
TOOLS_PATH=${!i}
|
||||
;;
|
||||
-*) TESTOPTS="$TESTOPTS ${!i}";;
|
||||
*) TESTRARGS="$TESTRARGS ${!i}"
|
||||
esac
|
||||
(( i++ ))
|
||||
done
|
||||
}
|
||||
|
||||
|
||||
TOOLS_PATH=${TOOLS_PATH:-${PWD}}
|
||||
VENV_PATH=${VENV_PATH:-${PWD}}
|
||||
VENV_DIR=${VENV_DIR:-.venv}
|
||||
WITH_VENV=${TOOLS_PATH}/tools/with_venv.sh
|
||||
|
||||
ALWAYS_VENV=0
|
||||
NEVER_VENV=0
|
||||
FORCE=0
|
||||
NO_SITE_PACKAGES=1
|
||||
INSTALLVENVOPTS=
|
||||
TESTRARGS=
|
||||
TESTOPTS=
|
||||
WRAPPER=""
|
||||
JUST_PEP8=0
|
||||
NO_PEP8=0
|
||||
COVERAGE=0
|
||||
DEBUG=0
|
||||
RECREATE_DB=1
|
||||
UPDATE=0
|
||||
|
||||
LANG=en_US.UTF-8
|
||||
LANGUAGE=en_US:en
|
||||
LC_ALL=C
|
||||
|
||||
process_options $@
|
||||
# Make our paths available to other scripts we call
|
||||
export VENV_PATH
|
||||
export TOOLS_PATH
|
||||
export VENV_DIR
|
||||
export WITH_VENV
|
||||
export VENV=${VENV_PATH}/${VENV_DIR}
|
||||
|
||||
|
||||
function run_tests {
|
||||
# Cleanup *pyc
|
||||
${WRAPPER} find . -type f -name "*.pyc" -delete
|
||||
|
||||
if [ ${DEBUG} -eq 1 ]; then
|
||||
if [ "${TESTOPTS}" = "" ] && [ "${TESTRARGS}" = "" ]; then
|
||||
# Default to running all tests if specific test is not
|
||||
# provided.
|
||||
TESTRARGS="discover ./${TESTS_DIR}"
|
||||
fi
|
||||
${WRAPPER} python -m testtools.run ${TESTOPTS} ${TESTRARGS}
|
||||
|
||||
# Short circuit because all of the testr and coverage stuff
|
||||
# below does not make sense when running testtools.run for
|
||||
# debugging purposes.
|
||||
return $?
|
||||
fi
|
||||
|
||||
if [ ${COVERAGE} -eq 1 ]; then
|
||||
TESTRTESTS="${TESTRTESTS} --coverage"
|
||||
else
|
||||
TESTRTESTS="${TESTRTESTS}"
|
||||
fi
|
||||
|
||||
# Just run the test suites in current environment
|
||||
set +e
|
||||
TESTRARGS=`echo "${TESTRARGS}" | sed -e's/^\s*\(.*\)\s*$/\1/'`
|
||||
|
||||
if [ ${WORKERS_COUNT} -ne 0 ]; then
|
||||
TESTRTESTS="${TESTRTESTS} --testr-args='--concurrency=${WORKERS_COUNT} --subunit ${TESTOPTS} ${TESTRARGS}'"
|
||||
else
|
||||
TESTRTESTS="${TESTRTESTS} --testr-args='--subunit ${TESTOPTS} ${TESTRARGS}'"
|
||||
fi
|
||||
|
||||
if [ setup.cfg -nt ${EGG_INFO_FILE} ]; then
|
||||
${WRAPPER} python setup.py egg_info
|
||||
fi
|
||||
|
||||
echo "Running \`${WRAPPER} ${TESTRTESTS}\`"
|
||||
if ${WRAPPER} which subunit-2to1 2>&1 > /dev/null; then
|
||||
# subunit-2to1 is present, testr subunit stream should be in version 2
|
||||
# format. Convert to version one before colorizing.
|
||||
bash -c "${WRAPPER} ${TESTRTESTS} | ${WRAPPER} subunit-2to1 | ${WRAPPER} ${TOOLS_PATH}/tools/colorizer.py"
|
||||
else
|
||||
bash -c "${WRAPPER} ${TESTRTESTS} | ${WRAPPER} ${TOOLS_PATH}/tools/colorizer.py"
|
||||
fi
|
||||
RESULT=$?
|
||||
set -e
|
||||
|
||||
copy_subunit_log
|
||||
|
||||
if [ $COVERAGE -eq 1 ]; then
|
||||
echo "Generating coverage report in covhtml/"
|
||||
${WRAPPER} coverage combine
|
||||
# Don't compute coverage for common code, which is tested elsewhere
|
||||
# if we are not in `oslo-incubator` project
|
||||
if [ ${OMIT_OSLO_FROM_COVERAGE} -eq 0 ]; then
|
||||
OMIT_OSLO=""
|
||||
else
|
||||
OMIT_OSLO="--omit='${PROJECT_NAME}/openstack/common/*'"
|
||||
fi
|
||||
${WRAPPER} coverage html --include='${PROJECT_NAME}/*' ${OMIT_OSLO} -d covhtml -i
|
||||
fi
|
||||
|
||||
return ${RESULT}
|
||||
}
|
||||
|
||||
function copy_subunit_log {
|
||||
LOGNAME=`cat .testrepository/next-stream`
|
||||
LOGNAME=$((${LOGNAME} - 1))
|
||||
LOGNAME=".testrepository/${LOGNAME}"
|
||||
cp ${LOGNAME} subunit.log
|
||||
}
|
||||
|
||||
function run_pep8 {
|
||||
echo "Running flake8 ..."
|
||||
bash -c "${WRAPPER} flake8"
|
||||
}
|
||||
|
||||
|
||||
TESTRTESTS="lockutils-wrapper python setup.py testr"
|
||||
|
||||
if [ ${NO_SITE_PACKAGES} -eq 1 ]; then
|
||||
INSTALLVENVOPTS="--no-site-packages"
|
||||
fi
|
||||
|
||||
if [ ${NEVER_VENV} -eq 0 ]; then
|
||||
# Remove the virtual environment if -f or --force used
|
||||
if [ ${FORCE} -eq 1 ]; then
|
||||
echo "Cleaning virtualenv..."
|
||||
rm -rf ${VENV}
|
||||
fi
|
||||
|
||||
# Update the virtual environment if -u or --update used
|
||||
if [ ${UPDATE} -eq 1 ]; then
|
||||
echo "Updating virtualenv..."
|
||||
python ${TOOLS_PATH}/tools/install_venv.py ${INSTALLVENVOPTS}
|
||||
fi
|
||||
|
||||
if [ -e ${VENV} ]; then
|
||||
WRAPPER="${WITH_VENV}"
|
||||
else
|
||||
if [ ${ALWAYS_VENV} -eq 1 ]; then
|
||||
# Automatically install the virtualenv
|
||||
python ${TOOLS_PATH}/tools/install_venv.py ${INSTALLVENVOPTS}
|
||||
WRAPPER="${WITH_VENV}"
|
||||
else
|
||||
echo -e "No virtual environment found...create one? (Y/n) \c"
|
||||
read USE_VENV
|
||||
if [ "x${USE_VENV}" = "xY" -o "x${USE_VENV}" = "x" -o "x${USE_VENV}" = "xy" ]; then
|
||||
# Install the virtualenv and run the test suite in it
|
||||
python ${TOOLS_PATH}/tools/install_venv.py ${INSTALLVENVOPTS}
|
||||
WRAPPER=${WITH_VENV}
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Delete old coverage data from previous runs
|
||||
if [ ${COVERAGE} -eq 1 ]; then
|
||||
${WRAPPER} coverage erase
|
||||
fi
|
||||
|
||||
if [ ${JUST_PEP8} -eq 1 ]; then
|
||||
run_pep8
|
||||
exit
|
||||
fi
|
||||
|
||||
if [ ${RECREATE_DB} -eq 1 ]; then
|
||||
rm -f tests.sqlite
|
||||
fi
|
||||
|
||||
run_tests
|
||||
|
||||
# NOTE(sirp): we only want to run pep8 when we're running the full-test suite,
|
||||
# not when we're running tests individually. To handle this, we need to
|
||||
# distinguish between options (testropts), which begin with a '-', and
|
||||
# arguments (testrargs).
|
||||
if [ -z "${TESTRARGS}" ]; then
|
||||
if [ ${NO_PEP8} -eq 0 ]; then
|
||||
run_pep8
|
||||
fi
|
||||
fi
|
|
@ -1,80 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import sys
|
||||
|
||||
import delorean
|
||||
import jinja2
|
||||
import parawrap
|
||||
|
||||
|
||||
def expand_template(contents, params):
|
||||
if not params:
|
||||
params = {}
|
||||
tpl = jinja2.Template(source=contents, undefined=jinja2.StrictUndefined)
|
||||
return tpl.render(**params)
|
||||
|
||||
|
||||
TPL = """
|
||||
Hi everyone,
|
||||
|
||||
The OpenStack {{ team }} team will be hosting a virtual sprint in
|
||||
the Freenode IRC channel #{{ channel }} for the {{ for }}
|
||||
on {{ when }} starting at {{ starts_at }} and going for ~{{ duration }} hours.
|
||||
|
||||
The goal of this sprint is to work on any open reviews, documentation or
|
||||
any other integration questions, development and so-on, so that we can help
|
||||
progress the {{ for }} forward at a good rate.
|
||||
|
||||
Live version of the current documentation is available here:
|
||||
|
||||
{{ docs }}
|
||||
|
||||
The code itself lives in the openstack/{{ project }} respository.
|
||||
|
||||
{{ git_tree }}
|
||||
|
||||
Please feel free to join if interested, curious, or able.
|
||||
|
||||
Much appreciated,
|
||||
|
||||
{{ author }}
|
||||
"""
|
||||
|
||||
# Example:
|
||||
#
|
||||
# python tools/virtual_sprint.py "taskflow" "next tuesday" "Joshua Harlow"
|
||||
if len(sys.argv) != 4:
|
||||
print("%s project when author" % sys.argv[0])
|
||||
sys.exit(1)
|
||||
|
||||
# Something like 'next tuesday' is expected...
|
||||
d = delorean.Delorean()
|
||||
when = getattr(d, sys.argv[2].replace(" ", "_"))
|
||||
project = sys.argv[1]
|
||||
author = sys.argv[3]
|
||||
params = {
|
||||
'team': 'oslo',
|
||||
'project': project,
|
||||
'channel': 'openstack-oslo',
|
||||
'docs': 'http://docs.openstack.org/developer/%s/' % project,
|
||||
'when': when().datetime.strftime('%A %m-%d-%Y'),
|
||||
'starts_at': '16:00 UTC',
|
||||
'duration': 8,
|
||||
'author': author,
|
||||
'git_tree': 'http://git.openstack.org/cgit/openstack/%s/tree' % project,
|
||||
}
|
||||
params['for'] = params['project'] + ' ' + 'subproject'
|
||||
for line in parawrap.wrap(expand_template(TPL.strip(), params)):
|
||||
print(line)
|
|
@ -1,6 +0,0 @@
|
|||
#!/bin/bash
|
||||
TOOLS_PATH=${TOOLS_PATH:-$(dirname $0)/../}
|
||||
VENV_PATH=${VENV_PATH:-${TOOLS_PATH}}
|
||||
VENV_DIR=${VENV_DIR:-/.venv}
|
||||
VENV=${VENV:-${VENV_PATH}/${VENV_DIR}}
|
||||
source ${VENV}/bin/activate && "$@"
|
30
tox.ini
|
@ -1,30 +0,0 @@
|
|||
[tox]
|
||||
minversion = 1.6
|
||||
envlist = pep8,dashboards,docs
|
||||
skipsdist = True
|
||||
|
||||
[testenv]
|
||||
sitepackages = False
|
||||
usedevelop = True
|
||||
install_command = pip install -U {opts} {packages}
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
PYTHONHASHSEED=0
|
||||
[flake8]
|
||||
show-source = True
|
||||
ignore = E123,H405,H904
|
||||
exclude = .venv,.tox,dist,doc,*.egg,.update-venv
|
||||
|
||||
[testenv:pep8]
|
||||
commands = flake8 {posargs}
|
||||
|
||||
[testenv:docs]
|
||||
commands = python setup.py build_sphinx
|
||||
|
||||
[testenv:dashboards]
|
||||
skipdist = True
|
||||
usedevelop = True
|
||||
deps =
|
||||
commands =
|
||||
git clone https://git.openstack.org/openstack/gerrit-dash-creator {envdir}/gerrit-dash-creator
|
||||
pip install -r {envdir}/gerrit-dash-creator/requirements.txt
|
||||
{toxinidir}/tools/build_dashboards.sh {envdir}/gerrit-dash-creator {toxinidir}/dashboards
|