Retire Packaging Deb project repos
This commit is part of a series to retire the Packaging Deb project. Step 2 is to remove all content from the project repos, replacing it with a README notification where to find ongoing work, and how to recover the repo if needed at some future point (as in https://docs.openstack.org/infra/manual/drivers.html#retiring-a-project). Change-Id: I467ab5607b11d3fe72f7939f67ee891a3c331801
This commit is contained in:
parent
341d07e298
commit
df306dfc51
|
@ -1,10 +0,0 @@
|
|||
*.pyc
|
||||
*.swp
|
||||
build
|
||||
dist
|
||||
heat_cfntools.egg-info/
|
||||
.testrepository/
|
||||
subunit.log
|
||||
.tox
|
||||
AUTHORS
|
||||
ChangeLog
|
|
@ -1,4 +0,0 @@
|
|||
[gerrit]
|
||||
host=review.openstack.org
|
||||
port=29418
|
||||
project=openstack/heat-cfntools.git
|
|
@ -1,4 +0,0 @@
|
|||
[DEFAULT]
|
||||
test_command=OS_STDOUT_CAPTURE=1 OS_STDERR_CAPTURE=1 OS_TEST_TIMEOUT=60 ${PYTHON:-python} -m subunit.run discover -t ./ ./heat_cfntools/tests $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
|
@ -1,16 +0,0 @@
|
|||
If you would like to contribute to the development of OpenStack,
|
||||
you must follow the steps in this page:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html
|
||||
|
||||
Once those steps have been completed, changes to OpenStack
|
||||
should be submitted for review via the Gerrit tool, following
|
||||
the workflow documented at:
|
||||
|
||||
http://docs.openstack.org/infra/manual/developers.html#development-workflow
|
||||
|
||||
Pull requests submitted through GitHub will be ignored.
|
||||
|
||||
Bugs should be filed on Launchpad, not GitHub:
|
||||
|
||||
https://bugs.launchpad.net/heat
|
176
LICENSE
176
LICENSE
|
@ -1,176 +0,0 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
|
@ -1,7 +0,0 @@
|
|||
include CONTRIBUTING.rst
|
||||
include MANIFEST.in
|
||||
include README.rst
|
||||
include AUTHORS LICENSE
|
||||
include ChangeLog
|
||||
graft doc
|
||||
graft tools
|
|
@ -0,0 +1,14 @@
|
|||
This project is no longer maintained.
|
||||
|
||||
The contents of this repository are still available in the Git
|
||||
source code management system. To see the contents of this
|
||||
repository before it reached its end of life, please check out the
|
||||
previous commit with "git checkout HEAD^1".
|
||||
|
||||
For ongoing work on maintaining OpenStack packages in the Debian
|
||||
distribution, please see the Debian OpenStack packaging team at
|
||||
https://wiki.debian.org/OpenStack/.
|
||||
|
||||
For any further questions, please email
|
||||
openstack-dev@lists.openstack.org or join #openstack-dev on
|
||||
Freenode.
|
38
README.rst
38
README.rst
|
@ -1,38 +0,0 @@
|
|||
========================
|
||||
Team and repository tags
|
||||
========================
|
||||
|
||||
.. image:: http://governance.openstack.org/badges/heat-cfntools.svg
|
||||
:target: http://governance.openstack.org/reference/tags/index.html
|
||||
|
||||
.. Change things from this point on
|
||||
|
||||
=========================
|
||||
Heat CloudFormation Tools
|
||||
=========================
|
||||
|
||||
There are several bootstrap methods for cloudformations:
|
||||
|
||||
1. Create image with application ready to go
|
||||
2. Use cloud-init to run a startup script passed as userdata to the nova
|
||||
server create
|
||||
3. Use the CloudFormation instance helper scripts
|
||||
|
||||
This package contains files required for choice #3.
|
||||
|
||||
cfn-init -
|
||||
Reads the AWS::CloudFormation::Init for the instance resource,
|
||||
installs packages, and starts services
|
||||
cfn-signal -
|
||||
Waits for an application to be ready before continuing, ie:
|
||||
supporting the WaitCondition feature
|
||||
cfn-hup -
|
||||
Handle updates from the UpdateStack CloudFormation API call
|
||||
|
||||
* Free software: Apache license
|
||||
* Source: http://git.openstack.org/cgit/openstack/heat-cfntools
|
||||
* Bugs: http://bugs.launchpad.net/heat-cfntools
|
||||
|
||||
Related projects
|
||||
----------------
|
||||
* http://wiki.openstack.org/Heat
|
|
@ -1,87 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Creates symlinks for the cfn-* scripts in this directory to /opt/aws/bin
|
||||
"""
|
||||
import argparse
|
||||
import glob
|
||||
import os
|
||||
import os.path
|
||||
|
||||
|
||||
def create_symlink(source_file, target_file, override=False):
|
||||
if os.path.exists(target_file):
|
||||
if (override):
|
||||
os.remove(target_file)
|
||||
else:
|
||||
print('%s already exists, will not replace with symlink'
|
||||
% target_file)
|
||||
return
|
||||
print('%s -> %s' % (source_file, target_file))
|
||||
os.symlink(source_file, target_file)
|
||||
|
||||
|
||||
def check_dirs(source_dir, target_dir):
|
||||
print('%s -> %s' % (source_dir, target_dir))
|
||||
|
||||
if source_dir == target_dir:
|
||||
print('Source and target are the same %s' % target_dir)
|
||||
return False
|
||||
|
||||
if not os.path.exists(target_dir):
|
||||
try:
|
||||
os.makedirs(target_dir)
|
||||
except OSError as exc:
|
||||
print('Could not create target directory %s: %s'
|
||||
% (target_dir, exc))
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def create_symlinks(source_dir, target_dir, glob_pattern, override):
|
||||
source_files = glob.glob(os.path.join(source_dir, glob_pattern))
|
||||
for source_file in source_files:
|
||||
target_file = os.path.join(target_dir, os.path.basename(source_file))
|
||||
create_symlink(source_file, target_file, override=override)
|
||||
|
||||
if __name__ == '__main__':
|
||||
description = 'Creates symlinks for the cfn-* scripts to /opt/aws/bin'
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument(
|
||||
'-t', '--target',
|
||||
dest="target_dir",
|
||||
help="Target directory to create symlinks",
|
||||
default='/opt/aws/bin',
|
||||
required=False)
|
||||
parser.add_argument(
|
||||
'-s', '--source',
|
||||
dest="source_dir",
|
||||
help="Source directory to create symlinks from. "
|
||||
"Defaults to the directory where this script is",
|
||||
default='/usr/bin',
|
||||
required=False)
|
||||
parser.add_argument(
|
||||
'-f', '--force',
|
||||
dest="force",
|
||||
action='store_true',
|
||||
help="If specified, will create symlinks even if "
|
||||
"there is already a target file",
|
||||
required=False)
|
||||
args = parser.parse_args()
|
||||
|
||||
if not check_dirs(args.source_dir, args.target_dir):
|
||||
exit(1)
|
||||
|
||||
create_symlinks(args.source_dir, args.target_dir, 'cfn-*', args.force)
|
|
@ -1,85 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Implements cfn-get-metadata CloudFormation functionality
|
||||
"""
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
description = " "
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument('-s', '--stack',
|
||||
dest="stack_name",
|
||||
help="A Heat stack name",
|
||||
required=True)
|
||||
parser.add_argument('-r', '--resource',
|
||||
dest="logical_resource_id",
|
||||
help="A Heat logical resource ID",
|
||||
required=True)
|
||||
parser.add_argument('--access-key',
|
||||
dest="access_key",
|
||||
help="A Keystone access key",
|
||||
required=False)
|
||||
parser.add_argument('--secret-key',
|
||||
dest="secret_key",
|
||||
help="A Keystone secret key",
|
||||
required=False)
|
||||
parser.add_argument('--region',
|
||||
dest="region",
|
||||
help="Openstack region",
|
||||
required=False)
|
||||
parser.add_argument('--credential-file',
|
||||
dest="credential_file",
|
||||
help="credential-file",
|
||||
required=False)
|
||||
parser.add_argument('-u', '--url',
|
||||
dest="url",
|
||||
help="service url",
|
||||
required=False)
|
||||
parser.add_argument('-k', '--key',
|
||||
dest="key",
|
||||
help="key",
|
||||
required=False)
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.stack_name:
|
||||
print('The Stack name must not be empty.')
|
||||
exit(1)
|
||||
|
||||
if not args.logical_resource_id:
|
||||
print('The Resource ID must not be empty')
|
||||
exit(1)
|
||||
|
||||
log_format = '%(levelname)s [%(asctime)s] %(message)s'
|
||||
logging.basicConfig(format=log_format, level=logging.DEBUG)
|
||||
|
||||
LOG = logging.getLogger('cfntools')
|
||||
log_file_name = "/var/log/cfn-get-metadata.log"
|
||||
file_handler = logging.FileHandler(log_file_name)
|
||||
file_handler.setFormatter(logging.Formatter(log_format))
|
||||
LOG.addHandler(file_handler)
|
||||
|
||||
metadata = cfn_helper.Metadata(args.stack_name,
|
||||
args.logical_resource_id,
|
||||
access_key=args.access_key,
|
||||
secret_key=args.secret_key,
|
||||
region=args.region,
|
||||
credentials_file=args.credential_file)
|
||||
metadata.retrieve()
|
||||
LOG.debug(str(metadata))
|
||||
metadata.display(args.key)
|
108
bin/cfn-hup
108
bin/cfn-hup
|
@ -1,108 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Implements cfn-hup CloudFormation functionality
|
||||
"""
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import os.path
|
||||
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
description = " "
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument('-c', '--config',
|
||||
dest="config_dir",
|
||||
help="Hook Config Directory",
|
||||
required=False,
|
||||
default='/etc/cfn/hooks.d')
|
||||
parser.add_argument('-f', '--no-daemon',
|
||||
dest="no_daemon",
|
||||
action="store_true",
|
||||
help="Do not run as a daemon",
|
||||
required=False)
|
||||
parser.add_argument('-v', '--verbose',
|
||||
action="store_true",
|
||||
dest="verbose",
|
||||
help="Verbose logging",
|
||||
required=False)
|
||||
args = parser.parse_args()
|
||||
|
||||
# Setup logging
|
||||
log_format = '%(levelname)s [%(asctime)s] %(message)s'
|
||||
log_file_name = "/var/log/cfn-hup.log"
|
||||
log_level = logging.INFO
|
||||
if args.verbose:
|
||||
log_level = logging.DEBUG
|
||||
logging.basicConfig(filename=log_file_name,
|
||||
format=log_format,
|
||||
level=log_level)
|
||||
|
||||
LOG = logging.getLogger('cfntools')
|
||||
|
||||
main_conf_path = '/etc/cfn/cfn-hup.conf'
|
||||
try:
|
||||
main_config_file = open(main_conf_path)
|
||||
except IOError as exc:
|
||||
LOG.error('Could not open main configuration at %s' % main_conf_path)
|
||||
exit(1)
|
||||
|
||||
config_files = []
|
||||
hooks_conf_path = '/etc/cfn/hooks.conf'
|
||||
if os.path.exists(hooks_conf_path):
|
||||
try:
|
||||
config_files.append(open(hooks_conf_path))
|
||||
except IOError as exc:
|
||||
LOG.exception(exc)
|
||||
|
||||
if args.config_dir and os.path.exists(args.config_dir):
|
||||
try:
|
||||
for f in os.listdir(args.config_dir):
|
||||
config_files.append(open(os.path.join(args.config_dir, f)))
|
||||
|
||||
except OSError as exc:
|
||||
LOG.exception(exc)
|
||||
|
||||
if not config_files:
|
||||
LOG.error('No hook files found at %s or %s' % (hooks_conf_path,
|
||||
args.config_dir))
|
||||
exit(1)
|
||||
|
||||
try:
|
||||
mainconfig = cfn_helper.HupConfig([main_config_file] + config_files)
|
||||
except Exception as ex:
|
||||
LOG.error('Cannot load configuration: %s' % str(ex))
|
||||
exit(1)
|
||||
|
||||
if not mainconfig.unique_resources_get():
|
||||
LOG.error('No hooks were found. Add some to %s or %s' % (hooks_conf_path,
|
||||
args.config_dir))
|
||||
exit(1)
|
||||
|
||||
|
||||
for r in mainconfig.unique_resources_get():
|
||||
LOG.debug('Checking resource %s' % r)
|
||||
metadata = cfn_helper.Metadata(mainconfig.stack,
|
||||
r,
|
||||
credentials_file=mainconfig.credential_file,
|
||||
region=mainconfig.region)
|
||||
metadata.retrieve()
|
||||
try:
|
||||
metadata.cfn_hup(mainconfig.hooks)
|
||||
except Exception as e:
|
||||
LOG.exception("Error processing metadata")
|
||||
exit(1)
|
71
bin/cfn-init
71
bin/cfn-init
|
@ -1,71 +0,0 @@
|
|||
#!/usr/bin/python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Implements cfn-init CloudFormation functionality
|
||||
"""
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
description = " "
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument('-s', '--stack',
|
||||
dest="stack_name",
|
||||
help="A Heat stack name",
|
||||
required=False)
|
||||
parser.add_argument('-r', '--resource',
|
||||
dest="logical_resource_id",
|
||||
help="A Heat logical resource ID",
|
||||
required=False)
|
||||
parser.add_argument('--access-key',
|
||||
dest="access_key",
|
||||
help="A Keystone access key",
|
||||
required=False)
|
||||
parser.add_argument('--secret-key',
|
||||
dest="secret_key",
|
||||
help="A Keystone secret key",
|
||||
required=False)
|
||||
parser.add_argument('--region',
|
||||
dest="region",
|
||||
help="Openstack region",
|
||||
required=False)
|
||||
parser.add_argument('-c', '--configsets',
|
||||
dest="configsets",
|
||||
help="An optional list of configSets (default: default)",
|
||||
required=False)
|
||||
args = parser.parse_args()
|
||||
|
||||
log_format = '%(levelname)s [%(asctime)s] %(message)s'
|
||||
log_file_name = "/var/log/cfn-init.log"
|
||||
logging.basicConfig(filename=log_file_name,
|
||||
format=log_format,
|
||||
level=logging.DEBUG)
|
||||
|
||||
LOG = logging.getLogger('cfntools')
|
||||
|
||||
metadata = cfn_helper.Metadata(args.stack_name,
|
||||
args.logical_resource_id,
|
||||
access_key=args.access_key,
|
||||
secret_key=args.secret_key,
|
||||
region=args.region,
|
||||
configsets=args.configsets)
|
||||
metadata.retrieve()
|
||||
try:
|
||||
metadata.cfn_init()
|
||||
except Exception as e:
|
||||
LOG.exception("Error processing metadata")
|
||||
exit(1)
|
|
@ -1,286 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Implements cfn-push-stats CloudFormation functionality
|
||||
"""
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
# Override BOTO_CONFIG, which makes boto look only at the specified
|
||||
# config file, instead of the default locations
|
||||
os.environ['BOTO_CONFIG'] = '/var/lib/heat-cfntools/cfn-boto-cfg'
|
||||
from boto.ec2 import cloudwatch
|
||||
|
||||
|
||||
log_format = '%(levelname)s [%(asctime)s] %(message)s'
|
||||
log_file_name = "/var/log/cfn-push-stats.log"
|
||||
logging.basicConfig(filename=log_file_name,
|
||||
format=log_format)
|
||||
LOG = logging.getLogger('cfntools')
|
||||
|
||||
try:
|
||||
import psutil
|
||||
except ImportError:
|
||||
LOG.warning("psutil not available. If you want process and memory "
|
||||
"statistics, you need to install it.")
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
KILO = 1024
|
||||
MEGA = 1048576
|
||||
GIGA = 1073741824
|
||||
unit_map = {'bytes': 1,
|
||||
'kilobytes': KILO,
|
||||
'megabytes': MEGA,
|
||||
'gigabytes': GIGA}
|
||||
|
||||
description = " "
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument('-v', '--verbose', action="store_true",
|
||||
help="Verbose logging", required=False)
|
||||
parser.add_argument('--credential-file', dest="credential_file",
|
||||
help="credential-file", required=False,
|
||||
default='/etc/cfn/cfn-credentials')
|
||||
parser.add_argument('--service-failure', required=False, action="store_true",
|
||||
help='Reports a service failure.')
|
||||
parser.add_argument('--mem-util', required=False, action="store_true",
|
||||
help='Reports memory utilization in percentages.')
|
||||
parser.add_argument('--mem-used', required=False, action="store_true",
|
||||
help='Reports memory used (excluding cache/buffers) '
|
||||
'in megabytes.')
|
||||
parser.add_argument('--mem-avail', required=False, action="store_true",
|
||||
help='Reports available memory (including cache/buffers) '
|
||||
'in megabytes.')
|
||||
parser.add_argument('--swap-util', required=False, action="store_true",
|
||||
help='Reports swap utilization in percentages.')
|
||||
parser.add_argument('--swap-used', required=False, action="store_true",
|
||||
help='Reports allocated swap space in megabytes.')
|
||||
parser.add_argument('--disk-space-util', required=False, action="store_true",
|
||||
help='Reports disk space utilization in percentages.')
|
||||
parser.add_argument('--disk-space-used', required=False, action="store_true",
|
||||
help='Reports allocated disk space in gigabytes.')
|
||||
parser.add_argument('--disk-space-avail', required=False, action="store_true",
|
||||
help='Reports available disk space in gigabytes.')
|
||||
parser.add_argument('--memory-units', required=False, default='megabytes',
|
||||
help='Specifies units for memory metrics.')
|
||||
parser.add_argument('--disk-units', required=False, default='megabytes',
|
||||
help='Specifies units for disk metrics.')
|
||||
parser.add_argument('--disk-path', required=False, default='/',
|
||||
help='Selects the disk by the path on which to report.')
|
||||
parser.add_argument('--cpu-util', required=False, action="store_true",
|
||||
help='Reports cpu utilization in percentages.')
|
||||
parser.add_argument('--haproxy', required=False, action='store_true',
|
||||
help='Reports HAProxy loadbalancer usage.')
|
||||
parser.add_argument('--haproxy-latency', required=False, action='store_true',
|
||||
help='Reports HAProxy latency')
|
||||
parser.add_argument('--heartbeat', required=False, action='store_true',
|
||||
help='Sends a Heartbeat.')
|
||||
parser.add_argument('--watch', required=False,
|
||||
help='the name of the watch to post to.')
|
||||
parser.add_argument('--metric', required=False,
|
||||
help='name of the metric to post to.')
|
||||
parser.add_argument('--units', required=False,
|
||||
help='name of the units to be used for the specified'
|
||||
'metric')
|
||||
parser.add_argument('--value', required=False,
|
||||
help='value to post to the specified metric')
|
||||
args = parser.parse_args()
|
||||
|
||||
LOG.debug('cfn-push-stats called %s ' % (str(args)))
|
||||
|
||||
credentials = cfn_helper.parse_creds_file(args.credential_file)
|
||||
|
||||
namespace = 'system/linux'
|
||||
data = {}
|
||||
|
||||
# Logging
|
||||
# =======
|
||||
if args.verbose:
|
||||
LOG.setLevel(logging.DEBUG)
|
||||
|
||||
# Generic user-specified metric
|
||||
# =============================
|
||||
if args.metric and args.units and args.value:
|
||||
data[args.metric] = {
|
||||
'Value': args.value,
|
||||
'Units': args.units}
|
||||
|
||||
# service failure
|
||||
# ===============
|
||||
if args.service_failure:
|
||||
data['ServiceFailure'] = {
|
||||
'Value': 1,
|
||||
'Units': 'Counter'}
|
||||
|
||||
# heartbeat
|
||||
# ========
|
||||
if args.heartbeat:
|
||||
data['Heartbeat'] = {
|
||||
'Value': 1,
|
||||
'Units': 'Counter'}
|
||||
|
||||
# memory space
|
||||
# ============
|
||||
if args.mem_util or args.mem_used or args.mem_avail:
|
||||
mem = psutil.phymem_usage()
|
||||
if args.mem_util:
|
||||
data['MemoryUtilization'] = {
|
||||
'Value': mem.percent,
|
||||
'Units': 'Percent'}
|
||||
if args.mem_used:
|
||||
data['MemoryUsed'] = {
|
||||
'Value': mem.used / unit_map[args.memory_units],
|
||||
'Units': args.memory_units}
|
||||
if args.mem_avail:
|
||||
data['MemoryAvailable'] = {
|
||||
'Value': mem.free / unit_map[args.memory_units],
|
||||
'Units': args.memory_units}
|
||||
|
||||
# swap space
|
||||
# ==========
|
||||
if args.swap_util or args.swap_used:
|
||||
swap = psutil.virtmem_usage()
|
||||
if args.swap_util:
|
||||
data['SwapUtilization'] = {
|
||||
'Value': swap.percent,
|
||||
'Units': 'Percent'}
|
||||
if args.swap_used:
|
||||
data['SwapUsed'] = {
|
||||
'Value': swap.used / unit_map[args.memory_units],
|
||||
'Units': args.memory_units}
|
||||
|
||||
# disk space
|
||||
# ==========
|
||||
if args.disk_space_util or args.disk_space_used or args.disk_space_avail:
|
||||
disk = psutil.disk_usage(args.disk_path)
|
||||
if args.disk_space_util:
|
||||
data['DiskSpaceUtilization'] = {
|
||||
'Value': disk.percent,
|
||||
'Units': 'Percent'}
|
||||
if args.disk_space_used:
|
||||
data['DiskSpaceUsed'] = {
|
||||
'Value': disk.used / unit_map[args.disk_units],
|
||||
'Units': args.disk_units}
|
||||
if args.disk_space_avail:
|
||||
data['DiskSpaceAvailable'] = {
|
||||
'Value': disk.free / unit_map[args.disk_units],
|
||||
'Units': args.disk_units}
|
||||
|
||||
# cpu utilization
|
||||
# ===============
|
||||
if args.cpu_util:
|
||||
# blocks for 1 second.
|
||||
cpu_percent = psutil.cpu_percent(interval=1)
|
||||
data['CPUUtilization'] = {
|
||||
'Value': cpu_percent,
|
||||
'Units': 'Percent'}
|
||||
|
||||
|
||||
# HAProxy
|
||||
# =======
|
||||
def parse_haproxy_unix_socket(res, latency_only=False):
|
||||
# http://docs.amazonwebservices.com/ElasticLoadBalancing/latest
|
||||
# /DeveloperGuide/US_MonitoringLoadBalancerWithCW.html
|
||||
|
||||
type_map = {'FRONTEND': '0', 'BACKEND': '1', 'SERVER': '2', 'SOCKET': '3'}
|
||||
num_map = {'status': 17, 'svname': 1, 'check_duration': 38, 'type': 32,
|
||||
'req_tot': 48, 'hrsp_2xx': 40, 'hrsp_3xx': 41, 'hrsp_4xx': 42,
|
||||
'hrsp_5xx': 43}
|
||||
|
||||
def add_stat(key, value, unit='Counter'):
|
||||
res[key] = {'Value': value,
|
||||
'Units': unit}
|
||||
|
||||
echo = subprocess.Popen(['echo', 'show stat'],
|
||||
stdout=subprocess.PIPE)
|
||||
socat = subprocess.Popen(['socat', 'stdio', '/tmp/.haproxy-stats'],
|
||||
stdin=echo.stdout,
|
||||
stdout=subprocess.PIPE)
|
||||
end_pipe = socat.stdout
|
||||
raw = [l.strip('\n').split(',')
|
||||
for l in end_pipe if l[0] != '#' and len(l) > 2]
|
||||
latency = 0
|
||||
up_count = 0
|
||||
down_count = 0
|
||||
for f in raw:
|
||||
if latency_only is False:
|
||||
if f[num_map['type']] == type_map['FRONTEND']:
|
||||
add_stat('RequestCount', f[num_map['req_tot']])
|
||||
add_stat('HTTPCode_ELB_4XX', f[num_map['hrsp_4xx']])
|
||||
add_stat('HTTPCode_ELB_5XX', f[num_map['hrsp_5xx']])
|
||||
elif f[num_map['type']] == type_map['BACKEND']:
|
||||
add_stat('HTTPCode_Backend_2XX', f[num_map['hrsp_2xx']])
|
||||
add_stat('HTTPCode_Backend_3XX', f[num_map['hrsp_3xx']])
|
||||
add_stat('HTTPCode_Backend_4XX', f[num_map['hrsp_4xx']])
|
||||
add_stat('HTTPCode_Backend_5XX', f[num_map['hrsp_5xx']])
|
||||
else:
|
||||
if f[num_map['status']] == 'UP':
|
||||
up_count = up_count + 1
|
||||
else:
|
||||
down_count = down_count + 1
|
||||
if f[num_map['check_duration']] != '':
|
||||
latency = max(float(f[num_map['check_duration']]), latency)
|
||||
|
||||
# note: haproxy's check_duration is in ms, but Latency is in seconds
|
||||
add_stat('Latency', str(latency / 1000), unit='Seconds')
|
||||
if latency_only is False:
|
||||
add_stat('HealthyHostCount', str(up_count))
|
||||
add_stat('UnHealthyHostCount', str(down_count))
|
||||
|
||||
|
||||
def send_stats(info):
|
||||
|
||||
# Create boto connection, need the hard-coded port/path as boto
|
||||
# can't read these from config values in BOTO_CONFIG
|
||||
# FIXME : currently only http due to is_secure=False
|
||||
client = cloudwatch.CloudWatchConnection(
|
||||
aws_access_key_id=credentials['AWSAccessKeyId'],
|
||||
aws_secret_access_key=credentials['AWSSecretKey'],
|
||||
is_secure=False, port=8003, path="/v1", debug=0)
|
||||
|
||||
# Then we send the metric datapoints passed in "info", note this could
|
||||
# contain multiple keys as the options parsed above are not exclusive
|
||||
# The alarm name is passed as a dimension so the metric datapoint can
|
||||
# be associated with the alarm/watch in the engine
|
||||
metadata = cfn_helper.Metadata('not-used', None)
|
||||
metric_dims = metadata.get_tags()
|
||||
if args.watch:
|
||||
metric_dims['AlarmName'] = args.watch
|
||||
for key in info:
|
||||
LOG.info("Sending metric %s, Units %s, Value %s" %
|
||||
(key, info[key]['Units'], info[key]['Value']))
|
||||
client.put_metric_data(namespace=namespace,
|
||||
name=key,
|
||||
value=info[key]['Value'],
|
||||
timestamp=None, # means use "now" in the engine
|
||||
unit=info[key]['Units'],
|
||||
dimensions=metric_dims,
|
||||
statistics=None)
|
||||
|
||||
|
||||
if args.haproxy:
|
||||
namespace = 'AWS/ELB'
|
||||
lb_data = {}
|
||||
parse_haproxy_unix_socket(lb_data)
|
||||
send_stats(lb_data)
|
||||
elif args.haproxy_latency:
|
||||
namespace = 'AWS/ELB'
|
||||
lb_data = {}
|
||||
parse_haproxy_unix_socket(lb_data, latency_only=True)
|
||||
send_stats(lb_data)
|
||||
else:
|
||||
send_stats(data)
|
118
bin/cfn-signal
118
bin/cfn-signal
|
@ -1,118 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Implements cfn-signal CloudFormation functionality
|
||||
"""
|
||||
import argparse
|
||||
import logging
|
||||
import sys
|
||||
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
|
||||
description = " "
|
||||
parser = argparse.ArgumentParser(description=description)
|
||||
parser.add_argument('-s', '--success',
|
||||
dest="success",
|
||||
help="signal status to report",
|
||||
default='true',
|
||||
required=False)
|
||||
parser.add_argument('-r', '--reason',
|
||||
dest="reason",
|
||||
help="The reason for the failure",
|
||||
default="Configuration Complete",
|
||||
required=False)
|
||||
parser.add_argument('-d', '--data',
|
||||
dest="data",
|
||||
default="Application has completed configuration.",
|
||||
help="The data to send",
|
||||
required=False)
|
||||
parser.add_argument('-i', '--id',
|
||||
dest="unique_id",
|
||||
help="the unique id to send back to the WaitCondition",
|
||||
default=None,
|
||||
required=False)
|
||||
parser.add_argument('-e', '--exit-code',
|
||||
dest="exit_code",
|
||||
help="The exit code from a process to interpret",
|
||||
default=None,
|
||||
required=False)
|
||||
parser.add_argument('--exit',
|
||||
dest="exit",
|
||||
help="DEPRECATED! Use -e or --exit-code instead.",
|
||||
default=None,
|
||||
required=False)
|
||||
parser.add_argument('url',
|
||||
help='the url to post to')
|
||||
parser.add_argument('-k', '--insecure',
|
||||
help="This will make insecure https request to cfn-api.",
|
||||
action='store_true')
|
||||
args = parser.parse_args()
|
||||
|
||||
log_format = '%(levelname)s [%(asctime)s] %(message)s'
|
||||
log_file_name = "/var/log/cfn-signal.log"
|
||||
logging.basicConfig(filename=log_file_name,
|
||||
format=log_format,
|
||||
level=logging.DEBUG)
|
||||
|
||||
LOG = logging.getLogger('cfntools')
|
||||
|
||||
LOG.debug('cfn-signal called %s ' % (str(args)))
|
||||
if args.exit:
|
||||
LOG.warning('--exit DEPRECATED! Use -e or --exit-code instead.')
|
||||
status = 'FAILURE'
|
||||
exit_code = args.exit_code or args.exit
|
||||
if exit_code:
|
||||
# "exit_code" takes precedence over "success".
|
||||
if exit_code == '0':
|
||||
status = 'SUCCESS'
|
||||
else:
|
||||
if args.success == 'true':
|
||||
status = 'SUCCESS'
|
||||
|
||||
unique_id = args.unique_id
|
||||
if unique_id is None:
|
||||
LOG.debug('No id passed from the command line')
|
||||
md = cfn_helper.Metadata('not-used', None)
|
||||
unique_id = md.get_instance_id()
|
||||
if unique_id is None:
|
||||
LOG.error('Could not get the instance id from metadata!')
|
||||
import socket
|
||||
unique_id = socket.getfqdn()
|
||||
LOG.debug('id: %s' % (unique_id))
|
||||
|
||||
body = {
|
||||
"Status": status,
|
||||
"Reason": args.reason,
|
||||
"UniqueId": unique_id,
|
||||
"Data": args.data
|
||||
}
|
||||
data = cfn_helper.json.dumps(body)
|
||||
|
||||
cmd = ['curl']
|
||||
if args.insecure:
|
||||
cmd.append('--insecure')
|
||||
cmd.extend([
|
||||
'-X', 'PUT',
|
||||
'-H', 'Content-Type:',
|
||||
'--data-binary', data,
|
||||
args.url
|
||||
])
|
||||
|
||||
command = cfn_helper.CommandRunner(cmd).run()
|
||||
if command.status != 0:
|
||||
LOG.error(command.stderr)
|
||||
sys.exit(command.status)
|
|
@ -1,2 +0,0 @@
|
|||
target/
|
||||
build/
|
153
doc/Makefile
153
doc/Makefile
|
@ -1,153 +0,0 @@
|
|||
# Makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
PAPER =
|
||||
BUILDDIR = build
|
||||
|
||||
# Internal variables.
|
||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||
PAPEROPT_letter = -D latex_paper_size=letter
|
||||
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
|
||||
# the i18n builder cannot share the environment and doctrees with the others
|
||||
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
|
||||
|
||||
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
||||
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@echo " json to make JSON files"
|
||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||
@echo " qthelp to make HTML files and a qthelp project"
|
||||
@echo " devhelp to make HTML files and a Devhelp project"
|
||||
@echo " epub to make an epub"
|
||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||
@echo " text to make text files"
|
||||
@echo " man to make manual pages"
|
||||
@echo " texinfo to make Texinfo files"
|
||||
@echo " info to make Texinfo files and run them through makeinfo"
|
||||
@echo " gettext to make PO message catalogs"
|
||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||
@echo " linkcheck to check all external links for integrity"
|
||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||
|
||||
clean:
|
||||
-rm -rf $(BUILDDIR)/*
|
||||
|
||||
html:
|
||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||
|
||||
singlehtml:
|
||||
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||
|
||||
pickle:
|
||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||
@echo
|
||||
@echo "Build finished; now you can process the pickle files."
|
||||
|
||||
json:
|
||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||
@echo
|
||||
@echo "Build finished; now you can process the JSON files."
|
||||
|
||||
htmlhelp:
|
||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||
|
||||
qthelp:
|
||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Heat.qhcp"
|
||||
@echo "To view the help file:"
|
||||
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Heat.qhc"
|
||||
|
||||
devhelp:
|
||||
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||
@echo
|
||||
@echo "Build finished."
|
||||
@echo "To view the help file:"
|
||||
@echo "# mkdir -p $$HOME/.local/share/devhelp/Heat"
|
||||
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Heat"
|
||||
@echo "# devhelp"
|
||||
|
||||
epub:
|
||||
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||
@echo
|
||||
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||
|
||||
latex:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo
|
||||
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||
"(use \`make latexpdf' here to do that automatically)."
|
||||
|
||||
latexpdf:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through pdflatex..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
text:
|
||||
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||
@echo
|
||||
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||
|
||||
man:
|
||||
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||
@echo
|
||||
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||
|
||||
texinfo:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo
|
||||
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||
"(use \`make info' here to do that automatically)."
|
||||
|
||||
info:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo "Running Texinfo files through makeinfo..."
|
||||
make -C $(BUILDDIR)/texinfo info
|
||||
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||
|
||||
gettext:
|
||||
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||
@echo
|
||||
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||
|
||||
changes:
|
||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||
@echo
|
||||
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||
|
||||
linkcheck:
|
||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||
@echo
|
||||
@echo "Link check complete; look for any errors in the above output " \
|
||||
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||
|
||||
doctest:
|
||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||
@echo "Testing of doctests in the sources finished, look at the " \
|
||||
"results in $(BUILDDIR)/doctest/output.txt."
|
|
@ -1,23 +0,0 @@
|
|||
======================
|
||||
Building the man pages
|
||||
======================
|
||||
|
||||
Dependencies
|
||||
============
|
||||
|
||||
Sphinx_
|
||||
You'll need sphinx (the python one) and if you are
|
||||
using the virtualenv you'll need to install it in the virtualenv
|
||||
specifically so that it can load the cinder modules.
|
||||
|
||||
::
|
||||
|
||||
sudo yum install python-sphinx
|
||||
sudo pip-python install sphinxcontrib-httpdomain
|
||||
|
||||
Use `make`
|
||||
==========
|
||||
|
||||
To build the man pages:
|
||||
|
||||
make man
|
|
@ -1,34 +0,0 @@
|
|||
=======================
|
||||
cfn-create-aws-symlinks
|
||||
=======================
|
||||
|
||||
.. program:: cfn-create-aws-symlinks
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-create-aws-symlinks``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Creates symlinks for the cfn-* scripts in this directory to /opt/aws/bin
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -t, --target
|
||||
|
||||
Target directory to create symlinks, defaults to /opt/aws/bin
|
||||
|
||||
.. cmdoption:: -s, --source
|
||||
|
||||
Source directory to create symlinks from. Defaults to the directory where this script is
|
||||
|
||||
.. cmdoption:: -f, --force
|
||||
|
||||
If specified, will create symlinks even if there is already a target file
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,55 +0,0 @@
|
|||
================
|
||||
cfn-get-metadata
|
||||
================
|
||||
|
||||
.. program:: cfn-get-metadata
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-get-metadata``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Implements cfn-get-metadata CloudFormation functionality
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -s --stack
|
||||
|
||||
A Heat stack name
|
||||
|
||||
.. cmdoption:: -r --resource
|
||||
|
||||
A Heat logical resource ID
|
||||
|
||||
.. cmdoption:: --access-key
|
||||
|
||||
A Keystone access key
|
||||
|
||||
.. cmdoption:: --secret-key
|
||||
|
||||
A Keystone secret key
|
||||
|
||||
.. cmdoption:: --region
|
||||
|
||||
Openstack region
|
||||
|
||||
.. cmdoption:: --credential-file
|
||||
|
||||
credential-file
|
||||
|
||||
.. cmdoption:: -u --url
|
||||
|
||||
service url
|
||||
|
||||
.. cmdoption:: -k --key
|
||||
|
||||
key
|
||||
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,34 +0,0 @@
|
|||
=======
|
||||
cfn-hup
|
||||
=======
|
||||
|
||||
.. program:: cfn-hup
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-hup``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Implements cfn-hup CloudFormation functionality
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -c, --config
|
||||
|
||||
Hook Config Directory, defaults to /etc/cfn/hooks.d
|
||||
|
||||
.. cmdoption:: -f, --no-daemon
|
||||
|
||||
Do not run as a daemon
|
||||
|
||||
.. cmdoption:: -v, --verbose
|
||||
|
||||
Verbose logging
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,46 +0,0 @@
|
|||
========
|
||||
cfn-init
|
||||
========
|
||||
|
||||
.. program:: cfn-init
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-init``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Implements cfn-init CloudFormation functionality
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -s, --stack
|
||||
|
||||
A Heat stack name
|
||||
|
||||
.. cmdoption:: -r, --resource
|
||||
|
||||
A Heat logical resource ID
|
||||
|
||||
.. cmdoption:: --access-key
|
||||
|
||||
A Keystone access key
|
||||
|
||||
.. cmdoption:: --secret-key
|
||||
|
||||
A Keystone secret key
|
||||
|
||||
.. cmdoption:: --region
|
||||
|
||||
Openstack region
|
||||
|
||||
.. cmdoption:: -c, --configsets
|
||||
|
||||
An optional list of configSets (default: default)
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,98 +0,0 @@
|
|||
==============
|
||||
cfn-push-stats
|
||||
==============
|
||||
|
||||
.. program:: cfn-push-stats
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-push-stats``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Implements cfn-push-stats CloudFormation functionality
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -v, --verbose
|
||||
|
||||
Verbose logging
|
||||
|
||||
.. cmdoption:: --credential-file
|
||||
|
||||
credential-file
|
||||
|
||||
.. cmdoption:: --service-failure
|
||||
|
||||
Reports a service falure.
|
||||
|
||||
.. cmdoption:: --mem-util
|
||||
|
||||
Reports memory utilization in percentages.
|
||||
|
||||
.. cmdoption:: --mem-used
|
||||
|
||||
Reports memory used (excluding cache and buffers) in megabytes.
|
||||
|
||||
.. cmdoption:: --mem-avail
|
||||
|
||||
Reports available memory (including cache and buffers) in megabytes.
|
||||
|
||||
.. cmdoption:: --swap-util
|
||||
|
||||
Reports swap utilization in percentages.
|
||||
|
||||
.. cmdoption:: --swap-used
|
||||
|
||||
Reports allocated swap space in megabytes.
|
||||
|
||||
.. cmdoption:: --disk-space-util
|
||||
|
||||
Reports disk space utilization in percentages.
|
||||
|
||||
.. cmdoption:: --disk-space-used
|
||||
|
||||
Reports allocated disk space in gigabytes.
|
||||
|
||||
.. cmdoption:: --disk-space-avail
|
||||
|
||||
Reports available disk space in gigabytes.
|
||||
|
||||
.. cmdoption:: --memory-units
|
||||
|
||||
Specifies units for memory metrics.
|
||||
|
||||
.. cmdoption:: --disk-units
|
||||
|
||||
Specifies units for disk metrics.
|
||||
|
||||
.. cmdoption:: --disk-path
|
||||
|
||||
Selects the disk by the path on which to report.
|
||||
|
||||
.. cmdoption:: --cpu-util
|
||||
|
||||
Reports cpu utilization in percentages.
|
||||
|
||||
.. cmdoption:: --haproxy
|
||||
|
||||
Reports HAProxy loadbalancer usage.
|
||||
|
||||
.. cmdoption:: --haproxy-latency
|
||||
|
||||
Reports HAProxy latency
|
||||
|
||||
.. cmdoption:: --heartbeat
|
||||
|
||||
Sends a Heartbeat.
|
||||
|
||||
.. cmdoption:: --watch
|
||||
|
||||
the name of the watch to post to.
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,42 +0,0 @@
|
|||
==========
|
||||
cfn-signal
|
||||
==========
|
||||
|
||||
.. program:: cfn-signal
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
``cfn-signal``
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
Implements cfn-signal CloudFormation functionality
|
||||
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
.. cmdoption:: -s, --success
|
||||
|
||||
signal status to report
|
||||
|
||||
.. cmdoption:: -r, --reason
|
||||
|
||||
The reason for the failure
|
||||
|
||||
.. cmdoption:: --data
|
||||
|
||||
The data to send
|
||||
|
||||
.. cmdoption:: -i, --id
|
||||
|
||||
the unique id to send back to the WaitCondition
|
||||
|
||||
.. cmdoption:: -e, --exit
|
||||
|
||||
The exit code from a procecc to interpret
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
Heat bugs are managed through Launchpad <https://launchpad.net/heat-cfntools>
|
|
@ -1,193 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# heat-cfntools documentation build configuration file, created by
|
||||
# sphinx-quickstart on Thu Jul 20 09:19:39 2017.
|
||||
#
|
||||
# This file is execfile()d with the current directory set to its
|
||||
# containing dir.
|
||||
#
|
||||
# Note that not all possible configuration values are present in this
|
||||
# autogenerated file.
|
||||
#
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
# import os
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
|
||||
# -- General configuration ------------------------------------------------
|
||||
|
||||
# If your documentation needs a minimal Sphinx version, state it here.
|
||||
#
|
||||
# needs_sphinx = '1.0'
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = ['sphinx.ext.autodoc',
|
||||
'openstackdocstheme']
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix(es) of source filenames.
|
||||
# You can specify multiple suffix as a list of string:
|
||||
#
|
||||
# source_suffix = ['.rst', '.md']
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = 'heat-cfntools'
|
||||
copyright = 'OpenStack Foundation'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
version = ''
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = ''
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
#
|
||||
# This is also used if you do content translation via gettext catalogs.
|
||||
# Usually you set "language" from the command line for these cases.
|
||||
language = None
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
# This patterns also effect to html_static_path and html_extra_path
|
||||
exclude_patterns = []
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# If true, `todo` and `todoList` produce output, else they produce nothing.
|
||||
# todo_include_todos = False
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
#
|
||||
html_theme = 'openstackdocs'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
#
|
||||
# html_theme_options = {}
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
# html_static_path = ['_static']
|
||||
|
||||
# Custom sidebar templates, must be a dictionary that maps document names
|
||||
# to template names.
|
||||
#
|
||||
# This is required for the alabaster theme
|
||||
# refs: http://alabaster.readthedocs.io/en/latest/installation.html#sidebars
|
||||
# html_sidebars = {}
|
||||
|
||||
# -- Options for openstackdocstheme --------------------------------------
|
||||
repository_name = 'openstack/heat-cfntools'
|
||||
bug_project = 'heat-cfntools'
|
||||
bug_tag = ''
|
||||
|
||||
# -- Options for HTMLHelp output ------------------------------------------
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'heat-cfntoolsdoc'
|
||||
|
||||
|
||||
# -- Options for LaTeX output ---------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#
|
||||
# 'papersize': 'letterpaper',
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#
|
||||
# 'pointsize': '10pt',
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#
|
||||
# 'preamble': '',
|
||||
|
||||
# Latex figure (float) alignment
|
||||
#
|
||||
# 'figure_align': 'htbp',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title,
|
||||
# author, documentclass [howto, manual, or own class]).
|
||||
latex_documents = [
|
||||
(master_doc, 'heat-cfntools.tex', 'heat-cfntools Documentation',
|
||||
'OpenStack Foundation', 'manual'),
|
||||
]
|
||||
|
||||
|
||||
# -- Options for manual page output ---------------------------------------
|
||||
|
||||
# One entry per manual page. List of tuples
|
||||
# (source start file, name, description, authors, manual section).
|
||||
man_pages = [
|
||||
(master_doc, 'heat-cfntools', 'heat-cfntools Documentation',
|
||||
['Heat Developers'], 1),
|
||||
('cfn-create-aws-symlinks', 'cfn-create-aws-symlinks',
|
||||
u'Creates symlinks for the cfn-* scripts in this directory to /opt/aws/bin',
|
||||
[u'Heat Developers'], 1),
|
||||
('cfn-get-metadata', 'cfn-get-metadata',
|
||||
u'Implements cfn-get-metadata CloudFormation functionality',
|
||||
[u'Heat Developers'], 1),
|
||||
('cfn-hup', 'cfn-hup',
|
||||
u'Implements cfn-hup CloudFormation functionality',
|
||||
[u'Heat Developers'], 1),
|
||||
('cfn-init', 'cfn-init',
|
||||
u'Implements cfn-init CloudFormation functionality',
|
||||
[u'Heat Developers'], 1),
|
||||
('cfn-push-stats', 'cfn-push-stats',
|
||||
u'Implements cfn-push-stats CloudFormation functionality',
|
||||
[u'Heat Developers'], 1),
|
||||
('cfn-signal', 'cfn-signal',
|
||||
u'Implements cfn-signal CloudFormation functionality',
|
||||
[u'Heat Developers'], 1),
|
||||
]
|
||||
|
||||
|
||||
# -- Options for Texinfo output -------------------------------------------
|
||||
|
||||
# Grouping the document tree into Texinfo files. List of tuples
|
||||
# (source start file, target name, title, author,
|
||||
# dir menu entry, description, category)
|
||||
texinfo_documents = [
|
||||
(master_doc, 'heat-cfntools', 'heat-cfntools Documentation',
|
||||
'Heat Developers', 'heat-cfntools', 'One line description of project.',
|
||||
'Miscellaneous'),
|
||||
]
|
|
@ -1,17 +0,0 @@
|
|||
===================================
|
||||
Man pages for Heat cfntools utilities
|
||||
===================================
|
||||
|
||||
-------------
|
||||
Heat cfntools
|
||||
-------------
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
cfn-create-aws-symlinks
|
||||
cfn-get-metadata
|
||||
cfn-hup
|
||||
cfn-init
|
||||
cfn-push-stats
|
||||
cfn-signal
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
|
@ -1,91 +0,0 @@
|
|||
#
|
||||
# Copyright 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import fixtures
|
||||
import mock
|
||||
import tempfile
|
||||
import testtools
|
||||
|
||||
from heat_cfntools.cfntools import cfn_helper
|
||||
|
||||
|
||||
class TestCfnHup(testtools.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestCfnHup, self).setUp()
|
||||
self.logger = self.useFixture(fixtures.FakeLogger())
|
||||
self.stack_name = self.getUniqueString()
|
||||
self.resource = self.getUniqueString()
|
||||
self.region = self.getUniqueString()
|
||||
self.creds = tempfile.NamedTemporaryFile()
|
||||
self.metadata = cfn_helper.Metadata(self.stack_name,
|
||||
self.resource,
|
||||
credentials_file=self.creds.name,
|
||||
region=self.region)
|
||||
self.init_content = self.getUniqueString()
|
||||
self.init_temp = tempfile.NamedTemporaryFile()
|
||||
self.service_name = self.getUniqueString()
|
||||
self.init_section = {'AWS::CloudFormation::Init': {
|
||||
'config': {
|
||||
'services': {
|
||||
'sysvinit': {
|
||||
self.service_name: {
|
||||
'enabled': True,
|
||||
'ensureRunning': True,
|
||||
}
|
||||
}
|
||||
},
|
||||
'files': {
|
||||
self.init_temp.name: {
|
||||
'content': self.init_content
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
def _mock_retrieve_metadata(self, desired_metadata):
|
||||
with mock.patch.object(
|
||||
cfn_helper.Metadata, 'remote_metadata') as mock_method:
|
||||
mock_method.return_value = desired_metadata
|
||||
with tempfile.NamedTemporaryFile() as last_md:
|
||||
self.metadata.retrieve(last_path=last_md.name)
|
||||
|
||||
def _test_cfn_hup_metadata(self, metadata):
|
||||
|
||||
self._mock_retrieve_metadata(metadata)
|
||||
FakeServicesHandler = mock.Mock()
|
||||
FakeServicesHandler.monitor_services.return_value = None
|
||||
self.useFixture(
|
||||
fixtures.MonkeyPatch(
|
||||
'heat_cfntools.cfntools.cfn_helper.ServicesHandler',
|
||||
FakeServicesHandler))
|
||||
|
||||
section = self.getUniqueString()
|
||||
triggers = 'post.add,post.delete,post.update'
|
||||
path = 'Resources.%s.Metadata' % self.resource
|
||||
runas = 'root'
|
||||
action = '/bin/sh -c "true"'
|
||||
hook = cfn_helper.Hook(section, triggers, path, runas, action)
|
||||
|
||||
with mock.patch.object(cfn_helper.Hook, 'event') as mock_method:
|
||||
mock_method.return_value = None
|
||||
self.metadata.cfn_hup([hook])
|
||||
|
||||
def test_cfn_hup_empty_metadata(self):
|
||||
self._test_cfn_hup_metadata({})
|
||||
|
||||
def test_cfn_hup_cfn_init_metadata(self):
|
||||
self._test_cfn_hup_metadata(self.init_section)
|
|
@ -1,4 +0,0 @@
|
|||
pbr>=0.6,!=0.7,<1.0
|
||||
boto>=2.12.0,!=2.13.0
|
||||
psutil>=1.1.1,<2.0.0
|
||||
six>=1.9.0
|
43
setup.cfg
43
setup.cfg
|
@ -1,43 +0,0 @@
|
|||
[metadata]
|
||||
name = heat-cfntools
|
||||
summary = Tools required to be installed on Heat provisioned cloud instances
|
||||
description-file =
|
||||
README.rst
|
||||
author = OpenStack
|
||||
author-email = openstack-dev@lists.openstack.org
|
||||
home-page = http://www.openstack.org/
|
||||
classifier =
|
||||
Environment :: OpenStack
|
||||
Intended Audience :: Information Technology
|
||||
Intended Audience :: System Administrators
|
||||
License :: OSI Approved :: Apache Software License
|
||||
Operating System :: POSIX :: Linux
|
||||
Programming Language :: Python
|
||||
Programming Language :: Python :: 2
|
||||
Programming Language :: Python :: 2.7
|
||||
|
||||
[files]
|
||||
packages =
|
||||
heat_cfntools
|
||||
scripts =
|
||||
bin/cfn-create-aws-symlinks
|
||||
bin/cfn-get-metadata
|
||||
bin/cfn-hup
|
||||
bin/cfn-init
|
||||
bin/cfn-push-stats
|
||||
bin/cfn-signal
|
||||
|
||||
[global]
|
||||
setup-hooks =
|
||||
pbr.hooks.setup_hook
|
||||
|
||||
[wheel]
|
||||
universal = 1
|
||||
|
||||
[build_sphinx]
|
||||
source-dir = doc/source
|
||||
build-dir = doc/build
|
||||
all_files = 1
|
||||
|
||||
[upload_sphinx]
|
||||
upload-dir = doc/build/html
|
22
setup.py
22
setup.py
|
@ -1,22 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
# Copyright (c) 2013 Hewlett-Packard Development Company, L.P.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
# implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
# THIS FILE IS MANAGED BY THE GLOBAL REQUIREMENTS REPO - DO NOT EDIT
|
||||
import setuptools
|
||||
|
||||
setuptools.setup(
|
||||
setup_requires=['pbr'],
|
||||
pbr=True)
|
|
@ -1,9 +0,0 @@
|
|||
# Hacking already pins down pep8, pyflakes and flake8
|
||||
hacking>=0.8.0,<0.9
|
||||
|
||||
mock>=1.0
|
||||
discover
|
||||
openstackdocstheme>=1.11.0 # Apache-2.0
|
||||
sphinx>=1.6.2 # BSD
|
||||
testrepository>=0.0.18
|
||||
testtools>=0.9.34
|
|
@ -1,198 +0,0 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
# Copyright (c) 2012, AT&T Labs, Yun Mao <yunmao@gmail.com>
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""pylint error checking."""
|
||||
|
||||
import cStringIO as StringIO
|
||||
import json
|
||||
import re
|
||||
import sys
|
||||
|
||||
from pylint import lint
|
||||
from pylint.reporters import text
|
||||
|
||||
# Note(maoy): E1103 is error code related to partial type inference
|
||||
ignore_codes = ["E1103"]
|
||||
# Note(maoy): the error message is the pattern of E0202. It should be ignored
|
||||
# for nova.tests modules
|
||||
ignore_messages = ["An attribute affected in nova.tests"]
|
||||
# Note(maoy): we ignore all errors in openstack.common because it should be
|
||||
# checked elsewhere. We also ignore nova.tests for now due to high false
|
||||
# positive rate.
|
||||
ignore_modules = ["nova/openstack/common/", "nova/tests/"]
|
||||
|
||||
KNOWN_PYLINT_EXCEPTIONS_FILE = "tools/pylint_exceptions"
|
||||
|
||||
|
||||
class LintOutput(object):
|
||||
|
||||
_cached_filename = None
|
||||
_cached_content = None
|
||||
|
||||
def __init__(self, filename, lineno, line_content, code, message,
|
||||
lintoutput):
|
||||
self.filename = filename
|
||||
self.lineno = lineno
|
||||
self.line_content = line_content
|
||||
self.code = code
|
||||
self.message = message
|
||||
self.lintoutput = lintoutput
|
||||
|
||||
@classmethod
|
||||
def from_line(cls, line):
|
||||
m = re.search(r"(\S+):(\d+): \[(\S+)(, \S+)?] (.*)", line)
|
||||
matched = m.groups()
|
||||
filename, lineno, code, message = (matched[0], int(matched[1]),
|
||||
matched[2], matched[-1])
|
||||
if cls._cached_filename != filename:
|
||||
with open(filename) as f:
|
||||
cls._cached_content = list(f.readlines())
|
||||
cls._cached_filename = filename
|
||||
line_content = cls._cached_content[lineno - 1].rstrip()
|
||||
return cls(filename, lineno, line_content, code, message,
|
||||
line.rstrip())
|
||||
|
||||
@classmethod
|
||||
def from_msg_to_dict(cls, msg):
|
||||
"""From the output of pylint msg, to a dict, where each key
|
||||
is a unique error identifier, value is a list of LintOutput
|
||||
"""
|
||||
result = {}
|
||||
for line in msg.splitlines():
|
||||
obj = cls.from_line(line)
|
||||
if obj.is_ignored():
|
||||
continue
|
||||
key = obj.key()
|
||||
if key not in result:
|
||||
result[key] = []
|
||||
result[key].append(obj)
|
||||
return result
|
||||
|
||||
def is_ignored(self):
|
||||
if self.code in ignore_codes:
|
||||
return True
|
||||
if any(self.filename.startswith(name) for name in ignore_modules):
|
||||
return True
|
||||
if any(msg in self.message for msg in ignore_messages):
|
||||
return True
|
||||
return False
|
||||
|
||||
def key(self):
|
||||
if self.code in ["E1101", "E1103"]:
|
||||
# These two types of errors are like Foo class has no member bar.
|
||||
# We discard the source code so that the error will be ignored
|
||||
# next time another Foo.bar is encountered.
|
||||
return self.message, ""
|
||||
return self.message, self.line_content.strip()
|
||||
|
||||
def json(self):
|
||||
return json.dumps(self.__dict__)
|
||||
|
||||
def review_str(self):
|
||||
return ("File %(filename)s\nLine %(lineno)d:%(line_content)s\n"
|
||||
"%(code)s: %(message)s" % self.__dict__)
|
||||
|
||||
|
||||
class ErrorKeys(object):
|
||||
|
||||
@classmethod
|
||||
def print_json(cls, errors, output=sys.stdout):
|
||||
print >>output, "# automatically generated by tools/lintstack.py"
|
||||
for i in sorted(errors.keys()):
|
||||
print >>output, json.dumps(i)
|
||||
|
||||
@classmethod
|
||||
def from_file(cls, filename):
|
||||
keys = set()
|
||||
for line in open(filename):
|
||||
if line and line[0] != "#":
|
||||
d = json.loads(line)
|
||||
keys.add(tuple(d))
|
||||
return keys
|
||||
|
||||
|
||||
def run_pylint():
|
||||
buff = StringIO.StringIO()
|
||||
reporter = text.ParseableTextReporter(output=buff)
|
||||
args = ["--include-ids=y", "-E", "nova"]
|
||||
lint.Run(args, reporter=reporter, exit=False)
|
||||
val = buff.getvalue()
|
||||
buff.close()
|
||||
return val
|
||||
|
||||
|
||||
def generate_error_keys(msg=None):
|
||||
print "Generating", KNOWN_PYLINT_EXCEPTIONS_FILE
|
||||
if msg is None:
|
||||
msg = run_pylint()
|
||||
errors = LintOutput.from_msg_to_dict(msg)
|
||||
with open(KNOWN_PYLINT_EXCEPTIONS_FILE, "w") as f:
|
||||
ErrorKeys.print_json(errors, output=f)
|
||||
|
||||
|
||||
def validate(newmsg=None):
|
||||
print "Loading", KNOWN_PYLINT_EXCEPTIONS_FILE
|
||||
known = ErrorKeys.from_file(KNOWN_PYLINT_EXCEPTIONS_FILE)
|
||||
if newmsg is None:
|
||||
print "Running pylint. Be patient..."
|
||||
newmsg = run_pylint()
|
||||
errors = LintOutput.from_msg_to_dict(newmsg)
|
||||
|
||||
print "Unique errors reported by pylint: was %d, now %d." \
|
||||
% (len(known), len(errors))
|
||||
passed = True
|
||||
for err_key, err_list in errors.items():
|
||||
for err in err_list:
|
||||
if err_key not in known:
|
||||
print err.lintoutput
|
||||
print
|
||||
passed = False
|
||||
if passed:
|
||||
print "Congrats! pylint check passed."
|
||||
redundant = known - set(errors.keys())
|
||||
if redundant:
|
||||
print "Extra credit: some known pylint exceptions disappeared."
|
||||
for i in sorted(redundant):
|
||||
print json.dumps(i)
|
||||
print "Consider regenerating the exception file if you will."
|
||||
else:
|
||||
print("Please fix the errors above. If you believe they are false"
|
||||
" positives, run 'tools/lintstack.py generate' to overwrite.")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def usage():
|
||||
print """Usage: tools/lintstack.py [generate|validate]
|
||||
To generate pylint_exceptions file: tools/lintstack.py generate
|
||||
To validate the current commit: tools/lintstack.py
|
||||
"""
|
||||
|
||||
|
||||
def main():
|
||||
option = "validate"
|
||||
if len(sys.argv) > 1:
|
||||
option = sys.argv[1]
|
||||
if option == "generate":
|
||||
generate_error_keys()
|
||||
elif option == "validate":
|
||||
validate()
|
||||
else:
|
||||
usage()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
|
@ -1,59 +0,0 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
# Copyright (c) 2012-2013, AT&T Labs, Yun Mao <yunmao@gmail.com>
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# Use lintstack.py to compare pylint errors.
|
||||
# We run pylint twice, once on HEAD, once on the code before the latest
|
||||
# commit for review.
|
||||
set -e
|
||||
TOOLS_DIR=$(cd $(dirname "$0") && pwd)
|
||||
# Get the current branch name.
|
||||
GITHEAD=`git rev-parse --abbrev-ref HEAD`
|
||||
if [[ "$GITHEAD" == "HEAD" ]]; then
|
||||
# In detached head mode, get revision number instead
|
||||
GITHEAD=`git rev-parse HEAD`
|
||||
echo "Currently we are at commit $GITHEAD"
|
||||
else
|
||||
echo "Currently we are at branch $GITHEAD"
|
||||
fi
|
||||
|
||||
cp -f $TOOLS_DIR/lintstack.py $TOOLS_DIR/lintstack.head.py
|
||||
|
||||
if git rev-parse HEAD^2 2>/dev/null; then
|
||||
# The HEAD is a Merge commit. Here, the patch to review is
|
||||
# HEAD^2, the master branch is at HEAD^1, and the patch was
|
||||
# written based on HEAD^2~1.
|
||||
PREV_COMMIT=`git rev-parse HEAD^2~1`
|
||||
git checkout HEAD~1
|
||||
# The git merge is necessary for reviews with a series of patches.
|
||||
# If not, this is a no-op so won't hurt either.
|
||||
git merge $PREV_COMMIT
|
||||
else
|
||||
# The HEAD is not a merge commit. This won't happen on gerrit.
|
||||
# Most likely you are running against your own patch locally.
|
||||
# We assume the patch to examine is HEAD, and we compare it against
|
||||
# HEAD~1
|
||||
git checkout HEAD~1
|
||||
fi
|
||||
|
||||
# First generate tools/pylint_exceptions from HEAD~1
|
||||
$TOOLS_DIR/lintstack.head.py generate
|
||||
# Then use that as a reference to compare against HEAD
|
||||
git checkout $GITHEAD
|
||||
$TOOLS_DIR/lintstack.head.py
|
||||
echo "Check passed. FYI: the pylint exceptions are:"
|
||||
cat $TOOLS_DIR/pylint_exceptions
|
||||
|
35
tox.ini
35
tox.ini
|
@ -1,35 +0,0 @@
|
|||
[tox]
|
||||
envlist = py34,py27,pep8
|
||||
|
||||
[testenv]
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
deps = -r{toxinidir}/requirements.txt
|
||||
-r{toxinidir}/test-requirements.txt
|
||||
commands = python setup.py testr --slowest --testr-args='{posargs}'
|
||||
|
||||
[testenv:pep8]
|
||||
commands = flake8
|
||||
flake8 --filename=cfn-* bin
|
||||
|
||||
[testenv:pylint]
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
deps = -r{toxinidir}/requirements.txt
|
||||
pylint==0.26.0
|
||||
commands = bash tools/lintstack.sh
|
||||
|
||||
[testenv:cover]
|
||||
commands =
|
||||
python setup.py testr --coverage --testr-args='{posargs}'
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
||||
|
||||
[flake8]
|
||||
show-source = true
|
||||
exclude=.venv,.git,.tox,dist,doc,*lib/python*,*egg,tools
|
||||
|
||||
[testenv:docs]
|
||||
deps = -r{toxinidir}/requirements.txt
|
||||
-r{toxinidir}/test-requirements.txt
|
||||
sphinxcontrib-httpdomain
|
||||
commands = python setup.py build_sphinx
|
Loading…
Reference in New Issue