StarlingX open source release updates

Signed-off-by: Dean Troyer <dtroyer@gmail.com>
This commit is contained in:
Dean Troyer 2018-05-30 16:16:19 -07:00
parent 4b6bbc9989
commit 98f9250109
26 changed files with 3640 additions and 0 deletions

7
CONTRIBUTORS.wrs Normal file
View File

@ -0,0 +1,7 @@
The following contributors from Wind River have developed the seed code in this
repository. We look forward to community collaboration and contributions for
additional features, enhancements and refactoring.
Contributors:
=============
Jerry Sun <Jerry.Sun@windriver.com>

202
LICENSE Normal file
View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

5
README.rst Normal file
View File

@ -0,0 +1,5 @@
===========
stx-clients
===========
StarlingX Clients

View File

@ -0,0 +1,3 @@
SRC_DIR=install-log-server
COPY_LIST="$SRC_DIR/*"
TIS_PATCH_VER=4

View File

@ -0,0 +1,28 @@
Summary: install-log-server
Name: install-log-server
Version: 1.1.2
Release: %{tis_patch_ver}%{?_tis_dist}
License: Apache-2.0
Group: devel
Packager: Wind River <info@windriver.com>
URL: unknown
Source0: %{name}-%{version}.tar.gz
%define cgcs_sdk_deploy_dir /opt/deploy/cgcs_sdk
%description
Titanium Cloud log server installation
%prep
%setup
mv %name wrs-%{name}-%{version}
tar czf wrs-%{name}-%{version}.tgz wrs-%{name}-%{version}
# Install for guest-client package
%install
install -D -m 644 wrs-%{name}-%{version}.tgz %{buildroot}%{cgcs_sdk_deploy_dir}/wrs-%{name}-%{version}.tgz
%files
%{cgcs_sdk_deploy_dir}/wrs-%{name}-%{version}.tgz

View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,93 @@
Copyright © 2016-2017 Wind River Systems, Inc.
SPDX-License-Identifier: Apache-2.0
-----------------------------------------------------------------------
EXAMPLE Titanium Cloud Log Server configuration utility
--------------------------------------------------------
This utility provides an example of how to install and configure
Elasticsearch, Kibana and Logstash on a remote Linux central logging
server, for the purposes of receiving syslogs from a Titanium
Cloud System. This utility will install the open-source ELK packages
with the appropriate network configuration on the Linux server.
Refer to the Titanium Cloud System Administration guide for how to
configure the Titanium Cloud System to send its syslogs to this remote
Linux server.
NOTE: This is ONLY an EXAMPLE of an ELK installation on a remote Linux
central logging server. Wind River does NOT support the ELK installation
itself, only the sending of syslogs to the ELK server from Titanium Cloud.
NOTE: This example installer has only been tested with IPv4.
Supported Operating Systems
---------------------------
Ubuntu 14.04 LTS
Ubuntu 12.04 LTS
Ubuntu 16.04 LTS
CentOS 6
CentOS 7
Installing the utility
----------------------
To install the utility on a Linux machine follow these steps:
1. Untar the provided tarball:
tar xfv wrs-install-log-server-*.tgz
or using your favorite manager.
2. cd wrs-install-log-server-*
3. Execute the provided installer script:
sudo ./install-log-server.sh -i <IP Address> [ -u | -t ] [... Other Options ...]
Usage:
-c Path to a ca certificate file that Logstash will use.
-h Show help options.
-i The IP address all Elasticsearch, Logstash, Kibana modules will use to bind and publish to.
-k Path to a server key file that Logstash will use.
-p The port Logstash will bind and listen to. The default port is 514. Privileged ports are redirected to port 10514.
-t Set this system up to receive logs through TCP. (at least one of TCP/TLS or UDP options must be selected)
-u Set this system up to receive logs through UDP. (at least one of TCP/TLS or UDP options must be selected)
This utility will install a remote log server and configures communications with Titanium Cloud.
Refer to the Titanium Cloud System Administration guide and README file for more details.
NOTE: Both TCP and UDP can be configured at the same time. Specifying -c and -k will have encryption on top of TCP.
Using the Kibana web interface
------------------------------
To begin using the log server, you must enable remote logging on the Titanium
Cloud system. Kibana provides a web-based interface for using an installed
and configured remote log server. Refer to the Titanium Cloud System
Administration guide and README file for additional to start exploring
with Kibana.
1. Open Kibana in your browser http://YOURDOMAIN.com:5601 or
http://<IP Address>:5601
2. To use Kibana dashboard visualizations, open Kibana in a browser, click the Dashboard tab, click the
+ button, click "manage visualizations" on the right above the Visualization Filter text box, click
"Import" and upload the tisElkDashboards.json file found in the directory containing this doc.
After it has loaded, you can view either Dashboards or Visualizations by going to the Settings tab
and clicking the Dashboard or Visualizations tab on that page. To view dashboards, select them and
click the eye icon on the right. Selecting multiple dashboards is not supported because some dashboards
are set to take up the entire page, meaning other dashboards might not be displayed, or that some of the
visualizations may be cut off. Thus, it is recommended that only one dashboard is viewed at a time, or
that you add and check dashboards one-by-one to verify that the selected dashboards can be displayed
together. To view an individual visualization, click the eye icon beside the visualization's name; to view
multiple visualizations, click the Dashboard tab at the top of the page, click the + button, and the
visualizations should appear in a droplist. Select as many as you'd like to view.
Visualizations are graphical visualizations of the log data grouped by things such as fieldname.
Dashboards are a collection of visualizations.
Note: The provided dashboards and visualizations are just samples of dashboards that can be constructed
based on logs collected from Titanium Cloud.

View File

@ -0,0 +1,482 @@
#!/bin/bash
#
# Copyright (c) 2016-2017 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
#
#The following paths are using for package installation
ELASTICSEARCH_RPM_URL=https://download.elastic.co/elasticsearch/release/org/elasticsearch/distribution/rpm/elasticsearch/2.3.2/elasticsearch-2.3.2.rpm
ELASTICSEARCH_DEB_URL=https://download.elastic.co/elasticsearch/release/org/elasticsearch/distribution/deb/elasticsearch/2.3.2/elasticsearch-2.3.2.deb
LOGSTASH_RPM_URL=https://download.elastic.co/logstash/logstash/packages/centos/logstash-2.3.4-1.noarch.rpm
LOGSTASH_DEB_URL=https://download.elastic.co/logstash/logstash/packages/debian/logstash_2.3.4-1_all.deb
KIBANA_RPM_URL=https://download.elastic.co/kibana/kibana/kibana-4.5.1-1.x86_64.rpm
KIBANA_DEB_URL=https://download.elastic.co/kibana/kibana/kibana_4.5.1_amd64.deb
printusage () {
echo "Usage:"
echo "install-log-server -i <IP Address> [OPTION...]"
echo "-c Path to a ca certificate file that logstash will use."
echo "-h Show help options."
echo "-i The IP address all Elasticsearch, Logstash, Kibana modules will use to bind and publish to."
echo "-k Path to a server key file that logstash will use."
echo "-p The port Logstash will bind and listen to. Privileged ports are redirected to port 10514."
echo "-t Set this system up to receive logs through TCP. (at least one of TCP/TLS or UDP options must be selected)"
echo "-u Set this system up to receive logs through UDP. (at least one of TCP/TLS or UDP options must be selected)"
echo ""
echo "This utility will install a remote log server and configures communications with Titanium Cloud."
echo "Refer to the Titanium Cloud System Administration guide and README file for more details."
}
PORT=514 # The default port to align with the Titanium Cloud remote logging component port is 514
while getopts ":c:h:i:k:p:tu" opt; do
case $opt in
c)
CERT_FILE=${OPTARG}
;;
h)
printusage
exit 0
;;
i)
IP_ADDRESS=${OPTARG}
;;
k)
KEY_FILE=${OPTARG}
;;
p)
PORT=${OPTARG}
;;
t)
USE_TCP=true
;;
u)
USE_UDP=true
;;
\?)
echo "Invalid option: -$OPTARG, valid options are -h, -i, and -p."
exit 1
;;
esac
done
# The -i option is mandatory
if [[ -z $IP_ADDRESS ]] ; then
echo "The IP Address option is mandatory: install-log-server -i <IP Address>"
# config must set logstash up for SOMETHING
if [ ! $USE_TCP ] && [ ! $USE_UDP ] ; then
echo "and at least one of TCP/TLS or UDP options must also be selected. "
fi
printusage
exit 1
fi
TLS_PARAM_COUNT=0
# to enable TLS, both certificate and key must be provided, not 1 but not the other
if [[ ! -z "$CERT_FILE" ]]; then
TLS_PARAM_COUNT=$((TLS_PARAM_COUNT+1))
if [[ ! -e "$CERT_FILE" ]]; then
echo "$CERT_FILE is not a valid file path."
printusage
exit 1
fi
fi
if [[ ! -z "$KEY_FILE" ]]; then
TLS_PARAM_COUNT=$((TLS_PARAM_COUNT+1))
if [[ ! -e "$KEY_FILE" ]]; then
echo "$KEY_FILE is not a valid file path."
printusage
exit 1
fi
fi
if [ $TLS_PARAM_COUNT -eq 1 ]; then
echo "Both cert file and key file must be provided for TLS."
printusage
exit 1
fi
# TLS is on top of TCP
if [ $TLS_PARAM_COUNT -eq 2 ]; then
if [ ! $USE_TCP ] ; then
echo "TLS can only be used with tcp, please also enable TCP by specifying -t"
printusage
exit 1
fi
fi
# config must set logstash up for SOMETHING
if [ ! $USE_TCP ] && [ ! $USE_UDP ] ; then
echo "Please specify at least one of -t for TCP and -u for UDP"
printusage
exit 1
fi
# wget is required and used for package download which is more reliable
# than downloading packages from the elastic.co repositories.
# One of apt-get or yum package managers is required.
# USE_APT is true when the APT package manager is installed.
USE_APT=false
install_wget=false
install_curl=false
install_iptables_save=false
if [[ ! -z "which wget" ]]; then
install_wget=true
fi
if [[ ! -z "which curl" ]]; then
install_curl=true
fi
if [[ "$PORT" -lt "1024" ]]; then
install_iptables_save=true
fi
YUM_CMD=$(which yum)
APT_GET_CMD=$(which apt-get)
if [[ ! -z $YUM_CMD ]]; then
PKG_NAME="yum"
if $install_wget ; then
echo "wget is required for Java installation."
yum install wget
fi
if $install_iptables_save; then
echo "iptables-services is required for Logstash with protected ports under 1024."
yum install iptables-services
fi
dist="$(cat /etc/*-release)"
firewallcmdStatus="$(firewall-cmd --state 2>/dev/null)"
if [[ "$dist" == *"CentOS"* ]] && [[ "$firewallcmdStatus" == *"running"* ]]; then
if [ "$USE_TCP" = true ]; then
firewall-cmd --zone=public --add-port="$PORT"/tcp --permanent
firewall-cmd --zone=public --add-port=10514/tcp --permanent
elif [ "$USE_UDP" = true ]; then
firewall-cmd --zone=public --add-port="$PORT"/udp --permanent
firewall-cmd --zone=public --add-port=10514/udp --permanent
fi
firewall-cmd --reload
fi
elif [[ ! -z $APT_GET_CMD ]]; then
PKG_NAME="apt-get"
USE_APT=true
if $install_wget ; then
echo "wget is required for Java installation."
apt-get install wget
fi
if $install_curl ; then
echo "curl is required for Elasticsearch package download."
apt-get install curl
fi
if $install_iptables_save; then
echo "iptables-persistent is required for Logstash with protected ports under 1024."
apt-get install iptables-persistent
fi
else
echo "No supported package managers detected (apt-get, yum)"
echo "exiting installer..."
exit 0
fi
get_install_package() {
# The URL parameter is required for this function
if [ -z "$1" ]
then
return 1
fi
PACKAGE_URL=$1
PACKAGE_FILE=${1##*/}
echo $PACKAGE_URL
echo $PACKAGE_FILE
if $USE_APT ; then
if [ ! -f "$PACKAGE_FILE" ] ; then
curl -L -O $PACKAGE_URL
fi
dpkg -i $PACKAGE_FILE
else
if [ ! -f "$PACKAGE_FILE" ] ; then
echo Downloading $PACKAGE_URL
curl -L -O $PACKAGE_URL
#wget $PACKAGE_URL
fi
echo Installing $PACKAGE_FILE
yum localinstall --nogpgcheck $PACKAGE_FILE
fi
return 0
}
boot_at_startup () {
if [ -z "$1" ]
then
echo "A URL parameter must be passed to boot_at_startup"
return 1
fi
SYSTEMCTL=$(which systemctl)
if [[ ! -z $SYSTEMCTL ]]; then
systemctl daemon-reload
echo "Starting $1 with systemctl."
systemctl enable $1.service
systemctl restart $1.service
else
update-rc.d $1 defaults 95 10
echo "Starting $1 with update-rc.d."
/etc/init.d/$1 restart
fi
}
echo "Checking for required Java version."
if type -p java; then
_java=java
elif [[ -n "$JAVA_HOME" ]] && [[ -x "$JAVA_HOME/bin/java" ]]; then
_java="$JAVA_HOME/bin/java"
else
install_java=y
fi
if [[ "$_java" ]]; then
# Get java version in format 1.8.0.73
version=$("$_java" -version 2>&1 | awk -F '"' '/version/ {print $2}'| sed -r 's/[_]+/./g')
#minimum java version is 1.8.0.73
min=1.8.0.73
val=${version}
if (( ${val%%.*} < ${min%%.*} || ( ${val%%.*} == ${min%%.*} && ${val##*.} < ${min##*.} ) )) ; then
echo "Elasticsearch recommends that you use the Oracle JDK version 1.8.0_73."
echo "Refer to the current documentation: https://www.elastic.co/guide/en/elasticsearch/reference/current/_installation.html"
read -p "Would you like to install Oracle Java 8? y/n: " PACKMAN_CONTINUE_INPUT
while [[ "$PACKMAN_CONTINUE_INPUT" != "y" && "$PACKMAN_CONTINUE_INPUT" != "n" ]]
do
echo invalid input: $PACKMAN_CONTINUE_INPUT
read -p "Continue with installation? y/n: " PACKMAN_CONTINUE_INPUT
done
if [[ "$PACKMAN_CONTINUE_INPUT" == "y" ]]; then
install_java=y
fi
fi
fi
if [[ "$install_java" == "y" ]]; then
echo "Installing Java:"
if $USE_APT ; then
add-apt-repository ppa:webupd8team/java
apt-get update
apt-get install oracle-java8-installer
apt-get install oracle-java8-set-default
else
wget --no-cookies --no-check-certificate --header "Cookie: gpw_e24=http%3A%2F%2Fwww.oracle.com%2F; oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jdk/8u92-b14/jre-8u92-linux-x64.rpm"
yum localinstall jre-8u92-linux-x64.rpm
fi
echo "Java installation complete."
fi
wait_for_elasticsearch() {
# elasticsearch can take some time to start
ES_ACTIVE=false
echo "Waiting for Elasticsearch to start."
for i in {1..15}; do
sleep 2
response=$(curl -s -XGET "http://${IP_ADDRESS}:9200/_cluster/health?pretty=true")
if [[ ! "$response" =~ "green" ]] && [[ ! "$response" =~ "yellow" ]]; then
echo "Waiting for Elasticsearch to start."
else
ES_ACTIVE=true
break 2
fi
done
if [ "$ES_ACTIVE" == false ]; then
echo "Elasticsearch is not responding. Please resolve the issue and rerun the script."
echo "More details at: https://www.elastic.co/guide/en/elasticsearch/reference/current/cluster-stats.html"
echo "Refer to the install-log-server README file for more details"
exit 1
fi
echo "Elasticsearch is installed and running."
}
# Install Elasticsearch if necessary
response=$(curl -s -XGET "http://${IP_ADDRESS}:9200/_cluster/health?pretty=true")
if [[ ! "$response" =~ "green" ]] && [[ ! "$response" =~ "yellow" ]]; then
echo "Installing Elasticsearch."
if $USE_APT ; then
get_install_package $ELASTICSEARCH_DEB_URL
else
get_install_package $ELASTICSEARCH_RPM_URL
fi
# Remove a previously configured network host ip address
sed -i "/^network.host:.*/d" /etc/elasticsearch/elasticsearch.yml
# Add the IP address in the elasticsearch config file
sed -i "/# network.host:.*/a network.host: ${IP_ADDRESS}" /etc/elasticsearch/elasticsearch.yml
boot_at_startup elasticsearch
wait_for_elasticsearch
else
echo "Elasticsearch is already installed and running."
fi
config_logstash() {
if [ -f "./wrs-logstash.conf" ] ; then
cp -f ./wrs-logstash.conf /etc/logstash/conf.d/wrs-logstash.conf
else
echo "Error: wrs-logstash.conf is missing from install package!"
exit 1
fi
# Fill in the config file based on what transport the user specified
if [ $USE_TCP ] ; then
TCP_PARAMS="tcp {\n host => \"127.0.0.1\"\n port => 514\n #OPTIONAL_TLS_PARAMS\n }"
sed -i "s/#TCP_PARAMS/${TCP_PARAMS}/g" /etc/logstash/conf.d/wrs-logstash.conf
fi
if [ $USE_UDP ] ; then
UDP_PARAMS="udp {\n host => \"127.0.0.1\"\n port => 514\n }"
sed -i "s/#UDP_PARAMS/${UDP_PARAMS}/g" /etc/logstash/conf.d/wrs-logstash.conf
fi
# Update conf file with IP_ADDRESS
sed -i "s/ host => .*/ host => \"${IP_ADDRESS}\"/" /etc/logstash/conf.d/wrs-logstash.conf
sed -i "s/.*elasticsearch { hosts.*/ elasticsearch { hosts => [\"${IP_ADDRESS}:9200\"] }/" /etc/logstash/conf.d/wrs-logstash.conf
# install certificate, key, and set TLS config
if [ $TLS_PARAM_COUNT -eq 2 ]; then
mkdir -p /etc/pki/tls/certs
mkdir -p /etc/pki/tls/private
cp $CERT_FILE /etc/pki/tls/certs/remote-logging-ca-cert.pem
cp $KEY_FILE /etc/pki/tls/private/remote-logging-server-key.pem
SSL_PARAMS="ssl_enable => true\n ssl_verify => false\n ssl_cert => \"\/etc\/pki\/tls\/certs\/remote-logging-ca-cert.pem\"\n ssl_key => \"\/etc\/pki\/tls\/private\/remote-logging-server-key.pem\""
sed -i "s/#OPTIONAL_TLS_PARAMS/${SSL_PARAMS}/g" /etc/logstash/conf.d/wrs-logstash.conf
fi
# If the user entered a privileged port then redirect to a non-privileged port logstash can use.
if [[ "$PORT" -lt "1024" ]]; then
# Make iptables rules persistent after restart
if [ -f "/bin/systemctl" ] ; then
systemctl enable iptables
else
update-rc.d iptables defaults 95 10
fi
port_in_use=$( netstat -an | grep 10514 )
if [ -f "/bin/systemctl" ] ; then
systemctl enable iptables
else
update-rc.d iptables defaults 95 10
fi
netstat -an | grep 10514
# Delete any pre-existing rules forwarding to port 10514
old_rules_list=$(iptables -t nat --line-numbers -L | grep '^[0-9].*10514' | awk '{ print $1 }' | tac)
old_rules_count=0
for i in $old_rules_list; do
iptables -t nat -D PREROUTING $i
old_rules_count=$((old_rules_count+1))
done
echo Deleted $old_rules_count NAT PREROUTING rules to Logstash listening authorized port 10514.
# Update conf file with non-priviledged port
echo "Priviledged port $PORT redirected to Logstash listening authorized port 10514."
sed -i "s/ port =>.*/ port => 10514/" /etc/logstash/conf.d/wrs-logstash.conf
# Use iptables for IPv4 or ip6tables for IPv6
if [[ $IP_ADDRESS =~ ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ ]]; then
if [ "$USE_TCP" = true ]; then
iptables -A INPUT -p tcp --dport "$PORT" -j ACCEPT
elif [ "$USE_UDP" = true ]; then
iptables -A INPUT -p udp --dport "$PORT" -j ACCEPT
fi
iptables -t nat -A PREROUTING -p UDP -m udp --dport $PORT -j REDIRECT --to-ports 10514
iptables -t nat -A PREROUTING -p tcp -m tcp --dport $PORT -j REDIRECT --to-ports 10514
else
if [ "$USE_TCP" = true ]; then
ip6tables -A INPUT -p tcp --dport "$PORT" -j ACCEPT
elif [ "$USE_UDP" = true ]; then
ip6tables -A INPUT -p udp --dport "$PORT" -j ACCEPT
fi
ip6tables -t nat -A PREROUTING -p UDP -m udp --dport $PORT -j REDIRECT --to-ports 10514
ip6tables -t nat -A PREROUTING -p tcp -m tcp --dport $PORT -j REDIRECT --to-ports 10514
fi
# Save iptables rules permanently (after restart)
if [ -f "/usr/sbin/netfilter-persistent" ] ; then
netfilter-persistent save
netfilter-persistent reload
elif [ -f "/etc/init.d/iptables-persistent" ] ; then
/etc/init.d/iptables-persistent save
/etc/init.d/iptables-persistent reload
elif [ -f "/etc/centos-release" ] ; then
# CentOS 7 https://wiki.centos.org/HowTos/Network/IPTables
/sbin/service iptables save
else
iptables-save
fi
fi
}
pidfile="/var/run/logstash.pid"
logstash_running() {
if [ -f "$pidfile" ] ; then
echo "Logstash is installed and running."
else
echo "Logstash is not responding. Please resolve the issue and rerun the script."
echo "More details at: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html"
echo "Refer to the install-log-server README file for more details"
exit 1
fi
}
# Install Logstash if necessary
if [ ! -f "$pidfile" ] ; then
echo "Logstash is being downloaded."
if $USE_APT ; then
get_install_package $LOGSTASH_DEB_URL
else
get_install_package $LOGSTASH_RPM_URL
fi
config_logstash
boot_at_startup logstash
logstash_running
else
config_logstash
boot_at_startup logstash
logstash_running
fi
config_kibana() {
if [ -f "./kibana.svg" ] ; then
mv -f kibana.svg /opt/kibana/optimize/bundles/src/ui/public/images/kibana.svg
fi
sed -i "s/^.*server\.host: .*/server\.host: \"${IP_ADDRESS}\"/" /opt/kibana/config/kibana.yml
sed -i "s/^.*elasticsearch\.url:.*/elasticsearch\.url: \"http:\/\/${IP_ADDRESS}:9200\"/" /opt/kibana/config/kibana.yml
}
kibana_active=false
kibana_running() {
kibana_active=$(curl -s -XGET "http://${IP_ADDRESS}:5601")
if [[ ! "$kibana_active" =~ "false" ]] ; then
echo "Kibana is installed and running. Updating Kibana settings."
curl -XPUT http://$IP_ADDRESS:9200/.kibana/index-pattern/logstash-* -d '{"title" : "logstash-*", "timeFieldName": "@timestamp"}'
curl -XPUT http://$IP_ADDRESS:9200/.kibana/config/4.5.1 -d '{"defaultIndex" : "logstash-*"}'
echo
echo "To begin using the log server, you must enable remote logging on the Titanium Cloud system."
echo "Kibana provides a web-based interface for using an installed and configured remote log server."
echo "Open Kibana in your browser http://YOURDOMAIN.com:5601 or http://${IP_ADDRESS}:5601"
echo "Refer to the Titanium Cloud System Administration guide and README file for additional to start exploring with Kibana."
echo
else
echo "Kibana is not responding. Please resolve the issue and rerun the script."
echo "More details at: https://www.elastic.co/guide/en/kibana/current/setup.html"
echo "Refer to the install-log-server README file for more details"
exit 1
fi
}
# Install Kibana if necessary
cfgfile="/opt/kibana/config/kibana.yml"
if [ ! -f "$cfgfile" ] ; then
echo "Kibana is being downloaded."
if $USE_APT ; then
get_install_package $KIBANA_DEB_URL
else
get_install_package $KIBANA_RPM_URL
fi
fi
config_kibana
boot_at_startup kibana
kibana_running
exit 0

View File

@ -0,0 +1,91 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
id="svg3354"
version="1.1"
inkscape:version="0.91 r13725"
width="234.375"
height="40.3125"
viewBox="0 0 234.375 40.3125"
sodipodi:docname="kibana.svg">
<metadata
id="metadata3360">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<defs
id="defs3358" />
<sodipodi:namedview
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1"
objecttolerance="10"
gridtolerance="10"
guidetolerance="10"
inkscape:pageopacity="0"
inkscape:pageshadow="2"
inkscape:window-width="458"
inkscape:window-height="405"
id="namedview3356"
showgrid="false"
inkscape:zoom="7.5306667"
inkscape:cx="117.1875"
inkscape:cy="20.15625"
inkscape:window-x="1021"
inkscape:window-y="107"
inkscape:window-maximized="0"
inkscape:current-layer="svg3354" />
<image
width="234.375"
height="40.3125"
preserveAspectRatio="none"
style="image-rendering:optimizeQuality"
xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAPoAAAArCAIAAADe7ALgAAAAAXNSR0IArs4c6QAAAARnQU1BAACx
jwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAW3SURBVHhe7ZxhiBVVFMfPK6PCIh+7YEEl3AKj
gu1DHwIliGUDsaggNosks0ADoQLDYoUIiooKCRfaQKFIyowIKypapCwFP/hBPyQGeUBSaOltb3ur
ZBTR/51753Zn3owrOrNuc8+PYc55M3fvvJn3v+eee+dqwxhDihIHFzirKBGgclciQuWuRITKXYkI
lbsSESp3JSJU7kpEqNyViFC5KxGhclciQuWuRERjv3PqzHDxuqDFzKNEJ4gWEN1x2uVDG5hvE+cy
oqGk5GfMf1svYB7RB0TvJ2W+YZ6yXhoUu3umBUufMFsHF11PdFDKr2MetEcLQM2ew0THiA4Q7c67
1gjzAKolwv43orXG2COneWL/a6KI7jcmoullLVGb6C+iX4meLC4GlkoxbNvdgS5/iBAXJqfslmkA
OIsNzSksgw1/i9biChXwp2wofEr2lj6ifqnzkuRsZkPNfltEtIToKbnWZubr0ld8SWS9vtnE/mci
nL1GjteVRlNutWbc3G5vEwVbfiB6oSBcQQTQBPiHaH5xuEWDeVXUc2lBmQeYV4jUwEVEHwXR3fMQ
873O/Q8Uvq/gomC7qBMxCdfdQHSop+QYMxqSD1oTRF8R/UJ0tdzO7ei+gucAcLlJoseDenYwDzWb
4+32czhOtIXo5fpG99ouAH5NApXVweVEy/JuM9QoQLFnk4Qhwybmq6S2TlorIb7lFMkd+DIhJ4ke
LahzRrnfybw6uU1cF1H8aE+ZhcwQcXjdsNHi1p42BqKHxJHJIN7bI/ZszfBxoW4gul/sXJpGbpqX
NtwVaB2g2CrnZrkleVLfy/6sCbNqD8Lw8zNlNefChDEQNxqqB9K3DQlYZdtwbnObumod1FbuCNJI
bT0Y52XoY0banQGy7uUeZiQJAOHz3fKkgBDruUlSHfchwLfYcwedEr6/BzXnXrHe1Fbu4ICzXXqV
jbzFhnaM9rwOkOY+0iMCZAj2MeVOsJw1SJSRPnkeLOiCSuQdZ7vg3oecGxF1ljsGlz6C4tcdTIvp
etljhAoRhLOxy511oBPwKn/b2XJAjvFi0NKQYLxOtKBKxe80JmxgGOPGRp3lPmVMOCe4xtku6Mdt
aL+WaJcxY0HDgP4WB5rbGOT3+8pOavca83mg+LZM2FdKZvCwqOL+ZK5RZ7mDb53Nskz2CO17xZk0
BiEcHwHE/bA4lhucpePOlsxWY35ybpffg1dLVTDprMN2cfFQc7m/mY7Hj4mS/CD1CqJRccB7ybgQ
og8HrNOyRwBG4lERzxiD8YMFvwf8LZUp3vdUlgudjYWayx2E2SoGZ51OZ13yq58KZqm/NsZ29PaJ
LGdGyR3MUL+N+ugBxFbCyqByfIE+WbOAL+AOlceVzjqOOBsL9Zf7K852mS/7W2UPkDeH+MwHo8YV
4qCp4AFh2yMfK2XYmHBgvZTo/lbLR/2yQAsPOVJlG56D1F/uu43xyySg4y9aLft+EdpC3iyu442g
JBKbna2Wz2S2ilM16HZ8X4Qv+UTS2MpiMP1ON5PYxED95Q6OOtvF33DuJPrhJHUBPpVHiJ2alSiI
zGpTMFFTuhzRnEK+czYiopD7ZmdTbHQ2xVtJwuOB6D917myAIcQu55bMSDL3akGjGo0skwFRyP1g
kKVY8LF3KRU4lJ6qt+Qu9qoOqDDsjkoBY+4B53ZBytS7qiIGopA72OdsF+Qn4Zr1DNucdRxzdlYJ
pybPHARvv8D4RKczwLyGeZw5M5GPuL4qvkGqJRa5jwU5MdKVzCA15OP0m/aiF1VVs1ImavxA4gxZ
QvQlM7bxVmsEY9PkH69YUOGHsrx+Ikqtg1jkfjyIlyedLWR/ojPoA+oX9zww2N8/r9Gwfu7K4Vym
ZfMSRyP/UZYPDcsy4PN4O3MB/f/dlYiIJborClC5KxGhclciQuWuRITKXYkIlbsSESp3JSJU7kpE
qNyViFC5K9FA9C9eeaA7E62h+QAAAABJRU5ErkJggg==
"
id="image3362"
x="0"
y="0" />
</svg>

After

Width:  |  Height:  |  Size: 3.8 KiB

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,563 @@
[
{
"_id": "Overview",
"_type": "dashboard",
"_source": {
"title": "Overview",
"hits": 0,
"description": "",
"panelsJSON": "[{\"col\":4,\"id\":\"Unique-Systems-ampersand-Hosts-Counts\",\"panelIndex\":1,\"row\":1,\"size_x\":2,\"size_y\":2,\"type\":\"visualization\"},{\"col\":1,\"id\":\"Alarm-Severity-Summary\",\"panelIndex\":3,\"row\":1,\"size_x\":3,\"size_y\":2,\"type\":\"visualization\"},{\"col\":1,\"id\":\"Log-Activity-PER-HOST\",\"panelIndex\":5,\"row\":3,\"size_x\":12,\"size_y\":3,\"type\":\"visualization\"},{\"col\":6,\"id\":\"Raw-Log-Severity-Pie-Chart\",\"panelIndex\":6,\"row\":1,\"size_x\":3,\"size_y\":2,\"type\":\"visualization\"},{\"col\":9,\"id\":\"Customer-Log-Severity-Summary\",\"panelIndex\":9,\"row\":1,\"size_x\":4,\"size_y\":2,\"type\":\"visualization\"},{\"id\":\"All-Logs\",\"type\":\"search\",\"panelIndex\":10,\"size_x\":12,\"size_y\":6,\"col\":1,\"row\":6,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-1\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}}},\"P-3\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-4\":{\"vis\":{\"legendOpen\":false}},\"P-5\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}},\"P-6\":{\"vis\":{\"legendOpen\":true}},\"P-9\":{\"spy\":{\"mode\":{\"fill\":false,\"name\":null}},\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}]}"
}
}
},
{
"_id": "Host-ampersand-VM-Event-Activity",
"_type": "dashboard",
"_source": {
"title": "Host & VM Customer Log Event Activity",
"hits": 0,
"description": "",
"panelsJSON": "[\n {\n \"col\": 1,\n \"columns\": [\n \"_source\"\n ],\n \"id\": \"Logs-Host-events\",\n \"panelIndex\": 5,\n \"row\": 4,\n \"size_x\": 6,\n \"size_y\": 5,\n \"sort\": [\n \"@timestamp\",\n \"desc\"\n ],\n \"type\": \"search\"\n },\n {\n \"col\": 7,\n \"columns\": [\n \"_source\"\n ],\n \"id\": \"Logs-VM-events\",\n \"panelIndex\": 6,\n \"row\": 4,\n \"size_x\": 6,\n \"size_y\": 5,\n \"sort\": [\n \"@timestamp\",\n \"desc\"\n ],\n \"type\": \"search\"\n },\n {\n \"col\": 1,\n \"id\": \"Customer-Host-slash-VM-Log-Severity-Summary\",\n \"panelIndex\": 7,\n \"row\": 1,\n \"size_x\": 3,\n \"size_y\": 3,\n \"type\": \"visualization\"\n },\n {\n \"id\": \"Host-and-VM-Events-Date-Histogram\",\n \"type\": \"visualization\",\n \"panelIndex\": 8,\n \"size_x\": 9,\n \"size_y\": 3,\n \"col\": 4,\n \"row\": 1\n }\n]",
"optionsJSON": "{\n \"darkTheme\": false\n}",
"uiStateJSON": "{\n \"P-7\": {\n \"vis\": {\n \"legendOpen\": false\n }\n },\n \"P-8\": {\n \"vis\": {\n \"legendOpen\": false\n }\n }\n}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"filter\": [\n {\n \"query\": {\n \"query_string\": {\n \"analyze_wildcard\": true,\n \"query\": \"*\"\n }\n }\n }\n ]\n}"
}
}
},
{
"_id": "Resource-Add-slash-Delete-Activity",
"_type": "dashboard",
"_source": {
"title": "Resource Add / Delete Activity",
"hits": 0,
"description": "",
"panelsJSON": "[{\"col\":1,\"id\":\"Unique-Systems-ampersand-Hosts-Counts\",\"panelIndex\":1,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":3,\"id\":\"Resource-Add-slash-Delete-Activity\",\"panelIndex\":2,\"row\":1,\"size_x\":10,\"size_y\":3,\"type\":\"visualization\"},{\"id\":\"Resource-Add-slash-Delete-Table\",\"type\":\"search\",\"panelIndex\":3,\"size_x\":12,\"size_y\":5,\"col\":1,\"row\":4,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}]}"
}
}
},
{
"_id": "Login-Authentication-Audit-Log",
"_type": "dashboard",
"_source": {
"title": "Login Authentication Audit Log",
"hits": 0,
"description": "",
"panelsJSON": "[{\"id\":\"Horizon-Authentication-Audit-Log\",\"type\":\"search\",\"panelIndex\":1,\"size_x\":6,\"size_y\":4,\"col\":7,\"row\":1,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"SSH-slash-SFTP-slash-SUDO-slash-Postgres-Authentication-Audit-Log\",\"type\":\"search\",\"panelIndex\":2,\"size_x\":6,\"size_y\":4,\"col\":7,\"row\":5,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"Horizon-Authentication-Audit-Log-Date-Histogram\",\"type\":\"visualization\",\"panelIndex\":3,\"size_x\":6,\"size_y\":4,\"col\":1,\"row\":1},{\"id\":\"SSH-slash-SFTP-slash-sudo-slash-postgres-Authentication-Audit-Log-Date-Histogram\",\"type\":\"visualization\",\"panelIndex\":4,\"size_x\":6,\"size_y\":4,\"col\":1,\"row\":5}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-3\":{\"vis\":{\"legendOpen\":false}},\"P-4\":{\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}}]}"
}
}
},
{
"_id": "Command-Audit-Log",
"_type": "dashboard",
"_source": {
"title": "Command Audit Log",
"hits": 0,
"description": "",
"panelsJSON": "[{\"id\":\"Bash-Audit-Log\",\"type\":\"search\",\"panelIndex\":1,\"size_x\":6,\"size_y\":4,\"col\":7,\"row\":1,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"REST-API-Audit-Log\",\"type\":\"search\",\"panelIndex\":2,\"size_x\":6,\"size_y\":4,\"col\":7,\"row\":5,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"BASH-Audit-Log-Date-Histogram\",\"type\":\"visualization\",\"panelIndex\":3,\"size_x\":6,\"size_y\":4,\"col\":1,\"row\":1},{\"id\":\"REST-API-Audit-Log-Histogram\",\"type\":\"visualization\",\"panelIndex\":4,\"size_x\":6,\"size_y\":4,\"col\":1,\"row\":5}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-3\":{\"vis\":{\"legendOpen\":false}},\"P-4\":{\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}}]}"
}
}
},
{
"_id": "System-Troubleshooting",
"_type": "dashboard",
"_source": {
"title": "System Troubleshooting",
"hits": 0,
"description": "",
"panelsJSON": "[{\"id\":\"DEBUG-Maintenance-ampersand-Inventory-Logs-(First-Level)\",\"type\":\"search\",\"panelIndex\":1,\"size_x\":6,\"size_y\":3,\"col\":7,\"row\":3,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"DEBUG-Service-Manager-Logs\",\"type\":\"search\",\"panelIndex\":2,\"size_x\":6,\"size_y\":3,\"col\":1,\"row\":3,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"DEBUG-VM-Logs-(First-Level-NOVA)\",\"type\":\"search\",\"panelIndex\":3,\"size_x\":12,\"size_y\":4,\"col\":1,\"row\":6,\"columns\":[\"_source\"],\"sort\":[\"@timestamp\",\"desc\"]},{\"id\":\"System-Troubleshooting-Logs-Date-Histogram\",\"type\":\"visualization\",\"panelIndex\":4,\"size_x\":12,\"size_y\":2,\"col\":1,\"row\":1}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-4\":{\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": false,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}}]}"
}
}
},
{
"_id": "Resource-Add-slash-Delete-Table",
"_type": "search",
"_source": {
"title": "Resource Add / Delete Table",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"( filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:200* AND ( message:\\\"added\\\" OR message:\\\"delete\\\" ) ) OR ( filename:\\\"cinder-volume.log\\\" AND message:\\\"volume successfully\\\" ) OR ( filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND ( message:700.108 OR message:700.114 ))\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "Alarms",
"_type": "search",
"_source": {
"title": "Alarm Activity",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"fm-event.log\\\" AND ( message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"set\\\\\\\"\\\" OR message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"clear\\\\\\\"\\\" )\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "Logs",
"_type": "search",
"_source": {
"title": "Customer Logs",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\"\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "All-Logs",
"_type": "search",
"_source": {
"title": "All Logs",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "DEBUG-Maintenance-ampersand-Inventory-Logs-(First-Level)",
"_type": "search",
"_source": {
"title": "DEBUG - Maintenance & Inventory Logs (First Level)",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"mtcAgent.log\\\" OR filename:\\\"mtcClient.log\\\" OR filename:\\\"sysinv.log\\\"\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "Logs-Host-events",
"_type": "search",
"_source": {
"title": "Customer Logs - Host events",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:200*\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "DEBUG-VM-Logs-(First-Level-NOVA)",
"_type": "search",
"_source": {
"title": "DEBUG - VM Logs (First Level - NOVA)",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"nova-compute\\\" OR filename:\\\"nova-scheduler.log\\\" OR filename:\\\"nova-conductor.log\\\"\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "DEBUG-Service-Manager-Logs",
"_type": "search",
"_source": {
"title": "DEBUG - Service Manager Logs (First Level)",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"sm.log\\\" OR filename:\\\"sm-customer\\\" OR filename:\\\"daemon-ocf.log\\\"\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "Logs-VM-events",
"_type": "search",
"_source": {
"title": "Customer Logs - VM events",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:700* AND message:instance\",\"analyze_wildcard\":true}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
},
{
"_id": "Horizon-Authentication-Audit-Log",
"_type": "search",
"_source": {
"title": "Horizon Authentication Audit Log",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647},\"query\":{\"query_string\":{\"query\":\"filename:horizon.log\",\"analyze_wildcard\":true}}}"
}
}
},
{
"_id": "REST-API-Audit-Log",
"_type": "search",
"_source": {
"title": "REST API Audit Log",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647},\"query\":{\"query_string\":{\"query\":\"filename:api.log\",\"analyze_wildcard\":true}}}"
}
}
},
{
"_id": "SSH-slash-SFTP-slash-SUDO-slash-Postgres-Authentication-Audit-Log",
"_type": "search",
"_source": {
"title": "SSH / SFTP / sudo / postgres Authentication Audit Log",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"logstash-*\",\n \"filter\": [],\n \"highlight\": {\n \"pre_tags\": [\n \"@kibana-highlighted-field@\"\n ],\n \"post_tags\": [\n \"@/kibana-highlighted-field@\"\n ],\n \"fields\": {\n \"*\": {}\n },\n \"require_field_match\": false,\n \"fragment_size\": 2147483647\n },\n \"query\": {\n \"query_string\": {\n \"query\": \"filename:auth.log\",\n \"analyze_wildcard\": true\n }\n }\n}"
}
}
},
{
"_id": "Bash-Audit-Log",
"_type": "search",
"_source": {
"title": "BASH Audit Log",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"logstash-*\",\n \"filter\": [],\n \"highlight\": {\n \"pre_tags\": [\n \"@kibana-highlighted-field@\"\n ],\n \"post_tags\": [\n \"@/kibana-highlighted-field@\"\n ],\n \"fields\": {\n \"*\": {}\n },\n \"require_field_match\": false,\n \"fragment_size\": 2147483647\n },\n \"query\": {\n \"query_string\": {\n \"query\": \"filename:bash.log\",\n \"analyze_wildcard\": true\n }\n }\n}"
}
}
},
{
"_id": "Customer-Log-Severity-Summary",
"_type": "visualization",
"_source": {
"title": "Customer Log Severity Summary",
"visState": "{\"title\":\"Customer Log Severity Summary\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"filters\",\"schema\":\"segment\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"critical\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"CRITICAL\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"major\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"MAJOR\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"minor\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"Minor\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"warning\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"Warning\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"not-applicable\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"Not-Applicable\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"savedSearchId": "Logs",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[]}"
}
}
},
{
"_id": "Alarm-Severity-Summary",
"_type": "visualization",
"_source": {
"title": "Alarm Severity Summary",
"visState": "{\"title\":\"Alarm Severity Summary\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"filters\",\"schema\":\"segment\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"( message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"set\\\\\\\"\\\" OR message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"clear\\\\\\\"\\\" ) AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"critical\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"CRITICAL\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"( message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"set\\\\\\\"\\\" OR message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"clear\\\\\\\"\\\" ) AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"major\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"Major\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"( message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"set\\\\\\\"\\\" OR message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"clear\\\\\\\"\\\" ) AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"minor\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"minor\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"( message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"set\\\\\\\"\\\" OR message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"clear\\\\\\\"\\\" ) AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"warning\\\\\\\"\\\"\",\"analyze_wildcard\":true}}},\"label\":\"warning\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"savedSearchId": "Alarms",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[]}"
}
}
},
{
"_id": "Log-Activity-PER-HOST",
"_type": "visualization",
"_source": {
"title": "Log Activity PER HOST",
"visState": "{\"title\":\"Log Activity PER HOST\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"terms\",\"schema\":\"segment\",\"params\":{\"field\":\"node.raw\",\"size\":100,\"order\":\"desc\",\"orderBy\":\"_term\",\"customLabel\":\"Log Activity per Host\"}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "Raw-Log-Severity-Pie-Chart",
"_type": "visualization",
"_source": {
"title": "Raw Log Severity Pie Chart",
"visState": "{\"title\":\"Raw Log Severity Pie Chart\",\"type\":\"pie\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"isDonut\":false},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"terms\",\"schema\":\"segment\",\"params\":{\"field\":\"level.raw\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\"}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "Unique-Systems-ampersand-Hosts-Counts",
"_type": "visualization",
"_source": {
"title": "Unique Systems & Hosts Counts",
"visState": "{\"title\":\"Unique Systems & Hosts Counts\",\"type\":\"metric\",\"params\":{\"handleNoResults\":true,\"fontSize\":60},\"aggs\":[{\"id\":\"2\",\"type\":\"cardinality\",\"schema\":\"metric\",\"params\":{\"field\":\"system_name.raw\",\"customLabel\":\"Systems\"}},{\"id\":\"3\",\"type\":\"cardinality\",\"schema\":\"metric\",\"params\":{\"field\":\"node.raw\",\"customLabel\":\"Hosts\"}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "Resource-Add-slash-Delete-Activity",
"_type": "visualization",
"_source": {
"title": "Resource Add & Delete Activity",
"visState": "{\n \"title\": \"Resource Add / Delete Activity\",\n \"type\": \"histogram\",\n \"params\": {\n \"shareYAxis\": true,\n \"addTooltip\": true,\n \"addLegend\": true,\n \"scale\": \"linear\",\n \"mode\": \"stacked\",\n \"times\": [],\n \"addTimeMarker\": false,\n \"defaultYExtents\": false,\n \"setYExtents\": false,\n \"yAxis\": {}\n },\n \"aggs\": [\n {\n \"id\": \"1\",\n \"type\": \"count\",\n \"schema\": \"metric\",\n \"params\": {}\n },\n {\n \"id\": \"2\",\n \"type\": \"date_histogram\",\n \"schema\": \"split\",\n \"params\": {\n \"field\": \"@timestamp\",\n \"interval\": \"d\",\n \"customInterval\": \"2h\",\n \"min_doc_count\": 1,\n \"extended_bounds\": {},\n \"customLabel\": \"\",\n \"row\": false\n }\n },\n {\n \"id\": \"3\",\n \"type\": \"filters\",\n \"schema\": \"group\",\n \"params\": {\n \"filters\": [\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:200* AND ( message:\\\"added\\\" OR message:\\\"delete\\\" )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"Hosts\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"filename:\\\"fm-event.log\\\" AND ( message:700.108 OR message:700.114 )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"VMs\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"volume successfully\\\"\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"Volumes\"\n }\n ]\n }\n }\n ],\n \"listeners\": {}\n}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"logstash-*\",\n \"query\": {\n \"query_string\": {\n \"query\": \"( filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" ) OR filename:\\\"cinder-volume.log\\\" OR filename:\\\"openstack.log\\\"\",\n \"analyze_wildcard\": true\n }\n },\n \"filter\": []\n}"
}
}
},
{
"_id": "Resource-Add-ampersand-Delete-Activity",
"_type": "visualization",
"_source": {
"title": "Resource Add & Delete Activity",
"visState": "{\"title\":\"Resource Add & Delete Activity\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"split\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{},\"customLabel\":\"\",\"row\":false}},{\"id\":\"3\",\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:200* AND ( message:\\\"added\\\" OR message:\\\"delete\\\" )\",\"analyze_wildcard\":true}}},\"label\":\"Hosts\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"filename:\\\"fm-event.log\\\" AND ( message:700.108 OR message:700.114 )\",\"analyze_wildcard\":true}}},\"label\":\"VMs\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"message:\\\"volume successfully\\\"\",\"analyze_wildcard\":true}}},\"label\":\"Volumes\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"( filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" ) OR filename:\\\"cinder-volume.log\\\" OR filename:\\\"openstack.log\\\"\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "Customer-Host-slash-VM-Log-Severity-Summary",
"_type": "visualization",
"_source": {
"title": "Host & VM Customer Log Severity Summary",
"visState": "{\n \"title\": \"Customer Host / VM Log Severity Summary\",\n \"type\": \"histogram\",\n \"params\": {\n \"shareYAxis\": true,\n \"addTooltip\": true,\n \"addLegend\": true,\n \"scale\": \"linear\",\n \"mode\": \"stacked\",\n \"times\": [],\n \"addTimeMarker\": false,\n \"defaultYExtents\": false,\n \"setYExtents\": false,\n \"yAxis\": {}\n },\n \"aggs\": [\n {\n \"id\": \"1\",\n \"type\": \"count\",\n \"schema\": \"metric\",\n \"params\": {}\n },\n {\n \"id\": \"2\",\n \"type\": \"filters\",\n \"schema\": \"segment\",\n \"params\": {\n \"filters\": [\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"critical\\\\\\\"\\\" AND ( message:200* OR message:700* )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"CRITICAL\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"major\\\\\\\"\\\" AND ( message:200* OR message:700* )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"MAJOR\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"minor\\\\\\\"\\\" AND ( message:200* OR message:700* )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"Minor\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"warning\\\\\\\"\\\" AND ( message:200* OR message:700* )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"Warning\"\n },\n {\n \"input\": {\n \"query\": {\n \"query_string\": {\n \"query\": \"message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND message:\\\"\\\\\\\"severity\\\\\\\" : \\\\\\\"not-applicable\\\\\\\"\\\" AND ( message:200* OR message:700* )\",\n \"analyze_wildcard\": true\n }\n }\n },\n \"label\": \"Not-Applicable\"\n }\n ]\n }\n }\n ],\n \"listeners\": {}\n}",
"uiStateJSON": "{}",
"description": "",
"savedSearchId": "Logs",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"filter\": []\n}"
}
}
},
{
"_id": "Host-and-VM-Events-Date-Histogram",
"_type": "visualization",
"_source": {
"title": "Host & VM Customer Log Events Date Histogram",
"visState": "{\n \"title\": \"New Visualization\",\n \"type\": \"histogram\",\n \"params\": {\n \"shareYAxis\": true,\n \"addTooltip\": true,\n \"addLegend\": true,\n \"scale\": \"linear\",\n \"mode\": \"stacked\",\n \"times\": [],\n \"addTimeMarker\": false,\n \"defaultYExtents\": false,\n \"setYExtents\": false,\n \"yAxis\": {}\n },\n \"aggs\": [\n {\n \"id\": \"1\",\n \"type\": \"count\",\n \"schema\": \"metric\",\n \"params\": {}\n },\n {\n \"id\": \"2\",\n \"type\": \"date_histogram\",\n \"schema\": \"segment\",\n \"params\": {\n \"field\": \"@timestamp\",\n \"interval\": \"auto\",\n \"customInterval\": \"2h\",\n \"min_doc_count\": 1,\n \"extended_bounds\": {}\n }\n }\n ],\n \"listeners\": {}\n}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"logstash-*\",\n \"query\": {\n \"query_string\": {\n \"query\": \"filename:\\\"fm-event.log\\\" AND message:\\\"\\\\\\\"state\\\\\\\" : \\\\\\\"msg\\\\\\\"\\\" AND ( message:200* OR ( message:700* AND message:instance ) )\",\n \"analyze_wildcard\": true\n }\n },\n \"filter\": []\n}"
}
}
},
{
"_id": "System-Troubleshooting-Logs-Date-Histogram",
"_type": "visualization",
"_source": {
"title": "System Troubleshooting Logs Date Histogram",
"visState": "{\"title\":\"New Visualization\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:\\\"sm.log\\\" OR filename:\\\"sm-customer\\\" OR filename:\\\"daemon-ocf.log\\\" OR filename:\\\"mtcAgent.log\\\" OR filename:\\\"mtcClient.log\\\" OR filename:\\\"sysinv.log\\\" OR filename:\\\"nova-compute\\\" OR filename:\\\"nova-scheduler.log\\\" OR filename:\\\"nova-conductor.log\\\"\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "SSH-slash-SFTP-slash-sudo-slash-postgres-Authentication-Audit-Log-Date-Histogram",
"_type": "visualization",
"_source": {
"title": "SSH / SFTP / sudo / postgres Authentication Audit Log Date Histogram",
"visState": "{\"title\":\"New Visualization\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:auth.log\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "Horizon-Authentication-Audit-Log-Date-Histogram",
"_type": "visualization",
"_source": {
"title": "Horizon Authentication Audit Log Date Histogram",
"visState": "{\"title\":\"New Visualization\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:horizon.log\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "BASH-Audit-Log-Date-Histogram",
"_type": "visualization",
"_source": {
"title": "BASH Audit Log Date Histogram",
"visState": "{\"title\":\"New Visualization\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:bash.log\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "REST-API-Audit-Log-Histogram",
"_type": "visualization",
"_source": {
"title": "REST API Audit Log Histogram",
"visState": "{\"title\":\"New Visualization\",\"type\":\"histogram\",\"params\":{\"shareYAxis\":true,\"addTooltip\":true,\"addLegend\":true,\"scale\":\"linear\",\"mode\":\"stacked\",\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false,\"yAxis\":{}},\"aggs\":[{\"id\":\"1\",\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-*\",\"query\":{\"query_string\":{\"query\":\"filename:api.log\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
}
]

View File

@ -0,0 +1,76 @@
# The logstash configuration below takes Titanium Cloud syslog input and outputs the custom Titanium Cloud log data to elasticsearch.
# Extending the openstack log format, Titanium Cloud syslog messages required the use of grok to parse log data into something structured and queryable.
# - Inconsistent formating of log level, pid and program
# - custom Titanium Cloud syslog fields and naming
input {
# Do not use syslog input plugin (or type)
#TCP_PARAMS
#UDP_PARAMS
}
filter {
# "Grok is currently the best way in logstash to parse log data into something structured and queryable."
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html
# The input plugins above convert the
grok {
match => {
"message" => [
# The default break_on_match is used so first successful match by grok will result in the filter being finished.
# Use Titanium Cloud term node instead of host.
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}\: %{TIMESTAMP_ISO8601:syslog_trash} %{POSINT:pid} %{LOGLEVEL:level} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program} %{TIMESTAMP_ISO8601:syslog_trash} %{POSINT:pid} %{LOGLEVEL:level} %{DATA:program} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}(?:\[%{POSINT:pid}\])?: %{LOGLEVEL:level}?: %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program} \[%{POSINT:pid}\] %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}: %{LOGLEVEL:level} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}: %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}\[%{POSINT:pid}\]?: %{LOGLEVEL:level} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}\[%{POSINT:pid}\]?: %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program}(?:\[%{POSINT:pid}\])?: %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{NOTSPACE:system_name} %{NOTSPACE:filename} %{SYSLOGHOST:node} %{DATA:program} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{TIMESTAMP_ISO8601:syslog_timestamp} %{POSINT:pid} %{LOGLEVEL:level} %{DATA:program} %{GREEDYDATA:message}",
"<%{POSINT:syslog_pri}>%{DATESTAMP:syslog_timestamp} %{POSINT:pid} %{LOGLEVEL:level} %{DATA:program} %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
remove_field => [ "syslog_trash" ]
add_field => { "host_timestamp" => "%{syslog_timestamp}" }
}
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-syslog_pri.html
syslog_pri {
facility_labels => ["kernel", "user-level", "mail", "daemon", "security/authorization", "syslogd", "line printer", "network news", "UUCP", "clock", "security/authorization", "FTP", "NTP", "log audit", "log alert", "clock", "postgres", "platform", "openstack", "sm", "local4", "mtce", "sysinv", "horizon"]
severity_labels => [
"emergency", "alert", "crit", "error", "warn", "notice", "info", "debug"
]
# syslog_pri has served its purpose, and syslog_facility_code isn't useful
remove_field => [ "syslog_pri", "syslog_facility_code", "syslog_severity_code", "severity_label", "syslog_severity" ]
}
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
# set <at> timestamp from the grok'd syslog_timestamp and remove the field
match => [ "syslog_timestamp", "MMM d HH:mm:ss.SSS", "MMM dd HH:mm:ss.SSS" , "yyyy-MM-dd HH:mm:ss.SSS"]
remove_field => [ "syslog_timestamp" ]
timezone => [ "UTC" ]
}
# Rename and remove unwanted syslog fields
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html
mutate {
rename => [
"host", "system_address"
]
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-remove_field
remove_field => [ "type", "syslog_facility" ]
}
}
output {
if "_grokparsefailure" in [tags] {
file { path => "/var/log/logstash/wrs-grokparsefailure-%{+YYYY-MM-dd}" }
} else {
elasticsearch { hosts => ["127.0.0.1:9200"] }
}
}

View File

@ -0,0 +1,86 @@
input {
file {
path => ""
start_position => beginning
ignore_older => 0
sincedb_path => "/dev/null"
codec => multiline {
pattern => "^%{TIMESTAMP_ISO8601}"
negate => true
what => previous
}
}
}
filter {
# "Grok is currently the best way in logstash to parse log data into something structured and queryable."
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html
grok {
match => ["path","%{GREEDYDATA:system_name}(?:/(?<node>(.+?(?=_[0-9]{8}\.[0-9]{6}))))(.+?\b)/%{GREEDYDATA}/%{USERNAME:filename}"]
}
if [filename] =~ "(\s*)nfv\-vim(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE}(\s*)%{NOTSPACE:program}(\s*)%{NOTSPACE:level}(\s*)%{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)libvirtd(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} %{POSINT:pid}: %{LOGLEVEL:level} : %{NOTSPACE:program} %{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)horizon(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} \[(?<level>([a-zA-Z0-9.]*))\](\s*)%{NOTSPACE:program}: %{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)openstack_test(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE}(\s*)%{NOTSPACE:program} %{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)platform(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE}(\s*)%{NOTSPACE:program}(\s*)%{NOTSPACE:level}(\s*)%{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)mtcAgent\_api(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} \[(?<pid>([0-9]*))\](\s*) %{NOTSPACE}(\s*)%{NOTSPACE:program}(\s*)%{GREEDYDATA:message}"]
}
} else if [filename] =~ "(\s*)mtcAgent\_event(.*)" {
grok {
match => [ "message","%{TIMESTAMP_ISO8601:timestamp} \[(?<pid>([0-9]*))\](\s*)%{GREEDYDATA:message}"]
}
} else {
grok {
match => {
"message" => [
# The default break_on_match is used so first successful match by grok will result in the filter being finished.
# Use Titanium Cloud term node instead of host.
"%{TIMESTAMP_ISO8601:timestamp} (\[[a-zA-Z0-9.]*\])(\s*)%{NOTSPACE}(\s*)%{NOTSPACE}(\s*)%{NOTSPACE}(\s*)%{NOTSPACE:program} %{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp} %{POSINT:pid} %{LOGLEVEL:level} %{DATA:program} %{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE} (?<program>(.+?(?=\()))(\(.+?\))(\[)(?<pid>(.*?))(\]\:) %{NOTSPACE:level} %{GREEDYDATA:message}",
"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE}(\s*)%{NOTSPACE:program} %{GREEDYDATA:message}"
]
}
}
}
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html
date {
# set <at> timestamp from the grok'd syslog_timestamp and remove the field
match => [ "timestamp", "ISO8601", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy/mm/dd/HH/mm/ss.SSS", "MMM d HH:mm:ss.SSS", "MMM dd HH:mm:ss.SSS" , "yyyy-MM-dd HH:mm:ss.SSS", "MMM dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss,SSS", "yy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss" ]
target => "@timestamp"
timezone => [ "UTC" ]
remove_field => [ "timestamp" ]
}
# Rename and remove unwanted syslog fields
# https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html
mutate {
rename => [
"host", "system_address"
]
}
if "_grokparsefailure" in [tags] {
drop { }
}
}
output {
elasticsearch { hosts => ["127.0.0.1:9200"] }
stdout {}
}

View File

@ -0,0 +1,304 @@
#!/bin/bash
################################################################################
#
# Copyright (c) 2017 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
################################################################################
#
# Description: Changes a running ELK stack system to read in logs from either
# local log files or from a remote logging server.
#
# Behaviour : The script takes in a directory or file location, unzips any .tgz
# files found within the first two directory levels (node files
# within a collect file), and unzips any .gz files found within
# a var/log/ path found inside of the path designated by the user.
# Each ELK service is restarted, and current elasticsearch indices
# can optionally be wiped. A custom config file is modified to
# contain the user-specified filepath, and then logstash is set to
# begin reading in all logs found, starting from the user-specified
# location. Upon completion, the window displaying the logs being
# read into logstash will appear to hang up and no new text will be
# displayed. This is because all files have been read and no new
# information is available. It is not currently possible to detect
# when this happens and terminate the process. The user can manually
# terminate the script at this time without their ELK setup/data being
# affected. Logstash can be set to read from a remote logging server
# as per the settings in wrs-logstash.conf if the remote logging server
# had been set up and working with ELK prior to running this script. To
# return to viewing logs from a remote logger use the --remote command.
#
# This script should be kept in the same directory as the custom config file
# local-logstash.conf, otherwise this script will not be able to edit the config
# file to include the user-specified path.
#
# If after opening the Kibana webpage and clicking "create" on the "Configure an
# index pattern" page and selecting a Time-field name from the drop list and
# then navigating to the Discover page, if no logs are seen but no errors are
# displayed either, click the range information at the top right of the page, click
# "Absolute" on the left side, and then select the date range in which you expect
# the logs to have been created. Alternatively, you can click "Quick" instead of
# "Absolute" and choose one of those options. Kibana looking at too recent of a
# time range seems to be the most common issue when logs fail to appear after they
# have been read in.
#
# If you are trying to view logs from a local file and are noticing logs from a
# remote logger appearing in Kibana, check that you do not have any UDP port
# forwards set up. If you do, your ELK setup will continue to receive data from
# the remote logger while local logs are also being added, and you will simultanesouly
# add data from both sources to the indices and have them viewable in Kibana.
#
# To increase the speed in which logs are read into logstash, near the bottom of
# this script, change -8 to a higher number. This is the number of workers that
# read through the files within the specified location. The number of workers
# should correspond to the number of cores you have available, but numbers greater
# than your number of cores still seem to improve the rate at which logs are read
# and parsed.
#
# Dependencies: This script requires that /etc/logstash/conf.d/wrs-logstash.conf
# exists. This file is initially placed in this location by
# install-log-server.sh which is used to set up ELK on your system
# to receive logs from a remote logger. This file is used to allow
# logs to be received from a remote server when the --remote option
# is specified, and further, the IP designated to receive logstash
# input for remote and local logging is obtained from this file.
# If logs are being read from local files, ensure local-logstash.conf
# exists in the same directory as this script.
#
################################################################################
ip=""
suffix=""
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" # Directory that this script is contained in
if [ $UID -ne 0 ]; then
echo $'\tWarning: This script must be run as \'root\' user to restart ELK services. Please rerun with sudo.\n'
exit 1
fi
# Check to see if required config files are present
if [[ ! -f "/etc/logstash/conf.d/wrs-logstash.conf" ]]; then
echo $'\tWarning: etc/logstash/conf.d/wrs-logstash.conf does not exist.'
echo $'\t\t Please make sure you have properly run the install-log-server.sh script.\n'
exit 1
fi
if [[ ! -f "$DIR""/local-logstash.conf" ]]; then
echo $'\tWarning: local-logstash.conf does not exist in the directory containing this script.'
echo $'\t\t Please make sure both of these files are in the same location before re-running.\n'
exit 1
fi
function help {
echo ""
echo "--------------------------------------------------------------------------------------"
echo "ELK Local Log Setup Script"
echo ""
echo "Usage:"
echo ""
echo "sudo ./locallog.sh [-clean] [Parent directory containing collect file(s) or location of specific log] [--remote]"
echo ""
echo " -clean ... wipes all elasticsearch indices clearing the log data shown"
echo " in Kibana. Omitting this will append any newly found log data"
echo " to the log data already seen in Kibana."
echo "[Location of logs] ... omitting the square braces, enter the location of a directory"
echo " containing untarred Collect files, or enter the path to a specific"
echo " log file. Drag and dropping files into terminal to get the location"
echo " is supported."
echo " --remote ... directs logstash to acquire logs remotely over the network."
echo " By default the log server created using the install-log-server.sh"
echo " script is connected to using the original configuration file at"
echo " /etc/logstash/conf.d/wrs-logstash.conf"
echo " to use a different server's .conf file please modify this script."
echo " --help | -h ... this info"
echo ""
echo " As an argument, enter the location of a parent directory containing one or more collect files"
echo " to have all of the logs contained within the specified path's subdirectories loaded into a local"
echo " Kibana server. Individual collect files or log files may also be specified."
echo ""
echo "Note: Only collect files that have already been untarred can be searched for logs. This script will"
echo " take care of unpacking .tgz files found inside of the specified path, as well as unzipping all"
echo " .gz files found in any var/log/ path found within any subdirectories."
echo " So as to only unpack the initial .tgz file for each node in a collect file, .tgz files will only"
echo " be unzipped if they are found within 2 directory-levels from your designated path."
echo " if the -clean option is not used, new and old log data will both be visible in Kibana."
echo ""
echo "Tips: -If the script is run multiple times without using the -clean option, some logs may not appear in Kibana"
echo " initially if their index does not use the same time-field name as the logs added in previous runs of the"
echo " script. To see the new logs, in Kibana go to Settings>Add New> Then select the appropriate time-field name"
echo " and click Create. The time-field name can be found in the grok statements used to parse your logs."
echo " -If you've created an index but no logs appear on the Discover page in Kibana, go to the top right"
echo " and modify the date range to include dates you believe might include when the logs were created on"
echo " their respective node. The date range being set to too recent an interval is the most common reason"
echo " for logs failing to appear."
echo " -To keep Kibana populated with previously read-in logs from either local files or a remote logger, simply"
echo " omit using -clean, and all logs obtained by the script will be appended to an index and kept in Kibana"
echo " -If you feel that log files are being parsed and read too slowly, modify this file at the bottom"
echo " and change -w 8 to a larger number. The number should correspond to the number of cores available,"
echo " but improvements have been seen with a number greater than the number of cores."
echo " -If you use the --remote option and you get an error, make sure that the wrs-logstash.conf file"
echo " is in /etc/logstash/conf.d/ or that you modify this script to point to whichever .conf you "
echo " originally used when setting up ELK to work with a remote logger."
echo " -If you use the --remote option and logs fail to populate, or you get an error about elasticsearch"
echo " make sure that the port your remote logger is using is still being forwarded correctly by re-entering"
echo " iptables -t nat -A PREROUTING -p UDP -m udp --dport $PORT -j REDIRECT --to-ports 10514"
echo " OR"
echo " ip6tables -t nat -A PREROUTING -p tcp -m tcp --dport $PORT -j REDIRECT --to-ports 10514"
echo " make sure you correctly specify tcp or udp, and use iptables for IPV4 and ip6tables for IPV6"
echo " -If you are noticing new logs from a remote logger present in Kibana even though you are populating it"
echo " with local logs, check and delete any UDP port forwards to 514/10514, as these forwards will result in"
echo " log data from remote sources being added into your index, even if you are also reading in local logs."
echo " -If you have stopped the script from reading from a remote logger but new logs from the remote server"
echo " continue to appear in Kibana even though the remote server wasn't connected via UDP, run the -clean"
echo " command on its own, then run -clean --remote to get the connection properly established again. Cancelling"
echo " and cleaning after this should clear up the issue. This issue seems to occur randomly and does not appear"
echo " to result from any particular sequence of events (This issue is not specifc to this script)."
echo ""
echo "Examples:"
echo ""
echo "sudo ./locallog.sh -clean"
echo "sudo ./locallog.sh -clean --remote"
echo "sudo ./locallog.sh -clean /localdisk/Collect/ALL_NODES_20170215.202328/"
echo "sudo ./locallog.sh --remote # Will wipe indices and begin receiving logs from remote logger again"
echo "sudo ./locallog.sh /localdisk/Collect/ALL_NODES_20170215.202328/"
echo "sudo ./locallog.sh /localdisk/Collect/ALL_NODES_20170215.202328/controller-0_20170215.202328/"
echo "sudo ./locallog.sh /localdisk/Collect/ALL_NODES_20170215.202328/controller-0_20170215.202328/var/log/sysinv.log"
echo ""
echo "Refer to the wiki at: http://wiki.wrs.com/PBUeng/LocalLogsInELK"
echo "--------------------------------------------------------------------------------------"
echo ""
exit 0
}
function localLog {
# Address of parent directory for collect files to look through
address="$arg"
address="${address#\'}" # Remove ' from beginning of address if drag n' dropped into terminal
address="${address%\'}" # Remove ' from end of address if drag n' dropped into terminal
# unzips .tgz files within first 2 directory levels from given path. This is intended to unzip the files corresponding
# to each of the nodes contained in a Collect file.
for i in $(find "$address" -maxdepth 2 -type f -path '*/*.tgz'); do
loc="$(readlink -f $i)"
tar -zxvf "$i" -C "${loc%/*}"
done
# This unzips any .gz files found in var/log/ which is where log files are stored (meant for rotated logs)
for i in $(find "$address" -type f -path '*/var/log/*.gz'); do
gunzip "$i"
done
# Changes suffix to designate whether a directory is being looked through or if an individual log file was specified
address="\"${address%\/}""$suffix"
hostAddr="\[\"""$ip""\"\]"
# Changes the input filepath in the custom config file to point to the user-specified location
perl -pi -e 's#(^\s*path\s*=> ).*\"#$1'"$address"'#g' "$confLoc" # Replaces current input path in config file with the new one specified
perl -pi -e 's#(^\s*elasticsearch\s*\{\s*hosts\s*=> ).*#$1'"$hostAddr \}"'#g' "$confLoc" # Replaces current output hosts' address with the one in wrs-logstash.conf
}
# Restarts each of the ELK services
function restart {
if [[ "${dist}" == *"CentOS"* ]]; then
echo "Restarting elasticsearch..."
systemctl restart elasticsearch
echo "Restarting logstash..."
systemctl restart logstash
echo "Restarting kibana..."
systemctl restart kibana
elif [[ "${dist}" == *"Ubuntu"* ]]; then
echo "Restarting elasticsearch..."
/etc/init.d/elasticsearch status/restart
echo "Restarting logstash..."
/etc/init.d/logstash status/restart
echo "Restarting kibana..."
/etc/init.d/kibana status/restart
else
# If host OS cannot be determined to be CentOS or Ubuntu, run commands for both systems to see if they will work
echo "Unknown OS detected."
echo "Attempting all solutions. If none pass, please look up how to restart elasticsearch, logstash and kibana"
echo "for your system and continue final steps manually."
echo "Attempting to restart elasticsearch"
systemctl restart elasticsearch
/etc/init.d/elasticsearch status/restart
echo "Attempting to restart logstash"
systemctl restart logstash
/etc/init.d/logstash status/restart
echo "Attempting to restart kibana"
systemctl restart kibana
/etc/init.d/kibana status/restart
sleep 5s # Sleep to give user time to see if any of the restarts passed
fi
}
# Deletes all indices in elasticsearch (clears logs in Kibana)
function wipeIndices {
# Changes index API settings to allow indices to be deleted so past local logs aren't always visible
curl -s -XPUT "$ip"/_cluster/settings -d '{
"persistent" : {
"action.destructive_requires_name" : false
}
}' > /dev/null
curl -s -XDELETE "$ip"/_all > /dev/null # Deletes all elasticsearch indices
echo "Indices wiped"
}
function getIP {
# Your IP since elasticsearch doesn't always get hosted at localhost
origConf="/etc/logstash/conf.d/wrs-logstash.conf"
if [[ "${dist}" == *"CentOS"* ]] || [[ "${dist}" == *"Ubuntu"* ]]; then
# Pulls IP from output specified in wrs-logstash.conf
ip=$(perl -ne'/(?:^\s*elasticsearch\s*\{\s*hosts\s*=> )(.*)/ and print $1' "$origConf")
else
read -p "Enter the IP that ELK modules will bind and publish to: " ip
fi
ip="${ip#[\"}"
ip="${ip%\"] \}}"
}
echo ""
# Determines which OS you are using and runs the corresponding reset commands
dist="$(lsb_release -a)"
while [[ $# > 0 ]]; do
arg="$1"
case $arg in
-h|--help)
help
;;
--remote)
confLoc="/etc/logstash/conf.d/wrs-logstash.conf"
restart
break
;;
-clean)
getIP
wipeIndices
if [ -z "$2" ]; then # If no arguments follow -clean then exit
echo "Error: Log path not specified."
exit 1
fi
#exit 0
;;
*)
getIP
confLoc="$DIR""/local-logstash.conf" # Location of the custom config file
# Sets the config file to either look for logs in subdirectories or just a single specified log
if [[ -f "$arg" ]]; then
suffix="\""
elif [[ -d "$arg" ]]; then
suffix="/**/*.log*\""
else
printf "Unknown input.\nTerminating...\n"
exit 1
fi
localLog
restart
break
esac
shift
done
echo "Reading logs..."
# Changes which config file logstash reads in and sets the number of workers to 8
/opt/logstash/bin/logstash -f "$confLoc" -w 8
exit

2
mwa-delphi.map Normal file
View File

@ -0,0 +1,2 @@
cgcs/openstack/recipes-remote-clients/install-log-server|install-log-server
cgcs/openstack/recipes-remote-clients/remote-clients|remote-clients

View File

@ -0,0 +1,3 @@
SRC_DIR=remote-clients
COPY_LIST="$SRC_DIR/*"
TIS_PATCH_VER=5

View File

@ -0,0 +1,46 @@
Summary: Remote-Clients
Name: remote-clients
Version: 2.0.1
Release: %{tis_patch_ver}%{?_tis_dist}
License: Apache-2.0
Group: devel
Packager: Wind River <info@windriver.com>
URL: unknown
Source0: %{name}-%{version}.tar.gz
BuildRequires: python-ceilometerclient-sdk
BuildRequires: python-cinderclient-sdk
BuildRequires: python-glanceclient-sdk
BuildRequires: python-heatclient-sdk
BuildRequires: python-keystoneclient-sdk
BuildRequires: python-keystoneauth1-sdk
BuildRequires: python-neutronclient-sdk
BuildRequires: python-novaclient-sdk
BuildRequires: python-openstackclient-sdk
BuildRequires: python-openstacksdk-sdk
BuildRequires: cgts-client-sdk
BuildRequires: python-osc-lib-sdk
BuildRequires: python-muranoclient-sdk
%define cgcs_sdk_deploy_dir /opt/deploy/cgcs_sdk
%define remote_client_dir /usr/share/remote-clients
%description
Remote-Client files
%prep
%setup
mv %{name} wrs-%{name}-%{version}
find %{remote_client_dir} -name "*.tgz" -exec cp '{}' wrs-%{name}-%{version}/ \;
sed -i 's/xxxVERSIONxxx/%{version}/g' wrs-%{name}-%{version}/README
tar czf wrs-%{name}-%{version}.tgz wrs-%{name}-%{version}
# Install for guest-client package
%install
install -D -m 644 wrs-%{name}-%{version}.tgz %{buildroot}%{cgcs_sdk_deploy_dir}/wrs-%{name}-%{version}.tgz
%files
%{cgcs_sdk_deploy_dir}/wrs-%{name}-%{version}.tgz

View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -0,0 +1,86 @@
Copyright © 2016-2017 Wind River Systems, Inc.
SPDX-License-Identifier: Apache-2.0
-----------------------------------------------------------------------
Titanium Cloud Remote CLI Clients
----------------------------------
To enable access to the Titanium Cloud CLI remotely, clients and
installer script have been packaged for install on a remote Linux distribution.
This tarball includes several clients which can be used to issue CLI commands
to an existing Titanium Cloud.
Installing the Remote CLI Clients (system install)
--------------------------------------------------
To install the clients to the system packages on a Linux machine follow these
steps:
1. Untar the provided tarball:
tar xfv wrs-remote-clients-xxxVERSIONxxx.tgz
cd wrs-remote-clients-xxxVERSIONxxx
2. Execute the provided installer script:
sudo ./install_clients.sh
NOTE: please open a new terminal after installing for the installed bash
completion to take effect
Installing the Remote CLI Clients (virtualenv)
----------------------------------------------
To install the clients within an isolated virtualenv follow these steps:
1. Create a virtualenv (if it does not already exist)
virtualenv MYENV
2. Activate the virtualenv
source MYENV/bin/activate
3. Untar the provided tarball:
tar xfv wrs-remote-clients-xxxVERSIONxxx.tgz
cd wrs-remote-clients-xxxVERSIONxxx
4. Execute the provided installer script:
./install_clients.sh
5. Remember to deactivate the virtualenv when you are finished using the
remote clients.
deactivate
6. Run /etc/bash_completion to update bash completion, if your system
supports it. You may need to do this when opening a new console as well.
Using the Remote CLI Clients
-----------------------------
1. Download the openrc file from Horizon.
Log in to horizon, go to:
Project -> API Access -> Download OpenStack RC File
2. On your console, source the file you have just downloaded. You will be
asked for your openstack password. You will also be asked for an optional
CA certificate. Enter the path to your CA certificate used for Titanium
Cloud if you have HTTPS configured. Press Enter if you are not using HTTPS
3 Run CLI commands in the same way you would run them on the Titanium
Cloud Controller
NOTE: The open-source OpenStack CLI Clients can NOT be installed at the same
time as the Titanium Cloud CLI Clients, however the Titanium Cloud CLI
Clients can be used to manage an open-source OpenStack cloud by sourcing
the appropriate openrc file for that cloud.
NOTE: The remote cli installer uses pip to install depdencies. The default pip
package that comes with Ubuntu 16.04 has issues. Please upgrade pip to
the 'latest' version before running the remote client installer if you
are installing remote clients on a Ubuntu 16.04 system.
pip install --upgrade pip
Failure to do so can result in installation issues. The Ubuntu pip
package issue is currently being tracked here
https://bugs.launchpad.net/ubuntu/+source/salt/+bug/1586381

View File

@ -0,0 +1,165 @@
#!/bin/bash
#
# Copyright (c) 2016-2017 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
#
while getopts ":hs" opt; do
case $opt in
h)
echo "Usage:"
echo "install_clients [OPTION...]"
echo "-h show help options"
echo "-s skip installing of dependencies through package manager"
echo ""
echo "This script installs the remote clients for Titanium Cloud. It automatically"
echo "uses the package manager detected on your system to pull in dependencies. The"
echo "installation process is dependent on the following packages. If your system"
echo "already includes these packages, or you prefer to manage them manually, then"
echo "you can skip this step by specifying the -s option."
echo " python-dev python-setuptools gcc git python-pip libxml2-dev libxslt-dev"
echo " libssl-dev libffi-dev libssl-dev"
echo ""
echo "If this script is run within a virtualenv then dependent packages will not be"
echo "installed and client packages will be installed within the virtualenv."
echo ""
exit 0
;;
s)
skip_req=1
;;
\?)
echo "Invalid option: -$OPTARG, valid options are -h and -s"
exit 1
;;
esac
done
if [ -z "${VIRTUAL_ENV}" ]; then
if [ $EUID != 0 ]; then
echo "Root access is required. Please run with sudo or as root."
exit 1
fi
# install tools for the script, like pip
if [[ ! -v skip_req ]]; then
which apt-get > /dev/null
aptget_missing=$?
which yum > /dev/null
yum_missing=$?
if [[ "$aptget_missing" == "0" ]]; then
apt-get install python-dev python-setuptools gcc git libxml2-dev libxslt-dev libssl-dev libffi-dev libssl-dev --no-upgrade || exit 1
easy_install pip || exit 1
elif [[ "$yum_missing" == "0" ]]; then
yum install python-devel python-setuptools gcc git libxml2-devel libxslt-devel openssl-devel libffi-devel || exit 1
easy_install pip || exit 1
else
echo "No supported package managers detected (apt-get, yum)"
echo "Please ensure the following are installed on your system before continuing:"
echo "python-dev python-setuptools gcc git python-pip"
read -p "Continue with installation? y/n: " PACKMAN_CONTINUE_INPUT
while [[ "$PACKMAN_CONTINUE_INPUT" != "y" && "$PACKMAN_CONTINUE_INPUT" != "n" ]]
do
echo invalid input: $PACKMAN_CONTINUE_INPUT
read -p "Continue with installation? y/n: " PACKMAN_CONTINUE_INPUT
done
if [[ "$PACKMAN_CONTINUE_INPUT" == "n" ]]; then
echo "exiting installer..."
exit 0
fi
fi
fi
else
echo "Installing clients to virtual env: ${VIRTUAL_ENV}"
fi
# log standard output and standard error, because there is quite a lot of it
# only output what is being installed and the progress to the console (echo)
SCRIPTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
exec 3>&1 1>> $SCRIPTDIR/client_installation.log 2>&1
# extract all clients
echo -n Extracting individual clients ... 1>&3
# centos 7 have an issue where the "positional" package does not install
# the pbr requirement. We will manually install it here.
if ! pip install "pbr>=1.8"; then
echo "Failed to install requirements" 1>&3
exit 1
fi
while true;do echo -n . 1>&3;sleep 1;done &
trap 'kill $! 2>/dev/null' EXIT
for file in *.tgz
do
if ! tar -zxf $file; then
echo "Failed to extract file $file" 1>&3
exit 1
fi
done
if [ -f "requirements.txt" ]
then
if ! pip -q install -r requirements.txt -c upper_constraints.txt; then
echo "Failed to install requirements" 1>&3
exit 1
fi
fi
kill $!
echo [DONE] 1>&3
# first remove any clients already installed
# we need to do this in order to downgrade to the ones we are installing
# because some of our tis clients are older than the most recent openstack clients
pip freeze | grep -wF -f installed_clients.txt | xargs pip uninstall -y
for dir in ./*/
do
cd $dir
if [ -f "setup.py" ]
then
echo -n Installing $(python setup.py --name) ... 1>&3
fi
while true;do echo -n . 1>&3;sleep 1;done &
if [ -f "requirements.txt" ]
then
grep -vwF -f ../installed_clients.txt requirements.txt > requirements.txt.temp
mv requirements.txt.temp requirements.txt
sed -i -e 's/# Apache-2.0//g' requirements.txt
if ! pip -q install -r requirements.txt -c ../upper_constraints.txt; then
echo "Failed to install requirements for $(python setup.py --name)" 1>&3
exit 1
fi
fi
if [ -f "setup.py" ]
then
if ! python setup.py -q install; then
echo "Failed to install $(python setup.py --name)" 1>&3
exit 1
fi
fi
# install bash completion
if [ -d "tools" -a -z "${VIRTUAL_ENV}" ]
then
cd tools
if [ -d "/etc/bash_completion.d" ]
then
count=`ls -1 *.bash_completion 2>/dev/null | wc -l`
if [ $count != 0 ]
then
cp *.bash_completion /etc/bash_completion.d
fi
fi
cd ../
fi
kill $!
echo [DONE] 1>&3
cd ../
done

View File

@ -0,0 +1,13 @@
cgtsclient
python-ceilometerclient
python-cinderclient
python-glanceclient
python-heatclient
python-keystoneclient
python-neutronclient
python-novaclient
python-openstackclient
osc-lib
keystoneauth1
openstacksdk
python-muranoclient

View File

@ -0,0 +1,12 @@
requests==2.11.1
oslo.i18n==3.9.0
tablib
os-client-config==1.21.1
pyopenssl
ndg-httpsclient
pyasn1
httplib2
python-swiftclient==3.1.0
python-muranoclient==0.11.0
python-magnumclient==2.3.1
python-ironicclient==1.7.1

View File

@ -0,0 +1,22 @@
#!/bin/bash
#
# Copyright (c) 2016-2017 Wind River Systems, Inc.
#
# SPDX-License-Identifier: Apache-2.0
#
#
if [ -z "${VIRTUAL_ENV}" ]; then
if [ $EUID != 0 ]; then
echo "Root access is required. Please run with sudo or as root."
exit 1
fi
fi
# log standard output and standard error, because there is quite a lot of it
# only output what is being installed and the progress to the console (echo)
SCRIPTDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
exec 3>&1 1>> $SCRIPTDIR/client_uninstallation.log 2>&1
pip freeze | grep -wF -f installed_clients.txt | xargs --no-run-if-empty pip uninstall -y

View File

@ -0,0 +1,542 @@
ntlm-auth===1.0.5
voluptuous===0.10.5
chardet===3.0.4
enum-compat===0.0.2
rsa===3.4.2
restructuredtext-lint===1.1.1
netmiko===1.4.2
instack-undercloud===7.4.5
PasteDeploy===1.5.2
typing===3.6.1
python-saharaclient===1.3.0
python-hnvclient===0.1.0
Routes===2.4.1
rtslib-fb===2.1.63
smmap===0.9.0
XStatic-Angular-Bootstrap===2.2.0.0
paunch===1.5.2
WebOb===1.7.3
sphinxcontrib-actdiag===0.8.5
pecan===1.2.1
ryu===4.15
os-api-ref===1.4.0
oslo.concurrency===3.21.1
websocket-client===0.44.0
osprofiler===1.11.0
bandit===1.4.0
tabulate===0.7.7
python-ironic-inspector-client===2.1.0
lxml===3.8.0
jdcal===1.3
python-kingbirdclient===0.2.0
setproctitle===1.1.10
pytest===3.1.3
python-etcd===0.4.5
cursive===0.1.2
oslo.service===1.25.1
django-appconf===1.0.2
pykerberos===1.1.14
certifi===2017.4.17
sphinxcontrib-nwdiag===0.9.5
requests-aws===0.1.8
alabaster===0.7.10
pbr===3.1.1
munch===2.2.0
microversion-parse===0.1.4
Pint===0.8.1
oslo.i18n===3.17.1
jsonpath-rw-ext===1.1.2
python-mistralclient===3.1.4
oslo.context===2.17.1
python-senlinclient===1.4.0
rcssmin===1.0.6
pycadf===2.6.0
grpcio===1.4.0
pysendfile===2.0.1
fixtures===3.0.0
neutron-lib===1.9.1
pystache===0.5.4
XStatic-Font-Awesome===4.7.0.0
nose===1.3.7
click-spinner===0.1.7
nosehtmloutput===0.0.5
waitress===1.0.2
os-refresh-config===7.1.0
jsbeautifier===1.6.14;python_version=='3.4'
jsbeautifier===1.6.14;python_version=='3.5'
pysnmp===4.3.9
sphinxcontrib-websupport===1.0.1
Mako===1.0.7
XStatic-angular-ui-router===0.3.1.2
pyScss===1.3.4
XStatic-jQuery===1.10.2.1
jsonmodels===2.1.5
ddt===1.1.1
ipaddress===1.0.18
python-freezerclient===1.5.0
os-xenapi===0.2.0
python-vitrageclient===1.4.0
nosexcover===1.0.11
krest===1.3.1
psycopg2===2.7.3
networkx===1.11
bashate===0.5.1
XStatic-Angular===1.5.8.0
pyngus===2.2.1
Pillow===4.2.1
python-mimeparse===1.6.0
tripleo-common===7.6.5
Tempita===0.5.2
ply===3.10
requests-toolbelt===0.8.0
simplejson===3.11.1
suds-jurko===0.6
python-swiftclient===3.4.0
pyOpenSSL===17.2.0
monasca-common===2.3.0
hyperframe===4.0.2;python_version=='3.4'
hyperframe===4.0.2;python_version=='3.5'
cssutils===1.0.2;python_version=='3.4'
cssutils===1.0.2;python_version=='3.5'
scipy===0.19.1
MySQL-python===1.2.5;python_version=='2.7'
XStatic-Jasmine===2.4.1.1
python-glanceclient===2.8.0
pyinotify===0.9.6
debtcollector===1.17.1
requests-unixsocket===0.1.5
odfpy===1.3.5
asn1crypto===0.22.0
croniter===0.3.17
python-watcherclient===1.3.0
MarkupSafe===1.0
pypowervm===1.1.6
doc8===0.8.0
pymongo===3.4.0
sqlparse===0.2.3
oslotest===2.17.1
jsonpointer===1.10
netaddr===0.7.19
pyghmi===1.0.22
sphinxcontrib-blockdiag===1.5.5
kaitaistruct===0.6;python_version=='3.4'
kaitaistruct===0.6;python_version=='3.5'
gnocchiclient===3.3.1
wcwidth===0.1.7
jsonpath-rw===1.4.0
prettytable===0.7.2
vine===1.1.4
taskflow===2.14.1
traceback2===1.4.0
semantic-version===2.6.0
tablib===0.11.5
virtualbmc===1.2.0
deprecation===1.0.1
SQLAlchemy===1.1.12
pyroute2===0.4.21
google-auth===1.0.1
kazoo===2.4.0
XStatic-roboto-fontface===0.5.0.0
pyudev===0.21.0
eventlet===0.20.0
openstack-doc-tools===1.6.0
frozendict===1.2
oslo.messaging===5.30.1
extras===1.0.0
PyJWT===1.5.2
et-xmlfile===1.0.1
paramiko===2.2.1
ordereddict===1.1
reno===2.5.0
unicodecsv===0.14.1
imagesize===0.7.1
pathlib===1.0.1;python_version=='2.7'
urllib3===1.22
graphviz===0.8
PyKMIP===0.6.0
python-subunit===1.2.0
tornado===4.4.3;python_version=='3.4'
tornado===4.4.3;python_version=='3.5'
pycparser===2.18
mock===2.0.0
PyYAML===3.12
beautifulsoup4===4.6.0
os-net-config===7.3.2
ovs===2.7.0
cryptography===2.0.2
backports.ssl-match-hostname===3.5.0.1
pylxd===2.2.4
anyjson===0.3.3
requests-mock===1.3.0
os-apply-config===7.2.1
oslosphinx===4.15.2
mox3===0.23.0
gunicorn===19.7.1
unittest2===1.1.0
django-compressor===2.1.1
libvirt-python===3.5.0
python-zunclient===0.4.1
tzlocal===1.4
python-novaclient===9.1.1
bcrypt===3.1.3
os-client-config===1.28.0
XStatic-Angular-Gettext===2.3.8.0
Pygments===2.2.0
XStatic-Hogan===2.0.0.2
XStatic-objectpath===1.2.1.0
python-manilaclient===1.17.2
requests===2.18.2
snowballstemmer===1.2.1
Jinja2===2.9.6
XStatic-Bootstrap-SCSS===3.3.7.1
pyzabbix===0.7.4
ptyprocess===0.5.2
amqp===2.2.1
ruamel.yaml===0.13.14;python_version=='3.4'
ruamel.yaml===0.13.14;python_version=='3.5'
websockify===0.8.0
html2text===2016.9.19;python_version=='3.4'
html2text===2016.9.19;python_version=='3.5'
XStatic-JQuery.quicksearch===2.0.3.1
mpmath===0.19
XStatic-JQuery-Migrate===1.2.1.1
appdirs===1.4.3
tinyrpc===0.5
Flask-SQLAlchemy===2.2
daiquiri===1.2.1
influxdb===4.1.1
funcparserlib===0.3.6
passlib===1.7.1
dib-utils===0.0.11
xlwt===1.2.0
cliff===2.8.0
os-brick===1.15.4
trollius===2.1
scp===0.10.2
xlrd===1.0.0
python-zaqarclient===1.7.0
funcsigs===1.0.2;python_version=='2.7'
zhmcclient===0.14.0
dnspython3===1.15.0;python_version=='3.4'
dnspython3===1.15.0;python_version=='3.5'
ldappool===2.1.0
termcolor===1.1.0
hpack===3.0.0;python_version=='3.4'
hpack===3.0.0;python_version=='3.5'
hiredis===0.2.0
google-api-python-client===1.6.2
castellan===0.12.1
oslo.versionedobjects===1.26.1
webcolors===1.7
aodhclient===0.9.0
autobahn===17.7.1
SQLAlchemy-Utils===0.32.14
coverage===4.4.1
freezegun===0.3.9
python-pytun===2.2.1
pyperclip===1.5.27
cassandra-driver===3.11.0
mox===0.5.3
XStatic-Angular-Schema-Form===0.8.13.0
gabbi===1.35.0
nwdiag===1.0.4
XStatic-bootswatch===3.3.7.0
XStatic-term.js===0.0.7.0
oslo.log===3.30.1
nodeenv===1.1.4
pylev===1.3.0
python-searchlightclient===1.2.0
oslo.middleware===3.30.1
brotlipy===0.6.0;python_version=='3.4'
brotlipy===0.6.0;python_version=='3.5'
XStatic-mdi===1.4.57.0
django-pyscss===2.0.2
uritemplate===3.0.0
django-babel===0.6.1
docutils===0.13.1
notifier===1.0.3
pycrypto===2.6.1
ujson===1.35
selenium===3.4.3
python-glareclient===0.4.3
mypy===0.521;python_version=='3.4'
mypy===0.521;python_version=='3.5'
mistral-lib===0.3.3
python-masakariclient===3.0.1
dogtag-pki===10.3.5.1
sphinxcontrib-seqdiag===0.8.5
os-win===2.2.0
pydot3===1.0.9
retrying===1.3.3
pathlib2===2.3.0
pydotplus===2.0.2
flask-oslolog===0.1
urwid===1.3.1;python_version=='3.4'
urwid===1.3.1;python_version=='3.5'
singledispatch===3.4.0.3;python_version=='2.7'
oslo.serialization===2.20.1
warlock===1.2.0
exabgp===4.0.2
sphinxcontrib-httpdomain===1.5.0
thriftpy===0.3.9;python_version=='2.7'
murano-pkg-check===0.3.0
oslo.vmware===2.23.1
sqlalchemy-migrate===0.11.0
gitdb===0.6.4
python-monascaclient===1.7.0
ldap3===2.2.4
requests-ntlm===1.0.0
automaton===1.12.1
argh===0.26.2;python_version=='3.4'
argh===0.26.2;python_version=='3.5'
os-service-types===1.0.0
keyring===10.4.0
testscenarios===0.5.0
sphinxcontrib-pecanwsme===0.8.0
enum34===1.1.6
packaging===16.8
flask-keystone===0.2
nose-exclude===0.5.0
psutil===5.2.2
py===1.4.34
txaio===2.8.1
elasticsearch===2.4.1
django-nose===1.4.4
XStatic-JQuery.TableSorter===2.14.5.1
pifpaf===1.9.2
pysmi===0.1.3
blockdiag===1.5.3
testtools===2.3.0
Parsley===1.3
XStatic-tv4===1.2.7.0
positional===1.1.2
XStatic-JSEncrypt===2.3.1.1
python-cinderclient===3.1.0
keystonemiddleware===4.17.0
django-formtools===2.0
python-ceilometerclient===2.9.0
XStatic-Spin===1.2.5.2
os-traits===0.3.3
SecretStorage===2.3.1
XStatic-Rickshaw===1.5.0.0
iso8601===0.1.11
tooz===1.58.0
linecache2===1.0.0
oauth2client===4.1.2
idna===2.5
python-karborclient===0.6.0
weakrefmethod===1.0.3;python_version=='2.7'
PuLP===1.6.8
crc16===0.1.1
protobuf===3.3.0
os-dpm===1.1.0
sushy===1.1.0
python-neutronclient===6.5.0
pika===0.10.0
oslo.cache===1.25.1
WebTest===2.0.27
openstack.nose-plugin===0.11
os-collect-config===7.2.1
python-qpid-proton===0.17.0
python-octaviaclient===1.2.0
pysaml2===4.0.2
oslo.reports===1.22.1
ceilometermiddleware===1.1.0
python-nss===1.0.1
testrepository===0.0.20
sympy===1.1.1
sphinxmark===0.1.19
openpyxl===2.4.8
PyNaCl===1.1.2
osc-lib===1.7.0
python-consul===0.7.0
seqdiag===0.9.5
numpy===1.13.1
repoze.who===2.3
Sphinx===1.6.3
oslo.config===4.11.1
tempest===17.0.0
django-floppyforms===1.7.0
openstackdocstheme===1.16.1
progressbar2===3.32.1
zake===0.2.2
python-solumclient===2.5.0
PyMySQL===0.7.11
kubernetes===2.0.0
httplib2===0.10.3
bottle===0.12.13
betamax===0.8.0
construct===2.8.12
pyparsing===2.2.0
dogpile.cache===0.6.4
python-barbicanclient===4.5.2
blinker===1.4;python_version=='3.4'
blinker===1.4;python_version=='3.5'
tricircleclient===0.1.1
WSME===0.9.2
msgpack-python===0.4.8
proboscis===1.2.6.0
fortiosclient===0.0.2
stevedore===1.25.1
botocore===1.5.89
xmltodict===0.11.0
pyasn1===0.3.1
python-utils===2.2.0
oslo.rootwrap===5.9.1
Django===1.11.3
pexpect===4.2.1
cmd2===0.7.5
redis===2.10.5
jmespath===0.9.3
click===6.7
docker-pycreds===0.2.1
XStatic-smart-table===1.4.13.2
kuryr-lib===0.5.0
scrypt===0.8.0
jsonpatch===1.16
typed-ast===1.0.4;python_version=='3.4'
typed-ast===1.0.4;python_version=='3.5'
os-testr===0.8.2
cotyledon===1.6.8
stomp.py===4.1.18
xattr===0.9.2
systemd-python===234
python-memcached===1.58
openstacksdk===0.9.17
six===1.10.0
h2===2.6.2;python_version=='3.4'
h2===2.6.2;python_version=='3.5'
dulwich===0.17.3
pykafka===2.6.0
kombu===4.1.0
mitmproxy===2.0.2;python_version=='3.4'
mitmproxy===2.0.2;python_version=='3.5'
betamax-matchers===0.4.0
yaql===1.1.3
requestsexceptions===1.3.0
testresources===2.0.1
falcon===1.2.0
etcd3gw===0.1.0
pycryptodome===3.4.6
pyldap===2.4.37
Flask-RESTful===0.3.6
GitPython===2.1.5
python-ironicclient===1.17.0
XStatic===1.0.1
click-repl===0.1.2
XStatic-Angular-FileUpload===12.0.4.0
python-openstackclient===3.12.0
pika-pool===0.1.3
pyzmq===16.0.2
EditorConfig===0.12.1;python_version=='3.4'
EditorConfig===0.12.1;python_version=='3.5'
oslo.db===4.25.1
simplegeneric===0.8.1
abclient===0.2.3
pymemcache===1.4.3
wrapt===1.10.10
oslo.privsep===1.22.1
zope.interface===4.4.2
oslo.policy===1.25.2
python-muranoclient===0.14.0
pyeclib===1.5.0
django-openstack-auth===3.5.0
wsgi-intercept===1.5.0
ndg-httpsclient===0.4.2;python_version=='2.7'
tempest-lib===1.0.0
spec-cleaner===0.9.8
repoze.lru===0.6
rfc3986===1.1.0
tenacity===4.4.0
XStatic-Magic-Search===0.2.5.1
python-designateclient===2.7.0
Paste===2.0.3
boto===2.48.0
functools32===3.2.3.post2;python_version=='2.7'
watchdog===0.8.3;python_version=='3.4'
watchdog===0.8.3;python_version=='3.5'
gevent===1.2.2
os-vif===1.7.0
Werkzeug===0.12.2
pyasn1-modules===0.0.10
APScheduler===3.3.1
monotonic===1.3
python-smaugclient===0.0.8
python-troveclient===2.12.0
etcd3===0.6.2
cliff-tablib===2.0
XStatic-Bootstrap-Datepicker===1.3.1.0
CouchDB===1.1
netifaces===0.10.6
cachetools===2.0.0
ws4py===0.4.2
backports-abc===0.5;python_version=='3.4'
backports-abc===0.5;python_version=='3.5'
keystoneauth1===3.1.0
statsd===3.2.1
XenAPI===1.2
python-keystoneclient===3.13.0
demjson===2.2.4
diskimage-builder===2.7.2
heat-translator===0.9.0
python-magnumclient===2.7.0
docker===2.4.2
prompt-toolkit===1.0.15
pathtools===0.1.2;python_version=='3.4'
pathtools===0.1.2;python_version=='3.5'
qpid-python===1.36.0.post1;python_version=='2.7'
contextlib2===0.5.5
XStatic-Angular-lrdragndrop===1.0.2.2
python-congressclient===1.8.0
ovsdbapp===0.4.1
aniso8601===1.2.1
rjsmin===1.0.12
icalendar===3.11.5
decorator===4.1.2
cffi===1.10.0
futurist===1.3.1
jsonschema===2.6.0
alembic===0.9.3
glance-store===0.22.0
sphinx-testing===0.7.2
dnspython===1.15.0
oauthlib===2.0.2
Babel===2.3.4
logutils===0.3.5
scandir===1.5
sphinxcontrib-fulltoc===1.2.0
smmap2===2.0.3
olefile===0.44
greenlet===0.4.12
xvfbwrapper===0.2.9
futures===3.1.1;python_version=='2.7'
tosca-parser===0.8.1
Flask===0.12.2
pymod2pkg===0.8.4
happybase===1.1.0;python_version=='2.7'
marathon===0.9.0
docker-py===1.10.6
fasteners===0.14.1
sortedcontainers===1.5.7;python_version=='3.4'
sortedcontainers===1.5.7;python_version=='3.5'
python-tackerclient===0.10.0
python-heatclient===1.11.1
kafka-python===1.3.3
oslo.utils===3.28.1
python-editor===1.0.3
gitdb2===2.0.2
requests-kerberos===0.11.0
itsdangerous===0.24
XStatic-jquery-ui===1.12.0.1
monasca-statsd===1.7.0
python-dateutil===2.6.1
virtualenv===15.1.0
colorama===0.3.9
ironic-lib===2.10.0
pytz===2017.2
XStatic-D3===3.5.17.0
actdiag===0.5.4
sysv-ipc===0.7.0
scikit-learn===0.18.2
zuul-sphinx===0.1.2
shade===1.22.2