Skip to content

Commit bc4c0ac

Browse files
moshemoradnhermentAvi-Robusta
authored
Precommit checks (#257)
Co-authored-by: Nicolas Herment <[email protected]> Co-authored-by: Nicolas Herment <[email protected]> Co-authored-by: Avi-Robusta <[email protected]>
1 parent e04a5cd commit bc4c0ac

File tree

137 files changed

+1454
-1077
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

137 files changed

+1454
-1077
lines changed

.gitattributes

+1-1
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
holmes/.git_archival.json export-subst
1+
holmes/.git_archival.json export-subst

.github/workflows/build-binaries-and-brew.yaml

+5-5
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ jobs:
2222
uses: actions/setup-python@v2
2323
with:
2424
python-version: '3.11'
25-
25+
2626
- name: Install dependencies
2727
if: matrix.os != 'windows-latest'
2828
run: |
@@ -43,7 +43,7 @@ jobs:
4343
if: matrix.os == 'ubuntu-20.04'
4444
run: |
4545
sudo apt-get install -y binutils
46-
46+
4747
- name: Update package version (Linux)
4848
if: matrix.os == 'ubuntu-20.04'
4949
run: sed -i 's/__version__ = .*/__version__ = "${{ github.ref_name }}"/g' holmes/__init__.py
@@ -67,7 +67,7 @@ jobs:
6767
# regarding the tiktoken part of the command, see https://github.com/openai/tiktoken/issues/80
6868
# regarding the litellm part of the command, see https://github.com/pyinstaller/pyinstaller/issues/8620#issuecomment-2186540504
6969
run: |
70-
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --hiddenimport litellm.litellm_core_utils.tokenizers --collect-data litellm
70+
pyinstaller holmes.py --add-data 'holmes/plugins/runbooks/*:holmes/plugins/runbooks' --add-data 'holmes/plugins/prompts/*:holmes/plugins/prompts' --add-data 'holmes/plugins/toolsets/*:holmes/plugins/toolsets' --hidden-import=tiktoken_ext.openai_public --hidden-import=tiktoken_ext --hiddenimport litellm.llms.tokenizers --hiddenimport litellm.litellm_core_utils.tokenizers --collect-data litellm
7171
ls dist
7272
7373
- name: Zip the application (Unix)
@@ -91,7 +91,7 @@ jobs:
9191
env:
9292
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
9393
with:
94-
upload_url: ${{ github.event.release.upload_url }}
94+
upload_url: ${{ github.event.release.upload_url }}
9595
asset_path: ./holmes-${{ matrix.os }}-${{ github.ref_name }}.zip
9696
asset_name: holmes-${{ matrix.os }}-${{ github.ref_name }}.zip
9797
asset_content_type: application/octet-stream
@@ -105,7 +105,7 @@ jobs:
105105
check-latest:
106106
needs: build
107107
runs-on: ubuntu-20.04
108-
outputs:
108+
outputs:
109109
IS_LATEST: ${{ steps.check-latest.outputs.release == github.ref_name }}
110110
steps:
111111
- id: check-latest

.github/workflows/build-docker-images.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ jobs:
7777
# Note: this ignores the "Set as latest release" checkbox in the GitHub UI
7878
# it isn't possible to check whether that was set or not
7979
# so if you do not want to override the "latest" tag, you should mark the release as a prerelease or a draft
80-
# for prereleases and drafts we don't tag latest
80+
# for prereleases and drafts we don't tag latest
8181
- name: Tag and push Docker image as latest if applicable
8282
if: ${{ github.event.release.prerelease == false && github.event.release.draft == false }}
8383
run: |

.pre-commit-config.yaml

+13-1
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,16 @@ repos:
66
- id: poetry-lock
77
pass_filenames: false
88
args:
9-
- --no-update
9+
- --no-update
10+
- repo: https://github.com/astral-sh/ruff-pre-commit
11+
rev: v0.7.2
12+
hooks:
13+
- id: ruff
14+
entry: ruff check --fix
15+
- id: ruff-format
16+
- repo: https://github.com/pre-commit/pre-commit-hooks
17+
rev: v5.0.0
18+
hooks:
19+
- id: detect-private-key
20+
- id: end-of-file-fixer
21+
- id: trailing-whitespace

CONTRIBUTING.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -15,15 +15,15 @@ Please make sure to read and observe our [Code of Conduct](https://github.com/ro
1515

1616
## Reporting bugs
1717

18-
We encourage those interested to contribute code and also appreciate when issues are reported.
18+
We encourage those interested to contribute code and also appreciate when issues are reported.
1919

2020
- Create a new issue and label is as `bug`
2121
- Clearly state how to reproduce the bug:
2222
- Which LLM you've used
2323
- Which steps are required to reproduce
2424
- As LLMs answers may differ between runs - Does it always reproduce, or occasionally?
2525

26-
26+
2727
## Contributing Code
2828

2929
- Fork the repository and clone it locally.

Dockerfile.dev

+1-1
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ ARG PRIVATE_PACKAGE_REGISTRY="none"
5959
RUN if [ "${PRIVATE_PACKAGE_REGISTRY}" != "none" ]; then \
6060
pip config set global.index-url "${PRIVATE_PACKAGE_REGISTRY}"; \
6161
fi \
62-
&& pip install poetry
62+
&& pip install poetry
6363
ARG POETRY_REQUESTS_TIMEOUT
6464
RUN poetry config virtualenvs.create false
6565
COPY pyproject.toml poetry.lock /app/

Makefile

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
2+
3+
check:
4+
poetry run pre-commit run -a

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -875,7 +875,7 @@ Configure Slack to send notifications to specific channels. Provide your Slack t
875875
The OpenSearch toolset (`opensearch`) allows Holmes to consult an opensearch cluster for its health, settings and shards information.
876876
The toolset supports multiple opensearch or elasticsearch clusters that are configured by editing Holmes' configuration file:
877877

878-
```
878+
```
879879
opensearch_clusters:
880880
- hosts:
881881
- https://my_elasticsearch.us-central1.gcp.cloud.es.io:443

examples/custom_llm.py

+29-27
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,14 @@
1-
21
from typing import Any, Dict, List, Optional, Type, Union
3-
from holmes.config import Config
42
from holmes.core.llm import LLM
53
from litellm.types.utils import ModelResponse
64
from holmes.core.tool_calling_llm import ToolCallingLLM
75
from holmes.core.tools import Tool, ToolExecutor
86
from holmes.plugins.toolsets import load_builtin_toolsets
9-
from rich.console import Console
107
from pydantic import BaseModel
118
from holmes.plugins.prompts import load_and_render_prompt
12-
import sys
13-
class MyCustomLLM(LLM):
149

10+
11+
class MyCustomLLM(LLM):
1512
def get_context_window_size(self) -> int:
1613
return 128000
1714

@@ -21,36 +18,41 @@ def get_maximum_output_token(self) -> int:
2118
def count_tokens_for_message(self, messages: list[dict]) -> int:
2219
return 1
2320

24-
def completion(self, messages: List[Dict[str, Any]], tools: Optional[List[Tool]] = [], tool_choice: Optional[Union[str, dict]] = None, response_format: Optional[Union[dict, Type[BaseModel]]] = None, temperature:Optional[float] = None, drop_params: Optional[bool] = None) -> ModelResponse:
25-
return ModelResponse(choices=[{
26-
"finish_reason": "stop",
27-
"index": 0,
28-
"message": {
29-
"role": "assistant",
30-
"content": "There are no issues with your cluster"
31-
}
32-
}],
33-
usage={
34-
"prompt_tokens": 0, # Integer
35-
"completion_tokens": 0,
36-
"total_tokens": 0
37-
}
38-
)
21+
def completion(
22+
self,
23+
messages: List[Dict[str, Any]],
24+
tools: Optional[List[Tool]] = [],
25+
tool_choice: Optional[Union[str, dict]] = None,
26+
response_format: Optional[Union[dict, Type[BaseModel]]] = None,
27+
temperature: Optional[float] = None,
28+
drop_params: Optional[bool] = None,
29+
) -> ModelResponse:
30+
return ModelResponse(
31+
choices=[
32+
{
33+
"finish_reason": "stop",
34+
"index": 0,
35+
"message": {
36+
"role": "assistant",
37+
"content": "There are no issues with your cluster",
38+
},
39+
}
40+
],
41+
usage={
42+
"prompt_tokens": 0, # Integer
43+
"completion_tokens": 0,
44+
"total_tokens": 0,
45+
},
46+
)
3947

4048

4149
def ask_holmes():
42-
console = Console()
43-
4450
prompt = "what issues do I have in my cluster"
4551

4652
system_prompt = load_and_render_prompt("builtin://generic_ask.jinja2")
4753

4854
tool_executor = ToolExecutor(load_builtin_toolsets())
49-
ai = ToolCallingLLM(
50-
tool_executor,
51-
max_steps=10,
52-
llm=MyCustomLLM()
53-
)
55+
ai = ToolCallingLLM(tool_executor, max_steps=10, llm=MyCustomLLM())
5456

5557
response = ai.prompt_call(system_prompt, prompt)
5658

examples/custom_runbooks.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,4 +4,4 @@ runbooks:
44
instructions: >
55
Analyze pod logs for errors and also read the monogodb logs
66
Correlate between the two logs and try to find the root cause of the issue.
7-
Based on the logs, report the session ids of impacted transactions
7+
Based on the logs, report the session ids of impacted transactions

examples/custom_toolset.yaml

+2-2
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ toolsets:
1111
docs_url: "https://kubernetes.io/docs/home/"
1212
# Icon URL. Used for display in the UI
1313
icon_url: "https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRPKA-U9m5BxYQDF1O7atMfj9EMMXEoGu4t0Q&s"
14-
# Tags for categorizing toolsets, 'core' will be used for all Holmes features (both cli's commands and chats in UI).
14+
# Tags for categorizing toolsets, 'core' will be used for all Holmes features (both cli's commands and chats in UI).
1515
# The 'cluster' tag is used for UI functionality, while 'cli' is for for command-line specific tools
1616
tags:
1717
- core
@@ -24,7 +24,7 @@ toolsets:
2424
- name: "switch_cluster"
2525
# The LLM looks at this description when deciding what tools are relevant for each task
2626
description: "Used to switch between multiple kubernetes contexts(clusters)"
27-
27+
2828
# A templated bash command using Jinja2 templates
2929
# The LLM can only control parameters that you expose as template variables like {{ this_variable }}
3030
command: "kubectl config use-context {{ cluster_name }}"

helm/holmes/Chart.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@ type: application
77
# we use 0.0.1 as a placeholder for the version` because Helm wont allow `0.0.0` and we want to be able to run
88
# `helm install` on development checkouts without updating this file. the version doesn't matter in that case anyway
99
version: 0.0.1
10-
appVersion: 0.0.0
10+
appVersion: 0.0.0

helm/holmes/templates/holmesgpt-service-account.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -229,4 +229,4 @@ subjects:
229229
- kind: ServiceAccount
230230
name: {{ .Release.Name }}-holmes-service-account
231231
namespace: {{ .Release.Namespace }}
232-
{{- end }}
232+
{{- end }}

holmes/.git_archival.json

-1
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,3 @@
55
"refs": "$Format:%D$",
66
"describe": "$Format:%(describe:tags=true,match=v[0-9]*)$"
77
}
8-

holmes/__init__.py

+30-8
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
import sys
55

66
# For relative imports to work in Python 3.6 - see https://stackoverflow.com/a/49375740
7-
this_path = os.path.dirname(os.path.realpath(__file__))
7+
this_path = os.path.dirname(os.path.realpath(__file__))
88
sys.path.append(this_path)
99

1010
# This is patched by github actions during release
@@ -19,28 +19,50 @@ def get_version() -> str:
1919
# we are running from an unreleased dev version
2020
try:
2121
# Get the latest git tag
22-
tag = subprocess.check_output(["git", "describe", "--tags"], stderr=subprocess.STDOUT, cwd=this_path).decode().strip()
22+
tag = (
23+
subprocess.check_output(
24+
["git", "describe", "--tags"], stderr=subprocess.STDOUT, cwd=this_path
25+
)
26+
.decode()
27+
.strip()
28+
)
2329

2430
# Get the current branch name
25-
branch = subprocess.check_output(["git", "rev-parse", "--abbrev-ref", "HEAD"], stderr=subprocess.STDOUT, cwd=this_path).decode().strip()
31+
branch = (
32+
subprocess.check_output(
33+
["git", "rev-parse", "--abbrev-ref", "HEAD"],
34+
stderr=subprocess.STDOUT,
35+
cwd=this_path,
36+
)
37+
.decode()
38+
.strip()
39+
)
2640

2741
# Check if there are uncommitted changes
28-
status = subprocess.check_output(["git", "status", "--porcelain"], stderr=subprocess.STDOUT, cwd=this_path).decode().strip()
42+
status = (
43+
subprocess.check_output(
44+
["git", "status", "--porcelain"],
45+
stderr=subprocess.STDOUT,
46+
cwd=this_path,
47+
)
48+
.decode()
49+
.strip()
50+
)
2951
dirty = "-dirty" if status else ""
3052

3153
return f"{tag}-{branch}{dirty}"
32-
54+
3355
except Exception:
3456
pass
3557

3658
# we are running without git history, but we still might have git archival data (e.g. if we were pip installed)
37-
archival_file_path = os.path.join(this_path, '.git_archival.json')
59+
archival_file_path = os.path.join(this_path, ".git_archival.json")
3860
if os.path.exists(archival_file_path):
3961
try:
40-
with open(archival_file_path, 'r') as f:
62+
with open(archival_file_path, "r") as f:
4163
archival_data = json.load(f)
4264
return f"{archival_data['refs']}-{archival_data['hash-short']}"
4365
except Exception:
4466
pass
4567

46-
return f"dev-version"
68+
return "dev-version"

holmes/common/env_vars.py

+8-4
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,14 @@ def load_bool(env_var, default: bool):
77
return json.loads(s.lower())
88

99

10-
ENABLED_BY_DEFAULT_TOOLSETS = os.environ.get('ENABLED_BY_DEFAULT_TOOLSETS', 'kubernetes/core,kubernetes/logs,robusta,internet')
11-
HOLMES_HOST = os.environ.get('HOLMES_HOST', '0.0.0.0')
12-
HOLMES_PORT = int(os.environ.get('HOLMES_PORT', 5050))
13-
ROBUSTA_CONFIG_PATH = os.environ.get('ROBUSTA_CONFIG_PATH', "/etc/robusta/config/active_playbooks.yaml")
10+
ENABLED_BY_DEFAULT_TOOLSETS = os.environ.get(
11+
"ENABLED_BY_DEFAULT_TOOLSETS", "kubernetes/core,kubernetes/logs,robusta,internet"
12+
)
13+
HOLMES_HOST = os.environ.get("HOLMES_HOST", "0.0.0.0")
14+
HOLMES_PORT = int(os.environ.get("HOLMES_PORT", 5050))
15+
ROBUSTA_CONFIG_PATH = os.environ.get(
16+
"ROBUSTA_CONFIG_PATH", "/etc/robusta/config/active_playbooks.yaml"
17+
)
1418

1519
ROBUSTA_ACCOUNT_ID = os.environ.get("ROBUSTA_ACCOUNT_ID", "")
1620
STORE_URL = os.environ.get("STORE_URL", "")

holmes/config.py

+2-4
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,13 @@
1-
from functools import lru_cache
21
import logging
32
import os
43
import yaml
54
import os.path
65

76
from holmes.core.llm import LLM, DefaultLLM
87
from typing import Any, Dict, List, Optional
9-
from typing import List, Optional
108

119

12-
from pydantic import FilePath, SecretStr, Field
10+
from pydantic import FilePath, SecretStr
1311
from pydash.arrays import concat
1412

1513

@@ -496,7 +494,7 @@ def merge_and_override_bultin_toolsets_with_toolsets_config(
496494
@classmethod
497495
def load_from_file(cls, config_file: Optional[str], **kwargs) -> "Config":
498496
if config_file is not None:
499-
logging.debug(f"Loading config from file %s", config_file)
497+
logging.debug("Loading config from file %s", config_file)
500498
config_from_file = load_model_from_file(cls, config_file)
501499
elif os.path.exists(DEFAULT_CONFIG_LOCATION):
502500
logging.debug(

0 commit comments

Comments
 (0)