Add check for Python version.

Add remove_model_name_local
pull/19/head
rsaryev 9 months ago
parent ebf672cc7c
commit 3d3e2dabd5
No known key found for this signature in database

@ -1,7 +1,11 @@
# talk-codebase
[![Node.js Package](https://github.com/rsaryev/talk-codebase/actions/workflows/python-publish.yml/badge.svg)](https://github.com/rsaryev/talk-codebase/actions/workflows/python-publish.yml)
Talk-codebase is a tool that allows you to converse with your codebase using Large Language Models (LLMs) to answer your queries. It supports offline code processing using LlamaCpp and [GPT4All](https://github.com/nomic-ai/gpt4all) without sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you. Please note that talk-codebase is still under development and is recommended for educational purposes, not for production use.
Talk-codebase is a tool that allows you to converse with your codebase using Large Language Models (LLMs) to answer your
queries. It supports offline code processing using LlamaCpp and [GPT4All](https://github.com/nomic-ai/gpt4all) without
sharing your code with third parties, or you can use OpenAI if privacy is not a concern for you. Please note that
talk-codebase is still under development and is recommended for educational purposes, not for production use.
<p align="center">
<img src="https://github.com/rsaryev/talk-codebase/assets/70219513/b5d338f9-14a5-417b-9690-83f5cd66facf" width="800" alt="chat">
@ -9,6 +13,8 @@ Talk-codebase is a tool that allows you to converse with your codebase using Lar
## Installation
Requirement Python 3.8.1 or higher
```bash
pip install talk-codebase
```
@ -25,7 +31,8 @@ Select model type: Local or OpenAI
OpenAI
If you use the OpenAI model, you need an OpenAI API key. You can get it from [here](https://beta.openai.com/). Then you will be offered a choice of available models.
If you use the OpenAI model, you need an OpenAI API key. You can get it from [here](https://beta.openai.com/). Then you
will be offered a choice of available models.
<img width="300" alt="select" src="https://github.com/rsaryev/talk-codebase/assets/70219513/889ad7c8-a489-4ce8-83af-148b7df09229">
@ -46,7 +53,8 @@ talk-codebase configure
## Advanced configuration
You can manually edit the configuration by editing the `~/.config.yaml` file. If you cannot find the configuration file, run the tool and it will output the path to the configuration file at the very beginning.
You can manually edit the configuration by editing the `~/.config.yaml` file. If you cannot find the configuration file,
run the tool and it will output the path to the configuration file at the very beginning.
## Supported Extensions
@ -61,6 +69,11 @@ You can manually edit the configuration by editing the `~/.config.yaml` file. If
## Contributing
* If you find a bug in talk-codebase, please report it on the project's issue tracker. When reporting a bug, please include as much information as possible, such as the steps to reproduce the bug, the expected behavior, and the actual behavior.
* If you have an idea for a new feature for Talk-codebase, please open an issue on the project's issue tracker. When suggesting a feature, please include a brief description of the feature, as well as any rationale for why the feature would be useful.
* You can contribute to talk-codebase by writing code. The project is always looking for help with improving the codebase, adding new features, and fixing bugs.
* If you find a bug in talk-codebase, please report it on the project's issue tracker. When reporting a bug, please
include as much information as possible, such as the steps to reproduce the bug, the expected behavior, and the actual
behavior.
* If you have an idea for a new feature for Talk-codebase, please open an issue on the project's issue tracker. When
suggesting a feature, please include a brief description of the feature, as well as any rationale for why the feature
would be useful.
* You can contribute to talk-codebase by writing code. The project is always looking for help with improving the
codebase, adding new features, and fixing bugs.

@ -1,6 +1,6 @@
[tool.poetry]
name = "talk-codebase"
version = "0.1.45"
version = "0.1.46"
description = "talk-codebase is a powerful tool for querying and analyzing codebases."
authors = ["Saryev Rustam <rustam1997@gmail.com>"]
readme = "README.md"

@ -1,11 +1,19 @@
import sys
import fire
from talk_codebase.config import CONFIGURE_STEPS, save_config, get_config, config_path, remove_api_key, \
remove_model_type
remove_model_type, remove_model_name_local
from talk_codebase.consts import DEFAULT_CONFIG
from talk_codebase.llm import factory_llm
def check_python_version():
if sys.version_info < (3, 8, 1):
print("🤖 Please use Python 3.8.1 or higher")
sys.exit(1)
def update_config(config):
for key, value in DEFAULT_CONFIG.items():
if key not in config:
@ -17,6 +25,7 @@ def configure(reset=True):
if reset:
remove_api_key()
remove_model_type()
remove_model_name_local()
config = get_config()
config = update_config(config)
for step in CONFIGURE_STEPS:
@ -43,6 +52,7 @@ def chat(root_dir=None):
def main():
check_python_version()
print(f"🤖 Config path: {config_path}:")
try:
fire.Fire({

@ -102,6 +102,12 @@ def configure_model_name_local(config):
print("🤖 Model name saved!")
def remove_model_name_local():
config = get_config()
config["local_model_name"] = None
save_config(config)
def get_and_validate_api_key():
prompt = "🤖 Enter your OpenAI API key: "
api_key = input(prompt)

@ -20,7 +20,7 @@ DEFAULT_CONFIG = {
"max_tokens": "2056",
"chunk_size": "2056",
"chunk_overlap": "256",
"k": "1",
"k": "2",
"temperature": "0.7",
"model_path": DEFAULT_MODEL_DIRECTORY,
"n_batch": "8",

@ -94,6 +94,7 @@ class LocalLLM(BaseLLM):
model_n_batch = int(self.config.get("n_batch"))
callbacks = CallbackManager([StreamStdOut()])
llm = LlamaCpp(model_path=model_path, n_ctx=model_n_ctx, n_batch=model_n_batch, callbacks=callbacks, verbose=False)
llm.client.verbose = False
return llm

Loading…
Cancel
Save