|
1 | 1 | # gptify |
2 | 2 |
|
3 | | -`gptify` is a command-line tool that transforms a Git repository into a single text file suitable for use with Large Language Models (LLMs) like ChatGPT. It preserves the file structure and content, enabling LLMs to understand and process the codebase for tasks such as code review, documentation generation, and answering questions about the code. This project is a fork of [gptrepo](https://github.com/zackess/gptrepo) with added features specifically designed for the [miniogre devtool](https://github.com/ogre-run/miniogre). |
| 3 | +`gptify` is a command-line tool that transforms a Git repository into a single text file or multiple text chunks suitable for use with Large Language Models (LLMs) like ChatGPT. It preserves the file structure and content, enabling LLMs to understand and process the codebase for tasks such as code review, documentation generation, and answering questions about the code. This project is a fork of [gptrepo](https://github.com/zackess/gptrepo) with added features. |
4 | 4 |
|
5 | 5 | ## Relevance |
6 | 6 |
|
7 | | -This tool addresses the challenge of effectively using LLMs with codebases. By converting a repository into a digestible format, `gptify` allows developers to leverage the power of LLMs for various development tasks. Within the miniogre project, it plays a crucial role in facilitating AI-driven code understanding and interaction. |
| 7 | +This tool addresses the challenge of effectively using LLMs with codebases. By converting a repository into a digestible format, `gptify` allows developers to leverage the power of LLMs for various development tasks. It simplifies the process of feeding code context into LLMs, avoiding size limitations and formatting issues. |
8 | 8 |
|
9 | 9 | ## Installation |
10 | 10 |
|
11 | | -The easiest way |
12 | | -`pip install gptify`. |
| 11 | +The easiest way to install `gptify` is using `pip`: |
13 | 12 |
|
14 | | -`gptify` can also be installed using `pipx`: |
| 13 | +```bash |
| 14 | +pip install gptify |
| 15 | +``` |
| 16 | + |
| 17 | +Alternatively, you can install it using `pipx`: |
15 | 18 |
|
16 | 19 | ```bash |
17 | 20 | poetry build && pipx install dist/*.whl |
18 | 21 | ``` |
19 | | -You can also uninstall older versions using the provided install script: `./install.sh`. |
| 22 | + |
| 23 | +You can also uninstall older versions using the provided install script: |
| 24 | + |
| 25 | +```bash |
| 26 | +./install.sh |
| 27 | +``` |
20 | 28 |
|
21 | 29 | ## Usage |
22 | 30 |
|
23 | | -After installation, navigate to the root directory of your Git repository and run: |
| 31 | +1. **Navigate to the root directory** of your Git repository. |
| 32 | +2. **Run the `gptify` command**: |
24 | 33 |
|
25 | 34 | ```bash |
26 | 35 | gptify |
27 | 36 | ``` |
28 | 37 |
|
29 | | -This command will generate a file named `gptify_output.txt` in the current directory containing the formatted repository content. You can then copy and paste the contents of this file into a ChatGPT session to interact with your codebase. |
| 38 | +This will generate a file named `gptify_output.txt` in the current directory containing the formatted repository content. You can then copy and paste the contents of this file into a ChatGPT session. |
| 39 | + |
30 | 40 |
|
31 | 41 | ### Options |
32 | 42 |
|
33 | 43 | * `--output <filename>`: Specifies the name of the output file (default: `gptify_output.txt`). |
34 | | -* `--clipboard`: Copies the output directly to the clipboard, omitting the output file creation. |
| 44 | +* `--clipboard`: Copies the output directly to the clipboard instead of creating an output file. |
35 | 45 | * `--openfile`: Opens the output file after creation using the default system application. |
36 | | -* `--preamble <filepath>`: Prepends a custom preamble to the output file. |
| 46 | +* `--preamble <filepath>`: Prepends a custom preamble to the output file. This is useful for providing instructions or context to the LLM. |
| 47 | +* `--chunk`: Enables chunking of the output into smaller files, useful for handling large repositories that exceed LLM context limits. Used with `--max_tokens` and `--overlap`. |
| 48 | +* `--max_tokens`: Sets the maximum number of tokens per chunk when using the `--chunk` option (default: 900000). Requires the `tiktoken` library. |
| 49 | +* `--overlap`: Sets the number of overlapping tokens between chunks when using the `--chunk` option (default: 400). Helps maintain context across chunks. Requires the `tiktoken` library. |
| 50 | +* `--output_dir`: Specifies the output directory for chunks when using `--chunk` (default: `gptify_output_chunks`). |
| 51 | + |
37 | 52 |
|
38 | | -## Example with custom output file: |
| 53 | +## Example with custom output file and preamble: |
39 | 54 |
|
40 | 55 | ```bash |
41 | | -gptify --output my_repo.txt |
| 56 | +gptify --output my_repo.txt --preamble instructions.txt |
42 | 57 | ``` |
43 | 58 |
|
44 | | -This will generate `my_repo.txt` with the processed repository data. |
| 59 | +This command will generate `my_repo.txt` with the processed repository data, prepended with the content of `instructions.txt`. |
45 | 60 |
|
46 | | -## Contributing |
| 61 | +## Example with chunking: |
47 | 62 |
|
48 | | -While contributions are welcome, the focus of this fork is on specific features for miniogre, and responses to pull requests might be delayed. |
| 63 | +```bash |
| 64 | +gptify --chunk --max_tokens 4000 --overlap 200 |
| 65 | +``` |
| 66 | +This will create multiple files in the `gptify_output_chunks` directory, each containing a chunk of the repository data, with a maximum of 4000 tokens and an overlap of 200 tokens. |
| 67 | + |
| 68 | +## Contributing |
49 | 69 |
|
| 70 | +Contributions are welcome. |
50 | 71 |
|
51 | 72 | ## License |
52 | 73 |
|
|
0 commit comments