Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dynamically load requirements.txt in pyproject.toml #15

Merged
merged 4 commits into from
Oct 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 23 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,31 +7,41 @@ SPDX-License-Identifier: MIT
[![Release chap](https://github.com/jepler/chap/actions/workflows/release.yml/badge.svg?event=release)](https://github.com/jepler/chap/actions/workflows/release.yml)
[![PyPI](https://img.shields.io/pypi/v/chap)](https://pypi.org/project/chap/)

# chap - A Python interface to chatgpt, including a terminal user interface (tui)
# chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)

![Chap screencast](https://github.com/jepler/chap/blob/main/chap.gif)

## System requirements

Chap is developed on Linux with Python 3.11. Due to use of the `list[int]` style of type hints, it is known not to work on 3.8 and older; the target minimum Python version is 3.9 (debian oldstable).

## installation
## Installation

Install with e.g., `pipx install chap`
Install with e.g., `pipx install chap`, or `pip install chap` in a virtual environment.

## configuration
## Installation for development

Install in developer mode e.g., with `pip install -e .`.
In this mode, you get the "chap" commandline program installed but can edit the source files in place.
This is the [recommended practice per PyPA](https://setuptools.pypa.io/en/latest/userguide/development_mode.html).

A shim script `chap.py` is included so that the older development style of `pip install -r requirements.txt` + `python chap.py` (or `./chap.py`) functions as well.

## Configuration

Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at `~/.config/chap/openai_api_key`

## commandline usage
## Commandline usage

* `chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"`

* `chap render --last`
* `chap render --last` / `chap cat --last`

* `chap import chatgpt-style-chatlog.json` (for files from pionxzh/chatgpt-exporter)

## interactive terminal usage
* `chap grep needle`

## Interactive terminal usage
* chap tui

## Sessions & Commandline Parameters
Expand All @@ -49,15 +59,18 @@ You can set the "system message" with the `-S` flag.

You can select the text generating backend with the `-b` flag:
* openai\_chatgpt: the default, paid API, best quality results
* llama_cpp: Works with (llama.cpp's http server)[https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md] and can run locally with various models. Set the server URL with `-B url:...`.
* textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models. Needs the server URL in *$configuration_directory/textgen\_url*.
* llama\_cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,
though it is [optimized for models that use the llama2-style prompting](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
Set the server URL with `-B url:...`.
* textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models.
Needs the server URL in *$configuration_directory/textgen\_url*.
* lorem: local non-AI lorem generator for testing

## Environment variables

The backend can be set with `CHAP_BACKEND`.
Backend settings can be set with `CHAP_<backend_name>_<parameter_name>`, with `backend_name` and `parameter_name` all in caps.
For instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama_cpp back-end.
For instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama\_cpp back-end.

## Importing from ChatGPT

Expand Down
19 changes: 19 additions & 0 deletions chap.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
#!/usr/bin/env python3
# SPDX-FileCopyrightText: 2023 Jeff Epler <[email protected]>
#
# SPDX-License-Identifier: MIT

import pathlib
import sys

sys.path[0] = str(pathlib.Path(__file__).parent / "src")

if __name__ == "__main__":
# pylint: disable=import-error,no-name-in-module
from chap.core import main

main()
else:
raise ImportError(
"this script exists to facilitate running 'python -mchap' in the top directory; it should not be imported"
)
16 changes: 3 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

[build-system]
requires = [
"setuptools>=61",
"setuptools>=68.2.2",
"setuptools_scm[toml]>=6.0",
]
build-backend = "setuptools.build_meta"
Expand All @@ -19,18 +19,7 @@ where = ["src"]
name="chap"
authors = [{name = "Jeff Epler", email = "[email protected]"}]
description = "Interact with the OpenAI ChatGPT API (and other text generators)"
dynamic = ["readme","version"]
dependencies = [
"click",
"dataclasses_json",
"httpx",
"lorem-text",
"platformdirs",
"simple_parsing",
"textual>=0.18.0",
"tiktoken",
"websockets",
]
dynamic = ["readme","version","dependencies"]
classifiers = [
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.9",
Expand All @@ -50,3 +39,4 @@ chap = "chap.__main__:main"
write_to = "src/chap/__version__.py"
[tool.setuptools.dynamic]
readme = {file = ["README.md"], content-type="text/markdown"}
dependencies = {file = "requirements.txt"}
2 changes: 1 addition & 1 deletion requirements-dev.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@
# SPDX-License-Identifier: MIT

build
setuptools>=45
setuptools>=68.2.2
twine
wheel
13 changes: 13 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# SPDX-FileCopyrightText: 2023 Jeff Epler
#
# SPDX-License-Identifier: Unlicense

click
dataclasses_json
httpx
lorem-text
platformdirs
simple_parsing
textual>=0.18.0
tiktoken
websockets