codeassistant is toolinf which automates interactions with the OpenAI Completions API and the Vertex AI Predict API.
Prompts are organized in a directory (or prompts library) as YAML configuration files with documentation implemented in Markdown. An example of such a library can be found here.
We are looking for contributors, please see how you can contribute, and our code of conduct.
It fulfills these purposes:
- A tool for prompt engineers to prototype prompts, and rapidly iterate on them
- The ability to parameterize prompts with light templating in the handling of input
- Allows prompts to be integrated with other software such as shell scripts
- Provides a Web UI
It has two main modes of operation:
- CLI: Suitable for shell scripts. Output of prompts can be redirected from STDOUT.
- WebUI: Useful for testing prompts.
You will need to configure an OpenAI API Key before usage.
It is recommended you set up codeassistant
with a config file at $HOME/.codeassistant.yaml
for default values:
backend: openai
openAiApiKey: "<api key>"
openAiUserId: "<your email address>"
promptsLibraryDir: <directory to load prompts, defaults to $HOME/prompts-library>
You will need to install the gcloud sdk before using VertexAI. You will need a project on Google Cloud which gives your user access to Vertex AI.
You need to login with:
gcloud auth login
Before using.
backend: vertexai
vertexAiProjectId: "<project-id>"
vertexAiLocation: "us-central1"
vertexAiModel: "text-bison@001"
promptsLibraryDir: <directory to load prompts, defaults to $HOME/prompts-library>
More keys are available for debugging
debug:
- configuration
- first-response-time
- last-response-time
- request-header
- request-time
- request-tokens
- response-header
- sent-prompt
- webserver
docker run --rm --name codeassistant \
--volume $HOME/.codeassistant.yaml:/.codeassistant.yaml:ro \
--volume $HOME/prompts-library:/prompts-library:ro \
-p8989:8989 \
ghcr.io/spandigital/codeassistant:latest serve
In this example .codeassistant.yaml
is $HOME/.codeassistant.yaml and
prompts-library and the prompts-library folder is in $HOME/.prompts-library
On the docker container $HOME is defined as /
brew tap SPANDigital/homebrew-tap
brew install codeassistant
brew up
brew reinstall codeassistant
codeassistant serve
or to override the default model
codeassistant serve --openAiCompletionsModel gpt-4
codeassistant list
codeassistant run <library> <command> <var1:value> <vae2:value>
or to override the default model
codeassistant run <library> <command> <var1:value> <vae2:value> --openAiModel gpt-4
codeassistant list-models
This README.md
file is documentation:
SPDX-License-Identifier: MIT