Skip to content

Prompt Engineering Tooling for OpenAI (chatgpt) or VertexAI (google cloud) APIs . With more APIs to follow

License

Notifications You must be signed in to change notification settings

SPANDigital/codeassistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

167 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

codeassistant

codeassistant is toolinf which automates interactions with the OpenAI Completions API and the Vertex AI Predict API.

Prompts are organized in a directory (or prompts library) as YAML configuration files with documentation implemented in Markdown. An example of such a library can be found here.

We are looking for contributors, please see how you can contribute, and our code of conduct.

It fulfills these purposes:

  • A tool for prompt engineers to prototype prompts, and rapidly iterate on them
  • The ability to parameterize prompts with light templating in the handling of input
  • Allows prompts to be integrated with other software such as shell scripts
  • Provides a Web UI

It has two main modes of operation:

  • CLI: Suitable for shell scripts. Output of prompts can be redirected from STDOUT.
  • WebUI: Useful for testing prompts.

OpenAI Configuration

You will need to configure an OpenAI API Key before usage.

Sample configuration

It is recommended you set up codeassistant with a config file at $HOME/.codeassistant.yaml for default values:

backend: openai
openAiApiKey: "<api key>"
openAiUserId: "<your email address>"
promptsLibraryDir: <directory to load prompts, defaults to $HOME/prompts-library>

VertexAI Configuration

You will need to install the gcloud sdk before using VertexAI. You will need a project on Google Cloud which gives your user access to Vertex AI.

You need to login with: gcloud auth login

Before using.

Sample configuration

backend: vertexai
vertexAiProjectId: "<project-id>"
vertexAiLocation: "us-central1"
vertexAiModel: "text-bison@001"
promptsLibraryDir: <directory to load prompts, defaults to $HOME/prompts-library>

Additional configuration

More keys are available for debugging

debug:
  - configuration
  - first-response-time
  - last-response-time
  - request-header
  - request-time
  - request-tokens
  - response-header
  - sent-prompt
  - webserver

Installing and running via Docker

docker run --rm --name codeassistant \
  --volume $HOME/.codeassistant.yaml:/.codeassistant.yaml:ro \
  --volume $HOME/prompts-library:/prompts-library:ro \
  -p8989:8989  \
  ghcr.io/spandigital/codeassistant:latest serve

In this example .codeassistant.yaml is $HOME/.codeassistant.yaml and prompts-library and the prompts-library folder is in $HOME/.prompts-library On the docker container $HOME is defined as /

Installing an running via MacOS X

Initial installation

brew tap SPANDigital/homebrew-tap
brew install codeassistant

Upgrades

brew up
brew reinstall codeassistant

Usage

Run web front-end

codeassistant serve

or to override the default model

codeassistant serve --openAiCompletionsModel gpt-4

List all the commands in your prompt libraries

codeassistant list

List commands for a specific prompt library

codeassistant run <library> <command> <var1:value> <vae2:value>

or to override the default model

codeassistant run <library> <command> <var1:value> <vae2:value> --openAiModel gpt-4

List available models (beta)

codeassistant list-models

This README.md file is documentation:

SPDX-License-Identifier: MIT

About

Prompt Engineering Tooling for OpenAI (chatgpt) or VertexAI (google cloud) APIs . With more APIs to follow

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •