Skip to content

Conversation

lukehollenback
Copy link

This pull request adds custom command support to the prompt passed to the LLM when asking it to review files.

In a repository's workflow configuration file for AI Code Reviewer, one can use multiline YAML to specify said custom commands in the custom_prompt input. For example, see the bottom of →

name: Code Review with OpenAI
on:
  pull_request:
    types:
      - opened
      - synchronize
      - ready_for_review
permissions: write-all
jobs:
  code_review:
    if: '! github.event.pull_request.draft'
    runs-on: ubuntu-latest
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3
      - name: Code Review
        uses: lukehollenback/ai-codereviewer@main-luke
        with:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
          OPENAI_API_MODEL: "gpt-4-turbo-preview"
          exclude: "yarn.lock, dist/**, **/*.json, **/*.md, **/*.yaml, **/*.xml"
          custom_prompts: |
            Do not worry about the verbosity of variable names, as long as they are somewhat descriptive.
            Be sure to call out potential null pointer exceptions.

Tested over in lukehollenback#4 — see executed checks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant