Skip to content

Initial streaming chat commit #5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jul 4, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions streaming-chat/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
pom.xml.next
release.properties
dependency-reduced-pom.xml
buildNumber.properties
.mvn/timing.properties
# https://github.com/takari/maven-wrapper#usage-without-binary-jar
.mvn/wrapper/maven-wrapper.jar

.DS_Store
.project
.classpath
.settings
2 changes: 2 additions & 0 deletions streaming-chat/.mvn/wrapper/maven-wrapper.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.6/apache-maven-3.9.6-bin.zip
wrapperUrl=https://repo.maven.apache.org/maven2/org/apache/maven/wrapper/maven-wrapper/3.1.1/maven-wrapper-3.1.1.jar
118 changes: 118 additions & 0 deletions streaming-chat/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
# LangChain4j in Jakarta EE and MicroProfile

This example demonstrates LangChain4J in a Jakarta EE / MicroProfile application on Open Liberty. The application is a chatbot built with LangChain4J and uses Jakarta CDI, Jakarta RESTful Web Services, Jakarta WebSocket, MicroProfile Config, MicroProfile Metrics, and MicroProfile OpenAPI features. The application allows to use models from either Github, Ollama, or Hugging Face.

## Prerequisites:

- [Java 21](https://developer.ibm.com/languages/java/semeru-runtimes/downloads)
- Either one of the following model providers:
- Github
- Sign up and sign in to https://github.com.
- Go to your [Settings/Developer Settings/Personal access tokens](https://github.com/settings/personal-access-tokens).
- Generate a new token with the `models` account permission.
- Ollama
- Download and install [Ollama](https://ollama.com/download)
- see the [README.md](https://github.com/ollama/ollama/blob/main/README.md#ollama)
- Pull the following models
- `ollama pull llama3.2`
- Mistral AI
- Sign up and log in to https://console.mistral.ai/home.
- Go to [Your API keys](https://console.mistral.ai/api-keys).
- Create a new key.
- Hugging Face
- Sign up and log in to https://huggingface.co.
- Go to [Access Tokens](https://huggingface.co/settings/tokens).
- Create a new access token with `read` role.

## Environment Set Up

To run this example application, navigate to the `streaming-chat` directory:

```
cd sample-langchain4j/streaming-chat
```

Set the `JAVA_HOME` environment variable:

```
export JAVA_HOME=<your Java 21 home path>
```

Set the `GITHUB_API_KEY` environment variable if using Github.

```
unset HUGGING_FACE_API_KEY
unset OLLAMA_BASE_URL
unset MISTRAL_AI_API_KEY
export GITHUB_API_KEY=<your Github API token>
```

Set the `OLLAMA_BASE_URL` environment variable if using Ollama. Use your Ollama URL if not using the default.

```
unset HUGGING_FACE_API_KEY
unset GITHUB_API_KEY
unset MISTRAL_AI_API_KEY
export OLLAMA_BASE_URL=http://localhost:11434
```

Set the `MISTRAL_AI_API_KEY` environment variable if using Mistral AI.

```
unset HUGGING_FACE_API_KEY
unset GITHUB_API_KEY
unset OLLAMA_BASE_URL
export MISTRAL_AI_API_KEY=<your Mistral AI API key>
```

Set the `HUGGING_FACE_API_KEY` environment variable if using Hugging Face.

```
unset GITHUB_API_KEY
unset OLLAMA_BASE_URL
unset MISTRAL_AI_API_KEY
export HUGGING_FACE_API_KEY=<your Hugging Face read token>
```

## Start the application

Use the Maven wrapper to start the application by using the [Liberty dev mode](https://openliberty.io/docs/latest/development-mode.html):

```
./mvnw liberty:dev
```

## Try out the streaming chat application

- Navigate to http://localhost:9080
- At the prompt, try the following message examples:
- ```
What are large language models?
```
- ```
Which are the most used models?
```
- ```
show me the documentation
```

## Running the tests

Because you started Liberty in dev mode, you can run the provided tests by pressing the `enter/return` key from the command-line session where you started dev mode.

If the tests pass, you see a similar output to the following example:

```
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running it.dev.langchan4j.example.StreamingChatServiceIT
[INFO] ...
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.101 s...
[INFO]
[INFO] Results:
[INFO]
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
```

When you are done checking out the service, exit dev mode by pressing `Ctrl+C` in the command-line session where you ran Liberty, or by typing `q` and then pressing the `enter/return` key.
Loading