Artificial Intelligence in Elexis #11
Replies: 2 comments 2 replies
-
|
@rgwch thank you for your input on this topic. We thought about something like this for quite a while, yet there still are, and maybe for some time being, always will be obstacles. I'm not primarily talking about the hardware, we could host an instance that satisfy privacy requirements - but about the content the AI generates. I would be really interested in an answer to the folllwing questions:
🤔 |
Beta Was this translation helpful? Give feedback.
-
|
Looking at the code I see the context size is set to 16384. I am not sure that the resulting summary will be complete, if the text extracted from the pdf is longer than 16384 tokens. My comment is based on https://medium.com/google-cloud/langchain-chain-types-large-document-summarization-using-langchain-and-google-cloud-vertex-ai-1650801899f6 where different approaches to summarize texts larger than the context size are discussed. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Here's a proof of concept for the use of locally running LLM's to save some time when reading incoming documents.
It's just an extension of the global-Inbox feature:
https://elexis.ch/ungrad/features/inbox_de/#ki-zusammenfassung
It uses a proxy which should run on the same machine at the moment and hands the document over to a LLM run by node-llama-cpp.
Then, it creates a new Konsultation/Encounter and writes the summary there.
There are several enhancements possible: We can ask the AI to retrieve the medication from hospital letters and insert them in the "Fixmedikation" after having it checked by the user. We can bill automatically an "Aktenstudium" when inserting the document. And so on.
Some thoughts on this topic are also here (I was using Ollama there instead of node-llama-cpp, but it's about the same principle)
Of course we need a somewhat better computer to use that. I'm using a Ryzen 7, 48GB RAM, and an NVidia 5060 Ti with 16GB. It is necessary to install Nvidia's CUDA toolkit to allow use of the graphic card for other jobs than just display. Driver and Cuda install was a bit tricky at this time on Ubuntu Linux, but finally it worked.
Beta Was this translation helpful? Give feedback.
All reactions