-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Security information to README.md #787
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -29,10 +29,25 @@ used within the VM. | |
|
||
Default settings for flags are defined in `ramalama.conf(5)`. | ||
|
||
RamaLama supports multiple AI model registries types called transports. Supported transports: | ||
## SECURITY | ||
|
||
### Test and run your models more securely | ||
|
||
Because RamaLama defaults to running AI models inside of rootless containers using Podman on Docker. These containers isolate the AI models from information on the underlying host. With RamaLama containers, the AI model is mounted as a volume into the container in read/only mode. This results in the process running the model, llama.cpp or vLLM, being isolated from the host. In addition, since `ramalama run` uses the --network=none option, the container can not reach the network and leak any information out of the system. Finally, containers are run with --rm options which means that any content written during the running of the container is wiped out when the application exits. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. read/only typo again There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What is the typo? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm wrong, sorry. I'd never seen read/only written with a slash before, it's just a way of writing it I'm unaware of, used to seeing: read-only or readonly I have seen r/o before though once or twice 😄 |
||
|
||
### Here’s how RamaLama delivers a robust security footprint: | ||
|
||
✅ Container Isolation – AI models run within isolated containers, preventing direct access to the host system. | ||
✅ Read-Only Volume Mounts – The AI model is mounted in read-only mode, meaning that processes inside the container cannot modify host files. | ||
✅ No Network Access – ramalama run is executed with --network=none, meaning the model has no outbound connectivity for which information can be leaked. | ||
✅ Auto-Cleanup – Containers run with --rm, wiping out any temporary data once the session ends. | ||
✅ Drop All Linux Capabilities – No access to Linux capabilities to attack the underlying host. | ||
✅ No New Privileges – Linux Kernel feature which disables container processes from gaining additional privileges. | ||
|
||
## MODEL TRANSPORTS | ||
|
||
RamaLama supports multiple AI model registries types called transports. Supported transports: | ||
|
||
| Transports | Prefix | Web Site | | ||
| ------------- | ------ | --------------------------------------------------- | | ||
| URL based | https://, http://, file:// | `https://web.site/ai.model`, `file://tmp/ai.model`| | ||
|
@@ -156,6 +171,15 @@ show RamaLama version | |
|
||
## CONFIGURATION FILES | ||
|
||
**ramalama.conf** (`/usr/share/ramalama/ramalama.conf`, `/etc/ramalama/ramalama.conf`, `$HOME/.config/ramalama/ramalama.conf`) | ||
|
||
RamaLama has builtin defaults for command line options. These defaults can be overridden using the ramalama.conf configuration files. | ||
|
||
Distributions ship the `/usr/share/ramalama/ramalama.conf` file with their default settings. Administrators can override fields in this file by creating the `/etc/ramalama/ramalama.conf` file. Users can further modify defaults by creating the `$HOME/.config/ramalama/ramalama.conf` file. RamaLama merges its builtin defaults with the specified fields from these files, if they exist. Fields specified in the users file override the administrator's file, which overrides the distribution's file, which override the built-in defaults. | ||
|
||
RamaLama uses builtin defaults if no ramalama.conf file is found. | ||
|
||
If the **RAMALAMA_CONFIG** environment variable is set, then its value is used for the ramalama.conf file rather than the default. | ||
|
||
## SEE ALSO | ||
**[podman(1)](https://github.com/containers/podman/blob/main/docs/podman.1.md)**, **docker(1)**, **[ramalama.conf(5)](ramalama.conf.5.md)** | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"read/only" typo, can catch typos in follow on PRs though