Problem or motivation
Hello, i configured codex to use llama swap / llama cpp, and it is usable via codex but when i tried to change the model via telegram, it doesnt show the local models as an option. is this expected and if yes, can we request it as a feature? thank you
Proposed solution
expose non official models eg. local models configured via the ai tool
Alternatives considered
No response
Before submitting
Problem or motivation
Hello, i configured codex to use llama swap / llama cpp, and it is usable via codex but when i tried to change the model via telegram, it doesnt show the local models as an option. is this expected and if yes, can we request it as a feature? thank you
Proposed solution
expose non official models eg. local models configured via the ai tool
Alternatives considered
No response
Before submitting