Skip to content

Commit 69cac49

Browse files
committed
gptel: Add support for the DeepSeek API
* gptel.el: Mention instructions for DeepSeek and PrivateGPT in the package description. * README.org: Add instructions for the DeepSeek API.
1 parent 95a5716 commit 69cac49

File tree

2 files changed

+42
-3
lines changed

2 files changed

+42
-3
lines changed

README.org

+37
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ gptel is a simple Large Language Model chat client for Emacs, with support for m
2323
| Groq | ✓ | [[https://console.groq.com/keys][API key]] |
2424
| OpenRouter | ✓ | [[https://openrouter.ai/keys][API key]] |
2525
| PrivateGPT | ✓ | [[https://github.com/zylon-ai/private-gpt#-documentation][PrivateGPT running locally]] |
26+
| DeepSeek | ✓ | [[https://platform.deepseek.com/api_keys][API key]] |
2627
#+html: </div>
2728

2829
*General usage*: ([[https://www.youtube.com/watch?v=bsRnh_brggM][YouTube Demo]])
@@ -67,6 +68,7 @@ gptel uses Curl if available, but falls back to url-retrieve to work without ext
6768
- [[#groq][Groq]]
6869
- [[#openrouter][OpenRouter]]
6970
- [[#privategpt][PrivateGPT]]
71+
- [[#deepseek][DeepSeek]]
7072
- [[#usage][Usage]]
7173
- [[#in-any-buffer][In any buffer:]]
7274
- [[#in-a-dedicated-chat-buffer][In a dedicated chat buffer:]]
@@ -595,6 +597,41 @@ The above code makes the backend available to select. If you want it to be the
595597

596598
#+end_src
597599

600+
#+html: </details>
601+
#+html: <details><summary>
602+
**** DeepSeek
603+
#+html: </summary>
604+
605+
Register a backend with
606+
#+begin_src emacs-lisp
607+
;; DeepSeek offers an OpenAI compatible API
608+
(gptel-make-openai "DeepSeek" ;Any name you want
609+
:host "api.deepseek.com"
610+
:endpoint "/chat/completions"
611+
:stream t
612+
:key "your-api-key" ;can be a function that returns the key
613+
:models '("deepseek-chat" "deepseek-coder"))
614+
615+
#+end_src
616+
617+
You can pick this backend from the menu when using gptel (see [[#usage][Usage]]).
618+
619+
***** (Optional) Set as the default gptel backend
620+
621+
The above code makes the backend available to select. If you want it to be the default backend for gptel, you can set this as the value of =gptel-backend=. Use this instead of the above.
622+
#+begin_src emacs-lisp
623+
;; OPTIONAL configuration
624+
(setq gptel-model "deepseek-chat"
625+
gptel-backend
626+
(gptel-make-openai "DeepSeek" ;Any name you want
627+
:host "api.deepseek.com"
628+
:endpoint "/chat/completions"
629+
:stream t
630+
:key "your-api-key" ;can be a function that returns the key
631+
:models '("deepseek-chat" "deepseek-coder")))
632+
633+
#+end_src
634+
598635
#+html: </details>
599636

600637
** Usage

gptel.el

+5-3
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,8 @@
3232
;; gptel supports
3333
;;
3434
;; - The services ChatGPT, Azure, Gemini, Anthropic AI, Anyscale, Together.ai,
35-
;; Perplexity, Anyscale, OpenRouter, Groq and Kagi (FastGPT & Summarizer)
35+
;; Perplexity, Anyscale, OpenRouter, Groq, PrivateGPT, DeepSeek and Kagi
36+
;; (FastGPT & Summarizer)
3637
;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All
3738
;;
3839
;; Additionally, any LLM service (local or remote) that provides an
@@ -60,8 +61,9 @@
6061
;; - For Gemini: define a gptel-backend with `gptel-make-gemini', which see.
6162
;; - For Anthropic (Claude): define a gptel-backend with `gptel-make-anthropic',
6263
;; which see
63-
;; - For Together.ai, Anyscale, Perplexity, Groq and OpenRouter: define a
64-
;; gptel-backend with `gptel-make-openai', which see.
64+
;; - For Together.ai, Anyscale, Perplexity, Groq, OpenRouter or DeepSeek: define
65+
;; a gptel-backend with `gptel-make-openai', which see.
66+
;; - For PrivateGPT: define a backend with `gptel-make-privategpt', which see.
6567
;; - For Kagi: define a gptel-backend with `gptel-make-kagi', which see.
6668
;;
6769
;; For local models using Ollama, Llama.cpp or GPT4All:

0 commit comments

Comments
 (0)