emacs-elpa-diffs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[elpa] externals/llm 8b7faa68e2: Add ability to set the ollama host


From: ELPA Syncer
Subject: [elpa] externals/llm 8b7faa68e2: Add ability to set the ollama host
Date: Thu, 19 Oct 2023 00:58:39 -0400 (EDT)

branch: externals/llm
commit 8b7faa68e2a511308383901b367de0c976d6a41e
Author: Andrew Hyatt <ahyatt@gmail.com>
Commit: Andrew Hyatt <ahyatt@gmail.com>

    Add ability to set the ollama host
    
    This fixes https://github.com/ahyatt/llm/issues/4.
---
 llm-ollama.el | 8 ++++++--
 1 file changed, 6 insertions(+), 2 deletions(-)

diff --git a/llm-ollama.el b/llm-ollama.el
index 7a8d53e02d..fbce0fa6e0 100644
--- a/llm-ollama.el
+++ b/llm-ollama.el
@@ -47,12 +47,15 @@
 (cl-defstruct llm-ollama
   "A structure for holding information needed by Ollama's API.
 
+HOST is the host that Ollama is running on. It is optional and
+default to localhost.
+
 PORT is the localhost port that Ollama is running on.  It is optional.
 
 CHAT-MODEL is the model to use for chat queries. It is required.
 
 EMBEDDING-MODEL is the model to use for embeddings.  It is required."
-  port chat-model embedding-model)
+  host port chat-model embedding-model)
 
 ;; Ollama's models may or may not be free, we have no way of knowing. There's 
no
 ;; way to tell, and no ToS to point out here.
@@ -62,7 +65,8 @@ EMBEDDING-MODEL is the model to use for embeddings.  It is 
required."
 
 (defun llm-ollama--url (provider method)
   "With ollama PROVIDER, return url for METHOD."
-  (format "http://localhost:%d/api/%s"; (or (llm-ollama-port provider) 11434) 
method))
+  (format "http://%s:%d/api/%s"; (or (llm-ollama-host provider) "localhost")
+          (or (llm-ollama-port provider) 11434) method))
 
 (defun llm-ollama--embedding-request (provider string)
   "Return the request to the server for the embedding of STRING.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]