[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[elpa] externals/llm ee50f9cd9f 6/9: Delete unused testing code, run con
From: |
ELPA Syncer |
Subject: |
[elpa] externals/llm ee50f9cd9f 6/9: Delete unused testing code, run conversation testers llm-tester-all |
Date: |
Thu, 26 Oct 2023 00:58:44 -0400 (EDT) |
branch: externals/llm
commit ee50f9cd9f28724ef7bab6a0847c9bdcbc7d99ee
Author: Andrew Hyatt <ahyatt@gmail.com>
Commit: Andrew Hyatt <ahyatt@gmail.com>
Delete unused testing code, run conversation testers llm-tester-all
---
llm-tester.el | 20 ++++----------------
1 file changed, 4 insertions(+), 16 deletions(-)
diff --git a/llm-tester.el b/llm-tester.el
index 37290d37c8..bae06426d1 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -142,21 +142,6 @@
(message "ERROR: Provider %s returned a response not in the original
buffer" (type-of provider)))
(message "ERROR: Provider %s returned an error of type %s with message
%s" (type-of provider) type message)))))
-(defun llm-tester-chat-conversation (provider chat-func)
- "Test that PROVIDER can handle a conversation via CHAT-FUNC.
-CHAT-FUNC should insert the chat response to the buffer."
- (message "Testing provider %s for conversation" (type-of provider))
- (with-temp-buffer
- (let ((prompt (llm-make-simple-chat-prompt
- "I'm currently testing conversational abilities. Please
respond to each message with the ordinal number of your response, so just '1'
for the first response, '2' for the second, and so on. It's important that I
can verify that you are working with the full conversation history, so please
let me know if you seem to be missing anything.")))
- (push (llm-chat provider prompt) outputs)
- (llm-chat-prompt-append-response prompt "This is the second message.")
- (push (llm-chat provider prompt) outputs)
- (llm-chat-prompt-append-response prompt "This is the third message.")
- (push (llm-chat provider prompt) outputs)
- (message "SUCCESS: Provider %s provided a conversation with responses
%s" (type-of provider)
- (nreverse outputs)))))
-
(defun llm-tester-chat-conversation-sync (provider)
"Test that PROVIDER can handle a conversation."
(message "Testing provider %s for conversation" (type-of provider))
@@ -232,7 +217,10 @@ CHAT-FUNC should insert the chat response to the buffer."
(llm-tester-chat-sync provider)
(llm-tester-embedding-async provider)
(llm-tester-chat-async provider)
- (llm-tester-chat-streaming provider))
+ (llm-tester-chat-streaming provider)
+ (llm-tester-chat-conversation-sync provider)
+ (llm-tester-chat-conversation-async provider)
+ (llm-tester-chat-conversation-streaming provider))
(provide 'llm-tester)
- [elpa] externals/llm updated (b69b6e8480 -> 53b5ebcbdb), ELPA Syncer, 2023/10/26
- [elpa] externals/llm 92914e3304 1/9: Improve how conversations work and make it easier to handle them, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 50ad3cbe4c 2/9: Fix issue with never correctly storing or applying the context, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 59fc3d7d29 3/9: Ensure that all callbacks are in the original buffer, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 16335ca7cd 4/9: Merge branch 'conversation-fix', ELPA Syncer, 2023/10/26
- [elpa] externals/llm 9658faa37e 5/9: Make embedding-async callbacks also in original buffer, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 0af6350d10 7/9: Add to README info about callbacks in buffer, fix convo example, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 2daffebdee 8/9: Properly throw errors when sync requests receive an error code, ELPA Syncer, 2023/10/26
- [elpa] externals/llm 53b5ebcbdb 9/9: Add new provider GPT4All, ELPA Syncer, 2023/10/26
- [elpa] externals/llm ee50f9cd9f 6/9: Delete unused testing code, run conversation testers llm-tester-all,
ELPA Syncer <=