gnu-emacs-sources
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[GNU ELPA] Llm version 0.7.0


From: ELPA update
Subject: [GNU ELPA] Llm version 0.7.0
Date: Mon, 18 Dec 2023 05:03:34 -0500

Version 0.7.0 of package Llm has just been released in GNU ELPA.
You can now find it in M-x list-packages RET.

Llm describes itself as:

  ===================================
  Interface to pluggable llm backends
  ===================================

More at https://elpa.gnu.org/packages/llm.html

## Summary:

                          ━━━━━━━━━━━━━━━━━━━━━━━
                           LLM PACKAGE FOR EMACS
                          ━━━━━━━━━━━━━━━━━━━━━━━





  1 Introduction
  ══════════════

    This is a library for interfacing with Large Language Models.  It
    allows elisp code to use LLMs, but allows gives the end-user an option
    to choose which LLM they would prefer.  This is especially useful for
    LLMs, since there are various high-quality ones that in which API
    access costs money, as well as locally installed ones that are free,
    but of medium quality.  Applications using LLMs can use this library
    to make sure their application works regardless of whether the user
    has a local LLM or is paying for API access.

## Recent NEWS:

1 Version 0.7
═════════════

  • Upgrade Google Cloud Vertex to Gemini - previous models are no
    longer available.
  • Added `gemini' provider, which is an alternate endpoint with
    alternate (and easier) authentication and setup compared to Cloud
    Vertex.
  • Provide default for `llm-chat-async' to fall back to streaming if
    not defined for a provider.


2 Version 0.6
═════════════

  • Add provider `llm-llamacpp'.
  • Fix issue with Google Cloud Vertex not responding to messages with a
    system interaction.
  • Fix use of `(pos-eol)' which is not compatible with Emacs 28.1.


3 Version 0.5.2
═══════════════

  • Fix incompatibility with older Emacs introduced in Version 0.5.1.
  • Add support for Google Cloud Vertex model `text-bison' and variants.
  • `llm-ollama' can now be configured with a scheme (http vs https).


4 Version 0.5.1
═══════════════

  • Implement token counting for Google Cloud Vertex via their API.
  • Fix issue with Google Cloud Vertex erroring on multibyte strings.
  • Fix issue with small bits of missing text in Open AI and Ollama
    streaming chat.


5 Version 0.5
═════════════

  • Fixes for conversation context storage, requiring clients to handle
    ongoing conversations slightly differently.
  • Fixes for proper sync request http error code handling.
  • `llm-ollama' can now be configured with a different hostname.
  • Callbacks now always attempts to be in the client's original buffer.
  • Add provider `llm-gpt4all'.


6 Version 0.4
═════════════

  • Add helper function `llm-chat-streaming-to-point'.
  • Add provider `llm-ollama'.


7 Version 0.3
═════════════

  • Streaming support in the API, and for the Open AI and Vertex models.
  • Properly encode and decode in utf-8 so double-width or other
    character sizes don't cause problems.


8 Version 0.2.1
═══════════════

  • Changes in how we make and listen to requests, in preparation for
    streaming functionality.
  • Fix overzealous change hook creation when using async llm requests.


9 Version 0.2
═════════════

  • Remove the dependency on non-GNU request library.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]