emacs-bug-tracker
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#70175: closed ([PATCH] gnu: llama-cpp: support OpenBLAS for faster p


From: GNU bug Tracking System
Subject: bug#70175: closed ([PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing)
Date: Fri, 05 Apr 2024 11:36:02 +0000

Your message dated Fri, 05 Apr 2024 12:35:02 +0100
with message-id <871q7kghzt.fsf@cbaines.net>
and subject line Re: [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for 
faster prompt processing
has caused the debbugs.gnu.org bug report #70175,
regarding [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
to be marked as done.

(If you believe you have received this mail in error, please contact
help-debbugs@gnu.org.)


-- 
70175: https://debbugs.gnu.org/cgi/bugreport.cgi?bug=70175
GNU Bug Tracking System
Contact help-debbugs@gnu.org with problems
--- Begin Message --- Subject: [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing Date: Wed, 3 Apr 2024 23:46:25 -0400
OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp

Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
---
 gnu/packages/machine-learning.scm | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/gnu/packages/machine-learning.scm 
b/gnu/packages/machine-learning.scm
index 225bff0ca2..ea3674ce3e 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -542,6 +542,8 @@ (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
+        #:configure-flags
+        '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
         #:modules '((ice-9 textual-ports)
                     (guix build utils)
                     ((guix build python-build-system) #:prefix python:)
@@ -576,8 +578,9 @@ (define-public llama-cpp
               (lambda _
                 (copy-file "bin/main" (string-append #$output 
"/bin/llama")))))))
       (inputs (list python))
+      (native-inputs (list pkg-config))
       (propagated-inputs
-       (list python-numpy python-pytorch python-sentencepiece))
+       (list python-numpy python-pytorch python-sentencepiece openblas))
       (home-page "https://github.com/ggerganov/llama.cpp";)
       (synopsis "Port of Facebook's LLaMA model in C/C++")
       (description "This package provides a port to Facebook's LLaMA collection

base-commit: 1441a205b1ebb610ecfae945b5770734cbe8478c
-- 
2.41.0




--- End Message ---
--- Begin Message --- Subject: Re: [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing Date: Fri, 05 Apr 2024 12:35:02 +0100 User-agent: mu4e 1.12.2; emacs 29.3
John Fremlin via Guix-patches via <guix-patches@gnu.org> writes:

> OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp
>
> Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
> ---
>  gnu/packages/machine-learning.scm | 5 ++++-
>  1 file changed, 4 insertions(+), 1 deletion(-)

Looks good to me, I tweaked the commit message a bit and pushed this to
master as d8a63bbcee616f224c10462dbfb117ec009c50d8.

Chris

Attachment: signature.asc
Description: PGP signature


--- End Message ---

reply via email to

[Prev in Thread] Current Thread [Next in Thread]