ellama 
- Description
- Tool for interacting with LLMs
- Latest
- ellama-1.5.2.tar (.sig), 2025-Mar-10, 280 KiB
- Maintainer
- Sergey Kostyaev <sskostyaev@gmail.com>
- Website
- http://github.com/s-kostyaev/ellama
- Browse ELPA's repository
- CGit or Gitweb
- Badge
To install this package from Emacs, use package-install
or list-packages
.
Full description
1. Ellama
Ellama is a tool for interacting with large language models from Emacs. It allows you to ask questions and receive responses from the LLMs. Ellama can perform various tasks such as translation, code review, summarization, enhancing grammar/spelling or wording and more through the Emacs interface. Ellama natively supports streaming output, making it effortless to use with your preferred text editor.
The name "ellama" is derived from "Emacs Large LAnguage Model
Assistant". Previous sentence was written by Ellama itself.
1.1. Installation
Just M-x
package-install
Enter
ellama
Enter. By default it uses ollama
provider. If you ok with it, you need to install ollama and pull
any ollama model like this:
ollama pull qwen2.5:3b
You can use ellama
with other model or other llm provider.
Without any configuration, the first available ollama model will be used.
You can customize ellama configuration like this:
(use-package ellama :ensure t :bind ("C-c e" . ellama-transient-main-menu) ;; send last message in chat buffer with C-c C-c :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message) :init (setopt ellama-auto-scroll t) :config ;; show ellama context in header line in all buffers (ellama-context-header-line-global-mode +1))
More sofisticated configuration example:
(use-package ellama :ensure t :bind ("C-c e" . ellama-transient-main-menu) ;; send last message in chat buffer with C-c C-c :hook (org-ctrl-c-ctrl-c-final . ellama-chat-send-last-message) :init ;; setup key bindings ;; (setopt ellama-keymap-prefix "C-c e") ;; language you want ellama to translate to (setopt ellama-language "German") ;; could be llm-openai for example (require 'llm-ollama) (setopt ellama-provider (make-llm-ollama ;; this model should be pulled to use it ;; value should be the same as you print in terminal during pull :chat-model "llama3:8b-instruct-q8_0" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("num_ctx" . 8192)))) (setopt ellama-summarization-provider (make-llm-ollama :chat-model "qwen2.5:3b" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("num_ctx" . 32768)))) (setopt ellama-coding-provider (make-llm-ollama :chat-model "qwen2.5-coder:3b" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("num_ctx" . 32768)))) ;; Predefined llm providers for interactive switching. ;; You shouldn't add ollama providers here - it can be selected interactively ;; without it. It is just example. (setopt ellama-providers '(("zephyr" . (make-llm-ollama :chat-model "zephyr:7b-beta-q6_K" :embedding-model "zephyr:7b-beta-q6_K")) ("mistral" . (make-llm-ollama :chat-model "mistral:7b-instruct-v0.2-q6_K" :embedding-model "mistral:7b-instruct-v0.2-q6_K")) ("mixtral" . (make-llm-ollama :chat-model "mixtral:8x7b-instruct-v0.1-q3_K_M-4k" :embedding-model "mixtral:8x7b-instruct-v0.1-q3_K_M-4k")))) ;; Naming new sessions with llm (setopt ellama-naming-provider (make-llm-ollama :chat-model "llama3:8b-instruct-q8_0" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("stop" . ("\n"))))) (setopt ellama-naming-scheme 'ellama-generate-name-by-llm) ;; Translation llm provider (setopt ellama-translation-provider (make-llm-ollama :chat-model "qwen2.5:3b" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("num_ctx" . 32768)))) (setopt ellama-extraction-provider (make-llm-ollama :chat-model "qwen2.5-coder:7b-instruct-q8_0" :embedding-model "nomic-embed-text" :default-chat-non-standard-params '(("num_ctx" . 32768)))) ;; customize display buffer behaviour ;; see ~(info "(elisp) Buffer Display Action Functions")~ (setopt ellama-chat-display-action-function #'display-buffer-full-frame) (setopt ellama-instant-display-action-function #'display-buffer-at-bottom) :config ;; show ellama context in header line in all buffers (ellama-context-header-line-global-mode +1))
1.2. Commands
1.2.1. ellama-chat
Ask Ellama about something by entering a prompt in an interactive
buffer and continue conversation. If called with universal argument
(C-u
) will start new session with llm model interactive selection.
1.2.2. ellama-write
This command allows you to generate text using an LLM. When called interactively, it prompts for an instruction that is then used to generate text based on the context. If a region is active, the selected text is added to the context before generating the response.
1.2.3. ellama-chat-send-last-message
Send last user message extracted from current ellama chat buffer.
1.2.4. ellama-ask-about
Ask Ellama about a selected region or the current buffer.
1.2.5. ellama-ask-selection
Send selected region or current buffer to ellama chat.
1.2.6. ellama-ask-line
Send current line to ellama chat.
1.2.7. ellama-complete
Complete text in current buffer with ellama.
1.2.8. ellama-translate
Ask Ellama to translate a selected region or word at the point.
1.2.9. ellama-translate-buffer
Translate current buffer.
1.2.10. ellama-define-word
Find the definition of the current word using Ellama.
1.2.11. ellama-summarize
Summarize a selected region or the current buffer using Ellama.
1.2.12. ellama-summarize-killring
Summarize text from the kill ring.
1.2.13. ellama-code-review
Review code in a selected region or the current buffer using Ellama.
1.2.14. ellama-change
Change text in a selected region or the current buffer according to a provided change.
1.2.15. ellama-make-list
Create a markdown list from the active region or the current buffer using Ellama.
1.2.16. ellama-make-table
Create a markdown table from the active region or the current buffer using Ellama.
1.2.17. ellama-summarize-webpage
Summarize a webpage fetched from a URL using Ellama.
1.2.18. ellama-provider-select
Select ellama provider.
1.2.19. ellama-code-complete
Complete selected code or code in the current buffer according to a provided change using Ellama.
1.2.20. ellama-code-add
Generate and insert new code based on description. This function
prompts the user to describe the code they want to generate. If a
region is active, it includes the selected text as context for code
generation.
1.2.21. ellama-code-edit
Change selected code or code in the current buffer according to a provided change using Ellama.
1.2.22. ellama-code-improve
Change selected code or code in the current buffer according to a provided change using Ellama.
1.2.23. ellama-generate-commit-message
Generate commit message based on diff.
1.2.24. ellama-proofread
Proofread selected text.
1.2.25. ellama-improve-wording
Enhance the wording in the currently selected region or buffer using Ellama.
1.2.26. ellama-improve-grammar
Enhance the grammar and spelling in the currently selected region or buffer using Ellama.
1.2.27. ellama-improve-conciseness
Make the text of the currently selected region or buffer concise and simple using Ellama.
1.2.28. ellama-make-format
Render the currently selected text or the text in the current buffer as a specified format using Ellama.
1.2.29. ellama-load-session
Load ellama session from file.
1.2.30. ellama-session-delete
Delete ellama session.
1.2.31. ellama-session-switch
Change current active session.
1.2.32. ellama-session-kill
Select and kill one of active sessions.
1.2.33. ellama-session-rename
Rename current ellama session.
1.2.34. ellama-context-add-file
Add file to context.
1.2.35. ellama-context-add-directory
Add all files in directory to the context.
1.2.36. ellama-context-add-buffer
Add buffer to context.
1.2.37. ellama-context-add-selection
Add selected region to context.
1.2.38. ellama-context-add-info-node
Add info node to context.
1.2.39. ellama-context-reset
Clear global context.
1.2.40. ellama-manage-context
Manage the global context. Inside context management buffer you can see ellama context elements. Availible actions with key bindings:
n
: Move to the next line.p
: Move to the previous line.q
: Quit the window.g
: Update context management buffer.a
: Open the transient context menu for adding new elements.d
: Remove the context element at the current point.RET
: Preview the context element at the current point.
1.2.41. ellama-preview-context-element-at-point
Preview ellama context element at point. Works inside ellama context management buffer.
1.2.42. ellama-remove-context-element-at-point
Remove ellama context element at point from global context. Works inside ellama context management buffer.
1.2.43. ellama-chat-translation-enable
Chat translation enable.
1.2.44. ellama-chat-translation-disable
Chat translation disable.
1.2.45. ellama-solve-reasoning-problem
Solve reasoning problem with Abstraction of Thought technique. It uses a chain of multiple messages to LLM and help it to provide much better answers on reasoning problems. Even small LLMs like phi3-mini provides much better results on reasoning tasks using AoT.
1.2.46. ellama-solve-domain-specific-problem
Solve domain specific problem with simple chain. It makes LLMs act like a professional and adds a planning step.
1.2.47. ellama-community-prompts-select-blueprint
Select a prompt from the community prompt collection. The user is prompted to choose a role, and then a corresponding prompt is inserted into a blueprint buffer.
1.2.48. ellama-community-prompts-update-variables
Prompt user for values of variables found in current buffer and update them.
1.3. Keymap
In any buffer where there is active ellama streaming, you can press
C-g
and it will cancel current stream.
Here is a table of keybindings and their associated functions in
Ellama, using the ellama-keymap-prefix
prefix (not set by default):
Keymap | Function | Description |
---|---|---|
"w" | ellama-write | Write |
"c c" | ellama-code-complete | Code complete |
"c a" | ellama-code-add | Code add |
"c e" | ellama-code-edit | Code edit |
"c i" | ellama-code-improve | Code improve |
"c r" | ellama-code-review | Code review |
"c m" | ellama-generate-commit-message | Generate commit message |
"s s" | ellama-summarize | Summarize |
"s w" | ellama-summarize-webpage | Summarize webpage |
"s c" | ellama-summarize-killring | Summarize killring |
"s l" | ellama-load-session | Session Load |
"s r" | ellama-session-rename | Session rename |
"s d" | ellama-session-delete | Delete delete |
"s a" | ellama-session-switch | Session activate |
"P" | ellama-proofread | Proofread |
"i w" | ellama-improve-wording | Improve wording |
"i g" | ellama-improve-grammar | Improve grammar and spelling |
"i c" | ellama-improve-conciseness | Improve conciseness |
"m l" | ellama-make-list | Make list |
"m t" | ellama-make-table | Make table |
"m f" | ellama-make-format | Make format |
"a a" | ellama-ask-about | Ask about |
"a i" | ellama-chat | Chat (ask interactively) |
"a l" | ellama-ask-line | Ask current line |
"a s" | ellama-ask-selection | Ask selection |
"t t" | ellama-translate | Text translate |
"t b" | ellama-translate-buffer | Translate buffer |
"t e" | ellama-chat-translation-enable | Translation enable |
"t d" | ellama-chat-translation-disable | Translation disable |
"t c" | ellama-complete | Text complete |
"d w" | ellama-define-word | Define word |
"x b" | ellama-context-add-buffer | Context add buffer |
"x f" | ellama-context-add-file | Context add file |
"x d" | ellama-context-add-directory | Context add directory |
"x s" | ellama-context-add-selection | Context add selection |
"x i" | ellama-context-add-info-node | Context add info node |
"x r" | ellama-context-reset | Context reset |
"p s" | ellama-provider-select | Provider select |
1.4. Configuration
The following variables can be customized for the Ellama client:
ellama-enable-keymap
: Enable the Ellama keymap.ellama-keymap-prefix
: The keymap prefix for Ellama.ellama-user-nick
: The user nick in logs.ellama-assistant-nick
: The assistant nick in logs.ellama-language
: The language for Ollama translation. Default
language is english.
ellama-provider
: llm provider for ellama.
There are many supported providers: ollama
, open ai
, vertex
,
GPT4All
. For more information see llm documentation.
ellama-providers
: association list of model llm providers with name as key.ellama-spinner-enabled
: Enable spinner during text generation.ellama-spinner-type
: Spinner type for ellama. Default type is
progress-bar
.
ellama-ollama-binary
: Path to ollama binary.ellama-auto-scroll
: If enabled ellama buffer will scroll automatically during generation. Disabled by default.ellama-fill-paragraphs
: Option to customize ellama paragraphs filling behaviour.ellama-name-prompt-words-count
: Count of words in prompt to generate name.- Prompt templates for every command.
ellama-chat-done-callback
: Callback that will be called on ellama
chat response generation done. It should be a function with single argument generated text string.
ellama-nick-prefix-depth
: User and assistant nick prefix depth. Default value is 2.ellama-sessions-directory
: Directory for saved ellama sessions.ellama-major-mode
: Major mode for ellama commands. Org mode by default.ellama-session-auto-save
: Automatically save ellama sessions if set. Enabled by default.ellama-naming-scheme
: How to name new sessions.ellama-naming-provider
: LLM provider for generating session names by LLM. If not setellama-provider
will be used.ellama-chat-translation-enabled
: Enable chat translations if set.ellama-translation-provider
: LLM translation provider.ellama-provider
will be used if not set.ellama-coding-provider
: LLM coding tasks provider.ellama-provider
will be used if not set.ellama-summarization-provider
LLM summarization provider.ellama-provider
will be used if not set.ellama-show-quotes
: Show quotes content in chat buffer. Disabled by default.ellama-chat-display-action-function
: Display action function forellama-chat
.ellama-instant-display-action-function
: Display action function forellama-instant
.ellama-translate-italic
: Translate italic during markdown to org transformations. Enabled by default.ellama-extraction-provider
: LLM provider for data extraction.ellama-text-display-limit
: Limit for text display in context elements.ellama-context-poshandler
: Position handler for displaying context buffer.posframe-poshandler-frame-top-center
will be used if not set.ellama-context-border-width
: Border width for the context buffer.ellama-session-remove-reasoning
: Remove internal reasoning from the session after ellama provide an answer. This can improve long-term communication with reasoning models. Enabled by default.ellama-session-hide-org-quotes
: Hide org quotes in the Ellama session buffer. From now on, think tags will be replaced with quote blocks. If this flag is enabled, reasoning steps will be collapsed after generation and upon session loading. Enabled by default.ellama-output-remove-reasoning
: Eliminate internal reasoning from ellama output to enhance the versatility of reasoning models across diverse applications.ellama-context-posframe-enabled
: Enable showing posframe with ellama context.ellama-manage-context-display-action-function
: Display action function forellama-render-context
. Default valuedisplay-buffer-same-window
.ellama-preview-context-element-display-action-function
: Display action function forellama-preview-context-element
.ellama-context-line-always-visible
: Make context header or mode line always visible, even with empty context.ellama-community-prompts-url
: The URL of the community prompts collection.ellama-community-prompts-file
: Path to the CSV file containing community prompts. This file is expected to be located inside anellama
subdirectory within youruser-emacs-directory
.
1.5. Minor modes
1.5.1. ellama-context-header-line-mode
Description: Toggle the Ellama Context header line mode. This minor mode updates the header line to display context-specific information.
Usage:
To enable or disable ellama-context-header-line-mode
, use the command:
M-x ellama-context-header-line-mode
When enabled, this mode adds a hook to window-state-change-hook
to update the header line whenever
the window state changes. It also calls ellama-context-update-header-line
to initialize the header
line with context-specific information.
When disabled, it removes the evaluation of (:eval (ellama-context-line))
from
header-line-format
.
1.5.2. ellama-context-header-line-global-mode
Description:
Globalized version of ellama-context-header-line-mode
. This mode ensures that
ellama-context-header-line-mode
is enabled in all buffers.
Usage:
To enable or disable ellama-context-header-line-global-mode
, use the command:
M-x ellama-context-header-line-global-mode
This globalized minor mode provides a convenient way to ensure that context-specific header line information is always available, regardless of the buffer being edited.
1.5.3. ellama-context-mode-line-mode
Description: Toggle the Ellama Context mode line mode. This minor mode updates the mode line to display context-specific information.
Usage:
To enable or disable ellama-context-mode-line-mode
, use the command:
M-x ellama-context-mode-line-mode
When enabled, this mode adds a hook to window-state-change-hook
to update the
mode line whenever the window state changes. It also calls
ellama-context-update-mode-line
to initialize the mode line with
context-specific information.
When disabled, it removes the evaluation of (:eval (ellama-context-line))
from
mode-line-format
.
1.5.4. ellama-context-mode-line-global-mode
Description:
Globalized version of ellama-context-mode-line-mode
. This mode ensures that
ellama-context-mode-line-mode
is enabled in all buffers.
Usage:
To enable or disable ellama-context-mode-line-global-mode
, use the command:
M-x ellama-context-mode-line-global-mode
This globalized minor mode provides a convenient way to ensure that context-specific mode line information is always available, regardless of the buffer being edited.
1.6. Using Blueprints
Blueprints in Ellama refer to predefined templates or structures that facilitate the creation and management of chat sessions. These blueprints are designed to streamline the process of generating consistent and high-quality outputs by providing a structured framework for interactions.
1.6.1. Key Components of Ellama Blueprints
- Act: This is the primary identifier for a blueprint, representing the
action or purpose of the blueprint.
- Prompt: The content that will be used to initiate the chat session. This
can include instructions, context, or any other relevant information needed to guide the conversation.
- For Developers: A flag indicating whether the blueprint is intended for
developers.
1.6.2. Creating and Managing Blueprints
Ellama provides several functions to create, select, and manage blueprints:
ellama-blueprint-create
: This function allows users to create a new blueprint from the current buffer. It prompts for a name and whether the blueprint is for developers, then saves the content of the current buffer as the prompt.ellama-blueprint-new
: This function creates a new buffer for a blueprint, optionally inserting the content of the current region if active.ellama-blueprint-select
: This function allows users to select a prompt from the collection of blueprints. It filters prompts based on whether they are for developers and their source (user-defined, community, or all).
1.6.3. Variable Management
Blueprints can include variables that need to be filled before running the chat session. Ellama provides command to fill these variables:
ellama-blueprint-fill-variables
: Prompts the user to enter values for variables found in the current buffer and fills them.
1.6.4. Keymap and Mode
Ellama provides a local keymap ellama-blueprint-mode-map
for managing
blueprints within buffers. The mode includes key bindings for sending the buffer
to a new chat session, killing the current buffer, creating a new blueprint, and
filling variables.
The ellama-blueprint-mode
is a derived mode from text-mode
, providing syntax
highlighting for variables in curly braces and setting up the local keymap.
When in ellama-blueprint-mode
, the following keybindings are available:
C-c C-c
: Send current buffer to a new chat session and kill the current buffer.C-c C-k
: Kill the current buffer.C-c c
: Create a blueprint from the current buffer.C-c v
: Fill variables in the current blueprint.
1.6.5. Transient Menus
Ellama includes transient menus for easy access to blueprint commands. The
ellama-transient-blueprint-menu
provides options for chatting with a selected
blueprint, creating a new blueprint, and quitting the menu.
The ellama-transient-main-menu
integrates the blueprint menu into the main
menu, providing a comprehensive interface for all Ellama commands.
1.6.6. Running Blueprints programmatically
The ellama-blueprint-run
function initiates a chat session using a specified
blueprint. It pre-fills variables based on the provided arguments.
(defun my-chat-with-morpheus () "Start chat with Morpheus." (interactive) (ellama-blueprint-run "Character" '(:character "Morpheus" :series "Matrix"))) (global-set-key (kbd "C-c e M") #'my-chat-with-morpheus)
1.7. Acknowledgments
Thanks Jeffrey Morgan for excellent project ollama. This project cannot exist without it.
Thanks zweifisch - I got some ideas from ollama.el what ollama client in Emacs can do.
Thanks Dr. David A. Kunz - I got more ideas from gen.nvim.
Thanks Andrew Hyatt for llm
library. Without it only ollama
would
be supported.
2. Contributions
To contribute, submit a pull request or report a bug. This library is part of GNU ELPA; major contributions must be from someone with FSF papers. Alternatively, you can write a module and share it on a different archive like MELPA.
Old versions
ellama-1.5.1.tar.lz | 2025-Mar-09 | 49.7 KiB |
ellama-1.5.0.tar.lz | 2025-Mar-08 | 49.3 KiB |
ellama-1.4.5.tar.lz | 2025-Mar-03 | 45.8 KiB |
ellama-1.3.0.tar.lz | 2025-Feb-22 | 42.6 KiB |
ellama-1.2.5.tar.lz | 2025-Feb-21 | 41.4 KiB |
ellama-1.1.7.tar.lz | 2025-Feb-17 | 39.8 KiB |
ellama-1.0.3.tar.lz | 2025-Feb-13 | 38.5 KiB |
ellama-0.13.11.tar.lz | 2025-Feb-09 | 36.5 KiB |
ellama-0.10.0.tar.lz | 2024-Jun-29 | 30.0 KiB |
ellama-0.9.11.tar.lz | 2024-Jun-26 | 29.3 KiB |
ellama-0.9.0.tar.lz | 2024-Apr-03 | 26.7 KiB |
ellama-0.8.13.tar.lz | 2024-Mar-31 | 25.7 KiB |
ellama-0.8.1.tar.lz | 2024-Feb-11 | 24.1 KiB |
ellama-0.7.7.tar.lz | 2024-Feb-10 | 22.9 KiB |
ellama-0.7.0.tar.lz | 2024-Jan-20 | 20.3 KiB |
ellama-0.6.0.tar.lz | 2024-Jan-18 | 20.1 KiB |
ellama-0.5.7.tar.lz | 2024-Jan-16 | 18.7 KiB |
ellama-0.4.13.tar.lz | 2023-Dec-28 | 18.0 KiB |
ellama-0.4.0.tar.lz | 2023-Dec-18 | 17.4 KiB |
ellama-0.3.2.tar.lz | 2023-Dec-18 | 16.9 KiB |
News
1. Version 1.5.2
- Fixed a bug in session delete or kill that deletes or kills the current file or buffer when no session is selected.
2. Version 1.5.1
- Renamed the variable
ellama-transient-system
toellama-global-system
to reflect its broader scope and updated all references inellama.el
,ellama-transient.el
andellama-blueprint.el
. Moved it from ellama-transient to ellama. Fix symbol’s value as variable is void bug.
3. Version 1.5.0
- Code Refactoring and Modularity
- Moved transient menu-related functions and variables from
ellama.el
toellama-transient.el
. - Created
ellama-context.el
for context-related functions, variables, and classes. - Moved blueprint-related code from
ellama.el
toellama-blueprint.el
.
- Moved transient menu-related functions and variables from
- System Message Support
- Added system message support with new functions and variables in
ellama-blueprint.el
andellama-transient.el
. - Updated keybindings and transient menus to include system message options.
- Modified
ellama-stream
to use the transient system message if not provided explicitly.
- Added system message support with new functions and variables in
- Functionality Enhancements
- Updated
ellama-summarize-prompt-template
with new summarization instructions. - Modified
ellama-instant
calls inellama-summarize
andellama-summarize-kill-ring
. - Updated translation template for better structure and added Org-mode to Markdown conversion.
- Added
ellama-fix-parens
function to remove unnecessary parentheses after template insertion. - Refined
ellama-complete
function for more accurate response trimming.
- Updated
- Blueprints Support Enhancements
- Added
ellama-blueprint-run
function to run a chat with an LLM using a specified blueprint and optional pre-filled variables. - Added documentation for blueprints in the README file.
- Added a new transient prefix command
ellama-transient-blueprint-menu
for managing blueprint-related commands, including creating blueprints from buffer or as new ones, and chatting with selected blueprints. - Added a main menu option for chatting with blueprints.
- Added custom variable
ellama-blueprints
to store user-defined blueprints. - Created commands
ellama-create-blueprint
andellama-new-blueprint
for creating new blueprints from an existing one and from scratch.
- Added
4. Version 1.4.5
- Fix compat dependency version.
5. Version 1.4.4
- Ensured that the buffer
ellama--context-buffer
is created if it does not exist before attempting to update and show context to prevent errors related to non-existent buffers. - Added calls to
ellama-update-context-show
in both header line and mode line minor modes to ensure context is shown when these modes are toggled.
6. Version 1.4.3
- Added fallback mechanism to use the first available Ollama chat model when no specific provider is defined for various ellama functions, ensuring that a valid provider is always used and preventing potential errors in scenarios where providers are not explicitly set.
- Refactored
ellama--scroll
to accept an optional POINT argument, allowing the function to go to that specific point before scrolling. Updated the callerellama-update-buffer
to pass the new point. - Refactored the
ellama-preview-context-mode
by renaming the quit command key binding to use a custom functionellama-kill-current-buffer
. Added a header line format that displays the quit command instruction. - Added a new function
ellama-send-buffer-to-new-chat-then-kill
to send the current buffer to a new chat session and then kill it. Updated the keybinding inellama-blueprint-mode-map
to use this new function instead of the old one. - Added a new function
ellama-kill-current-buffer
and updated the keymap inellama-blueprint-mode-map
to use this function instead of an anonymous lambda. Also, updated the header line format to display the correct command names usingsubstitute-command-keys
. - Removed redisplay call to prevent flickering.
- Ensure
llm-ollama
dependency are loaded.
7. Version 1.4.2
… …