Part of my artificial intelligence learning process involves not only using it daily, but also exploring ways to improve my usage. This includes enhancing its ease of use and speed, as well as gradually increasing the complexity of my interactions.
This led me to command line tools, likeĀ FabricĀ that has a lot of built-in prompts, but the fact that I needed to manually store each output bothered me. WatchingĀ AI tools for software engineers, but without the hype ā with Simon Willison (Co-Creator of Django), from the Pragmatic Engineer Podcast, I discoveredĀ LLM, a CLI utility and Python library for interacting with Large Language Models, that automatically stores the results in a SQLite database. This was exactly what I was looking for. It took a while for me to try it out, but after watching another video (Language models on the command-line w/ Simon Willison), I finally decided to try it. Since this process wasn’t without its difficulties, I decided to create a quick guide so you could get up to speed more quickly than I did.
Installation and setup
Installing package
I’m using Ubuntu 24.04, but besides this section, everything else will work on other Operating Systems, like Mac and Windows.
The easiest way to install it is running:
pipx install llm
BashSetting up OpenAI
Now let’s set up OpenAI. If you need help, I wrote previously explainingĀ how to get OpenAI API key and some tips to keep costs under control.
llm keys set openai
BashDefining a default mode
The following command will list all available models.
llm models
BashTo define a default model just run
llm models default MODELNAME
BashI’m currently using gpt-4o-mini
because is the cheapest.
To view the current model run
llm models default
BashMost used commands
Below is a list of the commands that I most use. If you want to learn more, check theĀ LLM usage documentation.
Single prompt
llm "MESSAGE"
BashOr you can use it with the output of another command (system prompt):
cat mycode.py | llm -s "Explain this code"
BashContinue last conversation
llm "MESSAGE" -c
BashTo continue a conversation that is not the most recent one, first get its ID using llm logs
and them type use llm "MESSAGE" --cid ID
.
Set specific model for a prompt
llm "MESSAGE" -m MODELNAME
BashChat mode
llm chat
BashTo open a chat to continue your last prompt, use llm chat -c
.
Templates
Official documentation for Prompt templates.
Create template
llm --system 'PROMPTMESSAGE' --save TEMPLATENAME
BashUse template
Single prompt:
llm "MESSAGE" -t TEMPLATENAME
BashChat:
llm chat -t TEMPLATENAME
BashTips to improve your experience
Use with pbpaste and pbcopy
This is very helpful as it eliminates the need to copy and paste inputs or outputs when using LLM. If you don’t use Mac, I’ve written about how to use pbcopy and pbpaste on Linux (Ubuntu or others).
Here is one example to improve a content that I’ve copied into my clipboard (CTRL+V):
pbpaste | llm -s "improve English" | pbcopy
BashThe command above does not display any output because it is copied to your clipboard. However, if you would like to view the output, you can run the following command:
pbpaste | llm -s "improve English" | tee >(pbcopy)
BashFormat Markdown output
One solution is to use Python Rich library to render Markdown with the output of the command:
llm "MESSAGE" | python3 -m rich.markdown -
BashExplore logs with Datasette
Datasette is also a tool created by Simon, to help explore and publish data. After installing it, the simplest way to use it is:
datasette $(llm logs path)
BashTo check your log status, run llm logs status
.
Learn more about LLM logging to SQLite.
Real-life examples
Improve my written English
I’ve created a template called improve-english
with the following content:
system: Correct and English writing, explaining why in markdown format. For the output, separate in sections (##) and for the improved version keep the original format.
improve-english.yamlEvery time I need to use it, I quickly open Guake and run:
pbpaste | llm -t improve-english | python3 -m rich.markdown -
BashTranslate my blog posts
I’m doing some tests to translate the content from English to Brazilian Portuguese, using:
pbpaste | llm -s "translate this content from English to Brazilian Portuguese without changing it or it's structure, output in markdown" | pbcopy
BashConclusion
I hope that this post helped you quickly set up this tool and understand better how you can use it to improve your day-to-day usage of AI in general. There is still a lot that I want to learn, like how to useĀ pluginsĀ orĀ run local models, and I expect to write about it in the future.
What I’m most excited about this tool is the ability to use the logging history to build other projects that can help enhance my learning process.
Leave a Reply