In the realm of Large Language Models (LLMs), Daniel Miessler’s fabric project is a popular choice for collecting and integrating various LLM prompts. However, its default requirement to access the OpenAI API can lead to unexpected costs. Enter ollama, an alternative solution that allows running LLMs locally on powerful hardware like Apple Silicon chips or dedicated GPUs. In this guide, we’ll explore how to modify fabric to work with ollama. Step 1: Install Ollama To begin, install ollama according to ...

Bernhard Knasmüller on Software Development