Installation
Requirements
Section titled “Requirements”- macOS 12+ or Linux (glibc 2.31+)
- No runtime dependencies required
Download
Section titled “Download”Purchase a license at fialr.com/licensing to receive a download link and license key.
After purchasing, download the binary for your platform:
| Platform | File | BLAKE3 |
|---|---|---|
| macOS (Apple Silicon) | fialr-1.0.0-macos-arm64 | a1b2c3d4... |
| macOS (Intel) | fialr-1.0.0-macos-x64 | e5f6a7b8... |
| Linux (x64) | fialr-1.0.0-linux-x64 | c9d0e1f2... |
Full checksums and detached signatures are published with each release.
Verify checksum
Section titled “Verify checksum”b3sum fialr-1.0.0-macos-arm64Compare the output against the BLAKE3 hash listed above.
Verify signature
Section titled “Verify signature”Each binary ships with a detached Ed25519 signature (.sig file). Verify with the fialr public key:
minisign -Vm fialr-1.0.0-macos-arm64 -P RWSGmBqXfRPhKPRMN4E+JGuhMGZ6NlEiXPNnA8tY6VDzThe public key is also published at fialr.com/signing-key.
Install
Section titled “Install”macOS / Linux
Section titled “macOS / Linux”chmod +x fialr-1.0.0-macos-arm64sudo mv fialr-1.0.0-macos-arm64 /usr/local/bin/fialrActivate
Section titled “Activate”On first run, fialr requires a one-time activation with your license key:
fialr activate YOUR-LICENSE-KEYThis contacts api.fialr.com once to validate the key and bind it to the machine. After activation, fialr operates fully offline. No further network calls are made.
Each license allows activation on up to 3 machines.
Verify
Section titled “Verify”Confirm the installation:
fialr --versionfialr 1.0.0 (build a1b2c3d4)Optional: AI dependencies
Section titled “Optional: AI dependencies”For AI-assisted enrichment, semantic search, and vector embeddings, install Ollama and pull the required models:
# Install Ollamabrew install ollama # macOS# See ollama.com for Linux
# LLM for enrichment inference (~2 GB)ollama pull llama3.2
# Embedding model for semantic search and similarity (~275 MB)ollama pull nomic-embed-textBoth models require approximately 2.3 GB of disk space total. Ollama downloads them once and caches locally.
fialr runs without Ollama, but the following features require it: enrichment (AI metadata generation), vector embeddings, semantic search, and embedding-based near-duplicate detection. Without local models, these features are unavailable unless you configure a cloud provider.
Cloud-only alternative
Section titled “Cloud-only alternative”If disk space is a constraint, you can skip the local models and use a cloud provider (Claude API) for enrichment instead:
fialr config ai --provider claude --key sk-ant-...Cloud enrichment works for Tier 2–3 files. Tier 1 files require local AI by default (cloud access needs explicit two-step confirmation). Vector embeddings and semantic search currently require local Ollama — there is no cloud embedding provider.
OCR for scanned documents
Section titled “OCR for scanned documents”To extract text from scanned PDFs and images, install Tesseract:
brew install tesseract # macOSsudo apt install tesseract-ocr # Debian/UbuntuThe ocrmypdf Python package (included in fialr[enrichment]) handles the OCR pipeline. Without Tesseract, scanned documents produce no extracted text and enrichment falls back to filename-only metadata. Native PDFs, Office documents, photos, and audio files are unaffected.