BUILT FOR PRECISION
Every line of code optimized for one goal: finding exactly what you're looking for on the first try.
THE STACK
RUST
Memory-safe systems programming without garbage collection. Zero-cost abstractions for maximum performance.
- Safe idiomatic code throughout
- No unwrap() in production paths
- Async-first with Tokio runtime
- SIMD optimizations where applicable
FOUNDATIONDB
Distributed, transactional key-value store with ACID guarantees. Apple's battle-tested database.
- Horizontal scalability
- Automatic sharding & replication
- Linearizable transactions
- Self-healing architecture
ARGAND ML
In-house machine learning for search ranking, understanding intent, and improving results continuously.
- Local-first inference (RTX 4070)
- Custom embedding models
- Privacy-preserving training
- Continuous improvement loop
ARGAND PLANE
In-house replacement for Qdrant. Purpose-built vector database for semantic search.
- HNSW indexing
- FDB backend for persistence
- Real-time updates
- Sub-millisecond queries
ARGAND SEARCH MINI
Lightweight search for local deployments, edge computing, and offline-first applications.
- Embedded mode available
- BM25 + semantic hybrid
- Memory-efficient
- WASM compatible
OSM + CUSTOM
OpenStreetMap data enhanced with custom processing for the best mapping experience.
- Self-hosted tile server
- Real-time updates
- Custom routing engine
- Offline map support
ARCHITECTURE
PERFORMANCE TARGETS
CODE SAMPLES
// search/src/query.rs // By Nic Weyand! // Query processing with privacy-preserving design use crate::{SearchResult, QueryContext, Error}; pub async fn process_query( query: &str, ctx: QueryContext, ) -> Result<Vec<SearchResult>, Error> { // Query processed in memory only - never persisted let tokens = tokenize(query)?; let embeddings = embed_query(&tokens).await?; // Hybrid search: BM25 + semantic vectors let results = hybrid_search( &tokens, &embeddings, ctx.limit.unwrap_or(10), ).await?; // Results returned, query discarded Ok(results) }
# FoundationDB configuration # Ephemeral data with automatic TTL [storage] cluster_file = "/etc/foundationdb/fdb.cluster" data_dir = "/babas-books/argand/fdb" [ttl] # Operational data expires in 1 hour operational_data_seconds = 3600 # System logs max 30 days (no search content) system_logs_days = 30 [replication] mode = "double" redundancy = "triple"
MACHINE LEARNING
Local-first ML that respects your privacy. All inference runs on our hardware—your queries never touch external AI services.
Query Understanding
Custom models trained to understand search intent. Disambiguates queries, handles typos, and interprets natural language—all locally.
Semantic Embeddings
Document and query embeddings for semantic search. Find results that match meaning, not just keywords.
Ranking & Relevance
Neural ranking models that learn what makes a result relevant. Continuously improving from aggregate patterns—never individual behavior.
Image Understanding
Visual search and image classification for maps, weather imagery, and search results. CLIP-based models running on RTX 4070.
INFRASTRUCTURE
Development Machine
AMD Ryzen 9 7900X (12 cores/24 threads), 64GB DDR5, RTX 4070 12GB. Primary development and ML training workstation.
Storage
Samsung 980 PRO 2TB NVMe for hot data, Seagate 4TB HDD (/babas-books/) for datasets and model weights. FoundationDB on NVMe.
Version Control
Self-hosted Forgejo at git.argand.org. All repositories private by default. Automated backups via systemd timers.
DNS
Quad9 (9.9.9.9) for all DNS resolution. Privacy-focused, no logging, DNSSEC enabled. Never Cloudflare.