LocalAI
loading
·
loading
·
The State of Local and Affordable Inference in October 2025
·1327 words·7 mins
An overview of the current landscape of GPUs and AI compute for local inference as of October 2025 from Nvidia and AMD to Intel, Apple, and the cloud.
Testing DeepSeek-OCR: Vision Text Compression for LLMs
·466 words·3 mins
Notes from testing DeepSeek-OCR as a local vision-language model for OCR and text compression on a large archive of screenshots. Includes observations on model performance, visual-token compression, and multilingual results.