Close Menu
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Osaurus brings both local and cloud AI models to your Mac

May 15, 2026

EY AI Leader Says 3 Engineering Roles Are Converging

May 15, 2026

How AI Is Upending the Consulting Industry

May 15, 2026
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
  • Home
  • AI
  • Education
  • Entertainment
  • Food Health
  • Health
  • Sports
  • Tech
  • Well Being
IQ Times Media – Smart News for a Smarter YouIQ Times Media – Smart News for a Smarter You
Home » Osaurus brings both local and cloud AI models to your Mac
AI

Osaurus brings both local and cloud AI models to your Mac

IQ TIMES MEDIABy IQ TIMES MEDIAMay 15, 2026No Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


As AI models increasingly become commoditized, startups are racing to build the software layer that sits on top of them. One interesting entrant into this space is Osaurus, an open source, Apple-only LLM server that lets users move between different local AI models, either locally or in the cloud, while keeping their files and tools all on their own hardware.

Osaurus evolved out of the idea for a desktop AI companion, Dinoki, which Osaurus co-founder Terence Pae described as a sort of “AI-powered Clippy.” Dinoki’s customers had asked him why they should buy the app if they still had to pay for tokens — the usage units AI companies charge for processing prompts and generating responses.

That got Pae thinking more deeply about running AI locally.

“That’s how Osaurus started,” Pae, previously a software engineer at Tesla and Netflix, told TechCrunch over a call. The idea, he explained, was to try to run an AI assistant locally. “You can do pretty much everything on your Mac locally, like browsing your files, accessing your browser, accessing your system configurations. I figured this would be a great way to position Osaurus as a personal AI for individuals.”

Pae began building the tool in public as an open-source project, adding features and fixing bugs along the way.

Image Credits:Osaurus, Inc.

Today, Osaurus can flexibly connect with locally hosted AI models or cloud providers like OpenAI and Anthropic. Users can freely choose which AI models they’re using, and keep other aspects of the AI experience on their own hardware, like the models’ own memory, or their files and tools.

Given that different AI models have different strengths, the advantage of this system is that users can switch to the AI model that best fits their needs.

Such a structure makes Osaurus what’s called a “harness” — a control layer that connects different AI models, tools, and workflows through a single interface, similar to tools like OpenClaw or Hermes. However, the difference is that such tools are often aimed at developers who know their way around a terminal. And sometimes, like in the case of OpenClaw, they may pose security issues and holes to worry about.

Osaurus, meanwhile, presents an easy-to-use interface that consumers can use, and addresses security concerns by running things in a hardware-isolated, virtual sandbox. This limits the AI to a certain scope, keeping your computer and data safe.

Image Credits:Osaurus, Inc.

Of course, the practice of running AI models on your machine is still in its early days, given that it’s heavily resource-intensive and hardware-dependent. To run local models, your system will need at least 64 GB of RAM. For running larger models, like DeepSeek v4, Pae recommends systems with about 128 GB of RAM.

But Pae believes local AI’s needs will come down in time.

“I can see the potential of it, because the intelligence per wattage — which is like the metric for local AI — has been going up significantly. It’s on its own curve of innovation. Last year, local AI could barely finish sentences, but today it can actually run tools, write code, access your browser, and order stuff from Amazon […] it’s just getting better and better,” he said.

Image Credits:Osaurus, Inc.

Osaurus today can run MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, DeepSeek V4, and other models. It also supports Apple’s on-device foundation models, Liquid AI’s LFM family of on-device models, and in the cloud, it can connect to OpenAI, Anthropic, Gemini, xAI/Grok, Venice AI, OpenRouter, Ollama, and LM Studio.

As a full MCP (Model Context Protocol) server, you can give any MCP-compatible client access to your tools as well. Plus, it ships with over 20 native plugins for Mail, Calendar, Vision, macOS Use, XLSX, PPTX, Browser, Music, Git, Filesystem, Search, Fetch, and more. 

More recently, Osaurus was updated to include voice capabilities as well.

Since the project went live nearly a year ago, it has been downloaded north of 112,000 times, according to its website.

Currently, Osaurus’ founders (who include co-founder Sam Yoo) are participating in the New York-based startup accelerator Alliance. They’re also thinking about next steps, which could see Osaurus being offered to businesses, like those in the legal space or in healthcare, where running local LLMs could address privacy concerns.

As the power of local AI models grows, the team believes it could lower the demand for AI data centers.

“We’re seeing this explosive growth in the AI space where [cloud AI providers] have to scale up using data centers and infrastructure, but we feel like people haven’t really seen the value of the local AI yet,” Pae said. “Instead of relying on the cloud, they can actually deploy a Mac Studio on-prem, and it should use substantially less power. You still have the capabilities of the cloud, but you will not be dependent on a data center to be able to run that AI,” he added.

When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
IQ TIMES MEDIA
  • Website

Related Posts

What the jury will actually decide in the case of Elon Musk vs. Sam Altman

May 14, 2026

Elon Musk’s SpaceXAI has been bleeding staff since its merger

May 14, 2026

OpenAI says Codex is coming to your phone

May 14, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Justice Department alleges Yale illegally considered race in medical school admissions

May 14, 2026

Princess of Wales highlights Italy’s Reggio Approach for children

May 14, 2026

Pope Leo XIV warns of AI and weaponry leading to global annihilation

May 14, 2026

A clash over classroom technology in a Philadelphia school district

May 14, 2026
Education

Justice Department alleges Yale illegally considered race in medical school admissions

By IQ TIMES MEDIAMay 14, 20260

The Justice Department on Thursday accused Yale University of illegally considering race in admissions to…

Princess of Wales highlights Italy’s Reggio Approach for children

May 14, 2026

Pope Leo XIV warns of AI and weaponry leading to global annihilation

May 14, 2026

A clash over classroom technology in a Philadelphia school district

May 14, 2026
IQ Times Media – Smart News for a Smarter You
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise With Us
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 iqtimes. Designed by iqtimes.

Type above and press Enter to search. Press Esc to cancel.