5 Reasons to Use Local AI on Desktop – Instead of chatgpt, gemini, or claude

Applications of AI


5 reasons why you should always choose local AI first

Jack Warren/Elis makes Picaro/ZDNET better

Follow zdnet: Please add it as a priority source On Google.


ZDNET Key Takeout

  • The use of AI continues to grow and impact everything.
  • Using cloud-hosted AI has several drawbacks.
  • Locally installed AI is easy to use and free.

AI doesn't go anywhere, so everyone knows it by now. People all over the world use AI for every reason or task you can imagine. I know people who think AI chatbots are friends. We also know people who view AI as a research tool. And there are people who use AI to write communications and other types of documents.

Also: What exactly is an AI PC? Should I buy it in 2025?

When most people use AI, they tend to use things like ChatGpt, Mistral, Copilot, Gemini, or Claude. These services are cloud-hosted and certainly have their advantages. Others (like me) always choose locally installed AI first.

There's a reason.

What is the locally installed AI?

As the name suggests, locally installed AI means that it can be used in the same way as using a cloud hosted solution, installing everything you need on your personal desktop (or server).

Also: I tried Sanctum's local AI app, and that's exactly what I need to keep my data private

Yes, this job requires a little knowledge, but it's much easier than you think. For example, you can install the Orama and the Orama Desktop app within 5 minutes. Take care of it and you can enjoy your own personal AI solutions.

But why are you?

Let's chat.

1. privacy

This is a big thing for me. I'm a very private person and I don't use it on third parties by cutting down on AI usage. Businesses do not want to train LLMS using AI interactions. I also don't want to use chat with these third parties to create profiles for targeted ads.

Also: How to add AI to your favorite Microsoft Office alternatives easily

It's part of the beauty that comes with local AI. You don't need to worry about these things. Locally installed AI solutions do not store and share what you share in telemetry or queries. If you go this route, your data will be safe. This means that everything you're investigating is not available to third parties.

2. cost

The basic version of ChatGpt Plus costs $20 per month. That may not seem very similar to some people, but when you combine it with all the other subscriptions, it adds up. And like with mobile plans in the early 2000s, if you go beyond the data limit, it costs money.

(Disclosure: Ziff Davis, the parent company of ZDNET, filed a lawsuit against Openai in April 2025, claiming it infringed Ziff Davis' copyright in training and operating AI systems.)

Also: My two favorite AI apps on Linux – and how to get more done with them

Locally installed AI is free. period. I have installed Ollama and MSTY or the official Ollama app on all my machines, but have not paid for AI penny since I started using it. Ask yourself this question: Why am I paying for a service I can have for free? Save $20 a month and use locally installed AI.

3. speed

Please ask me about this. We understand that none of us have a data center in our home that could fit the power of chatgpt. At the same time, when using cloud-hosted AI, other issues need to be addressed, such as speed of your internet connection. Slow internet speeds can affect AI chat.

Also: 8 ways I quickly enhanced my Linux skills – and you can too

With locally installed AI, you don't have to worry about internet speeds and you can enjoy a level of consistency you don't know what you don't. Better yet, if local AI chat is too slow, you can always upgrade your computer's RAM or GPU to speed it up. That's not what you can do with cloud-hosted AI.

4. Offline function

I had an instance that had an internet connection interrupted, but I can interact with locally installed AI. why? This is because locally installed AI does not require an internet connection. Everything you need to use AI is there when you use it locally.

Also: This is the fastest local AI I've tried and it's not even close – how to get it

Its offline feature can save you a lot of wasted time when you still need to do your research but you are not connected to the internet, when you still need to do your research. I was able to shut down my LAN and use AI on my desktop or laptop. Of course, if you shut down your LAN, you won't be able to access locally installed AI from another machine (this does sometimes).

5. environment

For me (and certainly many others), this is a big problem. The impact of cloud-hosted AI on the environment. The MIT paper said this about this issue.

“The computational power required to train billions of parameters-rich generation AI models, such as Openai's GPT-4, can require an incredible amount of electricity, which could lead to increased carbon dioxide emissions and pressure in the electrical grid.”

Also: How to feed files to local AI for better, more relevant responses

There are serious concerns about what AI will do to the environment over time. Not only does AI companies consume a lot of electricity, but as a by-product of energy consumption, AI data centers generate a lot of heat sent into the environment. It's not sustainable. I'm pleased that instead of using the cloud host option, I chose to use locally installed AI.

Want more stories about AI? Check out AI Leaderboardour weekly newsletter.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *