
When AI first hit the scene in its current form, I was dead for it due to the generative nature of what was being sold to the public. I thought of shortcuts for creating art to be offensive to craft.
But then I realized that AI can be used for things that traditional searches are beginning to fail: research.
Also: ClaudeAi can do your research and process your emails now – how do you like it?
On both sides of my writing career (fiction and non-fiction), I had to do quite a bit of research and Google was becoming an obstacle to that process. Instead of being provided with useful information, I was flooded with ads, sponsored content, and my own AI-based answers (which were of little use).
I first kicked the tire with an opera ARIA. This showed that AI could actually be useful. At the same time, I realized that AI must also be overseen, as it can be just as easily wrong as it is.
I also found something else useful about AI. That's how I can lead some fun rabbit holes, and I might discover something really cool to investigate. Ultimately, the journey led me to two AI tools. Both were free to install and use on Linux.
These two tools helped me achieve more results each day.
1. Ollama/msty
Ollama is an open source AI tool. Its open source nature is one of the main reasons why I was drawn to it, as I know that developers around the world can examine its code, and no one ever says that they have discovered anything offensive in their code.
In addition to the open source nature of Ollama, it is easy to install and use. And the fact that you can download and use several different LLMs is a bit tasty icing on an already sweet cake. You can use Cogito, Gemma 3, Deepseek R1, Llama 3.3, Llama 3.2, Phi 4, QWQ, and more.
Also: how to send files to local AI for better, more relevant responses
But the main reason I prefer Ollama over other AI tools is that it can be used locally. This means that my queries are not accessible by third parties. I like that level of privacy.
But how does an orama help me get things done? First, there is the prompt library. This allows you to access several quick prompts and also create custom prompts. One of the prompts I often type is “Dive deep into the next topic and make sure to explore related side topics.” Instead of always entering that prompt, you can create a quick prompt. Therefore, all you need to enter is the subject. Plus, I don't need to remember to inspire you to explore topics related to oramas.
Also: How to run deepseek ai locally to protect your privacy – two easy ways
You can create that quick prompt in your library and easily invoke it whenever you need it. This saves time and ensures you get something fast and fast every time. You don't need to think about what the prompt needs to say. You can also make the prompts as simple or complicated as you need them.
Creating a quick prompt in MSTY is a surefire way to make your daily work a little more efficient.
Jack Warren/ZDNET
Prompt libraries are very useful, especially if you have more complicated prompts to type regularly.
I also tried out the local AI app from Sanctum, which is exactly what I need to keep my data private
Next is the knowledge stack. This allows you to add your own document (always stay local), so the selected LLM can use that information as a source. Let's say you write a few articles on a single subject and want to use their combined information to answer some questions. I was able to read and read all of that series. Or I was able to add them to the knowledge stack and then ask my questions. Ollama searches all documents added to the stack and uses that information in its response.
It really helps.
2. Confused
There is also a desktop app that you can use to be confused. The desktop app is roughly the same as using perplexity.ai via a browser, but it's a bit more efficient to use.
There are two main features that will help me with my daily tasks: search and research.
Also: How I was confused has become the default search engine for browsers (and do you need to do that too)
If you are confused and want to do a standard search, click the search button, type the query, and press Enter on your keyboard. On the other hand, if you need to dive deeper into the subject, hit the survey and enter the query.
One thing you need to know about research options is that it can take up to 30 minutes to do a really deep dive and deliver results. But when you really need to get into the subject, that feature is a must. The cool thing about this research is that you can click on the task. While it is doing, you will see the source used for deep diving.
It's fascinating to see them confused.
Jack Warren/ZDNET
One thing to keep in mind when researching is that the free version is limited to the number of queries that can be executed per day. Upgrade to professional planning for unlimited free searches and over 300 Pro searches per day. The specialist plan is $20 per month.
Another very useful feature of confusion is the space. This feature allows you to create custom spaces for a variety of topics. You can then switch spaces to run the query and know that the query will be separated into that space. In other words, if you want to remember that query, you have to switch to spaces and find it. This makes it much easier to track previous queries without washing long lists.
Also: I've tried Perplexity's assistant, but only one stops it being my default phone AI
You can accomplish more every day between these two AI tools on your Linux desktop. I highly recommend trying one of these (or both).
Get the top stories of the morning in your inbox every day Tech Today newsletter.
