The reason behind LLocal is that there are so many client platforms, but they are all aimed at developers and enthusiasts but none of them are aimed at the general public.
To address this and make the process of utilizing Large Language Models easier I have built LLocal. The process of hosting and running is done by Ollama, which is bundled with LLocal. The idea of utilizing LLocally (haha I feel like the joke is getting old) running LLM's ensures that your data is with you. With LLocal and Ollama, your data is stored safely in your own machine. Further, making LLocal open sourced means anyone and everyone can possibly contribute and if not be able to contribute at the very least can learn from the project.
That was and is my thought process.
Kartikeya Mishra.