If you're looking at using Ollama on your PC to run local LLMs (Large Language Models), with Windows PCs at least, you have two options. The first is to just use the Windows app and run it natively.
For many people, the Windows Subsystem for Linux, or WSL, is a great way to use the tools you need in Linux while staying on Windows most of the time. You can just use Windows for most of your tasks, ...