1. EachPod

Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them

Author
Ben Stegink, Scott Hoag
Published
Thu 13 Mar 2025
Episode Link
https://www.msclouditpropodcast.com/episode397/

Welcome to Episode 397 of the Microsoft Cloud IT Pro Podcast. In this episode, Scott and Ben dive into the world of local LLMs—large language models that run entirely on your device. We’re going to explore why more IT pros and developers are experimenting with them, the kinds of models you can run, and how you can integrate them directly into your workflow, including in Visual Studio Code for AI-assisted coding.

Your support makes this show possible! Please consider becoming a premium member for access to live shows and more. Check out our membership options.
Show Notes

Ollama
Running LLMs Locally: A Beginner's Guide to Using Ollama
open-webui/open-webui
LM Studio
LM. Studio Model Catalog
Why do people like Ollama more than LM Studio?
A Starter Guide for Playing with Your Own Local AI!
host ALL your AI locally
Run your own AI (but private)

About the sponsors

Would you like to become the irreplaceable Microsoft 365 resource for your organization? Let us know!

Share to: