top of page
blogbanner-thinstripback-compressor_edited.jpg
Original-LogoOnly-Square-SMLL-Pixel-Tran
  • linkedin
  • twitter
  • YouTube
  • Reddit
  • Instagram
  • facebook

Written by Michael Plis. Your go-to source for smart technology & cybersecurity insights for small business. 

Writer's pictureMichael Plis

How to build your own AI chatbot on the computer?

Updated: May 16

Small robot learning from a book sitting on bench
Will we teach our own ai models? Photo by Andrea De Santis on Unsplash

In today's AI landscape, the allure of generative AI extends beyond cloud services to local installations on personal computers. This blog delves into the benefits and practicalities of bringing this cutting-edge technology directly to your device, offering a glimpse into the future of AI accessibility and innovation. I'll show you a method of how to build your own AI chatbot. Read on.


We'll explore the fundamentals of local generative AI, from understanding the underlying models to navigating installation processes with user-friendly tools like LM Studio. By demystifying the complexities and offering practical guidance, we aim to empower readers to embark on their own AI exploration journey. Whether you're a novice or an enthusiast, join us as we unlock the potential of generative AI, right at your fingertips.


Bringing AI Power to Your Device


Many are familiar with generative AI tools like ChatGPT and Google Bard, typically accessed through cloud services. However, there's a way to tap into this technology directly on your own computer.


Installing generative AI locally offers privacy benefits and eliminates concerns about capacity or availability issues. Plus, it's just plain cool to have that kind of power at your fingertips.



Understanding the Basics


To embark on this journey, you'll need both a program to run the AI and a Large Language Model (LLM) to generate responses. These LLMs serve as the backbone of text generation AI, with GPT-4 driving ChatGPT and Google Gemini.


While delving into the realm of LLMs may seem daunting, they essentially function as supercharged autocorrect engines, trained on vast amounts of data to recognize relationships between words and sentences.


Exploring Available Models


There's a variety of LLMs you can install locally, including those released by Meta (like LLaMa) and others developed by researchers and volunteers. Publicly available LLMs aim to foster innovation and transparency, making them accessible to a broader audience.


For this guide, we'll focus on LM Studio, a user-friendly option for installing LLMs on Windows, macOS, and Linux systems.





LM Studio Capabilities & System Requirements


With LM Studio, you can ...


🤖 - Run LLMs on your laptop, entirely offline


👾 - Use models through the in-app Chat UI or an OpenAI compatible local server


📂 - Download any compatible model files from HuggingFace 🤗 repositories


🔭 - Discover new & noteworthy LLMs in the app's home page


LM Studio supports any ggml Llama, MPT, and StarCoder model on Hugging Face (Llama 2, Orca, Vicuna, Nous Hermes, WizardCoder, MPT, etc.)


Minimum requirements:

M1/M2/M3 Mac, or a Windows PC with a processor that supports AVX2. Linux is available in beta.


Build your own AI chatbot with LM Studio


Getting started with LM Studio involves downloading the software from the official website and ensuring your system meets the minimum requirements, such as sufficient RAM and VRAM.


Once installed, you can explore and download LLMs within the application.


LM Studio simplifies the process by recommending notable LLMs and providing options to filter and manage installed models.


With LM Studio, you can engage in prompt-based interactions with the selected LLM, controlling various settings to tailor the AI's responses to your preferences.





Embarking on AI Exploration


With local LLMs up and running, the possibilities for AI-driven interactions are vast. While delving deeper into LLM development may require additional learning, LM Studio streamlines the setup process, even for beginners.


Whether you're curious about AI technology or eager to experiment with text generation, harnessing generative AI locally offers a fascinating glimpse into the future of human-computer interaction.





Future of local AI models

The future of local AI models will keep growing. They'll become lighter and easier to install and run on software that you can feed data as simply as adding documents to feed the beast.


I envision each person customising their own AI chatbot with the knowledge that they have gathered in their personal documents and personal beliefs.


For example, I would like to feed it with Bible and publications that matter to me and my own files and have a digital assistant that I can talk with and study with and learn from my own knowledge and knowledge that matters to me which sometimes gets lost in files on the computer.


Remember humans and and animals and our surroundings matter more than training AI models always find the time to go outside and smell the roses as it were.


Happy AI learning


Michael Plis


References


LM Studio


You Can Run a Generative AI Locally on Your Computer


Recent Posts

See All

Welcome to Cyberkite blog! This is your go-to source for smart technology and cybersecurity insights for small business. Stay ahead of the curve with our expert tips and strategies, and join the Cyberkite community by subscribing today!

Knowledge is Power” – Francis Bacon / Thomas Hobbes

"Technology is a useful servant but a dangerous master" - Christian Lange

Cyberkite logo
  • linkedin
  • twitter
  • YouTube
  • Reddit
  • Instagram
  • facebook
PXL_20240404_032951047~2 - SQUARE 250px.jpg

About Michael Plis

 

Michael is a technology and cybersecurity professional with over 18 years of experience. He offers unique insights into the benefits and potential risks of technology from a neurodivergent perspective. He believes that technology is a useful servant but a dangerous master. In his blog articles, Michael helps readers better understand and use technology in a beneficial way. He is also a strong supporter of mental health initiatives and advocates for creating business environments that promote good mental health.

Disclaimer: Please note that the opinions expressed by Michael or any blog assistants on this blog are his/their own and may not necessarily reflect the views of Cyberkite. Michael is neurodiverse so he needs the assistance of voice typing and AI tools to help him write and edit blog articles to and get them completed. Also we use open source images from Unsplash and Pixabay and we try to include credit to the artist of each image. Michael shares his opinions based on his extensive experience in the IT and Cybersecurity industry, learning from the world's top subject matter experts and passing on this knowledge to his audience in the hopes of benefiting them. If there is a mistake or something needs to be corrected please message using the green chat window bottom right hand corner or contact him through social media by searching for Michael Plis blogger. 

View our full Site Disclaimer

View our Affiliate Statement

bottom of page