On October 23rd we did our second Artificial Intelligence (AI) class at Castlemakers, focusing more on the uses of AI. In June we did a class on creating your own AI LLM on your own computer (which not only is a great way to learn about the technology but also addresses privacy concerns), but realized there was also interest in how it can be used.
One of the key points in the class, regardless of of the AI product you use (and we showed several), is phrasing the question. Putting together a scenario and asking the AI engine to report or behaving within context of the person or situation makes a real difference in the results. Ian, who utilizes AI and develops software, also demonstrated a system he made where you could input PDF or other documents that created a video podcast summary with 2 AI avatars discussing the results. It was kind of like Google’s Notebook LM on steroids.
If there’s interest in the future, we could put together another class on the subject, although the field is moving so rapidly we’d have new topics to cover. Next time we could perhaps even have AI teach the class!
We held our first Artificial Intelligence (AI) / local Large Language Model (LLM) workshop at Castlemakers last Thursday. The focus was on downloading & creating an AI LLM local model using Ollama on a Windows PC, but given the target audience (those unfamiliar with AI) we used a ‘docker’ to keep it simple and spent a lot of time explaining all of the terms involved.
Attendees discuss what’s involved in a LLM.
We decided to try a more cutting edge topic to expose people to AI and open source LLMs – plus help people realize it’s not as hard to do as it might seem. Having someone in our midst, Ian, that has been trying and using the different models also really helped! Folks attending had an option to take home a USB drive loaded with the software to install and try the software on their local computer at home.
If you’re interested in the topic, stop by Castlemakers during our Open Shop hours – we still have the system installed on a surprisingly small computer and with the Ollama interface you can try different open source LLM models on a local pc. It’s a great way to learn and experiment with something that will be effecting all of us already and in the coming future…
Not surprisingly given some of our member’s interest in software, there’s been a lot of discussion at Castlemakers about Artificial Intelligence(AI). We started experimenting with machine learning, an application of AI, several years ago in a squirrel proof bird feeder. And since two of our members work with software it’s been a hot topic during a lot of Open Shop times recently.
Screenshot of Ollama running TinyLlama on a local i5 Windows PC at Castlemakers to test local AI queries.
Many people don’t realize how rapidly using Large Language Models (LLM, but for that matter all AI) is becoming possible using smaller computers. One of the big changes has been open source software that provides installation capabilities and a front end for the LLM that you can run on a Windows/Linux/Mac machine.
Which has led into a workshop/class on the subject in June. It started as a way for more of us to learn about running an open source LLM on a workstation or laptop, but realized there might be more people that would like to find out more on how easy it is to set one up on a local computer. So June 20th at 7 pm we’ll be having a small class in using Ollama to install an open source LLM on a Windows PC – please check our Classes webpage for more information and to make reservations if you’re interested!