Self-Hosting Llms PART 2: Deploying Ollama and Appsmith with Docker and Digital Ocean

Dec
27
Friday, December 27
10:00 AM to 10:30 AM EST
Cover iamge for Self-Hosting Llms PART 2: Deploying Ollama and Appsmith with Docker and Digital Ocean

Participants

Avatar image for joseph_appsmith Joseph Petty Verified userVerified user View 's profile

About the event

Large Language Models can be extremely powerful business tools, but they also come with significant risk if not used properly. A few weeks ago, Joseph covered how you can self-host your own LLM on local hardware, and avoid the privacy and security risks of 3rd party AI services. In this workshop, we'll be moving that setup to our own private cloud, using Docker in a Digital Ocean Droplet.