Skip to content
0:19:28
15 552
1 069
263
Last update : 16/10/2024

Unleash the Power of Unlimited AI: Build Full-Stack Apps Locally with Any LLM 🚀

Ever heard of Bolt.new? It’s an amazing open-source platform that lets you build full-stack apps right in your browser with the help of AI. It’s like having a supercharged coding buddy!

But what if you could make Bolt.new even BETTER? 🤔 That’s exactly what this guide will help you do! We’ll explore how to unlock the full potential of Bolt.new by running it locally on your computer.

Why This Matters to You 💡

Imagine building apps with the power of AI, without any usage limits and with your favorite LLM! 🤩 This means:

  • Cost Savings: Say goodbye to token limits and subscription fees!
  • Unlimited Creativity: Experiment with any LLM, even those fine-tuned for coding.
  • Total Control: Run everything on your own machine for enhanced privacy and customization.

Ready to dive in? Let’s go!

Breaking Free: Running Bolt.new Locally 🧰

Here’s the secret sauce: Bolt.new is open-source! This means you can download the code and run it on your own computer. Here’s how:

  1. Get the Code: Download the enhanced Bolt.new code from the GitHub repository linked in the resources section. This version is specially modified to support multiple LLMs.
  2. Install Dependencies: Make sure you have Node.js and npm installed on your machine. Then, navigate to the project directory and run npm install to install all the necessary packages.
  3. Set Up Environment Variables: You’ll need API keys for the LLMs you want to use. Don’t worry, the repository provides a clear guide on how to set these up.
  4. Launch Bolt.new: Once everything is set up, run npm run dev and open http://localhost:3000 in your browser. Voilà! You’ve got Bolt.new running locally!

The Magic of Multiple LLMs ✨

The real game-changer? This enhanced version lets you choose from a variety of powerful LLMs, including local models powered by Ollama. This means:

  • Specialized Coding Assistance: Utilize LLMs like quant 2.5 coder that are specifically trained on code, making your development process smoother and more efficient.
  • Experiment and Discover: Try out different LLMs to see which one best suits your coding style and project needs.
  • Always Evolving: Easily integrate new and upcoming LLMs as they emerge, ensuring you’re always at the forefront of AI-powered development.

Supercharge Your Workflow with Ollama 💪

Ollama lets you run large language models locally, opening up a world of possibilities:

  • Download and Experiment: Explore a vast library of open-source LLMs, including those optimized for coding.
  • Fine-Tune for Your Needs: Train LLMs on your own datasets to create specialized coding assistants tailored to your specific requirements.
  • Cost-Effective Powerhouse: Run powerful LLMs on your own hardware, eliminating the need for expensive cloud services.

Ready to Build the Future? Let’s Get Started! 🚀

This is just the beginning! With Bolt.new running locally and a universe of LLMs at your fingertips, you have the power to create amazing applications. Explore the resources below to dive deeper:

Resource Toolbox 🧰

Remember, the future of development is here, and it’s powered by AI. Embrace the possibilities, experiment, and build something extraordinary!