Ever heard of Bolt.new? It’s an amazing open-source platform that lets you build full-stack apps right in your browser with the help of AI. It’s like having a supercharged coding buddy!
But what if you could make Bolt.new even BETTER? 🤔 That’s exactly what this guide will help you do! We’ll explore how to unlock the full potential of Bolt.new by running it locally on your computer.
Why This Matters to You 💡
Imagine building apps with the power of AI, without any usage limits and with your favorite LLM! 🤩 This means:
- Cost Savings: Say goodbye to token limits and subscription fees!
- Unlimited Creativity: Experiment with any LLM, even those fine-tuned for coding.
- Total Control: Run everything on your own machine for enhanced privacy and customization.
Ready to dive in? Let’s go!
Breaking Free: Running Bolt.new Locally 🧰
Here’s the secret sauce: Bolt.new is open-source! This means you can download the code and run it on your own computer. Here’s how:
- Get the Code: Download the enhanced Bolt.new code from the GitHub repository linked in the resources section. This version is specially modified to support multiple LLMs.
- Install Dependencies: Make sure you have Node.js and npm installed on your machine. Then, navigate to the project directory and run
npm install
to install all the necessary packages. - Set Up Environment Variables: You’ll need API keys for the LLMs you want to use. Don’t worry, the repository provides a clear guide on how to set these up.
- Launch Bolt.new: Once everything is set up, run
npm run dev
and openhttp://localhost:3000
in your browser. Voilà ! You’ve got Bolt.new running locally!
The Magic of Multiple LLMs ✨
The real game-changer? This enhanced version lets you choose from a variety of powerful LLMs, including local models powered by Ollama. This means:
- Specialized Coding Assistance: Utilize LLMs like
quant 2.5 coder
that are specifically trained on code, making your development process smoother and more efficient. - Experiment and Discover: Try out different LLMs to see which one best suits your coding style and project needs.
- Always Evolving: Easily integrate new and upcoming LLMs as they emerge, ensuring you’re always at the forefront of AI-powered development.
Supercharge Your Workflow with Ollama 💪
Ollama lets you run large language models locally, opening up a world of possibilities:
- Download and Experiment: Explore a vast library of open-source LLMs, including those optimized for coding.
- Fine-Tune for Your Needs: Train LLMs on your own datasets to create specialized coding assistants tailored to your specific requirements.
- Cost-Effective Powerhouse: Run powerful LLMs on your own hardware, eliminating the need for expensive cloud services.
Ready to Build the Future? Let’s Get Started! 🚀
This is just the beginning! With Bolt.new running locally and a universe of LLMs at your fingertips, you have the power to create amazing applications. Explore the resources below to dive deeper:
Resource Toolbox 🧰
- Enhanced Bolt.new Repository: Download the modified code here: https://github.com/coleam00/bolt.new-any-llm
- Ollama Website: Get started with Ollama and download local LLMs: https://ollama.com/
- Vercel AI SDK Documentation: Learn more about integrating different LLM providers: https://sdk.vercel.ai/docs/foundations/providers-and-models
- Original Bolt.new Repository: Explore the original Bolt.new project: https://github.com/stackblitz/bolt.new
Remember, the future of development is here, and it’s powered by AI. Embrace the possibilities, experiment, and build something extraordinary!