Run a fast ChatGPT-like model locally on your device. This combines the LLaMA foundation model with an open reproduction of Stanford Alpaca a fine-tuning of the base model to obey instructions (akin to the RLHF used to train ChatGPT) and a set of modifications to llama.cpp to add a chat interface. Download the zip file corresponding to your operating system from the latest release. The weights are based on the published fine-tunes from alpaca-lora, converted back into a PyTorch checkpoint with a modified script and then quantized with llama.cpp the regular way.
Features
- Run a fast ChatGPT-like model locally on your device
- If you have more than 10GB of RAM, you can use the higher quality 13B model
- Combines the LLaMA foundation model with an open reproduction of Stanford Alpaca
- The weights are based on the published fine-tunes from alpaca-lora,
- You can add other launch options
- You can now type to the AI in the terminal and it will reply
License
MIT LicenseFollow Alpaca.cpp
You Might Also Like
See Everything. Miss Nothing. 30-day free trial
As the IT backbone of your company, you can’t afford to miss a thing. PRTG monitors every device, application, and connection - on-premise and in the cloud. You get clear dashboards, smart alerts, and mobile access, so you’re always in control, wherever you are. No more guesswork or manual checks. PRTG’s powerful automation and easy setup mean you spend less time firefighting and more time moving your business forward. Discover how simple and reliable IT monitoring can be.
Rate This Project
Login To Rate This Project
User Reviews
-
Really great for running a ChatGPT-style model on my own device.