The Backyard Summer Special is here!
Try the Unlimited Free Plan!
Backyard AI Logo
Character HubDesktop AppCloud Plans
Resources
DocsBlogStatusChangelogCompany

Choosing a Model

You can browse a list of models that are compatible with Backyard Desktop on the "Manage Models" page.

Each model has been tested by our team, and the file can be downloaded directly to your hard drive from within the app. This list is updated frequently with new models created by the open source community.

Model Manager

File Formats #

Backyard AI works with GGUF files. Each model displayed in the Model Manager is a GGUF file.

Other file types, such as FP16, GPTQ, and AWQ are not supported.

Quantization #

GGUF model files are quantized, which means they are compressed to a smaller size than the original model. This allows them to run on consumer hardware (such as your computer). There is a trade-off between quantization level and output quality.

GGUF files ending in “Q4_K_M” or “Q5_K_M” are currently recommended for most users. The "Q4" describes the level of quantization, the K denotes that it’s a “k-quant” and the M denotes that it’s a medium size k-quant.

Parameter Size (8B, 13B, 70B, etc.) #

The "B" number represents the number of parameters within the base model. If quantization is like changing the resolution of an image, then parameters are the number of colors in the image; 70B is HDR and 3B is a gif from the 90s. Backyard AI will give you guidance on which number of parameters to use.

Model Quants

Size vs. Perplexity Tradeoff #

Perplexity is a metric measuring the quality of text generations. A lower perplexity value is better. Here is a charge comparing the perplexity of different LLaMa2 model quantizations and parameter sizes:

Perplexity vs. Size

Start Guide
OverviewQuick Start
Models
Helpful ConceptsChoosing A ModelModel Parameters
© 2024 Backyard AI
Community Guidelines
Terms of Use
Privacy Policy
Tethering: Disabled
No model loaded