Using Open Source Models

Ready to try some AI models? The Hugging Face Hub hosts many open-source models you can use right now — no setup required.

What makes this special? You get access to the same models that power many AI applications, completely free. Whether you’re a curious beginner or an experienced developer, you can experiment with text generation, image creation, translation, and more.

New to AI models? Think of them as specialized tools: some write text, some create images, some translate.

Let’s try some models!

Let’s try a text model (e.g., openai/gpt-oss-120b) in the Hugging Face Playground:

Now try an image editing Space: Qwen Image Edit.

For audio, try a text-to-speech model.

Building with open source models

Beyond trying models, you can build your own tools and applications.

🌤️ Widgets - Try models directly in their model cards. This will give a quick overview of what the model can do.

🖥️ The Playground - Try models directly on the website. No coding required.

🔌 API Calls - Integrate models via simple HTTP requests.

🐍 Python Libraries - Download and run models locally for full control.

Start simple! Most people begin with browser widgets to understand what models can do, then move to APIs or Python as their projects grow more complex.

🌤️ Widgets - Quick Model Testing

Every model on the Hub has an interactive widget that lets you test it instantly. No account needed, no setup - just type and see what happens!

How to Use Widgets:

  1. Visit any model page. For example, openai/gpt-oss-120b
  2. Look for the widget on the right side of the screen
  3. Try the example inputs first, then experiment with your own

What You Can Try:

Widget not working? Some models are popular and may be busy. Try again in a few minutes, or look for similar models that aren’t as crowded.

🖥️ The Playground - Advanced Experimentation

The Playground offers more controls and settings to understand and compare models.

What Makes Playgrounds Special:

Pro Tips:

🔌 API Calls - Build Real Applications

Ready to integrate AI into your application? APIs let you add AI capabilities with a few lines of code.

How APIs Work:

  1. Send a request to the model with your input
  2. Get back the AI’s response from the API
  3. Use that response in your application

Simple API Example (Python):

from huggingface_hub import InferenceClient

client = InferenceClient()
response = client.text_generation(
    "Write a short story about a robot:",
    model="microsoft/DialoGPT-large"
)
print(response)

API Benefits:

Go deeper: We’re moving pretty fast over these topics, but at the end of this chapter we’ll dive deep into building AI applications with APIs.

🖥️ Local Apps

Run models on your computer without sending data to remote servers.

Why Use Local Apps:

How to Get Started

To get started:

  1. Enable Local Apps in your account settings (if available).

Local Apps

  1. Choose a supported model from the Hub by searching for it.

Local Apps

  1. Select the local app from the “Use this model” dropdown on the model page.

Local Apps

  1. Copy and run the provided command in your terminal.

Local Apps

**New to local apps?** Start with LM Studio or Jan - they have user-friendly interfaces that make getting started easy!

Popular Local Apps

Llama.cpp

High-performance C/C++ library optimized for CPU inference. Based on the GGUF format for efficient model loading.

Best for: CPU-based inference, maximum performance

# Install with brew (Mac/Linux)
brew install llama.cpp

# Run a model directly from the Hub
llama-cli -hf bartowski/Llama-3.2-3B-Instruct-GGUF:Q8_0

LM Studio

Desktop application with an intuitive graphical interface for downloading and running local LLMs.

Best for: Beginners who want a GUI

Jan

Open-source chat application that can run offline with document chat capabilities.

Best for: Privacy-focused users

**Model formats matter.** Look for GGUF format models — they're optimized for local inference and work with most local apps. You can [browse GGUF models](https://huggingface.co/models?library=gguf).

Finding the Right Model

With many models on the Hub, here are some tips:

Browse by Task:

Check the Stats:

Filter by Your Needs:

Your Next Challenge

Ready to put this knowledge into practice? Pick one approach and try it this week:

  1. Widget Explorer: Try 5 different models using their widgets
  2. API Builder: Make your first API call and integrate it into a simple app
  3. Local Runner: Download a model and run it locally with Python
  4. Playground Pro: Experiment with parameters in different playgrounds
**Start small!** Pick the approach that matches your current skill level. You can always progress to more advanced methods later.

The world of open-source AI is waiting for you to explore, experiment, and create. Every expert started as a beginner - your journey starts with trying your first model!


Ready to start building? Visit huggingface.co/models and begin your AI development journey.

< > Update on GitHub