Web UI For AI DeepSeek-R1 32B Model

RUN AI PRIVATELY (Fedora + DeepSeek R1)
RUN AI PRIVATELY (Fedora + DeepSeek R1)

Live stream set for 2025-10-29 at 14:00:00 Eastern

Ask questions in the live chat about any programming or lifestyle topic.

This livestream will be on YouTube or you can watch below.

DeepSeek-R1 32B on Fedora: Launching the Local AI Web Chat (Moving Beyond the CLI)

Introduction

Following our previous exploration of the powerful DeepSeek-R1 32B Large Language Model (LLM) via the command line (as detailed here), we’re ready for the next step. While the terminal is excellent for validation, a proper web interface provides a much better and more accessible user experience.

This guide will show you how to move from that initial command-line setup to a custom, clean web application on your Fedora system. We will utilize your existing Ollama installation and leverage Python’s requests and json libraries to build a basic, framework-free chat front-end.

DeepSeek-R1 32B: Open Source Powerhouse

The DeepSeek-R1 LLM remains an impressive and crucial example of the value of open-source AI. Running it locally gives you complete control over privacy and performance.

  • Model: DeepSeek-R1 32B
  • License: The model weights for the DeepSeek-R1 series are released under the highly permissive MIT License. This allows for free use, modification, distribution, and commercialization of the software, provided the terms are respected.

Installation: Setting up DeepSeek-R1 32B on Fedora

Since you’ve already installed Ollama from the Fedora repositories, setting up the model and its server endpoint is straightforward.

1. Verify and Start Ollama

Ensure the Ollama service is running to expose the local API endpoint (usually on http://localhost:11434):

ollama serve &

(Run this command if your Ollama service is not already active in the background.)

2. Pull the DeepSeek-R1 32B Model

If you haven’t already, use the Ollama CLI to download the 32 Billion parameter version:

ollama pull deepseek-r1:32b

This ensures the model is downloaded and ready for your new web application to communicate with via the Ollama API.

The Web Interface: Python, HTML, and Custom Styling

We will create a simple, custom application using three files to demonstrate connecting to the local AI without relying on large web frameworks: index.html, style.css, and the Python backend logic.

Step 1: The Python Backend Logic (app.py)

This Python logic is the crucial middleman, handling the communication with the Ollama API using requests and json. Note that in a full web setup, you’d need a web server to expose this logic as an endpoint.

import requests
import json

OLLAMA_API_URL = "http://localhost:11434/api/generate"
MODEL_NAME = "deepseek-r1:32b"

def get_deepseek_response(prompt):
    """Sends a request to the local Ollama API and returns the AI's response."""
    payload = {
        "model": MODEL_NAME,
        "prompt": prompt,
        "stream": False # Request a single, complete response
    }
    headers = {'Content-Type': 'application/json'}
    
    try:
        # Send the prompt to the local Ollama endpoint
        response = requests.post(OLLAMA_API_URL, data=json.dumps(payload), headers=headers)
        response.raise_for_status() # Check for HTTP errors
        
        # Extract the response text from the JSON data
        data = response.json()
        
        if 'response' in data:
            return data['response']
        else:
            return "Error: Could not parse model response."

    except requests.exceptions.RequestException as e:
        return f"Error connecting to Ollama: {e}"

# NOTE: In a complete solution, you would set up an HTTP server 
# to call this function when the web page sends a request.
if __name__ == "__main__":
    test_prompt = "Explain why the MIT License is suitable for open-source LLMs."
    print(f"**Query:** {test_prompt}")
    answer = get_deepseek_response(test_prompt)
    print(f"**DeepSeek-R1 32B:** {answer}")

Step 2: The HTML Front-end (index.html)

This file provides the chat interface structure. Custom JavaScript handles sending the input and displaying the AI’s response.

    <div class="chat-container">
        <h1>Local AI Chat: DeepSeek-R1 32B</h1>
        <div id="chat-box">
            <!-- Messages will appear here -->
            <div class="message user-message">Welcome! Ask your local DeepSeek-R1 32B model anything.</div>
        </div>
        <form id="chat-form">
            <input type="text" id="user-input" placeholder="Enter your prompt..." required>
            <button type="submit">Send</button>
        </form>
    </div>

    <script>
        document.getElementById('chat-form').addEventListener('submit', async function(e) {
            e.preventDefault();
            const inputField = document.getElementById('user-input');
            const chatBox = document.getElementById('chat-box');
            const prompt = inputField.value.trim();

            if (prompt === '') return;

            // 1. Display user message
            const userMsg = document.createElement('div');
            userMsg.className = 'message user-message';
            userMsg.textContent = prompt;
            chatBox.appendChild(userMsg);

            inputField.value = ''; // Clear input
            chatBox.scrollTop = chatBox.scrollHeight; // Scroll to bottom

            // 2. Display "Thinking..." message
            const thinkingMsg = document.createElement('div');
            thinkingMsg.className = 'message ai-message thinking';
            thinkingMsg.textContent = 'DeepSeek-R1 32B is thinking...';
            chatBox.appendChild(thinkingMsg);
            chatBox.scrollTop = chatBox.scrollHeight;

            try {
                // *** IMPORTANT: Replace this placeholder with your actual fetch call ***
                // You must configure your Python script (like app.py) to run as an API 
                // endpoint that the browser can contact.
                
                // Placeholder response for simple demonstration
                const aiText = await new Promise(resolve => setTimeout(() => resolve("The MIT License is great because it grants broad rights to use, copy, modify, and distribute the software with minimal restrictions, fostering an open ecosystem for LLM research and commercial deployment."), 2000));
                
                // 3. Update with AI response
                thinkingMsg.classList.remove('thinking');
                thinkingMsg.textContent = aiText;

            } catch (error) {
                thinkingMsg.classList.remove('thinking');
                thinkingMsg.textContent = 'Error: Could not get a response from the AI backend.';
                console.error('Fetch error:', error);
            }

            chatBox.scrollTop = chatBox.scrollHeight;
        });
    </script>

Step 3: Custom CSS Styling (style.css)

A simple, professional style for your chat application.

    body {
    font-family: Arial, sans-serif;
    background-color: #f4f7f6;
    display: flex;
    justify-content: center;
    align-items: center;
    min-height: 100vh;
    margin: 0;
}

.chat-container {
    width: 100%;
    max-width: 700px;
    background-color: #ffffff;
    border-radius: 10px;
    box-shadow: 0 4px 15px rgba(0, 0, 0, 0.1);
    padding: 20px;
}

h1 {
    color: #333;
    text-align: center;
    border-bottom: 2px solid #eee;
    padding-bottom: 10px;
    margin-bottom: 20px;
}

#chat-box {
    height: 400px;
    overflow-y: auto;
    border: 1px solid #ddd;
    padding: 15px;
    margin-bottom: 15px;
    border-radius: 5px;
    display: flex;
    flex-direction: column;
}

.message {
    padding: 10px 15px;
    margin-bottom: 10px;
    border-radius: 15px;
    max-width: 80%;
    line-height: 1.4;
    word-wrap: break-word;
}

.user-message {
    background-color: #007bff;
    color: white;
    align-self: flex-end;
    border-bottom-right-radius: 3px;
}

.ai-message {
    background-color: #e9ecef;
    color: #333;
    align-self: flex-start;
    border-bottom-left-radius: 3px;
}

#chat-form {
    display: flex;
}

#user-input {
    flex-grow: 1;
    padding: 10px;
    border: 1px solid #ddd;
    border-radius: 5px 0 0 5px;
    font-size: 16px;
}

button[type="submit"] {
    padding: 10px 20px;
    background-color: #28a745;
    color: white;
    border: none;
    border-radius: 0 5px 5px 0;
    cursor: pointer;
    font-size: 16px;
    transition: background-color 0.3s;
}

button[type="submit"]:hover {
    background-color: #218838;
}

Screenshots and Screencast

Here’s where you’ll find a visual walkthrough of setting up DeepSeek-R1 32B using Alpaca Ollama on your local system:

Ollama loading tensors for AMD GPU
Command Line Ollama Loading Tensors For AMD Instinct Mi60.

Ollama API Endpoint
Command Line Ollama API Endpoint Ready.

Custom Ollama Web UI HTML
Gnome Text Editor Displaying Ollama Web Interface HTML File.

Custom Ollama Web UI CSS
Gnome Text Editor Displaying Ollama Web Interface CSS File.

Ollama API Direct Python Script
Gnome Text Editor Displaying Ollama Endpoint Direct API Python Script.

Python Server
Command Line Python Server.

CoolerControl Showing AMD Instinct Mi60 temperature
Web Browser Running CoolerControl Displaying AMD Instinct Mi60 Temperature And Shroud Fan RPM.

DeepSeek-R1 32B answered question about the Mayor
Command Line DeepSeek-R1 32B Answered Mayor Of Toronto Request.

DeepSeek-R1 32B answered question about PHP code
Command Line DeepSeek-R1 32B Answered PHP Code Request.

DeepSeek-R1 32B answered question about screenshot
Command Line DeepSeek-R1 32B Answered Gnome Desktop Screenshot Request.

DeepSeek-R1 32B answered request for Kotlin code
DeepSeek-R1 32B Answered Kotlin Code Request.

DeepSeek-R1 32B answered request for Blender Blend File
DeepSeek-R1 32B Answered Blender Blend File Request.

Video Displaying Using DeepSeek-R1 32B In Custom Web UI For Ollama Client

Results:

Who is the mayor of Toronto?

Produced inaccurate outdated answer to Olivia Chow as the mayor of Toronto.

I need a PHP code snippet to connect to a MySQL database.

Produced correct syntax PHP code snippet to connect to a MySQL database.

I need a 1080p screenshot of the gnome desktop environment.

Accurately provided instructions to generate a 1080p screenshot of Gnome desktop environment because it is a text-based AI lacking ability.

I need a kotlin code snippet to open the camera using Camera2 API and place the camera view on a TextureView.

Produced untested Kotlin code snippet.

I need a blender blend file for fire animation.

Accurately detected inability to generate Blender Blend file for a fire animation because it is a text-based AI lacking ability.

Level Up Your Python Skills

This simple web integration is just the beginning of what you can do with Python and powerful local LLMs.

Professional AI and Python Services

Need expert help setting up this powerful technology or mastering your coding skills?

  • One-on-One Tutorials: I offer dedicated one-on-one online Python tutorials tailored to your learning pace and goals: https://ojambo.com/contact
  • LLM Installation and Migration: I can professionally install the DeepSeek-R1 32B LLM– or migrate it- to your machine or server environment to ensure smooth, optimal performance. Contact me for professional AI services: https://ojamboservices.com/contact
Recommended Resources:

Disclosure: Some of the links above are referral (affiliate) links. I may earn a commission if you purchase through them - at no extra cost to you.

About Edward

Edward is a software engineer, web developer, and author dedicated to helping people achieve their personal and professional goals through actionable advice and real-world tools.

As the author of impactful books including Learning JavaScript, Learning Python, Learning PHP, Mastering Blender Python API, and fiction The Algorithmic Serpent, Edward writes with a focus on personal growth, entrepreneurship, and practical success strategies. His work is designed to guide, motivate, and empower.

In addition to writing, Edward offers professional "full-stack development," "database design," "1-on-1 tutoring," "consulting sessions,", tailored to help you take the next step. Whether you are launching a business, developing a brand, or leveling up your mindset, Edward will be there to support you.

Edward also offers online courses designed to deepen your learning and accelerate your progress. Explore the programming on languages like JavaScript, Python and PHP to find the perfect fit for your journey.

📚 Explore His Books – Visit the Book Shop to grab your copies today.
💼 Need Support? – Learn more about Services and the ways to benefit from his expertise.
🎓 Ready to Learn? – Check out his Online Courses to turn your ideas into results.

Leave a Reply

Your email address will not be published. Required fields are marked *