Skip to content

frischHWC/ai-helper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

AI Helper

Goal of this project is to help setup an internal AI tool.

It should be able to be deployed locally using podman or on a k8s cluster for more scalability and to be enterprise available.

Docker deployment

Add in /etc/hosts:

127.0.0.1 owb.local

Create certificates if required for nginx using following commands

cd docker/volumes/nginx/ssl
openssl req -nodes -newkey rsa:2048 -keyout nginx.key -out nginx.csr -subj "/CN=owb.local"
echo "subjectAltName=DNS:*.owb.local,DNS:owb.local,DNS:localhost" > openssl-ext.conf
openssl x509 -req -extfile openssl-ext.conf -CA /Users/francoisrisch/Documents/Docs/ca/rootCa.crt -CAkey /Users/francoisrisch/Documents/Docs/ca/rootCa.key -CAcreateserial -days 365 -sha256 -in nginx.csr -out nginx.crt

Add nginx conf with content for file volumes/nginx/conf.d/open-webui.conf:

server {
    listen 443 ssl;
    server_name nginx;

    ssl_certificate /etc/nginx/ssl/nginx.crt;
    ssl_certificate_key /etc/nginx/ssl/nginx.key;
    ssl_protocols TLSv1.2 TLSv1.3;

    location / {
        proxy_pass http://openwebui:8080;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;

        # (Optional) Disable proxy buffering for better streaming response from models
        proxy_buffering off;

        # (Optional) Increase max request size for large attachments and long audio messages
        client_max_body_size 20M;
        proxy_read_timeout 10m;
    }
}

Go to directory deployment/docker and run either the script start.sh or the command:

docker compose up -d

To stop it, run either the script stop.sh or the command:

docker compose down

Access open-webui by going to: (http://owb.local/ or directly with: (http://localhost:8001/

Login the first time with the user that you want and keep it for future access.

K8s Deployment

Requirements

On the ansible controller:

  • Ansible 2.16

  • Kubernetes collection to install with: ansible-galaxy collection install kubernetes.core

On the node where ansible will connect, it requires:

  • Python >= 3.9: yum install python3.11 && yum install python3.11-pip

  • Python packages: pip3.11 install kubernetes pip3.11 install pyyaml pip3.11 install jsonpatch

Note: There is an option to let the ansible playbook install and configure python on the remote node

Launch

Example to launch a deployment on node: 'mynode' and let the playbook install and configure python on it:

./deploy-to-k8s.sh \
        --domain-name=my.domain.com \
        --node-to-run-commands=mynode \
        --node-to-run-commands-user="root" \
        --kubeconfig-file="/path/to/kubeconfig/on/mynode" \
        --helm-bin-path="/path/to/helm/binary/on/mynode" \
        --install-python="true"

Login the first time with the user that you want and keep it for future access.

About

AI tool that deploys an open-webui with all prerequisites and tools for production

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages