ChatGPT Web Mirror using API via Docker

Personal use only. Do not deploy it for commercial purpose.

ChatGPT Web Mirror using API via Docker

This is a ChatGPT web mirror, based on the model gpt-3.5-turbo.

  • Download the repo
git clone https://github.com/yuezk/chatgpt-mirror.git
  • Create .env file
cd chatgpt-mirror
nano .env

The content is quite simple:

OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# optional, support http or socks proxy
# HTTP_PROXY=http://proxy-server:port

Note: If you don’t know where to get OpenAI API, please refer to Implement Cloudflare WARP Native IPv4/IPv6 Dual-Stack Networking to Linux Cloud Servers

  • Copy and modify configuration file
cp config/example.json config/app.config.json
nano config/app.config.json

Comments are not allowed in JSON files so you have to delete them (two lines with // below). Also, feel free to change the value of maxTokens

{
  "openai": {
    "systemMessage": "You are ChatGPT, a large language model trained by OpenAI. Answer as concisely as possible",
    "maxTokens": 4000,
    // Currently, only `gpt-3.5-turbo` and `gpt-3.5-turbo-0301` are supported.
    // default: `gpt-3.5-turbo`
    "model": "gpt-3.5-turbo",
    "errorMapping1": [
      {
        "keyword": "insufficient_quota",
        "message": "The API key has insufficient quota."
      },
      {
        "keyword": "Rate limit reached for",
        "message": "The API key has reached its rate limit."
      },
      {
        "keyword": "context_length_exceeded",
        "message": "The context length exceeds the maximum allowed length."
      }
    ]
  }
}
  • Create docker-compose.xml
    In your app root dir:
nano docker-compose.yml

My configuration for example:

version: "3.8"

services:
  chatgpt-mirror:
    image: chatgpt-mirror:latest
    container_name: chatgpt-mirror
    restart: unless-stopped
    env_file:
      - .env
    volumes:
      - ./config/app.config.json:/app/config/app.config.json
    ports:
      - 127.0.0.1:<your_port>:3000
  • Build the docker image
docker build -t chatgpt-mirror .
  • Bring up the container
docker-compose up -d
  • Configure Nginx
nano /etc/nginx/chatgpt_mirror.conf

My conf for example

server {
  listen 127.0.0.1:<your_port> ssl http2;

  ssl_certificate       <your_cert_absolute_path>/fullchain.pem;
  ssl_certificate_key   <your_cert_absolute_path>/privkey.pem;
  ssl_session_timeout 1d;
  ssl_session_cache shared:MozSSL:10m;
  ssl_session_tickets off;

  ssl_protocols         TLSv1.2 TLSv1.3;
  ssl_ciphers           ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384;
  ssl_prefer_server_ciphers off;

  server_name           your.domain.com;
    add_header Content-Security-Policy upgrade-insecure-requests;
  location / {
    proxy_redirect off;
    proxy_pass http://127.0.0.1:<your_port>; 
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_next_upstream off;
  }
}
  • Check and restart Nginx
nginx -t
systemctl restart nginx

Enjoy


Copyright statement: Unless otherwise stated, all articles on this blog adopt the CC BY-NC-SA 4.0 license agreement. For non-commercial reprints and citations, please indicate the author: Henry, and original article URL. For commercial reprints, please contact the author for authorization.