pdf -> podcast slop generator
  • Python 75.4%
  • HTML 12.9%
  • CSS 10.8%
  • Dockerfile 0.9%
Find a file
2026-02-26 01:33:01 +00:00
app Fix FPDF horizontal space crash in markdown PDF rendering 2026-02-26 01:33:01 +00:00
deploy Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
local_tts_bridge Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
.env.example Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
.gitignore Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
docker-compose.yml Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
Dockerfile Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
README.md Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00
requirements.txt Initial commit: Podcast About PDFs app 2026-02-26 01:24:10 +00:00

Podcast About PDFs

Single-user web app that turns a research prompt into:

  1. A deep research report (report.md + report.pdf)
  2. A two-host podcast script
  3. A synthesized MP3 episode (female voices only)

The app is designed for VPS deployment behind HTTPS at pap.unicornfan.com.

Stack

  • FastAPI + Jinja templates (web UI)
  • Celery + Redis (background pipeline jobs)
  • PostgreSQL (episode metadata)
  • Nginx on host (TLS termination + reverse proxy)
  • Docker Compose (deployment)

Pipeline

prompt -> research blueprint -> long-form report -> PDF export -> podcast script -> TTS segments -> merged MP3

Female Voice Enforcement

Voice synthesis is hard-coded to female voices:

  • HOST_A -> nova
  • HOST_B -> shimmer

This is enforced in app/services/pipeline.py.

Quick Start (Server)

  1. Clone repo on server.
  2. Copy env template:
cp .env.example .env
  1. Set required values in .env:
  • SECRET_KEY
  • APP_PASSWORD_HASH (or APP_PASSWORD)
  • KIMI_API_KEY
  • POSTGRES_PASSWORD
  • DOMAIN / BASE_URL
  1. Generate password hash (recommended):
docker compose run --rm web python app/scripts/hash_password.py
  1. Start services:
docker compose up -d --build
  1. Check status:
docker compose ps
docker compose logs -f web worker

Operations

  • Restart app stack: docker compose restart
  • Pull updates + redeploy:
git pull
docker compose up -d --build
  • Data persistence volumes:
    • db_data (Postgres)
    • redis_data (queue)
    • pap_data (reports, scripts, audio)

Reverse Proxy

  • docker compose up -d --build starts app + worker + db + redis.
  • Nginx on the host should proxy pap.unicornfan.com to 127.0.0.1:8000.
  • Example Nginx site block for pap.unicornfan.com:
server {
    listen 80;
    server_name pap.unicornfan.com;

    location / {
        proxy_pass http://127.0.0.1:8000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}
  • After DNS points pap.unicornfan.com to your VPS IP, issue TLS cert:
sudo certbot --nginx -d pap.unicornfan.com --redirect

Kimi Code

Set these in .env:

LLM_PROVIDER=auto
KIMI_API_KEY=your_key_here
KIMI_CODE_BASE_URL=https://api.kimi.com/coding/v1
KIMI_CODE_USER_AGENT=KimiCLI/0.77
LLM_MODEL=kimi-for-coding

LLM_PROVIDER=auto means the app uses Kimi Code automatically when KIMI_API_KEY is present.

Local TTS Bridge (GPU Box -> VPS)

If you run TTS on a different machine (for example this ROCm box), the VPS can call it through an SSH reverse tunnel.

  1. Run your TTS HTTP server locally on this machine at 127.0.0.1:8091, exposing POST /tts.
  2. Keep this reverse tunnel running from this machine:
ssh -N -R 127.0.0.1:8091:127.0.0.1:8091 aria@unicornfan.com
  1. On VPS .env:
TTS_BACKEND=local_http
LOCAL_TTS_ENDPOINT=http://127.0.0.1:8091/tts
LOCAL_TTS_AUTH_TOKEN=

The app will then synthesize on your local machine, but orchestration/storage stays on the VPS. Reference setup for a local bridge is in local_tts_bridge/README.md.

Notes

  • If your provider is OpenAI-compatible, set OPENAI_BASE_URL accordingly.
  • TTS/audio generation requires ffmpeg in the container (already installed in Dockerfile).