Deploying SerpBear on a Server
This guide covers two ways to deploy SerpBear on a server: running directly with Node.js, or running with Docker. It also covers setting up a reverse proxy with either Nginx or Traefik to expose the app on your domain with SSL.
Prerequisites
- A server (VPS or dedicated) with SSH access.
- A domain name pointed to your server's IP address (e.g.
serpbear.yourdomain.com).
Option 1: Running with Node.js (Without Docker)
Use this method if you prefer running SerpBear directly with Node.js without Docker.
Step 1: Install Node.js and Git
Make sure Node.js (v18 or later) and Git are installed on your server. You can verify by running:
node -v
git --version
Step 2: Clone the Repository
git clone https://github.com/towfiqi/serpbear.git
cd serpbear
Step 3: Set Up Environment Variables
Create a .env.local file in the project root with the following content:
USER_NAME=admin
PASSWORD=your_password_here
SECRET=your_random_secret_key_here
APIKEY=your_random_api_key_here
SESSION_DURATION=24
NEXT_PUBLIC_APP_URL=https://serpbear.yourdomain.com
Replace the values with your own. The details of all environment variables can be found here.
Step 4: Create the Data Directory
mkdir -p data
Step 5: Install Dependencies and Build
npm install
npm run build
Step 6: Start the App
To keep SerpBear running in the background we will use a process manager PM2:
npm install -g pm2
pm2 start npm --name "serpbear" -- run start:all
pm2 save
pm2 startup
This starts the Next.js server on port 3000 and the cron job process for scheduled SERP scraping.
Updating (Node.js)
When a new version is released, pull the latest changes and rebuild:
cd serpbear
git pull
npm install
npm run build
npm run start:all
Option 2: Running with Docker
Use this method if you prefer running SerpBear inside a Docker container.
Step 1: Install Docker
Install Docker Engine by following the official Docker installation guide for your OS. Docker Compose is included with modern Docker installations.
Verify the installation:
docker --version
docker compose version
Step 2: Set Up Environment Variables
Create a new directory for SerpBear and add a .env file with your configuration:
mkdir serpbear && cd serpbear
Create a .env file with the following content:
USER=admin
PASSWORD=your_password_here
SECRET=your_random_secret_key_here
APIKEY=your_random_api_key_here
SESSION_DURATION=24
NEXT_PUBLIC_APP_URL=https://serpbear.yourdomain.com
Replace the values with your own. The details of all environment variables can be found here.
Step 3: Create the Docker Compose File
In the same directory, create a docker-compose.yaml file with the following content:
services:
app:
image: towfiqi/serpbear:latest
restart: unless-stopped
ports:
- "${PORT:-3000}:3000"
environment:
- USER_NAME=${USER:-admin}
- PASSWORD=${PASSWORD}
- SECRET=${SECRET}
- APIKEY=${APIKEY}
- SESSION_DURATION=${SESSION_DURATION:-24}
- NEXT_PUBLIC_APP_URL=${NEXT_PUBLIC_APP_URL:-http://localhost:3000}
# Optional: Google Search Console integration
- SEARCH_CONSOLE_CLIENT_EMAIL=${SEARCH_CONSOLE_CLIENT_EMAIL:-}
- SEARCH_CONSOLE_PRIVATE_KEY=${SEARCH_CONSOLE_PRIVATE_KEY:-}
volumes:
- serpbear_data:/app/data
healthcheck:
test: ["CMD-SHELL", "wget -qO- http://localhost:3000 || exit 1"]
interval: 30s
timeout: 10s
retries: 3
start_period: 15s
volumes:
serpbear_data:
Docker Compose will automatically read the .env file and substitute the variable values.
Step 4: Start the Container
docker compose up -d
The -d flag runs the container in the background. You can view the logs with:
docker compose logs -f
Updating (Docker)
When a new version of SerpBear is released, navigate to your serpbear directory and run:
docker compose pull
docker compose up -d
Your data is persisted in the serpbear_data Docker volume and will not be affected by updates.
Setting Up a Reverse Proxy
SerpBear runs on port 3000 by default. To serve it on your domain with SSL, you need a reverse proxy. Choose one of the options below.
Option A: Nginx
Step 1: Install Nginx
# Debian/Ubuntu
sudo apt update && sudo apt install nginx
# RHEL/CentOS/Fedora
sudo dnf install nginx
Step 2: Configure the Virtual Host
Create a new config file for SerpBear:
sudo nano /etc/nginx/sites-available/serpbear
Add the following configuration (replace serpbear.yourdomain.com with your actual domain):
server {
listen 80;
server_name serpbear.yourdomain.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
}
}
Enable the site and restart Nginx:
sudo ln -s /etc/nginx/sites-available/serpbear /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl restart nginx
Step 3: Add SSL with Let's Encrypt
Install Certbot and obtain an SSL certificate:
sudo apt install certbot python3-certbot-nginx # Debian/Ubuntu
sudo certbot --nginx -d serpbear.yourdomain.com
Certbot will automatically update your Nginx config to redirect HTTP to HTTPS. Verify auto-renewal is set up:
sudo certbot renew --dry-run
Step 4: Configure Firewall (Optional)
If you have a firewall enabled, allow HTTP and HTTPS traffic:
sudo ufw allow 'Nginx Full'
sudo ufw allow ssh
sudo ufw enable
Option B: Traefik
Traefik is a modern reverse proxy that integrates natively with Docker and handles SSL certificates automatically.
Step 1: Create a Docker Network
Create a shared Docker network that Traefik and SerpBear will use to communicate:
docker network create proxy
Step 2: Set Up Traefik
Create a directory for Traefik and add the configuration files:
mkdir traefik && cd traefik
Create a docker-compose.yaml file for Traefik:
services:
traefik:
image: traefik:v3.3
restart: unless-stopped
command:
- "--providers.docker=true"
- "--providers.docker.exposedbydefault=false"
- "--providers.docker.network=proxy"
- "--entryPoints.web.address=:80"
- "--entryPoints.websecure.address=:443"
- "--certificatesresolvers.letsencrypt.acme.httpchallenge.entrypoint=web"
- "--certificatesresolvers.letsencrypt.acme.email=your-email@example.com"
- "--certificatesresolvers.letsencrypt.acme.storage=/letsencrypt/acme.json"
# Redirect all HTTP traffic to HTTPS
- "--entryPoints.web.http.redirections.entryPoint.to=websecure"
- "--entryPoints.web.http.redirections.entryPoint.scheme=https"
ports:
- "80:80"
- "443:443"
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
- traefik_letsencrypt:/letsencrypt
networks:
- proxy
networks:
proxy:
external: true
volumes:
traefik_letsencrypt:
Replace [email protected] with your actual email address for Let's Encrypt notifications.
Start Traefik:
docker compose up -d
Step 3: Configure SerpBear with Traefik
Go back to your SerpBear directory and update the docker-compose.yaml to use Traefik labels instead of publishing ports directly. Replace serpbear.yourdomain.com with your actual domain:
services:
app:
image: towfiqi/serpbear:latest
restart: unless-stopped
environment:
- USER_NAME=${USER:-admin}
- PASSWORD=${PASSWORD}
- SECRET=${SECRET}
- APIKEY=${APIKEY}
- SESSION_DURATION=${SESSION_DURATION:-24}
- NEXT_PUBLIC_APP_URL=https://serpbear.yourdomain.com
- SEARCH_CONSOLE_CLIENT_EMAIL=${SEARCH_CONSOLE_CLIENT_EMAIL:-}
- SEARCH_CONSOLE_PRIVATE_KEY=${SEARCH_CONSOLE_PRIVATE_KEY:-}
volumes:
- serpbear_data:/app/data
labels:
- "traefik.enable=true"
- "traefik.http.routers.serpbear.rule=Host(`serpbear.yourdomain.com`)"
- "traefik.http.routers.serpbear.entrypoints=websecure"
- "traefik.http.routers.serpbear.tls.certresolver=letsencrypt"
- "traefik.http.services.serpbear.loadbalancer.server.port=3000"
networks:
- proxy
healthcheck:
test: ["CMD-SHELL", "wget -qO- http://localhost:3000 || exit 1"]
interval: 30s
timeout: 10s
retries: 3
start_period: 15s
networks:
proxy:
external: true
volumes:
serpbear_data:
Note that the ports section is removed since Traefik handles routing. Make sure NEXT_PUBLIC_APP_URL is set to your full HTTPS domain URL.
Start SerpBear:
docker compose up -d
Traefik will automatically detect the container, obtain an SSL certificate from Let's Encrypt, and route traffic from https://serpbear.yourdomain.com to SerpBear.