SerpBear
  • Introduction
  • Getting Started
  • How SerpBear works
  • Integrations
  • deployment
    • Running SerpBear Locally
    • Running Locally with Docker
    • Deploying SerpBear on a Server
    • Deploying to Fly.io (Free)
    • Deploying to PikaPods (1-click)
    • Deploying to Elestio (1-Click)
  • Miscellaneous
    • Environment Variables
    • API Documentation
    • Integrate Google Search Console
    • Integrate Google Ads
    • Tutorials
Powered by GitBook
On this page

Getting Started

PreviousIntroductionNextHow SerpBear works

Last updated 1 year ago

SerpBear uses third-party website scrapers like ScrapingAnt, ScrapingRobot or Your given Proxy ips to scrape Google search results to see if your domain appears in the search results for the given keyword.

Requirements

  • Required: A 3rd party Scraping account or a proxy IP to scrape Google Search Results. You can get a free API key from either (5k/month free) or (15k/month for a one-time payment of $59), or choose from .

  • Optional: An SMTP account to receive SERP position notification emails.

  • Optional: A server to deploy the app. You can also run it locally on your PC.

Setting up and running the App

Step 1: Deploy and run the App

You can run the app or deploy it on (free), or your . Note that when you are running locally, you won't be able to access the app through mobile and won't be able to use the API for your data reporting apps.

Step 2: Access your App and Login

After running the app, access it through your browser with the APP URL. If you are running locally, its http://localhost:3000 . If you have deployed it on a server, it's the value you set as NEXT_PUBLIC_APP_URL in your .

Once you access the URL, you will be redirected to the login page. Use the username and password set in your env file.

Step 3: Add your First Domain

Once you log in, you will be required to add your very first domain. Once you do that, you will be redirected to the App's main interface.

Step 4: Setting up the Scraper

The core of the app is the scraper which scrapes Google Search results for a given keyword. Since you cannot scrape Google without getting an IP ban, you will need to use 3rd party scrapers or private proxies. You can either use a third-party scraper or you can use 3rd party proxy services.

Using 3rd party Scrapers

Using Proxies instead

If you have decided to use proxy services, select Proxy from the Scraper List, and then in the proxy field, insert your proxy IP list. The ips should be formatted this way: http://username:password@IP_ADDRESS:PORT .

Note that the number of ips should correlate to the number of keywords you added. If you've added 100 keywords, its best to use at-least 30 keywords. its even better if you are using rotating proxy ips. And no, you don't need residential ips, datacenter ips work fine.

Step 5: Adding your First Keyword

Once you have setup the scraper, click the Add Keyword button and insert the keywords you want to track, select the keyword Country and device and click add. Once they are added, their SERPs will be automatically fetched and their Position data will updated.

Step 7: Receiving Daily/Weekly Email Notification of your Keyword's Position changes

From the Settings panel, setup SMTP details to get notified of your keywords positions through email. You can use ElasticEmail and Sendpulse SMTP services that are free.

You can get a free API key from either (10k/month free) or (5k/month free). Once you get an API key from any of the aforementioned services, click the Settings menu from the top right corner of the App, select the service insert your API key, and save. Then refresh the page.

Example:

ScrapingAnt
ScrapingRobot
http://eizjbatm:mbn4iyhp72la@45.148.125.184:7291
ScrapingAnt
ScrapingRobot
SpaceSerp
other providers
locally
fly.io
own server
env file
SerpBear Login page