SerpBear uses third party website scrapers like ScrapingAnt, ScrapingRobot or Your given Proxy ips to scrape google search results to see if your domain appears in the search result for the given keyword.
- Required: A 3rd party Scraping account or a proxy ips to scrape Google Search Result. You can get an free API key from either ScrapingAnt (10k/month free) or ScrapingRobot (5k/month free) or SpaceSerp (15k/month for one-time payment of $59).
- Optional: A SMTP account to recieve SERP positions notification emails.
- Optional: A server to deploy the app. You can also run locally on your pc.
Step 1: Deploy & Run the App
You can run the app locally or deploy it on fly.io (free), or your own server. Note that when you are running locally, you won't be able to access the app through mobile and won't be able to use the API for your data reporting apps.
Step 2: Access your App and Login
After running the app, access it through your browser with the APP Url. If you are running locally, its
http://localhost:3000. If you have deployed it on a server, its the value you set as
NEXT_PUBLIC_APP_URLin your env file.
Once you access the url, you will be redirected to the login page. Use the username and password set in your env file.
SerpBear Login page
Step 3: Add your First domain
Once you login, you will be required to add your very first domain. Once you do that, you will be redirected to the App's main interface.
Step 4: Setting up the Scraper
The core of the app is the scraper which scrapes Google Search result for a given keyword. Since you cannot scrape Google without getting an IP ban, you will need to use 3rd party scrapers or private proxies. You can either use a third party scraper or you can use 3rd party proxy services.
Using 3rd party Scrapers
You can get an free API key from either ScrapingAnt (10k/month free) or ScrapingRobot (5k/month free). Once you get an API key from any of the aforementioned services, click the Settings menu from the top right corner of the App, and select the service and insert your API key and save. Then refresh the page.
Using Proxies instead
If you have decided to use proxy services, select Proxy from the Scraper List, and then in the proxy field, insert your proxy ip list. The ips should be formatted this way:
Note that the number of ips should corelate to the number of keywords you added. If you've added 100 keywords, its best to use at-least 30 keywords. its even better if you are using rotating proxy ips. And no, you don't need residential ips, datacenter ips work fine.
Step 5: Adding your First Keyword
Once you have setup the scraper, click the Add Keyword button and insert the keywords you want to track, select the keyword Country and device and click add. Once they are added, their SERPs will be automatically fetched and their Position data will updated.
Step 7: Receiving Daily/Weekly Email Notification of your Keyword's Position changes
From the Settings panel, setup SMTP details to get notified of your keywords positions through email. You can use ElasticEmail and Sendpulse SMTP services that are free.