Keeping tabs on your competitors’ prices is a pain if you do it by hand. Maybe you sell on Shopify, run a DTC brand, or manage an ecommerce team. If you’ve ever spent your morning copying prices into spreadsheets, you know: it’s tedious, and it breaks fast. This guide is for you if you want a straightforward, moderately technical way to automate price monitoring—without spending your weekend learning Python from scratch or getting locked into some “AI-powered” SaaS black box.
We’ll use Apify, a platform that lets you run and schedule web scrapers. You don’t need to be a coder, but basic logic, patience, and attention to detail help a lot. I’ll walk you through the real steps, flag the gotchas, and show you what’s worth automating—and what’s best left alone.
Let’s get your competitor price tracking out of spreadsheets and into autopilot.
Step 1: Get Clear About What You Need
Before you touch Apify or any tool, figure out exactly:
- Which competitor sites or products do you want to track? List the URLs or product SKUs.
- How often do you need fresh data? Hourly? Daily? Weekly? More isn’t always better—don’t create a data firehose you can’t use.
- What data fields matter? Usually, it’s product name, price, maybe stock status or sale flags.
- How will you use the data? Will you just check it, or do you need to trigger pricing actions?
Pro tip: Don’t try to track 1000 products across 30 sites to start. Pick 1-3 competitors and a handful of SKUs. Get that working, then scale up.
Step 2: Create Your Apify Account
Head to Apify and sign up. The free tier is decent for getting started. You’ll get:
- Access to the Apify Console (your dashboard)
- Free monthly credits for running scrapers (“Actors” in Apify-speak)
- The ability to schedule, store, and export runs
What to skip: Don’t get sucked into the paid plans or marketplace actors just yet. You can do a lot for free, and most price tracking doesn’t need enterprise bells and whistles.
Step 3: Pick (or Build) a Scraper
Now to the meat of the job: scraping competitor websites for price data.
Option A: Use an Existing Actor
Apify’s Marketplace has ready-made scrapers (“Actors”) for big sites like Amazon, eBay, and Walmart. Just search for the site you want. If you’re lucky, someone’s already built and shared what you need.
- Click “Actors” in the sidebar, then search for your target site.
- Check the reviews, last update date, and docs.
- Try running the Actor with a sample input (usually a product URL).
What works: Marketplace Actors save a ton of time—if they exist and are maintained.
What doesn’t: For niche ecommerce sites, or if the site layout changes often, you’ll probably have to build your own.
Option B: Create a Custom Scraper with Apify’s Web Scraper
If there’s no ready-made Actor, use Apify’s no-code “Web Scraper” Actor:
- In the Console, click “Actors” → “+ New” → “Web Scraper.”
- Paste in a product page URL.
- Use the visual selector to click and highlight the price, product name, and whatever else you want. The tool will figure out the selectors for you.
- Test the scraper. Does it pull the right data? If not, tweak the selectors.
What to watch for: - Many modern ecommerce sites use JavaScript to load prices. Apify’s scrapers handle this pretty well, but sometimes you’ll need to tweak timeouts or add “wait for element” steps. - If the price is inside a popup, tab, or hidden element, scraping it gets trickier.
If you know basic JavaScript: You can add logic to handle tricky pages, loops, or pagination. But don’t overcomplicate it if you’re new.
Step 4: Set Up Your Input List
Now you need to tell your scraper which product pages to visit.
- Most Actors accept a list of URLs as input. Paste in your competitor’s product page links.
- For bulk tracking, keep your URLs in a Google Sheet or CSV, then upload or copy-paste them into Apify as needed.
Pro tip: Start with 3-5 products. Confirm it works. Then scale up.
Step 5: Run a Manual Test
Never schedule anything before you test it manually.
- Hit “Run” in Apify.
- Wait for it to finish (could be seconds or minutes depending on site and your settings).
- Open the output data (JSON, CSV, or table view).
- Check: Did it grab the right product, right price, right currency? Did it miss anything or pull junk data?
What to do if it fails: - Double-check your selectors. - Make sure you’re scraping the actual price, not a “from” or “list” price. - Some sites block scrapers—if you get blocked, try slowing your scraper down or running less often.
Step 6: Schedule Automated Runs
Once you’re confident your scraper works, automate it:
- In Apify, use the “Schedule” feature on your Actor.
- Choose how often to run (daily is plenty for most).
- Set up email or Slack notifications if a run fails.
Honest take: Don’t run scrapers every 10 minutes unless you absolutely need to. It’ll chew up your Apify credits and can get your IP blocked. Once or twice a day is enough for most ecommerce price monitoring.
Step 7: Export and Use Your Data
Scraping is pointless if you don’t use the data.
- Download results as CSV or JSON from Apify.
- You can hook Apify up to Google Sheets using integrations or Zapier if you want data to flow automatically.
- For more advanced setups, pull data via Apify’s API into your own dashboards or apps.
What’s worth automating: - Automatic price drop alerts - Syncing competitor prices into your own BI tools
What’s usually overkill: - Real-time price undercutting (unless you’re a giant retailer) - Fancy dashboards that no one actually uses
Step 8: Maintain and Tweak
Websites change. Scrapers break. That’s just reality.
- Check your runs weekly. If data looks weird or blank, inspect the site—maybe they changed their HTML.
- Update selectors as needed.
- Stay under the radar—if you scrape too aggressively, some sites will block or throttle you.
Pro tip: If a site keeps breaking your scraper, try using Apify’s “proxy” feature to rotate IPs or slow down requests. Don’t hammer competitor servers—be a decent internet citizen.
What to Ignore (For Now)
- Headless browser “AI” scrapers: Usually overkill and buggy for basic price tracking.
- Enterprise SaaS solutions: Expensive, slow to adapt, and often lock your data away.
- Scraping mobile apps: Nightmare territory. Stick to web pages.
Focus on getting a simple, robust pipeline running first. Fancy features can wait.
Summary: Start Small, Iterate Fast
You don’t need a team of devs or a massive budget to automate competitor price monitoring. Start with a clear goal, use Apify to build or adapt a basic scraper, and only scale up when you trust the data and the flow. Keep your process lean. It won’t ever be 100% set-and-forget, but it’ll save you hours of copy/paste hell.
Iterate as you go. When you hit a snag, fix what matters—don’t chase every shiny feature. The best automation is the one you’ll actually use.