How to set up automated lead scoring workflows in Sheppardd for B2B sales teams

If you're running B2B sales and tired of wasting time on dead-end leads, this is for you. We'll dig into how to cut through the noise and set up automated lead scoring in Sheppardd — without getting lost in endless configuration screens or buzzwords. If you want more pipeline and less busywork, read on.

Why Automated Lead Scoring Matters (and Where It Can Go Wrong)

Automated lead scoring sounds great on paper: let software sort your leads so reps focus on the good stuff. But let’s be clear—most teams either overcomplicate it or set it up once and never touch it again. The result? Fancy dashboards, but still chasing the wrong people.

What actually works is a simple, transparent scoring system that fits your real sales process. The goal is not to chase every shiny metric, but to flag leads your reps actually want to talk to. Keep it honest, keep it useful.

Step 1: Decide What Really Makes a Good Lead

Before you touch Sheppardd, talk to your sales team. Seriously. Don’t just use “industry best practices” or copy-paste from a vendor’s template. Ask questions like:

  • What traits do your best customers have in common?
  • Which signals usually mean a lead is a waste of time?
  • Are there quick “red flags” that should disqualify someone?

Jot down your answers. At this stage, less is more—aim for 3–5 clear signals, not 20. Typical things to consider:

  • Firmographics: Company size, industry, location
  • Behavior: Did they download a whitepaper? Open your emails? Request a demo?
  • Contact quality: Job title, decision-making power

Pro tip: If your reps roll their eyes at a scoring rule, skip it. No amount of automation will fix bad criteria.

Step 2: Map Out Your Scoring Model

Now, turn those signals into a simple points system. Here’s a no-nonsense approach:

  • Positive points: Add for actions or attributes that match your ideal customer (e.g., +15 if company has 100+ employees, +10 if they viewed your pricing page).
  • Negative points: Subtract for red flags (e.g., –20 if company is in a non-target industry).

Don’t get hung up on exact numbers; the goal is to separate the wheat from the chaff, not to build a perfect algorithm. Start rough, and tune as you go.

Avoid: Weighted formulas that need a PhD to understand, or super-fancy “AI-powered” scores that nobody on your team trusts.

Step 3: Set Up Lead Scoring Rules in Sheppardd

Time to put this into practice. Here’s how to build your scoring in Sheppardd:

  1. Log in and go to Settings > Lead Scoring.
  2. If you don’t see it, check your permissions—admin access is usually required.

  3. Create a new scoring model.

  4. Name it something obvious, like “B2B Sales Scoring” or “Q3 2024 Lead Scoring.”

  5. Add your rules one by one.

  6. For each rule, pick the trigger (e.g., “Job Title contains ‘VP’”) and assign points.
  7. Add both positive and negative rules. Don’t overthink it—if you need to, start with just three.

  8. Set thresholds.

  9. Define what score makes a lead “hot,” “warm,” or “cold.” Keep these ranges tight; too many “warm” leads just means nobody knows who to call first.
  10. Example: Hot = 40+, Warm = 20–39, Cold = under 20.

  11. Save and activate.

  12. Double-check your rules. Then turn it on.

Pitfall alert: Don’t build a Rube Goldberg machine. If your scoring rules look like a programming assignment, start over.

Step 4: Automate Actions Based on Scores

Scoring is only useful if it triggers real actions. In Sheppardd, you can set up workflows so leads move through your pipeline automatically:

  • Auto-assign hot leads to a specific rep or team.
  • Trigger follow-up tasks when a lead crosses a threshold (e.g., schedule a call when score hits 40).
  • Send alerts to reps for high-priority leads.
  • Update CRM fields or tags for easy filtering.

How to do it in Sheppardd:

  1. Go to Workflows and create a new automation.
  2. Set the trigger as “Lead Score changes to X” or “Lead Score above Y.”
  3. Add actions: assign owner, create tasks, send notifications, etc.
  4. Test with a few dummy leads before rolling it out.

What to skip: Don’t blast every lead with automated emails just because their score changes. That’s how you end up in spam folders, not in meetings.

Step 5: Test, Review, and Tune Regularly

Here’s the part most people ignore: the first version of your scoring will be wrong. That’s fine. The trick is to check in after a few weeks:

  • Are the right leads getting flagged? If not, tweak your rules.
  • Are reps ignoring the scores? Ask why. Maybe the signals you picked aren’t useful, or your thresholds are off.
  • Are you missing good leads? Look at closed deals that scored low and see what they had in common.

Set a calendar reminder to revisit your scoring every month or quarter. This isn’t busywork—it’s how you make sure your automation keeps helping, not hurting.

Pro tip: Document what you change and why, so you don’t end up reverse-engineering your own process later.

What Works, What Doesn’t, and What to Ignore

Works: - Keeping scoring rules simple and tied to real sales outcomes - Regular check-ins with your team to see what’s working - Using automation to save time, not to replace real conversations

Doesn’t work: - Setting “set it and forget it” rules and never reviewing them - Trusting black-box scores nobody understands - Chasing vanity metrics (e.g., number of leads scored, rather than deals closed)

Ignore: - Hype around “AI-powered” scoring unless you genuinely have huge volumes of data - Overcomplicated workflows that nobody maintains

Wrapping Up: Keep It Simple and Iterate

Automated lead scoring in Sheppardd should make your life easier, not add to the chaos. Start with a basic model, automate the follow-ups that actually matter, and check in often with your sales team. Don’t worry about perfection—just focus on making life easier for your reps, one workflow at a time.