Setting up automated lead scoring rules in Apteco PeopleStage

If you’re running campaigns in Apteco PeopleStage, you’ve probably wondered how to weed out tire-kickers from buyers without babysitting every lead. Automated lead scoring is supposed to solve that—if you set it up right. This guide skips the theory and gets to the point: how to build practical, no-nonsense lead scoring rules that actually help your team focus on people ready to buy.

You’ll get the exact steps, a few “don’t bother” warnings, and some reality checks on what lead scoring can and can’t do for you.


Who this is for

This guide is for marketers and CRM folks who use Apteco PeopleStage and want to automate lead prioritization—without getting lost in features or drowning in data. You don’t need to be a data scientist, but you should be comfortable navigating PeopleStage and have a sense of your sales process.


What is lead scoring (and what isn’t it)?

Before you jump in, let’s be clear: lead scoring means giving each lead a score based on actions or traits, so your sales team can focus on the best bets. Done right, it’s a filter. Done badly, it’s just another number nobody trusts.

Automated lead scoring isn’t magic. It won’t “reveal hidden intent” or “transform your pipeline overnight.” It just helps you stop wasting time on people who aren’t likely to buy. That’s it.


Step 1: Decide what actually signals a good lead

Don’t let the software decide what counts as a hot lead. Sit down with your sales team (or just think like them for a minute) and nail down what really matters. Ask:

  • What actions do our best leads take before buying? (e.g., requesting a demo, opening key emails, visiting pricing pages)
  • Are there traits that make a lead more likely to convert? (e.g., company size, job title, region)
  • What’s just noise? (e.g., clicking a random blog post, opening every email but never replying)

Pro tip: Don’t try to score 50 different things. Three to five meaningful signals are plenty to start.


Step 2: Map out your scoring rules on paper first

Resist the urge to dive into PeopleStage right away. Sketch out your rules on a whiteboard or notebook:

  • List the actions or traits you want to score.
  • Assign a rough score to each (e.g., 10 points for a demo request, 5 for opening a pricing email).
  • Decide what score makes someone “sales ready.”

Example basic model:

| Action/Attribute | Score | |-------------------------|-------| | Requested demo | +10 | | Opened pricing email | +5 | | Visited product page | +3 | | Has business email | +2 | | Unsubscribed | -20 |

Don’t worry about precision. You’ll tweak these later.


Step 3: Set up your lead scoring database fields in Apteco

Before building rules, make sure you have somewhere to store scores. In PeopleStage, you’ll usually need:

  • A numeric field on the contact (lead) table to hold the score. This is often called “LeadScore” or something similar.
  • Date fields for when the score was last updated or when a lead became “sales ready” (optional but useful).

Heads-up: If you don’t have permission to update database fields, talk to your Apteco admin or IT team. Don’t try to hack around this—half-baked workarounds will come back to bite you.


Step 4: Build scoring rules using PeopleStage campaigns

Now the fun part. In PeopleStage, you automate lead scoring using campaigns, selections, and actions.

4.1 Create selections for your signals

Each “signal” (e.g., demo requested, pricing page visit) should have its own selection—a saved filter/segment of contacts who’ve done the thing.

  • Go to the Selections area.
  • Set up a selection for each key action or trait you identified.
  • Test your selections. Make sure they pull in the right people.

Reality check: If you’re struggling to define a selection, maybe that signal wasn’t as clear as you thought. Don’t force it.

4.2 Build a campaign to update scores

  • Create a new campaign in PeopleStage.
  • For each selection, set up an Action to update the lead score field.
    • Use the “Update Field” or similar functionality.
    • Add or subtract points based on the rules you mapped out.
  • Schedule the campaign to run daily (or as often as your data updates).

Tip: Handle negative signals (like unsubscribes) too, so scores can go down, not just up.

4.3 Optional: Build a “sales ready” trigger

If you want, add a step that flags leads as “sales ready” when they hit your threshold:

  • Add a condition: If LeadScore ≥ your threshold, update a “SalesReady” flag or send an alert to sales.

Step 5: Test your scoring logic

Don’t skip this. Run your scoring campaign on test data or a small real segment first.

  • Check a handful of leads manually. Do the scores make sense? Did anyone get an absurdly high/low score?
  • Make sure negative scores or “reset” logic (if you have it) works.

What to ignore: Don’t obsess over edge cases at this stage. You’ll never catch them all, and it’s not worth delaying the launch.


Step 6: Act on the scores—don’t just report them

Too many teams set up scoring, then leave it to rot. Make sure your sales team actually sees and uses the scores:

  • Push “sales ready” leads to your CRM (Salesforce, Dynamics, whatever you use).
  • Send alerts or assign tasks when leads cross the threshold.
  • Build dashboards—but only if someone will look at them.

If nobody picks up the phone when a lead scores high, your automation is pointless.


Step 7: Review, tweak, and (occasionally) rebuild

Lead scoring is never “done.” After a month or two:

  • Ask sales if the “hot” leads are actually any good.
  • Look at conversion rates for different score bands.
  • Drop or adjust signals that don’t predict sales.

Don’t: Get suckered into “AI-powered scoring” unless you have a mountain of data and a data scientist on hand. Simple rule-based scoring works fine for most companies.


What works, what doesn’t, and what to ignore

What works

  • Simplicity. Fewer, clearer rules are easier to maintain and explain.
  • Regular reviews. Even the best model gets stale as your market changes.
  • Real sales input. Don’t build this in a vacuum.

What doesn’t

  • Scoring too many actions. You’ll end up with noise and confusion.
  • Ignoring negative signals. If someone unsubscribes or complains, their score should plummet.
  • “Set and forget” mindset. Automation is great—until it’s outdated and nobody trusts it.

What to ignore

  • Fancy dashboards nobody reads.
  • Over-complicated math. If you need a PhD to explain it, nobody will use it.
  • Vendor hype. Lead scoring helps, but it’s not a silver bullet.

Wrap-up: Keep it simple and iterate

Automated lead scoring in Apteco PeopleStage is only as good as the rules you set—and how you use them. Start simple, focus on signals that really matter, and tweak as you go. Don’t waste time building a “perfect” model right out of the gate. Get something live, see what works, and make small improvements over time.

The best lead scoring system? The one your team actually uses. Keep it grounded, and you’ll get more value with less hassle.