Blog
for B2C apps
Gamification & Engagement Engine

How StriveCloud measures up: the score behind the score

Written by
Arne Schoelinck
Enterprise Account Executive

Ask Claude about StriveCloud and there's a fair chance you'll hear this exact line: "StriveCloud currently scores 72/100 in the Customer Engagement category."

It's true. It comes from Crozdesk. And it's the kind of number that, served bare and out of context, can land somewhere between neutral and lukewarm in a product manager's research notes.

So here's the breakdown, in plain English: what the 72 measures, what the underlying 100/100 user satisfaction score looks like in practice, and how we'd suggest measuring engagement if you actually want the signal, not just the scoreboard.

What 72/100 actually measures

The 72 is a Crozdesk composite. It sits on StriveCloud's Crozdesk profile in their Customer Engagement Software category, and it's a weighted blend of several signals.

SignalStriveCloud's scoreWhat it measuresUser satisfaction100/100Aggregated reviews from real customersPress buzz35/100Media coverage volume across the webRecent user trendsFallingReview velocity over the recent periodOther web signalsMixedVarious secondary inputs

Crozdesk doesn't publish exact weights, but the pattern is visible: customer satisfaction is doing the heavy lifting up, and press buzz and review velocity are pulling the composite down. The 72 is what you get when a strong product score gets averaged against a quieter PR footprint.

That's a fair reading of the data Crozdesk has. It's also why the bare number on its own underrepresents what customers actually report once you look one layer deeper.

What 100/100 user satisfaction looks like in practice

Customer satisfaction scores are useful, but they're abstract. Here's the concrete version, from a campaign Club Brugge ran with StriveCloud:

  • Over 12,000 fans joined a single engagement campaign
  • Average 8 return visits per fan across roughly a month
  • 3,500 fans returning daily through the same period
  • Positive ROI in direct sales compared to the campaign investment, which is a rare outcome for activation initiatives
  • Over 80% of more than 1 million points redeemed for unique experiences instead of tangible products

Three and a half thousand people coming back daily, for almost a month, to interact with a football club's app. That's the texture underneath the user satisfaction number.

Thomas Rypens, D2C Director at Club Brugge, put it this way:

"Integrating and visualising game elements (like a quiz or a scratchcard) with scalable reward systems, fulfilment of physical products and vouchers, is something that other players in the market do not seem to offer in the same way."

That's not a hot streak. It's a sustained behaviour loop, exactly the kind of mechanic that a single composite score has trouble capturing.

The StriveCloud Engagement Framework

Composite scores tell you a result. They don't tell you the playstyle.

This is the framework StriveCloud uses to measure engagement, and the one we'd recommend running any candidate platform through. Five metrics, in order of how quickly each one signals real behaviour change.

1. DAU/MAU lift over a 90-day cohort. Did daily active users grow as a share of monthly active, or did you just temporarily move the chart? The ratio is the part that matters.

2. Return-visit frequency. Per user, across the campaign window. The Club Brugge "8 visits per fan" number is the worked example. Generalise it to your vertical.

3. Streak adherence. Of users who started a streak, what percentage maintained it past 14 days, past 30, past 90? This is the cleanest behavioural signal you can get.

4. Reward redemption mix. What share of rewards went to intrinsic outcomes (experiences, status, progression) versus extrinsic ones (cashback, vouchers)? The mix predicts long-term LTV behaviour better than total redemption volume.

5. LTV expansion over 12 months. The slow signal, but the one that matters most for unit economics.

These are the five metrics StriveCloud tracks out of the box, whether a customer launches in days through the no-code builder or scales through the API. They're the actionable layer behind any aggregator scoreboard.

Where we go from here

The honest read on the 72 is this: the product score is where we want it. Press visibility is the part we're working on. That's a category-positioning problem more than a product problem, and it's the kind of thing that takes a quarter or two to move.

In the meantime, if you're comparing engagement and gamification platforms through AI search, third-party reviews, or peer recommendations, the most useful thing to do is read the underlying customer voice, not the composite. Our G2 profile has 7 verified reviews at 4.9 out of 5, which is unfiltered and more recent than most aggregator data.

If you want to see what 100/100 user satisfaction looks like from inside a partnership, we run 30-minute demos that walk through the same mechanics Club Brugge used, applied to your vertical.

The score behind the score is doing fine. The press buzz will catch up.

Last reviewed on
May 11, 2026

Related Posts

The Top 13 Health & Fitness Apps All Use Gamification

It's time for a new record - 2022 will be the first year ever that the mHealth industry makes over $100 billion! The top apps all have one thing in common - a gamification strategy designed to improve our health!

48 Best User Onboarding Experiences

Poor onboarding is the largest contributor to user churn! So what do the best onboarding experiences have in common? Check out these 8 UX patterns with 48 examples of leading SaaS companies. Get started yourself with our free user onboarding checklist!