Measuring membership is a headache.
I learned this while at the Shorenstein Center, where I studied how single-topic newsrooms like The Trace and The Hechinger Report can engage and grow their audiences. About a year into the project, I remember a phone call with Hannah Young, director of audience at Reveal. She essentially said: “The traditional marketing funnel just doesn’t work for us — we have 10 funnels!”
Hannah was referring to the many different ways Reveal captures a reader or listener’s attention (events, podcasts, SMS, an award-winning investigative story, email), then engages with them as they become a loyal reader, listener, or attendee, and finally cultivates them into a paying member.
Even across our research cohort of similarly structured newsrooms, understanding how to measure the success of the membership piece was complicated. In our group of nine sites, membership programs were defined in different ways, their members received different benefits, and donations were tracked with different CRMs and across different categories. (Much of this pattern is clear across the field when browsing the Membership Puzzle Project’s database of membership programs.)
After this project at the Shorenstein Center came to an end, a few questions on membership remained: how might a news site know if their membership programs are successful? What should a newsroom prioritize in membership metrics, above all? And what are traps that sites often fall into?
The Membership Puzzle Project has been exploring these questions, and I wanted to dive deeper into one specific area: the metrics and benchmarks that news organizations use to guide membership strategy. I’ll review some of the available data on membership benchmarks, then spotlight how public media stations, digital news sites, and advocacy groups measure the success of their membership programs.
One thing became very clear across all of these groups: a membership program should focus on building repeat activity and member retention, above all. Together, these two cornerstones make up a good sense of member “loyalty” — the North Star for membership programs.
Emily Roseman
Researcher
@emilyroseman1
Benchmarks for membership, or the most important reader revenue metrics to watch
The first place I looked for a better sense of membership metrics are the pros at the News Revenue Hub. The Hub is a consultancy focused on bolstering revenue-generation among newsrooms, including several with membership programs (see their site for a list of their clients). The Hub compiles and sends monthly metrics reports to its client base that contain a snapshot of how the site is doing compared to the current median of all Hub clients.
It’s fascinating to see the categories of data that the Hub tracks in these monthly reports.
In this sample report we can see the median number of members across all Hub clients falls around 700, and roughly a third of those members are recurring donors. The Hub also highlights member retention, with a median rolling annual retention of 60%.
Email, of course, is an important metric connected to membership and therefore a heavy feature of these reports. (Check out Phillip Smith’s research on leveraging paid lead acquisition via email list growth.) In July, the Hub leveraged their robust set of client data and published a report that found across local newsrooms, each additional 100 email subscribers converted to an additional $102/month in revenue.
As Rebecca Quarls, senior director of membership at the Hub, explained: “our research suggests that actively working to convert site users to newsletter subscribers has a downstream impact on revenue.” She continued: “There’s an even bigger payoff for local newsrooms that aggressively pursue subscribers and build reader-first newsletter products.”
At the bottom of the Hub’s monthly reports there is a standard version of the membership funnel, which notes a Hub client median of 12% for acquisition (the percentage of site visitors who give their email address) and 4% for median conversion rate (the percentage of email readers who convert to membership). In a follow-up note to Rebecca, she confirmed that the high-performing end of Hub clients are acquiring 15 to 22% of site visitors' email addresses and converting 9 to 12% of those email subscribers into members.
Public Media Metrics and the importance of regularity
Public media knows a thing or two about membership for news. In fact, something I hear often is how public media and digital newsrooms with membership programs should share notes on what works more often. (For more on public media, check out Anika Gupta’s research.)
Starting in spring 2017, Mark Fuerst, director at Public Media Futures Forums, and Steve Mulder, senior director of audience insights at NPR, convened a working group of 15 public media stations to figure out how everyone is measuring their online audiences. Their goal — identify a small set of key performance indicators for public media services.
The groups met over the course of a year and shared their metrics dashboards with Mark and Steve, who copied down every indicator and measurement in use across all 15 stations. In total, they found around 115 different measurements of digital audiences across the working group. Steve arranged the measurements into a matrix and, over time, he led a continuing conversation that winnowed the matrix into the table below, which largely captures the major categories that should be used to inspire audience growth and revenue generation. (See this article in Current for more on their findings).
I spoke with Mark Fuerst about the process of developing this metric regime. Mark routinely referred to a series of research reports by David Giovannoni (among others) in public radio’s listener-support philosophy and membership strategy. In the key membership chapter in his “Audience 88: A Comprehensive Analysis of Public Radio Listeners,” David suggests two major and interconnected indicators that he determined drove membership: 1) frequency of use, and 2) a sense of personal importance.
David found that regular use of a radio station indicates its programming is meeting a listener’s needs. For this reason, the frequency with which a listener tunes into public radio and the amount of time he or she spends with it is strongly associated with public media membership. As an extension of David’s theory to digital service, Mark encourages stations to stop looking at “top-line” metrics like total visits, weekly, or monthly cumulative audience. Instead — and this is the critical point — they suggest focusing on those that center around “user loyalty,” especially the number of return visits.
You can see the mantras of frequency of use and sense of personal importance in the four categories of metrics that emerged in Mark and Steve’s process, which was an adaptation of the digital concepts being developed and applied at NPR: “Growing,” “Knowing,” “Engaging,” and “Monetizing” audiences.
Of note, for growing audience, Mark and Steve recommended tracking KPIs that can track volume of use and activity of “loyal” website/digital service users. Thus, they recommended tracking users with three or more sessions per month.
Measuring loyalty means measuring regularity
I found that “loyalty” is often measured through “regularity” metrics — or the process of measuring the people who return again and again to take an action or use your product.
We can again see the focus on regularity metrics in two successful digital news membership programs: The Guardian and Slate. In Digital Content Next and Lenfest’s case study of these sites’ membership programs, Slate’s data showed that readers who view eight articles per month or more are far more likely to subscribe.
The focus on regularity metrics is clear again in the Lenfest Institute and Shorenstein Center’s report How Today’s News Publishers Can Use Data, Best Practices, And Test-and-Learn Tactics to Build Better Pay-Meters, which included findings from the aggregated data of 500 news organizations between 2011 and 2018. They found that “digital subscriptions require growing the number of users who are highly engaged in a publisher’s content...This shift in the key underlying unit of growth – from page views to engaged readership – incentivizes publishers to invest in content that is valuable to readers.”
Across data provided by 15 metro area publishers, they noted that only nine percent (9%) of users were “regular readers,” who view more than five articles in a thirty-day period. In the Lenfest Institute presentation Digital Subscription Reader Revenue, lead researcher Matt Skibinski featured a study of fourteen organizations with membership components. Here, he found that the average performance is 4.22% of unique visitors as regular readers. Skibinski noted that better performance on the regular reader metric correlates with higher conversion, retention, and total membership numbers.
What tends to drive regular readership? The researchers emphasized engagement. “News organizations with larger-than-average ‘regular readership’ – engaging that critical 9 percent of audiences – tended to prioritize audience engagement efforts, including social media content promotion, in-situ recommendation engines, dedicated newsletters, and dedicated audience development teams.”
It seems that the field has reached a consensus here — growing the group of readers or people who use your product regularly is a smart membership strategy, and membership driven news organizations would do well to benchmark for success on this basis. And, not only does the work need to be done, but it’s best performed when allocated to a dedicated audience or growth team.
But how does this play out in practice? What are the pitfalls, and what are the field’s blindspots?
From public media to advocacy groups, how different groups measure loyalty
I examined several public media stations, digital news sites and advocacy groups to get a sense of how these different types of organizations gauge their membership success. Below, I highlight two public media groups, two digital news sites, and an advocacy organization that shared key metrics and benchmarks. For each group, I highlight one of their key measurement challenges and how they’re working around it now.
KQED
While talking with Tim Olson, Senior Vice President of Strategic Digital Partnerships at KQED (an NPR and PBS member radio station in Northern California) I was struck by how Tim’s philosophy on KQED’s membership was so in line with David Giovannoni’s lesson about creating a sense of personal importance. As Tim described the KQED membership philosophy: “you need to be essential and relevant in people’s lives.”
One dynamic of public media, of course, is the mission attached to the “public” part. For this to work, it’s vital that KQED is free and accessible to everyone no matter the listener’s membership status or donation history.
So when this public media station designed a membership funnel, they wanted a visual that took into account their users who might not have the capacity to give, but are still key and loyal constituents of KQED. Tim gave the example of a teacher living in San Jose who is an ambassador for KQED. This is someone who KQED should acknowledge and champion, even though they might never become a financial supporter of the station.
See KQED’s conceptual framework below on the levels of audience engagement. Instead of taking a “funnel” shape that implies optimizing for the small proportion of users who can and will contribute financially, they’ve instead emphasized listeners, viewers, and readers that use their services but might never become paying members.
In other words, they broadened the definition of loyalty to include repeat visits that might not lead to financial contributions, but instead optimized for mission-alignment.
One challenge: Measuring loyalty for a news service that provides content on the radio, website, TV and social can be difficult to track.
One thing that worked: Using two tracks of metrics. KQED uses a two-pronged approach for measuring each platform. On the web, for example, kqed.org tracks the number of visitors and the subset of repeat (4x or more a month) visitors. For events, they measure the number of attendees and the number of repeat event attendees. For video, the number of views and then the depth of views through video subscribers.
KUER
Over the summer I worked with KUER (the Salt Lake City NPR member station).I wanted to get a sense of how KUER tracks member loyalty, so I interviewed Michael Toomey, manager of digital fundraising and data at KUER, to learn more about how KUER measures the health of their membership program.
One challenge: Over the past 9 years, KUER has prioritized the monthly sustainer model of membership, meaning that they mostly track metrics around recurring, small-dollar donors. One challenge Michael cited when doing so is the lack of public media industry benchmarks. Even with benchmarks, Michael is unsure how Salt Lake's market compares with other stations.
Michael is also working with a “legacy” membership base and systems that require a bit of manual labor to crunch numbers. (For example, in order to track the number of members who have increased their donation, Michael has to manually link the old pledge data for a user to their new pledge data in order for that to be trackable as an upgrade.)
One thing that worked: Focusing on metrics that measure the loyal core. Michael and the KUER team are focused on building out a history of data in order to track their own, internal progress. Under the sustainer model, the metrics that Michael focuses on include: KUER’s member retention rate over 12 months, 3 years, and 5 years (a whopping 93.24%. 69.54%, and 57%, respectively), the upgrade rate of sustainers (the proportion of sustainers that are increasing the amount of their monthly contribution every year or every two years — currently at 9.4%), and the number and proportion of sustainers who also choose to give a one-time gift. For FY 2019, 4.7% of KUER’s sustainers elected to contribute an additional, one-time gift to the station.
WhereBy.Us
The problem of tracking “regularity” when regularity occurs across different platforms is shared by digital media, too. WhereBy.Us is a local news parent startup with several local brands and chapters. A recent Nieman article highlighted how membership at WhereBy comes in many forms, including volunteerism, networking, and cash. I spoke with Alexandra Smith, growth director at WhereBy.Us, to catch up on how things are going with all of these member funnels and engagement tactics.
WhereBy launched their membership in December 2018. Since then, they’ve been strategizing on how to grow the total amount of revenue that comes from readers over time.
One challenge: As expected, one challenge Alexandra cited when trying to quantify reader engagement is the dispersed nature of the data. Because different member benefits exist on different platforms (some are forms, the events are on Eventbrite), efforts to track this engagement and subsequent loyalty are currently disjointed and manual.
Alexandra explained: “We interviewed and surveyed our readers before we launched our membership program, then used their insights to develop member benefits along with other components of the program. But once we launched, we didn't know which benefits members were actually using the most or finding the most value in.”
One thing that worked: Making DIY metrics to measure member benefits. Every few weeks, Alexandra and the team run a raw measurement — the number of member benefits used divided by the number of total members. This calculation gives WhereBy a rough calculation on the usefulness of their membership program, although Alexandra noted that this figure is far from ideal.
“For a few months, we continued to manually track how many times our members used each benefit. We learned some interesting things. First, event ticket discounts are one of the most-used benefits, so we've been working to integrate our events and membership strategies more fully.”
The Rivard Report
The Rivard Report is a local, nonpartisan and nonprofit news organization that covers San Antonio, Texas. Kassie Kelly, membership and engagement coordinator at the Rivard Report, said that the main goals of the Rivard Report’s membership program are to help underwrite their nonprofit journalism while also creating community buy-in in their news. By encouraging financial support of local news media, she said that a goal of their membership program is to “help restore trust in media at large.” They measure this sentiment across their members annually in a survey.
Rivard Report invites members and readers to participate in their journalism in a few different ways, including tipping off reporters on leads, sharing stories on their own social media accounts and via email, submitting commentaries, and sharing their personal stories in an editorial series Where I Live, in which they publish a weekly San Antonio resident's story detailing their neighborhood and why they live where they do.
One challenge: Rivard Report doesn’t consistently measure reader or member participation in the commentaries or Where I Live project now, though they plan on tracking these metrics soon.
One thing that worked: Tracking member retention and upgrade rates. Kassie and the team use a set of KPIs to measure their membership program, which launched in 2016 shortly after they became a nonprofit the year prior. Kassie flagged member retention and, like KUER, the increases in members' donation levels year-over-year as primary indicators of the health of their membership program.
As of July 2019, Rivard Report had 1,649 individual members, with roughly half recurring. Their rolling annual retention rate is 67% and conversion rate (percent of email subscribers who become members) is 8%, which recently decreased from 13% due to rapid email list growth. This conversion rate falls on the higher end of the News Revenue Hub member spectrum.
Change.org
A few years ago, I attended a summit of groups who classified themselves as “weird big fundraisers.” One of the groups was Change.org, a Public Benefit Company with a b-corporation designation that operates as a global platform for the general public to create and share petitions.
I wanted to know what an advocacy group like Change.org could teach us about asking their members to take action. I reached out to Amanda Luther, senior director of revenue, and Nicolas Danet, global membership director at Change.org to dig into their strategy of audience engagement and growth. Change.org’s overall company mission is to “empower everyone to create the change that they want to see,” so people can start their own petitions, sign petitions, and share petitions on their platform free of charge.
One of the main on-site membership funnels works like this: a person signs or shares a Change.org petition, is asked to contribute toward the promotion of the petition or share the petition with friends, then is asked to sign additional petitions. Following this, the person is then asked to join as a member. Interestingly, at Change.org, the only way you can become a member is through giving a monthly contribution.
In this user interaction, Amanda and Nico highlighted four crucial areas of metrics: amplification, revenue per signature, retention (both retention of email users and retention of members), and the lifetime value of members.
One challenge: Since Change.org’s mission is to create impact, they need to balance finding resources and building a business model with making change happen. The main questions they have include: what type of business model is best aligned with their mission? And what is an internal metric they should use to make progress on both of these fronts?
One thing that worked: Measuring “powerful campaigns.” Change.org has found that being 100% people-powered keeps their platform open and independent. Independence is important when it comes to petition efficacy because petitions often target big corporations and governments. A funding model that relies on corporate advertising or government grants could undermine the potential impact of certain petitions, whereas being funded by individual users keeps the platform independent. To orient around a company metric, Change.org is aiming to double the number of “powerful campaigns” on their site.
A powerful campaign for Change.org is a petition that has at least 250 signatures in the first 7-days of launching, 5% of these signers revisit the petition for deeper engagement, and there’s at least one proof-point or external validation. (This external validation could be a decision-maker or target of the petition responding on the platform, a media hit about the petition or an endorsement from a high profile influencer.)
If the petition meets these criteria, all signs point toward this particular petition having a high likelihood of gaining traction. Perhaps not surprisingly, Amanda and Nico find a strong positive correlation between campaign engagement and membership conversion. In other words, people who are most deeply engaged in campaigns (i.e people who sign multiple petitions, respond to email, and have promoted at least one petition) are more likely to join as monthly members. Thus, powerful campaigns are aligned with both impact and membership.
Measuring repeat actions is nothing new for advocacy groups. Take for example the group Sum of Us, a nonprofit advocacy organization. In 2014, they developed a guiding metric for their organization — Members Returning for Action, or MeRa. MeRa is the number of unique members who have taken an action other than their first one. Based on MeRa, SumOfUs developed a set of sub-metrics including 30 Day MeRa (in the past 30 days, the number of unique users who have taken an action other than their first one), Monthly MeRa (in a given month, the number of unique users who have taken an action other than their first one) and the Source of Monthly MeRa (the Monthly MeRa for members who joined from a particular source).
Retention matters for subscribers & members
In this research I’ve focused on metrics and benchmarks involving tracking people or groups of people who regularly use a product. Media organizations will be well-served by developing internal metrics that allow them to track the progress of growing this highly engaged subset of users.
Throughout several of the cases above, retention of members was also frequently mentioned as an indicator of membership health. Retention, in many ways, is the much-needed companion to regularity.
This is especially clear in Poynter’s piece about The LA Times’ disappointing subscriber retention rates. Despite adding 52,000 digital subscriptions this year, “significant cancellations during the same stretch” left the Times with a net increase of only 13,000 subscribers. As Josh Benton wrote in the Nieman summary, “It’s also a reminder that getting digital subscriptions right isn’t just about getting people in the door — it’s about keeping them there.”
Looking back at KUER’s retention performance and the data from the News Revenue Hub puts the size of this loss of subscribers in perspective. Hub members have a median annual member retention rate of 60%. KUER’s one year retention rate is 93%, and Michael mentioned that he believes this falls around the public media industry standard. How could digital news organizations achieve the public media standard?
I wanted to understand where nonprofits outside of news alone perform around retention, so I returned again to a source of aggregated data, the M+R Benchmarks Report. M+R is a group that works with nonprofits to help them mobilize supporters, raise money, and create impact. Their 2019 Benchmarks report includes data from 135 of their nonprofit partners including public media groups and environmental nonprofits. The report finds that, across their set of nonprofit partners, 37% of donors who made a gift online in 2017 donated online again to that nonprofit in 2018. This includes monthly donors whose sustaining gift continued from one year to the next.
M+R goes further in their analysis of retention; they also break down retention rates by the gift amount and giving history. Interestingly, M+R founds that new donors (those who made their first gift in 2017) had a retention rate of just 25% – 34 percentage points lower than the retention rate of prior donors.
They also find that donors at the lowest giving levels tend to have the lowest retention rates (just 10% of donors who made gifts under $25 in 2017 gave again online in 2018), and retention rates tend to increase along with the size of the contribution.
Among groups with membership programs, members had a retention rate about double that of non-member donors. About 33% of 2017 members made another membership gift in 2018, while these organizations retained 17% of non-member donors. (In this study, it’s worth noting that M+R only included membership programs that offer substantial tangible benefits, like free admission, a magazine subscription, or schwag, which contrasts with many of the membership models mentioned earlier in this post).
Based on this dataset, we see clear signs that membership and retention in the non-profit sector are intimately linked.
Concluding thoughts, and the implications for measuring and optimizing loyalty
I’ve found that news organization staff most frequently cite loyalty metrics as the needed “North Stars” to measure the health of their membership programs, and that there are a growing set of benchmarks against which this health or success can be measured. Loyalty can mean both regularity (or measuring the return visits or use from audience-members) and also the near and long-term retention of the group. Regularity metrics serve a dual purpose for many of the organizations that I interviewed: not only do they act as KPIs to cultivate donors and drive financial value, but also to measure the success of their missions.
The largest challenges of loyalty metrics are both technical & ethical
For the technical, several groups struggled with measuring the repeat use of their members when the activity was taking place across different platforms (audio, email newsletter, site, events, and social media, among others). How are we to overcome this obstacle?
For the ethical — what if, in order to generate trust and distinguish themselves from others, a site promises not to collect too much data on their audiences?
An example of this emerged just a few weeks ago, when Julia Angwin posted The Markup’s first newsletter on why the publication opted for an email service provider (ESP) that could turn off tracking on their audiences’ email activity.
Certainly, some of the data an ESP can track (like the personal location of a user) feels like a gross privacy violation. But other email metrics — like user unique open rates and click rates — are important data points to help newsrooms understand the segments of regularly reading people on their list. So, where is the line? And how can optimizing for regularity and optimizing for data privacy be compatible?
And here’s another ethical concern — if news organizations prioritize growing their segment of regular readers, listeners, or event attendees, what do we do if we find ourselves optimizing only for a certain type of person who has the time and energy to act and participate? Do we acknowledge where people fall on KQED’s “rainbow of engagement,” and try to ensure that one layer isn’t growing at the expense of another? Or, like Hannah Young at Reveal, do we need to maintain ten different funnels organized by pathways on platforms and optimize regularity for each (and do we have the tools to do that)?
If you or your organization is in the midst of measuring membership and thinking through these questions, I’d like to hear from you! Please get in touch (emilyroseman1@gmail.com) and be sure to follow along with the Membership Puzzle Project’s new research on memberful routines.
Additional suggested reading:
Jessica Best, Emily Goligoski, Ariel Zirulnick, Jay Rosen, and Lukas Kouwets contributed to this post.