John Nawn
John Nawn is a strategic advisor and thought leader helping associations turn research and organizational knowledge into strategies that improve decisions, programs, and member outcomes.
Collecting data and using it to make better decisions are very different things.
This article is the first in a three-part series on association research and its role in decision making.
Associations invest heavily in research. Annual member surveys, benchmarking reports, post-event evaluations, industry outlook studies, and pulse polls are now common features of association operations. These efforts are typically well intentioned and often well executed.
From the outside, this level of activity signals seriousness. It conveys that the organization is listening, measuring, and paying attention to its environment.
But an important question often goes unasked: What is all this research actually doing for the organization?
For associations operating in a knowledge economy, insight should function as a strategic asset rather than simply a recurring reporting exercise. Yet many organizations struggle to translate research activity into meaningful strategic clarity.
Research produces information. It generates charts, reports, and summaries. It often confirms things leaders already suspected or provides useful descriptive snapshots of member attitudes and industry trends.
What it does less reliably is change decisions.
It does not always redirect investment, challenge long-held assumptions, or reshape strategic priorities. In many organizations, research outputs circulate widely, appear in board decks, and are referenced in planning discussions, yet the organization’s fundamental direction remains largely unchanged.
Over time, research becomes something associations do because responsible organizations are expected to do it. The result is a subtle but important distinction—associations can become data-active without becoming truly data-driven.
Data-active organizations collect information consistently. They measure engagement, track satisfaction, survey members, and publish findings. Data-driven organizations, by contrast, allow those insights to materially influence the choices they make about programs, investments, and strategic direction.
The difference is not methodological sophistication. It is organizational behavior.
Collecting data does not automatically produce insight. Publishing findings does not guarantee action. In some cases, research simply reinforces existing beliefs rather than challenging them. In others, it identifies problems that the organization lacks the capability or appetite to address.
Sometimes the findings are acknowledged respectfully and then quietly absorbed into the background.
This dynamic creates a subtle but dangerous illusion. Leadership teams believe they are operating on evidence because research exists. Boards feel reassured by dashboards and reports. Yet the connection between research activity and enterprise-level decision making may be far weaker than assumed.
The risk is not simply inefficiency. The real risk is misplaced confidence—the belief that data is guiding the organization when, in reality, it may simply be documenting it.
Many associations continue repeating the same studies year after year. Member surveys persist because they are expected. Benchmark reports continue because they always have. New pulse surveys are introduced during periods of uncertainty. Each initiative may be justified on its own merits.
Over time, however, the research portfolio often becomes a collection of inherited practices rather than a deliberately designed strategic capability.
The organization becomes busy with research without necessarily becoming clearer because of it.
For leaders operating in an increasingly complex environment, that distinction matters. Associations exist to help their industries and professions navigate uncertainty. If the research they produce does not sharpen understanding or meaningfully inform direction, its strategic value becomes questionable.
The issue, in other words, is not whether associations are collecting enough data.
The more difficult question is whether the knowledge they generate is actually helping them think better.
If your association suspended all research tomorrow, which strategic decisions would become materially harder to make?
And just as important: Which decisions would likely continue unchanged?