At last year’s ASAE Technology Conference there was a lot of discussion of “big data.” I remember at the time that Elizabeth Engel was overheard on Twitter saying something to the effect of “Before associations get too into big data, they might want to get their arms around the little data they already have.” Well said. It’s been years now since the 7 Measures of Success report suggested that associations needed to have “data driven strategies.” Everyone agreed with that, and associations certainly have lots of data about their members, but I’m concerned that too many associations just don’t know what to do with it.
I was discussing this the other day at lunch with Mark Tobias of the tech company, Pantheon. We were talking about the importance of how actionable the data are, rather than the big/small distinction. When do the data that you are analyzing give you something that can spur action? And is that action any more likely to generate a positive result than simply guessing (without the data)? I’m scared to hear associations truly answer that question. I have no doubt that some will have great answers, but I doubt it would be anywhere near a majority.
Why? Well probably for a variety of reasons, but the one that bugs me has to do with learning. In Humanize, Maddie and I identify elements of an organization’s culture that are compatible with the speed and agility of social media. Those elements are decentralization, transparency, inclusion, and learning. I think learning is the hardest. Learning requires experimentation–actually trying new things and learning from it. Learning requires a rigorous (ruthless, almost) understanding of “what is.” Learning requires challenging your own assumptions, rather than just reacting to surface information. Learning, in fact, requires that we be more nuanced about data, because data rarely give you the answer. If they do, then the problem you were trying to solve was pretty simple. To be ”actionable,” it turns out, requires more than data. You need data, plus thinking, plus conversation, plus insight, plus some more data, plus some assumption-testing and a healthy dose of experimenting.
We’re just not good at that. We’re good at surveying members and deciding that they want education, networking, and advocacy. We’re good at bringing speakers back who score a 4.3. We’re good at focus groups that generate tiered sponsorship packages. What we need to be good at is learning.