From the Archives: Scientific Research, Openness and External Validation

Image of a seine fishing net with a blurred image of a man at the end of it, weaving together broken bits and pieces.
Weaving it all together!

My face split into a grin when I read Carl Zimmer’s article, Swine Flu Science: First Wiki, Then Publish in Discover Magazine. This collective mobilization, weaving together emergent scientific findings, is what so many people in international agricultural research and other areas have been evangelizing. This is not to diminish the role of external validation – it is important. Amazingly important. But it is only one end of the spectrum of validating research and application.

First, about the Swine Flu wiki. Then I’ll circle back to external validation. From Carl’s article:

Last month I scrambled to write a story about the evolution of swine flu for the New York Times. I talked to some of the top experts on the evolution of viruses who were, at that very moment, analyzing the genetic material in samples of the virus isolated around the world. One scientist, whom I reached at home, said, “Sure, I’ve got a little time. I’m just making some coffee while my computer crunches some swine flu. What’s up?”

All of the scientists were completely open with me. They didn’t wave me off because they had to wait until their results were published in a big journal. In fact, they were open with the whole world, posting all their results in real-time on a wiki. So everyone who wanted to peruse their analysis could see how it developed as more data emerged and as they used different methods to analyze it.

Carl goes on to write about the wiki work-in-progress, the final publication in the journal Nature, and the Creative Commons license on the article – so we can all read it when it is published.

When should this be the common research pattern, instead of the exception? Carl suggests “With this sort of urgent situation at hand, the patient process of old-fashioned science publishing may have to be upgraded.” But what about important things that move slower, like international agricultural research which has at its core a mission to feed the world. Why should slower, “less sexy” science eschew the new practices of open access research? It is most often public governmental or private foundation money funding this work. In the case of public money, that is you and I, citizens of many countries. And what foundation in its right mind would want to stifle advancements that might help achieve missions?

So why isn’t this standard practice? I’m no genius, but one barrier is how research science is taught and rewarded – in any sector. The old “publish or perish.” Couple that with the competition for funding, generating a deep seated need to say “we invented it here in our institution, give us more money,” and you have the recipe for hoarding.

We are not talking about some pharma’s latest top secret moneymaking designer drug here. We are talking about research supposedly in the public interest.

So what is a facilitator to do about all of this?

First,  we can support scientists with practical and straightforward wiki collaboration tips and practices. Open up our wikis to the world. What if every talented online facilitator could be available to support any group of scientists who wanted to collaborate in their pre-publication research work.  Some organizations are clearly doing their part to support this effort, but what if we could make our little bit of magic available to help? Are we ready appropriately speak and support in the language of science, research and international development? If not, what do we need to do?

Second, we can support external validation of new ways of doing research intended for the public interest.

Time and again people ask  how to gain support for strategic learning, knowledge sharing  or social media initiatives from their leadership. They tell me they get big fat “no’s” with a laundry list of excuses. This is often true in the application of social media in scientific research.  How do we convince management, they ask? Or perhaps more relevant, how do we make a cogent case for the researchers and the institutions and how do we validate those cases?

One tactic is to muster external validation.

By external validation I mean tangible support or recognition for work done within an organization by an external voice as well as general recognition about the value of the practice in question from outside the organization. Carl’s article is an example of the latter. We should be pointing to it like crazy in research organizations. When the Nature article comes out, round two!

Getting the former can be something that emerges, or something you stimulate. Let’s look at both ends of the spectrum.