As I write the title of this editorial I know that I don’t have the answer to it. But it is an issue that is a lot on my mind these days. Let me tell you why...
This week I reviewed three manuscripts on malaria for scientific journals. One of these made the case for establishing high-tech capability laboratories in malaria-endemic countries. It almost appeared as if without gene-sequencing capabilities such countries would never make headway in malaria control (or elimination). I also reviewed an interesting study whereby local plant material was processed by villagers as a potent larvicide, and the case was made that this could be the key to low-cost, community-based, and sustainable malaria elimination. Two widely varying views, from prominent malariologists. And after submitting my reviews I sat back and thought: ‘What determines now if the high-tech or no-tech approach will be taken up?’
A few weeks ago, a study was published that showed the impact of ivermectin on malaria mosquitoes. A drug in use for river blindness control could be the next thing to combat malaria. Great idea, but will it become mainstream? Or will it stop after a few articles and a review in Parasitology Today? As a matter of fact, with a sharp increase in funding for malaria research witnessed over the last decade, it is no surprise that new tools and strategies are surfacing regularly these days. That is a good thing – but what or who determines what comes out of the funnel at the end of the day?
It was a Frenchman, Philippe Ranque, who was one of the first to experiment with pyrethroids on bednets in West Africa. In the early 1980s. Today, millions of long-lasting pyrethroid treated nets are produced on a monthly basis. Hundreds of millions of nets have found their way into African homes. Was it the simplicity of the net (a physical barrier) and the repelling/killing properties of the pyrethroids that led to nets becoming mainstream? Or was it the relentless and untiring efforts of some that believed strongly in this technology that was key to the success of nets? Or was it industry that saw bread in nets? Or policy advisors? Clearly, a web of stakeholders influence this process, that’s for sure.
Industry has worked out this process in more detail. Take drug discovery, or the search for new insecticides. The search for active ingredients is followed by a step-wise process, involving stability, cost, production, toxicity, and so on. They have the right people for every step of the process, all the way from scientists down to marketers. As a scientist working in a University we don’t have this, and our endpoint is mostly a paper, hopefully in a high-ranking journal.
Is an article the right way to make your discovery or new approach or tool become mainstream? I don’t think so. Is broad press coverage associated with the paper the answer? Maybe. Or should we as scientists take the plunge and look beyond our papers and step into the world of valorisation of scientific findings?
Food for thought. How do you ensure that the tool you are developing becomes mainstream?