In the last decade there have been marked reductions in malaria incidence in sub-Saharan Africa.
The complexity of these observations emphasizes the importance of population-based studies to evaluate the effects of strong drug selection on Plasmodium falciparum populations.
Although caused by vastly different pathogens, the world's three most serious infectious diseases, tuberculosis, malaria, and HIV-1 infection, share the common problem of drug resistance.
Surveillance for drug-resistant parasites in human blood is a major effort in malaria control.
We review some contributions that early efficacy studies of antimalarial treatment brought to clinical pharmacology, including convincing documentation of atebrine-resistant malaria in the 1940s, prior to the launching of what soon became first-choice antimalarials, chloroquine and amodiaquine.
The LDR-FMA described here allows a discriminant genotyping of resistance alleles in the pvdhfr, pvdhps, and pvmdr1 genes and can be used in large-scale surveillance studies.
Intermittent preventive treatment of infants (IPTi) with sulphadoxine pyrimethamine (SP) is recommended as an additional malaria control intervention in high transmission areas of sub-Saharan Africa, provided its protective efficacy is not compromised by SP resistance.
Multidrug-resistant Plasmodium falciparum malaria parasites pose a threat to effective drug control, even to artemisinin-based combination therapies (ACTs).
This technique for optical mapping of multiple malaria genomes allows for whole genome comparison of multiple strains and can assist in identifying genetic variation and sequence contig assembly.
The study showed significant association of three loci in the human genome with the ability of patients to clear drug-resistant P. falciparum in samples taken from five countries distributed across sub-Saharan Africa.