27.03.2018

RCTs are data-intensive and hence relatively expensive (but not necessarily more so than alternative designs). Designing an under-powered RCT, which has a too little sample size to detect statistically significant effects, is therefore not an effective use of resources. If indeed, as it unfortunately mostly continues to be the case in development programmes, we still do not know whether the type of programme/intervention or some sub-activity of it ‘works’ or has important adverse side-effects, an impact evaluation may be called for. But bear in mind that depending on the effects of interest, these aspects may take time to emerge and to be discernible in the data.

To give an example, agricultural productivity effects will at least take a year to detect (next harvesting season of similar type), while to find out the effect on employability of early childhood development interventions will take one generation (15-20 years). Once we have established that a counterfactual analysis is desired, the next issue to consider is how to establish a counterfactual that best mimics the population that was targeted by the intervention, while taking into consideration what is ethical and feasible in the particular context.