Impact Evaluation at Cartier Philanthropy
Assessing what works, where, and why across 22 development partners in 40 countries
March 2024 · Soumita Roy · 10 min read
Cartier Philanthropy, Geneva
Project Overview: As Programme Analyst at Cartier Philanthropy, I conducted impact evaluation across a portfolio of 22 non-profit partners operating in 40 countries. I authored 22 comprehensive due diligence reports, built analytical dashboards in R and Tableau, constructed geographic visualisations in ArcGIS, and tracked the financial health and programme performance of interventions spanning education, health, agriculture, and sustainable livelihoods.
My Contribution
- Authored 22 Due Diligence Summary reports (25-30 pages each), covering organisational review, impact assessment, financial health, and risk analysis for each partner
- Used R for data analysis and Tableau to create interactive visualisations, integrating indicator selection to strengthen programme evaluations across the portfolio
- Built geographic maps of partner operations using ArcGIS, and tracked organisational evolution over time using the Wayback Machine for longitudinal web analysis
- Tracked financial sustainability of partners' programmes across 40 countries, assessing donor diversification, unit economics, and budget-to-expenditure ratios
- Performed mixed-methods research, triangulating case studies, RCTs, annual reports, and programme officer interviews to produce funding recommendations
The context
Cartier Philanthropy is the philanthropic arm of the Richemont group, channelling funding to high-impact development programmes across Sub-Saharan Africa, South Asia, and the Caribbean. Unlike many foundations that write cheques and wait for annual reports, Cartier takes an unusually hands-on approach: every partner undergoes rigorous due diligence before funding, and every programme is tracked through structured evaluation throughout the grant cycle.
I joined as Programme Analyst at a moment when the portfolio was expanding into new geographies and thematic areas. My mandate was twofold. First, to conduct deep-dive due diligence on prospective and existing partners, assessing everything from their theory of change to their unit economics. Second, to build the analytical infrastructure, dashboards and indicator frameworks, that would allow the team to track programme performance in near-real time.
Due diligence and partner evaluation
The core of my work was preparing comprehensive Due Diligence Summaries (DDS) for each partner. These were not perfunctory checklists. A single report could run 25-30 pages, covering the organisation's history and governance, its theory of change, programme model, beneficiary selection process, KPI tracking record, financial health, unit economics, external evaluations, and risk assessment. I authored 22 such reports over six months, covering partners across Haiti, India, Mali, Senegal, Tanzania, Zambia, Sierra Leone, Cambodia, and several other countries.
Each report required triangulating information from multiple sources: partner-submitted documents, annual reports, published RCTs, government data, and conversations with programme officers. The goal was to give the foundation's decision-makers a clear-eyed view of whether a programme was delivering on its promises, and at what cost per beneficiary.
Data analysis and visualisation
Beyond narrative assessment, much of the work was quantitative. I used R to clean and analyse programme data submitted by partners, and Tableau to build interactive dashboards that the team could use to monitor KPIs across the portfolio. This involved designing indicator frameworks that were comparable across very different programme types, from micro-lending in Haiti to school health in Zambia.
A recurring challenge was indicator selection: partners often tracked dozens of metrics, not all equally informative. Part of my contribution was helping programme officers identify the two or three indicators that most meaningfully captured whether a programme was on track, and building the visual infrastructure to make those indicators legible at a glance.
For several partners, I also constructed geographic visualisations using ArcGIS to map the spatial distribution of programme activities. For Imagine Worldwide, for example, I built maps showing the rollout of tablet-based learning across schools in Sierra Leone, overlaid with district-level education statistics. These maps gave programme officers a spatial dimension to complement the tabular KPI data, making it easier to spot geographic gaps in coverage or identify clusters where implementation was particularly strong.
Financial sustainability tracking
A less visible but equally important strand of the work was assessing the financial health of partner organisations. This meant reviewing audited financials, tracking donor diversification, and flagging cases where a partner's budget had outpaced its income, or where reliance on a single funder created vulnerability. In a portfolio spanning 40 countries, from a post-conflict Haiti to a fast-growing India, the range of financial risks was considerable.
Inside a due diligence report
Each Due Diligence Summary followed a standardised six-section structure, designed to give the foundation's programme officers and board a complete picture of a partner's strengths, weaknesses, and trajectory. The structure is worth describing because it shaped how I approached every analytical task: each section demanded a different mix of quantitative and qualitative methods, and the final recommendation had to synthesise across all six.
The analytical toolkit varied by section. Section 2 (Achieving Impact) was the most data-intensive: I built R scripts to ingest partner KPI data, compute year-on-year trends, and flag deviations from targets. For partners with published RCTs, like Educate Girls (IDInsight, 2015-18) and DMI (Burkina Faso child survival trial), I reviewed the evaluation methodology and assessed whether the reported treatment effects were likely to hold at the partner's current scale and in its current operating context.
Section 3 (Financial Situation) required a different kind of analysis. I tracked unit cost trajectories over multiple fiscal years, computed donor concentration ratios, and compared budget projections against actuals to identify organisations that were growing faster than their revenue base could support. For some partners, I used the Wayback Machine to reconstruct the historical evolution of their public-facing claims, a form of longitudinal web analysis that helped assess whether an organisation's narrative was consistent over time or had shifted to match funder expectations.
Selected partners
Each partner operated in a different context, addressed a different problem, and measured success differently. What they shared was a commitment to evidence-based intervention and a willingness to be scrutinised. Below is a selection of the organisations for which I authored full due diligence reports, chosen to illustrate the range of programme types, geographies, and evaluation challenges in the portfolio.
Fonkoze
Ultra-poverty graduation programme for Haitian women using BRAC's model. 18-month accompaniment with asset transfers, health services, and case management. 95% graduation rate in 2022.
Educate Girls
Community volunteer model enrolling out-of-school girls in rural Rajasthan, MP, and UP. Ran the world's first education Development Impact Bond. IDInsight RCT showed 28% higher learning gains.
Development Media International
Evidence-based mass-media campaigns reaching 91 million people. Radio and TV spots on child survival, family planning, and nutrition. RCT-validated in Burkina Faso.
myAgro
Mobile layaway savings platform for smallholder farmers. Prepaid seed and fertiliser packages with training. 156% yield increase reported among 100,000+ farmers.
Kheyti
Affordable modular greenhouse ("Greenhouse-in-a-Box") for smallholder farmers. 7x increase in food production, 50x improvement in water efficiency. Targets income doubling.
Healthy Learners
Training teachers as community health workers to reach 763,000 students. Reduced stunting by 52%, cut morbidity by 38%, at $1.62 per child per year.
Imagine Worldwide
Solar-powered tablets for autonomous literacy and numeracy learning. No internet required. Targeting less than $5 per child at scale across six African countries.
iDE
Farm Business Advisors connecting remote farmers to markets in Zambia. Sanitation marketing driving toilet coverage from 23% to 85% in Cambodia since 2009.
A closer look: evaluating Educate Girls
To give a sense of what this work looked like in practice, consider the due diligence on Educate Girls, one of the portfolio's largest partnerships at $3.8 million across three grant cycles. EG operates in some of India's most educationally disadvantaged districts, deploying community volunteers to identify and re-enrol out-of-school girls.
The evaluation had to weigh several competing signals. On one hand, EG's headline numbers were impressive: 245,000 girls enrolled, a successful Development Impact Bond, and IDInsight's RCT showing 28% higher learning gains in programme schools. On the other, certain KPI trajectories warranted scrutiny. Cost per enrolled girl had risen from $59 in FY21 to $100 in FY24 (estimated), partly because saturation in early districts meant reaching increasingly remote, harder-to-serve populations. The pandemic-era community learning camps (Camp Vidya) were discontinued in FY23-24 after limited impact on actual school enrolment.
The DDS needed to make a judgement: were rising costs a sign of diminishing returns, or an expected feature of scale-up into harder geographies? The answer, arrived at through a mix of quantitative analysis and qualitative triangulation, was largely the latter, but with the recommendation that EG's upcoming expansion into Uttar Pradesh be monitored closely, given the state's weaker administrative infrastructure and the consequent risk to programme fidelity.
Reflections
The Cartier Philanthropy experience taught me things that a PhD in economics does not. Econometrics trains you to estimate treatment effects; impact evaluation in the field trains you to decide which treatment effects are worth estimating in the first place. Working across 22 partners forced me to develop judgement about what constitutes useful evidence versus performative measurement, a distinction that is surprisingly underappreciated in development practice.
It also sharpened my ability to translate quantitative findings for non-technical audiences. Programme officers at Cartier did not want p-values; they wanted to know whether Fonkoze's graduation rate in an increasingly insecure Haiti was likely to hold, and what would happen to myAgro's unit economics if West African rainfall patterns continued to shift. Learning to answer those questions clearly, without hedging behind confidence intervals, was a skill I continue to draw on.
Perhaps most valuably, the role gave me a panoramic view of how development interventions actually work on the ground, across sectors, across continents, and across scales. That breadth of exposure, from a $400,000 grant to a Haitian NGO to a $3.8 million multi-cycle partnership with an Indian education powerhouse, has informed how I think about innovation policy, impact measurement, and the gap between what works in an RCT and what works in practice.
From impact evaluation to research
The experience of evaluating theories of change across 22 organisations directly shaped how I approach my PhD research on innovation ecosystems. The habits of triangulation and scepticism carry over.
Analytical range
In six months I worked with RCT data, logframes, financial statements, qualitative interviews, ArcGIS maps, and Tableau dashboards. That breadth of method is rare in a single role and hard to replicate in a classroom.
Communicating to decision-makers
Every DDS report ended with a funding recommendation. Writing for a decision rather than for a journal taught me a different kind of rigour: one where clarity and judgement matter more than caveats.
Global perspective
Evaluating programmes across Sub-Saharan Africa, South Asia, and the Caribbean gave me first-hand appreciation for how context shapes implementation. No model travels without adaptation.