GHA_2020-2021_PNPIE_v01_M
Parental Nudges Project Impact Evaluation 2020-2021
All Rounds
Name | Country code |
---|---|
Ghana | GHA |
1-2-3 Survey, phase 1 [hh/123-1]
Baseline, Midline and Endline
The Parental Nudges Project Impact Evaluation was approved by the Strategic Impact Evaluation Fund (SIEF) of the World Bank on October 2020 in the Great Accra Region of Ghana. The official project name is called " Nudges To Improve Learning And Gender Parity: Supporting Parent Engagement And Ghana’s Educational Response To Covid-19 Using Mobile Phones”, known as “Parental Nudges Project (PNP)”. The project seeks to address the inequitable access to education and inequalities by child gender and households background during the Covid-19 pandemic through a household-level intervention designed to improve school-aged children’s outcomes by engaging parents in their children’s learning. Impacts are examined on child outcomes, including school enrollment and attendance, and learning outcomes. In addition, impacts are examined on mediating mechanisms including parental beliefs about returns to education, educational expectations and aspirations for children, engagement in education, and the rate of returning to school. Finally, with four variations of the treatment arm, impact variation is examined by duration of message receipt (12 vs 24 weeks), and by a focus on general education versus gender parity.
Sample survey data [ssd]
Primary caregivers and school-age children aged 5 to 17 years.
Version 1: Edited, anonymous dataset for public distribution. All Personally Identifying Information (PII) has been removed.
2021-09-30
The data provided is the final version with PII removed.
The scope of the Subject Enrollment and Caregiver Survey includes screening questions to determine eligibility, household roster, household characteristics, children educational activities since school closure due to covid-19, children time use, caregiver’s view about the education of the boy child and girl child, educational aspirations for boys and girls, parental expectations on returns to education, caregiver engagement with children education and food security
The scope of the midline and endline survey includes:
16 districts/municipalities across the 5 regions (Northern, North East, Savannah, Upper East, and Upper West) in the northern part of Ghana.
Five regions in the northern part of Ghana. PNP targets (a) households - parents or guardians and (b) their respective school-age children between 6 and 17 years in primary school [including kindergarten] and/or junior high school in the Northern, Savannah, North East, Upper East, and Upper West regions of Ghana. We sampled 2,500 households or primary caregivers and 5,000 (2 each) school-age children from each household.
Name | Affiliation |
---|---|
Sharon Wolf | University of Pennsylvania |
Elisabetta Aurino | Imperial College London |
Name | Role |
---|---|
Innovations for Poverty Action | echnical assistance in questionnaire design, sampling methodology, data collection and data processing |
Name | Abbreviation |
---|---|
Strategic Impact Evaluation Fund, | SIEF |
Name | Role |
---|---|
Ed Tech Hub | Co-funding |
Baseline:
The sample is drawn from two previously completed studies. First, an impact evaluation of the Communications for Development (C4D) study (2012-2016), launched by the Ghana Health Service with funding from UNICEF in 12 districts of the three poorest regions of Ghana. The sample included mothers with a child aged 0-5 years recruited in 2012. The C4D program relied on voice messages directly delivered to female respondents through their cell phones. The C4D sample has high rates of mobile phone ownership (83%). Second, even in a very negative scenario in which 52% of households from the C4D sample have changed phone numbers, we relied on a subsample of the Graduating the Ultra Poor (GUP) study from the same regions to obtain our desired sample size.
The samples from the C4D and GUP projects formed the sampling frame, from which we determined the eligibility of households and then sampled from those eligible for the PNP. An initial desktop screening was conducted to screen out households without phone numbers. Households with phone numbers were then contacted through a subject enrollment call to determine their eligibility, seek their consent to participate in the study and their consent to receive the text messages.
The 4500 eligible households in the two datasets were screened to determine their eligibility for the study, with the goal of selecting 2,500 households / primary caregivers (and 2 children per household) into either one of the experimental groups.
First, eligible 4500 households were identified in the dataset and randomization was conducted before households were recruited.
Second, each of the 2500 eligible households was called to confirm eligibility and enroll them in the study. The eligibility criteria were as follows:
Finally, 2 school-aged children were randomly sampled from each eligible household (5,000 total). Additionally, using a reserve list of replacement households, we replaced 128 households that had dropped out of the study after the Subject Enrollment and Caregiver Survey.
Midline and endline:
The sampling frame for the PNP consisted of two previously completed IPA studies. First, an impact evaluation of the Communications for Development (C4D) study (2012-2016), launched by the Ghana Health Service with funding from UNICEF in 12 districts of the three poorest regions of Ghana. The sample included mothers with a child aged 0-5 years recruited in 2012. The C4D program relied on voice messages directly delivered to female respondents through their cell phones. The C4D sample has high rates of mobile phone ownership (83%). Second, even in a very negative scenario in which 52% of households from the C4D sample have changed phone numbers, we relied on a subsample of the Graduating the Ultra Poor (GUP) study from the same regions to obtain our desired sample size.
A target sample of 2,500 primary caregivers/households was set. We assumed an average of 2 eligible school-going children (aged 5 to 17 years) per household and thus, planned to recruit approximately 5,000 school-going children (2 per household) from the 2,500 households. Our goal was to randomly sample one = younger school-aged child (5-9 years) and one older school-aged child (10-17 years). Taking into account an estimated 10% non-response rate and pilot sample, we targeted a sample of 2,750 households or primary caregivers. We followed the following steps in order to select the primary caregivers and children for the PNP. The steps involved screening study participants, assigning them to the intervention and comparison groups, and enrolling in the study.
Step 1. Desktop screening and random assignment of mobile phone numbers or households. We conducted desktop screening using the existing samples from the C4D and GUP to determine whether at least one household member within each household had a mobile phone number. Households without at least one household member with a mobile phone number were screened out. The desk-top screening resulted in 4,500 households with mobile phone numbers. Next, we randomly assigned the 4,500 households to the intervention and comparison groups. The random assignment resulted in an almost equal distribution of households across the intervention and comparison groups.
Step 2. Screening and enrollment of eligible households or primary caregivers. We implemented a phone-based subject enrollment call to (a) determine the eligibility of the households or primary caregivers for the study, (b) seek their consent and that of their children to be enlisted into the study, (c) inform them about the project, the intervention, and implementation modalities, (d) seek their consent to participate in the intervention, and (e) enroll 2,668 eligible primary caregivers into the study. The additional 168 households were selected to (a) serve as reserve households for replacement (n = 128) and (b) be used for the intervention piloting (n = 40).
We enrolled households or primary caregivers based on the following eligibility criteria:
a. A household with an adult aged 18 years and above. This age criterion satisfies the age requirement as provided in 2014 The Unsolicited Electronic Communications Code of Conduct by the National Communications Authority of Ghana. It also forms the basis for recruiting the primary caregiver, defined as the person who is primarily responsible for the care and education of the school-aged children in the household and could best talk about their experiences in school and at home.
b. A household that is willing to participate in the study and/or to receive text messages.
c. A household with more than one member including at least 1 school-aged child. A child roster was created for school-going children aged 5 to 17 years within the household.
Step 3. Selection of school-going children. Once the primary caregiver had been identified, the SurveyCTO in-built randomization protocol was triggered to randomly select at most 2 school-going children [and 2 replacements] from each household. School-going children who meet the eligibility criteria, namely, (a) aged 6-17 years, (b) attending school, and (c) meeting the household residency requirement were randomly selected for the study.
The SurveyCTO in-build randomization code was designed to:
a. Generate two age groups or strata of eligible school-aged children based on their age: (a) Stratum A comprised school-going children aged 6 to 9 years, (b) Stratum B consisted of school-going children aged 10 to 17 years. If there was no child aged 5-17, the selection was based on children aged 5, if available.
b. Randomly sample 3 eligible school-going children from each stratum and rank them from 1 to 3. The child with rank 1 in either stratum became the selected child. Those with ranks 2 and 3 in either stratum became the replacement children. If Stratum A (Stratum B) has no eligible child, the children in Stratum B (Stratum A) with ranks 1 and 2 became the selected children. Rank 3 was the replacement child. If either stratum had less than 3 eligible children, the child assigned rank 2 became the only replacement child for that stratum. If either stratum had only 1 child each, there was no replacement child.
We identified and sampled 4,685 school-aged children out of the 5,000 targeted for the study. This was largely due to some households not having two children in the target-age. On average, we reported 1 eligible school-going child per household.
Baseline: We recruited and surveyed 100% (n = 2,500 primary caregivers) of the target sample for the study. We also interviewed 5% more for replacement.
Midline: We tracked and survey all caregivers (n = 2,500) registering 100% response rate. We recruited and assessed 4,675 out of 5,000 children, indicating a response rate of 92%.
Endline: We tracked and survey all caregivers (n = 2,500) registering 100% response rate. We recruited and assessed 4,635 out of 4,685 children, indicating a response rate of 99%.
Baseline: The core research team developed an evaluation plan, surveys, and guidelines to guide the implementation of the evaluation activities. The first evaluation activity combined the Subject Enrollment Call with a Caregiver Survey – Subject Enrollment and Caregiver Survey. The caregiver survey portion explored children's educational activities since Covid-19, children's general time use, gender bias, educational aspirations for boys and girls, parental expectations on returns to education, parent engagement in education, and food security. The Caregiver Survey was developed using a mixture of adapted, adopted, and self-designed questions. The Subject Enrollment and Caregiver Survey was piloted on 43 households or primary caregivers within the catchment areas of the study. Experienced IPA staff piloted the Subject Enrollment and Caregiver Survey. Feedback from the piloting was used to modify the content of the Subject Enrollment and Caregiver Survey. The survey is published in English Language.
The Subject Enrollment and Caregiver Survey sort information from primary caregivers on screening questions to determine their eligibility, household roster, household characteristics, children educational activities since school closure due to covid-19, children time use, caregiver’s view about the education of the boy child and girl child, educational aspirations for boys and girls, parental expectations on returns to education, andcaregiver engagement with children education and food security.
Midline: The instruments for the midline survey were designed by the core research team including the researchers. The midline survey instruments were structured questionnaires and administered in person. The midline survey instruments comprised caregiver survey and child survey, administered to eligible primary caregivers and children, respectively.
A caregiver survey was administered to each primary caregiver to obtain household and personal information of primary caregivers, update child roster, caregiver engagement in education, emotional supportiveness, discipline practices, educational aspirations, parental self-efficacy, gender bias, implementation quality, food security, and poverty status.
A child survey was administered to each eligible child within the primary caregiver's household. A child survey was only administered after completing a caregiver survey so as to get pertinent child-specific information from the caregiver survey. The child survey sought information on education status, conflict resolution, relationships, oral vocabulary, non-word reading, spelling, oral reading and comprehension, phonological awareness, counting, number discrimination, missing number, numbers, word problem, operations, working memory, short-term memory, gender bias, time use, caregiver engagement, job and education aspirations, school motivation, self-esteem and food security.
Endline: The instruments for the endline survey were designed by the core research team including the researchers. The endline survey instruments were structured questionnaires and administered in person. The endline survey instruments comprised caregiver survey and child survey, administered to eligible primary caregivers and children, respectively.
A caregiver survey was administered to each primary caregiver to obtain household and personal information of primary caregivers, update child roster, caregiver engagement in education, emotional supportiveness, discipline practices, educational aspirations, physical health, parental reported attendance and attitudes on attendance, parent mental health, parental self efficacy, gender bias, parent perception on return to school, implementation quality, and food security.
A child survey was administered to each eligible child within the primary caregiver's household. A child survey was only administered after completing a caregiver survey so as to get pertinent child-specific information from the caregiver survey. The child survey sought information on education status, conflict resolution, relationships, oral vocabulary, non-word reading, spelling, oral reading and comprehension, listening comprehension, phonological awareness, counting, number discrimination, missing number, numbers, word problem, operations, working memory, short-term memory, gender bias, time use, caregiver engagement, job and education aspirations, child perceptions on return to school, school motivation, self esteem and food security.
Data was collected electronically using SurveyCTO software (based on ODK). The SurveyCTO software has enhanced data quality controls systems or features such as automatic skip patterns, relevance, and constraints, which were integrated into the programming to guarantee data quality.
Start | End | Cycle |
---|---|---|
2020-11-14 | 2021-01-25 | Baseline |
2021-04-22 | 2021-06-30 | Midline |
2021-08-12 | 2021-09-30 | Endline |
Name | Abbreviation |
---|---|
Innovations for Poverty Action | IPA |
Baseline: One survey team conducted data collection remotely via phone. The field team was monitored remotely through regular check-ups on them by the research associate and auditor to understand their concerns and challenges to addressing as well as ensuring that they are conducting the survey. There was no field visits as data were collected remotely.
Midline: The success of the midline survey depended on many different people. The total number of personnel for the midline survey was 46, comprising 35 interviewers, 2 auditors, 7 team leaders, and 2 field supervisors. Each survey team had 5 interviewers and 1 team leader. There were 7 team leaders.
Each field supervisor managed 3/4 survey teams. The field supervisors were responsible for
Project staff including the Field Manager, Research Associate and Research Quality Associates were involved in field visits. Field visits were conducted throughout the data collection phase. Field visits were conducted to (a) monitor and evaluate interviewer performance and (b) observe interviews. Field visits took the form of accompaniments and spot checks. The field visits were structured in a way to ensure that all field teams, team leaders, and interviewers were monitored at least once each week.
Endline: The success of the endline survey depended on many different people. The total number of personnel for the endline survey was 46, comprising 35 interviewers, 2 auditors, 7 team leaders, and 2 field supervisors. Each survey team had 5 interviewers and 1 team leader. There were 7 team leaders.
Each field supervisor managed 3/4 survey teams. The field supervisors were responsible for
Project staff including the Field Manager, Research Associate and Research Quality Associates were involved in field visits. Field visits were conducted throughout the data collection phase. Field visits were conducted to (a) monitor and evaluate interviewer performance and (b) observe interviews. Field visits took the form of accompaniments and spot checks. The field visits were structured in a way to ensure that all field teams, team leaders, and interviewers were monitored at least once each wee
Baseline: IPA recruited 35 remote enumerators through an internal competitive process involving experienced enumerators. The enumerators were trained remotely for the Subject Enrollment and Caregiver Survey through Google Meet. The training was conducted for four days starting from 2nd November 2020 to 5th November 2020. The training was designed using educational methods including presentations, questions and answers, group discussions, role-play, and practice. Following the training evaluation conducted at the end of the training, 34 enumerators were hired for the Subject Enrollment and Caregiver Survey.
The Subject Enrollment and Caregiver Survey was programmed using SurveyCTO and administered via phone. The programmed survey was rigorously bench-tested and fine-tuned before it was finalized. The survey was programmed with quality checks including constraints, skip patterns, relevance commands to automate the administration process and automatically check inconsistencies or errors associated with the administration of the survey. We obtained permission from primary caregivers for participation in the study, the text message intervention and their children participation in the study. The survey was administered in the following local languages: Dagbani, Manpruli, Buli, Dagaari, Sissala, Dagaari, Guruni and Wali, based on partial translation of the survey.
The Subject Enrollment and Caregiver Survey was administered from 13th November 2020 to 25th January 2021. Data collection took 42 working days to complete instead of the 20 working days originally planned. This was due to not locatable phone numbers, inactive phone numbers, wrong numbers, phone numbers switched off, and connectivity issues in the catchment areas. This affected the sample recruitment leading to the in-person tracking of respondents with not locatable numbers. The research team visited the catchment areas and tracked the households to update their phone numbers. The in-person tracking had helped the research team to update the records of over 150 primary caregivers. Weekly debriefing sessions were held to get feedback from the enumerators and provide updates and useful suggestions for improving the quality of data collection.
Data were collected via encrypted Samsung tab 7 with installed SurveyCTO Collect and administered via phone. Completed surveys were uploaded to the project’s SurveyCTO server and exported to a secured IPA’s institutional Box account with BoxCryptor encrypted folders and backup on an encrypted folder on the project computer. Raw data were run through IPA’s data management system daily to conduct validation checks and provide feedback to the enumerators on the completed surveys.
Midline and endline: IPA recruited 60 remote enumerators through a competitive process involving experienced enumerators. Six days of in-person training were conducted. The training was designed using educational methods including presentations, questions and answers, group discussions, role-play, and practice. Following the training evaluation conducted at the end of the training, 46 field staff were hired for the endline survey.
The endline survey instruments were programmed using SurveyCTO and administered in person. The programmed survey was rigorously bench-tested and fine-tuned before it was finalized. The surveys were programmed with quality checks including constraints, skip patterns, relevance commands to automate the administration process and automatically check inconsistencies or errors associated with the administration of the survey. The survey was administered in the following local languages: Dagbani, Manpruli, Buli, Dagaari, Sissala, Dagaari, Guruni and Wali, based on partial translation of the survey.
Data collection took 45 working days to complete. Data collection coincided with school activities and the Ramadan period, thereby, affecting the tracking of respondents. The endline survey was conducted in person at the homes of the study participants. At endline, primary caregivers and children were interviewed in person based on their availability. The endline survey was conducted in person at the homes of the study participants.
Data were collected via encrypted Samsung tab 7 with installed SurveyCTO Collect and administered via phone. Completed surveys were uploaded to the project’s SurveyCTO server and exported to a secured IPA’s institutional Box account with BoxCryptor encrypted folders and backup on an encrypted folder on the project computer. Raw data were run through IPA’s data management system daily to conduct validation checks and provide feedback to the enumerators on the completed surveys.
Data consistency checks namely high-frequency checks were conducted for all surveys remotely. Corrections were made during and after data collection after errors were reconciled.
All checks and cleaning was done using STATA and IPA data management systems. IPA possesses all the relevant code.
Name | Affiliation | |
---|---|---|
Alaka Holla | World Bank | aholla@worldbank.org |
Sharon Wolf | wolfs@upenn.edu | |
Elisabetta Aurino | e.aurino@imperial.ac.uk |
Licensed datasets, accessible under conditions and following review
Sharon Wolf, University of Pennsylvania, Elisabetta Aurino, Imperial College London. Ghana: Parental Nudges Project Impact Evaluation 2020-2021, All Rounds, Ref. GHA_2020-2021_PNPIE_v01_M. Dataset downloaded from [URL] on [date]
The user of the data acknowledges that the University of Pennsylvania, Imperial College London, Innovations for Poverty Action, Strategic Impact Evaluation Fund, the World Bank bear no responsibility for use of this data or for interpretations or inferences based upon such uses.
(c) 2020, Innovations for Poverty Action
Name | Affiliation | |
---|---|---|
Alaka Holla | World Bank | aholla@worldbank.org |
DDI_GHA_2020-2021_PNPIE_v01_M_WB
Name | Abbreviation | Affiliation | Role |
---|---|---|---|
Development Data Group | DECDG | World Bank | Documentation of the study |
2025-01-27
Version 01 (January 2025)