This study is provided by Child Care & Early Education Research Connections.
Head Start Family and Child Experiences Survey (FACES): 2006 Cohort [United States] (ICPSR 28421)
Principal Investigator(s): United States Department of Health and Human Services. Administration for Children and Families. Office of Planning, Research and Evaluation
The Head Start Family and Child Experiences Survey (FACES) is a periodic, ongoing longitudinal study of program performance. Successive nationally representative samples of Head Start children, their families, classrooms, and programs provide descriptive information on the population of children and families served; staff qualifications, credentials, and opinions; Head Start classroom practices and quality measures; and child and family outcomes. FACES includes a battery of child assessments across multiple developmental domains (cognitive, social, emotional, and physical).
For nearly a decade, the Office of Head Start, the Administration for Children and Families, other federal agencies, local programs, and the public have depended on FACES for valid and reliable national information on (1) the skills and abilities of Head Start children, (2) how Head Start children's skills and abilities compare with preschool children nationally, (3) Head Start children's readiness for and subsequent performance in kindergarten, and (4) the characteristics of the children's home and classroom environments. The FACES study is designed to enable researchers to answer a wide range of research questions that are crucial for aiding program managers and policymakers. Some of the questions that are central to FACES include:
- What are the demographic characteristics of the population of children and families served by Head Start? How has the population served by Head Start changed?
- What are the experiences of families and children in the Head Start program? How have they changed?
- What are the cognitive and social skills of Head Start children at the beginning and end of their first year in the program? Has Head Start program performance improved over time?
- Do the gains in cognitive and social skills that Head Start children achieve carry over into kindergarten? Do larger gains (or greater declines in problem behavior) translate into higher achievement at the end of kindergarten?
- What are the qualifications of Head Start teachers in terms of education, experience, and credentials? Are average teacher education levels rising in Head Start?
- What is the observed quality of Head Start classrooms as early learning environments, including the level and range of teaching and interactions, provisions for learning, emotional and instructional support, and classroom organization? How has quality changed over time? What program- and classroom-level factors are related to observed classroom quality? How is observed quality related to children's outcomes and developmental gains?
FACES also supports analyses of subgroups of interest, such as children with disabilities, dual language learners, and children who are performing above or below average on standardized assessments. Its design changes in response to emerging policy and research questions. For example, in response to the growing concern about childhood obesity, measures of children's height and weight were introduced in FACES 2006.
Measures for FACES 2006 were selected to balance the need to support comparisons to previous cohorts of FACES (particularly with respect to program performance measures) against the need to update the measurement battery and address emerging policy issues and benefits from progress in the assessment field. Many of the measures used in FACES 2006 were included in previous cohorts and they are presented below by the five major measurement sources in FACES: (1) child direct assessments; (2) parent interviews; (3) teacher interviews and survey; (4) classroom observations; and (5) program director, center director, and education coordinator interviews.
- The child direct assessments included the major components of school readiness. They included a language screener, the Peabody Picture Vocabulary Test, Fourth Edition/Test de Vocabulario de Imagines Peabody, subtests from the Woodcock-Johnson Tests of Achievement Third Edition/Bateria III Woodcock-Munoz (letter word identification, applied problems, spelling, and word attack), a measure of early math literacy based on items from the Early Childhood Longitudinal Study, Birth and Kindergarten Cohorts math assessments (geometry, patterns, and measurement), story and print concepts, and physical measurements (height and weight). At the end of the direct child assessment, interviewers rate the child's attention, organization/impulse control, activity level, and sociability using items from the Leiter-R scales.
- The parent interview was designed to provide Head Start with a comprehensive understanding of the families that they serve, including the demographic characteristics of households and household members, parent-child relationships and the quality of the child's home life, and parent ratings of the child's behavior problems, social skills, and competencies, levels and types of participation in the program and in other community services.
- The Head Start teacher interview was designed to collect information about classroom and teacher characteristics related to the quality of care provided by Head Start programs. Teachers were asked about their classroom activities and use of curricula, as well as their demographic and educational background and professional experience. They also used a Web survey to rate the social skills, problem behaviors, and competencies of each FACES child in their classroom. Kindergarten teachers provided information about schools attended by Head Start children, their classrooms and school experiences using a Web survey. They also completed ratings of each FACES child's social skills, behavior problems and competencies.
- The classroom observations were designed to measure peer interactions and the extent to which Head Start programs employed skilled teachers and provided developmentally appropriate environments and curricula for their children. The measures used included the Early Childhood Environment Rating Scale-Revised (ECERS-R), the Arnett Scale of Lead Teacher Behavior, and the Instructional Support scale from the Classroom Assessment Scoring System (CLASS). Counts of children and adults were also taken to calculate group size and child-adult ratios.
- The Program Director, Center Director, and Education Coordinator Interviews gathered information about staffing and recruitment, teacher education initiatives and training, waiting lists and program expansion, classroom activities, curriculum, overview of program management, and parent involvement.
The User Guide provides detailed information about the FACES 2006 study design, execution, and data to inform and assist researchers who may be interested in using the data for future analyses. The following items are provided in the User Guide as appendices.
- Appendix A -- Copyright Statements
- Appendix B -- Instrument Content Matrices
- Appendix C -- Questionnaires
- Appendix D -- Center/Program Codebook
- Appendix E -- Classroom/Teacher Codebook
- Appendix F -- Child Codebook
- Appendix G -- Description of Constructed/Derived Variables
This data collection may not be used for any purpose other than statistical reporting and analysis. Use of these data to learn the identity of any person or establishment is prohibited. To protect respondent privacy, the FACES 2006 Cohort data are restricted from general dissemination. Access to parts of this study requires a signed User Agreement. To obtain the file(s), researchers must agree to the terms and conditions of the Restricted Data Use Agreement, which is included with every download and can also be obtained separately on the Browse Documentation page.
Any public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.
WARNING: This study is over 150MB in size and may take several minutes to download on a typical internet connection.
United States Department of Health and Human Services. Administration for Children and Families. Office of Planning, Research and Evaluation. Head Start Family and Child Experiences Survey (FACES): 2006 Cohort [United States]. ICPSR28421-v4. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2013-05-10. http://doi.org/10.3886/ICPSR28421.v4
Persistent URL: http://doi.org/10.3886/ICPSR28421.v4
This study was funded by:
- United States Department of Health and Human Services. Administration for Children and Families. Office of Planning, Research and Evaluation (Contract # HHSP23320052905YC)
Scope of Study
Geographic Coverage: United States
- 2006--2009 (Fall 2006 through Spring 2009)
Date of Collection:
- 2006 (Fall)
- 2007 (Spring)
- 2008 (Spring)
- 2009 (Spring)
Unit of Observation: Head Start program, Head Start center, Head Start classroom/teacher, children new to Head Start (and their families), Kindergarten teacher
Universe: The Head Start programs participating in the FACES 2006 Cohort were a probability sample selected from among 1,639 study-eligible programs on the 2004-2005 Head Start Program Information Report (PIR). To be eligible for the study, a program had to be in one of the 50 states or the District of Columbia, be providing services directly to children ages 3 to 5, and not be in imminent danger of losing its grantee status. Furthermore, programs under the Migrant and Seasonal Worker program or American Indian and Alaskan Native program were not eligible. Probability samples of centers were selected within each program, classrooms within each center, and children within each classroom. Teachers associated with selected classrooms were included in the study with certainty, as were parents associated with selected children.
Data Types: observational data, survey data
Data Collection Notes:
The Head Start Family and Child Experiences Survey would like to acknowledge Maria Woolverton's role as the ACF Project Officer.
Louisa Tarullo, Ph.D. (Mathematica) was the Principal Investigator for this project.
Jerry West, Ph.D. (Mathematica) was the Project Director for this project.
Juárez and Associates, Inc. assisted in collecting data for FACES 2006. Psychometric support was provided by Educational Testing Service.
Reports based on this data collection are available at the Administration for Children and Families Web site (http://www.acf.hhs.gov/programs/opre/).
Sample: The sample is a multi-stage clustered sample, with the first three of four stages (programs, centers, classrooms) being selected with probability proportion to size. At the final stage, children were sampled with equal probability within classrooms. Sixty programs were selected, two centers per program, and up to three classrooms per center for a total of 415 classrooms. Within each classroom, children were sampled with the goal of obtaining 10 children with parental consent per classroom, for a total of 3,817 children. At each stage of sampling, FACES 2006 used implicit and explicit stratification and a sequential sampling technique based on a procedure developed by Chromy (1979).
Weight: The FACES 2006 data include sampling weights to account for variations in the probabilities of selection as well as eligibility and cooperation rates among those selected. Consult the User Guide for a more in-depth explanation of the weights, the weighting procedure, and the specific formulas used for each of the weights.
Mode of Data Collection: computer-assisted personal interview (CAPI), computer-assisted telephone interview (CATI), coded on-site observation, face-to-face interview, mixed mode, paper and pencil interview (PAPI), self-enumerated questionnaire, telephone interview, web-based survey
In the FACES 2006 study, there were high participation rates at each level and each time point of data collection: Among the 63 sampled Head Start programs that met the study eligibility criteria, 60 agreed to participate, for an unweighted response rate of more than 95 percent, and a weighted rate of about 92 percent. All participating program directors completed the program director interview at baseline. Among the 135 sampled eligible centers within these programs, all participated in the study, and all of the associated center directors and educational coordinators completed their respective interviews at baseline. Among those 410 sampled eligible classes within these centers, all participated in the study, and teacher interviews were obtained for 407 of the 410 classes, for an unweighted response rate of 99.3 percent and a weighted rate of 98.7. Because some teachers teach two half-day sessions, many ended up with both of their classes being selected for the FACES 2006 sample. Among the 368 teachers associated with the 410 eligible classes, 365 responded to the teacher interview, for an unweighted response rate of 99.2 percent and a weighted response rate of 98.4 percent. The cumulative response rate through the class level is 94.5 percent (91.2 weighted). And the cumulative rate through the teacher level is 94.5 percent (90.9 percent weighted). At the child level, response rates are presented in several different ways: unweighted and weighted, at each time point, by age cohort and combined, by consent and data collection instrument, and marginal and cumulative. Parental consents were obtained for 92 percent of the 3,817 sampled children. Among these children we obtained a child assessment for 96 percent, a parent interview for 96.2 percent and a TCR for 95.2 percent. The cumulative response rates, accounting for response at the program, center, class, and parental consent levels were 83.9, 84.1, and 83.2 for the child assessment, parent interview, and TCR, respectively. The comparable weighted response rates were 81.1, 81.4, and 80.5 percent. By spring of the first Head Start year (2007), only 2,914 consented children remained in the sampled Head Start program. Assuming that some of the eligible but non-consented children from baseline would have become ineligible between fall and spring, it was estimated that 3,177 of the originally sampled children were eligible in spring 2007, leaving us with a consent rate of 91.7 percent. Among these children, 97.8 percent completed the child assessment in the spring, 92.2 percent had a completed parent interview, and 95.5 percent had a completed Teacher Child Report (TCR). The cumulative unweighted response rates through spring 2007 were 85.4, 80.4, and 83.4 percent, respectively, and the cumulative weighted response rates were 82.6, 77.8, and 80.7 percent.
Overall, the estimated eligible sample size was down to 2,512 children by the spring of 2008, and 2,226 were still participating at that time, with a consent rate of 88.6 percent. Among the 3-year-old cohort (most of whom were still in Head Start), child assessments, parent interviews, and TCRs were completed with more than 93 percent of the 1,219 children still enrolled in Head Start. Teacher interviews were obtained for 97 percent of the children still in Head Start. For the 4-year-old cohort (most of whom were in kindergarten in spring of 2008), the child assessment was completed on 88 percent of the 1,007 eligible children, parent interviews on 92 percent, and kindergarten teacher TCRs on 64 percent. Additionally, teacher interviews were obtained for 65 percent of the children in kindergarten. In spring 2009, the sample included those children who were in kindergarten and who had been in Head Start in spring of 2008, mostly the 3-year-old cohort. It was estimated that 1,462 of the originally sampled children were eligible for this round of data collection, and 1,089 participated, for a consent rate of about 75 percent. Among the 1,089 children, child assessments were completed on 88 percent, parent interviews on 93 percent, and kindergarten teacher TCRs on 72 percent. Teacher interviews were obtained from 73 percent of the children who were in kindergarten in spring 2009.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:
- Performed consistency checks.
- Checked for undocumented or out-of-range codes.
Original ICPSR Release: 2010-10-26
- 2015-08-04 Codebooks were enhanced and variable search function was added to the study homepage.
- 2013-07-08 The FACES instrument matrix has been updated to include the 2009 cohort. The Child Data files have been updated with missing designations.
- 2013-05-10 The FACES instrument matrix has been updated to include the Spanish Version of the FACES 2006 Parent Interview and the FACES-produced questionnaires are now available in the documentation download.
- 2013-03-05 All of the codebooks have been updated.
- 2012-12-13 The covers for the Parent Interview Questionnaire, Head Start Teacher Interview Questionnaire and Head Start Teacher Child Report in Appendix C of the User Guide have been updated. An explanation page was also added after the cover for the Parent Interview Questionnaire and Head Start Teacher Interview Questionnaire.
- 2012-10-05 The Use Agreement has been updated.
- 2012-07-10 The Child Data files have been updated.
- 2011-02-01 On November 16, 2010, Research Connections (www.researchconnections.org) hosted a Webinar Data Training conducted by Mathematica Policy Research Staff that introduced the FACES 2006 data collection. Topics covered included FACES Instruments, Data File Structure, Tips for Working with the Data and more. The recorded Webinar is now available for download. Users can also view the recording directly from the Research Connections website.
- 2010-12-14 The value labels for variable "P1_3VS4" in the Child Data were modified.
Related Publications (?)
- Citations exports are provided above.
Export Study-level metadata (does not include variable-level metadata)
If you're looking for collection-level metadata rather than an individual metadata record, please visit our Metadata Records page.