Archives of Scientific Psychology

This dataset is made available in connection to an article in Archives of Scientific Psychology, the first open-access, open-methods journal of the American Psychological Association (APA). Archiving and dissemination of this research is part of APA's commitment to collaborative data sharing.

Raw data for meta-analysis of discriminative validity of caregiver, youth, and teacher report for pediatric bipolar disorder -- all English publications through End of 2014 (ICPSR 36245)

Principal Investigator(s): Youngstrom, Eric A., University of North Carolina-Chapel Hill


Objective: To meta-analyze the diagnostic efficiency of checklists for discriminating pediatric bipolar disorder (PBD) from other conditions. Hypothesized moderators included (a) informant - we predicted caregiver report would produce larger effects than youth or teacher report; (b) scale content - scales that include manic symptoms should be more discriminating; and (c) sample design - samples that include healthy control cases or impose stringent exclusion criteria are likely to produce inflated effect sizes.

Methods: Searches in PsycINFO, PubMed, and GoogleScholar generated 4094 hits. Inclusion criteria were (1) sufficient statistics to estimate a standardized effect size, (2) age 18 years or less, and (3) at least 10 cases (4) with diagnoses of PBD based on semi-structured diagnostic interview. Multivariate mixed regression models accounted for nesting of multiple effect sizes from different informants or scales within the same sample.

Results: Data included 63 effect sizes from 8 rating scales across 27 separate samples (N=11,941 youths, 1,834 with PBD). The average effect size was g=1.05. Random effect variance components within study and between study were significant, ps<.00005. Informant, scale content, and sample design all explained significant unique variance, even after controlling for design and reporting quality.

Discussion: Checklists have clinical utility for assessing PBD. Caregiver reports discriminated PBD significantly better than teacher and youth self report, although all three showed discriminative validity. Studies using "distilled" designs with healthy control comparison groups, or stringent exclusion criteria, produced significantly larger effect size estimates that could lead to inflated false positive rates if used as described in clinical practice.

Access Notes

  • This dataset is part of ICPSR's Archives of Scientific Psychology journal database. Users should contact the Editorial Office at the American Psychological Association for information on requesting data access.


No downloadable data files available.

Study Description


Youngstrom, Eric A. Raw data for meta-analysis of discriminative validity of caregiver, youth, and teacher report for pediatric bipolar disorder -- all English publications through End of 2014. ICPSR36245-v1. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2015-08-17.

Persistent URL:

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote XML (EndNote X4.0.1 or higher)

Scope of Study

Subject Terms:    adolescents, bipolar disorder, caregivers, child health, children, mental health, pediatrics, psychological evaluation, symptoms, youths

Smallest Geographic Unit:    Effect size (Country of origin)

Geographic Coverage:    Global

Time Period:   

  • 1990--2014

Date of Collection:   

  • 2013-09--2015-06

Unit of Observation:    Effect size for scale (often nested within sample)

Universe:    Youths ages 5-18 evaluated via caregiver, teacher, or youth checklists.


Sample:    Systematic review

Weight:    Inverse variance weighting, as standard when meta-analyzing standardized mean differences such as Hedges' g.

Presence of Common Scales:    QUADAS2 and Kowatch for study quality; Hedges' g for effect size.


Original ICPSR Release:   2015-08-17

Related Publications


Metadata Exports

If you're looking for collection-level metadata rather than an individual metadata record, please visit our Metadata Records page.

Download Statistics