Measures of Effective Teaching Longitudinal Database
Welcome
This site enables users to apply for access to quantitative data and classroom videos created by the Measures of Effective Teaching (MET) project, funded by the Bill & Melinda Gates Foundation.
Use of the MET Longitudinal Database is offered to approved researchers via a remote access system. Please follow the numbered instructions above to learn about the MET Project, the data files available for secondary analysis, and to apply for access. Please use this website to understand the policies and procedures required for responsible use of these data.
Completing my Application
User submits completed Data Use Agreement application including:
Contact information
Project summary
List of data files requested
List of and contact information for research staff
IRB documentation (approval, exemption, etc.)
Agreement to Data Security Plan procedures
Agreement document with PI signature and Institutional Representative signature
ICPSR reviews and approves the application, or sends it back to the applicant if further work is necessary.
-
User submits payment and ICPSR staff configure data access accounts:
User submits payment to ICPSR
User indicates which team members, if any, require access to the video data.
ICPSR staff will work with your team members to configure University of Michigan accounts.
Your team members will download the required software and familiarize themselves with our access system: the Virtual Data Enclave and Secure Video Player (if video access is requested).
Webinars
ICPSR has prepared a number of video tutorials about MET data available on YouTube. In addition, the American Educational Research Association and MET LDB staff prepared an introductory video on MET (site requires you create a free account).
MET Early Career Grants Webinars
An introduction for potential grantees to the scope of the MET project and the data collected. The grants program has ended, but the recordings are provided here for informational purposes.
- Measures of Effective Teaching Early Career Grants Program, Part 1
- Measures of Effective Teaching Early Career Grants Program, Part 2
- Measures of Effective Teaching Early Career Grants Program, Part 3
Using the MET LDB
A webinar series offering descriptions and discussion of various facets of the MET LDB.
- Using the MET LDB Video Data: Access, scoring, and linking
- Random Assignment in the MET LDB
- MET Early Career Grantees: Research Projects Underway and Preliminary Findings
- Video Data Within the MET LDB: Video Capture, Scoring Protocols, and Measures Used
MET LDB Lecture Series
A series of three lectures providing in-depth discussion of complex facets of the MET LDB, given by faculty with experience using the database for secondary analysis.
- Understanding the Nested Data Structure, Elizabeth Minor, National Luis University
- Implications of Using the Nested Data, Ben Kelcey, University of Cincinnati
- Using the Randomized Sample, Matthew Steinberg, University of Pennsylvania
Projects Using the MET LDB
Project Name | Principal Investigator | Affiliation | |
---|---|---|---|
Features of Writing Instruction Across Grade Levels: Instructional Practices and Student Outcomes in 4th and 8th Grades | Alston, Chandra | University of Michigan |
more info
|
Teaching from a social justice perspective in advancing student achievement | Ampaw, Frimpomaa | Central Michigan University |
more info
|
Understanding the Mechanisms behind the Learning Production Function | Aucejo, Esteban; Cooley Freuhwirth, Jane; Coate, Patrick; Azmat, Ghazala; de Chaisemartin, Clement | London School of Economics, University of Cambridge, University of Michigan, Queen Mary University of London, University of Warwick |
more info
|
Evaluation of the Effectiveness of Summer Learning Programs on Student Outcomes | Augustine, Catherine | RAND Corporation |
more info
|
Study of Beginning Mathematics Teaching: Large scale impirical investigiation of the nature and impact of beginning mathematics teaching | Ball, Deborah | University of Michigan | - |
Value Added in Value Added Models: RealVAMs and Kappa Project | Broatch, Jennifer; Green, Jennifer | Arizona State University, Montana State University |
more info
|
School Influences on Teacher Collective Efficacy for Student Engagement | Burden, Paul | Kansas State University |
more info
|
Developing the Validity Agument for the DMT Observation Protocol and a FA Observation Protocol | Carney, Michele | Boise State University |
more info
|
The Expanded Hierarchical Rater Model: A Framework for the Analysis of Ratings | Casabianca, Jodi | University of Texas at Austin |
more info
|
Understanding Achievement Gaps as Students Age: Dynamic Peer Effects in Human Capital Production | Cooley Freuhwirth, Jane; Navarro, Salvador; Takahashi, Yuya | University of Cambridge, Western University, Johns Hopkins University |
more info
|
What teacher quality matters? Understanding the relationships among teachers' professional content knowledge, instruction, and student achievement | Copur-Gencturk, Yasemin | University of Houston | - |
How different value-added models and different weighting schemes affect the teacher ratings | Delandshere, Ginette | Indiana University | - |
Understanding mathematics classroom instruction through students and teachers | Eccles, Jacquelynne | University of California Irvine |
more info
|
Dimensionality of Teaching Quality Measures | Ferguson, Ronald | Harvard University |
more info
|
Teachers as Role Models in Gender-Stereotyped Academic Domains | Gunderson, Elizabeth | Temple University | - |
The Role of Student Participation in Mediating the Relationship Between Teacher Practices and Student Achievement in Mathematics | Ing, Marsha | University of California Riverside | - |
The Student Teacher Relationship as a Quality Indicator for Effective Teaching | Klopfenstein, Kristin | University of Northern Colorado |
more info
|
Noncognitive Skills & Schooling: An Exploratory Analysis | Kraft, Matthew | Brown University |
more info
|
The effect of time sampling and segment placement on the scoring quality of videotaped performances: Pilot study | Lai, Emily | Pearson | - |
Effective Teaching and Cognitive Science: Pilot Study of Video Scoring Scheme | Laski, Elida | Boston College |
more info
|
An Examination of the Teaching Effectiveness Measures | Li, Hongli | Georgia State University |
more info
|
Investigating the effects of teaching practices in science and mathematics | Maltese, Adam | Indiana University |
more info
|
Impact Evaluation of Teacher and Leader Performance Evaluation Systems | Manzeske, David | American Institutes for Research |
more info
|
Examining Teacher and Teaching Quality through Predictors of Urban Teacher Effectiveness | May, Henry | University of Delaware | - |
Use of Measures of Effective Teaching Longitudinal Database 3 | McCoach, D. Betsy | University of Connecticut | - |
Understanding Science Teaching Effectiveness: Examining the Relationship between Secondary Science Teachers' Instructional Practices and Value-Added Scores | Mikeska, Jamie | Educational Testing Service (ETS) | - |
Implementation of Reformed Teaching Practices and Student Achievement | Miura, Yoko | Wright State University |
more info
|
Teacher Quality | Moffitt, Robert | Central Michigan University | - |
Closing the Teacher Quality Gap- An Empirical Analysis of Teacher Assignment Problems | Ridder, Geert | University of Southern California |
more info
|
Contextual Conditions for Effective Teaching | Rimm-Kaufman, Sara | University of Virginia |
more info
|
Investigating Gender Gaps in Elementary School Mathematics: Connections Among Teachers' Math Knowledge for Teaching, Students' Attitudes, and Students' Problem Solving Approaches | Robinson, Joseph | University of Illinois Urbana-Champaign | - |
Emotional Classroom Climate and Academic Achievement | Schmitigal-Snyder, Linda | Lake Superior State University |
more info
|
Secondary Analysis of Gates MET data | Schoenfeld, Alan | University of California Berkeley | - |
An Exploration of Novice Teachers' Core Competencies: Impacts on Student Achievement, and Effectiveness of Preparation | Seidel, Kent;Green, Kathy | University of Colorado Denver, University of Denver, University of Colorado Boulder |
more info
|
Teacher Talk in the MET database | Silverman, Rebecca; Goodwin, Amanda | University of Maryland, Vanderbilt University |
more info
|
Teacher Turn-over intentions, teacher effectiveness and working conditions in high-need vs low-need schools | Singh, Kusum | Virginia Tech University |
more info
|
Validation of RATE (Rapid Assessment of Teacher Effectiveness) | Strong, Michael | University of California Santa Cruz | - |
Synthesizing Intraclass and Covariate Correlations for Measures of Teacher Outcomes | Taylor, Joseph | Biological Science Curriculum Study | - |
Identity Intersections in the Classroom: Teacher Effectiveness Across Racial Lines | Tienda, Marta | Princeton University |
more info
|
Evaluating the Diagnostic Value of Teacher Effectiveness Data Obtained from Classroom Observational Protocols | Traynor, Anne | Purdue University |
more info
|
Study of Teacher Preparation Experiences and Early Teacher Effectiveness | Unlu, Fatih | Abt Associates | - |
Investigating Reliability in Rating Effective Teachers : Multicolored Teachers and Grain Size | Ward, John | Millersville University | - |
Use of Measures of Effective Teaching Longitudinal Database | Welsh, Megan | University of California Davis; Christopher Rhoads, University of Connecticut | - |
Relationships between Student Surveys and Mathematics Classroom Instruction | Wilhelm, Anne | Southern Methodist University | - |
UC Berkeley Graduate School of Education Exploration of MET Project Classroom Observation and Student Survey Instruments | Wilson, Mark | University of California Berkeley |
more info
|
CRT Design Parameter | Xu, Zeyu | American Institutes for Research |
more info
|
Applying for Access
The MET data are restricted from general dissemination to protect the confidentiality of the students, teachers, schools, and school districts. A Confidential Data Use Agreement (DUA) must be established between the University of Michigan and the user's institution.
Download the DUA. This will also be emailed to you once you initiate your application.
- User submits completed Data Use Agreement application including:
- Contact information
- Project summary
- List of data files requested
- List of and contact information for research staff
- IRB documentation (approval, exemption, etc.)
- Agreement to Data Security Plan procedures
- Agreement document with PI signature and Institutional Representative signature
- ICPSR reviews and approves the application, or sends it back to the applicant if further work is necessary.
- User submits payment ($350/license, renewed annually) and ICPSR staff configure data access accounts:
- User submits payment to ICPSR
- User indicates which team members, if any, require access to the video data.
- ICPSR staff will work with your team members to configure University of Michigan accounts.
- Your team members will download the required software and familiarize themselves with our access system: the Virtual Data Enclave and Secure Video Player (if video access is requested).
Apply for Access (link)
Hide
The Primary Investigator of a research project using the MET LDB must: hold a PhD orother terminal degree, and hold a faculty position at a university or a researchposition at a research institution.
Hide
In order to support the MET LDB data access systems, project teams are charged anannual fee of $375/user. Each team member with access to the data requires alicense.
Hide
Your Institutional Representative is an individual who has been delegated theauthority to enter your institution into legal contracts and agreements. The MET LDBConfidential Data Use Agreement is an institution-to-institution agreement and somust be signed by both your institution and the University of Michigan.
Hide
Yes. Please note, however, that Data Use Agreements are between the University ofMichigan and the PI's institution, therefore if two institutions will collaborate onone project, two Data Use Agreements must be established. Each institution must havea Principal Investigator (as defined in the Data Use Agreement), a Data UseAgreement, and IRB review documentation.
Hide
The Agreement for the Use of Confidential Data from the Measures of EffectiveTeaching Longitudinal Database is a standard template that is required to be signedby the authorized institutional official for each approved requester. The terms andconditions are written to apply uniformly to all data users and withoutmodification.
Hide
About the MET LDB
The MET project was a research partnership between 3,000 teacher volunteers and dozens of independent research teams. The project's goal was to build and test measures of effective teaching to determine how evaluation methods could best be used to tell teachers more about the skills that make them most effective and to help districts identify and develop great teaching. Launched in 2009, the study has identified multiple measures and tools that -- taken together -- can provide an accurate and reliable picture of teaching effectiveness. By understanding what great teachers do and by improving the ways teachers gain insight into their practice, we can help more teachers develop their practice and achieve success for their students.
Research shows that a teacher's contribution matters more than anything else within a school. More than class size. More than school funding. More than technology. For decades, most initiatives to improve public education have focused on improving poor performing schools. But studies show that there are bigger differences in teaching quality within schools than there are between schools. This means that in the same school, a child taught by a less effective teacher can receive an education of vastly different quality than a student just down the hall who is taught by a more effective teacher. And the way evaluations are currently conducted don't provide a teacher who is struggling with a roadmap to improve.
Because teaching is complex, no single measure can capture the complete picture of a teacher's impact; yet many evaluation systems use tools that measure only a few aspects of teaching. The information that results provides teachers with very limited, occasional feedback to help develop their practice. Multiple measures are needed to help school leaders understand how teaching contributes to student success. By evaluating multiple aspects of teaching, instructors and school leaders can create better professional development programs that promote proven techniques and practices that help students learn, and can make better-informed hiring and tenure decisions.
The project was funded by the Bill & Melinda Gates Foundation as part of ongoinge fforts to give teachers the tools they need to be successful and to improve student achievement in public schools across the United States.
Hide
2.a MET Design
The data collections available through the MET Longitudinal Database have been organized into multiple ICPSR studies in an attempt to provide clarity to how themany different data files relate to each other. There are data collections, and thus ICPSR studies, from three sample sets from within, or related to, the MET project:the full MET sample, which includes all MET teachers and students; the randomization sample, drawn from the full MET sample; and the district-wide census, from which the full MET sample was drawn. The data within the collections, and thus within the different ICPSR studies, are also made available at six different levels of analysis: student, teacher, class/section, school, video observation segment, and survey or measurement item-level. Learn more about the study organization.
2.b MET Sampling
Recruitment of Districts
MET researchers recruited districts into the study during the period of July -November 2009. The final selection of districts was based on a district's interest in the study, staff size sufficient to assure adequate numbers of participating teachers, central office support for the MET program, willingness and capacity to participate in all parts of the data collection process, and broader local political and union support for the project. At the end of recruitment, the following districts were selected for and participated in the study: Charlotte-Mecklenburg(NC) Schools, Dallas (TX) Independent School District, Denver (CO) Public Schools,Hillsborough County (FL) Public Schools, Memphis (TN) City Schools, and the New YorkCity (NY) Department of Education.
Recruitment of Schools
Within each recruited district, certain schools were excluded from participation in the study including special education schools, alternative schools, community schools, autonomous dropout and pregnancy programs, returning education schools, andvocational schools that did not teach academic courses. Also excluded from the study were schools that employed team teaching or other structural features that made it impossible to assign responsibility for a student's learning to a single, specific teacher.
Recruitment of Teachers
Once a school principal agreed to participate in the study, all teachers assigned toteach MET Study focal grade/subject combinations were invited to participate in thestudy unless: (a) they were team teaching or looping, making it impossible to assign responsibility for the learning of a given student in a specific subject to that teacher; (b) the teacher indicated that he or she was not planning to stay in the same school and teach the same subject the following year; or (c) there were fewerthan two other teachers with the same grade/subject teaching assignments. This last restriction was put into place to assure that each teacher could be put into an"exchange" group for random assignment of classes to teachers in Year Two of thestudy.
Realized Samples
In Year One of the study, a total of 2,741 teachers in 317 schools took part in the MET Study, distributed across grade/subject groupings as indicated in the table below. By contrast, the Year Two sample includes just 2,086 teachers in 310 schools.The table below shows the realized sample sizes for the MET teacher samples, forboth years of the study. The Year One sample (left hand column) shows all teachers who participated in Year One of the study, regardless of their eligibility for the randomization that took place in Year Two. The Year Two sample (right hand column)shows the number of teachers who participated in Year Two of the study broken out by their randomization status.
Year One MET Teacher Sample vs. MET LDB Core Teacher Sample by Focal Grade/Subject | ||||
---|---|---|---|---|
Full Sample All Year One Teachers (AY 2009-2010) | Core Study Sample All Teachers Present in Year Two (AY 2009-2010) | |||
Randomized | Non-Randomized | |||
4th and 5th Grade English/Language Arts (ELA) | 138 | 98 | 29 | |
4th and 5th Grade Mathematics | 102 | 67 | 31 | |
4th and 5th Grade ELA and Mathematics | 634 | 305 | 52 | |
Grades 6-8 ELA | 606 | 292 | 139 | |
Grades 6-8 Mathematics | 528 | 282 | 120 | |
Grades 6-8 ELA and Mathematics | 18 | 4 | 4 | |
9th Grade Algebra I | 233 | 116 | 44 | |
9th Grade English | 242 | 108 | 48 | |
9th Grade Biology | 240 | 103 | 60 |
2.c Types of Instruments, Measures and Analysis
Student achievement gains on state standardized tests, supplemental tests, and value-added measures
MET researchers used state test scores and administered supplemental assessment testsas one measure student learning. While the state tests are designed to measure how well students have learned by the state standards, supplemental tests tend to measure reasoning skills and conceptual understanding. From these, and other instruments described below, "value-added" measures of teacher effects on student learning were created.
Classroom observations
To see how well different classroom observation tools identify effective teaching, MET project researchers recorded four lessons each year in each participating teacher's classroom. These recording were then scored using one or more of the six protocols listed below:
- Classroom Assessment Scoring System (CLASS), developed by Robert Pianta, University of Virginia
- Framework for Teaching, developed by Charlotte Danielson
- Mathematical Quality of Instruction (MQI), developed by Heather Hill, Harvard University, and Deborah Loewenberg Ball, University of Michigan
- Protocol for Language Arts Teaching Observations (PLATO), developed by Pam Grossman, Stanford University
- Quality Science Teaching (QST) Instrument, developed by Raymond Pecheone, Stanford University
Teachers' pedagogical content knowledge
In the second year of the project (2010-2011), participating teachers took assessments to measure their ability to choose appropriate strategies and to recognize and diagnose common student errors.
Student perceptions of the classroom instructional environment
All students in participating teachers' classrooms completed surveys about their experience in the classroom and their teachers' ability to engage them in the course material.
Teachers' perceptions of working conditions and support at their schools
All participating teachers completed surveys asking them about the quality of working conditions within their schools and the amount of instructional support they receive.
2.d MET Randomization
When schools joined the study during the 2009-10 school year, principals identifiedgroups of teachers in which all teachers met the following criteria:
- They were teaching the same subject to students in the same grade (for example, teachers teaching math to 6th graders or English language arts to 8th graders or self-contained 4th grade classes).
- They had the necessary certification so they could all teach common classes.
- They were expected to teach the same subject to students in the same grade in the 2010-11 school year.
These groups of teachers were referred to as "exchange groups," and schools needed atleast one exchange group with two or more teachers who agreed to enroll in the studyto participate in the MET project.
The project requested scheduling information for 2,462 teachers from 865 exchangegroups in 316 schools. The project created 668 randomization blocks from 619exchange groups in 284 of the participating schools. The remaining schools'schedules did not permit randomly swapping rosters among any of MET project teachersor all its MET project teachers left the school or the study.
From these randomization blocks, the project randomly assigned rosters to 1,591 teachers. (This includes 386 high school teachers and 24 teachers for whom rosters were later found to be invalid.) Seventy teachers were not eligible for randomization because they were not scheduled to teach the exchange group subject and grade level in 2010-11 or they decided not to participate in year 2 of the study. The remaining 281 teachers could not be randomized because they did not teachin a period with two or more teachers for exchanging rosters.
2.e Video Observation Scoring
Video scoring was conducted in several phases. There was an initial summer pilot phase during which time a subset of 413 teachers with complete data had their 2,000 videos scored with the CLASS protocol. These videos are sometimes called the "PlanB" sample. This scoring occurred prior to use of the web-based coding interface. Phase 1 occurred once this interface was established and scored the same 2000 videos using the rest of the observation protocols. Phase 2 scoring occurred later and focused on scoring both years of videos from teachers that were successfully randomized in Year Two. Phase 3 consisted of scoring videos that were only 25 minutes long on the FFT protocol.
For more information on phases of video observation scoring see Chapter 10 of theUser Guide.
Hide
3.a File Manifest
A file organization chart is available that includes the file name for each file available through the MET Longitudinal Database (LDB), along with the name and number for the ICPSR study in which each file can be found. It also lists the primary IDs included in each file to aid users in merging files or linking datawithin or across studies. Each file is provided in SPSS, SAS, STATA, R and ASCIIformats. The ASCII version of each file name is used in the File Name column of the chart (da34771-0001.txt); other formats will follow the same naming conventions but will have different file extensions (da34771-0001.sav, da34771-0001.rda, etc.). More information is available here.
3.b How the Data Collections are Organized in the MET LDB
Overall
ICPSR has organized the MET LDB data into six collections each of which has an ICPSR "Study Number":
- Study Information (ICPSR 34771)
- Core Files, 2009-2011 (ICPSR 34414)
- Base Data: Section-Level Analytical Files, 2009-2011 (ICPSR 34309)
- Base Data: Item-Level Supplemental Test Files, 2009-2011 (ICPSR 34868)
- Base Data: Item-Level Observational Scores, 2009-2011 (ICPSR 34346)
- Base Data: Item-Level Surveys and Assessment Teacher Files, 2009-2011 (ICPSR 34345)
- District-Wide Files, 2009-2011 (ICPSR 34344)
ICPSR Study # 34771 - Study Information
Contained in this release are a video information file, a randomization file, asubject ID crosswalk, and a teacher demographics file. The Video Information File contains descriptive information about the videos captured for the MET project andtheir availability to users. The Randomization File includes district, school,section and student IDs, teacher IDs for the teacher a student was randomly assigned to, and the actual teacher the student was recorded as having in October and May ofthat year. The Subject ID Crosswalk contains only ID variables and is included to describe the associations between districts, schools, teachers, sections, and students. The Teacher Demographics File contains information on gender, ethnicity,work experience, and levels of education for teachers in the MET project.
ICPSR Study # 34414 - Core Files, 2009-2011
The Core Data Files contain a crafted sample of teachers to facilitate longitudinal analyses, and include only MET teachers who participated in Year 1 or both Year 1& 2 of the MET Project. No teachers are included who only participated in Year 2of the MET Project. Additionally, for certain instruments, only aggregated or summary variables are included. No sub-sample instruments or special topic data are included.
ICPSR Study # 34309 -- Base Data: Section-Level Analytical Files,2009-2011
The Section-Level Analytical files are a merger of demographics, constructed, and summary variables aggregated to the teacher section level for elementary and secondary teachers in both years of the MET study. Variables contained in therelease include student race, age and other demographic variables, state test and supplemental test rankings, value-added variables, and student perception survey composite measures.
ICPSR Study # 34868 -- Base Data: Item-Level Supplemental TestFiles, 2009-2011
Student achievement was measured in two ways -- through existing state assessments,designed to assess student progress on the state curriculum for accountability purposes, and supplemental assessments, designed to assess higher-order conceptual understanding. The Item-level Observational Scores release consists of data files for the three supplemental assessments (SAT- 9, BAM and ACT) for both years of the MET study.
ICPSR Study # 34346 -- Base Data: Item-Level Observational Scores,2009-2011
Panoramic digital video of classroom sessions were taken of participating teachersand students, teachers submitted commentary on their lessons (e.g., specifying thelearning objective) and then trained raters scored the lesson based on particular classroom observation protocols. This collections consists of item-level scores foreach of the observational protocols (CLASS, FFT, MQI, PLATO, QST and UTOP) used across both years of the MET study.
ICPSR Study # 34345 -- Item-Level Surveys and Assessment TeacherFiles, 2009-2011
The Item-Level Surveys and Assessment Teacher release consists of four written response surveys, from or about the MET teachers, a teacher knowledge assessment anda survey of curriculum taught by teachers. The written response surveys were given to principals, to gauge their knowledge of their teachers' effectiveness, teachers,to gauge their perception of their principals' effectiveness and of their broader working environment, and students, to analyze the value of student feedback on theeffort to improve both teaching and learning. A teacher knowledge assessment was also conducted to test the utility of both newly developed and well established measures of teacher knowledge to predict measures of teacher effectiveness.
ICPSR Study # 34798 -- District-Wide Files, 2008-2011
The district wide files are comprised of one data file per district for each of the two years of the MET project and the school year immediately prior to the METproject. Each file contains information on each student in the school district, including student demographic variables, such as race, age and gender, specialty student status variables such as free lunch, English language learner, and gifted and talented program participation, and student-level test rankings for math and reading, as well as, aggregate means of student demographic, specialty status, and test score variables for each teacher.
The scripts used by the MET Project team at RAND to compute value-added estimates arealso included in this collection.
3.c Linking/Merging Data Files
The MET LDB includes six primary IDs that can be used to merge data files; district,school, section, teacher, student, and video. Most files include several IDs, and aMET respondent (teacher or student) or video may appear more than once in any particular file based on the level of analysis in which that file is provided. For example, a teacher who taught three sections of math may appear three times in a file provided at the class/section level. Files at different levels of analysis can be merged, however users wishing to avoid creating files with redundant variables that contain excessive missing values or files with repeated cases with redundant information will need to aggregate variables up to, or disaggregate variables down to, a common analysis level prior to merging. For more information on the analysis level of each file, see the MET LDB Study Organization charts.
Any number of data files that contain the same ID can be merged. Two data files have been created to assist users wishing to merge files on an ID that is not common toeach file wished to be merged. The Subject ID Crosswalk contains only ID variablesand describes the associations between districts, schools, teachers, sections, and students. The Video Information File contains more than just IDs, but it also provides the relationship between videos and the district, school, teacher, and section from which they were recorded.
More information about merging files, including counts of the number of teachers common across different combinations of files, can be found in Appendix A of the User Guide.
Hide
The VDE is a virtual machine launched from the researcher's own desktop but operating on a remote server, similar to remotely logging into another physical computer. The virtual machine is isolated from the user's physical desktop computer, restricting the user from downloading files or parts of files to their physical computer. The virtual machine is also restricted in its external access, preventing users from emailing, copying, or otherwise moving files outside of the secure environment, either accidentally or intentionally.
The VDE operates just like a standard Windows desktop and includes many of the most popular statistical analysis packages. All work with the quantitative data files can only be done within the VDE. The data files and software remain on an ICPSR server. Software tools and support available in the VDE include:
- Geospatial Analysis Tools
- ArcGIS
- GeoDa
- SpatialEcology
- Statistical Analysis Tools
- DimPack
- HLM
- IRTPRO
- JAGS
- MPlus
- R for Windows
- RStudio
- SAS
- SPSS
- Stata
- StatTransfer
- SUDAAN
- WinBUGS
- Documentation Support
- Adobe Acrobat
- GNU Emacs
- Textpad
- Microsoft Office
Hide
Once approved for use of the confidential data, access to the streamed video datafiles will be granted via a secure log in, using a web browser on your computer (not within the VDE). A filterable and categorized list of uniquely ID'd videos will be presented and researchers may select and stream all videos of interest.
Each video has a four-character alphanumeric Session ID that is referenced in the data files. Write down the code and find it on the Observation Session List.
Hide
The MET-X videos can be streamed along with the MET LDB videos. ICPSR holds noassociated metadata or quantitative data about these videos and cannot respond touser inquiries about their content or use. Please visit the MET-X website or contactsupport@umichsoe.zendesk.com for further information.
Hide
Using the VDE and Video Player
Requests for additional software are welcome, and will be reviewed in batches every 3 months. Please email ICPSR-help@umich.edu with requests. Please be aware that wehave made every effort to provide a wide range of software for use with the MET LDB. Requested software packages with little demand are unlikely to be included.
Researchers interested in uploading existing syntax or other files into the VDE foruse with the MET data will email the files to ICPSR-help@umich.edu and we will vet them, similar to requested output. ICPSR staff will not allow researchers to input data files as the MET data may not be merged other data sources.
Hide
As a user of the MET data you are obligated to protect the confidentiality of students and teachers who appear in the videos and quantitative data. No information may be recorded outside of the VDE that might disclose the identities of districts, teachers, classes, or students -- either alone or in combination with otherinformation. For example, attributes of teachers or students appearing in videos,such as race, age, gender, etc. may not be recorded in any way outside of the VDE. Additionally, attributes of the class session, such as grade or subject, may not be recorded outside of the VDE. Re-identifying schools or districts from information like classroom setting, student demographics, or material appearing in classrooms(e.g. names of sports teams) is also prohibited.
It is acceptable to record information that does not identify people or places. For example, rubrics describing teaching styles, student reactions to lessons, classroom tone or climate may be used to score videos. These scores may be recorded outside the VDE with the video ID.
Hide
Research team members who will record non-identifying information about teaching sessions will not need access to the VDE. Any team member who will record identifying information (such as characteristics of teachers or students) or requires access to the quantitative data will need to complete their work within the VDE.
Hide
Complying with the DUA
Requests for additional software are welcome, and will be reviewed in batches every 3 months. Please email ICPSR-help@umich.edu with requests. Please be aware that we have made every effort to provide a wide range of software for use with the MET LDB. Requested software packages with little demand are unlikely to be included.
Researchers interested in uploading existing syntax or other files into the VDE for use with the MET data will email the files to ICPSR-help@umich.edu and we will vet them, similar to requested output. ICPSR staff will not allow researchers to input data files as the MET data may not be merged other data sources.
Hide
As a user of the MET data you are obligated to protect the confidentiality of students and teachers who appear in the videos and quantitative data. No information may be recorded outside of the VDE that might disclose the identities of districts, teachers, classes, or students -- either alone or in combination with other information. For example, attributes of teachers or students appearing in videos, such as race, age, gender, etc. may not be recorded in any way outside of the VDE. Additionally, attributes of the class session, such as grade or subject, may not be recorded outside of the VDE. Re-identifying schools or districts from information like classroom setting, student demographics, or material appearing in classrooms (e.g. names of sports teams) is also prohibited.
It is acceptable to record information that does not identify people or places. For example, rubrics describing teaching styles, student reactions to lessons, classroom tone or climate may be used to score videos. These scores may be recorded outside the VDE with the video ID.
Hide
Research team members who will record non-identifying information about teaching sessions will not need access to the VDE. Any team member who will record identifying information (such as characteristics of teachers or students) or requires access to the quantitative data will need to complete their work within the VDE.
Hide
No. The MET database was not designed as a representative sample of teachers in each district. Comparisons across districts are not valid. Analytical models should control for unobserved differences among districts, but conclusions about differences between districts cannot be derived from these data.
Hide
All of the data files and videos are consented by parents and teachers for secondary analysis except the District-Wide Files. The District-Wide Files are a collection of administrative data provided to approved researchers in accordance with the Family Educational Rights and Privacy Act (FERPA), which stipulates that student record data may be distributed for secondary analysis only for the purposes of education research. Please see FERPA §99.31 for further information on appropriate uses of student administrative records. All MET LDB Data Use Agreement applications will be evaluated to ensure that the proposed use of the District-Wide Files, if applicable, complies with FERPA requirements and falls within the scope of the goals of the original MET Project: to build and test measures of effective teaching and to help school districts identify and develop great teaching.
Hide
Although the names of the school districts included in the MET LDB are known, the identities of those districts should be masked in all publications based on the METLDB for two reasons.
First, since the MET data are not a representative sample of teachers in each district, identification of districts may lead to comparisons that lack scientific merit.
Second, public identification of districts would increase the risk that individual teachers and students might be identified. As a user of the MET LDB, you are obligated to protect the confidentiality of individuals in the database.
Districts should not be named in any publications or presentations except as part of a statement listing all the districts that participated in MET. This information canalready be found publicly on the MET website and in existing MET publications.
Publications should not include tables providing information that could be used to identify MET districts. For example, demographic characteristics (e.g. distributionby race, ethnicity, free and reduced school lunches) may lead to identification of districts and should not be associated with MET District IDs. Readers may be provided background information about the districts from public sources without associating that information with MET District IDs.
Hide
Working on a laptop or working from home is possible, as long as work is completed inan office that meets the requirements of a Secure Project Office, which are listed below:
- The screen cannot be visible from open doors or through windows
- The door must be closed when using the data
- Only approved research staff can be in the room while the data is in use
- If the data is active but you are out of the office, the door must be locked
Hide
Changes in office locations are not required to be reported. Please note that they still must meet the criteria for a Secure Project Office as defined in FAQ #4 above and in the DUA.
Hide
Yes. The instructions from the DUA are as follows:
M. To notify ICPSR of a change in institutional affiliation of the Investigator. Notification must be in writing and must be received by ICPSR at least six (6) weeks prior to Investigator's last day of employment with Institution. Investigator's separation from Institution terminates this Agreement. Investigator may reapply for access to Confidential Data as an employee of the new institution. Re-application requires:
- Execution of a new Agreement for the Use of Confidential Data by both the Investigator and the proposed new institution;
- Execution of any Supplemental Agreement(s) with Research Staff and Pledges of Confidentiality by Research Staff and Participants at the proposed new institution;
- Preparation and approval of a new Data Security Plan; and
- Evidence of approval or exemption by the proposed new institution's IRB.
These materials must be approved by ICPSR before Confidential Data or any derivatives or analyses may be accessed at the new institution.
Hide
If a staff person drops out, it is possible to replace this license with a new staff person. However, this license will expire based on when it was originally purchased and assigned.
Hide
Obtaining and Publishing Output
Researchers interested in removing output or other non-data files from the secure environment may submit a request to ICPSR, and ICPSR staff will review the requested files to ensure no confidential information is removed from the secure environment.The request must include the following information:
- A narrative summary of all output requested for release, compiled into a single document, one-page maximum.
- File names of all documents requested.
ICPSR staff will email files with adequately low reidentification risk or request modifications prior to a second review.
Current vetting requirements and criteria are provided here but may be modified inorder to protect confidentiality of students and teachers.
- No data files, data extracts, or listing of cases will be released.
- No identifying output broken out by district, teacher, classroom, or student will be released. Please review the Data Use Agreement for further information on acceptable output presentations.
- Please limit requested output to results that are needed for presentations and papers. We are unlikely to review output beyond 15 standard pages in length. Be aware that longer output will significantly increase the length of the review period. In most cases, we will only review one set of output files per project.
- Well-documented and explained program files should accompany all output.
- Tabular output:
- Tables must include cell counts and percentages
- Tables with cell sizes smaller than ten (10) will not be released. Please recode variables as necessary before submitting tables for review.
- If applicable, tables must be accompanied by a relevant subset count and description.
- Regression output:
- Regression output must include a description of any sub sampling and description of variables included in the model
- Regressions based on fewer than 20 cases will not be released
- Logistic regression output should be submitted with a pseudo-R-squared statistic
- District-Wide output files:
- We will look closely at District-Wide output to ensure that it meets the above requirements, meets FERPA regulations, and falls within the scope of the original goals of the MET Project. Please see How can the District-Wide Files be analyzed, and what do I and my Institutional Review Board (IRB) need to know about this group of files? for further information on appropriate and sanctioned uses of the District-Wide Files.
Hide
MET Additional Documentation
These additional documentation files are a sample of the web-based rater training courses and video scoring guides used to instruct video observation raters for a selection of the video observation scoring measures. This material is provided for the purpose of research and documentation of the Measures of Effective Teaching Longitudinal Database and is not to be used for other purposes. Please be advised that these materials may contain broken links to material no longer available.
Framework for Teaching
MET Rater Training: CLASS Training
Analysis and Problem Solving Cheat Sheet
Behavior Management Cheat Sheet
Content Understanding Cheat Sheet
Instructional Dialogue Cheat Sheet
Instructional Learning Formats Cheat Sheet
Negative Climate Cheat Sheet
Positive Climate Cheat Sheet
Productivity Cheat Sheet
Quality of Feedback Cheat Sheet
Regard for Adolescent Perspectives Cheat Sheet
Regard for Student PerspectivesCheat Sheet
Student Engagement Cheat Sheet
Teacher Sensitivity Cheat Sheet
Scoring Leader Training: CLASS Secondary
CMQI Lite
Quality Science Teaching Scales
- Assigns Tasks to Promote Learning and Addresses the Task Demands
- Demonstrates Content Knowledge
- Elicits Evidence of Students' Knowledge and Conceptual Understanding
- Guides Analysis and Interpretation of Data
- Promotes Students' Interest and Motivation to Learn Science
- Provides Feedback For Learning
- Provides Guidelines for Conducting the Investigation and Gathering Data
- Sets the Context and Focuses Learning on Key Science Concepts
- Uses Modes of Teaching Science Concepts
- Uses Representations
MET Project Instrument Descriptions
- Teacher Instruments
- Teachers' Perceptions and the MET Project (Teacher Working Conditions Survey)
- Content Knowledge for Teaching and the MET Project (Content Knowledge for Teaching Assessment - CKT)
- Classroom Observation Instruments
- The CLASS Protocol for Classroom Observations (The Classroom Assessment Scoring System - CLASS)
- Danielson's Framework for Teaching for Classroom Observations (Framework for Teaching - FFT)
- The MQI Protocol for Classroom Observations (Mathematical Quality of Instruction - MQI lite)
- The PLATO Protocol for Classroom Observations (Protocol for Language Arts Teaching Observation - PLATO Prime)
- Student Instruments
- Student Assessments and the MET Project (Student Supplemental Assessments)