Peer-Reviewed Manuscripts /cadre/ en Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data /cadre/2019/04/15/using-hierarchical-logistic-regression-study-dif-and-dif-variance-multilevel-data <span>Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-04-15T13:29:41-06:00" title="Monday, April 15, 2019 - 13:29">Mon, 04/15/2019 - 13:29</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cadre/taxonomy/term/53" hreflang="en">Large-scale Assessment</a> </div> <a href="/cadre/benjamin-shear">Benjamin Shear</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource:&nbsp;</strong><a href="/cadre/media/100" data-entity-type="media" data-entity-uuid="09b28de8-45c6-4da0-806e-6145c94b0fe1" data-entity-substitution="canonical" rel="nofollow" title="difhlm-man-submit-rev1-proofed.pdf"><strong>Using Hierarchical Logistic Regression to Study DIF and DIF Variance in Multilevel Data</strong></a></p><p><strong>Authors:&nbsp;</strong>Benjamin R. Shear</p><p><strong>Citation:</strong>&nbsp;Shear, B.R. (2018).&nbsp;Using hierarchical logistic regression to study DIF and DIF variance in multilevel data. Pre-print: Journal of Educational Measurement, v. 55, no. 4, pp. 513&nbsp;<a href="https://onlinelibrary.wiley.com/doi/full/10.1111/jedm.12190" rel="nofollow">https://onlinelibrary.wiley.com/doi/full/10.1111/jedm.12190</a></p><p><strong>Abstract:</strong>&nbsp;</p><p>When contextual features of test-taking environments differentially affect item responding for different test-takers and these features vary across test administrations, they may cause differential item functioning (DIF) that varies across test administrations. Because many common DIF detection methods ignore potential DIF variance, this paper proposes the use of random coefficient hierarchical logistic regression (RC-HLR) models to test for both uniform DIF and DIF variance simultaneously. A simulation study and real data analysis are used to demonstrate and evaluate the proposed RC-HLR model. Results show the RC-HLR model can detect uniform DIF and DIF variance more accurately than standard logistic regression DIF models in terms of bias and Type I error rates.</p></div> </div> </div> </div> </div> <div>By Benjamin Shear. When contextual features of test-taking environments differentially affect item responding for different test-takers and these features vary across test administrations, they may cause differential item functioning (DIF) that varies across test administrations. Because many common DIF detection methods ignore potential DIF variance, this paper proposes the use of random coefficient hierarchical logistic regression (RC-HLR) models to test for both uniform DIF and DIF variance simultaneously.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 15 Apr 2019 19:29:41 +0000 Anonymous 245 at /cadre Learning Progressions and Embedded Assessment /cadre/2019/04/03/learning-progressions-and-embedded-assessment <span>Learning Progressions and Embedded Assessment</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-04-03T15:06:58-06:00" title="Wednesday, April 3, 2019 - 15:06">Wed, 04/03/2019 - 15:06</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cadre/taxonomy/term/51" hreflang="en">Classroom Assessment</a> <a href="/cadre/taxonomy/term/36" hreflang="en">Learning Progressions</a> </div> <a href="/cadre/derek-briggs">Derek Briggs</a> <span>,&nbsp;</span> <a href="/cadre/erin-furtak">Erin Furtak</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource: </strong><a href="/cadre/media/95" data-entity-type="media" data-entity-uuid="75078932-1aad-42c2-a215-8e340fc06f8f" data-entity-substitution="canonical" rel="nofollow" title="bf_ncme_book_110618.pdf">Learning Progressions and Embedded Assessment&nbsp;</a></p><p><strong>Authors:&nbsp;</strong>Derek C. Briggs and Erin Marie Furtak</p><p><strong>Citation:</strong>&nbsp;Briggs, D.C. &amp; Furtak, E. (2018). Learning progressions and embedded assessment.&nbsp;Pre-print from S. Brookhart &amp; J. McMillan (Eds) <em>Classroom Assessment and Educational Measurement</em>, NCME Book Series.&nbsp;</p><p><strong>Abstract:</strong>&nbsp;</p><p>Learning progressions have great potential as an organizing framework for classroom instruction and assessment. However, successful implementation of this framework hinges upon developing a curriculum-embedded system of student assessment. In this chapter, an approach to meeting this challenge is illustrated in the context of a learning progression in science that crosses the disciplinary boundaries of physics, chemistry and biology in a high school setting. Four key ingredients of our approach include (1) mapping and aligning the scientific content of the learning progression to the curricula of the participating teachers, (2) making the case that assessment activities targeted to the learning progression can provide teachers with relevant insights about their students, (3) bringing teachers together to discuss student ideas that emerge from assessment activities, and (4) linking the assessments within and across the courses taught by participating teachers.</p></div> </div> </div> </div> </div> <div>By Derek C. Briggs and Erin Marie Furtak. Learning progressions have great potential as an organizing framework for classroom instruction and assessment. However, successful implementation of this framework hinges upon developing a curriculum-embedded system of student assessment. In this chapter, an approach to meeting this challenge is illustrated in the context of a learning progression in science that crosses the disciplinary boundaries of physics, chemistry and biology in a high school setting.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 03 Apr 2019 21:06:58 +0000 Anonymous 233 at /cadre Making Inferences about Teacher Observation Scores over Time /cadre/2019/04/02/making-inferences-about-teacher-observation-scores-over-time <span>Making Inferences about Teacher Observation Scores over Time</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-04-02T15:10:33-06:00" title="Tuesday, April 2, 2019 - 15:10">Tue, 04/02/2019 - 15:10</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <a href="/cadre/derek-briggs">Derek Briggs</a> <span>,&nbsp;</span> <a href="/cadre/jessica-alzen">Jessica Alzen</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource:&nbsp;</strong><a href="/cadre/media/96" data-entity-type="media" data-entity-uuid="c24bc0c7-8370-4a97-8882-44a860c3605b" data-entity-substitution="canonical" rel="nofollow" title="ba_jebs_top_122118_preprint.pdf"><strong>Making Inferences about Teacher Observation Scores over Time</strong></a></p><p><strong>Authors:&nbsp;</strong>Derek C. Briggs and Jessica L. Alzen</p><p><strong>Citation:</strong>&nbsp;Pre-print of Briggs, D. C. &amp; Alzen, J. L. (2019). Making inferences about teacher observation scores over time. Educational and Psychological Measurement.&nbsp;https://doi.org/10.1177/0013164419826237</p><p><strong>Abstract:</strong>&nbsp;</p><p>Observation protocol scores are commonly used as status measures to support inferences about teacher practices. When multiple observations are collected for the same teacher over the course of a year, some portion of a teacher’s score on each occasion may be attributable to the rater, lesson and time of year of the observation. All three of these are facets that can threaten the generalizability of teacher scores, but the role of time is easiest to overlook. A generalizability theory framework is used in this study to illustrate the concept of a hidden facet of measurement. When there are many temporally spaced observation occasions, it may be possible to support inferences about the growth in teaching practices over time as an alternative (or complement) to making inferences about status at a single point in time. This study uses longitudinal observation scores from the Measures of Effective Teaching project to estimate the reliability of teacher-level growth parameters for designs that vary in the number and spacing of observation occasions over a two-year span. On the basis of a subsample of teachers scored using the Danielson Framework for Teaching, we show that at least 8 observations over two years are needed before it would be possible to make distinctions in growth with a reliability coefficient of .38.</p></div> </div> </div> </div> </div> <div>By Derek C. Briggs and Jessica L. Alzen. Observation protocol scores are commonly used as status measures to support inferences about teacher practices. When multiple observations are collected for the same teacher over the course of a year, some portion of a teacher’s score on each occasion may be attributable to the rater, lesson and time of year of the observation. </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 02 Apr 2019 21:10:33 +0000 Anonymous 237 at /cadre Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation /cadre/2019/04/01/examining-dual-purpose-use-student-learning-objectives-classroom-assessment-and-teacher <span>Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2019-04-01T15:23:49-06:00" title="Monday, April 1, 2019 - 15:23">Mon, 04/01/2019 - 15:23</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cadre/taxonomy/term/51" hreflang="en">Classroom Assessment</a> </div> <a href="/cadre/derek-briggs">Derek Briggs</a> <span>,&nbsp;</span> <a href="/cadre/rajendra-chattergoon">Rajendra Chattergoon</a> <span>,&nbsp;</span> <a href="/cadre/amy-burkhardt">Amy Burkhardt</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource:&nbsp;</strong><a href="/cadre/media/97" data-entity-type="media" data-entity-uuid="13dad11c-3332-4af8-ae11-9b125b5c0100" data-entity-substitution="canonical" rel="nofollow" title="bcb_slo_jem_110118_final.pdf"><strong>Examining the Dual Purpose Use of Student Learning Objectives for Classroom Assessment and Teacher Evaluation</strong></a></p><p><strong>Authors:&nbsp;</strong>Derek C. Briggs, Rajendra Chattergoon, and Amy Burkhardt</p><p><strong>Citation:</strong>&nbsp;Briggs, D.C., Chattergoon, R. &amp; Burkhardt, A. (2018). Examining the dual purpose use of student learning objectives for classroom assessment and teacher evaluation.&nbsp;In press, Journal of Educational Measurement.</p><p><strong>Abstract:</strong>&nbsp;</p><p>The process of setting and evaluating Student Learning Objectives (SLOs) has become increasingly popular as an example where classroom assessment is intended to fulfill the dual purpose use of informing instruction and holding teachers accountable. A concern is that the high stakes purpose may lead to distortions in the inferences about students and teachers that SLOs can support. This concern is explored in the present study by contrasting student SLO scores in a large urban school district to performance on a common objective external criterion. This external criterion is used to evaluate the extent to which student growth scores appear to be inflated. Using two years of data, growth comparisons are also made at the teacher-level for teachers who submit SLOs and have students that take the state-administered large-scale assessment. Although they do show similar relationships with demographic covariates and have the same degree of stability across years, the two different measures of growth are weakly correlated.</p></div> </div> </div> </div> </div> <div>By Derek C. Briggs, Rajendra Chattergoon, and Amy Burkhardt. The process of setting and evaluating Student Learning Objectives (SLOs) has become increasingly popular as an example where classroom assessment is intended to fulfill the dual purpose use of informing instruction and holding teachers accountable.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 01 Apr 2019 21:23:49 +0000 Anonymous 241 at /cadre Using a Learning Progression Framework to Assess and Evaluate Student Growth /cadre/2017/08/25/using-learning-progression-framework-assess-and-evaluate-student-growth <span>Using a Learning Progression Framework to Assess and Evaluate Student Growth</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2017-08-25T15:18:27-06:00" title="Friday, August 25, 2017 - 15:18">Fri, 08/25/2017 - 15:18</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cadre/taxonomy/term/32" hreflang="en">Educator Effectiveness</a> <a href="/cadre/taxonomy/term/36" hreflang="en">Learning Progressions</a> </div> <a href="/cadre/derek-briggs">Derek Briggs</a> <span>,&nbsp;</span> <a href="/cadre/elena-diaz-bilello">Elena Diaz-Bilello</a> <span>,&nbsp;</span> <span>Fred Peck</span> <span>,&nbsp;</span> <a href="/cadre/jessica-alzen">Jessica Alzen</a> <span>,&nbsp;</span> <span>Raymond Johnson</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource:</strong>&nbsp; <a href="/cadre/media/353" rel="nofollow">Using a learning progression framework to assess and evaluate student growth</a>.&nbsp;[<a href="/cadre/media/354" rel="nofollow">Click here for the Executive Summary</a>]</p><p><strong>Authors:</strong> &nbsp;Derek Briggs, Elena Diaz-Bilello, Fred Peck, Jessica Alzen, Raymond Johnson</p><p><strong>Citation:</strong> &nbsp;Briggs, D.C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., &amp; Johnson, R. (2015). Using a learning progression framework to assess and evaluate student growth. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE) and National Center for the Improvement of Educational Assessment.&nbsp;</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 25 Aug 2017 21:18:27 +0000 Anonymous 142 at /cadre The Prospects of Teacher Pay-for-Performance /cadre/2017/08/25/prospects-teacher-pay-performance <span>The Prospects of Teacher Pay-for-Performance</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2017-08-25T15:14:53-06:00" title="Friday, August 25, 2017 - 15:14">Fri, 08/25/2017 - 15:14</time> </span> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cadre/taxonomy/term/42"> Peer-Reviewed Manuscripts </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cadre/taxonomy/term/30" hreflang="en">Teacher Labor Markets</a> </div> <a href="/cadre/derek-briggs">Derek Briggs</a> <span>,&nbsp;</span> <span>Michael Turner</span> <span>,&nbsp;</span> <span>Charles Bibilos</span> <span>,&nbsp;</span> <span>Andy Maul</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default 3"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><strong>Link to Resource: &nbsp;</strong><a href="/cadre/media/355" rel="nofollow">The prospects of teacher pay-for-performance</a></p><p><strong>Authors:&nbsp;</strong> Derek Briggs, Michael Turner, Charles Bibilos, Andy Maul</p><p><strong>Citation:&nbsp;</strong> Briggs, D. C., Turner, M., Bibilos, C. &amp; Maul, A. (2014) The prospects of teacher pay-for-performance. Boulder, CO: Center for Assessment Design Research and Evaluation (CADRE). &nbsp;</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 25 Aug 2017 21:14:53 +0000 Anonymous 140 at /cadre