Extended Time Testing Accommodations: What Does the Research Say?
By Benjamin J. Lovett
Extended time is among the most common testing accommodations given to students with a wide range of disabilities (e.g., Bolt & Thurlow, 2004). However, although school psychologists are often involved in accommodation decisions, many are unaware of research from the past decade that has changed our understanding of extended time. Used properly, testing accommodations let students demonstrate their skills, increasing the accuracy of their test scores. But accommodations are a double-edged sword: They break the standardization of testing, and so they can compromise score quality as well. Extended time is no exception to this rule. In this article, I have selected the most important and consistent findings from the research literature, focusing on their relevance for practitioners (for a more technical presentation of these issues, see Lovett, 2010). Four basic facts about this accommodation are of particular importance: (a) extended time often helps students regardless of their disability status, (b) it can change the proper interpretation of students’ test scores, (c) decisions about it are not made consistently, and (d) interventions may help to decrease students’ reliance on it.
Extended time tends to help nondisabled students, too. As Zuriff (2000) noted, providing more time to students with disabilities assumes that nondisabled students display their skills fully under standard time limits. But the most thorough review of literature on this point (Sireci, Scarpati, & Li, 2005) concluded that "extra time appears to improve the performance of all student groups, not just those with disabilities" (p. 483). These authors found that students with disabilities sometimes showed greater benefits, but at other times the groups benefited equally. Since then, the research has confirmed this conclusion, sometimes going even further. In fact, two more recent studies (Lewandowski, Lovett, Parolin, Gordon, & Codding, 2007; Lewandowski, Lovett, & Rogers, 2008) found that under severe time pressure, nondisabled students actually showed more benefit from extended time than students with either reading disabilities or attention deficit hyperactivity disorder (ADHD). Even when students with disabilities show greater benefits from extended time, the problem with nondisabled students benefiting at all should be obvious. Students are provided time extensions when they are thought to require the additional time to have full access to examinations; nondisabled students are assumed to already have full access under standard time limits. If all students benefit from extended time, then the accommodation becomes a privilege conferred on some and not others. Some advocates might still defend current practices by arguing that extended time is given to students who need it most, but this assumes that the end goal of accommodations is higher scores, rather than more accurate scores.
Extended time changes the meaning of students’ test performance. No school psychologist would give extended time accommodations on a curriculum-based measurement probe; the probe is designed to measure fluency, which includes the ability to respond quickly. The importance of fluency in basic academic skills, now widely acknowledged, should make us consider how extended time accommodations affect the nature of all tasks and the test scores that result from them. Even relatively generous test time limits require some degree of automaticity in skills. This automaticity turns out to be important later on; we know that students who respond fluently (and not just accurately) are better able to generalize their skills and retain those skills in the absence of practice, among other benefits (Kubina & Morrison, 2000; Martens & Witt, 2004). More specifically, reading fluency predicts students’ later reading comprehension (e.g., Wood, 2006; Yovanoff, Duesbury, Alonzo, & Tindal, 2005), and time limits encourage a faster rate of work completion (Pariseau, Fabiano, Massetti, Hart, & Pelham, 2010). Of course, if fluency is a desired component of performance on a given test, extended time would not ever be appropriate on the test, since the accommodation would alter the construct being measured.
Recent research confirms that extended time accommodations can change the intepretation of test scores. For instance, Bolt (2004) examined the psychometric properties of test items completed under standard versus extended time conditions and found that as many as 44% of the items showed differential item functioning (DIF), meaning that the relationship between students’ overall skill levels and their chance of answering an item correctly was different depending on whether they received extended time accommodations. DIF is a threat to the basic fairness of items, since answering an item correctly should mean the same thing for all students. Related worries have arisen in the admissions testing arena, where extended time accommodations have been associated with scores that show less validity in predicting students’ future performance (e.g., Cahalan, Mandinach, & Camara, 2002; Thornton, Reese, Pashley, & Dalessandro, 2001).
Decisions about extended time are not made consistently. Most school psychologists have had occasion to wonder if a given decision to classify a particular student would have been made the same way in another school. Decisions about extended time accommodations are even less likely than classification decisions to replicate. But even when a disability label commands consensus, there are no set criteria for determining when extended time is appropriate. One common strategy is to focus on students’ processing speed skills, but the little research we have on this point suggests that many measures of processing speed fail to predict which students will benefit from extended time (e.g., Ofiesh, Mather, & Russell, 2005).
Moreover, when students prepare to graduate from high school, the accommodations process changes again. College admissions testing agencies are not part of the special education system, and the Americans with Disabilities Act only requires that they provide reasonable accommodations to individuals who are substantially limited in a major life activity, a much stricter standard than that used by many multidisciplinary teams. Similarly, many colleges and universities now require extensive documentation of those substantial limitations, and so high-functioning students with learning disabilities or ADHD may not qualify for accommodations anymore. When schools err on the side of providing extended time early on, more requests for the accommodation are denied at later stages of the student’s education, since extended time was never really shown to be necessary in the first place.
One detail of the accommodation decisions may be especially unreliable: the amount of extended time given. The default amount is usually either 50% or 100% additional time, with occasional "untimed testing" accommodations. However, there is no empirical justification for these particular time amounts, and research suggests that 25% additional time may be enough for students with disabilities to complete tests (Cahalan-Laitusis, King, Cline, & Bridgeman, 2006) and even to normalize their performance to that of nondisabled students (Cohen, Lewandowski, & Lovett, 2010). Unfortunately, decisions over amounts of additional time tend to be "sticky," with students who have received 100% additional time in the past expecting to receive it forever, even believing that they need 100% additional time to take a test.
Accommodations may be taking the focus away from interventions. Students are provided with extended time on the basis of data from psychoeducational evaluations; could these same data suggest effective interventions instead? Consider test anxiety; many students with disabilities experience test anxiety (e.g., Peleg, 2009), and an intensive study of the testing accommodations process at the middle school level (Rickey, 2005) found test anxiety to be a common rationale given for accommodations. Certainly, research has found that students are more relaxed when given extended time (Elliott & Marquart, 2004). But are accommodations the most appropriate response to test anxiety? Validated interventions exist for text anxiety (Zeidner, 1998), and a successful intervention would be far more effective in the long term than accommodations, which may only reinforce students’ belief that they are unable to perform under standard time limits.
Deficits in reading fluency are another student characteristic used to make decisions about extended time. Here, too, might accommodations distract from a more appropriate focus on improving fluency? The past decade has seen the development and dissemination of effective strategies for increasing reading fluency, with effect sizes in the moderate-to-large range (Fletcher, Lyon, Fuchs, & Barnes, 2007). Even in high school populations, where there may be more skepticism about interventions, effective fluency-building strategies exist (Wexler, Vaughn, Edmonds, & Reutebuch, 2008). Certainly, nothing stops schools from providing interventions in addition to accommodations; indeed, New York State’s testing accommodations guidelines recommend that decision makers consider what interventions have been attempted with students (Office of Vocational and Education Services for Individuals with Disabilities, 2006). However, especially with older students, there is reason to worry that accommodations are overly tempting because they are simply easier than interventions.
More generally, interventions may be avoided because of doubts that students with disabilities can work more quickly. These doubts may be unfounded; in a recent study by Pariseau and colleagues (2010), students with ADHD completed mathematics worksheets and reading comprehension assignments under 30-minute and 45-minute conditions. When they were given only 30 minutes to work, students worked at a faster pace without significantly sacrificing quality; that is, their rate of correct answers increased substantially. Although the context of the assignment was different from that of exams, the study nonetheless suggests that work pace may be more malleable than we assume, even in students with disabilities. Moreover, long before Pariseau’s study, making time limits more salient ("explicit timing") was found to increase academic productivity (e.g., Van Houten & Thompson, 1976). It seems that the same principle underlying committee meetings may apply to completing test items: Work expands to fill the available time!
Implications for Practice
Research shows consistently that (a) the benefits of extended time are not specific to students with disabilities, (b) extended time changes the meaning of students’ test scores, (c) individual decisions about extended time—including the amount of time— are often not replicable or defensible, and (d) extended time can discourage interventions that target underlying sources of poor test performance. Faced with these findings, what can school psychologists do?
Work with teachers to determine when speed is important. Educate teachers about the utility of fluent responding and help them to consider whether fluency should contribute to performance on the classroom tests that they create. If they find speed to be important, extended time accommodations may not be appropriate for any students. If speed is truly unimportant, ensure that time limits are generous enough for all students (with and without disabilities) to finish. The general principle at work here, called universal design (Ketterlin-Geller, 2005), reduces the need for accommodations while enhancing the validity of all students’ assessment results.
Develop an extended time accommodation protocol for your school or district (see page 34). Clarify which student characteristics should affect whether a student is given accommodations. Gather evidence as to whether the student is actually unable to complete exams in the allotted time (self-reports, teacher observations, test performance). Rule out other explanations for slow performance (e.g., anxiety, low motivation). Examine evidence of low fluency in academic skills. Document any steps taken before considering accommodations (e.g., encouragement to work more quickly, reinforcement-based interventions for fluency). Given the literature suggesting that extended time changes the interpretation of test scores, your protocol should not provide extended time as a default; instead, it should require evidence that the accommodation is both necessary and appropriate. Finally, if you do decide to provide extended time, examine the efficacy of small amounts (25%) before providing larger amounts.
Consider whether interventions would be helpful, either in conjunction with or in lieu of extended time. Even if extended time is determined to be necessary, what steps will you take to make it less so? Is the student already receiving interventions to build fluency, and is he or she making progress? Make a note to specifically reevaluate the need for extended time at the annual review. If extended time is being considered due to test anxiety or other emotional or behavioral concerns, an intervention may be more appropriate, while extended time may be inappropriate.
A Balanced Perspective
Ultimately, the focus of our decisions should always be on enhancing autonomy through skill development. Used well, testing accommodations do precisely this, giving students with disabilities a chance to show their skills fully and receive feedback on what continues to need improvement. Asking a blind student to take a mathematics test in the same format as other students leads to invalid inferences about the student’s mathematics skills, whereas accommodations (testing in Braille, reading the test aloud, etc.) allow for a more accurate skill estimate. At times, extended time can have a similar effect, giving students with poor academic fluency the opportunity to demonstrate academic competencies on tests where fluency is not a construct of interest.
However, the sobering message from recent research is that extended time accommodations may at times have just the opposite effects, discouraging the development of needed skills and reinforcing the low self-efficacy of slow test takers, all while threatening the accuracy of their test scores. As purveyors of research and recognized experts on assessment, school psychologists are in an excellent position to promote more thoughtful use of accommodations. The first step is spreading the word about the limitations of extended time detailed in the research literature. Admittedly, arguing for the more judicious provision of extended time may be surprising to parents and school staff; on rare occasions, it may even lead to conflicts. But the truest child advocacy involves considering the student’s long-term interests, and so school psychologists must persuade other parties that providing only truly necessary accommodations leads to more valid assessments of a student’s functioning, while habituating the student to the testing conditions (including time pressure) which are often unavoidable in life. Confidently describing the findings of empirical research can go a long way in this persuasion.
Bolt, S. E. (2004, April). Using DIF analyses to examine several commonly held beliefs about testing accommodations for students with disabilities. Paper presented at the annual conference of the National Council on Measurement in Education, San Diego, CA.
Bolt, S. E., & Thurlow, M. L. (2004). Five of the most frequently allowed testing accomodations in state policy. Remedial and Special Education, 25, 141–152.
Cahalan, C., Mandinach, E. B., & Camara, W. J. (2002). Predictive validity of SAT I: Reasoning test for test-takers with learning disabilities and extended time accommodations. New York, NY: College Entrance Examination Board.
Cahalan-Laitusis, C., King, T. C., Cline, F., & Bridgeman, B. (2006). Observational timing study on the SAT Reasoning test for test-takers with learning disabilities and/or ADHD. (Research Report 2006-4). New York, NY: College Board.
Cohen, J. A., Lewandowski. L. J., & Lovett, B. J. (2010, March). Differences between extended time allotments for learning disabled college students. Poster presented at the convention of the National Association of School Psychologists, Chicago, IL.
Elliott, S. N., & Marquart, A. M. (2004). Extended time as a testing accommodation: Its effects and perceived consequences. Exceptional Children, 349–367.
Fletcher, J. M., Lyon, G. R., Fuchs, L. S., & Barnes, M. A. (2007). Learning disabilities: From identification to intervention. New York, NY: Guilford.
Ketterlin-Geller, L. R. (2005). Knowing what all students know: Procedures for developing universal design for assessment. Journal of Technology, Learning, & Assessment, 4(2). Retrieved from http://www.jtla.org
Kubina, R. M., & Morrison, R. S. (2000). Fluency in education. Behavior and Social Issues, 10, 83–99.
Lewandowski, L. J., Lovett, B. J., Parolin, R. A., Gordon, M., & Codding, R. S. (2007). Extended time accommodations and the mathematics performance of students with and without ADHD. Journal of Psychoeducational Assessment, 25, 17–28.
Lewandowski, L. J., Lovett, B. J., & Rogers, C. L. (2008). Extended time as a testing accommodation for students with reading disabilities: Does a rising tide lift all ships? Journal of Psychoeducational Assessment, 26, 315–324.
Lovett, B. J. (2010). Extended time testing accommodations for students with disabilities: Answers to five fundamental questions. Review of Educational Research, 80, 611–638.
Martens, B. K., & Witt, J. C. (2004). Competence, persistence, and success: The positive psychology of behavioral skill instruction. Psychology in the Schools, 41, 19–30.
Office of Vocational and Education Services for Individuals with Disabilities. (2006). Test access and accommodations for students with disabilities. Albany, NY: Author.
Ofiesh, N., Mather, N., & Russell, A. (2005). Using speeded cognitive, reading, and academic measures to determine the need for extended test time among university students with learning disabilities. Journal of Psychoeducational Assessment, 23, 35–52.
Pariseau, M. E., Fabiano, G. A., Massetti, G. M., Hart, K. C., & Pelham, W. E. (2010). Extended time on academic assignments: Does increased time lead to improved performance for children with attention/deficit-hyperactivity disorder? School Psychology Quarterly, 25, 236–248.
Peleg, O. (2009). Test anxiety, academic achievement, and self-esteem among Arab adolescents with and without learning disabilities. Learning Disability Quarterly, 32, 11–20.
Rickey, K. M. (2005). Assessment accommodations for students with disabilities: A description of the decision-making process, perspectives of those affected, and current practices. Unpublished dissertation, University of Iowa.
Sireci, S. G., Scarpati, S. E., & Li, S. (2005). Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Review of Educational Research, 75, 457–490.
Thornton, A. E., Reese, L. M., Pashley, P. J., & Dalessandro, S. P. (2001). Predictive validity of accommodated LSAT scores (Technical Report 01-01). Newtown, PA: Law School Admission Council.
Van Houten, R., & Thompson, C. (1976). The effects of explicit timing on math performance. Journal of Applied Behavior Analysis, 9, 227–230.
Wexler, J., Vaughn, S., Edmonds, M., & Reutebuch, C. K. (2008). A synthesis of fluency interventions for secondary struggling readers. Reading and Writing, 21, 317–347.
Wood, D. E. (2006). Modeling the relationship between oral reading fluency and performance on a statewide reading test. Educational Assessment, 11, 85–104.
Yovanoff, P., Duesbury, L., Alonzo, J., & Tindal, G. (2005). Grade-level invariance of a theoretical causal structure predicting reading comprehension with vocabulary and oral reading fluency. Educational Measurement: Issues and Practice, 24(3), 4–12.
Zeidner, M. (1998). Test anxiety: The state of the art. New York: Plenum.
Zuriff, G. E. (2000). Extra examination time for students with learning disabilities: An examination of the maximum potential thesis. Applied Measurement in Education, 13, 99–117.
Benjamin J. Lovett, PhD, is an assistant professor of psychology at Elmira College in Elmira, NY.