Part I: The ABCs of Accessibility and Accommodations in State Assessments

Led by Martha Thurlow, Ph.D., NCEO Director, and Research Associates Laurene Christensen, Ph.D. and Sheryl Lazarus, Ph.D., this information packed webinar provides a quick and comprehensive overview of the types of assessments used and discuss the move toward technology-based assessments.   (Watch Part 2 of the series)
 

 

 

Transcript: 

- [Voiceover] Good afternoon. My name is Tracy Gray and I'm part of the team on the Center on Technology and Disability. We are very excited to welcome you today to this webinar, that's going to focus on The ABCs of Accessibility and Accommodations in State Assessments. And we're just thrilled to have the opportunity to hear from Dr. Martha Thurlow and her team of Sheryl Lazarus and Laurene Christensen. They're joining us today from the National Center on Educational Outcomes. We here at the Center on Technology and Disability have the opportunity to work with our colleagues who lead this center at FHI 360, led by Jackie Hess and also we work and partner with the PACER Center. So to really focus on this critical issue of accessibility and accommodations, particularly as it relates to state assessments. So we have a large group today, which is really exciting. And I just wanted to call your attention to the area on the right-hand side, the lower right-hand side of your screen, where you can post questions. And as Dr. Thurlow and her team go through the presentation, we'll be stopping to address those questions and also have built in a Q and A period at the end of the session. So we're very eager to hear from you. We want to make sure that we address your needs today and that you have a sense of what's critical related to this important topic. So Martha, I turn it over to you, and take it away.

- [Voiceover] All right, thank you, Tracy. We're really pleased to be able to participate today on this topic. It's ABCs and that means we are not get into a lot of depth, but hope to give you kind of a scan of what's going on in terms of state assessments and current approaches to accessibility and accommodations. And we will definitely leave time for discussion and questions because that's probably gonna be the most important part. If you go to the next, I see there's a slide saying please at the end of the webinar do the survey, so we would like you to do that, so we make sure that these meet your needs. But if you could go to the next slide now. Thank you. I want to highlight, the three of us will be talking today. I will start us out and then I'll turn it over to Laurene, and then Laurene will turn it over to Sheryl, and then we'll come back to me as we get into questions. But, accessibility and accommodations and assessments, there's a lot going on and so we identified these four topics to begin our discussion today. First we want to talk about the types of assessments that are out there. Focusing in especially on what's going on in terms of technology-based assessments today. We will talk about the paradigm shift that has really taken place in the field of assessment, related to accessibility and accommodations and what those are. And then we'll touch on challenges and solutions in the technology-based approaches to accessibility and accommodations. I say we'll touch on those because we will just begin to have a discussion of some of the challenges and potential solutions. And we will have a whole 'nother webinar topic that gets in more depth, into some of the challenges that have occurred in the past year or so. And other opportunities and solutions that might be out there. So that's the plan for our series of two webinars. So I'm gonna jump right in. And Tracy will watch for questions, so if I go into some topic and it's making no sense to you, please jot a question in the chat box and Tracy will make sure that we see it and/or she'll interrupt us so that we be sure to respond to the question. Okay, so the next slide will get us into the types of assessments that are being used today in the world of assessment. So I'm waiting for the slide to turn. Thank you. And so you know, based on, there's a new law, Every Student Succeeds Act. Our latest version or reauthorization of the Elementary and Secondary Education Act. This is a very important act in terms of what's happened in the world of state assessments because it gives general guidelines about what is happening out there. Based on what's in that act, there are three main kinds of state assessments. Now this is really a generalization, but these are assessments that are in every state. So there's the general assessment or the regular assessment of English language arts and reading, and mathematics, and science. Those are required by the law. And then many states have assess-- in other content areas as well. A second kinds of assessments are alternate assessments for students with the most significant cognitive disabilities. Those have been called the alternate assessment based on alternate achievement standards. That clarification, the based on alternate achievement standards may no longer be needed because the Every Student Succeeds Act essentially eliminated previous alternates based on modified achievement standards or grade level achievement standards. And then finally, another type of assessment out there is the English language proficiency assessments. And those are assessments for English learners to determine their proficiency in English. All right, so there are different forms of assessment. Thank you. And the ones we've been talking about so far, and that we will focus on primarily today are the summative assessments. Those are the ones that people most often think of when they talk about the state assessment that was used for NCLB or for adequate yearly progress. It's a summative assessment, end of the year kind of assessment. But there are many other kinds of assessments that are being used at the state or district level or at the classroom level. So those are interim or benchmark assessments. Those are mid-year assessments used to see how students are doing at certain periods of time. There's formative assessment, which is defined in different ways by different people, but is meant to get at the idea of assessing so, to inform instruction, so that instruction of processes and procedures and approaches can be adjusted based on how the student is doing. Progress monitoring, another kind of along the way assessment to check on student progress. And for students with disabilities, we often hear about curriculum-based measures. There are other classroom assessments that generally are developed by teachers and/or are provided by textbooks. And then diagnostic assessments, used to diagnose particular challenges that a student might have. These are, and you know there are probably other names that you can come up with as well, but these are kind of the main forms of assessment out there today. And we are going to focus on the summative assessments. On the next slide, summative assessments also take different forms. The one we are most familiar with is that standardized assessment which is, could be norm-referenced to, you know, comparing to how all students do, for example, or criterion-referenced comparing how a student, or indicating how a student is doing in relationship to a specific standard. The primary type of assessments that are used for summative purposes today are criterion-referenced assessments. They are standards-based assessments. But there are other assessments that are summative assessments. Performance tasks you may hear about. I think both of the large groups of states working on assessments have performance tasks. We also hear about authentic tasks. We hear about portfolios, which are ways to gather and document kinds of information used for, information for individual students. Next slide. I know this is a whirlwind through these topics. We could go into depth on any one of them, but just to give you some of the basics. So thinking about assessment formats, we, a lot of us probably, are most familiar with the paper and pencil tests that have been around a long time, have a long history of development. And increasingly over time became more and more synonymous with a multiple choice test or a multiple choice test with some short constructive responses. So kind of the nature of these paper and pencil tests shifted over time so that they were easier to score, took less time, et cetera. With some federal funding in 2010, called the Race to the Top Assessment Program, a whole new emphasis was given to technology-based assessments. The desire to use computers or other kinds of platforms, for example, iPads, other tablets, to have the assessments on, based in technology because of the many perceived benefits of that including being able to score more quickly, being able to have different kinds of items, et cetera. So I'm gonna move on now to, be sure that you are aware of the National Education Technology Plan that was developed to kind of provide guidance or recommendations for what should be happening in the future. They provide a vision for including students with disabilities in technology-based assessment. And in that plan there were four kinds of recommendations. Things like being sure that there was privacy of information at the same time that there's gathering of data that will improve learning and teaching. Developing and designing and implementing various kinds of learning dashboards and various response systems and other communication pathways that would give everybody kind of timely and actionable feedback about student learning so that they could improve achievement and improve instructional practices. Another kind of recommendation related to the need to really create and validate an integrated system for designing and implementing valid, reliable, cost effective assessments that really get at the complex skills and academic disciplines for the 21st century, and then the fourth recommendation was related to research and development to make sure that embedded assessment technologies, such as simulations, collaborative environments, virtual worlds, games, cognitive tutors, all could be used to engage and to motivate learners with assessing complex skills. Those are some wonderful recommendations related to a vision of technology that's grounded in a universal design approach, and that's more accessible and valid for greater numbers of students. So I want, we're gonna keep this kind of in our mind as background because I want us to be thinking about as we go through the rest of today's webinar, and as we go to the second webinar that gets even more in depth at some of the issues, to really think about whether the vision is being realized. Are the recommendations being followed? What needs to happen to make sure that we get to a place where that truly is happening. So with the next slide, I've already mentioned some of these terms, but the technology-based assessments really opened up a discussion about being able to have innovative item types. Those really were promoted. Things like dragging and dropping. Including video stimuli or audios that students would listen to. Allowing students to make multiple responses, and not just a four choice, one answer is correct kind of item, but items where they could make multiple responses. These are just a few of the item types that really have been promoted to increase engagement, get a better handle on what students know and are able to do. Other benefits of technology-based assessments that were identified were things like the possibility of having adaptivity, adaptive assessments where the items that an individual student would see in the assessment would be based on how that student was responding. They would not be the same for every single student in a classroom, for example. Then that possibility of timely reporting. Getting scores much sooner so that, in fact, something could be done potentially. Not having to wait way into the next academic year to find out how students had done. And then finely with the little asterisks, embedded accommodations. And this is where the work of our center has spent a lot of its time in the past few years. Really looking at what's going on in state assessments related to embedding accommodations and what that actually means. So, one more slide for me and then I'm gonna turn it over to Laurene. You know, I kinda asterisked that word accommodations because it is to some extent being redefined by the field of assessment. There's been a huge paradigm shift in thinking about meeting the accommodations needs of students and that means meeting needs of students who may not have been identified as having a disability or who may not be identified as being an English learner. So really thinking about all students, more of a universal design approach, and their needs for interacting with the assessment. The consortia states, there are two major regular assessment consortia states. And there are other consortia. We can talk about that later if that would be helpful. We really have thought about this paradigm shift. And so we are seeing a range of accessibility features when we look at what those consortia are doing and when we look at what other states not in consortia are doing. And then, so basically, the point that all assessments now are trying in some way or another to a greater extent or a lesser extent to incorporate that universal design approach as they develop and refine their assessments. So with that, I'm gonna turn it over to Laurene, who will talk a little bit more about that new paradigm.

- [Voiceover] Thank you, Martha, and good afternoon everyone. The states in consortia were working on making accessibility and accommodations available to a wider group of students. One of the things that really changed, I think, was the language that we use to talk about accessibility and accommodations. And you'll see as I walk you through the frameworks of some of the different assessment consortia in the next two slides that the language doesn't always completely align across the various consortia. And so here on this slide where we talk about the new paradigm, we sort of use language that cuts across all of the various consortia and their accessibility work. So universal features are sort of a first tier. Often I think the frameworks you'll see sort of mirror a pyramid style approach, and the universal features are really the base or the foundation of the pyramid of accessibility. So these are supports that are available to all students. Usually they're part of the technology platform. Sometimes they can be a locally provided accessibility feature, but they're available just sort of as part of the way that the assessment is presented to the student. And you'll see that there's an option of turning these features off. So these kinds of things might be like a digital notepad or some zooming features highlighting tools that really are available to all students in the classroom, and then presented through a technology platform as part of the assessment. Designated features is sort of the next level up on the pyramid. Here these are accessibility supports that are also available for all students. But these accessibility supports need to be planned for in advance, and so with that additional planning comes some decision making on the part of an adult who knows the child or a team of adults. So this doesn't necessarily require an IEP team decision or a 504 team decision. It can be simply an educator in the school who knows the student. But there does need to be some decision making in advance for this level. The things that are included in this level do vary a bit from consortium to consortium, but I think one example is like testing in a separate setting. So you can see there that that does require advanced planning, but it makes sense that it would be available to all students. Then the last tier is accommodations. So this is the top of the pyramid. And these are accessibility supports that are only available to students with disabilities and in some cases, English learners. So these are more restricted accessibility tools. In most cases they are limited to students with disabilities, but you will see that in one of the frameworks of the consortia, English learners are also able to get accommodations. So again, I mentioned that there's some variability in the language, and you'll definitely see that come through in the next slides. So we can turn I think to the next slide and look at the Smarter Balanced framework. So here you see the three tiers of Smarter Balanced. With the universal tools, and then there's both embedded and non-embedded supports. And again, those non-embedded supports are locally provided. So one example of the difference between an embedded and non-embedded is you might have a digital notepad that a student can use on the computer to take notes, but you can also have non-embedded, which is scratch paper. And then there's also the next tier of designated supports. And again, there's both non-embedded and embedded features. And then finally there's accommodations. And for Smarter Balanced and their framework, accommodations are only available to students who have an IEP or a 504 plan. So that's a little bit about what the Smarter Balanced framework looks like. And then we can turn and look at the PARCC framework. Here we move, you can see very similar in the tiered approach, although instead of using squares we've used circles. These three features are at the base, the features that are available for all students. Then we have the blue level in the middle, accessibility features that are identified in advance. And then at the top we have accommodations. And in the case of PARCC, this does include those tools that are available for students who have IEPs and 504 plans, but also English learners, and of course, English learners who have disabilities. So among these two general education consortia, or assessment consortia there are some differences in, not only in the language, but also who can receive the accommodations at the top. And then we can turn to one more example that we have here for you. This is the ELPA21 framework on the next slide. And this framework, so this is for an English language proficiency assessment. And so here this assessment is only for English learners to begin with. The bottom tier, the universal features, are available to all students. And again, there's listed both embedded and non-embedded features. Then there's designated features. So again, you notice a little difference in the language. But these are also available to all students identified in advance. And then the top tier of accommodations are available only with an IEP or 504 plan. One other difference on this particular framework is you see at the bottom there's also a listing of administrative considerations. And so these are also some, I guess features or considerations that sort of didn't warrant the level of being an accessibility feature. For example, familiar examiner is listed in there. And in the past those kinds of things were sometimes listed as accommodations, but really the ELPA21 consortium felt very strongly that a familiar examiner is more of an administrative consideration and something that is a good testing practice for all students and doesn't need to rise to the level of even being a universal feature. So I think that's some differences across some of the various consortia. And now we'll look a little bit more at some of the processes related to documentation. I think one of the big issues when we think about documentation of accessibility features and accommodations for students now is that we've moved in this paradigm shift from really having clear processes in place for our students who are students with disabilities. So the IEP or 504 plan has space in it for assessment and instructional accessibility and accommodations to be included. But for those students who are English learners there may or may not be an English learner team and so making a decision for English learners I think is a challenge. And then in addition, we now also have the designated tier that allows for accessibility for really any student. And so having some mechanisms for both, having a process as well as a place to document that information is really critical. So here's two examples of approaches that are taken in terms of documentation. So, in one instance there's the Personal Needs Profile. And this is really a digital version of the accessibility features that are formatted so that it can be uploaded into the testing engine and also can be included with other student information. So that's PARCC's definition of the Personal Needs Profile, or PNP. And then Smarter Balanced has a similar tool, which I think is really a mouthful. It's the ISAAP, or Individualized Student Assessment Accessibility Profile. And if you're interested in either of these tools, you can Google them under PARCC and Smarter Balanced. The ISAAP tool also takes a needs-based approach to helping educators make determinations for the accessibility tools that a student may need to use, and also creates a form that can be included with student information and uploaded into the testing engine. So these are two ways that individual educators can document the needs really of all students, including students who have disabilities. So now I'm going to turn things over to Sheryl, who's going to tell you more about some of the lessons learned, we encountered related to accessibility and accommodations.

- [Voiceover] Thank you, Laurene. What I want to talk a little bit about is an analysis, a study we did following the administration of the first, a year ago, of the first administrations of some of these technology-based assessments that the consortia were administering. And the consortia had some surveys of teachers following those administrations in 2015. And asked us and we did it as part of our work with a community of practice, some other technical assistance providers that we work with, to kind of look at the data, what the findings of those surveys were. About what kids with disabilities found difficult, what they really struggled with with these new technology-based assessments. And coming out of that work, NCEO combined efforts with the National Center on Systemic Improvement. And we did two publications. They are both really short, user friendly publications. One is Lessons Learned About instruction from the Inclusion of Students with Disabilities in College and Career Ready Assessments, and the other is Lessons Learned About Assessment. These publications are aimed at technical assistance providers and states. Kind of what is the implications for assessment instruction, what are some of the technical assistance needs, professional development needs. But there's charts there that anyone would find very useful thinking through these issues. Let me tell you a little bit about what we found about the lessons l.earned for instruction. And when we looked at the data from those surveys, we identified four major challenges. And the first challenge for instruction was the reading. That the students struggled to read the extended passages that were on those assessments in the time that was available. Also that some times the students that, were not familiar with the types of authentic text that they had to read as part of these assessments, or with the vocabulary. Students also sometimes had difficulty understanding the assessment questions. You know, there was the new types of a questions they may not have ever experienced some of those types of questions that Martha previously described prior to taking the test. And they also sometimes struggled with constructive responses, longer responses that were required. You know, just have, extract supporting evidence from text as well as from some videos. So we came up with some implications there related to instruction and really just a real need for kids to have the opportunity to read, understand, multi-paragraph, authentic text. Giving kids opportunities to really increase that reading stamina as well as kids need some practice at reading text online, that that's really a different experience than what it might be in some classroom situations. The second major challenge related to instruction were writing challenges. And similarly to the reading, students often had not had the opportunity to practice writing extended responses, and they were not used to composing online responses and sometimes lack the skills in keyboarding, scrolling, just the things that they needed to do to successfully complete these assessments. You know the implications there. Well, give 'em some practice at writing online and the support needed during instruction to help develop the keyboarding skills and the scrolling skills. Also just in general, more opportunities for extended, at writing. The third challenge related to instruction was related to the students often struggled with justifying their answers. And that's very different from the old multiple choice tests where it just needed the right answer. It didn't really matter whether or not the student knew why they selected it as long as they got it right was all that mattered. And so figuring out, understanding how to justify answers was a challenge. This was a particular challenge with some of the math problems. And similarly students need those opportunities to learn how to identify relevant evidence from text to support their conclusions. And they need that practice in drawing those inferences from the textual evidence as well as from video evidence. Fourth, the last of the major challenges related to instruction that we identified was that students struggled to get, for some of the constructed responses, the essay, to get the research and the essay done within the time period allowed, within the one day that they were allowed to have to complete the assessment. The kids sometimes didn't have those basic research skills, that that had been something they received little instruction or opportunity to practice on during the school year. And sometimes didn't know really how to organize the information on a topic or how to integrate information for more than one source. Again, there's real need for teachers to think about, to provide students with, to teach them what is important to answer the question, to help them identify keywords, phrases, summarization skills, and how to organize information, those note taking skills, how to cite things, and just how to get it all pulled together in a timely manner.

- [Voiceover] Sheryl? Sheryl, this is Martha. Before you go on, John in Maine asked about whether the Lessons Learned were only for students with disabilities. I wonder if you could just take a couple seconds--

- [Voiceover] The surveys that we looked at the data for were specific to students with disabilities. So they were sent out to teachers with kids with disabilities. That's our very first reaction, which is probably similar to what John was thinking, was you know, many of these challenges, all kids are facing exactly the same challenges, exactly the same issues. It doesn't really matter whether or not the kid has an IEP, an identified disability. So this analysis, the data that was collected here, was specific to students with disabilities, but many of the, I want to say most of these suggestions for technical assistance providers for teachers related to instruction and assessment would help all kids, all struggling learners. Now I'm going to shift to the second publication which we did, which was the Lessons Learned About Assessment. This publication we identified, or this analysis, we identified three major challenges. The first was that unfamiliarity with some of the item types. The students, and in some cases the teachers, had never seen some of the item types. The kids had not had opportunities to practice some of these item types. And as previously mentioned, that essay, the constructed responses, were particularly challenging. Getting students, giving students that opportunity, whether it be posting examples online of high quality examples of the different items and what an appropriate response might look like, getting information out to teachers as to how to familiarize students could be very helpful, but the bottom line is kids just need more familiarity with some of these item types. The second major challenge that we identified related to assessment was that there were a lot of accessibility challenges. When students have difficulty accessing an assessment, it's just hard. And teachers found it very hard to make good decisions. As Laurene previously discussed, there's new frameworks there. Things that were once accommodations now may be universally available, or any student may have them as long as they are selected in advance, but there was just a lot of confusion. Teachers felt like they did not know how to confidently select and then really implement some of these accessibility features and accommodations for their students. And then sometimes things just didn't go as smoothly on test day. There were logistical issues. Students for one reason or another did not receive accessibility features and accommodations, which they should have had. So there were a lot of implications there for technical assistance providers for improving assessment. And most of these really focused on getting the professional development out there, the information out there to educators, to parents so that better decisions can be made and implemented, both for the instruction and assessment when it comes to accessibility features and accommodations. There's also some of the online tools that are available, like online calculators and math tools that, online rulers, whatnot. They're cool but all kids need to know how to use them, some students with disabilities, you know if they haven't had previous experience with some of these tools may find them particularly challenging. And then just at a school district level, there's a real need for making what, ensuring that there are sound, logistical plans to ensure that all students get the accessibility and accommodations features they need and to track it and then follow through so kids are getting what they need. The third challenge that we identified was that students, sometimes the accessibility features and accommodations weren't available or they just plain didn't work right, didn't work as they were intended. And as well as, as I previously talked a little bit about, this issues with the embedded technology. Some of the issues were related to what, compatibility issues for assistive technology. There were also some challenges last year related to the assessments, some of the features just plain didn't work quite right. Hopefully as we, a year into it now, more of those technology issues related to accessibility have been resolved, though anecdotally I know there's still some problems out there. But being aware of those states, you know, working with their vendors to get those things addressed, fixed, developing processes so that districts, teachers, can let them know when there are issues related to some of the accessibility features and accommodations, as well as really helping, sorting out any issues related to the assistive technology, both compatibility issues and just getting that all figured out. I guess that's the major things that we identified in these two analyses. But you know there's lots of lessons out there, lots of things that we as NCEO really have noticed just in general. And one of the challenges that continues is that sometimes schools, teacher educators, just want to make decisions about groups of students, like well, all kids with learning disabilities need this accommodation or, but that is so, kids vary so much. They have very different characteristics and needs. Disability categories don't indicate which accommodations a student needs. They might provide some insight, but they definitely don't indicate, As you all know, very diverse. They have very different characteristics and needs. So that's been a challenge. You know, continues to be a challenge, but it's so important that accommodation decision making, accessibility decision making being made at the individual student level. And then--

- [Voiceover] Laurene? Laurene, let me just jump in here. A lot of good questions have come up. Ruth Ziolkowski from Don Johnston has asked about this issue of security blocking most AT. She was wondering if you've heard anything about that.

- [Voiceover] So I'm gonna jump in and say, we've got one more slide and yes.

- [Voiceover] Okay.

- [Voiceover] We don't have exact data, but we definitely heard about that--

- [Voiceover] That's what we figured.

- [Voiceover] Yeah, and next, in our next webinar, you'll want to be there because we'll talk more in depth about the issues. Sheryl, do you want to make your last point, and then I'll jump in?

- [Voiceover] I don't even know, the last point there says that adults are often more challenged by technology than students. I'm not sure whether that's always true, but it's definitely, you know, that we are, both students and adults, sometimes can find technology challenging. And so sorting through those issues is just so important. I'll turn it back to Martha, then.

- [Voiceover] Thank you. So if you go to the next slide, you've been hearing lots about challenges. And these are some of the, you know we thought it was important to talk about those that have implications for instruction, as well as assessment. But there were lots of other technology challenges. On the other hand, we have heard lots of positives. We've heard about students being much more engaged. Really liking the interaction that they have with the assessments. Persevering more when it's on some technology device than when it's a paper and pencil. So, lots and lots of positives that we haven't really highlighted today. But there's still more challenges, and this slide is kind of a segue into the second webinar where we'll get more in depth, into some of those technology issues that are out there, and some potential solutions or things that states are working on right now. So, you know this scrolling issue. It was huge, it was a huge issue for students and their teachers, sometimes. Because of longer passages. Just getting everything where a student could see it without scrolling was not happening. The issue around things didn't always work as intended. And that was kind of, some times a general issue, but it particularly happened related to accessibility features and accommodations, and probably that's partly because this is a new venture, and the things will improve over time. But it was definitely an issue in this past year. And then one that I'll talk about more on the next webinar is the inability to actually obtain data on which accessibility features and accommodations were used by students. It's something we've desired to get for a long time, and was not realized in this past year with the new assessments, the new technology-based assessments. With that, I promised we'd leave time for questions and discussion. So we're turning it over to you all. And we can do it both via chat or I think Tracy you said you would open the lines.

- [Voiceover] Absolutely. And thank you so much to all of you. This has been extremely informative. So now it's your turn, those of you who are on the webinar with us. We'd love to hear from you. What are some of the challenges that you're hearing about or have personally experienced in your efforts? Well, I'll throw one thing out and it's a bit of a hot potato, Martha and team, and that has to do with the fact that as we know, many states are moving away from the PARCC and the Smarter Balanced consortium. And so it begs the question, what is filling the void? And what are the implications for students with special needs and disabilities?

- [Voiceover] So yes, we have observed that happening in large part, but maybe not totally, because of some political issues that are out there. But what I do, and I'm an optimist, so I'll have to say that right off the bat. What I do think is happening is that states who participated in those consortia, and even states who did not participate, but watched what was going on, have learned a lot about accessibility and accommodations. And about providing assessments that are better designed with all students in mind. So I believe that we will see transformations in our assessments even when they are not part of a consortium. We'll see whether that in fact happens, but that's I think, you know the states have seen the advantages of going to a technology-based system and the potential for really opening up the universal design and the accessibility and the provision of accommodations. So, we'll see whether that pans out.

- [Voiceover] That's really helpful. Any other questions? I know it's a Friday. Looks like we've got a question from John from Maine, who's noting that this was the first time he's heard anything positive about those tests. Now, John! Now, John.

- [Voiceover] But that's a good point, because as you said, we kind of focused on what the challenges were. But because we have such hopes for getting everything fixed and in better shape. I think, you know, we have learned a lot. I think there were many positives that people don't talk about because they want to fix what didn't work right. I think that's part of what's going on here. Because we did hear some of those real positive things. Even from teachers and schools, test coordinators who said, "Oh, what a relief "not to have have all those boxes, "those, which we have to keep straight." But, you know, I think you're right John, that we have focused on the negative.

- [Voiceover] And this is Laurene. I just wanted to add too, we know that often when we implement a new assessment that there are bumps in the road and challenges just in sort of, you know students are getting used to a new test. And the Common Core state standards are a much more rigorous set of standards. So, I think it's not surprising that there have been a lot of challenges, but I think over time students will become more familiar with the types of items on the assessment and teachers will realize that they need to teach their students strategies to address those items. And I think that I'm optimistic too, that over time we'll see test scores go up and the responses to the assessment improve.

- [Voiceover] So we're getting lots of comments and questions. I see one from Joy that says, "I remember reading somewhere "that as a whole, all students who took the test online "scored below those taking a paper-based test." We have to remember that the change in the testing to online occurred at the same time as states moved to more rigorous college and career ready standards. So, I don't think we can attribute any lower performance that we saw to a test being online. Let's see, what else have we got here? Was there a concern about AT not even addressing some things like graphic organizers, talking word processors, word prediction? And you mentioned Smarter Balanced. I think, and this ismemory, but some of those were definitely addressed. Maybe decisions were not made. I thought PARCC, in fact, did allow word prediction, is my recollection. Could be wrong. But I believe the other challenge here was that for the consortia, they were groups of states trying to agree on things. And so, they are probably taking a more iterative process in coming to recognize some accommodations or approaches that some states have had in the past.

- [Voiceover] Can I add one more thing too--

- [Voiceover] Sure.

- [Voiceover] Since it's specifically related to Smarter Balanced? I know that Smarter Balanced does have a process for including new accessibility tools or accommodations, so their process is essentially that that inclusion of something new could be a state decision for one year only, and then those items go back to Smarter Balanced and their executive committee for a decision, and can then be added to the accessibility and accommodations guidelines for the future years. So, even though there are some things that perhaps we feel got overlooked, there is a process at least for Smarter Balanced in including new items.

- [Voiceover] And Ruth, thank you for confirming that word prediction was allowed but, but it had to be on a separate computer. And I'm sure that was for security reasons. But as you note, that presents new challenges 'cause then you have to be managing two computers. Yeah, we haven't got this all worked out yet. Karen commented that even though we've got more technology, it's not necessarily being used for testing. Still lots being done on paper form. That's probably true as well. It's going to be a process too, this movement. But I think we've made leaps, leaps in the past couple years. Due in part to the consortia, really. Really pushing this. Yeah, two computers is very difficult to manage for using the word prediction. Yeah, and there should be some way to get around that. All our tech people, get on it. We've got more coming in.

- [Voiceover] Yes, we do.

- [Voiceover] Yeah, it's great. Like, Smarter Balanced doesn't allow word prediction as of yet. So, it is one thing. If you are in the consortium states, and even if you're in your own state and it doesn't allow something, you need to be talking to your, either your consortium leads or your state department leads about the need for certain kinds of accommodations that may or may not be allowed yet. You know, the more they hear about it, the more they're open to listening and thinking about whether that, in fact, should be provided. As long as it doesn't compromise the construct that the assessment is trying to get at.

- [Voiceover] I see the comments about that we assume students are digital natives, but yet they still have problems scrolling. There definitely are issues, kids having difficulty with the scrolling, but it also makes me think about, or something that I didn't mention before is that students with some physical disabilities can find scrolling some of the ways of navigating technology-based assessments to be particularly challenging. And so technology-based assessments can create a whole new set of challenges for some students. Similarly, there's some new challenges related to the assessment of deaf, hard of hearing, blind visually impaired students that are introduced with the new test.

- [Voiceover] So it looks like Jackie Moraes is typing, asking a question.

- [Voiceover] I just have to say, this is great. We so appreciate all the comments and questions.

- [Voiceover] Well I will, while Jackie is posting her question, I was just wondering, Martha, whether it's your sense that the consortia is more willing to listen to the needs of the field as the number of states get smaller, or do you think that they're more entrenched and less flexible about these things? Do you have a feel for that?

- [Voiceover] Wow, I don't know. Oh goodness, I don't know that I have a feel for that. You know, I think they've been both, or all of the consortia have been quite open to listening. One of the challenges is that they need to get agreement across all of the states. And as you probably recognize, the states are in very different places on their thinking about accessibility and accommodations, still. So, they've made good progress I think. And I have the sense that they're willing to listen. Yet they have to take it through a process to get it to actually happen.

- [Voiceover] And this is Laurene, I would just add too, I think one of the things we haven't really talked much about, although I alluded to when I was talking about some of the language, is that vendors play a role in this too. And so some of these features are really easy for vendors to develop in their platforms and others are much more complicated and challenging. And so I think that we wanna also keep in mind that some of the issues related to accessibility have to do with what vendors can provide.

- [Voiceover] That's a really important issue.

- [Voiceover] So Jim, his great comment. "Salvia and Ysseldyke have cautioned for years "that interdependent questions should be avoided "and generally is bad practice to make the selection "of a correct response dependent upon earlier items." And you are, I'd say, both of the consortia are, well maybe all of the consortia are doing this to some extent, driven by what's in those college and career ready standards. Yeah, dependent answers, you're right. There are definitely issues there.

- [Voiceover] Well, we also, thanks to Jackie. She has posted information on the next edition of Martha and her team. And that is set for May 9th at 3:30pm, that's Eastern time. And Martha and her team will be really delving down on these issues, looking at the implications again of the new legislation and building on those lessons learned. So we encourage you to join us. Looks like Jackie's typing, and while we're waiting for Jackie to finish, we would love to have you complete the survey at the end. As we know, it's really so important for us to get your feedback. And if there are any additional issues in particular that were raised today or also additional issues that you would like to hear us talk about, please feel free to note that in the survey or get back to us. And Devon has just posted that Survey Monkey. It will just take you a minute or so to complete. So we encourage you to do that.

- [Voiceover] And thank you, everybody, from us here at NCEO. This is a great conversation. I know we didn't get to all the questions, and bring them up next time. We'll have lots of opportunity to dig even more into what's going on.

- [Voiceover] Great. And we send our deep appreciation, Martha, to you and Sheryl and Laurene, and look forward to connecting up again online on May 9th at 3:30 Eastern time.

- [Voiceover] Great, thank you.

- [Voiceover] Everyone have a good weekend. Stay dry and take care, thank you.

- [Voiceover] Bye bye.