News

How Can FSA be Validated When Based on Utah SAGE Test That Has No Validity Data Either?

August, 2015


Orlando Sentinel Education reporter Leslie Postal revealed evidence that Florida district officials expect few clear answers from the $600,000 taxpayer funded validity study of the Florida Standards Assessment (FSA). 

Postal described what the hired company was evaluating as follows:

The monthly report filed July 31 detailed meetings the testing company officials have held with staff from the Florida Department of Education and with school district administrators, who've been asked for their feedback on the 2015 administration of the FSA.

It noted the companies have collected more than 650 documents related to the FSA and looked at more than 200 test items, studying whether they met "best practices in assessment design," used appropriate language for the grade level, targeted "intended depth of knowledge" and were free of bias.

The firms also have looked at how FSA questions, leased from Utah's state test, were field tested.

The only problem is that there is abundant evidence already submitted to the Florida Senate that there have been no validation studies done of the Utah tests.  That evidence includes the offer of $100,000.00 from the psychology practice of Dr. Gary Thompson for evidence from the Utah State office of Education. The evidence also includes the following letter sent from Utah district board members to Senator David Simmons (R-Altamonte Springs) who worked with Senator Alan Hays (R-Umatilla) to craft the validity amendment that required this study after it became obvious that neither the Utah nor the FSA were validated prior to administration to Florida students:



 
The company is not giving any hints about the actual validity of the test, only what it is doing to try to assess it. This led an Orange County testing official to give the most important quote of the article at the end when he admitted that there would likely be little concrete evidence of validity:

But company officials have given no hint about whether they think the FSA is a well-put together and valid exam or if they've spotted problems.

"Those monthly reports haven't really said anything about what they've been finding but what they'be been doing," said Brandon McKelvey, the Orange County school district's senior director of accountability, at a recent school board meeting.
McKelvey said the final report should provide some answers but also likely will be more technical then many educators and parents (and reporters) would like.

After all, it's an "independent verification of the psychometric validity" of the FSA.

"The first page will not have in giant red letters valid or not valid," he said.  (Emphasis added).

This kind of statement is being used to dampen expectations.  It also mirrors the likely path that occurred with the FCAT.  During the 2015 legislative session, when they were supposed to be providing promised validity data for the FSA/Utah SAGE test, the FLDOE sent Senator Simmons a validity document from the end of 2014 regarding the long used FCAT and EOC tests showing that even after more than a decade of use, there were still doubts about how valid these established tests were for the high stakes purposes for which they were used:

Less strong is the empirical evidence for extrapolation and implication. This is due in part to the absence of criterion studies. Because an ideal criterion for the FCAT 2.0 or EOC assessments probably cannot be found, empirical evidence for the extrapolation argument may need to come from several studies showing convergent validity evidence. Further studies are also needed to verify some implication arguments. This is especially true for the inference that the state's accountability program is making a positive impact on student proficiency and school accountability without causing unintended negative consequences. (Emphasis added).

This is a very important admission that the FCAT after all of its years of use should not have been used to make such high stakes decisions because of lack of validity evidence. Multiple pieces of evidence shows that even a long used test is not "making a positive impact on student proficiency."  This evidence included the quote from a letter signed by 500 university professors casting doubt on the efficacy of high stakes testing, a chart showing ACT scores have actually declined under the Bush high stakes accountability regime,and two more charts showing that Florida's ranking on the NAEP in comparison to other states has also leveled off or declined:

"The broad consensus among researchers is that this system [NCLB and RTTT] is at best ineffective and at worst counterproductive. The issues now being debated in Washington largely ignore these facts about the failure of test-based educational reform, and the proposals now on the table simply gild, rather than transform, a strategy with little or no promise...

...The ultimate question we should be asking isn't whether test scores are good measures of learning, whether growth modeling captures what we want it to, or even whether test scores are increasing; it is whether the overall impact of the reform approach can improve or is improving education. Boosting test scores can, as we have all learned, be accomplished in lots of different ways, some of which focus on real learning but many of which do not. An incremental increase in reading or math scores means almost nothing, particularly if children's engagement is decreased; if test-prep comes at a substantial cost to science, civics, and the arts; and if the focus of schooling as a whole shifts from learning to testing."
 



Florida's well-documented rise in education performance relative to other states ended abruptly around 2009. Our analysis shows that from 1996 to 2009, Florida posted the 2nd largest gains of any state on the NAEP exams, taking the state from 39th to 24th in the country1. Since then, education performance has stagnated relative to other states, leaving Florida stuck at "about average" performance 26th in 2013 (From Cota and Donalds - Memorandum: High-stakes testing and lost instructional time 3/24//15 - pages 4-5)




"...in 2013 the 4th grade cohort from 2009 took the NAEP as 8th graders and failed to sustain their position relative to other states. These students dropped from 10th when they were 4th graders to 33rd as 8th graders." (From Cota and Donalds - Memorandum: High-stakes testing and lost instructional time 3/24//15 - pages 6-7)
 
 
Stating that it is a long process that will be adjusted depending on the results of the validity study, the FL DOE admitted yesterday that they are convening meetings with teachers to set the cut scores (which score counts for the various achievement levels of 1-5)  for the tests on August 31st, one day before the validity study is due, but in a prompt and courteous email from the department stated they will wait until the study is completed. We are working to find out if the public will be able to attend the educator meetings or only the rule development meetings after the cut scores are established by the teachers and business people.  Hopefully this is not another glaring example of how the department continues to skirt the truth (evidence of validity), flout the law (load testing), or deceptively rebrand the standards and proceed with the implementation of Common Core and its aligned assessments no matter what.  The process will have to be followed closely.  Stay tuned!
 


Website Powered by Morphogine