You have 1 PagePop. View
The Toys for Tots are giving toys away as follows below. Please spread the word as best you can for our children that may need it. Thank you.
Friday at the VFW from 4:00 to 7:00
Saturday at the Wassiac fire house from 3:00 to 7:00
Sunday at the Wassiac fire house from 11:00 to 4:00
Monday at the Wassiac fire house from 11:00 to 3:00
Tuesday at the Legion Hall Wingdale 11:00 to 4:00
QUESTIONS & ANSWERS REGARDING 2013 GRADES 3-8 NY STATE TEST RESULTS & RELATED COMMON CORE CONCERNS
Many of these questions and answers have been taken from the Southorange Central School District with permission and thanks. Parts have been edited and added in order to taylor for the Dover Union Free School District. Thank you to Dr. Ken Mitchell.
Following the August 8, 2013, letter on the 2013 NYS grades 3-8 assessments, the district has received questions about the test results. The following set of Questions & Answers will serve as our initial effort to respond and related matters that have been considered after further deliberation:
Click on the question(s) below to reveal the responses. Re-click the question(s) to close the response.
According to the New York State Education Department in 2013, the state-wide percentage of students reaching proficiency on the new Common Core 3-8 tests averaged only 31% in math and 31% in ELA. Dover Union Free School District students performed as follows:
- 2013 DUFSD 3-8 PROFICIENCY LEVELS AVERAGED 14% IN MATH AND 23% IN ELA.
- 2012 DUFSD 3-8 PROFICIENCY LEVELS AVERAGED 55% IN MATH AND 46% IN ELA.
There is great controversy about the increased volume of testing and the accelerated implementation of the state’s new curriculum and accountability systems. The following provides some commentary about why the scores dropped:
According to Commissioner of Education, John King, “These new proficiency rates do not mean that teachers are teaching less or that students are learning less than last year.” So what happened?
The Commissioner and other state officials have stated that the change in scores is related to harder tests that were based on the Common Core, a new set of standards that purportedly will better prepare students for college and careers. This is a theory. There is no evidence to support that statement. There have been no field tests or research to make that determination.
According to many school leaders and teachers, other factors may have contributed to poor test results: ·
- The tests were administered before districts had all of the information and necessary training to teach the new Common Core-based curriculum;
- Students were asked to answer very challenging questions about content they had never learned and demonstrate skills that were never taught;
- The tests were extremely long – many students were unable to finish;
- The typical elementary school student sat for as long as 10 hours (7 hours of actual testing time) for just the ELA and Math assessments;
- The tests included “field test” questions that did not count (between 25-30% of all test items). These added to the length of the test. Many were included at the beginning or during the middle of the assessment which added to the “test fatigue” that may have contributed to students’ failure to complete the assessment or perform well later in the test;
- Assessments for students with disabilities needing extended time lasted as long as three hours in some cases;
- Many teachers have criticized questions as being poorly worded or ambiguous;
- Some districts that had used Pearson (the test vendor) materials in their classrooms discovered that reading passages from their textbooks were also used on the test, putting districts that did not use Pearson materials at a disadvantage;
- Some of the material is not developmentally appropriate for students who have not reached certain levels of cognitive maturation*. The developmental research is being ignored by many policymakers;
(*The Epstein/Piaget Child Growth and Development Chart shows the cognitive potential for children as they age. Depending on their natural developmental process – not aptitude or potential – some children are simply not ready for certain cognitive tasks. This is analogous to how some children walk sooner than others or grow faster than others. Just as we cannot “force” them to walk sooner or grow faster, we cannot force them to “think more abstractly” sooner than they are ready. Nature takes its course as this varies for every child. Children should not be held accountable for what is beyond their control.) The State is prescribing more to Vygotsky's Theory of Zone of Proximal Development.
There is debate in the field about the appropriateness of some of the new material that is being introduced in both math and English Language Arts. Because of such concerns many professional organizations, including the International Reading Association and the American Association of School Administrators, have called for a delay in implementation until the new standards on which the curriculum and assessments will be based can be field-tested and validated.
While most professionals welcome curriculum and instruction that is challenging, there are concerns that some of the reading materials and mathematical concepts are beyond the cognitive capabilities of children. This is based on the work of cognitive development researchers, not the opinion of bureaucrats, politicians, and education publishers. There has been no field-testing of the standards to ensure that they are developmentally appropriate.
Educational leaders – superintendents, principals, teachers, and parent groups – have been requesting an extension of time to implement the new curriculum and provide teacher training. However, warnings about such an illogical sequence were ignored and the tests were administered before the curriculum was developed or materials were provided by the state to the districts.
State officials have indicated that they wanted to establish a baseline of what students did not know. They have assured school officials and the public that we will see an improvement in future years.
Units and models were not consistently provided by State Ed until mid year. Resources and training took time to implement as well.
Even though districts received some Race to the Top funding, the amount that is required to pay for the mandated reforms is well short of what the new laws require. School officials have testified before Senate hearings and the Governor’s Reform and Mandate Relief Commissions. They have spoken to members of the Board of Regents and State Education Department officials in Albany.
Concerns about funding were ignored. Districts were informed that this was a new priority and that they needed to find local funding to pay for changes. This required many districts to cut programs and staffing or increase class sizes, all of which are counterproductive to providing a comprehensive education that addresses the needs of the “whole child”.
There are many who believe that the tests were designed for the purpose of achieving the outcomes that were predicted
to create a perception of poor performance. The results appear to “reveal” that our top students are not as good as we think they were, and that our weakest students are worse than we thought.
New York’s Commissioner of Education, John King, Jr., wrote to districts that “Scores are expected to be significantly lower than the 2011-2012 scores…effectively creating a new baseline measurement of student learning.”
The argument to make reforms to all schools, not just those that have typically underperformed, is strengthened when tests are made impossibly difficult for students.
Common Core changes to the Regents exams will go into effect in the spring of 2014 in the subjects of algebra and English.
NYS Education law requires that 8th grade students taking advanced high school math must also take the Math 8 assessment that all other students are taking, even though the curriculum is not aligned.
When approximately 70% of students in New York State fail an exam, perhaps it is the exam and not the students or the instruction that is the problem.
The Commissioner of Education ultimately determines the cut scores. These were set particularly high and at a time when a new form of test was given and on content that had not been provided to teachers and students before the tests. There is no scientific basis for the determination of the scores. Instead, there is a committee process that includes test company vendors, some educators, and data technicians and results in recommendations that are subjective and influenced by ideology.
These types of changes have been made before but not as radically. In 2010 then Commissioner of Education David Steiner raised the minimum passing scores (so-called “cut” scores) for proficiency, effectively making back-year comparisons difficult if not impossible. In South Orangetown, we saw a drop in test scores from an average of 90% of our students performing at or above proficiency to an average of 75% of our students performing at or above proficiency in ELA and math respectively.
Commissioner Steiner stated, "New, higher cut scores have resulted in fewer students scoring at a ‘Proficient’ level. While that is sobering news, it should cause all of us – the State Education Department, schools, administrators, teachers, and parents – to work ever more effectively together to ensure that all children in New York State get the knowledge and skills they need."
At that time John King, Jr., Senior Deputy Commissioner for P-12 Education said, “These newly defined cut scores do not mean that students who were previously scoring at the Proficient standard and are now labeled Basic have learned less. Rather, the lower numbers of students meeting the Proficient standard reflects that we are setting the bar higher and we expect students, teachers, and parents to reach even higher to achieve these new targets."
(Following the release of the 2013 set of scores, Commissioner King stated, “These proficiency scores do not reflect a drop in performance, but rather a raising of standards to reflect college- and career-readiness in the 21st century.”)
(When the passing rates for test scores are regularly adjusted and then vary so widely over a period of seven years, it is difficult for students, parents, teachers, or administrators to have confidence in the meaning, the validity, or the reliability of the information. This is further compounded by the many voices of practitioners and testing experts who declaim the current testing agenda as being flawed.)
In the past teachers were allowed to review tests to make instructional and curricular adjustments. The tests are now collected immediately after administration and destroyed after being examined at NYSED.
As indicated in previous correspondence, we believe that the data does not accurately reflect student ability.
Instead, during the course of the school year, we collect information about student performance through ongoing and more personalized assessment processes, including the use of assessment tools that are valid and reliable. This information is more precise and provides us with what we need to know about addressing student needs. Data that we receive from a State test that is administered in the spring and whose scores arrive in mid-August do nothing to help us improve student learning.
Since most of the assessment items were based on content and skills that had not been taught to our students, Dover teaching staff will be very selective in how we are using any of the state-provided data to make individual student decisions about placement or special support. Primarily, we will use the data that we already have collected.
The significant decrease in this year’s assessment results, experienced across the Hudson Valley and throughout New York State, IS NOT ATTRIBUTABLE TO A DECLINE IN STUDENT PERFORMANCE, but instead, reflects the fact that the assessments, which were extremely difficult, were implemented before students could be effectively instructed in the new, more rigorous, Common Core Learning Standards curriculum. While the NYSED established the cut scores, which determined the new proficiency levels, it is impossible to accurately compare student progress using prior year’s assessment results.
We have drafted a new academic intervention service plan to better address our students' needs. I am sure it will change and adjust as the year progresses.
Academic Intervention Service Plan
It is anticipated that the district will receive the individual N.Y. State assessment reports in September when they will be distributed.
Parents, please provide a balanced response to children which explaines that these are new assessments and that the school is adjusting what it teaches to respond to areas where students did poorly. Please acknowledge that this happened in every school district in New York where the average percentage of students reaching proficiency was 31%. Remind them that with continued hard work, they will be fine.
We believe that in spite of the rhetoric that this will be “good” for students, there is a potential that this could be damaging. We are concerned that students may interpret the test results as a reflection of their abilities and potential, rather than the results of a controversial and experimental process to reform education. We will be speaking to teachers and our school psychologists about how to best respond to student questions related to a perception of “failure”.
We have asked the New York State Education Department to provide us guidance on how to speak to children about the results. We have not received a response.
It is extremely important that parent and teachers emphasize to children that these test results have nothing to do with their abilities or what they have learned. The Commissioner of Education has clearly stated this point. They should be assured that they just happened to be in school when a new test was being used as part of an experiment.
I have asked our psychologists to write up helpful suggestions when talking to your children about the results. Please follow the link below to view these ideas and suggestions. We will also share it with the PTSA.Helping Your Child Understand Tests and Test Results
Many parents have raised concerns that test results and other student data will be shared or used by businesses and other government agencies. These concerns have been presented to the Commissioner of Education and a request for a statement from the New York State Education Department about student data security and access has been made.
While the district is extremely dissatisfied about the validity, process, timing, and support for these new tests, as well as the number, length, and design of the assessments, until this current course is corrected, we must be practical.
The district’s principals and central office team have been and will continue to ensure that staff are prepared to deliver the necessary instruction to implement the Common Core so that students will be prepared for the next round of assessments in the spring of 2014. During the summer of 2013 over 30 teachers worked on Common Core curriculum units.
In the 2014-2015 school year, the PARCC on-line assessments will go into effect. These are part of a national consortium of states that have adopted the Common Core and will use on-line testing. Currently the district is analyzing the cost for the testing hardware and related equipment. This will mean a new battery of assessments and preparation for them. As the district receives new information, we will provide updates to the community.
These other tests will continue to be required under state education law. They will include a pre and post-test. The state education law requires formal standardized testing that is linked to teacher performance at the local level and the state level.
The tagline of “college and career readiness” has become the mantra for the national reform movement. It is claimed that the new Common Core standards will prepare kindergarteners through high school seniors for college and careers.
There is no evidence to support this claim. The standards are not based on any scientifically-based research study. They were selected via a “committee” and are based on the opinions of those on the committee, not through a thorough analysis and testing to determine that they will produce the results that have been promised.
In fact, there is no research that shows that standardized testing has improved education. There may be evidence to the contrary. One study shows that in sixteen of eighteen states with high-stakes assessments, the dropout rates increased.
State officials have cited Regents and AP exams or ACT high school exams as indicators of “college and career readiness”. Previously in this Q&A, we cited the contradictory data of students not reaching proficiency on the Common Core-based assessments while achieving the mastery level on a high school Regents exam as an eighth grader.
One of the arguments for the Common Core and the new testing system is that these changes will make the United States more competitive. This is not the first time that such claims have been made. When Sputnik was launched by the Soviet Union in 1957, there was a similar call for reform; however, the United States had already had a satellite ready to be launched. It was not a matter of technological advancement, but rather, a matter of timing. The call for increased funding for math and science education and research came from those who wanted more funding for education and for research.
Similar arguments were made in the 1980’s and again in the 1990’s. Many of these decisions are made for political and not educational reasons.
That said, there are some curricular shifts in the Common Core that represent an improvement. These curricular changes could have been made without wholesale changes and significant increase of the testing of students.
While the United States has never performed well on international assessments going back over 60 years, our country has led in almost every leading economic indicator
within the same time period*. This continues today.
The United States is known for its innovation. One piece of evidence is the annual number of approved patents for innovation. In 2011, the United States was granted almost 110,000 patents. All other nations combined were granted almost 116,000. The second leading patent generator was Japan with 46,000. Such creativity and innovation do not come without a sound and productive public education system.
The U.S. gross domestic product (GDP) is 3x that of China’s and 5x that of Germany’s and just slightly below the collective nations of the European Union, even though many of these countries have students who perform better on tests.
There is not nor has there been any correlation between performance on international scholastic tests and economic success. Perhaps one of the reasons that our economy has been so successful has been that schools, under local control, have adapted to the needs of society based on the expertise of local educators and community members.
Some education experts see the comprehensive curriculum, not one narrowly focused on test prep, as a reason for economic success. Many are concerned that over-standardization in order to be competitive on international tests may in fact be counterproductive. There are some who believe that this may not only be an expensive experiment in public education reform but a dangerous one.
(*The United States has one of the highest child poverty rates in the developed world with around 22% of children living below the poverty level. New York State’s childhood poverty level is comparable to that of the nation and according to 2010 statistics New York ranked 28 of 50 in this category. It is also no surprise that in states with the lowest child poverty levels (e.g., Connecticut, Massachusetts, and Maryland), performance on exams or high school graduation rates are consistently the highest.)