To our readers
This week, we're taking you on a virtual journey to help tackle one of our community's most intractable problems. This week and occasionally throughout the year we'll report on urban school districts in the United States where educators have proved the most successful in teaching poor, minority students. We hope our “Field Trip” helps the Omaha School Board, its new superintendent, Nancy Sebring, and all educators, policymakers and citizens who seek the best for disadvantaged children.
— Executive Editor Mike Reilly
Sunday: Urban schools playbook
Monday: Measurable goals
Tuesday: Data-driven instruction
Wednesday: Strong principals
Thursday: Effective teachers
Teachers around the country are honing their detective skills as they work to pinpoint why students struggle.
They regularly test whether kids are learning what state officials say they should know. Computers compile results, and teachers sift the data, analyze the root causes of wrong answers and customize their lessons as they reteach. Each student gets precisely the instruction he or she needs.
The approach is called data-driven instruction, and high-performing urban districts such as the Hillsborough County Public Schools in Tampa, Fla., are good at it.
The Omaha Public Schools trail high-performing urban districts in mining test data to improve instruction.
OPS spent $3.36 million in the past three years launching a computer system that gives teachers the capability for data-driven instruction. The CTB/McGraw-Hill Acuity system quickly provides detailed test-score data to help teachers assess whether students are grasping Nebraska's academic standards and diagnose where they need more help.
But some teachers have yet to embrace its capabilities. Others say they're not fully trained to use it. And once Omaha teachers diagnose where students need more help, some say, they still need more help developing alternative lessons and strategies.
Nebraska was the last state in the union to adopt statewide tests that measure whether kids are learning the skills and concepts in the state academic standards.
States such as Florida, North Carolina and Texas did so long ago and have spent years refining their use of data.
“The greatest impact is the ability to pinpoint,” said Anna Brown, a Hillsborough administrator who specializes in assessment.
“We call it root-cause analysis. Pinpoint the actual root cause of either low performance or a gap in learning, whatever it is that you're analyzing,” Brown said. “Once you can get to that root cause, then you can make a good, solid instructional plan that is individualized for that student or group of students to move forward.”
It's working for Hillsborough. On a key national test, the Nation's Report Card, Hillsborough's 2011 math and reading scores exceeded the average for Florida and the nation.
The district's poorest black, Hispanic and white fourth-graders outscored their Nebraska counterparts in math and reading. In eighth-grade math, Hillsborough's black and Hispanic students also topped their peers in Nebraska. Hillsborough's fourth-graders scored highest of all Florida districts on the 2011 state writing test.
The College Board recognized Hillsborough as a national leader for increasing both participation and scores in Advanced Placement exams.
Hillsborough and other high-achieving districts closely and systematically align their curriculum — what's taught in each grade and in what sequence — to the state standards. Frequent testing measures whether students are learning the standards.
Until the Nebraska State Accountability tests began rolling out in 2010, such seamless alignment of state standards, curriculum and testing was hit-and-miss for Nebraska districts.
Before then, OPS and other school districts created and administered their own tests to gauge whether students knew state standards and decided what score a student needed to be proficient. In some cases, districts had adopted their own standards, as well.
States with standardized testing systems in place for a long time offer advantages to districts. Everyone involved knows what students are expected to learn, what teachers teach and what the state tests.
In 2002, researchers in the state of Washington reviewed more than 20 studies and concluded that the close alignment between curriculum, instruction and testing is an essential characteristic of good districts.
“The match between what is taught and tested with the state standards is critical,” according to their report, “Nine Characteristics of High-Performing Schools.”
In a 2011 report on how low-income and minority students were faring in California schools, the Education Trust concluded that the best districts know how to use data. The trust graded the performance of the state's 146 largest unified districts, highlighting the high-poverty Val Verde, Sanger and Desert Sands districts for their “culture of data use to inform decision-making.”
“Data use at the classroom level is supported by investments in technology, which help deliver a constant stream of information on student achievement to teachers and principals,” the trust said.
Florida began state testing in the 1970s. The Florida Comprehensive Assessment Test — math, reading, writing and science assessments for grades 3 through 11 — began in 1998.
Hillsborough trains teachers to dig deep into data, Brown said, so they don't resort to the “spray and pray” approach: teaching everybody the same thing and hoping they all do better on the test.
Although good teachers have always tried to pinpoint their students' weak points, today's technology lets teachers analyze larger sets of data more quickly, she said.
Hillsborough trains teachers to analyze data to look for matches or mismatches between curriculum and standards to find gaps in learning, she said. When students don't understand something, teachers plan reteaching groups. For students who understand the material, teachers plan enrichment groups to keep them moving forward.
It's called differentiation. It's big in Hillsborough and “definitely the key in an urban poverty area,” Brown said. Poor, minority and immigrant youths enter school with widely varying ability levels. They also change schools more frequently.
“Students don't come to us with everyone in the exact same place, ready to move forward,” she said.
In North Carolina, data use has also been linked to the success of the Charlotte-Mecklenburg Schools.
In a report last year, educators attributed the district's recent test-score gains to “the comprehensive, detailed student data used to set incremental, achievable goals.”
By the end of the 2010-11 school year, the report from the Martin and Susan Dell Foundation said, Charlotte-Mecklenburg had “built a strikingly data-rich culture in which administrators, teachers, parents and students looked to data as critical to the enterprise of learning — and not as a simple score on which any individual's success or failure rode.”
But Charlotte's example also highlights a potential downside to the data push.
To obtain more data to analyze student progress, the district created a series of new tests, resulting in a backlash from parents. Charlotte has since slowed down the changes.
Common criticisms of standardized testing are that it narrows the curriculum to core subjects such as math and reading, at the expense of arts and physical education, and that bubble tests poorly gauge the breadth of knowledge and critical thinking that children need for success.
But the goal of data-driven instruction is not so much testing children more but doing more with test data.
In the Houston area, the Aldine Independent School District has been sharpening its use of data for 17 years, Superintendent Wanda Bamberg said. Texas enacted its first standardized tests in 1980, the Texas Assessment of Knowledge and Skills.
Initially, Aldine's scores were among the lowest of 53 school districts in the Houston region, she said.
Aldine used a district-created Excel program to break down test data by student groups and then by individual student. Later, the district used a program called TRIAND, in which tests were computer-scanned and results were color-coded, improving efficiency. Teachers could spend more time working with the data instead of collecting, inputting and organizing the data.
Now Aldine uses the more powerful Eduphoria, a national commercial system produced by a Plano, Texas, firm.
“The data comes up in such a way that you can look at a class roster and you can literally say, ‘OK, tomorrow I'm going to group these students so you can meet their needs. I'm going to group these four kids together, and these six kids together. And this group of four, they made 100 percent, so I'm going to do enrichment for them.'”
When whole campuses struggle on tests, teachers in those schools get extra training.
Joseph Johnson, executive director of the National Center for Urban School Transformation, said data systems are only as good as the people using them.
“It's not something that you can buy that makes the difference in these places,” said Johnson, a professor of urban education at San Diego State University. “There isn't necessarily a silver bullet. But it is how smart, dedicated educators, working with their communities, use those products or tools in very systematic ways.”
Districts must train teachers to use data to improve instruction, he said.
If four seventh-grade math teachers have similar students, and one gets better results, school leaders can help to explore why, he said. It might be that the other teachers aren't connecting with the students, rely too heavily on the textbook or just don't understand the concept, he said.
It's important for teachers and principals to carve time out of the school day to analyze data.
Very successful schools structure teachers' planning time around logical teams. The fourth-grade teachers might have a shared period, with part of that time devoted to common planning, using data to help them.
School leaders need to set aside time to participate in those meetings, he said. Some school districts release kids early to give teachers such time.
OPS bought the new Acuity software system with federal stimulus money in response to the rollout of new state reading, math and science tests, said ReNae Kehrberg, the district's assistant superintendent for curriculum and instruction.
Kehrberg said the purchase does not reflect a new emphasis on data — officials say they've always made use of it — but rather a “retooling” in light of the new state tests.
After three years, some OPS teachers lean heavily on the Acuity software to diagnose weaknesses.
Dave Alati, assistant principal at OPS's Alice Buffett Magnet Middle School, said Acuity has helped teachers to differentiate math lessons.
Children enrolled at Marrs Magnet Middle School in South Omaha are very familiar with Acuity. They set Acuity goals, and colorful bulletin boards hail their successes.
Principal Pam Cohn credits the system with helping raise achievement, though she said not all teachers have embraced it. Of four teaching teams at Marrs, she said, the two teams who made better use of Acuity showed greater student gains this year.
Elsewhere in the district, some teachers say they have never been trained to use it, though the district has offered training sessions. Some teachers said they still need help identifying alternative teaching strategies and more time to collaborate on solutions.
OPS officials are hopeful that once the bugs are worked out with Acuity, instruction will improve and scores will rise.
Bamberg, the Aldine superintendent, said her district has developed “a culture” of data use. Teachers know that at team meetings, they will be talking about student results, including whose students did the best, and that teachers will receive training based on what's working.
Nancy Sebring, who will take over as OPS superintendent in July, said districts should not overwhelm teachers with data. What data they get should be concise and informative, she said.
While some teachers are rightly concerned about the overemphasis of testing in schools, the standards movement has given teachers a better understanding of what's expected of them, Sebring said.
“We didn't always know what the targets were,” Sebring said. “Our target might be whatever was in the textbook that we were handed. And now, with state content standards in place, that has really helped us define what the target is for all students, and thus, it's somewhat easier to align your work toward that and then to monitor it.”
While data analysis can pinpoint problems, it doesn't solve them, she said.
“Ultimately, you have to turn that information into some kind of a work plan,” Sebring said. “How am I now going to address this student in my classroom?”
World-Herald staff writer Jeffrey Robb contributed to this report.
Contact the writer: 402-444-1077, firstname.lastname@example.org