To some degree, ranking the best high schools of the region is an exercise in predictability. Given the measures we have, the neighborhoods we have, the socio-economic disparity we have, we know the schools that serve largely middle-class or affluent students will generally — as in almost always — rank higher than those serving a low-income student body.
This is a nationwide truth, and one with which we have been wrestling as a society for decades now. The causes do not lie in the innate ability of students. They don’t lie in their desire to learn. Rather, this is a pipeline issue. It’s the accumulation of multiple factors. Those factors include the education level of parents plus the stability and mobility of the family plus the concentration of highly qualified teachers and stability of leadership within a school. It’s the rigor of the curriculum and the range of course offerings plus the size of budgets that can pay for support staff, electives and extracurricular activities — or not.
How do you compare Richmond’s Armstrong High School, where last school year three of four students were classified by the state as economically disadvantaged — they qualified for free or reduced-price meals, public assistance or Medicaid — with Henrico County’s Deep Run, where fewer than 5 percent were considered economically disadvantaged?
Our answer? You don’t.
Yes, all should be performing at high levels no matter the neighborhood, no matter the household income of a student. But our reality does not yet match our rhetoric. So, in an attempt at an apples-to-apples comparison, we divided 28 traditional high schools in the region — that’s Richmond city, Chesterfield, Henrico and Hanover counties into three categories based on the percentage of economically disadvantaged students enrolled in the last school year. Schools then were ranked relative to their peers. Overall scores are provided, as well.
In the end, the schools with the highest poverty rates still tended to fall to the bottom of the ranking, even among their peers. But Henrico County’s Varina High School climbed among its group, largely because it has one of the smallest achievement gaps between black and white students in the region. A couple of outliers also emerged. Hanover County’s Lee Davis High School scored far below its peers largely because it reported a smaller percentage of students graduating with advanced diplomas and enrolled in advanced coursework over the last three school years. Richmond Public Schools’ George Wythe High School tumbled on the safety score. Over the last three school years, its administrators have reported an average of 1,425 student offenses, most under the category of disruptive or disorderly behavior, which is up to each school to define as it sees fit. No other school came close to reporting as many offenses.
Finally, we interviewed students who graduated in 2014 at the top of their classes. We asked them how well their schools prepared them for their first year of college. Almost all touch upon that which cannot be measured: intrinsic student motivation, student-teacher relationships and the vital importance of a constellation of adults — beyond parents — who believe in the capacity of all students to succeed. - Tina Greigo
Accountability 2.0
What Virginia’s testing regimen hasn’t been telling us about our schools and students
In 1996, Stephen Geyer took a job as a seventh-grade English teacher in Greene County Public Schools. That same year, Virginia rolled out its new grade-by-grade curriculum, called the Standards of Learning. The standards, Geyer thought then and still argues now, were a good thing. They set clear goals in need-to-know core subjects, such as reading, math and science. They provided uniformity. They promoted accountability.
Geyer left Greene County four years later for a job in Northern Virginia as an elementary school principal. The following year, President George W. Bush signed No Child Left Behind into law, launching what has come to be known as the high-stakes testing era. Virginia already had phased in its SOL testing, but the new federal mandate created more pressure on teachers and schools to get students to a passing score, Geyer says, and that pressure only increased as more tests were mandated.
Geyer spent a decade wrestling with the fact that public opinion of his school, ultimately, hinged on a single number: what percentage of its students passed the tests. Over time, he and his colleagues grew more convinced that the way in which the state’s accreditation system sought to hold schools accountable wasn’t working — especially for students far ahead of their peers or those who were several grade levels behind. There had to be a better way, he thought.
“The more the pressure around the tests got ratcheted up, the more we were forgetting about those marginalized kids,” he says.
Geyer arrived in Goochland County in 2011 to take the job of assistant superintendent. The county hired James Lane as superintendent a few months later. Ahead of the 2013-14 school year, the pair formed a taskforce that sought a better way to assess how its schools were performing and what its students were learning. Instead of relying solely on the state’s SOL test pass rates to determine school performance, the district would also measure — and tell parents — how much progress students made during the course of the year, as well as year-over-year.
As it stands, Virginia’s system for accrediting schools does not take into account students’ academic growth. Put another way, a student who achieves a score of 250 on his reading SOL one year may score a 395 the following year. It’s a huge leap, but still below the required 400 score for proficiency.
“The teacher that helped that student is a miracle worker,” Superintendent Lane says. “But in the Standards of Learning era, that kid is considered a failure.”
Which is the better school: One where 85 percent of students pass a standardized test and 25 percent show growth, or one where 60 percent of students pass, but 75 percent show growth?
“There’s a pretty wide consensus that the [SOL] system served us well in many ways, but that it’s time to rebuild it from the ground up,” says Anne Holton, Virginia’s secretary of education. “What the next system is going to look like is still very much in flux. The notion that we should be assessing more growth — where does a student start versus where they end — seems to have been gathering pretty widespread acceptance.”
Welcome to Accountability 2.0.
Throughout the country, states — pushed, in part by the federal government — have been moving toward a combined measure of school performance that takes into account both how a group of students performs at a particular point in time and how that same group performs over time.
Virginia has been tinkering with its growth measurement over the last several years. It had to. Part of the American Recovery and Reinvestment Act of 2009 said that states receiving federal education funds were required to provide student growth measures to school divisions to help them evaluate teacher performance. By 2014, about 30 states were using some form of a growth measure called a value-added model, which attempts to measure how much a teacher or school contributes to a student’s academic progress.
In Denver, Colorado, for several years now, the district’s school report cards have tracked each school’s performance by both achievement and year-over-year academic growth. The growth measure is largely — but not entirely — based upon the state’s standardized test scores. Next fall, the district plans to slightly reduce the emphasis on growth and add one more measure of school performance: how well each school is closing achievement gaps. Such gaps measure the disparity in academic performance between groups of students, typically based on race and/or economic status.
In September, Virginia implemented a new growth measure it’s calling progress tables. The new system will not compare an individual student’s growth against a group of like students — as the state’s first attempts at measuring progress did — but instead against his or her past performance on the SOL. The measure divides test ranges into new increments of growth. Move from “high” proficiency to “low” advanced and you grew by one sub-level.
At the earliest, a report on the progress tables could be available by fall 2016, says Charles Pyle, communications director for the Virginia Department of Education. The department has not yet determined whether it will publicly release any data derived from the new system.
But even as Virginia moves toward a new growth measure, that measure still relies upon SOL data and SOL data was not designed to measure student growth. Or school performance. Or teacher effectiveness, says Meg Gruber, a high school science teacher of 32 years.
“The SOLs were set up originally to measure, at that one point in time, what facts a child knows,” says Gruber, president of the Virginia Education Association and a member of the state’s SOL Innovation Committee. “We’re still using tests that were designed for one function and we’re now using them for another function.”
The SOL isn’t a perfect proxy for growth, acknowledges Goochland superintendent Lane, but he’s encouraged the state is taking steps toward measuring student progress.
Before the start of the 2013-14 school year, Goochland contracted with the Northwest Evaluation Association to provide its Measures of Academic Progress test to students in first through eighth grade in reading and math. The tests, administered in the fall, winter and spring, are computer adaptive, meaning that questions get harder if a student answers correctly and easier if they answer incorrectly. The company generates an individualized report that the district then sends home to parents. The report shows a student’s progress compared to school, district and national averages. Henrico County Public Schools uses the same test to assess growth for students in grades three through eight and some high schools, and has done so since 2010-2011. Richmond Public Schools uses the test in struggling schools. For Goochland high schoolers, the division administers student growth assessments that Roanoke-based company Interactive Achievement generates for each subject area. (Chesterfield County also uses Interactive Achievement.) For both contracts, the division pays $45,000 annually, Geyer says.
Lane says the district has used the growth results to create targeted instruction for students who were struggling and to show just how much progress they could make in a year, even if that meant not meeting the SOL benchmark.
“We’re able to celebrate successes that in the past were considered failures,” Lane says.
The changes didn’t just help struggling students, Lane says. Parents of gifted students who routinely performed well on the SOLs could see whether the system was challenging them to keep improving.
Virginia’s first SOL benchmark tests were given in the 1997-98 school year. Scored on a scale of 0 to 600, SOLs are taken once a year by students in grades three through eight for reading and math, and periodically in science and history. For a school to achieve full accreditation at least 75 percent of its students must pass the reading SOL and 70 percent must pass the math, history and science tests. Tougher math and reading tests beginning in 2012 and 2013 respectively upped the ante.
At high schools, the stakes are even higher. Students must pass at least six SOL tests in core courses to earn enough verified credits to graduate with a standard diploma. To earn an advanced diploma, the best measure of college readiness, a student must pass nine benchmarks.
A high school must graduate 85 percent of students to be deemed fully accredited.
A pass/fail system does not acknowledge that some schools must reach the benchmarks with students who enter far below the grade-level in which they need to test proficient. This is especially true at schools that serve high-poverty populations, have a high percentage of students with disabilities, or a high percentage of students who are learning English.
All six of the Richmond region’s high schools where fewer than 10 percent of students are economically disadvantaged were fully accredited before the start of the 2014-15 school year. In contrast, only three of the 11 schools where poverty exceeds 40 percent achieved full accreditation last year.
In June 2012, the U.S. Department of Education granted Virginia a waiver to set incremental goals for students at underperforming schools to slowly chip away at the achievement gap between the lowest- and highest-performing schools in the state. By 2016-17, 78 percent of students must be proficient in reading and 73 percent must be proficient in math.
All this revamping is happening against a backdrop of growing resistance to standardized tests as a chief measure of student — and school — performance. The No Child Left Behind Act of 2001 set the ambitious goal that 100 percent of students receiving a public education in the United States should be “proficient” in each subject area by 2014.
Not a single school division in the state met that goal.
Two-thirds of Americans think public schools place too much emphasis on testing, according to a Gallup poll published in August. More than half of the poll’s respondents also said tests were not the best way to measure a school’s success.
Victoria Carll, a teacher at Richmond Public Schools’ Open High, is founder of RVA Opt Out, a group that encourages parents to opt their children out of SOLs. Carll says Virginia’s test-result, data-driven accountability system is “an underrepresentation of what kids, teachers and schools are struggling with and what they’re succeeding at.”
Carll prefers an accreditation process that takes into account a school’s self-evaluation — whether students meet goals set by teachers and administrators— the makeup of its student body, parental involvement and its course offerings.
In recent years, the state has dialed back its testing policy. The General Assembly passed legislation in 2013 eliminating five SOL exams (third-grade science and history; fifth-grade writing and two U.S. history tests in middle school), cutting the number of tests from 34 to 29. In doing so, the state required school divisions to replace them with alternative assessments. Additionally, a measure passed last year that allows expedited retakes for students in third through eighth grade lowers the stakes for teachers and students alike.
Gov. Terry McAuliffe assembled the SOL Innovation Committee in the summer of 2014 to review the state’s testing and accreditation policies. The committee’s initial recommendations in November 2014 included a greater range of accreditation categories for schools that have not reached the state’s pass-rate threshold but have demonstrated significant progress toward it. The committee’s work continues through 2015 and could render more recommendations ahead of the upcoming General Assembly session.
Accountability 2.0, Secretary of Education Holton says, represents a necessary acknowledgment by the state of “the starting points that folks are dealing with while ultimately wanting folks to still get to the same finishing point.”
“The biggest thing we hear from everyone is: ‘lighten up,’” she says. “We’re trying to figure out ways to preserve the essence of accountability, but be more constructive — have a less punitive system.” -Mark Robinson
Methodology
1. Our ranking includes the 28 traditional public high schools in Henrico, Chesterfield and Hanover counties and in the city of Richmond. Excluded are the two regional governor’s schools: Maggie L. Walker and Appomattox, as well as Community and Open high schools and Franklin Military Academy. All five are selective admission schools.
2. The schools are grouped by the percentage of students who were classified as economically disadvantaged during the last school year. Virginia considers a student economically disadvantaged if he or she receives free or reduced-price meals, Temporary Assistance for Needy Families (public assistance) or are eligible for Medicaid. High-challenge schools had an economically disadvantaged population above 38 percent; low-challenge had an economically disadvantaged population at 10 percent or below. Mid-challenge encompassed those in between.
3. The information used in our ranking was drawn from the Virginia Department of Education’s 2012-2013, 2013-2014 and 2014-2015 School Report Cards and School Cohort reports. Virginia Commonwealth University’s D’Arcy Mays, associate professor and chairman of the Statistical Sciences and Operations Research Department, created the statistical model to calculate rankings. That model required the conversion of all numbers to “positive value” percentages. So, instead of using the percentage of students who drop out of school, the model uses the percentage of students retained. Instead of using the number of disciplinary incidents per 100 students, the model converts to its flipside, a safety rate.
4. We gave the most weight (20 percent) to the percentage of students graduating with advanced diplomas, which is what the state considers the best proxy for college or career readiness. Another 20 percent was assigned to pass/advance scores on the Standards of Learning (SOL) reading test. We assigned lower weights (15 percent) to retention rates, and to the percentage of students enrolled in advanced coursework or programming: Advancement Placement, International Baccalaureate and dual enrollment classes. It is possible, even probable, that there is overlap between these three groups since one student could be taking an AP class and a dual-enrollment class and would thus be counted in both categories.
We assigned still lower weights (10 percent each) to on-time graduation rates, 2013-2014 reported student misconduct (safety), and to SAT scores, which are based on a three-year average of the combined verbal and math scores converted to a percentage of a perfect 1600 score. A combined average score of 1050 would equate to a 65.6 percent average SAT.
Bonus Points: Education experts suggested we give recognition to diversity within a school, so we awarded bonus points for racially and ethnically diverse student populations. Each school earned between 0 and 10 points corresponding to the percentage of students not in the largest racial group in 2014-2015. For example, if 60.5 percent of a school’s students were white, then the remaining 39.5 were of other races/ethnicities, and the school received a bonus of 3.95. The greater the diversity, the greater the bonus.
While we were at it, we decided to award — or subtract — points based on how well the school was closing the achievement gap between black and white students. One bonus point is the equivalent of one percentage point. We took the difference in the percentage of black students and white students passing the SOL reading test (averaged over three years). The model establishes a 10-point gap as the baseline. A school with an achievement gap of 13 percentage points would then lose three points. A school with an achievement gap of 3 percentage points would gain 7 points. We could not calculate the achievement gap for four schools due to a lack of data, a lack of diversity, or, as in the case of Chesterfield County’s Meadowbrook High, because black students were outperforming white.
Low-Challenge
Johnny Willing: 2014 Atlee High school valedictorian, now at the College of William & Mary. (Photo by Jay Paul)
“When I got to Atlee, I didn’t have a clue what I wanted to do after I graduated. I thought for a while I was going to study chemistry and engineering. I thought I would study music. I was in the IB (International Baccalaureate) program; it taught me to think outside of the boundaries of the United States. It helped me a lot, getting interested in foreign languages and foreign cultures. For me, enrolling in IB wasn’t a question. I wanted the most challenging course load and I thought IB would be the most challenging, overarching curriculum that Atlee offered. It was very comprehensive. I didn’t have great experiences with some individual classes, and I sometimes disagreed with the teaching philosophies of individual teachers, but overall, with the IB curriculum itself, I thought Atlee did a great job of allowing me to have the academic experience and rigorous curriculum that I wanted.
During my senior year, I applied to colleges and several exchange programs. I was accepted to William & Mary, but I also accepted a scholarship from the (U.S.) State Department to study abroad in South Korea. I spent the past year in the National Security Language Initiative for Youth immersion program, living with a host family while taking Korean language classes and attending a Korean high school.
Now, I’ve kind of settled on some combination of economics and international relations, either working for the State Department or an NGO [nongovernmental organization]. During high school, my future plans changed totally, and I think they changed because of the teachers I had at Atlee who challenged me to study different things.”
John Brownhill: 2014 Deep Run High School salutatorian, now at the University of Virginia. (Photo by Jay Paul)
“At Deep Run, it was about high standards, high scores and the student body is extremely competitive. All that put me in the place I needed to be for college. You can’t depend on the system to push you through. You have to have a strong work ethic, and I do. That’s something that comes first from my parents and from myself. The resources at Deep Run are plentiful, and if you use them efficiently and take advantage of them, you can really succeed.
I was always a strong student — not in the sense of ‘I’m more intelligent,’ but I set a goal, I ask myself: ‘What do I need to do to accomplish my goal? What do I need to sacrifice along the way to achieve it?’ Then I work hard and do it. My peer group was definitely instrumental because the environment is so competitive, it was like, I have to strive, I have to do better than the next guy. I liked that. It’s the way the real world works.
I’d say the aspects of a superior high school are first, a great teaching faculty — teachers willing to go the extra mile for their students. Second, a student body that is driven toward academia— that’s really important, having that community of students who all know they have to do well in school. And the third is parent involvement. Deep Run has an extensive parent-teacher program and just having that involvement of parents making sure things are done the right way makes a huge difference.”
Mid-Challenge
Sean Sequeria: 2014 Godwin High School valedictorian, now at the University of Virginia. (Photo by Jay Paul)
“I really think, academically, Godwin and Henrico schools, in general, help you prepare for college. But I think what goes along with that is the intrinsic motivation of students to take the classes that will prepare you for college.
I chose to go to Godwin for its Math and Science Specialty Center. Essentially, the entire curriculum was very biased toward math and science, which a lot of people would frown on, saying you’re not getting that holistic education you would get in a normal high school. But, for someone like me, it all kind of boils down to recognizing who you are as a student.
When you ask what distinguishes the best high schools, it’s not AP (Advanced Placement) scores or GPAs that really provide effective measures because they are so erroneous and biased from school to school. I don’t want to say SATs are a good measure, either, because so much depends upon the amount of prep you get, and that amount varies, especially with the vast disparity in socioeconomic status from county to county.
One thing that colleges emphasize that high schools don’t is the student-teacher ratio, and in that, there’s a disparity between private and public schools. My private school friends got a lot of extra instruction. When teachers devote more time to each student, students feel as though adults other than their parents care about their future, their education, their dreams, and so they do, too. They want to succeed, not only for themselves, but for the teacher who has invested so much time in them. It’s about the micro-environment — a stimulating one can be instrumental in changing a student’s life.”
Zachary Zumbo: 2014 Matoaca High School salutatorian, now at the Massachusetts Institute of Technology. (Photo courtesy Zachary Zumbo)
“When I got my acceptance letter from MIT, I thought I was kind of screwed, to be honest. I just didn’t think I would be able to live up to the expectations. But once I settled down, I realized that a lot of students were really in the same boat as I was. It wasn’t an easy transition, but it was a good one.
I chose Mataoca for its specialty center, the Center for Teaching and Learning through Technology, and it offered a few classes I don’t think I could have gotten any other place. So, I did fine in my tech classes in my first year of college, but I really wasn’t prepared for how rigorous the humanities classes would be.
In high school, you know, you look at the textbook and there are really good sources for taking the test and passing the test. You read, you synthesize, you write. It was all in front of me. But in college, it was like, here’s the essay you have to write — you find the sources. It’s so open-ended and you might be digging through hundreds of sources and I didn’t know how to do what was being asked of me. So, that kind of skill is something I wish high school would have helped me with.
For me, the difference in high school was the teachers. I had excellent teachers. It wasn’t a matter of them having mastered the material. Just because you’re a good student doesn’t mean you’ll make a good teacher. There is just something different about the best teachers. They understand not just the way a teacher needs to teach, but the way each student needs to be taught. It’s something organic.”
High-Challenge
Mercedes Hanks: 2014 John Marshall High School valedictorian, now at James Madison University. (Photo by Jay Paul)
“My classes in high school did not prepare me in any way for the rigor of my classes in college. I had great teachers at John Marshall, teachers who were more than teachers. They cared about me as a person, but, as far as the curriculum, it was nowhere near rigorous enough.
In high school, I was a mathematical genius, I really was. My first semester at JMU, I took a math class called quantitative reasoning. The second semester I took statistics. I got a B in quantitative reasoning and a C in statistics. I couldn’t believe that — a C! I would have cried in high school if I had gotten a B.
The other thing is that I went to a predominantly black high school, and so I faced a lot of culture shock going to JMU, which is a predominantly white institution. Making high schools more integrated is hard because of the way schools are zoned and who lives in what neighborhoods.
A public-speaking class might have helped me better prepare. Certainly, the opportunity to take more AP (Advanced Placement) classes would have benefited me. Also, in the whole district, we only have one IB (International Baccalaureate) high school and only one governor’s school in the city. So we don’t have as many choices as the counties do. I also think we need to stop putting so much time into standardized testing. I’m not sure it’s preparing us for anything.
Difficult as it has been, quitting has never been a thought in my mind. No, no, no, no. It’s difficult and difficult is good because if it weren’t, then I wouldn’t be growing.” Read more about Mercedes' academic journey in Tina Greigo's Sunday Story column here.
Mounika Bodapati: 2014 John Randolph Tucker High co-valedictorian, now at Columbia University. (Photo courtesy of Mounika Bodapati)
“I was in the first IB (International Baccalaureate) graduating class at Tucker, so a lot of the times, even when we talked to our Henrico High counterparts, our curriculum tended to be very rigorous. But it ended up paying off. Competing at Columbia has been a lot easier because of the rigors the IB program at Tucker put me through. They really tried to bring the concepts of IB and interdisciplinary work to life and to look at how what we were learning in the classroom applied to the real world.
Tucker itself is a very diverse school. At one point when I was there, I think we had kids coming from something like 72 countries to the school. That diversity prepared me very much for life at Columbia, all the different viewpoints, going to classes with different people from such different backgrounds. I already knew how to deal with it, handle it, and make the most of it because of my experience at Tucker.
My main issue was with how Henrico County handled opportunities. When I was trying to do Science Fair, there were a lot of hoops I had to jump through. The kids that needed to be challenged were always challenged; I just wish I was, I guess, a little bit more challenged. I don’t think there’s a good way for students who want to access different opportunities to easily be able to access them.”