Politics

What’s the best way to test for partisan gerrymandering?

7398696708_ebe4acd400_k.jpg

The Wisconsin State Capitol in Madison is shown here. 

Credit:

Teemu08/CC BY-SA 2.0 (image cropped)

In October, the United States Supreme Court heard oral arguments for Gill v. Whitford, a landmark case about gerrymandering in Wisconsin. Democratic plaintiffs proposed an easy formula — called the efficiency gap — to determine whether an electoral district is fairly drawn.

Player utilities

Listen to the Story.

“It's meant to get at this idea of 'wasted votes,'" explains Oliver Roeder, a senior writer for FiveThirtyEight. “So, you take a party's total wasted votes — so, 'wasted' in the sense of votes for a losing candidate or votes for a winning candidate beyond what he or she needed — and you divide that by the total number of votes cast. And a gerrymanderer tries to get one party essentially to waste as many votes as it can, and that's what the efficiency gap tries to get at.”

But in response to the proposed math, the justices of our nation’s highest court were skeptical. Chief Justice John Roberts described efforts to measure partisan gerrymandering as “sociological gobbledygook.” Justice Neil Gorsuch compared a lower court’s use of several tests to his steak rub recipe: “And so what's this court supposed to do, a pinch of this, a pinch of that?” Even the more liberal Justice Stephen Breyer opined that the standard should be “manageable by a court,” not just social scientists and computer experts.

The reactions frustrated Roeder, who discussed them in an article for FiveThirtyEight called “The Supreme Court is Allergic to Math.” But Moon Duchin, an associate professor of mathematics at Tufts University who studies gerrymandering, says there’s broader context for the justices’ grumbling.

“The case started out in a district court, then it went to a circuit court, and the Supreme Court, and at every higher level, the plaintiffs — the ones challenging the map in Wisconsin — broadened the scope a little bit of which metrics they wanted to consider,” she explains. “I think part of the complaint was actually that it wasn't just efficiency gap, there were a proliferation of different kinds of methods.”

She explains the efficiency gap metric came about in the first place because, in the past, it seemed like the court was looking for a simple, all-in-one score for gerrymandering.

“They're looking for a standard that can be managed by courts and not just by experts, right,” she says. “The problem is that just in terms of the information theory, gerrymandering is far too complicated to be captured in a single score, and it'll be hard to find a meaningful threshold for where to cut off permissibility in that score,” she says.

In fact, Duchin thinks there’s a method better than the efficiency gap metric at discerning abusive gerrymandering. It’s called an outlier analysis, and she says the idea is already on the justices’ radar — it came up in the oral arguments.

“So, this approach says, 'Let's use computers to help us get out of this mess; let's try to understand the giant space of all the possible districting plans,'” she explains. “What if you knew something about those many millions of maps and you could sample from that space just the plans that are legally valid, that meet all the criteria that a plan has to meet?

“If you can look at all those along any attribute — you could look at them by efficiency gap, simply by the number of seats for the Democrats or Republicans, by other partisan symmetry metrics, your favorite metric — what they'll do is they'll give you a bell curve and a really easy intuitive test of whether a legislature's proposed map sits in the meaty part of the bell curve, or … way out in the tail.”

That kind of flexible approach doesn’t bind the courts to a particular metric, she says — and it’s mathematically powerful. In an era when gerrymandering is increasingly difficult to detect, that’s just what we need.

“I mean, the current crisis in gerrymandering really is brought on by much more sophisticated uses of data,” Duchin says. “And actually, one of the things that should be alarming is that with all the access to data that we have, down to the household level sometimes, you can make a gerrymander that doesn't look as bad as, you know, your grandparents' gerrymandering.

“It doesn't have to look like a reptile with teeth and claws and wings anymore in order to produce really kind of skewed outcomes. So, we absolutely have to catch up to this problem in robust mathematical and quantitative ways.”

As for the Supreme Court’s seeming aversion to math, Roeder’s reporting uncovered a few suggestions there, too. For one, he says that more empirical analysis is already sliding into law school curriculums. In our increasingly data-driven world, that’s a good thing.

“And two, I think each justice has a number of clerks, right, who help them through the legal thinking and the opinion writing. Why not have empirical clerks? Why not have some trusted advisers for the court, for the justices, to help them understand or parse or think through the empirical evidence in addition to the purely legal arguments?”

There, Duchin adds, the scientific community can contribute, as well. “If you read between the lines in this case, some of what the conservative justices are complaining about is just the very newness of efficiency gap and the idea that it hasn't been around long enough to be vetted,” she says.

“So, what we can do as a scientific community is also take the time to debate openly in peer-reviewed journals about the best ways to measure these things, get more mathematicians in the conversation and come to a kind of consensus.”

This article is based on an interview that aired on PRI’s Science Friday with Ira Flatow.