The Supreme Court Is Allergic To Math

from FiveThirtyEight

The Supreme Court does not compute. Or at least some of its members would rather not. The justices, the most powerful jurists in the land, seem to have a reluctance — even an allergy — to taking math and statistics seriously.

For decades, the court has struggled with quantitative evidence of all kinds in a wide variety of cases. Sometimes justices ignore this evidence. Sometimes they misinterpret it. And sometimes they cast it aside in order to hold on to more traditional legal arguments. (And, yes, sometimes they also listen to the numbers.) Yet the world itself is becoming more computationally driven, and some of those computations will need to be adjudicated before long. Some major artificial intelligence case will likely come across the court’s desk in the next decade, for example. By voicing an unwillingness to engage with data-driven empiricism, justices — and thus the court — are at risk of making decisions without fully grappling with the evidence.

This problem was on full display earlier this month, when the Supreme Court heard arguments in Gill v. Whitford, a case that will determine the future of partisan gerrymandering — and the contours of American democracy along with it. As my colleague Galen Druke has reported, the case hinges on math: Is there a way to measure a map’s partisan bias and to create a standard for when a gerrymandered map infringes on voters’ rights?

More here.

, ,

8 Responses to The Supreme Court Is Allergic To Math

  1. Carolyn Wyland November 10, 2017 at 9:16 am #

    The Supreme Court not using statistical data for their decision-making process is not that surprising because law can be a very complicated matter. Every case brought to the Supreme Court is different and interpreting the law is a very subjective matter. I do not think that every court case warrants statistical data because there are some cases that do not need data to prove an injustice. However, there are some cases that do warrant the use of statistical data and I think all judges should have the capability to interpret and use data. Statistical data should be used in the same manner of how case-law is practiced.
    It is an excellent point that the gerrymandering case being brought forth to the Supreme Court would need the analysis of statistical data. The data used to compute the number of votes towards each candidate before and after new boundary lines were drawn could be critical in proving that there was gerrymandering. Therefore, it is important for statistical data to be used in certain cases warranting the need of hard numbers for evidence. Judges deciding to ignore or not use statistical data for consideration in cases is an ignorant mistake that can lead to missing crucial facts. The Supreme Court using statistical data would set precedent for all lower courts and could save time and money by having decisions reached more quickly.
    Courts of all levels should be using statistical data now more than ever because of how advanced technology has become. Technology has allowed for smaller errors when computing data and its capacity to store data has increased. The most brilliant minds have designed programs used by accountants, financial advisors and engineers, so why shouldn’t a less technological job use it? In a technology driven society courts should keep up with the times and use technology to their advantage. Of course, this does not mean to ignore all other considerations in law but rather use statistical data as part of their consideration.

  2. Ameer Richmond November 10, 2017 at 11:21 am #

    Reading over this article I picked up on a few things that really caught my eye. The biggest being that the Supreme Court really is not in favor of dealing with numbers and statistics. I agree this can be very tedious in regards to everything else the Supreme Court deals with, but this is a matter that needs to be looked at and fast. Depending on how many numbers you are dealing with, the more complicated the case gets as a whole, which can be even more stressful for the Supreme court to reach a final decision. It was also pointed out that anxiety can come from calculations being done. Which is regular to have, but to ignore it or not want to use math for simple reasons like that are not just. But in the age of today, this needs to be taken very serious and needs to change. Just like in the situation were gerrymandering happened, more and more cases will end up being math and statistic related. They must make sure they are willing to upgrade or even find new methods to help keep this situation in the right direction.
    Looking ahead the article mentions multiple methods, many long term solutions. Fixing the curriculum is a smart idea, but how long with that actually take before it effective for future Judges? I believe that what needs to happen is just a better overall practice of statistics with Judges. Maybe mandatory refreshers every two or three years. I believe that they still understand the concept of many calculations but are often times forgotten. Keeping the Judges up to date with technology can help as well. It can easily make sure that they are good with any calculations that need to be done as well. But just like the article states only time can help fix this math issue.

  3. KM November 10, 2017 at 8:13 pm #

    Oliver Roeder’s article “The Supreme Court Is Allergic To Math” brings to light how a data-driven society has posed new challenges for the Supreme Court. Several key cases being brought before the Court, such as the popular gerrymandering case Gill v. Whitford, involve the utilization of statistical analysis and quantitative evidence to support a given position. Cases such as these will only become more common as we as a society have become more dependent on the use of data and statistical analysis to make better informed decisions. The reluctance of the Supreme Court to accept or fully acknowledge the importance of these tools as providing satisfactory evidence could have a significant impact on not just the legal system, but society as a whole.
    One of the things that I found most interesting when reading this article was the attitudes that some of the justices took in their approach to situations involving unfamiliar territory in which their knowledge may be tested and/or limited. As the article notes, the justices are all highly educated individuals yet they seem to be reluctant in accepting the fact that society and how it operates is changing, particularly in regards to the use of data. Cases will only become more complicated as we continue to push the limits with technology and incorporate it ever more so into our daily lives. Being as the Supreme Court has every right to choose which cases it hears, it would be difficult to challenge why the Court is choosing not to hear some of these cases involving the use of quantitative data. Yet, while the justices legally have the right to decide not to accept a case, it raises the question about if the justices are choosing not to accept cases for ethically sound reasons. Personally, I do find it preferable that they choose not to hear a case rather than arrive at a misguided decision due to a lack of accepting, understanding, or incorrectly interpreting data-driven evidence.
    This article highlights the fact that even the Supreme Court has limitations. Nine individuals cannot be expected to be knowledgeable in all areas; rather the justices need to be experts in the area of the interpretation of law and the U.S. Constitution. As is noted by Eric McGhee in Roeder’s article, maybe it is time that the Court has a “trusted staff of social scientists” that would assist the justices in cases involving empirical arguments. The President has a group of trusted advisors, why should the top Court in our country not have something similar? Having an independent council of experts that differ in areas of expertise that could assist the justices in analyzing and understanding the methods and data in cases, could enable the Court to arrive at more appropriate decisions and possibly mitigate the Court’s reluctance to accept such cases. It is not meant that these independent advisors would be charged with interpreting and applying the law, merely they would be there to make sure that the presented information in cases is able to be understood and comprehended by the justices. The Supreme Court’s decisions are far-reaching and it is imperative that these decisions be made with the proper knowledge and all appropriate evidence available.
    While it is difficult to determine if “an allergy to statistical evidence” is really behind the justices’ motivations for avoiding certain evidence and cases, the situation does still does need a resolution as it poses a threat to the integrity of the legal system. The legal landscape is changing and learning to incorporate empirical evidence and other concepts related to data is a complex issue that most likely requires a multi-faceted approach. While the incorporation of a dedicated expert panel seems to be a reasonable response to the issues faced by the justices, other solutions should also be given consideration as well. As the article notes, maybe the current curriculum in law schools needs to be adjusted and students should be taught to adopt a different mindset when facing issues outside of their expertise. A more difficult adjustment to make, as a justice is an appointed position, would be to alter the desired qualifications, experience, and attitude that are sought in future justices. Regardless of the actions taken, the issue of “an allergy to statistical evidence” needs to be addressed, as complex situations involving data will not be disappearing anytime soon.

  4. Cristina O. November 11, 2017 at 10:39 pm #

    This is a very interesting article that highlights the major differences that have evolved throughout the past few generations. Society has become extremely data driven with all kinds of studies being done on any number of subjects. The fact that the justices seem wary of trusting in such evidence would make anyone question their motives.

    However, as we’ve seen in the past few presidential elections, even statistics can be manipulated depending on the samples taken to conduct the studies or surveys. I can understand how seeing this cherrypicking of information would make someone distrustful of the results they are given based on the motives of those who conduct said studies.

    The best solution is probably one that was offered here where the Supreme Court is appointed some kind of committee to not only disseminate the information but also test the validity of the studies they are presented. This can be done in an unbiased fashion by providing that committee with some parameters, but possibly leaving out anything that would identify exactly which way the study was leaning. If they could prove the method was sound then the justices should have no issue trusting the results regardless of their personal biases or motivations.

  5. Jimmy Bedoya November 18, 2017 at 3:19 pm #

    Mathematics, standing as a national language, is also the only language shared by all of the humanity regardless of culture, religion, or gender. Reason being that the numbers, formulas, and patterns all have set regulations and rules that are applicable all over the planet. With the new coming age of technology and the consistently growing population, it is necessary that statistics and computations of things be utilized to evaluate a situation efficiently and make a decision that is in fact correct. According to recent findings, the Supreme Court does not compute. It seems as though the justices have a reluctance to compute the numbers necessary for the case and neglect the statistical research once provided. For quite a while, the court has been found the struggle with quantitative evidence of all sorts and this can be seen in a number of cases in which the court oversaw. Sometimes the court was found to misinterpret the evidence, but in other, disappointing cases, the court simply decided to ignore the evidence. According to the justices, the stats were simply ignored in an effort to follow more traditional legal arguments. Yet with the world becoming more computational driven, the way justices act upon these findings and research should change to efficiently evaluate each and every case.

    The issue arose when the case of Gill v. Whitford, a case regarding the future partisan of gerrymandering, was under Supreme Court ruling. The case, in essence, revolves around math in the sense that it was necessary to know if a map’s partisan bias can be measured and then use to create a standard for when a gerrymandered map influences voting rights. In response, there is a way to measure such information and it is known as the efficiency gap. Consisting of a non-complex calculation in which you take the difference between each party’s unrecorded votes and divided that by the total number of votes cast. This calculation might seem difficult but it is, in fact, simple and easy to use in order to find an exact answer. However, half of the justices recently spoke about how they felt anxious whenever having to use calculations to answer questions relevant to bias and partisanship. Some even said that the math was “unwieldy, complicated, and new-fangled”. Some of the justices even suggested that using such calculations was “baloney” and a sociological “gobbledygook”. These opinions come from the minds of Harvard graduates, let alone Supreme Court justices are supposed to be the fairest and mindful of judges nationally. In my opinion, it seems as though these justices are simply avoiding the calculations as a means to make a decision in favor for those who oppose the statistical evidence. If that is not the case, as suggested in the article, the supreme court can easily hire a trusted group of social scientists who can calculate these stats for them to avoid having the justices suffer “anxiety” from the truth. Regardless of what the Supreme Court’s decisions might be, they must change and learn to make these calculations as important as any legal document they have, to avoid conflict and injustice.

  6. Rebecca Hu December 1, 2017 at 3:53 pm #

    I believe the beauty of math lies under that through calculations people can reach a very precise answer, this answer will be universal. In simple words it is a fact, everyone can get to the conclusion using the formula. Using math and data calculations in the Supreme Court is a good idea, however, the law is a very subjective matter. It comes down to a group of people deciding the case. Humans are not math machines, people are known to be not precise. That is why we have machines to do the precise work.
    With previous cases in the past, the Supreme Court does not feel comfortable with applying math calculations to evidence. Which is a very interesting topic, I will assume since Supreme Court is the highest fender court they should be good at everything. Or they should have people that are good at different fields as consultants. There are a lot of things that people are not good at or they don’t like. However, it is not a justification for people to not to care at all. Just ten years ago, lawsuits do not heavily involve in the technology field. Technology has improved at such as fast pace, the law is not keeping up with it. Such as the topic we are discussing now regarding workplace privacy. Technology is improving our general life but also making lawsuits to be more complicated.
    Law should be updated with the progress of time, so should the system. Math and statistics will become a major tool to understand the world centered on technology. Only math and statistics can offer an unbiased view on the topic. Since we are relying on the Supreme Court to make a decision for the dispute. This decision is final and very important. It should be their responsibility to ensure that all the methods look into. The people in court are all intelligent individuals they should accept new knowledge rather than sticking with what was taught in an outdated textbook.
    The excuse law is complicated can always be used to reject new ideas or undiscovered territory. Math will only become more and more important due to its impact on technology. The programs we used are coded with a different language, which works the same as math formulas. Math and data can give us the precise information we should not shy away from this information. Using these data analysis more carefully it can assist the court and probably lead to higher efficiency. It is only correct or incorrect in math, which matches the characteristic of evidence. They either commit or did not commit the crime.

  7. Vincent Scorese December 8, 2017 at 9:11 pm #

    I do find it very peculiar to the fact that something as concrete and simple as mathematical data could be ignored by the supreme court in the way it affects decisions and how it rules on those decisions. Whenever one is presenting an argument or is trying to prove a point, data or evidence is the major thing that each party would present to the deciding party in a way to show how their point is not only an opinion but also a fact of the matter and show be respected by the court of the way that they rule in cases. Whether this was numbers regarding the fact that two plus two equals four or if it involves quantitative numbers on the matter regarding gerrymandering, numbers are very important.
    Now I do understand that in certain circumstances that data or numbers in a situation or case for that matter aren’t as important but that just doesn’t seem to be the case very often as mathematics or numbers is a standard language that everyone can understand and looking at how concrete numbers are when deciding whether something is being violated, numbers draw a very fine line that isn’t very subjective. 25 miles per hour means 25 miles per hour and the failure to go 25 miles per hour or under is quite obviously a violation and consequences are handed out. efficiency gap is defined as the difference between [political parties’] wasted votes, divided by the total number of votes cast in the election.”
    Now gerrymandering is very important to us because of the way it can shape future elections and how those future elections will impact policies and certain laws that we would live by for years to come. If there are less voters say that are in one favor of a party, the way that they draw the map could be drawn to even have the minority win the vote depending how they draw that line. Now supreme court justices should take in the quantitative statistics in this case very seriously because of how important and landmark this case is going to be going forward tend to ignore numbers in a scenario where numbers are going to be king is just straight negligence and that is going to hurt whichever side is on the benefit of using those statistics.
    We shall see what happens in such a landmark case and what this shapes for us going forward as a country with voting being our most fundamental right and should be messed with to make certain outcomes possible.

  8. Andrew Kuttin February 9, 2018 at 1:26 am #

    With the passage of time comes an inevitable change in the way laws must be interpreted. In its current term, the Supreme Court is having trouble accepting this reality when it comes to using math/advanced statistics in their case decisions. Of the several examples of this that the article gives, the incorporation of efficiency gap statistics into the coming Gill v. Whitford decision is shown the most attention. The case is one concerning gerrymandering charges against the Republican led Wisconsin legislature. The math at the core of this case is a statistic called the “efficiency gap”, which takes the difference between wasted votes in an election and winning votes, then divides that by the total number of votes cast. The goal of this number is to create a concrete way to measure the partisan bias that comes as a result of the way districts are drawn on an electoral map. The article mentions that several of the Supreme Court’s justice are not taking the presence of this analytic statistic seriously. One justice called it “baloney” and claimed that the difficulty involved in understanding it would “erode the legitimacy of the court”. Another simply opted to refer to it as “gobbledygook”.
    The justices’ adversity to an analytic statistic feeds into the inherent issue that I have always had with the Supreme Court. Their lifelong appointments inevitably result in outdated minds being tasked with determining legal meaning in an ever-changing world. Despite how merited the legal minds on the court were at the time of their appointment, there are members well into their 80s currently serving. One of which was originally appointed by Ronald Reagan. At a certain point, the legitimacy of whether or not a justice can still properly understand and interpret the world that they live in needs to be questioned. Today’s world has the capability to collect massive amounts of data and analyze it in ways never before possible. In the Whitford gerrymandering case, some sort of statistical standard is the only true way to measure the bias drawn into voting districts. If the court rules that gerrymandering is indeed a constitutional issue subject to judicial review (which I believe they will), then they will need to take into consideration this “computer stuff” as Justice Breyer calls it.
    This is why I believe that a Supreme Court nomination should no longer be valid for life. I acknowledge that at the time of the court’s creation in the 1700s a lifetime gig was needed to entice justices to accept the position. In 2018, all a lifetime appointment does is put our nation’s most delicate legal issues in the hands of potentially outdated legal minds. The article makes a good recommendation for a possible future remedy of this math phobia in the court by calling for a curriculum change in top law schools. While this is a valid antidote for the future, in our immediate time I think that the terms of Supreme Court justices should be limited in some fashion.

Leave a Reply