The good, the bad and the ugly big data

Helge Helguson Neumann
ESST MA Student

There is a hurricane on its way. You run to the store to stock up on essentials, preparing for the worst. What do you buy? Strawberry Pop-Tarts, apparently.

Using big data, Walmart found out that Americans buy seven times as many Pop-Tarts whenever a storm is brewing. Also using big data, Google guides you away from traffic jams by collecting millions of cellphone signals. Facebook recommends top stories based on their vision of you; a vision consisting of codified numbers generated by the actions of you and your kind. The atoms of cyber-you. With these data, companies are able to make sense of a world that does not even make sense to us. But is it all for good use? Here’s a good, a bad and an ugly example of how big data can be used.

Big data functions as follows: you take a huge amount of data, and then use a computer to run algorithms to find some sort of pattern in the mess. It is used to find trivial issues such as fast-food preferences at different times of the day (at night, women like Thai food, men like Turkish, and everybody loves pizza), but also for making morally tense choices, like categorizing people. The data itself is not dangerous, but our interpretation and usage of it can have dire consequences. When we choose to use big data for big decisions, we need to take into account that big data can be accurate and helpful, but also unfair and racist.

“Big data” has become a buzzword in recent years much due to its promising possibilities for firms and customers, government and citizens. It helps Walmart stock up on Pop-Tarts when a storm is brewing. It helps the Norwegian government reveal fraudulent behaviour in the welfare system. But maybe most promising is its use within the health sector. Using big data from search engines and deep learning artificial intelligence, Google are showing promising results in predicting cancer before doctors are able to. How do they do it? By looking at tons and tons of search data to find some patterns between search words and cancer patients. This is perhaps the biggest advantage of big data: the ability to find interesting and useful correlations where we previously didn’t know there were any.

Not all big data can show these sorts of correlations. In some cases, the data is highly uncorrelated, but are still being used for decision-making. Back in 2012, Sarah Wysocki started her job as a teacher in Washington, D.C. After some time, she was evaluated and scored highly with her superior. She was motivating, good at teaching, and the kids liked her. Two weeks later she was fired. According to a recently implemented teacher evaluation system, IMPACT, she was not suited to be a teacher. As an addition to a human evaluation, IMPACT was put in place to look at data statistics, to better make decisions about which teachers to hire and fire. Unfortunately for Wysocki, the evaluation paid more attention to the data than the person. The data said in particular that she was not effective enough. However, when looking at the data, Wysocki had reason to be upset. A scatterplot of the “effectiveness” showed little sign of a consistent pattern, and looked more like a starry night in the desert. The data did not reveal a significant correlation, with an r=0.25, which is about the same as the correlation between height and ice-cream preferences.

Misusing big data can have great impact on individuals, costing them their jobs or excluding them from insurance policies. It is all based on the atoms of cyber-you, what you are, what you have done, and even more importantly, what people similar to you have done. When these characteristics include black, poor, and American, you are, according to big data, in trouble.

The American police have started to use predictive analysis based on big data sets. However, the statistics that are fed into the data sets are not always trustworthy. For example, while blacks and whites in America smoke marijuana at the same rate, black smokers are four times as likely to be arrested for it. Similarly, the police more frequently patrol poor, black neighbourhoods. More patrols lead to more arrests and more points in the datasets. And once in prison, chances are you will return.

Courtrooms in America are increasingly using big data to determine the likelihood of committing future crimes. ProPublica, a nonprofit investigative newsroom, dug into the numbers and found some disturbing news. The data were highly biased towards black people, automatically predicting black criminals to be more likely to commit future crimes. Not only were the predictions barely more accurate than a coin-toss, they were twice as likely to falsely accuse black people. Recently, American courtrooms have taken it a step further by using algorithms in the sentencing. Because of privacy protection, the defendant is not able to question the sentencing or view the algorithm or what data has been put into it.

Big data has an inherent risk of making history repeat itself. We use data from past experiences to predict the future, and in doing so we nudge the future in that direction. If Walmart finds out we bought Pop-Tarts during the last storm, they will put Pop-Tarts by the cashier before the next storm, making us more likely to buy them. If the police use datasets from poor, black neighbourhoods, chances are those same neighbourhoods will be targeted next. In the end, big data is only numbers. It is us that need to interpret the numbers, decide which ones to use and which ones to discard.

Photo: © posteriori/Shutterstock

Forever a Pornstar, Software Says

Jørgen Tresse
TIK MA student

Judging by any benchmark, Julie is a normal teenager in school. During the course of one day, a series of unfortunate events left her social and private life in ruins. These events also led to her transferring schools and suffering from severe psychological trauma. What happened?

I made up the previous paragraph, but it is nonetheless based on true stories. Unfortunately, this is something that does happen.

It should come as no surprise that teenagers engage in sexual activities. However, in the twenty-first century, these intimate activities are not necessarily acts that stay exclusive only to the people involved. With the advent of social media and an increasing norm of sharing all the details of your life on the Internet, acts that may have been poorly thought through – or at the very least meant solely to be private – can be filmed or photographed and shared with hundreds of people within minutes. As Aftenposten has shed light on through a series of articles in the fall of 2017, it is not uncommon that youths share photographs and videos of sexual activities involving their peers. It even seems to be happening regularly, and involves a wide variety of young people. A common thread is that the persons exposed — often young girls pressured into an act — are unaware of this sharing. It is also striking how the common reactions boys and girls receive are as opposite as can possibly be: their peers praise the boy as a man, while labeling the girl a slut. The photo or video in question can be shared with the whole school within a day, and can even spread further, possibly ruining a person’s social life through crowd judgement in the process.

There are popular porn niches devoted to material where you know the identities of the persons engaging in sexual acts. The allure of recognising an actress is not lost on the Internet, with finding out the identity of people in pornographic videos or gifs being a hobby and skill several people pride themselves on. For example, you can find communities on the social and media aggregation website Reddit where users help each other to determine the name of an actress from a specific pornographic clip, or services like Pornstar.id where you can reverse image search porn stars. Recently, Pornhub – as of October 2017 ranked the 20th most visited Internet site in the US – announced that they are piloting an AI-based software that identifies specific porn actresses in clips. This is supposedly so that users can more easily find their favourite actresses and fetishes, and Pornhub claims that they will only use the software on professional actresses. However, as several privacy enthusiasts have pointed out, these new features should be worrying.

Teenagers sharing videos and pictures of each other is devastating for those involved, but unfortunately it is far from the only non-consensual sharing occurring. “Revenge porn” is a category of porn where jilted exes share intimate and private content without their previous partner’s consent, and with the intent of shaming or hurting them in some way. This does not just affect an unlucky few – some surveys have revealed that as many as 23% of respondents, overwhelmingly women, have experienced being the victim of revenge porn, with pictures and videos being spread on an estimated 2,000 websites worldwide dedicated to this genre. Often, this is accompanied by doxxing, or the release of private information such as full name, address, telephone number and more, opening the door for widespread abuse. While there are efforts underway to limit the damage from incidents like these — Twitter, for example, is banning profiles which engage in these activities, and the American Congress is considering making doxxing a federal offence — it is easy to see how software such as Pornhub’s may exacerbate the problem.

It is common that a technology which is developed for a certain use, gets applied in other areas or by actors with other needs. Viagra, for example, was originally intended as a heart medicine, Listerine as a cure for gonorrhea, and the Frisbee was a pie container, but none of these uses are what they are best known for. Serendipitous discoveries happen a lot in the fields of science and technology, but one does not always stumble upon a new use; actors with malicious intent can actively search for ways to warp a technology to fit their needs.

If the facial recognition AI is used on private videos as discussed here, it can connect those videos to a person’s full digital profile, making personal information all the more accessible. Aftenposten focused on youth culture where actions have led to the courtroom, but for Julie, the girl who had to transfer schools, this may be a small consolation. Starting over somewhere new or waiting until content is forgotten, is hard enough as it is, without making the content easier to find and tagging it to a person so that it can follow them through their whole life.

Common sense tells us that anything shared on the Internet is on the Internet forever. The least we can hope for is privacy through hiding in the massive overflow of content that is out there.

Photo: © Andrii Zastrozhnov/Adobe Stock

Apocalyptic Blindness and the Atomic Bomb

Hannah Monsrud Sandvik
ESST MA Student

The mere existence of the atomic bomb carries with it the possibility of the complete annihilation of all forms of life. Through an investigation of the nature of the bomb, we can better understand the relation between technology and the effects machines have on our lives.

Technology is persistently praised for its ability to connect and unite us. In perhaps no case is this more apparent than with regards to the atomic bomb, which in an absolutely inclusive sense affects us all simply by existing. The increasing power struggle between the US and North Korea, and recent reports that the latter has successfully tested hydrogen bombs, only serves to underline the fact that the current atomic situation should be our greatest worry.

Few have written as extensively and profoundly about the atomic bomb as the Austrian philosopher Günther Anders (1902-1992). For Anders, the dropping of the atomic bomb on Hiroshima on August 6th, 1945, marked the beginning of an era where the entire world at any moment could be turned into post-nuclear ashes. The atomic bomb is more than a weapon of mass destruction: because the bomb makes it possible to obliterate all life on earth, we are confronted with a new existential condition. As Anders writes, “the possibility of our final destruction is, even if it never happens, the final destruction of our possibilities.” (My translation.)

In the 1960s, Anders started a correspondence with Claude Eatherly, the American reconnaissance pilot who declared the weather conditions satisfactory to drop the bomb. Their writings were subsequently published in the book Burning Conscience, a collection of letters reflecting upon the human condition in the atomic age1. Eatherly was the living example of everything Anders thought about the bomb. After Hiroshima, Eatherly was celebrated as a war hero, but he struggled to come to terms with his role in the bombings. Subsequently he attempted suicide, went through a divorce and performed several armed robberies, though never actually stealing anything. In Anders’ view, these were acts of repentance: a way of seeking a punishment Eatherly felt he deserved but didn’t get.

The reason why the Eatherly case is so interesting is that it shows how technology turns us into cogs in large machineries and removes us from the relation between cause and effect. Anders calls the gap between our ability to imagine something and our ability to produce it the promethean gap2. The fact that I push the button seems unrelated to the fact that millions of people die as a direct result of this. It is paradoxical how pushing a button is l easier than killing one single person, but this is the case because the larger the possible effect of a certain act, the more difficult it becomes to imagine the effect. Adolf Eichmann, one of the lead organizers of Holocaust, used this line of argument to make the case that he was not guilty for the role he played in murdering thousands of Jews – he was merely following his superiors’ orders. In the Eatherly letters, Anders turns the argument around. Morally speaking, Anders argues, there is no such thing as ‘mere co-acting’ – whatever we’re partaking in doing, promoting or provoking is being done by us, and using Eichmann’s excuse is the same as abolishing the freedom of moral decision and the freedom of conscience. Eatherly’s feeling of guilt, therefore, was an entirely appropriate response. Fortsett å lese Apocalyptic Blindness and the Atomic Bomb