More Than A Number: Bearing the Image of God in the Age of Big Data
by Luke Jacobs
I don’t mean to alarm you, but there’s a drunk driving this car. The only reason why he is in the driver’s seat is because we put him there. We are headed for destruction, but if we are courageous, there is a way out.
The drunk that society has put at the wheel is a technology called a “Big Data algorithm.” This technology steers culture recklessly, without respect to humanity. As Christians, our job is to transform culture in service to God. We must “not be overcome by evil, but overcome evil with good.” Because this is our role, and because Big Data algorithms are major cultural influences, we must do something about them.
A Big Data algorithm is a defined, mathematical procedure that processes a vast quantity of data and outputs an analysis of that data, whether that be a prediction, a score of value, or some other judgment. One of the biggest strengths of Big Data algorithms is their ability to learn complicated problems through trial and error. They can learn from training problems so that they can be deployed on real problems in the future. This is like a child preparing for a test by flipping through flashcards and attempting practice problems. By doing this, the child can quickly learn the areas where they are confident and where they lack knowledge. Programmers apply this same process to train algorithms to do tasks like facial recognition or answering Jeopardy questions. In those cases, the “practice problems” would be pre-identified faces and Jeopardy answers. A Big Data algorithm can also be used to find patterns in data, without actually knowing what it is looking for. This would be equivalent to showing a child a number sequence and asking him or her to find a pattern in it. Today, algorithms can find patterns in phenomena like traffic congestion and the stock market much faster and with greater accuracy than humans. These uses are just the tip of the iceberg; engineers are applying this technology to new fields every day.
One specific example of a Big Data algorithm is a recommendation engine. A recommendation engine provides customers of a business with tailored recommendations for products they might be interested in purchasing. This algorithm belongs to the “Big Data” category because it requires a large amount of data to make a decision on what to recommend. Most recommendation algorithms require not only all of one customer’s shopping history, but the shopping history of every customer who has bought the products in that one customer’s shopping history. The algorithm needs this information because it needs to be able to look at customers with similar interests to predict what this one customer might be interested in buying in the future. The more data that this algorithm can process, the more customers with similar interests it will generally find. This is one of the reasons why large companies value customer data. The more data they collect, the more fine-tuned their Big Data algorithms will become.
Because of Big Data algorithms, jobs that have only been performed by humans for millennia will soon be performed by a different operator: a computer. Even now, algorithms are performing tasks like agricultural management, a job we have been working since the beginning of time. Because we are passing the responsibility of judgment onto algorithms, we Christians need to turn our eyes to the evaluation of the technologies that are doing our jobs for us.
It is paramount to discuss how Big Data algorithms affect our culture. They will only become more prevalent in this world, so Christians need to know how to respond to them. These algorithms can influence the course of an individual’s life, like what jobs a person has access to or what hospital treatments they receive , so we must approach them with seriousness and responsibility. They are like any other tool, which means that their end goal must be carefully controlled by human intelligence. We give them the power of making judgments in our absence, so we must make sure they are held to high standards of morality.
Holding these algorithms accountable begins with the ability to see what they are doing, but the growing rift between users and engineers hampers our ability to test them. Because many Big Data algorithms require massive amounts of computing power, they are mostly employed by large companies and government organizations. This means that there are some algorithms that the average person would be unable to test, since the average person does not have access to a supercomputer. This is problematic, because if we cannot test these algorithms, we cannot be certain that they are performing in a moral manner – or even how its engineers intended them to be working. The inability to test Big Data algorithms opens a rift between the average person and the specialists who make the algorithms. The distance between us and the production of these algorithms should give us even more cause for concern for how they are being used. The average American has little to no knowledge of the processes that use their data. This forces the responsibility of algorithm development onto engineers. If we fail to apply sufficient restraints to these algorithms, we risk producing a system that harms society and ignores Christian values.
A harmful system would thrive under the policy of neglectful acceptance, the view of technophiles. Neglectful acceptance is the belief that technological progress is inherently good and that the cultural effects of new technology are not important. It is the belief that all technology is beneficial, as long as it is faster, smarter, or more convenient than a previous technology. It is the glorification of technique and progress without regard to long-term consequences. This policy is an extreme, naive view of technology.
The opposite to this belief is the Luddite view, a hatred for technology. During the Industrial Revolution, the Luddites were a group of lower-class workers who would destroy the machines that they were employed to operate. They hated technology because they saw it as the enemy. However, they did not understand that the enemy was actually the greed of capitalists. The Luddites hated technological progress and technophiles. In the modern world, a Luddite would be someone who would reject any technological progress no matter if it could be put to good use. They instinctively condemn technology because they do not take the time to assess its uses.
The middle ground between technophiles and Luddites is the centralist position. Centralists believe that technology can be used for good and bad purposes. They believe we should assess technology as we come across it, looking both at the structure of the technology and the effects it would likely cause after being introduced to society.
My position follows most closely with the centralist position. I believe that technology does not reduce the consequences of sin and death or bring us closer to God, so we should not hope for salvation through technology. However, technology is necessary in a cursed world and useful for loving our neighbors. Big Data algorithms can be used for God-honoring purposes, but we should only implement them if they do not promote a worldview that redefines the Image of God in man, namely: Consumerism, Superficiality, or Totalitarianism.
Technology Is Not the End Solution
Technophiles incorrectly believe that technology is the end solution to humanity’s problems. Technology will never repair the rift of sin that is passed down from generation to generation. The Bible makes it obvious that having faith is all that is needed to be saved , so technology is irrelevant to our salvation. However, from the beginning of America, we have included technological progress in our salvation narratives, and this error has yielded problematic views of technology. These views are relevant to us because much of the world believes in them, including engineers working on Big Data algorithms.
Christopher Columbus believed that technological and exploratory progress would lead to the second coming of Christ. The Religion of Technology explains that Columbus held the millenarian mentality that humanity could actively bring forth its own recovery. The book writes, “Columbus, master of the marine arts, thus identified his epoch-making technical achievement with the ultimate destiny of mankind.” He firmly believed that, by discovering the New World, he was fulfilling a prophecy that needed to be completed before the end of the world. He believed that it was by sailing technologies that he, and humanity by extension, was able to find the Americas, which he refers to as the New Eden. To Columbus, technology was the means to initiate the second return of Christ, the apocalypse. In his mind it was only a matter of time until the remaining prophecies were completed, and then the world would end. Despite Matthew 24:36, “But concerning that day and hour no one knows,” millenarians like Columbus believed that their efforts in fulfilling prophecies could bring the world to completion.
The belief that we, humanity, have a hold of our own destiny – and that this destiny is achieved through technology – is even more prevalent today. The author and inventor Ray Kurzweil is a passionate proponent of this worldview. His book How to Create a Mind is a scientific work that discusses how we can recreate or simulate the human mind. He argues that a time will soon come “when a thousand dollars’ worth of computation will be trillions of times more powerful than the human brain.” He says that, due to a law that he calls “The Law of Accelerating Returns,” humanity will become more intelligent at an exponential rate and will soon be able to fix enormous problems like poverty, the limits of our planet, and possibly even death. His worldview is rooted in the imperatives of natural selection, that we must adapt to our changing environment or else we will go extinct. Part of this adaptation involves building better computer systems that are more efficient and can solve more problems. The last few lines of the book explain the end goal of humanity from the perspective of a technophile: “If we can transcend the speed of light … it could be achieved within a few centuries. Otherwise, it will take much longer. In either scenario, waking up the universe, and then intelligently deciding its fate by infusing it with our human intelligence in its nonbiological form, is our destiny.” Kurzweil sees our salvation as something that is within our reach and will be accomplished by our own effort.
The technophile belief, exemplified in both Columbus and Kurzweil, is that the advance of technology brings us closer to taking control of the physical world. They believe that it is by human effort that the world will be saved. Nothing is further from the truth. Every problem that we face spawns from sin, and we can do nothing on our own to reverse it. Ephesians 2:8-9 writes, “For by grace you have been saved through faith. And this is not your own doing; it is the gift of God, not a result of works, so that no one may boast.” Our salvation comes through God reaching out to us, not through our own determination. Because only God can rescue us, it is useless to look to technology as our savior.
Some Technology Is Necessary
Although technology will not save us, we cannot adopt the equally-radical Luddite view. We need to develop and use some technologies or else we are not being good stewards of what God has given us. We should not eliminate all technology because some are necessary in this post-Fall world. In order to look at how we should use technology today, let us look at technology in the time of the Garden, when sin had not entered the world and when we had the blessings God had intended for us to enjoy.
“And out of the ground the Lord God made to spring up every tree that is pleasant to the sight and good for food,” describes Genesis 2:9. This verse addresses the abundance of the Garden of Eden. God placed us in a perfect land that satisfied our every physical need. Because we had our physical needs satisfied, we had no need for technology of any kind. The modern theologian Jacques Ellul explains that because the garden was perfect, tools were unnecessary. He writes:
Just as Adam did not have to institute religion or magic in order to establish or regulate his relationship with God, because he spoke with God face to face; and just as there was no protocol or sacrifices; so Adam did not have to use any method to contact nature, to make use of plants, to lead the animals. While ruling over it, he was in communion with the entire whole to which he belonged.
Ellul says that because creation was completely unified (as anything that God calls “good” would be), “there was no possible distinction between ends and means,” and thus technology was unnecessary. This means that if Adam or Eve wanted food, they just had to reach out and grab some from the plants around them. Because God’s curse of labor had not yet come into effect, their work was easy, and they worked with their hands, having no need for technology to complete their tasks.
This was the ideal, but we chose sin, and God justly took away the Garden. We had to toil so that we had enough to eat, since we no longer had access to the bounty of the Garden. Developing technology became an unfortunate necessity.
Big Data Algorithms Can Help Society
Because having too much faith in technology leads us away from reliance on God, and completely ignoring technology is impossible given the curse of labor, we must carefully choose which technologies we should use and which we should discard. Big Data algorithms are not a technology we should completely discard, because we can use them to love our neighbors. Their most prominent use is in the medical field. IBM’s Watson Artificial Intelligence, the same AI that defeated two of the world’s best Jeopardy players back in 2011 , has the valuable ability to search through human-written medical data to gain insight into medical problems. Watson is doing the same type of work as it did to win Jeopardy: reading a lot. To be competitive in the game, Watson absorbed millions of documents to build a database of knowledge. Now its abilities are being used to comprehend the vast amount of medical data produced by patients. According to IBM, “each person will generate enough health data in their lifetime to fill 300 million books.” No doctor has enough time to spend with all that data! Watson has the ability to look at all that medical data in the same way, unhindered by human fatigue or bias, and to give logical diagnoses or treatments based on the evidence it has been given.
Not only are doctors overwhelmed by all the data they have, but they are also overwhelmed by the diversity of diseases that do not have cures. Amir Husain, the author of The Sentient Machine, was a victim of the disease that some call the “suicide headache.” It is a rare, chronic cluster headache that brings so much pain that many believe their only relief is suicide. What is even worse is that there has not been much research invested in its cure, since it affects only a relatively small group of people. Husain writes in his book, a passionate defense for AI, that it was only by “happenstance” that he was healed. Some doctor happened to find the right medical paper detailing his specific condition and offered him an uncommon and drastic treatment. It turns out that this treatment healed his chronic pain and allowed him to return to his life. He argues in his book that if his doctor had not found the exact paper offering him a treatment, he would still be writhing in pain without any hope of healing. However, if an AI was implemented to scan through medical papers for a solution to his specific problem, it would not be by luck that he would be saved, but by intelligence. As Christians, we need to help people like Amir. God has commanded us to serve the suffering, as it is written in 1 Peter 4:10, “As each has received a gift, use it to serve one another, as good stewards of God’s varied grace.” Those who suffer are created in the Image of God, so out of love for God’s creation, we must heal them if we can. Watson can do the job of hundreds of trained doctors in a fraction of the time, revealing new strategies to fight these rare and deadly diseases. Consulting Big Data algorithms like Watson will give us the insight into solutions to the pain of our fellow neighbors.
Big Data algorithms transcend more than just the medical field. They can also manage physical resources much better than any human. This is especially useful in the field of agriculture, as a substantial amount of food goes to waste due to farming inefficiencies. According to the management consulting firm McKinsey and Company, “Food waste causes economic losses, harms natural resources, and exacerbates food-security issues. About a third of food produced for human consumption is lost or wasted every year in a world where 795 million people—a ninth of the population—go hungry.” They continue, “Cutting postharvest losses in half would produce enough food to feed a billion more people.” Big Data is one of best methods to cut these losses. Farmers are beginning to place sensors in their fields to monitor soil, wind, and pest conditions. Each sensor provides a wealth of information about a specific location in a field, so that farmers can visualize the real-time needs of their plants. The insight given by Big Data algorithms can not only boost the health of entire fields, but also determine the optimal way to handle food cargo. Say a farmer wants to ship a large amount of lettuce to a place in the country with a high demand. Using weather data, train congestion data, and pricing data, a Big Data algorithm can suggest the best route to its destination, maybe even routing the train to a different place on the fly. As the human population grows, so must our food production. Big Data is the new management paradigm that will help us accommodate for growing needs and provide for our neighbors most effectively.
Developing better cures and cutting food losses are two great examples of stewarding God’s gifts well. As it is written in the Parable of the Talents, God has given us gifts that he expects us to invest. We can ignore the needs in the world and just bury our gifts in the dirt, or we can get to work and try to multiply our gifts. If we do not embrace the usage of Big Data algorithms in the medical field and agriculture, we are burying our gifts. There are too many rare, painful diseases to only rely on human doctors for medical research. There is too much food being wasted in the harvesting process. We need to invest in Big Data to help our neighbors. It is by no means a panacea, but it is a reasonable, practical solution to the demanding problems in front of us.
The Three Worldviews
However, as with any tool, Big Data can be used for evil. It can be used to promote the worldviews of Consumerism, Superficiality, and Totalitarianism, all of which redefine the Image of God in man. However, before analyzing how these worldviews fail to see the Image of God in man, it is necessary to define the most important features of the Image.
The three worldviews redefine two key attributes of the Image. They fail to see that humans receive value and satisfaction from God alone. To look to any other thing for value and satisfaction would be to look at ourselves as lesser than who God calls us to be. Since God is the creator of everything that is valuable, to ignore him and to rely on something else for self-value would be foolish. God values us higher than any other being, since “God shows his love for us in that while we were still sinners, Christ died for us.” God also has a plan for us, and that is for us to love him. C.S. Lewis writes, “[I]t would seem that Our Lord finds our desires not too strong, but too weak.” When we turn our gaze from God to other things, we are falling into the trap of desiring too little. We must cling to these two truths and not buy the lies that Consumerism, Superficiality, and Totalitarianism offer us.
Consumerism redefines the Image of God by claiming that we are valued and satisfied by the products we have. Big Data algorithms can promote Consumerism by enabling companies to target customers with tailored messages. This is through “360° views” of customers. A 360° customer view is a collection of insights about a customer’s interactions with a company. For example, according to Cloudera, a service which offers a Big Data platform, a 360° customer view can “deliver personalized offers,” “drive down customer churn,” and “deliver proactive care.” Personalized offers sound convenient, but they can also encourage the lie of Consumerism. In the book Being Consumed, William Cavanaugh writes, “Although the customer spirit delights in material things and sees them as good, the thing itself is never enough. Things and brands must be invested with mythologies, with spiritual aspirations; things come to represent freedom, status, and love.” Having “spiritual aspirations” associated with a product warps how we look at ourselves and the world around us. Instead of looking to God for our satisfaction, consumerist ads tell us to look no further than the material world. They tell us that their products give us value and satisfaction. Big Data algorithms can become the means to fuel these product mythologies. With the power to know a specific customer and personalize their ads, companies can attack their weaknesses to gain profit.
Predatory advertising from for-profit colleges is one example of the destruction that Big Data can cause when it is put to use on human weaknesses. The book Weapons of Math Destruction, written by Cathy O’Neil, explains the goal of these colleges. Because for-profit colleges can make profit without reinvesting that money into the school, they are essentially businesses. Like any other business, they need to find their niche, or else they will die off. O’Neil explains that the most common niche for for-profit colleges is preying on the poor for federal loan money. She writes:
They [for-profit schools] sell them [the poor] the promise of an education and a tantalizing glimpse of upward mobility—while plunging them deeper into debt. They take advantage of the pressing need in poor households, along with their ignorance and their aspirations, then they exploit it. And they do this at great scale.
O’Neil goes on to say that a degree from a for-profit college is actually worth less to employers than one from a community college. If this is true, then why would anyone want to enroll in such a place? The reason these types of colleges are so popular is due to their intentional marketing through focused advertising. These colleges target the poor with ads that weave a mythology. The ads promise social mobility, and many people took the chance. While these business tactics are more criminal than consumerist, it shows that businesses can gain major profit from advertising that targets specific demographics with tailored messages. O’Neil writes that the students of one schooling company, Corinthian Colleges, had an outstanding federal debt that totaled 3.5 billion dollars! It is likely that a significant amount of this money was given to the college from students who enrolled because the school’s advertising had affected them. The effectiveness of this scheme shows the power of targeting demographics with a personal message.
Although ads from for-profit colleges do not necessarily promote consumerist living, they do give us a troubling example of the potential future of businesses. If Corinthian Colleges can reap billions in profit due to the persuasion of targeted ads, then what is stopping other businesses from following in their footsteps? Using Big Data algorithms, companies like Facebook are able to accurately reveal sensitive information, even if we do not want that information disclosed. These algorithms can guess identities like political affiliation or ethnicity given seemingly-insignificant data from our interactions with the website. The more a company knows about us, the more it can customize the ads that it shows us. These ads, armed with the knowledge of our background, can tempt us with a product in a way that makes us look at ourselves differently. This is the heart of Consumerism. An example of this is Gillette’s new advertising campaign, titled , “The Best Men Can Be.” The ads revolve around the ideal man that Gillette paints for us, which is intended to leave us with a new definition of masculinity, one that they define and one that inevitably involves buying their products. Ads that try to make us see ourselves through their consumerist lens are trying to change what the viewer thinks it means to be human.
Big Data algorithms have the ability to take ads and distribute them to the demographics where they will be most relevant or persuasive. Ultimately, the morality of algorithms depends on the content that they facilitate. A company that serves ads that encourage us to redefine our identity is using those algorithms for a consumerist purpose. However, ads can be informative without being consumerist. An ad that tells the viewer about a product is not the same as an ad that tells the viewer that their identity depends on a product. Consumerism is rooted in the message of the ad. Big Data is the means for the delivery of that message, so if the message spreads lies, then the Big Data algorithms that facilitate those ads will harm our culture.
While Consumerism redefines the Image by claiming that human value and satisfaction is found in products, Superficiality redefines the Image by claiming that human value and satisfaction is found in how we appear. Appearance is defined by what the world sees of us. Appearances are more than just the physical. Our personality, the job we have, our successes and failures, and our human relationships are all appearances we carry around with us. These appearances are important to our lives, but they do not give any indicator of our value. We are valuable because we have the Image, regardless of how we appear to others. Superficiality, on the other hand, teaches that we are our appearances. It lies by claiming that our human worth is defined by how we appear to the world, and that we will find satisfaction in making ourselves appear more valuable to society.
Most everyone knows that humans can be superficial, but how can a computer be superficial? Computers are inherently logical machines that calculate results without any emotional bias. However, computers can logically execute an algorithm that gives completely incorrect results. This would be like programming an algorithm to tell us that 2+2=5. Just because the computer that completed that algorithm operated logically by doing every step that it was told to do does not mean that the final result was logical. Software engineers and mathematicians determine what algorithms run on computers, which means that their biases and their emotions are inevitably involved in making algorithms. Engineers choose what data to give to a learning algorithm, and that data might be biased. For example, say a programmer makes an algorithm that predicts a person’s chance of being elected to Congress given their political party, age, and gender. If that programmer trained the algorithm on data before 1917, the algorithm would give any woman a success chance of 0%. This is because Jeannette Rankin, the first woman to be elected into Congress, was elected in 1917. There was no precedent or visible data up until that point that would suggest that congresswomen would ever exist! The algorithm would dismiss any woman trying to reach Congress. Big Data algorithms are strictly tied to their input data, so if their input data is unbalanced and superficial, their results will be also.
One result of the superficiality of Big Data algorithms is that some people do not receive grace for their mistakes. This is seen in the justice system, where algorithms are advising judges in making the life-altering judgment of assigning sentence time. One such algorithm is named COMPAS , and it has been used on “more than 1 million offenders since it was developed in 1998.” In order to use it, the court provides the algorithm with 137 data points about a criminal. COMPAS then processes the data and outputs a few scores from 1 to 10 that rank the probability that that criminal will re-offend in the next two years. These scores are then given to judges for their consideration. Unsurprisingly, according to a recent scientific investigation, COMPAS is no more accurate at predicting the recidivism of criminals than the average person. If this algorithm is no better than the average person, then judges all around the United States have been listening to results that an untrained person could give with the same accuracy! This is even worse if the judge particularly trusts this algorithm and gives its verdict more weight. In a worst-case scenario, the algorithm assesses someone who is innocent of their accused crime and finds them to be likely to commit another one in the future, so the judge sentences them to more time because of the algorithm’s advice. Because most of the algorithm’s deliberation process is unknown to judges, they will never know if it reached a logical conclusion. It could be seizing on a single past mistake in someone’s record as a reason why they should have extra prison time. Say a criminal has a strong desire to get back on their feet, but one day they slip back into their bad habits and commit a misdemeanor. If an algorithm were to judge this person, it would see data about their past and condemn them to extra time. To the algorithm, the criminal is just like every other convict who has returned to crime. It has no understanding of this person’s story outside of the data. The intentions of the human heart cannot be reduced to simple numbers, so Big Data algorithms should not give a score of value to someone’s intentions. Humans should judge human intentions. Big Data algorithms especially do not belong in the justice system, where decisions involve much more than surface-level computations.
COMPAS is a failure not only because of its inaccuracy, but also because of the precedent it sets for algorithms assigning us reductionist scores. For example, the Washington, D.C. school system implemented a teacher ranking system using Big Data algorithms , and the company HireVue applied the same basic principles to score job applicants . Building algorithms to score the value of any human can lead us down a slippery slope of superficial judgment. A score of value provided by a machine can give us the justification to discriminate against a person or a group of people. A perfect example of this phenomenon is Jane Elliot’s experiment on her third grade class in 1968. She wanted her students to feel the dehumanizing effects of discrimination, so one day she told her class that brown-eyed students were “better people” than blue-eyed students. This statement is as broad as any reached by COMPAS. Without having to say much more, Elliot watched the class segregate on their own into groups of brown-eyed students and blue-eyed students. The brown-eyed students threw insults at the blue-eyed students, because they saw eye color as an indicator of value. Knowing that they were better than the blue-eyed students, the brown-eyed students developed more confidence. The opposite effect happened to the blue-eyed students, to the extent that a “smart blue-eyed girl who had never had problems with multiplication tables started making mistakes.” This shows how an assignment of value becomes a self-fulfilling prophecy. When given a low score by an authority we trust, our minds are prone to believe that it is true. Our confidence lowers, and our brains act according to our score, no matter its veracity. Lies said against our value can also affect our spiritual well-being. Satan wants us to think that we have no value, or that we need to prove our value through works. If we begin to believe this, it will weaken our reliance on God’s grace. Elliot’s experiment clearly shows that we can be easily torn apart – socially, mentally, and spiritually – by the discrimination that comes with scores of value.
Big Data algorithms view not only people with a superficial lens, but also culture. These algorithms can encourage a culture that appeals to our base desires and not to God, because they regulate the culture of the internet. Almost every popular media platform – YouTube, Facebook, Twitter, Reddit, and many others – all have media-ranking Big Data algorithms for the content on their site. Because all these websites are businesses, their recommendation algorithms try to optimize the amount of time users spend on their site. In order to do this, these algorithms advertise content that catch the most views and likes, content that is most likely to capture the user’s attention. The effect of having media-ranking algorithms promote the most popular content is that content creators try to produce media that is intentionally provocative. Because algorithms make human attention the currency of media sites, many content creators try to produce the most attention-grabbing content possible. The content that most easily appeals to our attention is superficial content that appeals to our base desires, like anger, lust, and fear. Videos that feed on these desires captivate our attention and receive more views, which the ranking algorithm sees as more valuable. According to a study by the Pew Research Center, 60% of YouTube users say that they have seen videos “engaging in dangerous or troubling behavior,” and 61% of users say they have seen content that “they felt was unsuitable for children.” These behaviors are all being motivated by the algorithm, because sensationalist content appeals to the largest audience. Being in the Image of God, we were not meant to consume this media, because a long exposure to it changes the way we look at ourselves.
The more of this degrading media we consume, the more we are tempted to integrate it into our worldview. We start to think that base media is what we should be desiring. As C.S. Lewis describes, we become content with “making mud pies in a slum” when “infinite joy is offered us.” This is especially true of children who grow up with this media. According to Pew Research, “34% of parents say their child watches content on YouTube regularly.” Children are learning how to view themselves and their world by the content that they are recommended by algorithms. The algorithm does not care about the well-being of a child if it can get them to watch another hour-long video with five advertisements. There are even some videos on YouTube that have twisted, almost demonic plots that appear harmless, but actually contain disturbing images designed to capture the innocent minds of children. It is a world where they can explore their darkest fantasies. This applies to more than just children. The brains of adults may not be as plastic as that of children, but they are certainly not immutable. Because the media-serving algorithms on the internet give us what we want, not necessarily what we need, we can chase whatever sinful desire we want, to the profit of media companies. The algorithms encourage us to shape our minds with superficial media that was not intended by God to be consumed.
Big Data algorithms do not have the eyes of God. They see people and culture through a superficial lens, and because we rely on their judgments, our culture in turn becomes more superficial. They reduce the value of people to the sum of the data they have left behind, and they reduce the value of media to the number of its viewers. To God, people are more than their past and media is more than the attention it receives. As 1 Samuel 16:7 explains, “For the Lord sees not as man sees: man looks on the outward appearance, but the Lord looks on the heart.” God sees people as made in his Image. He designed us to give grace to one another and create media that glorifies him. The algorithms we have put in places of authority do neither of these things.
While Superficiality promotes the belief that we are valued by our appearances, Totalitarianism promotes the belief that we are valued by having an upstanding relationship to the government. Totalitarianism competes with our relationship with God by redefining our values. Just like with Consumerism and Superficiality, Big Data algorithms can aid the cause of Totalitarianism.
Not all Big Data algorithms employed by governments promote totalitarianism. Some make it run much more efficiently. For example, England’s National Health Service Business Services Authority (NHSBSA) used a chatbot service provided by Amazon which was built using Big Data algorithms. “[T]he chatbot helped NHSBSA respond to approximately 11,000 calls, addressing simple queries and rerouting complicated queries to staff who can provide more support.” It reportedly saved the government $650,000 per year. The US has implemented its own Big Data chatbot to answer immigration questions, also cutting spending on human assistants. The Belgian government has also reached out to Amazon to tackling the problem of job searching. If Big Data algorithms could be implemented to solve that public problem, then both the government and unemployed workers would benefit. If the algorithm could automatically serve up a list of ideal jobs, then a candidate could just apply to their top choices. Greater efficiency in the job application process would be ideal.
The problem with Big Data’s efficiency is that governments can apply it to assert control. Instead of making services that benefit citizens more efficient, they can use the power of Big Data to more effectively promote their agenda. One example of this is China’s “social credit system” which measures the truthworthiness of its citizens. The government is planning to assign a score to every citizen by 2020. If you obey the rules set by the Chinese government, your score will be fine, but the more you disobey, the lower your score drops. Infractions can be anything the government looks down upon, everything from not enlisting for mandatory service in the Chinese military to “buying too many video games.” Punishments involve being banned from public transportation, blacklisted from top schools, and even having your dog taken away.
Big Data is the power behind this system. It powers the monitoring services that are dispersed everywhere throughout the country, and it processes in real-time data produced by the entire government. Without the help of Big Data, the system would be significantly less efficient, because it would require humans to be hands-on. If Big Data algorithms can search through patterns and monitor citizens on its own, then people do not need to be on the ground floor of the operation. This type of automation distances engineers from the consequences of their technology. Engineers do not have to see the emotion on people’s faces or read about their life stories when they setup a system that will punish them. With automation, empathy can be ignored. Big Data algorithms are the life-blood of China’s social credit system.
Aspects of the system encourage good behavior. It can detect if you have not been paying your taxes. Jesus said to “render to Caesar the things that are Caesar’s,” so it is obvious that we should pay our taxes. The fear of punishment for committing a crime can be a positive motivation. However, Jesus also says in that same sentence to give “to God the things that are God’s.” The government should never ask something of Christians that belongs to God, and we should not give to our government the things that God wants us to give to him. If we put the government before God, we are being unfaithful to him. God intends for us to help those who are suffering and those who have lost their way. The Chinese government stands opposed to this. The system will lower the scores of those who associate with people of low scores. Under this system, Christians will have to oppose the government if they want to do God’s will and help those who are lowly. More importantly, by enforcing this rule, the government is teaching the populace to value others by their scores. If someone wanted to keep their score at a high level, it would be their responsibility to watch out for those with low scores, or else their score might drop. China is using Big Data algorithms to teach citizens that an arbitrary score is the most important quality about them.
Totalitarianism can go beyond just scoring citizens. Big Data algorithms can help governments monitor certain demographics or even assist them in genocide. In his book IBM and the Holocaust, the author Edwin Black reveals how Thomas Watson, founder of IBM, “cooperated with the Nazis for the sake of profit.” The Holocaust required meticulous organization, which could only be effectively done by machines. IBM leased punch-card machines to help the Nazis complete their mission in exchange for business. This is the same company now doing work today scanning healthcare data using Watson AI, the namesake of its founder. If IBM has misused its machines in the past, what is stopping them from doing that now? The punch-card technology of the early 1900’s was terribly weak in comparison to the supercomputers we have today. Can you imagine the damage a modern government could do with an easily-accessible registry of citizens, Big Data algorithms, and intentions of genocide? It already looks like a citizen registry could become a reality in the United States. According to The Guardian, the Big Data company Palantir has teamed up with the US government to track and help deport immigrants. Immigration is a complicated and controversial topic which falls outside of the scope of this argument, but it should be noted how much power our government wields by consulting Big Data. Big Data could be easily used by our government to target and track certain demographics. This use of Big Data would not be treating humans in the Image of God, but as a simple category that poses a threat.
In order to be more efficient, governments turn to Big Data algorithms. This can save the taxpayer money, but it can also strengthen our governments more than necessary. Christian engineers working for governments need to be aware that the government may want to setup systems that may not treat citizens by the guidelines of the Image. Those engineers may not see those consequences directly, but they should be able to assess the most likely use for their algorithms. Even if the government has good intentions like cutting spending or providing more extensive services, the system it creates can lead to Totalitarian effects. When a government becomes Totalitarian, it becomes harder to live for God first. 1 Samuel 8 explains this exact situation. Looking at all the other nations around them, Israel decides that it wants a strong, human leader. When Samuel tells this to God, he replies, “Listen to all that the people are saying to you; it is not you they have rejected, but they have rejected me as their king.” God knows how Israel’s kings will be a stumbling stone to the nation in the future, so he warns the Israelites. God wants their devotion first, and a king – or any government for that matter – might contest that devotion. This is not to say that all government is evil, but that we must remember to give to God the things that are God’s, even when the government demands those things. Christian engineers working for governments need to know that their algorithms may be used to manage people, so we should make sure that they are managing them in a Godly way.
What We Can Do
Big Data algorithms are at the wheel, but they are not qualified to drive. Many of them promote a sinful view of humanity that pushes aside the Image of God. Consumerism, Superficiality, and Totalitarianism are at the root of these technologies. These worldviews demand that we change the way we view ourselves. Consumerism demands that we find our identity through the lens of a product. Superficiality demands that our identity is a score, which leaves no room for grace or human value. It promotes a culture of degrading media that further distorts our identity. Totalitarianism demands that we pay allegiance to the government above all. It judges nations according to flawed values. As culture redeemers, we must do something about this.
These algorithms are unqualified because they were created by engineers without the ethical foundation of the Image. As Karl Marx writes, “Technology discloses man’s mode of dealing with Nature.” When an engineer creates a new technology for a certain purpose, they are professing their worldview. They are personally telling the world their beliefs about humanity. The worldview that is embedded in that technology argues for a certain view of man. Technology can declare the truth of the Image, or declare the three evil worldviews. This is why respect for the Image is critical to making society more human. Without its guidance, engineers are willing to create systems that reduce people to numbers.
Taking algorithms out of the driver’s seat can only be accomplished if we live our lives in a way that professes the truth of the Image. Christian engineers need to profess the Image of God in their workplace. They need to express concerns about the ways their companies or government treats people. We need to be courageous enough to push back on skewed ethics and, if necessary, to refuse to work on sinful projects. Being a light involves more than just engineers. It is the Spirit working every Christian to spread the Gospel. Through our God-honoring living, we can influence the engineers working at companies and governments to see the detrimental effects of their systems. Through God’s power and wisdom, we can put our hands back on the wheel.
Ali, Maysam, and Leonardo Quattrucci. “Machine Learning: What’s in It for Government?” Amazon Web Services, February 19, 2019. https://aws.amazon.com/blogs/machine-learning/machine-learning-whats-in-it-for-government/.
Black, Edwin. “IBM and the Holocaust - Home Page.” IBM and The Holocaust. Accessed March 29, 2019. https://ibmandtheholocaust.com/.
Bloom, Stephen. “Lesson of a Lifetime.” Smithsonian. Accessed March 21, 2019. https://www.smithsonianmag.com/science-nature/lesson-of-a-lifetime-72754306/.
Cavanaugh, William T. Being Consumed: Economics and Christian Desire. Grand Rapids: William B. Eerdmans Pub. Co, 2008.
Dressel, Julia, and Hany Farid. “The Accuracy, Fairness, and Limits of Predicting Recidivism.” Science Advances 4, no. 1 (January 1, 2018): eaao5580. https://doi.org/10.1126/sciadv.aao5580.
Ellul, Jacques. “Technique and the Opening Chapters of Genesis,” n.d. https://www.jesusradicals.com/uploads/2/6/3/8/26388433/technique-and-the-opening-chapters-of-genesis.pdf.
Crossway, English Standard Version. Wheaton, IL: Good News Publishers, 2001.
Gilchrist, Karen. “Future of Employment: Your next Job Interview Could Be with a Robot.” CNBC, October 2, 2018. https://www.cnbc.com/2018/10/03/future-of-jobs-your-next-job-interview-could-be-with-a-robot.html.
Gillette. “The Best Men Can Be | Gillette®.” Accessed April 16, 2019. https://gillette.com/en-us/the-best-men-can-be.
HireVue. “HireVue - Hiring Intelligence | Assessment & Video Interview Software.” Accessed April 17, 2019. https://www.hirevue.com/.
“How Big Data Will Revolutionize the Global Food Chain | McKinsey.” Accessed March 6, 2019. https://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/how-big-data-will-revolutionize-the-global-food-chain.
Husain, Amir. The Sentient Machine: The Coming Age of Artificial Intelligence. New York: Scribner, 2017.
Islam, M. R., N. I. Shahid, D. T. ul Karim, A. A. Mamun, and M. K. Rhaman. “An Efficient Algorithm for Detecting Traffic Congestion and a Framework for Smart Traffic Control System.” In 2016 18th International Conference on Advanced Communication Technology (ICACT), 802–7, 2016. https://doi.org/10.1109/ICACT.2016.7423566.
Janjigian, Lori. “Facebook Can Guess Your Political Preferences - Business Insider.” Business Insider. Accessed April 9, 2019. https://www.businessinsider.com/facebook-can-guess-your-political-preferences-2016-8.
Kent, Jessica. “Top 4 Big Data Analytics Strategies to Reduce Hospital Readmissions.” Health IT Analytics. Accessed April 10, 2019. https://healthitanalytics.com/news/top-4-big-data-analytics-strategies-to-reduce-hospital-readmissions.
Kurzweil, Ray. How to Create a Mind: The Secret of Human Thought Revealed. London: Duckworth, 2014.
Lewis, C. S. The Weight Of Glory. 1st HarperCollins ed., [rev.]. San Francisco: HarperSanFrancisco, 1949.
Ma, Alexandra. “China Has Started Ranking Citizens with a Creepy ‘social Credit’ System — Here’s What You Can Do Wrong, and the Embarrassing, Demeaning Ways They Can Punish You.” Business Insider. Accessed March 28, 2019. https://www.businessinsider.com/china-social-credit-system-punishments-and-rewards-explained-2018-4.
Marx, Karl. Capital: Volume 1, 1867. https://www.marxists.org/archive/marx/works/download/pdf/Capital-Volume-I.pdf.
“Meet Emma, Our Virtual Assistant.” USCIS, April 13, 2018. https://www.uscis.gov/emma.
Noble, David. The Religion of Technology. New York: Penguin Books, 1999.
Northpointe. “A Practitioner’s Guide to COMPAS Core,” n.d. https://assets.documentcloud.org/documents/2840784/Practitioner-s-Guide-to-COMPAS-Core.pdf.
O’Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Crowning Publishing Group, 2016.
Raja, Vijay. “Using Big Data to Drive a True Customer 360.” Cloudera Blog, January 25, 2016. http://vision.cloudera.com/using-big-data-to-drive-a-true-customer-360/.
Smith, Aaron, Skye Toor, and Patrick Van Kessel. “Many Turn to YouTube for Children’s Content, News, How-To Lessons | Pew Research Center.” Pew Research Center. Accessed April 11, 2019. https://www.pewinternet.org/2018/11/07/many-turn-to-youtube-for-childrens-content-news-how-to-lessons/.
Sparapani, Tim. “How Big Data And Tech Will Improve Agriculture, From Farm To Table.” Forbes. Accessed March 5, 2019. https://www.forbes.com/sites/timsparapani/2017/03/23/how-big-data-and-tech-will-improve-agriculture-from-farm-to-table/#79f2a5e65989.
Stix, Gary. “New Clues to Just How Much the Adult Brain Can Change.” Scientific American Blog Network. Accessed April 13, 2019. https://blogs.scientificamerican.com/talking-back/new-clues-to-just-how-much-the-adult-brain-can-change/.
The Classical Difference. “Life After Graduation: How Classical Ed Prepared Me For Engineering - The Classical Difference.” Accessed April 10, 2019. https://classicaldifference.com/life-after-graduation-how-classical-ed-prepared-me-for-engineering/.
“The Disturbing YouTube Videos That Are Tricking Children.” BBC, March 27, 2017, sec. BBC Trending. https://www.bbc.com/news/blogs-trending-39381889.
“Watson (Computer).” In Wikipedia, March 4, 2019. https://en.wikipedia.org/w/index.php?title=Watson_(computer)&oldid=886216889.
Watson Health. “Watson Health: Get the Facts - Watson Health Perspectives.” Watson Health: Get the Facts. Accessed March 5, 2019. https://www.ibm.com/blogs/watson-health/watson-health-get-facts/.
Woodman, Spencer. “Documents Suggest Palantir Could Help Power Trump’s ‘Extreme Vetting’ of Immigrants - The Verge.” The Verge. Accessed March 28, 2019. https://www.theverge.com/2016/12/21/14012534/palantir-peter-thiel-trump-immigrant-extreme-vetting.
Yong, Ed. “A Popular Algorithm Is No Better at Predicting Crimes Than Random People.” The Atlantic, January 17, 2018. https://www.theatlantic.com/technology/archive/2018/01/equivant-compas-algorithm/550646/.