Hey, Computer Scientists! Stop Hating on the Humanities

from Wired

AS A COMPUTER science PhD student, I am a disciple of big data. I see no ground too sacred for statistics: I have used it to study everything from sex to Shakespeare, and earned angry retorts for these attempts to render the ineffable mathematical. At Stanford I was given, as a teenager, weapons both elegant and lethal—algorithms that could pick out the terrorists most worth targeting in a network, detect someone’s dissatisfaction with the government from their online writing.

Computer science is wondrous. The problem is that many people in Silicon Valley believe that it is all that matters. You see this when recruiters at career fairs make it clear they’re only interested in the computer scientists; in the salary gap between engineering and non-engineering students; in the quizzical looks humanities students get when they dare to reveal their majors. I’ve watched brilliant computer scientists display such woeful ignorance of the populations they were studying that I laughed in their faces. I’ve watched military scientists present their lethal innovations with childlike enthusiasm while making no mention of whom the weapons are being used on. There are few things scarier than a scientist who can give an academic talk on how to shoot a human being but can’t reason about whether you should be shooting them at all.

More here.


16 Responses to Hey, Computer Scientists! Stop Hating on the Humanities

  1. Anthony Laverde April 25, 2017 at 9:43 pm #

    This article makes a sound argument for the reason why computer scientists need to be more ethical. I whole-heartedly believe in the main points. Too often are ethical issues over looked in big business let alone big data. The author mentions big data very early on stating that she is even “a disciple of big data”. My understanding of big data is gathering as much information to garner a profile on a person then sell it out to companies who decide what to do with it. This is extremely unethical and must be regulated by in today’s day in age where whether we like it or not everything is monitored. The author of the article completely skips over the fact that she works in one of the most unethical subjects of computer science. This most likely formed her opinion on the ethics of computer science but why does she not speak about this.
    Big data is one of the most lucrative and biggest sectors of computer science. I read a while back that the entire big data industry is growing exponentially and has already reached a market value of around 100 billion dollars. They are making all this money off information gathered from the public. Everything is monitored from the things we look up to the people we speak to. Sure, there are laws preventing companies from doing so but there are ways around it especially when people do not read the entirety of what they are signing. Of course, this is unethical but people do not care when they are making money. There is nothing stopping them. The article talks about how computer scientists need to learn more about ethics and the best ways to do so.
    The author speaks about how computer scientists need to be taught ethics. She states that even the managers cannot regulate it because they would not know the full extent of the work so the blame should be put on the programmers themselves. Schools need to find a way to incorporate ethics into their computer science programs because programmers have no self-control. Programmers will turn out however many projects and whatever projects the business wants with no thought to their actual implications. These programs will be put into use and harm many people before they are finally stopped. This creates a huge issue and some problems may be irreversible. So how do you stop this from happening? Schools need to implement classes to aid these programmers in understanding the implications their work can have. Seton Hall University does a great job of this by offering the information technology management major. This major is offered through the business school forcing the student to take a business ethics class. This puts the student at an advantage to not only write the code but also understand the implications and market it properly. Not all schools have this same major or requirements. There is also the problem that this is not a computer science major. This major is primarily for those that will be the managers of the programmers. I do not believe that the computer science major is required to take an ethics class but having a major that will be able to lead the computer scientists toward a more ethical solution is a step in the right direction.

  2. Matt Talarico April 26, 2017 at 12:30 am #

    Technology will only let humans go as far as they explore. There is the common misconception that artificial intelligence will outpace the efficiency and creativity of humans. While this seems logical, it will not happen. Technological advances are made because humans are able to compute the code into exactly what it wants it to do. A few coders will not randomly type up some code, and something they did not expect will happen. I think that people who think that this will happen do not have a deeper understanding of artificial intelligence, and it’s development.

    On the other hand, people think that robots will take over the job market, and not as many humans will be required for certain jobs. While this is already happening, I do not think it would get to the point where it will put humans completely out of work. Yes, businesses want what is most efficient for their company, but maintaining and developing all those robots and artificial intelligence will be costly. Also, robots can cost hundreds of thousands of dollars to create. A $100,000 robot that needs thousands of dollars worth of repairing every year may not be as efficient as just hiring a human. On the same topic, someone has to develop and manage these robots, which will only create more high-level jobs.

    With these technologies, the American economy will skyrocket. The only way for an economy to accelerate at a higher rate is to improve technology and its efficiency. With a more efficient economy, more things can get done in a shorter time without just shoving resources into certain projects. While many people are skeptical of such advancement into the world of artificial intelligence, I think that people should welcome it with open arms because it will only do what is best for our country, and that is excel it to greater heights. I think President Donald Trump has a very good plan to expand the artificial intelligence sector of the United States economy. By allowing corporations to grow their brand within United States borders, they will be able to expand faster than under President Obama. I am very excited for the future of the American economy with the accelerated rate of innovation, and so should everyone else.

  3. Jonathan Cavallone April 26, 2017 at 7:53 pm #

    After reading this article about computer scientists, algorithms, and programming, I was truly amazed at what technology is capable of doing. An algorithm is defined as any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values as output. Sounds sort of confusing but basically algorithms take a set of data and produces a difference set of data based on the data entered into the algorithm. Algorithms are like road maps for accomplishing a given, well-defined task. Therefore, a chunk of code that calculates the terms of the Fibonacci sequence is an implementation of a particular algorithm. Even a simple function for adding two numbers is an algorithm in a sense, though a simple one. There are many different types of algorithms all with different purposes. The different algorithms that people study are as varied as the problems that they solve. Although I do not understand much more than that when it comes to algorithms. Today in class we discussed something similar to what is discussed in this article. In class we talked about how there will be a conflict with ethics as technology continues to advance. We discussed things like if it would be fair for a treatment to prevent cancers to be given only to those who can afford it. Additionally, if scientists could create robots that put even more American’s out of work should they be made? There is going to be many controversial decisions that must be made in the future because of technology advancing so quickly. This article talks about how many computer scientists are oblivious and ignorant when it comes to the algorithms and codes they are designing. These scientists typically only think one way described by the author and that is mathematically. This can be a real problem because the scientists do not understand the effects their creations can have on society. For example, the author describes instances where military scientists have created lethal weapons that will be used to kill people, meanwhile they do not know who the weapons will be used on or if the reasoning behind the use is ethical. The fact that so many computer scientists are ignorant or disdainful of non-technical approaches is worrisome because in computer science there are many questions that cannot be answered with code. It is important for these scientists to have a functioning brain that take into account the effects of their creations. Many things computer scientists have developed have put millions of people out of work because robots have replaced their jobs. Either the scientists do not care about what they are creating because they are being paid very well, or they are oblivious to the damage that they are doing to the blue collar working community. It will be interesting to see what happens has computer science continues to advance in the future and to see the impacts that it will have on society. I cannot tell if I am nervous or excited to see the impacts technology will have on my generation.

  4. Andrew Imbesi April 28, 2017 at 12:14 pm #

    Last class we discussed “What’s Next” for millennials, the lives ahead of them during and after college. For starters, I felt this presentation connected to me since it circulated similar interests to mine in discussion. For years, I have been watching these events unfold and think to myself “where do I see myself in all of this?”, and now I am starting to reconsider where I should take my next steps towards success.
    I called my mom immediately after class spitting these words out of my mouth: “Mom, I have no idea what I want to major in anymore”. Ideally, I thought accounting would be a good path for me. I know I will never leave Stillman, and Accounting is more like the king of all business concentrations. However, after class I realized that not only business is the path, but also I feel as if I need more knowledge than just an IT certificate. I want to be “business brilliant”, and I know I have the potential to be a successful investor and entrepreneur, but I see myself going nowhere unless I am technologically advanced.
    Computers truly are taking over, they are taking over people’s jobs, and being integrated into pretty much everything humans do. I believe learning this technology inside and out would be nothing but advantages to me in the future. Being a tech savvy person, I know my way around my devices well enough to protect and manage myself, and browsing is no problem for me.
    Knowing this is not enough, I am a late blossom to coding and I really do not know where to begin. Coding will become an essential to life for many humans worldwide over the next few decades. I reconsidered my major to have something to do with Computer Science, and I believe dedicating more time to this at Seton Hall will definitely be useful to me.
    Technology is growing and I am glad I go to a school that values that future. It is extremely beneficial how Seton Hall incorporates technology into their students. I did not realize I would be receiving a laptop, and I had no clue that it would be so incorporated into the curriculums (except religion). I am not saying this is a bad thing; it is definitely a blessing in disguise.
    Towards the end of this article, the author suggests that tech companies should hire non-computer scientists. I would not rule this idea out; maybe a slower transition towards technology would be better for the world. A quick economic shift, a quick change for the world could come with repercussions. Slowly introducing people unfamiliar with technology and computer science will give humans more time to understand technology.
    I wonder if there will ever be a time in human history where simply computers do everything, and make all the money (if it still exists) for humans. The overwhelming force of technology can bring good and bad times ahead. I hope that humans do not allow the technology to destroy themselves.

  5. Thomas Dellisanti April 28, 2017 at 4:29 pm #

    This article brings up some very interesting points regarding the ethics involved in technological decisions. With the advancement of technology being so significant in recent years, there is an incredible urgency to do anything that could not be done before. However, there seems to be no consideration as to if programs or various parts of the Internet affect other people. The article notes that the programmers are in charge of writing the code for programs, and nothing more. It is scary to think that people who are in charge of creating programs that affect hundreds of thousands of people do not care or worry about the ethical consequences that come with those programs.

    One of the more significant examples of these ethical dilemmas mentioned in the article is Pierson’s personal projects that involve downloading communications and posting messages for potentially depressed and suicidal teens. In theory, the intentions of the program were good because the end result was to help these teens. However, further thought on this project can show that it can actually have an extremely negative effect, especially if these communications are mishandled or misused in any way. This situation can be an example of what could happen if ideas are not regulated. An idea might sound beneficial at first, but programmers might run with the idea by creating a program without thinking about how the program might negatively affect its users.

    Programmers are not the only ones that struggle with the ethics of new and advanced technology. Companies, such as Vizio, have admitted that they monitor their consumers’ viewing habits in order to make a bigger profit. This technique is a blatant invasion of their consumers’ privacy, and ethical considerations obviously did not take place. I think that because new technology allows situations like this to happen, people are getting caught up in the fact that it could be done. Again, the recent situation of Republicans making the decision to allow Internet service providers to sell their consumers’ data to third parties shows that the ethics behind a decision are not considered. Although this situation is mostly motivated by making extra profits, ethical considerations are disturbingly absent.

    Politicians who are concerned with strictly making profits cannot understand the seriousness of the ethical situations that come with advanced technology. However, programmers that are in charge of creating different programs for thousands of different users must understand the degree to which they affect their lives. A potential solution to this could be to include stricter or more extensive ethical training in programming majors to show how much they can affect other people’s lives. New technology can be incredibly beneficial to everyone, but it can also be very dangerous if not used properly. In addition, considering that everyone uses technology in some way, programmers can play an unseen important part in people’s lives, so they must recognize the influence of their projects. Their job is to type lines of code, but they do not see the direct effects and consequences of what they create. If they do become aware of how much they can affect people’s lives, the world of technology can become a much more secure place.

  6. Nicholas Thomas April 28, 2017 at 5:56 pm #

    The article raises an important relationship in society, the relationship between technology and ethics. Technology is concerned with meeting some need in society. Ethics is concerned with protecting humanity, and developing what it means to be human. With this said, as new technology emerges there are obvious risks and concerns. I argue that as technology pushes forward to “improvement,” ethics slows down the integration of advances of technology. Many may see this relationship as burdensome, but I agree with the author that humanities is not valued enough. This relationship between technology and ethics is needed in society to protect people.
    When I use the word “technology” I am referring both to objects such as robots, and programs and the development of such programs. Technology and the constant improvement upon it is a fantastic and needed thing for society to grow. For example, the author relies heavily on the example of algorithms. Algorithms are a basis for programs, and how certain technology “thinks.” Algorithms are what allow companies such as Amazon to analyze shopping trends and suggest other products to users. However, algorithms are not perfect, and depending on how they work can have serious flaws. For example, certain algorithms are susceptible to trends in society, such as the program used in FaceApp. FaceApp is an app that allows people to “transform” pictures of themselves. The app allows people to make themselves look older, younger, a different gender, and “hotter.” The issue with the “hotter” function of the app is it makes the person in the photo have a lighter skin tone, and therefore implies lighter skin means more attractive. The author uses similar examples such as programs in jails “determining” blacks are more of high risk inmates, and that women should be geared away from careers in engineering. The point is that technology contains biases of the people of a particular period. Ethics is in part the reason why people are able to become aware of these biases and adjust the technology accordingly. Ethics is not meant to stop the progress of technology, but make coders and computer scientists question if something is morally correct, why such technology is needed, and if people are ready it. For example, there is technology to start putting chips in peoples’ heads that could improve communication between people. This technology is possible, but there are many people who simply are not ready to take such a risk and make the transition. Moreover, there is technology for gene medication, but to make is easily assessable may create detrimental consequences. For example, with positive therapy, gene modification that makes “improvements” to genes, modification may be done to follow race and gender roles such as undergoing germline therapy so that children are male and are white (white males have the most opportunity in society). As a result, racism is enforced and the gene pool may shrink, so ethically should gene modification be allowed? Are people ready for such technology? I think the key word is “people;” people are humans not computers or robots to be programed. Advancements should integrate into society, but when society is ready.
    We live in an extremely competitive society, so people try to get every edge they can to be “successful.” In this society, part of what it means to be successful is to have money and the high paying jobs are technology related. Moreover, if one does not pursue a job in technology he or she is often mocked and looked down upon. However, I believe that ethics acts as a safety mechanic for technology. Coders and computer scientists become so caught up if they can do something, they do not ask if what they are doing is safe or needed. For the stigma of humanities to change, and to be valued more, I think colleges should require more humanities classes. Moreover, I think if humanities such as philosophy are taught at a much younger age we may have a more coders and computer scientists that are ethical. Ethics protects people from the technology people create.

  7. Matthew Radman April 28, 2017 at 8:20 pm #

    I agree with his author that computers and the scientists who work with them are incredible. They are helping to usher in a whole new world of possibilities for the future. But, computers are no the end all be all. In an interesting dynamic, often, it seems as though humans are an afterthought to a computer’s superiority. We hear so often now that machines are smarter, faster, and better than anything humans could do. It is impossible to deny that some of those claims are true, yes, however, we are ultimately in charge. Humans should not be counted out. The strange land of Silicon Valley seems to undermine this fact. I agree with the author that the superiority felt by computer scientists and engineers is unjust. Even though technology is heading in a direction in which AI rules and will replace many tedious jobs, what is a society without humans? The answer is nothing. Therefore, there is no reason why the end goal should be to replace people with computers. The fact is the world that is aided by AI dramatically will still include people in the humanities. The reason for this is the same reason why the world still has artists, architects, salespeople, managers, philosophers, historians, etc… because it is natural.

    Computer scientists and engineers often have the opinion that their jobs are the only safe one sin the future, i would argue otherwise. In fact, AI is, by its nature, able to learn on its own. That is why at a certain point, the programs will be able to code themselves. Coders and computer engineers will be as replaceable in the future as truck drivers are today. Even at its current pace, the computer will never be as good at being human as humans are. Careers in the Humanities will continue to be demanded because it would not make sense for people to decide to create programs to replace humans.

    Humans are here to stay, therefore so are the humanities. Hypothetically, in an AI-controlled world, the humanities are the only careers safe. It is scary to think about the future. Entrepreneurs, scientists, engineers, and alike (mostly from the bay area) have predicted that coding is the only career of the future. It seems like if you can’t code, there is o future for you. Everyone in and around Silicon Valley codes and wants everyone else to code. People travel far and wide to land a job at one of the billion dollar tech companies so that the can continue to code. Coding is a competition, hackathons have sprouted up everywhere lately in which people go for 48 hours or longer “coding binges.” We are living in a society that is beginning to reward making computers a priority, however, as computers become smarter and more ubiquitous. Humans will realize the importance of being human. Humanities will be more necessity and more appreciated as AI becomes more integrated into society. If not, the population risks losing something special. As a business student, I believe that no matter how smart AI gets, nothing can replace human interactions and human nature.

  8. Ryan Appello April 28, 2017 at 8:26 pm #

    It’s undeniable that the entire world is shifting towards engineering geared economies with engineering jobs in them. However, it is important to preserve fields like the humanities that make us humans. I mean, it’s our culture! It’s us. How can we blame a college student for studying something like this? Why is it that students who go into something like a literature field are automatically seen as inferiors to others and are assumed to go nowhere in life? If we are to preserve who we are as a collective species, we need to maintain the historic importance we place on our own culture. It’s only logical to think this. If our entire population only knows how to be engineers or coders, we open the door for so many problems to arise.

    The humanities allow us as people to improve our understanding of who we are and allows us to overall become more robust members of society. If we all only specialize in one field, we might as well kiss our futures goodbye. I’m not denying the importance of computer sciences and engineering fields, I’m just pointing out that they aren’t the only things that are important to us as a collective group. In order to remain the multi-faceted and successful society we are, we need to continue to place importance on the humanities as we know it. Because if we don’t, we are denying our own culture and history.

    One main issue that humanities can address is ethical problems. As technology continues to take over more of our lives, more and more ethical questions will be raised. This is one of the main reasons the humanities needed to be protected. People who specialize in ethics can consult these issues and allow for the best solution to be found. This would help avoid other potential problems that could prove to be very destructive. The people who specialize in coding and other computer sciences need to still educate themselves in these ethical fields in order to insure what they are doing is the morally correct thing to do and that it won’t hurt anyone. Everyone should receive training and education in ethics at some point in their higher education. It’s fundamental to being intelligent and compassionate human beings and it goes much further than knowing right from wrong. Ethics deals with plenty of incredibly difficult problems and if we aren’t prepared to solve them, the consequences could be very negative to us as people. In the end, it’s clear that computer science is the field that many people are going to. It is the future, without a doubt. But, the importance of the humanities must never be undermined. In order to retain our morals as civilized people, everyone should have at least a basic knowledge in ethics and other similar fields. And looking down upon those who specialize in it isn’t the right thing either. It still is an integral part of our society that needs to be carried on if we are to remain cultured and civilized people.

  9. Benjamin Jaros April 28, 2017 at 8:33 pm #

    In a way, Emma Pierson is discussing what philosophers have discussed over centuries, what is education?
    It seems that those in the technology sector and in many other parts of our world, utilitarianism indeed does seem to be the prime driver (thanks Machiavelli). Further, I think outside of this class in the business school, most students do not see the importance of the credits that do not pertain to their particular major.
    I believe that education is about the formation of the mind, not the content retained. Now, this is not an original view, it is just a summary more or less of John Henry Newman. Therefore, what courses should be taken in order for a student to fully form their mind?
    Now, the philosopher analyzing this essay is asking me to define what a fully formed mind looks like, which I cannot do. Not only because I do not have one, but I think the formation of the mind is a pursuit, therefore, I cannot define it. All I can say is that I well-formed mind involves one with broad and diverse study. Further, it involves a lot of mental gymnastics that forces the mind to grow beyond its limits on a regular basis.
    Therefore, part of this pursuit will certainly involves fields outside of our interest. I am interested in many things, I find difficulty and sometimes even boredom and studying dense philosophical theology (Lonergan). It forces one to open the mind to a degree that I actually found myself in great perspiration upon reading the work. The formation of ones mind is mentally and physically exhausting. Therefore, I see the appeal of being sloth like and not devoting oneself to study. Work is hard.
    However, currently in the United States, Education is looked at and approached by many students as a burden, not as a love. There needs to be a cultural shift in our approach to education. We need to spend less time focusing on getting through school and more time enjoying the process.
    Beyond this cultural shift, we need to incentivize those in the tech industry to study things beyond mathematics and computer science. Human beings are not the sum of their online presence. Technology is an extension of the life and dignity of the human person. Therefore, until computer science majors see it that way, we will continue to have moral and ethical problems arising from technology (cough..cough zuckerburg).

  10. Adara Gonzalez April 28, 2017 at 8:34 pm #

    The article brings up a good a point of the distance brought between the numbers and the significance behind the numbers. It is scary to think that the people in charge of the behind the scenes data is ignoring the human side of what is necessary in the work of their data. In addition, the question is not to plead to scientists to ignore the data and statistics of what they do, but instead to find a human characteristic and to think twice before going about said situational data.
    The article also speaks about the intellectual mind of these data seekers, claiming that they believe they know the answer to everything. As perfectly said in the article by the Wired “Professors need to scare their students, to make them feel they have been given the skills not just to get rich, but to wreck lives; they need to humble them, to make them realize that however good they might be at math, there is still so much they do not know.” There needs to be a sense of knowledge of power for the people in this position. Although I know I speak from an outside position, I personally feel that these scientists and data seekers are so knowledgeable that they do not realize how valuable and how dangerous it can be. Knowing too much can be quite dangerous and in this situation, those scientist and math geeks getting to know too much could affect all humans.
    The article also goes about on how to fix it and provide that humanistic approach in situations when all it just seems to be is zeros and ones. The author explains and commends the different kinds of companies who realize the problem of data seekers distancing themselves from the numbers they are paid to find. Companies like Google and Microsoft are directly approaching the situation. The article states, “Google and Microsoft deserve credit for researching algorithmic discrimination, for example, and Facebook for investigating echo chambers. Make it easier for external researchers to evaluate the impacts of your products: be transparent about how your algorithms work and provide access to data under appropriate data use agreements.” These companies are making the attempt of bringing together and making the scientist realize the direct impact their work has. Avoiding for them to be blind from the fact of the impact of their work.
    This situation reminds me of the work politicians do. Although their work is dedicated for the people, most of them do not realize the direct impact their words and actions have on the people they work for. Even though it seems to be somewhat complicated, that is because it is. In addition, if these people whose job is to service humanity and society and benefit them, how can that be accomplished without them realizing the potential they have and their work has. Now, I am not speaking for all politicians and I do not speak for all data scientists, I just speak on the fact that it is a common occurrence that like this article states, needs to be addressed.

  11. Daniel Anglim April 28, 2017 at 8:48 pm #

    In today’s world it seems almost impossible to obtain a well-paying job without higher education. Since elementary school my parents have prioritized the need for education in order to be successful in the future. I decided to become a Business major with a concentration in mathematical finance at Seton Hall University. The difficulty of my degree will hopefully set me apart from the millions of other Business majors I will be competing with to acquire internships and jobs. A degree from a university no longer guarantees employment. That being said, when choosing a school people are looking at the universities ability to create a well-rounded individual, that will be ready to work after four years of education. Colleges are looking at ways to entice students to come to their school by offering guarantees like, “a student will finish in four years, or the college will swallow the cost of additional semesters, to some colleges offering guarantees that graduates will be able to find employment with a certain income.” These guarantees are something new not previously offered by colleges. People want a guarantee that they will be getting their money’s worth for the education they receive. Many colleges claim that they prepare students for the workforce, but when business leaders are asked about the performance of college graduates “only 11% believe college graduates for properly equipped to enter the workforce.” People are not willingly going to spend tens of thousands of dollars on an education that does not prove to be useful. College guarantees are going to become essential in the future for getting students to join their schools.

    Colleges are working with employers to see exactly what they expect from new employees. Colleges are creating programs like, “College for America, part of Southern New Hampshire University, which has customized partnerships with a broad range of employers, including Aetna, the Gap and the District of Columbia government.” The acceleration in programs like the College for America are critical in transforming students into employees. More universities will construct these co-op programs to offer to students. These programs will be revolutionary, perhaps changing the standard of all college programs. One of the reasons I choose Seton Hall University is because they are highly ranked for getting their students internships in New York City. The purpose of college is to increase one’s education so they will be able to get a job in the future. Many colleges must adapt to the demands from the workforce to make graduates more attuned to real life application of the education they receive. This calls for an “overhaul of accreditation.” Stuart Butler, author of “Business Is Likely to Reshape Higher Education” says that, “accreditation has become a poor indicator of quality and increasingly a barrier to innovative and less costly forms of higher education. One big reason it stays in place, however, is that federal student aid is tied to accredited institutions.” Basically accreditation is setting the bar low for universities, allowing them to maintain their current education plan for students, instead of thinking of innovative ways for course delivery. Fortunately, it seems that Congress and other administrations are looking for alternative ways to break free from the traditional accreditation system. I look forward to seeing the transformation of higher education systems.

  12. Jill Coleman May 26, 2017 at 10:57 am #

    I found it incredibly perplexing that of all industries, the fastest growing and evolving one did not think not leverage the skillsets and ideas of those external to technology. In my experience, in corporate America, the most valued employees are those with diverse backgrounds that differentiate them in the fields they practice. Companies nowadays spend a lot of money acquiring talent that suits their weaknesses. For instance almost all Fortune 500 companies offer extensive training on subjects to generate subject matter experts. This begins with developing young talent, many have co-op programs that immerse students into a full time workload while connecting their practical knowledge obtained in the classroom. This breaks the generational gap and faults in the company by allowing millennials the platform to be able to voice their opinion. In most cases, those who work primarily in their field of knowledge for an extended period of time lose their touch. Not to say that they do not perform to the best of their ability or standards, but that they no longer look at the information through the eyes of an outsider. They view life through the same lens and only consider a limited set of option based off of their past experiences. As such, the fresh set of eyes that those who do not perform the function of that job offer valuable insight. Questioning why things are done and providing a different take on interpretations of data can pinpoint faults in business plans and proposals. Based off of this most companies in the supply chain industry tend to leverage the mindsets of students who have majors unrelated to supply chain and logistics. For instance, for leadership development programs they weigh candidates’ strengths against their weakness by identifying this gap they can begin to train and compensate for the the lapse. Further developing employees where they falter to make a broader understanding of the business and make informed decisions. Much like a liberal arts major would be able to deduce the ethical and social impacts of developed technology. To this token, a liberal arts major would not only be able to provide this other insight, but even be able to identify opportunities and gaps where technology is not being developed. For instance, the article makes mention of developing code that contributes to potentially harmful websites that affect others. But this other perspective would be able to use their knowledge to develop where technology is lacking that can assist others. For example, if there is a defined business need for a universal platform to share and store product based information this will not only aid the business but have the best interest of the consumer in mind, as ease of use will increase as well as quality of information.
    The oversight/lack of consumer interest drives the negative stereotype for pharmaceutical and healthcare service providing industries as being primarily focused on profits. Many corporations, unlike the tech companies listed in the article, have whole departments dedicated consumer experience and other qualitative variables that measure success. Very many of these same companies offer the professional development opportunities detailed in the first paragraph. I am fortunate to attend a university that offers a major equivalent to computer science with the focus of business. Computer information systems, focuses on a strong coding and analytical background while also covering a wide range of courses upon the basis of corporate social responsibility, ethics, law, and economics. This allows for more holistically cultivated students to understand the varying aspects of why business needs drive technological advancements and understand the ethical limitations of business when impeding upon the best interest of consumers. Perhaps the best value for students would be to attend a primarily liberal arts focused university with a strong business program, thus fulfilling the ethical and creative course load as a requirement of obtaining a degree in computer science or computer information systems.

  13. Meagan E Finnerty May 27, 2017 at 10:00 am #

    I enjoyed the article, Hey, Computer Scientists! Stop Hating on the Humanities, written by Dr. John Shannon. This article grasped the rising issues within technology, especially within a career form of computer science. There are a lot of issues with technology, since it is so new to our generation and the morals and ethics behind this is becoming a huge problem. Whether it be kids on Facebook or other social media, privacy concerns or individuals in their careers. Computer ethics, which is something that must be addressed and enforced within our nation before it becomes out of hand.
    During my time as an undergraduate student at Rider University, I have had many majors; these being criminal justice, teaching and finally healthcare management. Through these majors I have had to take classes on ethics and behavior to ensure that when I am put into a work place, I know the proper actions to take for when certain situations arise. However, when I was a teaching major, I had to take a class on internet ethics. I found this class extremely interesting. As talked about within the article, Dr. Shannon believes that students should take a class on social issues in computer science. This correlates with that class I took. I learned that there are many issues within social media. For example, a teacher lost her job after posting a picture of a drink on a beach over summer when her students found her Facebook. Both ethics and privacy were violated during this and therefore lead to an uproar for the Board of Ed in this school district.
    Another point that I felt strongly towards brought upon by Dr. Shannon is when he stated, “Professors need to scare their students, to make them feel they’ve been given the skills not just to get rich but to wreck lives; they need to humble them, to make them realize that however good they might be at math, there’s still so much they don’t know” (shannonweb). This is such a powerful statement. We spend so much money on educations to work our way through and get jobs, but often we are just skating through to get our degrees, not learning and putting in the time or effort that we were supposed to. I spend a lot of time in class, due to changing my major so many times and needing to catch up. I also work an internship, and play softball at a division one level. Therefore, the amount of time I must work on school work outside of the classroom is very limited. I will be the first person to admit that school falls close to last for me, but in my internship, I have learned so much about healthcare, ethics, life lessons and just like in the article, much about technology – since everything I do is based off the computer. Walking into the first day of my internship, I had no fear that I was doing the right thing, that everything I had learned in class about Medicare and Medicaid was right there in front of me and that I knew what I was talking about. About an hour into my internship, I was so overwhelmed. My supervisor was handing me terms that I should have studied, but was too lazy too and on top of that I had to learn a new computer system and what ‘my job’ consisted of. I wish that a professor would have scared me and forced me to realize the outside world so that I hadn’t had to play catch up the first month of my internship. Ultimately wish that a professor would have scared me and forced me to realize the outside world so that I hadn’t had to play catch up the first month of my internship. Ultimately, this links together the ethics behind school work and the workplace. I did not properly follow the ethics and had to learn all over everything I should have been learning in class.
    Computers have changed our life for the better – but repercussions also follow shortly behind. It is important for us to realize that although computers are extremely helpful to our nation, there are more to individuals that what is shown behind their emoji’s, and the screen of the computer, as discussed within the article. We must meet people, get to know them and interact with them ultimately building a rapport to ensure that they are being hired for the correct reasons, that they are doing their job for the right reasons – not just the price tag at the end of the year. The computer science is growing, due to the demand and growth of this, which needs to be controlled.

  14. Greg D'Ottavi September 7, 2017 at 2:52 pm #

    With technology on the rise and on the forefront of our society, it has become certain that in the world today some of the highest paying jobs revolve around computers and knowledge thereof. After reading this article by Emma Pierson, it has become even clearer to me how technology is dictating our world sometimes in a good way, but possibly in a bad way as well. Pierson explores the field of computer science stating, “Computer science is wondrous. The problem is that many people in Silicon Valley believe that it is all that matters.” Pierson goes on to make the claim that these advanced computer scientists, who essentially run various aspects of society, do not factor simple ethics into their innovations. The problem of ethics within the never-ending technological advancement of humanity is one that has been debated since the beginning. This is the focus of Pierson’s article and while reading it made me realize the impact computer science and the importance of the ethical issues it presents.
    One of the key points Pierson brings up, which caught my attention, is the suggestion of universities offering broader training courses for computer science students. She explains that out of the top eight computer science programs in the country most do not require students to take a course on ethical or social issues. “Professors need to scare their students, to make them feel they’ve been given the skills not just to get rich but to wreck lives; they need to humble them, to make them realize that however good they might be at math, there’s still so much they don’t know” (Pierson). Someone who has enough knowledge to create the next big technological innovation, whether it is for societal use or not, should know its ethical effects as well. Pierson makes a great point and calls for a realization of the impact technology has. Technology has effects on multiple platforms and the more people begin to realize that, the safer our society can be overall. She goes on to describe how and why large tech corporations should explore and monitor the social problems their products create. Companies such as Google and Microsoft create so many products that people use on a day to day basis, but most people do not understand remotely how any of it works particularly. Pierson suggests, “Make it easier for external researchers to evaluate the impacts of your products: be transparent about how your algorithms work and provide access to data under appropriate data use agreements.” The products and services these companies provide can all be largely beneficial to society, but at the same time they can be rather frightening. The average person who consumes these products has no knowledge of their full capabilities and in reality, no one except those who created it may know. With that said, there is a very large gray area which raises questions about the morality and even legal of these devices and services.
    Pierson wraps up her article by suggesting what I believe to be the most important part of her article; begin hiring those on the outside of the computer science bubble. The general public is not a part of the computer science realm and therefore is the majority that struggles to understand technology’s capabilities. By bringing in those who are on the outside, computer scientists and tech companies may be able to limit the social and ethical issues their products create. In a world where technology continues to move forward at an exponential rate, it will be imperative that the public remain informed and safe. Technology is a great thing and has always been on the forefront my entire life, but it is also a very scary thing that can certainly raise question and cause problems in the future.

  15. Lucas Nieves-Violet September 7, 2017 at 4:14 pm #

    Reading this article made me reflect on the case of Edward Snowden. Snowden itself is a terrific movie, but I found the event that unfolded in May of 2013 to be particularly interesting and relevant as well. I believe that this exposure gave a big dent to the U.S government and its secrecy, not only that but it also woke peoples up across the globe of the true potential computer science can have. The case of Edward Snowden is similar to the topic being discussed in this article. Throughout his life, Snowden worked with both the CIA and the NSA. He built numerous spying algorithm, radars, viruses and even cyber bombs powerful enough to shut down a country like Japan. While Snowden thought that all of his inventions and work would be to help the U.S government in case of an attack, the government instead used to spy on its own peoples and the rest of the world.
    This breach is the same Emma Pierson talks about in her article. Computer science has the power to do incredible good, but it can also be invasive and dangerous. In Snowden’s case, it can be used to spy on the country’s people. This lack of privacy and control is exactly what Pierson Talks about. She explains that young kids who are studying computer science should also be thought a moral class in school. Having the ability and power to create such instruments and weapons in some cases can have an enormous impact on society. It is important but hard for every computer science geek out there to understand what is right and wrong. In their opinions what they do is for the safety of the country and their government. However, the government will not tell them what their servers or logarithm are being used for when the time comes. Governments like ours prosper on secrecy and try to hide as much as they can from their people.
    I believe that Edward Snowden’s act was right, he informed the peoples of what was going on in front them and exposed the U.S government for being invasive and controlling. The Constitution states that we are to be free, but after Snowden’s action we were able to see just how much the government trusted us. Now the questions remain is Snowden a traitor or is he a hero to the people and civilians of the country. I think that to us he is indeed a sort of hero, we would have never known what the government was up too if it wasn’t for him. With this information, we have a new insight of what computers and technology are capable of. To the U.S government, I have no doubt that he is considered a traitor, this was classified information that was meant to be kept from the U.S public but more importantly the world.
    Emma Pierson denounces that computer science majors should be able to realize the power that the algorithms they write will have not only have an impact on society but also the word. Pierson quotes: “Professors need to scare their students, to make them feel they’ve been given the skills not just to get rich but to wreck lives; they need to humble them, to make them realize that however good they might be at math, there’s still so much they don’t know.” This point made is crucial, without understanding the power of their skills computer scientists won’t be able to realize the magnitude of the service they have created for others. I join Pierson again when she adds that computer scientist should be able to build platforms however no computer scientist should be able to say what should happen with the product they created. While this may be slow the process, Silicon Valley giants and kids coming out of college with those very skills need to communicate to non-computer scientist to see and conclude what types of information should be allowed and published to the public and society.

  16. Kunj Darji February 9, 2018 at 8:18 pm #

    The article claims that most computer scientists are deeply oblivious to humanities issues such as ethics or cultural aspects. I quite agree. However, I think the issue is a bit more complicated than this. People in computer systems are scientists. They employ the scientific method, organize their research projects in terms of hypothesis and experiments, and often deal with research problems that aim to understand the world.
    People in software engineering are, well, engineers. They rarely employ the scientific method, they organize their projects in terms of problem-solving and often deal with research issues that aim to change the world.
    Finally, people in information systems act as business specialists or sociologists with a strong IT background. They often employ ethnographic approaches and action research and organize their projects in terms of the observation of complex socio-technical phenomena that happen to be mediated by IT.
    Of course, this is a simplification, and the divide between the three communities is rarely as neat as I describe above. However, this simplification is very useful to understand the biases and expectations of anyone in one of the three communities. For example, I am a software engineer and, as such, I tend to formulate research issues in terms of problem-solving, and I tend to focus more on the creation of artifacts that solve problems rather than the pure generation of knowledge or the understanding of society.
    I also must say that the terminology that I am using is not always observed. For example, American and Canadian universities often employ the term “computer science” to refer to the whole of IT, which encompasses the three communities that I described above. This is unfortunate because anyone hearing or reading “computer science” would probably assume that this is about science, the scientific method, and the production of knowledge. However, “computer science” as used by Americans and Canadians is supposed to go beyond that to also include engineering and social issues of IT.
    Software engineers are trained to develop systems that solve meaningful problems by using minimum resources. They know how to make trade-offs between performance, quality, features, and cost. They can understand your requirements (because they have understood much more complex requirements in much more awkward situations) and present you with alternative ways of decomposing your problem into manageable chunks. They know how to work to a deadline and provide value from day one.
    Or they should. Of course, not every engineer is a good engineer. And I must admit that some computer scientists may have a good understanding of these issues. However, you are likely to get better results by recruiting engineers than scientists for your next digital humanities project.
    I often voice my concerns about the digital humanities not being truly trans-disciplinary and lacking a spirit of co-research. By this, I mean that most projects in the digital humanities employ IT as a provider to solve research issues in the humanities, and this is what I have been assuming in this post. However, this doesn’t need to be the case. In fact, I often advocate a different flavor of digital humanities where IT and the humanities benefit from each other equally, and every project produces results that advance the state of the art in both fields. More often than not, this is a utopian dream. But when it happens, then things may be different: in a project where IT issues, as well as humanities ones, are being addressed, computer scientists may genuinely be needed in addition to engineers.

Leave a Reply