Face it: Facebook is not a social networking site.
Maybe it started out that way. But now Facebook is truly a data collecting machine. What data is it collecting? Human data – the personalities, preferences, and lives of 500 million people all over the world. It is the largest-scale anthropological study ever undertaken in the history of Earth, our version of Oceania’s Panopticon.
Dr. Anthony Beavers from the University of Evansville explored this more sinister side of Facebook at his discussion at CPSU yesterday. He narrated his story, from the original allure of social networking, to his “Facebook addiction,” to how he came to understand Facebook’s dark side and eventually quit Facebook.
In his lecture, Beavers addressed the following ethical issues associated with Facebook and emerging technology, and I would like to discuss them here.
The true “privacy” problem
A public concern today is lack of privacy. People are worried that everyone from creepy internet stalkers to CIA officials can spy on them through their Facebook pages. However, this is not the true privacy invasion. The only entity with complete access to your Facebook data is Facebook. Their computers record and analyze every Like you make, every Friend you fake, every Relationship Status you break.
How Facebook uses your information
The data on your page, as well as all your recorded activity on the website, is analyzed using computer algorithms. Therefore, Facebook may know you better than you know yourself. They can find out what movies you like based on comparison data from 500 million other users. They can guess from your age, gender, geography, and Groups joined whether you will vote Democrat or Republican. They know how you think based on the kind of Friends you collect. This information can be used to sell exposure space to advertisers. It can also be used to directly influence you, because computer algorithms determine your news feeds and Friend requests.
Why we should be worried
The data that Facebook collects is an information bank of human behavior. This gives the company enormous socio-political power. After running your profile through an algorithm, Facebook can allow advertisers to target you efficiently. The worry isn’t consumer product advertising, but political campaign advertising. Facebook can guess whether you are wavering on a political issue, in which case they can sell political ads for you at a premium. This intensifies the corporate and economic bias of elections. Facebook potentially affect public policy and opinion; Beavers attributes the extreme polarization of American politics in the past few elections to social networking. In short, filling out a Facebook profile gives Facebook (a company using the data of 500 million people) more power.
What else is wrong with Facebook’s “social networking” model
The other social problem with Facebook is called “information siloing.” Silo means to store in an isolated place. On Facebook, the user is siloed from most information on the website. Imagine having to sort through 500 million profiles every day to see what friends were up to – so inefficient. To fix this, Facebook’s algorithms figure out what you would prefer to see on your page and news feeds, using previous data. As a result, you are siloed off from potential new experiences. Although this does happen in real life with real friends, Facebook amplifies the effects of information siloing. The intense amount of accessible information online demands narrowed attention and actually shields you from diversity. It’s not social networking: it’s antisocial networking.
What we can do to solve these ethical issues
As of now, not much. There is no legislation to regulate Facebook and its corporate activities. Far too many people are “addicted” to Facebook, and the technology for internet interaction is only increasing with time. The first step would be to recognize that Facebook is unethical in many ways and has a negative influence on the people it is supposed to be serving.
Why quitting Facebook is not a solution
The miracle of Facebook is that, even if you quit, they still collected tons of data about you. They can use data projection to predict your future behavior, based on your behavior today. So once you join Facebook, quitting is ineffective. If you never join, they still have 500 million members. That is a huge number of statistical subjects, and Facebook has seven years of solid data. Beavers told us that after quitting Facebook last year, he has recently rejoined. I believe this is his justification: if Facebook already knows everything about him, he might as well stay on and watch its development from within.
The future as Beavers sees it is dim. If we cannot stop the unethical aspects of social networking, then what is in store for humanity? Will we all become antisocial internet addicts at the mercy of advertising and information siloing, the embodiment of 1984? Beavers ended with optimism; he envisions a “Star-Trek like world” with an entirely new economic and legal system that better protects its people from corporation and political corruption. But until then, it appears that we have a kind of Big Brother watching over us.
Thanks to Cal Poly, Dr. Beavers, and all my fellow non-members of Facebook.
Cal Poly Website http://cla.calpoly.edu/cla_ethicsoffacebook.html
Dr. Beavers Bio http://faculty.evansville.edu/tb2/