I can still remember when Facebook was still "The Facebook", before its doors were open to anyone who wanted to join, before you wanted to put your head through a wall because of never-ending Farmville and Candy Crush requests. Not so long ago, Facebook served the simple purpose of connecting you with the people that you went to school with and maybe a few friends.
Since then, Facebook has grown and evolved, though not for the better. Now, it bombards users with unnecessary ads for things that no one really wants and useless sponsored stories that look more like spyware than anything anyone wants to read. Oh, and it experiments on its users.
In spite of the criticism it rightfully gets, Facebook still manages to connect its more than 1 billion users.
But the revelations last week that it conducted a social experiment on users by manipulating their feeds should hasten its decline, putting it on course to follow the likes of MySpace, Friendster and similar social sites.
This coming decline is no secret at Facebook. The company is clearly worried that it may be losing its influence with younger users. Earlier this year, Princeton University released a study ( download PDF) that concluded, "Facebook will undergo a rapid decline in the coming years, losing 80% of its peak user base between 2015 and 2017." And yet the company doesn't seem to truly understand what's brought it to this point.
It's not just the experiment -- which the company apologized for -- that's causing users to flee.
Although Facebook has attempted to curb illegal activity on its site, criminals continue to penetrate it. Both Fox News and the Daily Mail have reported on a number of crimes committed on Facebook, from pedophilia and the dissemination of child pornography to malware attacks on users through shady backdoor links. (Facebook has at least partnered with internet security firm McAfee to resolve malware threats.)
Because Facebook does not verify the profiles of everyday people, literally anyone can create a profile -- including those with malicious intent, those in prison and even members of the world's oldest profession. Yes, prostitutes can advertise their services without standing on a street corner. Facebook paid no attention to these profiles until it was notified of their existence.
Privacy? What privacy?
Privacy concerns have been one of the biggest thorns in the company's side, and they have the potential to become the main reason Facebook may eventually cease to exist. A study from the University of Vienna ( download PDF) looked at people who decided to leave Facebook and found almost half, 48%, said they committed "virtual identity suicide" because of privacy concerns.
Being a Facebook user myself, shutting down my account has crossed my mind multiple times, and with every new breach of trust Facebook puts me through this becomes more likely. Facebook has consistently used its own Terms and Conditions to justify using your profile pictures for advertising content and having the full rights to any pictures you post. The company has always kept security settings buried deep in the user interface, a major issue that's left many users angry and pushed some to just pick up and leave.
In January 2012, Facebook ran a sociological experiment as part of a joint venture between Cornell University and the company's own Core Data Science Team. The experiment involved 689,003 users and more than three million status updates. It was designed to determine whether the emotional content of those updates could influence Facebook friends' emotional states. What it did, instead, was to dynamite any sense of trust among users.
Cornell has since released a statement putting some distance between it and Facebook, directing any ethics questions to Facebook itself. "Professor [Jeffrey] Hancock and Dr. [Jamie] Guillory [now at the University of California -- San Francisco] did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper."
Adam Kramer, the Facebook researcher who worked on the study posted his own statement -- on Facebook, of course -- to try and reassure users that the company's internal self-policing insured that the study was ethical.
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.
"...I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," Kramer said in his post. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Now, let's be real; studies like this can be useful, given that social media is a major part of our day-to-day lives. But manipulating people's emotions in any way can be extremely dangerous, and the possibility of hurting someone, no matter how little, is too high a price to pay.
Facebook's senior leadership -- in particular, those involved with the study -- apparently slept through their ethics courses in college. A company technically has the right to hide behinds its Terms and Conditions as much as it wants; but that doesn't make dubious actions right. In fact, it just reinforces the growing belief that Facebook acts according to its own rules, users be damned. And younger users -- the very ones it needs to stay alive -- are the ones who feel this the most.
It's not too late to keep those and other users on board. A Forrester study showed that 57% of the users it surveyed who are between the ages of 12 and 17 said Facebook is their most often-used social site, and nearly half of 12- and 13-year-olds said they now use Facebook more than they did a year ago.
Whether Facebook will continue to be a powerhouse by 2017 or will by then be on its way down, as Princeton predicted, remains to be seen. What is certain is that a sizable number of people will continue to leave as more privacy issues arise. Clearly, Facebook hasn't learned from previous mistakes and the backlash from the public. It is a company that does not care about its users; otherwise, it would respect them more by creating a profile verification system that legitimizes user accounts, give users the ability to opt-out of any "studies," and do a better job of protecting its users from criminals.
The lack of respect it's shown of late, if left unchecked, will be the company's downfall.
Alex Burinskiy is a technical analyst at IDG, Computerworld's parent company. He was previously an Apple Store genius for four years and has worked with a range of IT systems, from personal to enterprise, for nine years. You can find him on Twitter ( @aburinskiy).
Read more about social media in Computerworld's Social Media Topic Center.