FACEBOOK has been accused of conducting another bizarre experiment with its users' trust, and it could completely change your view on the hugely-popular social network.

After reading this, you'll probably NEVER trust Facebook again
Facebook wanted to test the loyalty of its user to breaking point
Facebook is the most popular social network in the world, with more than 1.50 billion monthly active users.
The US social network, which was launched by CEO Mark Zuckerberg in 2004, is now valued at a staggering £197 billion.
But details of an experiment to test social media users' loyalty to Facebook has now emerged online.
The Californian social network is believed to be preparing for the eventuality that Google one day removes Facebook's apps from its Play Store marketplace for competitive reasons.
As a result, Facebook tried to test the loyalty and patience of its Android users to the limit.
The US firm secretly rolled-out a slew of artificial errors within the Android app that would automatically crash the mobile app for hours at a time.
The experiment was designed to test at what point a Facebook user would give-up and ditch the Facebook app from their device all-together.
Speaking anonymously to The Information, a source familiar with the one-time test, which is believed to have taken place a few years ago, said Facebook was never able to reach this threshold.
"People never stopped coming back," the source said.
After reading this, you'll probably NEVER trust Facebook again
Facebook CEO Mark Zuckerberg founded the site in 2004, which is now valued at £197 billion
After reading this, you'll probably NEVER trust Facebook again
Android users were logged out of the app to test whether or not they would delete the app
Facebook wanted to see whether users would abandon the social network or simply switch to the far-inferior mobile website while their Android app was artificially broken.
Former Facebook data scientist JJ Maxwell defended the move, saying tests like these are "hugely valuable" to the company and "their prerogative," The Verge reports.
Admittedly, Facebook is not alone – many technology firms quietly test new features on users. Google famously cycled between 41 different shades of blue on its homepage, to see which promoted the best response from its users.
But tweaking a shade of blue is very different to testing the loyalty of your users by deliberately crashing their access to the service.
Especially when you state your company mission is to "connect the world" and you have a feature – dubbed Safety Check – to allow users to log-in and signal to one another that they are safe in a time of disaster. It's criticial to ensure people can stay connected. 
  
After reading this, you'll probably NEVER trust Facebook again
Facebook promises to 'connect the world'

The latest revelation follows the controversial 2014 experiment which manipulated users' emotions using the Facebook News Feed.
Devised by the social network's on-staff data scientist, Facebook scientifically tweaked the News Feed of hundreds of thousands of users.
Some were sent an onslaught of upsetting or negative posts, while others were given a barrage of positive posts to another group.
A number of critics highlighted the potential dangers of this type of manipulation, following the publication of two separate studies from the University of Houston which linked Facebook to depression.
Entitled "Seeing Everyone Else's Highlight Reels: How Facebook Usage is Linked to Depressive Symptoms," the study provided evidence that Facebook users felt depressed when comparing themselves to others.


But Facebook data scientist and co-author of the study Adam Kramer said: "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. 
"We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. 
"At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
The type of data manipulation used during this controversial experiment is completely sanctioned by Facebook's Terms of Use.
The news comes after Facebook changed its News Feed algorithm to account for the amount of time you spend reading posts, statuses, comments or browsing a friends' photos.
As a result, if you linger on a particular person's status, or read through the comments under a certain kind of video – Facebook will populate your timeline with similar content.
The changes mean you no longer have to comment or hit the Like button for Facebook to begin to learn who you are interested in hearing from online.
However – monitoring how long you spend reading an ex-partner's statuses, or lingering on their photos could easily be perceived as a little creepy.
But its not all bad news, Facebook CEO Mark Zuckerberg last month announced that he and his wife Priscilla Chan would donate 99 per cent of their Facebook shares – currently valued at a staggering $45 billion, some £30billion – to "advancing human potential" and "promoting equality."
The hugely successful couple, who recently welcomed their first child Max, will donate the majority of their wealth over the course of their lives.
  • NewsNewsBlog.blogspot.com has reached out to Facebook for comment on this story

Post a Comment Blogger Disqus

 
Top