Live
- India global chip hub in 5 yrs
- 10% growth in pharma exports in FY24
- Mahabubnagar: ‘Will stand by farmers, bring in industries’
- Returns in Nifty-500 slowing down
- HUL’s Q4 net falls 1.53% to Rs 2,561 cr
- M&A deal value surges 60% in Q1
- Tirupati: Congress manifesto reflects divide and rule policy says BJP
- Fag-end selling erodes initial gains
- CID officials take custody of Fayaz in neha murder case
- Deve Gowda slams Rahul over wealth redistribution promise
Just In
If your Facebook News Feed is devoid of variety and diverse views, most of the blame lies with you and not with the Facebook algorithm, says a new study by data scientists at Facebook. The researchers analysed the accounts of 10 million users over six months to reach the conclusion that the so-called \"echo-chamber\" isn\'t as impermeable as thought to be.
New York: If your Facebook News Feed is devoid of variety and diverse views, most of the blame lies with you and not with the Facebook algorithm, says a new study by data scientists at Facebook. The researchers analysed the accounts of 10 million users over six months to reach the conclusion that the so-called "echo-chamber" isn't as impermeable as thought to be.
It said that liberals and conservatives are regularly exposed to at least some content that doesn't conform to their political or religious views, adding that almost 29 percent of the stories displayed by Facebook's news feed present views that conflict with an individual's ideology..
"You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that's not the case here," Eytan Bakshy, a data scientist at Facebook who led the study, was quoted as saying by NYT. The researchers found individuals' choices about which stories to click on had a larger effect than Facebook's filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.
Facebook's algorithm serves users stories based in part on the content they have clicked in the past. The researchers found that people's networks of friends and the stories they see are skewed toward their ideological preferences. But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.
On average, about 23 percent of users' friends are of an opposing political affiliation, according to the study. However, some observers argued that the Facebook study was flawed because of sampling problems and interpretation issues.
The study appeared in the journal Science.
© 2024 Hyderabad Media House Limited/The Hans India. All rights reserved. Powered by hocalwire.com