Role of libraries in a social world

Role of libraries in a social world

When it comes to sharing news, social media drives an increasingly large part of our news consumption, with 62 per cent of adults in America getting some of their news from social media. Of those, Facebook is by far the largest player, by consistently resisting calls for it to make the kind of editorial decisions that newspapers and other media outlets traditionally made (Segreti, 2016). Yet, the same algorithms that power the rest of Facebook have embedded within them many editorial implications (Lee, 2016). These algorithms are designed to do certain things, and favor sharing certain kinds of content, with a strong incentive toward encouraging engagement. They want customers to use Facebook as much as possible, and everything about the platform is designed to encourage people to engage with the system. The end result is that, while most users will only see a selected portion of their friends’ posts, the content that Facebook favors and promotes is content that is attention-grabbing – regardless of its accuracy.

It is easy to see why. Social media posts are tied up with an individual’s emotional experience as well as the user’s personal identity. Sharing content online is in part a means of maintaining and expressing an individual’s identity (Blommaert and Varis, 2017). What this filter means can vary widely, as people have many different identities. An esoteric article about foreign relations may be of interest to me (or of particular interest to someone for whom that issue is a key part of their identity), but if that interest is not part of the identity I want to display to my social group on Facebook, I am unlikely to post it or comment upon it. While a newspaper may feel that it is important to share that same story on its front page for any number of reasons. Similarly, I am even less likely to share a complicated article that challenges me, or an article that contains great content but a poorly worded title.

If the original post is generated in part because of personal identity, what others react to (and thus what social media promotes) is often driven by both identity and what elicits a strong emotional reaction (Libert, 2014). That reaction can be positive (think a cute picture of a kitten) or negative (think an article about someone hurting kittens); the important thing is that it makes many people want to comment on it and let the world know what they think about it – ideally, relatively quickly. Viral content like the lists that Buzzfeed gained fame for was designed and tested to accomplish this task and by making easily shareable lists of items. An article about the Pope endorsing Trump has a number of features that tap into this paradigm. It taps into a number of identities (i.e. religious affiliation, political affiliation) along with raw emotional engagement (i.e. shock, bemusement, excitement, outrage). Shares designed to mock it or to express their outrage over it were equally valuable for both the social media companies and the host site.

Sharing content that makes other people engage with their site is a large part of what makes social media sites valuable. Importantly, technology companies are legitimately reluctant to engage in obvious censorship. However, these limitations take on a different set of implications when the content is not personal photos or cute animals but distributing facts and news. Moreover, this shift is not happenstance – Facebook has moved aggressively to incorporate news into its site, launching programs such as the trending stories news feature and launching an Instant Articles program in partnership with key trusted news sources. Given social media’s evolving role in the larger information ecosystem, as well as their active efforts to incorporate news content, the policies of large social media companies will continue to have practical implications for news consumers. The result is that they cannot absolve themselves entirely from any obligation to make editorial decisions (or escape the fact that they already inadvertently have) even if that endangers their ability to be seen as neutral, algorithm-driven technology companies.

In the aftermath of all the recent coverage on this topic, many technology companies have admitted to the need to change how they operate. Facebook is experimenting with ways to remove known propagators of fake news and offering ways to flag stories as problematic. It is also changing the structure of how the platform operates to discourage fake news (e.g. removing the ability to edit link previews, adding a “disputed” label and minimizing the impact of high-frequency posters, which have been associated with fake news stories).

Share This:

Me On Instagram

Get The Best Of All Hands Delivered To Your Inbox

Subscribe to our newsletter and stay updated.

DEMOS