5/31/2023 0 Comments Twitter 4.3.2One account within the pair selects to see personalized content, while the other chooses to see the timeline in reverse chronological order, i.e., without explicit personalization. We created matched pairs of user accounts, which we refer to as audit bots or bots for short, both following the same set of accounts, i.e., friends. We use our methodology to audit Twitter, a popular OSN where many people consume and discuss news . Note that our approach is similar to the ”sock-puppet audit” recently described in Sandvig et al., 2014. In this paper we describe a methodology for auditing a real-world algorithmic curation system using synthetic accounts to run a field experiment. Compared to observational studies, randomized field experiments offer a unique opportunity to address these empirical challenges. Similarly, confounding factors, such as homophily of preferences and social influence, make estimating the effects of algorithmic recommendations hard to disentangle. ![]() Retrieving the information as seen by a user in their social feed cannot be normally done through the API provided by the platform.At best, researchers reconstruct users’ timelines by collecting through an API the messages shared by all accounts they follow . We can only observe people's explicit behaviors, e.g., likes or shares of news. Quantifying algorithmic biases of social media curation and recommendation systems is limited by many empirical challenges. However, another study has shown that newsfeed curation can also counteract the self-reinforcing effects of ideological homophily and expose individuals to more diverse viewpoints . An experimental study using bots showed how Twitter's environment can steer nonpartisan accounts into partisan echo chambers . News stories that have been automatically curated by Apple News come from less diverse sources than the human-curated editorial picks . For example, the introduction of “who to follow” friend recommendation on Twitter, disproportionately accelerated the growth of the already-popular accounts Google search results of political queries differ significantly based on users’ previous browsing history . Prior research has shown that algorithmic recommendations combine with individual decisions to alter the information ecosystem. Moreover, existing cognitive biases, including social influence and position bias , coupled with algorithmic recommendations can amplify online trends, creating “irrational herding” and distorting the perceptions of the underlying value of content .ĭespite the growing importance of social media in news consumption, the role of algorithmic curation of content by OSNs has been only partially explored and the findings are conflicting. ![]() Algorithms may trap users within “filter bubbles” by presenting to them content from like-minded people, limiting the diversity of information to which they are exposed , and amplifying selective exposure to information . While it is useful for mitigating information overload, the algorithmic curation of content also has downsides. By surfacing the messages the curation algorithm predicts the users will find more interesting, the platforms hope to reduce information overload and improve user experience. The social feed mediates users’ exposure to information deemed to be sufficiently relevant or interesting by the platform's proprietary curation algorithm. ![]() For instance, OSNs like Twitter, Facebook, Instagram and LinkedIn all create a personalized “social feed”-also known as a timeline or newsfeed-from the content generated by people the users “follow”. To mitigate information overload, OSNs curate information for their users, recommending a smaller, more tractable subset of content shared by their friends. The downside of unfettered content production is the over-abundance of information, which creates information overload . Online social networks (OSNs) have lowered barriers to publication, allowing ever larger numbers of people to share information online and engage in public speech .
0 Comments
Leave a Reply. |