Instagram testing user ability to reset content recommendations
Published: 14:27, 19 November 2024
Updated: 15:20, 19 November 2024
Instagram is testing the ability for users to reset the recommended content they see on the app and “start fresh” as part of efforts to ensure users have positive experiences on the platform.
The Meta-owned site said the aim of the feature was to give users “new ways to shape their Instagram experience”.
Many online safety campaigners and social media critics have argued that the recommendation algorithms on such sites are a key factor in exposing users, particularly young people, to potentially harmful content through the way they serve users content related to things it believes they may be interested in or its systems believe could be related to their interests.
With this new tool, users will be able to reset their recommendations completely and clear all previously recommended content across the various Instagram feeds.
The company said user recommendations would then start to personalise again over time, showing users new content based on who and what they interact with.
“We want to make sure everyone on Instagram – especially teens – has safe, positive, age-appropriate experiences and feels the time they’re spending on Instagram is valuable,” an Instagram blog post said on the new tool.
“In addition to providing built-in protections from sensitive content with Teen Accounts, we want to give teens new ways to shape their Instagram experience, so it can continue to reflect their passions and interests as they evolve.
“That’s why we’ve started testing the ability for everyone on Instagram – including teens – to reset their recommendations.
It’s good to see Instagram bringing these changes in before regulation starts to bite
“In just a few taps, you’ll be able to clear your recommended content across Explore, Reels and Feed and start fresh.”
Instagram confirmed that while currently only being tested, the new tool will “soon roll out globally”.
The new online safety regulator Ofcom said it welcomed the move from the tech giant, ahead of the new safety duties that will be imposed on social media firms when the Online Safety Act comes into force.
“It’s good to see Instagram bringing these changes in before regulation starts to bite, and we’ll be pressing for companies to do more to protect and empower their users,” an Ofcom spokesperson said.
“When the UK’s online safety laws are fully in force, the largest sites and apps will have to give people more control over what they see.”
Andy Burrows, chief executive of suicide prevention charity the Molly Rose Foundation, said: “Our research shows how young people often feel trapped by harmful content recommended by platform algorithms so this step to give them more control is a potentially positive move.
“While this can empower young people it should be no replacement for further action to prevent harmful content being algorithmically suggested in the first place.
“We now need to see a step change in transparency from Meta so we can see for ourselves if this makes a genuine impact or is just another PR-driven move.”
Read more
More by this author
PA News