WASHINGTON (Reuters) - A Facebook Inc executive said Sunday that the company would introduce new measures on its apps to prompt teens away from harmful content, as U.S lawmakers scrutinize how Facebook and subsidiaries like Instagram affect young people's mental health.
Nick Clegg, Facebook's vice president of global affairs, also expressed openness to the idea of letting regulators have access to Facebook algorithms that are used to amplify content. But Clegg said he could not answer the question whether its algorithms amplified the voices of people who had attacked the U.S. Capitol on Jan. 6.