Meta is (again) Choosing Profit Over Protecting Children
New reporting has revealed that Meta suppressed internal research that showed risks to children in virtual reality spaces. As we’ve seen too many times before, tech companies are prioritizing growth and profit over the safety of children. Tech executives and lawmakers alike have been warned for years about the dangers children face online, and for too long, they have failed to pass meaningful legislation and implement effective guardrails to proactively protect children. In this case, Meta actively suppressed information and research on potential harms to children in order to avoid regulation and prioritize profit.
Read the news coverage of this story here in The Washington Post, or here in The Guardian.
With virtual reality (VR), we actually have the foresight that we lacked with social media. We know that VR platforms are being designed for social connection, and that more children will be entering immersive digital worlds in the years ahead. That doesn’t make VR inherently bad — but it does make accountability and safety urgent. If companies and lawmakers don’t take action now, children will once again pay the price for our delay.
The questions before us are clear: What kind of virtual worlds are being built? What protections will exist for children? What consequences will exist for tech companies that turn a blind eye and people who exploit kids? And how will lawmakers put in place limits and accountability?
At Love146, we believe that waiting for our children to be harmed before acting is not an option. Now is the time to act. We must continue to scrutinize tech companies, and pressure lawmakers and companies to take action now to protect our children.
Because let’s also be clear: The solution is not to continue putting profit over children’s safety, and then take the “easy road” of blaming parents for the gaps and dangers that will become prolific.
(No identifiable children featured in Love146 communications are known to be exploited)