January is sometimes referred to as "Divorce Month" in the legal world. That isn't to say it has the highest rates of divorce filings, but rather, it's the time when couples examine (or reexamine) divorce options. As the law firm Goldberg Jones writes, "In some instances, people put off acting on a decision they already made, waiting to get past the stress and chaos of the holidays…In other cases, holidays may be the proverbial straw that broke the camel’s back. All the additional anxiety and activity put increased pressure on a marriage."
- In lieu of rising billing rates, clients seem to be changing the game
- Women pass a major milestone in the legal field
- Meta announces changes to how minors use its platforms
Be a smarter legal leader
Join 7,000+ subscribers getting the 4-minute monthly newsletter with fresh takes on the legal news and industry trends that matter.
📱 SOCIAL MEDIA
Increasing evidence shows that teens who consume high levels of social media also exhibit higher rates of self-harm, mental health crisis, eating disorder, and feelings of social isolation. The connection has put Meta under pressure to act, and the company faces multiple lawsuits and even bipartisan legislation to change how minors use their platforms.
Now, the company has announced new policies to restrict even further what teens see and have access to across Meta. “We will start to hide more types of content for teens on Instagram and Facebook, in line with expert guidance,” a company statement reads. “We’re automatically placing all teens into the most restrictive content control settings” and limiting search terms on Instagram. The actual details of the new policy weren't clarified much more.
But how effective will these new changes be? "You do not need parental permission to sign up for a social media account," Jean Twenge, a psychology professor at San Diego State, told NPR. "You check a box saying that you're 13, or you choose a different birth year and, boom, you're on." It's an obvious flaw, and one that underscores the challenges Meta faces in trying to keep minors on its platforms safe. Though some critics accuse the tech giant of playing a cat-and-mouse game between regulating its service and profiting off its users.
In 2021, a company whistleblower testified before the Senate Commerce subcommittee that “I saw that Facebook repeatedly encountered conflicts between its own profits and our safety." The whistleblower, Frances Haugen, added that “Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world.”
In October, 40 states' Attorneys General sued Meta, accusing it of tactics that "exploit and manipulate" children with features they call "dopamine-manipulating". As NPR notes, Meta used Section 230—the federal law on website liability enacted in 1996—as its main defense. Use of the law has been met with mixed reaction in courts in recent years because of the sweeping protections it provides tech firms.
"I think that it's a very close call as to whether Meta would succeed with a Section 230 defense," Jeff Kosseff, a cybersecurity law professor at the US Naval Academy, told NPR. "Courts are increasingly willing to conclude that Section 230 is not a defense in lawsuits arising from claims about product design, though the line is not always clear."
It feels like we've been here before: Meta says it will change its ways, and we just have to believe them. But the shifting cultural attitude towards Silicon Valley and Big Tech is on our side this time. With heavy-hitting lawsuits and legislation, Meta might be telling the truth this time.