The FDA should regulate Instagram’s algorithm as a drug – TechCrunch

The Wall Street Journal on Tuesday reported Silicon Valley’s worst-kept secret: Instagram harms teens’ mental health; In fact, its effect is so negative that it represents suicidal thoughts.

Thirty-two percent of teenage girls who feel bad about their bodies report that Instagram makes them feel worse. Of the teens who have suicidal thoughts, 13% of British and 6% of American users find these thoughts on Instagram, according to a WSJ report. This is the internal data of Facebook. The truth is definitely worse.

President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 because Big Food and Big Pharma failed to protect the common good. In celebration of the 0.01% lifestyles and organizations that we can only achieve mortally as its executives at the Met Gala, there is an explanation for Instagram’s reluctance to regulate what is right: the FDA should emphasize its codified right to regulate. An algorithm that powers Instagram’s medicine.

The FDA should consider algorithms on drugs that affect the mental health of our nation: The Federal Food, Drugs and Cosmetics Act gives the FDA the right to regulate drugs, giving drugs the right to define drugs as part of “articles (except food) with the intention of affecting structure or any function.” Or the bodies of other animals. “Instagram’s internal data shows that its technology is an article that changes our minds. If this attempt fails Congress and President Joe Biden should create a mental health FDA.

Researchers can study what Facebook prioritizes and how those decisions affect our minds. How do we know this? Because Facebook is already doing that – they’re just burying the results.

The public needs to understand what Facebook and Instagram’s algorithms prioritize. Our government is equipped to study the clinical trials of products that can physically harm people. Researchers can study Facebook privileges and how those decisions affect our minds. How do we know this? Because Facebook is already doing that – they’re just burying the results.

In November 2020, Cecilia Kang and Sheera Frankel report in “An Ugly Truth” that Facebook changed its news feed to a crisis, with more emphasis on “News Ecosystem Quality” scores (NEQs). High NEQ sources were reliable sources; The lows were unreliable. Facebook modified the algorithm to give privileges to higher NEQ scores. As a result, for five days around the election, users saw “better news feeds” with fewer fake news and fewer conspiracy theories. But Mark Zuckerberg reversed the change because it could lead to less engagement and a stronger reaction. The masses had to suffer for it.

Facebook has similarly studied the content of privileges of an algorithm that is “good for the world” over “bad for the world” content. Take and see, engagement decreases. Facebook knows that its algorithm has a significant impact on the minds of the American people. How does the government allow a man to set the standard based on the needs of his business, not general welfare?

Upton Sinclair memorably exposed the dangerous abuse in “The Jungle”, which caused a huge uproar among the people. The free market failed. Consumers needed protection. The Pure Food and Drugs Act of 1906 for the first time promotes safety standards, which regulate consumer goods affecting our physical health. Today, we need to regulate algorithms that affect our mental health. Teen depression has risen alarmingly since 2007. Similarly, between 2007 and 2018 it has increased by almost 60% between 10 and 24 suicides.

It is impossible to prove that social media is responsible for this growth, but it is absurd to argue that it did not contribute. Filter bubbles distort our views and make them more extreme. Bullying online is easy and persistent. Regulators should audit the algorithm and question Facebook’s preferences.

When it comes to Facebook’s biggest issue – what is a product Makes us – Regulators have struggled to clarify the problem. Section 230 is appropriate in its purpose and application; The Internet cannot function if every user is responsible for the platform for pronunciation. And a private company like Facebook loses the trust of its community if it imposes arbitrary rules that target users based on their background or political beliefs. As a company there is no explicit obligation to support Facebook’s first amendment, but the brand requires a public perception of its fairness.

Thus, Zuckerberg has shown ambiguity for years before imposing a delayed ban on Holocaust deniers, Donald Trump, anti-vaccine activists and other bad actors. By deciding which speech is privileged or allowed on its platform, Facebook will always be too slow, too cautious and ineffective to respond. Zuckerberg only cares about engagement and growth. Our hearts and minds are caught in a balance.

The most terrifying part of “The Ugly Truth”, the way it talked to everyone in Silicon Valley, was the nominated memo: Andrew “Bose” Bosworth’s 2016 “The Ugly.”

In the memo, Zuckerberg’s longtime deputy Bosworth writes:

“So we connect more people. It can be bad if they make it negative. Maybe he loses his life by exposing someone to thugs. Maybe someone dies in an integrated terrorist attack on our equipment. And yet we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is Real Good. “

Zuckerberg and Sheryl Sandberg withdrew their statements when employees objected to Bosworth, but for outsiders, the memo represents Facebook’s unreliable ID, the ugly truth. Facebook’s monopoly, its stranglehold on our social and political fabric, its mantra of “connection” at all costs is not good. As Boseworth admits, Facebook causes suicides and allows terrorists to organize. Such power concentrated in the hands of one man-run corporation is a threat to our democracy and our way of life.

Critics of the FDA’s regulation of social media will claim that this is a big brother’s attack on our personal freedoms. But what is the alternative? Why would it be bad for our government to demand Facebook accounts for its internal calculations? Is only the result important for the number of sessions, time spent and revenue growth? What about the collective mental health of the country and the world?

Refusing to study a problem does not mean that it does not exist. In the absence of action, we alone decide what is right. How much do we pay for a “connection”? This is not based on Zuckerberg. The FDA should decide.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *