Quantcast
23.3 C
United States of America
Wednesday, September 15, 2021
We're performing server maintenance: As a result, our website might be unavailable for 60 minutes on 14 September 2021 13:30.

Instagram should be regulated by the FDA as a drug

Must read

The Wall Street Journal reported Tuesday that Instagram is a danger to teens’ mental health. In fact, it can even lead to suicidal thoughts.

- Advertisement -

Three-quarters of teenage girls that feel unhappy about their bodies say they use Instagram to make it worse. According to the WSJ, 13% and 6% respectively of American teens have suicidal thoughts. These are Facebook’s internal data. It is worse.

In 1906, President Theodore Roosevelt and Congress created the Food and Drug Administration. This was because Big Food and Big Pharma had failed to provide protection for the public welfare. Instagram’s executives march to the Met Gala to celebrate the unattainable 0.01% who live lives and have bodies beyond our reach. This is why regulation is necessary: The FDA must claim its codified rights to regulate Instagram’s algorithm.

FDA should make algorithms an FDA drug. The Federal Food, Drug and Cosmetic Act allows FDA to regulate drugs. Instagram’s internal data clearly shows that its technology alters brain function. Congress should make a mental-health FDA if this fails.

Facebook’s priorities and their impact on our brains can be studied by researchers. This is how we can find out. Facebook already does it, they just hide the results.

It is important that the public understands what Instagram and Facebook prioritize. The government has the ability to conduct clinical trials for products that could physically hurt people. Facebook privileges can be studied and their impact on the minds of researchers. This is how we can find out. Facebook already does it, they just hide the results.

- Advertisement -

As Cecilia Kang, Sheera Frenkel and Sheera Frenkel reported in ” An Unhappily True“, Facebook implemented an urgent change to its News Feed. It now places more emphasis on the “News Ecosystem Q” scores (NEQs) than ever before. Sources with high NEQ scores were trusted sources, while sources with low NEQ were not trustworthy. Facebook changed the algorithm to give high NEQ scores priority. Five days after the election, Facebook changed its algorithm to favor high NEQ scores. Users saw more fake news and less conspiracy theories in their “nicer News Feed”. Mark Zuckerberg made this change back because it caused less engagement and may have triggered a conservative backlash. It was a terrible mistake that the public made.

Facebook has also studied the effects of the algorithm’s preference for content “good for people” rather than content “bad for them.” Engagement drops. Facebook is well aware that the algorithm it uses has an incredible impact on American citizens’ minds. What is the point of allowing one person to decide what standard should be applied, based only on their business interests and not the overall welfare?

Upton Sinclair’s infamous discovery of dangerous abuses within “The Jungle” prompted a massive public outcry. The market was a failure. Protection was needed for consumers. 1906 Pure Food and Drug Act established safety standards that regulate consumable goods that have an impact on our physical and mental health. We must regulate algorithms that affect our mental health. Since 2007, teens have been suffering from depression in alarming numbers. The suicide rate among the 10-24 age group has risen nearly 60% from 2007 to 2018.

Although it’s impossible to prove social media was the sole cause of this rise, it would be absurd to claim it did not. Filter bubbles can distort and make our opinions more extreme. Online bullying is more common and easier. Facebook must be audited by regulators and questioned about its algorithm.

Regulators have had difficulty articulating the issue that Facebook presents — the greatest problem that the product causes. The intent of Section 230 is right. Without platforms being liable for any user’s words, the internet can not function. A private company such as Facebook can lose the trust of its users by imposing arbitrary rules on them that are based upon their political views or background. Facebook, as a company, has no obligation to protect the First Amendment. However the public perception of the brand’s fairness is crucial.

- Advertisement -

Zuckerberg, who has been ambiguous over the years, finally banned Holocaust deniers, Donald Trump and anti-vaccine activists. Facebook cannot decide what speech should be privileged. It will continue to slow down, make mistakes, and not do enough. Zuckerberg is focused on engagement and growth. The balance is what holds our hearts and minds together.

The scariest part of “The Ugly Truth”, the section that made Silicon Valley scream, was the nameless memo by Andrew Boz Bosworth in 2016.

Bosworth is Zuckerberg’s longtime assistant and writes in the memo:

“So we connect more people. If they do it negatively, that can make things worse. It could cost someone their life to expose someone bullies. Perhaps someone is killed in an attack on our tools. Yet, we still connect people. We believe so strongly in connecting people that any way we can connect with more people is de facto great.

Bosworth was made to retract his statements by Zuckerberg and Sheryl Sandberg when employees raised objections. However, outsiders see the memo as the Facebook unvarnished truth, which is the ugly truth. Facebook’s stranglehold over our political and social fabric, as well as its growing at any cost mantra of “connection,” are not the best. Bosworth admits that Facebook encourages suicides and permits terrorists to form. The concentration of power in one company, managed by one person, poses a danger to democracy and our way of living.

The FDA’s regulation of social media by critics will be criticized as a Big Brother invasion of personal liberties. What is the alternative? It would be a bad idea for the government to require that Facebook discloses its internal calculations to the public. It is safe to say that revenue growth, sessions and time spent are the most important results. How about the mental well-being of all people in the world and the nation?

It does not necessarily mean that the problem is not being studied. We are left without a man to decide what is right. How much do we have to pay for “connection?” Zuckerberg isn’t the one to decide. This should be decided by the FDA.

Publited at Wed 15 Sep 2021, 18:18.46 +0000

- Advertisement -
- Advertisement -

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

- Advertisement -

Latest article