Contributed by Lauren Taylor
Originally, YouTube would only take down videos that stated the coronavirus vaccine isn’t trustworthy and dangerous. According to the Washington Post, now any and all videos discrediting vaccines no matter the disease it’s meant to prevent will get removed from the site.
The Google-owned company has been trying to sustain against the backlash of lawmakers and regulators, they’ve always stood on the values of their platform being open for free speech. Fearing that policing their content or content creators would take away from that. YouTube conceded in changing their policies once the company took the blame for this pool of misinformation.
Hany Farid’s thoughts on YouTube’s policies, “You create this breeding ground and when you deplatform it doesn’t go away, they just migrate”, Farid is a computer science professor and misinformation researcher at the University of California at Berkeley. He believes that these policies should’ve been enacted prior to the launch of the vaccine. On the other hand, Matt Halprin, YouTube’s vice president of global trust and safety, says things like that take time.
The controversy between censoring content too much and not censoring enough is an ongoing battle, especially for YouTube.