The EU defines new requirements for the digital responsibility of large tech companies
Big Tech companies will have to meet new EU requirements to curb illegal content and misinformation on their platforms after the European Parliament and the Council of Ministers reached a landmark agreement on how Europe regulates the Internet under the name “Act on digital services ”(the“ Digital Service Act ”).
The new rules are made to create greater security on the Internet, counteract disinformation and secure the rights of children and young people, in a recognition that the existing rules were from a time before big giants like META (Facebook, Instagram and more), Alphabet (Google , Youtube) and TikTok.
If the Tech giants do not comply with the rules, it can lead to fines of six percent of their annual income, which can be very large amounts.
What exactly is the problem?
There are, of course, several challenges that the new set of rules must address, but in our perspective the most important point is the requirement that: “the platforms change their algorithms so that they do not prioritize content among users that has a negative impact on the democratic processes or on the health of citizens ”.
The big tech companies have gradually built up gigantic user bases that have them as the primary source of news and information about the world around them. Especially the younger generation are very impressionable and perceive content they read online on eg facebook as the ultimate truth without any major kind of critical thinking. But objectivity and public service through balanced content is just not the purpose of these great companies! it’s making money. They do this by selling advertisements and the longer we are on their platform, the more advertisements we are exposed to and the more money they make. In order to keep us on the platform for as long as possible, they have invented algorithms that constantly present new content to us, which we either agree with or found interesting and exciting. Thereby we are constantly affirmed in our own attitudes and interests, thereby reducing our acceptance that there are several sides to the same issue and the expansion of our worldview. Ultimately, it can be a huge challenge for our Western democracies, which are built on dialogue and acceptance that there are several attitudes and perspectives on the same issue.
For a long time, tech companies have not been responsible for the content that is posted on their platforms unless it is directly illegal. So everything from right-wing attitudes and hatred, to the more funny attitudes, such as that the ground is flat, have been given free rein. Research shows that it has helped to create a markedly more polarized world, with a lesser understanding of alternative perspectives than its own.
Why did it not happen before?
It takes a very long time to make legislation, and the new set of rules has not actually been adopted yet, which will take a few more months. The concern about the long processing time is that the technology is moving much faster than legislation can keep up with, and there are voices who believe that the set of rules has time to become obsolete before it has even been adopted. But instead of seeing the challenges of regulation and legislation in general, we should instead perceive the new set of rules as a start, a form of “tech constitution”, on which we can build new legislation and which can be adopted much faster than is the case today. . So even though it takes the EU a really long time, it is a positive sign that we have reached this point today with a comprehensive set of rules.
As parents of a new generation, we need to recognize and understand our children’s world, and it’s largely digital. It’s just incredibly difficult to help and guide them in the digital world when we ourselves are struggling to understand it. Today’s step is a step in the right direction, to help us and our children to a better future, where rules and laws from the real world are translated and applied in the digital. But despite advances in the legislative institutions, solutions like Zoe remain highly relevant to give you and your family a safe and secure life online, as legislation and regulation will always be at the forefront of technological development.
Here is a selection of the points in the new rules:
- Requirements are made for the platforms’ functions to protect children’s privacy.
- The platforms will be required to change their algorithms so that they do not prioritize content among users that has a negative impact on the democratic processes or on the health of the citizens.
- The platforms must give users a greater insight into what “recommended content” is based on.
- The Commission, the authorities and researchers must have access to algorithms and data.