Anonymous 08/11/2024 (Sun) 07:15 Id: c4606f No.144209 del
>>144208
How the UK Online Safety Act will be enforced

Ofcom is now the regulator of online safety and must make sure that platforms are protecting their users. Once the new duties are in effect, following Ofcom’s publication of final codes and guidance, platforms will have to show they have processes in place to meet the requirements set out by the Act. Ofcom will monitor how effective those processes are at protecting internet users from harm. Ofcom will have powers to take action against companies which do not follow their new duties.
Companies can be fined up to £18 million or 10 percent of their qualifying worldwide revenue, whichever is greater. Criminal action can be taken against senior managers who fail to ensure companies follow information requests from Ofcom. Ofcom will also be able to hold companies and senior managers (where they are at fault) criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or to child sexual abuse and exploitation on their service.
In the most extreme cases, with the agreement of the courts, Ofcom will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.

How the Act affects companies that are not based in the UK=

The Act gives Ofcom the powers they need to take appropriate action against all companies in scope, no matter where they are based, where services have relevant links with the UK. This means services with a significant number of UK users or where UK users are a target market, as well as other services which have in-scope content that presents a risk of significant harm to people in the UK.

How the Act tackles harmful algorithms
The Act requires providers to specifically consider how algorithms could impact users’ exposure to illegal content – and children’s exposure content that is harmful to children – as part of their risk assessments.
Providers will then need to take steps to mitigate and effectively manage any identified risks. This includes considering their platform’s design, functionalities, algorithms, and any other features likely to meet the illegal content and child safety duties.
The law also makes it clear that harm can arise from the way content is disseminated, such as when an algorithm repeatedly pushes content to a child in large volumes over a short space of time.
Some platforms will be required to publish annual transparency reports containing online safety related information, such as information about the algorithms they use and their effect on users’ experience, including children.

Message too long. Click here to view full text.