The UK's
Online Safety Act is now live and in effect. All residents in the UK - minors and adults alike - will now be required to verify their identity to services or websites that provide content that's not specifically for children under the age of 18 (17 years or younger). Depending on the server this may means scanning an 'identity' document that includes a photo or owners image, such as a passport, or by verifying using a 'face scan'. In both (or all) instances, this personally identifying information (PII) is uploaded/sent to the verifying party.
Important note: this isn't specifically about content that would otherwise be considered 'adult' or "18" or "R" rated, but content the Online Safety Act considers "harmful" or "inappropriate", e.g. "this includes [but is not limited to] taking specific action to prevent children from encountering pornographic content, and content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders." ["How the Online Safety Act will help to protect children"]
In a nutshell, the Online Safety Act places a "legal duty", a "burden of responsibility" to keep children safe, into the hands of third-parties and other entities or people, not the parents or those directly responsible for their general well being. In addition, the upload or sending of identifying documentation to third-party's does not guaranty this process is being serviced through legitimate entity's
specifically because of the third party burden - this won't necessarily stop bad actors hijacking this requirement for their own ends, it's astonishing this doesn't appear to have been a consideration, the Government instead relying on bad actors not acting badly ("criminals better not criminal") - in other words,
law abiding citizens most affected.