https://ipfs.tech/
https://private.storage/
https://nostr.com/
https://glowinglist.linksta.cc/@notafbihoneypot.bsky.social
https://simplex.chat/
if you wanna donate to me because you love the xmr circular economy and appreciate my work i believe in value4value
notafbihoneypot.xmr
xmr: 86gTQFcgz9FPyHzj3kyKqtHXqV1MYsWYRfDYVSjrk6etAGpSbnvyZs7BuT9Urhfdzx9PRG2Em8t317dKQ2m66fSr6UsV8y1
The Online Safety Act 2023 (c. 50) is an Act of the Parliament of the United Kingdom designed to regulate online content and protect users, particularly children, from illegal and harmful material. It received Royal Assent on 26 October 2023, establishing a new legal framework for online safety. The Act places a duty of care on online platforms, requiring them to take action against illegal content and legal content that could be harmful to children, especially if the service is likely to be accessed by minors. This duty applies globally to services with a significant number of UK users, those targeting UK users, or those capable of being accessed in the UK where there is a material risk of significant harm.
Ofcom, the UK's communications regulator, is the independent body responsible for enforcing the Act. It has the power to investigate non-compliance, impose fines of up to 10% of a provider's annual worldwide revenue, and in severe cases, apply to the courts to block access to services. The Act's implementation is being carried out in phases. As of 17 March 2025, platforms have a legal duty to protect users from illegal content, and Ofcom has been actively enforcing these duties. The child safety regime, which requires platforms to prevent children from accessing harmful content, became fully effective on 25 July 2025. This includes the use of highly effective age assurance to prevent children from accessing pornography, content promoting self-harm, suicide, or eating disorders, as well as other harmful and age-inappropriate content like bullying and dangerous stunts.