Misplaced Pages

Online Safety Bill

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
#917082

38-471: Online Safety Bill may refer to: Online Safety Act 2023 , 2023 United Kingdom legislation Online Safety Bill (Sri Lanka) , 2024 Sri Lanka legislation Kids Online Safety Act , 2023 United States legislation Topics referred to by the same term [REDACTED] This disambiguation page lists articles associated with the title Online Safety Bill . If an internal link led you here, you may wish to change

76-486: A person under the age of 18 and is based in or serves users within the United Kingdom. The Code requires that services be designed in "the best interests" of children, including their physical and mental health, protecting them from being exploited commercially or sexually, and acknowledging parents and caregivers' roles in protecting and supporting their child's best interests. The Code specifies that when used by

114-502: A child, online services must use their highest privacy settings by default, unless there is a compelling reason to do so while keeping into account the best interests of the child. This includes not allowing access to data by other users, location tracking , or behavioural profiling (such as algorithmic curation and targeted advertising , or using data "in a way that incentivises children to stay engaged"). The amount of data collected from children must be minimized, only collecting data that

152-694: A democratic society" and was incompatible with Article 6 of the European Convention on Human Rights . This decision may potentially form part of the basis of legal challenges to the Online Safety Act 2023. Age Appropriate Design Code The Age appropriate design code , also known as the Children's Code , is a British internet safety and privacy code of practice created by the Information Commissioner's Office (ICO). The draft Code

190-501: A prominent supporter of the act, saying it will help protect children from abuse. The Samaritans , that had made strengthening the act one of its key campaigns "to ensure no one is left unprotected from harmful content under the new law" gave the final act its qualified support, also saying the act fell short of the promise to make the UK the safest place to be online. The international human rights organization Article 19 stated that they saw

228-469: A significant number of United Kingdom users, or which target UK users, or those which are capable of being used in the United Kingdom where there are reasonable grounds to believe that there is a material risk of significant harm. The idea of a duty of care for Internet intermediaries was first proposed in Thompson (2016) and made popular in the UK by the work of Woods and Perrin (2019). The duty of care in

266-608: A tweet that scanning everyone's messages would destroy privacy. Ciaran Martin , a former head of the UK National Cyber Security Centre , accused the government of " magical thinking " and said that scanning for child abuse content would necessarily require weakening the privacy of encrypted messages. In February 2024, the European Court of Human Rights ruled, in an unrelated case, that requiring degraded end-to-end encryption "cannot be regarded as necessary in

304-493: Is an act of the Parliament of the United Kingdom to regulate online speech and media. It passed on 26 October 2023 and gives the relevant Secretary of State the power, subject to parliamentary approval, to designate and suppress or record a wide range of speech and media deemed "harmful". The act requires platforms, including end-to-end encrypted messengers, to scan for child pornography , despite warnings from experts that it

342-570: Is justification. It also requires privacy policies and controls to be presented in a manner that is clear and accessible to children, including prohibiting dark patterns . Baroness Beeban Kidron sponsored the amendment to the DPA that mandated the development of the Code. Upon the implementation of the Code in 2021, she explained that "[the Code] shows tech companies are not exempt. This exceptionalism that has defined

380-469: Is likely to be accessed by a person under the age of 18 . It requires online services to be designed in the "best interests" of children and their health, safety, and privacy, requiring that they be afforded with the strongest privacy settings by default, that only data strictly necessary to deliver individual service elements is collected from children unless there is justification, and that children's personal data not be disclosed to third-parties unless there

418-654: Is not possible to implement such a scanning mechanism without undermining users' privacy. The act creates a new duty of care of online platforms, requiring them to take action against illegal, or legal but "harmful", content from their users. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. It obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues. The bill that became

SECTION 10

#1732852745918

456-511: Is strictly necessary to deliver service elements that a child is "actively and knowingly engaged" in. A service may not disclose a child's personal data to a third party without a compelling reason to do so. Services must present their privacy policy , privacy options, and data export and erasure tools in clear and age-appropriate means. They must not use dark patterns to nudge children toward options that reduce their privacy. The Code recommends that privacy settings and tools be tailored to

494-626: The House of Commons and peers from the House of Lords . The Opposition Spokesperson, Lord Ponsonby of Shulbrede , in the House of Lords said, "My understanding is that we now have a timeline for the online harms Bill, with pre-legislative scrutiny expected immediately after the Queen’s Speech—before the Summer Recess—and that Second Reading would be expected after the Summer Recess." But

532-457: The "harmful" content they do not want to see. The act grants significant powers to the secretary of state to direct Ofcom, the media regulator, on the exercise of its functions, which includes the power to direct Ofcom as to the content of codes of practice. This has raised concerns about the government's intrusion in the regulation of speech with unconstrained emergency-like powers that could undermine Ofcom's authority and independence. Within

570-540: The Digital Economy Minister, Chris Philp , announced that the act would be amended to bring commercial pornographic websites within its scope. The Act adds two new offences to the Sexual Offences Act 2003 : sending images of a person's genitals ( cyberflashing ), or sharing or threatening to share intimate images. The draft bill was given pre-legislative scrutiny by a joint committee of Members of

608-561: The Minister replying refused to pre-empt the Queen's Speech by confirming this. In early February 2022, ministers planned to add to their existing proposal several criminal offences against those who send death threats online or deliberately share dangerous disinformation about fake cures for COVID-19 . Other new offences, such as revenge porn , posts advertising people-smuggling , and messages encouraging people to commit suicide , would fall under

646-765: The Online Safety Act 2023 as a potential threat to human rights, describing it as an "extremely complex and incoherent piece of legislation". The Open Rights Group described the Online Safety Bill (OSB) as a "censor's charter". During an interview for the BBC , Rebecca MacKinnon , the vice president for global advocacy at the Wikimedia Foundation , criticised the OSB, saying the threat of "harsh" new criminal penalties for tech bosses would affect "not only big corporations, but also public interest websites, such as Misplaced Pages ". In

684-461: The UK government to "amend the Bill to protect strong end-to-end encryption". Meta Platforms has criticised the plan, saying, "We don't think people want us reading their private messages ... The overwhelming majority of Brits already rely on apps that use encryption to keep them safe from hackers, fraudsters and criminals". Head of WhatsApp Will Cathcart voiced his opposition to the OSB, stating that

722-453: The act and Ofcom can at any time issue notices requiring the breaking of end-to-end encryption technology. This followed statements from several tech firms, including Signal , suggesting they would withdraw from the UK market rather than weaken their encryption. The UK National Crime Agency , part of the Home Office , has said the act is necessary to protect children. The NSPCC has been

760-484: The act refers to a number of specific duties to all services within scope: For services 'likely to be accessed by children', adopting the same scope as the Age Appropriate Design Code , two additional duties are imposed: For category 1 services, which will be defined in secondary legislation but are limited to the largest global platforms, there are four further new duties: This would empower Ofcom ,

798-493: The act was criticised for its proposals to restrain the publication of "lawful but harmful" speech, effectively creating a new form of censorship of otherwise legal speech. As a result, in November 2022, measures that were intended to force big technology platforms to take down "legal but harmful" materials were removed from the bill. Instead, tech platforms are obliged to introduce systems that will allow users to better filter out

SECTION 20

#1732852745918

836-865: The direction and submit a revised draft to the Secretary of State. The Secretary of State may give Ofcom further directions to modify the draft, and once satisfied, must lay the modified draft before Parliament. Additionally, the Secretary of State can remove or obscure information before laying the review statement before Parliament. The act has provisions to impose legal requirements ensuring that content removals do not arbitrarily remove or infringe access to what it defines as journalistic content. Large social networks would be required to protect "democratically important" content, such as user-submitted posts supporting or opposing particular political parties or policies. The government stated that news publishers' own websites, as well as reader comments on such websites, are not within

874-428: The display of advertising on a regulated service (for example, an ad server or an ad network). Ofcom must apply to a court for both Access Restriction and Service Restriction Orders. Section 44 of the act also gives the Secretary of State the power to direct Ofcom to modify a draft code of practice for online safety if deemed necessary for reasons of public policy, national security or public safety. Ofcom must comply with

912-456: The draft published by the government. Addressing the House of Commons DCMS Select Committee , the Secretary of State , Oliver Dowden , confirmed he would be happy to consider a proposal during pre-legislative scrutiny of the act by a joint committee of both Houses of Parliament to extend the scope of the act to all commercial pornographic websites. According to the government, the act addresses

950-403: The evening and nighttime hours, while YouTube stated that it would treat all videos "made for kids" (a designation introduced in 2020 following a ruling and fine under the U.S. Children's Online Privacy Protection Act ) under the assumption they were being viewed by a child, including disabling autoplay, personalization, targeted advertising, and social features. In March 2023, a complaint

988-514: The intended scope of the law. Section 212 of the act repeals part 3 of the Digital Economy Act 2017 , which demands mandatory age verification to access online pornography but was subsequently not enforced by the government. The act will include within scope any pornographic site which has functionality to allow for user-to-user services, but those which do not have this functionality, or choose to remove it, would not be in scope based on

1026-773: The last decade, that they are different, just disappears in a puff of smoke when you say, 'actually, this is business.' And business has to be safe, equitable, run along rules that at a minimum protect vulnerable users. The Children's Code is a code of practice enforceable under the Data Protection Act 2018 , and is consistent with GDPR and the Convention on the Rights of the Child . It specifies design standards for any information society services (ISS, which includes websites, software and apps, and connected toys ) that are likely to be used by

1064-422: The link to point directly to the intended article. Retrieved from " https://en.wikipedia.org/w/index.php?title=Online_Safety_Bill&oldid=1200002284 " Category : Disambiguation pages Hidden categories: Short description is different from Wikidata All article disambiguation pages All disambiguation pages Online Safety Act 2023 The Online Safety Act 2023 (c. 50)

1102-551: The major concern expressed by campaigners such as the Open Rights Group about the risk to user privacy with the Digital Economy Act 2017's requirement for age verification by creating, on services within scope of the legislation, "A duty to have regard to the importance of... protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures." In February 2022

1140-605: The national communications regulator, to block access to particular user-to-user services or search engines from the United Kingdom, including through interventions by internet access providers and app stores . The regulator will also be able to impose, through "service restriction orders", requirements on ancillary services which facilitate the provision of the regulated services. The act lists in section 92 as examples (i) services which enable funds to be transferred, (ii) search engines which generate search results displaying or promoting content and (iii) services which facilitate

1178-508: The needs of specific age groups. Per GDPR, a user must be at least 13 years old to give verifiable consent to data processing; verifiable consent must be given by the child's parent or custodian. Social media services adjusted their services to comply with the Code; on Instagram , all accounts created by under-18s began to be marked as private by default, and adults may not direct message them unless they are followers. TikTok stated that it will not send push notifications to children during

Online Safety Bill - Misplaced Pages Continue

1216-514: The responsibilities of online platforms like Facebook and Twitter to tackle. In September 2023, during the third reading in the Lords, Lord Parkinson presented a ministerial statement from the government claiming the controversial powers allowing Ofcom to break end-to-end encryption would not be used immediately. Despite the government's claim the powers will not be used, the provisions pertaining to end-to-end encryption weakening were not removed from

1254-645: The same instance, MacKinnon argued the act should have been based on the European Union 's Digital Services Act , which reportedly included differences between centralised content moderation and community-based moderation. In April 2023, both MacKinnon and the chief executive of Wikimedia UK , Lucy Crompton-Reid, announced that the WMF did not intend to apply the age-check requirements of the act to Misplaced Pages users, stating that it would violate their commitment to collect minimal data about readers and contributors. On 29 June of

1292-460: The same year, WMUK and the WMF officially published an open letter, asking the government and Parliament to exempt "public interest projects", including Misplaced Pages itself, from the OSB before it entered its report stage , starting on 6 July. Apple Inc. criticised legal powers in the OSB which threatened end-to-end encryption on messaging platforms in an official statement, describing the act as "a serious threat" to end-to-end encryption, and urging

1330-521: The scope of the act is any "user-to-user service". This is defined as an Internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be read, viewed, heard or otherwise experienced ("encountered") by another user, or other users. Content includes written material or messages, oral communications, photographs, videos, visual images, music and data of any description. The duty of care applies globally to services with

1368-457: The service would not compromise its encryption for the proposed law and saying "The reality is, our users all around the world want security – ninety-eight percent of our users are outside the UK, they do not want us to lower the security of the product and just as a straightforward matter, it would be an odd choice for us to choose to lower the security of the product in a way that would affect those ninety-eight percent of users." He also stated in

1406-514: Was filed against YouTube alleging violations of the Code, as the service can track children via devices shared by multiple users. The code was adapted by the U.S. state of California as AB 2273, The California Age-Appropriate Design Code Act, and passed in August 2022. Kidron's charity 5Rights Foundation was credited as a supporter and "co-source" of the bill. In September 2023, the bill was ruled unconstitutional by Federal Judge Beth Labson Freeman as

1444-501: Was published in April 2019, as instructed by the Data Protection Act 2018 (DPA). The final regulations were published on 27 January 2020 and took effect 2 September 2020, with a one-year grace period before the beginning of enforcement. The Children's Code is written to be consistent with GDPR and the DPA, meaning that compliance with the Code is enforceable under the latter. It applies to any internet-connected product or service that

#917082