The web is global, and so is the regulatory landscape that governs it. For web professionals serving international clients or working for organizations with a global footprint, understanding legislative developments around the world is essential to staying compliant and competitive.
This month, we’re taking a look at significant legislative and policy updates from around the globe that affect how we build, design, and deploy web experiences. From Australia’s groundbreaking under-16 social media ban to evolving AI regulations across Europe, whether you’re developing e-commerce sites for European customers, building applications for the Asia-Pacific market, or creating accessible experiences for users worldwide, these developments deserve your attention.
Europe
European Union: AI Act Implementation and the Digital Omnibus
The European Union continues to lead global technology regulation, and 2025 has been a pivotal year for implementation. The EU AI Act, which entered into force in August 2024, is now in active deployment. As of February 2025, the prohibitions on unacceptable-risk AI practices took effect, banning things like untargeted facial recognition database scraping, emotion recognition in workplaces and schools, and real-time biometric identification in public spaces.
In a significant development last month, the European Commission unveiled its “Digital Omnibus” proposal on November 19, 2025, aimed at simplifying the EU’s sweeping digital regulations. The proposal includes extending the timeline for full AI Act compliance for high-risk systems from August 2026 to December 2027, giving businesses more time to prepare. The Commission estimates these simplifications could save businesses approximately 5 billion euros annually.
However, the Digital Omnibus faces opposition. The proposed changes require modifications to the General Data Protection Regulation (GDPR), and many Members of the European Parliament have already announced their opposition to weakening data protection standards.
The Digital Services Act (DSA) continues its enforcement phase, with Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs)—those with more than 45 million monthly active users in the EU—subject to the most stringent requirements, including annual independent third-party audits. Non-compliance can result in penalties of up to 6% of annual worldwide revenue.
For web professionals, the practical impact is significant. If you’re building AI-powered features, you need to understand the risk classification system. If you’re creating platforms that allow user-generated content, you need DSA-compliant content moderation systems. And if you’re handling any personal data of EU residents, GDPR compliance remains paramount.
The European AI Office provides detailed guidance on compliance requirements and implementation timelines.
European Accessibility Act: Now in Effect
The European Accessibility Act (EAA) came into full effect on June 28, 2025, representing one of the most significant accessibility mandates for private-sector websites globally. Unlike previous EU accessibility requirements that focused primarily on public sector websites, the EAA extends to private businesses, including e-commerce platforms, banking services, transportation, and consumer electronics.
The technical standard for compliance is EN 301 549, which currently incorporates WCAG 2.1 and is being updated to include WCAG 2.2. Web professionals building sites for European audiences should be designing to WCAG 2.1 Level AA at minimum, with WCAG 2.2 becoming the benchmark as standards evolve.
Enforcement varies by member state, but penalties for non-compliance can reach up to €1,000,000 depending on the severity of the infraction. Perhaps more importantly, products and services that fail to meet accessibility requirements can be removed from the European market entirely.
This guide from Level Access provides practical steps for achieving EAA compliance.
United Kingdom: Online Safety Act Enters Phase 2
The UK’s Online Safety Act has entered its most consequential phase. As of March 17, 2025, platforms have a legal duty to protect users from illegal content online, and as of July 25, 2025, they must protect children from harmful content including pornography, self-harm content, and eating disorder content.
The requirement that’s generated the most attention is “highly effective age assurance”—platforms can no longer rely on self-declaration of age or simply prohibiting under-18s in their terms of service. Instead, they must implement robust verification methods such as photo-ID matching, credit card checks, facial age estimation, or digital identity services.
Ofcom, the UK’s online safety regulator, has already opened enforcement actions against pornography providers without effective age assurance, file-sharing services lacking child sexual abuse material protections, and smaller services that haven’t completed risk assessments. Penalties can reach up to £18 million or 10% of global turnover, whichever is greater.
The Act has generated controversy, particularly around its implications for encryption. Apple called it a “serious threat” to end-to-end encryption, while Meta stated it would rather have WhatsApp blocked in the UK than weaken encryption standards. Wikipedia’s Wikimedia Foundation launched a judicial review (which it lost in August 2025), arguing the Act’s requirements would compromise its open editing model.
For web professionals building services that UK users can access, the message is clear: understand your obligations under the OSA, conduct required risk assessments, and implement appropriate safety measures. The regulatory guidance runs to over 3,000 pages—compliance requires dedicated attention.
GOV.UK’s Online Safety Act explainer provides an official overview of requirements and timelines.
Asia-Pacific
India: Digital Personal Data Protection Rules 2025
India officially operationalized its first comprehensive data protection regime on November 14, 2025, with the notification of the Digital Personal Data Protection (DPDP) Rules, 2025. These rules give effect to the Digital Personal Data Protection Act, 2023, which was passed by Parliament in August 2023 but waited over two years for implementing regulations.
The framework introduces consent-driven data governance for the first time in India. Companies must now provide clear, plain-language notices explaining what data is being collected, why it’s being processed, how complaints can be raised, and how consent can be withdrawn. Consent must be “free, specific, informed, unconditional and unambiguous.”
For Significant Data Fiduciaries—organizations designated based on the volume and sensitivity of data they process—additional requirements include annual Data Protection Impact Assessments, mandatory audits, and the appointment of a Data Protection Officer based in India.
The rules are being implemented in phases: Data Protection Board provisions take effect immediately, the consent manager framework in 12 months, and broader compliance obligations in 18 months. Cross-border data transfers are generally permitted unless specifically restricted by the government, offering more flexibility than GDPR—though sector-specific localization requirements remain.
Critics have raised concerns about Rule 23, which gives the state broad power to demand personal data from any data fiduciary without user consent for reasons including national security and “any lawful function of the government.” The Internet Freedom Foundation has characterized these access categories as so wide they invite misuse.
For web professionals serving Indian users or working with Indian companies, this represents a significant shift toward formal privacy requirements. Begin mapping your data practices now to ensure compliance within the phased timelines.
India Briefing’s guide to the DPDP Rules 2025 provides detailed compliance information.
Japan: Platform Regulation and Active Cyber Defense
Japan has been actively developing its digital regulatory framework throughout 2025. The Act on Promoting Competition for Specified Smartphone Software, enacted in June 2024, is expected to become fully effective by late 2025 or early 2026. Inspired by the EU’s Digital Markets Act, this law targets designated providers of mobile operating systems, app stores, browsers, and search engines—primarily Apple and Google—with requirements around interoperability, alternative payment systems, and anti-self-preferencing rules.
The Information Distribution Platform Act came into force on April 1, 2025, requiring large social media providers to implement systems for removing illegal or harmful content, including defamation. Meanwhile, Japan’s Act on Improving Transparency and Fairness of Digital Platforms continues to impose annual self-assessment and disclosure requirements on major e-commerce platforms and app stores.
In a major cybersecurity development, Japan passed the Active Cyber Defense Act in May 2025, which will come into effect in 2026. This legislation moves Japan from a “passive” defense posture—relying on firewalls and antivirus measures—to an “active” approach that allows authorities to monitor communications data for threat detection and take counter-measures against cyberattack sources. Critical infrastructure operators will be legally required to inform the government of cyberattacks.
Japan’s Personal Information Protection Commission is also planning amendments to the Act on the Protection of Personal Information (APPI) for 2025 or 2026, with topics including stronger protections for children’s data and enhanced enforcement penalties similar to GDPR.
Japan’s Digital Agency maintains current information on digital legislation and notices.
Australia: World-First Social Media Ban for Under-16s
Australia has become the first country to enforce a nationwide ban on social media access for children under 16, with the law taking effect on December 10, 2025. The Online Safety Amendment (Social Media Minimum Age) Act 2024, passed by Parliament on November 29, 2024, represents the world’s strictest approach to protecting children from online harms.
The legislation targets 10 major platforms: Facebook, Instagram, YouTube, TikTok, Snapchat, Reddit, X, Threads, Kick, and Twitch. The law places sole responsibility on platforms to take “reasonable steps” to prevent under-16s from creating or maintaining accounts, with penalties reaching up to AU$49.5 million (approximately $32 million USD) for systemic failures. Notably, there are no penalties for children or parents who circumvent the restrictions.
Implementation and Age Assurance
More than 1 million social media accounts held by users under 16 are set to be deactivated as platforms implement age assurance measures. These include AI-powered age estimation from video selfies, email verification, and government ID checks—though platforms are prohibited from compelling users to provide government-issued identity documents or demanding digital identification through government systems. Platforms must delete age verification data after use to address privacy concerns.
Several major platforms have already begun compliance efforts. Meta started removing under-16 users from Facebook, Instagram, and Threads on December 4, ahead of the December 10 deadline. Users can reactivate accounts when they turn 16, with their data stored until then or available for download. Even X, which has been vocal in its opposition based on free speech concerns, announced it would comply using a multi-faceted approach including self-attested age, identification documents, account creation date, and email addresses.
Public Support and Controversy
A YouGov poll found that 77% of Australians support the ban, driven largely by parental concerns about the impact of social media on children’s mental health. Prime Minister Anthony Albanese stated the ban aims to “give kids back their childhood and parents their peace of mind”, citing rising suicide and self-harm rates among Gen-Z Australians.
The push for the ban intensified following grassroots campaigns including “Let Them Be Kids,” launched by News Corp alongside parents and child safety advocates, which garnered over 54,000 petition signatures. A government-commissioned national study found that 96% of children ages 10 to 15 use social media, with seven out of 10 exposed to harmful content including misogynistic material, fight videos, and content promoting eating disorders and suicide.
However, the legislation has generated significant criticism. Mental health experts and child welfare advocates worry that banning young children from social media will dangerously isolate many who use these platforms to find support, particularly LGBTQI youth, children in regional communities, and those from marginalized backgrounds. Critics also argue the ban could drive children to less safe parts of the internet or reduce platforms’ incentives to improve online safety.
The bill’s passage was criticized as rushed, with only one day allowed for public submissions despite 15,000 responses received. Technology companies including Google and Meta urged Australia to delay passage, arguing more time was needed to assess potential impacts.
Legal Challenges and Global Impact
The Digital Freedom Project announced in November 2025 it would commence legal action in the High Court of Australia, arguing the laws violate the implied right to political communication. Implementation challenges are already emerging, with reports of young users circumventing AI age estimation systems.
The world is watching Australia’s experiment closely. Denmark, Norway, France, Spain, Malaysia, and New Zealand are all considering similar age-based restrictions, positioning Australia as a critical test case for global youth social media policy. Australian officials have described the landmark ban as the world’s “first domino” in what may become a broader international movement toward stricter child protection measures online.
For web professionals building platforms that could be accessed by children, Australia’s ban represents a new regulatory frontier. Understanding age assurance technologies, privacy-preserving verification methods, and the balance between child protection and user rights will become increasingly important as other jurisdictions consider similar measures.
Africa
Continental Progress on Data Protection
Africa’s data protection landscape continues to mature rapidly. As of 2024, 39 out of 55 African nations have implemented data protection laws, with 34 having established Data Protection Authorities. Several countries made significant progress in 2024: Cameroon, Ethiopia, and Malawi enacted new data protection laws; Botswana amended its existing legislation; and the Democratic Republic of Congo, Somalia, Togo, and Tanzania established or launched their Data Protection Authorities.
South Africa’s Protection of Personal Information Act (POPIA) saw important updates in April 2025, with stricter rules around consent, breach reporting, and data handling. Notably, compliance violations are now publicly visible through the CIPC BizPortal, meaning non-compliance affects not just potential fines but also business reputation and relationships.
Kenya’s Office of the Data Protection Commissioner has been particularly active, releasing sector-specific guidelines for healthcare, education, and digital lending. Nigeria continues to enforce its Data Protection Regulation (NDPR) through the National Information Technology Development Agency.
AI governance is gaining traction across the continent. The African Union adopted its Continental AI Strategy in 2024, while individual countries including Egypt, Kenya, Morocco, Nigeria, Uganda, and Zimbabwe have developed proposals for AI-specific regulations. South Africa published its AI Policy Framework, and Kenya is developing a national AI strategy for publication in 2025.
Looking ahead to 2025, expect enhanced regulatory sophistication, sector-specific regulations (particularly in healthcare, finance, and education), increased enforcement actions, and growing emphasis on child online protection.
This roundup from Tech Hive Advisory provides detailed analysis of African data protection trends.
South America
Brazil: LGPD Matures and AI Regulation Advances
Brazil’s Lei Geral de Proteção de Dados (LGPD) continues to mature as the country’s comprehensive data protection framework. A significant development for web professionals: companies utilizing Standard Contractual Clauses for international data transfers must incorporate ANPD-approved clauses into their contractual instruments by August 23, 2025, as mandated by Resolução CD/ANPD 19/2024.
The National Data Protection Authority (ANPD) has been active on multiple fronts. In mid-2025, it launched public consultations on the treatment of sensitive biometric data, signaling upcoming regulations for this high-risk data category. Resolução CD/ANPD 15/2024 mandates notification of data breaches to both the ANPD and affected individuals when risks arise.
AI regulation is advancing rapidly. After years of debate, Brazil’s Senate approved an AI bill in December 2024. The bill sets rights and obligations for developers, deployers, and distributors of AI systems, taking a human rights, risk management, and transparency approach. A Special Committee was formed in April 2025 to advance its examination in the Chamber of Deputies. As of July 2025, the bill (PL 2338/2023) remains pending but progressing.
At the state level, Goiás became the first Brazilian state to introduce an AI law (Complementary Law 205/2025) in May 2025, establishing an ethics council, AI sandbox, auditability requirements, and environmental standards for data centers.
In a notable enforcement action, the ANPD suspended Meta’s data training policy in July 2024 and imposed a daily penalty for improper use of AI training data, demonstrating that existing LGPD provisions already govern AI-related data use even before dedicated AI legislation passes.
Covington’s overview of Brazil’s digital policy in 2025 provides comprehensive analysis of the regulatory landscape.
What Web Professionals Can Do
The global regulatory landscape is complex, but there are practical steps you can take:
Conduct a jurisdiction audit. Understand where your users are located and which regulations apply to your projects. A website accessible to EU users needs to comply with GDPR and potentially the EAA; one serving Indian users now needs to comply with the DPDP Rules.
Build accessibility into your workflow. With the EAA in effect, ADA litigation continuing in the US, and accessibility requirements emerging globally, WCAG compliance should be standard practice, not an afterthought.
Understand age verification requirements. Australia’s under-16 social media ban represents the most stringent youth protection measure globally, and other countries are watching closely. If you’re building platforms that could be accessed by children, familiarize yourself with age assurance technologies and privacy-preserving verification methods. Even if your jurisdiction hasn’t implemented similar bans, expect increased scrutiny around child safety measures.
Implement privacy by design. Multiple jurisdictions now require or encourage privacy considerations from the earliest stages of development. Document your data practices, implement appropriate consent mechanisms, and be prepared to demonstrate compliance.
Stay informed. Regulations evolve, and implementation timelines shift. Follow the regulatory bodies relevant to your work and consider joining industry associations that track legislative developments.
Consult specialists. For complex compliance questions, especially around cross-border data transfers or sector-specific requirements, legal counsel with expertise in technology regulation is invaluable.
The web connects us globally—and so do the laws that govern it. By staying informed about international regulatory developments, web professionals can build experiences that serve users everywhere while respecting local requirements and expectations.
What international regulatory developments are you tracking? We’d love to hear your thoughts in the comments below. As always, feel free to reach out to learn more about Web Professionals Global and our mission of Community, Education, Certification.