This Wednesday, September 17, Brazilian President Luis Inácio Lula da Silva sanctioned, with a few vetoes, Statute #15.211/25, which establishes rules for the protection of children and adolescents in digital environments.
The Bill, originally introduced in 2022, gained momentum after a viral video by influencer Felipe Bressanim Pereira, known as “Felca”, denounced the sexualization of minors on social media platforms. Its approval reflects growing public demand for stricter social media regulation, due to the use of these platforms to influence the electoral processes and conduct harmful activities in the past years.
The most relevant change introduced by President Lula was the anticipation of the law’s effective date to six months from its enactment, instead of the one-year period originally proposed in the bill.
President Lula also issued Decree #12.622/25 and Provisional Measure #1.317/25, transforming the Brazilian National Data Protection Authority into a regulatory agency and appointing it as the entity responsible for overseeing enforcement and issuing guidelines regarding the statute.
The new statute introduces extensive obligations for Online Service Providers (“OSPs”) operating in Brazil. Key provisions include:
- Scope and applicability: The rules apply to all OSPs whose products or services are targeted at or likely to be accessed by minors, regardless of their location, development, manufacture, supply, marketing, or operation. “Likely to be accessed” is defined as: (i) sufficient probability of use and attractiveness to minors; (ii) considerable ease of access and use for minors; and (iii) a significant degree of risk to privacy, safety, or biopsychosocial development.
- Priority protection and best interests: Digital products and services targeted at or likely to be accessed by minors must ensure priority protection of these users. They must be guided by the best interests of minors and incorporate adequate and proportionate measures to ensure a high level of privacy, data protection, and safety.
- Risk prevention by design: Providers of digital products and services targeted at or likely to be accessed by minors must adopt reasonable measures from the design stage to prevent and mitigate risks of exposure to harmful content, including content related to sexual exploitation and abuse, physical violence, cyberbullying, incitement to self-harm or substance abuse, gambling, predatory advertising, and pornography.
- Age verification and restrictions:
- Providers of digital products and services targeted at or likely to be accessed by minors must implement mechanisms that ensure age-appropriate user experiences. Accounts belonging to minors up to 16 years of age must be linked to the account of one of their legal guardians.
- Providers of app stores and operating systems must implement proportional, auditable, and technically secure measures to verify users’ age or age range, and enable, through API, the transmission of age signals to OSPs.
- OSPs offering content, products or services deemed inappropriate for minors must implement effective measures to prevent access by minors. This includes deploying reliable age verification mechanisms at each access point. Self-declaration is explicitly prohibited. For the scope of this rule “inappropriate” content, products, or services are defined as those containing pornographic material, as well as other restrictions established by current legislation.
- Parental supervision:
- OSPs must offer accessible and user-friendly parental supervision features allowing parents to: (i) view, configure, and manage account and privacy settings; (ii) restrict purchases and financial transactions; (iii) identify adult profiles interacting with the minor; (iv) access consolidated metrics on total usage time; (v) activate or deactivate safeguards through accessible controls; and (vi) access all information and control options in Portuguese.
- Providers of app stores and operating systems must allow parents to configure voluntary parental supervision mechanisms and actively monitor minors’ access to applications and content.
- Reporting and removal obligations: OSPs must report any detected content involving apparent sexual exploitation, abuse, abduction, or grooming of minors to the appropriate national and international authorities, as must also remove such content, in accordance with the requirements and deadlines to be defined in further regulation.
- Data preservation: OSPs are required to retain relevant data from cases reported as sexual exploitation for six months. This includes the reported content, metadata referring to said content, and data and metadata from involved users.
- Notice-and-takedown: OSPs must promptly remove content that violates the rights of children and adolescents upon notification, regardless of a court order. For the purpose of this rule, content related to sexual exploitation and abuse, physical violence, cyberbullying, incitement to self-harm or substance abuse, gambling, predatory advertising, and pornography is considered a violation. Providers must also make available channels for receiving complaints. Users whose content is removed must be notified and granted the right to appeal, with clear justification, access to the appeal process, and defined procedural deadlines. This rule does not apply to content of a journalistic nature and subject to editorial control.
- Prohibition of loot boxes: Loot boxes are prohibited in electronic games targeted at or likely to be accessed by minors.
- Transparency reports: OSPs with over one million underage users must publish semiannual transparency reports in Portuguese. These must include, at least: (i) the channels for receiving complaints; (ii) the number of complaints received; (iii) the number of content or account moderation actions, by type; (iv) the measures to identify child accounts and detect illicit activities; (v) the technical improvements for data protection and privacy; (vi) the mechanisms for verifying parental consent; and (vii) the methods and results of impact assessments and risk management related to the safety and health of minors.
- Local legal representation: OSPs operating in Brazil must maintain legal representation in the country with the authority to receive judicial and administrative summons, notices, and communications, to respond to Executive, Judiciary, and Public Prosecutor’s Office authorities, and to act on behalf of the company before the public administration.
- Sanctions: Non-compliance with legal obligations may result in: (i) warning; (ii) fines of up to 10% of the economic group’s revenue in Brazil or up to BRL 50 million per violation; (iii) temporary suspension of activities; or (iv) permanent ban (the last two require a court order).
- Governance: As mentioned, the government appointed the Brazilian National Data Protection Authority as the regulatory authority to oversee enforcement and issue guidelines.
In view of Provisional Measure #1.319/2025, the statute will come into force on March 17, 2026.
Our team is fully prepared to assist clients with this matter. Should you have any questions, please contact us at info@lickslegal.com.