AMNESTY INTERNATIONAL KENYA AND 5RIGHTS FOUNDATION COMMENTS THE DRAFT GUIDANCE NOTE FOR PROCESSING CHILDREN’S DATA

The ODPC’s draft Guidance Note on the Processing of Children’s Data is a significant step towards protecting children’s privacy in the digital age. While the document demonstrates several strengths that provide a strong starting point for safeguarding children’s data, it also highlights critical areas that require further refinement to align more closely with international best practices and ensure robust, comprehensive protection.

The Guidance Note comprehensively addresses definitions, privacy concerns, legislative frameworks, data protection principles, consent mechanisms, safeguards, and obligations for data handlers.

Amnesty International Kenya and 5Rights Foundation recommend that the ODPC enhance the Guidance Note by clearly defining and regulating the commercial use of children’s data; strengthening provisions for children’s evolving capacities and direct participation in data rights; emphasising tech companies’ responsibility to protect children’s rights rather than relying on parental consent mechanisms; addressing the additional risks presented by AI systems; and explicitly integrating a broader human rights impact assessment framework that considers the full spectrum of online harms.

Page Number and Section of Guidance NoteStatement as Currently CapturedProposed StatementComments and Justification
Page 9, Section 6Overview of the national legislative frameworksInclude explicit references to relevant international instruments to which Kenya is a party. These notably include the UN Convention on the Rights of the Child (UNCRC) and its accompanying General comment No. 25; the African Charter on the Rights and Welfare of the Child (ACRWC), and the African Union Child Online Safety and Empowerment Policy.To build on existing progress and international best practices, we recommend grounding the Guideline in established international children’s rights frameworks.
Page 2, Section 9.8;
Page 7, Section 4;
Page 10, Section
7.1; Page 22,
Section 13.1
Section 9.8,
“Commercial Use of
Children’s Data,”
Regulation 13
prohibits profiling a
child in relation to
direct marketing.
Profiling is
identified as a major
privacy concern that
can lead to “unjust
exclusion or
discrimination” and
requires a Data
Protection Impact
Assessment (DPIA).
Develop and integrate a detailed Section 9.8 on “Commercial Use of Children’s Data,” explicitly prohibiting manipulative design (dark patterns), limiting targeted advertising, and regulating economic exploitation (e.g. prohibiting loot boxes, excessive in app purchases, and nudge techniques that manipulate children into sharing more personal data than necessary). Furthermore, expand the prohibition on profiling to include
any profiling that is likely to lead to discrimination, exclusion, or unfair treatment,
regardless of whether it is for. This addresses a critical regulatory gap as the section is currently missing. It protects children from financial exploitation and manipulative online practices that undermine their autonomy and well-being, aligning with UNCRC General Comment No. 25, which warns against “exploitative contractual arrangements, i.e. dark patterns, profiling and automated processing for user retention and information filtering” and OECD recommendations on harmful commercial practices. The current note only prohibits direct marketing. Mandate regular independent audits and Child Rights Impact Assessments of algorithms and AI systems likely to impact children’s rights to identify and mitigate risks, including bias, discriminatory outcomes, privacy violations, and commercial exploitation. Require human
oversight and intervention for any automated decision-making that is likely to affect a child’s rights or well-being.
 This addresses a critical regulatory gap as the section is currently missing. It protects children from financial exploitation and manipulative online practices that undermine their autonomy and well-being, aligning with
UNCRC General Comment No. 25, which warns against “exploitative contractual arrangements, i.e. dark patterns, profiling and automated processing for user retention and
information filtering” and OECD recommendations on harmful commercial practices. The current note only prohibits direct marketing profiling, which is narrow.
Page 14, Section
9.1; Page 17,
Section 9.6; Page
15, Section 9.2
Section 9.1 states that data handlers should “recognise the evolving
capacity of the child to form their view and give due weight to that view”. However,
Section 9.6 states
those rights to access, rectify, restrict, or
delete personal data “should be handled by a parent or legal guardian. The note requires informing parents about
processing details.
Develop a tiered approach for
children to directly exercise some data rights with parental oversight for older children (e.g. 13-17 years), considering their evolving
capacities. Mandate the provision of
clear, concise, and age-appropriate
privacy notices directly to children, in addition to comprehensive information for parents. Develop and publicise clear, simple, and accessible mechanisms for children and parents to lodge complaints and seek redress.
Empowers children as active rights-holders, fosters
digital literacy, and aligns with UNCRC General Comment No. 25’s emphasis on evolving
capacities and
participation. The current guidance acknowledges
evolving capacity but doesn’t provide mechanisms for
direct exercise of rights by children. GDPR calls for “clear, child-friendly notices”.
 Page 13, Section 8 “Regardless of the legal basis relied upon by the data handler, parental consent must be
obtained when processing children’s data. The parent or guardian’s consent primarily serves as a means of meeting the notification
obligation rather than being a Consent as a lawful basis which can be withdrawn.”
Emphasize that relying on children or their parents to understand complex privacy policies and “consenting” to companies using data has significant limitations, as it shifts accountability from businesses, who employ
behavioural psychologists for the specific purpose of maximising engagement, to an overwhelmed parent or even a child who may be less aware of the risks to their privacy and less able to understand complicated terms and conditions. Instead, a proactive and upstream approach that ensures a high level of privacy by default to meaningfully protect children’s privacy is needed, and data minimization and purpose limitation remain paramount. Children and their parents must have clear avenues for
redress if these principles are violated.
Clarify that while parental/guardian consent may serve as a notification obligation when processing is based on a legal obligation,
Parents/ guardians must still retain the right to request rectification, restriction, or
deletion of data, and to object to certain processing activities, in line with the “best
interest of the child” and general data
subject rights.
Prevents an “illusion of choice” and ensures
Parents/guardians maintain effective control over their child’s data, enhancing accountability and redress. The current phrasing limits parental control over data when the lawful basis is not consent.


However, relying solely on consent for data processing is a fundamentally flawed approach, given the imbalance between the willingness of tech companies to capture greater quantities of data and children’s lower awareness of the risks and consequences of the collection and
processing of their data. Instead, a
robust data protection framework is crucial to ensure children’s right to privacy, as well as to safety and security in the digital environment.
Page 16, Section 9.4 & 9.5; Page 15, Section 9.2Sections 9.4 and 9.5 discuss age verification methods, stating they must be “proportionate, privacy preserving and adhere to the principle of data minimisation” and “grounded on a risk-based approach” with “greater stringency… where the processing… is of higher risk”. Section 9.2 requires informing parents about processing details.Emphasise that age assurance/age verification is not a complete solution on its own, but should form part of a broader strategy alongside other approaches for child online privacy and safety, such as setting a high level of privacy configuration by design and default. Any age assurance mechanisms in use must be privacy-preserving, proportionate to the risks, effective, age-appropriate, accessible, transparent and secure. Provide a tiered approach to age verification, clearly linking the required level of stringency to the sensitivity of the data and the potential for harm, drawing from COPPA’s detailed methods and GDPR’s “reasonable efforts” within a risk-based framework. Mandate that privacy notices and consent forms be presented in plain language, with visual aids where appropriate, and be easily accessible. Consider requiring a “summary” version for children and parents that highlights key data practices in simple terms, alongside the full legal policy.Age assurance does not prevent children from being exposed to online harm, but it ensures that the sector does not continue to ignore the presence of children on their platforms. As such, tech companies have an opportunity to develop age-appropriate experiences for children.

To support the protection of children’s rights, age assurance must be privacy-preserving, proportionate to the risks, and effective, as set out in the IEEE 2089.1
Standard for Online Age Verification.

Ensures verification methods are proportionate to risk, reduces barriers to beneficial services, and makes information truly understandable, fostering informed consent. Current guidance lacks granular detail on tiered verification. GDPR emphasises “clear, child-friendly notices”. COPPA provides specific methods for verifiable parental consent.
Page 18, Section 10.1; Page 10, Section 7.3; Page 15, Section 9.2; Page 22, Section 13.1Section 10.1 outlines security measures, including encryption, pseudonymization, access controls, and regular audits. Section 7.3 mentions conducting regular internal audits. Section 9.2 requires informing parents about automated decision-making and human intervention. Profiling is identified as a high-risk activity that requires a DPIA.Add a requirement for data handlers to regularly review and update their security measures based on the latest cybersecurity best practices and emerging threat intelligence, emphasizing continuous risk assessment and adaptation. Explicitly prohibit profiling children for purposes that are likely to lead to discrimination, commercial exploitation, exclusion, or unfair treatment in access to services, resources, or opportunities. Mandate regular, independent audits and Child Rights Impact Assessments of algorithms and AI systems likely to impact children’s rights, to identify and mitigate risks, including bias, discriminatory outcomes, privacy violations, and commercial exploitation. Require human oversight and intervention for any automated decision-making that is likely to affect a child’s rights or well-being.Protects against emerging cyber threats and prevents algorithmic harms to children’s rights, such as discrimination or unfair treatment, ensuring the ethical and equitable use of children’s data. The current security measures are effective but require a proactive update. UNCRC General Comment No. 25 warns against algorithmic discrimination and exploitative processing.
Page 17, Section 9.6; Page 19, Section 10.5 & 10.6; Page 5, Section 2Section 9.6 states that rights “should be handled by a parent or legal guardian”. Section 10.5 outlines accountability mechanisms and regular internal audits. Section 10.6 mandates data breach notification to ODPC and parents. Section 2 describes.Develop and publicize clear, simple, age- appropriate, and accessible mechanisms for children and parents to lodge complaints with the ODPC or directly with data handlers regarding data protection violations. This could include online portals, dedicated helplines, or simplified complaint forms. The ODPC should adopt a more proactive regulatory stance, including conducting regular sector-specific audits of child-directed online services, issuing proactive warnings about emerging data protection risks for children, and engaging in continuous research to understand the evolving digital landscape and its impact on children’s data rights.Develop and publicise clear, simple, age-appropriate, and accessible mechanisms for children and parents to lodge complaints with the ODPC or directly with data handlers regarding data protection violations. This could include online portals, dedicated helplines, or simplified complaint forms. The ODPC should adopt a more proactive regulatory stance, including conducting regular sector-specific audits of child-directed online services, issuing proactive warnings about emerging data protection risks for children, and engaging in continuous research to understand the evolving digital landscape and its impact on children’s data rights.
Page 19, Section 11N/AAdd a section addressing the responsibility of businesses to respect children’s rights in the context of the heightened risks presented by AI systems. Tech companies must provide children with a high level of privacy, safety, and security by design and default. This notably necessitates embedding Safety- and Privacy-by- Design in the design, development, and deployment of AI systems. Such a proactive approach involves addressing known or anticipated risks during product development to prevent or significantly reduce potential harm.Add a section addressing the responsibility of businesses to respect children’s rights in the context of the heightened risks presented by AI systems. Tech companies must provide children with a high level of privacy, safety, and security by design and default. This notably necessitates embedding Safety- and Privacy-by-Design in the design, development, and deployment of AI systems. Such a proactive approach involves addressing known or anticipated risks during product development to prevent or significantly reduce potential harm.

Key Abbreviations

COPPA: Children’s Online Privacy Protection Act GDPR: General Data Protection Regulation

GDPR: General Data Protection Regulation

OECD: Organisation for Economic Co-operation and Development UNCRC: United Nations Convention on the Rights of the Child

UNCRC: United Nations Convention on the Rights of the Child