MEMORANDUM ON THE DRAFT INDUSTRY GUIDELINES FOR CHILD ONLINE PROTECTION AND SAFETY IN KENYA, 2025 TO THE COMMUNICATIONS AUTHORITY OF KENYA. 

JUNE 2025

This memorandum identifies potential gaps and proposes actionable improvements. The goal is to enhance the effectiveness of Kenya’s framework in safeguarding children in the evolving digital environment. Upon careful analysis of the Draft Industry Guidelines for Child Online Protection and Safety in Kenya, 2025, Civil Society Organisations call upon the Communications Authority of Kenya to take the following considerations: – 

  1. It lumps all children and young people under 18 into one group and makes no mention of their evolving capacities and abilities to provide consent when accessing apps and platforms. In fact, it doesn’t mention consent at all. 
  1. The guidelines need to reference existing statutes such as the Children ‘s Act Cap 141 and the Data Protection Act, Cap 411C; further, the draft doesn’t mention relevant international laws applicable to children’s rights, namely the African Charter on the Rights and Welfare of the Child and the UNCRC (both ratified by Kenya). These two instruments address issues relevant to this draft, such as the right of the child to privacy and protection from abuses and violations, among others. 
  1. The guidelines are very process-heavy with very little detail about what is expected of companies/Industry. For example, it stipulates the need to have a child online safety and protection policy and mechanisms for implementation, but it doesn’t include any details on what this policy should cover and what the mechanisms are. 
  1. Below are the specifics that are submitted for review: 
GuidelineProvisionRecommendation and Justification
1.1These Guidelines may be cited as “Industry Guidelines for Child Online Protection and Safety in Kenya, 2025”. These Guidelines may be cited as “Communications Industry Guidelines for Child Online Protection and Safety in Kenya, 2025”. To better provide clarity in which industry these guidelines apply. A proposal to introduce the guidelines for the application of the term ‘industry’. To mean ‘a particular form or branch of productive labor; an aggregate of enterprises employing similar production and marketing facilities to produce items having markedly similar characteristics’. 
Section 2.1: definition of disabilityDespite the definition of disability, there is no requirement for accessible interfaces or alternative reporting tools. Additionally, there is no mandate for actionable inclusion measures in section 4.1.2. Section 8.2 lacks specific requirements for accessibility (e.g., sign language, accessible formats), thus excluding many children.Introduce in Sections 8.1 and 8.2 a requirement for inclusive design: digital services must comply with recognised accessibility standards (e.g., WCAG), and reporting mechanisms must accommodate sign language, voice/SMS options, and easy-to-read formats.
4.1These Guidelines are premised on the following principles. A list of principles is listed.Introduce the principle of evolving capacities – This principle recognises that as children grow and develop, they gain increased understanding, autonomy, and the ability to make decisions about their own lives. 
4.1.3A Multi-stakeholder approach is required: Child online protection and safety requires a multi-stakeholder approach. Provide a contextual definition of Multi-stakeholder Approach: This is a method that is employed to bring together affiliations, who will participate in dialogue, decision making and implementation of responses to jointly perceived and/or resolve problems. 
Sections 4.1.5 & 8.1.1.9 emphasise stakeholder roles and empowering childrenThese sections however fail to offer a specific mechanism. Additionally, there is no provision for formal channels which children can use to contribute to policy, design or review.Proposed amendment to introduce a Child Advisory Council, enabling periodic consultation (every 6 months) to inform CA policies, auditing and tool development. 
A clause to be added to Section 4 (Principles) affirming that “children must be participants, not just beneficiaries, in all stages of COP policy development. 
In Section 8.1.1, require creation of a Child Advisory Council, whose feedback must be integrated into product/tool design, policy revisions, and implementation reviews.
4.1.6Data protection and Safety by design: This is crucial for the development of a safer internet and online experience for children. Expand this clause to include privacy by design and default. The providers of online platforms accessible to children should integrate the highest standards of privacy, safety and security in the design, development and operation of their services, with safety by default and design as a foundational principle. Privacy by design and default-Privacy and data protection are embedded throughout the entire lifecycle of the platforms, and by ensuring children’s personal data is automatically protected during the process. Platform risk assessments and mitigation strategies should be mandated, deployed, and periodically reviewed to address immediate and remedial risks and online harms. 
5.1.2We are not sure it achieves this objective. As mentioned above, it seems overly process-heavy in terms of implementing policies and mechanisms without detailing what these policies should look like. The danger of leaving it up to companies or industry bodies is that the policies and mechanisms will be almost certainly weaker. We are not sure it achieves this objective. As said above, it seems very process heavy about putting in place policies and mechanisms without detailing what the policies should look like. The danger of leaving it up to companies or industry bodies is that the policies and mechanisms  will be almost certainly weaker. 
Section 7 (Application)
Section 7.1.3 – 
No established body to coordinate, monitor, or evaluate the adoption of these guidelines.



Categories of ICT products and services to include all stakeholders
Proposal – Establish a Children’s Online Safety Council under CA, including the tech industry, child rights NGOs, educators, and law enforcement, to meet quarterly, review compliance, and publish annual reports.


Proposal to amend this to state “All state and non-state ICT products and services that target and/or are accessible to children. 
8.1Guidelines for Implementation of Organisational Measures by the ICT Industry. This lacks a contextual definition of organisational measures. The mechanisms should be simplified. Guidelines for implementation of organisational measures by the ICT Industry- Give a definition of organisational measures(actions and processes to address child online protection and safety) 
8.1.1 Develop, publish, and implement a corporate child online protection and safety policy and strategy. The clause is unclear about what constitutes proof of commitment (e.g., signing or other actions).
The clause does not provide for a review period for the policy.
Proposal for the provision to provide: – Clarify how organisations confirm their commitment (e.g. signed declaration, policy publication).
– Specify a regular review period for the policy (e.g., annually).
8.1.1.9 and 8.1.1.12





8.1.1.10 –
Sections 8.1.9 and 8.1.1.12 broadly require consumer education and capacity building but lack structured delivery mechanisms, content standards or targets to ensure consistency and reach.


 ‘Mechanism to clearly communicate how and where to report complaints.’-This provision does not mention what is to be done after a complaint is filed. Investigation? Response?Duration? .
Proposal – Section 8.1.1.9 should include the development of standard curricula (a National Digital Literacy Curriculum) and free downloadable resource toolkits for parents, guardians, and educators. Licensees must report the frequency and reach of workshops, e‑learning modules, and public awareness campaigns, with content co-created with experts on child safety.This provision should guide the industry on the speed, transparency and accuracy necessary in addressing online violations of children’s rights.
8.1.1.11Align business practices with relevant legislation on marketing and advertising to children, including the Data Protection Act 2019.The Data Protection Act does not specifically address marketing and advertising with regard to children. The Data Protection Act 2019, Section 37, provides regulations for the commercial use of data, including direct marketing.
8.1.1.13Capacity building initiatives within the organisation. The clause is unclear if training targets the entire organisation or only focal points. It is also unclear what the role of the Communications Authority (CA) is with regard to the capacity building of Licensees.The provision should:- Specify who is required to be trained (e.g., all staff, specific departments).
– CA should provide or support training for licensees.
8.1.1.14 Designate a focal point for Child Online Protection and Safety. The provision does not provide: – Guidance on role and the scope relative to organization size.
– Unclear if the role is full-time or combined with other responsibilities.
– No mention of required qualifications.
A proposal to have guidance provided on:- The role/scope of the focal point and the responsibilities
– The qualifications of the position including  counselling qualifications and knowledge of child protection escalation procedures.
8.2  Guidelines for Implementation of Technical Measures by the Industry. The clause use vague language, especially  8.2.7. This gives room for multiple interpretations.
Subclauses 8.2.1 and 8.2.2 are repetitive.
A proposal to use specific, actionable language in technical requirements,including contextual definition of technical measures.
A proposal to merge 8.2.1 and 8.2.2 for clarity and efficiency.
 – No outlined penalties for non-compliance.(after the lapse of the stipulated 6 months duration)
– Communication of legal expectations is complex and inaccessible.
– No distinction between local and international legal obligations.
– Introduce enforceable penalties (e.g., license revocation for serious non-compliance).
– Simplify and repackage the legal language to be easily understood (e.g., use Kenya Revenue Authority-style communication).
– Clearly state which local and international laws apply.
 Compliance language is punitive rather than supportive.– Promote compliance and adoption through positive messaging and support.
– Still include appropriate penalties to ensure seriousness and accountability.
 Lack of internal guidance standards.– Specify internal parameters for developing standard operating procedures.
– Provide examples of best practices to guide implementation.
9. Specific Guidelines on Broadcast Content and Broadcasters  
10. Specific Guidelines for Application Service Providers and Content Service Providers
11. Specific Guidelines for Mobile OperatorsIs there a specific service provision/ parental control tools embedded in this SIM cards for children?Ensure SIM cards for children are registered per the Kenya Information and Communications Act, 1998 and the Registration of SIM-cards Regulations, 2015.
12. Specific Guidelines for Hardware Manufacturers, Communication Devices and Equipment Vendors – Provide information on activating built-in technical safety mechanisms.
– Activate default heightened security features before sale, especially for children’s devices.
13. Complaints Reporting Mechanisms– No clause on child-friendly reporting.
– Children may not be aware of the mechanism.
– Complex formalities.
– Children can’t submit complaints directly.
– Add a clause on child-friendly reporting mechanisms.Multiple channels: Toll-free helplines, SMS, WhatsApp, web forms, mobile apps, school desks, or in-person reporting.
Inclusive formats: Use of sign language, braille, easy-to-read formats, and translations into local languages.
24/7 availability: Especially for high-risk or urgent cases.Simple, non-technical language.

Friendly, trained staff who know how to communicate with children of various ages and abilities They should have a standard template for reporting Clear data protection and confidentiality protocols.

Anonymous reporting options.Liaise with already existing children’s helpline 
 Section 13.4 requires quarterly reports on complaints, but lacks key performance indicators (KPIs) or evaluation metrics to assess impact (e.g., resolution rates, CSAM takedowns, user training reach).Amend Section 13.4 to require standardized KPIs such as number of CSAM incidents detected/removed, turnaround time for complaints, training sessions held, and percentage of employees trained. Include periodic third-party audits or evaluations.Draft a standard template for reporting
14.1 Compliance ImplementationNo specified enforcement mechanisms or consequences of non-complianceAdd consequences of non-compliance, such as fines and license suspension clauses.
14.2 Submission of COP PoliciesNo standardised documents or approval standards.Develop templates for COP policies and complaint handling procedures.
14.3 Monitoring ComplianceNo consequences for non-compliance.Introduce follow-up actions: notices, warnings, penalties.
Section 14.3 requires self-assessment to establish the level of compliance with the guidelines This section fails to specify measurable targets or an independent evaluation of outcomes.


Proposed amendment to include timeline-based targets (e.g., 80% of content takedown requests processed within 48 hours).Independent audits every two years with summaries published publicly.
14.4 Self-AuditingSelf-assessments are optional.Make self-assessments mandatory and periodical with a standardised tool by the Authority.
14.5 Public ComplaintsAssumes the public is informed and empowered.Launch digital literacy and privacy rights awareness campaigns. Require licensees to display complaint procedures publicly.
15.1 Review of GuidelinesNon-committal review frequency (‘from time to time’).Set fixed review timelines and allow emergency reviews in response to changes.

SIGNED OFF BY:

  1. Watoto Watch Network
  2. Amnesty International Kenya
  3. Rights Click Alliance
  4. Child Tech Counties Consortium
  5. KenSafeSpace
  6. Internews
  7. Kictanet 
  8. Kenya Society  for Deaf Children (KSDC)
  9. Childline Kenya
  10. Akili Network- Akilii Kids! Tv
  11. Mzalendo
  12. Africomm
  13. Midrift Hurinet 
  14. Action for children development Center
  15. Women  voice
  16. Kwa Wote Initiative
  17. Bringing Smiles Foundation
  18. Tabasamu Na Upendo
  19. Legal Sister
  20. Child Space Organization
  21. Tunza Safeguarding