Child Safety Standards
Our commitment to preventing Child Sexual Abuse and Exploitation (CSAE)
Last Updated: February 2026
Avokindo Digital Sdn Bhd - Child Safety Policy
๐ก๏ธ Zero Tolerance Policy
Avokindo maintains a strict zero-tolerance policy against Child Sexual Abuse Material (CSAM) and any form of Child Sexual Abuse and Exploitation (CSAE). We are committed to creating a safe environment and will take immediate action against any violations.
1. Our Commitment
Avokindo is dedicated to protecting children from sexual abuse and exploitation. As a social platform connecting adults for hospitality and cultural exchange, we:
- Restrict access to users 18 years and older โ verified through mandatory identity verification during registration
- Employ advanced content moderation systems to detect and remove prohibited content
- Maintain trained safety teams to review reports and take swift action
- Cooperate fully with law enforcement agencies worldwide
- Report all instances of CSAM to appropriate authorities, including the National Center for Missing & Exploited Children (NCMEC)
2. Prohibited Content and Behavior
The following are strictly prohibited on Avokindo and will result in immediate account termination and reporting to authorities:
- Any sexual content involving minors (under 18 years of age)
- Solicitation of minors for sexual purposes
- Sharing, distributing, or possessing child sexual abuse material
- Grooming behavior or attempts to establish inappropriate relationships with minors
- Discussions promoting or normalizing child exploitation
- Any content that sexualizes children, including fictional or AI-generated content
- Links to external sites containing CSAM or exploitative content
Note: Avokindo is an adults-only platform. Users must be 18 years or older to create an account, and we employ facial verification technology to confirm age.
3. Prevention Measures
3.1 Age Verification
All users must complete our verification process which includes:
- Government ID verification confirming the user is 18+
- Live facial recognition (liveness detection) to prevent identity fraud
- Photo comparison between ID and selfie
3.2 Content Moderation
We employ multiple layers of content moderation:
- Automated detection systems using industry-standard hash-matching technology (PhotoDNA) to identify known CSAM
- AI-powered content analysis for detecting suspicious content patterns
- Human review teams trained to identify and escalate potential violations
- 24/7 monitoring of reported content with priority handling for child safety reports
3.3 Staff Training
All Avokindo team members with access to user content receive specialized training on:
- Recognizing indicators of child exploitation
- Proper reporting procedures to law enforcement
- Trauma-informed content review practices
- Regular updates on evolving threats and tactics
4. Reporting Child Safety Concerns
If you encounter any content or behavior that may involve the exploitation or endangerment of children, report it immediately using one of the following methods:
In-App Reporting
Tap the report button (โ ๏ธ) on any user profile, message, or content to flag it for review.
Email Our Safety Team
Send details to ask@avokindo.com with subject line "URGENT: Child Safety Report"
Contact Authorities
For immediate danger, contact local law enforcement or report to NCMEC at CyberTipline.org
All reports are treated with the highest priority and confidentiality. We will never disclose the identity of reporters to accused users.
๐ก๏ธ Designated Child Safety Contact
For questions about our child safety policies or to report concerns:
ask@avokindo.com
Avokindo Digital Sdn Bhd
Kuala Lumpur, Malaysia
5. Response and Enforcement
When we receive a report or detect potential CSAE content, we take immediate action:
- Immediate content removal โ Suspected CSAM is removed within minutes of detection
- Account suspension โ Accounts involved in CSAE are immediately and permanently suspended
- Evidence preservation โ Relevant data is preserved for law enforcement investigation
- Authority notification โ We report all confirmed CSAM to NCMEC and cooperate with local law enforcement
- No reinstatement โ Users terminated for CSAE violations are permanently banned
6. Cooperation with Law Enforcement
Avokindo is committed to full cooperation with law enforcement agencies investigating child exploitation. We will:
- Respond promptly to valid legal requests for user information
- Provide technical assistance to support investigations
- Submit reports to NCMEC's CyberTipline for all identified CSAM
- Work with international organizations like INTERPOL when appropriate
- Comply with all applicable laws regarding mandatory reporting, including US federal law (18 U.S.C. ยง 2258A) and Malaysian Communications and Multimedia Act 1998
7. Transparency and Accountability
We believe in transparency about our child safety efforts. Avokindo commits to:
- Publishing annual transparency reports on child safety actions taken
- Regular review and updates to our child safety policies
- Participation in industry initiatives like the Tech Coalition against child exploitation
- Ongoing investment in detection and prevention technology
8. Contact Information
For questions about these Child Safety Standards, to report a concern, or to request additional information about our CSAE prevention practices:
- Email: ask@avokindo.com
- Company: Avokindo Digital Sdn Bhd
- Location: Kuala Lumpur, Malaysia
We are ready and able to discuss our child sexual abuse material (CSAM) prevention practices and compliance with applicable laws and regulations.