top of page

A title goes here. Click to edit and add your own.

Standards Against Child Sexual Abuse and Exploitation (CSAE)

 

AIS Social’s Commitment to Safety

At AIS Social, we have zero tolerance for child sexual abuse and exploitation (CSAE). We are committed to ensuring that our platform is not used to harm children in any way and proactively enforce safety measures to detect, prevent, and report such abuse.

1. Clear Prohibition

We explicitly prohibit:

  • The upload, distribution, or sharing of any content that depicts, promotes, or glorifies the sexual exploitation or abuse of children.

  • The use of our platform to communicate with or groom minors for abusive purposes.

  • Attempts to solicit sexually explicit material from minors or share such material.

 

Violations lead to immediate account termination and mandatory reporting to appropriate authorities.

2. Detection and Reporting

We use a combination of:

  • Industry-standard tools (e.g. PhotoDNA, CSAI Match) to detect known CSAM (Child Sexual Abuse Material).

  • Automated scanning for suspicious patterns in text, image, and file uploads.

  • User reporting mechanisms that allow the community to flag suspicious content or behavior.

 

All verified CSAE content or activity is:

  • Immediately removed

  • Reported to relevant authorities

3. Preventive Design and Access Controls

We actively design against exploitation risks by:

  • Restricting account creation to verified adult users (e.g. sailors with AIS-enabled vessels).

  • Offering no general-purpose chat rooms or child-directed content.

  • Monitoring account behavior for grooming signals or repeat boundary violations.

  • Educating users on safe conduct and reporting procedures.

 

AIS Social is not intended for children under 18. Any underage user identified will be removed.

4. Collaboration and Accountability

We participate in industry collaborations to stay current on CSAE detection and prevention best practices, and we:

  • Review and update policies regularly in line with international CSAE standards (e.g. WeProtect, ECPAT, EU CSA Regulation).

  • Cooperate fully with law enforcement investigations.

  • Conduct periodic internal audits of moderation, escalation, and reporting workflows.

5. Survivor-Centered Approach

We recognize that CSAE causes lasting harm. Our platform prioritizes:

  • Swift removal of harmful content

  • Respectful handling of reports

  • Non-retaliation and anonymity for whistleblowers or survivors who come forward

If you believe a child is at risk or that you’ve encountered CSAE content or behavior on our platform, please report it immediately via info@humeko.com or our in-app safety reporting tool. We act on all reports with urgency and care.

bottom of page