CONNECTION

    INTEGRATION

    MOVEMENT

    Child Safety Policy

    Our commitment to protecting children and preventing exploitation on our platform.

    Last Updated:December 2, 2025

    Paxillin Interface Private Limited ("we," "us," or "our") is committed to maintaining a safe environment and protecting children from exploitation, abuse, and harmful content. This Child Safety Policy outlines our standards, procedures, and commitment to preventing Child Sexual Abuse and Exploitation (CSAE) and ensuring our platform remains safe for all users.

    Zero Tolerance Policy: Paxillin maintains an absolute zero-tolerance stance toward any content or behavior that exploits, endangers, or sexualizes children. Any violation will result in immediate account termination and reporting to relevant law enforcement authorities.

    1. Our Commitment to Child Safety

    At Paxillin, child safety is a fundamental priority. We are dedicated to:

    • Preventing the creation, distribution, and consumption of child sexual abuse material (CSAM) on our platform.
    • Implementing robust systems to detect and remove exploitative content immediately.
    • Cooperating fully with law enforcement agencies in investigations related to child exploitation.
    • Providing clear reporting mechanisms for users to flag concerning content or behavior.
    • Continuously improving our safety measures through regular policy reviews and technology updates.
    • Training our team members on child safety protocols and best practices.

    2. Age Restrictions

    Paxillin is strictly intended for healthcare professionals aged 18 years and above.

    Our age verification and restriction measures include:

    • Mandatory age verification during the registration process.
    • Professional credential verification that inherently confirms adult status.
    • Immediate account termination if a user is discovered to be under 18 years of age.
    • Regular audits of user accounts to ensure compliance with age requirements.
    • Clear terms of service stating the minimum age requirement.

    If we discover that personal information has been collected from anyone under 18, we will delete that information immediately and take appropriate action.

    3. Child Sexual Abuse and Exploitation (CSAE) Prevention Standards

    Prohibited Content and Behavior:

    The following content and behaviors are strictly prohibited and will result in immediate action:

    • Child Sexual Abuse Material (CSAM) including any imagery, video, or text depicting minors in sexual situations.
    • Grooming behaviors or any attempts to establish inappropriate relationships with minors.
    • Solicitation or trafficking of minors for any purpose.
    • Content that sexualizes or objectifies minors in any way.
    • Sharing or requesting personal information of minors for exploitative purposes.
    • Any content promoting, glorifying, or normalizing child abuse or exploitation.
    • Artificially generated (AI/deepfake) sexual content involving minors.

    Our Prevention Measures:

    • Implementation of automated detection systems to identify and remove CSAM and related content.
    • Use of PhotoDNA and similar hash-matching technologies to detect known CSAM.
    • Human review teams trained in identifying exploitative content and behavior patterns.
    • Proactive monitoring of platform activity for suspicious patterns.
    • Regular updates to detection systems based on emerging threats and techniques.

    4. Content Moderation

    Our content moderation approach includes multiple layers of protection:

    • Automated Scanning: All uploaded content is automatically scanned using industry-standard detection technologies.
    • Human Review: Flagged content is reviewed by trained moderators within 24 hours of detection.
    • User Reports: All user reports are investigated promptly, with priority given to child safety concerns.
    • Proactive Monitoring: We actively monitor for patterns and behaviors associated with child exploitation.
    • Swift Removal: Violating content is removed immediately upon detection, and accounts are terminated.

    5. Reporting Mechanisms

    We provide multiple channels for reporting child safety concerns:

    In-App Reporting:

    Use the "Report" feature available on all content and user profiles to flag concerning material or behavior.

    Email Reporting:

    Send reports directly to our Child Safety Team at safety@paxillin.com

    External Reporting:

    You may also report to the National Center for Missing & Exploited Children (NCMEC) CyberTipline or your local law enforcement authorities.

    Response Commitment: All child safety reports are treated as high priority and are reviewed within 24 hours. We cooperate fully with law enforcement investigations and report confirmed CSAM to NCMEC and relevant authorities.

    6. Data Protection for Minors

    While our platform is not intended for minors, we maintain strict data protection measures:

    • We do not knowingly collect personal information from anyone under 18 years of age.
    • If we become aware that we have collected personal data from a minor, we immediately delete such information.
    • We do not share, sell, or use any data from minors for any purpose.
    • Our advertising and analytics systems are configured to exclude targeting of minors.
    • We comply with COPPA (Children's Online Privacy Protection Act) and similar international regulations.

    7. Information for Parents and Guardians

    We encourage parents and guardians to be involved in their children's online activities:

    • Paxillin is a professional healthcare networking platform and is not designed for users under 18.
    • If you believe your child has created an account on our platform, please contact us immediately.
    • We will promptly investigate and delete any account belonging to a minor.
    • We recommend using parental control software to monitor and restrict your child's online activities.
    • Educate children about online safety and the importance of not sharing personal information.

    8. Regulatory Compliance

    We comply with all applicable child safety laws and regulations, including:

    • Google Play Child Safety Standards: Full compliance with Google's CSAE prevention requirements.
    • Apple App Store Guidelines: Adherence to Apple's child safety and content moderation standards.
    • COPPA: Children's Online Privacy Protection Act compliance.
    • DPDPA 2023: India's Digital Personal Data Protection Act requirements.
    • IT Act 2000: Compliance with India's Information Technology Act provisions on child protection.
    • POCSO Act: Protection of Children from Sexual Offences Act compliance.

    We regularly review and update our policies to ensure ongoing compliance with evolving regulations and industry best practices.

    9. Child Safety Point of Contact

    For any child safety concerns or inquiries, please contact our dedicated Child Safety Team:

    Paxillin Interface Private Limited

    Child Safety Team

    Email: safety@paxillin.com

    General Inquiries: info@paxillin.com

    Response Time: All child safety reports are prioritized and will receive a response within 24 hours.

    Policy Updates

    This Child Safety Policy may be updated periodically to reflect changes in regulations, technology, or our practices. We will notify users of significant changes through our platform or via email. Continued use of the Service after updates constitutes acceptance of the revised policy.

    Report a Child Safety Concern

    If you encounter any content or behavior that endangers children, please report immediately to safety@paxillin.com