Abuse Complaints Guide
⚠️ This guide applies under both EU and UK law.
🌐 To report, send an email to support@um.ink.
1. Purpose
This guide explains how users can report abusive or policy-violating content on UMInk Cloud (DTNETWORK LTD), including child sexual abuse, violent/gore with weapons, illegal drug content, hate speech, and other material harmful to minors. It also sets forth the legal framework under UK law and EU Digital Services regulation.
2. Legal Framework
| Legal Framework | Relevant Content | Key Obligations |
|---|---|---|
| UK: Online Safety Act 2023 | All content classified as ‘illegal’, including CSAM, terrorism, self-harm, child-accessible content. Platforms must remove it quickly, conduct risk-assessments, and implement effective age verification (from 25 July 2025). Penalties up to £18 million or 10% global turnover apply. | Maintain complainable procedure; swift takedown; keep records and risk assessments; age assurance for children. |
| UK: Crime & Policing Bill 2025 | Includes AI-generated CSAM (even if no real child depicted) as illegal. Hosts and moderators can be criminally liable. | CSAM must be removed promptly and reported to NCA / IWF; platforms must not provide safe havens. |
| EU: e-Commerce Directive (2000/31/EC) | Platforms benefit from liability exemption unless they have actual knowledge or fail to act. | Must establish a clear “notice-and-action” mechanism so users can report illegal content efficiently. |
| EU: Digital Services Act (DSA) | Applies to very large platforms or intermediary services operating in EU. | Requires risk assessments, emergency removal, priority channels for “trusted flaggers”, and cooperation with enforcement authorities. |
3. Reportable Content Types
You may report content if it clearly includes (but is not limited to):
- Child Sexual Abuse Material (CSAM) – Real or AI-generated images, videos, pseudo-photographs.
- Violence, Blood, Weapons, particularly involving minors.
- Illicit Drug-related Material – sales, manufacture, instructions, promotion.
- Terrorist or Hate Speech – incitement of violence against protected groups.
- Self-Harm / Suicide Inducement – content accessible to minors that encourages self-harm, eating disorders, etc.
- Fraud, Spam, Phishing, especially if targeting minors or involving deception.
- Other clearly illegal content, such as extreme copyright violation.
4. Reporting Instructions
Email to: support@um.ink
| Field | What to Include |
|---|---|
| Required | – Links or screenshots of the content – Date/time when seen – Why it violates policy (e.g. “CSAM—AI image of minor sexual abuse”) |
| Optional | – Contact info or alias – Affected age group (e.g. minor victim) |
Process after receipt:
- Acknowledge within 48 hours.
- Triage and assess:
- If content is outright illegal (e.g. CSAM, drug sales, extremism), we remove immediately and notify enforcement agencies.
- For disputed cases, content may be preserved pending expert or legal review.
5. Response Standards & Timelines
| Timeline | Platform Action |
|---|---|
| T₀ + 48 hours | Acknowledge receipt of complaint |
| T₀ + 5 business days | Provide status: removed / pending / rejected (with rationale) |
| Within 14 days of rejection | Complainant may request a second review by a different compliance officer |
| Retention | All records (notice, review, decision rationale) are kept at least 12 months (per Ofcom illegal content codes) |
6. Data Protection & Privacy
- Comply with UK GDPR / Data Protection Act 2018 and EU GDPR.
- Personal data in reports is used only for complaint handling, not marketing.
- Where personal data of third parties is involved, appropriate safeguards are applied.
7. Special Protections for Minors
- Content involving children is prioritized.
- Material accessible to minors that promotes self-harm, substance abuse, or explicit violence is fast-tracked under UK and EU rules.
8. Appeals & Regulator Escalation
- UK users may escalate unresolved serious complaints to Ofcom.
- EU users may escalate to their national Digital Services Coordinator under the DSA.
9. FAQ Highlights
- What is a “trusted flagger”?
Certain entities (child protection bodies, law enforcement partners) have priority reporting status. - Why retain illegal content temporarily?
Evidence must be preserved securely until removal or law enforcement escalation. - Can I file anonymously?
Yes. We do not require names; email addresses are kept confidential.
📌 Additional Resources
- Ofcom: Online Safety Act & Illegal Content Codes of Practice
- EU Commission: Tackling Illegal Content Online
- UK Crime & Policing Bill (2025) Fact Sheet on CSAM
Thank you for helping keep UMInk safe and compliant.