滥用投诉指南 / Abuse Complaints Guide
⚠️ 本指南同时适用于 EU 及 UK 平台运营适用法律。English version follows after 中文版。
🌐 举报方式:发送邮件至 support@um.ink。
中文版
1. 指南目的
本指南旨在帮助用户识别并举报违反 UMInk / DTNETWORK LTD 服务条款的滥用内容,尤其是涉及儿童色情、暴力血腥、武器、毒品、仇恨言论与自残内容,确保平台符合法规要求,并迅速采取行动。
2. 法律与政策依据
| 法规体系 | 涉及内容 | 要求/义务摘要 |
|---|---|---|
| UK:Online Safety Act 2023 | 列为“非法内容”的资料(如 CSAM、恐怖内容、儿童自残诱导)均须主动检测与快速删除;对于儿童保护内容需实施“高度有效”年龄验证,自 2025‑07‑25 起生效;违反可处以最高 £1800 万或全球营业额的 10% 罚款。 | 依法设立举报入口、快速处理机制、风险评估、记录投诉流程等 |
| UK:犯罪与警务法案 2025(Crime & Policing Bill) | 包括对 AI 生成的图像即使不涉及真实儿童,若构成 CSAM 同样属刑事案例;为在线平台提供:出现 CSAM 即上报国家机构义务。 | 内容如 CSAM 须迅速删除并向 NCA / IWF 举报 |
| EU:e‑Commerce Directive(2000/31/EC) | 平台在接到通知后须对非法内容采取行动,但在“无显性知识”时享有免责;同时应设立“notice‑and‑action”路径。 | 确保“trusted flagger”机制与透明处理流程 |
| EU:Digital Services Act (DSA) | 对极大型平台设立严格义务,包括即时响应举报、与执法机构协作、进行风险评估以防止非法内容再出现。 | 如果 UMInk 的服务规模属于 DSA 范畴,平台将依该法执行。 |
3. 可举报内容类型
请举报以下 明显违法或风险极高 的内容:
- 儿童性虐待内容(CSAM),包括真实或 AI 生成图像、视频或模仿图。(UK Protection of Children Act 1978 / Criminal Justice Act 2009;AI 生成模型亦属违法)
- 暴力、血腥和武器相关内容,尤其涉及虐待、儿童、使用枪支或杀戮描写。
- 毒品相关:非法出售、制作、使用指南或交易讨论。
- 恐怖与仇恨言论:宣传极端主义、鼓动暴力或基于种族/性别/宗教等群体的仇视内容。
- 自残或自杀鼓励,未成年人可以接触到的鼓励自残、饮食失调等内容。。
- 诈骗或网络钓鱼,若对象敏感或可能伤害未成年人。
- 其他违法行为:包括但不限于侵犯个人隐私、生殖权或侵犯知识产权极端形式。
4. 举报流程
| 项目 | 建议内容 |
|---|---|
| 必填信息 | – 涉事内容链接/截图 – 观看日期与大致时间 – 犯规类型简述与/或违反条款指向 – 若涉及青少年/儿童用语说明 |
| 选填信息 | 举报人联系方式(可匿名) |
内容接收后流程
- 48 小时内确认已收到投诉,并开立内部工单。
- 初步评估与处置:
- 如果内容属于 CSAM、暴力、毒品或被 UK/EU 定性为非法性质,将 立即下架、隔离,并 上报执法机构。
- 对可能违法但争议不明的内容,将保全原样以便司法或合规专家审查。
5. 平台响应规范 & 时间线
- 已接收时间(T₀):邮件阅读后 48 小时内回复确认。
- 初步处理时间(T₀ + 5 工作日):提供处理状态,例如“已删除/正在处理/拒绝并说明理由”。
- 若投诉被拒绝,用户可以在 14 天内提出复查请求,平台将指定另一名合规审核员处理。
- 所有投诉记录(原始举报、审查、决策理由)将保留至少 12 个月以满足 Ofcom 要求(非法内容代码指引)。
6. 数据保护与隐私
- 遵守 UK GDPR / UK Data Protection Act 2018 以及 EU GDPR。
- 举报人个人信息(如邮件地址、IP)仅作为合规所需,并不会用于商业或未经授权的用途。
- 被举报内容如涉及他人个人资料,应按法律规定处理。
7. 面向儿童的特别保护
- 出现公开分享/通过搜索检索到涉儿童的滥用内容,平台应优先处理。
- 在 EU 框架下,如涉及漫画/动画中隐含性或暴力暗示内容若普遍被未成年用户看到,应提高处理优先级。 (与 EU 的儿童安全规则一致)
8. 未解决投诉的上诉渠道
- UK 用户:可向 Ofcom 投诉平台未依法处理严重违法内容。
- EU 用户:可向所在成员国的 国家 Digital Services Coordinator 或 DSA 监管机构提出申诉。
9. 常见问题解答(FAQ)
-
📌 什么叫“信任举报人(trusted flagger)”?
在 EU 推荐与 DSA 制度下,某些报告来源如儿童保护机构或执法合作组可加速优先处理。UMInk 会在特定情况下参考此原则。 -
📌 为什么要保留违法内容?
为遵守 UK 及 EU 的“通知—响应”机制和执法保留要求,平台须保全证据直到确认已移除或上报至相关部门。 -
📌 匿名举报安全么?
可以。举报邮箱不要求你提供姓名。系统以邮件地址隐去处理报告人身份。
English Version
1. Purpose
This guide explains how users can report abusive or policy‑violating content on UMInk Cloud (DTNETWORK LTD), including child sexual abuse, violent/gore with weapons, illegal drug content, hate speech, and other material harmful to minors. It also sets forth the legal framework under UK law and EU Digital Services regulation.
2. Legal Framework
| Legal Framework | Relevant Content | Key Obligations |
|---|---|---|
| UK: Online Safety Act 2023 | All content classified as ‘illegal’, including CSAM, terrorism, self‑harm, child‑accessible content. Platforms must remove it quickly, conduct risk‑assessments, and implement effective age verification (from 25 July 2025). Penalties up to £18 million or 10% global turnover apply. | Maintain complainable procedure; swift takedown; keep records and risk assessments; age assurance for children. |
| UK: Crime & Policing Bill 2025 | Includes AI‑generated CSAM (even if no real child depicted) as illegal. Hosts and moderators can be criminally liable. | CSAM must be removed promptly and reported to NCA / IWF; platforms must not provide safe havens. |
| EU: e‑Commerce Directive (2000/31/EC) | Platforms benefit from liability exemption unless they have actual knowledge or fail to act. | Must establish a clear “notice‑and‑action” mechanism so users can report illegal content efficiently. |
| EU: Digital Services Act (DSA) | Applies to very large platforms or intermediary services operating in EU. | Requires risk assessments, emergency removal, priority channels for “trusted flaggers”, and cooperation with enforcement authorities. |
3. Reportable Content Types
You may report content if it clearly includes (but is not limited to):
- Child Sexual Abuse Material (CSAM) – Real or AI-generated images, videos, pseudo-photographs. (UK law criminalizes AI-generated CSAM as of 2025).
- Violence, Blood, Weapons, particularly involving minors.
- Illicit Drug-related Material – sales, manufacture, instructions, promotion.
- Terrorist or Hate Speech – incitement of violence against protected groups.
- Self-Harm / Suicide Inducement – content accessible to minors that encourages self-harm, eating disorders, etc.
- Fraud, Spam, Phishing, especially if targeting minors or involving deception.
- Other clearly illegal content, such as extreme copyright violation.
4. Reporting Instructions
Email to: support@um.ink
| Field | What to Include |
|---|---|
| Required | – Links or screenshots of the content – Date/time when seen – Why it violates policy (e.g. “CSAM—AI image of minor sexual abuse”) |
| Optional | – Contact info or alias – Affected age group (e.g. minor victim) |
Upon receipt:
- We acknowledge within 48 hours.
- Triage and assess:
- If content is outright illegal (e.g. CSAM, drug sales, extremism), we remove immediately and notify relevant enforcement agencies.
- For disputed cases, preservation may occur pending legal review.
5. Response Standards & Timelines
| Timeline | Platform Action |
|---|---|
| T₀ + 48 hours | Acknowledge receipt of complaint |
| T₀ + 5 business days | Provide status: removed / pending / rejected (with rationale) |
| Within 14 days of rejection | You may request a second review by a different compliance officer |
| Record Retention | Keep full logs of complaint & actions for at least 12 months, per Ofcom’s illegal content codes |
6. Data Protection & Privacy
- Comply with UK GDPR / UK Data Protection Act 2018 and EU GDPR.
- Personal data in reports will be used only for handling the complaint and will not be used for marketing or disclosed unnecessarily.
- Where reports involve personal data of others, appropriate legal redaction and handling will occur.
7. Special Protections for Minors
- UGC involving children (sexual, violent, exploitative) is prioritized.
- Materials accessible to minors that could cause harm (self-harm content, substance abuse, violent acts) are expedited under both UK and EU safety codes and age-appropriate design standards.
8. Appeals & Regulator Escalation
- UK-based users dissatisfied with how a serious complaint was handled can escalate to Ofcom under the Online Safety Act.
- EU-based users may complain to their national Digital Services Coordinator or DSA enforcement authority if platform obligations under EU law are not met.
9. FAQ Highlights
- What is a “trusted flagger”?
A flagging entity (like child protection organizations, law enforcement liaison bodies) granted priority processing status under EU or platform policy. - Why retain illegal content temporarily?
Preservation enables audit and compliance with “notice‑and‑action” frameworks; content must be kept securely until official takedown or legal referral. - Can I file anonymouѕly?
Yes — emails are not publicly disclosed, and Parties may use pseudonyms.
📌 Additional Resources
- Ofcom: Online Safety Act & Illegal Content Codes of Practice, updated July 2025.
- EU Commission: Recommendation on tackling illegal content online (incl. CSAM, hate speech).
- UK Crime & Policing Bill Fact Sheet on CSAM (2025) detailing classification of AI-generated content.
Thank you for helping maintain a safe and lawful community with UMInk.
如有疑问,欢迎随时邮件咨询。