DSA-Required Policies

These policies outline our procedures for notice-and-action, appeals, and transparency reporting in accordance with Regulation (EU) 2022/2065 (Digital Services Act).

1. Notice-and-Action Procedure

1.1. Scope and Legal Basis

This Notice-and-Action Procedure is adopted pursuant to Article 16 of Regulation (EU) 2022/2065 (Digital Services Act) and applies to all content hosted on the Platform, including Public Cases, Private Cases (to the extent lawfully reviewable), comments, petitions, attachments, and Representative communications. This Procedure supplements, and shall be interpreted consistently with the other relevant provisions. Nothing herein imposes a general monitoring obligation in accordance with Article 8 DSA.

1.2. Submission of Notices of Allegedly Illegal Content

Any natural or legal person may submit a notice alleging that specific content hosted on the Platform constitutes illegal content. A notice shall be considered sufficiently precise and adequately substantiated where it includes: (a) a reasoned explanation of why the content is alleged to be illegal; (b) a clear identification of the content, including URL or unique identifier; (c) contact details of the notifying party; (d) a statement of good faith belief that the information provided is accurate and complete. The Platform shall provide an accessible electronic submission mechanism available without requiring account registration. Submission of a notice does not create any contractual relationship between the notifying party and the Platform.

1.3. Processing of Notices and Decision Framework

Upon receipt of a sufficiently substantiated notice, the Platform shall assess the content without undue delay, taking into account: ● the nature and gravity of the alleged illegality; ● freedom of expression under Article 11 of the Charter of Fundamental Rights of the European Union; ● Irish constitutional proportionality principles; ● applicable statutory provisions (e.g., Defamation Act 2009, Criminal Justice legislation). The Platform may remove, disable access to, restrict visibility of, or otherwise limit content where there is a reasonable basis to consider the content unlawful. Where the content concerns Citizen Case data controlled by a Representative, the Platform shall assess only hosting-related illegality and shall not substitute its judgment for that of the Representative acting in their official capacity, consistent with the controller allocation under the Privacy Policy.

1.4. Notification of Outcome

Following assessment of the notice, the Platform shall inform the notifying party of the decision taken in respect of the reported content. Where applicable, such notification shall indicate: ● whether content has been removed, restricted, or left available; ● whether the decision was based on illegality or incompatibility with Platform rules; and ● the availability of redress mechanisms, including internal complaint-handling procedures and, where relevant, out-of-court dispute settlement mechanisms pursuant to applicable intermediary service regulations. Nothing in this section shall require disclosure of information restricted by law, legal privilege, confidentiality obligations, or law enforcement confidentiality requirements.

1.5. Trusted Flaggers and Authority Orders

Where a notice is submitted by a Trusted Flagger designated, such notices shall be prioritised. Orders from judicial or competent administrative authorities issued shall be complied with in accordance with applicable law and recorded for transparency reporting.

1.6. Record-Keeping and Logging Obligations

All notices, moderation decisions, timestamps, and supporting rationale shall be securely logged in accordance with the logging framework described in the Platform Rules & Safety document and retained in accordance with the Data Retention Policy Data Protection.

2. Appeal & Internal Complaint-Handling Mechanism

2.1. Right to Challenge Moderation Decisions

Users whose content has been removed, restricted, demoted, or whose accounts have been suspended or terminated may challenge such decisions through an internal complaint-handling system.

2.2. Procedural Standards and Timelines

Complaints must be submitted within six (6) months of notification of the moderation decision. Complaints shall be processed diligently, objectively, in a non-discriminatory manner, and within a reasonable timeframe proportionate to the complexity of an issue.

2.3. Human Review and Safeguards

All appeals shall be subject to meaningful human review by a person not solely responsible for the initial decision. Where automated tools were used in the initial detection process, the reviewer shall assess whether (1) the automated flagging was contextually accurate, (2) relevant lawful exceptions apply, and (3) freedom of expression considerations were adequately weighed.

2.4. Outcome, Remedies, and Out-of-Court Dispute Settlement

Following review, the User shall receive a reasoned decision confirming, reversing, or modifying the original measure. Users shall be informed of the possibility of referring disputes to a certified out-of-court dispute settlement body, without prejudice to judicial remedies under Irish law.

2.5. Abuse of the Complaints Mechanism

Where a User repeatedly submits manifestly unfounded complaints, the Platform may temporarily suspend access to the complaint mechanism.

3. Annual Transparency Report

3.1. Publication Obligation and Format

The Platform shall publish, at least annually, a publicly accessible transparency report. The report shall be made available in a clear, accessible and, where feasible, machine-readable format.

3.2. Categories of Data Disclosed

The report shall include aggregated information on (1) number of notices received; (2) categories of alleged illegal content; (3) actions taken (removal, restriction, suspension); (4) average decision timeframes; (5) number of authority orders received; and (6) number of appeals submitted and outcomes. No personal data shall be disclosed.

3.3. Use of Automated Tools

The report shall disclose (1) whether automated tools are used for detection; (2) the general type of tools (e.g., keyword flagging, spam detection); and (3) whether final decisions involve human review.

3.4. Complaints and Outcomes

Aggregated statistics shall be provided regarding number of internal complaints, number upheld, number rejected, and suspensions for abusive notice or complaint activity.

3.5. Regulatory Cooperation and Audit Trail

The Platform shall cooperate with the Irish Digital Services Coordinator (as designated under Irish law implementing the DSA) and retain records necessary to demonstrate compliance.