The MIC is beginning to hold large platforms to account for the spread of harmful and illegal content online, broadly reflecting designations made under the DSA
The MIC has designated Google, Yahoo, Meta, X and TikTok under Japan’s online safety legislation
On 30 April 2025, the Ministry of Internal Affairs and Communications (MIC) in Japan designated Google, Yahoo, Meta, X and TikTok as “large specified telecommunications providers” under the Information Distribution and Platform Handling Act (IDPHA) – the country’s equivalent to the EU’s Digital Services Act (DSA). The IDPHA contains provisions on the following key issues relating to the removal of harmful content online:
The point of contact for removal requests being unclear;
Harmful posts left unattended have been allowed to spread and gain significant attention;
Content removal requests have not been responded to with any notification of whether content was removed; and
Platform guidelines and criteria for the removal of content have been abstract and unclear.
The designations will result in the imposition of new obligations on these providers to implement clearer procedures for the removal of illegal and harmful content from their platforms, as well as for better operational transparency.
The designated platforms must appoint specialist investigators for content removal procedures alongside other obligations focused on transparency
The five designated platforms will now have to establish new measures, including:
Accelerating removal responses – a removal request window must be established and the user must be notified of this window;
Establishing a system for responding to removal requests;
Operational transparency – platforms must publicise a clear criteria for the deletion of content;
Platforms must notify users who have made a request of their prevention measures for illegal content and of whether the relevant content is deleted or not;
Platforms will have to appoint specialist investigators to handle content removal requests; and
Platforms must publicise the status of their implementation of prevention measures for illegal content.
Alongside these obligations, the platforms must notify the MIC of their details within three months from their date of designation.
Key decisions on the implementation of obligations under the IDPHA will be made by the MIC
While being obliged to appoint specialist investigators to handle content deletion processes, platforms are not in control of how many of these investigators they should appoint. The MIC itself will determine the number of investigators necessary based on the average monthly number of users a platform has and the platform’s type. The obligation to better notify users of the progress of their deletion requests also has further detail, requiring that platforms notify users of whether their request will see action be taken within 14 days of it being made. The imposition of these obligations on the first designated platforms will mark an important turn in the regulation of online platforms in Japan where until now, the removal and deletion of harmful and illegal content by large platforms was done voluntarily.
The IDPHA bears close similarity to the DSA, but there are some differences in platform designations
The MIC’s designations follow similar action in the EU under the DSA. For the most part, the EC has designated the same platforms as the MIC as very large online platforms (VLOPs), the only difference being that Yahoo has not been designated under the DSA. Interestingly, the MIC has specifically designated Meta’s newest platform Threads under the IDPHA, whereas the EC only designated Meta’s Instagram and Facebook. The DSA’s obligations on designated platforms also bear a number of similarities to the IDPHA’s obligations, such as rules around the removal of illegal content and transparency measures relating to platforms’ algorithms and operations.