Japan has drawn a clear line in the global debate over AI training data: it will prioritize development over consent restrictions, at least for statistical research purposes.
The Diet enacted an amendment to Japan’s Personal Information Protection Act (APPI) on April 14, 2026, one week after the Cabinet approved the bill on April 8. The core change: AI developers can now use personal data, including sensitive categories, for statistical research purposes without obtaining opt-in consent, provided the use poses low risk to individuals. The Register reported the consent exemption’s scope, which covers sensitive data categories including health and biometric data.
This is a deliberate policy choice, and the contrast with Brussels is sharp. The EU AI Act requires transparency about training data for certain AI systems and imposes obligations on GPAI providers. Japan is moving in the opposite direction, reducing friction for developers who want to use personal data in their training pipelines. The amendment is the privacy reform companion to Japan’s broader AI Promotion Law, which Tech Jacks Solutions covered when it established Japan’s AI policy direction and 502.7 billion yen strategic budget.
The amendment isn’t a wholesale removal of protections. Mandatory parental consent for data collection from individuals under 16 is retained, according to Japanese press reports. And according to Japanese press reports, the bill also introduces penalties calibrated to profits gained from malicious data violations, though the precise enforcement mechanism awaits full regulatory guidance. Those profit-based penalties are a meaningful addition: they make data misuse financially painful in proportion to what bad actors gained, rather than applying a flat fine.
For AI developers with Japanese operations or data subjects: the consent exemption applies to statistical research use, not to commercial applications generally. The compliance question isn’t whether you can use Japanese personal data without consent, it’s whether your specific use case qualifies as low-risk statistical research under the amendment’s framework. That determination will depend on regulatory guidance that hasn’t yet been published.
Full implementation is expected following the current Diet session, with observers anticipating a transition period through late 2026. Plan for October 2026 as a working target, but treat it as an estimate until formal regulatory guidance sets the timeline.
Watch for the implementing regulations. The Cabinet approval and Diet enactment establish the legal authority; the regulatory guidance will specify exactly which use cases qualify for the research exemption, how “low risk” is defined, and what documentation developers must maintain. The implementing rules are where the compliance details live, and they’re not published yet.
TJS perspective: Japan’s APPI amendment gives AI developers operating in or sourcing data from Japan a meaningful new capability: the ability to train on sensitive data categories that were previously off-limits without consent. But “you can” and “you should without further legal review” are different questions. The absence of implementing regulations means the practical scope of the exemption remains uncertain. Begin your use-case assessment now; don’t wait for the October implementation date to start the analysis.