Thule advises governments, foundation model developers, deployers, and institutional investors on the regulatory regimes that are reshaping the artificial intelligence industry. We translate novel statutes, executive actions, and standards into clear operating direction.
The legal architecture for advanced AI has, in three years, moved from white papers to enforceable obligation. The EU AI Act now imposes binding duties on general-purpose model providers. The Trump administration's AI Action Plan recast federal procurement as a regulatory lever. State legislatures have stepped into the space left by the rescission of Executive Order 14110. Frontier labs face simultaneous scrutiny from competition authorities, copyright plaintiffs, and national security regulators.
Thule was built for this regime. Our advisors have served in senior roles at competition agencies, trade ministries, and constitutional courts; led safety and policy teams inside foundation model laboratories; and negotiated standards at the OECD, the G7 Hiroshima Process, and the AI Action Summit track. We work in small, partner-led teams. We do not publish unless we have something new to say.
GPAI obligations under Article 53 and 55 of the EU AI Act, Code of Practice negotiations, model evaluation regimes, systemic risk designations, and post-deployment monitoring duties.
Federal procurement under the AI Action Plan, Commerce and BIS dual-use foundation model rules, California SB 53 transparency obligations, and the patchwork of state deployer regimes.
Training data provenance, fair use posture after Bartz, licensing strategy across publishers, stock media, and code repositories, and EU AI Act copyright transparency templates.
Diffusion framework counsel, advanced computing item classifications, model weight controls, and bilateral licensing for sovereign AI initiatives.
Engagement with the EU AI Office, ISO/IEC JTC 1/SC 42, the UK AI Security Institute network, the OECD, and the G7 Hiroshima Process secretariat.
Strategic counsel in copyright, consumer protection, and competition matters; preparation for regulator inquiries and parliamentary hearings; crisis response.
A first audit of how the AI Office, signatories, and holdouts are handling Article 53 and 55 obligations.
The Transparency in Frontier Artificial Intelligence Act, fourteen months after SB 1047 was vetoed.
What the July 2025 White House plan tells us about preemption, procurement, and the new federal posture.
Thule operates on a partner-led model. We do not staff engagements with junior associates, and we decline mandates that we cannot personally supervise. New clients are accepted by referral or after a preliminary call with a partner.
For inquiries:
counsel@thule-advisors.example
For press:
press@thule-advisors.example
For speaking and committee appearances:
speaking@thule-advisors.example