As artificial intelligence reshapes the business landscape, the SEC is gearing up for a new era of oversight. With a handful of cases already on the books and warnings from top officials, the message is clear: AI isn’t just disrupting industries — it’s disrupting regulatory enforcement. From startup pitches to shareholder meetings, Wolters Kluwer’s Mark S. Nelson unpacks the SEC’s sharpening focus on AI.
March 2024 marked the dawn of the SEC’s awakening to the myriad enforcement issues the agency can pursue regarding artificial intelligence (AI). The SEC has brought a small number of enforcement cases thus far. The question is whether this is the beginning of a selective process to pursue the worst bad actors or the beginning of a torrent of enforcement.
SEC Chairman Gary Gensler has warned of AI enforcement risk, and Gurbir Grewal, recently departed enforcement head, also spoke of this risk: “While perhaps not quite yet a perfect storm, there’s certainly one brewing around AI. And it is incumbent on each of us to make sure it does not come to pass and that investors are not harmed by noncompliance with the securities laws when it comes to this new technology.” And the agency’s Examinations Division recently released its 2025 priorities, which include understanding how registrants are using AI.
Public companies offering AI products and services or that use AI in their business operations, should prepare to make public disclosures about AI in their SEC filings. Disclosures could be made in annual and other periodic reports, including in the business description, risk factors, or the management’s discussion and analysis (MD&A). The following discussion suggests topics to consider:
- Existing laws: Do not discount the SEC’s or other federal agencies’ ability to invoke existing authorities in the AI space. In one early SEC case, Grewal said the case was about “an old school fraud using new school buzzwords.”
- Venture funding: Companies seeking funding for business ventures should be cautious about their sales pitches during early funding rounds. In a parallel criminal case to an SEC AI action, the U.S. Attorney for the Southern District of New York noted his office’s plans to focus on startup funding rounds.
- SEC examination priorities: The SEC Examination Division’s 2024 priorities include AI among the several risks it is monitoring. “The Division remains focused on certain services, including automated investment tools, artificial intelligence, and trading algorithms or platforms, and the risks associated with the use of emerging technologies and alternative sources of data,” according to its, which also recalled that the division created specialized teams within its examination programs to address emerging risk, including AI.
- Investor alerts: A January 2024 joint investor alert issued by the SEC’s Office of Investor Education and Advocacy, the North American Securities Administrators Association (NASAA), and the Financial Industry Regulatory Authority (FINRA) urged investors to, among other things, beware of registered and unregistered firms making claims about using proprietary AI to boost investment returns and to conduct due diligence by verifying facts and contact information in SEC filings. Two other alerts also emphasized IPO pitches on emerging technologies and AI and scams using “AI-related buzzwords” that promise outsized returns or claim that bots can find good investments.
- Broker-dealers: FINRA has issued guidance on the use of generative AI (genAI) by broker-dealers. Regulatory Notice 24-09, for example, reiterated that FINRA’s rulebook is technology-neutral, that SEC regulations for broker-dealers continue to apply to AI and that FINRA rules apply to AI developed in-house by broker-dealers and to AI that is provided to broker-dealers by third-parties. The guidance also reminds broker-dealers of their obligations under FINRA Rule 3110 (supervisory duties) and FINRA Rule 2210 (communications).
- SEC filings: Companies should decide what must be disclosed about AI in their SEC filings. Many companies using AI will talk about AI in the business, risk factors and MD&A portions of their Forms 10-K. Typical business section disclosures may indicate a company’s status as a technology pioneer, the existence of strategic partnerships or discuss competitors. Risk factors can vary widely but often mention that AI products or services may fail or may fail to gain consumer acceptance. Some companies note that foreign, state and local AI regulations may increase enforcement action risks. Companies that train AI models may disclose lawsuits against them for copyright infringement. Filings also may include forward-looking statements that mention AI. Forms 8-K may disclose AI-themed mergers or detail votes on shareholders’ AI proposals.
- Shareholder proposals: Shareholders are asking companies to take actions regarding AI at annual shareholder meetings. During the most recently completed proxy season, SEC staff told Disney, Apple and Paramount that the staff could not concur in the companies’ requests to exclude proposals seeking transparency reports on the companies’ use of AI (and in some instances, also seeking information about board oversight of AI and the companies’ ethical guidelines for using AI). In the staffs’ view, under Exchange Act Rule 14a-8(i)(7), “the Proposal[s] transcend … ordinary business matters and do … not seek to micromanage the Company.”
- Biden Administration and Congress: Last year, the Biden Administration issued an executive order on the safe use of AI in which it directed executive branch agencies to issue guidance, but most independent agencies were merely encouraged to use existing powers to do the same. The Treasury Department issued a request for information seeking public comment on AI issues affecting financial institutions regarding explainability and bias, illicit finance, data privacy and data protection and risk management.
- NIST guidance: The National Institute of Standards and Technology (NIST) has published an AI governance framework that is a starting point for U.S. companies seeking to adopt policies and procedures to guide their use of AI. In June 2024, partly in response to the Biden Administration’s issuance of an executive order, NIST issued additional AI guidance.
- State AI laws: Several states have enacted AI laws and companies may need to disclose the impact of these laws in their SEC filings. Most state and local AI laws focus on discrete topics, such as automated employment decision tools, but states have sought to enact broad-based AI laws. Colorado’s Consumer Protections In Interactions With Artificial Intelligence legislation, for example, has similarities to the EU AI Act, especially its focus on regulating the highest risk AI systems. California’s Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, would have applied to the largest and riskiest AI systems, though that measure was vetoed by Gov. Gavin Newsom.
- EU AI Act: The European Union in June 2024 adopted the EU AI Act. The act applies to providers located in the EU and to those outside the EU who place AI systems on the EU market, providers whose AI system outputs are used in the EU, or whose AI products affect persons located in the EU. U.S.-based companies, especially those with business in or affecting the EU, may need to disclose in their SEC filings how they are impacted by the EU AI Act. At the PLI SEC Speaks event in Washington, D.C. in April 2024, an SEC official who spoke during the Division of Corporation Finance Workshop observed that some companies had already begun to make disclosures about the impact of the then-forthcoming EU AI Act but that future disclosures about the EU AI Act should be more specific.
Looking ahead
The SEC appears ready to increase its enforcement of AI-related activities at companies and investment funds. While the U.S. lags much of the world in formal AI regulation, that does not mean U.S. enforcement is equally lagging. As a result, companies need to be conversant with SEC staff views on AI issues, make key decisions about what to disclose in SEC filings, keep an eye on the SEC’s and federal prosecutors’ AI-themed enforcement cases and try to anticipate foreign, state and local legislation that may impact their AI disclosures.