Back to Blog
Compliance & Regulation 12 min read

SEC Rules on AI Tools: A Plain-English Guide for RIAs

Spencer Gauta

June 26, 2025

SEC Rules on AI Tools: A Plain-English Guide for RIAs

The SEC hasn't published a rule titled "AI for Financial Advisors." They haven't issued an official list of approved AI tools. And they haven't created a special compliance framework just for AI.

But that doesn't mean the SEC is silent on AI. It means existing regulations apply, and many advisors don't realize how.

If you're using AI tools in your advisory practice, here's what the SEC actually requires, in plain English.

The SEC's Position on AI (In 50 Words)

AI is a tool. If you use it with client data, for client communications, or to support investment advice, you're responsible for ensuring it complies with existing regulations, including Regulation S-P (data privacy), the Marketing Rule (advertising), and your fiduciary duty (duty of care). The technology doesn't change your obligations.

That's it. The SEC isn't anti-AI. They're anti-reckless adoption.

The Four SEC Regulations That Apply to AI

1. Regulation S-P: Safeguarding Client Information

What it says: Advisors must adopt policies to protect client information and provide notice before sharing it with third parties.

How it applies to AI: When you use an AI tool (meeting assistant, document analyzer, email drafter), you're sending client information to a third party: the AI vendor.

Under Reg S-P, you must:

  • Conduct due diligence on the vendor's data security practices
  • Provide notice to clients that their information may be shared with service providers
  • Implement safeguards to ensure the vendor protects client data
  • Have a data disposal policy that covers data held by third-party vendors

What this means in practice:

If you're using an AI meeting assistant, your Form ADV Part 2A should disclose:

"We use third-party technology services to assist with meeting documentation and client communications. These services may access client information as part of their function."

And your privacy policy should address:

  • What data is shared with AI vendors
  • How long vendors retain data
  • What security measures vendors employ

Red flag: If you're using an AI tool and your Form ADV doesn't mention third-party service providers, you're likely out of compliance.

2. Investment Advisers Act Section 206: Fiduciary Duty

What it says: Advisors owe clients a duty of care and a duty of loyalty. This includes an obligation to provide advice in the client's best interest, disclose conflicts, and avoid negligent or reckless conduct.

How it applies to AI: If you use AI to support investment advice, you're still responsible for the accuracy and suitability of that advice, even if the AI generated it.

SEC guidance (February 2024):

"An adviser may not disclaim or limit its fiduciary obligations by attributing advice or investment decisions to technology. The adviser remains responsible for the output."

What this means in practice:

You can use AI to:

  • Summarize client meeting notes
  • Draft financial plan summaries
  • Generate investment policy statements
  • Identify tax optimization opportunities

But you must review and verify the output before presenting it to clients. If the AI makes a mistake (e.g., recommends an unsuitable investment, misinterprets tax law, hallucinates a fact), you're liable, not the AI vendor.

Red flag: If you're copying AI-generated advice directly into client deliverables without review, you're breaching your fiduciary duty.

3. Marketing Rule (Rule 206(4)-1): AI in Advertising

What it says: Advisors can't make false or misleading statements in advertisements. All claims must be substantiated.

How it applies to AI: If you mention AI in your marketing ("We use AI to optimize portfolios," "AI-powered financial planning"), the SEC expects you to back it up.

SEC guidance:

  • If you claim "AI-driven performance," you must be able to demonstrate how AI influenced results
  • If you advertise "AI-enhanced client service," you must actually be using AI in a meaningful way (not just buzzword marketing)
  • If you use AI to generate testimonials or social media posts, those must still comply with the Marketing Rule (no cherry-picking, no hypothetical performance)

What this means in practice:

Acceptable:

"We use AI meeting assistants to ensure accurate documentation of client conversations." (True, verifiable, specific)

Not acceptable:

"Our AI-powered investment strategies outperform the market." (Implies performance claim without substantiation)

Red flag: If you're marketing AI capabilities that you don't actually use (or can't explain), the SEC will view that as misleading advertising.

4. Rule 206(4)-7: Compliance Policies and Procedures

What it says: Advisors must adopt written policies reasonably designed to prevent violations of the Advisers Act.

How it applies to AI: If you're using AI tools, your compliance program should address:

  • Vendor oversight: How do you evaluate AI vendors before adoption?
  • Data handling: What client data can be input into AI tools?
  • Supervision: How do you monitor advisors' use of AI?
  • Training: Do advisors understand the compliance risks of AI?
  • Incident response: What happens if an AI tool causes a compliance breach?

What this means in practice:

Your compliance manual should include:

  • A list of approved AI tools
  • An AI acceptable use policy
  • A vendor due diligence process
  • Annual training on AI risks

If you don't have policies addressing AI, the SEC will view that as a gap, especially if AI usage has become widespread at your firm.

Red flag: If your compliance manual was last updated before AI tools were adopted, it's outdated.

The SEC's Cybersecurity Rule: Why It Matters for AI

In July 2024, the SEC finalized a cybersecurity rule requiring RIAs to:

  1. Adopt cybersecurity policies tailored to the firm's specific risks
  2. Report significant cybersecurity incidents to the SEC within 48 hours
  3. Conduct periodic risk assessments of vendors that access client data

This rule directly impacts AI adoption.

Why: AI vendors are third-party service providers with access to sensitive client information. If the vendor suffers a breach, the RIA is responsible for:

  • Notifying affected clients
  • Reporting the incident to the SEC
  • Documenting the incident and remediation steps

What advisors must do:

  1. Risk Assessment: Before adopting an AI tool, assess:

    • What data the tool accesses
    • Where data is stored and for how long
    • What happens if the vendor is breached
    • Whether the vendor has cyber insurance
  2. Incident Response Plan: Your plan should address:

    • How you'll be notified if the AI vendor is breached
    • Who at your firm is responsible for responding
    • What timeline you'll follow for SEC and client notification
  3. Vendor Monitoring: Review your AI vendors at least annually:

    • Have their security certifications changed?
    • Have their terms of service changed?
    • Have they had any disclosed security incidents?

Real-world scenario:

An RIA uses an AI transcription service for client meetings. The vendor suffers a data breach. Attackers access 12 months of client meeting recordings.

Under the SEC's cybersecurity rule:

  • The RIA must report the incident to the SEC within 48 hours (if deemed "significant")
  • The RIA must notify affected clients under state breach notification laws
  • The RIA must document the incident, including what data was exposed and what remediation steps were taken

If the RIA can't demonstrate that they conducted due diligence on the vendor before adoption, the SEC may cite inadequate cybersecurity policies.

Common SEC Exam Questions About AI

Based on recent exams and sweep letters, here's what SEC examiners are asking:

"What AI tools does your firm use?"

What they're really asking: Do you even know? Can you produce a list?

How to prepare: Maintain an inventory of all AI tools used at your firm, including:

  • Tool name and vendor
  • What it's used for (meeting notes, email drafting, portfolio analysis, etc.)
  • Who has access
  • Date adopted and approval documentation

"How do these tools comply with Regulation S-P?"

What they're really asking: Did you conduct vendor due diligence? Do you know where client data is stored?

How to prepare: For each AI tool, document:

  • Vendor security certifications (SOC 2, ISO 27001)
  • Data storage location and retention policy
  • How you ensure data is properly deleted
  • What safeguards the vendor has in place

"Does your Form ADV disclose the use of third-party service providers?"

What they're really asking: Are you providing required disclosures to clients?

How to prepare: Review your Form ADV Part 2A. If you use AI tools that access client data, ensure you disclose this under:

  • Item 15 (Custody)
  • Item 8 (Methods of Analysis, Investment Strategies, and Risk of Loss) if AI supports investment decisions
  • Privacy Policy (if separate from ADV)

"How do you supervise advisors' use of AI tools?"

What they're really asking: Do you have policies and procedures? Are they being followed?

How to prepare: Document your AI supervision framework:

  • AI acceptable use policy
  • Training records (showing advisors were trained on AI risks)
  • Spot-check audits (reviewing AI-generated outputs for accuracy)
  • Incident log (any issues related to AI usage)

"What happens if an AI tool generates inaccurate advice?"

What they're really asking: Do you have a review process? Are you blindly trusting AI?

How to prepare: Demonstrate that AI-generated content is reviewed before being shared with clients:

  • Email templates drafted by AI are reviewed by advisors
  • Meeting summaries are spot-checked for accuracy
  • Investment recommendations are verified against firm models
  • If an error is found, document it and implement corrective action

What the SEC Has Said About AI (Directly)

The SEC has issued public statements and enforcement actions related to AI. Here are the key takeaways:

SEC Chair Gary Gensler (July 2023):

"AI is not a get-out-of-jail-free card. Advisers using AI tools must still comply with all existing regulations, including fiduciary duty, disclosure requirements, and marketing rules."

Translation: Don't assume that "the AI did it" is a valid defense. You're responsible for your firm's actions, even if AI is involved.

SEC Enforcement Action: Delphia (March 2024)

The SEC fined Delphia, an AI-driven investment adviser, for making false and misleading statements about its AI capabilities in marketing materials.

Key finding: Delphia claimed its "AI-powered investment strategies" outperformed traditional approaches, but couldn't substantiate the claims. The SEC found this violated the Marketing Rule.

Lesson for advisors: If you market AI, you better be able to prove what the AI actually does, and that it works as advertised.

SEC Risk Alert (February 2024): Outsourcing and Third-Party Vendors

The SEC issued a risk alert highlighting deficiencies in RIAs' oversight of third-party vendors, including:

  • Failure to conduct due diligence before engaging vendors
  • Lack of ongoing monitoring
  • Inadequate incident response plans
  • Failure to update Form ADV when vendors change

How this applies to AI: AI vendors are third-party service providers. If you're not treating them the same way you'd treat a custodian or broker-dealer relationship, you're exposed.

Practical Compliance Steps for RIAs Using AI

Step 1: Update Your Form ADV

Add or revise disclosures to include:

Item 8 (Methods of Analysis):

"We use AI-powered tools to assist with data analysis, meeting documentation, and administrative tasks. All AI-generated outputs are reviewed by our advisors before being presented to clients. Final investment decisions are made by human advisors, not AI systems."

Item 15 (Custody):

"We use third-party service providers, including AI-based tools, that may access client information as part of their function. We conduct due diligence on all vendors and implement safeguards to protect client data."

Step 2: Draft an AI Acceptable Use Policy

Your policy should cover:

  • What AI tools are approved
  • What data can be input
  • Review requirements for AI-generated content
  • Consequences for using unapproved tools

See our free template: AI Acceptable Use Policy for RIAs

Step 3: Conduct Vendor Due Diligence

For every AI tool, document:

  • What due diligence you conducted
  • What security certifications the vendor holds
  • Where client data is stored and for how long
  • Whether the vendor uses your data to train models
  • What happens if the vendor is breached or goes out of business

Step 4: Train Your Team

Include AI in your annual compliance training:

  • What AI tools are approved
  • The risks of shadow AI (using unapproved tools)
  • How to use AI tools in compliance with fiduciary duty
  • What to do if AI generates an error

Step 5: Audit and Monitor

Quarterly:

  • Review AI-generated content for accuracy
  • Spot-check that advisors are following policies
  • Monitor for use of unapproved tools

Annually:

  • Review vendor security and terms of service
  • Update Form ADV if AI usage has changed
  • Re-train team on AI policies

The Bottom Line: The SEC Expects Thoughtfulness, Not Perfection

The SEC isn't trying to ban AI. They're not requiring advisors to build perfect compliance programs overnight.

What they expect:

  • Awareness of the risks
  • Documented policies
  • Reasonable due diligence
  • Ongoing supervision

If you can demonstrate those four things, you're in good standing, even if your AI program isn't flawless.

The advisors who will struggle are those who:

  • Adopt AI tools without thinking about compliance
  • Can't explain where client data is stored
  • Assume "SOC 2" means "SEC-approved"
  • Have no policies addressing AI usage

What to Do This Week

  1. Inventory your AI tools. Make a list of every AI tool your firm uses.
  2. Review your Form ADV. Does it disclose third-party service providers? If not, update it.
  3. Check your compliance manual. Does it address AI? If not, add an AI acceptable use policy.
  4. Document vendor due diligence. For each AI tool, create a one-page summary of your due diligence.
  5. Train your team. Add AI compliance to your next team meeting agenda.

The SEC is watching. The firms that respond proactively will be ahead. The ones that wait will be explaining gaps during exams.


Free Resource

SEC AI Compliance Checklist

A step-by-step checklist covering all SEC and FINRA requirements for AI tool adoption. Includes recordkeeping, Books and Records Rule, and supervision obligations.

Download Free

Ready to try AI Secretary?

Start your 14-day free trial. No credit card required.

Start Your Free Trial