Skip to content

Why We’re Pressing Pause on AI Note Takers and Why Your Organization Should Too

Why We’re Pressing Pause on AI Note Takers and Why Your Organization Should Too


AI is moving fast. Tools that promise to make meetings easier and more productive are everywhere, especially AI note takers that can capture and summarize conversations in seconds.

At AZ Impact for Good, we made a deliberate decision to pause the use of external AI note taking tools in our public meetings and trainings.

This is not about rejecting innovation. It is about understanding the real risks and putting the right guardrails in place before those risks become problems.

Why We Made This Decision

Our meetings are spaces where nonprofit leaders, grantmakers, and partners share ideas, strategies, and sometimes sensitive information. That requires trust, clarity, and control over how conversations are captured.

What we are seeing, and what recent reporting has highlighted, is that AI note takers do not always behave the way people expect.

In some cases, AI assistants continue recording even after participants think a meeting has ended, capturing offhand comments or sensitive conversations and distributing them more widely than intended. (AOL)

That is not a hypothetical risk. It is already happening in workplaces, creating legal, HR, and reputational challenges.

As outlined in our policy, allowing multiple tools to generate their own versions of a meeting also increases the risk of “errors, misinterpretations, or omissions,” and undermines the need for a single, accurate, official record.

This is where control becomes critical. If you cannot fully control what is being recorded, where it is stored, and who has access to it, you are introducing risk into every conversation. Just ask your HR department….


Why This Matters for the Nonprofit Sector

For nonprofits, grantmakers, and mission-driven businesses, these risks are amplified.

You are often working with sensitive community data, funding strategies, and policy discussions. In some cases, even informal conversations can carry real consequences if they are captured and shared out of context. This during a time when the nonprofit sector has been under attack. 

AI note takers can also shift behavior in meetings. When everything is being recorded and potentially distributed, conversations become more cautious. That can limit honest dialogue, especially in spaces focused on collaboration, advocacy, and problem solving.

There is also a growing compliance challenge. Organizations now need to think about consent, data storage, and who can access transcripts, not just whether a tool is helpful. 

The Bigger Picture: Every Organization Needs an AI Policy

The lesson here is not to avoid AI. It is to use it intentionally.

Organizations that are getting ahead of this are asking key questions:

  • What is being recorded, and when?
  • Who has access to that information?
  • How long is it stored?
  • When should recording be turned off entirely?

Without clear answers, even well-intentioned tools can create unnecessary exposure.

Where to Start

You do not need a complex policy to begin. Start with clarity.

Define when AI tools are appropriate, require transparency and consent, and establish a single source of truth for official records. Most importantly, ensure your use of AI aligns with your mission and the trust your community places in you.

At AZ Impact for Good, we believe innovation should strengthen our sector, not introduce avoidable risk. Setting boundaries is part of that responsibility.

Powered By GrowthZone
Scroll To Top