Free Resource

AI Readiness Checklist for Teams

25 questions across 5 areas to find out where your team actually stands — before you buy any tools or book any training.

Takes about 5 minutes. Print it out or work through it on screen.

Total score
0
out of 25
Check items above to see your score

1. Team Skills & Awareness

0 / 5
Most team members have tried at least one AI tool in the last 3 months
ChatGPT, Copilot, Gemini, Claude, or similar — even casually
Your team can describe a specific work task where AI would (or does) help
Not "AI can help with everything" — a specific, named task
People on your team know what AI hallucination is and why it matters
Not just "it makes stuff up" — they understand it's confident, not random
Your team has had at least one structured conversation about AI at work
A meeting, training, or shared resource — not just individual dabbling
Team members feel comfortable raising concerns about AI outputs with their manager
No "everyone acts like AI is perfect so I don't say anything" culture

2. Tool Use & Access

0 / 5
Your organization has decided which AI tools are approved for work use
Even if that list is "ChatGPT is fine, nothing else yet" — that's a decision
Team members with Microsoft 365 licenses know whether Copilot is available to them
Not "I think we have it somewhere" — someone actually knows
There's no significant gap between who uses AI tools and who has access to them
If some people have licenses but aren't using them, that's a gap
Your team knows which tools NOT to use for sensitive work or client data
Clear guidance exists, not just "use your judgment"
Someone is responsible for keeping the approved tool list updated as new tools emerge
The AI landscape changes fast — a stale list is almost as bad as no list

3. Data & Privacy Practices

0 / 5
Team members know they should not paste client names, personal data, or confidential content into a public AI tool
This is one of the most common and consequential gaps
Your organization has documented what types of data are off-limits for AI tools
PHI, PII, financial records, trade secrets — at minimum
Staff know the difference between enterprise-grade AI tools and consumer versions
Microsoft Copilot (M365) vs. ChatGPT free/plus have very different data handling
There's a reporting process if someone accidentally inputs sensitive data into an AI tool
Even a simple "tell your manager" is a process
AI-generated content that goes external (reports, emails, posts) is reviewed before sending
Not "we trust it" — someone is actually reading and checking

4. Governance & Policy

0 / 5
Your organization has a written AI use policy (even a one-pager counts)
Not "we're working on it" — something documented and shared
Employees know who to ask when they have a question about whether AI is appropriate for a task
A named person or team — not "probably HR"
Your AI policy addresses disclosure — when does your team need to tell someone that AI was used?
Client deliverables, external content, legal documents — the rules vary by context
Leadership has communicated a clear stance on AI — encouraging, cautious, or somewhere defined
Silence from leadership is a policy. Just not a useful one.
Your AI policy is reviewed at least once a year (the landscape changes fast)
A policy written in 2023 is probably already missing critical context

5. Workflow Integration

0 / 5
At least one team workflow has been meaningfully improved using AI in the last 6 months
"We used it once for a project" is not a workflow improvement
There's a way to share what's working across the team (not just individual discovery)
Slack channel, Notion doc, monthly share-out — anything shared
Your team can identify 3+ repetitive tasks that could be partially automated or accelerated with AI
This is the core question of any AI strategy session
Team members know when NOT to use AI — tasks that require human judgment, sensitivity, or accountability
Knowing the limits is as important as knowing the uses
Someone on your team owns the AI strategy — not just "everyone figuring it out individually"
Could be a manager, an ops lead, an IT person, or the ED — just someone

What your score means

0–10
Early stage. Your team is dabbling but doesn't have a foundation yet. A strategy session is the right first step.
11–19
Building. You have some of the pieces. Targeted training in the areas with the most gaps will close the distance quickly.
20–25
Ready to level up. Your foundation is solid. A strategy session can help you find the next layer of efficiency and capability.

Want help acting on your results?

The Roadmap is a 90-minute strategy session that maps your team's specific gaps, identifies the highest-value AI opportunities, and gives you a written plan. Flat fee $400–$600.

Central Florida in-person or virtual — same price.