Skip to main content
AI

Your Staff Are Using AI Tools. Does Your Business Know That?

By 9th April 2026No Comments

There’s a quiet revolution happening in offices across South Africa, and most business owners have no idea it’s underway. Their staff — good, productive, well-meaning staff — are using artificial intelligence tools to get their work done faster. They’re drafting emails with ChatGPT, summarising meeting notes with Claude, generating reports with Copilot, and transcribing calls with tools they found on the internet. 

None of them asked for permission. Most didn’t think they needed to. 

This isn’t a story about bad employees. It’s a story about a policy gap that almost every SMB has right now — and why it matters more than it might seem. 

What’s Actually Happening in Your Business Right Now 

The numbers are not subtle. Microsoft and LinkedIn’s 2024 Work Trend Index — a survey of 31,000 workers across 31 countries — found that 75% of knowledge workers are already using AI tools at work. More tellingly, 78% of those users are bringing their own tools rather than using anything their company has officially approved or provided. 

In practical terms, this means your staff are very likely feeding business information into external platforms right now. A customer service manager pastes a client complaint into ChatGPT to draft a response. A salesperson uploads a proposal into an AI summariser to pull out the key points. An administrator uses an AI tool to help format a spreadsheet. Each of these actions sends data — sometimes sensitive data — to a third-party server. The question is: what happens to it there? 

The Risk Isn’t Malice. It’s Ignorance. 

Most AI platforms used on free or basic plans have data retention policies that allow them to use your inputs to improve their models. This became very public in 2023 when Italy’s data protection authority temporarily banned ChatGPT over concerns that OpenAI had no lawful basis for collecting and storing personal data at the scale required to train its models. Italy was the first Western country to take such a step — but the concern it raised is relevant far beyond Europe. 

For casual personal use, the implications are limited. For a business, the picture changes considerably when the data being entered includes client details, financial information, or proprietary processes — particularly under South Africa’s Protection of Personal Information Act (POPIA). 

Under POPIA, your business is responsible for how personal information is processed — including by third parties acting on your behalf. An employee pasting client data into a free AI tool without your knowledge doesn’t transfer that responsibility away from the business. It just means the exposure happened without a policy or a paper trail. 

This Isn’t an Argument Against AI 

To be clear: AI tools are genuinely useful. Businesses that use them thoughtfully will outpace those that don’t. The goal isn’t to ban ChatGPT — it’s to make sure that when your team uses AI, they’re working in an environment your business controls. 

That’s exactly what Microsoft 365 Copilot is designed to do. Unlike third-party AI tools that operate outside your IT environment, Copilot is built directly into the Microsoft 365 apps your team already uses — Word, Excel, Outlook, Teams — and runs entirely within your Microsoft tenant. Your data doesn’t leave your environment. It doesn’t train external models. It’s subject to the same security and compliance controls as the rest of your Microsoft setup. 

Your staff get the productivity benefits they’re already seeking. The business doesn’t carry unknown data exposure risks. 

What a Sensible AI Policy Looks Like for an SMB 

You don’t need a 40-page document. You need three things: 

Visibility. Know which AI tools your staff are currently using. A straightforward conversation or short internal survey is usually enough to start. 

A clear position. Decide which tools are acceptable for which types of work. Consumer AI tools on free plans may be fine for low-sensitivity tasks. Client data, financial records, and anything covered by POPIA should only ever go into approved, controlled environments. 

A sanctioned alternative. If you tell staff they can’t use ChatGPT for work tasks, give them something better. Microsoft 365 Copilot, properly licensed and configured, provides a capable AI assistant that works within your existing tools and your existing security perimeter — without the exposure. 

The Window to Get Ahead of This Is Now 

AI adoption in the workplace isn’t slowing down. The businesses that will navigate it well are the ones that act now — have the conversation, set a policy, and equip their teams with the right tools before a data incident forces the issue. 

Dial a Nerd helps South African businesses set up and manage Microsoft 365 properly — including Copilot licensing, configuration, and onboarding. If you want your team to have the benefits of AI without the risks of unmanaged tools, we can help you get there. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share