AI and Your Data - Privacy and Security
Understand what happens to your data when you use AI tools, your obligations under the NZ Privacy Act, and how to keep customer information safe.
AI and Your Data — Privacy and Security
In this lesson, you will:
- Understand what happens to your data when you use AI tools
- Know your obligations under the New Zealand Privacy Act 2020
- Take practical steps to keep customer information safe
You run a small tour operator in Rotorua and have just signed up for a new AI tool to help manage your customer bookings. The tool asks for access to your customer emails and phone numbers. You’re not sure whether this is safe. You want to use AI to save time, but you’re worried about exposing your customers’ personal information. This situation is common for small business owners in New Zealand who are balancing the benefits of AI with the need to protect sensitive data.
All company names and scenarios used in this course are fictitious and created for illustration and training purposes only. Any resemblance to real businesses or organisations is coincidental.
When you use AI tools, your data becomes part of the process. Whether it’s customer details, sales records, or even internal documents, these tools often need access to your data to function. But this doesn’t mean your data is at risk — if you take the right steps. Here’s how to keep your data safe and legal under New Zealand’s Privacy Act 2020.
1. Understand the Privacy Act 2020
New Zealand’s Privacy Act protects personal information - this includes names, contact details, and even payment information. If you use an AI tool that handles this data, the tool must comply with the Act. Key requirements include:
- Lawful use: The tool must only use your data for the purpose you agreed to (e.g., managing bookings, not selling to third parties).
- Data minimisation: The tool should only collect the data it needs. For example, if your AI tool is for customer service, it shouldn’t ask for your customers’ bank details.
- Storage and security: The tool must store your data securely, whether in New Zealand or overseas.
If you’re unsure, check the tool’s privacy policy — more on this in the Try This section.
2. Use Encryption to Protect Data
Encryption is a way of turning data into a code that only authorised people can read. For example, if you use a cloud-based AI tool to store customer emails, the emails should be encrypted both when they’re being sent (in transit) and when they’re stored (at rest). This prevents hackers from accessing your data if they intercept it or break into the system.
A local accounting firm in Christchurch used an AI tool to manage client tax documents. They chose a provider that used end-to-end encryption, which meant only their team and the AI tool could access the files. This gave them peace of mind and avoided potential legal issues.
3. Limit Access to Your Data
Not everyone in your business needs access to all your data. For example, your AI tool might only need access to sales data for a specific team member, not the entire office. You can:
- Use strong passwords and two-factor authentication for AI tools.
- Assign roles in your AI platform (e.g., “admin” for managers, “viewer” for others).
- Review access regularly - if an employee leaves, their access should be revoked.
A small retail store in Dunedin used an AI inventory system. They gave access only to the manager and the stock clerk, ensuring that sensitive supplier contracts and pricing data stayed secure.
4. Choose Tools That Store Data Locally
Some AI tools store data overseas, which can be a risk under the Privacy Act. If a tool stores your data in a country with weaker privacy laws, it might be harder to protect your information if there’s a breach. Look for tools that:
- Store data in New Zealand (e.g., hosted by a local provider).
- Have clear policies about data transfers to other countries.
A local IT consultancy in Auckland chose an AI customer support tool that stored data on New Zealand servers. This aligned with their commitment to protecting client information and avoided potential compliance issues.
Common Pitfalls
Even well-meaning small businesses can make mistakes when using AI tools. Here are three common pitfalls to avoid:
- Assuming free tools are less secure: Just because a tool is free doesn’t mean it’s safe. Some free AI tools may collect your data for training models or sell it to third parties. Always read the fine print.
- Ignoring compliance with the Privacy Act: Using a tool that doesn’t comply with the Privacy Act could result in fines or legal action. For example, if an AI tool stores customer data in a country without proper data protection laws, it may be illegal under the Privacy Act.
- Not preparing for a data breach: If your AI tool is hacked, you need a plan. This includes informing affected customers, contacting the Privacy Commissioner, and fixing the security issue. Many small businesses skip this step, leaving them unprepared.
Try This
Check the privacy policy of an AI tool you’re considering using. Here’s how to do it:
- Look for clear explanations of how the tool handles your data. Does it say where your data is stored? Who can access it?
- Check for encryption details. Does the tool mention encryption for data in transit and at rest?
- Review data usage policies. Does the tool use your data for purposes beyond what you agreed to (e.g., training AI models)?
- Look for compliance with the Privacy Act 2020. Some tools will explicitly state that they follow New Zealand privacy standards.
For example, if you’re considering a customer relationship management (CRM) AI tool, search for “privacy policy” on their website. A reputable tool will have a clear, easy-to-read policy that answers these questions.
You can also use the Privacy Commissioner’s website to check if the tool is registered or has any past issues.
Key Takeaway
When using AI tools, always ensure they comply with the Privacy Act 2020, protect your data with encryption, and limit access to only those who need it. Check the tool’s privacy policy for details on data storage, usage, and security. Small businesses can use AI safely by being proactive about data protection and choosing tools that align with New Zealand’s privacy standards. In the next lesson, we’ll look at the broader ethical considerations of using AI in your business.
AI for Good: Protecting People, Not Just Data
Data privacy isn’t just a legal requirement — it’s about respecting the people who trust you with their information. When you choose AI tools that protect your customers’ data, you’re building a business that people can rely on. In a small community, that trust is everything. Using AI responsibly means treating your customers’ information with the same care you’d want for your own.