As the summer holidays become a distant memory, it’s a great time to reassess your privacy and AI governance strategies and operations.
So we’ve taken inspiration from the Future of Privacy Foundation’s “Twelve Privacy Investments for Your Company for a Stronger 2025” and developed a Kiwi version!
Here’s the Simply Privacy view of some of the key things you should be prioritising and investing in this year.
- Revisit your privacy notice
- Ensure your privacy notices are up-to-date and reflect any new data uses planned for 2025, especially around AI and secondary uses of personal data. You should be aiming for “trusted transparency”.
- You’ll need to prepare for the introduction of IPP 3A by (a) working out where and how you indirectly collect personal information through third parties and (b) ensuring the collecting agency has told individuals who will be receiving their information and what it will be used for. It’s also a good idea to place contractual obligations on the collecting agency to do this. The OPC will be releasing guidance for organisations on the requirements of IPP 3A and in the meantime there is some helpful guidance at govt.nz. We’ve already helped numerous clients update their privacy notices in preparation for IPP 3A so reach out if you have any questions.
- Likewise, agencies that collect information directly and pass it on to other agencies should review their own privacy statements to make sure they are accurate and up to date. That may involve checking with the recipient agency as to their intended uses of the information. See here for more detail.
- Start preparing for the Biometric Processing Privacy Code of Practice
- The Office of the Privacy Commissioner is still consulting on the draft code and you have until 14 March 2025 to submit comments on the new draft. As we note elsewhere in this newsletter, it’s important New Zealand takes a risk-based approach to this issue, taking care not to over-regulate in a way that could prejudice beneficial and safe uses of biometric information, or under-regulate in a way that could leave individuals and communities open to harm.
- Identify where you are using any biometric processing and starting planning your approach. You’ll probably need to update your privacy notices, conduct PIAs (or at least “necessary and proportionate” assessments – best done via a PIA in our view) and consider whether your use is restricted because it’s particularly high risk. And that’s also true for biometric processing that’s been in place for a while, as the code will have retrospective effect.
- Understand your AI risk profile
- The first step to managing potential AI risks is to understand what AI tools and systems are being used across your organisation. Are you using any machine learning systems? Biometric technologies that use AI? Or is your organisation focusing purely on generative AI?
- From there, you can work out your AI risk profile to help inform the best, risk-based approach for your business (AI Impact Assessments can be helpful here). You don’t need an all-singing-and-dancing Responsible AI framework if you only have a few staff using Copilot. But if your business is leaning heavily into numerous types of generative AI, as well as more traditional machine learning model, then you’ll want something a bit more robust. It’s all about understanding the extent of risk and then designing what’s best for your business.
- Refresh your approach to privacy requests
- Every year, access and correction requests are the biggest source of complaints to the OPC and the Human Rights Review Tribunal. Key steps you should take include making sure you have appropriate policies and procedures in place, ensuring your staff know people are entitled to access their personal information and knowing where your personal information is located.
- And don’t forget to think about how you will meet your IPP 6 obligations when you procuring new systems and services, especially when it comes to AI. Which brings us to…
- Strengthen vendor due diligence and management
- Don’t just rely on contracts to ensure your vendors and service providers are doing the right thing. Develop a robust approach to upfront vendor due diligence to understand exactly what you’re getting into from both a privacy and AI governance perspective. You can even use the answers from vendors to help inform your approach to contractual risk allocation.
- Implement appropriate technical monitoring and accountability measures in your relationships with service providers. Remember, you remain responsible for personal information processed by service providers who aren’t using that information for their own purposes.
- Make sure you’re protecting children’s information
- Children’s privacy is a big focus around the world right now. Under IPP 4, we need to take particular care when collecting information from children and young people. And it may not be fair to collect information from children in the same manner as we would from an adult.
- Keep an eye out for the OPC’s best practice guides on children’s information. These will focus on different sectors, starting with education.
- Collaborate across the business & with other privacy professionals
- We know you know this already. But just a reminder that developing constructive, transparent and friendly relationships with key business teams will make your privacy and AI governance work much easier. Engage early with the likes of sales, marketing and product teams to understand their future plans and enable you to support them while managing privacy and reputation risks.
- Plus, a problem shared is a problem halved. Get involved in Privacy Week, attend an IAPP KnowledgeNet event or submit a proposal to speak at the IAPP ANZ Summit in December. However you participate, you’re likely to learn something useful, build connections with other privacy professionals and generally hang out with some great people!
- Privacy and AI literacy education
- Empowering your staff to understand privacy and AI is one of the best investments an organisation can make. And in our experience, it provides one of the biggest returns on privacy investment. If your staff understand what privacy and Responsible AI are, why they matter and what they mean to their roles, then privacy and AI risks can be identified, understood and managed before anything really bad happens.
- That’s why we developed our “Privacy Made Simple” and “Gen AI Guardrails Made Simple” e-learning modules. There’s also plenty that staff can learn from our in-person and live online training, including our Privacy Officer Focus, Privacy Officer Toolkit and IAPP AI Governance Professional training courses.