AI is quickly becoming an integral part of daily operations, however, as we are meeting with organisations it is clear that there is some confusion over whether it is being used, and how it should be used. This article will explain why it's important to understand what you are using and why, with some ideas about how to start training your staff.
Why AI Literacy Matters
Misinformation - it's important to be able to recognise AI-generated content. Sometimes this might not be obvious. AI doesn't have to be 100% statistically accurate to provide results, so there may be some falsehoods presented in results. Always fact check and cross-reference!Productivity - AI tools can help with repetitive tasks but shouldn't be over-relied upon. Consider defining what AI tools your organisation allows and in what capacity.
Ethics - understanding how AI tools can generate bias results is important. Bias can happen when it produces unfair, skewed or discriminatory outcomes due to the data, design or deployment. AI models and results should be regularly checked for bias as part of normal procedure. Adding in a regular 'human element' check is best practice if you are using AI to help automate decision making.
Cyber security - regularly training staff about phishing attacks and to expect AI-powered attacks to become increasingly sophisticated is best practice. Constant awareness through short briefings at staff meetings, staff email newsletters, posters around the office and adding to regular training conversations are essential. Using a Phishing tool is a good way to do this. Don't forget that the power of AI can also be used in cyber attack recovery by speeding up response times, minimising damage and strengthening future security. AI has a place in rapid detection, automated incident response, forensic analysis and more.
How do you train staff about AI?
While it's important to schedule regular structured training and workshops, this can be very difficult to fit in for small organisations and schools where time is short. Here are some ideas:Workshops - for all staff. Include AI training in data protection and cyber security training. Given cyber security training should be run annually for anyone with access to the network, this is a good time to talk about AI. Data Protection training should be run regularly for all staff that have access to personal data. Including AI training in data protection training helps staff to recognise personal data and how they should be avoid inputting it into any AI model prompts. Staff should be trained on the extra risk of sharing special category data with AI tools.
AI might also be used in filtering and monitoring reporting, so check with your provider. If you are a school or multi academy trust, it's important that the DSL understands how decisions are made about the filtering and monitoring reporting if AI is used.
AI might be used in your MIS tool. For example, Arbor has Arbor AI, which is a collection of AI-powered features being added to the MIS on a rolling basis. This type of AI helps users to draft communications and provides summaries of student's performance. While Arbor advises that student and staff data that is processed by Arbor AI is not stored or used to train the Open AI model, staff and parents should be made aware that an AI model is being used; transparency is always best advised. You might want to update your privacy notices and due diligence information to reflect this.
AI tools - if you plan on introducing AI tools to the workforce then ensure it comes with appropriate training (check with the provider), particularly if prompts are required. Understanding how to recognise personal data, so that this is not entered into the AI model is crucial - ensure data protection training is up to date for all staff. If you are planning on using AI tools, then ensure all due diligence has been completed. It is accepted that processing data with AI is likely to result in a high risk to an individual's rights and freedoms, so therefore, a DPIA is advised. Always contact your DPO and whoever is in charge of Information Security in your organisation if you are procuring AI tools or products that use AI. If you are a school or multi academy trust, then review the DfE Digital Standards for Cloud Solutions, as the framework will help with procurement.
Ethics - ensure staff understand how to use AI tools ethically and responsibly. This means understanding data privacy, bias mitigation and responsible use.
If you're just getting started with using AI then take a look at our AI Best Practice Area and our AI checklist: