You can now find Cyber Kendra on Google News!

Ensure a Generative AI Apps is Secure for Educational Institutions to Use

Ensuring Secure Generative AI for Education

Generative AI is revolutionizing countless industries. While it might once have been the stuff of sci-fi dreams, today generative AI is a very real tool with very real possibilities.

Education could benefit from AI tech more than any other industry, but learning institutions are also among the most vulnerable to cyberattacks. That’s why schools are increasingly being targeted.

According to an NPR report, 45 K-12 school systems were attacked in 2022 more than doubled to 108 in 2023. Schools are often major employers. The more people you employ the higher the chances that someone makes a mistake. 

One cyberattack on Atlanta Public Schools (APS) in 2017 started with a few employees clicking a link in a phishing email and ended up with their salaries being rerouted to hackers. APS spent $300,000 dealing with the aftermath of the attack, the Atlanta Journal-Constitution reported.

Schools tend not to have great cybersecurity, as they tend to run on older computer systems and don’t have the resources to employ cybersecurity experts.

However, they have lots of very sensitive data about young people. That data is particularly valuable as it can be easier to fraudulently use a child’s identity, which doesn’t have a bank account or credit card, than an adult’s where anomalies would be quickly spotted.

However while hackers have identified schools as easy targets, this doesn’t have to rule out innovative generative AI applications for educational institutions. Picking the right platform and understanding how it works can ensure even brand-new technology stays safe.

Why should schools be excited about Generative AI?

While robot assistance might be some way off, GenAI applications could mean very advanced learning assistance.

This goes well beyond automated yes-no, right-wrong responses. Instead, educational institutions can make use of AI that’s able to give students accurate, contextually relevant advice as they learn, helping them correct course when they start to get things wrong – all thanks to RAG, or retrieval augmented generation.

As MongoDB’s post on retrieval augmented generation explains, RAG complements LLMs’ training data with domain-specific data that is not publicly available. This could allow AI to respond to a particular student’s past grades, look out for repeat mistakes, or spot signs of misunderstandings before they become embedded.

Is this a security risk?

It’s perhaps not hard to see how this could lead to a security risk, at least in principle.

An AI system plugged into live data about students might seem like an obvious target for hackers trying to scrape students’ personal details.

At the same time, an AI that can hold increasingly familiar conversations with students and staff might also make them more gullible when asked for security information by what they assume is the friendly AI TA.

RAG is a revolutionary technology, but it can also leave organizations in a security nightmare, as The Stack’s RAG security deep-dive explains.

How to manage these threats

These might be justifiable concerns in principle, but in reality, they can all be negated by picking the right system and asking the right questions.

Build your RAG-powered system on a platform that is secure and has proper data governance capabilities. That means a platform that is secure by default, has end-to-end encryption and complies with data protection best practices at the highest levels.

Consider cloud-based platforms, as many of the major cloud providers are leaders in innovating AI infrastructure. That means that you’ll be able to take advantage of the latest security innovations.

Pick a provider that’s proven itself on the grandest scale, with a track record of international enterprise-level customers, and examples to back up their claims. If you want to ensure security, you want to go to the best of the best.

By sticking to these principles you can develop GenAI-powered applications that give students cutting-edge learning experiences without making the school system vulnerable.

Education stands to benefit hugely from the emergence of AI, as we’ve explored here in our post on ‘The Benefits of Using AI in Academic Institutions’. Don’t let poor security get in the way of this.

Post a Comment