We could have started offering MOAT simply to address those AI topics, which it does. MOAT acknowledges the velocity of change, the need for a clear AI policy and plan, the importance of cross-departmental communication, the challenges of weaving AI into with compliance conversations, the protection of personal and proprietary research information linked by AI, and ways to align freedom and privacy to promote learning and relieve security concerns.
But the biggest reason we started and continue to offer MOAT has nothing to do with the technology itself. We started MOAT for the kids, the young people who will live with AI — and all its potential messy consequences — for the rest of their lives if they don’t get the training they need right now.
WATCH: Master operational artificial intelligence transformation with CDW’s MOAT engagement.
The Greatest Threat to AI Security Is the People Using It
Most college students are technically adults. But, as anyone who has been around one can attest, they don’t always act like it. College is known as much for being a place of higher learning and academic prowess as it is as a place where nights can get wild, parties happen daily and mistakes are made. Learning to limit the consequences of those mistakes is a part of what higher education institutions should provide these young people.
In the context of AI, and especially generative AI, users are making mistakes left and right. It’s hard to blame them. This is a new, mostly misunderstood but highly tantalizing technology that gives students and adults alike the power of creation in a way they’ve never experienced.
But much like the advent of the internet created new and unforeseen violations of our privacy, especially in the early years, generative AI is being underestimated in some of the same ways.
Consider this scenario in today’s AI world: Students are having fun at a party one night and create a deepfake image of a friend doing something nefarious. They laugh at the image, maybe share it with their friends and forget about it a few days later.
But what has the LLM remembered? Where does that image live online? Who has access to it? And, when the technology improves, what happens if someone can’t tell the difference between the deepfake and a real person doing that nefarious thing? Does the image show up in Google searches when the student is looking for a job? Does it haunt them into adulthood?
DISCOVER: Find out how eProcurement services can simplify IT purchasing.
Perhaps even more troubling is the most recent research and reporting on how generative AI is being used. A disquieting Harvard Business Review piece published this spring notes that the No. 1 way generative AI is being used is for therapy and companionship, followed by organizing one’s life and finding purpose. Just 12 months earlier, the top use cases were generating ideas (fell to No. 6 in 2025), therapy and companionship, and specific search (down to No. 13 in 2025).
The research paints a bleak picture of AI “relationships” becoming more intimate, and anecdotal reporting from the New York Times adds context and examples of people who have very quickly gotten in far too deep with an AI companion.
Combine this with the fact that today’s college students have been through a significant trauma during childhood: the isolation and disconnection of the COVID-19 pandemic, which was at its peak when today’s students were high school age or younger.
What Is MOAT and How Can It Help Higher Education Institutions?
MOAT is a holistic and ongoing engagement offered by CDW to higher education institutions. We strongly believe our expertise, experience and position as a solutions integrator can deliver results for whatever your college is looking to do with AI.
Through MOAT, we guide universities through an understanding of AI’s present and potential future, work together with campus leadership to identify what gaps AI can fill, put stakeholders across campus on the same page, and develop a roadmap to integration.
READ MORE: The importance of data management in higher education.
There will also be conversations about security, and that’s where the well-being of students comes to the forefront. There is no more important piece of IT security training than educating the people who will be using the technology. In the context of generative AI, that’s pretty much everyone on campus. And the future will see even more AI devices getting plugged in.
AI literacy and ethics training are critically important in this realm. An understanding that what goes into the AI never goes away is a good starting point. Educating students on the risks of AI companions is also key, whether that’s providing adequate counseling services or conducting mandatory trainings on the dangers of AI companionship.
There is no doubt that AI is powerful, and with that power comes great responsibility. Colleges and universities hold that responsibility as caretakers for their students, nurturing them and putting them in position to have fulfilling lives. Making sure those institutions unleash AI’s power with the necessary guardrails in place to keep students on track is what MOAT was built to do.
Source link
#Universities #Obligation #Protect #Students