CSU’s adoption of ChatGPT Edu is, in many ways, a welcome move. The System has recognized that generative AI is no longer optional or experimental. It is part of the work students, researchers, and educators do across disciplines. Providing a dedicated version of the platform with institutional controls makes sense. But the way it has been implemented has led to a diminished version of what could have been a powerful tool.
The most immediate concern is the complete ban on third-party custom bots. Students and faculty cannot use them, and even more frustrating, they cannot share the ones they create beyond their own campus. The motivation is likely grounded in cybersecurity and privacy concerns. But the result is a flawed solution that restricts access to useful tools and blocks opportunities for creativity and professional development.
Some of the most valuable GPTs in use today come from third-party developers who specialize in specific domains. Bots that incorporate Wolfram, for instance, have become essential in areas like physics, engineering, and data science. ScholarAI and ScholarGPT are very useful in research, and not easy to replicate. There are hundreds more potentially useful tools. Not having access to those tools on the CSU platform is not just a minor technical gap. It is an educational limitation.
The problem becomes even clearer when considering what students are allowed to do with their own work. If someone builds a custom GPT in a course project, they cannot share it publicly. There is no way to include it in a digital portfolio or present it to a potential employer. The result is that their work remains trapped inside the university’s system, unable to circulate or generate value beyond the classroom.
This limitation also weakens CSU’s ability to serve the public. Take, for example, an admissions advisor who wants to create a Custom bot to help prospective or transfer students explore majors or understand credit transfers. The bot cannot be shared with anyone outside the CSU environment. In practice, the people who most need that information are blocked from using it. This cuts against the mission of outreach and access that most universities claim to support.
Faced with these limits, faculty and staff are left to find workarounds. Some are like me and now juggle two accounts, one tied to CSU’s system and another personal one that allows access to third-party tools. We have to pay for our personal accounts out of pocket. This is not sustainable, and it introduces friction into the very work the platform was meant to support.
Higher education functions best when it remains open to the world. It thrives on collaboration across institutions, partnerships with industry, and the free exchange of ideas and tools. When platforms are locked down and creativity is siloed, that spirit is lost. We are left with a version of academic life that is narrower, more cautious, and less connected.
Of course, privacy and security matter. But so does trust in the people who make the university what it is. By preventing sharing and disabling custom bots, the policy sends a message that students and faculty cannot be trusted to use these tools responsibly. It puts caution ahead of creativity and treats containment as a form of care.
The solution is not difficult. Other platforms already support safer modes of sharing, such as read-only access, limited-time links, or approval systems. CSU could adopt similar measures and preserve both privacy and openness. What is needed is not better technology, but a shift in priorities.
Custom GPTs are not distractions. They are how people are beginning to build, explain, and share knowledge. If we expect students to thrive in that environment, they need access to the real tools of the present, not a constrained version from the past.