I was honored to receive the EDUCAUSE Professional Pipeline Student Scholarship to attend the EDUCAUSE annual conference as a student representative from UW–Madison. It was a precious opportunity to share a student perspective, listen to national conversations about teaching and technology, and bring ideas back to support instructors and students in the College of Engineering.
Over several days I attended general sessions, breakout sessions, and many informal conversations that circled around one big theme. Higher education is still figuring out what AI means for learning. In a surprising way, that made me appreciate the work we are already doing at CEETE even more.
What I noticed about AI across campuses
In many sessions, people shared how their institutions are approaching AI in teaching and student support. My honest impression is that most schools are still in a vague place. One speaker suggested that nine out of ten campuses have only high level AI statements with little concrete guidance for instructors or students.
At CEETE we are already working on very practical questions. For example, how to talk with students about AI in a syllabus, how to design assignments where AI is allowed but must be documented, and how to support faculty who want to try AI in a small, safe way. We are not perfect, but we are not standing still.
I also met people who are thinking about AI from very different contexts. A school for disabled students in Boston talked about AI in a very honest way. They worry about being left behind if they do not explore AI for accessibility and support. At the same time, they are careful about bias and harm. That mix of urgency and care felt important.
On the expo floor, I saw how quickly tools are changing. Dell showed an advanced printer that can handle large, low cost print jobs while still producing very clear visuals, even with small images. Kahoot now lets instructors upload a document and automatically generate quiz questions. Tools like this can save time, but they also raise new design questions. If the quiz is easy to generate, what should we spend our human effort on instead?
Learning from “Unmasking AI”
One of the most powerful sessions for me was a talk by Joy Buolamwini, a computer scientist and founder of the Algorithmic Justice League, based on her book Unmasking AI.
Her message was very clear. AI systems are not neutral. They reflect the data and values they are built on, which means they can repeat and even amplify existing bias. When that happens, people who are already marginalized are the ones who get hurt first. For higher education, that matters a lot, because we are making choices about which tools to use and who they affect.
Joy shared facial recognition as one concrete example. She showed that many systems perform worse on dark skinned faces and on women, even though they are already used in airports and other public spaces. She also asked us to think about consent when someone wants to scan our face. We do not have to quietly accept every AI system we are offered.
For me, this connected back to classroom technology. When we bring AI tools into our courses, we also bring their values, their training data, and their blind spots. That means instructors and support units like CEETE have a responsibility to ask hard questions and to keep human dignity at the center.
Sharing the student voice on the EDUCAUSE panel

One of the highlights of EDUCAUSE for me was serving on a student panel about cultivating the next generation of higher ed IT leaders. The session was recorded, and for other recorded sessions from EDUCASUE 2025, feel free to check the link. Being on that stage as a student from UW–Madison felt special because I could speak directly to leaders about what students are actually experiencing with technology and AI.
On the panel I talked about how important open communication is between students and higher ed leaders. Students are already using technology and AI in many different ways, but decision makers do not always see that or ask about it. I shared that I use AI every day, from simple things like asking which wine pairs best with my pasta to bigger tasks in my coursework and work at CEETE. I also said that higher ed often talks about AI only in terms of cheating instead of giving clear guidance and designing assignments that test real understanding and creativity. I told the audience that students like me want more training and direction on AI, not less, because it is becoming part of everyday life as quickly as smartphones did
What this means for instructors
For instructors, the biggest takeaway for me is to talk about AI out loud and on purpose. Students are already using these tools, whether we write it into the syllabus or not. When a course has a clear note about where AI is welcome, how to cite it, and where it crosses a line, it removes a lot of fear on both sides. It also signals to students that you know AI exists and that you are interested in how they are learning, not just in catching them doing something wrong.
AI can also change how we design assignments. Instead of only looking at the final answer, we can ask students to explain how they used AI, what it got right, what it missed, and how they fixed it. A short reflection like that already pushes them to think more deeply and turns AI into a partner they have to manage, not a secret shortcut. At the same time, some of the more repetitive work in a course can move to AI, like drafting quiz questions or giving first pass comments. That extra time can go back into conversations, feedback on thinking, and redesigning activities so they feel more authentic.
I also hope we keep equity and accessibility in mind when we try AI in courses. These tools can help some students by rewriting hard text, giving extra practice, or breaking down code and math in a different way. They can also bring in bias or privacy worries if we are not careful. Piloting tools in a small way, asking students how it felt, and looping in groups like CEETE can help us find a balance that is both creative and responsible.
What this means for students
Students are not just on the receiving end of AI. We are already living with it. For me, a healthy habit is to treat AI as a place to start, not the place to end. It is great for brainstorming, planning a project, reviewing code, or hearing a concept explained in a new way. After that, your own judgment, your class notes, and your instructor’s expectations still matter. That extra step of checking and editing is where real learning happens.
It also helps a lot to know the AI rules for each course. Every instructor is in a slightly different place. Some are excited, some are cautious, some are still deciding. Asking early about what is allowed, and being honest about how you are using AI, builds more trust than trying to hide it. Employers said the same thing at EDUCAUSE. They do not just want to know that you used AI. They want to hear how you combined tools, how you checked the output, and how you owned the final result.
Finally, students have a voice in how AI shows up on campus. When we share our experiences with instructors, advisors, and support units like CEETE, it gives them real stories to work with instead of only headlines or fears. AI in teaching is not a finished thing. It is still being shaped, and our everyday choices and feedback are part of that story.
Leaving EDUCAUSE, I felt a mix of worry and hope. Many places are still unsure what to do with AI, and the technology is moving very fast. At the same time, I felt proud to represent UW–Madison as a student and to share the work we are doing at CEETE. If we keep coming back to human values, clear communication, and real collaboration between instructors, students, and staff, I believe we can make AI part of deeper learning instead of a replacement for it.