
In 2014, Bradley Bryant had just started in a role at Sony’s software division when North Korean hackers breached the company’s systems, stealing and leaking confidential data from Sony’s film studio and demanding the cancellation of the film “The Interview.” The Sony hack made international headlines amid what Bryant called a “wild” and “stressful” time at the company.
“That was one of the first cyberattacks that clearly crossed from the digital realm into the human realm,” Bryant explained. “It affected the release of the film, and it scared some people out of going to the movies.” At the same time, the hack represented an “a-ha” moment in the field of cybersecurity, he said—a recognition of how wide-ranging the human impacts of technical breaches can be. “That moment kicked off a mind shift in the way that we look at cybersecurity and cyberattacks in general,” Bryant explained. “We now consider cybersecurity an interdisciplinary field that sits at the intersection of humans and technology.”
More than a decade after the Sony hack, Bryant has dedicated his career to teaching the critical lessons in cybersecurity that episodes like the hack reveal. After teaching part-time at the iSchool since 2023, Bryant has been hired into a full-time Teaching Faculty I role, in which he is set to teach courses on Human Factors in Information Security (LIS 510) and Database Design (LIS 464) this coming fall. Bryant recently sat down to discuss his unique career path, the centrality of the human element in cybersecurity, the emergence of artificial intelligence (AI) in the classroom, and more. The conversation has been edited for length and clarity.
What led you from industry to academia and how did you come to UW-Madison?
I was working in IT systems administration for about 10 years, starting in software support in 2011. Then I decided to pursue a master’s in cybersecurity at Georgia Tech, and after serving as a TA, I started considering teaching as a career path. I just happened to be looking casually online, saw a job posted at Madison College and ended up applying. I ended up getting the job, and it confirmed for me that I really enjoy teaching.
After about four years at Madison College, I saw a part-time posting for the iSchool at UW–Madison. The description of the course they were hiring an instructor for (LIS 706) was similar to one I helped teach at Georgia Tech, so I thought it could be a great fit. And that ended up being true!
Talk a bit about your experience teaching at the iSchool so far and the courses you’ll be teaching moving forward.
I taught LIS 706 (Data Mining Planning and Management) online in the fall of 2023, then taught an in-person section the following spring. That course covers how to conduct real-world data mining projects by uncovering meaningful patterns and insights from data, as well as principles of data ethics related to data mining. This coming fall, I’ll be teaching LIS 510 (Human Factors in Information Security). This aligns really well with my industry experience and previous teaching at Madison College. Also, with my database experience, I’ll be teaching LIS 464 (Applied Database Design).
The third class I’ll teach is the capstone course for MS Information students, where we have an industry partner, and students work with that partner to deliver on a real, impactful project. At the end of the course they go out to the headquarters and give a data-driven presentation on their final results. This course launched last year for MS Information students, but it has already been highly successful. Our partner, ABC Supply, has been very impressed by our students’ ability to analyze and present data and contribute valuable perspectives. Some students have been offered interviews for full-time roles based on their performance in the capstone.
The MS capstone is a win-win for students and industry partners because the students get applicable experience and the industry partners get deliverables from talented students who can address their challenges in new and creative ways.
How would you describe your teaching philosophy or approach?
My teaching philosophy changes quite often, actually. I pull from experiences in the tech industry, but also before that when I was working other jobs as a line cook or a warehouse worker. I see myself as a trainer, sometimes a little more so than a teacher, and I lean on the process of “tell, show, do, review.” When you’re training somebody, you tell them what you’re going to be doing, you then show them how to do that, you have the trainee do whatever that thing may be, and then you review what they did. In addition, I recognize that the way each person learns is as unique as their fingerprint, and that influences my course design, the way I teach, and my assessments.
Cybersecurity has recently grown as an area of research and education at the iSchool. How do you see the field fitting squarely at the intersection of technology and humanity?
There are many ways to answer this, but let’s take one example: phishing. Phishing is enabled by a process of social engineering—or convincing someone to click on an email or a link that is not actually in their interest. Phishing requires some technical abilities, but it is fundamentally a social and psychological undertaking. In other words, sometimes it doesn’t matter what sort of technical security or defenses you have—if somebody’s going to click on a link in an email, you cannot stop them.
So phishing, as a microcosm of cybersecurity, is both technical and human at its core. And it’s one of the biggest attack vectors for bad actors to get into systems.
Changing gears, can you discuss how AI factors into your teaching, and how your approach to it has evolved in recent years?
After the initial freakout around generative AI in the classroom, I realized we have to incorporate it into the curriculum. We have to get students using it and using it in the right way, so that they’re not putting in personal information or violating academic integrity. That has proved more challenging than many people thought.
I recently had a class in which several students gave an incorrect answer for a quiz question that seemed pretty straightforward. I couldn’t figure out why so many students answered incorrectly, until I entered the question into ChatGPT. It gave me the same incorrect answer that many of the students gave. I then had a conversation with the chatbot to help coax out the correct answer, and it quickly admitted that it too had answered incorrectly. The next class session, I brought up the quiz question to the students and shared my discussion with ChatGPT. This highlighted how these bots can, and often do, make mistakes. In addition to domain knowledge, context and verification are extremely important when interacting with these models.
One last thing I can convey to students about not exposing their personal information is we have supported AI tools at UW–Madison, such as Microsoft Copilot, that have agreements built in saying that they will not use the student data and the student prompts to train their models. This can help protect students’ privacy and the security of their information.
Finally, what are some of your hobbies or interests outside of work?
Sports are big in our household—all the Wisconsin sports: Badgers, Packers, Brewers, Bucks and recently WNBA as well. The Caitlin Clark effect is real. Playing video games casually, reading, checking out restaurants in the area, and my pandemic purchase was a motorcycle, so I am trying to get out on that as much as possible.
We also love our two little dogs, so walking them is much of my time spent outdoors. They’re a great reason to get you outside and off the couch.