Today’s discussions of artificial intelligence (AI) tend to focus on its most visible presence, such as the chatbot ChatGPT.
Yet, as two doctoral students discovered during their past year as Lender Center for Social Justice student fellows, AI exists in society in many forms, both readily apparent and not well recognized.
ParKer Bryant and Aren Burnside found the existence of AI technologies in communities affects people in many ways. They were part of a five-student research team working with Mona Bhan, Professor of Anthropology in the Maxwell School of Citizenship and Public Affairs, who was chosen as the 2022-2024 Lender Center faculty fellow to study how AI impacts weapons systems, communities and issues of social justice.
Bryant has worked in education since 2008. She has a bachelor’s degree in psychology and a master’s degree in education leadership and moved to Syracuse from Georgia to pursue her doctorate in literacy education in the School of Education. Now in her third year, she wants to work as a faculty member or education researcher after graduation to stay involved with students but use data to ensure that educational policies are structured to benefit them.
Burnside is a third-year Ph.D. student in anthropology at the Maxwell School. He grew up in the Syracuse area and obtained dual bachelor’s degrees in anthropology and philosophy from Syracuse University in 2020. He wants to become a professor because he especially enjoys teaching.
Here, Bryant and Burnside discuss how their thinking about AI evolved after investigating its social intricacies.
Q: Why were you interested in this research opportunity?
Bryant: I wanted to study how to achieve balance between AI and education regarding the implications of relinquishing our memory and cognition to technology. Studying AI and education positions me ahead to understand where technology is going in education and as a platform to help teachers address current fears and uncertainties and start healthy conversations about AI benefits and consequences. The end goal is learning how to make peace with this new technology while striving for a balanced relationship for equitable futures through education.
Burnside: This experience has allowed me to contextualize my focus on the defense contracting and militarization processes that we’re seeing locally. My dissertation, “Just Defense: Whiteness, Settler Colonialism and Environmental Devastation in Defense Contracting in Syracuse, NY,” looks at the physical materials and land required to support the materiality of defense contracting here.
Q: How did your research change or affirm your expectations about AI?
Bryant: We came to understand AI as a multifaceted concept with many components making up its expansive interdisciplinary terrain. I focused on science, technology, engineering, and math (STEM) education and investigated educational systems and preparedness programs within the context of technology and the city of Syracuse. I was not surprised so much as taken aback to see how wide, far, and expansive AI networks go overall in society and how far up the chain and low on the rung the operations are supported. You can’t look at just one thing; it’s all connected to something else. The networks are deep and wide.
Burnside: It definitely changed some of my conceptions. Before this, the only time I encountered AI was in telling my students to not write their papers with it. I think this project has helped me divorce my conceptions about AI from an easy scapegoat form, like ChatGPT, to understand that we’re essentially in a moment in which AI is becoming embedded in systems of state power to surveil, target, and restrict people on a global scale. I’ve really begun to shift my focus to how AI is transforming the way we should think about the enactment of state power, violence, and social justice.
Q: What were some of your key findings?
Bryant: Before this experience I was interested in emerging technologies and how they impact thinking and the capacity to generate thought and focused attention. Now, I see both sides of the consequence. My exploration of knowledge theory perspectives reinforced my understanding of AI’s consequences on either side of the pendulum. I think people should know just how entrenched data via AI is in STEM education and the implications.
There are massive networks of corporatization, the visible systems of inequalities, the targeting of our youth, and the brilliance of marginalized communities living within those social orders. People should start asking questions: Who are the major stakeholders in the policies being pushed, especially those that target marginalized communities? AI is bigger than what you may understand, and that’s okay, yet students, educators and family members need to ask specific questions to those in leadership and require transparency before giving over consent.
Burnside: One of the reasons we’re seeing a lot of investment in semiconductor processing and manufacturing right now through things like the CHIPS Act is a national push to make the U.S. self-reliant for its technological needs, especially in security and defense.
When I think about militarization in Syracuse and my own work—you can see this in the city through the proliferation of defense contracting firms and military investment, but you can also see it through this new investment into semiconductors and AI. Both operate around this central framework of securing the nation and defending the nation. I think this recognition really helped me think through some of my own research and ideas for my dissertation as well.