When a student recently showed me a video of their favorite influencer promoting a new health product, I was impressed by how natural it seemed until the student told me it was a deepfake.
The influencer had never endorsed the product. The video was polished, persuasive, and entirely false.
Moments like this are no longer rare. They are becoming a regular part of the digital experience, leaving users to navigate a chaotic information landscape shaped by opaque algorithms built for engagement, not accuracy.
When social media, search and digital news companies raced to reduce or end their role in fact-checking and monitoring the information posted on their platforms, I felt a significant shift. This shift is especially consequential for young people still developing the skills to question sources, identify manipulation and evaluate the technologies shaping their worldview.
We cannot afford to take our chances with weak crowdsourced fact-checking systems that are vulnerable to manipulation, harassment and partisan influence. Those of us in technology education must prepare all internet users to confront these challenges with clarity, responsibility and ethical awareness.
Technical expertise and ethical reflection
Higher education has a responsibility to ensure that every graduate, regardless of major, understands the influence of algorithms, artificial intelligence and digital infrastructures on daily life.
In my work bringing computational thinking to broader audiences, I have seen how even a foundational fluency in software design and data-driven decisions can empower students to critique automated systems and foresee unintended consequences.
This understanding is especially important as AI becomes deeply integrated into daily workflows, where professionals across all fields—journalists, policymakers, business leaders or healthcare professionals—are confronted with complex technologies that influence their decisions.
As we consider the broad-based adoption of technology literacy, we must also recognize the need for computer science programs to balance technical expertise with ethical reflection. The ongoing trend to reduce moderation and monitoring spotlights the importance of computer science educators in helping students evaluate the power dynamics and understand their role in creating ethical, inclusive and responsible technologies.
Guidelines for computer scientists, such as the ACM Code of Ethics, provide a foundational framework, but are insufficient for equipping students to navigate the multifaceted, context-specific challenges they face in practice.
No technology is value-neutral
At Allegheny College, our ethiCS responsible computer science effort embeds ethical reasoning, fairness and accountability into technical coursework.
Hands-on exercises that build on real-world case studies or scrutinize platform policies foster a deeper understanding of how code choices can reinforce or challenge societal values. These experiences encourage students to consider social implications alongside performance metrics, preparing them to tackle current topics such as data bias and platform moderation with ethical responsibility.
By highlighting the human impact of technology, students learn how unchecked decisions during software and hardware development can lead to serious or unintended consequences, reinforcing the need for responsible computing.
Students must understand that no technology is value-neutral. Even seemingly mundane algorithms can embed assumptions or biases, and future technologists need to anticipate how their creations might be misused or inadvertently harm vulnerable communities.
This awareness is even more important as major tech companies exert influence on policy, regulation and cultural norms. Educators must address these dynamics directly, encouraging students to question how the software, hardware and data-driven systems they develop can reinforce or disrupt existing power hierarchies—whether in the form of recommendation algorithms favoring certain viewpoints, hiring systems reflecting historical biases, or data policies shaping user autonomy.
Such inquiry prepares students to navigate dilemmas involving privacy, justice and public discourse, and to become thoughtful creators and consumers of technology.
Empowering a new generation
Liberal arts institutions play a key role by integrating ethics and social perspectives into computer science curricula. This interdisciplinary approach allows computer science students to gain cultural awareness while letting non-computer science students acquire the tools to critically assess technological innovations in their fields.
At the same time, higher education must ensure that curricular initiatives extend beyond theoretical learning. Structural dynamics—such as corporate influence on public policy, the consolidation of tech giants, and the use of black-box algorithms in decisions from content moderation to financial lending—require direct attention rather than being treated as background context.
To counter this, higher education must promote collaboration among faculty, students, policymakers, industry leaders and advocacy groups. Through academic research, public scholarship and internships, students gain direct insight into the interplay between institutional power and ethical development in the tech sector.
Ultimately, both a campus-wide technology literacy requirement and a rigorous, ethically informed computer science education play distinct but complementary roles in preparing students to navigate our changing technological world.
By integrating issues of power, accountability and values into a technology literacy distribution requirement, higher education leaders can position their institutions to bridge the gap between technical practices and ethical principles, empowering a new generation of graduates to thoughtfully navigate the complexities of technology.



