We need a GI Bill for cyberskills at the college level
Colleges and universities are in crisis mode.
The digital age continues to transform disciplines—dissolving some, and merging and reconfiguring others.
Yet a highly trained and college-educated workforce, one grounded in the ethics of how to best harness new technologies to benefit humanity, is needed more than ever to tackle issues such as cyberwarfare and artificial intelligence.
However, higher education is an expense close to, or greater than, the cost of a house. Who can divorce one of the most serious investments of a lifetime from its potential to return better career opportunities?
Few people question the need for a digitally literate workforce. Yet a recent study indicated that more than half of four-year degree undergraduates don’t believe that the value of a college education is keeping up with its cost.
Got a tech story to tell? Present at UBTech 2019.
Clearly, higher education institutions need to better demonstrate the outcomes and return on investment that added knowledge brings—while also keeping costs down. Some colleges have even introduced “income-share agreements” that lower students’ debt burden with the help of outside investors.
But even with creative financial aid packages, colleges—like Silicon Valley startups—need to be nimble, entrepreneurial and a step ahead.
Some are incorporating blockchain courses and dedicating research to address the rise of cryptocurrencies. And many are focusing more on how to combat cyberthreats that potentially can create economic, political and social upheaval.
While the National Science Foundation provides some support for cybersecurity scholarships, it is insufficient. One report (cybersecurityventures.com/jobs) indicates that there will be approximately 3.5 million unfilled jobs for cybersecurity professionals by 2021.
Another focus for higher education—especially from an ethical standpoint—is machine learning and AI. It’s a task far too important to be left to the tech industry, which has a tendency to build first and think about the impact later.
Though companies such as Amazon and Google have been leaders in developing AI, investments through the National Science Foundation’s Intelligent Systems Program dropped by 10 percent in 2018. This occurred shortly after China directly challenged America’s lead by declaring its intention to become the world leader in AI by 2030.
But AI is not the only technology with serious ethical concerns. Does it really make sense to permit one privately owned social network to collect and sell more data about more people than all the totalitarian regimes of the 20th century combined? How about the emerging field of synthetic biology and home-editing of the genome of a human or pathogen?
At Harvard, MIT, Stanford and elsewhere, formal reflection on the means and motives of technology applications are entering the curriculum.
This kind of study is at the spiritual core of higher education. It must be preserved and nurtured, and academics must ask if current institutional structures make that possible.
Meeting the challenge
Because universities will be crucial in keeping America on a competitive footing and in helping to ensure that students understand the moral and societal impact of tech on humanity, we need an ambitious program similar in scope to the GI Bill.
American students should know that if they become qualified in tech skills, the government will help pay for their education. The private sector should match these efforts, offering to pay off loans for students who accept in-house security positions.
Higher education may be facing a crisis, but it may lead to a greater opportunity for universities to demonstrate why they must remain a cornerstone of a society fueled by technology.
Nada Marie Anid is vice president for strategic communications and external affairs at New York Institute of Technology, and is co-editor of The Internet of Women: Accelerating Culture Change.