Computer use policies cover a range of regulatory demands while also protecting students, the college or university, and all generated data.
In the face of security audits and digital vulnerabilities, today’s policies must be driven by risk management principles—yet remain flexible enough for updates when tech teams identify new threats.
The language of these policies is under particular scrutiny: FIRE (Foundation for Individual Rights in Education), a First Amendment advocacy organization, recently called on Dartmouth College to revise its computer policy.
Subjective words like “hateful” and “offensive” were cited as problematic, and the new policy will include more concrete definitions in various contexts, according to an article in Valley News. (When contacted by UB, Dartmouth declined to comment.)
Got a tech story to tell? Present at UBTech 2019.
Campus IT and computer rules are maturing as universities work proactively to mitigate security breaches, inappropriate use of technology and other risks.
“Before, universities would start with a policy arising from an event or an audit—individuals were driving policy development,” says Stan Waddell, associate vice president of IT and CIO at the University of New Hampshire.
“When external reviews and audits occur, these organizations want to have a sense that the policy is programmatic.”
To achieve this, specifying the roles of system administrators in policy language is important.
Federal regulations now set stricter requirements for data use and classification, including mandates for storing Social Security numbers and credit card information. As a result, officials must write policies in straightforward language and conduct periodic reviews.
“You want to keep policy as simple as, ‘The university will encrypt its sensitive data,’” says Waddell. “It helps to avoid nuance. You want policy to be living, but you don’t want to be constantly editing.”
A schedule for reviews and potential updates helps to set a tone of respect for policy practice by making policy management a priority.
When followed by all on campus, comprehensive policies help prevent attacks and user errors that can disrupt networks, says Mitch Davidson, CIO at Purdue University Fort Wayne in Indiana.
Forming ethical policy as the future arrives on campus
A wealth of new data captured from institutions’ native apps and classroom technology offer a detailed picture of how often students are utilizing resources, and how that may connect to their success.
“Universities can use data to set up technology and other resources in a more useful fashion,” says Waddell.
Many students are bringing their own artificial intelligence devices into their dorms, introducing new portals to a school’s network. Some institutions, such as Saint Louis University in Missouri and Arizona State University, provide them to students.
Saint Louis is now stocking every dorm room with an Echo Dot programmed especially for the school: Alexa has answers to 100 questions frequently asked on campus.
In response to the omnipresence of smart devices, students should be made aware that these tools are run by AI, not people. A complete computer policy would inform students how much and what kind of data the university gathers from the technology.
For example, a privacy statement on Saint Louis’ website shares that organizations have access to anonymous engagement metrics gathered from campus Dots, but not to voice recordings.
Users should also be made aware of the importance of computer policies, and how they align with a university’s values and obligations, says Nicholas Tella, director of information security at Johnson & Wales University in Rhode Island.
For guidance in meeting federal regulations, IT teams can look to FERPA, HIPAA and standards set by other organizations. The university’s legal counsel can be particularly helpful in navigating these protocols.