Digital transformation entails several risks in terms of security. Digitalising processes entails the risk that systems can be attacked and rendered unusable by cyber-criminals; gathering and storing data on individuals harbours risks to privacy; and data from sensitive research projects can be stolen. For a university, it is central that its services are secure and reliable and that users know their data to be safe. The notion that data systems and stored data could be at risk is one of the issues that have brought digital transformation to the attention of top university leadership and made it a common concern.
Attacks by cybercriminals are a very serious threat to universities. The purpose is often to make key services – for example, exams – impossible and demand a ransom for making the system useable again. As universities are complex organisations with many points of access, they can be vulnerable, although they might not be able to pay as large ransoms as other types of organisations. Particularly for research, units will often be able to build their own software or choose to use different systems than the institutional standard. There are also many devices in laboratories or in university hospitals that are connected to the system and potentially vulnerable to attacks.
The large amounts of data held by universities also pose a risk: They collect data on learners held for administrative purposes, and this might be sensitive depending on the data collection practices in the system – it could, for example, be data on ethnic background, refugee status or (in rare cases) sexual orientation (see EUA report on diversity, equity and inclusion, page 33). There is also the possibility of providers of online learning tools to collect data when learners log in to the platforms. This could, for example, be data on the hardware that learners use or where they log in from, both pieces of information that can have commercial value. Such data could be sold or used by the company, if its business model relies on data. Cybercriminals could also get access to personal data from educational technology. There has been an extensive debate about this in the US regarding primary and secondary schools, but many of the issues could also be relevant for universities. In Europe, General Data Protection Regulation (GDPR) would make the non-consensual commercial use of data illegal, but there must be trust that non-EU companies, too, remain complaint with GDPR. Here, there remains an unease with the amount of data that is handled by large, especially American companies. In the past, audits have shown excessive data collection practices by big companies, although many have changed these in the wake of GDPR.
Much of research data is made to be shared, and there is a culture of making data available openly. Nevertheless, some research data and results will need protection, following the principle of “as open as possible, as closed as necessary”. Research data that is deemed sensitive, personal and/or private could be data obtained from cooperation with industrial partners, who would like to keep data away from competitors. Research data could also be deemed sensitive for security concerns regarding dual use technologies and technologies that are considered critical in other regards. It is also important to note that, while they might not be suitable to be shared for ethical, legal or commercial reasons, sensitive research data can still benefit from good FAIR research data management.
As cybersecurity is becoming a societal challenge, the area is becoming increasingly regulated. At the European level, the Network and Information Security Directive (NIS2) sets requirements for the quality assurance of cybersecurity systems for sensitive sectors. Universities are not defined as a sensitive sector, but areas of university activities such as health research is. In addition, this regulation needs to be implemented at the national level, and there will be differences between EU member states. As with other legislation, these requirements add to the obligations for compliance within universities beyond the existing security standards. Some, for example, define which areas are critical and should follow the NIS2 requirements, and which are not, applying a modular approach to cybersecurity.
See also under Regulatory Framework and European Digital Regulation.
There are various arguments and strategies for universities to meet the challenge of ensuring security of their systems. One common aspect is the lack of resources: the requirements in terms of skilled staff is very difficult to meet, even for large universities. There is a persistent issue of competition for IT skills with the private sector, which can pay much more than universities.
Instead, some universities choose to buy external solutions that can guarantee a high level of security. These systems are preferably applied uniformly across the institution to make certain that there is an overview of which solutions are being used and that they do not pose a significant security threat. This is not always popular with staff, as solutions developed within the university or within individual units are matched to processes and needs, and they have a dedicated group of users. Here, it can be necessary to enforce the institutional policies in a forceful manner.
Moreover, many feel that using external solutions, often provided by big technology companies, might ensure a good level of cybersecurity in the sense of preventing attacks, but they do not always ensure the privacy of students and staff. The trade-off between security from attacks and security in terms of privacy makes some universities choose different solutions for different purposes, for example, keeping data related to student management internal, and using external cloud solutions for less sensitive information. Generally, the lack of resources makes it sensible to have a diversified security policy, with different levels of security, so that the ‘golden nuggets’ of core data and services are protected, while the tolerance level and the acceptance of risk is more lenient in other areas.
The human factor is highly important for security. Staff need to take responsibility for security and have the knowledge to do so. Often, an attack will begin with manipulating one person, for example through phishing. Here, policies that prevent non-secure practices are important, together with training to make learners and staff aware of good practice and able to avoid scams and attempts to get access to the institutional system. This training can, for example, by done through simulation games for individuals or big simulations of different kinds of attacks at the institutional or national level.
One way to solve the challenge of limited resources within one institution can be to pool resources at the national level. This can be done where national-level organisations already exist, such as SURF in the Netherlands or Jisc in the UK. Such collaborative IT organisations allow universities to have a permanent service, 24 hours a day, with experts to assist with immediate problems – something that would be very expensive for individual institutions. Common services also enable common training or simulations to enhance preparedness.