Anyone who works with electronic devices that can process information knows about computer security. Computer security as a concept is not very different from what is legislated in other areas, such as workplace safety. In the adoption of IT systems within a company, both by employees and by automation systems related to – for example – Industry 4.0, it grows the risk in using them and the information they develop. For this reason, companies need measures to limit possible problems.
Among the various standards that define best practices related to computer security, there are also company certifications, including ISO 27002:2007 and ISO 27001:2005. To obtain these certifications, but just to be able to define the state of corporate IT security, a risk plan needs to be drawn up, in which the various uses of the devices and the information that pass on them are analysed.
The process leading to the definition of a proper risk plan normally requires three main phases: risk assessment, risk limitation, measurement and evaluation. Risk assessment is a recurring activity related to the analysis, planning, implementation, control and measurement of implemented security information measures. It is usually executed at set times (every 6 months, every year …) and provides a photograph of the current corporate security system. Risk limitation focuses on prioritising, evaluating and implementing the appropriate actions emerging from risk assessment. In the third phase the processes and the results obtained through the first two phases are analysed. In practice, however, it is often very difficult to be able to draw up a plan of risk and effective action in a timely manner, especially today, where computer systems of companies evolve very quickly.
Considering a small subset of this, and trying to imagine which could be the risks associated with the workstation, it is immediately clear that finding good rules that can make them safe for all contexts is very important. By exploring the various risk plans involved in the net, it is clear that very different types of criticality lie within the workplace security of the workstation. There are risks associated with improper use, perhaps due to a mistake or poor training of the user, or external threats such as viruses or malicious users who may try to obtain confidential information or to interrupt the functionality of the systems simply to create a damage.
In an almost historic context, each employee is entrusted with a personal computer that contains all the tools needed to carry out his or her work. A commonly used practice is the use of a password to access the system. The password will have its own rules, such as an expiration date, or a rule to define its complexity. This type of use is virtually tied to operating systems of the Microsoft Windows family, which also require frequent security updates (due to bug fixes), and running a valid antivirus that must always be updated to the latest virus definitions.
Using a Microsoft system also allows various personal computers to provide their users with a simple system for sharing and searching resources within the corporate network. This has led to managing access and permissions to various resources, creating the need for a centralised control over users who can access and work on workstations. Domain usage has also made it easier to configure the various workstations through the tools provided directly by Microsoft itself. From the security point of view, this has changed the perspective, because a malicious user or computer virus access to machines is no longer tied to the workstation, but also to all shared network resources available, effectively modifying the scope of the threat and inevitably altering the risk.
Virtualization, first server side and then desktop, has substantially changed the business security model. The user still has his own resources available but no longer has the awareness of the physical location of them. In fact, they can be local, virtual, and therefore delivered from within the company, or even directly from the cloud. This model is still tied to centralising user configuration, and no longer to workstation configuration. In fact, the workstation only becomes a point of access, where the need is to access the resources of the user who is using it at that time. Virtualization, then, completely disconnected the workstations from the user who uses them. By accepting this concept, it is easy to understand how the configuration and typology of the workstation device has changed, since it is no longer necessary to store applications or data (if not for some configuration) device side.
This is one of the reasons that led to the birth of Thin Client devices, which do not store data on the machine and have a very simple and quick configuration. Often, this also has made access to resources shared by the device even more unnecessary, effectively giving way to use non-Microsoft operating systems. From the point of view of security, this allows to lower the probability of attacks by viruses, much less widespread for this type of operating systems. Also, having no access to shared resources, an attacker will not be able to get information beyond the single machine, which actually does not contain data, but only device configurations.
However, it is clear that this model does not fit in all contexts. What to do, then, in other contexts, where perhaps there are still local applications, maybe related to historical reasons? Is it really necessary to authenticate devices to access shared and local resources? Agile is a new product that allows you to make secure access to the device by removing all those options that are not provided by the business risk plan. You do not have to associate your devices to the domain, effectively reducing the range of malicious access, but you can authenticate to the resources you need through the ThinMan management console.