Big brother is watching you

It’s almost eighty years since George Orwell penned those immortal words symbolising oppressive control: “Big brother is watching you”. Would he smile, eight decades later, that it’s now used to reference the seemingly relentless march of advanced-technology monitoring?
In the field of employment, such monitoring undoubtedly has its place. Beware though, it’s very easy to be swept along, enthusiastically embracing new toys, including AI. Ignore the legal obligations and important principles that must underpin usage at your peril.
The Information Commissioner’s survey
It’s five years since the Covid pandemic led to a vast increase in home and hybrid working. In many cases, overnight changes gave little thought to consequences. The urgent overtook the important. Although Covid has thankfully receded, many measures just stayed in place. However, managers started to think about all those people no longer visible in the office. Hence, ad hoc monitoring emerged.
Two years ago, the Information Commissioner undertook a survey on monitoring in the workplace. Some results were pretty stark. For instance:
70% of respondents considered monitoring by an employer intrusive (60% of young workers, and 76% of those over 55).
57% felt uncomfortable taking a job where their employer would monitor them.
And so they published detailed monitoring guidance to avoid infringing data protection legislation.
As the Information Commissioner said at the time:
“While data protection law does not prevent monitoring… it must be necessary, proportionate, and respect the rights of workers. We will take action if we believe people’s privacy is being threatened.”
And they did. For instance, early in 2024, the ICO ordered ten employers to stop using facial recognition technology and fingerprint scanning to monitor attendance. They concluded there are less risky ways to record attendance than highly sensitive biometric data.
Let’s revisit that statistic about employee monitoring. 57% of people are uncomfortable being monitored at work. Could that be the catalyst leading your favoured candidate to reject an employment offer? Or result in them walking away during probation?
So, maybe just don’t tell anybody you’re monitoring? Mistake! Not only are legality, transparency and fairness cornerstones of data protection law, they’re key to building trust and a positive environment where staff feel valued and respected. What’s more, the ICO says any monitoring must be both necessary and proportionate.
This isn’t to say that monitoring isn’t helpful. Often it is. For instance:
- An effective record of staff movement in and out of a building is valuable for fire safety.
- Giving a lone cleaner a mobile phone provides a lifeline in an otherwise unoccupied building.
- Delivery vehicles with GPS trackers can avoid traffic jams and offer safety reassurance to drivers.
Key areas for monitoring performance
With artificial intelligence developing exponentially, opportunities to monitor performance have exploded. You’re probably bombarded with offers of software to monitor just about every aspect of employment. Some frequently encountered elements include:
- Emails, phone calls and messaging.
- Timekeeping, clocking in and out, building access.
- Internet activity, keyboard usage and keystrokes.
- Screenshots and webcam footage.
- Productivity tools and software.
- Audio and video recording.
- Tracking.
- Monitoring personal phones and devices.
- Monitoring personal social media use.
Remember, the more intrusive and detailed the monitoring, the more you must be ready to justify usage. Always look for the least intrusive way that suitably achieves your necessary objective.
Effective policies
At the heart of all employment monitoring sit legality, transparency, and fairness. You must be able to demonstrate that each process is necessary. Building and maintaining a positive and trusting environment in which staff feel valued and respected is critical. Covert monitoring is always problematic, and often unlawful.
Similarly, action triggered by artificial intelligence without human involvement is potentially unlawful. An example would be sickness monitoring where hitting a particular score triggers automatic sanction without a manager first reviewing the circumstances.
If you utilise Moorepay’s employment documentation, it may well reference key monitoring provisions, such as:
- The circumstances in which emails, phone calls, text messages etc. can be recorded and monitored.
- Details about who’s on or off the premises and rules for ‘clocking’ in/out.
- Whether, where and how CCTV is used. And the exceptional circumstances in which it can be used covertly.
- How personal data is confidentially processed, retained and only disclosed on a need-to-know basis.
- How you deal with the most sensitive, special category data such as ethnicity, sexual orientation, health etc.
- That your trackers do not monitor company vehicles during authorised private use.
- That social media searches during recruitment are strictly limited and appropriate.
- Appropriate procedures are used when checking bags, personal property etc.
- A manager is always involved in decision making. Outcomes are never automatic or left to artificial intelligence software.
Implementation of the Data Use and Access Act 2025 is imminent. Among other changes, it relaxes restrictions on automated decision making. In the future, only automated decisions subject to ‘special category’ considerations e.g. ethnicity, sexual orientation, health etc. will remain restricted. ICO guidance (including updating the document we’ve hyperlinked) is expected later this year.
The new legislation potentially makes it easier to harness artificial intelligence monitoring software. However, it does not override key data protection principles – legality, necessity, transparency and fairness.