On 11 March the Council of the EU confirmed the provisional agreement reached on the Platform Workers Directive (the Directive). The Directive aims to improve the working conditions of those who work on platforms in the gig economy and will also regulate the use of algorithms by digital labour platforms.
Employment protection
The EU suggests that there are more than 28 million people working on digital labour platforms in the EU, sometimes known as “gig economy” workers. One of the key issues regarding these individuals is correctly determining their employment status in order to understand the minimum standards of employment protection to which they are entitled. The agreed text on the Directive means that Member States will establish a legal presumption that will help determine the correct employment status of persons working in digital platforms in their legal systems. The legal presumption will be triggered when facts indicating control and direction are found. People working in the digital platforms, their representatives or national authorities may invoke the legal presumption and claim that they have been misclassified. The burden of proof will be on the digital platform to prove that there is no employment relationship. In addition, Member States will provide guidance to digital platforms and national authorities when the new measures are put in place.
This is a departure from the original drafting which provided that individuals would be presumed to be employees if a certain number of criteria were met. The compromise reached means that the Directive will not outline the conditions to determine employment status; instead this responsibility is given to each EU Member State taking into account national law, collective agreements and EU case law. In the UK there has been a significant amount of case law considering the employment status of such gig economy workers, and while the UK is not bound by the Directive it will be interesting to see how this will impact any UK determinations. In addition, the Labour party in the UK has said that it will consult on its proposal to create a single “worker” status for all but the genuinely self- employed and reviewing the rights available to such workers, potentially increasing the employment protections that gig economy workers may receive.
Regulating algorithmic management
The Directive is interesting on algorithmic management as it covers with more specificity ground already covered by the GDPR and which will also be covered by the EU AI Act when it finally comes into force. All three instruments can apply to automated monitoring and decision making relating to platform workers. Platform operators are going to have to take account of them all.
The Directive prohibits automated monitoring or decision making based on the psychological or emotional state, private conversations, activity outside of the performance of the platform work, data used to predict the person’s exercise of fundamental rights (such as collective bargaining or association), inferences relating to certain sensitive characteristics and one from many biometric identification of the person performing platform work. It appears that the intention is to catch decision support systems as well as wholly automated systems and these activities will therefore be banned.
Outside these prohibited areas all automated monitoring and decision making of platform workers is deemed to trip the requirement to undertake a DPIA under the GDPR, workers representatives must be consulted (they also have a right to be assisted by an expert of their choice) before the system’s introduction and provided with comprehensive and detailed information about how the system will be used and what the parameters and weightings used to make decisions are, whilst workers or applicants themselves must receive the same information in a concise form. A right to human review within 2 weeks (quicker than under the GDPR) is included and platform workers are entitled to require monitored data relating to their activities to be moved to other platforms.
The system’s operation must be reviewed at least every two years and the results shared with workers representatives. Private rights of action are created and GDPR penalties can be applied for breach of these provisions.
Next steps
The text of the Directive must now be finalised and must then be formally adopted. Member States will have two years after the formal steps of adoption to incorporate the provisions into their national legislation. The intention is to make these platforms decision making processes much more transparent and easier for workers to anticipate how they will be treated and to challenge practices that they consider to be unfair. As many of these rules could be derived via the principles in the GDPR platform operators should beware of DPAs applying them in practice sooner than their implementation date.
This post has also been published on our Global Employment blog – Global Workplace Insider here