On August 27, 2020 the Dutch Data Protection Authority (Dutch DPA) announced that it approved the first ‘code of conduct’ in the Netherlands, the Data Pro Code. The Data Pro Code was drafted by NL Digital, the Dutch industry association for organizations in the ICT sector in the Netherlands.

What is a ‘Code of Conduct’?

Under the EU General Data Protection Regulation (GDPR), organizations must implement ‘appropriate measures’ on an organizational, technical, and legal level and be able to demonstrate their compliance with the GDPR. In order to help companies from particular sectors with this obligation, GDPR allows associations and other bodies representing categories of controllers or processors to prepare codes of conduct that specify what data controllers and processors need to do in order to be GDPR compliant.

By means of best practice, such codes of conduct clarify the obligations of controllers and processors, thereby taking into account the risk likely to result from the processing for the rights and freedoms of natural persons. Once drafted, the codes must be approved by the relevant national data protection authority.

Why apply ‘Codes of Conduct’?

Companies that apply codes of conduct may thereby ensure that they conform with the GDPR effectively. In addition, the adherence to codes of conduct means that the company follows GDPR requirements in a manner that is considered as good practice within the sector.

What does the Data Pro Code entail?

The Data Pro Code focuses on the ICT sector in the Netherlands and provides further explanation of data processors’ obligations under the GDPR. In particular, the code offers the relevant Dutch processors practical information about open standards from the GDPR.

An important element is compliance with GDPR information obligations which require a data processor to inform its customer (the data controller) about its security measures. Such information must be provided in a way which allows the customer to assess whether the measures are sufficient, given the intended use of the service or product by the customer.

Data processors which apply the Data Pro Code may comply with this obligation by completing a Data Pro Statement which is then made part of the data processing agreement between the processor and the customer. The data processor thereby informs its customer (i) how it has implemented the GDPR’s security measures, (ii) what certification it holds and (iii) how it is processing the customer’s data (incl. duration, possible ways of deletion and retention period).

Supervision of the Data Pro Code

Compliance with the Data Pro Code is supervised by an independent body, the Data Pro Supervisor. A data processor who wishes to apply the Data Pro Code must accept an independent assessment of its activities. In addition, the processor can be certified as an adherer to the Data Pro Code and be included in a Data Pro Code Register, which is managed by the Data Pro Supervisor. This enables potential customers to view the code membership and ensures that the processor’s compliance with the GDPR is monitored by the Data Pro Supervisor. This monitoring, in turn provides assurance that the code of conduct can be trusted.

Next steps

Currently, the criteria that the Data Pro Supervisor must meet are submitted to the European Data Protection Board for advice. The Dutch DPA expects a definite answer within the course of this year.


As many countries reach the second stage of the Coronavirus Disease 2019 (COVID-19) outbreak, privacy protections may be relaxed under certain circumstances. The European Data Protection Board (EDPB) issued a statement on the processing of personal data in this period of time, and several national data protection authorities have issued COVID-19 specific guidelines and advice. As there are considerable differences between the various guidelines, it is essential that organizations subject to these EU data protection laws become familiar with these national information and guidelines.

This guidance – amongst others – touches upon the question what employers are allowed to do and say when an employee has or appears to have contracted COVID-19.

This blog covers the following topics:

  • European Data Protection Board Guidance
  • Dutch Guidance
  • UK Guidance
  • Irish Guidance
  • Spanish Guidance

EDPB Statement: General Information About Legal Bases for Data Processing

The EDPB adopted a ‘Statement on the processing of personal data in the context of the COVID-19 outbreak’ on March 19, 2020.

In this statement the EDPB reiterates that employers and public health authorities may process personal data without consent of the data subject in case of a pandemic. The EDPB refers to several legal bases in articles 6 and 9 of the General Data Protection Regulation (GDPR).

Article 6 (1) (e) GDPR contains a legal basis for processing if it is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (the public interest legal basis).

Article 6 (d) GDPR provides for the processing of personal data if it is necessary to protect the vital interests of the data subject or another natural person (the vital interest legal basis). Such vital interest can be found in the protection of the data subject’s or another natural person’s life, but only where the processing cannot be based on another legal basis. Generally, it is regarded as a ‘last resort,’ for instance when an individual is unconscious and in mortal danger.

Recital 46 of the GDPR explains that the above-mentioned legal bases can go hand in hand:

Some types of processing may serve both important grounds of public interest and the vital interests of the data subject as for instance when processing is necessary for humanitarian purposes, including for monitoring epidemics and their spread or in situations of humanitarian emergencies, in particular in situations of natural and man-made disasters’.

Where it concerns health data, which qualify as a special category of personal data, the employer or public health organization can rely on the same legal bases; articles 9 (2) (c) and 9 (2) (g) GDPR are the counterparts of the public interest and vital interest legal bases as referred to above.

In addition, pursuant to article 9 (2) (i) health data may also be processed if this is necessary ‘for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices’ (the public health legal basis).

The EDPB also lists several core principles that must be taken into account in the context of the COVID-19 outbreak. Personal data which is necessary to pursue the objectives should only be processed for specified and explicit purposes and the data subject must receive transparent information on the processing activities and their main features.

Lastly, the EDPB answers specific questions, which are particularly relevant for employers. The answers to these questions demonstrate that the national law of the member states is decisive to determine the employer’s obligations.

The EDPB’s answer to the question ‘Can an employer require visitors or employees to provide specific health information in the context of COVID-19?’ is as follows:

The application of the principle of proportionality and data minimization is particularly relevant here. The employer should only require health information to the extent that national law allows it.

In addition, the question ‘Can an employer disclose that an employee is infected with COVID-19 to his colleagues or to externals?’ is answered as follows:

‘Employers should inform staff about COVID-19 cases and take protective measures but should not communicate more information than necessary. In cases where it is necessary to reveal the name of the employee(s) who contracted the virus (e.g. in a preventive context) and the national law allows it, the concerned employees shall be informed in advance and their dignity and integrity shall be protected.’

The Netherlands

The Dutch data protection authority (AP) published an approach towards an employer’s obligations on its website, which was updated several times. The approach has been criticized by some for limiting an employer’s options to obtain information and prioritizing an employee’s privacy.

Generally, an employer is not allowed to check whether its employee is contaminated with the COVID-19 virus. This is only allowed if the employee works in the health sector.

In addition, an employer may not request any information about the nature and causes of a sickness notification. Consequently, it may also not register such information as it is not considered necessary. This applies to a bone fracture, a cold, and to COVID-19. While most employees may feel obliged to inform their employer about the nature of their sickness (especially in times of a pandemic), such limitation may preclude an employer from ensuring the health and safety of its employees.

The AP advises that instead, an employer may request a company doctor to test an employee in cases of suspicions of COVID-19. The AP specifically states that an employer may also request its employee to contact such company doctor.

If the doctor suspects a contamination on the basis of this test, it will file a report to the regional health service. This service will then discuss the required next steps with the employer. Note that, with this mechanism, the employer only indirectly receive information about the employee’s illness. The company doctor is allowed to process and register health data, as required under Dutch law, and as this is necessary for reasons of public interest in the area of public health (article 9 (i) GDPR).

Recently, the AP added that an employer may also request the employee to check its health during work hours, for example by measuring its temperature. This is specifically the case if the employee does not work from home.

In addition, the AP notes that under the current circumstances, an employer may send employees home if they are sick, and if the employer suspects that the employee is sick (the employee has flu or cold symptoms for example).

According to the AP, the question ‘what can I say about someone’s absence to its colleagues?’ is that it remains up to the employee what it wants to share. The employer may inform colleagues about an expected absence duration, but the AP reiterates that the employer/employee relationship is not considered equal. The AP advises that an employer must make sure that the employee does not feel any pressure to provide further information (such as its name and details of the illness).

The AP adds that the current situation requires specific and far reaching measures. The employer is encouraged to monitor the advice of national and regional health services closely.

United Kingdom

The UK’s Information Commissioners’ Office (ICO) provides information on its website about ‘Data protection and coronavirus: what you need to know.’ The information is set up in a Q&A form.

The ICO answers the question ‘As a healthcare organization, can we contact individuals in relation to COVID-19 without having prior consent?’ as follows:

‘Data protection and electronic communication laws do not stop Government, the National Health Service or any other health professionals from sending public health messages to people, either by phone, text or email as these messages are not direct marketing. Nor does it stop them using the latest technology to facilitate safe and speedy consultations and diagnoses. Public bodies may require additional collection and sharing of personal data to protect against serious threats to public health’.

The ICO also confirms that an employer can tell its staff that a colleague may have potentially contracted COVID-19. However, the ICO emphasizes that ‘you probably don’t need to name individuals and you shouldn’t provide more information than necessary.’

It makes it clear that the organization has ‘an obligation to ensure the health and safety of your employees, as well as a duty of care. Data protection doesn’t prevent you doing this.’

Furthermore, the ICO advises that an employer is allowed to share employees’ health information with authorities – even though it is unlikely that this will be requested. The ICO emphasizes that it remains of importance to collect only strictly necessary data and to ensure that the information is treated with appropriate safeguards.

Interestingly, the ICO notes that it will not ‘penalize organizations that we know need to prioritize other areas (than data protection practices, red.) or adapt their usual approach during this extraordinary period.’


The Irish Data Protection Commission (Commission) emphasizes the employer’s obligation to protect its employees.

With regard to article 9 GDPR, the Commission notes that it is likely that Article 9(2)(i) GDPR and Section 53 of the Irish Data Protection Act 2018 will permit the processing of personal data, including health data, once suitable safeguards are implemented. Such safeguards may include limitation on access to the data, strict time limits for erasure, and other measures such as adequate staff training to protect the data protection rights of individuals.

The Commission concludes that an employer may process health data of its employees, if it is necessary and proportionate to do so. The legal basis for this processing is found in article 9(2)(b) and the obligation of an employer to protect its employees under the Irish Safety, Health and Welfare at Work Act 2005.

In addition, the Commission explicitly states that in light of this employer’s duty of care:

  • employers would be justified in asking employees and visitors to inform them if they have visited an affected area and/or are experiencing symptoms; and
  • employers would be justified in requiring employees to inform them if they have a medical diagnosis of COVID-19 in order to allow necessary steps to be taken.

Lastly, the Commission notes that an employer should avoid naming any individual employee that has contracted the virus. Rather an employer should inform its staff that there has been a (suspected) case and request the staff to work from home. This last part was clearly written in the first stage of the pandemic.


In its extensive report , the Spanish data protection agency (AEPD) touches upon the vital interest legal bases in articles 6 and 9 GDPR.

With regard to this legal basis in article 6 GDPR, the AEPD notes that this also aims to protect the vital interests of ‘another natural person.’ Consequently, this legal basis may be sufficient for the processing of personal data aimed at protecting all those persons susceptible to being infected in the spread of an epidemic, which would justify, […] in the widest possible way, the measures adopted to this end […]. This interpretation is clearly broader than the generally accepted interpretation.

The AEPD notes that this legal basis does not work where health data are concerned. The AEPD is then the first national authority that refers to article 9 (2) (b) GDPR, which provides a legal basis for processing of health data without consent, for the purposes of carrying out the obligations and exercising specific rights of the controller or of the data subject in the field of employment law.

A Spanish employer is subject to a Spanish law on the prevention of occupational risks. This law requires each worker to ensure its own and others’ safety and health at work. Consequently, they must immediately report any situation that reasonably involves a risk to that safety and health. This also includes any suspected contact with the virus. The employer must then process such data in line with the GDPR.

As possible legal bases for data processing in light of the COVID-19 pandemic, the AEPD lists:

  • the established article 9 (2) (g) and 9(2)(i) GDPR (the public interest and public health legal basis)
  • article 9 (2) (h) – the processing is necessary to carry out a medical diagnosis, or evaluation of the worker’s work capacity
  • article 9 (2) (c) (the vital interest legal basis), but only in the event that the data subject is not physically or legally capable of giving their consent.

The APED reiterates that Spanish laws contain provisions that allow processing of data in emergency situation. So, in order to protect the public health, the ‘different public administrations, who may adopt the necessary measures to safeguard said essential public interests in public health sanitary emergency situation.’ An assuring message from the AEPD, including its notion that all data protection principles (laid down in article 5 GDPR) must be respected.

The AEPD concludes with the highly relevant recital 54 GDPR:

The processing of special categories of personal data, without the consent of the interested party, may be necessary for reasons of public interest in the field of public health. Such processing must be subject to appropriate and specific measures in order to protect the rights and freedoms of natural persons. […] This processing of health-related data for reasons of public interest should not result in third parties, such as businessmen, insurance companies or banks, treating personal data for other purposes”.

Final note

It is expected that other national data protection authorities will also provide specific Coronavirus Disease guidance. Naturally, we will keep you updated on the latest developments.

On Sept. 24, 2019, the Court of Justice of the European Union (CJEU) decided that the “right to be forgotten” does not require a search engine operator to carry out de-referencing on non-EU member state versions of its search engine. The case relates to a penalty of €100,000 that the French data protection authority, CNIL, had imposed on Google in March 2016. In granting a de-referencing request, the search engine – on free speech grounds – declined to apply the de-referencing worldwide to all domain-name extensions of its search engine. Arguing for global freedom of expression, Google appealed the penalty and filed an application for the annulment of CNIL’s decision with the French Council of State. The French court then referred several questions concerning the territorial scope of the “right to be forgotten” to the CJEU for preliminary ruling.

The CJEU reviewed the case both under the former Data Protection Directive 95/46/EC (Privacy Directive) and the General Data Protection Regulation (GDPR), which replaced the Privacy Directive on May 25, 2018.

Click here for the full GT Alert, which discusses the CJEU ruling.

On July 29, 2019, the Court of Justice of the European Union (CJEU) found that a website operator using a social media plugin is a joint controller with the social media company providing the plugin and can be held jointly liable in relation to such processing activities. Although the case was decided under the Privacy Directive 95/46, since the ruling concerns definitions that also exist under the General Data Protection Regulation (GDPR), website operators should take note and may want to review their previous legal bases determinations and notices as well as their existing contractual arrangements with the social media company to ensure they are in compliance with GDPR.

The case arose when a German consumer protection association sued a German online fashion retailer, Fashion ID, for allegedly breaching the then-existing national data protection laws when it enabled the transfer of visitors’ personal data to a third party via a social plugin. The German Higher Regional Court referred the matter to the CJEU.

In the proceedings it became apparent that the social media plugin (a “like” button) on Fashion ID’s website caused the visitor’s browser to request content from the company providing the plugin; then the browser transmitted the visitor’s personal data to the social plugin company. This happened as soon as the visitor consulted the website and regardless of whether or not the visitor:

  • was aware of such an operation;
  • was a member of the social media platform; or
  • had clicked on the plugin.

Click here for the full GT Alert on the CJEU’s finding, the website operator’s responsibilities, and key takeaways for website operators.


Modern technologies and personal data are increasingly important for real estate businesses. Robotics, Wi-Fi tracking, augmented and virtual reality, sensor technology, and the Internet of Things (e.g., a physical smart object in an internet-based structure) are some of the technologies being used. Through such modern technologies, a landlord has access to a large amount of data with respect to the owned property.

Collecting, sorting, and analyzing such data can provide the landlord with new insights on the building and its users, and can enable the landlord to predict their behavior. The accuracy of such predictions will generally improve if the landlord can develop a large dataset and combine a variety of information (e.g., by using data from a real estate portfolio). Predicting the behavior of a building’s users can, amongst other things, improve the service level, help retain tenants, and reduce maintenance costs.

The technological possibilities for data processing in real estate seem endless. However, the legislature has put in place certain limits.

Statutory limits to personal data processing in the EU

Under the EU General Data Protection Regulation (GDPR), processing of personal data requires a legal basis (e.g., consent or the execution of a contract). Personal data can in principle only be processed for specified and legitimate purposes. Data subjects must be informed about all personal data processing, and the data controller cannot freely share the personal information with third parties. In addition, using personal data to predict a person’s behavior and for decision-making may qualify as “profiling” and “automated individual decision-making” under the GDPR.

Profiling and automated individual decision-making have a somewhat negative connotation, as they are believed to create unfair stereotypes and social division. As a result, profiling and automated individual decision-making are subject to scrutiny. Profiling under the GDPR is “any form of automated processing carried out on personal data for the purpose of evaluation of certain personal aspects to a natural person, in particular to analyze or predict aspects with regard to work performance, economic situation, health, personal preferences, interests, reliability, behavior, location or movements”.

Automated decision-making under the GDPR is defined as “making a decision by technological means without human involvement”.

Data subjects must be informed about any profiling or automated individual decision-making that occurs, the logic employed to justify such profiling, and the expected consequences of the processing. In addition, data controllers must consider objections against personal data processing, which can be made at any time.

The Dutch Data Protection Authority has stated (link in Dutch) that the tracking of people in the street, in shopping centers or stations via their mobile devices is only allowed in a few rare cases and under strict conditions. It is only allowed, according to the Dutch Data Protection Authority, if explicit prior consent is obtained or if there is a legitimate purpose. Based on this decision by the Dutch Data Protection Authority, tracking activities are only allowed if limited to specific periods and areas and where truly necessary. At other times and places, this measuring equipment should be turned off (link in Dutch). The Dutch Data Protection Authority has already imposed an order (link in Dutch) on Bluetrace, subject to penalty for noncompliance (last onder dwangsom), under the former Dutch Data Protection Act. The company was providing technology which could be used to track Wi-Fi signals of mobile devices arounds stores. Bluetrace had to stop collecting personal data from neighboring residents, erase or anonymize data from shopping passers-by, and provide information in and around the stores about the data processing.

Although the use of data in real estate has much broader applications than WIFI-tracking, the mentioned examples do illustrate the fine line between the technical possibilities for processing personal data and the statutory limits.


Non-compliance with GDPR requirements may lead to severe fines. The regulatory limits to personal data processing do not mean, however, that modern technologies can no longer be used. While the benefits of modern technologies remain available for both landlords and tenants, such technologies must be used in a transparent, fair, and lawful manner. Landlords, amongst other affected parties, will have to address the use of such modern technologies in their lease agreements and privacy policies.

Click here for more on GDPR.

While many are still digesting the changes brought about by the EU General Data Protection Regulation (GDPR), a new privacy regulation is already on its way. The Regulation Concerning the Respect for Private Life and the Protection of Personal Data in Electronic Communications – in short, the ePrivacy Regulation  – is currently a draft under discussion (the latest version by the EU Council was published on 13 March 2019).

Unlike the GDPR, the draft ePrivacy Regulation focuses on privacy with respect to electronic communication services and on the data processed by electronic communication services. This means that in relation to such communication services, the ePrivacy Regulation provides the specific obligations that flesh out the more general provisions of the GDPR. The draft ePrivacy Regulation covers more than just data protection law; it also relates to non-personal data, such as metadata. Lastly, the draft ePrivacy Regulation contains provisions on telecommunication confidentiality.

Click here to read the full GT Alert.

In a period of ongoing modernization of European legislation concerning the European Digital Single Market, the regulation of online copyright is a continuing concern. The proposed new copyright directive (‘the Copyright Directive’) would bring far-reaching changes to European copyright law and has been heavily debated by the member states over the last two years. It has also been intensely criticized in the media.

On Wednesday, 13 February 2019, however, a breakthrough was achieved when the three branches of European government – the European Commission, the Parliament, and the Council of the EU – reached a political agreement on the Copyright Directive. In the coming months, the European Parliament and the Council of the EU will have a final vote on the Copyright Directive.

At the same time, the media publish extensively about the directive. Their most common complaint concerns Article 13 and the duty of ‘online content sharing providers’ to filter content. Household-named tech giants are the online content sharing providers (‘Content Sharing Providers’). Some argue that such filtering could violate people’s freedom of expression, as defined in Article 10 of the European Convention of Human Rights.

But where does this fear of ‘filtering of the internet’ come from? Does it really pose a threat to human rights? And are there no countervailing advantages to be provided by the Copyright Directive?

Protection of the Creative Content Sector

To start with the last question: Yes, there are advantages, but only for a limited group of stakeholders. One of the objectives of the Copyright Directive is to create a ‘fairer and sustainable marketplace for authors, performers, the creative industries and the press’.

While these parties are at the heart of content creation and the creative sector, their remunerations are not considered reflective of the extensive online use of their content by Content Sharing Providers. This use is generally not addressed in agreements between creators and such providers.

Consequently, if content published on the Internet infringes a creator’s copyright, the content can only be removed afterwards, and no fixed arrangements on remuneration and/or compensation for damages are in place to make the creators whole. Uncertainty about the specific use of content creators’ material negatively affects their ability to determine appropriate use and remuneration. The European Union therefore finds it important to ‘foster the development of the licensing market between rightholders and the Content Sharing Providers’. These licensing agreements should be ‘fair and keep a reasonable balance for both parties’. (Recital 37 of the last proposal for the Copyright Directive).

Article 13 Copyright Directive – Conclusion of License Agreements

Article 13 of the Copyright Directive says that Content Sharing Providers shall:

‘obtain an authorization from the rightholders […], for instance by concluding a licensing agreement’.

The complete text of the amended (and agreed upon) Article 13 can be found here.

Subparagraph 2 of Article 13 sets out one main element of the license agreement: ‘acts carried out by users of the services’. This means that the license agreement would need to cover the possible acts of the platform’s users. This would impose a great burden on online platforms to control and manage user-generated content, thus   incentivizing the filtering of user-generated content on these online platforms.

Article 13 Copyright Directive –  Liability, unless….

Subparagraph 4 of Article 13 makes the duty to filter content explicitly clear, because if the rightholder does not grant the required authorization, the Content Sharing Provider is liable for the publication of the copyrighted work.

There is an exception to this strict liability, but only if the Content Sharing Provider complies with the following obligations.

The Content Sharing Provider must demonstrate that:

  1. it made best efforts to obtain an authorization; and
  2. it made best efforts to ensure the unavailability of the specific work; and in any event
  3. it acted expeditiously (upon receiving a notice by the rightholders to remove the specific work).

The most efficient way to ensure the “unavailability of the specific work” under (b) would likely be the filtering of all uploaded works on a platform. Using a filter would enable the practical detection of potentially infringing specific works.

Possible Infringements of Human Rights

While Content Sharing Providers are likely to filter user-generated content to protect themselves, automated content filters often fail to recognize the context and actual content of the specific material (link in Dutch). Such failures would contravene other provisions of  the Copyright Directive, which explicitly allow specific forms of expression that may not be recognized by a content filter.

The Copyright Directive even contains an obligation for the member states to ensure that users in the EU have the freedom for expressions such as:

  • quotation, criticism and review; and
  • use for the purpose of caricature, parody or pastiche

(Article 13 subparagraph 5 Copyright Directive)

This language would seem to require that a filter draw a clear distinction between the content and its specific purpose. This may be an impossible task in practical terms.

A further challenge for adequate filtering is the fact that a copyrighted work, like a video clip, may have multiple authors, each of whom would need to authorize a license to use the work. Consequently, a Content Sharing Provider’s filter system would need to be precise and fully accurate as to attribution, since a mistake regarding a single author’s authorization could lead to a claim.

Lastly, the use of Internet filters poses threats to user privacy. The filtering of content could easily result in the monitoring of users and their personal data. Objective and clear criteria for content filtering is thus required to prevent infringements of the General Data Protection Regulation (GDPR).

All the above will likely lead to the implementation of risk averse policies by online platforms, considering the high threat of many large claims. Such policies are likely to result in strict application of filters in order to block all content that poses a potential risk.

Thus, the risk of using strict upload filters is that ‘safe’ content is filtered out, limiting a free flow of information and freedom of expression. The fact that ‘new platforms’ (i.e., platforms on the market for less than three years and with an annual turnover below EUR 10 million) are exempted from the above obligations (b) and (c) may seem positive for start-ups, but it also means that there is really no way out for the large-scale platforms with millions of European users.

To be continued…

The European Parliament and the Council of Europe are expected to take their final vote in March and April 2019. Subsequently, the member states need to implement the Copyright Directive as national legislation. The future of the Copyright Directive and its potential impact are uncertain, and it remains to be seen if the (members of these) institutions are willing to obstruct the current proposal after years of numerous and lengthy negotiations. To be continued.

For more on copyright law, click here.

The European Union continues to roll out regulations in furtherance of the EU Digital Single Market, a strategy that covers digital marketing, e-commerce, and telecommunications. The GDPR went into force in May 2018, the new ePrivacy Regulation is planned for 2019, and on Friday, 9 November 2018, a new milestone was reached when the Council of the European Union approved another regulation that provides a framework for the free flow of non-personal data in the EU (the Regulation). The EU Parliament already approved the Regulation on 4 October 2018.


Electronic data are at the center of modern, innovative economic systems and societies. You could say a new “data economy” is emerging.

Besides personal data (the protection of which has strongly improved with the GDPR), non-personal data are becoming a greater source of value. Non-personal data includes data sets used for big data analytics with the aim of improving website functionality. In this respect, clickstream data are analyzed and used to make desired improvements.

At the moment, effective and efficient functioning of non-personal data processing and the development of the data economy in the EU are hampered by national laws. For example, member states use data localization requirements mandating that data be stored on a device that is physically present within the borders of a specific country. Furthermore, “vendor lock-in” practices in the private sector limit the free flow of non-personal data. “Vendor lock-in” is a situation in which a customer who uses a product or service cannot easily transition to the products or services of a competitor.

The Regulation aims to remove these restrictions and to provide for free movement of non- personal data within the EU by applying the following provisions:


The Regulation applies to the processing of electronic data other than personal data in the EU. “Processing” of data covers a wide range of data operations, such as collecting, filing, storing, using, and transferring. Such data processing should be provided by service providers to users residing or having an establishment in the EU, regardless of whether or not the provider is established in the EU. The data processing may also be carried out by a natural or legal person for their own needs, and residing or having an establishment in the EU.

Free movement of data within the EU

The data localization requirements shall no longer apply: under the Regulation, the location of non-personal data for storage or processing within the EU shall not be restricted to the territory of a member state. As such, free movement of data should be established. In practice, this means that a cloud service provider in the EU may decide for itself where it stores non-personal data.

Porting of data

To eliminate vendor lock-in practices, the Regulation provides for and encourages the development of codes of conduct for service providers. With these codes of conduct, consumers should be able to switch to other service providers more easily. Such codes of conduct focus, for example, on the provision of sufficient and clear information to consumers before a contract is concluded.

Single points of contact

Each member state shall appoint a single point of contact regarding the application of the Regulation. Such point of contact shall liaise with the designated points of contact in the other member states to discuss any issues regarding the Regulation.


The Regulation is another step towards the Single Digital Market and the free flow of data. It should make it easier for service providers to store and process non-personal data within the EU, while consumers should be able to switch between service providers more easily. It remains to be seen whether the Regulation will make a difference, considering that many data sets will contain both non-personal data and personal data, and a completely different regulation (the GDPR) applies for the latter.

The Regulation will be published in the Official Journal of the EU within a couple of weeks. Six months after the date of publication, the Regulation enters into force without further required implementation.

The European General Data Protection Regulation (GDPR) has brought important changes to the legal grounds for data transfers between the EU and the United States. Simultaneously, a new act in the United States has come into force that also affects data transfers between the United States and the EU. This act, the Clarifying Overseas Use of Data Act (the CLOUD Act) creates legal uncertainty and could lead to violations of GDPR.

Continue reading.