Updates / AP and ACM concerned about AI chatbots
16 10 2025

AP and ACM concerned about AI chatbots

AI chatbots are rapidly taking over customer service functions. Companies are increasingly referring customers directly to a chatbot on their websites. In a joint opinion issued on 2 October 2025, the Autoriteit Persoonsgegevens (AP) and the Autoriteit Consument & Markt (ACM) expressed their concerns about this development. What are the risks and obligations for businesses?

What are the concerns?

Company contact pages increasingly no longer lead to a phone number, email address or postal address, but directly to a chatbot. The Autoriteit Persoonsgegevens (AP) and the Autoriteit Consument & Markt (ACM) note, among other things, that customers often receive lower-quality or inaccurate answers. In addition, people with visual impairments or limited computer skills may find it difficult, or even impossible, to ask questions if communication is only possible through a chatbot.

Another concern is that customers may mistake chatbots for humans. The technology is becoming increasingly human-like: chatbots often use personal names, emojis, and an informal tone of voice. As a result, it can be difficult for users to tell whether they are communicating with a person or a chatbot. The regulators also warn of the risk of data breaches. Customers may share confidential information with chatbots without being aware of the risks. If a website is not adequately secured, a hack could easily expose personal data through the chatbot.

Legal obligations

Under Article 3:15d of the Dutch Civil Code, businesses operating online, such as webshops, must provide direct means of contact. Article 6:230m of the Dutch Civil Code further requires businesses selling to consumers to provide their contact details before concluding a contract.

From August 2026, new obligations under the European AI Act will come into effect. Websites using AI chatbots must then clearly inform customers at the very first point of contact that they are communicating with a chatbot. For certain companies, so-called “intermediary service providers” (e.g. cloud platforms or internet providers), this transparency obligation already applies under the Digital Services Act (Regulation 2022/2065).

Key considerations for chatbots

What does this mean for businesses? The AP and ACM emphasize the importance of human contact. Companies using chatbots must ensure that customers can easily switch to a human representative. In anticipation of the new obligations in August 2026, the regulators already recommend clearly disclosing at the start of a conversation that the customer is speaking with a chatbot.

Although chatbots can be convenient, businesses remain fully responsible for the information communicated through them. Mistakes can damage customer trust, the regulators warn. It is therefore essential to implement strong quality control measures for chatbot communications.

Finally, robust security is crucial: protect the personal data customers share with your chatbot. Through a clear privacy policy, companies can inform customers about their privacy measures and the use of AI within their organization.

Manage my cookies

To ensure that the bz.nl website functions properly, Boels Zanders NV uses techniques involving the processing of personal data, such as placing cookies. On the bz.nl website, we distinguish between functional and non-functional cookies.

Functional cookies (necessary)

We always install functional cookies. Functional cookies are necessary to ensure the website works properly. These cookies do not process personal data.

(Always active)

Analytical cookies (optional)

These non-functional cookies process personal data outside your field of vision. That is why we always ask for your consent before using these cookies. Analytical cookies may serve many different purposes, but above all enable us to improve our services.

If you agree to this, you can simply continue. You can read more about cookies in our cookie statement and adjust your cookie settings if you wish.

Save my choice