What are the concerns?
Company contact pages increasingly no longer lead to a phone number, email address or postal address, but directly to a chatbot. The Autoriteit Persoonsgegevens (AP) and the Autoriteit Consument & Markt (ACM) note, among other things, that customers often receive lower-quality or inaccurate answers. In addition, people with visual impairments or limited computer skills may find it difficult, or even impossible, to ask questions if communication is only possible through a chatbot.
Another concern is that customers may mistake chatbots for humans. The technology is becoming increasingly human-like: chatbots often use personal names, emojis, and an informal tone of voice. As a result, it can be difficult for users to tell whether they are communicating with a person or a chatbot. The regulators also warn of the risk of data breaches. Customers may share confidential information with chatbots without being aware of the risks. If a website is not adequately secured, a hack could easily expose personal data through the chatbot.
Legal obligations
Under Article 3:15d of the Dutch Civil Code, businesses operating online, such as webshops, must provide direct means of contact. Article 6:230m of the Dutch Civil Code further requires businesses selling to consumers to provide their contact details before concluding a contract.
From August 2026, new obligations under the European AI Act will come into effect. Websites using AI chatbots must then clearly inform customers at the very first point of contact that they are communicating with a chatbot. For certain companies, so-called “intermediary service providers” (e.g. cloud platforms or internet providers), this transparency obligation already applies under the Digital Services Act (Regulation 2022/2065).
Key considerations for chatbots
What does this mean for businesses? The AP and ACM emphasize the importance of human contact. Companies using chatbots must ensure that customers can easily switch to a human representative. In anticipation of the new obligations in August 2026, the regulators already recommend clearly disclosing at the start of a conversation that the customer is speaking with a chatbot.
Although chatbots can be convenient, businesses remain fully responsible for the information communicated through them. Mistakes can damage customer trust, the regulators warn. It is therefore essential to implement strong quality control measures for chatbot communications.
Finally, robust security is crucial: protect the personal data customers share with your chatbot. Through a clear privacy policy, companies can inform customers about their privacy measures and the use of AI within their organization.