On-Prem vs Cloud LLMS For GDPR-Compliant Customer-Service Chatbots in the Hotel Industry

Nagy, Zsombor [Nagy, Zsombor (RPA (Robotizált f...), szerző] Gazdaságinformatika Tanszék (BGE / PSZK); Szabó, László [Szabó, László (menedzsment, ellá...), szerző] Gazdaságinformatika Tanszék (BGE / PSZK)

Angol nyelvű Konferenciaközlemény (Könyvrészlet) Tudományos
    Azonosítók
    • MTMT: 36253764
    Mid-size hotels in Central Europe are increasingly adopting AI chatbots to enhance customer service. They must, however, balance innovation with strict GDPR requirements.This study compares a cloud-based large language model (OpenAI’s GPT-4 API) against an on-premises deployment of an open-source LLM (Llama 3–13B) for hotel customer-service chatbots. We evaluate both solutions in the context of Cogniforce Labs deploying chatbots for regional hotels, focusing on GDPR compliance, operational cost, response latency, and customer satisfaction. Using three common use cases (booking modification, late check-in, and local recommendations), we simulate chatbot interactions and measure performance. The results show that the on-premises LLM offers superior data privacy (all guest data remains in-house, aiding GDPR compliance) and lower latency (up to ~35% faster responses), along with a more predictable cost structure. The cloud GPT-4 solution, however, delivers slightly higher answer quality, yielding greater customer satisfaction scores, at the expense of transmitting personal data to a third-party and incurring usage-based fees. Our findings suggest a trade-off between compliance/cost and service quality. Hotels prioritizing privacy may favor on-premise LLMs, while those emphasizing customer experience might opt for cloud AI with proper safeguards. We discuss hybrid strategies and provide recommendations for hospitality businesses navigating this choice.
    Hivatkozás stílusok: IEEEACMAPAChicagoHarvardCSLMásolásNyomtatás
    2026-02-08 12:10