Introduction: AI and Legal Responsibility
The adoption of artificial intelligence in medical practices raises legitimate questions about data protection and legal compliance. In Switzerland, the regulatory framework is particularly rigorous when it comes to health data, and rightly so: patient trust is the foundation of the medical profession.
This guide clarifies what Swiss law actually says and how to implement AI in a compliant manner.
The Swiss Legal Framework
1. The new Federal Act on Data Protection (nFADP/revDPA)
The new Federal Act on Data Protection (nFADP) has been in force since September 1, 2023, modernizing personal data protection in Switzerland to align with European standards.
Fundamental principles applicable to medical AI:
- Privacy by design: Data protection must be integrated from system design
- Data minimization: Collect only strictly necessary data
- Transparency: Patients must be informed about the use of automated systems
- Right to erasure: Data deletion when no longer necessary
- Automated profiling: Sensitive decisions cannot be made solely by automated systems
For medical AI, this means:
- Patients must be informed that calls may be handled by an AI assistant
- AI cannot make diagnostic or therapeutic medical decisions autonomously
- Conversation data must be protected with end-to-end encryption
- Transcripts must be deleted according to defined policies
2. Article 321 of the Swiss Criminal Code: Medical Professional Secrecy
Art. 321 SCC is the cornerstone protecting professional secrecy in Switzerland. It states:
"Ecclesiastics, lawyers, defenders, notaries, patent attorneys, auditors bound to professional secrecy under the Code of Obligations, doctors, dentists, chiropractors, pharmacists, midwives and their assistants [...] who disclose secrets entrusted to them by virtue of their profession or of which they became aware in the exercise thereof, shall be liable to a custodial sentence not exceeding three years or to a monetary penalty, on complaint."
Implications for AI:
- "Assistants" also include technological systems used by the physician
- AI solution providers must ensure data is not accessible to unauthorized third parties
- Servers must be in Switzerland or jurisdictions with equivalent protection
- Technical staff who might access data must be bound to professional secrecy
3. GDPR: Applicability for EU Patients
Although Switzerland is not part of the EU, GDPR (General Data Protection Regulation) applies when a Swiss medical practice processes data of EU-resident patients.
Main GDPR obligations:
- Legal basis for processing: Explicit consent or legitimate interest
- Data Protection Impact Assessment (DPIA): Mandatory for large-scale automated processing of health data
- Data Protection Officer (DPO): Recommended for practices processing large volumes of sensitive data
- Breach notification: Obligation to notify data breaches within 72 hours
- Right of access: Patients can request copies of all their data
Data Residency: Where Data Must Be
Location Requirements
To comply with Swiss legislation and medical best practices, health data should:
- Primary servers in Switzerland: Hosting at ISO 27001 certified data centers on Swiss territory
- Backups in recognized jurisdictions: If necessary, backups in EU or countries with EU "adequacy decision"
- NO to US public clouds: Following the Schrems II ruling, use of US cloud providers (AWS, Google Cloud, Microsoft Azure US) is problematic for health data
Encryption: Required Standards
- In transit: Minimum TLS 1.3 for all communications
- At rest: AES-256 for database data
- End-to-end: Application-level encryption, not just transport
- Key management: Hardware Security Module (HSM) recommended for critical encryption keys
What AIAgens Does to Ensure Compliance
1. Swiss-First Architecture
- Hosting: Primary servers at certified Swiss data centers (Interxion, Green Datacenter)
- Database: PostgreSQL on Swiss infrastructure with replication to separate data center
- Backup: Encrypted backups on Swiss storage, with configurable retention per practice policy
2. Multi-Layer Encryption
- Voice communications: TLS 1.3 encryption for calls via SIP/WebRTC
- Database: AES-256 encryption for all sensitive fields (patient name, phone number, call notes)
- API: All communications via HTTPS with certificate pinning
- HSM integration: Key management via HSM for enterprise clients
3. Tenant Separation and Data Isolation
- Multi-tenant database: Each practice has a logically isolated space
- No cross-customer sharing: One practice's data is never accessible to others
- AI model isolation: No training on production data - AI models are pre-trained on public datasets
4. Access Control and Audit Trail
- Data access: Only authorized practice staff can access their own data
- Audit log: Every access, modification or deletion is tracked with timestamp and user
- Retention policy: Automatic deletion of transcripts after configurable period (default 90 days)
5. Contractual Compliance
- Data Processing Agreement (DPA): Standard contract compliant with nFADP and GDPR provided to all customers
- EU Standard Contractual Clauses: For clients with EU patients
- Confidentiality agreement: AIAgens staff bound to professional secrecy
Implementation Compliance Checklist
Before activating an AI system in your practice, make sure to:
- Updated privacy notice
- Add section on phone AI assistant use
- Specify purpose, legal basis and retention period
- Publish on website and provide to patients on request
- Patient consent (if needed)
- For call recording: clear notice before recording
- For advanced analysis: separate explicit consent
- Ability to opt-out without penalties
- Agreement with provider
- Sign Data Processing Agreement
- Verify server and backup locations
- Request ISO 27001 / SOC 2 certifications
- Staff training
- Explain how AI works and what data it processes
- Instruct on what to tell patients who ask for information
- Procedures for escalation to human when necessary
- Data Protection Impact Assessment (DPIA)
- Recommended for practices with >250 employees or high-risk processing
- Analyze risks specific to your context
- Document mitigation measures
Frequently Asked Questions
Can AI diagnose or prescribe?
No. By law, diagnostic and therapeutic medical decisions must be made by authorized healthcare professionals. AI can only assist with administrative tasks like appointment booking, provide general information, and do pre-clinical triage (direct to right specialty).
Do I need to notify the Federal Data Protection Commissioner (FDPC)?
No prior notification is required for administrative AI use. However, in case of data breach, notification to FDPC is mandatory within reasonable time if there are high risks to patient rights.
Can I use ChatGPT or Google Bard to respond to patients?
No, for health data. Public AI models like ChatGPT, Bard, Claude do not guarantee Swiss data residency and data may be used for training. Use only healthcare-specific solutions with DPA contracts.
Do patients need to explicitly consent to AI?
Depends on use. For basic administrative activities (bookings, hours information), notice in privacy policy is sufficient. For recording and deep conversation analysis, explicit opt-in consent is recommended.
Conclusions: Compliant AI is Transparent AI
Data protection is not an obstacle to innovation, but a fundamental requirement to maintain patient trust. A well-implemented AI system:
- Respects privacy by design, not as an afterthought
- Is transparent: patients know when they're talking to AI
- Minimizes data: collects only what's needed for stated purpose
- Protects technically: encryption, isolation, audit trail
- Respects rights: access, deletion, portability
AIAgens is designed with these principles from day one. For more technical information on our security architecture, visit our healthcare security page.
To implement AI in your practice compliantly, explore our plans or contact our team for personalized consultation.