1. Current sub-processors
Infrastructure
| Provider | Purpose | Location | BAA |
|---|---|---|---|
| Amazon Web Services (AWS) | Cloud infrastructure: Lambda, API Gateway, S3, DynamoDB, RDS, EventBridge, KMS, CloudFront, CloudWatch. | United States (us-east-1 default; EU/UK on Enterprise request) | Yes |
| Amazon Cognito | Authentication, MFA, user pool. Holds account-level PII (email, name, MFA secret). | Co-located with your AI2BI region | Yes (under AWS BAA) |
AI inference
| Provider | Purpose | Location | BAA |
|---|---|---|---|
| AWS Bedrock — Anthropic Claude Haiku 4.5 | Generative AI inference. Processes prompts + relevant data context to answer questions and build dashboards. No model training on your data. | AWS regions (data does not leave AWS network boundary) | Yes (under AWS BAA — Bedrock is HIPAA-eligible) |
Operations
| Provider | Purpose | Location | BAA |
|---|---|---|---|
| Stripe | Payment processing for paid plans. Card numbers handled by Stripe (PCI-DSS Level 1); we never see PANs. | United States | No (no PHI processed) |
| Amazon SES | Transactional email delivery (verification, password reset, MFA codes, notifications). | United States | No |
| Sentry | Error tracking for product and website. Stack traces and request metadata. PII redaction enabled. | United States | No |
| GitHub | Source control + CI/CD. Does not access Customer Data. Used to build and ship code. | United States | No (no Customer Data) |
2. A note on AI inference
AI inference for AI2BI Hub runs on AWS Bedrock using Anthropic’s Claude Haiku 4.5 model. Because Bedrock runs inside AWS, your prompts and responses never leave the AWS network boundary for AI processing, and the AWS BAA covers Bedrock as a HIPAA-eligible service. Anthropic does not retain your prompts or responses sent via Bedrock, and your data is not used to train any foundation model.
If you optionally enable MCP connector access (querying AI2BI from Claude Desktop, ChatGPT, Cursor, etc.), data flows through the LLM provider you choose. You are responsible for ensuring your users access AI2BI from LLM accounts with appropriate agreements (e.g., Claude Enterprise with a customer BAA for healthcare workloads). See our trust page for details.
3. Change notification
We will give Customer at least 30 days’ notice before engaging a new sub-processor that processes Customer Data. Notification is provided via:
- An update to this page (with a new Last updated date)
- A notice email to the billing contact on each Customer’s account
- A banner inside the AI2BI Hub product for active users
You may subscribe to sub-processor change notifications by emailing compliance@ai2bihub.com with subject “Subscribe — sub-processor changes”.
4. Right to object
If you have a reasonable, data-protection-related objection to a new sub-processor (e.g., a regulator has issued an adverse finding against the provider), tell us within the 30-day notice window. We will work in good faith to either:
- Demonstrate that your concern is addressed by the contractual / technical safeguards in place,
- Offer a configuration that doesn’t use the new sub-processor for your tenant (where feasible), or
- Allow you to terminate the affected services without penalty if no resolution is reached.
Send objections to legal@ai2bihub.com.
