Legal & Trust

Sub-processors

Last updated: April 25, 2026

A “sub-processor” is a third party that processes Customer Data on our behalf to deliver AI2BI Hub. We use the providers below; this page is the authoritative, current list. We commit to giving you at least 30 days’ notice before adding a new sub-processor that touches Customer Data, and you may object on reasonable grounds.

1. Current sub-processors

Infrastructure

ProviderPurposeLocationBAA
Amazon Web Services (AWS)Cloud infrastructure: Lambda, API Gateway, S3, DynamoDB, RDS, EventBridge, KMS, CloudFront, CloudWatch.United States (us-east-1 default; EU/UK on Enterprise request)Yes
Amazon CognitoAuthentication, MFA, user pool. Holds account-level PII (email, name, MFA secret).Co-located with your AI2BI regionYes (under AWS BAA)

AI inference

ProviderPurposeLocationBAA
AWS Bedrock — Anthropic Claude Haiku 4.5Generative AI inference. Processes prompts + relevant data context to answer questions and build dashboards. No model training on your data.AWS regions (data does not leave AWS network boundary)Yes (under AWS BAA — Bedrock is HIPAA-eligible)

Operations

ProviderPurposeLocationBAA
StripePayment processing for paid plans. Card numbers handled by Stripe (PCI-DSS Level 1); we never see PANs.United StatesNo (no PHI processed)
Amazon SESTransactional email delivery (verification, password reset, MFA codes, notifications).United StatesNo
SentryError tracking for product and website. Stack traces and request metadata. PII redaction enabled.United StatesNo
GitHubSource control + CI/CD. Does not access Customer Data. Used to build and ship code.United StatesNo (no Customer Data)

2. A note on AI inference

AI inference for AI2BI Hub runs on AWS Bedrock using Anthropic’s Claude Haiku 4.5 model. Because Bedrock runs inside AWS, your prompts and responses never leave the AWS network boundary for AI processing, and the AWS BAA covers Bedrock as a HIPAA-eligible service. Anthropic does not retain your prompts or responses sent via Bedrock, and your data is not used to train any foundation model.

If you optionally enable MCP connector access (querying AI2BI from Claude Desktop, ChatGPT, Cursor, etc.), data flows through the LLM provider you choose. You are responsible for ensuring your users access AI2BI from LLM accounts with appropriate agreements (e.g., Claude Enterprise with a customer BAA for healthcare workloads). See our trust page for details.

3. Change notification

We will give Customer at least 30 days’ notice before engaging a new sub-processor that processes Customer Data. Notification is provided via:

  • An update to this page (with a new Last updated date)
  • A notice email to the billing contact on each Customer’s account
  • A banner inside the AI2BI Hub product for active users

You may subscribe to sub-processor change notifications by emailing compliance@ai2bihub.com with subject “Subscribe — sub-processor changes”.

4. Right to object

If you have a reasonable, data-protection-related objection to a new sub-processor (e.g., a regulator has issued an adverse finding against the provider), tell us within the 30-day notice window. We will work in good faith to either:

  • Demonstrate that your concern is addressed by the contractual / technical safeguards in place,
  • Offer a configuration that doesn’t use the new sub-processor for your tenant (where feasible), or
  • Allow you to terminate the affected services without penalty if no resolution is reached.

Send objections to legal@ai2bihub.com.