|
| 1 | +--- |
| 2 | +title: Regional Segment (Europe) |
| 3 | +--- |
| 4 | +{% include content/plan-grid.md name="data-residency" %} |
| 5 | + |
| 6 | + |
| 7 | + |
| 8 | + |
| 9 | +Segment provides regional infrastructure across Europe, Middle East, Africa and Asia Pacific with [rate limits and SLA](/docs/connections/rate-limits/). |
| 10 | + |
| 11 | +As Schrems II restricts the transfer of personal data to any processors established in countries outside of Europe, all data in European workspaces ingest, store, process and deliver locally within the EU. You can set up workspaces in the EU where data is received via the Dublin, Ireland endpoint hosted in the EU for all sources, with the exception of [cloud object sources](/docs/connections/sources/#object-cloud-sources) and cloud event sources that aren't supported. Segment-hosted archives hosted in S3 AWS Dublin, Ireland, process, filter, validate, deduplicate, and archive data for EU workspaces. |
| 12 | + |
| 13 | +> info "" |
| 14 | +> Regional Segment for Europe is currently in beta. Segment’s [First-Access and Beta terms](https://segment.com/legal/first-access-beta-preview/) govern this feature. |
| 15 | +
|
| 16 | +## Create a new workspace with a different region |
| 17 | +To create a workspace with a different data processing region: |
| 18 | +1. Log in to your segment account. |
| 19 | +2. Click **New Workspace**. |
| 20 | +3. Select your **Data processing region**. This determines the location in which Segment collects, processes, and stores data that’s sent to and from your workspace. You can choose from *US West* or *EU Central*. |
| 21 | +4. Click **Create workspace**. |
| 22 | + |
| 23 | +## Regional Data Ingestion |
| 24 | +Regional Data Ingestion enables you to send data to Segment from both Device-mode and Cloud-mode sources through regionally hosted API ingest points. The regional infrastructure can fail across locations within a region, but never across regions. |
| 25 | + |
| 26 | +Segment's EU instance only supports data ingestion from Dublin, Ireland through the `in.eu2.segmentapis.com/v1` endpoint. |
| 27 | + |
| 28 | +### Set your Data Ingestion Region |
| 29 | +To set your Data Ingestion Region: |
| 30 | +1. Go to your source. |
| 31 | +2. Select the **Settings** tab. |
| 32 | +3. Click **Regional Settings**. |
| 33 | +4. Choose your **Data Ingestion Region**. |
| 34 | + * If you’re in the US region, you can select from: Dublin, Singapore, Oregon, and Sydney. |
| 35 | + * Segment’s EU instance only supports data ingestion from Dublin with the `events.eu1.segmentapis.com` endpoint. |
| 36 | + |
| 37 | +### Client-side sources |
| 38 | +You can configure Segment’s client-side SDKs for Javascript, iOS, Android, and React Native sources to send data to a regional host after you’ve updated the Data Ingestion Region in that source’s settings. |
| 39 | + |
| 40 | +All regions are configured on a per-source basis. Configure the region for each source separately if you don't want to use the default region. |
| 41 | + |
| 42 | +> info "" |
| 43 | +> Dublin is the only region for EU instances and defaults automatically. |
| 44 | +
|
| 45 | +All Segment client-side SDKs read this setting and update themselves automatically to send data to new endpoints when the app reloads. You don't need to change the code when you switch regions. |
| 46 | + |
| 47 | +### Server-side and project sources |
| 48 | +When you send data from a server-side or project source, you can use the `host` configuration parameter to send data to the desired region, which will be Dublin — `in.eu2.segmentapis.com/v1` |
| 49 | + |
| 50 | +## Regional Data Storage |
| 51 | +[Regional Data Storage](/docs/connections/data-residency/#regional-data-storage) isn't supported in EU workspaces. |
| 52 | + |
| 53 | +## EU Updates |
| 54 | +### Data Lakes |
| 55 | +Regional Segment in the EU changes the way you [configure the Data Lakes AWS environment](/docs/connections/storage/data-lakes/data-lakes-manual-setup/#iam-role). |
| 56 | + |
| 57 | +### Warehouse Public IP Range |
| 58 | +Use Segment's custom CIDR `3.251.148.96/29` while authorizing Segment to write in to your Redshift or Postgres port. [BigQuery](/docs/connections/storage/catalog/bigquery/#getting-started) doesn't require you to allow a custom IP address. |
0 commit comments