You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-sql-data-warehouse.md
+15-12Lines changed: 15 additions & 12 deletions
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
14
14
ms.topic: conceptual
15
-
ms.date: 04/16/2019
15
+
ms.date: 04/19/2019
16
16
ms.author: jingwang
17
17
18
18
---
@@ -396,22 +396,29 @@ Learn more about how to use PolyBase to efficiently load SQL Data Warehouse in t
396
396
397
397
Using [PolyBase](https://docs.microsoft.com/sql/relational-databases/polybase/polybase-guide) is an efficient way to load a large amount of data into Azure SQL Data Warehouse with high throughput. You'll see a large gain in the throughput by using PolyBase instead of the default BULKINSERT mechanism. See [Performance reference](copy-activity-performance.md#performance-reference) for a detailed comparison. For a walkthrough with a use case, see [Load 1 TB into Azure SQL Data Warehouse](https://docs.microsoft.com/azure/data-factory/v1/data-factory-load-sql-data-warehouse).
398
398
399
-
* If your source data is in Azure Blob storage or Azure Data Lake Store, and the format is compatible with PolyBase, copy direct to Azure SQL Data Warehouse by using PolyBase. For details, see **[Direct copy by using PolyBase](#direct-copy-by-using-polybase)**.
399
+
* If your source data is in **Azure Blob, Azure Data Lake Storage Gen1 or Azure Data Lake Storage Gen2**, and the **format is PolyBase compatible**, you can use copy activity to directly invoke PolyBase to let Azure SQL Data Warehouse pull the data from source. For details, see **[Direct copy by using PolyBase](#direct-copy-by-using-polybase)**.
400
400
* If your source data store and format isn't originally supported by PolyBase, use the **[Staged copy by using PolyBase](#staged-copy-by-using-polybase)** feature instead. The staged copy feature also provides you better throughput. It automatically converts the data into PolyBase-compatible format. And it stores the data in Azure Blob storage. It then loads the data into SQL Data Warehouse.
401
401
402
402
### Direct copy by using PolyBase
403
403
404
-
SQL Data Warehouse PolyBase directly supports Azure BlobandAzure Data Lake Store. It uses service principal as a source andhas specific file format requirements. If your source data meets the criteria described in this section, use PolyBase to copy directfrom the source data store to Azure SQL Data Warehouse. Otherwise, use [Staged copy by using PolyBase](#staged-copy-by-using-polybase).
404
+
SQL Data Warehouse PolyBase directly supports Azure Blob, Azure Data Lake Storage Gen1 andAzure Data Lake Storage Gen2. If your source data meets the criteria described in this section, use PolyBase to copy directlyfrom the source data store to Azure SQL Data Warehouse. Otherwise, use [Staged copy by using PolyBase](#staged-copy-by-using-polybase).
405
405
406
406
> [!TIP]
407
-
> To copy data efficiently from Data Lake Store to SQL Data Warehouse, learn more from [Azure Data Factory makes it even easier and convenient to uncover insights from data when using Data Lake Store with SQL Data Warehouse](https://blogs.msdn.microsoft.com/azuredatalake/2017/04/08/azure-data-factory-makes-it-even-easier-and-convenient-to-uncover-insights-from-data-when-using-data-lake-store-with-sql-data-warehouse/).
407
+
> To copy data efficiently to SQL Data Warehouse, learn more from [Azure Data Factory makes it even easier and convenient to uncover insights from data when using Data Lake Store with SQL Data Warehouse](https://blogs.msdn.microsoft.com/azuredatalake/2017/04/08/azure-data-factory-makes-it-even-easier-and-convenient-to-uncover-insights-from-data-when-using-data-lake-store-with-sql-data-warehouse/).
408
408
409
409
If the requirements aren't met, Azure Data Factory checks the settings and automatically falls back to the BULKINSERT mechanism for the data movement.
410
410
411
-
1. The **Source linked service** type is Azure Blob storage (**AzureBLobStorage**/**AzureStorage**) with **account key authentication** or Azure Data Lake Storage Gen1 (**AzureDataLakeStore**) with **service principal authentication**.
412
-
2. The **input dataset** type is **AzureBlob** or **AzureDataLakeStoreFile**. The format type under `type` properties is **OrcFormat**, **ParquetFormat**, or **TextFormat**, with the following configurations:
411
+
1. The **source linked service** is with the following types and authentication methods:
413
412
414
-
1. `fileName` doesn't contain wildcard filter.
413
+
| Supported source data store type | Supported source authentication type |
0 commit comments