You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-data-lake-storage.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@ ms.reviewer: douglasl
9
9
ms.service: data-factory
10
10
ms.workload: data-services
11
11
ms.topic: conceptual
12
-
ms.date: 05/24/2019
12
+
ms.date: 06/10/2019
13
13
ms.author: jingwang
14
14
15
15
---
@@ -105,7 +105,7 @@ To use service principal authentication, follow these steps:
105
105
-**As sink**, in Storage Explorer, grant at least **Write + Execute** permission to create child items in the folder. Alternatively, in Access control (IAM), grant at least **Storage Blob Data Contributor** role.
106
106
107
107
>[!NOTE]
108
-
>To list folders starting from the account level or to test connection, you need to set the permission of the service principal being granted to **storage account with "Execute" permission in IAM**. This is true when you use the:
108
+
>To list folders starting from the account level or to test connection, you need to set the permission of the service principal being granted to **storage account with "Storage Blob Data Reader" permission in IAM**. This is true when you use the:
109
109
>-**Copy Data Tool** to author copy pipeline.
110
110
>-**Data Factory UI** to test connection and navigating folders during authoring.
111
111
>If you have concern on granting permission at account level, you can skip test connection and input path manually during authoring. Copy activity will still work as long as the service principal is granted with proper permission at the files to be copied.
@@ -159,7 +159,7 @@ To use managed identities for Azure resources authentication, follow these steps
159
159
-**As sink**, in Storage Explorer, grant at least **Write + Execute** permission to create child items in the folder. Alternatively, in Access control (IAM), grant at least **Storage Blob Data Contributor** role.
160
160
161
161
>[!NOTE]
162
-
>To list folders starting from the account level or to test connection, you need to set the permission of the managed identity being granted to **storage account with "Execute" permission in IAM**. This is true when you use the:
162
+
>To list folders starting from the account level or to test connection, you need to set the permission of the managed identity being granted to **storage account with "Storage Blob Data Reader" permission in IAM**. This is true when you use the:
163
163
>-**Copy Data Tool** to author copy pipeline.
164
164
>-**Data Factory UI** to test connection and navigating folders during authoring.
165
165
>If you have concern on granting permission at account level, you can skip test connection and input path manually during authoring. Copy activity will still work as long as the managed identity is granted with proper permission at the files to be copied.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-azure-sql-database-managed-insance.md
+3-31
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
14
14
ms.topic: conceptual
15
-
ms.date: 04/08/2019
15
+
ms.date: 06/10/2019
16
16
ms.author: jingwang
17
17
18
18
---
@@ -26,7 +26,7 @@ You can copy data from Azure SQL Database Managed Instance to any supported sink
26
26
27
27
Specifically, this Azure SQL Database Managed Instance connector supports:
28
28
29
-
- Copying data by using SQL or Windows authentication.
29
+
- Copying data by using SQL authentication.
30
30
- As a source, retrieving data by using a SQL query or stored procedure.
31
31
- As a sink, appending data to a destination table or invoking a stored procedure with custom logic during copy.
32
32
@@ -51,9 +51,7 @@ The following properties are supported for the Azure SQL Database Managed Instan
51
51
| Property | Description | Required |
52
52
|:--- |:--- |:--- |
53
53
| type | The type property must be set to **SqlServer**. | Yes. |
54
-
| connectionString |This property specifies the connectionString information that's needed to connect to the managed instance by using either SQL authentication or Windows authentication. For more information, see the following examples. <br/>Mark this field as a SecureString to store it securely in Data Factory. You can also put password in Azure Key Vault,and if it's SQL authentication pull the `password` configuration out of the connection string. See the JSON example below the table and [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) article with more details. |Yes. |
55
-
| userName |This property specifies a user name if you use Windows authentication. An example is **domainname\\username**. |No. |
56
-
| password |This property specifies a password for the user account you specified for the user name. Select **SecureString** to store the connectionString information securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No. |
54
+
| connectionString |This property specifies the connectionString information that's needed to connect to the managed instance by using SQL authentication. For more information, see the following examples. <br/>Mark this field as a SecureString to store it securely in Data Factory. You can also put password in Azure Key Vault,and if it's SQL authentication pull the `password` configuration out of the connection string. See the JSON example below the table and [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) article with more details. |Yes. |
57
55
| connectVia | This [integration runtime](concepts-integration-runtime.md) is used to connect to the data store. Provision the self-hosted integration runtime in the same virtual network as your managed instance. |Yes. |
58
56
59
57
>[!TIP]
@@ -109,32 +107,6 @@ The following properties are supported for the Azure SQL Database Managed Instan
109
107
}
110
108
```
111
109
112
-
**Example 3: Use Windows authentication**
113
-
114
-
```json
115
-
{
116
-
"name": "AzureSqlMILinkedService",
117
-
"properties": {
118
-
"type": "SqlServer",
119
-
"typeProperties": {
120
-
"connectionString": {
121
-
"type": "SecureString",
122
-
"value": "Data Source=<servername>\\<instance name if using named instance>;Initial Catalog=<databasename>;Integrated Security=True;"
123
-
},
124
-
"userName": "<domain\\username>",
125
-
"password": {
126
-
"type": "SecureString",
127
-
"value": "<password>"
128
-
}
129
-
},
130
-
"connectVia": {
131
-
"referenceName": "<name of Integration Runtime>",
132
-
"type": "IntegrationRuntimeReference"
133
-
}
134
-
}
135
-
}
136
-
```
137
-
138
110
## Dataset properties
139
111
140
112
For a full list of sections and properties available for use to define datasets, see the datasets article. This section provides a list of properties supported by the Azure SQL Database Managed Instance dataset.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-salesforce-marketing-cloud.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
14
14
ms.topic: conceptual
15
-
ms.date: 01/09/2019
15
+
ms.date: 06/10/2019
16
16
ms.author: jingwang
17
17
18
18
---
@@ -27,7 +27,7 @@ This article outlines how to use the Copy Activity in Azure Data Factory to copy
27
27
28
28
You can copy data from Salesforce Marketing Cloud to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
29
29
30
-
Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
30
+
The Salesforce Marketing Cloud connector supports OAuth 2 authentication. It is built on top of the [Salesforce Marketing Cloud REST API](https://developer.salesforce.com/docs/atlas.en-us.mc-apis.meta/mc-apis/index-api.htm).
31
31
32
32
>[!NOTE]
33
33
>This connector doesn't support retrieving custom objects or custom data extensions.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-sap-table.md
+21-3
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
12
12
ms.tgt_pltfrm: na
13
13
14
14
ms.topic: conceptual
15
-
ms.date: 05/24/2018
15
+
ms.date: 06/10/2018
16
16
ms.author: jingwang
17
17
18
18
---
@@ -26,7 +26,13 @@ You can copy data from SAP Table to any supported sink data store. For a list of
26
26
27
27
Specifically, this SAP Table connector supports:
28
28
29
-
- Copying data from SAP Table in **SAP Business Suite with version 7.01 or higher** (in a recent SAP Support Package Stack released after the year 2015) or **S/4HANA**.
29
+
- Copying data from SAP Table in:
30
+
31
+
-**SAP ECC** with version 7.01 or higher (in a recent SAP Support Package Stack released after the year 2015)
32
+
- **SAP BW** with version 7.01 or higher
33
+
- **SAP S/4HANA**
34
+
- **Other products in SAP Business Suite** with version 7.01 or higher
35
+
30
36
- Copying data from both **SAP Transparent Table** and **View**.
31
37
- Copying data using **basic authentication** or **SNC** (Secure Network Communications) if SNC is configured.
32
38
- Connecting to **Application Server** or **Message Server**.
@@ -200,7 +206,7 @@ To copy data from SAP Table, the following properties are supported.
200
206
| type | The type property must be set to **SapTableSource**. | Yes |
201
207
| rowCount | Number of rows to be retrieved. | No |
202
208
| rfcTableFields | Fields to copy from the SAP table. For example, `column0, column1`. | No |
203
-
| rfcTableOptions | Options to filter the rows in SAP Table. For example, `COLUMN0 EQ 'SOMEVALUE'`. | No |
209
+
| rfcTableOptions | Options to filter the rows in SAP Table. For example, `COLUMN0 EQ 'SOMEVALUE'`. See more description below this table. | No |
204
210
| customRfcReadTableFunctionModule | Custom RFC function module that can be used to read data from SAP Table. | No |
205
211
| partitionOption | The partition mechanism to read from SAP table. The supported options include: <br/>- **None**<br/>- **PartitionOnInt** (normal integer or integer values with zero padding on the left, such as 0000012345)<br/>- **PartitionOnCalendarYear** (4 digits in format "YYYY")<br/>- **PartitionOnCalendarMonth** (6 digits in format "YYYYMM")<br/>- **PartitionOnCalendarDate** (8 digits in format "YYYYMMDD") | No |
206
212
| partitionColumnName | The name of the column to partition the data. | No |
@@ -213,6 +219,18 @@ To copy data from SAP Table, the following properties are supported.
213
219
>- Taking `partitionOption` as `partitionOnInt` as an example, the number of rows in each partition is calculated by (total rows falling between *partitionUpperBound* and *partitionLowerBound*)/*maxPartitionsNumber*.<br/>
214
220
>- If you want to further run partitions in parallel to speed up copy, it is strongly recommended to make `maxPartitionsNumber` as a multiple of the value of `parallelCopies` (learn more from [Parallel Copy](copy-activity-performance.md#parallel-copy)).
215
221
222
+
In `rfcTableOptions`, you can use e.g. the following common SAP query operators to filter the rows:
0 commit comments