Skip to content

Commit 64a7ad6

Browse files
authored
Merge pull request #79151 from linda33wj/master
Address a few ADF doc feedback on connector & quickstart
2 parents 0524bd8 + 12152e8 commit 64a7ad6

5 files changed

+31
-41
lines changed

articles/data-factory/connector-azure-data-lake-storage.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.reviewer: douglasl
99
ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
12-
ms.date: 05/24/2019
12+
ms.date: 06/10/2019
1313
ms.author: jingwang
1414

1515
---
@@ -105,7 +105,7 @@ To use service principal authentication, follow these steps:
105105
- **As sink**, in Storage Explorer, grant at least **Write + Execute** permission to create child items in the folder. Alternatively, in Access control (IAM), grant at least **Storage Blob Data Contributor** role.
106106

107107
>[!NOTE]
108-
>To list folders starting from the account level or to test connection, you need to set the permission of the service principal being granted to **storage account with "Execute" permission in IAM**. This is true when you use the:
108+
>To list folders starting from the account level or to test connection, you need to set the permission of the service principal being granted to **storage account with "Storage Blob Data Reader" permission in IAM**. This is true when you use the:
109109
>- **Copy Data Tool** to author copy pipeline.
110110
>- **Data Factory UI** to test connection and navigating folders during authoring.
111111
>If you have concern on granting permission at account level, you can skip test connection and input path manually during authoring. Copy activity will still work as long as the service principal is granted with proper permission at the files to be copied.
@@ -159,7 +159,7 @@ To use managed identities for Azure resources authentication, follow these steps
159159
- **As sink**, in Storage Explorer, grant at least **Write + Execute** permission to create child items in the folder. Alternatively, in Access control (IAM), grant at least **Storage Blob Data Contributor** role.
160160

161161
>[!NOTE]
162-
>To list folders starting from the account level or to test connection, you need to set the permission of the managed identity being granted to **storage account with "Execute" permission in IAM**. This is true when you use the:
162+
>To list folders starting from the account level or to test connection, you need to set the permission of the managed identity being granted to **storage account with "Storage Blob Data Reader" permission in IAM**. This is true when you use the:
163163
>- **Copy Data Tool** to author copy pipeline.
164164
>- **Data Factory UI** to test connection and navigating folders during authoring.
165165
>If you have concern on granting permission at account level, you can skip test connection and input path manually during authoring. Copy activity will still work as long as the managed identity is granted with proper permission at the files to be copied.

articles/data-factory/connector-azure-sql-database-managed-insance.md

+3-31
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313

1414
ms.topic: conceptual
15-
ms.date: 04/08/2019
15+
ms.date: 06/10/2019
1616
ms.author: jingwang
1717

1818
---
@@ -26,7 +26,7 @@ You can copy data from Azure SQL Database Managed Instance to any supported sink
2626

2727
Specifically, this Azure SQL Database Managed Instance connector supports:
2828

29-
- Copying data by using SQL or Windows authentication.
29+
- Copying data by using SQL authentication.
3030
- As a source, retrieving data by using a SQL query or stored procedure.
3131
- As a sink, appending data to a destination table or invoking a stored procedure with custom logic during copy.
3232

@@ -51,9 +51,7 @@ The following properties are supported for the Azure SQL Database Managed Instan
5151
| Property | Description | Required |
5252
|:--- |:--- |:--- |
5353
| type | The type property must be set to **SqlServer**. | Yes. |
54-
| connectionString |This property specifies the connectionString information that's needed to connect to the managed instance by using either SQL authentication or Windows authentication. For more information, see the following examples. <br/>Mark this field as a SecureString to store it securely in Data Factory. You can also put password in Azure Key Vault,and if it's SQL authentication pull the `password` configuration out of the connection string. See the JSON example below the table and [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) article with more details. |Yes. |
55-
| userName |This property specifies a user name if you use Windows authentication. An example is **domainname\\username**. |No. |
56-
| password |This property specifies a password for the user account you specified for the user name. Select **SecureString** to store the connectionString information securely in Data Factory, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |No. |
54+
| connectionString |This property specifies the connectionString information that's needed to connect to the managed instance by using SQL authentication. For more information, see the following examples. <br/>Mark this field as a SecureString to store it securely in Data Factory. You can also put password in Azure Key Vault,and if it's SQL authentication pull the `password` configuration out of the connection string. See the JSON example below the table and [Store credentials in Azure Key Vault](store-credentials-in-key-vault.md) article with more details. |Yes. |
5755
| connectVia | This [integration runtime](concepts-integration-runtime.md) is used to connect to the data store. Provision the self-hosted integration runtime in the same virtual network as your managed instance. |Yes. |
5856

5957
>[!TIP]
@@ -109,32 +107,6 @@ The following properties are supported for the Azure SQL Database Managed Instan
109107
}
110108
```
111109

112-
**Example 3: Use Windows authentication**
113-
114-
```json
115-
{
116-
"name": "AzureSqlMILinkedService",
117-
"properties": {
118-
"type": "SqlServer",
119-
"typeProperties": {
120-
"connectionString": {
121-
"type": "SecureString",
122-
"value": "Data Source=<servername>\\<instance name if using named instance>;Initial Catalog=<databasename>;Integrated Security=True;"
123-
},
124-
"userName": "<domain\\username>",
125-
"password": {
126-
"type": "SecureString",
127-
"value": "<password>"
128-
}
129-
},
130-
"connectVia": {
131-
"referenceName": "<name of Integration Runtime>",
132-
"type": "IntegrationRuntimeReference"
133-
}
134-
}
135-
}
136-
```
137-
138110
## Dataset properties
139111

140112
For a full list of sections and properties available for use to define datasets, see the datasets article. This section provides a list of properties supported by the Azure SQL Database Managed Instance dataset.

articles/data-factory/connector-salesforce-marketing-cloud.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313

1414
ms.topic: conceptual
15-
ms.date: 01/09/2019
15+
ms.date: 06/10/2019
1616
ms.author: jingwang
1717

1818
---
@@ -27,7 +27,7 @@ This article outlines how to use the Copy Activity in Azure Data Factory to copy
2727

2828
You can copy data from Salesforce Marketing Cloud to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
2929

30-
Azure Data Factory provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver using this connector.
30+
The Salesforce Marketing Cloud connector supports OAuth 2 authentication. It is built on top of the [Salesforce Marketing Cloud REST API](https://developer.salesforce.com/docs/atlas.en-us.mc-apis.meta/mc-apis/index-api.htm).
3131

3232
>[!NOTE]
3333
>This connector doesn't support retrieving custom objects or custom data extensions.

articles/data-factory/connector-sap-table.md

+21-3
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313

1414
ms.topic: conceptual
15-
ms.date: 05/24/2018
15+
ms.date: 06/10/2018
1616
ms.author: jingwang
1717

1818
---
@@ -26,7 +26,13 @@ You can copy data from SAP Table to any supported sink data store. For a list of
2626

2727
Specifically, this SAP Table connector supports:
2828

29-
- Copying data from SAP Table in **SAP Business Suite with version 7.01 or higher** (in a recent SAP Support Package Stack released after the year 2015) or **S/4HANA**.
29+
- Copying data from SAP Table in:
30+
31+
- **SAP ECC** with version 7.01 or higher (in a recent SAP Support Package Stack released after the year 2015)
32+
- **SAP BW** with version 7.01 or higher
33+
- **SAP S/4HANA**
34+
- **Other products in SAP Business Suite** with version 7.01 or higher
35+
3036
- Copying data from both **SAP Transparent Table** and **View**.
3137
- Copying data using **basic authentication** or **SNC** (Secure Network Communications) if SNC is configured.
3238
- Connecting to **Application Server** or **Message Server**.
@@ -200,7 +206,7 @@ To copy data from SAP Table, the following properties are supported.
200206
| type | The type property must be set to **SapTableSource**. | Yes |
201207
| rowCount | Number of rows to be retrieved. | No |
202208
| rfcTableFields | Fields to copy from the SAP table. For example, `column0, column1`. | No |
203-
| rfcTableOptions | Options to filter the rows in SAP Table. For example, `COLUMN0 EQ 'SOMEVALUE'`. | No |
209+
| rfcTableOptions | Options to filter the rows in SAP Table. For example, `COLUMN0 EQ 'SOMEVALUE'`. See more description below this table. | No |
204210
| customRfcReadTableFunctionModule | Custom RFC function module that can be used to read data from SAP Table. | No |
205211
| partitionOption | The partition mechanism to read from SAP table. The supported options include: <br/>- **None**<br/>- **PartitionOnInt** (normal integer or integer values with zero padding on the left, such as 0000012345)<br/>- **PartitionOnCalendarYear** (4 digits in format "YYYY")<br/>- **PartitionOnCalendarMonth** (6 digits in format "YYYYMM")<br/>- **PartitionOnCalendarDate** (8 digits in format "YYYYMMDD") | No |
206212
| partitionColumnName | The name of the column to partition the data. | No |
@@ -213,6 +219,18 @@ To copy data from SAP Table, the following properties are supported.
213219
>- Taking `partitionOption` as `partitionOnInt` as an example, the number of rows in each partition is calculated by (total rows falling between *partitionUpperBound* and *partitionLowerBound*)/*maxPartitionsNumber*.<br/>
214220
>- If you want to further run partitions in parallel to speed up copy, it is strongly recommended to make `maxPartitionsNumber` as a multiple of the value of `parallelCopies` (learn more from [Parallel Copy](copy-activity-performance.md#parallel-copy)).
215221
222+
In `rfcTableOptions`, you can use e.g. the following common SAP query operators to filter the rows:
223+
224+
| Operator | Description |
225+
| :------- | :------- |
226+
| EQ | Equal to |
227+
| NE | Not equal to |
228+
| LT | Less than |
229+
| LE | Less than or equal to |
230+
| GT | Greater than |
231+
| GE | Greater than or equal to |
232+
| LIKE | As in LIKE 'Emma%' |
233+
216234
**Example:**
217235

218236
```json

articles/data-factory/quickstart-create-data-factory-rest-api.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm:
1313
ms.devlang: rest-api
1414
ms.topic: quickstart
15-
ms.date: 02/20/2019
15+
ms.date: 06/10/2019
1616
ms.author: jingwang
1717
---
1818
# Quickstart: Create an Azure data factory and pipeline by using the REST API
@@ -75,7 +75,7 @@ Run the following commands to authenticate with Azure Active Directory (AAD):
7575
```powershell
7676
$AuthContext = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext]"https://login.microsoftonline.com/${tenantId}"
7777
$cred = New-Object -TypeName Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential -ArgumentList ($appId, $authKey)
78-
$result = $AuthContext.AcquireToken("https://management.core.windows.net/", $cred)
78+
$result = $AuthContext.AcquireTokenAsync("https://management.core.windows.net/", $cred).GetAwaiter().GetResult()
7979
$authHeader = @{
8080
'Content-Type'='application/json'
8181
'Accept'='application/json'

0 commit comments

Comments
 (0)