You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. In the **Storage account name** and **key1** boxes, copy the values, and then paste them into Notepad or another editor for later use in the tutorial.
89
88
@@ -92,37 +91,27 @@ In this section, you create a blob container named **adftutorial** in your Blob
92
91
93
92
1. In the **Storage account** window, switch to **Overview**, and then select **Blobs**.
1. Keep the **Container** window for **adftutorial** open. You use it verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
1. Keep the **Container** window for **adftutorial** open. You use it to verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
112
102
113
103
114
104
## Create a data factory
115
105
116
-
1. On the menu on the left, select**New**>**Data +Analytics**>**Data Factory**.
106
+
1. On the menu on the left, select**+ Create a resource**>**Analytics**>**Data Factory**.
117
107
118
-

108
+

109
+
119
110
1. On the **New data factory** page, under **Name**, enter **ADFTutorialDataFactory**.
120
-
121
-

122
111
123
112
The name of the data factory must be *globally unique*. If you see the following error message for the name field, change the name of the data factory (for example, yournameADFTutorialDataFactory). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
124
113
125
-

114
+

126
115
1. Select the Azure **subscription**in which you want to create the data factory.
127
116
1. For **Resource Group**, take one of the following steps:
128
117
@@ -131,61 +120,46 @@ In this section, you create a blob container named **adftutorial** in your Blob
131
120
-Select**Create new**, and enter the name of a resource group.
132
121
133
122
To learn about resource groups, see [Use resource groups to manage your Azure resources](../azure-resource-manager/resource-group-overview.md).
134
-
1. Under **Version**, select**V2**.
123
+
1. Under **Version**, select**V2**.
135
124
1. Under **Location**, select the location for the data factory. Only locations that are supported are displayed in the drop-down list. The data stores (for example, Azure Storage and SQL Database) and computes (for example, Azure HDInsight) used by Data Factory can be in other locations/regions.
136
-
1. Select**Pin to dashboard**.
137
125
1. Select**Create**.
138
-
1. On the dashboard, you see the following tile with the status **Deploying Data Factory**:
139
126
140
-

141
127
1. After the creation is finished, you see the **Data Factory** page as shown in the image.
142
128
143
-

129
+

144
130
1. Select**Author & Monitor** to launch the Data Factory user interface in a separate tab.
145
131
146
132
## Use the Copy Data tool to create a pipeline
147
133
148
134
1. On the **Let's get started** page, select **Copy Data** to launch the Copy Data tool.
149
135
150
-

136
+

151
137
152
138
1. On the **Properties** page of the Copy Data tool, under **Task name**, enter **CopyFromOnPremSqlToAzureBlobPipeline**. Then select **Next**. The Copy Data tool creates a pipeline with the name you specify for this field.
1. On the **Source data store** page, click on **Create new connection**.
157
142
158
-

159
-
160
-
1. Under **New Linked Service**, search for **SQL Server**, and then select **Next**.
161
-
162
-

163
143
164
-
1. Under New Linked Service (SQL Server) **Name**, enter **SqlServerLinkedService**. Select **+New** under **Connect via integration runtime**. You must create a self-hosted integration runtime, download it to your machine, and register it with Data Factory. The self-hosted integration runtime copies data between your on-premises environment and the cloud.
144
+
1. Under **New Linked Service**, search for **SQL Server**, and then select **Continue**.
1. In the **New Linked Service (SQL Server)** dialog box, under **Name**, enter **SqlServerLinkedService**. Select **+New** under **Connect via integration runtime**. You must create a self-hosted integration runtime, download it to your machine, and register it with Data Factory. The self-hosted integrationruntime copies data between your on-premises environment and the cloud.
167
147
168
-
1. In the **Integration Runtime Setup** dialog box, Select **Private Network**. Then select **Next**.
1. In the **Integration Runtime Setup** dialog box, under **Name**, enter **TutorialIntegrationRuntime**. Then select **Next**.
175
154
176
-
1. Select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
177
155
178
-

156
+
1. In the **Integration Runtime Setup** dialog box, select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
179
157
180
158
1. Run the downloaded application. You see the status of the express setup in the window.
1. In **Specify the on-premises SQL Server database**, take the following steps:
162
+
1. In the **New Linked Service (SQL Server)** dialog box, confirm that **TutorialIntegrationRuntime** is selected for the Integration Runtime field. Then, take the following steps:
189
163
190
164
a. Under **Name**, enter **SqlServerLinkedService**.
191
165
@@ -197,71 +171,55 @@ In this section, you create a blob container named **adftutorial** in your Blob
197
171
198
172
e. Under **User name**, enter the name of user with access to on-premises SQL Server.
199
173
200
-
f. Enter the **password** for the user. Select **Finish**.
1. On the **Select tables from which to copy the data or use a custom query** page, select the **[dbo].[emp]** table in the list, and select **Next**. You can select any other table based on your database.
180
+
1. On the **Source data store** page, select **Next**.
1. On the **Select tables from which to copy the data or use a custom query** page, select the **[dbo].[emp]** table in the list, and select **Next**. You can select any other table based on your database.
209
183
210
184
1. On the **Destination data store** page, select **Create new connection**
1. In the **Choose the output file or folder** dialog, under **Folder path**, enter **adftutorial/fromonprem**. You created the **adftutorial** container as part of the prerequisites. If the output folder doesn't exist (in this case **fromonprem**), Data Factory automatically creates it. You also can use the **Browse** button to browse the blob storage and its containers/folders. If you do not specify any value under **File name**, by default the name from the source would be used (in this case **dbo.emp**).
203
+
1. In the **Choose the output file or folder** dialog, under **Folder path**, enter **adftutorial/fromonprem**. You created the **adftutorial** container as part of the prerequisites. If the output folder doesn't exist (in this case **fromonprem**), Data Factory automatically creates it. You can also use the **Browse** button to browse the blob storage and its containers/folders. If you do not specify any value under **File name**, by default the name from the source would be used (in this case **dbo.emp**).
235
204
236
205

237
206
238
207
1. On the **File format settings** dialog, select**Next**.
239
208
240
-

1. On the **Monitor** tab, you can view the status of the pipeline you created. You can use the links in the **Action** column to view activity runs associated with the pipeline run and to rerun the pipeline.
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select**Pipelines** at the top.
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select**Pipeline Runs** at the top.
261
220
262
221
1. Confirm that you see the output file in the **fromonprem** folder of the **adftutorial** container.
1. Select the **Edit** tab on the left to switch to the editor mode. You can update the linked services, datasets, and pipelines created by the tool by using the editor. Select**Code** to view the JSON code associated with the entity opened in the editor. For details on how to edit these entities in the Data Factory UI, see [the Azure portal version of this tutorial](tutorial-copy-data-portal.md).
1. In the **Storage account name** and **key1** boxes, copy the values, and then paste them into Notepad or another editor for later use in the tutorial.
93
92
@@ -112,9 +111,6 @@ In this section, you create a blob container named **adftutorial** in your Blob
112
111
113
112
1. Keep the **container** window for **adftutorial** open. You use it verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
Copy file name to clipboardExpand all lines: articles/data-factory/tutorial-hybrid-copy-powershell.md
-3
Original file line number
Diff line number
Diff line change
@@ -78,8 +78,6 @@ You use the name and key of your Azure storage account in this tutorial. Get the
78
78
79
79
1. In the **Storage account** window, select **Access keys**.
80
80
81
-

82
-
83
81
1. In the **Storage account name** and **key1** boxes, copy the values, and then paste them into Notepad or another editor for later use in the tutorial.
84
82
85
83
#### Create the adftutorial container
@@ -103,7 +101,6 @@ In this section, you create a blob container named **adftutorial** in your Azure
103
101
104
102
1. Keep the **container** window for **adftutorial** open. You use it verify the output at the end of the tutorial. Data Factory automatically creates the output folder in this container, so you don't need to create one.
0 commit comments