You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. In the list of containers, select **adftutorial**.
101
99
102
100
@@ -107,13 +105,13 @@ In this section, you create a blob container named **adftutorial** in your Blob
107
105
108
106
1. On the menu on the left, select**+ Create a resource**>**Analytics**>**Data Factory**.
109
107
110
-

108
+

111
109
112
110
1. On the **New data factory** page, under **Name**, enter **ADFTutorialDataFactory**.
113
111
114
112
The name of the data factory must be *globally unique*. If you see the following error message for the name field, change the name of the data factory (for example, yournameADFTutorialDataFactory). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
115
113
116
-

114
+

117
115
1. Select the Azure **subscription**in which you want to create the data factory.
118
116
1. For **Resource Group**, take one of the following steps:
119
117
@@ -128,43 +126,35 @@ In this section, you create a blob container named **adftutorial** in your Blob
128
126
129
127
1. After the creation is finished, you see the **Data Factory** page as shown in the image.
130
128
131
-

129
+

132
130
1. Select**Author & Monitor** to launch the Data Factory user interface in a separate tab.
133
131
134
132
## Use the Copy Data tool to create a pipeline
135
133
136
134
1. On the **Let's get started** page, select **Copy Data** to launch the Copy Data tool.
137
135
138
-

136
+

139
137
140
138
1. On the **Properties** page of the Copy Data tool, under **Task name**, enter **CopyFromOnPremSqlToAzureBlobPipeline**. Then select **Next**. The Copy Data tool creates a pipeline with the name you specify for this field.
1. On the **Source data store** page, click on **Create new connection**.
145
142
146
-

147
143
148
144
1. Under **New Linked Service**, search for **SQL Server**, and then select **Continue**.
149
145
150
-

151
-
152
146
1. In the **New Linked Service (SQL Server)** dialog box, under **Name**, enter **SqlServerLinkedService**. Select **+New** under **Connect via integration runtime**. You must create a self-hosted integration runtime, download it to your machine, and register it with Data Factory. The self-hosted integration runtime copies data between your on-premises environment and the cloud.
1. In the **Integration Runtime Setup** dialog box, select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
165
157
166
-

167
-
168
158
1. Run the downloaded application. You see the status of the express setup in the window.
1. On the **Source data store** page, select **Next**.
193
181
194
182
1. On the **Select tables from which to copy the data or use a custom query** page, select the **[dbo].[emp]** table in the list, and select **Next**. You can select any other table based on your database.
1. In **Destination data store** dialog, make sure that **Azure Blob Storage** is selected. Then select **Next**.
219
202
220
203
1. In the **Choose the output file or folder** dialog, under **Folder path**, enter **adftutorial/fromonprem**. You created the **adftutorial** container as part of the prerequisites. If the output folder doesn't exist (in this case **fromonprem**), Data Factory automatically creates it. You can also use the **Browse** button to browse the blob storage and its containers/folders. If you do not specify any value under **File name**, by default the name from the source would be used (in this case **dbo.emp**).
221
204
222
205

223
206
224
207
1. On the **File format settings** dialog, select**Next**.
225
208
226
-

1. On the **Monitor** tab, you can view the status of the pipeline you created. You can use the links in the **Action** column to view activity runs associated with the pipeline run and to rerun the pipeline.
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select**Pipelines** at the top.
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select**Pipeline Runs** at the top.
247
220
248
221
1. Confirm that you see the output file in the **fromonprem** folder of the **adftutorial** container.
1. Select the **Edit** tab on the left to switch to the editor mode. You can update the linked services, datasets, and pipelines created by the tool by using the editor. Select**Code** to view the JSON code associated with the entity opened in the editor. For details on how to edit these entities in the Data Factory UI, see [the Azure portal version of this tutorial](tutorial-copy-data-portal.md).
0 commit comments