Skip to content

Commit 44feb14

Browse files
committed
Update Tutorials - Copy on-premises data to cloud - Copy Data tool
1 parent 7915940 commit 44feb14

10 files changed

+13
-41
lines changed

articles/data-factory/tutorial-hybrid-copy-data-tool.md

Lines changed: 13 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ You use the name and key of your storage account in this tutorial. To get the na
7777
7878
1. In the left pane, select **All services**. Filter by using the **Storage** keyword, and then select **Storage accounts**.
7979
80-
![Storage account search](media/tutorial-hybrid-copy-powershell/search-storage-account.png)
80+
![Storage account search](media/doc-common-process/search-storage-account.png)
8181
8282
1. In the list of storage accounts, filter for your storage account, if needed. Then select your storage account.
8383
@@ -95,8 +95,6 @@ In this section, you create a blob container named **adftutorial** in your Blob
9595
9696
1. In the **New container** window, under **Name**, enter **adftutorial**, and then select **OK**.
9797
98-
![New container](media/tutorial-hybrid-copy-powershell/new-container-dialog.png)
99-
10098
1. In the list of containers, select **adftutorial**.
10199
102100
@@ -107,13 +105,13 @@ In this section, you create a blob container named **adftutorial** in your Blob
107105

108106
1. On the menu on the left, select **+ Create a resource** > **Analytics** > **Data Factory**.
109107

110-
![New data factory creation](./media/tutorial-hybrid-copy-data-tool/new-azure-data-factory-menu.png)
108+
![New data factory creation](./media/doc-common-process/new-azure-data-factory-menu.png)
111109

112110
1. On the **New data factory** page, under **Name**, enter **ADFTutorialDataFactory**.
113111

114112
The name of the data factory must be *globally unique*. If you see the following error message for the name field, change the name of the data factory (for example, yournameADFTutorialDataFactory). For naming rules for Data Factory artifacts, see [Data Factory naming rules](naming-rules.md).
115113

116-
![New data factory name](./media/tutorial-hybrid-copy-data-tool/name-not-available-error.png)
114+
![New data factory name](./media/doc-common-process/name-not-available-error.png)
117115
1. Select the Azure **subscription** in which you want to create the data factory.
118116
1. For **Resource Group**, take one of the following steps:
119117

@@ -128,43 +126,35 @@ In this section, you create a blob container named **adftutorial** in your Blob
128126

129127
1. After the creation is finished, you see the **Data Factory** page as shown in the image.
130128

131-
![Data factory home page](./media/tutorial-hybrid-copy-data-tool/data-factory-home-page.png)
129+
![Data factory home page](./media/doc-common-process/data-factory-home-page.png)
132130
1. Select **Author & Monitor** to launch the Data Factory user interface in a separate tab.
133131

134132
## Use the Copy Data tool to create a pipeline
135133

136134
1. On the **Let's get started** page, select **Copy Data** to launch the Copy Data tool.
137135
138-
![Copy Data tool tile](./media/tutorial-hybrid-copy-data-tool/copy-data-tool-tile.png)
136+
![Get started page](./media/doc-common-process/get-started-page.png)
139137
140138
1. On the **Properties** page of the Copy Data tool, under **Task name**, enter **CopyFromOnPremSqlToAzureBlobPipeline**. Then select **Next**. The Copy Data tool creates a pipeline with the name you specify for this field.
141-
142-
![Task name](./media/tutorial-hybrid-copy-data-tool/properties-page.png)
139+
![Task name](./media/tutorial-hybrid-copy-data-tool/properties-page.png)
143140
144141
1. On the **Source data store** page, click on **Create new connection**.
145142
146-
![Create new linked service](./media/tutorial-hybrid-copy-data-tool/create-new-source-data-store.png)
147143
148144
1. Under **New Linked Service**, search for **SQL Server**, and then select **Continue**.
149145
150-
![SQL Server selection](./media/tutorial-hybrid-copy-data-tool/select-source-data-store.png)
151-
152146
1. In the **New Linked Service (SQL Server)** dialog box, under **Name**, enter **SqlServerLinkedService**. Select **+New** under **Connect via integration runtime**. You must create a self-hosted integration runtime, download it to your machine, and register it with Data Factory. The self-hosted integration runtime copies data between your on-premises environment and the cloud.
153147
154-
![Create self-hosted integration runtime](./media/tutorial-hybrid-copy-data-tool/create-integration-runtime-link.png)
155148
156149
1. In the **Integration Runtime Setup** dialog box, Select **Self-Hosted**. Then select **Next**.
157150
158-
![](./media/tutorial-hybrid-copy-data-tool/create-integration-runtime-dialog0.png)
151+
![Create integration runtime](./media/tutorial-hybrid-copy-data-tool/create-integration-runtime-dialog0.png)
159152
160153
1. In the **Integration Runtime Setup** dialog box, under **Name**, enter **TutorialIntegrationRuntime**. Then select **Next**.
161154
162-
![Integration runtime name](./media/tutorial-hybrid-copy-data-tool/create-integration-runtime-dialog.png)
163155
164156
1. In the **Integration Runtime Setup** dialog box, select **Click here to launch the express setup for this computer**. This action installs the integration runtime on your machine and registers it with Data Factory. Alternatively, you can use the manual setup option to download the installation file, run it, and use the key to register the integration runtime.
165157
166-
![lLaunch express setup on this computer link](./media/tutorial-hybrid-copy-data-tool/launch-express-setup-link.png)
167-
168158
1. Run the downloaded application. You see the status of the express setup in the window.
169159
170160
![Express setup status](./media/tutorial-hybrid-copy-data-tool/express-setup-status.png)
@@ -187,19 +177,14 @@ In this section, you create a blob container named **adftutorial** in your Blob
187177
188178
![Integration runtime selected](./media/tutorial-hybrid-copy-data-tool/integration-runtime-selected.png)
189179
190-
1. Select **Next**.
191-
192-
![](./media/tutorial-hybrid-copy-data-tool/select-source-linked-service.png)
180+
1. On the **Source data store** page, select **Next**.
193181
194182
1. On the **Select tables from which to copy the data or use a custom query** page, select the **[dbo].[emp]** table in the list, and select **Next**. You can select any other table based on your database.
195183
196-
![The Product table selection](./media/tutorial-hybrid-copy-data-tool/select-emp-table.png)
197-
198184
1. On the **Destination data store** page, select **Create new connection**
199185
200-
![Create Destination linked service](./media/tutorial-hybrid-copy-data-tool/create-new-sink-connection.png)
201186
202-
1. In **New Linked Service**, Search and Select **Azure Blob**, then **Continue**.
187+
1. In **New Linked Service**, Search and Select **Azure Blob**, and then select **Continue**.
203188
204189
![Blob storage selection](./media/tutorial-hybrid-copy-data-tool/select-destination-data-store.png)
205190
@@ -211,43 +196,30 @@ In this section, you create a blob container named **adftutorial** in your Blob
211196
212197
c. Under **Storage account name**, select your storage account from the drop-down list.
213198
214-
d. Select **Next**.
215-
216-
1. In **Destination data store** dialog, select **Next**. In **Connection properties**, select **Azure storage service** as **Azure Blob Storage**. Select **Next**.
199+
d. Select **Finish**.
217200
218-
![connection properties](./media/tutorial-hybrid-copy-data-tool/select-connection-properties.png)
201+
1. In **Destination data store** dialog, make sure that **Azure Blob Storage** is selected. Then select **Next**.
219202
220203
1. In the **Choose the output file or folder** dialog, under **Folder path**, enter **adftutorial/fromonprem**. You created the **adftutorial** container as part of the prerequisites. If the output folder doesn't exist (in this case **fromonprem**), Data Factory automatically creates it. You can also use the **Browse** button to browse the blob storage and its containers/folders. If you do not specify any value under **File name**, by default the name from the source would be used (in this case **dbo.emp**).
221204

222205
![Choose the output file or folder](./media/tutorial-hybrid-copy-data-tool/choose-output-file-folder.png)
223206

224207
1. On the **File format settings** dialog, select **Next**.
225208

226-
![File format settings page](./media/tutorial-hybrid-copy-data-tool/file-format-settings-page.png)
227-
228209
1. On the **Settings** dialog, select **Next**.
229210

230-
![Settings page](./media/tutorial-hybrid-copy-data-tool/settings-page.png)
231-
232211
1. On the **Summary** dialog, review values for all the settings, and select **Next**.
233212

234-
![Summary page](./media/tutorial-hybrid-copy-data-tool/summary-page.png)
235-
236213
1. On the **Deployment** page, select **Monitor** to monitor the pipeline or task you created.
237214

238215
![Deployment page](./media/tutorial-hybrid-copy-data-tool/deployment-page.png)
239216

240217
1. On the **Monitor** tab, you can view the status of the pipeline you created. You can use the links in the **Action** column to view activity runs associated with the pipeline run and to rerun the pipeline.
241-
242-
![Monitor pipeline runs](./media/tutorial-hybrid-copy-data-tool/monitor-pipeline-runs.png)
243-
244-
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select **Pipelines** at the top.
245-
246-
![Monitor activity runs](./media/tutorial-hybrid-copy-data-tool/monitor-activity-runs.png)
218+
219+
1. Select the **View Activity Runs** link in the **Actions** column to see activity runs associated with the pipeline run. To see details about the copy operation, select the **Details** link (eyeglasses icon) in the **Actions** column. To switch back to the **Pipeline Runs** view, select **Pipeline Runs** at the top.
247220

248221
1. Confirm that you see the output file in the **fromonprem** folder of the **adftutorial** container.
249222

250-
![Output blob](./media/tutorial-hybrid-copy-data-tool/output-blob.png)
251223

252224
1. Select the **Edit** tab on the left to switch to the editor mode. You can update the linked services, datasets, and pipelines created by the tool by using the editor. Select **Code** to view the JSON code associated with the entity opened in the editor. For details on how to edit these entities in the Data Factory UI, see [the Azure portal version of this tutorial](tutorial-copy-data-portal.md).
253225

0 commit comments

Comments
 (0)