Skip to content

Commit d963d1e

Browse files
authored
Merge pull request MicrosoftDocs#72668 from mamccrea/userstory1473518
Databricks: Cosmos DB connector
2 parents ccc38aa + 198aeb4 commit d963d1e

19 files changed

+173
-0
lines changed

articles/azure-databricks/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@
1717
items:
1818
- name: Query SQL Server running in Docker container
1919
href: vnet-injection-sql-server.md
20+
- name: Use Cosmos DB service endpoint
21+
href: service-endpoint-cosmosdb.md
2022
- name: Perform ETL operations
2123
href: databricks-extract-load-sql-data-warehouse.md
2224
- name: Stream data using Event Hubs
Lines changed: 171 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,171 @@
1+
---
2+
title: Implement Azure Databricks with a Cosmos DB endpoint
3+
description: This tutorial describes how to implement Azure Databricks in a virtual network with a Service Endpoint enabled for Cosmos DB.
4+
services: azure-databricks
5+
author: mamccrea
6+
ms.author: mamccrea
7+
ms.reviewer: jasonh
8+
ms.topic: tutorial
9+
ms.date: 04/17/2019
10+
#Customer intent: As a data scientist, I want to use the Cosmos DB Spark connector so that I can access Cosmos DB data from Azure Databricks.
11+
---
12+
13+
# Tutorial: Implement Azure Databricks with a Cosmos DB endpoint
14+
15+
This tutorial describes how to implement a VNet injected Databricks environment with a Service Endpoint enabled for Cosmos DB.
16+
17+
In this tutorial you learn how to:
18+
19+
> [!div class="checklist"]
20+
> * Create an Azure Databricks workspace in a virtual network
21+
> * Create a Cosmos DB service endpoint
22+
> * Create a Cosmos DB account and import data
23+
> * Create an Azure Databricks cluster
24+
> * Query Cosmos DB from an Azure Databricks notebook
25+
26+
## Prerequisites
27+
28+
Before you start, do the following:
29+
30+
* Create an [Azure Databricks workspace in a virtual network](quickstart-create-databricks-workspace-vnet-injection.md).
31+
32+
* Download the [Spark connector](https://search.maven.org/remotecontent?filepath=com/microsoft/azure/azure-cosmosdb-spark_2.4.0_2.11/1.3.4/azure-cosmosdb-spark_2.4.0_2.11-1.3.4-uber.jar).
33+
34+
* Download sample data from the [NOAA National Centers for Environmental Information](https://www.ncdc.noaa.gov/stormevents/). Select a state or area and select **Search**. On the next page, accept the defaults and select **Search**. Then select **CSV Download** on the left side of the page to download the results.
35+
36+
* Download the [pre-compiled binary](https://aka.ms/csdmtool) of the Azure Cosmos DB Data Migration Tool.
37+
38+
## Create a Cosmos DB service endpoint
39+
40+
1. Once you have deployed an Azure Databricks workspace to a virtual network, navigate to the virtual network in the [Azure portal](https://portal.azure.com). Notice the public and private subnets that were created through the Databricks deployment.
41+
42+
![Virtual network subnets](./media/service-endpoint-cosmosdb/virtual-network-subnets.png)
43+
44+
2. Select the *public-subnet* and create a Cosmos DB service endpoint. Then **Save**.
45+
46+
![Add a Cosmos DB service endpoint](./media/service-endpoint-cosmosdb/add-cosmosdb-service-endpoint.png)
47+
48+
## Create a Cosmos DB account
49+
50+
1. Open the Azure portal. On the upper-left side of the screen, select **Create a resource > Databases > Azure Cosmos DB**.
51+
52+
2. Fill out the **Instance Details** on the **Basics** tab with the following settings:
53+
54+
|Setting|Value|
55+
|-------|-----|
56+
|Subscription|*your subscription*|
57+
|Resource Group|*your resource group*|
58+
|Account Name|db-vnet-service-endpoint|
59+
|API|Core (SQL)|
60+
|Location|West US|
61+
|Geo-Redundancy|Disable|
62+
|Multi-region Writes|Enable|
63+
64+
![Add a Cosmos DB service endpoint](./media/service-endpoint-cosmosdb/create-cosmosdb-account-basics.png)
65+
66+
3. Select the **Network** tab and configure your virtual network.
67+
68+
a. Choose the virtual network you created as a prerequisite, and then select *public-subnet*. Notice that *private-subnet* has the note *'Microsoft AzureCosmosDB' endpoint is missing'*. This is because you only enabled the Cosmos DB service endpoint on the *public-subnet*.
69+
70+
b. Ensure you have **Allow access from Azure portal** enabled. This setting allows you to access your Cosmos DB account from the Azure portal. If this option is set to **Deny**, you will receive errors when attempting to access your account.
71+
72+
> [!NOTE]
73+
> It is not necessary for this tutorial, but you can also enable *Allow access from my IP* if you want the ability to access your Cosmos DB account from your local machine. For example, if you are connecting to your account using the Cosmos DB SDK, you need to enable this setting. If it is disabled, you will receive "Access Denied" errors.
74+
75+
![Cosmos DB Account network settings](./media/service-endpoint-cosmosdb/create-cosmosdb-account-network.png)
76+
77+
4. Select **Review + Create**, and then **Create** to create your Cosmos DB account inside the virtual network.
78+
79+
5. Once your Cosmos DB account has been created, navigate to **Keys** under **Settings**. Copy the primary connection string and save it in a text editor for later use.
80+
81+
![Cosmos DB account keys page](./media/service-endpoint-cosmosdb/cosmos-keys.png)
82+
83+
6. Select **Data Explorer** and **New Collection** to add a new database and collection to your Cosmos DB account.
84+
85+
![Cosmos DB new collection](./media/service-endpoint-cosmosdb/new-collection.png)
86+
87+
## Upload data to Cosmos DB
88+
89+
1. Open the graphical interface version of the [data migration tool for Cosmos DB](https://aka.ms/csdmtool), **Dtui.exe**.
90+
91+
![Cosmos DB Data Migration Tool](./media/service-endpoint-cosmosdb/cosmos-data-migration-tool.png)
92+
93+
2. On the **Source Information** tab, select **CSV File(s)** in the **Import from** dropdown. Then select **Add Files** and add the storm data CSV you downloaded as a prerequisite.
94+
95+
![Cosmos DB Data Migration Tool source information](./media/service-endpoint-cosmosdb/cosmos-source-information.png)
96+
97+
3. On the **Target Information** tab, input your connection string. The connection string format is `AccountEndpoint=<URL>;AccountKey=<key>;Database=<database>`. The AccountEndpoint and AccountKey are included in the primary connection string you saved in the previous section. Append `Database=<your database name>` to the end of the connection string, and select **Verify**. Then, add the Collection name and partition key.
98+
99+
![Cosmos DB Data Migration Tool target information](./media/service-endpoint-cosmosdb/cosmos-target-information.png)
100+
101+
4. Select **Next** until you get to the Summary page. Then, select **Import**.
102+
103+
## Create a cluster and add library
104+
105+
1. Navigate to your Azure Databricks service in the [Azure portal](https://portal.azure.com) and select **Launch Workspace**.
106+
107+
![Launch Databricks workspace](./media/service-endpoint-cosmosdb/launch-workspace.png)
108+
109+
2. Create a new cluster. Choose a Cluster Name and accept the remaining default settings.
110+
111+
![New cluster settings](./media/service-endpoint-cosmosdb/create-cluster.png)
112+
113+
3. After your cluster is created, navigate to the cluster page and select the **Libraries** tab. Select **Install New** and upload the Spark connector jar file to install the library.
114+
115+
![Install Spark connector library](./media/service-endpoint-cosmosdb/install-cosmos-connector-library.png)
116+
117+
You can verify that the library was installed on the **Libraries** tab.
118+
119+
![Databricks cluster Libraries tab](./media/service-endpoint-cosmosdb/installed-library.png)
120+
121+
## Query Cosmos DB from a Databricks notebook
122+
123+
1. Navigate to your Azure Databricks workspace and create a new python notebook.
124+
125+
![Create new Databricks notebook](./media/service-endpoint-cosmosdb/new-python-notebook.png)
126+
127+
2. Run the following python code to set the Cosmos DB connection configuration. Change the **Endpoint**, **Masterkey**, **Database**, and **Collection** accordingly.
128+
129+
```python
130+
connectionConfig = {
131+
"Endpoint" : "https://<your Cosmos DB account name.documents.azure.com:443/",
132+
"Masterkey" : "<your Cosmos DB primary key>",
133+
"Database" : "<your database name>",
134+
"preferredRegions" : "West US 2",
135+
"Collection": "<your collection name>",
136+
"schema_samplesize" : "1000",
137+
"query_pagesize" : "200000",
138+
"query_custom" : "SELECT * FROM c"
139+
}
140+
```
141+
142+
3. Use the following python code to load the data and create a temporary view.
143+
144+
```python
145+
users = spark.read.format("com.microsoft.azure.cosmosdb.spark").options(**connectionConfig).load()
146+
users.createOrReplaceTempView("storm")
147+
```
148+
149+
4. Use the following magic command to execute a SQL statement that returns data.
150+
151+
```python
152+
%sql
153+
select * from storm
154+
```
155+
156+
You have successfully connected your VNet-injected Databricks workspace to a service-endpoint enabled Cosmos DB resource. To read more about how to connect to Cosmos DB, see [Azure Cosmos DB Connector for Apache Spark](https://github.com/Azure/azure-cosmosdb-spark).
157+
158+
## Clean up resources
159+
160+
When no longer needed, delete the resource group, the Azure Databricks workspace, and all related resources. Deleting the job avoids unnecessary billing. If you're planning to use the Azure Databricks workspace in future, you can stop the cluster and restart it later. If you are not going to continue to use this Azure Databricks workspace, delete all resources you created in this tutorial by using the following steps:
161+
162+
1. From the left-hand menu in the Azure portal, click **Resource groups** and then click the name of the resource group you created.
163+
164+
2. On your resource group page, select **Delete**, type the name of the resource to delete in the text box, and then select **Delete** again.
165+
166+
## Next steps
167+
168+
In this tutorial, you've deployed an Azure Databricks workspace to a virtual network, and used the Cosmos DB Spark connector to query Cosmos DB data from Databricks. To learn more about working with Azure Databricks in a virtual network, continue to the tutorial for using SQL Server with Azure Databricks.
169+
170+
> [!div class="nextstepaction"]
171+
> [Tutorial: Query a SQL Server Linux Docker container in a virtual network from an Azure Databricks notebook](vnet-injection-sql-server.md)

0 commit comments

Comments
 (0)