Skip to content

Commit a4de1a3

Browse files
authored
Composer: initial workflow and REST samples (GoogleCloudPlatform#1468)
* Add quickstart example DAG. Add first airflow example DAG. This DAG demonstrates how to call Dataproc from airflow. Change-Id: I2f0464da37f11085dfc927f4f4d96d6a99634aad * Add region tags for cloud.google.com. Also, adjusts indentation to 4 spaces so that `flake8` passes. Change-Id: Id3e1fbfa5bd41a4e11e0cd7991b058b063241e6b * Add composer_quickstart_steps region tag. Change-Id: Ib0bbacd91529fac272e2da0c72abca5db9b4fad8 * Test Composer workflow sample. The GCP Airflow operators only work on Python 2.7 (a constraint they inherited from Apache Beam), so I've made some changes to nox.py to exclude them from the Python 3 builds. Change-Id: Ia10971dd5a7b14279b236041836e317e79693258 * Add DAG that runs BigQuery and notifies user. Add a sample Airflow DAG that will create BigQuery dataset, run BigQuery, export results to Cloud Storage, notify user by email when results are ready, and clean up the dataset. Change-Id: Ia8242df29223d910b2d269a9bb93720b35470b7a * Composer: add sample to get GCS bucket for DAGs Since there are not Cloud client libraries for Composer yet, this sample uses the REST API directly via the google-auth and requests libraries. Sample to be used at https://cloud.google.com/composer/docs/how-to/using/managing-dags Also, enables Kokoro testing for Composer samples. (Uses Python 2.7 since Cloud Composer is currently restricted to Python 2.7) Change-Id: Icb37e079992c88eedc06cdcc3d72db5106d10ef5 * Add tests for BQ notify DAG. Requires master copy of Airflow for `bigquery_get_data` operartor. Change-Id: I73cd2cfb2458b67bed1a77e65966d5018e8bb45d * Composer: Fix flake8 errors. Change-Id: I2856bc6cb866bd6f7abbac8de3323797a83c9857 * Composer: add region tags to notification DAG sample. Change-Id: I657e052fa851daa7c72045762090a2e27dd406d3 * Set machine type in quickstart to n1-standard-1. The default machine type was n1-standard-4, which exceeds trial project quota. This CL changes the machine type to n1-standard-1 since a more powerful machine is not necessary for quickstart sample data. Change-Id: I46af68c29145f7a7ce303afdad4708bda7d9e6dd * Add composer config to travis test env. Change-Id: I9c27c75cbea8c5ed4edf859d26980e24ea270eea
1 parent 895cbf5 commit a4de1a3

18 files changed

+610
-8
lines changed

.kokoro/presubmit_tests_composer.cfg

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Format: //devtools/kokoro/config/proto/build.proto
2+
3+
# Download secrets from Cloud Storage.
4+
gfile_resources: "/bigstore/cloud-devrel-kokoro-resources/python-docs-samples"
5+
6+
# Tell the trampoline which build file to use.
7+
env_vars: {
8+
key: "TRAMPOLINE_BUILD_FILE"
9+
value: "github/python-docs-samples/.kokoro/system_tests.sh"
10+
}
11+
12+
env_vars: {
13+
key: "NOX_SESSION"
14+
value: "composer and py27 and not appengine"
15+
}

.kokoro/system_tests_composer.cfg

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Format: //devtools/kokoro/config/proto/build.proto
2+
3+
# Download secrets from Cloud Storage.
4+
gfile_resources: "/bigstore/cloud-devrel-kokoro-resources/python-docs-samples"
5+
6+
# Tell the trampoline which build file to use.
7+
env_vars: {
8+
key: "TRAMPOLINE_BUILD_FILE"
9+
value: "github/python-docs-samples/.kokoro/system_tests.sh"
10+
}
11+
12+
env_vars: {
13+
key: "NOX_SESSION"
14+
value: "composer and py27 and not appengine"
15+
}

.travis.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,10 +9,10 @@ branches:
99
- master
1010
cache:
1111
directories:
12-
- $HOME/.cache
12+
- "$HOME/.cache"
1313
env:
1414
global:
15-
secure: RPClDmwFZnpBIKtMvlzjzZVam4flvJtSvxFD8mHCQVQ//KqyBbQxl970kqStOK7p0RXkOB3XuFDvVixqyuptoQ8wTdgSBEPAub4DwRpcmCc1exzErHIt9zep3cQhSUuzl8N/tNl3o6GG04NsZTeErORqxfDvk3WbqFa9593G358=
15+
secure: fsBH64/WqTe7lRcn4awZU7q6+euS/LHgMq2Ee2ubaoxUei2GbK5jBgnGHxOKVL5sZ4KNfTc7d6NR5BB1ZouYr2v4q1ip7Il9kFG4g5qV4cIXzHusXkrjvIzQLupNpcD9JJZr1fmYh4AqXRs2kP/nZqb7xB6Jm/O+h+aeC1bhhBg=
1616
addons:
1717
apt:
1818
sources:
@@ -26,4 +26,4 @@ install:
2626
- pip install --upgrade nox-automation
2727
- pip install --upgrade git+https://github.com/dhermes/ci-diff-helper.git
2828
script:
29-
- ./scripts/travis.sh
29+
- "./scripts/travis.sh"

composer/rest/README.rst

Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
.. This file is automatically generated. Do not edit this file directly.
2+
3+
Google Cloud Composer Python Samples
4+
===============================================================================
5+
6+
.. image:: https://gstatic.com/cloudssh/images/open-btn.png
7+
:target: https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/GoogleCloudPlatform/python-docs-samples&page=editor&open_in_editor=composer/rest/README.rst
8+
9+
10+
This directory contains samples for Google Cloud Composer. `Google Cloud Composer`_ is a managed Apache Airflow service that helps you create, schedule, monitor and manage workflows. Cloud Composer automation helps you create Airflow environments quickly and use Airflow-native tools, such as the powerful Airflow web interface and command line tools, so you can focus on your workflows and not your infrastructure.
11+
12+
13+
14+
15+
.. _Google Cloud Composer: https://cloud.google.com/composer/docs
16+
17+
Setup
18+
-------------------------------------------------------------------------------
19+
20+
21+
Authentication
22+
++++++++++++++
23+
24+
This sample requires you to have authentication setup. Refer to the
25+
`Authentication Getting Started Guide`_ for instructions on setting up
26+
credentials for applications.
27+
28+
.. _Authentication Getting Started Guide:
29+
https://cloud.google.com/docs/authentication/getting-started
30+
31+
Install Dependencies
32+
++++++++++++++++++++
33+
34+
#. Install `pip`_ and `virtualenv`_ if you do not already have them. You may want to refer to the `Python Development Environment Setup Guide`_ for Google Cloud Platform for instructions.
35+
36+
.. _Python Development Environment Setup Guide:
37+
https://cloud.google.com/python/setup
38+
39+
#. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+.
40+
41+
.. code-block:: bash
42+
43+
$ virtualenv env
44+
$ source env/bin/activate
45+
46+
#. Install the dependencies needed to run the samples.
47+
48+
.. code-block:: bash
49+
50+
$ pip install -r requirements.txt
51+
52+
.. _pip: https://pip.pypa.io/
53+
.. _virtualenv: https://virtualenv.pypa.io/
54+
55+
Samples
56+
-------------------------------------------------------------------------------
57+
58+
Determine Cloud Storage path for DAGs
59+
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
60+
61+
.. image:: https://gstatic.com/cloudssh/images/open-btn.png
62+
:target: https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/GoogleCloudPlatform/python-docs-samples&page=editor&open_in_editor=composer/rest/get_dag_prefix.py,composer/rest/README.rst
63+
64+
65+
66+
67+
To run this sample:
68+
69+
.. code-block:: bash
70+
71+
$ python get_dag_prefix.py
72+
73+
usage: get_dag_prefix.py [-h] project_id location composer_environment
74+
75+
Get a Cloud Composer environment via the REST API.
76+
77+
This code sample gets a Cloud Composer environment resource and prints the
78+
Cloud Storage path used to store Apache Airflow DAGs.
79+
80+
positional arguments:
81+
project_id Your Project ID.
82+
location Region of the Cloud Composer environent.
83+
composer_environment Name of the Cloud Composer environent.
84+
85+
optional arguments:
86+
-h, --help show this help message and exit
87+
88+
89+
90+
91+
92+
.. _Google Cloud SDK: https://cloud.google.com/sdk/

composer/rest/README.rst.in

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
# This file is used to generate README.rst
2+
3+
product:
4+
name: Google Cloud Composer
5+
short_name: Cloud Composer
6+
url: https://cloud.google.com/composer/docs
7+
description: >
8+
`Google Cloud Composer`_ is a managed Apache Airflow service that helps
9+
you create, schedule, monitor and manage workflows. Cloud Composer
10+
automation helps you create Airflow environments quickly and use
11+
Airflow-native tools, such as the powerful Airflow web interface and
12+
command line tools, so you can focus on your workflows and not your
13+
infrastructure.
14+
15+
setup:
16+
- auth
17+
- install_deps
18+
19+
samples:
20+
- name: Determine Cloud Storage path for DAGs
21+
file: get_dag_prefix.py
22+
show_help: True
23+
24+
cloud_client_library: false
25+
26+
folder: composer/rest

composer/rest/__init__.py

Whitespace-only changes.

composer/rest/get_dag_prefix.py

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# Copyright 2018 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
"""Get a Cloud Composer environment via the REST API.
16+
17+
This code sample gets a Cloud Composer environment resource and prints the
18+
Cloud Storage path used to store Apache Airflow DAGs.
19+
"""
20+
21+
import argparse
22+
23+
24+
def get_dag_prefix(project_id, location, composer_environment):
25+
# [START composer_get_environment_dag_prefix]
26+
import google.auth
27+
import google.auth.transport.requests
28+
29+
# Authenticate with Google Cloud.
30+
# See: https://cloud.google.com/docs/authentication/getting-started
31+
credentials, _ = google.auth.default(
32+
scopes=['https://www.googleapis.com/auth/cloud-platform'])
33+
authed_session = google.auth.transport.requests.AuthorizedSession(
34+
credentials)
35+
36+
# project_id = 'YOUR_PROJECT_ID'
37+
# location = 'us-central1'
38+
# composer_environment = 'YOUR_COMPOSER_ENVIRONMENT_NAME'
39+
40+
environment_url = (
41+
'https://composer.googleapis.com/v1beta1/projects/{}/locations/{}'
42+
'/environments/{}').format(project_id, location, composer_environment)
43+
response = authed_session.request('GET', environment_url)
44+
environment_data = response.json()
45+
46+
# Print the bucket name from the response body.
47+
print(environment_data['config']['dagGcsPrefix'])
48+
# [END composer_get_environment_dag_prefix]
49+
50+
51+
if __name__ == '__main__':
52+
parser = argparse.ArgumentParser(
53+
description=__doc__,
54+
formatter_class=argparse.RawDescriptionHelpFormatter)
55+
parser.add_argument('project_id', help='Your Project ID.')
56+
parser.add_argument(
57+
'location', help='Region of the Cloud Composer environent.')
58+
parser.add_argument(
59+
'composer_environment', help='Name of the Cloud Composer environent.')
60+
61+
args = parser.parse_args()
62+
get_dag_prefix(
63+
args.project_id, args.location, args.composer_environment)

composer/rest/get_dag_prefix_test.py

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
# Copyright 2018 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# https://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
import os
16+
17+
from .get_dag_prefix import get_dag_prefix
18+
19+
20+
PROJECT = os.environ['GOOGLE_CLOUD_PROJECT']
21+
COMPOSER_LOCATION = os.environ['COMPOSER_LOCATION']
22+
COMPOSER_ENVIRONMENT = os.environ['COMPOSER_ENVIRONMENT']
23+
24+
25+
def test_get_dag_prefix(capsys):
26+
get_dag_prefix(PROJECT, COMPOSER_LOCATION, COMPOSER_ENVIRONMENT)
27+
out, _ = capsys.readouterr()
28+
assert 'gs://' in out

composer/rest/requirements.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
google-auth==1.4.1
2+
requests==2.18.4

composer/workflows/__init__.py

Whitespace-only changes.

0 commit comments

Comments
 (0)