Cloud Composer: Copying BigQuery Tables Across Different Locations
In this advanced lab you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:
- Reads from a config file the list of tables to copy
- Exports the list of tables from a BigQuery dataset located in US to Cloud Storage
- Copies the exported tables from US to EU Cloud Storage buckets
- Imports the list of tables into the target BigQuery Dataset in EU
Join Qwiklabs to read the rest of this lab...and more!
- Get temporary access to the Google Cloud Console.
- Over 200 labs from beginner to advanced levels.
- Bite-sized so you can learn at your own pace.
Create Cloud Composer environment.
Create two Cloud Storage buckets.
Create a dataset.
Uploading the DAG and dependencies to Cloud Storage