menu
arrow_back

Cloud Composer: Copying BigQuery Tables Across Different Locations

Cloud Composer: Copying BigQuery Tables Across Different Locations

1 hour 7 Credits

GSP283

Google Cloud Self-Paced Labs

Overview

In this advanced lab you will learn how to create and run an Apache Airflow workflow in Cloud Composer that completes the following tasks:

  • Reads from a config file the list of tables to copy
  • Exports the list of tables from a BigQuery dataset located in US to Cloud Storage
  • Copies the exported tables from US to EU Cloud Storage buckets
  • Imports the list of tables into the target BigQuery Dataset in EU

cce6bf21555543ce.png

Join Qwiklabs to read the rest of this lab...and more!

  • Get temporary access to the Google Cloud Console.
  • Over 200 labs from beginner to advanced levels.
  • Bite-sized so you can learn at your own pace.
Join to Start This Lab
Score

—/100

Create Cloud Composer environment.

Run Step

/ 25

Create two Cloud Storage buckets.

Run Step

/ 25

Create a dataset.

Run Step

/ 25

Uploading the DAG and dependencies to Cloud Storage

Run Step

/ 25