Cloud Composer: 別のロケーションに BigQuery テーブルをコピーする
need to try again
Only us could be confirmed in "Viewing Variables". Where did I fail?
Getting Error in Airflow Ui: Broken DAG: [/home/airflow/gcs/dags/bq_copy_across_locations.py] list index out of range Also didn't understand : To define the Cloud Storage plugin, the class Cloud StoragePlugin(AirflowPlugin) is defined, mapping the hook and operator downloaded from the Airflow 1.10-stable branch.
I had a big problem setting the location of the destination bucket in Europe (as the Cloud Shell do not accept "EU" as predefine multiregion location in Europe). I was not even able to continue with the lab
Cloud Composer: Copying BigQuery Tables Across Different Locations Completed