Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Ralf Baumgartner (Unlicensed) : to be done

Note

MAKE SURE THAT YOUR BIGQUERY REGION IS “Multi-Region EU
and not another one like “europe-west-6”

Set up Google Ads data transfer

Google Documentation: https://cloud.google.com/bigquery/docs/merchant-center-transfer#set_up_a_google_merchant_center_transfer

  • If the API is not yet enabled do that now.

    Image Added

  • Create a new transfer.

    Image Added

  • For the source select Google Ads. Give a name to the transfer. Leave the default (daily) schedule options.

...

  • In the destination settings create a new dataset in the multi-region EU.

    Image Added
Note

MAKE SURE THAT YOUR BIGQUERY REGION IS “Multi-Region EU
and not another one like “europe-west-6”

  • In the Data source details add your Merchant ID and enable the desired inventories/insights.

...

  • Add Notifications as you like and save the transfer.

Share your datasets access with the Boxalino Service Account emails :and the datastudio email.

  1. boxalino.datastudio@gmail.com

  2. 55483703770-compute@developer.gserviceaccount.com

  3. 285267441728-compute@developer.gserviceaccount.com

  4. Access to daily processing: 473358938625-compute@developer.gserviceaccount.com

  5. Access to full processing: 748704719320-compute@developer.gserviceaccount.com

  6. Access to delta processing : 178258671205-compute@developer.gserviceaccount.com

  7. Access for reports processing: connection-daily@reports-290318.iam.gserviceaccount.com

  8. Access for lab processing: connection-daily@lab-daily.iam.gserviceaccount.com

  9. (optional) Access to mailchimp processing: 44011570818-compute@developer.gserviceaccount.com

  10. (optional) Access for p13nsync processing: 57928973091-compute@developer.gserviceaccount.com

Provide a read access (Data Viewer/Metadata Viewer) on all relevant datasets in your custom GCP projects.

Let Boxalino know the name of the project and dataset and we will integrate it to your data lake.