The purpose of the GCP Deployment Request is to allow our client`s Data Science team access Boxalino datasets, for the goal of running jupyter/notebook processes in the designed anaconda environments.
...
Make a GCP Project Deployment Request with the Required Information information. Shortly, Boxalino will provide the project.
Your user email (as the requester) will be given the editor role.
Prepare the Required Files (project structure) and load them in a GCS bucket from the project.
Provide to Boxalino Prepare the information content for the Application Launch.
Launch the application.
Tip |
---|
The application is launched in a VM in the project. The commands from commands.txt are executed. Additionally, you can SSH on the VM and update/check content. |
Further tools will be provided which you can use to update the code running in the VM.
...
Because the application is launched in the scope of the project, the following Google Cloud tools can be used:
the Compute Engine - launch more applications https://console.cloud.google.com/compute/instances
the project`s BigQuery dataset (to store results, if required) https://console.cloud.google.com/bigquery
the Google Cloud Storage (GCS) to load files & store logs https://console.cloud.google.com/storage/browser
the Cloud Scheduler to create events (for automatic runs) https://console.cloud.google.com/cloudscheduler/start
Required Information
When contacting Boxalino with a GCP project deployment request, please provide the following information:
1 | project name | as will appear in your project`s list;
|
2 | the requestor is the one managing the applications running on the project; this email will receive messages (alert and notifications) for when the project is ready to be used; | |
3 | client name | (also known as the Boxalino account name) this is to ensure the access to the views, core & reports datasets |
4 | optional; the labels are used as project meta-information. see Labels | |
5 | optional; by default, the requestor will have full access and can further share with others. see Permissions |
...
access the project from Google Console https://console.cloud.google.com/
add new permissions from share access to the project with other members in the IAM Admin panel
create GCS bucketbuckets and load different applications, which can be triggered
...
Labels (optional)
Labels are key-value pairs meant to better organize the projects.
...
More information on labels: https://cloud.google.com/resource-manager/docs/creating-managing-labels
Permissions (optional)
The permissions are added when the project is created.
...
Code Block |
---|
name: gcp-application-name channels: - defaults dependencies: - ca-certificates=2020.1.1=0 - <a list of dependencies> - pip: - google-api-core==1.22.2 - google-api-python-client==1.9.3 - google-auth==1.17.2 - <more-libraries required for the application> prefix: /opt/conda/envs/gcp-application-env |
Application Launch
...
Note |
---|
Before launching the application, make sure that the Required Files are uploaded in a GCS bucket |
...
. |
To launch the application, complete the form in the Application Launch service https://gcp-deploy-du3do2ydza-ew.a.run.app/application.
Provide the following information:
1 | project ID | the project ID is unique; the project ID is diplayed on the dashboard of your project https://console.cloud.google.com/home/dashboard | ||
2 | GCS bucket name | the bucket name where the Required Files are located (ex: gs://project-name); the contents will be made available on the application as well.
| ||
3 | launch date | optional; projects can be scheduled for launch at a later day (tomorrow, etc)access code | as provided by Boxalino |
BigQuery access
As a data scientist, chances are that you have been provided with a Service Account (SA) to access the client`s private projects.
The application is run by the project's own Compute Engine Service Account (CE SA).
Because the project is in the scope of Boxalino, it will have direct read access to the client's datasets.
...