Terraform simplifies deploying Generative AI for Marketing. The Terraform deployment includes all necessary requirements.
Note: The Terraform Provider for Google Cloud is not able to generate some of the GenAI resources, null_resource
is used to create some resources using the Google Cloud SDK.
You'll need to create a Google Cloud project and link a billing account before you begin. It is strongly recommended you deploy Generative AI for Marketing in its own, new project. Existing resources in a project may be impacted by the deployment, and the deployment itself may fail.
These instructions have been tested as run by a Google Cloud user with the Owner role for the project, installation may not work if the installing user does not have the Owner role.
In certain Google Cloud Organizations, organization policies may block installation steps. The Known Issues section provides help changing these policies, which requires the Organization Administrator Role.
Make sure you have sufficient free space in your terminal environment before you begin installation--4GB is recommended. Having insufficient free space can cause installation steps to fail in a state that makes recovery especially difficult. This is especially important when installing the frontend, which requires a large number of npm packages.
If you encounter problems during deployment see the Known Issues section for workarounds to common issues.
Before executing Terraform, follow these steps to enable some services:
Request access to Imagen through this form. Note this can take up to a week. You can still use Generative AI for Marketing while awaiting allowlisting, but image generation capabilities will not work. Generative AI for Marketing currently uses Imagen 3.
- Go to https://console.developers.google.com/apis/api/cloudresourcemanager.googleapis.com/overview and enable the API.
The frontend of Generative AI for Marketing is hosted on Firebase. Before beginning deployment, you need to enable Firebase.
- Go to https://console.firebase.google.com/.
- Select "Create a project" and enter the name of your Google Cloud Platform project, then click "Continue".
- If you're using Firebase for the first time, you'll have to add Firebase to one of yor existing Google Cloud projects and confirm the Firebase billing plan.
- When prompted to set up Google Analytics respond as you'd like.
- Continue and complete.
The chat agent and search features of Generative AI for Marketing require Vertex AI Agent Builder.
- Go to https://console.cloud.google.com/gen-app-builder/start .
- Click the button to accept TOS and enable.
Cloud Shell is the recommend environment for running the deployment. If you are deploying from outside Cloud Shell, set up your Google Cloud SDK Credentials:
gcloud config set project <your_project_id>
gcloud auth application-default set-quota-project <your_project_id>
You'll also need to install Terraform and the gcloud
CLI.
Note: The deployment requires Terraform 1.7 or higher.
Generative AI for Marketing requires your organization has Google Workspace set up and you have an account before proceeding.
-
Clone the GitHub repo.
-
In Cloud Shell navigate to the git repo root.
-
Run
gcloud config set project YOUR_PROJECT_ID
to ensure you're installing into the expected project. -
In the cloned project root, run the following to start the Terraform deployment:
# Move to the infra folder.
cd infra/
export USER_PROJECT_OVERRIDE=true
export GOOGLE_BILLING_PROJECT=$(gcloud config get project)
terraform init
terraform apply -var=project_id=$(gcloud config get project)
When terraform apply
completes successfully, you'll see a message Apply complete!
along with outputs specifying config values. Save this output somewhere, you'll need these values later.
After the Terraform deployment successfully completes, enable at least one authentication provider in Firebase. You can enable it using the following steps:
- Go to https://console.firebase.google.com/project/your_project_id/authentication/providers (change the
your_project_id
value in this URL to your project ID). - Click on Get Started (if needed).
- Select Google and flip the enable switch on.
- Set the name for the project and the support email.
- Click the "Save" button.
Generative AI for Marketing Uses Google Drive to store created marketing materials. This step creates a Google Drive folder, populates it with templates for the marketing materials, and then returns Google Drive IDs for these templates (you'll need these later). You'll then give the Generative AI for Marketing application access to the Google Drive folder.
Execute the following script from the infra
subfolder, substituting <cloud_run_backend_sa>
for the cloud_run_backend_sa
value output by terraform apply
(without quotes) in step 1.
Note: if you already have a genai-marketing-assets
folder in your top-level Google Drive you must use a different folder name.
echo "{}" >> gdrive_folder_results.json
python scripts/create_gdrive_folder.py --folder-name="genai-marketing-assets" --service-account-email=<cloud_run_backend_sa>
In most Workspace setups, the Generative AI for Marketing application needs to be granted access to the Google Drive folder you just created. To do this, share the folder with the service account created for the application backend during the Terraform installation. This will allow the service account to access and manage files within the designated folder.
- In your web browser, open drive.google.com.
- Find the folder created in the previous step (default is genai-marketing-assets).
- Click on the three dots menu at the far right of the row with the folder.
- Highlight "Share" in the menu that pops up, then click "Share" in the submenu.
- In the "Add people, groups, and calendar events" field, enter the email address of the service account (
cloud_run_backend_sa
) that was provided as output from theterraform apply
. - Set the permissions for the service account to "Editor".
- Clock "Send" to share the folder and grant permissions. If a popup appears asking for confirmation, click "Share anyway".
Terraform uses the template at infra/templates/config.toml.tftpl
to generate config.toml
. During deployment, key sections are replaced with actual infrastructure details, and the final config.toml
is written to infra/output_config/
.
Next, we'll incorporate Google Drive details into this file. Use your preferred text editor (nano is shown here).
- Open
gdrive_folder_results.json
in theinfra
directory. - Copy the values for
folder_gdrive_id
,slide_gdrive_id
,doc_gdrive_id
, andsheet_gdrive_id
. Save these somewhere safe outside of Cloud Shell. - In the
infra
directory, openoutput_config/config.toml
(e.g.,nano output_config/config.toml
). - Search for
drive_folder_id
. You'll see placeholders for the 4 values you copied. (In nano, use Ctrl+w to search) - Replace the placeholders:
folder_gdrive_id
->drive_folder_id
slide_gdrive_id
->slides_template_id
doc_gdrive_id
->doc_template_id
sheet_gdrive_id
->sheet_template_id
- Save the file. (In nano, Ctrl+x, then Y, then Enter)
The automated deployment process created all resources to enable the use of Vertex AI Search service with a Dialogflow CX Agent. However, additional steps are required to complete the process of providing the chat agent with data to use on the Frontend:
-
Indexing Data: Two data stores were created during the automation deployment;
Website
type for indexing information from your already existing website (if applicable) andUnstructured data
type for indexing information from files like PDFs into Google Cloud Storage (GCS), this is what we'll be doing. To learn more about Agent Builder data stores, see here.IMPORTANT: Indexing data from a website requires domain verification of your website in order to use the advanced features. Domain verification is out of scope for this demo but you can find the steps here.
Follow steps below:
-
GCS Bucket Creation: Create a GCS bucket if you don't have one already. The following steps create a GCS bucket with uniform level access. Change the value of
BUCKET_LOCATION
otherwise it will be deployed tous-central1
.export PROJECT_ID=$(gcloud config get project) export BUCKET_NAME="$PROJECT_ID-vais-unstructured-data" export BUCKET_LOCATION="us-central1" export STORAGE_CLASS="STANDARD" gcloud config set project $PROJECT_ID gcloud storage buckets create gs://$BUCKET_NAME --project=$PROJECT_ID --default-storage-class=$STORAGE_CLASS --location=$BUCKET_LOCATION --uniform-bucket-level-access
-
Copying Data: We need to copy over some PDFs - Alphabet Earnings Reports from 2004 to 2023 - into the newly created bucket using
gsutil
.gsutil -m cp -r "gs://cloud-samples-data-us-central1/gen-app-builder/search/alphabet-investor-pdfs" "gs://$BUCKET_NAME/data"
You can also download the folder manually and upload it to your storage account.
-
Indexing Data Store: Follow instructions here to index the data store with the PDF documents we just copied over.
-
-
Add Datastore to Dialogflow CX: Once the pdf documents are copied over to your GCS bucket and indexed, you need to connect Dialogflow CX agent to your data store. Follow the steps below to do that.
- Connect Agent to Data: Go to your Dialogflow CX agent and click on Build > Default Start Flow > Start Page.
- Under Data stores, click on Edit Data Store and select your indexed data store from the drop down of type: Unstructured documents.
- Click Save.
- Test your agent, from Dialogflow CX UI, to make sure it responds with the right data. Otherwise, ensure you followed all steps above.
-
Publish Agent: In order to access your chat agent from the GAIM Frontend, you will need to publish it. Follow steps below:
- Click on Publish.
- Under Access, ensure
Unauthenticated API (anonymous access)
is checked - Set your UI style as Side Panel.
- Finally, click on Enable the Unauthenticated API. This will generate some HTML code that can be added to your website to display your agent. You can ignore this as the provided Frontend already has the Chat UI done for you.
- Now, click Done and exit.
To deploy the backend of the application run the following command from the /infra
folder. You need to use values output by terraform apply
(region
and cloud_run_backend_sa
, both without quotes and removing <
and >
) for this step.
sh scripts/backend_deployment.sh --project $(gcloud config get project) --region <region> --sa <cloud_run_backend_sa>
In a fresh project, you'll be asked to create an Artifact Registry Docker repoitory. Enter Y
to confirm.
The backend deployment pushes the backend APIs into a Cloud Run container that will be called by the frontend UI. The APIs are implemented in Python using FastAPI.
The frontend is an Angular application deployed in Firebase.
Please validate that you're logged in with the correct account, and if not, log out and then log in again with the firebase commandline.
# To list your existing logins.
firebase login:list
Then to deploy the frontend you need to execute from the /infra
folder:
sh scripts/frontend_deployment.sh --project $(gcloud config get project)
Once this script completes, Generative AI for Marketing is Deployed!
When frontend deployment is complete, the 'Hosting URL' printed in the terminal is your link to the UI. You can also see this value in the frontend_deployment
value output by terraform apply
.
The backend is located at the address in the backend_deployment
value in the terraform apply
output. It should look something like "https://genai-for-marketing-xxxxxxxx.a.run.app". If you append /marketing-api/docs
(i.e., "https://genai-for-marketing-xxxxxxxx.a.run.app/marketing-api/docs") to this URL you can access the FastAPI interface for exploring the backend APIs.
The deployment creates all the resources described in the main README.md file, the following is a list of the created resources:
- Required Google Cloud services
- BiqQuery Dataset and tables (populating tables with sample data)
- Google Drive folder and templates files
- Service Account with the required permissions
- Search engine and Chat engine with datastores
- Cloud Run for backend APIs
- Firebase for frontend deployment
Workarounds for known issues.
Note that some of the workarounds require modifying organization policies, which can only be done by a user with the orgpolicy.policyAdmin
role. If you have an Google Cloud organization administrator, you should work with them on issues requiring organization policy changes.
Error creating service account key: googleapi: Error 400: Key creation is not allowed on this service account.
Resolution: Disable the disableServiceAccountKeyCreation organization policy in your project.
gcloud resource-manager org-policies disable-enforce constraints/iam.disableServiceAccountKeyCreation --project $(gcloud config get project)
After this run the terraform apply
command again. Note that fixing this may fix other errors that were raised during deployment.
After the service account is successfully created, you should consider reenabling this organization policy:
gcloud resource-manager org-policies enable-enforce constraints/iam.disableServiceAccountKeyCreation --project $(gcloud config get project)
Error setting IAM policy for cloudrun service: googleapi: Error 400: One or more users named in the policy do not belong to a permitted customer, perhaps due to an organization policy.
Resolution: Disable the iam.allowedPolicyMemberDomains organizational policy in your project.
First, you need to create a policy file, replace the <your_project_number>
with your project number:
# policy.yaml
name: projects/<your_project_number>/policies/iam.allowedPolicyMemberDomains
spec:
rules:
- allowAll: true
inheritFromParent: true
and then apply the policy:
`gcloud org-policies set-policy policy.yaml`
After this run the terraform apply
command again. It may take a few minutes for the policy change to take effect, if you keep getting errors wait a few minutes and retry. Once the error is resolved, you should reenable this organization policy:
gcloud resource-manager org-policies delete constraints/iam.allowedPolicyMemberDomains --project $(gcloud config get project)
Error creating Database: googleapi: Error 400: Database ID '(default)' is not available in project 'your_project_id'. Please retry in 134 seconds.
Resolution: This issue occurred because you were repeatedly creating and deleting the Firestore database. To resolve this, you can either:
-
Wait: Firestore has built-in mechanisms to handle this. Wait a few minutes, and the system should automatically resolve the conflict.
-
Manual Deletion (If Necessary): If the problem persists, you may need to manually delete the Firestore database through the Google Cloud Console. Please note that this will erase all data within the database.
FileNotFoundError: [Errno 2] No such file or directory: PATH_TO_SOME_FILE
Resolution: When running create_gdrive_folder.py
, make sure you are running from the infra
subdirectory.