Skip to content

Latest commit

 

History

History
132 lines (92 loc) · 4.62 KB

README.md

File metadata and controls

132 lines (92 loc) · 4.62 KB

CalData Infrastructure Template

test

This repository is a copier template which can be used to quickly seed a modern data stack project. Instructions may vary depending on if the repo is hosted via GitHub or Azure DevOps, so we make some distinctions below.

This repo consists of:

  1. A dbt project.
  2. pre-commit checks for enforcing code-quality.
  3. A documentation skeleton using mkdocs-material
  4. GitHub actions for running quality checks in continuous integration (CI)

Usage

Start with a Python environment. Install copier and poetry:

python -m pip install copier

Create a directory into which the project will be rendered:

mkdir <your-project-name>
cd <your-project-name>
git init

Create a new project using the copier command-line tool, with...

GitHub repo

HTTPS:

copier copy https://github.com/cagov/caldata-infrastructure-template .

OR with SSH:

copier copy [email protected]:cagov/caldata-infrastructure-template.git .

Azure DevOps repo

Install git credential manager (with Homebrew if on a Mac, if on a windows you should have it by default with this git instalation.) Then run the following three commands:

brew install git-credential-manager

git remote add origin <Azure DevOps repo url e.g. https://[email protected]/caldata-sandbox/mdsa-test/_git/mdsa-test>

copier copy https://github.com/cagov/caldata-infrastructure-template .

This will ask you a series of questions, the answers to which will be used to populate the project.

Once the project is rendered, you should initialize it as a git repository:

git add .
git commit -m "Initial commit"

Finally, install the Python dependencies and commity the poetry.lock:

poetry install
git add poetry.lock
git commit -m "Add poetry.lock"

dbt Cloud setup

For Azure DevOps repos you'll follow the instructions here.

For GitHub repos you'll follow the instructions here.

GitHub-based Snowflake setup

The projects generated from our infrastructure template need read access to the Snowflake account in order to do two things from GitHub actions:

  1. Verify that dbt models in branches compile and pass linter checks
  2. Generate dbt docs upon merge to main.

The terraform configurations deployed above create two service accounts for GitHub actions, a production one for docs and a dev one for CI checks.

Add key pairs to the GitHub service accounts

This repository assumes two service accounts in Snowflake for usage with GitHub Actions. Set up key pairs for the two GitHub actions service accounts (GITHUB_ACTIONS_SVC_USER_DEV and GITHUB_ACTIONS_SVC_USER_PRD) following the instructions given here.

Set up GitHub actions secrets

In order for the service accounts to be able to connect to your Snowflake account you need to configure secrets in GitHub actions From the repository page, go to "Settings", then to "Secrets and variables", then to "Actions".

Add the following repository secrets:

Variable Value
SNOWFLAKE_ACCOUNT new account locator
SNOWFLAKE_USER_DEV GITHUB_ACTIONS_SVC_USER_DEV
SNOWFLAKE_USER_PRD GITHUB_ACTIONS_SVC_USER_PRD
SNOWFLAKE_PRIVATE_KEY_DEV dev service account private key
SNOWFLAKE_PRIVATE_KEY_PRD prd service account private key

Enable GitHub pages for the repository

The repository must have GitHub pages enabled in order for it to deploy and be viewable.

  1. From the repository page, go to "Settings", then to "Pages".
  2. Under "GitHub Pages visibility" select "Private" (unless the project is public!).
  3. Under "Build and deployment" select "Deploy from a branch" and choose "gh-pages" as your branch.

Testing

Continuous integration for this template creates a new project from the template, then verifies that the pre-commit checks pass. Future versions might do additional checks (e.g., running sample dbt models or orchestration DAGs).

To run the tests locally, change directories to the parent of the template, then run

./caldata-infrastructure-template/ci/test.sh