Reference
Integrations
Tunnel
Tunnel is a collaboration layer for your development process. It integrates very nicely with preview environments to provide different stakeholders an easy way to leave feedback on work in progress. Check out their docs for more info.
Neon
neon offers the first "serverless" postgres. This is a fast and cost-effective alternative to cloud-native databases such as Amazon RDS or Google Cloud SQL that is great for use in preview environments in particular, due to the ability for new instances to branch from existing data such as a staging environment, for each database.
Coherence offers a no-code and low-configuration integration to use neon databases in your Coherence environments, that will automatically create a branch for each environment and integrate best practices from Coherence such as secrets management for database passwords, toolbox access for cloud shell in each environment, and CI steps such as database migrations.
The lifecycle of the integration and how environments relate to branches:
- Coherence will create a new branch based on the
base_branch
provided for each environment, including for the first environment we create. - If you want to, you can change the
base_branch
to be based on another Coherence environment after creating one. - Get in touch if you'd like to change how branches map to environments for your use case!
To use a neon database:
- Can be any version supported by neon
- Coherence yml resource looks like:
resources:
- name: db1
engine: postgres
version: 15
provider: neon
base_branch: your-neon-branch-123
role_name: yourrolename
- Set 2 environment variables in your
Preview
tab:NEON_API_KEY
for your neon API key (find in your neon settings), andNEON_PASSWORD
for your database password thatrole_name
from the yml above can connect with.
That's it - now each new environment in Coherence will get it's own neon branch, and as it is archived the branch will be deleted.
Docker Hub
Docker Hub enforces rate limits on unauthenticated image pulls.
- On GCP, Google runs a caching proxy that hides this problem from you.
- On AWS, CodeBuild project executions use shared IPs and by default you will be rapidly rate limited using public images hosted by Docker in your account.
- Coherence hides the problem by using a Coherence-provided Docker Hub account that has increased rate limits, to pull images more frequently. However, this still leaves you vulnerable to our account being used aggressivly by other applications, or us forgetting to pay our bills :) Therefore, we also allow you to provide your own Docker Hub credentials to be used in your build pipelines.
- To provide your Docker Hub credentials to your pipelines, you just need to set the Environment Variables
DOCKER_USERNAME
andDOCKER_PASSWORD
. Like all variables, these can be the project-level default scope, or per-environment scope.
Slack
To integrate Coherence with Slack you will first need to click on the settings tab at the top of the Coherence dashboard.
Scroll to the bottom of the page where you will see the "Integrations" section. Click on the Install button next to Slack.
Next, you will need to enter the name of your Slack workspace. Then click the Continue button.
Once Slack finds your workspace you will need to grant Coherence access to your workspace.
If everything worked correctly, you should see the "Successfully installed Slack" screen.
Finally, you will need to select which Slack channel you would like Coherence to post to on the settings page of the Coherence dashboard.
DataDog
AWS-only
DataDog is only supported on AWS, at this time.
Datadog is a popular monitoring and security service. To use Datadog in your application, you will need to do the following.
First, you will need to add the following environment variables to your application:
DD_API_KEY # your datadog API key
DD_APP_KEY # your datadog application key (created in the Datadog UI)
if you use a non-default site or API URL host for your datadog account, you can provide those values with the following variables (use the correct values for your account):
DD_HOST=https://api.us5.datadoghq.com/
DD_SITE=us5.datadoghq.com
Environment variables
We recommend you add these variables at the cloud project level in Coherence so that all environments will inherit them
Finally, you will need to add the following to your coherence.yml
backend:
type: backend
# ...
integrations:
datadog:
enabled: true
If you've already installed the DataDog AWS integration into your AWS accounts, you'll also need to add: install_integration: False
to the datadog
dictionary.
Cypress
Cypress is a popular end-to-end and component testing tool.
To learn how to run your Cypress tests within Coherence, see:
GitHub
Our GitHub integration does the following:
- Installs the app for webhooks
- Post back check status on each commit when we run the build
- Adds a comment on each PR with the Coherence preview URL
- Auto-creation of features from Github pull requests. Enable "Create Preview via Pull Requests in the Settings tab in Coherence and when we receive a webhook from github about a pull request in your repo for a branch that is not present in Coherence, we will create a new Preview environment, submit a build, and leave a comment with a link the Preview on your pull request.
You can also integrate GitHub Actions CI/CD piplelines with Coherence by reading our docs How to integrate GitHub Actions CI/CD pipelines.
Graphite
Coming soon.
Reflect
Reflect is an automated testing framework that can be integrated into Coherence via integration_test in your coherence.yml
.
name_of_your_integration_tests:
integration_test:
type: integration_test
command: ['curl', '-X', 'POST', '-H', 'X-API-KEY: <API-KEY>', 'https://api.reflect.run/v1/suites/<suite-id>/executions']
image: 'curlimages/curl:7.85.0'
Include your integration tests as a top level block along with your application's services with a type
of integration_test
Replace <suite-id>
with the Suite ID on the suites page and <API-KEY>
with your Reflect API Key.
See Reflect's official docs here.
Doppler
doppler is a secrets manager platform. To integrate with coherence, you can place the command doppler run --token=$VAR_NAME
before your dev/prod/build
commands in coherence.yml
, and choose different VAR_NAME
for the token in each environment as appropriate. One good practice is to use a different variable in dev
and prod
commands so that you can provide a unique set of secrets for workspaces and deployed environments.