--- title: Use Self-Hosted CodeRabbit With Bitbucket Datacenter sidebar_label: Bitbucket Datacenter description: Instructions to self-host CodeRabbit and integrate it with Bitbucket Datacenter. sidebar_position: 4 --- :::note The self-hosted option is only available for CodeRabbit Enterprise customers with 500 user seats or more. Please contact [CodeRabbit Sales](mailto:sales@coderabbit.ai) to learn more about the CodeRabbit Enterprise plan. ::: ## Create a Bitbucket User - **Username**: Set the username to "CodeRabbit" for easier identification (optional). - **Profile Image**: Use the CodeRabbitAI logo for the user image (optional). ## Add User to Projects Add the CodeRabbit user to each project where you want CodeRabbit to post reviews, with permissions to: - Post reviews - Open pull requests ## Create a Personal Access Token for CodeRabbit user Generate a personal access token for the CodeRabbit user to be added in the `.env` file as `BITBUCKET_SERVER_BOT_TOKEN`. ## Add a webhook to each project 1. **Navigate to Webhook Settings**: Go to the repository settings and locate the webhooks configuration page. 2. **Configure Events**: Enable the following Pull Request events: - "Opened" - "Modified" - "Comment Added" 3. **Add Webhook URL**: Enter the URL pointing to the CodeRabbit service, followed by `/bitbucket_server_webhooks` (e.g., `http://127.0.0.1:8080/bitbucket_server_webhooks`). ## Prepare a `.env` file Create a `.env` file with the following content: ```bash # if using OpenAI LLM_PROVIDER=openai LLM_TIMEOUT=360000 OPENAI_API_KEYS= OPENAI_BASE_URL=[] OPENAI_ORG_ID=[] OPENAI_PROJECT_ID=[] # if using Azure OpenAI LLM_PROVIDER=azure-openai LLM_TIMEOUT=360000 AZURE_OPENAI_ENDPOINT= AZURE_OPENAI_API_KEY= ## it is recommended to use gpt-4o-mini, o3-mini, and o1 deployments. AZURE_GPT4OMINI_DEPLOYMENT_NAME= AZURE_O3MINI_DEPLOYMENT_NAME= AZURE_O1_DEPLOYMENT_NAME= # optionally, you can swap o3-mini with o1-mini AZURE_O1MINI_DEPLOYMENT_NAME=[] # OAuth2 Configuration (optional) # This will use client credentials grant flow to get an access token, and use that token in headers while making requests to AZURE_OPENAI_ENDPOINT. OAUTH2_ENDPOINT=[] OAUTH2_CLIENT_ID=[] OAUTH2_CLIENT_SECRET=[] # if using AWS Bedrock LLM_PROVIDER=bedrock-anthropic LLM_TIMEOUT=360000 AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= AWS_REGION= # System Configuration TEMP_PATH=/cache SELF_HOSTED=bitbucket-server BITBUCKET_SERVER_URL=/rest BITBUCKET_SERVER_WEBHOOK_SECRET= BITBUCKET_SERVER_BOT_TOKEN= BITBUCKET_SERVER_BOT_USERNAME= CODERABBIT_LICENSE_KEY= CODERABBIT_API_KEY= ENABLE_LEARNINGS=[true] ENABLE_METRICS=[true] JIRA_HOST=[] JIRA_PAT=[] LINEAR_PAT=[] ``` :::note - If you are using Azure OpenAI, verify that the model deployment names are in the .env file. Values marked with [] are optional and can be omitted if the feature is not needed. - You can generate `CODERABBIT_API_KEY` from CodeRabbit UI -> Organizations Settings -> API Keys. ::: ## Pull the CodeRabbit Docker image Authenticate and pull the Docker image using the provided credentials file: ```bash cat coderabbit.json | docker login -u _json_key --password-stdin us-docker.pkg.dev docker pull us-docker.pkg.dev/coderabbitprod/self-hosted/coderabbit-agent:latest ``` ### Verify the image is up You can query `/health` endpoint to verify that the `coderabbit-agent` service is up and running. ```bash curl 127.0.0.1:8080/health ``` ## Host the image You can host the image on a server, serverless function, or container environment and expose port `8080`. Run the Docker image with the equivalent command on your chosen platform, ensuring you replace the `.env` file path with the path to your actual `.env` file: ```bash docker run --env-file .env --publish 127.0.0.1:8080:8080 us-docker.pkg.dev/coderabbitprod/self-hosted/coderabbit-agent:latest ```