Bitbucket to GitLab
Migration planning, methods, limits & step-by-step procedures for moving from Bitbucket Server/Data Center to GitLab
Overview
Migrating from Bitbucket Server/Data Center to GitLab involves moving Git repositories, pull request history, CI/CD pipelines, and user permissions. The scope of migration depends on how much metadata (beyond raw Git data) needs to be preserved.
Bitbucket Server stores projects and repositories in a PROJECT/repo hierarchy. GitLab uses Groups and Subgroups with a similar hierarchy, so structural mapping is straightforward. The main complexity lies in migrating pull request discussions, inline comments, and CI/CD pipelines (Bamboo/Jenkins to GitLab CI).
Decide early whether you need full-fidelity migration (preserving PR history, comments, reviewers) or code-only migration (Git history + branches). Full-fidelity requires the GitLab Importer or API scripting. Code-only can be done with git clone --mirror + git push --mirror in minutes.
Migration Methods
Simplest Git Mirror
git clone --mirror from Bitbucket, git push --mirror to GitLab. Migrates all Git data: commits, branches, tags, refs.
- Does not migrate PRs, comments, or metadata
- Fastest method — suitable for bulk migration of many repos
- Can be scripted with Bitbucket and GitLab REST APIs
Built-in GitLab Importer
GitLab's built-in Bitbucket Server importer accessed via New Project → Import → Bitbucket Server.
- Imports repos, PRs (with comments), and branch info
- Requires Bitbucket personal access token
- Available on GitLab.com and self-managed
Professional Congregate
GitLab Professional Services tool — Congregate automates large-scale migrations from Bitbucket, GitHub, and other platforms to GitLab.
- Handles bulk migration of hundreds/thousands of repos
- Preserves PRs, comments, labels, and user mapping
- Requires GitLab PS engagement or Premium/Ultimate license
Custom API-Based Migration
Script the migration using both the Bitbucket REST API and GitLab REST/GraphQL API for maximum control.
- Full control over what gets migrated and how
- Can map users, transform data, handle edge cases
- Significant development effort
Method comparison
| Method | Git History | PRs/Comments | CI/CD | Effort |
|---|---|---|---|---|
| Git Mirror | Full | No | No | Low |
| GitLab Importer | Full | Yes | No | Low |
| Congregate | Full | Yes | No | Medium (PS engagement) |
| API Scripting | Full | Yes (custom) | Possible (custom) | High |
Concept Mapping
Bitbucket Server and GitLab use different terminology for similar concepts. This mapping helps plan the structural migration.
| Bitbucket Server | GitLab | Notes |
|---|---|---|
| Project | Group | Top-level container for repos |
| Repository | Project | Individual Git repository |
| Pull Request | Merge Request | Code review workflow |
| Branch Permissions | Protected Branches | Similar but different granularity |
| Personal Repo | Personal Project | User namespace ~username → username/ |
| Hooks (pre-receive) | Push Rules / Server Hooks | Server hooks require admin access |
| Bamboo/Jenkins CI | GitLab CI/CD | Requires pipeline rewrite — .gitlab-ci.yml |
| Access Keys (SSH) | Deploy Keys | Per-repo read/write SSH keys |
| Webhooks | Webhooks | Different payload format |
Limits & Constraints
GitLab repository limits
| Limit | Value | Notes |
|---|---|---|
| Max repo size (GitLab.com Free) | 5 GB | Includes Git history + LFS |
| Max repo size (self-managed) | No default limit | Configurable by admin via application_settings API |
| Max file size (push, GitLab.com) | 100 MB | Blocked at receive-pack level |
| Max file size (self-managed) | Configurable | Settings → Repository → Max file size |
| Max import file size | Configurable | Default 50 MB for file uploads; no limit for direct Git transfer |
GitLab LFS limits
| Limit | Value | Notes |
|---|---|---|
| LFS per-file max (GitLab.com) | 5 GB | Same as repo storage pool |
| LFS storage (Free tier) | 5 GB | Shared with repo, packages, etc. |
| LFS storage (Premium) | 50 GB | Namespace-level quota |
| LFS storage (Ultimate) | 250 GB | Namespace-level quota |
| LFS storage (self-managed) | Unlimited | Limited only by disk/object storage |
Bitbucket Server export limits
| Limit | Value | Notes |
|---|---|---|
| API pagination | 25–1000 per page | Default 25; max configurable in bitbucket.properties |
| REST API rate limit | No built-in limit | Bitbucket Server has no rate limiter by default; be careful not to overwhelm it |
| Git clone (large repo) | No hard limit | Limited by JVM heap and disk I/O |
GitLab import rate limits
| Limit | Value | Notes |
|---|---|---|
| API rate limit (authenticated) | 2,000 req/min | GitLab.com default; self-managed is configurable |
| Import workers (GitLab.com) | 5 concurrent | Per user; queued imports wait |
| Import workers (self-managed) | Configurable | Sidekiq concurrency for import jobs |
| Git push size | No explicit limit | Limited by HTTP timeout and client_max_body_size in Nginx |
Bitbucket Server does not enforce API rate limits by default. If you script a bulk export, you can easily overwhelm Bitbucket with concurrent API calls. Throttle to 5–10 concurrent requests and add a short delay between pages to avoid impacting other users on the instance.
Git Mirror Method
The simplest migration path. Clone each Bitbucket repo as a bare mirror and push to a new GitLab project. This preserves all commits, branches, tags, and refs.
Single repo
# Clone bare mirror from Bitbucket
git clone --mirror https://bitbucket.example.com/scm/PROJECT/repo.git
# Create the project in GitLab (via API or UI)
# Then push the mirror
cd repo.git
git push --mirror https://gitlab.example.com/group/repo.git
Bulk migration script
#!/bin/bash
# Bulk migrate all repos from a Bitbucket project to a GitLab group
BB_URL="https://bitbucket.example.com"
BB_PROJECT="MYPROJECT"
BB_TOKEN="your-bitbucket-pat"
GL_URL="https://gitlab.example.com"
GL_GROUP_ID="42"
GL_TOKEN="your-gitlab-pat"
# List all repos in the Bitbucket project
repos=$(curl -s -H "Authorization: Bearer $BB_TOKEN" \
"$BB_URL/rest/api/1.0/projects/$BB_PROJECT/repos?limit=1000" \
| jq -r '.values[].slug')
for repo in $repos; do
echo "Migrating: $repo"
# Clone bare mirror
git clone --mirror "$BB_URL/scm/$BB_PROJECT/$repo.git" "/tmp/$repo.git"
# Create GitLab project
curl -s --request POST \
-H "PRIVATE-TOKEN: $GL_TOKEN" \
"$GL_URL/api/v4/projects" \
-d "name=$repo&namespace_id=$GL_GROUP_ID" > /dev/null
# Push mirror
cd "/tmp/$repo.git"
git push --mirror "$GL_URL/group-path/$repo.git"
cd -
# Cleanup
rm -rf "/tmp/$repo.git"
done
Run the bulk script during a maintenance window. Set Bitbucket repos to read-only during migration to prevent commits being pushed to the old location after the mirror is taken.
GitLab Bitbucket Server Importer
GitLab includes a built-in importer for Bitbucket Server that can migrate repositories along with pull request metadata.
What it migrates
- Repository — full Git history, all branches and tags
- Pull requests — converted to merge requests, including title, description, state (open/merged/declined)
- PR comments — general and inline comments preserved with author attribution
- PR reviewers — mapped to GitLab reviewers where users match
What it does NOT migrate
- Branch permissions / protected branch rules
- Webhooks
- Repository settings (default branch must be set manually if not
main/master) - CI/CD pipelines (Bamboo/Jenkins configs)
- Git LFS objects (must be pushed separately)
- Attachments on PR comments
Setup steps
- In Bitbucket, create a Personal Access Token with
PROJECT_READandREPO_READpermissions - In GitLab, go to New Project → Import project → Bitbucket Server
- Enter Bitbucket Server URL and the personal access token
- Select repositories to import — choose target namespace/group
- Click Import and monitor progress
The GitLab importer maps Bitbucket users to GitLab users by email address. Ensure users exist in GitLab with matching email addresses before starting the import. Unmatched users will be attributed to the importing admin user.
API-Based Migration
For maximum control, use the Bitbucket Server REST API to export data and the GitLab REST API to import it. This approach is useful when you need to transform data, handle complex user mapping, or migrate metadata the built-in importer doesn't support.
Bitbucket Server API endpoints
| Resource | Endpoint |
|---|---|
| List projects | GET /rest/api/1.0/projects |
| List repos in project | GET /rest/api/1.0/projects/{key}/repos |
| List pull requests | GET /rest/api/1.0/projects/{key}/repos/{slug}/pull-requests |
| PR comments | GET /rest/api/1.0/projects/{key}/repos/{slug}/pull-requests/{id}/comments |
| PR activities | GET /rest/api/1.0/projects/{key}/repos/{slug}/pull-requests/{id}/activities |
| Branch permissions | GET /rest/branch-permissions/2.0/projects/{key}/repos/{slug}/restrictions |
| Webhooks | GET /rest/api/1.0/projects/{key}/repos/{slug}/webhooks |
GitLab API endpoints
| Resource | Endpoint |
|---|---|
| Create project | POST /api/v4/projects |
| Create merge request | POST /api/v4/projects/{id}/merge_requests |
| Add MR note | POST /api/v4/projects/{id}/merge_requests/{iid}/notes |
| Protected branches | POST /api/v4/projects/{id}/protected_branches |
| Deploy keys | POST /api/v4/projects/{id}/deploy_keys |
| Webhooks | POST /api/v4/projects/{id}/hooks |
LFS Considerations
Git LFS objects are not included in a standard git clone --mirror and are not migrated by the GitLab Bitbucket Server importer. They must be handled separately.
Migrating LFS objects
# Clone with LFS from Bitbucket
GIT_LFS_SKIP_SMUDGE=0 git clone https://bitbucket.example.com/scm/PROJECT/repo.git
cd repo
# Fetch all LFS objects for all refs
git lfs fetch --all origin
# Add GitLab as new remote
git remote add gitlab https://gitlab.example.com/group/repo.git
# Push all LFS objects to GitLab
git lfs push --all gitlab
# Push Git refs
git push --mirror gitlab
On GitLab.com, LFS storage counts toward namespace storage quotas (5 GB Free, 50 GB Premium, 250 GB Ultimate). On self-managed GitLab, configure an object storage backend (S3, GCS, MinIO) for LFS to avoid filling local disk. Check total LFS usage on Bitbucket before migrating: find /var/atlassian/application-data/bitbucket/shared/data/git-lfs -type f | wc -l
CI/CD Migration
CI/CD pipelines do not migrate automatically. Bamboo plans and Jenkins jobs must be manually converted to .gitlab-ci.yml files. This is typically the most time-consuming part of the migration.
Bamboo to GitLab CI mapping
| Bamboo Concept | GitLab CI Equivalent |
|---|---|
| Plan | Pipeline (.gitlab-ci.yml) |
| Stage | stages: keyword |
| Job | Job (within a stage) |
| Task | script: steps within a job |
| Agent | Runner |
| Agent capability | Runner tags |
| Plan variable | CI/CD variable |
| Artifact | artifacts: keyword |
| Deployment project | environment: keyword |
| Trigger (after plan) | trigger: keyword (downstream pipelines) |
Example conversion
Bamboo (conceptual)
Plan: Build & Deploy
Stage: Build
Job: Maven Build
Task: mvn clean package
Task: Archive artifact
Stage: Deploy
Job: Deploy to Staging
Task: scp target/*.jar
GitLab CI (.gitlab-ci.yml)
stages:
- build
- deploy
maven-build:
stage: build
image: maven:3.9-eclipse-temurin-21
script:
- mvn clean package
artifacts:
paths:
- target/*.jar
deploy-staging:
stage: deploy
script:
- scp target/*.jar user@staging:/app/
environment:
name: staging
If using Jenkins rather than Bamboo, the same principle applies — Jenkinsfile stages/steps map to GitLab CI stages/jobs. GitLab also offers the Jenkins integration as a transitional step, allowing GitLab to trigger existing Jenkins jobs while pipelines are being rewritten.
Migration Checklist
Pre-migration
- Inventory all Bitbucket projects and repositories — note sizes, LFS usage, and activity
- Identify which repos are active vs. archived — archive inactive repos rather than migrating them
- Create GitLab groups matching Bitbucket project structure
- Ensure all users exist in GitLab with matching email addresses for attribution
- Estimate total storage needed (Git repos + LFS objects)
- Plan maintenance window for cutover
- Choose migration method based on metadata requirements
During migration
- Set Bitbucket repos to read-only to prevent new commits
- Run migration (mirror, importer, or API scripts)
- Migrate LFS objects separately if applicable
- Verify commit counts, branch counts, and tag counts match
- Spot-check PR/MR history on critical repos
Post-migration
- Configure protected branches to match Bitbucket branch permissions
- Set up webhooks for integrations (Slack, Jira, etc.)
- Convert CI/CD pipelines to
.gitlab-ci.yml - Update any hardcoded Bitbucket URLs in documentation, scripts, and configs
- Update DNS/proxy if migrating from self-hosted Bitbucket to self-hosted GitLab
- Communicate new repo URLs to all developers
- Keep Bitbucket running in read-only mode for a transition period
- Decommission Bitbucket after confirming all teams have switched
Run a dry-run migration with a small subset of repos first. This validates your process, uncovers edge cases (large files, special characters in repo names, LFS quirks), and gives you a realistic time estimate for the full migration.