0% found this document useful (0 votes)
22 views7 pages

Gitlab Object Storage From To s3 Documentation

This document outlines the steps to modify GitLab's object storage configuration from local storage to an AWS S3 bucket. It includes creating an S3 bucket, configuring IAM roles, and updating GitLab's configuration file with specific parameters for various object types such as LFS, packages, and artifacts. Additionally, it provides migration commands and testing procedures for each object type to ensure successful migration to S3.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views7 pages

Gitlab Object Storage From To s3 Documentation

This document outlines the steps to modify GitLab's object storage configuration from local storage to an AWS S3 bucket. It includes creating an S3 bucket, configuring IAM roles, and updating GitLab's configuration file with specific parameters for various object types such as LFS, packages, and artifacts. Additionally, it provides migration commands and testing procedures for each object type to ensure successful migration to S3.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Modifying Gitlab object storage configuration from

local to s3 bucket for all the objects:

LINK:
https://docs.gitlab.com/ee/administration/object_storage.html

1. Create an AWS S3 bucket


➢ Public access is disabled
➢ Versioning is enabled
➢ Encrypted
➢ Create a folder inside the s3 bucket with names below:

[ artifacts, dependency_proxy, external_diffs, lfs, packages, pages, terraform_state,


uploads]

2. Create an AWS Subnet endpoint for S3 [ already configured in the internal account] -
no need to do this
3. Create a IAM role having s3 policy and attach it to the ec2 [Already gitlab-
opensearch-role-poc is configured to live gitlab and s3 policy need to be attached to
that role ]
—--------------------------------------------------------------------------------------------------------------
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::<bucket_name>/*"
}
]
}
—----------------------------------------------------------------
4. Configure the below parameters for the object storage migration from local to s3:

ssh to gitlab server


cd /etc/gitlab/
sudo vim gitlab.rb

—------------------------------------------------------------------------------------------------------------------------

gitlab_rails['object_store']['enabled'] = true
gitlab_rails['object_store']['proxy_download'] = true
gitlab_rails['object_store']['connection'] = {
'provider' => 'AWS',
'region' => 'ap-south-1',
'use_iam_profile' => true, # Use IAM role instead of access keys
'path_style' => true
}
gitlab_rails['object_store']['objects']['artifacts']['bucket'] = 'git-objects-migration/artifacts'
gitlab_rails['object_store']['objects']['external_diffs']['bucket'] =
'git-objects-migration/external_diffs'
gitlab_rails['object_store']['objects']['lfs']['bucket'] = 'git-objects-migration/lfs'
gitlab_rails['object_store']['objects']['uploads']['bucket'] = 'git-objects-migration/uploads'
gitlab_rails['object_store']['objects']['terraform_state']['bucket'] =
'git-objects-migration/terraform_state'
gitlab_rails['object_store']['objects']['packages']['bucket'] = 'git-objects-migration/packages'
gitlab_rails['object_store']['objects']['dependency_proxy']['bucket'] =
'git-objects-migration/dependency_proxy'
gitlab_rails['object_store']['objects']['pages']['bucket'] = 'git-objects-migration/pages'

—------------------------------------------------------------------------------------------------------------------------

5. sudo gitlab-ctl reconfigure

_________________________________________________________________________

Follow the below individual steps for the migration of each object to s3:

LFS objects:

check in db:
sudo -i -u postgres psql -d etglab
command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM lfs_objects;

migrate command:
sudo gitlab-rake gitlab:lfs:migrate

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/lfs-objects -type f | grep -v tmp | wc -l

How to test:

Install lfs debian package: git lfs install


Note: If it is above command is not working then use the following google link to download
the git lfs (https://askubuntu.com/questions/799341/how-to-install-git-lfs-on-ubuntu-16-04)
Clone the respective project:
git lfs track *.iso*["*.mp4" “*.psd”]
git add .
git commit -m “<commut message>”
git push
Then in the gitlab ui with respective to the project you will see lfs enabled with respective to
the file
Example:

---------------------------------------------------------------------------------------------------------------------------

Packages:
check in db:

sudo -i -u postgres psql -d etglab


command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM packages_package_files;

migrate command:
sudo gitlab-rake "gitlab:packages:migrate"

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/packages -type f | grep -v tmp | wc -l

How to test:
Run any pipeline in which which package job is there and after that check the s3
bucket weather the package is generated or not
---------------------------------------------------------------------------------------------------------------------------

Terraform state files:

check in db:
sudo -i -u postgres psql -d etglab
command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM terraform_state_versions;

migrate command:
sudo gitlab-rake gitlab:terraform_states:migrate

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/terraform_state -type f | grep -v tmp | wc -l

How to test:
Run any new pipeline or existing one and check state files are stored in s3 or not
---------------------------------------------------------------------------------------------------------------------------
Job artifacts:

check in db:
sudo -i -u postgres psql -d etglab
command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM ci_job_artifacts;

migrate command:
sudo gitlab-rake gitlab:artifacts:migrate

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/artifacts -type f | grep -v tmp | wc -l

How to test:
Run any new pipeline or existing one and check artifacts files are stored in s3 or not
---------------------------------------------------------------------------------------------------------------------------
Uploads:

check in db:
sudo -i -u postgres psql -d etglab
command:
SELECT count(*) AS total, sum(case when store = '1' then 1 else 0 end)
AS filesystem, sum(case when store = '2' then 1 else 0 end) AS
objectstg FROM uploads;
migrate command:
gitlab-rake "gitlab:uploads:migrate:all"

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/uploads -type f | grep -v tmp | wc -l

How to test:
Create a git issue and try to upload any files and check they are stored in s3 or not

---------------------------------------------------------------------------------------------------------------------------
Merge request diffs:

ssh to gitlab server


cd /etc/gitlab/
sudo vim gitlab.rb
gitlab_rails['external_diffs_enabled'] = true
gitlab_rails['external_diffs_when'] = 'outdated'

migrate command:
sudo gitlab-rake gitlab:external_diffs:force_object_storage

NOTE:-
To make the merge request diff objects to be stored in the gitlab database then don’t
add this below configuration line in the gitlab.rb file:
gitlab_rails['external_diffs_enabled'] = true

How to test:
Create a merge request for any branch at any project and check weather is stored in
s3 or not

---------------------------------------------------------------------------------------------------------------------------
Dependency Proxy:

check in db:
sudo -i -u postgres psql -d etglab

command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM dependency_proxy_blobs;

SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0


end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM dependency_proxy_manifests;
migrate command:
sudo gitlab-rake "gitlab:dependency_proxy:migrate"

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/dependency_proxy -type f | grep -v tmp | wc -l

Disable this feature globally:


ssh to gitlab server
cd /etc/gitlab/
sudo vim gitlab.rb
gitlab_rails['dependency_proxy_enabled'] = false
sudo gitlab-ctl reconfigure
—-----------We are not using this object ignore this —------------

---------------------------------------------------------------------------------------------------------------------------

Pages content:

ssh to gitlab server


cd /etc/gitlab/
sudo vim gitlab.rb

pages_external_url "http://pages.thecollective.energy:8008/"
gitlab_pages['enable'] = true
gitlab_rails['external_diffs_enabled'] = true

check in db:
sudo -i -u postgres psql -d etglab

command:
SELECT count(*) AS total, sum(case when file_store = '1' then 1 else 0
end) AS filesystem, sum(case when file_store = '2' then 1 else 0 end)
AS objectstg FROM pages_deployments;

migrate command:
sudo gitlab-rake gitlab:pages:deployments:migrate_to_object_storage

check in the local ec2 command:


sudo find /var/opt/gitlab/gitlab-rails/shared/dependency_proxy -type f | grep -v tmp | wc -l

How to test:
Configure DNS:
● Point the DNS record for pages.thecollective.energy to the
secondary IP address where GitLab Pages will be hosted. This is
typically done through your DNS provider.

Update gitlab.rb with the External URL:

● In your GitLab server, locate the gitlab.rb configuration file


(commonly found at /etc/gitlab/gitlab.rb).
● Update the gitlab_pages['external_url'] configuration to match
your DNS:

Go to your GitLab project, navigate to "Deploy" > "Pages" ensure that your HTML file
has been deployed, and GitLab Pages configuration is configured correctly.

---------------------------------------------------------------------------------------------------------------------------

You might also like