In this article, we will see how to paginate through all databases present in AWS Glue.
Example
Problem Statement: Use boto3 library in Python to paginate through all databases from AWS Glue Data Catalog that is created in your account
Approach/Algorithm to solve this problem
Step 1: Import boto3 and botocore exceptions to handle exceptions.
Step 2: max_items, page_size and starting_token is parameter for this function.
max_items denote the total number of records to return. If the number of available records > max_items, then a NextToken will be provided in the response to resume pagination.
page_size denotes the size of each page.
starting_token helps to paginate, and it uses NextToken from a previous response.
Step 3: Create an AWS session using boto3 lib. Make sure region_name is mentioned in the default profile. If it is not mentioned, then explicitly pass the region_name while creating the session.
Step 4: Create an AWS client for glue.
Step 5: Create a paginator object that contains details of all crawlers using get_databases
Step 6: Call the paginate function and pass the max_items, page_size and starting_token as PaginationConfig
Step 7: It returns the number of records based on max_size and page_size.
Step 8: Handle the generic exception if something went wrong while paginating.
Example Code
Use the following code to paginate through all crawlers created in user account −
import boto3 from botocore.exceptions import ClientError def paginate_through_databases(max_items=None:int,page_size=None:int, starting_token=None:string): session = boto3.session.Session() glue_client = session.client('glue') try: paginator = glue_client.get_paginator('get_databases') response = paginator.paginate(PaginationConfig={ 'MaxItems':max_items, 'PageSize':page_size, 'StartingToken':starting_token} ) return response except ClientError as e: raise Exception("boto3 client error in paginate_through_databases: " + e.__str__()) except Exception as e: raise Exception("Unexpected error in paginate_through_databases: " + e.__str__()) a = paginate_through_databases(2,5) print(*a)
Output
{'DatabaseList': [ {'Name': 'aurora_glue_catalog', 'CreateTime': datetime.datetime(2020, 11, 18, 14, 24, 46, tzinfo=tzlocal())}, {'Name': 'custdb', 'CreateTime': datetime.datetime(2020, 8, 31, 20, 30, 9, tzinfo=tzlocal())}], 'NextToken': 'eyJsYXN0RXZhbHVhdGVkS2V5Ijp7IkhBU0hfS0VZIjp7InMiOiJuLjc4MjI1ODQ4NTg0MSJ9LCJSQU5HRV9LRVkiOnsicyI6ImRldjEtZnJlYV9lZGxfZ2x1ZV9kYXRhYmFzZSJ9fSwiZXhwaXJhdGlvbiI6eyJzZWNvbmRzIjoxNjE3NDUwNDQxLCJuYW5vcyI6ODcwMDAwMDB9LCJzaGFyZWRDb250ZXh0IjpmYWxzZSwidGFnQ29udGV4dCI6ZmFsc2V9', 'ResponseMetadata': {'RequestId': '3e1c4f54-d573-4ba9-9948-832273ecca02', 'HTTPStatusCode': 200, 'HTTPHeaders': {'date': 'Fri, 02 Apr 2021 11:47:21 GMT', 'content-type': 'application/x-amz-json-1.1', 'content-length': '1617', 'connection': 'keep-alive', 'x-amzn-requestid': '3e1c4f54-d573-4ba9-9948-832273ecca02'}, 'RetryAttempts': 0}}