There are more AWS SDK examples available in the AWS Doc SDK Examples
The following code examples show you how to perform actions and implement common scenarios by using the AWS SDK for Python (Boto3) with DynamoDB.
Basics are code examples that show you how to perform the essential operations within a service.
Actions are code excerpts from larger programs and must be run in context. While actions show you how to call individual service functions, you can see actions in context in their related scenarios.
Scenarios are code examples that show you how to accomplish specific tasks by calling multiple functions within a service or combined with other AWS services.
Each example includes a link to the complete source code, where you can find instructions on how to set up and run the code in context.
Get started
The following code examples show how to get started using DynamoDB.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. import boto3 # Create a DynamoDB client using the default credentials and region dynamodb = boto3.client("dynamodb") # Initialize a paginator for the list_tables operation paginator = dynamodb.get_paginator("list_tables") # Create a PageIterator from the paginator page_iterator = paginator.paginate(Limit=10) # List the tables in the current AWS account print("Here are the DynamoDB tables in your account:") # Use pagination to list all tables table_names = [] for page in page_iterator: for table_name in page.get("TableNames", []): print(f"- {table_name}") table_names.append(table_name) if not table_names: print("You don't have any DynamoDB tables in your account.") else: print(f"\nFound {len(table_names)} tables.")
-
For API details, see ListTables in AWS SDK for Python (Boto3) API Reference.
-
Basics
The following code example shows how to:
Create a table that can hold movie data.
Put, get, and update a single movie in the table.
Write movie data to the table from a sample JSON file.
Query for movies that were released in a given year.
Scan for movies that were released in a range of years.
Delete a movie from the table, then delete the table.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create a class that encapsulates a DynamoDB table.
from decimal import Decimal from io import BytesIO import json import logging import os from pprint import pprint import requests from zipfile import ZipFile import boto3 from boto3.dynamodb.conditions import Key from botocore.exceptions import ClientError from question import Question logger = logging.getLogger(__name__) class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def exists(self, table_name): """ Determines whether a table exists. As a side effect, stores the table in a member variable. :param table_name: The name of the table to check. :return: True when the table exists; otherwise, False. """ try: table = self.dyn_resource.Table(table_name) table.load() exists = True except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": exists = False else: logger.error( "Couldn't check for existence of %s. Here's why: %s: %s", table_name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: self.table = table return exists def create_table(self, table_name): """ Creates an Amazon DynamoDB table that can be used to store movie data. The table uses the release year of the movie as the partition key and the title as the sort key. :param table_name: The name of the table to create. :return: The newly created table. """ try: self.table = self.dyn_resource.create_table( TableName=table_name, KeySchema=[ {"AttributeName": "year", "KeyType": "HASH"}, # Partition key {"AttributeName": "title", "KeyType": "RANGE"}, # Sort key ], AttributeDefinitions=[ {"AttributeName": "year", "AttributeType": "N"}, {"AttributeName": "title", "AttributeType": "S"}, ], BillingMode='PAY_PER_REQUEST', ) self.table.wait_until_exists() except ClientError as err: logger.error( "Couldn't create table %s. Here's why: %s: %s", table_name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return self.table def list_tables(self): """ Lists the Amazon DynamoDB tables for the current account. :return: The list of tables. """ try: tables = [] for table in self.dyn_resource.tables.all(): print(table.name) tables.append(table) except ClientError as err: logger.error( "Couldn't list tables. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return tables def write_batch(self, movies): """ Fills an Amazon DynamoDB table with the specified data, using the Boto3 Table.batch_writer() function to put the items in the table. Inside the context manager, Table.batch_writer builds a list of requests. On exiting the context manager, Table.batch_writer starts sending batches of write requests to Amazon DynamoDB and automatically handles chunking, buffering, and retrying. :param movies: The data to put in the table. Each item must contain at least the keys required by the schema that was specified when the table was created. """ try: with self.table.batch_writer() as writer: for movie in movies: writer.put_item(Item=movie) except ClientError as err: logger.error( "Couldn't load data into table %s. Here's why: %s: %s", self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise def add_movie(self, title, year, plot, rating): """ Adds a movie to the table. :param title: The title of the movie. :param year: The release year of the movie. :param plot: The plot summary of the movie. :param rating: The quality rating of the movie. """ try: self.table.put_item( Item={ "year": year, "title": title, "info": {"plot": plot, "rating": Decimal(str(rating))}, } ) except ClientError as err: logger.error( "Couldn't add movie %s to table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise def get_movie(self, title, year): """ Gets movie data from the table for a specific movie. :param title: The title of the movie. :param year: The release year of the movie. :return: The data about the requested movie. """ try: response = self.table.get_item(Key={"year": year, "title": title}) except ClientError as err: logger.error( "Couldn't get movie %s from table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Item"] def update_movie(self, title, year, rating, plot): """ Updates rating and plot data for a movie in the table. :param title: The title of the movie to update. :param year: The release year of the movie to update. :param rating: The updated rating to the give the movie. :param plot: The updated plot summary to give the movie. :return: The fields that were updated, with their new values. """ try: response = self.table.update_item( Key={"year": year, "title": title}, UpdateExpression="set info.rating=:r, info.plot=:p", ExpressionAttributeValues={":r": Decimal(str(rating)), ":p": plot}, ReturnValues="UPDATED_NEW", ) except ClientError as err: logger.error( "Couldn't update movie %s in table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Attributes"] def query_movies(self, year): """ Queries for movies that were released in the specified year. :param year: The year to query. :return: The list of movies that were released in the specified year. """ try: response = self.table.query(KeyConditionExpression=Key("year").eq(year)) except ClientError as err: logger.error( "Couldn't query for movies released in %s. Here's why: %s: %s", year, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Items"] def scan_movies(self, year_range): """ Scans for movies that were released in a range of years. Uses a projection expression to return a subset of data for each movie. :param year_range: The range of years to retrieve. :return: The list of movies released in the specified years. """ movies = [] scan_kwargs = { "FilterExpression": Key("year").between( year_range["first"], year_range["second"] ), "ProjectionExpression": "#yr, title, info.rating", "ExpressionAttributeNames": {"#yr": "year"}, } try: done = False start_key = None while not done: if start_key: scan_kwargs["ExclusiveStartKey"] = start_key response = self.table.scan(**scan_kwargs) movies.extend(response.get("Items", [])) start_key = response.get("LastEvaluatedKey", None) done = start_key is None except ClientError as err: logger.error( "Couldn't scan for movies. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise return movies def delete_movie(self, title, year): """ Deletes a movie from the table. :param title: The title of the movie to delete. :param year: The release year of the movie to delete. """ try: self.table.delete_item(Key={"year": year, "title": title}) except ClientError as err: logger.error( "Couldn't delete movie %s. Here's why: %s: %s", title, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise def delete_table(self): """ Deletes the table. """ try: self.table.delete() self.table = None except ClientError as err: logger.error( "Couldn't delete table. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
Create a helper function to download and extract the sample JSON file.
def get_sample_movie_data(movie_file_name): """ Gets sample movie data, either from a local file or by first downloading it from the Amazon DynamoDB developer guide. :param movie_file_name: The local file name where the movie data is stored in JSON format. :return: The movie data as a dict. """ if not os.path.isfile(movie_file_name): print(f"Downloading {movie_file_name}...") movie_content = requests.get( "https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/moviedata.zip" ) movie_zip = ZipFile(BytesIO(movie_content.content)) movie_zip.extractall() try: with open(movie_file_name) as movie_file: movie_data = json.load(movie_file, parse_float=Decimal) except FileNotFoundError: print( f"File {movie_file_name} not found. You must first download the file to " "run this demo. See the README for instructions." ) raise else: # The sample file lists over 4000 movies, return only the first 250. return movie_data[:250]
Run an interactive scenario to create the table and perform actions on it.
def run_scenario(table_name, movie_file_name, dyn_resource): logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s") print("-" * 88) print("Welcome to the Amazon DynamoDB getting started demo.") print("-" * 88) movies = Movies(dyn_resource) movies_exists = movies.exists(table_name) if not movies_exists: print(f"\nCreating table {table_name}...") movies.create_table(table_name) print(f"\nCreated table {movies.table.name}.") my_movie = Question.ask_questions( [ Question( "title", "Enter the title of a movie you want to add to the table: " ), Question("year", "What year was it released? ", Question.is_int), Question( "rating", "On a scale of 1 - 10, how do you rate it? ", Question.is_float, Question.in_range(1, 10), ), Question("plot", "Summarize the plot for me: "), ] ) movies.add_movie(**my_movie) print(f"\nAdded '{my_movie['title']}' to '{movies.table.name}'.") print("-" * 88) movie_update = Question.ask_questions( [ Question( "rating", f"\nLet's update your movie.\nYou rated it {my_movie['rating']}, what new " f"rating would you give it? ", Question.is_float, Question.in_range(1, 10), ), Question( "plot", f"You summarized the plot as '{my_movie['plot']}'.\nWhat would you say now? ", ), ] ) my_movie.update(movie_update) updated = movies.update_movie(**my_movie) print(f"\nUpdated '{my_movie['title']}' with new attributes:") pprint(updated) print("-" * 88) if not movies_exists: movie_data = get_sample_movie_data(movie_file_name) print(f"\nReading data from '{movie_file_name}' into your table.") movies.write_batch(movie_data) print(f"\nWrote {len(movie_data)} movies into {movies.table.name}.") print("-" * 88) title = "The Lord of the Rings: The Fellowship of the Ring" if Question.ask_question( f"Let's move on...do you want to get info about '{title}'? (y/n) ", Question.is_yesno, ): movie = movies.get_movie(title, 2001) print("\nHere's what I found:") pprint(movie) print("-" * 88) ask_for_year = True while ask_for_year: release_year = Question.ask_question( f"\nLet's get a list of movies released in a given year. Enter a year between " f"1972 and 2018: ", Question.is_int, Question.in_range(1972, 2018), ) releases = movies.query_movies(release_year) if releases: print(f"There were {len(releases)} movies released in {release_year}:") for release in releases: print(f"\t{release['title']}") ask_for_year = False else: print(f"I don't know about any movies released in {release_year}!") ask_for_year = Question.ask_question( "Try another year? (y/n) ", Question.is_yesno ) print("-" * 88) years = Question.ask_questions( [ Question( "first", f"\nNow let's scan for movies released in a range of years. Enter a year: ", Question.is_int, Question.in_range(1972, 2018), ), Question( "second", "Now enter another year: ", Question.is_int, Question.in_range(1972, 2018), ), ] ) releases = movies.scan_movies(years) if releases: count = Question.ask_question( f"\nFound {len(releases)} movies. How many do you want to see? ", Question.is_int, Question.in_range(1, len(releases)), ) print(f"\nHere are your {count} movies:\n") pprint(releases[:count]) else: print( f"I don't know about any movies released between {years['first']} " f"and {years['second']}." ) print("-" * 88) if Question.ask_question( f"\nLet's remove your movie from the table. Do you want to remove " f"'{my_movie['title']}'? (y/n)", Question.is_yesno, ): movies.delete_movie(my_movie["title"], my_movie["year"]) print(f"\nRemoved '{my_movie['title']}' from the table.") print("-" * 88) if Question.ask_question(f"\nDelete the table? (y/n) ", Question.is_yesno): movies.delete_table() print(f"Deleted {table_name}.") else: print( "Don't forget to delete the table when you're done or you might incur " "charges on your account." ) print("\nThanks for watching!") print("-" * 88) if __name__ == "__main__": try: run_scenario( "doc-example-table-movies", "moviedata.json", boto3.resource("dynamodb") ) except Exception as e: print(f"Something went wrong with the demo! Here's what: {e}")
This scenario uses the following helper class to ask questions at a command prompt.
class Question: """ A helper class to ask questions at a command prompt and validate and convert the answers. """ def __init__(self, key, question, *validators): """ :param key: The key that is used for storing the answer in a dict, when multiple questions are asked in a set. :param question: The question to ask. :param validators: The answer is passed through the list of validators until one fails or they all pass. Validators may also convert the answer to another form, such as from a str to an int. """ self.key = key self.question = question self.validators = Question.non_empty, *validators @staticmethod def ask_questions(questions): """ Asks a set of questions and stores the answers in a dict. :param questions: The list of questions to ask. :return: A dict of answers. """ answers = {} for question in questions: answers[question.key] = Question.ask_question( question.question, *question.validators ) return answers @staticmethod def ask_question(question, *validators): """ Asks a single question and validates it against a list of validators. When an answer fails validation, the complaint is printed and the question is asked again. :param question: The question to ask. :param validators: The list of validators that the answer must pass. :return: The answer, converted to its final form by the validators. """ answer = None while answer is None: answer = input(question) for validator in validators: answer, complaint = validator(answer) if answer is None: print(complaint) break return answer @staticmethod def non_empty(answer): """ Validates that the answer is not empty. :return: The non-empty answer, or None. """ return answer if answer != "" else None, "I need an answer. Please?" @staticmethod def is_yesno(answer): """ Validates a yes/no answer. :return: True when the answer is 'y'; otherwise, False. """ return answer.lower() == "y", "" @staticmethod def is_int(answer): """ Validates that the answer can be converted to an int. :return: The int answer; otherwise, None. """ try: int_answer = int(answer) except ValueError: int_answer = None return int_answer, f"{answer} must be a valid integer." @staticmethod def is_letter(answer): """ Validates that the answer is a letter. :return The letter answer, converted to uppercase; otherwise, None. """ return ( answer.upper() if answer.isalpha() else None, f"{answer} must be a single letter.", ) @staticmethod def is_float(answer): """ Validate that the answer can be converted to a float. :return The float answer; otherwise, None. """ try: float_answer = float(answer) except ValueError: float_answer = None return float_answer, f"{answer} must be a valid float." @staticmethod def in_range(lower, upper): """ Validate that the answer is within a range. The answer must be of a type that can be compared to the lower and upper bounds. :return: The answer, if it is within the range; otherwise, None. """ def _validate(answer): return ( answer if lower <= answer <= upper else None, f"{answer} must be between {lower} and {upper}.", ) return _validate
-
For API details, see the following topics in AWS SDK for Python (Boto3) API Reference.
-
Actions
The following code example shows how to use BatchExecuteStatement
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class PartiQLBatchWrapper: """ Encapsulates a DynamoDB resource to run PartiQL statements. """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource def run_partiql(self, statements, param_list): """ Runs a PartiQL statement. A Boto3 resource is used even though `execute_statement` is called on the underlying `client` object because the resource transforms input and output from plain old Python objects (POPOs) to the DynamoDB format. If you create the client directly, you must do these transforms yourself. :param statements: The batch of PartiQL statements. :param param_list: The batch of PartiQL parameters that are associated with each statement. This list must be in the same order as the statements. :return: The responses returned from running the statements, if any. """ try: output = self.dyn_resource.meta.client.batch_execute_statement( Statements=[ {"Statement": statement, "Parameters": params} for statement, params in zip(statements, param_list) ] ) except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": logger.error( "Couldn't execute batch of PartiQL statements because the table " "does not exist." ) else: logger.error( "Couldn't execute batch of PartiQL statements. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return output
-
For API details, see BatchExecuteStatement in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use BatchGetItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. import decimal import json import logging import os import pprint import time import boto3 from botocore.exceptions import ClientError logger = logging.getLogger(__name__) dynamodb = boto3.resource("dynamodb") MAX_GET_SIZE = 100 # Amazon DynamoDB rejects a get batch larger than 100 items. def do_batch_get(batch_keys): """ Gets a batch of items from Amazon DynamoDB. Batches can contain keys from more than one table. When Amazon DynamoDB cannot process all items in a batch, a set of unprocessed keys is returned. This function uses an exponential backoff algorithm to retry getting the unprocessed keys until all are retrieved or the specified number of tries is reached. :param batch_keys: The set of keys to retrieve. A batch can contain at most 100 keys. Otherwise, Amazon DynamoDB returns an error. :return: The dictionary of retrieved items grouped under their respective table names. """ tries = 0 max_tries = 5 sleepy_time = 1 # Start with 1 second of sleep, then exponentially increase. retrieved = {key: [] for key in batch_keys} while tries < max_tries: response = dynamodb.batch_get_item(RequestItems=batch_keys) # Collect any retrieved items and retry unprocessed keys. for key in response.get("Responses", []): retrieved[key] += response["Responses"][key] unprocessed = response["UnprocessedKeys"] if len(unprocessed) > 0: batch_keys = unprocessed unprocessed_count = sum( [len(batch_key["Keys"]) for batch_key in batch_keys.values()] ) logger.info( "%s unprocessed keys returned. Sleep, then retry.", unprocessed_count ) tries += 1 if tries < max_tries: logger.info("Sleeping for %s seconds.", sleepy_time) time.sleep(sleepy_time) sleepy_time = min(sleepy_time * 2, 32) else: break return retrieved
-
For API details, see BatchGetItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use BatchWriteItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def write_batch(self, movies): """ Fills an Amazon DynamoDB table with the specified data, using the Boto3 Table.batch_writer() function to put the items in the table. Inside the context manager, Table.batch_writer builds a list of requests. On exiting the context manager, Table.batch_writer starts sending batches of write requests to Amazon DynamoDB and automatically handles chunking, buffering, and retrying. :param movies: The data to put in the table. Each item must contain at least the keys required by the schema that was specified when the table was created. """ try: with self.table.batch_writer() as writer: for movie in movies: writer.put_item(Item=movie) except ClientError as err: logger.error( "Couldn't load data into table %s. Here's why: %s: %s", self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
-
For API details, see BatchWriteItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use CreateTable
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create a table for storing movie data.
class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def create_table(self, table_name): """ Creates an Amazon DynamoDB table that can be used to store movie data. The table uses the release year of the movie as the partition key and the title as the sort key. :param table_name: The name of the table to create. :return: The newly created table. """ try: self.table = self.dyn_resource.create_table( TableName=table_name, KeySchema=[ {"AttributeName": "year", "KeyType": "HASH"}, # Partition key {"AttributeName": "title", "KeyType": "RANGE"}, # Sort key ], AttributeDefinitions=[ {"AttributeName": "year", "AttributeType": "N"}, {"AttributeName": "title", "AttributeType": "S"}, ], BillingMode='PAY_PER_REQUEST', ) self.table.wait_until_exists() except ClientError as err: logger.error( "Couldn't create table %s. Here's why: %s: %s", table_name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return self.table
-
For API details, see CreateTable in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use DeleteItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def delete_movie(self, title, year): """ Deletes a movie from the table. :param title: The title of the movie to delete. :param year: The release year of the movie to delete. """ try: self.table.delete_item(Key={"year": year, "title": title}) except ClientError as err: logger.error( "Couldn't delete movie %s. Here's why: %s: %s", title, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
You can specify a condition so that an item is deleted only when it meets certain criteria.
class UpdateQueryWrapper: def __init__(self, table): self.table = table def delete_underrated_movie(self, title, year, rating): """ Deletes a movie only if it is rated below a specified value. By using a condition expression in a delete operation, you can specify that an item is deleted only when it meets certain criteria. :param title: The title of the movie to delete. :param year: The release year of the movie to delete. :param rating: The rating threshold to check before deleting the movie. """ try: self.table.delete_item( Key={"year": year, "title": title}, ConditionExpression="info.rating <= :val", ExpressionAttributeValues={":val": Decimal(str(rating))}, ) except ClientError as err: if err.response["Error"]["Code"] == "ConditionalCheckFailedException": logger.warning( "Didn't delete %s because its rating is greater than %s.", title, rating, ) else: logger.error( "Couldn't delete movie %s. Here's why: %s: %s", title, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
-
For API details, see DeleteItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use DeleteTable
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def delete_table(self): """ Deletes the table. """ try: self.table.delete() self.table = None except ClientError as err: logger.error( "Couldn't delete table. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
-
For API details, see DeleteTable in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use DescribeTable
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def exists(self, table_name): """ Determines whether a table exists. As a side effect, stores the table in a member variable. :param table_name: The name of the table to check. :return: True when the table exists; otherwise, False. """ try: table = self.dyn_resource.Table(table_name) table.load() exists = True except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": exists = False else: logger.error( "Couldn't check for existence of %s. Here's why: %s: %s", table_name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: self.table = table return exists
-
For API details, see DescribeTable in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use DescribeTimeToLive
.
- SDK for Python (Boto3)
-
Describe TTL configuration on an existing DynamoDB table using AWS SDK for Python (Boto3).
import boto3 def describe_ttl(table_name, region): """ Describes TTL on an existing table, as well as a region. :param table_name: String representing the name of the table :param region: AWS Region of the table - example `us-east-1` :return: Time to live description. """ try: dynamodb = boto3.resource("dynamodb", region_name=region) ttl_description = dynamodb.describe_time_to_live(TableName=table_name) print( f"TimeToLive for table {table_name} is status {ttl_description['TimeToLiveDescription']['TimeToLiveStatus']}" ) return ttl_description except Exception as e: print(f"Error describing table: {e}") raise # Enter your own table name and AWS region describe_ttl("your-table-name", "us-east-1")
-
For API details, see DescribeTimeToLive in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use ExecuteStatement
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class PartiQLWrapper: """ Encapsulates a DynamoDB resource to run PartiQL statements. """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource def run_partiql(self, statement, params): """ Runs a PartiQL statement. A Boto3 resource is used even though `execute_statement` is called on the underlying `client` object because the resource transforms input and output from plain old Python objects (POPOs) to the DynamoDB format. If you create the client directly, you must do these transforms yourself. :param statement: The PartiQL statement. :param params: The list of PartiQL parameters. These are applied to the statement in the order they are listed. :return: The items returned from the statement, if any. """ try: output = self.dyn_resource.meta.client.execute_statement( Statement=statement, Parameters=params ) except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": logger.error( "Couldn't execute PartiQL '%s' because the table does not exist.", statement, ) else: logger.error( "Couldn't execute PartiQL '%s'. Here's why: %s: %s", statement, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return output
-
For API details, see ExecuteStatement in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use GetItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def get_movie(self, title, year): """ Gets movie data from the table for a specific movie. :param title: The title of the movie. :param year: The release year of the movie. :return: The data about the requested movie. """ try: response = self.table.get_item(Key={"year": year, "title": title}) except ClientError as err: logger.error( "Couldn't get movie %s from table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Item"]
-
For API details, see GetItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use ListTables
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def list_tables(self): """ Lists the Amazon DynamoDB tables for the current account. :return: The list of tables. """ try: tables = [] for table in self.dyn_resource.tables.all(): print(table.name) tables.append(table) except ClientError as err: logger.error( "Couldn't list tables. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return tables
-
For API details, see ListTables in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use PutItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def add_movie(self, title, year, plot, rating): """ Adds a movie to the table. :param title: The title of the movie. :param year: The release year of the movie. :param plot: The plot summary of the movie. :param rating: The quality rating of the movie. """ try: self.table.put_item( Item={ "year": year, "title": title, "info": {"plot": plot, "rating": Decimal(str(rating))}, } ) except ClientError as err: logger.error( "Couldn't add movie %s to table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise
-
For API details, see PutItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use Query
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Query items by using a key condition expression.
class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def query_movies(self, year): """ Queries for movies that were released in the specified year. :param year: The year to query. :return: The list of movies that were released in the specified year. """ try: response = self.table.query(KeyConditionExpression=Key("year").eq(year)) except ClientError as err: logger.error( "Couldn't query for movies released in %s. Here's why: %s: %s", year, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Items"]
Query items and project them to return a subset of data.
class UpdateQueryWrapper: def __init__(self, table): self.table = table def query_and_project_movies(self, year, title_bounds): """ Query for movies that were released in a specified year and that have titles that start within a range of letters. A projection expression is used to return a subset of data for each movie. :param year: The release year to query. :param title_bounds: The range of starting letters to query. :return: The list of movies. """ try: response = self.table.query( ProjectionExpression="#yr, title, info.genres, info.actors[0]", ExpressionAttributeNames={"#yr": "year"}, KeyConditionExpression=( Key("year").eq(year) & Key("title").between( title_bounds["first"], title_bounds["second"] ) ), ) except ClientError as err: if err.response["Error"]["Code"] == "ValidationException": logger.warning( "There's a validation error. Here's the message: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) else: logger.error( "Couldn't query for movies. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Items"]
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use Scan
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def scan_movies(self, year_range): """ Scans for movies that were released in a range of years. Uses a projection expression to return a subset of data for each movie. :param year_range: The range of years to retrieve. :return: The list of movies released in the specified years. """ movies = [] scan_kwargs = { "FilterExpression": Key("year").between( year_range["first"], year_range["second"] ), "ProjectionExpression": "#yr, title, info.rating", "ExpressionAttributeNames": {"#yr": "year"}, } try: done = False start_key = None while not done: if start_key: scan_kwargs["ExclusiveStartKey"] = start_key response = self.table.scan(**scan_kwargs) movies.extend(response.get("Items", [])) start_key = response.get("LastEvaluatedKey", None) done = start_key is None except ClientError as err: logger.error( "Couldn't scan for movies. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise return movies
-
For API details, see Scan in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use UpdateItem
.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Update an item by using an update expression.
class Movies: """Encapsulates an Amazon DynamoDB table of movie data. Example data structure for a movie record in this table: { "year": 1999, "title": "For Love of the Game", "info": { "directors": ["Sam Raimi"], "release_date": "1999-09-15T00:00:00Z", "rating": 6.3, "plot": "A washed up pitcher flashes through his career.", "rank": 4987, "running_time_secs": 8220, "actors": [ "Kevin Costner", "Kelly Preston", "John C. Reilly" ] } } """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource # The table variable is set during the scenario in the call to # 'exists' if the table exists. Otherwise, it is set by 'create_table'. self.table = None def update_movie(self, title, year, rating, plot): """ Updates rating and plot data for a movie in the table. :param title: The title of the movie to update. :param year: The release year of the movie to update. :param rating: The updated rating to the give the movie. :param plot: The updated plot summary to give the movie. :return: The fields that were updated, with their new values. """ try: response = self.table.update_item( Key={"year": year, "title": title}, UpdateExpression="set info.rating=:r, info.plot=:p", ExpressionAttributeValues={":r": Decimal(str(rating)), ":p": plot}, ReturnValues="UPDATED_NEW", ) except ClientError as err: logger.error( "Couldn't update movie %s in table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Attributes"]
Update an item by using an update expression that includes an arithmetic operation.
class UpdateQueryWrapper: def __init__(self, table): self.table = table def update_rating(self, title, year, rating_change): """ Updates the quality rating of a movie in the table by using an arithmetic operation in the update expression. By specifying an arithmetic operation, you can adjust a value in a single request, rather than first getting its value and then setting its new value. :param title: The title of the movie to update. :param year: The release year of the movie to update. :param rating_change: The amount to add to the current rating for the movie. :return: The updated rating. """ try: response = self.table.update_item( Key={"year": year, "title": title}, UpdateExpression="set info.rating = info.rating + :val", ExpressionAttributeValues={":val": Decimal(str(rating_change))}, ReturnValues="UPDATED_NEW", ) except ClientError as err: logger.error( "Couldn't update movie %s in table %s. Here's why: %s: %s", title, self.table.name, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Attributes"]
Update an item only when it meets certain conditions.
class UpdateQueryWrapper: def __init__(self, table): self.table = table def remove_actors(self, title, year, actor_threshold): """ Removes an actor from a movie, but only when the number of actors is greater than a specified threshold. If the movie does not list more than the threshold, no actors are removed. :param title: The title of the movie to update. :param year: The release year of the movie to update. :param actor_threshold: The threshold of actors to check. :return: The movie data after the update. """ try: response = self.table.update_item( Key={"year": year, "title": title}, UpdateExpression="remove info.actors[0]", ConditionExpression="size(info.actors) > :num", ExpressionAttributeValues={":num": actor_threshold}, ReturnValues="ALL_NEW", ) except ClientError as err: if err.response["Error"]["Code"] == "ConditionalCheckFailedException": logger.warning( "Didn't update %s because it has fewer than %s actors.", title, actor_threshold + 1, ) else: logger.error( "Couldn't update movie %s. Here's why: %s: %s", title, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return response["Attributes"]
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use UpdateTimeToLive
.
- SDK for Python (Boto3)
-
Enable TTL on an existing DynamoDB table.
import boto3 def enable_ttl(table_name, ttl_attribute_name): """ Enables TTL on DynamoDB table for a given attribute name on success, returns a status code of 200 on error, throws an exception :param table_name: Name of the DynamoDB table :param ttl_attribute_name: The name of the TTL attribute being provided to the table. """ try: dynamodb = boto3.client("dynamodb") # Enable TTL on an existing DynamoDB table response = dynamodb.update_time_to_live( TableName=table_name, TimeToLiveSpecification={"Enabled": True, "AttributeName": ttl_attribute_name}, ) # In the returned response, check for a successful status code. if response["ResponseMetadata"]["HTTPStatusCode"] == 200: print("TTL has been enabled successfully.") else: print( f"Failed to enable TTL, status code {response['ResponseMetadata']['HTTPStatusCode']}" ) return response except Exception as ex: print("Couldn't enable TTL in table %s. Here's why: %s" % (table_name, ex)) raise # your values enable_ttl("your-table-name", "expireAt")
Disable TTL on an existing DynamoDB table.
import boto3 def disable_ttl(table_name, ttl_attribute_name): """ Disables TTL on DynamoDB table for a given attribute name on success, returns a status code of 200 on error, throws an exception :param table_name: Name of the DynamoDB table being modified :param ttl_attribute_name: The name of the TTL attribute being provided to the table. """ try: dynamodb = boto3.client("dynamodb") # Enable TTL on an existing DynamoDB table response = dynamodb.update_time_to_live( TableName=table_name, TimeToLiveSpecification={"Enabled": False, "AttributeName": ttl_attribute_name}, ) # In the returned response, check for a successful status code. if response["ResponseMetadata"]["HTTPStatusCode"] == 200: print("TTL has been disabled successfully.") else: print( f"Failed to disable TTL, status code {response['ResponseMetadata']['HTTPStatusCode']}" ) except Exception as ex: print("Couldn't disable TTL in table %s. Here's why: %s" % (table_name, ex)) raise # your values disable_ttl("your-table-name", "expireAt")
-
For API details, see UpdateTimeToLive in AWS SDK for Python (Boto3) API Reference.
-
Scenarios
The following code example shows how to:
Create and write data to a table with both the DAX and SDK clients.
Get, query, and scan the table with both clients and compare their performance.
For more information, see Developing with the DynamoDB Accelerator Client.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create a table with either the DAX or Boto3 client.
import boto3 def create_dax_table(dyn_resource=None): """ Creates a DynamoDB table. :param dyn_resource: Either a Boto3 or DAX resource. :return: The newly created table. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table_name = "TryDaxTable" params = { "TableName": table_name, "KeySchema": [ {"AttributeName": "partition_key", "KeyType": "HASH"}, {"AttributeName": "sort_key", "KeyType": "RANGE"}, ], "AttributeDefinitions": [ {"AttributeName": "partition_key", "AttributeType": "N"}, {"AttributeName": "sort_key", "AttributeType": "N"}, ], "BillingMode": "PAY_PER_REQUEST", } table = dyn_resource.create_table(**params) print(f"Creating {table_name}...") table.wait_until_exists() return table if __name__ == "__main__": dax_table = create_dax_table() print(f"Created table.")
Write test data to the table.
import boto3 def write_data_to_dax_table(key_count, item_size, dyn_resource=None): """ Writes test data to the demonstration table. :param key_count: The number of partition and sort keys to use to populate the table. The total number of items is key_count * key_count. :param item_size: The size of non-key data for each test item. :param dyn_resource: Either a Boto3 or DAX resource. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table = dyn_resource.Table("TryDaxTable") some_data = "X" * item_size for partition_key in range(1, key_count + 1): for sort_key in range(1, key_count + 1): table.put_item( Item={ "partition_key": partition_key, "sort_key": sort_key, "some_data": some_data, } ) print(f"Put item ({partition_key}, {sort_key}) succeeded.") if __name__ == "__main__": write_key_count = 10 write_item_size = 1000 print( f"Writing {write_key_count*write_key_count} items to the table. " f"Each item is {write_item_size} characters." ) write_data_to_dax_table(write_key_count, write_item_size)
Get items for a number of iterations for both the DAX client and the Boto3 client and report the time spent for each.
import argparse import sys import time import amazondax import boto3 def get_item_test(key_count, iterations, dyn_resource=None): """ Gets items from the table a specified number of times. The time before the first iteration and the time after the last iteration are both captured and reported. :param key_count: The number of items to get from the table in each iteration. :param iterations: The number of iterations to run. :param dyn_resource: Either a Boto3 or DAX resource. :return: The start and end times of the test. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table = dyn_resource.Table("TryDaxTable") start = time.perf_counter() for _ in range(iterations): for partition_key in range(1, key_count + 1): for sort_key in range(1, key_count + 1): table.get_item( Key={"partition_key": partition_key, "sort_key": sort_key} ) print(".", end="") sys.stdout.flush() print() end = time.perf_counter() return start, end if __name__ == "__main__": # pylint: disable=not-context-manager parser = argparse.ArgumentParser() parser.add_argument( "endpoint_url", nargs="?", help="When specified, the DAX cluster endpoint. Otherwise, DAX is not used.", ) args = parser.parse_args() test_key_count = 10 test_iterations = 50 if args.endpoint_url: print( f"Getting each item from the table {test_iterations} times, " f"using the DAX client." ) # Use a with statement so the DAX client closes the cluster after completion. with amazondax.AmazonDaxClient.resource(endpoint_url=args.endpoint_url) as dax: test_start, test_end = get_item_test( test_key_count, test_iterations, dyn_resource=dax ) else: print( f"Getting each item from the table {test_iterations} times, " f"using the Boto3 client." ) test_start, test_end = get_item_test(test_key_count, test_iterations) print( f"Total time: {test_end - test_start:.4f} sec. Average time: " f"{(test_end - test_start)/ test_iterations}." )
Query the table for a number of iterations for both the DAX client and the Boto3 client and report the time spent for each.
import argparse import time import sys import amazondax import boto3 from boto3.dynamodb.conditions import Key def query_test(partition_key, sort_keys, iterations, dyn_resource=None): """ Queries the table a specified number of times. The time before the first iteration and the time after the last iteration are both captured and reported. :param partition_key: The partition key value to use in the query. The query returns items that have partition keys equal to this value. :param sort_keys: The range of sort key values for the query. The query returns items that have sort key values between these two values. :param iterations: The number of iterations to run. :param dyn_resource: Either a Boto3 or DAX resource. :return: The start and end times of the test. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table = dyn_resource.Table("TryDaxTable") key_condition_expression = Key("partition_key").eq(partition_key) & Key( "sort_key" ).between(*sort_keys) start = time.perf_counter() for _ in range(iterations): table.query(KeyConditionExpression=key_condition_expression) print(".", end="") sys.stdout.flush() print() end = time.perf_counter() return start, end if __name__ == "__main__": # pylint: disable=not-context-manager parser = argparse.ArgumentParser() parser.add_argument( "endpoint_url", nargs="?", help="When specified, the DAX cluster endpoint. Otherwise, DAX is not used.", ) args = parser.parse_args() test_partition_key = 5 test_sort_keys = (2, 9) test_iterations = 100 if args.endpoint_url: print(f"Querying the table {test_iterations} times, using the DAX client.") # Use a with statement so the DAX client closes the cluster after completion. with amazondax.AmazonDaxClient.resource(endpoint_url=args.endpoint_url) as dax: test_start, test_end = query_test( test_partition_key, test_sort_keys, test_iterations, dyn_resource=dax ) else: print(f"Querying the table {test_iterations} times, using the Boto3 client.") test_start, test_end = query_test( test_partition_key, test_sort_keys, test_iterations ) print( f"Total time: {test_end - test_start:.4f} sec. Average time: " f"{(test_end - test_start)/test_iterations}." )
Scan the table for a number of iterations for both the DAX client and the Boto3 client and report the time spent for each.
import argparse import time import sys import amazondax import boto3 def scan_test(iterations, dyn_resource=None): """ Scans the table a specified number of times. The time before the first iteration and the time after the last iteration are both captured and reported. :param iterations: The number of iterations to run. :param dyn_resource: Either a Boto3 or DAX resource. :return: The start and end times of the test. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table = dyn_resource.Table("TryDaxTable") start = time.perf_counter() for _ in range(iterations): table.scan() print(".", end="") sys.stdout.flush() print() end = time.perf_counter() return start, end if __name__ == "__main__": # pylint: disable=not-context-manager parser = argparse.ArgumentParser() parser.add_argument( "endpoint_url", nargs="?", help="When specified, the DAX cluster endpoint. Otherwise, DAX is not used.", ) args = parser.parse_args() test_iterations = 100 if args.endpoint_url: print(f"Scanning the table {test_iterations} times, using the DAX client.") # Use a with statement so the DAX client closes the cluster after completion. with amazondax.AmazonDaxClient.resource(endpoint_url=args.endpoint_url) as dax: test_start, test_end = scan_test(test_iterations, dyn_resource=dax) else: print(f"Scanning the table {test_iterations} times, using the Boto3 client.") test_start, test_end = scan_test(test_iterations) print( f"Total time: {test_end - test_start:.4f} sec. Average time: " f"{(test_end - test_start)/test_iterations}." )
Delete the table.
import boto3 def delete_dax_table(dyn_resource=None): """ Deletes the demonstration table. :param dyn_resource: Either a Boto3 or DAX resource. """ if dyn_resource is None: dyn_resource = boto3.resource("dynamodb") table = dyn_resource.Table("TryDaxTable") table.delete() print(f"Deleting {table.name}...") table.wait_until_not_exists() if __name__ == "__main__": delete_dax_table() print("Table deleted!")
-
For API details, see the following topics in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to compare multiple values with a single attribute in DynamoDB.
Use the IN operator to compare multiple values with a single attribute.
Compare the IN operator with multiple OR conditions.
Understand the performance and expression complexity benefits of using IN.
- SDK for Python (Boto3)
-
Compare multiple values with a single attribute using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key from typing import Any, Dict, List, Optional def compare_multiple_values( table_name: str, attribute_name: str, values_list: List[Any], partition_key_name: Optional[str] = None, partition_key_value: Optional[str] = None, ) -> Dict[str, Any]: """ Query or scan a DynamoDB table to find items where an attribute matches any value from a list. This function demonstrates the use of the IN operator to compare a single attribute against multiple possible values, which is more efficient than using multiple OR conditions. Args: table_name (str): The name of the DynamoDB table. attribute_name (str): The name of the attribute to compare against the values list. values_list (List[Any]): List of values to compare the attribute against. partition_key_name (Optional[str]): The name of the partition key attribute for query operations. partition_key_value (Optional[str]): The value of the partition key to query. Returns: Dict[str, Any]: The response from DynamoDB containing the matching items. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Create the filter expression using the is_in method filter_expression = Attr(attribute_name).is_in(values_list) # If partition key is provided, perform a query operation if partition_key_name and partition_key_value: key_condition = Key(partition_key_name).eq(partition_key_value) response = table.query( KeyConditionExpression=key_condition, FilterExpression=filter_expression ) else: # Otherwise, perform a scan operation response = table.scan(FilterExpression=filter_expression) # Handle pagination if there are more results items = response.get("Items", []) while "LastEvaluatedKey" in response: if partition_key_name and partition_key_value: response = table.query( KeyConditionExpression=key_condition, FilterExpression=filter_expression, ExclusiveStartKey=response["LastEvaluatedKey"], ) else: response = table.scan( FilterExpression=filter_expression, ExclusiveStartKey=response["LastEvaluatedKey"] ) items.extend(response.get("Items", [])) # Return the complete result return {"Items": items, "Count": len(items)} def compare_with_or_conditions( table_name: str, attribute_name: str, values_list: List[Any], partition_key_name: Optional[str] = None, partition_key_value: Optional[str] = None, ) -> Dict[str, Any]: """ Alternative implementation using multiple OR conditions instead of the IN operator. This function is provided for comparison to show why using the IN operator is preferable. With many values, this approach becomes verbose and less efficient. Args: table_name (str): The name of the DynamoDB table. attribute_name (str): The name of the attribute to compare against the values list. values_list (List[Any]): List of values to compare the attribute against. partition_key_name (Optional[str]): The name of the partition key attribute for query operations. partition_key_value (Optional[str]): The value of the partition key to query. Returns: Dict[str, Any]: The response from DynamoDB containing the matching items. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Create a filter expression with multiple OR conditions filter_expression = None for value in values_list: condition = Attr(attribute_name).eq(value) if filter_expression is None: filter_expression = condition else: filter_expression = filter_expression | condition # If partition key is provided, perform a query operation if partition_key_name and partition_key_value and filter_expression: key_condition = Key(partition_key_name).eq(partition_key_value) response = table.query( KeyConditionExpression=key_condition, FilterExpression=filter_expression ) elif filter_expression: # Otherwise, perform a scan operation response = table.scan(FilterExpression=filter_expression) else: # Return empty response if no values provided return {"Items": [], "Count": 0} # Handle pagination if there are more results items = response.get("Items", []) while "LastEvaluatedKey" in response: if partition_key_name and partition_key_value: response = table.query( KeyConditionExpression=key_condition, FilterExpression=filter_expression, ExclusiveStartKey=response["LastEvaluatedKey"], ) else: response = table.scan( FilterExpression=filter_expression, ExclusiveStartKey=response["LastEvaluatedKey"] ) items.extend(response.get("Items", [])) # Return the complete result return {"Items": items, "Count": len(items)}
Example usage of comparing multiple values with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the compare_multiple_values function.""" # Example parameters table_name = "Products" attribute_name = "Category" values_list = ["Electronics", "Computers", "Accessories"] print(f"Searching for products in any of these categories: {values_list}") # Using the IN operator (recommended approach) print("\nApproach 1: Using the IN operator") response = compare_multiple_values( table_name=table_name, attribute_name=attribute_name, values_list=values_list ) print(f"Found {response['Count']} products in the specified categories") # Using multiple OR conditions (alternative approach) print("\nApproach 2: Using multiple OR conditions") response2 = compare_with_or_conditions( table_name=table_name, attribute_name=attribute_name, values_list=values_list ) print(f"Found {response2['Count']} products in the specified categories") # Example with a query operation print("\nQuerying a specific manufacturer's products in multiple categories") partition_key_name = "Manufacturer" partition_key_value = "Acme" response3 = compare_multiple_values( table_name=table_name, attribute_name=attribute_name, values_list=values_list, partition_key_name=partition_key_name, partition_key_value=partition_key_value, ) print(f"Found {response3['Count']} Acme products in the specified categories") # Explain the benefits of using the IN operator print("\nBenefits of using the IN operator:") print("1. More concise expression compared to multiple OR conditions") print("2. Better readability and maintainability") print("3. Potentially better performance with large value lists") print("4. Simpler code that's less prone to errors") print("5. Easier to modify when adding or removing values")
The following code example shows how to conditionally update an item's TTL.
- SDK for Python (Boto3)
-
Update TTL on on an existing DynamoDB Item in a table, with a condition.
from datetime import datetime, timedelta import boto3 from botocore.exceptions import ClientError def update_dynamodb_item_ttl(table_name, region, primary_key, sort_key, ttl_attribute): """ Updates an existing record in a DynamoDB table with a new or updated TTL attribute. :param table_name: Name of the DynamoDB table :param region: AWS Region of the table - example `us-east-1` :param primary_key: one attribute known as the partition key. :param sort_key: Also known as a range attribute. :param ttl_attribute: name of the TTL attribute in the target DynamoDB table :return: """ try: dynamodb = boto3.resource("dynamodb", region_name=region) table = dynamodb.Table(table_name) # Generate updated TTL in epoch second format updated_expiration_time = int((datetime.now() + timedelta(days=90)).timestamp()) # Define the update expression for adding/updating a new attribute update_expression = "SET newAttribute = :val1" # Define the condition expression for checking if 'expireAt' is not expired condition_expression = "expireAt > :val2" # Define the expression attribute values expression_attribute_values = {":val1": ttl_attribute, ":val2": updated_expiration_time} response = table.update_item( Key={"primaryKey": primary_key, "sortKey": sort_key}, UpdateExpression=update_expression, ConditionExpression=condition_expression, ExpressionAttributeValues=expression_attribute_values, ) print("Item updated successfully.") return response["ResponseMetadata"]["HTTPStatusCode"] # Ideally a 200 OK except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": print("Condition check failed: Item's 'expireAt' is expired.") else: print(f"Error updating item: {e}") except Exception as e: print(f"Error updating item: {e}") # replace with your values update_dynamodb_item_ttl( "your-table-name", "us-east-1", "your-partition-key-value", "your-sort-key-value", "your-ttl-attribute-value", )
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to count expression operators in DynamoDB.
Understand DynamoDB's 300 operator limit.
Count operators in complex expressions.
Optimize expressions to stay within limits.
- SDK for Python (Boto3)
-
Demonstrate expression operator counting using AWS SDK for Python (Boto3).
import boto3 from botocore.exceptions import ClientError from typing import Any, Dict, List, Optional, Tuple def create_complex_filter_expression( attribute_name: str, values: List[Any], use_or: bool = True ) -> Tuple[str, Dict[str, Any], Dict[str, str], int]: """ Create a complex filter expression with multiple conditions. This function demonstrates how to build a complex filter expression and count the number of operators used. Args: attribute_name (str): The name of the attribute to filter on. values (List[Any]): List of values to compare against. use_or (bool, optional): Whether to use OR between conditions. Defaults to True. Returns: Tuple[str, Dict[str, Any], Dict[str, str], int]: A tuple containing: - The filter expression string - Expression attribute values - Expression attribute names - The number of operators used """ if not values: return "", {}, {}, 0 # Initialize expression components filter_expression = "" expression_attribute_values = {} expression_attribute_names = {"#attr": attribute_name} operator_count = 0 # Build the filter expression for i, value in enumerate(values): value_placeholder = f":val{i}" expression_attribute_values[value_placeholder] = value if i > 0: # Add OR or AND operator between conditions filter_expression += " OR " if use_or else " AND " operator_count += 1 # Count the OR/AND operator # Add the condition filter_expression += f"#attr = {value_placeholder}" operator_count += 1 # Count the = operator return ( filter_expression, expression_attribute_values, expression_attribute_names, operator_count, ) def create_nested_filter_expression( depth: int, conditions_per_level: int ) -> Tuple[str, Dict[str, Any], Dict[str, str], int]: """ Create a deeply nested filter expression with multiple conditions. This function demonstrates how to build a complex nested filter expression and count the number of operators used. Args: depth (int): The depth of nesting. conditions_per_level (int): Number of conditions at each level. Returns: Tuple[str, Dict[str, Any], Dict[str, str], int]: A tuple containing: - The filter expression string - Expression attribute values - Expression attribute names - The number of operators used """ if depth <= 0 or conditions_per_level <= 0: return "", {}, {}, 0 # Initialize expression components expression_attribute_values = {} expression_attribute_names = {} operator_count = 0 def build_nested_expression(current_depth: int, prefix: str) -> str: nonlocal operator_count if current_depth <= 0: return "" # Build conditions at this level conditions = [] for i in range(conditions_per_level): attr_name = f"attr{prefix}_{i}" attr_placeholder = f"#attr{prefix}_{i}" val_placeholder = f":val{prefix}_{i}" expression_attribute_names[attr_placeholder] = attr_name expression_attribute_values[val_placeholder] = i conditions.append(f"{attr_placeholder} = {val_placeholder}") operator_count += 1 # Count the = operator # Join conditions with AND level_expression = " AND ".join(conditions) operator_count += max(0, len(conditions) - 1) # Count the AND operators # If not at the deepest level, add nested expressions if current_depth > 1: nested_expr = build_nested_expression(current_depth - 1, f"{prefix}_{current_depth}") if nested_expr: level_expression = f"({level_expression}) OR ({nested_expr})" operator_count += 1 # Count the OR operator return level_expression # Build the expression starting from the top level filter_expression = build_nested_expression(depth, "1") return ( filter_expression, expression_attribute_values, expression_attribute_names, operator_count, ) def count_operators_in_update_expression(update_expression: str) -> int: """ Count the number of operators in an update expression. This function demonstrates how to count operators in an update expression based on DynamoDB's rules. Args: update_expression (str): The update expression to analyze. Returns: int: The number of operators in the expression. """ operator_count = 0 # Count SET operations if "SET" in update_expression: set_section = ( update_expression.split("SET")[1].split("REMOVE")[0].split("ADD")[0].split("DELETE")[0] ) # Count assignment operators (=) operator_count += set_section.count("=") # Count arithmetic operators (+, -) operator_count += set_section.count("+") operator_count += set_section.count("-") # Count list_append function calls (each counts as 1 operator) operator_count += set_section.lower().count("list_append") # Count if_not_exists function calls (each counts as 1 operator) operator_count += set_section.lower().count("if_not_exists") # Count REMOVE operations (no additional operators) # Count ADD operations (each ADD counts as 1 operator) if "ADD" in update_expression: add_section = ( update_expression.split("ADD")[1].split("DELETE")[0].split("SET")[0].split("REMOVE")[0] ) operator_count += add_section.count(",") + 1 # Count DELETE operations (each DELETE counts as 1 operator) if "DELETE" in update_expression: delete_section = ( update_expression.split("DELETE")[1].split("SET")[0].split("ADD")[0].split("REMOVE")[0] ) operator_count += delete_section.count(",") + 1 return operator_count def count_operators_in_condition_expression(condition_expression: str) -> int: """ Count the number of operators in a condition expression. This function demonstrates how to count operators in a condition expression based on DynamoDB's rules. Args: condition_expression (str): The condition expression to analyze. Returns: int: The number of operators in the expression. """ operator_count = 0 # Count comparison operators comparison_operators = ["=", "<>", "<", "<=", ">", ">="] for op in comparison_operators: operator_count += condition_expression.count(op) # Count logical operators operator_count += condition_expression.upper().count(" AND ") operator_count += condition_expression.upper().count(" OR ") operator_count += condition_expression.upper().count("NOT ") # Count BETWEEN operator (counts as 2: BETWEEN + AND) between_count = condition_expression.upper().count(" BETWEEN ") operator_count += between_count * 2 # Count IN operator (counts as 1 regardless of number of values) operator_count += condition_expression.upper().count(" IN ") # Count functions (each counts as 1 operator) functions = [ "attribute_exists", "attribute_not_exists", "attribute_type", "begins_with", "contains", "size", ] for func in functions: operator_count += condition_expression.lower().count(func) return operator_count # Note: This function is for demonstration purposes only and should be called from example_usage() # It's not meant to be used directly as a test function def _test_expression_limit( table_name: str, key: Dict[str, Any], operator_count: int, attribute_name: str = "TestAttribute" ) -> Tuple[bool, Optional[str]]: """ Test if an expression with a specific number of operators exceeds the limit. This function demonstrates how to test the 300 operator limit by creating an expression with a specified number of operators. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. operator_count (int): The number of operators to include in the expression. attribute_name (str, optional): The name of the attribute to update. Defaults to "TestAttribute". Returns: Tuple[bool, Optional[str]]: A tuple containing: - A boolean indicating if the operation succeeded - The error message if it failed, None otherwise """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Create an update expression with the specified number of operators update_expression = f"SET #{attribute_name} = :val0" expression_attribute_names = {f"#{attribute_name}": attribute_name} expression_attribute_values = {":val0": 0} # Add additional SET operations to reach the desired operator count # Each assignment adds 1 operator for i in range(1, operator_count): attr_name = f"{attribute_name}{i}" attr_placeholder = f"#attr{i}" val_placeholder = f":val{i}" update_expression += f", {attr_placeholder} = {val_placeholder}" expression_attribute_names[attr_placeholder] = attr_name expression_attribute_values[val_placeholder] = i try: # Attempt the update operation table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues=expression_attribute_values, ) return True, None except ClientError as e: error_message = e.response["Error"]["Message"] if "expression contains too many operators" in error_message.lower(): return False, error_message else: # Other error occurred raise
Example usage of expression operator counting with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the expression operator counting functions.""" print("Example 1: Creating a complex filter expression with multiple conditions") attribute_name = "Status" values = ["Active", "Pending", "Processing", "Shipped", "Delivered"] filter_expr, expr_attr_vals, expr_attr_names, op_count = create_complex_filter_expression( attribute_name=attribute_name, values=values, use_or=True ) print(f"Filter Expression: {filter_expr}") print(f"Expression Attribute Values: {expr_attr_vals}") print(f"Expression Attribute Names: {expr_attr_names}") print(f"Operator Count: {op_count}") print("\nExample 2: Creating a nested filter expression") nested_expr, nested_vals, nested_names, nested_count = create_nested_filter_expression( depth=3, conditions_per_level=2 ) print(f"Nested Filter Expression: {nested_expr}") print(f"Operator Count: {nested_count}") print("\nExample 3: Counting operators in an update expression") update_expression = "SET #name = :name, #age = :age + :increment, #address.#city = :city, #status = if_not_exists(#status, :default_status) REMOVE #old_field ADD #counter :value DELETE #set_attr :set_val" update_op_count = count_operators_in_update_expression(update_expression) print(f"Update Expression: {update_expression}") print(f"Operator Count: {update_op_count}") print("\nExample 4: Counting operators in a condition expression") condition_expression = "(#status = :active OR #status = :pending) AND #price BETWEEN :min_price AND :max_price AND attribute_exists(#category) AND NOT (#stock <= :min_stock)" condition_op_count = count_operators_in_condition_expression(condition_expression) print(f"Condition Expression: {condition_expression}") print(f"Operator Count: {condition_op_count}") print("\nExample 5: Testing the 300 operator limit") # This is just for demonstration - in a real application, you would use your actual table # Note: This function is renamed to _test_expression_limit to avoid pytest trying to run it print("In a real application, you would test with _test_expression_limit function") print("Expression with 250 operators would be under the limit") print("Expression with 350 operators would exceed the 300 operator limit") print("\nOperator Counting Rules in DynamoDB:") print("1. Comparison Operators (=, <>, <, <=, >, >=): 1 operator each") print("2. Logical Operators (AND, OR, NOT): 1 operator each") print("3. BETWEEN: 2 operators (BETWEEN + AND)") print("4. IN: 1 operator (regardless of number of values)") print("5. Functions (attribute_exists, begins_with, etc.): 1 operator each") print("6. Arithmetic Operators (+, -): 1 operator each") print("7. SET assignments (=): 1 operator each") print("8. ADD and DELETE operations: 1 operator each") print("\nStrategies for Working Within the 300 Operator Limit:") print("1. Break operations into multiple requests") print("2. Use DynamoDB Transactions for complex operations") print("3. Optimize data model to reduce query complexity") print("4. Use application-side filtering for less critical filters") print("5. Consider using IN operator instead of multiple OR conditions")
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to create a REST API that simulates a system to track daily cases of COVID-19 in the United States, using fictional data.
- SDK for Python (Boto3)
-
Shows how to use AWS Chalice with the AWS SDK for Python (Boto3) to create a serverless REST API that uses Amazon API Gateway, AWS Lambda, and Amazon DynamoDB. The REST API simulates a system that tracks daily cases of COVID-19 in the United States, using fictional data. Learn how to:
Use AWS Chalice to define routes in Lambda functions that are called to handle REST requests that come through API Gateway.
Use Lambda functions to retrieve and store data in a DynamoDB table to serve REST requests.
Define table structure and security role resources in an AWS CloudFormation template.
Use AWS Chalice and CloudFormation to package and deploy all necessary resources.
Use CloudFormation to clean up all created resources.
For complete source code and instructions on how to set up and run, see the full example on GitHub
. Services used in this example
API Gateway
AWS CloudFormation
DynamoDB
Lambda
The following code example shows how to create an AWS Step Functions messenger application that retrieves message records from a database table.
- SDK for Python (Boto3)
-
Shows how to use the AWS SDK for Python (Boto3) with AWS Step Functions to create a messenger application that retrieves message records from an Amazon DynamoDB table and sends them with Amazon Simple Queue Service (Amazon SQS). The state machine integrates with an AWS Lambda function to scan the database for unsent messages.
Create a state machine that retrieves and updates message records from an Amazon DynamoDB table.
Update the state machine definition to also send messages to Amazon Simple Queue Service (Amazon SQS).
Start and stop state machine runs.
Connect to Lambda, DynamoDB, and Amazon SQS from a state machine by using service integrations.
For complete source code and instructions on how to set up and run, see the full example on GitHub
. Services used in this example
DynamoDB
Lambda
Amazon SQS
Step Functions
The following code example shows how to create a table with warm throughput enabled.
- SDK for Python (Boto3)
-
Create DynamoDB table with warm throughput setting using AWS SDK for Python (Boto3).
from boto3 import client from botocore.exceptions import ClientError def create_dynamodb_table_warm_throughput( table_name, partition_key, sort_key, misc_key_attr, non_key_attr, table_provisioned_read_units, table_provisioned_write_units, table_warm_reads, table_warm_writes, gsi_name, gsi_provisioned_read_units, gsi_provisioned_write_units, gsi_warm_reads, gsi_warm_writes, region_name="us-east-1", ): """ Creates a DynamoDB table with a warm throughput setting configured. :param table_name: The name of the table to be created. :param partition_key: The partition key for the table being created. :param sort_key: The sort key for the table being created. :param misc_key_attr: A miscellaneous key attribute for the table being created. :param non_key_attr: A non-key attribute for the table being created. :param table_provisioned_read_units: The newly created table's provisioned read capacity units. :param table_provisioned_write_units: The newly created table's provisioned write capacity units. :param table_warm_reads: The read units per second setting for the table's warm throughput. :param table_warm_writes: The write units per second setting for the table's warm throughput. :param gsi_name: The name of the Global Secondary Index (GSI) to be created on the table. :param gsi_provisioned_read_units: The configured Global Secondary Index (GSI) provisioned read capacity units. :param gsi_provisioned_write_units: The configured Global Secondary Index (GSI) provisioned write capacity units. :param gsi_warm_reads: The read units per second setting for the Global Secondary Index (GSI)'s warm throughput. :param gsi_warm_writes: The write units per second setting for the Global Secondary Index (GSI)'s warm throughput. :param region_name: The AWS Region name to target. defaults to us-east-1 """ try: ddb = client("dynamodb", region_name=region_name) # Define the table attributes attribute_definitions = [ {"AttributeName": partition_key, "AttributeType": "S"}, {"AttributeName": sort_key, "AttributeType": "S"}, {"AttributeName": misc_key_attr, "AttributeType": "N"}, ] # Define the table key schema key_schema = [ {"AttributeName": partition_key, "KeyType": "HASH"}, {"AttributeName": sort_key, "KeyType": "RANGE"}, ] # Define the provisioned throughput for the table provisioned_throughput = { "ReadCapacityUnits": table_provisioned_read_units, "WriteCapacityUnits": table_provisioned_write_units, } # Define the global secondary index gsi_key_schema = [ {"AttributeName": sort_key, "KeyType": "HASH"}, {"AttributeName": misc_key_attr, "KeyType": "RANGE"}, ] gsi_projection = {"ProjectionType": "INCLUDE", "NonKeyAttributes": [non_key_attr]} gsi_provisioned_throughput = { "ReadCapacityUnits": gsi_provisioned_read_units, "WriteCapacityUnits": gsi_provisioned_write_units, } gsi_warm_throughput = { "ReadUnitsPerSecond": gsi_warm_reads, "WriteUnitsPerSecond": gsi_warm_writes, } global_secondary_indexes = [ { "IndexName": gsi_name, "KeySchema": gsi_key_schema, "Projection": gsi_projection, "ProvisionedThroughput": gsi_provisioned_throughput, "WarmThroughput": gsi_warm_throughput, } ] # Define the warm throughput for the table warm_throughput = { "ReadUnitsPerSecond": table_warm_reads, "WriteUnitsPerSecond": table_warm_writes, } # Create the DynamoDB client and create the table response = ddb.create_table( TableName=table_name, AttributeDefinitions=attribute_definitions, KeySchema=key_schema, ProvisionedThroughput=provisioned_throughput, GlobalSecondaryIndexes=global_secondary_indexes, WarmThroughput=warm_throughput, ) print(response) return response except ClientError as e: print(f"Error creating table: {e}") raise e
-
For API details, see CreateTable in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to create a web application that tracks work items in an Amazon DynamoDB table and uses Amazon Simple Email Service (Amazon SES) to send reports.
- SDK for Python (Boto3)
-
Shows how to use the AWS SDK for Python (Boto3) to create a REST service that tracks work items in Amazon DynamoDB and emails reports by using Amazon Simple Email Service (Amazon SES). This example uses the Flask web framework to handle HTTP routing and integrates with a React webpage to present a fully functional web application.
Build a Flask REST service that integrates with AWS services.
Read, write, and update work items that are stored in a DynamoDB table.
Use Amazon SES to send email reports of work items.
For complete source code and instructions on how to set up and run, see the full example in the AWS Code Examples Repository
on GitHub. Services used in this example
DynamoDB
Amazon SES
The following code example shows how to create a chat application that is served by a websocket API built on Amazon API Gateway.
- SDK for Python (Boto3)
-
Shows how to use the AWS SDK for Python (Boto3) with Amazon API Gateway V2 to create a websocket API that integrates with AWS Lambda and Amazon DynamoDB.
Create a websocket API served by API Gateway.
Define a Lambda handler that stores connections in DynamoDB and posts messages to other chat participants.
Connect to the websocket chat application and send messages with the Websockets package.
For complete source code and instructions on how to set up and run, see the full example on GitHub
. Services used in this example
API Gateway
DynamoDB
Lambda
The following code example shows how to create an item with TTL.
- SDK for Python (Boto3)
-
from datetime import datetime, timedelta import boto3 def create_dynamodb_item(table_name, region, primary_key, sort_key): """ Creates a DynamoDB item with an attached expiry attribute. :param table_name: Table name for the boto3 resource to target when creating an item :param region: string representing the AWS region. Example: `us-east-1` :param primary_key: one attribute known as the partition key. :param sort_key: Also known as a range attribute. :return: Void (nothing) """ try: dynamodb = boto3.resource("dynamodb", region_name=region) table = dynamodb.Table(table_name) # Get the current time in epoch second format current_time = int(datetime.now().timestamp()) # Calculate the expiration time (90 days from now) in epoch second format expiration_time = int((datetime.now() + timedelta(days=90)).timestamp()) item = { "primaryKey": primary_key, "sortKey": sort_key, "creationDate": current_time, "expireAt": expiration_time, } response = table.put_item(Item=item) print("Item created successfully.") return response except Exception as e: print(f"Error creating item: {e}") raise e # Use your own values create_dynamodb_item( "your-table-name", "us-west-2", "your-partition-key-value", "your-sort-key-value" )
-
For API details, see PutItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to perform advanced query operations in DynamoDB.
Query tables using various filtering and condition techniques.
Implement pagination for large result sets.
Use Global Secondary Indexes for alternate access patterns.
Apply consistency controls based on application requirements.
- SDK for Python (Boto3)
-
Query with strongly consistent reads using AWS SDK for Python (Boto3).
import time import boto3 from boto3.dynamodb.conditions import Key def query_with_consistent_read( table_name, partition_key_name, partition_key_value, sort_key_name=None, sort_key_value=None, consistent_read=True, ): """ Query a DynamoDB table with the option for strongly consistent reads. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str, optional): The name of the sort key attribute. sort_key_value (str, optional): The value of the sort key to query. consistent_read (bool, optional): Whether to use strongly consistent reads. Defaults to True. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) if sort_key_name and sort_key_value: key_condition = key_condition & Key(sort_key_name).eq(sort_key_value) # Perform the query with the consistent read option response = table.query(KeyConditionExpression=key_condition, ConsistentRead=consistent_read) return response
Query using a Global Secondary Index with AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Key def query_table(table_name, partition_key_name, partition_key_value): """ Query a DynamoDB table using its primary key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Perform the query on the table's primary key response = table.query(KeyConditionExpression=Key(partition_key_name).eq(partition_key_value)) return response def query_gsi(table_name, index_name, partition_key_name, partition_key_value): """ Query a Global Secondary Index (GSI) on a DynamoDB table. Args: table_name (str): The name of the DynamoDB table. index_name (str): The name of the Global Secondary Index. partition_key_name (str): The name of the GSI's partition key attribute. partition_key_value (str): The value of the GSI's partition key to query. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Perform the query on the GSI response = table.query( IndexName=index_name, KeyConditionExpression=Key(partition_key_name).eq(partition_key_value) ) return response
Query with pagination using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Key def query_with_pagination( table_name, partition_key_name, partition_key_value, page_size=25, max_pages=None ): """ Query a DynamoDB table with pagination to handle large result sets. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. page_size (int, optional): The number of items to return per page. Defaults to 25. max_pages (int, optional): The maximum number of pages to retrieve. If None, retrieves all pages. Returns: list: All items retrieved from the query across all pages. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Initialize variables for pagination last_evaluated_key = None page_count = 0 all_items = [] # Paginate through the results while True: # Check if we've reached the maximum number of pages if max_pages is not None and page_count >= max_pages: break # Prepare the query parameters query_params = { "KeyConditionExpression": Key(partition_key_name).eq(partition_key_value), "Limit": page_size, } # Add the ExclusiveStartKey if we have a LastEvaluatedKey from a previous query if last_evaluated_key: query_params["ExclusiveStartKey"] = last_evaluated_key # Execute the query response = table.query(**query_params) # Process the current page of results items = response.get("Items", []) all_items.extend(items) # Update pagination tracking page_count += 1 # Get the LastEvaluatedKey for the next page, if any last_evaluated_key = response.get("LastEvaluatedKey") # If there's no LastEvaluatedKey, we've reached the end of the results if not last_evaluated_key: break return all_items def query_with_pagination_generator( table_name, partition_key_name, partition_key_value, page_size=25 ): """ Query a DynamoDB table with pagination using a generator to handle large result sets. This approach is memory-efficient as it yields one page at a time. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. page_size (int, optional): The number of items to return per page. Defaults to 25. Yields: tuple: A tuple containing (items, page_number, last_page) where: - items is a list of items for the current page - page_number is the current page number (starting from 1) - last_page is a boolean indicating if this is the last page """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Initialize variables for pagination last_evaluated_key = None page_number = 0 # Paginate through the results while True: # Prepare the query parameters query_params = { "KeyConditionExpression": Key(partition_key_name).eq(partition_key_value), "Limit": page_size, } # Add the ExclusiveStartKey if we have a LastEvaluatedKey from a previous query if last_evaluated_key: query_params["ExclusiveStartKey"] = last_evaluated_key # Execute the query response = table.query(**query_params) # Get the current page of results items = response.get("Items", []) page_number += 1 # Get the LastEvaluatedKey for the next page, if any last_evaluated_key = response.get("LastEvaluatedKey") # Determine if this is the last page is_last_page = last_evaluated_key is None # Yield the current page of results yield (items, page_number, is_last_page) # If there's no LastEvaluatedKey, we've reached the end of the results if is_last_page: break
Query with complex filters using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_complex_filter( table_name, partition_key_name, partition_key_value, min_rating=None, status_list=None, max_price=None, ): """ Query a DynamoDB table with a complex filter expression. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. min_rating (float, optional): Minimum rating value for filtering. status_list (list, optional): List of status values to include. max_price (float, optional): Maximum price value for filtering. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Initialize the filter expression and expression attribute values filter_expression = None expression_attribute_values = {} # Build the filter expression based on provided parameters if min_rating is not None: filter_expression = Attr("rating").gte(min_rating) expression_attribute_values[":min_rating"] = min_rating if status_list and len(status_list) > 0: status_condition = None for i, status in enumerate(status_list): status_value_name = f":status{i}" expression_attribute_values[status_value_name] = status if status_condition is None: status_condition = Attr("status").eq(status) else: status_condition = status_condition | Attr("status").eq(status) if filter_expression is None: filter_expression = status_condition else: filter_expression = filter_expression & status_condition if max_price is not None: price_condition = Attr("price").lte(max_price) expression_attribute_values[":max_price"] = max_price if filter_expression is None: filter_expression = price_condition else: filter_expression = filter_expression & price_condition # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression if expression_attribute_values: query_params["ExpressionAttributeValues"] = expression_attribute_values # Execute the query response = table.query(**query_params) return response def query_with_complex_filter_and_or( table_name, partition_key_name, partition_key_value, category=None, min_rating=None, max_price=None, ): """ Query a DynamoDB table with a complex filter expression using AND and OR operators. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. category (str, optional): Category value for filtering. min_rating (float, optional): Minimum rating value for filtering. max_price (float, optional): Maximum price value for filtering. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Build a complex filter expression with AND and OR operators filter_expression = None expression_attribute_values = {} # Build the category condition if category: filter_expression = Attr("category").eq(category) expression_attribute_values[":category"] = category # Build the rating and price condition (rating >= min_rating OR price <= max_price) rating_price_condition = None if min_rating is not None: rating_price_condition = Attr("rating").gte(min_rating) expression_attribute_values[":min_rating"] = min_rating if max_price is not None: price_condition = Attr("price").lte(max_price) expression_attribute_values[":max_price"] = max_price if rating_price_condition is None: rating_price_condition = price_condition else: rating_price_condition = rating_price_condition | price_condition # Combine the conditions if rating_price_condition: if filter_expression is None: filter_expression = rating_price_condition else: filter_expression = filter_expression & rating_price_condition # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression if expression_attribute_values: query_params["ExpressionAttributeValues"] = expression_attribute_values # Execute the query response = table.query(**query_params) return response
Query with a dynamically constructed filter expression using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_dynamic_filter( table_name, partition_key_name, partition_key_value, filter_conditions=None ): """ Query a DynamoDB table with a dynamically constructed filter expression. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. filter_conditions (dict, optional): A dictionary of filter conditions where keys are attribute names and values are dictionaries with 'operator' and 'value'. Example: {'rating': {'operator': '>=', 'value': 4}, 'status': {'operator': '=', 'value': 'active'}} Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Initialize variables for the filter expression and attribute values filter_expression = None expression_attribute_values = {":pk_val": partition_key_value} # Dynamically build the filter expression if filter conditions are provided if filter_conditions: for attr_name, condition in filter_conditions.items(): operator = condition.get("operator") value = condition.get("value") attr_value_name = f":{attr_name}" expression_attribute_values[attr_value_name] = value # Create the appropriate filter expression based on the operator current_condition = None if operator == "=": current_condition = Attr(attr_name).eq(value) elif operator == "!=": current_condition = Attr(attr_name).ne(value) elif operator == ">": current_condition = Attr(attr_name).gt(value) elif operator == ">=": current_condition = Attr(attr_name).gte(value) elif operator == "<": current_condition = Attr(attr_name).lt(value) elif operator == "<=": current_condition = Attr(attr_name).lte(value) elif operator == "contains": current_condition = Attr(attr_name).contains(value) elif operator == "begins_with": current_condition = Attr(attr_name).begins_with(value) # Combine with existing filter expression using AND if current_condition: if filter_expression is None: filter_expression = current_condition else: filter_expression = filter_expression & current_condition # Perform the query with the dynamically built filter expression query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression response = table.query(**query_params) return response
Query with a filter expression and limit using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_filter_and_limit( table_name, partition_key_name, partition_key_value, filter_attribute=None, filter_value=None, limit=10, ): """ Query a DynamoDB table with a filter expression and limit the number of results. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. filter_attribute (str, optional): The attribute name to filter on. filter_value (any, optional): The value to compare against in the filter. limit (int, optional): The maximum number of items to evaluate. Defaults to 10. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition, "Limit": limit} # Add the filter expression if filter attributes are provided if filter_attribute and filter_value is not None: query_params["FilterExpression"] = Attr(filter_attribute).gt(filter_value) query_params["ExpressionAttributeValues"] = {":filter_value": filter_value} # Execute the query response = table.query(**query_params) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to perform list operations in DynamoDB.
Add elements to a list attribute.
Remove elements from a list attribute.
Update specific elements in a list by index.
Use list append and list index functions.
- SDK for Python (Boto3)
-
Demonstrate list operations using AWS SDK for Python (Boto3).
import boto3 import json from typing import Any, Dict, List, Optional, Union def create_list_attribute( table_name: str, key: Dict[str, Any], list_name: str, list_values: List[Any] ) -> Dict[str, Any]: """ Create a new list attribute or replace an existing one. This function demonstrates how to create a new list attribute or replace an existing list with new values. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. list_values (List[Any]): The values to set in the list. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use the SET operation to create or replace the list response = table.update_item( Key=key, UpdateExpression=f"SET {list_name} = :list_values", ExpressionAttributeValues={":list_values": list_values}, ReturnValues="UPDATED_NEW", ) return response def append_to_list( table_name: str, key: Dict[str, Any], list_name: str, values_to_append: List[Any] ) -> Dict[str, Any]: """ Append values to the end of a list attribute. This function demonstrates how to use the list_append function to add elements to the end of a list attribute. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. values_to_append (List[Any]): The values to append to the list. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use list_append to add values to the end of the list response = table.update_item( Key=key, UpdateExpression=f"SET {list_name} = list_append({list_name}, :values)", ExpressionAttributeValues={":values": values_to_append}, ReturnValues="UPDATED_NEW", ) return response def prepend_to_list( table_name: str, key: Dict[str, Any], list_name: str, values_to_prepend: List[Any] ) -> Dict[str, Any]: """ Prepend values to the beginning of a list attribute. This function demonstrates how to use the list_append function to add elements to the beginning of a list attribute. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. values_to_prepend (List[Any]): The values to prepend to the list. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use list_append with reversed order to add values to the beginning of the list response = table.update_item( Key=key, UpdateExpression=f"SET {list_name} = list_append(:values, {list_name})", ExpressionAttributeValues={":values": values_to_prepend}, ReturnValues="UPDATED_NEW", ) return response def update_list_element( table_name: str, key: Dict[str, Any], list_name: str, index: int, new_value: Any ) -> Dict[str, Any]: """ Update a specific element in a list attribute. This function demonstrates how to update a specific element in a list attribute using the index notation. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. index (int): The zero-based index of the element to update. new_value (Any): The new value for the element. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use the index notation to update a specific element response = table.update_item( Key=key, UpdateExpression=f"SET {list_name}[{index}] = :value", ExpressionAttributeValues={":value": new_value}, ReturnValues="UPDATED_NEW", ) return response def remove_list_element( table_name: str, key: Dict[str, Any], list_name: str, index: int ) -> Dict[str, Any]: """ Remove a specific element from a list attribute. This function demonstrates how to remove a specific element from a list attribute using the REMOVE action with index notation. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. index (int): The zero-based index of the element to remove. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use the REMOVE action with index notation to remove a specific element response = table.update_item( Key=key, UpdateExpression=f"REMOVE {list_name}[{index}]", ReturnValues="UPDATED_NEW" ) return response def update_nested_list_element( table_name: str, key: Dict[str, Any], path: str, new_value: Any ) -> Dict[str, Any]: """ Update an element in a nested list structure. This function demonstrates how to update an element in a nested list structure using expression attribute names for the path components. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. path (str): The path to the nested element (e.g., "parent[0].child[1]"). new_value (Any): The new value for the element. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Define a type for path parts path_part = Dict[str, Union[str, int]] # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Parse the path to extract attribute names and indices path_parts: List[path_part] = [] current_part = "" in_bracket = False for char in path: if char == "[": if current_part: path_parts.append({"type": "attribute", "value": current_part}) current_part = "" in_bracket = True elif char == "]": if current_part: # Fix for mypy: Use a properly typed dictionary with Union type path_parts.append({"type": "index", "value": int(current_part)}) current_part = "" in_bracket = False elif char == "." and not in_bracket: if current_part: path_parts.append({"type": "attribute", "value": current_part}) current_part = "" else: current_part += char if current_part: path_parts.append({"type": "attribute", "value": current_part}) # Build the update expression and attribute names update_expression = "SET " expression_attribute_names = {} # Build the path expression path_expression = "" for i, part in enumerate(path_parts): if part["type"] == "attribute": name_placeholder = f"#attr{i}" expression_attribute_names[name_placeholder] = part["value"] if path_expression: path_expression += "." path_expression += name_placeholder elif part["type"] == "index": path_expression += f"[{part['value']}]" # Complete the update expression update_expression += f"{path_expression} = :value" # Execute the update response = table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues={":value": new_value}, ReturnValues="UPDATED_NEW", ) return response def create_list_if_not_exists( table_name: str, key: Dict[str, Any], list_name: str, default_values: List[Any] ) -> Dict[str, Any]: """ Create a list attribute if it doesn't exist. This function demonstrates how to use if_not_exists to create a list attribute with default values if it doesn't already exist. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. default_values (List[Any]): The default values for the list. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use if_not_exists to create the list if it doesn't exist response = table.update_item( Key=key, UpdateExpression=f"SET {list_name} = if_not_exists({list_name}, :default)", ExpressionAttributeValues={":default": default_values}, ReturnValues="UPDATED_NEW", ) return response def append_to_list_safely( table_name: str, key: Dict[str, Any], list_name: str, values_to_append: List[Any], default_values: Optional[List[Any]] = None, ) -> Dict[str, Any]: """ Append values to a list, creating it if it doesn't exist. This function demonstrates how to safely append values to a list attribute, creating the list with default values if it doesn't exist. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. list_name (str): The name of the list attribute. values_to_append (List[Any]): The values to append to the list. default_values (Optional[List[Any]]): The default values if the list doesn't exist. If not provided, values_to_append will be used as the default. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # If default_values is not provided, use values_to_append if default_values is None: default_values = values_to_append # Use if_not_exists with list_append to safely append to the list response = table.update_item( Key=key, UpdateExpression=f"SET {list_name} = list_append(if_not_exists({list_name}, :default), :values)", ExpressionAttributeValues={ ":default": default_values if default_values else [], ":values": values_to_append, }, ReturnValues="UPDATED_NEW", ) return response
Example usage of list operations with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use list operations in DynamoDB.""" # Example parameters table_name = "UserData" key = {"UserId": "user123"} print("Example 1: Creating a list attribute") try: response = create_list_attribute( table_name=table_name, key=key, list_name="Interests", list_values=["Reading", "Hiking", "Photography"], ) print( f"List attribute created successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error creating list attribute: {e}") print("\nExample 2: Appending values to a list") try: response = append_to_list( table_name=table_name, key=key, list_name="Interests", values_to_append=["Cooking", "Gardening"], ) print( f"Values appended to list successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error appending to list: {e}") print("\nExample 3: Prepending values to a list") try: response = prepend_to_list( table_name=table_name, key=key, list_name="Interests", values_to_prepend=["Travel", "Music"], ) print( f"Values prepended to list successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error prepending to list: {e}") print("\nExample 4: Updating a specific list element") try: response = update_list_element( table_name=table_name, key=key, list_name="Interests", index=2, new_value="Mountain Hiking", ) print( f"List element updated successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error updating list element: {e}") print("\nExample 5: Removing a list element") try: response = remove_list_element( table_name=table_name, key=key, list_name="Interests", index=0 ) print( f"List element removed successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error removing list element: {e}") print("\nExample 6: Working with nested lists") try: # First, create an item with a nested structure dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) table.update_item( Key={"UserId": "user456"}, UpdateExpression="SET #skills = :skills", ExpressionAttributeNames={"#skills": "Skills"}, ExpressionAttributeValues={ ":skills": [ {"Category": "Programming", "Languages": ["Python", "Java", "JavaScript"]}, {"Category": "Database", "Systems": ["DynamoDB", "MongoDB", "PostgreSQL"]}, ] }, ) # Now update a nested element response = update_nested_list_element( table_name=table_name, key={"UserId": "user456"}, path="Skills[0].Languages[1]", new_value="TypeScript", ) print( f"Nested list element updated successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error working with nested lists: {e}") print("\nExample 7: Creating a list if it doesn't exist") try: response = create_list_if_not_exists( table_name=table_name, key={"UserId": "user789"}, list_name="Preferences", default_values=["Default1", "Default2", "Default3"], ) print( f"List created with default values: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error creating list with default values: {e}") print("\nExample 8: Safely appending to a list") try: response = append_to_list_safely( table_name=table_name, key={"UserId": "user789"}, list_name="Notifications", values_to_append=["New message received"], default_values=[], ) print(f"Safely appended to list: {json.dumps(response.get('Attributes', {}), default=str)}") except Exception as e: print(f"Error safely appending to list: {e}") print("\nKey Points About Working with Lists in DynamoDB:") print("1. Lists are ordered collections of elements that can be of different types") print("2. Use the SET operation with direct assignment to create or replace a list") print("3. Use list_append() to add elements to a list without replacing the entire list") print("4. To append to the end: list_append(list_name, :values)") print("5. To prepend to the beginning: list_append(:values, list_name)") print("6. Use index notation list_name[index] to access or update specific elements") print("7. Use the REMOVE action with index notation to remove specific elements") print("8. Lists can contain nested structures like maps and other lists") print("9. Use if_not_exists() to create a list with default values if it doesn't exist") print("10. List indices are zero-based (the first element is at index 0)") print("11. Attempting to access an index beyond the list bounds will result in an error")
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to perform map operations in DynamoDB.
Add and update nested attributes in map structures.
Remove specific fields from maps.
Work with deeply nested map attributes.
- SDK for Python (Boto3)
-
Demonstrate map operations using AWS SDK for Python (Boto3).
""" Example of updating map attributes in DynamoDB. This module demonstrates how to update map attributes in DynamoDB, including handling cases where the map attribute might not exist yet. """ import boto3 from typing import Any, Dict, Optional def update_map_attribute_safe( table_name: str, key: Dict[str, Any], map_name: str, map_key: str, value: Any ) -> Dict[str, Any]: """ Update a specific key in a map attribute, creating the map if it doesn't exist. This function demonstrates how to safely update a key within a map attribute, even if the map doesn't exist yet in the item. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. map_name (str): The name of the map attribute. map_key (str): The key within the map to update. value (Any): The value to set for the map key. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use SET with attribute_not_exists to safely update the map response = table.update_item( Key=key, UpdateExpression="SET #map.#key = :value", ExpressionAttributeNames={"#map": map_name, "#key": map_key}, ExpressionAttributeValues={":value": value}, ReturnValues="UPDATED_NEW", ) return response def add_to_nested_map( table_name: str, key: Dict[str, Any], path: str, value: Any ) -> Dict[str, Any]: """ Add or update a value in a deeply nested map structure. This function demonstrates how to update a value at a specific path in a nested map structure, creating any intermediate maps as needed. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. path (str): The path to the nested attribute (e.g., "user.preferences.theme"). value (Any): The value to set at the specified path. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Split the path into components path_parts = path.split(".") # Build the update expression and attribute names update_expression = "SET " expression_attribute_names = {} # Build the path expression path_expression = "" for i, part in enumerate(path_parts): name_placeholder = f"#attr{i}" expression_attribute_names[name_placeholder] = part if i == 0: path_expression = name_placeholder else: path_expression += f".{name_placeholder}" # Complete the update expression update_expression += f"{path_expression} = :value" # Execute the update response = table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues={":value": value}, ReturnValues="UPDATED_NEW", ) return response def update_map_with_if_not_exists( table_name: str, key: Dict[str, Any], map_name: str, map_key: str, value: Any, default_map: Optional[Dict[str, Any]] = None, ) -> Dict[str, Any]: """ Update a key in a map, creating the map with default values if it doesn't exist. This function demonstrates how to use if_not_exists to initialize a map with default values if it doesn't exist yet, and then update a specific key. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. map_name (str): The name of the map attribute. map_key (str): The key within the map to update. value (Any): The value to set for the map key. default_map (Optional[Dict[str, Any]]): Default map values if the map doesn't exist. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Set default map if not provided if default_map is None: default_map = {} # Create a map with the new key-value pair updated_map = default_map.copy() updated_map[map_key] = value # Use if_not_exists to initialize the map if it doesn't exist response = table.update_item( Key=key, UpdateExpression="SET #map = if_not_exists(#map, :default_map)", ExpressionAttributeNames={"#map": map_name}, ExpressionAttributeValues={":default_map": updated_map}, ReturnValues="UPDATED_NEW", ) return response def merge_into_map( table_name: str, key: Dict[str, Any], map_name: str, values_to_merge: Dict[str, Any] ) -> Dict[str, Any]: """ Merge multiple key-value pairs into a map attribute. This function demonstrates how to update multiple keys in a map attribute in a single operation, without overwriting the entire map. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. map_name (str): The name of the map attribute. values_to_merge (Dict[str, Any]): Key-value pairs to merge into the map. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the update expression for each key-value pair update_expression = "SET " expression_attribute_names = {"#map": map_name} expression_attribute_values = {} # Add each key-value pair to the update expression for i, (k, v) in enumerate(values_to_merge.items()): key_placeholder = f"#key{i}" value_placeholder = f":value{i}" expression_attribute_names[key_placeholder] = k expression_attribute_values[value_placeholder] = v if i > 0: update_expression += ", " update_expression += f"#map.{key_placeholder} = {value_placeholder}" # Execute the update response = table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues=expression_attribute_values, ReturnValues="UPDATED_NEW", ) return response def example_usage(): """Example of how to use the map attribute update functions.""" # Example parameters table_name = "UserProfiles" key = {"UserId": "user123"} print("Example 1: Updating a specific key in a map attribute") try: response = update_map_attribute_safe( table_name=table_name, key=key, map_name="Preferences", map_key="Theme", value="Dark" ) print(f"Map attribute updated successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error updating map attribute: {e}") print("\nExample 2: Adding a value to a deeply nested map") try: response = add_to_nested_map( table_name=table_name, key=key, path="Settings.Notifications.Email", value=True ) print(f"Nested map updated successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error updating nested map: {e}") print("\nExample 3: Initializing a map with default values if it doesn't exist") try: default_map = {"Language": "English", "Currency": "USD", "TimeZone": "UTC"} response = update_map_with_if_not_exists( table_name=table_name, key={"UserId": "newuser456"}, map_name="Preferences", map_key="Theme", value="Light", default_map=default_map, ) print(f"Map initialized with defaults: {response.get('Attributes', {})}") except Exception as e: print(f"Error initializing map: {e}") print("\nExample 4: Merging multiple values into a map") try: values_to_merge = { "NotificationsEnabled": True, "EmailFrequency": "Daily", "PushNotifications": False, } response = merge_into_map( table_name=table_name, key=key, map_name="NotificationSettings", values_to_merge=values_to_merge, ) print(f"Multiple values merged into map: {response.get('Attributes', {})}") except Exception as e: print(f"Error merging values into map: {e}") print("\nBest practices for working with map attributes in DynamoDB:") print("1. Use dot notation to access and update nested attributes") print("2. Use ExpressionAttributeNames to handle reserved words and special characters") print("3. Use if_not_exists() to handle cases where attributes might not exist") print("4. Update specific map keys rather than overwriting the entire map") print("5. Use a single update operation to modify multiple map keys for better performance") print("6. Consider your data model carefully to minimize the need for deeply nested attributes") if __name__ == "__main__": example_usage()
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to perform set operations in DynamoDB.
Add elements to a set attribute.
Remove elements from a set attribute.
Use ADD and DELETE operations with sets.
- SDK for Python (Boto3)
-
Demonstrate set operations using AWS SDK for Python (Boto3).
import boto3 from typing import Any, Dict, List def create_set_attribute( table_name: str, key: Dict[str, Any], set_name: str, set_values: List[Any], set_type: str = "string", ) -> Dict[str, Any]: """ Create a new set attribute or add elements to an existing set. This function demonstrates how to use the ADD operation to create a new set or add elements to an existing set. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. set_name (str): The name of the set attribute. set_values (List[Any]): The values to add to the set. set_type (str, optional): The type of set to create: "string", "number", or "binary". Defaults to "string". Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Convert the list to a DynamoDB set based on the specified type if set_type == "string": dynamo_set = set(str(value) for value in set_values) elif set_type == "number": # We need to use actual float values for the DynamoDB API # but mypy expects strings in sets, so we need to use type: ignore dynamo_set = set(float(value) for value in set_values) # type: ignore else: # binary set is not directly supported in high-level API, handled differently raise ValueError("Binary sets are not supported in this example") # Use the ADD operation to create or update the set response = table.update_item( Key=key, UpdateExpression="ADD #set_attr :set_values", ExpressionAttributeNames={"#set_attr": set_name}, ExpressionAttributeValues={":set_values": dynamo_set}, ReturnValues="UPDATED_NEW", ) return response def add_to_set( table_name: str, key: Dict[str, Any], set_name: str, values_to_add: List[Any] ) -> Dict[str, Any]: """ Add elements to an existing set attribute. This function demonstrates how to use the ADD operation to add elements to an existing set. If the set doesn't exist, it will be created. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. set_name (str): The name of the set attribute. values_to_add (List[Any]): The values to add to the set. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Convert the list to a set (assuming string set for simplicity) dynamo_set = set(str(value) for value in values_to_add) # Use the ADD operation to add values to the set response = table.update_item( Key=key, UpdateExpression="ADD #set_attr :values_to_add", ExpressionAttributeNames={"#set_attr": set_name}, ExpressionAttributeValues={":values_to_add": dynamo_set}, ReturnValues="UPDATED_NEW", ) return response def remove_from_set( table_name: str, key: Dict[str, Any], set_name: str, values_to_remove: List[Any] ) -> Dict[str, Any]: """ Remove elements from a set attribute. This function demonstrates how to use the DELETE operation to remove elements from a set. If the last element is removed, the attribute will be deleted entirely. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. set_name (str): The name of the set attribute. values_to_remove (List[Any]): The values to remove from the set. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Convert the list to a set (assuming string set for simplicity) dynamo_set = set(str(value) for value in values_to_remove) # Use the DELETE operation to remove values from the set response = table.update_item( Key=key, UpdateExpression="DELETE #set_attr :values_to_remove", ExpressionAttributeNames={"#set_attr": set_name}, ExpressionAttributeValues={":values_to_remove": dynamo_set}, ReturnValues="UPDATED_NEW", ) return response def check_if_set_exists(table_name: str, key: Dict[str, Any], set_name: str) -> bool: """ Check if a set attribute exists in an item. This function demonstrates how to check if a set attribute exists after potentially removing all elements from it. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to check. set_name (str): The name of the set attribute. Returns: bool: True if the set attribute exists, False otherwise. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Get the item response = table.get_item( Key=key, ProjectionExpression="#set_attr", ExpressionAttributeNames={"#set_attr": set_name} ) # Check if the item exists and has the set attribute return "Item" in response and set_name in response["Item"] def demonstrate_last_element_removal( table_name: str, key: Dict[str, Any], set_name: str ) -> Dict[str, Any]: """ Demonstrate what happens when you remove the last element from a set. This function creates a set with a single element, then removes that element, showing that the attribute is completely removed when the last element is deleted. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. set_name (str): The name of the set attribute. Returns: Dict[str, Any]: A dictionary containing the results of the demonstration. """ # Step 1: Create a set with a single element create_response = create_set_attribute( table_name=table_name, key=key, set_name=set_name, set_values=["last_element"], set_type="string", ) # Step 2: Check that the set exists exists_before = check_if_set_exists(table_name, key, set_name) # Step 3: Remove the last element delete_response = remove_from_set( table_name=table_name, key=key, set_name=set_name, values_to_remove=["last_element"] ) # Step 4: Check if the set still exists exists_after = check_if_set_exists(table_name, key, set_name) # Return the results return { "create_response": create_response, "exists_before": exists_before, "delete_response": delete_response, "exists_after": exists_after, } def work_with_number_set( table_name: str, key: Dict[str, Any], set_name: str, initial_values: List[float], values_to_add: List[float], values_to_remove: List[float], ) -> Dict[str, Any]: """ Demonstrate working with a number set in DynamoDB. This function shows how to create and manipulate a set of numbers. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. set_name (str): The name of the set attribute. initial_values (List[float]): The initial values for the set. values_to_add (List[float]): Values to add to the set. values_to_remove (List[float]): Values to remove from the set. Returns: Dict[str, Any]: A dictionary containing the responses from each operation. """ # Step 1: Create the number set create_response = create_set_attribute( table_name=table_name, key=key, set_name=set_name, set_values=initial_values, set_type="number", ) # Step 2: Add more numbers to the set add_response = add_to_set( table_name=table_name, key=key, set_name=set_name, values_to_add=values_to_add ) # Step 3: Remove some numbers from the set remove_response = remove_from_set( table_name=table_name, key=key, set_name=set_name, values_to_remove=values_to_remove ) # Step 4: Get the final state dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) get_response = table.get_item( Key=key, ProjectionExpression=f"#{set_name}", ExpressionAttributeNames={f"#{set_name}": set_name}, ) # Return all responses return { "create_response": create_response, "add_response": add_response, "remove_response": remove_response, "final_state": get_response.get("Item", {}), }
Example usage of set operations with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the set operations functions.""" # Example parameters table_name = "UserPreferences" key = {"UserId": "user123"} print("Example 1: Creating a string set attribute") try: response = create_set_attribute( table_name=table_name, key=key, set_name="FavoriteTags", set_values=["AWS", "DynamoDB", "NoSQL"], set_type="string", ) print(f"Set attribute created successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error creating set attribute: {e}") print("\nExample 2: Adding elements to an existing set") try: response = add_to_set( table_name=table_name, key=key, set_name="FavoriteTags", values_to_add=["Database", "Serverless"], ) print(f"Elements added to set successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error adding to set: {e}") print("\nExample 3: Removing elements from a set") try: response = remove_from_set( table_name=table_name, key=key, set_name="FavoriteTags", values_to_remove=["NoSQL"] ) print(f"Elements removed from set successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error removing from set: {e}") print("\nExample 4: Demonstrating what happens when you remove the last element from a set") try: results = demonstrate_last_element_removal( table_name=table_name, key={"UserId": "tempUser"}, set_name="SingleElementSet" ) print(f"Set exists before removal: {results['exists_before']}") print(f"Set exists after removal: {results['exists_after']}") if not results["exists_after"]: print("The set attribute was completely removed when the last element was deleted.") else: print("The set attribute still exists after removing the last element.") except Exception as e: print(f"Error in last element removal demonstration: {e}") print("\nExample 5: Working with a number set") try: results = work_with_number_set( table_name=table_name, key={"UserId": "user456"}, set_name="LuckyNumbers", initial_values=[7, 13, 42], values_to_add=[99, 100], values_to_remove=[13], ) print(f"Initial number set: {results['create_response'].get('Attributes', {})}") print(f"After adding numbers: {results['add_response'].get('Attributes', {})}") print(f"After removing numbers: {results['remove_response'].get('Attributes', {})}") print(f"Final state: {results['final_state']}") except Exception as e: print(f"Error working with number set: {e}") print("\nKey Points About DynamoDB Sets:") print("1. Sets can only contain elements of the same type (string, number, or binary)") print("2. Sets automatically eliminate duplicate values") print("3. The ADD operation creates a set if it doesn't exist") print("4. The DELETE operation removes specified elements from a set") print("5. When the last element is removed from a set, the entire attribute is deleted") print("6. Empty sets are not allowed in DynamoDB") print("7. Sets are unordered collections") print("8. The ADD operation is atomic for sets")
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to:
Get a batch of items by running multiple SELECT statements.
Add a batch of items by running multiple INSERT statements.
Update a batch of items by running multiple UPDATE statements.
Delete a batch of items by running multiple DELETE statements.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create a class that can run batches of PartiQL statements.
from datetime import datetime from decimal import Decimal import logging from pprint import pprint import boto3 from botocore.exceptions import ClientError from scaffold import Scaffold logger = logging.getLogger(__name__) class PartiQLBatchWrapper: """ Encapsulates a DynamoDB resource to run PartiQL statements. """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource def run_partiql(self, statements, param_list): """ Runs a PartiQL statement. A Boto3 resource is used even though `execute_statement` is called on the underlying `client` object because the resource transforms input and output from plain old Python objects (POPOs) to the DynamoDB format. If you create the client directly, you must do these transforms yourself. :param statements: The batch of PartiQL statements. :param param_list: The batch of PartiQL parameters that are associated with each statement. This list must be in the same order as the statements. :return: The responses returned from running the statements, if any. """ try: output = self.dyn_resource.meta.client.batch_execute_statement( Statements=[ {"Statement": statement, "Parameters": params} for statement, params in zip(statements, param_list) ] ) except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": logger.error( "Couldn't execute batch of PartiQL statements because the table " "does not exist." ) else: logger.error( "Couldn't execute batch of PartiQL statements. Here's why: %s: %s", err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return output
Run a scenario that creates a table and runs PartiQL queries in batches.
def run_scenario(scaffold, wrapper, table_name): logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s") print("-" * 88) print("Welcome to the Amazon DynamoDB PartiQL batch statement demo.") print("-" * 88) print(f"Creating table '{table_name}' for the demo...") scaffold.create_table(table_name) print("-" * 88) movie_data = [ { "title": f"House PartiQL", "year": datetime.now().year - 5, "info": { "plot": "Wacky high jinks result from querying a mysterious database.", "rating": Decimal("8.5"), }, }, { "title": f"House PartiQL 2", "year": datetime.now().year - 3, "info": { "plot": "Moderate high jinks result from querying another mysterious database.", "rating": Decimal("6.5"), }, }, { "title": f"House PartiQL 3", "year": datetime.now().year - 1, "info": { "plot": "Tepid high jinks result from querying yet another mysterious database.", "rating": Decimal("2.5"), }, }, ] print(f"Inserting a batch of movies into table '{table_name}.") statements = [ f'INSERT INTO "{table_name}" ' f"VALUE {{'title': ?, 'year': ?, 'info': ?}}" ] * len(movie_data) params = [list(movie.values()) for movie in movie_data] wrapper.run_partiql(statements, params) print("Success!") print("-" * 88) print(f"Getting data for a batch of movies.") statements = [f'SELECT * FROM "{table_name}" WHERE title=? AND year=?'] * len( movie_data ) params = [[movie["title"], movie["year"]] for movie in movie_data] output = wrapper.run_partiql(statements, params) for item in output["Responses"]: print(f"\n{item['Item']['title']}, {item['Item']['year']}") pprint(item["Item"]) print("-" * 88) ratings = [Decimal("7.7"), Decimal("5.5"), Decimal("1.3")] print(f"Updating a batch of movies with new ratings.") statements = [ f'UPDATE "{table_name}" SET info.rating=? ' f"WHERE title=? AND year=?" ] * len(movie_data) params = [ [rating, movie["title"], movie["year"]] for rating, movie in zip(ratings, movie_data) ] wrapper.run_partiql(statements, params) print("Success!") print("-" * 88) print(f"Getting projected data from the table to verify our update.") output = wrapper.dyn_resource.meta.client.execute_statement( Statement=f'SELECT title, info.rating FROM "{table_name}"' ) pprint(output["Items"]) print("-" * 88) print(f"Deleting a batch of movies from the table.") statements = [f'DELETE FROM "{table_name}" WHERE title=? AND year=?'] * len( movie_data ) params = [[movie["title"], movie["year"]] for movie in movie_data] wrapper.run_partiql(statements, params) print("Success!") print("-" * 88) print(f"Deleting table '{table_name}'...") scaffold.delete_table() print("-" * 88) print("\nThanks for watching!") print("-" * 88) if __name__ == "__main__": try: dyn_res = boto3.resource("dynamodb") scaffold = Scaffold(dyn_res) movies = PartiQLBatchWrapper(dyn_res) run_scenario(scaffold, movies, "doc-example-table-partiql-movies") except Exception as e: print(f"Something went wrong with the demo! Here's what: {e}")
-
For API details, see BatchExecuteStatement in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to:
Get an item by running a SELECT statement.
Add an item by running an INSERT statement.
Update an item by running an UPDATE statement.
Delete an item by running a DELETE statement.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create a class that can run PartiQL statements.
from datetime import datetime from decimal import Decimal import logging from pprint import pprint import boto3 from botocore.exceptions import ClientError from scaffold import Scaffold logger = logging.getLogger(__name__) class PartiQLWrapper: """ Encapsulates a DynamoDB resource to run PartiQL statements. """ def __init__(self, dyn_resource): """ :param dyn_resource: A Boto3 DynamoDB resource. """ self.dyn_resource = dyn_resource def run_partiql(self, statement, params): """ Runs a PartiQL statement. A Boto3 resource is used even though `execute_statement` is called on the underlying `client` object because the resource transforms input and output from plain old Python objects (POPOs) to the DynamoDB format. If you create the client directly, you must do these transforms yourself. :param statement: The PartiQL statement. :param params: The list of PartiQL parameters. These are applied to the statement in the order they are listed. :return: The items returned from the statement, if any. """ try: output = self.dyn_resource.meta.client.execute_statement( Statement=statement, Parameters=params ) except ClientError as err: if err.response["Error"]["Code"] == "ResourceNotFoundException": logger.error( "Couldn't execute PartiQL '%s' because the table does not exist.", statement, ) else: logger.error( "Couldn't execute PartiQL '%s'. Here's why: %s: %s", statement, err.response["Error"]["Code"], err.response["Error"]["Message"], ) raise else: return output
Run a scenario that creates a table and runs PartiQL queries.
def run_scenario(scaffold, wrapper, table_name): logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s") print("-" * 88) print("Welcome to the Amazon DynamoDB PartiQL single statement demo.") print("-" * 88) print(f"Creating table '{table_name}' for the demo...") scaffold.create_table(table_name) print("-" * 88) title = "24 Hour PartiQL People" year = datetime.now().year plot = "A group of data developers discover a new query language they can't stop using." rating = Decimal("9.9") print(f"Inserting movie '{title}' released in {year}.") wrapper.run_partiql( f"INSERT INTO \"{table_name}\" VALUE {{'title': ?, 'year': ?, 'info': ?}}", [title, year, {"plot": plot, "rating": rating}], ) print("Success!") print("-" * 88) print(f"Getting data for movie '{title}' released in {year}.") output = wrapper.run_partiql( f'SELECT * FROM "{table_name}" WHERE title=? AND year=?', [title, year] ) for item in output["Items"]: print(f"\n{item['title']}, {item['year']}") pprint(output["Items"]) print("-" * 88) rating = Decimal("2.4") print(f"Updating movie '{title}' with a rating of {float(rating)}.") wrapper.run_partiql( f'UPDATE "{table_name}" SET info.rating=? WHERE title=? AND year=?', [rating, title, year], ) print("Success!") print("-" * 88) print(f"Getting data again to verify our update.") output = wrapper.run_partiql( f'SELECT * FROM "{table_name}" WHERE title=? AND year=?', [title, year] ) for item in output["Items"]: print(f"\n{item['title']}, {item['year']}") pprint(output["Items"]) print("-" * 88) print(f"Deleting movie '{title}' released in {year}.") wrapper.run_partiql( f'DELETE FROM "{table_name}" WHERE title=? AND year=?', [title, year] ) print("Success!") print("-" * 88) print(f"Deleting table '{table_name}'...") scaffold.delete_table() print("-" * 88) print("\nThanks for watching!") print("-" * 88) if __name__ == "__main__": try: dyn_res = boto3.resource("dynamodb") scaffold = Scaffold(dyn_res) movies = PartiQLWrapper(dyn_res) run_scenario(scaffold, movies, "doc-example-table-partiql-movies") except Exception as e: print(f"Something went wrong with the demo! Here's what: {e}")
-
For API details, see ExecuteStatement in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table using a Global Secondary Index.
Query a DynamoDB table using its primary key.
Query a Global Secondary Index (GSI) for alternate access patterns.
Compare table queries and GSI queries.
- SDK for Python (Boto3)
-
Query a DynamoDB table using its primary key and a Global Secondary Index (GSI) with AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Key def query_table(table_name, partition_key_name, partition_key_value): """ Query a DynamoDB table using its primary key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Perform the query on the table's primary key response = table.query(KeyConditionExpression=Key(partition_key_name).eq(partition_key_value)) return response def query_gsi(table_name, index_name, partition_key_name, partition_key_value): """ Query a Global Secondary Index (GSI) on a DynamoDB table. Args: table_name (str): The name of the DynamoDB table. index_name (str): The name of the Global Secondary Index. partition_key_name (str): The name of the GSI's partition key attribute. partition_key_value (str): The value of the GSI's partition key to query. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Perform the query on the GSI response = table.query( IndexName=index_name, KeyConditionExpression=Key(partition_key_name).eq(partition_key_value) ) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table using a begins_with condition.
Use the begins_with function in a key condition expression.
Filter items based on a prefix pattern in the sort key.
- SDK for Python (Boto3)
-
Query a DynamoDB table using a begins_with condition on the sort key with AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Key def query_with_begins_with( table_name, partition_key_name, partition_key_value, sort_key_name, prefix ): """ Query a DynamoDB table with a begins_with condition on the sort key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute. prefix (str): The prefix to match at the beginning of the sort key. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Perform the query with a begins_with condition on the sort key key_condition = Key(partition_key_name).eq(partition_key_value) & Key( sort_key_name ).begins_with(prefix) response = table.query(KeyConditionExpression=key_condition) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table using a date range in the sort key.
Query items within a specific date range.
Use comparison operators on date-formatted sort keys.
- SDK for Python (Boto3)
-
Query a DynamoDB table for items within a date range with AWS SDK for Python (Boto3).
from datetime import datetime, timedelta import boto3 from boto3.dynamodb.conditions import Key def query_with_date_range( table_name, partition_key_name, partition_key_value, sort_key_name, start_date, end_date ): """ Query a DynamoDB table with a date range on the sort key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute (containing date values). start_date (datetime): The start date for the query range. end_date (datetime): The end date for the query range. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Format the date values as ISO 8601 strings # DynamoDB works well with ISO format for date values start_date_str = start_date.isoformat() end_date_str = end_date.isoformat() # Perform the query with a date range on the sort key using BETWEEN operator key_condition = Key(partition_key_name).eq(partition_key_value) & Key(sort_key_name).between( start_date_str, end_date_str ) response = table.query( KeyConditionExpression=key_condition, ExpressionAttributeValues={ ":pk_val": partition_key_value, ":start_date": start_date_str, ":end_date": end_date_str, }, ) return response def query_with_date_range_by_month( table_name, partition_key_name, partition_key_value, sort_key_name, year, month ): """ Query a DynamoDB table for a specific month's data. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute (containing date values). year (int): The year to query. month (int): The month to query (1-12). Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Calculate the start and end dates for the specified month if month == 12: next_year = year + 1 next_month = 1 else: next_year = year next_month = month + 1 start_date = datetime(year, month, 1) end_date = datetime(next_year, next_month, 1) - timedelta(microseconds=1) # Format the date values as ISO 8601 strings start_date_str = start_date.isoformat() end_date_str = end_date.isoformat() # Perform the query with a date range on the sort key key_condition = Key(partition_key_name).eq(partition_key_value) & Key(sort_key_name).between( start_date_str, end_date_str ) response = table.query(KeyConditionExpression=key_condition) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with a complex filter expression.
Apply complex filter expressions to query results.
Combine multiple conditions using logical operators.
Filter items based on non-key attributes.
- SDK for Python (Boto3)
-
Query a DynamoDB table with a complex filter expression using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_complex_filter( table_name, partition_key_name, partition_key_value, min_rating=None, status_list=None, max_price=None, ): """ Query a DynamoDB table with a complex filter expression. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. min_rating (float, optional): Minimum rating value for filtering. status_list (list, optional): List of status values to include. max_price (float, optional): Maximum price value for filtering. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Initialize the filter expression and expression attribute values filter_expression = None expression_attribute_values = {} # Build the filter expression based on provided parameters if min_rating is not None: filter_expression = Attr("rating").gte(min_rating) expression_attribute_values[":min_rating"] = min_rating if status_list and len(status_list) > 0: status_condition = None for i, status in enumerate(status_list): status_value_name = f":status{i}" expression_attribute_values[status_value_name] = status if status_condition is None: status_condition = Attr("status").eq(status) else: status_condition = status_condition | Attr("status").eq(status) if filter_expression is None: filter_expression = status_condition else: filter_expression = filter_expression & status_condition if max_price is not None: price_condition = Attr("price").lte(max_price) expression_attribute_values[":max_price"] = max_price if filter_expression is None: filter_expression = price_condition else: filter_expression = filter_expression & price_condition # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression if expression_attribute_values: query_params["ExpressionAttributeValues"] = expression_attribute_values # Execute the query response = table.query(**query_params) return response def query_with_complex_filter_and_or( table_name, partition_key_name, partition_key_value, category=None, min_rating=None, max_price=None, ): """ Query a DynamoDB table with a complex filter expression using AND and OR operators. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. category (str, optional): Category value for filtering. min_rating (float, optional): Minimum rating value for filtering. max_price (float, optional): Maximum price value for filtering. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Build a complex filter expression with AND and OR operators filter_expression = None expression_attribute_values = {} # Build the category condition if category: filter_expression = Attr("category").eq(category) expression_attribute_values[":category"] = category # Build the rating and price condition (rating >= min_rating OR price <= max_price) rating_price_condition = None if min_rating is not None: rating_price_condition = Attr("rating").gte(min_rating) expression_attribute_values[":min_rating"] = min_rating if max_price is not None: price_condition = Attr("price").lte(max_price) expression_attribute_values[":max_price"] = max_price if rating_price_condition is None: rating_price_condition = price_condition else: rating_price_condition = rating_price_condition | price_condition # Combine the conditions if rating_price_condition: if filter_expression is None: filter_expression = rating_price_condition else: filter_expression = filter_expression & rating_price_condition # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression if expression_attribute_values: query_params["ExpressionAttributeValues"] = expression_attribute_values # Execute the query response = table.query(**query_params) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with a dynamic filter expression.
Build filter expressions dynamically at runtime.
Construct filter conditions based on user input or application state.
Add or remove filter criteria conditionally.
- SDK for Python (Boto3)
-
Query a DynamoDB table with a dynamically constructed filter expression using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_dynamic_filter( table_name, partition_key_name, partition_key_value, filter_conditions=None ): """ Query a DynamoDB table with a dynamically constructed filter expression. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. filter_conditions (dict, optional): A dictionary of filter conditions where keys are attribute names and values are dictionaries with 'operator' and 'value'. Example: {'rating': {'operator': '>=', 'value': 4}, 'status': {'operator': '=', 'value': 'active'}} Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Start with the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Initialize variables for the filter expression and attribute values filter_expression = None expression_attribute_values = {":pk_val": partition_key_value} # Dynamically build the filter expression if filter conditions are provided if filter_conditions: for attr_name, condition in filter_conditions.items(): operator = condition.get("operator") value = condition.get("value") attr_value_name = f":{attr_name}" expression_attribute_values[attr_value_name] = value # Create the appropriate filter expression based on the operator current_condition = None if operator == "=": current_condition = Attr(attr_name).eq(value) elif operator == "!=": current_condition = Attr(attr_name).ne(value) elif operator == ">": current_condition = Attr(attr_name).gt(value) elif operator == ">=": current_condition = Attr(attr_name).gte(value) elif operator == "<": current_condition = Attr(attr_name).lt(value) elif operator == "<=": current_condition = Attr(attr_name).lte(value) elif operator == "contains": current_condition = Attr(attr_name).contains(value) elif operator == "begins_with": current_condition = Attr(attr_name).begins_with(value) # Combine with existing filter expression using AND if current_condition: if filter_expression is None: filter_expression = current_condition else: filter_expression = filter_expression & current_condition # Perform the query with the dynamically built filter expression query_params = {"KeyConditionExpression": key_condition} if filter_expression: query_params["FilterExpression"] = filter_expression response = table.query(**query_params) return response
Demonstrates how to use dynamic filter expressions with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the query_with_dynamic_filter function.""" # Example parameters table_name = "Products" partition_key_name = "Category" partition_key_value = "Electronics" # Define dynamic filter conditions based on user input or runtime conditions user_min_rating = 4 # This could come from user input user_status_filter = "active" # This could come from user input filter_conditions = {} # Only add conditions that are actually specified if user_min_rating is not None: filter_conditions["rating"] = {"operator": ">=", "value": user_min_rating} if user_status_filter: filter_conditions["status"] = {"operator": "=", "value": user_status_filter} print( f"Querying products in category '{partition_key_value}' with filter conditions: {filter_conditions}" ) # Execute the query with dynamic filter response = query_with_dynamic_filter( table_name, partition_key_name, partition_key_value, filter_conditions ) # Process the results items = response.get("Items", []) print(f"Found {len(items)} items") for item in items: print(f"Product: {item}")
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with a filter expression and limit.
Apply filter expressions to query results with a limit on items evaluated.
Understand how limit affects filtered query results.
Control the maximum number of items processed in a query.
- SDK for Python (Boto3)
-
Query a DynamoDB table with a filter expression and limit using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_filter_and_limit( table_name, partition_key_name, partition_key_value, filter_attribute=None, filter_value=None, limit=10, ): """ Query a DynamoDB table with a filter expression and limit the number of results. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. filter_attribute (str, optional): The attribute name to filter on. filter_value (any, optional): The value to compare against in the filter. limit (int, optional): The maximum number of items to evaluate. Defaults to 10. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Prepare the query parameters query_params = {"KeyConditionExpression": key_condition, "Limit": limit} # Add the filter expression if filter attributes are provided if filter_attribute and filter_value is not None: query_params["FilterExpression"] = Attr(filter_attribute).gt(filter_value) query_params["ExpressionAttributeValues"] = {":filter_value": filter_value} # Execute the query response = table.query(**query_params) return response
Demonstrates how to use filter expressions with limits in AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the query_with_filter_and_limit function.""" # Example parameters table_name = "ProductReviews" partition_key_name = "ProductId" partition_key_value = "P123456" filter_attribute = "Rating" filter_value = 3 # Filter for ratings > 3 limit = 5 print(f"Querying reviews for product '{partition_key_value}' with rating > {filter_value}") print(f"Limiting to {limit} evaluated items") # Execute the query with filter and limit response = query_with_filter_and_limit( table_name, partition_key_name, partition_key_value, filter_attribute, filter_value, limit ) # Process the results items = response.get("Items", []) print(f"\nReturned {len(items)} items that passed the filter") for item in items: print(f"Review: {item}") # Explain the difference between Limit and actual results explain_limit_vs_results(response) # Check if there are more results if "LastEvaluatedKey" in response: print("\nThere are more results available. Use the LastEvaluatedKey for pagination.") else: print("\nAll matching results have been retrieved.")
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with nested attributes.
Access and filter by nested attributes in DynamoDB items.
Use document path expressions to reference nested elements.
- SDK for Python (Boto3)
-
Query a DynamoDB table with nested attributes using AWS SDK for Python (Boto3).
from typing import Any, Dict, List import boto3 from boto3.dynamodb.conditions import Attr, Key def query_with_nested_attributes( table_name: str, partition_key_name: str, partition_key_value: str, nested_path: str, comparison_operator: str, comparison_value: Any, ) -> Dict[str, Any]: """ Query a DynamoDB table and filter by nested attributes. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. nested_path (str): The path to the nested attribute (e.g., 'specs.weight'). comparison_operator (str): The comparison operator to use ('=', '!=', '<', '<=', '>', '>='). comparison_value (any): The value to compare against. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Build the filter expression based on the nested attribute path and comparison operator filter_expression = None if comparison_operator == "=": filter_expression = Attr(nested_path).eq(comparison_value) elif comparison_operator == "!=": filter_expression = Attr(nested_path).ne(comparison_value) elif comparison_operator == "<": filter_expression = Attr(nested_path).lt(comparison_value) elif comparison_operator == "<=": filter_expression = Attr(nested_path).lte(comparison_value) elif comparison_operator == ">": filter_expression = Attr(nested_path).gt(comparison_value) elif comparison_operator == ">=": filter_expression = Attr(nested_path).gte(comparison_value) elif comparison_operator == "contains": filter_expression = Attr(nested_path).contains(comparison_value) elif comparison_operator == "begins_with": filter_expression = Attr(nested_path).begins_with(comparison_value) # Execute the query with the filter expression response = table.query(KeyConditionExpression=key_condition, FilterExpression=filter_expression) return response def query_with_multiple_nested_attributes( table_name: str, partition_key_name: str, partition_key_value: str, nested_conditions: List[Dict[str, Any]], ) -> Dict[str, Any]: """ Query a DynamoDB table and filter by multiple nested attributes. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. nested_conditions (list): A list of dictionaries, each containing: - path (str): The path to the nested attribute - operator (str): The comparison operator - value (any): The value to compare against Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) # Build the combined filter expression for all nested attributes combined_filter = None for condition in nested_conditions: if not isinstance(condition, dict): continue path = condition.get("path", "") operator = condition.get("operator", "") value = condition.get("value") if not path or not operator: continue # Build the individual filter expression current_filter = None if operator == "=": current_filter = Attr(path).eq(value) elif operator == "!=": current_filter = Attr(path).ne(value) elif operator == "<": current_filter = Attr(path).lt(value) elif operator == "<=": current_filter = Attr(path).lte(value) elif operator == ">": current_filter = Attr(path).gt(value) elif operator == ">=": current_filter = Attr(path).gte(value) elif operator == "contains": current_filter = Attr(path).contains(value) elif operator == "begins_with": current_filter = Attr(path).begins_with(value) # Combine with the existing filter using AND if current_filter: if combined_filter is None: combined_filter = current_filter else: combined_filter = combined_filter & current_filter # Execute the query with the combined filter expression response = table.query(KeyConditionExpression=key_condition, FilterExpression=combined_filter) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with pagination.
Implement pagination for DynamoDB query results.
Use the LastEvaluatedKey to retrieve subsequent pages.
Control the number of items per page with the Limit parameter.
- SDK for Python (Boto3)
-
Query a DynamoDB table with pagination using AWS SDK for Python (Boto3).
import boto3 from boto3.dynamodb.conditions import Key def query_with_pagination( table_name, partition_key_name, partition_key_value, page_size=25, max_pages=None ): """ Query a DynamoDB table with pagination to handle large result sets. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. page_size (int, optional): The number of items to return per page. Defaults to 25. max_pages (int, optional): The maximum number of pages to retrieve. If None, retrieves all pages. Returns: list: All items retrieved from the query across all pages. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Initialize variables for pagination last_evaluated_key = None page_count = 0 all_items = [] # Paginate through the results while True: # Check if we've reached the maximum number of pages if max_pages is not None and page_count >= max_pages: break # Prepare the query parameters query_params = { "KeyConditionExpression": Key(partition_key_name).eq(partition_key_value), "Limit": page_size, } # Add the ExclusiveStartKey if we have a LastEvaluatedKey from a previous query if last_evaluated_key: query_params["ExclusiveStartKey"] = last_evaluated_key # Execute the query response = table.query(**query_params) # Process the current page of results items = response.get("Items", []) all_items.extend(items) # Update pagination tracking page_count += 1 # Get the LastEvaluatedKey for the next page, if any last_evaluated_key = response.get("LastEvaluatedKey") # If there's no LastEvaluatedKey, we've reached the end of the results if not last_evaluated_key: break return all_items def query_with_pagination_generator( table_name, partition_key_name, partition_key_value, page_size=25 ): """ Query a DynamoDB table with pagination using a generator to handle large result sets. This approach is memory-efficient as it yields one page at a time. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. page_size (int, optional): The number of items to return per page. Defaults to 25. Yields: tuple: A tuple containing (items, page_number, last_page) where: - items is a list of items for the current page - page_number is the current page number (starting from 1) - last_page is a boolean indicating if this is the last page """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Initialize variables for pagination last_evaluated_key = None page_number = 0 # Paginate through the results while True: # Prepare the query parameters query_params = { "KeyConditionExpression": Key(partition_key_name).eq(partition_key_value), "Limit": page_size, } # Add the ExclusiveStartKey if we have a LastEvaluatedKey from a previous query if last_evaluated_key: query_params["ExclusiveStartKey"] = last_evaluated_key # Execute the query response = table.query(**query_params) # Get the current page of results items = response.get("Items", []) page_number += 1 # Get the LastEvaluatedKey for the next page, if any last_evaluated_key = response.get("LastEvaluatedKey") # Determine if this is the last page is_last_page = last_evaluated_key is None # Yield the current page of results yield (items, page_number, is_last_page) # If there's no LastEvaluatedKey, we've reached the end of the results if is_last_page: break
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query a table with strongly consistent reads.
Configure the consistency level for DynamoDB queries.
Use strongly consistent reads to get the most up-to-date data.
Understand the tradeoffs between eventual consistency and strong consistency.
- SDK for Python (Boto3)
-
Query a DynamoDB table with the option for strongly consistent reads using AWS SDK for Python (Boto3).
import time import boto3 from boto3.dynamodb.conditions import Key def query_with_consistent_read( table_name, partition_key_name, partition_key_value, sort_key_name=None, sort_key_value=None, consistent_read=True, ): """ Query a DynamoDB table with the option for strongly consistent reads. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str, optional): The name of the sort key attribute. sort_key_value (str, optional): The value of the sort key to query. consistent_read (bool, optional): Whether to use strongly consistent reads. Defaults to True. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the key condition expression key_condition = Key(partition_key_name).eq(partition_key_value) if sort_key_name and sort_key_value: key_condition = key_condition & Key(sort_key_name).eq(sort_key_value) # Perform the query with the consistent read option response = table.query(KeyConditionExpression=key_condition, ConsistentRead=consistent_read) return response
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query for TTL items.
- SDK for Python (Boto3)
-
Query Filtered Expression to gather TTL items in a DynamoDB table using AWS SDK for Python (Boto3).
from datetime import datetime import boto3 def query_dynamodb_items(table_name, partition_key): """ :param table_name: Name of the DynamoDB table :param partition_key: :return: """ try: # Initialize a DynamoDB resource dynamodb = boto3.resource("dynamodb", region_name="us-east-1") # Specify your table table = dynamodb.Table(table_name) # Get the current time in epoch format current_time = int(datetime.now().timestamp()) # Perform the query operation with a filter expression to exclude expired items # response = table.query( # KeyConditionExpression=boto3.dynamodb.conditions.Key('partitionKey').eq(partition_key), # FilterExpression=boto3.dynamodb.conditions.Attr('expireAt').gt(current_time) # ) response = table.query( KeyConditionExpression=dynamodb.conditions.Key("partitionKey").eq(partition_key), FilterExpression=dynamodb.conditions.Attr("expireAt").gt(current_time), ) # Print the items that are not expired for item in response["Items"]: print(item) except Exception as e: print(f"Error querying items: {e}") # Call the function with your values query_dynamodb_items("Music", "your-partition-key-value")
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to query tables using date and time patterns.
Store and query date/time values in DynamoDB.
Implement date range queries using sort keys.
Format date strings for effective querying.
- SDK for Python (Boto3)
-
Query using date ranges in sort keys with AWS SDK for Python (Boto3).
from datetime import datetime, timedelta import boto3 from boto3.dynamodb.conditions import Key def query_with_date_range( table_name, partition_key_name, partition_key_value, sort_key_name, start_date, end_date ): """ Query a DynamoDB table with a date range on the sort key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute (containing date values). start_date (datetime): The start date for the query range. end_date (datetime): The end date for the query range. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Format the date values as ISO 8601 strings # DynamoDB works well with ISO format for date values start_date_str = start_date.isoformat() end_date_str = end_date.isoformat() # Perform the query with a date range on the sort key using BETWEEN operator key_condition = Key(partition_key_name).eq(partition_key_value) & Key(sort_key_name).between( start_date_str, end_date_str ) response = table.query( KeyConditionExpression=key_condition, ExpressionAttributeValues={ ":pk_val": partition_key_value, ":start_date": start_date_str, ":end_date": end_date_str, }, ) return response def query_with_date_range_by_month( table_name, partition_key_name, partition_key_value, sort_key_name, year, month ): """ Query a DynamoDB table for a specific month's data. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute (containing date values). year (int): The year to query. month (int): The month to query (1-12). Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Calculate the start and end dates for the specified month if month == 12: next_year = year + 1 next_month = 1 else: next_year = year next_month = month + 1 start_date = datetime(year, month, 1) end_date = datetime(next_year, next_month, 1) - timedelta(microseconds=1) # Format the date values as ISO 8601 strings start_date_str = start_date.isoformat() end_date_str = end_date.isoformat() # Perform the query with a date range on the sort key key_condition = Key(partition_key_name).eq(partition_key_value) & Key(sort_key_name).between( start_date_str, end_date_str ) response = table.query(KeyConditionExpression=key_condition) return response
Query using date-time variables with AWS SDK for Python (Boto3).
from datetime import datetime, timedelta import boto3 from boto3.dynamodb.conditions import Key def query_with_datetime( table_name, partition_key_name, partition_key_value, sort_key_name, start_date, end_date ): """ Query a DynamoDB table with a date range filter on the sort key. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. sort_key_name (str): The name of the sort key attribute (containing date/time values). start_date (datetime): The start date/time for the query range. end_date (datetime): The end date/time for the query range. Returns: dict: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Format the date/time values as ISO 8601 strings # DynamoDB works well with ISO format for date/time values start_date_str = start_date.isoformat() end_date_str = end_date.isoformat() # Perform the query with a date range on the sort key key_condition = Key(partition_key_name).eq(partition_key_value) & Key(sort_key_name).between( start_date_str, end_date_str ) response = table.query( KeyConditionExpression=key_condition, ExpressionAttributeValues={ ":pk_val": partition_key_value, ":start_date": start_date_str, ":end_date": end_date_str, }, ) return response def example_usage(): """Example of how to use the query_with_datetime function.""" # Example parameters table_name = "Events" partition_key_name = "EventType" partition_key_value = "UserLogin" sort_key_name = "Timestamp" # Create date/time variables for the query end_date = datetime.now() start_date = end_date - timedelta(days=7) # Query events from the last 7 days print(f"Querying events from {start_date.isoformat()} to {end_date.isoformat()}") # Execute the query response = query_with_datetime( table_name, partition_key_name, partition_key_value, sort_key_name, start_date, end_date ) # Process the results items = response.get("Items", []) print(f"Found {len(items)} items") for item in items: print(f"Event: {item}")
-
For API details, see Query in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to understand update expression order.
Learn how DynamoDB processes update expressions.
Understand the order of operations in update expressions.
Avoid unexpected results by understanding expression evaluation.
- SDK for Python (Boto3)
-
Demonstrate update expression order using AWS SDK for Python (Boto3).
import boto3 import json from typing import Any, Dict, Optional def update_with_multiple_actions( table_name: str, key: Dict[str, Any], update_expression: str, expression_attribute_names: Optional[Dict[str, str]] = None, expression_attribute_values: Optional[Dict[str, Any]] = None, ) -> Dict[str, Any]: """ Update an item with multiple actions in a single update expression. This function demonstrates how to use multiple actions in a single update expression and how DynamoDB processes these actions. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. update_expression (str): The update expression with multiple actions. expression_attribute_names (Optional[Dict[str, str]]): Expression attribute name placeholders. expression_attribute_values (Optional[Dict[str, Any]]): Expression attribute value placeholders. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Prepare the update parameters update_params = { "Key": key, "UpdateExpression": update_expression, "ReturnValues": "UPDATED_NEW", } # Add expression attribute names if provided if expression_attribute_names: update_params["ExpressionAttributeNames"] = expression_attribute_names # Add expression attribute values if provided if expression_attribute_values: update_params["ExpressionAttributeValues"] = expression_attribute_values # Execute the update response = table.update_item(**update_params) return response def demonstrate_value_copying(table_name: str, key: Dict[str, Any]) -> Dict[str, Any]: """ Demonstrate that variables hold copies of existing values before modifications. This function creates an item with initial values, then updates it with an expression that uses the values of attributes before they are modified in the same expression. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to create and update. Returns: Dict[str, Any]: A dictionary containing the results of the demonstration. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Step 1: Create an item with initial values initial_item = key.copy() initial_item.update({"a": 1, "b": 2, "c": 3}) table.put_item(Item=initial_item) # Step 2: Get the item to verify initial state response_before = table.get_item(Key=key) item_before = response_before.get("Item", {}) # Step 3: Update the item with an expression that uses values before they are modified # This expression removes 'a', then sets 'b' to the value of 'a', and 'c' to the value of 'b' update_response = table.update_item( Key=key, UpdateExpression="REMOVE a SET b = a, c = b", ReturnValues="UPDATED_NEW" ) # Step 4: Get the item to verify final state response_after = table.get_item(Key=key) item_after = response_after.get("Item", {}) # Return the results return { "initial_state": item_before, "update_response": update_response, "final_state": item_after, } def demonstrate_action_order(table_name: str, key: Dict[str, Any]) -> Dict[str, Any]: """ Demonstrate the order in which different action types are processed. This function creates an item with initial values, then updates it with an expression that includes multiple action types (SET, REMOVE, ADD, DELETE) to show the order in which they are processed. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to create and update. Returns: Dict[str, Any]: A dictionary containing the results of the demonstration. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Step 1: Create an item with initial values initial_item = key.copy() initial_item.update( { "counter": 10, "set_attr": set(["A", "B", "C"]), "to_remove": "This will be removed", "to_modify": "Original value", } ) table.put_item(Item=initial_item) # Step 2: Get the item to verify initial state response_before = table.get_item(Key=key) item_before = response_before.get("Item", {}) # Step 3: Update the item with multiple action types # The actions will be processed in this order: REMOVE, SET, ADD, DELETE update_response = table.update_item( Key=key, UpdateExpression="REMOVE to_remove SET to_modify = :new_value ADD counter :increment DELETE set_attr :elements", ExpressionAttributeValues={ ":new_value": "Updated value", ":increment": 5, ":elements": set(["B"]), }, ReturnValues="UPDATED_NEW", ) # Step 4: Get the item to verify final state response_after = table.get_item(Key=key) item_after = response_after.get("Item", {}) # Return the results return { "initial_state": item_before, "update_response": update_response, "final_state": item_after, } def update_with_multiple_set_actions( table_name: str, key: Dict[str, Any], attributes: Dict[str, Any] ) -> Dict[str, Any]: """ Update multiple attributes with a single SET action. This function demonstrates how to update multiple attributes in a single SET action, which is more efficient than using multiple separate update operations. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. attributes (Dict[str, Any]): The attributes to update and their new values. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Build the update expression and expression attribute values update_expression = "SET " expression_attribute_values = {} # Add each attribute to the update expression for i, (attr_name, attr_value) in enumerate(attributes.items()): value_placeholder = f":val{i}" if i > 0: update_expression += ", " update_expression += f"{attr_name} = {value_placeholder}" expression_attribute_values[value_placeholder] = attr_value # Execute the update response = table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeValues=expression_attribute_values, ReturnValues="UPDATED_NEW", ) return response def update_with_conditional_value_copying( table_name: str, key: Dict[str, Any], source_attribute: str, target_attribute: str, default_value: Any, ) -> Dict[str, Any]: """ Update an attribute with a value from another attribute or a default value. This function demonstrates how to use if_not_exists to conditionally copy a value from one attribute to another, or use a default value if the source doesn't exist. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. source_attribute (str): The attribute to copy the value from. target_attribute (str): The attribute to update. default_value (Any): The default value to use if the source attribute doesn't exist. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use if_not_exists to conditionally copy the value response = table.update_item( Key=key, UpdateExpression=f"SET {target_attribute} = if_not_exists({source_attribute}, :default)", ExpressionAttributeValues={":default": default_value}, ReturnValues="UPDATED_NEW", ) return response
Example usage of update expression order with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use update expression order of operations in DynamoDB.""" # Example parameters table_name = "OrderProcessing" key = {"OrderId": "order123"} print("Example 1: Demonstrating value copying in update expressions") try: results = demonstrate_value_copying(table_name=table_name, key=key) print(f"Initial state: {json.dumps(results['initial_state'], default=str)}") print(f"Update response: {json.dumps(results['update_response'], default=str)}") print(f"Final state: {json.dumps(results['final_state'], default=str)}") print("\nExplanation:") print("1. The initial state had a=1, b=2, c=3") print("2. The update expression 'REMOVE a SET b = a, c = b' did the following:") print(" - Copied the value of 'a' (which was 1) to be used for 'b'") print(" - Copied the value of 'b' (which was 2) to be used for 'c'") print(" - Removed the attribute 'a'") print("3. The final state has b=1, c=2, and 'a' is removed") print( "4. This demonstrates that DynamoDB uses the values of attributes as they were BEFORE any modifications" ) except Exception as e: print(f"Error demonstrating value copying: {e}") print("\nExample 2: Demonstrating the order of different action types") try: results = demonstrate_action_order(table_name=table_name, key={"OrderId": "order456"}) print(f"Initial state: {json.dumps(results['initial_state'], default=str)}") print(f"Update response: {json.dumps(results['update_response'], default=str)}") print(f"Final state: {json.dumps(results['final_state'], default=str)}") print("\nExplanation:") print("1. The update expression contained multiple action types: REMOVE, SET, ADD, DELETE") print("2. DynamoDB processes these actions in this order: REMOVE, SET, ADD, DELETE") print("3. First, 'to_remove' was removed") print("4. Then, 'to_modify' was set to a new value") print("5. Next, 'counter' was incremented by 5") print("6. Finally, 'B' was removed from the set attribute") except Exception as e: print(f"Error demonstrating action order: {e}") print("\nExample 3: Updating multiple attributes in a single SET action") try: response = update_with_multiple_set_actions( table_name=table_name, key={"OrderId": "order789"}, attributes={ "Status": "Shipped", "ShippingDate": "2025-05-14", "TrackingNumber": "1Z999AA10123456784", }, ) print( f"Multiple attributes updated successfully: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error updating multiple attributes: {e}") print("\nExample 4: Conditional value copying with if_not_exists") try: response = update_with_conditional_value_copying( table_name=table_name, key={"OrderId": "order101"}, source_attribute="PreferredShippingMethod", target_attribute="ShippingMethod", default_value="Standard", ) print( f"Conditional value copying result: {json.dumps(response.get('Attributes', {}), default=str)}" ) except Exception as e: print(f"Error with conditional value copying: {e}") print("\nKey Points About Update Expression Order of Operations:") print( "1. Variables in expressions hold copies of attribute values as they existed BEFORE any modifications" ) print( "2. Multiple actions in an update expression are processed in this order: REMOVE, SET, ADD, DELETE" ) print("3. Within each action type, operations are processed from left to right") print("4. You can reference the same attribute multiple times in an expression") print("5. You can use if_not_exists() to conditionally set values based on attribute existence") print( "6. Using a single update expression with multiple actions is more efficient than multiple separate updates" ) print("7. The update expression is atomic - either all actions succeed or none do")
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to update a table's warm throughput setting.
- SDK for Python (Boto3)
-
Update warm throughput setting on an existing DynamoDB table using AWS SDK for Python (Boto3).
from boto3 import client from botocore.exceptions import ClientError def update_dynamodb_table_warm_throughput( table_name, table_read_units, table_write_units, gsi_name, gsi_read_units, gsi_write_units, region_name="us-east-1", ): """ Updates the warm throughput of a DynamoDB table and a global secondary index. :param table_name: The name of the table to update. :param table_read_units: The new read units per second for the table's warm throughput. :param table_write_units: The new write units per second for the table's warm throughput. :param gsi_name: The name of the global secondary index to update. :param gsi_read_units: The new read units per second for the GSI's warm throughput. :param gsi_write_units: The new write units per second for the GSI's warm throughput. :param region_name: The AWS Region name to target. defaults to us-east-1 :return: The response from the update_table operation """ try: ddb = client("dynamodb", region_name=region_name) # Update the table's warm throughput table_warm_throughput = { "ReadUnitsPerSecond": table_read_units, "WriteUnitsPerSecond": table_write_units, } # Update the global secondary index's warm throughput gsi_warm_throughput = { "ReadUnitsPerSecond": gsi_read_units, "WriteUnitsPerSecond": gsi_write_units, } # Construct the global secondary index update global_secondary_index_update = [ {"Update": {"IndexName": gsi_name, "WarmThroughput": gsi_warm_throughput}} ] # Construct the update table request update_table_request = { "TableName": table_name, "GlobalSecondaryIndexUpdates": global_secondary_index_update, "WarmThroughput": table_warm_throughput, } # Update the table response = ddb.update_table(**update_table_request) print("Table updated successfully!") return response # Make sure to return the response except ClientError as e: print(f"Error updating table: {e}") raise e
-
For API details, see UpdateTable in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to update an item's TTL.
- SDK for Python (Boto3)
-
from datetime import datetime, timedelta import boto3 def update_dynamodb_item(table_name, region, primary_key, sort_key): """ Update an existing DynamoDB item with a TTL. :param table_name: Name of the DynamoDB table :param region: AWS Region of the table - example `us-east-1` :param primary_key: one attribute known as the partition key. :param sort_key: Also known as a range attribute. :return: Void (nothing) """ try: # Create the DynamoDB resource. dynamodb = boto3.resource("dynamodb", region_name=region) table = dynamodb.Table(table_name) # Get the current time in epoch second format current_time = int(datetime.now().timestamp()) # Calculate the expireAt time (90 days from now) in epoch second format expire_at = int((datetime.now() + timedelta(days=90)).timestamp()) table.update_item( Key={"partitionKey": primary_key, "sortKey": sort_key}, UpdateExpression="set updatedAt=:c, expireAt=:e", ExpressionAttributeValues={":c": current_time, ":e": expire_at}, ) print("Item updated successfully.") except Exception as e: print(f"Error updating item: {e}") # Replace with your own values update_dynamodb_item( "your-table-name", "us-west-2", "your-partition-key-value", "your-sort-key-value" )
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to create an AWS Lambda function invoked by Amazon API Gateway.
- SDK for Python (Boto3)
-
This example shows how to create and use an Amazon API Gateway REST API that targets an AWS Lambda function. The Lambda handler demonstrates how to route based on HTTP methods; how to get data from the query string, header, and body; and how to return a JSON response.
Deploy a Lambda function.
Create an API Gateway REST API.
Create a REST resource that targets the Lambda function.
Grant permission to let API Gateway invoke the Lambda function.
Use the Requests package to send requests to the REST API.
Clean up all resources created during the demo.
This example is best viewed on GitHub. For complete source code and instructions on how to set up and run, see the full example on GitHub
. Services used in this example
API Gateway
DynamoDB
Lambda
Amazon SNS
The following code example shows how to use atomic counter operations in DynamoDB.
Increment counters atomically using ADD and SET operations.
Safely increment counters that might not exist.
Implement optimistic locking for counter operations.
- SDK for Python (Boto3)
-
Demonstrate atomic counter operations using AWS SDK for Python (Boto3).
import boto3 from botocore.exceptions import ClientError from typing import Any, Dict, Union def increment_counter_with_add( table_name: str, key: Dict[str, Any], counter_name: str, increment_value: int = 1 ) -> Dict[str, Any]: """ Increment a counter attribute using the ADD operation. This function demonstrates the atomic ADD operation, which is ideal for incrementing counters without the risk of race conditions. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. counter_name (str): The name of the counter attribute. increment_value (int, optional): The value to increment by. Defaults to 1. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use the ADD operation to atomically increment the counter response = table.update_item( Key=key, UpdateExpression="ADD #counter :increment", ExpressionAttributeNames={"#counter": counter_name}, ExpressionAttributeValues={":increment": increment_value}, ReturnValues="UPDATED_NEW", ) return response def increment_counter_with_set( table_name: str, key: Dict[str, Any], counter_name: str, increment_value: int = 1 ) -> Dict[str, Any]: """ Increment a counter attribute using the SET operation with an expression. This function demonstrates using SET with an expression to increment a counter. While this works, it's generally recommended to use ADD for simple increments. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. counter_name (str): The name of the counter attribute. increment_value (int, optional): The value to increment by. Defaults to 1. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use the SET operation with an expression to increment the counter response = table.update_item( Key=key, UpdateExpression="SET #counter = #counter + :increment", ExpressionAttributeNames={"#counter": counter_name}, ExpressionAttributeValues={":increment": increment_value}, ReturnValues="UPDATED_NEW", ) return response def increment_counter_safely( table_name: str, key: Dict[str, Any], counter_name: str, increment_value: int = 1, initial_value: int = 0, ) -> Dict[str, Any]: """ Increment a counter attribute safely, handling the case where it might not exist. This function demonstrates a best practice for incrementing counters by using the if_not_exists function to handle the case where the counter doesn't exist yet. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. counter_name (str): The name of the counter attribute. increment_value (int, optional): The value to increment by. Defaults to 1. initial_value (int, optional): The initial value if the counter doesn't exist. Defaults to 0. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use SET with if_not_exists to safely increment the counter response = table.update_item( Key=key, UpdateExpression="SET #counter = if_not_exists(#counter, :initial) + :increment", ExpressionAttributeNames={"#counter": counter_name}, ExpressionAttributeValues={":increment": increment_value, ":initial": initial_value}, ReturnValues="UPDATED_NEW", ) return response def atomic_conditional_increment( table_name: str, key: Dict[str, Any], counter_name: str, condition_attribute: str, condition_value: Any, increment_value: int = 1, ) -> Union[Dict[str, Any], None]: """ Atomically increment a counter only if a condition is met. This function demonstrates combining atomic counter operations with conditional expressions for more complex update scenarios. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. counter_name (str): The name of the counter attribute. condition_attribute (str): The attribute to check in the condition. condition_value (Any): The value to compare against. increment_value (int, optional): The value to increment by. Defaults to 1. Returns: Optional[Dict[str, Any]]: The response from DynamoDB if successful, None if condition failed. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) try: # Use ADD with a condition expression response = table.update_item( Key=key, UpdateExpression="ADD #counter :increment", ConditionExpression="#condition = :value", ExpressionAttributeNames={"#counter": counter_name, "#condition": condition_attribute}, ExpressionAttributeValues={":increment": increment_value, ":value": condition_value}, ReturnValues="UPDATED_NEW", ) return response except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": # Condition was not met return None else: # Other error occurred raise
Example usage of atomic counter operations with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the atomic counter operations functions.""" # Example parameters table_name = "GameScores" key = {"UserId": "user123", "GameId": "game456"} counter_name = "Score" print("Example 1: Incrementing a counter with ADD operation") try: response = increment_counter_with_add( table_name=table_name, key=key, counter_name=counter_name, increment_value=10 ) print( f"Counter incremented successfully. New value: {response.get('Attributes', {}).get(counter_name)}" ) except Exception as e: print(f"Error incrementing counter with ADD: {e}") print("\nExample 2: Incrementing a counter with SET operation") try: response = increment_counter_with_set( table_name=table_name, key=key, counter_name=counter_name, increment_value=5 ) print( f"Counter incremented successfully. New value: {response.get('Attributes', {}).get(counter_name)}" ) except Exception as e: print(f"Error incrementing counter with SET: {e}") print("\nExample 3: Safely incrementing a counter that might not exist") try: new_key = {"UserId": "newuser789", "GameId": "game456"} response = increment_counter_safely( table_name=table_name, key=new_key, counter_name=counter_name, increment_value=15, initial_value=100, ) print( f"Counter safely incremented. New value: {response.get('Attributes', {}).get(counter_name)}" ) except Exception as e: print(f"Error safely incrementing counter: {e}") print("\nExample 4: Conditional counter increment") try: # Fix for mypy: Handle the case where response might be None result = atomic_conditional_increment( table_name=table_name, key=key, counter_name="Achievements", condition_attribute="Level", condition_value=5, increment_value=1, ) if result is not None: print( f"Conditional increment succeeded. New value: {result.get('Attributes', {}).get('Achievements')}" ) else: print("Conditional increment failed because condition was not met.") if response: print( f"Conditional increment succeeded. New value: {response.get('Attributes', {}).get('Achievements')}" ) else: print("Conditional increment failed because condition was not met.") except Exception as e: print(f"Error with conditional increment: {e}") print("\nComparison of ADD vs SET for counter operations:") print("1. ADD is specifically designed for atomic numeric increments and set operations") print("2. SET with an expression can be used for more complex calculations") print("3. Both operations are atomic, preventing race conditions") print("4. ADD is more concise for simple increments") print("5. SET with if_not_exists() is recommended when the attribute might not exist") print("6. For counters, ADD is generally preferred for clarity and simplicity")
-
For API details, see UpdateItem in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use conditional operations in DynamoDB.
Implement conditional writes to prevent overwriting data.
Use condition expressions to enforce business rules.
Handle conditional check failures gracefully.
- SDK for Python (Boto3)
-
Demonstrate conditional operations using AWS SDK for Python (Boto3).
import boto3 from botocore.exceptions import ClientError from typing import Any, Dict, Optional, Tuple, Union def conditional_update( table_name: str, key: Dict[str, Any], condition_attribute: str, condition_value: Any, update_attribute: str, update_value: Any, ) -> Tuple[bool, Optional[Dict[str, Any]]]: """ Update an item only if a condition is met. This function demonstrates how to perform a conditional update operation and determine if the condition was met. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. condition_attribute (str): The attribute to check in the condition. condition_value (Any): The value to compare against. update_attribute (str): The attribute to update. update_value (Any): The new value to set. Returns: Tuple[bool, Optional[Dict[str, Any]]]: A tuple containing: - A boolean indicating if the update succeeded - The response from DynamoDB if successful, None otherwise """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) try: # Perform the conditional update response = table.update_item( Key=key, UpdateExpression="SET #update_attr = :update_val", ConditionExpression="#cond_attr = :cond_val", ExpressionAttributeNames={ "#update_attr": update_attribute, "#cond_attr": condition_attribute, }, ExpressionAttributeValues={":update_val": update_value, ":cond_val": condition_value}, ReturnValues="UPDATED_NEW", ) # Update succeeded, condition was met return True, response except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": # Condition was not met return False, None else: # Other error occurred raise def conditional_delete( table_name: str, key: Dict[str, Any], condition_attribute: str, condition_value: Any ) -> bool: """ Delete an item only if a condition is met. This function demonstrates how to perform a conditional delete operation and determine if the condition was met. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to delete. condition_attribute (str): The attribute to check in the condition. condition_value (Any): The value to compare against. Returns: bool: True if the delete succeeded (condition was met), False otherwise. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) try: # Perform the conditional delete table.delete_item( Key=key, ConditionExpression="#attr = :val", ExpressionAttributeNames={"#attr": condition_attribute}, ExpressionAttributeValues={":val": condition_value}, ) # Delete succeeded, condition was met return True except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": # Condition was not met return False else: # Other error occurred raise def optimistic_locking_update( table_name: str, key: Dict[str, Any], version_attribute: str, update_attribute: str, update_value: Any, ) -> Tuple[bool, Optional[Dict[str, Any]]]: """ Update an item using optimistic locking with a version attribute. This function demonstrates how to implement optimistic locking using a version attribute that is incremented with each update. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. version_attribute (str): The name of the version attribute. update_attribute (str): The attribute to update. update_value (Any): The new value to set. Returns: Tuple[bool, Optional[Dict[str, Any]]]: A tuple containing: - A boolean indicating if the update succeeded - The response from DynamoDB if successful, None otherwise """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # First, get the current version try: response = table.get_item( Key=key, ProjectionExpression=f"#{version_attribute}", ExpressionAttributeNames={f"#{version_attribute}": version_attribute}, ) item = response.get("Item", {}) current_version = item.get(version_attribute, 0) # Now, try to update with a condition on the version try: update_response = table.update_item( Key=key, UpdateExpression=f"SET #{update_attribute} = :update_val, #{version_attribute} = :new_version", ConditionExpression=f"#{version_attribute} = :current_version", ExpressionAttributeNames={ f"#{update_attribute}": update_attribute, f"#{version_attribute}": version_attribute, }, ExpressionAttributeValues={ ":update_val": update_value, ":current_version": current_version, ":new_version": current_version + 1, }, ReturnValues="UPDATED_NEW", ) # Update succeeded return True, update_response except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": # Version has changed, optimistic locking failed return False, None else: # Other error occurred raise except ClientError: # Error getting the item raise def conditional_check_and_update( table_name: str, key: Dict[str, Any], check_attribute: str, check_value: Any, update_attribute: str, update_value: Any, create_if_not_exists: bool = False, ) -> Union[Dict[str, Any], None]: """ Check if an attribute has a specific value and update another attribute if it does. This function demonstrates a more complex conditional update that can also create the item if it doesn't exist. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. check_attribute (str): The attribute to check in the condition. check_value (Any): The value to compare against. update_attribute (str): The attribute to update. update_value (Any): The new value to set. create_if_not_exists (bool, optional): Whether to create the item if it doesn't exist. Returns: Union[Dict[str, Any], None]: The response from DynamoDB if successful, None otherwise. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) try: if create_if_not_exists: # Use attribute_not_exists to create the item if it doesn't exist condition_expression = "attribute_not_exists(#pk) OR #check_attr = :check_val" update_expression = "SET #update_attr = :update_val, #check_attr = if_not_exists(#check_attr, :check_val)" # Get the partition key name from the key dictionary pk_name = next(iter(key)) expression_attribute_names = { "#pk": pk_name, "#check_attr": check_attribute, "#update_attr": update_attribute, } else: # Only update if the check attribute has the expected value condition_expression = "#check_attr = :check_val" update_expression = "SET #update_attr = :update_val" expression_attribute_names = { "#check_attr": check_attribute, "#update_attr": update_attribute, } # Perform the conditional update response = table.update_item( Key=key, UpdateExpression=update_expression, ConditionExpression=condition_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues={":check_val": check_value, ":update_val": update_value}, ReturnValues="UPDATED_NEW", ) return response except ClientError as e: if e.response["Error"]["Code"] == "ConditionalCheckFailedException": # Condition was not met return None else: # Other error occurred raise
Example usage of conditional operations with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use the conditional operations functions.""" # Example parameters table_name = "Products" key = {"ProductId": "prod123"} print("Example 1: Conditional Update") try: # Update the price only if the current stock is greater than 10 success, response = conditional_update( table_name=table_name, key=key, condition_attribute="Stock", condition_value=10, update_attribute="Price", update_value=99.99, ) if success: # Fix for mypy: Handle the case where response might be None attributes = {} if response is None else response.get("Attributes", {}) print(f"Update succeeded! New values: {attributes}") else: print("Update failed because the condition was not met.") except Exception as e: print(f"Error during conditional update: {e}") print("\nExample 2: Conditional Delete") try: # Delete the product only if it's discontinued success = conditional_delete( table_name=table_name, key=key, condition_attribute="Status", condition_value="Discontinued", ) if success: print("Delete succeeded! The item was deleted.") else: print("Delete failed because the condition was not met.") except Exception as e: print(f"Error during conditional delete: {e}") print("\nExample 3: Optimistic Locking") try: # Update with optimistic locking using a version attribute success, response = optimistic_locking_update( table_name=table_name, key=key, version_attribute="Version", update_attribute="Description", update_value="Updated product description", ) if success: # Fix for mypy: Handle the case where response might be None attributes = {} if response is None else response.get("Attributes", {}) print(f"Optimistic locking update succeeded! New values: {attributes}") else: print("Optimistic locking update failed because the version has changed.") except Exception as e: print(f"Error during optimistic locking update: {e}") print("\nExample 4: Conditional Check and Update") try: # Update the featured status if the product is in stock response = conditional_check_and_update( table_name=table_name, key=key, check_attribute="InStock", check_value=True, update_attribute="Featured", update_value=True, create_if_not_exists=True, ) if response: print( f"Conditional check and update succeeded! New values: {response.get('Attributes', {})}" ) else: print("Conditional check and update failed because the condition was not met.") except Exception as e: print(f"Error during conditional check and update: {e}") print("\nUnderstanding Conditional Operations in DynamoDB:") print("1. Conditional operations help maintain data integrity") print("2. They prevent race conditions in concurrent environments") print("3. Failed conditions result in ConditionalCheckFailedException") print("4. No DynamoDB capacity is consumed when conditions fail") print("5. Optimistic locking is a common pattern using version attributes") print("6. Conditions can be combined with logical operators (AND, OR, NOT)") print("7. Conditions can use comparison operators (=, <>, <, <=, >, >=)") print( "8. attribute_exists() and attribute_not_exists() are useful for checking attribute presence" )
-
For API details, see the following topics in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to use expression attribute names in DynamoDB.
Work with reserved words in DynamoDB expressions.
Use expression attribute name placeholders.
Handle special characters in attribute names.
- SDK for Python (Boto3)
-
Demonstrate expression attribute names using AWS SDK for Python (Boto3).
import boto3 from botocore.exceptions import ClientError from typing import Any, Dict, List def use_reserved_word_attribute( table_name: str, key: Dict[str, Any], reserved_word: str, value: Any ) -> Dict[str, Any]: """ Update an attribute whose name is a DynamoDB reserved word. This function demonstrates how to use expression attribute names to work with attributes that have names that are DynamoDB reserved words. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. reserved_word (str): The reserved word to use as an attribute name. value (Any): The value to set for the attribute. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use expression attribute names to handle the reserved word response = table.update_item( Key=key, UpdateExpression="SET #reserved_attr = :value", ExpressionAttributeNames={"#reserved_attr": reserved_word}, ExpressionAttributeValues={":value": value}, ReturnValues="UPDATED_NEW", ) return response def use_special_character_attribute( table_name: str, key: Dict[str, Any], attribute_with_special_chars: str, value: Any ) -> Dict[str, Any]: """ Update an attribute whose name contains special characters. This function demonstrates how to use expression attribute names to work with attributes that have names containing special characters like spaces, dots, or hyphens. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. attribute_with_special_chars (str): The attribute name with special characters. value (Any): The value to set for the attribute. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use expression attribute names to handle special characters response = table.update_item( Key=key, UpdateExpression="SET #special_attr = :value", ExpressionAttributeNames={"#special_attr": attribute_with_special_chars}, ExpressionAttributeValues={":value": value}, ReturnValues="UPDATED_NEW", ) return response def query_with_attribute_names( table_name: str, partition_key_name: str, partition_key_value: str, filter_attribute_name: str, filter_value: Any, ) -> Dict[str, Any]: """ Query a table using expression attribute names for both key and filter attributes. This function demonstrates how to use expression attribute names in a query operation for both the key condition expression and filter expression. Args: table_name (str): The name of the DynamoDB table. partition_key_name (str): The name of the partition key attribute. partition_key_value (str): The value of the partition key to query. filter_attribute_name (str): The name of the attribute to filter on. filter_value (Any): The value to compare against in the filter. Returns: Dict[str, Any]: The response from DynamoDB containing the query results. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Use expression attribute names for both key condition and filter response = table.query( KeyConditionExpression="#pk = :pk_val", FilterExpression="#filter_attr = :filter_val", ExpressionAttributeNames={"#pk": partition_key_name, "#filter_attr": filter_attribute_name}, ExpressionAttributeValues={":pk_val": partition_key_value, ":filter_val": filter_value}, ) return response def update_nested_attribute_with_dots( table_name: str, key: Dict[str, Any], path_with_dots: str, value: Any ) -> Dict[str, Any]: """ Update a nested attribute using a path with dot notation. This function demonstrates how to use expression attribute names to work with nested attributes specified using dot notation. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. path_with_dots (str): The path to the nested attribute using dot notation (e.g., "a.b.c"). value (Any): The value to set for the nested attribute. Returns: Dict[str, Any]: The response from DynamoDB containing the updated attribute values. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Split the path into components path_parts = path_with_dots.split(".") # Build the update expression and attribute names update_expression = "SET " expression_attribute_names = {} # Build the path expression path_expression = "" for i, part in enumerate(path_parts): name_placeholder = f"#attr{i}" expression_attribute_names[name_placeholder] = part if i == 0: path_expression = name_placeholder else: path_expression += f".{name_placeholder}" # Complete the update expression update_expression += f"{path_expression} = :value" # Execute the update response = table.update_item( Key=key, UpdateExpression=update_expression, ExpressionAttributeNames=expression_attribute_names, ExpressionAttributeValues={":value": value}, ReturnValues="UPDATED_NEW", ) return response def demonstrate_attribute_name_requirements(table_name: str, key: Dict[str, Any]) -> Dict[str, Any]: """ Demonstrate the requirements and allowed characters for attribute names. This function shows examples of valid and invalid attribute names and how to handle them using expression attribute names. Args: table_name (str): The name of the DynamoDB table. key (Dict[str, Any]): The primary key of the item to update. Returns: Dict[str, Any]: A dictionary containing the results of the demonstration. """ # Initialize the DynamoDB resource dynamodb = boto3.resource("dynamodb") table = dynamodb.Table(table_name) # Examples of attribute names with different characteristics examples = { "valid_standard": "NormalAttribute", # Standard attribute name (no placeholder needed) "valid_with_underscore": "Normal_Attribute", # Underscore is allowed "valid_with_number": "Attribute123", # Numbers are allowed "reserved_word": "Timestamp", # Reserved word (requires placeholder) "starts_with_number": "123Attribute", # Starts with number (valid but may need placeholder in some contexts) "with_space": "Attribute Name", # Contains space (requires placeholder) "with_dot": "Attribute.Name", # Contains dot (requires placeholder) "with_hyphen": "Attribute-Name", # Contains hyphen (requires placeholder) "with_special_chars": "Attribute#$%", # Contains special characters (requires placeholder) } results = {} # Try to update each attribute type for example_type, attr_name in examples.items(): try: # For attributes that don't need placeholders, try direct reference if example_type in ["valid_standard", "valid_with_underscore", "valid_with_number"]: try: # Try without expression attribute names first response = table.update_item( Key=key, UpdateExpression=f"SET {attr_name} = :value", ExpressionAttributeValues={":value": f"Value for {attr_name}"}, ReturnValues="UPDATED_NEW", ) results[example_type] = { "attribute_name": attr_name, "success": True, "needed_placeholder": False, "response": response, } except ClientError: # If direct reference fails, try with placeholder response = table.update_item( Key=key, UpdateExpression="SET #attr = :value", ExpressionAttributeNames={"#attr": attr_name}, ExpressionAttributeValues={":value": f"Value for {attr_name}"}, ReturnValues="UPDATED_NEW", ) results[example_type] = { "attribute_name": attr_name, "success": True, "needed_placeholder": True, "response": response, } else: # For attributes that definitely need placeholders response = table.update_item( Key=key, UpdateExpression="SET #attr = :value", ExpressionAttributeNames={"#attr": attr_name}, ExpressionAttributeValues={":value": f"Value for {attr_name}"}, ReturnValues="UPDATED_NEW", ) results[example_type] = { "attribute_name": attr_name, "success": True, "needed_placeholder": True, "response": response, } except ClientError as e: results[example_type] = {"attribute_name": attr_name, "success": False, "error": str(e)} return results
Example usage of expression attribute names with AWS SDK for Python (Boto3).
def example_usage(): """Example of how to use expression attribute names in DynamoDB.""" # Example parameters table_name = "Products" key = {"ProductId": "prod123"} print("Example 1: Using a reserved word as an attribute name") try: response = use_reserved_word_attribute( table_name=table_name, key=key, reserved_word="Timestamp", value="2025-05-14T12:00:00Z" ) print(f"Reserved word attribute updated successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error updating reserved word attribute: {e}") print("\nExample 2: Using an attribute name with special characters") try: response = use_special_character_attribute( table_name=table_name, key=key, attribute_with_special_chars="Product Info", value="Special product information", ) print(f"Special character attribute updated successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error updating special character attribute: {e}") print("\nExample 3: Querying with expression attribute names") try: response = query_with_attribute_names( table_name=table_name, partition_key_name="Category", partition_key_value="Electronics", filter_attribute_name="Price", filter_value=500, ) print( f"Query with expression attribute names returned {len(response.get('Items', []))} items" ) except Exception as e: print(f"Error querying with expression attribute names: {e}") print("\nExample 4: Updating a nested attribute with dot notation") try: response = update_nested_attribute_with_dots( table_name=table_name, key=key, path_with_dots="Product.Details.Specifications", value={"Weight": "2.5 kg", "Dimensions": "30x20x10 cm"}, ) print(f"Nested attribute updated successfully: {response.get('Attributes', {})}") except Exception as e: print(f"Error updating nested attribute: {e}") print("\nExample 5: Demonstrating attribute name requirements") try: results = demonstrate_attribute_name_requirements(table_name=table_name, key=key) print("Attribute Name Requirements Results:") for example_type, result in results.items(): if result.get("success", False): needed_placeholder = result.get("needed_placeholder", True) print( f" - {example_type}: '{result['attribute_name']}' - {'Requires' if needed_placeholder else 'Does not require'} placeholder" ) else: print( f" - {example_type}: '{result['attribute_name']}' - Failed: {result.get('error', 'Unknown error')}" ) except Exception as e: print(f"Error demonstrating attribute name requirements: {e}") print("\nCommon DynamoDB Reserved Words (sample):") reserved_words = get_common_reserved_words() print(", ".join(reserved_words[:20]) + "... (and many more)") print("\nWhen to Use Expression Attribute Names:") print("1. When the attribute name is a DynamoDB reserved word") print("2. When the attribute name contains special characters (spaces, dots, hyphens)") print("3. When the attribute name begins with a number") print("4. When working with nested attributes using dot notation") print("5. When you need to reference the same attribute multiple times in an expression") print("\nExpression Attribute Name Requirements:") print("1. Must begin with a pound sign (#)") print("2. After the pound sign, must contain at least one character") print("3. Can contain alphanumeric characters and underscore (_)") print("4. Are case-sensitive") print("5. Must be unique within a single expression") print("\nAttribute Name Requirements in DynamoDB:") print("1. Can begin with a-z, A-Z, or 0-9") print("2. Can contain a-z, A-Z, 0-9, underscore (_), dash (-), and dot (.)") print("3. Are case-sensitive") print("4. No length restrictions, but practical limits apply") print("5. Cannot be a DynamoDB reserved word if used directly in expressions")
-
For API details, see the following topics in AWS SDK for Python (Boto3) API Reference.
-
The following code example shows how to create an AWS Lambda function invoked by an Amazon EventBridge scheduled event.
- SDK for Python (Boto3)
-
This example shows how to register an AWS Lambda function as the target of a scheduled Amazon EventBridge event. The Lambda handler writes a friendly message and the full event data to Amazon CloudWatch Logs for later retrieval.
Deploys a Lambda function.
Creates an EventBridge scheduled event and makes the Lambda function the target.
Grants permission to let EventBridge invoke the Lambda function.
Prints the latest data from CloudWatch Logs to show the result of the scheduled invocations.
Cleans up all resources created during the demo.
This example is best viewed on GitHub. For complete source code and instructions on how to set up and run, see the full example on GitHub
. Services used in this example
CloudWatch Logs
DynamoDB
EventBridge
Lambda
Amazon SNS
Serverless examples
The following code example shows how to implement a Lambda function that receives an event triggered by receiving records from a DynamoDB stream. The function retrieves the DynamoDB payload and logs the record contents.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples
repository. Consuming a DynamoDB event with Lambda using Python.
import json def lambda_handler(event, context): print(json.dumps(event, indent=2)) for record in event['Records']: log_dynamodb_record(record) def log_dynamodb_record(record): print(record['eventID']) print(record['eventName']) print(f"DynamoDB Record: {json.dumps(record['dynamodb'])}")
The following code example shows how to implement partial batch response for Lambda functions that receive events from a DynamoDB stream. The function reports the batch item failures in the response, signaling to Lambda to retry those messages later.
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the Serverless examples
repository. Reporting DynamoDB batch item failures with Lambda using Python.
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. # SPDX-License-Identifier: Apache-2.0 def handler(event, context): records = event.get("Records") curRecordSequenceNumber = "" for record in records: try: # Process your record curRecordSequenceNumber = record["dynamodb"]["SequenceNumber"] except Exception as e: # Return failed record's sequence number return {"batchItemFailures":[{"itemIdentifier": curRecordSequenceNumber}]} return {"batchItemFailures":[]}