-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ref(span-buffer): Reduce zadd/zrem calls to Redis #88463
Conversation
Build up a bunch of temporary datastructures in Python to relieve load on Redis. The result is more CPU usage in consumers but that might be worth it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, thank you! Good to merge once this has passed benchmarks.
p.expire(key, self.redis_ttl) | ||
has_root_span_count = 0 | ||
min_hole_size = float("inf") | ||
max_hole_size = float("-inf") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note I'm renaming these in #88453, so we will need to resolve a conflict there.
|
||
assert len(queue_keys) == len(results) | ||
|
||
for queue_key, (hole_size, delete_item, add_item, has_root_span) in zip( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Can we type the redis script result and then pass the queue part through a sub-function for readability?
Hole size can be iterated and logged entirely separately from the remaining result.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
discussed offline, let's keep as-is due to unrelated metrics being recorded in this loop
Build up a bunch of temporary datastructures in Python to relieve load on Redis. The result is more CPU usage in consumers but that might be worth it.
Build up a bunch of temporary datastructures in Python to relieve load
on Redis. The result is more CPU usage in consumers but that might be
worth it.