Limit number of diffs per worker process in server #210
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This adds a new
MAX_DIFFS_PER_WORKERenvironment variable that limits the number of diffs performed by a single worker. After a worker process performs this many diffs, it is shut down and replaced with a fresh process.This is an ugly first cut and needs a lot of cleaning up. It’s probably time to address the longstanding to-do here:
web-monitoring-diff/web_monitoring_diff/server/server.py
Lines 503 to 507 in b3cf524
This also doesn't really account for things on a per worker basis — it just restarts the pool after
DIFFER_PARALLELISM * MAX_DIFFS_PER_WORKERdiffs. But newer versions of Python have an API that does this right and which we should eventually switch to when compatible (although it seems like there maybe be some bugs in it: #202 (comment)).Fixes #202.