Skip to content

Commit 2c81086

Browse files
authored
chore: increase stale cloud functions cleanup rate (#863)
* chore: increase stale cloud functions cleanup rate * reword the comment
1 parent 823c0ce commit 2c81086

File tree

1 file changed

+9
-5
lines changed

1 file changed

+9
-5
lines changed

tests/system/conftest.py

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,15 @@
4343

4444
# Use this to control the number of cloud functions being deleted in a single
4545
# test session. This should help soften the spike of the number of mutations per
46-
# minute tracked against a quota limit (default 60, increased to 120 for
47-
# bigframes-dev project) by the Cloud Functions API
48-
# We are running pytest with "-n 20". Let's say each session lasts about a
49-
# minute, so we are setting a limit of 120/20 = 6 deletions per session.
50-
MAX_NUM_FUNCTIONS_TO_DELETE_PER_SESSION = 6
46+
# minute tracked against the quota limit:
47+
# Cloud Functions API -> Per project mutation requests per minute per region
48+
# (default 60, increased to 1000 for the test projects)
49+
# We are running pytest with "-n 20". For a rough estimation, let's say all
50+
# parallel sessions run in parallel. So that allows 1000/20 = 50 mutations per
51+
# minute. One session takes about 1 minute to create a remote function. This
52+
# would allow 50-1 = 49 deletions per session. As a heuristic let's use half of
53+
# that potential for the clean up.
54+
MAX_NUM_FUNCTIONS_TO_DELETE_PER_SESSION = 25
5155

5256
CURRENT_DIR = pathlib.Path(__file__).parent
5357
DATA_DIR = CURRENT_DIR.parent / "data"

0 commit comments

Comments
 (0)