-
Notifications
You must be signed in to change notification settings - Fork 233
Update Vercel queue delay cap for longer sleeps #1604
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 1 commit
563c84b
6b2568f
407954e
da34354
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,5 @@ | ||
| --- | ||
| '@workflow/world-vercel': patch | ||
| --- | ||
|
|
||
| Update the Vercel queue delay cap for longer sleeps. | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,5 +1,7 @@ | ||
| import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'; | ||
|
|
||
| const MAX_DELAY_SECONDS = 7 * 24 * 60 * 60 - 60 * 60; | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. re-use constant from actual file?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done in |
||
|
|
||
| const { | ||
| mockSend, | ||
| MockDuplicateMessageError, | ||
|
|
@@ -404,7 +406,7 @@ describe('createQueue', () => { | |
| } | ||
| }); | ||
|
|
||
| it('should clamp delaySeconds to max 23 hours for long sleeps', async () => { | ||
| it('should clamp delaySeconds to 1 hour less than 7 days for long sleeps', async () => { | ||
| mockSend.mockResolvedValue({ messageId: 'new-msg-123' }); | ||
|
|
||
| let capturedHandler: ( | ||
|
|
@@ -422,7 +424,7 @@ describe('createQueue', () => { | |
| try { | ||
| const queue = createQueue(); | ||
| queue.createQueueHandler('__wkf_workflow_', async () => ({ | ||
| timeoutSeconds: 100000, | ||
| timeoutSeconds: 700000, | ||
| })); | ||
|
|
||
| await capturedHandler!( | ||
|
|
@@ -443,7 +445,7 @@ describe('createQueue', () => { | |
| expect(mockSend).toHaveBeenCalledTimes(1); | ||
| // send(topicName, payload, options) | ||
| const sendOpts = mockSend.mock.calls[0][2]; | ||
| expect(sendOpts.delaySeconds).toBe(82800); // MAX_DELAY_SECONDS | ||
| expect(sendOpts.delaySeconds).toBe(MAX_DELAY_SECONDS); | ||
| } finally { | ||
| if (originalEnv !== undefined) { | ||
| process.env.VERCEL_DEPLOYMENT_ID = originalEnv; | ||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -1,5 +1,5 @@ | ||||||
| import { AsyncLocalStorage } from 'node:async_hooks'; | ||||||
| import { QueueClient, DuplicateMessageError } from '@vercel/queue'; | ||||||
| import { DuplicateMessageError, QueueClient } from '@vercel/queue'; | ||||||
| import { | ||||||
| MessageId, | ||||||
| type Queue, | ||||||
|
|
@@ -32,12 +32,12 @@ const MessageWrapper = z.object({ | |||||
| * rather than using visibility timeouts on the same message. | ||||||
| * | ||||||
| * Benefits of this approach: | ||||||
| * - Fresh 24-hour lifetime with each message (no message age tracking needed) | ||||||
| * - Fresh delay window with each message (no message age tracking needed) | ||||||
| * - Messages fire at the scheduled time (no short-circuit + recheck pattern) | ||||||
| * - Simpler conceptual model: messages are triggers with delivery schedules | ||||||
| * | ||||||
| * For sleeps > 24 hours (max delay), we use chaining: | ||||||
| * 1. Schedule message with max delay (~23h, leaving buffer) | ||||||
| * For sleeps > 7 days (max delay), we use chaining: | ||||||
| * 1. Schedule message with max delay (~6d 23h, leaving 1h buffer) | ||||||
| * 2. When it fires, workflow checks if sleep is complete | ||||||
| * 3. If not, another delayed message is queued for remaining time | ||||||
| * 4. Process repeats until the full sleep duration has elapsed | ||||||
|
|
@@ -48,8 +48,11 @@ const MessageWrapper = z.object({ | |||||
| * | ||||||
| * These constants can be overridden via environment variables for testing. | ||||||
| */ | ||||||
| const SECONDS_PER_HOUR = 60 * 60; | ||||||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we should actually make this one day. Practically, there'll be no performance or cost difference, BUT if there's infra downtime, and it happens to be the "right" minute, this gives us a bigger margin for queue to re-drive before it gives up? Would have to think through this more deeply but seems like a generally safer approach
Suggested change
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Makes sense — applied in |
||||||
| const MAX_QUEUE_DELAY_WINDOW_SECONDS = 7 * 24 * SECONDS_PER_HOUR; | ||||||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Applied in |
||||||
| const MAX_DELAY_SECONDS = Number( | ||||||
| process.env.VERCEL_QUEUE_MAX_DELAY_SECONDS || 82800 // 23 hours - leave 1h buffer before 24h retention limit | ||||||
| process.env.VERCEL_QUEUE_MAX_DELAY_SECONDS || | ||||||
| MAX_QUEUE_DELAY_WINDOW_SECONDS - SECONDS_PER_HOUR | ||||||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Applied in |
||||||
| ); | ||||||
|
||||||
|
|
||||||
| /** | ||||||
|
|
@@ -191,8 +194,9 @@ export function createQueue(config?: APIConfig): Queue { | |||||
|
|
||||||
| if (typeof result?.timeoutSeconds === 'number') { | ||||||
| // When timeoutSeconds is 0, skip delaySeconds entirely for immediate re-enqueue. | ||||||
| // Otherwise, clamp to max delay (23h) - for longer sleeps, the workflow will chain | ||||||
| // multiple delayed messages until the full sleep duration has elapsed. | ||||||
| // Otherwise, clamp to the queue delay window minus a 1h buffer (6d 23h). | ||||||
| // For longer sleeps, the workflow will chain multiple delayed messages until | ||||||
| // the full sleep duration has elapsed. | ||||||
| const delaySeconds = | ||||||
| result.timeoutSeconds > 0 | ||||||
| ? Math.min(result.timeoutSeconds, MAX_DELAY_SECONDS) | ||||||
|
|
||||||
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Applied in
407954ea.