Conversation
✅ Deploy Preview for gitfastnow ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
Co-authored-by: amkayondo <[email protected]>
Co-authored-by: amkayondo <[email protected]>
There was a problem hiding this comment.
Pull request overview
This PR adds explicit timeouts to GitHub API calls and adjusts Next.js route handler time limits to reduce hanging requests and premature termination in serverless environments.
Changes:
- Add a per-request timeout to
ghFetch()(viaAbortSignal.timeout) for GitHub API calls. - Configure Octokit with a 30s request timeout in the standalone scraper.
- Export
maxDurationfrom API route handlers to align with expected execution time.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
lib/github.ts |
Adds a 30s per-request timeout to fetch() calls used for GitHub API access. |
src/scraper.js |
Adds a 30s request timeout to the Octokit client used by the CLI scraper. |
app/api/scrape/route.ts |
Sets maxDuration = 60 to allow longer-running scrape requests. |
app/api/export/json/route.ts |
Sets maxDuration = 30 for JSON export route. |
app/api/export/csv/route.ts |
Sets maxDuration = 30 for CSV export route. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const res = await fetch(url, { | ||
| headers: headers(), | ||
| signal: AbortSignal.timeout(GITHUB_API_TIMEOUT_MS), | ||
| }); | ||
|
|
There was a problem hiding this comment.
fetch() will throw on timeout (AbortError) or other network errors, which bypasses the current retry loop entirely. Since this PR introduces timeouts, consider wrapping the fetch() call in a try/catch and retrying (or at least converting AbortError into a clearer error) so a transient timeout doesn’t immediately fail the whole scrape.
| const res = await fetch(url, { | |
| headers: headers(), | |
| signal: AbortSignal.timeout(GITHUB_API_TIMEOUT_MS), | |
| }); | |
| let res: Response; | |
| try { | |
| res = await fetch(url, { | |
| headers: headers(), | |
| signal: AbortSignal.timeout(GITHUB_API_TIMEOUT_MS), | |
| }); | |
| } catch (err) { | |
| const isAbortError = | |
| err instanceof Error && err.name === "AbortError"; | |
| if (attempt < maxRetries) { | |
| // Simple backoff for transient network/timeout errors. | |
| const backoffMs = 1000 * (attempt + 1); | |
| await sleep(backoffMs); | |
| continue; | |
| } | |
| const baseMessage = isAbortError | |
| ? `GitHub API request timed out after ${GITHUB_API_TIMEOUT_MS}ms` | |
| : "GitHub API request failed"; | |
| const detail = | |
| err instanceof Error && err.message | |
| ? `: ${err.message}` | |
| : ""; | |
| throw new Error(`${baseMessage}${detail} (${url})`); | |
| } |
GitHub API calls had no per-request timeout, allowing requests to hang indefinitely. API routes also lacked
maxDurationexports, causing premature termination on serverless platforms.Changes
lib/github.ts— AddAbortSignal.timeout(30s)toghFetch()via aGITHUB_API_TIMEOUT_MSconstantsrc/scraper.js— Addrequest.timeout: 30_000to the Octokit client configmaxDurationon all route handlers:/api/scrape→ 60s (multiple sequential API calls + courtesy delays)/api/export/json,/api/export/csv→ 30s💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.