Conversation
There was a problem hiding this comment.
Pull request overview
Adds a new TypeScript SDK package for interacting with the Drivebase API (GraphQL + REST proxy), alongside repo-wide version bumps and Biome tooling updates.
Changes:
- Introduce
@drivebase/sdkwith client/resources, HTTP layer, operations, utilities, and Bun tests. - Bump Drivebase workspace package versions to
3.3.0. - Update Biome version/schema references and lockfile.
Reviewed changes
Copilot reviewed 39 out of 40 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| packages/webdav/package.json | Bump package version to 3.3.0 |
| packages/utils/package.json | Bump package version to 3.3.0 |
| packages/telegram/package.json | Bump package version to 3.3.0 |
| packages/sdk/tsup.config.ts | Add SDK build config (tsup) |
| packages/sdk/tsconfig.json | Add SDK TypeScript config |
| packages/sdk/tests/utils.test.ts | Add tests for retry/chunk utilities |
| packages/sdk/tests/resources.test.ts | Add tests for files/folders resources |
| packages/sdk/tests/http.test.ts | Add tests for HttpClient behavior |
| packages/sdk/tests/errors.test.ts | Add tests for SDK error classes |
| packages/sdk/tests/client.test.ts | Add tests for DrivebaseClient construction |
| packages/sdk/src/utils/retry.ts | Add retry utility |
| packages/sdk/src/utils/chunk.ts | Add chunk slicing utility/constants |
| packages/sdk/src/types.ts | Add SDK public types |
| packages/sdk/src/resources/upload.ts | Add upload manager (simple + chunked) |
| packages/sdk/src/resources/folders.ts | Add folders resource methods |
| packages/sdk/src/resources/files.ts | Add files resource methods (incl. download/upload) |
| packages/sdk/src/operations/folder-queries.ts | Add folder GraphQL query strings |
| packages/sdk/src/operations/folder-mutations.ts | Add folder GraphQL mutation strings |
| packages/sdk/src/operations/file-queries.ts | Add file GraphQL query strings |
| packages/sdk/src/operations/file-mutations.ts | Add file GraphQL mutation strings |
| packages/sdk/src/index.ts | Export SDK public surface |
| packages/sdk/src/http.ts | Add HTTP client wrapper (GraphQL + REST) |
| packages/sdk/src/errors.ts | Add SDK error hierarchy |
| packages/sdk/src/client.ts | Add DrivebaseClient entry point |
| packages/sdk/package.json | Add SDK package metadata/exports |
| packages/s3/package.json | Bump package version to 3.3.0 |
| packages/nextcloud/package.json | Bump package version to 3.3.0 |
| packages/local/package.json | Bump package version to 3.3.0 |
| packages/google-drive/package.json | Bump package version to 3.3.0 |
| packages/ftp/package.json | Bump package version to 3.3.0 |
| packages/dropbox/package.json | Bump package version to 3.3.0 |
| packages/db/package.json | Bump package version to 3.3.0 |
| packages/darkibox/package.json | Bump package version to 3.3.0 |
| packages/core/package.json | Bump package version to 3.3.0 |
| package.json | Pin Biome devDependency version |
| bun.lock | Lockfile updates for version bumps + new SDK deps |
| biome.json | Update Biome schema URL |
| apps/web/src/features/files/components/FilesToolbar.tsx | Adjust breadcrumb React key |
| apps/web/biome.json | Update Biome schema URL |
| apps/docs/biome.json | Update Biome schema URL |
| if (useDirectDownload && downloadUrl) { | ||
| return { | ||
| url: downloadUrl, | ||
| stream: async () => { | ||
| const response = await fetch(downloadUrl); | ||
| if (!response.ok || !response.body) { | ||
| throw new Error(`Download failed: ${response.status}`); | ||
| } | ||
| return response.body; | ||
| }, |
There was a problem hiding this comment.
Direct download uses global fetch(downloadUrl) instead of the injected client fetch, which makes custom fetch implementations ineffective and can break in non-browser runtimes; use the configured fetch (e.g., via HttpClient) for direct downloads too.
| export function sliceIntoChunks( | ||
| data: Blob | ArrayBufferView | ArrayBuffer, | ||
| chunkSize: number, | ||
| ): Blob[] { | ||
| let blob: Blob; | ||
| if (data instanceof Blob) { | ||
| blob = data; | ||
| } else if (data instanceof ArrayBuffer) { | ||
| blob = new Blob([data]); | ||
| } else { | ||
| // ArrayBufferView (e.g. Uint8Array, Buffer) | ||
| blob = new Blob([ | ||
| new Uint8Array( | ||
| data.buffer as ArrayBuffer, | ||
| data.byteOffset, | ||
| data.byteLength, | ||
| ), | ||
| ]); | ||
| } | ||
|
|
||
| const chunks: Blob[] = []; | ||
| let offset = 0; | ||
|
|
||
| while (offset < blob.size) { | ||
| chunks.push(blob.slice(offset, offset + chunkSize)); | ||
| offset += chunkSize; | ||
| } |
There was a problem hiding this comment.
sliceIntoChunks will enter an infinite loop if chunkSize is <= 0 (since offset never increases); validate chunkSize and throw a clear error when it’s not a positive integer.
There was a problem hiding this comment.
@copilot open a new pull request to apply changes based on this feedback
| const canDrop = !isLast && crumb.folderId !== null; | ||
| return ( | ||
| <div key={`${crumb.name}-${index}`} className="flex items-center"> | ||
| <div key={`${crumb.folderId}`} className="flex items-center"> |
There was a problem hiding this comment.
key={${crumb.folderId}} will be non-unique for intermediate crumbs where folderId is null (see useBreadcrumbs), which can cause React key collisions; use a stable unique key per crumb (e.g., include index or the path segment) when folderId is null.
| <div key={`${crumb.folderId}`} className="flex items-center"> | |
| <div key={crumb.folderId ?? `crumb-${index}`} className="flex items-center"> |
| } | ||
| formData.append("file", blob, options.name); | ||
|
|
||
| const response = await this.http.rest("POST", uploadUrl, { |
There was a problem hiding this comment.
When uploadFields is present this uses this.http.rest(...), which always attaches Drivebase auth/workspace headers; for S3-style presigned POSTs that can trigger CORS preflights or signature/CORS failures—send this request with the SDK’s underlying fetch (no extra headers) instead of the Drivebase REST helper.
| const response = await this.http.rest("POST", uploadUrl, { | |
| const response = await fetch(uploadUrl, { | |
| method: "POST", |
| const response = await fetch(uploadUrl, { | ||
| method: "PUT", | ||
| headers: { "Content-Type": options.mimeType }, | ||
| body: blob, |
There was a problem hiding this comment.
Direct uploads here call global fetch(...) instead of the user-provided fetch configured on DrivebaseClient, so SDK consumers (Node, SSR, custom transports/tests) can’t intercept/override these requests; route direct upload through the injected fetch implementation.
| const response = await fetch(uploadUrl, { | |
| method: "PUT", | |
| headers: { "Content-Type": options.mimeType }, | |
| body: blob, | |
| const response = await this.http.rest("PUT", uploadUrl, { | |
| body: blob, | |
| headers: { "Content-Type": options.mimeType }, |
| // Session completed — the file should now exist. | ||
| // Unfortunately we don't have the fileId from the session query, | ||
| // so we return a minimal file object. Caller can re-fetch if needed. | ||
| break; |
There was a problem hiding this comment.
waitForFile() breaks out of the polling loop on completion but then always falls through to returning the sentinel/placeholder File object, so upload() never returns the real server file record even on success; return the actual file (e.g., refetch by fileId) instead of breaking-and-falling-through.
| break; | |
| const now = new Date().toISOString(); | |
| return { | |
| id: sessionId, | |
| name: "", | |
| virtualPath: "", | |
| mimeType: "", | |
| size: totalSize, | |
| hash: null, | |
| remoteId: "", | |
| providerId: "", | |
| folderId: null, | |
| uploadedBy: "", | |
| isDeleted: false, | |
| starred: false, | |
| createdAt: now, | |
| updatedAt: now, | |
| lifecycle: { | |
| state: "HOT", | |
| storageClass: null, | |
| restoreRequestedAt: null, | |
| restoreExpiresAt: null, | |
| lastCheckedAt: null, | |
| }, | |
| }; |
| const maxAttempts = 60; | ||
| const pollIntervalMs = 2000; | ||
|
|
||
| for (let attempt = 0; attempt < maxAttempts; attempt++) { | ||
| try { | ||
| const data = await this.http.graphql<{ | ||
| activeUploadSessions: Array<{ | ||
| sessionId: string; | ||
| status: string; | ||
| errorMessage: string | null; | ||
| }>; | ||
| }>( | ||
| `query ActiveSessions { activeUploadSessions { sessionId status errorMessage } }`, | ||
| ); | ||
|
|
||
| const session = data.activeUploadSessions.find( | ||
| (s) => s.sessionId === sessionId, | ||
| ); | ||
|
|
||
| if (!session || session.status === "completed") { | ||
| onProgress?.({ | ||
| loaded: totalSize, | ||
| total: totalSize, | ||
| percent: 100, | ||
| phase: "complete", | ||
| }); | ||
| // Session completed — the file should now exist. | ||
| // Unfortunately we don't have the fileId from the session query, | ||
| // so we return a minimal file object. Caller can re-fetch if needed. | ||
| break; | ||
| } | ||
|
|
||
| if (session.status === "failed") { | ||
| throw new UploadError( | ||
| session.errorMessage ?? "Upload transfer failed", | ||
| sessionId, | ||
| ); | ||
| } | ||
| } catch (error) { | ||
| if (error instanceof UploadError) throw error; | ||
| // Ignore transient query errors during polling | ||
| } | ||
|
|
||
| await new Promise((resolve) => setTimeout(resolve, pollIntervalMs)); | ||
| } | ||
|
|
||
| // Return a best-effort file object — the upload completed but | ||
| // we can't easily resolve the fileId from the session alone. | ||
| // Return a sentinel that callers can use. | ||
| return { | ||
| id: sessionId, | ||
| name: "", | ||
| virtualPath: "", | ||
| mimeType: "", | ||
| size: totalSize, | ||
| hash: null, | ||
| remoteId: "", | ||
| providerId: "", | ||
| folderId: null, | ||
| uploadedBy: "", | ||
| isDeleted: false, | ||
| starred: false, | ||
| createdAt: new Date().toISOString(), | ||
| updatedAt: new Date().toISOString(), | ||
| lifecycle: { | ||
| state: "HOT", | ||
| storageClass: null, | ||
| restoreRequestedAt: null, | ||
| restoreExpiresAt: null, | ||
| lastCheckedAt: null, | ||
| }, | ||
| }; | ||
| } |
There was a problem hiding this comment.
After maxAttempts the function returns a placeholder File even if the session is still in progress (or stuck), which will report success incorrectly; this should throw an UploadError timeout (or return a result type that indicates pending) rather than returning a fake File.
This pull request introduces the new TypeScript SDK for the Drivebase API and updates package versions and configuration files across the repository. The main focus is the addition of the SDK, which includes client, error handling, HTTP communication, and GraphQL operation definitions. Additionally, all Drivebase-related packages are bumped to version 3.3.0, and the Biome configuration and dependency versions are updated for consistency.