-
-
Notifications
You must be signed in to change notification settings - Fork 4.3k
feat(duckdb): add support for duckdb dialect based on duckdb-node-neo #18080
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…hey are not well supported in duckdb, schema-proof sequence creation
WalkthroughAdds a new DuckDB dialect package and integrates DuckDB support across core runtime and tests: connection manager, dialect, query generator/interface, query execution, data type overrides, package metadata, and widespread test adjustments and feature gates. Changes
Sequence Diagram(s)sequenceDiagram
participant App as Application
participant Core as Sequelize Core
participant Dialect as DuckDbDialect
participant ConnMgr as DuckDbConnectionManager
participant DB as DuckDB Instance
participant Query as DuckDbQueryGenerator
App->>Core: instantiate Sequelize(dialect: 'duckdb')
Core->>Dialect: register/initialize dialect
Dialect->>ConnMgr: init connection manager
App->>Core: execute ORM query (e.g., findAll)
Core->>Dialect: route query
Dialect->>ConnMgr: get connection (connect / from cache)
ConnMgr->>DB: open or reuse DuckDBInstance
DB-->>ConnMgr: connection
ConnMgr-->>Dialect: connection
Dialect->>Query: generate SQL (attributesToSQL / createTable / updateQuery)
Query-->>Dialect: SQL string
Dialect->>DB: execute SQL via DuckDbQuery.executeQuery (queued)
DB-->>Dialect: rows / metadata
Dialect->>Dialect: parse results via data-type overrides
Dialect-->>Core: return deserialized result set
Core-->>App: resolved models / data
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Areas needing extra attention:
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 13
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (6)
packages/core/test/unit/query-interface/decrement.test.ts (1)
50-56: Split DuckDB and PostgreSQL expectations—DuckDB doesn't supportUPDATE ... RETURNING.Integration tests explicitly gate this with the comment: "DuckDB dialect disables RETURNING on update queries." The test should separate these dialects since DuckDB only supports
RETURNINGonINSERT, notUPDATE. Theincrement.test.tsfile (line 55) has the same issue.expectsql(firstCall.args[0], { default: `UPDATE [Users] SET [age]=[age]- ':age',[name]=':name' WHERE [firstName] = ':firstName'`, mssql: `UPDATE [Users] SET [age]=[age]- N':age',[name]=N':name' OUTPUT INSERTED.[:data] WHERE [firstName] = N':firstName'`, sqlite3: "UPDATE `Users` SET `age`=`age`- ':age',`name`=':name' WHERE `firstName` = ':firstName' RETURNING `:data`", - 'postgres duckdb': `UPDATE "Users" SET "age"="age"- ':age',"name"=':name' WHERE "firstName" = ':firstName' RETURNING ":data"`, + postgres: `UPDATE "Users" SET "age"="age"- ':age',"name"=':name' WHERE "firstName" = ':firstName' RETURNING ":data"`, + duckdb: `UPDATE "Users" SET "age"="age"- ':age',"name"=':name' WHERE "firstName" = ':firstName'`, });packages/core/test/unit/sql/change-column.test.js (1)
44-60: DuckDB expectation has a suspicious trailing space (TYPE FLOAT ;)—please keep generator/tests consistent.
If the space is accidental, it’s better to normalize the DuckDB query output (or the expected string) to avoid brittle diffs.packages/core/test/unit/query-generator/create-table-query.test.ts (1)
383-395: DuckDB autoincrement: multi-statement SQL in one call will fail—duckdb-node-neo does not support executing multiple semicolon-separated statements via a singlerunAndReadAll()call.The generated SQL concatenates
CREATE SEQUENCEandCREATE TABLEin one string, but the DuckDB Node API requires using theextractStatements()API to split and execute statements individually. The current implementation atpackages/duckdb/src/query.jspasses the entire SQL string directly toduckdbConnection.runAndReadAll(sql), which will throw an error at runtime. Emit sequence creation separately (e.g., via an explicit sequence creation step before table creation) or useextractStatements()and execute each statement in sequence.packages/core/src/sequelize.internals.ts (1)
47-53: Update the “Native dialects” error message to includeduckdb.default: throw new Error( - `The dialect ${dialect} is not natively supported. Native dialects: mariadb, mssql, mysql, postgres, sqlite3, ibmi, db2 and snowflake.`, + `The dialect ${dialect} is not natively supported. Native dialects: mariadb, mssql, mysql, postgres, sqlite3, ibmi, db2, snowflake and duckdb.`, );packages/core/src/model.js (1)
2315-2324: Documentation and design issue:updateOnDuplicateis enabled for DuckDB but poorly documented and relies on implicitly-disabled features.This change adds DuckDB to the list of dialects supporting
updateOnDuplicate(line 2321). However:
Documentation gap:
model.d.ts(lines 1185–1186) documentsupdateOnDuplicateas supported only for MySQL, MariaDB, SQLite, and Postgres—DuckDB is not mentioned.Missing conflict field specification: DuckDB's
dialect.tsdeclaresconflictFields: false(line 36), meaning users cannot explicitly specify conflict targets. The upsert key must be inferred automatically.Inference relies on disabled constraints: DuckDB disables unique constraints and primary keys in
supports.constraintsdue to over-eager constraint checking, yetquery-interface.ts(lines 80–98) attempts to infer upsert keys from these disabled features. While primary keys are still physically created (despite being marked unsupported), this creates a fragile implicit contract that updateOnDuplicate only works reliably with explicit primary keys.Required actions:
- Update
model.d.tsline 1185 to include DuckDB in the supported dialects list, or- Clarify in code comments that DuckDB's updateOnDuplicate requires an explicit primary key and does not support
conflictFieldsparameter.packages/core/test/integration/associations/has-many.test.js (1)
1041-1226: Consider usingsupports.constraints.onUpdateflag for cascade update test.The FK-constraint gating at line 1041 is correct. However, the "can cascade updates" test (lines ~1095–1120) should check
current.dialect.supports.constraints.onUpdateinstead of relying only on dialect name exclusion. The codebase already defines this flag (query-generator-internal.ts:181), and the restrict tests below correctly usesupports.constraints.restrict. For consistency and maintainability, apply the same capability-flag pattern to the cascade update test.
🟠 Major comments (10)
packages/core/test/integration/json.test.ts-21-23 (1)
21-23: Reconsider adding DuckDB todialectSupportsJsonEquality.According to DuckDB's official documentation, JSON equality comparisons have inconsistent behavior—they sometimes compare raw text (whitespace/formatting matters) and sometimes compare logical JSON content. The docs explicitly warn about this caveat with examples showing differing behavior.
This inconsistency contradicts the assumption made by the
dialectSupportsJsonEqualityconstant, which gates tests that rely on consistent JSON-to-JSON comparison. The test at line 122 (should be able to compare JSON to JSON directly) assumes reliable equality semantics, which DuckDB does not guarantee.Either:
- Verify with the DuckDB dialect maintainers that this behavior is acceptable/resolved, or
- Remove DuckDB from this list and gate the test on a more specific check, or
- Document the known limitations if DuckDB's inconsistency is acceptable for Sequelize's use case.
packages/duckdb/src/_internal/data-types-overrides.ts-86-99 (1)
86-99: FixparseDatabaseValuetyping—parameter should beunknown, notBigInt.The current signature
value: BigIntis incorrect (BigInt is a constructor). More importantly, @duckdb/node-api may return different types for BIGINT columns—string, number, or bigint—as evidenced by the comment in query.js: "BigInt binds as null in duckdb-node, convert to string". The parameter should acceptunknownand handle all possible driver return types defensively, matching the pattern used in Postgres BIGINT and the JSON class in this same file.export class BIGINT extends BaseTypes.BIGINT { protected _checkOptionSupport(dialect: AbstractDialect) { super._checkOptionSupport(dialect); } toSql(): string { return 'BIGINT'; } parseDatabaseValue(value: unknown): unknown { // Sequelize expects bigint to be represented as string + if (typeof value === 'bigint') { + return value.toString(); + } + + if (typeof value === 'number') { + return String(value); + } + + if (typeof value === 'string') { + return value; + } + + // fallback (covers boxed BigInt or other driver-specific representations) + return (value as { toString?: () => string } | null)?.toString?.() ?? value; } }packages/core/test/integration/index.test.ts-180-185 (1)
180-185: Fix overly-permissive error regexes (\s|^...$matches almost anything).
As written, any message containing whitespace will satisfy the assertion.- expect(error.message).to.match( - /\s|^Cannot use duplicate column names in index. Column name 'username' listed more than once.$/, - ); + expect(error.message).to.match( + /^\s*Cannot use duplicate column names in index\. Column name 'username' listed more than once\.\s*$/, + ); - expect(error.message).to.match( - /\s|^Column name 'email' does not exist in the target table or view.$/, - ); + expect(error.message).to.match( + /^\s*Column name 'email' does not exist in the target table or view\.\s*$/, + ); - expect(error.message).to.match( - /\s|^Column 'username' in table 'users' is of a type that is invalid for use as a key column in an index.$/, - ); + expect(error.message).to.match( + /^\s*Column 'username' in table 'users' is of a type that is invalid for use as a key column in an index\.\s*$/, + );Also applies to: 211-216, 242-247
packages/core/test/integration/index.test.ts-219-237 (1)
219-237: “invalid column type” test also references missing
If the intent is to test key-column type validation, make the include list valid (or addusernamebeingTEXT.sequelize.define( 'user', { username: DataTypes.TEXT, first_name: DataTypes.STRING, last_name: DataTypes.STRING, + email: DataTypes.STRING, }, { indexes: [ { name: 'user_username', fields: ['username'], - include: ['first_name', 'last_name', 'email'], + include: ['first_name', 'last_name', 'email'], unique: true, }, ], }, );packages/core/test/integration/model/bulk-create.test.js-499-520 (1)
499-520: “Should throw” ignoreDuplicates test can pass without asserting (missing failure on success).
IfbulkCreatedoesn’t throw, the test currently succeeds.} else { it('should throw an error when the ignoreDuplicates option is passed', async function () { const data = [ { uniqueName: 'Peter', secretValue: '42' }, { uniqueName: 'Paul', secretValue: '23' }, ]; await this.User.bulkCreate(data, { fields: ['uniqueName', 'secretValue'] }); data.push({ uniqueName: 'Michael', secretValue: '26' }); - try { - await this.User.bulkCreate(data, { - fields: ['uniqueName', 'secretValue'], - ignoreDuplicates: true, - }); - } catch (error) { - expect(error.message).to.equal( - `${dialectName} does not support the ignoreDuplicates option.`, - ); - } + await expect( + this.User.bulkCreate(data, { + fields: ['uniqueName', 'secretValue'], + ignoreDuplicates: true, + }), + ).to.be.rejectedWith(`${dialectName} does not support the ignoreDuplicates option.`); }); }packages/core/test/integration/model/bulk-create.test.js-523-555 (1)
523-555: Gate primary key upsert tests bydialect.supports.constraints.primaryKey.Three tests using primary keys for conflict detection are not gated by any constraint check:
- "when the primary key column names and model field names are the same" (lines 558–586)
- "when the primary key column names and model field names are different" (lines 587–614)
- "when the composite primary key column names and model field names are different" (lines 616–660)
DuckDB is a concrete example: it supports
inserts.updateOnDuplicatebut disablesconstraints.primaryKeydue to over-eager constraint checking. These tests will run and fail on such dialects. Wrap them withif (dialect.supports.constraints.primaryKey)to match the gating pattern used for unique constraint tests.packages/duckdb/src/connection-manager.ts-7-10 (1)
7-10: Remove unusedmodeparameter and adddatabasevalidation.The
modeoption is defined in the interface but never used in theconnect()method. Since@duckdb/node-api'sfromCache()doesn't support a mode parameter, either implement full mode support or remove it from the interface. Additionally, validate thatconfig.databaseis provided before passing it tofromCache().async connect(config: ConnectionOptions<DuckDbDialect>): Promise<DuckDbConnection> { + if (!config.database) { + throw new Error('DuckDB connection requires a non-empty "database" path.'); + } + const instance = await DuckDBInstanceClass.fromCache(config.database, { custom_user_agent: 'sequelize', });Committable suggestion skipped: line range outside the PR's diff.
packages/duckdb/src/query-generator.js-100-109 (1)
100-109: addIndexQuery: escape the comment payload + beware overwriting existing comments.
actualIndexQueryis inserted into a single-quoted string without escaping; a single quote in the generated SQL would break the statement. Also, each index attempt overwrites the table comment (losing previous “no-op index” notes).Safer quoting:
- return `COMMENT ON TABLE ${table} IS '${actualIndexQuery}'`; + return `COMMENT ON TABLE ${table} IS ${this.escape(actualIndexQuery)}`;packages/duckdb/src/dialect.ts-28-118 (1)
28-118: PRIMARY KEY flag contradicts actual constraint generation; RETURNING disabled for UPDATE despite "returning" support flag.The supports flag sets
constraints.primaryKey: falsebutquery-generator.js(lines 43-55, 86-88) actively emitsPRIMARY KEY (...)table constraints. Either the flag should betrueor the constraint generation should be prevented.Additionally,
returnValues: 'returning'doesn't reflect that DuckDB can only use RETURNING with INSERT statements—updateQuery()(lines 111-115) andarithmeticQuery()(lines 137-138) explicitly strip or disable it due to constraint violation issues. The code works correctly as a runtime workaround, but the supports flag should communicate this limitation.packages/duckdb/src/query-generator.js-11-60 (1)
11-60:createTableQuery: sequence handling has a critical bug with multiple AUTOINCREMENT columns.
sequence_sqlis reassigned on each loop iteration (line 38), so only the last AUTOINCREMENT column's sequence is created. Earlier sequences are discarded, causing the second and subsequent columns'nextval()calls to fail at runtime.- Line 87 comment says "will be replaced with a PRIMARY KEY comment in createTableQuery," but line 54 emits an actual
PRIMARY KEY (...)table constraint, not a comment. The terminology is misleading—update the comment to say "table constraint" instead of "comment."Fix the sequence accumulation issue:
- let sequence_sql = ''; + const sequenceSql: string[] = []; @@ - if (dataType.includes('AUTOINCREMENT')) { - sequence_sql = `CREATE SEQUENCE IF NOT EXISTS ${this.quoteIdentifier(sequence_name)} START 1; `; + if (dataType.includes('AUTOINCREMENT')) { + sequenceSql.push( + `CREATE SEQUENCE IF NOT EXISTS ${this.quoteIdentifier(sequence_name)} START 1; `, + ); // this could be done in attributesToSQL but better keep it with sequence_name generation in case it changes dataType = dataType.replace('AUTOINCREMENT', `DEFAULT nextval('${sequence_name}')`); } @@ - const sql = `${sequence_sql}CREATE TABLE IF NOT EXISTS ${table} (${attrStr});`; + const sql = `${sequenceSql.join('')}CREATE TABLE IF NOT EXISTS ${table} (${attrStr});`;Also verify that
@duckdb/node-apisupports multiple SQL statements concatenated in a single string viarunAndReadAll().Also applies to: 62-98
🟡 Minor comments (13)
packages/core/test/unit/query-generator/rollback-savepoint-query.test.ts-10-15 (1)
10-15: Misleading test name (“committing” vs rollback)
This test callsrollbackSavepointQueryand expectsROLLBACK...; theit(...)title should reflect rollback to avoid confusion when reading failures.packages/core/test/unit/query-generator/get-constraint-snippet.test.ts-468-468 (1)
468-468: Fix inconsistent error message.The error message uses "Unique key constraints" while all other unique constraint tests in this file use "Unique constraints" (lines 341, 357, 371, 381, 393, 407, 421, 435, 449). This inconsistency should be corrected for uniformity.
Apply this diff:
- duckdb: new Error('Unique key constraints are not supported by duckdb dialect'), + duckdb: new Error('Unique constraints are not supported by duckdb dialect'),packages/core/test/unit/transaction.test.ts-69-69 (1)
69-69: Remove DuckDB from isolation level test expectations.DuckDB's dialect declares
isolationLevels: falseatpackages/duckdb/src/dialect.ts:100. This means the test transaction at line 73 will throw an error for DuckDB, caught at line 78, and the expectation at line 74 (referencing line 69) will never be evaluated. The expectation at line 69 should be removed to align with the test's error-handling logic and coding guidelines for skipping unsupported behavior.packages/core/test/unit/query-generator/remove-constraint-query.test.ts-120-120 (1)
120-120: Fix inconsistent dialect ordering.This line uses
'sqlite3 duckdb'while all other occurrences in this file use'duckdb sqlite3'. Maintain consistent ordering to avoid potential matching issues and improve readability.Apply this diff to fix the ordering:
- 'sqlite3 duckdb': notSupportedError, + 'duckdb sqlite3': notSupportedError,packages/core/test/integration/error.test.ts-694-695 (1)
694-695: Typo in test description."assocated" should be "associated".
- it('should throw a foreign key constraint error when deleting a parent row that has assocated child rows', async () => { + it('should throw a foreign key constraint error when deleting a parent row that has associated child rows', async () => {packages/core/test/integration/error.test.ts-849-850 (1)
849-850: Typo in test description."constranit" should be "constraint".
- it('should throw an unknown constranit error for duplicate constraint names', async () => { + it('should throw an unknown constraint error for duplicate constraint names', async () => {packages/core/test/unit/data-types/temporal-types.test.ts-29-29 (1)
29-29: Add precision option validation to DuckDB's DATE type.DuckDB's DATE implementation silently ignores the precision parameter without warning, unlike other unsupported options in the dialect which either implement
_checkOptionSupportwith a warning (e.g., TEXT) or declare the limitation indialect.ts(e.g., TIME.precision: false).Add a
_checkOptionSupportmethod to the DATE class inpackages/duckdb/src/_internal/data-types-overrides.tsto warn when precision is provided, or addDATE: { precision: false }to thesupports.dataTypessection inpackages/duckdb/src/dialect.tsto explicitly declare that DuckDB does not support DATE precision.packages/core/test/integration/query-interface/remove-column.test.ts-168-236 (1)
168-236: Test name/setup mismatch: “retain ON UPDATE” but the constraint isn’t created withonUpdate.
Either addonUpdate: 'CASCADE'to the constraint creation, or rename the test to only mention ON DELETE / default update behavior.await queryInterface.addConstraint('actors', { name: 'actors_level_id_fkey', type: 'FOREIGN KEY', fields: ['level_id'], references: { field: 'id', table: 'level' }, onDelete: 'CASCADE', + onUpdate: 'CASCADE', });packages/duckdb/src/_internal/data-types-overrides.ts-171-196 (1)
171-196: Keep theJSONoverride, but silence the “shadow global JSON” lint locally.
The export name is likely intentional for Sequelize’s datatype override map; suppressing is cleaner than renaming.-export class JSON extends BaseTypes.JSON { +// biome-ignore lint/suspicious/noShadowRestrictedNames: Sequelize override type name; use globalThis.JSON explicitly below. +export class JSON extends BaseTypes.JSON { toSql(): string { return 'JSON'; } @@ }packages/core/test/integration/utils.test.ts-12-14 (1)
12-14: Userecursive: trueto make directory creation idempotent.
fs.mkdirSync(DUCKDB_DATABASES_DIR)will throw anEEXISTerror if the directory already exists (e.g., on test re-runs or if cleanup didn't occur). This could cause spurious test failures.if (dialectName === 'duckdb') { - fs.mkdirSync(DUCKDB_DATABASES_DIR); + fs.mkdirSync(DUCKDB_DATABASES_DIR, { recursive: true }); }packages/duckdb/src/dialect.ts-12-20 (1)
12-20: Prefer a type alias over an empty interface (Biome lint).-export interface DuckDbDialectOptions {} +export type DuckDbDialectOptions = Record<string, never>;packages/duckdb/src/query-interface.ts-123-153 (1)
123-153: bulkInsert: typo + replacements handling looks fine, but keep option names consistent with upsert.
- Variable name typo:
attrubutesWithCoercedType.- You delete
options.replacementshere (plural), but upsert deletesreplacement(singular). After fixing upsert, this section is consistent.- const attrubutesWithCoercedType = attributes as { + const attributesWithCoercedType = attributes as { [columnName: string]: NormalizedAttributeOptions; }; @@ - attrubutesWithCoercedType, + attributesWithCoercedType,packages/duckdb/src/query-generator.js-110-139 (1)
110-139: RETURNING removal is brittle—handle multiple clause formats.
ThearithmeticQuerymethod removes only the exact substringRETURNING *. If the base query generator emitsRETURNING "id",RETURNING id, or similar variations, this replacement will fail and the problematic RETURNING clause will remain.Consider using a more flexible pattern:
- return query.replace('RETURNING *', ''); + return query.replace(/\s+RETURNING\s+.*$/i, '');This handles any RETURNING format at the end of the query (case-insensitive). Verify this aligns with how the base
arithmeticQueryconstructs its output.
🧹 Nitpick comments (34)
packages/core/test/unit/sql/group.test.js (2)
39-45: Confirm combined fixture key'postgres duckdb'is the intendedexpectsqlconvention.If DuckDB is expected to match Postgres here, this is fine; just be aware that if DuckDB diverges later you’ll likely want a dedicated
duckdbentry to avoid coupling expectations. Based on learnings, tests should cleanly express dialect differences.
58-64: Same note for empty GROUP BY case: consider separateduckdbfixture if behavior may diverge.If the goal is “DuckDB matches Postgres” for now, the combined key is OK; otherwise, a dedicated
duckdbexpectation keeps future edits localized.packages/core/test/unit/query-generator/rollback-savepoint-query.test.ts (2)
6-18: Potentially lost coverage for “unsupported” dialects due to outerifgate
Gating bydialect.supports.savepointsmatches the general pattern, but it also means dialects that should raisenotSupportedError(e.g., the explicitsnowflakecase) won’t execute this test if they reportsupports.savepoints === false. Consider keeping the suite always-defined and conditionally skipping/asserting inside the test so “unsupported” behavior is still covered. As per coding guidelines / learnings, feature-gating is correct, but ensure it doesn’t hide expected failures.-if (dialect.supports.savepoints) { - describe('QueryGenerator#rollbackSavepointQuery', () => { +describe('QueryGenerator#rollbackSavepointQuery', () => { const queryGenerator = sequelize.queryGenerator; - it('should generate a query for committing a transaction', () => { + it('should generate a query for rolling back to a savepoint', () => { + if (!dialect.supports.savepoints) { + return; + } expectsql(() => queryGenerator.rollbackSavepointQuery('mySavepoint'), { default: 'ROLLBACK TO SAVEPOINT [mySavepoint]', mssql: 'ROLLBACK TRANSACTION [mySavepoint]', snowflake: notSupportedError, }); }); - }); -} +});(If you prefer explicit reporting over silent return, use
this.skip()with a message.)
7-7: Consider usingSupport.getTestDialectTeaser()in descriptions (if this suite follows that convention)
Coding guidelines mention usingSupport.getTestDialectTeaser()for dialect-specific test descriptions; even though this test isn’t dialect-specific in name, it’s feature-specific and may benefit from the standardized teaser format used elsewhere in tests.Also applies to: 10-10
packages/core/test/unit/query-generator/describe-table-query.test.ts (1)
78-83: DuckDB describe query: qualify inner columns + consider deterministic ordering (and verifyarray_contains)The correlated subquery reads clearer/safer if
duckdb_constraintsis aliased and its columns are qualified (todaydatabase_oid/schema_oid/table_oidare unqualified). Also consider adding anORDER BYon a stable column fromduckdb_columns()(if available) to avoid any nondeterminism indescribeTableoutput, and please double-check thatarray_containsis the intended DuckDB function name for the supported version(s).Example shape (illustrative):
- (select count(*) from duckdb_constraints where col.database_oid = database_oid - and col.schema_oid = schema_oid and col.table_oid = table_oid - and array_contains(constraint_column_names, col.column_name) - and constraint_type='PRIMARY KEY') > 0 as is_primary_key + exists ( + select 1 + from duckdb_constraints con + where col.database_oid = con.database_oid + and col.schema_oid = con.schema_oid + and col.table_oid = con.table_oid + and array_contains(con.constraint_column_names, col.column_name) + and con.constraint_type = 'PRIMARY KEY' + ) as is_primary_key + -- optionally: order by col.<stable_position_column>Also applies to: 159-165, 241-246, 320-325, 405-410, 488-493
packages/core/test/unit/query-generator/arithmetic-query.test.ts (1)
65-78: Consider adding a duckdb expectation for completeness.While DuckDB naturally omits RETURNING for UPDATE operations, this test explicitly sets
returning: false. For consistency with other tests in this file and to validate DuckDB's identifier quoting (double quotes vs. square brackets indefault), consider adding a duckdb entry.expectsql(sql, { default: `UPDATE [myTable] SET [foo]=[foo]+ 3`, + duckdb: `UPDATE "myTable" SET "foo"="foo"+ 3`, });packages/core/test/integration/query-interface/changeColumn.test.js (2)
229-232: Consider removing redundant variable assignment.The intermediate assignment
const firstForeignKeys = keyscan be simplified by usingkeysdirectly in the assertions below, or by renaming in the destructuring.- const keys = await this.queryInterface.showConstraints('users', { + const firstForeignKeys = await this.queryInterface.showConstraints('users', { constraintType: 'FOREIGN KEY', }); - const firstForeignKeys = keys;
255-271: Consider relocating this test outside the FK describe block.This test verifies column comment functionality rather than foreign key behavior. While it works correctly here (using tables from
beforeEach), placing it inside thedescribe('should support foreign keys', ...)block is semantically misleading.If relocating would require duplicating table setup, the current placement is acceptable.
packages/core/test/integration/model/sync.test.js (2)
167-168: Consider using a feature flag instead of hardcoded dialect name check.The guard uses
sequelize.dialect.name !== 'duckdb'which is inconsistent with the recommended pattern used elsewhere in this file (e.g.,sequelize.dialect.supports.constraints.foreignKeyat lines 581 and 653). As per coding guidelines, tests should skip unsupported behavior withdialect.supports.featureNamechecks.Consider adding a
supports.indexesfeature flag in the DuckDB dialect configuration and using it here:- // DuckDB dialect does not support indexes - if (sequelize.dialect.name !== 'duckdb') { + if (sequelize.dialect.supports.indexes) {This makes the code more maintainable if other dialects also lack index support in the future.
Based on learnings, the pattern should be
dialect.supports.featureNamechecks.
439-440: Duplicate dialect name check pattern.Same concern as the earlier guard at line 168 - this uses a hardcoded dialect name check rather than a feature flag. If a
supports.indexesfeature flag is added, this guard should also be updated.Additionally, consider whether these two separate guard blocks (lines 168-434 and 440-547) could be consolidated into a single block to reduce code duplication and improve readability.
packages/core/test/integration/instance.validations.test.js (2)
95-100: Consider using the dialect variable for consistency.The error message branching logic is correct. However, for consistency with the feature gating at line 80, consider using the
dialectvariable defined at line 9 rather than callingSupport.getTestDialect()directly.Optional refactor:
- if (Support.getTestDialect() === 'ibmi') { + if (dialect.name === 'ibmi') { expect(err.errors[0].message).to.include('Duplicate key value specified'); } else { expect(err.errors[0].path).to.include('uniqueName'); expect(err.errors[0].message).to.include('must be unique'); }
103-156: LGTM! Well-structured tests for custom unique constraint messages.The nested gating correctly handles:
- First-level check: dialect supports unique constraints at all
- Second-level check: dialect supports custom error messages (excluding ibmi)
Test coverage for both single and multiple unique constraints with custom messages is thorough.
Optional: For consistency with line 80, consider using
dialect.name !== 'ibmi'instead ofSupport.getTestDialect() !== 'ibmi'at lines 103 and within the nested structure.packages/core/test/integration/associations/multiple-level-filters.test.js (2)
62-71: Makeorderunambiguous + explicit (ASC) to avoid dialect-specific parsing/SQL ambiguityWith joins in play,
order: ['title']can be (a) parsed differently than expected and/or (b) produce an ambiguousORDER BY titleon some dialects. Safer to qualify the column and be explicit about direction.const tasks = await Task.findAll({ include: [ { model: Project, include: [{ model: User, where: { username: 'leia' } }], required: true, }, ], - order: ['title'], + order: [[Task, 'title', 'ASC']], });
130-147: Same here: qualify the ordered column to avoid ambiguousORDER BY titleconst tasks = await Task.findAll({ include: [ { model: Project, include: [ { model: User, where: { username: 'leia', id: 1, }, }, ], required: true, }, ], - order: ['title'], + order: [[Task, 'title', 'ASC']], });packages/core/test/unit/data-types/temporal-types.test.ts (1)
144-144: Consider matching the error message format for consistency.The error message for DuckDB's unsupported
TIME(precision)is shorter than the equivalent db2 error (lines 141-142) and lacks the documentation URL. For consistency and better user guidance, consider using the same format.Apply this diff to align the error message format:
- duckdb: new Error('duckdb does not support the TIME(precision) data type.'), + duckdb: new Error(`duckdb does not support the TIME(precision) data type. +See https://sequelize.org/docs/v7/models/data-types/ for a list of supported data types.`),packages/core/test/unit/sql/insert.test.js (1)
44-44: Consider adding feature flag checks for DuckDB-specific limitations.Multiple test expectations now combine
'postgres duckdb'assuming identical SQL generation. While this is efficient, consider whether any of these scenarios should be gated bydialect.supportschecks given the documented limitations (no constraints, no savepoints, limited RETURNING support). This would make the test suite more resilient and explicit about feature support.As per coding guidelines: "In tests, skip unsupported behavior with dialect.supports.featureName checks."
Also applies to: 70-70, 211-211, 255-255, 295-295, 352-352, 381-381
packages/core/test/integration/query-interface/add-show-remove-constraint.test.ts (2)
10-12: Consider switching the suite title to the dialect teaser helper for consistency with other integration tests.
E.g. use the existing test helpers (getTestDialectTeaser/Support.getTestDialectTeaser) if already standard in this area. As per coding guidelines, tests should use the dialect teaser.
8-10: Refactor to usesequelize.dialect.supports.constraintsflags instead ofif (dialect !== 'duckdb')check.
DuckDB explicitly declaressupports.constraints.add: falseandsupports.constraints.remove: false. Use feature-flag gating to align with project patterns and avoid hard-coding dialect names. This applies to both the top-level suite wrapper (line 8) and any similar patterns elsewhere (lines 703-704).packages/core/src/abstract-dialect/query-interface.js (1)
78-85: Update the nearby TODO/comment to mention DuckDB too (since it’s now excluded alongside sqlite3).
The behavior is fine, but the comment is now misleading.- // TODO: the sqlite implementation of createTableQuery should be improved so it also generates a CREATE UNIQUE INDEX query + // TODO: sqlite3 & duckdb: createTableQuery handling around uniqueKeys is special-cased. + // - sqlite3: does not respect index names when created through CREATE TABLE + // - duckdb: unique constraints are intentionally not created (see dialect limitations)packages/core/test/integration/query-interface/remove-column.test.ts (1)
91-102: DuckDB skip for “remove PK column” looks consistent with dialect limitations.packages/core/test/unit/query-interface/create-table.test.ts (2)
73-99: Same concern forsql.uuidV1: DuckDB output omits the default.
If intentional, consider documenting it in the dialect limitations.
13-44: DuckDB expectation dropsDEFAULTforsql.uuidV4—confirm this is an intentional limitation (vs missing feature).
If DuckDB can support UUID defaults, it may be worth adding a dialect implementation rather than silently omitting the default.DuckDB SQL: what built-in functions exist for generating UUIDs (v4/v1), and can they be used in DEFAULT clauses?packages/core/test/unit/sql/index.test.js (1)
20-169: DuckDB deliberately converts index queries toCOMMENT ON TABLE ... IS 'CREATE INDEX ...'because indexes make columns impossible to update (a DuckDB limitation). While this is an intentional workaround with documented rationale (lines 101-108 in packages/duckdb/src/query-generator.js), the approach has a real drawback: it will overwrite legitimate user table comments.Consider using a truly inert statement instead—such as a PRAGMA,
SELECT 1, or raising an informative error—that doesn't risk corrupting user metadata. The current implementation works functionally but sacrifices user table comments as collateral damage.packages/duckdb/package.json (1)
1-44: Consider addingenginesfield for consistency.The core package specifies
"engines": { "node": ">=18.20.8" }. For consistency across the monorepo and to ensure users are aware of Node.js version requirements, consider adding the same engines constraint to this package."type": "commonjs", "version": "7.0.0-alpha.41", + "engines": { + "node": ">=18.20.8" + }, "publishConfig": {packages/duckdb/src/query-generator.internal.ts (1)
5-12: Return type should be readonly to match the frozen constant.
Small polish: declare return type asreadonly string[](orreadonly (typeof TECHNICAL_SCHEMAS)[number][]) to reflect immutability.packages/duckdb/src/_internal/data-types-overrides.ts (1)
38-85: DuckDB type names are correct.UTINYINT,USMALLINT,UINTEGER, andTIMESTAMPTZare all official DuckDB type names and match the correct mappings for their respective Sequelize types.However, consider mapping
DATEONLYto DuckDB's nativeDATEtype instead ofVARCHAR. DuckDB's DATE type provides proper date semantics (comparison, date arithmetic, special values), whereas VARCHAR loses these semantics and requires explicit casting for date operations.packages/duckdb/src/query-generator-typescript.internal.ts (1)
105-107: Remove this redundant override method. The parent classstartTransactionQueryalready validatesreadOnlyandtransactionTypeoptions againstdialect.supports.startTransaction, which DuckDB correctly sets tofalsefor both. Callingsuper.startTransactionQuery(options)provides no additional logic or safety.packages/core/test/support.ts (1)
614-618: Clarify or remove the "TBD: clean up later" comment.The comment on line 616 is unclear. What aspect needs to be cleaned up later? If this refers to a known follow-up task, consider creating an issue and referencing it here, or remove the comment if the implementation is complete.
return Promise.all([ fs.promises.rm(SQLITE_DATABASES_DIR, { recursive: true, force: true }), - // TBD: clean up later fs.promises.rm(DUCKDB_DATABASES_DIR, { recursive: true, force: true }), ]);packages/core/test/integration/associations/has-one.test.js (1)
177-301: FK-dependent tests correctly gated
Gating the FK-constraint error expectations behindcurrent.dialect.supports.constraints.foreignKeyis the right way to keep the suite meaningful across dialects. Based on learnings, this is the preferred approach for unsupported constraint behavior.
Minor:describe('foreign key')becomes empty when unsupported; consider wrapping the wholedescribein theifto avoid empty suites.packages/core/test/integration/model/upsert.test.js (1)
86-108: Good capability gate for unique-constraint/index-based upsert tests
Wrapping composite-key upsert behindsupports.constraints.uniqueand skipping unique-index-specific tests for duckdb matches the stated dialect limitations and keeps test intent accurate. Based on learnings, capability-gating is preferred over dialect-name checks; if DuckDB’s support flags already encode this, consider switching thedialectName !== 'duckdb'block to a supports check in the future.Also applies to: 395-539
packages/core/test/config/config.ts (1)
4-23: Config helpers look fine; ensure DuckDB databases dir is created during tests
ExportingDUCKDB_DATABASES_DIR+getDuckDbDatabasePathis a good parallel to sqlite. Just ensure the test harness creates/cleans that directory (otherwise first-run can fail on some environments).packages/duckdb/src/connection-manager.ts (1)
34-38: Prefer camelCase for added connection fields (db_path→dbPath).
Minor, but this leaks into typings and any downstream usage.Also applies to: 56-58
packages/core/test/integration/index.test.ts (1)
9-12: Consider adding a top-levelsupports.indexesflag to the dialect support matrix.The current code gates the entire indexes test suite with
dialect !== 'duckdb', which hard-codes dialect names. While this is a common pattern in the test suite, the learnings emphasize usingdialect.supports.*checks instead. DuckDB has all granular index features disabled (supports.index.include,supports.index.type, etc. are all false), making it suitable for a feature-level flag. Addindexes: falseto DuckDB's supports configuration and update the test to useif (sequelize.dialect.supports.indexes) { describe(...) }for consistency with the testing guidelines.packages/duckdb/src/dialect.ts (1)
160-176: escapeBuffer: consider building via array join to avoid quadratic concatenation.
Not a blocker, but this is an easy micro-optimization for large buffers:- let escaped = ''; + const parts: string[] = []; for (const element of buffer) { @@ - escaped += element ? `\\x${hex}` : '\\x00'; + parts.push(element ? `\\x${hex}` : '\\x00'); } - - return `'${escaped}'::BLOB`; + return `'${parts.join('')}'::BLOB`;
| case 'duckdb': | ||
| // eslint-disable-next-line import/no-extraneous-dependencies -- legacy function, will be removed. User needs to install the dependency themselves | ||
| return require('@sequelize/duckdb').DuckDbDialect; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find the duckdb dialect package in the sequelize monorepo
find . -type d -name "*duckdb*" | head -20Repository: sequelize/sequelize
Length of output: 182
🌐 Web query:
@sequelize/duckdb package exports DuckDbDialect npm
💡 Result:
Short answer: There is no official @sequelize/duckdb package that exports a DuckDbDialect for Sequelize. Sequelize’s official dialect packages and docs do not list DuckDB as a supported dialect, and a scoped package named @sequelize/duckdb is not published. [1][2]
If you want to use DuckDB from Node.js you can use node-duckdb or DuckDB WASM + community adapters (e.g., Kysely DuckDB dialects) instead; but there’s no built-in Sequelize DuckDB dialect at this time. [3][4]
Sources:
- Sequelize docs (supported/official dialects). [1]
- npm / package metadata (no @sequelize/duckdb published package). [2]
- node-duckdb (Node client for DuckDB). [3]
- Kysely community DuckDB dialect examples (DuckDB WASM dialect). [4]
Remove or fix the broken DuckDB dialect case—@sequelize/duckdb is not a published package.
The code attempts to require @sequelize/duckdb, but this package does not exist on npm or as an official Sequelize dialect. This will fail at runtime if the 'duckdb' case is reached. Either remove this case, implement a proper DuckDB dialect following the patterns in packages/{dialect}/src/dialect.ts, or clarify if this is placeholder code pending future implementation.
🤖 Prompt for AI Agents
In packages/core/src/sequelize.internals.ts around lines 47-49, the 'duckdb'
case requires a non-existent package '@sequelize/duckdb' which will crash at
runtime; remove or replace this broken case. Replace it either by deleting the
'duckdb' branch entirely (and removing any related typings/tests), or implement
a real DuckDB dialect following the pattern used in
packages/{dialect}/src/dialect.ts and export it here; as a minimal stopgap,
change the branch to throw a clear UnsupportedDialectError (or similar) that
states DuckDB is not supported and points to docs, and ensure you update any
related type declarations and unit tests to reflect the removal or new behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Either remove this case, implement a proper DuckDB dialect following the patterns
What do you think we are doing here, bot?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!
| async disconnect(connection: DuckDbConnection) { | ||
| if (connection.closed) { | ||
| return; | ||
| } | ||
|
|
||
| connection.closed = true; | ||
| connection.closeSync(); | ||
| connection.instance.closeSync(); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, explore the repository structure
find packages/duckdb -type f -name "*.ts" | head -20Repository: sequelize/sequelize
Length of output: 529
🏁 Script executed:
# Look at connection-manager.ts in detail
cat -n packages/duckdb/src/connection-manager.tsRepository: sequelize/sequelize
Length of output: 2431
🏁 Script executed:
# Check the connection class definition
cat -n packages/duckdb/src/connection.tsRepository: sequelize/sequelize
Length of output: 128
🏁 Script executed:
# Search for instance caching/sharing logic
rg -i "cache|fromcache|instance" packages/duckdb/src --type ts -A 2 -B 2Repository: sequelize/sequelize
Length of output: 3140
🏁 Script executed:
# Check package.json for duckdb dependencies and versions
cat packages/duckdb/package.jsonRepository: sequelize/sequelize
Length of output: 1274
🌐 Web query:
@duckdb/node-api DuckDBInstance.fromCache reference counting closeSync
💡 Result:
Short answer — fromCache returns a cached DuckDBInstance (shared via the instance cache); the cache prevents multiple instances attaching the same file and the native library uses explicit close semantics / reference management, so you must explicitly close when finished to release resources and the file lock.
Key points and actions
- DuckDBInstance.fromCache(...) / DuckDBInstanceCache.getOrCreateInstance(...) returns a shared instance from the default or a custom instance cache (so callers get the same underlying DB instance rather than creating a new one). [1]
- Connections obtained from that instance should be closed when done (connection.disconnectSync() / connection.closeSync()). [1]
- The native layer requires an explicit close to free the underlying DuckDB instance (duckdb_close in the C API). If you don’t close the instance/connection you can leave file handles/locks open; call instance.closeSync() (or close the cache / destroy it) to fully release resources. [2]
- There are existing reports of DB files staying locked if not all handles/instances are properly closed — ensure every connection and the instance/cache are closed to avoid EBUSY/locked-file errors. [3]
Sources
- DuckDB Node (Neo) client docs — instance cache, fromCache, and connection close APIs. [1]
- DuckDB C API / bindings notes: cached instances must be closed (duckdb_close) to free resources. [2]
- GitHub issue reporting file-locking when instances/handles aren’t fully closed. [3]
If you want, tell me the exact code pattern you’re using and I’ll show the minimal correct sequence (fromCache → connect → closeSync) to avoid locks.
🏁 Script executed:
# Search for any tests related to connection lifecycle
rg -i "disconnect|closeSync" packages/duckdb --type ts -B 3 -A 3Repository: sequelize/sequelize
Length of output: 1042
🏁 Script executed:
# Check if there are any usage examples or tests in the repo
find . -name "*test*" -o -name "*spec*" | grep -i duckdbRepository: sequelize/sequelize
Length of output: 45
Lifecycle bug: disconnect() unconditionally closes a shared instance on every connection close.
Since DuckDBInstanceClass.fromCache(config.database) returns a cached instance shared across all connections to the same database, calling connection.instance.closeSync() on the first disconnected connection will close the shared instance and break all other active connections to that database. The code needs reference counting to track how many connections are active for each shared instance, calling closeSync() only when the last connection is closed. Currently, the implementation contradicts its own comment about native cache eviction of instances "when all references are released."
🤖 Prompt for AI Agents
In packages/duckdb/src/connection-manager.ts around lines 46 to 54, disconnect()
currently always calls connection.instance.closeSync(), which closes a shared
cached DuckDBInstance and breaks other active connections; change this to use
reference counting on the shared instance: increment the instance's refCount
when a connection is created/checked out, decrement refCount when disconnecting,
and only call instance.closeSync() (and remove the instance from the cache) when
refCount reaches zero; ensure refCount updates are atomic/thread-safe (or
guarded by the existing cache lock) and set connection.closed = true and
closeSync() on the connection itself as before.
| describeTableQuery(tableName: TableOrModel) { | ||
| const table = this.extractTableDetails(tableName); | ||
|
|
||
| return `select col.column_name, col.data_type as column_type, col.is_nullable, col.column_default as default_value, | ||
| (select count(*) from duckdb_constraints where col.database_oid = database_oid and col.schema_oid = schema_oid | ||
| and col.table_oid = table_oid and array_contains(constraint_column_names, col.column_name) | ||
| and constraint_type='PRIMARY KEY') > 0 as is_primary_key | ||
| from duckdb_columns() col | ||
| where col.table_name = '${table.tableName}' and col.schema_name = '${table.schema}'`; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, let's see the full file structure and context
head -100 packages/duckdb/src/query-generator-typescript.internal.tsRepository: sequelize/sequelize
Length of output: 3678
🏁 Script executed:
# Search for the escape method definition
rg "escape\s*\(" packages/duckdb/src/query-generator-typescript.internal.ts -A 3Repository: sequelize/sequelize
Length of output: 1351
🏁 Script executed:
# Check parent class or related implementations
rg "class.*QueryGenerator" packages/duckdb/src/ -A 5Repository: sequelize/sequelize
Length of output: 1760
🏁 Script executed:
# Look for other describeTableQuery or similar methods to see escaping patterns
rg "describeTableQuery|escape\(" packages/duckdb/src/ -B 2 -A 2Repository: sequelize/sequelize
Length of output: 5572
🏁 Script executed:
# Check the extractTableDetails method
rg "extractTableDetails" packages/duckdb/src/query-generator-typescript.internal.ts -A 10Repository: sequelize/sequelize
Length of output: 2206
🌐 Web query:
DuckDB duckdb_constraints table or function system catalog
💡 Result:
duckdb_constraints() is a built‑in DuckDB meta table function that returns metadata about constraints defined in the current DuckDB instance (NOT an actual catalog table). It appears in the DuckDB meta functions list and exposes one row per constraint with columns such as database_name, database_oid, schema_name, schema_oid, table_name, table_oid, constraint_index and constraint_type (possible types: CHECK, FOREIGN KEY, PRIMARY KEY, NOT NULL, UNIQUE). Example:
SELECT * FROM duckdb_constraints();
Notes:
- This function is documented in the DuckDB meta table functions docs. [1]
- There are known limitations/feature requests (e.g., some referenced details for foreign keys — referenced table/constraint may only appear in the constraint_text column), tracked on the DuckDB GitHub. [2]
- duckdb_constraints() is one of several meta functions listed in the DuckDB environment docs (duckdb_columns(), duckdb_tables(), etc.). [3]
Sources:
[1] DuckDB — duckdb_constraints() (meta table functions).
[2] GitHub issue: “Does duckdb_constraints() contain all relevant information?” (duckdb/duckdb).
[3] DuckDB — DuckDB environment / meta table functions.
🌐 Web query:
DuckDB SQL injection prevention escaping identifiers parameterized queries
💡 Result:
Short answer:
- For values: use DuckDB prepared/parameterized statements (?, $1, or named $name) — they prevent SQL injection for data values. [1][2]
- For identifiers (table/ or column-names) you cannot bind them as parameters. Either (A) restrict them to a server-side whitelist/allowlist, or (B) safely quote them by wrapping in double quotes and escaping any embedded double-quote by doubling it (replace " with ""). Do NOT concatenate unchecked user input into SQL. [3][1]
Examples
-
Value parameter (Python):
duckdb.execute("SELECT * FROM t WHERE id = ?", [user_id]) -- safe for user_id. [2] -
Identifier safe handling (pseudo):
if user_col not in ALLOWED_COLUMNS: error
sql = f'SELECT "{user_col.replace(""","""")}" FROM my_table' -- only after whitelist check. [1][3]
Sources
[1] DuckDB — Keywords and Identifiers (identifier quoting/escaping).
[2] DuckDB — Prepared Statements (parameter syntax and examples).
[3] DuckDB — Securing DuckDB (recommendation to use prepared statements; avoid string concatenation).
🏁 Script executed:
# Check exact line numbers for describeTableQuery
rg -n "describeTableQuery" packages/duckdb/src/query-generator-typescript.internal.ts -A 15Repository: sequelize/sequelize
Length of output: 915
SQL injection risk in describeTableQuery (unescaped table/schema names).
Line 52 directly interpolates table.tableName and table.schema into the query string without escaping. All other query methods in this file (showConstraintsQuery, showIndexesQuery, listTablesQuery) properly use this.escape() for these values. Update both parameters to use this.escape():
describeTableQuery(tableName: TableOrModel) {
const table = this.extractTableDetails(tableName);
return `select col.column_name, col.data_type as column_type, col.is_nullable, col.column_default as default_value,
(select count(*) from duckdb_constraints where col.database_oid = database_oid and col.schema_oid = schema_oid
and col.table_oid = table_oid and array_contains(constraint_column_names, col.column_name)
and constraint_type='PRIMARY KEY') > 0 as is_primary_key
from duckdb_columns() col
- where col.table_name = '${table.tableName}' and col.schema_name = '${table.schema}'`;
+ where col.table_name = ${this.escape(table.tableName)} and col.schema_name = ${this.escape(table.schema)}`;
}Additionally, line 48 uses duckdb_constraints without parentheses; it should be duckdb_constraints() to match DuckDB function call syntax used in other methods.
🤖 Prompt for AI Agents
In packages/duckdb/src/query-generator-typescript.internal.ts around lines 44 to
53, the SQL built in describeTableQuery directly interpolates table.tableName
and table.schema causing an injection risk and also references
duckdb_constraints without parentheses; update the query to call
this.escape(table.tableName) and this.escape(table.schema) where those values
are inserted, and change duckdb_constraints to duckdb_constraints() so the
function call matches DuckDB syntax; ensure the rest of the query string remains
unchanged and use the escaped values in the where clause.
| const TECHNICAL_SCHEMAS = Object.freeze(['information_schema', 'pg_catalog']); | ||
|
|
||
| export class DuckDbQueryGeneratorInternal< | ||
| Dialect extends DuckDbDialect = DuckDbDialect, | ||
| > extends AbstractQueryGeneratorInternal<Dialect> { | ||
| getTechnicalSchemaNames() { | ||
| return TECHNICAL_SCHEMAS; | ||
| } | ||
|
|
||
| addLimitAndOffset(options: AddLimitOffsetOptions) { | ||
| let fragment = ''; | ||
| if (options.limit != null) { | ||
| fragment += ` LIMIT ${this.queryGenerator.escape(options.limit, options)}`; | ||
| } | ||
|
|
||
| if (options.offset) { | ||
| fragment += ` OFFSET ${this.queryGenerator.escape(options.offset, options)}`; | ||
| } | ||
|
|
||
| return fragment; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix OFFSET 0 being dropped (truthiness check).
Line 20 uses if (options.offset), which omits OFFSET 0.
addLimitAndOffset(options: AddLimitOffsetOptions) {
let fragment = '';
if (options.limit != null) {
fragment += ` LIMIT ${this.queryGenerator.escape(options.limit, options)}`;
}
- if (options.offset) {
+ if (options.offset != null) {
fragment += ` OFFSET ${this.queryGenerator.escape(options.offset, options)}`;
}
return fragment;
}🤖 Prompt for AI Agents
packages/duckdb/src/query-generator.internal.ts around lines 5 to 25: the
current truthiness check `if (options.offset)` drops valid offset 0; change the
condition to explicitly check for null/undefined (e.g. `if (options.offset !=
null)`) so that 0 is handled, keeping the existing escape call and fragment
construction otherwise unchanged.
| 'use strict'; | ||
|
|
||
| import { CREATE_TABLE_QUERY_SUPPORTABLE_OPTIONS } from '@sequelize/core/_non-semver-use-at-your-own-risk_/abstract-dialect/query-generator.js'; | ||
| import { rejectInvalidOptions } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/check.js'; | ||
| import { defaultValueSchemable } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/query-builder-utils.js'; | ||
| import isObject from 'lodash/isObject'; | ||
|
|
||
| const { DuckDbQueryGeneratorTypeScript } = require('./query-generator-typescript.internal'); | ||
|
|
||
| export class DuckDbQueryGenerator extends DuckDbQueryGeneratorTypeScript { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Blocker: mixed ESM/CJS + Biome parse errors — make this file consistently TS/ESM (preferred) or CJS.
Static analysis indicates this won’t parse as-is, and the import/export + require mix is fragile across tooling. As per coding guidelines, new src code should be TypeScript.
A minimal ESM fix (if the package is ESM) would look like:
-'use strict';
-
-import { CREATE_TABLE_QUERY_SUPPORTABLE_OPTIONS } from '@sequelize/core/_non-semver-use-at-your-own-risk_/abstract-dialect/query-generator.js';
+import { CREATE_TABLE_QUERY_SUPPORTABLE_OPTIONS } from '@sequelize/core/_non-semver-use-at-your-own-risk_/abstract-dialect/query-generator.js';
import { rejectInvalidOptions } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/check.js';
import { defaultValueSchemable } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/query-builder-utils.js';
import isObject from 'lodash/isObject';
-
-const { DuckDbQueryGeneratorTypeScript } = require('./query-generator-typescript.internal');
+import { DuckDbQueryGeneratorTypeScript } from './query-generator-typescript.internal.js';Based on coding guidelines, prefer moving this implementation to packages/duckdb/src/query-generator.ts.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| 'use strict'; | |
| import { CREATE_TABLE_QUERY_SUPPORTABLE_OPTIONS } from '@sequelize/core/_non-semver-use-at-your-own-risk_/abstract-dialect/query-generator.js'; | |
| import { rejectInvalidOptions } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/check.js'; | |
| import { defaultValueSchemable } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/query-builder-utils.js'; | |
| import isObject from 'lodash/isObject'; | |
| const { DuckDbQueryGeneratorTypeScript } = require('./query-generator-typescript.internal'); | |
| export class DuckDbQueryGenerator extends DuckDbQueryGeneratorTypeScript { | |
| import { CREATE_TABLE_QUERY_SUPPORTABLE_OPTIONS } from '@sequelize/core/_non-semver-use-at-your-own-risk_/abstract-dialect/query-generator.js'; | |
| import { rejectInvalidOptions } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/check.js'; | |
| import { defaultValueSchemable } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/query-builder-utils.js'; | |
| import isObject from 'lodash/isObject'; | |
| import { DuckDbQueryGeneratorTypeScript } from './query-generator-typescript.internal.js'; | |
| export class DuckDbQueryGenerator extends DuckDbQueryGeneratorTypeScript { |
🧰 Tools
🪛 Biome (2.1.2)
[error] 2-3: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 3-4: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 4-5: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 5-6: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
| changeColumnQuery(tableName, attributes) { | ||
| const query = subQuery => `ALTER TABLE ${this.quoteTable(tableName)} ALTER COLUMN ${subQuery};`; | ||
| const sql = []; | ||
| const fields = this.attributesToSQL(attributes, { context: 'addColumn' }); | ||
|
|
||
| for (const attributeName in attributes) { | ||
| const definition = fields[attributeName]; | ||
| let attrSql = ''; | ||
|
|
||
| if (definition.includes('DEFAULT')) { | ||
| attrSql += query( | ||
| `${this.quoteIdentifier(attributeName)} SET DEFAULT ${definition.match(/DEFAULT ([^;]+)/)[1]}`, | ||
| ); | ||
| } else if (definition.includes('NOT NULL')) { | ||
| // adding/removing constraints in ALTER TABLE is not supported | ||
| attrSql += query( | ||
| `${this.quoteIdentifier(attributeName)} TYPE ${definition.replace('NOT NULL', '')}`, | ||
| ); | ||
| } else { | ||
| attrSql += query(`${this.quoteIdentifier(attributeName)} TYPE ${definition}`); | ||
| } | ||
|
|
||
| sql.push(attrSql); | ||
| } | ||
|
|
||
| return sql.join(''); | ||
| } | ||
|
|
||
| renameColumnQuery(tableName, attrBefore, attributes) { | ||
| const attrString = []; | ||
|
|
||
| for (const attributeName in attributes) { | ||
| attrString.push( | ||
| `${this.quoteIdentifier(attrBefore)} TO ${this.quoteIdentifier(attributeName)}`, | ||
| ); | ||
| } | ||
|
|
||
| return `ALTER TABLE ${this.quoteTable(tableName)} RENAME COLUMN ${attrString.join(', ')};`; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
renameColumnQuery likely generates invalid SQL for multiple renames (repeats the same “from” column).
As written, if attributes has more than one key, you generate old TO new1, old TO new2, which is almost certainly invalid and not what Sequelize expects.
If this is meant to rename a single column, enforce exactly one target:
renameColumnQuery(tableName, attrBefore, attributes) {
- const attrString = [];
-
- for (const attributeName in attributes) {
- attrString.push(
- `${this.quoteIdentifier(attrBefore)} TO ${this.quoteIdentifier(attributeName)}`,
- );
- }
-
- return `ALTER TABLE ${this.quoteTable(tableName)} RENAME COLUMN ${attrString.join(', ')};`;
+ const [attrAfter] = Object.keys(attributes);
+ if (!attrAfter || Object.keys(attributes).length !== 1) {
+ throw new Error('DuckDB renameColumnQuery expects exactly one target column name.');
+ }
+ return `ALTER TABLE ${this.quoteTable(tableName)} RENAME COLUMN ${this.quoteIdentifier(attrBefore)} TO ${this.quoteIdentifier(attrAfter)};`;
}🤖 Prompt for AI Agents
In packages/duckdb/src/query-generator.js around lines 157 to 195,
renameColumnQuery currently iterates all keys in `attributes` and repeats the
same `attrBefore` for each, producing invalid SQL like `old TO new1, old TO
new2`; enforce that `attributes` contains exactly one target column (check
Object.keys(attributes).length === 1), extract that single target name and build
a single `ALTER TABLE ... RENAME COLUMN old TO new;` statement, and throw a
clear error if more than one or zero targets are provided.
| // A copy of the one in core, except conflict columns do not get updated | ||
| // to work around DuckDB's overly strict unique constraint enforcement | ||
| async upsert<M extends Model>( | ||
| tableName: TableName, | ||
| insertValues: object, | ||
| updateValues: object, | ||
| where: object, | ||
| inputOptions: QiUpsertOptions<M>, | ||
| ): Promise<object> { | ||
| if (inputOptions?.bind) { | ||
| assertNoReservedBind(inputOptions.bind); | ||
| } | ||
|
|
||
| const options: UpsertOptionsWithAllProperties<M> = { ...inputOptions }; | ||
|
|
||
| const model = options.model; | ||
| const modelDefinition = model.modelDefinition; | ||
|
|
||
| options.type = QueryTypes.UPSERT; | ||
| options.updateOnDuplicate = Object.keys(updateValues); | ||
| options.upsertKeys = options.conflictFields || []; | ||
|
|
||
| if (options.upsertKeys.length === 0) { | ||
| const primaryKeys = [ | ||
| ...map( | ||
| modelDefinition.primaryKeysAttributeNames, | ||
| pkAttrName => modelDefinition.attributes?.get(pkAttrName)?.columnName || pkAttrName, | ||
| ), | ||
| ]; | ||
|
|
||
| const uniqueColumnNames = Object.values(model.getIndexes()) | ||
| .filter(c => c.unique && c.fields?.length && c.fields?.length > 0) | ||
| .map(c => c.fields as string[]); | ||
| // For fields in updateValues, try to find a constraint or unique index | ||
| // that includes given field. Only first matching upsert key is used. | ||
| for (const field of options.updateOnDuplicate) { | ||
| const indexKey = uniqueColumnNames.find(fields => fields.includes(field)); | ||
| if (indexKey && typeof indexKey === 'string') { | ||
| options.upsertKeys.push(indexKey); | ||
| break; | ||
| } | ||
| } | ||
|
|
||
| // Always use PK, if no constraint available OR update data contains PK | ||
| if ( | ||
| options.upsertKeys.length === 0 || | ||
| intersection(options.updateOnDuplicate, primaryKeys).length > 0 | ||
| ) { | ||
| options.upsertKeys = primaryKeys; | ||
| } | ||
|
|
||
| options.upsertKeys = uniq(options.upsertKeys); | ||
| // This is the only real difference from the built-in upsert -- the keys do not participate | ||
| // in the DO UPDATE SET clause. This should ideally be done in abstract-dialect/query-interface | ||
| // since those keys should not ever need to get updated, but since this is only causing problems | ||
| // in DuckDB, for now the difference lives here. | ||
| options.updateOnDuplicate = difference(options.updateOnDuplicate, options.upsertKeys); | ||
| } | ||
|
|
||
| const { bind, query } = this.queryGenerator.insertQuery( | ||
| tableName, | ||
| insertValues, | ||
| getObjectFromMap(modelDefinition.attributes), | ||
| options, | ||
| ) as any as { query: string; bind?: { [key: string]: unknown } }; | ||
|
|
||
| // unlike bind, replacements are handled by QueryGenerator, not QueryRaw | ||
| delete options.replacement; | ||
| options.bind = combineBinds(options.bind || {}, bind || {}); | ||
|
|
||
| return this.sequelize.queryRaw(query, options); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, let's understand the structure and find relevant files
find . -type f -name "query-interface.ts" | head -20Repository: sequelize/sequelize
Length of output: 274
🏁 Script executed:
# Find query-interface.ts files to understand core vs duckdb implementations
find . -type f -name "query-interface.ts" | head -20Repository: sequelize/sequelize
Length of output: 274
🏁 Script executed:
# Get overview of the repo structure
git ls-files | grep -E "(query-interface|model\.ts)" | head -30Repository: sequelize/sequelize
Length of output: 1954
🏁 Script executed:
# Examine the duckdb query-interface.ts file around the upsert method
cat -n packages/duckdb/src/query-interface.ts | sed -n '40,130p'Repository: sequelize/sequelize
Length of output: 4219
🏁 Script executed:
# Check the core query-interface for comparison
cat -n packages/core/src/query-interface.ts | sed -n '1,50p'Repository: sequelize/sequelize
Length of output: 131
🏁 Script executed:
# Search for upsert method in core query-interface
grep -n "async upsert" packages/core/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 132
🏁 Script executed:
# Look at model.getIndexes() return type
ast-grep --pattern 'getIndexes() {
$$$
}'Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Find where getIndexes is defined
rg "getIndexes\(" -A 3 packages/core/src/ | head -50Repository: sequelize/sequelize
Length of output: 3105
🏁 Script executed:
# Search for getIndexes definition in Model
rg "getIndexes\(" -t ts packages/core/src/ -B 2 -A 5 | grep -A 10 "getIndexes()"Repository: sequelize/sequelize
Length of output: 3221
🏁 Script executed:
# Check types/interfaces for Index structure
rg "interface.*Index" packages/core/src/ | head -20Repository: sequelize/sequelize
Length of output: 1080
🏁 Script executed:
# Read the duckdb query-interface file entirely
wc -l packages/duckdb/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 105
🏁 Script executed:
# Read the relevant section
cat -n packages/duckdb/src/query-interface.ts | sed -n '50,125p'Repository: sequelize/sequelize
Length of output: 3513
🏁 Script executed:
# Find Model class and getIndexes method
fd -t f "model.ts" packages/core/src/Repository: sequelize/sequelize
Length of output: 107
🏁 Script executed:
# Search for getIndexes implementation
rg "getIndexes" packages/core/src/ -t ts -B 2 -A 8 | head -80Repository: sequelize/sequelize
Length of output: 4361
🏁 Script executed:
# Get line 78 context
cat -n packages/duckdb/src/query-interface.ts | sed -n '75,82p'Repository: sequelize/sequelize
Length of output: 469
🏁 Script executed:
# Check sequelize.queryRaw signature and how it handles options
rg "queryRaw" packages/core/src/ -t ts -B 2 -A 5 | head -100Repository: sequelize/sequelize
Length of output: 7078
🏁 Script executed:
# Check the query-generator to understand replacement vs replacements
rg "replacement" packages/core/src/query-generator/ -t ts | head -30Repository: sequelize/sequelize
Length of output: 137
🏁 Script executed:
# Look for how options are used in query generation
rg "options\.replacement" packages/core/src/ -t tsRepository: sequelize/sequelize
Length of output: 200
🏁 Script executed:
# Read duckdb query-interface completely
cat packages/duckdb/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 6191
🏁 Script executed:
# Check the core query-interface upsert method
sed -n '800,950p' packages/core/src/query-interface.ts | grep -A 150 "async upsert"Repository: sequelize/sequelize
Length of output: 142
🏁 Script executed:
# Find the upsert method in core
rg -n "async upsert" packages/core/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 139
🏁 Script executed:
# Search for Index interface/type
rg "interface.*Index\|type.*Index" packages/core/src/ -t ts | head -20Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Look for getIndexes in model definition
rg "getIndexes" packages/core/src/ -t ts -B 2 -A 10 | head -100Repository: sequelize/sequelize
Length of output: 4904
🏁 Script executed:
# Look at the actual Model implementation
fd "model.ts" packages/core/src/ -x cat {} | head -200Repository: sequelize/sequelize
Length of output: 7054
🏁 Script executed:
# Search more specifically for indexes property
rg "indexes.*Map\|_indexes" packages/core/src/model.ts -B 2 -A 5 | head -80Repository: sequelize/sequelize
Length of output: 129
🏁 Script executed:
# Find type definitions
fd "index.ts" packages/core/src/types/ -x ls -la {}Repository: sequelize/sequelize
Length of output: 174
🏁 Script executed:
# List what's in types directory
ls -la packages/core/src/types/ | head -30Repository: sequelize/sequelize
Length of output: 134
🏁 Script executed:
# Read model.ts to find getIndexes
cat packages/core/src/model.ts | grep -A 15 "getIndexes"Repository: sequelize/sequelize
Length of output: 121
🏁 Script executed:
# Check how insertQuery uses replacement/replacements
rg "replacement" packages/core/src/query-generator/query-generator.ts -B 3 -A 3 | head -100Repository: sequelize/sequelize
Length of output: 155
🏁 Script executed:
# Check the options interfaces used
rg "UpsertOptions\|QueryOptions" packages/core/src/query-interface.ts -B 2 -A 10 | head -100Repository: sequelize/sequelize
Length of output: 139
🏁 Script executed:
# Look at insertQuery signature and what it returns
rg "insertQuery" packages/core/src/query-generator/ -t ts -B 2 -A 5 | head -100Repository: sequelize/sequelize
Length of output: 137
🏁 Script executed:
# Find and read the core upsert method
rg -n "async upsert<M" packages/core/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 139
🏁 Script executed:
# Read the core upsert implementation
sed -n '1000,1200p' packages/core/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 142
🏁 Script executed:
# Find upsert in core more carefully
grep -n "upsert" packages/core/src/query-interface.ts | head -20Repository: sequelize/sequelize
Length of output: 132
🏁 Script executed:
# Now read that section
sed -n '850,1050p' packages/core/src/query-interface.tsRepository: sequelize/sequelize
Length of output: 142
🏁 Script executed:
# Check what model.getIndexes() returns - look at model property initialization
rg "this\.indexes\|getIndexes\|_indexes" packages/core/src/model.ts -B 3 -A 8Repository: sequelize/sequelize
Length of output: 129
🏁 Script executed:
# Check index definition and its fields property
rg "fields\?" packages/core/src/ -t ts -B 3 | grep -A 5 "index\|Index" | head -80Repository: sequelize/sequelize
Length of output: 1320
🏁 Script executed:
# Let's verify the exact code at those lines by reading the file directly
python3 << 'EOF'
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
lines = f.readlines()
# Lines 50-121 (0-indexed: 49-120)
for i, line in enumerate(lines[49:121], start=50):
print(f"{i:3d}: {line}", end='')
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# 1. Check what model.getIndexes() returns
python3 << 'EOF'
import re
with open('packages/core/src/model.ts', 'r') as f:
content = f.read()
# Find getIndexes method
match = re.search(r'getIndexes\s*\(\s*\)\s*\{[^}]*\}', content, re.DOTALL)
if match:
print("Found getIndexes:")
print(match.group(0)[:500])
else:
print("getIndexes not found, searching for indexes property...")
# Search for indexes property
match = re.search(r'(indexes|_indexes).*?=', content)
if match:
print("Found:", match.group(0))
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Find the upsert method in core and check the full implementation
python3 << 'EOF'
import re
with open('packages/core/src/query-interface.ts', 'r') as f:
content = f.read()
# Find upsert method
match = re.search(r'async upsert<[^>]*>\([^)]*\)\s*:\s*Promise<[^>]*>\s*\{', content)
if match:
start = match.start()
# Find the line number
line_num = content[:start].count('\n') + 1
print(f"Core upsert starts around line {line_num}")
# Print 200 chars around it
print(content[start:start+1500])
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Search for where index fields are set or used
rg "\.fields" packages/core/src/ -t ts -B 2 -A 2 | grep -i "index\|unique" -A 3 -B 3 | head -100Repository: sequelize/sequelize
Length of output: 4002
🏁 Script executed:
# Check what insertQuery returns and how it's used
python3 << 'EOF'
import re
with open('packages/core/src/query-interface.ts', 'r') as f:
lines = f.readlines()
# Search for insertQuery calls
for i, line in enumerate(lines, 1):
if 'insertQuery' in line and 'queryGenerator' in line:
print(f"Line {i}: {line.rstrip()}")
# Print surrounding lines for context
for j in range(max(0, i-3), min(len(lines), i+10)):
print(f" {j+1}: {lines[j].rstrip()}")
break
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Search for "delete options.replacement" in the codebase
rg "delete.*options\.replacement" packages/ -t tsRepository: sequelize/sequelize
Length of output: 205
🏁 Script executed:
# Check what queryRaw expects
rg "queryRaw.*replacement" packages/core/src/ -t ts -B 3 -A 3Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Check the QueryOptions type definition
rg "QueryOptions\|QueryRawOptions" packages/core/src/ -t ts -B 2 -A 20 | head -150Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Look at what insertQuery does with replacement/replacements
python3 << 'EOF'
import os
import re
# Find query-generator files
query_gen_files = []
for root, dirs, files in os.walk('packages/core/src/query-generator'):
for f in files:
if f.endswith('.ts'):
query_gen_files.append(os.path.join(root, f))
print(f"Found {len(query_gen_files)} query-generator files")
# Search for insertQuery implementation
for file in query_gen_files:
with open(file, 'r') as f:
content = f.read()
if 'insertQuery' in content:
# Find the method
match = re.search(r'insertQuery\s*\([^)]*\)\s*\{[^}]*return[^;]*;', content, re.DOTALL)
if match:
print(f"\nFound in {file}:")
lines = match.group(0).split('\n')
for line in lines[:50]:
print(line)
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Look for the indexes property in Model class
python3 << 'EOF'
import re
with open('packages/core/src/model.ts', 'r') as f:
content = f.read()
# Search for getIndexes method
match = re.search(r'getIndexes\s*\([^)]*\)\s*\{[^}]+\}', content, re.DOTALL)
if match:
print("getIndexes found:")
print(match.group(0))
# Also search for indexes property
if 'this.indexes' in content:
idx = content.find('this.indexes')
print("\nthis.indexes context:")
print(content[max(0, idx-200):min(len(content), idx+200)])
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Let's trace through the logic step by step
python3 << 'EOF'
# Read the DuckDB query-interface file
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
lines = f.readlines()
print("=== Lines 70-85 (unique index detection) ===")
for i in range(69, 85):
print(f"{i+1:3d}: {lines[i]}", end='')
print("\n=== Lines 85-95 (the problematic check) ===")
for i in range(84, 95):
print(f"{i+1:3d}: {lines[i]}", end='')
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check the return type of getIndexes and what Object.values yields
python3 << 'EOF'
import re
# Check the model.ts for getIndexes
with open('packages/core/src/model.ts', 'r') as f:
content = f.read()
# Search for getIndexes
if 'getIndexes()' in content:
idx = content.find('getIndexes()')
snippet = content[idx:idx+500]
print("getIndexes definition:")
print(snippet)
print("\n---")
# Also look for indexes Map
if 'this.indexes' in content:
idx = content.find('this.indexes')
snippet = content[max(0, idx-100):idx+300]
print("this.indexes context:")
print(snippet)
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Look for Index type definition
rg "^export.*Index\|interface Index" packages/core/src/ -t ts -A 10 | head -100Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check index definition more broadly
rg "fields\s*\?" packages/core/src/ -t ts -B 3 -A 1 | head -100Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Find core upsert more carefully
python3 << 'EOF'
with open('packages/core/src/query-interface.ts', 'r') as f:
lines = f.readlines()
# Search for upsert method
for i, line in enumerate(lines):
if 'async upsert<M' in line:
print(f"Found upsert at line {i+1}")
# Print the next 100 lines
for j in range(i, min(i+100, len(lines))):
print(f"{j+1:4d}: {lines[j]}", end='')
break
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Let's specifically check what core does vs duckdb
python3 << 'EOF'
# Core query-interface
with open('packages/core/src/query-interface.ts', 'r') as f:
core_content = f.read()
# DuckDB query-interface
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
duckdb_content = f.read()
# Search for updateOnDuplicate handling in core
import re
core_upsert = re.search(r'async upsert<[^>]*>\([^)]*\)\s*:\s*Promise[^{]*\{.*?(?=\n \}$|\n async\s)', core_content, re.DOTALL | re.MULTILINE)
if core_upsert:
text = core_upsert.group(0)
lines = text.split('\n')
print("=== CORE UPSERT ===")
for i, line in enumerate(lines[:80]):
print(f"{i:3d}: {line}")
print("\n=== DUCKDB UPSERT updateOnDuplicate handling ===")
duckdb_upsert = re.search(r'async upsert<[^>]*>\([^)]*\)\s*:\s*Promise[^{]*\{.*?(?=\n \}$|\n async\s)', duckdb_content, re.DOTALL | re.MULTILINE)
if duckdb_upsert:
text = duckdb_upsert.group(0)
lines = text.split('\n')
for i, line in enumerate(lines):
if 'updateOnDuplicate' in line or 'difference' in line:
print(f"{i:3d}: {line}")
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Look for the return statement in insertQuery with bind/replacement/replacements
python3 << 'EOF'
import os
import re
for root, dirs, files in os.walk('packages/core/src/query-generator'):
for file in files:
if file.endswith('.ts'):
path = os.path.join(root, file)
with open(path, 'r') as f:
content = f.read()
if 'insertQuery' in content:
# Find insertQuery method
start = content.find('insertQuery')
if start > -1:
# Find the return statement
section = content[start:start+3000]
if 'return' in section:
print(f"Found in {file}:")
# Extract just the return object
ret_match = re.search(r'return\s*\{[^}]+\}', section, re.DOTALL)
if ret_match:
print(ret_match.group(0))
else:
print(section[:1000])
break
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Search for how queryRaw or sequelize.queryRaw handles options
rg "queryRaw" packages/ -t ts -A 5 -B 2 | grep -A 8 "replacement\|bind" | head -80Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Look at the sequelize class itself
fd sequelize.ts packages/core/src/ -x cat {} | head -300Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Find queryRaw method definition
rg "queryRaw\s*\([^)]*\)" packages/core/src/ -t ts -B 3 -A 10 | head -100Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Search for "replacement" (singular) in the codebase
rg "options\.replacement[^s]|replacement\s*:" packages/ -t ts | head -30Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check lines 115-120 in duckdb query-interface where deletion happens
python3 << 'EOF'
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
lines = f.readlines()
print("=== Lines 110-122 (property deletion area) ===")
for i in range(109, min(122, len(lines))):
print(f"{i+1:3d}: {lines[i]}", end='')
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check what insertQuery returns - look for "as any as" pattern
rg "as any as.*query.*bind" packages/ -t ts -B 5 -A 2Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Find combineBinds function
rg "combineBinds" packages/core/src/ -t ts -B 2 -A 8Repository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Get the full core upsert method to see if it handles updateOnDuplicate -= upsertKeys
python3 << 'EOF'
with open('packages/core/src/query-interface.ts', 'r') as f:
lines = f.readlines()
# Find upsert
upsert_line = None
for i, line in enumerate(lines):
if 'async upsert<M' in line and 'extends Model' in line:
upsert_line = i
break
if upsert_line:
print(f"Core upsert starts at line {upsert_line + 1}")
# Print next 150 lines to see the full method
for i in range(upsert_line, min(upsert_line + 150, len(lines))):
print(f"{i+1:4d}: {lines[i]}", end='')
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check the type of uniqueColumnNames by looking at the mapping
python3 << 'EOF'
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
lines = f.readlines()
print("=== Lines 69-82 showing uniqueColumnNames creation ===")
for i in range(68, 82):
print(f"{i+1:3d}: {lines[i]}", end='')
print("\n\n=== Analysis ===")
print("Line 71-73: Object.values(model.getIndexes()).filter(...).map(c => c.fields as string[])")
print(" - Object.values() yields the index objects")
print(" - .map(c => c.fields as string[]) yields the 'fields' property of each")
print(" - Therefore: uniqueColumnNames is string[] | undefined[] or (string[] | undefined)[]")
print("\nLine 76: const indexKey = uniqueColumnNames.find(fields => fields.includes(field))")
print(" - .find() returns one element from the array")
print(" - So indexKey is (string[] | undefined) or undefined")
print("\nLine 77-78: if (indexKey && typeof indexKey === 'string')")
print(" - This checks if indexKey is a string, but indexKey is actually string[] | undefined")
print(" - The check will ALWAYS be false (except if indexKey is somehow a string, which it shouldn't be)")
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Check if the difference() call (workaround) is inside or outside the if block
python3 << 'EOF'
with open('packages/duckdb/src/query-interface.ts', 'r') as f:
lines = f.readlines()
print("=== Lines 65-105 showing the if block scope ===")
for i in range(64, 105):
# Show line number and indentation level
line = lines[i]
indent = len(line) - len(line.lstrip())
print(f"{i+1:3d} (indent {indent:2d}): {line.rstrip()}")
EOFRepository: sequelize/sequelize
Length of output: 149
🏁 Script executed:
# Final verification - check if core upsert does the difference() call outside the if
python3 << 'EOF'
import re
with open('packages/core/src/query-interface.ts', 'r') as f:
content = f.read()
# Find the upsert method and check if it applies difference() outside the if block
match = re.search(r'async upsert<M[^}]*?difference\(options\.updateOnDuplicate', content, re.DOTALL)
if match:
print("Core upsert DOES call difference() on updateOnDuplicate")
text = match.group(0)
# Check context - is it inside an if block?
lines = text.split('\n')
print(f"Last 10 lines of match:")
for line in lines[-10:]:
print(line)
else:
print("Searching for updateOnDuplicate handling in core...")
match = re.search(r'updateOnDuplicate.*?(?=\n\s{4}\}|\n\s{4}const|\n\s{4}return)', content, re.DOTALL)
if match:
print("Found updateOnDuplicate handling:")
print(match.group(0)[:500])
EOFRepository: sequelize/sequelize
Length of output: 149
DuckDB upsert workaround is broken due to incorrect type check, conditional scope, and property deletion mismatch.
Issues:
- Line 87:
indexKeyis an array (from.map(c => c.fields as string[])), sotypeof indexKey === 'string'will always be false, preventing unique index detection. - Line 106: The workaround that removes conflict keys from
updateOnDuplicateonly executes whenupsertKeys.length === 0(inside the if-block). WhenconflictFieldsis provided, the if-block is skipped and the workaround never runs. - Line 117: Deletes
options.replacement(singular), but bulkInsert deletesoptions.replacements(plural). Given the comment about replacements being handled by QueryGenerator, this inconsistency needs clarification.
Suggested fixes:
- if (indexKey && typeof indexKey === 'string') {
- options.upsertKeys.push(indexKey);
+ if (indexKey && Array.isArray(indexKey)) {
+ options.upsertKeys = indexKey;
break;
}
options.upsertKeys = uniq(options.upsertKeys);
- // This is the only real difference...
- options.updateOnDuplicate = difference(options.updateOnDuplicate, options.upsertKeys);
}
+ // Ensure conflict keys never get updated (even if conflictFields was provided).
+ options.updateOnDuplicate = difference(options.updateOnDuplicate, options.upsertKeys);
- delete options.replacement;
+ delete (options as any).replacements;Additionally, verify that fields.includes(field) uses consistent naming (attribute names or column names) with the values in updateOnDuplicate.
🤖 Prompt for AI Agents
In packages/duckdb/src/query-interface.ts around lines 50 to 121, the DuckDB
upsert workaround has three issues: the unique-index detection uses typeof on an
array, the removal of conflict keys from updateOnDuplicate only runs inside the
if-block, and the wrong replacements property is deleted. Fix by (1) treating
indexKey as an array (use Array.isArray(indexKey) and push its fields into
options.upsertKeys with options.upsertKeys.push(...indexKey) or otherwise add
each column, not push the array itself), (2) move the line
options.updateOnDuplicate = difference(options.updateOnDuplicate,
options.upsertKeys) outside of the if-block so it always runs (including when
conflictFields is provided), and (3) delete the plural options.replacements
(delete options.replacements) instead of options.replacement; also ensure the
comparison fields.includes(field) uses the same naming convention (column names
vs attribute names) as options.updateOnDuplicate by converting names
consistently before the check.
| async dropSchema(schema: string, options?: DropSchemaOptions): Promise<void> { | ||
| if (schema !== 'main') { | ||
| // main schema cannot be dropped | ||
| return super.dropSchema(schema, options); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dropSchema is inverted and falls through (should no-op for "main", delegate otherwise).
Current code calls super.dropSchema for non-main schemas and does nothing for main, contradicting the comment and likely callers’ expectations.
async dropSchema(schema: string, options?: DropSchemaOptions): Promise<void> {
- if (schema !== 'main') {
- // main schema cannot be dropped
- return super.dropSchema(schema, options);
- }
+ if (schema === 'main') {
+ // main schema cannot be dropped
+ return;
+ }
+
+ return super.dropSchema(schema, options);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async dropSchema(schema: string, options?: DropSchemaOptions): Promise<void> { | |
| if (schema !== 'main') { | |
| // main schema cannot be dropped | |
| return super.dropSchema(schema, options); | |
| } | |
| } | |
| async dropSchema(schema: string, options?: DropSchemaOptions): Promise<void> { | |
| if (schema === 'main') { | |
| // main schema cannot be dropped | |
| return; | |
| } | |
| return super.dropSchema(schema, options); | |
| } |
🤖 Prompt for AI Agents
In packages/duckdb/src/query-interface.ts around lines 171-176, the logic for
dropSchema is inverted: it currently delegates to super.dropSchema when schema
!== 'main' and no-ops for 'main', which contradicts the comment; change the
control flow to no-op (return immediately) when schema === 'main' and delegate
to/return the parent implementation for all other schemas (e.g., call and return
await super.dropSchema(schema, options) or return super.dropSchema(schema,
options) to preserve the async Promise<void> behavior).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 8
♻️ Duplicate comments (1)
packages/duckdb/src/connection-manager.ts (1)
53-61: Don’t unconditionallycloseSync()a potentially shared cached instance on everydisconnect(). WithDuckDBInstanceClass.fromCache(...), multiple Sequelize “connections” can share the same underlying instance for a db path; closing the instance here risks breaking other active connections and/or flapping the cache. This matches a previously raised issue for this file.A safer shape is: always close the connection; only close the instance when you can prove it’s no longer in use (ref-count per db path / instance, or rely on a documented ref-counted close API if
fromCacheprovides one).async disconnect(connection: DuckDbConnection) { if (connection.closed) { return; } connection.closed = true; connection.closeSync(); - connection.instance.closeSync(); + // NOTE: only close the cached instance when it is no longer referenced + // (e.g., ref-count per db_path / instance), otherwise other active connections may break. }@duckdb/node-api DuckDBInstance.fromCache closeSync reference counting
🧹 Nitpick comments (2)
packages/duckdb/src/query.js (1)
23-71:databaseQueuescan leak entries (no cleanup when db is no longer used). Since keys are db paths, this Map will grow across ephemeral databases (tests, temp files) unless you delete entries when the last connection for a db closes. Consider deleting the key on final disconnect (ties nicely into the instance ref-count fix inconnection-manager.ts).packages/duckdb/src/connection-manager.ts (1)
53-61: Use async close API instead of sync calls in async function. Thedisconnect()method isasyncbut callscloseSync(), which blocks the event loop. The DuckDB Node.js API provides async alternatives:connection.close()orconnection.disconnect(). Replace the sync calls with their async equivalents and useawait.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/duckdb/src/connection-manager.ts(1 hunks)packages/duckdb/src/query.js(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
packages/**/src/**/*.js
📄 CodeRabbit inference engine (.github/copilot-instructions.md)
Write all new implementations in TypeScript; avoid creating new .js files in src
Files:
packages/duckdb/src/query.js
🧠 Learnings (4)
📓 Common learnings
Learnt from: CR
Repo: sequelize/sequelize PR: 0
File: .github/copilot-instructions.md:0-0
Timestamp: 2025-09-29T19:49:24.715Z
Learning: Applies to packages/*/src/query-generator.{js,ts} : Implement dialect feature SQL in query-generator.ts (preferred) or legacy query-generator.js
Learnt from: CR
Repo: sequelize/sequelize PR: 0
File: .github/copilot-instructions.md:0-0
Timestamp: 2025-09-29T19:49:24.715Z
Learning: Applies to packages/*/src/dialect.ts : When adding a dialect feature, update packages/{dialect}/src/dialect.ts to declare feature support flags
📚 Learning: 2025-09-29T19:49:24.715Z
Learnt from: CR
Repo: sequelize/sequelize PR: 0
File: .github/copilot-instructions.md:0-0
Timestamp: 2025-09-29T19:49:24.715Z
Learning: Applies to packages/*/src/query-interface.{js,ts} : Add schema operations for new features in query-interface.ts (preferred) or legacy query-interface.js
Applied to files:
packages/duckdb/src/query.jspackages/duckdb/src/connection-manager.ts
📚 Learning: 2025-09-29T19:49:24.715Z
Learnt from: CR
Repo: sequelize/sequelize PR: 0
File: .github/copilot-instructions.md:0-0
Timestamp: 2025-09-29T19:49:24.715Z
Learning: Applies to packages/*/src/query-generator.{js,ts} : Implement dialect feature SQL in query-generator.ts (preferred) or legacy query-generator.js
Applied to files:
packages/duckdb/src/query.jspackages/duckdb/src/connection-manager.ts
📚 Learning: 2025-09-29T19:49:24.715Z
Learnt from: CR
Repo: sequelize/sequelize PR: 0
File: .github/copilot-instructions.md:0-0
Timestamp: 2025-09-29T19:49:24.715Z
Learning: Applies to packages/**/src/**/query-generator.{js,ts} : When modifying query generation, update both the base query generator and all dialect-specific implementations
Applied to files:
packages/duckdb/src/query.js
🧬 Code graph analysis (2)
packages/duckdb/src/query.js (1)
packages/duckdb/src/query.d.ts (1)
DuckDbQuery(3-3)
packages/duckdb/src/connection-manager.ts (1)
packages/core/src/abstract-dialect/connection-manager.ts (1)
AbstractConnection(15-19)
🪛 Biome (2.1.2)
packages/duckdb/src/query.js
[error] 2-3: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 3-4: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 4-5: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 5-6: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 72-263: Illegal use of an export declaration outside of a module
not allowed inside scripts
(parse)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: Upload install and build artifact (Node 18)
- GitHub Check: Upload install and build artifact (Node 20)
| async connect(config: ConnectionOptions<DuckDbDialect>): Promise<DuckDbConnection> { | ||
| const instance = await DuckDBInstanceClass.fromCache(config.database, { | ||
| custom_user_agent: 'sequelize', | ||
| }); | ||
|
|
||
| const connection = await instance.connect(); | ||
| const duckDbConnection = connection as DuckDbConnection; | ||
| duckDbConnection.closed = false; | ||
| duckDbConnection.db_path = config.database; | ||
| duckDbConnection.instance = instance; | ||
|
|
||
| return duckDbConnection; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# Search for DuckDbConnectionOptions and mode definition
rg "DuckDbConnectionOptions" --type ts -A 10Repository: sequelize/sequelize
Length of output: 2727
🏁 Script executed:
# Search for DuckDbConnection type definition
rg "DuckDbConnection" --type ts -B 2 -A 10 | head -80Repository: sequelize/sequelize
Length of output: 5597
🏁 Script executed:
# Find the DuckDBInstanceClass implementation
fd "connection-manager" --type f
rg "class DuckDBInstanceClass\|DuckDBInstanceClass" --type ts -B 2 -A 5Repository: sequelize/sequelize
Length of output: 1078
🏁 Script executed:
# Search for any mode property usage in duckdb-related files
rg "\.mode\s*=" --type ts -B 3 -A 3Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Search for ConnectionOptions type definition
rg "interface ConnectionOptions\|type ConnectionOptions" --type ts -A 15Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Find where DuckDBInstanceClass comes from
rg "DuckDBInstanceClass\|DuckDBInstance" packages/duckdb/src/connection-manager.ts -B 5 -A 2 | head -30Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Check the imports at the top of connection-manager.ts
head -20 packages/duckdb/src/connection-manager.tsRepository: sequelize/sequelize
Length of output: 846
🏁 Script executed:
# Check package.json for @duckdb/node-api version
rg "@duckdb/node-api" packages/duckdb/package.jsonRepository: sequelize/sequelize
Length of output: 189
🏁 Script executed:
# Search for any DuckDB options usage or documentation in the codebase
rg "fromCache\|\.connect\(" packages/duckdb/src/connection-manager.ts -B 2 -A 5Repository: sequelize/sequelize
Length of output: 45
🌐 Web query:
@duckdb/node-api DuckDBInstance.fromCache options 1.4.2 readonly readwrite mode
💡 Result:
Short answer: The Neo client (@duckdb/node-api, v1.x) does not expose a usable READONLY/READWRITE open flag on DuckDBInstance.fromCache — the instance-cache helpers (fromCache / DuckDBInstanceCache.getOrCreateInstance) take a path and return an instance; the old numeric open flags (duckdb.OPEN_READONLY / OPEN_READWRITE) are documented as “ignored” in the Node API. In other words, you cannot reliably set a read-only vs read-write mode via fromCache in the Neo Node API (1.x); access-mode flags are effectively ignored. [1][2][3]
Sources
- @duckdb/node-api (Neo) docs — examples showing DuckDBInstance.create / fromCache usage. [1]
- Node.js API reference noting duckdb.OPEN_READONLY / OPEN_READWRITE (currently ignored). [2]
- Chinese mirror / documentation pages showing instance cache and that create/fromCache are used to get instances. [3]
References:
[1] DuckDB Node (Neo) docs — Node.js Client (examples: create, fromCache). (duckdb.org)
[2] Node.js API reference — OPEN_READONLY / OPEN_READWRITE documented as “currently ignored”. (duckdb.net.cn)
[3] DuckDB docs (instance cache example). (duckdb.net.cn)
If you want, I can: (a) show a small code snippet that enforces read-only behavior at the application level, or (b) check the exact behavior in DuckDB v1.4.2 source/releases if you need definitive confirmation for that exact patch release. Which do you prefer?
Citations:
- 1: https://duckdb.org/docs/1.3/clients/node_neo/overview.html?utm_source=openai
- 2: https://duckdb.net.cn/docs/stable/clients/nodejs/reference.html?utm_source=openai
- 3: https://duckdb.net.cn/docs/stable/clients/node_neo/overview.html?utm_source=openai
Remove unused mode option from DuckDbConnectionOptions — not supported by @duckdb/node-api
The mode: 'readonly' | 'readwrite' option is defined in DuckDbConnectionOptions and listed in CONNECTION_OPTION_NAMES but is never passed to DuckDBInstance.fromCache() or instance.connect(). The @duckdb/node-api library (v1.4.2) does not expose usable READONLY/READWRITE flags in fromCache—the old numeric open flags are documented as "ignored" in the Node API. Users can pass this option with no effect, creating a misleading public API. Either remove the option or document it as unsupported.
🤖 Prompt for AI Agents
In packages/duckdb/src/connection-manager.ts around lines 33 to 45, the
DuckDbConnectionOptions includes a `mode` option that is not passed to
DuckDBInstance.fromCache() or instance.connect() and is unsupported by
@duckdb/node-api; remove `mode` from the public DuckDbConnectionOptions/type
definitions and from CONNECTION_OPTION_NAMES so it is no longer accepted or
advertised by the API, and update any related JSDoc/type comments and tests that
reference `mode` to avoid exposing a no-op option (alternatively, if backward
compatibility is required, explicitly mark `mode` as unsupported/ignored in the
docs and runtime validation but do not attempt to pass it to the node-api).
| 'use strict'; | ||
|
|
||
| import { blobValue } from '@duckdb/node-api'; | ||
| import { AbstractQuery, DatabaseError, UniqueConstraintError } from '@sequelize/core'; | ||
| import { logger } from '@sequelize/core/_non-semver-use-at-your-own-risk_/utils/logger.js'; | ||
| import { isBigInt } from '@sequelize/utils'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
cat packages/duckdb/package.jsonRepository: sequelize/sequelize
Length of output: 1274
🏁 Script executed:
# Check the full query.js file to see lines 72-74 mentioned
wc -l packages/duckdb/src/query.jsRepository: sequelize/sequelize
Length of output: 95
🏁 Script executed:
# View lines 1-10 and 72-80 to verify the issues
sed -n '1,10p;72,80p' packages/duckdb/src/query.jsRepository: sequelize/sequelize
Length of output: 683
🏁 Script executed:
# Check if query.ts exists
ls -la packages/duckdb/src/query.* 2>/dev/null || echo "No query.* files found"Repository: sequelize/sequelize
Length of output: 220
🏁 Script executed:
# View lines 70-75 to see the issues mentioned at 72-74
sed -n '70,75p' packages/duckdb/src/query.jsRepository: sequelize/sequelize
Length of output: 257
🏁 Script executed:
# Check git log to see if this file is new or has history
git log --oneline -n 5 -- packages/duckdb/src/query.js 2>/dev/null | head -10Repository: sequelize/sequelize
Length of output: 77
🏁 Script executed:
# Check the full file to understand the scope of ESM usage
grep -n "^export\|^import" packages/duckdb/src/query.js | head -20Repository: sequelize/sequelize
Length of output: 392
File should be converted to TypeScript, not kept as CommonJS with ESM syntax. This file uses ESM import/export syntax but the packages/duckdb package is configured with "type": "commonjs" in its package.json. Additionally, 'use strict' is incompatible with ESM (strict mode is implicit in modules). Per repo guidance, new implementations in packages/**/src/**/*.js should be written in TypeScript.
Recommended: Convert to query.ts with TypeScript syntax. This applies to the export class declaration at line 73 as well.
-'use strict';
-
-import { blobValue } from '@duckdb/node-api';
+import { blobValue } from '@duckdb/node-api';Also applies to: 73
🧰 Tools
🪛 Biome (2.1.2)
[error] 2-3: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 3-4: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 4-5: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
[error] 5-6: Illegal use of an import declaration outside of a module
not allowed inside scripts
(parse)
🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 1 to 73, the file is using ESM
imports and 'use strict' while the package is configured as CommonJS and per
repo guidance new src files must be TypeScript; convert this file to
packages/duckdb/src/query.ts: remove the 'use strict' line, rename the file to
.ts, change any CommonJS-specific usage to TypeScript module format, add
appropriate type annotations/import types for blobValue, AbstractQuery,
DatabaseError, UniqueConstraintError, logger and isBigInt, and convert the
`export class` at line ~73 to a typed TypeScript class export; ensure the
project compiles by updating any local imports referencing ./query.js to ./query
and run type-check/tsc to fix remaining typing errors.
| function executeQuery(duckdbConnection, sql, parameters) { | ||
| const hasParameters = parameters && parameters.length > 0; | ||
| const dbPath = duckdbConnection.db_path; | ||
|
|
||
| // Chain this query to the database's execution queue | ||
| const currentQueue = getDatabaseQueue(dbPath); | ||
|
|
||
| const resultPromise = currentQueue.then(async () => { | ||
| const reader = await duckdbConnection.runAndReadAll( | ||
| sql, | ||
| hasParameters ? parameters : undefined, | ||
| ); | ||
|
|
||
| return reader.getRowObjectsJS(); | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
@duckdb/node-api runAndReadAll reader close dispose API documentation
💡 Result:
Short answer — yes:
- @duckdb/node-api (Neo) exposes connection.runAndReadAll (and prepared.runAndReadAll / streamAndReadAll) which returns a DuckDB data reader containing all rows/columns. [1]
- The reader exposes getters like getRows(), getRowObjects(), getColumns(), getRowsJson()/getRowObjectsJson()/getColumnsJson(), and chunk/column access helpers. [1]
- Results/readers are heap-backed resources; free them when done by calling dispose() on the result/reader (examples showing result.dispose()/prepared.dispose()). Connections/prepared objects also expose dispose/close-style methods. [2][1]
Sources:
- @duckdb/node-api README / API snippets (runAndReadAll, reader methods, examples). [1]
- DuckDB Node (Neo) docs / Node client overview (examples showing dispose/close usage). [2]
References:
[1] duckdb-node-neo API snippets (runAndReadAll, reader methods).
[2] DuckDB Node (Neo) documentation (connection/prepared lifecycle and dispose/close examples).
🏁 Script executed:
# Search for other usages of runAndReadAll in the codebase to see patterns
rg "runAndReadAll" --type js --type ts -B 2 -A 5Repository: sequelize/sequelize
Length of output: 182
🏁 Script executed:
# Check if there's duckdb package documentation or source in the repo
find . -type f -name "*.md" -o -name "*.json" | xargs grep -l "runAndReadAll\|getRowObjectsJS" 2>/dev/null | head -20Repository: sequelize/sequelize
Length of output: 182
Close the reader after consuming results to prevent resource leaks. The reader returned by runAndReadAll is a heap-backed resource that must be explicitly disposed. Add reader.dispose() after calling getRowObjectsJS():
const resultPromise = currentQueue.then(async () => {
const reader = await duckdbConnection.runAndReadAll(
sql,
hasParameters ? parameters : undefined,
);
const result = reader.getRowObjectsJS();
reader.dispose();
return result;
});🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 48 to 62, the DuckDB reader
returned by runAndReadAll is not disposed, causing a heap-backed resource leak;
after calling reader.getRowObjectsJS() capture its result, call
reader.dispose(), then return the captured result so the reader is explicitly
released.
| if (sql.startsWith('DROP TABLE')) { | ||
| const sequence_prefix = sql | ||
| .match(/^DROP TABLE IF EXISTS "([^ ]+)"/)[1] | ||
| .replaceAll('.', '_') | ||
| .replaceAll('"', ''); | ||
|
|
||
| // clean up all the table's sequences | ||
| const seqResult = await executeQuery( | ||
| this.duckdbConnection, | ||
| 'SELECT sequence_name FROM duckdb_sequences() WHERE starts_with(sequence_name, $1)', | ||
| [sequence_prefix], | ||
| ); | ||
|
|
||
| if (seqResult && seqResult.length > 0 && 'sequence_name' in seqResult[0]) { | ||
| await executeQuery( | ||
| this.duckdbConnection, | ||
| `DROP SEQUENCE ${seqResult[0].sequence_name} CASCADE`, | ||
| ); | ||
| } | ||
|
|
||
| return this.runQueryInternal(sql, parameters, complete); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DROP TABLE special-case can throw + sequence cleanup only drops one sequence.
sql.match(...)[1]will throw if the SQL doesn’t exactly match the regex (even though it starts withDROP TABLE).- Comment says “clean up all the table’s sequences”, but it only drops
seqResult[0]. DROP SEQUENCE ${seqName}should be properly quoted/escaped as an identifier.
- const sequence_prefix = sql
- .match(/^DROP TABLE IF EXISTS "([^ ]+)"/)[1]
+ const match = sql.match(/^DROP TABLE IF EXISTS "([^ ]+)"/);
+ if (!match) {
+ return this.runQueryInternal(sql, parameters, complete);
+ }
+
+ const sequence_prefix = match[1]
.replaceAll('.', '_')
.replaceAll('"', '');
@@
- if (seqResult && seqResult.length > 0 && 'sequence_name' in seqResult[0]) {
- await executeQuery(
- this.duckdbConnection,
- `DROP SEQUENCE ${seqResult[0].sequence_name} CASCADE`,
- );
- }
+ for (const row of seqResult ?? []) {
+ if (!row || typeof row.sequence_name !== 'string') continue;
+ // Best: quote via dialect queryGenerator if accessible; otherwise minimally double-quote escape.
+ const seqName = `"${row.sequence_name.replaceAll('"', '""')}"`;
+ await executeQuery(this.duckdbConnection, `DROP SEQUENCE ${seqName} CASCADE`);
+ }🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 91 to 112, the DROP TABLE
special-case assumes the regex match always succeeds, only drops the first
sequence returned, and injects sequence names without proper identifier-quoting;
fix by first defensively matching the DROP TABLE regex and returning or falling
through if match failed, compute sequence_prefix only when match exists, then
iterate over all rows in seqResult and drop each sequence rather than only
seqResult[0], and when composing the DROP SEQUENCE statement properly
quote/escape the sequence identifier (e.g. wrap in double quotes and escape any
internal double quotes) or use a parameterized identifier-safe API if available
to avoid SQL injection.
| formatError(err) { | ||
| if ( | ||
| err.errorType === 'Constraint' && | ||
| (err.message.includes('Duplicate key') || err.message.includes('duplicate key')) | ||
| ) { | ||
| // retry 'properly bind parameters on extra retries' test has a hardcoded condition with "Validation" | ||
| return new UniqueConstraintError({ message: `Validation error: ${err.message}`, cause: err }); | ||
| } | ||
|
|
||
| return new DatabaseError(err); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
formatError should be defensive about error shape. If errorType/message aren’t present (or error isn’t an object), this can throw and mask the original error.
formatError(err) {
+ if (!err || typeof err !== 'object') {
+ return new DatabaseError(err);
+ }
if (
err.errorType === 'Constraint' &&
- (err.message.includes('Duplicate key') || err.message.includes('duplicate key'))
+ typeof err.message === 'string' &&
+ (err.message.includes('Duplicate key') || err.message.includes('duplicate key'))
) {🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 117 to 127, the formatError
function assumes err is an object with errorType and message and may throw if
those properties are missing or err is not an object; update the logic to first
validate err (e.g., if (!err || typeof err !== 'object') return new
DatabaseError(err)), then read errorType and message defensively (const
errorType = err.errorType ?? ''; const message = typeof err.message === 'string'
? err.message : String(err.message ?? '');) and only call message.includes(...)
when message is a string; keep the existing UniqueConstraintError path when
conditions match and otherwise return new DatabaseError(err).
| async runQueryInternal(sql, parameters, loggingCompleteCallback) { | ||
| let convertedParameters; | ||
| if (parameters) { | ||
| // Convert parameters to DuckDB-compatible types | ||
| convertedParameters = parameters.map(p => { | ||
| if (isBigInt(p)) { | ||
| // BigInt binds as null in duckdb-node, convert to string | ||
| return p.toString(); | ||
| } | ||
|
|
||
| // Buffer/Uint8Array must be wrapped in blobValue for @duckdb/node-api | ||
| if (Buffer.isBuffer(p) || p instanceof Uint8Array) { | ||
| return blobValue(p); | ||
| } | ||
|
|
||
| // Numbers exceeding INT32_MAX are interpreted as signed INT32 by the DuckDB Node.js bindings, | ||
| // causing overflow (e.g., 4294967295 becomes -1). Convert to BigInt to preserve value. | ||
| // This is needed for UINTEGER columns that can hold values > 2147483647. | ||
| if (typeof p === 'number' && Number.isInteger(p) && p > 2_147_483_647) { | ||
| return BigInt(p); | ||
| } | ||
|
|
||
| return p; | ||
| }); | ||
| } | ||
|
|
||
| try { | ||
| const data = await executeQuery(this.duckdbConnection, sql, convertedParameters); | ||
| loggingCompleteCallback(); | ||
|
|
||
| if (this.isSelectQuery()) { | ||
| return this.handleSelectQuery(this.postprocessData(data, this.model?.modelDefinition)); | ||
| } | ||
|
|
||
| return this.processResults(data); | ||
| } catch (error) { | ||
| throw this.formatError(error); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
@duckdb/node-api BigInt parameter binding documentation
💡 Result:
Short answer: @duckdb/node-api accepts JavaScript BigInt values for BIGINT/HUGEINT parameter binding — pass a BigInt as the parameter value (or in prepared.bind({...}, { name: BIGINT }) to force the SQL type). The Node "Neo" docs state JS primitives include bigint and that types not natively represented in JS can be provided by specifying the DuckDB type when binding; the DuckDB C API exposes hugeint binding for 128-bit integers. [1][2]
Sources
- DuckDB Node (Neo) client — parameter/value and type handling (shows JS primitives include bigint, inference and explicit-type binding examples). [duckdb.org]. [1]
- DuckDB C API header — duckdb_bind_hugeint / duckdb_bind_int64 (shows HUGEINT/INT64 binding in C API used by the node bindings). [2]
🌐 Web query:
duckdb-node BigInt UINTEGER parameter binding
💡 Result:
Short answer: UINTEGER is an unsigned 32-bit (0..4294967295). In duckdb-node you should pass a JavaScript Number for UINTEGER (Numbers safely represent all uint32 values). If you currently have a BigInt, convert it to Number (Number(myBigInt)) before binding, or cast the parameter in SQL (e.g. INSERT ... VALUES (CAST($1 AS UINTEGER))). DuckDB’s Node “Neo” client also exposes explicit type-specific bind helpers for lossless binding of types without native JS equivalents (see docs) — but there’s no native JS-only unsigned-32 BigInt type, so Number or an explicit CAST is the practical solution. [1][2][3]
Sources:
- DuckDB numeric types (UINTEGER = UINT32). [1]
- DuckDB C API / binding primitives (integer binding functions). [2]
- duckdb-node / Neo client docs (explicit type-specific binding helpers). [3]
🏁 Script executed:
# Search for related tests or examples in the codebase
fd -e js -e ts | xargs grep -l "BigInt\|blobValue\|duckdb" | head -20Repository: sequelize/sequelize
Length of output: 182
🏁 Script executed:
# Check package.json for @duckdb/node-api version and dependencies
cat packages/duckdb/package.jsonRepository: sequelize/sequelize
Length of output: 1274
🏁 Script executed:
# Search for tests related to query parameters and UINTEGER
rg -i "uinteger|uint32" --type js --type ts -B 2 -A 2Repository: sequelize/sequelize
Length of output: 3731
🏁 Script executed:
# Search for test files related to query.js
fd -e test.js -e spec.js -e test.ts -e spec.ts | xargs grep -l "query\|parameter" 2>/dev/null | head -10Repository: sequelize/sequelize
Length of output: 700
🏁 Script executed:
# Look for any existing tests for runQueryInternal or parameter conversion
rg "runQueryInternal|convertParameters" --type js --type ts -B 2 -A 5Repository: sequelize/sequelize
Length of output: 1088
🏁 Script executed:
# Search for DuckDB-specific integration tests with large numbers
fd "duckdb" -t d | xargs -I {} find {} -name "*.test.js" -o -name "*.test.ts" 2>/dev/null | head -10Repository: sequelize/sequelize
Length of output: 45
🏁 Script executed:
# Look for any comments or git history about parameter binding issues
rg "overflow|binding.*null|4294967295" --type js --type ts -B 3 -A 3Repository: sequelize/sequelize
Length of output: 6264
🏁 Script executed:
# Check if there are any issues or comments about BigInt conversion
rg "BigInt.*string|toString\(\)" packages/duckdb/src/query.js -B 5 -A 5Repository: sequelize/sequelize
Length of output: 526
🏁 Script executed:
# Check if there are any version-specific comments or issues in the query.js file
cat -n packages/duckdb/src/query.js | head -200Repository: sequelize/sequelize
Length of output: 7905
🏁 Script executed:
# Look for executeQuery to understand how parameters are passed
rg "executeQuery|duckdbConnection" packages/duckdb/src/query.js -B 2 -A 8Repository: sequelize/sequelize
Length of output: 2195
Parameter conversion is internally inconsistent for BigInt / large integers. Lines 150–152 convert BigInt to string with the comment "BigInt binds as null in duckdb-node", but lines 163–164 convert large JS numbers to BigInt for UINTEGER overflow protection. If BigInt truly binds as null, then converting numbers to BigInt would reintroduce the same issue. According to @duckdb/node-api 1.4.2 documentation, BigInt is natively supported for BIGINT/HUGEINT, and UINTEGER (0–4,294,967,295) should use JS Number type, not BigInt. Pick one approach that aligns with the current API's actual binding behavior and apply it consistently.
As a secondary note: this file is written in .js but the coding guidelines require new implementations in packages/**/src/**/*.js to be written in TypeScript.
🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 145–183, the parameter conversion
is inconsistent: currently BigInt values are coerced to strings while large
integers are coerced to BigInt, which conflicts with @duckdb/node-api behavior.
Remove the unconditional BigInt->string coercion and instead allow BigInt to
pass through natively; stop coercing large JS Numbers to BigInt unconditionally
— leave Numbers as Number (UINTEGER expects Number), and only convert to BigInt
when you can deterministically detect the target column type is BIGINT/HUGEINT
(use this.model.modelDefinition to map parameter positions/names to column types
and convert there). Keep the Buffer/Uint8Array -> blobValue handling. Finally,
move/rename this implementation to TypeScript per project guidelines
(packages/**/src/**/*.ts) or rewrite the file as a .ts module.
| try { | ||
| const data = await executeQuery(this.duckdbConnection, sql, convertedParameters); | ||
| loggingCompleteCallback(); | ||
|
|
||
| if (this.isSelectQuery()) { | ||
| return this.handleSelectQuery(this.postprocessData(data, this.model?.modelDefinition)); | ||
| } | ||
|
|
||
| return this.processResults(data); | ||
| } catch (error) { | ||
| throw this.formatError(error); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Always call loggingCompleteCallback (use finally). Right now, if executeQuery throws, the “complete” callback never runs, which can skew logging/benchmarks and leave “in-flight” logs.
try {
const data = await executeQuery(this.duckdbConnection, sql, convertedParameters);
- loggingCompleteCallback();
-
if (this.isSelectQuery()) {
return this.handleSelectQuery(this.postprocessData(data, this.model?.modelDefinition));
}
return this.processResults(data);
} catch (error) {
throw this.formatError(error);
+ } finally {
+ loggingCompleteCallback();
}🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 171 to 182, the
loggingCompleteCallback is only called inside the try block so it is skipped
when executeQuery throws; move the loggingCompleteCallback invocation into a
finally block that always runs, remove the in-try invocation, and keep the
existing control flow so the finally runs whether the function returns a handled
result (select or non-select) or throws the formatted error — i.e., wrap the
execute/handle logic in try, catch and use finally { loggingCompleteCallback();
} ensuring the catch still throws this.formatError(error).
| if (this.isInsertQuery(data, {}) || this.isUpsertQuery()) { | ||
| this.handleInsertQuery(data, {}); | ||
| const modelDefinition = this.model?.modelDefinition; | ||
|
|
||
| if (!this.instance) { | ||
| // return autogenerated fields, so a model can be constructed | ||
| result = data; | ||
| } else if (Array.isArray(data) && data.length > 0) { | ||
| // update model with values returned from the database | ||
| for (const column of Object.keys(data[0])) { | ||
| const modelColumn = modelDefinition.columns.get(column); | ||
| if (modelColumn) { | ||
| const val = data[0][column] | ||
| ? modelColumn.type.parseDatabaseValue(data[0][column]) | ||
| : data[0][column]; | ||
| this.instance.set(modelColumn.attributeName, val, { | ||
| raw: true, | ||
| comesFromDatabase: true, | ||
| }); | ||
| } | ||
| } | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Returned values parsing is wrong for falsy values + modelDefinition may be undefined.
data[0][column] ? parseDatabaseValue(...) : data[0][column]skips parsing for0,'',false.modelDefinition.columnscan throw ifthis.modelis unexpectedly unset.
const modelDefinition = this.model?.modelDefinition;
@@
- } else if (Array.isArray(data) && data.length > 0) {
+ } else if (modelDefinition && Array.isArray(data) && data.length > 0) {
// update model with values returned from the database
for (const column of Object.keys(data[0])) {
const modelColumn = modelDefinition.columns.get(column);
if (modelColumn) {
- const val = data[0][column]
- ? modelColumn.type.parseDatabaseValue(data[0][column])
- : data[0][column];
+ const rawVal = data[0][column];
+ const val = rawVal === null || rawVal === undefined
+ ? rawVal
+ : modelColumn.type.parseDatabaseValue(rawVal);
this.instance.set(modelColumn.attributeName, val, {
raw: true,
comesFromDatabase: true,
});
}
}
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (this.isInsertQuery(data, {}) || this.isUpsertQuery()) { | |
| this.handleInsertQuery(data, {}); | |
| const modelDefinition = this.model?.modelDefinition; | |
| if (!this.instance) { | |
| // return autogenerated fields, so a model can be constructed | |
| result = data; | |
| } else if (Array.isArray(data) && data.length > 0) { | |
| // update model with values returned from the database | |
| for (const column of Object.keys(data[0])) { | |
| const modelColumn = modelDefinition.columns.get(column); | |
| if (modelColumn) { | |
| const val = data[0][column] | |
| ? modelColumn.type.parseDatabaseValue(data[0][column]) | |
| : data[0][column]; | |
| this.instance.set(modelColumn.attributeName, val, { | |
| raw: true, | |
| comesFromDatabase: true, | |
| }); | |
| } | |
| } | |
| } | |
| if (this.isInsertQuery(data, {}) || this.isUpsertQuery()) { | |
| this.handleInsertQuery(data, {}); | |
| const modelDefinition = this.model?.modelDefinition; | |
| if (!this.instance) { | |
| // return autogenerated fields, so a model can be constructed | |
| result = data; | |
| } else if (modelDefinition && Array.isArray(data) && data.length > 0) { | |
| // update model with values returned from the database | |
| for (const column of Object.keys(data[0])) { | |
| const modelColumn = modelDefinition.columns.get(column); | |
| if (modelColumn) { | |
| const rawVal = data[0][column]; | |
| const val = rawVal === null || rawVal === undefined | |
| ? rawVal | |
| : modelColumn.type.parseDatabaseValue(rawVal); | |
| this.instance.set(modelColumn.attributeName, val, { | |
| raw: true, | |
| comesFromDatabase: true, | |
| }); | |
| } | |
| } | |
| } |
🤖 Prompt for AI Agents
In packages/duckdb/src/query.js around lines 201 to 223, the returned-value
parsing and model access are incorrect: change the falsy-value check so parsing
runs for values like 0, '', and false by testing for null/undefined (e.g., value
!= null) instead of a truthy check, and guard against a missing modelDefinition
by verifying this.model and this.model.modelDefinition exist before accessing
.columns; if modelDefinition is absent skip column mapping or handle gracefully.
Update the logic to extract the value once, use the null/undefined check to
decide whether to call parseDatabaseValue, and add a conditional around
modelDefinition.columns access to avoid runtime errors.
Pull Request Checklist
Description of Changes
This is a full implementation of the DuckDB dialect, also compatible with MotherDuck. DuckDB is an in-process OLAP database. MotherDuck is a cloud data warehouse based on and compatible with DuckDB.
This dialect is based on DuckDB Node Neo - a modern Node library for DuckDB. I've previously sent a PR based on
duckdb-async, which is the async wrapper around https://github.com/duckdb/duckdb-node. Both packages are getting deprecated at the end of the year in favor of DuckDB Node Neo.All unit and integration tests pass.
Limitations:
Changes in this PR outside of the
packages/duckdbdialect folder:core/srcare limited to where there already was an exclusion for another dialect.core/testare limited toconstraints.foreignKey,constraints.unique,isolationLevelsfindAllorder explicit since the test relies on the ordering (integration/associations/multiple-level-filters.test.js).Fixes #18079
List of Breaking Changes
n/a
Summary by CodeRabbit
New Features
Tests
Chores
Documentation
✏️ Tip: You can customize this high-level summary in your review settings.