feat: support time travel queries, upsert, enums, spatial types in cockroachdb (#9128)

* feature: adds support for enum type (fixes #9068)

* temporarily ran package to test on different repo

* playing around - bumped version

* Revert "playing around - bumped version"

This reverts commit 7df4adb3e698419c174c2daee88614f8dafdbb6c.

* Revert "temporarily ran package to test on different repo"

This reverts commit 48f394e8eb32c22fe757010b446c85740bf80b5f.

* feat: add support for geometry data type

* feature: properly hydrate enum array values

* feature: adds support for geography and geometry for cockroachdb

* bugfix: fixes issue with primary generated columns being invalid column type (fixes #8532)

* Revert "bugfix: fixes issue with primary generated columns being invalid column type (fixes #8532)"

This reverts commit e00cdb090638d34668e3e10acd5f8267fe3bd028.

* bugfix: type casts to string when using ANY

* feature: cast geometry/geography to geojson

* feature: added references to srid

* bugfix: prevent error if trying to close already closed connection

* feature: added cockrachodb as part of postgres family

* feature: ensures support for spatial columns for cockroachdb

* feature: adds support for UPSERT for CockroachDB (fixes #9199)

* minor: added TODO; unsure how to achieve this

* feature: adds support for time travelling queries for cockroachdb

* bugfix: only run time travel query on SELECT statements

* refactor: changed UsertType from 'upsert' to 'primary-key' since this is more logical

* feature: added posibility to set timeTravelQuery to false, instead of the parameter function; help for disabling time travel queries during tests

* feature: allow timeTravelQueries in find* queries

* bugfix: when using timetravel queries with joinAttributes it now prevents error 'AS OF SYSTEM TIME must be in top level' error

* lint

* minor fix

* fixed failing test

* implemented ENUM type;
added tests;

* fixed failing tests

* fixed failing test

* fixed spatial types synchronization;
implemented spatial indices;
added tests for spatial columns;

* refactored Time Travel Query functionality;
removed TTQ from find options;
added tests for TTQ;

* added docs for Time Travel Queries

* minor changes

* added GeoJSON types;
other minor fixes;

* updated docs

* updated docs

Co-authored-by: Dmitry Zotov <dmzt08@gmail.com>
This commit is contained in:
Mattias Fjellvang 2023-01-03 15:25:22 +01:00 committed by GitHub
parent 3e1caf0ff3
commit defb409f56
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
53 changed files with 2557 additions and 299 deletions

View File

@ -206,8 +206,8 @@ export class User {
- `hstoreType: "object"|"string"` - Return type of `HSTORE` column. Returns value as string or as object. Used only in [Postgres](https://www.postgresql.org/docs/9.6/static/hstore.html).
- `array: boolean` - Used for postgres and cockroachdb column types which can be array (for example int[]).
- `transformer: ValueTransformer|ValueTransformer[]` - Specifies a value transformer (or array of value transformers) that is to be used to (un)marshal this column when reading or writing to the database. In case of an array, the value transformers will be applied in the natural order from entityValue to databaseValue, and in reverse order from databaseValue to entityValue.
- `spatialFeatureType: string` - Optional feature type (`Point`, `Polygon`, `LineString`, `Geometry`) used as a constraint on a spatial column. If not specified, it will behave as though `Geometry` was provided. Used only in PostgreSQL.
- `srid: number` - Optional [Spatial Reference ID](https://postgis.net/docs/using_postgis_dbmanagement.html#spatial_ref_sys) used as a constraint on a spatial column. If not specified, it will default to `0`. Standard geographic coordinates (latitude/longitude in the WGS84 datum) correspond to [EPSG 4326](http://spatialreference.org/ref/epsg/wgs-84/). Used only in PostgreSQL.
- `spatialFeatureType: string` - Optional feature type (`Point`, `Polygon`, `LineString`, `Geometry`) used as a constraint on a spatial column. If not specified, it will behave as though `Geometry` was provided. Used only in PostgreSQL and CockroachDB.
- `srid: number` - Optional [Spatial Reference ID](https://postgis.net/docs/using_postgis_dbmanagement.html#spatial_ref_sys) used as a constraint on a spatial column. If not specified, it will default to `0`. Standard geographic coordinates (latitude/longitude in the WGS84 datum) correspond to [EPSG 4326](http://spatialreference.org/ref/epsg/wgs-84/). Used only in PostgreSQL and CockroachDB.
Learn more about [entity columns](entities.md#entity-columns).

View File

@ -198,7 +198,7 @@ There are several special column types with additional functionality available:
### Spatial columns
MS SQL, MySQL / MariaDB, and PostgreSQL all support spatial columns. TypeORM's
MS SQL, MySQL, MariaDB, PostgreSQL and CockroachDB all support spatial columns. TypeORM's
support for each varies slightly between databases, particularly as the column
names vary between databases.
@ -207,10 +207,85 @@ be provided as [well-known text
(WKT)](https://en.wikipedia.org/wiki/Well-known_text), so geometry columns
should be tagged with the `string` type.
TypeORM's PostgreSQL support uses [GeoJSON](http://geojson.org/) as an
```typescript
import { Entity, PrimaryColumn, Column } from "typeorm"
@Entity()
export class Thing {
@PrimaryColumn()
id: number
@Column("point")
point: string
@Column("linestring")
linestring: string
}
...
const thing = new Thing()
thing.point = "POINT(1 1)"
thing.linestring = "LINESTRING(0 0,1 1,2 2)"
```
TypeORM's PostgreSQL and CockroachDB support uses [GeoJSON](http://geojson.org/) as an
interchange format, so geometry columns should be tagged either as `object` or
`Geometry` (or subclasses, e.g. `Point`) after importing [`geojson`
types](https://www.npmjs.com/package/@types/geojson).
types](https://www.npmjs.com/package/@types/geojson) or using TypeORM built in [GeoJSON types](../src/driver/types/GeoJsonTypes.ts).
```typescript
import {
Entity,
PrimaryColumn,
Column,
Point,
LineString,
MultiPoint
} from "typeorm"
@Entity()
export class Thing {
@PrimaryColumn()
id: number
@Column("geometry")
point: Point
@Column("geometry")
linestring: LineString
@Column("geometry", {
spatialFeatureType: "MultiPoint",
srid: 4326,
})
multiPointWithSRID: MultiPoint
}
...
const thing = new Thing()
thing.point = {
type: "Point",
coordinates: [116.443987, 39.920843],
}
thing.linestring = {
type: "LineString",
coordinates: [
[-87.623177, 41.881832],
[-90.199402, 38.627003],
[-82.446732, 38.413651],
[-87.623177, 41.881832],
],
}
thing.multiPointWithSRID = {
type: "MultiPoint",
coordinates: [
[100.0, 0.0],
[101.0, 1.0],
],
}
```
TypeORM tries to do the right thing, but it's not always possible to determine
when a value being inserted or the result of a PostGIS function should be
@ -219,7 +294,9 @@ to the following, where values are converted into PostGIS `geometry`s from
GeoJSON and into GeoJSON as `json`:
```typescript
const origin = {
import { Point } from "typeorm"
const origin: Point = {
type: "Point",
coordinates: [0, 0],
}

View File

@ -103,7 +103,7 @@ export class User {
## Spatial Indices
MySQL and PostgreSQL (when PostGIS is available) both support spatial indices.
MySQL, CockroachDB and PostgreSQL (when PostGIS is available) supports spatial indices.
To create a spatial index on a column in MySQL, add an `Index` with `spatial: true` on a column that uses a spatial type (`geometry`, `point`, `linestring`,
`polygon`, `multipoint`, `multilinestring`, `multipolygon`,
@ -118,7 +118,7 @@ export class Thing {
}
```
To create a spatial index on a column in PostgreSQL, add an `Index` with `spatial: true` on a column that uses a spatial type (`geometry`, `geography`):
To create a spatial index on a column add an `Index` with `spatial: true` on a column that uses a spatial type (`geometry`, `geography`):
```typescript
export interface Geometry {

View File

@ -178,6 +178,7 @@ await repository.upsert(
{
conflictPaths: ["externalId"],
skipUpdateIfNoValuesChanged: true, // supported by postgres, skips update if it would not change row values
upsertType: "upsert", // "on-conflict-do-update" | "on-duplicate-key-update" | "upsert" - optionally provide an UpsertType - 'upsert' is currently only supported by CockroachDB
},
)
/** executes
@ -211,7 +212,7 @@ await repository.upsert(
* ON CONFLICT (externalId) WHERE ( dateAdded > 2021-01-01 ) DO UPDATE
* SET firstName = EXCLUDED.firstName,
* SET dateAdded = EXCLUDED.dateAdded,
* WHERE user.firstName IS DISTINCT FROM EXCLUDED.firstName OR user.dateAdded IS DISTINCT FROM EXCLUDED.dateAdded
* WHERE user.firstName IS DISTINCT FROM EXCLUDED.firstName OR user.dateAdded IS DISTINCT FROM EXCLUDED.dateAdded
**/
```

View File

@ -1238,3 +1238,38 @@ const users = await connection.getRepository(User)
.where(`user.id IN (SELECT "id" FROM 'insert_results')`)
.getMany();
```
## Time Travel Queries
[Time Travel Queries](https://www.cockroachlabs.com/blog/time-travel-queries-select-witty_subtitle-the_future/)
currently supported only in `CockroachDB` database.
```typescript
const repository = connection.getRepository(Account)
// create a new account
const account = new Account()
account.name = "John Smith"
account.balance = 100
await repository.save(account)
// imagine we update the account balance 1 hour after creation
account.balance = 200
await repository.save(account)
// outputs { name: "John Smith", balance: "200" }
console.log(account)
// load account state on 1 hour back
account = await repository
.createQueryBuilder("account")
.timeTravelQuery(`'-1h'`)
.getOneOrFail()
// outputs { name: "John Smith", balance: "100" }
console.log(account)
```
By default `timeTravelQuery()` uses `follower_read_timestamp()` function if no arguments passed.
For another supported timestamp arguments and additional information please refer to
[CockroachDB](https://www.cockroachlabs.com/docs/stable/as-of-system-time.html) docs.

View File

@ -1,3 +1,5 @@
import { Geometry } from "../../driver/types/GeoJsonTypes"
/**
* Options for spatial columns.
*/
@ -6,7 +8,7 @@ export interface SpatialColumnOptions {
* Column type's feature type.
* Geometry, Point, Polygon, etc.
*/
spatialFeatureType?: string
spatialFeatureType?: Geometry["type"]
/**
* Column type's SRID.

View File

@ -66,7 +66,7 @@ export interface Driver {
/**
* Returns type of upsert supported by driver if any
*/
supportedUpsertType?: UpsertType
supportedUpsertTypes: UpsertType[]
/**
* Default values of length, precision and scale depends on column data type.

View File

@ -41,7 +41,9 @@ export class DriverUtils {
}
static isPostgresFamily(driver: Driver): boolean {
return ["postgres", "aurora-postgres"].includes(driver.options.type)
return ["postgres", "aurora-postgres", "cockroachdb"].includes(
driver.options.type,
)
}
/**

View File

@ -23,6 +23,7 @@ import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType"
/**
* Organizes communication with MySQL DBMS.
@ -152,7 +153,7 @@ export class AuroraMysqlDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = "on-duplicate-key-update"
supportedUpsertTypes: UpsertType[] = ["on-duplicate-key-update"]
/**
* Gets list of spatial column data types.

View File

@ -12,6 +12,12 @@ export interface CockroachConnectionOptions
*/
readonly type: "cockroachdb"
/**
* Enable time travel queries on cockroachdb.
* https://www.cockroachlabs.com/docs/stable/as-of-system-time.html
*/
readonly timeTravelQueries: boolean
/**
* Schema name.
*/

View File

@ -1,32 +1,33 @@
import { Driver } from "../Driver"
import { ConnectionIsNotSetError } from "../../error/ConnectionIsNotSetError"
import { ObjectLiteral } from "../../common/ObjectLiteral"
import { DataSource } from "../../data-source/DataSource"
import { TypeORMError } from "../../error"
import { ConnectionIsNotSetError } from "../../error/ConnectionIsNotSetError"
import { DriverPackageNotInstalledError } from "../../error/DriverPackageNotInstalledError"
import { DriverUtils } from "../DriverUtils"
import { ColumnMetadata } from "../../metadata/ColumnMetadata"
import { EntityMetadata } from "../../metadata/EntityMetadata"
import { PlatformTools } from "../../platform/PlatformTools"
import { QueryRunner } from "../../query-runner/QueryRunner"
import { RdbmsSchemaBuilder } from "../../schema-builder/RdbmsSchemaBuilder"
import { Table } from "../../schema-builder/table/Table"
import { TableColumn } from "../../schema-builder/table/TableColumn"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { View } from "../../schema-builder/view/View"
import { ApplyValueTransformers } from "../../util/ApplyValueTransformers"
import { DateUtils } from "../../util/DateUtils"
import { InstanceChecker } from "../../util/InstanceChecker"
import { ObjectUtils } from "../../util/ObjectUtils"
import { OrmUtils } from "../../util/OrmUtils"
import { Driver } from "../Driver"
import { DriverUtils } from "../DriverUtils"
import { ColumnType } from "../types/ColumnTypes"
import { CteCapabilities } from "../types/CteCapabilities"
import { DataTypeDefaults } from "../types/DataTypeDefaults"
import { MappedColumnTypes } from "../types/MappedColumnTypes"
import { ReplicationMode } from "../types/ReplicationMode"
import { UpsertType } from "../types/UpsertType"
import { CockroachConnectionCredentialsOptions } from "./CockroachConnectionCredentialsOptions"
import { CockroachConnectionOptions } from "./CockroachConnectionOptions"
import { DateUtils } from "../../util/DateUtils"
import { PlatformTools } from "../../platform/PlatformTools"
import { DataSource } from "../../data-source/DataSource"
import { RdbmsSchemaBuilder } from "../../schema-builder/RdbmsSchemaBuilder"
import { MappedColumnTypes } from "../types/MappedColumnTypes"
import { ColumnType } from "../types/ColumnTypes"
import { QueryRunner } from "../../query-runner/QueryRunner"
import { DataTypeDefaults } from "../types/DataTypeDefaults"
import { TableColumn } from "../../schema-builder/table/TableColumn"
import { EntityMetadata } from "../../metadata/EntityMetadata"
import { OrmUtils } from "../../util/OrmUtils"
import { CockroachQueryRunner } from "./CockroachQueryRunner"
import { ApplyValueTransformers } from "../../util/ApplyValueTransformers"
import { ReplicationMode } from "../types/ReplicationMode"
import { TypeORMError } from "../../error"
import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { ObjectUtils } from "../../util/ObjectUtils"
import { InstanceChecker } from "../../util/InstanceChecker"
/**
* Organizes communication with Cockroach DBMS.
@ -119,6 +120,9 @@ export class CockroachDriver implements Driver {
"bytea",
"blob",
"date",
"enum",
"geometry",
"geography",
"numeric",
"decimal",
"dec",
@ -158,12 +162,15 @@ export class CockroachDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = "on-conflict-do-update"
supportedUpsertTypes: UpsertType[] = [
"on-conflict-do-update",
"primary-key",
]
/**
* Gets list of spatial column data types.
*/
spatialTypes: ColumnType[] = []
spatialTypes: ColumnType[] = ["geometry", "geography"]
/**
* Gets list of column data types that support length by a driver.
@ -315,6 +322,18 @@ export class CockroachDriver implements Driver {
* Makes any action after connection (e.g. create extensions in Postgres driver).
*/
async afterConnect(): Promise<void> {
// enable time travel queries
if (this.options.timeTravelQueries) {
await this.connection.query(
`SET default_transaction_use_follower_reads = 'on';`,
)
}
// enable experimental alter column type support (we need it to alter enum types)
await this.connection.query(
"SET enable_experimental_alter_column_type_general = true",
)
return Promise.resolve()
}
@ -427,6 +446,43 @@ export class CockroachDriver implements Driver {
value = DateUtils.stringToSimpleArray(value)
} else if (columnMetadata.type === "simple-json") {
value = DateUtils.stringToSimpleJson(value)
} else if (
columnMetadata.type === "enum" ||
columnMetadata.type === "simple-enum"
) {
if (columnMetadata.isArray) {
if (value === "{}") return []
if (Array.isArray(value)) return value
// manually convert enum array to array of values (pg does not support, see https://github.com/brianc/node-pg-types/issues/56)
value = (value as string)
.substr(1, (value as string).length - 2)
.split(",")
.map((val) => {
// replace double quotes from the beginning and from the end
if (val.startsWith(`"`) && val.endsWith(`"`))
val = val.slice(1, -1)
// replace double escaped backslash to single escaped e.g. \\\\ -> \\
val = val.replace(/(\\\\)/g, "\\")
// replace escaped double quotes to non-escaped e.g. \"asd\" -> "asd"
return val.replace(/(\\")/g, '"')
})
// convert to number if that exists in possible enum options
value = value.map((val: string) => {
return !isNaN(+val) &&
columnMetadata.enum!.indexOf(parseInt(val)) >= 0
? parseInt(val)
: val
})
} else {
// convert to number if that exists in possible enum options
value =
!isNaN(+value) &&
columnMetadata.enum!.indexOf(parseInt(value)) >= 0
? parseInt(value)
: value
}
}
if (columnMetadata.transformer)
@ -618,6 +674,8 @@ export class CockroachDriver implements Driver {
return "float4"
} else if (column.type === "character") {
return "char"
} else if (column.type === "simple-enum") {
return "enum"
} else if (column.type === "json") {
return "jsonb"
} else {
@ -630,11 +688,33 @@ export class CockroachDriver implements Driver {
*/
normalizeDefault(columnMetadata: ColumnMetadata): string | undefined {
const defaultValue = columnMetadata.default
const arrayCast = columnMetadata.isArray
? `::${columnMetadata.type}[]`
: ""
if (typeof defaultValue === "number") {
if (
(columnMetadata.type === "enum" ||
columnMetadata.type === "simple-enum") &&
defaultValue !== undefined
) {
if (defaultValue === null) return "NULL"
if (columnMetadata.isArray) {
const enumName = this.buildEnumName(columnMetadata)
let arrayValue = defaultValue
if (typeof defaultValue === "string") {
if (defaultValue === "{}") return `ARRAY[]::${enumName}[]`
arrayValue = defaultValue
.replace("{", "")
.replace("}", "")
.split(",")
}
if (Array.isArray(arrayValue)) {
const expr = `ARRAY[${arrayValue
.map((it) => `'${it}'`)
.join(",")}]`
return `${expr}::${enumName}[]`
}
} else {
return `'${defaultValue}'`
}
} else if (typeof defaultValue === "number") {
return `(${defaultValue})`
}
@ -653,6 +733,9 @@ export class CockroachDriver implements Driver {
}
if (typeof defaultValue === "string") {
const arrayCast = columnMetadata.isArray
? `::${columnMetadata.type}[]`
: ""
return `'${defaultValue}'${arrayCast}`
}
@ -703,6 +786,14 @@ export class CockroachDriver implements Driver {
column.precision !== undefined
) {
type += "(" + column.precision + ")"
} else if (this.spatialTypes.indexOf(column.type as ColumnType) >= 0) {
if (column.spatialFeatureType != null && column.srid != null) {
type = `${column.type}(${column.spatialFeatureType},${column.srid})`
} else if (column.spatialFeatureType != null) {
type = `${column.type}(${column.spatialFeatureType})`
} else {
type = column.type
}
}
if (column.isArray) type += " array"
@ -802,6 +893,7 @@ export class CockroachDriver implements Driver {
tableColumn.name !== columnMetadata.databaseName ||
tableColumn.type !== this.normalizeType(columnMetadata) ||
tableColumn.length !== columnMetadata.length ||
tableColumn.isArray !== columnMetadata.isArray ||
tableColumn.precision !== columnMetadata.precision ||
(columnMetadata.scale !== undefined &&
tableColumn.scale !== columnMetadata.scale) ||
@ -815,10 +907,20 @@ export class CockroachDriver implements Driver {
tableColumn.isNullable !== columnMetadata.isNullable ||
tableColumn.isUnique !==
this.normalizeIsUnique(columnMetadata) ||
tableColumn.enumName !== columnMetadata.enumName ||
(tableColumn.enum &&
columnMetadata.enum &&
!OrmUtils.isArraysEqual(
tableColumn.enum,
columnMetadata.enum.map((val) => val + ""),
)) || // enums in postgres are always strings
tableColumn.isGenerated !== columnMetadata.isGenerated ||
tableColumn.generatedType !== columnMetadata.generatedType ||
(tableColumn.asExpression || "").trim() !==
(columnMetadata.asExpression || "").trim()
(columnMetadata.asExpression || "").trim() ||
(tableColumn.spatialFeatureType || "").toLowerCase() !==
(columnMetadata.spatialFeatureType || "").toLowerCase() ||
tableColumn.srid !== columnMetadata.srid
)
})
}
@ -980,4 +1082,21 @@ export class CockroachDriver implements Driver {
return comment
}
/**
* Builds ENUM type name from given table and column.
*/
protected buildEnumName(column: ColumnMetadata): string {
const { schema, tableName } = this.parseTableName(column.entityMetadata)
let enumName = column.enumName
? column.enumName
: `${tableName}_${column.databaseName.toLowerCase()}_enum`
if (schema) enumName = `${schema}.${enumName}`
return enumName
.split(".")
.map((i) => {
return `"${i}"`
})
.join(".")
}
}

View File

@ -25,6 +25,7 @@ import { ReplicationMode } from "../types/ReplicationMode"
import { TypeORMError } from "../../error"
import { MetadataTableType } from "../types/MetadataTableType"
import { InstanceChecker } from "../../util/InstanceChecker"
import { VersionUtils } from "../../util/VersionUtils.js"
/**
* Runs queries on a single postgres database connection.
@ -122,6 +123,10 @@ export class CockroachQueryRunner
* You cannot use query runner methods once its released.
*/
release(): Promise<void> {
if (this.isReleased) {
return Promise.resolve()
}
this.isReleased = true
if (this.releaseCallback) this.releaseCallback()
@ -496,6 +501,24 @@ export class CockroachQueryRunner
const upQueries: Query[] = []
const downQueries: Query[] = []
// if table have column with ENUM type, we must create this type in postgres.
const enumColumns = table.columns.filter(
(column) => column.type === "enum" || column.type === "simple-enum",
)
const createdEnumTypes: string[] = []
for (const column of enumColumns) {
// TODO: Should also check if values of existing type matches expected ones
const hasEnum = await this.hasEnumType(table, column)
const enumName = this.buildEnumName(table, column)
// if enum with the same "enumName" is defined more then once, me must prevent double creation
if (!hasEnum && createdEnumTypes.indexOf(enumName) === -1) {
createdEnumTypes.push(enumName)
upQueries.push(this.createEnumTypeSql(table, column, enumName))
downQueries.push(this.dropEnumTypeSql(table, column, enumName))
}
}
table.columns
.filter(
(column) =>
@ -900,6 +923,39 @@ export class CockroachQueryRunner
foreignKey.name = newForeignKeyName
})
// rename ENUM types
const enumColumns = newTable.columns.filter(
(column) => column.type === "enum" || column.type === "simple-enum",
)
for (let column of enumColumns) {
// skip renaming for user-defined enum name
if (column.enumName) continue
const oldEnumType = await this.getUserDefinedTypeName(
oldTable,
column,
)
upQueries.push(
new Query(
`ALTER TYPE "${oldEnumType.schema}"."${
oldEnumType.name
}" RENAME TO ${this.buildEnumName(
newTable,
column,
false,
)}`,
),
)
downQueries.push(
new Query(
`ALTER TYPE ${this.buildEnumName(
newTable,
column,
)} RENAME TO "${oldEnumType.name}"`,
),
)
}
await this.executeQueries(upQueries, downQueries)
}
@ -923,6 +979,14 @@ export class CockroachQueryRunner
)
}
if (column.type === "enum" || column.type === "simple-enum") {
const hasEnum = await this.hasEnumType(table, column)
if (!hasEnum) {
upQueries.push(this.createEnumTypeSql(table, column))
downQueries.push(this.dropEnumTypeSql(table, column))
}
}
upQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(
@ -1143,6 +1207,7 @@ export class CockroachQueryRunner
let clonedTable = table.clone()
const upQueries: Query[] = []
const downQueries: Query[] = []
let defaultValueChanged = false
const oldColumn = InstanceChecker.isTableColumn(oldTableColumnOrName)
? oldTableColumnOrName
@ -1157,6 +1222,7 @@ export class CockroachQueryRunner
if (
oldColumn.type !== newColumn.type ||
oldColumn.length !== newColumn.length ||
newColumn.isArray !== oldColumn.isArray ||
oldColumn.generatedType !== newColumn.generatedType ||
oldColumn.asExpression !== newColumn.asExpression
) {
@ -1184,6 +1250,36 @@ export class CockroachQueryRunner
),
)
// rename ENUM type
if (
oldColumn.type === "enum" ||
oldColumn.type === "simple-enum"
) {
const oldEnumType = await this.getUserDefinedTypeName(
table,
oldColumn,
)
upQueries.push(
new Query(
`ALTER TYPE "${oldEnumType.schema}"."${
oldEnumType.name
}" RENAME TO ${this.buildEnumName(
table,
newColumn,
false,
)}`,
),
)
downQueries.push(
new Query(
`ALTER TYPE ${this.buildEnumName(
table,
newColumn,
)} RENAME TO "${oldEnumType.name}"`,
),
)
}
// rename column primary key constraint
if (
oldColumn.isPrimary === true &&
@ -1603,6 +1699,158 @@ export class CockroachQueryRunner
}
}
if (
(newColumn.type === "enum" ||
newColumn.type === "simple-enum") &&
(oldColumn.type === "enum" ||
oldColumn.type === "simple-enum") &&
(!OrmUtils.isArraysEqual(newColumn.enum!, oldColumn.enum!) ||
newColumn.enumName !== oldColumn.enumName)
) {
const arraySuffix = newColumn.isArray ? "[]" : ""
// "public"."new_enum"
const newEnumName = this.buildEnumName(table, newColumn)
// "public"."old_enum"
const oldEnumName = this.buildEnumName(table, oldColumn)
// "old_enum"
const oldEnumNameWithoutSchema = this.buildEnumName(
table,
oldColumn,
false,
)
//"public"."old_enum_old"
const oldEnumNameWithSchema_old = this.buildEnumName(
table,
oldColumn,
true,
false,
true,
)
//"old_enum_old"
const oldEnumNameWithoutSchema_old = this.buildEnumName(
table,
oldColumn,
false,
false,
true,
)
// rename old ENUM
upQueries.push(
new Query(
`ALTER TYPE ${oldEnumName} RENAME TO ${oldEnumNameWithoutSchema_old}`,
),
)
downQueries.push(
new Query(
`ALTER TYPE ${oldEnumNameWithSchema_old} RENAME TO ${oldEnumNameWithoutSchema}`,
),
)
// create new ENUM
upQueries.push(
this.createEnumTypeSql(table, newColumn, newEnumName),
)
downQueries.push(
this.dropEnumTypeSql(table, newColumn, newEnumName),
)
// if column have default value, we must drop it to avoid issues with type casting
if (
oldColumn.default !== null &&
oldColumn.default !== undefined
) {
// mark default as changed to prevent double update
defaultValueChanged = true
upQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(
table,
)} ALTER COLUMN "${oldColumn.name}" DROP DEFAULT`,
),
)
downQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(
table,
)} ALTER COLUMN "${oldColumn.name}" SET DEFAULT ${
oldColumn.default
}`,
),
)
}
// build column types
const upType = `${newEnumName}${arraySuffix} USING "${newColumn.name}"::"text"::${newEnumName}${arraySuffix}`
const downType = `${oldEnumNameWithSchema_old}${arraySuffix} USING "${newColumn.name}"::"text"::${oldEnumNameWithSchema_old}${arraySuffix}`
upQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(table)} ALTER COLUMN "${
newColumn.name
}" TYPE ${upType}`,
),
)
// we add a delay here since for some reason cockroachdb fails with
// "cannot drop type because other objects still depend on it" error
// if we are trying to drop type right after we altered it.
upQueries.push(new Query(`SELECT pg_sleep(0.1)`))
downQueries.push(new Query(`SELECT pg_sleep(0.1)`))
downQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(table)} ALTER COLUMN "${
newColumn.name
}" TYPE ${downType}`,
),
)
// restore column default or create new one
if (
newColumn.default !== null &&
newColumn.default !== undefined
) {
upQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(
table,
)} ALTER COLUMN "${newColumn.name}" SET DEFAULT ${
newColumn.default
}`,
),
)
downQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(
table,
)} ALTER COLUMN "${newColumn.name}" DROP DEFAULT`,
),
)
}
// remove old ENUM
upQueries.push(
this.dropEnumTypeSql(
table,
oldColumn,
oldEnumNameWithSchema_old,
),
)
downQueries.push(
this.createEnumTypeSql(
table,
oldColumn,
oldEnumNameWithSchema_old,
),
)
}
if (
oldColumn.isGenerated !== newColumn.isGenerated &&
newColumn.generationStrategy !== "uuid"
@ -1652,7 +1900,10 @@ export class CockroachQueryRunner
}
}
if (newColumn.default !== oldColumn.default) {
if (
newColumn.default !== oldColumn.default &&
!defaultValueChanged
) {
if (
newColumn.default !== null &&
newColumn.default !== undefined
@ -1715,6 +1966,27 @@ export class CockroachQueryRunner
}
}
if (
(newColumn.spatialFeatureType || "").toLowerCase() !==
(oldColumn.spatialFeatureType || "").toLowerCase() ||
newColumn.srid !== oldColumn.srid
) {
upQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(table)} ALTER COLUMN "${
newColumn.name
}" TYPE ${this.driver.createFullType(newColumn)}`,
),
)
downQueries.push(
new Query(
`ALTER TABLE ${this.escapePath(table)} ALTER COLUMN "${
newColumn.name
}" TYPE ${this.driver.createFullType(oldColumn)}`,
),
)
}
await this.executeQueries(upQueries, downQueries)
this.replaceCachedTable(table, clonedTable)
}
@ -1922,6 +2194,24 @@ export class CockroachQueryRunner
downQueries.push(insertQuery)
}
// drop enum type
if (column.type === "enum" || column.type === "simple-enum") {
const hasEnum = await this.hasEnumType(table, column)
if (hasEnum) {
const enumType = await this.getUserDefinedTypeName(
table,
column,
)
const escapedEnumName = `"${enumType.schema}"."${enumType.name}"`
upQueries.push(
this.dropEnumTypeSql(table, column, escapedEnumName),
)
downQueries.push(
this.createEnumTypeSql(table, column, escapedEnumName),
)
}
}
await this.executeQueries(upQueries, downQueries)
clonedTable.removeColumn(column)
@ -2457,6 +2747,7 @@ export class CockroachQueryRunner
const isAnotherTransactionActive = this.isTransactionActive
if (!isAnotherTransactionActive) await this.startTransaction()
try {
const version = await this.getVersion()
const selectViewDropsQuery =
`SELECT 'DROP VIEW IF EXISTS "' || schemaname || '"."' || viewname || '" CASCADE;' as "query" ` +
`FROM "pg_views" WHERE "schemaname" IN (${schemaNamesString})`
@ -2481,6 +2772,11 @@ export class CockroachQueryRunner
sequenceDropQueries.map((q) => this.query(q["query"])),
)
// drop enum types. Supported starting from v20.2.19.
if (VersionUtils.isGreaterOrEqual(version, "20.2.19")) {
await this.dropEnumTypes(schemaNamesString)
}
if (!isAnotherTransactionActive) await this.commitTransaction()
} catch (error) {
try {
@ -2559,7 +2855,6 @@ export class CockroachQueryRunner
if (!tableNames) {
const tablesSql = `SELECT "table_schema", "table_name" FROM "information_schema"."tables"`
dbTables.push(...(await this.query(tablesSql)))
} else {
const tablesCondition = tableNames
@ -2646,16 +2941,30 @@ export class CockroachQueryRunner
`INNER JOIN "pg_class" "cl" ON "cl"."oid" = "con"."confrelid" ` +
`INNER JOIN "pg_namespace" "ns" ON "cl"."relnamespace" = "ns"."oid" ` +
`INNER JOIN "pg_attribute" "att2" ON "att2"."attrelid" = "con"."conrelid" AND "att2"."attnum" = "con"."parent"`
const tableSchemas = dbTables
.map((dbTable) => `'${dbTable.table_schema}'`)
.join(", ")
const enumsSql =
`SELECT "t"."typname" AS "name", string_agg("e"."enumlabel", '|') AS "value" ` +
`FROM "pg_enum" "e" ` +
`INNER JOIN "pg_type" "t" ON "t"."oid" = "e"."enumtypid" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" IN (${tableSchemas}) ` +
`GROUP BY "t"."typname"`
const [
dbColumns,
dbConstraints,
dbIndices,
dbForeignKeys,
dbEnums,
]: ObjectLiteral[][] = await Promise.all([
this.query(columnsSql),
this.query(constraintsSql),
this.query(indicesSql),
this.query(foreignKeysSql),
this.query(enumsSql),
])
// create tables for loaded tables
@ -2785,17 +3094,50 @@ export class CockroachQueryRunner
}
}
// docs: https://www.postgresql.org/docs/current/xtypes.html
// When you define a new base type, PostgreSQL automatically provides support for arrays of that type.
// The array type typically has the same name as the base type with the underscore character (_) prepended.
// ----
// so, we must remove this underscore character from enum type name
let udtName = dbColumn["udt_name"]
if (udtName.indexOf("_") === 0) {
udtName = udtName.substr(1, udtName.length)
}
const enumType = dbEnums.find((dbEnum) => {
return dbEnum["name"] === udtName
})
if (enumType) {
// check if `enumName` is specified by user
const builtEnumName = this.buildEnumName(
table,
tableColumn,
false,
true,
)
const enumName =
builtEnumName !== enumType["name"]
? enumType["name"]
: undefined
tableColumn.type = "enum"
tableColumn.enum = enumType["value"].split("|")
tableColumn.enumName = enumName
}
if (
dbColumn["data_type"].toLowerCase() === "array"
) {
tableColumn.isArray = true
const type = dbColumn["crdb_sql_type"]
.replace("[]", "")
.toLowerCase()
tableColumn.type =
this.connection.driver.normalizeType({
type: type,
})
if (!enumType) {
const type = dbColumn["crdb_sql_type"]
.replace("[]", "")
.toLowerCase()
tableColumn.type =
this.connection.driver.normalizeType({
type: type,
})
}
}
// check only columns that have length property
@ -2920,6 +3262,14 @@ export class CockroachQueryRunner
/^(-?[\d\.]+)$/,
"($1)",
)
if (enumType) {
tableColumn.default =
tableColumn.default.replace(
`.${enumType["name"]}`,
"",
)
}
}
}
@ -2959,6 +3309,32 @@ export class CockroachQueryRunner
tableColumn.charset =
dbColumn["character_set_name"]
if (
tableColumn.type === "geometry" ||
tableColumn.type === "geography"
) {
const sql =
`SELECT * FROM (` +
`SELECT "f_table_schema" "table_schema", "f_table_name" "table_name", ` +
`"f_${tableColumn.type}_column" "column_name", "srid", "type" ` +
`FROM "${tableColumn.type}_columns"` +
`) AS _ ` +
`WHERE "column_name" = '${dbColumn["column_name"]}' AND ` +
`"table_schema" = '${dbColumn["table_schema"]}' AND ` +
`"table_name" = '${dbColumn["table_name"]}'`
const results: ObjectLiteral[] =
await this.query(sql)
if (results.length > 0) {
tableColumn.spatialFeatureType =
results[0].type
tableColumn.srid = results[0].srid
? parseInt(results[0].srid)
: undefined
}
}
return tableColumn
}),
)
@ -3269,6 +3645,17 @@ export class CockroachQueryRunner
return new Query(sql)
}
/**
* Loads Cockroachdb version.
*/
protected async getVersion(): Promise<string> {
const result = await this.query(`SELECT version()`)
return result[0]["version"].replace(
/^CockroachDB CCL v([\d\.]+) .*$/,
"$1",
)
}
/**
* Builds drop table sql.
*/
@ -3337,6 +3724,68 @@ export class CockroachQueryRunner
})
}
/**
* Drops ENUM type from given schemas.
*/
protected async dropEnumTypes(schemaNames: string): Promise<void> {
const selectDropsQuery =
`SELECT 'DROP TYPE IF EXISTS "' || n.nspname || '"."' || t.typname || '";' as "query" FROM "pg_type" "t" ` +
`INNER JOIN "pg_enum" "e" ON "e"."enumtypid" = "t"."oid" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" IN (${schemaNames}) GROUP BY "n"."nspname", "t"."typname"`
const dropQueries: ObjectLiteral[] = await this.query(selectDropsQuery)
await Promise.all(dropQueries.map((q) => this.query(q["query"])))
}
/**
* Checks if enum with the given name exist in the database.
*/
protected async hasEnumType(
table: Table,
column: TableColumn,
): Promise<boolean> {
let { schema } = this.driver.parseTableName(table)
if (!schema) {
schema = await this.getCurrentSchema()
}
const enumName = this.buildEnumName(table, column, false, true)
const sql =
`SELECT "n"."nspname", "t"."typname" FROM "pg_type" "t" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" = '${schema}' AND "t"."typname" = '${enumName}'`
const result = await this.query(sql)
return result.length ? true : false
}
/**
* Builds create ENUM type sql.
*/
protected createEnumTypeSql(
table: Table,
column: TableColumn,
enumName?: string,
): Query {
if (!enumName) enumName = this.buildEnumName(table, column)
const enumValues = column
.enum!.map((value) => `'${value.replace("'", "''")}'`)
.join(", ")
return new Query(`CREATE TYPE ${enumName} AS ENUM(${enumValues})`)
}
/**
* Builds create ENUM type sql.
*/
protected dropEnumTypeSql(
table: Table,
column: TableColumn,
enumName?: string,
): Query {
if (!enumName) enumName = this.buildEnumName(table, column)
return new Query(`DROP TYPE ${enumName}`)
}
/**
* Builds create index sql.
* UNIQUE indices creates as UNIQUE constraints.
@ -3346,9 +3795,11 @@ export class CockroachQueryRunner
.map((columnName) => `"${columnName}"`)
.join(", ")
return new Query(
`CREATE INDEX "${index.name}" ON ${this.escapePath(
table,
)} (${columns}) ${index.where ? "WHERE " + index.where : ""}`,
`CREATE ${index.isUnique ? "UNIQUE " : ""}INDEX "${
index.name
}" ON ${this.escapePath(table)} ${
index.isSpatial ? "USING GiST " : ""
}(${columns}) ${index.where ? "WHERE " + index.where : ""}`,
)
}
@ -3546,6 +3997,57 @@ export class CockroachQueryRunner
: this.buildSequenceName(table, columnOrName)
}
/**
* Builds ENUM type name from given table and column.
*/
protected buildEnumName(
table: Table,
column: TableColumn,
withSchema: boolean = true,
disableEscape?: boolean,
toOld?: boolean,
): string {
const { schema, tableName } = this.driver.parseTableName(table)
let enumName = column.enumName
? column.enumName
: `${tableName}_${column.name.toLowerCase()}_enum`
if (schema && withSchema) enumName = `${schema}.${enumName}`
if (toOld) enumName = enumName + "_old"
return enumName
.split(".")
.map((i) => {
return disableEscape ? i : `"${i}"`
})
.join(".")
}
protected async getUserDefinedTypeName(table: Table, column: TableColumn) {
let { schema, tableName: name } = this.driver.parseTableName(table)
if (!schema) {
schema = await this.getCurrentSchema()
}
const result = await this.query(
`SELECT "udt_schema", "udt_name" ` +
`FROM "information_schema"."columns" WHERE "table_schema" = '${schema}' AND "table_name" = '${name}' AND "column_name"='${column.name}'`,
)
// docs: https://www.postgresql.org/docs/current/xtypes.html
// When you define a new base type, PostgreSQL automatically provides support for arrays of that type.
// The array type typically has the same name as the base type with the underscore character (_) prepended.
// ----
// so, we must remove this underscore character from enum type name
let udtName = result[0]["udt_name"]
if (udtName.indexOf("_") === 0) {
udtName = udtName.substr(1, udtName.length)
}
return {
schema: result[0]["udt_schema"],
name: udtName,
}
}
/**
* Escapes a given comment so it's safe to include in a query.
*/
@ -3590,8 +4092,12 @@ export class CockroachQueryRunner
}
}
if (!column.isGenerated)
if (column.type === "enum" || column.type === "simple-enum") {
c += " " + this.buildEnumName(table, column)
if (column.isArray) c += " array"
} else if (!column.isGenerated) {
c += " " + this.connection.driver.createFullType(column)
}
if (column.asExpression) {
c += ` AS (${column.asExpression}) ${

View File

@ -24,6 +24,7 @@ import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType.js"
/**
* Organizes communication with MongoDB.
@ -78,6 +79,11 @@ export class MongoDriver implements Driver {
*/
supportedDataTypes: ColumnType[] = []
/**
* Returns type of upsert supported by driver if any
*/
supportedUpsertTypes: UpsertType[]
/**
* Gets list of spatial column data types.
*/

View File

@ -26,6 +26,7 @@ import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { VersionUtils } from "../../util/VersionUtils"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType"
/**
* Organizes communication with MySQL DBMS.
@ -156,7 +157,7 @@ export class MysqlDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = "on-duplicate-key-update"
supportedUpsertTypes: UpsertType[] = ["on-duplicate-key-update"]
/**
* Gets list of spatial column data types.

View File

@ -25,6 +25,7 @@ import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { TypeORMError } from "../../error"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType.js"
/**
* Organizes communication with Oracle RDBMS.
@ -127,6 +128,11 @@ export class OracleDriver implements Driver {
"urowid",
]
/**
* Returns type of upsert supported by driver if any
*/
supportedUpsertTypes: UpsertType[] = []
/**
* Gets list of spatial column data types.
*/

View File

@ -27,6 +27,7 @@ import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType"
/**
* Organizes communication with PostgreSQL DBMS.
@ -188,7 +189,7 @@ export class PostgresDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = "on-conflict-do-update"
supportedUpsertTypes: UpsertType[] = ["on-conflict-do-update"]
/**
* Gets list of spatial column data types.

View File

@ -3537,48 +3537,22 @@ export class PostgresQueryRunner
}
}
if (tableColumn.type === "geometry") {
const geometryColumnSql = `SELECT * FROM (
SELECT
"f_table_schema" "table_schema",
"f_table_name" "table_name",
"f_geometry_column" "column_name",
"srid",
"type"
FROM "geometry_columns"
) AS _
WHERE
"column_name" = '${dbColumn["column_name"]}' AND
"table_schema" = '${dbColumn["table_schema"]}' AND
"table_name" = '${dbColumn["table_name"]}'`
if (
tableColumn.type === "geometry" ||
tableColumn.type === "geography"
) {
const sql =
`SELECT * FROM (` +
`SELECT "f_table_schema" "table_schema", "f_table_name" "table_name", ` +
`"f_${tableColumn.type}_column" "column_name", "srid", "type" ` +
`FROM "${tableColumn.type}_columns"` +
`) AS _ ` +
`WHERE "column_name" = '${dbColumn["column_name"]}' AND ` +
`"table_schema" = '${dbColumn["table_schema"]}' AND ` +
`"table_name" = '${dbColumn["table_name"]}'`
const results: ObjectLiteral[] =
await this.query(geometryColumnSql)
if (results.length > 0) {
tableColumn.spatialFeatureType =
results[0].type
tableColumn.srid = results[0].srid
}
}
if (tableColumn.type === "geography") {
const geographyColumnSql = `SELECT * FROM (
SELECT
"f_table_schema" "table_schema",
"f_table_name" "table_name",
"f_geography_column" "column_name",
"srid",
"type"
FROM "geography_columns"
) AS _
WHERE
"column_name" = '${dbColumn["column_name"]}' AND
"table_schema" = '${dbColumn["table_schema"]}' AND
"table_name" = '${dbColumn["table_name"]}'`
const results: ObjectLiteral[] =
await this.query(geographyColumnSql)
await this.query(sql)
if (results.length > 0) {
tableColumn.spatialFeatureType =

View File

@ -25,6 +25,7 @@ import { ReplicationMode } from "../types/ReplicationMode"
import { DriverUtils } from "../DriverUtils"
import { View } from "../../schema-builder/view/View"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType.js"
/**
* Organizes communication with SAP Hana DBMS.
@ -129,6 +130,11 @@ export class SapDriver implements Driver {
"st_point",
]
/**
* Returns type of upsert supported by driver if any
*/
supportedUpsertTypes: UpsertType[] = []
/**
* Gets list of spatial column data types.
*/

View File

@ -20,6 +20,7 @@ import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { CteCapabilities } from "../types/CteCapabilities"
import { UpsertType } from "../types/UpsertType.js"
/**
* Organizes communication with Spanner DBMS.
@ -99,7 +100,7 @@ export class SpannerDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = undefined
supportedUpsertTypes: UpsertType[] = []
/**
* Gets list of spatial column data types.

View File

@ -21,6 +21,7 @@ import { Table } from "../../schema-builder/table/Table"
import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType"
type DatabasesMap = Record<
string,
@ -131,7 +132,7 @@ export abstract class AbstractSqliteDriver implements Driver {
/**
* Returns type of upsert supported by driver if any
*/
readonly supportedUpsertType = "on-conflict-do-update"
supportedUpsertTypes: UpsertType[] = ["on-conflict-do-update"]
/**
* Gets list of column data types that support length by a driver.

View File

@ -26,6 +26,7 @@ import { View } from "../../schema-builder/view/View"
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
import { TypeORMError } from "../../error"
import { InstanceChecker } from "../../util/InstanceChecker"
import { UpsertType } from "../types/UpsertType.js"
/**
* Organizes communication with SQL Server DBMS.
@ -142,6 +143,11 @@ export class SqlServerDriver implements Driver {
"rowversion",
]
/**
* Returns type of upsert supported by driver if any
*/
supportedUpsertTypes: UpsertType[] = []
/**
* Gets list of spatial column data types.
*/

View File

@ -0,0 +1,109 @@
/**
* Point geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.2
*/
export type Point = {
type: "Point"
coordinates: number[]
}
/**
* LineString geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.4
*/
export type LineString = {
type: "LineString"
coordinates: number[][]
}
/**
* Polygon geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.6
*/
export type Polygon = {
type: "Polygon"
coordinates: number[][][]
}
/**
* MultiPoint geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.3
*/
export type MultiPoint = {
type: "MultiPoint"
coordinates: number[][]
}
/**
* MultiLineString geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.5
*/
export type MultiLineString = {
type: "MultiLineString"
coordinates: number[][][]
}
/**
* MultiPolygon geometry object.
* https://tools.ietf.org/html/rfc7946#section-3.1.7
*/
export type MultiPolygon = {
type: "MultiPolygon"
coordinates: number[][][][]
}
/**
* Geometry Collection
* https://tools.ietf.org/html/rfc7946#section-3.1.8
*/
export type GeometryCollection = {
type: "GeometryCollection"
geometries: (
| Point
| LineString
| Polygon
| MultiPoint
| MultiLineString
| MultiPolygon
)[]
}
/**
* Union of Geometry objects.
*/
export type Geometry =
| Point
| LineString
| Polygon
| MultiPoint
| MultiLineString
| MultiPolygon
| GeometryCollection
export type Geography = Geometry
/**
* A feature object which contains a geometry and associated properties.
* https://tools.ietf.org/html/rfc7946#section-3.2
*/
export type Feature = {
type: "Feature"
geometry: Geometry
id?: string | number
bbox?: number[]
properties: { [name: string]: any } | null
}
/**
* A collection of feature objects.
* https://tools.ietf.org/html/rfc7946#section-3.3
*/
export type FeatureCollection = {
type: "FeatureCollection"
bbox?: number[]
features: Feature[]
}
/**
* Union of GeoJSON objects.
*/
export type GeoJSON = Geometry | Feature | FeatureCollection

View File

@ -1 +1,4 @@
export type UpsertType = "on-conflict-do-update" | "on-duplicate-key-update"
export type UpsertType =
| "on-conflict-do-update"
| "on-duplicate-key-update"
| "primary-key"

View File

@ -727,6 +727,9 @@ export class EntityManager {
skipUpdateIfNoValuesChanged:
options.skipUpdateIfNoValuesChanged,
indexPredicate: options.indexPredicate,
upsertType:
options.upsertType ||
this.connection.driver.supportedUpsertTypes[0],
},
)
.execute()

View File

@ -127,6 +127,7 @@ export * from "./schema-builder/options/TableUniqueOptions"
export * from "./schema-builder/options/ViewOptions"
export * from "./driver/mongodb/typings"
export * from "./driver/types/DatabaseType"
export * from "./driver/types/GeoJsonTypes.js"
export * from "./driver/types/ReplicationMode"
export * from "./driver/sqlserver/MssqlParameter"

View File

@ -1,3 +1,5 @@
import { UpsertType } from "../driver/types/UpsertType"
export type InsertOrUpdateOptions = {
/**
* If true, postgres will skip the update if no values would be changed (reduces writes)
@ -7,4 +9,5 @@ export type InsertOrUpdateOptions = {
* If included, postgres will apply the index predicate to a conflict target (partial index)
*/
indexPredicate?: string
upsertType?: UpsertType
}

View File

@ -1,22 +1,22 @@
import { QueryBuilder } from "./QueryBuilder"
import { ObjectLiteral } from "../common/ObjectLiteral"
import { EntityTarget } from "../common/EntityTarget"
import { QueryDeepPartialEntity } from "./QueryPartialEntity"
import { MysqlDriver } from "../driver/mysql/MysqlDriver"
import { InsertResult } from "./result/InsertResult"
import { ReturningStatementNotSupportedError } from "../error/ReturningStatementNotSupportedError"
import { InsertValuesMissingError } from "../error/InsertValuesMissingError"
import { ColumnMetadata } from "../metadata/ColumnMetadata"
import { ReturningResultsEntityUpdator } from "./ReturningResultsEntityUpdator"
import { BroadcasterResult } from "../subscriber/BroadcasterResult"
import { TypeORMError } from "../error"
import { v4 as uuidv4 } from "uuid"
import { InsertOrUpdateOptions } from "./InsertOrUpdateOptions"
import { SqlServerDriver } from "../driver/sqlserver/SqlServerDriver"
import { EntityTarget } from "../common/EntityTarget"
import { ObjectLiteral } from "../common/ObjectLiteral"
import { AuroraMysqlDriver } from "../driver/aurora-mysql/AuroraMysqlDriver"
import { DriverUtils } from "../driver/DriverUtils"
import { ObjectUtils } from "../util/ObjectUtils"
import { MysqlDriver } from "../driver/mysql/MysqlDriver"
import { SqlServerDriver } from "../driver/sqlserver/SqlServerDriver"
import { TypeORMError } from "../error"
import { InsertValuesMissingError } from "../error/InsertValuesMissingError"
import { ReturningStatementNotSupportedError } from "../error/ReturningStatementNotSupportedError"
import { ColumnMetadata } from "../metadata/ColumnMetadata"
import { BroadcasterResult } from "../subscriber/BroadcasterResult"
import { InstanceChecker } from "../util/InstanceChecker"
import { ObjectUtils } from "../util/ObjectUtils"
import { InsertOrUpdateOptions } from "./InsertOrUpdateOptions"
import { QueryBuilder } from "./QueryBuilder"
import { QueryDeepPartialEntity } from "./QueryPartialEntity"
import { InsertResult } from "./result/InsertResult"
import { ReturningResultsEntityUpdator } from "./ReturningResultsEntityUpdator"
/**
* Allows to build complex sql queries in a fashion way and execute those queries.
@ -381,6 +381,7 @@ export class InsertQueryBuilder<
overwrite: statementOrOverwrite?.overwrite,
skipUpdateIfNoValuesChanged:
orUpdateOptions?.skipUpdateIfNoValuesChanged,
upsertType: orUpdateOptions?.upsertType,
}
return this
}
@ -391,6 +392,7 @@ export class InsertQueryBuilder<
skipUpdateIfNoValuesChanged:
orUpdateOptions?.skipUpdateIfNoValuesChanged,
indexPredicate: orUpdateOptions?.indexPredicate,
upsertType: orUpdateOptions?.upsertType,
}
return this
}
@ -413,6 +415,10 @@ export class InsertQueryBuilder<
const columnsExpression = this.createColumnNamesExpression()
let query = "INSERT "
if (this.expressionMap.onUpdate?.upsertType === "primary-key") {
query = "UPSERT "
}
if (
DriverUtils.isMySQLFamily(this.connection.driver) ||
this.connection.driver.options.type === "aurora-mysql"
@ -471,118 +477,132 @@ export class InsertQueryBuilder<
query += ` DEFAULT VALUES`
}
}
if (
this.connection.driver.supportedUpsertType ===
"on-conflict-do-update"
) {
if (this.expressionMap.onIgnore) {
query += " ON CONFLICT DO NOTHING "
} else if (this.expressionMap.onConflict) {
query += ` ON CONFLICT ${this.expressionMap.onConflict} `
} else if (this.expressionMap.onUpdate) {
const {
overwrite,
columns,
conflict,
skipUpdateIfNoValuesChanged,
indexPredicate,
} = this.expressionMap.onUpdate
if (this.expressionMap.onUpdate?.upsertType !== "primary-key") {
if (
this.connection.driver.supportedUpsertTypes.includes(
"on-conflict-do-update",
)
) {
if (this.expressionMap.onIgnore) {
query += " ON CONFLICT DO NOTHING "
} else if (this.expressionMap.onConflict) {
query += ` ON CONFLICT ${this.expressionMap.onConflict} `
} else if (this.expressionMap.onUpdate) {
const {
overwrite,
columns,
conflict,
skipUpdateIfNoValuesChanged,
indexPredicate,
} = this.expressionMap.onUpdate
let conflictTarget = "ON CONFLICT"
let conflictTarget = "ON CONFLICT"
if (Array.isArray(conflict)) {
conflictTarget += ` ( ${conflict
.map((column) => this.escape(column))
.join(", ")} )`
if (
indexPredicate &&
!DriverUtils.isPostgresFamily(this.connection.driver)
) {
throw new TypeORMError(
`indexPredicate option is not supported by the current database driver`,
)
if (Array.isArray(conflict)) {
conflictTarget += ` ( ${conflict
.map((column) => this.escape(column))
.join(", ")} )`
if (
indexPredicate &&
!DriverUtils.isPostgresFamily(
this.connection.driver,
)
) {
throw new TypeORMError(
`indexPredicate option is not supported by the current database driver`,
)
}
if (
indexPredicate &&
DriverUtils.isPostgresFamily(this.connection.driver)
) {
conflictTarget += ` WHERE ( ${this.escape(
indexPredicate,
)} )`
}
} else if (conflict) {
conflictTarget += ` ON CONSTRAINT ${this.escape(
conflict,
)}`
}
if (Array.isArray(overwrite)) {
query += ` ${conflictTarget} DO UPDATE SET `
query += overwrite
?.map(
(column) =>
`${this.escape(
column,
)} = EXCLUDED.${this.escape(column)}`,
)
.join(", ")
query += " "
} else if (columns) {
query += ` ${conflictTarget} DO UPDATE SET `
query += columns
.map(
(column) =>
`${this.escape(column)} = :${column}`,
)
.join(", ")
query += " "
}
if (
indexPredicate &&
Array.isArray(overwrite) &&
skipUpdateIfNoValuesChanged &&
DriverUtils.isPostgresFamily(this.connection.driver)
) {
conflictTarget += ` WHERE ( ${this.escape(
indexPredicate,
)} )`
query += ` WHERE (`
query += overwrite
.map(
(column) =>
`${tableName}.${this.escape(
column,
)} IS DISTINCT FROM EXCLUDED.${this.escape(
column,
)}`,
)
.join(" OR ")
query += ") "
}
} else if (conflict) {
conflictTarget += ` ON CONSTRAINT ${this.escape(conflict)}`
}
if (Array.isArray(overwrite)) {
query += ` ${conflictTarget} DO UPDATE SET `
query += overwrite
?.map(
(column) =>
`${this.escape(
column,
)} = EXCLUDED.${this.escape(column)}`,
)
.join(", ")
query += " "
} else if (columns) {
query += ` ${conflictTarget} DO UPDATE SET `
query += columns
.map((column) => `${this.escape(column)} = :${column}`)
.join(", ")
query += " "
}
if (
Array.isArray(overwrite) &&
skipUpdateIfNoValuesChanged &&
DriverUtils.isPostgresFamily(this.connection.driver)
) {
query += ` WHERE (`
query += overwrite
.map(
(column) =>
`${tableName}.${this.escape(
column,
)} IS DISTINCT FROM EXCLUDED.${this.escape(
column,
)}`,
)
.join(" OR ")
query += ") "
}
}
} else if (
this.connection.driver.supportedUpsertType ===
"on-duplicate-key-update"
) {
if (this.expressionMap.onUpdate) {
const { overwrite, columns } = this.expressionMap.onUpdate
if (Array.isArray(overwrite)) {
query += " ON DUPLICATE KEY UPDATE "
query += overwrite
.map(
(column) =>
`${this.escape(column)} = VALUES(${this.escape(
column,
)})`,
)
.join(", ")
query += " "
} else if (Array.isArray(columns)) {
query += " ON DUPLICATE KEY UPDATE "
query += columns
.map((column) => `${this.escape(column)} = :${column}`)
.join(", ")
query += " "
}
}
} else {
if (this.expressionMap.onUpdate) {
throw new TypeORMError(
`onUpdate is not supported by the current database driver`,
} else if (
this.connection.driver.supportedUpsertTypes.includes(
"on-duplicate-key-update",
)
) {
if (this.expressionMap.onUpdate) {
const { overwrite, columns } = this.expressionMap.onUpdate
if (Array.isArray(overwrite)) {
query += " ON DUPLICATE KEY UPDATE "
query += overwrite
.map(
(column) =>
`${this.escape(
column,
)} = VALUES(${this.escape(column)})`,
)
.join(", ")
query += " "
} else if (Array.isArray(columns)) {
query += " ON DUPLICATE KEY UPDATE "
query += columns
.map(
(column) =>
`${this.escape(column)} = :${column}`,
)
.join(", ")
query += " "
}
}
} else {
if (this.expressionMap.onUpdate) {
throw new TypeORMError(
`onUpdate is not supported by the current database driver`,
)
}
}
}

View File

@ -795,6 +795,20 @@ export abstract class QueryBuilder<Entity extends ObjectLiteral> {
return `/* ${this.expressionMap.comment.replace("*/", "")} */ `
}
/**
* Time travel queries for CockroachDB
*/
protected createTimeTravelQuery(): string {
if (
this.expressionMap.queryType === "select" &&
this.expressionMap.timeTravel
) {
return ` AS OF SYSTEM TIME ${this.expressionMap.timeTravel}`
}
return ""
}
/**
* Creates "WHERE" expression.
*/
@ -848,13 +862,20 @@ export abstract class QueryBuilder<Entity extends ObjectLiteral> {
conditionsArray.push(condition)
}
let condition = ""
// time travel
condition += this.createTimeTravelQuery()
if (!conditionsArray.length) {
return ""
condition += ""
} else if (conditionsArray.length === 1) {
return ` WHERE ${conditionsArray[0]}`
condition += ` WHERE ${conditionsArray[0]}`
} else {
return ` WHERE ( ${conditionsArray.join(" ) AND ( ")} )`
condition += ` WHERE ( ${conditionsArray.join(" ) AND ( ")} )`
}
return condition
}
/**
@ -1042,6 +1063,10 @@ export abstract class QueryBuilder<Entity extends ObjectLiteral> {
.slice(1)
.join(", ")})`
case "any":
if (driver.options.type === "cockroachdb") {
return `${condition.parameters[0]}::STRING = ANY(${condition.parameters[1]}::STRING[])`
}
return `${condition.parameters[0]} = ANY(${condition.parameters[1]})`
case "isNull":
return `${condition.parameters[0]} IS NULL`

View File

@ -14,6 +14,8 @@ import { RelationMetadata } from "../metadata/RelationMetadata"
import { SelectQueryBuilderOption } from "./SelectQueryBuilderOption"
import { TypeORMError } from "../error"
import { WhereClause } from "./WhereClause"
import { UpsertType } from "../driver/types/UpsertType"
import { CockroachConnectionOptions } from "../driver/cockroachdb/CockroachConnectionOptions"
/**
* Contains all properties of the QueryBuilder that needs to be build a final query.
@ -115,6 +117,7 @@ export class QueryExpressionMap {
overwrite?: string[]
skipUpdateIfNoValuesChanged?: boolean
indexPredicate?: string
upsertType?: UpsertType
}
/**
@ -316,6 +319,12 @@ export class QueryExpressionMap {
*/
useTransaction: boolean = false
/**
* Indicates if query should be time travel query
* https://www.cockroachlabs.com/docs/stable/as-of-system-time.html
*/
timeTravel?: boolean | string
/**
* Extra parameters.
*
@ -349,6 +358,10 @@ export class QueryExpressionMap {
if (connection.options.relationLoadStrategy) {
this.relationLoadStrategy = connection.options.relationLoadStrategy
}
this.timeTravel =
(connection.options as CockroachConnectionOptions)
?.timeTravelQueries || false
}
// -------------------------------------------------------------------------
@ -524,6 +537,7 @@ export class QueryExpressionMap {
map.updateEntity = this.updateEntity
map.callListeners = this.callListeners
map.useTransaction = this.useTransaction
map.timeTravel = this.timeTravel
map.nativeParameters = Object.assign({}, this.nativeParameters)
map.comment = this.comment
map.commonTableExpressions = this.commonTableExpressions.map(

View File

@ -1323,6 +1323,21 @@ export class SelectQueryBuilder<Entity extends ObjectLiteral>
return this
}
/**
* Enables time travelling for the current query (only supported by cockroach currently)
*/
timeTravelQuery(timeTravelFn?: string | boolean): this {
if (this.connection.driver.options.type === "cockroachdb") {
if (timeTravelFn === undefined) {
this.expressionMap.timeTravel = "follower_read_timestamp()"
} else {
this.expressionMap.timeTravel = timeTravelFn
}
}
return this
}
/**
* Sets ORDER BY condition in the query builder.
* If you had previously ORDER BY expression defined,
@ -3385,13 +3400,26 @@ export class SelectQueryBuilder<Entity extends ObjectLiteral>
},
)
const originalQuery = this.clone()
// preserve original timeTravel value since we set it to "false" in subquery
const originalQueryTimeTravel =
originalQuery.expressionMap.timeTravel
rawResults = await new SelectQueryBuilder(
this.connection,
queryRunner,
)
.select(`DISTINCT ${querySelects.join(", ")}`)
.addSelect(selects)
.from(`(${this.clone().orderBy().getQuery()})`, "distinctAlias")
.from(
`(${originalQuery
.orderBy()
.timeTravelQuery(false) // set it to "false" since time travel clause must appear at the very end and applies to the entire SELECT clause.
.getQuery()})`,
"distinctAlias",
)
.timeTravelQuery(originalQueryTimeTravel)
.offset(this.expressionMap.skip)
.limit(this.expressionMap.take)
.orderBy(orderBys)

View File

@ -585,8 +585,9 @@ export class UpdateQueryBuilder<Entity extends ObjectLiteral>
expression = `${geomFromText}(${paramName})`
}
} else if (
this.connection.driver.options.type ===
"postgres" &&
DriverUtils.isPostgresFamily(
this.connection.driver,
) &&
this.connection.driver.spatialTypes.indexOf(
column.type,
) !== -1

View File

@ -1,9 +1,9 @@
import { InsertOrUpdateOptions } from "../query-builder/InsertOrUpdateOptions"
import { UpsertType } from "../driver/types/UpsertType"
/**
* Special options passed to Repository#upsert
*/
// eslint-disable-next-line @typescript-eslint/no-unused-vars
export interface UpsertOptions<Entity> extends InsertOrUpdateOptions {
conflictPaths: string[] | { [P in keyof Entity]?: true }
@ -11,4 +11,11 @@ export interface UpsertOptions<Entity> extends InsertOrUpdateOptions {
* If true, postgres will skip the update if no values would be changed (reduces writes)
*/
skipUpdateIfNoValuesChanged?: boolean
/**
* Define the type of upsert to use (currently, CockroachDB only).
*
* If none provided, it will use the default for the database (first one in the list)
*/
upsertType?: UpsertType
}

View File

@ -0,0 +1,448 @@
import "reflect-metadata"
import { DataSource } from "../../../../../src/data-source/DataSource"
import { expect } from "chai"
import {
closeTestingConnections,
createTestingConnections,
reloadTestingDatabases,
} from "../../../../utils/test-utils"
import { Post } from "./entity/Post"
import { Table, TableColumn } from "../../../../../src"
describe("database schema > column types > cockroachdb-enum", () => {
let connections: DataSource[]
before(async () => {
connections = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
enabledDrivers: ["cockroachdb"],
})
})
beforeEach(() => reloadTestingDatabases(connections))
after(() => closeTestingConnections(connections))
it("should create table with ENUM column and save data to it", () =>
Promise.all(
connections.map(async (connection) => {
const postRepository = connection.getRepository(Post)
const queryRunner = connection.createQueryRunner()
const table = await queryRunner.getTable("post")
await queryRunner.release()
const post = new Post()
post.enum = "A"
post.enumArray = ["A", "B"]
post.enumArray2 = ["A", "C"]
post.simpleEnum = "A"
post.name = "Post #1"
await postRepository.save(post)
const loadedPost = (await postRepository.findOneBy({
id: 1,
}))!
loadedPost.enum.should.be.equal(post.enum)
loadedPost.enumArray.should.be.deep.equal(post.enumArray)
loadedPost.enumArray2.should.be.deep.equal(post.enumArray2)
loadedPost.simpleEnum.should.be.equal(post.simpleEnum)
table!.findColumnByName("enum")!.type.should.be.equal("enum")
table!
.findColumnByName("enumArray")!
.type.should.be.equal("enum")
table!
.findColumnByName("enumArray2")!
.type.should.be.equal("enum")
table!.findColumnByName("enumArray")!.isArray.should.be.true
table!
.findColumnByName("simpleEnum")!
.type.should.be.equal("enum")
}),
))
it("should create ENUM column and revert creation", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
await queryRunner.addColumn(
"post",
new TableColumn({
name: "newEnum",
type: "enum",
enum: ["Apple", "Pineapple"],
}),
)
let table = await queryRunner.getTable("post")
table!.findColumnByName("newEnum")!.type.should.be.equal("enum")
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
expect(table!.findColumnByName("newEnum")).to.be.undefined
await queryRunner.release()
}),
))
it("should drop ENUM column and revert drop", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
const enumColumn = table!.findColumnByName("enum")!
await queryRunner.dropColumn(table!, enumColumn)
expect(table!.findColumnByName("enum")).to.be.undefined
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
table!.findColumnByName("enum")!.type.should.be.equal("enum")
await queryRunner.release()
}),
))
it("should create table with ENUM column and revert creation", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
await queryRunner.createTable(
new Table({
name: "question",
columns: [
{
name: "enum",
type: "enum",
enum: ["Apple", "Banana", "Cherry"],
},
],
}),
)
let table = await queryRunner.getTable("question")
const enumColumn = table!.findColumnByName("enum")!
enumColumn.type.should.be.equal("enum")
enumColumn.enum!.should.be.eql(["Apple", "Banana", "Cherry"])
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("question")
expect(table).to.be.undefined
await queryRunner.release()
}),
))
it("should drop table with ENUM column and revert drop", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
await queryRunner.dropTable("post")
let table = await queryRunner.getTable("post")
expect(table).to.be.undefined
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
expect(table).to.be.not.undefined
await queryRunner.release()
}),
))
it("should change non-enum column in to ENUM and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
let nameColumn = table!.findColumnByName("name")!
let changedColumn = nameColumn.clone()
changedColumn.type = "enum"
changedColumn.enum = ["Apple", "Banana", "Cherry"]
await queryRunner.changeColumn(
table!,
nameColumn,
changedColumn,
)
table = await queryRunner.getTable("post")
changedColumn = table!.findColumnByName("name")!
changedColumn.type.should.be.equal("enum")
changedColumn.enum!.should.be.eql(["Apple", "Banana", "Cherry"])
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
nameColumn = table!.findColumnByName("name")!
nameColumn.type.should.be.equal("varchar")
expect(nameColumn.enum).to.be.undefined
await queryRunner.release()
}),
))
it("should change ENUM column in to non-enum and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
let enumColumn = table!.findColumnByName("enum")!
let changedColumn = enumColumn.clone()
changedColumn.type = "varchar"
changedColumn.enum = undefined
await queryRunner.changeColumn(
table!,
enumColumn,
changedColumn,
)
table = await queryRunner.getTable("post")
changedColumn = table!.findColumnByName("enum")!
changedColumn.type.should.be.equal("varchar")
expect(changedColumn.enum).to.be.undefined
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
enumColumn = table!.findColumnByName("enum")!
enumColumn.type.should.be.equal("enum")
enumColumn.enum!.should.be.eql(["A", "B", "C"])
await queryRunner.release()
}),
))
it("should change ENUM array column in to non-array and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
let enumColumn = table!.findColumnByName("enumArray")!
let changedColumn = enumColumn.clone()
changedColumn.isArray = false
await queryRunner.changeColumn(
table!,
enumColumn,
changedColumn,
)
table = await queryRunner.getTable("post")
changedColumn = table!.findColumnByName("enumArray")!
changedColumn.isArray.should.be.false
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
enumColumn = table!.findColumnByName("enumArray")!
enumColumn.isArray.should.be.true
await queryRunner.release()
}),
))
it("should change ENUM value and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
const enumColumn = table!.findColumnByName("enum")!
const changedColumn = enumColumn.clone()
changedColumn.enum = ["C", "D", "E"]
await queryRunner.changeColumn(
table!,
enumColumn,
changedColumn,
)
table = await queryRunner.getTable("post")
table!
.findColumnByName("enum")!
.enum!.should.be.eql(["C", "D", "E"])
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
table!
.findColumnByName("enum")!
.enum!.should.be.eql(["A", "B", "C"])
await queryRunner.release()
}),
))
it("should change `enumName` and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
// add `enumName`
let table = await queryRunner.getTable("post")
const column = table!.findColumnByName("enum")!
const newColumn = column.clone()
newColumn.enumName = "PostTypeEnum"
// change column
await queryRunner.changeColumn(table!, column, newColumn)
// check if `enumName` changed
table = await queryRunner.getTable("post")
let changedColumn = table!.findColumnByName("enum")!
expect(changedColumn.enumName).to.equal("PostTypeEnum")
// revert changes
await queryRunner.executeMemoryDownSql()
// check if `enumName` reverted
table = await queryRunner.getTable("post")
changedColumn = table!.findColumnByName("enum")!
expect(changedColumn.enumName).to.undefined
await queryRunner.release()
}),
))
it("should not create new type if same `enumName` is used more than once", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const table = new Table({
name: "my_table",
columns: [
{
name: "enum1",
type: "enum",
enum: ["Apple", "Banana", "Cherry"],
enumName: "Fruits",
},
{
name: "enum2",
type: "enum",
enum: ["Apple", "Banana", "Cherry"],
enumName: "Fruits",
},
{
name: "enum3",
type: "enum",
enumName: "Fruits",
},
],
})
await queryRunner.createTable(table)
// revert changes
await queryRunner.executeMemoryDownSql()
await queryRunner.release()
}),
))
it("should change both ENUM value and ENUM name and revert change", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
let table = await queryRunner.getTable("post")
const enumColumn = table!.findColumnByName("enum")!
const changedColumn = enumColumn.clone()
changedColumn.enum = ["C", "D", "E"]
changedColumn.enumName = "my_enum_type"
await queryRunner.changeColumn(
table!,
enumColumn,
changedColumn,
)
table = await queryRunner.getTable("post")
const columnAfterChange = table!.findColumnByName("enum")!
columnAfterChange.enum!.should.be.eql(["C", "D", "E"])
columnAfterChange.enumName!.should.be.eql("my_enum_type")
await queryRunner.executeMemoryDownSql()
table = await queryRunner.getTable("post")
const columnAfterRevert = table!.findColumnByName("enum")!
columnAfterRevert.enum!.should.be.eql(["A", "B", "C"])
expect(columnAfterRevert.enumName).to.undefined
await queryRunner.release()
}),
))
it("should rename ENUM when column renamed and revert rename", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const currentSchemaQuery = await queryRunner.query(
`SELECT * FROM current_schema()`,
)
const currentSchema = currentSchemaQuery[0]["current_schema"]
const table = await queryRunner.getTable("post")
const enumColumn = table!.findColumnByName("enum")!
const changedColumn = enumColumn.clone()
changedColumn.name = "enumerable"
await queryRunner.changeColumn(
table!,
enumColumn,
changedColumn,
)
let result = await queryRunner.query(
`SELECT "n"."nspname", "t"."typname" FROM "pg_type" "t" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" = '${currentSchema}' AND "t"."typname" = 'post_enumerable_enum'`,
)
result.length.should.be.equal(1)
await queryRunner.executeMemoryDownSql()
result = await queryRunner.query(
`SELECT "n"."nspname", "t"."typname" FROM "pg_type" "t" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" = '${currentSchema}' AND "t"."typname" = 'post_enum_enum'`,
)
result.length.should.be.equal(1)
await queryRunner.release()
}),
))
it("should rename ENUM when table renamed and revert rename", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const currentSchemaQuery = await queryRunner.query(
`SELECT * FROM current_schema()`,
)
const currentSchema = currentSchemaQuery[0]["current_schema"]
const table = await queryRunner.getTable("post")
await queryRunner.renameTable(table!, "question")
let result = await queryRunner.query(
`SELECT "n"."nspname", "t"."typname" FROM "pg_type" "t" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" = '${currentSchema}' AND "t"."typname" = 'question_enum_enum'`,
)
result.length.should.be.equal(1)
await queryRunner.executeMemoryDownSql()
result = await queryRunner.query(
`SELECT "n"."nspname", "t"."typname" FROM "pg_type" "t" ` +
`INNER JOIN "pg_namespace" "n" ON "n"."oid" = "t"."typnamespace" ` +
`WHERE "n"."nspname" = '${currentSchema}' AND "t"."typname" = 'post_enum_enum'`,
)
result.length.should.be.equal(1)
await queryRunner.release()
}),
))
})

View File

@ -0,0 +1,28 @@
import { Column } from "../../../../../../src/index"
import { Entity } from "../../../../../../src/index"
import { PrimaryGeneratedColumn } from "../../../../../../src"
@Entity()
export class Post {
@PrimaryGeneratedColumn()
id: number
@Column("enum", { enum: ["A", "B", "C"] })
enum: string
@Column("enum", { enum: ["A", "B", "C"], array: true })
enumArray: string[]
@Column("enum", {
enum: ["A", "B", "C"],
enumName: "enum_array",
array: true,
})
enumArray2: string[]
@Column("simple-enum", { enum: ["A", "B", "C"] })
simpleEnum: string
@Column()
name: string
}

View File

@ -1,5 +1,14 @@
import "reflect-metadata"
import { DataSource } from "../../../../../src"
import {
DataSource,
GeometryCollection,
LineString,
MultiLineString,
MultiPoint,
MultiPolygon,
Point,
Polygon,
} from "../../../../../src"
import {
closeTestingConnections,
createTestingConnections,
@ -28,6 +37,84 @@ describe("database schema > column types > cockroachdb", () => {
const table = await queryRunner.getTable("post")
await queryRunner.release()
const point: Point = {
type: "Point",
coordinates: [116.443987, 39.920843],
}
const linestring: LineString = {
type: "LineString",
coordinates: [
[-87.623177, 41.881832],
[-90.199402, 38.627003],
[-82.446732, 38.413651],
[-87.623177, 41.881832],
],
}
const polygon: Polygon = {
type: "Polygon",
coordinates: [
[
[-87.906471, 43.038902],
[-95.992775, 36.15398],
[-75.704722, 36.076944],
[-87.906471, 43.038902],
],
[
[-87.623177, 41.881832],
[-90.199402, 38.627003],
[-82.446732, 38.413651],
[-87.623177, 41.881832],
],
],
}
const multipoint: MultiPoint = {
type: "MultiPoint",
coordinates: [
[100.0, 0.0],
[101.0, 1.0],
],
}
const multilinestring: MultiLineString = {
type: "MultiLineString",
coordinates: [
[
[170.0, 45.0],
[180.0, 45.0],
],
[
[-180.0, 45.0],
[-170.0, 45.0],
],
],
}
const multipolygon: MultiPolygon = {
type: "MultiPolygon",
coordinates: [
[
[
[180.0, 40.0],
[180.0, 50.0],
[170.0, 50.0],
[170.0, 40.0],
[180.0, 40.0],
],
],
[
[
[-170.0, 40.0],
[-170.0, 50.0],
[-180.0, 50.0],
[-180.0, 40.0],
[-170.0, 40.0],
],
],
],
}
const geometrycollection: GeometryCollection = {
type: "GeometryCollection",
geometries: [point, linestring, polygon],
}
const post = new Post()
post.id = 1
post.name = "Post"
@ -72,6 +159,20 @@ describe("database schema > column types > cockroachdb", () => {
post.boolean = true
post.bool = false
post.inet = "192.168.100.128"
post.point = point
post.linestring = linestring
post.polygon = polygon
post.multipoint = multipoint
post.multilinestring = multilinestring
post.multipolygon = multipolygon
post.geometrycollection = geometrycollection
post.point_geography = point
post.linestring_geography = linestring
post.polygon_geography = polygon
post.multipoint_geography = multipoint
post.multilinestring_geography = multilinestring
post.multipolygon_geography = multipolygon
post.geometrycollection_geography = geometrycollection
post.uuid = "0e37df36-f698-11e6-8dd4-cb9ced3df976"
post.jsonb = { id: 1, name: "Post" }
post.json = { id: 1, name: "Post" }
@ -113,12 +214,12 @@ describe("database schema > column types > cockroachdb", () => {
.should.be.equal(post.bytea.toString())
loadedPost.blob.toString().should.be.equal(post.blob.toString())
loadedPost.date.should.be.equal(post.date)
// loadedPost.interval.years.should.be.equal(1);
// loadedPost.interval.months.should.be.equal(2);
// loadedPost.interval.days.should.be.equal(3);
// loadedPost.interval.hours.should.be.equal(4);
// loadedPost.interval.minutes.should.be.equal(5);
// loadedPost.interval.seconds.should.be.equal(6);
loadedPost.interval.years.should.be.equal(1)
loadedPost.interval.months.should.be.equal(2)
loadedPost.interval.days.should.be.equal(3)
loadedPost.interval.hours.should.be.equal(4)
loadedPost.interval.minutes.should.be.equal(5)
loadedPost.interval.seconds.should.be.equal(6)
loadedPost.time.should.be.equal(post.time)
loadedPost.timeWithoutTimeZone.should.be.equal(
post.timeWithoutTimeZone,
@ -138,6 +239,38 @@ describe("database schema > column types > cockroachdb", () => {
loadedPost.boolean.should.be.equal(post.boolean)
loadedPost.bool.should.be.equal(post.bool)
loadedPost.inet.should.be.equal(post.inet)
loadedPost.point.should.deep.include(post.point)
loadedPost.linestring.should.deep.include(post.linestring)
loadedPost.polygon.should.deep.include(post.polygon)
loadedPost.multipoint.should.deep.include(post.multipoint)
loadedPost.multilinestring.should.deep.include(
post.multilinestring,
)
loadedPost.multipolygon.should.deep.include(post.multipolygon)
loadedPost.geometrycollection.should.deep.include(
post.geometrycollection,
)
loadedPost.point_geography.should.deep.include(
post.point_geography,
)
loadedPost.linestring_geography.should.deep.include(
post.linestring_geography,
)
loadedPost.polygon_geography.should.deep.include(
post.polygon_geography,
)
loadedPost.multipoint_geography.should.deep.include(
post.multipoint_geography,
)
loadedPost.multilinestring_geography.should.deep.include(
post.multilinestring_geography,
)
loadedPost.multipolygon_geography.should.deep.include(
post.multipolygon_geography,
)
loadedPost.geometrycollection_geography.should.deep.include(
post.geometrycollection_geography,
)
loadedPost.uuid.should.be.eql(post.uuid)
loadedPost.jsonb.should.be.eql(post.jsonb)
loadedPost.json.should.be.eql(post.json)
@ -223,6 +356,48 @@ describe("database schema > column types > cockroachdb", () => {
table!.findColumnByName("boolean")!.type.should.be.equal("bool")
table!.findColumnByName("bool")!.type.should.be.equal("bool")
table!.findColumnByName("inet")!.type.should.be.equal("inet")
table!
.findColumnByName("point")!
.type.should.be.equal("geometry")
table!
.findColumnByName("linestring")!
.type.should.be.equal("geometry")
table!
.findColumnByName("polygon")!
.type.should.be.equal("geometry")
table!
.findColumnByName("multipoint")!
.type.should.be.equal("geometry")
table!
.findColumnByName("multilinestring")!
.type.should.be.equal("geometry")
table!
.findColumnByName("multipolygon")!
.type.should.be.equal("geometry")
table!
.findColumnByName("geometrycollection")!
.type.should.be.equal("geometry")
table!
.findColumnByName("point_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("linestring_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("polygon_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("multipoint_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("multilinestring_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("multipolygon_geography")!
.type.should.be.equal("geography")
table!
.findColumnByName("geometrycollection_geography")!
.type.should.be.equal("geography")
table!.findColumnByName("uuid")!.type.should.be.equal("uuid")
table!.findColumnByName("jsonb")!.type.should.be.equal("jsonb")
table!.findColumnByName("json")!.type.should.be.equal("jsonb")

View File

@ -1,6 +1,15 @@
import { Entity } from "../../../../../../src"
import { PrimaryColumn } from "../../../../../../src"
import { Column } from "../../../../../../src"
import {
Column,
Entity,
GeometryCollection,
LineString,
MultiLineString,
MultiPoint,
MultiPolygon,
Point,
Polygon,
PrimaryColumn,
} from "../../../../../../src"
@Entity()
export class Post {
@ -142,6 +151,56 @@ export class Post {
@Column("inet")
inet: string
// -------------------------------------------------------------------------
// Geometry Type
// -------------------------------------------------------------------------
@Column("geometry")
point: Point
@Column("geometry")
polygon: Polygon
@Column("geometry")
multipoint: MultiPoint
@Column("geometry")
linestring: LineString
@Column("geometry")
multilinestring: MultiLineString
@Column("geometry")
multipolygon: MultiPolygon
@Column("geometry")
geometrycollection: GeometryCollection
// -------------------------------------------------------------------------
// Geography Type
// -------------------------------------------------------------------------
@Column("geography")
point_geography: Point
@Column("geography")
polygon_geography: Polygon
@Column("geography")
multipoint_geography: MultiPoint
@Column("geography")
linestring_geography: LineString
@Column("geography")
multilinestring_geography: MultiLineString
@Column("geography")
multipolygon_geography: MultiPolygon
@Column("geography")
geometrycollection_geography: GeometryCollection
// -------------------------------------------------------------------------
// UUID Type
// -------------------------------------------------------------------------

View File

@ -91,6 +91,14 @@ export class EnumArrayEntity {
})
enumWithoutDefault: StringEnum[]
@Column({
type: "enum",
enum: StringEnum,
array: true,
default: "{a,e}",
})
defaultArrayAsString: StringEnum[]
@Column({
type: "enum",
enum: StringEnum,

View File

@ -18,7 +18,7 @@ describe("database schema > enum arrays", () => {
before(async () => {
connections = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
enabledDrivers: ["postgres"],
enabledDrivers: ["postgres", "cockroachdb"],
})
})
beforeEach(() => reloadTestingDatabases(connections))

View File

@ -18,7 +18,7 @@ describe("database schema > enums", () => {
before(async () => {
connections = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
enabledDrivers: ["postgres", "mysql", "mariadb"],
enabledDrivers: ["postgres", "mysql", "mariadb", "cockroachdb"],
})
})
beforeEach(() => reloadTestingDatabases(connections))

View File

@ -47,7 +47,7 @@ describe("entity-model", () => {
it("should upsert successfully", async () => {
// These must run sequentially as we have the global context of the `Post` ActiveRecord class
for (const connection of connections.filter(
(c) => c.driver.supportedUpsertType != null,
(c) => c.driver.supportedUpsertTypes.length > 0,
)) {
Post.useDataSource(connection) // change connection each time because of AR specifics

View File

@ -14,7 +14,7 @@ describe("find options > find operators > ArrayContainedBy", () => {
async () =>
(connections = await createTestingConnections({
__dirname,
enabledDrivers: ["postgres"],
enabledDrivers: ["postgres", "cockroachdb"],
// logging: true,
})),
)

View File

@ -14,7 +14,7 @@ describe("find options > find operators > ArrayContains", () => {
async () =>
(connections = await createTestingConnections({
__dirname,
enabledDrivers: ["postgres"],
enabledDrivers: ["postgres", "cockroachdb"],
// logging: true,
})),
)

View File

@ -307,8 +307,9 @@ describe("query builder > insertion > on conflict", () => {
Promise.all(
connections.map(async (connection) => {
if (
connection.driver.supportedUpsertType !==
"on-duplicate-key-update"
!connection.driver.supportedUpsertTypes.includes(
"on-duplicate-key-update",
)
)
return
const post1 = new Post()

View File

@ -820,20 +820,6 @@ describe("query builder > locking", () => {
])
})
if (connection.driver.options.type === "cockroachdb")
return connection.manager.transaction((entityManager) => {
return Promise.all([
entityManager
.createQueryBuilder(PostWithVersion, "post")
.setLock("pessimistic_read")
.where("post.id = :id", { id: 1 })
.getOne()
.should.be.rejectedWith(
LockNotSupportedOnGivenDriverError,
),
])
})
return
}),
))
@ -868,7 +854,7 @@ describe("query builder > locking", () => {
it("should throw error if for key share locking not supported by given driver", () =>
Promise.all(
connections.map(async (connection) => {
if (!(connection.driver.options.type === "postgres")) {
if (!DriverUtils.isPostgresFamily(connection.driver)) {
return connection.manager.transaction((entityManager) => {
return Promise.all([
entityManager

View File

@ -0,0 +1,17 @@
import {
Column,
Entity,
PrimaryGeneratedColumn,
} from "../../../../../src/index"
@Entity()
export class Account {
@PrimaryGeneratedColumn()
id: number
@Column()
name: string
@Column()
balance: number
}

View File

@ -0,0 +1,17 @@
import {
Entity,
JoinColumn,
OneToOne,
PrimaryGeneratedColumn,
} from "../../../../../src/index"
import { Account } from "./Account.js"
@Entity()
export class Person {
@PrimaryGeneratedColumn()
id: number
@OneToOne(() => Account)
@JoinColumn()
account: Account
}

View File

@ -0,0 +1,271 @@
import "reflect-metadata"
import {
closeTestingConnections,
createTestingConnections,
reloadTestingDatabases,
sleep,
} from "../../../utils/test-utils"
import { DataSource } from "../../../../src/index"
import { Account } from "./entity/Account.js"
import { Person } from "./entity/Person.js"
describe("query builder > time-travel-query", () => {
// -------------------------------------------------------------------------
// Prepare
// -------------------------------------------------------------------------
let connections: DataSource[]
before(
async () =>
(connections = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
enabledDrivers: ["cockroachdb"],
})),
)
beforeEach(() => reloadTestingDatabases(connections))
after(() => closeTestingConnections(connections))
// -------------------------------------------------------------------------
// Reusable functions
// -------------------------------------------------------------------------
it("should execute time travel query without options", () =>
Promise.all(
connections.map(async (connection) => {
const repository = await connection.getRepository(Account)
// create account
let account = new Account()
account.name = "Edna Barath"
account.balance = 100
await repository.save(account)
// wait for 5 seconds
await sleep(5000)
// update account balance
account.balance = 200
await repository.save(account)
// check if balance updated
account = await repository
.createQueryBuilder("account")
.getOneOrFail()
account.balance.should.be.equal(200)
// load account state on 5 seconds back
account = await repository
.createQueryBuilder("account")
.timeTravelQuery()
.getOneOrFail()
account.balance.should.be.equal(100)
}),
))
it("should execute time travel query with options", () =>
Promise.all(
connections.map(async (connection) => {
const repository = await connection.getRepository(Account)
// create account
let account = new Account()
account.name = "Edna Barath"
account.balance = 100
await repository.save(account)
// wait for 2 seconds
await sleep(2000)
// update account balance
account.balance = 200
await repository.save(account)
// load current account state
account = await repository
.createQueryBuilder("account")
.getOneOrFail()
account.balance.should.be.equal(200)
// load account state on 2 seconds back
account = await repository
.createQueryBuilder("account")
.timeTravelQuery("'-2s'")
.getOneOrFail()
account.balance.should.be.equal(100)
}),
))
it("should execute time travel query with 'skip' and 'take' options", () =>
Promise.all(
connections.map(async (connection) => {
const repository = await connection.getRepository(Account)
// create accounts
for (let i = 1; i < 6; i++) {
const account = new Account()
account.name = `Person_${i}`
account.balance = 100 * i
await repository.save(account)
}
// wait for 2 seconds
await sleep(2000)
let accounts = await repository
.createQueryBuilder("account")
.getMany()
// update accounts
for (let account of accounts) {
account.balance = account.balance + 100
await repository.save(account)
}
// load current accounts state
accounts = await repository
.createQueryBuilder("account")
.skip(2)
.take(3)
.getMany()
accounts.length.should.be.equal(3)
accounts[0].balance.should.be.equal(400)
accounts[1].balance.should.be.equal(500)
accounts[2].balance.should.be.equal(600)
// load accounts state on 2 seconds back
accounts = await repository
.createQueryBuilder("account")
.timeTravelQuery(`'-2s'`)
.skip(2)
.take(3)
.getMany()
accounts.length.should.be.equal(3)
accounts[0].balance.should.be.equal(300)
accounts[1].balance.should.be.equal(400)
accounts[2].balance.should.be.equal(500)
}),
))
it("should execute time travel query with JOIN and skip/take options", () =>
Promise.all(
connections.map(async (connection) => {
const accountRepository = await connection.getRepository(
Account,
)
const personRepository = await connection.getRepository(Person)
// create persons and accounts
for (let i = 1; i < 6; i++) {
const account = new Account()
account.name = `Person_${i}`
account.balance = 100 * i
await accountRepository.save(account)
const person = new Person()
person.account = account
await personRepository.save(person)
}
// wait for 2 seconds
await sleep(2000)
const accounts = await accountRepository
.createQueryBuilder("account")
.getMany()
// update accounts
for (let account of accounts) {
account.balance = account.balance + 100
await accountRepository.save(account)
}
// load current accounts state
let persons = await personRepository
.createQueryBuilder("person")
.innerJoinAndSelect("person.account", "account")
.skip(2)
.take(3)
.getMany()
persons.length.should.be.equal(3)
persons[0].account.balance.should.be.equal(400)
persons[1].account.balance.should.be.equal(500)
persons[2].account.balance.should.be.equal(600)
// load accounts state on 2 seconds back
persons = await personRepository
.createQueryBuilder("person")
.innerJoinAndSelect("person.account", "account")
.timeTravelQuery(`'-2s'`)
.skip(2)
.take(3)
.getMany()
persons.length.should.be.equal(3)
persons[0].account.balance.should.be.equal(300)
persons[1].account.balance.should.be.equal(400)
persons[2].account.balance.should.be.equal(500)
}),
))
it("should execute time travel query with JOIN and limit/offset options", () =>
Promise.all(
connections.map(async (connection) => {
const accountRepository = await connection.getRepository(
Account,
)
const personRepository = await connection.getRepository(Person)
// create persons and accounts
for (let i = 1; i < 6; i++) {
const account = new Account()
account.name = `Person_${i}`
account.balance = 100 * i
await accountRepository.save(account)
const person = new Person()
person.account = account
await personRepository.save(person)
}
// wait for 2 seconds
await sleep(2000)
const accounts = await accountRepository
.createQueryBuilder("account")
.getMany()
// update accounts
for (let account of accounts) {
account.balance = account.balance + 100
await accountRepository.save(account)
}
// load current accounts state
let persons = await personRepository
.createQueryBuilder("person")
.innerJoinAndSelect("person.account", "account")
.offset(2)
.limit(3)
.getMany()
persons.length.should.be.equal(3)
persons[0].account.balance.should.be.equal(400)
persons[1].account.balance.should.be.equal(500)
persons[2].account.balance.should.be.equal(600)
// load accounts state on 2 seconds back
persons = await personRepository
.createQueryBuilder("person")
.innerJoinAndSelect("person.account", "account")
.timeTravelQuery(`'-2s'`)
.offset(2)
.limit(3)
.getMany()
persons.length.should.be.equal(3)
persons[0].account.balance.should.be.equal(300)
persons[1].account.balance.should.be.equal(400)
persons[2].account.balance.should.be.equal(500)
}),
))
})

View File

@ -440,7 +440,7 @@ describe("repository > basic methods", () => {
it("should first create then update an entity", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const externalIdObjects = connection.getRepository(
ExternalIdPrimaryKeyEntity,
)
@ -541,7 +541,7 @@ describe("repository > basic methods", () => {
it("should bulk upsert", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const externalIdObjects = connection.getRepository(
ExternalIdPrimaryKeyEntity,
@ -589,7 +589,7 @@ describe("repository > basic methods", () => {
it("should not overwrite unspecified properties", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const postObjects = connection.getRepository(Post)
const externalId = "external-no-overwrite-unrelated"
@ -688,7 +688,7 @@ describe("repository > basic methods", () => {
it("should upsert with embedded columns", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const externalIdObjects = connection.getRepository(
ExternalIdPrimaryKeyEntity,
@ -746,7 +746,7 @@ describe("repository > basic methods", () => {
it("should upsert on one-to-one relation", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const oneToOneRepository = connection.getRepository(
OneToOneRelationEntity,
@ -784,7 +784,7 @@ describe("repository > basic methods", () => {
it("should bulk upsert with embedded columns", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType == null) return
if (!connection.driver.supportedUpsertTypes.length) return
const embeddedConstraintObjects =
connection.getRepository(EmbeddedUQEntity)
@ -837,7 +837,7 @@ describe("repository > basic methods", () => {
it("should throw if using an unsupported driver", () =>
Promise.all(
connections.map(async (connection) => {
if (connection.driver.supportedUpsertType != null) return
if (connection.driver.supportedUpsertTypes.length) return
const postRepository = connection.getRepository(Post)
const externalId = "external-2"
@ -851,9 +851,12 @@ describe("repository > basic methods", () => {
it("should throw if using indexPredicate with an unsupported driver", () =>
Promise.all(
connections.map(async (connection) => {
// does not throw for cockroachdb, just returns a result
if (connection.driver.options.type === "cockroachdb") return
if (
connection.driver.supportedUpsertType !==
"on-conflict-do-update"
!connection.driver.supportedUpsertTypes.includes(
"on-conflict-do-update",
)
)
return

View File

@ -547,21 +547,6 @@ describe("repository > find options > locking", () => {
])
})
if (connection.driver.options.type === "cockroachdb")
return connection.manager.transaction((entityManager) => {
return Promise.all([
entityManager
.getRepository(PostWithVersion)
.findOne({
where: { id: 1 },
lock: { mode: "pessimistic_read" },
})
.should.be.rejectedWith(
LockNotSupportedOnGivenDriverError,
),
])
})
return
}),
))

View File

@ -0,0 +1,41 @@
import {
Column,
Entity,
Geography,
Geometry,
Index,
Point,
PrimaryGeneratedColumn,
} from "../../../../../src"
@Entity()
export class Post {
@PrimaryGeneratedColumn()
id: number
@Column("geometry", {
nullable: true,
})
@Index({
spatial: true,
})
geom: Geometry
@Column("geometry", {
nullable: true,
spatialFeatureType: "Point",
})
pointWithoutSRID: Point
@Column("geometry", {
nullable: true,
spatialFeatureType: "Point",
srid: 4326,
})
point: Point
@Column("geography", {
nullable: true,
})
geog: Geography
}

View File

@ -0,0 +1,249 @@
import "reflect-metadata"
import { expect } from "chai"
import { DataSource, Point } from "../../../../src/index"
import {
closeTestingConnections,
createTestingConnections,
reloadTestingDatabases,
} from "../../../utils/test-utils"
import { Post } from "./entity/Post"
describe("spatial-cockroachdb", () => {
let connections: DataSource[]
before(async () => {
connections = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
enabledDrivers: ["cockroachdb"],
})
})
beforeEach(async () => {
try {
await reloadTestingDatabases(connections)
} catch (err) {
console.warn(err.stack)
throw err
}
})
after(async () => {
try {
await closeTestingConnections(connections)
} catch (err) {
console.warn(err.stack)
throw err
}
})
it("should create correct schema with geometry type", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const schema = await queryRunner.getTable("post")
await queryRunner.release()
expect(schema).not.to.be.undefined
const pointColumn = schema!.columns.find(
(tableColumn) =>
tableColumn.name === "point" &&
tableColumn.type === "geometry",
)
expect(pointColumn).to.not.be.undefined
expect(pointColumn!.spatialFeatureType!.toLowerCase()).to.equal(
"point",
)
expect(pointColumn!.srid).to.equal(4326)
}),
))
it("should create correct schema with geography type", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const schema = await queryRunner.getTable("post")
await queryRunner.release()
expect(schema).not.to.be.undefined
expect(
schema!.columns.find(
(tableColumn) =>
tableColumn.name === "geog" &&
tableColumn.type === "geography",
),
).to.not.be.undefined
}),
))
it("should create correct schema with geometry indices", () =>
Promise.all(
connections.map(async (connection) => {
const queryRunner = connection.createQueryRunner()
const schema = await queryRunner.getTable("post")
await queryRunner.release()
expect(schema).not.to.be.undefined
expect(
schema!.indices.find(
(tableIndex) =>
tableIndex.isSpatial === true &&
tableIndex.columnNames.length === 1 &&
tableIndex.columnNames[0] === "geom",
),
).to.not.be.undefined
}),
))
it("should persist geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const recordRepo = connection.getRepository(Post)
const post = new Post()
post.geom = geom
const persistedPost = await recordRepo.save(post)
const foundPost = await recordRepo.findOne({
where: {
id: persistedPost.id,
},
})
expect(foundPost).to.exist
expect(foundPost!.geom).to.deep.equal(geom)
}),
))
it("should persist geography correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const recordRepo = connection.getRepository(Post)
const post = new Post()
post.geog = geom
const persistedPost = await recordRepo.save(post)
const foundPost = await recordRepo.findOne({
where: {
id: persistedPost.id,
},
})
expect(foundPost).to.exist
expect(foundPost!.geog).to.deep.equal(geom)
}),
))
it("should update geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const geom2: Point = {
type: "Point",
coordinates: [45, 45],
}
const recordRepo = connection.getRepository(Post)
const post = new Post()
post.geom = geom
const persistedPost = await recordRepo.save(post)
await recordRepo.update(
{
id: persistedPost.id,
},
{
geom: geom2,
},
)
const foundPost = await recordRepo.findOne({
where: {
id: persistedPost.id,
},
})
expect(foundPost).to.exist
expect(foundPost!.geom).to.deep.equal(geom2)
}),
))
it("should re-save geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const geom2: Point = {
type: "Point",
coordinates: [45, 45],
}
const recordRepo = connection.getRepository(Post)
const post = new Post()
post.geom = geom
const persistedPost = await recordRepo.save(post)
persistedPost.geom = geom2
await recordRepo.save(persistedPost)
const foundPost = await recordRepo.findOne({
where: {
id: persistedPost.id,
},
})
expect(foundPost).to.exist
expect(foundPost!.geom).to.deep.equal(geom2)
}),
))
it("should be able to order geometries by distance", () =>
Promise.all(
connections.map(async (connection) => {
const geoJson1: Point = {
type: "Point",
coordinates: [139.9341032213472, 36.80798008559315],
}
const geoJson2: Point = {
type: "Point",
coordinates: [139.933053, 36.805711],
}
const origin: Point = {
type: "Point",
coordinates: [139.933227, 36.808005],
}
const post1 = new Post()
post1.geom = geoJson1
const post2 = new Post()
post2.geom = geoJson2
await connection.manager.save([post1, post2])
const posts1 = await connection.manager
.createQueryBuilder(Post, "post")
.where(
"ST_Distance(post.geom, ST_GeomFromGeoJSON(:origin)) > 0",
)
.orderBy({
"ST_Distance(post.geom, ST_GeomFromGeoJSON(:origin))": {
order: "ASC",
nulls: "NULLS FIRST",
},
})
.setParameters({ origin: JSON.stringify(origin) })
.getMany()
const posts2 = await connection.manager
.createQueryBuilder(Post, "post")
.orderBy(
"ST_Distance(post.geom, ST_GeomFromGeoJSON(:origin))",
"DESC",
)
.setParameters({ origin: JSON.stringify(origin) })
.getMany()
expect(posts1[0].id).to.be.equal(post1.id)
expect(posts2[0].id).to.be.equal(post2.id)
}),
))
})

View File

@ -1,7 +1,12 @@
import { PrimaryGeneratedColumn } from "../../../../../src/decorator/columns/PrimaryGeneratedColumn"
import { Entity } from "../../../../../src/decorator/entity/Entity"
import { Column } from "../../../../../src/decorator/columns/Column"
import { Index } from "../../../../../src/decorator/Index"
import {
Column,
Entity,
Geography,
Geometry,
Index,
Point,
PrimaryGeneratedColumn,
} from "../../../../../src"
@Entity()
export class Post {
@ -14,23 +19,23 @@ export class Post {
@Index({
spatial: true,
})
geom: object
geom: Geometry
@Column("geometry", {
nullable: true,
spatialFeatureType: "Point",
})
pointWithoutSRID: object
pointWithoutSRID: Point
@Column("geometry", {
nullable: true,
spatialFeatureType: "Point",
srid: 4326,
})
point: object
point: Point
@Column("geography", {
nullable: true,
})
geog: object
geog: Geography
}

View File

@ -1,6 +1,6 @@
import "reflect-metadata"
import { expect } from "chai"
import { DataSource } from "../../../../src/data-source/DataSource"
import { DataSource, Point } from "../../../../src"
import {
closeTestingConnections,
createTestingConnections,
@ -91,7 +91,7 @@ describe("spatial-postgres", () => {
it("should persist geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom = {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
@ -112,7 +112,7 @@ describe("spatial-postgres", () => {
it("should persist geography correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom = {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
@ -133,11 +133,11 @@ describe("spatial-postgres", () => {
it("should update geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom = {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const geom2 = {
const geom2: Point = {
type: "Point",
coordinates: [45, 45],
}
@ -168,11 +168,11 @@ describe("spatial-postgres", () => {
it("should re-save geometry correctly", () =>
Promise.all(
connections.map(async (connection) => {
const geom = {
const geom: Point = {
type: "Point",
coordinates: [0, 0],
}
const geom2 = {
const geom2: Point = {
type: "Point",
coordinates: [45, 45],
}
@ -197,17 +197,17 @@ describe("spatial-postgres", () => {
it("should be able to order geometries by distance", () =>
Promise.all(
connections.map(async (connection) => {
const geoJson1 = {
const geoJson1: Point = {
type: "Point",
coordinates: [139.9341032213472, 36.80798008559315],
}
const geoJson2 = {
const geoJson2: Point = {
type: "Point",
coordinates: [139.933053, 36.805711],
}
const origin = {
const origin: Point = {
type: "Point",
coordinates: [139.933227, 36.808005],
}