mirror of
https://github.com/typeorm/typeorm.git
synced 2025-12-08 21:26:23 +00:00
feat: Cloud Spanner support (#8730)
* working on Cloud Spanner driver implementation * working on DDL synchronization * working on DDL synchronization * fixed failing test * working on VIEW implementation * fixed query parameters * lint * added transaction support; added streaming support; * fixed column types * fixes after merge * prettier * added support for generated columns * added escaping for distinct alias * working on generated columns * changed failing test * updated tests for Spanner; bugfixes; * updated tests for Spanner; bugfixes; * updated tests for Spanner; bugfixes; * fixed failing test * fixed failing test * fixing failing tests * fixing failing tests * fixing failing tests * added support for typeorm-generated uuid; fixed caching; * fixing failing tests * fixing failing tests * fixing failing tests * fixing failing tests * fixing failing tests * fixing failing tests * debugging failing test * debugging failing test * fixed bug in @PrimaryColumn decorator * fixed failing tests * fixed VIEW functionality; fixed failing tests; * updated docs
This commit is contained in:
parent
4687be8b77
commit
62518ae122
1
.github/ISSUE_TEMPLATE/bug-report.md
vendored
1
.github/ISSUE_TEMPLATE/bug-report.md
vendored
@ -98,6 +98,7 @@ assignees: ''
|
||||
| `postgres` | no |
|
||||
| `react-native` | no |
|
||||
| `sap` | no |
|
||||
| `spanner` | no |
|
||||
| `sqlite` | no |
|
||||
| `sqlite-abstract` | no |
|
||||
| `sqljs` | no |
|
||||
|
||||
1
.github/ISSUE_TEMPLATE/feature-request.md
vendored
1
.github/ISSUE_TEMPLATE/feature-request.md
vendored
@ -76,6 +76,7 @@ assignees: ''
|
||||
| `postgres` | no |
|
||||
| `react-native` | no |
|
||||
| `sap` | no |
|
||||
| `spanner` | no |
|
||||
| `sqlite` | no |
|
||||
| `sqlite-abstract` | no |
|
||||
| `sqljs` | no |
|
||||
|
||||
37
README.md
37
README.md
@ -214,12 +214,41 @@ await timber.remove()
|
||||
- for **SAP Hana**
|
||||
|
||||
```
|
||||
npm i @sap/hana-client
|
||||
npm i hdb-pool
|
||||
npm install @sap/hana-client
|
||||
npm install hdb-pool
|
||||
```
|
||||
|
||||
_SAP Hana support made possible by the sponsorship of [Neptune Software](https://www.neptune-software.com/)._
|
||||
|
||||
- for **Google Cloud Spanner**
|
||||
|
||||
```
|
||||
npm install @google-cloud/spanner --save
|
||||
```
|
||||
|
||||
Provide authentication credentials to your application code
|
||||
by setting the environment variable `GOOGLE_APPLICATION_CREDENTIALS`:
|
||||
|
||||
```shell
|
||||
# Linux/macOS
|
||||
export GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"
|
||||
|
||||
# Windows
|
||||
set GOOGLE_APPLICATION_CREDENTIALS=KEY_PATH
|
||||
|
||||
# Replace KEY_PATH with the path of the JSON file that contains your service account key.
|
||||
```
|
||||
|
||||
To use Spanner with the emulator you should set `SPANNER_EMULATOR_HOST` environment variable:
|
||||
|
||||
```shell
|
||||
# Linux/macOS
|
||||
export SPANNER_EMULATOR_HOST=localhost:9010
|
||||
|
||||
# Windows
|
||||
set SPANNER_EMULATOR_HOST=localhost:9010
|
||||
```
|
||||
|
||||
- for **MongoDB** (experimental)
|
||||
|
||||
`npm install mongodb@^3.6.0 --save`
|
||||
@ -255,7 +284,7 @@ npx typeorm init --name MyProject --database postgres
|
||||
```
|
||||
|
||||
Where `name` is the name of your project and `database` is the database you'll use.
|
||||
Database can be one of the following values: `mysql`, `mariadb`, `postgres`, `cockroachdb`, `sqlite`, `mssql`, `oracle`, `mongodb`,
|
||||
Database can be one of the following values: `mysql`, `mariadb`, `postgres`, `cockroachdb`, `sqlite`, `mssql`, `sap`, `spanner`, `oracle`, `mongodb`,
|
||||
`cordova`, `react-native`, `expo`, `nativescript`.
|
||||
|
||||
This command will generate a new project in the `MyProject` directory with the following files:
|
||||
@ -553,7 +582,7 @@ AppDataSource.initialize()
|
||||
|
||||
We are using Postgres in this example, but you can use any other supported database.
|
||||
To use another database, simply change the `type` in the options to the database type you are using:
|
||||
`mysql`, `mariadb`, `postgres`, `cockroachdb`, `sqlite`, `mssql`, `oracle`, `cordova`, `nativescript`, `react-native`,
|
||||
`mysql`, `mariadb`, `postgres`, `cockroachdb`, `sqlite`, `mssql`, `oracle`, `sap`, `spanner`, `cordova`, `nativescript`, `react-native`,
|
||||
`expo`, or `mongodb`.
|
||||
Also make sure to use your own host, port, username, password and database settings.
|
||||
|
||||
|
||||
@ -79,6 +79,14 @@ services:
|
||||
- default
|
||||
- typeorm
|
||||
|
||||
# google cloud spanner
|
||||
spanner:
|
||||
image: alexmesser/spanner-emulator
|
||||
container_name: "typeorm-spanner"
|
||||
ports:
|
||||
- "9010:9010"
|
||||
- "9020:9020"
|
||||
|
||||
# sap hana (works only on linux)
|
||||
# hanaexpress:
|
||||
# image: "store/saplabs/hanaexpress:2.00.040.00.20190729.1"
|
||||
|
||||
@ -25,7 +25,7 @@ Different RDBMS-es have their own specific options.
|
||||
|
||||
- `type` - RDBMS type. You must specify what database engine you use.
|
||||
Possible values are:
|
||||
"mysql", "postgres", "cockroachdb", "sap", "mariadb", "sqlite", "cordova", "react-native", "nativescript", "sqljs", "oracle", "mssql", "mongodb", "aurora-mysql", "aurora-postgres", "expo", "better-sqlite3", "capacitor".
|
||||
"mysql", "postgres", "cockroachdb", "sap", "spanner", "mariadb", "sqlite", "cordova", "react-native", "nativescript", "sqljs", "oracle", "mssql", "mongodb", "aurora-mysql", "aurora-postgres", "expo", "better-sqlite3", "capacitor".
|
||||
This option is **required**.
|
||||
|
||||
- `extra` - Extra options to be passed to the underlying driver.
|
||||
|
||||
@ -337,6 +337,10 @@ or
|
||||
`timestamp with local time zone`, `interval year to month`, `interval day to second`, `bfile`, `blob`, `clob`,
|
||||
`nclob`, `rowid`, `urowid`
|
||||
|
||||
### Column types for `spanner`
|
||||
|
||||
`bool`, `int64`, `float64`, `numeric`, `string`, `json`, `bytes`, `date`, `timestamp`, `array`
|
||||
|
||||
### `enum` column type
|
||||
|
||||
`enum` column type is supported by `postgres` and `mysql`. There are various possible column definitions:
|
||||
|
||||
17
package.json
17
package.json
@ -74,12 +74,15 @@
|
||||
"postgresql-orm",
|
||||
"mariadb",
|
||||
"mariadb-orm",
|
||||
"spanner",
|
||||
"sqlite",
|
||||
"sqlite-orm",
|
||||
"sql-server",
|
||||
"sql-server-orm",
|
||||
"oracle",
|
||||
"oracle-orm"
|
||||
"oracle-orm",
|
||||
"cloud-spanner",
|
||||
"cloud-spanner-orm"
|
||||
],
|
||||
"devDependencies": {
|
||||
"@types/app-root-path": "^1.2.4",
|
||||
@ -121,6 +124,7 @@
|
||||
"mysql2": "^2.2.5",
|
||||
"pg": "^8.5.1",
|
||||
"pg-query-stream": "^4.0.0",
|
||||
"prettier": "^2.5.1",
|
||||
"redis": "^3.1.1",
|
||||
"remap-istanbul": "^0.13.0",
|
||||
"rimraf": "^3.0.2",
|
||||
@ -131,10 +135,10 @@
|
||||
"sqlite3": "^5.0.2",
|
||||
"ts-node": "^10.7.0",
|
||||
"typeorm-aurora-data-api-driver": "^2.0.0",
|
||||
"typescript": "^4.6.2",
|
||||
"prettier": "^2.5.1"
|
||||
"typescript": "^4.6.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"@google-cloud/spanner": "^5.18.0",
|
||||
"@sap/hana-client": "^2.11.14",
|
||||
"better-sqlite3": "^7.1.2",
|
||||
"hdb-pool": "^0.1.6",
|
||||
@ -153,6 +157,9 @@
|
||||
"typeorm-aurora-data-api-driver": "^2.0.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"@google-cloud/spanner": {
|
||||
"optional": true
|
||||
},
|
||||
"@sap/hana-client": {
|
||||
"optional": true
|
||||
},
|
||||
@ -208,6 +215,7 @@
|
||||
"buffer": "^6.0.3",
|
||||
"chalk": "^4.1.0",
|
||||
"cli-highlight": "^2.1.11",
|
||||
"date-fns": "^2.28.0",
|
||||
"debug": "^4.3.3",
|
||||
"dotenv": "^16.0.0",
|
||||
"glob": "^7.2.0",
|
||||
@ -218,8 +226,7 @@
|
||||
"tslib": "^2.3.1",
|
||||
"uuid": "^8.3.2",
|
||||
"xml2js": "^0.4.23",
|
||||
"yargs": "^17.3.1",
|
||||
"date-fns": "^2.28.0"
|
||||
"yargs": "^17.3.1"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "rimraf ./build && tsc && mocha --file ./build/compiled/test/utils/test-setup.js --bail --recursive --timeout 60000 ./build/compiled/test",
|
||||
|
||||
@ -4,7 +4,7 @@ import { Generated } from "../../../src/decorator/Generated"
|
||||
|
||||
@Entity("sample01_post")
|
||||
export class Post {
|
||||
@PrimaryColumn("integer")
|
||||
@PrimaryColumn()
|
||||
@Generated()
|
||||
id: number
|
||||
|
||||
|
||||
14
src/cache/DbQueryResultCache.ts
vendored
14
src/cache/DbQueryResultCache.ts
vendored
@ -5,6 +5,7 @@ import { QueryRunner } from "../query-runner/QueryRunner"
|
||||
import { Table } from "../schema-builder/table/Table"
|
||||
import { QueryResultCache } from "./QueryResultCache"
|
||||
import { QueryResultCacheOptions } from "./QueryResultCacheOptions"
|
||||
import { v4 as uuidv4 } from "uuid"
|
||||
|
||||
/**
|
||||
* Caches query result into current database, into separate table called "query-result-cache".
|
||||
@ -80,7 +81,10 @@ export class DbQueryResultCache implements QueryResultCache {
|
||||
type: driver.normalizeType({
|
||||
type: driver.mappedDataTypes.cacheId,
|
||||
}),
|
||||
generationStrategy: "increment",
|
||||
generationStrategy:
|
||||
driver.options.type === "spanner"
|
||||
? "uuid"
|
||||
: "increment",
|
||||
isGenerated: true,
|
||||
},
|
||||
{
|
||||
@ -256,6 +260,14 @@ export class DbQueryResultCache implements QueryResultCache {
|
||||
|
||||
await qb.execute()
|
||||
} else {
|
||||
// Spanner does not support auto-generated columns
|
||||
if (
|
||||
this.connection.driver.options.type === "spanner" &&
|
||||
!insertedValues.id
|
||||
) {
|
||||
insertedValues.id = uuidv4()
|
||||
}
|
||||
|
||||
// otherwise insert
|
||||
await queryRunner.manager
|
||||
.createQueryBuilder()
|
||||
|
||||
@ -15,6 +15,7 @@ import { SapConnectionOptions } from "../driver/sap/SapConnectionOptions"
|
||||
import { AuroraPostgresConnectionOptions } from "../driver/aurora-postgres/AuroraPostgresConnectionOptions"
|
||||
import { BetterSqlite3ConnectionOptions } from "../driver/better-sqlite3/BetterSqlite3ConnectionOptions"
|
||||
import { CapacitorConnectionOptions } from "../driver/capacitor/CapacitorConnectionOptions"
|
||||
import { SpannerConnectionOptions } from "../driver/spanner/SpannerConnectionOptions"
|
||||
|
||||
/**
|
||||
* DataSourceOptions is an interface with settings and options for specific DataSource.
|
||||
@ -37,3 +38,4 @@ export type DataSourceOptions =
|
||||
| ExpoConnectionOptions
|
||||
| BetterSqlite3ConnectionOptions
|
||||
| CapacitorConnectionOptions
|
||||
| SpannerConnectionOptions
|
||||
|
||||
@ -135,6 +135,7 @@ export function Index(
|
||||
unique: options && options.unique ? true : false,
|
||||
spatial: options && options.spatial ? true : false,
|
||||
fulltext: options && options.fulltext ? true : false,
|
||||
nullFiltered: options && options.nullFiltered ? true : false,
|
||||
parser: options ? options.parser : undefined,
|
||||
sparse: options && options.sparse ? true : false,
|
||||
background: options && options.background ? true : false,
|
||||
|
||||
@ -41,8 +41,13 @@ export function PrimaryColumn(
|
||||
return function (object: Object, propertyName: string) {
|
||||
// normalize parameters
|
||||
let type: ColumnType | undefined
|
||||
if (typeof typeOrOptions === "string") {
|
||||
type = typeOrOptions
|
||||
if (
|
||||
typeof typeOrOptions === "string" ||
|
||||
typeOrOptions === String ||
|
||||
typeOrOptions === Boolean ||
|
||||
typeOrOptions === Number
|
||||
) {
|
||||
type = typeOrOptions as ColumnType
|
||||
} else {
|
||||
options = Object.assign({}, <PrimaryColumnOptions>typeOrOptions)
|
||||
}
|
||||
|
||||
@ -19,6 +19,15 @@ export interface IndexOptions {
|
||||
*/
|
||||
fulltext?: boolean
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
nullFiltered?: boolean
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
|
||||
@ -18,6 +18,7 @@ import { DataSource } from "../data-source/DataSource"
|
||||
import { SapDriver } from "./sap/SapDriver"
|
||||
import { BetterSqlite3Driver } from "./better-sqlite3/BetterSqlite3Driver"
|
||||
import { CapacitorDriver } from "./capacitor/CapacitorDriver"
|
||||
import { SpannerDriver } from "./spanner/SpannerDriver"
|
||||
|
||||
/**
|
||||
* Helps to create drivers.
|
||||
@ -65,6 +66,8 @@ export class DriverFactory {
|
||||
return new AuroraPostgresDriver(connection)
|
||||
case "capacitor":
|
||||
return new CapacitorDriver(connection)
|
||||
case "spanner":
|
||||
return new SpannerDriver(connection)
|
||||
default:
|
||||
throw new MissingDriverError(type, [
|
||||
"aurora-mysql",
|
||||
@ -85,6 +88,7 @@ export class DriverFactory {
|
||||
"sap",
|
||||
"sqlite",
|
||||
"sqljs",
|
||||
"spanner",
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
19
src/driver/spanner/SpannerConnectionCredentialsOptions.ts
Normal file
19
src/driver/spanner/SpannerConnectionCredentialsOptions.ts
Normal file
@ -0,0 +1,19 @@
|
||||
/**
|
||||
* Spanner specific connection credential options.
|
||||
*/
|
||||
export interface SpannerConnectionCredentialsOptions {
|
||||
/**
|
||||
* Connection url where perform connection to.
|
||||
*/
|
||||
readonly instanceId?: string
|
||||
|
||||
/**
|
||||
* Database host.
|
||||
*/
|
||||
readonly projectId?: string
|
||||
|
||||
/**
|
||||
* Database host port.
|
||||
*/
|
||||
readonly databaseId?: string
|
||||
}
|
||||
147
src/driver/spanner/SpannerConnectionOptions.ts
Normal file
147
src/driver/spanner/SpannerConnectionOptions.ts
Normal file
@ -0,0 +1,147 @@
|
||||
import { BaseConnectionOptions } from "../../connection/BaseConnectionOptions"
|
||||
import { SpannerConnectionCredentialsOptions } from "./SpannerConnectionCredentialsOptions"
|
||||
|
||||
/**
|
||||
* Spanner specific connection options.
|
||||
*/
|
||||
export interface SpannerConnectionOptions
|
||||
extends BaseConnectionOptions,
|
||||
SpannerConnectionCredentialsOptions {
|
||||
/**
|
||||
* Database type.
|
||||
*/
|
||||
readonly type: "spanner"
|
||||
|
||||
/**
|
||||
* The driver object
|
||||
* This defaults to require("@google-cloud/spanner").
|
||||
*/
|
||||
readonly driver?: any
|
||||
|
||||
// todo
|
||||
readonly database?: string
|
||||
|
||||
// todo
|
||||
readonly schema?: string
|
||||
|
||||
/**
|
||||
* The charset for the connection. This is called "collation" in the SQL-level of MySQL (like utf8_general_ci).
|
||||
* If a SQL-level charset is specified (like utf8mb4) then the default collation for that charset is used.
|
||||
* Default: 'UTF8_GENERAL_CI'
|
||||
*/
|
||||
readonly charset?: string
|
||||
|
||||
/**
|
||||
* The timezone configured on the MySQL server.
|
||||
* This is used to type cast server date/time values to JavaScript Date object and vice versa.
|
||||
* This can be 'local', 'Z', or an offset in the form +HH:MM or -HH:MM. (Default: 'local')
|
||||
*/
|
||||
readonly timezone?: string
|
||||
|
||||
/**
|
||||
* The milliseconds before a timeout occurs during the initial connection to the MySQL server. (Default: 10000)
|
||||
*/
|
||||
readonly connectTimeout?: number
|
||||
|
||||
/**
|
||||
* The milliseconds before a timeout occurs during the initial connection to the MySQL server. (Default: 10000)
|
||||
* This difference between connectTimeout and acquireTimeout is subtle and is described in the mysqljs/mysql docs
|
||||
* https://github.com/mysqljs/mysql/tree/master#pool-options
|
||||
*/
|
||||
readonly acquireTimeout?: number
|
||||
|
||||
/**
|
||||
* Allow connecting to MySQL instances that ask for the old (insecure) authentication method. (Default: false)
|
||||
*/
|
||||
readonly insecureAuth?: boolean
|
||||
|
||||
/**
|
||||
* When dealing with big numbers (BIGINT and DECIMAL columns) in the database, you should enable this option (Default: false)
|
||||
*/
|
||||
readonly supportBigNumbers?: boolean
|
||||
|
||||
/**
|
||||
* Enabling both supportBigNumbers and bigNumberStrings forces big numbers (BIGINT and DECIMAL columns) to be always
|
||||
* returned as JavaScript String objects (Default: false). Enabling supportBigNumbers but leaving bigNumberStrings
|
||||
* disabled will return big numbers as String objects only when they cannot be accurately represented with
|
||||
* [JavaScript Number objects](http://ecma262-5.com/ELS5_HTML.htm#Section_8.5) (which happens when they exceed the [-2^53, +2^53] range),
|
||||
* otherwise they will be returned as Number objects. This option is ignored if supportBigNumbers is disabled.
|
||||
*/
|
||||
readonly bigNumberStrings?: boolean
|
||||
|
||||
/**
|
||||
* Force date types (TIMESTAMP, DATETIME, DATE) to be returned as strings rather then inflated into JavaScript Date objects.
|
||||
* Can be true/false or an array of type names to keep as strings.
|
||||
*/
|
||||
readonly dateStrings?: boolean | string[]
|
||||
|
||||
/**
|
||||
* Prints protocol details to stdout. Can be true/false or an array of packet type names that should be printed.
|
||||
* (Default: false)
|
||||
*/
|
||||
readonly debug?: boolean | string[]
|
||||
|
||||
/**
|
||||
* Generates stack traces on Error to include call site of library entrance ("long stack traces").
|
||||
* Slight performance penalty for most calls. (Default: true)
|
||||
*/
|
||||
readonly trace?: boolean
|
||||
|
||||
/**
|
||||
* Allow multiple mysql statements per query. Be careful with this, it could increase the scope of SQL injection attacks.
|
||||
* (Default: false)
|
||||
*/
|
||||
readonly multipleStatements?: boolean
|
||||
|
||||
/**
|
||||
* Use spatial functions like GeomFromText and AsText which are removed in MySQL 8.
|
||||
* (Default: true)
|
||||
*/
|
||||
readonly legacySpatialSupport?: boolean
|
||||
|
||||
/**
|
||||
* List of connection flags to use other than the default ones. It is also possible to blacklist default ones.
|
||||
* For more information, check https://github.com/mysqljs/mysql#connection-flags.
|
||||
*/
|
||||
readonly flags?: string[]
|
||||
|
||||
/**
|
||||
* Replication setup.
|
||||
*/
|
||||
readonly replication?: {
|
||||
/**
|
||||
* Master server used by orm to perform writes.
|
||||
*/
|
||||
readonly master: SpannerConnectionCredentialsOptions
|
||||
|
||||
/**
|
||||
* List of read-from severs (slaves).
|
||||
*/
|
||||
readonly slaves: SpannerConnectionCredentialsOptions[]
|
||||
|
||||
/**
|
||||
* If true, PoolCluster will attempt to reconnect when connection fails. (Default: true)
|
||||
*/
|
||||
readonly canRetry?: boolean
|
||||
|
||||
/**
|
||||
* If connection fails, node's errorCount increases.
|
||||
* When errorCount is greater than removeNodeErrorCount, remove a node in the PoolCluster. (Default: 5)
|
||||
*/
|
||||
readonly removeNodeErrorCount?: number
|
||||
|
||||
/**
|
||||
* If connection fails, specifies the number of milliseconds before another connection attempt will be made.
|
||||
* If set to 0, then node will be removed instead and never re-used. (Default: 0)
|
||||
*/
|
||||
readonly restoreNodeTimeout?: number
|
||||
|
||||
/**
|
||||
* Determines how slaves are selected:
|
||||
* RR: Select one alternately (Round-Robin).
|
||||
* RANDOM: Select the node by random function.
|
||||
* ORDER: Select the first node available unconditionally.
|
||||
*/
|
||||
readonly selector?: "RR" | "RANDOM" | "ORDER"
|
||||
}
|
||||
}
|
||||
794
src/driver/spanner/SpannerDriver.ts
Normal file
794
src/driver/spanner/SpannerDriver.ts
Normal file
@ -0,0 +1,794 @@
|
||||
import { Driver, ReturningType } from "../Driver"
|
||||
import { DriverPackageNotInstalledError } from "../../error/DriverPackageNotInstalledError"
|
||||
import { SpannerQueryRunner } from "./SpannerQueryRunner"
|
||||
import { ObjectLiteral } from "../../common/ObjectLiteral"
|
||||
import { ColumnMetadata } from "../../metadata/ColumnMetadata"
|
||||
import { DateUtils } from "../../util/DateUtils"
|
||||
import { PlatformTools } from "../../platform/PlatformTools"
|
||||
import { Connection } from "../../connection/Connection"
|
||||
import { RdbmsSchemaBuilder } from "../../schema-builder/RdbmsSchemaBuilder"
|
||||
import { SpannerConnectionOptions } from "./SpannerConnectionOptions"
|
||||
import { MappedColumnTypes } from "../types/MappedColumnTypes"
|
||||
import { ColumnType } from "../types/ColumnTypes"
|
||||
import { DataTypeDefaults } from "../types/DataTypeDefaults"
|
||||
import { TableColumn } from "../../schema-builder/table/TableColumn"
|
||||
import { EntityMetadata } from "../../metadata/EntityMetadata"
|
||||
import { OrmUtils } from "../../util/OrmUtils"
|
||||
import { ApplyValueTransformers } from "../../util/ApplyValueTransformers"
|
||||
import { ReplicationMode } from "../types/ReplicationMode"
|
||||
import { Table } from "../../schema-builder/table/Table"
|
||||
import { View } from "../../schema-builder/view/View"
|
||||
import { TableForeignKey } from "../../schema-builder/table/TableForeignKey"
|
||||
import { CteCapabilities } from "../types/CteCapabilities"
|
||||
|
||||
/**
|
||||
* Organizes communication with Spanner DBMS.
|
||||
*/
|
||||
export class SpannerDriver implements Driver {
|
||||
// -------------------------------------------------------------------------
|
||||
// Public Properties
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Connection used by driver.
|
||||
*/
|
||||
connection: Connection
|
||||
|
||||
/**
|
||||
* Cloud Spanner underlying library.
|
||||
*/
|
||||
spanner: any
|
||||
|
||||
/**
|
||||
* Cloud Spanner instance.
|
||||
*/
|
||||
instance: any
|
||||
|
||||
/**
|
||||
* Cloud Spanner database.
|
||||
*/
|
||||
instanceDatabase: any
|
||||
|
||||
/**
|
||||
* Database name.
|
||||
*/
|
||||
database?: string
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Public Implemented Properties
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Connection options.
|
||||
*/
|
||||
options: SpannerConnectionOptions
|
||||
|
||||
/**
|
||||
* Indicates if replication is enabled.
|
||||
*/
|
||||
isReplicated: boolean = false
|
||||
|
||||
/**
|
||||
* Indicates if tree tables are supported by this driver.
|
||||
*/
|
||||
treeSupport = true
|
||||
|
||||
/**
|
||||
* Represent transaction support by this driver
|
||||
*/
|
||||
transactionSupport = "none" as const
|
||||
|
||||
/**
|
||||
* Gets list of supported column data types by a driver.
|
||||
*
|
||||
* @see https://cloud.google.com/spanner/docs/reference/standard-sql/data-types
|
||||
*/
|
||||
supportedDataTypes: ColumnType[] = [
|
||||
"bool",
|
||||
"int64",
|
||||
"float64",
|
||||
"numeric",
|
||||
"string",
|
||||
"json",
|
||||
"bytes",
|
||||
"date",
|
||||
"timestamp",
|
||||
"array",
|
||||
]
|
||||
|
||||
/**
|
||||
* Returns type of upsert supported by driver if any
|
||||
*/
|
||||
readonly supportedUpsertType = undefined
|
||||
|
||||
/**
|
||||
* Gets list of spatial column data types.
|
||||
*/
|
||||
spatialTypes: ColumnType[] = []
|
||||
|
||||
/**
|
||||
* Gets list of column data types that support length by a driver.
|
||||
*/
|
||||
withLengthColumnTypes: ColumnType[] = ["string", "bytes"]
|
||||
|
||||
/**
|
||||
* Gets list of column data types that support length by a driver.
|
||||
*/
|
||||
withWidthColumnTypes: ColumnType[] = []
|
||||
|
||||
/**
|
||||
* Gets list of column data types that support precision by a driver.
|
||||
*/
|
||||
withPrecisionColumnTypes: ColumnType[] = []
|
||||
|
||||
/**
|
||||
* Gets list of column data types that supports scale by a driver.
|
||||
*/
|
||||
withScaleColumnTypes: ColumnType[] = []
|
||||
|
||||
/**
|
||||
* ORM has special columns and we need to know what database column types should be for those columns.
|
||||
* Column types are driver dependant.
|
||||
*/
|
||||
mappedDataTypes: MappedColumnTypes = {
|
||||
createDate: "timestamp",
|
||||
createDateDefault: "",
|
||||
updateDate: "timestamp",
|
||||
updateDateDefault: "",
|
||||
deleteDate: "timestamp",
|
||||
deleteDateNullable: true,
|
||||
version: "int64",
|
||||
treeLevel: "int64",
|
||||
migrationId: "int64",
|
||||
migrationName: "string",
|
||||
migrationTimestamp: "int64",
|
||||
cacheId: "string",
|
||||
cacheIdentifier: "string",
|
||||
cacheTime: "int64",
|
||||
cacheDuration: "int64",
|
||||
cacheQuery: "string",
|
||||
cacheResult: "string",
|
||||
metadataType: "string",
|
||||
metadataDatabase: "string",
|
||||
metadataSchema: "string",
|
||||
metadataTable: "string",
|
||||
metadataName: "string",
|
||||
metadataValue: "string",
|
||||
}
|
||||
|
||||
/**
|
||||
* Default values of length, precision and scale depends on column data type.
|
||||
* Used in the cases when length/precision/scale is not specified by user.
|
||||
*/
|
||||
dataTypeDefaults: DataTypeDefaults = {}
|
||||
|
||||
/**
|
||||
* Max length allowed by MySQL for aliases.
|
||||
* @see https://dev.mysql.com/doc/refman/5.5/en/identifiers.html
|
||||
*/
|
||||
maxAliasLength = 63
|
||||
|
||||
cteCapabilities: CteCapabilities = {
|
||||
enabled: true,
|
||||
}
|
||||
|
||||
/**
|
||||
* Supported returning types
|
||||
*/
|
||||
private readonly _isReturningSqlSupported: Record<ReturningType, boolean> =
|
||||
{
|
||||
delete: false,
|
||||
insert: false,
|
||||
update: false,
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Constructor
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
constructor(connection: Connection) {
|
||||
this.connection = connection
|
||||
this.options = connection.options as SpannerConnectionOptions
|
||||
this.isReplicated = this.options.replication ? true : false
|
||||
|
||||
// load mysql package
|
||||
this.loadDependencies()
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Public Methods
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Performs connection to the database.
|
||||
*/
|
||||
async connect(): Promise<void> {
|
||||
this.instance = this.spanner.instance(this.options.instanceId)
|
||||
this.instanceDatabase = this.instance.database(this.options.databaseId)
|
||||
}
|
||||
|
||||
/**
|
||||
* Makes any action after connection (e.g. create extensions in Postgres driver).
|
||||
*/
|
||||
afterConnect(): Promise<void> {
|
||||
return Promise.resolve()
|
||||
}
|
||||
|
||||
/**
|
||||
* Closes connection with the database.
|
||||
*/
|
||||
async disconnect(): Promise<void> {
|
||||
this.instanceDatabase.close()
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a schema builder used to build and sync a schema.
|
||||
*/
|
||||
createSchemaBuilder() {
|
||||
return new RdbmsSchemaBuilder(this.connection)
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a query runner used to execute database queries.
|
||||
*/
|
||||
createQueryRunner(mode: ReplicationMode) {
|
||||
return new SpannerQueryRunner(this, mode)
|
||||
}
|
||||
|
||||
/**
|
||||
* Replaces parameters in the given sql with special escaping character
|
||||
* and an array of parameter names to be passed to a query.
|
||||
*/
|
||||
escapeQueryWithParameters(
|
||||
sql: string,
|
||||
parameters: ObjectLiteral,
|
||||
nativeParameters: ObjectLiteral,
|
||||
): [string, any[]] {
|
||||
const escapedParameters: any[] = Object.keys(nativeParameters).map(
|
||||
(key) => nativeParameters[key],
|
||||
)
|
||||
if (!parameters || !Object.keys(parameters).length)
|
||||
return [sql, escapedParameters]
|
||||
|
||||
sql = sql.replace(
|
||||
/:(\.\.\.)?([A-Za-z0-9_.]+)/g,
|
||||
(full, isArray: string, key: string): string => {
|
||||
if (!parameters.hasOwnProperty(key)) {
|
||||
return full
|
||||
}
|
||||
|
||||
let value: any = parameters[key]
|
||||
|
||||
if (value === null) {
|
||||
return full
|
||||
}
|
||||
|
||||
if (isArray) {
|
||||
return value
|
||||
.map((v: any) => {
|
||||
escapedParameters.push(v)
|
||||
return this.createParameter(
|
||||
key,
|
||||
escapedParameters.length - 1,
|
||||
)
|
||||
})
|
||||
.join(", ")
|
||||
}
|
||||
|
||||
if (value instanceof Function) {
|
||||
return value()
|
||||
}
|
||||
escapedParameters.push(value)
|
||||
return this.createParameter(key, escapedParameters.length - 1)
|
||||
},
|
||||
) // todo: make replace only in value statements, otherwise problems
|
||||
|
||||
sql = sql.replace(
|
||||
/([ ]+)?=([ ]+)?:(\.\.\.)?([A-Za-z0-9_.]+)/g,
|
||||
(
|
||||
full,
|
||||
emptySpaceBefore: string,
|
||||
emptySpaceAfter: string,
|
||||
isArray: string,
|
||||
key: string,
|
||||
): string => {
|
||||
if (!parameters.hasOwnProperty(key)) {
|
||||
return full
|
||||
}
|
||||
|
||||
let value: any = parameters[key]
|
||||
if (value === null) {
|
||||
return " IS NULL"
|
||||
}
|
||||
|
||||
return full
|
||||
},
|
||||
)
|
||||
return [sql, escapedParameters]
|
||||
}
|
||||
|
||||
/**
|
||||
* Escapes a column name.
|
||||
*/
|
||||
escape(columnName: string): string {
|
||||
return `\`${columnName}\``
|
||||
}
|
||||
|
||||
/**
|
||||
* Build full table name with database name, schema name and table name.
|
||||
* E.g. myDB.mySchema.myTable
|
||||
*/
|
||||
buildTableName(
|
||||
tableName: string,
|
||||
schema?: string,
|
||||
database?: string,
|
||||
): string {
|
||||
let tablePath = [tableName]
|
||||
|
||||
if (database) {
|
||||
tablePath.unshift(database)
|
||||
}
|
||||
|
||||
return tablePath.join(".")
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse a target table name or other types and return a normalized table definition.
|
||||
*/
|
||||
parseTableName(
|
||||
target: EntityMetadata | Table | View | TableForeignKey | string,
|
||||
): { database?: string; schema?: string; tableName: string } {
|
||||
const driverDatabase = this.database
|
||||
const driverSchema = undefined
|
||||
|
||||
if (target instanceof Table || target instanceof View) {
|
||||
const parsed = this.parseTableName(target.name)
|
||||
|
||||
return {
|
||||
database: target.database || parsed.database || driverDatabase,
|
||||
schema: target.schema || parsed.schema || driverSchema,
|
||||
tableName: parsed.tableName,
|
||||
}
|
||||
}
|
||||
|
||||
if (target instanceof TableForeignKey) {
|
||||
const parsed = this.parseTableName(target.referencedTableName)
|
||||
|
||||
return {
|
||||
database:
|
||||
target.referencedDatabase ||
|
||||
parsed.database ||
|
||||
driverDatabase,
|
||||
schema:
|
||||
target.referencedSchema || parsed.schema || driverSchema,
|
||||
tableName: parsed.tableName,
|
||||
}
|
||||
}
|
||||
|
||||
if (target instanceof EntityMetadata) {
|
||||
// EntityMetadata tableName is never a path
|
||||
|
||||
return {
|
||||
database: target.database || driverDatabase,
|
||||
schema: target.schema || driverSchema,
|
||||
tableName: target.tableName,
|
||||
}
|
||||
}
|
||||
|
||||
const parts = target.split(".")
|
||||
|
||||
return {
|
||||
database:
|
||||
(parts.length > 1 ? parts[0] : undefined) || driverDatabase,
|
||||
schema: driverSchema,
|
||||
tableName: parts.length > 1 ? parts[1] : parts[0],
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Prepares given value to a value to be persisted, based on its column type and metadata.
|
||||
*/
|
||||
preparePersistentValue(value: any, columnMetadata: ColumnMetadata): any {
|
||||
if (columnMetadata.transformer)
|
||||
value = ApplyValueTransformers.transformTo(
|
||||
columnMetadata.transformer,
|
||||
value,
|
||||
)
|
||||
|
||||
if (value === null || value === undefined) return value
|
||||
|
||||
if (columnMetadata.type === "numeric") {
|
||||
const lib = this.options.driver || PlatformTools.load("spanner")
|
||||
return lib.Spanner.numeric(value)
|
||||
} else if (columnMetadata.type === "date") {
|
||||
return DateUtils.mixedDateToDateString(value)
|
||||
} else if (columnMetadata.type === "json") {
|
||||
return value
|
||||
} else if (
|
||||
columnMetadata.type === "timestamp" ||
|
||||
columnMetadata.type === Date
|
||||
) {
|
||||
return DateUtils.mixedDateToDate(value)
|
||||
}
|
||||
|
||||
return value
|
||||
}
|
||||
|
||||
/**
|
||||
* Prepares given value to a value to be persisted, based on its column type or metadata.
|
||||
*/
|
||||
prepareHydratedValue(value: any, columnMetadata: ColumnMetadata): any {
|
||||
if (value === null || value === undefined)
|
||||
return columnMetadata.transformer
|
||||
? ApplyValueTransformers.transformFrom(
|
||||
columnMetadata.transformer,
|
||||
value,
|
||||
)
|
||||
: value
|
||||
|
||||
if (columnMetadata.type === Boolean || columnMetadata.type === "bool") {
|
||||
value = value ? true : false
|
||||
} else if (
|
||||
columnMetadata.type === "timestamp" ||
|
||||
columnMetadata.type === Date
|
||||
) {
|
||||
value = new Date(value)
|
||||
} else if (columnMetadata.type === "numeric") {
|
||||
value = value.value
|
||||
} else if (columnMetadata.type === "date") {
|
||||
value = DateUtils.mixedDateToDateString(value)
|
||||
} else if (columnMetadata.type === "json") {
|
||||
value = typeof value === "string" ? JSON.parse(value) : value
|
||||
}
|
||||
|
||||
if (columnMetadata.transformer)
|
||||
value = ApplyValueTransformers.transformFrom(
|
||||
columnMetadata.transformer,
|
||||
value,
|
||||
)
|
||||
|
||||
return value
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a database type from a given column metadata.
|
||||
*/
|
||||
normalizeType(column: {
|
||||
type: ColumnType
|
||||
length?: number | string
|
||||
precision?: number | null
|
||||
scale?: number
|
||||
}): string {
|
||||
if (column.type === Number) {
|
||||
return "int64"
|
||||
} else if (column.type === String || column.type === "uuid") {
|
||||
return "string"
|
||||
} else if (column.type === Date) {
|
||||
return "timestamp"
|
||||
} else if ((column.type as any) === Buffer) {
|
||||
return "bytes"
|
||||
} else if (column.type === Boolean) {
|
||||
return "bool"
|
||||
} else {
|
||||
return (column.type as string) || ""
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalizes "default" value of the column.
|
||||
*
|
||||
* Spanner does not support default values.
|
||||
*/
|
||||
normalizeDefault(columnMetadata: ColumnMetadata): string | undefined {
|
||||
return columnMetadata.default === ""
|
||||
? `"${columnMetadata.default}"`
|
||||
: `${columnMetadata.default}`
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalizes "isUnique" value of the column.
|
||||
*/
|
||||
normalizeIsUnique(column: ColumnMetadata): boolean {
|
||||
return column.entityMetadata.indices.some(
|
||||
(idx) =>
|
||||
idx.isUnique &&
|
||||
idx.columns.length === 1 &&
|
||||
idx.columns[0] === column,
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns default column lengths, which is required on column creation.
|
||||
*/
|
||||
getColumnLength(column: ColumnMetadata | TableColumn): string {
|
||||
if (column.length) return column.length.toString()
|
||||
if (column.generationStrategy === "uuid") return "36"
|
||||
|
||||
switch (column.type) {
|
||||
case String:
|
||||
case "string":
|
||||
case "bytes":
|
||||
return "max"
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates column type definition including length, precision and scale
|
||||
*/
|
||||
createFullType(column: TableColumn): string {
|
||||
let type = column.type
|
||||
|
||||
// used 'getColumnLength()' method, because Spanner requires column length for `string` and `bytes` data types
|
||||
if (this.getColumnLength(column)) {
|
||||
type += `(${this.getColumnLength(column)})`
|
||||
} else if (column.width) {
|
||||
type += `(${column.width})`
|
||||
} else if (
|
||||
column.precision !== null &&
|
||||
column.precision !== undefined &&
|
||||
column.scale !== null &&
|
||||
column.scale !== undefined
|
||||
) {
|
||||
type += `(${column.precision},${column.scale})`
|
||||
} else if (
|
||||
column.precision !== null &&
|
||||
column.precision !== undefined
|
||||
) {
|
||||
type += `(${column.precision})`
|
||||
}
|
||||
|
||||
if (column.isArray) type = `array<${type}>`
|
||||
|
||||
return type
|
||||
}
|
||||
|
||||
/**
|
||||
* Obtains a new database connection to a master server.
|
||||
* Used for replication.
|
||||
* If replication is not setup then returns default connection's database connection.
|
||||
*/
|
||||
obtainMasterConnection(): Promise<any> {
|
||||
return this.instanceDatabase
|
||||
}
|
||||
|
||||
/**
|
||||
* Obtains a new database connection to a slave server.
|
||||
* Used for replication.
|
||||
* If replication is not setup then returns master (default) connection's database connection.
|
||||
*/
|
||||
obtainSlaveConnection(): Promise<any> {
|
||||
return this.instanceDatabase
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates generated map of values generated or returned by database after INSERT query.
|
||||
*/
|
||||
createGeneratedMap(
|
||||
metadata: EntityMetadata,
|
||||
insertResult: any,
|
||||
entityIndex: number,
|
||||
) {
|
||||
if (!insertResult) {
|
||||
return undefined
|
||||
}
|
||||
|
||||
if (insertResult.insertId === undefined) {
|
||||
return Object.keys(insertResult).reduce((map, key) => {
|
||||
const column = metadata.findColumnWithDatabaseName(key)
|
||||
if (column) {
|
||||
OrmUtils.mergeDeep(
|
||||
map,
|
||||
column.createValueMap(insertResult[key]),
|
||||
)
|
||||
// OrmUtils.mergeDeep(map, column.createValueMap(this.prepareHydratedValue(insertResult[key], column))); // TODO: probably should be like there, but fails on enums, fix later
|
||||
}
|
||||
return map
|
||||
}, {} as ObjectLiteral)
|
||||
}
|
||||
|
||||
const generatedMap = metadata.generatedColumns.reduce(
|
||||
(map, generatedColumn) => {
|
||||
let value: any
|
||||
if (
|
||||
generatedColumn.generationStrategy === "increment" &&
|
||||
insertResult.insertId
|
||||
) {
|
||||
// NOTE: When multiple rows is inserted by a single INSERT statement,
|
||||
// `insertId` is the value generated for the first inserted row only.
|
||||
value = insertResult.insertId + entityIndex
|
||||
// } else if (generatedColumn.generationStrategy === "uuid") {
|
||||
// console.log("getting db value:", generatedColumn.databaseName);
|
||||
// value = generatedColumn.getEntityValue(uuidMap);
|
||||
}
|
||||
|
||||
return OrmUtils.mergeDeep(
|
||||
map,
|
||||
generatedColumn.createValueMap(value),
|
||||
)
|
||||
},
|
||||
{} as ObjectLiteral,
|
||||
)
|
||||
|
||||
return Object.keys(generatedMap).length > 0 ? generatedMap : undefined
|
||||
}
|
||||
|
||||
/**
|
||||
* Differentiate columns of this table and columns from the given column metadatas columns
|
||||
* and returns only changed.
|
||||
*/
|
||||
findChangedColumns(
|
||||
tableColumns: TableColumn[],
|
||||
columnMetadatas: ColumnMetadata[],
|
||||
): ColumnMetadata[] {
|
||||
return columnMetadatas.filter((columnMetadata) => {
|
||||
const tableColumn = tableColumns.find(
|
||||
(c) => c.name === columnMetadata.databaseName,
|
||||
)
|
||||
if (!tableColumn) return false // we don't need new columns, we only need exist and changed
|
||||
|
||||
const isColumnChanged =
|
||||
tableColumn.name !== columnMetadata.databaseName ||
|
||||
tableColumn.type !== this.normalizeType(columnMetadata) ||
|
||||
tableColumn.length !== this.getColumnLength(columnMetadata) ||
|
||||
tableColumn.asExpression !== columnMetadata.asExpression ||
|
||||
tableColumn.generatedType !== columnMetadata.generatedType ||
|
||||
tableColumn.isPrimary !== columnMetadata.isPrimary ||
|
||||
tableColumn.isNullable !== columnMetadata.isNullable ||
|
||||
tableColumn.isUnique !== this.normalizeIsUnique(columnMetadata)
|
||||
|
||||
// DEBUG SECTION
|
||||
if (isColumnChanged) {
|
||||
console.log("table:", columnMetadata.entityMetadata.tableName)
|
||||
console.log(
|
||||
"name:",
|
||||
tableColumn.name,
|
||||
columnMetadata.databaseName,
|
||||
)
|
||||
console.log(
|
||||
"type:",
|
||||
tableColumn.type,
|
||||
this.normalizeType(columnMetadata),
|
||||
)
|
||||
console.log(
|
||||
"length:",
|
||||
tableColumn.length,
|
||||
this.getColumnLength(columnMetadata),
|
||||
)
|
||||
console.log(
|
||||
"asExpression:",
|
||||
tableColumn.asExpression,
|
||||
columnMetadata.asExpression,
|
||||
)
|
||||
console.log(
|
||||
"generatedType:",
|
||||
tableColumn.generatedType,
|
||||
columnMetadata.generatedType,
|
||||
)
|
||||
console.log(
|
||||
"isPrimary:",
|
||||
tableColumn.isPrimary,
|
||||
columnMetadata.isPrimary,
|
||||
)
|
||||
console.log(
|
||||
"isNullable:",
|
||||
tableColumn.isNullable,
|
||||
columnMetadata.isNullable,
|
||||
)
|
||||
console.log(
|
||||
"isUnique:",
|
||||
tableColumn.isUnique,
|
||||
this.normalizeIsUnique(columnMetadata),
|
||||
)
|
||||
console.log("==========================================")
|
||||
}
|
||||
|
||||
return isColumnChanged
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true if driver supports RETURNING / OUTPUT statement.
|
||||
*/
|
||||
isReturningSqlSupported(returningType: ReturningType): boolean {
|
||||
return this._isReturningSqlSupported[returningType]
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true if driver supports uuid values generation on its own.
|
||||
*/
|
||||
isUUIDGenerationSupported(): boolean {
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true if driver supports fulltext indices.
|
||||
*/
|
||||
isFullTextColumnTypeSupported(): boolean {
|
||||
return false
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates an escaped parameter.
|
||||
*/
|
||||
createParameter(parameterName: string, index: number): string {
|
||||
return "@param" + index
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Protected Methods
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Loads all driver dependencies.
|
||||
*/
|
||||
protected loadDependencies(): void {
|
||||
try {
|
||||
const lib = this.options.driver || PlatformTools.load("spanner")
|
||||
this.spanner = new lib.Spanner({
|
||||
projectId: this.options.projectId,
|
||||
})
|
||||
} catch (e) {
|
||||
console.error(e)
|
||||
throw new DriverPackageNotInstalledError(
|
||||
"Spanner",
|
||||
"@google-cloud/spanner",
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if "DEFAULT" values in the column metadata and in the database are equal.
|
||||
*/
|
||||
protected compareDefaultValues(
|
||||
columnMetadataValue: string | undefined,
|
||||
databaseValue: string | undefined,
|
||||
): boolean {
|
||||
if (
|
||||
typeof columnMetadataValue === "string" &&
|
||||
typeof databaseValue === "string"
|
||||
) {
|
||||
// we need to cut out "'" because in mysql we can understand returned value is a string or a function
|
||||
// as result compare cannot understand if default is really changed or not
|
||||
columnMetadataValue = columnMetadataValue.replace(/^'+|'+$/g, "")
|
||||
databaseValue = databaseValue.replace(/^'+|'+$/g, "")
|
||||
}
|
||||
|
||||
return columnMetadataValue === databaseValue
|
||||
}
|
||||
|
||||
/**
|
||||
* If parameter is a datetime function, e.g. "CURRENT_TIMESTAMP", normalizes it.
|
||||
* Otherwise returns original input.
|
||||
*/
|
||||
protected normalizeDatetimeFunction(value?: string) {
|
||||
if (!value) return value
|
||||
|
||||
// check if input is datetime function
|
||||
const isDatetimeFunction =
|
||||
value.toUpperCase().indexOf("CURRENT_TIMESTAMP") !== -1 ||
|
||||
value.toUpperCase().indexOf("NOW") !== -1
|
||||
|
||||
if (isDatetimeFunction) {
|
||||
// extract precision, e.g. "(3)"
|
||||
const precision = value.match(/\(\d+\)/)
|
||||
return precision
|
||||
? `CURRENT_TIMESTAMP${precision[0]}`
|
||||
: "CURRENT_TIMESTAMP"
|
||||
} else {
|
||||
return value
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Escapes a given comment.
|
||||
*/
|
||||
protected escapeComment(comment?: string) {
|
||||
if (!comment) return comment
|
||||
|
||||
comment = comment.replace(/\u0000/g, "") // Null bytes aren't allowed in comments
|
||||
|
||||
return comment
|
||||
}
|
||||
}
|
||||
2103
src/driver/spanner/SpannerQueryRunner.ts
Normal file
2103
src/driver/spanner/SpannerQueryRunner.ts
Normal file
File diff suppressed because it is too large
Load Diff
@ -15,7 +15,7 @@ export type PrimaryGeneratedColumnType =
|
||||
| "decimal" // mysql, postgres, mssql, sqlite, sap
|
||||
| "smalldecimal" // sap
|
||||
| "fixed" // mysql
|
||||
| "numeric" // postgres, mssql, sqlite
|
||||
| "numeric" // postgres, mssql, sqlite, spanner
|
||||
| "number" // oracle
|
||||
|
||||
/**
|
||||
@ -47,7 +47,7 @@ export type WithPrecisionColumnType =
|
||||
| "time" // mysql, postgres, mssql, cockroachdb
|
||||
| "time with time zone" // postgres, cockroachdb
|
||||
| "time without time zone" // postgres
|
||||
| "timestamp" // mysql, postgres, mssql, oracle, cockroachdb
|
||||
| "timestamp" // mysql, postgres, mssql, oracle, cockroachdb, spanner
|
||||
| "timestamp without time zone" // postgres, cockroachdb
|
||||
| "timestamp with time zone" // postgres, oracle, cockroachdb
|
||||
| "timestamp with local time zone" // oracle
|
||||
@ -74,7 +74,7 @@ export type WithLengthColumnType =
|
||||
| "raw" // oracle
|
||||
| "binary" // mssql
|
||||
| "varbinary" // mssql, sap
|
||||
| "string" // cockroachdb
|
||||
| "string" // cockroachdb, spanner
|
||||
|
||||
export type WithWidthColumnType =
|
||||
| "tinyint" // mysql
|
||||
@ -97,17 +97,18 @@ export type SimpleColumnType =
|
||||
| "integer" // postgres, oracle, sqlite, cockroachdb
|
||||
| "int4" // postgres, cockroachdb
|
||||
| "int8" // postgres, sqlite, cockroachdb
|
||||
| "int64" // cockroachdb
|
||||
| "int64" // cockroachdb, spanner
|
||||
| "unsigned big int" // sqlite
|
||||
| "float" // mysql, mssql, oracle, sqlite, sap
|
||||
| "float4" // postgres, cockroachdb
|
||||
| "float8" // postgres, cockroachdb
|
||||
| "float64" // spanner
|
||||
| "smallmoney" // mssql
|
||||
| "money" // postgres, mssql
|
||||
|
||||
// boolean types
|
||||
| "boolean" // postgres, sqlite, mysql, cockroachdb
|
||||
| "bool" // postgres, mysql, cockroachdb
|
||||
| "bool" // postgres, mysql, cockroachdb, spanner
|
||||
|
||||
// text/binary types
|
||||
| "tinyblob" // mysql
|
||||
@ -123,7 +124,7 @@ export type SimpleColumnType =
|
||||
| "longtext" // mysql
|
||||
| "alphanum" // sap
|
||||
| "shorttext" // sap
|
||||
| "bytes" // cockroachdb
|
||||
| "bytes" // cockroachdb, spanner
|
||||
| "bytea" // postgres, cockroachdb
|
||||
| "long" // oracle
|
||||
| "raw" // oracle
|
||||
@ -138,7 +139,7 @@ export type SimpleColumnType =
|
||||
| "timestamptz" // postgres, cockroachdb
|
||||
| "timestamp with local time zone" // oracle
|
||||
| "smalldatetime" // mssql
|
||||
| "date" // mysql, postgres, mssql, oracle, sqlite
|
||||
| "date" // mysql, postgres, mssql, oracle, sqlite, spanner
|
||||
| "interval year to month" // oracle
|
||||
| "interval day to second" // oracle
|
||||
| "interval" // postgres, cockroachdb
|
||||
@ -184,7 +185,7 @@ export type SimpleColumnType =
|
||||
| "tsquery" // postgres
|
||||
| "uuid" // postgres, cockroachdb
|
||||
| "xml" // mssql, postgres
|
||||
| "json" // mysql, postgres, cockroachdb
|
||||
| "json" // mysql, postgres, cockroachdb, spanner
|
||||
| "jsonb" // postgres, cockroachdb
|
||||
| "varbinary" // mssql, sap
|
||||
| "hierarchyid" // mssql
|
||||
@ -193,7 +194,7 @@ export type SimpleColumnType =
|
||||
| "urowid" // oracle
|
||||
| "uniqueidentifier" // mssql
|
||||
| "rowversion" // mssql
|
||||
| "array" // cockroachdb, sap
|
||||
| "array" // cockroachdb, sap, spanner
|
||||
| "cube" // postgres
|
||||
| "ltree" // postgres
|
||||
|
||||
|
||||
@ -20,3 +20,4 @@ export type DatabaseType =
|
||||
| "expo"
|
||||
| "better-sqlite3"
|
||||
| "capacitor"
|
||||
| "spanner"
|
||||
|
||||
@ -38,6 +38,15 @@ export interface EntitySchemaIndexOptions {
|
||||
*/
|
||||
fulltext?: boolean
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
nullFiltered?: boolean
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
|
||||
@ -240,6 +240,7 @@ export class EntitySchemaTransformer {
|
||||
unique: index.unique === true ? true : false,
|
||||
spatial: index.spatial === true ? true : false,
|
||||
fulltext: index.fulltext === true ? true : false,
|
||||
nullFiltered: index.nullFiltered === true ? true : false,
|
||||
parser: index.parser,
|
||||
synchronize: index.synchronize === false ? false : true,
|
||||
where: index.where,
|
||||
|
||||
@ -146,7 +146,7 @@ export class AdvancedConsoleLogger implements Logger {
|
||||
|
||||
/**
|
||||
* Converts parameters to a string.
|
||||
* Sometimes parameters can have circular objects and therefor we are handle this case too.
|
||||
* Sometimes parameters can have circular objects and therefore we are handle this case too.
|
||||
*/
|
||||
protected stringifyParams(parameters: any[]) {
|
||||
try {
|
||||
|
||||
@ -34,6 +34,15 @@ export interface IndexMetadataArgs {
|
||||
*/
|
||||
fulltext?: boolean
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
nullFiltered?: boolean
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
|
||||
@ -197,7 +197,9 @@ export class EntityMetadataBuilder {
|
||||
"aurora-mysql" ||
|
||||
this.connection.driver.options.type ===
|
||||
"mssql" ||
|
||||
this.connection.driver.options.type === "sap"
|
||||
this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type ===
|
||||
"spanner"
|
||||
) {
|
||||
const index = new IndexMetadata({
|
||||
entityMetadata:
|
||||
@ -224,6 +226,13 @@ export class EntityMetadataBuilder {
|
||||
.join(" AND ")
|
||||
}
|
||||
|
||||
if (
|
||||
this.connection.driver.options.type ===
|
||||
"spanner"
|
||||
) {
|
||||
index.isNullFiltered = true
|
||||
}
|
||||
|
||||
if (relation.embeddedMetadata) {
|
||||
relation.embeddedMetadata.indices.push(
|
||||
index,
|
||||
@ -615,7 +624,7 @@ export class EntityMetadataBuilder {
|
||||
propertyName: "mpath",
|
||||
options: /*tree.column || */ {
|
||||
name: namingStrategy.materializedPathColumnName,
|
||||
type: "varchar",
|
||||
type: String,
|
||||
nullable: true,
|
||||
default: "",
|
||||
},
|
||||
@ -635,7 +644,7 @@ export class EntityMetadataBuilder {
|
||||
propertyName: left,
|
||||
options: /*tree.column || */ {
|
||||
name: left,
|
||||
type: "integer",
|
||||
type: Number,
|
||||
nullable: false,
|
||||
default: 1,
|
||||
},
|
||||
@ -653,7 +662,7 @@ export class EntityMetadataBuilder {
|
||||
propertyName: right,
|
||||
options: /*tree.column || */ {
|
||||
name: right,
|
||||
type: "integer",
|
||||
type: Number,
|
||||
nullable: false,
|
||||
default: 2,
|
||||
},
|
||||
@ -764,11 +773,12 @@ export class EntityMetadataBuilder {
|
||||
})
|
||||
}
|
||||
|
||||
// Mysql and SAP HANA stores unique constraints as unique indices.
|
||||
// This drivers stores unique constraints as unique indices.
|
||||
if (
|
||||
DriverUtils.isMySQLFamily(this.connection.driver) ||
|
||||
this.connection.driver.options.type === "aurora-mysql" ||
|
||||
this.connection.driver.options.type === "sap"
|
||||
this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
const indices = this.metadataArgsStorage
|
||||
.filterUniques(entityMetadata.inheritanceTree)
|
||||
|
||||
@ -207,6 +207,7 @@ export class JunctionEntityMetadataBuilder {
|
||||
|
||||
// create junction table foreign keys
|
||||
// Note: UPDATE CASCADE clause is not supported in Oracle.
|
||||
// Note: UPDATE/DELETE CASCADE clauses are not supported in Spanner.
|
||||
entityMetadata.foreignKeys = relation.createForeignKeyConstraints
|
||||
? [
|
||||
new ForeignKeyMetadata({
|
||||
@ -214,9 +215,13 @@ export class JunctionEntityMetadataBuilder {
|
||||
referencedEntityMetadata: relation.entityMetadata,
|
||||
columns: junctionColumns,
|
||||
referencedColumns: referencedColumns,
|
||||
onDelete: relation.onDelete || "CASCADE",
|
||||
onDelete:
|
||||
this.connection.driver.options.type === "spanner"
|
||||
? "NO ACTION"
|
||||
: relation.onDelete || "CASCADE",
|
||||
onUpdate:
|
||||
this.connection.driver.options.type === "oracle"
|
||||
this.connection.driver.options.type === "oracle" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
? "NO ACTION"
|
||||
: relation.onUpdate || "CASCADE",
|
||||
}),
|
||||
@ -225,11 +230,15 @@ export class JunctionEntityMetadataBuilder {
|
||||
referencedEntityMetadata: relation.inverseEntityMetadata,
|
||||
columns: inverseJunctionColumns,
|
||||
referencedColumns: inverseReferencedColumns,
|
||||
onDelete: relation.inverseRelation
|
||||
? relation.inverseRelation.onDelete
|
||||
: "CASCADE",
|
||||
onDelete:
|
||||
this.connection.driver.options.type === "spanner"
|
||||
? "NO ACTION"
|
||||
: relation.inverseRelation
|
||||
? relation.inverseRelation.onDelete
|
||||
: "CASCADE",
|
||||
onUpdate:
|
||||
this.connection.driver.options.type === "oracle"
|
||||
this.connection.driver.options.type === "oracle" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
? "NO ACTION"
|
||||
: relation.inverseRelation
|
||||
? relation.inverseRelation.onUpdate
|
||||
|
||||
@ -40,6 +40,15 @@ export class IndexMetadata {
|
||||
*/
|
||||
isFulltext: boolean = false
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
isNullFiltered: boolean = false
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
@ -134,6 +143,7 @@ export class IndexMetadata {
|
||||
this.isUnique = !!options.args.unique
|
||||
this.isSpatial = !!options.args.spatial
|
||||
this.isFulltext = !!options.args.fulltext
|
||||
this.isNullFiltered = !!options.args.nullFiltered
|
||||
this.parser = options.args.parser
|
||||
this.where = options.args.where
|
||||
this.isSparse = options.args.sparse
|
||||
|
||||
@ -35,8 +35,13 @@ export class PlatformTools {
|
||||
|
||||
try {
|
||||
// switch case to explicit require statements for webpack compatibility.
|
||||
|
||||
switch (name) {
|
||||
/**
|
||||
* spanner
|
||||
*/
|
||||
case "spanner":
|
||||
return require("@google-cloud/spanner")
|
||||
|
||||
/**
|
||||
* mongodb
|
||||
*/
|
||||
|
||||
@ -626,6 +626,7 @@ export class InsertQueryBuilder<Entity> extends QueryBuilder<Entity> {
|
||||
if (
|
||||
column.isGenerated &&
|
||||
column.generationStrategy === "increment" &&
|
||||
!(this.connection.driver.options.type === "spanner") &&
|
||||
!(this.connection.driver.options.type === "oracle") &&
|
||||
!DriverUtils.isSQLiteFamily(this.connection.driver) &&
|
||||
!DriverUtils.isMySQLFamily(this.connection.driver) &&
|
||||
@ -778,7 +779,8 @@ export class InsertQueryBuilder<Entity> extends QueryBuilder<Entity> {
|
||||
DriverUtils.isSQLiteFamily(
|
||||
this.connection.driver,
|
||||
) ||
|
||||
this.connection.driver.options.type === "sap"
|
||||
this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
// unfortunately sqlite does not support DEFAULT expression in INSERT queries
|
||||
if (
|
||||
@ -796,6 +798,11 @@ export class InsertQueryBuilder<Entity> extends QueryBuilder<Entity> {
|
||||
} else {
|
||||
expression += "DEFAULT"
|
||||
}
|
||||
} else if (
|
||||
value === null &&
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
expression += "NULL"
|
||||
|
||||
// support for SQL expressions in queries
|
||||
} else if (typeof value === "function") {
|
||||
@ -929,16 +936,22 @@ export class InsertQueryBuilder<Entity> extends QueryBuilder<Entity> {
|
||||
// if value for this column was not provided then insert default value
|
||||
} else if (value === undefined) {
|
||||
if (
|
||||
(this.connection.driver.options.type === "oracle" &&
|
||||
valueSets.length > 1) ||
|
||||
DriverUtils.isSQLiteFamily(
|
||||
this.connection.driver,
|
||||
) ||
|
||||
this.connection.driver.options.type === "sap"
|
||||
this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
expression += "NULL"
|
||||
} else {
|
||||
expression += "DEFAULT"
|
||||
}
|
||||
|
||||
} else if (
|
||||
value === null &&
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
// just any other regular value
|
||||
} else {
|
||||
expression += this.createParameter(value)
|
||||
|
||||
@ -2427,7 +2427,8 @@ export class SelectQueryBuilder<Entity>
|
||||
} else if (
|
||||
DriverUtils.isMySQLFamily(this.connection.driver) ||
|
||||
this.connection.driver.options.type === "aurora-mysql" ||
|
||||
this.connection.driver.options.type === "sap"
|
||||
this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
) {
|
||||
if (limit && offset) return " LIMIT " + limit + " OFFSET " + offset
|
||||
if (limit) return " LIMIT " + limit
|
||||
@ -2805,6 +2806,27 @@ export class SelectQueryBuilder<Entity>
|
||||
return `COUNT(DISTINCT(CONCAT(${columnsExpression})))`
|
||||
}
|
||||
|
||||
if (this.connection.driver.options.type === "spanner") {
|
||||
// spanner also has gotta be different from everyone else.
|
||||
// they do not support concatenation of different column types without casting them to string
|
||||
|
||||
if (primaryColumns.length === 1) {
|
||||
return `COUNT(DISTINCT(${distinctAlias}.${this.escape(
|
||||
primaryColumns[0].databaseName,
|
||||
)}))`
|
||||
}
|
||||
|
||||
const columnsExpression = primaryColumns
|
||||
.map(
|
||||
(primaryColumn) =>
|
||||
`CAST(${distinctAlias}.${this.escape(
|
||||
primaryColumn.databaseName,
|
||||
)} AS STRING)`,
|
||||
)
|
||||
.join(", '|;|', ")
|
||||
return `COUNT(DISTINCT(CONCAT(${columnsExpression})))`
|
||||
}
|
||||
|
||||
// If all else fails, fall back to a `COUNT` and `DISTINCT` across all the primary columns concatenated.
|
||||
// Per the SQL spec, this is the canonical string concatenation mechanism which is most
|
||||
// likely to work across servers implementing the SQL standard.
|
||||
@ -3218,7 +3240,9 @@ export class SelectQueryBuilder<Entity>
|
||||
primaryColumn.databaseName,
|
||||
)
|
||||
|
||||
return `${distinctAlias}.${columnAlias} as "${alias}"`
|
||||
return `${distinctAlias}.${columnAlias} AS ${this.escape(
|
||||
alias,
|
||||
)}`
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@ -478,11 +478,19 @@ export class UpdateQueryBuilder<Entity>
|
||||
? this.expressionMap.mainAlias!.metadata
|
||||
: undefined
|
||||
|
||||
// it doesn't make sense to update undefined properties, so just skip them
|
||||
const valuesSetNormalized: ObjectLiteral = {}
|
||||
for (let key in valuesSet) {
|
||||
if (valuesSet[key] !== undefined) {
|
||||
valuesSetNormalized[key] = valuesSet[key]
|
||||
}
|
||||
}
|
||||
|
||||
// prepare columns and values to be updated
|
||||
const updateColumnAndValues: string[] = []
|
||||
const updatedColumns: ColumnMetadata[] = []
|
||||
if (metadata) {
|
||||
this.createPropertyPath(metadata, valuesSet).forEach(
|
||||
this.createPropertyPath(metadata, valuesSetNormalized).forEach(
|
||||
(propertyPath) => {
|
||||
// todo: make this and other query builder to work with properly with tables without metadata
|
||||
const columns =
|
||||
@ -506,10 +514,11 @@ export class UpdateQueryBuilder<Entity>
|
||||
updatedColumns.push(column)
|
||||
|
||||
//
|
||||
let value = column.getEntityValue(valuesSet)
|
||||
let value = column.getEntityValue(valuesSetNormalized)
|
||||
if (
|
||||
column.referencedColumn &&
|
||||
typeof value === "object" &&
|
||||
value !== null &&
|
||||
!Buffer.isBuffer(value)
|
||||
) {
|
||||
value =
|
||||
@ -531,7 +540,9 @@ export class UpdateQueryBuilder<Entity>
|
||||
value(),
|
||||
)
|
||||
} else if (
|
||||
this.connection.driver.options.type === "sap" &&
|
||||
(this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type ===
|
||||
"spanner") &&
|
||||
value === null
|
||||
) {
|
||||
updateColumnAndValues.push(
|
||||
@ -614,7 +625,7 @@ export class UpdateQueryBuilder<Entity>
|
||||
// Don't allow calling update only with columns that are `update: false`
|
||||
if (
|
||||
updateColumnAndValues.length > 0 ||
|
||||
Object.keys(valuesSet).length === 0
|
||||
Object.keys(valuesSetNormalized).length === 0
|
||||
) {
|
||||
if (
|
||||
metadata.versionColumn &&
|
||||
@ -636,8 +647,8 @@ export class UpdateQueryBuilder<Entity>
|
||||
) // todo: fix issue with CURRENT_TIMESTAMP(6) being used, can "DEFAULT" be used?!
|
||||
}
|
||||
} else {
|
||||
Object.keys(valuesSet).map((key) => {
|
||||
let value = valuesSet[key]
|
||||
Object.keys(valuesSetNormalized).map((key) => {
|
||||
let value = valuesSetNormalized[key]
|
||||
|
||||
// todo: duplication zone
|
||||
if (typeof value === "function") {
|
||||
@ -646,7 +657,8 @@ export class UpdateQueryBuilder<Entity>
|
||||
this.escape(key) + " = " + value(),
|
||||
)
|
||||
} else if (
|
||||
this.connection.driver.options.type === "sap" &&
|
||||
(this.connection.driver.options.type === "sap" ||
|
||||
this.connection.driver.options.type === "spanner") &&
|
||||
value === null
|
||||
) {
|
||||
updateColumnAndValues.push(this.escape(key) + " = NULL")
|
||||
|
||||
@ -66,8 +66,11 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
|
||||
// CockroachDB implements asynchronous schema sync operations which can not been executed in transaction.
|
||||
// E.g. if you try to DROP column and ADD it again in the same transaction, crdb throws error.
|
||||
// In Spanner queries against the INFORMATION_SCHEMA can be used in a read-only transaction,
|
||||
// but not in a read-write transaction.
|
||||
const isUsingTransactions =
|
||||
!(this.connection.driver.options.type === "cockroachdb") &&
|
||||
!(this.connection.driver.options.type === "spanner") &&
|
||||
this.connection.options.migrationsTransactionMode !== "none"
|
||||
|
||||
await this.queryRunner.beforeMigration()
|
||||
@ -813,7 +816,8 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
if (
|
||||
!(
|
||||
DriverUtils.isMySQLFamily(this.connection.driver) ||
|
||||
this.connection.driver.options.type === "aurora-mysql"
|
||||
this.connection.driver.options.type === "aurora-mysql" ||
|
||||
this.connection.driver.options.type === "spanner"
|
||||
)
|
||||
) {
|
||||
for (const changedColumn of changedColumns) {
|
||||
@ -1181,6 +1185,10 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
database,
|
||||
)
|
||||
|
||||
// Spanner requires at least one primary key in a table.
|
||||
// Since we don't have unique column in "typeorm_metadata" table
|
||||
// and we should avoid breaking changes, we mark all columns as primary for Spanner driver.
|
||||
const isPrimary = this.connection.driver.options.type === "spanner"
|
||||
await queryRunner.createTable(
|
||||
new Table({
|
||||
database: database,
|
||||
@ -1194,6 +1202,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataType,
|
||||
}),
|
||||
isNullable: false,
|
||||
isPrimary,
|
||||
},
|
||||
{
|
||||
name: "database",
|
||||
@ -1202,6 +1211,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataDatabase,
|
||||
}),
|
||||
isNullable: true,
|
||||
isPrimary,
|
||||
},
|
||||
{
|
||||
name: "schema",
|
||||
@ -1210,6 +1220,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataSchema,
|
||||
}),
|
||||
isNullable: true,
|
||||
isPrimary,
|
||||
},
|
||||
{
|
||||
name: "table",
|
||||
@ -1218,6 +1229,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataTable,
|
||||
}),
|
||||
isNullable: true,
|
||||
isPrimary,
|
||||
},
|
||||
{
|
||||
name: "name",
|
||||
@ -1226,6 +1238,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataName,
|
||||
}),
|
||||
isNullable: true,
|
||||
isPrimary,
|
||||
},
|
||||
{
|
||||
name: "value",
|
||||
@ -1234,6 +1247,7 @@ export class RdbmsSchemaBuilder implements SchemaBuilder {
|
||||
.metadataValue,
|
||||
}),
|
||||
isNullable: true,
|
||||
isPrimary,
|
||||
},
|
||||
],
|
||||
}),
|
||||
|
||||
@ -33,6 +33,15 @@ export interface TableIndexOptions {
|
||||
*/
|
||||
isFulltext?: boolean
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
isNullFiltered?: boolean
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
|
||||
@ -38,6 +38,15 @@ export class TableIndex {
|
||||
*/
|
||||
isFulltext: boolean
|
||||
|
||||
/**
|
||||
* NULL_FILTERED indexes are particularly useful for indexing sparse columns, where most rows contain a NULL value.
|
||||
* In these cases, the NULL_FILTERED index can be considerably smaller and more efficient to maintain than
|
||||
* a normal index that includes NULL values.
|
||||
*
|
||||
* Works only in Spanner.
|
||||
*/
|
||||
isNullFiltered: boolean
|
||||
|
||||
/**
|
||||
* Fulltext parser.
|
||||
* Works only in MySQL.
|
||||
@ -59,6 +68,7 @@ export class TableIndex {
|
||||
this.isUnique = !!options.isUnique
|
||||
this.isSpatial = !!options.isSpatial
|
||||
this.isFulltext = !!options.isFulltext
|
||||
this.isNullFiltered = !!options.isNullFiltered
|
||||
this.parser = options.parser
|
||||
this.where = options.where ? options.where : ""
|
||||
}
|
||||
@ -77,6 +87,7 @@ export class TableIndex {
|
||||
isUnique: this.isUnique,
|
||||
isSpatial: this.isSpatial,
|
||||
isFulltext: this.isFulltext,
|
||||
isNullFiltered: this.isNullFiltered,
|
||||
parser: this.parser,
|
||||
where: this.where,
|
||||
})
|
||||
@ -98,6 +109,7 @@ export class TableIndex {
|
||||
isUnique: indexMetadata.isUnique,
|
||||
isSpatial: indexMetadata.isSpatial,
|
||||
isFulltext: indexMetadata.isFulltext,
|
||||
isNullFiltered: indexMetadata.isNullFiltered,
|
||||
parser: indexMetadata.parser,
|
||||
where: indexMetadata.where,
|
||||
})
|
||||
|
||||
9
test/__spanner-test/Dockerfile
Normal file
9
test/__spanner-test/Dockerfile
Normal file
@ -0,0 +1,9 @@
|
||||
FROM google/cloud-sdk:slim
|
||||
|
||||
RUN apt-get install -y google-cloud-sdk google-cloud-sdk-spanner-emulator
|
||||
|
||||
COPY ./start_spanner.bash start_spanner.bash
|
||||
|
||||
RUN ["chmod", "+x", "./start_spanner.bash"]
|
||||
|
||||
CMD ./start_spanner.bash
|
||||
81
test/__spanner-test/spanner-test.ts
Normal file
81
test/__spanner-test/spanner-test.ts
Normal file
@ -0,0 +1,81 @@
|
||||
// import { Spanner } from "@google-cloud/spanner"
|
||||
//
|
||||
// process.env.SPANNER_EMULATOR_HOST = "localhost:9010"
|
||||
// // process.env.GOOGLE_APPLICATION_CREDENTIALS="/Users/messer/Documents/google/astute-cumulus-342713-80000a3b5bdb.json"
|
||||
//
|
||||
// async function main() {
|
||||
// const projectId = "test-project"
|
||||
// const instanceId = "test-instance"
|
||||
// const databaseId = "test-db"
|
||||
//
|
||||
// const spanner = new Spanner({
|
||||
// projectId: projectId,
|
||||
// })
|
||||
//
|
||||
// const instance = spanner.instance(instanceId)
|
||||
// const database = instance.database(databaseId)
|
||||
//
|
||||
// // const [operation] = await database.updateSchema(
|
||||
// // `CREATE TABLE \`test\` (\`id\` INT64, \`name\` STRING(MAX)) PRIMARY KEY (\`id\`)`,
|
||||
// // )
|
||||
// // await operation.promise()
|
||||
// // const [tx] = await database.getTransaction()
|
||||
// // await tx.runUpdate(`INSERT INTO \`book\`(\`ean\`) VALUES ('asd')`)
|
||||
// // await tx.commit()
|
||||
//
|
||||
// // await database.run(`INSERT INTO \`book\`(\`ean\`) VALUES ('asd')`)
|
||||
//
|
||||
// const [session] = await database.createSession({})
|
||||
// const sessionTransaction = await session.transaction()
|
||||
//
|
||||
// // await sessionTransaction.begin()
|
||||
// // await sessionTransaction.run({
|
||||
// // sql: `INSERT INTO \`test\`(\`id\`, \`name\`) VALUES (@param0, @param1)`,
|
||||
// // params: {
|
||||
// // param0: 2,
|
||||
// // param1: null,
|
||||
// // },
|
||||
// // types: {
|
||||
// // param0: "int64",
|
||||
// // param1: "string",
|
||||
// // },
|
||||
// // })
|
||||
// // await sessionTransaction.commit()
|
||||
//
|
||||
// await sessionTransaction.begin()
|
||||
// const [rows] = await sessionTransaction.run({
|
||||
// sql: `SELECT * FROM test WHERE name = @name AND id = @id`,
|
||||
// params: {
|
||||
// id: Spanner.int(2),
|
||||
// name: null,
|
||||
// },
|
||||
// types: {
|
||||
// id: "int64",
|
||||
// name: "string",
|
||||
// },
|
||||
// json: true,
|
||||
// })
|
||||
// await sessionTransaction.commit()
|
||||
// console.log(rows)
|
||||
//
|
||||
// // const first = async () => {
|
||||
// // const sessionTransaction = await session.transaction()
|
||||
// // await sessionTransaction.begin()
|
||||
// // await sessionTransaction.run(`INSERT INTO \`category\`(\`id\`, \`name\`) VALUES (1, 'aaa')`)
|
||||
// // await sessionTransaction.commit()
|
||||
// // }
|
||||
// //
|
||||
// // const second = async () => {
|
||||
// // const sessionTransaction = await session.transaction()
|
||||
// // await sessionTransaction.begin()
|
||||
// // await sessionTransaction.run(`INSERT INTO \`category\`(\`id\`, \`name\`) VALUES (2, 'bbb')`)
|
||||
// // await sessionTransaction.commit()
|
||||
// // }
|
||||
// //
|
||||
// // await Promise.all([
|
||||
// // first(),
|
||||
// // second()
|
||||
// // ])
|
||||
// }
|
||||
//
|
||||
// main()
|
||||
22
test/__spanner-test/start_spanner.bash
Normal file
22
test/__spanner-test/start_spanner.bash
Normal file
@ -0,0 +1,22 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -m
|
||||
|
||||
gcloud beta emulators spanner start --host-port=0.0.0.0:9010 &
|
||||
|
||||
# configure gcloud cli to connect to emulator
|
||||
gcloud config set auth/disable_credentials true
|
||||
gcloud config set project test-project
|
||||
gcloud config set api_endpoint_overrides/spanner http://localhost:9020/
|
||||
|
||||
# create spanner instance
|
||||
gcloud spanner instances create test-instance \
|
||||
--config=emulator-config \
|
||||
--description="Test Instance" \
|
||||
--nodes=1
|
||||
|
||||
# create spanner database with the given schema
|
||||
gcloud spanner databases create test-db \
|
||||
--instance=test-instance
|
||||
|
||||
fg %1
|
||||
19
test/functional/cache/custom-cache-provider.ts
vendored
19
test/functional/cache/custom-cache-provider.ts
vendored
@ -29,6 +29,10 @@ describe("custom cache provider", () => {
|
||||
it("should be used instead of built-ins", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
|
||||
const queryResultCache: any = connection.queryResultCache
|
||||
expect(queryResultCache).to.have.property(
|
||||
"queryResultCacheTable",
|
||||
@ -43,6 +47,9 @@ describe("custom cache provider", () => {
|
||||
it("should cache results properly", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
// first prepare data - insert users
|
||||
const user1 = new User()
|
||||
user1.firstName = "Timber"
|
||||
@ -108,6 +115,10 @@ describe("custom cache provider", () => {
|
||||
it("should cache results with pagination enabled properly", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
|
||||
// first prepare data - insert users
|
||||
const user1 = new User()
|
||||
user1.firstName = "Timber"
|
||||
@ -185,6 +196,10 @@ describe("custom cache provider", () => {
|
||||
it("should cache results with custom id and duration supplied", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
|
||||
// first prepare data - insert users
|
||||
const user1 = new User()
|
||||
user1.firstName = "Timber"
|
||||
@ -265,6 +280,10 @@ describe("custom cache provider", () => {
|
||||
it("should cache results with custom id and duration supplied", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
|
||||
// first prepare data - insert users
|
||||
const user1 = new User()
|
||||
user1.firstName = "Timber"
|
||||
|
||||
@ -22,6 +22,10 @@ describe("columns > update and insert control", () => {
|
||||
it("should respect column update and insert properties", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
return
|
||||
}
|
||||
|
||||
const postRepository = connection.getRepository(Post)
|
||||
|
||||
// create and save a post first
|
||||
|
||||
@ -0,0 +1,160 @@
|
||||
import "reflect-metadata"
|
||||
import { DataSource } from "../../../../../src"
|
||||
import { Post } from "./entity/Post"
|
||||
import {
|
||||
closeTestingConnections,
|
||||
createTestingConnections,
|
||||
reloadTestingDatabases,
|
||||
} from "../../../../utils/test-utils"
|
||||
import { PostWithoutTypes } from "./entity/PostWithoutTypes"
|
||||
import { PostWithOptions } from "./entity/PostWithOptions"
|
||||
|
||||
describe("database schema > column types > spanner", () => {
|
||||
let connections: DataSource[]
|
||||
before(async () => {
|
||||
connections = await createTestingConnections({
|
||||
entities: [__dirname + "/entity/*{.js,.ts}"],
|
||||
enabledDrivers: ["spanner"],
|
||||
})
|
||||
})
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
after(() => closeTestingConnections(connections))
|
||||
|
||||
it("all types should work correctly - persist and hydrate", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const postRepository = connection.getRepository(Post)
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("post")
|
||||
await queryRunner.release()
|
||||
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.name = "Post"
|
||||
post.int64 = 2147483647
|
||||
post.string = "This is string"
|
||||
post.bytes = Buffer.from("This is bytes")
|
||||
post.float64 = 10.53
|
||||
post.numeric = "10"
|
||||
post.bool = true
|
||||
post.date = "2022-03-16"
|
||||
post.timestamp = new Date()
|
||||
post.json = { param: "VALUE" }
|
||||
post.array = ["A", "B", "C"]
|
||||
await postRepository.save(post)
|
||||
|
||||
const loadedPost = (await postRepository.findOneBy({ id: 1 }))!
|
||||
loadedPost.id.should.be.equal(post.id)
|
||||
loadedPost.name.should.be.equal(post.name)
|
||||
loadedPost.int64.should.be.equal(post.int64)
|
||||
loadedPost.string.should.be.equal(post.string)
|
||||
loadedPost.bytes
|
||||
.toString()
|
||||
.should.be.equal(post.bytes.toString())
|
||||
loadedPost.float64.should.be.equal(post.float64)
|
||||
loadedPost.numeric.should.be.equal(post.numeric)
|
||||
loadedPost.bool.should.be.equal(post.bool)
|
||||
loadedPost.date.should.be.equal(post.date)
|
||||
loadedPost.timestamp
|
||||
.valueOf()
|
||||
.should.be.equal(post.timestamp.valueOf())
|
||||
loadedPost.json.should.be.eql(post.json)
|
||||
loadedPost.array[0].should.be.equal(post.array[0])
|
||||
loadedPost.array[1].should.be.equal(post.array[1])
|
||||
loadedPost.array[2].should.be.equal(post.array[2])
|
||||
|
||||
table!.findColumnByName("id")!.type.should.be.equal("int64")
|
||||
table!.findColumnByName("name")!.type.should.be.equal("string")
|
||||
table!.findColumnByName("int64")!.type.should.be.equal("int64")
|
||||
table!
|
||||
.findColumnByName("string")!
|
||||
.type.should.be.equal("string")
|
||||
table!.findColumnByName("bytes")!.type.should.be.equal("bytes")
|
||||
table!
|
||||
.findColumnByName("float64")!
|
||||
.type.should.be.equal("float64")
|
||||
table!
|
||||
.findColumnByName("numeric")!
|
||||
.type.should.be.equal("numeric")
|
||||
table!.findColumnByName("date")!.type.should.be.equal("date")
|
||||
table!.findColumnByName("bool")!.type.should.be.equal("bool")
|
||||
table!.findColumnByName("date")!.type.should.be.equal("date")
|
||||
table!
|
||||
.findColumnByName("timestamp")!
|
||||
.type.should.be.equal("timestamp")
|
||||
table!.findColumnByName("json")!.type.should.be.equal("json")
|
||||
table!.findColumnByName("array")!.type.should.be.equal("string")
|
||||
table!.findColumnByName("array")!.isArray.should.be.true
|
||||
}),
|
||||
))
|
||||
|
||||
it("all types should work correctly - persist and hydrate when options are specified on columns", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const postRepository = connection.getRepository(PostWithOptions)
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("post_with_options")
|
||||
await queryRunner.release()
|
||||
|
||||
const post = new PostWithOptions()
|
||||
post.id = 1
|
||||
post.string = "This is string"
|
||||
post.bytes = Buffer.from("This is bytes")
|
||||
await postRepository.save(post)
|
||||
|
||||
const loadedPost = (await postRepository.findOneBy({ id: 1 }))!
|
||||
loadedPost.id.should.be.equal(post.id)
|
||||
loadedPost.string.should.be.equal(post.string)
|
||||
loadedPost.bytes
|
||||
.toString()
|
||||
.should.be.equal(post.bytes.toString())
|
||||
|
||||
table!.findColumnByName("id")!.type.should.be.equal("int64")
|
||||
table!
|
||||
.findColumnByName("string")!
|
||||
.type.should.be.equal("string")
|
||||
table!.findColumnByName("string")!.length!.should.be.equal("50")
|
||||
table!.findColumnByName("bytes")!.type.should.be.equal("bytes")
|
||||
table!.findColumnByName("bytes")!.length!.should.be.equal("50")
|
||||
}),
|
||||
))
|
||||
|
||||
it("all types should work correctly - persist and hydrate when types are not specified on columns", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const postRepository =
|
||||
connection.getRepository(PostWithoutTypes)
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("post_without_types")
|
||||
await queryRunner.release()
|
||||
|
||||
const post = new PostWithoutTypes()
|
||||
post.id = 1
|
||||
post.name = "Post"
|
||||
post.bool = true
|
||||
post.bytes = Buffer.from("A")
|
||||
post.timestamp = new Date()
|
||||
post.timestamp.setMilliseconds(0)
|
||||
await postRepository.save(post)
|
||||
|
||||
const loadedPost = (await postRepository.findOneBy({ id: 1 }))!
|
||||
loadedPost.id.should.be.equal(post.id)
|
||||
loadedPost.name.should.be.equal(post.name)
|
||||
loadedPost.bool.should.be.equal(post.bool)
|
||||
loadedPost.bytes
|
||||
.toString()
|
||||
.should.be.equal(post.bytes.toString())
|
||||
loadedPost.timestamp
|
||||
.valueOf()
|
||||
.should.be.equal(post.timestamp.valueOf())
|
||||
|
||||
table!.findColumnByName("id")!.type.should.be.equal("int64")
|
||||
table!.findColumnByName("name")!.type.should.be.equal("string")
|
||||
table!.findColumnByName("bool")!.type.should.be.equal("bool")
|
||||
table!.findColumnByName("bytes")!.type.should.be.equal("bytes")
|
||||
table!
|
||||
.findColumnByName("timestamp")!
|
||||
.type.should.be.equal("timestamp")
|
||||
}),
|
||||
))
|
||||
})
|
||||
@ -0,0 +1,68 @@
|
||||
import { Column, Entity, PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
name: string
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Integer Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("int64")
|
||||
int64: number
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Character Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("string")
|
||||
string: string
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Float Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("float64")
|
||||
float64: number
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Binary Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("bytes")
|
||||
bytes: Buffer
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Numeric Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("numeric")
|
||||
numeric: string
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Date Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("date")
|
||||
date: string
|
||||
|
||||
@Column("timestamp")
|
||||
timestamp: Date
|
||||
|
||||
// -------------------------------------------------------------------------
|
||||
// Other Types
|
||||
// -------------------------------------------------------------------------
|
||||
|
||||
@Column("bool")
|
||||
bool: boolean
|
||||
|
||||
@Column("json")
|
||||
json: Object
|
||||
|
||||
@Column("string", { array: true })
|
||||
array: string[]
|
||||
}
|
||||
@ -0,0 +1,13 @@
|
||||
import { Column, Entity, PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class PostWithOptions {
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column({ length: 50 })
|
||||
string: string
|
||||
|
||||
@Column({ length: 50 })
|
||||
bytes: Buffer
|
||||
}
|
||||
@ -0,0 +1,19 @@
|
||||
import { Column, Entity, PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class PostWithoutTypes {
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
name: string
|
||||
|
||||
@Column()
|
||||
bool: boolean
|
||||
|
||||
@Column()
|
||||
bytes: Buffer
|
||||
|
||||
@Column()
|
||||
timestamp: Date
|
||||
}
|
||||
@ -4,7 +4,7 @@ import { PrimaryColumn } from "../../../../../src/decorator/columns/PrimaryColum
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryColumn("int")
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column({ nullable: true })
|
||||
|
||||
@ -432,7 +432,7 @@ describe("embedded > embedded-many-to-many-case1", () => {
|
||||
|
||||
const loadedUsers2 = (await connection
|
||||
.getRepository(User)
|
||||
.find())!
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers2.length).to.be.equal(2)
|
||||
expect(loadedUsers2[0].name).to.be.equal("Bob")
|
||||
expect(loadedUsers2[1].name).to.be.equal("Clara")
|
||||
|
||||
@ -240,7 +240,7 @@ describe("embedded > embedded-many-to-many-case2", () => {
|
||||
|
||||
const loadedUsers2 = (await connection
|
||||
.getRepository(User)
|
||||
.find())!
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers2.length).to.be.equal(2)
|
||||
expect(loadedUsers2[0].name).to.be.equal("Bob")
|
||||
expect(loadedUsers2[1].name).to.be.equal("Clara")
|
||||
|
||||
@ -428,7 +428,9 @@ describe("embedded > embedded-many-to-many-case3", () => {
|
||||
|
||||
await connection.getRepository(User).remove(loadedUser!)
|
||||
|
||||
loadedUsers = (await connection.getRepository(User).find())!
|
||||
loadedUsers = (await connection
|
||||
.getRepository(User)
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers.length).to.be.equal(2)
|
||||
expect(loadedUsers[0].name).to.be.equal("Bob")
|
||||
expect(loadedUsers[1].name).to.be.equal("Clara")
|
||||
|
||||
@ -445,7 +445,9 @@ describe("embedded > embedded-many-to-many-case4", () => {
|
||||
|
||||
await connection.getRepository(User).remove(loadedUser!)
|
||||
|
||||
loadedUsers = (await connection.getRepository(User).find())!
|
||||
loadedUsers = (await connection
|
||||
.getRepository(User)
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers.length).to.be.equal(2)
|
||||
expect(loadedUsers[0].name).to.be.equal("Bob")
|
||||
expect(loadedUsers[1].name).to.be.equal("Clara")
|
||||
|
||||
@ -445,7 +445,9 @@ describe("embedded > embedded-many-to-many-case5", () => {
|
||||
|
||||
await connection.getRepository(User).remove(loadedUser!)
|
||||
|
||||
loadedUsers = (await connection.getRepository(User).find())!
|
||||
loadedUsers = (await connection
|
||||
.getRepository(User)
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers.length).to.be.equal(2)
|
||||
expect(loadedUsers[0].name).to.be.equal("Bob")
|
||||
expect(loadedUsers[1].name).to.be.equal("Clara")
|
||||
|
||||
@ -184,7 +184,9 @@ describe("embedded > embedded-many-to-one-case2", () => {
|
||||
|
||||
await connection.getRepository(User).remove(loadedUser!)
|
||||
|
||||
loadedUsers = (await connection.getRepository(User).find())!
|
||||
loadedUsers = (await connection
|
||||
.getRepository(User)
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers.length).to.be.equal(1)
|
||||
expect(loadedUsers[0].name).to.be.equal("Bob")
|
||||
}),
|
||||
|
||||
@ -346,7 +346,9 @@ describe("embedded > embedded-one-to-one", () => {
|
||||
|
||||
await connection.getRepository(User).remove(loadedUser!)
|
||||
|
||||
loadedUsers = (await connection.getRepository(User).find())!
|
||||
loadedUsers = (await connection
|
||||
.getRepository(User)
|
||||
.find({ order: { name: "ASC" } }))!
|
||||
expect(loadedUsers.length).to.be.equal(1)
|
||||
expect(loadedUsers[0].name).to.be.equal("Bob")
|
||||
}),
|
||||
|
||||
@ -25,6 +25,7 @@ describe("entity-model", () => {
|
||||
Post.useDataSource(connection) // change connection each time because of AR specifics
|
||||
|
||||
const post = Post.create()
|
||||
post.id = 1
|
||||
post.title = "About ActiveRecord"
|
||||
post.text = "Huge discussion how good or bad ActiveRecord is."
|
||||
await post.save()
|
||||
@ -52,16 +53,18 @@ describe("entity-model", () => {
|
||||
|
||||
const externalId = "external-entity"
|
||||
|
||||
await Post.upsert({ externalId, title: "External post" }, [
|
||||
"externalId",
|
||||
])
|
||||
await Post.upsert(
|
||||
{ externalId, id: 1, title: "External post" },
|
||||
["externalId"],
|
||||
)
|
||||
const upsertInsertedExternalPost = await Post.findOneByOrFail({
|
||||
externalId,
|
||||
})
|
||||
|
||||
await Post.upsert({ externalId, title: "External post 2" }, [
|
||||
"externalId",
|
||||
])
|
||||
await Post.upsert(
|
||||
{ externalId, id: 1, title: "External post 2" },
|
||||
["externalId"],
|
||||
)
|
||||
const upsertUpdatedExternalPost = await Post.findOneByOrFail({
|
||||
externalId,
|
||||
})
|
||||
@ -89,6 +92,7 @@ describe("entity-model", () => {
|
||||
await category.save()
|
||||
|
||||
const post = Post.create()
|
||||
post.id = 1
|
||||
post.title = "About ActiveRecord"
|
||||
post.categories = [category]
|
||||
await post.save()
|
||||
@ -126,6 +130,7 @@ describe("entity-model", () => {
|
||||
Category.useDataSource(connection)
|
||||
|
||||
const post1 = Post.create()
|
||||
post1.id = 1
|
||||
post1.title = "About ActiveRecord 1"
|
||||
post1.externalId = "some external id 1"
|
||||
await post1.save()
|
||||
@ -146,6 +151,7 @@ describe("entity-model", () => {
|
||||
})
|
||||
|
||||
const post2 = Post.create()
|
||||
post2.id = 2
|
||||
post2.title = "About ActiveRecord 2"
|
||||
post2.externalId = "some external id 2"
|
||||
await post2.save()
|
||||
|
||||
@ -4,13 +4,13 @@ import {
|
||||
Entity,
|
||||
JoinTable,
|
||||
ManyToMany,
|
||||
PrimaryGeneratedColumn,
|
||||
PrimaryColumn,
|
||||
} from "../../../../src"
|
||||
import { Category } from "./Category"
|
||||
|
||||
@Entity()
|
||||
export class Post extends BaseEntity {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column({
|
||||
|
||||
@ -24,6 +24,7 @@ describe("entity schemas > basic functionality", () => {
|
||||
connections.map(async (connection) => {
|
||||
const postRepository = connection.getRepository(PostEntity)
|
||||
const post = postRepository.create({
|
||||
id: 1,
|
||||
title: "First Post",
|
||||
text: "About first post",
|
||||
})
|
||||
|
||||
@ -7,7 +7,6 @@ export const CategoryEntity = new EntitySchema<Category>({
|
||||
id: {
|
||||
type: Number,
|
||||
primary: true,
|
||||
generated: true,
|
||||
},
|
||||
name: {
|
||||
type: String,
|
||||
|
||||
@ -7,7 +7,6 @@ export const PostEntity = new EntitySchema<Post>({
|
||||
id: {
|
||||
type: Number,
|
||||
primary: true,
|
||||
generated: true,
|
||||
},
|
||||
title: {
|
||||
type: String,
|
||||
|
||||
@ -6,30 +6,59 @@ import {
|
||||
} from "../../../utils/test-utils"
|
||||
import { DataSource } from "../../../../src/data-source/DataSource"
|
||||
import { PersonSchema } from "./entity/Person"
|
||||
import { DriverUtils } from "../../../../src/driver/DriverUtils"
|
||||
import { PersonSchema2 } from "./entity/Person2"
|
||||
|
||||
describe("entity-schema > checks", () => {
|
||||
let connections: DataSource[]
|
||||
before(
|
||||
async () =>
|
||||
(connections = await createTestingConnections({
|
||||
entities: [<any>PersonSchema],
|
||||
})),
|
||||
)
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
after(() => closeTestingConnections(connections))
|
||||
describe("entity-schema > checks > postgres, cockroachdb, oracle, mssql", () => {
|
||||
let connections: DataSource[]
|
||||
before(
|
||||
async () =>
|
||||
(connections = await createTestingConnections({
|
||||
entities: [<any>PersonSchema],
|
||||
enabledDrivers: [
|
||||
"postgres",
|
||||
"cockroachdb",
|
||||
"oracle",
|
||||
"mssql",
|
||||
],
|
||||
})),
|
||||
)
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
after(() => closeTestingConnections(connections))
|
||||
|
||||
it("should create a check constraints", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Mysql does not support check constraints.
|
||||
if (DriverUtils.isMySQLFamily(connection.driver)) return
|
||||
it("should create a check constraints", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("person")
|
||||
await queryRunner.release()
|
||||
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("person")
|
||||
await queryRunner.release()
|
||||
table!.checks.length.should.be.equal(2)
|
||||
}),
|
||||
))
|
||||
})
|
||||
|
||||
table!.checks.length.should.be.equal(2)
|
||||
}),
|
||||
))
|
||||
describe("entity-schema > checks > spanner", () => {
|
||||
let connections: DataSource[]
|
||||
before(
|
||||
async () =>
|
||||
(connections = await createTestingConnections({
|
||||
entities: [<any>PersonSchema2],
|
||||
enabledDrivers: ["spanner"],
|
||||
})),
|
||||
)
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
after(() => closeTestingConnections(connections))
|
||||
|
||||
it("should create a check constraints", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("person")
|
||||
await queryRunner.release()
|
||||
|
||||
table!.checks.length.should.be.equal(2)
|
||||
}),
|
||||
))
|
||||
})
|
||||
})
|
||||
|
||||
@ -5,7 +5,7 @@ export const PersonSchema = new EntitySchema<any>({
|
||||
columns: {
|
||||
Id: {
|
||||
primary: true,
|
||||
type: "int",
|
||||
type: Number,
|
||||
generated: "increment",
|
||||
},
|
||||
FirstName: {
|
||||
|
||||
29
test/functional/entity-schema/checks/entity/Person2.ts
Normal file
29
test/functional/entity-schema/checks/entity/Person2.ts
Normal file
@ -0,0 +1,29 @@
|
||||
import { EntitySchema } from "../../../../../src/index"
|
||||
|
||||
export const PersonSchema2 = new EntitySchema<any>({
|
||||
name: "Person",
|
||||
columns: {
|
||||
Id: {
|
||||
primary: true,
|
||||
type: Number,
|
||||
generated: "increment",
|
||||
},
|
||||
FirstName: {
|
||||
type: String,
|
||||
length: 30,
|
||||
},
|
||||
LastName: {
|
||||
type: String,
|
||||
length: 50,
|
||||
nullable: false,
|
||||
},
|
||||
Age: {
|
||||
type: Number,
|
||||
nullable: false,
|
||||
},
|
||||
},
|
||||
checks: [
|
||||
{ expression: `\`FirstName\` <> 'John' AND \`LastName\` <> 'Doe'` },
|
||||
{ expression: `\`Age\` > 18` },
|
||||
],
|
||||
})
|
||||
@ -13,6 +13,7 @@ describe("entity-schema > exclusions", () => {
|
||||
async () =>
|
||||
(connections = await createTestingConnections({
|
||||
entities: [<any>MeetingSchema],
|
||||
enabledDrivers: ["postgres"],
|
||||
})),
|
||||
)
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
@ -21,9 +22,6 @@ describe("entity-schema > exclusions", () => {
|
||||
it("should create an exclusion constraint", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Only PostgreSQL supports exclusion constraints.
|
||||
if (!(connection.driver.options.type === "postgres")) return
|
||||
|
||||
const queryRunner = connection.createQueryRunner()
|
||||
const table = await queryRunner.getTable("meeting")
|
||||
await queryRunner.release()
|
||||
|
||||
@ -5,7 +5,7 @@ export const PersonSchema = new EntitySchema<any>({
|
||||
columns: {
|
||||
Id: {
|
||||
primary: true,
|
||||
type: "int",
|
||||
type: Number,
|
||||
generated: "increment",
|
||||
},
|
||||
FirstName: {
|
||||
|
||||
@ -5,7 +5,7 @@ export const PersonSchema = new EntitySchema<any>({
|
||||
columns: {
|
||||
Id: {
|
||||
primary: true,
|
||||
type: "int",
|
||||
type: Number,
|
||||
generated: "increment",
|
||||
},
|
||||
FirstName: {
|
||||
|
||||
@ -29,7 +29,8 @@ describe("entity-schema > uniques", () => {
|
||||
|
||||
if (
|
||||
DriverUtils.isMySQLFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
) {
|
||||
expect(table!.indices.length).to.be.equal(1)
|
||||
expect(table!.indices[0].name).to.be.equal("UNIQUE_TEST")
|
||||
|
||||
@ -58,7 +58,11 @@ describe("entity subscriber > transaction flow", () => {
|
||||
|
||||
it("transactionStart", async () => {
|
||||
for (let connection of connections) {
|
||||
if (connection.driver.options.type === "mssql") return
|
||||
if (
|
||||
connection.driver.options.type === "mssql" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
beforeTransactionStart.resetHistory()
|
||||
afterTransactionStart.resetHistory()
|
||||
@ -140,7 +144,11 @@ describe("entity subscriber > transaction flow", () => {
|
||||
|
||||
it("transactionCommit", async () => {
|
||||
for (let connection of connections) {
|
||||
if (connection.driver.options.type === "mssql") return
|
||||
if (
|
||||
connection.driver.options.type === "mssql" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
beforeTransactionCommit.resetHistory()
|
||||
afterTransactionCommit.resetHistory()
|
||||
@ -204,7 +212,11 @@ describe("entity subscriber > transaction flow", () => {
|
||||
|
||||
it("transactionRollback", async () => {
|
||||
for (let connection of connections) {
|
||||
if (connection.driver.options.type === "mssql") return
|
||||
if (
|
||||
connection.driver.options.type === "mssql" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
beforeTransactionRollback.resetHistory()
|
||||
afterTransactionRollback.resetHistory()
|
||||
|
||||
@ -1,14 +1,9 @@
|
||||
import {
|
||||
Column,
|
||||
Entity,
|
||||
OneToMany,
|
||||
PrimaryGeneratedColumn,
|
||||
} from "../../../../../src"
|
||||
import { Column, Entity, OneToMany, PrimaryColumn } from "../../../../../src"
|
||||
import { Photo } from "./Photo"
|
||||
|
||||
@Entity()
|
||||
export class Author {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -1,14 +1,9 @@
|
||||
import {
|
||||
Column,
|
||||
Entity,
|
||||
ManyToOne,
|
||||
PrimaryGeneratedColumn,
|
||||
} from "../../../../../src"
|
||||
import { Column, Entity, ManyToOne, PrimaryColumn } from "../../../../../src"
|
||||
import { Author } from "./Author"
|
||||
|
||||
@Entity()
|
||||
export class Photo {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -4,7 +4,7 @@ import {
|
||||
JoinTable,
|
||||
ManyToMany,
|
||||
ManyToOne,
|
||||
PrimaryGeneratedColumn,
|
||||
PrimaryColumn,
|
||||
} from "../../../../../src"
|
||||
import { Tag } from "./Tag"
|
||||
import { Author } from "./Author"
|
||||
@ -12,7 +12,7 @@ import { Counters } from "./Counters"
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -1,14 +1,9 @@
|
||||
import {
|
||||
Column,
|
||||
Entity,
|
||||
ManyToMany,
|
||||
PrimaryGeneratedColumn,
|
||||
} from "../../../../../src"
|
||||
import { Column, Entity, ManyToMany, PrimaryColumn } from "../../../../../src"
|
||||
import { Post } from "./Post"
|
||||
|
||||
@Entity()
|
||||
export class Tag {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -393,6 +393,7 @@ describe("find options > order", () => {
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
author: {
|
||||
id: "desc",
|
||||
},
|
||||
|
||||
@ -8,16 +8,19 @@ import { Counters } from "./entity/Counters"
|
||||
|
||||
export async function prepareData(manager: EntityManager) {
|
||||
const photo1 = new Photo()
|
||||
photo1.id = 1
|
||||
photo1.filename = "saw.jpg"
|
||||
photo1.description = "Me and saw"
|
||||
await manager.save(photo1)
|
||||
|
||||
const photo2 = new Photo()
|
||||
photo2.id = 2
|
||||
photo2.filename = "chain.jpg"
|
||||
photo2.description = "Me and chain"
|
||||
await manager.save(photo2)
|
||||
|
||||
const user1 = new Author()
|
||||
user1.id = 1
|
||||
user1.firstName = "Timber"
|
||||
user1.lastName = "Saw"
|
||||
user1.age = 25
|
||||
@ -25,6 +28,7 @@ export async function prepareData(manager: EntityManager) {
|
||||
await manager.save(user1)
|
||||
|
||||
const user2 = new Author()
|
||||
user2.id = 2
|
||||
user2.firstName = "Gyro"
|
||||
user2.lastName = "Copter"
|
||||
user2.age = 52
|
||||
@ -32,18 +36,22 @@ export async function prepareData(manager: EntityManager) {
|
||||
await manager.save(user2)
|
||||
|
||||
const tag1 = new Tag()
|
||||
tag1.id = 1
|
||||
tag1.name = "category #1"
|
||||
await manager.save(tag1)
|
||||
|
||||
const tag2 = new Tag()
|
||||
tag2.id = 2
|
||||
tag2.name = "category #2"
|
||||
await manager.save(tag2)
|
||||
|
||||
const tag3 = new Tag()
|
||||
tag3.id = 3
|
||||
tag3.name = "category #3"
|
||||
await manager.save(tag3)
|
||||
|
||||
const post1 = new Post()
|
||||
post1.id = 1
|
||||
post1.title = "Post #1"
|
||||
post1.text = "About post #1"
|
||||
post1.author = user1
|
||||
@ -54,6 +62,7 @@ export async function prepareData(manager: EntityManager) {
|
||||
await manager.save(post1)
|
||||
|
||||
const post2 = new Post()
|
||||
post2.id = 2
|
||||
post2.title = "Post #2"
|
||||
post2.text = "About post #2"
|
||||
post2.author = user1
|
||||
@ -64,6 +73,7 @@ export async function prepareData(manager: EntityManager) {
|
||||
await manager.save(post2)
|
||||
|
||||
const post3 = new Post()
|
||||
post3.id = 3
|
||||
post3.title = "Post #3"
|
||||
post3.text = "About post #3"
|
||||
post3.author = user2
|
||||
|
||||
@ -16,7 +16,9 @@ describe("find options > where", () => {
|
||||
let connections: DataSource[]
|
||||
before(
|
||||
async () =>
|
||||
(connections = await createTestingConnections({ __dirname })),
|
||||
(connections = await createTestingConnections({
|
||||
__dirname,
|
||||
})),
|
||||
)
|
||||
beforeEach(() => reloadTestingDatabases(connections))
|
||||
after(() => closeTestingConnections(connections))
|
||||
@ -207,6 +209,9 @@ describe("find options > where", () => {
|
||||
},
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
@ -271,6 +276,9 @@ describe("find options > where", () => {
|
||||
likes: 1,
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
@ -305,6 +313,9 @@ describe("find options > where", () => {
|
||||
},
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
@ -351,6 +362,9 @@ describe("find options > where", () => {
|
||||
},
|
||||
},
|
||||
],
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
@ -426,6 +440,9 @@ describe("find options > where", () => {
|
||||
photos: MoreThan(1),
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts3.should.be.eql([
|
||||
@ -461,6 +478,9 @@ describe("find options > where", () => {
|
||||
where: {
|
||||
posts: MoreThan(1),
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
tags1.should.be.eql([
|
||||
@ -486,6 +506,7 @@ describe("find options > where", () => {
|
||||
await prepareData(connection.manager)
|
||||
|
||||
const post4 = new Post()
|
||||
post4.id = 4
|
||||
post4.title = "Post #4"
|
||||
post4.text = "About post #4"
|
||||
post4.counters = new Counters()
|
||||
@ -501,6 +522,9 @@ describe("find options > where", () => {
|
||||
firstName: undefined,
|
||||
},
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
@ -538,6 +562,7 @@ describe("find options > where", () => {
|
||||
await prepareData(connection.manager)
|
||||
|
||||
const post4 = new Post()
|
||||
post4.id = 4
|
||||
post4.title = "Post #4"
|
||||
post4.text = "About post #4"
|
||||
post4.counters = new Counters()
|
||||
@ -550,6 +575,9 @@ describe("find options > where", () => {
|
||||
where: {
|
||||
author: true,
|
||||
},
|
||||
order: {
|
||||
id: "asc",
|
||||
},
|
||||
})
|
||||
.getMany()
|
||||
posts.should.be.eql([
|
||||
|
||||
@ -7,7 +7,6 @@ import {
|
||||
createTestingConnections,
|
||||
reloadTestingDatabases,
|
||||
} from "../../../utils/test-utils"
|
||||
// import {expect} from "chai";
|
||||
|
||||
describe("persistence > bulk-insert-remove-optimization", function () {
|
||||
// -------------------------------------------------------------------------
|
||||
@ -32,12 +31,15 @@ describe("persistence > bulk-insert-remove-optimization", function () {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const category1 = new Category()
|
||||
category1.id = 1
|
||||
category1.name = "cat#1"
|
||||
|
||||
const category2 = new Category()
|
||||
category2.id = 2
|
||||
category2.name = "cat#2"
|
||||
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.title = "about post"
|
||||
post.categories = [category1, category2]
|
||||
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
import { Entity } from "../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { Post } from "./Post"
|
||||
import { Column } from "../../../../../src/decorator/columns/Column"
|
||||
import { ManyToMany } from "../../../../../src/decorator/relations/ManyToMany"
|
||||
import { PrimaryColumn } from "../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Category {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -1,13 +1,13 @@
|
||||
import { Category } from "./Category"
|
||||
import { Entity } from "../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { Column } from "../../../../../src/decorator/columns/Column"
|
||||
import { ManyToMany } from "../../../../../src/decorator/relations/ManyToMany"
|
||||
import { JoinTable } from "../../../../../src/decorator/relations/JoinTable"
|
||||
import { PrimaryColumn } from "../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -24,11 +24,16 @@ describe("persistence > cascades > example 1", () => {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const photo = new Photo()
|
||||
photo.id = 1
|
||||
if (connection.driver.options.type === "spanner")
|
||||
photo.name = "My photo"
|
||||
|
||||
const profile = new Profile()
|
||||
profile.id = 1
|
||||
profile.photo = photo
|
||||
|
||||
const user = new User()
|
||||
user.id = 1
|
||||
user.name = "Umed"
|
||||
user.profile = profile
|
||||
|
||||
|
||||
@ -1,10 +1,10 @@
|
||||
import { Column } from "../../../../../../src/decorator/columns/Column"
|
||||
import { Entity } from "../../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Photo {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column({ default: "My photo" })
|
||||
|
||||
@ -1,13 +1,13 @@
|
||||
import { Entity } from "../../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { User } from "./User"
|
||||
import { Photo } from "./Photo"
|
||||
import { OneToOne } from "../../../../../../src/decorator/relations/OneToOne"
|
||||
import { JoinColumn } from "../../../../../../src/decorator/relations/JoinColumn"
|
||||
import { PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Profile {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@OneToOne((type) => User, (user) => user.profile, {
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
import { Column } from "../../../../../../src/decorator/columns/Column"
|
||||
import { Entity } from "../../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { Profile } from "./Profile"
|
||||
import { OneToOne } from "../../../../../../src/decorator/relations/OneToOne"
|
||||
import { PrimaryColumn } from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class User {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -24,6 +24,9 @@ describe("persistence > cascades > example 2", () => {
|
||||
it("should insert everything by cascades properly", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// not supported in Spanner
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const photo = new Photo()
|
||||
const user = new User()
|
||||
|
||||
|
||||
@ -54,6 +54,9 @@ describe("persistence > entity updation", () => {
|
||||
it("should update default values after saving", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Spanner does not support DEFAULT values
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new PostDefaultValues()
|
||||
post.title = "Post #1"
|
||||
await connection.manager.save(post)
|
||||
@ -69,6 +72,9 @@ describe("persistence > entity updation", () => {
|
||||
it("should update special columns after saving", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Spanner does not support DEFAULT values
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new PostSpecialColumns()
|
||||
post.title = "Post #1"
|
||||
await connection.manager.save(post)
|
||||
@ -95,6 +101,9 @@ describe("persistence > entity updation", () => {
|
||||
it("should update even with embeddeds", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Spanner does not support DEFAULT values
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new PostComplex()
|
||||
post.firstId = 1
|
||||
post.embed = new PostEmbedded()
|
||||
|
||||
@ -252,6 +252,9 @@ describe("persistence > many-to-one bi-directional relation", function () {
|
||||
it("should set category's post to NULL when post is removed from the database (database ON DELETE)", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Spanner does not support ON DELETE clause
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new Post(1, "Hello Post")
|
||||
await connection.manager.save(post)
|
||||
|
||||
|
||||
@ -252,6 +252,9 @@ describe("persistence > many-to-one uni-directional relation", function () {
|
||||
it("should set category's post to NULL when post is removed from the database (database ON DELETE)", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// Spanner does not support ON DELETE clause
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new Post(1, "Hello Post")
|
||||
await connection.manager.save(post)
|
||||
|
||||
|
||||
@ -8,7 +8,7 @@ import { DataSource } from "../../../../src/data-source/DataSource"
|
||||
import { Post } from "./entity/Post"
|
||||
import { Category } from "./entity/Category"
|
||||
|
||||
describe("persistence > multi primary keys", () => {
|
||||
describe("persistence > multi primary keys on both sides", () => {
|
||||
let connections: DataSource[]
|
||||
before(
|
||||
async () =>
|
||||
|
||||
@ -3,12 +3,10 @@ import { PrimaryColumn } from "../../../../../src/decorator/columns/PrimaryColum
|
||||
import { Column } from "../../../../../src/decorator/columns/Column"
|
||||
import { Post } from "./Post"
|
||||
import { OneToMany } from "../../../../../src/decorator/relations/OneToMany"
|
||||
import { Generated } from "../../../../../src/decorator/Generated"
|
||||
|
||||
@Entity()
|
||||
export class Category {
|
||||
@PrimaryColumn("int")
|
||||
@Generated()
|
||||
@PrimaryColumn()
|
||||
categoryId: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -38,6 +38,7 @@ describe("persistence > multi primary keys", () => {
|
||||
|
||||
// create first category and post and save them
|
||||
const category1 = new Category()
|
||||
category1.categoryId = 1
|
||||
category1.name = "Category saved by cascades #1"
|
||||
category1.posts = [post1]
|
||||
|
||||
|
||||
@ -1,12 +1,12 @@
|
||||
import { Entity } from "../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { ManyToOne } from "../../../../../src/decorator/relations/ManyToOne"
|
||||
import { Post } from "./Post"
|
||||
import { Column } from "../../../../../src/decorator/columns/Column"
|
||||
import { PrimaryColumn } from "../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Category {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@ManyToOne((type) => Post, (post) => post.categories)
|
||||
|
||||
@ -1,19 +1,17 @@
|
||||
import { Category } from "./Category"
|
||||
import { Entity } from "../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { OneToMany } from "../../../../../src/decorator/relations/OneToMany"
|
||||
import { Column } from "../../../../../src/decorator/columns/Column"
|
||||
import { PrimaryColumn } from "../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@OneToMany((type) => Category, (category) => category.post)
|
||||
categories: Category[] | null
|
||||
|
||||
@Column({
|
||||
default: "supervalue",
|
||||
})
|
||||
@Column()
|
||||
title: string
|
||||
}
|
||||
|
||||
@ -34,10 +34,12 @@ describe("persistence > one-to-many", function () {
|
||||
const categoryRepository = connection.getRepository(Category)
|
||||
|
||||
const newCategory = categoryRepository.create()
|
||||
newCategory.id = 1
|
||||
newCategory.name = "Animals"
|
||||
await categoryRepository.save(newCategory)
|
||||
|
||||
const newPost = postRepository.create()
|
||||
newPost.id = 1
|
||||
newPost.title = "All about animals"
|
||||
await postRepository.save(newPost)
|
||||
|
||||
@ -65,10 +67,12 @@ describe("persistence > one-to-many", function () {
|
||||
const categoryRepository = connection.getRepository(Category)
|
||||
|
||||
const newCategory = categoryRepository.create()
|
||||
newCategory.id = 1
|
||||
newCategory.name = "Animals"
|
||||
await categoryRepository.save(newCategory)
|
||||
|
||||
const newPost = postRepository.create()
|
||||
newPost.id = 1
|
||||
newPost.title = "All about animals"
|
||||
newPost.categories = [newCategory]
|
||||
await postRepository.save(newPost)
|
||||
@ -94,14 +98,17 @@ describe("persistence > one-to-many", function () {
|
||||
const categoryRepository = connection.getRepository(Category)
|
||||
|
||||
const firstNewCategory = categoryRepository.create()
|
||||
firstNewCategory.id = 1
|
||||
firstNewCategory.name = "Animals"
|
||||
await categoryRepository.save(firstNewCategory)
|
||||
|
||||
const secondNewCategory = categoryRepository.create()
|
||||
secondNewCategory.id = 2
|
||||
secondNewCategory.name = "Insects"
|
||||
await categoryRepository.save(secondNewCategory)
|
||||
|
||||
const newPost = postRepository.create()
|
||||
newPost.id = 1
|
||||
newPost.title = "All about animals"
|
||||
await postRepository.save(newPost)
|
||||
|
||||
@ -136,14 +143,17 @@ describe("persistence > one-to-many", function () {
|
||||
const categoryRepository = connection.getRepository(Category)
|
||||
|
||||
let firstNewCategory = categoryRepository.create()
|
||||
firstNewCategory.id = 1
|
||||
firstNewCategory.name = "Animals"
|
||||
await categoryRepository.save(firstNewCategory)
|
||||
|
||||
let secondNewCategory = categoryRepository.create()
|
||||
secondNewCategory.id = 2
|
||||
secondNewCategory.name = "Insects"
|
||||
await categoryRepository.save(secondNewCategory)
|
||||
|
||||
let newPost = postRepository.create()
|
||||
newPost.id = 1
|
||||
newPost.title = "All about animals"
|
||||
await postRepository.save(newPost)
|
||||
|
||||
@ -176,14 +186,17 @@ describe("persistence > one-to-many", function () {
|
||||
const categoryRepository = connection.getRepository(Category)
|
||||
|
||||
let firstNewCategory = categoryRepository.create()
|
||||
firstNewCategory.id = 1
|
||||
firstNewCategory.name = "Animals"
|
||||
await categoryRepository.save(firstNewCategory)
|
||||
|
||||
let secondNewCategory = categoryRepository.create()
|
||||
secondNewCategory.id = 2
|
||||
secondNewCategory.name = "Insects"
|
||||
await categoryRepository.save(secondNewCategory)
|
||||
|
||||
let newPost = postRepository.create()
|
||||
newPost.id = 1
|
||||
newPost.title = "All about animals"
|
||||
await postRepository.save(newPost)
|
||||
|
||||
|
||||
@ -7,7 +7,7 @@ import { Generated } from "../../../../../src/decorator/Generated"
|
||||
|
||||
@Entity()
|
||||
export class AccessToken {
|
||||
@PrimaryColumn("int")
|
||||
@PrimaryColumn()
|
||||
@Generated()
|
||||
primaryKey: number
|
||||
|
||||
|
||||
@ -7,7 +7,7 @@ import { Generated } from "../../../../../src/decorator/Generated"
|
||||
|
||||
@Entity()
|
||||
export class User {
|
||||
@PrimaryColumn("int")
|
||||
@PrimaryColumn()
|
||||
@Generated()
|
||||
primaryKey: number
|
||||
|
||||
|
||||
@ -1,12 +1,14 @@
|
||||
import { BeforeInsert } from "../../../../../../src/decorator/listeners/BeforeInsert"
|
||||
import { Entity } from "../../../../../../src/decorator/entity/Entity"
|
||||
import { PrimaryGeneratedColumn } from "../../../../../../src/decorator/columns/PrimaryGeneratedColumn"
|
||||
import { Column } from "../../../../../../src/decorator/columns/Column"
|
||||
import { AfterRemove } from "../../../../../../src/decorator/listeners/AfterRemove"
|
||||
import {
|
||||
AfterRemove,
|
||||
BeforeInsert,
|
||||
Column,
|
||||
Entity,
|
||||
PrimaryColumn,
|
||||
} from "../../../../../../src"
|
||||
|
||||
@Entity()
|
||||
export class Post {
|
||||
@PrimaryGeneratedColumn()
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column()
|
||||
|
||||
@ -7,7 +7,6 @@ import {
|
||||
import { Post } from "./entity/Post"
|
||||
import { DataSource } from "../../../../../src/data-source/DataSource"
|
||||
import { PostWithDeleteDateColumn } from "./entity/PostWithDeleteDateColumn"
|
||||
// import {expect} from "chai";
|
||||
|
||||
describe("persistence > persistence options > listeners", () => {
|
||||
// -------------------------------------------------------------------------
|
||||
@ -30,6 +29,7 @@ describe("persistence > persistence options > listeners", () => {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.title = "Bakhrom"
|
||||
post.description = "Hello"
|
||||
await connection.manager.save(post)
|
||||
@ -41,6 +41,7 @@ describe("persistence > persistence options > listeners", () => {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.title = "Bakhrom"
|
||||
post.description = "Hello"
|
||||
await connection.manager.save(post, { listeners: false })
|
||||
@ -52,6 +53,7 @@ describe("persistence > persistence options > listeners", () => {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.title = "Bakhrom"
|
||||
post.description = "Hello"
|
||||
await connection.manager.save(post)
|
||||
@ -64,6 +66,7 @@ describe("persistence > persistence options > listeners", () => {
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
const post = new Post()
|
||||
post.id = 1
|
||||
post.title = "Bakhrom"
|
||||
post.description = "Hello"
|
||||
await connection.manager.save(post)
|
||||
|
||||
@ -7,6 +7,6 @@ export class Foo {
|
||||
@PrimaryColumn()
|
||||
id: number
|
||||
|
||||
@Column("varchar")
|
||||
@Column()
|
||||
bar: string
|
||||
}
|
||||
|
||||
@ -25,6 +25,9 @@ describe("query builder > cte > recursive", () => {
|
||||
connections
|
||||
.filter(filterByCteCapabilities("enabled"))
|
||||
.map(async (connection) => {
|
||||
// CTE cannot reference itself in Spanner
|
||||
if (connection.options.type === "spanner") return
|
||||
|
||||
const qb = await connection
|
||||
.createQueryBuilder()
|
||||
.select([])
|
||||
|
||||
@ -35,18 +35,28 @@ describe("query builder > cte > simple", () => {
|
||||
const cteQuery = connection
|
||||
.createQueryBuilder()
|
||||
.select()
|
||||
.addSelect(`foo.bar`)
|
||||
.addSelect(`foo.bar`, "bar")
|
||||
.from(Foo, "foo")
|
||||
.where(`foo.bar = :value`, { value: "2" })
|
||||
|
||||
// Spanner does not support column names in CTE
|
||||
const cteOptions =
|
||||
connection.driver.options.type === "spanner"
|
||||
? undefined
|
||||
: {
|
||||
columnNames: ["raz"],
|
||||
}
|
||||
const cteSelection =
|
||||
connection.driver.options.type === "spanner"
|
||||
? "qaz.bar"
|
||||
: "qaz.raz"
|
||||
|
||||
const qb = await connection
|
||||
.createQueryBuilder()
|
||||
.addCommonTableExpression(cteQuery, "qaz", {
|
||||
columnNames: ["raz"],
|
||||
})
|
||||
.addCommonTableExpression(cteQuery, "qaz", cteOptions)
|
||||
.from("qaz", "qaz")
|
||||
.select([])
|
||||
.addSelect("qaz.raz", "raz")
|
||||
.addSelect(cteSelection, "raz")
|
||||
|
||||
expect(await qb.getRawMany()).to.deep.equal([{ raz: "2" }])
|
||||
}),
|
||||
@ -65,16 +75,26 @@ describe("query builder > cte > simple", () => {
|
||||
const cteQuery = connection
|
||||
.createQueryBuilder()
|
||||
.select()
|
||||
.addSelect("bar")
|
||||
.addSelect("bar", "bar")
|
||||
.from(Foo, "foo")
|
||||
.where(`foo.bar = '2'`)
|
||||
|
||||
// Spanner does not support column names in CTE
|
||||
const cteOptions =
|
||||
connection.driver.options.type === "spanner"
|
||||
? undefined
|
||||
: {
|
||||
columnNames: ["raz"],
|
||||
}
|
||||
const cteSelection =
|
||||
connection.driver.options.type === "spanner"
|
||||
? "qaz.bar"
|
||||
: "qaz.raz"
|
||||
|
||||
const results = await connection
|
||||
.createQueryBuilder(Foo, "foo")
|
||||
.addCommonTableExpression(cteQuery, "qaz", {
|
||||
columnNames: ["raz"],
|
||||
})
|
||||
.innerJoin("qaz", "qaz", "qaz.raz = foo.bar")
|
||||
.addCommonTableExpression(cteQuery, "qaz", cteOptions)
|
||||
.innerJoin("qaz", "qaz", `${cteSelection} = foo.bar`)
|
||||
.getMany()
|
||||
|
||||
expect(results).to.have.length(1)
|
||||
@ -121,21 +141,41 @@ describe("query builder > cte > simple", () => {
|
||||
connections
|
||||
.filter(filterByCteCapabilities("enabled"))
|
||||
.map(async (connection) => {
|
||||
const results = await connection
|
||||
.createQueryBuilder()
|
||||
.select()
|
||||
.addCommonTableExpression(
|
||||
`
|
||||
SELECT 1
|
||||
UNION
|
||||
SELECT 2
|
||||
`,
|
||||
"cte",
|
||||
{ columnNames: ["foo"] },
|
||||
)
|
||||
.from("cte", "cte")
|
||||
.addSelect("foo", "row")
|
||||
.getRawMany<{ row: any }>()
|
||||
// Spanner does not support column names in CTE
|
||||
|
||||
let results: { row: any }[] = []
|
||||
if (connection.driver.options.type === "spanner") {
|
||||
results = await connection
|
||||
.createQueryBuilder()
|
||||
.select()
|
||||
.addCommonTableExpression(
|
||||
`
|
||||
SELECT 1 AS foo
|
||||
UNION ALL
|
||||
SELECT 2 AS foo
|
||||
`,
|
||||
"cte",
|
||||
)
|
||||
.from("cte", "cte")
|
||||
.addSelect("foo", "row")
|
||||
.getRawMany<{ row: any }>()
|
||||
} else {
|
||||
results = await connection
|
||||
.createQueryBuilder()
|
||||
.select()
|
||||
.addCommonTableExpression(
|
||||
`
|
||||
SELECT 1
|
||||
UNION
|
||||
SELECT 2
|
||||
`,
|
||||
"cte",
|
||||
{ columnNames: ["foo"] },
|
||||
)
|
||||
.from("cte", "cte")
|
||||
.addSelect("foo", "row")
|
||||
.getRawMany<{ row: any }>()
|
||||
}
|
||||
|
||||
const [rowWithOne, rowWithTwo] = results
|
||||
|
||||
|
||||
@ -133,7 +133,8 @@ describe("query builder > delete", () => {
|
||||
const result = await connection
|
||||
.createQueryBuilder()
|
||||
.delete()
|
||||
.from(User)
|
||||
.from(User, "user")
|
||||
.where("name IS NOT NULL")
|
||||
.execute()
|
||||
|
||||
expect(result.affected).to.equal(2)
|
||||
|
||||
@ -43,6 +43,11 @@ describe("query builder > entity updation", () => {
|
||||
it("should not update entity model after insertion if updateEntity is set to false", () =>
|
||||
Promise.all(
|
||||
connections.map(async (connection) => {
|
||||
// for spanner we skip this test, because it's not possible to do it right considering we faked primary generated column
|
||||
// for the spanner and we have updateEntity(false) in this test, but we cannot disable subscriber defined in the tests setup
|
||||
// for the spanner and it updates the entity with it's id anyway
|
||||
if (connection.driver.options.type === "spanner") return
|
||||
|
||||
const post = new Post()
|
||||
post.title = "about entity updation in query builder"
|
||||
|
||||
|
||||
@ -51,7 +51,11 @@ describe("query builder > insert", () => {
|
||||
.values({ name: "Muhammad Mirzoev" })
|
||||
.execute()
|
||||
|
||||
const users = await connection.getRepository(User).find()
|
||||
const users = await connection.getRepository(User).find({
|
||||
order: {
|
||||
id: "ASC",
|
||||
},
|
||||
})
|
||||
users.should.be.eql([
|
||||
{ id: 1, name: "Alex Messer" },
|
||||
{ id: 2, name: "Dima Zotov" },
|
||||
@ -81,7 +85,11 @@ describe("query builder > insert", () => {
|
||||
])
|
||||
.execute()
|
||||
|
||||
const users = await connection.getRepository(User).find()
|
||||
const users = await connection.getRepository(User).find({
|
||||
order: {
|
||||
id: "ASC",
|
||||
},
|
||||
})
|
||||
users.should.be.eql([
|
||||
{ id: 1, name: "Umed Khudoiberdiev" },
|
||||
{ id: 2, name: "Bakhrom Baubekov" },
|
||||
|
||||
@ -117,6 +117,7 @@ describe("query builder > joins", () => {
|
||||
.leftJoinAndSelect("post.categories", "categories")
|
||||
.leftJoinAndSelect("categories.images", "images")
|
||||
.where("post.id = :id", { id: 1 })
|
||||
.orderBy("post.id, categories.id")
|
||||
.getOne()
|
||||
|
||||
expect(loadedPost!.tag).to.not.be.undefined
|
||||
|
||||
@ -35,7 +35,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -56,7 +57,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -87,7 +89,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -371,7 +374,8 @@ describe("query builder > locking", () => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "cockroachdb" ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -414,7 +418,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -433,7 +438,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return
|
||||
|
||||
@ -789,7 +795,8 @@ describe("query builder > locking", () => {
|
||||
connections.map(async (connection) => {
|
||||
if (
|
||||
DriverUtils.isSQLiteFamily(connection.driver) ||
|
||||
connection.driver.options.type === "sap"
|
||||
connection.driver.options.type === "sap" ||
|
||||
connection.driver.options.type === "spanner"
|
||||
)
|
||||
return connection.manager.transaction((entityManager) => {
|
||||
return Promise.all([
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user