Compare commits

...

14 Commits

Author SHA1 Message Date
Naor Peled
555ffa15be
Merge branch 'master' into feature/returning-selection 2025-11-29 02:07:32 +02:00
Mohammed Gomaa
d0b54544e9
fix: typesense doc sync (#11807)
Co-authored-by: Giorgio Boa <35845425+gioboa@users.noreply.github.com>
2025-11-28 15:03:18 +01:00
Lucian Mocanu
cfb3d6c015
feat(mysql): add support for vector columns on MariaDB and MySQL (#11670) 2025-11-27 15:28:49 +01:00
Piotr Kuczynski
dd55218648
fix(cli): init command reading package.json from two folders up (#11789) 2025-11-25 14:13:25 +01:00
Henry Chan
cb1284c8c0
feat: init version in postgres driver only if not set (#11373) 2025-11-24 13:10:30 +01:00
Naor Peled
9383799b3d
chore: add Qodo config (#11791) 2025-11-24 08:18:00 +02:00
Oleg "OSA413" Sokolov
ea0f155532
ci(oracle): add extra sleep after container starts (#11795) 2025-11-24 10:54:49 +05:00
Daniel Harvey
ade198c77c
feat: export QueryPartialEntity and QueryDeepPartialEntity types (#11748)
Co-authored-by: Giorgio Boa <35845425+gioboa@users.noreply.github.com>
Co-authored-by: Piotr Kuczynski <piotr.kuczynski@gmail.com>
Co-authored-by: Oleg "OSA413" Sokolov <OSA413@users.noreply.github.com>
2025-11-23 18:18:51 +05:00
Copilot
5fa8a0bf6c
chore: add GitHub Copilot instructions (#11781)
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: Naor Peled <me@naor.dev>
2025-11-21 23:03:26 +02:00
Piotr Kuczynski
6eda13884e
docs: fix build status badge url (#11790) 2025-11-21 20:27:19 +01:00
Mike Guida
6ed24f8235
ci: run tests on commits to master and next (#11783)
Co-authored-by: Oleg "OSA413" Sokolov <OSA413@users.noreply.github.com>
2025-11-21 01:11:49 +05:00
dependabot[bot]
cad0921827
build(deps): bump js-yaml in /docs (#11779)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 19:52:17 +01:00
Pablo Thiele
dc74f5374e
fix(deps): upgrade glob to fix CVE-2025-64756 (#11784)
Co-authored-by: Oleg "OSA413" Sokolov <OSA413@users.noreply.github.com>
2025-11-20 23:20:05 +05:00
Piotr Kuczynski
bec548a7d4
ci: migrate from nyc to c8 (#11759)
Co-authored-by: Oleg "OSA413" Sokolov <OSA413@users.noreply.github.com>
2025-11-20 22:15:53 +05:00
30 changed files with 805 additions and 1685 deletions

View File

@ -1,8 +1,9 @@
{
"all": true,
"cache": false,
"exclude": ["**/*.d.ts"],
"exclude": ["node_modules", "**/*.d.ts"],
"exclude-after-remap": true,
"extension": [".ts"],
"include": ["build/compiled/src/**", "src/**"],
"reporter": "lcov"
"reporter": ["lcov"]
}

283
.github/copilot-instructions.md vendored Normal file
View File

@ -0,0 +1,283 @@
# GitHub Copilot Instructions for TypeORM
This document provides guidance for GitHub Copilot when working with the TypeORM codebase.
## Project Overview
TypeORM is a TypeScript-based Object-Relational Mapping (ORM) library that supports multiple databases including MySQL/MariaDB, PostgreSQL, MS SQL Server, Oracle, SAP HANA, SQLite, MongoDB, and Google Spanner. It implements both Active Record and Data Mapper patterns and runs on Node.js, Browser, React Native, and Electron platforms.
## Architecture & Structure
### Core Components
- **`src/data-source/`** - DataSource (formerly Connection) management
- **`src/entity-manager/`** - Entity management and operations
- **`src/repository/`** - Repository pattern implementation
- **`src/query-builder/`** - SQL query building
- **`src/decorator/`** - TypeScript decorators for entities, columns, relations
- **`src/driver/`** - Database-specific drivers
- **`src/metadata/`** - Entity metadata management
- **`src/schema-builder/`** - Schema creation and migration
- **`src/migration/`** - Database migration system
- **`src/subscriber/`** - Event subscriber system
- **`src/persistence/`** - Entity persistence logic
### Design Patterns
- **Active Record Pattern**: Entities have methods to save, remove, and query themselves
- **Data Mapper Pattern**: Repositories handle entity persistence separately from business logic
- **Decorator Pattern**: Extensive use of TypeScript decorators for metadata definition
- **Builder Pattern**: QueryBuilder for constructing complex queries
## Coding Standards
### TypeScript Configuration
- Target: ES2021+ with CommonJS modules
- Decorators: `experimentalDecorators` and `emitDecoratorMetadata` enabled
### Code Style
- **Formatting**: Use Prettier with these settings:
- No semicolons (`"semi": false`)
- Arrow function parentheses always (`"arrowParens": "always"`)
- Trailing commas everywhere (`"trailingComma": "all"`)
- **Linting**: ESLint with TypeScript support
- Use `@typescript-eslint` rules
- Warnings allowed for some `@typescript-eslint/no-*` rules
- Unused variables starting with `_` are ignored
- **Naming Conventions**:
- Classes: PascalCase (e.g., `DataSource`, `EntityManager`)
- Interfaces: PascalCase (e.g., `ColumnOptions`, `RelationOptions`)
- Variables/functions: camelCase
- Constants: UPPER_SNAKE_CASE for true constants
- Private members: Use standard camelCase (no underscore prefix)
### TypeScript Patterns
- Use explicit types for public APIs
- Prefer interfaces over type aliases for object shapes
- Use generics for reusable components
- Avoid `any` where possible; use `unknown` or proper types
- Use optional chaining (`?.`) and nullish coalescing (`??`) operators
- Leverage TypeScript utility types (`Partial<T>`, `Required<T>`, `Pick<T>`, etc.)
## Testing
### Test Structure
Tests are organized in `test/` directory:
- **`test/functional/`** - Feature and integration tests organized by functionality (preferred)
- **`test/github-issues/`** - Tests for specific GitHub issues
- **`test/unit/`** - Unit tests for individual components
- **`test/utils/`** - Test utilities and helpers
**Note**: Prefer writing functional tests over per-issue tests.
### Test Writing Guidelines
1. **Use the standard test template**:
```typescript
import "reflect-metadata"
import { createTestingConnections, closeTestingConnections, reloadTestingDatabases } from "../../utils/test-utils"
import { DataSource } from "../../../src/data-source/DataSource"
import { expect } from "chai"
describe("description of functionality", () => {
let dataSources: DataSource[]
before(async () => dataSources = await createTestingConnections({
entities: [__dirname + "/entity/*{.js,.ts}"],
schemaCreate: true,
dropSchema: true,
}))
beforeEach(() => reloadTestingDatabases(dataSources))
after(() => closeTestingConnections(dataSources))
it("should do something specific", () => Promise.all(dataSources.map(async dataSource => {
// Test implementation
})))
})
```
2. **Test Configuration**:
- Tests run against multiple databases (as configured in `ormconfig.json`)
- Each test should work across all supported databases unless database-specific
- Place entity files in `./entity/` relative to test file for automatic loading
- Use `Promise.all(dataSources.map(...))` pattern to test against all databases
3. **Test Naming**:
- Use descriptive `describe()` blocks for features
- Use "should..." format for `it()` descriptions
- Reference GitHub issue numbers when fixing specific issues
4. **Running Tests**:
- Full test suite: `npm test` (compiles then runs tests)
- Fast iteration: `npm run test:fast` (runs without recompiling)
- Specific tests: `npm run test:fast -- --grep "pattern"`
- Watch mode: `npm run compile -- --watch` + `npm run test:fast`
## Database-Specific Considerations
### Multi-Database Support
When writing code or tests:
- Ensure compatibility across all supported databases
- Use driver-specific code only in `src/driver/` directory
- Test database-agnostic code against multiple databases
- Use `DataSource.options.type` to check database type when needed
- Be aware of SQL dialect differences (LIMIT vs TOP, etc.)
### Driver Implementation
Each driver in `src/driver/` implements common interfaces:
- Connection management
- Query execution
- Schema synchronization
- Type mapping
- Transaction handling
## Common Development Tasks
### Adding a New Feature
1. Create entities in appropriate test directory
2. Write tests first (TDD approach encouraged)
3. Implement feature in `src/`
4. Ensure tests pass across all databases
5. Update documentation if public API changes
6. Follow commit message conventions
### Adding a New Decorator
1. Create decorator file in `src/decorator/`
2. Create metadata args in `src/metadata-args/`
3. Update metadata builder in `src/metadata-builder/`
4. Export from `src/index.ts`
5. Add comprehensive tests
6. Update TypeScript type definitions if needed
### Working with Migrations
- Migrations are in `src/migration/`
- Migration files should be timestamped
- Support both up and down migrations
- Test migrations against all supported databases
- Ensure schema changes are reversible
## Build & Development Workflow
### Commands
- **Build**: `npm run compile` - Compiles TypeScript to `build/compiled/`
- **Package**: `npm run package` - Creates distribution in `build/package/`
- **Pack**: `npm run pack` - Creates `.tgz` file in `build/`
- **Test**: `npm test` - Compile and run all tests
- **Lint**: `npm run lint` - Run ESLint
- **Format**: `npm run format` - Run Prettier
- **Watch**: `npm run watch` - Watch mode for TypeScript compilation
### Development Setup
1. Install dependencies: `npm install`
2. Copy config: `cp ormconfig.sample.json ormconfig.json`
3. Configure database connections in `ormconfig.json`
4. Optionally use Docker: `docker-compose up` for database services
### Pre-commit Hooks
- Husky runs pre-commit hooks
- Lint-staged runs on staged files
- Format and lint checks must pass
## Contribution Guidelines
### Commit Message Format
Follow conventional commits:
```
<type>: <subject>
<body>
<footer>
```
**Types**: `feat`, `fix`, `docs`, `style`, `refactor`, `perf`, `test`, `build`, `chore`, `revert`
**Subject**:
- Use imperative, present tense
- Don't capitalize first letter
- No period at the end
- Max 100 characters per line
### Pull Request Requirements
- All tests must pass
- Include appropriate tests for changes
- Follow existing code style
- Update documentation for API changes
- Reference related GitHub issues
- Get approval before merging
## Common Patterns & Idioms
### Entity Definition
```typescript
@Entity()
export class User {
@PrimaryGeneratedColumn()
id: number
@Column()
name: string
@OneToMany(() => Photo, photo => photo.user)
photos: Photo[]
}
```
### Repository Usage
```typescript
const userRepository = dataSource.getRepository(User)
const user = await userRepository.findOne({ where: { id: 1 } })
```
### QueryBuilder
```typescript
const users = await dataSource
.getRepository(User)
.createQueryBuilder("user")
.leftJoinAndSelect("user.photos", "photo")
.where("user.name = :name", { name: "John" })
.getMany()
```
### Transactions
```typescript
await dataSource.transaction(async (manager) => {
await manager.save(user)
await manager.save(photo)
})
```
## Important Notes
- Always import `reflect-metadata` before TypeORM
- Be careful with circular dependencies between entities
- Use lazy relations or forward references for circular entity references
- Connection pooling is handled automatically by drivers
- Be mindful of N+1 query problems; use joins or eager loading
- Repository methods are async; always use `await`
- Entity instances should be plain objects, not class instances with methods (Data Mapper pattern)
## Resources
- [Main Documentation](https://typeorm.io)
- [Contributing Guide](../CONTRIBUTING.md)
- [Developer Guide](../DEVELOPER.md)
- [GitHub Repository](https://github.com/typeorm/typeorm)
- [Issue Tracker](https://github.com/typeorm/typeorm/issues)

View File

@ -12,7 +12,35 @@ jobs:
steps:
- uses: actions/checkout@v5
- name: Delete unaliased collections
env:
TYPESENSE_API_KEY: ${{ secrets.TYPESENSE_API_KEY }}
TYPESENSE_HOST: ${{ secrets.TYPESENSE_HOST }}
TYPESENSE_PROTOCOL: https
TYPESENSE_PORT: 443
run: |
ALIAS_COLLECTION=$(curl -s -H "X-TYPESENSE-API-KEY: $TYPESENSE_API_KEY" \
"$TYPESENSE_PROTOCOL://$TYPESENSE_HOST:$TYPESENSE_PORT/aliases/typeorm-docs" \
| jq -r '.collection_name')
if [ "$ALIAS_COLLECTION" = "null" ] || [ -z "$ALIAS_COLLECTION" ]; then
echo "Alias does not exist; skipping collection cleanup."
exit 0
fi
echo "Alias currently points to: $ALIAS_COLLECTION"
COLLECTIONS=$(curl -s -H "X-TYPESENSE-API-KEY: $TYPESENSE_API_KEY" \
"$TYPESENSE_PROTOCOL://$TYPESENSE_HOST:$TYPESENSE_PORT/collections" \
| jq -r '.[].name')
for col in $COLLECTIONS; do
if [ "$col" != "$ALIAS_COLLECTION" ]; then
echo "Deleting unaliased collection: $col"
curl -s -X DELETE -H "X-TYPESENSE-API-KEY: $TYPESENSE_API_KEY" \
"$TYPESENSE_PROTOCOL://$TYPESENSE_HOST:$TYPESENSE_PORT/collections/$col"
fi
done
- run: |
docker run \
-e TYPESENSE_API_KEY=${{ secrets.TYPESENSE_API_KEY }} \

View File

@ -31,7 +31,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/cockroachdb.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -58,7 +58,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/mongodb.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -89,7 +89,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/mssql.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -130,7 +130,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/mysql-mariadb.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -171,7 +171,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/mysql-mariadb-latest.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -192,7 +192,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/better-sqlite3.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -213,7 +213,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/sqlite.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -234,7 +234,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/sqljs.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -274,7 +274,7 @@ jobs:
path: build/
- run: npm ci
- run: cp .github/workflows/test/postgres.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
with:
@ -300,7 +300,8 @@ jobs:
- run: npm ci
- run: cat ormconfig.sample.json | jq 'map(select(.name == "oracle"))' > ormconfig.json
- run: docker compose up oracle --no-recreate --wait
- run: npx nyc npm run test:ci
- run: sleep 10
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
@ -327,7 +328,7 @@ jobs:
- run: npm ci
- run: cat ormconfig.sample.json | jq 'map(select(.name == "hanaexpress"))' > ormconfig.json
- run: docker compose up hanaexpress --no-recreate --wait
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2

View File

@ -22,7 +22,7 @@ jobs:
- run: npm ci
- run: cp .github/workflows/test/better-sqlite3.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
@ -45,7 +45,7 @@ jobs:
- run: npm ci
- run: cp .github/workflows/test/sqlite.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2
@ -68,7 +68,7 @@ jobs:
- run: npm ci
- run: cp .github/workflows/test/sqljs.ormconfig.json ormconfig.json
- run: npx nyc npm run test:ci
- run: npx c8 npm run test:ci
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2

View File

@ -1,5 +1,9 @@
name: Tests
on:
push:
branches:
- master
- next
pull_request:
jobs:

5
.gitignore vendored
View File

@ -3,12 +3,11 @@
._*
### Node ###
npm-debug.log*
build/
coverage/
*.lcov
.nyc_output/
node_modules/
npm-debug.log*
*.lcov
### VisualStudioCode ###
.vscode/*

14
.pr_agent.toml Normal file
View File

@ -0,0 +1,14 @@
[github_app]
pr_commands = [
"/review",
"/improve",
]
handle_push_trigger = true
push_commands = [
"/improve",
]
[auto_best_practices]
enable_auto_best_practices = true
utilize_auto_best_practices = true

View File

@ -8,11 +8,11 @@
</a>
<br>
<br>
<a href="https://www.npmjs.com/package/typeorm"><img src="https://img.shields.io/npm/v/typeorm" alt="NPM Version" /></a>
<a href="https://www.npmjs.com/package/typeorm"><img src="https://img.shields.io/npm/dm/typeorm" alt="NPM Downloads" /></a>
<a href="https://github.com/typeorm/typeorm/actions/workflows/commit-validation.yml?query=branch%3Amaster"><img src="https://github.com/typeorm/typeorm/actions/workflows/commit-validation.yml/badge.svg?branch=master" alt="Commit Validation"/></a>
<a href="https://coveralls.io/github/typeorm/typeorm?branch=master"><img src="https://coveralls.io/repos/github/typeorm/typeorm/badge.svg?branch=master" alt="Coverage Status" /></a>
<a href=""><img src="https://img.shields.io/badge/License-MIT-teal.svg" alt="MIT License" /></a>
<a href="https://www.npmjs.com/package/typeorm"><img src="https://img.shields.io/npm/v/typeorm" alt="NPM Version"/></a>
<a href="https://www.npmjs.com/package/typeorm"><img src="https://img.shields.io/npm/dm/typeorm" alt="NPM Downloads"/></a>
<a href="https://github.com/typeorm/typeorm/actions/workflows/tests.yml?query=branch%3Amaster"><img src="https://github.com/typeorm/typeorm/actions/workflows/tests.yml/badge.svg?branch=master" alt="Commit Validation"/></a>
<a href="https://coveralls.io/github/typeorm/typeorm?branch=master"><img src="https://coveralls.io/repos/github/typeorm/typeorm/badge.svg?branch=master" alt="Coverage Status"/></a>
<a href=""><img src="https://img.shields.io/badge/License-MIT-teal.svg" alt="MIT License"/></a>
<br>
<br>
</div>

View File

@ -12,7 +12,7 @@ services:
MYSQL_DATABASE: "test"
mysql-9:
image: "mysql:9.4.0"
image: "mysql:9.5.0"
container_name: "typeorm-mysql-9"
ports:
- "3306:3306"
@ -24,7 +24,7 @@ services:
# mariadb
mariadb-10:
image: "mariadb:10.6.22-jammy"
image: "mariadb:10.6.24-jammy"
container_name: "typeorm-mariadb-10"
ports:
- "3307:3306"
@ -35,7 +35,7 @@ services:
MYSQL_DATABASE: "test"
mariadb-12:
image: "mariadb:12.0.1-rc"
image: "mariadb:12.1.2"
container_name: "typeorm-mariadb-12"
ports:
- "3307:3306"

View File

@ -207,7 +207,7 @@ const queryEmbedding = [
const results = await dataSource.query(
`
DECLARE @question AS VECTOR (1998) = @0;
SELECT TOP (10) dc.*,
SELECT TOP (10) dc.*,
VECTOR_DISTANCE('cosine', @question, embedding) AS distance
FROM document_chunk dc
ORDER BY VECTOR_DISTANCE('cosine', @question, embedding)

View File

@ -139,3 +139,7 @@ export class User {
roles: UserRoleType[]
}
```
### Vector Types
MySQL supports the [VECTOR type](https://dev.mysql.com/doc/refman/en/vector.html) since version 9.0, while in MariaDB, [vectors](https://mariadb.com/docs/server/reference/sql-structure/vectors/vector-overview) are available since 11.7.

View File

@ -60,7 +60,7 @@ Additional options can be added to the `extra` object and will be passed directl
### Column types for `postgres`
`int`, `int2`, `int4`, `int8`, `smallint`, `integer`, `bigint`, `decimal`, `numeric`, `real`, `float`, `float4`, `float8`, `double precision`, `money`, `character varying`, `varchar`, `character`, `char`, `text`, `citext`, `hstore`, `bytea`, `bit`, `varbit`, `bit varying`, `timetz`, `timestamptz`, `timestamp`, `timestamp without time zone`, `timestamp with time zone`, `date`, `time`, `time without time zone`, `time with time zone`, `interval`, `bool`, `boolean`, `enum`, `point`, `line`, `lseg`, `box`, `path`, `polygon`, `circle`, `cidr`, `inet`, `macaddr`, `macaddr8`, `tsvector`, `tsquery`, `uuid`, `xml`, `json`, `jsonb`, `jsonpath`, `int4range`, `int8range`, `numrange`, `tsrange`, `tstzrange`, `daterange`, `int4multirange`, `int8multirange`, `nummultirange`, `tsmultirange`, `tstzmultirange`, `multidaterange`, `geometry`, `geography`, `cube`, `ltree`
`int`, `int2`, `int4`, `int8`, `smallint`, `integer`, `bigint`, `decimal`, `numeric`, `real`, `float`, `float4`, `float8`, `double precision`, `money`, `character varying`, `varchar`, `character`, `char`, `text`, `citext`, `hstore`, `bytea`, `bit`, `varbit`, `bit varying`, `timetz`, `timestamptz`, `timestamp`, `timestamp without time zone`, `timestamp with time zone`, `date`, `time`, `time without time zone`, `time with time zone`, `interval`, `bool`, `boolean`, `enum`, `point`, `line`, `lseg`, `box`, `path`, `polygon`, `circle`, `cidr`, `inet`, `macaddr`, `macaddr8`, `tsvector`, `tsquery`, `uuid`, `xml`, `json`, `jsonb`, `jsonpath`, `int4range`, `int8range`, `numrange`, `tsrange`, `tstzrange`, `daterange`, `int4multirange`, `int8multirange`, `nummultirange`, `tsmultirange`, `tstzmultirange`, `multidaterange`, `geometry`, `geography`, `cube`, `ltree`, `vector`, `halfvec`.
### Column types for `cockroachdb`
@ -68,6 +68,33 @@ Additional options can be added to the `extra` object and will be passed directl
Note: CockroachDB returns all numeric data types as `string`. However, if you omit the column type and define your property as `number` ORM will `parseInt` string into number.
### Vector columns
Vector columns can be used for similarity searches using PostgreSQL's vector operators:
```typescript
// L2 distance (Euclidean) - <->
const results = await dataSource.sql`
SELECT id, embedding
FROM post
ORDER BY embedding <-> ${"[1,2,3]"}
LIMIT 5`
// Cosine distance - <=>
const results = await dataSource.sql`
SELECT id, embedding
FROM post
ORDER BY embedding <=> ${"[1,2,3]"}
LIMIT 5`
// Inner product - <#>
const results = await dataSource.sql`
SELECT id, embedding
FROM post
ORDER BY embedding <#> ${"[1,2,3]"}
LIMIT 5`
```
### Spatial columns
TypeORM's PostgreSQL and CockroachDB support uses [GeoJSON](http://geojson.org/) as an interchange format, so geometry columns should be tagged either as `object` or `Geometry` (or subclasses, e.g. `Point`) after importing [`geojson` types](https://www.npmjs.com/package/@types/geojson) or using the TypeORM built-in GeoJSON types:

View File

@ -37,15 +37,16 @@ SAP HANA 2.0 and SAP HANA Cloud support slightly different data types. Check the
- [SAP HANA 2.0 Data Types](https://help.sap.com/docs/SAP_HANA_PLATFORM/4fe29514fd584807ac9f2a04f6754767/20a1569875191014b507cf392724b7eb.html?locale=en-US)
- [SAP HANA Cloud Data Types](https://help.sap.com/docs/hana-cloud-database/sap-hana-cloud-sap-hana-database-sql-reference-guide/data-types)
TypeORM's `SapDriver` supports `tinyint`, `smallint`, `integer`, `bigint`, `smalldecimal`, `decimal`, `real`, `double`, `date`, `time`, `seconddate`, `timestamp`, `boolean`, `char`, `nchar`, `varchar`, `nvarchar`, `text`, `alphanum`, `shorttext`, `array`, `varbinary`, `blob`, `clob`, `nclob`, `st_geometry`, `st_point`, `real_vector`, `half_vector`, `vector`, and `halfvec`. Some of these data types have been deprecated or removed in SAP HANA Cloud, and will be converted to the closest available alternative when connected to a Cloud database.
TypeORM's `SapDriver` supports `tinyint`, `smallint`, `integer`, `bigint`, `smalldecimal`, `decimal`, `real`, `double`, `date`, `time`, `seconddate`, `timestamp`, `boolean`, `char`, `nchar`, `varchar`, `nvarchar`, `text`, `alphanum`, `shorttext`, `array`, `varbinary`, `blob`, `clob`, `nclob`, `st_geometry`, `st_point`, `real_vector` and `half_vector`. Some of these data types have been deprecated or removed in SAP HANA Cloud, and will be converted to the closest available alternative when connected to a Cloud database.
### Vector Types
The `real_vector` and `half_vector` data types were introduced in SAP HANA Cloud (2024Q1 and 2025Q2 respectively), and require a supported version of `@sap/hana-client` as well.
The `real_vector` and `half_vector` data types were introduced in SAP HANA Cloud (2024Q1 and 2025Q2 respectively), and require a supported version of `@sap/hana-client` as well.
For consistency with PostgreSQL's vector support, TypeORM also provides aliases:
- `vector` (alias for `real_vector`) - stores vectors as 4-byte floats
- `halfvec` (alias for `half_vector`) - stores vectors as 2-byte floats for memory efficiency
- `vector` (alias for `real_vector`) - stores vectors as 4-byte floats
- `halfvec` (alias for `half_vector`) - stores vectors as 2-byte floats for memory efficiency
```typescript
@Entity()
@ -70,3 +71,5 @@ export class Document {
```
By default, the client will return a `Buffer` in the `fvecs`/`hvecs` format, which is more efficient. It is possible to let the driver convert the values to a `number[]` by adding `{ extra: { vectorOutputType: "Array" } }` to the connection options. Check the SAP HANA Client documentation for more information about [REAL_VECTOR](https://help.sap.com/docs/SAP_HANA_CLIENT/f1b440ded6144a54ada97ff95dac7adf/0d197e4389c64e6b9cf90f6f698f62fe.html) or [HALF_VECTOR](https://help.sap.com/docs/SAP_HANA_CLIENT/f1b440ded6144a54ada97ff95dac7adf/8bb854b4ce4a4299bed27c365b717e91.html).
Use the appropriate [vector functions](https://help.sap.com/docs/hana-cloud-database/sap-hana-cloud-sap-hana-database-sql-reference-guide/vector-functions) for similarity searches.

View File

@ -180,88 +180,6 @@ There are several special column types with additional functionality available:
each time you call `save` of entity manager or repository, or during `upsert` operations when an update occurs.
You don't need to set this column - it will be automatically set.
### Vector columns
Vector columns are supported on PostgreSQL (via [`pgvector`](https://github.com/pgvector/pgvector) extension), Microsoft SQL Server, and SAP HANA Cloud, enabling storing and querying vector embeddings for similarity search and machine learning applications.
TypeORM supports both `vector` and `halfvec` column types across databases:
- `vector` - stores vectors as 4-byte floats (single precision)
- PostgreSQL: native `vector` type via pgvector extension
- SQL Server: native `vector` type
- SAP HANA: alias for `real_vector` type
- `halfvec` - stores vectors as 2-byte floats (half precision) for memory efficiency
- PostgreSQL: native `halfvec` type via pgvector extension
- SAP HANA: alias for `half_vector` type
You can specify the vector dimensions using the `length` option:
```typescript
@Entity()
export class Post {
@PrimaryGeneratedColumn()
id: number
// Vector without specified dimensions (works on PostgreSQL and SAP HANA; SQL Server requires explicit dimensions)
@Column("vector")
embedding: number[] | Buffer
// Vector with 3 dimensions: vector(3)
@Column("vector", { length: 3 })
embedding_3d: number[] | Buffer
// Half-precision vector with 4 dimensions: halfvec(4) (PostgreSQL and SAP HANA only)
@Column("halfvec", { length: 4 })
halfvec_embedding: number[] | Buffer
}
```
**PostgreSQL** - Vector columns can be used for similarity searches using vector operators:
```typescript
// L2 distance (Euclidean) - <->
const results = await dataSource.query(
`SELECT id, embedding FROM post ORDER BY embedding <-> $1 LIMIT 5`,
["[1,2,3]"],
)
// Cosine distance - <=>
const results = await dataSource.query(
`SELECT id, embedding FROM post ORDER BY embedding <=> $1 LIMIT 5`,
["[1,2,3]"],
)
// Inner product - <#>
const results = await dataSource.query(
`SELECT id, embedding FROM post ORDER BY embedding <#> $1 LIMIT 5`,
["[1,2,3]"],
)
```
**SQL Server** - Use the `VECTOR_DISTANCE` function for similarity searches:
```typescript
const queryEmbedding = [1, 2, 3]
// Cosine distance
const results = await dataSource.query(
`
DECLARE @question AS VECTOR(3) = @0;
SELECT TOP (5) id, embedding,
VECTOR_DISTANCE('cosine', @question, embedding) AS distance
FROM post
ORDER BY VECTOR_DISTANCE('cosine', @question, embedding)
`,
[JSON.stringify(queryEmbedding)],
)
```
> **Note**:
>
> - **PostgreSQL**: Vector columns require the `pgvector` extension to be installed. The extension provides the vector data types and similarity operators.
> - **SQL Server**: Vector type support requires a compatible SQL Server version with vector functionality enabled.
> - **SAP HANA**: Vector columns require SAP HANA Cloud (2024Q1+) and a supported version of `@sap/hana-client`. Use the appropriate [vector similarity functions](https://help.sap.com/docs/hana-cloud-database/sap-hana-cloud-sap-hana-database-sql-reference-guide/vector-functions) for similarity searches.
## Column types
TypeORM supports all of the most commonly used database-supported column types.
@ -414,6 +332,50 @@ Besides "uuid" there is also "increment", "identity" (Postgres 10+ only) and "ro
on some database platforms with this type of generation (for example some databases can only have one increment column,
or some of them require increment to be a primary key).
### Vector columns
Vector columns are supported on MariaDB/MySQL, Microsoft SQL Server, PostgreSQL (via [`pgvector`](https://github.com/pgvector/pgvector) extension) and SAP HANA Cloud, enabling storing and querying vector embeddings for similarity search and machine learning applications.
TypeORM supports both `vector` and `halfvec` column types across databases:
- `vector` - stores vectors as 4-byte floats (single precision)
- MariaDB/MySQL: native `vector` type
- Microsoft SQL Server: native `vector` type
- PostgreSQL: `vector` type, available via `pgvector` extension
- SAP HANA Cloud: alias for `real_vector` type
- `halfvec` - stores vectors as 2-byte floats (half precision) for memory efficiency
- PostgreSQL: `halfvec` type, available via `pgvector` extension
- SAP HANA Cloud: alias for `half_vector` type
You can specify the number of vector dimensions using the `length` option:
```typescript
@Entity()
export class Post {
@PrimaryGeneratedColumn()
id: number
// Vector without specified dimensions
@Column("vector")
embedding: number[] | Buffer
// Vector with 3 dimensions: vector(3)
@Column("vector", { length: 3 })
embedding_3d: number[] | Buffer
// Half-precision vector with 4 dimensions: halfvec(4) (works on PostgreSQL and SAP HANA only)
@Column("halfvec", { length: 4 })
halfvec_embedding: number[] | Buffer
}
```
> **Note**:
>
> - **MariaDB/MySQL**: Vectors are supported since MariaDB 11.7 and MySQL 9
> - **Microsoft SQL Server**: Vector type support requires SQL Server 2025 (17.x) or newer.
> - **PostgreSQL**: Vector columns require the `pgvector` extension to be installed. The extension provides the vector data types and similarity operators.
> - **SAP HANA**: Vector columns require SAP HANA Cloud (2024Q1+) and a supported version of `@sap/hana-client`.
### Spatial columns
Microsoft SQLServer, MySQL/MariaDB, PostgreSQL/CockroachDB and SAP HANA all support spatial columns. TypeORM's support for each varies slightly between databases, particularly as the column names vary between databases.

View File

@ -27,12 +27,7 @@
"strip_chars": " .,;:#",
"custom_settings": {
"separatorsToIndex": "_",
"attributesForFaceting": [
"language",
"version",
"type",
"docusaurus_tag"
],
"attributesForFaceting": [],
"attributesToRetrieve": [
"hierarchy",
"content",
@ -46,4 +41,4 @@
"833762294"
],
"nb_hits": 0
}
}

12
docs/package-lock.json generated
View File

@ -8639,9 +8639,9 @@
}
},
"node_modules/gray-matter/node_modules/js-yaml": {
"version": "3.14.1",
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.1.tgz",
"integrity": "sha512-okMH7OXXJ7YrN9Ok3/SXrnu4iX9yOk+25nqX4imS2npuvTYDmo/QEZoqwZkYaIDk3jVvBOTOIEgEhaLOynBS9g==",
"version": "3.14.2",
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.2.tgz",
"integrity": "sha512-PMSmkqxr106Xa156c2M265Z+FTrPl+oxd/rgOQy2tijQeK5TxQ43psO1ZCwhVOSdnn+RzkzlRz/eY4BgJBYVpg==",
"license": "MIT",
"dependencies": {
"argparse": "^1.0.7",
@ -9981,9 +9981,9 @@
"license": "MIT"
},
"node_modules/js-yaml": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.0.tgz",
"integrity": "sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==",
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.1.tgz",
"integrity": "sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==",
"license": "MIT",
"dependencies": {
"argparse": "^2.0.1"

1592
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -103,7 +103,7 @@
"debug": "^4.4.3",
"dedent": "^1.7.0",
"dotenv": "^16.6.1",
"glob": "^10.4.5",
"glob": "^10.5.0",
"reflect-metadata": "^0.2.2",
"sha.js": "^2.4.12",
"sql-highlight": "^6.1.0",
@ -129,6 +129,7 @@
"@types/source-map-support": "^0.5.10",
"@types/yargs": "^17.0.33",
"better-sqlite3": "^8.7.0",
"c8": "^10.1.3",
"chai": "^4.5.0",
"chai-as-promised": "^7.1.2",
"class-transformer": "^0.5.1",
@ -150,7 +151,6 @@
"mssql": "^11.0.1",
"mysql": "^2.18.1",
"mysql2": "^3.15.0",
"nyc": "^17.1.0",
"oracledb": "^6.9.0",
"pg": "^8.16.3",
"pg-query-stream": "^4.10.3",

View File

@ -6,8 +6,6 @@ import { TypeORMError } from "../error"
import { PlatformTools } from "../platform/PlatformTools"
import { CommandUtils } from "./CommandUtils"
import ourPackageJson from "../../package.json"
/**
* Generates a new project with TypeORM.
*/
@ -117,7 +115,7 @@ export class InitCommand implements yargs.CommandModule {
)
await CommandUtils.createFile(
basePath + "/package.json",
InitCommand.appendPackageJson(
await InitCommand.appendPackageJson(
packageJsonContents,
database,
isExpress,
@ -673,13 +671,16 @@ Steps to run this project:
/**
* Appends to a given package.json template everything needed.
*/
protected static appendPackageJson(
protected static async appendPackageJson(
packageJsonContents: string,
database: string,
express: boolean,
projectIsEsm: boolean /*, docker: boolean*/,
): string {
): Promise<string> {
const packageJson = JSON.parse(packageJsonContents)
const ourPackageJson = JSON.parse(
await CommandUtils.readFile(`${__dirname}/../package.json`),
)
if (!packageJson.devDependencies) packageJson.devDependencies = {}
packageJson.devDependencies = {

View File

@ -157,6 +157,8 @@ export class MysqlDriver implements Driver {
"multilinestring",
"multipolygon",
"geometrycollection",
// vector data types
"vector",
// additional data types for mariadb
"uuid",
"inet4",
@ -191,6 +193,7 @@ export class MysqlDriver implements Driver {
"nvarchar",
"binary",
"varbinary",
"vector",
]
/**
@ -280,6 +283,7 @@ export class MysqlDriver implements Driver {
char: { length: 1 },
binary: { length: 1 },
varbinary: { length: 255 },
vector: { length: 2048 }, // default length MySQL uses if not provided a value
decimal: { precision: 10, scale: 0 },
dec: { precision: 10, scale: 0 },
numeric: { precision: 10, scale: 0 },

View File

@ -2802,17 +2802,19 @@ export class MysqlQueryRunner extends BaseQueryRunner implements QueryRunner {
) !== -1 &&
dbColumn["CHARACTER_MAXIMUM_LENGTH"]
) {
const length =
dbColumn[
"CHARACTER_MAXIMUM_LENGTH"
].toString()
let length: number =
dbColumn["CHARACTER_MAXIMUM_LENGTH"]
if (tableColumn.type === "vector") {
// MySQL and MariaDb store the vector length in bytes, not in number of dimensions.
length = length / 4
}
tableColumn.length =
!this.isDefaultColumnLength(
table,
tableColumn,
length,
length.toString(),
)
? length
? length.toString()
: ""
}

View File

@ -73,7 +73,8 @@ export class PostgresDriver implements Driver {
options: PostgresConnectionOptions
/**
* Version of Postgres. Requires a SQL query to the DB, so it is not always set
* Version of Postgres. Requires a SQL query to the DB, so it is set on the first
* connection attempt.
*/
version?: string
@ -362,20 +363,24 @@ export class PostgresDriver implements Driver {
this.master = await this.createPool(this.options, this.options)
}
const queryRunner = this.createQueryRunner("master")
if (!this.version || !this.database || !this.searchSchema) {
const queryRunner = this.createQueryRunner("master")
this.version = await queryRunner.getVersion()
if (!this.version) {
this.version = await queryRunner.getVersion()
}
if (!this.database) {
this.database = await queryRunner.getCurrentDatabase()
if (!this.database) {
this.database = await queryRunner.getCurrentDatabase()
}
if (!this.searchSchema) {
this.searchSchema = await queryRunner.getCurrentSchema()
}
await queryRunner.release()
}
if (!this.searchSchema) {
this.searchSchema = await queryRunner.getCurrentSchema()
}
await queryRunner.release()
if (!this.schema) {
this.schema = this.searchSchema
}

View File

@ -75,7 +75,7 @@ export type WithLengthColumnType =
| "binary" // mssql
| "varbinary" // mssql, sap
| "string" // cockroachdb, spanner
| "vector" // postgres, mssql, sap
| "vector" // mariadb, mysql, mssql, postgres, sap
| "halfvec" // postgres, sap
| "half_vector" // sap
| "real_vector" // sap

View File

@ -163,6 +163,10 @@ export { InsertResult } from "./query-builder/result/InsertResult"
export { UpdateResult } from "./query-builder/result/UpdateResult"
export { DeleteResult } from "./query-builder/result/DeleteResult"
export { ReturningOption } from "./query-builder/ReturningOption"
export {
QueryPartialEntity,
QueryDeepPartialEntity,
} from "./query-builder/QueryPartialEntity"
export { QueryResult } from "./query-runner/QueryResult"
export { QueryRunner } from "./query-runner/QueryRunner"
export { MongoEntityManager } from "./entity-manager/MongoEntityManager"

View File

@ -0,0 +1,61 @@
import {
Column,
Entity,
PrimaryColumn,
ValueTransformer,
} from "../../../../../../src"
/*
* The mysql2 client partially supports the vector type. Newer versions support
* only deserializing from binary format. Currently mysql2 only accepts binary
* parameters for vector values, and not numeric arrays.
*/
const vectorTransformer: ValueTransformer = {
to: (value: number[]) => {
const length = value.length
const arrayBuffer = new ArrayBuffer(length * 4)
const dataView = new DataView(arrayBuffer)
for (let index = 0; index < length; index++) {
dataView.setFloat32(index * 4, value[index], true)
}
return Buffer.from(arrayBuffer)
},
from: (value: Buffer | number[]) => {
if (Array.isArray(value)) {
// newer versions of mysql2 already deserialize vector as number[]
return value
}
const dataView = new DataView(
value.buffer,
value.byteOffset,
value.byteLength,
)
const length = value.byteLength / 4
const array = new Array<number>(length)
for (let index = 0; index < length; index++) {
array[index] = dataView.getFloat32(index * 4, true)
}
return array
},
}
@Entity()
export class Embedding {
@PrimaryColumn()
id: number
@Column()
content: string
@Column()
metadata: string
@Column("vector", {
length: 16,
transformer: vectorTransformer,
})
vector: number[]
}

View File

@ -0,0 +1,85 @@
import { expect } from "chai"
import { DataSource, DeepPartial } from "../../../../../src"
import { DriverUtils } from "../../../../../src/driver/DriverUtils"
import {
closeTestingConnections,
createTestingConnections,
} from "../../../../utils/test-utils"
import { Embedding } from "./entity/Embedding"
describe("database-schema > vectors > mysql", () => {
describe("with vector output type Array", () => {
let dataSources: DataSource[]
before(async () => {
dataSources = await createTestingConnections({
entities: [Embedding],
enabledDrivers: ["mariadb", "mysql"],
driverSpecific: {
synchronize: false,
},
})
})
after(() => closeTestingConnections(dataSources))
it("should work correctly - create, persist and hydrate", () =>
Promise.all(
dataSources.map(async (dataSource) => {
if (
(dataSource.options.type === "mysql" &&
!DriverUtils.isReleaseVersionOrGreater(
dataSource.driver,
"9.0",
)) ||
(dataSource.options.type === "mariadb" &&
!DriverUtils.isReleaseVersionOrGreater(
dataSource.driver,
"11.7",
))
) {
return
}
await dataSource.synchronize()
// Verify column metadata
const queryRunner = dataSource.createQueryRunner()
const table = (await queryRunner.getTable(
dataSource.getMetadata(Embedding).tableName,
))!
await queryRunner.release()
expect(table.findColumnByName("vector")).to.contain({
type: "vector",
length: "16",
})
const vector = [
0.004318627528846264, -0.008295782841742039,
0.011462775990366936, -0.03171011060476303,
-0.003404685528948903, 0.018827877938747406,
0.010692788287997246, 0.014154385775327682,
-0.026206370443105698, -0.03977154940366745,
-0.008630559779703617, 0.040039367973804474,
0.0019048830727115273, 0.01347813569009304,
-0.02147931419312954, -0.004211498890072107,
]
const plainEmbedding = {
id: 1,
content: "This is a sample text to be analyzed by AI",
metadata: `{"client":"typeorm"}`,
vector,
} satisfies DeepPartial<Embedding>
const embeddingRepository =
dataSource.getRepository(Embedding)
const embedding = embeddingRepository.create(plainEmbedding)
await embeddingRepository.save(embedding)
const loadedEmbedding = await embeddingRepository.findOneBy(
{ id: 1 },
)
expect(loadedEmbedding).to.deep.equal(plainEmbedding)
}),
))
})
})

View File

@ -1,4 +1,38 @@
import { Column, Entity, PrimaryColumn } from "../../../../../../src"
import {
Column,
Entity,
PrimaryColumn,
ValueTransformer,
} from "../../../../../../src"
const vectorTransformer: ValueTransformer = {
to: (value: number[]) => {
const length = value.length
const arrayBuffer = new ArrayBuffer(4 + length * 4)
const dataView = new DataView(arrayBuffer)
dataView.setUint32(0, length, true)
for (let index = 0; index < length; index++) {
dataView.setFloat32(4 + index * 4, value[index], true)
}
return Buffer.from(arrayBuffer)
},
from: (value: Buffer) => {
const dataView = new DataView(
value.buffer,
value.byteOffset,
value.byteLength,
)
const length = dataView.getUint32(0, true)
const array = new Array<number>(length)
for (let index = 0; index < length; index++) {
array[index] = dataView.getFloat32(4 + index * 4, true)
}
return array
},
}
@Entity()
export class BufferEmbedding {
@ -11,6 +45,8 @@ export class BufferEmbedding {
@Column("nclob")
metadata: string
@Column("real_vector")
realVector: Buffer
@Column("real_vector", {
transformer: vectorTransformer,
})
realVector: number[]
}

View File

@ -119,34 +119,6 @@ describe("database-schema > vectors > sap", () => {
})
after(() => closeTestingConnections(dataSources))
function deserializeFvecs(buffer: Buffer) {
const dataView = new DataView(
buffer.buffer,
buffer.byteOffset,
buffer.byteLength,
)
const length = dataView.getUint32(0, true)
const array = new Array<number>(length)
for (let index = 0; index < length; index++) {
array[index] = dataView.getFloat32(4 + index * 4, true)
}
return array
}
function serializeFvecs(array: number[]) {
const length = array.length
const arrayBuffer = new ArrayBuffer(4 + length * 4)
const dataView = new DataView(arrayBuffer)
dataView.setUint32(0, length, true)
for (let index = 0; index < length; index++) {
dataView.setFloat32(4 + index * 4, array[index], true)
}
return Buffer.from(arrayBuffer)
}
it("should work correctly - persist and hydrate ", () =>
Promise.all(
dataSources.map(async (dataSource) => {
@ -177,7 +149,7 @@ describe("database-schema > vectors > sap", () => {
content:
"This is a sample text to be analyzed by SAP Joule AI",
metadata: `{"client":"typeorm"}`,
realVector: serializeFvecs(plainVector),
realVector: plainVector,
} satisfies DeepPartial<BufferEmbedding>
const embeddingRepository =
@ -188,10 +160,9 @@ describe("database-schema > vectors > sap", () => {
const loadedEmbedding = await embeddingRepository.findOneBy(
{ id: 1 },
)
const loadedVector = deserializeFvecs(
loadedEmbedding!.realVector,
expect(loadedEmbedding!.realVector).to.deep.equal(
plainVector,
)
expect(loadedVector).to.deep.equal(plainVector)
}),
))
})

View File

@ -1,10 +1,9 @@
import { expect } from "chai"
import { exec } from "child_process"
import { readFile, writeFile, chmod, unlink, rmdir } from "fs/promises"
import { dirname } from "path"
import { readFile, rm, unlink, writeFile } from "fs/promises"
describe("cli init command", () => {
const cliPath = `${dirname(dirname(dirname(__dirname)))}/src/cli.js`
const cliPath = `${__dirname}/../../../src/cli.js`
const databaseOptions = [
"mysql",
"mariadb",
@ -20,21 +19,18 @@ describe("cli init command", () => {
const builtSrcDirectory = "build/compiled/src"
before(async () => {
const copyPackageJson = async () => {
// load package.json from the root of the project
const packageJson = JSON.parse(
await readFile("./package.json", "utf8"),
)
packageJson.version = `0.0.0` // install no version but
packageJson.installFrom = `file:../${builtSrcDirectory}` // use the built src directory
// write the modified package.json to the build directory
await writeFile(
`./${builtSrcDirectory}/package.json`,
JSON.stringify(packageJson, null, 4),
)
}
// load package.json from the root of the project
const packageJson = JSON.parse(await readFile("./package.json", "utf8"))
await Promise.all([chmod(cliPath, 0o755), copyPackageJson()])
// init command is taking typeorm version from package.json
// so ensure we are working against local build
packageJson.version = `file:../${builtSrcDirectory}`
// write the modified package.json to the build directory
await writeFile(
`./${builtSrcDirectory}/package.json`,
JSON.stringify(packageJson, null, 4),
)
})
after(async () => {
@ -42,7 +38,7 @@ describe("cli init command", () => {
})
afterEach(async () => {
await rmdir(`./${testProjectPath}`, { recursive: true })
await rm(`./${testProjectPath}`, { recursive: true, force: true })
})
for (const databaseOption of databaseOptions) {