--- title: Features | Guide outline: deep --- # Features
Learn how to write your first test by Video ## Shared Config between Test, Dev and Build Vite's config, transformers, resolvers, and plugins. Use the same setup from your app to run the tests. Learn more at [Configuring Vitest](/guide/#configuring-vitest). ## Watch Mode ```bash $ vitest ``` When you modify your source code or the test files, Vitest smartly searches the module graph and only reruns the related tests, just like how HMR works in Vite! `vitest` starts in `watch mode` **by default in development environment** and `run mode` in CI environment (when `process.env.CI` presents) smartly. You can use `vitest watch` or `vitest run` to explicitly specify the desired mode. Start Vitest with the `--standalone` flag to keep it running in the background. It won't run any tests until they change. Vitest will not run tests if the source code is changed until the test that imports the source has been run ## Common Web Idioms Out-Of-The-Box Out-of-the-box ES Module / TypeScript / JSX support / PostCSS ## Threads By default Vitest runs test files in [multiple processes](/guide/parallelism) using [`node:child_process`](https://nodejs.org/api/child_process.html), allowing tests to run simultaneously. If you want to speed up your test suite even further, consider enabling `--pool=threads` to run tests using [`node:worker_threads`](https://nodejs.org/api/worker_threads.html) (beware that some packages might not work with this setup). To run tests in a single thread or process, see [`fileParallelism`](/config/#fileParallelism). Vitest also isolates each file's environment so env mutations in one file don't affect others. Isolation can be disabled by passing `--no-isolate` to the CLI (trading correctness for run performance). ## Test Filtering Vitest provides many ways to narrow down the tests to run in order to speed up testing so you can focus on development. Learn more about [Test Filtering](/guide/filtering). ## Running Tests Concurrently Use `.concurrent` in consecutive tests to start them in parallel. ```ts import { describe, it } from 'vitest' // The two tests marked with concurrent will be started in parallel describe('suite', () => { it('serial test', async () => { /* ... */ }) it.concurrent('concurrent test 1', async ({ expect }) => { /* ... */ }) it.concurrent('concurrent test 2', async ({ expect }) => { /* ... */ }) }) ``` If you use `.concurrent` on a suite, every test in it will be started in parallel. ```ts import { describe, it } from 'vitest' // All tests within this suite will be started in parallel describe.concurrent('suite', () => { it('concurrent test 1', async ({ expect }) => { /* ... */ }) it('concurrent test 2', async ({ expect }) => { /* ... */ }) it.concurrent('concurrent test 3', async ({ expect }) => { /* ... */ }) }) ``` You can also use `.skip`, `.only`, and `.todo` with concurrent suites and tests. Read more in the [API Reference](/api/#test-concurrent). ::: warning When running concurrent tests, Snapshots and Assertions must use `expect` from the local [Test Context](/guide/test-context) to ensure the right test is detected. ::: ## Snapshot [Jest-compatible](https://jestjs.io/docs/snapshot-testing) snapshot support. ```ts import { expect, it } from 'vitest' it('renders correctly', () => { const result = render() expect(result).toMatchSnapshot() }) ``` Learn more at [Snapshot](/guide/snapshot). ## Chai and Jest `expect` Compatibility [Chai](https://www.chaijs.com/) is built-in for assertions with [Jest `expect`](https://jestjs.io/docs/expect)-compatible APIs. Notice that if you are using third-party libraries that add matchers, setting [`test.globals`](/config/#globals) to `true` will provide better compatibility. ## Mocking [Tinyspy](https://github.com/tinylibs/tinyspy) is built-in for mocking with `jest`-compatible APIs on `vi` object. ```ts import { expect, vi } from 'vitest' const fn = vi.fn() fn('hello', 1) expect(vi.isMockFunction(fn)).toBe(true) expect(fn.mock.calls[0]).toEqual(['hello', 1]) fn.mockImplementation((arg: string) => arg) fn('world', 2) expect(fn.mock.results[1].value).toBe('world') ``` Vitest supports both [happy-dom](https://github.com/capricorn86/happy-dom) or [jsdom](https://github.com/jsdom/jsdom) for mocking DOM and browser APIs. They don't come with Vitest, you will need to install them separately: ::: code-group ```bash [happy-dom] $ npm i -D happy-dom ``` ```bash [jsdom] $ npm i -D jsdom ``` ::: After that, change the `environment` option in your config file: ```ts [vitest.config.ts] import { defineConfig } from 'vitest/config' export default defineConfig({ test: { environment: 'happy-dom', // or 'jsdom', 'node' }, }) ``` Learn more at [Mocking](/guide/mocking). ## Coverage Vitest supports Native code coverage via [`v8`](https://v8.dev/blog/javascript-code-coverage) and instrumented code coverage via [`istanbul`](https://istanbul.js.org/). ```json [package.json] { "scripts": { "test": "vitest", "coverage": "vitest run --coverage" } } ``` Learn more at [Coverage](/guide/coverage). ## In-Source Testing Vitest also provides a way to run tests within your source code along with the implementation, similar to [Rust's module tests](https://doc.rust-lang.org/book/ch11-03-test-organization.html#the-tests-module-and-cfgtest). This makes the tests share the same closure as the implementations and able to test against private states without exporting. Meanwhile, it also brings the feedback loop closer for development. ```ts [src/index.ts] // the implementation export function add(...args: number[]): number { return args.reduce((a, b) => a + b, 0) } // in-source test suites if (import.meta.vitest) { const { it, expect } = import.meta.vitest it('add', () => { expect(add()).toBe(0) expect(add(1)).toBe(1) expect(add(1, 2, 3)).toBe(6) }) } ``` Learn more at [In-source testing](/guide/in-source). ## Benchmarking Experimental {#benchmarking} You can run benchmark tests with [`bench`](/api/#bench) function via [Tinybench](https://github.com/tinylibs/tinybench) to compare performance results. ```ts [sort.bench.ts] import { bench, describe } from 'vitest' describe('sort', () => { bench('normal', () => { const x = [1, 5, 4, 2, 3] x.sort((a, b) => { return a - b }) }) bench('reverse', () => { const x = [1, 5, 4, 2, 3] x.reverse().sort((a, b) => { return a - b }) }) }) ``` Benchmark report Benchmark report ## Type Testing Experimental {#type-testing} You can [write tests](/guide/testing-types) to catch type regressions. Vitest comes with [`expect-type`](https://github.com/mmkal/expect-type) package to provide you with a similar and easy to understand API. ```ts [types.test-d.ts] import { assertType, expectTypeOf, test } from 'vitest' import { mount } from './mount.js' test('my types work properly', () => { expectTypeOf(mount).toBeFunction() expectTypeOf(mount).parameter(0).toExtend<{ name: string }>() // @ts-expect-error name is a string assertType(mount({ name: 42 })) }) ``` ## Sharding Run tests on different machines using [`--shard`](/guide/cli#shard) and [`--reporter=blob`](/guide/reporters#blob-reporter) flags. All test and coverage results can be merged at the end of your CI pipeline using `--merge-reports` command: ```bash vitest --shard=1/2 --reporter=blob --coverage vitest --shard=2/2 --reporter=blob --coverage vitest --merge-reports --reporter=junit --coverage ``` See [`Improving Performance | Sharding`](/guide/improving-performance#sharding) for more information. ## Environment Variables Vitest exclusively autoloads environment variables prefixed with `VITE_` from `.env` files to maintain compatibility with frontend-related tests, adhering to [Vite's established convention](https://vitejs.dev/guide/env-and-mode.html#env-files). To load every environmental variable from `.env` files anyway, you can use `loadEnv` method imported from `vite`: ```ts [vitest.config.ts] import { loadEnv } from 'vite' import { defineConfig } from 'vitest/config' export default defineConfig(({ mode }) => ({ test: { // mode defines what ".env.{mode}" file to choose if exists env: loadEnv(mode, process.cwd(), ''), }, })) ``` ## Unhandled Errors By default, Vitest catches and reports all [unhandled rejections](https://developer.mozilla.org/en-US/docs/Web/API/Window/unhandledrejection_event), [uncaught exceptions](https://nodejs.org/api/process.html#event-uncaughtexception) (in Node.js) and [error](https://developer.mozilla.org/en-US/docs/Web/API/Window/error_event) events (in the [browser](/guide/browser/)). You can disable this behaviour by catching them manually. Vitest assumes the callback is handled by you and won't report the error. ::: code-group ```ts [setup.node.js] // in Node.js process.on('unhandledRejection', () => { // your own handler }) process.on('uncaughtException', () => { // your own handler }) ``` ```ts [setup.browser.js] // in the browser window.addEventListener('error', () => { // your own handler }) window.addEventListener('unhandledrejection', () => { // your own handler }) ``` ::: Alternatively, you can also ignore reported errors with a [`dangerouslyIgnoreUnhandledErrors`](/config/#dangerouslyignoreunhandlederrors) option. Vitest will still report them, but they won't affect the test result (exit code won't be changed). If you need to test that error was not caught, you can create a test that looks like this: ```ts test('my function throws uncaught error', async ({ onTestFinished }) => { const unhandledRejectionListener = vi.fn() process.on('unhandledRejection', unhandledRejectionListener) onTestFinished(() => { process.off('unhandledRejection', unhandledRejectionListener) }) callMyFunctionThatRejectsError() await expect.poll(unhandledRejectionListener).toHaveBeenCalled() }) ```