mirror of
https://github.com/tailwindlabs/tailwindcss.git
synced 2025-12-08 21:36:08 +00:00
Closes #13694 Closes #13591 # Source Maps Support for Tailwind CSS This PR adds support for source maps to Tailwind CSS v4 allowing us to track where styles come from whether that be user CSS, imported stylesheets, or generated utilities. This will improve debuggability in browser dev tools and gives us a good foundation for producing better error messages. I'll go over the details on how end users can enable source maps, any limitations in our implementation, changes to the internal `compile(…)` API, and some details and reasoning around the implementation we chose. ## Usage ### CLI Source maps can be enabled in the CLI by using the command line argument `--map` which will generate an inline source map comment at the bottom of your CSS. A separate file may be generated by passing a file name to `--map`: ```bash # Generates an inline source map npx tailwindcss -i input.css -o output.css --map # Generates a separate source map file npx tailwindcss -i input.css -o output.css --map output.css.map ``` ### PostCSS Source maps are supported when using Tailwind as a PostCSS plugin *in development mode only*. They may or may not be enabled by default depending on your build tool. If they are not you may be able to configure them within your PostCSS config: ```jsonc // package.json { // … "postcss": { "map": { "inline": true }, "plugins": { "@tailwindcss/postcss": {}, }, } } ``` ### Vite Source maps are supported when using the Tailwind CSS Vite plugin in *development mode only* by enabling the `css.devSourcemap` setting: ```js import tailwindcss from "@tailwindcss/vite"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [tailwindcss()], css: { devSourcemap: true, }, }) ``` Now when a CSS file is requested by the browser it'll have an inline source map comment that the browser can use. ## Limitations - Production build source maps are currently disabled due to a bug in Lightning CSS. See https://github.com/parcel-bundler/lightningcss/pull/971 for more details. - In Vite, minified CSS build source maps are not supported at all. See https://github.com/vitejs/vite/issues/2830 for more details. - In PostCSS, minified CSS source maps are not supported. This is due to the complexity required around re-associating every AST node with a location in the generated, optimized CSS. This complexity would also have a non-trivial performance impact. ## Testing Here's how to test the source map functionality in different environments: ### Testing the CLI 1. Setup typical project that the CLI can use and with sources to scan. ```css @import "tailwindcss"; @utilty my-custom-utility { color: red; } /* to test `@apply` */ .card { @apply bg-white text-center shadow-md; } ``` 2. Build with source maps: ```bash bun /path/to/tailwindcss/packages/@tailwindcss-cli/src/index.ts --input input.css -o output.css --map ``` 3. Open Chrome DevTools, inspect an element with utility classes, and you should see rules pointing to `input.css` or `node_modules/tailwindcss/index.css` ### Testing with Vite Testing in Vite will require building and installing necessary files under `dist/*.tgz`. 1. Create a Vite project and enable source maps in `vite.config.js`: ```js import tailwindcss from "@tailwindcss/vite"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [tailwindcss()], css: { // This line is required for them to work devSourcemap: true, }, }) ``` 2. Add a component that uses Tailwind classes and custom CSS: ```jsx // ./src/app.jsx export default function App() { return ( <div className="bg-blue-500 my-custom-class"> Hello World </div> ) } ``` ```css /* ./src/styles.css */ @import "tailwindcss"; @utilty my-custom-utility { color: red; } /* to test `@apply` */ .card { @apply bg-white text-center shadow-md; } ``` 3. Run `npm run dev`, open DevTools, and inspect elements to verify source mapping works for both utility classes and custom CSS. ### Testing with PostCSS CLI 1. Create a test file and update your PostCSS config: ```css /* input.css */ @import "tailwindcss"; @layer components { .card { @apply p-6 rounded-lg shadow-lg; } } ``` ```jsonc // package.json { // … "postcss": { "map": { "inline": true }, "plugins": { "/path/to/tailwindcss/packages/packages/@tailwindcss-postcss/src/index.ts": {} } } } ``` 2. Run PostCSS through Bun: ```bash bunx --bun postcss ./src/index.css -o out.css ``` 3. Inspect the output CSS - it should include an inline source map comment at the bottom. ### Testing with PostCSS + Next.js Testing in Next.js will require building and installing necessary files under `dist/*.tgz`. However, I've not been able to get CSS source maps to work in Next.js without this hack: ```js const nextConfig: NextConfig = { // next.js overwrites config.devtool so we prevent it from doing so // please don't actually do this… webpack: (config) => Object.defineProperty(config, "devtool", { get: () => "inline-source-map", set: () => {}, }), }; ``` This is definitely not supported and also doesn't work with turbopack. This can be used to test them temporarily but I suspect that they just don't work there. ### Manual source map analysis You can analyze source maps using Evan Wallace's [Source Map Visualization](https://evanw.github.io/source-map-visualization/) tool which will help to verify the accuracy and quality of source maps. This is what I used extensively while developing this implementation. It'll help verify that custom, user CSS maps back to itself in the input, that generated utilities all map back to `@tailwind utilities;`, that source locations from imported files are also handled correctly, etc… It also highlights the ranges of stuff so it's easy to see if there are off-by-one errors. It's easiest to use inline source maps with this tool because you can take the CSS file and drop it on the page and it'll analyze it while showing the file content. If you're using Vite you'll want to access the CSS file with `?direct` at the end so you don't get a JS module back. ## Implementation The source map implementation follows the ECMA-426 specification and includes several key components to aid in that goal: ### Source Location Tracking Each emittable AST node in the compilation pipeline tracks two types of source locations: - `src`: Original source location - [source file, start offset, end offset] - `dst`: Generated source location - [output file, start offset, end offset] This dual tracking allows us to maintain mappings between the original source and generated output for things like user CSS, generated utilities, uses of `@apply`, and tracking theme variables. It is important to note that source locations for nodes _never overlap_ within a file which helps simplify source map generation. As such each type of node tracks a specific piece of itself rather than its entire "block": | Node | What a `SourceLocation` represents | | ----------- | ---------------------------------------------------------------- | | Style Rule | The selector | | At Rule | Rule name and params, includes the `@` | | Declaration | Property name and value, excludes the semicolon | | Comment | The entire comment, includes the start `/*` and end `*/` markers | ### Windows line endings when parsing CSS Because our AST tracks nodes through offsets we must ensure that any mutations to the file do *not* change the lenth of the string. We were previously replacing `\r\n` with `\n` (see [filter code points](https://drafts.csswg.org/css-syntax/#css-filter-code-points) from the spec) — which changes the length of the string and all offsets may end up incorrect. The CSS parser was updated to handle the CRLF token directly by skipping over the `\r` and letting remaining code handle `\n` as it did previously. Some additional tweaks were required when "peeking" the input but those changes were fairly small. ### Tracking of imports Source maps need paths to the actual imported stylesheets but the resolve step for stylesheets happens inside the call to `loadStylesheet` which make the file path unavailable to us. Because of this the `loadStylesheet` API was augmented such that it has to return a `path` property that we can then use to identify imported sources. I've also made the same change to the `loadModule` API for consistency but nothing currently uses this property. The `path` property likely makes `base` redundant but elminating that (if we even want to) is a future task. ### Optimizing the AST Our optimization pass may intoduce some nodes, for example, fallbacks we create for `@property`. These nodes are linked back to `@tailwind utilities` as ultimately that is what is responsible for creating them. ### Line Offset Tables A key component to our source map generation is the line offset table, which was inspired by some ESBuild internals. It stores a sorted list of offsets for the start of each line allowing us to translate offsets to line/column `Position`s in `O(log N)` time and from `Position`s to offsets in `O(1)` time. Creation of the table takes `O(N)` time. This means that we can store code point offsets for source locations and not have to worry about computing or tracking line/column numbers during parsing and serialization. Only when a source map is generated do these offsets need to be computed. This ensures the performance penalty when not using source maps is minimal. ### Source Map Generation The source map returned by `buildSourceMap()` is designed to follow the [ECMA-426 spec](https://tc39.es/ecma426). Because that spec is not completely finalized we consider the result of `buildSourceMap()` to be internal API that may change as the spec chamges. The produces source map is a "decoded" map such that all sources and mappings are in an object graph. A library like `source-map-js` must be used to convert this to an encoded source map of the right version where mappings are encoded with base 64 VLQs. Any specific integration (Vite, PostCSS, etc…) can then use `toSourceMap()` from `@tailwindcss/node` to convert from the internal source map to an spec-compliant encoded source map that can be understood by other tools. ### Handling minification in Lightning Since we use Lightning CSS for optimization, and it takes in an input map, we generate an encoded source map that we then pass to lightning. The output source map *from lighting itself* is then passed back in during the second optimization pass. The final map is then passed from lightning to the CLI (but not Vite or PostCSS — see the limitations section for details). In some cases we have to "fix up" the output CSS. When this happens we use `magic-string` to do the replacement in a way that is trackable and `@amppproject/remapping` to map that change back onto the original source map. Once the need for these fix ups disappear these dependencies can go away. Notes: - The accuracy of source maps run though lightning is reduced as it only tracks on a per-rule level. This is sufficient enough for browser dev tools so should be fine. - Source maps during optimization do not function properly at this time because of a bug in Lightning CSS regarding license comments. Once this bug is fixed they will start working as expected. ### How source locations flow through the system 1. During initial CSS parsing, source locations are preserved. 2. During parsing these source locations are also mapped to the destinations which supports an optimization for when no utilities are generated. 3. Throughout the compilation process, transformations maintain source location data 4. Generated utilities are explicitly pointed to `@tailwind utilities` unless generated by `@apply`. 5. When optimization is enabled, source maps are remapped through lightningcss 6. Final source maps are written in the requested format (inline or separate file)
691 lines
22 KiB
TypeScript
691 lines
22 KiB
TypeScript
import dedent from 'dedent'
|
|
import fastGlob from 'fast-glob'
|
|
import { exec, spawn } from 'node:child_process'
|
|
import fs from 'node:fs/promises'
|
|
import { platform, tmpdir } from 'node:os'
|
|
import path from 'node:path'
|
|
import { stripVTControlCharacters } from 'node:util'
|
|
import { RawSourceMap, SourceMapConsumer } from 'source-map-js'
|
|
import { test as defaultTest, type ExpectStatic } from 'vitest'
|
|
import { createLineTable } from '../packages/tailwindcss/src/source-maps/line-table'
|
|
import { escape } from '../packages/tailwindcss/src/utils/escape'
|
|
|
|
const REPO_ROOT = path.join(__dirname, '..')
|
|
const PUBLIC_PACKAGES = (await fs.readdir(path.join(REPO_ROOT, 'dist'))).map((name) =>
|
|
name.replace('tailwindcss-', '@tailwindcss/').replace('.tgz', ''),
|
|
)
|
|
|
|
interface SpawnedProcess {
|
|
dispose: () => void
|
|
flush: () => void
|
|
onStdout: (predicate: (message: string) => boolean) => Promise<void>
|
|
onStderr: (predicate: (message: string) => boolean) => Promise<void>
|
|
}
|
|
|
|
interface ChildProcessOptions {
|
|
cwd?: string
|
|
env?: Record<string, string>
|
|
}
|
|
|
|
interface ExecOptions {
|
|
ignoreStdErr?: boolean
|
|
stdin?: string
|
|
}
|
|
|
|
interface TestConfig {
|
|
fs: {
|
|
[filePath: string]: string | Uint8Array
|
|
}
|
|
|
|
installDependencies?: boolean
|
|
}
|
|
interface TestContext {
|
|
root: string
|
|
expect: ExpectStatic
|
|
exec(command: string, options?: ChildProcessOptions, execOptions?: ExecOptions): Promise<string>
|
|
spawn(command: string, options?: ChildProcessOptions): Promise<SpawnedProcess>
|
|
parseSourceMap(opts: string | SourceMapOptions): SourceMap
|
|
fs: {
|
|
write(filePath: string, content: string, encoding?: BufferEncoding): Promise<void>
|
|
create(filePaths: string[]): Promise<void>
|
|
read(filePath: string): Promise<string>
|
|
glob(pattern: string): Promise<[string, string][]>
|
|
dumpFiles(pattern: string): Promise<string>
|
|
expectFileToContain(
|
|
filePath: string,
|
|
contents: string | RegExp | (string | RegExp)[],
|
|
): Promise<void>
|
|
expectFileNotToContain(filePath: string, contents: string | string[]): Promise<void>
|
|
}
|
|
}
|
|
type TestCallback = (context: TestContext) => Promise<void> | void
|
|
interface TestFlags {
|
|
only?: boolean
|
|
skip?: boolean
|
|
debug?: boolean
|
|
}
|
|
|
|
type SpawnActor = { predicate: (message: string) => boolean; resolve: () => void }
|
|
|
|
export const IS_WINDOWS = platform() === 'win32'
|
|
|
|
const TEST_TIMEOUT = IS_WINDOWS ? 120000 : 60000
|
|
const ASSERTION_TIMEOUT = IS_WINDOWS ? 10000 : 5000
|
|
|
|
// On Windows CI, tmpdir returns a path containing a weird RUNNER~1 folder that
|
|
// apparently causes the vite builds to not work.
|
|
const TMP_ROOT =
|
|
process.env.CI && IS_WINDOWS ? path.dirname(process.env.GITHUB_WORKSPACE!) : tmpdir()
|
|
|
|
export function test(
|
|
name: string,
|
|
config: TestConfig,
|
|
testCallback: TestCallback,
|
|
{ only = false, skip = false, debug = false }: TestFlags = {},
|
|
) {
|
|
return defaultTest(
|
|
name,
|
|
{
|
|
timeout: TEST_TIMEOUT,
|
|
retry: process.env.CI ? 2 : 0,
|
|
only: only || (!process.env.CI && debug),
|
|
skip,
|
|
concurrent: true,
|
|
},
|
|
async (options) => {
|
|
let rootDir = debug ? path.join(REPO_ROOT, '.debug') : TMP_ROOT
|
|
await fs.mkdir(rootDir, { recursive: true })
|
|
|
|
let root = await fs.mkdtemp(path.join(rootDir, 'tailwind-integrations'))
|
|
|
|
if (debug) {
|
|
console.log('Running test in debug mode. File system will be written to:')
|
|
console.log(root)
|
|
console.log()
|
|
}
|
|
|
|
let context = {
|
|
root,
|
|
expect: options.expect,
|
|
parseSourceMap,
|
|
async exec(
|
|
command: string,
|
|
childProcessOptions: ChildProcessOptions = {},
|
|
execOptions: ExecOptions = {},
|
|
) {
|
|
let cwd = childProcessOptions.cwd ?? root
|
|
if (debug && cwd !== root) {
|
|
let relative = path.relative(root, cwd)
|
|
if (relative[0] !== '.') relative = `./${relative}`
|
|
console.log(`> cd ${relative}`)
|
|
}
|
|
if (debug) console.log(`> ${command}`)
|
|
return new Promise((resolve, reject) => {
|
|
let child = exec(
|
|
command,
|
|
{
|
|
cwd,
|
|
...childProcessOptions,
|
|
env: childProcessOptions.env,
|
|
},
|
|
(error, stdout, stderr) => {
|
|
if (error) {
|
|
if (execOptions.ignoreStdErr !== true) console.error(stderr)
|
|
if (only || debug) {
|
|
console.error(stdout)
|
|
}
|
|
reject(error)
|
|
} else {
|
|
if (only || debug) {
|
|
console.log(stdout.toString() + '\n\n' + stderr.toString())
|
|
}
|
|
resolve(stdout.toString() + '\n\n' + stderr.toString())
|
|
}
|
|
},
|
|
)
|
|
if (execOptions.stdin) {
|
|
child.stdin?.write(execOptions.stdin)
|
|
child.stdin?.end()
|
|
}
|
|
})
|
|
},
|
|
async spawn(command: string, childProcessOptions: ChildProcessOptions = {}) {
|
|
let resolveDisposal: (() => void) | undefined
|
|
let rejectDisposal: ((error: Error) => void) | undefined
|
|
let disposePromise = new Promise<void>((resolve, reject) => {
|
|
resolveDisposal = resolve
|
|
rejectDisposal = reject
|
|
})
|
|
|
|
let cwd = childProcessOptions.cwd ?? root
|
|
if (debug && cwd !== root) {
|
|
let relative = path.relative(root, cwd)
|
|
if (relative[0] !== '.') relative = `./${relative}`
|
|
console.log(`> cd ${relative}`)
|
|
}
|
|
if (debug) console.log(`>& ${command}`)
|
|
let child = spawn(command, {
|
|
cwd,
|
|
shell: true,
|
|
...childProcessOptions,
|
|
env: {
|
|
...process.env,
|
|
...childProcessOptions.env,
|
|
},
|
|
})
|
|
|
|
function dispose() {
|
|
if (!child.kill()) {
|
|
child.kill('SIGKILL')
|
|
}
|
|
|
|
let timer = setTimeout(
|
|
() =>
|
|
rejectDisposal?.(new Error(`spawned process (${command}) did not exit in time`)),
|
|
ASSERTION_TIMEOUT,
|
|
)
|
|
disposePromise.finally(() => {
|
|
clearTimeout(timer)
|
|
})
|
|
return disposePromise
|
|
}
|
|
disposables.push(dispose)
|
|
function onExit() {
|
|
resolveDisposal?.()
|
|
}
|
|
|
|
let stdoutMessages: string[] = []
|
|
let stderrMessages: string[] = []
|
|
|
|
let stdoutActors: SpawnActor[] = []
|
|
let stderrActors: SpawnActor[] = []
|
|
|
|
function notifyNext(actors: SpawnActor[], messages: string[]) {
|
|
if (actors.length <= 0) return
|
|
let [next] = actors
|
|
|
|
for (let [idx, message] of messages.entries()) {
|
|
if (next.predicate(message)) {
|
|
messages.splice(0, idx + 1)
|
|
let actorIdx = actors.indexOf(next)
|
|
actors.splice(actorIdx, 1)
|
|
next.resolve()
|
|
break
|
|
}
|
|
}
|
|
}
|
|
|
|
let combined: ['stdout' | 'stderr', string][] = []
|
|
|
|
child.stdout.on('data', (result) => {
|
|
let content = result.toString()
|
|
if (debug || only) console.log(content)
|
|
combined.push(['stdout', content])
|
|
for (let line of content.split('\n')) {
|
|
stdoutMessages.push(stripVTControlCharacters(line))
|
|
}
|
|
notifyNext(stdoutActors, stdoutMessages)
|
|
})
|
|
child.stderr.on('data', (result) => {
|
|
let content = result.toString()
|
|
if (debug || only) console.error(content)
|
|
combined.push(['stderr', content])
|
|
for (let line of content.split('\n')) {
|
|
stderrMessages.push(stripVTControlCharacters(line))
|
|
}
|
|
notifyNext(stderrActors, stderrMessages)
|
|
})
|
|
child.on('exit', onExit)
|
|
child.on('error', (error) => {
|
|
if (error.name !== 'AbortError') {
|
|
throw error
|
|
}
|
|
})
|
|
|
|
options.onTestFailed(() => {
|
|
// In only or debug mode, messages are logged to the console
|
|
// immediately.
|
|
if (only || debug) return
|
|
|
|
for (let [type, message] of combined) {
|
|
if (type === 'stdout') {
|
|
console.log(message)
|
|
} else {
|
|
console.error(message)
|
|
}
|
|
}
|
|
})
|
|
|
|
return {
|
|
dispose,
|
|
flush() {
|
|
stdoutActors.splice(0)
|
|
stderrActors.splice(0)
|
|
|
|
stdoutMessages.splice(0)
|
|
stderrMessages.splice(0)
|
|
},
|
|
onStdout(predicate: (message: string) => boolean) {
|
|
return new Promise<void>((resolve) => {
|
|
stdoutActors.push({ predicate, resolve })
|
|
notifyNext(stdoutActors, stdoutMessages)
|
|
})
|
|
},
|
|
onStderr(predicate: (message: string) => boolean) {
|
|
return new Promise<void>((resolve) => {
|
|
stderrActors.push({ predicate, resolve })
|
|
notifyNext(stderrActors, stderrMessages)
|
|
})
|
|
},
|
|
}
|
|
},
|
|
fs: {
|
|
async write(
|
|
filename: string,
|
|
content: string | Uint8Array,
|
|
encoding: BufferEncoding = 'utf8',
|
|
): Promise<void> {
|
|
let full = path.join(root, filename)
|
|
let dir = path.dirname(full)
|
|
await fs.mkdir(dir, { recursive: true })
|
|
|
|
if (typeof content !== 'string') {
|
|
return await fs.writeFile(full, content)
|
|
}
|
|
|
|
if (filename.endsWith('package.json')) {
|
|
content = await overwriteVersionsInPackageJson(content)
|
|
}
|
|
|
|
// Ensure that files written on Windows use \r\n line ending
|
|
if (IS_WINDOWS) {
|
|
content = content.replace(/\n/g, '\r\n')
|
|
}
|
|
|
|
await fs.writeFile(full, content, encoding)
|
|
},
|
|
|
|
async create(filenames: string[]): Promise<void> {
|
|
for (let filename of filenames) {
|
|
let full = path.join(root, filename)
|
|
|
|
let dir = path.dirname(full)
|
|
await fs.mkdir(dir, { recursive: true })
|
|
await fs.writeFile(full, '')
|
|
}
|
|
},
|
|
|
|
async read(filePath: string) {
|
|
let content = await fs.readFile(path.resolve(root, filePath), 'utf8')
|
|
|
|
// Ensure that files read on Windows have \r\n line endings removed
|
|
if (IS_WINDOWS) {
|
|
content = content.replace(/\r\n/g, '\n')
|
|
}
|
|
|
|
return content
|
|
},
|
|
async glob(pattern: string) {
|
|
let files = await fastGlob(pattern, { cwd: root })
|
|
return Promise.all(
|
|
files.map(async (file) => {
|
|
let content = await fs.readFile(path.join(root, file), 'utf8')
|
|
return [
|
|
file,
|
|
// Drop license comment
|
|
content.replace(/[\s\n]*\/\*![\s\S]*?\*\/[\s\n]*/g, ''),
|
|
]
|
|
}),
|
|
)
|
|
},
|
|
async dumpFiles(pattern: string) {
|
|
let files = await context.fs.glob(pattern)
|
|
return `\n${files
|
|
.slice()
|
|
.sort((a: [string], z: [string]) => {
|
|
let aParts = a[0].split('/')
|
|
let zParts = z[0].split('/')
|
|
|
|
let aFile = aParts.at(-1)
|
|
let zFile = zParts.at(-1)
|
|
|
|
// Sort by depth, shallow first
|
|
if (aParts.length < zParts.length) return -1
|
|
if (aParts.length > zParts.length) return 1
|
|
|
|
// Sort by folder names, alphabetically
|
|
for (let i = 0; i < aParts.length - 1; i++) {
|
|
let diff = aParts[i].localeCompare(zParts[i])
|
|
if (diff !== 0) return diff
|
|
}
|
|
|
|
// Sort by filename, sort files named `index` before others
|
|
if (aFile?.startsWith('index') && !zFile?.startsWith('index')) return -1
|
|
if (zFile?.startsWith('index') && !aFile?.startsWith('index')) return 1
|
|
|
|
// Sort by filename, alphabetically
|
|
return a[0].localeCompare(z[0])
|
|
})
|
|
.map(([file, content]) => `--- ${file} ---\n${content || '<EMPTY>'}`)
|
|
.join('\n\n')
|
|
.trim()}\n`
|
|
},
|
|
async expectFileToContain(filePath, contents) {
|
|
return retryAssertion(async () => {
|
|
let fileContent = await this.read(filePath)
|
|
for (let content of Array.isArray(contents) ? contents : [contents]) {
|
|
if (content instanceof RegExp) {
|
|
options.expect(fileContent).toMatch(content)
|
|
} else {
|
|
options.expect(fileContent).toContain(content)
|
|
}
|
|
}
|
|
})
|
|
},
|
|
async expectFileNotToContain(filePath, contents) {
|
|
return retryAssertion(async () => {
|
|
let fileContent = await this.read(filePath)
|
|
for (let content of contents) {
|
|
options.expect(fileContent).not.toContain(content)
|
|
}
|
|
})
|
|
},
|
|
},
|
|
} satisfies TestContext
|
|
|
|
config.fs['.gitignore'] ??= txt`
|
|
node_modules/
|
|
`
|
|
|
|
for (let [filename, content] of Object.entries(config.fs)) {
|
|
await context.fs.write(filename, content)
|
|
}
|
|
|
|
let shouldInstallDependencies = config.installDependencies ?? true
|
|
|
|
try {
|
|
// In debug mode, the directory is going to be inside the pnpm workspace
|
|
// of the tailwindcss package. This means that `pnpm install` will run
|
|
// pnpm install on the workspace instead (expect if the root dir defines
|
|
// a separate workspace). We work around this by using the
|
|
// `--ignore-workspace` flag.
|
|
if (shouldInstallDependencies) {
|
|
let ignoreWorkspace = debug && !config.fs['pnpm-workspace.yaml']
|
|
await context.exec(`pnpm install${ignoreWorkspace ? ' --ignore-workspace' : ''}`)
|
|
}
|
|
} catch (error: any) {
|
|
console.error(error)
|
|
console.error(error.stdout?.toString())
|
|
console.error(error.stderr?.toString())
|
|
throw error
|
|
}
|
|
|
|
let disposables: (() => Promise<void>)[] = []
|
|
|
|
async function dispose() {
|
|
await Promise.all(disposables.map((dispose) => dispose()))
|
|
|
|
if (!debug) {
|
|
await gracefullyRemove(root)
|
|
}
|
|
}
|
|
|
|
options.onTestFinished(dispose)
|
|
|
|
// Make it a git repository, and commit all files
|
|
if (only || debug) {
|
|
try {
|
|
await context.exec('git init', { cwd: root })
|
|
await context.exec('git add --all', { cwd: root })
|
|
await context.exec('git commit -m "before migration"', { cwd: root })
|
|
} catch (error: any) {
|
|
console.error(error)
|
|
console.error(error.stdout?.toString())
|
|
console.error(error.stderr?.toString())
|
|
throw error
|
|
}
|
|
}
|
|
|
|
return await testCallback(context)
|
|
},
|
|
)
|
|
}
|
|
test.only = (name: string, config: TestConfig, testCallback: TestCallback) => {
|
|
return test(name, config, testCallback, { only: true })
|
|
}
|
|
test.skip = (name: string, config: TestConfig, testCallback: TestCallback) => {
|
|
return test(name, config, testCallback, { skip: true })
|
|
}
|
|
test.debug = (name: string, config: TestConfig, testCallback: TestCallback) => {
|
|
return test(name, config, testCallback, { debug: true })
|
|
}
|
|
|
|
// Maps package names to their tarball filenames. See scripts/pack-packages.ts
|
|
// for more details.
|
|
function pkgToFilename(name: string) {
|
|
return `${name.replace('@', '').replace('/', '-')}.tgz`
|
|
}
|
|
|
|
async function overwriteVersionsInPackageJson(content: string): Promise<string> {
|
|
let json = JSON.parse(content)
|
|
|
|
// Resolve all workspace:^ versions to local tarballs
|
|
for (let key of ['dependencies', 'devDependencies', 'peerDependencies', 'optionalDependencies']) {
|
|
let dependencies = json[key] || {}
|
|
for (let dependency in dependencies) {
|
|
if (dependencies[dependency] === 'workspace:^') {
|
|
dependencies[dependency] = resolveVersion(dependency)
|
|
}
|
|
}
|
|
}
|
|
|
|
// Inject transitive dependency overwrite. This is necessary because
|
|
// @tailwindcss/vite internally depends on a specific version of
|
|
// @tailwindcss/oxide and we instead want to resolve it to the locally built
|
|
// version.
|
|
json.pnpm ||= {}
|
|
json.pnpm.overrides ||= {}
|
|
for (let pkg of PUBLIC_PACKAGES) {
|
|
if (pkg === 'tailwindcss') {
|
|
// We want to be explicit about the `tailwindcss` package so our tests can
|
|
// also import v3 without conflicting v4 tarballs.
|
|
json.pnpm.overrides['@tailwindcss/node>tailwindcss'] = resolveVersion(pkg)
|
|
json.pnpm.overrides['@tailwindcss/upgrade>tailwindcss'] = resolveVersion(pkg)
|
|
json.pnpm.overrides['@tailwindcss/cli>tailwindcss'] = resolveVersion(pkg)
|
|
json.pnpm.overrides['@tailwindcss/postcss>tailwindcss'] = resolveVersion(pkg)
|
|
json.pnpm.overrides['@tailwindcss/vite>tailwindcss'] = resolveVersion(pkg)
|
|
} else {
|
|
json.pnpm.overrides[pkg] = resolveVersion(pkg)
|
|
}
|
|
}
|
|
|
|
return JSON.stringify(json, null, 2)
|
|
}
|
|
|
|
function resolveVersion(dependency: string) {
|
|
let tarball = path.join(REPO_ROOT, 'dist', pkgToFilename(dependency))
|
|
return `file:${tarball}`
|
|
}
|
|
|
|
export function stripTailwindComment(content: string) {
|
|
return content.replace(/\/\*! tailwindcss .*? \*\//g, '').trim()
|
|
}
|
|
|
|
export let svg = dedent
|
|
export let css = dedent
|
|
export let html = dedent
|
|
export let ts = dedent
|
|
export let js = dedent
|
|
export let jsx = dedent
|
|
export let json = dedent
|
|
export let yaml = dedent
|
|
export let txt = dedent
|
|
|
|
export function binary(str: string | TemplateStringsArray, ...values: unknown[]): Uint8Array {
|
|
let base64 = typeof str === 'string' ? str : String.raw(str, ...values)
|
|
|
|
return Uint8Array.from(atob(base64), (c) => c.charCodeAt(0))
|
|
}
|
|
|
|
export function candidate(strings: TemplateStringsArray, ...values: any[]) {
|
|
let output: string[] = []
|
|
for (let i = 0; i < strings.length; i++) {
|
|
output.push(strings[i])
|
|
if (i < values.length) {
|
|
output.push(values[i])
|
|
}
|
|
}
|
|
|
|
return `.${escape(output.join('').trim())}`
|
|
}
|
|
|
|
export async function retryAssertion<T>(
|
|
fn: () => Promise<T>,
|
|
{ timeout = ASSERTION_TIMEOUT, delay = 5 }: { timeout?: number; delay?: number } = {},
|
|
) {
|
|
let end = Date.now() + timeout
|
|
let error: any
|
|
while (Date.now() < end) {
|
|
try {
|
|
return await fn()
|
|
} catch (err) {
|
|
error = err
|
|
await new Promise((resolve) => setTimeout(resolve, delay))
|
|
}
|
|
}
|
|
throw error
|
|
}
|
|
|
|
export async function fetchStyles(base: string, path = '/'): Promise<string> {
|
|
while (base.endsWith('/')) {
|
|
base = base.slice(0, -1)
|
|
}
|
|
|
|
let index = await fetch(`${base}${path}`)
|
|
let html = await index.text()
|
|
|
|
let linkRegex = /<link rel="stylesheet" href="([a-zA-Z0-9\/_\.\?=%-]+)"/gi
|
|
let styleRegex = /<style\b[^>]*>([\s\S]*?)<\/style>/gi
|
|
|
|
let stylesheets: string[] = []
|
|
|
|
let paths: string[] = []
|
|
for (let match of html.matchAll(linkRegex)) {
|
|
let path: string = match[1]
|
|
if (path.startsWith('./')) {
|
|
path = path.slice(1)
|
|
}
|
|
paths.push(path)
|
|
}
|
|
stylesheets.push(
|
|
...(await Promise.all(
|
|
paths.map(async (path) => {
|
|
let css = await fetch(`${base}${path}`, {
|
|
headers: {
|
|
Accept: 'text/css',
|
|
},
|
|
})
|
|
return await css.text()
|
|
}),
|
|
)),
|
|
)
|
|
|
|
for (let match of html.matchAll(styleRegex)) {
|
|
stylesheets.push(match[1])
|
|
}
|
|
|
|
return stylesheets.reduce((acc, css) => {
|
|
if (acc.length > 0) acc += '\n'
|
|
acc += css
|
|
return acc
|
|
}, '')
|
|
}
|
|
|
|
async function gracefullyRemove(dir: string) {
|
|
// Skip removing the directory in CI because it can stall on Windows
|
|
if (!process.env.CI) {
|
|
await fs.rm(dir, { recursive: true, force: true }).catch((error) => {
|
|
console.log(`Failed to remove ${dir}`, error)
|
|
})
|
|
}
|
|
}
|
|
|
|
const SOURCE_MAP_COMMENT = /^\/\*# sourceMappingURL=data:application\/json;base64,(.*) \*\/$/
|
|
|
|
export interface SourceMap {
|
|
at(
|
|
line: number,
|
|
column: number,
|
|
): {
|
|
source: string | null
|
|
original: string
|
|
generated: string
|
|
}
|
|
}
|
|
|
|
interface SourceMapOptions {
|
|
/**
|
|
* A raw source map
|
|
*
|
|
* This may be a string or an object. Strings will be decoded.
|
|
*/
|
|
map: string | object
|
|
|
|
/**
|
|
* The content of the generated file the source map is for
|
|
*/
|
|
content: string
|
|
|
|
/**
|
|
* The encoding of the source map
|
|
*
|
|
* Can be used to decode a base64 map (e.g. an inline source map URI)
|
|
*/
|
|
encoding?: BufferEncoding
|
|
}
|
|
|
|
function parseSourceMap(opts: string | SourceMapOptions): SourceMap {
|
|
if (typeof opts === 'string') {
|
|
let lines = opts.trimEnd().split('\n')
|
|
let comment = lines.at(-1) ?? ''
|
|
let map = String(comment).match(SOURCE_MAP_COMMENT)?.[1] ?? null
|
|
if (!map) throw new Error('No source map comment found')
|
|
|
|
return parseSourceMap({
|
|
map,
|
|
content: lines.slice(0, -1).join('\n'),
|
|
encoding: 'base64',
|
|
})
|
|
}
|
|
|
|
let rawMap: RawSourceMap
|
|
let content = opts.content
|
|
|
|
if (typeof opts.map === 'object') {
|
|
rawMap = opts.map as RawSourceMap
|
|
} else {
|
|
rawMap = JSON.parse(Buffer.from(opts.map, opts.encoding ?? 'utf-8').toString())
|
|
}
|
|
|
|
let map = new SourceMapConsumer(rawMap)
|
|
let generatedTable = createLineTable(content)
|
|
|
|
return {
|
|
at(line: number, column: number) {
|
|
let pos = map.originalPositionFor({ line, column })
|
|
let source = pos.source ? map.sourceContentFor(pos.source) : null
|
|
let originalTable = createLineTable(source ?? '')
|
|
let originalOffset = originalTable.findOffset(pos)
|
|
let generatedOffset = generatedTable.findOffset({ line, column })
|
|
|
|
return {
|
|
source: pos.source,
|
|
original: source
|
|
? source.slice(originalOffset, originalOffset + 10).trim() + '...'
|
|
: '(none)',
|
|
generated: content.slice(generatedOffset, generatedOffset + 10).trim() + '...',
|
|
}
|
|
},
|
|
}
|
|
}
|