mirror of
https://github.com/tailwindlabs/tailwindcss.git
synced 2025-12-08 21:36:08 +00:00
8 Commits
| Author | SHA1 | Message | Date | |
|---|---|---|---|---|
|
|
0d975f5f06
|
Update dedent 1.5.3 → 1.6.0 (minor) (#17965)
Here is everything you need to know about this upgrade. Please take a good look at what changed and the test results before merging this pull request. ### What changed? #### ✳️ dedent (1.5.3 → 1.6.0) · [Repo](https://github.com/dmnd/dedent) · [Changelog](https://github.com/dmnd/dedent/blob/main/CHANGELOG.md) <details> <summary>Release Notes</summary> <h4><a href="https://github.com/dmnd/dedent/releases/tag/v1.6.0">1.6.0</a></h4> <blockquote><h2 dir="auto">What's Changed</h2> <ul dir="auto"> <li>feat: add <code class="notranslate">trimWhitespace</code> option by <a href="https://bounce.depfu.com/github.com/43081j">@43081j</a> in <a href="https://bounce.depfu.com/github.com/dmnd/dedent/pull/97">#97</a> </li> </ul> <h2 dir="auto">New Contributors</h2> <ul dir="auto"> <li> <a href="https://bounce.depfu.com/github.com/43081j">@43081j</a> made their first contribution in <a href="https://bounce.depfu.com/github.com/dmnd/dedent/pull/97">#97</a> </li> </ul> <p dir="auto"><strong>Full Changelog</strong>: <a href="https://bounce.depfu.com/github.com/dmnd/dedent/compare/v1.5.3...v1.6.0"><tt>v1.5.3...v1.6.0</tt></a></p></blockquote> <p><em>Does any of this look wrong? <a href="https://depfu.com/packages/npm/dedent/feedback">Please let us know.</a></em></p> </details> <details> <summary>Commits</summary> <p><a href=" |
||
|
|
56b22bb1d3
|
Add support for source maps (#17775)
Closes #13694 Closes #13591 # Source Maps Support for Tailwind CSS This PR adds support for source maps to Tailwind CSS v4 allowing us to track where styles come from whether that be user CSS, imported stylesheets, or generated utilities. This will improve debuggability in browser dev tools and gives us a good foundation for producing better error messages. I'll go over the details on how end users can enable source maps, any limitations in our implementation, changes to the internal `compile(…)` API, and some details and reasoning around the implementation we chose. ## Usage ### CLI Source maps can be enabled in the CLI by using the command line argument `--map` which will generate an inline source map comment at the bottom of your CSS. A separate file may be generated by passing a file name to `--map`: ```bash # Generates an inline source map npx tailwindcss -i input.css -o output.css --map # Generates a separate source map file npx tailwindcss -i input.css -o output.css --map output.css.map ``` ### PostCSS Source maps are supported when using Tailwind as a PostCSS plugin *in development mode only*. They may or may not be enabled by default depending on your build tool. If they are not you may be able to configure them within your PostCSS config: ```jsonc // package.json { // … "postcss": { "map": { "inline": true }, "plugins": { "@tailwindcss/postcss": {}, }, } } ``` ### Vite Source maps are supported when using the Tailwind CSS Vite plugin in *development mode only* by enabling the `css.devSourcemap` setting: ```js import tailwindcss from "@tailwindcss/vite"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [tailwindcss()], css: { devSourcemap: true, }, }) ``` Now when a CSS file is requested by the browser it'll have an inline source map comment that the browser can use. ## Limitations - Production build source maps are currently disabled due to a bug in Lightning CSS. See https://github.com/parcel-bundler/lightningcss/pull/971 for more details. - In Vite, minified CSS build source maps are not supported at all. See https://github.com/vitejs/vite/issues/2830 for more details. - In PostCSS, minified CSS source maps are not supported. This is due to the complexity required around re-associating every AST node with a location in the generated, optimized CSS. This complexity would also have a non-trivial performance impact. ## Testing Here's how to test the source map functionality in different environments: ### Testing the CLI 1. Setup typical project that the CLI can use and with sources to scan. ```css @import "tailwindcss"; @utilty my-custom-utility { color: red; } /* to test `@apply` */ .card { @apply bg-white text-center shadow-md; } ``` 2. Build with source maps: ```bash bun /path/to/tailwindcss/packages/@tailwindcss-cli/src/index.ts --input input.css -o output.css --map ``` 3. Open Chrome DevTools, inspect an element with utility classes, and you should see rules pointing to `input.css` or `node_modules/tailwindcss/index.css` ### Testing with Vite Testing in Vite will require building and installing necessary files under `dist/*.tgz`. 1. Create a Vite project and enable source maps in `vite.config.js`: ```js import tailwindcss from "@tailwindcss/vite"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [tailwindcss()], css: { // This line is required for them to work devSourcemap: true, }, }) ``` 2. Add a component that uses Tailwind classes and custom CSS: ```jsx // ./src/app.jsx export default function App() { return ( <div className="bg-blue-500 my-custom-class"> Hello World </div> ) } ``` ```css /* ./src/styles.css */ @import "tailwindcss"; @utilty my-custom-utility { color: red; } /* to test `@apply` */ .card { @apply bg-white text-center shadow-md; } ``` 3. Run `npm run dev`, open DevTools, and inspect elements to verify source mapping works for both utility classes and custom CSS. ### Testing with PostCSS CLI 1. Create a test file and update your PostCSS config: ```css /* input.css */ @import "tailwindcss"; @layer components { .card { @apply p-6 rounded-lg shadow-lg; } } ``` ```jsonc // package.json { // … "postcss": { "map": { "inline": true }, "plugins": { "/path/to/tailwindcss/packages/packages/@tailwindcss-postcss/src/index.ts": {} } } } ``` 2. Run PostCSS through Bun: ```bash bunx --bun postcss ./src/index.css -o out.css ``` 3. Inspect the output CSS - it should include an inline source map comment at the bottom. ### Testing with PostCSS + Next.js Testing in Next.js will require building and installing necessary files under `dist/*.tgz`. However, I've not been able to get CSS source maps to work in Next.js without this hack: ```js const nextConfig: NextConfig = { // next.js overwrites config.devtool so we prevent it from doing so // please don't actually do this… webpack: (config) => Object.defineProperty(config, "devtool", { get: () => "inline-source-map", set: () => {}, }), }; ``` This is definitely not supported and also doesn't work with turbopack. This can be used to test them temporarily but I suspect that they just don't work there. ### Manual source map analysis You can analyze source maps using Evan Wallace's [Source Map Visualization](https://evanw.github.io/source-map-visualization/) tool which will help to verify the accuracy and quality of source maps. This is what I used extensively while developing this implementation. It'll help verify that custom, user CSS maps back to itself in the input, that generated utilities all map back to `@tailwind utilities;`, that source locations from imported files are also handled correctly, etc… It also highlights the ranges of stuff so it's easy to see if there are off-by-one errors. It's easiest to use inline source maps with this tool because you can take the CSS file and drop it on the page and it'll analyze it while showing the file content. If you're using Vite you'll want to access the CSS file with `?direct` at the end so you don't get a JS module back. ## Implementation The source map implementation follows the ECMA-426 specification and includes several key components to aid in that goal: ### Source Location Tracking Each emittable AST node in the compilation pipeline tracks two types of source locations: - `src`: Original source location - [source file, start offset, end offset] - `dst`: Generated source location - [output file, start offset, end offset] This dual tracking allows us to maintain mappings between the original source and generated output for things like user CSS, generated utilities, uses of `@apply`, and tracking theme variables. It is important to note that source locations for nodes _never overlap_ within a file which helps simplify source map generation. As such each type of node tracks a specific piece of itself rather than its entire "block": | Node | What a `SourceLocation` represents | | ----------- | ---------------------------------------------------------------- | | Style Rule | The selector | | At Rule | Rule name and params, includes the `@` | | Declaration | Property name and value, excludes the semicolon | | Comment | The entire comment, includes the start `/*` and end `*/` markers | ### Windows line endings when parsing CSS Because our AST tracks nodes through offsets we must ensure that any mutations to the file do *not* change the lenth of the string. We were previously replacing `\r\n` with `\n` (see [filter code points](https://drafts.csswg.org/css-syntax/#css-filter-code-points) from the spec) — which changes the length of the string and all offsets may end up incorrect. The CSS parser was updated to handle the CRLF token directly by skipping over the `\r` and letting remaining code handle `\n` as it did previously. Some additional tweaks were required when "peeking" the input but those changes were fairly small. ### Tracking of imports Source maps need paths to the actual imported stylesheets but the resolve step for stylesheets happens inside the call to `loadStylesheet` which make the file path unavailable to us. Because of this the `loadStylesheet` API was augmented such that it has to return a `path` property that we can then use to identify imported sources. I've also made the same change to the `loadModule` API for consistency but nothing currently uses this property. The `path` property likely makes `base` redundant but elminating that (if we even want to) is a future task. ### Optimizing the AST Our optimization pass may intoduce some nodes, for example, fallbacks we create for `@property`. These nodes are linked back to `@tailwind utilities` as ultimately that is what is responsible for creating them. ### Line Offset Tables A key component to our source map generation is the line offset table, which was inspired by some ESBuild internals. It stores a sorted list of offsets for the start of each line allowing us to translate offsets to line/column `Position`s in `O(log N)` time and from `Position`s to offsets in `O(1)` time. Creation of the table takes `O(N)` time. This means that we can store code point offsets for source locations and not have to worry about computing or tracking line/column numbers during parsing and serialization. Only when a source map is generated do these offsets need to be computed. This ensures the performance penalty when not using source maps is minimal. ### Source Map Generation The source map returned by `buildSourceMap()` is designed to follow the [ECMA-426 spec](https://tc39.es/ecma426). Because that spec is not completely finalized we consider the result of `buildSourceMap()` to be internal API that may change as the spec chamges. The produces source map is a "decoded" map such that all sources and mappings are in an object graph. A library like `source-map-js` must be used to convert this to an encoded source map of the right version where mappings are encoded with base 64 VLQs. Any specific integration (Vite, PostCSS, etc…) can then use `toSourceMap()` from `@tailwindcss/node` to convert from the internal source map to an spec-compliant encoded source map that can be understood by other tools. ### Handling minification in Lightning Since we use Lightning CSS for optimization, and it takes in an input map, we generate an encoded source map that we then pass to lightning. The output source map *from lighting itself* is then passed back in during the second optimization pass. The final map is then passed from lightning to the CLI (but not Vite or PostCSS — see the limitations section for details). In some cases we have to "fix up" the output CSS. When this happens we use `magic-string` to do the replacement in a way that is trackable and `@amppproject/remapping` to map that change back onto the original source map. Once the need for these fix ups disappear these dependencies can go away. Notes: - The accuracy of source maps run though lightning is reduced as it only tracks on a per-rule level. This is sufficient enough for browser dev tools so should be fine. - Source maps during optimization do not function properly at this time because of a bug in Lightning CSS regarding license comments. Once this bug is fixed they will start working as expected. ### How source locations flow through the system 1. During initial CSS parsing, source locations are preserved. 2. During parsing these source locations are also mapped to the destinations which supports an optimization for when no utilities are generated. 3. Throughout the compilation process, transformations maintain source location data 4. Generated utilities are explicitly pointed to `@tailwind utilities` unless generated by `@apply`. 5. When optimization is enabled, source maps are remapped through lightningcss 6. Final source maps are written in the requested format (inline or separate file) |
||
|
|
ae8fb146a7
|
Update fast-glob 3.3.2 → 3.3.3 (patch) (#15607) | ||
|
|
2a29c29441
|
Improve integration tests (stability + performance) (#15125)
This PR improves the integration tests in two ways: 1. Make the integration tests more reliable and thus less flakey 2. Make the integration tests faster (by introducing concurrency) Tried a lot of different things to make sure that these tests are fast and stable. --- The biggest issue we noticed is that some tests are flakey, these are tests with long running dev-mode processes where watchers are being used and/or dev servers are created. To solve this, all the tests that spawn a process look at stdout/stderr and wait for a message from the process to know whether we can start making changes. For example, in case of an Astro project, you get a `watching for file changes` message. In case of Nuxt project you can wait for an `server warmed up in` and in case of Next.js there is a `Ready in` message. These depend on the tools being used, so this is hardcoded per test instead of a magically automatic solution. These messages allow us to wait until all the initial necessary work, internal watchers and/or dev servers are setup before we start making changes to the files and/or request CSS stylesheets before the server(s) are ready. --- Another improvement is how we setup the dev servers. Before, we used to try and get a free port on the system and use a `--port` flag or a `PORT` environment variable. Instead of doing this (which is slow), we rely on the process itself to show a URL with a port. Basically all tools will try to find a free port if the default port is in use. We can then use the stdout/stderr messages to get the URL and the port to use. To reduce the amount of potential conflicts in ports, we used to run every test and every file sequentially to basically guarantee that ports are free. With this new approach where we rely on the process, I noticed that we don't really run into this issue again (I reran the tests multiple times and they were always stable) <img width="316" alt="image" src="https://github.com/user-attachments/assets/b75ddab4-f919-4995-85d0-f212b603e5c2" /> Note: these tests run Linux, Windows and macOS in this branch just for testing purposes. Once this is done, we will only run Linux tests on PRs and run all 3 of them on the `next` branch. We do make the tests concurrent by default now, which in theory means that there could be conflicts (which in practice means that the process has to do a few more tries to find a free port). To reduce these conflicts, we split up the integration tests such that Vite, PostCSS, CLI, … tests all run in a separate job in the GitHub actions workflow. <img width="312" alt="image" src="https://github.com/user-attachments/assets/fe9a58a1-98eb-4d9b-8845-a7c8a7af5766" /> Comparing this branch against the `next` branch, this is what CI looks like right now: | `next` | `feat/improve-integration-tests` | | --- | --- | | <img width="594" alt="image" src="https://github.com/user-attachments/assets/540d21eb-ab03-42e8-9f6f-b3a071fc7635" /> | <img width="672" alt="image" src="https://github.com/user-attachments/assets/8ef2e891-08a1-464b-9954-4153174ebce7" /> | There also was a point in time where I introduced sequential tests such that all spawned processes still run after each other, but so far I didn't run into issues if we keep them concurrent so I dropped that code. Some small changes I made to make things more reliable: 1. When relying on stdout/stderr messages, we split lines on `\n` and we strip all the ANSI escapes which allows us to not worry about special ANSI characters when finding the URL or a specific message to wait for. 2. Once a test is done, we `child.kill()` the spawned process. If that doesn't work, for whatever reason, we run a `child.kill('SIGKILL')` to force kill the process. This could technically lead to some memory or files not being cleaned up properly, but once CI is done, everything is thrown away anyway. 3. As you can see in the screenshots, I used some nicer names for the workflows. | `next` | `feat/improve-integration-tests` | | --- | --- | | <img width="276" alt="image" src="https://github.com/user-attachments/assets/e574bb53-e21b-4619-9cdb-515431b255b9" /> | <img width="179" alt="image" src="https://github.com/user-attachments/assets/8bc75119-fb91-4500-a1d0-bd09f74c93ad" /> | They also look a bit nicer in the PR overview as well: <img width="929" alt="image" src="https://github.com/user-attachments/assets/04fc71fc-74b0-4e7c-9047-2aada664efef" /> The very last commit just filters out Windows and macOS tests again for PRs (but they are executed on the `next` branch. --- ### Nest steps I think for now we are in a pretty good state, but there are some things we can do to further improve everything (mainly make things faster) but aren't necessary. I also ran into issue while trying it so there is more work to do. 1. More splits — instead of having a Vite folder and PostCSS folder, we can go a step further and have folders for Next.js, Astro, Nuxt, Remix, … 2. Caching — right now we have to run the build step for every OS on every "job". We can re-use the work here by introducing a setup job that the other jobs rely on. @thecrypticace and I tried it already, but were running into some Bun specific Standalone CLI issues when doing that. 3. Remote caching — we could re-enable remote caching such that the `build` step can be full turbo (e.g.: after a PR is merged in `next` and we run everything again) |
||
|
|
27912f9bb5
|
Add integration test setup and tests for the Vite integration (#14089)
This PR adds a new root `/integrations` folder that will be the home of
integration tests. The idea of these tests is to use Tailwind in various
setups just like our users would (by only using the publishable npm
builds).
To avoid issues with concurrent tests making changes to the file system,
to make it very easy to test through a range of versions, and to avoid
changing configuration objects over and over in test runs, we decided to
inline the scaffolding completely into the test file and have no
examples checked into the repo.
Here's an example of how this can look like for a simple Vite test:
```ts
test('works with production builds', {
fs: {
'package.json': json`
{
"type": "module",
"dependencies": {
"@tailwindcss/vite": "workspace:^",
"tailwindcss": "workspace:^"
},
"devDependencies": {
"vite": "^5.3.5"
}
}
`,
'vite.config.ts': ts`
import tailwindcss from '@tailwindcss/vite'
import { defineConfig } from 'vite'
export default defineConfig({
build: { cssMinify: false },
plugins: [tailwindcss()],
})
`,
'index.html': html`
<head>
<link rel="stylesheet" href="./src/index.css">
</head>
<body>
<div class="underline m-2">Hello, world!</div>
</body>
`,
'src/index.css': css`
@import 'tailwindcss/theme' reference;
@import 'tailwindcss/utilities';
`,
},
},
async ({ fs, exec }) => {
await exec('pnpm vite build')
expect.assertions(2)
for (let [path, content] of await fs.glob('dist/**/*.css')) {
expect(path).toMatch(/\.css$/)
expect(stripTailwindComment(content)).toMatchInlineSnapshot(
`
".m-2 {
margin: var(--spacing-2, .5rem);
}
.underline {
text-decoration-line: underline;
}"
`,
)
}
},
)
```
By defining all dependencies this way, we never have to worry about
which fixtures are checked in and can more easily describe changes to
the setup.
For ergonomics, we've also added the [`embed` prettier
plugin](https://github.com/Sec-ant/prettier-plugin-embed). This will
mean that files inlined in the `fs` setup are properly indented. No
extra work needed!
If you're using VS Code, I can also recommend the [Language
Literals](https://marketplace.visualstudio.com/items?itemName=sissel.language-literals)
extension so that syntax highlighting also _just works_.
A neat feature of inlining the scaffolding like this is to make it very
simple to test through a variety of versions. For example, here's how we
can set up a test against Vite 5 and Vite 4:
```js
;['^4.5.3', '^5.3.5'].forEach(viteVersion => {
test(`works with production builds for Vite ${viteVersion}`, {
fs: {
'package.json': json`
{
"type": "module",
"devDependencies": {
"vite": "${viteVersion}"
}
}
`,
async () => {
// Do something
},
)
})
```
## Philosophy
Before we dive into the specifics, I want to clearly state the design
considerations we have chosen for this new test suite:
- All file mutations should be done in temp folders, nothing should ever
mess with your working directory
- Windows as a first-class citizen
- Have a clean and simple API that describes the test setup only using
public APIs
- Focus on reliability (make sure cleanup scripts work and are tolerant
to various error scenarios)
- If a user reports an issue with a specific configuration, we want to
be able to reproduce them with integration tests, no matter how obscure
the setup (this means the test need to be in control of most of the
variables)
- Tests should be reasonably fast (obviously this depends on the
integration. If we use a slow build tool, we can't magically speed it
up, but our overhead should be minimal).
## How it works
The current implementation provides a custom `test` helper function
that, when used, sets up the environment according to the configuration.
It'll create a new temporary directory and create all files, ensuring
things like proper `\r\n` line endings on Windows.
We do have to patch the `package.json` specifically, since we can not
use public versions of the tailwindcss packages as we want to be able to
test against a development build. To make this happen, every `pnpm
build` run now creates tarballs of the npm modules (that contain only
the files that would also in the published build). We then patch the
`package.json` to rewrite `workspace:^` versions to link to those
tarballs. We found this to work reliably on Windows and macOS as well as
being fast enough to not cause any issues. Furthermore we also decided
to use `pnpm` as the version manager for integration tests because of
it's global module cache (so installing `vite` is fast as soon as you
installed it once).
The test function will receive a few utilities that it can use to more
easily interact with the temp dir. One example is a `fs.glob` function
that you can use to easily find files in eventual `dist/` directories or
helpers around `spawn` and `exec` that make sure that processes are
cleaned up correctly.
Because we use tarballs from our build dependencies, working on changes
requires a workflow where you run `pnpm build` before running `pnpm
test:integrations`. However it also means we can run clients like our
CLI client with no additional overhead—just install the dependency like
any user would and set up your test cases this way.
## Test plan
This PR also includes two Vite specific integration tests: One testing a
static build (`pnpm vite build`) and one a dev mode build (`pnpm vite
dev`) that also makes changes to the file system and asserts that the
resources properly update.
---------
Co-authored-by: Robin Malfait <malfait.robin@gmail.com>
|
||
|
|
32cf8aa0fb
|
remove v3 codebase | ||
|
|
516ba530f0
|
Setup integration tests (#5466)
* setup integration tests * fix rgb color syntax * ensure integration tests always exit If for any reason the integration tests fail, then it will run forever on CI (~2hours or something). The `--forceExit` is not ideal but it will prevent long running processes. * fix incorrect test We were never properly waiting for the command to finish. * handle AbortError properly In CI, when an AbortController gets aborted an error is thrown (AbortError). If we don't catch it properly then it will "leak" and the test will fail. * improve IO functions * quit integration tests after 10seconds * only test a few integrations * test all integrations using matrix This will cancel other builds when one fails, it will also separate the output per integration which can be useful especially now that we are still figuring things out. * rename `build` to `test` * add --verbose flag to receive output in the console * when reading stdout or stderr, wait a certain about to ensure stability Debouncing for 200ms means that if another message comes in within those 200ms we delay the execution of the callback. * simplify workflow * use terminal output instead of disk events * cache node_modules for integrations * empty commit, to test cache hits |
||
|
|
7565099c1f
|
Integrations setup (#4354)
* add integration test tools * setup jest in the integrations folder * add `test:integrations` script The default `npm test` script will ignore all the tests in the `integrations` folder. * add integration tests with `webpack 5` * add integration tests with `postcss-cli` * add `npm run install:integrations` script This script will run `npm install` in every integration, and in the integrations folder itself (to setup Jest for example). * add `toIncludeCss` custom matcher * increate Jest timeout for integration tests * add integration tests with `vite` * add integration tests with `webpack 4` * add isomorphic fetch * add the ability to wait for specific stdout/stderr output * write vite tests, assert using API calls We will wait for the correct stdout/stderr output, once we know that we can request the fresh css, we will fetch it and make assertions accordingly. Port is currently hardcoded, maybe we should use a packaage to ensure that we use a free port. * add integration tests with `rollup` * add integration tests with `parcel` * run all integration tests in band * add .gitignore in integrations folder |