Fixes#4495
Here's my approach for replacing the XHR on-demand-entries pinger #1364#4495. I'm not sure if this is the way everyone wants to accomplish this since I saw mention of using a separate server and port for the dynamic entries websocket, but thought this would be a fairly clean solution since it doesn't need that.
With this method the only change when using a custom server is you have to listen for the upgrade event and pass it to next.getRequestHandler(). Example:
```
const server = app.listen(port)
const handleRequest = next.getRequestHandler()
if(dev) {
server.on('upgrade', handleRequest)
}
```
# Fixes https://github.com/zeit/next.js/issues/5674
This adds config option
```js
// next.config.js
module.exports = {
crossOrigin: 'anonymous'
}
```
This config option is defined in the webpack Define Plugin at build.
`Head` and `NextScript` now use the config option, if it's not explicitly set on the element.
This value is now passed to Webpack so it can add it to scripts that it loads.
The value is now used in `PageLoader` (on the client) so it can add it to the scripts and links that it loads.
Using `<Head crossOrigin>` or `<NextScript crossOrigin>` is now deprecated.
* Convert render.js to typescript
* Compile tsx files too
* Remove internal renderErrorToHTML function
* Interopt component result
* requirePage doesn’t need async
* Move out enhancing logic into it’s own function
* Remove buildManifest from renderPage
* Move render into it’s own function
* Change let to const
* Move renderDocument into it’s own function
This PR will
- allow nextjs export to use all available CPU cores for rendering & writing pages by using child_process
- make use of async-sema to allow each thread to concurrently write multiple paths
- show a fancy progress bar while processing pages (with non-TTY fallback for CI web consoles)
The performance gain for my MacBook with 4 CPU cores went from ~25 pages per second to ~75 pages per second. Beefy CI machines with lots of cores should profit even more.
This PR Fixes#4920
So the problem is that when a next.js application is built on windows, the `pages-manifest.json` file is created with backslashes. If this built application is deployed to a linux hosting enviroment, the server will fail when trying to load the modules.
```
Error: Cannot find module '/user_code/next/server/bundles\pages\index.js
```
My simple solution is to modify the `pages-manifest.json` to always use linux separator (`/`), then also
modify `server/require.js` to, when requiring page, replace any separator (`\` or `/`) with current platform-specific file separator (`require('path').sep`).
The fix in `server/require.js` would be sufficient, but my opinion is that having some cross-platform consistency is nice.
This change was tested by bulding an application in windows and running it in linux and windows, aswell as building an application in linux and running it in linux and windows. The related tests was also run.
# Conflicts:
# test/integration/production/test/index.test.js
* Move send-html function and rewrite in typescript
* Move getPageFiles and convert to ts
* Move getPageFiles and convert to ts (#5841)
* Move getPageFiles and convert to ts
# Conflicts:
# packages/next-server/server/render.js
* Fix unit tests
We don't have to check if the file already exists here, since it's always in production mode (dev overrides the readBuildId method to always be `development`) If the file is not found (error is thrown) we check if the file exists. If not we throw a helpful error. In other cases we throw the original error.
Fixes#3705Fixes#4656
- No longer automatically dedupe certain tags. Only the ones we know are *never* going to be duplicate like charSet, title etc.
- Fix `key=""` behavior, making sure that if a unique key is provided tags are deduped based on that.
For example:
```jsx
<meta property='fb:pages' content='one'>
<meta property='fb:pages' content='two'>
```
Would currently cause
```jsx
<meta property='fb:pages' content='two'>
```
### After this change:
```jsx
<meta property='fb:pages' content='one'>
<meta property='fb:pages' content='two'>
```
Then if you use next/head multiple times / want to be able to override:
```jsx
<meta property='fb:pages' content='one' key="not-unique-key">
<meta property='fb:pages' content='two' key="not-unique-key">
```
Would cause:
```jsx
<meta property='fb:pages' content='two'>
```
As `key` gets deduped correctly after this PR, similar to how React itself works.
- Replaces taskr-babel with taskr-typescript for the `next` package
- Makes sure Node 8+ is used, no unneeded transpilation
- Compile Next.js client side files through babel the same way pages are
- Compile Next.js client side files to esmodules, not commonjs, so that tree shaking works.
- Move error-debug.js out of next-server as it's only used/require in development
- Drop ansi-html as dependency from next-server
- Make next/link esmodule (for tree-shaking)
- Make next/router esmodule (for tree-shaking)
- add typescript compilation to next-server
- Remove last remains of Flow
- Move hoist-non-react-statics to next, out of next-server
- Move htmlescape to next, out of next-server
- Remove runtime-corejs2 from next-server
* Remove flow-typed
* Remove flow types
* Remove the last types
* Bring back taskr dependency
* Revert "Bring back taskr dependency"
This reverts commit 38cb95d7274d63fe63c6ac3c95ca358a28c17895.
* Bring back preset-flow as it’s used for tests
* Revert "Revert "Bring back taskr dependency""
This reverts commit b4c933ef133f4039f544fb10bf31d5c95d3b27a2.
Extracting the logic that defines if a page is blocked to utils.
If that refactor make sense, I will create a next PR to cover both of the functions inside utils with tests.
* Add node_modules bundling under the —lambdas flag for next build
* Run minifier when lambdas mode is enabled
* Add lambdas option to next.config.js
* Add test for lambdas option
Takes advantage of caching between builds for Terser, also makes writing caches for babel-loader faster by disabling compression.
Results for zeit.co (350 pages):
Without cache:
[4:16:22 PM] Compiled server in 1m
[4:16:57 PM] Compiled client in 2m
✨ Done in 125.83s.
With cache:
[4:19:38 PM] Compiled client in 17s
[4:19:50 PM] Compiled server in 29s
✨ Done in 31.79s.
Note: these results are from my multi-core Macbook Pro 2017, exact specs:
MacBook Pro (13-inch, 2017, Four Thunderbolt 3 Ports)
- 3,3 GHz Intel Core i5
- 16 GB 2133 MHz LPDDR3
- Intel Iris Plus Graphics 650 1536 MB
The `without cache` build runs uglify in parallel, so without cache is likely to take longer on environments where you have only 1 core available.
The `with cache` build however runs in a single thread, so the results should be similar.