**This does not change existing behavior.**
building to serverless is completely opt-in.
- Implements `target: 'serverless'` in `next.config.js`
- Removes `next build --lambdas` (was only available on next@canary so far)
This implements the concept of build targets. Currently there will be 2 build targets:
- server (This is the target that already existed / the default, no changes here)
- serverless (New target aimed at compiling pages to serverless handlers)
The serverless target will output a single file per `page` in the `pages` directory:
- `pages/index.js` => `.next/serverless/index.js`
- `pages/about.js` => `.next/serverless/about.js`
So what is inside `.next/serverless/about.js`? All the code needed to render that specific page. It has the Node.js `http.Server` request handler function signature:
```ts
(req: http.IncomingMessage, res: http.ServerResponse) => void
```
So how do you use it? Generally you **don't** want to use the below example, but for illustration purposes it's shown how the handler is called using a plain `http.Server`:
```js
const http = require('http')
// Note that `.default` is needed because the exported module is an esmodule
const handler = require('./.next/serverless/about.js').default
const server = new http.Server((req, res) => handler(req, res))
server.listen(3000, () => console.log('Listening on http://localhost:3000'))
```
Generally you'll upload this handler function to an external service like [Now v2](https://zeit.co/now-2), the `@now/next` builder will be updated to reflect these changes. This means that it'll be no longer neccesary for `@now/next` to do some of the guesswork in creating smaller handler functions. As Next.js will output the smallest possible serverless handler function automatically.
The function has 0 dependencies so no node_modules are required to run it, and is generally very small. 45Kb zipped is the baseline, but I'm sure we can make it even smaller in the future.
One important thing to note is that the function won't try to load `next.config.js`, so `publicRuntimeConfig` / `serverRuntimeConfig` are not supported. Reasons are outlined here: #5846
So to summarize:
- every page becomes a serverless function
- the serverless function has 0 dependencies (they're all inlined)
- "just" uses the `req` and `res` coming from Node.js
- opt-in using `target: 'serverless'` in `next.config.js`
- Does not load next.config.js when executing the function
TODO:
- [x] Compile next/dynamic / `import()` into the function file, so that no extra files have to be uploaded.
- [x] Setting `assetPrefix` at build time for serverless target
- [x] Support custom /_app
- [x] Support custom /_document
- [x] Support custom /_error
- [x] Add `next.config.js` property for `target`
Need discussion:
- [ ] Since the serverless target won't support `publicRuntimeConfig` / `serverRuntimeConfig` as they're runtime values. I think we should support build-time env var replacement with webpack.DefinePlugin or similar.
- [ ] Serving static files with the correct cache-control, as there is no static file serving in the serverless target
* Remove flow-typed
* Remove flow types
* Remove the last types
* Bring back taskr dependency
* Revert "Bring back taskr dependency"
This reverts commit 38cb95d7274d63fe63c6ac3c95ca358a28c17895.
* Bring back preset-flow as it’s used for tests
* Revert "Revert "Bring back taskr dependency""
This reverts commit b4c933ef133f4039f544fb10bf31d5c95d3b27a2.
Takes advantage of caching between builds for Terser, also makes writing caches for babel-loader faster by disabling compression.
Results for zeit.co (350 pages):
Without cache:
[4:16:22 PM] Compiled server in 1m
[4:16:57 PM] Compiled client in 2m
✨ Done in 125.83s.
With cache:
[4:19:38 PM] Compiled client in 17s
[4:19:50 PM] Compiled server in 29s
✨ Done in 31.79s.
Note: these results are from my multi-core Macbook Pro 2017, exact specs:
MacBook Pro (13-inch, 2017, Four Thunderbolt 3 Ports)
- 3,3 GHz Intel Core i5
- 16 GB 2133 MHz LPDDR3
- Intel Iris Plus Graphics 650 1536 MB
The `without cache` build runs uglify in parallel, so without cache is likely to take longer on environments where you have only 1 core available.
The `with cache` build however runs in a single thread, so the results should be similar.