Dev (#12)
* health check * Update Dockerfile * simplifying the deployment * Update Bot.js makes the find team command public * test (#9) * Dev (#7) * health check * Update Dockerfile * simplifying the deployment * Dev (#8) * health check * Update Dockerfile * simplifying the deployment * Update Bot.js makes the find team command public * Update PlayerService.js * massive update???? could break stuff * Update Bot.js update
This commit is contained in:
1
node_modules/undici/README.md
generated
vendored
1
node_modules/undici/README.md
generated
vendored
@@ -84,6 +84,7 @@ The `body` mixins are the most common way to format the request/response body. M
|
||||
|
||||
- [`.arrayBuffer()`](https://fetch.spec.whatwg.org/#dom-body-arraybuffer)
|
||||
- [`.blob()`](https://fetch.spec.whatwg.org/#dom-body-blob)
|
||||
- [`.bytes()`](https://fetch.spec.whatwg.org/#dom-body-bytes)
|
||||
- [`.json()`](https://fetch.spec.whatwg.org/#dom-body-json)
|
||||
- [`.text()`](https://fetch.spec.whatwg.org/#dom-body-text)
|
||||
|
||||
|
||||
260
node_modules/undici/docs/docs/api/Dispatcher.md
generated
vendored
260
node_modules/undici/docs/docs/api/Dispatcher.md
generated
vendored
@@ -488,11 +488,13 @@ The `RequestOptions.method` property should not be value `'CONNECT'`.
|
||||
|
||||
`body` contains the following additional [body mixin](https://fetch.spec.whatwg.org/#body-mixin) methods and properties:
|
||||
|
||||
- `text()`
|
||||
- `json()`
|
||||
- `arrayBuffer()`
|
||||
- `body`
|
||||
- `bodyUsed`
|
||||
* [`.arrayBuffer()`](https://fetch.spec.whatwg.org/#dom-body-arraybuffer)
|
||||
* [`.blob()`](https://fetch.spec.whatwg.org/#dom-body-blob)
|
||||
* [`.bytes()`](https://fetch.spec.whatwg.org/#dom-body-bytes)
|
||||
* [`.json()`](https://fetch.spec.whatwg.org/#dom-body-json)
|
||||
* [`.text()`](https://fetch.spec.whatwg.org/#dom-body-text)
|
||||
* `body`
|
||||
* `bodyUsed`
|
||||
|
||||
`body` can not be consumed twice. For example, calling `text()` after `json()` throws `TypeError`.
|
||||
|
||||
@@ -984,6 +986,254 @@ client.dispatch(
|
||||
);
|
||||
```
|
||||
|
||||
##### `dns`
|
||||
|
||||
The `dns` interceptor enables you to cache DNS lookups for a given duration, per origin.
|
||||
|
||||
>It is well suited for scenarios where you want to cache DNS lookups to avoid the overhead of resolving the same domain multiple times
|
||||
|
||||
**Options**
|
||||
- `maxTTL` - The maximum time-to-live (in milliseconds) of the DNS cache. It should be a positive integer. Default: `10000`.
|
||||
- Set `0` to disable TTL.
|
||||
- `maxItems` - The maximum number of items to cache. It should be a positive integer. Default: `Infinity`.
|
||||
- `dualStack` - Whether to resolve both IPv4 and IPv6 addresses. Default: `true`.
|
||||
- It will also attempt a happy-eyeballs-like approach to connect to the available addresses in case of a connection failure.
|
||||
- `affinity` - Whether to use IPv4 or IPv6 addresses. Default: `4`.
|
||||
- It can be either `'4` or `6`.
|
||||
- It will only take effect if `dualStack` is `false`.
|
||||
- `lookup: (hostname: string, options: LookupOptions, callback: (err: NodeJS.ErrnoException | null, addresses: DNSInterceptorRecord[]) => void) => void` - Custom lookup function. Default: `dns.lookup`.
|
||||
- For more info see [dns.lookup](https://nodejs.org/api/dns.html#dns_dns_lookup_hostname_options_callback).
|
||||
- `pick: (origin: URL, records: DNSInterceptorRecords, affinity: 4 | 6) => DNSInterceptorRecord` - Custom pick function. Default: `RoundRobin`.
|
||||
- The function should return a single record from the records array.
|
||||
- By default a simplified version of Round Robin is used.
|
||||
- The `records` property can be mutated to store the state of the balancing algorithm.
|
||||
|
||||
> The `Dispatcher#options` also gets extended with the options `dns.affinity`, `dns.dualStack`, `dns.lookup` and `dns.pick` which can be used to configure the interceptor at a request-per-request basis.
|
||||
|
||||
|
||||
**DNSInterceptorRecord**
|
||||
It represents a DNS record.
|
||||
- `family` - (`number`) The IP family of the address. It can be either `4` or `6`.
|
||||
- `address` - (`string`) The IP address.
|
||||
|
||||
**DNSInterceptorOriginRecords**
|
||||
It represents a map of DNS IP addresses records for a single origin.
|
||||
- `4.ips` - (`DNSInterceptorRecord[] | null`) The IPv4 addresses.
|
||||
- `6.ips` - (`DNSInterceptorRecord[] | null`) The IPv6 addresses.
|
||||
|
||||
**Example - Basic DNS Interceptor**
|
||||
|
||||
```js
|
||||
const { Client, interceptors } = require("undici");
|
||||
const { dns } = interceptors;
|
||||
|
||||
const client = new Agent().compose([
|
||||
dns({ ...opts })
|
||||
])
|
||||
|
||||
const response = await client.request({
|
||||
origin: `http://localhost:3030`,
|
||||
...requestOpts
|
||||
})
|
||||
```
|
||||
|
||||
##### `Response Error Interceptor`
|
||||
|
||||
**Introduction**
|
||||
|
||||
The Response Error Interceptor is designed to handle HTTP response errors efficiently. It intercepts responses and throws detailed errors for responses with status codes indicating failure (4xx, 5xx). This interceptor enhances error handling by providing structured error information, including response headers, data, and status codes.
|
||||
|
||||
**ResponseError Class**
|
||||
|
||||
The `ResponseError` class extends the `UndiciError` class and encapsulates detailed error information. It captures the response status code, headers, and data, providing a structured way to handle errors.
|
||||
|
||||
**Definition**
|
||||
|
||||
```js
|
||||
class ResponseError extends UndiciError {
|
||||
constructor (message, code, { headers, data }) {
|
||||
super(message);
|
||||
this.name = 'ResponseError';
|
||||
this.message = message || 'Response error';
|
||||
this.code = 'UND_ERR_RESPONSE';
|
||||
this.statusCode = code;
|
||||
this.data = data;
|
||||
this.headers = headers;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Interceptor Handler**
|
||||
|
||||
The interceptor's handler class extends `DecoratorHandler` and overrides methods to capture response details and handle errors based on the response status code.
|
||||
|
||||
**Methods**
|
||||
|
||||
- **onConnect**: Initializes response properties.
|
||||
- **onHeaders**: Captures headers and status code. Decodes body if content type is `application/json` or `text/plain`.
|
||||
- **onData**: Appends chunks to the body if status code indicates an error.
|
||||
- **onComplete**: Finalizes error handling, constructs a `ResponseError`, and invokes the `onError` method.
|
||||
- **onError**: Propagates errors to the handler.
|
||||
|
||||
**Definition**
|
||||
|
||||
```js
|
||||
class Handler extends DecoratorHandler {
|
||||
// Private properties
|
||||
#handler;
|
||||
#statusCode;
|
||||
#contentType;
|
||||
#decoder;
|
||||
#headers;
|
||||
#body;
|
||||
|
||||
constructor (opts, { handler }) {
|
||||
super(handler);
|
||||
this.#handler = handler;
|
||||
}
|
||||
|
||||
onConnect (abort) {
|
||||
this.#statusCode = 0;
|
||||
this.#contentType = null;
|
||||
this.#decoder = null;
|
||||
this.#headers = null;
|
||||
this.#body = '';
|
||||
return this.#handler.onConnect(abort);
|
||||
}
|
||||
|
||||
onHeaders (statusCode, rawHeaders, resume, statusMessage, headers = parseHeaders(rawHeaders)) {
|
||||
this.#statusCode = statusCode;
|
||||
this.#headers = headers;
|
||||
this.#contentType = headers['content-type'];
|
||||
|
||||
if (this.#statusCode < 400) {
|
||||
return this.#handler.onHeaders(statusCode, rawHeaders, resume, statusMessage, headers);
|
||||
}
|
||||
|
||||
if (this.#contentType === 'application/json' || this.#contentType === 'text/plain') {
|
||||
this.#decoder = new TextDecoder('utf-8');
|
||||
}
|
||||
}
|
||||
|
||||
onData (chunk) {
|
||||
if (this.#statusCode < 400) {
|
||||
return this.#handler.onData(chunk);
|
||||
}
|
||||
this.#body += this.#decoder?.decode(chunk, { stream: true }) ?? '';
|
||||
}
|
||||
|
||||
onComplete (rawTrailers) {
|
||||
if (this.#statusCode >= 400) {
|
||||
this.#body += this.#decoder?.decode(undefined, { stream: false }) ?? '';
|
||||
if (this.#contentType === 'application/json') {
|
||||
try {
|
||||
this.#body = JSON.parse(this.#body);
|
||||
} catch {
|
||||
// Do nothing...
|
||||
}
|
||||
}
|
||||
|
||||
let err;
|
||||
const stackTraceLimit = Error.stackTraceLimit;
|
||||
Error.stackTraceLimit = 0;
|
||||
try {
|
||||
err = new ResponseError('Response Error', this.#statusCode, this.#headers, this.#body);
|
||||
} finally {
|
||||
Error.stackTraceLimit = stackTraceLimit;
|
||||
}
|
||||
|
||||
this.#handler.onError(err);
|
||||
} else {
|
||||
this.#handler.onComplete(rawTrailers);
|
||||
}
|
||||
}
|
||||
|
||||
onError (err) {
|
||||
this.#handler.onError(err);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = (dispatch) => (opts, handler) => opts.throwOnError
|
||||
? dispatch(opts, new Handler(opts, { handler }))
|
||||
: dispatch(opts, handler);
|
||||
```
|
||||
|
||||
**Tests**
|
||||
|
||||
Unit tests ensure the interceptor functions correctly, handling both error and non-error responses appropriately.
|
||||
|
||||
**Example Tests**
|
||||
|
||||
- **No Error if `throwOnError` is False**:
|
||||
|
||||
```js
|
||||
test('should not error if request is not meant to throw error', async (t) => {
|
||||
const opts = { throwOnError: false };
|
||||
const handler = { onError: () => {}, onData: () => {}, onComplete: () => {} };
|
||||
const interceptor = createResponseErrorInterceptor((opts, handler) => handler.onComplete());
|
||||
assert.doesNotThrow(() => interceptor(opts, handler));
|
||||
});
|
||||
```
|
||||
|
||||
- **Error if Status Code is in Specified Error Codes**:
|
||||
|
||||
```js
|
||||
test('should error if request status code is in the specified error codes', async (t) => {
|
||||
const opts = { throwOnError: true, statusCodes: [500] };
|
||||
const response = { statusCode: 500 };
|
||||
let capturedError;
|
||||
const handler = {
|
||||
onError: (err) => { capturedError = err; },
|
||||
onData: () => {},
|
||||
onComplete: () => {}
|
||||
};
|
||||
|
||||
const interceptor = createResponseErrorInterceptor((opts, handler) => {
|
||||
if (opts.throwOnError && opts.statusCodes.includes(response.statusCode)) {
|
||||
handler.onError(new Error('Response Error'));
|
||||
} else {
|
||||
handler.onComplete();
|
||||
}
|
||||
});
|
||||
|
||||
interceptor({ ...opts, response }, handler);
|
||||
|
||||
await new Promise(resolve => setImmediate(resolve));
|
||||
|
||||
assert(capturedError, 'Expected error to be captured but it was not.');
|
||||
assert.strictEqual(capturedError.message, 'Response Error');
|
||||
assert.strictEqual(response.statusCode, 500);
|
||||
});
|
||||
```
|
||||
|
||||
- **No Error if Status Code is Not in Specified Error Codes**:
|
||||
|
||||
```js
|
||||
test('should not error if request status code is not in the specified error codes', async (t) => {
|
||||
const opts = { throwOnError: true, statusCodes: [500] };
|
||||
const response = { statusCode: 404 };
|
||||
const handler = {
|
||||
onError: () => {},
|
||||
onData: () => {},
|
||||
onComplete: () => {}
|
||||
};
|
||||
|
||||
const interceptor = createResponseErrorInterceptor((opts, handler) => {
|
||||
if (opts.throwOnError && opts.statusCodes.includes(response.statusCode)) {
|
||||
handler.onError(new Error('Response Error'));
|
||||
} else {
|
||||
handler.onComplete();
|
||||
}
|
||||
});
|
||||
|
||||
assert.doesNotThrow(() => interceptor({ ...opts, response }, handler));
|
||||
});
|
||||
```
|
||||
|
||||
**Conclusion**
|
||||
|
||||
The Response Error Interceptor provides a robust mechanism for handling HTTP response errors by capturing detailed error information and propagating it through a structured `ResponseError` class. This enhancement improves error handling and debugging capabilities in applications using the interceptor.
|
||||
|
||||
## Instance Events
|
||||
|
||||
### Event: `'connect'`
|
||||
|
||||
1
node_modules/undici/docs/docs/api/Fetch.md
generated
vendored
1
node_modules/undici/docs/docs/api/Fetch.md
generated
vendored
@@ -28,6 +28,7 @@ This API is implemented as per the standard, you can find documentation on [MDN]
|
||||
|
||||
- [`.arrayBuffer()`](https://fetch.spec.whatwg.org/#dom-body-arraybuffer)
|
||||
- [`.blob()`](https://fetch.spec.whatwg.org/#dom-body-blob)
|
||||
- [`.bytes()`](https://fetch.spec.whatwg.org/#dom-body-bytes)
|
||||
- [`.formData()`](https://fetch.spec.whatwg.org/#dom-body-formdata)
|
||||
- [`.json()`](https://fetch.spec.whatwg.org/#dom-body-json)
|
||||
- [`.text()`](https://fetch.spec.whatwg.org/#dom-body-text)
|
||||
|
||||
2
node_modules/undici/docs/docs/api/RetryHandler.md
generated
vendored
2
node_modules/undici/docs/docs/api/RetryHandler.md
generated
vendored
@@ -19,7 +19,7 @@ Extends: [`Dispatch.DispatchOptions`](Dispatcher.md#parameter-dispatchoptions).
|
||||
|
||||
#### `RetryOptions`
|
||||
|
||||
- **retry** `(err: Error, context: RetryContext, callback: (err?: Error | null) => void) => void` (optional) - Function to be called after every retry. It should pass error if no more retries should be performed.
|
||||
- **retry** `(err: Error, context: RetryContext, callback: (err?: Error | null) => void) => number | null` (optional) - Function to be called after every retry. It should pass error if no more retries should be performed.
|
||||
- **maxRetries** `number` (optional) - Maximum number of retries. Default: `5`
|
||||
- **maxTimeout** `number` (optional) - Maximum number of milliseconds to wait before retrying. Default: `30000` (30 seconds)
|
||||
- **minTimeout** `number` (optional) - Minimum number of milliseconds to wait before retrying. Default: `500` (half a second)
|
||||
|
||||
3
node_modules/undici/index.js
generated
vendored
3
node_modules/undici/index.js
generated
vendored
@@ -41,7 +41,8 @@ module.exports.createRedirectInterceptor = createRedirectInterceptor
|
||||
module.exports.interceptors = {
|
||||
redirect: require('./lib/interceptor/redirect'),
|
||||
retry: require('./lib/interceptor/retry'),
|
||||
dump: require('./lib/interceptor/dump')
|
||||
dump: require('./lib/interceptor/dump'),
|
||||
dns: require('./lib/interceptor/dns')
|
||||
}
|
||||
|
||||
module.exports.buildConnector = buildConnector
|
||||
|
||||
2
node_modules/undici/lib/api/api-request.js
generated
vendored
2
node_modules/undici/lib/api/api-request.js
generated
vendored
@@ -73,7 +73,7 @@ class RequestHandler extends AsyncResource {
|
||||
this.removeAbortListener = util.addAbortListener(this.signal, () => {
|
||||
this.reason = this.signal.reason ?? new RequestAbortedError()
|
||||
if (this.res) {
|
||||
util.destroy(this.res, this.reason)
|
||||
util.destroy(this.res.on('error', util.nop), this.reason)
|
||||
} else if (this.abort) {
|
||||
this.abort(this.reason)
|
||||
}
|
||||
|
||||
4
node_modules/undici/lib/api/api-upgrade.js
generated
vendored
4
node_modules/undici/lib/api/api-upgrade.js
generated
vendored
@@ -50,9 +50,9 @@ class UpgradeHandler extends AsyncResource {
|
||||
}
|
||||
|
||||
onUpgrade (statusCode, rawHeaders, socket) {
|
||||
const { callback, opaque, context } = this
|
||||
assert(statusCode === 101)
|
||||
|
||||
assert.strictEqual(statusCode, 101)
|
||||
const { callback, opaque, context } = this
|
||||
|
||||
removeSignal(this)
|
||||
|
||||
|
||||
42
node_modules/undici/lib/api/readable.js
generated
vendored
42
node_modules/undici/lib/api/readable.js
generated
vendored
@@ -121,6 +121,11 @@ class BodyReadable extends Readable {
|
||||
return consume(this, 'blob')
|
||||
}
|
||||
|
||||
// https://fetch.spec.whatwg.org/#dom-body-bytes
|
||||
async bytes () {
|
||||
return consume(this, 'bytes')
|
||||
}
|
||||
|
||||
// https://fetch.spec.whatwg.org/#dom-body-arraybuffer
|
||||
async arrayBuffer () {
|
||||
return consume(this, 'arrayBuffer')
|
||||
@@ -306,6 +311,31 @@ function chunksDecode (chunks, length) {
|
||||
return buffer.utf8Slice(start, bufferLength)
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {Buffer[]} chunks
|
||||
* @param {number} length
|
||||
* @returns {Uint8Array}
|
||||
*/
|
||||
function chunksConcat (chunks, length) {
|
||||
if (chunks.length === 0 || length === 0) {
|
||||
return new Uint8Array(0)
|
||||
}
|
||||
if (chunks.length === 1) {
|
||||
// fast-path
|
||||
return new Uint8Array(chunks[0])
|
||||
}
|
||||
const buffer = new Uint8Array(Buffer.allocUnsafeSlow(length).buffer)
|
||||
|
||||
let offset = 0
|
||||
for (let i = 0; i < chunks.length; ++i) {
|
||||
const chunk = chunks[i]
|
||||
buffer.set(chunk, offset)
|
||||
offset += chunk.length
|
||||
}
|
||||
|
||||
return buffer
|
||||
}
|
||||
|
||||
function consumeEnd (consume) {
|
||||
const { type, body, resolve, stream, length } = consume
|
||||
|
||||
@@ -315,17 +345,11 @@ function consumeEnd (consume) {
|
||||
} else if (type === 'json') {
|
||||
resolve(JSON.parse(chunksDecode(body, length)))
|
||||
} else if (type === 'arrayBuffer') {
|
||||
const dst = new Uint8Array(length)
|
||||
|
||||
let pos = 0
|
||||
for (const buf of body) {
|
||||
dst.set(buf, pos)
|
||||
pos += buf.byteLength
|
||||
}
|
||||
|
||||
resolve(dst.buffer)
|
||||
resolve(chunksConcat(body, length).buffer)
|
||||
} else if (type === 'blob') {
|
||||
resolve(new Blob(body, { type: stream[kContentType] }))
|
||||
} else if (type === 'bytes') {
|
||||
resolve(chunksConcat(body, length))
|
||||
}
|
||||
|
||||
consumeFinish(consume)
|
||||
|
||||
109
node_modules/undici/lib/core/connect.js
generated
vendored
109
node_modules/undici/lib/core/connect.js
generated
vendored
@@ -4,6 +4,9 @@ const net = require('node:net')
|
||||
const assert = require('node:assert')
|
||||
const util = require('./util')
|
||||
const { InvalidArgumentError, ConnectTimeoutError } = require('./errors')
|
||||
const timers = require('../util/timers')
|
||||
|
||||
function noop () {}
|
||||
|
||||
let tls // include tls conditionally since it is not always available
|
||||
|
||||
@@ -91,9 +94,11 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
servername = servername || options.servername || util.getServerName(host) || null
|
||||
|
||||
const sessionKey = servername || hostname
|
||||
assert(sessionKey)
|
||||
|
||||
const session = customSession || sessionCache.get(sessionKey) || null
|
||||
|
||||
assert(sessionKey)
|
||||
port = port || 443
|
||||
|
||||
socket = tls.connect({
|
||||
highWaterMark: 16384, // TLS in node can't have bigger HWM anyway...
|
||||
@@ -104,7 +109,7 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
// TODO(HTTP/2): Add support for h2c
|
||||
ALPNProtocols: allowH2 ? ['http/1.1', 'h2'] : ['http/1.1'],
|
||||
socket: httpSocket, // upgrade socket connection
|
||||
port: port || 443,
|
||||
port,
|
||||
host: hostname
|
||||
})
|
||||
|
||||
@@ -115,11 +120,14 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
})
|
||||
} else {
|
||||
assert(!httpSocket, 'httpSocket can only be sent on TLS update')
|
||||
|
||||
port = port || 80
|
||||
|
||||
socket = net.connect({
|
||||
highWaterMark: 64 * 1024, // Same as nodejs fs streams.
|
||||
...options,
|
||||
localAddress,
|
||||
port: port || 80,
|
||||
port,
|
||||
host: hostname
|
||||
})
|
||||
}
|
||||
@@ -130,12 +138,12 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
socket.setKeepAlive(true, keepAliveInitialDelay)
|
||||
}
|
||||
|
||||
const cancelTimeout = setupTimeout(() => onConnectTimeout(socket), timeout)
|
||||
const clearConnectTimeout = setupConnectTimeout(new WeakRef(socket), { timeout, hostname, port })
|
||||
|
||||
socket
|
||||
.setNoDelay(true)
|
||||
.once(protocol === 'https:' ? 'secureConnect' : 'connect', function () {
|
||||
cancelTimeout()
|
||||
queueMicrotask(clearConnectTimeout)
|
||||
|
||||
if (callback) {
|
||||
const cb = callback
|
||||
@@ -144,7 +152,7 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
}
|
||||
})
|
||||
.on('error', function (err) {
|
||||
cancelTimeout()
|
||||
queueMicrotask(clearConnectTimeout)
|
||||
|
||||
if (callback) {
|
||||
const cb = callback
|
||||
@@ -157,36 +165,75 @@ function buildConnector ({ allowH2, maxCachedSessions, socketPath, timeout, sess
|
||||
}
|
||||
}
|
||||
|
||||
function setupTimeout (onConnectTimeout, timeout) {
|
||||
if (!timeout) {
|
||||
return () => {}
|
||||
}
|
||||
|
||||
let s1 = null
|
||||
let s2 = null
|
||||
const timeoutId = setTimeout(() => {
|
||||
// setImmediate is added to make sure that we prioritize socket error events over timeouts
|
||||
s1 = setImmediate(() => {
|
||||
if (process.platform === 'win32') {
|
||||
// Windows needs an extra setImmediate probably due to implementation differences in the socket logic
|
||||
s2 = setImmediate(() => onConnectTimeout())
|
||||
} else {
|
||||
onConnectTimeout()
|
||||
/**
|
||||
* @param {WeakRef<net.Socket>} socketWeakRef
|
||||
* @param {object} opts
|
||||
* @param {number} opts.timeout
|
||||
* @param {string} opts.hostname
|
||||
* @param {number} opts.port
|
||||
* @returns {() => void}
|
||||
*/
|
||||
const setupConnectTimeout = process.platform === 'win32'
|
||||
? (socketWeakRef, opts) => {
|
||||
if (!opts.timeout) {
|
||||
return noop
|
||||
}
|
||||
})
|
||||
}, timeout)
|
||||
return () => {
|
||||
clearTimeout(timeoutId)
|
||||
clearImmediate(s1)
|
||||
clearImmediate(s2)
|
||||
}
|
||||
}
|
||||
|
||||
function onConnectTimeout (socket) {
|
||||
let s1 = null
|
||||
let s2 = null
|
||||
const fastTimer = timers.setFastTimeout(() => {
|
||||
// setImmediate is added to make sure that we prioritize socket error events over timeouts
|
||||
s1 = setImmediate(() => {
|
||||
// Windows needs an extra setImmediate probably due to implementation differences in the socket logic
|
||||
s2 = setImmediate(() => onConnectTimeout(socketWeakRef.deref(), opts))
|
||||
})
|
||||
}, opts.timeout)
|
||||
return () => {
|
||||
timers.clearFastTimeout(fastTimer)
|
||||
clearImmediate(s1)
|
||||
clearImmediate(s2)
|
||||
}
|
||||
}
|
||||
: (socketWeakRef, opts) => {
|
||||
if (!opts.timeout) {
|
||||
return noop
|
||||
}
|
||||
|
||||
let s1 = null
|
||||
const fastTimer = timers.setFastTimeout(() => {
|
||||
// setImmediate is added to make sure that we prioritize socket error events over timeouts
|
||||
s1 = setImmediate(() => {
|
||||
onConnectTimeout(socketWeakRef.deref(), opts)
|
||||
})
|
||||
}, opts.timeout)
|
||||
return () => {
|
||||
timers.clearFastTimeout(fastTimer)
|
||||
clearImmediate(s1)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {net.Socket} socket
|
||||
* @param {object} opts
|
||||
* @param {number} opts.timeout
|
||||
* @param {string} opts.hostname
|
||||
* @param {number} opts.port
|
||||
*/
|
||||
function onConnectTimeout (socket, opts) {
|
||||
// The socket could be already garbage collected
|
||||
if (socket == null) {
|
||||
return
|
||||
}
|
||||
|
||||
let message = 'Connect Timeout Error'
|
||||
if (Array.isArray(socket.autoSelectFamilyAttemptedAddresses)) {
|
||||
message += ` (attempted addresses: ${socket.autoSelectFamilyAttemptedAddresses.join(', ')})`
|
||||
message += ` (attempted addresses: ${socket.autoSelectFamilyAttemptedAddresses.join(', ')},`
|
||||
} else {
|
||||
message += ` (attempted address: ${opts.hostname}:${opts.port},`
|
||||
}
|
||||
|
||||
message += ` timeout: ${opts.timeout}ms)`
|
||||
|
||||
util.destroy(socket, new ConnectTimeoutError(message))
|
||||
}
|
||||
|
||||
|
||||
13
node_modules/undici/lib/core/errors.js
generated
vendored
13
node_modules/undici/lib/core/errors.js
generated
vendored
@@ -195,6 +195,18 @@ class RequestRetryError extends UndiciError {
|
||||
}
|
||||
}
|
||||
|
||||
class ResponseError extends UndiciError {
|
||||
constructor (message, code, { headers, data }) {
|
||||
super(message)
|
||||
this.name = 'ResponseError'
|
||||
this.message = message || 'Response error'
|
||||
this.code = 'UND_ERR_RESPONSE'
|
||||
this.statusCode = code
|
||||
this.data = data
|
||||
this.headers = headers
|
||||
}
|
||||
}
|
||||
|
||||
class SecureProxyConnectionError extends UndiciError {
|
||||
constructor (cause, message, options) {
|
||||
super(message, { cause, ...(options ?? {}) })
|
||||
@@ -227,5 +239,6 @@ module.exports = {
|
||||
BalancedPoolMissingUpstreamError,
|
||||
ResponseExceededMaxSizeError,
|
||||
RequestRetryError,
|
||||
ResponseError,
|
||||
SecureProxyConnectionError
|
||||
}
|
||||
|
||||
2
node_modules/undici/lib/core/util.js
generated
vendored
2
node_modules/undici/lib/core/util.js
generated
vendored
@@ -233,7 +233,7 @@ function getServerName (host) {
|
||||
return null
|
||||
}
|
||||
|
||||
assert.strictEqual(typeof host, 'string')
|
||||
assert(typeof host === 'string')
|
||||
|
||||
const servername = getHostname(host)
|
||||
if (net.isIP(servername)) {
|
||||
|
||||
117
node_modules/undici/lib/dispatcher/client-h1.js
generated
vendored
117
node_modules/undici/lib/dispatcher/client-h1.js
generated
vendored
@@ -85,35 +85,35 @@ async function lazyllhttp () {
|
||||
return 0
|
||||
},
|
||||
wasm_on_status: (p, at, len) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
const start = at - currentBufferPtr + currentBufferRef.byteOffset
|
||||
return currentParser.onStatus(new FastBuffer(currentBufferRef.buffer, start, len)) || 0
|
||||
},
|
||||
wasm_on_message_begin: (p) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
return currentParser.onMessageBegin() || 0
|
||||
},
|
||||
wasm_on_header_field: (p, at, len) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
const start = at - currentBufferPtr + currentBufferRef.byteOffset
|
||||
return currentParser.onHeaderField(new FastBuffer(currentBufferRef.buffer, start, len)) || 0
|
||||
},
|
||||
wasm_on_header_value: (p, at, len) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
const start = at - currentBufferPtr + currentBufferRef.byteOffset
|
||||
return currentParser.onHeaderValue(new FastBuffer(currentBufferRef.buffer, start, len)) || 0
|
||||
},
|
||||
wasm_on_headers_complete: (p, statusCode, upgrade, shouldKeepAlive) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
return currentParser.onHeadersComplete(statusCode, Boolean(upgrade), Boolean(shouldKeepAlive)) || 0
|
||||
},
|
||||
wasm_on_body: (p, at, len) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
const start = at - currentBufferPtr + currentBufferRef.byteOffset
|
||||
return currentParser.onBody(new FastBuffer(currentBufferRef.buffer, start, len)) || 0
|
||||
},
|
||||
wasm_on_message_complete: (p) => {
|
||||
assert.strictEqual(currentParser.ptr, p)
|
||||
assert(currentParser.ptr === p)
|
||||
return currentParser.onMessageComplete() || 0
|
||||
}
|
||||
|
||||
@@ -131,9 +131,17 @@ let currentBufferRef = null
|
||||
let currentBufferSize = 0
|
||||
let currentBufferPtr = null
|
||||
|
||||
const TIMEOUT_HEADERS = 1
|
||||
const TIMEOUT_BODY = 2
|
||||
const TIMEOUT_IDLE = 3
|
||||
const USE_NATIVE_TIMER = 0
|
||||
const USE_FAST_TIMER = 1
|
||||
|
||||
// Use fast timers for headers and body to take eventual event loop
|
||||
// latency into account.
|
||||
const TIMEOUT_HEADERS = 2 | USE_FAST_TIMER
|
||||
const TIMEOUT_BODY = 4 | USE_FAST_TIMER
|
||||
|
||||
// Use native timers to ignore event loop latency for keep-alive
|
||||
// handling.
|
||||
const TIMEOUT_KEEP_ALIVE = 8 | USE_NATIVE_TIMER
|
||||
|
||||
class Parser {
|
||||
constructor (client, socket, { exports }) {
|
||||
@@ -164,26 +172,39 @@ class Parser {
|
||||
this.maxResponseSize = client[kMaxResponseSize]
|
||||
}
|
||||
|
||||
setTimeout (value, type) {
|
||||
this.timeoutType = type
|
||||
if (value !== this.timeoutValue) {
|
||||
timers.clearTimeout(this.timeout)
|
||||
if (value) {
|
||||
this.timeout = timers.setTimeout(onParserTimeout, value, this)
|
||||
// istanbul ignore else: only for jest
|
||||
if (this.timeout.unref) {
|
||||
this.timeout.unref()
|
||||
}
|
||||
} else {
|
||||
setTimeout (delay, type) {
|
||||
// If the existing timer and the new timer are of different timer type
|
||||
// (fast or native) or have different delay, we need to clear the existing
|
||||
// timer and set a new one.
|
||||
if (
|
||||
delay !== this.timeoutValue ||
|
||||
(type & USE_FAST_TIMER) ^ (this.timeoutType & USE_FAST_TIMER)
|
||||
) {
|
||||
// If a timeout is already set, clear it with clearTimeout of the fast
|
||||
// timer implementation, as it can clear fast and native timers.
|
||||
if (this.timeout) {
|
||||
timers.clearTimeout(this.timeout)
|
||||
this.timeout = null
|
||||
}
|
||||
this.timeoutValue = value
|
||||
|
||||
if (delay) {
|
||||
if (type & USE_FAST_TIMER) {
|
||||
this.timeout = timers.setFastTimeout(onParserTimeout, delay, new WeakRef(this))
|
||||
} else {
|
||||
this.timeout = setTimeout(onParserTimeout, delay, new WeakRef(this))
|
||||
this.timeout.unref()
|
||||
}
|
||||
}
|
||||
|
||||
this.timeoutValue = delay
|
||||
} else if (this.timeout) {
|
||||
// istanbul ignore else: only for jest
|
||||
if (this.timeout.refresh) {
|
||||
this.timeout.refresh()
|
||||
}
|
||||
}
|
||||
|
||||
this.timeoutType = type
|
||||
}
|
||||
|
||||
resume () {
|
||||
@@ -288,7 +309,7 @@ class Parser {
|
||||
this.llhttp.llhttp_free(this.ptr)
|
||||
this.ptr = null
|
||||
|
||||
timers.clearTimeout(this.timeout)
|
||||
this.timeout && timers.clearTimeout(this.timeout)
|
||||
this.timeout = null
|
||||
this.timeoutValue = null
|
||||
this.timeoutType = null
|
||||
@@ -363,20 +384,19 @@ class Parser {
|
||||
const { upgrade, client, socket, headers, statusCode } = this
|
||||
|
||||
assert(upgrade)
|
||||
assert(client[kSocket] === socket)
|
||||
assert(!socket.destroyed)
|
||||
assert(!this.paused)
|
||||
assert((headers.length & 1) === 0)
|
||||
|
||||
const request = client[kQueue][client[kRunningIdx]]
|
||||
assert(request)
|
||||
|
||||
assert(!socket.destroyed)
|
||||
assert(socket === client[kSocket])
|
||||
assert(!this.paused)
|
||||
assert(request.upgrade || request.method === 'CONNECT')
|
||||
|
||||
this.statusCode = null
|
||||
this.statusText = ''
|
||||
this.shouldKeepAlive = null
|
||||
|
||||
assert(this.headers.length % 2 === 0)
|
||||
this.headers = []
|
||||
this.headersSize = 0
|
||||
|
||||
@@ -433,7 +453,7 @@ class Parser {
|
||||
return -1
|
||||
}
|
||||
|
||||
assert.strictEqual(this.timeoutType, TIMEOUT_HEADERS)
|
||||
assert(this.timeoutType === TIMEOUT_HEADERS)
|
||||
|
||||
this.statusCode = statusCode
|
||||
this.shouldKeepAlive = (
|
||||
@@ -466,7 +486,7 @@ class Parser {
|
||||
return 2
|
||||
}
|
||||
|
||||
assert(this.headers.length % 2 === 0)
|
||||
assert((this.headers.length & 1) === 0)
|
||||
this.headers = []
|
||||
this.headersSize = 0
|
||||
|
||||
@@ -523,7 +543,7 @@ class Parser {
|
||||
const request = client[kQueue][client[kRunningIdx]]
|
||||
assert(request)
|
||||
|
||||
assert.strictEqual(this.timeoutType, TIMEOUT_BODY)
|
||||
assert(this.timeoutType === TIMEOUT_BODY)
|
||||
if (this.timeout) {
|
||||
// istanbul ignore else: only for jest
|
||||
if (this.timeout.refresh) {
|
||||
@@ -556,11 +576,12 @@ class Parser {
|
||||
return
|
||||
}
|
||||
|
||||
assert(statusCode >= 100)
|
||||
assert((this.headers.length & 1) === 0)
|
||||
|
||||
const request = client[kQueue][client[kRunningIdx]]
|
||||
assert(request)
|
||||
|
||||
assert(statusCode >= 100)
|
||||
|
||||
this.statusCode = null
|
||||
this.statusText = ''
|
||||
this.bytesRead = 0
|
||||
@@ -568,7 +589,6 @@ class Parser {
|
||||
this.keepAlive = ''
|
||||
this.connection = ''
|
||||
|
||||
assert(this.headers.length % 2 === 0)
|
||||
this.headers = []
|
||||
this.headersSize = 0
|
||||
|
||||
@@ -587,7 +607,7 @@ class Parser {
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
|
||||
if (socket[kWriting]) {
|
||||
assert.strictEqual(client[kRunning], 0)
|
||||
assert(client[kRunning] === 0)
|
||||
// Response completed before request.
|
||||
util.destroy(socket, new InformationalError('reset'))
|
||||
return constants.ERROR.PAUSED
|
||||
@@ -613,19 +633,19 @@ class Parser {
|
||||
}
|
||||
|
||||
function onParserTimeout (parser) {
|
||||
const { socket, timeoutType, client } = parser
|
||||
const { socket, timeoutType, client, paused } = parser.deref()
|
||||
|
||||
/* istanbul ignore else */
|
||||
if (timeoutType === TIMEOUT_HEADERS) {
|
||||
if (!socket[kWriting] || socket.writableNeedDrain || client[kRunning] > 1) {
|
||||
assert(!parser.paused, 'cannot be paused while waiting for headers')
|
||||
assert(!paused, 'cannot be paused while waiting for headers')
|
||||
util.destroy(socket, new HeadersTimeoutError())
|
||||
}
|
||||
} else if (timeoutType === TIMEOUT_BODY) {
|
||||
if (!parser.paused) {
|
||||
if (!paused) {
|
||||
util.destroy(socket, new BodyTimeoutError())
|
||||
}
|
||||
} else if (timeoutType === TIMEOUT_IDLE) {
|
||||
} else if (timeoutType === TIMEOUT_KEEP_ALIVE) {
|
||||
assert(client[kRunning] === 0 && client[kKeepAliveTimeoutValue])
|
||||
util.destroy(socket, new InformationalError('socket idle timeout'))
|
||||
}
|
||||
@@ -646,10 +666,10 @@ async function connectH1 (client, socket) {
|
||||
socket[kParser] = new Parser(client, socket, llhttpInstance)
|
||||
|
||||
addListener(socket, 'error', function (err) {
|
||||
const parser = this[kParser]
|
||||
|
||||
assert(err.code !== 'ERR_TLS_CERT_ALTNAME_INVALID')
|
||||
|
||||
const parser = this[kParser]
|
||||
|
||||
// On Mac OS, we get an ECONNRESET even if there is a full body to be forwarded
|
||||
// to the user.
|
||||
if (err.code === 'ECONNRESET' && parser.statusCode && !parser.shouldKeepAlive) {
|
||||
@@ -803,8 +823,8 @@ function resumeH1 (client) {
|
||||
}
|
||||
|
||||
if (client[kSize] === 0) {
|
||||
if (socket[kParser].timeoutType !== TIMEOUT_IDLE) {
|
||||
socket[kParser].setTimeout(client[kKeepAliveTimeoutValue], TIMEOUT_IDLE)
|
||||
if (socket[kParser].timeoutType !== TIMEOUT_KEEP_ALIVE) {
|
||||
socket[kParser].setTimeout(client[kKeepAliveTimeoutValue], TIMEOUT_KEEP_ALIVE)
|
||||
}
|
||||
} else if (client[kRunning] > 0 && socket[kParser].statusCode < 200) {
|
||||
if (socket[kParser].timeoutType !== TIMEOUT_HEADERS) {
|
||||
@@ -840,7 +860,10 @@ function writeH1 (client, request) {
|
||||
const expectsPayload = (
|
||||
method === 'PUT' ||
|
||||
method === 'POST' ||
|
||||
method === 'PATCH'
|
||||
method === 'PATCH' ||
|
||||
method === 'QUERY' ||
|
||||
method === 'PROPFIND' ||
|
||||
method === 'PROPPATCH'
|
||||
)
|
||||
|
||||
if (util.isFormDataLike(body)) {
|
||||
@@ -1119,7 +1142,7 @@ function writeBuffer (abort, body, client, request, socket, contentLength, heade
|
||||
socket.uncork()
|
||||
request.onBodySent(body)
|
||||
|
||||
if (!expectsPayload) {
|
||||
if (!expectsPayload && request.reset !== false) {
|
||||
socket[kReset] = true
|
||||
}
|
||||
}
|
||||
@@ -1149,7 +1172,7 @@ async function writeBlob (abort, body, client, request, socket, contentLength, h
|
||||
request.onBodySent(buffer)
|
||||
request.onRequestSent()
|
||||
|
||||
if (!expectsPayload) {
|
||||
if (!expectsPayload && request.reset !== false) {
|
||||
socket[kReset] = true
|
||||
}
|
||||
|
||||
@@ -1250,7 +1273,7 @@ class AsyncWriter {
|
||||
socket.cork()
|
||||
|
||||
if (bytesWritten === 0) {
|
||||
if (!expectsPayload) {
|
||||
if (!expectsPayload && request.reset !== false) {
|
||||
socket[kReset] = true
|
||||
}
|
||||
|
||||
|
||||
91
node_modules/undici/lib/dispatcher/client-h2.js
generated
vendored
91
node_modules/undici/lib/dispatcher/client-h2.js
generated
vendored
@@ -24,11 +24,15 @@ const {
|
||||
kOnError,
|
||||
kMaxConcurrentStreams,
|
||||
kHTTP2Session,
|
||||
kResume
|
||||
kResume,
|
||||
kSize,
|
||||
kHTTPContext
|
||||
} = require('../core/symbols.js')
|
||||
|
||||
const kOpenStreams = Symbol('open streams')
|
||||
|
||||
let extractBody
|
||||
|
||||
// Experimental
|
||||
let h2ExperimentalWarned = false
|
||||
|
||||
@@ -160,11 +164,10 @@ async function connectH2 (client, socket) {
|
||||
version: 'h2',
|
||||
defaultPipelining: Infinity,
|
||||
write (...args) {
|
||||
// TODO (fix): return
|
||||
writeH2(client, ...args)
|
||||
return writeH2(client, ...args)
|
||||
},
|
||||
resume () {
|
||||
|
||||
resumeH2(client)
|
||||
},
|
||||
destroy (err, callback) {
|
||||
if (closed) {
|
||||
@@ -183,6 +186,20 @@ async function connectH2 (client, socket) {
|
||||
}
|
||||
}
|
||||
|
||||
function resumeH2 (client) {
|
||||
const socket = client[kSocket]
|
||||
|
||||
if (socket?.destroyed === false) {
|
||||
if (client[kSize] === 0 && client[kMaxConcurrentStreams] === 0) {
|
||||
socket.unref()
|
||||
client[kHTTP2Session].unref()
|
||||
} else {
|
||||
socket.ref()
|
||||
client[kHTTP2Session].ref()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function onHttp2SessionError (err) {
|
||||
assert(err.code !== 'ERR_TLS_CERT_ALTNAME_INVALID')
|
||||
|
||||
@@ -210,17 +227,33 @@ function onHttp2SessionEnd () {
|
||||
* along with the socket right away
|
||||
*/
|
||||
function onHTTP2GoAway (code) {
|
||||
const err = new RequestAbortedError(`HTTP/2: "GOAWAY" frame received with code ${code}`)
|
||||
// We cannot recover, so best to close the session and the socket
|
||||
const err = this[kError] || new SocketError(`HTTP/2: "GOAWAY" frame received with code ${code}`, util.getSocketInfo(this))
|
||||
const client = this[kClient]
|
||||
|
||||
// We need to trigger the close cycle right away
|
||||
// We need to destroy the session and the socket
|
||||
// Requests should be failed with the error after the current one is handled
|
||||
this[kSocket][kError] = err
|
||||
this[kClient][kOnError](err)
|
||||
client[kSocket] = null
|
||||
client[kHTTPContext] = null
|
||||
|
||||
this.unref()
|
||||
if (this[kHTTP2Session] != null) {
|
||||
this[kHTTP2Session].destroy(err)
|
||||
this[kHTTP2Session] = null
|
||||
}
|
||||
|
||||
util.destroy(this[kSocket], err)
|
||||
|
||||
// Fail head of pipeline.
|
||||
if (client[kRunningIdx] < client[kQueue].length) {
|
||||
const request = client[kQueue][client[kRunningIdx]]
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
util.errorRequest(client, request, err)
|
||||
client[kPendingIdx] = client[kRunningIdx]
|
||||
}
|
||||
|
||||
assert(client[kRunning] === 0)
|
||||
|
||||
client.emit('disconnect', client[kUrl], [client], err)
|
||||
|
||||
client[kResume]()
|
||||
}
|
||||
|
||||
// https://www.rfc-editor.org/rfc/rfc7230#section-3.3.2
|
||||
@@ -230,17 +263,14 @@ function shouldSendContentLength (method) {
|
||||
|
||||
function writeH2 (client, request) {
|
||||
const session = client[kHTTP2Session]
|
||||
const { body, method, path, host, upgrade, expectContinue, signal, headers: reqHeaders } = request
|
||||
const { method, path, host, upgrade, expectContinue, signal, headers: reqHeaders } = request
|
||||
let { body } = request
|
||||
|
||||
if (upgrade) {
|
||||
util.errorRequest(client, request, new Error('Upgrade not supported for H2'))
|
||||
return false
|
||||
}
|
||||
|
||||
if (request.aborted) {
|
||||
return false
|
||||
}
|
||||
|
||||
const headers = {}
|
||||
for (let n = 0; n < reqHeaders.length; n += 2) {
|
||||
const key = reqHeaders[n + 0]
|
||||
@@ -283,6 +313,8 @@ function writeH2 (client, request) {
|
||||
// We do not destroy the socket as we can continue using the session
|
||||
// the stream get's destroyed and the session remains to create new streams
|
||||
util.destroy(body, err)
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
client[kResume]()
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -293,6 +325,10 @@ function writeH2 (client, request) {
|
||||
util.errorRequest(client, request, err)
|
||||
}
|
||||
|
||||
if (request.aborted) {
|
||||
return false
|
||||
}
|
||||
|
||||
if (method === 'CONNECT') {
|
||||
session.ref()
|
||||
// We are already connected, streams are pending, first request
|
||||
@@ -304,10 +340,12 @@ function writeH2 (client, request) {
|
||||
if (stream.id && !stream.pending) {
|
||||
request.onUpgrade(null, null, stream)
|
||||
++session[kOpenStreams]
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
} else {
|
||||
stream.once('ready', () => {
|
||||
request.onUpgrade(null, null, stream)
|
||||
++session[kOpenStreams]
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
})
|
||||
}
|
||||
|
||||
@@ -347,6 +385,16 @@ function writeH2 (client, request) {
|
||||
|
||||
let contentLength = util.bodyLength(body)
|
||||
|
||||
if (util.isFormDataLike(body)) {
|
||||
extractBody ??= require('../web/fetch/body.js').extractBody
|
||||
|
||||
const [bodyStream, contentType] = extractBody(body)
|
||||
headers['content-type'] = contentType
|
||||
|
||||
body = bodyStream.stream
|
||||
contentLength = bodyStream.length
|
||||
}
|
||||
|
||||
if (contentLength == null) {
|
||||
contentLength = request.contentLength
|
||||
}
|
||||
@@ -428,17 +476,20 @@ function writeH2 (client, request) {
|
||||
// Present specially when using pipeline or stream
|
||||
if (stream.state?.state == null || stream.state.state < 6) {
|
||||
request.onComplete([])
|
||||
return
|
||||
}
|
||||
|
||||
// Stream is closed or half-closed-remote (6), decrement counter and cleanup
|
||||
// It does not have sense to continue working with the stream as we do not
|
||||
// have yet RST_STREAM support on client-side
|
||||
if (session[kOpenStreams] === 0) {
|
||||
// Stream is closed or half-closed-remote (6), decrement counter and cleanup
|
||||
// It does not have sense to continue working with the stream as we do not
|
||||
// have yet RST_STREAM support on client-side
|
||||
|
||||
session.unref()
|
||||
}
|
||||
|
||||
abort(new InformationalError('HTTP/2: stream half-closed (remote)'))
|
||||
client[kQueue][client[kRunningIdx]++] = null
|
||||
client[kPendingIdx] = client[kRunningIdx]
|
||||
client[kResume]()
|
||||
})
|
||||
|
||||
stream.once('close', () => {
|
||||
|
||||
10
node_modules/undici/lib/dispatcher/client.js
generated
vendored
10
node_modules/undici/lib/dispatcher/client.js
generated
vendored
@@ -63,6 +63,8 @@ let deprecatedInterceptorWarned = false
|
||||
|
||||
const kClosedResolve = Symbol('kClosedResolve')
|
||||
|
||||
const noop = () => {}
|
||||
|
||||
function getPipelining (client) {
|
||||
return client[kPipelining] ?? client[kHTTPContext]?.defaultPipelining ?? 1
|
||||
}
|
||||
@@ -385,6 +387,10 @@ function onError (client, err) {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {Client} client
|
||||
* @returns
|
||||
*/
|
||||
async function connect (client) {
|
||||
assert(!client[kConnecting])
|
||||
assert(!client[kHTTPContext])
|
||||
@@ -438,7 +444,7 @@ async function connect (client) {
|
||||
})
|
||||
|
||||
if (client.destroyed) {
|
||||
util.destroy(socket.on('error', () => {}), new ClientDestroyedError())
|
||||
util.destroy(socket.on('error', noop), new ClientDestroyedError())
|
||||
return
|
||||
}
|
||||
|
||||
@@ -449,7 +455,7 @@ async function connect (client) {
|
||||
? await connectH2(client, socket)
|
||||
: await connectH1(client, socket)
|
||||
} catch (err) {
|
||||
socket.destroy().on('error', () => {})
|
||||
socket.destroy().on('error', noop)
|
||||
throw err
|
||||
}
|
||||
|
||||
|
||||
6
node_modules/undici/lib/dispatcher/pool-base.js
generated
vendored
6
node_modules/undici/lib/dispatcher/pool-base.js
generated
vendored
@@ -113,9 +113,9 @@ class PoolBase extends DispatcherBase {
|
||||
|
||||
async [kClose] () {
|
||||
if (this[kQueue].isEmpty()) {
|
||||
return Promise.all(this[kClients].map(c => c.close()))
|
||||
await Promise.all(this[kClients].map(c => c.close()))
|
||||
} else {
|
||||
return new Promise((resolve) => {
|
||||
await new Promise((resolve) => {
|
||||
this[kClosedResolve] = resolve
|
||||
})
|
||||
}
|
||||
@@ -130,7 +130,7 @@ class PoolBase extends DispatcherBase {
|
||||
item.handler.onError(err)
|
||||
}
|
||||
|
||||
return Promise.all(this[kClients].map(c => c.destroy(err)))
|
||||
await Promise.all(this[kClients].map(c => c.destroy(err)))
|
||||
}
|
||||
|
||||
[kDispatch] (opts, handler) {
|
||||
|
||||
14
node_modules/undici/lib/dispatcher/pool.js
generated
vendored
14
node_modules/undici/lib/dispatcher/pool.js
generated
vendored
@@ -73,6 +73,20 @@ class Pool extends PoolBase {
|
||||
? { ...options.interceptors }
|
||||
: undefined
|
||||
this[kFactory] = factory
|
||||
|
||||
this.on('connectionError', (origin, targets, error) => {
|
||||
// If a connection error occurs, we remove the client from the pool,
|
||||
// and emit a connectionError event. They will not be re-used.
|
||||
// Fixes https://github.com/nodejs/undici/issues/3895
|
||||
for (const target of targets) {
|
||||
// Do not use kRemoveClient here, as it will close the client,
|
||||
// but the client cannot be closed in this state.
|
||||
const idx = this[kClients].indexOf(target)
|
||||
if (idx !== -1) {
|
||||
this[kClients].splice(idx, 1)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
[kGetDispatcher] () {
|
||||
|
||||
4
node_modules/undici/lib/dispatcher/proxy-agent.js
generated
vendored
4
node_modules/undici/lib/dispatcher/proxy-agent.js
generated
vendored
@@ -23,6 +23,8 @@ function defaultFactory (origin, opts) {
|
||||
return new Pool(origin, opts)
|
||||
}
|
||||
|
||||
const noop = () => {}
|
||||
|
||||
class ProxyAgent extends DispatcherBase {
|
||||
constructor (opts) {
|
||||
super()
|
||||
@@ -81,7 +83,7 @@ class ProxyAgent extends DispatcherBase {
|
||||
servername: this[kProxyTls]?.servername || proxyHostname
|
||||
})
|
||||
if (statusCode !== 200) {
|
||||
socket.on('error', () => {}).destroy()
|
||||
socket.on('error', noop).destroy()
|
||||
callback(new RequestAbortedError(`Proxy response (${statusCode}) !== 200 when HTTP Tunneling`))
|
||||
}
|
||||
if (opts.protocol !== 'https:') {
|
||||
|
||||
20
node_modules/undici/lib/handler/retry-handler.js
generated
vendored
20
node_modules/undici/lib/handler/retry-handler.js
generated
vendored
@@ -192,8 +192,18 @@ class RetryHandler {
|
||||
if (this.resume != null) {
|
||||
this.resume = null
|
||||
|
||||
if (statusCode !== 206) {
|
||||
return true
|
||||
// Only Partial Content 206 supposed to provide Content-Range,
|
||||
// any other status code that partially consumed the payload
|
||||
// should not be retry because it would result in downstream
|
||||
// wrongly concatanete multiple responses.
|
||||
if (statusCode !== 206 && (this.start > 0 || statusCode !== 200)) {
|
||||
this.abort(
|
||||
new RequestRetryError('server does not support the range header and the payload was partially consumed', statusCode, {
|
||||
headers,
|
||||
data: { count: this.retryCount }
|
||||
})
|
||||
)
|
||||
return false
|
||||
}
|
||||
|
||||
const contentRange = parseRangeHeader(headers['content-range'])
|
||||
@@ -219,7 +229,7 @@ class RetryHandler {
|
||||
return false
|
||||
}
|
||||
|
||||
const { start, size, end = size } = contentRange
|
||||
const { start, size, end = size - 1 } = contentRange
|
||||
|
||||
assert(this.start === start, 'content-range mismatch')
|
||||
assert(this.end == null || this.end === end, 'content-range mismatch')
|
||||
@@ -242,7 +252,7 @@ class RetryHandler {
|
||||
)
|
||||
}
|
||||
|
||||
const { start, size, end = size } = range
|
||||
const { start, size, end = size - 1 } = range
|
||||
assert(
|
||||
start != null && Number.isFinite(start),
|
||||
'content-range mismatch'
|
||||
@@ -256,7 +266,7 @@ class RetryHandler {
|
||||
// We make our best to checkpoint the body for further range headers
|
||||
if (this.end == null) {
|
||||
const contentLength = headers['content-length']
|
||||
this.end = contentLength != null ? Number(contentLength) : null
|
||||
this.end = contentLength != null ? Number(contentLength) - 1 : null
|
||||
}
|
||||
|
||||
assert(Number.isFinite(this.start))
|
||||
|
||||
4
node_modules/undici/lib/mock/mock-utils.js
generated
vendored
4
node_modules/undici/lib/mock/mock-utils.js
generated
vendored
@@ -118,6 +118,10 @@ function matchKey (mockDispatch, { path, method, body, headers }) {
|
||||
function getResponseData (data) {
|
||||
if (Buffer.isBuffer(data)) {
|
||||
return data
|
||||
} else if (data instanceof Uint8Array) {
|
||||
return data
|
||||
} else if (data instanceof ArrayBuffer) {
|
||||
return data
|
||||
} else if (typeof data === 'object') {
|
||||
return JSON.stringify(data)
|
||||
} else {
|
||||
|
||||
410
node_modules/undici/lib/util/timers.js
generated
vendored
410
node_modules/undici/lib/util/timers.js
generated
vendored
@@ -1,99 +1,423 @@
|
||||
'use strict'
|
||||
|
||||
const TICK_MS = 499
|
||||
/**
|
||||
* This module offers an optimized timer implementation designed for scenarios
|
||||
* where high precision is not critical.
|
||||
*
|
||||
* The timer achieves faster performance by using a low-resolution approach,
|
||||
* with an accuracy target of within 500ms. This makes it particularly useful
|
||||
* for timers with delays of 1 second or more, where exact timing is less
|
||||
* crucial.
|
||||
*
|
||||
* It's important to note that Node.js timers are inherently imprecise, as
|
||||
* delays can occur due to the event loop being blocked by other operations.
|
||||
* Consequently, timers may trigger later than their scheduled time.
|
||||
*/
|
||||
|
||||
let fastNow = Date.now()
|
||||
/**
|
||||
* The fastNow variable contains the internal fast timer clock value.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
let fastNow = 0
|
||||
|
||||
/**
|
||||
* RESOLUTION_MS represents the target resolution time in milliseconds.
|
||||
*
|
||||
* @type {number}
|
||||
* @default 1000
|
||||
*/
|
||||
const RESOLUTION_MS = 1e3
|
||||
|
||||
/**
|
||||
* TICK_MS defines the desired interval in milliseconds between each tick.
|
||||
* The target value is set to half the resolution time, minus 1 ms, to account
|
||||
* for potential event loop overhead.
|
||||
*
|
||||
* @type {number}
|
||||
* @default 499
|
||||
*/
|
||||
const TICK_MS = (RESOLUTION_MS >> 1) - 1
|
||||
|
||||
/**
|
||||
* fastNowTimeout is a Node.js timer used to manage and process
|
||||
* the FastTimers stored in the `fastTimers` array.
|
||||
*
|
||||
* @type {NodeJS.Timeout}
|
||||
*/
|
||||
let fastNowTimeout
|
||||
|
||||
/**
|
||||
* The kFastTimer symbol is used to identify FastTimer instances.
|
||||
*
|
||||
* @type {Symbol}
|
||||
*/
|
||||
const kFastTimer = Symbol('kFastTimer')
|
||||
|
||||
/**
|
||||
* The fastTimers array contains all active FastTimers.
|
||||
*
|
||||
* @type {FastTimer[]}
|
||||
*/
|
||||
const fastTimers = []
|
||||
|
||||
function onTimeout () {
|
||||
fastNow = Date.now()
|
||||
/**
|
||||
* These constants represent the various states of a FastTimer.
|
||||
*/
|
||||
|
||||
let len = fastTimers.length
|
||||
/**
|
||||
* The `NOT_IN_LIST` constant indicates that the FastTimer is not included
|
||||
* in the `fastTimers` array. Timers with this status will not be processed
|
||||
* during the next tick by the `onTick` function.
|
||||
*
|
||||
* A FastTimer can be re-added to the `fastTimers` array by invoking the
|
||||
* `refresh` method on the FastTimer instance.
|
||||
*
|
||||
* @type {-2}
|
||||
*/
|
||||
const NOT_IN_LIST = -2
|
||||
|
||||
/**
|
||||
* The `TO_BE_CLEARED` constant indicates that the FastTimer is scheduled
|
||||
* for removal from the `fastTimers` array. A FastTimer in this state will
|
||||
* be removed in the next tick by the `onTick` function and will no longer
|
||||
* be processed.
|
||||
*
|
||||
* This status is also set when the `clear` method is called on the FastTimer instance.
|
||||
*
|
||||
* @type {-1}
|
||||
*/
|
||||
const TO_BE_CLEARED = -1
|
||||
|
||||
/**
|
||||
* The `PENDING` constant signifies that the FastTimer is awaiting processing
|
||||
* in the next tick by the `onTick` function. Timers with this status will have
|
||||
* their `_idleStart` value set and their status updated to `ACTIVE` in the next tick.
|
||||
*
|
||||
* @type {0}
|
||||
*/
|
||||
const PENDING = 0
|
||||
|
||||
/**
|
||||
* The `ACTIVE` constant indicates that the FastTimer is active and waiting
|
||||
* for its timer to expire. During the next tick, the `onTick` function will
|
||||
* check if the timer has expired, and if so, it will execute the associated callback.
|
||||
*
|
||||
* @type {1}
|
||||
*/
|
||||
const ACTIVE = 1
|
||||
|
||||
/**
|
||||
* The onTick function processes the fastTimers array.
|
||||
*
|
||||
* @returns {void}
|
||||
*/
|
||||
function onTick () {
|
||||
/**
|
||||
* Increment the fastNow value by the TICK_MS value, despite the actual time
|
||||
* that has passed since the last tick. This approach ensures independence
|
||||
* from the system clock and delays caused by a blocked event loop.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
fastNow += TICK_MS
|
||||
|
||||
/**
|
||||
* The `idx` variable is used to iterate over the `fastTimers` array.
|
||||
* Expired timers are removed by replacing them with the last element in the array.
|
||||
* Consequently, `idx` is only incremented when the current element is not removed.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
let idx = 0
|
||||
|
||||
/**
|
||||
* The len variable will contain the length of the fastTimers array
|
||||
* and will be decremented when a FastTimer should be removed from the
|
||||
* fastTimers array.
|
||||
*
|
||||
* @type {number}
|
||||
*/
|
||||
let len = fastTimers.length
|
||||
|
||||
while (idx < len) {
|
||||
/**
|
||||
* @type {FastTimer}
|
||||
*/
|
||||
const timer = fastTimers[idx]
|
||||
|
||||
if (timer.state === 0) {
|
||||
timer.state = fastNow + timer.delay - TICK_MS
|
||||
} else if (timer.state > 0 && fastNow >= timer.state) {
|
||||
timer.state = -1
|
||||
timer.callback(timer.opaque)
|
||||
// If the timer is in the ACTIVE state and the timer has expired, it will
|
||||
// be processed in the next tick.
|
||||
if (timer._state === PENDING) {
|
||||
// Set the _idleStart value to the fastNow value minus the TICK_MS value
|
||||
// to account for the time the timer was in the PENDING state.
|
||||
timer._idleStart = fastNow - TICK_MS
|
||||
timer._state = ACTIVE
|
||||
} else if (
|
||||
timer._state === ACTIVE &&
|
||||
fastNow >= timer._idleStart + timer._idleTimeout
|
||||
) {
|
||||
timer._state = TO_BE_CLEARED
|
||||
timer._idleStart = -1
|
||||
timer._onTimeout(timer._timerArg)
|
||||
}
|
||||
|
||||
if (timer.state === -1) {
|
||||
timer.state = -2
|
||||
if (idx !== len - 1) {
|
||||
fastTimers[idx] = fastTimers.pop()
|
||||
} else {
|
||||
fastTimers.pop()
|
||||
if (timer._state === TO_BE_CLEARED) {
|
||||
timer._state = NOT_IN_LIST
|
||||
|
||||
// Move the last element to the current index and decrement len if it is
|
||||
// not the only element in the array.
|
||||
if (--len !== 0) {
|
||||
fastTimers[idx] = fastTimers[len]
|
||||
}
|
||||
len -= 1
|
||||
} else {
|
||||
idx += 1
|
||||
++idx
|
||||
}
|
||||
}
|
||||
|
||||
if (fastTimers.length > 0) {
|
||||
// Set the length of the fastTimers array to the new length and thus
|
||||
// removing the excess FastTimers elements from the array.
|
||||
fastTimers.length = len
|
||||
|
||||
// If there are still active FastTimers in the array, refresh the Timer.
|
||||
// If there are no active FastTimers, the timer will be refreshed again
|
||||
// when a new FastTimer is instantiated.
|
||||
if (fastTimers.length !== 0) {
|
||||
refreshTimeout()
|
||||
}
|
||||
}
|
||||
|
||||
function refreshTimeout () {
|
||||
if (fastNowTimeout?.refresh) {
|
||||
// If the fastNowTimeout is already set, refresh it.
|
||||
if (fastNowTimeout) {
|
||||
fastNowTimeout.refresh()
|
||||
// fastNowTimeout is not instantiated yet, create a new Timer.
|
||||
} else {
|
||||
clearTimeout(fastNowTimeout)
|
||||
fastNowTimeout = setTimeout(onTimeout, TICK_MS)
|
||||
fastNowTimeout = setTimeout(onTick, TICK_MS)
|
||||
|
||||
// If the Timer has an unref method, call it to allow the process to exit if
|
||||
// there are no other active handles.
|
||||
if (fastNowTimeout.unref) {
|
||||
fastNowTimeout.unref()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class Timeout {
|
||||
constructor (callback, delay, opaque) {
|
||||
this.callback = callback
|
||||
this.delay = delay
|
||||
this.opaque = opaque
|
||||
/**
|
||||
* The `FastTimer` class is a data structure designed to store and manage
|
||||
* timer information.
|
||||
*/
|
||||
class FastTimer {
|
||||
[kFastTimer] = true
|
||||
|
||||
// -2 not in timer list
|
||||
// -1 in timer list but inactive
|
||||
// 0 in timer list waiting for time
|
||||
// > 0 in timer list waiting for time to expire
|
||||
this.state = -2
|
||||
/**
|
||||
* The state of the timer, which can be one of the following:
|
||||
* - NOT_IN_LIST (-2)
|
||||
* - TO_BE_CLEARED (-1)
|
||||
* - PENDING (0)
|
||||
* - ACTIVE (1)
|
||||
*
|
||||
* @type {-2|-1|0|1}
|
||||
* @private
|
||||
*/
|
||||
_state = NOT_IN_LIST
|
||||
|
||||
/**
|
||||
* The number of milliseconds to wait before calling the callback.
|
||||
*
|
||||
* @type {number}
|
||||
* @private
|
||||
*/
|
||||
_idleTimeout = -1
|
||||
|
||||
/**
|
||||
* The time in milliseconds when the timer was started. This value is used to
|
||||
* calculate when the timer should expire.
|
||||
*
|
||||
* @type {number}
|
||||
* @default -1
|
||||
* @private
|
||||
*/
|
||||
_idleStart = -1
|
||||
|
||||
/**
|
||||
* The function to be executed when the timer expires.
|
||||
* @type {Function}
|
||||
* @private
|
||||
*/
|
||||
_onTimeout
|
||||
|
||||
/**
|
||||
* The argument to be passed to the callback when the timer expires.
|
||||
*
|
||||
* @type {*}
|
||||
* @private
|
||||
*/
|
||||
_timerArg
|
||||
|
||||
/**
|
||||
* @constructor
|
||||
* @param {Function} callback A function to be executed after the timer
|
||||
* expires.
|
||||
* @param {number} delay The time, in milliseconds that the timer should wait
|
||||
* before the specified function or code is executed.
|
||||
* @param {*} arg
|
||||
*/
|
||||
constructor (callback, delay, arg) {
|
||||
this._onTimeout = callback
|
||||
this._idleTimeout = delay
|
||||
this._timerArg = arg
|
||||
|
||||
this.refresh()
|
||||
}
|
||||
|
||||
/**
|
||||
* Sets the timer's start time to the current time, and reschedules the timer
|
||||
* to call its callback at the previously specified duration adjusted to the
|
||||
* current time.
|
||||
* Using this on a timer that has already called its callback will reactivate
|
||||
* the timer.
|
||||
*
|
||||
* @returns {void}
|
||||
*/
|
||||
refresh () {
|
||||
if (this.state === -2) {
|
||||
// In the special case that the timer is not in the list of active timers,
|
||||
// add it back to the array to be processed in the next tick by the onTick
|
||||
// function.
|
||||
if (this._state === NOT_IN_LIST) {
|
||||
fastTimers.push(this)
|
||||
if (!fastNowTimeout || fastTimers.length === 1) {
|
||||
refreshTimeout()
|
||||
}
|
||||
}
|
||||
|
||||
this.state = 0
|
||||
// If the timer is the only active timer, refresh the fastNowTimeout for
|
||||
// better resolution.
|
||||
if (!fastNowTimeout || fastTimers.length === 1) {
|
||||
refreshTimeout()
|
||||
}
|
||||
|
||||
// Setting the state to PENDING will cause the timer to be reset in the
|
||||
// next tick by the onTick function.
|
||||
this._state = PENDING
|
||||
}
|
||||
|
||||
/**
|
||||
* The `clear` method cancels the timer, preventing it from executing.
|
||||
*
|
||||
* @returns {void}
|
||||
* @private
|
||||
*/
|
||||
clear () {
|
||||
this.state = -1
|
||||
// Set the state to TO_BE_CLEARED to mark the timer for removal in the next
|
||||
// tick by the onTick function.
|
||||
this._state = TO_BE_CLEARED
|
||||
|
||||
// Reset the _idleStart value to -1 to indicate that the timer is no longer
|
||||
// active.
|
||||
this._idleStart = -1
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* This module exports a setTimeout and clearTimeout function that can be
|
||||
* used as a drop-in replacement for the native functions.
|
||||
*/
|
||||
module.exports = {
|
||||
setTimeout (callback, delay, opaque) {
|
||||
return delay <= 1e3
|
||||
? setTimeout(callback, delay, opaque)
|
||||
: new Timeout(callback, delay, opaque)
|
||||
/**
|
||||
* The setTimeout() method sets a timer which executes a function once the
|
||||
* timer expires.
|
||||
* @param {Function} callback A function to be executed after the timer
|
||||
* expires.
|
||||
* @param {number} delay The time, in milliseconds that the timer should
|
||||
* wait before the specified function or code is executed.
|
||||
* @param {*} [arg] An optional argument to be passed to the callback function
|
||||
* when the timer expires.
|
||||
* @returns {NodeJS.Timeout|FastTimer}
|
||||
*/
|
||||
setTimeout (callback, delay, arg) {
|
||||
// If the delay is less than or equal to the RESOLUTION_MS value return a
|
||||
// native Node.js Timer instance.
|
||||
return delay <= RESOLUTION_MS
|
||||
? setTimeout(callback, delay, arg)
|
||||
: new FastTimer(callback, delay, arg)
|
||||
},
|
||||
/**
|
||||
* The clearTimeout method cancels an instantiated Timer previously created
|
||||
* by calling setTimeout.
|
||||
*
|
||||
* @param {NodeJS.Timeout|FastTimer} timeout
|
||||
*/
|
||||
clearTimeout (timeout) {
|
||||
if (timeout instanceof Timeout) {
|
||||
// If the timeout is a FastTimer, call its own clear method.
|
||||
if (timeout[kFastTimer]) {
|
||||
/**
|
||||
* @type {FastTimer}
|
||||
*/
|
||||
timeout.clear()
|
||||
// Otherwise it is an instance of a native NodeJS.Timeout, so call the
|
||||
// Node.js native clearTimeout function.
|
||||
} else {
|
||||
clearTimeout(timeout)
|
||||
}
|
||||
}
|
||||
},
|
||||
/**
|
||||
* The setFastTimeout() method sets a fastTimer which executes a function once
|
||||
* the timer expires.
|
||||
* @param {Function} callback A function to be executed after the timer
|
||||
* expires.
|
||||
* @param {number} delay The time, in milliseconds that the timer should
|
||||
* wait before the specified function or code is executed.
|
||||
* @param {*} [arg] An optional argument to be passed to the callback function
|
||||
* when the timer expires.
|
||||
* @returns {FastTimer}
|
||||
*/
|
||||
setFastTimeout (callback, delay, arg) {
|
||||
return new FastTimer(callback, delay, arg)
|
||||
},
|
||||
/**
|
||||
* The clearTimeout method cancels an instantiated FastTimer previously
|
||||
* created by calling setFastTimeout.
|
||||
*
|
||||
* @param {FastTimer} timeout
|
||||
*/
|
||||
clearFastTimeout (timeout) {
|
||||
timeout.clear()
|
||||
},
|
||||
/**
|
||||
* The now method returns the value of the internal fast timer clock.
|
||||
*
|
||||
* @returns {number}
|
||||
*/
|
||||
now () {
|
||||
return fastNow
|
||||
},
|
||||
/**
|
||||
* Trigger the onTick function to process the fastTimers array.
|
||||
* Exported for testing purposes only.
|
||||
* Marking as deprecated to discourage any use outside of testing.
|
||||
* @deprecated
|
||||
* @param {number} [delay=0] The delay in milliseconds to add to the now value.
|
||||
*/
|
||||
tick (delay = 0) {
|
||||
fastNow += delay - RESOLUTION_MS + 1
|
||||
onTick()
|
||||
onTick()
|
||||
},
|
||||
/**
|
||||
* Reset FastTimers.
|
||||
* Exported for testing purposes only.
|
||||
* Marking as deprecated to discourage any use outside of testing.
|
||||
* @deprecated
|
||||
*/
|
||||
reset () {
|
||||
fastNow = 0
|
||||
fastTimers.length = 0
|
||||
clearTimeout(fastNowTimeout)
|
||||
fastNowTimeout = null
|
||||
},
|
||||
/**
|
||||
* Exporting for testing purposes only.
|
||||
* Marking as deprecated to discourage any use outside of testing.
|
||||
* @deprecated
|
||||
*/
|
||||
kFastTimer
|
||||
}
|
||||
|
||||
1
node_modules/undici/lib/web/cache/cache.js
generated
vendored
1
node_modules/undici/lib/web/cache/cache.js
generated
vendored
@@ -37,6 +37,7 @@ class Cache {
|
||||
webidl.illegalConstructor()
|
||||
}
|
||||
|
||||
webidl.util.markAsUncloneable(this)
|
||||
this.#relevantRequestResponseList = arguments[1]
|
||||
}
|
||||
|
||||
|
||||
2
node_modules/undici/lib/web/cache/cachestorage.js
generated
vendored
2
node_modules/undici/lib/web/cache/cachestorage.js
generated
vendored
@@ -16,6 +16,8 @@ class CacheStorage {
|
||||
if (arguments[0] !== kConstruct) {
|
||||
webidl.illegalConstructor()
|
||||
}
|
||||
|
||||
webidl.util.markAsUncloneable(this)
|
||||
}
|
||||
|
||||
async match (request, options = {}) {
|
||||
|
||||
2
node_modules/undici/lib/web/eventsource/eventsource.js
generated
vendored
2
node_modules/undici/lib/web/eventsource/eventsource.js
generated
vendored
@@ -105,6 +105,8 @@ class EventSource extends EventTarget {
|
||||
// 1. Let ev be a new EventSource object.
|
||||
super()
|
||||
|
||||
webidl.util.markAsUncloneable(this)
|
||||
|
||||
const prefix = 'EventSource constructor'
|
||||
webidl.argumentLengthCheck(arguments, 1, prefix)
|
||||
|
||||
|
||||
15
node_modules/undici/lib/web/fetch/body.js
generated
vendored
15
node_modules/undici/lib/web/fetch/body.js
generated
vendored
@@ -20,6 +20,14 @@ const { isErrored, isDisturbed } = require('node:stream')
|
||||
const { isArrayBuffer } = require('node:util/types')
|
||||
const { serializeAMimeType } = require('./data-url')
|
||||
const { multipartFormDataParser } = require('./formdata-parser')
|
||||
let random
|
||||
|
||||
try {
|
||||
const crypto = require('node:crypto')
|
||||
random = (max) => crypto.randomInt(0, max)
|
||||
} catch {
|
||||
random = (max) => Math.floor(Math.random(max))
|
||||
}
|
||||
|
||||
const textEncoder = new TextEncoder()
|
||||
function noop () {}
|
||||
@@ -113,7 +121,7 @@ function extractBody (object, keepalive = false) {
|
||||
// Set source to a copy of the bytes held by object.
|
||||
source = new Uint8Array(object.buffer.slice(object.byteOffset, object.byteOffset + object.byteLength))
|
||||
} else if (util.isFormDataLike(object)) {
|
||||
const boundary = `----formdata-undici-0${`${Math.floor(Math.random() * 1e11)}`.padStart(11, '0')}`
|
||||
const boundary = `----formdata-undici-0${`${random(1e11)}`.padStart(11, '0')}`
|
||||
const prefix = `--${boundary}\r\nContent-Disposition: form-data`
|
||||
|
||||
/*! formdata-polyfill. MIT License. Jimmy Wärting <https://jimmy.warting.se/opensource> */
|
||||
@@ -154,7 +162,10 @@ function extractBody (object, keepalive = false) {
|
||||
}
|
||||
}
|
||||
|
||||
const chunk = textEncoder.encode(`--${boundary}--`)
|
||||
// CRLF is appended to the body to function with legacy servers and match other implementations.
|
||||
// https://github.com/curl/curl/blob/3434c6b46e682452973972e8313613dfa58cd690/lib/mime.c#L1029-L1030
|
||||
// https://github.com/form-data/form-data/issues/63
|
||||
const chunk = textEncoder.encode(`--${boundary}--\r\n`)
|
||||
blobParts.push(chunk)
|
||||
length += chunk.byteLength
|
||||
if (hasUnknownSizeValue) {
|
||||
|
||||
61
node_modules/undici/lib/web/fetch/constants.js
generated
vendored
61
node_modules/undici/lib/web/fetch/constants.js
generated
vendored
@@ -1,27 +1,30 @@
|
||||
'use strict'
|
||||
|
||||
const corsSafeListedMethods = ['GET', 'HEAD', 'POST']
|
||||
const corsSafeListedMethods = /** @type {const} */ (['GET', 'HEAD', 'POST'])
|
||||
const corsSafeListedMethodsSet = new Set(corsSafeListedMethods)
|
||||
|
||||
const nullBodyStatus = [101, 204, 205, 304]
|
||||
const nullBodyStatus = /** @type {const} */ ([101, 204, 205, 304])
|
||||
|
||||
const redirectStatus = [301, 302, 303, 307, 308]
|
||||
const redirectStatus = /** @type {const} */ ([301, 302, 303, 307, 308])
|
||||
const redirectStatusSet = new Set(redirectStatus)
|
||||
|
||||
// https://fetch.spec.whatwg.org/#block-bad-port
|
||||
const badPorts = [
|
||||
/**
|
||||
* @see https://fetch.spec.whatwg.org/#block-bad-port
|
||||
*/
|
||||
const badPorts = /** @type {const} */ ([
|
||||
'1', '7', '9', '11', '13', '15', '17', '19', '20', '21', '22', '23', '25', '37', '42', '43', '53', '69', '77', '79',
|
||||
'87', '95', '101', '102', '103', '104', '109', '110', '111', '113', '115', '117', '119', '123', '135', '137',
|
||||
'139', '143', '161', '179', '389', '427', '465', '512', '513', '514', '515', '526', '530', '531', '532',
|
||||
'540', '548', '554', '556', '563', '587', '601', '636', '989', '990', '993', '995', '1719', '1720', '1723',
|
||||
'2049', '3659', '4045', '4190', '5060', '5061', '6000', '6566', '6665', '6666', '6667', '6668', '6669', '6679',
|
||||
'6697', '10080'
|
||||
]
|
||||
|
||||
])
|
||||
const badPortsSet = new Set(badPorts)
|
||||
|
||||
// https://w3c.github.io/webappsec-referrer-policy/#referrer-policies
|
||||
const referrerPolicy = [
|
||||
/**
|
||||
* @see https://w3c.github.io/webappsec-referrer-policy/#referrer-policies
|
||||
*/
|
||||
const referrerPolicy = /** @type {const} */ ([
|
||||
'',
|
||||
'no-referrer',
|
||||
'no-referrer-when-downgrade',
|
||||
@@ -31,29 +34,31 @@ const referrerPolicy = [
|
||||
'origin-when-cross-origin',
|
||||
'strict-origin-when-cross-origin',
|
||||
'unsafe-url'
|
||||
]
|
||||
])
|
||||
const referrerPolicySet = new Set(referrerPolicy)
|
||||
|
||||
const requestRedirect = ['follow', 'manual', 'error']
|
||||
const requestRedirect = /** @type {const} */ (['follow', 'manual', 'error'])
|
||||
|
||||
const safeMethods = ['GET', 'HEAD', 'OPTIONS', 'TRACE']
|
||||
const safeMethods = /** @type {const} */ (['GET', 'HEAD', 'OPTIONS', 'TRACE'])
|
||||
const safeMethodsSet = new Set(safeMethods)
|
||||
|
||||
const requestMode = ['navigate', 'same-origin', 'no-cors', 'cors']
|
||||
const requestMode = /** @type {const} */ (['navigate', 'same-origin', 'no-cors', 'cors'])
|
||||
|
||||
const requestCredentials = ['omit', 'same-origin', 'include']
|
||||
const requestCredentials = /** @type {const} */ (['omit', 'same-origin', 'include'])
|
||||
|
||||
const requestCache = [
|
||||
const requestCache = /** @type {const} */ ([
|
||||
'default',
|
||||
'no-store',
|
||||
'reload',
|
||||
'no-cache',
|
||||
'force-cache',
|
||||
'only-if-cached'
|
||||
]
|
||||
])
|
||||
|
||||
// https://fetch.spec.whatwg.org/#request-body-header-name
|
||||
const requestBodyHeader = [
|
||||
/**
|
||||
* @see https://fetch.spec.whatwg.org/#request-body-header-name
|
||||
*/
|
||||
const requestBodyHeader = /** @type {const} */ ([
|
||||
'content-encoding',
|
||||
'content-language',
|
||||
'content-location',
|
||||
@@ -63,18 +68,22 @@ const requestBodyHeader = [
|
||||
// removed in the Headers implementation. However, undici doesn't
|
||||
// filter out headers, so we add it here.
|
||||
'content-length'
|
||||
]
|
||||
])
|
||||
|
||||
// https://fetch.spec.whatwg.org/#enumdef-requestduplex
|
||||
const requestDuplex = [
|
||||
/**
|
||||
* @see https://fetch.spec.whatwg.org/#enumdef-requestduplex
|
||||
*/
|
||||
const requestDuplex = /** @type {const} */ ([
|
||||
'half'
|
||||
]
|
||||
])
|
||||
|
||||
// http://fetch.spec.whatwg.org/#forbidden-method
|
||||
const forbiddenMethods = ['CONNECT', 'TRACE', 'TRACK']
|
||||
/**
|
||||
* @see http://fetch.spec.whatwg.org/#forbidden-method
|
||||
*/
|
||||
const forbiddenMethods = /** @type {const} */ (['CONNECT', 'TRACE', 'TRACK'])
|
||||
const forbiddenMethodsSet = new Set(forbiddenMethods)
|
||||
|
||||
const subresource = [
|
||||
const subresource = /** @type {const} */ ([
|
||||
'audio',
|
||||
'audioworklet',
|
||||
'font',
|
||||
@@ -87,7 +96,7 @@ const subresource = [
|
||||
'video',
|
||||
'xslt',
|
||||
''
|
||||
]
|
||||
])
|
||||
const subresourceSet = new Set(subresource)
|
||||
|
||||
module.exports = {
|
||||
|
||||
14
node_modules/undici/lib/web/fetch/formdata-parser.js
generated
vendored
14
node_modules/undici/lib/web/fetch/formdata-parser.js
generated
vendored
@@ -87,11 +87,21 @@ function multipartFormDataParser (input, mimeType) {
|
||||
// the first byte.
|
||||
const position = { position: 0 }
|
||||
|
||||
// Note: undici addition, allow \r\n before the body.
|
||||
if (input[0] === 0x0d && input[1] === 0x0a) {
|
||||
// Note: undici addition, allows leading and trailing CRLFs.
|
||||
while (input[position.position] === 0x0d && input[position.position + 1] === 0x0a) {
|
||||
position.position += 2
|
||||
}
|
||||
|
||||
let trailing = input.length
|
||||
|
||||
while (input[trailing - 1] === 0x0a && input[trailing - 2] === 0x0d) {
|
||||
trailing -= 2
|
||||
}
|
||||
|
||||
if (trailing !== input.length) {
|
||||
input = input.subarray(0, trailing)
|
||||
}
|
||||
|
||||
// 5. While true:
|
||||
while (true) {
|
||||
// 5.1. If position points to a sequence of bytes starting with 0x2D 0x2D
|
||||
|
||||
2
node_modules/undici/lib/web/fetch/formdata.js
generated
vendored
2
node_modules/undici/lib/web/fetch/formdata.js
generated
vendored
@@ -14,6 +14,8 @@ const File = globalThis.File ?? NativeFile
|
||||
// https://xhr.spec.whatwg.org/#formdata
|
||||
class FormData {
|
||||
constructor (form) {
|
||||
webidl.util.markAsUncloneable(this)
|
||||
|
||||
if (form !== undefined) {
|
||||
throw webidl.errors.conversionFailed({
|
||||
prefix: 'FormData constructor',
|
||||
|
||||
2
node_modules/undici/lib/web/fetch/headers.js
generated
vendored
2
node_modules/undici/lib/web/fetch/headers.js
generated
vendored
@@ -359,6 +359,8 @@ class Headers {
|
||||
#headersList
|
||||
|
||||
constructor (init = undefined) {
|
||||
webidl.util.markAsUncloneable(this)
|
||||
|
||||
if (init === kConstruct) {
|
||||
return
|
||||
}
|
||||
|
||||
22
node_modules/undici/lib/web/fetch/index.js
generated
vendored
22
node_modules/undici/lib/web/fetch/index.js
generated
vendored
@@ -2137,7 +2137,7 @@ async function httpNetworkFetch (
|
||||
|
||||
// https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Encoding
|
||||
if (codings.length !== 0 && request.method !== 'HEAD' && request.method !== 'CONNECT' && !nullBodyStatus.includes(status) && !willFollow) {
|
||||
for (let i = 0; i < codings.length; ++i) {
|
||||
for (let i = codings.length - 1; i >= 0; --i) {
|
||||
const coding = codings[i]
|
||||
// https://www.rfc-editor.org/rfc/rfc9112.html#section-7.2
|
||||
if (coding === 'x-gzip' || coding === 'gzip') {
|
||||
@@ -2150,9 +2150,15 @@ async function httpNetworkFetch (
|
||||
finishFlush: zlib.constants.Z_SYNC_FLUSH
|
||||
}))
|
||||
} else if (coding === 'deflate') {
|
||||
decoders.push(createInflate())
|
||||
decoders.push(createInflate({
|
||||
flush: zlib.constants.Z_SYNC_FLUSH,
|
||||
finishFlush: zlib.constants.Z_SYNC_FLUSH
|
||||
}))
|
||||
} else if (coding === 'br') {
|
||||
decoders.push(zlib.createBrotliDecompress())
|
||||
decoders.push(zlib.createBrotliDecompress({
|
||||
flush: zlib.constants.BROTLI_OPERATION_FLUSH,
|
||||
finishFlush: zlib.constants.BROTLI_OPERATION_FLUSH
|
||||
}))
|
||||
} else {
|
||||
decoders.length = 0
|
||||
break
|
||||
@@ -2160,13 +2166,19 @@ async function httpNetworkFetch (
|
||||
}
|
||||
}
|
||||
|
||||
const onError = this.onError.bind(this)
|
||||
|
||||
resolve({
|
||||
status,
|
||||
statusText,
|
||||
headersList,
|
||||
body: decoders.length
|
||||
? pipeline(this.body, ...decoders, () => { })
|
||||
: this.body.on('error', () => { })
|
||||
? pipeline(this.body, ...decoders, (err) => {
|
||||
if (err) {
|
||||
this.onError(err)
|
||||
}
|
||||
}).on('error', onError)
|
||||
: this.body.on('error', onError)
|
||||
})
|
||||
|
||||
return true
|
||||
|
||||
1
node_modules/undici/lib/web/fetch/request.js
generated
vendored
1
node_modules/undici/lib/web/fetch/request.js
generated
vendored
@@ -82,6 +82,7 @@ let patchMethodWarning = false
|
||||
class Request {
|
||||
// https://fetch.spec.whatwg.org/#dom-request
|
||||
constructor (input, init = {}) {
|
||||
webidl.util.markAsUncloneable(this)
|
||||
if (input === kConstruct) {
|
||||
return
|
||||
}
|
||||
|
||||
1
node_modules/undici/lib/web/fetch/response.js
generated
vendored
1
node_modules/undici/lib/web/fetch/response.js
generated
vendored
@@ -110,6 +110,7 @@ class Response {
|
||||
|
||||
// https://fetch.spec.whatwg.org/#dom-response
|
||||
constructor (body = null, init = {}) {
|
||||
webidl.util.markAsUncloneable(this)
|
||||
if (body === kConstruct) {
|
||||
return
|
||||
}
|
||||
|
||||
20
node_modules/undici/lib/web/fetch/util.js
generated
vendored
20
node_modules/undici/lib/web/fetch/util.js
generated
vendored
@@ -1340,6 +1340,14 @@ function buildContentRange (rangeStart, rangeEnd, fullLength) {
|
||||
// interpreted as a zlib stream, otherwise it's interpreted as a
|
||||
// raw deflate stream.
|
||||
class InflateStream extends Transform {
|
||||
#zlibOptions
|
||||
|
||||
/** @param {zlib.ZlibOptions} [zlibOptions] */
|
||||
constructor (zlibOptions) {
|
||||
super()
|
||||
this.#zlibOptions = zlibOptions
|
||||
}
|
||||
|
||||
_transform (chunk, encoding, callback) {
|
||||
if (!this._inflateStream) {
|
||||
if (chunk.length === 0) {
|
||||
@@ -1347,8 +1355,8 @@ class InflateStream extends Transform {
|
||||
return
|
||||
}
|
||||
this._inflateStream = (chunk[0] & 0x0F) === 0x08
|
||||
? zlib.createInflate()
|
||||
: zlib.createInflateRaw()
|
||||
? zlib.createInflate(this.#zlibOptions)
|
||||
: zlib.createInflateRaw(this.#zlibOptions)
|
||||
|
||||
this._inflateStream.on('data', this.push.bind(this))
|
||||
this._inflateStream.on('end', () => this.push(null))
|
||||
@@ -1367,8 +1375,12 @@ class InflateStream extends Transform {
|
||||
}
|
||||
}
|
||||
|
||||
function createInflate () {
|
||||
return new InflateStream()
|
||||
/**
|
||||
* @param {zlib.ZlibOptions} [zlibOptions]
|
||||
* @returns {InflateStream}
|
||||
*/
|
||||
function createInflate (zlibOptions) {
|
||||
return new InflateStream(zlibOptions)
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
2
node_modules/undici/lib/web/fetch/webidl.js
generated
vendored
2
node_modules/undici/lib/web/fetch/webidl.js
generated
vendored
@@ -1,6 +1,7 @@
|
||||
'use strict'
|
||||
|
||||
const { types, inspect } = require('node:util')
|
||||
const { markAsUncloneable } = require('node:worker_threads')
|
||||
const { toUSVString } = require('../../core/util')
|
||||
|
||||
/** @type {import('../../../types/webidl').Webidl} */
|
||||
@@ -86,6 +87,7 @@ webidl.util.Type = function (V) {
|
||||
}
|
||||
}
|
||||
|
||||
webidl.util.markAsUncloneable = markAsUncloneable || (() => {})
|
||||
// https://webidl.spec.whatwg.org/#abstract-opdef-converttoint
|
||||
webidl.util.ConvertToInt = function (V, bitLength, signedness, opts) {
|
||||
let upperBound
|
||||
|
||||
4
node_modules/undici/lib/web/websocket/events.js
generated
vendored
4
node_modules/undici/lib/web/websocket/events.js
generated
vendored
@@ -14,6 +14,7 @@ class MessageEvent extends Event {
|
||||
constructor (type, eventInitDict = {}) {
|
||||
if (type === kConstruct) {
|
||||
super(arguments[1], arguments[2])
|
||||
webidl.util.markAsUncloneable(this)
|
||||
return
|
||||
}
|
||||
|
||||
@@ -26,6 +27,7 @@ class MessageEvent extends Event {
|
||||
super(type, eventInitDict)
|
||||
|
||||
this.#eventInit = eventInitDict
|
||||
webidl.util.markAsUncloneable(this)
|
||||
}
|
||||
|
||||
get data () {
|
||||
@@ -112,6 +114,7 @@ class CloseEvent extends Event {
|
||||
super(type, eventInitDict)
|
||||
|
||||
this.#eventInit = eventInitDict
|
||||
webidl.util.markAsUncloneable(this)
|
||||
}
|
||||
|
||||
get wasClean () {
|
||||
@@ -142,6 +145,7 @@ class ErrorEvent extends Event {
|
||||
webidl.argumentLengthCheck(arguments, 1, prefix)
|
||||
|
||||
super(type, eventInitDict)
|
||||
webidl.util.markAsUncloneable(this)
|
||||
|
||||
type = webidl.converters.DOMString(type, prefix, 'type')
|
||||
eventInitDict = webidl.converters.ErrorEventInit(eventInitDict ?? {})
|
||||
|
||||
2
node_modules/undici/lib/web/websocket/websocket.js
generated
vendored
2
node_modules/undici/lib/web/websocket/websocket.js
generated
vendored
@@ -51,6 +51,8 @@ class WebSocket extends EventTarget {
|
||||
constructor (url, protocols = []) {
|
||||
super()
|
||||
|
||||
webidl.util.markAsUncloneable(this)
|
||||
|
||||
const prefix = 'WebSocket constructor'
|
||||
webidl.argumentLengthCheck(arguments, 1, prefix)
|
||||
|
||||
|
||||
7
node_modules/undici/package.json
generated
vendored
7
node_modules/undici/package.json
generated
vendored
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "undici",
|
||||
"version": "6.19.8",
|
||||
"version": "6.21.3",
|
||||
"description": "An HTTP/1.1 client, written from scratch for Node.js",
|
||||
"homepage": "https://undici.nodejs.org",
|
||||
"bugs": {
|
||||
@@ -78,6 +78,9 @@
|
||||
"test:fuzzing": "node test/fuzzing/fuzzing.test.js",
|
||||
"test:fetch": "npm run build:node && npm run test:fetch:nobuild",
|
||||
"test:fetch:nobuild": "borp --timeout 180000 --expose-gc --concurrency 1 -p \"test/fetch/*.js\" && npm run test:webidl && npm run test:busboy",
|
||||
"test:h2": "npm run test:h2:core && npm run test:h2:fetch",
|
||||
"test:h2:core": "borp -p \"test/http2*.js\"",
|
||||
"test:h2:fetch": "npm run build:node && borp -p \"test/fetch/http2*.js\"",
|
||||
"test:interceptors": "borp -p \"test/interceptors/*.js\"",
|
||||
"test:jest": "cross-env NODE_V8_COVERAGE= jest",
|
||||
"test:unit": "borp --expose-gc -p \"test/*.js\"",
|
||||
@@ -105,7 +108,7 @@
|
||||
"@fastify/busboy": "2.1.1",
|
||||
"@matteo.collina/tspl": "^0.1.1",
|
||||
"@sinonjs/fake-timers": "^11.1.0",
|
||||
"@types/node": "^18.0.3",
|
||||
"@types/node": "~18.19.50",
|
||||
"abort-controller": "^3.0.0",
|
||||
"borp": "^0.15.0",
|
||||
"c8": "^10.0.0",
|
||||
|
||||
1
node_modules/undici/types/dispatcher.d.ts
generated
vendored
1
node_modules/undici/types/dispatcher.d.ts
generated
vendored
@@ -244,6 +244,7 @@ declare namespace Dispatcher {
|
||||
readonly bodyUsed: boolean;
|
||||
arrayBuffer(): Promise<ArrayBuffer>;
|
||||
blob(): Promise<Blob>;
|
||||
bytes(): Promise<Uint8Array>;
|
||||
formData(): Promise<never>;
|
||||
json(): Promise<unknown>;
|
||||
text(): Promise<string>;
|
||||
|
||||
2
node_modules/undici/types/eventsource.d.ts
generated
vendored
2
node_modules/undici/types/eventsource.d.ts
generated
vendored
@@ -2,8 +2,6 @@ import { MessageEvent, ErrorEvent } from './websocket'
|
||||
import Dispatcher from './dispatcher'
|
||||
|
||||
import {
|
||||
EventTarget,
|
||||
Event,
|
||||
EventListenerOptions,
|
||||
AddEventListenerOptions,
|
||||
EventListenerOrEventListenerObject
|
||||
|
||||
2
node_modules/undici/types/filereader.d.ts
generated
vendored
2
node_modules/undici/types/filereader.d.ts
generated
vendored
@@ -1,7 +1,7 @@
|
||||
/// <reference types="node" />
|
||||
|
||||
import { Blob } from 'buffer'
|
||||
import { DOMException, Event, EventInit, EventTarget } from './patch'
|
||||
import { DOMException, EventInit } from './patch'
|
||||
|
||||
export declare class FileReader {
|
||||
__proto__: EventTarget & FileReader
|
||||
|
||||
19
node_modules/undici/types/interceptors.d.ts
generated
vendored
19
node_modules/undici/types/interceptors.d.ts
generated
vendored
@@ -1,3 +1,5 @@
|
||||
import { LookupOptions } from 'node:dns'
|
||||
|
||||
import Dispatcher from "./dispatcher";
|
||||
import RetryHandler from "./retry-handler";
|
||||
|
||||
@@ -7,9 +9,24 @@ declare namespace Interceptors {
|
||||
export type DumpInterceptorOpts = { maxSize?: number }
|
||||
export type RetryInterceptorOpts = RetryHandler.RetryOptions
|
||||
export type RedirectInterceptorOpts = { maxRedirections?: number }
|
||||
|
||||
export type ResponseErrorInterceptorOpts = { throwOnError: boolean }
|
||||
|
||||
// DNS interceptor
|
||||
export type DNSInterceptorRecord = { address: string, ttl: number, family: 4 | 6 }
|
||||
export type DNSInterceptorOriginRecords = { 4: { ips: DNSInterceptorRecord[] } | null, 6: { ips: DNSInterceptorRecord[] } | null }
|
||||
export type DNSInterceptorOpts = {
|
||||
maxTTL?: number
|
||||
maxItems?: number
|
||||
lookup?: (hostname: string, options: LookupOptions, callback: (err: NodeJS.ErrnoException | null, addresses: DNSInterceptorRecord[]) => void) => void
|
||||
pick?: (origin: URL, records: DNSInterceptorOriginRecords, affinity: 4 | 6) => DNSInterceptorRecord
|
||||
dualStack?: boolean
|
||||
affinity?: 4 | 6
|
||||
}
|
||||
|
||||
export function createRedirectInterceptor(opts: RedirectInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
export function dump(opts?: DumpInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
export function retry(opts?: RetryInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
export function redirect(opts?: RedirectInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
export function responseError(opts?: ResponseErrorInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
export function dns (opts?: DNSInterceptorOpts): Dispatcher.DispatcherComposeInterceptor
|
||||
}
|
||||
|
||||
38
node_modules/undici/types/patch.d.ts
generated
vendored
38
node_modules/undici/types/patch.d.ts
generated
vendored
@@ -6,44 +6,6 @@ export type DOMException = typeof globalThis extends { DOMException: infer T }
|
||||
? T
|
||||
: any
|
||||
|
||||
export type EventTarget = typeof globalThis extends { EventTarget: infer T }
|
||||
? T
|
||||
: {
|
||||
addEventListener(
|
||||
type: string,
|
||||
listener: any,
|
||||
options?: any,
|
||||
): void
|
||||
dispatchEvent(event: Event): boolean
|
||||
removeEventListener(
|
||||
type: string,
|
||||
listener: any,
|
||||
options?: any | boolean,
|
||||
): void
|
||||
}
|
||||
|
||||
export type Event = typeof globalThis extends { Event: infer T }
|
||||
? T
|
||||
: {
|
||||
readonly bubbles: boolean
|
||||
cancelBubble: () => void
|
||||
readonly cancelable: boolean
|
||||
readonly composed: boolean
|
||||
composedPath(): [EventTarget?]
|
||||
readonly currentTarget: EventTarget | null
|
||||
readonly defaultPrevented: boolean
|
||||
readonly eventPhase: 0 | 2
|
||||
readonly isTrusted: boolean
|
||||
preventDefault(): void
|
||||
returnValue: boolean
|
||||
readonly srcElement: EventTarget | null
|
||||
stopImmediatePropagation(): void
|
||||
stopPropagation(): void
|
||||
readonly target: EventTarget | null
|
||||
readonly timeStamp: number
|
||||
readonly type: string
|
||||
}
|
||||
|
||||
export interface EventInit {
|
||||
bubbles?: boolean
|
||||
cancelable?: boolean
|
||||
|
||||
5
node_modules/undici/types/readable.d.ts
generated
vendored
5
node_modules/undici/types/readable.d.ts
generated
vendored
@@ -25,6 +25,11 @@ declare class BodyReadable extends Readable {
|
||||
*/
|
||||
blob(): Promise<Blob>
|
||||
|
||||
/** Consumes and returns the body as an Uint8Array
|
||||
* https://fetch.spec.whatwg.org/#dom-body-bytes
|
||||
*/
|
||||
bytes(): Promise<Uint8Array>
|
||||
|
||||
/** Consumes and returns the body as an ArrayBuffer
|
||||
* https://fetch.spec.whatwg.org/#dom-body-arraybuffer
|
||||
*/
|
||||
|
||||
2
node_modules/undici/types/retry-handler.d.ts
generated
vendored
2
node_modules/undici/types/retry-handler.d.ts
generated
vendored
@@ -32,7 +32,7 @@ declare namespace RetryHandler {
|
||||
};
|
||||
},
|
||||
callback: OnRetryCallback
|
||||
) => number | null;
|
||||
) => void
|
||||
|
||||
export interface RetryOptions {
|
||||
/**
|
||||
|
||||
6
node_modules/undici/types/webidl.d.ts
generated
vendored
6
node_modules/undici/types/webidl.d.ts
generated
vendored
@@ -67,6 +67,12 @@ interface WebidlUtil {
|
||||
* Stringifies {@param V}
|
||||
*/
|
||||
Stringify (V: any): string
|
||||
|
||||
/**
|
||||
* Mark a value as uncloneable for Node.js.
|
||||
* This is only effective in some newer Node.js versions.
|
||||
*/
|
||||
markAsUncloneable (V: any): void
|
||||
}
|
||||
|
||||
interface WebidlConverters {
|
||||
|
||||
2
node_modules/undici/types/websocket.d.ts
generated
vendored
2
node_modules/undici/types/websocket.d.ts
generated
vendored
@@ -3,8 +3,6 @@
|
||||
import type { Blob } from 'buffer'
|
||||
import type { MessagePort } from 'worker_threads'
|
||||
import {
|
||||
EventTarget,
|
||||
Event,
|
||||
EventInit,
|
||||
EventListenerOptions,
|
||||
AddEventListenerOptions,
|
||||
|
||||
Reference in New Issue
Block a user