Request HTTP URLs in a complex world â basic and digest authentication, redirections, timeout and more.
npm install urllib
import { request } from 'urllib';
const { data, res } = await request('http://cnodejs.org/');
// result: { data: Buffer, res: Response }
console.log('status: %s, body size: %d, headers: %j', res.status, data.length, res.headers);
const { request } = require('urllib');
const { data, res } = await request('http://cnodejs.org/');
// result: { data: Buffer, res: Response }
console.log('status: %s, body size: %d, headers: %j', res.status, data.length, res.headers);
- url String | Object - The URL to request, either a String or a Object that return by url.parse.
-
options Object - Optional
-
method String - Request method, defaults to
GET
. Could beGET
,POST
,DELETE
orPUT
. Alias 'type'. - data Object - Data to be sent. Will be stringify automatically.
-
content String | Buffer - Manually set the content of payload. If set,
data
will be ignored. -
stream stream.Readable - Stream to be pipe to the remote. If set,
data
andcontent
will be ignored. -
writeStream stream.Writable - A writable stream to be piped by the response stream. Responding data will be write to this stream and
callback
will be called withdata
setnull
after finished writing. -
files {Array<ReadStream|Buffer|String> | Object | ReadStream | Buffer | String - The files will send with
multipart/form-data
format, base onformstream
. Ifmethod
not set, will usePOST
method by default. -
contentType String - Type of request data. Could be
json
(Notes: not useapplication/json
here). If it'sjson
, will auto setContent-Type: application/json
header. -
dataType String - Type of response data. Could be
text
orjson
. If it'stext
, thecallback
eddata
would be a String. If it'sjson
, thedata
of callback would be a parsed JSON Object and will auto setAccept: application/json
header. Defaultcallback
eddata
would be aBuffer
. -
fixJSONCtlChars Boolean - Fix the control characters (U+0000 through U+001F) before JSON parse response. Default is
false
. - headers Object - Request headers.
-
timeout Number | Array - Request timeout in milliseconds for connecting phase and response receiving phase. Default is
5000
. You can usetimeout: 5000
to tell urllib use same timeout on two phase or set them seperately such astimeout: [3000, 5000]
, which will set connecting timeout to 3s and response 5s. -
keepAliveTimeout
number | null
- Default is4000
, 4 seconds - The timeout after which a socket without active requests will time out. Monitors time between activity on a connected socket. This value may be overridden by keep-alive hints from the server. See MDN: HTTP - Headers - Keep-Alive directives for more details. -
auth String -
username:password
used in HTTP Basic Authorization. -
digestAuth String -
username:password
used in HTTP Digest Authorization. - followRedirect Boolean - follow HTTP 3xx responses as redirects. defaults to true.
- maxRedirects Number - The maximum number of redirects to follow, defaults to 10.
-
formatRedirectUrl Function - Format the redirect url by your self. Default is
url.resolve(from, to)
. - beforeRequest Function - Before request hook, you can change every thing here.
-
streaming Boolean - let you get the
res
object when request connected, defaultfalse
. aliascustomResponse
-
compressed Boolean - Accept
gzip, br
response content and auto decode it, default isfalse
. -
timing Boolean - Enable timing or not, default is
true
. -
socketPath String | null - request a unix socket service, default is
null
. -
highWaterMark Number - default is
67108864
, 64 KiB.
-
method String - Request method, defaults to
When making a request:
await request('https://example.com', {
method: 'GET',
data: {
'a': 'hello',
'b': 'world',
},
});
For GET
request, data
will be stringify to query string, e.g. http://example.com/?a=hello&b=world
.
For others like POST
, PATCH
or PUT
request,
in defaults, the data
will be stringify into application/x-www-form-urlencoded
format
if content-type
header is not set.
If content-type
is application/json
, the data
will be JSON.stringify
to JSON data format.
options.content
is useful when you wish to construct the request body by yourself,
for example making a content-type: application/json
request.
Notes that if you want to send a JSON body, you should stringify it yourself:
await request('https://example.com', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
content: JSON.stringify({
a: 'hello',
b: 'world',
}),
});
It would make a HTTP request like:
POST / HTTP/1.1
host: example.com
content-type: application/json
{
"a": "hello",
"b": "world"
}
This exmaple can use options.data
with application/json
content type:
await request('https://example.com', {
method: 'POST',
headers: {
'content-type': 'application/json'
},
data: {
a: 'hello',
b: 'world',
}
});
Upload a file with a hello
field.
await request('https://example.com/upload', {
method: 'POST',
files: __filename,
data: {
hello: 'hello urllib',
},
});
Upload multi files with a hello
field.
await request('https://example.com/upload', {
method: 'POST',
files: [
__filename,
fs.createReadStream(__filename),
Buffer.from('mock file content'),
],
data: {
hello: 'hello urllib with multi files',
},
});
Custom file field name with uploadfile
.
await request('https://example.com/upload', {
method: 'POST',
files: {
uploadfile: __filename,
},
});
Response is normal object, it contains:
-
status
orstatusCode
: response status code.-
-1
meaning some network error likeENOTFOUND
-
-2
meaning ConnectionTimeoutError
-
-
headers
: response http headers, default is{}
-
size
: response size -
aborted
: response was aborted or not -
rt
: total request and response time in ms. -
timing
: timing object if timing enable. -
socket
: socket info
NODE_DEBUG=urllib:* npm test
Create a HttpClient with options.allowH2 = true
import { HttpClient } from 'urllib';
const httpClient = new HttpClient({
allowH2: true,
});
const response = await httpClient.request('https://node.js.org');
console.log(response.status);
console.log(response.headers);
export from undici
import { strict as assert } from 'assert';
import { MockAgent, setGlobalDispatcher, request } from 'urllib';
const mockAgent = new MockAgent();
setGlobalDispatcher(mockAgent);
const mockPool = mockAgent.get('http://localhost:7001');
mockPool.intercept({
path: '/foo',
method: 'POST',
}).reply(400, {
message: 'mock 400 bad request',
});
const response = await request('http://localhost:7001/foo', {
method: 'POST',
dataType: 'json',
});
assert.equal(response.status, 400);
assert.deepEqual(response.data, { message: 'mock 400 bad request' });
export from undici
import { ProxyAgent, request } from 'urllib';
const proxyAgent = new ProxyAgent('http://my.proxy.com:8080');
const response = await request('https://www.npmjs.com/package/urllib', {
dispatcher: proxyAgent,
});
console.log(response.status, response.headers);
undici@6.19.2
Node.js v18.20.3
âââââââââââŹââââââââââââââââââââââââŹââââââââââŹâââââââââââââââââââââŹââââââââââââââŹââââââââââââââââââââââââââ
â (index) â Tests â Samples â Result â Tolerance â Difference with slowest â
âââââââââââźââââââââââââââââââââââââźââââââââââźâââââââââââââââââââââźââââââââââââââźââââââââââââââââââââââââââ¤
â 0 â 'urllib2 - request' â 10 â '321.53 req/sec' â 'Âą 0.38 %' â '-' â
â 1 â 'http - no keepalive' â 10 â '607.77 req/sec' â 'Âą 0.80 %' â '+ 89.02 %' â
â 2 â 'got' â 101 â '7929.51 req/sec' â 'Âą 4.46 %' â '+ 2366.15 %' â
â 3 â 'node-fetch' â 40 â '8651.95 req/sec' â 'Âą 2.99 %' â '+ 2590.84 %' â
â 4 â 'request' â 101 â '8864.09 req/sec' â 'Âą 7.81 %' â '+ 2656.82 %' â
â 5 â 'undici - fetch' â 101 â '9607.01 req/sec' â 'Âą 4.23 %' â '+ 2887.87 %' â
â 6 â 'axios' â 55 â '10378.80 req/sec' â 'Âą 2.94 %' â '+ 3127.91 %' â
â 7 â 'superagent' â 75 â '11286.74 req/sec' â 'Âą 2.90 %' â '+ 3410.29 %' â
â 8 â 'http - keepalive' â 60 â '11288.96 req/sec' â 'Âą 2.95 %' â '+ 3410.98 %' â
â 9 â 'urllib4 - request' â 101 â '11352.65 req/sec' â 'Âą 10.20 %' â '+ 3430.79 %' â
â 10 â 'urllib3 - request' â 40 â '13831.19 req/sec' â 'Âą 2.89 %' â '+ 4201.64 %' â
â 11 â 'undici - pipeline' â 60 â '14562.44 req/sec' â 'Âą 2.91 %' â '+ 4429.06 %' â
â 12 â 'undici - request' â 70 â '19630.64 req/sec' â 'Âą 2.87 %' â '+ 6005.32 %' â
â 13 â 'undici - stream' â 55 â '20843.50 req/sec' â 'Âą 2.90 %' â '+ 6382.54 %' â
â 14 â 'undici - dispatch' â 55 â '21233.10 req/sec' â 'Âą 2.82 %' â '+ 6503.70 %' â
âââââââââââ´ââââââââââââââââââââââââ´ââââââââââ´âââââââââââââââââââââ´ââââââââââââââ´ââââââââââââââââââââââââââ
Node.js v20.15.0
âââââââââââŹââââââââââââââââââââââââŹââââââââââŹâââââââââââââââââââââŹâââââââââââââŹââââââââââââââââââââââââââ
â (index) â Tests â Samples â Result â Tolerance â Difference with slowest â
âââââââââââźââââââââââââââââââââââââźââââââââââźâââââââââââââââââââââźâââââââââââââźââââââââââââââââââââââââââ¤
â 0 â 'urllib2 - request' â 10 â '332.91 req/sec' â 'Âą 1.13 %' â '-' â
â 1 â 'http - no keepalive' â 10 â '615.50 req/sec' â 'Âą 2.25 %' â '+ 84.88 %' â
â 2 â 'got' â 55 â '7658.39 req/sec' â 'Âą 2.98 %' â '+ 2200.42 %' â
â 3 â 'node-fetch' â 30 â '7832.96 req/sec' â 'Âą 2.96 %' â '+ 2252.86 %' â
â 4 â 'axios' â 40 â '8607.27 req/sec' â 'Âą 2.79 %' â '+ 2485.44 %' â
â 5 â 'request' â 35 â '8703.49 req/sec' â 'Âą 2.84 %' â '+ 2514.35 %' â
â 6 â 'undici - fetch' â 65 â '9971.24 req/sec' â 'Âą 2.96 %' â '+ 2895.15 %' â
â 7 â 'superagent' â 30 â '11006.46 req/sec' â 'Âą 2.90 %' â '+ 3206.11 %' â
â 8 â 'http - keepalive' â 55 â '11610.14 req/sec' â 'Âą 2.87 %' â '+ 3387.44 %' â
â 9 â 'urllib3 - request' â 25 â '13873.38 req/sec' â 'Âą 2.96 %' â '+ 4067.27 %' â
â 10 â 'urllib4 - request' â 25 â '14291.36 req/sec' â 'Âą 2.92 %' â '+ 4192.82 %' â
â 11 â 'undici - pipeline' â 45 â '14617.69 req/sec' â 'Âą 2.84 %' â '+ 4290.85 %' â
â 12 â 'undici - dispatch' â 101 â '18716.29 req/sec' â 'Âą 3.97 %' â '+ 5521.98 %' â
â 13 â 'undici - request' â 101 â '19165.16 req/sec' â 'Âą 3.25 %' â '+ 5656.81 %' â
â 14 â 'undici - stream' â 30 â '21816.28 req/sec' â 'Âą 2.99 %' â '+ 6453.15 %' â
âââââââââââ´ââââââââââââââââââââââââ´ââââââââââ´âââââââââââââââââââââ´âââââââââââââ´ââââââââââââââââââââââââââ
Node.js v22.3.0
âââââââââââŹââââââââââââââââââââââââŹââââââââââŹâââââââââââââââââââââŹâââââââââââââŹââââââââââââââââââââââââââ
â (index) â Tests â Samples â Result â Tolerance â Difference with slowest â
âââââââââââźââââââââââââââââââââââââźââââââââââźâââââââââââââââââââââźâââââââââââââźââââââââââââââââââââââââââ¤
â 0 â 'urllib2 - request' â 15 â '297.46 req/sec' â 'Âą 2.65 %' â '-' â
â 1 â 'http - no keepalive' â 10 â '598.25 req/sec' â 'Âą 1.94 %' â '+ 101.12 %' â
â 2 â 'axios' â 30 â '8487.94 req/sec' â 'Âą 2.91 %' â '+ 2753.52 %' â
â 3 â 'got' â 50 â '10054.46 req/sec' â 'Âą 2.89 %' â '+ 3280.16 %' â
â 4 â 'request' â 45 â '10306.02 req/sec' â 'Âą 2.87 %' â '+ 3364.73 %' â
â 5 â 'node-fetch' â 55 â '11160.02 req/sec' â 'Âą 2.87 %' â '+ 3651.83 %' â
â 6 â 'superagent' â 80 â '11302.28 req/sec' â 'Âą 2.85 %' â '+ 3699.66 %' â
â 7 â 'undici - fetch' â 60 â '11357.87 req/sec' â 'Âą 2.89 %' â '+ 3718.35 %' â
â 8 â 'http - keepalive' â 60 â '13782.10 req/sec' â 'Âą 2.97 %' â '+ 4533.34 %' â
â 9 â 'urllib4 - request' â 70 â '15965.62 req/sec' â 'Âą 2.88 %' â '+ 5267.40 %' â
â 10 â 'urllib3 - request' â 55 â '16010.37 req/sec' â 'Âą 2.90 %' â '+ 5282.45 %' â
â 11 â 'undici - pipeline' â 35 â '17969.37 req/sec' â 'Âą 2.95 %' â '+ 5941.03 %' â
â 12 â 'undici - dispatch' â 101 â '18765.50 req/sec' â 'Âą 3.01 %' â '+ 6208.68 %' â
â 13 â 'undici - request' â 85 â '20091.12 req/sec' â 'Âą 2.95 %' â '+ 6654.33 %' â
â 14 â 'undici - stream' â 45 â '21599.12 req/sec' â 'Âą 2.81 %' â '+ 7161.30 %' â
âââââââââââ´ââââââââââââââââââââââââ´ââââââââââ´âââââââââââââââââââââ´âââââââââââââ´ââââââââââââââââââââââââââ
Made with contributors-img.