Stream in nodejs
Created on: Aug 16, 2024
Streams are objects that let you read data from a source or write data to a destination in a continuous manner. They are particularly useful for working with large amounts of data, such as files or network requests, where it might not be efficient or even possible to load the entire data into memory at once.
Let's understand with the example with a nodejs, typescript project. Clone project from github
Create a server which accepts incoming request.
import { createServer, IncomingMessage, ServerResponse } from "http"; const server = createServer((req: IncomingMessage, res: ServerResponse) => { let body = ""; req.on("data", (chunk: Uint8Array) => { console.log(chunk); body += chunk.toString(); }); req.on("end", () => { console.log("Received body:", body); res.end("ok"); }); }); server.listen(3000, () => { console.log("Server is listening on port 3000"); });
Add script in package.json
"httpMessage": "ts-node src/httpMessage.ts"
And trigger below command in terminal
curl -X POST http://localhost:3000 -d 'message=Hello, world!'
In the console you can see
<Buffer 6d 65 73 73 61 67 65 3d 48 65 6c 6c 6f 2c 20 77 6f 72 6c 64 21> Received body: message=Hello, world!
In above code, req
is Readable stream and res is Writable Stream and chunk is Uint8Array
which is parent class of Buffer.
Uint8Array is a typed array in JavaScript that represents an array of 8-bit unsigned integers. Each element in a Uint8Array is an integer between 0 and 255. This array type is primarily used for handling binary data in a more efficient and structured way than general-purpose arrays.
Stream are commonly used in File I/O, Network Communication, Data Transformation and more. There are namely four types of streams in Node.js.
- Readable Streams: Streams from which data can be read. Example: fs.createReadStream().
- Writable Streams: Streams to which data can be written. Example: fs.createWriteStream()
- Duplex Streams: Streams that are both readable and writable. Example: net.Socket.
- Transform Streams: A type of Duplex Stream where the output is computed based on the input. Example: zlib.createGzip().
Benefit of using stream
- Memory Efficiency: Streams allow you to process data piece by piece (in chunks) rather than loading the entire dataset into memory. This is particularly useful when dealing with large files or data sets that could exceed the available memory.
- Time Efficiency: Streams process data as soon as it is available, allowing you to start working with data immediately without waiting for the entire data to be loaded. This leads to lower latency.
- Non-Blocking I/O: Streams in Node.js are built on non-blocking I/O principles, allowing other operations to continue while data is being processed, which enhances the overall performance of your application.
- Composable Pipelines:: Streams can be easily piped together, creating powerful data processing pipelines. For example, you can read data from a file, compress it, and then write it to another file—all using a chain of piped streams.
- Granular Error Management: Streams provide built-in error handling mechanisms that allow you to respond to issues (like file read/write errors, network timeouts, etc.) immediately as they occur, rather than waiting until the entire data is processed.
You can find completed code here