Node.js Streams with TypeScript

  • Why Streams Matter?
  • The Four Types of Streams
  • Setting Up Your TypeScript Environment
  • Real-World Use Case: Streaming API Responses
  • Advanced Tips: Handling Backpressure
  • Conclusion
  • Blog/
  • JavaScript/
  • Node.js/
  • Node.js Streams with TypeScript
  • Why Streams Matter?
  • The Four Types of Streams
  • Setting Up Your TypeScript Environment
  • Real-World Use Case: Streaming API Responses
  • Advanced Tips: Handling Backpressure
  • Conclusion

Share this article

  • Premium Results
  • Publish articles on SitePoint
  • Daily curated jobs
  • Learning Paths
  • Discounts to dev tools

7 Day Free Trial. Cancel Anytime.

Node.js is renowned for its ability to handle I/O operations efficiently, and at the heart of this capability lies the concept of streams. Streams allow you to process data piece by piece, rather than loading everything into memory at once—perfect for handling large files, network requests, or real-time data. When you pair streams with TypeScript’s strong typing, you get a powerful combo: performance meets safety.

In this guide, we’ll dive deep into Node.js streams, explore their types, and walk through practical examples using TypeScript. Whether you’re a Node.js newbie or a TypeScript enthusiast looking to level up, this post has you covered.

Why Streams Matter?

Picture this: you’re tasked with processing a 50GB log file. Loading it entirely into memory would exhaust your server’s resources, leading to crashes or sluggish performance. Streams solve this by letting you handle data as it flows, like sipping from a straw instead of chugging a gallon jug.

This efficiency is why streams are a cornerstone of Node.js, powering everything from file operations to HTTP servers. TypeScript enhances this by adding type definitions, catching errors at compile time, and improving code readability. Let’s dive into the fundamentals and see how this synergy works in practice.

The Four Types of Streams

Node.js offers four main stream types, each with a specific purpose:

  1. Readable Streams: Data sources you can read from (e.g., files, HTTP responses).
  2. Writable Streams: Destinations you can write to (e.g., files, HTTP requests).
  3. Duplex Streams: Both readable and writable (e.g., TCP sockets).
  4. Transform Streams: A special duplex stream that modifies data as it passes through (e.g., compression).

TypeScript enhances this by allowing us to define interfaces for the data flowing through them. Let’s break them down with examples.

Setting Up Your TypeScript Environment

Before we dive into code, ensure you have Node.js and TypeScript installed.

Create a new project:

Update your tsconfig.json to include:

Create a src folder and let’s start coding!

Example 1: Reading a File with a Readable Stream

Let’s read a text file chunk by chunk. First, create a file named data.txt in the root directory of your project with some sample text (e.g., “Hello, streams!”).

Now, in src/readStream.ts:

Run it with:

Here, TypeScript ensures the chunk adheres to our Chunk interface, and the error event handler expects an Error type. This stream reads data.txt in chunks (default 64KB for files) and logs them.

Example 2: Writing Data with a Writable Stream

Now, let’s write data to a new file. In src/writeStream.ts:

Compile and run:

This creates output.txt with three lines. TypeScript ensures the line is a string and provides autocompletion for stream methods.

Example 3: Piping with a Transform Stream

Piping is where streams shine, connecting a readable stream to a writable stream. Let’s add a twist with a Transform stream to uppercase our text.

In src/transformStream.ts:

Run it:

This reads data.txt, transforms the text to uppercase, and writes it to output_upper.txt.

TypeScript’s TransformCallback type ensures our _transform method is correctly implemented.

Example 4: Compressing Files with a Duplex Stream

Let’s tackle a more advanced scenario: compressing a file using the zlib module, which provides a duplex stream. It comes with the ‘@types/node’ package, which we installed earlier.

In src/compressStream.ts:

Run it:

Here, the pipeline ensures proper error handling and cleanup. The gzip stream compresses data.txt into data.txt.gz. TypeScript’s type inference keeps our code clean and safe.

Example 5: Streaming HTTP Responses

Streams shine in network operations. Let’s simulate streaming data from an HTTP server using axios. Install it:

In src/httpStream.ts:

Run it:

This streams an HTTP response (e.g., a web page) to example.html. TypeScript ensures the url and outputFile parameters are strings, and the Promise typing adds clarity.

​​We can also use Node.js’s built-in Fetch API (available since Node v18) or libraries like node-fetch, which also support streaming responses, although the stream types may differ (Web Streams vs. Node.js Streams).

Example:

Example 6: Real-Time Data Processing with a Custom Readable Stream

Let’s create a custom, readable stream to simulate real-time data, such as sensor readings. In src/customReadable.ts:

Run it:

This generates 10 random “sensor readings” and streams them. TypeScript’s class typing ensures our implementation aligns with the Readable interface.

Example 7: Chaining Multiple Transform Streams

Let’s chain transforms to process text in stages: uppercase it, then prepend a timestamp. In src/chainTransform.ts:

Run it:

This reads data.txt, uppercases the data, adds a timestamp, and writes the result to output_chain.txt. Chaining transforms showcases streams’ modularity.

Best Practices for Streams in TypeScript

  1. Type Your Data: Define interfaces for chunks to catch type errors early.
  2. Handle Errors: Always attach error event listeners to avoid unhandled exceptions.
  3. Use Pipes Wisely: Piping reduces manual event handling and improves readability.
  4. Backpressure: For large data, monitor writeStream.writableHighWaterMark to avoid overwhelming the destination.

Real-World Use Case: Streaming API Responses

Imagine you’re building an API that streams a large dataset. Using express and streams:

Install dependencies (npm install express @types/express), then run it. Visit http://localhost:3000/stream-data to see the data stream in your browser!

Advanced Tips: Handling Backpressure

When a writable stream can’t keep up with a readable stream, backpressure occurs. Node.js handles this automatically with pipes, but you can monitor it manually:

This ensures your app stays responsive under heavy loads.

Precautions for using Backpressure: When writing large amounts of data, the readable stream may produce data faster than the writable stream can consume it. While pipe and pipeline handle this automatically, if writing manually, check if write() returns false and wait for the ‘drain’ event before writing more.

Additionally, async iterators (for await…of) are modern alternatives for consuming readable streams, which can often simplify the code compared to using .on(‘data’) and .on(‘end’).

Example:

Additional points:

Ensure Resource Cleanup: This is especially important in custom stream implementations or when using stream.pipeline. Explicitly call stream.destroy() in error scenarios or when the stream is no longer needed to release underlying resources and prevent leaks. stream.pipeline handles this automatically for piped streams.

Use Readable.from() for Convenience: When you need to create a stream from an existing iterable (such as an array) or an async iterable, Readable.from() is often the simplest and most modern approach, requiring less boilerplate code than creating a custom Readable class.

Conclusion

Streams are a game-changer in Node.js, and TypeScript enhances them further by introducing type safety and clarity. From reading files to transforming data in real-time, mastering streams opens up a world of efficient I/O possibilities. The examples here—reading, writing, changing, compressing, and streaming over HTTP—scratch the surface of what’s possible.

Experiment with your own pipelines: try streaming logs, processing CSV files, or building a live chat system. The more you explore, the more you’ll appreciate the versatility of streams.

Raju Dandigam is an Engineering Manager and Staff Engineer with over 14 years of experience in full-stack development, AI integration, and building scalable web applications. He has led major projects at Navan, eBay, and Comcast, delivering innovative solutions across travel, commerce, and enterprise platforms. Raju specializes in Angular, React, Node.js, and modern JavaScript, focusing strongly on front-end modernization and AI-powered features. Outside work, he built a mindful breathing meditation app, contributes technical articles to platforms like Hackernoon, DZone, Tutorialspoint, Medium, and Dev.to, and judges industry awards and hackathons.

Share this article

Published in

November 17, 2014

Published in

December 5, 2018

Published in

August 17, 2015

Published in

January 15, 2020

Published in

February 10, 2020

Published in

October 7, 2023

Published in

October 30, 2013

Published in

November 1, 2024

  • Premium Results
  • Publish articles on SitePoint
  • Daily curated jobs
  • Learning Paths
  • Discounts to dev tools

7 Day Free Trial. Cancel Anytime.

Scroll to Top