Your destination for complete Tech news

how to read a large file one line at a time in node.js?

88 0
< 1 min read

To read a large file one line at a time in Node.js, you can use the readline module and the readline.createInterface() function.

The readline module provides an interface for reading data from a Readable stream (such as a file) one line at a time. The readline.createInterface() function takes an input stream and an output stream as arguments and returns an instance of the Readline interface.

Here’s an example of how to use the readline module to read a large file one line at a time:

const fs = require('fs');
const readline = require('readline');

const fileStream = fs.createReadStream('./large-file.txt');
const rl = readline.createInterface({
  input: fileStream,
  crlfDelay: Infinity
});

rl.on('line', (line) => {
  console.log(`Line from file: ${line}`);
});

rl.on('close', () => {
  console.log('File reading completed');
});

In this example, we use the fs.createReadStream() function to create a read stream for the large-file.txt file, and we pass this stream to the readline.createInterface() function as the input stream. We also set the crlfDelay option to Infinity to ensure that the stream is read correctly on all platforms.

Then, we register a listener for the line event, which is emitted every time a new line is read from the file. Inside the listener, we log the line to the console.

Finally, we register a listener for the close event, which is emitted when the end of the file is reached. Inside the listener, we log a message indicating that the file reading has completed.

Note: If the file is extremely large, you may want to consider using a streaming parser like csv-parser or ndjson to parse the file as it is being read, rather than reading the entire file into memory at once.

Leave A Reply

Your email address will not be published.

one × three =

This site uses Akismet to reduce spam. Learn how your comment data is processed.