EzDevInfo.com

fs

File system utilities for Clojure.

Is there any difference between fs.ReadStream and fs.createReadStream in Node.js?

In fs module in Node.js, is there any difference between fs.ReadStream and fs.createReadStream? As far as I know, both take filename and then create stream object...right?


Source: (StackOverflow)

What are the pros and cons of fs.createReadStream vs fs.readFile in node.js?

I'm mucking about with node.js and have discovered two ways of reading a file and sending it down the wire, once I've established that it exists and have sent the proper MIME type with writeHead:

// read the entire file into memory and then spit it out

fs.readFile(filename, function(err, data){
  if (err) throw err;
  response.write(data, 'utf8');
  response.end();
});

// read and pass the file as a stream of chunks

fs.createReadStream(filename, {
  'flags': 'r',
  'encoding': 'binary',
  'mode': 0666,
  'bufferSize': 4 * 1024
}).addListener( "data", function(chunk) {
  response.write(chunk, 'binary');
}).addListener( "close",function() {
  response.end();
});

Am I correct in assuming that fs.createReadStream might provide a better user experience if the file in question was something large, like a video? It feels like it might be less block-ish; is this true? Are there other pros, cons, caveats, or gotchas I need to know?


Source: (StackOverflow)

Advertisements

nodejs get file name from absolute path?

If there any API could retrieve file name from a absolute file path?

e.g. "foo.txt" from "/var/www/foo.txt"

I know it works with string operation, like fullpath.replace(/.+\//, '') but I want to know is there a more 'formal' way, like file.getName() in java, could do it.

nodejs get file name from absolute path?


Source: (StackOverflow)

Node.js check exist file

How to make check on the existence of the file?

In the documentation for the module fs is a description of rhe method fs.exists(path, callback). But, as I understand, it checks for the existence of only directories. And I need to check the file!

How can this be done?


Source: (StackOverflow)

Find absolute base path of the project directory (after Meteor 0.6.5)

Until now we could get the absolute path of a file to open later as readStream with this code snippet:

   var base = path.resolve('.');
   var file = base + '/data/test.csv';

   fs.createReadStream(file)

Since Meteor 0.6.5 the base path is pointing to .meteor/local/build/programs/...

There is also the Assets API, which but can not give us back a path but only the read document. We but need a stream to process some bigger data files?


Source: (StackOverflow)

NodeJS File Statistics

I don't know if this is a valid question but is there a documentation out there describing each property from the result of fs.stat() in nodejs. Because I am trying to find the meaning of each of those properties but no luck.

Thanks!


Source: (StackOverflow)

Node.js: How to check if folder is empty or not with out uploading list of files

I am using Node.js.

I want to check if folder is empty or not? One option is to use fs.readdir but it loads whole bunch of files into an array. I have more than 10000 files in the folder. Loading files name is useless just to check if folder is empty or not. So looking for alternate solution.


Source: (StackOverflow)

How do I use chmod with Node.js

How do I use chmod with Node.js?

There is a method in the package fs, which should do this, but I don't know what it takes as the second argument.

fs.chmod(path, mode, [callback])

Asynchronous chmod(2). No arguments other than a possible exception are given to the completion callback.

fs.chmodSync(path, mode)

Synchronous chmod(2).

(from the Node.js documentation)

If I do something like

fs.chmodSync('test', 0755);

nothing happens (the file isn't changed to that mode).

fs.chmodSync('test', '+x');

doesn't work either.

I'm working on a Windows machine btw.


Source: (StackOverflow)

How to close a readable stream (before end)?

How to close a readable stream in Node.js?

var input = fs.createReadStream('lines.txt');

input.on('data', function(data) {
   // after closing the stream, this will not
   // be called again

   if (gotFirstLine) {
      // close this stream and continue the
      // instructions from this if
      console.log("Closed.");
   }
});

This would be better than:

input.on('data', function(data) {
   if (isEnded) { return; }

   if (gotFirstLine) {
      isEnded = true;
      console.log("Closed.");
   }
});

But this would not stop the reading process...


Source: (StackOverflow)

fs.writeFileSync gives Error:UNKNOWN, correct way to make synchronous file write in nodejs

I have a NodeJS server application. I have this line of code for my logging:

fs.writeFileSync(__dirname + "/../../logs/download.html.xml", doc.toString());

Sometimes it works correctly, but under heavy load it gives this exception:

Error: UNKNOWN, unknown error 'download.html.xml'

PS: I've found a link here: http://www.daveeddy.com/2013/03/26/synchronous-file-io-in-nodejs/ Blogger describes that writeFileSync doesn't really finish writing on return. Is there any correct way to do it in a sync way, i.e. without callbacks?


Source: (StackOverflow)

Node.js from fs.readFileSync() to fs.readFile()

I'm trying to get my head around synchronous vs asynchronous in Node.js, in particular for reading an html file.

In a request handler, the synchronous version that i'm using, which works is the following:

var fs = require("fs");
var filename = "./index.html";
var buf = fs.readFileSync(filename, "utf8");

function start(resp) {
    resp.writeHead(200, {"Content-type":"text/html"});
    resp.write(buf);
    resp.end();
    }

exports.start=start; 
  1. What would be the version using readFile() ??
  2. I understand that readFile is asynchronous so theoretically I should wait that the entire file is read before rendering it, so should I introduce an addListener? I might be confusing different things.

Edit: I have tried to refactor the code like this:

var fs = require("fs");
var filename = "./index.html";
function start (resp) {
    resp.writeHead(200, {"Content-Type":"text/html"});
    fs.readFile(filename, "utf8", function (err, data) {
        if (err) throw err;
        resp.write(data);
        });
    resp.end();
    }

I get a blank page, I guess it's because it should wait that all the data has been read, before resp.write(data), how do i signal this?


Source: (StackOverflow)

Promises with fs and bluebird

I'm currently learning how to use promises in nodejs

so my first challenge was to list files in a directory and then get the content of each with both steps using asynchronous functions. I came up with the following solution but have a strong feeling that this is not the most elegant way to do this, especially the first part where I am "turning" the asynchronous methods into promises

// purpose is to get the contents of all files in a directory
// using the asynchronous methods fs.readdir() and fs.readFile()
// and chaining them via Promises using the bluebird promise library [1]
// [1] https://github.com/petkaantonov/bluebird 

var Promise = require("bluebird");
var fs = require("fs");
var directory = "templates"

// turn fs.readdir() into a Promise
var getFiles = function(name) {
    var promise = Promise.pending();

    fs.readdir(directory, function(err, list) {
        promise.fulfill(list)
    })

    return promise.promise;
}

// turn fs.readFile() into a Promise
var getContents = function(filename) {
    var promise = Promise.pending();

    fs.readFile(directory + "/" + filename, "utf8", function(err, content) {
        promise.fulfill(content)
    })

    return promise.promise
}

Now chain both promises:

getFiles()    // returns Promise for directory listing 
.then(function(list) {
    console.log("We got " + list)
    console.log("Now reading those files\n")

    // took me a while until i figured this out:
    var listOfPromises = list.map(getContents)
    return Promise.all(listOfPromises)

})
.then(function(content) {
    console.log("so this is what we got: ", content)
})

As I wrote above, it returns the desired result, but I'm pretty sure there is a more elegant way to this.


Source: (StackOverflow)

Download file from url and upload it to AWS S3 without saving - node.js

I'm writing an application which downloads images from a url and then uploads it to an S3 bucket using the aws-sdk.

Perviously I was just downloading images and saving them to disk like this.

request.head(url, function(err, res, body){

    request(url).pipe(fs.createWriteStream(image_path));

});

And then uploading the images to AWS S3 like this

fs.readFile(image_path, function(err, data){
    s3.client.putObject({
        Bucket: 'myBucket',
        Key: image_path,
        Body: data
        ACL:'public-read'
    }, function(err, resp) {
        if(err){
            console.log("error in s3 put object cb");
        } else { 
            console.log(resp);
            console.log("successfully added image to s3");
        }
    });
});

But I would like to skip the part where I save the image to disk. Is there some way I can pipe the response from request(url) to a variable and then upload that?


Source: (StackOverflow)

NodeJS fs.watch on directory only fires when changed by editor, but not shell or fs module

When the code below is ran, the watch is only triggered if I edit and save tmp.txt manually, using either my ide, TextEditor.app, or vim.

It doesn't by method of the write stream or manual shell output redirection (typing echo "test" > /path/to/tmp.txt").

Although if I watch the file itself, and not its dirname, then it works.

var fs, Path, file, watchPath, w;

fs = require('fs' );
Path = require('path');
file = __dirname + '/tmp.txt';
watchPath = Path.dirname(file); // changing this to just file makes it trigger

w = fs.watch ( watchPath, function (e,f) {
    console.log("will not get here by itself");
    w.close();
});
fs.writeFileSync(file,"test","utf-8");

fs.createWriteStream(file, {
    flags:'w',
    mode: 0777
} )
.end('the_date="'+new Date+'";' ); // another method fails as well

setTimeout (function () {
    fs.writeFileSync(file,"test","utf-8");
},500); // as does this one
// child_process exec and spawn fail the same way with or without timeout

So the questions are: why? and how to trigger this event programmatically from a node script?

Thanks!


Source: (StackOverflow)

Node.js Error ENOENT, open "file/path" when nothing has been changed

Ok first off... I'm new to Node.js. I'm trying to convert a word document to HTML then scrapping it to obtain the content. Then pump it into an existing engine.

With that being said, everything was running fairly smoothly until today. I just got the fs.writeFile working last night and stepped away from it. This morning without touching it and trying to run it I receive this:

this error

here's the block where the error is being called.

//COPY TEMPLATE AND PASTE
fs.readFile("./Templates/TextBasedEvent.xml", function (err, data){
    if (err) {
        throw err;
    }       
    var contentHolder = data.toString(),            
        contentHolder = contentHolder.replace(/%EVENTNUMBER%/gi, id),
        contentHolder = contentHolder.replace(/%CONTENT%/gi, contents);
    fs.writeFile("./bin/xml/" + id + ".xml", contentHolder, function (err){
        if (err) {
            throw err;
        }
    });
});

Does it have something to do with how the variable is placed in the file path? also the throw err for it just seems weird that it soft returned in between where the variable is.

Thanks!

Edit: The issue was with newline being pulled in, with the variable.


Source: (StackOverflow)