EzDevInfo.com

node-postgres

PostgreSQL client for node.js.

SSL for PostgreSQL connection nodejs

I am trying to connect to my Heroku PostgreSQL DB and I keep getting an SSL error. Does anyone have an idea on how to enable SSL in the connection string?

postgres://user:pass@host:port/database;

Been looking for it everywhere but it does not seem to be a very popular topic. By the way, I am running Nodejs and the node-pg module with its connection-pooled method:

pg.connect(connString, function(err, client, done) { /// Should work. });

Comments are much appreciated.


Source: (StackOverflow)

node-postgres will not insert data, but doesn't throw errors either

I'm using the node-postgres module for node.js and I have a problem inserting data.

The function:

function addItems(listId, listItems, handle) {
    if (!listItems) {
        handle(listId);
        return;
    }
    var client = dbConnector.getClient(),
        important,
        dateArray,
        dateString,
        i,
        prepStatement;
    client.connect();
    for (i in listItems) {
        console.log(listItems[i]);
        dateArray = listItems[i].itemdate.split('-');
        dateString = dateArray[1] + '-' + dateArray[0] + '-' + dateArray[2];
        if (listItems[i].important) {
            important = 'true';
        } else {
            important = 'false';
        }
        prepStatement = {
            name: 'insert task',
            text: 'INSERT INTO listitem (todolist_id, name, deadline, description, important, done, created) VALUES ($1, $2, $3, $4, $5, $6, now()) RETURNING listitem_id',
            values: [ listId, listItems[i].itemname, dateString, listItems[i].itemdesc, important, listItems[i].done ]
        };
        var query = client.query(prepStatement);
        console.log("Adding item " + i);
        query.on('error', function(error) {
            console.log(error);
        });
        query.on('end', function(result) {
           console.log("Query ended");
           if (result) {
               console.log("Added listitem no " + result.rows[0].listitem_id);
           } 
        });
    }
    client.end();
    handle(listId);
}

No new data appears in the database. The query.on('error') and query.on('end') events are never fired. Come to think of it, I'm beginning to doubt if the query is even triggered (tho I can't see why it shouldn't).

The only log I get is:

{  itemname: 'Task 1',
   itemdate: '08-05-2012',
   important: 'on',
   itemdesc: 'A task',
   done: 'false' }
Adding item 0
{  itemname: 'Task 2',
   itemdate: '22-05-2012',
   important: 'on',
   itemdesc: 'Another one',
   done: 'false' }
Adding item 1

So how should I proceed in debugging this?


Source: (StackOverflow)

Advertisements

pg doesn't seem to close the connection correctly

I am writing an application using Node.js and postgresql. When a user authenticates it hits the DB to check the user's password then supposedly closes the connection to the DB. However, if the user enters the wrong password and tries again to authenticate, I get this error on the server console:

UNCAUGHT EXCEPTION...
{ [Error: write EPIPE] code: 'EPIPE', errno: 'EPIPE', syscall: 'write' }
Error: write EPIPE

I believe that it is not closing the connection correctly the first time. This is the part of the script that is supposed to close the connection:

query.on('end', function(){
    client.end();
    if (pass == this.dbpass){
        console.log(pass);
        console.log(this.dbpass);
        return callback (200, "OK", {}, true);
    } else {
        console.log(pass);
        return callback(200, "OK", {}, false);
    }
});

Is there another way to close the connection that I am unaware of?


Source: (StackOverflow)

Storing a file in postgres using node-postgres

I'm trying to store a small file into a postgres db using the node-postgres module. I understand that I should use the bytea data type to do this. The problem I'm having is when I do some thing like:

fs.readFile path, (err, data) ->
    client.query 'UPDATE file_table SET file = $1 WHERE key = $2', [data, key], (e, result) ->
    ....

The contents of the file column in the db is: \x and nothing is stored. If I change the data buffer to hex i.e. data.toString('hex') the file is stored but all formatting is lost when I read the file back out.

What is the correct way of storing a file into postgres using the node-postgres module?


Source: (StackOverflow)

node.js, pg, postgresql and insert queries (app hangs)

I have the following simple node application for data insertion into postgres database:

var pg = require('pg');
var dbUrl = 'tcp://user:psw@localhost:5432/test-db';

pg.connect(dbUrl, function(err, client, done) {
    for (var i = 0; i < 1000; i++) {
        client.query(
            'INSERT into post1 (title, body, created_at) VALUES($1, $2, $3) RETURNING id', 
            ['title', 'long... body...', new Date()], 
            function(err, result) {
                if (err) {
                    console.log(err);
                } else {
                    console.log('row inserted with id: ' + result.rows[0].id);
                }

            });
    }
});

After I run node app.js in terminal it inserts 1000 rows into database, then application hangs, and it do not terminates. What I am doing wrong? I have looked into pg module examples but didn’t spot that I’m doing any thing differently…


Source: (StackOverflow)

Why does this bluebird pg code hang?

I'm trying to wrap my head around bluebird Promises, and going through some examples in the documentation. My current code is based on this example:

var Promise = require('bluebird');
var pg = Promise.promisifyAll(require('pg'));
var using = Promise.using;

function getConnection(string) {
    var close;
    return pg.connectAsync(string).spread(function(client, done) {
        close = done;
        return client;
    }).disposer(function() {
        console.log('In disposer');
        try {
            if (close) close();
        } catch(e) {};
    });
};

using(getConnection('/var/run/postgresql dc'), function(conn) {
    console.log('Got a connection');
    return conn.queryAsync('SELECT 1');
})
.then(function(rows) {
    console.log('Retrieved %s row(s)',rows.rowCount);
});

The output is as expected:

Got a connection
In disposer
Retrieved 1 row(s)

However, the program never terminates. What's the hang-up (pun intended)?


Source: (StackOverflow)

What kind of object is node-postgres Error? Why node's console.log and JSON.stringify handle it differently?

console.log outputs it like this,

{ [error: syntax error at or near "step"]
  length: 86,
  name: 'error',
  severity: 'ERROR',
  code: '42601',
  detail: undefined,
  hint: undefined,
  position: '62',
  internalPosition: undefined,
  internalQuery: undefined,
  where: undefined,
  file: 'scan.l',
  line: '1001',
  routine: 'scanner_yyerror' }

but JSON.stringify doesn't sees the narrative part of an error,

{"length":86,"name":"error","severity":"ERROR","code":"42601","position":"62","file":"scan.l","line":"1001","routine":"scanner_yyerror"}

I can't figure out how to get this "error: column "undefined" does not exist" reading wikies (https://github.com/brianc/node-postgres/wiki/Error-handling, http://nodejs.ru/doc/v0.4.x/stdio.html#console.log)

The code is like this,

   client.query(selstring, function(err, result) {
   if(err){
     res.send(JSON.stringify(err));
     console.log(err);
   }else{

thanks

UPDATE: err.toString() shows error: syntax error at or near "step"


Source: (StackOverflow)

Retrieving multiple rows with Sequelizejs

I am beginning to learn SequelizeJs however I encounter a small issue: I have a model which is defined as such:

var ex_table= sequelize.define("ex_table", {
   concept1: Sequelize.STRING(5),
   concept2: Sequelize.STRING(80),
   concept3: Sequelize.STRING,
   weight: Sequelize.INTEGER,
   value: Sequelize.DECIMAL(20,2)
}, {
   tableName: "samble_table"});

Retrieving a single row from the table works, I do it like this:

ex_table
  .find({where: Sequelize.and({weight: 320193}, {concept1: 'AGLOK'}, {concept2: 'lambda'}, {concept2: 'Industry Group'})})
  .complete(function (err, result) {
    if (!!err) {
      console.log("An error occurred while creating the table:", err);
    } else {
      console.log(result.values);
    }
  })

This gives me one row of data, which is as expected. However when I'm trying to retrieve multiple rows like this:

ex_table
  .findAll({where: Sequelize.and({weight: 320193}, {concept1: 'AGLOK'}, {concept2: 'lambda'}, {concept2: 'Industry Group'})})
  .complete(function (err, result) {
    if (!!err) {
      console.log("An error occurred while creating the table:", err);
    } else {
      console.log(result.values);
    }
  })

Here I get underfined, any ideas how to get the multiple rows?


Source: (StackOverflow)

Return a value from an event emitter callback

I am using the postgresql module 'pg' in node. I would like to do some processing when a row is returned from the database, but I do not know how to return the value to the calling function.

var caller = function(){
    var query = client.query(qString);
    query.on('row', function(row){
        if (row.foo == bar) {
            return true;
        } else {
            return false;
        }
    }
};

when I call:

var boo = caller();

I want boo to equal the returned bool of the callback.

I tried to declare a variable uninitialized and then assign the value of the returned bool,

var caller = function(){
    var fido;
    var query = client.query(qString);
    query.on('row', function(row){
        if (row.foo == bar) {
            fido = true;
        } else {
            fido =  false;
        }
    }
    return fido;
};

but the function doesn't wait until 'pg' returns a row. This causes the returned variable to be returned undefined. I'm thinking I need to call part of this synchronously so the assigning of the variable happens before the function returns it, but I'm not sure how to do this for a normal function.

EDIT:

Enough with the pseudo code. I am new to Node and trying to get better at asking questions. This is a application for a class I am in. I wrote most of it in python with web.py and because of a conflict with the angular.js framework and re-writing in node.js, which I don't have much experience with. The part I'm having trouble with is the authenticating users. Here is the module that gets called.

var postAuth = function(params, callback){
    var user = params['username'];
    var pass = params['password'];
    var result;
    util['dispatch']['auth'](user, function(blah){
        if (pass == blah) {
            result = true;
        } else {
            result = false;
        }
    });
    console.log(result);
    return callback(200, "OK", {}, result);
}

and the function that gets the password from the DB for the username:

var auth = function(user, callback) {
    client.connect();
    var qString = "SELECT password FROM users WHERE username='" + user + "'";
    var query = client.query(qString);
    query.on('row', function(row) {
            callback(row.password);
    }); 
};  

I know that this is an insecure way to authenticate. I am wondering if there is a better way to do this. I am trying to return a bool based on whether or not the password matches.


Source: (StackOverflow)

postgres composite type on node-postgres

Say I have the following postgreSQL composite type:

CREATE TYPE myType AS(
  id  bigint,
  name  text,
);

and a stored procedure that excepts that type:

CREATE FUNCTION myFunction(mt myType){
//do some stuff
}

I would like to call this procedure from Node-js using node-postgres module.

var pg = require('pg');
var connectionString = "connection string";
pg.connect(connectionString, function(err, client, done) {
   client.query('SELECT myFunction($1)', [some value],   
      function(err, result) {
      // do stuff
      done();
    });
});

How do i create such a type in JS? Is there a way to pass a type from Node to a Postgres stored procedure?


Source: (StackOverflow)

node server can't connect to postgres db

I recently switched from MySQL to postgres as my database for an node.js project. While I'm able to reach my remote postgres database from my local pgAdmin III (OSX) client, so far I've been unable to connect to my database through node.js. I'm sure that the credentials I entered for pgAdmin and my node.js were exactly the same. The other thing I've tried was setting my local ipadress to trust in stead of md5 in the pg_hba.conf at my database server. Is there anything I did wrong? My favourite search engine came up with some worrying hits about resetting my local os. I just used the example from the github repo doc of node-postgres:

var pg = require('pg');

var conString = "postgres://myusername:mypassword@hostname:5432/dbname";

var client = new pg.Client(conString);
client.connect(function(err) {
  if(err) {
    return console.error('could not connect to postgres', err);
  }
  client.query('SELECT NOW() AS "theTime"', function(err, result) {
    if(err) {
      return console.error('error running query', err);
    }
    console.log(result.rows[0].theTime);
    client.end();
  });
});

And these are the errors I get every time I try to start my server:

could not connect to postgres { [Error: getaddrinfo ENOTFOUND] code: 'ENOTFOUND', errno: 'ENOTFOUND', syscall: 'getaddrinfo' }

Help would be greatly appreciated


Source: (StackOverflow)

Why can't I delete from db using node-postgres?

Is there something special i need to do with a parameterized query?

the following seems to succeed (i'm using a promise-ified client.query see end),

console.log('cancel for', data);
var cancelInviteQuery = 'delete from friendship where user_1=$1 and user_2_canonical=$2';
return query(cancelInviteQuery, [data.id, data.friend])
  .then(function (results) {
    console.log('ran delete frienship', results);
  })
  .catch(function (error) {
    console.error('failed to drop friendship', error);
  });

because i get the output:

cancel for {"id":3,"friend":"m"}
ran delete frienship []

but then the following query of the database shows the record still exists

select * from friendship;
 id | user_1 | user_2 | user_2_canonical | confirmed | ignored 
----+--------+--------+------------------+-----------+---------
  8 |      3 |      9 | m                | f         | f

and then the following query succeeds when I make it directly against the database (using psql client)

delete from friendship where user_1=3 and user_2_canonical='m'

my query function (a wrapper for node-postgres's client.query):

function query(sql, params) {
  var promise = new RSVP.Promise(function (resolve, reject) {
    pg.connect(CONNECTIONSTRING, function (error, client, done) {
      if (error) { reject(error); return; }

      client.query(sql, params, function (error, result) {
        done(); // Greedy close.
        if (error) {
          reject(error);
          return;
        }
        else if (result.hasOwnProperty('rows')) {
          resolve(result.rows);
        } else { resolve(result.rows); }
      });
    });
  });
  return promise;
}

Source: (StackOverflow)

postgres connection from node.js

I am using node-postgres in my application. I would like to know the best practices that I want to follow to ensure the stable connection.

following is the code I'm using right now,

exports.getAll = function (args, callback) {
helper.client = new pg.Client('tcp://postgres:system6:5432@192.168.143.11/abc_dev');
helper.client.connect();
helper.client.query('select count(1) as total_records from facilities', function(err,response){
    helper.client.query('select * ,'+response.rows[0].total_records+' as total_records  from facilities', 
        function(err,response){
            callback(response);
            helper.client.end.bind(helper.client);
        });
    });
};

As you can see in the code I'm connecting the DB for every request and disconnect once the query has executed. I have one more idea where I can connect the DB globally only once and execute the query using the opened connection. The code look like

helper.client = new pg.Client('tcp://postgres:system6:5432@192.168.143.11/abc_dev');
helper.client.connect();

exports.getAll = function (args, callback) {
helper.client.query('select count(1) as total_records from facilities', function(err,response){
    helper.client.query('select * ,'+response.rows[0].total_records+' as total_records  from facilities', 
        function(err,response){
            callback(response);
        });
    });
};

Here the connection never ends. As of my knowledge I am unable to decide which one is best. Please suggest.

Thanks..


Source: (StackOverflow)

Node-Postgres SELECT WHERE IN dynamic query optimization

We're working on a Node/Express web app with a Postgres database, using the node-postgres package. We followed the instructions in this question, and have our query working written this way:

exports.getByFileNameAndColName = function query(data, cb) {

  const values = data.columns.map(function map(item, index) {
    return '$' + (index + 2);
  });

  const params = [];
  params.push(data.fileName);
  data.columns.forEach(function iterate(element) {
    params.push(element);
  });

  db.query('SELECT * FROM columns ' +
    'INNER JOIN files ON columns.files_id = files.fid ' +
    'WHERE files.file_name = $1 AND columns.col_name IN (' + values.join(', ') + ')',
    params, cb
  );

};

data is an object containing a string fileName and an array of column names columns. We want this query to extract information from our 'columns' and 'files' tables from a dynamic number of columns. db.query takes as parameters (query, args, cb), where query is the SQL query, args is an array of parameters to pass into the query, and cb is the callback function executed with the database results.

So the code written in this way returns the correct data, but (we think) it's ugly. We've tried different ways of passing the parameters into the query, but this is the only format that has successfully returned data.

Is there a cleaner/simpler way to pass in our parameters? (e.g. any way to pass parameters in a way the node-postgres will accept without having to create an additional array from my array + non-array elements.)

Asking this because:

  1. perhaps there's a better way to use the node-postgres package/we're using it incorrectly, and
  2. if this is the correct way to solve this type of issue, then this code supplements the answer in the question referenced above.

Source: (StackOverflow)

node-postgres with massive amount of queries

I just started playing around with node.js with postgres, using node-postgres. One of the things I tried to do is to write a short js to populate my database, using a file with about 200,000 entries.

I noticed that after sometime (less than 10 seconds), I start to get "Error: Connection terminated". I am not sure whether this is problem with how I use node-postgres, or if it's because I was spamming postgres.

Anyway, here is a simple code that shows this behaviour:

var pg = require('pg');
var connectionString = "postgres://xxxx:xxxx@localhost/xxxx";

pg.connect(connectionString, function(err,client,done){
  if(err) {
    return console.error('could not connect to postgres', err);
  }

  client.query("DROP TABLE IF EXISTS testDB");
  client.query("CREATE TABLE IF NOT EXISTS testDB (id int, first int, second int)");
  done();

  for (i = 0; i < 1000000; i++){
    client.query("INSERT INTO testDB VALUES (" + i.toString() + "," + (1000000-i).toString() + "," + (-i).toString() + ")",   function(err,result){
      if (err) {
         return console.error('Error inserting query', err);
      }
      done();
    });
  }
});

It fails after about 18,000-20,000 queries. Is this the wrong way to use client.query? I tried changing the default client number, but it didn't seem to help.

client.connect() doesn't seem to help either, but that was because I had too many clients, so I definitely think client pooling is the way to go.

Thanks for any help!


Source: (StackOverflow)