EzDevInfo.com

gcloud-node

Google Cloud Client Library for Node.js gcloud node idiomatic client for google cloud services.

Cannot download Google Cloud Storage file with Node gcloud package

I have gcloud set up in my application and I seem to be successfully authenticated. I can run the following without any problem:

GoogleCloud = require('gcloud')({ /* my credentials */ });
GoogleCloud.storage().createBucket('myBucket', function (err, res) {});

I can also do the following to retrieve my bucket and file without any problem:

var bucket = GoogleCloud.storage().bucket('myBucket');
var file = bucket.file('myFileName.ext');

And when I file.getMetadata(), I see the empty object convert to the expected information.

However, when I attempt to download the file, I get problems.

var pathToFileName = 'public/test.pdf';
file.download({ destination: pathToFileName }, function (err) { console.log(err); });
{ [Error: ENOENT, open '/public/test.pdf'] errno: 34, code: 'ENOENT', path: '/public/test.pdf' }

I have tried prepending the pathToFileName with /, ./, and ../ with no success. This is the same error that I have have gotten in the past with my configuration was not authorized or the file name was incorrect (although I was then unable to create buckets and get metadata). But is there still possibly an issue with acl or some other permission that I'm not aware of?


Source: (StackOverflow)

Google Cloud PubSub - can't seem to get topics

I'm using heroku to run a node.js app that uses gcloud to create a topic, and then subscribe to it. I'm using the following code, as taken from here: https://googlecloudplatform.github.io/gcloud-node/#/docs/v0.16.0/pubsub

var gcloud = require('gcloud')({
  keyFilename: 'pubsub_key.json',
  projectId: 'pipedrivesekoul'
});

var pubsub = gcloud.pubsub();

//create a new topic
pubsub.createTopic('projects/pipedrivesekoul/my-new-topic', function(err, topic, apiResponse) {
  var topic = pubsub.topic('my-new-topic');
  topic.publish({
    data: 'New message!'
  }, function(err) {console.log});
});

  var topic = pubsub.topic('my-new-topic');



// Without specifying any options.
topic.subscribe('newMessages', function(err, subscription, apiResponse) {});

var alltopics = pubsub.getTopics({}, function(err, topics, nextQuery, apiResponse) {});

console.log(pubsub.getTopics({}, function(err, topics, nextQuery, apiResponse) {}));

However, when I deploy on Heroku (https server, registered on Google Console, with the correct APIs deployed and the appropriate key in a json file), instead of seeing a list of topics, it just returns 'undefined':

2015-07-24T18:06:05.321079+00:00 app[web.1]: undefined

2015-07-24T18:06:05.337947+00:00 app[web.1]: Node app is running on port 36252

Not sure why this might be happening and not too sure how to debug this issue. Any suggestions would be greatly appreciated!


Source: (StackOverflow)

Advertisements

Node.js node-gcloud synchronous call

I'm using node-gcloud https://github.com/GoogleCloudPlatform/gcloud-node to interact with Google Cloud Storage.

I'm developing a node.js server (my first node.js project) to provide a small set of APIs to clients. Basically when an user uploads a file the API call return the signed url to show that file.

The getSignedUrl function is asynchronous https://googlecloudplatform.github.io/gcloud-node/#/docs/v0.8.1/storage?method=getSignedUrl and I can't find a way to return that result from another function.

I've started playing with Bluebird promises but I can't get to the point of it. Here is my code:

var _signedUrl = function(bucket,url,options) {
  new Promise(function (resolve, reject) {
    var signed_url
    bucket.getSignedUrl(options, function(err, url) {
      signed_url = err || url;
      console.log("This is defined: " + signed_url)

      return signed_url   
    })
  })
}


var _getSignedUrl = function(url) {
  new Promise(function(resolve) {
    var   options = config.gs
      ,   expires = Math.round(Date.now() / 1000) + (60 * 60 * 24 * 14)
      ,   bucket  = project.storage.bucket({bucketName: config.gs.bucket, credentials: config.gs })
      ,   signed_url = null

    options.action  = 'read'
    options.expires =  expires// 2 weeks.
    options.resource= url
    signed_url = resolve(_signedUrl(bucket,url,options))

    console.log("This is undefined: " + signed_url)

    return JSON.stringify( {url: signed_url, expires: expires} );

  });
}

I think that I'm missing the basics of how it is supposed to work, so any hint will be appreciated.

Edit:

I have reworked my solution as for the first comment:

getSignedUrl: function() {
    var   options = config.gs
      ,   expires = Math.round(Date.now() / 1000) + (60 * 60 * 24 * 14)
      ,   bucket  = project.storage.bucket({bucketName: config.gs.bucket, credentials: config.gs })
      ,   signed_url = null

    options.action  = 'read'
    options.expires =  expires// 2 weeks.
    options.resource= this.url

    Promise.promisifyAll(bucket);

    return bucket.getSignedUrlAsync(options).catch(function(err) {
        return url; // ignore errors and use the url instead
    }).then(function(signed_url) {
        return JSON.stringify( {url: signed_url, expires: expires} );
    });
}

It's not clear to me how the double return is supposed to work, but if I keep the return bucket

what I get is this output:

{ url: { _bitField: 0, _fulfillmentHandler0: undefined, _rejectionHandler0: undefined, _promise0: undefined, _receiver0: undefined, _settledValue: undefined, _boundTo: undefined } }

, and if remove it and keep the

return JSON.stringify( {url: signed_url, expires: expires} );

I get undefined as before. What am I missing?


Source: (StackOverflow)

Adding entity property with more than 500 characters/setting unindexed

I'm trying to create a entity with a property of more than 500 characters by setting the indexed value to false.

How would I go about doing this with the gcloud-node library with the save function?

Thanks


Source: (StackOverflow)

does gcloud-node support assigning a http proxy?

I am using this SDK for big query service.I want to assign a custom http-proxy to it. I found a 3rd party lib named proxy-agent to work as a agent lib. while I didn't find any config fields can be used to assign the agent. does anybody here know how to implement it?


Source: (StackOverflow)

Logging on gcloud node.js

I'm experimenting with node.js and managed servers. So far so good. I just can't figure out how to log to the console on my development server. I'm talking about the console I see in the terminal after running gcloud preview app run . (not the browser console). console.log() doesn't appear to do anything at all.


Source: (StackOverflow)

gcloud-node to access bearer token?

will the gcloud-node API give me the bearer token it is using?

I'm able to create signed urls with gcloud-node and the keyfile.json but I'm trying to follow the resumable download docs. they suggest starting an upload on the server and passing the session to the client. looks easy except for the header:

Authorization: Bearer your_auth_token

can i do something like gcs.getAuth after some kind of init?

thx for any help.


Source: (StackOverflow)

NodeJS gcloud - Upload to google storage with public-read property/custom cache-expire

I am trying to upload to google storage using the gcloud library (NodeJS).

I need to enable public-read property and also set the cache-expiration to 5 minutes.

I am using this (simplified) code:

storage = gcloud.storage({options}
bucker = storage.bucket('name');
fs.createReadStream(srcPath).pipe(bucket.file(targetFile).createWriteStream()).on('error', function(err) 

How do I go about setting the approprate ACL/cache expire? (I found this but not sure what to make of it: https://googlecloudplatform.github.io/gcloud-node/#/docs/v0.11.0/storage?method=acl)

Thanks for the help


Source: (StackOverflow)

How to disable Docker container to restart for `gcloud preview app run`

I start my Nodejs app with gcloud preview app run . command.

Every time I update any file inside the project, gcloud restarts Docker container. It always takes a lot of time.

In Google Compute Engine Docs I have found restartPolicy configuration, but trying to add it into my app.yaml, throws an error when running the app:

ERROR: Configuration is not valid: Unexpected attribute 'restartPolicy' for object of type AppInfoExternal.

What probably means it isn't supposed to work with custom VMs on Google Cloud.

I am wondering if there's a way to tell gcloud to disable automatic Docker container restart, but make it somehow manually when I need it?


Source: (StackOverflow)

Google Datastore slow using gcloud-node

I'm trying to call Google Datastore using gcloud-node and am consistently getting response times of around 400ms for very simple queries. The Google AppEngine status says the service is responding to gets in about 10ms at the moment, but I never go below 200ms.

I've tried running both on a Google Compute instance and locally and it doesn't make any significant different (I'm not running on Google AppEngine).

Am I doing something wrong or is Datastore really this slow?

Here is a trivial program I'm using to test performance:

var gcloud = require('gcloud');

var dataset = gcloud.datastore.dataset({
    projectId: 'myProject',
    keyFilename: 'key.json'
});

var calls = 0;

for (var i = 0; i < 10; i++) {
    var call = 'get' + calls++;
    console.time(call);

    dataset.get(dataset.key(['Kind', 'Name']),
        (function(call) {
            return function(err, entities, nextQuery) {
                if (err) { console.log(err); }

                console.timeEnd(call);
            }
        })(call)
    )
}

I get the following output:

get1: 654ms
get2: 656ms
get4: 657ms
get0: 668ms
get3: 793ms
get5: 916ms
get6: 919ms
get7: 933ms
get8: 952ms
get9: 1055ms

I've tried googling Datastore performance but apart from complaints dating back to 2011 I find very little.

Thankful for any pointers!


Source: (StackOverflow)

Master/Slave pattern on Google Cloud using Pub/Sub

We want to build a master slave pattern on Google Cloud. We planned to use Pub/Sub for that (similar to JMS pattern) letting each worker to grab a task from the queue and ack when done.

But, it seems like a subscriber can't get messages sent before it started. And we're not sure how to make sure each message will be processed by a single 'slave'.

Is there a way to do it? Or another mechanism on google cloud for that?


Source: (StackOverflow)

Is there a caching library for google datastore api in node.js

I'm looking into using gcloud node api to access the datastore api but was curious if it supported query caching in a similar manner to ndb? If not, what's the best way to make sure repeated queries are cached?


Source: (StackOverflow)

Same Google Cloud Storage upload script works from one PC, but not the other, Why?

I'm trying to upload a file to Google Cloud Storage from Node.JS (using gcloud package) with Service Account credentials and I get "invalid_grant" error (probably authorization error).

When I try to do this from another computer - it work fine, the error only occurs on my PC.

var gcloud = require('gcloud')({
    projectId: 'xxxxxxxxxxxxxx31032015',
    keyFilename: './keyfile.json'
});

var storage = gcloud.storage();

var bucket = storage.bucket('test.testBucket.com');

bucket.upload('test.png', function (err, file) {
    console.log(err);
});

Source: (StackOverflow)

Can you update metadata with gcloud-node

I'm using gcloud-node to store files on Google Cloud Storage. I would like to update metadata, like the timestamp ala touch. Is that possible?


Source: (StackOverflow)