EzDevInfo.com

google-api-python-client

How to get Google Analytics credentials without gflags - using run_flow() instead?

This may take a second to explain so please bear with me:

I'm working on a project for work that requires me to pull in google analytics data. I originally did this following this link, so after installing the API client pip install --upgrade google-api-python-client and setting things up like the client_secrets.json, it wanted gflags to be installed in order to execute the run() statement. (i.e credentials = run(FLOW, storage))

Now, I was getting the error message to install gflags or better to use run_flow() (exact error message was this):

NotImplementedError: The gflags library must be installed to use tools.run(). Please install gflags or preferably switch to using tools.run_flow().

I originally used gflags (a few months ago), but it wasn't compatible with our framework (pyramid), so we removed it until we could figure out what the issue was. And the reason why it's preferable to switch from gflags to run_flow() is because gflags has been deprecated, so I don't want to use it like I had. What I'm trying to do now is switch over to using run_flow()

The issue with this is run_flow() expects a command line argument to be sent to it and this is not a command line application. I found some documentation that was helpful but I'm stuck on building the flags for the run_flow() function.

Before showing code one more thing to explain.

run_flow() takes three arguments (documentation here). It takes the flow and storage just like run() does, but it also takes a flags object. the gflags library built a flags ArgumentParser object that was used in the oauth2client execution method.

a few other links that were helpful in building the argumentParser object:

The second link is very helpful to see how it would be executed, so now when I try to do something similar, sys.argv pulls in the location of my virtual environment that is running aka pserve and also pulls in my .ini file (which stores credentials for my machine to run the virtual environment). But that throws an error because its expecting something else, and this is where I'm stuck.

  • I don't know what flags object I need to build to send through run_flow()
  • I don't know what argv arguments I need passed in order for the statement flags = parser.parse_args(argv[1:]) to retrieve the correct information (I don't know what the correct information is supposed to be)

Code:

CLIENT_SECRETS = client_file.uri
MISSING_CLIENT_SECRETS_MESSAGE = '%s is missing' % CLIENT_SECRETS
FLOW = flow_from_clientsecrets(
    CLIENT_SECRETS,
    scope='https://www.googleapis.com/auth/analytics.readonly',
    message=MISSING_CLIENT_SECRETS_MESSAGE
)
TOKEN_FILE_NAME = 'analytics.dat'

def prepare_credentials(self, argv):
    storage = Storage(self.TOKEN_FILE_NAME)
    credentials = storage.get()

    if credentials is None or credentials.invalid:
        parser = argparse.ArgumentParser(description=__doc__,
            formatter_class=argparse.RawDescriptionHelpFormatter,
            parents=[tools.argparser])
        flags = parser.parse_args(argv[1:]) # i could also do just argv, both error
        credentials = run_flow(self.FLOW, storage, flags) 
    return credentials

def initialize_service(self, argv):
    http = httplib2.Http()
    credentials = self.prepare_credentials(self, argv)
    http = credentials.authorize(http)
    return build('analytics', 'v3', http=http)

I call a main function passing sys.argv that calls the initialize_service

def main(self, argv):
    service = self.initialize_service(self, argv)

    try:
        #do a query and stuff here

I knew this wouldn't work because my application is not a command line application but rather a full integrated service but I figured it was worth a shot. Any thoughts on how to build the flags object correctly?


Source: (StackOverflow)

Create a empty spreadsheet in google drive using google spreadsheet api (python )?

I want to create a empty spreadsheet(only with metadata) in google drive , when I refered the google SpreadSheet Api , it says to use the documents list api , but currently the document api is deprecated asking to use the google drive api , in drive api I could not find an way create a empty spreadsheet(only with metadata) . any clue on this ?


Source: (StackOverflow)

Advertisements

ImportError: No module named apiclient.discovery

I got this error in Google App Engine's Python have used Google Translate API, But I don't know how to fix,

<module>
from apiclient.discovery import build
ImportError: No module named apiclient.discovery

I'll try to set environment which indicates to Google App Engine SDK, And upload to Google Apps Engine again, always get the error,

Error: Server Error

The server encountered an error and could not complete your request. If the problem persists, please report your problem and mention this error message and the query that caused it.

Please tell me how to fix,

Thanks

UPDATE : Fixed Follow Nijjin's help, I fixed problems by adding the following folders,

apiclient, gflags, httplib2, oauth2client, uritemplate


Source: (StackOverflow)

Python bigquery - ImportError: No module named google.apputils

I am trying to use the google bigquery python library but whenever I run import bq I get the following error;

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-21-923a4eec0450> in <module>()
----> 1 import bq

/Users/tahirfayyaz/anaconda/python.app/Contents/lib/python2.7/site-packages/bq.py in      <module>()
     31 import oauth2client.tools
     32 
---> 33 from google.apputils import app
     34 from google.apputils import appcommands
     35 import gflags as flags

ImportError: No module named google.apputils

I have installed and even upgraded google-apputils but I still get this error.


Source: (StackOverflow)

Are there published specs for Google API bandwidth?

I am developing a Google APP Engine or possibly just direct GCS application. The client captures video streams from cameras and would like to selectively stream these to the cloud.

Possible targets are:

  • A Google App Engine app
  • Google Cloud Storage (JSON)
  • Google Drive API
  • Google Hangouts (incl. Hangouts on Air)

When considering the different architecture choices (which all have great features), we need to know what the max rate is for uploading.

Maybe this is a very naive question and the rate is "faster than you can give is data". If so, then that's all I need to know, and that's excellent. But if not, I'd love to have some idea of the throughput capabilities for a single connected client.

Is that a reasonable question? Thanks in advance for your help!


Source: (StackOverflow)

Google API Python client error

I want to have a script for getting home feed of Google+. I use for that google's script. The client-secrets.json file is:

{
 "web": {
   "client_id": "##########",
   "client_secret": "############",
   "redirect_uris": ["http://localhost:8080/oauth2callback/"],
   "auth_uri": "https://accounts.google.com/o/oauth2/auth",
   "token_uri": "https://accounts.google.com/o/oauth2/token",
   "client_email":"##########@developer.gserviceaccount.com",
   "javascript_origins":["http://localhost:8080/"]
        }
}

But when i want to start this app, it opens a page with error and broken robot:

The redirect URI in the request: http://localhost:8080/ did not match a registered redirect URI

Please, help me with my problem.


Source: (StackOverflow)

Errno 185090050 _ssl.c:343: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib, after packaging to exe by PyInstaller

I code a python script to check files in GCS, it uses wxpython to generate the GUI. To authenticate I did it in this way(following the way in Google sample code -> http://code.google.com/p/google-cloud-platform-samples/source/browse/file-transfer-json/chunked_transfer.py?repo=storage):

flow = flow_from_clientsecrets(CLIENT_SECRETS_FILE, 
                                # the secrets file I put in same folder
                               scope=scope,
                               message=MISSING_CLIENT_SECRETS_MESSAGE)

credential_storage = CredentialStorage(CREDENTIALS_FILE) # the file to store
                                                    # authentication credentials
credentials = credential_storage.get()
if credentials is None or credentials.invalid:
    credentials = run_oauth2(flow, credential_storage)

self.printLog('Constructing Google Cloud Storage service...')
http = credentials.authorize(httplib2.Http())
return discovery_build('storage', 'v1beta1', http=http)

The codes above are contained in my Python script, which works very well when it is just a python .py file, later I used pyinstaller convert it to .exe in win 7 64bit(I also put the secret file in the same folder as the .exe file) using below command

C:\gcs_file_check>python pyinstaller-2.0\pyinstaller.py -w gcs_file_check.py

When clicking on the exe, the Request for Permission page from Google was launched correctly(as what it is as running Python script not exe), but after I click Accept, the above code will throw exception:

[Errno 185090050] _ssl.c:343: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib

And I can see the file credentials.json is not created by .exe, while the Python script can create this file correctly.

Could someone know how this happens and how to fix? Appreciate every answer!

===================

updated on 04/16:

I added debugging code and found that the exception exactly comes from below code:

if credentials is None or credentials.invalid:
    credentials = run_oauth2(flow, credential_storage)

===================

updated:

Add more detail, previously I was using oauth2client.tools.run()

from oauth2client.tools import run as run_oauth2

Now I change to run_flow() as the source code suggested -> https://google-api-python-client.googlecode.com/hg/docs/epy/oauth2client.tools-pysrc.html#run

from oauth2client.tools import run_flow as run_oauth2

Now this part of code is:

parser=argparse.ArgumentParser(description=__doc__,                  
                     formatter_class=argparse.RawDescriptionHelpFormatter,
                     parents=[tools.argparser] )
                     flags = tools.argparser.parse_args(sys.argv[1:])

if credentials is None or credentials.invalid:
    credentials = run_oauth2(flow, credential_storage, flags)

But still, the python code works well, and throw the same exception [Errno 185090050] after packaging to .exe by PyInstaller.


Source: (StackOverflow)

Cannot get service account authorization to work on GCS script using Python client lib APIs

In attempting to write a python script to access GCS using service-based authorization, I have come up with the following. Note that 'key' is the contents of my p12 file.

I am attempting to just read the list of buckets on my account. I have successfully created one bucket using the web interface to GCS, and can see that with gsutil.

When I execute the code below I get a 403 error. At first I thought I was not authorized correctly, but I tried from this very useful web page (which uses web-based authorization), and it works correctly. https://developers.google.com/apis-explorer/#p/storage/v1beta1/storage.buckets.list?projectId=&_h=2&

When I look at the headers and query string and compare them to the keaders and query of the website-generated request I see that there is no authorization header, and that there is no key= tag in the query string. I suppose I thought that the credential authorization would have taken care of this for me.

What am I doing wrong?

code:

credentials = SignedJwtAssertionCredentials(
      'xxx-my-long-email-from-the-console@developer.gserviceaccount.com',
      key,
      scope='https://www.googleapis.com/auth/devstorage.full_control')
http = httplib2.Http()
http = credentials.authorize(http)

service = build("storage", "v1beta1", http=http)

# Build the request

request = service.buckets().list(projectId="159910083329")

# Diagnostic

pprint.pprint(request.headers)
pprint.pprint(request.to_json())

# Do it!

response = request.execute()

When I try to execute I get the 403.


Source: (StackOverflow)

from oauth2client.appengine import oauth2decorator_from_clientsecrets ImportError: No module named appengine

I' getting the following error while trying to run my application

from oauth2client.appengine import oauth2decorator_from_clientsecrets
ImportError: No module named appengine

Here is my main.py code

import httplib2
import os
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from oauth2client.appengine import oauth2decorator_from_clientsecrets

CLIENT_SECRETS = os.path.join(os.path.dirname(__file__), 'client_secrets.json')

decorator = oauth2decorator_from_clientsecrets(CLIENT_SECRETS,
'https://www.googleapis.com/auth/bigquery')

class MainHandler(webapp.RequestHandler):
    @decorator.oauth_required
    def get(self):
       self.response.out.write("Hello Dashboard!\n")

application = webapp.WSGIApplication([
     ('/', MainHandler),
     ], debug=True)

def main():
    run_wsgi_app(application)

if __name__ == '__main__':
   main()

And here is my app.yaml

application: hellomydashboard
version: 1
runtime: python27
api_version: 1
threadsafe: false

handlers:
- url: /favicon\.ico
  static_files: favicon.ico
  upload: favicon\.ico

- url: /oauth2callback
  script: oauth2client/appengine.py

- url: .*
  script: main.app

Source: (StackOverflow)

When attempting to upload a UTF-8 text file to Google Drive with the Google API client for python, I get a UnicodeDecodeError

This question is still unsolved! Please answer if you know

Bug

I have filed a bug here

http://code.google.com/p/google-api-python-client/issues/detail?id=131&thanks=131&ts=1335708962

While working on my gdrive-cli project, I ran into this error attempting to upload a UTF-8 markdown file, using the "text/plain" mime-type. I also tried with "text/plain;charset=utf-8" and got the same result.

Here is the stacktrace I got:

Traceback (most recent call last):
  File "./gdrive-cli", line 155, in <module>
    handle_args(args)
  File "./gdrive-cli", line 92, in handle_args
    handle_insert(args.insert)
  File "./gdrive-cli", line 126, in handle_insert
    filename)
  File "/home/tom/Github/gdrive-cli/gdrive/gdrive.py", line 146, in insert_file
    media_body=media_body).execute()
  File "/usr/local/lib/python2.7/dist-packages/apiclient/http.py", line 393, in execute
    headers=self.headers)
  File "/usr/local/lib/python2.7/dist-packages/oauth2client/client.py", line 401, in new_request
    redirections, connection_type)
  File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1544, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1294, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "/usr/local/lib/python2.7/dist-packages/httplib2/__init__.py", line 1231, in _conn_request
    conn.request(method, request_uri, body, headers)
  File "/usr/lib/python2.7/httplib.py", line 955, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python2.7/httplib.py", line 989, in _send_request
    self.endheaders(body)
  File "/usr/lib/python2.7/httplib.py", line 951, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python2.7/httplib.py", line 809, in _send_output
    msg += message_body
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 4518: ordinal not in range(128)

And the command I had to issue to generate it was:

gdrive-cli --insert README.md "readme file" none "text/plain" README.md

You can get the exact README.md file at the time this problem occurred here, http://tomdignan.com/files/README.md

The relevant code from the SDK examples follows. The parameters going in are in order:

a service instance, "README.md", "readme file", None (python keyword), "text/plain", and "README.md"

def insert_file(service, title, description, parent_id, mime_type, filename):
    """Insert new file.

    Args:
        service: Drive API service instance.
        title: Title of the file to insert, including the extension.
        description: Description of the file to insert.
        parent_id: Parent folder's ID.
        mime_type: MIME type of the file to insert.
        filename: Filename of the file to insert.
    Returns:
        Inserted file metadata if successful, None otherwise.
    """
    media_body = MediaFileUpload(filename, mimetype=mime_type)
    body = {
        'title': title,
        'description': description,
        'mimeType': mime_type
    }

    # Set the parent folder.
    if parent_id:
        body['parentsCollection'] = [{'id': parent_id}]


    try:
        file = service.files().insert(
                body=body,
                media_body=media_body).execute()

        # Uncomment the following line to print the File ID
        # print 'File ID: %s' % file['id']

        return file
    except errors.HttpError, error:
        print "TRACEBACK"
        print traceback.format_exc()
        print 'An error occured: %s' % error
        return None

Source: (StackOverflow)

Insert a file to google drive using google app engine. python client api used

Using Google App Engine, I am trying to insert a file "a.txt" into my google drive. The error that i get when i view page source of InsertDrive page is

HttpError 401 "Login Required" bound method InsertDrive.error of main.InsertDrive object at 0x10f884b0

Note: I am calling class InsertDrive from my MainHandler Class by showing the url in the Jinja template for the MainHandler class.

import httplib2
import logging
import os
import sys


from os import path
from apiclient.discovery import build
from apiclient.http import MediaFileUpload
from oauth2client.client import flow_from_clientsecrets
from oauth2client.file import Storage
from oauth2client.tools import run
from apiclient import discovery
from oauth2client import appengine
from oauth2client import client
from google.appengine.api import memcache

from apiclient import errors
from apiclient.http import MediaFileUpload

import webapp2
import jinja2

CREDENTIAL = 'drive.credential'
CLIENT_SECRET_JSON = 'client_secrets.json'
SCOPE = 'https://www.googleapis.com/auth/drive'

FILE_NAME = 'a.txt'




JINJA_ENVIRONMENT = jinja2.Environment(
        loader=jinja2.FileSystemLoader(os.path.dirname(__file__)),
        autoescape=True,
        extensions=['jinja2.ext.autoescape'])


CLIENT_SECRETS = os.path.join(os.path.dirname(__file__), 'client_secrets.json')


MISSING_CLIENT_SECRETS_MESSAGE = """
Warning: Please configure OAuth 2.0

""" % CLIENT_SECRETS

http = httplib2.Http(memcache)
service = discovery.build('drive', 'v2', http=http)
decorator = appengine.oauth2decorator_from_clientsecrets(
        CLIENT_SECRETS,
        scope=[
            'https://www.googleapis.com/auth/drive',
            'https://www.googleapis.com/auth/drive.appdata',
            'https://www.googleapis.com/auth/drive.apps.readonly',
            'https://www.googleapis.com/auth/drive.file',
            'https://www.googleapis.com/auth/drive.metadata.readonly',
            'https://www.googleapis.com/auth/drive.readonly',
            'https://www.googleapis.com/auth/drive.scripts',
            ],
        message=MISSING_CLIENT_SECRETS_MESSAGE)
title="a.txt"
description="none"
mime_type="text/*"
filename="a.txt"
parent_id=None

class MainHandler(webapp2.RequestHandler):

  @decorator.oauth_aware
  def get(self):
         insert_url = "/InsertDrive"
     if not decorator.has_credentials():
          url = decorator.authorize_url()
          self.redirect(url)
          self.response.write("Hello")
    #variables = {
    #           'url': decorator.authorize_url(),
    #           'has_credentials': decorator.has_credentials(),
    #           'insert_url': "/InsertDrive"
    #           }
         template = JINJA_ENVIRONMENT.get_template('main.html')
         self.response.write(template.render(insert_url=insert_url))

class InsertDrive(webapp2.RequestHandler):
    # ADDED FUNCTION TO UPLOAD  #

      def get(self):
             self.response.out.write('<h1>entered</h1>')
             media_body = MediaFileUpload(filename, mimetype=mime_type, resumable=True)
             self.response.write(media_body)
             body = {
             'title': title,
             'description': description,
                'mimeType': mime_type

                }
             self.response.write(body)
             # Set the parent folder.
             if parent_id:
               body['parents'] = [{'id': parent_id}]

             self.response.write(parent_id)
             try:   
                file = service.files().insert(
                    body=body,
                    media_body=media_body).execute()
                    self.response.write(file)

    # Uncomment the following line to print the File ID
    # print 'File ID: %s' % file['id']

             except errors.HttpError , error:
                self.response.write('<h1>checking if error</h1>: %s' % error)
                self.response.write(self.error)
                print 'An error occured: %s' % error


app = webapp2.WSGIApplication(
        [
            ('/', MainHandler),
            ('/InsertDrive' , InsertDrive),
            (decorator.callback_path, decorator.callback_handler()),
            ],
        debug=True)

Any help would be greatly appreciated Thanks, kira_111


Source: (StackOverflow)

Google Calendar API - recurring events instances are not fetched

I'm trying to fetch all events from a Google calendar using V3 API. I've noticed an issue regarding recurring events. For some recurring events, on some calendars, only the first instances are fetched (for example - first 5 out of total of 8 instances are fetched).

Few additional details:

  • I've double checked that the query date-range is correct.

  • Problem occurred few times - for different Google Apps organizations.

  • Problem occurs also if I try to fetch the events from the calendar of the event creator.
  • We are using Google's python library for fetching.

Any inputs will be appreciated. Thanks!


Source: (StackOverflow)

JWTAssertionCredentials with service account fails with asn1 not enough data error

When attempting to use SignedJwtAssertionCredentials() with a google service account I have been receiving the following error on one Windows 2008 server machine, but not on a Windows 7 machine locally.

Error: [('asn1 encoding routines', 'ASN1_D2I_READ_BIO', 'not enough data')]

I am reading the p12 key file as follows before passing it to SignedJwtAssertionCredentials().

    with open(path_to_key_file, 'r') as f:
        private_key = f.read()

Source: (StackOverflow)

How can I log in to an arbitrary user in appengine for use with the Drive SDK?

I have an application that needs to log into a singular Drive account and perform operations on the files automatically using a cron job. Initially, I tried to use the domain administrator login to do this, however I am unable to do any testing with the domain administrator as it seems that you cannot use the test server with a domain administrator account, which makes testing my application a bit impossible!

As such, I started looking at storing arbitray oauth tokens--especially the refresh token--to log into this account automatically after the initial setup. However, all of the APIs and documentation assume that multiple individual users are logging in manually, and I cannot find functionality in the oauth APIs that allow or account for logging into anything but the currently logged in user.

How can I achieve this in a way that I can test my code on a test domain? Can I do it without writing my own oauth library and doing the oauth requests by hand? Or is there a way to get the domain administrator authorization to work on a local test server?


Source: (StackOverflow)

How do you retrieve files over 64KB in size with Cloud Storage JSON API?

I have a need to retrieve files in my web application that are greater than 64KB in size. Right now, in v1beta1 of the JSON API, Google is only allowing uploads/downloads of 64KB in size via their JSON API. I've figured out how to upload files over 64KB in size using a "resumable" upload (not via the interface v1beta1 provides in its JSON API, but manually).

What I can't figure out is a good way to download. Right now, I make the ACL public for the object I want to download, download the file, and then remove the public ACL on the object. This is not only inefficient, it is not very clean. Is there a better method I could use or am I stuck until Google provides a better means in a future version of their API?

Background Information I am writing a GAE application and I know of the google.appengine.api.files interface. Unfortunately this does not work on live buckets while using the local dev environment, and for testing purposes, my team and I need a way to test development locally (too cumbersome to deploy to GAE among other limiting/security factors). We can interact with all other APIs except for Cloud Storage, so I'm writing a class that will use either the JSON API or AppEngine's files interface when reading/writing/deleting from Cloud Storage. I got a working implementation, but I'm unhappy with the way I retrieve files.

Clarification from comment below: We are downloading large amounts of information, massaging it, and storing it into Cloud Storage for consumption into BigQuery. We need to use live buckets from the dev environment because if we don't, BigQuery won't be able to consume data we want to test. No need to serve these files, just process them

Solution from comment on accepted Answer below: I was able to reuse my authenticated httplib2 object from my code that interacts with the JSON API to do an authenticated GET request against the https://{bucket_name}.storage.googleapis.com/{object_name} URL endpoint, adding only the Content-Length: 0 and x-goog-api-version: 2 headers.


Source: (StackOverflow)