EzDevInfo.com

eventlet

Concurrent networking library for Python (official mirror) Eventlet Networking Library

ab is erroring out with apr_socket_recv: Connection refused (61)

I am testing eventlet out, and I am getting this error:

~>ab -n 10 -c 1 http://localhost:8090/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)...apr_socket_recv: Connection reset by peer (54)
Total of 2 requests completed

The website works at localhost:8090/ and returns 200 OK.

I had the same issue with tomcat, again the website worked fine.

What could the issue be?


Source: (StackOverflow)

What are the benefits of using the Eventlet module in python over the threading module? [closed]

Specifically the GreenPool class in Eventlet. I have tested some code to upload large files to S3 as individual pieces of a multipart upload. What I have noticed so far is that when using eventlet the CPU usage is much lower. Just looking for other pros and cons for Eventlet over just using threading. Thanks.


Source: (StackOverflow)

Advertisements

error: command 'gcc' failed with exit status 1 while installing eventlet

I wanted to install eventlet on my system in order to have "Herd" for software deployment.. but the terminal is showing a gcc error:

  root@agrover-OptiPlex-780:~# easy_install -U eventlet
  Searching for eventlet
  Reading http://pypi.python.org/simple/eventlet/
  Reading http://wiki.secondlife.com/wiki/Eventlet
  Reading http://eventlet.net
   Best match: eventlet 0.9.16
    Processing eventlet-0.9.16-py2.7.egg
    eventlet 0.9.16 is already the active version in easy-install.pth

   Using /usr/local/lib/python2.7/dist-packages/eventlet-0.9.16-py2.7.egg
 Processing dependencies for eventlet
 Searching for greenlet>=0.3
Reading http://pypi.python.org/simple/greenlet/
Reading https://github.com/python-greenlet/greenlet
Reading http://bitbucket.org/ambroff/greenlet
Best match: greenlet 0.3.4
Downloading http://pypi.python.org/packages/source/g/greenlet/greenlet-   0.3.4.zip#md5=530a69acebbb0d66eb5abd83523d8272
Processing greenlet-0.3.4.zip
Writing /tmp/easy_install-_aeHYm/greenlet-0.3.4/setup.cfg
Running greenlet-0.3.4/setup.py -q bdist_egg --dist-dir /tmp/easy_install-_aeHYm/greenlet-0.3.4/egg-dist-tmp-t9_gbW
In file included from greenlet.c:5:0:
greenlet.h:8:20: fatal error: Python.h: No such file or directory
compilation terminated.
error: Setup script exited with error: command 'gcc' failed with exit status 1`

Why can't Python.h be found?


Source: (StackOverflow)

What happends when a single request takes a long time with these non-blocking io servers?

With nodejs, or eventlet or any other non-blocking server, what happens when a given request takes long, does it then block all other requests?

Example, a request comes in, and takes 200ms to compute, this will block other requests since e.g. nodejs uses a single thread.

Meaning your 15K per second will go down substantially b/c of the actual time it takes to compute the response for a given request.

But this just seems wrong to me, so I'm asking what really happens as I can't imagine that is how things work.


Source: (StackOverflow)

Python consumes 99% of CPU running eventlet

I have posted to the python and eventlet mailing list already so I apologize if I seem impatient.

I am running eventlet 0.9.16 on a Small (not micro) reserved ubuntu 11.10 aws instance.

I have a socketserver that is similar to the echo server from the examples in the eventlet documentation. When I first start running the code, everything seems fine, but I have been noticing that after 10 or 15 hours the cpu usage goes from about 1% to 99+%. At that point I am unable to make further connections to the socketserver.

This is the code that I am running:

    def socket_listener(self, port, socket_type): 
        L.LOGG(self._CONN, 0, H.func(), 'Action:Starting|SocketType:%s' % socket_type)   
        listener = eventlet.listen((self._host, port)) 
        listener.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
        pool = eventlet.GreenPool(20000)
        while True: 
            connection, address = listener.accept() 
            connection.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
            L.LOGG(self._CONN, 0, H.func(), 'IPAddress:%s|GreenthreadsFree:%s|GreenthreadsRunning:%s' % (str(address[0]), str(pool.free()),str(pool.running())))
            pool.spawn_n(self.spawn_socketobject, connection, address, socket_type)
        listener.shutdown(socket.SHUT_RDWR)
        listener.close()

The L.LOGG method simply logs the supplied parameters to a mysql table.

I am running the socket_listener in a thread like so:

def listen_phones(self):  
    self.socket_listener(self._port_phone, 'phone') 

t_phones = Thread(target = self.listen_phones)
t_phones.start() 

From my initial google searches I thought the issue might be similar to the bug reported at https://lists.secondlife.com/pipermail/eventletdev/2008-October/000140.html but I am using a new version of eventlet so surely that cannot be it?


Source: (StackOverflow)

Is a greenthread equal to a "real" thread

I've taken sample code from Unterstanding eventlet.wsgi.server.

from eventlet import wsgi
import eventlet
from eventlet.green import time
import threading

def hello_world(env, start_response):
    print "got request", eventlet.greenthread.getcurrent(), threading.currentThread()
    time.sleep(10)
    start_response('200 OK', [('Content-Type', 'text/plain')])
    return ['Hello, World!\n']

wsgi.server(eventlet.listen(('', 8090)), hello_world)

When I access the web server via different client ip addresses, I can see they are processed in parallel. And with the print in hello_world, I can also that they are processed in two different greenthreads but in same OS thread.

I'm new to Python. I'm curious that if each greenthread ties to an underlying OS thread?


Source: (StackOverflow)

Celery + Eventlet + non blocking requests

I am using Python requests in celery workers to make large number of (~10/sec) API calls(includes GET,POST, PUT, DELETE). Each request takes around 5-10s to complete.

I tried running celery workers in eventlet pool, with 1000 concurrency.

Since requests are blocking process each concurrent connection is waiting on one request.

How do I make requests asynchronous?


Source: (StackOverflow)

Eventlet and Python daemon, Foo not called?

I am trying to build a Python deamon which listen to a queue (Redis Kombu). Grab the task and spawn a greenthread to process this task.

I can receive the task and consume it without trouble but when I try to spawn a GreenThread with eventlet it does not seem to be doing anything at all.

No print, no logging is shown.

class agent(Daemon):
    """
    Agent
    """
    def run(self):  
        # Setup connection
        mainLogger.debug('Connecting to Redis')
        connection = BrokerConnection(
                        hostname=agentConfig['redis_host'],
                        transport="redis",
                        virtual_host=agentConfig['redis_db'],
                        port=int(agentConfig['redis_port']))
        connection.connect()

        # Create an eventlet pool of size 5
        pool = eventlet.GreenPool(5)
        q = connection.SimpleQueue("myq")
        while True:
            try:
               message = q.get(block=True, timeout=1)
               print "GOT A MESSAGE FROM Q !"
               pool.spawn_n(self.foo, 'x')
               print "END SPAWN !"
            except Empty:
               mainLogger.debug('No tasks, going to sleep')
               time.sleep(1)


    def foo(self, x):
        mainLogger.debug('\o/')
        print "HELLO FROM SPAWN"

Anything I am doing wrong ?


Source: (StackOverflow)

python AsyncIO for UDP send/receive server?

Can someone recommend a framework that uses python & eventlet to handle a simple but fast UDP receive/ack server?

Note: I don't want to use twisted.


Source: (StackOverflow)

high performance (yet dumb) web server

I'm trying to write a very simple web server that does the following:

  1. Receive request.
  2. Respond with a tiny file; close the connection.
  3. Handle the request data.

In other words, the response doesn't depend on the request information, but the request information is still important. The data will be persisted and then used for analytics.

I've tried to do this with some event-driven networking frameworks, but they all seem to hold the connection until the handling code returns. This makes sense, because generally a server doesn't have to do any work after responding, but in my case there's no need for this particular way of doing things.

Ideally, the server should keep responding to requests, whilst the request data is added to a stack which is emptied as it is persisted.

We expect to handle thousands of requests per second. Is event-driven programming really the way to go, or should I stick with (traditional) threads? Which language or framework is more appropriate for this kind of work?

Thanks.


Source: (StackOverflow)

Speed of fetching web pages with Eventlet and Python?

I am writing a relatively simple crawler in Python but I want to use asynchronous networking lib in order to fetch multiple pages concurrently.I saw the examples on their page but when I apply the same logic that is shown and works for ~200 web pages for ~1000/2000 urls , the performance slows down.(Most of the urls were from different domains and i have shuffled them ). What is the fastest way to crawl such number of pages with Eventlet and what speed can i get? (speed like fetches/s)

Here is the example:


urls = ["http://www.google.com/intl/en_ALL/images/logo.gif",
     "https://wiki.secondlife.com/w/images/secondlife.jpg",
     "http://us.i1.yimg.com/us.yimg.com/i/ww/beta/y3.gif"]

import eventlet
from eventlet.green import urllib2

def fetch(url):

  return urllib2.urlopen(url).read()

pool = eventlet.GreenPool()

for body in pool.imap(fetch, urls):
  print "got body", len(body)

Source: (StackOverflow)

gevent breaks requests/urllib2 timeouts

Some time ago I wrote code that interacts with foreign service for my Django project:

    try:
        response = requests.get('some host', timeout=TIMEOUT)
    except:
        log.warning('timeout')
        return None

I tested it in my devel envierment (python manage.py runserver) and timeouts work fine. Then I decide to patch it with gevent by specifying pool implementation for gunicorn, like so:

python manage.py run_gunicorn -k gevent 

And now get call is not interrupted by timout exception. I changed pool implementation to eventlet and it works as expected:

python manage.py run_gunicorn -k eventlet

Is there any way to fix gevent?


Source: (StackOverflow)

Lots of socket errors with celery eventlet tasks

I'm getting a lot of "IOError: Socket closed" exceptions from amqplib.client_0_8.method_framing.read_method when running my celery workers with the --pool=eventlet option. I'm also seeing a lot of timeout exceptions from eventlet.hubs.hub.switch.

I'm using an async_manage.py script similar to the one at https://gist.github.com/821848, running the works like:

./async_manage.py celeryd_detach -E --pool=eventlet --concurrency=120 --logfile=<path>

Is this a known issue, or is there something wrong with my configuration or setup?

I'm running djcelery 2.2.4, Django 1.3, and eventlet 0.9.15.


Source: (StackOverflow)

`eventlet.spawn` doesn't work as expected

I'm writing a web UI for data analysis tasks.

Here's the way it's supposed to work:

After a user specifies parameters like dataset and learning rate, I create a new task record, then a executor for this task is started asyncly (The executor may take a long time to run.), and the user is redirected to some other page.

After searching for an async library for python, I started with eventlet, here's what I wrote in a flask view function:

db.save(task)
eventlet.spawn(executor, task)
return redirect("/show_tasks")

With the code above, the executor didn't execute at all.

What may be the problem of my code? Or maybe I should try something else?


Source: (StackOverflow)

How to implement timeout in python truly?

How to truly implement timeout in python? http://eventlet.net/doc/modules/timeout.html

Code looks like:

#!/usr/bin/python
import eventlet
import time
import sys
import random

while True:
        try:
         with eventlet.timeout.Timeout(1, False):
                print 'limited by timeout execution'
                while True:
                        print '\r' + str(random.random()),
                        sys.stdout.flush()
                        eventlet.sleep(0)
                print ' Never printed Secret! '
        except Exception as e:
                print ' Exception: ', e
        finally:
                print ''
                print ' Timeout reached '
                print ''

Time out will never reached. Where am I wrong?

P.s. I replaced:

 time.sleep(0.1)

with:

 eventlet.sleep(0)

Add False for exception, now it works well:

with eventlet.timeout.Timeout(1):

change to:

with eventlet.timeout.Timeout(1, False):

But it works only with eventlet.sleep(0.1)

E.g this code wrong:

#!/usr/bin/python
import eventlet
import time
start_time = time.time()
data = 0
with eventlet.timeout.Timeout(1, False):
        while True:
                data +=1
print 'Catch data ', data, ' in ', time.time() - start_time

I simply add sleep 0 seconds:

eventlet.sleep(0)

And it works like a charm.

Solved


Source: (StackOverflow)