EzDevInfo.com

redis-py

Redis Python Client

redis-py raises AttributeError

In what circumstances would redis-py raise the following AttributeError exception?
Isn't redis-py built by design to raise only redis.exceptions.RedisError based exceptions?
What would be a reasonable handling logic?

Traceback (most recent call last):
  File "c:\Python27\Lib\threading.py", line 551, in __bootstrap_inner
    self.run()
  File "c:\Python27\Lib\threading.py", line 504, in run
    self.__target(*self.__args, **self.__kwargs)
  File C:\Users\Administrator\Documents\my_proj\my_module.py", line 33, in inner
    ret = protected_func(*args, **kwargs)
  File C:\Users\Administrator\Documents\my_proj\my_module.py", line 104, in _listen
    for message in _pubsub.listen():
  File "C:\Users\Administrator\virtual_environments\my_env\lib\site-packages\redis\client.py", line 1555, in listen
    r = self.parse_response()
  File "C:\Users\Administrator\virtual_environments\my_env\lib\site-packages\redis\client.py", line 1499, in parse_response
    response = self.connection.read_response()
  File "C:\Users\Administrator\virtual_environments\my_env\lib\site-packages\redis\connection.py", line 306, in read_response
    response = self._parser.read_response()
  File "C:\Users\Administrator\virtual_environments\my_env\lib\site-packages\redis\connection.py", line 104, in read_response
    response = self.read()
  File "C:\Users\Administrator\virtual_environments\my_env\lib\site-packages\redis\connection.py", line 89, in read
    return self._fp.readline()[:-2]
AttributeError: 'NoneType' object has no attribute 'readline'

Source: (StackOverflow)

Use sorted set to notifications system

I am using redis sorted sets to save user notifications. But as i never did a notification system, I am asking about my logic.

I need to save 4 things for each notification.

  • post_id
  • post_type - A/B
  • visible - Y/N
  • checked - Y/N

My question is how can I store this type of structure in sorted sets?

ZADD users_notifications:1 10 1_A_Y_Y 
ZADD users_notifications:1 20 2_A_Y_N
....

There is a better way to do this type of stuff in redis? In the case above i am saving the four thing in each element, and i need to split by the underscore in the server language.


Source: (StackOverflow)

Advertisements

Redis: Finding the SCARD of an SINTER result without storing the intermediate set

I need the length (SCARD) of the intersection of 2 large sets in redis.

So this achieves what I want:

> SINTERSTORE intermediate s:1 s:2
> SCARD intermediate

However the sets are large, so I don't want to store the intermediate value. Conceptually I want:

> SCARD (SINTER s:1 s:2)

Is there a way to achieve this in a single command, perhaps with Lua scripting? Or is my best bet to script it out in my application language and delete the intermediate value when I'm done? e.g. using python and redis-py:

>>> r = redis.Redis(...)
>>> pipe = r.pipeline()
>>> res = pipe.sinterstore('intermediate', 's:1', 's:2').scard('intermediate').delete('intermediate').execute()
>>> print res[1]

Source: (StackOverflow)

redis-py "ConnectionError: Socket closed on remote end"

Using redis-py's PubSub class I sometimes get the following exception:

Exception in thread listener_2013-10-24 12:50:31.687000:
Traceback (most recent call last):
  File "c:\Python27\Lib\threading.py", line 551, in __bootstrap_inner
    self.run()
  File "c:\Python27\Lib\threading.py", line 504, in run
    self.__target(*self.__args, **self.__kwargs)
  File "C:\Users\Administrator\Documents\my_proj\my_module.py", line 69, in _listen
    for message in _pubsub.listen():
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\client.py", line 1555, in listen
    r = self.parse_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\client.py", line 1499, in parse_response
    response = self.connection.read_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\connection.py", line 306, in read_response
    response = self._parser.read_response()
  File "C:\Users\Administrator\virtual_environments\spyker\lib\site-packages\redis\connection.py", line 106, in read_response
    raise ConnectionError("Socket closed on remote end")
ConnectionError: Socket closed on remote end

What would cause such an event?
If I catch this exception, what would be a reasonable handling logic? Would retrying listen() be futile?

The reason for asking and not simply trying is that I do not know how to reproduce this problem. It's rare but it's detrimental, so I must create some logic before this error strikes again.


Source: (StackOverflow)

How many commands could redis-py pipeline have?

I want to use pipeline to reduce the number of interaction between my program and redis-server.
I may set many commands in pipeline but I couldn't find any document describing the max number of commands that could be set in pipeline.

Is there any advice? Thanks in advance.


Source: (StackOverflow)

redis client pipeline does not work in twemproxy environment

I use redis-py to operate on redis, and our environment use twemproxy as redis proxy. But looks clinet pipeline doesn't work when connect to twemproxy.

import redis

client = redis.StrictRedis(host=host, port=port, db=0)
pipe = client.pipeline()
pipe.smembers('key')
print pipe.execute()

it throws exception when do execute method

redis.exceptions.ConnectionError: Socket closed on remote end

In twemproxy environment, client pipeline doesn't work or it is an issue of redis-py ?


Source: (StackOverflow)

How to set the redis timeout waiting for the response with pipeline in redis-py?

In the code below, is the pipeline timeout 2 seconds?

client = redis.StrictRedis(host=host, port=port, db=0, socket_timeout=2)
pipe = client.pipeline(transaction=False)
for name in namelist:
    key = "%s-%s-%s-%s" % (key_sub1, key_sub2, name, key_sub3)
    pipe.smembers(key)
pipe.execute()

In the redis, there are a lot of members in the set "key". It always return the error as below with the code last:

error Error while reading from socket: ('timed out',)

If I modify the socket_timeout value to 10, it returns ok.
Doesn't the param "socket_timeout" mean connection timeout? But it looks like response timeout.
The redis-py version is 2.6.7.


Source: (StackOverflow)

ZREM on Redis Sorted Set

What will happen if 2 workers call ZREM on the same element of a sorted set at the same time? Will it return true to the worker which actually removes the element and false to the other to indicate it doesn't exist or will it return true to both? In other words is ZREM atomic internally?


Source: (StackOverflow)

Is there a NUMSUB command for redis-py?

Is there some equivalent to the NUMSUB command in redis for the python client?

I've looked through the documentation and can't find anything other than the publish() method itself, which returns the number of subscribers on that channel. Knowing how many subscribers there are after-the-fact is not very useful to me though.


Source: (StackOverflow)

How can I implement an atomic get or set&get key in redis using python?

I have a redis server and I want to implement an atomic (or pseudo atomic) method that will do the following (NOTICE: I have a system that has multiple sessions to the redis server) :

  1. If some key K exists get the value for it
  2. Otherwise, call SETNX function with a random value that is generated by some function F(that generates salts)
  3. Ask redis for the value of key K that was just generated by the current session (or by another session "simultaneously" - a short moment before the current session generated it)

The reasons that I don't want to pre-generate (before checking if the value exists) a value with function F, and use it if the key doesn't exist are :

  1. I don't want to call F with no justification (it might cause an intensive CPU behaviour(
  2. I want to avoid the next problematic situation : T1 : Session 1 generated a random value VAL1 T2 : Session 1 asked if key K exists and got "False" T3 : Session 2 generated a random value VAL2 T4 : Session 2 asked if key K exists and got "False" T5 : Session 2 calls SETNX with the value VAL2 and uses VAL2 from now on T6 : Session 1 calls SETNX with the value VAL1 and uses VAL1 from now on where the actual value of key K is VAL2

A python pseudo-code that I created is :

    import redis
    r = redis.StrictRedis(host='localhost', port=6379, db=0)
    ''' gets the value of key K if exists (r.get(K) if r.exists(K)), 
    otherwise gets the value of key K if calling SETNX function returned TRUE 
    (else r.get(K) if r.setnx(K,F())), meaning this the sent value is really the value,
    otherwise, get the value of key K, that was generated by another session a         
    short moment ago (else r.get(K))
    The last one is redundant and I can write "**r.setnx(K,F()) or True**" to get the 
    latest value instead, but the syntax requires an "else" clause at the end '''
    r.get(K) if r.exists(K) else r.get(K) if r.setnx(K,F()) else r.get(K)

Is there another solution?


Source: (StackOverflow)

Redis too many open files error

I am getting "too many open files error" when a certain number of users exceeds (its around 1200 concurrent users).

I increased the limit using this but I was getting same error.

Then I followed this and no change getting the same error.

For creating connection I am using in my django settings and using REDIS when I need it.

REDIS = redis.StrictRedis(host='localhost', port=6379, db=0)

Why I did it like that because it was suggested in redis mailing list like below:

a. create a global redis client instance and have your code use that.

Is that approach right for connection pooling? Or how I avoid this error of too many open files? In Django response I am getting

Connection Error (Caused by : [Errno 24] Too many open files)",),)'

Thanks.


Source: (StackOverflow)

Redis Python - how to delete all keys according to a specific pattern In python, without python iterating

I'm writing a django management command to handle some of our redis caching. Basically I need to choose all keys that confirm to a certain pattern (for example: "prefix:*") and delete them. I know I can use the cli to do that:

redis-cli KEYS "prefix:*" | xargs redis-cli DEL

But I need to to this from within the app. So I need to use the python binding (I'm using py-redis) I tried feeding a list into delete. but it fails:

from common.redis_client import get_redis_client
cache = get_redis_client()
x = cache.keys('prefix:*') 

x == ['prefix:key1','prefix:key2'] # True

# And now

cache.delete(x) 

# returns 0 . nothing is deleted

I know I can iterate over x:

for key in x:
   cache.delete(key)

But that would be losing redis awesome speed and misusing It's capabilities. Is there a pythontic solution with py-redis, without iterating and/or the cli?

Thanks!


Source: (StackOverflow)

Redis-python setting multiple key/values in on operation

currently I use the basic mset feature to store a key/value;

rom common.redis_client import get_redis_client
cache = get_redis_client()
for k,v in some_dict.items():
   kw = {'key': value}
   cache.mset(kw) 

#later:
   cache.get('key')

I store each key/value seperatly (not in one json for example) Since storing the whole dict would turn it into a string and would require me to serialize/deserialize on storing and retrieving and I really need access to seperate key/values.

My question:: is there a way I can mset multiple key/values at once? Instead of multiple writes to the redis db? and vice-versa can I have multiple reads (get) in one access? (and Yes - I have a lot of redis actitivy going on and with heavly load. I do care about this)


Source: (StackOverflow)

redis-py - ConnectionError: Socket closed on remote end - overload?

I'm using Redis from Python via redis-py to store JSON in a sorted set.

Everything works fine until I try to get a certain amount of data out of Redis.


redis.StrictRedis(host='localhost', port=6379, db=12) 
redis_client.zrange('key', 0, 20, 'desc')

Will work fine as I'm only requesting 20 entries.

As soon as I try anything above 35 I get:

ConnectionError: Socket closed on remote end

I've tried working around it by "chunking" the queries in sets of 5 but it seems that I'm hitting Redis so fast with a lot of queries of 5 that this can still cause the exception.

Am I somehow DDOSing redis?


I've tried it on both Windows and Ubuntu.

Last week I actually got away with up to 100 entries at once and chunking worked if I did it in groups of 10, but it seems since then my Redis server has gotten even more sensitive.


Here is a little script that reproduces the error.

import redis
import ujson as json

r = redis.StrictRedis(host="localhost", port=6379, db=12)
dummy_json = {"data":"hfoiashflkasdjaisdäjpagufeiaghaifhaspdas", 
          "more": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more1": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more2": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more3": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp",
          "more4": "sdasdpjapsfdjapsofjaspofjsapojfpoasjfpoajfp"}

for score in xrange(0, 6000):
    dummy_json["score"]=score
    r.zadd("test", score, json.dumps(dummy_json))

result = r.zrange('test', 0, 200, 'desc')
print result

You'll see that if you make dummy_json hold less data or request fewer entries at once the exception will be gone.


Source: (StackOverflow)

redis-py and hgetall behavior

I played around with flask microframework, and wanted to cache some stats in redis. Let's say I have this dict:

mydict = {}
mydict["test"] = "test11"

I saved it to redis with

redis.hmset("test:key", mydict)

However after restore

stored = redis.hgetall("test:key")
print(str(stored))

I see weird {b'test': b'test11'} so stored.get("test") gives me None

mydict str method result looks fine {'test': 'test11'}. So, why this binary marker added to restored data? I also checked in redis-cli and don't see explicit b markers there. Something wrong with hgetall?


Source: (StackOverflow)