EzDevInfo.com

rpyc

RPyC (Remote Python Call) - A transparent and symmetric RPC library for python RPyC - Transparent, Symmetric Distributed Computing — RPyC

What are the pros and cons of PyRo and RPyC python libs?

I am looking for a remote procedure call engine for Python and I've found that PyRo (Python Remote Object) and RPyC (Remote Python Call) are both the kind of thing I am searching for.

However, I am curious to know how they compare to each other and what are their pros and cons ?


Source: (StackOverflow)

passing object in rpyc fails

I am trying to pass an object as a paramater using RPyC from the client to the server. But the server is unable to access the object and I receive an AttributeError.

Server Code:

class AgentService(rpyc.Service):
  def exposed_func(self, obj):
    return obj.name

client code

self._conn = connect(agent_host, agent_port, config = {"allow_public_attrs" : True})
return self._conn.root.func(obj)

returns: AttributeError: cannot access 'name'.

I am using RPyC services and accoding to the website, this should work.

Any ideas?


Source: (StackOverflow)

Advertisements

rpyc.Service takes 10 seconds to receive a 150kB object (on localhost, no LAN issue)

I am building a big (150kB when pickled) dummy dictionary and running a dummy function on it that runs quickly and smoothly.

When the same function is exposed via a rpyc.Service, the time taken becomes 10 seconds (instead of 0.0009 seconds), even if my client and server stand on the same host (no issue with the LAN latency here).

Any idea why it takes so long for my 150kB object to be communicated from the client to the server on the same host?

And why the function dummy.dummy() is called even if the input object is not yet "available" (if it were, then the time spent in the function would be the same in the two test cases)?

Cf my python (3.2) code below. I measure the time spent in dummy.dummy(d).

  1. Case 1: dummy.dummy is called by the client ; exec time = 0.0009
  2. Case 2: dummy.dummy is called the rpyc service ; exec time = 10 seconds

mini_service.py

import rpyc
from rpyc.utils.server import ThreadedServer
import dummy

class miniService(rpyc.Service):
    def exposed_myfunc(self,d):
        #Test case 2: call dummy.dummy from the service
        dummy.dummy(d)

if __name__=='__main__':
    t = ThreadedServer(miniService,protocol_config = {"allow_public_attrs" : True}, port = 19865)
    t.start()

mini_client.py

import rpyc
import sys
import pickle
import dummy

def makedict(n):
    d={x:x for x in range(n)}
    return d

if __name__ == "__main__":
    d=makedict(20000)
    print(sys.getsizeof(d))             #result = 393356

#   output = open("C:\\rd\\non_mc_test_files\\mini.pkl",'wb') #117kB object for n=20k
#   pickle.dump(d,output)
#   output.close()

#RUN1 : dummy.dummy(d) out of rpyc takes 0.00099 seconds
#   dummy.dummy(d)

#RUN2 : dummy.dummy(d) via RPYC on localhost takes 9.346 seconds
    conn=rpyc.connect('localhost',19865,config={"allow_pickle":True})
    conn.root.myfunc(d)

    print('Done.')  

dummy.py

import time

def dummy(d):
    start_ = time.time()
    for key in d:
        d[key]=0
    print('Time spent in dummy in seconds: ' + str(time.time()-start_)) 

Source: (StackOverflow)

Returning remote object in rpyc

I have a remote server like below which already has an initialized class and i have set protocol config as allow public attrs True.

import rpyc

class SharedClass(object):
    def __init__(self,string):
        print string

    def action(self):
        print 'i am doing something'

s=SharedClass('hi')

class MyService(rpyc.Service):
    def on_connect(self):
        pass

    def on_disconnect(self):
        pass

    def exposed_get_shared(self):
        return s

if __name__=='__main__:
    from rpyc.utils.server import ThreadedServer
    t=ThreadedServer(MyService,port=18861,protocol_config={"allow_public_attrs":True})
    t.start()

At the client side if i try to connect directly it is working, whereas when i try to make connection inside a function and return the object i am getting an error

**Client**

**Direct connection**

>>>Python 2.7.2 (default, Jun 12 2011, 15:08:59) [MSC v.1500 32 bit (Intel)] on win32
>>>Type "help", "copyright", "credits" or "license" for more information.
>>>conn=rpyc.connect('localhost',18861)
>>>share=getattr(conn.root,'get_shared')
>>>share
<bound method MyService.exposed_get_shared of <__main__.MyService
object at 0x011BA698>>
>>>share=getattr(conn.root,'get_shared')()
>>>share
<__main__.SharedClass object at 0x00B6ED30>
>>>share.action()
i am doing something

If i try to do it in a function i am getting an error ;(

>>>def returnObject(objName, host, port):
...    conn = rpyc.connect(host, port)
...    print conn
...    attr = getattr(conn.root, 'get_' + objName)()
...    print attr
...    return attr
>>>share=returnObject('shared','localhost',18861)
<rpyc.core.protocol.Connection 'conn2' object at 0x0108AAD0>
<__main__.SharedClass object at 0x00B6ED30>
>>>share
Traceback (most recent call last):
 File "<stdin>", line 1, in <module>
 File "C:\Python27\lib\site-packages\rpyc\core\netref.py", line 168,
in __repr__
   return syncreq(self, consts.HANDLE_REPR)
 File "C:\Python27\lib\site-packages\rpyc\core\netref.py", line 69,
in syncreq
   return conn().sync_request(handler, oid, *args)
AttributeError: 'NoneType' object has no attribute 'sync_request'

My purpose is to have an object initialized in server and to have many client access it. The initialized class is thread safe and so multiple clients can use it.

I realize that i am missing something while doing this.

--

Adhithya


Source: (StackOverflow)

Python rpyc "socket.error: [Errno 113] No route to host"

I have two machines using python rpyc ,one is server(ip:10.0.3.120), another is client(ip:10.0.3.197). The code shows below:

Server (ip:10.0.3.120)

from rpyc import Service
from rpyc.utils.server import ThreadedServer

class TestService(Service):

    def exposed_test(self, num):
        return num + 1

sr = ThreadedServer(TestService, port=9999, auto_register=False)
sr.start()

Client (ip:10.0.3.129)

import rpyc
conn = rpyc.connect('10.0.3.120', 9999)
cResult = conn.root.test(11)
conn.close()

print cResult

Client shows this error when I run server and client:

Traceback (most recent call last):
File "rpyc_client.py", line 4, in <module>
conn = rpyc.connect('10.0.3.120', 9999)
File "/usr/local/lib/python2.7/site-packages/rpyc-3.2.3-py2.7.egg/rpyc/utils/factory.py", line 89, in connect
s = SocketStream.connect(host, port, ipv6 = ipv6)
File "/usr/local/lib/python2.7/site-packages/rpyc-3.2.3-py2.7.egg/rpyc/core/stream.py", line 114, in connect
return cls(cls._connect(host, port, **kwargs))
File "/usr/local/lib/python2.7/site-packages/rpyc-3.2.3-py2.7.egg/rpyc/core/stream.py", line 92, in _connect
s.connect((host, port))
File "/usr/local/lib/python2.7/socket.py", line 224, in meth
return getattr(self._sock,name)(*args)

How does connect method use? If I use public network IP to build a rpyc server, could I connect it at home ?? Thanks a lot!


Source: (StackOverflow)

Using rpyc to connect to database once and serve multiple queries

Trying to serve database query results to adhoc client requests, but do not want to open a connection for each individual query. I'm not sure if i'm doing it right.

Current solution is something like this on the "server" side (heavily cut down for clarity):

import rpyc
from rpyc.utils.server import ThreadedServer
import cx_Oracle

conn = cx_Oracle.conect('whatever connect string')
cursor = conn.cursor()

def get_some_data(barcode):
    # do something
    return cursor.execute("whatever query",{'barcode':barcode})

class data_service(rpyc.Service):
   def exposed_get_some_data(self, brcd):       
       return get_some_data(brcd)


if __name__ == '__main__':
   s = ThreadedServer(data_service, port=12345, auto_register=False)
   s.start()

This runs okay for a while. However from time to time the program crashes and so far i haven't been able to track when it does that.

What i wish to confirm, is see how the database connection is created outside of the data_service class. Is this in itself likely to cause problems?

Many thanks any thoughts appreciated.


Source: (StackOverflow)

How can I get the list of clients connected to a server in RPyC?

I want to have a connection between two client and one server in RPyC, and I want to call a method of server from client1 that in the method of server call a method of client 2, this is my code:

import rpyc
#server:
class ServerService(rpyc.Service):
    def on_connect(self):
        print "Connected To Server\n"
    def on_disconnect(self):
        print "Disconnected From Server\n"
    def exposed_command(self, cmd):
        self._cmd = cmd
        self._conn.root.run_command(self._cmd)
#client1:
class AppService(rpyc.Service):
    def exposed_foo():
        return "foo"
conn = rpyc.connect("localhost", 2014, service = AppService)
conn.root.command(self._cmd)
#client2:
class ClientService(rpyc.Service):
    def exposed_run_command(self, cmd):
        eval(cmd)
# connect to server
conn = rpyc.connect("localhost", 2014, service = ClientService)

I have 3 separate files that I want to connect 2 clients by server.

Update

I try to explain more my situation and add codes of the 3 module that I use... please explain more for me and give me advice about ID of any client.getting IDs and other stuff you know, I have one server and two client,one client run a simple PyQt program that get a maya python command and with clicking in its button I want to run command on client 2 that is run on the maya, ok? But, I want to connect both client together and call their methods from each other as peer to peer connection. But I don't know how to call the run_command of client 2(maya) from client1(PyQt)

Server side:

import rpyc    
class ServerService(rpyc.Service):
    def on_connect(self):
        print "Connected To Server\n"    
    def on_disconnect(self):
        print "Disconnected From Server\n"    
    def exposed_command(self, cmd):
        self._cmd = cmd
        self._conn.root.run_command(self._cmd)   

if __name__ == "__main__":
    from rpyc.utils.server import ThreadedServer
    t = ThreadedServer(ServerService, port = 2014)
    print "Server is ready ..."
    t.start()

Client1 (PyQt) :

import sys
from PyQt4.QtCore import *
from PyQt4.QtGui import *
import rpyc
class ClientApp(QDialog):
    def __init__(self, parent = None):
        super(ClientApp, self).__init__(parent)

        self.label = QLabel("CMD: ")
        self.textBox = QLineEdit()
        self.button = QPushButton("Run Command")    
        layout = QHBoxLayout(self)
        layout.addWidget(self.label)
        layout.addWidget(self.textBox)
        layout.addWidget(self.button)    
        self.setWindowTitle("Client App")
        # SIGNALS
        self.button.clicked.connect(self.sendCommand)           
        self._cmd = str(self.textBox.text())
        self.sendCommand()    
    def sendCommand(self):
        print "Printed Command : "
        self._cmd = str(self.textBox.text())
        conn.root.command(self._cmd)    
class AppService(rpyc.Service):
    def exposed_foo2():
        return "foo2"    
conn = rpyc.connect("localhost", 2014, service = AppService)    
if __name__ == '__main__':
    app = QApplication(sys.argv)
    window = ClientApp()
    window.show()

Client2 (maya):

import rpyc
from maya import cmds    
class ClientService(rpyc.Service):
    def exposed_run_command(self, cmd):
        eval(cmd)    
# connect to server
conn = rpyc.connect("localhost", 2014, service = ClientService)

Source: (StackOverflow)

Multiprocessing Python with RPYC "ValueError: pickling is disabled"

I am trying to use the multiprocessing package within an rpyc service, but get ValueError: pickling is disabled when I try to call the exposed function from the client. I understand that the multiprocesing package uses pickling to pass information between processes and that pickling is not allowed in rpyc because it is an insecure protocol. So I am unsure what the best way (or if there is anyway) to use multiprocessing with rpyc. How can I make use of multiprocessing within a rpyc service? Here is the server side code:

import rpyc
from multiprocessing import Pool

class MyService(rpyc.Service):

    def exposed_RemotePool(self, function, arglist):

        pool = Pool(processes = 8)
        result = pool.map(function, arglist)
        pool.close()
        return result


if __name__ == "__main__":
    from rpyc.utils.server import ThreadedServer
    t = ThreadedServer(MyService, port = 18861)
    t.start()

And here is the client side code that produces the error:

import rpyc

def square(x):
    return x*x

c = rpyc.connect("localhost", 18861)
result = c.root.exposed_RemotePool(square, [1,2,3,4])
print(result)

Source: (StackOverflow)

Handle computationally-intensive requests to a Django web application, possibly using a pre-forking RPC server

I am running a Django-based webservice with Gunicorn behind nginx as a reverse proxy.

My webservice provides a Django view which performs calculations using an external instance of MATLAB. Because the MATLAB startup takes some seconds on its own, even requests incurring only very simple MATLAB calculations require this amount of time to be answered.

Moreover, due to the MATLAB sandboxing done in my code, it is important that only one MATLAB instance is run at the same time for a webserver process. (Therefore, currently I am using the Gunicorn sync worker model at the moment which implements a pre-forking webserver but does not utilize any multithreading.)

To improve user experience, I now want to eliminate the waiting time for MATLAB startup by keeping some (e.g. 3-5) "ready" MATLAB instances running and using them as requests come in. After a request has been serviced, the MATLAB process would be terminated and a new one would be started immediately, to be ready for another request.

I have been evaluationg two ways to do this:

  1. Continue using Gunicorn sync worker model and keep one MATLAB instance per webserver process.

    The problem with this seems to be that incoming requests are not distributed to the webserver worker processes in a round-robin fashion. Therefore, it could happen that all computationally-intensive requests hit the same process and the users still have to wait because that single MATLAB instance cannot be restarted as fast as necessary.

  2. Outsource the MATLAB computation to a backend server which does the actual work and is queried by the webserver processes via RPC.

    In my conception, there would be a number of RPC server processes running, each hosting a running MATLAB process. After a request has been processed, the MATLAB process would be restarted. Because the RPC server processes are queried round-robin, a user would never have to wait for MATLAB to start (except when there are too many requests overall, but that is inevitable).

Because of the issues described with the first approach, I think the RPC server (approach 2) would be the better solution to my problem.

I have already looked at some RPC solutions for Python (especially Pyro and RPyC), however I cannot find an implementation that uses a pre-forking server model for the RPC server. Remember, due to the sandbox, multithreading is not possible and if the server only forks after a connection has been accepted, I would still need to start MATLAB after that which would thwart the whole idea.

Does anybody know a better solution to my problem? Or is the RPC server actually the best solution? But then I would need a pre-forking RPC server (= fork some processes and let them all spin on accept() on the same socket) or at least a RPC framework that can be easily modified (monkey-patched?) to be pre-forking.

Thanks in advance.


Source: (StackOverflow)

Passing functions are parameters in RPyC

I'm trying to encapsulate a whole bunch of logic into one Python function and pass that as an argument to remote RPyC method where I want to execute it on the server. But RPyC treats the function as a callback and execute it locally. Is there a way around that? Can I force the server to execute it on the server instead of on the client?

Thank you.


Source: (StackOverflow)

Redirecting stdout for RPyC to local client without using Classic RPyC

How do I redirect stdout to my local client when using RPyC services? I know it is possible when using Classic RPyC by using

c = rpyc.classic.connect(ip)

c.modules.sys.stdout = sys.stdout

Does anyone have an equivalent code that I can use on my client side for a user-defined service like one below

class remoteServer(rpyc.Service):

    def on_connect(self):
        # code that runs when a connection is created
        # (to init the serivce, if needed)

        if not os.path.exists(self.LOG_DIRECTORY_PATH):
            os.mkdir(self.LOG_DIRECTORY_PATH);
            file = open(self.LOG_FILENAME, 'a');
        else:
            print 'Directory: ', self.LOG_DIRECTORY_PATH, 'already exists';

if __name__ == "__main__":
    server = ThreadedServer(remoteServer, port = 12345)
    server.start()

Source: (StackOverflow)

What would be the pythonic way to simulate communication with an external equipment? RPyC?

Think of a situation where a Python script sends commands to an external device,let's say, via serial port. The script resides on some Linux machine and the equipment reacts to some commands. The idea is to simulate all this chain on my machine by "sending" commands to a file which contains the dump that would be generated by the equipment, otherwise. It is first time when I'm trying such thing and it will be good to hear from someone who had experience with such situation. I red on the web and people are talking about few direction:

  • multiprocessing
  • RPyC
  • Threading
  • zmq

It might be more but not aware of. Any suggestions?


Source: (StackOverflow)

rpyc: difference between root.getmodule("module_name") and manually returning a module reference?

I want to use a python module that is accessible on a remote rpyc server only. Is there a difference between the following two ways of accessing modules on a remote machine:


""" on the client side: """

  1. my_local_mod_ref = my_rpyc_connection.root.getmodule("remote_module_name")
  2. my_local_mod_ref = my_rpyc_connection.root.a_func_returning_the_module_ref()


""" on the server side: """

def exposed_a_func_returning_the_module_ref()
    import my_remote_module_name
    return my_remote_module_name

If there is a difference, which of the two alternatives is cleaner or preferable?


Source: (StackOverflow)

How to start an rpyc server?

I recently installed rpyc on my Ubuntu 14.04 Virtual machine using:

pip install rpyc

i cannot find the file which can be used to start the server. I have looked in many places but to no avail.


Source: (StackOverflow)

how to get the output of a command via rpyc?

I need to run a command via rpyc and get the result of this command.

But whenever I run the command, it is printed on the remote server and can not bring the result.

What I am doing is the following:

import os, sys
import rpyc

conn = rpyc.classic.connect('server_remote')
a = conn.modules.os.system( "ls -lh")

print a

The output of my command is 0 for any command I run.

python get_output.py
0

Source: (StackOverflow)