Discussion:
[Fab-user] Timeout opening channel with fabric 1.13.2
Rob Marshall
2018-06-15 15:31:31 UTC
Permalink
Hi,

I have a "wrapper" for Fabric 1.13.2 that I use to make running fabric
commands a little easier in some senses. I can setup an object that
has the host_string, etc., all setup when I instantiate the object.
I'm attempting to use it to run multiple processes simultaneously on
multiple hosts. This seems to work fine for some things, but I'm
running into a problem with a particular command that I seem to be
able to run just fine by itself, i.e. not using multiprocessing, but
that gets a connection timeout when being run via
multiprocess.pool.apply_async(). Is there a known issue with running
Fabric with multiprocessing pools?

Thanks,

Rob
Jeff Forcier
2018-06-15 16:17:09 UTC
Permalink
Hi Rob,

Fabric 1 has its own multiprocessing setup, but it needs to do various
things to the objects getting cloned across subprocess barriers - e.g.
socket cleanup, fabric.state.env contents tweaking, and the like - and I'm
guessing if you are doing your own multiprocessing work, you may be
skipping all of that.

See e.g. some of the stuff done here:
https://github.com/fabric/fabric/blob/d91b86ecc0c91357e7befe3dd5b67efc00682aeb/fabric/tasks.py#L225
- you might need to look higher up in the call stack too but that's the
guts of it re: disconnecting client objects in the cache.

That said, it's possible that Fabric 1's design straight up precludes the
approach you're taking - it's not thread safe, it's not
greenlet/coroutine/etc async safe, etc because of the emphasis on global
module state. But as you're using multiprocessing, it depends on exactly
what 'async' means in this context - I'm guessing it's more about the
behavior of launching the subprocesses from the master, in which case it
might work fine.

I don't have a ton of spare time at the moment to get deep into this if you
encounter more problems, but hopefully the above gets you pointed in the
right direction!

Best,
Jeff
Post by Rob Marshall
Hi,
I have a "wrapper" for Fabric 1.13.2 that I use to make running fabric
commands a little easier in some senses. I can setup an object that
has the host_string, etc., all setup when I instantiate the object.
I'm attempting to use it to run multiple processes simultaneously on
multiple hosts. This seems to work fine for some things, but I'm
running into a problem with a particular command that I seem to be
able to run just fine by itself, i.e. not using multiprocessing, but
that gets a connection timeout when being run via
multiprocess.pool.apply_async(). Is there a known issue with running
Fabric with multiprocessing pools?
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
Rob Marshall
2018-06-15 17:12:29 UTC
Permalink
Hi Jeff,

What I've been trying is:

Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... with settings(warn_only=True,user='username',password='mypassword'):
... run('hostname')
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(gethostnames),hosts=['10.10.0.1','10.10.0.2','10.10.0.3'])
[10.10.0.1] Executing task 'gethostnames'
[10.10.0.2] Executing task 'gethostnames'
[10.10.0.3] Executing task 'gethostnames'
[10.10.0.3] run: hostname
[10.10.0.1] run: hostname
[10.10.0.2] run: hostname
[10.10.0.2] out: hostname2
[10.10.0.2] out:

[10.10.0.3] out: hosname3
[10.10.0.3] out:

[10.10.0.1] out: hostname1
[10.10.0.1] out:

However if I try to run a "task" that requires an argument it fails, e.g.:

Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... with settings(warn_only=True,user='username',password='mypassword'):
... run(cmd)
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(runcommand('hostname')),hosts=['10.10.0.1','10.10.0.2','10.10.0.3'])
No hosts found. Please specify (single) host string for connection:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in runcommand
File "/usr/local/lib/python2.7/dist-packages/fabric/network.py",
line 684, in host_prompting_wrapper
host_string = raw_input("No hosts found. Please specify (single)"
KeyboardInterrupt

How do I run a task with execute that requires arguments?

Thanks,

Rob
Post by Jeff Forcier
Hi Rob,
Fabric 1 has its own multiprocessing setup, but it needs to do various things to the objects getting cloned across subprocess barriers - e.g. socket cleanup, fabric.state.env contents tweaking, and the like - and I'm guessing if you are doing your own multiprocessing work, you may be skipping all of that.
See e.g. some of the stuff done here: https://github.com/fabric/fabric/blob/d91b86ecc0c91357e7befe3dd5b67efc00682aeb/fabric/tasks.py#L225 - you might need to look higher up in the call stack too but that's the guts of it re: disconnecting client objects in the cache.
That said, it's possible that Fabric 1's design straight up precludes the approach you're taking - it's not thread safe, it's not greenlet/coroutine/etc async safe, etc because of the emphasis on global module state. But as you're using multiprocessing, it depends on exactly what 'async' means in this context - I'm guessing it's more about the behavior of launching the subprocesses from the master, in which case it might work fine.
I don't have a ton of spare time at the moment to get deep into this if you encounter more problems, but hopefully the above gets you pointed in the right direction!
Best,
Jeff
Post by Rob Marshall
Hi,
I have a "wrapper" for Fabric 1.13.2 that I use to make running fabric
commands a little easier in some senses. I can setup an object that
has the host_string, etc., all setup when I instantiate the object.
I'm attempting to use it to run multiple processes simultaneously on
multiple hosts. This seems to work fine for some things, but I'm
running into a problem with a particular command that I seem to be
able to run just fine by itself, i.e. not using multiprocessing, but
that gets a connection timeout when being run via
multiprocess.pool.apply_async(). Is there a known issue with running
Fabric with multiprocessing pools?
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
Jeff Forcier
2018-06-15 17:34:31 UTC
Permalink
execute() uses a semi common functional programming tactic where you hand
in a callable object and (usually) the arguments with which to call said
object. So the stumbling block here was your use of
`runcommand('hostname')` - which calls `runcommand` directly and gives
execute() the _result_ - instead of handing in `runcommand` itself!

See the latter few paragraphs here:
http://docs.fabfile.org/en/1.14/api/core/tasks.html#fabric.tasks.execute

So, you probably just need to do this:

results = execute(parallel(runcommand), cmd='hostname', hosts=[...])

execute() will take out the 'hosts' kwarg for its own use, leaving 'cmd'
around to be handed to runcommand() each time it's called.

Best,
Jeff
Post by Rob Marshall
Hi Jeff,
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... run('hostname')
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(gethostnames),hosts=['10.10.0.1','10.10.0.
2','10.10.0.3'])
[10.10.0.1] Executing task 'gethostnames'
[10.10.0.2] Executing task 'gethostnames'
[10.10.0.3] Executing task 'gethostnames'
[10.10.0.3] run: hostname
[10.10.0.1] run: hostname
[10.10.0.2] run: hostname
[10.10.0.2] out: hostname2
[10.10.0.3] out: hosname3
[10.10.0.1] out: hostname1
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... run(cmd)
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(runcommand('hostname')),hosts=['10.10.0.1'
,'10.10.0.2','10.10.0.3'])
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in runcommand
File "/usr/local/lib/python2.7/dist-packages/fabric/network.py",
line 684, in host_prompting_wrapper
host_string = raw_input("No hosts found. Please specify (single)"
KeyboardInterrupt
How do I run a task with execute that requires arguments?
Thanks,
Rob
Post by Jeff Forcier
Hi Rob,
Fabric 1 has its own multiprocessing setup, but it needs to do various
things to the objects getting cloned across subprocess barriers - e.g.
socket cleanup, fabric.state.env contents tweaking, and the like - and I'm
guessing if you are doing your own multiprocessing work, you may be
skipping all of that.
Post by Jeff Forcier
See e.g. some of the stuff done here: https://github.com/fabric/
fabric/blob/d91b86ecc0c91357e7befe3dd5b67efc00682aeb/fabric/tasks.py#L225
- you might need to look higher up in the call stack too but that's the
guts of it re: disconnecting client objects in the cache.
Post by Jeff Forcier
That said, it's possible that Fabric 1's design straight up precludes
the approach you're taking - it's not thread safe, it's not
greenlet/coroutine/etc async safe, etc because of the emphasis on global
module state. But as you're using multiprocessing, it depends on exactly
what 'async' means in this context - I'm guessing it's more about the
behavior of launching the subprocesses from the master, in which case it
might work fine.
Post by Jeff Forcier
I don't have a ton of spare time at the moment to get deep into this if
you encounter more problems, but hopefully the above gets you pointed in
the right direction!
Post by Jeff Forcier
Best,
Jeff
Post by Rob Marshall
Hi,
I have a "wrapper" for Fabric 1.13.2 that I use to make running fabric
commands a little easier in some senses. I can setup an object that
has the host_string, etc., all setup when I instantiate the object.
I'm attempting to use it to run multiple processes simultaneously on
multiple hosts. This seems to work fine for some things, but I'm
running into a problem with a particular command that I seem to be
able to run just fine by itself, i.e. not using multiprocessing, but
that gets a connection timeout when being run via
multiprocess.pool.apply_async(). Is there a known issue with running
Fabric with multiprocessing pools?
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
Rob Marshall
2018-06-15 19:19:56 UTC
Permalink
Hi,

Yup, that works. And after some monkeying around trying to figure out
how to make the command host specific each time, it seems to be
working.

Thanks,

Rob
execute() uses a semi common functional programming tactic where you hand in a callable object and (usually) the arguments with which to call said object. So the stumbling block here was your use of `runcommand('hostname')` - which calls `runcommand` directly and gives execute() the _result_ - instead of handing in `runcommand` itself!
See the latter few paragraphs here: http://docs.fabfile.org/en/1.14/api/core/tasks.html#fabric.tasks.execute
results = execute(parallel(runcommand), cmd='hostname', hosts=[...])
execute() will take out the 'hosts' kwarg for its own use, leaving 'cmd' around to be handed to runcommand() each time it's called.
Best,
Jeff
Post by Rob Marshall
Hi Jeff,
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... run('hostname')
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(gethostnames),hosts=['10.10.0.1','10.10.0.2','10.10.0.3'])
[10.10.0.1] Executing task 'gethostnames'
[10.10.0.2] Executing task 'gethostnames'
[10.10.0.3] Executing task 'gethostnames'
[10.10.0.3] run: hostname
[10.10.0.1] run: hostname
[10.10.0.2] run: hostname
[10.10.0.2] out: hostname2
[10.10.0.3] out: hosname3
[10.10.0.1] out: hostname1
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Post by Jeff Forcier
Post by Rob Marshall
from fabric.api import *
... run(cmd)
...
Post by Jeff Forcier
Post by Rob Marshall
results = execute(parallel(runcommand('hostname')),hosts=['10.10.0.1','10.10.0.2','10.10.0.3'])
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in runcommand
File "/usr/local/lib/python2.7/dist-packages/fabric/network.py",
line 684, in host_prompting_wrapper
host_string = raw_input("No hosts found. Please specify (single)"
KeyboardInterrupt
How do I run a task with execute that requires arguments?
Thanks,
Rob
Post by Jeff Forcier
Hi Rob,
Fabric 1 has its own multiprocessing setup, but it needs to do various things to the objects getting cloned across subprocess barriers - e.g. socket cleanup, fabric.state.env contents tweaking, and the like - and I'm guessing if you are doing your own multiprocessing work, you may be skipping all of that.
See e.g. some of the stuff done here: https://github.com/fabric/fabric/blob/d91b86ecc0c91357e7befe3dd5b67efc00682aeb/fabric/tasks.py#L225 - you might need to look higher up in the call stack too but that's the guts of it re: disconnecting client objects in the cache.
That said, it's possible that Fabric 1's design straight up precludes the approach you're taking - it's not thread safe, it's not greenlet/coroutine/etc async safe, etc because of the emphasis on global module state. But as you're using multiprocessing, it depends on exactly what 'async' means in this context - I'm guessing it's more about the behavior of launching the subprocesses from the master, in which case it might work fine.
I don't have a ton of spare time at the moment to get deep into this if you encounter more problems, but hopefully the above gets you pointed in the right direction!
Best,
Jeff
Post by Rob Marshall
Hi,
I have a "wrapper" for Fabric 1.13.2 that I use to make running fabric
commands a little easier in some senses. I can setup an object that
has the host_string, etc., all setup when I instantiate the object.
I'm attempting to use it to run multiple processes simultaneously on
multiple hosts. This seems to work fine for some things, but I'm
running into a problem with a particular command that I seem to be
able to run just fine by itself, i.e. not using multiprocessing, but
that gets a connection timeout when being run via
multiprocess.pool.apply_async(). Is there a known issue with running
Fabric with multiprocessing pools?
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
--
Jeff Forcier
Unix sysadmin; Python engineer
http://bitprophet.org
Loading...