Discussion:
[Fab-user] Run multiple commands on single host in parallel
Rob Marshall
2018-06-18 18:54:11 UTC
Permalink
Hi,

I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?

I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).

Thanks,

Rob
Brandon Whaley
2018-06-18 20:57:21 UTC
Permalink
Hi Rob, I've done this as a hack in the past by adding data to the host
list and parsing it before execution to determine what to run. I've built
a simple example to give you an idea:

@task
def hostname():
return run('hostname')

@task
def uname():
return run('uname -a')

@task
def task_chooser():
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]

@task
def parallel_runner():
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
with settings(parallel=True):
execute(task_chooser, hosts=host_list)

[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11
12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host1] out:

[host2] out: host2
[host2] out:

[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1 17:20:32
UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host2] out:

[host1] out: host1
[host1] out:


Done.
Post by Rob Marshall
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Rob Marshall
2018-06-19 05:43:15 UTC
Permalink
Hi,

So I modified your code a bit and ended up with something like this:

@task
def monitor_task(rackname):
cmd = [
'run_rack_monitor',
'--rack',rackname
]

return run(' '.join(cmd))

@task
def run_load(load_node,load_base,load_max):
cmd = [
'run_system_load',
'--datanode',load_node,
'--base-value',str(load_base),
'--max-value',str(load_max),
]

return run(' '.join(cmd))

@task
def task_choser():
host, values, task = env.host_string.split('__')
for value in values.split(','):
exec(value)

if task == 'monitor_task':
return execute(task,hosts=[host],rackname=rackname)
else:
return execute(task,hosts=[host],load_node=load_node,load_base=load_base,load_max=load_max)

@task
def run_parallel():
host_list = [
'10.10.0.2__rackname="rackname01"__monitor_task',
'10.10.0.2__rackname="rackname02"__monitor_task',
'10.10.0.2__rackname="rackname03"__monitor_task',

'10.10.0.1__load_node="10.10.0.1",load_base=0,load_max=1000__run_load',

'10.10.0.2__load_node="10.10.0.2",load_base=1000,load_max=2000__run_load',

'10.10.0.3__load_node="10.10.0.3",load_base=2000,load_max=3000__run_load',

'10.10.0.4__load_node="10.10.0.4",load_base=3000,load_max=4000__run_load',

'10.10.0.5__load_node="10.10.0.5",load_base=4000,load_max=5000__run_load',

'10.10.0.6__load_node="10.10.0.6",load_base=5000,load_max=6000__run_load',
]

with settings(parallel=True):
results = execute(task_choser,hosts=host_list)

return results

Which allows me to pass in arguments to the tasks. I did run into one
odd thing: If I just tried to run run_parallel() as a function I got
an error:

Fatal error: '...' is not callable or a valid task name

So what I ended up doing (not sure if there's a better way) was:

from fabric.main import load_fabfile
from fabric.state import commands
...

docstring, callables, default = load_fabfile(__file__)
commands.update(callables)

with settings(hide('everything'),user='username',password='password1'):
results = execute('run_parallel')

That seemed to work.

Thanks,

Rob
Post by Brandon Whaley
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]
@task
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
execute(task_chooser, hosts=host_list)
[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11 12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host2] out: host2
[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1 17:20:32 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host1] out: host1
Done.
Post by Rob Marshall
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Brandon Whaley
2018-06-19 06:47:18 UTC
Permalink
Hmm, I'm not sure why run_parallel would throw an error like that. I'd be
interested to see the full stack trace. You actually shouldn't need to use
load_fabfile or commands.update, just using execute(run_parallel) should
work. I'll take some time tomorrow and try to replicate your issue.


P.S.
As a personal favor for my sanity, I ask that you not use exec(). Here's
Post by Rob Marshall
Post by Brandon Whaley
import json
values = 'load_node=10.10.0.1,load_base=0,load_max=1000'
args = { k: v for k, v in [ arg.split('=', 1) for arg in
values.split(',') ] }
Post by Rob Marshall
Post by Brandon Whaley
print json.dumps(args, indent=4)
{
"load_node": "10.10.0.1",
"load_base": "0",
"load_max": "1000"
}

You'd then check for args['load_node'] instead of using the local variable
load_node.
Post by Rob Marshall
Hi,
@task
cmd = [
'run_rack_monitor',
'--rack',rackname
]
return run(' '.join(cmd))
@task
cmd = [
'run_system_load',
'--datanode',load_node,
'--base-value',str(load_base),
'--max-value',str(load_max),
]
return run(' '.join(cmd))
@task
host, values, task = env.host_string.split('__')
exec(value)
return execute(task,hosts=[host],rackname=rackname)
return
execute(task,hosts=[host],load_node=load_node,load_base=load_base,load_max=load_max)
@task
host_list = [
'10.10.0.2__rackname="rackname01"__monitor_task',
'10.10.0.2__rackname="rackname02"__monitor_task',
'10.10.0.2__rackname="rackname03"__monitor_task',
'10.10.0.1__load_node="10.10.0.1",load_base=0,load_max=1000__run_load',
'10.10.0.2__load_node="10.10.0.2",load_base=1000,load_max=2000__run_load',
'10.10.0.3__load_node="10.10.0.3",load_base=2000,load_max=3000__run_load',
'10.10.0.4__load_node="10.10.0.4",load_base=3000,load_max=4000__run_load',
'10.10.0.5__load_node="10.10.0.5",load_base=4000,load_max=5000__run_load',
'10.10.0.6__load_node="10.10.0.6",load_base=5000,load_max=6000__run_load',
]
results = execute(task_choser,hosts=host_list)
return results
Which allows me to pass in arguments to the tasks. I did run into one
odd thing: If I just tried to run run_parallel() as a function I got
Fatal error: '...' is not callable or a valid task name
from fabric.main import load_fabfile
from fabric.state import commands
...
docstring, callables, default = load_fabfile(__file__)
commands.update(callables)
results = execute('run_parallel')
That seemed to work.
Thanks,
Rob
Post by Brandon Whaley
Hi Rob, I've done this as a hack in the past by adding data to the host
list and parsing it before execution to determine what to run. I've built
Post by Brandon Whaley
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]
@task
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
execute(task_chooser, hosts=host_list)
[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11
12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
Post by Brandon Whaley
[host2] out: host2
[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1
17:20:32 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
Post by Brandon Whaley
[host1] out: host1
Done.
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Rob Marshall
2018-06-19 14:14:54 UTC
Permalink
Hi Brandon,

As a personal favor :-) Which actually makes it easier to check the args...

@task
def task_choser():
host, values, task = env.host_string.split('__')
args = { k:v for k, v in [ arg.split('=', 1) for arg in
values.split(',') ] }

if task == 'monitor_task':
if not args.has_key('rackname'):
raise ValueError('A rackname is required for monitor_task')
return execute(task,hosts=[host],rackname=args['rackname'])[host]
else:
for argkey in ('load_node','load_base','load_max'):
if not args.has_key(argkey):
raise ValueError('A %s is required for run_load')
return execute(task,hosts=[host],load_node=args['load_node'],load_base=args['load_base'],
load_max=args['load_max'])[host]

I essentially took your test script as (BTW, I'm running Python 2.7.12
and Fabric 1.13.2):

***@robs-xubuntu2: [Projects]$ cat test_brandon.py
#!/usr/bin/python

from fabric.api import *
from pprint import pprint

@task
def hostname():
return run('hostname')

@task
def uname():
return run('uname -a')

@task
def task_chooser():
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
results = execute('%s' % task, hosts=[host])
return results

@task
def parallel_runner():
host_list=[
'10.245.129.185_hostname',
'10.245.129.185_uname',
'10.245.129.186_hostname',
'10.245.129.186_uname'
]
with settings(parallel=True):
results = execute(task_chooser, hosts=host_list)

pprint(results)
return results

if __name__ == '__main__':
execute(parallel_runner)

When I run it I get:

***@robs-xubuntu2: [Projects]$ test_brandon.py
[10.245.129.185_hostname] Executing task 'task_chooser'
[10.245.129.185_uname] Executing task 'task_chooser'
[10.245.129.186_hostname] Executing task 'task_chooser'
[10.245.129.186_uname] Executing task 'task_chooser'

Fatal error: 'uname' is not callable or a valid task name

Aborting.

Fatal error: 'hostname' is not callable or a valid task name

Aborting.

Fatal error: 'uname' is not callable or a valid task name

Aborting.

Fatal error: 'hostname' is not callable or a valid task name

Aborting.

Fatal error: One or more hosts failed while executing task 'task_chooser'

Aborting.
Hmm, I'm not sure why run_parallel would throw an error like that. I'd be interested to see the full stack trace. You actually shouldn't need to use load_fabfile or commands.update, just using execute(run_parallel) should work. I'll take some time tomorrow and try to replicate your issue.
P.S.
Post by Rob Marshall
Post by Brandon Whaley
import json
values = 'load_node=10.10.0.1,load_base=0,load_max=1000'
args = { k: v for k, v in [ arg.split('=', 1) for arg in values.split(',') ] }
print json.dumps(args, indent=4)
{
"load_node": "10.10.0.1",
"load_base": "0",
"load_max": "1000"
}
You'd then check for args['load_node'] instead of using the local variable load_node.
Post by Rob Marshall
Hi,
@task
cmd = [
'run_rack_monitor',
'--rack',rackname
]
return run(' '.join(cmd))
@task
cmd = [
'run_system_load',
'--datanode',load_node,
'--base-value',str(load_base),
'--max-value',str(load_max),
]
return run(' '.join(cmd))
@task
host, values, task = env.host_string.split('__')
exec(value)
return execute(task,hosts=[host],rackname=rackname)
return execute(task,hosts=[host],load_node=load_node,load_base=load_base,load_max=load_max)
@task
host_list = [
'10.10.0.2__rackname="rackname01"__monitor_task',
'10.10.0.2__rackname="rackname02"__monitor_task',
'10.10.0.2__rackname="rackname03"__monitor_task',
'10.10.0.1__load_node="10.10.0.1",load_base=0,load_max=1000__run_load',
'10.10.0.2__load_node="10.10.0.2",load_base=1000,load_max=2000__run_load',
'10.10.0.3__load_node="10.10.0.3",load_base=2000,load_max=3000__run_load',
'10.10.0.4__load_node="10.10.0.4",load_base=3000,load_max=4000__run_load',
'10.10.0.5__load_node="10.10.0.5",load_base=4000,load_max=5000__run_load',
'10.10.0.6__load_node="10.10.0.6",load_base=5000,load_max=6000__run_load',
]
results = execute(task_choser,hosts=host_list)
return results
Which allows me to pass in arguments to the tasks. I did run into one
odd thing: If I just tried to run run_parallel() as a function I got
Fatal error: '...' is not callable or a valid task name
from fabric.main import load_fabfile
from fabric.state import commands
...
docstring, callables, default = load_fabfile(__file__)
commands.update(callables)
results = execute('run_parallel')
That seemed to work.
Thanks,
Rob
Post by Brandon Whaley
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]
@task
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
execute(task_chooser, hosts=host_list)
[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11 12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host2] out: host2
[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1 17:20:32 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host1] out: host1
Done.
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Brandon Whaley
2018-06-19 14:59:20 UTC
Permalink
Thanks for dropping exec :)

I see what you mean about the task lookup problem. I guess you do need to
load the file as a fab file if you don't intend to use the "fab" command
line tool if you want string based task lookups in execute(). Sorry for
not realizing that!

If you hadn't found the fabfile loading workaround, you could also have
tried to look up the task function itself in globals(), since execute can
... pass
...
Post by Rob Marshall
Post by Brandon Whaley
globals()['test']
<function test at 0x7ff6150b5c80>

I'm partial to your solution though.
Post by Rob Marshall
Hi Brandon,
As a personal favor :-) Which actually makes it easier to check the args...
@task
host, values, task = env.host_string.split('__')
args = { k:v for k, v in [ arg.split('=', 1) for arg in
values.split(',') ] }
raise ValueError('A rackname is required for monitor_task')
return execute(task,hosts=[host],rackname=args['rackname'])[host]
raise ValueError('A %s is required for run_load')
return
execute(task,hosts=[host],load_node=args['load_node'],load_base=args['load_base'],
load_max=args['load_max'])[host]
I essentially took your test script as (BTW, I'm running Python 2.7.12
#!/usr/bin/python
from fabric.api import *
from pprint import pprint
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
results = execute('%s' % task, hosts=[host])
return results
@task
host_list=[
'10.245.129.185_hostname',
'10.245.129.185_uname',
'10.245.129.186_hostname',
'10.245.129.186_uname'
]
results = execute(task_chooser, hosts=host_list)
pprint(results)
return results
execute(parallel_runner)
[10.245.129.185_hostname] Executing task 'task_chooser'
[10.245.129.185_uname] Executing task 'task_chooser'
[10.245.129.186_hostname] Executing task 'task_chooser'
[10.245.129.186_uname] Executing task 'task_chooser'
Fatal error: 'uname' is not callable or a valid task name
Aborting.
Fatal error: 'hostname' is not callable or a valid task name
Aborting.
Fatal error: 'uname' is not callable or a valid task name
Aborting.
Fatal error: 'hostname' is not callable or a valid task name
Aborting.
Fatal error: One or more hosts failed while executing task 'task_chooser'
Aborting.
Post by Brandon Whaley
Hmm, I'm not sure why run_parallel would throw an error like that. I'd
be interested to see the full stack trace. You actually shouldn't need to
use load_fabfile or commands.update, just using execute(run_parallel)
should work. I'll take some time tomorrow and try to replicate your issue.
Post by Brandon Whaley
P.S.
As a personal favor for my sanity, I ask that you not use exec().
Here's an example of parsing an argument list like the one you're using
Post by Brandon Whaley
Post by Brandon Whaley
import json
values = 'load_node=10.10.0.1,load_base=0,load_max=1000'
args = { k: v for k, v in [ arg.split('=', 1) for arg in
values.split(',') ] }
Post by Brandon Whaley
Post by Brandon Whaley
print json.dumps(args, indent=4)
{
"load_node": "10.10.0.1",
"load_base": "0",
"load_max": "1000"
}
You'd then check for args['load_node'] instead of using the local
variable load_node.
Post by Brandon Whaley
Hi,
@task
cmd = [
'run_rack_monitor',
'--rack',rackname
]
return run(' '.join(cmd))
@task
cmd = [
'run_system_load',
'--datanode',load_node,
'--base-value',str(load_base),
'--max-value',str(load_max),
]
return run(' '.join(cmd))
@task
host, values, task = env.host_string.split('__')
exec(value)
return execute(task,hosts=[host],rackname=rackname)
return
execute(task,hosts=[host],load_node=load_node,load_base=load_base,load_max=load_max)
Post by Brandon Whaley
@task
host_list = [
'10.10.0.2__rackname="rackname01"__monitor_task',
'10.10.0.2__rackname="rackname02"__monitor_task',
'10.10.0.2__rackname="rackname03"__monitor_task',
'10.10.0.1__load_node="10.10.0.1",load_base=0,load_max=1000__run_load',
'10.10.0.2__load_node="10.10.0.2",load_base=1000,load_max=2000__run_load',
'10.10.0.3__load_node="10.10.0.3",load_base=2000,load_max=3000__run_load',
'10.10.0.4__load_node="10.10.0.4",load_base=3000,load_max=4000__run_load',
'10.10.0.5__load_node="10.10.0.5",load_base=4000,load_max=5000__run_load',
'10.10.0.6__load_node="10.10.0.6",load_base=5000,load_max=6000__run_load',
Post by Brandon Whaley
]
results = execute(task_choser,hosts=host_list)
return results
Which allows me to pass in arguments to the tasks. I did run into one
odd thing: If I just tried to run run_parallel() as a function I got
Fatal error: '...' is not callable or a valid task name
from fabric.main import load_fabfile
from fabric.state import commands
...
docstring, callables, default = load_fabfile(__file__)
commands.update(callables)
with
results = execute('run_parallel')
That seemed to work.
Thanks,
Rob
Post by Brandon Whaley
Hi Rob, I've done this as a hack in the past by adding data to the
host list and parsing it before execution to determine what to run. I've
Post by Brandon Whaley
Post by Brandon Whaley
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]
@task
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
execute(task_chooser, hosts=host_list)
[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11
12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
Post by Brandon Whaley
Post by Brandon Whaley
[host2] out: host2
[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1
17:20:32 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
Post by Brandon Whaley
Post by Brandon Whaley
[host1] out: host1
Done.
On Mon, Jun 18, 2018 at 3:00 PM Rob Marshall <
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it
doesn't
Post by Brandon Whaley
Post by Brandon Whaley
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Rob Marshall
2018-06-19 17:58:28 UTC
Permalink
Hi Brandon,

As far as the solution to not finding the task, I found that via
google search, so it's not really "mine". :-)

Thank-you very much. You've been very helpful.

Rob
Post by Brandon Whaley
Thanks for dropping exec :)
I see what you mean about the task lookup problem. I guess you do need to load the file as a fab file if you don't intend to use the "fab" command line tool if you want string based task lookups in execute(). Sorry for not realizing that!
... pass
...
Post by Rob Marshall
globals()['test']
<function test at 0x7ff6150b5c80>
I'm partial to your solution though.
Post by Rob Marshall
Hi Brandon,
As a personal favor :-) Which actually makes it easier to check the args...
@task
host, values, task = env.host_string.split('__')
args = { k:v for k, v in [ arg.split('=', 1) for arg in
values.split(',') ] }
raise ValueError('A rackname is required for monitor_task')
return execute(task,hosts=[host],rackname=args['rackname'])[host]
raise ValueError('A %s is required for run_load')
return execute(task,hosts=[host],load_node=args['load_node'],load_base=args['load_base'],
load_max=args['load_max'])[host]
I essentially took your test script as (BTW, I'm running Python 2.7.12
#!/usr/bin/python
from fabric.api import *
from pprint import pprint
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
results = execute('%s' % task, hosts=[host])
return results
@task
host_list=[
'10.245.129.185_hostname',
'10.245.129.185_uname',
'10.245.129.186_hostname',
'10.245.129.186_uname'
]
results = execute(task_chooser, hosts=host_list)
pprint(results)
return results
execute(parallel_runner)
[10.245.129.185_hostname] Executing task 'task_chooser'
[10.245.129.185_uname] Executing task 'task_chooser'
[10.245.129.186_hostname] Executing task 'task_chooser'
[10.245.129.186_uname] Executing task 'task_chooser'
Fatal error: 'uname' is not callable or a valid task name
Aborting.
Fatal error: 'hostname' is not callable or a valid task name
Aborting.
Fatal error: 'uname' is not callable or a valid task name
Aborting.
Fatal error: 'hostname' is not callable or a valid task name
Aborting.
Fatal error: One or more hosts failed while executing task 'task_chooser'
Aborting.
Hmm, I'm not sure why run_parallel would throw an error like that. I'd be interested to see the full stack trace. You actually shouldn't need to use load_fabfile or commands.update, just using execute(run_parallel) should work. I'll take some time tomorrow and try to replicate your issue.
P.S.
Post by Brandon Whaley
import json
values = 'load_node=10.10.0.1,load_base=0,load_max=1000'
args = { k: v for k, v in [ arg.split('=', 1) for arg in values.split(',') ] }
print json.dumps(args, indent=4)
{
"load_node": "10.10.0.1",
"load_base": "0",
"load_max": "1000"
}
You'd then check for args['load_node'] instead of using the local variable load_node.
Hi,
@task
cmd = [
'run_rack_monitor',
'--rack',rackname
]
return run(' '.join(cmd))
@task
cmd = [
'run_system_load',
'--datanode',load_node,
'--base-value',str(load_base),
'--max-value',str(load_max),
]
return run(' '.join(cmd))
@task
host, values, task = env.host_string.split('__')
exec(value)
return execute(task,hosts=[host],rackname=rackname)
return execute(task,hosts=[host],load_node=load_node,load_base=load_base,load_max=load_max)
@task
host_list = [
'10.10.0.2__rackname="rackname01"__monitor_task',
'10.10.0.2__rackname="rackname02"__monitor_task',
'10.10.0.2__rackname="rackname03"__monitor_task',
'10.10.0.1__load_node="10.10.0.1",load_base=0,load_max=1000__run_load',
'10.10.0.2__load_node="10.10.0.2",load_base=1000,load_max=2000__run_load',
'10.10.0.3__load_node="10.10.0.3",load_base=2000,load_max=3000__run_load',
'10.10.0.4__load_node="10.10.0.4",load_base=3000,load_max=4000__run_load',
'10.10.0.5__load_node="10.10.0.5",load_base=4000,load_max=5000__run_load',
'10.10.0.6__load_node="10.10.0.6",load_base=5000,load_max=6000__run_load',
]
results = execute(task_choser,hosts=host_list)
return results
Which allows me to pass in arguments to the tasks. I did run into one
odd thing: If I just tried to run run_parallel() as a function I got
Fatal error: '...' is not callable or a valid task name
from fabric.main import load_fabfile
from fabric.state import commands
...
docstring, callables, default = load_fabfile(__file__)
commands.update(callables)
results = execute('run_parallel')
That seemed to work.
Thanks,
Rob
Post by Brandon Whaley
@task
return run('hostname')
@task
return run('uname -a')
@task
# only consider up to the first underscore to be host data
host, task = env.host_string.split('_', 1)
return execute(task, hosts=[host])[host]
@task
host_list=[
'host1_hostname',
'host1_uname',
'host2_hostname',
'host2_uname'
]
execute(task_chooser, hosts=host_list)
[host1_hostname] Executing task 'task_chooser'
[host1_uname] Executing task 'task_chooser'
[host2_hostname] Executing task 'task_chooser'
[host2_uname] Executing task 'task_chooser'
[host2] Executing task 'uname'
[host2] Executing task 'hostname'
[host1] Executing task 'uname'
[host2] run: uname -a
[host1] Executing task 'hostname'
[host2] run: hostname
[host1] run: uname -a
[host1] run: hostname
[host1] out: Linux host1 4.4.0-104-generic #127-Ubuntu SMP Mon Dec 11 12:16:42 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host2] out: host2
[host2] out: Linux host2 4.4.0-63-generic #84-Ubuntu SMP Wed Feb 1 17:20:32 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
[host1] out: host1
Done.
Hi,
I'm trying to run multiple commands on the same host in parallel but
if I try to run a list of commands based on env.host_string it doesn't
run those commands in parallel. Is there a way to do that?
I guess, in essence, I'd like to "nest" parallel commands. I
originally attempted to place the host in the hosts list multiple
times, but it looks like parallel removes duplicates (I assume this
has to do with separating results by host).
Thanks,
Rob
_______________________________________________
Fab-user mailing list
https://lists.nongnu.org/mailman/listinfo/fab-user
Loading...