makeClusterFunctionsSSH {BatchJobs}R Documentation

Create an SSH cluster to execute jobs.


Worker nodes must share the same file system and be accessible by ssh without manually entering passwords (e.g. by ssh-agent or passwordless pubkey). Note that you can also use this function to parallelize on multiple cores on your local machine. But you still have to run an ssh server and provide passwordless access to localhost.


makeClusterFunctionsSSH(..., workers)



Worker objects, all created with makeSSHWorker.


[list of SSHWorker]
Alternative way to pass workers.



See Also


Other clusterFunctions: makeClusterFunctionsInteractive, makeClusterFunctionsLSF, makeClusterFunctionsLocal, makeClusterFunctionsMulticore, makeClusterFunctionsOpenLava, makeClusterFunctionsSGE, makeClusterFunctionsSLURM, makeClusterFunctionsTorque, makeClusterFunctions


## Not run: 

# Assume you have three nodes larry, curley and moe. All have 6
# cpu cores. On curley and moe R is installed under
# "/opt/R/R-current" and on larry R is installed under
# "/usr/local/R/". larry should not be used extensively because
# somebody else wants to compute there as well.
# Then a call to 'makeClusterFunctionsSSH'
# might look like this:

cluster.functions = makeClusterFunctionsSSH(
  makeSSHWorker(nodename = "larry", rhome = "/usr/local/R", = 2),
  makeSSHWorker(nodename = "curley", rhome = "/opt/R/R-current"),
  makeSSHWorker(nodename = "moe", rhome = "/opt/R/R-current"))

## End(Not run)

[Package BatchJobs version 1.8 Index]