High python CPU usage associated w/ Gurobi Cloud?
Hi all--I moved to using Gurobi Instant Cloud in an attempt to ease the burden of running my optimization model on my computer, and to open up opportunities for performing computations in parallel on multiple servers. However, when I run three separate instances of my code (using a multiple-server pool in Gurobi Cloud), the CPU usage of my 3 instances of python is really high (87% each...so well over 100% total).
Am I missing something about what it takes to in fact offload the computational burden to the cloud? Most of my code runs quite quickly; the optimization is what's burdensome. Thus, I would have expected the optimization burden to be moved to the cloud, but my computer still seems to be bearing a lot of it. Resources or thoughts on understanding what happens locally versus in the cloud would be appreciated.
Thanks,
Margaret
-
Official comment
This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?. -
2 things to check:
- Ensure your local machine configuration was updated correctly so that the computation is running on the cloud. The console and the log should show something like:
Capacity available on '999999-default' cloud pool - connecting... Established HTTPS encrypted connection with Compute Server
- Only the solve itself takes place on the cloud; anything else including data processing or model building is still done on your local computer in Python.
0 - Ensure your local machine configuration was updated correctly so that the computation is running on the cloud. The console and the log should show something like:
-
Thanks, Greg. On point #1, I don't see anything like that, and I'm not quite sure where I would expect to. This may be because my call to gurobi is not directly in the script I'm running (it's called within a module); I don't know if it matters that I'm using a jupyter notebook instead of running a .py file in the terminal... (I guess this is an issue with me understanding which console/log you're talking about).
However, when I run my script, I see the pool in the cloud manager turn from yellow to green and to #Ready. Is this enough to tell that it's running in the cloud?
0 -
Hi Margaret,
By default, the messages Greg mentioned would appear in the logfile and output produced by Gurobi when you solve a model. E.g.:
Gurobi 9.0.2 (mac64, gurobi_cl) logging started Fri Jun 26 12:42:43 2020
Using license file /Users/towle/gurobi.lic
Set parameter CloudAccessID
Set parameter CloudSecretKey
Set parameter CloudPool to value 999999-default
Set parameter LogFile to value gurobi.log
Waiting for cloud server to start (pool 999999-default)...
Starting...
Starting...
Starting...
Starting...
Compute Server job ID: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Capacity available on '999999-default' cloud pool - connecting...
Established HTTPS encrypted connection
Gurobi Optimizer version 9.0.2 build v9.0.2rc0 (mac64)
Copyright (c) 2020, Gurobi Optimization, LLC
...That said, I think you're using Pyomo, so the output is probably disabled by default. You can enable it by setting \( \texttt{tee=True} \) in the \( \texttt{solve()} \) command:
results = opt.solve(model, tee=True)
If the pool status icon turns green and is "Ready", then the pool successfully launched and is ready to solve models. After you begin optimizing, a new job should show up in the Jobs view of the Cloud Manager interface. Do you see your jobs there?
Thanks,
Eli
0
Post is closed for comments.
Comments
4 comments