difference between Gurobi solution and Python Scipy nnls solution for non-negative least square problem
solve the non-negative least square problem
argmin_x || Ax - b ||_2
for x>=0
for python solution, simply call nnls function as
r,s= nnls(A, B);
for Gurobi solution, define as
A_dot_A = A.T.dot(A);
A_dot_B = A.T.dot(B);
b_dot_b = B.T.dot(B);
m = gp.Model("qp")
x = m.addMVar(shape=int(A.shape[1]), name="x")
obj = x @ (A_dot_A @ x -2.0*A_dot_B)
m.setObjective(obj,, GRB.MINIMIZE)
m.addConstr(x @ A_dot_A @ x -2.0*A_dot_B @ x + b_dot_b >= 0, "c0")
m.optimize()
results completely different, seems nnls gives the right solution while Gurobi not. any thoughts?
-
Official comment
This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?. -
Could you paste a full example with A and b? I just tried the small example from the scipy documentation and Gurobi gave me the same result as nnls.
Moreover, I don't think you need to add the constraint since it simply says that the Euclidean norm of Ax-b should be non-negative (which should always be true by definition). What do you think?
0 -
Thank you for your response! My case is a non-convex optimization case.
A sample coefficient is in the link
https://drive.google.com/open?id=16v7_iQn3sVGumuIl9wsr0E3Nz3eC2SN3zUa_DBLW1Pg
0 -
I already set the solver to be 'nonconvex' by
m.setParam("NonConvex", 2)
0
Post is closed for comments.
Comments
4 comments