help-glpk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [Help-glpk] optimality tolerance


From: Ciyou Zhu
Subject: RE: [Help-glpk] optimality tolerance
Date: Fri, 20 Feb 2004 09:48:37 -0500

I did it for GLPK-3.2.3.  (To input LPX_K_TOLOBJ, LPX_K_TOLPIV and
LPX_K_TMLIM as environmental variables.)  You need to edit glpsol.c for that
purpose.  You need also to edit glplpx3.c if you want a TOLOBJ > 0.001.  See
attached files.

I have been using GLPK to solve IP problems upto 300,000 variables/1000
constraints derived from crew pairing optimization.

Ciyou Zhu

-----Original Message-----
From: address@hidden
[mailto:address@hidden On Behalf Of Michael
Hennebry
Sent: Thursday, February 19, 2004 9:58 PM
To: Brady Hunsaker
Cc: address@hidden
Subject: Re: [Help-glpk] optimality tolerance

On 19 Feb 2004, Brady Hunsaker wrote:

> On Mon, 2004-02-09 at 10:49, Duilio Foschi wrote:
> > is there a way to set the optimality tolerance of a fairly large MIP 
> > problem in GLPK ?
> >
>
> I do not believe there is currently a way to do this.  This should be 
> easier with the Integer Optimization Suite that is being developed, 
> but for the time being, large MIPs are probably not appropriate for GLPK.

There is a parameter for it, but to do what you want you will probably have
to edit the glpk source.
It appears to be intended to deal with the vagaries of approximate
arithmetic.
Accordingly a value larger than .001 is flagged as an error.

-- 
Mike   address@hidden
"Two roads divurged in a woods, and I took the road less travelled by...
and they CANCELLED MY FRIKKIN' SHOW. I totally shoulda took that road
with all those people on it. Damn."                           --  Joss
Whedon



_______________________________________________
Help-glpk mailing list
address@hidden
http://mail.gnu.org/mailman/listinfo/help-glpk

Attachment: glpsol.c
Description: Binary data

Attachment: glplpx3.c
Description: Binary data


reply via email to

[Prev in Thread] Current Thread [Next in Thread]