help-glpk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-glpk] Knapsack problem using GLPK


From: Shubhashis Ghosh
Subject: Re: [Help-glpk] Knapsack problem using GLPK
Date: Thu, 15 Sep 2005 11:16:10 -0600 (MDT)

Hi,
  Number of variables do not always represent the underlying hardness
of the instance. There are unsolvable knapsack problems having less than 
100 variables.
 Regarding GLPK settings, I don't think there is anything special for 
large instances . But you can expect clarification from someone of GLPK 
developer.

thanks,
-Shu'
On Thu, 15 Sep 2005, Lingzi Li wrote:

> I am using GLPKMEX to develop my project now. It is a branch-and-price 
> problem, and I use GLPK to solve Lagrangian relaxation's master and sub 
> problems. The subproblem is like the following:
> 
> min  y - sum(ui*xi)
> s.t. sum(wi*xi) <= c*y
>      xi + xj <= 1 (for some (i,j))
>      xi, y are binary
> 
> (variables are xi and y)
> 
> When there are 80 variables, it takes 9 minutes to run out a solution. But if 
> there are 100 variables, it will take more than 3 hours. And I do belive that 
> most of the time is in subproblem mentioned above. Do you think it is normal 
> for 100 variables to take that long time? Or do I need to do some special 
> settings for large scale problem? But acturally I don't think 100 variables 
> is "large" scale. Thank you very much in advance for your help!
> 
> ----------------------------------------
> This mail sent through www.mywaterloo.ca
> 
> 
> _______________________________________________
> Help-glpk mailing list
> address@hidden
> http://lists.gnu.org/mailman/listinfo/help-glpk
> 

-- 
-----------------------------------------------------
Shubhashis Ghosh
Ph.D. Student
Department of Computing Science
2-21 Athabasca Hall
University of Alberta
Edmonton, AB T6G 2E8
Canada
------------------------------------------------------
Phone:   780-492-2737(off)
         780-988-0378(res)      
------------------------------------------------------





reply via email to

[Prev in Thread] Current Thread [Next in Thread]