Skip to content

Conversation

@dourouc05
Copy link
Contributor

Follows #2004 and #2008. Supersedes #2006.

I cleaned up the history from the previous PR, and took into account the remaining comments. There is one outstanding issue (line 140 of this PR; this was https://github.com/JuliaOpt/JuMP.jl/pull/2006/files#diff-46229f75b87dbb3f6099492776c892edR143):

Why does a dual value of 0.0 cause GLPK to crash? That shouldn't happen.

Actually, it's more GPLK.jl that gives an error (Cbc.jl too):

ERROR: LoadError: AssertionError: dual >= 0.0
Stacktrace:
 [1] __assert_dual_sense__ at C:\Users\…\.julia\packages\LinQuadOptInterface\ZMx9f\src\solve.jl:210 [inlined]
 [2] get at C:\Users\…\.julia\packages\LinQuadOptInterface\ZMx9f\src\solve.jl:217 [inlined]
 [3] get(::MathOptInterface.Bridges.LazyBridgeOptimizer{GLPK.Optimizer,MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Bridges.AllBridgedConstraints{Float64}}}, ::MathOptInterface.ConstraintDual, ::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}) at C:\Users\…\.julia\packages\MathOptInterface\C3lip\src\Bridges\bridgeoptimizer.jl:236
 [4] get(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{JuMP._MOIModel{Float64}}}, ::MathOptInterface.ConstraintDual, ::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}}) at C:\Users\…\.julia\packages\MathOptInterface\C3lip\src\Utilities\cachingoptimizer.jl:453
 [5] _moi_get_result(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer,MathOptInterface.Utilities.UniversalFallback{JuMP._MOIModel{Float64}}}, ::MathOptInterface.ConstraintDual, ::Vararg{Any,N} where N) at C:\Users\…\.julia\dev\JuMP\src\JuMP.jl:618
 [6] get(::Model, ::MathOptInterface.ConstraintDual, ::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\JuMP.jl:649
 [7] _constraint_dual(::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\constraints.jl:545
 [8] dual(::ConstraintRef{Model,MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64},MathOptInterface.GreaterThan{Float64}},ScalarShape}) at C:\Users\…\.julia\dev\JuMP\src\constraints.jl:538
 [9] _broadcast_getindex_evalf at .\broadcast.jl:625 [inlined]
 [10] _broadcast_getindex at .\broadcast.jl:598 [inlined]
 [11] getindex at .\broadcast.jl:558 [inlined]
 [12] copyto_nonleaf!(::Array{Float64,1}, ::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Tuple{Base.OneTo{Int64}},typeof(dual),Tuple{Base.Broadcast.Extruded{Array{ConstraintRef{Model,C,Shape} where Shape<:AbstractShape where C,1},Tuple{Bool},Tuple{Int64}}}}, ::Base.OneTo{Int64}, ::Int64, ::Int64) at .\broadcast.jl:982
 [13] copy at .\broadcast.jl:836 [inlined]
 [14] materialize at .\broadcast.jl:798 [inlined]
 [15] #example_cutting_stock#3(::Int64, ::typeof(example_cutting_stock)) at C:\Users\…\Desktop\a.jl:126
 [16] example_cutting_stock() at C:\Users\…\Desktop\a.jl:88
 [17] top-level scope at C:\Users\…\Desktop\a.jl:175

@codecov
Copy link

codecov bot commented Jul 18, 2019

Codecov Report

Merging #2010 into master will increase coverage by 0.43%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #2010      +/-   ##
==========================================
+ Coverage   91.41%   91.84%   +0.43%     
==========================================
  Files          33       33              
  Lines        4217     4771     +554     
==========================================
+ Hits         3855     4382     +527     
- Misses        362      389      +27
Impacted Files Coverage Δ
src/constraints.jl 90.5% <0%> (-1.11%) ⬇️
src/objective.jl 91.07% <0%> (-0.11%) ⬇️
src/operators.jl 86.89% <0%> (-0.05%) ⬇️
src/_Derivatives/forward.jl 97.61% <0%> (-0.01%) ⬇️
src/nlp.jl 92.2% <0%> (+0.27%) ⬆️
src/variables.jl 97.66% <0%> (+2.42%) ⬆️
src/JuMP.jl 87.01% <0%> (+2.45%) ⬆️
src/copy.jl 97.67% <0%> (+2.67%) ⬆️
src/parse_nlp.jl 95.19% <0%> (+4.22%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6968fbe...470ee99. Read the comment docs.

@mlubin
Copy link
Member

mlubin commented Jul 20, 2019

The assertion error will be fixed by jump-dev/GLPK.jl#101, so let's hold off on merging this before that's released.

@dourouc05
Copy link
Contributor Author

Nice! What about Cbc/Clp, as there is the same assertion?

@mlubin
Copy link
Member

mlubin commented Jul 20, 2019

Clp will be fixed when either a new LinQuadOptInterface is released or the wrapper is rewritten without LinQuadOptInterface like we're doing for GLPK.

@chriscoey
Copy link
Contributor

This would also be great in notebook format in https://github.com/barpit20/JuMPTutorials.jl (which will be moved under the JuMP repos umbrella soon)

@mlubin
Copy link
Member

mlubin commented Sep 6, 2019

Just reran the tests, and it looks like this PR has drifted a bit from master:

 LoadError: UndefVarError: set_standard_form_coefficient not defined
368  Stacktrace:
369   [1] #example_cutting_stock#8(::Int64, ::Function) at /home/travis/build/JuliaOpt/JuMP.jl/examples/cutting_stock_column_generation.jl:158
370   [2] example_cutting_stock() at /home/travis/build/JuliaOpt/JuMP.jl/examples/cutting_stock_column_generation.jl:98

I'll merge once CI passes.

@dourouc05
Copy link
Contributor Author

This should be corrected.

@mlubin mlubin merged commit 3caea82 into jump-dev:master Sep 7, 2019
@mlubin mlubin mentioned this pull request Sep 7, 2019
@dourouc05 dourouc05 deleted the example-colgen-new branch September 7, 2019 01:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants