Skip to content

Commit 5b23b16

Browse files
committed
Preparing for 2K25 edition
1 parent 878e102 commit 5b23b16

30 files changed

+1701
-34
lines changed

README.md

+1-18
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Operations Research 2K24
1+
# Operations Research 2K25
22

33
This repository contains the Notebook used during the Operations Research (#orms) course at the [Department of Mathematics](https://matematica.unipv.it/) at the University of Pavia.
44

@@ -10,28 +10,11 @@ This repository is maintained by the [Computational Optimization Research Group]
1010

1111
| Data | Notebook | Link |
1212
|:-|:-|:-|
13-
|**[2024/05/10]**|*Traveling Student Problem (TSP)*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/TSP.ipynb)|
14-
|**[2024/04/19]**|*Training Binary Neural Networks*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/TrainingBNN.ipynb)|
15-
|**[2024/03/22]**|*Linear Regression with Gurobi*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/linear_regression.ipynb)|
16-
|**[2024/03/15]**|*Modeling exercises with Gurobi*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_and_Gurobi.ipynb)|
17-
|**[2024/03/11]**|*Steel Production Planning (intro to Gurobi)*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Steel_Planning.ipynb)|
18-
|**[2024/03/01]**|*Python in a Nutshell*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_in_a_Nutshell.ipynb)|
1913

2014
## Homeworks solutions
2115

2216
| Data | Solution | Link |
2317
|:-|:-|:-|
24-
|**[2024/05/17]**|*Asymmetric TSP*|[atsp.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/atsp.py)|
25-
|**[2024/05/17]**|*Symmetric TSP*|[tsp.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/tsp.py)|
26-
|**[2024/05/03]**|*Training a BNN for the XOR logical function (non linearly separable*|[NN_xor.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/Xor_aula.py)|
27-
|**[2024/04/22]**|*Training a BNN for the AND logical function*|[NN_and.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/NN_and.py)|
28-
|**[2024/04/15]**|*Optimal Color Transfer*|[colorTransfer.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/colorTransfer.py)|
29-
|**[2024/04/12]**|*Linear regression Diabete dataset*|[regression_diabete.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/regression_diabete.py)|
30-
|**[2024/04/12]**|*Linear regression noisy $sin(x)$*|[regression_sin.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/regression_sin.py)|
31-
|**[2024/03/15]**|*Exercise 2.6: Square Magic*|[square_magic.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/square_magic.py)|
32-
|**[2024/03/15]**|*Exercise 2.5: Steel Recycle Bleending Problem*|[steel.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/steel.py)|
33-
|**[2024/03/01]**|*Python in a Nutshell: Solutions to exercises*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_in_a_Nutshell_solutions.ipynb)|
34-
3518

3619

3720
### License

aa2024/README.md

+38
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
# Operations Research 2K24
2+
3+
This repository contains the Notebook used during the Operations Research (#orms) course at the [Department of Mathematics](https://matematica.unipv.it/) at the University of Pavia.
4+
5+
Every notebook can be opened directly on the web using Google Colab, by clicking on the corresponding Colab icon.
6+
7+
This repository is maintained by the [Computational Optimization Research Group](https://www.compopt.it/). You are welcome to contribute!
8+
9+
## Python Notebooks
10+
11+
| Data | Notebook | Link |
12+
|:-|:-|:-|
13+
|**[2024/05/10]**|*Traveling Student Problem (TSP)*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/TSP.ipynb)|
14+
|**[2024/04/19]**|*Training Binary Neural Networks*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/TrainingBNN.ipynb)|
15+
|**[2024/03/22]**|*Linear Regression with Gurobi*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/linear_regression.ipynb)|
16+
|**[2024/03/15]**|*Modeling exercises with Gurobi*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_and_Gurobi.ipynb)|
17+
|**[2024/03/11]**|*Steel Production Planning (intro to Gurobi)*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Steel_Planning.ipynb)|
18+
|**[2024/03/01]**|*Python in a Nutshell*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_in_a_Nutshell.ipynb)|
19+
20+
## Homeworks solutions
21+
22+
| Data | Solution | Link |
23+
|:-|:-|:-|
24+
|**[2024/05/17]**|*Asymmetric TSP*|[atsp.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/atsp.py)|
25+
|**[2024/05/17]**|*Symmetric TSP*|[tsp.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/tsp.py)|
26+
|**[2024/05/03]**|*Training a BNN for the XOR logical function (non linearly separable*|[NN_xor.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/Xor_aula.py)|
27+
|**[2024/04/22]**|*Training a BNN for the AND logical function*|[NN_and.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/NN_and.py)|
28+
|**[2024/04/15]**|*Optimal Color Transfer*|[colorTransfer.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/colorTransfer.py)|
29+
|**[2024/04/12]**|*Linear regression Diabete dataset*|[regression_diabete.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/regression_diabete.py)|
30+
|**[2024/04/12]**|*Linear regression noisy $sin(x)$*|[regression_sin.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/regression_sin.py)|
31+
|**[2024/03/15]**|*Exercise 2.6: Square Magic*|[square_magic.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/square_magic.py)|
32+
|**[2024/03/15]**|*Exercise 2.5: Steel Recycle Bleending Problem*|[steel.py](https://github.com/mathcoding/opt4ds/blob/master/scripts/steel.py)|
33+
|**[2024/03/01]**|*Python in a Nutshell: Solutions to exercises*|[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathcoding/opt4ds/blob/master/notebooks/Python_in_a_Nutshell_solutions.ipynb)|
34+
35+
36+
37+
### License
38+
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br /><span xmlns:dct="http://purl.org/dc/terms/" property="dct:title"><b>#ORMS Notebooks</b></span> by <a xmlns:cc="http://creativecommons.org/ns#" href="http://matematica.unipv.it/gualandi" property="cc:attributionName" rel="cc:attributionURL">Stefano Gualandi</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.<br />Based on a work at <a xmlns:dct="http://purl.org/dc/terms/" href="https://github.com/mathcoding/opt4ds" rel="dct:source">https://github.com/mathcoding/opt4ds</a>.

notebooks/Lego_Problems.ipynb renamed to aa2024/notebooks/Lego_Problems.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"id": "specified-functionality",
66
"metadata": {},
77
"source": [
8-
"<a rel=\"license\" href=\"http://creativecommons.org/licenses/by/4.0/\"><img alt=\"Creative Commons License\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by/4.0/88x31.png\" /></a><br /><span xmlns:dct=\"http://purl.org/dc/terms/\" property=\"dct:title\"><b>Solving the Lego Planning Problem with Pyomo and Glpk</b></span> by <a xmlns:cc=\"http://creativecommons.org/ns#\" href=\"http://mate.unipv.it/gualandi\" property=\"cc:attributionName\" rel=\"cc:attributionURL\">Stefano Gualandi</a> is licensed under a <a rel=\"license\" href=\"http://creativecommons.org/licenses/by/4.0/\">Creative Commons Attribution 4.0 International License</a>. Based on a project at <a xmlns:dct=\"http://purl.org/dc/terms/\" href=\"https://github.com/mathcoding/opt4ds\" rel=\"dct:source\">https://github.com/mathcoding/opt4ds</a>."
8+
"<a rel=\"license\" href=\"http://creativecommons.org/licenses/by/4.0/\"><img alt=\"Creative Commons License\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by/4.0/88x31.png\" /></a><br /><span xmlns:dct=\"http://purl.org/dc/terms/\" property=\"dct:title\"><b>Solving the Lego Planning Problem with Gurobi</b></span> by <a xmlns:cc=\"http://creativecommons.org/ns#\" href=\"http://mate.unipv.it/gualandi\" property=\"cc:attributionName\" rel=\"cc:attributionURL\">Stefano Gualandi</a> is licensed under a <a rel=\"license\" href=\"http://creativecommons.org/licenses/by/4.0/\">Creative Commons Attribution 4.0 International License</a>. Based on a project at <a xmlns:dct=\"http://purl.org/dc/terms/\" href=\"https://github.com/mathcoding/opt4ds\" rel=\"dct:source\">https://github.com/mathcoding/opt4ds</a>."
99
]
1010
},
1111
{
File renamed without changes.
File renamed without changes.
File renamed without changes.

aa2024/scripts/B-SVM.py

+261
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,261 @@
1+
from gurobipy import Model, GRB, quicksum
2+
import numpy as np
3+
from math import sqrt
4+
import matplotlib.pyplot as plt
5+
6+
import logging
7+
logging.basicConfig(filename='test-all.log',
8+
filemode='a',
9+
format='%(asctime)s,%(msecs)d %(message)s',
10+
datefmt='%H:%M:%S',
11+
level=logging.INFO)
12+
13+
def Parse(filename):
14+
fh = open(filename, 'r')
15+
fh.readline()
16+
Xs, Ys = [], []
17+
for row in fh:
18+
line = row.replace('\n','').split(';')
19+
v = int(line[0])
20+
Ys.append(-1 if v == 4 else 1)
21+
Xs.append(list(map(int, line[1:])))
22+
n = len(Xs)
23+
Xs = np.matrix(Xs)/255
24+
return Xs, np.array(Ys)
25+
26+
27+
class BatchLearner(object):
28+
def __init__(self, Xtrain, Ytrain) -> None:
29+
self.Xtrain = Xtrain
30+
self.Ytrain = Ytrain
31+
32+
self.mu = 0
33+
self.wp_hint = np.zeros(784)
34+
self.wn_hint = np.zeros(784)
35+
self.wbp_hint = 0
36+
self.wbn_hint = 0
37+
38+
self.accuracy = 0.0
39+
self.Results = []
40+
41+
self.Sol = []
42+
43+
def evalTrain(self, F):
44+
Xs, Ys = self.Xtrain, self.Ytrain
45+
acc = 0
46+
n = len(Ys)
47+
for i in range(n):
48+
if F(Xs[i]) == Ys[i]:
49+
acc += 1
50+
return (acc/n*100)
51+
52+
def solve(self, Xs, Ys, timelimit=10):
53+
model = Model()
54+
model.setParam(GRB.Param.OutputFlag, 0)
55+
model.setParam(GRB.Param.TimeLimit, timelimit)
56+
model.setParam(GRB.Param.BestObjStop, 1.00)
57+
#model.setParam(GRB.Param.Method, 2) # Barrier method
58+
#model.setParam(GRB.Param.Crossover, 0) # No crossover
59+
model.setParam(GRB.Param.Method, 1) # Dual Simplex
60+
61+
# Number of samples
62+
n, m = Xs.shape
63+
N = int(sqrt(m))
64+
65+
# Variables: w_ih weights
66+
wp, wn = {}, {}
67+
for i in range(m):
68+
wp[i] = model.addVar(obj=0.1, vtype=GRB.CONTINUOUS)
69+
wn[i] = model.addVar(obj=0.1, vtype=GRB.CONTINUOUS)
70+
71+
# Bias variable
72+
wbp, wbn = {}, {}
73+
wbp = model.addVar(obj=0.1, vtype=GRB.CONTINUOUS)
74+
wbn = model.addVar(obj=0.1, vtype=GRB.CONTINUOUS)
75+
76+
# Variable for margin
77+
z = [model.addVar(obj=1, lb=0.0, vtype=GRB.CONTINUOUS) for k in range(n)]
78+
79+
# Constraints
80+
for k in range(n):
81+
model.addConstr(quicksum([Ys[k]*Xs[k,i]*(wp[i] - wn[i]) for i in range(m)]) + Ys[k]*(wbp - wbn) >= 1-z[k])
82+
83+
model.update()
84+
85+
# Set hint for variable
86+
if self.mu > 0:
87+
muWbar, muBias = self.meanWeights()
88+
if muBias >= 0.0001:
89+
wbp.VarHintVal = muBias
90+
wbn.VarHintVal = 0.0
91+
elif muBias <= -0.0001:
92+
wbp.VarHintVal = 0.0
93+
wbn.VarHintVal = -muBias
94+
else:
95+
wbp.VarHintVal = 0.0
96+
wbn.VarHintVal = 0.0
97+
98+
for i in wp:
99+
if muWbar[i] >= 0.0001:
100+
wp[i].VarHintVal = muWbar[i]
101+
wn[i].VarHintVal = 0.0
102+
elif muWbar[i] <= -0.0001:
103+
wp[i].VarHintVal = 0.0
104+
wn[i].VarHintVal = -muWbar[i]
105+
else:
106+
wp[i].VarHintVal = 0.0
107+
wn[i].VarHintVal = 0.0
108+
109+
if muBias >= 0.0001:
110+
wbp.PStart = muBias
111+
wbn.PStart = 0.0
112+
elif muBias <= -0.0001:
113+
wbp.PStart = 0.0
114+
wbn.PStart = -muBias
115+
else:
116+
wbp.PStart = 0.0
117+
wbn.PStart = 0.0
118+
for i in wp:
119+
if muWbar[i] >= 0.0001:
120+
wp[i].PStart = muWbar[i]
121+
wn[i].PStart = 0.0
122+
elif muWbar[i] <= -0.0001:
123+
wp[i].PStart = 0.0
124+
wn[i].PStart = -muWbar[i]
125+
else:
126+
wp[i].PStart = 0.0
127+
wn[i].PStart = 0.0
128+
129+
model.optimize()
130+
131+
if model.status != GRB.Status.OPTIMAL and model.status != GRB.TIME_LIMIT and model.status != GRB.USER_OBJ_LIMIT:
132+
return None
133+
134+
if model.SolCount == 0:
135+
return None
136+
137+
if model.status == GRB.USER_OBJ_LIMIT:
138+
model.setParam(GRB.Param.BestObjStop, 0.0)
139+
model.setParam(GRB.Param.TimeLimit, 5)
140+
model.optimize()
141+
142+
# Build predictor function
143+
wbar = np.array([wp[i].x - wn[i].x for i in wp])
144+
wbias = wbp.x - wbn.x
145+
146+
# Build predictor function
147+
def Predict(x):
148+
return 1 if (wbias + np.dot(x, wbar)) >= 0 else -1
149+
150+
acc = self.evalTrain(Predict)
151+
152+
# Update internal values
153+
print('accuracy:', round(acc, 3), 'obj:', round(model.objVal, 3), 'runtime', round(model.runtime, 2), 'status:', model.status)
154+
# Keep the best start solution
155+
if acc > self.accuracy:
156+
self.accuracy = acc
157+
self.Predict = Predict
158+
159+
self.Results.append( (acc, model.objVal) )
160+
self.Sol.append( (wbar, wbias) )
161+
162+
# Return prediction function
163+
return Predict
164+
165+
def meanWeights(self):
166+
muWbar = np.zeros(784)
167+
muBias = 0.0
168+
for wbar, wbias in self.Sol:
169+
muWbar += wbar
170+
muBias += wbias
171+
muWbar = muWbar/len(self.Sol)
172+
muBias = muBias/len(self.Sol)
173+
return muWbar, muBias
174+
175+
def meanPredict(self):
176+
muWbar, muBias = self.meanWeights()
177+
return lambda x: 1 if (muBias + np.dot(x, muWbar)) >= 0 else -1
178+
179+
def showWeights(self):
180+
A = np.array([self.wp_start[i] - self.wn_start[i] for i in self.wp_start]).reshape(28,28)
181+
A[np.abs(A) < 1e-09] = np.nan
182+
plt.imshow(A, cmap='rainbow')
183+
plt.colorbar()
184+
plt.show()
185+
186+
if __name__ == "__main__":
187+
import keras
188+
189+
(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()
190+
x_train = x_train.astype(float)/255
191+
x_train = x_train.reshape(x_train.shape[0],784)
192+
193+
x_test = x_test.astype(float)/255
194+
x_test = x_test.reshape(x_test.shape[0],784)
195+
196+
np.random.seed(13)
197+
batch_size = 512
198+
epochs = 30
199+
for a in range(10):
200+
for b in range(a+1, 10):
201+
I = np.where((y_train == a) | (y_train == b))
202+
Xs = x_train[I]
203+
Ys = y_train[I]
204+
DigitOne = Ys[0]
205+
Ys = np.where(Ys == DigitOne, 1, -1)
206+
n, m = Xs.shape
207+
208+
bsvm = BatchLearner(Xs, Ys)
209+
210+
from time import perf_counter
211+
time_start = perf_counter()
212+
for i in range(epochs):
213+
Sample = np.random.choice(n, n, replace=False)
214+
bsvm.solve(Xs[Sample], Ys[Sample], timelimit=3600)
215+
216+
# record end time
217+
time_end = perf_counter()
218+
# calculate the duration
219+
tain_duration = time_end - time_start
220+
221+
mu_acc, min_acc, max_acc = 0, 100, 0
222+
for acc, obj in bsvm.Results:
223+
mu_acc += acc
224+
min_acc = min(min_acc, acc)
225+
max_acc = max(max_acc, acc)
226+
if len(bsvm.Results) > 0:
227+
mu_acc = round(mu_acc/len(bsvm.Results), 3)
228+
229+
# record end time
230+
time_end = perf_counter()
231+
# calculate the duration
232+
all_duration = time_end - time_start
233+
234+
# bsvm.showWeights()
235+
236+
print('LOG', a, b, 'END mu accuracy:', mu_acc, round(min_acc, 2), round(max_acc, 2), ' over', len(bsvm.Results), 'time:', round(all_duration,3), 'train time:', round(tain_duration, 3))
237+
238+
I = np.where((y_test == a) | (y_test == b))
239+
Xs = x_test[I]
240+
Ys = y_test[I]
241+
Ys = np.where(Ys == DigitOne, 1, -1)
242+
243+
acc = 0
244+
n = len(Ys)
245+
for i in range(n):
246+
if bsvm.Predict(Xs[i]) == Ys[i]:
247+
acc += 1
248+
acc = (acc/n*100)
249+
250+
print('LOG', a, b, 'Best Test accuracy:', round(acc, 2))
251+
252+
acc = 0
253+
n = len(Ys)
254+
F = bsvm.meanPredict()
255+
for i in range(n):
256+
if F(Xs[i]) == Ys[i]:
257+
acc += 1
258+
acc = (acc/n*100)
259+
260+
print('LOG', a, b, 'Mean Test accuracy:', round(acc, 2))
261+
logging.info('LOG %d %d %d TestAccuracy %f MuAcc %f time %f t_train %f', a, b, batch_size, acc, mu_acc, all_duration, tain_duration)

0 commit comments

Comments
 (0)