forked from madebr/pyOpt
-
Notifications
You must be signed in to change notification settings - Fork 2
/
QUICKGUIDE
227 lines (122 loc) · 5.1 KB
/
QUICKGUIDE
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
pyOpt Quick Reference Guide
===========================
Copyright (c) 2008-2014, pyOpt Developers
This is a quick guide to begin solving optimization problems with pyOpt.
Optimization Problem Definition
-------------------------------
pyOpt is design to solve general constrained nonlinear optimization problems:
min f(x)
x
s.t. g_j(x) = 0, j = 1, ..., m_e
g_j(x) <= 0, j = m_e + 1, ..., m
x_i_L <= x_i <= x_i_U, i = 1, ..., n
where:
* x is the vector of design variables
* f(x) is a nonlinear function
* g(x) is a linear or nonlinear function
* n is the number of design variables
* m_e is the number of equality constraints
* m is the total number of constraints (number of equality constraints: m_i = m - m_e)
Optimization Class
------------------
Instanciating an Optimization Problem:
>>> opt_prob = Optimization('name',obj_fun,var_set={},obj_set={},con_set={})
Notes on Objective Functions:
General Objective Function Template
::
def obj_fun(x, *args, **kwargs):
fail = 0
f = function(x,*args,**kwargs)
g = function(x,*args,**kwargs)
return f,g,fail
where:
f - objective value
g - array (or list) of constraint values
fail
- 0 for successful function evaluation
- 1 for unsuccessful function evaluation (test must be provided by user)
If the Optimization problem is unconstraint g must be returned as an empty
list or array: g = []
Inequality constraints are handled as <=.
Assigning Objective:
>>> opt_prob.addObj('name', value=0.0, optimum=0.0)
Assigning Design Variables:
Single Design variable:
>>> opt_prob.addVar('name', type='c', value=0.0, lower=-inf, upper=inf, choices=listochoices)
A Group of Design Variables:
>>> opt_prob.addVarGroup('name', numerinGroup, type='c', value=value, lower=lb, upper=up,choices=listochoices)
where:
value,lb,ub - (float or int or list or 1Darray).
Supported Types:
'c' - continous design variable.
'i' - integer design variable.
'd' - discrete design variable (based on choices, e.g.: list/dict of materials).
Assigning Constraints:
Single Constraint:
>>> opt_prob.addCon('name', type='i', lower=-inf, upper=inf, equal=0.0)
A Group of Constraints:
>>> opt_prob.addConGroup('name', numberinGroup, type='i', lower=lb, upper=up, equal=eq)
where:
lb,ub,eq - (float or int or list or 1Darray).
Supported Types:
'i' - inequality constraint.
'e' - equality constraint.
Optimizer Class
---------------
Instanciating an Optimizer (e.g.: Snopt):
>>> opt = pySNOPT.SNOPT()
Setting Optimizer Options:
either during instanciation:
>>> opt = pySNOPT.SNOPT(options={'name':value,...})
or one by one:
>>> opt.setOption('name',value)
Getting Optimizer Options/Attributes:
>>> opt.getOption('name')
>>> opt.ListAttributes()
Optimizing
----------
Solving the Optimization Problem:
>>> opt(opt_prob, sens_type='FD', disp_opts=False, sens_mode='',*args, **kwargs)
disp_opts - flag for displaying the options in the solution output.
sens_type - sensitivity type.
- 'FD' = finite differences.
- 'CS' = complex step.
- grad_function = user provided function.
format: grad_function(x,f,g)
returns g_obj,g_con,fail
sens_mode - parallel sensitivity flag (''-serial,'pgc'-parallel).
Additional arguments and keyword arguments (e.g.: parameters) can be passed to the objective function.
Output:
* Prompt output of the Optimization problem with inital values:
>>> print opt_prob
* Prompt output of specific solution of the Optimization problem:
>>> print opt_prob._solutions[key]
key - index in order of optimizer call.
* File output of the Optimization problem:
>>> opt_prob.write2file(outfile='', disp_sols=False, solutions=[])
where:
outfile - (filename, fileinstance, default=opt_prob name[0].txt).
disp_sols - True will display all the stored solutions.
solutions - list of indicies of stored solutions to display.
Output as Input:
The solution can be used directly as a optimization problem for
refinement by the same or a new optimizer:
>>> optimizer(opt_prob._solutions[key])
key - index in order of optimzer call.
The new solution will be stored as a sub-solution of the previous solution:
e.g.: print opt_prob._solutions[key]._solutions[nkey]
History and Hot Start:
The history flag stores all fucntion evaluations from an optimizer in
binary format in a .bin and .cue file:
>>> optimizer(opt_prob, store_hst=True)
True - uses default print name for the file names
str - filename
The binary history file can be used to hot start the optimzer if the
optimization was interruped. The flag needs the filename of the
history (True will use the default name)
>>> optimizer(opt_prob, store_hst=True, hot_start=True)
If the store history flag is set as the same as the hot start flag a
temporary file will be created during the run and teh original file
will be overwritten at the end.
For hot start to work properly all options must be the same as when
the history was created.