scipy L-BFGS-B optimizer with different step size per dimension
up vote
0
down vote
favorite
How can I adjust the optimizer to use a different step size for each DOF? When I print the parameters the step size seems to the the same per dimension.
Any other alternative optimizer that can facilitate this?
result = minimize(
error_func,
x0,
method='L-BFGS-B',
options=
'disp': verbose,
# 'maxiter': 1000,
# 'ftol': 1.0E-8,
)
optimization gradient-descent scipy
migrated from stats.stackexchange.com 2 days ago
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
add a comment |
up vote
0
down vote
favorite
How can I adjust the optimizer to use a different step size for each DOF? When I print the parameters the step size seems to the the same per dimension.
Any other alternative optimizer that can facilitate this?
result = minimize(
error_func,
x0,
method='L-BFGS-B',
options=
'disp': verbose,
# 'maxiter': 1000,
# 'ftol': 1.0E-8,
)
optimization gradient-descent scipy
migrated from stats.stackexchange.com 2 days ago
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
2
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
hail to autocomplete!
– El Dude
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
How can I adjust the optimizer to use a different step size for each DOF? When I print the parameters the step size seems to the the same per dimension.
Any other alternative optimizer that can facilitate this?
result = minimize(
error_func,
x0,
method='L-BFGS-B',
options=
'disp': verbose,
# 'maxiter': 1000,
# 'ftol': 1.0E-8,
)
optimization gradient-descent scipy
How can I adjust the optimizer to use a different step size for each DOF? When I print the parameters the step size seems to the the same per dimension.
Any other alternative optimizer that can facilitate this?
result = minimize(
error_func,
x0,
method='L-BFGS-B',
options=
'disp': verbose,
# 'maxiter': 1000,
# 'ftol': 1.0E-8,
)
optimization gradient-descent scipy
optimization gradient-descent scipy
asked Nov 9 at 2:42
El Dude
1,61331848
1,61331848
migrated from stats.stackexchange.com 2 days ago
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
migrated from stats.stackexchange.com 2 days ago
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
2
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
hail to autocomplete!
– El Dude
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago
add a comment |
2
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
hail to autocomplete!
– El Dude
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago
2
2
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
hail to autocomplete!
– El Dude
2 days ago
hail to autocomplete!
– El Dude
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53225094%2fscipy-l-bfgs-b-optimizer-with-different-step-size-per-dimension%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
took me a little while to figure out what a "spicy" optimizer was (now edited to "scipy" ...)
– Ben Bolker
Nov 9 at 3:14
hail to autocomplete!
– El Dude
2 days ago
The step-size is determined by line-search. If you are not familiar with it, don't expect it to be possible (even in theory). Intro. (also gradient-descent != L-BFGS-B)
– sascha
2 days ago