Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Julia equivalent of "nargout" to docs #4227

Closed
lsorber opened this issue Sep 8, 2013 · 10 comments
Closed

Add Julia equivalent of "nargout" to docs #4227

lsorber opened this issue Sep 8, 2013 · 10 comments
Labels
docs This change adds or pertains to documentation

Comments

@lsorber
Copy link

lsorber commented Sep 8, 2013

What are the best ways of translating MATLAB functions to Julia of which the output and its computations are based on nargout?

In my case, I would like to know the best workaround for a hypothetical optimization suite written in Julia in which the user is asked to supply a function which computes an objective function value, and as an optional second output a derivative. The advantage of a nargout approach is that the user need only write one function and, importantly, that in many cases some of the computation between the objective function and its derivative may be shared.

@lindahua
Copy link
Contributor

lindahua commented Sep 8, 2013

I think Julia is not going to take the MATLAB way, that is to let the function to decide the behavior based on the number of outputs arguments. And as far as I can tell, there is no way to tell "the number of outputs" within a function.

@lsorber
Copy link
Author

lsorber commented Sep 8, 2013

@lindahua So it seems. However, I think it would be a good idea to add a section to the docs with the Julian way of handling such situations. For example, in my case I am interested in the best solution to the optimization suite problem I described above.

@timholy
Copy link
Member

timholy commented Sep 8, 2013

The standard solution is to pass additional outputs as pre-allocated inputs, and have your function modify the inputs in-place.

FYI: both Optim and NLOpt handle this precise issue (optionally asking for the gradient) in just this way. NLOpt's approach is slightly better: rather than passing nothing to indicate you don't want the gradient, from a type-consistency perspective it's better to pass an empty gradient vector. The optimization function can pre-construct an empty vector so that you don't have the overhead of doing so on each iteration.

@timholy
Copy link
Member

timholy commented Sep 8, 2013

From the perspective of the docs, I think we need a Julia <--> Matlab page somewhere. Want to start one?

@lsorber
Copy link
Author

lsorber commented Sep 8, 2013

@timholy Thanks, that's the answer I was looking for. I would love to get involved more actively with Julia, but time is a continuously depleted resource of mine.

@timholy
Copy link
Member

timholy commented Sep 8, 2013

time is a continuously depleted resource of mine

join the crowd :-)

@JeffBezanson
Copy link
Member

Of course there is the list at http://docs.julialang.org/en/latest/manual/getting-started/#noteworthy-differences-from-matlab but maybe we're at the point where a list like this no longer cuts it.

@timholy
Copy link
Member

timholy commented Sep 8, 2013

It's certainly useful, and fairly complete (although this particular issue doesn't appear). Someone recently pointed out http://wiki.scipy.org/NumPy_for_Matlab_Users which has some attractive features.

@ViralBShah
Copy link
Member

ViralBShah added a commit that referenced this issue Jan 6, 2015
(cherry picked from commit d2ed565)
[av skip]

Docs: fix formatting error from d2ed565

(cherry picked from commit 38f1c36)
@tkelman
Copy link
Contributor

tkelman commented Jan 6, 2015

backported (squashed along with the formatting fix) in 4baf8e4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs This change adds or pertains to documentation
Projects
None yet
Development

No branches or pull requests

6 participants