-
Notifications
You must be signed in to change notification settings - Fork 625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Waveguide crossing inverse design example #2800
Conversation
Codecov ReportAttention: Patch coverage is
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## master #2800 +/- ##
==========================================
- Coverage 74.06% 73.75% -0.32%
==========================================
Files 18 18
Lines 5395 5422 +27
==========================================
+ Hits 3996 3999 +3
- Misses 1399 1423 +24
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it really necessary to have the two functions run_shape_optimization
and run_topology_optimization
be separate? They share a lot of code and can probably be combined into a single function.
x | ||
+ npa.rot90(x) | ||
+ npa.rot90(npa.rot90(x)) | ||
+ npa.rot90(npa.rot90(npa.rot90(x))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be cleaner to replace npa.rot90(npa.rot90(x))
with npa.rot90(x, 2)
, and similar.
else: | ||
x = mpa.tanh_projection(x, beta=beta, eta=eta) | ||
|
||
x = npa.clip(x, 0, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this really necessary or just meant to ensure the weights are bounded. Since the return values of smoothed_projection
and tanh_projection
should already have this property it might be better to place this statement inside those functions rather than make it a requirement for the user to include in their functions.
# | ||
# Importantly, this particular example highlights some of the ways one can use | ||
# the novel smoothed projection function to perform both shape and topology | ||
# optimization. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: can we use the docstring style for modules which uses triple quotes """
and starts with a one-line summary?
|
||
|
||
if __name__ == "__main__": | ||
run_shape_optimization(resolution=25.0, beta=np.inf, maxeval=30) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How are the return values handled?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not used here so NA
As discussed, if it is a greyscale issue, you can try using the |
I wonder if it's getting stuck from the subpixel-smoothed structure being only piecewise differentiable? In principle, we could fix that by convolving with something like a Gaussian rather than a sphere. |
The alternating signs of the gradient would cause the changes to cancel out to first order, so if that's due to a sign error it would be a problem. |
@stevengj if I monitor the output of CCSA during the optimization, During the stalled region, it's not necessarily stuck in the same outer iteration. But |
Here we add a python inverse-design example that maximizes the transmission of a waveguide crossing. In particular, it's meant to showcase the new (first-order accurate) smoothed projection functionality. Using this example, you can:
Perform shape optimization:
Perform topology optimization:
Analyze the norm of the gradient as$\beta \to \infty$ for both the smoothed case and non-smoothed case:
Analyze the convergence properties of a shape optimization problem at a finite beta ($\beta=64$ here) for both the smoothed case and non-smoothed case: