1/16/2025 at 5:56:34 PM
Every time I see stuff like this it makes me think about optical design software.There are applications (Zemax, for example) that are used to design optical systems (lens arrangements for cameras, etc). These applications are eye-wateringly expensive-- like similar in pricing to top-class EDA software licenses.
With the abundance GPU's and modern UI's, I wonder how much work would be involved for someone to make optical design software that blows away the old tools. It would be ray-tracing, but with interesting complications like accounting for polarization, diffraction, scattering, fluorescence, media effects beyond refraction like like birefringence and stuff like Kerr and Pockels, etc.
by crispyambulance
1/16/2025 at 6:27:56 PM
This, very much this!I do research in a subfield of optics called nonimaging optics (optics for energy transfer, e.g. solar concentrators or lighting systems). We typically use these optical design applications, and your observations are absolutely correct. Make some optical design software that uses GPUs for raytracing, reverse-mode autodiff for optimization, sprinkle in some other modern techniques you may blow these older tools out of the water.
I am hoping to be able to get some projects going in this direction (feel free to reach out if anyone are interested).
PS: I help organize an academic conference my subfield of optics. We run a design competition this year [1,2]. Would be super cool if someone submits a design that they made by drawing inspiration from modern computer graphics tools (maybe using Mitsuba 3, by one of the authors of this book?), instead of using our classical applications in the field.
[1] https://news.ycombinator.com/item?id=42609892
[2] https://nonimaging-conference.org/competition-2025/upload/
by hakonjdjohnsen
1/17/2025 at 1:02:48 AM
> I am hoping to be able to get some projects going in this direction (feel free to reach out if anyone are interested).This does sound interesting! I’ve just finished a Masters degree, also in non-imaging optics (in my case oceanographic lidar systems). I have experience in raytracing for optical simulation, though not quite in the same sense as optical design software. How should I contact you to learn more?
by bradrn
1/17/2025 at 6:38:29 AM
Interesting! I added an email address to my profile nowby hakonjdjohnsen
1/17/2025 at 7:41:36 AM
Great! I’ll send you an email now.by bradrn
1/17/2025 at 3:57:41 AM
Sounds a bit like https://github.com/mitsuba-renderer/mitsuba2by accurrent
1/17/2025 at 6:36:37 AM
Yes, exactly. I have not looked at Mitsuba 2, but Mitsuba 3 is absolutely along these lines. It is just starting to be picked up by some of the nonimaging/illumination community, e.g. there was a paper last year from Aurele Adam's group at TU Delft where they used it for optimizing a "magic window" [1]. Some tradeoffs and constraints are a bit different when doing optical design versus doing (inverse) rendering, but it definitely shows what is possible.by hakonjdjohnsen
1/17/2025 at 9:26:30 AM
Shameless plug, we use Mitsuba 3/Dr.JIT for image optimization around volumetric 3D printing https://github.com/rgl-epfl/drtvamby roflmaostc
1/17/2025 at 12:00:24 PM
It looks quite interesting, especially the part of scripting everything in Python with a JIT, instead of the traditional having to do everything in either C or C++.Looking forward to some weekend paper reading.
by pjmlp
1/17/2025 at 9:58:12 AM
Looks really cool! I look forward to reading your paper. Do you know if a recording of the talk is/will be posted somewhere?by hakonjdjohnsen
1/17/2025 at 4:11:47 PM
We presented this work at SIGGRAPH ASIA 2024. But I think they do not record it?Maybe in some time we also do an online workshop about it.
by roflmaostc
1/17/2025 at 9:26:27 AM
I dont know much about Optical engineering, but this sounds super exciting. I think I meant to point to mitsuba 3, not 2.by accurrent
1/17/2025 at 4:12:56 PM
This is one example of an area where economic incentives make it difficult to shift. - There aren't that many people willing to pay for such software, but those that do *really* need it, and will pay quite a bit (passing that cost on of course).
- The technical domain knowledge needed to do it properly is a barrier to many
- It needs to be pretty robust
As a result, you end up with a small handful of players who provide it. They have little incentive to modernize, and the opportunity cost for a new player high enough to chase most of them off to other avenues.I think the main way this changes is when someone has already spend the money in an adjacent area, and realized "huh, with a little effort here we could probably eat X's lunch"
Beyond that you at most get toy systems from enthusiasts and grad students (same group?) ...
by ska
1/16/2025 at 8:52:33 PM
You’d be surprised! Everywhere I’ve worked, academic or industry, typically writes their own simulation software. Sometimes it’s entirely handwritten (i.e., end-to-end, preprocessing to simulation to evaluation), sometimes it’ll leverage a pre-existing open source package. I imagine this will become more and more common if, for no other reason, you can’t back-propagate an OpticStudio project and open source automatic differentiation packages are unbeatable.by Q6T46nT668w6i3m
1/16/2025 at 10:31:12 PM
If you're interested in the equivalent of "backprop through zemax" there are a few projects going on to jointly optimize optical designs with the image processing, e.g. check out: https://vccimaging.org/Publications/Wang2022DiffOptics/by lcrs
1/16/2025 at 10:28:05 PM
I've been working on something similar, although I'm more interested in replicating the effects of existing lenses than designing new ones: https://x.com/dearlensform/status/1858229457430962318PBRT 3rd edition actually has a great section on the topic but it's one of the parts that wasn't implemented for the GPU (by the authors, anyway): https://pbr-book.org/3ed-2018/Camera_Models/Realistic_Camera...
by lcrs
1/16/2025 at 6:25:42 PM
I once saw a youtube video of a guy who first modeled a pinhole camera in something like Blender3D and then went on to design and simulate an entire SLR camera.by amelius
1/16/2025 at 6:44:57 PM
https://youtu.be/YE9rEQAGpLwby Tomte
1/16/2025 at 8:04:00 PM
Thanks, but it was a different video.I remember he had a lot of problems with the pinhole camera because the small size of the pinhole meant that rays had trouble going into the box, so to speak, and thus he needed an insane amount of rays.
by amelius
1/16/2025 at 7:16:48 PM
I'd imagine there is fairly wide gap between having a simulation engine core and an useful engineering applicationFrom academic side, I've found the work of Steinberg in this area extremely impressive. They are pushing the frontier to include more wave-optical phenomenon in the rendering. E.g. https://ssteinberg.xyz/2023/03/27/rtplt/
by zokier
1/17/2025 at 11:20:52 AM
> eye-wateringly expensiveFor you. Anyone doing design and manufacturing of optics will not blink at paying for software.
by liontwist
1/16/2025 at 8:45:26 PM
Well, everyone who can build this niche software is already employed to build it.by fooker
1/16/2025 at 8:56:49 PM
I think you’re overthinking this, e.g., Zemax’s optimization isn’t that different than the ray-tracing presented in this book. The sophistication truly comes from the users.by Q6T46nT668w6i3m
1/16/2025 at 9:31:51 PM
Yeah, perhaps.But the heavy-hitters in this field all seem to have very old-timey UI's and out-of-this-world pricing.
Meanwhile, raytracing for computer graphics on GPU's is soooo performant-- it makes me wonder how much work needs to be done to make the equivalent of KiCAD for optical design.
by crispyambulance
1/17/2025 at 6:57:42 PM
You're missing the point. The difficulty is not in the ray tracing, etc. It is in understanding the domain of the software and what needs to be done to make it useful.I completely agree that whatever simulation they have can be easily done better with modern GPUs.
by fooker
1/17/2025 at 1:24:58 AM
" I wonder how much work would be involved for someone to make optical design software that blows away the old tools"Depending on use case, it already exists for gem cutters. We can simulate everything from RI to fluorescence excitation.
by lightedman
1/16/2025 at 11:07:04 PM
I predict PBR is going to fall to neural rendering. Diffusion models have been shown to learn all of the rules of optics and shaders, and they're instructable and generalizable. It's god mode for physics and is intuitive for laypersons to manipulate.We're only at the leading edge of this, too.
by echelon
1/16/2025 at 11:24:12 PM
Can you link the neural rendering animation you're talking about with some info on the total rendering times without any precomputation?by CyberDildonics