Here's an animation generated by simulating the 2D wave equation kicked off with a Dirac delta pulse. Periodic boundary conditions.

But let's say we want the field value to be as high as possible at a particular point in space at a chosen future moment in time...

Let's use gradient descent to optimise the "refractive index" at each point in a square in the middle. After a few iterations we get the following (grey) refractive index leading to the following simulation. Note how it all conspires to arrive at the right point...

Show thread

You could solve this problem using automatic differentiation. It's a lot like training a convolutional neural net because a single time step here is essentially a (spatially varying) convolution and you're optimising the parameters that determine a long sequence of such convolutions...

Show thread

In this example I think (not 100% sure) the optimization has tried to construct elliptical mirrors with the start and end at the foci to focus the "direct" wave and parabolic mirrors at the ends to focus the waves that wrap around the domain...

Show thread

But this is much the same problem seismologists solve - finding the material properties based on what signals arrive where from a source. And seismologists were doing this long before convolutional neural nets were invented. So I thought I'd try something closer to what they do. It's also more insightful than just using AD which is something of a black box.

Anyway, here's the result of solving the same problem but getting the focus to happen much later. There's a sort of fractally refractive index now...

Show thread

@dpiponi So cool – the kind of thing I would ask my (math major) mom about as a kid and she’d say “Sure, in theory, maybe one day when computers are better.”

Two brainstorms just in case they’re interesting:

1. I’ve been reading about shearlets, which are near-optimal at representing the wavefront set, and I can feel things in the back of my brain churning here. (E.g., the “fractal” looks a bit like a noiselet.)

2. Would the results be subjectively different if trained with a little noise?

@vruba I'm actually very interested in the noise aspect but I haven't looked closely at it yet. This came up recently when a friend asked me about simulating audio reverb in a room. I think that simulations like this might give incorrect results because they're too perfect and real rooms, with variations everywhere, will give qualitatively different results. In particular I think overly perfect simulations will have clear paths of definite length causing constructive or destructive interference at particular wavelengths and I think this would be audible as some pitches getting boosted or dampened. I'm not sure of the effect on this particular optimization problem though.

@dpiponi That makes sense.

I suppose what I’m wondering could be taken from a different angle as: is that fractal “about” waves or is it “about” float32s (or other implementation details)? Presumably at some point in the complexity and time of the system, it changes from the former to the latter, but is that after, say, 100 timesteps? Or 100 billion?

When I said noise, I imagined AGWN in the air, but I suppose another option would be jittering the room dimensions.

@vruba It's definitely the case here that you're seeing effects of the grid being discretized. However in my code I use a well-behaved reversible time step and I can run the entire simulation forward, and then backward, returning close to my starting point, so I don't think we're seeing major issues from the use of floating point types - which typically looks like irreversible dissipation.

Follow

@dpiponi Ooh, see, that’s interesting to me!

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Horsin' Around

This is a hometown instance run by Sam and Ingrid, for some friends.