ML BS
This was a clever idea! Signals should have approximately standard normal distributions, and you mix two Gaussians of the same σ by doing a×sqrt(t) + b×sqrt(1-t) to get output with the same σ at any t. This gives you a nice compromise between concatenation (information-preserving but wasteful) and addition (vice versa). Great!
It didn’t work at all!
ML BS
I am actually giving it one more chance by trying it as the merge block for upsamples and residuals in a U-net, but when it was the final head-to-tail residual merge, the early training of the network was almost flat, so my hopes are not super high.