Follow

Image→image ANN design opinions 

Based on informal tinkering and reading. Happy to discuss, within reason.

1. QKV attention transformers work but are gross – O(n²) and equivariance problems – and will not last.

2. Most – not all! – learned convolutions are wasted and should be replaced by fixed bases or frames, or at least grouped convolutions.

3. Diffusion and flow training approaches are immature but way more elegant than x→y training.

5. MMA regularization is good.

· · Web · 1 · 0 · 3

Image→image ANN design opinions 

@vruba@everything.happens.here because I am trying to imagine what mixed-martial arts regularization would look like :p

Image→image ANN design opinions 

@secretasianman Their podcasters are terrible.

Sign in to participate in the conversation
Horsin' Around

This is a hometown instance run by Sam and Ingrid, for some friends.