You must log in or register to comment.
The long-forgotten technique? Normalizing flows. It feels like it was just last year that they were all the rage. It’s insane how fast the field is moving.
It looks like Laurent Dinh (dude who originally came up with normalizing flows) is one of the authors of this work.
There doesn’t seem to be much going on here. What’s the actual advantage of this technique? It seems they’re doing things more or less the same as everyone else.
I mean, the article talks about decoders as if they were some brand new innovation.
Generation ability seems to be about the same as any other model. The advantage of normalizing flows is that operations are invertible. That allows you to not just generate samples, but also calculate the probability of a sample.