Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes but they're not maxpooling in the last dimension. They're max pooling over the sequence length [0], (the other way doesn't really make sense in this context).

The output size is 1024, the hidden vector size is 512 but they're using bidirectional LSTMs which concatenates the outputs of each direction -- so the total is 1024 [1].

[0] https://github.com/facebookresearch/LASER/blob/fec5c7d63daa2...

[1] https://pytorch.org/docs/stable/nn.html#lstm



Gotcha, that makes sense. (I'm less familiar with dimension ordering in PyTorch)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: