The rotations of some word bounding boxes do not line up properly with the word itself.
It's strange and it happens quite often, maybe 20% of the time. It usually occurs when a word is rotated itself or has been projected onto a surface causing it to have a strange dimension. Thing is, though, it does not only happen on extreme projections. It also happens on only slightly projected text. It makes it hard for the fully convolutional network to converge well on the sin/cos pose params.