size mismatch for embedding.weight: copying a param with shape torch.size([4922, 512]) from checkpoint, the shape in current model is torch.size([2234, 512]).
a place to discuss pytorch code, issues, install, research
ADS