multi_head_attention_forward produces : ‘mask’ in call to _th_masked_fill_bool_

I found similar question in github. But I still hava this problem when my pytorch were upgraded to 1.5 . I don't know if it's related to the python version . Do you hava some suggestions ? Thank you very much!

Information

File "/home/ynos/anaconda3/envs/pytorch/lib/python3.6/site-packages/torch/nn/functional.py", line 3937, in multi_head_attention_forward float('-inf'), RuntimeError: Expected object of scalar type Bool but got scalar type Long for argument #2 'mask' in call to _th_masked_fill_bool_

Environment info:

Python version: 3.6.2  
PyTorch version (GPU): 1.5


Read more here: https://stackoverflow.com/questions/67018277/multi-head-attention-forward-produces-mask-in-call-to-th-masked-fill-bool

Content Attribution

This content was originally published by wangcui at Recent Questions - Stack Overflow, and is syndicated here via their RSS feed. You can read the original post over there.

%d bloggers like this: