multi_head_attention_forward produces : ‘mask’ in call to _th_masked_fill_bool_

I found similar question in github. But I still hava this problem when my pytorch were upgraded to 1.5 . I don't know if it's related to the python version . Do you hava some suggestions ? Thank you very much!


File "/home/ynos/anaconda3/envs/pytorch/lib/python3.6/site-packages/torch/nn/", line 3937, in multi_head_attention_forward float('-inf'), RuntimeError: Expected object of scalar type Bool but got scalar type Long for argument #2 'mask' in call to _th_masked_fill_bool_

Environment info:

Python version: 3.6.2  
PyTorch version (GPU): 1.5

Read more here:

Content Attribution

This content was originally published by wangcui at Recent Questions - Stack Overflow, and is syndicated here via their RSS feed. You can read the original post over there.

%d bloggers like this: