-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ANCE encoders #11
Comments
Kudos! You asked the exact question I have. In the paper. it keeps using "BERT-Siamese". To my understanding, siamese here means a shared encoder between query and document. In fact, if two encoders are used, Dense Retriever doubles the parameter size comparing to model like BERT Reranker or ColBERT. |
hhhhhh! Bingo! |
hhhhhh! Bingo!
Besides, the hyper parameters are two sensitive. See the table in Appendix,
if you change lr from 1e-6 to 2e-6, the accuracy decreases significantly!
Zhiqi ***@***.***>于2021年9月23日 周四05:02写道:
Hello, I have a question about the BERT encoders. In the paper, it is said
that "ANCE can be used to train any dense retrieval model. For simplicity,
we use a simple set up in recent research (Luan et al., 2020) with BERT
Siamese/Dual Encoder (shared between q and d), dot product similarity, and
negative log likelihood (NLL) loss." So actually, only *one* encoder is
used to encode queries and documents separately. However, in the
"model.py", the "BiEncoder" is as follows:
class BiEncoder(nn.Module):
""" Bi-Encoder model component. Encapsulates query/question and context/passage encoders. """
def __init__(self, args):
super(BiEncoder, self).__init__()
self.question_model = HFBertEncoder.init_encoder(args)
self.ctx_model = HFBertEncoder.init_encoder(args)
There are *two* encoders are defined.
Kudos! You asked the exact question I have. In the paper. it keeps using
"BERT-Siamese". To my understanding, siamese here means a shared encoder
between query and document.
In fact, if two encoders are used, Dense Retriever doubles the parameter
size comparing to model like BERT Reranker or ColBERT.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#11 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGITV7AKZ3LB7ZSQLDBKTSDUDJAAFANCNFSM4VYWS4MA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
--
KAISHUAI XU
|
Hello, I have a question about the BERT encoders. In the paper, it is said that "ANCE can be used to train any dense retrieval model. For simplicity, we use a simple set up in recent research (Luan et al., 2020) with BERT Siamese/Dual Encoder (shared between q and d), dot product similarity, and negative log likelihood (NLL) loss." So actually, only one encoder is used to encode queries and documents separately. However, in the "model.py", the "BiEncoder" is as follows:
There are two encoders are defined.
The text was updated successfully, but these errors were encountered: