Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: How the StackedEmbeddings function actually works? #3313

Open
ijazul-haq opened this issue Sep 8, 2023 · 3 comments
Open

[Question]: How the StackedEmbeddings function actually works? #3313

ijazul-haq opened this issue Sep 8, 2023 · 3 comments
Labels
question Further information is requested

Comments

@ijazul-haq
Copy link

Question

How the StackedEmbeddings function actually works?
Is it concatinating the two word embeddings, i.e. using torch.cat([emb1, emb2]).

I want to concatinate BytePairEmbeddings with TransformerWordEmbeddings, so i'm doing like this:

bert_emb = TransformerWordEmbeddings(
model='xlm-roberta-base',
layers="-1",
subtoken_pooling="mean",
fine_tune=True,
use_context=True,
)

bpe_emb = BytePairEmbeddings('en')
stacked_embeddings = StackedEmbeddings([bert_emb , bpe_emb])

So the resultant word embeddings (stacked_embeddings) will be a concatenation of the two embeddings, or is it element-wise mean embedding, or anything else?

Thank you

@ijazul-haq ijazul-haq added the question Further information is requested label Sep 8, 2023
@helpmefindaname
Copy link
Collaborator

Hi @ijazul-haq

StackedEmbeddings use the stacking/concatination operator as their name implies.
Note that element-wise pooling would not be possible, as the embeddings usually don't have the same embedding length.

@ijazul-haq
Copy link
Author

Does it mean that StackedEmbeddings can not be used for sequence tagging, it's only for sequence classification? right?

@helpmefindaname
Copy link
Collaborator

Sorry, I cannot follow how you come to that conclusion, but I can asure you that the StackedEmbeddings work for both, TokenEmbeddings and DocumentEmbeddings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants