Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use as a Tagging Model? #13

Open
simonhughes22 opened this issue Mar 1, 2015 · 3 comments
Open

Use as a Tagging Model? #13

simonhughes22 opened this issue Mar 1, 2015 · 3 comments

Comments

@simonhughes22
Copy link

This is more of a feature request.

I'd love to try to use an LTSM model as as tagging model. I have tagged words for my training data (not POS tags or any common NLP tagging problem). The previous tags can influence the current word's tags. Is it possible to use this library as a word tagger? Right now it looks like it trains on an entire document, sequentially, but with one target label per document.

@jacobmenick
Copy link

I'd like to piggyback on this question. It's unclear how to do multiclass classification from the example given. Presumably you change the activation to 'softmax', change the cost to 'CategoricalCrossEntropy', and change the size of the output 'Dense' layer to the number of classes.

Another question - Is the example given predicting a binary random variable for every step in the sequence? Or is the binary random variable predicted from the sequence itself ? I.e how many output nodes are there, and what are the connections from the hidden layer, in the example?

Thanks,
Jacob

@Slater-Victoroff
Copy link
Contributor

@Newmu Bumping this up on your radar.

Apologies for slow responses, Alec has been pretty sunk under water for the past couple of weeks.

We're always thrilled for more examples, and changing to SoftMax + CCE should work (trying it out on the blogger dataset would be a good example imo), let us know if you run into any issues, but we'd love to see a PR for a multiclass classification example if you wouldn't mind contributing one.

@youralien
Copy link

I added a multi-class classification example using the blogger gender data set here: #39
This addresses the multi class case for a single output at the end of the sequence

However, I think the tagging model that @simonhughes22 is really talking about is a classifier at each step in the RNN. This problem is more akin to something like a Neural Language Model. I see the similarity in that the Softmax which is emitted each step is predicting

  • tags, in the case of the tagging model
  • the next word, in the case of a language model

This sequence output support is being addressed in a similar, but separate, PR: #5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants