-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use as a Tagging Model? #13
Comments
I'd like to piggyback on this question. It's unclear how to do multiclass classification from the example given. Presumably you change the activation to 'softmax', change the cost to 'CategoricalCrossEntropy', and change the size of the output 'Dense' layer to the number of classes. Another question - Is the example given predicting a binary random variable for every step in the sequence? Or is the binary random variable predicted from the sequence itself ? I.e how many output nodes are there, and what are the connections from the hidden layer, in the example? Thanks, |
@Newmu Bumping this up on your radar. Apologies for slow responses, Alec has been pretty sunk under water for the past couple of weeks. We're always thrilled for more examples, and changing to SoftMax + CCE should work (trying it out on the blogger dataset would be a good example imo), let us know if you run into any issues, but we'd love to see a PR for a multiclass classification example if you wouldn't mind contributing one. |
I added a multi-class classification example using the blogger gender data set here: #39 However, I think the tagging model that @simonhughes22 is really talking about is a classifier at each step in the RNN. This problem is more akin to something like a Neural Language Model. I see the similarity in that the Softmax which is emitted each step is predicting
This sequence output support is being addressed in a similar, but separate, PR: #5 |
This is more of a feature request.
I'd love to try to use an LTSM model as as tagging model. I have tagged words for my training data (not POS tags or any common NLP tagging problem). The previous tags can influence the current word's tags. Is it possible to use this library as a word tagger? Right now it looks like it trains on an entire document, sequentially, but with one target label per document.
The text was updated successfully, but these errors were encountered: