Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Combiner] Add a class for computing variance distributedly #235

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Kelvin-Ng
Copy link
Contributor

Originally in Random Forest in old Husky.

Now it is useful for @Gyingguo 's data visualization project.

@kygx-legend
Copy link
Member

I don't quite understand why put this class in combiner. Is it also a combiner or used by combiner?And could you add the unit test, which help users understand?

@Kelvin-Ng
Copy link
Contributor Author

Very rough pseudo code:

auto ch = create_push_combine_channel<SumCombiner<VarianceMeanNum>>();
VarianceMeanNum a(xx, yy, zz);
for all element x in list:
    a += x;
ch.push(0, a);
ch.flush();
ch.get(0).get_variance() // variance of the whole list

@ddmbr
Copy link
Member

ddmbr commented Feb 6, 2017

This is to calculate variance/standard deviation incrementally. Obviously it's not easy to give a good name...

Copy link
Member

@kygx-legend kygx-legend left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But I think it should have a unit test for this.


#include "boost/sort/spreadsort/spreadsort.hpp"

#include "base/log.hpp"

namespace husky {

class VarianceMeanNum {
public:
VarianceMeanNum() {
Copy link
Member

@kygx-legend kygx-legend Feb 6, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

VarianceMeanNum() : variance_(.0), mean_(.0), num_(0) {}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants