Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MISP to multiple Sentinel #2

Open
Sp-TT opened this issue Apr 4, 2023 · 15 comments
Open

MISP to multiple Sentinel #2

Sp-TT opened this issue Apr 4, 2023 · 15 comments
Assignees
Labels
enhancement New feature or request roadmap To implement in a future release

Comments

@Sp-TT
Copy link

Sp-TT commented Apr 4, 2023

Hi, actually this is a requirement instead of an issue.

Have you considered to add function allow it pushs to multiple Sentinels with one pull. Instead of running multiple codes with different config files.

Cheers

@cudeso cudeso added the enhancement New feature or request label Apr 5, 2023
@cudeso
Copy link
Owner

cudeso commented Apr 13, 2023

This is actually a good suggestion for an enhancement @Sp-TT . I'll look into it for a future release.
Ideally it would also take into consideration the misp_event_filters , meaning each Sentinel can have their own set of filters.

As a short workaround now you can install the script in different directories / venvs and then run it from there with a different config.

@cudeso cudeso added the roadmap To implement in a future release label May 6, 2023
@lnfernux
Copy link
Collaborator

Just did a quick PoC for this;

I think if you redo the main function in script.py:

def push_to_sentinel(tenant, id, secret, workspace):
    logger.info(f"Using Microsoft Upload Indicator API")
    config.ms_auth[TENANT] = tenant
    config.ms_auth[CLIENT_ID] = id
    config.ms_auth[CLIENT_SECRET] = secret
    config.ms_auth[WORKSPACE_ID] = workspace
    parsed_indicators, total_indicators = _get_misp_events_stix()
    logger.info("Found {} indicators in MISP".format(total_indicators))

    with RequestManager(total_indicators, logger) as request_manager:
        logger.info("Start uploading indicators")
        request_manager.upload_indicators(parsed_indicators)
        logger.info("Finished uploading indicators")
        if config.write_parsed_indicators:
            json_formatted_str = json.dumps(parsed_indicators, indent=4)
            with open("parsed_indicators.txt", "w") as fp:
                fp.write(json_formatted_str)

def main():
    tenants = json.loads(config.ms_auth)
    for key, value in tenants.items():
        push_to_sentinel(key, value['client_id'], value['client_secret'], value['workspace_id'])

and config.py:

ms_auth = {
    '<tenant_1_Id>': {
        'client_id': '<client_id>',
        'client_secret': '<client_secret>',
        'graph_api': False,                                 # Set to False to use Upload Indicators API   
        'scope': 'https://management.azure.com/.default',   # Scope for Upload Indicators API
        'workspace_id': '<workspace_id>'
    },
    '<tenant_2_Id>': {
        'client_id': '<client_id>',
        'client_secret': '<client_secret>',
        'graph_api': False,                                 # Set to False to use Upload Indicators API   
        'scope': 'https://management.azure.com/.default',   # Scope for Upload Indicators API
        'workspace_id': '<workspace_id>'
    },
    '<tenant_n_Id>': {
        'client_id': '<client_id>',
        'client_secret': '<client_secret>',
        'graph_api': False,                                 # Set to False to use Upload Indicators API   
        'scope': 'https://management.azure.com/.default',   # Scope for Upload Indicators API
        'workspace_id': '<workspace_id>'
    }
}

It would allow for looping through the different workspaces? Also would need to add more SPNs for all the different workspaces, or an enterprise app that's added to all workspaces as a Microsoft Sentinel Contributor.

This was just a basic test, so would need to do a proper PoC with push later. But the idea is there 👯

@cudeso
Copy link
Owner

cudeso commented Jul 20, 2023

Good approach!
I put it on the list to include once I merge the upload_indicators branch with main.

@cudeso cudeso self-assigned this Jul 20, 2023
@Kaloszer
Copy link
Contributor

Does this mean that currently when there are >1 entries in tenants this will cause the .py to pull each time it will send (eg. (api request to misp > generate rest request > send to workspace 1 -> then api request to misp > gen rest request > send to workspace 2 ) or (api request to misp > generate rest request > send to workspace 1 -> send to workspace 2)

If that's the case, this would be a low hanging fruit to fix and improve perf.

@lnfernux
Copy link
Collaborator

The Azure function currently supports the multiple Sentinel mode, but as you said @Kaloszer I think we can improve performance by only getting the indicators once, then sending it instead of loop-downloading indicators.

@Kaloszer
Copy link
Contributor

So something really 'dumb' like this should work I guess

#69 (nice)

Just put a global var, check if it already exists and use it if it does, else just parse it. Don't have a way to test this year. But I don't see why this wouldn't do the trick.

@cudeso
Copy link
Owner

cudeso commented Dec 20, 2023

Simple, but I like it ;-) .
I don't see a reason why this wouldn't work. Will check with memory consumption on a large instance (without the Azure function, as 'local' running script).

@Kaloszer
Copy link
Contributor

@cudeso

Looking into fixing the code that was submitted and I was wondering, why is there such a big drift between init.py / script.py? Shouldn't they be pretty much the same in the grand scheme of things?

2 files that execute pretty much the same logic need to be maintained

@lnfernux
Copy link
Collaborator

@Kaloszer

This is my fault, I stripped all the unnecessary functions related to graph support.

@lnfernux
Copy link
Collaborator

Also logging in Azure Functions works differently if you want it to print compared to local so that's also a diff.

@jusso-dev
Copy link
Contributor

I think this can be closed @cudeso ?

@cudeso
Copy link
Owner

cudeso commented Mar 4, 2024

If I'm not mistaken this is only in the Azure function, not in the locally hosted Python version.
(unfortunately haven't come around to have both versions more in sync)

@Parasdeepkohli
Copy link

Hi,

Is there any update on this? The workaround of installing the repo in different directories is quite wasteful of resources.

@cudeso
Copy link
Owner

cudeso commented Oct 1, 2024

Hi @Parasdeepkohli , I have been completely overwhelmed with $dayjob tasks and have not been able to work on it.

@Parasdeepkohli
Copy link

Parasdeepkohli commented Oct 1, 2024

Hey @cudeso

Ahhh thats understandable! Thanks for all the work you've been putting in despite your day job workload :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request roadmap To implement in a future release
Projects
None yet
Development

No branches or pull requests

6 participants