Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Integrity check for maven, npm and pypi #261

Closed
wants to merge 149 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
149 commits
Select commit Hold shift + click to select a range
59becf3
Improve BOM processing performance and make it transactional (#218)
nscuro Jul 10, 2023
ad2205e
added stack trace (#231)
mehab Jul 10, 2023
3de61e7
Bump debian in /src/main/docker
dependabot[bot] Jul 10, 2023
354ee41
Bump docker/setup-buildx-action from 2.8.0 to 2.9.0
dependabot[bot] Jul 10, 2023
a39b913
Feature/authentication fix (#235)
mehab Jul 12, 2023
55569fe
Remove Lucene
nscuro Jul 10, 2023
d4cf6a0
Fix memory leak issue in vex upload processing task
Shawyeok Jul 5, 2023
f10e6af
Fix memory leak issue in policy evaluation
Shawyeok Jul 5, 2023
3782a7c
intermediate changes
mehab Jul 18, 2023
fbb55f8
intermediate commit
mehab Jul 20, 2023
8dc7a18
db query stage not working
mehab Jul 20, 2023
f27bce2
clean up
mehab Jul 20, 2023
cb00cf6
adding default false for authentication required (#238)
mehab Jul 14, 2023
2bfd76c
initial commit
Jul 18, 2023
70fe857
initial commit
Jul 18, 2023
3fcefcd
inital commit
Jul 18, 2023
e6c2ffb
added enum for step and status
Jul 18, 2023
7e98f63
added uniqueness constraint and modified cte query
Jul 18, 2023
a2d8e7a
added update query to update states of descendants
Jul 19, 2023
73543c9
addressed feedback
Jul 19, 2023
693bec3
inital commit for parsing cron expression
Jul 11, 2023
2f79579
minor tidying
Jul 11, 2023
9f22e0b
removed static instance
Jul 11, 2023
8eb589e
use local variable
Jul 11, 2023
d2c283b
remove fully qualified package name
Jul 11, 2023
44001dc
resolved merge conflicts
mehab Sep 12, 2023
b84ade7
updated default values
Jul 13, 2023
ad2e874
cleanup
mehab Jul 13, 2023
e235493
Bump docker/setup-buildx-action from 2.9.0 to 2.9.1
dependabot[bot] Jul 17, 2023
0ff1803
Issue-352 : CDX for vuln analysis result (#236)
sahibamittal Jul 18, 2023
bd30711
Re-implement `recursivelyDelete` for `Component` to be more performan…
nscuro Jul 10, 2023
841e7a7
Re-implement `recursivelyDelete` for `Project` to be more performant …
nscuro Jul 10, 2023
1b63950
Re-implement `recursivelyDelete` for `ServiceComponent` to be more pe…
nscuro Jul 17, 2023
1f54021
fix merge conflicts issue
Jul 19, 2023
ab7b86e
use test containers
Jul 19, 2023
b75be66
renamed method
Jul 19, 2023
504ebf3
Add Kafka Streams exception handlers
nscuro Jul 17, 2023
7797a8e
Reduce code duplication
nscuro Jul 19, 2023
7290d35
correcting merge conflict
mehab Jul 21, 2023
fbb8e22
component and project deletes corrected
mehab Jul 26, 2023
a7d1d14
topic segregation tested
mehab Jul 31, 2023
15b77f8
changed key for new topic and verified functionality
mehab Aug 1, 2023
ccd347f
Bump lib.net.javacrumbs.shedlock.version from 5.5.0 to 5.6.0
dependabot[bot] Jul 21, 2023
98e07cd
Bump org.apache.kafka:kafka-clients from 3.5.0 to 3.5.1
dependabot[bot] Jul 21, 2023
8648d01
Bump lib.kafka-streams.version from 3.5.0 to 3.5.1
dependabot[bot] Jul 21, 2023
cc486d9
Bump actions/setup-java from 3.11.0 to 3.12.0
dependabot[bot] Jul 24, 2023
5a5743d
Add REST endpoint for workflow status (#243)
sahibamittal Jul 26, 2023
2d96a73
Bom consumption and processing workflow (#247)
VithikaS Jul 26, 2023
e643502
Update IntelliJ run configurations
nscuro Jul 19, 2023
7c02667
add workflow update for metrics calculation
Jul 27, 2023
fd383af
remove unused imports
Jul 27, 2023
cc7a222
minor refactoring
Jul 27, 2023
1ebe1d7
fix failing test
Jul 27, 2023
d00551a
Fix flakiness of KS exception handler tests
nscuro Jul 27, 2023
c827590
add workflow state for policy evaluation
Jul 27, 2023
22ba990
changed test class name
Jul 28, 2023
1ed9479
create workflow steps on reanalysis
Jul 28, 2023
c3af160
reduce delays in task schedular
Jul 28, 2023
80f3ec8
change delays in task schedular test
Jul 28, 2023
8b396ad
Initial implementation and tests of workflow state reaper task
nscuro Jul 25, 2023
644e0f1
Fix deletion step; Add more tests; Add properties
nscuro Jul 26, 2023
9e3afbb
Make use of `AbstractPostgresEnabledTest`
nscuro Jul 26, 2023
7b51ef1
Resolve TODOs about step cancellation not updating `updatedAt`
nscuro Jul 26, 2023
61e4304
Populate `updatedAt` field when initially creating the workflow
nscuro Jul 27, 2023
216ed44
Do not require steps to be started in order for them to get timed out
nscuro Jul 27, 2023
4692707
Remove redundant `makePersistent`
nscuro Jul 27, 2023
1063fa8
Fix test failures due to non-`NULL` constraints of `UPDATED_AT`
nscuro Jul 28, 2023
770449a
Fix `BomUploadProcessingTaskTest`
nscuro Jul 28, 2023
04dc3dc
Wording: reaper -> cleanup
nscuro Jul 28, 2023
c7e7b49
Fix failures in `WorkflowQueryManagerTest` & `WorkflowResourceTest`
nscuro Jul 29, 2023
fcf5be1
Bump debian in /src/main/docker
dependabot[bot] Jul 31, 2023
bec9291
Implement workflow state for vulnerability analysis step (#252)
sahibamittal Aug 1, 2023
b463ca7
Bump Redpanda testcontainers to v23.2.2 (#256)
nscuro Aug 1, 2023
f2d0b05
Add status mapping in `NotificationModelConverter` (#257)
sahibamittal Aug 2, 2023
e886f59
Bump org.apache.maven:maven-artifact from 3.9.3 to 3.9.4
dependabot[bot] Aug 3, 2023
99a8d9e
Fix failure of `VULN_ANALYSIS` step not cancelling `POLICY_EVALUATION…
nscuro Aug 3, 2023
f9ec32e
Increase timeout for `KafkaStreamsTopologyTest#vulnScanResultProcessi…
nscuro Aug 4, 2023
c8b94c0
added unit tests for integrity check maven
mehab Aug 7, 2023
576584d
corrected code documentation
mehab Aug 9, 2023
ce8f9ef
addressed review comments
mehab Aug 12, 2023
155081c
Remove unused `org.hyades.vuln.v1` proto
nscuro Aug 7, 2023
8fb3f24
clean up logs
Aug 9, 2023
f24bc20
Don't log warning when rolling back BOM processing transaction
nscuro Aug 9, 2023
d0d0b63
Avoid redundant copies of uploaded BOM content in-memory
nscuro Aug 9, 2023
cf8ad9c
Fix inconsistent severities when updating vulnerabilities from scanne…
nscuro Aug 10, 2023
7e79f54
extend lock only when it is about to expire
Aug 11, 2023
90777f9
extend lock for inetrnal component identification when it is about to…
Aug 11, 2023
cb26e1c
lock repo meta analysis and vuln analysis
Aug 11, 2023
65cc935
remove stored procs script
Aug 11, 2023
372b4e3
Fix NPE when project doesn't exist during assembly of `PROJECT_VULN_A…
nscuro Aug 11, 2023
023acb4
prepare-release: set version to 5.0.0
dependencytrack-bot Aug 11, 2023
6c92d1e
changing column names for sync
mehab Aug 12, 2023
586881c
increased timeout
mehab Aug 12, 2023
74da717
increased timeout to avoid test failure
mehab Aug 12, 2023
f88bd5d
Increase timeouts in `KafkaStreamsTopologyTest`
nscuro Aug 14, 2023
81bc34c
add notification when project is created (#271)
VithikaS Aug 14, 2023
56395b5
Fix `VulnerabilityAnalysisTask` not acquiring a lock for processing t…
nscuro Aug 14, 2023
8595847
Refresh `WorkflowStatus` objects prior to asserting their status
nscuro Aug 14, 2023
4c80cb3
Issue-749 : add vulnerability property mapping (#274)
sahibamittal Aug 17, 2023
102dd19
Reduce volume of records to be processed for vulnerability scan compl…
nscuro Aug 17, 2023
feb5c48
Drop dependencies on MySQL and MSSQL JDBC drivers (#275)
nscuro Aug 17, 2023
18f9d73
Import components that are located under `metadata.component.componen…
malice00 Aug 19, 2023
e2072ce
Port https://github.com/DependencyTrack/dependency-track/pull/2549
nscuro Aug 20, 2023
462bddc
Bump version to `5.0.1-SNAPSHOT` in preparation of bugfix release
nscuro Aug 22, 2023
999c060
prepare-release: set version to 5.0.1
dependencytrack-bot Aug 22, 2023
a2d3202
Bump Redpanda images to `v23.2.6`
nscuro Aug 22, 2023
dfdab14
Bump debian in /src/main/docker
dependabot[bot] Aug 21, 2023
b08b361
Bump lib.protobuf-java.version from 3.23.4 to 3.24.1
dependabot[bot] Aug 21, 2023
3959daf
Bump lib.net.javacrumbs.shedlock.version from 5.6.0 to 5.7.0
dependabot[bot] Aug 25, 2023
f6145e1
Fix various bugs in parsing of mirrored vulnerabilities
nscuro Aug 18, 2023
b4f4f2f
Add more tests to verify mirrored vulnerability processing
nscuro Aug 18, 2023
8d76945
Add test case for vers range parsing
nscuro Aug 21, 2023
34db3d8
Un-ignore wildcard ranges
nscuro Aug 22, 2023
398903c
Bump docker/setup-buildx-action from 2.9.1 to 2.10.0
dependabot[bot] Aug 28, 2023
cce288a
Bump actions/checkout from 3.5.3 to 3.6.0
dependabot[bot] Aug 28, 2023
70d9303
Fix breaking change in `NEW_VULNERABILITY` notification JSON format
nscuro Aug 29, 2023
5d35cea
Fix usage of incorrect `timestamp` field in Webhook notification temp…
nscuro Aug 29, 2023
bb32262
Fix more JSON field names deviating from vanilla DT's
nscuro Aug 29, 2023
e031315
Ensure drainage of internal event system queue, Kafka Streams, and Ka…
nscuro Aug 22, 2023
8add1fe
Ensure Kafka producer exceptions are always logged
nscuro Aug 30, 2023
921de18
Set version to `5.0.2-SNAPSHOT` in preparation of bugfix release
nscuro Aug 31, 2023
bf9f019
prepare-release: set version to 5.0.2
dependencytrack-bot Aug 31, 2023
9ca8988
Fix grammatical number of `vulnerabilities` in `ProjectVulnAnalysisCo…
nscuro Sep 4, 2023
bff90b2
Ensure `protoc` version matches that of `protobuf-java`
nscuro Aug 14, 2023
3db3c51
Bump `protobuf-java` from `3.24.1` to `3.24.2`
nscuro Sep 4, 2023
22d34a8
fix order of setting severity
sahibamittal Sep 4, 2023
7fe903e
Bump actions/checkout from 3.6.0 to 4.0.0
dependabot[bot] Sep 4, 2023
53e21ae
Bump aquasecurity/trivy-action from 0.11.2 to 0.12.0
dependabot[bot] Sep 4, 2023
5f0cacd
Add temporary feature to delay `BOM_PROCESSED` notification until vul…
nscuro Sep 6, 2023
eb58b98
Bump com.github.tomakehurst:wiremock-jre8 from 2.35.0 to 2.35.1
dependabot[bot] Sep 6, 2023
c75f1c2
Fix BOM spec version missing in `Project#lastBomImportFormat`
nscuro Sep 8, 2023
ae6be26
Bump lib.protobuf-java.version from 3.24.2 to 3.24.3
dependabot[bot] Sep 8, 2023
938f69c
Bump org.slf4j:log4j-over-slf4j from 2.0.7 to 2.0.9
dependabot[bot] Sep 4, 2023
3652ea7
version bump
Sep 8, 2023
6101e1c
Update pom.xml
VithikaS Sep 10, 2023
81136b3
prepare-release: set version to 5.0.3
dependencytrack-bot Sep 11, 2023
1304d45
Bump debian in /src/main/docker
dependabot[bot] Sep 11, 2023
36addc4
Bump actions/upload-artifact from 3.1.2 to 3.1.3
dependabot[bot] Sep 11, 2023
fbd4aa8
Bump docker/build-push-action from 4.1.1 to 4.2.1
dependabot[bot] Sep 11, 2023
051461f
Bump org.apache.commons:commons-compress from 1.23.0 to 1.24.0
dependabot[bot] Sep 11, 2023
f3716dc
PR comments addressed
sahibamittal Aug 21, 2023
811ac23
refactor proto changes
sahibamittal Aug 21, 2023
9758161
api endpoint to get integrity analysis for component uuid
sahibamittal Aug 24, 2023
ea9cb89
add integrity analysis in component model
sahibamittal Aug 28, 2023
acdd3d4
Update ComponentIntegrityQueryManager.java
sahibamittal Aug 28, 2023
66d2fec
Update ComponentIntegrityQueryManager.java
sahibamittal Aug 28, 2023
8f0f325
update repository_url to identifier
sahibamittal Aug 29, 2023
7947e6e
resolved project build
mehab Sep 12, 2023
eb05934
resolved more conflicts
mehab Sep 12, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
package org.dependencytrack.event;

import alpine.event.framework.Event;
import com.github.packageurl.PackageURL;
import org.dependencytrack.model.Component;

import java.util.Optional;
import java.util.UUID;

public record ComponentIntegrityCheckEvent(String purl, Boolean internal, String md5, String sha1,
String sha256, UUID uuid, long componentId, String purlCoordinates) implements Event {
public ComponentIntegrityCheckEvent(final Component component) {
this(Optional.ofNullable(component.getPurl()).map(PackageURL::canonicalize).orElse(null), component.isInternal(), component.getMd5(),
component.getSha1(), component.getSha256(), component.getUuid(), component.getId(), Optional.ofNullable(component.getPurlCoordinates()).map(PackageURL::canonicalize).orElse(null));
}

}
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
package org.dependencytrack.event.kafka;

import alpine.common.logging.Logger;
import org.dependencytrack.event.ComponentIntegrityCheckEvent;
import org.dependencytrack.event.ComponentRepositoryMetaAnalysisEvent;
import org.dependencytrack.event.ComponentVulnerabilityAnalysisEvent;
import org.dependencytrack.event.kafka.KafkaTopics.Topic;
Expand All @@ -19,6 +21,8 @@
*/
final class KafkaEventConverter {

private static final Logger LOGGER = Logger.getLogger(KafkaEventDispatcher.class);

private KafkaEventConverter() {
}

Expand Down Expand Up @@ -55,14 +59,36 @@ static KafkaEvent<String, AnalysisCommand> convert(final ComponentRepositoryMeta
final var componentBuilder = org.hyades.proto.repometaanalysis.v1.Component.newBuilder()
.setPurl(event.purlCoordinates());
Optional.ofNullable(event.internal()).ifPresent(componentBuilder::setInternal);

final var analysisCommand = AnalysisCommand.newBuilder()
.setComponent(componentBuilder)
.build();

LOGGER.debug("Dispatching repo meta analysis event for component:" + componentBuilder.getUuid());
return new KafkaEvent<>(KafkaTopics.REPO_META_ANALYSIS_COMMAND, event.purlCoordinates(), analysisCommand, null);
}

static KafkaEvent<String, AnalysisCommand> convert(final ComponentIntegrityCheckEvent event) {
if (event == null || event.purl() == null) {
return null;
}

final var componentBuilder = org.hyades.proto.repometaanalysis.v1.Component.newBuilder()
.setPurl(event.purl());
Optional.ofNullable(event.internal()).ifPresent(componentBuilder::setInternal);
String uuid = null;
if (event.uuid() != null) {
uuid = event.uuid().toString();
}
Optional.ofNullable(uuid).ifPresent(componentBuilder::setUuid);
Optional.ofNullable(event.md5()).ifPresent(componentBuilder::setMd5Hash);
Optional.ofNullable(event.sha1()).ifPresent(componentBuilder::setSha1Hash);
Optional.ofNullable(event.sha256()).ifPresent(componentBuilder::setSha256Hash);
final var analysisCommand = AnalysisCommand.newBuilder()
.setComponent(componentBuilder)
.build();
LOGGER.debug("Dispatching integrity check event for component:" + componentBuilder.getUuid());
return new KafkaEvent<>(KafkaTopics.INTEGRITY_ANALYSIS_COMMAND, event.purlCoordinates(), analysisCommand, null);
}

static KafkaEvent<String, Notification> convert(final UUID projectUuid, final alpine.notification.Notification alpineNotification) {
final Notification notification = NotificationModelConverter.convert(alpineNotification);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
import org.apache.kafka.common.KafkaException;
import org.apache.kafka.common.errors.SerializationException;
import org.apache.kafka.common.serialization.Serde;
import org.dependencytrack.event.ComponentIntegrityCheckEvent;
import org.dependencytrack.event.ComponentRepositoryMetaAnalysisEvent;
import org.dependencytrack.event.ComponentVulnerabilityAnalysisEvent;
import org.dependencytrack.event.GitHubAdvisoryMirrorEvent;
Expand Down Expand Up @@ -69,6 +70,9 @@ public Future<RecordMetadata> dispatchAsync(final Event event, final Callback ca
} else if (event instanceof final ComponentRepositoryMetaAnalysisEvent e) {
LOGGER.debug("Dispatch internal called for component: " + e.purlCoordinates() + " Component is internal: " + e.internal());
return dispatchAsyncInternal(KafkaEventConverter.convert(e), callback);
} else if (event instanceof final ComponentIntegrityCheckEvent e) {
LOGGER.debug("Dispatching integrity check event for : " + e.purl() + " Component id is: " + e.componentId());
return dispatchAsyncInternal(KafkaEventConverter.convert(e), callback);
} else if (event instanceof final OsvMirrorEvent e) {
return dispatchAsyncInternal(new KafkaEvent<>(KafkaTopics.VULNERABILITY_MIRROR_COMMAND, Vulnerability.Source.OSV.name(), e.ecosystem(), null), callback);
} else if (event instanceof NistMirrorEvent) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import org.dependencytrack.event.ProjectMetricsUpdateEvent;
import org.dependencytrack.event.ProjectPolicyEvaluationEvent;
import org.dependencytrack.event.kafka.processor.DelayedBomProcessedNotificationProcessor;
import org.dependencytrack.event.kafka.processor.IntegrityAnalysisResultProcessor;
import org.dependencytrack.event.kafka.processor.MirrorVulnerabilityProcessor;
import org.dependencytrack.event.kafka.processor.RepositoryMetaResultProcessor;
import org.dependencytrack.event.kafka.processor.VulnerabilityScanResultProcessor;
Expand Down Expand Up @@ -235,6 +236,12 @@ Topology createTopology() {
.withName("consume_from_%s_topic".formatted(KafkaTopics.REPO_META_ANALYSIS_RESULT.name())))
.process(RepositoryMetaResultProcessor::new, Named.as("process_repo_meta_analysis_result"));

streamsBuilder
.stream(KafkaTopics.INTEGRITY_ANALYSIS_RESULT.name(),
Consumed.with(KafkaTopics.INTEGRITY_ANALYSIS_RESULT.keySerde(), KafkaTopics.INTEGRITY_ANALYSIS_RESULT.valueSerde())
.withName("consume_from_%s_topic".formatted(KafkaTopics.INTEGRITY_ANALYSIS_RESULT.name())))
.process(IntegrityAnalysisResultProcessor::new, Named.as("process_component_integrity_analysis_result"));

streamsBuilder
.stream(KafkaTopics.NEW_VULNERABILITY.name(),
Consumed.with(KafkaTopics.NEW_VULNERABILITY.keySerde(), KafkaTopics.NEW_VULNERABILITY.valueSerde())
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import org.dependencytrack.common.ConfigKey;
import org.dependencytrack.event.kafka.serialization.KafkaProtobufSerde;
import org.hyades.proto.notification.v1.Notification;
import org.hyades.proto.repointegrityanalysis.v1.IntegrityResult;
import org.hyades.proto.repometaanalysis.v1.AnalysisCommand;
import org.hyades.proto.repometaanalysis.v1.AnalysisResult;
import org.hyades.proto.vulnanalysis.v1.ScanCommand;
Expand All @@ -31,10 +32,13 @@ public final class KafkaTopics {
public static final Topic<String, String> VULNERABILITY_MIRROR_COMMAND;
public static final Topic<String, Bom> NEW_VULNERABILITY;
public static final Topic<String, AnalysisCommand> REPO_META_ANALYSIS_COMMAND;

public static final Topic<String, AnalysisCommand> INTEGRITY_ANALYSIS_COMMAND;
public static final Topic<String, AnalysisResult> REPO_META_ANALYSIS_RESULT;
public static final Topic<ScanKey, ScanCommand> VULN_ANALYSIS_COMMAND;
public static final Topic<ScanKey, ScanResult> VULN_ANALYSIS_RESULT;

public static final Topic<String, IntegrityResult>INTEGRITY_ANALYSIS_RESULT;
public static final Topic<String, Notification> NOTIFICATION_PROJECT_VULN_ANALYSIS_COMPLETE;
private static final Serde<Notification> NOTIFICATION_SERDE = new KafkaProtobufSerde<>(Notification.parser());

Expand All @@ -55,6 +59,8 @@ public final class KafkaTopics {
VULNERABILITY_MIRROR_COMMAND = new Topic<>("dtrack.vulnerability.mirror.command", Serdes.String(), Serdes.String());
NEW_VULNERABILITY = new Topic<>("dtrack.vulnerability", Serdes.String(), new KafkaProtobufSerde<>(Bom.parser()));
REPO_META_ANALYSIS_COMMAND = new Topic<>("dtrack.repo-meta-analysis.component", Serdes.String(), new KafkaProtobufSerde<>(AnalysisCommand.parser()));
INTEGRITY_ANALYSIS_COMMAND = new Topic<>("dtrack.integrity-analysis.component", Serdes.String(), new KafkaProtobufSerde<>(AnalysisCommand.parser()));
INTEGRITY_ANALYSIS_RESULT = new Topic<>("dtrack.integrity-analysis.result", Serdes.String(), new KafkaProtobufSerde<>(IntegrityResult.parser()));
REPO_META_ANALYSIS_RESULT = new Topic<>("dtrack.repo-meta-analysis.result", Serdes.String(), new KafkaProtobufSerde<>(AnalysisResult.parser()));
VULN_ANALYSIS_COMMAND = new Topic<>("dtrack.vuln-analysis.component", new KafkaProtobufSerde<>(ScanKey.parser()), new KafkaProtobufSerde<>(ScanCommand.parser()));
VULN_ANALYSIS_RESULT = new Topic<>("dtrack.vuln-analysis.result", new KafkaProtobufSerde<>(ScanKey.parser()), new KafkaProtobufSerde<>(ScanResult.parser()));
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,116 @@
package org.dependencytrack.event.kafka.processor;

import alpine.common.logging.Logger;
import alpine.common.metrics.Metrics;
import com.github.packageurl.MalformedPackageURLException;
import com.github.packageurl.PackageURL;
import io.micrometer.core.instrument.Timer;
import org.apache.kafka.streams.processor.api.Processor;
import org.apache.kafka.streams.processor.api.Record;
import org.dependencytrack.model.Component;
import org.dependencytrack.model.ComponentIntegrityAnalysis;
import org.dependencytrack.persistence.QueryManager;
import org.hyades.proto.repointegrityanalysis.v1.HashMatchStatus;
import org.hyades.proto.repointegrityanalysis.v1.IntegrityResult;

import javax.jdo.JDODataStoreException;
import javax.jdo.PersistenceManager;
import javax.jdo.Query;
import javax.jdo.Transaction;
import java.util.Date;
import java.util.UUID;

public class IntegrityAnalysisResultProcessor implements Processor<String, IntegrityResult, Void, Void> {
private static final Logger LOGGER = Logger.getLogger(IntegrityAnalysisResultProcessor.class);
private static final Timer TIMER = Timer.builder("integrity_analysis_result_processing")
.description("Time taken to process integrity analysis results")
.register(Metrics.getRegistry());

@Override
public void process(Record<String, IntegrityResult> record) {
final Timer.Sample timerSample = Timer.start();
try (final var qm = new QueryManager()) {
synchronizeComponentIntegrity(qm.getPersistenceManager(), record);
} catch (Exception e) {
LOGGER.error("An unexpected error occurred while processing record %s".formatted(record), e);
} finally {
timerSample.stop(TIMER);
}
}

private void synchronizeComponentIntegrity(final PersistenceManager pm, final Record<String, IntegrityResult> record) {

final IntegrityResult result = record.value();

final PackageURL purl;
try {
purl = new PackageURL(result.getComponent().getPurl());
} catch (MalformedPackageURLException e) {
LOGGER.warn("""
Received repository integrity information with invalid PURL,\s
will not be able to correlate; Dropping
""", e);
return;
}

final Transaction trx = pm.currentTransaction();
try {
trx.begin();
final Query<ComponentIntegrityAnalysis> query = pm.newQuery(ComponentIntegrityAnalysis.class);
query.setFilter("repositoryIdentifier == :repository && component.uuid == :uuid");
query.setParameters(
record.value().getRepositoryIdentifier(),
UUID.fromString(record.value().getComponent().getUuid())
);
ComponentIntegrityAnalysis persistentIntegrityResult = query.executeUnique();
if (persistentIntegrityResult == null || persistentIntegrityResult.getComponent() == null) {
persistentIntegrityResult = new ComponentIntegrityAnalysis();
}

if (persistentIntegrityResult.getLastCheck() != null
&& persistentIntegrityResult.getLastCheck().after(new Date(record.timestamp()))) {
LOGGER.warn("""
Received integrity check information for %s that is older\s
than what's already in the database; Discarding
""".formatted(purl));
return;
}
final Query<Component> queryComponent = pm.newQuery(Component.class);
queryComponent.setFilter("uuid == :uuid");
queryComponent.setParameters(UUID.fromString(record.value().getComponent().getUuid()));
Component component = queryComponent.executeUnique();
sahibamittal marked this conversation as resolved.
Show resolved Hide resolved
if (component != null) {
persistentIntegrityResult.setRepositoryIdentifier(record.value().getRepositoryIdentifier());
HashMatchStatus md5HashMatch = record.value().getMd5HashMatch();
HashMatchStatus sha1HashMatch = record.value().getSha1HashMatch();
HashMatchStatus sha256HashMatch = record.value().getSha256HashMatch();
persistentIntegrityResult.setMd5HashMatched(md5HashMatch.name());
persistentIntegrityResult.setSha256HashMatched(sha256HashMatch.name());
persistentIntegrityResult.setSha1HashMatched(sha1HashMatch.name());
persistentIntegrityResult.setComponent(component);
persistentIntegrityResult.setLastCheck(new Date(record.timestamp()));
if (md5HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_FAIL) || sha1HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_FAIL) || sha256HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_FAIL)) {
persistentIntegrityResult.setIntegrityCheckPassed(false);
} else if (md5HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN) && sha1HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN) && sha256HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN)) {
persistentIntegrityResult.setIntegrityCheckPassed(false);
} else if (md5HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_COMPONENT_MISSING_HASH) && sha1HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_COMPONENT_MISSING_HASH) && sha256HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_COMPONENT_MISSING_HASH)) {
persistentIntegrityResult.setIntegrityCheckPassed(false);
} else {
boolean flag = (md5HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_PASS) || md5HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN))
&& (sha1HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_PASS) || sha1HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN))
&& (sha256HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_PASS) || sha256HashMatch.equals(HashMatchStatus.HASH_MATCH_STATUS_UNKNOWN));
persistentIntegrityResult.setIntegrityCheckPassed(flag);
}
pm.makePersistent(persistentIntegrityResult);

trx.commit();
}
} catch (JDODataStoreException e) {
LOGGER.error("An unexpected error occurred while executing JDO query %s".formatted(record), e);
} finally {
if (trx.isActive()) {
trx.rollback();
}
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -130,4 +130,5 @@ private void synchronizeRepositoryMetaComponent(final PersistenceManager pm, fin
}
}


}
9 changes: 9 additions & 0 deletions src/main/java/org/dependencytrack/model/Component.java
Original file line number Diff line number Diff line change
Expand Up @@ -355,6 +355,7 @@ public enum FetchGroup {
private transient String licenseId;
private transient DependencyMetrics metrics;
private transient RepositoryMetaComponent repositoryMeta;
private transient ComponentIntegrityAnalysis integrityAnalysis;
private transient boolean isNew;
private transient int usedBy;
private transient Set<String> dependencyGraph;
Expand Down Expand Up @@ -741,6 +742,14 @@ public void setRepositoryMeta(RepositoryMetaComponent repositoryMeta) {
this.repositoryMeta = repositoryMeta;
}

public ComponentIntegrityAnalysis getIntegrityAnalysis() {
return integrityAnalysis;
}

public void setIntegrityAnalysis(ComponentIntegrityAnalysis integrityAnalysis) {
this.integrityAnalysis = integrityAnalysis;
}

public boolean isNew() {
return isNew;
}
Expand Down
Loading
Loading