You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
as the related issue #76 was not updated since 2 two years I open now a new issue.
When I try to map a gtfs zip file with a size of 30MB I get this error:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.regex.Matcher.<init>(Matcher.java:249)
at java.base/java.util.regex.Pattern.matcher(Pattern.java:1133)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory.getStringAsSeconds(StopTimeFieldMappingFactory.java:70)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory$StopTimeFieldMapping.convert(StopTimeFieldMappingFactory.java:123)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory$StopTimeFieldMapping.translateFromCSVToObject(StopTimeFieldMappingFactory.java:100)
at org.onebusaway.csv_entities.IndividualCsvEntityReader.readEntity(IndividualCsvEntityReader.java:131)
at org.onebusaway.csv_entities.IndividualCsvEntityReader.handleLine(IndividualCsvEntityReader.java:98)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:157)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:120)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:115)
at org.onebusaway.gtfs.serialization.GtfsReader.run(GtfsReader.java:171)
at org.onebusaway.gtfs.serialization.GtfsReader.run(GtfsReader.java:159)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl.extractStopsFrom(GtfsApiServiceImpl.java:65)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl.executeApiCall(GtfsApiServiceImpl.java:49)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl$$Lambda$351/0x0000000100410040.apply(Unknown Source)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:378)
at reactor.core.publisher.FluxIterable$IterableSubscription.slowPath(FluxIterable.java:267)
at reactor.core.publisher.FluxIterable$IterableSubscription.request(FluxIterable.java:225)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onSubscribe(FluxFlatMap.java:363)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:161)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:86)
at reactor.core.publisher.Flux.subscribe(Flux.java:8325)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.toVerifierAndSubscribe(DefaultStepVerifierBuilder.java:868)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.verify(DefaultStepVerifierBuilder.java:824)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.verify(DefaultStepVerifierBuilder.java:816)
at reactor.test.DefaultStepVerifierBuilder.verifyComplete(DefaultStepVerifierBuilder.java:683)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceTest.test_(GtfsApiServiceTest.java:65)
This is the code where the error happens:
private Collection<Stop> extractStopsFrom(File gtfsZip) throws IOException {
GtfsReader gtfsReader = new GtfsReader();
// set the input file
gtfsReader.setInputLocation(gtfsZip);
// starts to read the file
GtfsDaoImpl store = new GtfsDaoImpl();
gtfsReader.setEntityStore(store);
gtfsReader.run();
return store.getAllStops();
}
I use a Maven project with Java version 11.
How much memory do I need to fix this error? Am I doing something wrong?
The text was updated successfully, but these errors were encountered:
Hello together,
as the related issue #76 was not updated since 2 two years I open now a new issue.
When I try to map a gtfs zip file with a size of 30MB I get this error:
This is the code where the error happens:
I use a Maven project with Java version 11.
How much memory do I need to fix this error? Am I doing something wrong?
The text was updated successfully, but these errors were encountered: