Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiprocessing Error When Creating Portofiolio #729

Open
daviddwlee84 opened this issue Jul 8, 2024 · 1 comment
Open

Multiprocessing Error When Creating Portofiolio #729

daviddwlee84 opened this issue Jul 8, 2024 · 1 comment

Comments

@daviddwlee84
Copy link

daviddwlee84 commented Jul 8, 2024

When I tried to parallelize the backtesting, I found two different kinds of error

First, seems VectorBT Portfolio is not (built-in) pickable AttributeError: Can't pickle local object 'cached_method.<locals>.decorator.<locals>.wrapper.<locals>.partial_func' (but somehow it can be saved as pickle pf.save())

Seconds, if use a package like pathos to bypass the pickling issue, seems the extended VectorBT stuff may not initialize normally AttributeError: 'Config' object has no attribute '_readonly_'

@property
def readonly_(self) -> bool:
"""Whether to deny any updates to the config."""
return self._readonly_

Here is how to reproduce the error

# 1. Import packages, create data, and try basic operations
import pandas as pd
import numpy as np
import vectorbt as vbt

from tqdm import tqdm
tqdm.pandas()

from pandarallel import pandarallel
pandarallel.initialize(progress_bar=True)

seires_of_dfs = pd.Series(
    [
        pd.DataFrame(
            {"close": np.abs(np.random.randn(10)), "signal": np.random.randn(10)}
        )
        for _ in range(10)
    ]
)

def add(df: pd.DataFrame) -> pd.Series:
    return df["close"] + df["signal"]

print(seires_of_dfs.apply(add))
print(seires_of_dfs.progress_apply(add))
print(seires_of_dfs.parallel_apply(add))
# Multiprocessing 
def backtest(df: pd.DataFrame) -> vbt.Portfolio:
    long_signal = df["signal"] > 0.5
    short_signal = df["signal"] < -0.5
    return vbt.Portfolio.from_signals(
        close=df["close"], entries=long_signal, exits=short_signal
    )

# fine
print(seires_of_dfs.progress_apply(backtest))

# BUG: AttributeError: Can't pickle local object 'cached_method.<locals>.decorator.<locals>.wrapper.<locals>.partial_func'
print(seires_of_dfs.parallel_apply(backtest))
# Try multi-process pool
from multiprocessing import Pool
# or
from pathos.pools import _ProcessPool as Pool

with Pool(60) as pool:
    futures = []
    results = []
    for df in seires_of_dfs:
        futures.append(pool.apply_async(backtest, args=(df,)))
    for future in futures:
        # BUG (when using multiprocessing): multiprocessing.pool.MaybeEncodingError: Error sending result: '<vectorbt.portfolio.base.Portfolio object at 0x7f613c772040>'. Reason: 'AttributeError("Can't pickle local object 'cached_method.<locals>.decorator.<locals>.wrapper.<locals>.partial_func'")'
        # BUG (when using pathos): AttributeError: 'Config' object has no attribute '_readonly_'
        results.append(future.get())

Version

  • Python 3.8.13
  • vectorbt 0.26.1
  • pathos 0.3.2
  • pandas 1.5.3
  • numpy 1.23.5
  • pandarallel 1.6.5
@daviddwlee84
Copy link
Author

daviddwlee84 commented Dec 5, 2024

Calculate pf.stats() in multi-process can have same issue

# fine
all_column_stats = pf.wrapper.columns.to_series().progress_map(
    lambda column: self.pf.stats(column=column, group_by=False)
)

# BUG: ERROR: AttributeError: 'Config' object has no attribute '_readonly_'
all_column_stats = pf.wrapper.columns.to_series().parallel_map(
    lambda column: pf.stats(column=column, group_by=False)
)
all_column_stats = pf.wrapper.columns.to_series().parallel_map(
    lambda column: pf.copy().stats(column=column, group_by=False)
)
all_column_stats = pf.wrapper.columns.to_series().parallel_map(
    lambda column: pf.copy("deep", nested=True).stats(column=column, group_by=False)
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant