Evaluate module

pyro_risks.pipeline.evaluate.evaluate_pipeline(X: pandas.core.frame.DataFrame, y: pandas.core.series.Series, pipeline: Union[imblearn.pipeline.Pipeline, str], threshold: str, prefix: Optional[str] = None, destination: Optional[str] = None)None[source]

Build and save binary classification evaluation reports.

Parameters
  • X – Training dataset features pd.DataFrame.

  • y – Training dataset target pd.Series.

  • pipeline – imbalanced-learn preprocessing pipeline or path to pipeline.

  • threshold – Classification pipeline optimal threshold path.

  • prefix – Classification reports prefix i.e. pipeline name. Defaults to None.

  • destination – Folder where the report should be saved. Defaults to METADATA_REGISTRY.

pyro_risks.pipeline.evaluate.save_classification_plots(y_true: numpy.ndarray, y_proba: numpy.ndarray, threshold: numpy.float64, prefix: Optional[str] = None, destination: Optional[str] = None)None[source]

Build and save binary classification performance evaluation plots.

Parameters
  • y_true – Ground truth (correct) labels.

  • y_pred – Predicted probabilities of the positive class returned by a classifier.

  • threshold – Classification pipeline optimal threshold.

  • prefix – Classification plots prefix i.e. pipeline name. Defaults to None.

  • destination – Folder where the report should be saved. Defaults to METADATA_REGISTRY.

pyro_risks.pipeline.evaluate.save_classification_reports(y_true: numpy.ndarray, y_pred: numpy.ndarray, prefix: Optional[str] = None, destination: Optional[str] = None)None[source]

Build and save binary classification metrics reports.

Parameters
  • y_true – Ground truth (correct) labels.

  • y_pred – Predicted labels, as returned by a calibrated classifier.

  • prefix – Classification report prefix i.e. pipeline name. Defaults to None.

  • destination – Folder where the report should be saved. Defaults to METADATA_REGISTRY.