transformer/regressor API question #28744
-
Hi everyone in the scikit-learn community. First of all, thanks for the amazing package! Lately, I have been developing a package for statistical analysis time series in neuroscience, and I wanted to conform more to the scikit-learn API. However, the main class for constructing features in the package (called feature = Basis.compute_features(x1, x2,...) I would like for this constructor to conform to the transformer API, so that one could call sklearn pipelines on it, so i ended up with a sort of hack, a class that basically makes my constructor a transformer, class TransformerBasis:
def __init__(self, basis):
self._basis = basis
def fit(self, X: NDArray, y=None):
self._basis.set_kernel(*X.T)
return self
def transform(self, X: NDArray, y=None) -> NDArray:
return self._basis._compute_features(*X.T)
def fit_transform(self, X, y=None):
return self._basis.compute_features(*X.T) Q1: is there a better way to comply with the transformer API than the one I have currently adopted (without changing the API for Basis drastically)? Q2: why is the signature for the transformer's Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Your code makes me think about the So I would probably check if the design pattern is close to this.
Scikit-learn was thought and designed for structured data such as tabular data learning. Generally, this is a bad idea to think about to much generic design and/or thinking about use-cases that one doesn't have at hand. So |
Beta Was this translation helpful? Give feedback.
Your code makes me think about the
GaussianProcess
in scikit-learn.basis
would be equivalent tokernel
parameters (https://github.com/scikit-learn/scikit-learn/blob/f07e0138b/sklearn/gaussian_process/kernels.py) and the transformer would be the role of the regressor/classifier (GaussianProcessRegressor
orGaussianProcessClassifier
) that takekernel
as a parameter.So I would probably check if the design pattern is close to this.