I'm stuck with the problem of scaling new data. In my scheme, I have trained and test the model, with all x_train and x_test have been scaled using sklearn.MinMaxScaler(). Then, applying to the real-time process, how can I scale the new input in the same scale of the training and testing data. The step is as below
featuresData = df[features].values # Array of all features with the length of thousands
sc = MinMaxScaler(feature_range=(-1,1), copy=False)
featuresData = sc.fit_transform(featuresData)
#Running model to make the final model
model.fit(X,Y)
model.predict(X_test)
#Saving to abcxyz.h5
Then implementing with new data
#load the model abcxyz.h5
#catching new data
#Scaling new data to put into the loaded model << I'm stucking in this step
#...
So how to scale the new data to predict then inverse transform to the final result? From my logic, it need to scale in the same manner of the old scaler before training the model.