-1

Normalisation transform data into a range: $$X_i = \dfrac{X_i - Min}{Max-Min}$$

Practically, I found out that the model doesn't generalise well when using normalisation of input data, instead of standardisation (another formula shown below).

Before training a neural net, data are usually standardised or normalised. Standardising seems good as it makes the model generalise better, while normalisation may make the model not working with values out of training data range.

So I'm using standardisation for input data (X), however, I'm confusing whether I should standardise the expected output values too?

For a column in input data: $$X_i = \dfrac{(X_i - Mean)}{Standard\ Deviation\ of\ the\ Column}$$

Should I apply this formula to the expected output values (labels) too?

Dan D
  • 1,318
  • 1
  • 14
  • 39

1 Answers1

2

It depends, as mentioned in the comments, on your model and labels. For example, how would you use standardization on a multi-classification problem?

Generally, standardization is more favorable for input data as its mean is around 0.

I assume you have a regression model and in that case, using standardization could be better than normalization.

Shayan Shafiq
  • 350
  • 1
  • 4
  • 12
pedrum
  • 313
  • 1
  • 13