1

I have data with columns A, B, C as inputs and columns D, E, F, G as outputs. The table has a shape (1000,7). I would like to train the model, validate and test it.

My data:

A = [100, 120, 140, 160, 180, 200, 220, 240, 260, 280];
B = [300, 320, 340, 360, 380, 400, 420, 440, 460, 480];
C = [500, 520, 540, 560, 580, 600, 620, 640, 660, 680]; 

My desired outcome:

For each combination of A, B, C --> I get D, E, F, G as outputs (for example):

D = 2.846485609 
E = 5.06656901
F = 3.255358183
G = 5.464482379)

Also, for each different combination of A, B, C; I have a different set of outputs (D, E, F, G).

My Question: Is it possible to train a neural network, using this experienced network to predict new values of D, E, F, G; for new combination of A, B, C?

Kaushik Roy
  • 1,627
  • 2
  • 11
  • 13
Mitra Lanka
  • 69
  • 2
  • 9
  • just use multiple nodes in the final layer. Don't put any squashing function. – rawwar Sep 24 '19 at 05:42
  • I am very new to this concept. I am trying to understand what every function does. I will try to figure out these things in more detail. Thank you for the suggestion! – Mitra Lanka Sep 25 '19 at 01:01

1 Answers1

4

The problem falls into Multivariate Regression category since the outputs are continuous value. Therefore, you can train a neural network (NN) having 4 output nodes and input feature vector of size 4. A sample NN model having one hidden layer using tensorfow is as follows:

import itertools
import numpy as np
from sklearn.preprocessing import StandardScaler
from tensorflow.python.keras.layers import Input, Dense
from tensorflow.python.keras.models import Model

A = [100, 120, 140, 160, 180, 200, 220, 240, 260, 280]
B = [300, 320, 340, 360, 380, 400, 420, 440, 460, 480]
C = [500, 520, 540, 560, 580, 600, 620, 640, 660, 680]

X_train = np.array(list(itertools.product(A, B, C)))
# X_train = np.random.random(size=(1000,3))
scaler = StandardScaler()
X = scaler.fit_transform(X_train)

Y_train = np.random.randint(0, 100, size=(1000, 4)).astype(float)  # Should load original label

X_test = np.random.random(size=(100, 3))
Y_test = np.random.randint(0, 100, size=(100, 4)).astype(float)

input = Input(shape=(3,))
hidden_layer_1 = Dense(25, activation='relu')(input)
output = Dense(4)(hidden_layer_1)

model = Model(inputs=input, outputs=output)
model.compile(
    optimizer='adam',
    loss=['mean_squared_error']
)

history = model.fit(X_train, Y_train, epochs=1000, batch_size=8)

result = model.predict(X_test)
Kaushik Roy
  • 1,627
  • 2
  • 11
  • 13
  • Hey Hi. I got the results by running this code. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same result? – Mitra Lanka Oct 04 '19 at 17:36
  • Have you trained with your dataset? In this code i have used random dummy output/label. You may need to finetune the model by adding more hidden layer or increase the nodes in `hidden_layer_1`. – Kaushik Roy Oct 04 '19 at 17:46
  • Yes, I have used my dataset...and also increased the no of hidden layers to 4. – Mitra Lanka Oct 04 '19 at 17:51
  • The data set which I have used is not random at all – Mitra Lanka Oct 04 '19 at 17:52