0

I'm trying to write my first simple neural net with my own data based on various tutorials and information. I'm stuck at the point where I think I prepared the model and I'm trying to run it, but when I want to find out the cost function changes in every epochs, it returns NaN.

My code is:

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf

df = pd.read_excel("mydataset.xlsx")

# Preparing the dataset, doing some stuff here
df2 = df.dropna(subset=['wl'])
df2 = df2.sample(frac=1)
df2_X = df2[['param1','param2','param3','param4','param5','param6','param7']]
df2_y = df2[['numerical_result_param']]

# Spliting the dataset...
train_X, test_X, train_y, test_y = df2_X[:210], df2_X[210:], df2_y[:210], df2_y[210:]

# Creating model:
X = tf.placeholder("float", shape=[None, train_X.shape[1]])
y = tf.placeholder("float", shape=[None, train_y.shape[1]])

hl_size = 256 # Number of neurons in hidden layer

weights = {
    'hl': tf.Variable(tf.random_normal([train_X.shape[1], hl_size])),
    'out': tf.Variable(tf.random_normal([hl_size, train_y.shape[1]]))
}

biases = {
    'hl': tf.Variable(tf.random_normal([hl_size])),
    'out': tf.Variable(tf.random_normal([train_y.shape[1]]))
}

def multilayer_perceptron(x):
    hl_layer = tf.add(tf.matmul(x, weights['hl']), biases['hl'])
    hl_layer = tf.nn.relu(hl_layer)
    out_layer = tf.matmul(hl_layer, weights['out']) + biases['out']
    return out_layer

logits = multilayer_perceptron(X)

hm_epochs = 100 # Number of epochs

cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y, logits=logits))
optimizer = tf.train.AdamOptimizer(0.01).minimize(cost) # Training optimizer

# Running the session
with tf.Session() as sess:
    init = tf.global_variables_initializer()
    sess.run(init)

    for epoch in range(hm_epochs):
        epoch_loss = 0
        _, c = sess.run([optimizer, cost], feed_dict={X: train_X, y: train_y})
        epoch_loss += c
        print('Epoch',epoch,'out of',hm_epochs,'loss:',epoch_loss)

And it returns:

Epoch 0 out of 100 loss: nan
Epoch 1 out of 100 loss: nan

etc.

I'd appreciate any help and ideas what I did wrong!

jwitos
  • 492
  • 3
  • 7
  • 24

0 Answers0