I'm very new to Python
, it's actually the first thing I wrote, so I would be very grateful if someone could explain this to me
I followed a tutorial and build a simple artificial neural network using TensorFlow
. I used PyCharm
community version to do this
import numpy as np
import pandas as pd
import tensorflow as tf
from sklearn.preprocessing import LabelEncoder
from sklearn.compose import ColumnTransformer
from sklearn.preprocessing import OneHotEncoder
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
dataset = pd.read_csv('file.csv')
X = dataset.iloc[:, 3:-1].values
y = dataset.iloc[:, -1].values
le = LabelEncoder()
X[:, 2] = le.fit_transform(X[:, 2])
ct = ColumnTransformer(transformers=[('encoder', OneHotEncoder(), [1])], remainder='passthrough')
X = np.array(ct.fit_transform(X))
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 0)
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)
ann = tf.keras.models.Sequential()
ann.add(tf.keras.layers.Dense(units=6, activation='relu'))
ann.add(tf.keras.layers.Dense(units=6, activation='relu'))
ann.add(tf.keras.layers.Dense(units=1, activation='sigmoid'))
ann.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
ann.fit(X_train, y_train, batch_size = 32, epochs = 100)
Now I'd like to predict the results using this ann. My question is, can I execute the line below without running the whole script form the beginning and putting it at the end of the code?
ann.predict(sc.transform([[0, 1, 0, 0, 0, 60, 1, 100, 1, 0, 0, 10000]]))
From what I understand every time I run the script new neural network is created, then it's being trained. I'd like to skip the whole process, train that network only once, and then use it to make predictions without running it from the beginning. Is this possible? I tried executing it in Python
Console but it didn't work (I guess it's because the program already stopped running when I did that).