how can I make a NN such that some input variables are "more important" than others. For example lets say my input layer has 2 neurons and I want to stress that input one is 70% important and other one only 30% because though formula-wise they measure same thing, first one contributes more to final outcome than other. Something like weighing samples, just I want to weigh individual inputs globally.
Is that even possible/make sense?