Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
323 views
in Technique[技术] by (71.8m points)

python 3.x - Negative Infinity or positive infinity in loss function in Regression

My output is showing infinity or negative infinity for loss function. It should be 0 or 1.

w = np.random.randn(6)
x = np.random.randn(6)
b =1
z = np.dot(w,x) + b
# a is sigmoid 
a = 1/1+np.exp(-z)
loss_function = -(np.dot(a,np.log(y) + np.dot((1-a),np.log(1-y))))

in case of y when i am using any values for y. it shows

-inf

or inf

Can Someone explain me this?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You are applying the formula for the Cross Entropy loss function backward that's why you only get infinity.

Your true label y being either 0 or 1 this will never work, instead it's a that you should put in the log :

loss_function = -(np.dot(y,np.log(a) + np.dot((1-y),np.log(1-a))))

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...