3

I am using the following perceptron formula $\text{step}\left(\sum(w_ix_i)-\theta \right)$.

Is $\theta$ supposed to be updated in a perceptron, like the weights $w_i$? If so, what is the formula for this?

I'm trying to make the perceptron learn AND and OR, but without updating $\theta$, I don't feel like it's possible to learn the case where both inputs are $0$. They will, of course, be independent of the weights, and therefore the output will be $\text{step}(-\theta)$, meaning $\theta$ (which has a random value) alone will determine the output.

nbro
  • 42,615
  • 12
  • 119
  • 217
Mr. Eivind
  • 578
  • 5
  • 27

1 Answers1

0

I found the answer to my question.

Treat $\theta$ as a normal weight, associated with an input that always equals $-1$.

nbro
  • 42,615
  • 12
  • 119
  • 217
Mr. Eivind
  • 578
  • 5
  • 27