Hitesh Sahu
Hitesh SahuHitesh Sahu
  1. Home
  2. ›
  3. posts
  4. ›
  5. …

  6. ›
  7. 3 Model Example

Loading ⏳
Fetching content, this won’t take long…


💡 Did you know?

🍌 Bananas are berries, but strawberries are not.

🍪 This website uses cookies

No personal data is stored on our servers however third party tools Google Analytics cookies to measure traffic and improve your website experience. Learn more

Cover Image for Examples and Intuitions I — Neural Networks as Logical Gates

Examples and Intuitions I — Neural Networks as Logical Gates

A simple example of applying neural networks is predicting logical operations like AND and OR. By choosing appropriate weights and bias, a single logistic neuron can simulate these gates. This illustrates the power of neural networks to represent complex functions by stacking simple units.

Hitesh Sahu
Written by Hitesh Sahu, a passionate developer and blogger.

Fri Feb 27 2026

Share This on

Examples and Intuitions I — Neural Networks as Logical Gates

Example 1: Implementing the AND Operator

A simple example of applying neural networks is predicting:

x1∧x2x_1 \land x_2x1​∧x2​

The logical AND operator is true only when:

  • x1=1x_1 = 1x1​=1
  • x2=1x_2 = 1x2​=1

Otherwise, it is false.


Network Structure

Our small neural network looks like:

[x0x1x2]→g(z(2))→hΘ(x)\begin{bmatrix} x_0 \\ x_1 \\ x_2 \end{bmatrix} \rightarrow g(z^{(2)}) \rightarrow h_\Theta(x)​x0​x1​x2​​​→g(z(2))→hΘ​(x)

Remember:

x0=1x_0 = 1x0​=1

This is the bias unit.


Choosing the Weights

Let us define the weight matrix:

Θ(1)=[−302020]\Theta^{(1)} = \begin{bmatrix} -30 & 20 & 20 \end{bmatrix}Θ(1)=[−30​20​20​]

The hypothesis becomes:

hΘ(x)=g(−30+20x1+20x2)h_\Theta(x) = g(-30 + 20x_1 + 20x_2)hΘ​(x)=g(−30+20x1​+20x2​)

Evaluating All Input Combinations

Case 1

x1=0,x2=0x_1 = 0, \quad x_2 = 0x1​=0,x2​=0 g(−30)≈0g(-30) \approx 0g(−30)≈0

Case 2

x1=0,x2=1x_1 = 0, \quad x_2 = 1x1​=0,x2​=1 g(−10)≈0g(-10) \approx 0g(−10)≈0

Case 3

x1=1,x2=0x_1 = 1, \quad x_2 = 0x1​=1,x2​=0 g(−10)≈0g(-10) \approx 0g(−10)≈0

Case 4

x1=1,x2=1x_1 = 1, \quad x_2 = 1x1​=1,x2​=1 g(10)≈1g(10) \approx 1g(10)≈1

Conclusion

With this choice of weights:

Θ(1)=[−302020]\Theta^{(1)} = \begin{bmatrix} -30 & 20 & 20 \end{bmatrix}Θ(1)=[−30​20​20​]

the neural network behaves exactly like an AND gate.

We constructed a fundamental logical operation using a logistic neuron.


Example 2: Implementing the OR Operator

The logical OR operator is true when:

  • x1=1x_1 = 1x1​=1, or
  • x2=1x_2 = 1x2​=1, or both

We can implement OR using a different set of weights:

Θ(1)=[−102020]\Theta^{(1)} = \begin{bmatrix} -10 & 20 & 20 \end{bmatrix}Θ(1)=[−10​20​20​]

The hypothesis becomes:

hΘ(x)=g(−10+20x1+20x2)h_\Theta(x) = g(-10 + 20x_1 + 20x_2)hΘ​(x)=g(−10+20x1​+20x2​)

Evaluating OR

Case 1

x1=0,x2=0x_1 = 0, \quad x_2 = 0x1​=0,x2​=0 g(−10)≈0g(-10) \approx 0g(−10)≈0

Case 2

x1=0,x2=1x_1 = 0, \quad x_2 = 1x1​=0,x2​=1 g(10)≈1g(10) \approx 1g(10)≈1

Case 3

x1=1,x2=0x_1 = 1, \quad x_2 = 0x1​=1,x2​=0 g(10)≈1g(10) \approx 1g(10)≈1

Case 4

x1=1,x2=1x_1 = 1, \quad x_2 = 1x1​=1,x2​=1 g(30)≈1g(30) \approx 1g(30)≈1

Key Intuition

A single logistic neuron can simulate logical gates.

By adjusting:

  • Bias (threshold)
  • Weights (importance of inputs)

we can model:

  • AND
  • OR
  • NAND
  • NOR

Neural networks are powerful because stacking these simple units allows us to represent much more complex functions.

AI-DeepLearning/3-Model-Example
Let's work together
+49 176-2019-2523
hiteshkrsahu@gmail.com
WhatsApp
Skype
Munich 🥨, Germany 🇩🇪, EU
Playstore
Hitesh Sahu's apps on Google Play Store
Need Help?
Let's Connect
Navigation
  Home/About
  Skills
  Work/Projects
  Lab/Experiments
  Contribution
  Awards
  Art/Sketches
  Thoughts
  Contact
Links
  Sitemap
  Legal Notice
  Privacy Policy

Made with

NextJS logo

NextJS by

hitesh Sahu

| © 2026 All rights reserved.