AI-DeepLearning Index
📚 13 Posts
🕒 Last Updated: Fri Feb 27 2026
This folder contains AI-DeepLearning-related posts.
| # | Blog Link | Date | Excerpt | Tags |
|---|---|---|---|---|
| 1 | AI-DeepLearning Index | Fri Feb 27 2026 | Index of AI-DeepLearning posts (generated from Git) | |
| 2 | Neural Networks Introduction | Fri Feb 27 2026 | A concise cheat sheet covering core concepts, dimensions, activation functions, forward propagation, cost function, backpropagation, gradient checking, random initialization, training pipeline, and key intuition for neural networks. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 3 | Deep Learning Path 🤖 | Fri Feb 27 2026 | A comprehensive learning path for deep learning, covering foundational concepts, optimization techniques, project structuring, convolutional neural networks, and sequence models. This guide provides a structured approach to mastering deep learning through the Deep Learning Specialization DLS. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 4 | Neural Network Hypothesis and Intuition | Fri Feb 27 2026 | Explore the hypothesis and intuition behind neural networks, including their structure, activation functions, and how they process inputs to produce outputs. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 5 | Vectorized Neural Networks Model Representation | Fri Feb 27 2026 | Learn how to represent neural networks in a vectorized form, transforming scalar equations into efficient matrix operations for scalable and optimized computations. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs Vectorization Matrix Operations |
| 6 | Examples and Intuitions I — Neural Networks as Logical Gates | Fri Feb 27 2026 | A simple example of applying neural networks is predicting logical operations like AND and OR. By choosing appropriate weights and bias, a single logistic neuron can simulate these gates. This illustrates the power of neural networks to represent complex functions by stacking simple units. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 7 | Examples and Intuitions II — Building XNOR with a Hidden Layer | Fri Feb 27 2026 | In the previous section, we saw how to implement basic logical gates (AND, OR, NOR) using single neurons. However, some functions like XOR and XNOR cannot be represented by a single neuron. In this post, we will see how adding a hidden layer allows us to model the XNOR function. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 8 | Multiclass Classification with Neural Networks | Fri Feb 27 2026 | Learn how to extend binary classification to multiclass classification using neural networks, where the output layer consists of multiple units representing different classes, and the final prediction is made by selecting the class with the highest output value. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 9 | Backpropagation Algorithm | Fri Feb 27 2026 | Backpropagation is the algorithm used to minimize the neural network cost function. It computes the gradients of the cost function with respect to the parameters, allowing us to perform gradient descent and update our model. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 10 | Backpropagation Intuition | Fri Feb 27 2026 | Backpropagation is the algorithm used to compute the gradients of the cost function with respect to the parameters in a neural network. This post provides an intuitive understanding of how backpropagation works and why it is essential for training deep learning models. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 11 | Cost Function for Neural Networks | Fri Feb 27 2026 | The cost function for neural networks generalizes the logistic regression cost to multiple output units and includes regularization over all weights in the network. This post breaks down the cost function, explaining the double and triple summations, and provides intuition for how it works. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 12 | Gradient Checking and Random Initialization | Fri Feb 27 2026 | Gradient checking is a technique to verify the correctness of your backpropagation implementation. Random initialization is crucial for breaking symmetry and allowing the network to learn effectively. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
| 13 | Putting It Together — Training a Neural Network | Fri Feb 27 2026 | In this post, we will put together all the pieces we've learned about neural networks to understand how to train a neural network effectively. We will cover the cost function, backpropagation, gradient checking, and random initialization, along with key intuitions for each step. | Data Science Machine Learning Deep Learning Neural Networks Artificial Intelligence Computational Graphs |
