Skip to content
This repository has been archived by the owner on Jul 7, 2024. It is now read-only.

Latest commit

 

History

History
16 lines (14 loc) · 605 Bytes

File metadata and controls

16 lines (14 loc) · 605 Bytes

Overview

This repository contains a Jupyter Notebook demonstrating the approximation of continuous functions using Rectified Linear Units (ReLU).
The notebook explores how ReLU activation functions, commonly used in neural networks, can be used to approximate continuous mathematical functions.
It includes detailed visualizations and code examples to illustrate the concepts.

Example

The following images show function approximations using multiple stacked ReLUs at different intervals.

a

b

c

d