Why is ReLU better and more often used than Sigmoid in Neural Networks? Explain with code.
Why is ReLU better and more often used than Sigmoid in Neural Networks? Explain with code.
Related questions
Question
![Why is ReLU better and more often used than Sigmoid in Neural Networks? Explain with code.](/v2/_next/image?url=https%3A%2F%2Fcontent.bartleby.com%2Fqna-images%2Fquestion%2F60a8d290-2a37-4d6d-adc7-b1cfe5abe35c%2Fb36ab5f3-2825-4ec3-8191-4bb9fb2d040c%2Fsr1fflk_processed.png&w=3840&q=75)
Transcribed Image Text:Why is ReLU better and more often used than Sigmoid in Neural Networks? Explain with code.
Expert Solution
![](/static/compass_v2/shared-icons/check-mark.png)
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 4 steps
![Blurred answer](/static/compass_v2/solution-images/blurred-answer.jpg)