ReLU
2026/3/9小于 1 分钟
ReLU
题面
Implement a program that performs the Rectified Linear Unit (ReLU) activation function on a vector of 32-bit floating point numbers. The ReLU function sets all negative values to zero and leaves positive values unchanged:
ReLU(𝑥) = max(0, 𝑥)
Implementation Requirements
- External libraries are not permitted
- The solve function signature must remain unchanged
- The final result must be stored in output
Examples
Example 1:
Input: input = [-2.0, -1.0, 0.0, 1.0, 2.0]
Output: output = [0.0, 0.0, 0.0, 1.0, 2.0]Example 2:
Input: input = [-3.5, 0.0, 4.2]
Output: output = [0.0, 0.0, 4.2]Constraints
- 1 ≤ N ≤ 100,000,000
- Performance is measured with N = 25,000,000