Categorical Cross Entropy Loss
2026/3/9小于 1 分钟
Categorical Cross Entropy Loss
题面
实现批量分类交叉熵(基于 logits 的稳定公式),对 N×C 的 logits 与 N 个真实标签,计算平均损失:
L_j = log( Σ_k e^{z_{j,k}} ) - z_{j,y_j},L = (1/N) Σ_j L_j。
Implementation Requirements
- External libraries are not permitted
- The solve function signature must remain unchanged
- The final result (average loss) must be stored in
loss
Examples
Example 1: N=2,C=3 → loss ≈ 0.3548926
Example 2: N=3,C=4 → loss ≈ 0.98820376Constraints
- 1 ≤ N ≤ 10,000
- 2 ≤ C ≤ 1,000
- -10.0 ≤ logits[i,j] ≤ 10.0
- 0 ≤ true_labels[i] ≤ C
- Performance is measured with N = 10,000