You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
错误信息:
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_0.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_1.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_2.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_3.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_4.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_5.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_6.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_7.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_8.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_9.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_10.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_11.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_12.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_13.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.13, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_14.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_15.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.15, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_16.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.17, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_17.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_18.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.19, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_16.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_17.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_18.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_20.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_20.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 75) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ReduceSum.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_21.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.21, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 84) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 85) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 88) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 89) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.25, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 92) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 93) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.29, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_22.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.31, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_23.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_0.tmp_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor transpose_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_6.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_7.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_24.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_8.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_26.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_9.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 118) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 119) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.4, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 121) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_10.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_27.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_11.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.33, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_28.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 129) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 130) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 133) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 134) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.37, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.6, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 137) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 138) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_29.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.41, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_26.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_27.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.43, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_28.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_32.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_14.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 154) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ReduceSum.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_15.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_33.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.45, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_34.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 163) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 164) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 167) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 168) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.49, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 171) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 172) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_35.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.53, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_30.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.55, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_31.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_1.tmp_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor transpose_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_18.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_19.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_32.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_20.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_38.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_21.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 197) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 198) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.12, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 200) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_39.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.57, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_40.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 208) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 209) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 212) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 213) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.61, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.14, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 216) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 217) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_41.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.65, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_34.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_35.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.67, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_36.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_37.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.69, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_38.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_39.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_4.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.71, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_40.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_41.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_5.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.73, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_42.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_43.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_6.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.75, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_44.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Concat.8, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_45.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_46.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.77, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_7.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Concat.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_47.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_48.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_56.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_8.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 308) [TopK]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 308) [TopK]_output_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ArgMax.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor save_infer_model/scale_0.tmp_0.0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
问题描述
int8量化后,deploy_backend使用tensorrt,转onnx后转tensorrt,有很多Missing scale and zero-point for tensor
更多信息 :
链接:https://pan.baidu.com/s/1qiNlDMr0Haq3RUWTFhqe9g?pwd=585d
提取码:585d
模型是PaddleSeg的rtformer,QAT量化。
paddel2onnx命令:
paddle2onnx --model_dir=./output_quant_for_trt/ --model_filename=model.pdmodel --params=model.pdiparams --save_file=./output_quant_for_trt/model.onnx --opset_version=16 --enable_onnx_checker=True --deploy_backend tensorrt --save_calibration_file ./output_quant_for_trt/calibration.cache
错误信息:
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_0.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_1.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_2.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_3.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_4.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_5.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_6.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_7.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_8.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_9.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_10.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_11.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_12.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_13.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.13, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_14.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_15.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.15, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_16.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.17, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_17.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_18.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.19, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_16.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_17.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_18.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_20.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_20.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 75) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ReduceSum.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_21.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.21, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 84) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 85) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 88) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 89) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.25, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 92) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 93) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.29, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_22.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.31, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_23.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_0.tmp_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor transpose_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_6.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_7.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_24.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_8.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_26.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_9.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 118) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 119) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.4, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 121) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_10.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_27.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_11.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.33, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_28.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 129) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 130) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 133) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 134) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.37, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.6, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 137) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 138) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_29.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.41, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_26.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_27.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.43, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor relu_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_28.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_32.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_14.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 154) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ReduceSum.3, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_15.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_33.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.45, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_34.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 163) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 164) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.5, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 167) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 168) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.49, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.9, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 171) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 172) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_35.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.53, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_30.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.55, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_31.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor split_1.tmp_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor transpose_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_18.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_19.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_32.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_20.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_38.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_21.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 197) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 198) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.12, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 200) [Softmax]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor softmax_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_22.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_39.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor reshape2_23.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.57, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_40.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 208) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 209) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Div.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Erf.7, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 212) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 213) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.61, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Mul.14, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 216) [Constant]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 217) [Shuffle]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_41.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.65, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_34.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_35.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.67, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_36.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_0.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_37.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.69, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_38.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_1.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_39.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_4.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.71, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_40.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_2.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_41.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_5.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.73, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_42.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor pool2d_3.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_43.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_6.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.75, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_44.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Concat.8, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_45.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_46.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Add.77, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_7.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor Concat.11, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_47.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor batch_norm_48.tmp_2, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor conv2d_56.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor bilinear_interp_v2_8.tmp_0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 308) [TopK]_output, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor (Unnamed Layer* 308) [TopK]_output_1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor ArgMax.1, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
[01/03/2025-05:34:56] [TRT] [W] Missing scale and zero-point for tensor save_infer_model/scale_0.tmp_0.0, expect fall back to non-int8 implementation for any layer consuming or producing given tensor
导出的calibration.cache:
TRT-8XXX-EntropyCalibration2
x.quantized.dequantized: 40008102
x: 40008102
relu_9.tmp_0.quantized.dequantized.0: 3e06f7a3
relu_9.tmp_0.quantized.dequantized: 3e06f7a3
relu_9.tmp_0: 3e06f7a3
relu_8.tmp_0.quantized.dequantized: 3d3cd8d8
relu_8.tmp_0: 3d3cd8d8
relu_7.tmp_0.quantized.dequantized: 3db6ed15
relu_7.tmp_0: 3db6ed15
relu_6.tmp_0.quantized.dequantized: 3d6f53ad
relu_6.tmp_0: 3d6f53ad
relu_5.tmp_0.quantized.dequantized.0: 3e3d5019
relu_5.tmp_0.quantized.dequantized: 3e3d5019
relu_5.tmp_0: 3e3d5019
relu_4.tmp_0.quantized.dequantized: 3dbe8e70
relu_4.tmp_0: 3dbe8e70
relu_39.tmp_0.quantized.dequantized: 3da30a54
relu_39.tmp_0: 3da30a54
relu_38.tmp_0.quantized.dequantized: 3e24fd50
relu_38.tmp_0: 3e24fd50
relu_37.tmp_0.quantized.dequantized: 3dba6e2e
relu_37.tmp_0: 3dba6e2e
relu_36.tmp_0.quantized.dequantized: 3d6c3bb1
relu_36.tmp_0: 3d6c3bb1
relu_35.tmp_0.quantized.dequantized: 3d8375fb
relu_35.tmp_0: 3d8375fb
relu_34.tmp_0.quantized.dequantized: 3cee0f49
relu_34.tmp_0: 3cee0f49
relu_33.tmp_0.quantized.dequantized: 3d75091e
relu_33.tmp_0: 3d75091e
relu_32.tmp_0.quantized.dequantized: 3ce96e64
relu_32.tmp_0: 3ce96e64
relu_31.tmp_0.quantized.dequantized: 3da0d844
relu_31.tmp_0: 3da0d844
relu_30.tmp_0.quantized.dequantized: 3d5d1e2f
relu_30.tmp_0: 3d5d1e2f
relu_3.tmp_0.quantized.dequantized: 3e2e2a50
relu_3.tmp_0: 3e2e2a50
relu_29.tmp_0.quantized.dequantized: 3d8e8cef
relu_29.tmp_0: 3d8e8cef
relu_28.tmp_0.quantized.dequantized: 3d34d6e5
relu_28.tmp_0: 3d34d6e5
relu_27.tmp_0.quantized.dequantized: 3da917c3
relu_27.tmp_0: 3da917c3
relu_26.tmp_0.quantized.dequantized: 3e055356
relu_26.tmp_0: 3e055356
relu_25.tmp_0.quantized.dequantized: 3dd3d1c5
relu_25.tmp_0: 3dd3d1c5
relu_24.tmp_0.quantized.dequantized: 3de9ebfa
relu_24.tmp_0: 3de9ebfa
relu_21.tmp_0.quantized.dequantized: 3e31cd00
relu_21.tmp_0: 3e31cd00
relu_20.tmp_0.quantized.dequantized: 3e1bdd95
relu_20.tmp_0: 3e1bdd95
relu_2.tmp_0.quantized.dequantized: 3e04a01b
relu_2.tmp_0: 3e04a01b
relu_19.tmp_0.quantized.dequantized: 3cf084b5
relu_19.tmp_0: 3cf084b5
relu_15.tmp_0.quantized.dequantized: 3d54e7e6
relu_15.tmp_0: 3d54e7e6
relu_14.tmp_0.quantized.dequantized: 3e2c78a6
relu_14.tmp_0: 3e2c78a6
relu_13.tmp_0.quantized.dequantized: 3dac3acc
relu_13.tmp_0: 3dac3acc
relu_12.tmp_0.quantized.dequantized: 3d23f37b
relu_12.tmp_0: 3d23f37b
relu_11.tmp_0.quantized.dequantized: 3d6ff790
relu_11.tmp_0: 3d6ff790
relu_10.tmp_0.quantized.dequantized: 3d3fce83
relu_10.tmp_0: 3d3fce83
relu_1.tmp_0.quantized.dequantized: 3e26c865
relu_1.tmp_0: 3e26c865
relu_0.tmp_0.quantized.dequantized: 3e42f76a
relu_0.tmp_0: 3e42f76a
max_pool2d_with_index_1.tmp_0.quantized.dequantized: 3da0fb91
max_pool2d_with_index_1.tmp_0: 3da0fb91
max_pool2d_with_index_0.tmp_0.quantized.dequantized: 3cfc7d2f
max_pool2d_with_index_0.tmp_0: 3cfc7d2f
gelu_3.tmp_0.quantized.dequantized: 3df07eff
gelu_3.tmp_0: 3df07eff
gelu_2.tmp_0.quantized.dequantized: 3eab25ef
gelu_2.tmp_0: 3eab25ef
gelu_1.tmp_0.quantized.dequantized: 3ddf3ba1
gelu_1.tmp_0: 3ddf3ba1
gelu_0.tmp_0.quantized.dequantized: 3ec5ba52
gelu_0.tmp_0: 3ec5ba52
conv2d_9.tmp_0.tmp: 3ef6b973
conv2d_9.tmp_0: 3ef6b973
conv2d_8.tmp_0.tmp: 3e4498ce
conv2d_8.tmp_0: 3e4498ce
conv2d_7.tmp_0.tmp: 3e605dde
conv2d_7.tmp_0: 3e605dde
conv2d_6.tmp_0.tmp: 3f1dc67e
conv2d_6.tmp_0: 3f1dc67e
conv2d_56.tmp_1.tmp: 3eabf0a5
conv2d_56.tmp_1: 3eabf0a5
conv2d_55.tmp_0.tmp: 4007a1a0
conv2d_55.tmp_0: 4007a1a0
conv2d_54.tmp_0.tmp: 3e91fac6
conv2d_54.tmp_0: 3e91fac6
conv2d_53.tmp_0.tmp: 3e95402c
conv2d_53.tmp_0: 3e95402c
conv2d_52.tmp_0.tmp: 3f323ef3
conv2d_52.tmp_0: 3f323ef3
conv2d_51.tmp_0.tmp: 3ddfb51d
conv2d_51.tmp_0: 3ddfb51d
conv2d_50.tmp_0.tmp: 3f0989c8
conv2d_50.tmp_0: 3f0989c8
conv2d_5.tmp_0.tmp: 3e688fa3
conv2d_5.tmp_0: 3e688fa3
conv2d_49.tmp_0.tmp: 3da07903
conv2d_49.tmp_0: 3da07903
conv2d_48.tmp_0.tmp: 3f1bf9a8
conv2d_48.tmp_0: 3f1bf9a8
conv2d_47.tmp_0.tmp: 3e6aa4fc
conv2d_47.tmp_0: 3e6aa4fc
conv2d_46.tmp_0.tmp: 3f714f69
conv2d_46.tmp_0: 3f714f69
conv2d_45.tmp_0.tmp: 3e1485f2
conv2d_45.tmp_0: 3e1485f2
conv2d_44.tmp_0.tmp: 3e8f0dfa
conv2d_44.tmp_0: 3e8f0dfa
conv2d_43.tmp_0.tmp: 40427270
conv2d_43.tmp_0: 40427270
conv2d_42.tmp_0.tmp: 3f31707a
conv2d_42.tmp_0: 3f31707a
conv2d_41.tmp_1.tmp: 3e5baa22
conv2d_41.tmp_1: 3e5baa22
conv2d_40.tmp_1.tmp: 3eb15a38
conv2d_40.tmp_1: 3eb15a38
conv2d_4.tmp_0.tmp: 3ef83403
conv2d_4.tmp_0: 3ef83403
conv2d_37.tmp_0.tmp: 3e42f995
conv2d_37.tmp_0: 3e42f995
conv2d_36.tmp_0.tmp: 3e885d32
conv2d_36.tmp_0: 3e885d32
conv2d_35.tmp_1.tmp: 3f82e625
conv2d_35.tmp_1: 3f82e625
conv2d_34.tmp_1.tmp: 3f415c20
conv2d_34.tmp_1: 3f415c20
conv2d_31.tmp_0.tmp: 402dc4c6
conv2d_31.tmp_0: 402dc4c6
conv2d_30.tmp_0.tmp: 3f39c246
conv2d_30.tmp_0: 3f39c246
conv2d_3.tmp_0.tmp: 3ec4cf30
conv2d_3.tmp_0: 3ec4cf30
conv2d_29.tmp_1.tmp: 3e4b392b
conv2d_29.tmp_1: 3e4b392b
conv2d_28.tmp_1.tmp: 3e8991fa
conv2d_28.tmp_1: 3e8991fa
conv2d_25.tmp_0.tmp: 3e4c862c
conv2d_25.tmp_0: 3e4c862c
conv2d_24.tmp_0.tmp: 3d09cdf2
conv2d_24.tmp_0: 3d09cdf2
conv2d_23.tmp_1.tmp: 403cb514
conv2d_23.tmp_1: 403cb514
conv2d_22.tmp_1.tmp: 3f11b66d
conv2d_22.tmp_1: 3f11b66d
conv2d_2.tmp_0.tmp: 3edad7af
conv2d_2.tmp_0: 3edad7af
conv2d_19.tmp_0.tmp: 3e67af05
conv2d_19.tmp_0: 3e67af05
conv2d_18.tmp_0.tmp: 3e4dd594
conv2d_18.tmp_0: 3e4dd594
conv2d_17.tmp_0.tmp: 4044946a
conv2d_17.tmp_0: 4044946a
conv2d_16.tmp_0.tmp: 3e2543c3
conv2d_16.tmp_0: 3e2543c3
conv2d_15.tmp_0.tmp: 3e5a8758
conv2d_15.tmp_0: 3e5a8758
conv2d_14.tmp_0.tmp: 3eba2e5e
conv2d_14.tmp_0: 3eba2e5e
conv2d_13.tmp_0.tmp: 3e19d493
conv2d_13.tmp_0: 3e19d493
conv2d_12.tmp_0.tmp: 3e4b4de5
conv2d_12.tmp_0: 3e4b4de5
conv2d_11.tmp_0.tmp: 3f256097
conv2d_11.tmp_0: 3f256097
conv2d_10.tmp_0.tmp: 3e4fc34e
conv2d_10.tmp_0: 3e4fc34e
conv2d_1.tmp_1.tmp: 3ef71da4
conv2d_1.tmp_1: 3ef71da4
conv2d_0.tmp_1.tmp: 4102bafe
conv2d_0.tmp_1: 4102bafe
batch_norm_33.tmp_2.quantized.dequantized: 3da6047f
batch_norm_33.tmp_2: 3da6047f
batch_norm_29.tmp_2.quantized.dequantized: 3e2ff57b
batch_norm_29.tmp_2: 3e2ff57b
batch_norm_25.tmp_2.quantized.dequantized: 3d98a698
batch_norm_25.tmp_2: 3d98a698
batch_norm_21.tmp_2.quantized.dequantized: 3d1b5e66
batch_norm_21.tmp_2: 3d1b5e66
batch_norm_19.tmp_2.quantized.dequantized: 3e1d2261
batch_norm_19.tmp_2: 3e1d2261
auto.cast.9: 3f0989c8
auto.cast.8: 3f1bf9a8
auto.cast.7: 3f714f69
auto.cast.6: 3e8f0dfa
auto.cast.5: 3eb15a38
auto.cast.3: 3f415c20
auto.cast.2: 3e8991fa
auto.cast.10: 3f323ef3
auto.cast.0: 3f11b66d
Add.79: 3eabf0a5
Add.63: 3e5baa22
Add.59: 3eb15a38
Add.51: 3f82e625
Add.47: 3f415c20
Add.39: 3e4b392b
Add.35: 3e8991fa
Add.3: 3ef71da4
Add.27: 403cb514
Add.23: 3f11b66d
Add.1: 4102bafe
The text was updated successfully, but these errors were encountered: