Our team of savvy editors independently handpicks all recommendations. If you make a purchase through our links, we may earn a commission. Deals and coupons were accurate at the time of publication ...
In this paper, We propose a novel methodology of Wasserstein distance based knowledge distillation (WKD), extending beyond the classical Kullback-Leibler divergece based one pioneered by Hinton et al.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results