Researchers propose a self-distillation fix for ‘catastrophic forgetting’ in LLMs LLMs tend to lose prior skills when fine-tuned for new tasks. A new self-distillation approach aims to reduce ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results