This repository was archived by the owner on Dec 29, 2022. It is now read-only.

Description
I have a naive question, if I have a pre-trained transformer model (pre-train task is a regression probelem), can I using the embedding of this model to training a decode model to decode embeddings to sequences, as like a translation task? I wonder if seq2seq can help me on this task. Thanks, anyway.