TransformerLmΒΆ
Packed Transformer LM with position embedding and shared softmax layer.
Abstract Signature:
TransformerLm(vocab_size: int, model_dims: int)
PyTorch
API:
βStrategy: Custom / Partial
Packed Transformer LM with position embedding and shared softmax layer.
Abstract Signature:
TransformerLm(vocab_size: int, model_dims: int)
β