TransformerLm ============= Packed Transformer LM with position embedding and shared softmax layer. **Abstract Signature:** ``TransformerLm(vocab_size: int, model_dims: int)`` .. raw:: html

PyTorch

API:
Strategy: Custom / Partial

PaxML / Praxis

API: praxis.layers.TransformerLm
Strategy: Direct Mapping