15:08 pm
Add Code Sample Docstrings and Checkpoint Reference for GLM Models
What is the Issue?
There was a lack of code sample docstrings and checkpoint references for the GLM models in the huggingface/transformers repository. This made it difficult for developers to understand how to use these models effectively.
What does the PR do?
This PR adds code sample docstrings and checkpoint references for the GLM models. These additions provide clear examples and references for developers to follow when using the GLM models.
Why is it Important?
Providing code samples and checkpoint references makes it easier for developers to understand how to use the models. This PR helps in improving the usability of the GLM models by providing clear and concise documentation.
Code Snippet
Here is a code snippet that shows the changes made in the PR:
# Before
# No code sample docstrings and checkpoint references
# After
# Added code sample docstrings and checkpoint references
@add_code_sample_docstrings(
checkpoint=_CHECKPOINT_FOR_DOC,
output_type=TokenClassifierOutput,
config_class=_CONFIG_FOR_DOC,
)
def forward(self, input_ids: Optional[torch.LongTensor] = None, ...):
# Model forward passYou can view the full PR here.
Links : Transformers
Tags :
Date : 16th March, Sunday, 2025, (Wikilinks: 16th March, March 25, March, 2025. Sunday)
Category : Others