Instructions to use mrm8488/CodeBERTaPy with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mrm8488/CodeBERTaPy with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="mrm8488/CodeBERTaPy")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("mrm8488/CodeBERTaPy") model = AutoModelForMaskedLM.from_pretrained("mrm8488/CodeBERTaPy") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 85f57ab4d6f205d5aaf3e884037a20844d6ac6d7d53f8599d02fced9aeb6d827
- Size of remote file:
- 336 MB
- SHA256:
- 5d1542c331a5bba0e1f74956b2e377af4695f8311411dad79a922900755afc3b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.