Instructions to use mrm8488/CodeBERTaPy with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mrm8488/CodeBERTaPy with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="mrm8488/CodeBERTaPy")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("mrm8488/CodeBERTaPy") model = AutoModelForMaskedLM.from_pretrained("mrm8488/CodeBERTaPy") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ecbc86b9f331d93bce3d9b3c3b06c91639528853e360fd44b06b494f2562db80
- Size of remote file:
- 1.32 kB
- SHA256:
- 0bef59a7a7a38c8270704c55481a149792551b829683173b1d671c3dad571097
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.