Instructions to use neulab/codebert-python with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use neulab/codebert-python with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="neulab/codebert-python")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("neulab/codebert-python") model = AutoModelForMaskedLM.from_pretrained("neulab/codebert-python") - Inference
- Notebooks
- Google Colab
- Kaggle
Usable with oobabooga?
#2
by deleted - opened
Hi @Nurb432 ,
Thank you for your interest in our work!
I don't know what is oobabooga.
This model is usable with anything that bert is usable with.
Best,
Uri
Ooba = "text-generation-webui". One of the most popular gradio based interfaces at the moment. I did see the model was based on bert, so ill just look around for Bert GUIs.. thanks.
My use case for LLM is python generation. Most models are overkill since my use case is so targeted. Finding a small one like this that is targeted, is a great thing. ( if i can get it to work right :) )