This repository contains the model described in the paper A Systematic Investigation of Distilling Large Language Models into Cross-Encoders for Passage Re-ranking.
The code for training and evaluation can be found at https://github.com/webis-de/msmarco-llm-distillation.
- Downloads last month
- 28
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for webis/monoelectra-large
Base model
google/electra-large-discriminator