Named Entity Linking, or disambiguating named entities by linking them to a knowledge base, is an important Natural Language Processing task, especially in the humanities. In this paper, we examine the performance of the state-of-the-art entity linking model BLINK in connecting Ancient Greek person mentions to a domain-specific German-language knowledge base. To train the model, we create both gold-standard data through manual annotation and noisier silver data through automatic extraction. We then evaluate whether incorporating the latter improves performance. Our findings suggest that, overall, the results remain suboptimal for Ancient Greek. Increasing training data, even through automatic methods, shows promise. However, as it stands, using BLINK directly would be ill-suited for Named Entity Linking in the target setting. We discuss possible causes and suggest areas for improvement.
