Skip to content

Commit 632d698

Browse files
authored
make error more informative
1 parent 996fb14 commit 632d698

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

lm_eval/tasks/humanevalpack.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -226,7 +226,7 @@ def get_prompt(self, prompt_base, instruction, context=None):
226226
elif self.prompt == "codellama":
227227
prompt = f"[INST] {inp.strip()} [/INST] {prompt_base}"
228228
else:
229-
raise NotImplementedError
229+
raise ValueError(f"The --prompt argument {self.prompt} wasn't provided or isn't supported")
230230
# Strip off the final \n to make the tokens more natural
231231
# Essentially, we want to make sure that if there was no distinction between
232232
# input & output, the tokens would be the same

0 commit comments

Comments
 (0)