-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error code: 400 - {'error': {'message': "Sorry! We've encountered an issue with repetitive patterns in your prompt. Please try again with a different prompt.", 'type': 'invalid_request_error', 'param': 'prompt', 'code': 'invalid_prompt'}} #3
Comments
Hi! 👋 Thank you for reporting this issue! 🚀 Regarding the error you're encountering, this is actually the first time I've come across this specific issue. I found that this error typically occurs with GPT-3.5-turbo when a certain threshold is exceeded - specifically, when there are more than 200 repeated words in the prompt. https://community.openai.com/t/repetitive-prompt-error-400/716240 Here are a couple of suggestions that might help resolve the issue:
Let me know if you need any further assistance! |
Hi, {"question_id": 1, "image": "COCO_val2014_000000016631.jpg", "text": "Is there a person in the image?", "label": "yes"}
{"question_id": 2, "image": "COCO_val2014_000000016631.jpg", "text": "Is there a refrigerator in the image?", "label": "no"} Is it possible to use LogicCheckGPT to further improve LLaVA's performance on these types of binary questions? Below is the current implementation I am using for LLaVA to generate answers from an image: model_id = "llava-hf/llava-1.5-7b-hf"
model = LlavaForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=torch.float16,
low_cpu_mem_usage=True,
).to(0)
processor = AutoProcessor.from_pretrained(model_id)
conversation = [
{
"role": "user",
"content": [
{"type": "text", "text": question_text},
{"type": "image"},
],
},
]
prompt = processor.apply_chat_template(conversation, add_generation_prompt=True)
inputs = processor(images=raw_image, text=prompt, return_tensors='pt').to(0, torch.float16)
output = model.generate(**inputs, max_new_tokens=20, do_sample=False)
answer_text = processor.decode(output[0][2:], skip_special_tokens=True) |
Yes, LogicCheckGPT can definitely be integrated with LLaVA to enhance its performance on binary questions. To do this, you can replace the function |
Hi,
My transformers version is |
Hi, We have revisited the code and made a few adjustments to the implementation details during the reorganization process. The error you encountered is likely due to recent updates in the LLaVA repository, which may have introduced changes in the model's behavior or input/output formats. To address this, we have:
|
Hi, currently I am running
check_mplug.py
on dataset coco val2014. I followed everything on README. At first, it run successful, however, it showsError code: 400 - {'error': {'message': "Sorry! We've encountered an issue with repetitive patterns in your prompt. Please try again with a different prompt.", 'type': 'invalid_request_error', 'param': 'prompt', 'code': 'invalid_prompt'}}
after few minutes. Can you please take a look?Thank you
The text was updated successfully, but these errors were encountered: