-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting zero byte files using abris #307
Comments
Hi @kpr9991 Thanks for your examples. I ran your code with Abris and consumed the records as expected. I don't see any errors in your provided code. What versions of Spark and Abris are you running? I see that you used the same output and checkpoint directory for both examples. Is it possible that you forgot to clear the checkpoints before running the example with Abris? |
I ran it for 2.5hrs and then it was filling up the 0 byte files. The culprit was trigger. I thought trigger would trigger the infinite table for every x milliseconds, and write the output but it was creating 0 byte files for every x milliseconds and after some hours it started to write the files. Trigger is not behaving as expected. So i removed trigger and it ran perfectly. But still failed for batch data. Can you try once with spark.read instead of spark.readStream and spark.write instead of spark.writeStream. I am getting 0 byte files even in this case and this doesnt have any triggers so nothing to remove from this code to test |
Hi @kpr9991 I was able to run the example using |
I was using abris with confluent schema registry to deserialize avro records received from kafka source.
When i use confluent schema registry and manually get the schema and pass it to spark default from_avro function by skipping first 6 bytes i was able to read records. I wish to do the same using abris. Since abris as a library does that. But when i am using abris 0 byte files are written. Is this issue with Abris ?
Working code without Abris:
With Abris :
The text was updated successfully, but these errors were encountered: