-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to send kafka json messages to bigquery #406
Comments
Any update on this @nassereddinebelghith ? |
Unfortunately there is no update on this subject for this moment ,the problem was the Kafka / big query dynamic schemas updates that are not supported by the connector for this moment. I am developing a new branch to add this feature. I will be working on a project with a similar need , so if there is any update I will be happy that the community share it. Meanwhile I will do my best to clean my code and share it with the community. |
@arabaaoui I know that you work on a similar project , is there any updates on your work on data ingestion from sql db --> kafka --> big query |
I was able to resolve my own issue by changing the configuration of the connector. Can´t remember exactly what I did, but I changed some settings -> I then received a different error message. |
The problem I'm facing is related to schema compatibility in Kafka Connect. The error message indicates that the top-level Kafka Connect schema must be of type 'struct', but the schema being received does not adhere to this requirement. This suggests that there's a mismatch between the schema expected by the connector and the schema being provided with the Kafka message.
Here's what I've tried:
Checking Kafka Message Schema: I've verified the schema of my Kafka message to ensure it's correctly structured. However, it seems that the schema is not compliant with the 'struct' type requirement.
Connector Configuration: I've checked the configuration of my Kafka connector to ensure that schema-related parameters are correctly set. Specifically, I've ensured that the key.converter.schemas.enable and value.converter.schemas.enable parameters are both configured to true if schemas are being used.
Compatibility of Versions: I've verified the compatibility of versions between my Kafka connector, Kafka cluster, and schema.
Data Examination: I've examined the incoming data to ensure it adheres to the expected schema. Incorrect data can lead to schema conversion errors.
Despite these efforts, the error persists, indicating that there may still be a discrepancy between the schema expected by the connector and the schema provided with the Kafka message. To resolve this issue, I may need to further investigate the schema configurations and the format of the incoming data to ensure they align with the expectations of the Kafka connector. Additionally, considering the specific details of the schema and the connector's requirements may provide further insights into resolving the problem.
Here is an exemple of conf of the connect & connector:
PS: The data sent to Kafka must be in JSON format, but the schema is dynamic. Therefore, I cannot specify the schema by having a static BigQuery table already created. Instead, I need the connector to create the table dynamically while injecting the JSON data. My Kafka message must have a string key and a JSON-formatted value.
Can anyone help, please?
The text was updated successfully, but these errors were encountered: