Skip to content

Conversation

@david-leifker
Copy link
Collaborator

@david-leifker david-leifker commented May 30, 2025

When upgrading the Kafka library from an older version, we discovered a missing class that was previously excluded in an earlier pull request (PR #1693). In the past, the older Kafka library was probably more lenient and didn't attempt to load all referenced classes, effectively masking a potential issue.

The newer version of the Kafka library has stricter class loading behavior, causing this long-standing but previously unnoticed problem to surface. Removing the class exclusion resolves the immediate compilation and runtime issue with the newer library. However, the original reason for the exclusion remains unclear due to the loss of context from the initial implementation five years ago.

By removing the exclusion, we've addressed the immediate technical barrier to upgrading the library, but there might be underlying implications that are not immediately apparent. The fix solves the current problem, but without the original context, we cannot be certain if there are potential side effects or why the class was initially excluded.

Initial PR this is based on is here.

[x] Tested to ensure message compatibility with previously published messages.

@alwaysmeticulous
Copy link

alwaysmeticulous bot commented May 30, 2025

🔴 Meticulous spotted visual differences in 7 of 1367 screens tested: view and approve differences detected.

Meticulous evaluated ~10 hours of user flows against your PR.

Last updated for commit d68fd0d. This comment will update as new commits are pushed.

@codecov
Copy link

codecov bot commented May 30, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

❌ Unsupported file format

Upload processing failed due to unsupported file format. Please review the parser error message:

Error parsing JUnit XML in /home/runner/work/datahub/datahub/metadata-io/build/test-results/test/TEST-com.linkedin.metadata.graph.search.elasticsearch.SearchGraphServiceElasticSearchTest.xml at 117:1057

Caused by:
    RuntimeError: Error converting computed name to ValidatedString
    
    Caused by:
        string is too long

For more help, visit our troubleshooting guide.

📢 Thoughts on this report? Let us know!

@github-actions github-actions bot added ingestion PR or Issue related to the ingestion of metadata devops PR or Issue related to DataHub backend & deployment labels May 30, 2025
@datahub-cyborg datahub-cyborg bot added the needs-review Label for PRs that need review from a maintainer. label May 30, 2025
@datahub-cyborg datahub-cyborg bot added pending-submitter-merge and removed needs-review Label for PRs that need review from a maintainer. labels May 30, 2025
update schema reg api spec
@codecov
Copy link

codecov bot commented May 30, 2025

Bundle Report

Changes will increase total bundle size by 1.73kB (0.01%) ⬆️. This is within the configured threshold ✅

Detailed changes
Bundle name Size Change
datahub-react-web-esm 19.65MB 1.73kB (0.01%) ⬆️

Affected Assets, Files, and Routes:

view changes for bundle: datahub-react-web-esm

Assets Changed:

Asset Name Size Change Total Size Change (%)
assets/index-*.js 1.73kB 16.02MB 0.01%

@david-leifker david-leifker merged commit 1194f91 into master May 31, 2025
63 of 65 checks passed
@david-leifker david-leifker deleted the bump-kafka-version branch May 31, 2025 02:51
@RafaelFranciscoLuqueCerezo
Copy link
Contributor

Hello @david-leifker we have found a problem trying to configure OAUTH on kafka on the frontend project , we have this error on logs:
image
As you can see despite we have previously configured correctly the oauth bearer token endpoint , this configuration is not properly applied , it appears as null:
image

Our configuration is:

springKafkaConfigurationOverrides:
security.protocol: SASL_SSL
sasl.mechanism: OAUTHBEARER
sasl.login.callback.handler.class: org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginCallbackHandler
sasl.oauthbearer.token.endpoint.url: https://login.microsoftonline.com/xxxxxx/oauth2/v2.0/token
sasl.oauthbearer.method: oidc

We think maybe it is necessary to set the assignment of this property on these files:

  • datahub-frontend/app/client/KafkaTrackingProducer.java
    image

  • datahub-frontend/conf/application.conf
    image

@david-leifker
Copy link
Collaborator Author

@RafaelFranciscoLuqueCerezo - Good call out here! Most of our application stack is using Spring and benefits from the Spring kafka plugin. The frontend component is however not using Spring and appears to be manually configured. Let me look over the configuration there and potentially I can refactor the code to use the Spring properties even though it is not a Spring application. We can then replace all this manual work to configure the frontend Kafka producer.

kartikey-visa pushed a commit to kartikey-visa/datahub that referenced this pull request Jul 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

devops PR or Issue related to DataHub backend & deployment ingestion PR or Issue related to the ingestion of metadata pending-submitter-merge

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants