-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Publishing Non spark Automic/Airflow Job information in spline #1150
Comments
Not sure I understand your question correctly, but the execution plan name (aka application name, or job name) in provided by an agent in the optional string property On the UI, when talking about the execution event list which is formed from the items stored in the In the database, the application name is additionally stored under the Everything that is under Not sure if I answered your question. |
I'm not sure about the question either but we are planning to use the Airflow task name as the spark app name to achieve this. Basically we will be calling spark like this:
You should be able to pass the name of Airflow task to your module though. Hope this helps. |
Background [Optional]
We have requirement to capture the job name ( Airflow job or Automic job or any scheduler ) which is submitted . Right now we capture the spark application name which shows in the
progress
collection againstapplicationName
key.Every spark job would have submitted by scheduler ( airflow or Automic). We would like to captures this information as the top level source and the show the spark
applicationName
. I do seeextra
key in theprogress
document. but I don't see any reserved field to be displayed in the UI. It's not only Automic/airflow job name . In future We would like to also add some additional information when we progress further.Please give some insights what's the right way showing this non spark information. Also it using the same job name we would like to search in UI.
Reason for this requirement to identify the spark job is submitted from which scheduler job. By adding this feature more users can adopt
Question
Please give some insights what's the right way showing non spark job level information and how to make that job is searchable
The text was updated successfully, but these errors were encountered: