I am using spark-listeners-loganalytics_3.0.1_2.12 and spark-listeners_3.0.1_2.12 in my spark application that is running on databricks. With the default configuration, I am getting too many logs in Azure Log Analytics. I want to limit the logs that are forwarding to Log Analytics.
I tried to configure it as below and added spark-monitoring.sh file as init script in my databricks job cluster.
tee -a "$SPARK_CONF_DIR/spark-env.sh" << EOF export DB_CLUSTER_ID=$DB_CLUSTER_ID export LOG_ANALYTICS_WORKSPACE_ID=xxxx export LOG_ANALYTICS_WORKSPACE_KEY=xxxx export AZ_SUBSCRIPTION_ID=xxxxx export LA_SPARKLOGGINGEVENT_NAME_REGEX="com.demo.sample.*" EOF
Now I am getting logs with logger_name_s matches the regexr "com.demo.sample.*"
Requirement is to forward logs specific to "com.demo.sample.*" package and also ERROR logs from all other packages to Log Analytics. Is there any way I can configure it !!