Events for Servers
I can see logs under Advanced Hunting in the below portal for workstations. How can I see same data on-prem servers that have been onboarded and connect to LAW(Log analytics Workspace). I want IdentityLogonEvents and DeviceNetworkEvents Microsoft 365…
How to purchase more Storage for Azure Data Explorer
We are planning to use the Standard_E8a_v4 SKU for our Requiremnet but the Storage capacity for the same is only 127 GB SSD and 64 GB Cache. Is there a way to purchase more Storage Space for the mentioned SKU.
Project/Extract from additional data fields
How can I project CVE titles from this. Some have more than 1 as in below example, and some don't have any. I ultimately want server names and CVE columns either empty or ith Title value Below is example of what I am running in log analytics to extract…
Injest xml from blob into Azure Data Explorer
I have some test xml files stored as azure storage blobs that I would like to injest into Azure Data Explorer. I created a table with a single raw column. I'm having issues injesting the blob because there is no RAW format in the One Click wizard. Is…
What is CoreUtilizationCoefficient w.r.t the Export Capacity of the ADX.
Hi All, I am updating the value of CoreUtilizationCoefficient to increase the Export Capacity of ADX. As I increase this value the number capacity of Export Increases. I went through this documentation that explains what CoreUtilizationCoefficient…
azure data explorer function with parameter for 'by' condition
Hi, is there a way to add parameter to my ADX function in order to summarize value by different dimension. for example let say I have country and city in my table. I would like to create a function with a parameter to specificy if it needs to summarize…
Create Power Bi Dataset using c# code. Use Azure data explorer connection as data source to power bi dataset.
Our requirement is to create power bi dataset using c# programmatically while using azure data explorer connection as data source to it. Can someone please help with the sample code to make connection to azure data explorer while creating pbi dataset …
Data load to ADX (Kusto) fails with error - UserErrorKustoWriteFailed
We are loading data from OData to Kusto table. In OData source, a query is given to pull specific columns data to Kusto. If the query has $top=10000 clause, ADF loads the data to Kusto table but when it is removed or 10001 or more is given, the…
Trying to link azure log analytics with AKS
am trying to enable log analytics for aks instance through portal it throws an error saying try to enable using Azure CLI. I am trying to enable using below command az aks enable-addons -a monitoring -n ExistingManagedCluster -g…
Improve Logic App Execution time
We have created a logic app which would read the data from a Staging table from Azure Data Explorer and would expand the data using the Azure Data Explorer. Once expanded it would then push the data back to Azure Data Explorer to another table. This…
Ingest into Azure Data Explorer failing through Azure Data Factory pipeline
I have created a Azure Data Factory pipeline to copy data from On- Premise Sql server to a Kusto cluster.I have created a service principal and given it admin permission on the Kusto database in Azure Data Explorer.Also used this service principal for…
Is it possible to create connection in PowerBi dataset using the table existing in Azure Data Explorer programmatically(using c#)
We have tried following approaches but nothing helped: Approach 1 : Using ADOMD and AMO libraries create the dataset with data source connection as Azure Data Explorer Blocker : The challenge is, using ADOMD library the dataset is being created…
Azure Data Factory V2 Bug, Scheduled triggered Kusto result is wrong but manually triggered result is right
Hi all, I am a dev in MS and recently I created a Kusto pipeline on ADF, according to design, Kusto will execute at 9 am UTC Time everyday and return some data. However, I found that the scheduled triggered result is always wrong and large than the…
Push Large Quantity of events to Event Hub
A service which is generating 7.2 B events in 30 mins Duration. Need to store these events in Azure Data Explorer. To send the data we are using EventHub since we can stream the data in realtime using the EventHubs. We are not able to send these many…
Custom aggregate function on summarize
Hi at all, i want calculate a statistic mode on a colummn during summarize a table. My CalculateMode functions that i try are: .create function CalculateMode(Action:int, Asset:string, Start:long, End:long) { Event | where JsonPath…
Ingesting json events with various schemas from Blob Storage to Azure Data Explorer
Hi, We're a game company using PlayFab for our backend. Play sessions events are created by PlayFab and are automatically sent to a blob storage in Azure. I created an Azure Data Explorer Database which ingests these events whenever the blob storage…
My partitions are returning inconsistent results (Mapped JSON file data)
I have a set of tables based on a number of JSON files taken from a data lake v2 container. I'm building an external table using derived data from JSON mapping, then using derived columns based on those columns to generate an identifier. When I go to…
Azure: querying KQL from SSMS
Hi, just wondering about if we have the ability to call Graph queries from on prem SSMS, for example a simplest query enumerating all the resources under 1 subscription Thanks for your inputs,
Abnormal behaviour when inserting very large value in decimal column in azure data explorer
So I tried to ingest very large values in decimal column of azure data explorer, it succeeded. But then when I tried to fetch the rows it is showing empty. I ran the following queries : .create table test_decimal_table(id:decimal) .ingest inline…
Are Azure data explorer clusters storage and compute dedicated?
Hi there, we are working in highly complaint environment and we want to store data in Azure Data Explorer. When I tried to create them, it did not gave me and option to create dedicated cluster for our Subscription. May be it is dedicated by default.…