- Watch the introductory video.
- Test drive Analytics on our simulated data if your app isn't sending data to Application Insights yet.
From your app's home resource in Application Insights, click Analytics.
The inline tutorial gives you some ideas about what you can do.
There's a more extensive tour here.
Query your telemetry
Write a query
IntelliSense prompts you with the operators and the expression elements that you can use. Click the information icon (or press CTRL+Space) to get a longer description and examples of how to use each element.
Run a query
- You can use single line breaks in a query.
- Put the cursor inside or at the end of the query you want to run.
- Check the time range of your query. (You can change it, or override it by including your own
where...timestamp...clause in your query.)
- Click Go to run the query.
- Don't put blank lines in your query. You can keep several separated queries in one query tab by separating them with blank lines. Only the query that has the cursor runs.
Save a query
- Save the current query file.
- Open a saved query file.
- Create a new query file.
See the details
Expand any row in the results to see its complete list of properties. You can further expand any property that is a structured value - for example, custom dimensions, or the stack listing in an exception.
Arrange the results
You can sort, filter, paginate, and group the results returned from your query.
Sorting, grouping, and filtering in the browser don't re-run your query. They only rearrange the results that were returned by your last query.
Pick the columns you'd like to see, drag column headers to rearrange them, and resize columns by dragging their borders.
Sort and filter items
Sort your results by clicking the head of a column. Click again to sort the other way, and click a third time to revert to the original ordering returned by your query.
Use the filter icon to narrow your search.
To sort by more than one column, use grouping. First enable it, and then drag column headers into the space above the table.
Missing some results?
If you think you're not seeing all the results you expected, there are a couple of possible reasons.
Time range filter. By default, you will only see results from the last 24 hours. There is an automatic filter that limits the range of results that are retrieved from the source tables.
However, you can change the time range filter by using the drop-down menu.
Or you can override the automatic range by including your own
where ... timestamp ...clause into your query. For example:
requests | where timestamp > ago('2d')
Results limit. There's a limit of about 10k rows on the results returned from the portal. A warning shows if you go over the limit. If that happens, sorting your results in the table won't always show you all the actual first or last results.
It's good practice to avoid hitting the limit. Use the time range filter, or use operators such as:
(Want more than 10k rows? Consider using Continuous Export instead. Analytics is designed for analysis, rather than retrieving raw data.)
Select the type of diagram you'd like:
If you have several columns of the right types, you can choose the x and y axes, and a column of dimensions to split the results by.
By default, results are initially displayed as a table, and you select the diagram manually. But you can use the render directive at the end of a query to select a diagram.
On a timechart, if there is a sudden spike or step in your data, you may see a highlighted point on the line. This indicates that Analytics Diagnostics has identified a combination of properties that filter out the sudden change. Click the point to get more detail on the filter, and to see the filtered version. This may help you identify what caused the change.
Pin to dashboard
This means that, when you put together a dashboard to help you monitor the performance or usage of your web services, you can include quite complex analysis alongside the other metrics.
You can pin a table to the dashboard, if it has four or fewer columns. Only the top seven rows are displayed.
The chart pinned to the dashboard is refreshed automatically by re-running the query approximately every hours. You can also click the Refresh button.
Certain simplifications are applied to a chart when you pin it to a dashboard.
Time restriction: Queries are automatically limited to the past 14 days. The effect is the same as if your query includes
where timestamp > ago(14d).
Bin count restriction: If you display a chart that has a lot of discrete bins (typically a bar chart), the less populated bins are automatically grouped into a single "others" bin. For example, this query:
requests | summarize count_search = count() by client_CountryOrRegion
looks like this in Analytics:
but when you pin it to a dashboard, it looks like this:
Export to Excel
After you've run a query, you can download a .csv file. Click Export, Excel.
Export to Power BI
Put the cursor in a query and choose Export, Power BI.
You run the query in Power BI. You can set it to refresh on a schedule.
With Power BI, you can create dashboards that bring together data from a wide variety of sources.
Get a link under Export, Share link that you can send to another user. Provided the user has access to your resource group, the query will open in the Analytics UI.
(In the link, the query text appears after "?q=", gzip compressed and base-64 encoded. You could write code to generate deep links that you provide to users. However, the recommended way to run Analytics from code is by using the REST API.)
curl "https://api.applicationinsights.io/beta/apps/DEMO_APP/query?query=requests%7C%20where%20timestamp%20%3E%3D%20ago(24h)%7C%20count" -H "x-api-key: DEMO_KEY"
Unlike the Analytics UI, the REST API does not automatically add any timestamp limitation to your queries. Remember to add your own where-clause, to avoid getting huge responses.
You can import data from a CSV file. A typical usage is to import static data that you can join with tables from your telemetry.
For example, if authenticated users are identified in your telemetry by an alias or obfuscated id, you could import a table that maps aliases to real names. By performing a join on the request telemetry, you can identify users by their real names in the Analytics reports.
Define your data schema
- Click Settings (at top left) and then Data Sources.
- Add a data source, following the instructions. You are asked to supply a sample of the data, which should include at least ten rows. You then correct the schema.
This defines a data source, which you can then use to import individual tables.
Import a table
- Open your data source definition from the list.
- Click "Upload" and follow the instructions to upload the table. This involves a call to a REST API, and so it is easy to automate.
Your table is now available for use in Analytics queries. It will appear in Analytics
Use the table
Let's suppose your data source definition is called
usermap, and that it has two fields,
requests table also has a field named
user_AuthenticatedId, so it's easy to join them:
requests | where notempty(user_AuthenticatedId) | take 10 | join kind=leftouter ( usermap ) on user_AuthenticatedId
The resulting table of requests has an additional column,
Import from LogStash
- Tour of Analytics
- Start here. A tutorial covering the main features.
- Use operators such as
countto build queries.
- Use operators such as
- Used to compute statistics over groups of records
- Numbers, strings, and other expressions used to form query parameters.
- Using Analytics
- Using Analytics.
- Language Reference
- One-page reference.