Log analysis with SQL Server


In any application you will need to have some kind of logs where you write events, errors, debug and other information. One of the biggest problems is how to analyze information from log files. If you have some predefined format you can use some log analysis tools or languages such as Perl Scripts, Hadoop MapReduce (if you have big log files), Graylog2 (open source), LogStash (open source), Loggly (SaaS), etc.

The problem in this case might be that you have limited query/analysis capabilities because you will need to code your own logic for filtering and aggregating log information.

If you format information in log files as JSON you might have good trade-off between flexibility of information that you can store and ability to process information with some kind of query language. One of the common choice are NoSQL database such as MongoDB or Azure DocumentDB where you can store JSON messages and search them using some platform specific query language.

In this article we will see ho can you analyze log files containing JSON messages.


We have log files containing log messages formatted as JSON. How to analyze these log files?


Load text files in SQL Server and analyze them with OPENJSON function.

How to analyze JSON logs with SQL Server?

SQL Server enables you to load content of the file formatted as JSON and transform it into relational format that can be analyzed with standard SQL. We will start with and example of JSON log file shown in the following example:

{"time":"2015-11-27T02:33:05.063","ip":"","request":"/", "method":"GET", "status":"200"},
{"time":"2015-11-27T02:33:06.003","ip":"","request":"/",method":"GET", "status":"200"},
{"time":"2015-11-27T02:33:06.032","ip":"","request":"/contact", "method":"POST", "status":"500", "exception":"Object reference not set to an instance of object","stackTrace":"…"}, …….. {"time":"2015-11-27T02:37:06.203","ip":"","request":"/login",method":"GET", "status":"200"}, {"time":"2015-11-27T02:37:12.016","ip":"","request":"/login", "method":"POST","status":"404", "exception":"Potentially dangerous value was detected in request"} ]

Here we have some standard information such as time, IP address, requested URL, HTTP method, etc. If some error occurred we can have additional data such as exception message, stack trace, etc.

In SQL Server, we can easily read this log file and query results:

 SELECT log.*
 FROM OPENROWSET (BULK 'C:\logs\json-log-2015-11-27.txt', SINGLE_CLOB) as log_file
WITH( time datetime, status varchar(5), method varchar(10),
 exception nvarchar(200)) AS log

For better manageability and performance, it might be good to split your log files by date or size (using something like Rolling file appender in Log4J). Now we can do any type of analysis on the returned data set. The following report returns list of pages and number of server errors detected on them for each HTTP method:

 SELECT request, method, COUNT(*)
FROM OPENROWSET (BULK N'C:\logs\log-json-2015-11-27.txt', SINGLE_CLOB) as log_file

WITH( time datetime, status int, method varchar(10),
       request varchar(20), exception nvarchar(200)) AS log
WHERE status >= 500
GROUP BY request, method

You can notice that we have full power of T-SQL on JSON log files. You can also load JSON data into standard table and create reports on that table.


Logging information in traditional text files is the fastest way to log information. Formatting log messages as JSON enables you to have simple human readable log format with ability to query and analyze log data. New SQL Server with JSON support enables you to easily load log files and analyze them with standard T-SQL. Log files can be stored on local or network drives. In addition, they can be loaded from Azure File Storage.

There is another format of JSON called Line delimited JSON . In this format every line of file contains one valid JSON object. If you have that kind of files you can see how they can be parsed in this post - Loading line-delimited JSON files in SQL Server 2016.