How big is "Not Big Enough" for Synapse?

John Mihalko 1 Reputation point
2020-07-26T17:33:53.627+00:00

We are looking to move our data warehouse to the cloud. We don't have "big data". The biggest table we have has 200 millions rows. Is it cost effective to run a small data warehouse in Synapse? Is there a scenario that exists where you would put a small data warehouse, that doesn't come close to being "big data", in Synapses?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,529 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. HarithaMaddi-MSFT 10,136 Reputation points
    2020-07-27T11:23:01.497+00:00

    Hi @JohnMihalko-2725,

    Welcome to Microsoft Q&A Platform.

    Big Data can be defined by 7 V's - Volume, Velocity, Variety, Variability, Veracity, Visualization, and Value. Along with volume, these qualities significantly impact calling data as 'Big Data'. Synapse is extended version of SQL DW (Massive Parallel Processing Architecture) which is ideal in scenarios where huge queries hit the system and concurrent data loads are needed to refresh the data. Synapse gives the ability to do analytic processing by having feasibility to integrate with Azure ML, Databricks, ADLS, ADF. It is very much preferred to use Synapse if there are future plans to have analytic solutions with the data. Synapse has low DWU of 100 that almost acts as a SQL DB, hence it can be even used for small data. Also, data processing if involves unstructured data along with structured data, Synapse is preferred. SQL Server is geared primarily towards OLTP (Online Transactional Processing) requirements and it has capabilities to store huge data as well.

    Hope this helps!