question

BernzweigBruce-9887 avatar image
0 Votes"
BernzweigBruce-9887 asked DuaneArnold-0443 commented

What are the pros and cons of using stored procedures vs C# code for DB work?

If there is a better place to post them please advise. Based on the available tags I feel as if I'm not in the correct forum.

My company has a long history of using stored procedures for even the most trivial of db queries. I am trying to convince them to use EF and do some of the processing C#.

I'm trying to come up with a list of pros and cons for using stored procedures and would like feedback.

We are about to develop a new application using .NET Framework 4.6 and the DB is SQL Server.

Pros:
1. The company (developers) has lots of experience working with stored procedures.
2. The deployment process is easier when we only have to deploy .sql scripts vs server side code.
3. Stored procedures handle large quantities of data much better; in fact EF has some limitations on how much data can be handled.

Cons:
1. The typical stored procedure with business logic can be a single file with 1000+ lines of code. This is very difficult to work with and debug. It is very difficult to work on a small section of the code that may not be at the beginning of the sp. In C# I can easily encapsulate a small piece of logic into it's own class/method and easily work with it.
2. I can't mock data in a sp. I can only mock the results.
3. The DB's are too large for us to have multiple copies at a given time. We are not permitted to have local copies.
4. Multiple users can access the same table and interfere with a test if the test writes directly to the DB.
5. It may not be feasible or even possible to change data is some tables to set up certain scenarios.

What are your thoughts and is there anything I'm missing?
Have you ever been in this situation, if so how did you resolve it?
Are there solutions that solve the cons that I am unaware of (that don't cost $$ or tons of money)?

Thanks,


sql-server-generaldotnet-csharpdotnet-entity-framework-core
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

I take exception to the "We are not permitted to have local copies [of the DB]". Now, I understand not having a copy of the production database with real, possibly confidential, data. But you should have a local database with the same schema, populated with fake data. It will make a huge difference from the point of view of testing. The number of hours of development work saved in this way will be much larger than the (comparatively) small amount of time devoted to writing a few scripts to generate fake data.

0 Votes 0 ·
ErlandSommarskog avatar image
0 Votes"
ErlandSommarskog answered

I am not sure that I get all points about mocking data and not having local copies. That problem would be the same.

Yes, it is true that when it comes to modularisation, T-SQL is not fantastic. The query language is fantastic, but as a general programming environment, yeah, we've all seen better.

Nevertheless, I much prefer using stored procedures, since working where the data is means fewer round trips, and keep this in mind: when you work with data, you may be working with lots of data: GB or even TB.

As for EF and other ORM, I don't see the point with a tool that generates SQL queries - SQL is a fairly high-level language itself. This often leads to problems where the DBA and devs talk different languages. If you want an ORM try something like Dapper which does not hide the SQL.

However, if your organisation already have the business logic in stored procedures, I think you should stick to that. EF and that may give shorter turnaround times in development initially, but for a large system that is going to live a long time, it may not be a fantastic choice.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

TimonYang-MSFT avatar image
1 Vote"
TimonYang-MSFT answered

The purpose of the ORM framework is to simplify the work of developers at the expense of some performance, but I personally think that in most cases, it is worthwhile.

Most database operations in a project should be simple CRUD. It is not a good idea to spend energy on these tasks. We should let the ORM framework complete these trivial tasks for us.

If some tasks are complex or have high performance requirements, then it is certainly better to use stored procedures.

No one asked us to choose one of the two.

We should choose suitable methods in suitable places according to the actual situation.


If the response is helpful, please click "Accept Answer" and upvote it.
Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

MartinCairney-6481 avatar image
0 Votes"
MartinCairney-6481 answered

I suggest for the testing side you investigate the tSQLt framework. This will allow testing without the impact to data in the tables and other developers impacting you.

Also, unless you need to test performance specifically, you can mock the production data using many tools (e.g. Redgate SQL Data Generator) which can give the same consistent patterns and distributions without the volume and without including potentially PII data in test databases.

There are a number of areas where stored procedures hugely outperform ORM and client side code. With a Stored Proc you are unlikely to ever encounter the N+1 Selects issue, you tend not to ever return columns that are not needed, and the data is almost always filtered to only the required rows being returned. Processing this at the client side involves more round trips (higher latency) and more data volume moving between server and client.

Additionally I find that I can better secure the data using Stored Procs - no need to allow any users to have SELECT/UPDATE/DELETE permissions on any tables when using ownership chains with the procs.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Darius-2625 avatar image
0 Votes"
Darius-2625 answered

You definitely have the right thinking about this. Stored procedures have two uses, in my opinion, and they are performance and feature related. One use is to take advantage of the features of the SQL language that are not supported by any ORM. The other is to do higher volume processing where it makes no sense to transfer the data back and forth to a calling application.

All other data access should be done using a decent ORM, like EF or Nfinity.Data (new kid on the block).

The advantages of ORMS are many, some of which you mentioned, like encapsulation, but they are essential for code maintainability and quality: you get compile time safety in your application, the ability to write integration tests, etc., etc. And you can avoid the hell that many and nested stored procedures and functions inevitably create.

ORMS don’t perform any worse than stored procedures for regular operations, but be careful of EF, as it has some gotchas, like creating Cartesian products which will kill an app, and always use AsNoTracking (I’ve personally benchmarked the difference, and it’s marked). I prefer simplicity and Nfinity.Data has many advantages in code quality and performance. Bulk inserts are at least 5x faster than EF, and it has full support for stored procedures (strongly-typed, too).

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

DuaneArnold-0443 avatar image
0 Votes"
DuaneArnold-0443 answered ErlandSommarskog commented

Myself I avoid sprocs like the plague, becuase they grow like bunny rabbits out of control, and they have business logic in them that belongs in the domain/business object's behavior/methods of the application using an OO language. Oracle shops have a bad habit of doing this too having the Oracle packages out of control, having business logic in them and no one knows what all those packages are doing or even how to test one when needed.

I have faced the frustration and nightmare of working in a shop that has gone MS SQL Server stored procedure or Oracle package happy.

It seems to me that your software development department and the DBA or DBA(s) are stuck around year 2000 or so prior to ORM(s) becoming popular. The interaction of any solution should be restricted to doing simple CRUD operations with the underlying database technology.

Using an ORM allows OOP to be used by the software developer to do the simple CRUD operations with the underlying DB technology. EF can use raw-tsql or a sproc that is limited to doing simple CRUD operations with the underlying database.

There is a new trend going on where developers use Dapper to query the database for speed, and they use EF for the data persistence in the same solution.



· 5
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

it seems to me that your software development department and the DBA or DBA(s) are stuck around year 2000 or so prior to ORM(s) becoming popular.

Oh, ORMs are still not popular among database people in the year of 2021. We were told that the ORM would remove the impedance mismatch between object-oriented and relational. But they didn't. They jjust moved the impedance mismatch to be between developers and DBA instead of being in the code.

1 Vote 1 ·

DBA(s) don't write enterprise level software solutions software developers do that. And software developers shouldn't have to be shackled to what a DBA(s) thinks. If a software developer wants to use an ORM or if the developer wants to use inline parameterized T-SQL in the DAL using the DAO pattern or the repository pattern, then who is the DBA to think that the DBA has the right to block he or she from doing so based on outdated legacy thinking?

0 Votes 0 ·

In the end what matters is the user experience. And an non-important part of the user experience is performance. Remember that when you work with data, you work with lots of data, gigabytes or terabytes. If you do it in the wrong way, it will be very slow.

If the DBA and the developers cannot understand each other because they live in different realms, the users will suffer.

For the record, I am a software developer. And all my experience tells me that the business logic belongs where the data is. In the database. And I care about my end users.

0 Votes 0 ·
Show more comments
DuaneArnold-0443 avatar image
0 Votes"
DuaneArnold-0443 answered DuaneArnold-0443 commented

@ErlandSommarskog

I am replying to your last comment. Maybe, I will start using Chrome to see if replying in a 'comment' thread seems to have problems with Edge and one can't submit.

You are talking about exceptions to the norm. You remember where you are at you are in an EF tag for an ORM which is used primarily in the backend infrastructure of client/service applications such ASP.NET Web or even Windows desktop solutions such as WPF and Windows form using client/service architecture such as Layered or N-tier where the concerns are about speed and responsiveness in front of the user.

The data access is in that environment, and there is no large amount of data in gigabits traveling over the wire that would bring a client/service solution to its knees and where business logic is sitting in the domain or business logic layer.

What you are talking about, I would find a way to use the SQLCLR and .NET code that can be unit tested with UT harness outside MS SQL Server, and the whole thing hosted by MS SQL Server Service Broker in production running long running tasks in an async manner.

https://en.wikipedia.org/wiki/SQL_CLR
https://www.sqlservercentral.com/articles/service-broker-part-1-service-broker-basics

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

You are talking about exceptions to the norm.

Well, it's quite a typical example of something I have developed many times in my career. Process a lot of data and produce a result, either to be stored in the database, or to be returned to the user. The amount of data to process largely exceeds what needs to be returned.

You remember where you are at you are in an EF tag for an ORM

Not sure what you mean by this, but if you are referring to forum tags, I'm coming from the tag sql-server-general.

0 Votes 0 ·

Well EF and other ORM(s) do not shine in batch processing and )processing hugh amounts of data. ORM(s) shine in high performance client/service solutions i have discussed. And I cringe when I see sprocs, oracle packages, datasets and datatables being used in the solutions I have discussed.

0 Votes 0 ·