• 0

[ASP.NET, Design] Business Logic Layer or Stored procedures?


Question

Hello,

I'm trying to see which route is best. Having a Business Logic Layer or have all the logic as stored procedures in the database.

The application architecture looks like this: Data Access Layer (LINQ to SQL) -> Business Logic Layer (Needed?) -> UI Layer (Web).

Does it make sense to have the middle layer if everything is done via stored procedures? If not, what do you recommend for the BLL? Could you advice on both routes? Advantages and disadvantages? Kindly note that the application is married to the database (Sql Server 2005) so there is no need to take into concideration that the dbms will change in the future.

Thanks.

14 answers to this question

Recommended Posts

  • 0

I generally stick to the 3-layer architecture for my (admittedly quite small) ASP.NET systems. The business logic layer (even with stored procedures) allows you to add exception handling and more advanced input validation to your system between the presentation layer and the linq-to-sql layer, and for that I would recommend keeping it.

  • 0

Thanks for your input. What kind of class naming do you recommend for this layer? The DAC layer already has the entity objects (like Item, Subject).

  On 25/01/2010 at 08:21, Majesticmerc said:

I generally stick to the 3-layer architecture for my (admittedly quite small) ASP.NET systems. The business logic layer (even with stored procedures) allows you to add exception handling and more advanced input validation to your system between the presentation layer and the linq-to-sql layer, and for that I would recommend keeping it.

  • 0

I always use both

1) Presentation layer

2) Business layer (most of the time i still use ado.net though) that calls stored procedures

3) Stored procedures that contains most of the transactional stuff

4) Database with foreign keys for data integrity

  • 0

One thing I cannot understand is why some developers have a layer which talks to the db using one entity type and then converts to another so the main application can then use it......all your doing is adding more for to your already long list of things to do! Designing a database properly will combat this. So you can in a sense merge one the entity layer and business logic layer into one.

Have a look at this: http://www.west-wind.com/WebLog/posts/160237.aspx

The code files are here: http://www.west-wind.com/files/conferences/conn_LinqToSqlBusiness.zip

Basically it's a DataContext lifetime management and factory wrapper class.

How to use it:

Create a new Project, create two folders with the project called "BusinessFramework" and "BusinessObjects". Grab the "wwDataContext.cs" and "wwBusinessObject.cs" files and put them into the "BusinessFramework" folder. I rename them to "<shortprojectname>BusinessObject.cs" and "<shortProjectName>BusinessDataContext" but rename them as you see fit.

Create a DBML file (called YourData, which then becomes YourDataDataContext) in the "BusinessFramework" folder, right click WITHIN the DBML, select Properties. Under "Base Class" set to "<shortProjectName>BusinessDataContext", Under Connection set "Application Settings" to False, "Connection String" to None.

Right click on the DBML select "View Code".

You *may* need to change the namespace on this class first, it'll depend. it should be the full namespace of where the file is..... eg:

someNamespace.projectName.Data.BusinessFramework

rather than

someNamespace.projectName.Data

it should look something like this:

    public partial class DBMLNameDataContext         
    {

        public DBMLNameDataContext()
            : base(ConfigurationManager.ConnectionStrings["YourConnectionStringNameFromApp.Config"].ConnectionString, new AttributeMappingSource())
        {
            OnCreated();
        }

    }

Create an app.config file within the root of the Project. Add a new Connection String, putting the correct name into the class above.

Add a table into the DBML.....this will be signified by <entityName>

Create a new class file (within "BusinessObjects" folder) called "Bus<entityName>.cs", we have here tbl_Engineer which is named "Engineer" in the DBML, so the file becomes "BusEngineer.cs".

public class BusEngineers : &lt;shortprojectname&gt;BusinessObject&lt;tbl_engineer, YourDataDataContext&gt;
    {
    }

Within the root of the Project create a new class file called "<shortProjectName>DataFactory.cs", within this add:

        public static BusEngineers GetEngineers
        {
            get
            {
                return new BusEngineers();
           }
        }

So to use the whole Data Layer....in another project add:

<shortProjectName>DataFactory.GetEngineers. => this will then list all your public methods within your BusEngineer class.

You could even move all the Business Objects out into another project if you needed to.

Done - I think.....any questions feel free to PM me.

GE

  • 0

Thanks a lot, I'll look into that :).

  On 25/01/2010 at 11:04, garethevans1986 said:

One thing I cannot understand is why some developers have a layer which talks to the db using one entity type and then converts to another so the main application can then use it......all your doing is adding more for to your already long list of things to do! Designing a database properly will combat this. So you can in a sense merge one the entity layer and business logic layer into one.

Have a look at this: http://www.west-wind...sts/160237.aspx

The code files are here: http://www.west-wind...SqlBusiness.zip

Basically it's a DataContext lifetime management and factory wrapper class.

How to use it:

Create a new Project, create two folders with the project called "BusinessFramework" and "BusinessObjects". Grab the "wwDataContext.cs" and "wwBusinessObject.cs" files and put them into the "BusinessFramework" folder. I rename them to "<shortprojectname>BusinessObject.cs" and "<shortProjectName>BusinessDataContext" but rename them as you see fit.

Create a DBML file (called YourData, which then becomes YourDataDataContext) in the "BusinessFramework" folder, right click WITHIN the DBML, select Properties. Under "Base Class" set to "<shortProjectName>BusinessDataContext", Under Connection set "Application Settings" to False, "Connection String" to None.

Right click on the DBML select "View Code".

You *may* need to change the namespace on this class first, it'll depend. it should be the full namespace of where the file is..... eg:

someNamespace.projectName.Data.BusinessFramework

rather than

someNamespace.projectName.Data

it should look something like this:

 public partial class DBMLNameDataContext 
 {

 public DBMLNameDataContext()
 : base(ConfigurationManager.ConnectionStrings["YourConnectionStringNameFromApp.Config"].ConnectionString, new AttributeMappingSource())
 {
 OnCreated();
 }

 }

Create an app.config file within the root of the Project. Add a new Connection String, putting the correct name into the class above.

Add a table into the DBML.....this will be signified by <entityName>

Create a new class file (within "BusinessObjects" folder) called "Bus<entityName>.cs", we have here tbl_Engineer which is named "Engineer" in the DBML, so the file becomes "BusEngineer.cs".

public class BusEngineers : &lt;shortprojectname&gt;BusinessObject&lt;tbl_engineer, YourDataDataContext&gt;
 {
 }

Within the root of the Project create a new class file called "<shortProjectName>DataFactory.cs", within this add:

 public static BusEngineers GetEngineers
 {
 get
 {
 return new BusEngineers();
 }
 }

So to use the whole Data Layer....in another project add:

<shortProjectName>DataFactory.GetEngineers. => this will then list all your public methods within your BusEngineer class.

You could even move all the Business Objects out into another project if you needed to.

Done - I think.....any questions feel free to PM me.

GE

  • 0

Yeah I should probably mention too that I work mostly in ADO.NET, where business logic layers are pretty standard, so the method that Gareth mentioned might be more suited for you, although only you can make that decision :)

  • 0

A correction to the above post....because of this:

https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=361577&wa=wsignin1.0

namespace CompanyName.ProjectName.Data.BusinessFramework
{

    // DO NOT REMOVE
    using System.Configuration;
    using System.Data.Linq.Mapping;
    // SEE HERE - https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=361577&wa=wsignin1.0
    // GE, 26/01/10

    public partial class DBMLNameDataContext
    {


        public DBMLNameDataContext()
            : base(ConfigurationManager.ConnectionStrings["YourConnectionStringNameFromApp.Config"].ConnectionString, new AttributeMappingSource())
        {
            OnCreated();
        }

    }
}

  • 0

I rarely use stored procedures anymore. I used to use them religiously. I had one for every function in the system. Looking back now I wonder what I was thinking. What a waste that was.

In most of my applications, I typically use an architecture that many people call the Onion Architecture. I design my apps so that they're testable (automated testing, not F5 testing) because it's important to me.

I'll have a bunch a repository classes that interface with my database via nhibernate, domain classes and service layer classes, when they're required. Outside of those, I'll have presentation model classes for strongly-typed views.

  • 0
  On 26/01/2010 at 15:59, sbauer said:

I rarely use stored procedures anymore. I used to use them religiously. I had one for every function in the system. Looking back now I wonder what I was thinking. What a waste that was.

In most of my applications, I typically use an architecture that many people call the Onion Architecture. I design my apps so that they're testable (automated testing, not F5 testing) because it's important to me.

I'll have a bunch a repository classes that interface with my database via nhibernate, domain classes and service layer classes, when they're required. Outside of those, I'll have presentation model classes for strongly-typed views.

Looks like a nice article, gonna give that a read. Cheers!

I must admit that I used to use Stored Procedures for ALL my SQL, and to be honest, it did little to nothing to help the structure of my systems. Now (since I'm fortunate enough to develop in ASP.NET (no MVC yet though which is a bummer)) I tend to take advantage of the ADO.NET dataset designer to handle all my SQL statements, which makes life much easier since I can manage the entire system from inside Visual Studio.

  • 0
  Quote

I rarely use stored procedures anymore. I used to use them religiously. I had one for every function in the system. Looking back now I wonder what I was thinking. What a waste that was.

Could you tell us why it was a waste? I'm interested in knowing.

  • 0
  On 26/01/2010 at 23:58, Majesticmerc said:

Looks like a nice article, gonna give that a read. Cheers!

I must admit that I used to use Stored Procedures for ALL my SQL, and to be honest, it did little to nothing to help the structure of my systems. Now (since I'm fortunate enough to develop in ASP.NET (no MVC yet though which is a bummer)) I tend to take advantage of the ADO.NET dataset designer to handle all my SQL statements, which makes life much easier since I can manage the entire system from inside Visual Studio.

Stored procedures are such a fun topic because many people love them. I did too. I loved how my code looked. I loved how I had SQL sitting in the database, not in my code. Eventually, though, I got tired of it like you. I don't use ADO.NET dataset designer because I don't feel it scales well to larger systems and starts to break down.

It's unfortunate that you can't use MVC. If I never work in a WebForms project again, I'll be happy.

  On 27/01/2010 at 07:35, Ali Koubeissi said:

Could you tell us why it was a waste? I'm interested in knowing.

  On 27/01/2010 at 10:02, garethevans1986 said:

Because your "business logic" should not be in the database...

That's part of the reasoning behind it.

We use unit tests on our projects. Our unit tests ensure that 1) our features work as designed and 2) our features still work even after we've added new features. When you add business logic to stored procedures, you're beyond unit tests now. Unit tests need to be fast, and isolated - a small unit. Testing the database is not quick, and it's not a small unit. You'll need to create integration tests and those will be much slower. When things run slow, people tend not to use them.

Another reason is that I stopped generating most of my SQL by hand. No more CRUD sprocs, or adhoc statements. NHibernate takes care of that for us. If we have a very complicated query that we need to execute and the query looks pretty ugly while using the NHibernate API, then we'll generate the SQL by hand.

So far, so good. We'll have a repository that inherits from a base repository for each entity in the system.

    public abstract class RepositoryWithTypedKey&lt;KEY,ENTITY&gt;: IRepositoryWithTypedKey&lt;KEY,ENTITY&gt; where ENTITY: class 
    {
        protected ISession session;

        protected RepositoryWithTypedKey(ISession session)
        {
            this.session = session;
        }
        public ENTITY Get(KEY id)
        {
            return Session.Get<ENTITY>(id);
        }

        public ENTITY Load(KEY id)
        {
            return Session.Load<ENTITY>(id);
        }

        public IEnumerable&lt;ENTITY&gt; GetAll()
        {
            return Session.CreateCriteria&lt;ENTITY&gt;().List&lt;ENTITY&gt;();
        }

        public void Save(ENTITY entity)
        {
            Session.Save(entity);
        }

        public void Update(ENTITY entity)
        {
            Session.Update(entity);
        }

        public void SaveOrUpdate(ENTITY entity)
        {
            Session.SaveOrUpdate(entity);
        }

        public ISession Session { get { return session; } }
    }

    public class EmployeeRepository: Repository&lt;Employee&gt;, IEmployeeRepository
    {
        public EmployeeRepository(ISession session) : base(session)
        {
        }

        public IList&lt;Employee&gt; SearchByLastName(string lastName)
        {
            var criteria = Session.CreateCriteria&lt;Employee&gt;();
            criteria.Add(Expression.Like("LastName" + lastName + "%"));
            criteria.AddOrder(Order.Asc("LastName            return criteria.List<Employee>;();
        }

    }

And the controller

public class HomeController : Controller
    {
        private readonly IEmployeeRepository _employeeRepository;

        public HomeController(IEmployeeRepository _employeeRepository)
        {
            this._employeeRepository = _employeeRepository;
        }

        [HttpGet]
        public ActionResult Index()
        {
            return View();
        }

        [HttpPost]
        [ActionName("Index")]
        public ActionResult IndexSearch(string search)
        {
            return RedirectToAction("Search", new { searchTerm = search });
        }

        [Transaction]
        public ActionResult Search(string searchTerm)
        {
            if(string.IsNullOrEmpty(searchTerm))
                return RedirectToAction("index");

            var model = new HomeControllerViewModel();
            model.Employees = _employeeRepository.GetEmployeeListByLastName(searchTerm);
            model.Search = searchTerm;
            return View(model);

        }

    }

  • 0

I've come accross NHibernate, Fluent NHibernate and Dependency Injection before.

The only problem I have with NHibernate is how do you script the changes to keep your SQL Servers up to date?

GE

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • You can now measure internet speed and download videos using PowerToys Run by Taras Buria PowerToys Run is among my favorite modules, which I use daily (some of them should be integrated into Windows 11). This powerful utility is great for finding stuff, launching apps, navigating the web, and a lot more. Third-party modules help expand PowerToys Run's functionality and add additional capabilities like controlling Spotify, asking ChatGPT, and more. Now, there are three new modules you may find useful: speed test, video downloader, and vocabulary. The SpeedTest module uses Ookla Speedtest CLI for local execution without third-party tracking or ads. It has a modern, theme-aware user interface, and runs with a single command: just open Run and type spt to measure your internet connection speed. The result will show up in a new window with the ability to copy the result URL. The module also supports x64 and ARM systems. The VideoDownloader plugin lets you download videos from YouTube and other websites. You can specify the video quality and format (audio-only, for example) and select where to save the video. To make it work, open PowerToys Run, type dl and add a video URL. Finally, the Definition plugin from the same developer works as a powerful vocabulary that offers word definitions, word pronunciations, phonetics and synonyms, usage examples, and more. Note that these modules are not available in PowerToys out of the box. You need to download them from GitHub and install them manually. Here is how to do that: Download SpeedTest, VideoDownloader, or Definition plugins from their GitHub repositories. Extract downloaded plugins into %LOCALAPPDATA%\Microsoft\PowerToys\PowerToys Run\Plugins\ Open PowerToys Run (Alt + Space is the default shortcut) and launch plugins using their corresponding shortcuts (spt, dl, or def, all three can be customized in settings). You can also configure plugins in PowerToys Run settings. You can find more third-party modules for PowerToys Run on GitHub.
    • Showing people how to self host their own media is harmful, according to YouTube by David Uzondu YouTube has taken down a video from tech creator Jeff Geerling that demonstrated how to use LibreELEC, a lightweight operating system for turning devices into media centers, on a Raspberry Pi 5 for 4K video playback. The video, titled "I replaced my Apple TV—with a Raspberry Pi", originally published in May 2024, was removed in June 2025 under YouTube's "Harmful or dangerous content" policy. According to the violation notice, YouTube claimed the video showed "how to get unauthorized or free access to audio or audiovisual content, software, subscription services, or games that usually require payment." Image via Jeff Geerling Geerling strongly refuted YouTube's claims. He stated clearly, "I purposefully avoid demonstrating any of the tools that are popularly used to circumvent purchasing movie, TV, and other media content." He also emphasized that his own Network Attached Storage, or NAS, contains only legally acquired content. This isn't Geerling's first run-in with YouTube over self-hosted media tools. Last October, his tutorial titled "Better than Disney+: Jellyfin on my NAS" was hit with a similar strike for showing how to set up Jellyfin, an open source media server for organizing and streaming personal media. That strike was quickly overturned after an appeal. But this time, YouTube rejected his appeal, even though the LibreELEC video had been live for over a year, had racked up over half a million views, and contained no promotion of anything illegal. This whole thing feels a lot like what happened with youtube-dl. It's a simple command-line tool for downloading videos, used by tons of people for perfectly legal reasons like saving public domain content or backing up their own uploads. But that didn't stop the RIAA from hitting it with a DMCA takedown on GitHub, calling it a piracy tool. The community pushed back hard, and eventually it was brought back, thanks in part to support from groups like the Electronic Frontier Foundation who pointed out that not everything that can be misused is automatically bad. Side note, the youtube-dl project appears to be unmaintained (the last release was in 2021), if you're looking for an alternative, consider its very popular fork, yt-dlp. After the appeal was rejected, YouTube required Geerling to complete "policy training" to avoid a more serious, permanent strike on his channel. He eventually gave in and took the training. Anyways, if you're interested, he has uploaded the removed LibreELEC video to Internet Archive for anyone to watch.
    • Thanks to Herr Musk being a total poison pill, they can't even give those pieces of scrap away.  They can't even ship them to the UK/EU because they're completely illegal over here.  
    • Intel vs AMD? Microsoft seemingly has a clear recommendation for Windows 11 Pro PC upgrade by Sayan Sen Microsoft and its partners are now quite actively and regularly promoting the upgrade to Windows 11. Asus, for example, recently published blog posts about the "mandatory Windows 11 upgrade" that is coming as the Windows 11 end of support date nears. Microsoft itself, from time to time, urges users to upgrade to its newest OS. Back in February 2024, Microsoft released an advert highlighting the best things about Windows 11 over Windows 10. Later, in June in the same year, the tech giant busted "myths and misconceptions" surrounding a Windows 11 upgrade. And towards the end of 2024, in December, Microsoft put up a blog post outlining the gaming features a user enjoys on 11 if they were to upgrade from Windows 10. While technically there is nothing wrong with a company promoting its own product, sometimes these campaigns make little sense and they fall flat. For example, in January earlier this year, Microsoft shared a blog post headlined "Free Upgrade to Windows 11 (For a Limited Time Only)" which did not make sense as it offered little information about it being a "free upgrade," and it was rightfully, later taken down. The company is back again with a new commercial about Windows 11. This time it is aimed mainly at IT professionals and enterprises as the advert talks about upgrading to Windows 11 Pro from Windows 10. This landed a few days after Microsoft released a new backup tool for organizations for such a purpose. What is interesting is that the company is promoting Intel's vPro processors and there is no mention of AMD's Ryzen PRO parts. The commercial is posted on the Windows official YouTube channel and has been titled "Right side of risk | Windows 11 Pro and Intel". The video description says, "Windows 10 support ends October 14. Stay on the right side of risk—upgrade now to the power of Windows 11 Pro PCs with Intel vPro®." AMD does have a support article about the subject headlined "Support Your Customers’ Move to Windows 11, With AMD Ryzen™ PRO Processors" and you can find it here. This is not the first time Microsoft has promoted Intel CPUs over AMD ones. Back in 2021, the company also put up a full page explaining how users should "look for the Intel EVO badge" on a new device before making a purchase decision because such PCs are "verified wonderful" which was a bit of an odd language. Like the limited upgrade time article, the page above was taken down after we reported on it (can be viewed via the archive) and replaced with something else. The new commercial was published about a couple of days ago, and it is possible that Microsoft may have a dedicated AMD advert too in the pipeline scheduled for a later release, and that would only be fair if both companies get a similar treatment.
    • Don’t blame web developers for the downfall of Firefox. 😂
  • Recent Achievements

    • Week One Done
      luxoxfurniture earned a badge
      Week One Done
    • First Post
      Uranus_enjoyer earned a badge
      First Post
    • Week One Done
      Uranus_enjoyer earned a badge
      Week One Done
    • Week One Done
      jfam earned a badge
      Week One Done
    • First Post
      survivor303 earned a badge
      First Post
  • Popular Contributors

    1. 1
      +primortal
      436
    2. 2
      +FloatingFatMan
      238
    3. 3
      snowy owl
      216
    4. 4
      ATLien_0
      209
    5. 5
      Xenon
      154
  • Tell a friend

    Love Neowin? Tell a friend!