Compilation TargetFramework vs Runtime TargetFramework in ASP.NET MVC
If you're an ASP.NET programmer, you may have noticed various settings in the web.config that have left you confused, but since everything worked, you didn't bother to dig deeper. One of those things that often confuses new web application developers in the difference between the compilation and run-time target frameworks.
For example, you might see this in your web.config:
<system.web>
<compilation debug="true" targetFramework="4.7.1" />
<httpRuntime targetFramework="4.6" />
</system.web>
What are these two elements and what do they do?
The run-time target framework configuration was introduced in .NET 4.5 as a way to opt-in to newer functionality. If the target framework of the run-time is less than the framework that the application was compiled at, it applies a "quirks mode" behavior that targets the lesser functionality as a way to keep backwards compatibility (where possible) in the case that the .NET Framework on a server that the application is being deployed to is a lesser (but compatible) version.
The compilation target framework, on the other hand, is supposed to represent the framework that was targeted at the time of compilation. This value is actually inferred by the ASP.NET run-time, but often appears in the web.config because Visual Studio requires it.