<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>sql database Archives - Blog IT</title>
	<atom:link href="https://blogit.create.pt/tag/sql-database/feed/" rel="self" type="application/rss+xml" />
	<link>https://blogit.create.pt/tag/sql-database/</link>
	<description>Create IT blogger community</description>
	<lastBuildDate>Mon, 07 Nov 2022 11:39:06 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.1</generator>
	<item>
		<title>Entity Framework &#8211; Update Database using Azure Pipelines</title>
		<link>https://blogit.create.pt/vini/2022/11/07/entity-framework-update-database-using-azure-pipelines/</link>
					<comments>https://blogit.create.pt/vini/2022/11/07/entity-framework-update-database-using-azure-pipelines/#respond</comments>
		
		<dc:creator><![CDATA[Vinícius Biavatti]]></dc:creator>
		<pubDate>Mon, 07 Nov 2022 11:18:06 +0000</pubDate>
				<category><![CDATA[Architecture]]></category>
		<category><![CDATA[Deployment]]></category>
		<category><![CDATA[Databases]]></category>
		<category><![CDATA[Microsoft Azure]]></category>
		<category><![CDATA[azure]]></category>
		<category><![CDATA[DevOps]]></category>
		<category><![CDATA[entity framework]]></category>
		<category><![CDATA[migrations]]></category>
		<category><![CDATA[pipelines]]></category>
		<category><![CDATA[sql database]]></category>
		<guid isPermaLink="false">https://blogit.create.pt/?p=12770</guid>

					<description><![CDATA[<p>Introduction The pipelines bring to us the opportunity to create automatic tasks to execute development operations, like deploys, migrations, tests, etc. Focused in the deploy routine, some applications need other operations to ensure the stability between the application and other resources, like the database. During the development process, the developers usually customize and improve the [&#8230;]</p>
<p>The post <a href="https://blogit.create.pt/vini/2022/11/07/entity-framework-update-database-using-azure-pipelines/">Entity Framework &#8211; Update Database using Azure Pipelines</a> appeared first on <a href="https://blogit.create.pt">Blog IT</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">Introduction</h2>



<p>The pipelines bring to us the opportunity to create automatic tasks to execute development operations, like deploys, migrations, tests, etc. Focused in the deploy routine, some applications need other operations to ensure the stability between the application and other resources, like the database.</p>



<p>During the development process, the developers usually customize and improve the application database in the development environment scope, and it can be made easily by using frameworks that contains routines to create source files that have the responsability to keep the history of changes and are used to update the database for determinated version. These files are known as migrations.&nbsp;</p>



<p>In this post, we will see how to apply migrations for the databases that are located in different environments (staging, production, etc.), using the Code-First entity concept, and the Entity Framework Core.</p>



<h2 class="wp-block-heading">Migrations</h2>



<p>To generate database migrations, we can execute the &#8220;Add-Migraton &lt;name&gt;&#8221; command from the Entity Framework. This command will generate some source files that will contain the code to apply the migration to the database. To execute these migration files, you just need to execute the &#8220;Update-Database&#8221; command. But,&nbsp;<a href="https://learn.microsoft.com/en-us/ef/core/managing-schemas/migrations/applying?tabs=dotnet-core-cli#sql-scripts" target="_blank" rel="noreferrer noopener">Microsoft recommends</a>&nbsp;to create a SQL file individually, and suggest to execute this file direct to the database, without need to execute an intermediary software (C# + EF) to access the database. To do this, we will have to follow the steps below:</p>



<ul class="wp-block-list">
<li>1. Create a Pipeline in Azure Devops;</li>



<li>2. Install dotnet-ef-tool;</li>



<li>3. Generate SQL script;</li>



<li>4. Execute SQL script to the database;</li>



<li>5. Script File as a Pipeline Artifact.</li>
</ul>



<p><em><strong>NOTE</strong>: We don&#8217;t need to worry about which migration need to be applyed to the database. The dotnet-ef-tool command will generate a script cantaining the needed logic. Check the third step for more details.</em></p>



<h2 class="wp-block-heading">1. Create a Pipeline in Azure Devops</h2>



<p>We will create a common deployment pipeline in Azure Devops for ASP.NET Core backend applications. Check the end of the post to see the full result of the pipeline.</p>



<h2 class="wp-block-heading">2. Install dotnet-ef-tool</h2>



<p>To execute tool commands of Entity Framework, we have to install a package named dotnet-ef-tool. This needs to be available into the pipeline&#8217;s context to be used for the SQL script generation. To install this tool by a pipeline task, we can use the task below:</p>


<div class="wp-block-syntaxhighlighter-code "><pre class="brush: yaml; title: ; notranslate">
- task: DotNetCoreCLI@2
  displayName: &#039;Install EF tool&#039;
  inputs:
    command: &#039;custom&#039;
    custom: &#039;tool&#039;
    arguments: &#039;install --global dotnet-ef&#039;
</pre></div>


<h2 class="wp-block-heading">3. Generate SQL script</h2>



<p>To generate the SQL script containing the logic to apply migrations, we will use the dotnet-ef-tool. By default, the command does not generate the logic to check which migration should be applied, so we have to enable this behavior using the parameter: &#8211;idempotent. By using this behavior, we don&#8217;t need to worry about the migrations that will be applyed to the database since the generated script already has it. Below, you can see the task that could be used to generate the script file:</p>


<div class="wp-block-syntaxhighlighter-code "><pre class="brush: yaml; title: ; notranslate">
- task: DotNetCoreCLI@2
  displayName: &#039;Generate SQL Script&#039;
  inputs:
    command: &#039;custom&#039;
    custom: &#039;ef&#039;
    arguments: &#039;migrations script --idempotent --project $(Build.SourcesDirectory)\Data.csproj --output $(System.ArtifactsDirectory)/script.sql&#039;
</pre></div>


<p><em><strong>NOTE</strong>: We don&#8217;t need to stablish any database connection for the script generation, since we are following the Code First concept for entity creation, so, the command will just look at the entities source code of the project to generate the SQL file. If we don&#8217;t use the &#8211;idempotent argument, the script will generate the SQL without the conditional logic to determinate which migration should be applied, causing an error if the database already exists.</em></p>



<h2 class="wp-block-heading">4. Execute SQL script to the database</h2>



<p>Now, with the SQL script available, we just need to execute this file to the database. In this post, we will use a common Azure Database as an example, so for this case, we can use the following task to accomplish this requirement:</p>


<div class="wp-block-syntaxhighlighter-code "><pre class="brush: yaml; title: ; notranslate">
- task: SqlAzureDacpacDeployment@1
  displayName: &#039;Update Database&#039;
  inputs:
    azureSubscription: &#039;&lt;Service Connection Identifier&gt;&#039;
    AuthenticationType: &#039;connectionString&#039; # You can use other method to authenticate to the database if you want
    ConnectionString: &#039;&lt;Database Connection String&gt;&#039;
    deployType: &#039;SqlTask&#039;
    SqlFile: &#039;$(System.ArtifactsDirectory)/script.sql&#039;
</pre></div>


<h2 class="wp-block-heading">5. Script File as a Pipeline Artifact</h2>



<p>Well done. The last task will only be used to make the file available to be accessed by the pipeline runner user. To do this, you can use the task below to add the generated script.sql file to the file bundle (artifacts). To check the artifacts, just access the artifacts option from the pipeline details after the job&#8217;s execution.</p>


<div class="wp-block-syntaxhighlighter-code "><pre class="brush: yaml; title: ; notranslate">
- task: PublishBuildArtifacts@1
  displayName: &#039;Publish Artifacts&#039;
  inputs:
    PathtoPublish: &#039;$(System.ArtifactsDirectory)/script.sql&#039;
    ArtifactName: &#039;$(artifactName)&#039;
    publishLocation: &#039;Container&#039;
</pre></div>


<h2 class="wp-block-heading">Conclusion</h2>



<p>In this post, we could learn an easy way to apply the database migrations to the remote environment by using automation (pipelines) to accomplish it. This method is simple, and does not have any database validation before script execution. So, if you have an active and sensitive environment, be sure that the needed validations are considered before the database update, to ensure that the database will not be update wrongly.</p>



<h2 class="wp-block-heading">Pipeline Result</h2>


<div class="wp-block-syntaxhighlighter-code "><pre class="brush: yaml; title: ; notranslate">
trigger:
- none

pool:
  vmImage: windows-latest

variables:
  solution: &#039;**/*.sln&#039;
  buildPlatform: &#039;AnyCPU&#039;
  buildConfiguration: &#039;Release&#039;
  artifactName: &#039;artifacts&#039;
  environment: &#039;Development&#039;

- task: NuGetToolInstaller@1
  displayName: &#039;NuGet Installation&#039;

- task: NuGetCommand@2
  displayName: &#039;NuGet Restore&#039;
  inputs:
    restoreSolution: &#039;$(solution)&#039;

- task: VSBuild@1
  displayName: &#039;Build Project&#039;
  inputs:
    solution: &#039;backend/WebAPI&#039;
    msbuildArgs: &#039;/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation=&quot;$(build.artifactStagingDirectory)&quot;&#039;
    platform: &#039;$(buildPlatform)&#039;
    configuration: &#039;$(buildConfiguration)&#039;

- task: PublishSymbols@2
  displayName: &#039;Publish Symbols&#039;
  inputs:
    SearchPattern: &#039;**/bin/**/*.pdb&#039;
    SymbolServerType: &#039;FileShare&#039;
    CompressSymbols: false

- task: PublishBuildArtifacts@1
  displayName: &#039;Publish Artifacts&#039;
  inputs:
    PathtoPublish: &#039;$(Build.ArtifactStagingDirectory)&#039;
    ArtifactName: &#039;artifacts&#039;
    publishLocation: &#039;Container&#039;

- task: DownloadBuildArtifacts@0
  displayName: &#039;Build Artifacts&#039;
  inputs:
    buildType: &#039;current&#039;
    downloadType: &#039;single&#039;
    artifactName: &#039;$(artifactName)&#039;
    downloadPath: &#039;$(System.ArtifactsDirectory)&#039;

- task: DotNetCoreCLI@2
  displayName: &#039;Install EF tool&#039;
  inputs:
    command: &#039;custom&#039;
    custom: &#039;tool&#039;
    arguments: &#039;install --global dotnet-ef&#039;

- task: DotNetCoreCLI@2
  displayName: &#039;Generate SQL Script&#039;
  inputs:
    command: &#039;custom&#039;
    custom: &#039;ef&#039;
    arguments: &#039;migrations script --idempotent --project $(Build.SourcesDirectory)\Data.csproj --output $(System.ArtifactsDirectory)/script.sql&#039;

- task: SqlAzureDacpacDeployment@1
  displayName: &#039;Update Database&#039;
  inputs:
    azureSubscription: &#039;&lt;Service Connection Identifier&gt;&#039;
    AuthenticationType: &#039;connectionString&#039;
    ConnectionString: &#039;&lt;Connection String&gt;&#039;
    deployType: &#039;SqlTask&#039;
    SqlFile: &#039;$(System.ArtifactsDirectory)/script.sql&#039;

- task: AzureRmWebAppDeployment@4
  displayName: &#039;Deploy Application&#039;
  inputs:
    ConnectionType: &#039;AzureRM&#039;
    azureSubscription: &#039;&lt;Service Connection Identifier&gt;&#039;
    appType: &#039;webApp&#039;
    WebAppName: &#039;armazemlegalapi&#039;
    deployToSlotOrASE: true
    ResourceGroupName: &#039;&lt;Resource Group Identifier&gt;&#039;
    SlotName: &#039;production&#039;
    packageForLinux: &#039;$(System.ArtifactsDirectory)/Project.zip&#039;

- task: PublishBuildArtifacts@1
  displayName: &#039;Publish Artifacts&#039;
  inputs:
    PathtoPublish: &#039;$(System.ArtifactsDirectory)/script.sql&#039;
    ArtifactName: &#039;$(artifactName)&#039;
    publishLocation: &#039;Container&#039;
</pre></div>


<h2 class="wp-block-heading">References</h2>



<ul class="wp-block-list">
<li><a href="https://learn.microsoft.com/en-us/ef/core/managing-schemas/migrations/applying?tabs=dotnet-core-cli#sql-scripts">Microsoft Managing Schemas Article</a></li>
</ul>
<p>The post <a href="https://blogit.create.pt/vini/2022/11/07/entity-framework-update-database-using-azure-pipelines/">Entity Framework &#8211; Update Database using Azure Pipelines</a> appeared first on <a href="https://blogit.create.pt">Blog IT</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogit.create.pt/vini/2022/11/07/entity-framework-update-database-using-azure-pipelines/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>KendoUI Grid, OData, WebAPI and Entity Framework</title>
		<link>https://blogit.create.pt/andresantos/2017/07/05/kendoui-grid-odata-webapi-and-entity-framework/</link>
					<comments>https://blogit.create.pt/andresantos/2017/07/05/kendoui-grid-odata-webapi-and-entity-framework/#respond</comments>
		
		<dc:creator><![CDATA[André Santos]]></dc:creator>
		<pubDate>Wed, 05 Jul 2017 10:42:53 +0000</pubDate>
				<category><![CDATA[Umbraco]]></category>
		<category><![CDATA[.NET]]></category>
		<category><![CDATA[azure]]></category>
		<category><![CDATA[Backoffice extension]]></category>
		<category><![CDATA[C#]]></category>
		<category><![CDATA[entity framework]]></category>
		<category><![CDATA[grid]]></category>
		<category><![CDATA[kendoui]]></category>
		<category><![CDATA[odata]]></category>
		<category><![CDATA[sql database]]></category>
		<category><![CDATA[webapi]]></category>
		<category><![CDATA[webapp]]></category>
		<guid isPermaLink="false">http://blogit.create.pt/andresantos/?p=1764</guid>

					<description><![CDATA[<p>In the latest project I was involved with, there was a need to present database tables data directly in the Umbraco&#8217;s backoffice, where all CRUD operations must be supported. This tables could be something like 1000 to 1 million rows and both the website and database would be hosted in Azure, in a WebApp and [&#8230;]</p>
<p>The post <a href="https://blogit.create.pt/andresantos/2017/07/05/kendoui-grid-odata-webapi-and-entity-framework/">KendoUI Grid, OData, WebAPI and Entity Framework</a> appeared first on <a href="https://blogit.create.pt">Blog IT</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><span class="dropcap dropcap3">I</span>n the latest project I was involved with, there was a need to present database tables data directly in the Umbraco&#8217;s backoffice, where all CRUD operations must be supported. This tables could be something like 1000 to 1 million rows and both the website and database would be hosted in Azure, in a WebApp and SQL Database respectively. Users should be able to create complex filters and order the data as they wish as well as update the data individually and in batch.</p>
<p>From the get-go there were some challenges we needed to overcome in order to create an usable system, that was user-friendly, fast, reliable, and where no conflitcts would come up when data was being updated from multiple users.</p>
<p>The solution was a new Umbraco Package we called <em>Search Manager</em> (since the database tables ultimately feed an Azure Search index) that uses <a href="http://demos.telerik.com/kendo-ui/grid/index">Telerik&#8217;s Kendo UI Grid</a>, WebAPI and Entity Framework. The following presents the process it took to reach the end result.</p>
<p><span id="more-6483"></span></p>
<h2>Step 1</h2>
<p>We created a simple Umbraco Package that implemented a Kendo UI Grid which got its data from a WebAPI method that returned all rows from a table using Petapoco.</p>
<p>This allowed us to have the complete table data in memory using a Kendo UI Grid. It worked great but had major problems:</p>
<ul>
<li>A table with around 40k rows was actually a 40 MBs json download in order to populate the grid</li>
<li>Users could be working with outdated data, since it was loaded in memory and changes could have been made to the source</li>
</ul>
<h2>Step 2</h2>
<p>We started using the server paging feature of the Kendo Grid. This dropped the download size greatly, but we still had problems:</p>
<ul>
<li>Even though we are working with only a subset of results client-side, we were still getting the complete table data server-side, using petapoco.</li>
</ul>
<p>We needed an easy way to translate the client-side queries, such as filters, paging, order, etc to SQL queries.</p>
<h2>Step 3</h2>
<p>We changed the type of queries being generated by the Grid to OData. Using the <a href="https://www.nuget.org/packages/Microsoft.AspNet.WebApi.OData">OData WebAPI nuget package</a> we are able to map the odata query  to a C# object and then directly apply the query to any IQueryable data.</p>
<p>Petapoco does not provide any IQueryable data support, so we changed our ORM to Entity Framework, which does, and <em>voilá</em>!</p>
<h2>Finally</h2>
<p>With everything in place, we are able to have a filtered subset of results in a Grid, that gets exactly what it needs to display from a dynamically generated SQL query.</p>
<ul>
<li>It is fast because we only get exactly what we need:</li>
</ul>
<p>An OData query generated by the Kendo UI Grid looks like this:</p>
<pre class="brush: plain; title: ; notranslate">
http://localhost:14231/umbraco/backoffice/SearchManager/PricesApi/Read?%24inlinecount=allpages&%24top=20&%24skip=40&%24orderby=ProviderName+desc&%24filter=(NatureProcCode+eq+%27exames%27+and+AvgAmtClaimed+gt+10)
</pre>
<p>Which is translated to the follwing SQL query:</p>
<pre class="brush: sql; title: ; notranslate">
exec sp_executesql N'SELECT 
    &#x5B;Project1].&#x5B;Approved] AS &#x5B;Approved], 
    &#x5B;Project1].&#x5B;DiffWeightedRate] AS &#x5B;DiffWeightedRate], 
    &#x5B;Project1].&#x5B;SubmitPrvId] AS &#x5B;SubmitPrvId], 
    &#x5B;Project1].&#x5B;PracticeSeq] AS &#x5B;PracticeSeq], 
    &#x5B;Project1].&#x5B;NatureProcCode] AS &#x5B;NatureProcCode], 
    &#x5B;Project1].&#x5B;GroupProcCode] AS &#x5B;GroupProcCode], 
    &#x5B;Project1].&#x5B;Network] AS &#x5B;Network], 
    &#x5B;Project1].&#x5B;AmtClaimedWeighted] AS &#x5B;AmtClaimedWeighted], 
    &#x5B;Project1].&#x5B;ManualPrice] AS &#x5B;ManualPrice], 
    &#x5B;Project1].&#x5B;IsImported] AS &#x5B;IsImported], 
    &#x5B;Project1].&#x5B;DiffCalculatedRate] AS &#x5B;DiffCalculatedRate], 
    &#x5B;Project1].&#x5B;AmtContract] AS &#x5B;AmtContract], 
    &#x5B;Project1].&#x5B;AvgAmtClaimed] AS &#x5B;AvgAmtClaimed], 
    &#x5B;Project1].&#x5B;ProviderName] AS &#x5B;ProviderName], 
    &#x5B;Project1].&#x5B;ClinicName] AS &#x5B;ClinicName]
    FROM ( SELECT 
        &#x5B;Extent1].&#x5B;Approved] AS &#x5B;Approved], 
        &#x5B;Extent1].&#x5B;DiffWeightedRate] AS &#x5B;DiffWeightedRate], 
        &#x5B;Extent1].&#x5B;SubmitPrvId] AS &#x5B;SubmitPrvId], 
        &#x5B;Extent1].&#x5B;PracticeSeq] AS &#x5B;PracticeSeq], 
        &#x5B;Extent1].&#x5B;NatureProcCode] AS &#x5B;NatureProcCode], 
        &#x5B;Extent1].&#x5B;GroupProcCode] AS &#x5B;GroupProcCode], 
        &#x5B;Extent1].&#x5B;Network] AS &#x5B;Network], 
        &#x5B;Extent1].&#x5B;AmtClaimedWeighted] AS &#x5B;AmtClaimedWeighted], 
        &#x5B;Extent1].&#x5B;ManualPrice] AS &#x5B;ManualPrice], 
        &#x5B;Extent1].&#x5B;IsImported] AS &#x5B;IsImported], 
        &#x5B;Extent1].&#x5B;DiffCalculatedRate] AS &#x5B;DiffCalculatedRate], 
        &#x5B;Extent1].&#x5B;AmtContract] AS &#x5B;AmtContract], 
        &#x5B;Extent1].&#x5B;AvgAmtClaimed] AS &#x5B;AvgAmtClaimed], 
        &#x5B;Extent1].&#x5B;ProviderName] AS &#x5B;ProviderName], 
        &#x5B;Extent1].&#x5B;ClinicName] AS &#x5B;ClinicName]
        FROM (SELECT 
    &#x5B;PriceGrid].&#x5B;Approved] AS &#x5B;Approved], 
    &#x5B;PriceGrid].&#x5B;DiffWeightedRate] AS &#x5B;DiffWeightedRate], 
    &#x5B;PriceGrid].&#x5B;SubmitPrvId] AS &#x5B;SubmitPrvId], 
    &#x5B;PriceGrid].&#x5B;PracticeSeq] AS &#x5B;PracticeSeq], 
    &#x5B;PriceGrid].&#x5B;NatureProcCode] AS &#x5B;NatureProcCode], 
    &#x5B;PriceGrid].&#x5B;GroupProcCode] AS &#x5B;GroupProcCode], 
    &#x5B;PriceGrid].&#x5B;Network] AS &#x5B;Network], 
    &#x5B;PriceGrid].&#x5B;AmtClaimedWeighted] AS &#x5B;AmtClaimedWeighted], 
    &#x5B;PriceGrid].&#x5B;ManualPrice] AS &#x5B;ManualPrice], 
    &#x5B;PriceGrid].&#x5B;IsImported] AS &#x5B;IsImported], 
    &#x5B;PriceGrid].&#x5B;DiffCalculatedRate] AS &#x5B;DiffCalculatedRate], 
    &#x5B;PriceGrid].&#x5B;AmtContract] AS &#x5B;AmtContract], 
    &#x5B;PriceGrid].&#x5B;AvgAmtClaimed] AS &#x5B;AvgAmtClaimed], 
    &#x5B;PriceGrid].&#x5B;ProviderName] AS &#x5B;ProviderName], 
    &#x5B;PriceGrid].&#x5B;ClinicName] AS &#x5B;ClinicName]
    FROM &#x5B;dbo].&#x5B;PriceGrid] AS &#x5B;PriceGrid]) AS &#x5B;Extent1]
        WHERE (&#x5B;Extent1].&#x5B;NatureProcCode] = @p__linq__0) AND (&#x5B;Extent1].&#x5B;AvgAmtClaimed] &gt; @p__linq__1)
    )  AS &#x5B;Project1]
    ORDER BY &#x5B;Project1].&#x5B;ProviderName] DESC, &#x5B;Project1].&#x5B;AmtClaimedWeighted] ASC, &#x5B;Project1].&#x5B;AmtContract] ASC, &#x5B;Project1].&#x5B;Approved] ASC, &#x5B;Project1].&#x5B;AvgAmtClaimed] ASC, &#x5B;Project1].&#x5B;ClinicName] ASC, &#x5B;Project1].&#x5B;DiffCalculatedRate] ASC, &#x5B;Project1].&#x5B;DiffWeightedRate] ASC, &#x5B;Project1].&#x5B;GroupProcCode] ASC, &#x5B;Project1].&#x5B;IsImported] ASC, &#x5B;Project1].&#x5B;ManualPrice] ASC, &#x5B;Project1].&#x5B;NatureProcCode] ASC, &#x5B;Project1].&#x5B;Network] ASC, &#x5B;Project1].&#x5B;PracticeSeq] ASC, &#x5B;Project1].&#x5B;SubmitPrvId] ASC
    OFFSET @p__linq__2 ROWS FETCH NEXT @p__linq__3 ROWS ONLY  option (Recompile)',N'@p__linq__0 varchar(8000),@p__linq__1 decimal(2,0),@p__linq__2 int,@p__linq__3 int',@p__linq__0='exames',@p__linq__1=10,@p__linq__2=40,@p__linq__3=20
</pre>
<ul>
<li>It&#8217;s user-friendly and reliable because users can filter the data as if they were using an excel file on steroids</li>
<li>It avoids conflicts because we only get a subset of the results each time and each change is commited in a single transaction</li>
</ul>
<h2>Code samples:</h2>
<p>&nbsp;</p>
<pre class="brush: jscript; title: grid setup; notranslate">
grid = $(&quot;#grid&quot;).kendoGrid({
                dataSource: new kendo.data.DataSource({
                    transport: transport,
                    schema: schema,
                    serverFiltering: true,
                    serverPaging: true,
                    serverSorting: true,
                    type: &quot;odata&quot;,
                    pageSize: 20
                }),
                dataBound: onDataBound,
                pageable: true,
                filterable: true,
                sortable: true,
                scrollable: false,
                navigatable: true,
                resizable: true,
                batch: true,
                columnMenu: true,
                editable: true,
                toolbar: &#x5B;
					{ template: '&lt;a class=&quot;k-button k-button-icontext k-grid-save-batch&quot; href=&quot;\\#&quot;&quot;&gt;&lt;span class=&quot;k-icon k-i-update&quot;&gt;&lt;/span&gt;Guardar Alterações&lt;/a&gt;' },
					{ name: &quot;cancel&quot;, text: &quot;Cancelar&quot; },
                ],
                columns: &#x5B;...]
            });
</pre>
<p>&nbsp;</p>
<pre class="brush: csharp; title: WebAPI; notranslate">
&#x5B;HttpGet]
        public IHttpActionResult Read(ODataQueryOptions&lt;PriceGrid&gt; opts)
        {
            Mapper.CreateMap&lt;PriceGrid, PriceGridBO&gt;();

            using (var context = new EntitiesContext())
            {
                using (var qh = new HintScope(context, QueryHintType.Recompile))
                {
                    List&lt;PriceGrid&gt; results = opts
                        .ApplyTo(context.PriceGrids)
                        .Cast&lt;PriceGrid&gt;()
                        .ToList();

                    List&lt;PriceGridBO&gt; boEntities = results.Select(x =&gt; Mapper.Map&lt;PriceGrid, PriceGridBO&gt;(x)).ToList();

                    PriceResponse response = new PriceResponse()
                    {
                        Entities = boEntities,
                        Total = Request.ODataProperties().TotalCount.HasValue ? Request.ODataProperties().TotalCount.Value : 0
                    };

                    return Ok(response);
                }
            }
        }
</pre>
<p>The post <a href="https://blogit.create.pt/andresantos/2017/07/05/kendoui-grid-odata-webapi-and-entity-framework/">KendoUI Grid, OData, WebAPI and Entity Framework</a> appeared first on <a href="https://blogit.create.pt">Blog IT</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blogit.create.pt/andresantos/2017/07/05/kendoui-grid-odata-webapi-and-entity-framework/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
