

And this can lead to longer running queries or even “hanging” queries. It takes the assumption of the previous run and applies it to the small delta load. On the following loads, it sees the data and the already existing execution plan and thinks, okay, we have a plan, so let’s get it done. Well, THIS execution plan would be the best to process THAT lot of data. So, SQL Server sees all the data and thinks, wow, a lot to process. Usually in my business vault satellites, the first load will be the biggest. If tables are around, it gets the counts and other meta-data and tries to figure out the best way to process. When I compile now the stored procedure, SQL Server tries to already create an execution plan which is saved with the stored procedure. hashes and filter after that the needed 100000 or vice-versa. The temporary table is needed, because sometimes SQL Server doesn’t know if it should calculate 100 Mio. My typical layout looks like this: CREATE PROCEDURE In BI I work with data sets.Īs you have discovered, my loading functions are all created with Stored Procedures. No OLTP optimization or single record stuff. I would like to discuss and explain certain behaviors I discovered in BI and BI only. I’m far from being an expert in that field. Reading and optimizing execution plans is a huge playing field.

It will tell the SQL engine how it should optimize the execution plans. You rewrite parts of the query, try to ease the pain of the SQL engine.Īnd some when you come across OPTIONs, which can be added to your table, your joins, your query or your stored procedure. And suddenly it appears, as if your queries are stuck. The more you learn, the more you are going to use.

#With recompile how to
With time your skills evolve, and you learn how to optimize the queries, maybe using a temporary table or a common table expression. At the beginning as an analyst you are going to learn to write standard SQL queries to get your desired results.
