This is one topic on which I hold a slightly different opinion from my colleagues. I have seen in every shop that I have been to that they clear out the cache prior to doing another benchmark run – whether they are using Oracle or SQL Server. In SQL Server, one could use the DBCC commands to flush out the cache – DBCC FREEPROCCACHE would do that for you. It is granular enough to allow you to even remove a specific plan if you provide it with a plan_handle or a sql_handle. What this command does is that it clears out the plan cache and say you have a stored procedure that you are benchmarking, it would cause it to recompile since there is nothing in the cache. Some people while doing benchmarking do this for every iteration of the benchmark in order to get “real world” numbers. Ask yourself, do you clear out the cache in your production system as well every day? Plan re-use and changing of the plans as one provides different input values and as one puts on the load on the system and increases up the data volume is what (IMHO) needs to be tracked rather than trying to clear out the cache each time to make a VP happy 🙂
Other forms of such DBCC commands are DBCC FREESESSIONCACHE and DBCC FREESYSTEMCACHE. You can look up more information on the syntax and their usage in BOL.
In Oracle, prior to Oracle 10g, in Oracle 9i, one could use the “alter tablespace offline/online” command to flush the buffer cache of the blocks that pertain to that tablespace. In Oracle 10g and 11g, there is an alter system command to flush out the cache:
alter system flush buffer_cache;
Benchmarks should reflect reality and unless your benchmark consists of a single query where you want to not account for what is in the cache, I don’t see any reason for using these commands in a benchmark. In a real world, the buffer cache is never 100% empty. So, where are these commands useful then? They would be useful for a developer/DBA who have their own instance and are doing performance testing on some code.