Sometimes I get questions about how to get better performance from a database. In working with SQL Server over the years and now SQL Azure this is not an uncommon question. In SQL 2008 and beyond the tools include a Tuning Wizard, which is great, but it relies on capturing a realistic sample of the database activity which you can get with SQL Profiler. Just go to the tool and run it, saving the captured trace to a table in SQL so you can look at it later and do some analytics.
Here’s some thoughts and ideas, for what they’re worth. First thing I would look at is to take a profile sample of the application running, which captures the queries and the statistics around which tables are being used and can be fed into the tuning utility to suggest indexes and keys. The second thing I would look at is whether a permanent working table would work better than a Temp table. The advantage is you have index capabilities, but the downside is truncating it and loading it when you need it.
Do you have flexibility with the schema to look at ways to pre-populate the data you need for the report during normal runtime of your system? For example if you are doing validations and transformations could these be scheduled to run periodically or even as the data transactions occur so that the work doesn’t have to be done ad-hoc to generate the report?
As to the query syntax I’ve found the “NOT EXISTS” clause to give better performance than the IN or NOT IN because of the way the optimizer creates and executes the plan.
Finally if you have complex queries are you generating them on the fly or can you create functions/stored procedures where the execution plan is pre-compiled?