Infographic – The Modern Data Warehouse Framework

Check our latest infographic, The Modern Data Warehouse Framework!

As organizations are collecting and processing increasing amounts of data from a growing number data sources, data systems must evolve to keep up. In order to make the best data-driven decision possible, you need to reimagine the way you look at data warehousing.

We took a look at how to transition your data warehouse to the cloud and put together our top 8 recommendations for building a modern data warehouse on Azure.


Download the PDF here


The-Modern-Data-Warehouse-Framework Infographic - The Modern Data Warehouse Framework

Infographic – The Modern Data Warehouse Framework




Power BI for Mobile: Take Your Data on the Road

One area where the Power BI software stack really shines is the mobile space. The Power BI product line includes three apps: one for iOS, Windows Phone and Android. These apps allow you to take anything you can generate in Power BI and make it readily available to any stakeholder with a mobile phone or tablet. With a couple swipes, users can quickly interact with all your analysis. Power BI allows you to bring together the advantages of mobile devices, big data systems and compelling visualizations in a way that permits everyone involved to make better decisions.

The Power of the Dashboard

It’s one thing to produce an informative chart, but it’s quite another to deploy a fully interactive dashboard that can fetch real-time updates. Power BI permits you to tie together data from a variety of sources, including numerous non-Microsoft products. For the end user, the guy in marketing who just needs to see today’s report, the component that makes it all accessible is the dashboard.

Power BI dashboards allow you to publish any type of common infographic, geospatial information or visualization. If you need a bubble chart that displays the YTD performance of your company’s retail outlets, there’s an out-of-the-box solution for that with Power BI. It also allows you to create maps and overlay existing information onto those maps. Instead of just seeing that Store #325 is performing well, an app user can pull up the dashboard and see on the map whether that’s a one-off phenomenon or a regional trend.

Making Information Accessible

In the world of data analytics, a lot of work goes into empowering decision makers who may not have strong technical background. It’s extremely beneficial to give those people an app that allows them to quickly sort through the available data in a clear format. If your buyers can quickly bounce between multiple years’ worth of data and make comparisons, they can make important decisions faster.

Power BI also allows you to determine how the dashboard accesses the available information. Rather than simply presenting users a static set of reports, you can configure queries that allow them to sift through in a self-guided fashion. If someone needs access to a real-time inventory report, your dashboard can be configured to fetch that information from the company’s SQL Server installations. This allows members of your organization who might not be data scientists to rapidly develop insights that can guide their choices. 

Cross-Platform Compatibility

Keeping everyone in your business on the same page can be a challenge. Microsoft has gone to great lengths to ensure that the Power BI apps display information faithfully on every platform and function in a similar fashion. Our hypothetical data scientists in our example will have no trouble grabbing an art department iPhone and finding everything they need.

Data Sources

Any data source that can be access inside Office or Power BI can be presented within the app’s dashboard. If you need to present data from an Excel sheet in an appealing manner to someone on the other side of the planet, the app can make that happen. It also allows you to connect to commonly used data sources, such as SQL Server Reports, and outside sources, such as Google Analytics, Salesforce or MailChimp. You can even mix and match functionality, for example, overlaying Salesforce data on Google Maps.


Business intelligence is about putting the right information in the rights hands and in a format that makes a visually compelling case. Your company will likely invest a lot of effort in the coming years into producing analysis and generating insights. With Power BI’s mobile app, you can ensure that the people who need those insights have access to them with the touch of a finger. The app allows you to pass along analysis to stakeholders in a secure environment that makes interacting with the data easy. In short, it makes all your data analytics faster, more appealing and more accessible.

If you have questions about getting started with Power BI or want to push the toolset further, give us a call. We’re always happy to answer any questions.



Building a On-Site Data Warehouse Using Microsoft Stack

Most IT specialists are aware of individual components within SQL Server and may use one or two of them regularly for data analytics or general operations. Outside of IT, others may be familiar with one of several analytics-oriented SQL Server add-ons such as those for Excel that are targeted at data analysis or data mining in general.

It is likely that far fewer staff, if any, have combined SQL Server components into a complete data analytics stack that enables the organization to build a high-performance data analytics solution using on-premise and external data sources that deliver meaningful analytics across the enterprise.

Fortunately, Microsoft provides all the tools and services necessary for a traditional, on-site data analytics system. This article outlines these components and their general usage. Of necessity, it cannot cover all the intricacies that may arise out of business-specific requirements.

Normalized vs. De-Normalized Databases

Relational databases use a normalized data organization optimized for rapid insert and update operations while minimizing data reads, which makes them ideal for OLTP processing. These types of databases are not suitable for data warehouses, which use an OLAP design that de-normalizes the data. This organization is optimized for the high-volume read performance required for analytical processing.

Other Reasons to Use a Separate Data Analytics Database

In addition to optimizing read performance, creating a separate OLAP database versus utilizing the OLTP corporate database has several advantages:

  • Analytical data queries requiring summarization of large data quantities will not bog down corporate database servers that collect related data items from disparate resources
  • Historical data can be cached rather than archived, which is typical on OLTP databases
  • You are free to consolidate data from several sources including off-premise databases, social media sites or a variety of document formats such as XML or Excel.
  • Cleansing and optimizing your data for analysis will be automated using your business-specific rules, queries and schedule.

The Microsoft Data Analytics Stack

SQL Server 2016 contains the entire suite of components necessary to build your first data warehouse. The three critical components are Integration Services, Analysis Services and Reporting Services.

SQL Server Integration Services (SSIS)

SSIS is SQL Server’s ETL component that connects to your data sources, migrates data, restructures it according to business-specific rules and populates the database. It manages acquiring data from multiple sources, cleaning this data plus performing type conversions and any other transformations required to streamline analysis. SQL Server Agent permits automatic scheduling of these processes so the database always contains the most relevant and timely data.

Use the particular business problems you are solving to guide ETL preparation and specification. A user-centric approach pares down and organizes the data being collected for the most efficient extraction and presentation.

SQL Server Analysis Services  (SSAS)

SSAS is critical to creating the most useful and highest-performing database queries. This component transforms normalized or relational data into multidimensional database “cubes” that are tuned to deliver rapid results to ad hoc queries. It does so through data indexing, data aggregation and enabling complex data calculations to be embedded at the cube level where processing is most efficient.

SQL Server Reporting Services (SSRS)

SSRS is used to make queries on an SQL Server database as well. It offers everything required to build detailed, complex reports with visualizations and facilities to manage and distribute reports securely. Users may view reports via the SSRS Web Portal, Power BI or a SharePoint server. End users are able to import reports directly to Excel or PowerPoint.

SQL Server Data Tools

This package works within Visual Studio (VS) and provides development tools for all the integration, analysis and reporting steps outlined previously. It is free and downloads a VS shell if you do not already have VS installed. SSDT lets you design and build your OLAP database and SQL Server content in much the same manner as you would for a VS application. It replaces the previous Business Intelligence Development Studio.


Microsoft’s SQL Server plus SSDT support development of a complete data analytics analysis system that includes an efficient OLAP database, optimized data schema and a full pipeline of data acquisition, pre-processing, analysis, delivery and visualization. Many organizations without a current data analytics solution may not have expertise in all the SQL Server components required, and those that do have typically only applied their skills toward OLTP type solutions.

Thus, many businesses employ an experienced partner for architecting a data analytics system that is scalable, extendable and highly secure. The most capable consultants are able to train and guide in-house staff through the entire development and maintenance lifecycle such that the enterprise becomes self-sufficient at further expanding and optimizing the system.

If you need help or advice on your data analytics system, let us know. Our team of data experts have worked with businesses large and small across all industries and have the right specialized technical skills to guide you towards a solution.




Data Analytics in the Cloud: Where to Start?

An enterprise-wide data analytics system pulling data from multiple sources, correlating and presenting results in relevant, insightful visualizations to enable prompt, informed decisions is the dream of many organizations. Those in the lead already reap the benefits of faster, high-accuracy, proactive decisions that data analytics provides.

Getting to that point requires exquisite planning and execution by stakeholders from top to bottom and across departments in order to achieve an implementation that is useful, maintainable and flexible enough to handle ever-improving data analytics technology and an increasingly competitive business environment.

Don’t Go Too Big Too Fast

Data analytics systems are most valuable when shared across the organization. Therefore, cross-departmental input and commitments are vital as well as a high degree of project transparency. Collecting requirements and then creating an all-at-once solution months or quarters later is courting disaster. That sometimes works for limited software development projects, but data analytics initiatives necessarily demand a much larger scope in both time and space.

Adopt an incremental mindset from the start by applying a less risky phased and flexible tack to your data initiative’s development. The project should produce regular and continuous adjustments to functionality, features and metrics without unnecessary thrash. This paradigm is most likely to produce a quality product with high acceptance across the business.

Gain Executive Stakeholder Buy-In

With the correct attitude regarding initiative progression, gain C-Suite buy-in via detailed assessment and quantification of business needs to which data analytics capabilities can add measurable value. These come from conversations with department heads, people managers, project managers and line workers involved in operational activities that probably benefit from insights provided through readily accessible business analytics.

Collect Technical Requirements

After executive endorsement, pinpoint technical needs, features and functions required to create a system meeting the project’s strategic goals. These requirements are derived from various operational aspects:

  • Existing data analytics-related processes, e.g. CRM, and their supporting software and infrastructure
  • Identifying existing data sources and creating a baseline of the what, when and how of data storage and processing
  • Where applicable, data sharing patterns, especially where data transformation steps are required
  • The current collection of in-house, open source or cloud-based tools and services utilized

Turn Technical Requirements into KPIs

Concurrently with technical requirements acquisition, work closely with stakeholders to develop meaningful metrics and KPIs. Examples include metrics around analytics data acquisition, storage and cleaning. Marketing metrics might measure campaign response. High-interest sales metrics centre on conversions and ROI, whereas support metrics include customer satisfaction and retention. Be open to innovative metrics that a new data analytics system could enable.

Ask for Resource Commitments

While collaborating on KPIs, initiate frank discussions with regard to workers and material that stakeholder departments or teams are willing to provide to the project. The benefits of such commitments should already have been incorporated into specific KPIs that benefit them.

Choosing a Software Model

Inconsistent use of software models, such as open source, in-house or cloud-based SaaS is common in companies. This often results from an organic acquisition of tools and projects over time. Your data analytics implementation will not correct old problems, but its choice of software model should be based on technology availability, costs and the ability to scale and adapt as your company expands.

For instance, even with a strong in-house IT development capability, the benefits of basing your data analytics implementation on cloud-based SaaS are compelling.

First of all, removing the constraint of higher capital needs and their approval alone makes a forcible argument for choosing pay-as-you-go cloud SaaS. Furthermore, this complements your phased approach as infrastructure and services are easily scaled and maintenance is automatic. Finally, today’s cloud SaaS from the best providers is fully customizable, which enables rapid functionality development and ease of modification during ongoing development.

Additional Tips

  • Expect dirty data, especially from social media, and deal with it at the source where possible. Employ tools such as, diffbot and ScraperWiki in this battle. Especially during testing, consider importing customized, on-demand data sets. 
  • Be sure data analytics dashboards and reports are highly customizable but easy to learn. This area is the face of your initiative for the majority of users. Also, ensure dashboard functionality works for your mobile users.
  • Build in extensibility. This means anticipating new data sources and leaving room for the latest in predictive analysis technology.
  • If you are using a phased, results-oriented approach, you will have plenty of opportunities to celebrate small victories. Relish and share these milestones.


Data analytics have a proven track record of providing enterprises with streamlined and on-target decision-making improvements based on “right-time” data flows from inside and outside the company. Implementing the best system for your company requires intense and thorough planning followed by step-wise development and deployment.

Realize that even as your project begins achieving its end goals that ongoing business needs and changing markets call for continued growth of your data analytics capability. If you already chose cloud-based SaaS as your software core, then the next growth and adjustment phase will be much easier than the first, especially if you stick to your iterative development paradigm.

If you have questions about how to get started working in the cloud, let us know. We’re happy to share our knowledge and set you on the right path.



Automated Testing for SSRS Reports

Motivations for SSRS Report Testing

Both data warehouse developers and end users of data analytics reports have a keen interest in the accuracy and appearance of data content. As SSRS reports are being developed, they are typically tested piecemeal during construction and as a whole when completed for both these aspects.

However, it is always possible to overlook certain report items or their representation after report deployment. Furthermore, issues with changes in data sources or the ETL flow may introduce anomalies or errors that affect data values or their presentation at a later date.

SSRS Report Testing Automation Issues

To increase report testing efficiency and accuracy, automated testing is key during both development and maintenance of SSRS reports, especially for organizations utilizing dozens or hundreds of reports on a regular basis. Automation can decrease development time and play an important role in correcting discrepancies in report data post-deployment, which could otherwise negatively impact confidence in the data warehouse.

The complexity of interactions within SSRS itself and the many other components of SQL Server, however, make the creation of fully automated testing a tricky business. Furthermore, the high-degree of customization possible for SSRS reports implies that customized testing approaches are probably required in addition to standardization of report content and layout wherever feasible.

SSRS Testing Areas for Automation

Unit Testing

There are a large number of characteristics that can be tested during development via unit tests. Unit tests can migrate to test suites for troubleshooting post-deployment bugs quickly. These are a few example coverage areas for such automated tests:

  • If data is not reaching a report, validate the data source, check that the correct mappings to the dataset are being used and that each column’s Visibility property is set correctly.
  • If data is not present and the report uses stored procedures, validate the SP parameters, their data types, conversions and formatting requirements.
  • The failure of data to group as expected can be tested by examining the grouping expressions and that the Aggregate function is applied to Numeric columns.
  • Create a query test framework that can take as input individual queries or query scripts. Such a framework would contain validated comparison datasets including metadata to run against actual queries/datasets with each report’s specific data source, parameters or configuration.
  • Test report rendering into formats used by the organization. Initial tests require manual inspection of results, but subsequent tests could be automated using input reports under version control. The most useful of these tests is to output the report in XML format, which is likely to be complete, free of data conversions and most amenable to automated comparison tests.

A possible starting point for building a permanent RS unit testing framework can be found here:

Layout Testing

Automating report layout tests typically presents the greatest difficulties. If your organization uses the SSRS web portal, however, you can take advantage of a number of web UI automation tools that facilitate such testing.

Selenium Web-Driver is a free tool for testing web page layout and functionality, which works with Firefox, IE, Chrome and Safari. Automation test scripts are written in either Java, C#, Ruby, Python or JavaScript. Scripts utilize the Web-Driver API which invokes a live browser or runs headless.

Other UI-based testing tools are also available such as the open-source Sikuli, freeware AutoIt or TestComplete, which is a proprietary tool by SmartBear.


Test automation has its limits, of course. This is especially so with regard to SSRS reports testing. For instance, usability testing is clearly out of scope for automation. Automation is further complicated in SSRS, since data is often manipulated beneath the covers beyond control of the report itself. For example, data may be manipulated outside of queries in embedded VB.Net code or in external class libraries.

Even so, automating report testing wherever feasible always pays off. Choose the low-hanging fruit first before expanding into other areas, such as troubleshooting suites and GUI-based layout testing. As you progress, you are likely to find that developing a mindset aimed at automation frequently instills a virtuous cycle of discovering new opportunities for further test automation.

In SSRS testing areas that present the most difficult obstacles, employ third-party tools or employ an experienced automation consultant who can demonstrate automation methods most appropriate for your SSRS development and usage scenarios.

Cloud Data Warehousing with Azure SQL

If you are running an on-premise data analytics stack on Microsoft’s SQL Server, but running into maintenance, cost and scaling issues, you can consider moving your data system to a cloud-based database service such as Azure SQL Database. Especially for your first data analytics stack, Azure SQL Database provides low startup costs with the ability to easily expand as business grows.

Advantages of Azure SQL Database

There are several benefits to moving on-premise SQL Server infrastructure to Azure:

  • Physical acquisition, provisioning and maintenance of SQL Server deployments are a thing of the past. Furthermore, decreasing or increasing data infrastructure is instantaneous with SQL Database elastic pools.
  • Azure assists existing database migration to Azure SQL Database with wizard-based tools.
  • All stored and transmitted data are encrypted via client-side keys.
  • Microsoft accommodates third-party and open-source technologies, such as Python, Java, node.js, PHP and Python.
  • SQL developers feel right at home using SQLCMD or SQL Server Management Studio for development.

SQL Database Limitations

Although all SQL Server components, SSIS, SSAS and SSRS are available on Azure, there are still areas where the Azure version is not completely fleshed out. For instance, only a growing subset of T-SQL features are yet available such as cursors, transactions, triggers, all data types, all operators plus logical, arithmetic and string functions.

Additionally, many T-QSL statements in SQL Database do not support every option available in SQL Server 2016, such as CREATE/ALTER for databases, logins, tables, users and views. Collation of system objects, cross-database queries with three- or four-part names, database collector, diagrams and mail, some events and certain database characteristics that were managed manually in SQL Server but are automatic in SQL Database are also missing.

For a full list of deficiencies, see Azure SQL Database Transact-SQL differences

Additional Azure Capabilities

SSRS is actually replaced with a separate service, SQL Reporting, which incurs a separate charge for reports. It is not a general reporting service since it only works with SQL databases. It does offer a nearly identical development interface to traditional SSRS.

Azure Tables is a storage service targeted at non-relational database storage, which is a type preferred for data analysis processes. It stores up to 100TB of data via an Azure Storage account and supplies data in row form. Additional advantages include less cost than straight Azure storage and easy scaling.

Built on top of Hadoop, HDInsight offers unstructured data storage plus a number of tools, such as Sqoop, Pig and Hive for query processing. Your in-house SQL Server, Excel or SQL Database are all able to connect to this service.

Data Factory is Microsoft’s SaaS analogue to SSIS. It visually coordinates other services to transform raw, unstructured data via data flow pipelines into clean, transformed data ready for analysis engines such as HDInsight or Azure Machine Learning for predictive analytics.

In lieu of SQL Reporting, you can utilize Microsoft’s SaaS Power BI for report, dashboard and visualization creation. You can use this tool in conjunction with your on-premise SQL Server installation or stored spreadsheets too.

Steps to Migrating from SQL Server to SQL Database

SQL Database is, in theory, backward-compatible all the way to SQL Server 2005. In spite of this, the first step in migration is to test and fix any compatibility issues that may exist with SQL Database V12.

There are several methods to determine compatibility including the use of SQL Server Data Tools, the SqlPackage utility, SQL Server Management Studio’s Export Data Tier wizard and the Azure SQL Migration Wizard. SSDT, SSMS and SAMW can be used to fix any migration issues with your database also.

The next step is to create an Azure SQL Database logical server and migrate your existing data to it. Although other methods exist, the use of SQL Server transaction replication is the recommended solution since it minimizes live database downtime. Other solutions are to export/import BACPAC files when connection bandwidth is low or unreliable or use the SSMS database deploy wizard for smaller databases.

Running SQL Server in the Cloud Directly

There is nothing stopping you to begin or continue your SQL Server-based data analytics development and deployment without Azure SQL Database. Amazon AWS provides any level of SQL Server instantiation online with the advantages of computational, networking and storage elasticity on a pay-as-you-go basis. With a bit more lifting, you could do the same thing on Azure or AWS by utilizing their Virtual Machine services directly for your own SQL Server deployment.


Running data analytics in the public cloud brings all the usual benefits of cloud-based operation, the most important of which are elastic storage for big data crunching systems and high availability in-house or mobile across the enterprise.

Whether or not your business should consider a cloud-based data analytics deployment depends on several factors including TCO, data volume, bandwidth requirements, security and the need to scale operations up or down quickly.

A wise approach is to work with an experienced Optimus data analytics consultant to collate all factors and develop a full data architectural solution. Our experts can guide you towards the best solution for your needs.




Power BI and SSRS: How Do They Pair?

SQL Server Reporting Services is a powerful reporting tool in its own right with strong programmability, extensibility and scalability features that have led to its widespread adoption. As a report authoring tool, however, SSRS has lacked the powerful dashboard creation features of its competitors. With the release of SQL Server 2016, it now has the capability to produce stunning, information-packed visualizations when paired with Microsoft’s Power BI suite of business analytics tools.

Pairing Up SSRS and Power BI

The only prerequisites to get SSRS and Power BI to play together is the installation of SQL Server 2016 CTP 3, a Power BI account of any level, registration of SSRS with Power BI and SQL Server Agent running. Register SSRS with Power BI via Reporting Services Configuration Manager by entering the SQL server name and RS instance and connecting. Then under Power BI Integration, register a work or school Power BI account. The account must have permissions for registering apps with Azure Active Directory.

Basic SSRS/Power BI Interactions

Once registration is complete, you can upload any Power BI file to the Reporting Services web portal. After uploading, you can open a Power BI file, interact with the report, update its model, refresh the data and save it back to RS web portal when operations are complete. All other consumers of the report will see the changes instantly.

Power BI Mobile App

There is also a Power BI mobile application that connects to Reporting Services as long as network access to the RS server is obtainable from your location. Once connected, you can interact with reports or already defined KPIs in the web portal just as you would from your desktop.

BI Visualization Pinning

Where the partnership between SSRS and Power BI really shines is the ability for users to pin any report visualization, such as charts, matrices, maps and so on to a Power BI dashboard. These pins can have additional annotations. To take advantage of this new feature, you need to configure RS with a BI account that has application add rights in Azure Active Directory.

Once you are signed into RS with Power BI credentials, you can begin pinning. The most common errors during this interaction occur because SQL Agent is not running, since the RS subscription subsystem is utilized for this feature. When the pairing is set up correctly, a Power BI icon appears in the RS toolbar.

You access “pinnable” items via an additional tab, which darkens everything in the display that you cannot pin. Selecting an item brings up a dialog popup from which you select the working group which contains the dashboard to which you will add the pin. You also specify an update frequency so the pin is kept up to date with regular data refreshes.

Note that pins may “expire,” which means they fail to show up at a later date. This situation occurs if you have not refreshed the authentication token during that period. This refresh is accomplished simply by signing into Power BI every 90 days or less at which time all your pins are updated.

The Future of SSRS and Power BI

One of the clear drawbacks to Power BI is that, for now, it is strictly a cloud-based solution. Users are clamoring for the capabilities of Power BI for their on-premise and hybrid systems. Microsoft is responding by rewriting SSRS to extend Power BI-like features to on-premise installations. This also requires that legacy SSRS users be accommodated to provide them with advanced features and an upgrade path.

The new SSRS, which has been making an impact since SQL Server 2016 RC1, supports all modern web browsers using HTML5. A single SSRS mobile app runs on all three major platforms. The RS interface and reports can be custom-branded for a more professional look.


As SSRS has become the de facto modern standard for BI analytics systems, its weakness in data visualizations has come to the fore, which Microsoft is softening by creating ways for SSRS to pair up with Power BI for great visualizations. This pairing, for now, is loosely coupled with the ability to interact with Power BI reports, a mobile Power BI app that connects to an RS server and the ability to attach SSRS visuals as a tile inside of a Power BI dashboard.

They are now adding more capabilities within SSRS itself for big customers with on-premise BI operations, which will include another mobile app and better visuals. It seems logical that at some point Microsoft will opt to unify UIs, report formats and internal subsystems within both SSRS and Power BI in order to reach their eventual goal of one billion Power BI users.


Check out our latest interactive Power BI demos. Play with real data to see how it translates into interactive visualizations:


Migrating Crystal Reports to SSRS

Most legacy ERP systems still utilize Crystal Reports for creating detailed, flexible and dynamic business reports. However, multiple acquisitions of the software over the years has resulted in it falling behind software trends in Data Analytics systems. Even its latest owner, SAP, is moving to Microsoft’s SQL Server Reporting Service as its go-to solution.

The SSRS add-on to SQL Server has evolved to a web-based report server featuring essential Data Analytics functions such as data-driven subscriptions and a multi-format rendering engine. Additionally, its scalability potential far exceeds that of Crystal Reports. Since many companies already own an SQL Server license and SSRS comes bundled for free, many are anxious to convert legacy Crystal Reports resources to SSRS.

Migration Process Overview

If your organization is fortunate enough to have only a handful of relatively simple reports to convert, you might employ one of several third-party automatic CR/RPT to SSRS/RDL tools. If you are using SQL Server 2016, you will need to further upgrade the RDL output via SQL Server Data Tools as Microsoft has recently made changes to the internal report object model.

Most companies, however, will have dozens or hundreds of reports of varying complexity for conversion and many of these will not lend themselves well to automatic conversion.

Step One: Report Analysis

In order to reduce the number of reports to convert and prioritize the remainder, follow these guidelines:

  • Determine if the report is still used directly or indirectly via another report
  • Record how the report is accessed and how often
  • Discover which individuals or group users are using the report
  • Prioritize which report items are most important to their business decision value

The results of this analysis will reveal the shortest path to converting the highest value reports and report data. This step often suggests ways in which reports can be consolidated or standardized by sharing common information across reports. This presents opportunities to share SQL stored procedures across reports as well.

In preparation for migration, also document various technical aspects such as the specific data sources being used, which may be databases offline and online or software products such as OLAP Intelligence or MicroStrategy Intelligence Server.

Step Two: Migration Implementation

You should now be prepared to recreate report characteristics as new SSRS reports. Proceed in this order:

  1. Create new SQL queries and then fetch and verify the data sets for accuracy
  2. Set up the SSRS report layouts noting that Crystal Reports is a banded report design, whereas SSRS uses a Table object
  3. Incorporate row-level data while building grouping relationships and subtotals
  4. Devise formulas and procedures to derive condensed results from primary data
  5. Create visual reporting artifacts such as charts, graphs and matrices
  6. Finally, set up any required sub-reports and linked reports

Unfortunately, completing these steps accurately may require serious sleuthing skills if the original reports are not well documented, which is often the case.

Difficulties of Automation

Although some conversion steps above may appear amenable to automation, such tools often run into serious obstacles even with reports of relatively low complexity. These problems often derive from the distinct software architectures used by each product. Furthermore, no Crystal Reports conversion tool performs the initial analysis step that’s so vital to ensuring your upgraded reporting system is cost-effective and efficient.

There are numerous examples of automated conversion issues that are difficult to debug and correct:

  • The use of shared data sources often generates exceptions
  • Differences in parameter names may interfere with data source connections
  • Parameter renaming often breaks dependencies in the same or other reports
  • Required parameters in SSRS are sometimes missing in Crystal Reports.
  • Layout fields and blocks often intersect, and overlapped objects render poorly
  • The Keep Together feature in Crystal Reports frequently translates incorrectly
  • Expressions and VB code in Crystal Reports often break or produce incorrect results in SSRS
  • Multicolumn reports and text boxes frequently require manual adjustment in SSRS

Even in cases where auto-conversion may appear to work at a 95% level, it is highly recommended that the conversion be overseen by a trained specialist who can uncover hidden flaws and correct them before they reach end users who depend on a clean, fully functional reporting system.

Employ a Phased Approach

If your Crystal Reports to SSRS conversion appears evenly moderately complex, approach the migration in a phased approach including upfront detailed analysis to uncover unforeseen, time-consuming obstacles.

After initial testing, deliver reports to a select group of knowledgeable users for beta testing before widespread deployment. Reduce inevitable end-user issues by documenting thoroughly changes in appearance or usage.


Migrating from Crystal Reports to SSRS provides an ideal opportunity to evaluate and streamline your entire reporting solution to ensure BI decision makers are getting the information they require in the most effective manner. SSRS will continue to offer more advanced features and grow with your business needs due to its superior scalability.

You will likely find that automated migration presents as many problems as it solves and that using an experienced conversion partner to implement a detailed manual conversion process pays the highest dividends.

SSRS Best Practices

Installing, configuring and running SQL Server Reporting Services is a non-trivial set of tasks. To successfully perform these, a company needs people with skills related to databases, system administration, report creation and data analysis. Most initial deployments and their usage can take advantage of several best practices in order to improve performance, maintenance and the production of quality reports.

Deployment Tips

Safeguard the Encryption Key

A number of items such as passwords and connection strings are encrypted within SSRS and stored within the Report Server database or configuration files. Many an installer has forgotten to back up the encryption key and later regretted it. Keep a copy of the key in a password-protected file and place copies in both the SSRS server and another safe spot outside the server. When migrating SSRS to another server, restoring this key will avoid having to re-create encrypted items such as your data sources.

Simplify Security Control

Sysadmins should be in the habit of granting access privileges by Active Directory groups rather than to individuals. Furthermore, reports themselves should be categorized into folders, and access privileges should be assigned per folder. Hierarchical access can be created via subfolder privilege cascading.

Use SQL Authentication in Lieu of Windows Authentication

Windows Authentication for connecting to report data sources is only reliable if the SQL Server database and the SSRS server are on the same machine, which often presents an inconvenience to users trying to access remote machines. To enable access from another machine, use SQL Authentication instead or supply account credentials that work on the remote machine.

Performance Improvement Tips

Optimize SELECT Statements

A bit of fine-tuning of SELECT statements goes a long way to improving SSRS performance by reducing the amount of data fetched to only that which is necessary. Get in the habit of specifying only the column names relevant to your query instead of using SELECT * statements.

Profile Slow Queries

Take advantage of the GUI-based SQL Server Profiler to analyze long-running queries and stored procedures to see if you can wring out even better performance. Note that the profiler will be deprecated in later releases of SQL Server except for Analysis Services workloads.

Use WHERE to Filter at the Database

Learn to use the WHERE clause in your queries for data filtering whenever possible. It is far more efficient for data filtering to take place at the database level and not at the report level. Filtering at the report level should only be used when multiple report items, such as matrices, lists and charts, are sharing a single dataset and each requires different methods of filtering.

Maintenance Tips

Shared Data Sources

Expert programmers know the old maxim that any problem can be solved by introducing another level of indirection. This is important in SSRS reports as well. For example, if at least several of your reports use a common data source connection, do not refer to this connection directly within each report. Instead, set up a project-level Shared Data Source and refer to that within reports. If the source of report data changes at a later time, you only have to modify it in one place for all reports to pick up the new data source.

Maintaining SSRS Report Server Catalog

Within the SSRS software architecture is an SSRS report server catalog, which caches data used to serve up previous reports without re-creating them afresh from the SQL Server database. The catalog tables should be regularly re-indexed and database statistics updated to improve query performance.

Report Design Tips

Improving Readability

There are several ways in which you can make SSRS reports more accessible to end users, such as maintaining uniformity of the font type and its size throughout the report, avoiding too many colors, saving space by proper textbox orientation and using alternating row colors. If the report is exported into multiple formats, always perform each export and check the output for readability.

Improving Content Quality

For reports that require the most up-to-date information, prominently display the time and date of the last refresh. This practice helps end users evaluate the report’s relevance. Also, display the data filter parameters, which assists readers’ report comprehension. When presenting visuals, such as pie charts and bar graphs, avoid using too many data categories as the smaller ones will clump together. If necessary, break out the smaller categories into a secondary visual.


SQL Server Reporting Services is a key component for modern enterprise Business Intelligence solutions. It provides a huge array of features essential to extracting and presenting the most relevant data to business decision makers. As such, its complexity requires careful deployment and optimization to achieve the best performance possible and high quality, informative reports for its end users. The tips presented here will help your organization utilize SSRS to best advantage.

SSIS Best Practices

SSIS represents a complete rewrite of its early predecessor Data Transformation Services. Usability, parallelism and performance have all been vastly improved over the years resulting in an SQL Server component aimed at high-volume, high-performance ETL applications. Here are several additional tuning steps that can further increase SSIS performance for your particular jobs.

Managing Parallel Execution

MaxConcurrentExecutables and EngineThreads are two SSIS properties used to optimize SSIS’ parallel execution capabilities. The first specifies the number of data flow tasks that are run in parallel by the SSIS runtime engine. Its default value is -1, which allocates an amount of threads equal to the number of processors/cores plus 2. If the value of MaxConcurrentExecutables is less than the number of parallel tasks available, only those number of tasks run simultaneously. When one completes, another is added.

EngineThreads limits the number of source and worker threads used to execute the data flow tasks in the data flow pipeline engine. Thus, its effect is “nested” within the overall data flow parallelism influenced by MaxConcurrentExecutables. Its default value is 10, which indicates up to 10 data source and 10 worker threads will be allocated.

In order to tune these values for an ETL job, turn on PipelineExecutionTrees event logging, which tells you how many execution trees are used per data flow task. Use this number to fine-tune the setting of EngineThreads.

One caveat here is that these values cannot override precedence constraints between tasks, which may decrease the amount of execution parallelism available.

Use Logging Judiciously

SSIS logging is very helpful in tracking down bugs in package code or for monitoring packages already debugged and deployed. It allows exclusive selection of events of interest and permits logging them to any of several destinations.

The LoggingMode property within a package enables and disables logging easily without package modification. Use it wisely as logging adds additional SSIS runtime overhead. Be aware also that a decrease in performance because of excessive logging may mask race conditions in code execution, which is typically the opposite effect that is desired when debugging intermittent package problems.

Taking Pressure Off TempDB and the Transaction Log

During high volume data transfers, changing the defaults for two SSIS settings will avoid uncontrolled growth of the transaction log and tempdb memory usage. By default, Rows per batch and Maximum insert commit size are set to maximum values of -1 and 2 GB, respectively. The first value permits all data rows to be processed as a single batch, whereas the second specifies whether all rows in the data transfer are to be committed in one go. Thus, for high volume transfers, set these to more reasonable values to match your execution environment.

Avoiding Defragmentation When Loading Large Data Volumes with Keys

When transferring several to hundreds of millions of rows from one table to another, you may experience a sharp slowdown in execution if the data contains a large number of clustered or non-clustered indices. Such fragmentation may reach as high as 90 percent.

It is possible to periodically halt the transfer and rebuild the indices, but a more efficient solution is to drop the indices completely, turn the target table into a heap and transfer all remaining data to this heap. Once the raw transfer is complete, rebuild the indices. This will decrease the overall execution time of the entire transfer several fold.

Save Time with Checkpoints

The Checkpoint features lets a package restart from the last point at which it failed instead of from the very beginning. When a Checkpoint is enabled on a package, execution status is logged to the file named in the CheckpointFileName property. Deleting this file will cause the package to start from the beginning.

Additionally, set CheckpointUsage, SaveCheckpoints on the package and FailPackageOnFailure for data flow tasks you want to have restart capability. Note that Checkpoint only works at the package level and not inside data flow tasks, which will start at the beginning of their internal sequence.

When performing long, high-volume ETL operations, Checkpoint can save many hours. It also is a great assist in debugging new packages. In the latter case, use it in conjunction with the Boolean DelayValidation property, which controls SSIS early and late validation of packages and components prior to execution.


SQL Server Integration Services’ creates data warehouses for the purpose of high-volume data ETL operations. The focal point of SSIS development is an SSIS package, created within Business Intelligence Development Studio. Packages are used for various workflows including data cleansing, standardization, integration and administrative tasks.

At the package, task and system levels, there are a number of tweaks possible to optimize SSIS performance to the job at hand or to assist the debugging of new or modified packages. Always bear in mind, that these tips are not hard and fast rules due to internal SSIS dynamic optimizations that happen at runtime. Thorough testing and monitoring is always a wise approach when applying these practices to your particular tasks.

This is just the tip of the iceberg for optimizing your SSIS, and there are many more tips and tricks to make your data systems clean and efficient. If you need help or have questions about your current system’s performance, our data analytics specialists will be able to guide you. Give us a call, we’re always happy to help.