4 Ways Azure is Rising to Meet Data Warehouse Demands

In today’s data-first world, IT infrastructure is the foundation for strategic decision-making, with companies requiring larger quantities in shorter periods of time. This is putting the traditional data model – where data from systems like CRM, ERP and LOB applications are extracted, transformed and loaded (ETL) into the data warehouse – under pressure. The problem is compounded by increased data volumes from social apps, connected devices (IoT) and emerging sources of data.

The need to gather data from traditional, transactional systems, like ERP, CRM and LOB, and then integrate this data with social, mobile and connected devices has driven the adoption of big data storage technologies such as Hadoop. At Optimus, we’re finding more and more users demand predictive, real-time analytics to make use of their data, something that can’t be done with traditional data warehouse tools. Consequently, organizations are considering cloud-based solutions such as Azure to transform their data warehouse infrastructure.

Microsoft knows this, and are growing their solution portfolio accordingly. Below are four ways in which Microsoft Azure is adapting to meet the demands of today’s modern data warehouse.

1. Consistently High-Performance for all Volumes of Data

Microsoft is working to solve the problem of achieving high levels of performance for large datasets through MPP technologies, in-memory columnstore and optimizations on core query engine. In particular, Optimus is seeing SQL Server emerge as a leader in performance and scalability. SQL Server supports a large number of cores with complex vector instructions while holding terabytes of memory and contains local flash storage that provides high I/O bandwidth. When optimized for inherent parallelism and concurrency, it is not uncommon for users to outperform large distributed databases.

In one example, Microsoft and Intel teamed up to create a 100 terabyte data warehouse using a single server, four Xeon E7 processors and SQL Server 2016. According to the report, “The system was able to load a complex schema derived from TPC-H at 1.6TB/hour, and it took just 5.3 seconds to run a complex query (the minimum cost supplier query) on the entire 100TB database.”

2. Storing Integrated Data

Companies are looking for ways to store integrated – both relational and non-relational – data of any size, type and speed without forcing changes to applications as data scales.

Enter the Azure Data Lake Store. Data Lake makes it simple for everyone, from analysts to developers and data scientists, to access, add and modify data, regardless of its state.

Facilitating all of this is Azure HDInsight, a cloud-based Hadoop and Spark cluster. HDInsight lets your team create analytic clusters, manipulating data into actionable insights. In addition to a fully managed Hadoop service, Microsoft has included PolyBase in HDInsight, which provides the ability to query relational and non-relational data in Hadoop with a single, T-SQL-based query model.

3. Built with Hybrid Data Storage at the Core

While the cloud continues to gain popularity, companies are realizing that they still need to keep at least some information on-premises. Microsoft is acutely aware of this and has built Azure accordingly. Their data warehousing and big data tools are designed to span on-premises and cloud warehouses. Microsoft’s hybrid deployment is designed to provide the control and performance of on-premises with the scalability and redundancy of the cloud. Optimus is seeing users access and integrate data seamlessly, while leveraging advanced analytics capabilities, all through Azure.

4. Machine Learning and Big Data in Real-Time

Traditional advanced analytics applications use outdated methods of transferring data from the warehouse into the application tier to procure intelligence, resulting in unacceptably high latency and little scalability.

In contrast, Microsoft has transformed integrated analytics with machine learning in the cloud. The Cortana Intelligence Suite, coupled with R Server, can be deployed both on-premises with SQL Server and in the cloud with HDInsight. The resultant solution is one that solves for hybrid, scales seamlessly and enables real-time analytics.

There are many factors driving companies to consider an Azure Cloud data warehouse migration. To learn more, check out our e-Book, Building a Modern Data Warehouse on Azure.

Does Your Data Warehouse Belong in the Azure Cloud? Here are Some Things to Consider

It’s no secret: Microsoft Azure is hot right now. This is demonstrated by their 97% growth in Q2 2017. With more organizations migrating their data infrastructure to the cloud every day, some companies are asking themselves: does my data warehouse belong in Azure? While there’s no simple answer to this question, there are some ways in which you can begin to assess your current data infrastructure’s suitability for an Azure Cloud migration.

The Cost Factor

The team at Optimus has found cost to be one of, if not the top driver for cloud adoption. There are several factors businesses should consider where cost in the cloud is concerned:

  • If your business is cyclical (i.e. retail with high volume throughout the holiday season), the cloud pay-as-you-go model makes strong financial sense. Cyclical companies can burst to the cloud when they need to, saving them from buying new servers that may only be required a few weeks per year. Conversely, it may not be cost effective to move workloads that are required to run at a stable level 24/7/365 to the cloud, especially if they are running on equipment that does not need upgrading in the foreseeable future.
  • At Optimus, we have found that many organizations prefer opex over capex. Opex tends to be easier to manage over the long term, especially for fast-growing businesses where a significant capex could stall growth. The more a business transitions to the Azure pay-as-you-go model, the more they shift their data warehouse costs from a capex to an opex.
  • The apportioning of data costs across departments is significantly simplified in Azure. Pricing for individual workloads is made transparent, and data usage is easily tracked.

When considering leveraging Azure for your data warehouse, it is important to remember that a cloud migration is not an all-or-nothing endeavour. Every business will have certain workloads that make financial sense in the cloud and certain workloads that should remain on-premises. Perform an accurate assessment of your current data infrastructure to determine your cloud suitability.

What are Your Data Governance Requirements?

Meeting data governance and regulatory requirements is at the forefront of the mind of anyone considering an Azure migration, and for good reason. Moving an on-premises legacy data infrastructure to the cloud is a difficult landscape to navigate.

Your industry may determine your suitability for an Azure Cloud data warehouse migration. Certain sectors, such as financial and healthcare, have strict data governance laws to comply with. You need to make sure your – and your client’s – data remains within certain jurisdictions, something that may prove challenging and will influence your choice of what data to move to Azure.

Do you need to retain control over user authentication? If yes, you’ll need to look at the feasibility of this with various applications. Your service provider will be able to assess this with you and make the right recommendations.

Latency: Still a Consideration?

The short answer is yes. In particular instances where the speed of data transaction is mission-critical, an internal data warehouse may be required. This is common in the financial industry, where trading companies are under increasing pressure to host their servers physically close to a stock exchange’s computers. In an industry where transactions are conducted in microseconds, speed is priority number one.

While Azure has made significant improvements to latency times, the fact remains that the closer two computers are to each other, the faster they can communicate. At Optimus, we have seen companies with these types of operational concerns benefit from leaving some of their data on-premises. However, because the amount of data required to perform at a high level is typically small, leveraging the public cloud is still a viable option for most organizations.

There are many factors to keep in mind when considering a data warehouse migration to Azure. To learn more, check out our e-Book, Building a Modern Data Warehouse on Azure.

Infographic – The Modern Data Warehouse Framework

Check our latest infographic, The Modern Data Warehouse Framework!

As organizations are collecting and processing increasing amounts of data from a growing number data sources, data systems must evolve to keep up. In order to make the best data-driven decision possible, you need to reimagine the way you look at data warehousing.

We took a look at how to transition your data warehouse to the cloud and put together our top 8 recommendations for building a modern data warehouse on Azure.

 

Download the PDF here

 

The-Modern-Data-Warehouse-Framework Infographic - The Modern Data Warehouse Framework

Infographic – The Modern Data Warehouse Framework

 

 

 

Power BI for Mobile: Take Your Data on the Road

One area where the Power BI software stack really shines is the mobile space. The Power BI product line includes three apps: one for iOS, Windows Phone and Android. These apps allow you to take anything you can generate in Power BI and make it readily available to any stakeholder with a mobile phone or tablet. With a couple swipes, users can quickly interact with all your analysis. Power BI allows you to bring together the advantages of mobile devices, big data systems and compelling visualizations in a way that permits everyone involved to make better decisions.

The Power of the Dashboard

It’s one thing to produce an informative chart, but it’s quite another to deploy a fully interactive dashboard that can fetch real-time updates. Power BI permits you to tie together data from a variety of sources, including numerous non-Microsoft products. For the end user, the guy in marketing who just needs to see today’s report, the component that makes it all accessible is the dashboard.

Power BI dashboards allow you to publish any type of common infographic, geospatial information or visualization. If you need a bubble chart that displays the YTD performance of your company’s retail outlets, there’s an out-of-the-box solution for that with Power BI. It also allows you to create maps and overlay existing information onto those maps. Instead of just seeing that Store #325 is performing well, an app user can pull up the dashboard and see on the map whether that’s a one-off phenomenon or a regional trend.

Making Information Accessible

In the world of data analytics, a lot of work goes into empowering decision makers who may not have strong technical background. It’s extremely beneficial to give those people an app that allows them to quickly sort through the available data in a clear format. If your buyers can quickly bounce between multiple years’ worth of data and make comparisons, they can make important decisions faster.

Power BI also allows you to determine how the dashboard accesses the available information. Rather than simply presenting users a static set of reports, you can configure queries that allow them to sift through in a self-guided fashion. If someone needs access to a real-time inventory report, your dashboard can be configured to fetch that information from the company’s SQL Server installations. This allows members of your organization who might not be data scientists to rapidly develop insights that can guide their choices. 

Cross-Platform Compatibility

Keeping everyone in your business on the same page can be a challenge. Microsoft has gone to great lengths to ensure that the Power BI apps display information faithfully on every platform and function in a similar fashion. Our hypothetical data scientists in our example will have no trouble grabbing an art department iPhone and finding everything they need.

Data Sources

Any data source that can be access inside Office or Power BI can be presented within the app’s dashboard. If you need to present data from an Excel sheet in an appealing manner to someone on the other side of the planet, the app can make that happen. It also allows you to connect to commonly used data sources, such as SQL Server Reports, and outside sources, such as Google Analytics, Salesforce or MailChimp. You can even mix and match functionality, for example, overlaying Salesforce data on Google Maps.

Conclusion

Business intelligence is about putting the right information in the rights hands and in a format that makes a visually compelling case. Your company will likely invest a lot of effort in the coming years into producing analysis and generating insights. With Power BI’s mobile app, you can ensure that the people who need those insights have access to them with the touch of a finger. The app allows you to pass along analysis to stakeholders in a secure environment that makes interacting with the data easy. In short, it makes all your data analytics faster, more appealing and more accessible.

If you have questions about getting started with Power BI or want to push the toolset further, give us a call. We’re always happy to answer any questions.

 

 

Building a On-Site Data Warehouse Using Microsoft Stack

Most IT specialists are aware of individual components within SQL Server and may use one or two of them regularly for data analytics or general operations. Outside of IT, others may be familiar with one of several analytics-oriented SQL Server add-ons such as those for Excel that are targeted at data analysis or data mining in general.

It is likely that far fewer staff, if any, have combined SQL Server components into a complete data analytics stack that enables the organization to build a high-performance data analytics solution using on-premise and external data sources that deliver meaningful analytics across the enterprise.

Fortunately, Microsoft provides all the tools and services necessary for a traditional, on-site data analytics system. This article outlines these components and their general usage. Of necessity, it cannot cover all the intricacies that may arise out of business-specific requirements.

Normalized vs. De-Normalized Databases

Relational databases use a normalized data organization optimized for rapid insert and update operations while minimizing data reads, which makes them ideal for OLTP processing. These types of databases are not suitable for data warehouses, which use an OLAP design that de-normalizes the data. This organization is optimized for the high-volume read performance required for analytical processing.

Other Reasons to Use a Separate Data Analytics Database

In addition to optimizing read performance, creating a separate OLAP database versus utilizing the OLTP corporate database has several advantages:

  • Analytical data queries requiring summarization of large data quantities will not bog down corporate database servers that collect related data items from disparate resources
  • Historical data can be cached rather than archived, which is typical on OLTP databases
  • You are free to consolidate data from several sources including off-premise databases, social media sites or a variety of document formats such as XML or Excel.
  • Cleansing and optimizing your data for analysis will be automated using your business-specific rules, queries and schedule.

The Microsoft Data Analytics Stack

SQL Server 2016 contains the entire suite of components necessary to build your first data warehouse. The three critical components are Integration Services, Analysis Services and Reporting Services.

SQL Server Integration Services (SSIS)

SSIS is SQL Server’s ETL component that connects to your data sources, migrates data, restructures it according to business-specific rules and populates the database. It manages acquiring data from multiple sources, cleaning this data plus performing type conversions and any other transformations required to streamline analysis. SQL Server Agent permits automatic scheduling of these processes so the database always contains the most relevant and timely data.

Use the particular business problems you are solving to guide ETL preparation and specification. A user-centric approach pares down and organizes the data being collected for the most efficient extraction and presentation.

SQL Server Analysis Services  (SSAS)

SSAS is critical to creating the most useful and highest-performing database queries. This component transforms normalized or relational data into multidimensional database “cubes” that are tuned to deliver rapid results to ad hoc queries. It does so through data indexing, data aggregation and enabling complex data calculations to be embedded at the cube level where processing is most efficient.

SQL Server Reporting Services (SSRS)

SSRS is used to make queries on an SQL Server database as well. It offers everything required to build detailed, complex reports with visualizations and facilities to manage and distribute reports securely. Users may view reports via the SSRS Web Portal, Power BI or a SharePoint server. End users are able to import reports directly to Excel or PowerPoint.

SQL Server Data Tools

This package works within Visual Studio (VS) and provides development tools for all the integration, analysis and reporting steps outlined previously. It is free and downloads a VS shell if you do not already have VS installed. SSDT lets you design and build your OLAP database and SQL Server content in much the same manner as you would for a VS application. It replaces the previous Business Intelligence Development Studio.

Conclusion

Microsoft’s SQL Server plus SSDT support development of a complete data analytics analysis system that includes an efficient OLAP database, optimized data schema and a full pipeline of data acquisition, pre-processing, analysis, delivery and visualization. Many organizations without a current data analytics solution may not have expertise in all the SQL Server components required, and those that do have typically only applied their skills toward OLTP type solutions.

Thus, many businesses employ an experienced partner for architecting a data analytics system that is scalable, extendable and highly secure. The most capable consultants are able to train and guide in-house staff through the entire development and maintenance lifecycle such that the enterprise becomes self-sufficient at further expanding and optimizing the system.

If you need help or advice on your data analytics system, let us know. Our team of data experts have worked with businesses large and small across all industries and have the right specialized technical skills to guide you towards a solution.

 

 

 

Data Analytics in the Cloud: Where to Start?

An enterprise-wide data analytics system pulling data from multiple sources, correlating and presenting results in relevant, insightful visualizations to enable prompt, informed decisions is the dream of many organizations. Those in the lead already reap the benefits of faster, high-accuracy, proactive decisions that data analytics provides.

Getting to that point requires exquisite planning and execution by stakeholders from top to bottom and across departments in order to achieve an implementation that is useful, maintainable and flexible enough to handle ever-improving data analytics technology and an increasingly competitive business environment.

Don’t Go Too Big Too Fast

Data analytics systems are most valuable when shared across the organization. Therefore, cross-departmental input and commitments are vital as well as a high degree of project transparency. Collecting requirements and then creating an all-at-once solution months or quarters later is courting disaster. That sometimes works for limited software development projects, but data analytics initiatives necessarily demand a much larger scope in both time and space.

Adopt an incremental mindset from the start by applying a less risky phased and flexible tack to your data initiative’s development. The project should produce regular and continuous adjustments to functionality, features and metrics without unnecessary thrash. This paradigm is most likely to produce a quality product with high acceptance across the business.

Gain Executive Stakeholder Buy-In

With the correct attitude regarding initiative progression, gain C-Suite buy-in via detailed assessment and quantification of business needs to which data analytics capabilities can add measurable value. These come from conversations with department heads, people managers, project managers and line workers involved in operational activities that probably benefit from insights provided through readily accessible business analytics.

Collect Technical Requirements

After executive endorsement, pinpoint technical needs, features and functions required to create a system meeting the project’s strategic goals. These requirements are derived from various operational aspects:

  • Existing data analytics-related processes, e.g. CRM, and their supporting software and infrastructure
  • Identifying existing data sources and creating a baseline of the what, when and how of data storage and processing
  • Where applicable, data sharing patterns, especially where data transformation steps are required
  • The current collection of in-house, open source or cloud-based tools and services utilized

Turn Technical Requirements into KPIs

Concurrently with technical requirements acquisition, work closely with stakeholders to develop meaningful metrics and KPIs. Examples include metrics around analytics data acquisition, storage and cleaning. Marketing metrics might measure campaign response. High-interest sales metrics centre on conversions and ROI, whereas support metrics include customer satisfaction and retention. Be open to innovative metrics that a new data analytics system could enable.

Ask for Resource Commitments

While collaborating on KPIs, initiate frank discussions with regard to workers and material that stakeholder departments or teams are willing to provide to the project. The benefits of such commitments should already have been incorporated into specific KPIs that benefit them.

Choosing a Software Model

Inconsistent use of software models, such as open source, in-house or cloud-based SaaS is common in companies. This often results from an organic acquisition of tools and projects over time. Your data analytics implementation will not correct old problems, but its choice of software model should be based on technology availability, costs and the ability to scale and adapt as your company expands.

For instance, even with a strong in-house IT development capability, the benefits of basing your data analytics implementation on cloud-based SaaS are compelling.

First of all, removing the constraint of higher capital needs and their approval alone makes a forcible argument for choosing pay-as-you-go cloud SaaS. Furthermore, this complements your phased approach as infrastructure and services are easily scaled and maintenance is automatic. Finally, today’s cloud SaaS from the best providers is fully customizable, which enables rapid functionality development and ease of modification during ongoing development.

Additional Tips

  • Expect dirty data, especially from social media, and deal with it at the source where possible. Employ tools such as import.io, diffbot and ScraperWiki in this battle. Especially during testing, consider importing customized, on-demand data sets. 
  • Be sure data analytics dashboards and reports are highly customizable but easy to learn. This area is the face of your initiative for the majority of users. Also, ensure dashboard functionality works for your mobile users.
  • Build in extensibility. This means anticipating new data sources and leaving room for the latest in predictive analysis technology.
  • If you are using a phased, results-oriented approach, you will have plenty of opportunities to celebrate small victories. Relish and share these milestones.

Conclusion

Data analytics have a proven track record of providing enterprises with streamlined and on-target decision-making improvements based on “right-time” data flows from inside and outside the company. Implementing the best system for your company requires intense and thorough planning followed by step-wise development and deployment.

Realize that even as your project begins achieving its end goals that ongoing business needs and changing markets call for continued growth of your data analytics capability. If you already chose cloud-based SaaS as your software core, then the next growth and adjustment phase will be much easier than the first, especially if you stick to your iterative development paradigm.

If you have questions about how to get started working in the cloud, let us know. We’re happy to share our knowledge and set you on the right path.

 

 

Automated Testing for SSRS Reports

Motivations for SSRS Report Testing

Both data warehouse developers and end users of data analytics reports have a keen interest in the accuracy and appearance of data content. As SSRS reports are being developed, they are typically tested piecemeal during construction and as a whole when completed for both these aspects.

However, it is always possible to overlook certain report items or their representation after report deployment. Furthermore, issues with changes in data sources or the ETL flow may introduce anomalies or errors that affect data values or their presentation at a later date.

SSRS Report Testing Automation Issues

To increase report testing efficiency and accuracy, automated testing is key during both development and maintenance of SSRS reports, especially for organizations utilizing dozens or hundreds of reports on a regular basis. Automation can decrease development time and play an important role in correcting discrepancies in report data post-deployment, which could otherwise negatively impact confidence in the data warehouse.

The complexity of interactions within SSRS itself and the many other components of SQL Server, however, make the creation of fully automated testing a tricky business. Furthermore, the high-degree of customization possible for SSRS reports implies that customized testing approaches are probably required in addition to standardization of report content and layout wherever feasible.

SSRS Testing Areas for Automation

Unit Testing

There are a large number of characteristics that can be tested during development via unit tests. Unit tests can migrate to test suites for troubleshooting post-deployment bugs quickly. These are a few example coverage areas for such automated tests:

  • If data is not reaching a report, validate the data source, check that the correct mappings to the dataset are being used and that each column’s Visibility property is set correctly.
  • If data is not present and the report uses stored procedures, validate the SP parameters, their data types, conversions and formatting requirements.
  • The failure of data to group as expected can be tested by examining the grouping expressions and that the Aggregate function is applied to Numeric columns.
  • Create a query test framework that can take as input individual queries or query scripts. Such a framework would contain validated comparison datasets including metadata to run against actual queries/datasets with each report’s specific data source, parameters or configuration.
  • Test report rendering into formats used by the organization. Initial tests require manual inspection of results, but subsequent tests could be automated using input reports under version control. The most useful of these tests is to output the report in XML format, which is likely to be complete, free of data conversions and most amenable to automated comparison tests.

A possible starting point for building a permanent RS unit testing framework can be found here: http://rsunit.codeplex.com/

Layout Testing

Automating report layout tests typically presents the greatest difficulties. If your organization uses the SSRS web portal, however, you can take advantage of a number of web UI automation tools that facilitate such testing.

Selenium Web-Driver is a free tool for testing web page layout and functionality, which works with Firefox, IE, Chrome and Safari. Automation test scripts are written in either Java, C#, Ruby, Python or JavaScript. Scripts utilize the Web-Driver API which invokes a live browser or runs headless.

Other UI-based testing tools are also available such as the open-source Sikuli, freeware AutoIt or TestComplete, which is a proprietary tool by SmartBear.

Conclusion

Test automation has its limits, of course. This is especially so with regard to SSRS reports testing. For instance, usability testing is clearly out of scope for automation. Automation is further complicated in SSRS, since data is often manipulated beneath the covers beyond control of the report itself. For example, data may be manipulated outside of queries in embedded VB.Net code or in external class libraries.

Even so, automating report testing wherever feasible always pays off. Choose the low-hanging fruit first before expanding into other areas, such as troubleshooting suites and GUI-based layout testing. As you progress, you are likely to find that developing a mindset aimed at automation frequently instills a virtuous cycle of discovering new opportunities for further test automation.

In SSRS testing areas that present the most difficult obstacles, employ third-party tools or employ an experienced automation consultant who can demonstrate automation methods most appropriate for your SSRS development and usage scenarios.

Cloud Data Warehousing with Azure SQL

If you are running an on-premise data analytics stack on Microsoft’s SQL Server, but running into maintenance, cost and scaling issues, you can consider moving your data system to a cloud-based database service such as Azure SQL Database. Especially for your first data analytics stack, Azure SQL Database provides low startup costs with the ability to easily expand as business grows.

Advantages of Azure SQL Database

There are several benefits to moving on-premise SQL Server infrastructure to Azure:

  • Physical acquisition, provisioning and maintenance of SQL Server deployments are a thing of the past. Furthermore, decreasing or increasing data infrastructure is instantaneous with SQL Database elastic pools.
  • Azure assists existing database migration to Azure SQL Database with wizard-based tools.
  • All stored and transmitted data are encrypted via client-side keys.
  • Microsoft accommodates third-party and open-source technologies, such as Python, Java, node.js, PHP and Python.
  • SQL developers feel right at home using SQLCMD or SQL Server Management Studio for development.

SQL Database Limitations

Although all SQL Server components, SSIS, SSAS and SSRS are available on Azure, there are still areas where the Azure version is not completely fleshed out. For instance, only a growing subset of T-SQL features are yet available such as cursors, transactions, triggers, all data types, all operators plus logical, arithmetic and string functions.

Additionally, many T-QSL statements in SQL Database do not support every option available in SQL Server 2016, such as CREATE/ALTER for databases, logins, tables, users and views. Collation of system objects, cross-database queries with three- or four-part names, database collector, diagrams and mail, some events and certain database characteristics that were managed manually in SQL Server but are automatic in SQL Database are also missing.

For a full list of deficiencies, see Azure SQL Database Transact-SQL differences

Additional Azure Capabilities

SSRS is actually replaced with a separate service, SQL Reporting, which incurs a separate charge for reports. It is not a general reporting service since it only works with SQL databases. It does offer a nearly identical development interface to traditional SSRS.

Azure Tables is a storage service targeted at non-relational database storage, which is a type preferred for data analysis processes. It stores up to 100TB of data via an Azure Storage account and supplies data in row form. Additional advantages include less cost than straight Azure storage and easy scaling.

Built on top of Hadoop, HDInsight offers unstructured data storage plus a number of tools, such as Sqoop, Pig and Hive for query processing. Your in-house SQL Server, Excel or SQL Database are all able to connect to this service.

Data Factory is Microsoft’s SaaS analogue to SSIS. It visually coordinates other services to transform raw, unstructured data via data flow pipelines into clean, transformed data ready for analysis engines such as HDInsight or Azure Machine Learning for predictive analytics.

In lieu of SQL Reporting, you can utilize Microsoft’s SaaS Power BI for report, dashboard and visualization creation. You can use this tool in conjunction with your on-premise SQL Server installation or stored spreadsheets too.

Steps to Migrating from SQL Server to SQL Database

SQL Database is, in theory, backward-compatible all the way to SQL Server 2005. In spite of this, the first step in migration is to test and fix any compatibility issues that may exist with SQL Database V12.

There are several methods to determine compatibility including the use of SQL Server Data Tools, the SqlPackage utility, SQL Server Management Studio’s Export Data Tier wizard and the Azure SQL Migration Wizard. SSDT, SSMS and SAMW can be used to fix any migration issues with your database also.

The next step is to create an Azure SQL Database logical server and migrate your existing data to it. Although other methods exist, the use of SQL Server transaction replication is the recommended solution since it minimizes live database downtime. Other solutions are to export/import BACPAC files when connection bandwidth is low or unreliable or use the SSMS database deploy wizard for smaller databases.

Running SQL Server in the Cloud Directly

There is nothing stopping you to begin or continue your SQL Server-based data analytics development and deployment without Azure SQL Database. Amazon AWS provides any level of SQL Server instantiation online with the advantages of computational, networking and storage elasticity on a pay-as-you-go basis. With a bit more lifting, you could do the same thing on Azure or AWS by utilizing their Virtual Machine services directly for your own SQL Server deployment.

Conclusion

Running data analytics in the public cloud brings all the usual benefits of cloud-based operation, the most important of which are elastic storage for big data crunching systems and high availability in-house or mobile across the enterprise.

Whether or not your business should consider a cloud-based data analytics deployment depends on several factors including TCO, data volume, bandwidth requirements, security and the need to scale operations up or down quickly.

A wise approach is to work with an experienced Optimus data analytics consultant to collate all factors and develop a full data architectural solution. Our experts can guide you towards the best solution for your needs.

 

 

 

Power BI and SSRS: How Do They Pair?

SQL Server Reporting Services is a powerful reporting tool in its own right with strong programmability, extensibility and scalability features that have led to its widespread adoption. As a report authoring tool, however, SSRS has lacked the powerful dashboard creation features of its competitors. With the release of SQL Server 2016, it now has the capability to produce stunning, information-packed visualizations when paired with Microsoft’s Power BI suite of business analytics tools.

Pairing Up SSRS and Power BI

The only prerequisites to get SSRS and Power BI to play together is the installation of SQL Server 2016 CTP 3, a Power BI account of any level, registration of SSRS with Power BI and SQL Server Agent running. Register SSRS with Power BI via Reporting Services Configuration Manager by entering the SQL server name and RS instance and connecting. Then under Power BI Integration, register a work or school Power BI account. The account must have permissions for registering apps with Azure Active Directory.

Basic SSRS/Power BI Interactions

Once registration is complete, you can upload any Power BI file to the Reporting Services web portal. After uploading, you can open a Power BI file, interact with the report, update its model, refresh the data and save it back to RS web portal when operations are complete. All other consumers of the report will see the changes instantly.

Power BI Mobile App

There is also a Power BI mobile application that connects to Reporting Services as long as network access to the RS server is obtainable from your location. Once connected, you can interact with reports or already defined KPIs in the web portal just as you would from your desktop.

BI Visualization Pinning

Where the partnership between SSRS and Power BI really shines is the ability for users to pin any report visualization, such as charts, matrices, maps and so on to a Power BI dashboard. These pins can have additional annotations. To take advantage of this new feature, you need to configure RS with a BI account that has application add rights in Azure Active Directory.

Once you are signed into RS with Power BI credentials, you can begin pinning. The most common errors during this interaction occur because SQL Agent is not running, since the RS subscription subsystem is utilized for this feature. When the pairing is set up correctly, a Power BI icon appears in the RS toolbar.

You access “pinnable” items via an additional tab, which darkens everything in the display that you cannot pin. Selecting an item brings up a dialog popup from which you select the working group which contains the dashboard to which you will add the pin. You also specify an update frequency so the pin is kept up to date with regular data refreshes.

Note that pins may “expire,” which means they fail to show up at a later date. This situation occurs if you have not refreshed the authentication token during that period. This refresh is accomplished simply by signing into Power BI every 90 days or less at which time all your pins are updated.

The Future of SSRS and Power BI

One of the clear drawbacks to Power BI is that, for now, it is strictly a cloud-based solution. Users are clamoring for the capabilities of Power BI for their on-premise and hybrid systems. Microsoft is responding by rewriting SSRS to extend Power BI-like features to on-premise installations. This also requires that legacy SSRS users be accommodated to provide them with advanced features and an upgrade path.

The new SSRS, which has been making an impact since SQL Server 2016 RC1, supports all modern web browsers using HTML5. A single SSRS mobile app runs on all three major platforms. The RS interface and reports can be custom-branded for a more professional look.

Conclusion

As SSRS has become the de facto modern standard for BI analytics systems, its weakness in data visualizations has come to the fore, which Microsoft is softening by creating ways for SSRS to pair up with Power BI for great visualizations. This pairing, for now, is loosely coupled with the ability to interact with Power BI reports, a mobile Power BI app that connects to an RS server and the ability to attach SSRS visuals as a tile inside of a Power BI dashboard.

They are now adding more capabilities within SSRS itself for big customers with on-premise BI operations, which will include another mobile app and better visuals. It seems logical that at some point Microsoft will opt to unify UIs, report formats and internal subsystems within both SSRS and Power BI in order to reach their eventual goal of one billion Power BI users.

 

Check out our latest interactive Power BI demos. Play with real data to see how it translates into interactive visualizations:

 

Migrating Crystal Reports to SSRS

Most legacy ERP systems still utilize Crystal Reports for creating detailed, flexible and dynamic business reports. However, multiple acquisitions of the software over the years has resulted in it falling behind software trends in Data Analytics systems. Even its latest owner, SAP, is moving to Microsoft’s SQL Server Reporting Service as its go-to solution.

The SSRS add-on to SQL Server has evolved to a web-based report server featuring essential Data Analytics functions such as data-driven subscriptions and a multi-format rendering engine. Additionally, its scalability potential far exceeds that of Crystal Reports. Since many companies already own an SQL Server license and SSRS comes bundled for free, many are anxious to convert legacy Crystal Reports resources to SSRS.

Migration Process Overview

If your organization is fortunate enough to have only a handful of relatively simple reports to convert, you might employ one of several third-party automatic CR/RPT to SSRS/RDL tools. If you are using SQL Server 2016, you will need to further upgrade the RDL output via SQL Server Data Tools as Microsoft has recently made changes to the internal report object model.

Most companies, however, will have dozens or hundreds of reports of varying complexity for conversion and many of these will not lend themselves well to automatic conversion.

Step One: Report Analysis

In order to reduce the number of reports to convert and prioritize the remainder, follow these guidelines:

  • Determine if the report is still used directly or indirectly via another report
  • Record how the report is accessed and how often
  • Discover which individuals or group users are using the report
  • Prioritize which report items are most important to their business decision value

The results of this analysis will reveal the shortest path to converting the highest value reports and report data. This step often suggests ways in which reports can be consolidated or standardized by sharing common information across reports. This presents opportunities to share SQL stored procedures across reports as well.

In preparation for migration, also document various technical aspects such as the specific data sources being used, which may be databases offline and online or software products such as OLAP Intelligence or MicroStrategy Intelligence Server.

Step Two: Migration Implementation

You should now be prepared to recreate report characteristics as new SSRS reports. Proceed in this order:

  1. Create new SQL queries and then fetch and verify the data sets for accuracy
  2. Set up the SSRS report layouts noting that Crystal Reports is a banded report design, whereas SSRS uses a Table object
  3. Incorporate row-level data while building grouping relationships and subtotals
  4. Devise formulas and procedures to derive condensed results from primary data
  5. Create visual reporting artifacts such as charts, graphs and matrices
  6. Finally, set up any required sub-reports and linked reports

Unfortunately, completing these steps accurately may require serious sleuthing skills if the original reports are not well documented, which is often the case.

Difficulties of Automation

Although some conversion steps above may appear amenable to automation, such tools often run into serious obstacles even with reports of relatively low complexity. These problems often derive from the distinct software architectures used by each product. Furthermore, no Crystal Reports conversion tool performs the initial analysis step that’s so vital to ensuring your upgraded reporting system is cost-effective and efficient.

There are numerous examples of automated conversion issues that are difficult to debug and correct:

  • The use of shared data sources often generates exceptions
  • Differences in parameter names may interfere with data source connections
  • Parameter renaming often breaks dependencies in the same or other reports
  • Required parameters in SSRS are sometimes missing in Crystal Reports.
  • Layout fields and blocks often intersect, and overlapped objects render poorly
  • The Keep Together feature in Crystal Reports frequently translates incorrectly
  • Expressions and VB code in Crystal Reports often break or produce incorrect results in SSRS
  • Multicolumn reports and text boxes frequently require manual adjustment in SSRS

Even in cases where auto-conversion may appear to work at a 95% level, it is highly recommended that the conversion be overseen by a trained specialist who can uncover hidden flaws and correct them before they reach end users who depend on a clean, fully functional reporting system.

Employ a Phased Approach

If your Crystal Reports to SSRS conversion appears evenly moderately complex, approach the migration in a phased approach including upfront detailed analysis to uncover unforeseen, time-consuming obstacles.

After initial testing, deliver reports to a select group of knowledgeable users for beta testing before widespread deployment. Reduce inevitable end-user issues by documenting thoroughly changes in appearance or usage.

Conclusion

Migrating from Crystal Reports to SSRS provides an ideal opportunity to evaluate and streamline your entire reporting solution to ensure BI decision makers are getting the information they require in the most effective manner. SSRS will continue to offer more advanced features and grow with your business needs due to its superior scalability.

You will likely find that automated migration presents as many problems as it solves and that using an experienced conversion partner to implement a detailed manual conversion process pays the highest dividends.