Posts

Integrate Data Silos with Azure Synapse Analytics

The Roadblock for Digital Transformation

Synapse-1-211x300 Integrate Data Silos with Azure Synapse Analytics

Source: Harvard Business Review

It is clearly established that Digital Transformation is the key to success and even survival for organizations, even more so with the current global crisis due to COVID-19. 64% of executives believe that they have less than four years to complete digital transformation or they will go out of business. 91% of global executives surveyed by Harvard Business Review feel that effective data and analytics strategies are essential for digital transformation. This data driven culture is critical to spark innovation and drive efficiencies, which is crucial for survival.

But, 80% of the respondents also say that their organizations are struggling to become mature users of data and analytics even though 79% of the employees use data and analytics at least once a week. What gets in the way of organizations effectively using data and analytics for business decisions?

More than half (55%) of the executives say the key roadblock stems from data silos and difficulty managing data coming from multiple systems. Digital transformation leads to a lot of data being captured across various systems which can be extremely valuable. However, less than 20% of this data can ever be analyzed due to the silos. This is mainly because of the disconnect between Big Data analytics, enterprise Data Warehousing, Analytics, and Artificial Intelligence/Machine Learning.

Simplifying Analytics

The need of the hour is to simplify analytics in a manner that breaks down these silos and makes the most of the data available for analysis without having to jump through hoops. In an ideal world, streaming operational data should be available for immediate analysis to generate reports and run models on the data. This is not a trivial problem.

Operational data is a mix of structured and unstructured data which is generally stored in a Data Lake, not suitable for analytics. Hence, the operational data needs to be imported into a Data Warehouse. The reporting and analytics services can then run on the Data Warehouse.

This creates three key issues. 

  1. Lag between the operational and analytics data stores due to the ELT pipeline. 
  2. Balancing the operational, ELT, reporting, and analytics workloads in the cloud. 
  3. Efficient and effective model management.

Organizations would really benefit from a framework which effectively addresses these issues and removes the roadblocks to data maturity. Azure Synapse Analytics is a step in the right direction with a big promise – Limitless Analytics Services in the cloud.

Azure Synapse Analytics to the Rescue

synapse-2-300x198 Integrate Data Silos with Azure Synapse Analytics

Source: Microsoft

Microsoft has launched Azure Synapse Analytics to fulfill the promise of limitless analytics services. This service creates a single place to collaborate for Data Engineers, Database Administrators, Data Scientists, Business Intelligence Analysts, and Business Users with everyone accessing the same data.

The service offers a distributed query processing engine, versatile form factor for computing (cluster/ serverless), and a single experience for the users to manage the end-to-end process. This provides the much required flexibility in scaling and a great user experience, which promotes collaboration.

Many features of Azure Synapse Analytics are now generally available with many more in the pipeline. We believe that this service will evolve rapidly into the standard for analytics at scale for organizations.

Benefits of Azure Synapse Analytics

Azure Synapse Analytics allows teams to seamlessly work together. However, the benefits go beyond this. Some additional benefits are:

1. Unified Experience

 

synapse-3-300x117 Integrate Data Silos with Azure Synapse Analytics

Source: Microsoft

Azure Synapse Analytics allows users to ingest, prepare, manage, serve, visualize, and analyse the data using a unified experience. Users can bring their analytics to where the data is located, rather than switching to a different interface. This gives a big boost to productivity.

synapse-4-300x169 Integrate Data Silos with Azure Synapse Analytics

2. Limitless Scale

Azure Synapse Analytics enables limitless scaling for data and analytics in the cloud. Data professionals can derive insights from all the data across data warehouses and big analytics systems at speed. They can query both relational and non-relational data at petabyte-scale using T-SQL language. Furthermore, they can benefit from a versatile form factor of using clusters and serverless computing. Finally, they can run analytics systems along with mission critical workloads with intelligent workload management, workload isolation, and limitless concurrency.

Synapse-5-300x169 Integrate Data Silos with Azure Synapse Analytics

3. Integrate Business Intelligence and Machine Learning

Azure Synapse Analytics allows users to integrate Power BI and Azure Machine Learning within the Azure Synapse Studio. Then BI professionals and Data Scientists can tap into the available data immediately to create faster insights.

Synapse-6-300x169 Integrate Data Silos with Azure Synapse Analytics

4. Cloud-Native HTAP Implementation

The announcement of Azure Synapse Link (Preview) brings cloud-native hybrid transactional and analytical processing (HTAP) for Azure Cosmos DB. And with this, plans to expand it to other data stores in the future. It creates a tight seamless integration between Azure Cosmos DB and Azure Synapse Analytics. This enables users to run near real-time analytics over operational data which is stored in Azure Cosmos DB.

Synapse-7-300x126 Integrate Data Silos with Azure Synapse Analytics

Want to learn more? Click here for a quick and informative video that demonstrates the power of Synapse Analytics Link.

5. Price-Performance

Price-performance is also a critical part of data solutions. According to Microsoft, Azure Synapse Analytics offers better price-performance as compared to Google BigQuery and Amazon Redshift based on field tests done by GigaOm.

Synapse-8-300x221 Integrate Data Silos with Azure Synapse Analytics

Source: GigaOm Report

The TPC-H and TPC-DS results published by Microsoft show a significant reduction in price of Azure Synapse Analytics as compared to the others in the preceding as well as following graphics.

Synapse-9-300x294 Integrate Data Silos with Azure Synapse Analytics

Source: Microsoft

Speed

As demonstrated in this video from Ignite 2019, Azure Synapse Analytics can be blazing fast in a petabyte-scale environment combining relational and non-relational data. This can be a game changer for organizations where faster decision making can lead to substantial profit increase.

Synapse-10-300x168 Integrate Data Silos with Azure Synapse Analytics

Getting Started

Synapse-roadmap-300x166 Integrate Data Silos with Azure Synapse Analytics

We have multiple offers that make it easy for organizations to get started with Azure Synapse Analytics no matter what stage they are at in the process. 

  1. Just getting started?  We offer a free two-hour lunch and learn workshop to help you understand this service. 
  2. Do you already know about the service but need help figuring out your next step?  We can conduct an assessment, strategy, and roadmap workshop that will provide your organization a plan with how to move forward. 
  3. Do you have a roadmap but need help with implementation? We can get you started with the first pilot which can be completed in 2-4 weeks. Once you have experienced the value from the pilot, we can help you with the implementation as per the roadmap.

 

Contact us at info@optimusinfo.com to get started.

Digital Collaboration: Superhighway to Innovation and Economic Transformation

Federal and provincial governments in Canada along with the private sector are undertaking a major initiative to establish the nation as a global centre for digital transformation and innovation, generating 50 thousand jobs and boosting GDP by some 15 billion dollars over the next 10 years. 1.4 billion dollars is being committed in a collaboration of large and small companies, working together to generate solutions to some of the world’s most pressing health, productivity and sustainability issues.

Given the media attention being paid to technology and the Internet of Things today, it would be easy to assume that digital transformation throughout the Canadian economy – and elsewhere in the world – was happening quickly. But that’s not always the case.

Much of this digital transformation, of course, is taking place in the private sector, so one reason that the pace of change may be slower than we would assume is because the money to make it happen is coming from the private sector, itself. In other words, if and when corporate funds are available – and they’re not siphoned off to other projects – then investment in digital technologies can occur.

Another fundamental roadblock to speedier adoption of digital technology is a lack of leadership within a company[i]; there is often no single individual clearly in charge of leading such a project. Frequently, this is because corporate management lacks a firm knowledge and grasp of what is at stake.

Maintaining Software Integrity in Mission Critical Application

Optimus has developed rigorous standards for software testing along with a proprietary tool – the Azure Test Harness – which ensures integrity and faster time-to-market for companies rolling out applications.

For one client, a Finnish firm that provides detailed, real-time data on severe weather, like lightning strikes, for the airline industry, the ability to roll-out error-free software on a regular basis is critical.

The company frequently updated its software which meant manual testing to ensure the application remained error-free – a very time-consuming operation especially with airlines needing the latest data, instantly.

Optimus deployed an automated test framework on the Azure Test Harness which now allows the company to reduce its test cycles significantly while expanding its test coverage and maintaining the high integrity of its web-based app.

Azure Test Harness is one example of the innovative thinking Optimus is bringing to market.

It boils down to one simple fact: a failure to realize the powerful business advantages that digital technology brings to the table.

Digital Transformation Won’t Wait

International Data Corporation (IDC), a global research firm that provides marketing intelligence to technology markets, says that in 2018, only 30% of manufacturers investing in digital transformation will be able to maximize their advantages; the rest are held back by outdated business models and technology[ii].

There is one point on which even the late adapters can agree: digital transformation won’t wait.

In its 2017 budget, the federal government of Canada put forward a bold plan to supercharge innovation in the Canadian economy. The government earmarked nearly one billion dollars to be supported by money from the private sector for a combined total of 1.4 billion dollars[iii]. The money is being utilized across five “superclusters”, one of which is the digital technology supercluster.

At the Forefront of Innovation and Made in Canada

This cluster of companies, large and small – even start-ups – hails from healthcare, mining, financial services, telecom and other sectors of the economy. There are more than 270 companies that are part of the supercluster and they’ll collaborate on more than 100 projects, literally sharing talent and ideas. The goal is to generate commercial opportunities and economic prosperity that will keep Canada at the forefront of global innovation.

Optimus Information will be integrally involved, through our longstanding partnership with Microsoft, in assisting independent software vendors that are part of the digital technology supercluster as well as the large enterprise organizations that are participating. Many of the leading companies in these areas are already our clients. Optimus will offer expertise in technologies like blockchain, one of our growing areas of strength, through applications, another Optimus area of expertise.

What is exciting about this digital supercluster project is that the participants are operating at the edge of innovation, exploring and developing what’s never been done before. Optimus intends to leverage powerful AI and data analytics capabilities with which we work on a daily basis.

Optimus has breadth and depth of talent and experience in full stack and mobile development, software testing and business intelligence, data analytics and machine learning for industries from aerospace to transportation.

Contact us today to learn how we can bring value to your company and help you stay on the leading edge of innovation, as well.

Think Big: How Design Plus Data Will Change Your Business

Is design thinking catching your attention? It should. Data insights not available before now can transform your business models and allow you to lead in your industry when you incorporate elements such as predictive, mobile dashboards and machine learning. This wave of change is forcing data architects to re-think and re-design how programs and applications must be built. To truly innovate, design teams need to push the design thinking envelope on almost every project.

“You can have data without information, but you cannot have information without data.”
– Daniel Keys Moran, computer programmer and science fiction writer.

Since the invention of the first computer, the world has been on a digital light-speed journey – one that has seen massive change in how we interact with our world and with each other. Today, there are more than 2.5 billion[i] smart phones carried in people’s pockets – each more powerful than the ones used to run the spacecraft that landed the first men on the Moon.[ii] In particular, how we interact with and gain insight from data has gone through an incredible transformation. We have evolved from relying on simple historical reporting – from the days of simple reporting to now, where tanker.

The Way It Was

Reporting has always been a critical element for a business to thrive and we have been accustomed to seeing our reports – our data – in fairly standard and historic terms. Let’s take a straightforward quarterly sales report at a consumer retail company, for example. Simple data, like units sold, prices received, cost of goods, volume of shipments and so forth, would be gathered and stored over a three-month period and then used to generate a few charts and graphs. Conclusions would be drawn from this static data and the company would shift strategy based on the conclusions.

Perhaps the conclusions were accurate and maybe they weren’t. Regardless, that’s how it’s been done for a long time: based on the data available.

The Way It Is

Today, the capability exists to break down data into far greater detail, do it in real-time and through disciplines like machine learning and artificial intelligence, draw highly focused and accurate conclusions not at the end of a business quarter but at the end of each day, and, in many cases, as it happens.

IoT Changes Shipping Industry – Reduces Risk and Cost

A client that operates a fleet of tankers equipped with IoT sensors wanted to move beyond its basic data reports and drill deeper into the technical data gathered aboard its vessels. Optimus utilized elements from Microsoft’s IoT Suite, including Azure Data Factory, to create visually appealing reports and dashboards that contained information gathered from thousands of sensors throughout the fleet.

The results meant a far more in-depth data analysis than the company had been getting, delivering more accurate insight for more accurate business decisions. When it comes to tankers, a simple mistake can cost millions in terms of lost time, environmental disasters, financial penalties, missed deadlines and more.

Optimus solved the client’s existing problem while building a platform for continuous improvement with data analysis using Microsoft Azure tools. Because the data can be aggregated in the cloud, the client can analyze greater amounts of data over an extended period of time, thus further enhancing their shipboard operational analysis and implementing global cost saving efforts as a result.

Now, a business can make highly informed decisions immediately and adjust accordingly. Of course, it’s not simply analyzing a few traditional data points, like sales; it’s analyzing where those sales took place, in which store locations, even in which aisles or departments, at what time of day, from which shelf the customer chose a purchase, what the customer’s likely income level is– in other words, the more highly specialized the data, the more highly specialized and precise the conclusions that can be drawn.

Because it’s possible to generate highly detailed data and analyze it from so many different perspectives, every sector of the economy is making use of data analysis.

In the manufacturing sector, factory operations are being revolutionized[iii] by both big data and analytics. Sensors generate endless streams of data on the health of production line equipment, data that’s being examined by the minute for the slightest indication of a potential problem or defect. Conclusions are drawn and actions implemented immediately to avoid any breakdown and disruption in the production process. There’s a positive ripple effect to this: customers don’t experience delays and the company doesn’t experience a loss of revenue.

The virtually unlimited storage capacity in the cloud, coupled to highly sophisticated computer algorithms that can perform serious analysis in, literally, seconds, is placing tremendous demands on data architects. Programs and applications must be agile enough to allow for updates, added features and improvements without delay. This has meant developing new architecture that can not only run a program at lightning speed but can be altered or updated in the areas where it needs improvement, much like making incremental improvements to a car model but without re-designing the whole car every time.

Gone are the days of a monolithic software structure where data warehouses needed a year or more to be designed and several more months for data to be inputted. If missing data was discovered, it would mean an entire rebuilding of the program.

Microservices and Teams

Today, Optimus Information designs architecture so that updates, changes or improvements can be made to one area of a program or application without having to open up the whole program. By using microservices in our software development, Optimus has created functional teams whose responsibility is to just one area of a program. A team focuses only on its specific area and generates improvements without impacting other teams or resulting in an overhaul of an entire software product. Tremendous amounts of time are saved for our clients and the cost of updates or re-designs is driven down dramatically.

Optimus applies the same method to data gathering. By means of advanced tooling, our clients can store raw data, without pre-aggregating it, run a query on that raw data and have the answers they need in a matter of seconds. Previously, it would take weeks to get a result because the data would have to be assessed and compartmentalized as it was gathered and placed into structured environments before a query could be run. This is what we call modern data warehousing. The focus is on agility and speed.

Down the Road from Microsoft by Design

Optimus specializes in working with IT departments of companies that don’t or can’t spend the time and money to develop the cloud-based software architecture needed today. Optimus uses a suite of leading edge services, on the Microsoft Azure platform, that allow us to select exactly the right components to solve a client’s problem. We are physically located close to Microsoft’s Vancouver and Redmond development centres

Optimus is a Microsoft Gold Partner and, in that role, we work very closely with Microsoft on new product previews and trials that are in development, giving feedback that improves our customer’s end product. Optimus employees have often already kicked the tires on new Azure features before they are released. This keeps us at the forefront of rapidly changing technology but let’s us give feedback as enhancements are designed.

If you want to enhance and sharpen the results of your data analysis, we invite you to contact us. We are happy to explore some “what-if” scenarios with you to help propel your data insights – and your business – forward exponentially. Reach out and schedule a virtual coffee anytime.

Game Changers: The Role of Big Data in the Future of Credit Unions

In 2002, Billy Beane was the manager of the Oakland Athletics in Major League Baseball. Oakland was a small market club with a similar sized budget and it struggled to be competitive.

Because Oakland didn’t have the money of big market teams like the New York Yankees or Los Angeles Dodgers, Beane knew he couldn’t hope to attract the high-priced talent – the superstars – to play in Oakland.

Enter Paul Depodesta, aged 27, an economics graduate from Harvard, with an analytical mind and a love of baseball. His arrival on the doorstep of the Oakland A’s gave birth to data analysis in professional sports.

He analyzed player stats, using computer algorithms, and his results allowed Oakland to sign inexpensive players that other teams dismissed. The A’s were propelled into the stratosphere of success, thanks to big data.

The A’s finished the 2002 season with 103 wins, the same number as the New York Yankees – but with a budget about a tenth the size.

This is the “secret sauce” in data analytics: the ability to take substantial amounts of information – in the case of Oakland, endless baseball player statistics – look for patterns and capitalize on what is found.

Credit Unions, Machine Learning and Data Analytics

Credit unions in Canada are rapidly embarking on the same exploration. Using machine learning and data analytics, these financial firms are finding ways to improve service to their clients while, at the same time, discovering nuggets of information from the vast amounts of data they collect, that can then be turned into business opportunities.

Virtually every customer transaction within a credit union is electronic, and the amounts of data being collected are staggering. The need to analyze this information is what drives credit unions today to embrace machine learning and data analytics.

Matthew Maguire is the Chief Data Officer at Co-Op Financial Services, a California-based company that operates an interlinked system of ATM machines throughout the U.S. and Canada. He argues that machine learning and data analysis are critical for mid-sized credit unions as they work to reinforce current customer relationships and build new ones.

“Data is coming in from different places and the challenge is… how do you make it all connect?[i]” he said.

Credit unions are moving quickly into data analysis. Through machine learning, which unearths customer transaction patterns by using algorithms, credit unions are learning a great deal about their customers and are designing strategies to capitalize on that in order to drive sales.

But, for credit unions, data enables other capabilities. Patterns of fraud can be easier to spot and shut down through data analysis.

When a client invests with a credit union, regulations require the client to complete what’s called a Know Your Client form, which essentially draws a profile of risk tolerance and investment objectives. If the client’s portfolio strays from that profile and becomes riskier, big data can alert the financial institution and the problem can be corrected before any monetary loss accrues to the client – or to hundreds of thousands of clients.

Chris Catliff is the president and CEO of Blueshore Financial, a B.C.-based credit union with more than $3 billion in assets. His vision of the future of credit unions is predicated on the power of data analytics in combination with machine learning.

He envisions the day very soon when a client approaching a branch receives a text message saying the client is already checked in at the branch. As they walk through the door, their customer profile and picture pop up on a screen [ii] at a concierge desk and they’re greeted by name.

Blueshore’s ATM machines will respond to a customer’s biometrics and offer a transaction based on a pattern of previous transactions. Up-sell opportunities will present themselves, so staff can suggest options – situations that might never occur without data analysis.

Service, he said, “has to be electronic transactions with the introduction of superior, human touch at various critical points. It’s high tech and high touch.”

Explore Your Data Potential

Like the members they serve, every credit union is unique. It is imperative for a credit union to work with data specialists who can marry the individual needs of each credit union with high levels of expertise across big data, data analysis and machine learning.

One of our strengths here at Optimus is our track-record in the areas of data gathering, analysis, machine learning, dashboarding and data visualization, through which we help our clients tailor data mining and analysis to their business goals.

At the end of the day, it’s all about staying competitive and, like the Oakland Athletics, reaching the pinnacle of success by embracing and employing new strategies to achieve that success.

 

[i] https://www.pymnts.com/big-data/2018/credit-unions-big-data-authentication-aml-kyc/
[ii] http://enterprise-magazine.com/features/betting-big-on-big-data/