Posts

Digital Collaboration: Superhighway to Innovation and Economic Transformation

Federal and provincial governments in Canada along with the private sector are undertaking a major initiative to establish the nation as a global centre for digital transformation and innovation, generating 50 thousand jobs and boosting GDP by some 15 billion dollars over the next 10 years. 1.4 billion dollars is being committed in a collaboration of large and small companies, working together to generate solutions to some of the world’s most pressing health, productivity and sustainability issues.

Given the media attention being paid to technology and the Internet of Things today, it would be easy to assume that digital transformation throughout the Canadian economy – and elsewhere in the world – was happening quickly. But that’s not always the case.

Much of this digital transformation, of course, is taking place in the private sector, so one reason that the pace of change may be slower than we would assume is because the money to make it happen is coming from the private sector, itself. In other words, if and when corporate funds are available – and they’re not siphoned off to other projects – then investment in digital technologies can occur.

Another fundamental roadblock to speedier adoption of digital technology is a lack of leadership within a company[i]; there is often no single individual clearly in charge of leading such a project. Frequently, this is because corporate management lacks a firm knowledge and grasp of what is at stake.

Maintaining Software Integrity in Mission Critical Application

Optimus has developed rigorous standards for software testing along with a proprietary tool – the Azure Test Harness – which ensures integrity and faster time-to-market for companies rolling out applications.

For one client, a Finnish firm that provides detailed, real-time data on severe weather, like lightning strikes, for the airline industry, the ability to roll-out error-free software on a regular basis is critical.

The company frequently updated its software which meant manual testing to ensure the application remained error-free – a very time-consuming operation especially with airlines needing the latest data, instantly.

Optimus deployed an automated test framework on the Azure Test Harness which now allows the company to reduce its test cycles significantly while expanding its test coverage and maintaining the high integrity of its web-based app.

Azure Test Harness is one example of the innovative thinking Optimus is bringing to market.

It boils down to one simple fact: a failure to realize the powerful business advantages that digital technology brings to the table.

Digital Transformation Won’t Wait

International Data Corporation (IDC), a global research firm that provides marketing intelligence to technology markets, says that in 2018, only 30% of manufacturers investing in digital transformation will be able to maximize their advantages; the rest are held back by outdated business models and technology[ii].

There is one point on which even the late adapters can agree: digital transformation won’t wait.

In its 2017 budget, the federal government of Canada put forward a bold plan to supercharge innovation in the Canadian economy. The government earmarked nearly one billion dollars to be supported by money from the private sector for a combined total of 1.4 billion dollars[iii]. The money is being utilized across five “superclusters”, one of which is the digital technology supercluster.

At the Forefront of Innovation and Made in Canada

This cluster of companies, large and small – even start-ups – hails from healthcare, mining, financial services, telecom and other sectors of the economy. There are more than 270 companies that are part of the supercluster and they’ll collaborate on more than 100 projects, literally sharing talent and ideas. The goal is to generate commercial opportunities and economic prosperity that will keep Canada at the forefront of global innovation.

Optimus Information will be integrally involved, through our longstanding partnership with Microsoft, in assisting independent software vendors that are part of the digital technology supercluster as well as the large enterprise organizations that are participating. Many of the leading companies in these areas are already our clients. Optimus will offer expertise in technologies like blockchain, one of our growing areas of strength, through applications, another Optimus area of expertise.

What is exciting about this digital supercluster project is that the participants are operating at the edge of innovation, exploring and developing what’s never been done before. Optimus intends to leverage powerful AI and data analytics capabilities with which we work on a daily basis.

Optimus has breadth and depth of talent and experience in full stack and mobile development, software testing and business intelligence, data analytics and machine learning for industries from aerospace to transportation.

Contact us today to learn how we can bring value to your company and help you stay on the leading edge of innovation, as well.

Think Big: How Design Plus Data Will Change Your Business

Is design thinking catching your attention? It should. Data insights not available before now can transform your business models and allow you to lead in your industry when you incorporate elements such as predictive, mobile dashboards and machine learning. This wave of change is forcing data architects to re-think and re-design how programs and applications must be built. To truly innovate, design teams need to push the design thinking envelope on almost every project.

“You can have data without information, but you cannot have information without data.”
– Daniel Keys Moran, computer programmer and science fiction writer.

Since the invention of the first computer, the world has been on a digital light-speed journey – one that has seen massive change in how we interact with our world and with each other. Today, there are more than 2.5 billion[i] smart phones carried in people’s pockets – each more powerful than the ones used to run the spacecraft that landed the first men on the Moon.[ii] In particular, how we interact with and gain insight from data has gone through an incredible transformation. We have evolved from relying on simple historical reporting – from the days of simple reporting to now, where tanker.

The Way It Was

Reporting has always been a critical element for a business to thrive and we have been accustomed to seeing our reports – our data – in fairly standard and historic terms. Let’s take a straightforward quarterly sales report at a consumer retail company, for example. Simple data, like units sold, prices received, cost of goods, volume of shipments and so forth, would be gathered and stored over a three-month period and then used to generate a few charts and graphs. Conclusions would be drawn from this static data and the company would shift strategy based on the conclusions.

Perhaps the conclusions were accurate and maybe they weren’t. Regardless, that’s how it’s been done for a long time: based on the data available.

The Way It Is

Today, the capability exists to break down data into far greater detail, do it in real-time and through disciplines like machine learning and artificial intelligence, draw highly focused and accurate conclusions not at the end of a business quarter but at the end of each day, and, in many cases, as it happens.

IoT Changes Shipping Industry – Reduces Risk and Cost

A client that operates a fleet of tankers equipped with IoT sensors wanted to move beyond its basic data reports and drill deeper into the technical data gathered aboard its vessels. Optimus utilized elements from Microsoft’s IoT Suite, including Azure Data Factory, to create visually appealing reports and dashboards that contained information gathered from thousands of sensors throughout the fleet.

The results meant a far more in-depth data analysis than the company had been getting, delivering more accurate insight for more accurate business decisions. When it comes to tankers, a simple mistake can cost millions in terms of lost time, environmental disasters, financial penalties, missed deadlines and more.

Optimus solved the client’s existing problem while building a platform for continuous improvement with data analysis using Microsoft Azure tools. Because the data can be aggregated in the cloud, the client can analyze greater amounts of data over an extended period of time, thus further enhancing their shipboard operational analysis and implementing global cost saving efforts as a result.

Now, a business can make highly informed decisions immediately and adjust accordingly. Of course, it’s not simply analyzing a few traditional data points, like sales; it’s analyzing where those sales took place, in which store locations, even in which aisles or departments, at what time of day, from which shelf the customer chose a purchase, what the customer’s likely income level is– in other words, the more highly specialized the data, the more highly specialized and precise the conclusions that can be drawn.

Because it’s possible to generate highly detailed data and analyze it from so many different perspectives, every sector of the economy is making use of data analysis.

In the manufacturing sector, factory operations are being revolutionized[iii] by both big data and analytics. Sensors generate endless streams of data on the health of production line equipment, data that’s being examined by the minute for the slightest indication of a potential problem or defect. Conclusions are drawn and actions implemented immediately to avoid any breakdown and disruption in the production process. There’s a positive ripple effect to this: customers don’t experience delays and the company doesn’t experience a loss of revenue.

The virtually unlimited storage capacity in the cloud, coupled to highly sophisticated computer algorithms that can perform serious analysis in, literally, seconds, is placing tremendous demands on data architects. Programs and applications must be agile enough to allow for updates, added features and improvements without delay. This has meant developing new architecture that can not only run a program at lightning speed but can be altered or updated in the areas where it needs improvement, much like making incremental improvements to a car model but without re-designing the whole car every time.

Gone are the days of a monolithic software structure where data warehouses needed a year or more to be designed and several more months for data to be inputted. If missing data was discovered, it would mean an entire rebuilding of the program.

Microservices and Teams

Today, Optimus Information designs architecture so that updates, changes or improvements can be made to one area of a program or application without having to open up the whole program. By using microservices in our software development, Optimus has created functional teams whose responsibility is to just one area of a program. A team focuses only on its specific area and generates improvements without impacting other teams or resulting in an overhaul of an entire software product. Tremendous amounts of time are saved for our clients and the cost of updates or re-designs is driven down dramatically.

Optimus applies the same method to data gathering. By means of advanced tooling, our clients can store raw data, without pre-aggregating it, run a query on that raw data and have the answers they need in a matter of seconds. Previously, it would take weeks to get a result because the data would have to be assessed and compartmentalized as it was gathered and placed into structured environments before a query could be run. This is what we call modern data warehousing. The focus is on agility and speed.

Down the Road from Microsoft by Design

Optimus specializes in working with IT departments of companies that don’t or can’t spend the time and money to develop the cloud-based software architecture needed today. Optimus uses a suite of leading edge services, on the Microsoft Azure platform, that allow us to select exactly the right components to solve a client’s problem. We are physically located close to Microsoft’s Vancouver and Redmond development centres

Optimus is a Microsoft Gold Partner and, in that role, we work very closely with Microsoft on new product previews and trials that are in development, giving feedback that improves our customer’s end product. Optimus employees have often already kicked the tires on new Azure features before they are released. This keeps us at the forefront of rapidly changing technology but let’s us give feedback as enhancements are designed.

If you want to enhance and sharpen the results of your data analysis, we invite you to contact us. We are happy to explore some “what-if” scenarios with you to help propel your data insights – and your business – forward exponentially. Reach out and schedule a virtual coffee anytime.

Game Changers: The Role of Big Data in the Future of Credit Unions

In 2002, Billy Beane was the manager of the Oakland Athletics in Major League Baseball. Oakland was a small market club with a similar sized budget and it struggled to be competitive.

Because Oakland didn’t have the money of big market teams like the New York Yankees or Los Angeles Dodgers, Beane knew he couldn’t hope to attract the high-priced talent – the superstars – to play in Oakland.

Enter Paul Depodesta, aged 27, an economics graduate from Harvard, with an analytical mind and a love of baseball. His arrival on the doorstep of the Oakland A’s gave birth to data analysis in professional sports.

He analyzed player stats, using computer algorithms, and his results allowed Oakland to sign inexpensive players that other teams dismissed. The A’s were propelled into the stratosphere of success, thanks to big data.

The A’s finished the 2002 season with 103 wins, the same number as the New York Yankees – but with a budget about a tenth the size.

This is the “secret sauce” in data analytics: the ability to take substantial amounts of information – in the case of Oakland, endless baseball player statistics – look for patterns and capitalize on what is found.

Credit Unions, Machine Learning and Data Analytics

Credit unions in Canada are rapidly embarking on the same exploration. Using machine learning and data analytics, these financial firms are finding ways to improve service to their clients while, at the same time, discovering nuggets of information from the vast amounts of data they collect, that can then be turned into business opportunities.

Virtually every customer transaction within a credit union is electronic, and the amounts of data being collected are staggering. The need to analyze this information is what drives credit unions today to embrace machine learning and data analytics.

Matthew Maguire is the Chief Data Officer at Co-Op Financial Services, a California-based company that operates an interlinked system of ATM machines throughout the U.S. and Canada. He argues that machine learning and data analysis are critical for mid-sized credit unions as they work to reinforce current customer relationships and build new ones.

“Data is coming in from different places and the challenge is… how do you make it all connect?[i]” he said.

Credit unions are moving quickly into data analysis. Through machine learning, which unearths customer transaction patterns by using algorithms, credit unions are learning a great deal about their customers and are designing strategies to capitalize on that in order to drive sales.

But, for credit unions, data enables other capabilities. Patterns of fraud can be easier to spot and shut down through data analysis.

When a client invests with a credit union, regulations require the client to complete what’s called a Know Your Client form, which essentially draws a profile of risk tolerance and investment objectives. If the client’s portfolio strays from that profile and becomes riskier, big data can alert the financial institution and the problem can be corrected before any monetary loss accrues to the client – or to hundreds of thousands of clients.

Chris Catliff is the president and CEO of Blueshore Financial, a B.C.-based credit union with more than $3 billion in assets. His vision of the future of credit unions is predicated on the power of data analytics in combination with machine learning.

He envisions the day very soon when a client approaching a branch receives a text message saying the client is already checked in at the branch. As they walk through the door, their customer profile and picture pop up on a screen [ii] at a concierge desk and they’re greeted by name.

Blueshore’s ATM machines will respond to a customer’s biometrics and offer a transaction based on a pattern of previous transactions. Up-sell opportunities will present themselves, so staff can suggest options – situations that might never occur without data analysis.

Service, he said, “has to be electronic transactions with the introduction of superior, human touch at various critical points. It’s high tech and high touch.”

Explore Your Data Potential

Like the members they serve, every credit union is unique. It is imperative for a credit union to work with data specialists who can marry the individual needs of each credit union with high levels of expertise across big data, data analysis and machine learning.

One of our strengths here at Optimus is our track-record in the areas of data gathering, analysis, machine learning, dashboarding and data visualization, through which we help our clients tailor data mining and analysis to their business goals.

At the end of the day, it’s all about staying competitive and, like the Oakland Athletics, reaching the pinnacle of success by embracing and employing new strategies to achieve that success.

 

[i] https://www.pymnts.com/big-data/2018/credit-unions-big-data-authentication-aml-kyc/
[ii] http://enterprise-magazine.com/features/betting-big-on-big-data/

 

Test Data Management Best Practices

As the scale and complexity of IT software escalates, reliability and quality requirements must increase proportionally, which implies increasing challenges for test organizations to develop more effective testing methodologies. One critical test process is the collection, validation and maintenance of efficient production-quality test data as early as possible in the life of a software project.

Such a process has become more vital as organizations evolve from serial, waterfall-based development methods to agile development. Any test data management strategy must efficiently supply a steady supply of relevant test data to support ever-tightening development cycles, while avoiding testing bottlenecks.
Following are five best practices with regard to test data management to include in such a strategy.

1. Gathering and Analyzing Test Data

To the extent that relevant production data already exist, these often represent low-hanging fruit for building up a test data store. The chosen sets of production data are compared to required test cases for the software under test and sorted by purpose – positive path, negative path, boundary conditions, etc. – to determine their test coverage potential. Structured production data that include metadata are the most valuable as they are more readily adapted to format changes required by the test environment.

Test cases not covered by production data must be covered by newly created test data. These data may be obtained via an existing, closely-related database system or output from various applications such as inventory systems, accounting programs, financial applications or other applications appropriate to the software under test.

2. Sanitizing Data Sets

Test data sets should never include personally identifiable or otherwise confidential data in order to protect privacy and security as required by industry and governmental regulations. These data include names, addresses, phone numbers, email addresses, customer orders and so on. At the same time, however, the test data must reflect real-world scenarios, so such fields should be replaced with realistic, but fictitious information and details.

3. Automate Data Refreshes

It is an intrinsic characteristic of production data that they go stale. Therefore, they must be refreshed periodically to maintain up-to-date, optimized testing processes and results. It is probable that the initial production data acquisition and creation process was time-consuming, but subsequent refreshes will be more efficient if automation opportunities regarding acquisition and formatting are taken advantage of during the initial pass.

4. Categorize Data for Potency and Reuse

Taking time to define and set up descriptive test data attributes pays enormous dividends for the current and any future testing project that reuses the test data store. At a minimum, label test data to correlate them to specific test cases. At any point in the testing process, these correlations can calibrate the potency of current data sets as data are added, deprecated or modified, which can be an added factor in determining data store refresh rates.

More distinctive attributes should be added that facilitate data provisioning of subsequent releases of the software under test or for new, related software projects. This can be effected in the most efficient manner by automatically matching labeled data sets to test cases. Test data selection can be further enhanced by categorizing data sets according to their relevance to different testing stages such as functional, performance or stress tests.

5. Use of a Central Repository with Version Control

The ability to control the quality of the test data is severely hampered without the use of a central data repository with version control. It is an organizational decision whether or not the test data reside in the same version control system as the source code under development. If they are separate, an automated procedure must exist, however, to ensure release versions remain in alignment.

Additionally, there must be a distinction made between static input test data and those that are generated by test scripts or adjunct test applications. In the latter case, the scripts would fall under version control rather than the generated data. In general, output data should not be stored except when errors occur in order to facilitate debugging.

The overarching goal of any test data management strategy is to streamline testing with real-world data as early in the software development process as possible. Too many organizations, once they fall behind on software delivery schedules, fall into the trap of testing against live production data, which further disrupts the schedule and reduces customer confidence. Taking the time to gather, develop, analyze and properly maintain test data sets as early as possible leads to more valuable defect detection, efficient data use/re-use plus the shipment of a higher quality product out the door.

Business Analytics in 2012

logo_gartner-150x150 Business Analytics in 2012

Gartner’s business intelligence predictions for 2012

As 2012 is next door, Gartner has identified next year’s business analytics trends. From the list, I find the following the most fascinating.

Next-Generation Analytics:

As we know profitability at any organization is the result of limitless “minor” actions taken by decision makers at the front lines — at employees’ desks, in front of customers, with partners etc. But, because traditional business intelligence solutions are best suited to delivering historical, narrowly-focused information, most companies are unable to arbitrate and respond to business changes in real time.

The enabler of micro- and macroscopic views of profitability is analytics: sets of interactive objects (i.e., graphs, tables, reports, alerts) which support the process of taking profitable actions by helping the decision maker separate truly actionable knowledge from raw data. Therefore, analytics can help link both insight and action. In order to do this properly, these must be available when profit occurs, through collaboration, at the point of opportunity, by many people, and in incremental actions.

Analytics is growing along three key dimensions:

  1. From traditional offline analytics to in-line embedded analytics. This has been the focus for many efforts in the past and will continue to be an important focus for analytics.
  2. From analyzing historical data to explain what happened to analyzing real-time as well as historical data from multiple systems in order to simulate and predict the future.
  3. Over the next three years, analytics will mature along a third dimension:
    1. from many systems supporting a collaborative decision process that brings multiple people together to analyze, brainstorm and make decisions
    2. from structured and simple data analyzed by individuals to analysis of complex information of many types (txt, audio etc) – best example in this category would be Apple’s new TV that will have SIRI to help viewers decide which channels to watch based on their mood.

Additionally, analytics is also beginning to shift to the cloud which enables high performance and grid computing. As we have seen this year, I have a strong belief that in 2012, analytics will increasingly focus on decisions and collaboration. It will provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action.

Big Data:

The overwhelming size of data is exponentially increasing day by day. Along with the size, complexity of formats and delivery speed has also exceeded the capabilities of traditional data management technologies. These significant changes have increased the need for new, simple technologies to manage the volume of data. Currently in the market we find many new emerging technologies, with the potential to be disruptive (e.g., in-memory DBMS). Analytics has become a major driving application for data warehousing, with the use of self-service data marts and cubes. One major implication of big data is that in the future users will not be able to put all useful information into a single data warehouse. Logical data warehouses bring together information from multiple sources as needed and will replace the single data warehouse model soon.

Cloud Computing:

Cloud is a disruptive force and has the potential for broad long-term impact in most industries. While the market remains in its early stages in 2011 and throughout 2012, it will see the full range of large enterprise providers fully engaged in delivering a range of offerings to build cloud environments and deliver cloud services. As companies like Microsoft, Amazon, and Salesforce continue to expand their cloud offerings users will see competition heat up and enterprise-level cloud services increase.

Enterprises are moving from trying to understand the cloud to making decisions on selected workloads to implement on cloud services and where they need to build out private clouds. Hybrid cloud computing which brings together external public cloud services and internal private cloud services, as well as the capabilities to secure, manage and govern the entire cloud spectrum will be a major focus for 2012.

Rupmeet works with Optimus in the area of Business Analytics. Please feel free to contact us to setup a brief introductory meeting.

Introduction to Business Analytics

pills-225x300 Introduction to Business AnalyticsThis is the second post in a 3-part series on business analytics. My last post discussed the 5 categories of organizations fall under in their use of business analytics. This post will start with my introduction to business analytics, then explain an excellent example of successful analytics implementation.

A few years ago, I was working for an investment bank. My role was to write queries and generate reports. At that time I felt only financial institutions, such as my bank, needed customized reports.  Complex business dashboards were required to run their operations because they had large volumes of data and could afford the cost associated with analytics. I later learned that any company, large or small, could benefit from gathering and monitoring key performance metrics.

AstraZeneca is one of my favorite examples of successful business analytics implementation. AstraZeneca is a pharmaceutical company that was founded in 1999 by a merger of Astra AB and Zeneca Group plc. The merger brought data management challenges along as both companies had many data sources and data types.

At that time, AstraZeneca had more than 12,000 research employees across six international locations. Being in the drug development industry, AstraZeneca had a critical need to reduce time-to-market, which is directly related to maximizing project management efficiency, resource utilization and cost control. The biggest challenge faced by AstraZeneca was to find a solution that could provide immediate access to one consistent view of critical R&D information across all six locations.

AstraZeneca developed and deployed an in-house system with the help of an analytics vendor. The system collected data from various sources and stored R&D data in a central data warehouse using the ETL process (Extract, Transform and Load). The system used an analytics solution to query and analyze R&D information.

Users across the six locations were now able to access information on projects, schedules, costs, resources and more. AstraZeneca started using information from other business areas such as finance, manufacturing and marketing. AstraZeneca also implemented dashboard solutions to deliver at-a-glance displays of key performance indicators, both of functional units such as R&D, and of the overall strategic direction of the company led by executive management.

The data management and analytics solution enabled AstraZeneca to track, understand, and manage huge volumes of data created every day from various business units. AstraZeneca researchers, senior management, and other employees now had instant access to one consolidated view of up-to-date R&D information. This helped them make faster and more confident decisions.

Leveraging information, AstraZeneca gained a significant competitive edge by accelerating its time-to-market for new products, leading to quick product launch, and by securing early brand leadership and increased revenues.

OptimusBI helps companies with their business analytics needs. For more information on our services, please contact us at rupmeet.singh@optimusinfo.com

(image credit: D.H. Parks)

Business Data Maturity: 4 Types of Companies

business-analytics-data-matrix-300x227 Business Data Maturity: 4 Types of Companies

Reading raw data is like looking at The Matrix. Business analytics turns raw data into actionable information.

There are four types of companies with respect to business data maturity, three of which utterly fail in business analytics.

Business analytics is the art of extracting the maximum value from raw data. Data is now very inexpensive to capture and record. According to this article from the Gartner data centre conference on business data, there will be approximately 650% growth in all forms of enterprise data in the next 5 years. Eric Schmidt, former CEO of Google, once quoted, “We now create in two days as much information as humanity did from the beginning of recorded history until 2003.”

This presents a challenge: How do we make this data useful?

Very few companies use this enormous data to their competitive advantage. I would categorize these companies into 4 major segments:

Category 1 – No clue about data:

There are many companies that have no clue that they can transform their business model around data. For example, a coffee shop owner is busy with his customers every day. If the owner tries to analyze his customers’ tastes, e.g. if the demand for hot coffee and croissants is higher in the morning and the demand for muffins and iced tea is higher during the afternoon hours, there will be more efficiency around inventory management.

Category 2 – No access to data:

Many companies have lots of data, but have no access to it. The processes in place to access the data may be too complex or slow. Executives may not be able to obtain the data in real-time.

Category 3 – Data overload:

Companies going through transitions or expansions may have access to abundant data feeds, including those that are only marginally relevant to their industry. The executives may not know how to manage the large volumes of data. Unorganized data is as harmful as inaccessible data.

Category 4 – Data as competitive advantage:

Companies that see themselves as industry leaders invest heavily in collecting data, managing it and extracting the best analysis out of it. This helps them to be ahead of the game, become more market-responsive, improve customer service, and successfully differentiate their business.

In our next blog post on business analytics, we will discuss each category and how companies in each category may benefit from structured business analytics.

OptimusBI has a data management service to help companies with their business analytics needs. For more information on our services, please contact us at rupmeet.singh@optimusinfo.com