Blog posts on the subject of cloud computing by management and developers at Optimus Information Inc.

Cloud adoption is growing rapidly as enterprises become more comfortable with the concept and understand the advantages and how to mitigate the disadvantages.

Learn about the benefits of cloud computing for enterprise, some of the pitfalls to adopting cloud computing in an enterprise setting and how to deal with those pitfalls so your cloud computing experience is a positive one.

4 Tips to Prevent “Run-Away” Azure Costs

Imagine you return to the office on Monday morning and discover a mysterious spike in your Azure consumption. You are now significantly over your monthly budget. As you feel a headache forming, you ask yourself: “Wasn’t moving to the cloud supposed to reduce spending?”

Here are 4 easy and basic actions you can take to prevent a run-away Azure cost incident.

1. Delete or deactivate unused resources

According to Microsoft, “The challenge with cloud computing is that once you flip the switch on, the meter keeps running until you switch it off” (Aleenah Ansari). You don’t leave the heat on when you take a weekend trip. So, why would you pay for unused resources in the cloud? Part of optimizing for cloud is deleting all unused resources so you are not unknowingly incurring cost on an unused resource that is still active. If the resource needs to be kept, place the resource in a free or low-cost configuration (for example, a dev or test configuration which automatically starts at a lower price tier). 

Read this article to learn more about how to optimize your cloud investment with Azure Cost Management.

2. Create subscription budgets and alerts

Just like managing your own personal finances, you can set limits for your Azure budget. Microsoft  Azure allows you to set alerts and thresholds to monitor your budget based on cost or usage. You can even set up an automatic trigger when a given budget threshold is reached. For example, you can notify operations staff, trigger VMs to shut down, or you can move your infrastructure to a different pricing tier when you hit a certain budget.

As a best practice, it is important to review your budget regularly to see how much you have spent and to make proactive changes if needed. Use this tutorial to create and manage Azure budgets.

monthly-budget01-1030x624 4 Tips to Prevent “Run-Away” Azure Costs

Based on the fields chosen in the budget a graph is shown to help you select a threshold to use for your budget.

3. Use management groups to roll-up observation across all subscriptions

Modern TV streaming services give you parental controls to easily see who’s watching what and manage who should be allowed to watch what. Azure offers a similar feature to easily and efficiently manage multiple subscriptions within your organization. Azure management groups allow you to manage access, policies, and compliance of those subscriptions by providing a level of scope above subscriptions.  

Once you have organized your subscriptions into containers called ‘management groups’, you can apply your governance conditions to all your subscriptions in that management group. Note: All subscriptions within a single management group must trust the same Azure Active Directory tenant. However, once they do, the costs from individual subscriptions can also be rolled up to the containing management group, allowing for convenient budget monitoring and alerting across multiple subscriptions.

Root_Management_Group-1030x635 4 Tips to Prevent “Run-Away” Azure Costs

Azure management group hierarchy.

4. Improve system monitoring, alerts, and notifications

Your smartwatch counts your steps, monitors your sleep habits, and notifies you when you’ve been sitting too long. Wouldn’t it be nice if you received similar alerts about your Azure account? Using tools like Azure Monitor and Application Insights can help you understand how your applications are performing and can even proactively identify issues affecting them and the resources they depend on.

Azure Monitor can be used to monitor the availability, performance, and usage of your web applications. Alerts in Azure Monitor also notify you of critical conditions by sending a text or email to an administrator responsible for investigating an issue. Alerts can also attempt to take corrective action, such as launching an automated process that attempts to correct an error condition.

Azure_Monitor-1030x567 4 Tips to Prevent “Run-Away” Azure Costs

A high-level view of Azure Monitor.

Managing a cloud tenant can have unexpected challenges. However, using these features from Azure will help you maximize your cloud infrastructure without extending your budget. Reach out to us if you want to learn more about optimizing your Azure infrastructure.


All product names, trademarks and registered trademarks are property of their respective owners.

Q&A About Open Source and Azure with Mark Godfrey of Microsoft

Mark Godfrey is a TSP with Microsoft Canada. Recently he spent time presenting to an audience of more than 200 application developers about Open Source, Azure and Optimus Information at the Vancouver Microsoft Technology Summit

Open Source is everywhere at Microsoft. Check out what’s running on Microsoft Azure today.

Q: Welcome Mark, and thanks for joining us today.

A: My pleasure, happy to be here.

Q: For people who don’t know you and your role, can you explain what you do at Microsoft?

A: Sure. At Microsoft I’m a TSP – Technology Solution Professional. I’m the guy who helps the sales teams by talking to customers about how Microsoft technology works and how it will fit into their environment to improve agility, increase innovation and secure their IT assets

Q: You enjoy this role?

A: I love this role. I think it is the best job at Microsoft.

Q: Thanks – I’ve known you for a few years and I know that statement is true. Can we begin with some myth busting? I was really surprised to learn that about 50% of the workloads on Azure today are actually Open Source workloads.optimus-mark-qanda-blog-2-1 Q&A About Open Source and Azure with Mark Godfrey of Microsoft

A: True, we are approaching that, getting very close, I expect to see it will be greater than 50% within the next two years.

Q: Honestly, I am shocked. I had no clue.

A: You and most of the people in the room where I was presenting. I think the challenge we’ve had is that we haven’t done a good job of marketing how integrated we are with existing Open Source software. It’s really a shame that we haven’t been better at letting folks know we want to meet them where they are. If they are already using Chef and Puppet or any number of Linux distros. There are a lot of app development tools that are already built and integrated into Azure. We want the Open Source world to know about this.

Q: Who should we be telling? Are there industries or vertical markets that are more likely to be using instances of Open Source?

A: No, it’s everybody; anybody who is running Open Source software today. It’s not just small business and the hipster shops. It’s government and big business that are also using a lot of Open Source technology, especially in the application development and DevOps world.

When Microsoft first introduced Azure, we actively met with customers about their Windows workloads. And, in parallel, we were ensuring that Azure was equally friendly for Open Source. Now customers are beginning to realize, ‘wait a second, Microsoft is a great platform to meet us where we are.’ Customers might be using tools like Chef, Puppet, Drupal – all these different Open Source technologies – and not realize they don’t have to change. They can continue with these technologies on Azure.

Q: What’s the advantage of putting your Open Source workloads on Azure as opposed to another public Cloud provider?

A: Speed and agility, and the rate of innovation you can drive. It’s the same challenge you have with the Windows world, in that standing things up on-premises can take weeks or months. Standing things up in Azure and leveraging the services that already exist in Azure is minutes or hours. And Microsoft is the only public cloud platform that offers a truly hybrid solution; what you’re doing on-premises can be extended into Azure. You just can’t do that the same way with AWS or Google Cloud.

I’ll give you some examples, and the picture above really illustrates this; application development platforms like Joomla and Drupal can run natively on Azure. It means that development teams can deliver applications much faster than if they were using an on-premises development platform that needed to be created or recreated for every project. By leveraging Azure, they continue to manage and leverage the technology that they use today, but at cloud speed, so there’s not a lot of retraining, replacement, or new hiring that’s required.

Q: That leads me to my next question: Microsoft has a lot of great partners and one of them co-presented with you – Optimus Information Technology.

A: That’s right. We have an incredible partner community and they each have unique skills that help our customers move to the cloud. One of the strengths that Optimus brings is their test harness. This technology enables automated testing of Open Source applications. Microsoft is committed to helping our customers, and will fund proof-of-concepts and pilots for customers who are moving or testing workloads in the cloud.

Q: How do customers take advantage of this?

A: Simply reach out to Optimus or reach out to your local Microsoft TSP. We can help you get started.

Q: Mark, thanks for your time. This was really interesting.

A: My pleasure.

For readers of this blog who want to learn more about funding a PoC for automated testing of Open Source applications using the Optimus test harness, simply reach out to our Chief Technical Strategist, Ryan O’Connor.

Microsoft Technology Summit – Open Source and Azure and the Optimus Test Harness

Ryan O’Connor presenting the Optimus Test Harness at Microsoft Tech Summit in Vancouver

It was both a pleasure and a privilege to represent Optimus at the Vancouver Microsoft Technology Summit last week.

The tech summit is one of my favourite events. It attracts a lot of people – especially those who are working individually or remotely – and gets them out to an event where they can connect and talk about what is happening with technology and hear about the latest and greatest from Microsoft.

Also, it was an opportunity for me personally to reconnect with a bunch of people I have known from my many years of delivery; people who have moved on to other companies. Vancouver is a tight technology community, with a core of folks who have worked together over the years.

I was especially delighted that Microsoft asked us to co-present, because while the name “Optimus” might be new for some, the people who work here are well known, and so it allowed me to shine a light on some of the cool things that Optimus is doing. (Thanks Microsoft).

Microsoft and Open Source

Mark Godfrey from Microsoft was talking about Open Source on Azure and sharing some pretty interesting facts. Here are some of the stats he presented:

  • 4x growth in container customers in Azure since January, all with Docker on Linux!
  • ~40% VMs in Azure run Linux today
  • 60% of Azure Marketplace Images are Linux based!

optimus-ryan-blog-1 Microsoft Technology Summit – Open Source and Azure and the Optimus Test Harness

Mark was speaking about the investment that Microsoft has made and is making in Open Source. The image above really speaks to the journey. In the middle of his presentation he was explaining that if you are using Open Source software like Linux, Optimus is a great example of how you can work with a trusted partner to test an automation framework on Azure. At that point, Mark simply asked me to join him on stage and talk about the Optimus test harness.

The Optimus Test Harness

We have been helping customers move to the cloud and do their testing in the cloud for a while now. About 85% of our customers are tech companies that offer a product or a service. Most of them have cloud-based apps and a lot of them need to do testing on those apps. Locating something inside their tenant that is closer to the services they provide but also closer to Microsoft services like machine learning makes a lot of sense.

Three Ways We Can Help
  1. Proof of Concept. We can stand up our test harness on a customer’s tenant because there’s no licence fee. This helps customers prove the concept that automated testing might add value to them.
  2. Component Testing. We might test a particular component they are struggling with; for example, all the APIs. We might just do API testing or performance testing of some specific component. In that process, we might be doing end-to-end services. We will build and execute the script as a managed service.
  3. Knowledge Transfer. Or, we might take a more educational approach or a capacity-building approach, where we help an internal team grow in its testing maturity. In doing so, we are teaching, showing and doing. Ultimately, we turn everything over to the internal QA team to manage. Quite often, we end up continuing to provide service because of the value we deliver. The organization realizes that they can free up their resources to do other things and working with us is a cost-effective approach.

Ultimately, our approach is very consultative. We look at what you really need and what services you might want to embrace. For example, we can bundle the services (such as the creation of scripts) and the test harness into a licence fee so that you can purchase it as a capital expense.

The bottom line is that we can get you set up and get you going so that you will be able to do your regression testing with the push of a button.

Getting Started

Microsoft is investing in this approach and have already invested in several PoCs for customers who want to try the test harness. Let me know if you are interested in learning more. We can also help determine if you might qualify for Microsoft funding.

Reach out anytime – we would love to work with you. –

Could BYOD Work for your Enterprise?

With constant advancements in mobility and cloud the Bring Your Own Device (BYOD) technology strategy is not only for Startups anymore. According to Gartner, more than 60% of employees use a personal device for work including smartphones and tablets. When it comes to assessing the benefits and risks to a BYOD strategy there are still fairly mixed reviews. There are just as many reasons to switch to BYOD as there are to not make the switch. Here are a few things to consider when creating a BYOD strategy for your enterprise.

Costs Savings

The first and most talked about advantage of BYOD is the cost savings. Taking this route allows enterprises to phase out capital spending on hardware that will no doubt require upgrading in a few years time. The majority of employees are bringing mobile devices to the office already and are most comfortable with the device of their choosing so this can be a great advantage in saving costs on large amounts of unnecessary hardware.  

Mobile Device Management Software

For IT teams, the task of managing policy, inventory and security is becoming apparent. This is where Mobile Device Management Software (MDM) comes in. MDM allows IT teams to monitor, manage and secure employees’ devices across multiple service providers and operating systems within your enterprise. Most MDM software includes end-to-end security, which allows your enterprise’s IT team to fully manage the mobile apps, network and data used on the variety of mobile devices. Some MDM solutions also include mobile security and expense management. With the right MDM software you can reduce the security challenges that are prominent when switching to BYOD.

Security Challenges

Managing security is arguably the most complicated challenge in implementing a BYOD strategy. With the variety of devices, operating systems and mix of product versions your security policy will need to find ways to ensure everything is secured. There are a number of concerns related to security that your enterprise’s IT team will be managing including lost or stolen devices, mobile threats or internal security breaches. Before implementing a BYOD strategy conducting a full risk assessment can help your enterprise to understand where the greatest risks are and what the related cost will be.


Before implementing your BYOD strategy there are two important aspects to consider that can have a great affect on employee adoption and overall BYOD success. First, it is important to set out the guidelines for cost sharing to avoid any misunderstanding around who is paying for what. Here you need to consider who pays for devices, service plans, software or accessories. There are a variety of structures that can be used for this. For enterprises that require precise accounting there is software available that can monitor usage and cost differentiating between personal and business use. Other cost sharing strategies can be as simple as a monthly subsidy based on a fixed fee or percentage of monthly service plans.

Finally, when setting out to start your BYOD strategy finding a set of guidelines and policies that will work for your enterprise could be the key to success. Here you will want to set out guidelines around what your enterprise considers acceptable use for business versus personal. Some policies can get as detailed as to which apps are allowed or disabling camera and video camera use while on-site. You may want to consider which devices and what type of support your enterprise’s IT team will cover. Opening your policy to any type of device could create more costs and challenges for your IT team. Also important to include are the risks, liabilities and any disclaimers that would be relevant to your enterprise. To consider here would be precautions with lost or stolen devices, personal data loss, regular backing up, ethical use and legal issues if an employee or your enterprise become involved in a lawsuit which data would be required from a device.

Although there is a lot to consider in taking on a BYOD strategy for your company, the pay off can end up being worth it. It can create a competitive advantage and increase employee satisfaction allowing employees to feel more comfortable and empowered with the technology they are using. Adopting a BYOD strategy now can also allow for ease of IT evolution as we move increasingly towards mobile and cloud technologies.

Optimus Information provides both mobile and cloud strategy services and can assist with the implementation of MDM software. For information on how our team can help your enterprise contact us today.

Data Sovereignty in Canada


Most companies operating in Canada can store data wherever they want as long as they take measures to secure personal data.

Service providers working with public bodies in BC and Nova Scotia have stricter data sovereignty requirements including storing data in Canada.

Concerns about accessing data through the PATRIOT Act are misplaced because there are broader mechanisms in place for requesting and sharing data between governments and law enforcement agencies that predate the PATRIOT Act.


The PATRIOT Act was enacted in 2001 and it broadly extended US law enforcement’s powers to access data.

Companies with a presence in the US are subject to the PATRIOT Act regardless of where the data is physically located or where they are headquartered.

Canada offers no protections against the PATRIOT Act and only British Columbia and Nova Scotia have enacted any form of protection against the PATRIOT Act.

Furthermore, Canada, like most countries, has enacted legislation that grants similar powers to Canadian law enforcement agencies and, like most western countries, has agreements in place to share that information with foreign allies.

So, even if your company only operates in Canada and your data resides entirely in Canada, US law enforcement agencies can ask their Canadian counterparts for the data and the Canadian authorities will likely comply.

The main lesson is that if there is reasonable suspicion of criminal wrongdoing, then it doesn’t matter where the data is stored. For typical, non-criminal businesses, locating data in Canada with a Canadian hosting company offers very little additional protection.

Canada’s Patriot Act: PIPEDA

The Personal Information Protection and Electronic Documents Act (PIPEDA) governs how data on Canadians is collected, used and disclosed.

The main obligation for Canadian companies set out by PIPEDA is the requirement that “personal information shall be protected by security safeguards appropriate to the sensitivity of the information.”

You can store data wherever you want, just make sure anything sensitive is encrypted and password protected.

One of the main exceptions to PIPEDA’s protections is law enforcement and national security. That exception extends to sharing data with foreign bodies.

The Office of the Privacy Commissioner of Canada made this clear in a submission to the Office of the Information and Privacy Commissioner for British Columbia titled Transferring Personal Information about Canadians Across Borders–Implications of the USA PATRIOT Act.

“Canadian law often permits government agencies to share personal information that is held in Canada (by government or the private sector) with foreign governments and organizations, even without the consent of the individual to whom the information relates.”

Canada has signed a number of Mutual Legal Assistance Treaties with countries like the US and the UK that provide mechanisms for requesting evidence.

The Department of Justice then needs to apply for a search warrant before obtaining the information and then sharing it with the body that made the request.

British Columbia and Nova Scotia have enacted laws that govern records held by public bodies that apply to service providers working with public bodies. Both laws require that data be stored in Canada.

Freedom of Information and Protection of Privacy Act (BC)

The Freedom of Information and Protection of Privacy Act (FOIPPA) regulates access to records held by public bodies and privacy standards for such records in the province of British Columbia (BC).

Many of the privacy-related sections of the act apply to “officers of the Legislature, their employees and, in relation to their service providers, the employees and associates of those service providers, as if the officers and their offices were public bodies.”

That means that if you provide services for a public body in BC, then FOIPPA may apply to you.

Sections that apply include the data sovereignty provisions of FOIPPA which require that data collected by public bodies in BC be stored in Canada.

In addition to storing the data in Canada, organizations subject to FOIPPA are required to report foreign demand for disclosure to the minister responsible for FOIPPA.

This means that companies subject to the Patriot Act would be compelled to give requested data to US authorities and report the transaction to BC authorities. In practice, US authorities would likely ask Canadian authorities (who are exempted from notification) to share data thus circumventing any FOIPPA protections and responsibilities.

Most of the other applicable sections are related to storing data securely and unauthorized disclosure/access.

The following is the complete list of sections that apply to service providers as listed in the act:

  • Section 30: Protection of personal information.
  • Section 30.1: Storage and access must be in Canada.
  • Section 30.2: Obligation to report foreign demand for disclosure.
  • Section 30.3: Whistle-blower protection.
  • Section 30.4: Unauthorized disclosure prohibited.
  • Section 30.5: Notification of unauthorized disclosure.
  • Section 33: Disclosure of personal information.
  • Section 33.1: Disclosure inside or outside Canada.
  • Section 33.2 Disclosure inside Canada only.
  • Section 74.1: Privacy protection offences.

Personal Information International Disclosure Protection Act (NS)

The Personal Information International Protection Act (PIIDPA) applies to personal information collected by public bodies in Nova Scotia.

The act also applies to service providers defined as “an individual or a company that is retained under a contract to perform services for a public body, and in performing those services, uses, discloses, manages, stores or accesses personal information in the custody or under the control of that public body.”

Similar to the BC law, PIIDPA requires data covered under the act be stored in Canada. It also requires that foreign requests for disclosure be reported to the Minister responsible for the act, but specifically exempts foreign law enforcement agencies that request information through federal or provincial agreements.

In addition, it specifically prohibits storing PIIDPA data in portable devices while travelling unless given specific permission.


At the moment, there are very few data sovereignty requirements that apply to Canadian companies. The most common ones are satisfied by basic security practices that you should already be doing.

Keeping your data in Canada over PATRIOT Act concerns is also unnecessary. Most Western countries already had mechanisms in place where Canadian authorities would provide data that resides in Canada. The PATRIOT Act mostly only asserted the US’s right to unilaterally request data from companies with a presence in the US where they already had the bilateral right to do so from companies in Canada.

The only way that you can guarantee that you are made aware of those requests is by running your own data center, when you receive the subpoena for the data, or if you are working with data belonging to public bodies in BC and Nova Scotia, where provincial data sovereignty laws apply.

As with any post on legal topics here, this is an overview of the laws and does not replace proper legal advice in any way.


Business Analytics in 2012

logo_gartner-150x150 Business Analytics in 2012

Gartner’s business intelligence predictions for 2012

As 2012 is next door, Gartner has identified next year’s business analytics trends. From the list, I find the following the most fascinating.

Next-Generation Analytics:

As we know profitability at any organization is the result of limitless “minor” actions taken by decision makers at the front lines — at employees’ desks, in front of customers, with partners etc. But, because traditional business intelligence solutions are best suited to delivering historical, narrowly-focused information, most companies are unable to arbitrate and respond to business changes in real time.

The enabler of micro- and macroscopic views of profitability is analytics: sets of interactive objects (i.e., graphs, tables, reports, alerts) which support the process of taking profitable actions by helping the decision maker separate truly actionable knowledge from raw data. Therefore, analytics can help link both insight and action. In order to do this properly, these must be available when profit occurs, through collaboration, at the point of opportunity, by many people, and in incremental actions.

Analytics is growing along three key dimensions:

  1. From traditional offline analytics to in-line embedded analytics. This has been the focus for many efforts in the past and will continue to be an important focus for analytics.
  2. From analyzing historical data to explain what happened to analyzing real-time as well as historical data from multiple systems in order to simulate and predict the future.
  3. Over the next three years, analytics will mature along a third dimension:
    1. from many systems supporting a collaborative decision process that brings multiple people together to analyze, brainstorm and make decisions
    2. from structured and simple data analyzed by individuals to analysis of complex information of many types (txt, audio etc) – best example in this category would be Apple’s new TV that will have SIRI to help viewers decide which channels to watch based on their mood.

Additionally, analytics is also beginning to shift to the cloud which enables high performance and grid computing. As we have seen this year, I have a strong belief that in 2012, analytics will increasingly focus on decisions and collaboration. It will provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action.

Big Data:

The overwhelming size of data is exponentially increasing day by day. Along with the size, complexity of formats and delivery speed has also exceeded the capabilities of traditional data management technologies. These significant changes have increased the need for new, simple technologies to manage the volume of data. Currently in the market we find many new emerging technologies, with the potential to be disruptive (e.g., in-memory DBMS). Analytics has become a major driving application for data warehousing, with the use of self-service data marts and cubes. One major implication of big data is that in the future users will not be able to put all useful information into a single data warehouse. Logical data warehouses bring together information from multiple sources as needed and will replace the single data warehouse model soon.

Cloud Computing:

Cloud is a disruptive force and has the potential for broad long-term impact in most industries. While the market remains in its early stages in 2011 and throughout 2012, it will see the full range of large enterprise providers fully engaged in delivering a range of offerings to build cloud environments and deliver cloud services. As companies like Microsoft, Amazon, and Salesforce continue to expand their cloud offerings users will see competition heat up and enterprise-level cloud services increase.

Enterprises are moving from trying to understand the cloud to making decisions on selected workloads to implement on cloud services and where they need to build out private clouds. Hybrid cloud computing which brings together external public cloud services and internal private cloud services, as well as the capabilities to secure, manage and govern the entire cloud spectrum will be a major focus for 2012.

Rupmeet works with Optimus in the area of Business Analytics. Please feel free to contact us to setup a brief introductory meeting.

Cloud Data Warehousing Advantages

cloud_storage-300x200 Cloud Data Warehousing Advantages

Data warehousing in the cloud

“Why create a data warehouse in the cloud?” I have been answering this question many a times when I go to networking events.

There are a number of cloud data warehousing advantages:

  • Cost savings
  • Total cost of ownership
  • Mergers and acquisitions
  • Standardization
  • Better understanding of competitive landscape
  • Improving product and service quality.

Many enterprises are finding that data warehouse appliances can significantly reduce the total cost of ownership. All resources, including expensive networking equipment, servers, IT personnel, etc are shared, resulting in reduced costs, especially for small to mid-sized applications and prototypes.

In the case of mergers and acquisitions, duplicate financial data warehouses need to be rationalized. An enterprise or government agency may want to rationalize various warehouses based on disparate warehouse technology platforms by moving to a standard platform. Cloud computing enables companies to shift money from capital expenses to operating expenses, enabling the customer to focus on adding value in their areas of core competence, such as business and process insight, instead of building and maintaining IT infrastructure.

In short, cloud computing allows you to focus your money and resources on innovating. “Regardless of the reason for the migration, in every case the reporting and analysis supported by the migrating data warehouse must continue to run seamlessly.” This is a common need expressed by my professional network. Cloud computing platforms such as  Microsoft Azure, provide many of the core services that, under traditional development models, would normally be built in house. These services, plus templates and other tools can significantly accelerate the development cycle.

“What should be the solution?” is the subsequent question that arises during conversation. “Use data visualization to insulate reporting users during data warehouse migrations” is my to-the-point answer. Additionally, data virtualization to the cloud provides a more agile integration approach that overcomes data complexity and disparate silos to provide business with the timely data it needs to meet today’s ever-changing business requirements. Provisioning-on-demand enables faster set-up and tear-down of resources on an as-needed basis. When a project is funded, you initiate service, then if the project is killed, you simply terminate the cloud contract.

Data Virtualization using cloud computing is the opposite of traditional data warehousing. The cloud is more vibrant and valuable in that it’s something elastic that customers can scale up or down with dynamic resource allocation ensuring efficient use of shared processing and storage resources. It wants data to be location independent, transparent and function shippable; whereas, the traditional data warehouse is a centralized, persistent data store. There will be a need for a run-time metadata in order to register and access data sources as a service. Growing data volumes are winning. Still, with cloud computing (as with web services), the service, not the database, is the primary data integration method.

The ability of the cloud to load data quickly allows the vendor to work with larger data sets during smaller time windows as dictated by the customer, as well as service more customers at any one time. While smaller companies may only refresh the data in their data mart weekly, larger customers typically refresh data daily. Moreover, the transactional database of such larger customers can only stay offline for short time periods during which data must be transferred to the data marts.

Data loaded in the data warehouse has to be organized in the best possible way to enable the optimal execution of queries. The managers of on-premise data warehouses and marts constantly look at the queries executed against their databases to determine how to best organize the data to achieve best query execution times. Some of these optimizations can be performed automatically by the database management system but most require manual intervention. Having full control of the database management system enables the SaaS BI vendor to better optimize the organization of the stored data.

Finally, the emotion of the language used to query the data mart in the cloud determines the range of data analyses that can be performed and reports that can be created by the business intelligence tools (like Microsoft SSRS). Cloud infrastructure also contributes to the speed with which queries are executed that generates quick analytics reports.

As a business intelligence vendor in Vancouver, OptimusAnalytics would be happy to share our experiences with you. To learn more about cloud based data warehouse solutions, contact me directly at

(image credit: Goma on Flickr)

Windows Azure: Move your applications to the cloud

Windows-Azure-300x190 Windows Azure: Move your applications to the cloud

Set your apps up in the cloud using Windows Azure.

Many clients have been asking us about the Windows Azure platform so we’d like to take this opportunity to give a brief overview and let you know what we’ve been working on.

The Windows Azure platform is a platform-as-a-service offering from Microsoft that competes with Rackspace, Amazon Web Services, and Salesforce. It is everything you need to move your applications to the cloud. This enables developers to focus on developing great applications, instead of worrying about infrastructure.

It supports a variety of programming languages out of the box including Visual Studio, .Net, Java, PHP, and Ruby.

As illustrated above, using the Azure platform you can deploy your application, database, services, and access control service to the cloud. Then you can control who has access remotely while deploying your application through Microsoft’s globally distributed datacenters. The Azure platform will then automatically scale infrastructure up and down depending on load and charge only what you use. There are no upfront costs and organizations even have a free trial to get started.

A few other key components worth mentioning are:

  • The Azure Platform Appliance: If you need to keep your applications and data in your own data center, you can setup a private cloud using the Azure Platform Appliance. It is essentially a suite of software and hardware to setup on your own infrastructure. It is designed to be extremely efficient while providing massive scale.
  • The Azure Market Place: This is a market place for components, applications, and even services that run on Azure.
  • The Azure Data Market: This market is used for buying and selling raw data. From real estate to demographics, these data sets are rich with information that can be utilized in your applications.

At Optimus we focus on Software Development, Software Testing, and Business Analytics. We have been working with our clients to move all three services onto cloud based infrastructure. It has proven to be reliable, scalable, and increasingly cost efficient.

To learn more about how you can move your applications to cloud, contact us.

(image courtesy of Microsoft)

Benefits of Cloud Computing: Where are the Opportunities

data_center-200x300 Benefits of Cloud Computing: Where are the Opportunities

Enterprise data centers moving to the cloud

Amazon, Google, Microsoft, and now Apple are battling for market dominance in cloud computing. This fast-emerging segment has companies of all sizes measuring the benefits of cloud computing.

The big players are offering services ranging from software-as-a-service (ie: hosted email), to platform-as-a-service (ie: virtual desktops), to storage and computing as a service. Companies large and small can use these services on a pay-per-use model to access flexible, scalable, and powerful data centers.

Outsourcing is gaining in popularity as companies move towards storing their data on secured public or private infrastructure. Tech behemoths such as Amazon are vying for market share against their competitors, deploying faster networking servers that are available on demand. Scalability and ease of deployment are key reasons why enterprise software is well suited for the cloud.

All big market players have made sizable investments in their unique product offerings.

  • Apple’s iCloud promises consumers seamless integration services across all your applications.
  • Microsoft has launched its Azure platform backed by a significant investment in web infrastructure and thousands of servers.
  • Google’s App Engine resides on Google’s own infrastructure and delivers more than 250 million page views each day.
  • Amazon Web Services are constantly evolving and being enhanced to include database, auto-scaling, and monitoring.

Each player can benefit from different revenue streams from the cloud computing boom, including monetizing their infrastructure and attracting new developers. The cloud wars have led to rapid reductions in the price of utilizing the cloud infrastructure.

The cloud wars have not only benefited companies seeking to trim their internal IT expense by moving their data on the cloud; they have also led to the emergence of small, focused, quality-oriented outsourcing companies such as Optimus. These companies can offer services in their cloud infrastructure at a lower cost than larger, more established peers, such as Accenture. Optimus focuses on providing excellent, tailored services to all its customers, which comprises large as well as smaller companies.

OptimusBI helps companies with their data management and business analytics needs. For more information on our services, please contact us at

Google Cr-48: Next Generation of Cloud Computing

Cloud_computing_chrome_os_launch-300x204 Google Cr-48: Next Generation of Cloud Computing

The Google Cr-48 may just be a prototype, but it is also a glimpse in to the future of cloud computing.

Imagine a typical day at the office. You’re sitting down with your morning coffee, doing the final preparation for your weekly status report. You’ve got 10 minutes left before your meeting and have just received the revised budget figures. You quickly update your presentation with the new figures as you watch the seconds tick away. Just 3 minutes left before the meeting starts so you reach to close your laptop and splash… coffee all over the keyboard. A few sizzles later and the screen is off.

Whoops! Not only was your presentation on your laptop, but you were also going to demo some new software. What are you going to do?

Simple, on your way to the meeting you grab a laptop from your buddy in accounting. He doesn’t have the same security access as you and doesn’t have access to your files, but it doesn’t matter. You simply log in using your credentials and the lightweight operating system pulls all your data from the cloud. Every file, program, and setting is exactly as it was on your own laptop. Your presentation is even ready with the latest updates. Spilling the coffee on your keyboard only slowed you down by the time it took to log into a new laptop-about 18 seconds.

A prototype of the future of cloud computing has just been launched by Google (the Cr-48) and even though it’s not perfect, it is surely a sign of things to come. It is based on the premise that web-based (or browser-based) applications will replace desktop applications. As browsers become increasingly powerful, rich web-based applications have emerged to replace traditional software.

The most common example of a web-based platform replacing installed software is email. Managing email is easier than ever using services like Hotmail and Gmail. This is especially true for enterprises as running an internal email system requires servers, patches, upgrades, experts and inevitably leads to storage issues.

Take Your Data with Youos_in_the_cloud Google Cr-48: Next Generation of Cloud Computing

The new prototype from Google that runs Chrome OS is just like email. You log in over a network (either 3G or Wifi) and are instantly in your customized machine. All of your bookmarks, files, extensions, and web-based applications are there. There is no installation of programs, no reconfiguring of settings, and no copying files. Once you’ve set it up, you can simply log in from anywhere and access your computer as if you were at home. If you ever need to replace your machine, all you do to setup the new one is sign-in.

Additionally, you don’t need to worry about backing up your data or applying patches to your software because that’s handled automatically. If you corrupt your hard-drive or spill coffee on your machine it’s not a problem because your files are not stored locally. If you lose your laptop on the subway, it’s not a security concern (unless you have a weak password) because your laptop only has an OS on it.

Wondering where you “install” programs from? The newly launched  Chrome Webstore which you can log into right now using Google’s Chrome web-browser to test out web-based applications and extensions.

This latest prototype from Google is certainly not ready for mainstream adoption; but CIOs will be watching it carefully as there is huge potential. The cost of a laptop running Chrome OS will be significantly lower than a system running Windows/Mac OS, plus there will be huge savings managing the back-end. The difficulty will be in aligning the opportunities with the business processes to ensure a smooth workflow. Smaller businesses can adopt quickly and take advantage of many free services being offered.

The other major opportunity will be delivering the same applications to mobile devices. The adoption of smart phones and tablets has driven the demand for mobile access to both consumer and enterprise solutions. Citrix Receiver is one such example that enables a company to host applications and deliver them to clients on any device, including Chrome OS.

Watch this video below to see the announcement from Google and a live demo of the prototype.