10 Tips for Success with Big Data – Part 1

The age of big data has been a boon for anyone in the business intelligence world. Creating reports, apps and visuals that motivate decision-making processes is simply easier when you have a large data set to draw upon. There are, however, a number of issues to keep in mind. Here are 10 tips for anyone who’s looking to more successfully utilize big data for BI purposes.

1. Maintain Data Formats

There’s a temptation when putting data to use to reformat the data set for greater ease of use. It’s not unusual, for example, to remotely retrieve a data set using JSON and then dump the key and value pairings because the information contained in the keys is seen as useless. Those value pairings often offer insights that may only be useful further down the road. By maintaining the original format of a data set, you can preserve information like time sequences and references. That can be beneficial if you’re asked later to track down a specific data point as part of a later discussion.

2. Security Matters 

It can be a lot of fun splashing around a big pool of data, but it’s also important to be prepared to be told no and to tell others no. Security protocols exist for a reason. Your BI systems should already have industry-standard security in place, and you should not ignore its usefulness by not setting limits on authorization. As exciting as it can be to share data, it’s always critical to be sure that you and those you share it with have a right to access it.

3. Price Traps

At the petabyte scale, the cost of storing and transmitting data can be staggering. It’s easy to buy into the argument from vendors that big data costs mere pennies per gigabyte. Likewise, vendors love to price SaaS systems on a per user basis. You always want to make sure that your operation is paying the most feasible prices possible for its BI systems, and that often means negotiating with vendors. Whenever possible, try to arrive at flat prices or low rates with strict limitations in place.

It’s also important to bear in mind that many vendors are hoping you’ll go over your limits. Make sure your BI implementations shut down access to resources before they cause your fees to go through the roof. Remotely hosted storage and processing providers have built their business models on the belief that people rarely show restraint when playing with a fun toy. Contain yourself.

4. Don’t Let Data Delay Decisions

There’s always a pressure in the world of BI to have the freshest data. The problem with this attitude is that it can inhibit decision-making processes and ultimately undermine the value of analysis. Your operation cannot afford to be impaired by a ceaseless wait for new data.

De-emphasizing the importance of using the absolute freshest data can also help you realize speed and efficiency gains. For example, it’s easy to see how caching your data can improve performance. It does, however, come at the cost of forgoing access to the absolute freshest data. If the arrival of a small sliver of data can disrupt the decisions that are being made based upon it, that fact often will open more questions about the volatility of what’s being studied than the freshness of the data.

5. Don’t Discard the Outliers

The presentation of most large data sets almost always leads to questions about outliers. There’s a strong temptation to discard the outliers and present the cleanest visuals possible. The willingness to retain outliers, however, can be a signal of quality and honesty. Outliers can and should be discussed when they’re discovered. If you expect to be asked about the lack of smoothness that outliers can create, the wisest choice may be to explain that throwing them out inhibits discussion and is frequently a sign of manipulation. It’s always better to follow the data wherever it goes. 

Stay tuned for part two of this two part blog post. 

BI: Cloud vs On-premise

The model for how companies acquire and field software for business intelligence purposes has changed dramatically in the last five years. One major factor driving this change has been the emergence of cloud-based computing options and software as a service. Modern operations are increasingly comfortable having large amounts of their data, processing capacities and even actual software hosted remotely. One of the biggest challenges in the BI sector today is figuring out just how much of your infrastructure you want to have off- or on-premise. We’ll take a look at the pros and cons of both approaches and how a mixed environment may also be beneficial.


Comparing cloud to on-premise options in terms of security can be a bit tricky. Cloud-based solutions benefit immensely from regular updates by the vendors supplying them. Poorly updated software is one of the leading security risks in almost every sector of the business world. On the other hand, large farms of remotely hosted systems are very attractive targets to hackers by virtue of their size alone. SaaS providers see themselves in a perpetual battle against interlopers, but they also put significant resources to work on a daily basis because their reputations are on the line. The awareness of the risks in the cloud sector goes a long way to alleviate concerns about security. In fact, all the major recent data breaches reviewed by Bankrate occurred at on-site data centers rather than in cloud-based systems.


When companies start thinking about moving to off-premise solutions, the first thing that crosses their minds is often pricing. On-site solutions often carry very steep upfront costs, while cloud-based systems are typically billed on a recurring basis. It’s also important to appreciate that every piece of software that’s run on-premise has to be hosted on a piece of hardware. Installing, maintaining and replacing those systems are costs that accumulate rapidly. When a server dies at a SaaS provider, that’s a vendor problem, and it’s usually handled quickly and with no interruptions of service.

Costs also can vary dramatically among vendors. For example, Microsoft’s Power BI is integrated seamlessly with its Office 360 products, and it’s available to small businesses in a free version. The paid version starts at $9.99 per month, per user. Other vendors, such as SAP, sell products with pricing options that are opaque and require negotiation.


Getting up and running with BI systems for the first time is challenging regardless of whether you elect to use a cloud-based or on-premise solution. The big difference is that on-site systems require significantly more hardware setup. They may also demand multiple visits from the vendor to verify that everything is running as expected.

For companies that are looking for a fire-and-forget solution, SaaS wins hands down. The dashboards built for the cloud-based systems are typically designed to offer cross-platform compatibility. That means you can have folks in the office using Windows desktops and sharing information with individuals in the field who might be working on mobile platforms based on iOS or Android.


For all the setup headaches that on-premise systems can entail, they tend to offer much better customization. Cloud vendors, however, shouldn’t be discounted. There’s a good chance that you won’t require an on-site, customized setup unless your company is heavily into building its own solutions.

One of the main drivers of the adoption of cloud computing is the need for increased cooperation within companies. By moving services and data to the cloud, many businesses have dramatically increased the availability of information throughout their operations. For companies that have interests spread across the country or globe, this can reduce friction.


Every computing system encounters bottlenecks. SaaS solutions are often bottlenecked by network capacity and speed. On-site solutions are frequently limited by the available hardware. Cloud services providers offer solutions that allow clients to throw more resources at big problems when they require speed and power. This sort of instant scalability is hard to replicate at your own location without building your own supercomputing cluster.

Mixed Solutions

Few companies have wholly embraced the cloud-based ecosystems, and almost all operations are likely to end up using some mixture of both solutions. Many offerings take this fact into account. For example, Tableau mixes cloud storage and computing options with a desktop application that blurs the line between on- and off-site solutions. It also is designed to allow users to important local files and connect to remotely served data. SaaS providers are increasingly accommodating toward the needs of firms that have ingrained desktop computing cultures.


SaaS systems are growing in popularity, and many companies are trying to move as much of their BI infrastructure into the cloud as possible. As vendors continue to prove themselves capable of handling concerns about security and availability, this trend is likely to continue over the next five years. Increased competition in the sector should also encourage cheaper and more options, and that’s a win for nearly everyone looking to improve their BI resources.

Setting up a solution can be challenging, especially if you don’t have the right specialists on your team. Optimus information has worked on implementation of both SaaS cloud BI platforms and on-premise solutions, and we have the right BI technical experts to move you forwards without incurring the financial burden of a full time BI staff.

If you have questions about whether a cloud or on-premise BI solution is right for you, contact us today. We’re always happy to help.

Product comparison: Power BI, Tableau, QlikView and SAP

Few topics that come up during a business meeting are at once as simple and as challenging to address as the question of how the company is doing. It wasn’t long ago that someone involved in the process would toss around printouts of a few Excel sheets and declare the discussion over. In recent years, however, business intelligence has taken on a life of its own. Even in small operations, hard data is in-demand.

There are four popular tools used to handle BI work these days. They are:

  • Power BI
  • Tableau
  • QlikView
  • SAP

Let’s take a look at the pros and cons of each option.

Power BI

Power BI gains a lot of its strength from the fact that it’s built by Microsoft. That means companies that are deep into the MS Office ecosystem can quickly use their existing data from Excel and Access to produce new insights. It’s built to handle data from a number of different inputs, and you can create everything from widgets to full-fledged dashboards with ease using Power BI. It’s a popular choice for companies that need seamless integration and dedicated support.

For those who want visualizations and ease of use, Power BI is a great choice. It was originally designed as an add-on for Excel and has grown into a fully formed product in its own right. Power BI is a relative newcomer, and it’s generally not seen as a good option for companies that need to power through big data sets or produce complex visualizations. It does, however, provide support for R-based analytics, a programming language popular with statisticians and data miners, so advanced data people shouldn’t instantly dismiss its usefulness.

Microsoft is moving toward a monthly pricing model for most of its major business applications, and Power BI is no different. The basic version is available for free, and it’s designed to process 10,000 rows of information each hour. The advanced version is available for a monthly fee of $10. Both allow you to create dashboards that are compatible with Windows, Apple iOS and Android devices.


Tableau is designed to put a lot of analytic power in the hands of people who don’t have PhDs in stats. It handles many types of analysis that are often thought of as advanced functions, including trend analysis, regression modeling and correlations. It also allows you to quickly derive metadata from your existing information sets. You can simply group points together and produce new visualizations with a few clicks.

One thing Tableau does incredibly well is mapping. It comes with worldwide map data out of the box, and it also does a very good job of recognizing what formats correspond to what countries. You can readily throw together a visualization of your company’s country-by-country performance in Tableau by simply importing an Excel sheet. Within a matter of minutes, you’ll have a presentation element that looks professional.

Tableau is widely considered the BI industry standard for visualization projects. It is, however, lacking in deeper analysis capabilities. Many users, however, pair it with a separate engine because it’s so hard to sacrifice its gorgeous visual output. Tableau is available in both free and paid versions.


If you’re looking for something that’s powerful and able to rapidly mine data, QlikView may be the option for you. It has a bit of a reputation for a steep learning curve, but QlikView also is likely the bit-for-bit analysis champ among the big BI tools on the market today. It’s a popular choice for users who value analytics above visualization.

QlikView does visualizations, and it even can produce dashboards. It’s just not the preferred choice for those who want to produce beautiful data. If you’re looking for simple charts, it’s a solid contender.

QlikView is available in a personal edition that’s free. The personal edition is fully featured. Both the free and paid versions allow cloud deployment of applications on the company’s QlikSense platform. The paid version’s business model is built on allowing a larger application size, more storage and unlimited users.


SAP BusinessObjects is the preferred BI tool for companies that require predictive capabilities. It’s designed specifically for managing customer relationships, tracking finances and handling supply chains. If you’re looking for a program that can quickly compile your data and answer basic questions about profitability and efficiency, SAP is likely to be your BI weapon of choice. The package also comes with a variety of reporting a data visualization options.

Where the SAP offerings take a hit is pricing. It’s far from being as cost-effective as other BI options on this list. The range of offerings is diverse, and the company does not make a point of disclosing licensing fees upfront. Generally, you should expect to spend a couple thousand dollars to get started with SAP. The opaque licensing regime and poorly streamlined family of programs often turns potential users off SAP.


There isn’t a clear contender for the biggest BI tool in the all the land. If clean and effortless visualizations are your thing, Tableau will be your choice. QlikView offers raw power. Power BI provides easy integration. SAP is a great choice, but only if money isn’t an object.

If you need help figuring out which option is the best for you, ask one of our experts. We can help you choose and get you started in the right direction.

Contact us today

Best Practices for Successful Business Intelligence

Few, if any, companies doubt BI’s value to their internal and customer-facing operations and the crucial role it plays in making accurate, lightning-fast tactical and strategic decisions that affect the business’ competitive position. The difficult bit is ensuring that BI’s deployment in your company is fetching the best data, crunching it in a timely fashion and presenting actionable insights to the right decision makers at the right time.

The team at Optimus have traversed this path before and have several tips that will ensure a quick start and maximum return from your BI projects.

Get Everyone On Board

Once you develop a unique strategic vision around your BI project, share it with anyone who has a stake in its outcome. The ensuing discussions will crystallize short-term tactical and long-term strategic goals and produce a detailed roadmap with a timeline for development and installation.

Understand Your Data’s Context

As project functional requirements are shared across stakeholders, issues are bound to arise relative to which data is procured, how it is captured, stored and processed. These issues will differ depending on the nature of the data and its source. For instance, customer data must meet regulatory compliance. Other data may have restrictions on distribution and still other data streams will have format or periodicity characteristics that create technical problems.

Assigning technical personnel to the requirements gathering process as early as possible with a mission to uncover such issues will increase the likelihood that they are dealt with in the most efficient manner.

Aim for Agility

Approach the next phases – requirements gathering, specification, development, testing and deployment – with an eye toward flexibility and agility. Although it is theoretically possible to anticipate all the omissions, glitches and miscalculations in the requirements stage, in practice it never works out that way.

To mitigate this fact of life, take a page from agile software development methodologies. Break the project into small, independent components and apply short iterations over progressively more functionality. This approach yields continuous results while avoiding lengthy and disruptive project halts when inevitable technical hurdles arise.

Produce Measurable Solutions

At all times, keep in mind that the project should be producing not just functionality but measurable solutions to problems within the company. It is easy to check off features, but more difficult to judge the value of each feature to the end solution, which may be, in part, judged by subjective criteria.

Various methods to deal with this aspect include the use of working wireframe mock-ups along with usability tests, identifying and validating action loops, the use of responsibility matrix models and the development of user personas. Separately or together, these techniques are used to validate the BI system’s usefulness and ease of use.

Ensure Maximum Accessibility

One of the most invaluable improvements in BI over the decades has been increased accessibility of business intelligence tools across an enterprise. Whereas in the early days of BI, data experts were required to collect, massage and interpret data, modern BI software automates those processes. Today, non-experts possess the ability to develop actionable reports and dashboards relevant to their job function right at their desk.

One caveat is in order however. Although your new BI project should facilitate maximum accessibility as an end goal, do not distribute it widely until the bugs are shaken out. Furthermore, where possible, augment existing processes, workflows, apps or websites with your solution in lieu of creating a standalone app that is easier to ignore. On the other hand, develop a mobile app version of your project for those employees who typically work away from the office.

Plan for Training

Training is essential to your project in order to raise awareness among employees, tout its value and increase usage. Assigning one or more technical evangelists to spread the word is a superb way to start. These project ambassadors must be backed up with a suite of training material including downloadable multimedia content, ready-made presentations for department leaders and face-to-face classes as necessary.


Start your BI project development by ensuring everyone is on the same page with regard to the vision and the benefits of your BI solution. Collect stakeholder input to ensure a fuller set of requirements.

Deal with technical issues to your solution, its development and deployment through short-cycle agile methods, while bearing in mind it should result in maximum accessibility and collaboration when it is finished.

Finally, plan well ahead for the training necessary for maximum uptake and efficient usage of the project whether it be a plug-in, standalone app or a web-based tool.

Transforming your company into a fast-acting, data-driven organization that stays two steps ahead of the competition is not a task for the faint of heart. It is made infinitely harder by not following several tried and true approaches that raise your chances of success.

What is Business Intelligence?

Modern companies are compiling a huge amount of data on a daily basis. Answering the question of what to do with all this information is the domain of business intelligence. BI covers a large range of functions within a firm, including everything from database compilation to the creation of presentation materials. BI tasks typically center on using powerful software suites to quickly analyze and present a company’s data to stakeholders in a fashion that allows them to readily make decisions. The ultimate goal of business intelligence is to render the decision-making process as efficient, accurate and fast as practicable.

BI Questions

Business intelligence processes should allow a company to speedily address five key concerns:

  • Show what happened
  • Track what is currently happening
  • Explain why these things happened
  • Offer insights into what may happen
  • Inform decision makers about specific beneficial options

Most companies have pretty good raw data regarding what has happened. Even small businesses with little BI infrastructure can point toward sales receipts and inventory totals to frame a reasonably constructive narrative about the past.

Where real competitive advantages are emerging is in tracking what’s currently happening. A company with a serious BI system in place has the ability to analyze sentiment and trends in real time, allowing them to respond faster when critical inflection points have been passed. For example, a company can monitor social media outlets and its own customer service records to detect consumer discontent before it blows out of proportion. Similarly, companies can observe trends in real-time and task team members to explore them before competitors jump on them. A solid BI process can give a business a huge first-move advantage even in highly competitive niches.

Benefits of BI

Anything a company does has the potential to be done better with the benefit of business intelligence. Some operations have entire business models that are based on crushing their competitors by simply streamlining essential processes within their operations. Imagine operating in a market where margins are very thin and inventory requirements are huge. It’s easy to see where a company that can use BI systems to economize its processes for handling inventory could outpace a less nimble competitor quarter by quarter.

Big data’s a big deal, right? It’s not enough, however, to have a lot of data. Lots of companies are sitting on databases full of information, and very little of is actionable if it’s never properly compiled and presented. The main benefits of BI arise from how data is compiled and translated into actionable information. There’s already an arms race among the biggest companies in the world to produce the best BI, and corporations are spending billions of dollars to gain millisecond advantages over their competitors.

Business intelligence can be applied to a lot of highly specific problems. For example, many financial institutions now profile who their ideal customer is. They make a point of sending offers to those individuals who they feel will form long-term relationships with their companies. This requires having BI processes in place to filter through existing information about current customers in order to produce accurate predictions regarding who will be the best future customer. It also demands a commitment to running new experiments all the time to see what insights are waiting to be gathered.

Importance of BI

Business intelligence is increasingly becoming a swim-or-die proposition. The analytics boom in Major League Baseball, for example, allowed several perennial losing teams to rapidly ascend by making smarter decisions. Teams could gain an edge in a variety of ways, including how they scouted prospects, which free agents they spent money on and how they managed players’ health. The biggest name in the sport, the New York Yankees, is still struggling to catch up in the analytics game against several previously weaker opponents.

Failing to build a solid Bi infrastructure in the modern business world is tantamount to unilateral disarmament. If one convenience store chain is monitoring the total customer experience while its competitors aren’t even tracking equipment failures, it’s easy to predict who’s going to thrive. If one retailer is fumbling around trying to standardize its data while another company has a smoothly implemented system in place, the company with the better implementation is going to be able to respond faster to problems and opportunities.

Good BI isn’t just elemental. It’s evolutionary.


Business intelligence requires a broad dedication to processes across an entire operation. It requires putting in software systems and standards with the intent of compelling everyone to conform to their use. It also demands a willingness among higher management to listen to the insights that are bound to bubble up from below. Building a BI infrastructure is no longer a choice for many companies because competitors are either already doing it or bound to start soon.

Dos and Don’ts of Software Outsourcing

Outsourcing software development can be a great way to save your company or organization time and money. However, if done haphazardly, it can also become a source of great headaches. It’s easy to make common mistakes, such as not clearly defining goals or expecting the process to magically produce results. It’s much harder to manage the process and see that it achieves the results you desire. Here are some dos and don’ts for you to consider as you get involved with software development outsourcing.

Do Define Your Goals and Metrics

One of the hardest things for any outsourcing services provider to deal with is a moving target. It’s important that you have clear planning documents in place that outline the goals for each project. Likewise, you also need to use clearly understood and widely trusted metrics to measure the success of a project. If you and your outsourcing provider agree on the measures of success, there’s a much better chance you’ll hit your target.

Don’t Fire and Forget

Handing a pile of specs to an outsourcing firm does not count as passing the project along. The process is inherently collaborative. It’s important that everyone on your side, the client side, stays engaged. When mockups come in, check them. When requirements are reportedly fulfilled, verify them. If errors occur, correct them. Do not plan to hand off the specs and just come back months later expecting to see a finished product.

Do Put Processes in Place

Every aspect of your project should have a clear process in place to ensure that both you and your outsourcing company understand what it means for a step to be considered complete. This means verifying that mockups are passed along and that both parties sign off on each step. Requirements for each stage should also be outlined clearly, and all parties involved should agree to them. Nothing dooms a project as fast as an ambiguous understanding of what it means for a step to be truly completed.

Do Communicate Well

This can be trickier than it sounds. Even if you’re running a North American firm that’s outsourcing to another firm on the continent, time differences matter. If your vendor doesn’t have flexible hours, then you’re likely losing collaboration time. For example, if a company in New York City is outsourcing to a firm in Vancouver, it’s important to know that the Vancouver folks are prepared to have someone show up early on some days to touch base with the New York crowd before the work day gets rolling.

Don’t Judge on Price Alone

One of the greatest temptations of outsourcing software development is to simply take the lowest price offered. This is a terrible idea. There are many tradeoffs required to get to the lowest price. Will you be dealing with people who speak your language natively? If the vendor is on the other side of the planet, will they even be able, at that price point, to have someone communicate with you live at a convenient time?

Do Expect a Performance Curve

If you’re beginning your first project with a specific vendor, it’s unrealistic to assume they will be able to just hit the ground running. Every outsourcing firm requires time to make sense of your company’s culture and figure out the best way to achieve the results you seek. Getting frustrated by this process can be very costly. There’s little benefit that can be expected by bouncing from vendor to vendor. Give vendors time to acclimate.

Don’t Outsource Core Functions

Some parts of your company just have to work. The best way to see that your operation works the right way is to ensure that core functions stay in-house. For example, if you ran a vinyl sign company, it would be insane to outsource your graphic design work. On the other hand, it might be perfectly reasonably to outsource backend coding for your website.

Do Prepare for the Future

No matter how strong your relationships may be with your current set of vendors, you need to know the market and be prepared for the future. If the day comes that your preferred vendor can no longer handle the scale of your work, you want to have a list of vetted alternatives in place. You can lose months of project time trying to find a new vendor.

Don’t Use Outsourcing as a Stopgap

Adding an outsourced software development team to your organization means incorporating it on a long-term basis. If you drop a project on an outsourcing company and then turn around expecting in-house people to maintain or even expand it, the results will be rubbish. It will may foster discontent among the in-house team. Look at long term partnerships where certain projects or tasks stay with your outsourcer and others stay with your in-house team. Your guys handling the outsourcing on your software will then become comfortable dealing with your in-house people handling core functions, and vice versa.

Don’t Rely on Technology Alone

There’s a lot to be said for getting in at least one face-to-face meeting, ideally more as required. Your vendor and you can both demonstrate commitment to the project. You also can interface more quickly. If at all possible, try to make in-person meetings part of the process.


Outsourcing is an amazing tool for any company to have access to. It is important, however, to appreciate that it’s not magical. It’s a process that your organization has to fully accept and integrate. With the right checks and balances in place, you can ensure that your outsourced software development efforts achieve the results you want.

Ultimately, successful IT outsourcing comes down to finding the right partner that fits with your organization and IT needs. Optimus Information has become a trust partner to companies of all sizes and in all verticals; we know what works and doesn’t work when it comes your your IT challenges and working with your organization. 

Contact us today for your next IT project. We’re always happy to help.

Start outsourcing effectively. Download our How to Overcome IT Outsourcing Challenges whitepaper.

Tactical Outsourcing vs Strategic Outsourcing

The term outsourcing scares people for various reasons, but there is no reason that it should. Outsourcing is very effective for certain situations, especially when developing software solutions. For a company to grow its internal IT department, it must search for a qualified candidate and go through the long process of interviewing, hiring, and training the new employee. This can take a long time and cost the company a lot of money. The alternative to this is outsourcing.

There are two main types of outsourcing that you can find for your IT needs: tactical and strategic.

Tactical Outsourcing

Tactical outsourcing refers to hiring a firm to perform specific development functions as part of your existing software development process. You retain oversight of the project, giving you more control over the process. This is very helpful if you need a project done on short notice because you can avoid the long process of searching for the right candidate.

Tactical outsourcing is also helpful when you have a short-term need for a highly skilled developer in a specific technology that you do not normally use. A developer with the right skill set may be very difficult to locate, so going through a skilled tactical partner can mean fast turnaround and lower overhead than hiring internal resources.

With tactical outsourcing, you maintain the planning process, including gathering requirements from stakeholders and designing the system that the developers will create. This increases management overhead, but it also allows you to have more control over the process and more knowledge about what is happening. They are familiar with your processes and answer to you throughout the process.

Strategic Outsourcing

Strategic outsourcing involves partnering with an IT company that provides top-down services. They handle the entire planning and development processes, leaving you free to focus more on other aspects of the business. This is nice for some companies because they are able to pay prices per project instead of per hour of work.

With either type of outsourcing, you should address a few key points with your prospective outsourcing partner. The first is documentation because you never know what could happen in the future. If a support issue comes up or you need new functionality added to an existing product, you will need proper documentation to know how the product is supposed to work. Ensuring that your partner provides proper documentation, both in the code and out, can be vital to handling these situations.

You should always provide full technical information about your requirements to your outsourcing partner. Leaving out any details could cause severe problems as you go through the process. Always remember to include all of your key decision makers and stakeholders in discussions to ensure that everybody’s goals align properly. Repeated changes over time can greatly increase the cost of any project, so planning everything up front is vital to a successful partnership.

You should always examine your prospective partner’s portfolio, looking for consistent, high-quality performance over time. Not all outsourcing companies are equal, and even great companies may not be the right choice if their goals do not align with yours. It is also important to note for what industries the company has worked. If the company has worked for other companies in your industry, it could mean that they are already familiar with many of your processes and terminology, making communications much easier.


Both outsourcing solutions could work for you, depending on your situation. Many companies prefer tactical outsourcing because it allows them to maintain more control over the process and usually means less of a commitment. However, the trade off is significantly more involvement from the client. 

At Optimus, we find there is a lot to gain from ongoing strategic outsourcing partnerships, and clients typically reap more benefits: less hands-on management, higher quality work, faster delivery and greater flexibility.

For an in-depth look at strategic outsourcing, check out our guide.

Download our Guide to Strategic Outsourcing


What to Look for in an Outsourcing Partner

Bad experiences with outsourcing providers are often traceable to badly designed selection processes or the use of deficient selection criteria. Whereas, you can find ample guidance online on how to build a robust selection framework, we will share the most important vendor attributes for comparing and contrasting outsourcing companies to ensure you select one right for you.

Sizing a Vendor to Your Project

When optimizing the capabilities of an IT provider to your project, size matters. If your organization can fund nine-figure deals, then the number of companies to field such a deal is not large, whereas for smaller projects you have a lot more choices.

The key is to find a provider of a size that will consider your deal to be a big deal. This significantly increases the odds that your project receives the attention it deserves by having their most talented staff assigned to it. Additionally, right-sizing usually provides meaningful accommodation in contract terms and professional treatment from the executive staff.

The risk of choosing an outsourcing company that is too small, however, is that they may not have a sufficient level of technical capabilities, skilled staff, certifications or experience to deliver what you hope to accomplish.

Local Presence with Global Delivery

If your company is based in North America, then choose an outsourcing vendor whose headquarters there. They will better understand your industry, business model, goals and processes since you are working within a similar cultural context.

Your company also benefits from local contractual protections should your project hit a serious speed bump. It also provides the distinct possibility that they can provide onsite staff at your site, which improves communication and timely escalation of critical issues.

However, vendors that also provide delivery from offshore will save you money. Furthermore, vendors with a global presence could directly interact with your own global sites and offer the possibility to add shifts in other time zones that work collaboratively with your local staff, which provides you with 24-hours a day of development.

Consistency in Quality and Delivery

Until recently, the majority of IT outsourcing firms sold themselves mainly on cost and based contracts on hourly rates. These days, more companies compete on their ability to produce results. Those results should include both timely delivery and measurably high-quality products or services.

During your due diligence, evaluate the vendor’s past work and pursue references to gauge how well the vendor has delivered on their promises. Have a detailed discussion with their senior staff about how their corporate culture reinforces the importance of on-time delivery and high quality throughout the ranks.

When you are convinced they will deliver what they say they will, it is still prudent to start the relationship with one or more smaller projects of a few months in duration to validate their work and timeliness for yourself.

Communication Capabilities

Well-planned, thorough and frequent communication is critical when using an IT outsourcing vendor. This goes double if the company you select has offshore resources, since both time and language may present communication barriers.

How much, when and how you each of you communicates with the other should be driven by the client. Both sides must identify primary contacts for specific areas. These people must have seconds in cases where the primary is unavailable. Daily meetings with program and development managers are not unreasonable as are weekly meetings with BDMs or department managers. To gauge frequency, ask yourself how much time you can afford to lose should a process go astray.

Vendors uncomfortable with your communication plan should raise a red flag with you, since this is such an essential element in your business relationship.

Their Range of Skill Sets

Except for the largest IT organizations, most companies do not have all the personnel with all the right skill sets for every project. When evaluating vendors’ technical and process capabilities, strike a balance between broad and deep skills that align with your business and project needs.

If you hope for the vendor to work on more than one type of project or you wish to establish a long-term relationship, then one with a broader range of skills may work out better in the long run. A possible drawback is that a project comes along that is a mismatch for the vendor’s skills and quality suffers.

Many enterprises today recognize that one size does not fit all, especially when working with small to mid-size IT outsourcing companies, so they choose to multi-source these services. This can complicate internal management of vendors, but often the point solutions that smaller vendors provide are of higher quality, with faster delivery and at the same or lesser cost.


Proper selection of an IT outsourcing vendor will significantly augment your company’s strategy and operations. Lack of due diligence, however, often leads to negative consequences plus lost time and money.

Use the selection criteria above along with a robust process comparing business requirements against each company’s pros and cons. This will lead to asking the right questions and building a seamless working relationship with a talented development provider.

The Optimus Information model is designed to allocate the right mix of local and offshore resources in order to optimize expertise, speed and cost. We provide the ability for development teams to quickly add specialty skills to a development team without incurring long-term costs.Our successful track record speaks for it’s self, and we love to share past work we’ve done. Our global team is made up of a diverse range of experienced professionals, allowing us to work on complex solutions requiring a wide variety expertise. The result for our customers is the capability to far better manage resource capacities and outcomes.

Contact us if for your next IT project. We’re always happy to help.

Contact us now

Top Ten Software Development Outsourcing Trends for 2016

Originally, the primary motivation to outsource software development was to achieve lower labor costs, but continuing and emerging business and technology trends in 2016 are leading to new client requirements on outsourcers. When choosing an outsourcing partner, more and more businesses are looking for closer alignment to their business goals, flexibility demands and quality requirements.

Thus, clients are evaluating outsourcing companies via increasingly sophisticated criteria. The smartest software providers are reciprocating by developing new service models while taking advantage of many of the same technologies driving these current trends.

1) Moving from Hours to Results

In order to ensure that enterprises are getting what they need for their money, most are now seeking out providers who operate on a results-driven model versus rates based on time. Furthermore, clients are demanding that payment schedules be based on satisfactory achievement of those results versus upfront fees or retainers.

2) Greater Flexibility

Clients are looking for providers who provide on-demand services without locking them into long-term contracts or volume commitments. This enables client companies to respond more efficiently to rapidly changing market demands. In response, development providers who are moving operations to cloud resources are the ones most likely to adapt to the increased demand for flexibility.

3) Utilization of DevOps Practices Continues Apace

DevOps continues to attract adherents as it goes mainstream in up to 25 percent of companies this year, according to Gartner. Most of the IT departments in these organizations are transitioning to a service center model. Service providers who already operate in this manner will more easily blend into these organizations’ processes and decision-making apparatus.

4) Security Risk Perception Increases

A key concern within any outsourcing strategy is security. With the growing presence of the Internet of Things and the potential for an exponentially larger attack surface, software development outsourcing companies must ensure that their own security vulnerabilities are addressed in a manner that will win the confidence of client decision makers. Demonstrating solid track records and establish policies is of high importance when selecting a vendor.

5) Managing Infrastructure as Code

Amazon’s AWS has enabled the application of software development change management systems to development and deployment infrastructure. AWS is dedicated to making this paradigm increasingly easier with new APIs and services. Outsourcers who adopt this practice are reaping large benefits in their software support, testing and deployment efficiency by synching servers, storage and networking infrastructure to precise versions of the source code.

6) Multi-Sourcing Technologies Impacts Integration

Client companies are utilizing a more complex mix of software products and services this year. This multi-sourcing of technologies presents in-house management challenges, and a rise of new vendor management offices. The challenge for software providers is meeting new performance and integration standards from VMOs. Compliance failure may result in the outsourcer being dropped in the interests of streamlining operations.

7) Business Process Outsourcing Being Replaced by Robotic Process Automation

The software outsourcing industry in 2016 will continue to feel the influence of the rise of RPA. In fact, one of RPA’s touted benefits is the reduction of outsourcing, especially via cloud-based RPA services. Those outsourcers who can adapt by offering relevant automated services in the most responsive, scalable and efficient manner are the ones who can survive and profit from this trend.

8) Outsourcing Selection is Speeding Up

Along with the adoption of agile methodologies within software development, business decisions are also being made with more agility and higher velocity. Outsourcers will increasingly recognize this trend as more clients endeavor to close smaller deals faster in order to stay ahead of their competition.

9) Adept Companies Are Being More Selective with What They Outsource

Many organizations who originally turned to outsourcing to compensate for a lack of internal expertise and resources have grown more sophisticated over time. They are progressively learning to be more selective regarding what to do in-house versus handing off to an outsourcing provider. Organizations are looking deeper into what their core competencies are and what they can outsource to make themselves more efficient in-house. Their motivations are usually the desires for greater flexibility, responsiveness or cost reductions, all of which software providers need to be sensitive to in contract negotiations.

10) Outsourcing Company Accommodation Increasing

It is no longer the case that companies seek out only the lowest cost provider. Sophisticated outsourcing companies will respond tactically and strategically to all the trends discussed here to grow or to survive. This trend can be seen in the greater tendency for outsourcers to adapt and adjust terms or offer new services in an effort to deliver the best product and service.


The outsourcing industry is more fluid than ever this year with clients focusing less on price per se and more on results, quality, integration, security and agility from software development providers. As you adapt to your own fast-moving markets and the rise of paradigm-shaking technologies such as IoT and on-demand infrastructure, so do we. Optimus stays two steps ahead in order to support your business in all your software and IT requirements.

At Optimus, we consistently stay on top of these trends while leveraging the forces driving them to bring you the solutions you need. Contact us to help with your next development, testing, cloud, BI or mobile project.


How IoT Will Drive Big Data Adoption

According to Internet of Things true believers, the time is just around the corner when our cars, homes, appliances, TVs, PCs, phones and any other electronic or mechanical device in our lives will be spewing out data in all directions. That makes some sense, since IoT devices – at least those now envisaged – are designed for data spewing as they have minimal compute capacity presently.

Cisco estimates that already nearly 15 million connected devices comprise the nascent IoT, which will grow to 50 million by 2020. That sounds impressive until you realize it is less than 3 percent of the “things” on our planet potentially able to participate in IoT. Unfamiliar numerical terms such as zettabytes must enter our lexicon to describe the volume of data to be generated, consumed and analyzed.

What the IoT Data Wave Means for Big Data

The processing of the rivers of big data coming from today’s embedded sensors, telemetry, RFID chips, PCs, mobile devices, wearables, etc. already leaves 90 percent of these data in the dustbin. That is primarily because current big data hardware and software stacks are inadequate to manipulate it all let alone comprehend it.

Big data compute, storage and networking capabilities improve daily. However, even those enterprises on big data’s bleeding edge are today ill-equipped to handle the expected data flood gushing from the IoT let alone the larger Internet of Everything that Cisco tracks.

Even if IoT is realized in twice or thrice the time of most projections, then big data enterprises are going to be perennially behind the curve for the foreseeable future. The constant running to catch up will be the prime driver of the big data ecosystem beyond the next decade. If that does not kill big data, it will only make it stronger. Enterprises large and small will join the data mining gold rush if real-time analytics improve and a big data meta-architecture, as hinted at by Hadoop, emerges.

The Obstacles to a Happy Marriage between IoT and Big Data

Lack of Standards

Having to figuratively invent the wheel over and over again is the bane of any competitive industry. Without standards, IoT will struggle to reach escape velocity due to technology fragmentation. Standards must be in place for efficient access to “things”, consistent API interfaces, machine-to-machine communication, addressing privacy and security issues and lowering entry barriers to smaller, innovated players.

Closed or Inefficient Architectures

IoT is a game changer for big data architecture. All stakeholders are just now starting to recognize that dealing with IoT will require as much collaboration as competition.

The sheer magnitude of IoT data volumes dictate a layered hardware/software stack that is too gigantic, geographically dispersed and complex for a single enterprise or cloud providers. It begs for an ultra-distributed meta-architecture that step by step digests, absorbs and disperses unstructured data as it is collected, cleaned, normalized, correlated with other data, stored when necessary, deeply analyzed and presented. Along the way, vendors who today specialize in each of these processing layers will contribute via enormous arrays of small-scale data centers.

Analytics Capability Growth Rate

Above all else, business intelligence processing is the critical bottleneck to realizing the full potential of big data. The rate at which supporting analytics can improve is questionable without significant breakthroughs, but the search for data gold represents an immeasurable incentive. The deluge of IoT real-time data headed down the analytic pipeline will create even more pressure but is likely to engender even more opportunities for value extraction.


The Internet of Things is not an invention but a logical consequence of highly available, low-power, low-cost sensor technology and improvements in wireless connectivity penetration. Related technology improvements and cost-reductions in compute, storage and network hardware will complement the growth of IoT and make it something useful and valuable. And, finally, IPV6 is going to receive the appreciation it justly deserves.

All this power to generate, gather and process new, real-time micro-data is for naught, however, if it must be set aside awaiting analysis capabilities to catch up. Fortunately, although big data infrastructure and software are likely to be overwhelmed initially, that and analytic capabilities seem to have a bit of a head start. Increased collaboration among stakeholders, an effective, shared processing architecture and the inevitable analytical breakthroughs may just carry the day in the end.