After the “cloud” hype, here comes the “big data & analytics” one and it’s not just hype. Big data & analytics enables companies to make better business decisions faster than ever before; helps identify opportunities with new products and services and bring innovative solutions to the marketplace faster; assists IT and helpdesk in reducing mean time to repair and troubleshoot as well as giving reliable metrics for better IT spending planning; guides companies in improving their security posture by having more visibility on the corporate network and identify suspicious activities that go undetected with traditional signature-based technologies; serves to meet compliance requirements… in short, it makes companies more competitive! One simply has to go on Youtube to see the amazing things companies are doing with Splunk for example.
I remember when I started working in IT sales in the mid 90’s, a “fast” home Internet connexion was 56k and the Internet was rapidly gaining in popularity. A small company owner called me and asked “What are the competitive advantages of having a website?” to which I replied “it’s no longer a competitive advantage, it’s a competitive necessity” and to prove my point I asked him to search his competitors out on the Internet: he saw that all of his competitors’ had websites!
The same can now be said of big data & analytics. With all the benefits it brings, it is becoming a business necessity. But before you start rushing into big data & analytics, know the following important facts:
- According to Gartner, 69% of corporate data have no business value whatsoever
- According to Gartner still, only 1.5% of corporate data is high value data
This means that you will have to sort through a whole lot of data to find the valuable stuff that you need to grow your business, reducing costs, outpacing competition, finding new revenue sources, etc. It is estimated that every dollar invested in a big data & analytics solution brings four to six dollars in infrastructure investments (new storage to hold all that priceless data, CPU to analyze, security for protection etc.). So before you plan a 50,000$ investment in a big data & analytics solution and find out it comes with a 200,000$ to 300,000$ investment in infrastructure, you should talk to subject matter experts. They can help design strategies to hone in on the 1.5% of high value data, and reduce the required investment while maximizing the results.
Charles Tremblay, ESI Account Manager
In the summer, I enjoy doing volunteer work as a soccer coach for kids and teenagers. I do the same in the winter when hockey season begins. I find it challenging to bring different personalities to work as a group towards achieving common goals as a team. Being a coach doesn’t come without training however. And I remember one trainer commenting on being a coach as he said “if a given player isn’t doing what you asked him to, the first question you need to ask is: did I tell him? The second question is: did the player understand? The third: did I explain it well?” He ended up by saying “if your answer is yes to all three questions, then repeat as often as necessary”.
When I found myself with a client’s network manager talking about how sophisticated phishing campaigns have become, I remembered this wise comment about that coach trainer. This network administrator in particular admitted than even his seasoned team of network managers came close to being caught in one of these sophisticated phishing campaigns. It was a well-designed one using their GoDaddy account. It’s only when someone took the time to check the links that they noticed something fishy. The average user might very well have fallen victim of this. With regards to end users, ask yourself: “if a given user isn’t doing what you asked him to with regards to suspicious emails, the first question you need to ask is did I tell him? The second question is did the user understand the potential consequences? Thirdly, did I explain it in terms the average user understands?” I end up by saying “if your answer is yes to all three questions, then keep repeating as users will forget over time and new users become part of your community”.
Charles Tremblay, Account Manager
Does it even beat Y2K? It’s been a year now since I rejoined the IT integration industry. When I left it in 2003 to focus on PKI technologies, it was still the good old days of client server IT infrastructure right after Y2K and the dot-com bubble burst. For a year now I have been trying to understand clients’ challenges to see how I can help. For a year now I have observed my clients trying themselves to understand the mutations that appear to be changing the IT industry and how it affects them not only on a business level but also on professional AND personal levels as well. I find them fearful and closed. Witnessing this, I told a colleague of mine “it seems our clients are capable of telling us what they don’t want but rarely have a clear vision of what they’re aiming for”!
Big data, the internet of things, stuff called cloud, anything anywhere anytime on any device, the software defined companies etc. – all these new terminologies are being bombarded to our clients and are supposed to showcase the many new trends in the industry. I have recently been to a seminar where the audience was separated in three categories: traditional IT folks who resist these changes and new trends because they reshape traditional IT infrastructure and thus may even jeopardize their job definition or security, new line of business managers who embrace change and are shopping for apps that get the job done and high management who talk the numbers’ language (growth percentage, market share and other measurable KPIs) with whom you need to be able to prove ROI (not TCO this is the IT folks’ concerns).
And there we have it: widespread confusion and fear. Y2K all over again? People forget, BI has been around for a while, so has the Internet, thin client environments, databases etc. It’s just happening on a different scale and the challenge remains to bridge the gap between corporate and business objectives as defined by high management, finding the right tools and processes to get the job done by line of business owners and IT that still has an important role in solution selection, integration and support be it on site or off site.
My challenge over the last year has been to overcome those fears so as to allow my clients to have open discussions on their business objectives and avoid the use of buzz words to refocus on “where do you want to be in three to five years as a company, what IT tools will be required to help you get there and what are the ones I can help you with”.
Charles Tremblay, ESI account manager
I found myself recently in conversation with the General Manager of a small size but rapidly growing business. He mentioned three key objectives. First, growth in revenue. Second, being a consulting services-based business, key information being scattered on employee laptops – the company wants to make sure it “owns” the information and third, they have plans of bringing to market a new web-based software application for their clients.
Who is responsible for the revenue growth? Are they pursuing new business growth or growth in their current client base? How can, or does the company track the efforts and the steps taken by their team to make sure everyone’s pulling in the right direction to reach their growth target? The conversation then directed itself to sales processes and CRMs. What key information was held by their consultants, should be archived, if so, for how long and where? We proceeded onto data storage, archiving, secure access to the network for their consultants, data loss protection, electronic document management solutions etc. Finally, we asked ourselves where should their new web application reside? Cloud or not? Private, public or hybrid? On-premise or off-site? In the case where they should consider a cloud service provider, what should their SLAs be and if not satisfied, what should be their exit strategy?
IT solutions are not to be confused with business needs and corporate objectives! We left off with me having a better understanding of their corporate objectives and what I could do to help them attain them. On their side, they walked away with a much better understanding of which IT solutions were the most susceptible the help them reach their goals.
Charles Tremblay, ESI Account Manager
As I continue to attend conferences and sessions with many of our core partners, I continue on my quest for data centre innovation. Most recently I visited the sunny coast of the Bay Area to visit Brocade Communications, Hitachi Data Systems and VMware specifically the NSX division. This is part one of a three part overview of the technology offering.
Within my role “Office of the CTO” I am always exploring new trends and innovation in designs and solutions for our clients, in particular how “software defined everything” becomes a part of our clients’ data centre evolution. For many years we have been speaking about the cloud and its adoption in main stream IT. We have new technologies appear and some just take a new face. Today, I would like to explore the concept of Software Defined Data Centers (SDDC) or in this case specially Software Defined Networks (SDN), with an overview of some of the most interesting solutions on the market.
Like many of you I have experienced the virtualization becoming more and more common of the compute platform. It just seems like yesterday that my manager at the time asked me to assist in SAN connectivity with Microsoft version 1 of Virtual machine management! Today we are experiencing the continued evolution of virtualization. Server and storage virtualization are common place within the data centre. We are seeing Canadian companies 100% virtualized within the compute space. These same companies are looking for the next step in consolidation, agility and cost containment. That next step is network virtualization. But what is SDN? Software defined networking (SDN) is a model for network control, based on the idea that network traffic flow can be made programmable at scale, thus enabling new dynamic models for traffic management.
Source of above photo: https://www.opennetworking.org/sdn-resources/sdn-definition
VMware NSX – a product purchased by VMware to add to their virtual network strategy. The product is sound and provides a close coupling with VMware and the networking and security of East/West traffic within a VM. The NSX Data and management plane provides an excellent framework to allow the SME hypervisor to lock down the VM traffic, and virtual properties such as a vRouter, vVPN, vLoad Balancer, all of which work within the VM construct.
Brocade Vyatta – A technology acquired by Brocade 2 years ago. Today we see the vRouter and Vyatta OpenDaylight controller lead the pack. Brocade has v5400 and v5600 additions of the predefined Vyatta OpenFlow controller. The Vyatta implementation provides vRouter, vFirewall, vVPN and has also developed a vADX load balancer as well.
Cisco ACI or Nexus 9000L – Cisco announced in 2014 the spin-in of the Insieme product to provide an ACI (Application Centric Infrastructure) platform. The first release was a 40 Gb Ethernet switch with no real ACI functionality. Today we see the product with enhanced port/policy control strategy using the Cloupia Spin-in Technology (UCS Director) policy based engines to control the various functions within an ACI architecture.
The real mystery of software defined networking starts with the basic understanding of a business need for a “programmable network” based on X86 architecture within the virtualization layer. In the next installment I will breakdown the VMware NSX and what ESI is exploring with this leading edge SDN contributor.
Nicholas Laine, Director Solutions Architect – Office of the CTO
Nutanix Engineering has discovered a rare condition that can potentially cause data integrity issues for containers using the Nutanix Metro Availability Data Protection feature for all versions of NOS 4.1 prior to 4.1.2. This condition can occur in environments that have experienced aborted operations during failover instances between Metro Availability sites.
Note: This issue does not affect customers that are using the Nutanix Async DR Data-Protection feature.
If you are using Nutanix, it is important to perform the following check to determine if your cluster is using this feature. In the Prism web console, navigate to Home > Data Protection > Overview, where the “Data Protection Summary” box will display a protection domain count.
- A value of 1 or greater for “Metro Availability” indicates that the feature is enabled and this field advisory is applicable to your cluster.
- A value of 0 indicates that you are not using the Metro Availability feature and can safely ignore this Field Advisory.
There is no workaround to this condition. This issue is fixed in NOS 4.1.2. All customers that are using the Metro Availability Data Protection feature MUST upgrade to NOS 4.1.2 to prevent this issue. Nutanix intends to release NOS 4.1.2 on or before April 17th, 5PM PST. Nutanix will send an update to this field advisory if NOS 4.1.2 is delayed beyond this date.
Please contact your ESI representative if you wish to validate with one of our experts.
While watching a pickup truck commercial on TV lately, I couldn’t help but ask myself “How can all pickups have the best fuel efficiency in their category?” In a funny way, I hear the same in our industry with “most IOPS or terabytes per $”. It seems everyone’s the best at it. In one case, a client got the most IOPS per dollar he could get and he ended up having to change his whole data centre infrastructure because the IOPS he got were not of the right type!
In the storage industry, it seems that IOPS is the equivalent buzzword to horsepower (HP) in the automotive industry. So you’re going to try to get the most horsepower per $ when you purchase a car. You can get 350HP out of a small sports car or a pickup truck. Just don’t try to race with a pickup truck or tow something with the sports car! There’s a reason why you won’t see a Ferrari with a hitch! Though they both have 350HP, one has torque, the other one doesn’t. One is built for performance and speed, the other for heavy workloads. The same goes for data centres. Manufacturers will give you the IOPS you asked for and they can usually prove it! But do you know what IOPS type you’re looking for (sequential, random, read or write)? Why are you requiring those IOPS? Performance or heavy workloads? If you’re not sure, it’s an integrator’s core business and value to help you make sense of all the marketing blurb thrown at you, to help you choose wisely and protect your investment.
Charles Tremblay, ESI Account Manager
I was initially presented a client seeking technical help to perform regular recovery tests in their completely virtualized environment. And a complex environment to say the least with many different very specific CRMs using different databases. Being cautious, I wanted to make sure to meet the client’s expectation before scheduling any professional services. The conversation rapidly turned to business continuity, recovery point and time objectives and not at all around technical help to perform recovery tests.
Taking a step back, we agreed to meet to overlook and understand their business continuity needs and challenges. During the meeting, the conversation widened even more and we discovered that the agility and simplicity allowed by virtualization allowed internal shadow IT to emerge. Business lines were bypassing the IT director and were having the network administrators set up new servers with reserved CPU, memory and storage faster than ever to the point where it was hard to keep track of what was being setup where for what purpose. The conversation was leaning further away from backup & recovery. Or was it? I listened on very carefully.
Suddenly the fog seemed to lift. What applications needed to be recovered in what time? What was the order of priority? What were the dependencies between applications and the services they needed to function? What services needed to be back on line for which applications? Where was the data for every applications? All of this in an environment that was growing very complex. They needed help building their disaster recovery plan. Or did they? I listened on very carefully.
We then learned they already had a remote site, they had already purchased a complete recovery infrastructure, they were to move everything into their new data center and keep the current systems they were operating as their failover site. In short, they had a well thought out disaster recovery plan.
What they needed was someone that was listening carefully and who understood that they were really looking for an experienced team of extra bodies to help them setup the new data center, move the data and applications, test the environment while their own team continued to manage and operate the corporate infrastructure not consultation services for a disaster recovery plan they already had nor technical assistance for tests.
Charles Tremblay, ESI Account manager
The manufacturer Nutanix sent a notice to its integration partners to inform Nutanix customers of a possible problem with their data integrity. This advisory affects NOS version 4.0.3 and later and is applicable only to customers that have met all the criteria below:
- On disk deduplication is enabled
- Using: Nutanix Protection Domains (all Hypervisors) orVAAI plugin (applies only to VMware ESXi Hypervisor)
- Using NOS 4.0.3 or higher
If you are a Nutanix user, it is important that you verify if your environment is affected by this advisory.
Customers should avoid the configurations that are susceptible to this issue. Please contact your ESI representative if you wish to validate with one of our experts.
An update to this field advisory update will be sent on March 11th with details of the NOS release that will resolve this issue.
For a second time in the last couple of weeks, I was sitting in one of my client’s many small conference rooms equipped with telepresence. I found myself participating in a solutions meeting with his team members located in four different cities. And there we were, holding a meeting as if everyone was around the table being able to appreciate everyone’s body language. How quickly we all forgot we were not in the same room because it felt as though we were.
This is when it dawned on me that telepresence is underestimated. To get that feeling of truly being on the same page with everyone is priceless especially when you are engaging on client’s strategic projects that will have an impact on the way their team operates and performs. I strive to work in harmony with client’s values and corporate culture. Being in the presence of the client’s team made me understand why this client was so successful in their industry: civil engineering. They also need to be as close as possible to their own clients and partners working on remote construction sites all over the world.
It is only afterwards that I learned that the telepresence they had was purposely built for them by a business partner of ours. Having witnessed firsthand the effectiveness of telepresence as a way to bring together teams of people, I will no longer think of telepresence as something that’s only a hype or a cool thing to have. It can be a game changer for many businesses!
Charles Tremblay, ESI account manager