Agile IT: a better way of doing business

One of the most powerful new ideas to emerge from the cloud computing revolution is IT agility. Agile IT organizations are able to easily adapt to changing business needs by delivering applications and infrastructure quickly to those who need it. Does your organization have what it takes to be truly agile?

There are many components of agile IT infrastructure, but three that we think are particularly important are containers, microservices and automation. These form the foundation of the new breed of cloud-native applications, and they can be used by any organization to revolutionize the speed and agility of application delivery to support the business.

Containers: Fast and Flexible

Containers are a sort of lightweight virtual machine, but they differ from VMs in fundamental ways. Containers run as a group of namespaced processes within an operating system, with each having exclusive access to resources such as processor, memory and all of the supporting elements needed for an application. They are typically stored in libraries for reuse and can be spun up and shut down in seconds. They’re also portable, meaning that an application running in a container can be moved to any other environment that supports that type of container.

Containers have only been on the IT radar screen for about three years, but they are being adopted with astonishing speed. One recent study found that 40% of organizations are already using containers in production and just 13% have no plans to adopt them during the coming year. Containers are especially popular with developers because coders can configure and launch their own workspaces without incurring the delay and overhead of involving the IT organization.

Microservices: a Better Approach to Applications

Use of containers frequently goes hand-in-hand with the adoption of microservices architectures. Applications built from microservices are based upon a network of independently deployable, modular services that use a lightweight communications mechanism such as a messaging protocol. Think of it as an object assembled from Lego blocks. Individual blocks aren’t very useful by themselves, but when combined, they can create elaborate structures.

Service-oriented architecture is nothing new, but the technology has finally matured to the point that it’s practical to rethink applications in that form. The microservices approach is more flexible and efficient than the vertically integrated applications that have dominated IT for decades. By assembling applications from libraries of services, duplication is minimized and software can move into production much more quickly. There’s less testing overhead and more efficient execution, since developers can focus on improving existing microservices rather than reinventing the wheel with each project.

Containers are an ideal platform for microservices. They can be launched quickly and custom-configured to use only the resources they need. A single microservice may be used in many ways by many different applications. Orchestration software such as Kubernetes keeps things running smoothly, handles exceptions and constantly balances resources across a cluster.

Automation: Departure from Routine

Automation is essential to keeping this complex environment running smoothly. Popular open-source tools such as Puppet and Ansible make it possible for many tasks that were once performed by systems administrators – such as defining security policies, managing certificates, balancing processing loads and assigning network addresses – to be automated via scripts. Automation tools were developed by cloud-native companies to make it possible for them to run large-scale IT operations without legions of administrators, but the tools are useful in any context.

Automation not only saves money but improves job satisfaction. Manual, routine tasks can be assigned to scripts so that administrators can tend to more important and challenging work. And in a time of severe IT labor shortages, who doesn’t want happier employees?

Agile IT makes organizations nimbler, more responsive and faster moving. When planned and executed with the help of an experienced integration partner, it saves money as well.

 

Account of the NetApp Insight 2016 Conference

The 2016 Edition of NetApp Insight took place in Las Vegas from September 26 to 29.
Again this year, NetApp presented its ‘Data Fabric’ vision unveiled two years ago. According to NetApp, the growth in capacity, velocity and variety of data can no longer be handled by the usual tools. As stated by NetApp’s CEO George Kurian, “data is the currency of the digital economy” and NetApp wants to be compared to a bank helping organizations manage, move and globally grow their data. The current challenge of the digital economy is thus data management and NetApp clearly intends to be a leader in this field. This vision is realized more clearly every year accross products and platforms added to the portfolio.

New hardware platforms

NetApp took advantage of the conference to officially introduce its new hardware platforms that integrate 32Gb FC SAN ports, 40GbE network ports, NVMe SSD embedded read cache and SAS-3 12Gb ports for back-end storage. Additionally, FAS9000 and AFF A700 are using a new fully modular chassis (including the controller module) to facilitate future hardware upgrades.

Note that SolidFire platforms have been the subject of attention from NetApp and the public: the first to explain their position in the portfolio, the second to find out more on this extremely agile and innovative technology. https://www.youtube.com/watch?v=jiL30L5h2ik

New software solutions

  • SnapMirror for AltaVault, available soon through the SnapCenter platform (replacing SnapDrive/SnapManager): this solution allows backup of NetApp volume data (including application databases) directly in the cloud (AWS, Azure & StorageGrid) https://www.youtube.com/watch?v=Ga8cxErnjhs
  • SnapMirror for SolidFire is currently under development. No further details were provided.

The features presented reinforce the objective of offering a unified data management layer through the NetApp portfolio.

The last two solutions are more surprising since they do not require any NetApp equipment to be used. These are available on the AWS application store (SaaS).

In conclusion, we feel that NetApp is taking steps to be a major player in the “software defined” field, while upgrading its hardware platforms to get ready to meet the current challenges of the storage industry.

Olivier Navatte, Senior Consultant – Storage Architecture

Where’s the promised agility?

The world of technology solutions integrators has changed dramatically in the last 10 years. Customers are more educated than ever before through access to a world of information available on the Internet. It is estimated that 80% of customer decision-making is made online before they even reach out to us. This is not just true of our industry. The Internet is now woven into the fabric of society and clients now go to the veterinary clinic with the belief that they already identified their pet’s disease since “the Internet” provided them with a diagnosis!

agility380w_0What about the promises of industry giants? Simplified IT, reduced OPEX, increased budgets for projects instead of maintenance, etc.?

How can we explain that we don’t witness this in our conversations with customers? How is it that we still see today clients who have embraced those technologies also admit they are now faced with greater complexity than before? Perhaps the flaw comes exactly from the fact that 80% of decisions are made based on well designed and manufactured web marketing strategies…

Regardless of the technological evolution, the key it seems is still architecture design, thought with a business purpose and IT integration strategy tailored to your specific needs with the help of professionals. Just as a veterinarian is certainly a better source of information than the Internet to look after your pet…

For over 20 years, ESI designs solutions that are agile, scalable and customized to the specific needs of organisations. ESI works closely with customers to bridge the gap between business needs and technology, maximizing ROI and providing objective professional advice.

Review of NetApp Insight 2015

Logo NetApp Insight 2015

The 2015 Edition of NetApp Insight was held in Las Vegas from October 12 to 15. The event is comprised of general sessions, more than 400 breakout sessions, the Insight Central zone with partner booths, hands-on labs, a “meet the engineer” section and offers the possibility to complete certification exams onsite.
The general sessions were presented by different NetApp personalities, CEO, CIO, technical directors, engineers, the NetApp cofounder Dave Hitz, as well as partners and guests (including Cisco, Fujitsu, VMware, 3D Robotics).
Last year, the “Data Fabric” term was unveiled to identify NetApp’s vision of cloud computing. This year, most of the presentations were intended to make that vision more concrete, through examples, demonstrations and placed in context.
For NetApp, Data Fabric is synonymous with data mobility, wherever it resides, whether in traditional datacentres or in the cloud. The key to this mobility lies in SnapMirror, which should soon be supported by various NetApp platforms, FAS, Cloud ONTAP, NetApp Private Storage (PS), AltaVault, etc. and orchestrated by global tools such as OnCommand Cloud Manager and the adaptation of existing tools.
Still on the topic of cloud, a Cisco speaker presented the current issues and future trends: with the exponential use of devices (tablets, smartphones and connected devices) and the increasingly frequent move of data (and even of the compute) to the edge, accessibility, availability, security and data mobility therefore becomes an increasingly important issue. In short, the cloud trend belongs to the past, we now must talk about edge!
NetApp has also put forward its All-Flash FAS type entreprise solutions which, thanks to new  optimizations can now seriously compete in high performance and very low latency environments.
The number of breakout sessions was impressive and in four days, one can only expect to attend about 20 of the 400 sessions available.
Insight is open to clients since last year, but some sessions remain reserved for NetApp partners and employees. Some information are confidential, but without giving details and non-exhaustively, we can mention that a new generation of controllers and tablets are to be expected soon, that SnapCenter will eventually replace SnapManager (in cDOT only) and that new much more direct transition options from 7-Mode to cDOT will be made available.
Other sessions also helped to deepen knowledge or to discover some very interesting tools and features.
In conclusion, NetApp Insight is a must, to soak up in the NetApp line of solutions as much as to find out what NetApp’s vision and future direction will be.

Olivier Navatte, ESI Storage Specialist

What about Big Data & Analytics?

After the “cloud” hype, here comes the “big data & analytics” one and it’s not just hype. Big data & analytics enables companies to make better business decisions faster than ever before; helps identify opportunities with new products and services and bring innovative solutions to the marketplace faster; assists IT and helpdesk in reducing mean time to repair and troubleshoot as well as giving reliable metrics for better IT spending planning; guides companies in improving their security posture by having more visibility on the corporate network and identify suspicious activities that go undetected with traditional signature-based technologies; serves to meet compliance requirements… in short, it makes companies more competitive! One simply has to go on Youtube to see the amazing things companies are doing with Splunk for example.

BIG-DATA-1I remember when I started working in IT sales in the mid 90’s, a “fast” home Internet connexion was 56k and the Internet was rapidly gaining in popularity. A small company owner called me and asked “What are the competitive advantages of having a website?” to which I replied “it’s no longer a competitive advantage, it’s a competitive necessity” and to prove my point I asked him to search his competitors out on the Internet: he saw that all of his competitors’ had websites!

The same can now be said of big data & analytics. With all the benefits it brings, it is becoming a business necessity. But before you start rushing into big data & analytics, know the following important facts:

  1. According to Gartner, 69% of corporate data have no business value whatsoever
  2. According to Gartner still, only 1.5% of corporate data is high value data

This means that you will have to sort through a whole lot of data to find the valuable stuff that you need to grow your business, reducing costs, outpacing competition, finding new revenue sources, etc. It is estimated that every dollar invested in a big data & analytics solution brings four to six dollars in infrastructure investments (new storage to hold all that priceless data, CPU to analyze, security for protection etc.).

So before you plan a 50,000$ investment in a big data & analytics solution and find out it comes with a 200,000$ to 300,000$ investment in infrastructure, you should talk to subject matter experts. They can help design strategies to hone in on the 1.5% of high value data, and reduce the required investment while maximizing the results.

Charles Tremblay, ESI Account Manager

The greatest IT confusion ever?

Does it even beat Y2K? It’s been a year now since I rejoined the IT integration industry. When I left it in 2003 to focus on PKI technologies, it was still the good old days of client server IT infrastructure right after Y2K and the dot-com bubble burst. For a year now I have been trying to understand clients’ challenges to see how I can help. For a year now I have observed my clients trying themselves to understand the mutations that appear to be changing the IT industry and how it affects them not only on a business level but also on professional AND personal levels as well. I find them fearful and closed. Witnessing this, I told a colleague of mine “it seems our clients are capable of telling us what they don’t want but rarely have a clear vision of what they’re aiming for”!Trending concepts
Big data, the internet of things, stuff called cloud, anything anywhere anytime on any device, the software defined companies etc. – all these new terminologies are being bombarded to our clients and are supposed to showcase the many new trends in the industry. I have recently been to a seminar where the audience was separated in three categories: traditional IT folks who resist these changes and new trends because they reshape traditional IT infrastructure and thus may even jeopardize their job definition or security, new line of business managers who embrace change and are shopping for apps that get the job done and high management who talk the numbers’ language (growth percentage, market share and other measurable KPIs) with whom you need to be able to prove ROI (not TCO this is the IT folks’ concerns).

And there we have it: widespread confusion and fear. Y2K all over again? People forget, BI has been around for a while, so has the Internet, thin client environments, databases etc. It’s just happening on a different scale and the challenge remains to bridge the gap between corporate and business objectives as defined by high management, finding the right tools and processes to get the job done by line of business owners and IT that still has an important role in solution selection, integration and support be it on site or off site.

My challenge over the last year has been to overcome those fears so as to allow my clients to have open discussions on their business objectives and avoid the use of buzz words to refocus on “where do you want to be in three to five years as a company, what IT tools will be required to help you get there and what are the ones I can help you with”.

Charles Tremblay, ESI account manager

Don’t fall for marketing blurb

While watching a pickup truck commercial on TV lately, I couldn’t help but ask myself “How can all pickups have the best fuel efficiency in their category?” In a funny way, I hear the same in our industry with “most IOPS or terabytes per $”. It seems everyone’s the best at it. In one case, a client got the most IOPS per dollar he could get and he ended up having to change his whole data centre infrastructure because the IOPS he got were not of the right type!

IOPS

In the storage industry, it seems that IOPS is the equivalent buzzword to horsepower (HP) in the automotive industry. So you’re going to try to get the most horsepower per $ when you purchase a car. You can get 350HP out of a small sports car or a pickup truck. Just don’t try to race with a pickup truck or tow something with the sports car! There’s a reason why you won’t see a Ferrari with a hitch! Though they both have 350HP, one has torque, the other one doesn’t. One is built for performance and speed, the other for heavy workloads. The same goes for data centres. Manufacturers will give you the IOPS you asked for and they can usually prove it! But do you know what IOPS type you’re looking for (sequential, random, read or write)? Why are you requiring those IOPS? Performance or heavy workloads? If you’re not sure, it’s an integrator’s core business and value to help you make sense of all the marketing blurb thrown at you, to help you choose wisely and protect your investment.

Charles Tremblay, ESI Account Manager