Understanding and adopting Splunk

Splunk has been a trend in the industry for quite some time, but what do we know about its use and the market Splunk is targeting?

Splunk comes from the word “spelunking”, which refers to the activities of locating, exploring, studying and mapping.

  1. Data indexing: Splunk collects data from different locations, combines them and stores them in a centralized index.
  2. Using indexes for searches: the use of indexes gives Splunk a high degree of speed when searching for problem sources.
  3. Filtering results: Splunk provides user with several tools for filtering results, for faster detection of problems.

For more than a year I have been experimenting with Splunk in several facets: security, storage, infrastructure, telecom and more. We at ESI have a very complete laboratory which allowed me to push my experiments.

In addition to using all these amounts of data, I used open data to experiment with Splunk’s ability to interpret them.

I tested the open data of the site “montreal.bixi.com”; this is raw data formatted as follows:

Start date –  Start station number –  Start station –  End date –  End station number –  End station –  Account type – Total duration (ms)

With this data, we are able to find the most common routes, estimate the average duration of a trip, the anchorage points most requested for the entry or exit of bicycles.

For the operations team of the service, this provides real-time or predicted for the next day which anchors should have more bicycles, and mostly where these bicycles will go. They could predict the lack or surplus of bikes in the anchor points. If data is collected in real-time, alerts could be issued to indicate potential shortage or surplus in the anchor points. Thus the system facilitates planning and allows to be proactive to meet demand, rather than reactive. We would even be able to detect an undelivered bicycle; for instance a bike that has not been anchored for more than 24 hours could issue an alert, so the operations team attempts to trace it.

For marketers, one might think this data is useless, while the opposite is true; the same data can be used to put in place potential offers to attract customers, since we have the data that give the time of departure and arrival, time of use of the trips, and the most used routes. One can thus know the most used time slots and make promotions or adjust the rates according to objectives of traffic or customer loyalty.

For the management, open data unfortunately does not give the price of races according to the status of the users (members or non-members), but the beauty of Splunk is that one can enrich the collected data with data coming from a third-party system, a database or simply manually collected data. Management could then obtain reports and dashboards based on various factors, such as user status, travel time, days of the week, and much more. We could even make comparisons with previous months or the same month of the previous year. The applications are virtually limitless with data that resides in Splunk: the only limitation is that of our imagination!

These are of course fictitious examples made with available open data, but which could be real with your own systems and data.

The collection of information from a website can provide visibility for all users of a company, operations receive system overload alerts, marketers get information about the origin of the connections to target their campaigns based on this data, management gets a view of the user experience, as well as performance metrics that confirm SLAs.

Whether it is security, operations, marketing, analytics or whatever, Splunk can address your needs. In addition to the 1,200 applications available in its portal, you can create your own tables, reports, or alerts. You can use their Power Pivot to allow people to easily use the data and build their own dashboard.

The platform is easy to use and does not require special expertise: you only need the data there.

Do not hesitate to contact ESI for a presentation or a demo; it will be my pleasure to show you how to “Splunk”.

Guillaume Paré
Senior Consultant, Architecture & Technologies – ESI Technologies

Account of the NetApp Insight 2016 Conference

The 2016 Edition of NetApp Insight took place in Las Vegas from September 26 to 29.
Again this year, NetApp presented its ‘Data Fabric’ vision unveiled two years ago. According to NetApp, the growth in capacity, velocity and variety of data can no longer be handled by the usual tools. As stated by NetApp’s CEO George Kurian, “data is the currency of the digital economy” and NetApp wants to be compared to a bank helping organizations manage, move and globally grow their data. The current challenge of the digital economy is thus data management and NetApp clearly intends to be a leader in this field. This vision is realized more clearly every year accross products and platforms added to the portfolio.

New hardware platforms

NetApp took advantage of the conference to officially introduce its new hardware platforms that integrate 32Gb FC SAN ports, 40GbE network ports, NVMe SSD embedded read cache and SAS-3 12Gb ports for back-end storage. Additionally, FAS9000 and AFF A700 are using a new fully modular chassis (including the controller module) to facilitate future hardware upgrades.

Note that SolidFire platforms have been the subject of attention from NetApp and the public: the first to explain their position in the portfolio, the second to find out more on this extremely agile and innovative technology. https://www.youtube.com/watch?v=jiL30L5h2ik

New software solutions

  • SnapMirror for AltaVault, available soon through the SnapCenter platform (replacing SnapDrive/SnapManager): this solution allows backup of NetApp volume data (including application databases) directly in the cloud (AWS, Azure & StorageGrid) https://www.youtube.com/watch?v=Ga8cxErnjhs
  • SnapMirror for SolidFire is currently under development. No further details were provided.

The features presented reinforce the objective of offering a unified data management layer through the NetApp portfolio.

The last two solutions are more surprising since they do not require any NetApp equipment to be used. These are available on the AWS application store (SaaS).

In conclusion, we feel that NetApp is taking steps to be a major player in the “software defined” field, while upgrading its hardware platforms to get ready to meet the current challenges of the storage industry.

Olivier Navatte, Senior Consultant – Storage Architecture

Cloud Strategy: legal impacts across the organization

Here is part three of our series covering the key issues to consider before adopting cloud technologies. This article focuses specifically on legal impacts on your organization.

“Location, location, location”. We’re more accustomed to hearing this in the context of the housing market. However, where your company’s headquarters reside, where your company does business and where its subsidiaries are located directly impact how you need to manage sensitive information, such as strategic projects, HR/personnel information, etc.; essentially, IT needs to account for data sovereignty laws and regulations.

Various countries have already voted or are moving towards voting on more restrictive data sovereignty legislations that will control the transit of information out of border. For example, the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA) already governs how IT organisations can collect, use and disclose personal information in the course of commercial business. In addition, the Act contains various provisions to facilitate the use of electronic documents. Essentially, all personally identifiable information must stay in country, at rest and in transit, meaning that using a cloud provider in the US or any other country with said data could expose the company – and you – to a lawsuit, unless the cloud provider can guarantee no aforementioned data ever leaves the country at any time, including for redundancy/DR purposes.

While the previous Act covered what must be protected, the American law (the USA Freedom Act, and its previous incarnation, the Patriot Act) enables the US government to access any and all data residing on its soil, without owner’s authorization, need for warrant and without even the need to notice the owner before or after the fact. The few data privacy provisions in the bill apply to American citizens and entities only. This means all data housed in the US are at risk, especially if said data is owned by an organisation whose headquarters are out of country.

If in Europe, laws vary from country to country, we find that the regulations on data protection are becoming more stringent, requiring the establishment of procedures and controls to protect personal data and obtaining the explicit authorization of persons to collect and use their information. All this imposes guidelines to the use of the cloud within the country or outside their borders.

Typically, data sovereignty should be a concern for most organisations when looking at cloud and, as the current trend is for countries to vote in more stringent laws, any and all cloud strategy should account for local, national and international regulations.

Benoit Quintin – Director Cloud Services – ESI Technologies

Cloud Strategy: business impacts across the organization

Here is the second part of our series covering the key issues to consider before adopting cloud technologies. This article focuses specifically on business impacts on your organization.

Most markets are evolving faster than ever before, and the trend seems to be accelerating, so organisations globally need to adapt and change the way they go to market. From a business standpoint, the flexibility and speed with which new solutions can be delivered via cloud help enable the business units to react faster and better. So much so, that where IT organisations have not considered automating aspects of provisioning to provide more flexibility and faster access to resources, business units have started going outside of IT, to some of the public cloud offerings, for resources.

Planning for cloud should consider people and processes, as both will likely be directly impacted. From the requisition of resources, all the way to charging back the different business units for resources consumed, managed independently from projects’ budgets, processes that were created and used before the advent of cloud in your organisation should be adapted, if not discarded and rebuilt from scratch. IT will need to change and evolve as it becomes an internal service provider (in many instances, a P&L entity) – and resources broker for the business units.

Considering the large capital investments IT has typically been getting as budget to ‘keep the lights on’, and considering that, until recently, this budget had been growing at double digits rate since the early days of mainframe; the switch from a capital investment model to an operational model can impact the way IT does business significantly. Indeed, we have seen the shift forcing IT to focus on what it can do better, review its relationships with the vendors, ultimately freeing up the valuable investment resources. In many organisations, this has also translated to enabling net new projects to come to life, in and out of IT.

Once this transformation is underway, you should start seeing some of the benefits other organisations have been enjoying, starting with faster speed to market on new offerings. Indeed, in this age of mobile everything, customers expect access to everything all the time, and your competition is likely launching new offerings every day. A move towards cloud enables projects to move forward at an accelerated pace, letting you go to market with updated offerings much faster.

Benoit Quintin, Director Cloud Services, ESI Technologies

Review of NetApp Insight 2015

Logo NetApp Insight 2015

The 2015 Edition of NetApp Insight was held in Las Vegas from October 12 to 15. The event is comprised of general sessions, more than 400 breakout sessions, the Insight Central zone with partner booths, hands-on labs, a “meet the engineer” section and offers the possibility to complete certification exams onsite.
The general sessions were presented by different NetApp personalities, CEO, CIO, technical directors, engineers, the NetApp cofounder Dave Hitz, as well as partners and guests (including Cisco, Fujitsu, VMware, 3D Robotics).
Last year, the “Data Fabric” term was unveiled to identify NetApp’s vision of cloud computing. This year, most of the presentations were intended to make that vision more concrete, through examples, demonstrations and placed in context.
For NetApp, Data Fabric is synonymous with data mobility, wherever it resides, whether in traditional datacentres or in the cloud. The key to this mobility lies in SnapMirror, which should soon be supported by various NetApp platforms, FAS, Cloud ONTAP, NetApp Private Storage (PS), AltaVault, etc. and orchestrated by global tools such as OnCommand Cloud Manager and the adaptation of existing tools.
Still on the topic of cloud, a Cisco speaker presented the current issues and future trends: with the exponential use of devices (tablets, smartphones and connected devices) and the increasingly frequent move of data (and even of the compute) to the edge, accessibility, availability, security and data mobility therefore becomes an increasingly important issue. In short, the cloud trend belongs to the past, we now must talk about edge!
NetApp has also put forward its All-Flash FAS type entreprise solutions which, thanks to new  optimizations can now seriously compete in high performance and very low latency environments.
The number of breakout sessions was impressive and in four days, one can only expect to attend about 20 of the 400 sessions available.
Insight is open to clients since last year, but some sessions remain reserved for NetApp partners and employees. Some information are confidential, but without giving details and non-exhaustively, we can mention that a new generation of controllers and tablets are to be expected soon, that SnapCenter will eventually replace SnapManager (in cDOT only) and that new much more direct transition options from 7-Mode to cDOT will be made available.
Other sessions also helped to deepen knowledge or to discover some very interesting tools and features.
In conclusion, NetApp Insight is a must, to soak up in the NetApp line of solutions as much as to find out what NetApp’s vision and future direction will be.

Olivier Navatte, ESI Storage Specialist