It’s time to rethink cybersecurity.

For many years, organizations have focused their security efforts on endpoint protection. Firewalls, antivirus software, intrusion detection and anti-spyware tools are all effective to a point, but they are failing to stop the vast majority of threats.

A recent ServiceNow survey of 300 chief information security officers found that 81% are highly concerned that breaches are going unaddressed and 78% are worried about their ability to detect breaches in the first place. IBM’s 2017 X-Force Threat Intelligence Index reported a 566% increase in the number of compromised records in 2016 compared to the previous year. FireEye reported that the average time it takes an organization to detect an intrusion is over 200 days.

Endpoint security measures will only become less effective as the number of endpoints proliferates. Smart phones introduced a whole new class of threats, and the internet of things (IoT) will add billions of endpoint devices to networks over the next few years, many of which have weak or no security.

That’s why cybersecurity, in the words of Cisco CEO Chuck Robbins, “needs to start in the network.” The approach that Cisco is championing recognizes the reality that breaches today are inevitable but that they needn’t be debilitating. The increasing popularity of security operations centers shows that IT organizations are shifting their attention to creating an integrated view of all the activity on their networks – including applications, databases, servers and endpoints – and adopting tools that can identify patterns that indicate a breach. For example, multiple access attempts from a certain IP address or large outbound file transfers may indicate an intrusion, and that activity can be stopped before much damage is done.

Fortunately, technology is evolving to support the network-centric approach. Big data platforms like Hadoop have made it practical and affordable for organizations to store large amounts of data for analysis. Streaming platforms like Apache Spark and Kafka can capture and analyze data in near real-time. Machine learning programs, when applied to large data stores like Hadoop, can continuously sort through network and server logs to find anomalies, becoming “smarter” as they go.

And the cloud presents new deployment options. That’s why security is rapidly migrating from dedicated hardware to cloud-based solutions using a software-as-a-service model. Grandview Research estimates that the managed security services market was worth more than $17.5 billion in 2015, and that it will grow to more than $40 billion in 2021. As organizations increasingly virtualize their networks, these services will become integrated into basic network services. That means no more firmware upgrades, no more site visits to fix balky firewalls and no more anti-malware signature updates.

It’s too early to say that the tide has turned favorably in the fight with cyber-criminals, but the signs are at least promising. It’s heartening to see Cisco making security such important centerpiece of its strategy. Two recent acquisitions – Jasper and Lancope – give the company a prominent presence in cloud-based IoT security and deep learning capabilities for network and threat analysis. The company has said that security will be integrated into every new product it produces going forward. Perhaps that’s why Robbins has called his company, “the only $2 billion security business that is growing at double digits.”

Is your network ready for digital transformation?

If your company has more than one location, you know the complexity that’s involved in maintaining the network. You probably have several connected devices in each branch office, along with firewalls, Wi-Fi routers and perhaps VoIP equipment. Each patch, firmware update or new malware signature needs to be installed manually, necessitating a service call. The more locations you have, the bigger the cost and the greater the delay.

This is the state of technology at most distributed organizations these days, but it won’t scale well for the future. Some 50 billion new connected smart devices are expected to come online over the next three years, according to Cisco. This so-called “Internet of things” (IoT) revolution will demand a complete rethinking of network infrastructure.

Networks of the future must flexibly provision and manage bandwidth to accommodate a wide variety of usage scenarios. They must be also be manageable from a central point. Functionality that’s currently locked up in hardware devices must move into software. Security will become part of the network fabric, rather than distributed to edge devices. Software updates will be automatic.

Cisco calls this vision “Digital Network Architecture” (DNA). It’s a software-driven approach enabled by intelligent networks, automation and smart devices. By virtualizing many functions that are now provided by physical hardware, your IT organization can gain unparalleled visibility and control over every part of their network.

For example, you can replace hardware firewalls with a single socket connection. Your network administrators can get a complete view of every edge device, and your security operations staff can use analytics to identify and isolate anomalies. New phones, computers or other devices can be discovered automatically and appropriate permissions and policies enforced centrally. Wi-Fi networks, which are one of the most common entry points for cyber attackers, can be secured and monitored as a unit.

One of the most critical advantages of DNA is flexible bandwidth allocation. Many organizations today provision bandwidth on a worst-case scenario basis, resulting in excess network capacity that sits idle much for the time. In a fully software defined scenario, bandwidth is allocated only as needed, so a branch office that’s experiencing a lull doesn’t steal resources from a busy one. Virtualized server resources can also be allocated in the same way, improving utilization and reducing waste.

IoT will demand unprecedented levels of network flexibility. Some edge devices – such as point-of-sale terminals – will require high-speed connections that carry quick bursts of information for tasks such as credit card validation. Others, like security cameras, need to transmit much larger files but have greater tolerance for delay. Using a policy-based DNA approach, priorities can be set to ensure that each device gets the resources it needs.

Getting to DNA isn’t an overnight process. Nearly every new product Cisco is bringing to the market is DNA-enabled. As you retire older equipment, you can move to a fully virtualized, software-defined environment in stages. In some cases, you may find that the soft costs of managing a large distributed network – such as travel, staff time and lost productivity – already justify a switch. Whatever the case, ESI has the advisory and implementation expertise to help you make the best decision.

Understanding and adopting Splunk

Splunk has been a trend in the industry for quite some time, but what do we know about its use and the market Splunk is targeting?

Splunk comes from the word “spelunking”, which refers to the activities of locating, exploring, studying and mapping.

  1. Data indexing: Splunk collects data from different locations, combines them and stores them in a centralized index.
  2. Using indexes for searches: the use of indexes gives Splunk a high degree of speed when searching for problem sources.
  3. Filtering results: Splunk provides user with several tools for filtering results, for faster detection of problems.

For more than a year I have been experimenting with Splunk in several facets: security, storage, infrastructure, telecom and more. We at ESI have a very complete laboratory which allowed me to push my experiments.

In addition to using all these amounts of data, I used open data to experiment with Splunk’s ability to interpret them.

I tested the open data of the site “montreal.bixi.com”; this is raw data formatted as follows:

Start date –  Start station number –  Start station –  End date –  End station number –  End station –  Account type – Total duration (ms)

With this data, we are able to find the most common routes, estimate the average duration of a trip, the anchorage points most requested for the entry or exit of bicycles.

For the operations team of the service, this provides real-time or predicted for the next day which anchors should have more bicycles, and mostly where these bicycles will go. They could predict the lack or surplus of bikes in the anchor points. If data is collected in real-time, alerts could be issued to indicate potential shortage or surplus in the anchor points. Thus the system facilitates planning and allows to be proactive to meet demand, rather than reactive. We would even be able to detect an undelivered bicycle; for instance a bike that has not been anchored for more than 24 hours could issue an alert, so the operations team attempts to trace it.

For marketers, one might think this data is useless, while the opposite is true; the same data can be used to put in place potential offers to attract customers, since we have the data that give the time of departure and arrival, time of use of the trips, and the most used routes. One can thus know the most used time slots and make promotions or adjust the rates according to objectives of traffic or customer loyalty.

For the management, open data unfortunately does not give the price of races according to the status of the users (members or non-members), but the beauty of Splunk is that one can enrich the collected data with data coming from a third-party system, a database or simply manually collected data. Management could then obtain reports and dashboards based on various factors, such as user status, travel time, days of the week, and much more. We could even make comparisons with previous months or the same month of the previous year. The applications are virtually limitless with data that resides in Splunk: the only limitation is that of our imagination!

These are of course fictitious examples made with available open data, but which could be real with your own systems and data.

The collection of information from a website can provide visibility for all users of a company, operations receive system overload alerts, marketers get information about the origin of the connections to target their campaigns based on this data, management gets a view of the user experience, as well as performance metrics that confirm SLAs.

Whether it is security, operations, marketing, analytics or whatever, Splunk can address your needs. In addition to the 1,200 applications available in its portal, you can create your own tables, reports, or alerts. You can use their Power Pivot to allow people to easily use the data and build their own dashboard.

The platform is easy to use and does not require special expertise: you only need the data there.

Do not hesitate to contact ESI for a presentation or a demo; it will be my pleasure to show you how to “Splunk”.

Guillaume Paré
Senior Consultant, Architecture & Technologies – ESI Technologies

What about Big Data & Analytics?

After the “cloud” hype, here comes the “big data & analytics” one and it’s not just hype. Big data & analytics enables companies to make better business decisions faster than ever before; helps identify opportunities with new products and services and bring innovative solutions to the marketplace faster; assists IT and helpdesk in reducing mean time to repair and troubleshoot as well as giving reliable metrics for better IT spending planning; guides companies in improving their security posture by having more visibility on the corporate network and identify suspicious activities that go undetected with traditional signature-based technologies; serves to meet compliance requirements… in short, it makes companies more competitive! One simply has to go on Youtube to see the amazing things companies are doing with Splunk for example.

BIG-DATA-1I remember when I started working in IT sales in the mid 90’s, a “fast” home Internet connexion was 56k and the Internet was rapidly gaining in popularity. A small company owner called me and asked “What are the competitive advantages of having a website?” to which I replied “it’s no longer a competitive advantage, it’s a competitive necessity” and to prove my point I asked him to search his competitors out on the Internet: he saw that all of his competitors’ had websites!

The same can now be said of big data & analytics. With all the benefits it brings, it is becoming a business necessity. But before you start rushing into big data & analytics, know the following important facts:

  1. According to Gartner, 69% of corporate data have no business value whatsoever
  2. According to Gartner still, only 1.5% of corporate data is high value data

This means that you will have to sort through a whole lot of data to find the valuable stuff that you need to grow your business, reducing costs, outpacing competition, finding new revenue sources, etc. It is estimated that every dollar invested in a big data & analytics solution brings four to six dollars in infrastructure investments (new storage to hold all that priceless data, CPU to analyze, security for protection etc.).

So before you plan a 50,000$ investment in a big data & analytics solution and find out it comes with a 200,000$ to 300,000$ investment in infrastructure, you should talk to subject matter experts. They can help design strategies to hone in on the 1.5% of high value data, and reduce the required investment while maximizing the results.

Charles Tremblay, ESI Account Manager

The greatest IT confusion ever?

Does it even beat Y2K? It’s been a year now since I rejoined the IT integration industry. When I left it in 2003 to focus on PKI technologies, it was still the good old days of client server IT infrastructure right after Y2K and the dot-com bubble burst. For a year now I have been trying to understand clients’ challenges to see how I can help. For a year now I have observed my clients trying themselves to understand the mutations that appear to be changing the IT industry and how it affects them not only on a business level but also on professional AND personal levels as well. I find them fearful and closed. Witnessing this, I told a colleague of mine “it seems our clients are capable of telling us what they don’t want but rarely have a clear vision of what they’re aiming for”!Trending concepts
Big data, the internet of things, stuff called cloud, anything anywhere anytime on any device, the software defined companies etc. – all these new terminologies are being bombarded to our clients and are supposed to showcase the many new trends in the industry. I have recently been to a seminar where the audience was separated in three categories: traditional IT folks who resist these changes and new trends because they reshape traditional IT infrastructure and thus may even jeopardize their job definition or security, new line of business managers who embrace change and are shopping for apps that get the job done and high management who talk the numbers’ language (growth percentage, market share and other measurable KPIs) with whom you need to be able to prove ROI (not TCO this is the IT folks’ concerns).

And there we have it: widespread confusion and fear. Y2K all over again? People forget, BI has been around for a while, so has the Internet, thin client environments, databases etc. It’s just happening on a different scale and the challenge remains to bridge the gap between corporate and business objectives as defined by high management, finding the right tools and processes to get the job done by line of business owners and IT that still has an important role in solution selection, integration and support be it on site or off site.

My challenge over the last year has been to overcome those fears so as to allow my clients to have open discussions on their business objectives and avoid the use of buzz words to refocus on “where do you want to be in three to five years as a company, what IT tools will be required to help you get there and what are the ones I can help you with”.

Charles Tremblay, ESI account manager