Denial of service attacks – understanding and avoiding them

In October, a cyber attack on Internet provider Dyn made many web services and sites inaccessible, including several newscasters (Fox News, HBO, CNN, Weather Channel, etc.) and world-class sites Netflix, Paypal, Yelp, Starbucks, just to name a few.

This attack is considered the largest denial of service attack ever made. In order to better understand what happened, we will first of all recall some basic notions of Internet communications. We will continue by talking about botnets and their evolution, before we see the specifics of this recent attack. Finally, we will see how we can guard against such attacks.

Internet Communication Basics

Most Internet communications are of the client-server type. The Internet browser is often used as a “client” and sends requests to the server, asking it to display a Youtube video, for example.

Each server has its own IP address. When navigating on Google, for instance, the server that responds to our request may be different depending on our geographical location. This is made possible by using a Domain Name System (DNS).

These DNS servers will translate an address with the words “www.google.com” into an IP address. This notion is important for understanding the attack that targeted Dyn.

History of botnets

A “botnet” (combination of robot and network) is a network of computers infected by a virus, which turns them into passive entities that remain listening to future instructions. The person controlling the botnet can then send commands to his army of infected computers. For example, ask his robots to send spam or launch distributed denial of service attacks (DDoS). The distributed nature of this architecture makes detection of DDoS attacks difficult.

With the miniaturization and ever-decreasing cost of computing devices, more and more objects become “connected”. This creates an ever-growing network of printers, IP cameras and all kinds of objects that are connected to the web. All these devices are ultimately small computers, and like all computers, they are vulnerable to attacks.

Moreover, since few people take the time to configure these connected objects, most of them are configured with default passwords, making it even simpler for an attacker to compromise and infect them viruses.

We find ourselves in a situation where many objects connected to the Internet are infected by a virus. And these devices, like IP cameras, are constantly on, unlike our computers. During the most recent DDoS attack, this botnet managed to generate up to 1.2 Tb of data per second! This is a data rate equivalent to nearly 2,000 DVD-quality movies sent per second!

Why did this attack hurt so badly?

Denial of service attacks have traditionally targeted servers or websites of companies that are chosen either for activism (or hacktivism) reasons, or for the purpose of extorting money.

The reasons for this attack are not yet known, but what differs from previous ones is the target. For the first time, it was not site servers that were targeted, but the DNS servers of the Dyn company.

The sites of Twitter, Paypal and Netflix, for example, were fully functional. But by preventing us from knowing the address of the servers to connect, this attack made all these sites inaccessible.

How to defend against these attacks?

DDoS attacks often follow a well-established pattern. A first way to protect oneself therefore is to use systems that will detect the signatures of these attacks.

Another way to prevent is to implement redundancy on servers. By using load balancers, you can intelligently route traffic to multiple servers, improving the system’s resilience to high traffic flows.

But that’s not all! We also need to guard against infections, to prevent one of our systems from becoming a botnet member. To do this, you must first protect computers with antivirus software.

However, many connected devices are too simple to install an antivirus. It is therefore essential to analyze the inbound network traffic in your corporate network, both to detect known threats and zero-day vulnerabilities.

It is possible to further minimize the risk of infection of your systems by correlating and monitoring event logs, such as continuous network and systems monitoring, which is part of the services offered by ESI Technologies.

Finally, remember to keep systems updated, in order to mitigate the risk that known vulnerabilities can be exploited and use unique and complex passwords. Password management software exist to make your life easier.

A specialized information security firm such as ESI Technologies will be able to assist you in analyzing your needs and selecting the most effective and efficient solutions to mitigate the risks of botnet attacks on your systems.

Tommy Koorevaar, Security Advisor – ESI Technologies

Cloud Strategy: data collection

Here is part 6 of our series covering the key issues to consider before adopting cloud technologies. This month, we discuss how to build your strategy and data points that must be considered.

When considering & building a cloud strategy, organisations need to consider business objectives/outcomes desired, quantifiable and time-bound goals as well as identify specific initiatives that the enterprise can and should undertake in order to execute the strategy and achieve the goals set. As shown by surveys on the subject by Gartner in 2013 and 2014, process and culture are likely to be big hurdles in any move to cloud. Therefore, involving all aspects of the business and gathering the right information can assist in building the right strategy and identify potential problems ahead of time.

The first concrete step to take to building this strategy is to gather the data points to identify and define those objectives, goals and initiatives for the entreprise in the near – and mid – terms. Once the data is collected, you can review, analyze and identify the business outcomes desired, set the (quantifiable) goals and define the specific initiatives you want to put in place to achieve them. This should not be a strict price or technology evaluation.

Data Collection
The data points needed will have to come from various parts of the organisation (business units, finance, HR and IT). Some of the information required may take the form of files, but a lot of the required information will reside with your staff directly, and so interviews should be a part of the data collection process. These interviews should take up to a few hours each and focus on the interviewees functions, processes used and required/desired business outcomes, to provide insight into the actual impacts to the business before creating your cloud strategy.

With this data, you will be in a position to account for all aspects touching cloud computing, to see what it will affect and how, to evaluate its effect on the balance sheet (positive or negative) and decide on your strategy moving forward.

Benoit Quintin, Director Cloud Services – ESI Technologies

Account of the NetApp Insight 2016 Conference

The 2016 Edition of NetApp Insight took place in Las Vegas from September 26 to 29.
Again this year, NetApp presented its ‘Data Fabric’ vision unveiled two years ago. According to NetApp, the growth in capacity, velocity and variety of data can no longer be handled by the usual tools. As stated by NetApp’s CEO George Kurian, “data is the currency of the digital economy” and NetApp wants to be compared to a bank helping organizations manage, move and globally grow their data. The current challenge of the digital economy is thus data management and NetApp clearly intends to be a leader in this field. This vision is realized more clearly every year accross products and platforms added to the portfolio.

New hardware platforms

NetApp took advantage of the conference to officially introduce its new hardware platforms that integrate 32Gb FC SAN ports, 40GbE network ports, NVMe SSD embedded read cache and SAS-3 12Gb ports for back-end storage. Additionally, FAS9000 and AFF A700 are using a new fully modular chassis (including the controller module) to facilitate future hardware upgrades.

Note that SolidFire platforms have been the subject of attention from NetApp and the public: the first to explain their position in the portfolio, the second to find out more on this extremely agile and innovative technology. https://www.youtube.com/watch?v=jiL30L5h2ik

New software solutions

  • SnapMirror for AltaVault, available soon through the SnapCenter platform (replacing SnapDrive/SnapManager): this solution allows backup of NetApp volume data (including application databases) directly in the cloud (AWS, Azure & StorageGrid) https://www.youtube.com/watch?v=Ga8cxErnjhs
  • SnapMirror for SolidFire is currently under development. No further details were provided.

The features presented reinforce the objective of offering a unified data management layer through the NetApp portfolio.

The last two solutions are more surprising since they do not require any NetApp equipment to be used. These are available on the AWS application store (SaaS).

In conclusion, we feel that NetApp is taking steps to be a major player in the “software defined” field, while upgrading its hardware platforms to get ready to meet the current challenges of the storage industry.

Olivier Navatte, Senior Consultant – Storage Architecture

Cryptolocker virus : how to clear the infection

Cryptolocker is a now well-known type of virus that can be particularly harmful to data stored on computer. The virus carries a code that encrypts files, making them inaccessible to users and demands a ransom (as bitcoin, for example) to decipher them, hence their name “ransomware”.
Cryptolocker type viruses infiltrate by different vectors (emails, file sharing websites, downloads, etc.) and are becoming more resistant to antivirus solutions and firewalls; it is safe to say that these viruses will continue to evolve and become increasingly good at circumventing corporate security measures. Cryptolocker is already in its 6th or 7th variation!

Is there an insurance policy?

All experts agree that a solid backup plan is always the best prescription for dealing with this type of virus. But what does a good backup plan imply, what would a well-executed plan look like?
The backup plan must be tested regularly and preferably include an offsite backup copy. Using the ESI cloud backup service is an easy solution to implement.
The automated copy acts as an insurance policy in case of intrusion. Regular backups provide a secondary offsite datastore, and acts as a fallback mechanism in case of malicious attack.

What to do in case of infection?

From the moment your systems are infected with a Cryptolocker, you are already dealing with several encrypted files. If you do not have in place a mechanism to detect or monitor file changes (eg a change of 100 files per minute), damage can be very extensive.

  1. Notify the Security Officer of your IT department.
  2. Above all, do not pay this ransom, because you might be targeted again.
  3. You will have no choice but to restore your files from a backup copy. This copy becomes invaluable in your recovery efforts, as it will provide you a complete record of your data.

After treatment, are you still vulnerable?

Despite good backup practices, you still remain at risk after restoring your data.
An assessment of your security policies and your backup plan by professionals such as ESI Technologies will provide recommendations to mitigate such risks in the future. Some security mechanisms exist to protect you from viruses that are still unknown to detection systems. Contact your ESI representative to discuss it!

Roger Courchesne  – Director, Security and Internetworking Practice – ESI Technologies

Cloud Strategy – human impacts across organization

Here is part five of our series covering the key issues to consider before adopting cloud technologies. This month, we discuss the impact on human resources.

Resources in your organisation will be impacted by this change. Both on the IT side and on the business side. While helping companies move to cloud we have had to assist with adapting IT job descriptions, processes and roles within the organisation.

As the IT organisation moves into a P&L role, its success starts to be tied to the adoption by the stakeholders of the services offered. To do this, IT needs to get closer to the business units, understand their requirements and deliver access to resources on-demand. All this cannot happen unless things change within the IT group.

As companies automate their practice, and create a self-service portal to provision resources, some job descriptions need to evolve. A strong and clear communication plan with set milestones helps employees understand the changes coming to the organisation, and involving them in the decision process will go a long way to assist in the transition. We have seen that IT organisations with a clear communication plan at the onset that involved their employees in the process had a much easier transition, and faster adoption rate than those who did not.

Our experience helping customers with cloud computing shows that cloud alters significantly IT’s role and relationship with the business, and employees’ roles need to evolve. Training, staff engagement in the transition and constant communication will help your organisation significantly move to this new paradigm.

Benoit Quintin, Director Cloud Services – ESI Technologies

Cloud Strategy: technological impacts

Here is part four of our series covering the key issues to consider before adopting cloud technologies. This article focuses specifically on technological impacts to consider.

Not all software technology is created equal. Indeed, not every application will migrate gracefully to the cloud, some will never tolerate the latency, while others were never designed to have multiple smaller elements working together, rather than a few big servers. This means your business applications will need to be evaluated for cloud readiness. Indeed, this is possibly the largest technological hurdle, but, as with all technology, this may prove to be easier to solve that some of the other organisational issues.

One should look at the application’s architecture (n-tiered or monolithic), tolerance to faults/issues (e.g. latency, network errors, services down, servers down) and how the users consume the application (always from a PC, from the office, or fully decentralized, with offline and mobile access), to evaluate options for migrating an application to the cloud. Current growth rate and state of the organisation are often times mirrored in its IT consumption rate and requirements. Certainly, an organisation that’s under high growth rates or launching a project where growth is not easily identifiable can possibly benefit significantly from a scalable, elastic cloud model, whereas an organisation with slower growth, familiar / standard projects and predictable IT requirements will not likely assess the value of cloud computing the same way. Accountability of resources and traceability of all assets in use may be of bigger concern.

Architecture, applications and legacy environments are all technological considerations that should be factored in any cloud computing viability & readiness assessment, but that should probably not be the main driver for your cloud strategy.

Benoit Quintin, Director Cloud Services – ESI Technologies

Cloud Strategy: legal impacts across the organization

Here is part three of our series covering the key issues to consider before adopting cloud technologies. This article focuses specifically on legal impacts on your organization.

“Location, location, location”. We’re more accustomed to hearing this in the context of the housing market. However, where your company’s headquarters reside, where your company does business and where its subsidiaries are located directly impact how you need to manage sensitive information, such as strategic projects, HR/personnel information, etc.; essentially, IT needs to account for data sovereignty laws and regulations.

Various countries have already voted or are moving towards voting on more restrictive data sovereignty legislations that will control the transit of information out of border. For example, the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA) already governs how IT organisations can collect, use and disclose personal information in the course of commercial business. In addition, the Act contains various provisions to facilitate the use of electronic documents. Essentially, all personally identifiable information must stay in country, at rest and in transit, meaning that using a cloud provider in the US or any other country with said data could expose the company – and you – to a lawsuit, unless the cloud provider can guarantee no aforementioned data ever leaves the country at any time, including for redundancy/DR purposes.

While the previous Act covered what must be protected, the American law (the USA Freedom Act, and its previous incarnation, the Patriot Act) enables the US government to access any and all data residing on its soil, without owner’s authorization, need for warrant and without even the need to notice the owner before or after the fact. The few data privacy provisions in the bill apply to American citizens and entities only. This means all data housed in the US are at risk, especially if said data is owned by an organisation whose headquarters are out of country.

If in Europe, laws vary from country to country, we find that the regulations on data protection are becoming more stringent, requiring the establishment of procedures and controls to protect personal data and obtaining the explicit authorization of persons to collect and use their information. All this imposes guidelines to the use of the cloud within the country or outside their borders.

Typically, data sovereignty should be a concern for most organisations when looking at cloud and, as the current trend is for countries to vote in more stringent laws, any and all cloud strategy should account for local, national and international regulations.

Benoit Quintin – Director Cloud Services – ESI Technologies

Cloud Strategy: business impacts across the organization

Here is the second part of our series covering the key issues to consider before adopting cloud technologies. This article focuses specifically on business impacts on your organization.

Most markets are evolving faster than ever before, and the trend seems to be accelerating, so organisations globally need to adapt and change the way they go to market. From a business standpoint, the flexibility and speed with which new solutions can be delivered via cloud help enable the business units to react faster and better. So much so, that where IT organisations have not considered automating aspects of provisioning to provide more flexibility and faster access to resources, business units have started going outside of IT, to some of the public cloud offerings, for resources.

Planning for cloud should consider people and processes, as both will likely be directly impacted. From the requisition of resources, all the way to charging back the different business units for resources consumed, managed independently from projects’ budgets, processes that were created and used before the advent of cloud in your organisation should be adapted, if not discarded and rebuilt from scratch. IT will need to change and evolve as it becomes an internal service provider (in many instances, a P&L entity) – and resources broker for the business units.

Considering the large capital investments IT has typically been getting as budget to ‘keep the lights on’, and considering that, until recently, this budget had been growing at double digits rate since the early days of mainframe; the switch from a capital investment model to an operational model can impact the way IT does business significantly. Indeed, we have seen the shift forcing IT to focus on what it can do better, review its relationships with the vendors, ultimately freeing up the valuable investment resources. In many organisations, this has also translated to enabling net new projects to come to life, in and out of IT.

Once this transformation is underway, you should start seeing some of the benefits other organisations have been enjoying, starting with faster speed to market on new offerings. Indeed, in this age of mobile everything, customers expect access to everything all the time, and your competition is likely launching new offerings every day. A move towards cloud enables projects to move forward at an accelerated pace, letting you go to market with updated offerings much faster.

Benoit Quintin, Director Cloud Services, ESI Technologies

Review of Télécom 2016

This was the 13th edition of this annual event organized by Comtois-Carignan. ESI Technologies participated in the Industry Day on Tuesday April 26 during which 34 presentations on topics related to telecom, IT and contact centres were offered.

For a third consecutive year, we presented a conference this time on threat evolution and data protection. Installing security devices such as firewalls or first-generation IPS was before common and sufficient to protect organizations against threats that might affect the operations of a company’s activities. Today, the rapid evolution of malicious activity requires installing new solutions to better protect our assets. Our presentation provided an excellent overview of these solutions: next generation firewalls and IPS, protection systems against advanced threats, security for web browsing, email security and unified authentication services.

Participants were able to ask questions about these pioneering technologies, protection solutions that provide control and visibility to better react to a threat detected in the environment.

During the industry cocktail, 42 partner booths were available for participants to discuss technologies and service offerings. This cocktail formula is highly appreciated by participants, giving them the opportunity to discuss and share views on presentations of the day.

If you missed the ESI presentation, please contact us so we can share its content with you.

Roger Courchesne – Networking and Security Practice Manager

Cloud computing: strategy and IT readiness – Transformation in IT

Here is the first of a series of articles that provide both business and IT executives insights into the key issues that they should consider when evaluating cloud services, paying particular attention to business and legal ramifications of moving to the cloud environment, whether it is private, hybrid or public.

cloud-question-mark-710x345For the last few decades, IT organisations have been the only option for provisioning IT resources for projects. Indeed, all new projects would involve IT, and the IT team was responsible for acquiring, architecting and delivering the solution that would sustain the application/project during its lifecycle, planning for upgrades along the way.
This led to silo-based infrastructures – and teams -, often designed for peak demand, without possibilities of efficiency gains between projects. The introduction of compute virtualization, first for test/dev and then for production, showed other options were possible and available and that by aggregating requirements across projects, IT could get significant efficiencies of scale and costs while getting more flexibility and speed to market, as provisioning a virtual server suddenly became a matter of days, rather than weeks or months.
Over time, IT started applying these same methods to storage and network and these showed similar flexibility, scalability and efficiency improvements. These gains, together with automation capabilities and self-service portals, were combined over time to become what we know as ‘cloud offerings’.
In parallel to this, IT, in some organisations, has become structured, organized, usually silo’d, and, unfortunately, somewhat slow to respond to business needs. This has led to a slow erosion of IT’s power and influence over IT resources acquisition, delivery and management. Coupled with the existing commercial/public cloud options these days, capital is rapidly leaving the organisation for 3rd party public cloud vendors, also known as shadow IT. This raises concerns, not the least of which being that funds are sent outside the organisation to address tactical issues, typically without regard to legal implications, data security or cost efficiency. These issues highlight IT’s necessity to react faster, become more customer driven, deliver more value and provide its stakeholders with flexibility matching that of public cloud. Essentially, IT needs to evolve to become a business partner; cloud computing providing the tools by which IT offers flexibility, scalability and speed to market that the business units are looking for in today’s market.

Benoit Quintin, Director Cloud Services, ESI Technologies