Denial of service attacks – understanding and avoiding them

In October, a cyber attack on Internet provider Dyn made many web services and sites inaccessible, including several newscasters (Fox News, HBO, CNN, Weather Channel, etc.) and world-class sites Netflix, Paypal, Yelp, Starbucks, just to name a few.

This attack is considered the largest denial of service attack ever made. In order to better understand what happened, we will first of all recall some basic notions of Internet communications. We will continue by talking about botnets and their evolution, before we see the specifics of this recent attack. Finally, we will see how we can guard against such attacks.

Internet Communication Basics

Most Internet communications are of the client-server type. The Internet browser is often used as a “client” and sends requests to the server, asking it to display a Youtube video, for example.

Each server has its own IP address. When navigating on Google, for instance, the server that responds to our request may be different depending on our geographical location. This is made possible by using a Domain Name System (DNS).

These DNS servers will translate an address with the words “www.google.com” into an IP address. This notion is important for understanding the attack that targeted Dyn.

History of botnets

A “botnet” (combination of robot and network) is a network of computers infected by a virus, which turns them into passive entities that remain listening to future instructions. The person controlling the botnet can then send commands to his army of infected computers. For example, ask his robots to send spam or launch distributed denial of service attacks (DDoS). The distributed nature of this architecture makes detection of DDoS attacks difficult.

With the miniaturization and ever-decreasing cost of computing devices, more and more objects become “connected”. This creates an ever-growing network of printers, IP cameras and all kinds of objects that are connected to the web. All these devices are ultimately small computers, and like all computers, they are vulnerable to attacks.

Moreover, since few people take the time to configure these connected objects, most of them are configured with default passwords, making it even simpler for an attacker to compromise and infect them viruses.

We find ourselves in a situation where many objects connected to the Internet are infected by a virus. And these devices, like IP cameras, are constantly on, unlike our computers. During the most recent DDoS attack, this botnet managed to generate up to 1.2 Tb of data per second! This is a data rate equivalent to nearly 2,000 DVD-quality movies sent per second!

Why did this attack hurt so badly?

Denial of service attacks have traditionally targeted servers or websites of companies that are chosen either for activism (or hacktivism) reasons, or for the purpose of extorting money.

The reasons for this attack are not yet known, but what differs from previous ones is the target. For the first time, it was not site servers that were targeted, but the DNS servers of the Dyn company.

The sites of Twitter, Paypal and Netflix, for example, were fully functional. But by preventing us from knowing the address of the servers to connect, this attack made all these sites inaccessible.

How to defend against these attacks?

DDoS attacks often follow a well-established pattern. A first way to protect oneself therefore is to use systems that will detect the signatures of these attacks.

Another way to prevent is to implement redundancy on servers. By using load balancers, you can intelligently route traffic to multiple servers, improving the system’s resilience to high traffic flows.

But that’s not all! We also need to guard against infections, to prevent one of our systems from becoming a botnet member. To do this, you must first protect computers with antivirus software.

However, many connected devices are too simple to install an antivirus. It is therefore essential to analyze the inbound network traffic in your corporate network, both to detect known threats and zero-day vulnerabilities.

It is possible to further minimize the risk of infection of your systems by correlating and monitoring event logs, such as continuous network and systems monitoring, which is part of the services offered by ESI Technologies.

Finally, remember to keep systems updated, in order to mitigate the risk that known vulnerabilities can be exploited and use unique and complex passwords. Password management software exist to make your life easier.

A specialized information security firm such as ESI Technologies will be able to assist you in analyzing your needs and selecting the most effective and efficient solutions to mitigate the risks of botnet attacks on your systems.

Tommy Koorevaar, Security Advisor – ESI Technologies

Cloud Strategy: data collection

Here is part 6 of our series covering the key issues to consider before adopting cloud technologies. This month, we discuss how to build your strategy and data points that must be considered.

When considering & building a cloud strategy, organisations need to consider business objectives/outcomes desired, quantifiable and time-bound goals as well as identify specific initiatives that the enterprise can and should undertake in order to execute the strategy and achieve the goals set. As shown by surveys on the subject by Gartner in 2013 and 2014, process and culture are likely to be big hurdles in any move to cloud. Therefore, involving all aspects of the business and gathering the right information can assist in building the right strategy and identify potential problems ahead of time.

The first concrete step to take to building this strategy is to gather the data points to identify and define those objectives, goals and initiatives for the entreprise in the near – and mid – terms. Once the data is collected, you can review, analyze and identify the business outcomes desired, set the (quantifiable) goals and define the specific initiatives you want to put in place to achieve them. This should not be a strict price or technology evaluation.

Data Collection
The data points needed will have to come from various parts of the organisation (business units, finance, HR and IT). Some of the information required may take the form of files, but a lot of the required information will reside with your staff directly, and so interviews should be a part of the data collection process. These interviews should take up to a few hours each and focus on the interviewees functions, processes used and required/desired business outcomes, to provide insight into the actual impacts to the business before creating your cloud strategy.

With this data, you will be in a position to account for all aspects touching cloud computing, to see what it will affect and how, to evaluate its effect on the balance sheet (positive or negative) and decide on your strategy moving forward.

Benoit Quintin, Director Cloud Services – ESI Technologies

Cloud computing: strategy and IT readiness – Transformation in IT

Here is the first of a series of articles that provide both business and IT executives insights into the key issues that they should consider when evaluating cloud services, paying particular attention to business and legal ramifications of moving to the cloud environment, whether it is private, hybrid or public.

cloud-question-mark-710x345For the last few decades, IT organisations have been the only option for provisioning IT resources for projects. Indeed, all new projects would involve IT, and the IT team was responsible for acquiring, architecting and delivering the solution that would sustain the application/project during its lifecycle, planning for upgrades along the way.
This led to silo-based infrastructures – and teams -, often designed for peak demand, without possibilities of efficiency gains between projects. The introduction of compute virtualization, first for test/dev and then for production, showed other options were possible and available and that by aggregating requirements across projects, IT could get significant efficiencies of scale and costs while getting more flexibility and speed to market, as provisioning a virtual server suddenly became a matter of days, rather than weeks or months.
Over time, IT started applying these same methods to storage and network and these showed similar flexibility, scalability and efficiency improvements. These gains, together with automation capabilities and self-service portals, were combined over time to become what we know as ‘cloud offerings’.
In parallel to this, IT, in some organisations, has become structured, organized, usually silo’d, and, unfortunately, somewhat slow to respond to business needs. This has led to a slow erosion of IT’s power and influence over IT resources acquisition, delivery and management. Coupled with the existing commercial/public cloud options these days, capital is rapidly leaving the organisation for 3rd party public cloud vendors, also known as shadow IT. This raises concerns, not the least of which being that funds are sent outside the organisation to address tactical issues, typically without regard to legal implications, data security or cost efficiency. These issues highlight IT’s necessity to react faster, become more customer driven, deliver more value and provide its stakeholders with flexibility matching that of public cloud. Essentially, IT needs to evolve to become a business partner; cloud computing providing the tools by which IT offers flexibility, scalability and speed to market that the business units are looking for in today’s market.

Benoit Quintin, Director Cloud Services, ESI Technologies

Where’s the promised agility?

The world of technology solutions integrators has changed dramatically in the last 10 years.

Customers are more educated than ever before through access to a world of information available on the Internet. It is estimated that 80% of customer decision-making is made online before they even reach out to us. This is not just true of our industry. The Internet is now woven into the fabric of society and clients now go to the veterinary clinic with the belief that they already identified their pet’s disease since “the Internet” provided them with a diagnosis!

agility380w_0What about the promises of industry giants? Simplified IT, reduced OPEX, increased budgets for projects instead of maintenance, etc.?

How can we explain that we don’t witness this in our conversations with customers? How is it that we still see today clients who have embraced those technologies also admit they are now faced with greater complexity than before? Perhaps the flaw comes exactly from the fact that 80% of decisions are made based on well designed and manufactured web marketing strategies…

Regardless of the technological evolution, the key it seems is still architecture design, thought with a business purpose and IT integration strategy tailored to your specific needs with the help of professionals. Just as a veterinarian is certainly a better source of information than the Internet to look after your pet…

For over 20 years, ESI designs solutions that are agile, scalable and customized to the specific needs of organisations. ESI works closely with customers to bridge the gap between business needs and technology, maximizing ROI and providing objective professional advice.

The IT Catch-22

OK, so everyone’s taking about it. Our industry is undergoing major changes. It’s out there. It started with a first architecture of reference with mainframes and minicomputers designed to serve thousands of applications used by millions of users worldwide. It then evolved with the advent of the Internet into the “client-server” architecture, this one designed to run hundreds of thousands of applications used by hundreds of millions of users. And where are we now? It appears we are witnessing the birth of a third generation of architecture, one of which is described by the IDC as “the next generation compute platform that is accessed from mobile devices, utilizes Big Data, and is cloud based”. It is referred to as “the third platform”. It is destined to deliver millions of applications to billions of users.

3rd platformVirtualization seems to have been the spark that ignited this revolution. The underlying logic of this major shift is that virtualization allows to make abstraction of hardware, puts it all in a big usable pool of performance and assets that can be shared by different applications for different uses according to the needs of different business units within an organization. The promise of this is that companies can and have more with less. Therefore, IT budgets can be reduced!
These changes are huge. In this third platform IT is built, is run, is consumed and finally is governed differently. Everything is changed from the ground up. It would seem obvious that one would need to invest in careful planning of the transition from the second to the third platform. What pace can we go at? What can be moved out into public clouds? What investments are required on our own infrastructure? How will it impact our IT staff? What training and knowledge will they require? What about security and risks?
The catch is the following: the third platforms allows IT to do much more with less. Accordingly, IT budgets are reduced or at best, flattened. Moving into the third platform requires investments. Get it? Every week we help CIOs and IT managers raise this within their organization so that they can obtain the required investments they need to move into the third platform to reap the benefits of it.

What about Big Data & Analytics?

After the “cloud” hype, here comes the “big data & analytics” one and it’s not just hype. Big data & analytics enables companies to make better business decisions faster than ever before; helps identify opportunities with new products and services and bring innovative solutions to the marketplace faster; assists IT and helpdesk in reducing mean time to repair and troubleshoot as well as giving reliable metrics for better IT spending planning; guides companies in improving their security posture by having more visibility on the corporate network and identify suspicious activities that go undetected with traditional signature-based technologies; serves to meet compliance requirements… in short, it makes companies more competitive! One simply has to go on Youtube to see the amazing things companies are doing with Splunk for example.

BIG-DATA-1I remember when I started working in IT sales in the mid 90’s, a “fast” home Internet connexion was 56k and the Internet was rapidly gaining in popularity. A small company owner called me and asked “What are the competitive advantages of having a website?” to which I replied “it’s no longer a competitive advantage, it’s a competitive necessity” and to prove my point I asked him to search his competitors out on the Internet: he saw that all of his competitors’ had websites!
The same can now be said of big data & analytics. With all the benefits it brings, it is becoming a business necessity. But before you start rushing into big data & analytics, know the following important facts:

  1. According to Gartner, 69% of corporate data have no business value whatsoever
  2. According to Gartner still, only 1.5% of corporate data is high value data

This means that you will have to sort through a whole lot of data to find the valuable stuff that you need to grow your business, reducing costs, outpacing competition, finding new revenue sources, etc. It is estimated that every dollar invested in a big data & analytics solution brings four to six dollars in infrastructure investments (new storage to hold all that priceless data, CPU to analyze, security for protection etc.). So before you plan a 50,000$ investment in a big data & analytics solution and find out it comes with a 200,000$ to 300,000$ investment in infrastructure, you should talk to subject matter experts. They can help design strategies to hone in on the 1.5% of high value data, and reduce the required investment while maximizing the results.

Charles Tremblay, ESI Account Manager

Cloud adoption: getting through the maze

Companies can no longer ignore the increasing importance of cloud computing when planning their technological investments and that they must choose from the options available on the market. Evaluating products and services based on the needs of the organization, not only for today, but above all for the future, is quite a challenge!CLOUD_READINESSBeyond the technological considerations (product compatibility, required investment, scalability of existing systems, etc.) there are the evaluation of the different providers and the services they offer, as well as the costs associated to their use. The best known cloud solutions on the market may seem attractive because they have a high visibility, often with a recognized brand, which is perceived as a guarantee of reliability. The savings announced by these solutions and their accessibility are often decisive criteria when the time comes to make a choice. It is however almost impossible to assess the real costs of these solutions, because several important variables remain unknown: the price of retained data, the cost of download per Gb, pricing for transactions, etc.
Cloud offerings are diverse and are not equally suitable for all businesses. Some heterogenous environments are not easily transferable and it can be risky, if not impossible, to migrate to the cloud without a fundamental transformation of the architecture and the ways of making within the organization. Caution is therefore required when undertaking such an important turn. Do not see cloud computing as a simple upgrade to a more powerful technology, but as a business strategy. This demands a thorough evaluation of existing processes and of the legal and technological framework of the company, coupled with an action plan with clear goals to achieve.
Few companies have an IT team able to perform the necessary analysis of current processes in the organization and of the technological and governance challenges related to them.
It is in this context that specialized integrators provide a valuable contribution to the company’s thinking. A trusted partner will help you assess your needs and your cloud adoption process to optimize your investments while reaching your business goals.
Benoit Quintin – ESI Cloud Services Director

Got your head in the cloud? Keep your feet on the ground!

A couple of weeks ago, ESI in partnership with NetApp, hosted a very special event on cloud computing & associated data privacy legal issues. Guest speaker for this event was non-other than Ms. Sheila FitzPatrick who is recognized by data protection authorities worldwide as one of the world’s leading experts on data protection legislation and the compliance process.
I had the chance to be briefed on this presentation by peers at ESI at which some of our clients were conveyed and one thing really hit me in the same way it hit all the participants at this event:
The most important thing to remember with cloud services is that your company and you as a manager of that company will be held accountable for any data privacy issues of the cloud service provider you signed on with.

There you have it. You remain the owner and the person responsible for that data even though you no longer have control over it.

cloud-key

Given that there is no transfer of legal responsibility from you to the cloud provider with regards to data, a long checkup list ensued that included questions such as: how does the cloud provider separate my data from other clients’ data? Where is it stored (under which jurisdiction)? How strong is encryption? How does it get moved to the cloud provider? Where are located my backups? How secure is data transfer?… This is only a very small sample of that checklist.
A local presence by a cloud provider doesn’t mean your data is entirely local. Often your backups are sent offshore in another country governed by different laws and in some cases this goes against the legislation to which your company must comply.
In short, cloud technology is much less about technology than it is about legal compliance, SLAs and contract management. Of course, there is still obviously a strong technology component to it. At ESI, and its network of partners like that of Ms. Sheila FitzPatrick from NetApp, we can help companies navigate through this to set their cloud strategy in motion in full understanding of what is at stake, since it all comes down to a question of risk management: what to move into a public cloud, what to keep in a private one.

Charles Tremblay, ESI Account Manager