Tag Archives: datacenter.com

Updates from Datacenter.com

What is a DDoS attack and how to mitigate it?
Datacenter.com – A Distributed Denial-of-Service (DDoS) attack is a malicious attempt to disrupt the traffic of a targeted server, service or network by overwhelming it with a flood of internet traffic (Cloudflare, 2019).

DDoS attacks are much like traffic on a highway. Imagine regular traffic moving at a steady pace and cars on their way to their desired destination. If a flood of cars enters the highway at a particular point, it significantly delays or prevents the cars behind them from reaching their destination at the time they should.

In 2018, more than 400,000 DDoS attacks were reported worldwide (CALYPTIX, 2018). In 2018’s 4th quarter, Great Britain was responsible for 2.18% of these attacks, a staggering difference compared to 2019’s 1st quarter of 0.66% (Gutnikov, 2019).

The goal of this attack is to create congestion by consuming all available bandwidth utilized by the target to access the wider internet it wishes to interact with (Cloudflare, 2019). Large amounts of data are sent to the target by utilizing a form of amplification or another means of creating massive traffic, such as requests from a botnet (which is a group of devices infected with malware that an attacker has remote control over). more>

Related>

Updates from Datacenter.com

The Hidden Costs of Hosting Your Infrastructure On-Premise
Datacenter.com – There are many myths around it and the choice between hosting your mission-critical infrastructure in-house or accommodating your IT infrastructure in a professional data center. Managing and implementing your business-critical infrastructure in-house is a huge responsibility on top of your daily work. The specialist requirements with regard to the management of electricity, cooling and security are hard at the helm.

In addition, the excessive overhead on company resources to optimally manage the infrastructure and its environment creates its own set of challenges. The choice should not only be made on the basis of costs. It depends on your business requirements and specific usage options, as well as the costs of the service.

The role of IT has come a long way, with the success of companies now heavily dependent on the ability of the organization to digitally transform. IT and the infrastructure they build are under extreme pressure to perform, with most IT departments facing a tough battle to meet the demand from the operation (business). Digital technologies are drastically changing the way we do business, with business rules changing every day. With companies that continue to reinvent themselves, IT requirements are becoming increasingly difficult to predict now and in the future.

Research shows that nearly 70% of digital transformation projects are not finished within the set deadlines, with only one fifth of customers claiming that their transformation projects are successful in the long term. That said, taking the extra financial and resource costs needed to build and maintain your mission-critical infrastructure makes little sense if your digital journey and your business continue to mature.

As such, moving your business-critical on-premise infrastructure to a specialized colocation data center has quickly become the preferred choice that many organizations consider as part of their search to successfully digitize their business activities. more>

Related>

Updates from Datacenter.com

Private Cloud vs Public Cloud: what is the best solution?
Datacenter.com – Cloud computing spans a range of classifications, types and architecture models. The transformative networked computing model can be categorized into three major types: Public Cloud, Private Cloud and Hybrid Cloud.

Hybrid IT has rapidly proven that it offers the flexibility for delivering new software applications and enhanced features quickly critical agility in the age of digital business. With that in mind, enterprises now need to identify the best distribution of services and applications and their strategy for connecting to the clouds they use.

This article explores the key differences between the classifications of Public, and Private cloud environments.

Public Cloud refers to the cloud computing model with which the IT services are delivered across the Internet. The computing functionality may range from common services such as email, apps and storage to the enterprise-grade OS platform or infrastructure environments used for software development and testing. The cloud vendor is responsible for developing, managing and maintaining the pool of computing resources shared between multiple tenants from across the network.

The advantages of Public Cloud solutions for business customers include:

  • No investments required to deploy and maintain the IT infrastructure;
  • Flexible pricing options based on different SLA offerings

However, there are disadvantages as well, including:

  • The total cost of ownership (TCO) can rise exponentially for large-scale usage, specifically for midsize to large enterprises

more>

Related>

Updates from Datacenter.com

Planning a hybrid cloud implementation? Don’t forget the importance of the network
Almost every company is working on some form of cloud transformation, and we’ve noticed that almost everyone is pursuing a hybrid cloud strategy. Because hybrid sees a wide range of on-premise, hosted and cloud-based services side by side, it will only be cost-effective if you can establish reliable, secure connectivity between the various elements of your hybrid architecture.
datacenter.com – Today everything has to be on demand

The network is often forgotten because IT teams are planning hybrid cloud transformation projects. Without properly dimensioned legacy-to-public cloud connectivity, transformation projects can be compromised and run into serious customer experience problems.

“Transformation projects can be paralyzed without properly dimensioned legacy-to-public cloud connectivity”

This is why organizations are now trying to request and order on-demand capacity from the network as they need, reducing traditional constraints such as capex, delays at external suppliers and long project timelines.

From one to multiple networks

By connecting on demand, you can also adjust the bandwidth up or down to suit your project. For example, if you perform a major update on your cloud platform or your IT services go into production as quickly as they are built for the development and testing phase of your project you can adjust your bandwidth based on the (temporary) need.

“By connecting on-demand, you can also adjust the bandwidth up or down to your project” more>

Related>

Updates from Datacenter.com

The staffing challenge in data center market becomes a quality opportunity
The data center industry is growing rapidly. The amount of vacancies for specialized data center personnel in the area of Amsterdam, for example, is rising. A shortage of specialized/skilled data center personnel is expected, a staffing challenge. How can the data center market respond? Will it result in decreasing quality of data center services? We believe in the opportunity of improving!

datacenter.com – It is expected that the data center industry will rapidly grow and will result in a shortage of skilled IT personnel.

Almost all data centers have plans to expand and cloud providers are considering or starting to operate their own data center. So, the amount of companies and the size of the companies are expanding, means that specialists are needed to design, maintain and operate these high-tech buildings and services. The employees that are the hardest to find are ones with a very broad knowledge of power, cooling and construction knowledge. Due to the shortage in the skilled employees, the more expensive the employees will be.

Instead of trying to do everything by yourself, you can use the experts. The companies that build technical infrastructures and buildings are the once that (mostly) have in-depth knowledge within their field of expertise. Instead of engineering your data center with some multi-talent employees and asking the experts to fill-in the blanks, the experts can design and a project manager within the field can connect them. more>

Related>

Updates from Datacenter.com

Why can’t a data center guarantee the uptime of your environment?
Datacenter.com – One of the main reasons for choosing a data center, is to limit the risk of downtime and unavailability of the company’s critical environment. A data center offers redundant power feeds, multiple power sources (main grid and emergency generators) and redundant fiber paths to make sure one feed/source and path will always be available. So far, a data center can guarantee a certain uptime. The uptime guaranteed often guarantees the availability of at least one feed/source or path; in terms of data center design: N.

Does that ensure the uptime of your environment? To maximize your environment ‘s uptime, the resources a data center delivers must be used.

When choosing a high-standard datacenter, the equipment you will use in that data center must be able to use the safeguards that a data center offers. The infrastructure of power, fiber paths and cooling are as strong as the weakest link.

For example, when using a server that is only connected to one feed, the guaranteed uptime on one of the two power feeds do not apply anymore for the power on that server. When using a fiber connection from one fiber path, that fiber path is the single point of failure, although the data center has two redundant fiber paths. The same for using an ATS, when using it, the power to the ATS is dual feed, however the power path behind the ATS will be the single-point of failure. To achieve the highest uptime, you must use the safeguards that a data center offers as most as possible. more>

Related>

Updates from Datacenter.com

30 Years of Open Internet in Europe
By Piet Beertema – On Saturday, 17 November at 2.28 pm it is exactly thirty years ago since the Netherlands was the first country in Europe to be connected to the Internet. System Administrator Piet Beertema of Centrum Wiskunde & Informatica (CWI) in Amsterdam received the confirmation that CWI – as the first institute outside the US – officially gained access to NSFnet, an academic computer network that later evolved into the worldwide Internet.

In 1988, the pioneers of CWI gained access tot the – then still American – Internet after years of preparation (CWI was already the central hub within the European network ‘EUnet’ and predecessor NLnet), thanks to their good contacts in the network world. Teus Hagen, head of IT at CWI at that time, explains in the documentary that during the development period, especially hard work was being done to establish the internet connection and the associated technology, so that communication between – especially scientists – would be faster and easier. “Data and information were exchanged freely at that time. If we had known that privacy and hacking would play such a big role in the future, we would have opted for a different approach for sure.”

Steven Pemberton was one of the first Internet users in Europe. In a later stage he developed important standards for the World Wide Web, one of the most important applications of the Internet. “In retrospect, establishing that first connection was a historic moment, something we did not realize at that time.” more>

Related>

Updates from datacenter.com

Data Centers Integral to Successful Digital Transformation Strategy
datacenter.com – Digital transformation has gotten a lot of attention. It involves not just the implementation of new technologies, but the alteration of business processes and models to fully leverage those technologies. This enables organizations to gain unprecedented levels of productivity, enhance customer experience, drive innovation and create competitive advantages.

According to research firm IDC, by 2020, 60% of the top manufacturers will rely on digital platforms that enhance their investments in ecosystems and experiences and support as 30% of their overall revenue.

A recent white paper issued by the Center for Global Enterprise, entitled Digital Supply Chains: A Frontside Flip, discussed how forward-looking companies are re-thinking and transforming their supply chains as they see new digital technologies and organizational models coming to the forefront of business.

An enterprise-wide digital supply chain can lead to a 20% reduction of procurement costs, a 50% reduction in supply chain costs, and an increase in revenue of 10%.

Digital transformation is changing the nature of the data center, and new technologies are constantly placing new demand on data centers and data center services. more>

Related>

Updates from datacenter.com

How Cloud Demand Positively Impacts further growth of Amsterdam
datacenter.com – As one of the top data center markets in the world, the Amsterdam area is poised for more growth over the coming years as cloud demand increases. The amount of data going through the cloud will reach 14.1 ZB by 2020.

After Microsoft opened their €2 billion campus, Google opened their data center in North of the Netherlands (Eemshaven) and the recently the announcement came that Google will expand with a second campus in The Netherlands (Amsterdam region). Other large cloud companies are expanding their supply in and around Amsterdam.

Datacenter.com opened Datacenter.com AMS1 in Amsterdam South-East this year, the best connected data center Campus, which will soon will upgrade to the second phase due to the fast growth and the high interest. more>

Updates from Datacenter.com

Cabinet airflow management done right
By Hans Vreeburg – Let’s start with some basic understanding of airflows within data centers. Nowadays almost all data centers apply hot and cold corridors to optimize the cooling capabilities. In the ideal situation cold air goes straight to the servers’ inlets. The hot air exiting the servers is returned directly to the cooling unit. This setup enables systems to run at the highest possible efficiency, using the least amount of power. The cooling setup has a big influence on the PUE (Power Usage Effectiveness): a lower PUE results in lower total energy consumption of the data center. This indirectly saves the environment and lowers OPEX costs. Could a small gap in your server rack really have that much influence?

As said above, the ideal setup is cold air entering the servers, while hot air exits. Gaps can lead to a higher demand of cold air than actually required by the servers. See it as a large water pipe: it normally needs a specific amount of water. When we make some holes in the pipe, you will need to pump in more water to get the same amount at the end of the pipe. more>

Related>