With the rapid rise of the Internet of Things creating mountains of data generated from billions of devices, enterprises that depend on data centres need to take advantage of the most efficient systems available. One option earning the attention of companies of all sizes is the micro data center.
A micro data center is a self-contained, secure computing environment that includes all the storage, processing, and networking required to run customer applications. It may initially appear to be a traditional data center in a box, but there are crucial differences that set it apart from its larger fixed-location cousin.
Regional data centers cost a lot to build and maintain and embedded devices (e.g., routers, switches) are limited in terms of processing power and storage. Micro data centers, on the other hand, can be deployed quickly and cheaply in existing edge computing environments, and still provide ample processing power and storage. They are a great fit in a distributed computing model — a model that is becoming increasingly popular among companies looking to cope with the IoT data maelstrom.
From cloud to edge and micro
While most would agree the 2010s belonged to the cloud, some experts are saying the 2020s will belong to the micro data center. On the surface, the very idea of it might seem surprising. The cloud “revolution” was supposed to be the final stage of storage — the big finale.
But in many scenarios, the advantages of micro data centers is obvious. Take the case of retail, in which low latency and resiliency are vitally important. Micro data centers are purpose-built to address this speed issue — both the delay in the time it takes data to be processed as well as how reliable the system is overall. Micro data centers are also “low touch,” meaning branch staff aren’t saddled with the burden of having to manage it.
Latency factor
While data security remains a top priority for most businesses, in a world that is soon to be 5G fast, latency is becoming the challenge IT departments want to tackle. With the number of IoT-connected devices expected to rise almost 300 per cent between 2019 and 2025 network congestion is not just a nuisance but it becomes a real challenge in the form of latency and bandwidth issues.
So how do you improve the time it takes for packets of data to be stored and retrieved? Or from a business perspective, how long does it take for a user to retrieve source data from a server?
When it comes to reducing latency in a centralized enterprise data center, installing bigger switches to increase bandwidth has limitations. That’s why many leading enterprises are now looking for ways to expand their data processing infrastructure closer to where data is actually generated. This is precisely where micro data centers come in.
In support of edge computing
Schneider Electric’s digital guide “Cost Benefit Analysis of Edge Micro Data Center Deployments” explores why micro data centers are best suited to support edge computing — even more so than server rooms and traditional data centers. Among the topics covered in this guide:
- Drivers of micro data centers – scalability; speed of deployment; reliability; outsourcing to the cloud and co-location
- IT enablers – hyperconverged IT, compaction, virtualization
- Deployment methods
- Capital cost advantages – methodology, assumptions, findings
- Future micro data center architecture
This guide explains how micro data centers take advantage of existing infrastructure, and demonstrates how this architecture reduces capital expenses by 42 per cent over a traditional build. Other benefits discussed include shorter project timelines.
Download “Cost Benefit Analysis of Edge Micro Data Center Deployments”