Big Data, the Internet of Things, artificial intelligence, deep learning, and blockchain – the number of technologies introduced with great fanfare and much optimism is still growing. Apart from big promises as regards efficiency and the hype, they show one striking similarity: They all necessitate enormous data center capacities and sometimes produce gigantic data volumes as input or output.
The term “Big Data” already implies the consequences: For the analysis of mass data generated, e.g. in social networks or through purchasing processes aggregated in a web shop, a great deal of storage and computing performance is required. The same goes for the Internet of Things, this modern data monster. Especially in the industrial sector, the IoT generates an enormous amount of machine data to be transmitted via a communication network and to be stored in a data center or in a cloud – somehow.
Artificial intelligence and deep learning also make high demands on storage technologies and on the computing power of the computers used. These neural networks must be trained with big data records in the Gigabyte range before they can be used in practice. And blockchain, too, poses substantial requirements for computing performance and storage. One of its main characteristics is the fact that every blockchain transaction is stored permanently. Deleted data can never be removed and each modification of the data is completely recorded along with the original data.
Industrial IoT increases data volumes
These few remarks already show that new technologies are among the drivers of the much-cited growth in data volume. According to a Dell EMC study, the total data volume will increase tenfold by 2020. This represents an enormous challenge to enterprises. In order to be able to compete in new applications and business models, a strategy for the handling of this data growth must be developed.
At present, the IoT data volume in communication networks is specifically being increased by the use of “predictive maintenance”. Here, sensors permanently monitor the condition of machines and facilities and send data to a cloud application via communication networks. The application then evaluates the data and may, for example, recommend the exchange of worn parts within a certain time.
This field of application is quite popular among companies, as it quickly leads to shorter downtimes, higher production efficiency, and lower costs. And digital monitoring can also be performed in old facilities through retroactive installation of modern sensors. This again will cause a sharp increase in data volume, because these sensors transmit their data at one-second intervals. In big facilities, enormous amounts of data will thus be generated that must be transmitted via communication networks.
And there is a further phenomenon caused by the boom in cloud services: Many companies use several different services in the public cloud, often in combination with a private cloud or traditional application hosting. These most diverse workload requirements call for optimal hybrid and multi-cloud strategies. In many cases, the new technologies described above are one of the reasons for this. As a consequence, many well-known cloud providers are offering “pre-assembled” services that are relatively easy to use for business analytics, deep learning, or blockchain.
Here again, a look at our predictive maintenance example may be helpful. As the data generated by the sensors is already located in the cloud, it makes sense to also evaluate it there. It may be analyzed by business analytics applications and edited in a cloud dashboard. Such combined solutions based on an IoT platform and analytics are offered by many well-known cloud providers. In addition, there are also interfaces to other applications such as ERP or CRM or even to the classic Office solution in order to also present the data to people outside the circle of cloud users. As a consequence, demand is emerging for a flexible infrastructural design of future-oriented data logistics.
Hypes as a Service – Drivers of the Multi Cloud
How can enterprises handle this new situation? Many companies want to avoid partnering with one single cloud provider. A multi-cloud environment may be a solution to this desire for diversification, thus avoiding a vendor lock-in that is often felt as a restriction. Facing an undefined number of cloud providers, the complex requirements of the applications, and strict data protection regulations, the question of how to find the “right” vendor becomes increasingly important. In order to remain flexible in the future and to be able to react to new developments promptly, housing and connectivity should interact optimally. Modifications in the cloud infrastructure are often complex, unless a cloud exit (Clexit) strategy has been planned in advance. Thus, long-term consequences caused by the choice of the cloud provider can be avoided in a multi-cloud environment.
One useful solution could be an outsourcing partner that combines classic and modern ranges of application. On the one hand, the partner should maintain the usual data center operation and offer the hosting of traditional business applications. On the other hand, the partner should offer gateways to the big cloud providers – ideally acting as a cloud hub in order to ensure the high availability and reliability of the applications. This prevents situations in which several applications are run unconnectedly side by side at different providers, and where connections may be prone to disruption. There is a definite need for tailor-made individual cloud solutions.
Particularly the market leaders in the cloud business offer “As a Service” solutions for deep learning and blockchain. This facilitates the use of these technologies for the users. They don’t need to license and install the necessary software, nor do they have to provide the corresponding IT resources. The challenge remains, however, to manage the variety of services and interconnects – multi cloud here also translates into multi administration.
A modern and entirely networked data center platform such as ITENOS Data LogistIX alleviates the treatment of requirements for the users when applying most modern technologies. For one thing, there are the common benefits of a colocation data center and then again with its service “Cloud Connect”, the company can ensure very reliable and high-performance connections to the big cloud providers. Thus, a vendor lock-in can be avoided and maximum flexibility is ensured. If required, ten Gigabits per second can be provided – sufficient bandwidth to cope not only with development environments, but also with the productive use of the digital solutions.
Alexander Frese has uniquely diversified professional international management experience, 18+ years in marketing strategy, digital transformation, brand building and business development.
Over the years, Alexander has helped to shape the digital transformation in different roles. He knows the challenges of the digital industry from the perspectives of a consultant and from the business side.
Please note: The opinions expressed in Industry Insights published by dotmagazine are the author’s own and do not reflect the view of the publisher, eco – Association of the Internet Industry.