Even in the era of smartphones and a never-ending stream of data published through a myriad of apps, one of the most interesting drivers of data is the Internet of Things, aka the IoT. And what “thing” could better function as a sample of “silent data collection” than the car? And here is why:
Many of us use one daily and we have all seen some figures and have some idea of how much or how often cars are driven. For example, the average annual mileage in Germany is about 14,200 km per year. And we are all conscious of the general relation of our driving to issues like the CO2 footprint.
But what does the “data footprint” of a modern car look like? And how do we cope with the handling of that data – leaving the issue of data protection temporarily aside, and concentrating for the moment on the technical question of how to most efficiently handle the ensuing data volumes, both today and in the future.
Are we ready for 50 Terabytes of data per car per day?
Already, intelligent cars collect data on a widespread basis: Driving profiles, maintenance data, and state-of-the-art assistance functions lead to an average of 1 Gigabyte of data per second in each car. Most of that is driven by video-based systems which allow cars to drive on a partially autonomous basis, which suggests that this number will grow exponentially with the additional capabilities and complexity which are on the horizon. Looking into the automotive industry, there is talk of up to a factor of 20 for the future, leaving us with a total indication of potentially 50 Terabytes of data per car per day in the era of autonomous driving.
Given an average speed of 60 km/h, the 14,200 km per year translates into 852 Terabytes (which is 0.85 Petabytes) per year. If that equates with an average of approx. 23 Terabytes per day, we are roughly in line with a factor of 21, ending up with the anticipated 50 Terabytes per day in the future. But this is just one (!) car.
What if we try to extrapolate these figures to the total number of cars in Germany today (numbering at 46.5 m)?
Multiplied by the 46.5 million cars in Germany alone, we end up with a current data volume of up to 40 Zettabytes per year today, or 848 Zettabytes in future. That is approaching 0.85 Yottabytes – welcome to the next level of Big Data volume: A yottabyte is 10*24 Bytes (1,000,000,000,000,000,000,000,000 – try to get your head around that!).
The car as a mini data center
Too much heavy lifting and a few too many Bytes? Let’s translate this differently: Modern cars are in themselves a type of data center – they carry a processing power of more than 100 CPU and need to validate data within milliseconds before they can send the results, not only to their internal driving systems, but also for communication purposes to other vehicles on the road.
Even today, the largest test fleet of cars – from Tesla – has gathered “merely” about 1 billion kilometers of driving intelligence. The most recent death caused by an autonomous vehicle shows the need for further traffic observation by the car, which means even more data collection and quicker validation. And data transport.
The next wave of the Internet – the Edge.
Will 5G be the answer to this? No. If the amount of data explodes as indicated above, there will be a clear need for efficient data management systems, which are both cost- and network-efficient. A recent study concluded that a small local “data handling unit (DHU)” would need to be installed within a 15 kilometer range in order to get data transported and validated in the most efficient manner, including latency aspects for overall safety. That excludes traditional network and data center infrastructure and opens the door for the next wave of the Internet – the Edge.
Imagine a stand-alone unit, with full capabilities in terms of power, cooling, and connectivity. Portable, ready to rack, stack, and operate by plug & play into public power and network supplies. That is the sample use case for the next generation of small modular build-outs of data centers, combined with antenna technology for Car2Cell communication within short distances such as the indicated 15 kilometer radius. Whilst there are some manufacturers already in the field offering that sort of infrastructure, the next logical step is to deploy fully equipped DHUs on a pay-per-use basis. Currently that is possible up to a capacity of 3 Petabytes per unit with technology of innovIT360, for example (going back to the numbers …).
So, in the near future, we are looking at deploying a network of modular DHUs for a multitude of data and enriching capacity per unit as quickly as possible. If we then can cope with car data traffic, the next level will be the remaining topics on German highways. What about truck logistics, digital restaurants, break areas, and so on? The move to the Edge on the run, IoT offers many more sample cases for the next issues of dotmagazine. Stay tuned.
Frank J. Zachmann is CEO of innovIT360 AG, a privately owned German provider of consulting, implementation and service for innovative IT infrastructures. Prior to joining innovIT360 at the turn of the year 2016/2017, Mr. Zachmann held various senior management positions with leading data center operators. Among other things, he reported directly to the CEO of the Interxion Group and previously spent seven years at Equinix, the world's largest provider of neutral data centers - most recently as Vice President Business Development EMEA.
In addition, Mr. Zachmann, chairman of the association DigitalHub FrankfurtRheinMain e.V. - a local cooperation partner of the association of the internet industry, eco - is actively involved in the interests of the digital economy and is an international economic ambassador for the city of Frankfurt.
Please note: The opinions expressed in Industry Insights published by dotmagazine are the author’s own and do not reflect the view of the publisher, eco – Association of the Internet Industry.