IT trends research – Edge computing & Quantum Computing

Since 2010, Supply Value has conducted an annual survey on the most important trends within the purchasing field. Since this year we have been adding the IT trends research. The aim of this research is to support Information management and technology professionals in setting the most important priorities and the right focus. In this blog we provide a first look at two trends of the research: Edge computing and Quantum computing.

Disclaimer: This is a summary in the Trend study published later. Would you like to receive the full trend survey? Let us know.

Edge Computing

'Unknown is unloved' certainly applies to edge computing. Where everyone is nowadays talking about IoT and bidding against each other about who has the most sensors in his/her network, relatively few people are familiar with Edge computing. 75% of the respondents indicated that they were completely unknown or only fairly familiar with this concept. This while the two trends are strongly linked. Edge computing is the way to efficiently and quickly deal with all the collected IoT sensors. Today, edge computing is mostly about IoT devices that do not send data from sensors to the cloud, but process it directly on the device itself into summarized information that is then sent to a central point or used locally for taking (automated) data. decisions.

What is edge computing

But, how exactly does it work? To make this clear, it is important to understand the difference with cloud computing. With cloud computing, capacity, or computing power, can be purchased from a cloud provider. This cloud provider often delivers by operating huge data centers. Because the cloud provider has economies of scale and can work efficiently, this is often cheaper, more secure and more scalable for organizations than having their own servers. However, these data centers are often geographically distant from the user. In general, the delay caused by this physical distance is negligible. In some situations, however, every millisecond is of great importance and the choice is made to move the computing power to the 'edge' of the network. The aim is to have data processing take place as close as possible (physically) to the data source (often IoT sensors). This also prevents overloading of networks. In view of the expected exponential growth of IoT (4.9 billion connected devices in 2025) and the fact that 5G has not yet been rolled out everywhere for the time being, this will also increasingly become a point of attention in the near future.

Trend development

The first studies on edge computing were published as early as 2014-2015 and in fact companies like Google and Microsoft have been working with the underlying idea for much longer (the idea of a 'globally accessible network' from which cloud and edge computing are derived has been around since the 60's). However, it has only really been picked up by potential users since 2018 and it is expected that it will continue to develop strongly in 2019.

Although respondents generally do not indicate that they will give much priority to edge computing, we expect that this trend will accelerate towards 2020. Forbes (2019) also indicates that the expected market value of edge computing about $3.24 billion in 2025 will be. Over time, it will become increasingly clear how widely (or not widely) applicable edge computing actually is. Currently, the outlook is positive, because cloud computing and IoT are also proving extremely broadly applicable and valuable.

Quantum Computing

The use of data within organizations is growing exponentially, but not all of this data is actually used. Classical analysis methods are often no longer sufficient given the amount of data that can be processed. That is why new ways are being sought to better analyze ever-growing data sources. Quantum computing is a promising example of this. However, it is not easily applicable; IBM only unveiled its first-ever quantum computer for commercial use at the beginning of this year (January 2019). It is therefore far from being widely available or applicable. That is probably also the reason that the respondents are not yet familiar with this trend. More than 80% of the respondents indicated that they were completely unknown or only somewhat familiar with quantum computing. It is striking that such a large percentage is completely unfamiliar with it, because this trend has received a great deal of media attention, and quantum computing is seen as a huge breakthrough.

What is quantum computing

The classic view of information storage on a normal computer is that the information is made up of 1's and 0's. Data is made up of so-called 'bits' where each bit is a 1 or a 0. The value of a bit can therefore take on two states, comparable to, for example, an on/off switch of a computer or the answer to a yes/no question. A traditional computer stores numbers by making long combinations of 0's and 1's. This is quite easy for the numbers 0, 1 and 2. These are combinations of 2 bits: 00 is zero, 01 is one and 10 is two. However, if you want to make the number 250,000 you already need a combination of 18 bits! In other words, a series of 18 1s and 0s. If you want to store large amounts of data, this takes a considerable amount of storage space on a computer. As a result, a combination of a large amount of data with much-demanded computation leads to slow processing on traditional computers. Quantum computing may offer a solution for this!

 Quantum computers differ from traditional computers in that they work with qubits for data storage instead of bits. The advantage of qubits is that they can not only have a value of 0 or a 1 (and thus be on or off), but also all possible combinations of these at the same time. Because a qubit can be a 0 and a 1 at the same time (for example, 70 percent 0 and 30 percent 1), it is possible to perform two calculations in parallel. Because with this the computing power of quantum computers is vastly greater, they are also a lot faster than traditional computers. Exactly how much faster depends on the number of qubits in the system.

This sounds like a no-brainer so far, why isn't everyone going to apply quantum computing now? That's because working with qubits is a very delicate process. Qubits can lose their properties (could be a 0 and a 1 at the same time) if they are influenced by a 'measurement'. You can see a measurement as a 'photo' or snapshot of the state of a qubit at a certain moment, and on this snapshot the qubit will always show 0 or 1, even though the qubit can take multiple values. As a result, the special property of the qubit expires and it has actually become 'worthless'. Scaling up quantum systems is therefore difficult because a way has to be found to be able to link multiple qubits without them influencing each other and thus losing their power. quantum state touch.

Trend development

Although quantum computing is currently scarcely available, it will become more widely available in the coming years. For example, by IBM's commercial quantum computer, where calculations can be purchased via the internet.

Its potential will increasingly have to become apparent in practice and there will be a lot of experimentation in the coming period. However, a broad and accessible deployment of quantum computers will take some time.

Would you like to read more about Edge computing and Quantum computing, and would you like to know what the IT professionals who have assessed these trends think about these trends? Download the entire research report below!

Are you interested or do you want to know more about Supply Value?