From cloud to fog computing... waiting for "SUNplification" -

From cloud to fog computing… waiting for “SUNplification”

From cloud to fog computing… waiting for “SUNplification”

SUNplification

Bringing clouds close to the ground and creating fog – anything just to see the sun! No, it is not my weather forecast, even though, judging from how confusing this sentence is, you might in fact think so.

For several years now, cloud computing has been described in the IT industry as a new paradigm capable of disrupting digital service delivery and use patterns based on on-demand self-service and pay-per-use logics. As consumers, it did not take us long to ‘adapt’. Just look at the growth figures for mobile device use and App downloads. According to the latest report published by Flurry, the use of mobile applications has grown on average by 58% on a year-after-year basis (between 2014 and 2015) with peaks of truly amazing App categories: +332% for so-called personalization Apps (those allowing you to customize your mobile activities, such as Emoji apps and keypads to enter graphical elements mainly used in instant messaging where Facebook, Messenger and Whatsapp Apps prevail); +135% is the annual increase recorded for news and content applications; Apps dedicated to individual productivity are also booming (not just in terms of e-mail access), recording a 125% increase; definitely a positive trend for shopping mobile Apps as well (+ 81%), evidence of strongly accelerated growth in mobile commerce.

Staying on a more ‘enterprise’ level, data from market analysts simply confirm the growth: according to Gartner’s analyses, throughout 2016 public cloud services will reach US$ 204 billion (in 2015 the market value stood at around US$ 175 billion). Specifically, Saas services will grow by 20% (rising from US$ 31.4 billion in 2015 to the expected US$ 37.7 billion by year-end); Iaas services will continue to grow significantly – more than 38% (from US$ 16.2 billion in 2015 to the expected US$ 22.4 billion in 2016); interesting prospects also for Paas services, which will increase by a little more than 20%, from a current turnover of US$ 3.8 billion to the expected US$ 4.8 billion by 2016.

‘I’m losing my marbles,’ it’s true, – and I haven’t even told you what clouds have to do with fog…

The fact that cloud computing is in fact keeping its initial promises of triumphantly emerging from the area of ‘trendy buzzwords’, a stage in which all disruptive technology trends have to ‘serve’ a physiological period of incubation, is quite evident and confirmed by numbers. But it is still unknown how the growth predictions made by analysts – who even go so far as to forecast continuous growth for the cloud service public market at a 15% average annual rate until 2020 – are supported by real facts.

Apparently, fog will be supporting these numbers – fog computing (a term coined by Cisco), a new paradigm that will allow public cloud services to reach the so-called ‘periphery’, in other words to bring services closer to those who use them, namely the users.

If there is a limit that the cloud model must consider, this is connectivity, in particular the degree of latency concerning the exchange of data between an enterprise data center and cloud resources: especially at an enterprise level, accessing resources that are ‘scattered’ in the cloud is not always an effective (and efficient) response; the distance between data sources (which we know are constantly growing) and infrastructure, architectural and application services is often a critical issue. And if today these limits are ‘bypassed’ by companies opting for private cloud models with dedicated network lines, IoT advancement requires some urgency for action.

The reason why can be deduced by making a comparison with the Big Data analysis: taking the analysis ‘into volatile memory’, that is, the place closest to where data are ‘captured’, is the most effective way of analyzing huge amounts of unstructured data from multiple heterogeneous sources (hardly exploitable through traditional IT models and databases).

Now, fog computing does just that – it allows you to bring computational and analytical capacities closer to the data to be captured, validated, transported and analyzed. How? By enabling network devices (routers and network infrastructures) to ‘locally’ develop as much data as possible from the connected devices, transferring only the necessary information (e.g. that which may portend a malfunction or outage). In other words, data processing does not occur in cloud environments but within ‘intelligent’ network devices, provided with sufficient computational capacity to run this type of task.

The paradigm is perhaps best known as edge computing (which well identifies the concept of ‘bringing’ computational and analysis capacity to the edge, that is ‘on board’ the device), and in fact disrupts traditional models, where data are brought close to the processing systems to extract information and knowledge. The opposite happens through fog computing – processing is brought closer to the place where the data transit.

Not surprisingly, analysts speak about edge computing as a reference model for IoT development; taking only IDC into account (emphasizing, however, that all industry analysts devote extensive research to the subject), the American company estimates that by 2018, about 40% of the data created by connected ‘things’ will be stored and processed close to the edge of the network, if not directly on the device or the network infrastructure itself.

Exactly as what happened with cloud computing, simplifying data processing and resource management is one of the ‘promises’ of fog computing. But what we have actually often witnessed in the last decade goes exactly in the opposite direction, namely towards a significant increase in technological complexity levels and IT governance.

After clouds and fog, we look forward to figuring out when and how we will be able to see the sun!

Lascia un commento