This site uses cookies and by using the site you are consenting to this. We utilize cookies to optimize our brand’s web presence and website experience. To learn more about cookies, click here to read our privacy statement.

Mobile First – Cloud First Simplified (3 of 3)

Previous post for this series.

The conversation about business outreach (service first) and infrastructure elasticity (cloud) does not feel complete without including:

Big Data
Every generation of technology upgrade has created a need for the next upgrade in some ways. User created content and social media initially drove the need for big data techniques. However, the drivers to this movement added up pretty quickly because what big data analysis and prediction can do for business was quickly understood.

A Quick Introduction
Big data is commonly referred to as the mining and analysis of extremely high volumes of data. The data in question is not structured since it is collected from a variety of sources. Many such sources might not be following any standard storage format or schema. The data in question is also described by its characteristics, primarily – volume, verity, velocity, variability and complexity. The techniques for analyzing this data involve algorithms that engage multiple processors and servers that distribute the data to be processed in a logical way. Map-Reduce is one of the many popular algorithms, and Hadoop is one of the popular implementations of Map-Reduce.

Big data techniques are not something that only the corporations that collect social media dust need, it is something that every business needs to look into, sooner than later. It is a separate topic that every business needs to factor in social media data in some form or the other. Even if that part is ignored, the volume of structured data is increasing by the day.


Keeping all that in mind, it should be important to explain how well big data fits into the elasticity of the cloud. Imagine an operation where data needs to be segregated by some specific parameter on different servers. These different servers might run some processing depending of the type of the data or just store that data to improve access time. A true cloud environment will be the perfect host of such an operation. You can spin up new servers, having specific configuration, with just a few lines of scripts at run time.

Where are we heading
In 2011 Google Fiber announced Kansas City to be the first to receive 1 Gigabyte per second internet speed followed by Austin, TX and Provo, UT. As per the company’s announcement in February 2014, Google fiber will be reaching to another 34 cities. AT&T stepped up by announcing that its new service, GigaPower, will provide the gigabyte internet speed to cover as many as 100 municipalities in 25 metropolitan areas. Besides Google and AT&T many other large and smaller providers are working on targeted areas to provide superfast internet speed such as – Cox, Greenlight Network, Canby Telcom, CenturyLink, Sonic.Net etc.

Considering this new scale of data on bandwidth, the way application technology works is going to change, specially the part that involves Mobile and Cloud. It will be much more convenient to have a huge memory and processor centric operation running in a cloud environment, streaming status and results to the browser running on your laptop or a small hand held mobile device.

Moving the heavy lifting work to the cloud and keeping the control on low resource devices is not something that is going to happen. It is happening now; only the scale and outreach is going to increase exponentially. Everyone connected to this field should to pay attention to the changes and keep a strategy for the future, be it providers, consumers, decision makers, technology workers, business users and consultants.

Mobile first cloud first simplified.