Data is big. And getting bigger. It grew exponentially just in the past few seconds. Trust me…it did. Those in the know say data will grow 40% compounded annually for the foreseeable future. This brings up a few key data points.
There was a time when data practitioners only focused on a specific functional role, and typically associated that function with a particular tool; like Oracle or SQL. A guy or gal could be an experienced data analyst on the Oracle RDBMS, and pretty much not have to worry about DB2 or SQL. Sure, they might gain some peripheral knowledge on other platforms, but for the most part, their expertise was siloed. It wasn’t really a problem, because in those days, organizations routinely deployed and managed one major RDBMS. But, those days are gone.
Roger Lenihan, Big Data Specialist and Data Evangelist at SPR, was one such data professional. To a point. He began as a DBA in the Oracle environment and specialized there for the better part of a decade. He delved deep into the intricacies of the Oracle RDBMS and explored all facets of the platform; from partitioning, RAC, Data Guard, monitoring and database design to performance tuning and query optimization. He was an expert and quite comfortable with it, thank you very much. To a point.
See, Roger is the inquisitive type. He always wanted to learn more. No, he needed to learn and experience more. So he set out to explore new realms of data; briefly investigating the details and nuances of IBM DB2 and Microsoft SQL Server, until turning his focused attention to Hadoop and big data. Here’s where he took a deep dive and garnered the breadth of skills that led him to become a certified authority. Roger is certified in Hadoop, SQL Server, Linux and Oracle.
Today, the whole data explosion thing has turned stuff upside down and data management professionals with more narrow expertise are less desirable. Not only is data growing like anybody’s business; the complexity of harnessing, analyzing and capitalizing on data resources is a really tough nut to crack. It’s not for the faint of heart, or better stated, for the unskilled. Information management these days is simply too crucial, wielding the means for organizations to create sustained competitive advantage. Great apps and superb user experience are mere shells without access to the data that supports knowledge and action.
Consider these examples:
The landscape is scattered with various positions linked to data management, including: data quality specialist, insights manager, data analyst, master of data governance and a whole gamut of others. And you’re one of them (or hope to be), otherwise you wouldn’t be reading this post. So where do you go from where you are…to strengthen your position as a skilled resource in this burgeoning space?
Think about this.
According to Stacy Blanchard, an executive at Accenture Analytics (a 20,000-plus-employee unit of the management consulting and technology services firm of the same name,) the impetus is to find people who can tell the CEO what’s going to happen next, not what happened last week or last month. With that demand, she believes a generation gap of sorts is emerging within the BI and information management workforce. With regard to the next generation, Blanchard states, “They’re typically statisticians who are deep into data modeling, they’re close to the technology, and they know the right algorithms to use with the data available.”
IM pros who know how to use emerging big data platforms like Hadoop and NoSQL databases, will help organizations put enlarged volumes of data to work providing deeper insights and more accurate predictions. These professionals will easily be the top earners.
But get this. Roger (himself certified in Hadoop, Oracle and SQL) warns against alleging to know Hadoop. He suggests that anyone who possesses a minimum understanding of Hadoop would find that claim suspect.
Hortonworks, the California software company that focuses on the development and support of Apache Hadoop, calls Open Enterprise Hadoop…The Ecosystem of Projects. In short, the high level definition is, “the open source framework for storing and extracting insight from massive volumes of data.”
A more comprehensive explanation:
Numerous Apache Software Foundation projects make up the services required by an enterprise to deploy, integrate and work with Hadoop. Each project has been developed to deliver an explicit function and each has its own community of developers and individual release cycles.
Translate: there’s a lot to know.
Bottom line? For professionals like Roger who possess advanced skills, the world is rolling at their feet. Before choosing to join SPR Consulting, Roger had many enviable options. Some of the biggest names in the tech industry were vying to win the sum of his experience and skills. And yet, he selected the award-winning consultancy based in Chicago’s bustling loop for a few key reasons:
Finally, it’s important to note that SPR is committed to the advancement of talent, be it someone with years of experience who’s stretching into these evolved data roles; or someone fresh out of school armed solely with tenacity and a dream. Training is key, and SPR offers the best of it. So decide what you want to do, and where you want to play. Then reach out. We’re here.