Though IT and its functions and responsibilities have changed over the years, there’s one area that remains consistent: IT primarily focuses on major enterprise applications and on large machines—whether they are mainframes or super servers.
When IT deals with big data, the primary arena for it is, once again, large servers that are parallel processing in a Hadoop environment. Thankfully for the company at large, IT also focuses on reliability, security, governance, failover, and performance of data and apps—because if it didn’t, there would be nobody else internally to do the job that is required. Within this environment, IT’s job is most heavily focused on the structured transactions that come in daily from order, manufacturing, purchasing, service, and administrative systems that keep the enterprise running. In this environment, analytics, unstructured data and smaller servers in end user departments are still secondary.
“When we visit with large organizations, we often find common problems,” said Dan Ortega, vice president of marketing for Blazent, which provides big data intelligence solutions. “The fundamental issue is that it is the business that generates the data but IT that delivers it.”
Between these functions, data is cleaned, vetted and stored. Since IT performs all of these tasks, it is held responsible if data can’t be readily accessed when the business wants it.
“When this happens, the end users in the business tend not to view IT as a strategic partner, and the users in the business adopt an attitude that they will only deal with IT when they absolutely have to,” Ortega said.