Let’s be friends:
Keys to Working With Big Data
Read about insights from 15 executives that created big data solutions for clients with topics ranging from data sources, integration of data, and innovation.
Join For Free
Download this white paper to see how your organization can take advantage of the new data landscape by integrating Apache Hadoop with your EDW , brought to you in partnership with Hortonworks .
To gather insights for DZone’s Big Data Research Guide , scheduled for release in August 2016, we spoke to 15 executives who have created big data solutions for their clients.
Here’s who we talked to:
Uri Maoz, Head of U.S. Sales and Marketing, Anodot | Dave McCrory, CTO, Basho | Carl Tsukahara, CMO, Birst | Bob Vaillancourt, Vice President, CFB Strategies | Mikko Jarva, CTO Intelligent Data, Comptel | Sham Mustafa, Co-Founder and CEO, Correlation One | Andrew Brust, Senior Director Marketing Strategy, Datameer | Tarun Thakur, CEO/Co-Founder, Datos IO | Guy Yehiav, CEO, Profitect | Hjalmar Gislason, Vice President of Data, Qlik | Guy Levy-Yurista, Head of Product, Sisense | Girish Pancha, CEO, StreamSets | Ciaran Dynes, Vice Presidents of Products, Talend | Kim Hanmark, Director, Professional Services, TARGIT | Dennis Duckworth, Director of Product Marketing, VoltDB .
We asked these executives, “What are the keys to working with big data?”
Here’s what they told us:
There’s a high volume of data to manage.
The elastic nature of enterprise and e-commerce applications. EBay is using MongoDB to scale out horizontally.
Resilience—recovery and management of data. Companies cannot afford any downtime.
Define what you want to get from big data (e.g. algorithms for facial recognition). Financial Services/Investment houses may be building quantitatively-based hedge funds that require analysis of a lot of datasets and provide predictive analytics.
Transactional data tends to be less voluminous. We partner with Teradata for big data analysis where we’re the fast front end. Clients are able to deploy our product to see when errors may be forthcoming since we’re able to provide fast, real-time analytics.
The ability to pull data from anywhere, enrich data sets, take spreadsheet functions with declarative models to model, structure, analyze and visualize the data.
As data sets grow think about where big data is going. It’s difficult, slow, expensive, and rigid to put together large data warehouses. Large credit card companies load data at speed into Hadoop. We have ETL tools for information. The burden is to make it analytically ready to put on a platform. Multiple sources of data are available in a shorter period of time. It is difficult processing the data to get it ready for use in business decisions.
Innovation in dealing with volume and elasticity.