Rocana Releases Rocana Ops 1.5: Real Data Warehousing for IT Operations

Rocana Releases Rocana Ops 1.5: Real Data Warehousing for IT Operations

A couple of weeks ago I had the pleasure of meeting Eric Sammer, the co-founder and CTO of Rocana , a very original company in the field of big data and the Hadoop Apache project.

This is a software company that has developed—based on the technologies mentioned—an innovative set of solutions to analyze IT operations data. With what seems to be a good combination of event data warehouse and machine-learning capabilities, its flagship product Rocana Ops captures and analyzes an enormous amount of operations data in order to improve performance, identify existing and emerging problems, and optimize development and operations.

Sammer was kind enough to provide me with a complete briefing of the main and new features from the Rocana Ops solution and the new 1.5 version, launched just recently.

Self-identifying as “the digital transformation company,” Rocana aims to provide enhanced visibility into IT operations by enabling IT departments to collect, store, measure, and analyze large amounts of operations data within a central repository, which can be translated to existing and common IT metrics.

These metrics help to optimize IT operations via analysis and visualization. The solution contains a series of key functional elements along with an event-based data warehouse architecture that, combined with a series of data-collection and advanced-analysis features, provides a centralized infrastructure to perform effective analysis of large amounts of IT operations data.

Once collected, information can be easily understood using Rocana’s pre-built analytics infrastructure and advanced-analytics repository. All of Rocana’s founders have strong roots in the open-source scene, especially the Apache Hadoop project. This is evident in that at its core, the Rocana Ops architecture is a mix of in-house development and various Apache projects. For example, the Hadoop Distributed File System (HDFS) is used for long-term storage and downstream processing, Apache Parquet for supporting columnar storage features, and Apache Kafka for the transport of all data throughout the system.

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

7 Misconceptions About Data Lakes

21 Feb, 2017

As big data becomes an increasingly integral part of business operations, companies are looking for ways to make data ingestion …

Read more

Do regulatory data projects really need design-time data lineage? Probably not.

23 Apr, 2022

Your regulatory data project likely has no use case for design-time data lineage. Mapping Data Lineage at design time, for …

Read more

What’s the secret to making sure AI doesn’t steal your job? Work with it, not against it

3 Dec, 2021

Whether it’s athletes on a sporting field or celebrities in the jungle, nothing holds our attention like the drama of …

Read more

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.