Over the last few years there has been a lot of industry buzz about the future of the enterprise data warehouse (EDW). Maybe we should change the classic EDW acronym for a new title: Extended Data Warehouse.

If you have any doubts about the data flood that is covering the globe, here are a few amazing stats. Around the world, in just one minute…

There have been several advancements within the Hadoop world that have positioned Hadoop closer to the data warehousing community than ever before. With a series of Hadoop 2.0 releases starting in October 2013, Hadoop is now much closer to being a platform for a data warehouse.

How old is your data warehouse? It’s a simple question and probably one you don’t think about much. The majority of production data warehouses are now 15-20 years old and probably very transactional centric. Over the years, you’ve probably remodeled “the house” more than a few times—adding some “rooms” and “upgrades” here and there. It’s starting to feel its age as more Business Intelligence requirements have been added, including Mobile applications and specialized analytics. And more and more ideas seem to show up in your inbox every day, especially Big Data questions.

The success of any company is becoming more and more dependent on unlocking the value of data and turning it into trusted information for critical decision making. The ability to deliver the right information at the right time and in the right context is crucial. Today, organizations are bursting with data, yet most executives would agree they need to improve how they leverage information to prevent multiple versions of the truth, improve trust and control and respond quickly to change.

If you are an IBM customer, it is very likely you have received some level of education about IBM’s Information Management solutions platform, which includes IBM’s Big Data strategy.

As a much younger writer and marketing guy watching the database technology boom of the 80’s and 90’s, I was fascinated with the advent of the data warehouse surge that started about twenty years ago. I saw it coming and watched it bloom. The promise of a “sandbox of meaningful data” for quicker and easier use by line of business managers was exciting.

The total cost of operations (TCO) of Business Intelligence (BI) systems is often measured in three categories: time-to-completion of projects, on-budget completion of projects, and cost per user of BI applications. There is a key process in every project that impacts all three categories: Business Requirements Engineering.

An effective requirements methodology ensures that project scope is clearly understood and costs accurately estimated. At the same time, when we deliver what users want, usage and adoption of the solution increase the user base. Why then do so many programs not take a closer look and the effectiveness of their approach to this key part of the process?

BI Professionals are used to working with a wide range of products and platforms and typically have a pretty substantial tool belt to be able to work across a multitude of different technologies. Over the past couple of months I took the opportunity to experiment with technologies that are entering the data warehousing ecosystem. These technologies included the Cloudera Sandbox, Hortonworks Sandbox, IBM Big Insights Sandbox, and Amazon’s Red Shift.

The business world continues to evaluate and implement the cloud for some of its IT requirements. The concept of the cloud as a viable IT storage solution as well as a way to cut costs is gaining momentum. But it might prompt the question: is the cloud the right place for a data warehouse?

This is an interesting question for many, and a problematic question for some.

iOLAP Inc., has signed a Consulting Partner agreement with web infrastructure provider Amazon Web Services, Inc. (AWS), for their new Redshift platform, a petabyte-scale data warehouse service in the cloud.