Because I will have to do this again in no time, here I jot down the quickest way for me to install 12c. This is the usual series of following steps, running into an error, searching for a fix, lather, rinse repeat.
This blog post is about OBIEE reporting. Specifically, it is about skipping the data warehouse and reporting from the transactional database instead. Oracle’s OBIEE, like most BI reporting tools, is designed to use star/snowflake schemas as the underlying structures to report from. Additionally, OBIEE’s metadata layer is very rich and extensively well thought out, allowing for a great deal of flexibility. Oracle’s metadata tool (Admin Tool) allows us to leverage this flexibility and features to bridge the gap between an OLTP and an OLAP model. I am not negating the need for a data warehouse, I am just wondering if all BI reporting projects merit one.
So the question to ask is: Is OBIEE up to the task?
I’ve added ability to consume URLs whose output is XML as a data-source for Flex OLAP Cube. Check it out :)
Going thru this exercise has helped me understand a bit about creating data cubes and the uses they serve. This is but one of the many interesting (to me) things I am exposed to at work. Doing this from scratch provides me with insight unattainable with a ‘shrink-wrap’ tool.
Note – I had originally fetched data from accross the web but had to store xml files internally for show and tell. Enjoy.
I wanted to write the post ‘Slicing Your Own OLAP Cube’ but I am not there yet. From my last post, a recurring theme in my friends comments was that the dimensions and measures that can be inspected where set in stone. I thought I was doing fairly well but I can see their point. Having a cube and having it sliced in a way you don’t need is kinda useless.
For this post, I will describe how to use the previously provided database to create data cubes from the Movielens Dataset. With these cubes, I will then create a few reports using Adobe Flex to illustrate the advantages of using data cubes for reporting instead of the more traditional ‘query and report’ practices from live databases, etc.
1. I have broken down mysql dump file to a set of individual files per table. I got some complaints on unreasonable file size.
2. I’ve now included 10 million movie ratings as well which I hadn’t because of size as well. Now its a file to itself and you can skip if you find it difficult to import.
The set can be downloaded from Infochimps here. I will post any updates there as well.