Do police departments across the US (the world?) have the bandwidth to pour over crime reports in order to spot trends and mitigate crimes using all the available information? Given the ever increasing amount of data, now the norm, it will be increasingly difficult to make the best use of this data. As it applies to Crime, being able to effectively utilize this data should improve the quality of life of everyone.
For week 40, commercial burglaries on sector 2 have increased from an expected count of 7.89 to an actual count of 16. This is almost 2 1/2 times the expected volume. (Click image to try analysis or here for image)
Lets look at recent crime in Orlando and adjacent cities. What can we find out by exploring Orlando’s crime data? Is the Central Florida area a relatively safe place? Can we tell from 90 day’s worth of day? Are some areas safer than others? Do we have any false ideas about places in Orlando?
At the very least, lets attempt to get a better understanding of the crime issue which affects us everywhere.
Simply put, cohort analysis is a technique for analyzing activity over time by a common characteristic. Mostly used in sales and marketing, cohort analysis can be used in tasks such as analyzing customer loyalty, customer cost acquisition, marketing campaign effectiveness and to explore many other aspects of sales.
I am using the superstore sales created by Michael Martin found here or here. This excel file contains three sheets of which only the first one, Orders, will be used in this analysis.
The store providing their sales data does monthly advertising campaigns and wants to track what impact these advertising campaigns have on the amounts of orders placed over time. They want to use this information to evaluate their different campaigns and improve their efforts.
Given the superstore sales data and the requirements, lets present the number of orders placed per customer join date. Presenting the number of orders per join date will show the effectiveness of advertising campaigns leading to such date.
Any database server can be used to follow along. The code used here can easily be revised to work on any vendor’s product like MySQL, etc. For visualization purposes, Tableau can be easily changed replace by LibreOffice or similar.
Another great, and wet, run! Non stop rain did not stop the enthusiasm for the thousands of runners the met this year for the 5k and 15k Miracle Miles Event in Orlando.
Highlights for me:
- First race-in-the-rain!
- Second race without wearing headphones (in fanny pack as security blanket). First one was Disney, by choice; more later.
- Longest race without any walking.
- (Sub 🙂 ) 9 min/mi.
- Lots of fun.
I even knew a bunch of runners at event; neat. Lastly, I feel happy about performance. First 15k, hence reference for distance, and time-to-beat for next 15k’s , hopeful, PR!
Building on my previous race analysis for the Oviedo 5K, I debated whether to use Tableau (rocks!) again or try Oracle’s Cloud Analytics.
Disclaimer – Happy Oracle Employee.
Even thou it has a 30 day trial, I think free developer accounts will be more suitable to accelerate adoption. I’ve inquired and await response but they know best I guess. Oracle does offer such a service for APEX; maybe its a matter of time…
Maybe next time; Tableau to the rescue it is.
I had been meaning to write about this semi recent 5K run in beautiful Oviedo at First Baptist Church. With by Raul’s insistence, I decided to check this local favorite out.
Held this past May 24th, 2014 on a hot Saturday morning, the course was florida-flat (42 ft elevation gain) but had plenty of shade and had that small-town feeling Oviedo is known for.
With a few weeks to ‘train’ I set out to ramp up the miles in order to reduce my time. No medals for me yet but I enjoyed the race sights, the crowd and the location. Nothing like an early short run to set one up for a nice weekend.
For fun, I decided to play around with the results in Tableau and try out the OSX version of the software. Feature-wise, both Windows and OSX versions are the same, with the Windows version being just a bit more stable with big sets of data.
Super exiting news today. Tableau Public’s capabilities have been increased. This free solution had been limited to 100k records per datasource and to a combined (for all your stuff) size for all your projects to 50Mb.
They have just increased datasources to 1 million records! Just as impressive, now everyone gets 1 GB of space for their projects. This is awesome news; it will sure save me lots of time spent scaling down a problem to fit Tableau. Yay!
While looking for Paul Revere, I recently learned that you can define many-to-many database relationships in Tableau. I should not have been surprised, Tableau seems to do most things well. It should wear a cape all the time.
Anyways… before I forget, here how to to define a many-to-many database relationship in Tableau.
First, lets define the domain for this exercise. Lets say we have three tables, one for person (PERSON), one for organization (ORG), and an association table defining each persons membership into an organization (ORGPERSON). Each person can belong to none, one or many organizations and, of course, an organization can have any number of members as well.
Here is how this would look in a database: