- 2001 – The red line is at 88%, while the blue line is at 79%
- 2011 - The red line and blue line are both at about 114%
No dual-axis needed! Chart junk be gone! What a difference!
No dual-axis needed! Chart junk be gone! What a difference!
Look at the chart below from Chart of the Day. One question, 5 seconds. How much has the variance changed between the two lines from 2001 and 2011?
I’m going to guess you came up with:
If that's what you guessed, you’re wrong, completely wrong! Why?
This is incredibly confusing and incredibly misleading!
What’s the real answer?
This is one of the worst and most misleading dual-axis charts I’ve ever seen. At a glance it looked like blue was slightly ahead of red then the gap got really wide in blue’s favor. However, it’s the complete opposite.
Scary, scary, scary!
I was contacted by a follower of my blog yesterday that lives in the UK and works for Autonet Insurance Group. She asked if I would host their infographic on this blog, but I only agreed once she confirmed it was ok that I critique it a bit. I have very limited experience creating infographics, but I am a firm believer that you should always present data in the most simple format to understand.
Keep in mind that Wikipedia defines Information Graphics as “visual representations of information, data or knowledge. These graphics present complex information quickly and clearly.” Maybe this is why I’m not a huge fan of them. Scroll down below Autonet’s infographic for my improvements.
Infographic by Autonet Insurance
First, I love the content and the descriptions; they’re very informative. However,
While this view below doesn’t give any of the great details of the infographics and doesn’t look as snazzy, it does make the comparisons much, much easier.
And if you insist on a donut chart, how about something like this?
Via the Guardian Datablog, questions to ponder:
Click on a state either in the map or the bar chart to update the trend. Filter to the years you want to see with the table on the lower-left.
It’s pretty clear that the peace index is worst in the south. I wonder if there is any correlation to gun control laws?
From Nielsen. There’s no explanation in the article as to the scale, nor are any numbers mentioned. How am I supposed to compare men 35+ that are planning to purchase HDTVs to women 35+?
This chart could be whole numbers, percentages, those planning to buy on the 2nd Saturday next week. This is a really poor effort.
Have you ever received a chart similar to this? I did and was flabbergasted. What disturbed me most was that it was intentionally misleading. The other people who saw the chart didn’t notice the major problem that the dual-axes are not synchronized (and a dual-axis chart is unnecessary in this case anyway).
I quickly corrected the chart and shared how it should look. They were stunned at the the different stories these charts tell at first glance.
Bottom line: be skeptical when someone sends you a chart and quickly correct the situation if needed. Take Mar-11 as an example:
Now take a quick look at Dec-10 (don’t cheat and look at the data). In the 1st chart you’d think Group Y is killing Group E, when in fact Group E is outperforming Group Y.
It’s a scary world out there when people try to intentionally mislead you in an effort to support their personal agenda. A friend of mine likes to say “Facts are friendly”. Presenting facts needs to be friendly as well.
Another interesting report released by Nielsen, this time they measured the internet connection speeds of nine countries. They presented the results as a stacked column chart.
One of the biggest problems with stacked bar or column charts is that it’s very difficult to compare values other than those at the bottom or top. In the Nielsen chart, it takes some time for your eyes to rank any of the measures other than “Above 8Mb”. I’ve turned to Tableau and presented several alternatives. Scroll through the tabs to see each.
I also used a parameter control to allow sorting by any of the measures. This allows you to quickly rank the countries for any of the speeds. My personal favorite is the “Favorite” tab. Which is yours? How else would you present the data?
Contrary to popular belief, Scott Boras does not represent every big league player. He doesn't even represent all of the biggest names. In fact, Boras Corporation doesn't even represent the most baseball players. That distinction belongs to SFX World.
The Pareto principle is nearly in affect here – 29 of the 117 agencies (25%) account for 80% of the players. Click on any of the lines are bars to see the players and teams those agencies represent. Visit MLB Trade Rumors by clicking on the little red start on the logo.
A blog post from Nielsen says:
According to a new mobile video report from The Nielsen Company, the number of U.S. mobile subscribers watching video on their mobile devices rose more than 40 percent year-over-year in both the third and fourth quarters of 2010, ending the year at nearly 25 million people.
They presented their results with this table:
What’s wrong here?
I would reformat the table like this.
While Nielsen’s report says “the number of U.S. mobile subscribers watching video on their mobile devices rose more than 40 percent year-over-year in both the third and fourth quarters of 2010”, they fail to point out that growth has actually slowed in 2010, both quarter over quarter and year over year.
Bottom line, be careful what you read. I hope not, but Nielsen may have intentionally left out Q3 09 to paint a better picture.
I’ve previously published research regarding the Cobb County School Board’s decision to revert to the “traditional” calendar. Much to the dismay of the board members, many of the parents have not let this unpopular change go away.
In a very rare move, the Cobb school board was called to testify in front of a grand jury to discuss the recent calendar decision and the board’s overall decision-making process. The outcome/recommendation from the grand jury will likely not be known until early-May.
The remainder of this blog post contains excerpts from research conducted by a friend and fellow Cobb County parent with respect to academic gains achieved to-date under the “balanced” calendar. View the research in its entirety here or see the bottom of this post.
In November of 2009, the CCSD approved a balanced calendar for a three-year period to begin in 2010-2011. The district committed to monitor the impact of the balanced calendar on key operational areas including student achievement.
On February 17, 2011, the Board of Education overturned this decision before the first year under the balanced calendar had even been completed, approving “traditional” calendars for the following two school years. Although this reversal was shocking, even more stunning was the fact that the board overturned its 2009 decision without honoring the commitment made to the district when the balanced calendar plan was established.
Because the 2010-2011 school year has not yet been completed, a full assessment of student achievement under the balanced calendar is not possible. However, the three assessments completed to date – The Iowa Tests of Basic Skills (ITBS), The Georgia High School Writing Test (GHSWT), and the End-of-Course Test (EOCT) – can provide some insight into the impact of the balanced calendar on student achievement.
The Iowa Tests of Basic Skills (ITBS)
Students are assessed in the fall on the Iowa Tests of Basic Skills (ITBS). This is a norm-referenced assessment that measures student achievement in comparison to other students nationwide. Students in grades 3, 5, and 7 are tested in Reading, Mathematics, Language Arts, Science, Social Studies and Sources of Information. Results include:
The Georgia High School Writing Test (GHSWT)
High school students are assessed on the Georgia High School Graduation Tests (GHSGT) and the Georgia High School Writing Test (GHSWT). Most students take the GHSWT during the fall of their junior year and are required to pass the test to earn a regular education diploma. Results include:
End-of-Course Test (EOCT)
All students take the End-of-Course Test (EOCT) after completing various courses in four different categories: English and Language Arts (ELA), Math, Science, and Social Studies. Results of the EOCT are used for diagnostic, remedial and accountability purposes to gauge the quality of education in the state and also count as part of the student's final grade in the course. Results include:
Quantifiable year-to-date student achievement in the Cobb County School District during the first year of the Balanced Calendar can be summarized as follows:
Of course, the results of only three assessments cannot definitively establish the impact of the balanced calendar on student achievement. Ten total assessments are published each year in the CCSD, and full evaluations of all ten at the end of the 2010-2011 and 2011-2012 school years would have provided the most reliable gauge of the balanced calendar’s impact.
However, these are the only assessments that have been completed to date, and one thing can be said for certain. The balanced calendar has not had an adverse affect on student achievement in the Cobb County School District to date in 2010-2011. In fact, all indications so far suggest that the balanced calendar has had a positive impact on student achievement and is promoting improved results.
From Chart of the Day:
For me, this chart is quite hard to read because (1) you have to tilt your head 90 degrees to read anything and (2) the background is unnecessarily dark. A bar chart would be far superior; you’d be able to read left-to-right, just like a book.