MIT Sloan Sports Analytics Conference, Boston
Interesting research topics abound here on panels and poster board presentations. How about, "How Much Do Coaches Matter" from the University of Chicago, "Leveraging Pitcher-Batter Matchups for Optimal Game Strategy" from BYU, or "Using Deep Learning to Understand Patterns of Offensive Player Movement in the NBA" from Google and MIT?
Many of the underlying research papers are very well done, often because their authors take the time to define and specify the problem, ask specific research questions, apply the appropriate research methodology to the stated problem and questions, and interpret and apply the results to answer the questions - or at least advance our understanding of the phenomenon. Too many works, however, get lost in the statistical tools and bury the lede.
A positive example (images 1-3 below)? Two researchers presented a paper on "Analytics for the Front Office:Valuing Protections on NBA Drafts." Protected picks are growing in frequency in the Association, so exploring how best to value them makes sense. It wouldn't be my first choice for discussion, but the authors did a superb job inviting the reader into their work by stating the problem, defining the research questions, and applying the results. They identified workable research questions and applied the right statistical techniques needed to answer each one, e.g., running a Monte Carlo Simulation in one case or Building a Financial Asset Pricing Model for another question. They understood that the statistical tools and resulting quantitative analyses are the means and not the end. The end comes with creating new knowledge that can inform better decision making.
A negative example (image 4 below)? I'm interested in the court movements of NBA players, every nanosecond of which is captured on optical cameras. The Golden State Warriors have shown us that winning in the NBA - at least for the moment - means moving without the ball, creating space, and generating touches. I eagerly reviewed the paper on this subject, but was left wanting more ... and better. The researchers studied 2.3 million transitions in all 1,230 NBA games in 2015-16, which is remarkable. Great, except their efforts disappeared into their Markov Model and Poisson Intensity calculations and failed to tell us why their research mattered and, essentially, what they learned. Don't get me wrong; I get what they did and what it means. Their challenge, and the challenge facing all data analysts, however, is to translate their work into English. If not, they greatly reduce the ability and willingness of their coaches, general managers, and other organizational leaders to understand and use it.
Interesting research topics abound here on panels and poster board presentations. How about, "How Much Do Coaches Matter" from the University of Chicago, "Leveraging Pitcher-Batter Matchups for Optimal Game Strategy" from BYU, or "Using Deep Learning to Understand Patterns of Offensive Player Movement in the NBA" from Google and MIT?
Many of the underlying research papers are very well done, often because their authors take the time to define and specify the problem, ask specific research questions, apply the appropriate research methodology to the stated problem and questions, and interpret and apply the results to answer the questions - or at least advance our understanding of the phenomenon. Too many works, however, get lost in the statistical tools and bury the lede.
A positive example (images 1-3 below)? Two researchers presented a paper on "Analytics for the Front Office:Valuing Protections on NBA Drafts." Protected picks are growing in frequency in the Association, so exploring how best to value them makes sense. It wouldn't be my first choice for discussion, but the authors did a superb job inviting the reader into their work by stating the problem, defining the research questions, and applying the results. They identified workable research questions and applied the right statistical techniques needed to answer each one, e.g., running a Monte Carlo Simulation in one case or Building a Financial Asset Pricing Model for another question. They understood that the statistical tools and resulting quantitative analyses are the means and not the end. The end comes with creating new knowledge that can inform better decision making.
A negative example (image 4 below)? I'm interested in the court movements of NBA players, every nanosecond of which is captured on optical cameras. The Golden State Warriors have shown us that winning in the NBA - at least for the moment - means moving without the ball, creating space, and generating touches. I eagerly reviewed the paper on this subject, but was left wanting more ... and better. The researchers studied 2.3 million transitions in all 1,230 NBA games in 2015-16, which is remarkable. Great, except their efforts disappeared into their Markov Model and Poisson Intensity calculations and failed to tell us why their research mattered and, essentially, what they learned. Don't get me wrong; I get what they did and what it means. Their challenge, and the challenge facing all data analysts, however, is to translate their work into English. If not, they greatly reduce the ability and willingness of their coaches, general managers, and other organizational leaders to understand and use it.