I enjoy connecting events in my data & analytics work life with events in my non-work life. There are often good lessons to be learned there. On the non-work side, I’m a huge fan of K-12 robotics programs. I was a coach of a FIRST Lego League robotics team for five years and now a mentor for a FIRST Robotics Competition team for the past two years.
One aspect of the work the team has done this season has a direct crossover relationship with data. At FRC tournaments, there’s a task called scouting. While all of the different robots are competing on the field, your team needs to observe and rate the performances so you have an idea of each teams’ strengths and weaknesses. At the end of the qualifying rounds, the top teams draft two additional teams on their alliance for the playoffs. If you don’t scout, it’s harder to know which teams would be the best fit to draft on your alliance. Scouting is a quintessential data exercise. You need to gather data (somehow), organize it to see if there’s a story there, and then act on it. Here are some good lessons learned from watching our team scout:
- There’s a cost to sourcing data
At the tournament, I saw teams scouting with smartphone apps, laptops, and paper/pencil. Additionally, the tournament organizers store and share some performance data that everyone can access. This brings up lots of good questions about the cost of sourcing your data. I can build a fancy app, but is the development cost worth it given the problem I’m trying to solve? I can use manual techniques on the cheap, but will that hinder my ability to analyze the data (it’s hard to auto-sort sheets of paper)! I can use the default data that’s available (instead of collecting my own), but will that put me at a competitive disadvantage?
- Know how you’re going to utilize the data
When the eight teams lined up to start drafting other teams on their alliance, they were using smartphones, laptops, and clipboards to make their decisions on the spot. If they need to sort/filter, a laptop might be good. If they only have a few seconds to make a decision, a well formatted sheet of paper on a clipboard might be best. There are a few variables here, but it comes down to thinking ahead about how you are going to use the data and let that inform the channel you choose to disseminate results.
- Don’t forget data validation
After the tournament ends, you want to understand how good/bad your data were. Validation is hard and it takes effort. When tournaments are done, it’s hard for teenagers (and adults) to dedicate the time to go back and see what worked. The team can look at the outcome of the tournament and see if their scouting data did a good job at predicting results. They can also look at a second source (the data generated by the tournament) and see if their data tracked in line with those numbers. Either way, it’s hard to profess faith in your scoring system if there is no validation.
It’s nice to look at instances when the same lessons can be learned in two different settings. It gives credence to the validity of the lessons…that good data habits can transfer from one situation to the next. In this case, it’s really not a bad thing when worlds collide.
One last pitch: if you want to feel great about yourself and the impact you can have on our future generations, think about getting involved with FIRST (or similar organizations). You won’t regret it.