Sep 25, 2017
Off and Running
Brad Stenger

The use of sports science is taking off in strength and conditioning departments. But before getting carried away with data, this roundtable of experts urge coaches to understand what it means first.

This article first appeared in the October 2017 issue of Training & Conditioning.

Since Jim Harbaugh became Head Football Coach at the University of Michigan three years ago, he has continually brought innovative ideas to the program. One of those was the 2016 hiring of Fergus Connolly, PhD, as Director of Performance.

Connolly is not strictly a strength coach. He is an expert on using technology and analysis to boost athlete training, with a PhD in computer-based supply-chain optimization. Formerly the Director of Elite Performance with the San Francisco 49ers, he is charged with utilizing data in an effective way to help Michigan players get bigger, faster, and stronger.

The key phrase in Connolly’s directive is “in an effective way.” Over the past few years, teams at the top levels have increasingly integrated sports science into their training programs. But sometimes, the results have been frustrating. There is often not enough technical know-how on staff to make the science meaningful.

Connolly is a big believer in and proponent of using technology in training. In fact, he recently published a book on the topic called Game Changer. But he is also a realist. At Michigan, he is incorporating sports science into the football team’s training carefully and without pretense. This is fairly rare in the burgeoning field.

So how can strength coaches make good decisions regarding the use of sports science, if experts are hard to find? We asked Connolly and Ryan Smyth, a strength coach for the Anaheim Ducks, to participate in a roundtable discussion on the topic. Smyth helped the Ducks reach the NHL conference finals earlier this year, and he operates The Park Sports Facility, a sports performance technology and integration consulting firm in Whitby, Ontario, Canada.

In the following conversation, we discuss how to implement sports science, which tools are most effective, and how to make the data meaningful. We’ll also offer some definitions along the way.

What’s a good place to get started with technology?

Connolly: The best approach is to work backward. Ask yourself, what are we trying to solve? What problem are we trying to fix? From there, you need to figure out the best ways to measure what you are trying to solve and how technology could help.

Sometimes, coaches measure everything and then say, “Let’s see what we’ve got.” I don’t think that’s effective. You need to have a specific objective in mind.

And to get the biggest bang for your buck, you have to know how to use the tools-how to integrate them. The key is to optimize and exploit what they have to offer.

One popular new technology is using heart-rate monitors paired with location-tracking devices to assess an athlete’s training. Outdoors, GPS devices can track athletes’ movements in two dimensions (forward, backward, and side-to-side but not vertically). Indoors, accelerometers and radio frequency (RF) are often used. What are your thoughts on these?

Smyth: Heart-rate monitors are a good way to know how difficult a practice is for players. When you combine them with GPS or RF, you can get an even better picture of what your athletes are doing in terms of physical exertion.

Connolly: It’s important to use the data intelligently. Very often with sport coaches, you’re giving them the numbers, and they’re going to look for the guy who covers the most distance. There’s a tendency to think, “If this guy did the most running, then we need to rest him more.” Well, hold on-he might just have a higher tolerance than his teammates.

Force plates, which are floor-mounted pressure sensors that athletes jump on to give precise measurements of their ground force production, are also getting a lot of attention. Do either of you use them?

Smyth: I am a big fan of them. The force plates I use give me instant information. I look at each player individually and say, “This is where they’re at, and this is where they should be in terms of their development, fatigue, recovery, or performance.”

Because they have sensors beneath each foot, force plates also show me the left-to-right differentials. This can be helpful with developmental curves and return-to-play decisions after injury.

Connolly: I agree that force platforms are useful, especially for return to play. It’s one of those assessments that can be very difficult, though. It’s not easy to analyze the information, but it is easy to mess up the measurement because of the large forces and super short time frames involved. If you don’t make sure a measurement is within defined bounds, you could wind up using invalid data points.

Smyth: Also, some people make the mistake of raising a big red flag when they see left-to-right differentials. Most athletes are dominant on one side, and that’s okay. In reality, no one jumps perfectly evenly on both sides.

Connolly: It kills me when people say they’re trying to find symmetry. There’s no such thing. A balanced ice hockey player is probably sitting in the stands.

Along with the monitoring tools, some companies sell athlete management systems-database-backed software that helps gather and organize athletes’ physical and athletic performance measurements. What are your thoughts on these?

Connolly: The significance of the athlete management system is the output. So you have to start by asking, “What are we trying to do? What do we need to use the information for?” I think that’s the key and the most important thing people need to keep in mind.

One other piece of advice: If you’re going to buy an athlete management system, try to set up the data in Excel first so you can show the company your needs. Say to them, “This is what we do. Can you replicate it?” You want them to at least match what you’re doing in a spreadsheet.

Smyth: Athlete management systems have come a long way, especially recently. Some of the athlete monitoring systems that are out there try to do too much. I appreciate the ones that say, “What do you need to do?” They’re honest. They say what they do and do what they say.

The one I use manages data from hundreds of athletes. During our hockey combines, every athlete takes a battery of performance tests that measure explosiveness, agility, strength, and endurance. The system keeps track of all the information in a great way.

When using an athlete management system, my advice is to keep the reports simple. And you need to have someone dedicated to doing the input. It’s a lot of work to input all the information coming in. It needs to be organized.

As coaches realize the importance of recovery, new technology aims to measure it, especially through tracking the amount of sleep an athlete gets. Most of this is software- and question-based, but some sleep technologies use sensors. How much are you using these?

Smyth: I use some sleep technologies, depending on where the team that I’m dealing with is located. If the team is on the West Coast and traveling to the East Coast, it’s good to understand how sleep is affecting them. Compliance is always an issue with sleep technology, though.

How about using technology for talent identification?

Connolly: The key aspect of talent identification is first knowing what your model athlete is. You should identify all of the qualities that make up this athlete.

But you have to remember that we are only able to measure physical qualities, and that’s not a complete model of the athlete. You also need to consider decision-making skills, tactical awareness, and technical awareness, to name a few.

Smyth: I put on multiple combines every year and use technology to supply coaches with information beyond what they can see and easily test. What I try to do with the technology is put the athletes in different categories. If you need a powerful athlete, here are the most powerful athletes in the combine. If you need a forceful or acceleration-driven athlete, here they are.

Most scouts have an eye test and do their due diligence in their categories. They know who they want to draft in the first 10 picks. The harder part is looking at guys for the later rounds who can fill out the roster and eventually make big impacts.

Connolly: If you look at the teams that have sustained success, the second tier of players have the most to do with it. It’s not simply a matter of developing future players-it’s actually for practice time. How well are those guys going to practice against your first team? That’s where they add value. That’s where the difference is made.

The technology provides a metric. Sometimes the metric produces insight into an aspect of an athlete’s performance. If the insight holds up, we use it to coach the next guy, and the next guy after him, and so on.

What emerging technologies are you paying the most attention to? What has you most excited?

Connolly: When it comes to choosing new technologies, the basics are still key. What is it that I’m trying to fix, and what technologies can be used? How do you implement it, and what’s the sequence?

Smyth: The wearable technology industry has to be careful about what it could lead to. Athletes need to buy into it, and we can’t overstep our bounds. I could tell you that I’m intrigued by embedding IMU [inertial mass units] or accelerometry into equipment, but if the players aren’t on board with the information I’m gathering, or they think it’s going to be used against them, then I’m not excited about it.

That brings up an important last question: Are there any ethical ramifications for tracking athletes’ data? Do these tools cross the line by collecting personal health information?

Connolly: The classification of the data should be regarded in terms of the role of the professional who has collected it. So if a doctor gathers the data, it’s medical data. If it’s collected from a strength coach, then it’s performance data.

But overall, it’s always still the player’s information. I would love to see the scenario where an athlete comes to the team with a memory stick, hands it to me, and says, “This is the data about me. It’s my background, and it’s my own.” I don’t care who has collected the data. It would help me help the athlete be the best he or she can be.

GROUP EFFORT

The hardest part of integrating sports science into team training often is getting everyone on the same page about how and when to use it. Effective collaboration when using technology isn’t a given.

The first step is that all participants understand basic information about the data that’s being collected. Everyone also needs to know how and where the data is stored, whether it’s in a database, a spreadsheet, or a vendor’s app. Then, there must be agreement on what to do with the data. This may include daily reports for everyone, custom reports for individuals, or sharing analyses online.

In athletic departments, just like everywhere else, people work with other people. And people work with technology. Sometimes people work with other people and with technology at the same time. Computing experts use the phrase “computer-supported cooperative work” or CSCW to describe this situation.

The possible outcomes for CSCW boil down to a simple 2×2 matrix:

The key to working collaboratively with technology is staying in the “People: Easy, Tech: Easy” quadrant and out of the “People: Hard, Tech: Hard” quadrant.

To do so, it’s best to not go too fast. “Like a lot of other things in life, if you start at a micro level and don’t jump right into hyperspeed, you may not run into as many issues,” says Ryan Curtis, MS, ATC, CSCS, CES, Associate Director of Athlete Performance and Safety at the Korey Stringer Institute (KSI), a sports health and science research center at the University of Connecticut. “Organizations that develop a sequential process for things and take it relatively slowly have a better chance of succeeding.”

There must also be buy-in from the athletes. “Athletes will ask, ‘What’s the point? Why do I have to wear this?'” says Courteney Benjamin, MS, CSCS, Associate Director of Communication and Assistant Director of Athlete Performance and Safety at KSI. “We have to communicate its value.”

Benjamin previously worked at Florida State University, where coaches promoted compliance and let her share data with interested athletes. These top-down and bottom-up interactions helped Benjamin more easily collect data.

Curtis feels the process is better when athletic administrators are on board. The more familiarity they have with technology, the better adoption will be throughout the organization hierarchy. “Are they data-driven themselves? If not, then the structure might be tougher to put into place,” he says.


Brad Stenger is a researcher and journalist, currently working for the Georgia Tech Wearable Computing Center and New York University Center for Data Science. His articles have appeared in MIT Technology Review, Wired.com, Ars Technica, and TIME. He can be reached online at: bradstenger.com and on Twitter @bradstenger.


Shop see all »



75 Applewood Drive, Suite A
P.O. Box 128
Sparta, MI 49345
616.520.2137
website development by deyo designs
Interested in receiving the print or digital edition of Training & Conditioning?

Subscribe Today »

Be sure to check out our sister site: