As we roll into the last month of summer before school starts, I am still working on developing the survey and writing out my literature review. Ideally, I would like these questions to be as fine-tuned as possible before sending them out, just to make sure there is no bias in the wording or structure. I’ve asked my dad– a career market researcher– for help with this, and he has been hugely beneficial to editing any potential questions I send his way. Since there are just four weeks left until the school year (and the conclusion of the research) is upon us, I will have to commit to decently strict update schedule in order to fulfill the requirement.
In the last few weeks, I’ve been working on most likely the most tedious part of the entire project, and that is choosing the schools pulling names of the various NCAA personnel I wish to send the survey out to. To do this, I had several options, most of which involved manually choosing the schools and teams that were to be selected. I chose a different way, not just in the interest of time, but also in that it would likely skew the results since the teams were not being randomly selected. Thus, I looked for a way to generate a list of teams completely randomly. After some searching, I managed to find a diamond in the rough: an online dataset that listed every team in the NCAA, by sport, with the grades and overall academic trends for each. After downloading said dataset and deleting some of it– I didn’t need any of the academic data– I had a dynamic table in Excel that I could easily filter out select teams with. I wrote a quick formula that allowed me to quickly generate a list of random teams, organized by sport, so that I could then go online manually and find an email address or telephone number that would allow me to contact them over the coming weeks. I’m pretty proud of the formula, as it allowed me to use something from my major in a research setting.
For the survey, I elected to narrow the field down to the 4 most popular men’s NCAA sports (Football, basketball, baseball, and soccer) as well as the most popular women’s NCAA sport, basketball. While it is a shame to not weigh men’s and women’s sports equally, my research says that there are huge lapses in data for women’s collegiate sports as a whole. If I am able to generate enough responses from the existing potential respondents, ideally I could go back and use the same Excel sheet to find more potential respondents from more women’s sports. Additionally, ideally I would like to get analytics use data from a wider variety of sports, but the scope and timing of the overall project need to remain feasible.
I’m excited for the next few weeks, as I will attain actual data on a topic I have only been able to research as of now.
Here are the first 15 colleges that I will contact from each sport:
Football:
California State University, Fresno |
Stanford University |
Washington State University |
Texas Christian University |
University of Nebraska, Lincoln |
University of Cincinnati |
Southern Utah University |
University of South Dakota |
University of Tennessee at Chattanooga |
Furman University |
Florida Atlantic University |
Wake Forest University |
University of Illinois, Champaign |
Abilene Christian University |
Monmouth University |
Basketball:
Missouri State University |
Stetson University |
Fordham University |
Illinois State University |
Louisiana Tech University |
Stanford University |
University of Arkansas, Little Rock |
The Ohio State University |
University of Denver |
Wagner College |
Boston College |
Drexel University |
Ball State University |
University of Notre Dame |
Eastern Illinois University |
Soccer:
Columbia University-Barnard College |
Miami University (Ohio) |
University of Louisville |
Northern Kentucky University |
University of Pittsburgh |
Georgetown University |
South Dakota State University |
Siena College |
Elon University |
University of Mississippi |
Furman University |
Loyola University Maryland |
Southern Utah University |
Murray State University |
North Carolina State University |
Baseball:
Rider University |
Monmouth University |
Marist College |
New Jersey Institute of Technology |
Eastern Illinois University |
Columbia University-Barnard College |
University of Minnesota, Twin Cities |
Towson University |
The University of Southern Mississippi |
Coppin State University |
Indiana State University |
University of Connecticut |
Eastern Kentucky University |
Gardner-Webb University |
University of California, Berkeley |
Women’s Basketball:
Indiana University-Purdue University, Fort Wayne |
Mount St. Mary’s University |
North Carolina Central University |
San Diego State University |
Fordham University |
University of Virginia |
University of Rhode Island |
Harvard University |
Bryant University |
Savannah State University |
Youngstown State University |
University of Memphis |
University of Wisconsin-Green Bay |
Marshall University |
Western Carolina University |
Sounds like a cool project! Any chance of running a pilot survey to test for the bias you mentioned you were concerned about or to develop a scheme for coding open-ended responses? Piloting the survey could also help with refining multiple choice option answers. With the tight timeline you mentioned in this post, however, I can see how adding a pilot stage may not be suited to this study.
The open-ended responses are mostly being used for a more qualitative analysis; as several questions are being used for organizational purposes (School, conference) or for general initial observations, i.e. their own definition of analytics usage. I’ve gone through several drafts of the survey, as well as enlisted the help of a market research professional in order to pre-emptively deal with any bias in the questions.