This post chronicles a multi-year journey I embarked upon as a doctoral student: a journey to create a database of higher education journals and conferences—which you can find linked at the end of the post.
It all started in fall 2016, during my first semester as a Ph.D. student in higher education leadership at The University of Texas at Austin, when I completed my first scholarly manuscript for publication. I spoke with a few of my mentors at UT-Austin about where I should submit the work, and they told me to consider a few well-known journals in the field of student affairs, including the Journal of College Student Development and the Journal of Student Affairs Research and Practice.
I had never heard of either.
Like any green, neophyte doctoral student, I started Googling “higher education journals.” Thousands of results later, I had found the Journal of College Student Development and the Journal of Student Affairs Research and Practice, alongside hundreds of other journals somewhat related to higher education and student affairs. Surely, I was going about this the wrong way. Surely, there had to be a database housing all of these journals in the same place. As it turns out, there was…kind of.
Through our institutional library subscription, I was able to access Cabell’s International, a website known for gathering journal information, including submission details and publication metrics. I also stumbled across the Scimago Journal and Country Rank website, another source of conglomerated journal information across a wide variety of academic disciplines, including education. From these two databases, I located nearly 1,000 different education-focused journals. This approach was no better than the previous “Googling” I had performed. I had one manuscript and 1,000s of journal targets.
Ultimately, I adopted a different approach for crafting a unique, higher education-focused journals database: I went back to my mentors. More specifically, I went to their CVs.
Using the U.S. News & World Report ranking of higher education administration programs, I located the CV for every faculty member at every ranked institution in the United States, and I databased every single journal they had published in. I also researched each faculty member’s co-authors and databased every journal their co-authors had published in. This approach was valuable for two reasons. One, I knew that these journals were reputable and not “predatory”—tenured and tenure-track professors had chosen to publish in these journals. What is good enough for a professor should be good enough for some no-name doctoral student. Second, as I had focused on higher education-specific faculty, I knew the aims and scope of these journals would be receptive to my manuscript, one focused on higher education.
After scouring CVs for a month, my database had grown to over 150 unique journals, each with the following information: hyperlinks to the journal’s aims and scope webpage and “submission instructions for authors” webpage, the journal’s acceptance rate, the journal’s style (APA, MLA, Chicago, etc.), the journal’s word count limit, and any other notes I had gathered along the way. The Cabell’s database included some of these metrics, the Scimago database included some of these metrics, but the database I created included all of them. This was my logic: If you want to submit an article for publication, what do you need to know?
- The journal’s aims and scope
- The submission instructions
- The formatting style
- The word count limit
Finally, adding the acceptance rate may give folks an idea of the competitiveness of the journal and how long the peer review process may take.
Furthermore, it was important to me to create a database that could be widely shared with the higher education community. I decided to use Google Sheets to create the database, meaning that I could share the database using a hyperlink or an Excel file for maximum compatibility and collaboration. Also, early in the process, I learned that some journals databased by Cabell’s and Scimago included dead hyperlinks of journals that were no longer being published. Having somewhat of a background in computer science, I was able to program a function in the Google Sheets database that automatically checks the activity of every hyperlink in the database. If a journal moves its website or is no longer being published, the Google Sheets function will tell me immediately.
In early 2018, after I built the database, I felt it was ready to be shared. As a test drive, I forwarded the database to UT-Austin’s higher education program mailing list (listserv) and a few close colleagues at other institutions. The responses to the database were mixed.
Some people were extremely appreciative of my effort, going as far to say that the database had helped them find the perfect journal for their unpublished manuscript. Others said my database only helped perpetuate the “rat race” of scholarly work and the “publish or perish” mindset so pervasive in academia. However, two pieces of feedback were incredibly valuable. For one, people began forwarding me links to journals which were not included in the database. From here, the database grew immensely to over 300 journals in less than six months. Second, people told me to do for conferences what I did for journals.
Wash, rinse, repeat.
CVs, conferences, databasing.
A few short months later, by mid-2018, I had compiled a database of over 300 journals and 100 conferences related to the scholarly output of higher education and education researchers. Since then, I have shared my database with colleagues at dozens of other institutions in hopes that this single, updated, higher education-focused database can provide scholarly guidance for anyone seeking to publish a manuscript.
Here it is: Higher Education Journals and Conferences Database
In time, I hope to collaborate with the ECHER Network to house the database permanently. As a one-stop shop for scholarly publication in higher education, I foresee the database expanding to country-specific publications, as well as a more comprehensive dataset in terms of acceptance rate and extraneous notes. Beyond supporting early career higher education researchers, this database can help the entire higher education community find a home for their work. I know there are incredible researchers doing important work every day, but their work may struggle to find an audience because of the complex, convoluted smattering of journals scattered across countless websites.
ZW Taylor is a doctoral student at The University of Texas at Austin. His research examines the intersection of linguistics and informatics as they pertain to pre-enrollment student materials, such as application and financial aid instructions. His work can be found on Google Scholar at ZW Taylor.
3 Replies to “Three years, one manuscript, and hundreds of CVs later: a higher education journals and conferences database”
This is the type of work that needs much more recognition. We only realise how helpful it is when somebody does it. Congratulations! And here’s hoping that this might be the beginning of putting US and the European HE researchers closer together, one list at a time. If there was a way to merge and (partially) automate the journal & conferences lists from ECHERs, hundreds of people would have a one stop resource.
Missing two associations and their conferences that would be relevant additions to your database. Association for General and Liberal Studies (www.agls.org) and Association for Core Texts and Courses (www.coretexts.org).
I believe a crucial variable was missing – average or suggested times through each stage of the submission process through to final publication. This is critical, in fact arguably the key element for doctoral students. They are under an extreme time pressure to publish as required in time to graduate in the minimum time. Specifically, the scouring of CVs ignores this element and is looking at people who are under less pressure, and maybe no pressure at all, guys who can afford the time to submit to and wait around on the processes of even the highest value journals. Also, quite literally, name counts, so again another skew was introduced – and another inappropriateness for the student or early-stage researcher who simply doesn’t have a known name. If you catalogued where those “CVs” published in their early years that might well have value, but researchers change their publishing patterns over the years. Equally it could just be that if that were done, aligned admittedly with other variables, those journals would be swamped by submissions from student and early years researchers, so creating a new submission issue.