The statement was issued communitywide by the office of Equity, Diversity, and Inclusion (EDI) at the Peabody College of Education. The student body has responded with claims that the letter is generic and lacks any true empathy.
The letter opens with this:
It goes on to repeatedly emphasize the importance of creating a “safe and inclusive environment on campus.”
A disclaimer on the letter states that the statement was a “paraphrase from OpenAI’s ChatGPT AI language model.”
Both community members and students have accused the university of turning the tragedy into a publicity stunt. Laith Kayat, a senior at Vanderbilt who has a sister at MSU, called the letter “disgusting” according to a report in the Vanderbilt Hustler.
"There is a sick and twisted irony to making a computer write your message about community and togetherness because you can't be bothered to reflect on it yourself. [Administrators] only care about perception and their institutional politics of saving face."
One day after the letter was issued, Nicole M. Joseph, associate dean for EDI, sent out an email responding to the criticism. She acknowledged that the office had made a major lapse in judgment.
"As with all new technologies that affect higher education, this moment gives us all an opportunity to reflect on what we know and what we still must learn about AI,”
Meanwhile, the dean of the Peabody College, Camilla Benbow, stated that the letter wasn’t reviewed before it was sent. "The development and distribution of the initial email did not follow Peabody's normal processes providing for multiple layers of review before being sent." The office is now conducting an investigation and both Associate Dean Joseph and Assistant Dean Hasina Mohyuddin have been put on temporary leave.
Dean Benbow added that "I am also deeply troubled that a communication from my administration so missed the crucial need for personal connection and empathy during a time of tragedy."
While there are situations where ChatGPT is helpful with drafting communications, technology does have limitations. CEOs are using it as a “thought partner” to help write speeches, but the software can’t perform uniquely human qualities like judgment.
Essentially, the software is able to effectively summarize data and produce coherent text, but it doesn’t have any emotional intelligence. It can’t articulate the “why” of facts or express emotion. These are areas where humans still offer unmatched value.
Get all the Universities.com's college news, advice, updates, financial aid, and more straight to your inbox.