The best stories being published on the SNO Sites network

Best of SNO

The best stories being published on the SNO Sites network

Best of SNO

The best stories being published on the SNO Sites network

Best of SNO

Best of SNO Stats
2124
Published
Stories
575
Participating
Schools
333
Published
Schools
Publication Tips
We'll be the first to admit that getting your story published on Best of SNO is hard. We receive over 100 submissions per day, and only about 15 percent are selected for publication.

There are multiple factors that come into play when deciding if a story is Best of SNO-worthy. From engaging writing and unique angles to well thought out multimedia elements, more considerations are made than it might look.

If you're having a hard time achieving that Best of SNO distinction, check out our past newsletters to get a better idea of the type of content we're looking for.
March 21, 2024
January 26, 2024
November 16, 2023
March 1, 2023
January 10, 2023
November 1, 2022

Editorial: AI is a tool, not the future of journalism

Sophomore+Meredith+Ho+edits+an+interview+transcription+on+Otter.ai.+With+the+addition+of+AI+guidelines+to+the+student+handbook%2C+our+editorial+board+has+been+questioning+how+AI+technology+can+impact+journalism.
Maia Alvarez
Sophomore Meredith Ho edits an interview transcription on Otter.ai. With the addition of AI guidelines to the student handbook, our editorial board has been questioning how AI technology can impact journalism.

At the beginning of the 2023-2024 school year, Archer’s student handbook’s section on academic dishonesty was updated to incorporate guidelines for artificial intelligence, saying academic dishonesty includes, “plagiarizing or presenting another person’s work or ideas as the student’s own, including the use of artificial intelligence.”

In light of this addition, our editorial board has thought about our publication’s practices with AI and how professional and scholastic journalism will be impacted by this technology.

Some say AI will soon replace human journalists thanks to its timeliness and efficiency. However, while AI can produce grammatically correct writing, it misses the creativity, critical thinking skills and introspection that comes with the human experience. For example, when the New York Times used AI chatbots to write short college essays for Harvard, Yale and Princeton, many of the responses were vague, generic and inaccurate, such as a response that used false song lyrics and incorrect interpretations.

According to the New York Times, Google is testing a “personal assistant” for journalists that uses AI to “take in information — details of current events, for example — and generate news content.” Now that AI tools are being tailored to journalists, the credibility of future reporting is even more worrisome. In the summer, a Georgia man sued OpenAI for defamation after ChatGPT generated a false legal summary accusing him of embezzlement. ChatGPT has also been known to create fake articles, where the program instead claimed they were real articles from The Guardian.

Story continues below advertisement

AI tools — when solely relied upon without fact-checking and ethical considerations — perpetuate the cycle of misinformation not only in articles, but also in photography. In The Oracle, we use Photoshop for cropping and adjusting light levels in images but always maintain the authenticity of the photo captured. However, as AI-generated images become more realistic, ethical dilemmas arise: these images are not newsworthy, and, if used for articles, would cause readers to gain a false perception of the world around them.

As our editorial board looks at these examples, we begin to wonder if AI can devalue journalism. What function does photojournalism have if it doesn’t capture the truth? What’s the purpose of journalism if it isn’t factual, credible or reliable? Is there a point to storytelling if it doesn’t represent and resonate with the human experience?

However, there are also some benefits of the intersection between AI and journalism. With AI, articles can be translated into multiple languages or converted into text-to-speech, while audio content can be transcribed, making news more accessible. AI can also find and analyze patterns of bias in headlines and stories, which can help publications be more objective in their reporting. Today, some news publications, including Sports Illustrated, have already begun using AI to help produce full articles, where the information is based on decades of previous Sports Illustrated articles.

But if news articles are sold to improve AI technology, can journalism still be considered independent? Trustworthy?

There are still many unknowns about the future of AI technology. Similarly to writers’ concerns regarding AI in the entertainment industry, some journalists are fighting for job security against AI. Maybe with time, research and a greater application of ethics, professional news publications will find the best practices to operate alongside this new technology.

As of today, we, the student editors of The Oracle, know it can only be us who tell the stories and opinions of the students in our community in a way that is original, truthful and impactful.

The Oracle does not use AI chatbots like ChatGPT in any form. We only use Otter.ai, a voice-to-text that utilizes AI, for interviews. However, we always listen to the recorded audio and edit the transcription to ensure the quotes are accurate. Additionally, in WordPress, the site where we write articles, some students use Grammarly for spelling and minor grammatical edits, though this is not an overall policy for the publication.

The brainstorming, gathering of sources and information, fact-checking, writing and editing of articles are completed by our staff of student journalists with the assistance of our adviser Kristin Taylor. In the coming weeks, The Oracle will be updating our staff manual and editorial policy to state and clarify our uses of AI.

In our digitizing world, interaction with AI will be inevitable. Nevertheless, we believe journalists should prioritize their own brainstorming and writing capabilities and use AI only for transcription and minimal editing help. With these guidelines, strong ethics and clarifications of our policy, The Oracle will continue maintaining its journalistic integrity and amplifying students’ authentic voices.

This story was originally published on The Oracle on September 30, 2023.