Why users and students want humans to take responsibility in an accountable information environment — not machines.
- Now Age Storytelling team

- Sep 14
- 2 min read

The majority of users don't trust artificial intelligence in journalism – current studies show this unequivocally. Even the mere hint that AI helped with writing makes users skeptical.
The Digital News Report 2025 proves that we expect more errors and less transparency from AI news. Trust drops by a whopping 18 percentage points. As a background tool? Fine. As an author? Absolutely not.
58 percent of people worldwide believe, according to Reuters, that AI makes news incredible. In the US, it's even 73 percent.
The message is clear: Users want humans to take responsibility for our news – not machines.
The same skepticism prevails in education.
Teachers and professors see AI as a useful tool – but for truly productive use, the entire system would need to change.
What's missing?
Among other things: reliable fact-checking, adapted curricula, usable teaching materials and individual support according to learning level, and equal access to the technology itself.
The reality looks sobering.
Educators complain about "hidden additional workload" because they have to laboriously rework AI results, rewrite texts, and didactically restructure assignments.
Students fear that AI weakens their critical thinking, undermines ethical integrity and fairness.

Conclusion
In journalistic media and in the education system, human cognitive and ethical qualities remain central.
Journalists and news users, educators and students value human judgment especially in complex multi-contexts. They have good reasons to retain control over legal-ethical responsibility, empathetic morality, personal engagement, and source security – not to delegate them to machines.
Three Reading Recommendations
AI and the Future of Education
The report shows how AI is changing schools and universities: from personalized learning to ethical dilemmas. It makes clear that opportunities remain unequally distributed, as one-third of the world is still offline.
21 experts examine the philosophical, ethical and practical dilemmas of AI in education.
UNESCO: AI and the Future of Education: Disruptions, Dilemmas and Directions. UNESCO, 2025.
Learning, Media and Technology
This makes clear why AI teaching tools often fail in practice: teachers have to laboriously correct content because generic, flawed or inappropriate materials are produced. Instead of saving time, the "hidden workload" grows.
Ebben, M. & Murphy, J. (eds.): Learning, Media and Technology, Vol. 50(3). Routledge, 2025.
AI + Learning Differences
The Stanford symposium report identifies nine key areas where AI can support learners with disabilities and special needs:
co-designing with families, improving individual education plans, early needs identification, supporting social-emotional development, expanding assistive technologies, improving teacher training, preparing students for work life, promoting life satisfaction.
McGee, N. J., Kozleski, E., Lemons, C. J., Hau, I. C.: AI + Learning Differences. Stanford University, 2025.









Comments