ARTICLE

Volume 3,Issue 9

Cite this article
6
Citations
12
Views
21 November 2025

大模型智能体时代本科个性化思政教育探索

渲 胡1 志强 耿1 永明 韩1 孟志 王1
Show Less
1 北京化工大学 信息科学与技术学院, 中国
ETR 2025 , 3(47), 118–121; https://doi.org/10.61369/ETR.2025470001
© 2025 by the authors. Licensee Art and Technology, USA. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution -Noncommercial 4.0 International License (CC BY-NC 4.0) ( https://creativecommons.org/licenses/by-nc/4.0/ )
Abstract

面对大模型智能体技术与本科思政教育深度融合的新趋势,为解决传统本科思政教育中统一化内容难以适配学生个体差异的关键问题,本研究探索并提出大模型智能体与个性化思政教育结合的具体路径,包括学生思想动态感知体系、认知偏好的个性化引导和个性化思政内容生成。研究重点围绕突破传统思政教育单一化局限展开,通过搭建学生个性匹配的思想引导框架,设计思政教育与学生日常学习生活深度嵌入的实践方案,最终实现思想引导与学生成长进程的同步。在大模型智能体广泛应用的背景下,本科思政教育工作者应主动借助技术力量,推动教育模式从单向灌输转向双向交流,为高校人才培养提供更具针对性的思想支撑。

Keywords
大模型
智能体
个性化引导
思政教育
教育改革
References

[1]Kalla D, Smith N, Samaah F, et al. Study and analysis of chat GPT and its impact on different fields of study[J]. International journal of innovative science and research technology, 2023, 8(3).
[2]Achiam J, Adler S, Agarwal S, et al. Gpt-4 technical report[J]. arXiv preprint arXiv:2303.08774, 2023.
[3]Touvron H, Martin L, Stone K, et al. Llama 2: Open foundation and fine-tuned chat models[J]. arXiv preprint arXiv:2307.09288, 2023.
[4]Sun Y, Wang S, Feng S, et al. Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation[J]. arXiv preprint arXiv:2107.02137,2021.
[5]Team G, Anil R, Borgeaud S, et al. Gemini: a family of highly capable multimodal models[J]. arXiv preprint arXiv:2312.11805, 2023.
[6]Dan Y, Lei Z, Gu Y, et al. Educhat: A large-scale language model-based chatbot system for intelligent education[J]. arXiv preprint arXiv:2308.02773, 2023.
[7]Cui J, Li Z, Yan Y, et al. Chatlaw: Open-source legal large language model with integrated external knowledge bases[J]. CoRR, 2023.
[8]Sun T, Zhang X, He Z, et al. Moss: An open conversational large language model[J]. Machine Intelligence Research, 2024, 21(5): 888-905.
[9]Shao Y, Geng Z, Liu Y, et al. Cpt: A pre-trained unbalanced transformer for both chinese language understanding and generation[J]. Science China Information Sciences,2024, 67(5): 152102.
[10]Wang S, Xu T, Li H, et al. Large language models for education: A survey and outlook[J]. arXiv preprint arXiv:2403.18105, 2024
[11]Huang, X., & Li, S. A Review of Personalized Learning in the Age of Artificial Intelligence: From Single-Dimensional Adaptation to Multidimensional Integration.Computers & Education, 172, 104262, 2021.

Share
Back to top