Si Shi

and 2 more

Knowledge tracing, a mission to estimate a learner's knowledge level over time, can be simulated by a machine. Traditionally, it can be accomplished using the Bayesian Knowledge Tracing method, which mainly depends upon statistics. With the booming of Artificial Intelligence, more powerful machine learning-driven models have emerged. Plenty of them show satisfactory results. However, there exist three issues so far: 1) the data sparsity and quality in knowledge tracing domain remains a problem; 2) the hardness of a question is not designed properly; 3) the current models lack synthesized components of knowledge tracing. To address these issues, we proposed an enhanced method, Synthetic Separated Self-Attentive Neural Knowledge Tracing (SYNSAINT), which upgraded the data quality and mined inner relations among skills with clustering techniques and synthesized latent necessary embeddings. In this paper, we designed two novel embeddings: skillcluster and hardness in our estimation system. We harnessed the power of multistep clustering techniques to obtain the skillcluster embedding. Moreover, we utilized a sub-neural network method to assign a random weight for each item in the hardness measurement. We built deep sequential attentive structures and verified the method on two representative open-source knowledge tracing datasets. To the best of our knowledge, it was proven superior to other prevailing deep neural networks knowledge tracing methods. We also implemented abundant ablation studies to demonstrate the effects of the newly added embeddings.