Skip to content

yxonic/DTransformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DTransformer

Code for Tracing Knowledge Instead of Patterns: Stable Knowledge Tracing with Diagnostic Transformer (accepted at WWW '23).

Cite this work:

@inproceedings{yin2023tracing,
  author = {Yin, Yu and Dai, Le and Huang, Zhenya and Shen, Shuanghong and Wang, Fei and Liu, Qi and Chen, Enhong and Li, Xin},
  title = {Tracing Knowledge Instead of Patterns: Stable Knowledge Tracing with Diagnostic Transformer},
  year = {2023},
  isbn = {9781450394161},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3543507.3583255},
  doi = {10.1145/3543507.3583255},
  booktitle = {Proceedings of the ACM Web Conference 2023},
  pages = {855–864},
  numpages = {10},
  keywords = {contrastive learning, knowledge tracing, DTransformer},
  location = {Austin, TX, USA},
  series = {WWW '23}
}

Installation

poetry install

Usage

Train

Train DTransformer with CL loss:

python scripts/train.py -m DTransformer -d [assist09,assist17,algebra05,statics] -bs 32 -tbs 32 -p -cl --proj [-o output/DTransformer_assist09] [--device cuda]

For more options, run:

python scripts/train.py -h

Evaluate

Evaluate DTransformer:

python scripts/test.py -m DTransformer -d [assist09,assist17,algebra05,statics] -bs 32 -p -f [output/best_model.pt] [--device cuda]

About

Tracing Knowledge Instead of Patterns: Stable Knowledge Tracing with Diagnostic Transformer (WWW '23)

Topics

Resources

Stars

Watchers

Forks

Languages