==================================================================== DNN+HMM Toolbox v0.1 Antoni B. Chan, Janet H. Hsiao City University of Hong Kong, University of Hong Kong Copyright (c) 2022-06-16 ==================================================================== --- DESCRIPTION --- This is the DNN+HMM toolbox for simulating visual routine development via HMMs with DNNs that jointly learn visual representations and eye movement strategies for face recognition. --- REQUIREMENTS --- The DNN+HMM code is written as Python libraries / scripts and interactive Jupyter notebooks. Here are the required packages/toolboxes: python 3.7 tensorflow-gpu 1.14.0 keras-gpu 2.2.4 scikit-learn 1.0.2 matplotlib 3.5.1 scikit-image 0.19.2 pydot 1.4.1 nb_conda_kernels 2.3.1 - for Jupyter notebook functionality These can be installed by creating an environment using Anaconda 3. Note that the code has not been tested in TensorFlow 2; support for TF2 will be considered later. Clustering the HMMs requires the EMHMM toolbox and MATLAB. The EMHMM toolbox can be downloaded here: http://visal.cs.cityu.edu.hk/research/emhmm/ --- CONTENTS --- data/ - data files (the original dataset and the processed data for the experiment) nn/ - python library for building the DNN+HMM model, ipynb files for testing library matlab/ - helper functions for MATLAB results/ - directory that stores all results results_from_paper/ - the saved models used to generate the results in the paper. data_preprocess.ipynb - extract the 100 faces from the dataset and pre-process them. test_nn_models.ipynb - shows how to use the library to build a DNN+HMM model. run_trial.py - run a single trial to train an DNN+HMM test_run_trial.ipynb - run multiple trials in a Jupyter notebook run_analysis.ipynb - extract HMMs from the trials, and run preliminary analysis. run_hemfaces.m - cluster HMMs to find common strategies, perform AB scale analysis. --- BRIEF INSTRUCTIONS --- 1) Run test_nn_models.ipynb to make sure the computing environment is setup correctly. 2a) run "test_run_trial.ipynb" to train individual DNN+HMM. 2b) OR run "run_trial.py" with multiple seeds to train individual DNN+HMM. This method allows running in parallel on a GPU cluster. 3) run "run_analysis.ipynb" to extract the HMMs from the trained DNN+HMM models and run preliminary analysis. 4) run "run_hemfaces" in MATLAB to perform the HMM clustering and final analysis. The DNN+HMM models from the 2nd step are in the "results_from_paper/" directory. Running the above scripts, run_analysis and run_hemfaces, will perform the analysis again using the existing models. --- REFERENCES --- If you use this toolbox, please cite the following papers: Janet H. Hsiao, Jeehye An, Veronica Kit Sum Hui, Yueyuan Zheng, & Antoni B. Chan. "The role of eye movement consistency in face recognition: Computational and experimental examinations", npj Science of Learning, accepted 2022. Janet H. Hsiao, Jeehye An, and Antoni B. Chan. "The role of eye movement consistency in learning to recognise faces: Computational and experimental examinations." In: 42nd Annual Conference of the Cognitive Science Society (CogSci), Jul 2020. --- CONTACT INFO --- Please send comments, bug reports, feature requests to Antoni Chan (abchan at cityu dot edu . hk). --- ACKNOWLEDGEMENTS --- This research was supported by the Research Grant Council of Hong Kong SAR (General Research Fund # 17609117 & Collaborative Research Fund #C7129-20G).