Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags
A jam submission

Distillation by duplication: The importance of layer selectionView project page

Knowledge Distillation
Submitted by roksanagow — 50 seconds before the deadline
Add to collection

Play research

Distillation by duplication: The importance of layer selection's itch.io page

Results

CriteriaRankScore*Raw Score
Generality#92.8583.500
Novelty#102.8583.500
Mechanistic interpretability#122.8583.500
ML Safety#141.6332.000
Reproducibility#151.6332.000

Ranked from 2 ratings. Score is adjusted from raw score by the median number of ratings per game in the jam.

Judge feedback

Judge feedback is anonymous.

  • This work is quite interesting and presents a unique approach to interpreting how performance differs between student models trained from different baseline teacher layers. The visualizations are also very creative. It would be interesting to see this work expanded with more statistical analyses on additional teacher models or reinitializations to determine if the results are not just random effects (e.g. for 3,10 and 10,3). Overall, great job and I look forward to seeing this work developed further!

What are the full names of your participants?
Roksana Goworek, Paul Martin, Jonathan Frennert

What is your team name?
teamEd

What is you and your team's career stage?
UG students

Does anyone from your team want to work towards publishing this work later?

Yes

Where are you participating from?

Edinburgh

Leave a comment

Log in with itch.io to leave a comment.

Comments

No one has posted a comment yet