graftr

graftr: an interactive shell to view and edit PyTorch checkpoints.

graftr can be used to remove, rename, and move around the layers and parameters
of your saved model. It's also a handy tool to peek into the structure of pre-trained PyTorch
models that you can find online (e.g. Transformer, DCGAN, etc.).

The screencast above shows an example of taking a pre-trained Densenet
and preparing it for integration into a larger model. We remove the final classification layer
and move the feature extractor into its own densenet module.

Install

pip install graftr

Documentation

graftr presents a hierarchical directory structure for state_dicts and parameters in your
checkpoint. You can list (ls), move/rename (mv), and print (cat) parameters. And, of course,
you can navigate (cd) through the hierarchy. It also supports standard shell beahvior like
command history, up-arrow, tab-completion, etc.

All changes are kept in-memory until you're ready to write them back to your checkpoint with save.

Supported commands

  • cd - change working directory.
  • pwd - print working directory.
  • ls - list directory contents.
  • cat - print the contents of a value or directory.
  • mv - move/rename value or directory.
  • rm - remove value or directory.
  • parameters - print the number of model parameters under a directory.
  • shape - print tensor shape.
  • device - get or set the device of a tensor or group of tensors.
  • save - write back changes to disk.
  • where - print the location on disk where changes will be saved.
  • exit - exits the shell.

GitHub