Thanks so much David, looks amazing!
Thank you so much for implementing PyTorch @DavidBraun!
I am fairly new to PyTorch and started to train my own models for using them with your ~cpumem version, and in your github you’ve mentioned that the new GPU version could also work as style transfer implementation. Can you give me a little hint how this might be possible ?
Do I just have to export my models to torchlib or is there something that needs to be changed within the PyTorchTOP ?
Yes it should be easy if the GPU PyTorchTOP is working. Follow the torchscript export instructions from the CPU version and you’ll get pth files. Then back in the GPU version modify the WrapperModel: https://github.com/DBraun/PyTorchTOP/blob/34df7ca31a07b12fbfae1600b99077abad634494/src/WrapperModel.h#L17 because the function signature should take a single input.
The neural style project doesn’t have any preprocessing but some projects do. Sometimes in the WrapperModel you’d want to normalize the input or permute the dimension order in a different way. For example, some models take values between -1 and 1, not 0 and 1.
The neural style project is a lot like the Background Matting project because you can give it an 8-bit RGBA input, the model computation is done at 32-bit RGBA (equal to 4 bytes 4 channels). However for neural style the output will be color, not mono. If 32-bit RGB out is not working, try using torch::cat and torch::ones to add an alpha channel in the forward method.
Some projects are difficult to export correctly with torch.jit.script or torch.jit.trace, often because they use rare layer types or involve nesting preloaded models for other purposes. Luckily the neural style one is simpler.
Also consider forking the repo and pushing the changes when you’re done.