| 11880762 |
Choosing execution mode of a neural network based on total memory usage |
Yasushi Negishi, Haruki Imai, Taro Sekiyama, Kiyokuni Kawachiya |
2024-01-23 |
| 11836613 |
Neural programmer interpreters with modeled primitives |
— |
2023-12-05 |
| 11521062 |
Neural network training using a data flow graph and dynamic memory management |
Gradus Janssen, Vladimir Zolotov |
2022-12-06 |
| 11461637 |
Real-time resource usage reduction in artificial neural networks |
Taro Sekiyama, Kiyokuni Kawachiya, Yasushi Negishi |
2022-10-04 |
| 11362670 |
ReLU compression to reduce GPU memory |
Yasushi Negishi, Haruki Imai, Kiyokuni Kawachiya |
2022-06-14 |
| 11164079 |
Multi-GPU deep learning using CPUs |
Haruki Imai, Taro Sekiyama, Yasushi Negishi |
2021-11-02 |
| 11106970 |
Localizing tree-based convolutional neural networks |
Taro Sekiyama |
2021-08-31 |
| 10949746 |
Efficient parallel training of a network model on multiple graphics processing units |
Imai Haruki, Yasushi Negishi |
2021-03-16 |
| 10884755 |
Graph rewriting for large model support using categorized topological sort |
Haruki Imai, Yasushi Negishi, Kiyokuni Kawachiya |
2021-01-05 |
| 10558914 |
Real-time resource usage reduction in artificial neural networks |
Taro Sekiyama, Kiyokuni Kawachiya, Yasushi Negishi |
2020-02-11 |
| 10268951 |
Real-time resource usage reduction in artificial neural networks |
Taro Sekiyama, Kiyokuni Kawachiya, Yasushi Negishi |
2019-04-23 |