Comparison of the efficiency of our ultra-light MTANN model with that of the traditional state-of-the-art model. (IMAGE)
Caption
Despite being trained on a significantly lower-computational setup (MacBook Air with M1 chip), the 3D Massive-Training Artificial Neural Network (MTANN) achieves superior performance (area under the curve (AUC) = 0.92), faster inference, and drastically reduced training time and parameter count compared to that of 3D ResNet.
Credit
Kenji Suzuki from Institute of Science Tokyo, Japan
Usage Restrictions
Cannot be reused without permission
License
Original content