Just wanted to repost this from the other thread on TensorFlow, since I joined the party a bit late:
I think some of the raving that's going on is unwarranted. This is a very nice, very well put together library with a great landing page. It might eventually displace Torch and Theano as the standard toolkits for deep learning. It looks like it might offer performance / portability improvements. But it does not do anything fundamentally different from what has already been done for many years with Theano and Torch (which are standard toolkits for expressing computations, usually for building neural nets) and other libraries. It is not a game-changer or a spectacular moment in history as some people seem to believe.
Well, it looks way more scalable than Theano or Torch while being as easy to use as Theano. I'd say that's pretty exciting considering the number of groups working on way lower-level scalable neural nets.
This is "not a game-changer" in the same way map-reduce isn't a game-changer wrt for loops.
Also check out TensorBoard, their visualization tool (animation halfway down the page):
Hope to make the distributed version available. That's not a promise. Google outsourced Blaze, their distributed build system, as Bazel earlier this year. However, no sign of a distributed version there yet. As a result, Bazel has had almost no adoption.
I think the fundamental differentiator might be how "production ready" TensorFlow is - the promise of simply declaring an ML pipeline and have that run in very heterogeneous compute environments from mobile phones to GPU blades to plain-old clusters, if fulfilled, can indeed be a huge game changer. The promise is that you literally do not have to write any new code when you are done with a research project / a series of experiments and are ready to deploy your pipeline at a large scale. None of Theano / Torch etc make that promise.
That's not really a fundamental differentiator. Torch/Theano are definitely production ready. I think the portability is definitely an advantage, though.
I think you would not use Theano in an end-user product. It's made for developers to run on developer machines. It's very fragile. It has a very long start-up time, might be in the order of several minutes at the first start.
Maybe it would work good enough in a service backend. But even there it would not scale that well. For example, it doesn't support multi-threading (running a theano.function from multiple threads at the same time).
Torch is used in production by Facebook , deepmind and possibly baidu, amongst others. Facebook especially have released code to make torch run much faster on AWS with GPU cards. Also no startup time. The design is done with a high level language (lua) while computation done mostly in C. I'd be very surprised if tensorflow is actually faster than torch on a single machine
Torch has extremely difficult learning curve due to Lua. With Tensor Flow underlying engine in C++ it is likely as efficient as Torch. Special extension such as Nvidia Cudnn could also be used with tensorflow.
If somebody finds learning Lua to be difficult, then learning C++, learning basic statistics, or learning machine learning will be impossible for them.
I'm curious how the performance and scalability compares with Theano and Torch. I'm thinking the reason Google built this is that they wanted to scale computations to really large clusters (thousands, tens of thousands of machines) and the other options didn't really cut it.
This looks to be a single machine test, where this video and the poster above specifically talked about running against compute clusters. I don't think a single machine benchmark is going to be nearly as interesting.
As someone who had to go through pain of using caffe, struggled with steep learning curve of Lua/Torch and frustrated by lack of simple features (train on gpu/test on cpu) of Theano. You could not be more wrong. Having a well written, consistent and portable system is a huge plus. Tensor Flow is likely to do to deep learning what Hadoop did to Big Data.
I think some of the raving that's going on is unwarranted. This is a very nice, very well put together library with a great landing page. It might eventually displace Torch and Theano as the standard toolkits for deep learning. It looks like it might offer performance / portability improvements. But it does not do anything fundamentally different from what has already been done for many years with Theano and Torch (which are standard toolkits for expressing computations, usually for building neural nets) and other libraries. It is not a game-changer or a spectacular moment in history as some people seem to believe.