www.capterra.com

TensorFlow Reviews 2025. Verified Reviews, Pros & Cons | Capterra

12/15/2023Updated 5/27/2025

Excerpt

“Depreciation of the code is frustrating. To use one form just to throw a Error message.“September 5, 2018 “The best part is that it covers a great range of machine learning use cases from supervised to unsupervised learning and great support for lots of languages and integration.“January 6, 2020 “Control flow operations and loop functions are missing.“March 16, 2018 “Secondly, being developed by Google it integrates easily with Google ML the other products. ... … The learning curve is a bit steep. This isn't specifically an issue because of TensorFlow itself, the idea of neural networks are not simple. TensorFlow has made improvements on 2.0, that make it easier to use compared to previous versions. Azure ML Studio is more of a drag-drop which would make it useful for a lot of people who don't code. But we didn't like the interface and there's a learning curve in getting it set up as well. MS's Cognitive toolkit was pretty decent but doesn't receive as much support as TensorFlow. … I don't like that TensorFlow requires expertise as it is not easy for beginners. Also, TensorFlow has a slow speed which is not good in deploying deep learning models compared to other frameworks. I often work with ML engine, and it appears very complex to me. Because of that I suggest Newbies to start with AutoML first. … In constructing ML project at first, it is run by the local hardware platform Tensorflow GPU version, so that at the time of training can speed up a lot, but because of the high cost of GPU, when a project order of magnitude increases, the training time of exponential growth, if want to reduce the time, only through optimization algorithm or hardware.After that, I moved the whole project to the cloud platform for operation. Of course, there was also a problem. The resources of Aliyun were all based on fixed configuration to determine different prices.Finally, I migrated to Google Cloud ML Engine, which was cheap and perfectly compatible with other Google products, such as Cloud Storage, Cloud Dataflow, and Cloud Datalab.For the extension of the project and late derivative, provides a great convenience. … Although Python is very powerful and easy to use, using Python with TensorFlow will still cause some efficiency problems. For example, every mini-batch needs to be fed from Python to the network. During this process, when the data size of mini-batch is small or calculation time of is short, it will cause long latency. … As starting to learn the basic deep learning tools using Tensorflow, I found it not straight forward in terms of the sessions and variables management. It is quite tricky though to debug the code if it has some problems. Also, TesnorFlow does not support dynamic graphs. It was not an issue for me in the beginning, however, it started to be a challenging problem while dealing with dynamic graphs (e.g. text modeling). … Some of the cons are as follows: 1. Lack of Symbolic loops (which is given in Theano and Caffe) 2. Lack of support on Windows. Using machine learning in several projects. They all needed tensorflow even when I am not directly using it, but rather using other machine learning packages.

Source URL

https://www.capterra.com/p/170397/TensorFlow/reviews/

Related Pain Points

Scalability Cost Challenges in Cloud Deployment

6

When scaling TensorFlow projects on cloud platforms with high-cost GPU configurations, training time grows exponentially, forcing developers to either optimize algorithms or migrate infrastructure, leading to significant cost and complexity issues.

performanceTensorFlowGPUCloud

Static Computational Graph Rigidity

6

TensorFlow's static computational graph model requires developers to define the entire computational graph before execution, which is less flexible than dynamic graph alternatives like PyTorch and challenging for complex, evolving models.

architectureTensorFlowPyTorch

Slow Training Speed Compared to Competitors

6

TensorFlow consistently takes longer to train neural networks across all hardware setups compared to competing frameworks, with slower execution speeds impacting model deployment timelines.

performanceTensorFlow

PyTorch data loading bottlenecks starve GPU compute

6

When the data pipeline is slower than the model, the GPU sits idle waiting for the CPU to serve batches, wasting expensive compute cycles. This is a common but often overlooked performance killer in PyTorch training workflows.

performancePyTorchDataLoader

Missing Symbolic Loops Support

6

TensorFlow lacks prebuilt support for symbolic loops. It does not implicitly expand the graph and instead manages forward activations in different memory locations for each loop iteration without creating a static graph, limiting certain control flow operations.

architectureTensorFlow

No Windows Support

5

TensorFlow has very limited features and support for Windows users, with a significantly wider range of features available only for Linux users.

compatibilityTensorFlowWindowsLinux

Complex Debugging Mechanisms

5

TensorFlow's debugging mechanisms are complex and not straightforward, making it quite tricky to debug code with problems, particularly around sessions and variables management.

dxTensorFlow