Introduction to the Python Deep Learning Library TensorMotion
- Get link
- X
- Other Apps
Last Updated on July 27, 2023
TensorMotion is a Python library for fast numerical computing created and launched by Google.
It is a foundation library that may be utilized to create Deep Learning fashions immediately or by using wrapper libraries that simplify the strategy constructed on prime of TensorMotion.
In this publish, you may uncover the TensorMotion library for Deep Learning.
- Update Jun 2023: Update to TensorMotion 2.x
Kick-start your endeavor with my new e-book Deep Learning With Python, along with step-by-step tutorials and the Python provide code info for all examples.
Let’s get started.

Introduction to the Python deep finding out library TensorMotion
Photo by Nicolas Raymond, some rights reserved.
What Is TensorMotion?
TensorMotion is an open-source library for fast numerical computing.
It was created and is maintained by Google and was launched beneath the Apache 2.0 open provide license. The API is nominally for the Python programming language, although there could also be entry to the underlying C++ API.
Unlike completely different numerical libraries supposed for use in Deep Learning like Theano, TensorMotion was designed for use every in evaluation and enchancment and in manufacturing applications, not least of which is RankBrain in Google search and the gratifying DeepDream project.
It can run on single CPU applications and GPUs, along with mobile items and large-scale distributed applications of a lot of of machines.
How to Install TensorMotion
Installation of TensorMotion is straightforward if you already have a Python SciPy environment.
TensorMotion works with Python 3.3+. You can adjust to the Download and Setup instructions on the TensorMotion site. Installation is perhaps best by means of PyPI, and explicit instructions of the pip command to utilize in your Linux or Mac OS X platform are on the Download and Setup webpage. In the most effective case, you merely need to enter the subsequent in your command line:
1 | pip arrange tensorflow |
An exception could be on the newer Mac with an Apple Silicon CPU. The bundle title for this explicit construction is tensorflow-macos in its place:
1 | pip arrange tensorflow–macos |
There are moreover virtualenv and docker images that you must use in case you need.
To make use of the GPU, it’s advisable have the Cuda Toolkit put in as successfully.
Your First Examples in TensorMotion
Computation is described by the use of info flow into and operations throughout the building of a directed graph.
- Nodes: Nodes perform computation and have zero or further inputs and outputs. Data that strikes between nodes are sometimes referred to as tensors, which can be multi-dimensional arrays of precise values.
- Edges: The graph defines the flow into of knowledge, branching, looping, and updates to state. Special edges could be utilized to synchronize habits all through the graph, for example, prepared for computation on quite a few inputs to complete.
- Operation: An operation is a named abstract computation which will take enter attributes and produce output attributes. For occasion, you possibly can define an add or multiply operation.
Computation with TensorMotion
This first occasion is a modified mannequin of the occasion on the TensorFlow website. It reveals how one can define values as tensors and execute an operation.
1 2 3 4 | import tensorflow as tf a = tf.fastened(10) b = tf.fastened(32) print(a+b) |
Running this occasion reveals:
1 | tf.Tensor(42, type=(), dtype=int32) |
Linear Regression with TensorMotion
This subsequent occasion comes from the introduction throughout the TensorFlow tutorial.
This occasion reveals how one can define variables (e.g., W and b) along with variables that are the outcomes of the computation (y).
We get some sense that TensorMotion separates the definition and declaration of the computation. Below, there could also be computerized differentiation beneath the hood. When we use the function mse_loss() to compute the excellence between y and y_data, there is a graph created connecting the value produced by the function to the TensorMotion variables W and b. TensorMotion makes use of this graph to deduce learn to exchange the variables contained within the lower() function.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | import tensorflow as tf import numpy as np # Create 100 phony x, y info elements in NumPy, y = x * 0.1 + 0.3 x_data = np.random.rand(100).astype(np.float32) y_data = x_data * 0.1 + 0.3 # Try to hunt out values for W and b that compute y_data = W * x_data + b # (We know that W have to be 0.1 and b 0.3, nevertheless Tensorflow will # decide that out for us.) W = tf.Variable(tf.random.common([1])) b = tf.Variable(tf.zeros([1])) # A function to compute indicate squared error between y_data and computed y def mse_loss(): y = W * x_data + b loss = tf.reduce_mean(tf.sq.(y – y_data)) return loss # Minimize the indicate squared errors. optimizer = tf.keras.optimizers.Adam() for step in fluctuate(5000): optimizer.lower(mse_loss, var_list=[W,b]) if step % 500 == 0: print(step, W.numpy(), b.numpy()) # Learns biggest match is W: [0.1], b: [0.3] |
Running this occasion prints the subsequent output:
1 2 3 4 5 6 7 8 9 10 | 0 [-0.35913563] [0.001] 500 [-0.04056413] [0.3131764] 1000 [0.01548613] [0.3467598] 1500 [0.03492216] [0.3369852] 2000 [0.05408324] [0.32609695] 2500 [0.07121297] [0.316361] 3000 [0.08443557] [0.30884594] 3500 [0.09302785] [0.3039626] 4000 [0.09754606] [0.3013947] 4500 [0.09936733] [0.3003596] |
You could be taught further regarding the mechanics of TensorMotion throughout the Basic Usage guide.
More Deep Learning Models
Your TensorMotion arrange comes with quite a few Deep Learning fashions that you must use and experiment with immediately.
Firstly, it’s advisable uncover out the place TensorMotion was put in in your system. For occasion, you must use the subsequent Python script:
For occasion, this is perhaps:
1 | /usr/lib/python3.9/site-packages/tensorflow |
Change to this itemizing and take note of the fashions subdirectory. Included are quite a few deep finding out fashions with tutorial-like suggestions, equal to:
- Multi-threaded word2vec mini-batched skip-gram model
- Multi-threaded word2vec unbatched skip-gram model
- CNN for the CIFAR-10 neighborhood
- Simple, end-to-end, LeNet-5-like convolutional MNIST model occasion
- Sequence-to-sequence model with an consideration mechanism
Also, confirm the examples itemizing, which contains an occasion using the MNIST dataset.
There generally is a excellent list of tutorials on the precept TensorMotion site. They current learn to use fully completely different neighborhood types and fully completely different datasets and learn to use the framework in different strategies.
Finally, there would be the TensorFlow playground the place you probably can experiment with small networks correct in your web browser.
Need help with Deep Learning in Python?
Take my free 2-week e-mail course and uncover MLPs, CNNs and LSTMs (with code).
Click to sign-up now and as well as get a free PDF Ebook mannequin of the course.
TensorMotion Resources
More Resources
- TensorFlow Course on Udacity
- TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems (2023)
Summary
In this publish, you discovered the TensorMotion Python library for deep finding out.
You found that it is a library for fast numerical computation, significantly designed for the types of operations required to develop and think about big deep finding out fashions.
Do you have any questions on TensorMotion or this publish? Ask your questions throughout the suggestions, and I’ll do my biggest to answer them.

TensorMotion 2 Tutorial: Get Started in Deep Learning…

Monkey Patching Python Code

Data Visualization in Python with matplotlib,…

Handwritten Digit Recognition Using Convolutional…

Sequence Classification with LSTM Recurrent Neural…

Applied Deep Learning in Python Mini-Course
- Get link
- X
- Other Apps
Comments
Post a Comment