kandi has reviewed tensorflow and discovered the below as its top functions. This is intended to give you an instant insight into tensorflow implemented functionality, and help decide if they suit your requirements.
An Open Source Machine Learning Framework for Everyone
$ pip install tensorflow
What is XlaBuilder for?Asked 2022-Mar-20 at 18:41
What's the XLA class
XlaBuilder for? The docs describe its interface but don't provide a motivation.
The presentation in the docs, and indeed the comment above
XlaBuilder in the source code
// A convenient interface for building up computations.
suggests it's no more than a utility. However, this doesn't appear to explain its behaviour in other places. For example, we can construct an
XlaOp with an
XlaBuilder via e.g.
XlaOp ConstantLiteral(XlaBuilder* builder, const LiteralSlice& literal);
Here, it's not clear to me what role
builder plays (note functions for constructing
XlaOps aren't documented on the published docs). Further, when I add two
Add) it appears the ops must be constructed with the same builder, else I see
F tensorflow/core/platform/statusor.cc:33] Attempting to fetch value instead of handling error Invalid argument: No XlaOp with handle -1
XlaOp retains a handle for an
XlaBuilder. This suggests to me that the
XlaBuilder has a more fundamental significance.
Beyond the title question, is there a use case for using multiple
XlaBuilders, or would you typically use one global instance for everything?
ANSWERAnswered 2021-Dec-15 at 01:32
XlaBuilder is the C++ API for building up XLA computations -- conceptually this is like building up a function, full of various operations, that you could execute over and over again on different input data.
Some background, XLA serves as an abstraction layer for creating executable blobs that run on various target accelerators (CPU, GPU, TPU, IPU, ...), conceptually kind of an "accelerator virtual machine" with conceptual similarities to earlier systems like PeakStream or the line of work that led to ArBB.
XlaBuilder is a way to enqueue operations into a "computation" (similar to a function) that you want to run against the various set of accelerators that XLA can target. The operations at this level are often referred to as "High Level Operations" (HLOs).
XlaOp represents the result of the operation you've just enqueued. (Aside/nerdery: this is a classic technique used in "builder" APIs that represent the program in "Static Single Assignment" form under the hood, the operation itself and the result of the operation can be unified as one concept!)
XLA computations are very similar to functions, so you can think of what you're doing with an
XlaBuilder like building up a function. (Aside: they're called "computations" because they do a little bit more than a straightforward function -- conceptually they are coroutines that can talk to an external "host" world and also talk to each other via networking facilities.)
So the fact
XlaOps can't be used across
XlaBuilders may make more sense with that context -- in the same way that when building up a function you can't grab intermediate results in the internals of other functions, you have to compose them with function calls / parameters. In
XlaBuilder you can
Call another built computation, which is a reason you might use multiple builders.
As you note, you can choose to inline everything into one "mega builder", but often programs are structured as functions that get composed together, and ultimately get called from a few different "entry points". XLA currently aggressively specializes for the entry points it sees API users using, but this is a design artifact similar to inlining decisions, XLA can conceptually reuse computations built up / invoked from multiple callers if it thought that was the right thing to do. Usually it's most natural to enqueue things into XLA however is convenient for your description from the "outside world", and allow XLA to inline and aggressively specialize the "entry point" computations you've built up as you execute them, in Just-in-Time compilation fashion.
No vulnerabilities reported