keras-self-attention | Attention mechanism for processing sequential data | Machine Learning library
kandi X-RAY | keras-self-attention Summary
kandi X-RAY | keras-self-attention Summary
Attention mechanism for processing sequential data that considers the context for each timestamp.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Calculate the attention layer
- Emititive emission
- Calculate attention regularizer
- Call multiplicative expansion
- Build self attention layer
- Build the attention layer
- Build the multiplicative attention matrix
- Find version string
- Read a file
- Read requirements file
keras-self-attention Key Features
keras-self-attention Examples and Code Snippets
Community Discussions
Trending Discussions on keras-self-attention
QUESTION
I'm using (keras-self-attention) to implement attention LSTM in KERAS. How can I visualize the attention part after training the model? This is a time series forecasting case.
...ANSWER
Answered 2020-Feb-03 at 20:41One approach is to fetch the outputs of SeqSelfAttention
for a given input, and organize them so to display predictions per-channel (see below). For something more advanced, have a look at the iNNvestigate library (usage examples included).
Update: I can also recommend See RNN, a package I wrote.
Explanation:show_features_1D
fetches layer_name
(can be a substring) layer outputs and shows predictions per-channel (labeled), with timesteps along x-axis and output values along y-axis.
input_data
= single batch of data of shape(1, input_shape)
prefetched_outputs
= already-acquired layer outputs; overridesinput_data
max_timesteps
= max # of timesteps to showmax_col_subplots
= max # of subplots along horizontalequate_axes
= force all x- and y- axes to be equal (recommended for fair comparison)show_y_zero
= whether to show y=0 as a red linechannel_axis
= layer features dimension (e.g.units
for LSTM, which is last)scale_width, scale_height
= scale displayed image width & heightdpi
= image quality (dots per inches)
Visuals (below) explanation:
- First is useful to see the shapes of extracted features, regardless of magnitude - giving information about e.g. frequency contents
- Second is useful to see feature relationships - e.g. relative magnitudes, biases, and frequencies. Below result stands in stark contrast with image above it, as, running
print(outs_1)
reveals that all magnitudes are very small and don't vary much, so including the y=0 point and equating axes yields a line-like visual, which can be interpreted as self-attention being bias-oriented. - Third is useful for visualizing features too many to be visualized as above; defining model with
batch_shape
instead ofinput_shape
removes all?
in printed shapes, and we can see that first output's shape is(10, 60, 240)
, second's(10, 240, 240)
. In other words, the first output returns LSTM channel attention, and the second a "timesteps attention". The heatmap result below can be interpreted as showing attention "cooling down" w.r.t. timesteps.
SeqWeightedAttention is a lot easier to visualize, but there isn't much to visualize; you'll need to rid of Flatten
above to make it work. The attention's output shapes then become (10, 60)
and (10, 240)
- for which you can use a simple histogram, plt.hist
(just make sure you exclude the batch dimension - i.e. feed (60,)
or (240,)
).
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install keras-self-attention
You can use keras-self-attention like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page