Google's TensorNetwork library speeds up computation by up to 100 times

The Transformer, a type of AI architecture introduced in a 2017 paper (“Attention Is All You Need“) coauthored by scientists at Google, excels at writing prose and product reviews, synthesizing voices, and crafting harmonies in the style of classical composers. But a team of Google researchers believed it could be taken a step further with AutoML, a technique in which a “controller” system identifies a “child” architecture…

View On VentureBeat