Skip to main content

Posts

Showing posts from September, 2020

Machine Learning in Compilers: Past, Present and Future

 -By Hugh Leather - University of Edinburgh Edinburgh,                                                   Chris Cummins -  Facebook AI Research Paper Link   image Courtesy:Embecosm Abstract Writing optimising compilers is difficult. The range of programs that may be presented to the compiler is huge and the systems on which they run are complex, heterogeneous, non-deterministic, and constantly changing. The space of possible optimisations is also vast, making it very hard for compiler writers to design heuristics that take all of these considerations into account. As a result, many compiler optimisations are out of date or poorly tuned. A retrospective of machine learning in compiler optimisation from its earliest inception, through some of the works that set themselves apart, to today’s deep learning, finishing with our vision of the field’s future. Video from facebook Developers have known since they first used optimising compilers that the compiler does not always choose the best opt

Fast Quantum Algorithm for Learning

 -By Hayata Yamasaki,  Sathyawageeswar Subramanian,  Sho Sonoda,  Masato Koashi Paper Link image Courtesy: Fossguru.com Abstract Kernel methods augmented with random features give scalable algorithms for learning from big data. But it has been computationally hard to sample random features according to a probability distribution that is optimized for the data, so as to minimize the required number of features for achieving the learning to the desired accuracy. Here, we develop a quantum algorithm for sampling from this optimized distribution over features, in runtime O(D) that is linear in the dimension D of the input data. Our algorithm achieves an exponential speedup in D compared to any known classical algorithm for this sampling task. In contrast to existing quantum machine learning algorithms, our algorithm circumvents sparsity and low-rank assumptions and thus has wide applicability. We also show that the sampled features can be combined with regression by stochastic gradient des