Pebbling Club 🐧🪨

  • BART
    Notes
    Highlight:The Bart model was proposed in BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension by Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov and Luke Zettlemoyer on 29 Oct, 2019. We’re on a journey to advance and democratize artificial intelligence through open source and open science.
    Unfurl
  • BART Model for Text Summarization
    Notes
    This tutorial covers the origins and uses of the BART model for text summarization tasks, and concludes with a brief demo for using BART with Paperspace Notebooks.
    Unfurl
  • BART to Consider New Policy for Cutting Cellphone Service (Erin Sherbert/SF Weekly)
    Feed
    Unfurl
  • BART cops get shirt-pocket snitch-cams
    Feed
    Unfurl
  • Photo (Conan + Bart)
    Feed
    Embed
    Unfurl