• jqubed@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    20 days ago

    TL;DR: it looks like Smith’s team took actual still photos of the crowds from the concert and ran them through generative AI products to turn them into video clips. The original, professional photos are clear and signs are legible.

    They uploaded these clips to YouTube as well as Instagram/Facebook. Separately it turns out YouTube has been quietly experimenting on Shorts videos without telling people by running them through AI upscalers and sharpeners, and in statements responding to complaints where people are noticing the quality looks worse Google is insisting it’s not “generative AI” but “traditional machine learning.” The Will Smith clips on YouTube look noticeably worse and more like AI-garbled slop than the same clips on Instagram/Facebook.

    • Mothra@mander.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 days ago

      Ugh. Yeah I see why they would want to move the crowds if they were stills. It’s dumb though, would have been much easier to just shoot footage of the crowds in the first place.

      I didn’t like YT upscaling and sharpening content without consent, I upload some portfolio pieces occasionally and having to contend with this garbage now doesn’t make me look forward to it.

      Thanks for the TLDR