• iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    1 day ago

    To avoid people from simply copying the “age proof” and having others reuse it, a nonce/private key combo is needed. To protect that key a DRM style locked down device is necessary. Conveniently removing your ability to know what your device is doing, just a “trust us”.

    Seeing the EU doesn’t make any popular hardware, their plan will always rely on either Asian or US manufacturers implementing the black-box “safety” chip.

    • redjard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      If it is about hiding some data handled by the app, that will be instantly extracted.
      There are plenty of people with full integrity on rooted phones. It’s really annoying to set up and keep going, and requiring that would fuck over most rooted phone/custom os users, but someone to fully inspect and leak everything about the app will always be popping up.

      • iii@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        If it is about hiding some data handled by the app, that will be instantly extracted.

        Look at the design of DRM chips. They bake the key into hardware. Some keys have been leaked, I think playstation 2 is an example, but typically by a source inside the company.

        • redjard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          That applies to play integrity, and a lot of getting that working is juggling various signatures and keys.
          The suggestion above which I replied to was instead about software-managed keys, something handed to the app which it then stores, where the google drm is polled to get that sacred piece of data. Since this is present in the software, it can be plainly read by the user on rooted devices, which hardware-based keys cannot.

          Play integrity is hardware based, but the eu app is software based, merely polling googles hardware based stuff somewhere in the process.

          • iii@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            merely polling googles hardware based stuff

            I understand. In the context of digital sovereignty, even if the linked shitty implementation is discarded (as it should be), every correct implementation will require magic DRM-like chip. This chip will be made by a US or Asian manufacturer, as the EU has no manufacturing.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      The key doesn’t have to be on your phone. You can just send it to some service to sign it, identifying yourself to that service in whatever way.

      • iii@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 hours ago

        It’s that “whatever way” that is difficult. This proposal merely shifts the problem: now the login to that 3rd party can be shared, and age verification subverted.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          A phone can also be shared. If it happens at scale, it will be flagged pretty quickly. It’s not a real problem.

          The only real problem is the very intention of such laws.

          • iii@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            If it happens at scale, it will be flagged pretty quickly.

            How? In a correct implementation, the 3rd parties only receive proof-of-age, no identity. How will re-use and sharing be detected?

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 hours ago

              There are 3 parties:

              1. the user
              2. the age-gated site
              3. the age verification service

              The site (2) sends the request to the user (1), who passes it on to the service (3) where it is signed and returned the same way. The request comes with a nonce and a time stamp, making reuse difficult. An unusual volume of requests from a single user will be detected by the service.