"Like so many applications of AI, this new power is likely to be a double-edged sword: It may help people identify the locations of old snapshots from relatives, or allow field biologists to conduct rapid surveys of entire regions for invasive plant species, to name but a few of many likely beneficial applications.

“But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology. Stanley worries that similar technology, which he feels will almost certainly become widely available, could be used for government surveillance, corporate tracking or even stalking.”

    • PoopMonster@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      Openstreetmaps let’s you write some insanely precise queries. There’s a company around that had as a plan to team up with governments to pinpoint mass shooters when they were streaming (as a usage case).

      So say in the video it was clear they were in X city and they see things in the video like McDonald’s, Starbucks, fenced in playgrounds, churches, what have you you can give the query a bounding box with all that info and very quickly narrow down where the video could be taken.

      I think there was also some people who would pinpoint images from mountain outlines as a game. Kind of like geoguessr on steroids.

  • skydivekingair@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    11 months ago

    This isn’t unique to AI, like most LLM programs it’s just accomplishing it faster and on a larger scale. Personally think if you want privacy you should limit the personal things you post to what you’re okay with being out there and form habits such as waiting until home from vacation to post pictures.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    11 months ago

    Yes, and people like me having continued to point out that this problem stems from a bad view of expectation of privacy.

    A non-famous person has a reasonable expectation of privacy on public property. If you take a photo and a non-famous person’s face is in it, you should have written consent for only that photo or blur it out. If Disney can own an image of a mouse for 95 fucking years I can own my own image.

    Don’t take pictures of people or their property without consent. Just because technology allows you to be a disgusting creep doesn’t mean you should. If you want jerk off material just use the internet like the rest of us.

    • DessertStorms@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      11 months ago

      If you want jerk off material just use the internet like the rest of us.

      The kind of thing this can be used for is about ten stages past jerking off, and in to stalker territory. So a person already using the internet for jerking off can now pinpoint exactly where the person they’re jerking off to lives, and potentially turn up at their house, and escalate from there. This is beyond just creepy (and exploitative, in the case of corporations using the info), it’s potentially putting lives at risk.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        Ok I don’t know what I am supposed to do about that. Let’s just work on the problem we can solve for now.

        • DessertStorms@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          11 months ago

          I never asked you to do anything? just pointing out things are much more serious than your comment makes out. I also don’t see how what you said is a problem we can solve now and it’s ok to focus on, but what I added somehow isn’t…

  • AFK BRB Chocolate@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    11 months ago

    To get that kind of accuracy from a student project with such a small sample set is pretty remarkable and pretty frightening. Yes, there are people who are good at this, but (1) this AI just beat one of the most skilled humans and (2) having it in an AI brings the capability to anyone, regardless of their motives.

    Plus, with an AI you can incorporate more heuristics than any human could reasonably master. The article mentions types of foliage, which is a good example. An AI could incorporate thousands of things like that easily. Seems like a tool that’s ripe for abuse, but I don’t know what you could do about it.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    11 months ago

    This is the best summary I could come up with:


    The project, known as Predicting Image Geolocations (or PIGEON, for short) was designed by three Stanford graduate students in order to identify locations on Google Street View.

    But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology.

    It’s a neural network program that can learn about visual images just by reading text about them, and it’s built by OpenAI, the same company that makes ChatGPT.

    Rainbolt is a legend in geoguessing circles —he recently geolocated a photo of a random tree in Illinois, just for kicks — but he met his match with PIGEON.

    And it guessed that a picture of the Snake River Canyon in Idaho was of the Kawarau Gorge in New Zealand (in fairness, the two landscapes look remarkably similar).

    They’ve written a paper on their technique, which they co-authored along with their professor, Chelsea Finn — but they’ve held back from making their full model publicly available, precisely because of these concerns, they say.


    The original article contains 1,049 words, the summary contains 181 words. Saved 83%. I’m a bot and I’m open source!