Just a sample of the Echomail archive
Cooperative anarchy at its finest, still active today. Darkrealms is the Zone 1 Hub.
|    CONSPRCY    |    How big is your tinfoil hat?    |    2,445 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 2,051 of 2,445    |
|    Mike Powell to All    |
|    FBI warns kidnapping scam    |
|    09 Dec 25 09:05:56    |
      TZUTC: -0500       MSGID: 1808.consprcy@1:2320/105 2d9d69d4       PID: Synchronet 3.21a-Linux master/123f2d28a Jul 12 2025 GCC 12.2.0       TID: SBBSecho 3.28-Linux master/123f2d28a Jul 12 2025 GCC 12.2.0       BBSID: CAPCITY2       CHRS: ASCII 1       FORMAT: flowed       FBI warns of kidnapping scams as hackers turn to AI to provide 'proof of life'              Date:       Mon, 08 Dec 2025 14:30:00 +0000              Description:       AI-generated deepfake videos are being used as "proof" someone was kidnapped.              FULL STORY              Hackers are using Generative Artificial Intelligence (GenAI) to create       convincing deepfake videos which are then used as proof of life in kidnapping       and extortion scams.               This is according to the US Federal Bureau of Investigation (FBI) which       recently released a new Public Service Announcement (PSA), warning citizens       not to fall for the trick.               Here is how the scam works: the criminals will pick a target and scour social       media and other sources for images and videos. If they find enough       information, they will source it into an AI tool to create videos and images       depicting their targets loved ones as kidnapped. Then, they will reach out to       the victims and demand an immediate ransom payment in order to release their       hostage.               The scam might not be that widespread, but its been around for a little        while. The Guardian reported on it two years ago. Still, with AI getting       better by the minute, its safe to assume these scams are getting more common,       prompting a reaction from the FBI.               The FBI also said that these photos and videos are not perfect. With a little       pixel hunting, they can be identified as fake. However, crooks know this too,       so the messages they send are usually timed and expire before any meaningful       analysis can be done.              Examples of these inaccuracies include missing tattoos or scars and        inaccurate body proportions, the PSA reads. Criminal actors will sometimes       purposefully send these photos using timed message features to limit the       amount of time victims have to analyze the images.               To defend against these attacks, the FBI first suggests citizens be more       mindful about their privacy: when posting photos online, or when providing       personal information to strangers while traveling. Then, they suggest they       establish a code word only they know and, most importantly - try to contact       the loved ones before making any payments.               ======================================================================       Link to news story:       https://www.techradar.com/pro/security/fbi-warns-of-kidnapping-scams-as-hacker       s-turn-to-ai-to-provide-proof-of-life              $$       --- SBBSecho 3.28-Linux        * Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105)       SEEN-BY: 105/81 106/201 128/187 129/14 305 153/7715 154/110 218/700       SEEN-BY: 226/30 227/114 229/110 134 206 300 307 317 400 426 428 470       SEEN-BY: 229/664 700 705 266/512 291/111 320/219 322/757 342/200 396/45       SEEN-BY: 460/58 633/280 712/848 902/26 2320/0 105 304 3634/12 5075/35       PATH: 2320/105 229/426           |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca