This week:
It’s Groundhog Day, so here are 3 recent incidents that seem to occur day after day. (Where’s Bill Murray when we need him?)
3 – When someone leaves, make sure they can’t get back in.
2 – 18 hours from start to suicide.
1 – The rush to Deepseek proves we don’t really care about our privacy.
3 – When someone leaves, make sure they can’t get back in.
“Ex-worker arrested after ‘shutdown’ of British Museum computer systems”
Summary: The British Museum experienced a significant security breach when a former IT contractor, who had been recently dismissed, “allegedly trespassed onto the museum site and was able to shut down various systems”. This incident led to the partial closure of the museum, affecting both its permanent collection and temporary exhibitions.
So what? When someone leaves, make sure they can’t get back in. Revoke their system access, deactivate their swipe cards, and remind them of their ongoing contractual obligations to keep your systems and data secure and confidential.
Source: BitDefender’ Hot for Security blog
2 – 18 hours from start to suicide.
“Jack Sullivan stepped in front of a train while being blackmailed on Instagram and Snapchat [..] just 18 hours after Jack was first contacted on Instagram by someone pretending to be a woman his age.”
Summary: Another day, another sad story about a young person being blackmailed on social media, being fooled by a scammer into thinking they were sharing intimate photos with a girl of their age and then being threatened if they don’t continue to send money to the criminals. These scams “have killed more American teens in the past two years than ISIS ever has [..] This is the largest coordinated blackmail operation in human history”. Worse of all, some simple changes on the social media platforms would prevent the scams from succeeding.
So what? If you’ve read my article “When will Meta Care About Dead Teenagers“, you’ll know how Paul (and many others) are fighting get tech giants to make some simple changes to prevent these scams. While these firms continue to put profits over people, talk to your loved ones so they are aware of how the scam works and how they will be targeted. And reassure them that if this crime happens to them, they’re not alone and they will get through this.
Source: Paul Raffile on LinkedIn
1 – The rush to Deepseek proves we don’t really care about our privacy.
“DeepSeek, a Chinese AI chatbot similar to OpenAI’s ChatGPT, is the most downloaded free app in the U.S. — but its swift rise to the top of the app store charts has raised potent privacy concerns.”
Summary: DeepSeek’s arrival has caused significant stock market disruption after people realised the AI advances made by US tech companies may be easy for others to repeat or improve. But in the rush to sign up, people seemed to disregard the privacy implications of using a tool that explicitly states user data will be stored on servers in China. This user data includes conversations and generated responses, date of birth, username, email address, telephone number, IP address, device identifier, device model, and operating system. And most intriguing of all, “keystroke patterns or rhythms”. Yes, just like TikTok, the service records the timing of when you press keys on your keyboard! China has different privacy and security standards, and if the Chinese government wants a copy of the data, DeepSeek is legally obliged to provide it. The Italian regulator is so concerned about the risks that it has banned the app from app stores while it investigates further.
So what? Let’s not be naive – All tech platforms gather information about us. But many are subject to some sort of privacy regulations. Whatever platforms you choose to use, ensure you make an informed decision on whether the benefits are worth the risks.
Source: Mashable