The worst part of preparing a tenure portfolio is realizing you actually have to create that 'permanent record' your elementary school teachers threatened you with.
And it has pesky formatting requirements.
@secparam.bsky.social
UMD CS Prof. Security and applied cryptography.
The worst part of preparing a tenure portfolio is realizing you actually have to create that 'permanent record' your elementary school teachers threatened you with.
And it has pesky formatting requirements.
Isn't it worse than that. If your professional account is marked ChatControllExempt, isn't that a giant gapping red flag to adversaries to go look at the personal account of you, your spouse, anyone you might be having an affair with or owe money?
17.09.2025 22:02 β π 0 π 0 π¬ 0 π 0Best cover for a stego system.
11.09.2025 23:47 β π 2 π 0 π¬ 0 π 0There's a very niche case where
1) you succeed at building the quantum computer
2) crypto does migrate to pq
3) you can still sell recovery services on non migrated addresses
4) those addresses don't get robbed by others or FUD from competing PQ secure chais says they were
What's the value of recovering X% of crypto, discounted by: legal risk it's deemed theft, the chance crypto migrates to PQ-resistant algorithms first, and the risk that BTC/ETH prices collapse the moment everyone realizes the same quantum tech makes ALL legacy crypto vulnerable?
11.09.2025 22:48 β π 2 π 0 π¬ 1 π 0If true, this says more about VC funding fads than cryptography. It highlights how hard it is to find valuable applications that classical computers can't approximate well enough. And I have questions for the junior deal partner who modeled the ROI for pq crypto "recovery."
11.09.2025 22:48 β π 1 π 0 π¬ 1 π 0Interesting anecdote from a friend: quantum computing startups are now raising funds by pitching their ability to break cryptocurrency encryption (n=1 plus VC gossip, but still). Apparently other applications like quantum chemistry don't offer big enough ROI for investors.
11.09.2025 22:48 β π 2 π 1 π¬ 1 π 0By the way, if this is predictive typing(unclear) then not just is it on by default, it appears to default to federated learning on your data ( which I of course turned off ) support.google.com/gboard/answe...
10.09.2025 00:32 β π 1 π 0 π¬ 0 π 0Some "AI" on my phone is reading inbound Signal messages. I left predictive typing on, trading a little of my privacy for convenience. Yet something is giving responses using what others wrote in chats with disappearing messages, persisting or sharing them who knows where. Not a good default, Google
09.09.2025 23:59 β π 3 π 1 π¬ 2 π 0The Brooklyn one is actually a water front park development and a vacant office space, at least as of 4 months ago. So even more on brand.
01.09.2025 21:18 β π 2 π 0 π¬ 0 π 0We've crossed a threshold. A paid subscription used to be the ultimate proof of humanity online, now its not enough to allow a single link click inside the NYT cooking app. The next few years are going to be an interesting race to extract more and more invasive proofs of humanity.
01.09.2025 21:14 β π 3 π 0 π¬ 0 π 0The 2010s internet: Let's mock dissertation-length arguments about weird-ass fanfic tags.
The 2025 internet: 'dubcon' is an ancillary part of the financial privacy discourse.
The past was a better place.
And now is when someone should point out private compute for AIs and TEEs are not secure enough to make on by default chat monitoring a good idea. Because they aren't. It's terribly insecure, especially against hostile governments. It's just ... better than without.
06.06.2025 19:27 β π 1 π 0 π¬ 0 π 0Private AI needs to be the norm because opting out is impossible for many apps. Take messaging or photo sharing: even if you opt out, the recipient likely has AI enabledβmaybe even on by default. Your data ends up in their app's AI cloud. Private compute for AI must be a default.
06.06.2025 19:27 β π 0 π 1 π¬ 1 π 0And before anyone says TEEs have imperfect security, the point is they're a massive improvement. And essential in a future where AI assistants get baked into your chat apps and browser watching every move you make, video you view, and message you send.
06.06.2025 19:27 β π 0 π 0 π¬ 1 π 0This is doable today. Apple already has Private Cloud Compute, and Nvidia's H100 GPUs come with Trusted Execution Environments built right in. The pieces are thereβyour AI conversations could run where even the NYT, OpenAI, and hackers can't snoop.
06.06.2025 19:27 β π 0 π 0 π¬ 1 π 0Making LLM chats private is a good idea. We've accepted too much data harvesting alreadyβthis moment lets us reset the norm around who controls our data online. But let's go further: put LLM chats in private compute, so you get technical guarantees you control your data.
x.com/sama/status/...
Classic Google: an A/B test (a rare overt one)
Classic Google AI: it doesn't actually work (you can't submit)
Incidentally, if your Signal is now flooded with work chats, you can organize chats into folders
Settings->Chats->Chat Folders. Looks like its Android only for now.
Friend messaged me: Signal's going mainstream. They've got 150+ active chats. Work life invaded their friend space.
Its not just Signal being in the news: people don't trust other apps. Too many places to half-ass privacy: be it backups, ads, or an AI reading over your shoulder.
It's the year 2030. AIs write all our sitcoms now, but they're just endless FRIENDS clones because the underpaid content moderators in offshore offices learned that's the pinnacle of American comedy.
06.05.2025 17:48 β π 3 π 0 π¬ 0 π 0In particular, once you have digital passports/drivers licenses loaded into everyone's devices for privacy preserving uses, it becomes easier for governments to require non-private uses too. Like attaching your real name to everything you do online.
01.05.2025 23:00 β π 7 π 3 π¬ 2 π 0Privacy preserving identity also raises some questions:
1) where is it appropriate to require such checks. To pick spicy extremes from US politics: should we age gate LGBT sex ed content online? 2nd amendment content?
2) What guarantees do we have privacy isn't turned off later?
So what comes next?
Well one problem with digital IDs (even without privacy) is checking the ID is actually your ID and not someone else's you copied.
Phones, however, have biometric sensors in them. They can check the ID is of you without sending data to a third party.
So, this would let you prove to a website you are over 18 while hiding your identity.
Notions of zk-identity have been around for a while, but 1) this is amazing step to worlds wide scale usage
2) Some hard work went into making this practical eprint.iacr.org/2024/2010.
Google announced they will support privacy preserving age verification via zero-knowledge proofs.
You prove you have a signed digital copy of a drivers license and it says you are over 18 without revealing anything about you (name, birthdate, etc)
blog.google/products/goo...
hasn't launched
23.04.2025 23:33 β π 1 π 0 π¬ 0 π 0But the whatsApp AI opt-out is a fig leaf, it has to be enabled per chat. And they won't even be explicit that its opting out of a security downgrade caused by AI features in WhatsApp. Instead its phrased as an advanced feature most users can ignore.
23.04.2025 22:41 β π 2 π 2 π¬ 1 π 0A responsible roll out of AI in an **encrypted** chat app should require opt in by the message author.
Instead, WhatsApp is saying look, the recipient gets to decide if they can upload your texts/photos to Facebooks AI. They clearly know this is creepy, hence the opt-out.
"When the setting is on, you can block others from exporting chats, auto-downloading media to their phone, and using messages for AI features"
But the last bit only makes sense for AI features inside WhatsApp, that WhatsApp built, anything else is covered by "exporting chats."