"Cheetah" Performance Upgrades to GhostPII

NEW RELEASE: Take a moment to get the detail on our extremely sexy new release of Ghost PII improvements.

Dubbed "Cheetah" it includes a variety of world-class performance improvements - the real action is deep in the back-end, but the benefits for applications like encrypted search and machine learning are tangible and substantial. Private computation needs to be something everyday people can use and this includes the ability to build familiar models at scale on a timeline that fits into your day.

Cheetah is going to take good care of you in a way you may not be able to find elsewhere.

#ArtificialIntelligence #Python #Privacy #CyberSecurity

Facial recognition nightmares...

#FacialRecognition technology keeps coming up in the news as an #ArtificialIntelligence technology that seems to generate especially nightmarish #DataPrivacy issues (and maybe just general nightmares). In this video I tell the story of a woman trying to attend a show with her daughter's Girl Scout troop before being instantly singled out by security and ejected... because she worked for a law firm that was involved in litigation against the venue owner. She got to cool her heels outside waiting out the show, separated from her daughter. Oof...

FTX looking forward: Possible clawbacks and consequences

In many ways #FTX appears to be a very classic sort of scam writ large. In this video, I will talk a bit about what history tells us to expect going forward from here. I think the possibility of #LawEnforcement clawbacks of money spent by #SBF are particularly fascinating here given how freely he spent and into the pockets of so many influential people. I give some sketch about what might happen and talk a bit about some other clawbacks of history including the fallout from #BernieMadoff's charitable giving being clawed back to compensate victims.

Meta's fine, the special role of Ireland, and scraping as breach

The #DataPrivacy regulator in #Ireland recently fined #Meta, parent company of #Facebook, $275 million for a breach last year. In this video I elaborate on a couple weird situations embodied in this event including the awkward role that Ireland plays enforcing #GDPR disproportionately for #Europe as a whole and then also the legal haze regarding when letting people scrape your website too easily should be called a #DataBreach.

Litecoin as a privacy coin - How is it going?

#Crypto will likely find itself at the crossroads following the #FTX collapse so I thought it would be a good time to revisit the world of #privacy coins, #Litecoin especially, and check in on how some of the themes discussed in my past videos have played out. To recap, #Litecoin is an old #Bitcoin clone than recently adopted sexy new privacy features to differentiate itself. You can see the cons have played out to a small degree, with two small exchanges in South Korea de-listing it for #AntiMoneyLaundering concerns, but also maybe some signs of the pros in that the forward look around privacy has helped feed perceptions that Litecoin is a "good" project worthy of surviving crypto winter.

Basic ideas of multi-party computation

THE MINI-SERIES CONTINUES: multi-party computation edition.

I focus primarily on defining and motivating multi-party computation but I also outline a bit some of my remaining topics like federated learning and zero-knowledge proof. I will be starting to emphasize a little more the boundaries between these ideas (or not) and tell you a bit more of the big story about the big shared ideas.

Securing voting machines with encrypted data-in-use

As it is Election Day in the United States I have decided to set aside regularly scheduled programming to talk about the power of encrypted data-in-use techniques to better secure voting and voting machines. A little more obvious is how these techniques could improve the privacy of your vote, and then a little less obvious is how powerful they are for preventing tampering and improving auditability. I close by explaining the more general cybersecurity relevance of these ideas and give a bit of a teaser trailer for some future videos with the real sexy big ideas about transforming what it means to own data.

Shared pros and cons of synthetic data and differential privacy

I continue my mini series on privacy-enhancing technologies with a two part mini mini series on synthetic data and differential privacy. In addition to giving some very basic definitions, I will focus this week especially on some shared pros and cons between these two techniques. Next week, I will admit to some minor lies I told about differential privacy and clean things up to give the full story of that technique.

Privacy tech mini-course overview

MINI-SERIES EVENT! Over the course of Quarter 4, I will be posting a series of videos giving an overview of what are increasingly grouped together as "privacy-enhancing technologies" or PETs. (This will include homomorphic encryption, secure multi-party computation, zero-knowledge proof, differential privacy, synthetic data, and maybe a little bit on trusted execution environments.) I will take special care both to point out why the business side might decide to care about these things and to clarify the rather problematic and messy nomenclature in my last sentence's parenthetical word salad.

This video is a whirlwind tour of what is to come. A specific schedule is coming soon.

#DataScience #Privacy #Cybersecurity

Uber's 2016 Incident and Joe Sullivan's Criminal Liability

It's very unusual these days that an executive face criminal liability related to corporate wrongdoing so the story of ex-Uber CISO Joe Sullivan is a notable one. I discuss the relevant 2016 ransomware incident, the key legal role played by notification of law enforcement (or the lack thereof in this case), the gray zone around bug bounty programs, and the broader debate about when criminal liability is appropriate. I think this story has many interesting angles and I will also digress a little into executive compensation, the Theranos trials, and the legal meaning of the "O" in your favorite "C?O" title.

Uber's unpleasantly classic trainwreck of a data breach

Uber's recent breach is about as classic as it gets. I talk about what is classic in it including the role of social engineering, lateral movement and using some access to get more, and the sad predictability about who it is that got taken. I also discuss some unique wrinkles including the announcement of the breach by the hacker in an internal company chat only to receive jeering and disbelief.

Yowza! Data Breaches: Public Sector Edition

The government loses your data, too, and when it does it probably does it in a way that intersects something else you were nervous about. In this video I discuss the recent thefts of 1) name and address for ALL concealed carry permit holders in California stolen from a misbegotten Department of Justice portal, and 2) billions of records of comprehensive personal information probably describing more or less everyone in China and probably stolen from the Shanghai National Police database. Both stories are evolving...

The (uniquely flexible) Ghost PII permissions system

You share data with someone or you don't, right? Wrong!

In this video I continue my mini-series on how Ghost PII provides encrypted data-in-use techniques and a flexible permissions system so you can allow a partner to compute just what you want to let them compute and just how you want them to compute it. It's a great way to regulate risk, maintain auditability into how partners use your data, and escape from the either / or of wholesale sharing or not will allow you to get some exciting things done you might not have otherwise. #DataScience #Python #DataPrivacy #CyberSecurity

Restricted analytics on encrypted data with no decryption

Have you ever felt like you had a good reason to put data someplace but ran into too much static regarding data privacy or cybersecurity? Maybe you had a prototype that was only allowed on a test environment, but your ideal test data was only allowed on prod. Maybe you wanted to share data with an external partner, but one that compliance felt couldn't maintain cybersecurity standards to your own. If you have run into these kinds of headaches, maybe we can help. In this example, we encrypt some data, pool it while encrypted, and show how the owners can give a third-party analyst the ability to do some computations (mean, stdev) on that pooled, encrypted data but not others (decryption). #Cybersecurity #DataScience #Python

Video Block
Double-click here to add a video by URL or embed code. Learn more