PVML combines an AI-centric data access and analysis platform with differentiated privacy.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Image credit: PVML

Enterprises are collecting more data than ever to fuel their AI ambitions, but at the same time, they are concerned about who can access that data, which is often private in nature. PVML is offering an interesting solution by combining a tool like ChatGPT to analyze data with differential privacy protection guarantees. Using retrieval-augmented generation (RAG), PVML can access a corporation’s data without transferring it, eliminating another security concern.

The Tel Aviv-based company recently announced that it has raised an $8 million seed round led by NFX with participation from FJ Labs and Gefen Capital.

Image credit: PVML

The company was founded by husband and wife team Shachar Schnapp (CEO) and Rena Galperin (CTO). Schnapp earned a doctorate in computer science, specializing in differential privacy, and then worked on computer vision at General Motors, while Galperin earned a master’s in computer science with a focus on AI and natural language processing and worked at Microsoft. I worked on machine learning projects.

“A lot of our experience in this domain has come from our work in large corporates and large companies where we’ve seen things not be as efficient as we’d hoped as naive students,” Galperin said. said “The key value we want to bring to organizations as PVML is to democratize data. This can only happen if you, on the one hand, protect this highly sensitive data, but, on the other hand, to Allow easy access, which is synonymous with AI these days. Everyone wants to analyze data using free text — and our secret sauce, differential privacy, that integration Enables very easily.

Discretionary privacy is far from a new concept. The basic idea is to ensure the privacy of individual users in large data sets and to provide mathematical guarantees for this. One of the most common ways to achieve this is to introduce a degree of randomness into the data set, but in a way that does not alter the analysis of the data.

The team argues that today’s data access solutions are inefficient and create too much overhead. Often, for example, the process of enabling employees to securely access data requires removing a lot of data—but it can be productive because you may need to redact data for some tasks. cannot be used efficiently (plus the lead time to access additional data means that real-time use cases are often impossible).

Image credit: PVML

The promise of using discrete privacy means that PVML users do not need to make changes to the original data. It avoids almost all overhead and safely exposes this information for AI use cases.

Virtually all major tech companies now use some form of differential privacy, and make their tools and libraries available to developers. The PVML team argues that most of the data community has yet to implement this.

“Current knowledge about differential privacy is more theoretical than practical,” Schnapp said. “We decided to turn it from theory to practice. And that’s exactly what we’ve done: we develop practical algorithms that work best on data in real-life scenarios.

If PVML’s original data analysis tools and platforms weren’t useful, any privacy distinctions wouldn’t matter. The most obvious use case here is the ability to chat with your data, all with the guarantee that no sensitive data can be leaked in chat. By using RAG, PVML can bring the illusion down to almost zero and the overhead is minimal once the data is in place.

But there are other use cases. Schnapp and Galperin note how differential privacy also allows companies to now share data between business units. In addition, it may also allow some companies to monetize access to their data to third parties, for example.

“In the stock market today, 70% of transactions are done by AI,” said Gigi Levy-Weiss, general partner and co-founder of NFX. “This is a taste of things to come, and organizations that embrace AI today will be a step ahead tomorrow. But companies are afraid to connect their data to AI, because they fear exposure — and for good reason. But PVML’s unique technology creates an invisible layer of security and democratizes access to data, enabling monetization use cases today and paving the way for tomorrow.



WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment