Ameesh Divatia is the co-founder & CEO of Baffle, an organization centered on integrating knowledge safety into each facet of the info pipeline to simplify cloud knowledge safety and decrease the impression of knowledge breaches.
Its platform presents a no-code, easy-to-deploy answer that secures delicate knowledge with out affecting efficiency or requiring modifications to purposes. Baffle’s expertise is appropriate with main cloud suppliers equivalent to AWS, Azure, IBM, and GCP. Serving a variety of shoppers, from Fortune 25 corporations to small and medium companies, Baffle protects over 100 billion data worldwide, working with system integrators for environment friendly deployment.
What motivated you to co-found Baffle, Inc., and the way did your earlier entrepreneurial experiences form your method within the early levels of the corporate?
After my final firm’s exit, I took a much-needed break to recharge and take into consideration what I actually wished to do subsequent. I’ve at all times liked constructing corporations, so I began having conversations with an early-stage VC buddy of mine, and he launched me to Priyadarshan “PD” Kolte, who would turn out to be my co-founder. He challenged us with an intriguing query, disguised as a problem: “How do you get worth from knowledge whereas nonetheless defending it?” That problem hooked me—fixing powerful issues is what I stay for. There was a obtrusive hole in knowledge safety, particularly round simplifying encryption and defending knowledge ‘in use’. 9 years later, right here we’re, answering that query with Baffle.
With the rise of generative AI, how can corporations make sure that their knowledge stays safe whereas nonetheless leveraging the advantages of AI applied sciences?
It is a query each firm dabbling in AI ought to be asking. Safety and innovation usually really feel like two opposing forces, however they don’t must be. The hot button is a breakthrough innovation known as Privateness Enhanced Computation (PEC) that begins with encryption—conserving knowledge protected at relaxation, in transit, and whereas in use. By encrypting delicate knowledge earlier than it will get to the AI fashions first after which utilizing PEC to course of it, you possibly can nonetheless get the insights you want with out compromising safety. It’s about staying forward of the sport, updating safety protocols, and leveraging instruments like Baffle to mitigate dangers. You don’t must sacrifice innovation for safety.
Are you able to clarify the particular position of encryption in defending AI-generated knowledge and fashions? How does it differ from conventional knowledge safety strategies?
Encryption for AI knowledge is like wrapping your Most worthy asset in bubble wrap—regardless of how a lot it’s tossed round, it stays protected. Consider it as locking the info whilst you’re utilizing it. Conventional strategies concentrate on securing knowledge when it’s not in use (at relaxation) or when it is transferring (in transit). However with AI, we’re including a brand new layer of complexity as a result of the info wants to remain encrypted even when it’s being crunched by fashions. Baffle focuses on this “data-in-use” safety, making certain efficiency isn’t impacted however safety isn’t sacrificed.
Baffle not too long ago launched an information safety answer particularly for GenAI initiatives. Are you able to share extra particulars about how this answer works and what makes it distinctive available in the market?
Our GenAI answer is all about making encryption easy and environment friendly, even whenever you’re working with AI. It plugs into an current AI pipeline by defending knowledge as it’s being ingested. That is adopted by a functionality generally known as real-queryable encryption that processes the info with out exposing it. Most significantly, you don’t want to vary something in your AI pipeline—no rewriting code, no trouble. Simply plug it in and go. We’ve centered on ease of use and ensuring safety doesn’t get in the way in which of innovation, which is why prospects are discovering this answer so engaging.
Your platform emphasizes “no code” modifications for implementing knowledge safety. How does this method profit corporations, particularly these with massive, complicated knowledge pipelines?
Nobody desires to interrupt one thing that’s already working. With our “no code” method, corporations don’t want to tear aside their current purposes or knowledge movers so as to add encryption. It is a big profit for giant organizations with complicated knowledge pipelines as a result of it means they’ll bolster safety with out risking disruptions. It’s sooner, simpler, and removes lots of the complications that usually include integrating new tech.
How does Baffle’s Actual Queryable Encryption differ from different encryption strategies, and what benefits does it supply for corporations dealing with large-scale knowledge analytics?
Actual Queryable Encryption is our secret sauce. Not like conventional encryption, which requires you to decrypt knowledge within the knowledge retailer earlier than analyzing it (and thereby exposing it), we allow you to run queries on the encrypted knowledge itself. It’s like having your cake and consuming it too—you get the insights with out risking safety. It is a game-changer, particularly for corporations coping with big quantities of delicate knowledge, like in finance or healthcare, the place compliance is non-negotiable.
Knowledge-in-use safety is a important function of Baffle’s platform. Are you able to clarify how this works and why it’s important for corporations, significantly within the context of GDPR and different knowledge privateness laws?
When knowledge is in use—being processed by methods—it’s often at its most weak. That’s why defending it in real-time is important, particularly with laws like GDPR, which require a posture generally known as ‘knowledge safety by design’. Our platform ensures that even when knowledge is being processed, it is nonetheless encrypted. This method eliminates that dangerous window of publicity the place knowledge breaches usually occur, serving to corporations keep compliant and secure.
As AI fashions turn out to be extra complicated, what are the primary challenges in securing these fashions in opposition to adversarial assaults, and the way does Baffle tackle these challenges?
AI fashions are getting smarter, however so are attackers. Adversarial assaults—the place unhealthy actors attempt to manipulate the info that impacts an AI mannequin’s output—are a rising concern. We deal with this by specializing in the info aspect. By encrypting the info that the AI fashions depend on, we make it a lot tougher for anybody to mess with the mannequin’s integrity. It’s like giving the AI mannequin a locked vault of knowledge—nobody’s getting in with out the important thing.
Are you able to talk about the significance of role-based entry management (RBAC) in trendy knowledge safety methods, particularly for organizations utilizing multi-tenant cloud environments?
In multi-tenant cloud environments, RBAC is a must have. Think about you’ve received a bunch of individuals all sharing the identical cloud infrastructure. With out RBAC, it’s like giving everybody entry to the entire constructing as an alternative of simply their workplace. Our platform integrates RBAC so solely approved folks based mostly on their particular person position or credential, can entry delicate knowledge, conserving issues locked down tight and lowering the danger of breaches.
Baffle has seen vital progress lately, along with your revenues doubling up to now 12 months. What do you attribute this progress to, and the way do you intend to proceed this trajectory?
We’re using a wave of demand as a result of we’ve constructed the correct answer for the correct drawback. Our progress comes down to at least one factor: we’re fixing an issue that each firm faces—knowledge safety. With cyber threats on the rise and laws getting more durable, corporations are turning to us for options that work with out slowing them down. Our concentrate on real-queryable encryption and ease of use is a giant motive for that progress. Transferring ahead, we’re planning to maintain pushing the envelope on innovation, increasing our merchandise, and constructing sturdy partnerships that take us into new markets.
Thanks for the good interview, readers who want to be taught extra ought to go to Baffle.