Imagine you’re on a phone call with your sales rep—you catch up, swap some stories, and eventually discuss some numbers and updates on the new leads. A few minutes later, you receive an automated email summarizing the conversation, including relevant action items.
This is the dream of Surbhi Rathore from Symbl AI, a company developing the technology to interpret natural conversations and convert them into useful data.
In this week’s UpTech Report, Surbhi discusses the origins of this idea, how it works, and why she’s developing it as an API for all to use.
More information: https://symbl.ai/
TRANSCRIPT
DISCLAIMER: Below is an AI generated transcript. There could be a few typos but it should be at least 90% accurate. Watch video or listen to the podcast for the full experience!
Surbi Rathore 0:00
If we develop a platform and make this available as API’s, we can power we can help developers and businesses go to that next stage without having to spend so much time, effort and mindspace to think about how to develop this in house or how to acquire companies like that, and make it like democratize the intelligence so that products can actually become natively intelligent for conversations.
Alexander Ferguson 0:31
Imagine you’re on a phone call with a sales rep, you catch up swaps and stories and eventually discuss some numbers and updates on the new leads. A few minutes later, you receive an automated email summarizing the conversation, including relevant action items. This is the dream of Surbhi Rathore from symbl AI, a company developing the technology to interpret natural conversations and convert them into useful data. In this week’s UpTech Report, Surbhi discusses the origins of this idea of how it works, and why she’s developing it as an API for all to use. Surbhi, I’m excited to be with you and hear more about symbl AI and the direction that you’re headed. To start us off, I’m going to ask you describe your company in five seconds.
Surbi Rathore 1:15
API’s for natural conversation understanding.
Alexander Ferguson 1:19
Oh, they’re like, and I got it. KPIs for natural language understanding. So this company that you started and co founded two years ago? That’s right, the journey began two years, yeah, almost two years ago. And this is your first business that you’ve led? That’s right. The market that you’re focused on, what would you say to that? What is that industry or real target market that you’re looking to serve?
Surbi Rathore 1:47
These are developers that are building voice intelligence, or conversational intelligence products to superpower and create next generation experiences.
Alexander Ferguson 1:59
Love it, and even on your site is like you’re creating a programmable platform around natural language conversation. So help me understand even more than the problem. If you had to describe it, that you saw that you came like, we got a solution for this.
Surbi Rathore 2:13
Yeah, I mean, like, just to step back a little bit like I have been working in the conversational AI space. And we are familiar with chatbots and virtual assistants and how they’re kind of, you know, powering and replacing small tasks that people do. And when we were working in the space, we saw an opportunity, and more so a problem where there was no solution to analyze natural conversations like the one that we are having, like what happens to so much conversation data with gets generated, that is like kind of the last mile of data available anywhere. So it’s like how to capitalize on that how to make growth opportunities for businesses, how to make businesses effective and not do manual work. While they are engaging in a conversation superpower their own experiences.
Alexander Ferguson 2:57
I was intriguing when I first started looking to this, because there’s a lot of people that are developing solutions that tried to analyze conversations as well, but they’re doing it the the whole solution to the end product, but you’re not actually trying to create an end product that for the end customer, but rather a platform that others can build upon. Can you speak to that vision of why are you doing this?
Surbi Rathore 3:19
Yeah, when when we started the company, I think we started a little late. That’s why. So there were already products in the market like Gong, chorus YC, Sonia fireflies, and they were trying to build experiences for different domains, different market segments. And then we looked at it even more granularly looking at what’s the core technology that everyone is using, and it’s kind of like a foundational layer of conversational intelligence, which is then tailored to support their own domain experiences. And we also saw a lot of acquisitions happening, companies like Dell Baddeck, wire talk IQ, and SalesLoft acquired node ninja, and I was just to get this technology integrated within the platform, you’re like, Okay, if we develop a platform and make this available as API’s, we can power we can help developers and businesses go to that next stage without having to spend so much time effort and mindspace to think about how to develop this in house or how do I acquire companies like that, and make it like, democratize the intelligence so that products can actually become natively intelligent from conversations,
Alexander Ferguson 4:25
what kind of customer base Have you been able to build up and that are now being able to use your platform on a regular basis.
Surbi Rathore 4:32
The collaboration and communication products are kind of like our market segment that we were very focused on. So because that’s where conversations happen between people humongously and now with this remote collaboration being increased so much. We’ve seen usage of video, not just video conferencing, but also like text collaboration platforms. Everyone is on channels like Slack glue, flog Zoom is kind of like increasing their usage day by day, in addition to Microsoft Teams, and there are 50,000 products like this across the world. So that was kind of like our first target market to start with to say that how can we provide you with a simple use case of extracting actionable items from the conversation, grouping the conversations into contextually topics, or identify important questions are asked or put together the ideas that were generated in a confluence page, things like that.
Alexander Ferguson 5:29
What’s your business model for then making all this work?
Surbi Rathore 5:33
Yeah, so we have a pay as you go. It’s kind of straightforward. We do offer committed volumes for customers, when our developers when they build an application and move it into now production with some customers, they have some tentative usage idea. And they want to cap the usage on beyond that to make it predictable. So for them, we have committed volumes. Because we have always been working with enterprises. Since we started, we were kind of a weird way that we started instead of working with developers. First, we started working with enterprises first. So we’ve always had enterprise plans. And we’ve supported enterprises also, because we both me and my co founder come from working in enterprises. So now
Alexander Ferguson 6:13
moving forward, how do you see can you explain more about your current technology and the direction that you’re going of how you’re planning on continuing to build on it?
Surbi Rathore 6:23
Sure. So we provide an interface of API’s and SDKs, which integrates over both voice and text interfaces. So for real time voice streaming applications that happens on telephony, or SIP based systems or real or like WebSocket, or web RTC based systems, we provide an interface for that. We’re now providing interfaces for recorded audios, and even tech support. So there are some companies have already built transcription integrated transcription wonders. But they’re looking for transcription plus plus, like what comes after transcription. So here we are coming after transcription, but also supporting the interface of giving us just transcripts like don’t worry of giving us audio all the time. So that’s kind of like our interface. At a core, we have developed something that we call as contextual conversation intelligence platform CTY. And it’s really the core of the core of the platform is, I don’t know if you’ve heard about hybrid, hybrid AI or deep understanding, which is basically a combination of using classical AI, machine learning along with deep learning. So together, and I think there is an industry term, which I think probably mid we’ll definitely talk about called as deep understanding, where really, you’re using the best of the both worlds to build that intelligent platform. And that’s, that’s kind of our foundation of the product.
Alexander Ferguson 7:48
I love that can can you expand even further of giving you a use case of how that is? Works in in a real world situation?
Surbi Rathore 7:56
Sure. So um, so imagine that me and you are having a business conversation, and there’s a stream of audio coming in, we first use speech to text to convert it into a text, then we have our own system that works on the text data and kind of like we have a text correction there. So we have seen because we are agnostic to multiple ASR vendors, like we know that that what are the common mistakes. So we found out like, fix that normalized data, then it goes to our system, which we call this comprehension engine. That’s what we refer to that system internally. And that’s where all our context understanding is built. So we model the conversation into more details. In order to figure out what’s happening, where it is happening, how does the events relate to each other? What are the reasons of these events? How do they categorize into the same same kind of bucket and everything. And then we once we model the conversation, there is an insight engine sitting on top of it, which extracts the key information like action items, okay, this point, there were two dues. And this point, there were ideas and this word questions. So the combination of the comprehension and insight engine together generate these insights, which are then now pushed back into the API in a JSON format. We also have an out of the box user experience that multiple developers, they can just take that and embedded into their so go to market really reduces. So that’s kind of like the end to end flow.
Alexander Ferguson 9:23
Is there like a preset number of variables that you’ve created of like, here’s our action items here to do is to come out of a meeting that you’ve set that you’ll you’re constantly updating and modifying or is it is it how does that work then?
Surbi Rathore 9:37
So we have some basic insight items that we call, which are available out of the box, but anyone or any developer can choose to create more kinds of insights based on the conversation metadata. So we were very focused initially to make sure that we give all the metadata from conversation as possible like every single word To the timestamp of the word how much of all the variations of the spoken timings or the alignment to scraped or things like just who who is the person who’s giving most action items, and just all these different metrics, like you can figure all this out when you have all the conversation metadata. So all this is always pushed in the API’s. But in addition to that, we give you usable components of insight items, or which can create immediate functional experiences, which is a little difficult if you use standard NLP API’s, right? Because they will give you like parts of speech, which is like, Okay, you have this now you have the word and the noun, and the phrases now figure out what to do with it. And what we are trying to say is that we have figured out use cases like sales intelligence or augmenting a call center agent, while they’re on call with the relevant knowledge base articles, or in a meeting in real time actions or post meeting follow ups or even recommending calendar invites that needs to go after the meeting. So things like that, like small things like that, which really helps an individual, an end user to perform more than what they can today.
Alexander Ferguson 11:13
And that that layer on top that helps figure out those insights. That’s your, what do you call that? Again, that part is the insight engine. Can other developers using the metadata that you’re providing freely, then create new insights with your insight engine?
Surbi Rathore 11:30
So that’s not exposed outside, but what we expose all conversation metadata outside, so they can definitely build on top
Alexander Ferguson 11:38
of that, so they can create their own insight? Absolutely.
Surbi Rathore 11:41
So someone is trying to solve problem in sales, let’s say so they will want to create multiple domain specific insights for sales, which can be alignment to script how much time the agent said, like and things like that. So they can build all of that with this metadata. Our focus is not to like build experiences for people, our focus is to give them all the all the information they need all the Lego blocks that they need to go and now build the world or it
Alexander Ferguson 12:09
Where do you see your company in the near term and long term like the next year? And the next five years? What do you see?
Surbi Rathore 12:15
So we started, like I said, we started with enterprises side working with big companies in the beginning. But as we moved forward, we realized that, yes, this has to be built for easy enough for the developer to consume. So all our focus in this year, is strictly to make how to make our platform more friendly, easily acceptable, integrated for developers. So that’s kind of like our near term strategy to make sure that we just opened up our service platform, actually four weeks back. So now it’s available GA, anyone can go and sign up, get access keys, free credits to start with, but really taking this to the next level. And it’s it also means like building a whole lot of add ons, building up full lot of experiences in use cases. And in addition to helping people build also helping developers or businesses, we need to educate on the different use cases possible, because it’s a fairly new technology. And now, long term, we want to be the platform for conversational intelligence. That’s long term, not just being able to support one or two use cases, but bring together an ecosystem, which will help you build very unique conversational workflows with just not includes us. It includes top of the stream, upstream, downstream, every other platform. So think about you’re sitting in a room and just by conversations, you can build 3d models in AutoCAD. Like that’s the kind of experiences that we would love to power.
Alexander Ferguson 13:45
I’m excited for the future that your painting survey. So where can people go to learn more and what’s a good first step for them to take? Go to our website
Surbi Rathore 13:55
symbol.ai get started for free, sign up on the platform, get API access and go to the docks and start building. That’s pretty much it.
Alexander Ferguson 14:06
Awesome, thank you so much for your time.
Surbi Rathore 14:08
Absolutely lovely talking to you. And thanks for having me.
Alexander Ferguson 14:12
Be sure to catch part two of a conversation with servi in which he describes her unlikely journey from having a corporate job to becoming a first time co founder and CEO of a startup
PART 2
SUBSCRIBE
YouTube | LinkedIn | Twitter| Podcast