Perhaps the most exciting and frequently discussed technology of the 21st century is artificial intelligence. We’re using AI for solutions in everything from diagnosing serious medical conditions to selecting the right shade of nail polish.
But every solution brings a new set of problems, and AI is no exception. Predictive models become stale, they adopt biases, and the results can be costly and embarrassing.
Yotam Oren discovered these sorts of problems were widespread—so he started Monal Labs to fix them, a technology startup that allows you to monitor AI performance to ensure accuracy and transparency.
On this edition of UpTech Report, Yotam talks about the original problems he set out to solve and why AI should never be left alone in the room.
More information: https://www.monalabs.io/
Yotam is the Co-Founder and CEO of Mona, an SaaS startup empowering data scientists. Yotam is a tech executive, with a 15+ years experience in bringing data-driven products to the enterprise software market.
Before Mona, Yotam held multiple roles at ADP, most recently as VP Operations and Growth of ADP’s cloud payroll software. Prior to ADP, Yotam was an Associate Partner at McKinsey & Company, a business development associate at Google, and a software engineer at Telmap Ltd (acquired by Intel in 2011).
Yotam holds a B.Sc (Magna Cum Laude) in Bioinformatics from Ben Gurion University in Israel, and an MBA (with distinction) from the University of Michigan.
TRANSCRIPT
DISCLAIMER: Below is an AI generated transcript. There could be a few typos but it should be at least 90% accurate. Watch video or listen to the podcast for the full experience!
Yotam Oren 0:00
The integration is really easy. Because of that we’re not asking you to change the way you work today, we’re adding new capabilities for you that you can pretty easily integrate to whatever tech stack you’re on already today.
Alexander Ferguson 0:17
Welcome, everyone to UpTech Report. This is our Applied Tech series. UpTech Report is sponsored by TeraLeap. Learn how to leverage the power of video at teraleap.io. Today, I’m very excited to be joined by our guests, Yotam Oren, who’s based in Atlanta, Georgia, he’s the co founder and CEO of Moana labs. Welcome your time, it’s good to have you on.
Unknown Speaker 0:36
Thanks for having me.
Alexander Ferguson 0:37
Your product provides production monitoring for AI for AI. It’s offering oversight and quality control for AI systems. So basically, anyone out there, if you are in maybe an AI or ml ops, DevOps or data scientists, this might be an intriguing platform that you want to check out. On your site, you say, with Mona, you gain complete transparency into how your data and models behave in the real world. I’m curious, what was the problem you initially saw? And set up to solve and how has that changed over time?
Yotam Oren 1:11
Sure. So you know, I’ve been around big data my entire career. And I’ve always been, you know, first hand experience with the promise, and the pitfalls of big data and AI. And the story of Mona specifically starts couple years ago with a call from from Nemo, one of my good friends, and and now CTO. And he said, Hey, let’s help you solve a problem that really embarrassed me and my team at Google. And, and, and, you know, and caused some costs or some some some PR issues for Google as well. And, you know, they had this algorithm that was predicting the location and kind of timing of flu outbreaks based on on what how people search people’s people’s searches. And for a few years, it was very accurately predicting, actually, the outbreaks. And you can imagine the value of this to the healthcare industry, right, like being able to predict where fluids is occurring. And then all of a sudden, after a couple years, it started failing spectacularly. And the biggest learning from that was, of course, predictive models, AI systems, those can’t just be left alone. They become stale. They are very sensitive to changes in the world around them. And they need better oversight, because they need to be fine tuned over time. So we have to have, is there something here that we can commercialize? You know, is there a bigger problem for beyond Google Flu Trends, and start talking to a lot of data science teams, and I’ve talked to over 100 of them realized, everyone has a different story. And AI is huge and used is used in so many different verticals. But there’s some common themes. There, the scientists feel that they’re flying blind, they feel that they’re reactive, that they uncover problems are too late. And then they go on to lengthy and manual investigations, to figure out what’s wrong and of course, correct. And just better oversight, and better quality control tools are needed. And this is what we set out to do. And, frankly, we’re still, you know, while our storytelling and I think some of our, you know, messaging had changed, and some of our product definitions have changed. This is still the problem we were solving today, that hasn’t dramatically changed from looking
Alexander Ferguson 3:35
AI and machine learning is only going to increase the use of Java. And so being able to better understand have clarity is going to be paramount now that you started this about two years ago, a little over
Yotam Oren 3:44
two years. That’s right, yes. In fall of 2018.
Alexander Ferguson 3:48
Well, it’s interesting journey already from, from the concept of your friend call calling out and you realizing that need for those that want to hear more about your Tom says story, stick around for part two of our interview, we’ll be diving deeper into that, but to give a taste. I’m curious, what’s one thing you wish you had known two years ago, two, three years ago, before you started that, you know, now?
Yotam Oren 4:14
Ah, I mean, that’s, that’s it. That’s, that’s you know, I think I wish I had known more about the, you know, just the sheer variety of use cases that are out there for this problem and how, you know, and how, despite that, there’s enough commonalities that you can, you know, that you can address them all with some common capabilities. I was, you know, I think at first, my intuition was, this is pretty niche, right? You know, when I first start talking about this, I thought This is going to be have to be solved locally, this is going to be, you know, every, you know, every AI use case is going to have its own, you know, set of issues, its own set of solutions. And I think I wish I had been, I have had I’d had the foresight two years ago that there’s some, there’s a lot of commonality and you know, in the revolution of standardized or like having more and more your level universal language talking about these things, is Is there potential early
Alexander Ferguson 5:28
there? Well, I’m excited to hear more of your story. So definitely those who want to hear it, stick around for part two of our interview coming to them the product that Mona is and exists for, tell me a bit more of every day a use case, if you can describe one of your clients that are using right now, what does it look like when it’s in play?
Yotam Oren 5:48
Yeah, I mean, so imagine that you have, you know, that ecommerce, you know, ecommerce site, for example. And then, you know, for I’m going to say, let’s say your gap, they’re not our customer. But for for illustrative purposes, that’s a gap that you’re the gap.com. And you have a AI algorithm that makes product recommendations to your users, to your consumers when they come in the site. And it’s analyzing all sorts of things about the user, right, their browsing history, maybe their like purchasing history, a lot of like non user specific information, like trending products, and promotions, these kind of things. And the success of this system is measured by, you know, clicks and conversions, right, our users actually liking the recommendations, and do they turn them into actions. And in this case, you know, the owners of this algorithm, that when they fly blind is when they only react to issues when those clicks and conversion start decline. When the business is complaining, the marketing person, right is complaining conversions are down. So is there a way to help them right, this is what we’re trying to do is there a way to give them leading indicators that their issues is a way to help them get more transparency into more granular issues that might have so it starts with tracking more than just conversions and clicks. For example, one, one example for for this for a metric that you want to track is confidence, confidence intervals, when an algorithm makes recommendations, it usually will tell you how confident it is in those predictions, it’s making those recommendations. So if you track the confidence intervals, and you see a decline in that, there might be an issue worth mentioning, you know, and so we help them track more metrics, the second one would be to look more gradually to go beyond the global average to go into sub populations, because usually, when you have issues, they start with sub populations. So you know, for example, you know, we might want to track the age distribution of consumers on the site, and say, Hey, there’s an increase in the number of retirees, you know, or retiree age people on your own, you know, that are on your site. And frankly, our model isn’t as good, right, the you know, the results are not as good for this kind of segment, this sub population. And if there’s an increase in their numbers, they’re going to impact the overall results over time, you might want to fine tune your, you know, your, your performance, your, you know, your ability to recommend for this kind of population. And the third thing that, you know, is, you know, we hope to be able to track a lot of things that maybe are not directly associated the algorithm, but could be very relevant, for example, metadata, right, where is where the users coming from, maybe they’re coming from a new, a new Apple device that’s out in the market, and for a new operating system. And data is stored differently on this new operating system and your new browser. And maybe that’s throwing off your algorithm from just users coming from this new device. And so you want to get ahead, you want to understand that there’s this sub segment of the population that might, you know, cause trouble for your algorithm, and take corrective action before that device, or that browser on Apple becomes the gold standard, right, and everyone’s using it now. And your algorithms are failing. So you know, and that show, in this use case, a lot of it is about, you know, tracking things that might be leading indicators of the algorithm failing, cracking a lot more information, enabling the data scientists to get a much better view on strengths and weaknesses and take action before you know, ahead of time practically,
Alexander Ferguson 9:21
not for you, what’s what’s the business models, it’s something that someone signs up for. And this is a one time a monthly yearly, what does it look like?
Yotam Oren 9:30
Yeah, the standard models is SaaS, Software as a Service subscription. You know, your users pay pay subscription, and they get access to the platform. They easy integration and, and, you know, we host the, you know, the dashboard and the analytics that they get,
Alexander Ferguson 9:48
what’s the difference as far as the technology when as far as other options out there that they could be using or whether it’s another competition or just what they would be doing Without it, what’s the power around the technology? Yeah,
Yotam Oren 10:03
yeah. So you know, so in terms of number one, the alternative, right, the audit, what would you do without it, the what users get from this. Number one is, is risk avoidance, right, you’re, you know, the stakes are really high. And companies are starting to use AI, in more and more core business processes, and there’s a lot of money at stake. So think about, for example, fraud detection systems. If you have, you know, bad predictions of fraud, you’re rejecting transactions, you’re not supposed to reject, or your god forbid, you’re, you’re approving transactions that’s supposed to, right, so the stakes are high, and you gain, you know, you manage the risk of 40, you know, 40 decisions. And number two, is you get more efficient, because you are called to fix those errors. And those issues. And so usually, we staffed it, you might staff at a team, or you might engage in a lot of manual work to diagnose what the problem is. So you are getting, there’s a lot of, you know, efficiency involved. It was, so your alternative today is to try to do it yourself often. Or to try to, you know, use the kind of like oversight monitoring capabilities that you get from, you know, from from maybe your, you know, your data platforms today from the cloud providers from AWS, Amazon, or from from Google on Microsoft Azure. And, you know, the differentiation of something like Mona begins with number one, the ability to be in a best of breed standalone tool that’s highly focused on monitoring, it’s not a secondary, you know, it’s not a secondary focus for us. And, and the idea that being standalone, it’s completely agnostic to how you develop, and test and deploy your AI today. And so we’re not fit, you know, the integration is really easy. Because of that, we’re not asking you to change the way you work, today, we’re adding new capabilities for you that you can pretty easily integrate to whatever tech stack you’re on already today.
Alexander Ferguson 12:02
For the A, A AI ml ops, or DevOps, or data scientists out there any words of wisdom for them, because you probably, you know, what they’re feeling, you know, their, their pains, their concerns, things they’re working on. So a word of wisdom that you can share.
Yotam Oren 12:19
Yeah, number one, take control of, of have your, you know, of your models, and your, you know, in your data throughout their whole life cycle leverages many engineering tools to automate things that shouldn’t be your core, you know, your core day to day work. But, but take ownership of it, don’t expect, you know, don’t don’t, don’t expect to hand it off to someone else, you know, aspire to get as much transparency as possible, and continue to research into production. I mean, this is interesting, right? Obviously, that’s what they want to do. A lot of them because they understand, I mean, you know, data scientists know that it’s really difficult to, you know, to in day one, when you start researching a certain problem, or you know, developing a certain predictive model, to know everything that’s going to be around after deployment when this model is used. And so you’re trying to simulate the real world as well, as best as you can. But you want to continue to research there. So my advice is, go and try to do that and make sure that you use the tools that you have out there to not not be blind after deployment.
Alexander Ferguson 13:39
The latest feature element that you guys have published lately, or are about to publish anything that you’re really excited about that would want to share?
Yotam Oren 13:49
Yeah, I mean, we are, this is this is a, you know, AI is still an emerging field, a ton of investment, a lot of places and tools for AI and one on one of which is what we’re building is also an emerging category and new solutions. And, and because of that, I think in the early going, a lot of the adoption, a lot of deployments we have our you know, we do, you know, we do it, they’re almost like a magnet service in a lot of ways, right? We do a lot of the work, we help with onboarding, we consult we advise, and we still would love to do that, because we want to make sure that you know, that our customers and our users are making the most out of the, you know, capabilities we provide, but we’re also making a lot more DIY and this is one of the things that is coming you know, out where we want universal very broad adoption, we believe people need to own this this field of quality control and oversight. And we so we are, you know, launching or you know, extending a lot more, do it yourself kind of capabilities to let them take it take ownership end to end
Alexander Ferguson 14:59
look forward from here, five years from now, what can you share the roadmap of where you guys are headed?
Yotam Oren 15:09
This is, this is a foundational need, right? You can imagine, you know, not really monitoring anything that is kind of in a critical business process any any system or predictive model that’s in a critical business process. You know, we so we want to be an integral part of every AI project. And we want to be an enabler part of this, you know, movement to, to bridge the gap between kind of data research and, and production or operations right of off of products that consumers and customers use. We want to be part of this cultural, like organization revolution, right? Where new roles are created, that actually help bring data and research into products, right. And so there’ll be new roles. And I think we want to be not just an integral part of a VI project, but the de facto, you know, use in solution for a lot of these new roles that are coming out.
Alexander Ferguson 16:15
Well, it’s an interesting kind of direction that you’re definitely in the right space of where things are headed for AI and ML. And it’s needed that that visibility and clarity you guys are bringing thank you so much. Your time for sharing what you guys are up to for those that want to learn more, you can go check out their website, just looking double check. Yes, it is monalabs.io. Monalabs.io.
Yotam Oren 16:38
Monalabs.io is our site and of course, follow us on LinkedIn, we you know, we regularly post updates there at Moana labs. So try to be active on both fronts,
Alexander Ferguson 16:53
you’ll be able to learn a lot more love it. Thank you so much again, everyone. This was our Applied Tech series. If you want to hear more about Yotam’s, his background, his history and his his journey that’s been on stick around for part two of our discussion on founders journey. This is UpTech Report. Our sponsor today is TeraLeap. If you want to learn how to better leverage the power of video. To increase sales and marketing results, head over to TeraLeap.io and learn about the new product customer stories. Thanks everyone, and we’ll see you next time. That concludes the audio version of this episode. To see the original and more visit our UpTech Report YouTube channel. If you know a tech company, we should interview you can nominate them at UpTech report.com. Or if you just prefer to listen, make sure you’re subscribed to this series on Apple podcasts, Spotify or your favorite podcasting app.
Transcribed by https://otter.ai
PART 2
SUBSCRIBE
YouTube | LinkedIn | Twitter| Podcast