Preview Mode Links will not work in preview mode

Financial Investing Radio


Dec 15, 2022

In this episode, I talk with the CEO and founder of an organization that has been applying AI to help them develop products. Will AI help you develop your products faster? Come and see.


Grant
Hey, everybody, welcome to another episode of ClickAI Radio. So today I have this opportunity to speak with one of those brains out there in the market that's being disruptive, right? They're making changes in the industry in terms of not only the problems are solving, but it's the way in which they're solving the problems using AI very fascinating. Anyway, everyone, please welcome Paul Ortchanian here to the show.

Paul
Hi, nice. Nice, nice of you, happy to be here on the show. 

Grant
Absolutely. It's very good to have you here today. When I was first introduced to you. And I started to review your material what it is that your organization has put together as fascinated with the approach because I have a product development background and in in the software world. AI was late comer to that right meaning over generations when I saw the approach that you're taking to that I'm interested to dig more into that. But before we do that big reveal, could you maybe step back and talk about the beginning your journey? What got you on this route? And this map, both in terms of product development, and technology and AI itself?

Paul
Yeah, absolutely. So I started out as an engineer, headed down to San Francisco in the early 2000s. And, and I was more of a thinker than an actual engineer, or just be the type of guy who would figure things out by themselves. But if you were to ask me to really do things that the real things engineers do, you know, creativity was there, but not the solutioning. So being in San Francisco was a humbling experience, I guess, Silicon Valley, you get to see some really, really good engineers. So I had to make a shift in my career. And since I had a passion for user experience, the business aspect, product management was a great fit a function I didn't really understand. And I got to learn and respect, and did that for about 10 years. 

In the mid 2000s, and 10s, I basically moved back to Montreal for family reasons and cost of living, of course in San Francisco. And I started a company called Bank Biddick, which in French stands for public bath. And the idea is that most what I realized in Canada was that people here in accelerators, incubators and, and startups just didn't understand what product management was. So they didn't really understand what they do and how they do it. And I saw a lot of organizations being led by the marketing teams, or the sales team and being very service oriented and not really product LED. 

So basically, it basically stands for public bath, which means every quarter, you want to basically apply some hygiene to your roadmap, you have a galaxy of ideas, why not go out there and just, you know, take the good ones and remove the old ones and get rid of the dirt. And we started with that premise. And we put we said, well, what does a product manager do on a on a quarterly basis? Because a lot of the material you'll read out there really talks about, you know what product managers should do in terms of personas and understanding the customer's data and this and that, but nobody really tells you which order you should do it. Right. If that was my initial struggle as a product manager, do you try to do it all in the same day and then you realize that there's not enough time? So the question is like in a one quarter 12 week cycle, as my first three weeks should be about understanding the market shifts the industry, the product competitors and and the users and then maybe in the next three weeks working with leadership on making sure that there is no pivots in the organization or there are some some major strategic changes and then going into analyzing the DIS parking lot of ideas and figuring out which ones are short term and re and making business cases in order to present them for, for the company to make a decision on What to do next on the roadmap. 

So there is a process and we just call that process SOAP, which goes in line with our public bath theme. So the idea was like, let's let's give product managers SOAP to basically wash their roadmap on a quarterly basis. And, and that's what being public does. And we work with over 40 organizations today so far, on really implementing this product LEDs process within their organizations, we work with their leaders on identifying a product manager within the organization and making sure that marketing support sales, the CFO CEO really understand how to engage with them what to expect from them, and how product manager can add value to to the organization. And so they just doesn't become, you know, this grace towards them as many features as you can pump out, right.

Grant
Oh, boy, yeah. Which, which is constant problem. The other thing that I've noticed, and I'm wondering if, and I'm sure that your SOAP methodology addresses this, it's the problem of shifting an organization in terams of their funding model, right? They'll come from sort of these project centric or service centric funding styles, and then you've got to help them through that shift to a different funding model round products. You guys address that as well.

Paul
Yeah, we address that a lot. One of the things we always tell them is if you are a service professional services firm, and you know, I have no issues basically calling them that. If and I asked them like do you quantify staff utilization in percentages, like 70% of our engineers are being billed? Right? Do we basically look at the sales team? How many new deals do they have in terms of pipeline? Are we looking at on time delivery across those, so double use that to serve the sales team closed? And what is our time and technical staff attrition, that usually tends to be identifiers of you being a service firm? And we often ask them, well, let's let's make the shift, when we identify one little initiative that you have that you want to productize because they all these service firms, really all they want is recurring revenue, then the service is tough, right? 

That you constantly have to bring in new clients. So this recurring revenue, the path to recurring revenue is, you know, being able to say, Okay, I'm going to take two engineers, one sales person, one marketing person, one support person, and a product manager. And those guys collectively will cost me a million dollars a year, and I'm going to expect them to basically bring me $3 million in recurring revenue. That means that they're, they're no longer going to be evaluated on staff utilization, they're no longer going to be evaluating the number of deals they're bringing in. And they're, they're really going to be evaluated on how are they releasing features? Are they creating value for those features? are we increasing the number of paid customers? And are we basically, you know, staying abreast in terms of competitors and market industry changes. 

And so that's a complete paradigm shift. And that transition takes a while. But the first seed is really being able to say, can you create an entity within your organization where the CFO accepts that those engineers are dedicated and no longer being, you know, reviewed in terms of their utilization rate in terms of their know how much they're billing to customers? Once they do that shift in the recipe is pretty easy to do.

Grant
Yeah. So it's become easy. So the thing to I've seen and experienced with, with product and product development is the relationship of innovation to product development. And so I see some groups will take innovation, and they'll move that as some separate activity or function in the organization, whereas others will have that innate within the product team itself. What have you found effective? And does self addressed that?

Paul
Yeah, I mean, we always ask them the question of what how are you going to defend yourself against the competition with the VCs that have to call their moat, right? And that defensibility could be innovation, it could also be your global footprint, or, you know, it could be how you operationalize your supply chain make things really, really cheap, right? Every company can have a different strategy. And we really ask them from the get go. We call this playing the strategy, we'll give them like eight potential ways a company can, you know, find strategies to differentiate themselves? And the first one is first the market? 

And the question is, it's not about you being first to market today. But do you want to outpace your curlier closest rivals on a regular basis? And if so, you know, you need an r&d team and innovation team who is basically going to be pumping out commercializable features or r&d work. And then we always give him the two examples, the example of Dolby Dolby being completely analog in the 70s, but really banking on their r&d team to bring him to the digital age and from the digital age to set top boxes to Hollywood and now into Netflix compression, right? 

So they basically put their R&D team as the leader to basically keep them a step ahead of their competition. But it but on the other hand, we also Welcome, you know, talk about Tesla, where Tesla is basically doing the same thing, but they're not doing it for intellectual property like Dolby, they're not suing anybody are actually open sourcing it. But there's a reason behind it where that open sourcing allows them to basically create the, you know, what we call the Betamax VHS issue, which is making sure that there's compatibility across car manufacturers for Tesla parts and overproduction of parts that are Tesla just to increase their supply chain, right? So we ask them, Do you want to be that company, if you don't want to be that company, then there's other ways for you to basically create defensibility, it could be regulatory compliance, if your industry requires it, you can go global, you can go cross industry, you can basically create customer logins, how just how SAP and Salesforce love to basically just integrate workflows with like boots on the ground, professional services certified teams, right? 

And or you can basically review your process and make sure just like Amazon, that you're creating robots to do human work in order to just basically do it cheaper than anybody else. So there's ways of doing it. And I would say that if you were in AI space, especially, you know, it's important to make sure that, you know, are you really trying to innovate through AI, because you can get a lot of researchers doing a lot of things, but that's not really going to help you create commercializable ideas. So from the get go, the leadership team needs to, you know, at least make a hedge a bet on, you know, expansion, innovation, or creating efficiencies and just, you know, decide and let the product management team know in which direction they're gonna go planning on going for the next six years. Please.

Grant
I love your last comment there, Paul about about getting the leadership team involved. It seems that many times in organizations, this challenge of making the change sticky, right, making it last making it resonate, where people truly change their operating model, right, they're going to start operating in a different way, their roles and responsibilities change, what is the order in which things get done all of those change, when they start moving both into this AI space, but you know, product driven just by itself, even without AI has its own set of challenges? So here's the question I have for you. As you move companies through this transformation, that's part of your business, right? You are transforming the way companies operate and bring about better outcomes. How do you make those changes sticky? Because this is a cultural change? What is it you guys have found it's effective?

Paul
Or it goes back to our name public bath and SOAP, right? Because the idea is, you take a bath on a regular basis hygiene is something you do regularly, right? So we ask these organization, if we give you a process where you know exactly what the product management team is going to do with you with the leadership team in order to prioritize your next upcoming features, then can you do it in a cyclical way, every quarter, you need the product manager do the exact same process of revisiting the competitors, the industry, the market, as well as like the problems that you have with your premature customers, bringing it back to the organization, asking if the strategy is still about expansion, innovation, efficiencies, identifying new ideas, clearing up the parking lot of bad ideas, etc, and eventually making the business case for the new features in order for them to make a commitment. So if we do this in a cyclical way, then the product role becomes the role of what I'd like to call the CRO, which is the chief repeating officer, because all the product manager is doing is repeating that strategy and questioning the CEO, are we still on? Are we pivoting or if we pivot? 

What does that mean? And if you're doing it on a three month basis, what that allows your company to do is to make sure that the marketing and sales and support team are going along with what the engineering team is going to be delivering. So this is what I usually see most product organization where a decision has been made that the engineers are going to be building a particular feature, the sales and marketing team just waits for the engineers to be Code Complete. And once a code completes, done, they're like, Okay, now we're gonna promote it. But my question is that it's too late. Right? You really need so I always show the talk about Apple, how Apple would basically go out in front of millions of people and just say, here's the new iPhone 13. And we came up with a new version of Safari, and we're updating our iOS and we're doing a 40 Other changes. And the next thing you want considered an Apple store and you know, everything has changed. The marketing has changed the guys that the doing the conferences, and the lectures and the training are all talking about the new supplier, the new iPhone, and you ask yourself, How did how did Apple know and to organize the marketing support and sales team in that in such a way that the day that the announcement has been done? Everything is changed. So that means that it's not just the engineering team's responsibility to get to Code Complete. 

It is a collective responsibility where marketing support and sales are also preparing for the upcoming releases. And and the only way you can get that type of alignment is If every three months these these parties, technology, product, CEO, CFO, sales, marketing and support can get together and make a clear decision on what they're going to do, and be honest enough of what they're not going to do, and then work collectively together on making sure that that those are being delivered and prepared in terms of the size of the promotion that we're going to do, and how are we going to outreach how's the sales collateral going to change? How is the support team going to support these upcoming features. And so everybody has work to do in that three months timeframes. So and then that if we can get to that cyclical elements, I think most companies can create momentum. And once that momentum has is generating small increments of value to the customers, then you base start start building, what I like to call reputational capital, with the clients, with the customers with the prospects. And eventually anything you release the love, and everything you release adds value. And eventually everybody loves everything you're doing as an organization become that, you know, big unicorn that people want to be.

Grant
Yeah, so the net of that is, I believe what you said as you operationalize it. Now there's it gets integrated into everyone's role and responsibility. It's this enterprise level cross functional alignment that gets on a campus. And the cadence is, in your case, you'd mentioned quarterly, quarterly sounds like that's been a real real gem for you. I've seen some organizations do that in shorter timeframes and some much longer. It sounds like yeah, at least quarterly is that a good nugget that you find there? 

Paul
Yeah, quarterly works, because you know, markets are set in a quarter way they operate in that way the you want results on a quarterly basis in terms of sales in terms of engagement, etc. But what's important is that which you know, a lot of engineering teams like to work agile or Kanban. And in a quarter in a 12 week timeframe, you could fit, I'd say, Let's see your Sprint's are three weeks, you could fit for sprint for three weeks variance, or you could fit six 2-week sprints. But I feel that if you were to shorten it, then the marketing team and sales teams supporting might not have enough time to prepare themselves for Code Complete, the engineers might be able to deliver but then the product manager gets overwhelmed because doing an industry research, competitor research etc. Every, say month and a half or two months just becomes overwhelming for them. Because things don't change enough in two months for them to be able to say, Oh, look, this competitor just came up with that. And now we need so so I think three months is enough time for the world to change for, you know, country to go to war for COVID to come over and just destroy everything. So pivot decisions are usually can pretty good to do on a on a quarterly basis. 

Grant
Yeah, that's good. That's, I think COVID follow that rule. Right. Hey, I have a question for you around AI. So how are you leveraging AI in the midst of all this? Can you talk about that?

Paul
Yeah, absolutely. So what we noticed is a lot of organizations who have products, so SaaS products, or any type of product, IoT products, etc, they're generating data. I mean, it's it comes hand in hand with software development. So all that data is going into these databases are and nobody knows what to do with them. And eventually, you know, they want to start creating business intelligence, and from business intelligence, AI initiatives have just come about, it's very normal to say, You know what, with all this data, if we were to train a machine learning module, we would be able to recommend the best flight price or the best time for somebody to buy a flight, because we have enough data to do it. So so we're not working with AI first organizations who are here we have, our entire product is going to be around AI, we're just trying to work with organizations that have enough data to warrant 1-2-3, or four AI initiatives and an ongoing investment into those. So the best example I like to talk about is the Google Gmail suggestive, replies, right, which is adding value to the user needs AI in the back, end a lot of data. 

But ultimately, it's not that Gmail isn't AI product, it simply has AI features in it. So and when organizations start identifying AI or machine learning, predictive elements to their product, then we go from engineering being a deterministic function, which is if we were to deliver this feature, then customers will be able to do that to a probabilistic function where Let's experiment and see what the data can give us. And if this algorithm ends up really nailing it, we will achieve this result. But if it doesn't, then do we release it? Do we not release it? 

What's the and then it gets a little bit hairy because product managers just lose themselves into it. Oftentimes, they'll release a feature and the sales team would just ask them to pull it out right away because it has not met the expectations of a customer or two. And ultimately, like what we ask product managers to do is work with leadership on really it Identifying a few key elements that are very, very important to just just baseline before you were to begin an AI project. And those are pretty simple. It's, it's really like, are you trying to create to have the machine learning module? Make a prediction? Are you or are you trying for it to make a prediction plus pass judgment? Are you trying to make it a prediction, a judgment and take action? Right? Decision automation, which is what you know, self driving cars do, will will see biker, they will make a prediction that it's a biker will make a judgment that it's indeed a biker, and we'll take action to avoid the biker, right? 

But when you when you're creating ml projects, you can easily say, you know, we're just going to keep it to prediction, right? Like this machine is going to predict something and then a human will make judgment and the human will take action. There's nothing wrong in doing that. So just setting the expectations for from the get go in terms of are we basically going to predict judge or take action? That's number one. And then the next question is whatever that we decide if it's just prediction, is that worth guessing? And who doesn't have guessed today, if it's a human? Is that how accurate is that human? Let's quantify. So this way we can compare it against what this machine is going to do? What is the value the company gets out of that gas being the right gas? And what's the cost of getting it wrong? So oftentimes, we forget that humans to get it wrong to and if humans get it wrong, there are huge consequences to organizations that will overlook but as soon as machine learning does the same thing, we're ready to just cancel hundreds of $1,000 of investment. 

Grant
Yeah, that's right. Yeah, we tossed it out. So the use case, I'm assuming would be you would leverage AI to say enhance a product managers abilities to either predict outcomes of some product development activities, or releases or things like that, would that be a kind of use case where he looked apply?

Paul
Well, not a product managers, I would say the product manager, we'd look at it software, let's take the software of a website that tries to predict your if people qualify for a mortgage loan, for example, right? So you have enough data at that point to be able to automate, what's the underwriting process that humans do of validating whether or not somebody's eligible for loan? Well, we could take all that data and just make a prediction of that person's fit for a particular loan. Now, if we were to say, well, it's just going to be the prediction, but we're not going to give this person the loan, we're still going to ask a human being to pass judgment that that prediction was the correct one, and then take action to give or not give him a loan. 

So let's say that's the machine learning module, we're going to add to our to our feature. Now, the question is how this underwriting department in the past 10 years, how often did they really screw up that, you know, and issued loans to people that were that couldn't pay their loan, right? And realize it's 40%? Were like, Wow, 40%? Could this machine learning be as accurate as damn plus one, right? And, and then we ended up realizing that yeah, this, whatever we delivered is 33% accurate, and not 40% plus one accurate now is it still worth putting out there we spent $100,000 into it, and then you know, then it's up to the product manager to basically be able to put this thing in place and say, but look, you know, underwriting is a nine to five job currently in our business, and it cost us this much money. 

On the other hand, if there's this machine learning is 33% accurate, but it's actually doing it 24/7 365 days a year, and it's only going to improve from 33 to 40. And if it goes above 40, then we the savings for our organization are this much money. So it is really the product managers job to be able to not only talking about the business KPIs, but also the what the AI machine learning KPIs we need to achieve and what the impact of that would be if we get it right. And I think that the biggest issue we have as product managers in the AI space is if we were to go and do this all there everything that we need to create AI, like the day data ops, selecting the data, sourcing it, synthesizing it, cleaning it, etc. The model ops, which, you know, comes down to multiple algorithms, training those algorithms, evaluating tuning them, and then the operationalization. If you do all these steps, and you get to 80 to 20% accuracy, and your target is at 70% accuracy, right? What do you do with it? 

Because you had to do all this work anyways, it cost you tons of money and time. And so how do we get the leadership team to say this AI initiative has enough value for us that we're willing to live with the consequences of it getting it wrong, or we're willing to actually have it supported by human for the next six months to a year until we basically trains itself and gets better? So it's how do you get this openness from from from a leadership team? Because what I've often find delivering AI projects is every time you deliver an AI project, and it's misunderstood in terms of its output, and everybody thinks it has to be 100% accurate, the second and goes wrong. It's the political drama that you have to go through in order to keep it alive. is just it's just overwhelming, right? So miners will set those expectations up front and tool, the product managers with the right arguments to make sure that they the expectations are set correctly.

Grant
Have you ever worked with or heard of the company called digital.ai? Are your familiar with them? digital.ai, maybe not. Anyway, they have been working in a similar space as you but not so much of the product management level. What they're doing, though, is they're, they're looking to apply AI to the whole delivery function. So so you can you see, the product manager is above this, and is making sort of these KPIs and other estimate activities and the planning out. But then there are all these functions under there that of course, do the delivery of the product. And so they're working on the tooling spectrum, I think they acquired I think, was five different companies like in the last nine months, that they're integrating these and then building this AI seam or layer across that data across delivery with that purpose and intent to do that predictive not not only backwards analysis activities around AI, but predictive, which is what's the probabilities, I might run into the problem, or some problem with this particular release, right, of this product, right, that we're about to send out, now might be an interesting group for you to get connected with.

Paul
Yeah, I know, it's funny, because we're there. There's a local company here in Montreal that does the same thing. It's really about like data scientists are really expensive, and they're really hard to find, and there's a shortage of them. So, you know, the lot of organizations are trying to find like a self serve AI solution where you can build your AI using their AI. But ultimately, what they're doing is taking your data and delivering 123 or 10 versions of the machine learning module, it's up to you basically, judge which one is going to work the best for you, but they actually operationalize it, put it out there for you, and really automate the whole thing. So this way, you're not dependent on humans, I love that I really love that I think your organization should have one of those. But that still means that there's a dependency from the for the product manager to know that it's, it's data, like end to end, be able to clean it be able to tag it and then feed it to the to these machines, right? And I think that part is also misunderstood. Because Do we have enough data? Is there bias in the data and all that needs to be understood and figure it out? Because, you know, you could say like, Hey, we put it to this big machine. And we ended up with a 20% accuracy on the best ml that it out, put it, but that's still not good enough? Because we're trying, we're aiming for 87? And what does it mean? What do we need to do to basically get it to 87? We're gonna have to review the data bringing some third party data, you know, and it's, and that's, that costs a lot as well. So, yeah,

Grant
Do you think AutoML solutions play a role here like, Aible, I don't know if you're familiar with that platform, you know, that the goal is to try to reduce the amount of dependency that's needed on the data science. Scientists themselves, right. And but it's, it's still doesn't remove all of the data cleansing part, but it does help take care of some of the certainly the low level data science requirements, you think you think that's a viable solution in this area? 

Paul
I think it is. I mean, it's, you know, we went from rule based AI, where data scientists had to do good old fashioned AI, which was a feature engineering, right? Putting the rules themselves to machine learning AI, where, you know, we had to train the data that we needed, were so dependent on these data scientists. And now we're getting to v3, where we have these tools. And you know, there's a data dependency, but there, they also don't have such a high dependency on data scientists are and you know, figuring our algorithms and etc, we could just basically have these prepackaged algorithms that could basically output us any types of solution. What I tend to like, I've seen this a lot in a lot of companies. There's some companies that are very, very industry specific, right? So they're providing AI for E-commerce to be able to provide better search with predictive elements based on the person's browsing history. I mean, I, I'm not sure, but the ones that are providing every ML imaginable, so you could use it for supply chain, or you could use it for something else. I know it's dependent on data. But again, these algorithms, you can't have all the algorithms for all scenarios. 

Even if it's supply chain, some person has perishables and there's ordering bananas and the other person is ordering, I don't know water coolers, and those, those don't have the same rules, right. You know, so it's, it's important to just, I think that maybe in the coming years, we'll have a lot of companies that are really going cross industry, just like we're in E-commerce, the other ones that are med tech, the other ones are, etcetera, the tools are the same. I mean, more or less the same, the customers are gonna get used to basically having these UI is that I'll give you your input the data in and then these emails come out, and then you choose which one and they give you probability you can retrain them and all that stuff. And I think that it's just going to get to a point where we're going to have these product managers who are now responsible of kind of training the Machine Learning Module themselves, you know if it's going to be the product manager, or if it's going to be some other function, where I think it does definitely fit inside the product managers?

Grant
Well I do is, I think it's because they still need to have what we would call the domain knowledge and in this domain of building products, yeah, AI, at least at least in this phase of the life of AI, where we are today for the foreseeable future. I think the product manager needs to be involved with that. Sure. So.

Paul
It comes down to intuition, right, somebody has to have like to build that intuition about what a model is relying on when making a judgment. And I think that, you know, with product managers, the closest one really, maybe in bigger organizations, it's the person who's managing analytics and data, but in smaller startup organization, I can definitely see the product manager putting that 

Grant
Yeah, absolutely. Paul, I really appreciate you taking the time. Here today on this been fascinating conversation. Any last comments you want to share?

Paul
We have tons of articles that talk about so we're very open source as an organization. So if you want to learn more about this, we have about 70 articles on our website. Just go to BainPublic.com and just click on "Articles" and you could just, you know, self serve and basically improve as a product manager in the AI space.

Grant
Excellent, fascinating, love, love the conversation, your insight and the vision where you guys are taking this I think you're gonna continue to disrupt everyone. Thanks for joining another episode of ClickAI Radio and until next time, check out BainPublic.com.


Thank you for joining Grant on ClickAI Radio. Don't forget to subscribe and leave feedback. And remember to download your free ebook visit ClickAIRadio.com now.