Your AI Injection

AI-Driven Manufacturing: Predictive Maintenance and Defect Detection with Ryan Thompson

May 16, 2024 Deep Season 3 Episode 16
AI-Driven Manufacturing: Predictive Maintenance and Defect Detection with Ryan Thompson
Your AI Injection
More Info
Your AI Injection
AI-Driven Manufacturing: Predictive Maintenance and Defect Detection with Ryan Thompson
May 16, 2024 Season 3 Episode 16

In this episode of "Your AI Injection," Deep Dhillon chats with Ryan Thompson from the CRB Group, an architecture, engineering, construction, and consulting company. The two discuss the current state of AI adoption in the life sciences and food and beverage manufacturing sectors, highlighting the challenges and opportunities. They delve into how AI can enhance productivity through predictive maintenance and defect detection, and the importance of integrating data across different systems. The conversation explores the potential of AI to transform manufacturing by improving efficiency, quality, and speed to market, envisioning a future where factories are self-optimizing.

Learn more about Ryan:
and about CRB Group:

Learn more about AI for product and processes here: 

Show Notes Transcript

In this episode of "Your AI Injection," Deep Dhillon chats with Ryan Thompson from the CRB Group, an architecture, engineering, construction, and consulting company. The two discuss the current state of AI adoption in the life sciences and food and beverage manufacturing sectors, highlighting the challenges and opportunities. They delve into how AI can enhance productivity through predictive maintenance and defect detection, and the importance of integrating data across different systems. The conversation explores the potential of AI to transform manufacturing by improving efficiency, quality, and speed to market, envisioning a future where factories are self-optimizing.

Learn more about Ryan:
and about CRB Group:

Learn more about AI for product and processes here: 

[Automated Transcript]

Deep Dhillon: Ryan, thanks so much for coming onto the show. Maybe let's get started. Can you tell me a little bit about your typical client and walk me through where they are before you help them and how AI might fit into their future? 

Ryan Thompson: Sure. Uh, so CRB is an architecture, engineering, construction, and consulting company.

Uh, we're focused primarily in two verticals, uh, life sciences, manufacturing makes up around 80 percent of our business and food and beverage manufacturing, um, makes it about 20%. So we're engaged in both of those verticals. There's a, I guess, a vast area of where companies are with their digital maturity.

And, you know, as a subset of that AI maturity, I think when you look at. Manufacturers and in supply chain in general, they're, um, pretty far behind. Like, lots of other industries, retail, banking, marketing, all of that. Like, manufacturing is just, I don't want to say late to the party, but they're kind of more risk averse industry.

And especially with. Life sciences, there's a lot of regulation involved. So people are taking, you know, a standoffish approach until FDA comes up with more regulations that can be easier followed. And there's a lot of, like, nervousness about being first actors with respect to artificial intelligence.

Ryan Thompson: That said, I see, I think I see on the life sciences side, our clients are investing a lot in, like, the drug discovery. Process that's something that's pretty standard across the board, but they're not investing as heavily in in the manufacturing side. Um, And food and beverage is, I guess, a little bit further behind on, on their AI journey than, than life sciences would be.

Deep Dhillon: Maybe we'll just take one of these manufacturers. Maybe let's start with something like the food and beverage site, because it's a little bit less abstract. It's, everyone can kind of relate to it. Straightforward. So maybe walk us through, like, what the factory floor maybe looks like. And where are you, where are, where data is being gathered typically in this scenario and where are you sort of envision data being gathered in this scenario?

Ryan Thompson: Sure, so I think manufacturing data is, is kind of in a poor state in general across all verticals, but we'll, if we want to stick with food and beverage in general, um, we see. The factory floor is often kind of broken into 2 major areas. So, a process area and that could be if we want to take a bakery example as a food and beverage manufacturer.

So, the process area would consist of, you know, equipment that makes dough. Adds chocolate chips. Puts cookies onto a tray and then bakes them and then the packaging area would take the cookies, put them in a tray, put them in a wrapper, um, seal them, verify the package and get that off to, you know, target or Walmart or whomever is buying the products.

So, where the data exists in a lot of different places, so each of those process and packaging areas will have time series data. Things like, if you look at the process side, temperatures, humidities, pressures. Agitator speeds, things like that. On the packaging side, you have a lot more discreet data. You know, how many cookies I'm putting in a package, how efficient my machine is running.

And you'll also have kind of excursion type event data. So when something has gone wrong, you'll have a discreet events that are, are tying into that. And that's kind of on the factory floor. But then on top of that, you've got really like your business. Uh, your business data. So what product are you making?

Who is the end customer? What, how big is the batch size? How many cookies am I making? You know, what's the, what's the duty? And you have this kind of metadata that surrounds that. And then you've got that kind of the third level, which is really the consumer level data. So how, how fast am I selling these cookies?

Who's buying them? Um, what does my, my customer look like? And that's kind of in a separate silo. That would be more like a CRM type of, or, uh, I guess, customer relationship management type of silo. So that's really the, uh, a challenge with manufacturing is the data exists in so many different silos. So where do you get started with AI?

Because looking at any one of those silos individually, although there is some benefits on some of this stuff, the ROI is not necessarily there because the problem you're solving is, is kind of smaller scale. 

Deep Dhillon: Yeah. Thanks so much for describing that. World. So, um, I'm going to start maybe on the factory floor because it's interesting and different and we get to talk about cookies on a, on a conveyor belt.

My first question is, because you mentioned that these are sort of maybe more traditional businesses and are a little bit tend to be a little slower on the Um, on the centralization of all the data. So there's not necessarily like a centralized data warehouse that has like everything in it. The way you, the way you see with, you know, tech companies or software companies, and it's, it's fairly far along.

Walk me through, like, what are some of the reasons that didn't happen during the quote, you know, like big data era where everybody was busy instrumenting everything? Like, are there challenges where you have legacy equipment that maybe just doesn't have machinery built in? Um, Uh, or data gathering capabilities or that they don't have network capabilities to get the data off the machines or like, what are some of the, what were some of the challenges that and where is the state of that today?

Ryan Thompson: So, that state, I guess, varies greatly. I think you hit on a couple of key points. Some of this equipment isn't. Networkable so you'll have usually what runs factory floor equipment is something called a programmable logic controller or a PLC. Um, and those tend to have long lifespans. So you might see some that were put in in, like, the early 2000s or 1990s that still exists today.

And so it's difficult to get information from those. So, some companies will take, like, an IOT approach by putting sensors, not necessarily directly connected to those control systems and putting them into the cloud and doing something with it. But it's still not been tied into. The whole data warehouse infrastructure, and I think the other issue is like, from a security point of view, a lot of these older systems also have, you know, software requirements that I can't run on anything past Windows 7.

And so now, how do you connect that to your network in a secure way and get the information you need off of it? Of course, there's ways to do with with firewalls and rules and things like that, but it becomes a lot more complicated of a problem. I T teams generally are hesitant to put those types of old systems on their network without a really strong reason to do so.

So you're left in a world where on those types of systems, you can collect data from them, but it's often like a manual process. And it's time consuming. And so it can't really be done at scale. Well, it can be done to analyze, you know, 1 problem 1 time. But that's not really a good use case for for AI or even advanced analytics in general.

It's more of like, uh, just a human looking at data and trying to interpret it. 

Deep Dhillon: So. What's the kind of state of the market like to address these problems? Is it Getting, you know, increasing sensors inside of like more modern machinery and just sort of expecting them to grandfather their way out. Or is it building external sensors?

Maybe like, you know, cams that like stare at a dial and like automatically interpret the results of the dial, but, you know, have modern wifi connectivity and, you know, security, et cetera. 

Ryan Thompson: I mean, new equipment that's being purchased and produced today. It's all capable, you know, anything that's really been produced in the past 10 years is, is capable, but factories have, you know, lifespans that are longer than 10 years.

People do absolutely put extra instrumentation. I have seen that use case of a camera, not pointed at a dial, but point at some other type of device. And that works if you want it to, but you really have to have a compelling reason. Why it's not just cheaper to install a separate sensor. Uh, if you look at like what we'll call like IOT sensors that will connect directly to like a cloud platform, a temperature sensor might cost you a hundred dollars and might be another, you know, 10 a year in SAS costs to get data from that.

So those types of camera solutions don't often make sense. So we do see clients doing second sets of instrumentation on things that are, that are important to them. What we see, we see this use case a lot, and that's a really, really Well, defined cases, predictive maintenance for motors. So, you know, 100, 000 square foot factory might have.

You know, 1000 motors in it, those could be things like, uh, running conveyor belts. It could be things like running mixers in that in that dough or cookie manufacturing that we talked about putting a sensor on them. That would measure the, you know, the temperature vibration. And current that a motor is generating really cheap costs.

You know, less than 1 percent of the installed cost on the motor and predicting failures for motors based on temperature and vibration is a really well defined machine learning case is I can provide a lot of value really quickly with just a separate sensor connected that and you do miss some of the concept, like.

Or sorry, some of the context, you know, for instance, mixing dough, is this, is this recipe a thinner dough or a thicker dough? Does that affect the failure profile of the motor? So you do miss some of that context, but you do know when the device is going to fail and you can adjust your maintenance strategy to address that.

Deep Dhillon: So, the fact that you want to be able to predict when a motor is going to fail. I think I'm supposed to read from that, that the primary driver is to prevent the whole conveyor belt or the whole line from failing. And that was kind of my question is like, what's the thing that would motivate AI getting into the, into the, onto the floor and is that the main driver, like make sure it's like maintaining operations, maybe optimizing throughput secondly.

Ryan Thompson: Yeah. So I think these factories generally they're capital intensive. Buildings, they need to run 24 7 to make a return on investment. So that's keeping those running is a powerful use case. Optimization is another powerful use case, especially and this is more in life sciences than in food and beverage.

Whereas life sciences, the cost of goods are very high. So getting the most out of everything. Makes a lot of sense. The processes in life sciences are more complicated and people, they're multivariate. Uh, so people can't look at a biological processes, look at like temperatures and oxygens and things like that when we're growing cells and predict what's going to happen.

Whereas AI platforms are, or machine learning platforms are much better than that, that people could ever be. And it can also be done at scale instead of doing, you know, one process at a time, you can do hundreds or thousands of processes. 

Deep Dhillon: Okay. So now. Tell me what you do, uh, with these clients. Now that I think myself and our, and our audience here kind of has the lay of the land.

So I'm sort of envisioning, you know, your clients or companies that, you know, are food and beverage companies that have many of the same, probably all of the same things that other companies have. They have sales groups and marketing groups, and they You know that they're measuring kind of how well their product is.

But then you have this special world where you're physically manufacturing something. And we've talked about that. And the life sciences side, there's a little bit more complication because you have to decide how to commit capital to like a particular drug, which involves a whole R and D phase. That's probably fairly extensive.

But what's your role as somebody who's trying to, like, bring new Um, these more traditional industries, uh, into kind of more state of the art, technological applications and, and, and AI driven tools. 

Ryan Thompson: So, like I said, we see clients in a range of, of digital maturities. I think the 1st thing I want to look at as well.

What, what do you want to do as an organization? And so my role is, is, is a consulting role. So what goals do we want to accomplish here? We don't want to have, you know, technology that's in search of a problem. I want to know what, like, what business goals we do need to accomplish is my factory throughput at maximum.

And I want to get more throughput out of it. Do I want to reduce my labor costs? Um, do I want to get products to market faster? Do I want to be able to use equipment to make different products? So, 4. 0 is kind of. The term we call it with the 4th industrial revolution, where we're taking advantage of, like, orders of magnitude increases in computing the last.

I guess since Moore's law, right? Um, but we have all this, like, connectivity and computing power. So how can manufacturing leverage that? And the idea is that it's the 4th industrial revolution, uh, the 3 before that being, you know, the steam revolution, uh, electricity and mechanization, and then automation would be the previous 3.

Um, but there's really 4 benefits for manufacturers. So 1 is the productivity, getting more through the fast, getting more product through the plant, cheaper, uh, quality. So what do your rejects look like? Uh, in, in life sciences, cost of quality is, is very high. So this can actually be the number one driver for people adopting industry 4.

0 technologies. And then you also have speed. So that's time to market that can be really important in, in food. For instance, taking advantage of like changes in consumer trends. I always think of it just because there were those projects I was involved in, like getting avocado oil into foods was a huge thing.

I don't know if that was like five years ago, but marketing and now 

Deep Dhillon: there was a such thing as avocado oil. 

Ryan Thompson: There is a, you can squeeze an avocado, but it's 

Deep Dhillon: a picture of the green stuff. So, 

Ryan Thompson: yeah, but it has like health benefits and consumers loved it. Right. But if you could be first on the shelf to have, you know, a product made with avocado oil, it made a huge difference to your sales.

So that speed to market can be very important. And also in life sciences, if you're the first company to market with a drug for a particular condition, you're going to capture market share that you're never going to relinquish. Millions and if not billions of dollars of difference, so that speed can be another important factor and then finally flexibility.

So that's really the ability to have it change. What product mix you're making what. What your equipment can do, and that one's a little bit more difficult to define and clients aren't as that's usually the 1 that they rank the lowest of those 5, 4 priorities. So we look at. You know, what does your business want to do?

Which of these are important to you? And then we can start having conversations about where you are as a business in your digital maturity. Um, there are a couple of different different what we'll call digital plant maturity models, which kind of will gauge on us. Different areas of your business on on a scale of 1 to 5.

You know, where you are, so those models let you look at, you know, where are you going to get the most bang for your buck? You know, you're really good at say, scheduling, but your defects are poor. So you're going to get more return on your investment by investing in some sort of quality program that will reduce your defects.

Um, the other thing about those models is it lets you kind of track how you do over time to answer that question. What do you want to gain from digital manufacturing? And where are you now? And then we can start figuring out where you want to go.

Deep Dhillon: So I'm envisioning, like, you know, you're having a series of conversations, maybe you're consuming some of the documentation of your clients and you're sort of walking them through, uh, but ultimately your goal at this stage of your engagement is to figure out what the most bang for the buck problem is that they could focus on.

Is that a fair assessment? 

Ryan Thompson: That can be a fair assessment. I think some companies like I like to look at companies will generally have a vision. Do you want is this a 6 month project? Is it a 3 year project? Is it a 5 year project? So, when you start looking at those longer time horizons, it may not be what's the most bang for your buck.

In the short term, it's more, how do we change the culture of this business to be digital first, you know, I think about, you know, the Amazon's API mandate when, when they came up, you know, everything is, everything is going to be accessible via an API and that cultural shift. 

Deep Dhillon: Yeah, when they, they, they, uh, for the sake of our audience, uh, at Amazon, I think it was, it was in the fairly early days, but Bezos.

Put a mandate that every single team in Amazon had to put APIs around their service so that any other team could access, uh, their data and use it. I think originally it was sort of internally facing, but then of course it grew into this massive industry of, of like that we'd know of as AWS at this point, 

Ryan Thompson: right?

And that, that fundamentally changed their business. And I think when you look at manufacturers, they need to do this. They need to fundamentally change the way they're operating. Every manufacturer I I've worked with. Operates in Excel sheets. 

Deep Dhillon: Why do you think that is? Why, why, why is it that manufacturing is slower to jump on some of these, these like technology advancement trends?

Ryan Thompson: I don't know how much of it is, is history. And in the past, sorry, my lighting changed there. No worries. Um, we Um, so I don't know about how much of it is history and being burned in the past. Like Windows updates used to break manufacturing equipment. So there's been a hesitancy there. There's also been a hesitancy around, you know, I don't know if you're familiar with it, like IT versus OT.

So OT is operational technology. So that's really the technology stack that makes your factory work. And there was. Often kind of incompatibility with the two and it people wouldn't support OT systems and OT systems were always hesitant to it. And there was like this just cultural rift that was created.


Deep Dhillon: what were the OT systems? 

Ryan Thompson: Operational technology. Okay. 

Deep Dhillon: So this would be like the machines on the floor, the 

Ryan Thompson: machines on the floor and the whole technology stack that makes them run, for instance, there's like community, community. Communication protocols like OPC UA is one of them that nobody in the IT world has ever heard of, but is a huge thing in the OT world.

Um, so just the technology stacks are different. Um, and I also think what changed, you know, maybe from like 2005 onward, maybe, um, was in the past. Systems that were operated by the OT group had a very high uptime requiring 99. 9 percent and IT systems weren't in that area yet for businesses. So you, they'd be more, you know, an outage was a huge thing on the factory floor, like we talked about a little bit, but I think now when you look at IT systems versus OT systems, you know, IT systems have got a lot of extra nines, especially when you look at cloud based systems, AWS, Azure, GCP, whatever, but even, or even private data centers that are run by these.

These big manufacturers, their uptime is, is much higher. And what I like to kind of guide our clients towards is, you know, if you look at solutions that are built on, on AWS, uh, Azure, whatever, you know, you have hundreds of millions of people around the world that are using these systems. Um, when you look at the OT technology stack, you have tens or hundreds of thousands of people that are looking at it.

So you've got that order of magnitude difference in user base. And solutions that are out there and reliability. Um, so I think we want to just, I kind of guide them to. Embrace IT solutions because they're built for more reliability than OT systems are now. Like when was the last time you noticed like Google was down?

It just doesn't, doesn't go down. Right. 

Deep Dhillon: I mean, yeah, I, I hear you. So one question I have then is like kind of turning a little bit more towards AI. Uh, cause this is an AI podcast after all. Um, how does that fit in? So you, you mentioned you go through this process with these manufacturers and you identify sort of some of the key problems or areas that they want to focus in, we've also talked a lot about the data gathering challenges that happen in the operational context and, you know, of course there are other challenges in the other context as well in the, but like what drives, you know, the AI into one of these places.

Is it AI that's pulling in more sensors and more data to like feed better models? Or is it the other way around where there's just a lot of sensors and models and data and executives are asking, like, what can we do with all this investment we had made in gathering this data? 

Ryan Thompson: I think it goes more from the, like the executive push, especially, you know, we start looking at like 2022 onward when, you know, chat GPT became popular culture, um, And AI became a topic of Fortune 500 companies in their earnings reports, as well as you'll see consulting people talking with with C suite saying you, oh, you got to do AI, you got to do AI, you got to do AI, but there is a bit of a disconnect between, you know, what's going on in the executive level and what actually exists on the factory floor.

Um, so I, you know, I mentioned earlier that some of the use cases that we're tackling in silos. They're great, you know, we talk about predicting motors are going to fail. You know, there's a strong use case for defect to take detection using vision. AI, whether that's a cloud based service or at, you know, an edge service.

If it's a high speed application, those are great use cases save a ton of money. There's a lot of return on investment there. But I think. What we're really striving towards is. A more holistic view of, you know, how does a, how does your whole factory operate, you know, predicting, like being able to use data from all those levels from the factory floor from business systems from consumers and give suggestions to people on how to better operate.

The manufacturing floor, because I think that's where we're going to get the most long term gains where we have systems that are are supervising our management level, really to optimize the output of a factory. 

Deep Dhillon: Let's break that down a little bit because I'm not sure I'm following you there. So maybe we imagine the factory of the future here, where every machine is fully instrumented.

All of that data is being gathered and centralized all of the selling data, the data, the product and how it kind of comes to be. Maybe even there's cameras around for the humans that are moving and looking at efficiencies there. So all of that raw data exists. What does it mean to like move the business forward?

Is it a sort of a leap of faith that, Hey, we're going to throw a bunch of data scientists, help them have them start stack ranking the problems by severity. And, um, and then maybe trying to assess the feasibility of a machine learning solution and then go off and prototype. Now that we have this sort of data continuously being gathered, is it something like that, or.

Um, and, or is it more top down, like, well, we actually know there are particular problems in particular areas and we're going to go selectively gather data and try to address those or is it something else that I, 

Ryan Thompson: yeah, I think it's more the first one, um, where we, you know, we spent time and investment on instrumenting things, making sure we've got our data contextualized, um, you know, And organized in a way that I can ingest it and then we can start looking at your having your data science data science teams really look at things.

I think that's something else that is an opportunity for manufacturers to as often. They don't have data science teams, uh, data science teams might be focused on on process development, but they're not. You know, traditional data scientists, so I think that's another opportunity to, um, I think in the areas where we know there are problems, um, it may or may not be an opportunity for artificial intelligence.

The example I like to give, you know, we talked about life sciences. Um, so, you know, They're biological processes. So what we'll call a bioreactor, which is really a tank where we grow cells 

to make 

Ryan Thompson: medicines. Those processes are very tricky to optimize. We know by increasing those processes by 1 to 3%. That can make a difference of millions of dollars for companies across an organization.

Um, but where we are mostly now are doing kind of proof of concept types of projects. So analyzing 1 bioreactor. Well, that's great. That's great, exciting, might benefit the scale of one bioreactor, but what if we have a thousand across our organization? How do we model our data like that? So there are a couple of known problems where AI can have a direct impact Because those problems are just very difficult for for humans to solve it and for humans to solve at scale 

Deep Dhillon: Well, so one question I have is like putting the data science on the shelf for a moment is there a tradition in these arenas of having like Efficiency teams or some kind of tagger team that goes in and tries to solve particular problems, like the, the root of, you know, the cookie optimization problem or something like, is there a notion of that?

Ryan Thompson: And yeah, absolutely. So 

Deep Dhillon: maybe how do they operate? Like, maybe you can give us a little insight there. 

Ryan Thompson: So continuous improvement is a big portion of manufacturing, um, kind of Toyota made that famous in automotive, but it applies to other industries as well. I think what we see. A lot of right now is Six Sigma or Lean Six Sigma, which are programs developed to make data driven decisions, very specific processes to follow, but it's really looking at problems in an analytical way to solve them.

You know, if we, if we looked at that kind of panacea system where we have all the equipment on the factory floor instrumented and we have all of our data connected, I think AI can help point us towards problems to solve as well. I think that's a really compelling use case. Hey, this isn't right. And then you bring in that continuous improvement team to go look at those particular problems.

Deep Dhillon: What's the, like, let's say you have a continuous improvement team. They go, they study a problem. They come up with recommendations. What's the typical mechanism for affecting change on the floor and the operation structure like, is that they present a case to some senior execs and then somebody pulls a lever like, is that 

Ryan Thompson: just that's generally right?

Exactly. They would make a recommendation. We'd put together a project that would, you know, we have to have these actions, whether that's more instruments, or we need to write code, or we need to install something and then project would get executed. And then you would reconcile the results after you executed it.

Deep Dhillon: So those executives, when they see that set of recommendations, inevitably the recommendations are going to cost some money, take some resources, are they looking, like, what's the level of data support for the recommendations typically, you know, and is it observational or is there like, you know, actual measured experimental data that they're using to make those cases?

Ryan Thompson: It can be both, but I think most of the time there is there is data. The types of continuous improvement activities would be, you know, we're currently making 100 cookies per minute. If we make these improvements, we will make 110 cookies per minute, which means this machine needs to run X hours, less per year, which is going to save us this much labor and this much utility costs.

It's usually very tactical and concrete like that. 

Deep Dhillon: Just thinking out loud here. I mean, it feels like the yeah, In this factory of the future. So it seems like as an exec of one of these manufacturing facilities, you make a concerted investment in instrumenting and gathering data, which a consultant like you helps them get to that point to make that investment.

Once you have that data, then, you know, you hire some data scientists. But maybe you embed them on these efficiency teams. So they're going through with the team. They're helping make the case a little bit clearer. Like, why do you think exactly that it's going to go from a hundred cookies per minute to 110?

Oh, well, we have this statistical model that took all these variables into account and we've been able to like, you know, generate past training data across the variations in those variables. And now we can sort of. Stand on something a little bit more solid. I mean, it feels like a root. Is that legit?

Ryan Thompson: Yeah, I, I absolutely think so. So I think when we look at some of these six Sigma teams, I have a six Sigma green belt, for instance, but I didn't, would never call myself a data scientist, I'm more. I can fit models like usually regression analysis and bell curves and things like that, but I don't have, The rigor of dealing with, you know, large data sets and designing experiments and things like that.

Um, but absolutely having data scientists support continuous improvement teams would be a huge advantage. Great. 

Deep Dhillon: Yeah. I mean, even like, like, let's say, let's take the cookies per minute idea. So, you know, you have some measure of how many cookies are coming out per minute and you have a bunch of variables that you're tracking over time.

So now you have a curve of cookies per minute, maybe run in different contexts late at night, early in the morning. Stuff happened kind of like a natural experiment. That's that's like happening all the time. So now you build a model to predict Those cookies per minute. And then, and now you can go back and start torquing some of those variables that you think you could actually affect, I don't know, turning on some fan or whatever those variables are that seems quite promising, you know, and, and then you can leverage the machine learning or AI systems.

To go backwards and build explainability into them so you can sort of reverse engineer why you think you're going to get from 100 to 110, but specifically from the variables and their values and their sort of interconnectedness is something like that. 

Ryan Thompson: Absolutely. I think what I really liked where you said there was the natural experiment piece.

I think that's, that's You know, when you start looking at design of experiments, now you've started purposefully tweaking things and making measurements. So you do have some of that one observer effect, which who knows what you're changing just because you're actually running a real experiment, but it's also like, it's difficult to manage from an operational point of view, because now you want, instead of my normal production schedule, now I've got to have my engineering team come in.

They're going to tweak this recipe, but maybe we didn't finish our batch on time. And. They're kind of causing more headaches and interference, whereas if you're able to run a natural experiment, which now I'm making hundreds of batches of something a year, you get a lot more data and can really see what those what what those variables affect.

And then the other thing is, you remove a lot of the human bias out of it, too, is like, you know, you might have. A process engineer that it's convinced. So it's this one pressure variable. We need to change the pressure and it's going to optimize the process. Whereas an AI algorithm doesn't have those biases.

It will just look at, at everything holistically and make recommendations. 

Deep Dhillon: Well, it has other biases, but yeah, 

Ryan Thompson: there's, there's less when you're looking 

Deep Dhillon: at. That's kind of what I was. You know, when I, when you were explaining the sort of classical technique of these sort of efficiency teams, I mean, putting myself in the shoes of the exec, I just feel like I would be wanting them to prove to me that this is going to be the case.

And it seems like the more data you have to make that case, and not just the more data, but like, the more effectively you can make that case with data, I guess.

So there 

Deep Dhillon: seems like there's something else. So we talked about the natural experiment side, but it seems like there's the slightly unnatural experiment that you could do as well. Maybe you get the factory after hours or on certain weekends or something, and you can muck around with variables, stick a few more people in the catch the cookies, falling off the conveyor belt location, you know, certain things like that, where you can start to capture a little bit more data.

It feels like one of the challenges here With these kinds of systems is being able to get enough spread across the variables to really in these other contexts to be able to figure out what's actually going to work because in the natural context maybe there's you just don't muck with the pressure variable very much like it just stays within a certain bounds but you wouldn't know if you let it swing high or low that it actually can make a big difference.

Do you have to or 

Ryan Thompson: diverse your, your product portfolio is the better you're going to do with those natural experiments. Like, you might have some processes that naturally will run at, at say. You know, 80 Fahrenheit and some of that will naturally run at 60 and you get that variation there. And, you know, cookie a might be different from cookie B in that parameter for whatever reason, but you're right.

And, you know, in a well controlled process, your parameters may not be varying that much. And you, you don't get it. Like you might get noise. That is wrong, like, by, you know, temperature being off by 0. 1 degree. Is that really affecting anything or is that just. Or is that just noise? So I guess just like I said, the more diverse your portfolio is, the better you're going to get those natural experiments.

The tricky part is with the onesie twosies where you do, you know, you have access to the machine to do what we'll call engineering runs. You don't get to do enough of those where AI can make a good assessment of the cause of those variables. It's more a traditional design of experiment. I tweaked this value from A to B and these were my results.

You don't get to train it on 10, 100, 1000 sets of data where it can really be more sure. Of the cause and effect. 

Deep Dhillon: Makes sense. Yeah. And I imagine there's other very different applications as well of of AI where you are putting in your own sensors and maybe have a more niche or tailored computer vision problem.

Like, I don't know, boxes flying down a conveyor belt line. They're all supposed to have six parts in them. These are the six parts that are supposed to be in them. Camera shoots them. You know, you count them. Did you get all six parts? Yes. Now, computer 

Ryan Thompson: vision is is a huge opportunity. I think This is something I talked about in a webinar a couple of days ago, but, you know, in the past, where, uh, you know, I'd be sitting in programming camera and saying, you know, how do I tell a difference between an S and a 5?

Well, I've got to tell it, you know, an S has this many pixels in this direction and this many pixels in another direction and a 5 is totally different. Um, and so I've spent, you know, hundreds of hours programming cameras saying this is what it's supposed to look like. And it's very not robust and you might have, you know, your ink.

Changes on something that you're printing. And now all of a sudden, and that's looks like a five to the camera using AI to do something like that is, is super fast, quick, easy. And I mentioned earlier, you can either do that. If it's a slow operation, you can leverage cloud tools with a hundred dollar camera.

If it's a fast, you know, a lot of these applications, we're talking about like. Sub seconds between parts, you'll need some sort of edge. Edge compute to do that, but it's a super, super easy use case in a previous life. I had a, you know, a customer that was shipping products to Walmart, uh, and they would get dinged because they're labeling on the secondary packaging.

Couldn't be scanned, putting in a camera system. Solve that using an AI based camera system reduces your engineering costs by 50 percent really, really easy use case, really easy to implement. And again, when I, you know, I talked earlier about having hundreds of millions of people that use cloud tools versus tens or hundreds of thousands that use, um, OT types of tools.

So you get these use cases that are already well developed, like there's really no programming. Involved at all. Like can AWS read this barcode? Yes or no. And, and you're done. 

Deep Dhillon: So you mentioned, um, going back to something you said a few minutes ago, you mentioned that these executives and these manufacturing facilities are sort of really aware of, you know, GPT.

And based on that, they're sort of more open and pushing down some of these AI capabilities, you know, the man into the manufacturing centers. Walk me through a little bit more of that. Is it. Are we talking about the chat capabilities and the natural language conversations that, that you can have? Are we talking more about the generative imagery stuff?

And if it's the natural language stuff, what are they thinking? Is it like customer support, customer experience, technical support stuff? Is it internal? Use cases to, like, transfer knowledge efficiently, you know, amongst employees, like what kind of stuff are you seeing there? 

Ryan Thompson: So I think the natural language is, is really exciting.

I think, you know, it's pharmaceutical manufacturers specific specifically use, uh, use that for drug discovery. Um, so there that's I said, I think on the top of the podcast, that's where they're investing a lot of money trying to figure out what the next molecule looks like. And that's like, when we start looking at, okay, let's look at all the research that's out there for for drug discovery.

You know, humans can't stay on top of everything. It's just. Too vast natural language processing is a great tool for going over tens of thousands of research papers, lab papers, putting that all together and then giving some recommendations. Hey, we think. This is prior these are the priorities and then you have really smart people part of that will be a data scientist.

Part of that will be like, you know, PhD scientists. Hey, how will this molecule affect this function of the body? So like, that is really well defined research and the companies are, are. Investing heavily in there because for life sciences manufacturers, R and D is a big cost center. But if we're going to make innovations, we need to continually invest and AI is a huge tool helping that.

Deep Dhillon: Yeah, that's a that's a big 1. in fact, I mean, we were we were building deep solutions like this for bio farm back in the late 90s. Um, and back then it was really hard to do some of this stuff. So we were doing, like, you know, um. You know, like grammatical parsings of text, think seventh grade grammar students armed with really large dictionaries so that you could say, like, you know, this particular gene inhibits the expression of this particular gene, you know, in the mouse model or whatever.

But now, of course, with LLMs, like, oh, my gosh, the stuff we were dreaming about is all like. It's pretty, pretty real well realized. It's always 

Ryan Thompson: tough, like going back 10 years and looking at like how hard you worked at something and going back now, I was like, I could do that. 

Deep Dhillon: Oh, we were. 

Ryan Thompson: Yeah. 

Deep Dhillon: Yeah. I mean, if we were going to solve it the right way, we would have had to invent, you know, deep neural nets and LLM, like which a lot of evolution that had to happen.

You know, we were building these heavy dependency tree parsers that were. Largely rule based, but, you know, statistics too, but yeah, and then on the, on the other side, I'm not in the bio and not in the pharmaceutical arena, but in the food and beverage side, what, what is the role of LLMs on that side? Do you think 

Ryan Thompson: food is more like a consumer facing good?

So you do see, like, the typical use case of of. Of chatbots and things like that, but even handling like customer complaints in a more consistent way, you think about like every box of food that you buy has a 1 800 number to call for complaints, but being able to handle those using, using AI and tying that back to manufacturing data.

So I know that this box of cookies was made on this date and then you can. Kind of tie back the manufacturing data there. So you can start building those as you get more and more customer complaints. 'cause people never call to compliment the cookies. They only call. Oh, I see. If something's gone wrong.

Right. So 

Deep Dhillon: you wanna see if there was actually something going on? 

Ryan Thompson: Uh, well you can, you can get trends. Like you can get trends a lot, a lot better. Right. Being able to tie customer complaints to specific manufacturing lots. But even if you're not tying it to manufacturing lots, being able to just monitor customer complaints against skews or locations can be a huge benefit.

It's also like a time consuming task that people don't like doing. Imagine if you're the person that's responsible for reviewing this Excel list every week of all the complaints that you got. 

Deep Dhillon: That's how they do it. 

Ryan Thompson: No, I can't, I can't say that for sure, but that's certainly a way. And especially if you're a smaller company, you know, you've got an Excel sheet that's tracking these complaints.

It's just, it's, it's not an enjoyable job and you're not going to spot trends as well as a computer will. The other thing, uh, I think from like learning language models is more kind of on a marketing side. I talked about early, but like the avocado oil fad, but seeing what consumer trends are by monitoring, you know, what's going on on.

On the internet in general or whatever, Reddit or Facebook or whatever popular tools are out there where people are talking about, Oh, I just tried this new Habanero snack. Habanero is going to peak in popularity. So we need to be ready with our version of that. So that kind of marketing and product planning piece is another huge area because it's just, it's impossible for people to monitor all the information that's out there.

And AI is just a much better tool at it. Funny 

Deep Dhillon: that you mentioned that we were, we used to, this was Decades ago in a prior life, but we were, um, working for, I think it was General Mills or one of the big food companies. And, uh, they wanted to, they were looking at flavor predictions at that time. It was, this would have been early 2000s, I think.

And so we were mining all this blog content to figure out what the next big flavor was going to be. It was spent like tons of time and energy. And, um, and then we said, it's going to be pomegranate. They were like, pomegranate? Why would you say that? And I'm like, check it out. Here's the data. And then sure enough, like, I think it was six months later, pomegranate was everywhere, like everywhere, every fruit drink, everything.

And they were super excited about it. But I, that is a huge thing. 

Ryan Thompson: I love, I love that story. Um, but it's a use case that companies are continually to invest in. And again, that using large language models, 

Deep Dhillon: that people are still doing that. Yeah. I mean, that was a great use case and they were so. They were so excited when we showed them all the data behind pomegranate.

I don't know if they went off to invest in it or not. I have no idea, but it was, uh, but they appreciated the methodology and stuff. So, 

Ryan Thompson: but all like Having the right flavor at the right time. It, it's millions of dollars. Um, yeah, you look at, and same thing 

Deep Dhillon: for clothing too, like that, like trend prediction is a big deal.

'cause you know, you, you, you gotta ramp all this physical machinery and supply chain and everything to be able to produce the thing at volume. If you can be out two months in advance. Then you're the ones that everybody's buying from, you know, when the thing takes off, 

Ryan Thompson: if you remember, like at the start of the pandemic, if you remember like the, the truly, uh, peak, right?

Like everybody, everybody was buying that they were coming up with all sorts of new flavors, hand 

Deep Dhillon: stuff or whatever. 

Ryan Thompson: No, no, the truly the salt, the alcoholic seltzers. 

Deep Dhillon: Oh, okay. There was 

Ryan Thompson: like a big, there was a big peak in that market. Like start of the pandemic, kind of spring, summer 2020, it went huge.

Companies retooled their factories to make more seltzers. And then, you know, eight months later, the market died because people were like, People don't love them as much as they used to. It went back to normal, but there was a ton of capital and investment spent there, but I don't know how much of that I, you know, you couldn't really have predicted that.

That's not just the nature of the pandemic being kind of unexpected. And, you know, 

Deep Dhillon: once in 100 

Ryan Thompson: year type of event, 

Deep Dhillon: very few things, uh, proved forecasters wrong. More important. So, um, this has been a, uh, a super fun conversation. And I feel like I've learned a ton about this space. I want to ask you a question.

So a lot of our listeners are, you know, product managers or, you know, are basically folks trying to understand how to use AI in their business, in their product, in their processes. What can you say to them as far as maybe speak to folks that you can imagine in the industries that you're talking about, but.

Could be more general than that. But, you know, what do you think they should do? How should they get involved? Like, what are some good starting points, some tips or something? 

Ryan Thompson: So I think, like I said, manufacturing is kind of a mess on the factory floor. So software that. Makes it easier to organize your information and, and present it in a way that can be ingested by, um, these types of systems is, is a big deal.

So right now, like, if, even if I do have a, like a, a piece of equipment that is instrumented, it's still hard to get that. Data into Azure, for instance, it's a time consuming process. So having that kind of direct connectivity into those cloud ecosystems, I think is going to be really helpful. I have seen to, you know, a couple of products that will, I don't know what they use, but they make their own kind of niche products.

AI type of software that, you know, is like an on premises type of, of, of solution. But I think the most important thing is understanding the state of data on the factory floor, how it's structured and making it easier to de silo it and connect it with business and consumer information as well. I think it's the, what the missing link is right now for, for manufacturers and to really leverage artificial intelligence.

Deep Dhillon: So it's kind of like, get your data warehouse populated. 

Ryan Thompson: But I agree. I think that's 

Deep Dhillon: Everything we, all the fancy stuff we've talked about, all the prerequisites for all of it is that you actually have some data. Ideally, you've instrumented the heck out of your factory and you can now go and start to realize some of these predictions, whether it's cookies per minute or, you know, or what have you.

Ryan Thompson: Exactly. 

Deep Dhillon: Great. Well, thanks so much for, for coming on the show. I'll end with one last final question. Let's fast forward. You know, this is my favorite question to end on. So my listeners are probably like, yep, I know what's coming. So five to 10 years out, if everything that you think or want to happen or regularly recommending happens, what does like, um, a state of the art answer?

Manufacturing facility look like. 

Ryan Thompson: All right. Well, five to 10 years is so hard 'cause I couldn't have imagined where we'd be. I, if I take a step back five or 10 years from now, I wouldn't have predicted where we are now. So I, I, I like to think that things will be self optimizing and optimize is, is a tricky word.

You know, what does, what does that mean? What's being optimized? But you should know when. You know, if we're designing factories digitally, so we have, you know, digital twins of of tanks and and packaging lines, the facility should be able to set all of the parameters. It needs to set operate all of the equipment.

It needs to operate. And predict what that factory is going to be doing, you know, 1 minute, 5 minutes, an hour a day into the future. Um, so you really have kind of a complete digital almost like an end to end supply chain. So, you know, when you've got raw materials coming in, you know, what you're you're shipping out.

You always know the instantaneous state of the, of the factory. And you're able to predict with a high degree of act, accuracy, what the future state is. And of course, the further in the future, you're predicting the less accurate that prediction is going to be. But right now, I think we're in a state where we kind of sorta know the instantaneous state of the factory and our predictions are just, are not great.

Deep Dhillon: Awesome. Well, thanks so much for coming on.