Read the Full Transcript
Again, a very warm welcome. Thank you for joining us today for your webinar titled AI Solutions in Manufacturing. My name is Saurabh Kumar, and I’m on the marketing team here at H20.ai. I’d love to start off by introducing our speakers. Eric Choo, Eric is armed with the technical know-how of data science, machine learning, and big data analytics. He is equipped with the skill sets to value add businesses, exploring into areas of AI, within the AI consultation approach. Our second speaker for the day, John Hang, he works as a solutions engineer data scientist at H20.ai, and he’s based in Singapore. He helps with business, government, academia, and nonprofit organizations in their transformation into AI.
Before I hand it over to our speakers, I’d like to go over the following housekeeping items. Number one, please feel free to send us your questions throughout the session via the questions tab in your console. We’d be very happy to answer them towards the end of the session. This presentation is being recorded. A copy of the recording and a slide deck will be available shortly after the presentation is over. Without further ado, I’d now like to hand it over to Erik.
Thank you [inaudible]. All right, so for me, I will start my presentation now. Just a quick introduction to who we are. We are H20, we are on a mission to democratize AI for everyone, making your company an AI company. All right, our [inaudible] Industrial 4.0, I guess pretty much when we talk about AI in manufacturing, everyone leans it to Industrial 4.0. On a Singapore synergy, this is what [inaudible] mentioned. Describes a trend where automation for that data may be able to go from everyday tasks. What I want to point to is for sure you could see that artificial intelligence and also IoTs are actually part of the strategy. But of course, one of the biggest concern of Industrial 4.0 is its impact on this workforce. According to McKinsey Global Instituter research, between 400 million to 800 million jobs will be lost to automation by 2030, and that’s a scary number to look at. But, the same research people also mentioned that only 5% of the current occupation swill be automated by many of the existing job roles, but many of the existing jobs roles will be redefined.
Of course, government’s response is in various ways, such as to expand the science and technology programs. Of course then the majority of the respondents on that [inaudible] are confident that Singapore will continue to remain a competitive RND and product development global hub in the next five to 10 years.
Let’s look back a little bit on the history of Industrial 4.0. Industrial 4.0 begins in 1784. That’s like 200 years ago, and it begins in analyzation. Then subsequently close to a century later in 1870, the electrification race begins. That’s Industrial 2.0, then Industrial 3.0 begins around 1970 where automation starts steady square, all the electronics and IT starts to be introduced into manufacturing. That’s where the automated production process began. Today, we are at our 4.0. so what do we call it? Connection, integration, or revolution? Of course, there are different terms that a lot of people are talking about on forums, on news, and everywhere. What we are mainly looking at today is automation and [inaudible] change, and also digitalization. I think the main important thing in Industrial 4.0 is really integrating and also fusing these different technologies and techniques together.
Of course I’m not going to drill into the history of Industrial 4.0. If we look at it, the fourth evolution is really evolving at an exponential, rather than a linear, pace. It is disrupting almost every industry in every country. Some even say that it’s already predicting our cultural interest. Today no longer is we are looking at manufacturing for demands, but rather a lot of time manufacturing is driven by technology innovations. Therefore, effects of the fourth industry revolution on business is heavily impacting on customer’s expectation. Customer will be expecting more because of the technology innovation. There are more effects on the product enhancement. Today you look at products as evolving quite a bit. Unlike the combination of for us that we used to look at, and there’s a lot of collaborative innovation. A lot of innovators tend to get their products innovated in a much faster way. Then of course we look at how the different organizations are found. The conventional manufacturing firms have difficulties surviving in today’s market.
Closer to Industrial 4.0, so how is AI improving manufacturing. There’s quite a bit of use case that we see in the market today. I’m just going to read the list where it’s not just these four use cases that we’re looking at, but in actual fact 29% of AI implementations in manufacturing are for maintaining machinery and production assets. That’s from a Capgemini research team. What general models are doing actually they’re implementing generative design algorithms that rely on machine learning techniques to factor in design constraints and optimize product design. On Nissan, Nissan is piloting the use of AI to design new models in real-time, thus reducing time to market for the next generation models. Of course, Audi is analyzing in majors in real time to complete product quality inspecting in the automatic industry. The same technique is also used to help a lot of other manufacturers stay in compliance with t stringent regulatory requirements.
What Nokia is doing, they have introduced a video application that uses machine learning to alert and [inaudible] the operator if there are any inconsistencies in the production process. Danone Group, a French multinational food products manufacturer, they’re improving demand for cross-accuracy with machine learning. The same time, it’s also used in other industries to show solid results in demand for casting. With [Pierce], a leading supplier of electronic systems to a widespread of industries is actually using machine-learning to predict preventive maintenance for high speed real lines throughout Europe. China Electric, they’ve created a predictive IoT and Avix solution based on machine learning to improve worker safety, reduce call centers, so [inaudible]. [Kenna], Kenna has invented an [inaudible] as a defect mechanism systems that brings new levels of quality control to its manufacturing centers.
I guess when we look at AI in a nutshell, two main things. AI is there to actually help a lot of industry optimize costs, and also looking drastically on reducing costs in terms of maintaining certain machineries and materials. We also looking at improving yield. Of course, if we could maximize every yield performance in manufacturing processes, that is what a lot of companies is doing. Of course then we look at the AI plus data plus people and see that is equal to transformation. A lot of companies, they actually on a AI transformation today, especially on manufacturing.
I’m going to touch a bit on use cases. This is a typical use case in crude palm oil refinery process. As you can see, there’s quite a bit of different processes involved in palm oil refinery. Of course, over here in this process what are the known criteria, meaning the longer heating time, it will impact the FFA remover. The lower the left FFA, the better the quality of the CPO. Of course we know before unloading the CPO into the storage tank, they will go to a few process, in the oven and weighing to make sure there’s 0.5% moisture. What if we increase such values? Would that be significant improvement in the quality of CPO? If we hold the CPO within the oven for too long, it’s actually a waste of resources. How much does it impact the quality of the CPO?
Of course then we are looking at utilizing the different parameters within the PLC [inaudible] and RTD to actually monitor the removal of the moisture because it gets into the production itself. All of this is actually parameters we are collecting, and for every CPO refinery process there will be a best practice. A guideline where the best data will actually how they best [inaudible] way actually look like. With machine learning, we can ensure that all different values and parameters are set to the optimum to ensure that the consistency in the quality of the CPO.
Next, we look at the biodiesel manufacturing. When you look at a biodiesel process, of course the byproduct is actually resin, which is one of the content within detergent itself. There are also different, complex processes within transesterification, catalyst, washing and drying. Of course, the known parameters are actual chemical refining based on clients requirements doesn’t really cost to do it. But of course we would also like to find out when we are collecting all this different data within the transesterification process and chemical refining process. How can we actually optimize the amount of catalyst to be included in the transesterification process? After washing and drying, we get quality biodiesel in cost effective process.
Next, we look at the robotic welding process. Supposed to be running. When we look at this, this is typical welding process where we collect some of this welding paper. If you know welding, when we look at a welding process, it typical requires high current and a relatively low voltage. From the welding part itself, we are really collecting data points like voltage, current, fire, wire speed, up time, or even wire deposition. When we are looking at a robotic arms, the motions of the robotic arms are very much controlled. Therefore, all this can be actually controlled. For how this use case work is while it’s doing its welding work, the machine learning itself can actually predict how the quality of every weld, so that when we look at it we do not need to wait until the QC portion before we determine if welding is of good quality and know what is required. When we talk about welding, the material plays a big part as well, so understanding the material and of course the deterioration of the welding tip also plays a very big part.
When we look into a welding process in automotive, every car chassis has close to 1,000 welding points. Only a handful of the welding points can be actually choose it. It’s actually tough for automotive manufacturers to make sure that every weld is of good quality. That is where machine learning comes in to help in some of this area.
A little bit introduction of H2 AI and ML platform. When we look at these different processes, a big chunk of it is actually machine learning, and training the right model to use this model to do predictions for the processes. When we look at machine learning, we are looking at data, loading the data, running an automatic machine learning algorithm. This way, DriverlessAI, which is our collaborator, can actually handle majority big portion of it. Plus with the model that’s being trained, it is being compared to a model which can be implemented in DH devices. When we look at this, very much we are looking at IoT devices, a system that is managing the IoT parameters and stuff. We do have another product called [Q]. Q is able to build AI apps and implement these algorithms into DH devices. All this management of models deployment and stuff is actually managed by what we call model opps, where it can produce a learning feedback loop to retrain some of these models when certain accuracy falls below a certain value.
Of course, a lot of the products can actually help with a lot of organizations, as many understand that H20 has around 10% of the Kaggle Grandmasters of the world. All this expertise is built into our platform. Beyond using the platform itself we have got the domain expertise, and also our data science expertise that’s helping our customers. What used to take weeks and months to build for machine learning models now can be achieved, reducing to days or even hours. Of course, whatever our customers viewed on our platform, that’s transparency. All these models are explainable. There are features within our platform to help our customers to justify the models that being built. Also we have got our customer success team, and their goal is to focus and ensure our customer success. The results in overall happiness, leading to expansion of the products usage. There’s a few key points that our customer success team looks at. The resolution of technical errors, the insulation supports and user assign, of course and the regular cadence call with our customers.
We also have our open source products. As I guess many makers know that H20 main open source product is H20 three, which could be a standalone platform. We also have our Sparkling Water, which is a deployment architecture on Spark itself. Then our open source product’s actually free to use for everyone, and when customers who are on our open source platforms are looking for enterprise support, we do have our enterprise support. Of course, everyone is sort of enterprise team. Enterprise team is available through our enterprise support.
Overall, we look to partner with our customers on an AI journey. Bring together all the experts, not just our data experts, the Kaggle Grandmasters, but also our domain experts. We reference learning from our different customers and communities of hundreds of successful use cases, and just share these experience where we actually impart this knowledge to our new customers. By using our world-class technology platforms like DriverlessAI, we discuss how we can actually work together on the journey to help our customers to achieve their future goals. That’s all I have.
[inaudible]. All right, thank you Eric. I’m going to take over from here. Everyone can see a yellow background? Okay. All right. Okay, in this segment I’m going to talk about a specialized branch of manufacturing. It’s called Biopharmaceuticals Manufacturing. Before we get started, let me briefly introduce myself. I joined H20 in February 2020, and before that I worked for Bank of America Merrill Lynch in Hong Kong, and before that I worked for Teradata where I carry out a data science project in a few countries. These are some of the projects that are somewhat related to what I’m talking about in the United States and Europe.
Okay, this is the outline of my discussion. I’m going to start with briefly between conventional and biopharmaceutical manufacturing. Slightly more in depth, but not too deep into how AI can be used to overcome some of the very, very pressing problems and challenges in the biopharmaceutical manufacturing. But I will talk a little bit about markets and economics and [inaudible].
When you go to see doctor, the doctor prescribe medications to you like Panadol or aspirins or [inaudible] and so on and so forth, those are conventional pharmaceuticals, so they are chemical based drugs. There is another type of medicine called biologic drugs, which is derived from biological sources. What are the biological sources? These are some of the examples, where it can be proteins, allergens, or better known as the vaccines, blood products, cells tissue. In the recent last, there have been a new type of biologic being developed in order to first COVID-19. In news of the pandemic, I think biopharmaceutical manufacturing is a very timely topic to talk about.
These are the examples of commonly used biologics. Insulins are very, very well known, has a long history to treat diabetes and antibody to treat cancer, and stem cell to repair damaged tissues. The hottest topic now is production of vaccines to induce immune system against viruses. These are the examples of different type of disease. … obviously this is a mother nature.
Let’s say you want to produce the antibody to treat cancer. First you need to identify the target. Usually it’s a surface protein on a cancer cell that you want to antibody to bind to and cure the cancer cell. Once you have identified the right target, you need to find out which genes produce this target. Then you take it out, and you put it inside a special vehicle called vectors. These are very good. It helps you to make multiple copies of the antigen, which is you attack it. You put this vehicle inside your live cell so that when the cell [inaudible]. You also propagate this vehicle.
How do you propagate it? You put it in a medium that the cell can [inaudible] to. When you put these things in the cell, not all, but some of the cell will get it. Those cells which get the vectors, not all will be productive. This is a very crucial step, usually a bottle neck where you need to find a few among the millions that are able to produce this antigen in large quantity. You propagate the cell and divide it into multiple flask, very much like how you would sterile processing. This is like the live cell version of [inaudible], yeah.
Okay, so after that the cells do go on and produce a lot the antigen, you take the antigen out and inject into the mice. When they inject the mice, the mice will develop the antibody to kill this antigen. The antibody producing cells are inside and all of them [inaudible], so you perform surgery on the mouse. Take up the spin and extract the cell, and then this cell has the gene that produce an antibody against that antigen. The use of technology PCR to extract. There are two genes actually. This is antibody, so the inner structure that looks like a Y, that is what we call the heavy chain and the outside one is called light chain. Different genes are responsible for the different heavy chain and light chain.
We take these two genes out and you humanize it. What does it mean by humanize it? Because this is for a mouse you can’t use directly on human, so we need to modify the DNAs by introducing new patient so that the resulted antibody molecule will be closer to a human than mouse. Again, you take these two genes and put it inside another vehicle, and you propagate it. Then you grow the cells in their small bioreactor, and then after the cell reaches certain populations or density, we transfer to a bigger bioreactor. At the same time, while transferring to a bigger bioreactor, you might want to take some of this antibody out and inject into animals to see if animal develop any adverse response against the drug. In other words, if the drug and antibody become toxic to the animal [inaudible] and go on, and producing the larger bioreactor after that, a certain antibody model [inaudible].
This is the over antibody deposit. In what we can apply artificial intelligence on this biomanufacturing process, one of the things that you can do is predictive maintenance. This is a schematic of a typical unit operations of bioreactors. Inside these tank, there are live cells that will just keep on producing this antibody. The blue equipment, this is the food that the cells need. Over time, the cells will grow into a large number, and will consume a lot of this nutrients, so you need to replenish the nutrients. This literally need to be affect into this at a certain rate. Can not be too fast or going too slow, so you have a lot of IoT. A sensor placed all over the system in order to track those parameters via temperature, biomass, and so on and so forth.
Now, all the components that work properly at all times in order to get the product. Therefore, we have to do something in order to make sure that it will run properly at all times. We must do a lot of things in order to prevent any of the components [inaudible] or any of the telephoto equipment to break down. Now we go into the maintenance. [inaudible] there are several maintenance, reactive maintenance. By reactive maintenance means you go and repair the things physically when it breaks down. It can be very expensive, costly in terms of finance and in terms of lives. [inaudible] connected to each other, and when that fails it might effect the other. Preventive maintenance is scheduled maintenance, which is the recommendation from the vendors, and so on. It costs less expensive, but it has a drawback which is you do maintenance change [inaudible] too early, so you waste the emissions [inaudible] of a life. In the era of AI, so we can do predictive maintenance.
How does predictive maintenance work? Go to the next slide. The concept we have with predictive maintenance is that over time if a bioreactor or any kind of other devices, the performance of this infrastructure will degrade over time. This is time when you work together with domain experts to define where we think that we should draw a line to treat which part is the here is the target one with need attention or need fixing on this maintenance. There are two strategies to define how to draw the line. One is the strategy for [inaudible]. That’s the machine needs maintenance, yes a one, or no, a zero. Or you can do a multiclassifications of the status of the machine.
One of the things that you can do is to define a cycle, what is means by a cycle. Finding a good cycle can help you to give good data for machine learning training so that you can build robots and accurate model. This is one suggestion that before you produce antibody, soon you need to [inaudible] clean the bioreactor, and you need to sterilize the [inaudible] with high heat, and fuel up with the medium the nutrients, and then put the cell in, get the product out. This is one cycle. Because of this mechanical processor, each cycle would degrade the performance of the systems.
In order to predict if a particular machine or system is going to break down soon, one of the things that we can do is to produce a combination of [inaudible] the machines in terms of the determinate is remaining useful life, so this is a [inaudible] on how you can calculate the remaining useful life. We cap the cycle that is around after the cleaning, sterilization, that process, so after each cycle. Then you calculate the data from IoT, then you go to another cycle. Let’s say the cycle 103, the machine break down. You have the historical record on every cycle before it finally breaks down.
This collected data is where you can [inaudible] with the degradation of the equipment performance. You collect the extensive data, structure it properly, fill it. Then what you can do is you backtrack the fail cycle a few times. Let’s say at 100, then you do a combination like so [inaudible] there are many useful [inaudible], minus one. 100 minus 1 is 99. Then you do this calculation 98. 98, 97, and so on and so forth. After that, then you can label the data as urgent, short, or long, one and zero.
This would be becoming your remaining useful life, and you can snap it into your structure of sensor data and become data that we can use to to train your machinery model to predict if a particular machine or biodata is going to fail. This is the actual prediction in this data that I have run in a diverse AI. This is a performance on metrics. This slide is just to show you these are sample of the results that you can get from the DriverlessAI on the predictive maintenance. This is actually from [inaudible] engine from a bioreactor, so they work the same.
One of the things that people use AI for is you can perform many experiments. There are automated feature engineering can help you to reduce some of the critical metric. For example, in the bioreactor or [inaudible] engine. It’s very, very crucial to reduce the false negative. It’s okay to have a high false positive, but a false negative is very dangerous. For aircraft, means that the engine needs urgent maintenance, but you say that it still has a long life, so it could spell disaster, make a crash. In bioreactant, you might not be about to get your product there, so if the equipment thing, yeah.
Another output that you can get from a DriverlessAI is very important. It tells you that based on the predictions, whether or not this bioreactor or this equipment break down. What is the most important variable, the second most important variable, third, and so on and so forth. From here you can actually maybe use the variable importance to make the decision. For example, if the prediction says this particular variable is going to break down, then you can have the option to look back and check with your variable importance. For example, this sensor 11 is corresponding to the CO2 analyzer, and this corresponding to the air flow. This presents the reason for a failure, so you can actually attend specifically to that particular component. For example, check the CO2 tank, whether it still has enough pressure so that it can provide the necessarily gas or air, and oxygen, and nitrogen to the cell to grow properly.
That was the bioreactor. This bioreactor must be placed inside a very, very specialized facility in biomanufacturing. We call it the GMP facility, good manufacturing practice facility. It is very confined, very clean environment and regulatory and compliance dictate that for such therapeutic protein production, the facility must not exceed 100 particulates per meter cubed. Depending on the country and also the context, some dictates that it can not be more than 1,000. It can not be more, and is one of the most stringent.
Because of these regulatory, biomanufacturing test facility all have the sensors and edge devices places all over the facilities to detect the filter, the ventilation, the lighting, and also who can access this facility, and so on. All this data can be used to do predictive maintenance and how. You can collect the data and the power meter pressure, temperature like things, and put it as a feature and the human traffic who came in and who went out, for how long, so can also be used as a feature. Ventilation and the filter, so this is the target. The [inaudible] and then you can use for example diverse AI to predicted model in order to predict contamination risk. Is this facility at risk of being contaminated? Yeah. That is the second part of make predictive maintenance.
For this facility, predictiveness for the facility in order to predict contamination. Let’s say in your history reports, so you have contamination at this time, another contamination at this time, and at this time. What you can do is you sample, let’s say just before the contaminations and [inaudible] the IoT data just before this contamination and this contaminations, and be a table. You can do the labeling so these parts are contaminated, contaminated, so this idea. The data on what happened just before the contamination, so then you can use this as training data and feed into the DriverlessAI to build machinery model. DriverlessAI gives you the ability to take that machinery model out of the driver seat and place it elsewhere on the HD devices, on the sensor, on your personal computer, and so on. Then the IoT data that you collected can be fed into this possible model, MOJO, and do the predictions.
What else can you do? DriverlessAI, also they field the interpretability model, for model as some of the results. All these drivers automatically generate reports for the interpretability that you can use to communicate with the regulatory bodies so that they can devise new regulation and policy for other facility design. Other things that we can use AI for is obviously the prediction. Every biologic medicine must be very safe for patient. Therefore they must undergo toxicity test, so the drug must be injected into animal to see if an animal has any adverse reactions to it. How do we do that? Again, there are IoT data available. For example, pH, temperature, medium. Now we have understandability to actually characterize and study cells at their genome level, so those data that we get from genome experiments are very, very, very good. We can combine this data with the other physical data, and then again, so this other thing that you can predict if the antibody binds. Is the antibody working, or did animal develop a toxicity?
You can build a training data again, and from the training data build machinery model. Then, next project, so you have new data coming in. Now you are able to predict which particular batch we’ve shown has the lowest probability. You select those clones and inject into animals. Hopefully this will not develop toxicity within the animals. Still develop toxicity? If yes, then go back to the model, get more features, improve the algorithm. If no, then go to the productions. If you repeat this many, many times, eventually hopefully we’ll be able to minimize our dependence on the animal model. This has a few overriding benefits, so are those reduce development costs and address shortage of talent, and of course, it would reduce the time to market and reduce the corporate reputation risk. Also, the safety and well-being of the scientists themselves. Again, you can use the auto-generated reports to communicate directly about this in order to develop a better cytotoxicity. Then one of the most crucial step is to be able to abstract the productive cells.
Okay, just a recap on the earlier part of the presentation that we saw. The genes that has been inserted into the vehicle, it must be inserted into the cell. Let me repeat that, not all cells will get the vehicle, and those cells that get the vehicle, not all becomes productive. This is one strategy that we can use in order to create a feelings that we can develop or build a machinery model to predict if a particular cell is a productive cell. For example, you randomly pick cells and go with propagated cells. After that, you extract the DNA. Again, you snap the physical data with the genomic data together, and then label it whether or not a particular code is productive or not productive. It fit into the Driverless, get the model, and build a model. You can use a model in the next project to better determine if a particular clone or bath of cells is going to be productive, instead of running many, many lab experiments to determine that.
This is how I envision how machine learning can change this practice in the future. For example, the current practice is that you put the vehicle in the cell, and then you try to find the one or the few among the millions that are useful. How will AI change this practice? I envision that by combining the genomic DNA data with statistical data, we use machine learning to pick that few that are going to be productive. Then we only insert the vehicle on that few that have been selected by machine learning. Instead of performing many, many trials and errors in the laboratory. It reduces the iterations or the number of trial and errors that you need to perform laboratory works.
Okay, so I will go and talk a little bit about the markets and the economies of the antibody. This is list of a current treatment strategy used by hospital doctors, oncologists to treat cancer, from surgery to radiation therapy, hormone therapy and chemotherapy and immunotherapy, which is the use of antibody treatments. It was about $97 billion. This is for cancer. Among the $97 billion, actually antibody therapy take up about $41 billion, so it is a big business. However, for each antibody to be produced, it costs the pharmaceutical companies years, and from they have billions of dollars of investment. Therefore, as a US investor, how we go about investing in a particular target? There are calculator used for investor in pharmaceutical companies, so in [inaudible] this is something that you can do. Try to predict which candidate will most likely be approved by FDA.
I’m going to show you one example of actually just using an NLP based classification. This is there in the DriverlessAI. This is PubMed, it’s a search engine for the scientific journal and abstract. This is an example of a scientific site you can pull from the PubMed. There’s the API that you can convert this free text into a structured table that has a type, abstract, journal, who wrote the article, which year, and so on and so forth. Once you have this structure, you can actually easily fit it into DriverlessAI to create a machine learning model. Then you can use a model to predict new publications, to identify which publication has the most potential cancer target which will become the antibody to be approved by the FDA. This screenshot shows some of the experiments set into IUs in order to make these predictions. DriverlessAI come with the NLP specific settings that we can do, so this is slightly different from the previous one. These are some of the metrics that I got in trying to predict the next antibody to be approved by FDA.
This is another metric that I use, our sample data. This is [inaudible] result of my prediction, so this is outcome of the predictions. What I got is I looked to the one that has the highest probability of becoming FDA-approved drug. I look to each of this with the abstract, so this is my conclusions that in my predictions I think this target will be the next FDA-approved EGFR, and PDL1. Then from this, there are actually identification number that you can track. You might want to examine this publication further, see who wrote this article and got to the organization, the USD and so on, and talk to them and invest money for the develop the article drug.
Okay, so in the conclusions, let me just sum up what have I talked about for pharmaceutical manufacturing. In the midst of pandemic, we need to get the drugs out very, very quickly, and patient can not afford to pick up, so we can use AI to do predictive maintenance. Our sales need to be productive, and no time to lose. I can help find those few or the one among the many. Biologic medicines might be safe, and so we can use AI to predict the cytotoxicity, and also hopefully to reduce the dependence on animal models. Biologic medicines is to reach patient fast and in large quantity, so AI has potential to reduce their time to market.
Before we conclude, a few points to ponder is that based on my experience in both life science and industries, and also in the field of AI. I think life science could be the next frontier for AI, and in light of the pandemic, it is even more so now. AI could be the next frontier of the life science. I said to do life science, perform life science or biopharmaceutical experiments efficiently, so AI can definitely help. In a pandemic we can use AI to help with this one thing, which is to do things right with this time. Because of this, I envision that the Automated Machine Learning is becoming increasingly indispensable. That concludes the presentations of how you can use AI for the biopharmaceutical manufacturing. Thank you for your attention, and back to you.
Thank you so much Eric and John for doing such a great presentation. We have a few questions in the chat window. Let’s try to get most of them with the time we have at hand. The first question we have asks us, “Have many industry deployment utilized deep and for reinforcement learning?”
Okay, I don’t have the statistics of how many are using deep reinforcement learning, but from my previous experience with the Bank of America and my communication with those hedge fund people. Deep reinforcement learning is being [inaudible] especially on the [inaudible].
Sorry, I think there was some quality issues there. But thank you for taking a stab at it, John. We have another question that says, “Can you discuss how AI is used to automate environmentally sustainable supply choices, while keeping business costs competitive?” Either of you could take that.
Can you repeat the question?
Yes, absolutely. How can AI be used to automate environmentally sustainable supply choices, while keeping business costs competitive?
Oh, okay. I’ll take this one. Hold on, let me share my screen. Okay, so these are some of the things that you can use AI for, or for example nano optimizations. Finding the most optimum route to [inaudible] transform your good, so you actually practically reduce consumptions of fuel. Another one is demand optimizations, catastrophe modeling, and so on to bring attention to which is a reduced wastes. If you can find the shortest route to transfer your goods, you reduce the consumption of fuel, so you can also reduce waste here by having a better inventory management. One of the things that you can do in managing your inventory factory is to do time series forecast. You reduce the turnover and wastage. These are some of the things that you can do to make it environmentally sustainable. You don’t stock too much, at end that you throw things again, and so on and so forth.
I guess the way you look at it, there are multiple ways of substaining in terms of wastage, in terms of optimizing processes. Like what I mentioned, with AI you can actually reduce costs and also maximize the use. I guess with AI, it’s not just one use case or one particular model that you’re looking at, you are looking at combination of different models being put together that actually digitalize experience and knowledge, and actually put into the whole process and get a complete outcome from the whole AI journey.
Thank you so much guys. The next one says, “What part of procurement analysis can be predicted, and how can we analyze near-sourcing?” I’m assuming this is for supply chain.
Eric, you want to take that?
Yeah, I guess when you look at procurement process, a lot of time we are looking to buy at the lowest cost. At times when you buy a different batch of inventory, you are looking to do can be optimal quantity. A certain quantity at a certain price that is the lowest that you get. But of course, in that kind of quantity that you put it on your shelf and it stays on the shelf for too long. The demand is low. How do you prevent that? You do not want to buy at a low price where you keep all these stocks in your warehouse for a long time where it’s not moving because of low demand. When we look at using AI, you can actually get the base model in terms of the demand. What are the demand for this particular product? Will it stay on my shelf for the next few weeks, or once I get them in and will they start moving, of thing.
We are looking at two particular different models to predict. That means should I buy this particular product at this particular low price at this quantity? When I buy at this quantity, will it stay on my shelf for a long time because it’s low demand? Or, how can I actually amend the demand so that I can predict the sales for cost for this particular product? That’s very much on I think optimizing cost itself. How do you actually maximize the yield, then you’re looking at demand forecasting. That means to actually forecast the sales of this particular product, based on the demand in the market. When we look at sales forecast alone, we can typically look at this is how much we sell during this period of time last year. This should be how much we’d be selling this year, but no one will know that COVID-19 is coming, and no one will actually predict sales at that optimal level.
There are different solutions, like with H20, with Q we have got demand augmentation, where we actually put in public data. We take news data and stuff like that, then we put it into Q, and then we actually do a more effective sales forecast. These are ways to actually optimize your yield.
Okay, let me add onto what Eric has said. This [inaudible] you can use forecast to optimize your demand. This is bus ridership data. We have IoT data coming from GPS locations, and also from the bus count. You can feed this into DriverlessAI to train the time series model, where the output can be used for planning and operations, and the procurement as well. For example, this one case, this is an actually experiment in DriverlessAI. This is bus ridership, and this location A, this is the rider forecast for next week. For the same week I have a location two with a forecast, which is higher than the location one. Usually you put like four buses location one, four buses location two. It is a forecast where you can better manage your results allocation right there. You transfer two of the buses from location one to location two. That is one way of optimizing your resources. Instead of buying new bus to serve the second location, you use forecast AI into time series forecast to relocate your resources.
Thank you so much, Eric and John. We’re at the top of the hour. If you have any concluding thoughts for our audience, now would be the time to conclude.
Yeah, the automated machinery prove its worth, especially in the time of pandemic where … I will just repeat myself there. At at time like this, we need all the help that we can to do things right the first time. AI is one, and automated machine learning will further reinforce for and power data scientists. So [inaudible] I believe this will be a [inaudible].
I guess if you reference to industrial-
[crosstalk] tools and now-
… AI is actually part of the grouping, at least for the Singapore strategy. I guess pretty much manufacturing firms today should be already on AI with. If you’re not, I guess everyone will really need to look into it, because it’s like conventional frameworks where what works for my competitor doesn’t work for me. Because when we talk about AI, every data is unique. What works for you doesn’t work for others. Of course when you can look at it, we can use the best practice, but it’s still a lot of work being done.
When we are looking at time to market, I think today everyone is struggling because of the pandemic shut outs. You can’t really do much research, research only starts going. Of course then you have to make full use of your time. Especially I guess when you look into the health science pharmaceutical, data science alone is complex. Then you look into the process of manufacturing medicine, it’s even more complex. In terms of complexity, if you are not using AI it’s going to be very tough to actually derive certain favorable outcomes. I guess AI wave is here to stay. It’s going to be here for at least the next five to 10 years. I think it’s really high time that a lot of manufacturing firms really look into AI seriously.
Thank you. Thank you to our presenters, thank you to our audience for taking the time and discussing these solutions in manufacturing. The presentation slides and the recording will be sent out shortly after. Have a good day.
Yeah, thank you all.