
Auto Care ON AIR
"Auto Care ON AIR" is a candid podcast dedicated to exploring the most relevant topics within the auto care industry. Each episode features insightful discussions with leading experts and prominent industry figures. Our content is thoughtfully divided into four distinct shows to cover four different categories of topics, ensuring collective professional growth and a comprehensive understanding of the auto care industry.
The Driver's Seat: Navigating Business and the Journey of Leadership
To understand organizations, you need to understand their operators. Join Behzad Rassuli, as he sits down for in-depth, one-on-one conversations with leaders that are shaping the future. This show is a "must listen" for how top executives navigate growth, success, and setbacks that come with the terrain of business.
Carpool Conversations: Collaborative Reflections on the Road to Success
Hosted by Jacki Lutz, this series invites a vibrant and strategic mix of guests to debate and discuss the power skills that define success today. Each episode is an entertaining, multi-voice view of a professional development topic and a platform for our members to learn about our industry's most promising professionals.
Indicators: Discussing Data that Drives Business
This show explores data relevant to the automotive aftermarket. Join Mike Chung as he engages with thought leaders in identifying data that will help you monitor and forecast industry performance. Whether global economic data, industry indicators, or new data sources, listen in as we push the envelope in identifying and shaping the metrics that matter.
Traction Control: Reacting with Precision to the Road Ahead
Every single day, events happen, technologies are introduced, and the base assumptions to our best laid plans can change. Join Stacey Miller for a show focused on recent news from the global to the local level and what it may mean for auto care industry businesses. Get hot takes on current events, stay in the know with timely discussions and hear from guests on the frontlines of these developments.
Auto Care ON AIR
Data, AI, and the Future of Product Development
In this engaging episode, we uncover the transformative power of data and technology in shaping modern product development strategies. Join host, Mike Chung, as he welcomes Stephen Goldsmith, Vice President of Product at Hanover Research, who shares his extensive knowledge on market intelligence and the innovative approaches being utilized in the realm of user feedback and productization.
We dig deep into the importance of high-quality data and how organizations can pivot from outdated survey methods towards harnessing AI for more effective decision-making. Stephen emphasizes the need for an agile feedback cycle, highlighting that effective product development isn't a linear process but a continuous evolution. The conversation takes intriguing turns as we explore the integration of qualitative and quantitative insights, allowing businesses to better understand their consumers’ needs.
Stephen also provides insights into how cutting-edge digital products are being developed at Hanover Research, equipping clients with essential tools to analyze market demands. From labor market trends to academic program evaluations, we learn how data aggregation can facilitate smarter decision-making in a fast-paced environment. As we discuss trends towards shorter surveys and more intuitive platforms for gathering insights, we consider the implications of these shifts on user experience.
By the end of the episode, listeners will walk away with a clearer understanding of the modern product development landscape, enriched by thoughtful reflections on the future of market research and technology. Don’t miss your chance to engage with this insightful dialogue and learn how to adapt your strategies for success in a data-driven world.
To learn more about the Auto Care Association visit autocare.org.
To learn more about our show and suggest future topics and guests, visit autocare.org/podcast
Welcome to AutoCare OnAir, a candid podcast for a curious industry. I'm Mike Chung, senior Director of Market Intelligence at the AutoCare Association, and this is Indicators, where we identify and explore data that will help you monitor and forecast industry performance. This includes global economic data, industry indicators and new data that will help you monitor and forecast industry performance. This includes global economic data, industry indicators and new data sources. Really delighted to have Stephen Goldsmith join me for today's episode. Stephen is Vice President of Product at Hanover Research and Hanover Research is a custom market research provider for corporate and education clients. Welcome to the show, stephen.
Stephen Goldsmith:Thanks, mike, happy to be here.
Mike Chung:Really glad you could make it for this one and tell me a little bit about your role and what it entails.
Stephen Goldsmith:So, Mike, my role in Hanover is working on the full product lifecycle, so everything from the beginning of that market discovery, product market fit through, product development and then finally business and go-to-market strategies. So taking those products to market and then overseeing the business aspects of those products once they're in market, potentially improving them as we go along. And we do have a number of different initiatives and products in the pipeline at any given time, so I'm usually doing some part of all of those things at any given moment in my day.
Mike Chung:Really helpful and I've worked at Hanover Research. We worked together and I don't know that all of our listeners necessarily know that much about Hanover Research. I gave that little kind of very short 10-second overview, but can you tell us a little bit about Hanover Research? I gave that little kind of very short 10-second overview, but can you tell us a little bit about Hanover Research, the types of clients you're serving and the types of products and solutions you're providing to them?
Stephen Goldsmith:Maybe actually a good place to start would just be with Hanover's trajectory, and I think that will shed some light on the subsequent conversation. So what I joined in over a decade ago at this point it was a market research firm really focused on services, so providing professional services that was oriented towards applied social science methods. So things like survey research, qualitative research, secondary research, statistical analysis and then using those tools to answer a variety of business questions that handover clients had, or organizational questions, research into customers what are their needs, what are their expectations, and then also research into markets, typically adjacent markets, the clients were interested in getting into. Since that point in time, hanover, as it has grown, has gained increasing interest in productizing some of those approaches and offering really true data products in addition to those traditional social science services.
Mike Chung:Okay, that's really helpful Because when we think about product development, you're the first person I've had on this podcast that's been in product development and I'm hopeful that our listeners will be able to several things after our conversation. One is reflect on data collection from users to inform new product development, existing product enhancements and the like, whether they're making physical products, software products or, I suppose, anything in between, if there is an in-between there. So you mentioned services and survey and so forth. So tell me a little bit about some of the new products with regard to you mentioned productizing solutions and so forth. So tell me a little bit about some of the new products with regard to you mentioned productizing solutions and so forth. Can you just dive into that a little more please?
Stephen Goldsmith:Sure. So, being in the information services industry, the products that I work on are digital products and they're focused on data. I think it would be fair to say, or to categorize them into two broad buckets, the first bucket being aggregating data, formatting it, potentially running around analysis on it, and then providing that information back to clients in an easy-to-use format, typically in a dashboard or software interface. One example of this that is, a product Hanover currently offers is a tool that allows higher education administrators who are focused on academic program development to analyze all in one place things like the labor market, prospective student demand, the number of programs that already exist in the space, as they try and assess is this maybe a new program we want to launch or do we potentially want to sunset an existing program? So, to answer questions like that around how does the market impact their program development and that's entirely a self-service tool that they can use and make those decisions in an informed way.
Mike Chung:And, if I can just jump in here, so I really appreciate that explanation. So what you're referring to and you said data aggregator earlier, so I'm imagining databases that are out there. It could be a government database, there could be multiple databases and your team provides a solution that perhaps goes and picks out data that's relevant to your client so that they may assess something like okay, do we want to open a law school? Is there enough interest in this particular geographic area? Perhaps getting economic data, population data and using that data to inform a decision with regard to a program introduction. Is that correct?
Stephen Goldsmith:That's exactly right. So there is. You could think of labor market data from, let's say, the DLS going in there. You could think of public data on academic program demand or completions going in there, and that's one typical data source that we would use. Is that publicly or near publicly available data that we provide is proprietary to the company and in many cases, from our perspective, that's the more valuable data because it's less of a commodity and there's a little bit more of a moat around data like that and that is also something that potentially certain analyses can be won over and those are again proprietary to the company. But that's just kind of one bucket is that aggregation of data.
Stephen Goldsmith:The other bucket of product development that we're currently doing I would call software. Think of as software that facilitates research. So this is productizing some of those social science approaches. So that could be something like survey software that is DIY. It could be something like qualitative research software, but again has a DIY element to it, and those are important for the company strategy because we can offer those both as products but also use them for our internal efficiency. So it serves both goals at the same time.
Mike Chung:That's really helpful. And when I was at Hanover Research, as you know, I oversaw a lot of surveys to consumers. So it might be a consumer survey where there is that survey design, there's a survey administration where you're figuring out, are we using a customer list, Are we using a panel? And then there's the analysis phase, right. So I'm thinking about that software solution, say, for any of those phases. So if it's informing a, you know we're in this world of chat, GPT, so where it's almost like Star Trek, like computer, I want a survey that learns about this, make it so right. So can you perhaps expand on how some of the technologies you're using are informing product development in that type of a scenario, whether it's design, data collection, analysis and so forth?
Stephen Goldsmith:That's a very interesting question and we could take that topic, I think, in a number of directions, but maybe I'll just start with the example the survey platform and incorporating AI into that.
Stephen Goldsmith:So one thing we are doing, for example, the survey platform and incorporating AI into that.
Stephen Goldsmith:So one thing we are doing, for example, is exploring using AI to design surveys and reference past surveys while it's doing that, to prevent it going way off on the wrong track. Also, using AI to analyze a wide variety of information related to survey data. But I think it's very interesting in that AI is really opening the door to doing a much broader range of analysis than just on a specific type of data, which is often what you saw play out with social science-focused research services, where you would run a survey and you get the results of that survey. You'd run a qualitative research study and get the results of that qualitative research study. I think there's an opportunity now with AI to do a number of those different things in the same platform and then run AI over different kinds of data qualitative, quantitative data and not only potentially find new insights that it wouldn't have found or the user wouldn't have found otherwise, but also to be able to examine data in aggregate in a way that just wasn't possible or feasible before.
Mike Chung:Can you just give me an example of what that might look like, Because I'm thinking about a consumer survey and we might look at the entire nation. We get geographic census balance from geography, income, age. How are you using AI to kind of get more insights? Is it going into perhaps, a demographic that wasn't surveyed? Is it on the and I know I'm coming in different directions here, so please forgive me but you also mentioned the survey design. Is in perhaps using a pool of existing surveys to inform that design? Can you just perhaps give me a little bit more details on what that could look like?
Stephen Goldsmith:On the analysis side. Part of it is simply descriptive of quickly making sense of the data that are coming back from that specific survey. Making sense of the data that are coming back from that specific survey Like, to use your example, the broad scale consumer survey let's just call it an awareness and usage survey for a certain product. There could be industry research already out there. There's probably 20 published reports on that particular industry. Maybe your organization has also done a number of qualitative interviews with that same target demographic. Rather than just looking at that survey data in isolation and forgetting all those other data points that are out there, I think there's a real possibility of using AI to consider all of those different data points, that qualitative data that you already collected from secondary research and the survey data, and put the pieces together. And it's not as if a person or an analyst could not have done that in the past. It's just that that was a very time consuming task and typically that wasn't part of the research process.
Mike Chung:Right. So it sounds like, if I'm putting the pieces together correctly, you and your team are looking at engaging with these technologies to bring data sets together in a faster, more efficient way. Is that perhaps one way to think about it?
Stephen Goldsmith:I think that's fair to say. Faster and more efficient is one part of it, but also creating insights by looking at broader data sets that are different in kinds, data sets that are different in kinds, data sets that just typically wasn't done in the past, either because resources weren't available or those data just might not have been available in the past. So those two things not only get you the information faster, but I think can get you information better or better information when it comes to making decisions.
Mike Chung:Thinking about it from a single project perspective and then to a solution providing perspective, where this is now a process that your research teams are using. What does that look like? Because I'm thinking, if I do an individual study and then I go get data here and then I go get data there, I have to think about the pitfalls of perhaps emerging data sets right. So it's a little bit of a two-part fuzzy question, but can you tell me about the process you have in product development for taking those types of considerations into account?
Stephen Goldsmith:That's a really interesting question. I think one of the things that, as I've increasingly worked with AI and seen AI applied, in some ways AI analysis is not the hard part of creating a product like this. The hard part is actually in getting the data all in one place where you can use it effectively together and then setting the AI up in such a way that it can accurately read into that data set. So in some ways, it's more of an information gathering, aggregating problem and formatting problem more so than it is an AI problem. So while it can be very valuable to create and I think necessary to create prompts for AI to give an understanding of the task it's going to accomplish, Most of the AI tools that are out there are capable of conducting this type of task if the information that's accessing is in the right format.
Mike Chung:That brings up two questions for me then. So one is tell me a little bit about kind of best practices, data aggregation pitfalls that anybody, whether if it's in a research capacity because we're in the automotive aftermarket, at Auto Care Association, which I think I've described to you, and so we have manufacturers, we have researchers, we have distribution companies and I think a lot of companies are looking at new technologies, how to better harness data, get better insights out of it more quickly. So, in this data aggregation perspective, what are some of the perhaps recommendations you would have for somebody doing that? And then the second question would be you mentioned AI platforms. I'm personally not that familiar with all of them, so can you perhaps tell me, and tell our listeners, considerations when thinking about an AI platform to use?
Stephen Goldsmith:Sure. So on the data side, certainly the biggest consideration is garbage in, garbage out. So if the data that you're collecting and aggregating is not of high quality and high value, then regardless of whatever the quality of the AI is that you're running over, that you're not going to get value out of it.
Stephen Goldsmith:And I think that in many cases, that's still the hardest part in building a data product is collecting a very large data set in the first place that is of high value to your target market, so that information is of high value to them, and once you've done that, I think that you've gotten maybe 80, I'll call it 80% of the way there, and then the remaining 20% is setting up the analysis, and you do, of course, want AI to be reliable in its analysis, so one of the things that it's now well known for is collucinating, so making things up if the exact answer that you're looking for isn't in the reference information, and so there are a number of ways to try and control for that.
Stephen Goldsmith:One way in which we've controlled for that is by building in the capability for it to cite where it's pulling a certain data point from. So it will say here's your answer, and here are the three pertinent pieces of information that I drew on to provide you with this answer. So I think that's really a critical way to quickly double check that the information you're getting is accurate. Otherwise, it puts a lot of burden on a user to go back and do that sort of fact checking, or you're potentially taking a risk that the information the AI is providing is not 100% accurate.
Mike Chung:Right. So just a side note about hallucinations. Not being an AI expert here, I'm just going off of what I've read, and hallucinations are somewhat to be expected, I would think right, Just by virtue of large language models and perhaps infinite branches it can take in terms of filling in the next word. So is that fair? Should we expect hallucinations and is it reasonable to expect those types of errors in the future?
Stephen Goldsmith:So, with the caveat that nor am I an AI data scientist, I'll say if you're using just a very generalized version of, let's say, something like OpenAI's chat GPT and you're asking it questions it simply doesn't know the answer to, you're probably going to see a hallucination in the answer because it's going to try and give you an answer to the best of its ability, but it just doesn't know how to answer your question and a couple. So I said one way to control for that is to have its site source. Another way to control for that is to constrain its focus in terms of the data that it's referencing.
Mike Chung:And if it's?
Stephen Goldsmith:not in that data set, then you prompt it simply to say the pertinent information isn't part of this data set. So, going back to that survey about attitudes and usage of a certain product, if you ask it a question about a completely different product that's not in that data set, it should say I can't sell you that information's not part of the data.
Mike Chung:So setting the boundaries, setting the parameters, is critical and it sounds like, along with that, the front part. The first of those two questions was the garbage in, garbage out data quality, and I guess there it could be a combination of sort of human researcher analyst like combing through, cleaning the data data, as well as perhaps using an automated solution to help get through it faster right I think that's true, and it of course depends on the type of data you're talking about.
Stephen Goldsmith:so, if it's quantitative data, databases have been around for now for quite a long time, and I think there are a number of well-known techniques for entering quantitative data into a database. Where I think AI is particularly interesting is in analyzing large volumes of qualitative data which in the past would have required somebody to cone through and try and make sense of, and that just wasn't feasible based on the number of hours in the day and the week, and now that is using AI tools.
Mike Chung:So for the qualitative data I'm thinking of in-depth interviews and there could be the change in tone, there could be the pause, the hmm, things like that. Are the AI tools able to kind of peer into those recordings and glean additional insights based on things like that?
Stephen Goldsmith:When you're thinking of the pause or the tone. It's pause might indicate uncertainty. Tone might indicate some level of emotion. Is that what you're thinking?
Mike Chung:Yeah.
Stephen Goldsmith:Yeah, that's certainly possible and I believe there are AI tools out there that do attempt to do that type of sentiment analysis. I personally don't have a lot of experience using those tools, so it's hard for me to speak to the quality of that. I think in theory it should be something that AI should be able to handle if it's referencing again kind of a good data set where, let's say, certain emotions are expressed a thousand times, a hundred thousand times in videos and then those are clearly marked for the AI to understand. That's what's being expressed. In theory, it should be able to then read that in a new context.
Stephen Goldsmith:So let me shift a little bit to kind of that new product development how the user feedback user, I suppose, volume, as you're bringing new products to market, phases of the product's life cycle, because they're definitely different approaches. So, just to take it from kind of the end of it, let's say you have a product in the market, it's out there, it's selling, you're collecting data about it as it's being sold in the marketplace. I think at that point, ideally, you would want to have a variety. Particularly if it's a digital product, you should be able to have a variety of quantitative data about how it's performing. So let's say and that could be everything from marketing data to somebody what's the rate that people are clicking on an ad, what's the rate that they're then taking the next step when they come to your website, etc. It's a lot of data there. There's also a lot of data based on users actually interacting with a product, particularly a digital product. You can see where people are clicking, maybe where they're getting stuck, having a problem. So there's a lot of good information at that stage too. And that could be um, that could really come from a wide variety of sources. It could be as simple as a product manager is going out and talking to clients Sure, and that's part of the process. It could be as sophisticated as a product manager is using a technology platform that's capturing customer views and social media and sales conversations that salespeople are having with the market and interactions with the IT help desk for the product and reviewing all of those pieces of information and looking for trends. So there's a wide spectrum of that. But I think both a qualitative and a quantitative approach is important there.
Stephen Goldsmith:Taking a step back to the actively ongoing or when, let's say, you're building a new product, you're actively developing it.
Stephen Goldsmith:At that point I think it's really critical to have an agile cycle of feedback as you're going.
Stephen Goldsmith:So you don't want to get past the first stage and say we know exactly what we're going to build and we're just going to go out and do it.
Stephen Goldsmith:If you do that, you risk a lot of waste because as you go, you're making decisions about features, you're making decisions about design, and it's much cheaper and more effective to test those decisions early, before you've gone and done all the work of actually building a product. So I think while you're developing, it's a really iterative process. So maybe you're starting with some basic designs and then you move on to testing a more formal prototype and then you actually start building it and then towards the end you do maybe pricing tests or other tests to really hone in on what your strategy is going to be to take that to market. And then I think the first phase and this is probably potentially a whole conversation unto itself, but also to me the most interesting phase is the original assessment of what is the problem I'm trying to solve and trying to find product market fit. So what is the solution that's going to be effective and more effective than the solution my competitors have to solving this problem and that I think that is a very interesting conversation.
Mike Chung:Maybe we can dive into that for a little bit then, because I was going to get into some of the features and functionalities, because I was thinking about, say, a data dashboard, a data visualization, whether it's Power BI Tableau, and it could be a question of okay, we have this in market, we have a number of our clients using it, and sort of that holistic approach of, perhaps as a client touch, right when somebody is going and interviewing, how are you using it, what do you like, what do you dislike? It could be like the little avatar that comes up that says how do you like this? How would you rate this? Do you have questions? And one of the things I'm thinking about is that enhancement.
Mike Chung:So, if it's a, do we want to have a drill down for X feature? You know that could certainly be something I'd like to think is pretty easy to look into in terms of user studies, focus groups, things like that. But you know, rewinding a little bit. So you're talking about market entry and competitor products. I guess tell me a little bit about what type of timeframes you're working in the balance between getting something out to market. You know, how many features can you reasonably build in what might already be out there. Can you speak to that a little bit, stephen?
Stephen Goldsmith:Yes, and I might start even a little bit before that He'll hear me. So one of the famous quotes about market research Steve Jobs, I believe, said it where he was quoted or the quote was attributed to him was saying I don't need market research. Something along the lines of if Henry Ford had asked his customers what they wanted, they would say a faster course. So that was kind of saying you can't figure out what people need just based on asking them what do they need.
Stephen Goldsmith:But he also said something to the effect of you've got to work backwards from the people, your customer to design the product. You can't design a product and then figure out how that's going to solve some problem for your customers. And I think what he was really hitting on was you can't get too focused, and it's always a cliche in the product world that you can't get too focused on the solution or fall in love with the some solution that you've dreamed up. You really have to focus on what is the problem that you're solving. So if it's an automobile, it's transportation right, that seems pretty straightforward. Or if it's a horse, maybe it has some other uses that the automobile may not. But the point being, you have to really focus on that underlying issue and figure out what that is. And that's what the market research at that stage should be focused on.
Stephen Goldsmith:So it could be interviews with really, really very wide ranging, open-ended interviews, let's say, with people in that target demographic that you're thinking of selling to and exposing potential needs. And if you think of, there have definitely been some well-known recent startup founders who figured out there's a problem in their own lives and then built a whole company around it, and those are good examples of finding that need. I think the founders of Airbnb famously hosted people on their couch during some kind of marketing conference in San Francisco and said, wait, there's probably a business here. And then off they went. So you have to figure out that need, and maybe that's by accident in the case of the Airbnb founders. But if it's not by accident, then you have to use market research to figure out what that need is and then you have to iterate on your potential solutions, your financial product solutions, to figure out which ones are the best ones to satisfy that need.
Mike Chung:That's really helpful. And just thinking about the future, right, You're in product development. I'm thinking about a couple of things here when I go to do a survey. So much has changed technologically, right? Landline phones, random digit calling, mailing of surveys that doesn't seem to really be quite as relevant these days. Data collection perspective what types of movements have you seen in terms of, perhaps, are people really using a mobile device? Are people taking a survey on the computer? Are our attention spans shorter and perhaps that could lead into a? What might we expect from data capture and analysis for primary market research?
Stephen Goldsmith:That. That is very interesting question, I think, something that's continues to evolve and will continue to evolve. A lot has been made of, if you're in the in a b2b industry, that the era of the 30 minute online survey even now of an it decision maker is somebody fairly high up in a company is just dead. That's more or less impossible to get is what most people will tell you without a very significant incentive financial incentive behind it for participation. So, as things evolve, I think, maybe with a more limited attention span, as one door closes, perhaps another one opens, and I think that goes back to what we were just talking about around ai and the analysis of qualitative data, in that there's now a greater opportunity to take qualitative or similar to qualitative approaches that in some way can mimic some of the things that used to be done with quantitative approaches.
Stephen Goldsmith:So so to give you one example, let's say you had rented a car recently and you probably still would get a survey with a series of close-ended questions about please rate your level of satisfaction, how likely you need to recommend this company, etc. You could also try a single question where you say tell me the most important thing about your experience good or bad, just single open-ended question. Take 30 seconds, type that in your mobile phone and maybe that actually yields. If you can analyze that type of data set at scale using AI, maybe you actually get much more actionable information out of that. Versus on average, our score is a 4.2 out of 5.
Mike Chung:Right, and you have 20 questions that are the checkout, experience, the selection, the return, et cetera.
Stephen Goldsmith:Exactly. It gives the respondent the opportunity to focus on what he or she cares about about the experience, versus what you think he or she may care about about the experience in a variety of parameters, and it shortens the research experience and makes that a much more pleasant one for the respondent.
Mike Chung:Right, right, so I'm thinking about it from a data perspective. You're looking. You mentioned close-ended questions very easy to put into Excel and turn the cranks on it. But if it's an open-ended, I'm thinking about unstructured data and then using an AI tool to find themes across a pool of responses.
Stephen Goldsmith:That's exactly what I had in mind. So to categorize the themes, you should also from there, once you have those themes categorized, you can count right the number of times does a certain theme occur. So it's not a purely potentially not a purely qualitative analysis. That can turn also into a quantitative analysis. If somebody says you know, you see, 80% of people who say they're unsatisfied or didn't like their car renting experience say well, when I got in there I had to wait for half an hour to talk to anyone there. You just figured out what your problem was and you didn't have to ask 50 questions to figure that out.
Mike Chung:And I suppose that, thinking about earlier parts of our conversation, if I'm able to take those findings from that set of data and then say, well, what are we seeing in other consumer product industries? Is weighting, an issue Is getting something and it's dirty, where does that rank? And perhaps I can get some corollary findings that can back them up across other user experiences.
Stephen Goldsmith:That's a really interesting point. I think in direction as well, thinking about data products and building them out over time as well. Thinking about data products and building them out over time is that maybe? So I'll just back up and say traditionally there have been companies in the b2b space like nbd group or iri, traditionally more in the consumer space and lupus two companies are actually now one circana, I think yes, that sounds right.
Stephen Goldsmith:um, where they've essentially looked across a number of companies trying to figure out the same thing and then benchmarked a similar study across, let's say, a number of participants in a certain industry, you would be able to figure out relatively how well one is doing on certain themes versus another, and it could be effectively a new way, or maybe a more informative way, of benchmarking women in industry.
Mike Chung:And then thinking about AI models. What should research analysts, research heads across various industries be thinking about when choosing an AI model to work with?
Stephen Goldsmith:So again, with the caveat that I'm not an AI data scientist.
Mike Chung:Sorry, I keep putting you in that spot. An AI data scientist. Sorry, I keep putting you in that spot, I would say.
Stephen Goldsmith:You know there are a lot of great articles out there on the pros and cons of one, very specifically of one AI model versus another. One concern, certainly in the corporate space, is data privacy, and so some of them do allow or ensure that any data entered is deleted, only used for purposes of processing, not used for any other trading purposes or saved otherwise, and I think that's going to be very important for most businesses who are working with AI. That that's part of their agreement with whatever provider they're working with. That that's part of their agreement with whatever provider they're working with. So that's certainly one consideration. There's new models coming out all the time. Some of them are more effective, shown to be more effective in certain tasks than others, and I think it really depends on the application. I just encourage the audience to a degree to do their own research and try and figure out what's going to be the best model for their particular application.
Mike Chung:And anything else that our audience should keep in mind with regard to especially if they're in product development, whether they're in automotive software, other industries any kind of parting words you might have in terms of additional things you'd like to share, stephen?
Stephen Goldsmith:I guess I'll leave you with one last thought, that we've talked a lot about AI, particularly in its application in digital products, but when you're thinking of physical products, I think there's still and even in digital products too, but with physical products there's still a real room for traditional approaches and techniques. I think one story that I heard at one point was I think it was either Samsung or Sony their televisions weren't selling nearly as well as they thought they should be selling, based on what was in the box, based on what their price point was or where their price point was, and they finally sent somebody into the showroom to observe what was going on. What are people doing in the showroom while they're making this decision? Because they're not making rational decisions. They figured that out, and what they figured out finally, watching people in the showroom, was that TVs were a piece of furniture and people were making decisions based on how would this TV look in my room?
Stephen Goldsmith:Sure, Not. What is the latest resolution that this TV could provide, or, you know, the latest gimmick that's going to be in that box?
Stephen Goldsmith:And so I think, when you, particularly when you think of physical products. There's still room for that and probably no substitute for that type of approach of having somebody just observing human behavior and trying to find and make sense of things that are happening in the market that you just may not be able to figure out looking at quantitative data or an analysis by an AI alone.
Mike Chung:Right, it's like that path to purchase. What's going on in the consumer's head as they are making that decision, as they're researching a product, and what's that shopping experience like and what are the drivers? So yeah, terrific. Well, I feel like we covered a lot of interesting material, stephen. Is there anything else you'd like to share before we close up this one?
Stephen Goldsmith:I think that's the last thought I'll leave you with, unless there are any specific questions. Mike, you have that. We then get to.
Mike Chung:I think just a fun sort of random off the wall question. Let's say you're hosting friends over for dinner this weekend and it's a dinner party. What kind of foods might you serve?
Stephen Goldsmith:Mike, do you want to come over this weekend for dinner?
Mike Chung:You know me too well, Stephen.
Stephen Goldsmith:Well, I actually got some great tuna from Costco the other day and, yeah, that would probably be on the menu.
Mike Chung:Would it be like a grilled tuna?
Stephen Goldsmith:Seared on the outside, but raw on the inside, I'm getting hungry just thinking about it.
Mike Chung:I'll be there promptly this weekend and look forward to it.
Stephen Goldsmith:All kidding aside, Stephen, really great to have.
Mike Chung:you Really enjoyed the conversation. Thanks for tuning in to another episode of Auto Care On Air. Make sure to subscribe to our podcast so that you never miss an episode. Don't forget to leave us a rating and review. It helps others discover our show. Auto Care On Air is proud to be a production of the Auto Care Association, dedicated to advancing the auto care industry and supporting professionals like you. To learn more about the association and its initiatives, visit autocareorg.