Meet the Tool Helping Companies Analyze Positioning Across AI Platforms

Podcast: Politely Pushy with Eric Chemi

September 25, 2025 | Hosted by Eric Chemi

In this special episode of Politely Pushy, Bospar Principal Curtis Sparrer and Freshwater Creative Founder Jennifer Devine discuss a critical challenge businesses are facing today: the influence of AI-powered search engines. 

They also explain how Bospar’s new tool, Audit*E, addresses this challenge.

Audit*E evaluates any company’s presence and content performance across eight major AI platforms – including Claude, ChatGPT, Gemini, Grok, Meta AI, Mistral and DeepSeek – providing a modern brand management solution.

For more information on how AI is changing how businesses find solutions and suppliers – and why Audit*E the right tool at the right time – also see:

(Find these interviews and whole lot more at Bospar’s video page.)

Click To Read Transcript

0:07 Welcome to Politely Pushy. I’m your host, as always, Eric Chemi. Today, we’ve got two guests,

0:12 Curtis Bar, the principal of Bospar, and Jennifer Divine, who’s working with Bospar to help us create oddity, which I

0:19 think is the first ever, what do you call it? Brand safety tracking in the AI

0:26 engine marketplace. Right. This has come up. What do they say? Necessity is the mother of all invention. And you two ran

0:33 into a big necessity situation and and created this. So So Curtis, I’ll start with you. What was the problem that you

0:40 were trying to solve? And then Jennifer, how did you solve it? Well, what we discovered was that our client Real Sense was about to launch

0:47 and yet when you would ask ChatGPT or other engines about it, it would say that the company was dead. It was

0:53 defunct. And so at a certain point, this is a real serious issue because a lot of

1:00 executives say that they now depend on ChatGPT and other uh sort of engines to

1:05 find out information about who they’re going to hire or partner with next. And so we really needed to change how the AI

1:13 engine saw this. And so Jennifer, that’s where you really stepped in to write the

1:20 record, so to speak. Yeah. So as Curtis said, we discovered this this issue in how uh Real Sense was

1:28 appearing within responses when uh individuals were querying for you know

1:35 what how can we get a great um you know depth vision camera? Um who even who

1:42 is RealSense? uh RealSense uh used to be a division within Intel um which was

1:48 um you know discontinued and it was a misinterpretation of some information that had been published quite some time

1:54 ago when Intel and and RealSense discontinued a specific line of product

2:00 but it was interpreted as the entire division had been um spun down and now

2:06 we were ready to spin out independently um we were up against this issue. So at

2:12 the time um you know it took quite a while to hit each of the platforms put

2:18 in the queries understand where the you know where the the AI platforms you’re saying hit

2:24 each of the chat GPT and claude and dropic Gemini copilot whatever it is.

2:30 Yeah, exactly. All of them across the board because the issue was present um

2:36 across all of them. Some more than others, but um we needed to find out and

2:43 gather uh what were the issues were and how it was appearing across all of the platforms that are now highly used in in

2:49 search. And why does it take so long though? It seems like it should only take a couple of minutes to just type in what do you

2:56 think about this company and open up eight browser tabs and then do that and and then wait. Yeah. But then you’re then you’re

3:02 gathering um and analyzing and um assessing the sources that this

3:08 information is coming from as well. So there’s uh a level of interpretation and analysis that has to take place in order

3:15 to formulate a strategy on what the fix is. Um, and in doing that, we realized

3:22 there’s got to be a better way um that can expedite the the process, helping us

3:28 get to the solution faster. Um, but also gathering um gathering the information

3:37 in into one place so that it can be analyzed more quickly over time as well.

3:43 So, it wasn’t just about we need to find the fix here. It was about how do we um

3:49 audit continuously to make sure that the fixes that we are recommending are

3:55 actually working. It feels like so much of this is very similar to what

4:01 we would deal with with SEO, right? Where people would say, “Hey, let’s go look up what Google says about you and let’s try to fix it.” Is this basically

4:07 the same thing or is this somehow fundamentally different?

4:13 It’s not fundamentally different. Um but it’s more different than basically the same things. It’s In between somewhere.

4:22 there are aspects of it that are quite different. Um first and foremost these

4:28 platforms are formulating their own responses. Whereas in SEO if you’re

4:33 ranking well it is your content that is being indexed. Right. They’ll either link to your site

4:38 or they’ll link to a news site about you or a blog about you. But they’re not making up stuff. Right. Exactly. Exactly. And when you get a

4:45 highly ranked link within the search engines, it’s uh you know the page title shows, the description of that page

4:51 shows, and you formulate all of that yourself. So the game there is making sure that you’re ranking high within

4:59 within the search engines. The game with the AI platforms is making sure that the

5:06 information is easily accessible, highly credible and consistent so that when

5:11 they access it and formulate their responses, you know that you have a higher

5:17 likelihood of of showing up properly. So Curtis, how did this work when you

5:22 when you went to Real Sense and say, “Hey, by the way, AI doesn’t think you guys exist.” You know, AI has confused

5:29 the product, one product being discontinued with the whole company has been discontinued. So now it seems like

5:36 we’re spinning out a company that doesn’t exist and it’s as when you found that out, did they did you go to them

5:41 and figure it out? Did they come to you and say we’re having this problem? It was something that was communicated uh

5:46 back and forth with uh both sides and they really regarded it as their cross

5:53 to bear and we said no this is something that we can absolutely fix and this is

6:00 how we can do it and there was a almost a palpable feeling of like well I hope

6:07 you can this would be huge for us but also a feeling that this was something that they couldn’t fix otherwise And I

6:14 think if we were to like widen out the aperture and think about what this means, I think this means how important

6:19 it is for every brand to take a look at its errors and omissions policy and

6:27 ensure that there are no stray narratives out there that are untrue that have not been corrected or

6:32 remediated. because this just shows how something like a bad story can lead to

6:40 this kind of result where it could be potentially catastrophic. And I think

6:45 that’s the real challenge. We’re now living in these wild AI times is that AI

6:51 can still hallucinate. AI can still get things wrong. And yet more and more executives are treating AI like the

6:58 Oracle of Deli. And they are admitting that they’re going to go to AI first.

7:03 And if they are convinced enough, they’re going to make a purchase or business decision based on the first

7:09 results AI serves them. So has this changed fundamentally

7:15 the business of PR? Right? Before it was all about, hey, when people Google you, you want to have good stories about you.

7:22 You want to have the good links. So it’s thought leadership. It’s getting top tier placements. It’s your own content on your site. All of that stuff. Now

7:29 it’s, and we all know it. I’m sure we’re all using AI engines way more and using

7:35 Google a lot less, right? So, are you seeing this change of well, we don’t

7:42 even know what they’re going to say? So, the way we’re approaching the kinds of stories that we should do, is it much

7:47 more about quality or quantity? Is it about fixing your own thought leadership, your owned content instead

7:52 of your own media? How have you had to adjust what you’re doing in your in your

7:58 day-to-day job in concert with this tool that we’re using now to figure out all

8:03 these mistakes that the IIA is making? Well, the first thing is that this new

8:08 world order has made it a lot easier for people to understand the importance of PR. Uh beforehand, CEOs would say,

8:16 “Whoa, I’m skeptical of PR can really work.” But now that they know that earn

8:22 media placements from journalists who are either super well- reggarded in a

8:28 niche space or in a you know quoteunquote top tier publication could influence AI. Well now they’re on board.

8:35 So there is that. The other part of course is that there used to be an exercise where we would think well

8:41 should we publish a release or not? And then we’d think well who would write it? Who would read it? And now that kind of

8:48 thinking is rather quaint because now we know AI is going to read all the releases. And so on

8:54 that’s a good point. We should we should pause on that, right? People weren’t reading those releases, but AI is

9:00 reading all the releases. So maybe in the past we would have said, oh, you know, don’t waste your money and time on

9:05 a press release. Nobody cares. Just stick it on your site. Maybe now it’s like maybe you should do all these press releases because it’s going to feed that

9:12 engine a little more. It’s going to feed the beast. And that is an important part of our calculation.

9:19 I think the biggest challenge that oddity is solving though is that you

9:24 could put something out there, but it takes a lot of work to figure out how each different uh AI engine is going to

9:32 regard this information and and how long it’s going to take them to absolutely fix it or ignore it. Yeah,

9:40 they have different sources. they have different preferences for where they are um getting their information during the

9:47 insession queries. Um somebody like like some one one

9:52 algorithm might want your website, another one might look at what media writes about you, another one might look

9:58 at social media. Is that what you mean by what they’re prioritizing? Is that what you’re trying to say? Like

10:03 different different AI engines will use different concepts. We found in in our analysis that

10:12 heavily weighted to external sources, the internal or owned sources are still

10:17 very very important. So your own website um is still going to be referenced and accessed. It’s the easiest thing to find

10:24 about you and um source information about the organization. However, it wants to then go out and verify or

10:31 validate in that external uh content external to your own um website and

10:38 assets. Um news releases or press releases are are considered considered

10:43 an owned asset. So we are beyond that looking for um other external uh sources

10:50 of verification. So it might be top tier media, it might be a Wikipedia or or um

10:57 a crunchbased profile and having that consistency. Each of them have a different preference for the type of

11:04 external source or material that they’re looking for. So what we found for example and this is um supported by

11:11 other research being completed in the industry is that for example anthropic or code prefers um academic medical you

11:20 know research papers white papers there is a flavor for that chat GPT for the longest time wanted um and was hungry

11:27 for Wikipedia um references and sources so if you didn’t have a Wikipedia page

11:33 um you you might be in trouble in in what it might come back with in its

11:38 response about your organization. However, that’s shifted over time. They are now um more reliant on um similar

11:45 sources like the like the top tier media um external profile pages and and

11:50 looking for consistency across there. So really knowing um where it’s getting the

11:56 content from and and where you need to be looking and and maybe shoring up your presence is uh you know we can point

12:02 that pinpoint that a lot more quickly now than what we used to before we had oddity.

12:08 So I’m I’m curious. Do you have a a sample we can look at? I know it’s not for everyone who’s for people listening

12:14 they’re not going to be able to hear but if you’re watching this recording is is that something you can pull up for us Jennifer? Yeah absolutely.

12:20 Put you on the spot a little bit. I will quickly um share this. Let’s see.

12:26 We might have to blur out some sensitive stuff, I guess. Yeah, that’s we can do that in post. Um

12:32 but let me just find Okay, I selected the tab and the name, you know, for people who

12:38 are listening. It’s not spelled oddity like the normal word. Curtis, tell us

12:44 how it’s spelled and why it’s spelled that way. Well, Jennifer gets us loaded. Yeah, it’s audit. the word audit that no

12:50 one wants to have happen to themselves and the letter E. And we earlier launch

12:56 push. Okay. So when I go to this screen, are you seeing it? Okay. I just want to make sure. Sorry to interrupt, Curtis.

13:02 Yeah, we see the screen. You see? Yeah, we see it. Yeah, but finish. Curtis, tell us your audit audit story.

13:07 Yeah. So, AuditE is, you know, a a brand that really says what it does.

13:14 This is an electronic audit of your footprint and we’re able to take a look at how you’re showing up in all the

13:21 queries and that’s going to help you as you look for weak spots or look for

13:27 places that you can improve. I think one of the most interesting meetings that Jennifer and I took, for example, was

13:33 with a media company and Jennifer zoomed in on the fact that their about page was

13:40 really a weakness that a lot of companies don’t invest enough in. And Jennifer, you were making all sorts of

13:47 recommendations about how that about page could be bigger. And I suspect this

13:52 is true for all sorts of companies that don’t think the about page is important but really are overlooking something

13:58 that’s really vital. Absolutely. Um you know typically what

14:03 we see with our clients when we’re looking at uh redoing their sites or optimizing their sites, they’re

14:09 depprioritizing that about page um mistakenly. You’re saying they’re making the mistake of dep prioritizing it.

14:15 Okay. Yes. especially now um it is an important a very very important source of content. One of the first places that

14:22 the platforms will look to, the AI platforms will look to for for information um that baseline information

14:30 where what date was it founded, who is the leadership, who is on the team um they can quickly then branch out and and

14:36 source social media platforms um and links so that uh they can make the

14:42 connection with the content and and build an understanding of the organization. So if um company or a

14:48 client has dep prioritized that or they’ve, you know, just put a really quick overview as though nobody really

14:54 cares, it’s not important, what really matters is our services and products, they’re really missing out on an

14:59 opportunity to help these platforms um create a deeper understanding um and a

15:05 factual accurate understanding of the organization itself. So what I’m showing you here is a profile that is um created

15:13 for Bosebar um within oddity and you can see here you add as a source of truth

15:19 you add the you know overview about the company you can add competitors uh

15:24 we would manually do that the source of truth is is yes they because we don’t know if the AI is

15:30 accurate so we have to set up a here’s the the baseline that I want you to compare to

15:36 absolutely so in some of the queries. We have a number of queries that we’ve um that we’ve created here

15:42 and we believe are the the baseline important ones to have. Now, we can customize queries for other um other

15:50 clients, but the baseline ones are the ones that we’re going to be tracking and then we would customize um for each

15:56 profile. But we need to know whether what these

16:02 platforms understand about the organizations is accurate or not. So in

16:07 some of the queries we will come back to this as the source of truth. Do you know anything about this organization?

16:15 Yes or no? You know we have um the ability to um calculate that and and

16:21 analyze what it is understanding about the organization. So is there a presence or mention? Is there a general

16:27 understanding? And then we check for accuracy across the platforms. Um, and

16:33 if there are inaccuracies, we can look to providing suggestions to the client

16:39 on how they can improve the accuracy. So that could be that it’s sourcing

16:44 outdated profiles. You heard me mention crunchbase earlier. It could be that it just doesn’t have enough information or

16:50 connection within the information that it’s sourcing to know which is right and which is wrong. If you have an old

16:56 profile out there that has old team members on it, you might find those old team members, even uh past CEOs showing

17:04 up in the responses that these platforms are are formulating. Um so we can look

17:10 across all of these platforms at what the responses are.

17:15 I see. So in one place you can look at all the different responses right there. Exactly. And and it analyzes them as

17:23 well. So you’ll see the averages that are coming back based on criteria that we’ve inputed into the platform um and

17:30 then scoring it. And if the score is coming back below the performance level that we want to see it at, it will

17:37 provide recommendations and tasks that can be taken um and we’ll take those

17:42 recommendations and and go back to our clients um with these tasks. they can

17:48 work them into their own work schedules or we can take on and and support them in completing these tasks as well much

17:54 like you were saying. I see. So there there is prescriptions to fix the problem. It’s not just oh

18:00 here you go here’s a problem. Yeah it’s um

18:06 it’s it’s more more it’s different than just a prescription in that it’s not the same for everyone. And I guess I would

18:12 want to clarify that. But well, depending on what each company’s issue is on each particular platform, it

18:18 will give different advice about try to update this on your site or try to get this kind of earned media or try to get

18:24 this kind of third party validation. Those types of um advice points.

18:30 Absolutely. And then you can track over time. Um so you see the performance over time. you can have a view into whether

18:38 the work and tasks that you’ve taken, the actions that you’ve taken within your strategies because it can impact

18:43 your content strategy. It can impact, you know, your your tech stack, how your website is set up, the type of content

18:49 that you have on your site, where you’re posting, your, you know, social strategies. Making sure that these are

18:55 all aligning. um when you take into account those action items into your

19:01 workflows, you can measure over time whether you’re getting the results that you want. And if you’re not getting the

19:07 results that you want, you can then um you know look a little bit more deeply into uh what could be going on there.

19:14 I’m just going to stop sharing my screen because I I can’t see you. Um and I want to see your faces. So

19:20 Curtis, go ahead. What were you thinking? You know, I I think the thing is about this is that AuditE is a

19:30 necessary tool for us as PR practitioners, but we see it also as a

19:35 necessary tool, if not a lifeline for other companies out there that may not

19:41 be able to invest in PR itself, but still wants to know what they can do

19:47 because yes, there’s all sorts of PR activities about earn media, but there’s

19:52 all sorts of back-end work companies can do as well such as an FAQ or making sure

19:58 that they have an AI friendly schema or making sure that the content is AI

20:04 ready. And so we’re going to be offering AuditE on a subscription basis to

20:10 companies that are not our clients at the end of the year. So that for those companies that are clients and you know

20:18 need regular reports, we’ll have that. But for you know up and coming businesses all the way to big enterprises

20:24 we’ll be offering subscription models at different sort of configurations so that

20:30 people can figure out what’s going on underneath the hood in eight different engines.

20:35 How hard do you think it would be for somebody else to come up with something like this? Like do you have a mode? Do

20:40 you feel like this is unique IP or you know are we afraid of copycats? Well, I

20:46 don’t think we’re afraid of copycats because I think this is a big enough problem that everyone should be thinking

20:52 about it. I will say that we shared this with Rob Enderly who is a tech analyst quoted by the New York Times, Washington

20:59 Post, and uh the Wall Street Journal. And Rob said he was surprised that a PR agency was offering this. He said he

21:06 felt this was this level of technology was better suited for an AI company that was venture-backed. And so I think that

21:13 that’s the level at which we’re offering this where we want to make sure that our

21:18 clients are able to see what’s out there and determine how they can improve

21:24 things because for a lot of people this is how things are this is how search is

21:30 going to be and we want to make sure that everyone is equipped and ready to handle the challenge.

21:36 What about the inverse kind of question? So, one thing is to say, hey, you look up

21:41 company XYZ, right? I look up Bospar, I look up RealSense, I look up whatever, here’s what they said about it. The

21:47 other problem is the decision-making problem. So, let’s say I go in and I’m and I’m a possible buyer. I’m a

21:55 customer. Hey, what is the best AI robotics software provider that I should

22:00 be buying from? And then you want real sense to show up in that, right? you want you want a robotics hardware in

22:07 that case the cameras but like they didn’t they didn’t type in the company’s name they just typed in a question but

22:13 now you want your client to be in that answer so it’s a slightly different kind of problem different question does this

22:20 help with that does this help solve or or get those people the AI’s recommendation of like you know you

22:27 should consider blah blah blah yes absolutely so part of you know one of the one of the important queries in

22:33 in the baseline set that we have in the in the platform is um in comparison to

22:39 um competitors and with queries related to the products and services um of the

22:46 organization that we’re auditing um what is the presence and what is the visibility within um recommendations

22:54 that do not mention the brand name itself. So it’s a very important aspect

22:59 of of what is available and and provides insight into that. Um I think that from

23:05 the um user standpoint or or the individual who is querying and

23:10 researching at that time, we also want to know as users of these platforms in our in our day-to-day work lives and

23:17 personal lives, is there accuracy in what we’re and what we’re experiencing.

23:23 And you know, AuditE can help with that because it’s it’s not just helping our

23:28 clients ensure that they’re showing up accurately, properly, and and visibly in

23:34 comparison to um their competitors, but it’s helping individuals um have more

23:41 confidence in the responses that are coming back because Curtis had mentioned there are uh you know, there there is a

23:47 propensity for these platforms to hallucinate or make things up or just sort

23:53 peter out before it’s completely before they’ve completely analyzed and and gathered all of the information. So the

23:59 recommendations might be at a top level. Um but once things are formulated

24:05 properly and and and digested and and taken into consideration that accuracy is going to improve for the users of the

24:12 platforms as well. Before we go, are you able to pull up Jennifer the the screen one more time and show us let’s say in

24:17 the case of Bospar a specific example of this is what was correct? Like one thing that was correct, one thing that was

24:23 wrong and and one piece of recommendation advice that that the engine said, hey, do this if you want to

24:30 fix that incorrect information. I need to Yeah, that might take a little

24:36 bit of time because there’s lots of audits here. Let me just or you can or you can maybe explain it or if Curtis remembers you can talk

24:42 about how that something that you’ve noticed or something you’re actively trying to fix.

24:49 Okay. I don’t mean to I don’t mean to put people on the spot, but I think No, no, no, not at all. Not at all. It

24:54 might take a little bit of digging. Um but let me let me see. there were. It’s just going to be a matter of pinpointing

25:00 the um the exact audit that came back. And how while you’re while you’re looking that

25:06 up, I I’ll ask Curtis as Curtis, as you’ve talked to other clients about this, are they all saying, “Oh my gosh,

25:12 like I need to do this. I need to figure out because we know AI is is botching it for us a little bit.

25:17 There is this definite sense of I need to get ready for this and I already feel

25:23 behind the eightball.” And there’s, you know, two approaches we’re looking at.

25:28 One is answer engine optimization, AEO, and that is really the Google answers

25:34 you might see at the top of search, for example. But it’s an easier way to think of it as like what is the capital of New

25:41 York, Albany, and that’s not very narrative. Generative engine optimization however is the narrative

25:47 approach where we are asking which PR agencies would you recommend for a

25:53 pharmacy firm who a pharmaceutical company who’s looking to introduce a new drug or what PR agencies would be

26:01 helpful in taking a company out of stealth and that’s more narrative where

26:07 a AI would be preparing that kind of answer and so we want to tackle both and

26:12 companies of course will want to do that. But I think the big sort of wins

26:19 to be had as people work with uh AI is definitely on the generative engine optimization side, the GEO side, because

26:27 that’s what people find the most convincing and the most compelling.

26:33 I found an example. Let’s pull up I’m going to pull up your screen and uh let’s see.

26:38 Talk us through this. Great. All right. Okay. So, you’re seeing my my screen properly here. you.

26:44 This is just a quick example. Um, here is uh an inaccuracy that we’re seeing.

26:49 Um, Bospar was co-founded by Curtis Spar and Gabriel Ayala, whoever that is. I

26:55 mean, I think our research partners I know her, but she’s not a founder. Yeah.

27:02 Yeah. Not a founder. It is found a connection. It wants to provide an

27:07 answer and a fulsome answer and and it’s subbed in some information that’s not quite correct. Um, so we went in and

27:14 and worked a bit on the website. Um, made sure that it was sourcing the right information. So if you go to a more

27:22 recent um, summary, you will see um, it starts

27:28 to shift its answer. So now it’s Curtis and other key leadership information um,

27:34 other key leaders. Then as we get closer to um the most recent reports um it

27:42 starts to recognize the changes and tweaks that we’ve done and let me just

27:48 see it. I just have to find the one where it mentions Chris Boehlke as well. Um starts pulling in the right

27:55 information. So you will see here we go. Um Curtis Spar co-founded the company.

28:00 many other key leadership uh members include uh Chris Boehlke who is also a co-founder. So we’re seeing a shift in

28:07 accuracy um as we uh pull in the the reports um

28:13 analyze and make tweaks over time. We’re getting more accurate. So it’s no longer bringing in incorrect individuals. Um

28:21 it’s it’s recognizing that problem and now sourcing more accurate

28:26 information. Where do you see the AuditE tool? Let’s say a year from now. Do you

28:32 want more engines? Do you want, you know, quicker report times? What’s the goal? Let’s say the next 12 months.

28:38 I think the big goal is the kind of creative PR thinking. We had one person

28:46 uh Rob Enderly who was looking at it and he was you know looking at AuditE’s recommendations and he said well why

28:52 didn’t AuditE just tell the company to completely rebrand and change its name and that was a big ask for an AI about

29:01 hey you need to start over just change your name it’s it’s it’s all done just just it doesn’t even matter and whereas

29:09 I doubt we could get AI to that level of complexity and like rip up the playbook

29:15 decision making I do think that’s an interesting challenge to bring that level of creative big picture thinking

29:23 into a diagnostic and remediation tool.

29:28 Yeah. And who knows what PushE would have said, right? So, we do have um our PR assistant, PushE, who is likely more

29:36 primed um being um you know, educated on

29:41 uh Bospar’s own content strategies and and case studies. Um PushE may have made

29:47 that recommendation. Um but we’re also relying on our on our team’s expertise

29:53 and and um knowledge to to put forward our our strategies that we’re communicating with with our clients. And

30:00 these are tools that certainly support us. They make us faster, more knowledgeable, and more insightful. Um,

30:05 and I think that we’ll just as you know, a year from now, hopefully that just continues that we continue to build on

30:10 that and and you know, offer those um those advantages to our to our clients

30:16 and then those who want to subscribe to it, making sure that the the tool is accessible and and um available to them

30:23 as well. It’s it’s compelling to hear what Rob Enderly said. Well, why doesn’t the tool

30:29 just tell them to do XYZ thing and give them a higher level advice? Because already it’s starting to give a smaller

30:35 level of advice. Why can’t it go bigger and bigger? And it makes me think about what is the future of a PR agency all of

30:40 a sudden. Do you need teams of people? Or if it’s just, hey, I got the PushE assistant. I got the AuditE, you know,

30:47 auditing. The the assistant will use the auditing to figure out what’s going on. And the assistant will just tell you

30:52 what all the advice is. And you know Curtis will just have a picture of his face on the website but there’s no one

30:58 actually doing anything. You know a lot of journalists tell me that they are getting an influx of AI

31:04 written pitches and there is even a set formula where the AI says hey so and so

31:11 I really loved your story on blank. And it goes to such a degree of brownnosing

31:18 if you will that it becomes kind of obvious. And the people who are writing

31:24 the stories are hip to that game and they’re already hitting delete. So I

31:29 think that as long as we are relying on human beings to be the arbiter of truth

31:35 and write stories, I think there will be use for human beings to handle the PR as

31:41 well. And to Rob’s exhortation about why don’t you just rip up the playbook and

31:46 change the name. I would I love PushE, but I just can’t imagine PushE saying,

31:52 “Nope, game over. You need to change.” So, I think that level of higher level

31:57 thinking is always going to be needed. That I laugh because there’s so many

32:04 directions that this can go. You can see where the world is headed in the next few years. I mean, we see it even for this podcast. We get so many pitches

32:10 that are clearly AI written. I loved your episode with I loved your episode with Curtis and Jennifer. You should

32:17 have this guest on your podcast. I mean, again, you get so many of these so many. It doesn’t fly, right? I mean, and and

32:23 the rule um that we’re always providing or or the the input that we’re always

32:29 providing to our clients is humans first and foremost. um every article, every

32:34 piece of content um that you’re that you’re creating and every interaction has to consider the humans first because

32:41 when humans get those types of emails or get to a site that’s, you know, overly

32:46 constructed for SEO or bots or has like awkward sounding content that’s not

32:52 structured in a way that’s easily digestible, they’re out. They’re, you know, they’ll they delete or they

32:57 close the window and and you’ve lost them. So um this is really going to be

33:02 um providing a huge benefit but that that human touch and that human

33:08 experience is still the the leading aspect of everything we do. At some point do you see a world where

33:15 the articles are being written by AI anyway? So the AI is receiving pitches

33:21 but the pitches are being written by AI and the and the the pitches are being advised by AI based PR agencies and and

33:29 the engines are the ones that are reading let’s say reading the articles and in the end you know it’s someone

33:36 using an AI to figure out well what decision should I make what what vendor should I buy and I’m using AI to help me

33:42 make that decision but it’s just reading articles that were written by AI that were pitched by AI like at some point

33:47 doesn’t it feel like we’re getting to that that kind of ecosystem. It does. But I think the challenge is

33:54 with that sort of scenario is that you would still need

34:01 people to buy into it. And if people are going to buy into it and think that’s great, then sure. I think the challenge

34:08 with AI though is that there are certain things it does that are tells, for

34:15 example, the M dash. And so I think that people are kind of hip to the AI pros

34:22 and copy and I don’t think it really arrests people in the same way that

34:29 human generated content will. So I can see where that would happen. I just

34:35 don’t know if it would be very popular. Yeah. And I think to the the more time

34:43 at least the more time I’ve spent and I’m sure that it’s not a unique experience with these tools, the more

34:50 aware I am of the weaknesses that require the the human intervention, the

34:55 human eye um to ensure that uh again that recommendation is the

35:01 solid recommendation. Um uh and it’s I I

35:06 don’t think that it’s a possibility. Exactly. I agree with Curtis. It’s a possibility that that people could be

35:13 using these tools in these ways and the proliferation of AI making decisions and actually making the purchases and what

35:19 have you. But at the end of the day, it’s going to be the the human beings

35:24 that are that are needing to live with the solutions that are recommended that um if that it if it’s not the best

35:30 solution that’s been recommended um or there’s a problem within there um people

35:36 will will realize that and and uh correct their ways I think quite quickly. Yeah. Fundamentally, it’s it’s all of

35:42 our money at stake, right? like we’re the people are the ones that have the investments, that have the money, that have the budgets that that flow into all

35:49 of these things, the money and the experiences. Um so, you know, if if it’s leaving out a

35:56 plethora of options simply because it hasn’t been um able to access or consider that information.

36:04 um and and you’re ending up with systems that don’t work together and and communications tools and and products in

36:10 your home that don’t um actually operate the best that they could be. Um you

36:16 know, people will desert the platforms quite quickly if they’re not if they’re not ending up with a quality product in

36:22 the end. Yeah, Curtis, Jennifer, I really appreciate the time today. Thank you so much. Thanks so much.

36:29 Thank you to my guest and thanks for listening. Subscribe to get the latest episodes each week. And we’ll see you

36:34 next time.

Ask Push*E