< Back to Podcasts

Welcome to Bospar’s AI University

Podcast: Politely Pushy with Eric Chemi

June 3, 2025 | Hosted by Eric Chemi

Welcome to AI University! At Bospar, we understand that AI isn’t going anywhere. So instead of resisting, we are embracing AI and leveraging the power it holds to help us do our jobs better. Courtney Merolle and Kyle Ankney, our professors at AI University, join Eric Chemi to talk through how we thoughtfully leverage AI in our day-to-day operations and how we ensure every Bospartan is well-versed in the complexities AI can hold.

Click to read transcript

00:00:10.639 Today we are talking about AI of

00:00:13.040 course right AI is the only thing that

00:00:14.559 people seem to be talking about these

00:00:16.000 days and today I’m with Kyle and

00:00:17.840 Courtney Bospar employees who are in

00:00:19.840 charge of the Bospar AI initiatives AI

00:00:22.480 University AI programming everything AI

00:00:24.960 that’s happening here at Bospar so you

00:00:27.039 know Courtney I’ll start with you every

00:00:28.320 headline I read right now is that AI is

00:00:30.560 going to eliminate everybody’s jobs

00:00:32.479 including all the people that work in

00:00:34.399 places like this well what is your take

00:00:36.079 on that

00:00:37.920 I definitely don’t think it’s

00:00:39.120 eliminating our jobs but I am going to

00:00:42.239 sound like a broken record because all

00:00:43.840 the other CEOs are saying the same thing

00:00:45.920 it It’s going to make us more productive

00:00:47.840 it’s going to make us more efficient

00:00:49.600 there’s ways to create our outputs and

00:00:52.000 and think creatively and and do our

00:00:54.320 deliverables in a way that we’ve never

00:00:56.719 been able to do before and it doesn’t

00:00:59.840 eliminate our jobs we still very much

00:01:02.079 need a human touch within communication

00:01:04.080 we speak about emotion um and end

00:01:06.640 product and and what that looks like and

00:01:08.880 the synergy that exists within it you

00:01:11.040 can’t remove us but we can be more

00:01:13.840 productive

00:01:15.439 you can’t remove us yet right right kyle

00:01:17.520 what you’re you’re on the cutting edge

00:01:19.200 what do you see because I feel like a

00:01:20.320 lot of AI now it can do emotion it can

00:01:23.040 do reasoning it can do it can do so many

00:01:26.159 things that we would have thought needed

00:01:29.119 a human to do no I think that’s true and

00:01:31.680 I was we were Courtney and I were having

00:01:33.600 this discussion earlier that yes with

00:01:36.159 each new model and each new progression

00:01:38.320 there’s always things that are happening

00:01:39.680 we’re like “Wow that’s different that’s

00:01:40.960 new that’s exciting.” But at the end of

00:01:42.640 the day you know especially in places

00:01:44.720 like a PR agency I think people are

00:01:47.280 paying for expertise and a lot of people

00:01:50.159 can experiment with you know different

00:01:52.479 models and different prompts and things

00:01:54.479 and get really good outputs but at the

00:01:56.720 end of the day the application and

00:01:58.960 figuring out how to really package it up

00:02:01.600 in a way that not only makes sense to

00:02:04.479 the end user but media there there’s a

00:02:07.520 skill that isn’t quite at the AI level

00:02:10.479 and I won’t say that we’ll never get

00:02:12.080 there but we’re not there yet so so what

00:02:15.520 are some of the initiatives that you two

00:02:16.959 are working on internally is it about

00:02:18.879 educating employees is it about you know

00:02:21.440 showing them the right tools is it some

00:02:23.040 a AI ethics what exactly is the role of

00:02:25.920 of leading these Bosespar AI initiatives

00:02:29.920 feel free to jump in you two can

00:02:31.440 disagree with each other and argue with

00:02:32.879 each other don’t wait for me to ask you

00:02:34.160 guys can just jump right in and chat

00:02:35.280 with each other I would say right now

00:02:37.680 we’re creating what I like to refer to

00:02:39.920 as the AI playground I think a lot of

00:02:42.800 people are very fearful about AI to a

00:02:45.440 point where they’re scared to even use

00:02:47.440 it dabble in it we hear about you know

00:02:49.760 data concerns and and we as a PR firm

00:02:52.239 have NDAs and you know there’s a lot

00:02:54.720 from the journalist side about

00:02:56.080 plagiarism and you know not giving

00:02:58.400 commentary that’s humanmade um so

00:03:01.599 there’s a lot of concerns and scary

00:03:03.040 things attached to AI and it’s created a

00:03:05.680 world where people are not adopting it

00:03:07.519 as fast as you would expect considering

00:03:09.680 it does have incredible power in

00:03:11.920 speeding things up and we’re in a world

00:03:13.680 where you know we’re incredibly busy and

00:03:15.920 and would love to be able to power

00:03:17.519 through some of our workload smarter but

00:03:20.159 faster um so AI University is a lot

00:03:23.840 about creating guard rails and doing

00:03:27.120 some handholding but more so empowering

00:03:30.319 employees to find ways for them to

00:03:33.760 create a better workday for themselves

00:03:35.920 with the AI tool that they prefer oh

00:03:39.599 it’s interesting and I’ll let Kyle jump in

00:03:41.200 too like so you’re telling me actually

00:03:42.959 people aren’t using it enough i would

00:03:44.640 have thought you might have said the

00:03:45.519 other the opposite which is people are

00:03:47.200 using it too much you’re actually saying

00:03:48.560 no no they need to use it more

00:03:51.040 yeah we actually deployed a survey

00:03:53.200 internally throughout the agency just to

00:03:55.120 confirm you know what exactly is going

00:03:57.120 on with AI is it a world where people

00:03:59.760 are playing with a lot of different

00:04:01.280 tools are they using one and sticking to

00:04:03.680 it are they not using it at all um and

00:04:06.400 we do find a couple of trends did show

00:04:08.400 up one of which was people do find the

00:04:11.040 one tool that they like and they kind of

00:04:12.720 stick with it they don’t give themselves

00:04:14.560 to play with other options out there um

00:04:17.440 or use new LLM systems to see what the

00:04:19.759 results look like uh but yes

00:04:22.000 particularly from younger employees I

00:04:24.000 think there is still a level of

00:04:26.720 questions as to how to use it

00:04:28.560 responsibly to a point where they’re

00:04:30.800 slower to use it

00:04:33.440 and I want to just add to that I think

00:04:35.040 another component to this too is the

00:04:36.560 client side of things as Courtney was

00:04:38.080 saying um not only are we trying trying

00:04:40.479 to create guard rails for what works for

00:04:42.960 us internally but also letting clients

00:04:45.440 know we don’t want to say we’re not

00:04:46.639 using it and then get caught and say

00:04:48.000 like “Oh yes you did because this is

00:04:49.520 why.” We want to be very transparent and

00:04:51.840 and and intentional in saying “Yes we do

00:04:54.560 use it in these ways for this reason

00:04:56.800 this is how this is how your information

00:04:58.560 is safe.” and also empower our clients

00:05:01.840 to use it in ways that are smart and

00:05:06.720 practical for the media because the

00:05:08.240 media isn’t quite caught up with the AI

00:05:10.880 jargon or how to implement you know AI

00:05:13.840 written content we’re still figuring all

00:05:15.680 that out but to be able to utilize it in

00:05:18.160 a way that they’re getting the most PR

00:05:20.400 impact for their money and their

00:05:22.240 contract length they want to see how can

00:05:23.919 I utilize the tool as well to enhance

00:05:28.240 that experience so it really is like a

00:05:30.320 two-fold situation that we’re trying to

00:05:32.800 discover and unfold together so I see

00:05:36.160 the issue with clients

00:05:38.199 isn’t we’re concerned that you’re using

00:05:40.560 AI because we want you to be more

00:05:42.080 creative it’s we’re concerned you’re

00:05:43.759 using AI because we don’t want our

00:05:45.360 information now being put into a public

00:05:47.440 system is that really what the big issue

00:05:49.120 is I think so I think that’s more so

00:05:52.240 what I hear from clients than other than

00:05:54.960 it’s not necessarily a creative

00:05:56.639 challenge more so than a security and

00:05:59.080 safety concern so so what are the let’s

00:06:02.479 say basic let’s say a new employee

00:06:04.639 starts tomorrow what are you telling

00:06:06.400 them hey here’s the three things that

00:06:08.000 you definitely should be using AI for

00:06:09.840 and what are the three things that we

00:06:11.520 don’t want you to use it for

00:06:14.319 we don’t want you to use AI in the sense

00:06:17.280 of inputting any documentation from a

00:06:20.400 client that is not for external

00:06:23.319 consumption especially within our

00:06:25.199 industry there’s a lot of internalized

00:06:27.120 decks there’s a lot of company

00:06:29.520 information that’s not yet public

00:06:31.360 anything of that nature is an absolute

00:06:32.960 red flag we do not want that within an

00:06:34.720 AI model funding valuations things of

00:06:37.199 that nature we just can’t risk it uh so

00:06:39.440 that’s a pretty blanketed no um but

00:06:42.720 things that they should be using it for

00:06:45.759 I like to guide people to using AI as an

00:06:49.560 assistant to allow you to

00:06:52.199 create more unique deliverables or think

00:06:56.160 outside of the box um I find ChatGPT,

00:07:01.000 Claude they are really practical not

00:07:03.759 just in speeding things up while that’s

00:07:05.680 a great thing it almost creates a little

00:07:08.639 buddy especially within a remote world

00:07:10.800 someone who can almost think a little

00:07:12.720 bit different from you so that you can

00:07:15.280 be a little bit more on your toes about

00:07:17.280 how you’re creating things what the end

00:07:19.360 product looks like so it’s not so cookie

00:07:21.360 cutter um we have to be really

00:07:23.840 innovative and and different when we

00:07:25.759 approach media and I do I think ChatGPT is

00:07:28.080 a really great way to get you thinking

00:07:29.360 outside of your standard thought

00:07:30.720 processes

00:07:32.319 yeah I know and I will say one of my

00:07:33.919 favorite additional prompts once I’ve

00:07:35.840 already got a piece of work that I like

00:07:37.840 or a pitch that I prefer or an approach

00:07:40.800 with a journalist that I think will work

00:07:42.639 I always like to flip it around and then

00:07:44.240 and say “Okay well now I want you to

00:07:46.000 think like this journalist at this

00:07:48.000 outlet who covers this topic and poke

00:07:51.599 holes in this entire situation make it

00:07:55.280 all question everything and more often

00:07:58.639 than not I’ll get really strong results

00:08:00.479 of hey here’s where the pain points are

00:08:02.800 here’s what you’re not answering or

00:08:04.160 here’s where it’s being overlabored and

00:08:06.240 it just helps me streamline from a

00:08:08.319 perspective that I’m aware of but am not

00:08:10.879 in day-to-day

00:08:12.639 I like that I like that approach hey be

00:08:16.000 this reporter be this journalist and and

00:08:19.440 basically give me the rejection now

00:08:21.919 before I actually send the email pitch

00:08:24.160 later and officially get the rejection

00:08:26.080 you might as well let’s practice it now

00:08:27.599 so I can avoid those pitfalls and and

00:08:29.520 try to either answer them in advance or

00:08:31.840 maybe just don’t even pitch that person

00:08:33.279 in the first place like I I like that

00:08:35.039 way of of taking on these

00:08:38.919 roles how much how much are we spending

00:08:41.440 on AI internally is everyone just doing

00:08:43.760 like the $20 a month you know little

00:08:46.000 ChatGPT+ or are we getting some

00:08:48.320 corporate like what what is the the

00:08:49.920 situation right now that’s such an

00:08:51.760 interesting question Eric

00:08:53.920 um so currently there’s no there’s no

00:08:58.720 internal system in place where the

00:09:00.839 company is funding everyone to have an

00:09:05.200 AI assistant or buddy or however we want

00:09:07.519 we want to frame that however we are

00:09:10.320 currently actively looking for and

00:09:12.440 researching AI specific tools

00:09:15.360 particularly for teams that are geared

00:09:18.000 for the PR space um so harder to find

00:09:21.680 but they are starting to trickle out um

00:09:24.480 and there are many benefits to those and

00:09:27.279 the goal I think for the agency I don’t

00:09:29.279 want to speak for people above me but I

00:09:31.360 the goal is to hopefully find something

00:09:33.440 that works for everyone but allows us to

00:09:35.680 be collaborative with our AI use as a

00:09:38.160 team so that we’re not using our

00:09:39.680 individual ChatGPTs and Claudes in a way

00:09:43.440 that is siloed from everything else the

00:09:46.880 the company is doing oh so I see so like

00:09:49.519 if you if the three of us were each

00:09:51.519 doing something it should at least know

00:09:53.120 about it like in some in some corporate

00:09:55.200 system where it’s aggregating all of our

00:09:57.200 individual work and and that smartness

00:09:59.519 improves over time is that is that the

00:10:01.360 idea yes one thing we get asked about

00:10:04.320 quite frequently is the concept of a

00:10:06.080 prompt library you know ChatGPT or any

00:10:09.920 other system it it only gives you as

00:10:12.160 good as what you can question it and so

00:10:14.880 putting together just a really short

00:10:17.120 sentence that’s missing a lot of details

00:10:19.040 is is not going to get you the output

00:10:20.880 you’re looking for because there is some

00:10:23.440 level of repetition to what we’re doing

00:10:25.360 on the day-to-day it would behoove us to

00:10:28.399 eventually have a prompt library if you

00:10:31.279 will where you know you could just kind

00:10:33.360 of swap in and out some details but

00:10:35.519 ultimately it gives you a really good

00:10:37.920 in-depth thoughtful response when it

00:10:39.680 comes to survey questions or pitch

00:10:42.640 writing or you know award research

00:10:45.279 things of that nature things that are

00:10:46.560 really duplicative for us that’s a

00:10:48.560 that’s a good point but then you know

00:10:49.839 it’s funny a lot of times you can use

00:10:51.360 the AI to give you the prompts right now

00:10:53.920 it’s like hey I don’t have prompts make

00:10:56.240 me a prompt library and then boom it

00:10:58.079 does that for you so yeah I love kind of

00:11:01.560 flipping

00:11:03.160 AI and so there have been times where I

00:11:06.160 put it into a position where it’s asking

00:11:08.000 me questions rather than me asking it

00:11:10.160 questions oh really how do you So how do

00:11:12.000 you do that what are you saying to to

00:11:13.600 set that up so I will similar to the

00:11:16.880 role playing concept if I have a a piece

00:11:19.440 of um let’s say a press release or a

00:11:23.040 pitch I can put that into my ChatGPT and

00:11:26.320 say you know if I were doing a live

00:11:28.399 interview with such and such reporter

00:11:30.720 what are they going to be asking me and

00:11:32.560 then it’ll you know spit out a couple of

00:11:34.079 questions I would give it a response as

00:11:35.760 if I’m the the key spokes person and I

00:11:38.240 really just get a better sense of what’s

00:11:39.760 the forward thinking conversation at

00:11:41.920 play here that way we can get ahead of

00:11:43.839 those conversations and create talking

00:11:45.600 points create a briefing document that’s

00:11:47.680 more specific and cater to the you know

00:11:50.880 conversation that we’re going to have

00:11:52.240 things of that nature but it’s it’s a

00:11:54.320 slight twist on AI that people don’t

00:11:56.399 necessarily think of I like that I like

00:11:59.120 that yeah make them the one asking the

00:12:00.959 questions as opposed as opposed to us

00:12:03.279 you mentioned different tools you

00:12:04.839 mentioned how some people they’ll get

00:12:07.200 stuck on one tool and they won’t try the

00:12:08.959 others what tools are the two of you

00:12:12.000 experimenting with right now have do you

00:12:13.839 feel like you’ve tried all of them I

00:12:15.360 mean I guess no one has tried all

00:12:16.720 there’s so many different niche ones but

00:12:18.079 but what are the tools that you feel

00:12:19.279 like you’ve spent the most time with

00:12:21.120 what are the ones that you feel like you

00:12:22.800 still need to get to and then and then

00:12:24.240 I’m curious how would you rank them

00:12:26.560 right like hey this was really good at

00:12:28.000 something but this was really bad I’m

00:12:29.920 curious your adventures in this space oh

00:12:32.320 Kyle I haven’t heard from him in a while

00:12:34.560 okay i will say I have too um I like

00:12:37.519 ChatGPT because it’s kind of where I

00:12:39.360 started it’s where a lot of people

00:12:40.560 started but also I’m a huge Perplexity

00:12:43.200 fan um so what I find myself doing I

00:12:45.920 think Perplexity is much better at data

00:12:48.320 and data points and stronger research

00:12:51.120 information so typically I will if I’m

00:12:54.600 researching start with perplexity take

00:12:57.600 those data points or those links or

00:13:00.000 whatever it feeds me that I find to be

00:13:02.399 relevant and actually plug that back

00:13:05.279 into ChatGPT and say here’s the data

00:13:07.920 that I have or that I want to use please

00:13:11.360 like let’s make this more creative

00:13:13.519 depending on the task whatever it is the

00:13:15.519 prompt library but I use perplexity for

00:13:18.000 data mining and then ChatGPT for more of

00:13:21.360 the creative flow if you will

00:13:26.399 that’s actually my exact answer really

00:13:29.120 and I I think a lot of people you know

00:13:31.440 when we did kind of first iterations of

00:13:33.200 AI University throughout the agency many

00:13:35.440 did uncover that Perplexity was

00:13:38.800 significantly better with the research

00:13:40.760 function um people also liked the voice

00:13:44.000 function within Perplexity uh but

00:13:47.240 ChatGPT it stole everyone’s heart it was

00:13:49.920 the the first one it was the app that

00:13:52.240 people downloaded it was the interface

00:13:54.079 people just got really used to so I find

00:13:57.120 it’s just kind of what people gravitate

00:13:59.279 towards you you find yourself pulling

00:14:01.440 your pulling up ChatGPT more than

00:14:03.199 anything else um but I do think ChatGPT

00:14:06.079 is very creative

00:14:08.199 um really good with more of the one

00:14:11.120 pagers and the strategy documents and

00:14:13.600 also the more you use it the more it

00:14:15.600 understands you the more it understands

00:14:17.120 what you’re looking for and so you know

00:14:19.360 it’s a self-fulfilling prophecy you know

00:14:21.680 because I’m using ChatGPT more than

00:14:24.079 anything else it’s it’s over time

00:14:27.360 becoming the best version for me right

00:14:29.920 right what about anyone using Grock

00:14:33.680 haven’t touched it won’t touch it but

00:14:36.399 that’s that’s a different cover won’t

00:14:37.920 touch it I find you know for some

00:14:40.000 certain research projects I find it’s

00:14:41.920 its thinking can be better than ChatGPT

00:14:44.560 there were some scenarios because I

00:14:45.760 would do both um what about what about

00:14:49.199 Gemini from Google

00:14:52.240 i’ve dabbled in Gemini but I’ve never

00:14:55.440 loved it it was it was a short short

00:14:58.880 fling with I will agree with that but I

00:15:00.959 will say if you live in Google Drive

00:15:03.920 like we do they’re starting to really

00:15:06.480 implement Gemini in Google Drive so

00:15:09.440 depending on how that relationship

00:15:12.079 builds itself out to with my day-to-day

00:15:14.880 activities and behavior I could see

00:15:16.399 myself using it more if through Google

00:15:19.600 Drive and Sheets and whatever else it

00:15:21.839 becomes more intuitive with the data

00:15:23.920 that we already are using then who knows

00:15:26.720 but currently it’s not my favorite it it

00:15:29.839 um it’s not as great I think as the

00:15:31.760 other ones but I find it if you go into

00:15:33.680 NotebookLM Yeah and you can create a

00:15:36.480 podcast which I find fascinating

00:15:39.120 that’s crazy and you can you know you

00:15:40.959 can start to um you can start to ask

00:15:43.279 questions of those podcast hosts they’ll

00:15:45.680 hear your question stop what they’re

00:15:47.360 talking about and then start interacting

00:15:48.639 with you so and I’m sure you know it’s

00:15:51.519 not exactly Gemini but it’ll all get

00:15:53.199 related at some point you know on some

00:15:54.800 back end it’s similar like you said

00:15:56.480 Google Drive Microsoft Copilot I find

00:15:58.800 that that starts to pop up everywhere if

00:16:00.560 I’m in an Edge browser I’m doing Office

00:16:02.560 that starts to pop up that can be

00:16:04.399 helpful sometimes I don’t know if if you

00:16:06.160 two have used much of that

00:16:09.279 no no okay and then Claude I feel like a

00:16:12.720 lot of people love Claude a lot of

00:16:15.120 people love Claude that’s the the tech

00:16:16.959 darling for sure um I personally don’t

00:16:19.839 know anyone at the agency who really

00:16:21.519 stuck with Claude in any way but I do

00:16:24.399 think I think Curtis is a huge Claude

00:16:28.800 yeah

00:16:30.240 but one one thing I would like to point

00:16:32.000 out which I think is interesting and and

00:16:34.160 Kyle can probably elaborate more but one

00:16:36.160 of the products we were test running to

00:16:37.759 me is kind of more so the future of AI

00:16:41.279 which is one system that gives you

00:16:43.440 access to all of these LLMs so you feed

00:16:46.720 in your one question and then you can

00:16:48.800 toggle between what the cloud response

00:16:50.720 is what the perplexity response is what

00:16:52.880 the ChatGPT response is so that you have

00:16:55.839 more of an easy navigation to you know

00:16:58.399 all the variables that exist which I

00:17:01.199 think we’ll see a lot more often as new

00:17:03.920 products come out there’s a company I

00:17:05.520 just heard about yesterday it’s called

00:17:07.119 Profound and their idea is getting your

00:17:10.799 brand featured in the AI responses so

00:17:14.000 you know before it was like Google SEO

00:17:15.839 so now it’s like oh when someone asks a

00:17:17.760 question in ChatGPT or Perplexity let’s

00:17:20.240 make sure your brand is the one that

00:17:21.760 gets mentioned and I think that’s going

00:17:23.919 to be a fascinating world of well how do

00:17:26.559 they even do that how do you how do you

00:17:28.960 infiltrate the back end to pop up you

00:17:31.200 know Courtney’s Bakery or whatever it is

00:17:34.320 well I was

00:17:35.559 thinking I was thinking about it within

00:17:37.679 our own industry because we always talk

00:17:39.760 about data and we always really go after

00:17:42.000 the quirky data and I was curious one

00:17:44.160 day and I I put in the prompt you know I

00:17:46.559 wanted to see if one of our clients

00:17:47.919 would pull as the response because we

00:17:50.240 had some of the questions that no one

00:17:52.000 else had and so I put in a specific

00:17:55.360 question and it did in fact pull our

00:17:57.360 previous client and the data that we had

00:17:59.760 created on their behalf so for us

00:18:02.480 specifically not on the marketing side

00:18:04.080 but the PR side it does infiltrate how

00:18:07.120 we should be thinking about you know the

00:18:09.039 messaging the talking points the data

00:18:10.880 that we’re creating because exactly to

00:18:13.520 your point if you do have someone with a

00:18:15.520 very specific question in an AI platform

00:18:18.240 and it pulls your client’s data that’s

00:18:20.640 gold in the future

00:18:23.120 yeah I was just going to say I was on my

00:18:24.799 first client call today actually and it

00:18:26.799 was the first time I heard that a new uh

00:18:29.280 survey was released and it pulled the

00:18:32.160 client survey data and it wasn’t like

00:18:34.160 they didn’t ask for it to pull that

00:18:35.520 specific data but it organically came up

00:18:38.320 and they were like that felt

00:18:39.840 gamechanging because it felt like this

00:18:42.240 is where people are going to not search

00:18:45.039 but get organic information in a

00:18:47.840 conversational way which is very

00:18:50.160 different than Google SEO has ever been

00:18:52.400 in the past

00:18:54.640 break that down again so so tell me so

00:18:56.559 what so what was the difference just for

00:18:58.160 people who are listening that like they

00:18:59.440 want to clearly get like so what was new

00:19:01.360 what what yeah sure so I’m not exactly

00:19:04.160 sure what the client data was but

00:19:06.400 essentially um we were the the client

00:19:09.840 was asking questions not related to what

00:19:12.799 they found in their own data but just an

00:19:14.880 industry type question and it started

00:19:17.760 feeding their latest report which I

00:19:20.000 think was less than 24 hours old um back

00:19:23.360 to them as conversational

00:19:25.799 information with cited sources and it

00:19:28.880 happens to be their own data um so when

00:19:32.559 you’re going in and you’re having these

00:19:34.080 conversations about how do I and this is

00:19:36.240 oversimplified but how do I bake a

00:19:38.400 brownie and it pulls something that’s

00:19:41.360 from you know a top chef for example

00:19:44.880 that’s great because that’s what people

00:19:46.240 are going to look for because they’re

00:19:47.440 not going to go in and Google that

00:19:48.880 they’re going to have a conversation

00:19:50.480 with ChatGPT first and say “What do I

00:19:53.120 need to do to make this happen” and

00:19:55.360 whether you know it or not you’re

00:19:56.720 getting data from somewhere like you’re

00:19:58.799 getting that information from somewhere

00:20:00.480 so I really think that’s going to be the

00:20:02.080 gold mine moving

00:20:04.120 forward I see I see yeah no I agree I’ve

00:20:07.200 been using it a lot more now for

00:20:08.960 questions I have like hey you know my

00:20:11.679 Sonos speakers at home they’re not

00:20:13.440 showing up on my app what do I do it’s

00:20:16.080 I’m not googling it i’m asking AI and

00:20:17.600 it’ll just say okay like go and do these

00:20:19.360 things like boom here’s your answer this

00:20:20.720 is how you’re going to fix it right or

00:20:22.240 just other there were just something

00:20:23.760 like okay there’s some tech question I

00:20:25.280 forget what it was like hey how do I fix

00:20:27.039 this other issue in my house and it’s

00:20:29.200 just like okay boom just go do these

00:20:30.799 things so I’m not googling anything

00:20:32.159 right so you do wonder how how will

00:20:33.840 people make money in that situation hey

00:20:35.840 there’s no ad to click on what’s that

00:20:37.520 going to look like um I’m I’m curious

00:20:40.240 though outside of let’s say these large

00:20:41.760 language models a lot of the the tech

00:20:43.280 stuff is anyone playing with video

00:20:46.480 content creation audio content creation

00:20:49.200 like you know editing tools a lot of

00:20:50.880 this stuff

00:20:51.919 A lot of our job is to make sure things

00:20:53.679 become good video good audio and at some

00:20:56.240 level a podcast like this AI could do

00:20:58.480 that podcast right AI could do a news

00:21:00.159 segment has anyone messed with any of

00:21:02.080 those kinds of things i would say it

00:21:04.960 doesn’t necessarily come internally but

00:21:06.960 we feel these questions a lot with

00:21:08.799 clients since they’re the ones usually

00:21:10.880 delivering you know B-roll social media

00:21:13.520 content things of that nature there’s

00:21:15.760 questions about is this allowed um would

00:21:19.280 a reporter or a news anchor think less

00:21:23.440 of a potential segment with us if we’re

00:21:25.600 giving AI generated content questions

00:21:28.159 like that are bubbling up quite often

00:21:30.080 very specifically we used to work with

00:21:33.360 um a client and a video vendor on you

00:21:36.799 know standard you know video content

00:21:38.960 creation and quotes would be anywhere

00:21:40.880 between $5,000 and $10,000 uh and of course

00:21:44.159 one very smart CEO came around within

00:21:47.039 about 24 hours and said you know we got

00:21:50.320 off that brainstorm and I saw the price

00:21:52.480 quote and I decided to play with AI and

00:21:55.039 this was the end result and I think it

00:21:57.520 looks pretty good and honestly we agreed

00:22:01.280 so there is definitely a world where

00:22:03.520 that is happening you will have

00:22:05.600 reporters who do you know think less of

00:22:08.720 AI generated photos video content they

00:22:11.360 might go after another story because it

00:22:13.280 is you know more traditional assets but

00:22:17.039 with the cost savings that we’re seeing

00:22:19.200 clients are moving towards that level of

00:22:21.679 content creation more and more are are

00:22:24.240 clients saying hey why do I even need a

00:22:25.840 PR agency I can just ask you know ChatGPT

00:22:28.480 for a PR strategy and reporters to

00:22:31.039 pitch and write the pitch for me right

00:22:33.120 like I’m sure people think that

00:22:36.720 people definitely think that but

00:22:38.280 surprisingly it hasn’t come up in any of

00:22:41.280 our conversations um I think people

00:22:44.320 still realize one you know AI just can’t

00:22:46.799 be trusted yet with the outputs uh I’ve

00:22:49.679 seen a lot of errors and and things

00:22:51.760 going wrong within searches um we’ve

00:22:54.880 also seen a lot of horror stories hit

00:22:56.960 media and LinkedIn about you know

00:22:59.440 pitches gone wrong they strike the wrong

00:23:02.080 tone they’re really aggressive a whole

00:23:04.559 agency get gets blacklisted because of a

00:23:07.120 a pitch that went south um so I think

00:23:10.400 even clients they still see that right

00:23:13.360 now AI is not ready um and of course if

00:23:16.400 you ask me obviously I have skin in the

00:23:18.240 game but reporters are human and they’re

00:23:20.960 always going to want to deal with humans

00:23:22.720 on the other side as well can we

00:23:24.640 optimize with AI to make it a lot more

00:23:26.559 efficient definitely I want to word this

00:23:29.200 very carefully all right let’s let’s

00:23:31.360 give you time let’s get you make sure

00:23:32.720 you got your careful words here I want

00:23:34.159 to word this very carefully but I will

00:23:35.679 say I think it’s no secret to all of us

00:23:38.960 that some people just don’t fully

00:23:41.600 understand PR even if they are engaged

00:23:43.679 in a PR contract they’re not really sure

00:23:45.760 what we do or how we do it or why it’s

00:23:48.159 important and there are other clients on

00:23:50.880 the flip side that fully understand the

00:23:53.039 power of PR and what we do and I think

00:23:56.559 something that cannot be underscored and

00:23:59.039 this is an I don’t remember where I

00:24:00.960 think maybe it was the New York Times

00:24:02.720 but it was a podcast I was listening to

00:24:04.400 where it says because information is so

00:24:07.840 readily available now the importance of

00:24:11.760 relationships and true connection based

00:24:15.200 on recommendation or word of mouth is

00:24:19.279 actually going to come become full

00:24:21.039 circle and become more powerful because

00:24:25.240 everyone you know broadly speaking has

00:24:28.720 the same power at their fingertips if

00:24:31.039 you know how to use it but the

00:24:32.960 relationships the the one-on-one

00:24:34.960 connections the the power of the years

00:24:38.720 of experience that we have collectively

00:24:41.120 and how to get certain things placed or

00:24:44.080 pitched or what have you that that’s

00:24:47.039 never going to be an output that AI can

00:24:50.320 can comprehend or deliver to you because

00:24:54.320 that personalized touch will never be

00:24:56.880 there so I think it’s understanding how

00:25:00.559 PR works but also understanding there

00:25:03.120 are just certain things that AI will not

00:25:05.279 be able to deliver that people inside an

00:25:08.159 agency will I like that yeah it’s almost

00:25:11.120 like there’s so much information that

00:25:13.279 having information doesn’t matter

00:25:14.720 anymore it’s it’s it’s about the human

00:25:17.840 touch or like you know like they can’t

00:25:19.360 replicate your connections your network

00:25:21.679 your relationships that kind of thing

00:25:24.960 not yet not yet it’s been fascinating to

00:25:27.200 watch it’s interesting to watch how more

00:25:28.880 companies are are taking on initiatives

00:25:30.400 like what you two are doing there was a

00:25:32.320 friend of mine works at another agency

00:25:34.400 and and they got the sense from

00:25:36.320 management there basically that is we’re

00:25:38.240 not hiring any entry-level people

00:25:39.600 anymore like AI is the entry level

00:25:41.360 person so we’re not going to hire

00:25:42.960 someone so that they can learn when all

00:25:45.039 they’re going to do is the stuff that AI

00:25:46.480 can do so it’s this weird thing now well

00:25:47.840 how are you going to get experience you

00:25:49.679 can’t get started

00:25:51.600 you know they only want veteran people

00:25:52.960 now because veteran people have

00:25:54.320 relationships have a network have

00:25:55.760 connections have all that but just to

00:25:57.440 come in and say I’ll help you with

00:25:58.720 information that’s not a job at least

00:26:00.720 right now

00:26:02.799 there’s also a lot of conversations

00:26:04.159 right now about the death of middle

00:26:05.840 management yeah and I asked the same

00:26:07.600 question I said okay if you look at the

00:26:09.120 corporate ladder and you remove middle

00:26:11.080 management how do we get people to be

00:26:14.320 above middle management and and no one

00:26:16.559 has an answer I’ve asked many very

00:26:18.559 intelligent thought thoughtful people

00:26:21.279 and no one’s been able to give me

00:26:22.960 something smart as a response well when

00:26:25.440 you find out come back we’ll do the

00:26:27.600 podcast again like Courtney found out

00:26:29.279 the information she figured out how

00:26:30.880 we’re going to get up the career ladder

00:26:32.240 here by jumping from entry level to CEO

00:26:35.279 I don’t do middle management I just

00:26:36.720 jumped ahead yeah apparently although

00:26:39.279 the data is currently showing GenZ

00:26:41.279 doesn’t want leadership or C-suite at

00:26:43.520 all so that’s another conversation yeah

00:26:46.320 I was going to say as a middle manager

00:26:48.240 like I would love to have that

00:26:49.360 conversation let’s figure it out yeah

00:26:51.799 exactly this is great Kyle, Courtney

00:26:54.640 awesome awesome time thank you so much

00:26:56.480 for sharing your experiences I need to

00:26:59.279 do more with perplexity that’s the one

00:27:00.880 actually I’ve I haven’t done enough with

00:27:02.240 so it’s interesting to hear what you

00:27:03.200 guys are saying about it so like okay

00:27:04.480 that’s my that’s my to-do list here when

00:27:06.320 we’re done definitely cross reference

00:27:08.720 and let us know what you define as your

00:27:11.360 favorite after you experiment for sure

00:27:13.600 for sure awesome thanks so much Kyle and Courtney

00:27:15.840 thank you

00:27:17.840 thank you to my guest and thanks for

00:27:19.520 listening subscribe to get the latest

00:27:21.360 episodes each week and we’ll see you

00:27:23.200 next time

Ask Push*E