Discover more from Future Commerce
Episode 113
June 21, 2019

Cheap Fakes

Phillip and Brian are back to talking about deep fakes (and cheap fakes), Instagram may be having a massive influence on purchasing decisions, and is the term direct-to-consumer no longer relevant? Listen now!

<iframe height="52px" width="100%" frameborder="no" scrolling="no" seamless src=""></iframe>

this episode sponsored by

Phillip and Brian are back to talking about deep fakes (and cheap fakes), Instagram may be having a massive influence on purchasing decisions, and is the term direct-to-consumer no longer relevant? Listen now!

Show Notes:

Main Takeaways:

  • There's deep fakes, cheap fakes, and shallow fakes, and it's all getting a little ridiculous.
  • Should Anna Delvin (aka Anna Sorokin) become the next CMO of Victoria Secret?
  • Everyone should probably start sending Brian hate mail.
  • The Instagram aesthetic is no longer a thing; brands should get on board.
  • Are we going back to advertising on billboards?

Cheap Fakes And Deep Fakes And Shallow Fakes Oh My:

We Don't Need Deep-Fakes: We're Being Deep Faked All The Time:

  • But if the point of deep-fakes is to convince a viewer, then do we need them at all? Especially because as Phillip points out: , we're being convinced all day, every day on social media platforms.
  • Phillip blames Instagram (again) for turning him into a sneakerhead.
  • Brian says that there is going to be a place for deep fakes in advertising that is going to allow brands to leverage different personalities and people at a much larger scale than they do now.
  • We're seeing similar uses being used in body data, where services Allure Systems can use models body data at mass to create diverse catalogs.

Spoiler Commerce, Clever Marketing, and Bigger Better Brands:

Digitally Native or Direct-To-Consumer: Is Anything Real Anymore?

We love hearing from our listeners. So, do you think we're at the pinnacle of a new way of advertising? Is the Instagram aesthetic over, and should brands get on board with that?

Have any questions or comments about the show? Let us know on, or reach out to us on Twitter, Facebook, Instagram, or LinkedIn. We love hearing from our listeners!

Download MP3 (31.8 MB)

Phillip: [00:00:00] Hello and welcome to Future Commerce, the podcast... I don't even, the cutting edge and next generation cheap fakes and deep fakes and shallow fakes, and oh my gosh we have to get into something. I'm Phillip.

Brian: [00:00:11] I'm Brian.

Phillip: [00:00:12] Oh my gosh. Okay. We were arguing and I said let's just record OK. I have to start just right off the bat with... OK, does, is this fooling anybody?

Phillip: [00:00:22] And I think that... Well I'll start again, I'll start again. Here you go. Listen.

Recording: [00:00:27] I am Donald Trump and I think that my digital voice is quite impressive. I don't know exactly how they made it but I'm really impressed.

Phillip: [00:00:35] If anyone is faked out by that being a real Donald Trump then I feel sorry for them.

Brian: [00:00:41] I think that there are people that will be fooled by that. There's a whole generation of people out there that that have dealt with bad quality audio.. It's like... Let's say you put that on the radio. If you put that on the radio and someone's listening on AM radio. It's going to be challenging for somebody to be able to discern the difference.

Phillip: [00:01:02] So there's a fidelity thing. Okay. Hold on. Here's here's someone you might recognize, too, one second.

Recording: [00:01:07] I'm Barack Obama and I think that my visual voice is quite impressive. I don't know exactly how they made it but I'm really impressed.

Phillip: [00:01:14] So Barak.. he calls himself Barak, Barak Obama. OK.

Phillip: [00:01:22] So anyway this is a service called the Lyrebird. L Y R E B I R D dot AI, Lyrebird and so today we're we're we're harping on... I still want to stand on our pedal, pedestal... Not a pedal stool... I wanna stand on our pedastal...

Phillip: [00:01:38] ...about how smart we are that we've been talking about deep fakes for a year and a half before anyone else was talking about it in our in our context. But there's so many stories that are out this week in particular about different types of fakes. You had one in the news I think you wanted to bring out...

Brian: [00:01:58] Oh Pelosi, drunk Pelosi.

Phillip: [00:02:00] Yeah, the Nancy Pelosi that.. trust me... Pelosi, yeah.

Brian: [00:02:05] No I mean lots of interesting faking going on right now. And you know a lot of media attention directed at it. Obviously it affects media a lot, so they're gonna talk about it a lot.

Phillip: [00:02:21] If it bleeds it it's like this is the disinformation it fits right into a whole media narrative right now.

Brian: [00:02:28] Yes right. Right about so... Yeah.

Phillip: [00:02:31] Fake news, and you know fear fear fear, AI is going to destroy the world...

Phillip: [00:02:37] I cannot believe that this fakes anyone out, like Drunk Pelosi is being shared as a meme. And it's what, what do you call it? It's not a deep fake. It was something else...

Brian: [00:02:49] A cheap fake, shallow fake. Oh you know there's...

Phillip: [00:02:53] Shallow, I think shallow fakes are something else, but like fakes, like fakes.

Phillip: [00:02:59] There is an interesting... It's funny because it's, this week is just laden with this kind of news. There was also a Mark Zuckerberg...what they were calling a deep fake. It was actually done with real AI, not clever editing but it was Mark Zuckerberg. You know they source footage was from a CNN News interview a few, like a year and a half ago and they make him basically say that, you know a bunch of, you know dystopic things about how he realized that owning people's data was so important because of all the work that he did with the U.S. government with PRISM.

Phillip: [00:03:37] And it's the worst voice actor pretending to be Mark Zuckerberg that I've ever heard. It would have been better if they used that AI voice generation.

Brian: [00:03:47] Nice.

Phillip: [00:03:49] I might have actually been more convinced. It's so bad. I just... I like... at some point do we need to actually, like is it even, is there even a point in worrying about whether people can be faked out by this sort of stuff because the conversation around the fear of the, of a future where we could be faked out is enough to stoke enough fear and uncertainty?

Brian: [00:04:15] Well are we putting them, like or are we putting legitimate videos at risk as well, like a legitimate media at a risk? Everything's a swirl of confusion right now. There's like yeah.

Brian: [00:04:27] Anyway now when it comes to commerce and application here I mean we've we've kind of covered this quite a bit recently, but I just think the point that we're making is it's getting easier and cheaper to incorporate the stuff into media, and so well creating we're getting things like creating content and mass is going to be a lot easier coming up here.

Phillip: [00:04:54] I think so. I think so. And and when you see what is possible with deep fakes it makes me wonder about the these sort of posthumous celebrity, you know image slicing, like the kind that, you know Elvis Presley and Marilyn Monroe's estates have around those likenesses and endorsing of products that didn't exist.

Brian: [00:05:20] Right.

Phillip: [00:05:21] You know when they were around, I mean Chanel is one thing, but I saw Elvis Presley is being used in American Crew ads now.

Brian: [00:05:30] This is what I'm talking about man. This is body data. Actually can we just jump right in? Has anyone watched the new Netflix episode... Black Mirror episode with Miley Cyrus, yet?

Phillip: [00:05:44] I haven't seen it. Give me a synopsis.

Brian: [00:05:47] I forget the name offhand but effectively there's this this pop star, and her manager is really manipulative and controlling and... But what, so what they do is they they scan her brain and they put it into a robot, and, and they also get, you know they capture her body data and then they basically knock her out and put her into an unconscious state while they they go use her body data. Yeah. They in into various ways.

Brian: [00:06:20] Aka, like, you know put anything out there for, you know concerts in real time for everyone at any scale, you know at any size whether it's a hologram or you know or on you know on a screen or whatever it is like it's this whole... Spoiler alert, by the way.

Phillip: [00:06:38] Thank you. Thank you for ruining the whole show. I'll have to say insert spoiler alert to Chris here... Insert spoiler alert.

Brian: [00:06:47] That was a pretty big spoiler actually but yeah.

Phillip: [00:06:51] But here's the here's the problem right... Is that... Do you think... Do you really believe that that's the eventuality? Like, that's where we're heading?

Brian: [00:07:01] I don't know about scanning the brain, and putting it a robot. I think that's kind of what I'm... That's a long way away. That part was ridiculous. The part about, like leveraging people's likenesses for all kinds of things... I've been harping on that since since, you know our Episode 8.

Phillip: [00:07:20] Right. This is. Well yeah. Which actually has not aged so well if you go back and listen to it. We need to find some some other like a tentpole episode for us to reference. But if you if you want to, if you want to go in that direction. I don't know that we need AI and deep fakes to to...

Brian: [00:07:40] No, we don't.

Phillip: [00:07:42] What would the what would the outcome be.... Is it to be more convinced, like to convince people more deeply of things... and I think that we're... the thing that the consumer doesn't understand is that they're being convinced every day without seeing the likeness in the voice of someone convincing them, you're being convinced in other ways.

Phillip: [00:08:03] For instance a key metric that Instagram uses to tailor your likes or to understand your likes and dislikes and to tailor your news feed is they use the backtrack as a means of what caught your eye. So it's the gesture that when you're scrolling through the feed you stop the feed and you go back to see what you just passed. That is a very important metric and that backtrack tells you that something was interesting, and they have all sorts of machine vision to understand the context of the image. It doesn't need to be hashtagged "bicycle" for them to understand that there's a bicycle in there, and it doesn't need to be hashtagged "hot girl" to know that that was a hot girl riding a bicycle. If that's something that caught your eye they know contextually that they can feed more information to you and they can reinforce that over and over and over. It's the reason why I believe I am into sneakers. It's not because I've grown up my whole life wanting to love sneakers and I've loved sneakers my whole life. That happened when I was 35. It's because Instagram kept shoving it down my throat.

Brian: [00:09:07] Whoa!

Phillip: [00:09:07] Maybe I'm happy being a sneaker head, but it's the same reason I bought a denim jacket that I was joking about two years ago it's the same. It's the reason it's the reason that I buy anything nowadays. You don't need deep fakes. We're being deep faked all the time.

Brian: [00:09:22] I think deep fakes to...

Phillip: [00:09:23] So, that is Future Commerce. Like it doesn't mean that it has to like, take a person's likeness.

Brian: [00:09:29] It doesn't, but I think taking a person's likeness there's a 100 percent efficiency thing. Also I think that, you know as we get into this you're not going to see like, but actually I know for a fact it's already happening. You know they're... People are...

Brian: [00:09:43] It's not deep fakes, but body that is already being used, people's likenesses are already being used en mass, en mass, to create product imagery. Right?

Phillip: [00:09:54] So yeah we know. Yeah. In fact we had one of those services on the show... Yannis Constantinidas from SuperPersonal. Right? They they are launching a commerce focused product that allows people to place themselves into web sites as the model.

Brian: [00:10:09] That's different than what I'm talking about. This is like using models data. This is Allure Systems. We had them on the show, as well. OK. Whether I was using my models data en mass, it's an efficiency thing. Anyway I think you know we've covered this at length. It's already happening. It's still going to continue.

Phillip: [00:10:25] Let's keep going. Yeah I have more. Let's keep going.

Phillip: [00:10:28] So there was a so a trending video on YouTube this past week, or the last couple of weeks, is a YouTube channel that I guess has been around for a while and has over a million subscribers.

Phillip: [00:10:43] So they were fairly popular already, but a YouTube channel called Corridor Crew, which is a team of visual effects artists and filmmakers. They started doing, like reaction videos to bad visual effects and talking about what makes good visual effects versus bad visual effects.

Phillip: [00:11:00] And they did a whole episode last week on digital face recreation and really good examples of it and really bad examples. For instance, like a really good example is is the Paul Walker in Fast and Furious when they bring him back from the dead that, you know they basically recreate his entire head, and they superimpose it over his brother's body. And people, they criticize the one the one you know... They give you the reason why the one shot where he pulls up next to Vin Diesel, the one shot that's been criticized as being bad VFX, and then they show you all the other examples in the same movie of where you didn't even know visual effects were being used. And they, anyway.... What's really interesting about that is they compared deep fake versions of face replacement to actual VFX artist creations of face replacement and someone has created a deep fake of Carrie Fisher replacing the end of the Star Wars movie. You know when when she turns around. And what's the bridge movie for A New Hope? It's the the heist movie... What's it called?

Brian: [00:12:29] Oh gosh.

Phillip: [00:12:30] Everyone's yelling at the radio right now... Film before A New Hope I'm having to google for it. It's terrible.

Brian: [00:12:39] No, it's after New Hope. It's between A New...

Phillip: [00:12:42] No, it's before A New Hope.

Brian: [00:12:44] No it's not. It's between that and...

Phillip: [00:12:46] The one with Jyn, the one with Jyn Erso... What's the name of I know the name of it... Oh Rogue One.

Phillip: [00:12:53] God. I need my my brain on a Friday. My gosh. So at the end of Rogue One, spoiler alert. Oh I've got, you know I've gotta put in, "Chris put in spoiler alert.

Brian: [00:13:11] This is the Spoiler Commerce.

Phillip: [00:13:12] So yeah. So apparently if you haven't seen the movie, sorry, at the very end the last shot in Rogue One is a digital recreation of Carrie Fisher as Princess Leia turning around someone says, "Oh what was the thing that we just received?" You know because they had to get this, you know they had to get those plans for the Death Star off the planet, and they send it at the last second to an Imperial or to a, you know a Rebel ship which turns out to be you know, Princess Leia's ship, and they say, "Oh what did you what did we get?" And she says, "Hope." Right? But it's the it's a dead eye, scary looking Carrie Fisher. It's not the live Carrie Fisher. It's not even live Carrie Fisher like de-aged, like they did to Samuel L. Jackson in the Marvel movies. It's just a dead, lifeless, scary looking version of Carrie Fisher. And when the Corridor Crew show you the deep fake version of that that's modeled from all of the originals trilogy from Carrie Fisher's performance there, it's terrifyingly good.

Phillip: [00:14:15] So we are at that place where people with modern just IMAX can generate better visual effects with deep fake you know apps and deep fake technology than we can do with visual effects artists that have millions and millions of dollars and a budget can do with you know recreate a puppetry.

Phillip: [00:14:34] But. Where so. So that's where we are. Right? That's where we are. I just don't like... What I'm trying to get to and what we're belaboring the point on, and why why we keep coming back to this topic is, I don't know that we need deep fakes to be tricking people into doing things and coercing people into doing things. I believe that people are being tricked and coerced everyday by social media, and we are relying on social media more and more and more and more and more as part of our commerce engagement.

Phillip: [00:15:11] And in fact people are being tricked every day into believing brands are bigger than they are, that they're more ubiquitous than they are based on social media advertising alone, and social proof is very very powerful.

Brian: [00:15:24] Yeah. Yeah. So I hear everything you're saying and I'm going to say that I think that there is going to be a place for deep fakes in advertising that's going to allow people to leverage different personalities and people at a much bigger scale than they are now. And it's going to be a regular part of our advertising world.

Brian: [00:15:47] Almost. I can almost put them the Brian Lange stamp of guarantee on that. Like.

Phillip: [00:15:54] Take that to the bank, ladies and gentlemen.

Brian: [00:15:58] Deep fakes will be a part of advertising and marketing strategies.

Phillip: [00:16:03] Right. I can see that.

Brian: [00:16:06] Here's a question for our audience, and this is something I'm gonna, I'm gonna go here and hopefully we get a little feedback on this one. I. I'm curious if our audience would be interested if we worked with with Lyrebird to to do something with our voices, and maybe create some content there and so give us give us a ring, shoot us a message. Give us a little feedback here. Do you want to, do you want to hear? Just e-mail Brian, Brian with an "i" at future commerce (

Brian: [00:16:39] Do you want to hear deep voice fake content, voice cloning content from us with our voices? You know that that's not actually us talking it's it's it's are our voices being used. I'd like it.

Phillip: [00:17:00] Yeah I'm gonna say I'm gonna say no. But OK I want I want to hear what people think too. There's you know I want to harp back on this topic because I, I'm...

Phillip: [00:17:14] Social media is is this incredible tool that is connecting people, but at the same time when you look at... Did you watch the Fire Festival documentary?

Brian: [00:17:27] Not yet.

Phillip: [00:17:28] I'm talking about this a lot recently. But social media can can trick people into believing that things exist that don't exist, right?

Brian: [00:17:38] Can meaning does... Like it does trick people...

Phillip: [00:17:44] It does. It has already. Yes.

Phillip: [00:17:45] You know there's that. What was this story about the socialite in New York City that's like tricked everyone into believing you know, she was some heiress to some fortune?

Brian: [00:17:55] Oh yeah, I saw that story.

Phillip: [00:17:56] Yes I'm sorry sorry sorry. Anna Sorokin. Yeah. Fake Manhattan socialite, on the Washington Post, Anna Delvey, fake socialite who scammed New York City restaurants and yeah like it's just clever marketing, right?

Phillip: [00:18:15] This is incredible to me, by the way... That that a person, well I mean it's one thing I guess if you're scamming to get free stuff because you know, you say that like you're assuming a false identity, you know that that's patently wrong. But when big brands do it and they put out a press release that they are doing something that they're not actually doing or a software company sells you the vision of a software that they haven't actually created yet, but they pretend like it's true. It's all the same thing.

Phillip: [00:18:45] Right?

Phillip: [00:18:46] Like, it's building a bigger persona than you actually have and she was a clever marketer.

Brian: [00:18:50] You're putting, you're putting brands on trial? Or are you saying she should get off scott free?

Phillip: [00:18:57] I'm putting brands on trial and I'm saying that someone needs to give that girl a freakin job. Just like Kevin Mitnick in the 80s.

Phillip: [00:19:04] No Kevin Mitnick went to jail in the 80s for phone phreaking and for and for hacking and social engineering and lock picking. Right? And now he's a world renowned security expert. This woman needs to be the CMO of Victoria's Secret because she's figured something out.

Phillip: [00:19:21] She needs to turn it...

Phillip: [00:19:22] She needs to turn a brand around and trick everybody into believing that they're socially conscious because they're not. Victoria's Secret is not a woke brand. And I'm going to keep standing on that soapbox until they yell at us, very publicly.

Phillip: [00:19:36] But but but do you see what I'm saying? Like she's a genius.

Brian: [00:19:40] Yeah yeah I hear exactly what you're saying. I think that there's a whole set of influences out there that maybe didn't didn't take, they didn't... They didn't... They didn't Bill Belichick it right? They didn't skirt the the very edges of of of...

Phillip: [00:19:55] They didn't pay enough people off. She didn't pay enough people off is what you're trying to say.

Brian: [00:20:00] But effectively there's a whole set of influencers out there that are doing this within their bounds legally. But but they've created brands and you know it's it's it's you know, it's all self-made. And a lot of them have become CMOs.

Phillip: [00:20:16] I we had talked about truth and advertising and sort of digital photo retouching. CVS has that campaign to you know, give awareness when a photo is is not been retouched they you know they they mark it with a watermark in some way, so that you know that there is truth in advertising campaign there. I see more and more I'm shocked... By the way... This is a whole other thing. Go to target. Have you been to Target recently. Go to Target. Yeah. And look around. I want you to look at the pictures that are everywhere in Target now. You will see more diverse models that aren't just skin color and ethnicity, but also bodies shape and body size and just like, they look like real humans who aren't perfect models. And I think that that's becoming you know, this element of perfection. Right? The imperfection is what makes you kind of human, right? This this element of perfection is not a thing you have to,to... a standard you have to try to live up to anymore. We're kind of evolving that in our advertising. I I I I... I find it really interesting that the very opposite happens on social media with people, and that they try to... You know they try to portray perfection, and that's kind of why brands... You know you have you have brands that, direct to consumer brands who are startups, who have to portray being having, being having it all together and being perfect, and I don't know if this is something they're trying to connect the neurons it's not happening.

Brian: [00:21:52] And actually you know there was an article in The Atlantic. I might have already mentioned this article, but but the about how the Instagram aesthetic is over like everything was perfect and and candy colors forever.

Brian: [00:22:06] It was this whole like trend Instagram became what it was through a very specific influencer aesthetic and that aesthetic is in the midst of transformation.

Phillip: [00:22:18] So yeah there's a backlash is what you're saying.

Brian: [00:22:19] Yeah, the backlash the backlash by influencers and that's just by you know people that are looking at it, people are kind of sick of that entire aesthetic.

Phillip: [00:22:33] But are people sick of that from brands? Because I think they come to expect it from a brand and in fact there is a great article on Modern Retail by Hilary Milnes who...

Phillip: [00:22:48] She she goes into a whole... So that the article is called, "The Term DTC is a Misnomer: Brands recalibrate strategies as direct businesses become more complex." There's this really interesting story about, you know.

Phillip: [00:23:05] You know basically Harry's, when they were acquired by Edgewell, you know it turns out that 80 percent of their brand sales were off line.

Phillip: [00:23:15] Dollar Shave Club is owned by Unilever. Casper sells in Target now, direct to consumer isn't truly direct to consumer.

Phillip: [00:23:25] It's a bit of a misnomer. And so what, you know, she proposes is that you reorient that just as a digital native brand. Because they're they're not really truly direct to consumer, and they're not you know digitally native online brands either to get actual customer attraction you still have to be omnichannel.

Brian: [00:23:44] Right. And there's a good article recently about how Outdoor Voices and how they're. Yeah. Yeah.

Phillip: [00:23:51] Yeah I saw that too. Also on Modern Retail who are killing it right now. Call us. We'd love to make a podcast for you guys. So there's a really interesting sort of a take in this article about, you know a lot of the things we've been talking about, that influencer culture and or organic brand building really and had earned media from unpaid influence or promotion. And that was really what drove social commerce for a number of years.

Phillip: [00:24:25] But now that all the major platforms have ad services that's being now crowded out.

Phillip: [00:24:33] And and what used to be really organic brand building campaigns and really strategic brand building campaigns that were influencer driven are now social media budget driven and people are spending hundreds of millions of dollars, or billions of dollars on social advertising, and so this idea is that there's so many people competing for that same ad space. There's no true organic advertising anymore. It's getting more and more expensive. It's like an arms race and eventually like, yeah the only, like all this DTC like brand building was... will be looked at as a nostalgic era of a bunch of brands that were able to like get in before the land grab. And now that those days are over, they only ads you're ever gonna see on Facebook from now on are the ones that are already established and will be Coke and Pepsi and Ford and GM and that's what you're gonna get because that's what all advertising eventually hurtles towards.

Phillip: [00:25:36] So I find that sort of... And in that case like maybe the the dystopic future is that I was talking about about being tricked every day is you know coming to an end, and we'll have to be tricked by other things. Maybe it's going back to outdoor retail or outdoor advertising.

Brian: [00:25:56] So we're going back to billboards.

Phillip: [00:25:59] We're going backwards now.

Brian: [00:26:01] I mean I think you're right. You know all of that. The main brands are they already kind of have taken over Facebook, like it's happened in many ways. But but I think there's a new verticals and new ways of looking at things and new ways of advertising plenty of news but spaces. We just spent our entire last episode talking about that. Plenty of new spaces to expand.

Brian: [00:26:25] And also you know we've got Gen Z coming up. I mean we've been talking about millennials for so long. Gen Zs are an entirely different beasts the way that they think in communicating, and you know and purchase is is just it's it's not the same thing as millennials.

Brian: [00:26:45] And actually yes you're writing a book...

Phillip: [00:26:47] I have a whole book about that, actually.

Brian: [00:26:48] Exactly. So there's going to be new ways to get to get after these customers. Getting out in front of that's going to be really important, and building building up, building up that, you know sort of collateral with with Gen Z now is going to be important going into some of the, you know, as they get more mature and have more money to spend.

Brian: [00:27:09] Yeah but anyway we didn't cover half the things wanted to cover today.

Phillip: [00:27:16] I know, but but I do think that, you know, these these are topics that are top of mind not just for consumers but for retailers. Because I think everybody's looking for... everybody's looking for some sort of understanding of how the consumer reacts to things like this. Because I think this is, we're gonna see more technology. If you want a bold prediction, I think the next two years... in the next two years we're gonna see marketing, content marketing platforms evolve to you know more psychographic based customer segmentation to do direct selling and... one to one... a lot of the generation of those. Right. Yeah. And truly one to one, and...

Phillip: [00:28:06] And more and more of the way that it's not just about like how Wells Fargo sends me you know targeted ads and display ads that show kids with family, you know family and kids, where you know someone else sees you know a single person with their dog. I'm not talking about that.

Phillip: [00:28:23] I'm talking about like that could be like it could literally be me in that ad right me enjoying a happy life with my... I really think that that's what marketing platforms will go and that might not be, like that might be a powerful selling tool to a retailer, but it might be you know the kind of thing that the collective, the collective masses are going to rebel against. And so that's just a hot topic.

Brian: [00:28:52] That is interesting I agree with you on the one to one bit. And I'm also in the midst of writing an article on sort of what peek clienteling looks like. So we've got so I think a little a little preview there. Philip and I are both writing which is good.

Phillip: [00:29:09] Oh yeah. Well yeah that's Yeah.

Phillip: [00:29:12] We're a there's some really cool stuff coming catch us that IRCE. We'll give you a preview. We'll be there. I guess when you listen to this we'll be there next week. So it should that should be really really interesting.

Phillip: [00:29:27] Got a lot really interesting content too coming up soon. So stick around for everything and make sure you never miss anything by subscribing on Future Commerce F.M.. Subscribe to the podcast wherever podcasts are found.

Phillip: [00:29:37] Google podcasts, Apple podcasts, Google Play and while Spotify these days has gone on... Anyway.

Phillip: [00:29:47] Retail tech moves fast,

Brian: [00:29:49] But Future Commerce is moving faster.

Phillip: [00:29:51] Thanks Brian.

Recent episodes

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.