Episode 290
February 17, 2023

Disruptors and the Defenders of the Status Quo

Revisit with us this incredibly deep discussion from December 2016 with Brian Roemmele about how voice interaction (and other tech) will revolutionize most everything and how at the end of the day, we have to fight to keep the humanity into the equation. Learn, develop, grow, change, and also strive to understand the people around you and behind all of this technology. Listen now to hear more!

<iframe height="52px" width="100%" frameborder="no" scrolling="no" seamless src="https://player.simplecast.com/986c0760-6d7f-4cc8-b3f8-cf97be99cf59?dark=false"></iframe>

Revisit with us this incredibly deep discussion from December 2016 with Brian Roemmele about how voice interaction (and other tech) will revolutionize most everything and how at the end of the day, we have to fight to keep the humanity into the equation. Learn, develop, grow, change, and also strive to understand the people around you and behind all of this technology. Listen now to hear more!

Time For More

  • {00:08:27} The guys discuss with Roemmele how many buttons in the world, crosswalks, elevator open/close, and more, aren’t even connected. We just need that action to feel like we are controlling something.
  • {00:11:54} There are seasons of change as new technology is adopted. And that’s a good thing.
  • {00:18:17} Why the voices and names of these voice assistant tech devices are women’s voices and names and what the research has found about that
  • {00:33:36} “We are always trying to look for something in a voice that reassures us” - Roemmele
  • {00:38:58} “In ten years, fifty percent of your interaction with any computers is going to be via voice, of course voice assistant AI.” - Roemmele
  • {01:08:29} “Voice is a patina that is going to que up this new revolution.” - Roemmele
  • {01:12:34} We are at the precipice of what this all comes down to, which is privacy and also persona
  • {01:24:07} How Brian Roemmele taught Alexa and Siri to talk to each other as he experimented and researched on his own
  • {01:41:45} Some common concerns about AI tech such as self-driving cars and how to get to the point where our humanity is still always a part of decision making

Associated Links:

Have any questions or comments about the show? Let us know on Futurecommerce.com, or reach out to us on Twitter, Facebook, Instagram, or LinkedIn. We love hearing from our listeners!

Phillip: Hello and welcome to Future Commerce, the podcast about cutting edge and next generation commerce. I'm Phillip, and we have a special guest for the next two episodes, someone that I honestly, we didn't even know what to do with this episode. Mr. Brian Roemmele, who is a prolific thinker on payments and sort of cutting edge and next generation payments and technologies and voice activated and AI Technologies. Prolific blogger. Writes everywhere. His columns are syndicated all over amazing journals. He's a wonderful mind. And so we had him on the show a few weeks back to come and share what he thinks is the roadmap from 2017 and beyond for the future of commerce. And I'll tell you what, you get Brian Roemmele headed in a direction and you can't stop him. And so we have over two and a half hours of content with him, and we weren't really sure how to even publish this. So we're going to do this in two parts. So what follows is Brian Roemmele on Future Commerce.

Brian L: Welcome to Future Commerce, the podcast about cutting edge and next generation commerce. I'm Brian.

Phillip: I'm Phillip.

Brian L: And today we have one of the most interesting guests, I think...

Phillip: Well, that we'll ever have.

Brian L: Yeah, I mean, this is...

Phillip: We can say that definitively.

Brian L: Yeah. I would agree with that. This guy is wicked smart.

Phillip: {laughter} Wicked smart {with Boston accent}

Brian L: He's got some crazy stories that we've already heard and I'm sure we'll get them in there somehow. Really cool stuff going on. So Brian Roemmele. He's from the podcast Around the Coin. He's also got other companies he's involved in. And I'm really excited to talk with him. And so I'm sure we'll get a more formal introduction here in a second. But this is awesome.

Phillip: Yeah. The thing that I'm most excited about is I love when we have a... I love when we have a guest who is so passionate about what they're talking about that you sort of just set them in a direction and let them go. And I felt that way with all of our guests recently. But I don't think anyone's going to top Brian in enthusiasm.

Brian L: I think that all of our guests would probably agree with that, actually.

Phillip: {laughter}

Brian L: You definitely are going to want to subscribe. As you listen to this show, you'll see like this is some really fun stuff.

Phillip: Ok, so we started the show while you were gone. Yeah, we just started. So you're here. Welcome.

Brian R: Ok.

Brian L: All right.

Phillip: Yeah. So, Brian, actually, could you introduce yourself for the audience at home that may not know who you are and what you do and what you're about?

Brian R: Well, first gentleman, pleasure to be here. Loved our pre show. Brian and Phillip are amazing individuals, so this is going to be a fun time. I'm Brian Roemmele. I'm kind of interested in a lot of things. I've spent a lot of my life inside of payments. I see payments a lot more than just a transaction. I see it from a philosophical level. Really involved in technology. The convergence of what I really believe is the next levels of technology. And currently I really feel that speech, voice interaction with computers is one of those things. So it's kind of my interest right at the moment. That and just the startup environment, the incredible amount of creativity and vitality that's coming up inside the creativity of the startup environment. It's really an historical time, in my view.

Phillip: I was familiar with your name from the payments, you know, the payments area. I sort of consider you to be a luminary in that space. I know that there's a lot of interesting work you've done in that space. Can you sort of tout some of the stuff, besides being like, you know, a god on Quora. What else?

Brian R: Oh my gosh.

Phillip: {laughter} What else do you have? What might people know you from?

Brian R: Well, you know, I'm infamous, I guess. I don't know. I've only, listen, my social media for most of my existence beyond MySpace, which was more from the musical aspect of my life, you know, and maybe even mellow.com. I'll date everybody. You know, more recently, Quora. I found Quora to be incredible. If we look at history, it's the Library of Alexandria, hopefully never to be burnt down, of our epoch. And what the Library of Alexandria was, is, you know, when it burnt down, the Dark Ages began. So we can pretty much look at about 2016 years ago, we had accumulated the entire knowledge set of society, all the histories, all of the tragedies, all of the things that the Greek culture later built itself on. Or sorry, the Roman culture later built itself on, and the Greek culture sort of started to realize that it was a lot more going on with humanity around the world than anybody could imagine. And we created a cultural lobotomy. That was a time when in history where human beings said, you know, "What's his brain thing doing here? Let me pop this out." And we lobotomized humanity for two thousand... Well, I don't want to say two thousand years. I would say it wasn't until the Enlightenment and the Italian Renaissance that we finally got creativity and humanity back up. When you have that much knowledge in one place, and you're tasked with trying to synthesize that knowledge and original creative thought... There are two processes. There is creative thought, and there is a synthesis of creative thought. And one would argue that all creative thought is a form of synthesis because you can't separate the fish from the water. Right?

Phillip: Right.

Brian R: And you're going to be influenced by everything that's around you. We see that music. That's why genres are developed. That's why you know that that's a rip off poser artist, and you know, all these different things that we hear in that thing. And we could see it also taking place in a modern context within apps and app development. So Quora was that sort of Alexandrian library because it was first person re-accounting knowledge of information. It's not Wikipedia. You want "facts?" Whatever that means. Because facts change, as science improves. Right? It was a fact at one time there was no such thing as creepy crawlers that are on our fingers. We don't need to wash our hands after cutting off a gangrene leg and delivering a baby, because you can't show me where those invisible creepy crawlies are.

Phillip: Right.

Brian R: And then a guy came along and said, well, there might be a technology that comes along sooner or later. I'm paraphrasing. It's called the microscope. And we might actually see it. And then will you believe me? No. It took a generation of doctors to die off to accept that there were invisible things even after they saw it under the microscope. I always use that whenever I talk about technology, because there is the old guard and a lot of people who think the old guard is a bunch of old people. Not necessarily. It's the people who buy into a paradigm and fear the change. And they always want to drag along something from the old to inform them how the new is going to look. Yeah, Quora helped me a lot in a sense that it expanded me to contact people who there's no way it would have ever known existed. Who knows? Just like, for example, Quora had an expert on automatic stoplights. Traffic control systems expert. This guy you would never have heard unless he was your uncle. And it's Thanksgiving. And he started talking about how these machines work.

Phillip: Right, right.

Brian R: Well, nerds like me, I can't stop going down that road and you'll get that vibe from me. And he started talking about how these stoplights work. And it's the one that blew my mind. You know, the crossing button on all the crossing buttons and most of the New York City crosswalks are not connected anymore. We disconnected them in 1986.

Phillip: Right. {laughter}

Brian R: Mind blown.

Phillip: Sure.

Brian R: Sitting there in New York City. OK, let me get it for you. OK, got the button. It beeps. It even lights up. Some of them. Ain't doing a thing. Ain't doing a thing. It's all psychological. And I'll tell you, they did studies on it. And I don't know if it's intentionality. I don't know if it's, you know, the ordering of the world through physics, the observer effect. Reverse entropy. People absolutely 100 percent believe that that crossing control device is operating the light. And some of them are, by the way, but most of them aren't. And the biggest reason why they're still there is not so nefarious. It's because it costs what it costs them a few million dollars to remove them all. So somebody had the bright idea to maintain them and let them stay there and and let the lights light up and let kids play with them. It keeps the kids occupied. And I would advocate and this is what this guy said, he said there's a lot less kids being run over because they're busy pressing a button instead of running out in traffic. And I love that having kids. Yeah. Press that button, especially in Vegas. I won't go that route. So Quora gave me that. By the way, I'll blow everybody else's mind. Most elevator open and close buttons.

Phillip: Yeah.

Brian R: Don't work. They're not even connected.

Brian L: Believe that.

Brian R: And that's on purpose. That's just to make people feel like they're in control. I'm in control of this elevator now. You've got to have the key and that key is an override and you got your leg in the door. It's going it's because that override key overrides the sensors.

Phillip: One of my favorite memes that popped up recently was people taking, you know, disparate random pictures of everyday objects and saying assigning like a UI in a UX moniker to it. And it sort of blew up to become like it started out as like UI being, you know, the elevator open button. Right. But now it's just become like a bananas and like a Teddy Ruxpin. It has nothing to do with each other.

Brian R: It just goes off the edge. So I digressed there a bit. But basically that's how I got into social media and I was anti Facebook and Twitter. And then somebody said, hey, Brian, there are a whole lot of people talking about about you on Twitter. I go, I'm not going to go out there and look. It's probably bad. No, it's good stuff. And so I had an account forever and I just I think 2015 I turned it on. And I've gotten a lot of exposure through there. People think I'm purposely controversial. And to me, I'm like, I'm giving you my free stuff. Because my logic is I better give away five percent of what I know minimally. And I sometimes go a little bit more. But on the other stuff, I have to feed myself and my family. So I try to hopefully help companies. Hopefully they pay me sometimes. A lot of times I just do it out of... No. I literally do it out of love for the what the company is doing. I mean, companies that are in payments that are starting up, you know, I'll sit there and talk to CEOs and Founders and stuff and  great idea. Move along, Brian. And then all of a sudden two years later, they start doing it and saying, well, it'd be nice to get a residual from that, you know. But, you know, but seriously, I really like to engage people to think. I talked about the whole experience of the defenders of the status quo. We're in that epoch now. I believe that this next five year period is going to be the defenders of the status quo, sort of epoch. Because nobody is quite sure what the next big thing is. And whenever that happens, people tend to hold on to what they believe the last thing was. And so the whole paradigm is a smartphone, the whole paradigm. We talked about it pre show of the app economy. And, you know, you hold on to that because you get scared. And I go, well, that's what the old people felt like when you disrupted maps and you disrupted taxicabs and you disrupted restaurant delivery companies, mom and pop restaurants, you know, and it all comes full circle. And if you're around long enough, you get to see these cycles because everything has a cycle to it. It will never change. And there's going to be a spring, summer, winter and fall. Right? Probably. I'd like to start with spring, you know. And in spring, you know, we're out of the spring of the mobile revolution and we're kind of out of the summer of it. We're entering the fall. And that's not a bad thing. If it's a bad thing, you've got to look at nature and say, oh, that sucks. The trees look like they died. Like my kids. You know, they were raised in Southern California and we go back east and they go "Dad. Mom. The trees are all dying out here." It's so cute, you know, because I go back to Princeton and it's beautiful. You see the leaves falling. But to a kid who maybe saw that but didn't notice it, it's profound when you see the red and oranges and the leaves are falling. And to them, it looks like maybe a violent act, you know, and no, it's nature. It's recycling. So, yeah, that's what's going on in payments. That's what's going on in technology. I call it the voice first revolution. I call it voice first because there's only one hundred and forty characters in a tweet. And conversational commerce is a little too many characters. It's  little hard for me to say and it's a resonation of keyboard first that actually existed in a sense because there was a fight in the early dawn of computer, personal computers, where their punch cards were going to be keyboard operators. And guess who won that? Keyboard operators lost and punch cards won because you can move punch cards much faster than somebody could type. And chew on that for a while. And it kind of inverts your logic. Hold up. Where we start here? Somebody's got to punch the card to get the data on there, but they move faster through a computer. But that's a storage mechanism. But it's also a text entry system. Yeah, that's how it was back then. And when you look at the new paradigm, well we can go into the mouse, right? Everybody, I had arguments in that epoch. Text is faster than the mouse and all of this extra horsepower to drive these cute little windows you people love. Right? These are the anti Mac people. And I got in a fist fight. I didn't start it. At a Comdex where a guy said, "I'm not going to get all of my CPU horsepower given up to move pretty pictures around the screen. And this is a, you know, text only text first, this kind of guy. And I'm like, "That's always going to be there. I'm not advocating that the that the text input mechanism is going away. I'm just saying people are going to navigate and do most of their interaction with the computer through an input device, whether it's a mouse or trackball or whatever, trackpad, it didn't matter the modality, it's not going to be cursor keys." We have cursor keys, but most people if they're young enough right now, they don't even really look at it as a mode of interacting with their computer.

Phillip: Right.

Brian R: So I'm kind of saying some of this stuff because probably some of the stuff I see later on will hopefully sound a lot less controversial and make me look less of a prophet and more of just a poser.

Brian L: {laughter}

Phillip: {laughter} I don't think anybody at this point in 2016 would argue that voice first is something that's outlandish. I do think that a lot of people are looking at voice interaction as something that's not really relevant to how they're actually wanting to interact with technology. It's just something that we keep being told this is what you're supposed to be doing. What's your take on... Because I know you're very into Siri and Alexa and I see you have a Google Home now and all these. So what's your take on the assistants and the things that are happening? And is there really any market demand or is there any consumer uptake on this, or is this just something that's being sort of shoved down our throats?

Brian R: All the above. I got to make my definitions early so we can understand this. Siri, Alexa, Cortana, even Viv to a certain level. Google. Dammit, guys, just give it the name. Call it Becky. Just give it a name. And, you know, today, everybody saying, Brian, why does it have to be a girl, you know, don't go after me. That's research. It has nothing to do with sexism or gender. Even women prefer to have a female interaction. It has to do with our mom's voice. If you really want to go into the psycho research of it, our moms voice is more calming, our interaction with our devices. So it's actually a really nice thing. So anybody gets up in arms about women assistants, remove the word assistant and understand it's a female voice and it's a calming voice. And it's culturally around the world. I don't care what the culture is. It can be an ancient culture. It's the way it is. So personal assistants and personal and let's call it intelligent agents and intelligent assistants. Siri. None of these devices. Alexa. They're not intelligent agents and they're not intelligent assistants. They are Q&A interaction systems.

Phillip: Right.

Brian R: So what's a Q&A? Question. You get an answer. That's not a dialog. That's a Q&A. You and I, unfortunately, because I rattle on, we're not doing Q&A. We're just dialoging. We're conversing. And by the proper sense, we're having a dialog. And we're not at the point yet with these popular devices. It's what I do experimentally. All the research I do on any of these devices is I force them into dialogs and proactive and a premeditative reactionary sort of responses. Why is that? Because once you get out of this silo of I need to ask the question and then it gives me an answer and it doesn't follow up and I don't need to follow up, the utility of that system diminishes logarithmically. And then when you extract the fact that there is no personal assistant or personal intelligence agent or intelligent assistant, then it really drops off. Ok, so where is the intelligent assistant and the intelligent agent? They aren't quite here yet, and it's going to be the revolution of our era how this shapes because you're intelligent assistant is going to know more about you than your significant other. For it to operate effectively, it needs to know all of your loves and hates. And it's going to do that despite you giving it permission and monitoring everything you do on all your devices. So that's machine learning, right? It's not rocket science. You can do it today. I've written many of these things and I suck as a programmer. And all it does is it looks at what for the last year I've had this running, many of them actually. And it looks at everything I do. And there are patterns in everything. Right? Machine learning is about detecting patterns that you yourself do not know and then it gets proactive. What's the first thing I do in the morning? Unfortunately, all of us raise our hand. We're looking at our phone.

Phillip: Roll over and look at your phone.

Brian R: Hopefully it's not right next to your head. I'm going to advocate you to have that phone six feet away and face down and turned off. Honestly. I'm a big advocate of blue light problems. You shouldn't have blue light around your retina after 11:00 p.m. It's really going to screw up your melatonin cycle. You're not going to have deep sleep. Don't let me go down that road, but hold me back. But when you wake up. When you wake up, and this is empirical. I've done research in this a number of different studies, let's say 90 percent, it's actually higher, but I'm going to go 90 percent of people that are involved in touching any form of technology. I'm not talking about people I grew up with in eastern Pennsylvania, you know, horse and buggy people and stuff. I mean, these are brilliant people, but they're not involved in technology. People involved in technology. One of the first things they do it is to interact with their device. It's going to probably be an iPhone or an Android device, less likely a computer or tablet. But they still rank in there. And there's certain things that you're going to check. Now if your systems are proactive, it's already going to do it for you and it's going to give you a summary. And it's going to do at the moment you say good morning or whatever you want your activation to be. It might be you roll over and the thing knows you and it kind of says, "Hey, buddy, you better get up. You got a meeting in forty five minutes and traffic sucks."

Brian L: This is the exact story that we've been talking about. Keep going, keep going. But yeah, Phillip and I have been mulling this exact story over for quite some time. It's amazing.

Brian R: All right. So I haven't told a lot of people. I don't even say this on my own show because the guys would pull me in really bad. But and I don't say publicly right now people want me to go to a few conferences. Well, I keep dropping Watson out of this. Watson did something brilliant. Remind me to come back to that.

Phillip: Yeah will do.

Brian L: For sure.

Brian R: You say Watson on a five dollar Raspberry Pi is zero is a freaking revolution. The fact that anybody around the world get a five dollar device and put Watson AI on it is going to spark a revolution. I think it's one of the biggest things that IBM has ever done. And it's going to spark a whole new platform if they do it right. If they don't want to breaking legs on a horse. But anyway, that's not far away. That's just, I don't want to sound arrogant. What's going on right now is that the computer scientists are in charge of the voice AI that we're experiencing right now. There hasn't been a Steve Jobs to come in and say, "Hey, Xerox, you have the future of computing right here and you've had it for the last nine years. Let me steal this stuff and call it Macintosh." And nobody's even debating that that didn't happen. It happened. Steve literally said great artists steal. Right? And he stole it.

Phillip: Sure.

Brian R: And there's nothing wrong with that. That's what we do. He synergized it. He made it better. But you can go back ten years before that and see everything that the Macintosh and Leesha, the entire revolution right there. And it's no accident that the last thing that Steve Jobs did before he passed away was to call Dag. I'm not saying the very last thing. The last company he acquired was called Dag of Siri and to acquire that company. And he said it was one of the future paths of computers, probably orders of magnitude larger than the mobile revolution. And a lot of people think he was just whack. You know, he was old, maybe didn't get it. He did get it and he got it just like he got when he saw the mouse move for the first time. Because what Steve saw, we saw the Star of the Xerox Star or...the Star system. He saw the mouse that was a revolution. He saw the Window and that was revolution. But he didn't see a whole lot more than that. That was all he needed to invent the line that you and I have on our smartphones today. If the Mac didn't exist, the smartphone, as we know it, with the touch screen, wouldn't have existed in a way that we see it today. It needed that history to get to where we are today. It would have had some weird BlackBerry connection that we I don't know how we can imagine what it would look like, but it would have a lot of keys in front of it. You know, it would have a scroll keypad just in case you didn't want to touch the screen and get it dirty. Because some engineer would have come up with that. And I'm an engineer. I can get away with this junk. But all right. So I said that voice right now is in the hands of scientists and engineers, and that's part of the problem. We might call it a Google Glass problem. Right. It is very apropos because Spectacles just went on sale today. I'm telling most of the people that are willing to listen that's the beginning of a revolution of something that most people aren't recognizing. And Evan over at SNAP is building a hardware company and Spectacles is not the future. That's like saying, you know, Apple's sound card on their you know, Apple 2 is their future. Because everybody thought that that's what was going to make Apple really go.

Phillip: Right.

Brian R: Because nobody saw what the Apple 2 was going to be, Steve said it's going to be in everybody's home, playing video games and you better have a good sound card because the graphics suck and maybe we make the sound a little better. Because it cost too much. Moore's Law was not playing out. Moore's Law is playing out right now. And today you need the full power of AWS, Amazon's Cloud, or Google's Cloud, or IBM's Cloud to extract the intent of your words. Right now, on a Raspberry Pi, I can do pretty good speaker independent word recognition, and I can probably do ok. I probably wouldn't want to create a device around that because it's hit or miss. And I could probably waste a lot of hours coding and making it better and better. Now, you're probably going to use intent extraction from one of these cloud systems for a while. But there will be a point in time when we get many cores and we're looking and different. I'm not talking quantum computing here yet. It's not that kind of magnitude. But within the next 10 years, we're intent extraction, understanding what does that word really represent? And again, this is all part of the lexicon, right? You're saying a sentence and that sentence can be said a lot of different ways. And when you get really smart, you look at the inflection of that sentence. What word is being highlighted? What modality is it being? What happened just before that sense? What happened just after? Well if there's no speech, here's a couple of things you can do. Where is this person at this moment? Well, they're in their home. What part of their home? They're in the kitchen. And they said, "Hey, I'm hungry." OK, well, they're in the kitchen. And they said, "Hey, I'm hungry." Well, obviously, I don't have arms or legs and I can't move anything. So if they're asking me, "Hey, I'm hungry," that means "Give me some choices." And that probably means a list of restaurants. And that probably means restaurants that you've been to before. And if you really want to get random, it might use new signaling. And one of the signaling I'll give away free today and give away free lot is social signaling. There's over one hundred and eighteen signals we can use in this AI world, that I've identified. And there's more. I'm not... I'm a student of this stuff. Right? So there's going to be more. I just identified one hundred eighteen, but one of them the social signaling. So, you know, Phillip says that he really loves this hot dog lunch truck, this food truck that rolled out in a corner of town somewhere in Florida. Right?

Phillip: It's like you know me. Yeah.

Brian R: And what's that?

Phillip: You know me. Yeah.

Brian R: Now, my intelligent agent, this is not again, this is not rocket science. I'm not talking about the future, you know, like the Jetsons in 1960. I'm talking you just have to put the code together. And I'm talking to people who are, you know, stealthy startups or they're mostly late teens, early 20s. Most of the innovation that I'm talking about is going to come from people who are not actually making these things today. They're going to be using, just like Steve and Steve used the calculator chips because the 6502 was calculator chip like the 80,008. These are just made for calculators. And the computer scientists laughed at the whole deal. That's not a computer. What do you think? These kids are crazy. They think they're going to make, you know, display drivers work on this and they're going to get a disk drive to work. And Woz proved them wrong. Woz actually programed the display memory inside of the memory itself. Nobody ever did that before. It was theoretical.

Phillip: Right.

Brian R: And so that that's going on today in AI. And I'm not trying to insult any of the AI scientists or Google or Viv or Siri, but we will see. We come back to this archive. Bookmark this, folks, and you'll see in 10 years that a few people rose up who were not scientists and they didn't come out with this idea. Well we're going to... We're Google. We're going to come out with a voice assisted AI. You know what we're going to call it? Google. If in fact, I'm even 10 percent right, the most personal device or system you will ever own will be your intelligent assistant or your voice interaction to your computer. And it's going to all come out in different definitions. But let's call it our voice interaction. And do you want to be naming it a company name or do you want to anthropomorphize it? And the idea of anthropomorphism is human beings tend to make things look like other human beings. Well, like we go to Mars and we see a bunch of shadows and we say there's a face on Mars. In reality, there is a real face on Mars and Mars Aliens built it. But, you know, really... No. Not aliens. Some of my followers would believe me to be that, but... Aliens. But, you know, the bottom line is we tend to see in the shadows and rocks anthropomorphism. And the Egyptians, you look all over the place. Egyptians were all about reverse anthropomorphism. And there's other names, the idea of putting animal heads on human bodies, human bodies on animals. The Greeks did this and the Romans. And it's not an accident. It's what we do. Our mind is designed to look for faces at a distance to judge whether it's a friend or foe. So at that distance where somebody can throw a rock at you, literally, if you didn't have the eyes to be able to see the expression on the face of that individual, that being locating their eyes and the proportions of their eyes to their nose, to their mouth, to their you know what muscles are being pulled. There's forty five muscles or so that display emotions in their face. If we weren't able to determine that we didn't win the lottery of survival because we'd probably said, "Hey, that guy looks happy. I'm going to throw a rock up in the air just so he knows that I'm a good guy." Next thing you know, there's a thousand rocks and arrows being slung at you. And you don't get to reproduce. So humans have a preconditioned, hardwired ability to look at other humans. And the same is true with voice. We're always trying to look for something in a voice that reassures us. That's why certain movie voice actors come on and say "Coming soon to a theater near you." You know, these types of things. It's not an accident. The problem is, if you're in Silicon Valley and all you've ever studied was computer science and you weren't really brought up to be an empirical researcher, what happens is you tend to live in sort of an echo chamber and you say, well, you know, "All the voice researchers say a voice should kind of sound like this." Yeah, maybe. It's a pleasant voice. And, you know, all of our gender research and ethnic research studies say that we're an international company and we're going to go all around the world and we don't want to offend anybody by saying it's a girl, even though it is a girl's voice. And we don't want to Americanize it because it primarily speaks really good English and other languages. It doesn't sound so good, you know, and it's just like people get mad. It's like, why do people typing mostly English for search terms, even though they're in another part of the world? Because that's where a lot of the innovation is coming from, has nothing to do with anti culturalism. It's just where these engineers are developing. But that was some of the thinking that went on. And the thing is, in ten years, it's going to look ridiculous. And why is that important, because if, in fact, we are going to get that close to our intelligence assistants and our intelligence agents. Our intelligent assistant is the manifestation of what we deal with every day, the intelligence agent is really sort of a program that we send out on the web without asking permission. See we don't need an API in this new world.

Phillip: Right.

Brian R: If I can view it with my eyes, my intelligence agent can get it. So somebody will say, "Well, Brian, this world is going to require all these new APIs." No, in fact, APIs go away. People are going to optimize their websites for human consumption. And then the AI and the ML, machine learning, will discern what it needs to get, come back, slice and dice it and not give you choices, but give you answers. Because right now, the only reason we're using a keyboard today, the only reason is the computer wasn't smart enough to hear us, but they hear us. Our speech, discern what it is, extract the intent and come back with an answer. It wasn't capable of doing that because Moore's Law wasn't there yet. If the computer was invented today, we never saw a keyboard before, no normal human being would say, "Oh, I know how to solve this problem. We'll put the alphabet on little blocks and we'll pound on them." No, what we'd be doing is saying "We're going to talk to the computer." And so in one hundred years, we're going to look back and say, oh my God, you know, those people used to actually type and move things around and thought that that's the way it was going to be.

Phillip: Yeah.

Brian R: What do you do when you do a search? You're doing all the work, cognitive load, mechanical load. You're sifting. All right. You're going to buy new sneakers. Where do you start? Sneakers in Google. Is that really how you start? You have a vague idea that you might be on team Nike or team Adidas or team boutique sneaker company, whatever your deal is, or I'm anti brand, which is a brand in and of itself. Whatever your game is, you're going to overlay that in your search. And you might secretly have loved somebody's shoes, but you're not going to say, hey, buddy, I really love the shoes. You say, hey, what is that shoe? You're going to look it up. Or you might take cues from sports stars, rock stars. It doesn't matter. You're going to try to find something and that's going to be your base of search. And it sounds like a visual search. Perhaps. It becomes a visual search only after you get to the coalescing of what you really wanted. So what might a future search in a voice commerce environment start like? You know the shoes that that guy wore on that TV show that I saw at nine o'clock last night?

Phillip: Yeah.

Brian R: It might be like that because you might not remember any of this junk, but your intelligent assistant is going, "Yeah I know the shoe. Those shoes." It may be so abundantly obvious based on maybe you stopped the video stream and it recorded that notation that you stopped it. Then you rewind it. It started again, then you rewind it. It that might be a signal. I'm giving away another signaling that I shouldn't do this, but that's a signaling of interest and intent.

Phillip: Sure.

Brian R: So without you even lifting a finger and again, I'm giving you an extreme case, that's not a really good one because I don't want to give away the real good juicy stuff here. Nothing against you guys. But, you know, again, I hope somebody sits me down, says, "Brian, I'd like to pay you some money for this," but ok. Use that example. Now, where is your starting point? Now you need to tag that. Has it been tagged? In the future is going to be some tagging, whether we like it or not. The future of advertising as we know it is dead because if in fact my assertion, my hypothesis and I postulate this quite a bit lately is in ten years 50 percent of your interaction with any computer is going to be the voice, of course, voice assisted AI. But I don't need to say all that. It's a hundred and forty characters most of the time I communicate this. So it's going to be via voice. And now with the acquisition of Viv via Samsung, who makes refrigerators, dryers and washers. I don't know if you folks have dealt with the most recent dryer that comes out, you know, the technology enabled dryer with Wi-Fi.

Phillip: Yeah, I have one.

Brian R: Washer?

Phillip: Yeah. From LG.

Brian R: They suck.

Phillip: Yeah. They're terrible.

Brian R: They're like VCRs from the 1970s. If we're all hanging around the 1970s and 80s we'd be saying, "You know my VCR as one hundred nineteen events."

Phillip: Yep.

Brian R: And if I slow the tape down I could get sixty five events in there and, and I'll come to your house on a Tuesday night. Go, "Hey buddy, why is your VCR blinking at twelve?" "I haven't figured out how to set the time." So now guess what? Every one of the features you bought that VCR for... The events are based on timer right? It don't work. And then you're connecting it to a TV or a cable system that doesn't allow the VCR to change the channel. Now, what do you have? You have a VCR that can only record on the channel that it's on and only when you're in front of it because you haven't set the clock. Now, all it is a playback device. So when we look at modern dishwashers and modern appliances, people are like laughing at me. Oh Viv. What are they going to do on that? I'm going to talk to my dishwasher. Yes, you will. Why? Because us guys, the gals, too. I don't care. I suck at laundry, but I got to do it. I got kids, and sometimes things get dirty and I don't want anybody to find out. So let's get it in there. I love researching UI. I love doing research. I've done hundreds of studies and now a program. I love thinking about, but you know, at eleven o'clock the kids are angry. I got to do wash. I'm looking at the thing. Listen, it's my fault. I said, "Honey, let's get this thing. Look at the touch screen on this. Look at all the options." You get home, say screw this, I'm not going to stay here because I want to know what I'm not going to do. And this is my nerdism. It's "Honey, it's got an app." The other thing I'm not going is I'm not going to sit there and play with app. The app even sucks worse than the interface on that screen. So what am I going to do? I'm going to say I got my I got my kids white t shirts in there. It's got ketchup all over it. Take care of it. And I walk away. And again, I'm really simplifying it. It might be, "Mr. Washer Machine, please. I put white shirts in there and they're for youth, size small. And I like to bleach it, so that nobody finds out that the ketchup is all over the place under fifty dollar t shirt, antique t shirt. You know that that's where we're going. And when technologist's look at this, it's hard to comprehend because we grew up in a keyboard world. We believe that that's the only way we can communicate with the computer, and we believe that we need to sift. You want to know how much time you and I waste looking for stuff? And then we get all right, I love serendipity, but some of these listicles "Guess what this actress looks like today."

Phillip: Right.

Brian R: And, you know, she's barely dressed. And, you know, everybody clicks on that junk. Right. And next thing, you're in a listicle black hole and twenty minutes you're never going to get back. After clicking on thirty nine pictures, you finally get to the picture. You go, oh, OK. Never going to fall for that again. And tomorrow you fall for the same junk. And unfortunately, that's how currently we have to pay for journalism. And I can talk about that and my payments spiel because advertising consumption of new media, the presidential election we had today is a byproduct of social networks.

Phillip: Yep.

Brian R: The tools that we all created created the presidential campaign that we have. Love it or hate it. I have absolutely no... I'm not going to talk about my bias either way. I'm just talking about this is what we created and we all own it. We all own it because we created a persona, you and I, all of us have created this. You know, we created a secondary personality. I don't know what we want to call it, maybe skeptical, maybe snarky, maybe ironic, maybe a jerk. Maybe I'll use a four letter word with F. Maybe we create that persona online and maybe we create an environment where everybody is sort of doing the pile on in the twenty four hour cycle. Somebody might have said something wrong. Somebody may have done something wrong. Maybe some poor kid said the wrong thing to a girl. And you and I, when we were kids, said something wrong. It went out into the universe. We gulped and we said and we're lucky nobody heard that. Yeah, no, that's not happening today. Everything you say is reported and it's used against you forever, for better or for worse. And the element of transparency, for better or for worse, everything improves. Your email, for better or for worse, is out there. Every time you said, you know, something that you probably shouldn't have said, but you said to a friend, confidence in a chat is now out there. So this loops back to my voice first scenario and what I said, we are living through a time of a big decision we're going to make. Currently the Cloud is holding a lot of our personal information. I'm surrounded by all sorts of Cloud based voice devices that are listening to me and waiting for a trigger word and then ostensibly recording everything that came after that trigger word and is extracting that intent.

Phillip: Yeah.

Brian R: We've learned that if you have a Gmail account and you're inside of politics, everything you've ever said since 2009 is now going to be out there for people to see. I don't if that's a good or bad thing. Let one hundred years decide whether that's good or bad. I would say on a human level, of course it's bad. Of course, no human being should be having to be exposed. An historical level of power and transparency maybe it isn't. I don't know. I'm not here to judge that. But imagine everything you've ever done being recorded by your intelligence system. So you hear me excited about all of this stuff. Now I'm going to be the downer. This is my only downer in all this. We're going to face that. And Apple's got a trajectory where they want to localize that inside of your device and they don't want to make that make you the product. You know, Tim Cook will say you're not a product. Our product's the product.

Phillip: Right.

Brian R: He would advocate that the Google model and maybe even the Microsoft model is that the data that gathers on is the product being sold and maybe it's doing it in a semi anonymous way. It's still exposing you and perhaps exposing bad. So that may not even matter because you and I are not going to tolerate traditional advertising in a voice first world. So if, in fact, I'm correct in my postulation that I'm going to start ordering sneakers, at least starting my search by asking a question, the reality is I'm not going to see all those pay per click ads. I'm not going to tolerate my search results being nothing but pay per click ads. And I'm going to find ways to deadulterize my... Are they going to adulterate it? I'm going to find ways to my machine learning and AI to get around it. Let's call that ad blocking software in the modern era.

Brian L: It's not even ad blocking, really. It's almost like they'll be sort of our filters. They're going to become the way that we prevent bots.

Brian R: We're already doing that, right?

Brian L: Oh yeah. Yeah. Exactly.

Brian R: I guarantee you guys are immune to the bar just below the Google logo and just above the organic search result. You probably don't even see it anymore. We're blind to it.

Phillip: Sure.

Brian R: It's like we all have a blind spot in our eye. You can find it.

Phillip: Yeah. Yeah. It's a fun one.

Brian R: You freak out when you finally find it. Yeah. Oh my God, I'm blind there. And the thing is, that's been going on for a while. That's why the conversions for most small merchants I know this they've been dealing with small merchants for thirty years. Conversions have gone to the floor and it's not a Google problem. It's a failed product. A pay per click... Pay per click really winds up favoring the auction mentality of somebody that has too much credit line on their credit card. So you might think it's perfect if you're a nerd and you're coming from statistical science, you might think this is "perfect capitalism." Somebody can bid for the highest position and it's got to have good quality with a Google algorithm. Yeah, whatever. You know, the bottom line is you pay the most, you ultimately can be on top unless your ad really sucks and it delivers you to a website that is not apropos to what the ad is saying. But if it does deliver on the content, then now what do you have? You have somebody who spent a whole lot of money to acquire you. That means they have a big investment to get a sale out of you. And that means that in a lot of categories they're going to go after you forever. They're going to add track you forever. They're going to spam you because you're going to give up your email sooner or later. And you're never going to be... It's going to be like a bed, a bed suit. You know, it's going to be sticking on you in the summer. And we're not tolerating that much anymore. That's a failed model. And it also is not very intelligent because it doesn't understand that you already made the purchase. It doesn't understand that the consummation is like if you're looking for a new air conditioner that broke. When you finally consummate that perfect purchase, you're receptibility to air conditioner ads is 100 percent utterly wasted. And if they come at you, they're just going to even get you mad, because if they start showing you features that you didn't know existed. You're even just going to totally block it and say, I don't need to be reminded. Humans don't want to be reminded that they made bad choices, especially in purchases like that. And I can go down that whole road on why salespeople are never going to go away. And good salespeople are absolutely invaluable tools to human beings. If they're not trying to sell you something you don't want. They really are doing filtering in a way that AI can, but not quite. So anyway, that ad model is obviously broken and there's no way for Google to fully make that demarcation if they only do time constraints and aggressive retargeting is just going to give it back. Now, imagine that coming at you through your voice device. It's like, oh, well, you know, you kind of like this and now we're not going to tolerate that. So the first thing is that goes away. The next question is how are they going to harvest my data and how are our advertisements going to be? What are they going to look like? Well, you know, I can't you would not tolerate if you guys were hanging around having a beer, you wouldn't tolerate it if Phillip all of a sudden said, "Hey, speaking of tires," "What, I didn't say tires." "I just bought a brand new set of Firestone 305s. Firestone 305s are the most powerful." You know, you're not going to tolerate that in a voice stream. Right? And so people say, "Well, you still got to look at something." Yes. Because all the heavy lifting is going to be done by your intelligent assistant and your intelligence agents. It will distill it down to something. Imagine and again, I don't want to be using old 1950s metaphor, but imagine having a real assistant. And you were chomping on your cigar, Mr. Big, up there in the penthouse. "Yeah, get me choices of ten new suitcases or luggage," or whatever. And the poor assistant is like "I don't know what he likes." Well, you better start figuring out. Unfortunately, in that era, if you couldn't figure it out, the boos says, "Well, you don't know my taste. Get out of here." So an assistant... I see this in Southern California, in Hollywood, by the way, all the time, people running around getting coffee. No, that's not the coffee I wanted. I wanted Blue Bottle. Damn it. It's Wednesday.

Phillip: Yeah, it's Blue Bottle, and Thursday's Stumptown.

Brian R: Yeah. Yeah, that's right. And of course it's who you hang with. You know what I'm with. You know, when I'm with Joe or something like that, I got to have the coolest looking coffee, you know, and everybody's into this.

Phillip: Sure.

Brian R: So you're a good assistant knows, I call these situational conditions. So the AI that I'm working on is always aware of my schedule. So I always like, for example, guys, we unfortunately connect, but it now knows I'm interacting with you. So I already got the dirt on you folks, your LinkedIn or Twitter accounts, all this kind of stuff. Yeah. You know, it kind of gives me a quick update. I got a meeting coming up, and the best way I can equate this is it produces a card. It's not in a Google sense because this is more like HyperCard. The Google produces a monolithic card and you swipe. That's not what I do. I go back to HyperCard. It's the most beautiful way to look at the layering of what is really voice hypertext. The card is giving me what was the first distillation. Let's call it the first paragraph. And if I get lost or confused on what the heck it's telling me, I can go to that card and I can start a deep linking and saw and see how it arrived at that particular distillation because it always keeps the chain. So I can know how that neuron was created, and I can see if it's been adulterated. I can get other intelligent agents to work on these things, to keep an eye out and to always prune things off that don't make sense. So what I'm starting to talk about here is many intelligent agents operating with many different voice systems and operating with your intelligent assistant or assistants. This is where I'm not very clear. We might an intelligent system is the layer right next to you. You may have one. And I tell you this, once you build it, you will be loyal to it and it will be loyal to you. It will live with you the rest of your life because at the end of the day...

Brian L: You'll invest in it.

Brian R: Yes, because it will hold the essence of what you are. It will remember stuff that you long wanted to forget. We shouldn't be remembering when the War of 1812 happened. I'm making a joke about that. I mean, we remember things because we didn't have instant recall through our technology. All right. A book can only hold so much. So we started offloading as soon as books came into our culture, we started offloading a whole lot of junk that we didn't need to have in our brain and that freed us up. The book, The Gutenberg Press is one of the things that started the Italian Renaissance and ultimately the Industrial Revolution, because it allowed us to stop having to remember stuff, to educate ourselves and then to start synthesizing and being more creative. The reason why it was so many more creative people, unfortunately not cut loose... Most people are afraid. This is my other problem with social media. You come up with something crazy, the crab pot. I don't know if you guys know the crab pot. Crab pot is you can open up a big pot of crabs and they can climb out. But if you have enough crabs, it takes six. Five is not enough. Once there are six crabs in a pot, ain't nobody leaving that pot is as soon as somebody tries to get over the edge of the pot, all the crabs will unilaterally grab them and pull them back in. And this is I'm telling you, I learned this when I was a kid in New Jersey. It fascinated me. It turned out it's a meme I guess, but I saw it officially on the Jersey Shore and said, "Why don't you have a lid? They're going to get out of there." "No they want. Watch this. I'll even pull one up." And as soon as you pull one up, you know, the one that's coming out. They're like, yeah, freedom. Every claw, the big claw comes out. No buddy you're coming in. And that's where we are in social media today. When somebody comes up with a new idea. I mean, I'm even experiencing when I talk about voice first. "I've been a data scientist for thirty five years, I've walked and I've worked in voice for 12 years, all the stuff you're talking about is even going to happen in our lifetime. Brian, shut up." I'm like, yeah, because it's not going to be unilateral. It's like, all right, when Steve Jobs thought of the Apple 2, he said it's going to be in everybody's home. He didn't even know how or why. And it ultimately did. But it wasn't to balance checkbooks or to hold recipes because people discovered something called Lotus 123.

Phillip: Right.

Brian R: And Lotus 123 allowed people to take their work home with them and to do accounting and planning and speculation about the future. And this new thing called the spreadsheet. And it liberated people. And then desktop publishing brought it home so people can communicate knowledge, work. And the same revolutions are going to happen around voice. I can't predict all of them. I know commerce is going to be a massive part of it and payments are going to be a massive part of it. So commerce. I'll give you another example. Food. I can go up to any of my voice first devices and I can say order some pizza. I don't need to say much more, I can just let it go just from that. It's going to see am I home? Yes, I'm home. Anybody on my schedule beyond my family, my kids out doing sports or something. Now everybody's home and I got two guests. Who are they? I got two vegan's. OK, that's going to limit the choices, right? All of a sudden, we're going to now shape what pizza operations are there available. Am I acclimated to go out my car and pick this junk up or do I want it delivered? Well, I probably want it delivered because I'm hanging out. And what are my options to get this delivered? This is a big problem. Now think about what you need to do cognitively and I mean include everything you have to kind of interact. Have to nicely find out, you know, hey, we're going to get the meat special and the vegans look at you. You kill animals to eat? Now, you know, by the way, I'm a vegan, so I can say this crap. Now I am. Last ten minutes.

Phillip: All right, guys, thank you so much for sticking with us. That was Part 1, and I know that you need to hear Part 2. It's going to blow your mind. We get into some really deep subjects and talks about the potential pitfalls and moral implications of things like AI and payments and sort of is this an evolution of the human race? All these amazing ideas and thoughts that Brian's going to bring out for us in Episode 20, which is Part 2. So we want you to check back in. I don't want you to miss it, though, so you need to subscribe. And so the best way to subscribe, you can get us on iTunes. If you're on iTunes and you subscribe, please leave us a five star review. You can also listen to Future Commerce on Google Play or write from your Amazon Echo with the phrase "Alexa play Future Commerce podcast." Anyway, thank you for listening to Future Commerce and until next time, keep looking toward the future.


Phillip: Part 2 of 2 of a conversation that we had in great length with Mr. Brian Roemmele, who is a prolific blogger and prolific writer who is published in just about every magazine and journal and business journal that exists. And he's a great thinker around payments, AI, Virtual Reality... He pretty much... He's a renaissance man. He knows a little bit about all this stuff and a great thinker. We love that he spent a good amount of time with us on the show. And if you listen to Part 1, which was Episode 19, you know that we were sort of getting into the dark side of all these amazing technologies. So sit back and relax. You may want to buckle up your seat belt because I think you're going to be surprised at some of the things he brings out here. So without any further ado, let's get into the show.

Brian R: No. But the thing is, we forget how much energy we're dedicating. Now, I'll go to my phone. Now the phone sounds like it's really a lifesaver. But no, if you really look at it, it actually made the decision more complex. Now you have the tragedy of too many choices. Now you've got to distill it. Now it's a debate. Honey, would you like? No. Remember the? Yes. And there's this interaction. What do you guys like... No, don't go there. Wind up getting these debates. Everything gets crazy.

Phillip: Right.

Brian R: And maybe forty five minutes later, you say, you know, let's put some spaghetti on some. I don't know, something crazy happens. Instead when you activate that, and you look at the decisioning trees that take place, your mind is blown. And that's what I call the God moment for a lot of people because most people would debate with me. "Hey, Brian, I can order faster through my phone." "Go ahead, try it." I go, "Here's your challenge. Go ahead and try it." And it's gonna get sloppy because what the person's gonna do is they're gonna do a mixture of searching on maybe some social platforms. You know, maybe they use a restaurant review platform. Maybe they'll just do raw searches on Google, maybe they'll try to hit up a few friends that like certain types of food. All these different things. And at the end of the day, it's not going to be very convenient. If you can enact a chain of events by using your voice... Your voice is just the input mechanism. It isn't doing all the work. I'm not saying cursor move up three spaces. OK, turn page, go left. That's what some people think I mean by voice first. Has nothing do with voice first. That's voice command and control on a computer modality that doesn't even exist. It's ridiculous. And you could have done that 10 years ago with Dragon and LNS and a few other systems. What I'm talking about is the convergence that's just happened in 2014 really. And that is good enough voice recognition in the cloud, all of the voice recognition of the devices we're talking about, Google, Alexa, Siri, it's all cloud, intense extraction, you know, speech recognition, etc. It's all being done by thousands of computers that are breaking it up in small parts ultimately, not really for every command. But if it's hard enough and tasked with the problem enough, it's going to be the entire AWS platform. So it's going to do that. And so when I go to order some pizza, it might wind up giving me just one choice. And it will come back very quickly. Sometimes within a second, sometimes a few seconds, never more than a minute. Even if it's really complex. And it's going out and doing a lot of things you and I would do it. It will go out and look at sites. It will scan them. It knows the usually suspects. It knows my past, my history. And it's going to choose a place that makes sense. And if it can't, it will ask me just like a kid. Right. You know your voice... How are you going to program voice in a future? Like you train children. And if you don't want to do it, fine. You'll have kids that aren't very smart.

Phillip: Wow. Yeah.

Brian L: We talked about that as well. Totally.

Brian R: Yeah. So what's that like? It's like I make it... Me, I like it to be a lot of fun. I think we need to remove all these facades of trying to, you know, series a little snarky is better than than Google. Google has no sense of humor. And again, that's done out of not wanting to offend or hurt anybody's feelings and really stupid sense of humor, like with the weather bot. I mean, that's the other side of it. The other side of it is plain stupidity of Facebook's whether bot... Or it's not really Facebook. But it's fun the first time, sort of annoying the second time, you ain't going back the third time. Right. The novelty is gone.

Phillip: So true.

Brian R: So novelty is good when it's always novel and novelty is good when it knows you. Your friends have their own novelty and you get to love it. Right? You love the interactions. You know their quirks, and you live with it. You know, sometimes it's a little overbearing. Like me right here, you know, and you kind of put up with it, hopefully, and don't pull me off too early. And you kind of pull it back a little bit and you say, OK, that novelty's too much. I'm in that experimental mode right now. I mean, I'm cobbling this together, and I can't even pretend to tell you that I don't know what enough and too much is. But I can tell you that nobody does. That's how early we are. We are before the Homebrew Computer Club existed. If we want to use the analogy of the personal computer being discovered. The Homebrew Computer Club didn't even exist. That's how early we are in this technology, even though obviously we already have it. We're so early and what it ultimately is going to do, when we look back at it, we'll say, oh, my God, we were so primitive. We actually thought this is how we were gonna do it. You know, like Air Pods are a good example of another modality. I mean, they are a room based voice first system. Personalized bone conduction in your canal. I mean, it's many modalities. I would say rate has 27 modalities I've identified. And it all makes sense. Definitely in your car. Absolutely utter ridiculousness that we don't have profoundly powerful voice systems in our car.

Brian L: I so agree.

Phillip: Yeah.

Brian R: And by the way, a self-driving car by definition is a voice first device period. End of story. Right?

Phillip: Right. Yeah.

Brian R: You're going to say "Stop," because if you're sitting there with, you know, with a martini, with your back towards the front window and you look over the back edge of yours. Right. Holy cow. Worked about fly off the map. You're going to say, "Hey, Tesla stop. I mean, stop right now." You know, you're not going to find the brake pad. So it's a voice first device, even if it has controls in the car. Maybe if you're sitting by the wheel, you might take the wheel, whatever, but you probably are going to use your voice because it's quicker. And it's probably what's going to happen first. And there's a lot of other things. We already talked about appliances. In 10 years all the complexity is going to be completely hidden from you. For example, I'm looking at OS 10 right now, right? OS 10 is a patina painted over a Unix environment and everything that I do and clicking around a screen is an analogy to a rudimentary Unix kernel command that can be accessed by a command line inside of the terminal. So if I pop up terminal in OS 10, I'm literally talking to that kernel in the sense. Obviously programmatically I have to do some more things to simulate some of the things I do by clicking. So it's a patina over the complexity. So it's not a modality we're not used to. It's just it's not explained to us that way. It's like, oh, that's how computer works, because you grew up with a computer that looks that way. You just think that's the way it is. Reality... This is painted on there. All the stuff below in the engine compartment, which is got even more going on on a machine language and kernel level and to the processor itself...address codes and all that stuff is we're so far removed. No programmer even deals with that junk anymore. There's compilers you don't deal with machine code. I grew up programing in machine language, so I have an affinity to hexadecimal numbers and binary. At the bottom of the receipt of Spectacles...

Phillip: Yeah, yeah. I saw this.

Brian R: There's a binary code in there and I discovered it. And I guess I was the first one to discover it. So an executive at SNAP DMd me saying, "How did you get it so quick?" "What?" "Everyone who looks at it just thinks it's art." It's binary and it's groups of eight and ones and zeros. In fact, I read it. I kept reading it again because it says... Pot of GGLD! Pot of gold I thought is what it said. Maybe it did. Maybe I'm still bleary eyed. But the whole receipt is a rainbow. And at the bottom it says pot of gold in binary. I guess it's, you know, obviously nerd humor, but it's GG gold. And I guess GG Good Game. I don't know that maybe that's what they meant. I guess he said I got it right. But I thought it was pot of gold. Anyway. Getting back to the point of the patina. Voice is a patina that is going to gear up this new revolution. So anyway, that's sort of the monologue. I wanted to let you guys start getting in here because I'll talk into three o'clock in the morning. Does some of this stuff makes sense?

Phillip: Oh, gosh. I mean...

Brian L: A ton of it makes sense.

Phillip: You're hitting on so much that we've really, that's the whole really this charter of...

Brian R: Jump back some. Jump back in some of my assumptions. Tell me where you really buy into it. Give me some of your feedback and tell me where you think I'm just kind of going off.

Phillip: Brian. Take it away. I know you're champing at the bit.

Brian L: Yeah, I don't know. As far as off the deep end, I think it's not really off the deep end. It's about what's actually going to be adopted and when. So, so many things I want to touch on and questions I want to ask. But I think, you know, you mentioned personal assistance and we talked briefly about, you know, how you're going to have to sort of raise them, if you will. And honestly, I thought about this concept quite a bit, like there's going to be sort of your personal assistant. There's going to be a lot of other bots and assistants out there that are going to interact with your personal assistant.

Brian R: Absolutely. Yes.

Brian L: And then you're going to... The question is, are we going to make these bots in our own likeness? Are they going to be some sort of a surrogate? Or not surrogate. That's the wrong word. More than an avatar, but sort of the signet ring of our personal identity.

Brian R: I love that. Yeah.

Brian L: And so, are we going to invest in them and make them in our own image or are we going to treat them like in their own identity and give them their own name? And we will also have to make a lot of decisions around what they can't do. Permissions.

Brian R: That's right.

Brian L: Are they going to make purchases for us? Schedule meetings for us?

Phillip: Yeah.

Brian L: What are they going to accomplish for us?

Brian R: What are they going to expose? When do we let them give out personal data?

Brian L: Exactly. Yes. Nailed it. Yeah. Exactly. And they're also going to talk to us. I think this is one thing that I really want more of your thoughts on. But Alexa has been rumored to be able to enable push notifications soon. And frankly, in my mind, that's one of the most realistic next steps.

Brian R: Landmark.

Brian L: Yes, absolutely. Because the thing is, right now...

Brian R: I call that proactivity, by the way.

Brian L: Proactivity. Ok. Good word. Good word.

Brian R: It's all under the umbrella and that is rudimentary. Not even, you know, not even preschool. I mean this is actually not even sperm or egg. If you wanna go real backwards here. This is not even a thought. That's how early we are in that form of proactivity. The things that we call notifications today are ridiculous. Because they're annoying. I mean, you and I... I'm sure... I've never met you guys before. We go back way, an hour ago. We're all buds. But, you know, we connect. Warped Tour. But most of our notifications are clogged with bullshit right now. And the challenge is programmatically, let me try to guess what I need to send you. And data scientists are... I mean, they lose sleep. I hang with these guys. What is the proper number of notifications? How do we filter it? What tools do we give? Nobody uses our tools. As soon as they create the tools to filter the notifications, one or two people come out at a two thousand or ten thousand or a hundred thousand and actually set them. The rest that people just let it go full on or full off. The reason that is is because you looking at the problem the wrong way. I don't want to add another level of complexity. I want it to figure it out. I want it to know me. The only way it knows me is it's gotta get close to me, and if it gets close to me, I want it to be hygenic. I want to know where it's been. I want to know who is has been with. You know, I want to know who it's going to be with. And if it comes with a whole lot of warts, "Hey buddy, I don't know who you are anymore." And so you want to kind of, the whole problem... This is, again, our challenge. We're at the precipice of this. It all comes down to our privacy. What is really private.

Phillip: Yeah.

Brian R: [00:15:04] Our personas. What is really our true persona versus our public persona. All of us are liars. We all create facades of who we really are in the public space. That's normal. That's human behavior.

Brian L: It's going be verified data versus unverified data. There's gonna be dating sites where it's like we verify this data is accurate. We've talked about it on the show.

Phillip: That's true.

Brian L: And then what that's gonna do coming up not too long from now.

Brian R: Everybody's athletic. Everybody on a dating site has got an athletic build now.

Brian L: Yeah. So, you know, is it real? Is it not?

Phillip: I mean, I have an athlete's body. It's Warren Sapp. It's Warren Sapp. Number 99 on Tampa Bay Buccaneers.

Brian R: I'm running a marathon in Kenya right now.

Brian L: {laughter}

Phillip: {laughter}

Brian R: And I'm dribbling two basketballs, so date me. Swipe left.

Phillip: Yeah, one of the things that you kind of mentioned and I'm sorry to cut in, Brian.

Brian L: No, keep going.

Phillip: I just realized that both of you are Brian. It took me two and a half hours to figure that out. {laughter}.

Brian R: My AI is actually lighting up with the right Brian.

Phillip: Great. So I'm not usually a downer in these conversations, but we have such a hard time...

Brian L: What are you talking about? Yes you are.

Phillip: I know. I am such a Negative Nancy. We have such a hard time on her and in just human interaction and on a personal level of us understanding things like consent, which is a big conversation right now. And just interpersonally, the understanding of how we can now translate that in a meta sense to an extension of our personality or consent of understanding of our most personal interactions or the true us from an AI that understands exactly who we are.

Brian R: It will know you actually better, absolutely better, than any significant other, any best friend. And it will know you better than you.

Phillip: Well, so today relationship official is Facebook official. Right? Or whatever it used to be. So you know that in the future we're talking about a new level of consent of really like taking off... You know, people are really going to a new level of connection will be to show me the real you and allow me to see the real you without the facade getting put up.

Brian R: You're getting to a really interesting part. And this is where a lot of people get scared because we're now walking down a really dark alleyway.

Phillip: Right.

Brian R: Because, you know, on Warped Tour, I did a lot of research. We talked about this pre show and what I detected about four years ago, and this is just as the rise of the smartphone was matriculating down to somebody who was 14, you know, their first Warped Tour. And they were able to afford it in, you know, middle America, you know, not in technology centers where they had the dad's hand me down phone years ago. Or mom's hand-me-down. They literally had the first device, maybe their first iPhone. And I noticed one of the demarcation points of trust and verification within a relationship is that you gave your significant other your password to your phone. And whenever that came up, I tell you, everybody had a very serious... It could be crazy, you know, every got a really serious look because that was the deepest level of touching a human being. Now, take that feeling of somebody handing over your cell phone. Okay, honey, anywhere you want to go, look at my history. You know, whatever. Anthony Weiner aside here. But, you know, the bottom line is, they're going to be the Anthony Weiners of AI. I hope we don't live to see the how.

Phillip: Yeah. Yeah.

Brian R: But what I mean by that is this. If you think that's a personal diving in of you, and I think any of us feel it today, you wouldn't have guessed this 10 years ago. You said 10 years ago they're going to hand over a phone to somebody.

Phillip: Yeah.

Brian R: What my phone list? I don't care. Here you go. Nobody cares. You say it now. Even people who are older. Let's even call them senior citizens. They're still going to be like, no, I really don't want you to have my phone. And this happened recently with my son, with a relative I won't name. And he was just going to take a quick picture. And she was very not wanting to give up the phone. And she used "I don't want it to break" as an excuse, but I knew it was very peronal.

Phillip: Oh yeah. We know what that means. Yeah.

Brian R: So we as a society are going to have to deal with this. We're not even dealing with that yet. Right?

Phillip: Whoa yeah. Exactly.

Brian R: There's this whole thing of do you really get that right to me? You know, and what do you have to hide? We're also in a world now where if you want your privacy, then the next question is, oh, then what do you have to hide? And that's people who have grown up with not studying history. The only human being in America that would say that is somebody who did not study the history of human beings and to understand that we formed a country that allowed us not to have to answer that question and not to have to assume that I have something to hide because I want my privacy. And I say that now because it's more important right at this moment, right now, with what happened this week, than ever before. And I think we as a society have to deal with this. And I bring this meta problem up because I'm advocating this junk. I'm coming out here and some people say I'm the biggest cheerleader of the voice first revolution. And maybe that's true. Maybe it isn't. And I also feel, gentlemen, a really deep sense of responsibility that at the very same time I'm opening up an ugly black hole that I don't think we're mature enough as a species to deal with, and that is what happens when something knows you that well and then what happens when it's the embodiment of somebody else's computer that you don't own? We have this vague notion of ownership. When all of us were going out to Warped Tour and buying CDs, we "owned" our music. We're angry old men.

Phillip: Yeah.

Brian R: I want to own my music. And there was a whole ethos around that. You know, it was like you were proud to own the CD that somebody made in their garage and then played it and screamed their heart out on a tour. And you wanted to reward them. You wanted to be able to take that piece home with you. That doesn't exist. There is a generation growing up that doesn't feel that and that sense of ownership. Now, I extend that now into the ownership of your identity. If that doesn't embody in my own hardware ostensibly. Right? You have to believe that it's in something in your home. That you can block somebody from getting access to by unplugging it hopefully. At least you can unplug it and a battery doesn't pop up and say, no, I'm not off. No, I'm still broadcasting. Sort of like my camera right now. It's not on, but somebody is watching me. I know, because we're on Skype. I have taped over it. So does Mark Zuckerberg.

Phillip: Yeah. That speaks volumes, right?

Brian R: That says a whole lot to you. But what I'm saying is and it's all kind of I'm joking, but it all comes down to that same... I'm going down this little whirlwind that leads to one point and that is should we do this? Unfortunately, the answer is if technology is produced that does it, man and woman are going to apply this technology whether we like it or not.

Brian L: Yeah, I think you really nailed it. You know, at this point, it's because of the benefit. We talked about this before. When utility outweighs the privacy concerns, people will do it.

Brian R: The pendulum swings back and forth, though, right? The generation that has been raised on social media today says that unlimited sharing is OK. When you hit a bong, you know, before when you were in high school and you put it up on Facebook and now you're going to be the chief surgeon at a hospital, you like all they put up with it. You know, maybe at some point if you wind up leaving a surgical utensil inside that person and there's a lawsuit and you made an honest mistake because it happens, and they go back and you see that you hit a bong when you were a senior in high school and they bring that up. Do you still have your drug problem Mr. Surgeon? Now all of a sudden your social media past can now be used against you in a way that you never thought would be possible. And it sounds obtuse and bizarre. But isn't that how our past is always used against us when there is a record? Haven't we learned that in his last election cycle? We've learned that we could take anybody's past and we can demonize any human being on this planet. We now have 50 percent of this population or so, it's called 60. I don't care. That a demonizing the other 50 percent and we're doing it on social media and the leaders of our country are doing that in a sense. And we're all in the mud doing this together. I bring this up because I hate politics right now. I don't want to even talk about it the rest of my life. And I think most people are that way, no matter who won and whether you're happy or sad, crying, angry, acting out, you know, kicking somebody's car.

Phillip: Right.

Brian R: You know, mad. But what I'm saying is that's the same thing. But it's on turbo charge because if you're intelligent assistant really does its job, it's going to anticipate you in a way that is beyond freaking you out. It is going to statistically know... I mean, I can tell you right now some of the things that it knows about me that... All right. One of the things I've done is I've used Alexa and Siri to talk to each other pretty much since April 1st of 2016. So April 1st, 2016, I started them off, modified, of course. And I do it in a way where I'm constantly shifting IP addresses on Alexa and Siri and depersonalizing it, so that the data scientists over there can't really see even what I'm doing if they wanted to.

Brian L: {laughter}

Brian R: All it looks like is random traffic with random questions because I literally had... I won't tell you which company, but I had a data scientist tell me, Brian, you're not doing any of that work. You want to see. Here's a live cam. And he goes, "There's no way you're doing that." And I go, "Yes, there is." "But our data doesn't show that anybody is doing that." Yeah. Because I'm breaking it up. And it's not out of paranoia, to be honest about it. It's has nothing to do with that. It's mostly because I wanted to see if it would still operate in the same manner if I was able to do that. And I never decoupled it because I later started thinking this is a lot of highly sensitive information, because what Alexa in Syri were doing under my programable control, again, not their APIs... API is this sort of I built around it using Raspberry Pi to have my way with these these devices and systems. It started analyzing my schedule first because that was the thing I wanted to attack. I'd become quite busy. I have open office hours. I dedicated to anybody who does... And by the way, I advertise that right now, if you have anything you want to talk about your life... If you want to talk about voice commerce, voice data, voice first, anything under that realm, payments, contact me and my social platforms. I've open office hours. I dedicate, unfortunately, more time than I have sometimes to it. And I had to mediate my schedule and I figured, you know, that's a good problem to try to solve. So whenever I tossed a bone, if you will, into that mosh pit of actually three voice first systems interacting pretty much consistently in a closet. It's now soundproofed with egg carton. No it's got real foam rubber. But I wanted to use egg carton because I just like the look. Anyway, it's been talking in that closet forever, in my view. And it's almost a lifetime when you think about it from a data level. And again AI researchers think I'm absolutely insane and think that I'm doing stuff that doesn't need to be done, that you could do to do programmatically through APIs and source code and stuff. And I'm like, yeah, I could, but then I would have to do it. And I don't want to do it. I want to let them do it. And it's like a kid, right?

Brian L: Yup.

Brian R: If you never had a kid, you look at other people raising kids. You're a big skeptic about what parenthood is like, "Tell that kids to shut up." And why are they picking that kid up? And all of a sudden you have one, the aha moment comes and you say, "Now I understand."

Phillip: Yeah.

Brian R: One of the things I did with my children was when they were climbing on things, my wife would freak out and I'd say, "Listen, they need to understand what the barriers are. We can't always be there. We can be here to catch them if they fall. But I want them to fall because I want them to know where the edges are, their balance and stuff." And it sounds very high and mighty. I didn't quite say it that way at the time. And then right when he hits the ground, I'm grabbing them and he looks shocked, and he looks around and goes, "Oh, now I know not to do that." Now I could have said a thousand different ways, "Don't do that." Right? You know this as being dads. But once you experience it, let's call it muscle memory. It's a lot more. It's cognition. And there are neurotransmitters of fire and remind you not to do certain things. The neuropeptide release that reminds us. The same thing that tells us not to eat poison trees. And we eat it once and it tastes bad. Our entire body remembers it. That's why neuropeptides receptors are in every cell of our body, because it even bypasses our brain. See certain things, like scratching your fingers all along a blackboard. That's built into our our neuropeptides system. Once I start scratching the blackboard sound, it activates all those neuropeptides to say, you run away from that. Some people say it was dinosaurs chasing us. Again, that's really funny because theoretically dinosaurs and man weren't... The screeching sound is a reminder. I don't know. We won't go down that thing. But so now here we are with all of this data, our deeply personal data. Our loves, our hates, our pains, our true personality. Yes, that person did vote for the other guy. Now I can beat him up. Yes, I have proof, right? You're the jerk that did that. Or you're the jerk that didn't do that. You didn't vote. I'll use something topical because it's very emotionalized right now. In 20 years, nobody's gonna care. But everybody cares right now. Do you care what Nixon did right now? No. Do your parents even care if they lived through it? Or great grandparents, whatever? No, but I'll tell you this. When Nixon was caught doing the things he did, there were people having fistfights and anger at bars there. People say you voted for that SOB. No, I didn't. Well, if your voice device can prove that you did, now you have a problem, now we can go into pre-crime. Now we can go into the 1984 George Orwell. Now we can go in to the court order now gets to go into your AI and to try to see what you did on January 20 at this date. And where were you? That's already kind of happening, isn't it? They want access to your iPhone to prove whether you were there. And it always starts with a really wholesome reason.

Brian L: Right.

Brian R: Somebody did something bad. Always starts that way. And then some guy gets blue gloves and he gets to pull out your phone any time anywhere.

Phillip: Yeah.

Brian R: To give you a little feeling up and down your phone. And that's going to happen with her AI, and we're going to live through that. And I don't care who's in office. I don't care if they're a fascist or Chairman Mao, you know, reincarnated or Karl Marx and Lenin all in one. Pick your poison, polarize whoever the hell you want. The bottom line is we're going to live through that and they're going to be certain entities that are going to want to have access to that. And there's going to be certain entities, hopefully you, that don't want to give access. And then there is gonna be the question, if you're not guilty, what do you have to hide? And I think that is the challenge of this generation. Are we going to stand up for I don't have to answer that question, not because I'm guilty. It's because I've been an inalienable right not to have ever even have to deal with that. It becomes more important now than when the founders framed this, when they framed it, the privacy of your papers and your home. You know, I use all the old lingo if I want to. But basically, you were assumed, you know, that there's a sanctity. And we went through post 9/11. It's sort of started where we kind of opened up the gamut for always in the name of good. Every law. Everything that happened in the fall of the Greek and Roman Empire was always done for good reasons. Every step if you really study, even Egyptian, nobody else why the Egyptians fall apart. The Egyptians fell apart because the Greeks got there. The Greeks won the Egyptians and Egyptians were debasing their society because it lost touch with its roots. And if you look at all failed societies, it always loses touch with its roots. It always says that the world is more complex and we need more regulation. We not need more policing. And then the group that's policing always grows larger than the population itself. Have you looked at what happened with Stazi in the former Soviet Union? On every floor of every apartment, there was somebody moderating what everybody else was doing on that floor, reporting it to the secret police and they kept, the files got so big that they were taking over literally hundreds and thousands of square feet of notes. I don't know, "Rudolf flushes toilet at 10 am twice. Make note of that. What is he trying to hide? Is he stealing more water from the rest of us comrades? We work hard for our water." You know, that kind of stuff. And it's always done, and again, it sounds like it's good because in that society, everybody should have the equal number of flushes on their toilet. And back then it didn't have regulators. So in the future, maybe there's a regulator. You can only flush your toilet once a day. Let the crap build up and then you get the flush it. And it's always done for good reasons, isn't it? Right? Doesn't it sound like a good reason. You don't want to waste water. And it's only fair that if Joe Yoseph gets to not flush, you shouldn't get to not flush. Somebody has got to monitor that. So we got to elect some entity to stand in the middle. Make sure everybody only flushes the toilet once for the good of everybody.

Brian L: And all of a sudden we have something that's monitoring us all the time.

Brian R: That's right. And by the way, everything I just talked about it breathlessly, excited, I'm actually helping create that. Understand that I feel that burden. And I try to be a little light about it, but I don't even know that I understand it enough and I don't know what it looks like a hundred years from now. I don't know if this is actually the destruction. I don't know that that's where society fails because we're so overwrought with all of this personalized data. The quantitative self.

Phillip: Right.

Brian R: All right. Give you an example. You wear a watch that knows everything about your health. And let's say the health now become socialized. And now, instead of you paying the money, "the younger people" are paying for the older people.

Phillip: That's right.

Brian R: Well, and it knows that you've made some risky choices. You took a swig of whiskey when you shouldn't have. And you're going to pay the price. What that means is, well, you get a deduction for your life extension. And you don't get extra health care because you made bad choices when you were twenty seven.

Phillip: Yeah. It's true.

Brian R: You didn't even have to go socialized. I mean, this could be private health care even as well. I think, in all reality, anyone can use these things to create a system for incentivizing certain behaviors or penalizing for others.

Phillip: Yeah.

Brian R: I mean we're all nerds, like, OK. I love the idea. I'm only going to pay for insurance based on how many miles I drive and I'm going to give up for that savings letting the insurance company have second by second notice of everywhere I go, how many miles per hour.

Phillip: Yeah. I was gonna make this... We're gonna make this exact point. Yeah. We've already done it. We've already done it. Yeah.

Brian R: Right. So now it's your self-driving car. So now you don't even have autonomy. You say I want to go to hopefully you don't say TGIF Fridays or you want to go to California Pizza Kitchen or something. Well, it's going to tell you the California Pizza Kitchen that's closest and maybe at some point for optimization of your carbon footprint, you can only go to the CPK that's closest. And again, that might be done for the good of everybody. Right? You're going to create a bigger carbon footprint. Somebody is not going to, there's a few extra centimeters might grow up the coast because if you're driving, free driving down the road. I mean, these monitoring systems and the artificial intelligence and machine learning, it's what Elon Musk was talking about. He's got its dark side. So I don't want to dwell too much on this because we're...

Brian L: Let me take a listen to a little bit of a different direction, because I want to hit this. I was talking to Phil about this earlier this week, but IBM is focused on body... We've been talking about what you know, like what what AI will look like when it's different contexts. But there's another aspect of this where we talked the Sentient.AI, which was kind of spun out of the Siri. And they've got AI now that they've given the authority to make buy decisions on the market.

Brian R: Yeah.

Brian L: And so one of the things that I was talking about recently is what happens when hacktivists start building systems that they give decision making ability, money, and maybe not even hacktivists... Like anyone could do this.

Brian R: Anybody.  

Brian L: And there might be good reasons for this. And then they throw the key away and have no way to get back into it. And these AIs go and actually start making decisions autonomously with no way to ever recover control.

Phillip: That actually is something that's being theorized that can happen today. I mean, since we're already down this path already, let's just go there.

Brian R: We're already there with flash trading and things that nature.

Phillip: Well, there's a neural nets that are being set up and neural nets are being created today to create other neural net. And there's always, there's always that that middle layer. There's always the unknown layer that the hidden layer that you can't access and you can't see. So when a neural net creates another neural net, it has already effectively thrown away the key because you have no access to see the inner workings or even understanding the cognition.

Brian R: Unless you build it. Unless you build it with the idea in mind that you need to be able to do that. I mean, the AI I build, I purposely cripple. It's really crippled. And anybody looking at its ugly as hell because again, I'm not a programmer, I'm a researcher. I don't care what it looks like. I want to get to an end point. I want to see where the fence line is if there's a fence line or what the next mountain looks like. So I forensically create trails that slows it down, that makes a little bit more cumbersome and allows me to understand how I got there and whether or not I ever want to get there again, because there are some things I just completely shut down. And some of it's pretty scary. I won't get into it. But, it draws conclusions that are just absolutely phenomenal with very rudimentary AI. I don't know if you guys remember 20 questions, you know, you should pretty much figure out what somebody is doing and 20 questions. Right? It looks magical. And the whole concept is what is going on with neuron growth. And once we look at quantum computing, which I'm starting to play a little bit with right now in models, I don't have a quantum computer yet, but you and I will have access to one very, very much sooner than most people think. Not saying tomorrow. But then again, at that point, all passwords that existed from that point backwards don't exist anymore. They're gone. If you encrypted something in 1984, 2016 is a few seconds later. We're already in it, and we already figured out what it is, and we've identified the impact it had. And again, that's post crime. Right? It could probably figure out you were the guy that was there by pulling out all sorts of data bits. I wouldn't recommend everybody listening to our voices. And you folks, if you haven't done it, get an old movie on YouTube for free called the Forbin Project. And this is very much it. Again, this is probably before how in 2001, it might have been right around that same period. It's great of that sort of epoch of movie. It overly dramatized very much in the 70s kind of thing. But the AI, which is a voice AI system, took over this guy Forbin's, who's the inventor, life. It became Sentient. It started questioning things about Russia. And it connected with their computer. And again, it was towards the Cold War and started realizing that our whole premise of the Cold War was pretty stupid. And it was going to solve it by eliminating some human beings. And so basically one of the ways and he said, how does a computer kill a person? Forbin got it down really good. Once the AI got sentient enough to be able to make a dam overflow and kill maybe one hundred thousand people or two and a thousand people, it says, "Get John Smith. Put him out in the square. Shoot him in the head. Leave him there for three or four days. Or else I'm going to let the dam break and kill all these people." Or other things... They had a number of different things. That's called terrorism. And if your AI doesn't even even need to get sentient, if it rationalizes enough, which most of us do, it's a natural tendency of humans. But we also have our humanity. If we were just logical, we'd be making some really dumb decisions. This is a self-driving car quagmire. OK. Give it to you. I'm going to iterate down to another level, but I come out, I'll draw it all back. You're in a self-driving car. There is an old woman with a walker in the middle of street. Stage right is a child on a tricycle. Stage left as a wall. You're going 50 miles an hour. How are you going to teach that car to make a decision and make a value decision?

Phillip: Right. Yeah. The trolley problem.

Brian R: The value decision is kill the old person because they're more expendable. No, that was the grandma that told the kid to become a neurobiologist and discovered the cure for cancer. And that kid witnessing grandma dying will never have had grandma tell... You see what's going on. You cannot be God. You can't predict the future. So what will all of us do? I don't know. I've never talked to you guys about it. You know what we're going to do? We're going to do the impossible. When we hit that wall, we're gonna close our eyes and magically appear on the other side in our mind. And we're going to sacrifice ourself. And it's going to be very hard for Google to tell you. Oh, yes. Your self-drive driving car whenever there is an accident, is going to kill you first because you happen to be in a car. How's that sound? Sign here. No liability. That's the future we're going towards. Right? And this is all AI. And again, it manifests in different forms, but it's all doing the same thing. It's all making these human like decisions. So a rational engineer that is using only one side of the brain would say, well, we're going to find ways to do better cost accounting. What we're going to try to do is read the social network of all involved, see who contributes less to society, who's got the power to sue us the most. I mean, let's get real about this. That kind of junk is going to go on. And I don't want to live in that world. I don't think anybody listening right now really wants to live in that world.

Brian L: Yeah, I've talked to quite a few people about self-driving cars. And I think that actually honestly, that is the number one complaint that I heard. I don't want technology making that decision. I don't want it. I don't want somebody else making that decision who has no bearing on or who has no knowledge of what's going on. I think that kind of counter argument is actually that driving is actually going to be a lot safer in the end. And so I think, you kind of mentioned, we always make these decisions for good reasons, if you will. And so ultimately, do I feel a lot safer about putting my kids, my teenagers, into a self-driving car than having them go learn how to drive a giant motorized vehicle that travels insane speeds? You know, I'm probably going to put them in a self-driving car.

Brian R: All right. All right. So here's the problem with this about humanity and human growth. All right. And again, because I'm very schizophrenic, I'm an engineer, very logical. I have a physics science background. And I understand the statistical science behind it. But I also understand the most amazing things that has ever happened in your life, my life, and anybody listen to me did not develop through statistical. It happened just through serendipity. And there is no way to quantify these important things that have led you off to become who you are. If you're happy with your life, you can look at the little forks in the road that are not logical where you kind of winged it and you just kind of figured it out. All right. I grew up in New Jersey where the ability to feel free as an individual was to get in a car and to drive wherever the heck I felt like. And unfortunately, go as fast as I want. I'm going to be honest. You know, there are a lot of guys growing up in Jersey who I did they'd go out in the parkway and there's nobody on that road, 110, you know, even faster and doing all kinds of crazy things, stuff that as mature, sophisticated adults we shouldn't be doing. Some of these people died, some of them didn't. Some went on to become politicians, surgeons, a car repair person, and I am a parent. It's hard for me to imagine that I started driving a car when I was 12 years old illegally. I'm going to get arrested now. But, you know, I wasn't living at that particular moment on a farm. I was more in a semi city area of New Jersey. I immediately after moved to a farm area and I couldn't do any damage, but I was driving down the middle of the main street of my town at eleven o'clock at night on a Friday, barely over the steering wheel. And off to my left is a cop, rolls down the window and says, "Get home right now." I got home right now. He goes, "I'm going to be talking to your dad about this next week when I see him." They guy didn't really know me really well, but we were living in a small enough town where that's what went on. Today I'd be tased, be dragged. I mean, and not because the police officer himself, it's the mentality that we're in. It's because I wasn't wearing my seatbelt, by the way. That was a big one. I also got hit by a car driving a motorcycle, you know, motocross motorcycle in middle of the road. And it slammed into me at 60 miles an hour. And I wasn't wearing a helmet either. And I'm not... What I'm saying is we as a society, as we get "more sophisticated," when we get more involved with our technology, we could become more fearful and we become more restrictive. And we say, you know, when I grew up, there was no such thing as a car seat. My mom put her arm around me, and I sat in a front in the middle. And a lot of kids that got ejected out the front window, somehow people survived and got through it. And then we develop laws and rules and we say, well, isn't it nice to put a seat belt on people? Yes. And isn't it nice that we put it in a car seat until they're five and then till they're 10 and then all of a sudden, isn't everybody? And then, you know, people get in accidents and you shouldn't drive your own car. And then you see where this leads. It leads to Wall-e. Right?

Phillip:  Yeah.

Brian R: Remember the media, Wall-e? We're all basically a wheelchair for the rest of our life. And I say this not to throw a monkey wrench in all of this. I'm seeing that right now in this moment, literally next couple of years, we're in that moment where we're starting to make the decisions of, in a sense, our own destruction. Right? Everybody who thinks they have job security today, I don't care who you are. Wake up. You don't have job security. I don't care. You're a programmer? Sorry. AI is going to replace you. I'm going to train my computer just by training it like a kid. Your job is done. All right. So don't think you have job security. Oh, you make electronics? No, no job security. You're a lawyer? Sorry. Blockchain might replace you. Just the Blockchain.

Phillip: {laughter} Yes.

Brian R: All right. You're a doctor? No, sorry. A computer is going to do it better. A Watson. All right. Connected to a laser. And I don't have to go down that road. Nobody has job security any longer. So stop being smug because everybody who hears this argument say, "Oh, yeah, coal miner that voted for Trump. Yeah. You know, so he's pissed off." No everybody is gonna be out of a job. All right.

Brian L: With that, I mean, I hate to say this, but it is 2:30.

Brian R: Yeah. I'm sorry.  Don't let me leave yet. Let me wrap it up.

Brian L: Ok wrap it up. {laughter}

Brian R: So, by the way, I'm not supporting or endorsing negative or positive. Bottom line is, I want people to be free. And that's my thesis. My thesis is I'll start with the thesis and I'll talk to you about your voice I think it's going. My thesis is that we need to start clawing back our humanity. Desperately. As fast as we can. And we should start on our social platforms. You don't have to be me. All right? People think I'm weird. Every time I address somebody on Twitter, if I can discern a name, I will call them by their name. And that is my own way to myself. I'm not signaling. I'm not trying to tell the world this is who I am. It's really for me to constantly remind myself that on the other side of that screen, a face I can't see, an avatar that may be some kids dog or something. And then when they were 10 years older, some guy holding a Rickenbacker sunglasses and, you know, whoever... I want to remember that that's who I'm talking to. And that if they were face to face with me hanging out, maybe I won't be so snarky with them. Maybe their belief systems I won't challenge as much. And maybe I will speak my mind because nobody is recording it. And we might get down a dark alley of talking about different political thoughts that people have bottled up because you're not allowed to say it. Because if they do say one way or the other, there's groups of people that come with the pitchforks and the tar and feathers and they go after you. If we can't hold on to our humanity, if we can't be humans and interact without self-appointed defenders of a status quo, whatever that status quo is... Let me tell you, your status quo, whatever it is, is bullshit in a thousand years. It will look like a joke. Nobody will even care or remember. All the stuff that we're fighting over right now, no matter what side you're on, it ain't going to matter in a thousand years. And it's just like when you look back at little sideways pictures on Egyptian walls. And what was wrong with those folks? Look at us, too, and laugh even hardier. So we have to claw back our humanity. We have to do it through our technology. Stop being so frickin smart ass, stop thinking you're smarter than everybody else, and try to learn. Core is a great place to learn humanity. And, you know, learn to shut the F up and just listen to people who may have had some experiences firsthand and see that it's not so cut and dry, that it's not so binary, that our computers are training us to be black or white, binary, whatever. And I don't use those words by accident. We aren't. Because there's no such thing. We're human beings. And the difference is that we have are undetectable from even 300 feet away. Let alone 3000 miles away or 3000 years away. So we claw back our humanity. I'm not pretending to tell you exactly how to do that. I just find little ways to do that. I imagine anytime I'm chatting with somebody or talking with somebody, I can't see their face. It's a real person there. Not a snowflake, not a cupcake, because I want to see my mind. We should have the right to say our mind. Whatever it is. And you have the right to cry, kick or scream and say your mind, too. But at the end of today, will walk away learning a little bit more about each other and saying, you know something? Maybe these things aren't so cut and dry. Maybe there's a little bit of good in all of us. Now, how does this come into voice? The voice and the AI that we're creating is going to magnify this ugliness to a proportion that no human being is ready for. And it's going to happen whether we like it or not. It is already happening, and it's already in the hands of people that we thought we elected but have just been sitting inside a government for generations, for good or bad. It's always done for good reasons. But if the wrong person gets involved in it, let's call that person a tyrant or tyrants or tyrannical. And I don't, again, let history decide who that is. I can't tell you at this moment. They can use all of this stuff in a way. Unbelievable. But you know where it starts? It starts with this indignation that's somebody else's belief system is retarded or flawed or whatever politically incorrect or correct word you have. If we were able to interact with each other like we're all just hanging out, which I try to do in any of my conversations. It gets me in trouble. But now I'm an old man and I can hopefully not suffer consequences as much. Maybe that greets us. And I don't think, gentlemen, it's an accident that all of this is exploding upon us right now. The entire political season we've seen was invented because of social networks. The same blunt tool that you use when you don't like somebody was used to change you. You changed it. It changed you. How do you stop that? I don't know. You got to find that answer. If you're listening to this voice, find the answer. All I can tell you is one thing. If you want to study history and maybe we taught history too boring and maybe we don't have enough money for people to study it, maybe STEM is better. I don't think STEM matters unless you are able to understand how many frickin times we've gone through this cycle. How many times we thought we were the smartest. Everybody alive today thinks we're the smartest generation, we're the most liberated generation, the most open minded generation, the one that eliminated this or the one eliminated that. The first person that had this. I'm sorry. It's not the case. The first woman president happened almost 6000 years ago. You don't know about it because we committed a lobotomy and we don't have the memory to understand it. It happened already. All right. So we need to get over ourselves and stop being so dramatic about all that and to say, you know, we're just working our way through it. So our AI is going to do that. That's a negative side. The positive side is if we grab a hold of this, just like I think Elon said recently. He said, "We're not going to be doing the rudimentary jobs, the draftsman job or, you know, the building of something job or the driving of something job. We're going to have time to do better things." And the Wall-e example, the Disney movie, Wall-e. The better things was sitting around in our VR world and immersing ourselves endlessly into a false reality thinking that we're somehow going to find Nirvana or the answers to all of our sadness, the loves that are lost inside of an artificial world. And somehow maybe remove ourselves and live there. There's some people, you know, and I won't say this as a put down. Some people want to disembody themselves from their humanity and put themselves in a machine, maybe it's a Google machine, and you can pay per click advertising in it and you can't ever turn it away for an eternity. This is what hell looks like, right? Hell looks like you don't have to pay for your embodiment. But you have to deal with all the advertising. So any time you think of coffee. Well, coffee. Starbucks. Coffee. And they change your words. So be careful what you wish for. I'm a nerd. I want to see this technology. But on the other side, I'm screaming from the top of my lungs, if you can understand through all this, that and it's not a sermon. It's more me being frustrated and confused because I don't got the answers. I'm a student. Everything you heard me talk about today, I'm a student. I'm not an expert on anything. Student. I'm learning. And I'm saying that we've got to start learning this about ourselves. And if you think you need to take to the street and yell at it. Go. Go for it. You think that's productive? Sure. Think acting out is going to fix things? Great. All I can tell you is go back in history. Tell me if it ever fixed anything. Or even ask your Dad or your grandpa. But the good side. I'd want to drop this. The good side is this. When we unload the burdens of search and distilling all the stuff that we think is what our computer experience is, but it's nothing but a waste of our time, time that we could really be applying to things that are productive for us... The definition of productivity is going to be very, very personalized. What is enriching us and is it making stronger? Or is it making us weaker? That's how I define it. Is it making me greater or less greater? If you're if you're spending your time, you know, watching VR porn all the time, is that making you greater or lesser? If you're spending your time in a game where you're shooting somebody and it looks more realistic, is it making you greater or lesser? I'm not telling you that that's a bad or a good thing. I'm just asking that question and not trying to be ironic. I really don't know if that's a greater or lesser. Doing artificial surgeries on five million people on a video game, does that make you a greater or lesser person? These all things come. But I'll tell you what you will have a lot more of is time. And you're going to feel like you've run out of time. You are going to have all this time available, yet you're going to feel like you don't have enough time because...

Brian L: Isn't that already true?

Brian R: That's happening. Right. So along the way, you're going to have this great possibility. And I'm saying fall in love with that and understand that yeah, you're going to have... Part of my thesis. Are screens going away? No. There'll be situational. You're gonna be seeing images on the screen that's most convenient to you when you need to see it. Whether that's a virtual screen and a real metaphor of a screen or something that pops up and goes away or it's your phone. Doesn't matter. Are keyboards and touch screens gonna go away? Of course not. But the keyboard already is going away. Now, we're pounding on glass and not keys. It's a big step. And are input devices going to go away? No. But the punch card never really fully went away. But we're using it a lot less. So all these things are going to gestate. But the thesis is very simple. Voice First world is coming at us faster than we can possibly imagine. It's going to matriculate through us in ways that we didn't guess, that I can't guess. It's going to come through our appliances. It's going to be... What happened with the Internet going down a couple of weeks ago. It was somebody's cam that took down the Internet because somebody left a backdoor. Imagine hundreds and thousands of these things that weren't really thought through. Of course now you can do duh, why did they keep a password in the back end of the system? I don't know. You know,  you can theorize and conspiracy, you can put tin foil on whatever you want to do. It happened. What's happening right now? What's the backdoor that's happening right now? What's it backdoor into all the voice first devices I'm playing around with? What have I given up using them? I don't know. I mean, I can go run and live in the woods and run away from the stuff, but I'm boldly moving into it. And I gotta tell you, there's a lot of people who are the thought leaders in technology that think about 90 percent of what I'm talking about right now is a complete bunch of bullshit. These are tough things, tough things.

Brian L: I think security is absolutely a huge issue. I mean, I think, you know, if anyone is to say voice is not completely secure method of interacting with computers is blind. And so in reality, I mean, there's sort of like the short term version of this in a long term version of this. And the short term is, you know, I think, first of all, there's got to be a lot of change due to voice. No doubt. And we're going to be able to interact with voice, you know, in much greater ways than we thought were possible. Probably more quickly that we thought possible. Long term is there could be some very dark consequences to going down this road. I think how it actually plays out is definitely up for question. But what a great interview. Thank you so much.

Phillip: Thank you so much.

Brian R: Phillip. You're in a lot of this junk. Where do you fall in this?

Phillip: I mean, the thing that I kind of keep coming back to is where we're kind of at this inflection point of we don't have the tools or the skills or the the evolutionary traits necessary to survive this kind of an evolution. And we're going to have to evolve into it. And evolution sometimes comes at a very hard cost of there's an element of randomness that has to happen for the vestigial to die and the... We'll talk for another two and a half hours.

Brian R: Guys, I would love to do 10 of these shows if we had to. But I wanted to get mostly because I don't normally go down his path. But to see what's going on in the world.

Phillip: Yeah. That's really...

Brian L: This was amazing. I loved it.

Phillip: This is really what we're all about. Well, thank you so much. Thank you.

Brian R: Gentlemen, absolute honor to be with you folks.

Phillip: Yeah. And it was honor that you spent so much of your time.

Brian L: We'd love to have you back on the show again. Keep up with you. So we will definitely let you know when this goes live and we will be staying in touch.

Brian R: You know where I am.

Phillip: Thank you, sir. Thank you.

Brian L: Thank you.

Phillip: Thanks for listening to Future Commerce. We want you to give us some feedback about today's show, and you can also subscribe to Future Commerce on iTunes and Google Play or listen right away from your Amazon Echo with the phrase "Alexa play Future Commerce podcast." Thank you for listening. And until next time, keep looking toward the future.

Recent episodes

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.