Harpreet: [00:00:06] What's up, everybody, welcome, welcome to the Arthur Data Science. Happy hour is Friday, August 27th. This is happy hour number forty seven. Just a few more weeks until we get that one year mark. And I can't believe it. How awesome the community has grown in just under a year. Hopefully you guys got a chance to check me out on the great and powerful kanji, his podcast, Ken's nearest neighbor. It was on that. Have a great conversation with Ken. Please do tune in and let me know what you think. Hopefully you got a chance to tune into the episode I did with Jeffrey Lee, released that earlier today. Uh, if you have not yet listened to it, I think you'll enjoy the opening sizzler. Jeff drops the freestyle rap, wrote it there with a pretty sick beat in the background to hopefully have a chance to tune in to that. Harpreet: [00:00:58] And I'm excited to Harpreet: [00:00:59] Have all you guys here. So I was going to get a question here I want to open up with. Um oh, this is kind of inspired by one of my friend Shantha, this post. She made a post earlier today, friend of the show, friend of the Happy Hours Björnsson. I was talking about how she had made a blunder with respect to interpreting a histogram during a meeting and during a presentation. And it was funny because just a few minutes prior to that, I had Harpreet: [00:01:27] Made a blob Harpreet: [00:01:29] As well. I was doing a bit of prospecting, as it were, trying to get, you know, some people on podcast for paid sponsorships. And I have a I've got a template that I work with. And this template where I want to put the person's name will have bracket name. And there is more than one email that I sent out where I forgot to change the bracket name to the actual person's name. So I'm wondering, what's this something that's just super therapy that you've done this week? And what's your reaction to [00:02:00] that then? What's your reaction to your Harpreet: [00:02:02] Terpenes been like? Harpreet: [00:02:05] Let's let's go to Eric for this one. Eric: [00:02:09] Well, I guess second term. I actually just earlier today, I was working on a dashboard to deliver and I met with the stakeholder was like, yes, I was super stoked, ready to go. And he looks he's like on the third row says zero minus 50 is thirty one. I was like, well, crap. So like, clearly, like there's like there's a glaring error of basic arithmetic where I had just forgotten to change one field and tableau with another when I had recalculated it. And fortunately, you know, I my my reaction has just been like, oh, crap. All right, well, we're indig into it, you know, and I just kind of just own that. I feel kind of stupid and then just move along with it. And people are. I think when you when I own feeling like a goofball, I think people are less like I mean, they're just like kind of roll with it, you know? Harpreet: [00:03:02] Yeah, I like that that I like that kind of reaction to this. Yeah, it happens. So hopefully some of those people that I was prospecting hopefully will still consider my proposals. Antonio, what about you or something, Dappy, that you've done this week? Antonio: [00:03:20] I get familiar. I'm very familiar with the situation that you're in Harpreet: [00:03:24] Where I Antonio: [00:03:26] Try this, like automation. I'm like, you know, I was like going down a list of people I wanted to connect with. Harpreet: [00:03:34] And I was like, like, I'm Antonio: [00:03:36] Going through this and it's going to take me like, you know, like 30 minutes to just go through this list of the why not just automated this site so that we can do it? And the same thing with me, it says, hey, link, first name. And I put in the brackets, I put the parentheses. So now people are accepting my LinkedIn request and it feels so stupid because it says, OK, first name in parentheses. Harpreet: [00:03:57] I mean, people are being Antonio: [00:03:58] Nice, know that they're not [00:04:00] saying anything about it. But I feel so bad. Harpreet: [00:04:03] So I've sometimes Antonio: [00:04:04] I've just been like people just laugh and just continuing the conversation like this never happens. Like, hey, you Harpreet: [00:04:10] Know, because I do genuinely want to like meet Antonio: [00:04:12] These people. But it was just very time consuming. So I started I just can't continue the conversation. Yeah. Like you told me about like one that you to be a science and things like that. Harpreet: [00:04:23] But I Antonio: [00:04:23] Definitely, definitely can't relate to Harpreet: [00:04:27] You. Antonio: [00:04:28] The other one I think that I have and it wasn't this week it was more Harpreet: [00:04:32] Of happier when I Antonio: [00:04:34] When I started, I was working in fraud analytics. I think this is it always makes me laugh because and I think it's a it's a good lesson is Harpreet: [00:04:44] I was doing some Antonio: [00:04:46] Kind of fraud stuff and stopping some kind of fraudulent emails. And I did analytics on it and I stopped. I forget what the number was. Now, based on my calculations, I went up to the senior manager, like all Tokie and stuff, you know, just fresh out of college. I was they just saved your hundred thousand dollars. And I've been here like, you know, like a month. I've been working on this for like, you know, you know, compared to how much you're Amy. I'm I'm doing this company. Huge, like, you know, deal. Harpreet: [00:05:17] And he's like impressive. Antonio: [00:05:19] Let me let me look. And he thinks, look at the numbers. And I thought that for every email I was saving, I was saving him like zero point zero Harpreet: [00:05:27] Four cents, like point Antonio: [00:05:29] Zero four cents. Harpreet: [00:05:30] Well, it turned out Antonio: [00:05:32] For every email that while I was saving, it was costing I was saving like a zero point zero zero zero zero zero zero zero four. So when we ended up calculating it correctly, Harpreet: [00:05:43] The project Antonio: [00:05:43] I worked on for one month and I got so cocky about it, I ended up saving the company about four dollars Harpreet: [00:05:51] And that. Antonio: [00:05:53] Yeah. So he was like, yeah, thank you, Anthony. I'm going to go buy myself a Starbucks coffee now. And all this this money you saved us. [00:06:00] So since then, I just keep quiet. I don't think credit. I don't blame, Harpreet: [00:06:05] You know, that's the good omen because you guys just like. Yeah, like I'm saving you. Antonio: [00:06:11] Yeah. Especially when I was coming out of college, I thought like, oh, my God, I know. Two months of Data like killing it. Harpreet: [00:06:19] Russell, what about you? Harpreet: [00:06:20] And for everybody joining in on the live. Thank you so much for joining and everybody just coming into the Zouma room here. Thanks for coming Harpreet: [00:06:27] Into room room. Question we're opening Harpreet: [00:06:29] Up with is what something that you've done that's been Super Turpie Harpreet: [00:06:34] This week and Harpreet: [00:06:35] What was your reaction to that? I so happy to have all you guys to share. So if you'd like to share, go ahead. Let me know right there in the in the chat. And also, if you have questions, let me know wherever it is that you're watching, whether that's on YouTube or LinkedIn or right here in the zoom room. And I'll add you to the queue. Harpreet: [00:06:54] Let's go to Russell. Harpreet: [00:06:56] Russell, are you able to his his his audio might be going through some issues. How about Manny or Abe Diaz? Want to Harpreet: [00:07:04] Share something? Therapy, super. Harpreet: [00:07:06] Super. What, Dirk? Super Derby manages Derby like a silly mistake. Oh, and like the last hour or not. Well, ask me if the last hour counts as part of the week. So so that that works as well. Like any men in Antonio: [00:07:22] Regards to like anything or Harpreet: [00:07:24] Data related disregards anything like I was talking about earlier, I was trying to do some prospecting. So sending out emails to people, trying to see if they want to do some sponsorship for the podcast. And I have a template that I use. And instead of, you know, changing the bracket name, I left it in more than one Harpreet: [00:07:42] Email as Harpreet: [00:07:43] Opposed to changing it. Yeah, as Darba as that is, that's quite droopy. What did I do this week? I always drop stuff and I'm clumsy when my wife tells me not to break stuff. So that's like that's like every Antonio: [00:07:59] Day, though. Many, [00:08:00] how about you? Eric: [00:08:02] I know nothing that I can think of right now. This week I was in transition jobs right now. So a few weeks ago, though, I did mess up a mess. Antonio: [00:08:15] Linkedin connecting with someone and a messed up introduction. Eric: [00:08:19] And I don't think they noticed. But I but I bet I know for sure I saw the mistake afterward. And I'm like an idiot. Harpreet: [00:08:29] It happens that it happens. Russell, I think your audio is back on. It was a bit glitchy. I was not able to tell. Russell: [00:08:37] Uh, yeah, I think it's great. Um. Can you hear me? Yeah. Okay. Good. Good to speak to you again. Last week, I think I was I was wiped out for the entire session. So I don't think there's anything I've done this week. But there's a good one from my past that I remember whilst working in the office, you know, back in the days that we'd actually go into the office working on a docking station. I always carry around my own cordless or wireless mouse and keyboard to use with my mind set up, because we had like all desking system at the main office and was working and the screen was Harpreet: [00:09:13] Doing some real weird stuff. Russell: [00:09:15] Couldn't figure out what was happening. And it turned out that there was still a keyboard plugged Harpreet: [00:09:19] In to the docking station, Russell: [00:09:20] That someone had stuck on top of a pedestal beneath the desk, and then someone had put on it. So the weight of that book was constantly pressing down on the key. So a real stupid thing that we should have been able to to diagnose in a few seconds. But three of us sat around for about five minutes trying to figure out what was going on before we realized it was such a neat mistakes. Harpreet: [00:09:41] Like, look out for those, Harpreet: [00:09:42] Eric, as he got another one, Eric. Go for it. And then if you guys have questions, go ahead. Let me know right there in the chat. What are the comments section, wherever it is that you are joining us from DeBolt, everybody watching us on LinkedIn and on YouTube. I see you guys. I'm taking your questions, so let us know. Harpreet: [00:09:56] Go for Eric. Yeah, so Eric: [00:09:57] Many things Harpreet: [00:09:58] Reminded me. Eric: [00:09:59] I [00:10:00] once upon a time had a group of people came to my master's program to share some stuff about about the company. And one of them was a recruiter, and then some other people like directors and things like that. And I really enjoyed it, took lots of good notes. And I messaged I messaged the recruiter afterwards, later when I was interested in applying, you know, like a couple of months later or whatever. And so this actually was for lending tree company I work for. And earlier in the program, we had used the Lending Club, Data set for a different project. And I wrote this nice Harpreet: [00:10:35] Like note out Eric: [00:10:36] To the recruiter. And I accidentally in one spot, I said Lending Club, and the other part I said, lending tree. And he responded back and he's like, Oh, did you mean Lending Club or Lending Tree? And I just like I just I just responded real quick is like, oh, man, sorry, I'm not lending tree. That was stupid, blah, blah, blah. He never responded to me. Never. I was like got ghosted from there. But they got the job. Ended up talking to somebody else and found my way around. But I felt so stupid about it. But I was like, yeah, whatever. Like if it's going to work out, it's going to work out. Harpreet: [00:11:06] If not, then that's whatever, you know, Eric: [00:11:08] That's his choice. But it just made me think like I guess that's why it's important to one proofread, but to keep multiple irons in the fire when talking to people and just be a good be human. Harpreet: [00:11:19] Yeah. Yeah, absolutely. I mean, so something to be said about you being on the receiving end of someone else's jerkiness, if you know how compassionate or empathetic that mistake you want to be. I guess in my case where I'm prospecting people trying to get them to pay money, my podcast, they're probably not as Harpreet: [00:11:40] Much Harpreet: [00:11:42] Leniency there. Let's go ahead and move on to some question. I know that Eric said he had a question lined up. I think he had a question lined up as well. But we'll start with Antonia's question first is right here at the Harpreet: [00:11:51] At the very top. Go for it, Antonio. Antonio: [00:11:54] Sure. So I hope you guys like me, because I know, like a lot of people get annoyed because it's [00:12:00] all over like I mean, especially if you use Twitter. So I don't see it too much on LinkedIn. But how interested are you personally in like an empty Harpreet: [00:12:08] Space and Antonio: [00:12:09] Also more to this office hours? Do you see any applications for it in the in the beta field? So I'd love to hear from multiple people, I think. But I want to hear him out from like a Data perspective. But it's just it's just such a big space right now. I want to see what you guys think. Harpreet: [00:12:32] Yeah, let's go to Ben. I mean, me personally, like I find them fascinating. I find it interesting. I don't think they're going anywhere just because I think humans like collecting stuff that's always going to be an aspect of it, just the human element. I don't know enough about them to comment on the Data aspect of stuff. I was, you know, interested, intellectually curious about block chain for a while, but I had to do prioritize that for other things. But let's go to Ben and see what they think. Harpreet: [00:12:56] You're um, you Harpreet: [00:12:58] Open some audio issues. Well, then comes back, does anybody else have any thoughts on NAFTA? Eric: [00:13:05] You can hear me now, OK? I had to push the on button on my microphone. So I have a project in Q4 that I'm hoping to get approved where we would be producing and NFTY just for fun. But like in in I produced and which other people have done that we'd like to take to the next level as far as practical applications. I haven't put my hands Harpreet: [00:13:28] Deep into it to have a strong opinion. Eric: [00:13:31] A lot of the stuff I do is more like marketing stunt with some Miggy of I sprinkled on anyway. I'm curious what other people say Harpreet: [00:13:39] The same here as being a marketing stunt. Harpreet: [00:13:41] How was that Harpreet: [00:13:42] Crazy expedition you went out to fly fishing? Eric: [00:13:44] Anything that I'll I'll post a video in the chat. So I went last Saturday and the hike is twice as long as I told everyone, which I think is hilarious. And it's not moderate. It's strenuous, which I think is also hilarious. So but [00:14:00] yeah, we're still going forward. So got an awesome videographer, photographer. It's going to be an amazing shoot. And I'm doing testing this weekend, like technical testing on the doability, filming, you know, teaching an assistant to predict if you're going to catch a fish. Harpreet: [00:14:15] So, yeah, super excited to see what comes of that, because that sounds very, very fascinating. Harpreet: [00:14:20] I left our Eric: [00:14:21] Post a link in the chat. I think when you guys see like the footage of where we're actually going, you'll realize why this is insane. Harpreet: [00:14:28] Anybody else have any thoughts on on NTIS? Andrew, how about you or Hico or Manny and Lefty's and no relation to the Data, if anybody has any thoughts. Andrew: [00:14:39] I do have a question for the group and maybe extending on what Ben just mentioned. So I was playing around with the Kugan clip notebooks, which been really popular lately. And I was curious if anyone's a thought or heard or seen anything in the space about using it to customize experiences and games. Because I know it'd be pretty it'd be pretty neat if instead of getting like a platinum trophy in a piece for game, when I kick some alien ass and like mass effect or something that I get a trophy that's also like an NFT, because I was thinking originally, like customized art based on what props you could do in games and everything from the clip. But the PhD style trophy would be Harpreet: [00:15:24] Pretty cool, too. Well, yeah, Antonio: [00:15:25] I think we're I was thinking about use a sort of pre or people who do courses online. Somebody can buy your course once or let's say for three hundred dollars. Right. The user for two weeks. And they don't want it anymore. Well, maybe they'll learn what they needed. Well, would an NLP. You can sell that course to somebody else. So that way you as the learner, you don't feel like, for example, that you're like, well, I just dropped the hundred dollar bills on appropriates and now like I lost the money. Well, if somebody else wants that, you can buy that. Oh, what's good for Covid Reed? Because it [00:16:00] will work for you. You can attach, let's say, a 10 percent royalty on that. So now if your podcast starts becoming bigger, it'd be more people want it. Maybe you don't Harpreet: [00:16:09] Have too much time Antonio: [00:16:11] To to mentor Harpreet: [00:16:12] Everybody. So now the Antonio: [00:16:14] Only way for you, somebody to get a personal session for free is through this one Harpreet: [00:16:18] Nfte and a Antonio: [00:16:19] Billion dollar. The price keeps going up. You give people keep selling it. Right. And and so that they're not feeling like they're just dropping money without anything in return. I mean, they're still learning something. But you are making now royalties forever, so you're kind of betting on yourself. So that was kind of like one idea that came to mind, right? Because I know the artwork is like, yeah, I Harpreet: [00:16:42] Play a lot Antonio: [00:16:43] Of it's a lot of hype and what could happen. But that's kind of like one of the ideas that came through my mind. ISM's or selling your horse or what if like Ben designed some kind of a Data set and it's like one of a kind Data set and he doesn't want to give it all for free, so he drops it as amnesty and people like basically compete and then keeps getting royalties for the next 30 years when somebody uses this amnesty. You know, so I'm very I'm still learning about it, but I wanted to bring it up in case somebody else is like interested in it. But if not, we'll talk about it in another another session. I don't want to stop the conversation. My friend, Harpreet: [00:17:22] My friend showed me earlier, Royalle the Iot. And it's like the same thing. But it's like for artists like getting away from labels. So like I just looked at it and they're not starting it yet till October. Harpreet: [00:17:34] But it's that's pretty interesting. Harpreet: [00:17:35] Like you get a royalty from like Mariah Carey's Christmas song. I don't know how much it is, but. And that was pretty cool. Yeah. Yeah. That's pretty fascinating. I like that. I like that. That's an interesting use. It's almost like that. The use textbook market, kind of like you've done learning this thing. Let's see if we can resell it was Carlos and you need him. But it sounds like you were jumping into Eric: [00:17:57] One thing that I think is interesting is. The [00:18:00] Nets are kind of a little bit of a solution in search of a problem at the moment. Right, because we we have the only thing we know how to sell with them is the digital equivalent of baseball cards right at the moment. But like so I dropped a I dropped a post or a blog post from not boring, which I actually got from a post from Carlos LinkedIn, where he talks about using NFTY as a status, symbols and stuff. And so that's kind of it's an interesting thing that it's basically showing the current futility of the use of of NFL teams. However, it also the author kind of gets into a little bit of what they like the the metaverse kind of thing is going to look like. And I personally think like I try not to get too into the metaverse as a as a buzzword type thing, but I'm I've been really interested in VR for a few years. And so seeing the progression that's been made towards like Facebook's horizon workspaces and things where you can even actually like see your keyboard, even though you have your VR headset on and things like that. Right. So as we move towards ready player, one type world where I can be in a digital space and I may have digital Harpreet: [00:19:07] Possessions that you cannot Eric: [00:19:09] Just digitally copy and paste into your digital space, then a non fungible Harpreet: [00:19:14] Token will be the way Eric: [00:19:15] You prove your ownership, I think. And so eventually it will matter. For now, it's just a tool that you don't know how to use because the world doesn't have a use case quite ready yet for it. But I think that we will get there. But for now, yeah, for now it's like just speculate if you want to buy cartoon art or whatever. I do really think that it will have a purpose once the rest of the technology that we will actually want to use kind of capture. Harpreet: [00:19:41] Yeah, I heard Ralph Lauren got Gods and Lefty's, you know, for for polo gear. So just stock up on some polo threads in my virtual self. But no, that's really, really fascinating. Really interesting, Erica. I really like to see that while closely mentions that, you drop a link to that. I really appreciate that. Mikiko, go [00:20:00] for it. Mikiko: [00:20:00] Yeah. I mean, I think I'm sort of what's the term, bearish on 50s? I don't know. I've been what I've been following some like trader like me Instagram handles recently. Harpreet: [00:20:12] And I'm like, oh, all Mikiko: [00:20:13] These things that I used to know and I don't know. But they're really funny to poke fun of, like Excel sheets. Don't tell Dave Langer I said that, by the way. But they kind of tease like exactly to what Eric was saying, like they become more useful when you start talking about them, like in like specific domains. And when you talk about them in the vision of like what does this feature world look like? Because I feel like an if Ts are kind of this like weird like next step in evolution of what what is the like various attempts to, you know, create sort of like virtual worlds either through Sims or like other games, like other life, for example, was another. Harpreet: [00:20:51] Well, yeah, well, like Mikiko: [00:20:52] It's an extension of almost sort of like that sort of belief of like why don't we create the sort of like unified World Vision, where exactly like, as I point out, like digital, like proving possession of digital items becomes really important. And actually, even in like the luxury space, like, for example, like LVMH is just like, you know, monster portfolio company that owns various brands, like I think Chanel, Dior, Clutter's. But like they have some teams that are specifically, for example, looking at, well, how can we create essentially like digital clothing? And when you female, you're like, OK, that is definitely like Emperor's New Clothes. Until you realize how much of the revenue is actually driven by influencers and actually by like social media and Harpreet: [00:21:38] Through sort of like Mikiko: [00:21:39] The production of digital media, at which point then it's like, OK, it becomes really exciting for now. Harpreet: [00:21:45] Some people could Mikiko: [00:21:46] Say like, well, you could take that, for example, that that mesh of the clothing. So so in this case, the NFTY would be like the mesh, like the clothing mesh with all the designs and all that. And you could screenshot it and then sort of like Photoshop it onto something else. It [00:22:00] becomes a lot more interesting when you start talking about like the AARP aspect of it, or even the fact that like a lot of media nowadays is video or sort of tick tock shorts, and you can't just kind of like copy stuff. And so and so like it's one of these things where it's like very sort of industries are like adopting it in sort of, you know, some people are like, yes, this is an exciting new technology because we kind of see a lot of, for example, like business and like fashion and art going digital. But then you also have like lot of people like myself, where it's like, yeah, there's like just one hundred percent no use case for it. And it's almost like in the realm of art where like it's hard to like put a dollar value on subjective taste until like the artist AIs, at which point then it always goes up like 150 million at an auction at like Sotheby's or Christie's or whatever. Right. Harpreet: [00:22:48] So but yeah, I like that. Mikiko: [00:22:49] Yeah. Antonio: [00:22:50] Mikiko, like what Russell said Harp creative. So if you Harpreet: [00:22:55] Could use those Antonio: [00:22:56] And slash smart contracts to verify like Data. So. Ownership and things like that. Russell, he's dropping some of the comments where he's staying quiet, maybe due to his microphone issues. Harpreet: [00:23:05] Yeah, I think they may have something to do with the low rate grade comments in the chat. And you, dear viewer, on LinkedIn PhD to be part of this wonderful, scintillating conversation that's happening in the chat if you joined the room. A question from C. A. I'm not sure if your your microphone ready as well, but and he's got some interesting comments and questions. He says, where does block chain end? And after you began, do we talk about one and the other? I think NFTE are just then implementation of Blackfin just like zoom call it be an implementation of the Internet, I guess. I'm not sure if I'm saying that right, but I hope I of get what I mean, urben go for it. Eric: [00:23:41] I was just I was laughing because Mikiko said something important that I've I thought about Harpreet: [00:23:47] With these a if you have a Eric: [00:23:48] Like an artist in the future that are Harpreet: [00:23:49] Quite good, the artists Eric: [00:23:51] And future becomes more interesting when you destroy it. So imagine like your you're training this system with all of this compute. You generate this piece of art and [00:24:00] then you intentionally destroy the system. I don't know that that's made me laugh, laugh in the past based on the point that you just brought up. Andrew: [00:24:07] Future Banksy. AAII Banksy. Eric: [00:24:09] That's right. Watch me sred my eye. Harpreet: [00:24:12] I mean, but can you actually ever like fully destroy the if we still have the source code, right? Eric: [00:24:17] Like depends on the complexity, I think. Because if you don't if you haven't saved off those weights and you've just hashed it, you can. I don't know. I think you come up with a scenario where it's just it's too hard to get int it similar, but it'd be too hard to guarantee that you had achieved an identical. Yeah, there's too many too many possibilities. Harpreet: [00:24:36] Far as I'm still listening to Max Tegmark spoke, Life Repoint, I thought. Fascinating, interesting book. How we encourage all you guys to check it out. He just throws out a bunch of possible scenarios that could happen with with air in the future. And there's something that I was talking about was just Aiyaz having subjective experience. Right. So typically when we think of an artist making art, he's making that art from some place. Some subjective experience is pushing him to make that art. Is that the case with with a AIs like the way I like, emote teenager Harpreet: [00:25:09] Angst, painting this crazy Harpreet: [00:25:11] Artwork or making these, you know, crazy riffs on power chords? I don't know, man. The really interesting Harpreet: [00:25:16] Thing about that, I mean, Harpreet: [00:25:18] I can't wait for the future or one or any other points on this topic of NAFTA. I see a bunch of wonderful comments in the chat. I can scroll through all of Harpreet: [00:25:29] Them Harpreet: [00:25:30] Intersubjective with again and talk to us about that. Okay. No, go Antonio: [00:25:36] Ahead. Sorry. I didn't know someone Harpreet: [00:25:38] Else was lined up. Yeah, I think that's disenrolled. Harpreet: [00:25:40] Can you elaborate on this intersubjective with Gan's? Andrew: [00:25:44] I know you just said like subjective experiences with an A.I. then it feels like a gang is like a could be like a micro it like intersubjective experiences between like ant. Yeah, like between some sort of ensemble model. But you take that with you [00:26:00] ganna clip again, that's like a almost like an intersubjective interpretation of two models working together. It's just an ensemble model. I could be overthinking it. Harpreet: [00:26:09] But so you say intersubjective like the sense that how like no. Harare talks about it in his book Sapiens like by how money, how NFTE are really intersubjective reality. We're all most ads coming. Harpreet: [00:26:22] Yeah, it's kind of like Andrew: [00:26:23] A mix of like Karl Popper and what you're talking about. Yeah. Harpreet: [00:26:27] Interesting stuff to think about. Tom, you were saying Harpreet: [00:26:30] Something, so go for it. Oh, it's just a Antonio: [00:26:32] Real quick thought when we're talking these unifed. Harpreet: [00:26:34] Now I know what we're talking about. Antonio: [00:26:36] I actually had a friend doing a start up with a concept like this for music. Harpreet: [00:26:40] But I think the bigger Antonio: [00:26:42] Picture and I'm guessing this is on a theory on when we talk this way. Is that right? Because it seems like the better way to do it. Well, if you don't, no, that's fine. Harpreet: [00:26:51] But I think this Antonio: [00:26:52] Is one of those things where the elephant is there, but it's behind the bushes and no one can see it, that type of block chain that has smart contracts and can do these kind of things. This is just one example. I think Harpreet: [00:27:05] We've not learned to Antonio: [00:27:06] Exploit block chain or smart contracts well enough. Yeah, I think that's where this group should remain open as we talk about this over the Harpreet: [00:27:16] Coming months, years or Antonio: [00:27:17] Decades. Just just thinking out loud to say this is a first interesting topic, Harpreet: [00:27:24] But it's really what is the Antonio: [00:27:26] Foundation? It makes it possible. It's it's this new technology of change. Smart contracts. Harpreet: [00:27:33] Yeah. Yeah, definitely something I'm interested in. But again, I had a deep fryer and I studying it, but just having a conceptual understanding of the basics, I think enough to have some mental hooks and maybe enough to just kind of think about how it could affect your impact. Your work is designed his machine learning practitioner. Next question was coming from I think, Eric, you said in a comment for today's post that he is meaning that. The question [00:28:00] so this go to you, too. Eric: [00:28:02] So, yeah, it's I'm Harpreet: [00:28:04] The analytics or part Eric: [00:28:06] Of the Harpreet: [00:28:07] Organization, you Eric: [00:28:08] Know, the lending tree is has grown Harpreet: [00:28:11] Quite a bit Eric: [00:28:12] Recently. And and I suspect this is like a thing other companies as well. I'm just curious to know what different people have experienced, whether you're on the, you know, more tenured side of it or the new person coming into the company side of it from this one, like a sequel, just like a sequel standpoint, not like trying to get to know some code base of like functional, scripted things. But just like how do you what are kind of your best practices for sharing starter material? These confluence, these get labbe GitHub. Is it kind of like, you know, he's a Star Wars metaphors, like a Jedi where everybody has to build their own light saber, you know, is like hero, kind of tell you some stuff and build your own starter queries or, you know, what tools have you used and what what's most successful. Yeah, just kind of open ended on that. Harpreet: [00:29:04] So from my experience being, you know, a founding member on two Data science Harpreet: [00:29:07] Teams, like when I was I Harpreet: [00:29:08] Probably did use Confluence, just nice little wiki, and just really started outlining stuff and outlining the work that we're doing with a focus on the future, like, OK, here's some important things, here's where they live, here's what these things mean and just documenting all of that. And then we'd have like a consensus, like we would submit submitted on complainers, and then somebody has to to review it and approve it at price. I was doing everything inside team, so we had this really comprehensive bill that was really comprehensive wiki on teams. And it was just, you know, being a foundational founding member of a data science team, it was a lot of best practices for like, OK, this is how we're going to structure our repositories of how we name our files. This is how we going to work through data science problems is going to be a pipeline or workflow and things like that. I'm not sure if I'm answering your question, Eric, but I'd love to hear from from other folks. And I [00:30:00] guess we could start with the Antonio and then maybe go to a to to Mexico or Rabanne. And if anybody else would like to chime in on this, by all means, let me know and I'll go ahead and edge the QIC. Jose, I did not notice Jose here. Joe, I'd love to hear from Harpreet: [00:30:13] You as well. But let's start with Antonio. Antonio: [00:30:16] Yeah, sure. So when I was under the Edem, what we did is we wanted to get the non-technical Ortonville and we already were we were running dirty, the sequel. Right. So you're going to be running sequel server or whatever it is. And we created like six, six lessons from beginning to advance in, like PowerPoint slides. And the first one is basically like, OK, just what is the database and how to like, how to download SQL, how to get access to it at work. Harpreet: [00:30:44] And then afterwards, Antonio: [00:30:46] We created trainings, all the basic tables that we use, like customer tables or unemployed people that we were working with. So it was kind of like guide me through and with real examples. So it's like, hey, if you ever want to look up customer information longer, there'll be a person you can actually open sequel you to select star from the stable and open that so and progressively kind of like make different kind of videos. And then we will put those videos like on like you can go on a conference page or like on a shared network. And afterwards, every time somebody wanted to learn it will send them the videos and let them learn. And what we also did at the end of each one, that would be like you, but like all rights problems, if they want to practice and once a week on a Friday between like what's 80 hours of what we want to do, we hostis office hours. So anybody wasn't sure to have any questions one hour a week. They can come in and ask questions. And it was it was pretty successful. Harpreet: [00:31:49] So Bulahdelah Banjo or Mikiko, any any as when I had time in here. Harpreet: [00:31:56] I'll pass on this one. Harpreet: [00:31:57] Joe, what are you able to hear? The question. I'm not [00:32:00] sure when he joined in. Joe: [00:32:01] Yeah, I mean, it's we're going to do this right now at our company, actually leveling up people and trying to share knowledge and skills and say videos are good, like especially if you find yourself doing workflows where it's I would say more than once you make a video so you can share it with people. I think like showing people how to do stuff is a really good way to do it. I'm a big fan of like putting stuff together and links as well, or documents with lots of links and having people read through that their first Data so they can kind of level set. But making sure everyone has common knowledge, I think is the most important thing. There's nothing worse than starting off on the wrong foot, and everyone has different skill sets. I'm actually about to send a client right now. Our skill sets are all across the board, and that's going to be fun, trying to level everybody up to the same place. Looking forward to that. I'm joking. Harpreet: [00:32:52] Yeah, Andrew, I see some great comments here. Go for it. Andrew: [00:32:57] Learning is generally messy, you know, at. Depends on the team and the audience. It's always about your audience, but. They generally we've used weeky, sometimes that's been in teams, sometimes it's been elsewhere. Lots of links which you can gauge interest by, which is kind of key if you want to figure out which members of the team really want to scale up. And then we do a lot of stuff. I've been teaching a lot of people python, so lots of sharing notebooks. Harpreet: [00:33:25] You know, Andrew: [00:33:26] Getting someone on their first Jupiter notebook is always like a rite of passage if they've just joined the team just to make sure they they can do it. And then live sessions in brown bags like focused on particular topics. You know, again, there's like nothing better than one on one or like a life session. Harpreet: [00:33:44] Yeah. Preprogramming pepper sessions are always, always good. Eric, turn back over to you for any follow ups. Eric: [00:33:51] That's that's definitely really helpful. I particularly like the being able to set up, whether it's like you said, like PowerPoint thing or put it into a video format or [00:34:00] a little bit of both, just to make sure that they're. Because I just think it's so valuable. Even if I could record myself like writing up a query Harpreet: [00:34:07] And talk through stuff that Eric: [00:34:09] I wouldn't necessarily think to teach. But as somebody like Cizik, because that's how I learned it when I was like, oh, and this table would be connected to this because blah, blah, blah, I was like, wait, say that again. Write that down like I like, you know. So I really like I really like. It helps capture some of that institutional knowledge that maybe you would think if you were just typing. Harpreet: [00:34:26] So thank you very much. Harpreet: [00:34:27] Instead of asking what's what's the brown bag like lunch and learns where you just bring Harpreet: [00:34:32] Your own lunch and learn. Andrew: [00:34:36] Without feeling, you actually get to eat. That's true. I think people forget to do. Yeah. Harpreet: [00:34:42] So there's obviously if there's any other questions or comments on any other particular topic, I don't see anything coming in on LinkedIn or in the chat. So if anybody wants to to ask a question, now's the time to do so, then go for it. Eric: [00:34:55] So, Harpreet, you mentioned something earlier. That is a really fun topic, and that is the experience driven knowledge. So with A.I. systems of the future, an email from Invidia, we had our on our podcast. So she's kind of a Ajai expert and she's a professor at Caltech, I think. And I asked her if we'll be able to understand these systems of the future, you know, because people in the caller are like, well, they have storytelling with deep learning, grad, can we have all these great tools? And she said, absolutely not. And she leaned into this experience thing. The only way that you can understand a system like this would be if you could comprehend its experience through time, and which I think is a really fun discussion, because anyone on the call, if we happened to be clones of each other, I actually can't Harpreet: [00:35:41] Understand, you know, like I Eric: [00:35:43] Could guess what how you might be feeling and thinking, but like I would actually have to live your life through your shoes, which which I think is fascinating, because anyway, it's just a fun topic of conversation, whether it's A.I. or, you know, just humans in general, like experience driven insights and Harpreet: [00:35:59] Knowledge. [00:36:00] That's what Ben is saying is scientifically proven by a Star Antonio: [00:36:05] Trek next Generation episode where writers are split into two being separated for an entire decade. And then then when the two of them met, totally different. So is it perfectly well founded? What Ben just said? Harpreet: [00:36:19] It reminds me of a quote when I was in this this listening to rather this Max Tegmark book, he maybe was his own career is calling somebody else. But essentially he said, like, if a lion could talk, we just wouldn't be able to understand it. I think it's the same kind of thing, just the experience of it. It looks like Russell has a comment here. I love to hear from you. So go go for it, Russell. Russell: [00:36:40] So I had to kind of come in to loop back to Antonio's initial question about NAFTA, is it a good time to throw this back or do you want to carry on with the currency? Harpreet: [00:36:51] Anybody? I mean, I'd love to talk about that or so anybody would talk about. I experience experience based learning in a AIs experience thing Eric: [00:37:01] When you see things, because there are Russel's point in, because it's you know, it's top of mind for rimless. Let's go back to that. And then they go on. I you fall down the rabbit hole. We can go back to that. Harpreet: [00:37:11] Yeah, I'm just gonna fall down that rabbit hole as well as the rest of us. Go to your question. Then after that, we can go back down the rabbit hole. And if anybody has questions, whether you watch it on LinkedIn YouTube, which even or if you're here in the zoom room with us, please let me know. In the chat it wrestle. Russell: [00:37:28] I'll make it real quick and ugly. This might generate some some additional questions related to I ieso. So I put a comment in the in the channel a little while ago. I've just recently watched a documentary on the Witam claim. It was it was quite a long documentary, but one of the big parts in it was there some. Once upon a time in Shaolin, I think the album was called It was this one, one of a kind album that they put in this kind of bridewell, polished metal case, and it ended up being bought by, [00:38:00] you know, someone we might want to talk about this, you know. Harpreet: [00:38:03] And as I said, in the Russell: [00:38:04] In a chain, you know, you had dubious ethics and Harpreet: [00:38:06] Such. Russell: [00:38:07] But the point I was making was that could be done nowadays with an NFTE. It needn't be a physical edifice. But if it wasn't, do you think it would have sold for some months, do you think would have generated as much interest in this modern digital age? I tend to think that it might. But if it wasn't MFT, do you think they'd say you've got the ownership of it? People can still listen to it, much like they've done with the NFTE artworks. Harpreet: [00:38:32] They would just because it's like super, Harpreet: [00:38:34] Hyper right now. And if WuTang, you know, did something like like a publicity stunt like that, coupled with hype technology, would that it would definitely blow up a Balik, that analogy. I think that's a good analogy. Any comments on this? I love to hear from folks. Antonio: [00:38:47] It kind of loops. I hope this doesn't seem so far out, but it feels like it loops back around to this experientially. If you could somehow Harpreet: [00:38:56] Nfty a Antonio: [00:38:58] Certain age experience, it's a giant Harpreet: [00:39:00] Model, then you could Antonio: [00:39:02] Transfer to another Harpreet: [00:39:03] Agency. Yeah, I'd like to Antonio: [00:39:05] Experience this ai'i now instead and put on the shelf the one I had Harpreet: [00:39:10] Been running. It's it's quite Antonio: [00:39:12] Fascinating because Harpreet: [00:39:13] Then if you think of, OK, these different AIs Antonio: [00:39:17] Have different experiences, we can open source some of them and share them or we can trade them for a check them out for a short Harpreet: [00:39:24] Time. All sorts of Antonio: [00:39:25] Possibilities. Harpreet: [00:39:27] Super fascinating. Harpreet: [00:39:28] I think the one thing Antonio: [00:39:30] Client doesn't matter if it was on a tribe or not, I think it still would go for it. How much is like like a first edition copy of the book, whether it's electronic or not? It is. Yeah, you can have a million people who read it, but it's the human thing and those who we want, you know, I guess less social status or something that really hey, I was like the first one where I was the only one who has this official first release copy. So I it's a fascinating [00:40:00] space. Harpreet: [00:40:00] Any other comments on that album? Were you saying some? I just say lip move, but Harpreet: [00:40:05] I don't know if you're honest or Eric: [00:40:08] Know. Sorry about that. I'm financing a car Antonio: [00:40:11] For my son. Yeah. Harpreet: [00:40:12] Yeah. No problem. Harpreet: [00:40:14] Anybody else? That discussion on this NFTE topic Harpreet: [00:40:18] Does not look like it. Let's move on. If anybody has questions, go ahead. Let me know if anybody wants to comment and carry on Harpreet: [00:40:25] On the thread that that Harpreet: [00:40:27] Ben had unraveled there. And talk to me a little bit more about that. So we're talking about is kind of reframe the question in a way that would be easier for a dumb person like me to understand. Antonio: [00:40:36] Yes. Me. Yeah. Eric: [00:40:40] Well, I guess you can think it when you think of conscious A.I. or Ascension or Ajai, there's this continuous scale. So you imagine if I have a Roomba that is rewarded for searching for novelty. Harpreet: [00:40:53] So it's kind of Eric: [00:40:54] It's curious is trying to learn new categories. And it's stuff that we can all explain. You're like, OK, you're doing some data collection, you're doing some unsupervised deep learning. You have this novelty reward where you're trying to measure the Euclidean distance based on things that scene and it has this reward function. But you can imagine Harpreet: [00:41:10] This thing existing Eric: [00:41:12] In my home for a couple of months. If I dropped something new or my kids make a Harpreet: [00:41:16] Mess, there's some type of Eric: [00:41:18] Personification or way it behaves. The way that it behaves is really driven by its experience, like the way that it goes hunting for novelty, the way that it is going to pay attention to like I, I put a new plant in my house. And now this little thing is like very interested in this plant. But if I lived in a different universe Harpreet: [00:41:35] With a different, you know, Eric: [00:41:37] There's so much stochastic ness to it. Right. So I just think it's interesting that you can you can have this intermediate thing that feels alive, like, wow, look at this little thing that's got a personality. But everyone on this call. Know it doesn't like you're just kind of the parlor trick. So it's a parlor trick going from I could make Ajai this year that would fool my parents. But that's like the bottom one percent of the tech Harpreet: [00:41:58] Population for that [00:42:00] fraction Eric: [00:42:00] Will start going up. And then eventually it will get to like the 98 in the 99 percent, where now you have a lot of professionals that are like deep and heated debates and they are the top one percent and they are community arguing about whether or not this counts. So I, I know that's kind of this meandering thought, but I just last thinking about will Ajai in the future just be a parlor trick Antonio: [00:42:20] Where nobody that makes me forgive me, I'm fascinated by what you're saying. And I, I get fixated on watching these latest greatest magicians on American Britain's Got Talent there. They're they're just mind bogglingly good like Shen Lim. Harpreet: [00:42:34] And help me. Antonio: [00:42:37] There's another one that's just the they blow away the David Copperfield and all those guys from the past. And I'm Harpreet: [00:42:44] Thinking, are we going to get Antonio: [00:42:45] That good at our hearts to where we. For people with Raii, Harpreet: [00:42:50] It's because I agree with you, Ben. We're nowhere Antonio: [00:42:52] Near Ajai, Harpreet: [00:42:53] And if we if we if Antonio: [00:42:55] If we get to what Harpreet: [00:42:56] Looks like it, it's a parlor Antonio: [00:42:58] Trick. It's not I don't think it's real. Yeah, you put it on Penn and Teller Harpreet: [00:43:03] And they'll go, yeah. Eric: [00:43:04] But the one distinction that's different than the way people normally think about A.I. is you really have to go back to how the child learns language. Right. So like a child, it learns to refocus based learning. And so a child can't learn a language listening to the radio. Harpreet: [00:43:21] And as a parent, Eric: [00:43:22] You hold up your kid and within a few weeks they're eye tracking you and you celebrate as a parent, you're like, oh, my gosh, they're tracking me. Isn't this great? It's like, no, they're like in full on. I'm going to learn everything from you and everything that you're looking at. Harpreet: [00:43:34] And we don't teach them. Eric: [00:43:36] We don't go to school and how to teach kids a language. But they just their brains are so good at folks based learning. So I imagine a system in the Harpreet: [00:43:41] Future is very much focused on you. So you're walking Eric: [00:43:44] Through the home, you're doing different things, you're interacting with your coffee and it's focused. So it's. Yeah, I see. Mikiko has her Hendry's. I really want to Harpreet: [00:43:53] Hear what she has to Eric: [00:43:54] Say about this topic or a different topic in the afternoon. Mikiko: [00:43:58] Is it because I'm going to bring my liberal arts training [00:44:00] into it and start going like, well, you know, this will officer this philosopher said, but but that's the thing where like I feel like a lot of these questions about. So like one question I would almost ask is like if you had a Harpreet: [00:44:11] Guy like walking Mikiko: [00:44:12] Down the street in a slapped you in the face, would you even recognize it? Right. Like, would you even recognize it was Ajai? And then at that point, like would you even care? Like maybe you're someone who I don't once it gets the rocks off getting slapped, but maybe you're someone who the way you engage with it. Yeah, I like it. Does certain actions and behaviors. It lines up with with just what you would anticipate. So on how they would communicate with you and engage with you. And you're like, oh, yeah, that's that's a New Yorker right there. Yeah. From the east side, 100 percent. But then like let's say you come up against, let's say, someone, for example, who. Their persona is not that of someone born in New York and grown up in in Brooklyn, all that, but they maybe actually came from like San Francisco. And you're talking to them, you're like, well, and let's say you're a New Yorker and you're like, Harpreet: [00:45:03] Wow, these West Mikiko: [00:45:04] Coasters are really weird. They're almost like not even human, like they're so slow and they're so chilled, laid back. And so it's one of these things where, like, Harpreet: [00:45:13] I think at some point, Mikiko: [00:45:15] I don't know if people are almost going to care about whether or not there's a real human being on the other side, as long as they get something out of it, you know. And and it was interesting. Like there is a I think I need to track down this like blog post. But someone had done the study on children who had like Alexi's or series like in their room. Harpreet: [00:45:36] And I Mikiko: [00:45:36] Think at age three or four, like they thought that there was a Harpreet: [00:45:40] Person like in Mikiko: [00:45:41] The lexer Assiri, Harpreet: [00:45:44] Which is Mikiko: [00:45:44] Fine, like most kids think that I like I too. Once believed Santa Claus came and left cookies on the counter. Even though they were the same flavors, my dad's favorite cookies, I never figured that one out, you know. Harpreet: [00:45:55] But like Mikiko: [00:45:57] But in that same way, I feel like in some ways a lot of [00:46:00] these questions are so fundamental because, for example, unlike the anthracite Ralink, any time, for example, so in reports like, oh, we we've seen this Harpreet: [00:46:08] Mikac using a Mikiko: [00:46:10] Tool to dig stuff out of the ground. They're like, oh, is that culture? Oh, is that like learning? Is that intentional behavior? And then if you start seeing like two or three generations of like monkeys, are apes doing that, then they're like, oh, yeah, that's when we start arguing like, OK, this is culture. This is like a behavior. This is X, Y, Z, you know. And so like it is like even within with an anthropology, like when we look at living beings and the exhibit different behaviors, we kind of want to to some degree, like imbue their behaviors with a certain perspective, to some degree based on incentives. Because if you report, for example, this is a new learned behavior that shows culture in transmitting knowledge. Then you get a paper published, whereas if you you know, whereas if you're like, oh, yeah, they kind of just like started with that and they just kind of went with it because it was like the main thing that everyone was doing. Since it's since. That might not be intentional. Like, is it is it learning? Mikiko: [00:47:06] Is it is a culture at that point. So, I mean, I don't know. And I think the part that sort of I wonder about is still that, you know, let's say, for example, you're walking down the street, you get slapped by Najai or or maybe they don't do that. Like maybe they just hand you money. And then let's see, for example, there's a bill going on saying like, OK, well, you know, we need to make sure that these guys don't have rights like at that point. Are you? Will you argue for, let's say, AIs having more rights if it means you get a better experience or do you kind of stick with the like what is was human? Sorry. I think about this a lot. Harpreet: [00:47:40] Liberal arts background, right. I mean, Harpreet: [00:47:43] How do we define intelligence? I guess I guess that would be something to get on equal footing. And we should be talking about the same thing. Right. So we talk about intelligence. Harpreet: [00:47:52] What is it that makes Harpreet: [00:47:53] Something intelligent Eric: [00:47:54] If you're able to acquire new knowledge through experience? OK, so there's there's [00:48:00] some very dumb things that are intelligent. Harpreet: [00:48:02] Yeah, I think, too, the ability to Antonio: [00:48:06] Conceive of things. Harpreet: [00:48:08] Never thought of. Antonio: [00:48:09] And actually get them to work. Machines can't do anything like that right now. By the way, the zone we're entering in right now, you're verbally and Harpreet: [00:48:19] In the chat. I keep asking Antonio: [00:48:22] People if you want to talk about Ajai and we ask the right questions so we can try to begin to get some more together. Harpreet: [00:48:28] I feel like this is the Antonio: [00:48:29] First group of people I've been with that started to approach them right now Harpreet: [00:48:33] At this moment. Harpreet: [00:48:35] Andrew, what do you think? Love to hear from from Andrew here. If anybody else wants to chime in here. Definitely go for it. Yeah, I'd like that. They just so they're saying the ability to learn acquire skills. That's kind of like the Boxford definition of of intelligence. So like I was watching a documentary on it's called Wild Sri Lanka yesterday, and they're talking about these giant sperm whales. These sperm whales are like Harpreet: [00:49:01] Floating around thousands Harpreet: [00:49:03] And thousands of Harpreet: [00:49:03] Miles. And these really old ass Harpreet: [00:49:05] Turtles that are just floating around. Are they intelligent? And I was asking myself this question is like, is that constitute intelligence? Are those things intelligent? Um, intelligence different from from sentience. I don't know if consciousness. Anyways, I don't think I could. Eric: [00:49:20] Oh, sorry, Antonie. Here you go. Antonio: [00:49:21] I'm going to save the life. Three zero that Harpreet Sahota about it goes like if guy is running the world, how can Harpreet: [00:49:31] We really know Antonio: [00:49:33] Or understand that they are going Harpreet: [00:49:34] To like they're going to create Antonio: [00:49:36] Some future that we won't even understand. So how can we be sure if it's good or bad for us, Harpreet: [00:49:42] Like it can have some kind Antonio: [00:49:43] Of a concept? So other bands that we're going to be like, well, I thought this was good for me and it might be good. US as humans. I don't know, I just think those things can be learned so fast in theory and like advanced so much that Harpreet: [00:50:00] We'll [00:50:00] never be able Antonio: [00:50:01] To comprehend to know. What does that become like? Harpreet: [00:50:04] I don't know now, Antonio: [00:50:05] Like you're driving me down and it's up to me as a religious, because you know how this like everything happens for a Harpreet: [00:50:11] Reason. And you say, you know, Antonio: [00:50:13] If your religious use like, OK, God like this, there's a reason why this Harpreet: [00:50:17] Happened. Well, then there's the Antonio: [00:50:18] Eib, come God that you understand. And everything that you're doing is like, I don't know the aiai God said this is should happen and it's happening. So do we still do we end up with another unknown, like, I guess, secret thing that we don't understand? I mean, and then you just blindly trust them because you trust them because you're like, well, this is Harpreet: [00:50:41] This is above Antonio: [00:50:43] Me, above me, and this is this is beyond me. Do you turn it into some kind of God? Eric: [00:50:50] There is a topic about a religion at exactly that point, Antonio, because the I in the future Harpreet: [00:50:55] Will be so complicated, Eric: [00:50:56] So complex. You will have a fraction of humans that align Harpreet: [00:51:00] In worship Eric: [00:51:01] That and I think there already is in the eye religion. I'm not part of it, by the way. Are pretest going to say real quick to intelligence, it's nice to think of that as a continuous scale. And so you look in the animal kingdom, the dumbest layer of intelligence is you're hungry. And they go above that and you can actually have fear like you're worried about something else hurting you. And if you go up through all those layers, I almost like thinking of the brain is like an onion or like, you know, these layers. So if you go up all those layers, you then get to look at dolphins and elephants and people where we are social creatures and we share we share experience. We have language, we have empathy, compassion. You know, a member of our tribe dies. We actually care where I argue like a shark who has a very small brain but doesn't give a crap about like, oh, my like dolphins and elephants will actually show, you know, they will show grief where, you know, imagine like a shark watched its friend get cut up. It could care less five minutes later. Not [00:52:00] that I know what a shark thinks, but you look at like the size of the brain and the behavior. So if aliens came to earth, they, of course, would be more intelligent than us because they were able you could think of like a two dimensional scale, your understanding of space and time. So humans, we understand we hopefully we understand time further than other animals. Do we actually have conversations about where we going to be in three years or are we going to go to Mars with aliens? Come here. They're naturally they should be more intelligent than us because they've been able to, you know, make this jump. Harpreet: [00:52:27] But it doesn't mean Eric: [00:52:28] They're necessarily conscious. I don't know. I guess that's a really weird Harpreet: [00:52:32] Thing that people could kind of throw back Eric: [00:52:33] On that sometimes we think consciousness is a required check mark for things Harpreet: [00:52:36] That are very intelligent, but you Eric: [00:52:38] Have all of these different elements anyway. Harpreet: [00:52:41] Let's go to Ted Tantra here. Andrew: [00:52:44] Just such a fascinating conversation. I want to go back to one of the earliest memories that I have when I'm always thinking about what is intelligence, meaning what is self-awareness, sentience, which I've thought a lot about because I'm a huge Star Trek fans like Dummies. I think back I was in a math class and I think I was in my second, third, fourth grade. And for the first time, I remember realizing that the young woman who sat in the seat in front of me had thoughts in her head and had goals that were individually distinctive from mine and that they deserved respect just as much as my own actions and desires and outcomes and my own drives. And that quality of there's a self-awareness, but then there's a leap to also empathy. And I don't want to go into like the area of like human dignity. That's not right. But that there's there's something special in the diverse perspective, the diverse like self-awareness that different individuals have. And that's that's the thing where I get into boundaries like sentience, self-awareness. I can kind of see those being translated [00:54:00] in like the evolution of certain things that we're conceiving now. But the empathy concept, the allowing another intelligence to give it the deference that its actors are just as valid as mine. That's where I get into really a complicated, Harpreet: [00:54:18] Moralistic Andrew: [00:54:19] Territory that I don't understand. So I'd be interested in anybody's perspective on anyone. Harpreet: [00:54:24] Take a stab at this. This is definitely interesting conversation, and I Antonio: [00:54:28] Love it, Andrew. I don't think anyone in any Ajai conversations I've heard has touched on. Hey, will a group of a AIs Harpreet: [00:54:39] Seek to form a community Antonio: [00:54:41] And benefit from one another? Now, that's next level. I think the empathy part, too, will they want to empathize with us and will they appreciate that we brought them into existence if we even can create Data, by the way. But it's a fun thought experiment. Will they care about us as their creators? Those that help them maintain and get to where they're going. Will they pay it forward and help humanity? These are. You would hope so. Harpreet: [00:55:09] Yeah. Would we be able to like perspective taking? Were a artificial, intelligent, artificial intelligence system be able to take the Harpreet: [00:55:16] Perspective of somebody else who to understand their goals? Harpreet: [00:55:21] Russell, go for it. Russell, I believe you might be frozen. Russell: [00:55:24] Yeah. No, sorry. The latter was just kind of a slip of me. Yeah. So I think I think Eric and I both put a couple of comments in there in response to Ben's initial statements that that you've then discussed since. So I said I think there Harpreet: [00:55:41] Are different levels Russell: [00:55:42] Of intelligence. And I would maybe classify some of the stuff Ben was talking about as instinctual intelligence for the stuff that you're born with. Move away from the cold and move away from loud noises. Don't eat things that smell bad, you know, that type of stuff. You don't really need to learn that yet. You [00:56:00] know, most most organic beings, you know, animal, human, all on Harpreet: [00:56:08] This planet knows those Russell: [00:56:09] Things. They're born with those things. Then there's like an autonomic intelligence, Harpreet: [00:56:14] Which is kind of, you know, it's your Russell: [00:56:16] Brain making your heartbeat, your your lungs breathe or, you know, your chest muscles move to to to make your lungs breathe, essentially. So you don't need to think about this. Your body, your your brain is built with that inbuilt intelligence so that you don't die, essentially. And then there is a kind of conscious intelligence, which is the stuff that you need to learn. You need an experience to build upon this. The other two are there in your DNA as you're on the conscious level of intelligence, needs to be experiential, needs to be learned. And that's going to be the stuff that's the most difficult Tyagi. Harpreet: [00:56:54] So for, say, Russell: [00:56:56] A analog, for instinctual, we can probably program that into the into the model. And autonomic, again, can be programed into the model. But the experiential, I think, is going to be more challenging. And then just touching on to what Tom said most recently with the empathy, I'd love to to see that working in the Harpreet: [00:57:17] A.i., but Russell: [00:57:19] I'm not I'm not I'm hopeful. I mean, we might be able to, as Ben was saying, you know, paint something precisely to replicate an analog empathy. But I don't think that will be true empathy. I don't think we'll be able to get a model that will empathize either with another Harpreet: [00:57:33] Model or with a human being. I love to see it, Russell: [00:57:36] But I'm not I'm just not saying that we're even close to that yet. Eric: [00:57:40] You could do a parlor trick, which is which is cheating because you've Harpreet: [00:57:44] Intentionally demonstrated Eric: [00:57:45] Some type of emotional capability, your empathy Harpreet: [00:57:47] That is done Eric: [00:57:48] To convince an audience it's not. But you guys would see right through it that your eye this is a terrible thought. But I love terrible thoughts. So to throw out to the group, you can imagine that this would be [00:58:00] possible maybe in the next couple of years to do a keynote talk. So imagine me on stage and giving a talk. And I'm saying I have taught I to learn to speak Harpreet: [00:58:08] English just through experience. Eric: [00:58:10] And it's been able to learn 15 words like that feels possible, right? Harpreet: [00:58:13] Like just 15 Eric: [00:58:14] Words. I'm not talking about hundred thousand words, 15 words. And I'm on stage and I'm showing and demonstrating this, and it just mimics it back. Data says Data. I say Mama says mama. And I have. And I explain to the audience how I taught the system, how I interacted with it in my kitchen over six months. And when we make the right sound, I would smile and I would say a lot of things to it. And when it would get close, I'd smile at it. And then I tell the audience, and this is what I do when it doesn't do what I want. And I have a lever and I pull the lever and you hear this system squeal in pain. And I've announced the audience that it knows these ten words or 15 words, and the audience knows what they are and what they see on the screen is. One of those words is not please. And now you can imagine me in front of an audience, I'm pushing this lever and they hear the I squealing and then they hear it starting Harpreet: [00:59:00] To say, please, Eric: [00:59:03] Because it's trying to say, please stop. And you can imagine an audience of being in complete panic that this is this thing. Harpreet: [00:59:09] But there is something there. And you guys would Eric: [00:59:11] Know that I had just engineered a very selfish or not a very like kind of an evil trick. Harpreet: [00:59:16] Like I've kind of the empathy Eric: [00:59:17] And I like how you bring about Patty Russell, because what's happening, the empathy of the audience is rushing to save this thing in a box, which they don't know what it is, but they think it's trying to say, please, which I don't know. Why does is anyone else's brain that weird that they're thinking of like terrible tricks to panic an audience on Agga? I think it's I'm amused by the idea, but I know it's a terrible thing to do, Harpreet: [00:59:40] Like the the ghost in the machine type of thing that put a link. I think it's Yosh about gave this talk. Um, definitely that conversation he had on the next Freedom podcast, super interesting spox. Lex Freedman at first and second ones are really, really good. So so you're essentially you're teaching your system by punishing [01:00:00] it for not doing the right thing, which is essentially how machine learning works right now, like at the base level. Like we don't really teach Harpreet: [01:00:07] Machines to do the right thing. Harpreet: [01:00:09] We teach to what not to do by penalizing them. Eric: [01:00:11] You just don't hear the pains of the loss function. Yeah. Delta. Yeah, but maybe that'd be a good thing to add to your deep learning models when you train them that the A's in this pure agony. And then as it kind of fixes that loss function, it's like breathing a sigh of relief like ah, like that. That'd be a good way to train a model, right. Harpreet: [01:00:29] Can't the agony be already reflected in like the the heat being produced by a GPU as it's trying to calculate the right weights and Bice's? Eric: [01:00:37] They may be. My brain is running. Looks like all of these terrible things I could do in my basement or my kids. So like, dad, what's happening upstairs? Don't go down there and take a few hours for the A.I. to settle down. Andrew: [01:00:49] It's like like on my last. Ever read that dystopian story about Harpreet: [01:00:54] The the one Andrew: [01:00:55] Child whose it was like saying, oh, my Mikiko: [01:00:58] God, yes. That was my econ class in high school. Andrew: [01:01:01] And Berryville is the one child, basically, you get utopia, but somewhere deep down in a basement, there's one child suffering so everyone else can can have a utopia. And, you know, it's interesting. When you went down that line, when you said you had a parlor trick, I thought you were going to talk about like, you know, engineering. I to have like some sort of proclivity towards collaboration and therefore like empathy and engagement would be kind of helpful for its general directives. But you went dark Eric: [01:01:33] That sort of start up all the tesar it ruined for life. Mikiko: [01:01:36] Well, it's it's funny Harpreet: [01:01:37] Because I honestly Mikiko: [01:01:39] Feel like after a couple of years now, I'm just watching people do terrible things with like random forests. I, I really kind of wonder if we Harpreet: [01:01:47] Could even I mean, seriously Mikiko: [01:01:48] Awful things. I'm like, you're doing this with like just some very simple, like essentially. No, you raised maybe telling them to like ruin people's lives. I, I kind [01:02:00] of don't feel like we'll ever actual. OK, so I guess so. If we created an age right now, a super altruistic and you know, all these good things and yada, yada, yada, yada, like what it really actually be human Harpreet: [01:02:13] At that point. Mikiko: [01:02:14] And maybe this is the cynic in me where I'm like, I don't know if we can create something that like in order to create something that is human, I think we would have to include some pretty awful things in there. And I don't know if we really want to do that or if it's meant to represent the best of what we really kind of could be. You know, even things, for example, like and this is why I like I shouldn't have even like gone on to LinkedIn, you know, like with essentially what's going on like in Afghanistan right now. People have very strong opinions on, Harpreet: [01:02:45] You know, Mikiko: [01:02:46] Many multiple angles of that situation. And I think one of the like most heart rending comments I read was basically like, why do people even like Airbnb had announced that they were going to support temporary shelter for some of the refugees and all people? And I had seen this one common thread where basically like, well, why should we like why should we help them? They deserve it like x, y, z. I'm like, okay, well, first off, you all should just admit that you're Islamophobic. It's just admit it. Just get it out there. Don't couch it in all this like pretty little language that ultimately ends up being the same thing. But, you know, like what? Where would a guy like fall fall in that, you know, like I don't know. So to me, I'm kind of like, do we really in some ways, do we want to really represent us like in our current state? Because sometimes it's really beautiful, but sometimes it's really ugly. And I don't know, there's like a lot of ugliness in the world right now, especially right now. There is. Oh, and this year, in the last few years, there is so much ugliness. So I don't know if we really wanted Harpreet: [01:03:45] To represent who we are right Mikiko: [01:03:46] Now, but maybe we want it to represent something aspirational. And if we want to represent some aspirational, that might. We have to, you know, do things, for example, like come up with custom data sets, right, like seed, it was something that is Harpreet: [01:03:59] Not human [01:04:00] behavior essentially Antonio: [01:04:01] Over human trained them. There are still going to be human. Someone's bias is going to be put in Harpreet: [01:04:06] Place Eric: [01:04:07] Where you have the bias today. And so already have the bad behavior, humans being default transferred in current A.I. systems and people have to weed it out. Harpreet: [01:04:17] You know, Eric: [01:04:17] Important things to think about is how do we save ourselves from our bad Harpreet: [01:04:20] Behavior? Eric: [01:04:21] Which is I love this topic and and I'll shut up in 10 seconds. But it's like, man, I wish humans weren't so emotional and terrible and have these terrible biases. Harpreet: [01:04:29] But then it's some things Eric: [01:04:31] About our emotions that lead to these beautiful innovations or this competitive nature like er if you tried to throw it all away where we're rational robots, humanity, like it's this terrible conflict, like Harpreet: [01:04:42] If you made humans more less emotional, more rational, you start to Eric: [01:04:45] Lose the part of humanity Harpreet: [01:04:46] That a lot of us celebrate. So I have something Antonio: [01:04:49] Of I guess this is what you're saying on humans. We're driven by emotion. Harpreet: [01:04:54] And so Antonio: [01:04:56] Why wouldn't I be driven by Harpreet: [01:04:59] Like and Antonio: [01:05:00] What I'm thinking is I don't have an ego. Why would an AGI try to complete something other than you have? I mean, if you program a reward. But who is there going to have any like it can have an interesting, interesting, like motivation to do something, or is it just going to be lazy and just like sit on a beach and watch the sunset or do nothing to Harpreet: [01:05:22] Go for it? Antonio: [01:05:23] Yeah, this is great. So to several points on this. If we were going to create an AI that was like an arch type Harpreet: [01:05:31] Human, would we really Antonio: [01:05:32] Have the ability to figure out what an arch type was? And would we agree on what the arch type was meaning as a super moral, super high character A.I.? Harpreet: [01:05:43] But I'd like to ask you to Antonio: [01:05:45] Aim your thinking at just a slightly different Harpreet: [01:05:47] Way and Antonio: [01:05:48] More to Ben's purpose. Well, if we're going to start with CGI, why don't we start at maybe Ajai? But I'd like to and I think that was an excellent point. Become a step before that and think, OK, let's say I [01:06:00] could point out an Harpreet: [01:06:01] Architecture that had the Antonio: [01:06:02] Best hope of getting to baby Ajai. And I, I have a guess it would be a Harpreet: [01:06:07] Hybrid of a bunch of Antonio: [01:06:09] Transformers and other deep learning mechanisms. A lot of sensors coming into the CGI, the system. It's like an Android. And yeah, now it has the capability because it's got all these hybrid processing things that can interconnect. But so what? Now you have Harpreet: [01:06:27] To figure Harpreet: [01:06:27] Out, oh. Antonio: [01:06:28] Or else require a value Harpreet: [01:06:30] System and they Antonio: [01:06:31] Require specific training methods. And it's if we're going to use the technology that we're roughly using now and just add a more advanced stage. Oh, how how do I convert all these questions to a mathematical statement that can result in Harpreet: [01:06:50] Training to me? Antonio: [01:06:51] Like, I think empathy is good. Harpreet: [01:06:54] We could discuss Antonio: [01:06:55] For another 30 minutes an agreed Harpreet: [01:06:57] Empathy between Antonio: [01:06:59] Colorway AIs and humans. For an age to survive is essential being part of a community. If how do you turn that into rewards and loss statements in a training system? So they Harpreet: [01:07:14] Get that. But then if it was Antonio: [01:07:16] Just math, are they really ever going to become Harpreet: [01:07:19] Cynthia? These are the kind of Antonio: [01:07:21] Questions that I run up against the wall with when I think about Ajai. And then it's like thinset at the end Harpreet: [01:07:28] Of the day, wasn't it Antonio: [01:07:29] Just a mathematical parlor trick? So that was a lot all at once. Sorry. But those are the kind of things that run through my head with this and why I keep saying that we really ask the right questions clearly enough to where we can turn them into math statements, Harpreet: [01:07:44] Because at the end of the day, the A.I. is Antonio: [01:07:46] A giant set of math machines. Russell: [01:07:49] I think it's interesting that you said that, you know, talking about it being math, because essentially are organic neurological systems probably break [01:08:00] things down Harpreet: [01:08:00] To two Russell: [01:08:02] Mathematical functions. You know, it is or it isn't something you know, we respond. If it is, we just want it didn't work. That isn't although it's far more complex than anything we can kind of build in the electronic space at the moment and then and then just jump back to, you know, if we were to build a baby guy and let's put the empathy to the side for a minute. If I was to to try and build a baby at the moment, I think that that the key thing you need is some image recognition model. So it would recognize a few key people or it could learn to recognize a few key people, some audio monitoring so that you could listen to some of the the vocal feedback it was getting and some audio creation so he could make vocalizations. Then if it was recognizing a few key people and it was just experimenting. Is random vocalizations as a baby would do, but every time it hit on something that was kind of close to a word, a person would give it some feedback that it could then interpret. That would be a really interesting thing, just to leave that learning on its own, you know, completely unsupervised, just to see if organically it could start to come up with some animal for the language, that that would be a really interesting experiment. Harpreet: [01:09:17] I think sort of supervised learning, I think would be the right type of term there. That might be the key to it. To Ajai, self supervised learning. The guy's been a great discussion. I definitely, definitely enjoyed this. A lot of great comments, a lot of great chats that I highly recommend. You guys check out this, this future of life dot org or slash A.I. dash aftermath dash scenario lays out 12 different scenarios, ranging from the libertarian utopia to self-destruction. This is kind of an accompanying website to Max Tegmark book, Life to Point Out, which is also a good book to read. If you enjoyed this conversation, [01:10:00] definitely check that book Harpreet: [01:10:01] Out, Harpreet: [01:10:02] Trying to get Max on the podcast. So hopefully that happens. Yeah. So guys, thanks again for for joining in next week. We have a special guest host who's going to be taking over the airwaves of the arts today as his podcast Antônio will be hosting next week while I head out to ABC with the wife for for a wedding. Antonio: [01:10:25] So Andy got to get myself a Hawaiian T-shirt or something. Harpreet: [01:10:28] Yeah, it's actually it's cold and rainy outside today. So I like I I'm going back into a small flannel mode and sweaters. But guys, thanks for joining in. Don't forget to check out the episode that I released with Jeff Lee earlier today. He got a freestyle rap right at the beginning. I thought it was super cool. I also was on the 10 G podcast, Ken's Nearest Neighbors podcast, so check that out as well. Don't forget, I'm launching a course soon as well. Just got to do some recordings for it. And it's all about how Harpreet: [01:11:00] To essentially Harpreet: [01:11:01] Become an employable Data scientists want to talk about how to manage yourself, how to think like scientists have to think like an engineer, how to think like a business person. I'm excited to to start recording that stuff and getting that course out there. Thanks again for tuning in, everybody. Remember, my friends, you've got one life on this planet. Why not try to do something big here? Antonio: [01:11:18] I saved this chat. Just see on. Harpreet: [01:11:20] Yes, I've got the Chad saved as well. I'll be sure to have to copy and paste it a link to it. I'll put it as a blog post on the website. Hi, guys.