HH103_mixdown.mp3 Harpreet: [00:00:06] What's up, everybody? Welcome. Welcome to the arts of data science. Happy hour. Happy hour number 103, 103 Happy hours, man. It's been going, going strong. You know what? I'm just going to make the announcement now. I think next week's happy hour, I'm going to make it the last the final happy hour right before the holiday season kicks off. Have won one last happy hour because baby number two is coming first, you know, first couple of weeks of December at some point. And yeah, you know, the happy hour has been great. And I've loved having everyone here. But it's time to start, you know, kind of set in the sun on this Mike, kick it off, bring it back next year. Who knows, man, I might do that, but definitely take a little bit of a break for the for the holidays. So hope you all had a good week. I had an awesome week, man. Like, I just can't stress how much I love my job. You know, the company is great. Absolutely love the company. But just be endeavor. Like this job is absolutely fantastic. It is amazing. This week I spent most of my time preparing for these Ask me Anything sessions that I'm doing as part of the deep learning daily community for Dessie. So, you know, this week I did live streams with with Serge. Serge Masses talked about interpretable machine learning. Did a ask me anything session about object detection and the YOLO original YOLO paper with one of the internal experts at Dessie. Harpreet: [00:01:30] Kate, your cover. That was amazing. Today I spoke to Susan Shu Chang about deep reinforcement learning. And just kind of spoke about that from the from the ground up, from kind of first principles. So it's awesome, man. And all this stuff is going to be actually going all this content is going to be going towards the the as yet unnamed podcast they'll be doing as part of Dessie does he's, you know, D'avril duties. It's either going to be called the deep learning daily the same name as a community as you see there, or I [00:02:00] might just call it the Deep Learning podcast. I've claimed both podcast names already. I kind of like the Deep Learning podcast. This podcast is going to be a lot different than what the RC Data Science podcast is. Rc Science. I think I build it as a personal development podcast for the scientists, and at that point in my life, that was what I was really into. I was really into like, you know, just this personal growth, personal development. I still am. But to me, I think it is pretty much, you know, something that. It just running in my head constantly now, you know what I mean? And then also, you know, talking about breaking out data science, that's not really like my thing anymore. Harpreet: [00:02:38] So I'm going all in on deep learning and we're going to be having a lot more technical content coming around that I'm trying to make things as intuitive as possible. Speaking of deep learning. The most amazing thing was released earlier this week. Galactica released by papers with code and Meta. It is a large language model that was trained on 103 billion tokens of highly curated data. The data was, you know, taken from research papers, from textbooks, from code bases, just really high quality scientific knowledge. And they've created this this model. I've got a chance to test it out. I don't think the UI is available for for testing. I think they closed that now. But you could it's open source, completely open source. The weights are available. You can go to papers with code GitHub repository and see Galactica there. The model is available on torch, not torch ups. Are you hugging face the Hugging face hub? You can download it from their super easy to use. I mean, if you maybe in a few minutes we can go look at what happens. But yeah, it's about I think the model itself. I downloaded the. 6.7 billion parameter model. [00:04:00] And it was, if I'm doing the math here correctly, about 28 gigs or something like that. It's a huge, huge model. But but amazing. I'm pumped for this man. Harpreet: [00:04:10] Like, just as it is a content creator who's creating content about deep learning, I think. Or it's not just scientific content in general. I think just having a tool like this is going to be amazing. Just keeping in mind that, you know, these large language models are prone to this phenomena of hallucinations where they just spit out just blatant, incorrect things that sound like fact. So we got to fact check a little bit of everything, but I think it's a good way to kick start the creativity, you know, when coming up with that with the topic also Galactic, they use the model itself to write the paper for the about the model. So 60 page paper that was released earlier this week, they actually use the model to help write that paper which I thought was just insane. Fascinating. But yeah, Galactica. I haven't heard too many people complaining about about Galactica yet, but you know, as always, I don't think it's the model itself that is the issue is just the humans behind it and how it is being used. But yeah, I'm curious if anybody has thoughts on on Galactica, if anybody's playing around with it, if you know, what do they think about something like this. A lot of keen to hear from from surge. I know you've got a deep interest in generative kind of models and things like that. Speaker2: [00:05:28] Actually, I haven't tried it. I haven't tried any of the recent models. I'm dying to try it. And I think I'll I'll be able to finally play around with those models on like the winter. It's not really a break because I'll continue working, but I'm going to have less of this craziness of traveling and everything in December, and so I'll be able to read some more and and play around with new, new toys, [00:06:00] if you will. Harpreet: [00:06:02] What else is on the potential list of new toys to to play with? What else are you thinking about? Speaker2: [00:06:07] Oh, well, yeah, I just I just got a some, some new. What's it called? I just got the Jetson Nano. So Developer toolkit. So I'm going to be playing around with that. I'm going to be doing more edge stuff. I've been doing it, you know, kind of torturing my Raspberry Pi for a while now. Yeah, It's about time I got something better equipped for deep learning inference. And so I'm eager to get my my Jetson. Nano. Harpreet: [00:06:46] Yeah, well, that's. Let me know. I'm happy to put you in touch with the folks at Destiny. We can take the model you have and use our auto Nack engine to help make it smaller and have it reduced the. Speaker2: [00:06:58] I'd love that. I love that. Yeah, I, I haven't done a lot of quantization I think it's called Yeah yeah I mean since I the last time I did that was for a project where I, I put my model I think in my phone and taking like a project I did like four years ago and I needed to do that, but I forgot how I went through the process honestly. And it probably doesn't apply anymore. It's such an old, you know, things move so quickly. Harpreet: [00:07:34] Yeah, definitely a challenge taking these models. I can't imagine. I think Galactica deployed on my phone. I think it would take up all of the available memory. That model footprint is huge, but I'm sure Kozlov has some some thoughts about deploying the vision type of applications on resource constrained devices. Anybody else got got any input? Vin, I'd love to hear from you or Russell. Shout out to everybody else in the room. [00:08:00] Tashi On T Coast, if you're watching on LinkedIn, you got questions, do let me know. Speaker3: [00:08:06] Yeah, it looks like it got pulled already. I saw something today about I think it was yesterday. It started where they were having some serious issues with with Galactica and it was outputting a lot more inaccurate results than than normal. And today it got pulled. So they they kept the research out there and the model is still available. So it's still a learning tool as far as just incremental progress. But it looks like they their implementation didn't work out. Looks like it was, I don't know, part maybe was the implementation, maybe was the model doesn't really seem to say which one of them was the point of failure, but it looks like it didn't make it. And I think it's kind of funny. It would have taken me longer to read the paper then fine, We're kind of at that place in deep. Learning where we want to. We want to publish everything. But maybe we shouldn't. Harpreet: [00:09:02] Yeah. Yeah, I haven't. I mean, I've heard there's. There's a lot of, like, people posting their experiences with it. I've been I've been, like, willfully ignoring that, just to kind of stay in the honeymoon phase right now and play around with it. But yeah, I'll try to get it deployed locally and see what I could do. Russell or anybody else got any thoughts left? I'd love to hear from you all. All right. What else is going on, man? Next week, talking to. Talking to. To Kyle Cannon about graph neural networks. I'm not sure if anybody has experience with those. I've been doing research trying to, like, learn about that. So what I spend most of today on, I just ended up going down like a wormhole about, like. Like the actual intuition behind what they. [00:10:00] The function convolution is not like from a convolutional neural network, but like the actual function of a convolution of two functions. And went down a wormhole. And it was great. I loved it, man. I loved it. I was like, feeling like I was back in grad school, just reading all that math and stuff. It was great. Yeah. So what's going on, guys? Let's let's, let's get some let's, let's get some conversation going. Questions on anything. Toshi, I haven't seen you in forever, man. I know you've been out there crushing it. What's going on, dude? Speaker4: [00:10:29] Yeah, it's been a while. I mean, I've been working as a data analyst at Bloomberg. It's been over 14 months now. Things were a little hectic, and I did kind of miss out a few sessions here, data science, a bit on the personal side, which did affect my professional life. As I'm going back to Nepal in a week, I'll probably be staying there for like two or three months. I'm going back up nine years. So I did have to quit Bloomberg, but now I'm catching up a little bit on the engineering side of things. Picked up fundamentals of data engineering reading right now and just upskilling in general. It's been a journey, man. When I first joined, I started attending data Science back when I was just a student in school. And now, you know, I'm older years, so it's been it's been a journey. Harpreet: [00:11:25] So you moved back to Nepal for like, for forget? Or is it just to visit? Speaker4: [00:11:29] No, just for like two or three months. Yeah. Harpreet: [00:11:33] That's awesome, man. Well, congrats. Good to get to to hear from from you and see you around. There's some other big news this week that I picked up on Twitter, but I just didn't have time to go into the the wormhole about it. What was that company that just, like, Melted was like a crypto company of some sorts? If anybody can. Speaker3: [00:11:53] Fix. Harpreet: [00:11:54] Fdx, that's what it's called. Yeah. What the hell is that all about? Speaker3: [00:11:58] It was a massive scam. I [00:12:00] hate to say it that simply, but it was just a massive scam. It looks like it was. There were some backdoors that they put into the software and into the exchange itself that allowed them to embezzle money potentially. And that's the that's the allegation right now, is that they had pulled a ton of money out. And there are other exchanges that are all on FDX or have some sort of investment in FTC's or of exposure in FTC's. And no one's really talking about who has it and who doesn't. But I think there's been two other exchanges that have folded in the last ten days or last week. It feels like two months goes by and five days right now. But yeah, they they folded. They took down, I think it was $16 billion worth of total assets or 20 billion in total assets. And there's everybody on that exchange is basically not going to get paid. They were over leveraging because they would take the money and invest it into other crypto exchanges and they were propping up all these other exchanges to keep them from crashing. And eventually they couldn't prop anyone up anymore and they needed some propping up too. Speaker3: [00:13:13] So people started withdrawing money. They couldn't withdraw crypto, they couldn't withdraw money. And that almost became like a run on the bank essentially for this exchange. And overnight it just evaporated, went bankrupt. They closed everything down. There was another exchange the next day that had to stop trading. They're talking about this potentially having exposure to some of the largest exchanges. And after all of this is starting to there's some clarity starting to come out of this. And so there's now this the potential for a like a subcommittee type hearing and fraud investigation. And there's now price targets on crypto like Bitcoin down to 13,000. A lot of a lot of analysts are basically saying [00:14:00] liquidate everything and yeah it's got these massive ripple effects and even bought a yacht club is seeing some impacts here because they're their apes are now selling for less than the the minimum or the stake or something like that where there there's been what was it 12 or more than that of those digital monkey art images that got sold for less than they were like a 500 grand, I think it was a year and a half ago. And they're selling for 40 or 50,000 now, which has got to hurt a little bit. Speaker3: [00:14:35] I can imagine that's probably not what people wanted to have happen to their digital monkey art, but I think we're kind of in that stage now where we did the fraud and now we're finding out that if it sounds ridiculous, it probably is. And. Somebody I can't remember who said it this morning, but they said, when you give investors too much money, they will invest it. So when we had too much liquidity in the economy, instead of having a limited amount of resources, so you had to deploy it to the best possible investments. When you just are flush in cash, you start throwing it at anything and hoping it works because you have to put it somewhere. If it's if the dollar is not appreciating, if you're not getting interest on any of the Treasury notes, you have to put it somewhere or it's not doing anything. And I think we did that a little too much and we're finding out. So that's that's the long, ugly story of FTC's is somebody sort of revealed the biggest problem in crypto is that nobody in these exchanges really has any sort of transparency, which is rather bizarre considering that was the whole point, right? Harpreet: [00:15:42] Yeah, that's what I thought. The promise of this was like the the, you know, seen how trades are happening on the actual ledger itself. Wow. Speaker3: [00:15:53] I still think the technology has promise, but I don't understand crypto. It doesn't make any sense. Why would you connect [00:16:00] a decentralized currency to a centralized currency? Why would you centralize a decentralized currency on an exchange? You don't need the exchange. I mean, the whole point of the protocol is you don't need an exchange. And so now we're putting it on exchanges. And it's funny, IBM's had blockchain. I remember shilling for them in 2015 and their blockchain initiatives back then, and they're still using blockchain and they're still doing a ton of work and people are paying money for blockchain for supply chain uses and applications. Any place you have to guarantee ownership or have a chain of custody or any of those types of tracking methods and for food, it's kind of great because perishables. Harpreet: [00:16:45] Are. Speaker3: [00:16:46] You want to understand where it's been, what conditions it's been in, and all of that data can be put on the blockchain and then everybody has access to it that needs it. But it's a semi-private blockchain, not a completely public blockchain. So there are use cases for the technology. It's just looking at crypto and nfts. We've got proven reasons why we would use it. Why would you? Really? Why? Why? I do all these other things. And I think data science can actually benefit from it. I think there's some interesting work that we could do putting data on blockchains. And being able to do some tracking with it and have better visibility and transparency into personal data where if you want to access my personal data, that should be something that gets added to a blockchain somewhere. So if you want to look at it like in the US, you can look at a property records and you can look at and that's how Zillow figures some things out and that's how the pricing and the estimates and some of those companies like Zillow and Redfin, I think they get data from publicly available sources. I don't want them to. Speaker5: [00:17:56] Have my data. Speaker3: [00:17:57] They built a business model with our data. They [00:18:00] didn't pay any of us for it. And so I understand it's publicly available, but I should know when someone accesses my data and I should be able to say I don't want them to be able to do that again because I want them to prove to me first they have a legitimate non business reason for it. And you could do that on a blockchain. Every time somebody accesses your data, your publicly available data, it gets logged to a blockchain and there's just complete transparency to it. My data can be an NFT and so there's ownership. I own it. But through the contract you can say that obviously the government has unlimited access to it because that's that's part of the law. They need access to it. But this whole public domain thing, anyone who gets access to my particular publicly available data neft like my car registration, my housing, house registration, layout of my house is available online, all of those things. And so if you had something like that, I would be able to tell that someone looked at my data and I would have recourse for it to be able to say, I don't. I want you to prove to me you had a non business reason for this. Harpreet: [00:19:12] Like, wasn't Bitcoin kind of created in the aftermath of like a similar financial kind of meltdown and financial tomfoolery, I guess, for lack of better words? And now it has caused this on a whole new scale. That's just ironic to me. Yeah, I think Mark did a project, Mark Freeman did a project with using like data analysis on some blockchain stuff you guys should check. Check that out. Sort of. See you here. Well, Kosta Sorry, Deliveroo. Australia died overnight. What? I don't even know what Deliveroo is. Speaker5: [00:19:51] Basically, it's kind of essentially it's Uber eats, right? And their only thing is the eats [00:20:00] part, right? So they don't do rideshare kind of thing. There's got guys on motorbikes with like a little cooler bag in the back delivering food from any, you know, any store. I mean, it's there in the UK, it's there in Australia, it's there in a few countries. And basically I'm not 100% sure why or what the situation is, but kind of overnight they kind of went, okay, yep, we're shutting up shop. The app is no longer like the app works, but it just goes to this landing page saying, Hey, we're under administration. And then, I mean, all I'm hearing is off the news, right? So it's just straight off Channel seven or whatever ABC News or whatever it is. And they're basically like, yeah, we're going under administration. And the crazy part to me is that the. All of the all of the drivers right there, all contract just like Uber, just like a lot of those other services. So there's no severance pay, There's none of that. They're considered creditors under the employment system for that. Right. So they're essentially like any other creditor in a business waiting for the, you know, the outcomes of the administration process. So it's. Really strange new world. I don't think we've seen this on this kind of scale. Much before, I don't know. Have patterns like this existed in the past and. Harpreet: [00:21:23] Good question, man. Vince 2001. Let's hear about it. Speaker3: [00:21:30] So back when I was around your age, we had this dot com. Harpreet: [00:21:33] Bust. Speaker3: [00:21:34] And it was it was, I mean, really feeling similar to this. And although analysts are all saying that it's not going to be as bad as it was last time. Any company. Yeah. No, the hard back though. You can't get it digital. There's some old books so there's. Yeah, there's a ton of analysts who are saying it's not going to happen again. But [00:22:00] we have so many companies that are not profitable. And I think every single one of those over the next 12 months is a candidate for insolvency. If they can't figure out how to raise money and. Most of them don't really have a path to more cash. Or they can't figure out how to become profitable? I don't know. I mean, what are you going to do? You can't just keep the doors open with magic money, especially a company like Uber. How do you lose that much money and keep the doors open? I think that's where a lot of companies are. Right now, I think we're in denial because there aren't many people my age left who remember 2001. And I'm not being really as funny as it sounds like I am, because think about it. Me and ten years is retired. And so the majority of people who are in the field right now are too young to really remember it. They might have been in high school at the time or graduating from high school in college at the time. So they didn't really see it, didn't really get to study it. And all of the people who are older than my, like, small slice of Gen X are all retired or out of tech. They're doing some other kind of consulting. So there's entire companies that have no one who's ever seen something like this before. And I think that's going to that's going to have some problems because we're assuming companies are more stable than they are. And I think we're going to see a whole lot of was that a company called Deliveroo has a great name? Yeah, Deliveroo's. I think we're going to see a whole lot more Deliveroo's Underoos and companies that don't have money going under. Harpreet: [00:23:47] That's sad, man. I some companies are awesome, man. They're doing all this amazing stuff. I don't know, man. Why are you guys going to lose money like that? Why? You guys got to, like, that's got to. Speaker3: [00:23:59] It's [00:24:00] just too bad it doesn't make money. Yeah, I mean, I could. I could make $1,000,000,000 if you give me two. You know, And that's that's the truth. If you want me to make $1,000,000,000, give me 2 billion. No problem. I'll have that for you. So that's not what a business does, though. Speaker2: [00:24:18] But at what point does a business become a utility? Like you, I think if if Twitter goes under, you know how many people will be clamoring. We need a Twitter or something like that. Like. I don't know. I don't care. Personally, I don't use Twitter that much, but I'm saying there are a whole bunch of people. I think Twitter is like the best invention since sliced bread, right? Speaker3: [00:24:47] Yeah. We've always had nerds. I mean, we are nerds here. We've always had nerds that love technology and kind of overestimate the utility of it. But you're right. Search You say, when does it become a utility? Well, define the word when it has utility, When people are willing to pay more for it, it doesn't necessarily mean it'll be a utility. Speaker5: [00:25:11] Does it necessarily need to be a utility to have that kind of lash back in value, though, Like I mean. 90% of app usage is one behavioral habitual, right? Like what is the point of YouTube shorts and Instagram reels? They repeat, you can't rewind them. So if you miss something, you got to watch it again. And then the idea is to get you to interact, to switch onto the next one. So you can't just leave it in the background and do something else. You're fully engaged with the app, you'll fully engage with the advertising and everything else that's going on there. Right? So how much of this is Oh, we use that because it's some kind of utility value to us as opposed to, Hey, we're using that because it's the latest trend. It's something that, you know, socially we get sucked into using. Right. So I don't know whether it necessarily needs to be utility value for people to clamor about it and go, Oh, no, I [00:26:00] really need that. What would I do with that in my life, right? Speaker3: [00:26:03] But that's like arguing the cocaine is too big to fail. Right? That's the same argument. Speaker5: [00:26:10] Exactly. And that's not right. Speaker2: [00:26:12] But I do think, like for me, the utility is something that we all find useful. Like, no matter who we are and that, you know, but we we to a certain extent, take for granted. And so we will miss when it's gone. I would think, yeah. Twitter is ridiculous under that circumstance. But if the pandemic showed us anything, it's delivery services, you know, rideshare, you know, in the absence of taxis, of course, are somewhat of a utility. So there are essentials, you know, just like the Internet is an essential. Right. And so what would we do without them? You know, do we want to go back to the way things were, like free delivery, service, free, easy access to rides, because taxis had to step up their game to compete. And I think that was great, you know? You know, although I think, of course, the sharing economy and all that, you know, that's been, you know, going backwards in terms of workers and their their rights and everything. But I think from a user perspective, it's been great. Harpreet: [00:27:30] Just quick comment on the on right. I was you know, I was at the Jets game yesterday. I was seeing an anti trask's fellow countryman, Thomas Salamis get his jersey retired and I was trying to get a cab on Uber and I was like 40 bucks on Uber to get a ride home, which is ridiculous. And I use the app for the local taxi company and it was $21 and I was like, All right, that's more like it. But of course, I digress because I [00:28:00] don't even know where I'm getting to. But Coach would go for it. Speaker5: [00:28:03] I actually went exactly where I was going to go with it, right? Is that it's not I don't necessarily agree. So I don't think it is an essential. Right. Like there's a difference between the service being provided and the business through which that service is provided. Right? Like tomorrow, if you had publicly owned trains, for example, and the government owned train company went bust, you're in trouble because they own all of the trains, right? Like if you Roads and Maritime Services, for example, goes down the maintenance of your roads, that's a service that's absolutely essential because it's singularly owned by one entity, right? Whereas in this situation you've got taxis, you've got Ubers, you've got I mean, take delivery, right? Sure. Delivery went down. All right, cool. I guess I'll use Ubereats today, Right? Like there's enough competition still in the market to service the, the needs of the people so the service isn't gone. You're using a different app for, like, for all you know, taxis are the more long term profitable way of doing this as opposed to rideshare services. Right. And what if this is like, for example, you had other competitors in the social media space come along and then Facebook upped their game and survived, Right? Like Facebook wasn't the only like social it still isn't the only social media service. Speaker5: [00:29:21] Right. Like, the point is multiple are going to survive. And eventually the the most firm business plan models are going to survive the long run. Right. And if that is taxis and the taxis add additional services because they realize, hey, we've got guys and cars that could do stuff beyond just transport people and they could find a workable business model around it. It's just a question of human creativity, right? So I don't know, maybe I think if there is a genuine need and I think we've found valuable need from the food delivery service, particularly even more than rideshare, I'd argue. Will [00:30:00] someone step up to the game? I'm almost certain someone will. Right. Like, especially if we go into another lockdown. The food delivery services were incredible. Incredibly useful for people who had to isolate alone and couldn't get someone to drop food off for them. Right. That's. Yeah. That in its very definition is a need. Right. So. Speaker4: [00:30:20] Yeah. Harpreet: [00:30:22] Vint, let's hear from you. Speaker3: [00:30:25] I think this gets to and I'll take it a little to the left. A little bit, but or maybe to the right, I don't know, in a different direction. I think technology is really good at covering up exploitation of human labor. That's one of our biggest problems is especially in taxis are a great example because taxi companies used to buy a ton of medallions that cost a ton of money in the United States to buy a medallion. And so an individual taxi owner would not be able to afford one. Some did. And like it was something that you passed down from generation to generation where you would save up a ton of money, you'd buy a medallion and that was a business. It was essentially like a McDonald's franchise except Taxi Cab for Wheels. And so these companies would buy medallions. And if you couldn't afford to buy your own medallion, you had to go work for a taxi cab company and you worked ridiculous hours. You didn't get paid a lot. It was terrible. That's why the taxis were always so just absolutely wrecked and the people driving them weren't the nicest people on earth either. It's because it was not a great job to have. And then Uber came along and made it sound like it would be a great thing. And they just replaced the medallion system where they are now through their technology platform, doing the same exploitation. Speaker3: [00:31:42] People don't have a perception of value for a driver and a car that matches a fair wage for that driver and car. Perception of values too low. And so we're talking about right now, the technology is the deflationary force where Uber [00:32:00] creates this marketplace. And they'll somehow be able to offer these things at a lower cost because of the technology. But they can't. The technology does what's always been done just on an app instead of a switchboard. That's somebody running. And so from an efficiency standpoint, it's not a whole lot of difference from a marketplace perception of value standpoint. People aren't paying Uber drivers better than they pay taxicab drivers. And now we have Uber drivers who are their contractors. They have almost no real bargaining power when it comes to their wages. They're they're just being exploited by a technology platform. We seem to ignore the fact that that happens. And now if you look at DoorDash, you look at Uber eats, same idea, Amazon and they're driving workforce. Same idea. It seems like if you look at our perception of value for drivers and trucking industry is exactly the same thing. We don't treat truckers very well. Even though the myth is they get paid well. The majority of truckers are terribly underpaid and exploited by trucking companies. Speaker3: [00:33:11] So we still have this fundamental problem that the marketplace doesn't put enough value on a person driving a vehicle no matter what the purpose is for. And we simply do not compensate them enough to avoid exploiting them. But now we're using technology to kind of cover that fact up. And it's. So when I look at companies like Uber, I just say I like the idea. And as products like Uber Black, I like that. That's great because those companies already existed. There were already professional drivers, and it was something that did good for those companies. It made them a whole lot more accessible. They made more money. They got a lot more business. And those drivers are paid a living wage. Those drivers are paid well. And so that's an area where I think technology has improved [00:34:00] the business cycle. I think that part of the marketplace is viable. But when you begin to cover up the exploitation side of it and say at some point we'll be able to get to an economy of scale where this platform makes sense. The same thing with Twitter. We're exploiting the fact that people are addicted to the platform and we're addicting kids and we're covering up the exploitation of our views. And the amount of time that we spend on the platform. And really, we're using technology to cover that up. Speaker3: [00:34:30] We're saying, oh, no, we're providing all this amazing content. People are entertained, they love it. We're giving advertisers access to people. We're doing all this great stuff. But we're taking their attention and monetizing it. And in order to get more attention from them now, we are optimizing for addictive behaviors and we are trying to pull people onto the platform. And now we're even doing it with creators where there are platforms that are editorial. They want to drag you into a cycle of spending hours and hours and hours of your time as a creator, and they're not compensating. Youtube's a good example of this now where there is no monetization for creators until they hit a certain point. And so all those people, they're being sold the dream that you're going to be an influencer and a creator, but they're not making any money. They're putting a ton of time in and the people that are making money are Google. And so I think, like I said, taking it on a tangent, but I think we use technology way too often to cover up exploitation. And we make it sound like it's the societal good and there is a utility and it is beneficial. But you pull back all that technology and look at what's actually going on. A lot of times there's a ton of exploitation under the covers. Harpreet: [00:35:44] Because let's hear from you. Go for it. Speaker5: [00:35:48] Right. So, I mean, what I'm gathering from that is what's the first principle values of what? Services are providing or business businesses providing, right? Like is [00:36:00] Uber and rideshare services? Are they really providing ride share? Like is that the service? Is transport the service or is it accessibility to transport? Right. I would argue the technology itself of having app connected people driving cars. The provision of transport was sorted. It's people in the car. That hasn't changed. Right. So that's no different to a taxi. That's no different at all. So the real value added by the technology aspect of it is the accessibility to it. Right. And it's kind of and this is what just baffles me. Well, it doesn't baffle you, just amazes me, actually, is that what the promise of Internet bringing connected accessibility was made in what, the eighties. Right. And in a huge way and. That promise, like really took a spike 20 years later in the late nineties, and then it had to burn because they had to mess it up because they weren't providing value. They figured out that we could connect people, but clearly the dot com bubble was essentially because they couldn't figure out what to connect people for. Right. And the services that survived, the websites that survived tended to be the ones that did connect some kind of value. Right. So we're essentially seeing that next evolution step in that. Is that okay, now that we know we can connect like transport services with the Internet, right, With the magic of the Internet. Now we're getting to that point where the way in which we use that to connect transport. Speaker5: [00:37:35] So as we're going to that next level down, we're finding out that that process doesn't work. The Internet was the right thing to connect them. Yes, but then the way we use that to employ and and that's where you're like, it's weird. Like, you're right. If that sense of greed didn't get in the way, we probably could have avoided this. And just going for the true value addition, right, which is the connectivity of these apps and just said, okay, hey, taxi companies, you would be a lot [00:38:00] more serviceable if you had this connectivity in there and just focus on evolving those. So it's. That's the bit that amazes me is that we're constantly it's a battle between humanity's genius and greed almost simultaneously. And we're going to see these waves, right, of technology saying, hey, there's a promise, here's how we can use it. People figuring out a billion ways, how not to use it and figuring out how to use it, and then going to the next level of applying it, but then applying it wrong and then figuring out how to use them like. It's not. If you look at it from a historian's perspective, that's not surprising. But at the same time, it is amazing. Like to me, I'm 28, right? Like, it's amazing to me that this takes career's worth of time, right? Like this kind of big changes. They happen over like years and decades. And then you start questioning, okay, well, is that really the high priority in applying some of these technologies? But yeah, I guess I'm in a philosophical rabbit hole right now. Harpreet: [00:39:09] Advertising men. We need to. We need to go back in time and just kill the advertising companies. Like just some parallel universe out there. There's no advertisers, and we're. Or no advertising. And we're just. We're living blissfully with technology. And companies are making money. And after the accident, crash somewhere out there. Serge Love to hear from you, man. Andy, any thoughts for the riff off? Speaker2: [00:39:34] Yeah, no, I agree with custom, I think. Yeah. Like, yeah, it's to me, it's not surprising that people are exploited. To me, that's been like, you know, the theme throughout history. People have always been exploited with or without technology, right? To me, what's surprising, this is a day and age where technology is powerful enough to free ourselves from that [00:40:00] tyranny. You know, yet we find ways to kind of squeeze, you know, as much labor, human labor as possible when, you know, to be honest, we either don't need the humans because of AI or we still need them even more. And we should value them more because the productivity gains thanks to A.I.. So I think, like I it's either going to go in a direction where we're most of us are living off UBI or like power is concentrated to those that, you know, own A.I. and those that, you know, don't have anything to do with it. Or maybe it's not even a buddy AI. Maybe the future is bleak regardless of what goes on in the technology space. I mean, you know, because, you know, it's sometimes I think, you know, we're really getting dangerously close to, you know, nuclear war. And, you know, it's like, you know, aren't we past that? Right. Isn't that, you know, aren't we past the 1960s? I guess in a certain way we're not really the same. You know, we may may have evolved some software, but we're still running on fundamentally the same hardware. So I guess I wish I had more profound take on it. But, you know, that's that's my $0.02. Harpreet: [00:41:33] I love it, man. That's a that's a great, great analogy to put it. We still have the same firmware from, you know, thousands of years ago, even though we got new software updates over. Speaker5: [00:41:47] Holy shit. Then you just taught me the true meaning of leave no stone unturned. So then in the chat just wrote, Humanity will always do the right thing after exhausting [00:42:00] every other option. And oh my God, that is the genuine meaning of leave no stone unturned and I cannot be convinced otherwise. Now that's it. That's the line in the sand. That's real. Harpreet: [00:42:11] Yeah, I like that. That's a good quote. I just saw it myself. Then elaborate on this. If anybody else wants to jump out of the conversation, do let me know. And if anybody watching on LinkedIn is just enjoying this, just to smash like or or story comment right there in the chat, then go for it. Speaker3: [00:42:29] I think that's the history of humanity summed up in one line. I mean, we we always do the right thing after literally trying all the wrong ones. I mean, it's almost like we invent new wrong stuff before we do the right thing. And then, you know, for 100 years we go, why didn't we try that sooner? That's a great question. Why didn't we try that sooner? The one I love is why didn't we think of that? No, we did. We thought of that. Repeatedly. And that's just what we do as a species because we're optimizing and advertising. I like that example of advertising. We shouldn't kill advertising because we need to somehow create awareness for new companies, new products, new. We wouldn't have innovation if nobody knew it existed yet. And so we have to advertise. But I think in a smarter advertising economy, everybody would get the same amount of time. You have a very strict limit on the amount of advertising you get to put out there, and that's it. You only have that much. And so now you have to add optimize for impact, not for volume. And if we were smarter about the way that we created our complex systems, if we actually engineered the rules so that there was both social good and this concept of innovation and corporate good and progress and everything else, then we'd be fine. And we could do all of these things very easily. It's not like any of this stuff's hard to do. [00:44:00] It's only a social norm until you get rid of it. So all of the stuff's doable. It's not really that hard either. But it seems like instead of doing the obvious easy things, we will continue to do the wrong things over and over and over again from a policy standpoint until we're left with nothing else. And the problem is still there. Harpreet: [00:44:27] Ben, Thank you very much, man. I will say any other questions or comments coming in? Well, from Auntie, I like this one. Or Kosta has a question about learning resources. Well, let's go to you, Kosta, but let's read this quote that A.S. by Trujillo. You can always count on the Americans to do the right thing after they've tried everything else. This is true, of course. Go for it. Speaker5: [00:44:55] I think we could find a few dozen of those quotes so it can make a lot of t shirts for merch for this happy hour, right? Okay, so question kind of did you guys right is it's very easy to find 100 books on how to do something and YouTube videos and and Udemy courses. Pluralsight The new company I'm working for gives me full access to Pluralsight and I'm super excited about that because it's the one platform I never really got on to for learning stuff. Right? Probably should have. But yeah, I think I'm going to make the most of that. Do you guys know any resources that specifically spell out what not to do? Like great anti pattern examples because like I was looking at like it depends. Dependency injection thing at work right and it's like you can find a million YouTube videos of guys from India on YouTube explaining how dependency injection works and the rigors of doing it properly. But I can't fully describe it because, I mean, obviously it's work stuff, but yeah, those bits of it that I'm like, okay, I see what they're going for here. Like [00:46:00] they're going for a bit of dependency injection to reduce, you know, like dependency management issues, but. I was questioning whether it's the right way to do it, because it was really strange and I was really on the fence, right? So I was sitting there looking for anti pattern examples. I would love to see an anti patterns book like microservices architectures, anti patterns, right. I reckon that be worth ten times as much. So I'm wondering if you guys. Know any such resources or you guys keep lots of notes from your experiences. I'd love to collab with a few people and put together or something like this because I reckon that's the kind of stuff that would be worth its weight in gold to me. Any thoughts? Speaker2: [00:46:45] You should definitely write it. Yeah, I would. I would interview Joe and Vin, you know, and anybody here that's encountered that sort of thing. Yeah. I mean, I got to admit, like, I've seen a lot of anti patterns in web development because it's the career I spent the longest in in data science, a lot less. But I still have. But I think where you probably see the most, it's like in the practice of email apps and data engineering and in the science part, of course, but it becomes less engineering. E you know, it becomes more about statistics and so on. But. Yeah, I think. I think. Yeah, you should definitely go for it. I've never seen anything like it. Harpreet: [00:47:41] That's a dope ass YouTube channel like How Not to do it Tech Edition or, you know, whatever. Like that would be dope. Speaker5: [00:47:49] But my main concern is that it had turned into like a textbook and then it will take me 30 years to write it. So but you never know, right? Harpreet: [00:47:59] I mean, I think [00:48:00] that's probably a good, good topic that probably comes up a lot in like LinkedIn posts and things like that. I feel like that's probably the place that you'd find more of that. If anything, that should be like a hashtag that started with, you know how not to do it, hashtag how not to do it. But yeah, I like that idea, man. Let me know if you want me to put out like a a post on LinkedIn or that I'm trying to get people to to reply to that on your behalf. I'm happy to, to do that. And he says that his ears perk up every time you say. And he pattern is so close to how you pronounce my name. So questions there's actually word questions coming in on the live stream here. Just for whatever reason, LinkedIn live like, I don't see the comments pop up. Unless I go, like, through a few steps. It's weird. Mike Nash is asking, Do you think that technology is helping us to be socially aware of the use of technology? As technologists, do we play an important role in helping humanity? Do you think that technology is helping us to be socially aware of the use of technology? It's interesting question. Any thoughts on that? Speaker4: [00:49:15] Go for it. I've got a quick thought. Yeah. So if you view technology as a tool, which it is nothing more, it doesn't do anything other than what we design is to do and then what we direct it to do. Once it's built, it's it's inert. You know, it will do something good if we make it. It'll do something bad if we make it. And this kind of ties back into some of the earlier comments as well. From one of Ben's comments. I wanted to say that, you know, humanity has demonstrated through much of its evolution that it has the propensity to take the lazy solution. So whilst some will go for the hard task and do. Shouldn't [00:50:00] you take something from someone else? It's easier. It's less effort. The majority want majority, but certain people will do it. And humanity as a whole has demonstrated a willingness to do that. So I think the ability for humanity to do that is amplified by technology. When technology makes more complex actions, far more simple for people to do. Which then kind of ties back to the thing that we started with at the front. Speaker4: [00:50:32] You know, technology allowed these people to do those things, and they seem to have had some intention of doing things. This. Honestly, if the if the reports are correct and I don't know for certain that they are, but a lot of the reports seem that that was the case. And if we say then, you know, they did this on blockchain and the whole thing of blockchain is visibility. It is. But if there's no one there to look at it, what good is it? The visibility is there. There needs to be some kind of moderation and management about these things by some kind of objective observer that can identify these things. And because it's it's kind of defined, it's deregulated, that doesn't exist there at the moment, certainly not wide scale for all of these different platforms. And that's a challenge at the moment. So will that improve as crypto becomes more mature or as Web3 becomes more mature? That's a really interesting question. But even if it does, I think humanity is generally flawed to want to take that that lazy, quick one. And we'll always be we'll always be working against it. Harpreet: [00:51:43] Shortcuts, man. Always looking for shortcuts. I wonder if Costar has any thoughts on that. Speaker5: [00:51:52] Well. Regulating, but regulating entire industries and how essentially what you're getting at is how do [00:52:00] you regulate business value, Right? How do you or rather, how do you regulate the perception of business value? Because what's what's the reason all of these investments took off, Right? Right. We wanted quick returns. People wanted investments that make money on a shorter timescale. Bottom line. Right. And why is that? Because the business value is essentially measured only by money. Alibi returns. Right? On the other end of the spectrum, you seem to have public services which do focus on value addition to people. But in majority countries that have well-established public public service. Well, there's two patterns here, right? One is countries that don't have well-established public services. And, you know, there's often corruption and other things that come into play. Right. And they just don't serve as the need of the people. And on the other hand, countries that do have well-established public services. Australia is a great example. Right? Good, good medical care, good, you know, bunch of things that are pretty well done. Our trains are pretty great, especially in Sydney, right? Public transport, fantastic roads are well taken care of. And that's the value out there. But there is a huge money sink in all of those areas, right, where it's inefficiency brought in to the economy in many ways. Speaker5: [00:53:15] Right. So you kind of got these two ends of the spectrum, but nothing in the middle. And it seems to me that regulation often is battling that middle ground of having a completely publicly owned, interest focused thing versus a privately owned capital focused thing. Right? So that's where I'm like, Yeah, we're constantly fighting that weird middle ground battle. But then it comes up like, I mean, like the original question asked, is it up to the people building the technologies? Yes, because the people building the technologies isn't the engineer that's employed. It's the business that's building the technology. Let's be very, very clear about that. Right. If I were to run a business, I'm not [00:54:00] the one owning the business, and then my engineers build the stuff. I need to take personal responsibility of of the stuff that I'm building. Right. And it's a mindset thing that not every business owner has. And as an engineer, when I see leaders of a business that take that level of ownership as opposed to, you know, just come in and say, oh yeah, we own the business, and then, you know, the engineers will build some stuff, we'll make some money and then we'll sell it, right? When the owners take that ownership over, hey, we want to see a particular mission inside. And this is where mission statements become valuable, Right? So you see that they stand for something more than just capital growth. Speaker5: [00:54:36] And that gives me a lot of confidence as an engineer to say, hey, that's a company I want to work for. Like, if I had to draw one like linear model that just says work for this is not work for in the simplest possible way, it would basically just be, you know, hey, do the people at the top actually care about the mission of the company? And is that clear? And is it more than just let's make bank if it's just let's make bank. There's probably more interesting tech for me to build. So yeah, we do in a kind of very distributed democratic way, have a say in which companies succeed and fail because we make that individual choice of who we go to work for and what salary packages we choose. Right? Like, you can get paid ridiculous salaries to go work for some companies and you know, you can choose to work for the stuff that you're passionate about and the missions that you see are actually valuable in. I find more value in the latter personally. But hey, I may be super naive and young, and in 20 years I might be like, Damn man, you should have taken the other out. But you know, time will tell that. So be nice. Harpreet: [00:55:47] There's a comment coming here from Yousef on the live stream. We are in the earlier point. One general pattern is don't do the first thing that comes to mind. It usually popped [00:56:00] up because it was the most available for whatever reason and not because it's the best idea. Dig deeper, so ignore your first thoughts or your first ideas like that. Well, I think with that, we'll go ahead and wrap it up, guys. Thanks for hanging out. All right. So we'll do this next week since it is Black Friday holiday season, all that I know people with families, no data science happy hour next Friday, but the Friday after that December 2nd will be the final outside data science happy hour for for a bit of time. You know, we might bring it back next year. If I do bring it back, you will. You will all know for sure. So. If you've been listening to this for a while, I know there's a lot of you that listen to these episodes and you wanted to join. And December 2nd, 2022, coming up in like two weeks, it's going to be your chance. So just come hang out. Let's send the show off with a bang, man. Let's let's do a big and then I mean, I'm not done podcasting like you guys going to see a lot of me everywhere. Always like that just never changes. So that's that's still that's still going to be a thing. But but yeah, in general, the arts and sciences is, you know, start sunsetting this. So last happy hour will be the December 2nd. And then I've got a batch of episodes that I'm releasing and those are going to be the last batches of episodes for this podcast. But you will you'll see me on other podcasts, especially this new one coming up, which is going to be all about deep learning and you'll hear about that. Speaker4: [00:57:37] So I'm afraid it's the last one or the one on December 2nd is the last before Christmas. Do you plan to make this like the Christmas themed apps? Harpreet: [00:57:46] Yeah, yeah. Like typically in December, I always usually be wearing my, my, my ugly sweaters, Christmas sweaters. So yeah, I'm always going to do Christmas theme, but I'll switch up the lights. I have one green, one red, and we do it up Christmas style. So [00:58:00] yeah, let, let, let everyone you know know that this thing is wrapping up. It was cool, man. I remember. I remember just how bumpy this was two years ago at this time, like 50 people in the room just going crazy. It was was awesome, man. So y'all, thank you so much. That's making the official announcement where we're winding down the podcast. You are. So data science is wrapping up. There's other things on the horizon. There's that podcast I'm doing with Mark and Mexico, you know, we'll be bringing that and then the Deep Learning podcast and you'll find me everywhere else go. That's it for this, for my friends. Remember, you got one life on this planet. Why not try to do some big shit on one?