Mensing.AI


Learnings & Musings on AI, ML, Data Science & Python

Deepfakes: An Overview πŸ“½

Real talk, Deepfakes terrify me. πŸ™€ The implications of fake video that looks real to the human eye has “dystopian sci-fi” written all over it. If you thought the last election was a circus, wait until the first Deepfakes election. πŸŽͺ A few outcomes that come to mind: Political mudslinging reaches its zenith and societal upheaval follows in its wake … Read More


Explaining AI to Mom πŸ‘΅

Fear mongering and buzzword-y overuse/loosey-goosey usage have been hurting AI’s appeal with the broader population. Are there terrifying possibilities? Yes (I’m terrified of the implications of Deep Fakes). But guaranteeing dystopia is a disservice. It plays to the reptilian brain while we’re trying to recreate the higher level brain in silicon. 🦎 I agree with Bryan: Our thriving depends upon … Read More


Nvidia Takes it to the Cloud

Cloud and software providers are moving in on the chip game, so why shouldn’t a chip maker move in on the cloud game? Nvidia has announced that they’re combining 16 of their super AI friendly GPUs into high-performance computing clusters for cloud use. The new tech race is well underway and it is to become THE cloud platform for AI. … Read More


The Whole Will Be Greater Than the Sum of the Parts

I’m incredibly optimistic about the potential of human-machine teams, or centaurs. And not just because the name for them is cool and mythical. I think they make AI more immediately useful as the human can handle the intangible “soft skills” we’ve yet to figure out how to recreate in code and the machine can handle the crazy computation and pattern … Read More


Common Sense is Uncommon in AI

It seems like there is a two-party system developing in the AI-sphere: one argues for purposefully imbuing the systems with common sense, the other suggests that the cognitive abilities of human infants will eventually appear spontaneously in these models. Not quite sure how that last part works yet. I think the big difference between the two camps is the time … Read More


Facebook Chips in on Live Video

I’m starting to get obsessed with watching who is making a chip for what AI purpose these days. Facebook has announced they’re designing a chip to help with analysis and filtering of live-streaming video. Why? Two reasons: To minimize reliance on outside suppliers (here’s looking at you Intel) To avoid the embarrassing mistake of letting someone livestream their suicide or … Read More


Book Notes πŸ“š: Weapons of Math Destruction

Light Hearted Moment πŸ˜‚ The above is to offset a little of the doom-and-gloom that might follow. Also, it fits pretty well.I recently read Weapons of Math Destruction by Cathy O’Neil. It covers some of the concerns I’ve mentioned previously. What is a weapon of math destruction? πŸ’£ It’s basically an algorithm that utilizes Big Data at scale to cause … Read More


Thought I can’t stop thinking πŸ€”

The current goal/trend seems to be to mimic the human brain in software/algorithmic form, but that standard and comparison might be part of the problem. It will certainly be a massive task to accomplish. Maybe we need to think about neural networks and the like as more of a colony of bees or farm/hill/whatever of ants. A bunch of individual … Read More


What We Get Wrong About Technology 🚫

This article ties in nicely some of the deadly sins shared previously, especially #6 (that article linked to this one). Here’s what I noted: When asked to think how new inventions might shape the future, our imaginations tend to leap to technologies that are sophisticated beyond comprehension. (remember Clark’s 3rd Law?) The most influential technologies are often humble and cheap … Read More


The Quartz guide to artificial intelligence πŸ—ΊοΈ

Solid overview of AI: what it is, what it means, etc. So what is AI? Apparently it’s software with a mechanism to learn (careful with this word). It then uses that knowledge to make a decision in a new situation. (more suitcase words) Harking back to #3 above, a computer doesn’t have a flexible concept of “similar”. Humans know an … Read More