There's always a moment about 45 minutes into my workshops when the energy shifts.
Someoneâusually an experienced AI userâleans back in their chair, eyebrows raised, genuinely surprised:
"Wait, I had no idea you could do all of this work to get these kinds of results."
Exactly.
AI often gets sold as easyâtype your prompt, press enter, watch the magic unfold.
But when you try using it for meaningful, high-quality workâa nuanced strategy brief, creative positioning, thoughtful contentâit can feel frustratingly elusive.
Suddenly you're 40 minutes in, stitching together pieces from four different drafts, but itâs still not quite right.
Itâs taken far longer than expected. Maybe even longer than if youâd done it yourself.
Thatâs not failure. And itâs not you doing it wrong.
Itâs what it actually looks like when youâre learning how to use completely new systems weâre only beginning to understand.
The messy partâthe carefully designed prompts, the layers of detailed context, the trial and error and careful iterationâisnât a detour. Itâs the foundation.
You slow down to get better.
This isnât about doing more, faster.
AI isnât just a productivity tool. Itâs a mirror for your thinkingâand a pressure test for your ideas.
The more you experiment, the more you realize:
Youâre not just learning how to use AI better.
You're learning a more powerful way of thinkingâone that helps you:
break problems apart more effectively
ask sharper questions.
articulate what âgoodâ looks like
communicate your intentions, expectations, and tasteâwith more clarity and precision
Even now, it often takes me longer than I'd like to admit.
But Iâve come to see that time as an investmentânot a cost.
Iâve stopped measuring progress in minutes saved.
I measure it by how much better the work getsâand how much more me it becomes.
Because the truth is: youâre building a kind of foundation most people wonât take the time to build.
Most will settle for good enough.
And soon, everyone will be at good enough.
The real edgeâthe one that lastsâwonât come from knowing how to just use AI.
Itâll come from being intentional about how you think with it.
And how far you're willing to test the boundaries of whatâs now possible.
Thatâs the new skill worth building.
And youâre already on your way.
What You Need to Know About AI This Week âĄ
Clickable links appear underlined in emails and in orange in the Substack app.
đŹ Googleâs Gemini chatbot is launching for kids. No one is ready.
Googleâs rolling out Gemini to kids under 13 via parent-managed accountsâmaking it the first major AI chatbot aimed at this age group.
The company says it wonât use their data for training, but the rollout comes with a long list of warnings that frankly feel more like a wall of disclaimers for parentsâand too much for most kids to realistically follow.
In an email to parents last week, Google acknowledged several risks and laid out what families should do to keep kids safe:
âGemini can make mistakes,â so parents should âhelp your child think criticallyâ about its responses.
Remind children that âGemini isnât human.â It may sound like a person, but âit canât think for itself or feel emotions.â
Teach kids how to âdouble-checkâ what Gemini says.
Make sure they know ânot to enter sensitive or personal info in Gemini.â
Filters try to block inappropriate content, but âtheyâre not perfect.â Children âmay encounter content you donât want them to see.â
Tech companies clearly see kids as the next big market.
But launching tools like Gemini before safeguards are fully in place just shifts the risk onto parents.
Whatâs so urgent about putting this in kidsâ hands right now?
OpenAI is turning its for-profit arm into a Public Benefit Corporation (just like Anthropic and xAI). The non-profit board of OpenAI will still control this PBC and be a large (not major) shareholder.
The company also hired Instacart CEO Fidji Simo as CEO of Applications. Already a board member at OpenAI, Simo spent a decade at Meta, where she served as head of the Facebook Appâoverseeing News Feed, Stories, Groups, Video, Marketplace, Gaming, News, Dating, and Ads.
Last week it was Duolingo.
Before that, it was Shopify.
This week, itâs Fiverr.
Micha Kaufman, Fiverrâs CEO, sent an internal memo this week that went viralâand it doesnât pull any punches:
âAI is coming for your jobs. Heck, itâs coming for my job too. This is a wake-up call.â
âIt does not matter if you are a programmer, designer, product manager, data scientist, lawyer, customer support rep, salesperson, or a finance person â AI is coming for you.â
âYou must understand that what was once considered âeasy tasksâ will no longer exist; what was considered âhard tasksâ will be the new easy, and what was considered âimpossible tasksâ will be the new hard. If you do not become an exceptional talent at what you do, a master, you will face the need for a career change in a matter of months.â
Not every leader will say it this bluntly, but many are having this conversation behind closed doors.
So how do you manage this moment?
If your calendar doesnât already include time to learn, experiment, and build a stronger foundation with AI at the intersection of your domain expertiseâ nowâs the time to start.
Give it 20 minutes a day. One task. One small experiment. One meaningful update to a prompt or context document you're already using.
You donât need to master everything AI.
But you do need to be in the habit of daily learning.
Quiet consistency > loud panic.
You can read the full memo đ.
One of the weekâs best reads is from New York Magazine. Its headline: Everyone is cheating their way through college.
It profiles Roy Lee, a Columbia student who used ChatGPT to cheat his way through college, built a tool to help others cheat during job interviews, and raised $5.3 million to scale what he calls âcheating on everything.â
Recent surveys show over 90 percent of students routinely lean on AI assistants for assignments.
Blame opportunistic students if you likeâbut incentives shape behavior.
Most assignments still reward how well you can repackage information, not well you think critically, solve problems, or develop insights.
AI excels at repackaging. Naturally, students reach for shortcuts to get things done.
What would raise the bar?
Collaborative, in-class challenges with real-world ambiguity and shared accountability
Process-driven assignments that document research, reflection, and discoveries
Structured AI training that covers prompting, limitations, bias, and ethical useâso students build the skills most valuable today.
Rethink what we ask students to doâand how we measure growth, and AI becomes a partner in insight instead of a shortcut to âDone.â
Keep the status quo, and schools will keep losing relevance to the apps on every studentâs phone.
Google just launched â100 Zeros,â a multi-year film and TV initiative with Range Mediaâa talent and production company behind A Complete Unknown and Longlegs.
The goal is to co-produce projects to sell to studios and streamers ânot distribute them on YouTubeâas part of a bigger play to push adoption of products like Gemini.
The U.S. Copyright Office has now registered over 1,000 works that were âenhancedâ with AI.
Fully AI-generated content still isnât protected. But the line between authorship and assistance is already starting to blurâespecially as AI tools become a bigger part of the creative process.
Walmart is using generative AI to scan red carpets, runway shows, and social mediaâthen combine those signals with internal data to jumpstart the design process. Its new tool, Trend-to-Product, generates mood boards and early product concepts in minutes, cutting design timelines and compressing the entire product development cycle to as little as six weeks.
Whatâs smart here is how Walmart is moving fast, staying culturally tuned, and grounding it all in its own brand DNA.
Googleâs iOS app is getting a new AI âSimplifyâ tool that rewrites complex web content into plain Englishâwithout taking you off the page.
Powered by Gemini, itâs designed to keep users inside Googleâs ecosystem, instead of turning to tools like ChatGPT.
Just select any text in the Google app and tap the âSimplifyâ icon to see a rewritten version.
Google also launched a Gemini app for iPad.
Blumhouseâthe horror studio behind Get Out and M3GANâteamed up with Meta to test a chatbot that texts you during a movie to boost Gen Z engagement.
It ran one night, didnât stick, and wonât returnâat least for now.
There are far more interesting ways AI could actually resonate with this crowd.
Still, credit to Blumhouse for being willing to experiment.
Starting this fall, the UAE will mandate AI education for all Kâ12 studentsâreaching 400,000 kids with lessons on prompt engineering, bias, ethics, and building AI systems. The priority is re-engaging a generation thatâs already living in a tech-first worldâand adapting schools to catch up.
Meanwhile, over here in the U.S., weâre still debating whether using AI is cheating.
Pinterest launched tools to identify and filter AI-generated content after user complaints about âAI slopâ overwhelming the platform.
In case you missed last weekâs edition, you can find it đ:
đ¤ AIâs Biggest FlawâHallucinationsâIs Also Its Greatest Superpower
Weâve all learned to watch out for, and even fear AIâs tendency to make things up, then say them with confidence.
That's all for this week.
Iâll see you next Friday. Thoughts, feedback and questions are always welcome and much appreciated. Shoot me a note at avi@joinsavvyavi.com.
Stay curious,
Avi
đđđ P.S. A huge thank you to my paid subscribers and those of you who share this newsletter with curious friends and coworkers. It takes me about 25+ hours each week to research, curate, simplify the complex, and write this newsletter. So, your support means the world to me, as it helps me make this process sustainable (almost đ).