Vibe Code Camp Distilled

Logan Kilpatrick & Ammaar Reshi - Google AI Studio

Logan Kilpatrick & Ammaar Reshi - Google AI Studio

Key Insights

Summary

Logan Kilpatrick (Developer Products, Google AI) and Ammaar Reshi (Product & Design Lead, AI Studio) demonstrated Google’s AI Studio platform with a focus on vibe coding—building AI-powered applications through natural language prompts rather than traditional IDE workflows. The session showcased how AI Studio is transforming both internal Google workflows and the broader landscape of rapid prototyping and application development.

The demonstrations ranged from converting Figma mockups to interactive prototypes in seconds, to building data visualization apps from CSV files without ever opening the data, to creating full-stack multiplayer applications from single prompts. A particular highlight was a live multiplayer FigJam clone that handled dozens of concurrent users from the stream audience, built entirely through AI Studio’s interface. The discussion also explored deeper questions about what constitutes quality in AI-generated software and how tool calling capabilities are the hidden driver of recent breakthroughs in AI-assisted development.

Main Topics

AI Studio Positioning and Philosophy

AI Studio is Google’s entry point for discovering and experimenting with Gemini models, positioned between casual building and full IDE environments. Ammaar describes it as having two main components: a playground for discovering models and tweaking parameters (with everything exportable to code), and a vibe coding environment for building AI-powered apps with single prompts (00:03:21).

The team is explicit about not trying to compete with full IDEs like cursor or Antigravity. As Ammaar explains: “We’re really going after folks who are entering vibe coding for the first time or are more casual builders. We’re simplifying it, we’re making it approachable and easy and we’re bringing you all the power without the complexity. We don’t ever want to necessarily be a full blown IDE” (00:04:27).

Logan adds that the forthcoming version of AI Studio will be powered by the Antigravity coding agent, with easy transitions between tools: “We’ll make it super easy to go from AI Studio to anti gravity, which we’re really excited about. So like, as you build the initial version of what you want, if you’re a developer, you want to sort of continue in an IDE, we’ll let you do that like very easily single click” (00:04:45).

Workflow Revolution: From Figma to Live Prototypes

Ammaar demonstrated how Google teams are fundamentally changing their design workflow by building directly in AI Studio instead of Figma. The process is remarkably simple: take a screenshot of a Figma mockup, paste it into AI Studio’s build mode, and get an interactive prototype (00:05:44).

The team has also built a Figma plugin that extracts JSON with exact styles and typography, ensuring precision: “What this ensures is that the model has effectively the exact styles, the right typography, everything. So it really looks and feels like your figma file. Because I think that’s the biggest thing designers wrestle with, right? It’s like that precision” (00:07:56).

The workflow advantage is speed of iteration. As Ammaar showed with a new homepage design for AI Studio: “I just asked, what does the new user experience look like? Because I haven’t designed that yet and it came up with something like that. So I think this gets back to our workflow is changing because it’s so much faster to just ask for that new user experience and then go and tweak it here and there” (00:07:05).

A striking stat: “Some like double digit percentage points of all Googlers are using AI studio to accelerate product development” (00:09:11).

Data to Interactive App in Minutes

Logan demonstrated building an interactive YouTube Shorts analytics dashboard by simply dropping in a CSV file from Kaggle without ever opening it. The prompt was simple: “Here’s the CSV file. Build an interactive visualization to look at all this data” (00:10:31).

The real power came in the next iteration—adding AI features on top of the visualization: “Now I want an option to automatically use the [Gemini] three one to create YouTube shorts that mirror the performance of those videos, riffing on the ideas using AI” (00:11:52). The result was a dashboard that could analyze high-performing content and generate new video concepts based on the patterns.

Logan acknowledged a common vibe coding problem: “One of my biggest gripes is that I ended up building UIs these days where I don’t actually know how the UI works and I just end up fumbling through like this” (00:11:45). The response from the audience was interesting—some have started prompting for walkthroughs: “I have seen people who have started asking for in their prompts, like create a walkthrough of the feature as well. And it actually then just goes and clicks through things and shows you stuff” (00:12:22).

Full-Stack Multiplayer Apps: The FigJam Demo

The standout demonstration was a multiplayer collaborative whiteboard (FigJam clone) that Ammaar built and then opened to the live stream audience. The creation process: “This was one shot. Like, this was literally a screenshot of the FigJam website. And I was like, make me this app. And then obviously, like I added more and more features to it and was like fixing on the left. But the one shot experience had people joining in” (00:15:27).

The demo successfully handled dozens of concurrent users from the stream, all drawing and collaborating in real-time. This showcased what Ammaar called the new capability level: “These are the kinds of apps that we’re really excited that people will be able to make in just one shot. Easy multiplayer apps, games, all sorts of things. And great UI, right? All in one” (00:15:51).

Dan issued a challenge to the audience: “The first person to do a copy of this app fully from any agent of their choice, throw it in the chat, and we’ll send you every hat” (00:17:35).

The Google Ecosystem Advantage

Logan outlined AI Studio’s strategic advantage through Google’s ecosystem integrations: “With one click, you’ll be able to connect to Google workspace with one click, you can add Google calendar with one click, you can add Gemini, which already works today. So I think you’ll see more of us going in that direction of trying to just make it really easy to connect to all the things which everyone wants to connect to across the Google ecosystem” (00:18:38).

The goal is eliminating deployment friction: “If you’re in IDE, there’s like four or five hops that you have to do to then take the thing that’s running locally on your computer and actually go and make it available to more people other than running like a local dev server and deploy it and do all the other things, spin up a production database” (00:18:04).

The Tool Calling Breakthrough

When asked what changed in the last six months to enable this leap in capability, Logan identified tool calling as the critical factor: “Tool calling is the fruits, the basket of fruits that bears all the gifts. As you make tool calling better, and specifically over while running tasks, it’s the Holy grail” (00:20:29).

He explained why this matters: “There’s all of these micro decisions being made. Like, should I look through these files? Should I go and edit this thing? Should I use this tool to search for documentation so that I better understand what this package or this random thing is doing? So in every one of your user turn requests of like, I want to do this thing in some software environment, there’s many, many, many subsequent requests of the model basically doing on the fly context engineering to make sure that it has the right stuff and it’s making the right edits” (00:21:12).

Redefining Slop and Craft

The discussion turned philosophical about what constitutes quality in AI-generated software. Ammaar defined slop as laziness: “When I hear the word slop it’s like, it’s lazy. You look at it and you’re like, Oh, that was this first generation. You didn’t really iterate on it. You basically are copying all the AI isms of the thing and you didn’t refine it and actually make it great” (00:23:47).

Logan offered a more technical definition: “It’s where your thing that you built is in distribution to the model itself. The things that are most in distribution for the model, I would consider to most be slop. The easiest thing to make the model do. It’s like the default behavior” (00:24:47).

But Logan noted this is temporary: “The models can like barely generate high quality, like really aesthetic apps with high code quality. But that is a transitory effect. The expectation you would imagine is that as we’re sitting here in 12 months, like that’s actually no longer the case” (00:25:21). When asked if he believes this based on what he’s seeing internally, Logan confirmed: “Yeah” (00:25:40).

The implication: “If the thing that looks nice and has all the right aesthetics and high code quality becomes the thing that’s mostly in distribution, the definition of slop will change” (00:25:49).

Actionable Details

Tools & Products

AI Studio Workflow

  1. Start screen: Enter initial prompt
  2. Left panel: Chat interface for iterative refinement
  3. Right panel: Live app preview
  4. Can paste screenshots directly into build mode
  5. Everything exportable to code
  6. Upcoming: One-click export to Antigravity

Figma to Prototype Process

  1. Create basic mockup in Figma
  2. Take screenshot OR use Figma plugin to export JSON with exact styles
  3. Paste into AI Studio build mode
  4. Prompt for specific features or behaviors
  5. Iterate in chat to refine
  6. Test interactions in real-time preview

CSV to Interactive Dashboard

  1. Find data source (e.g., Kaggle dataset)
  2. Drop CSV directly into AI Studio
  3. Prompt: “Build an interactive visualization to look at all this data”
  4. Add AI features: “Add an option to use [model] to [specific task] based on this data”
  5. Get working dashboard without opening the CSV

Current AI Studio Capabilities

Coming Soon to AI Studio

Quotes Worth Saving

On workflow transformation:

“The beginning of this year, we’ve barely been in Figma. We’ve just been making a lot of our prototypes and UI work in AI studio inside of AI studio, which has been really fun. It’s really changed how the teams were works as well.” (00:02:19) — Ammaar Reshi

On the capability shift:

“I felt like back then you could like taste it a little bit where it was like, ah, I’m like kind of getting close to being able to make the thing that I want… I feel like the moment that we’re in and especially the last six months is like, you can now do the ambitious thing, which I think is like what all of us originally wanted to do.” (00:20:01) — Ammaar Reshi

On validation in vibe coding:

“When someone sends a PR for like one of my vibe coded projects, I just want to see like a loom of them going through the feature a few different times, rather than a code diff, honestly. I think it’s a new unit of proof of quality is I walk through it a lot.” (00:12:45) — Dan (moderator)

On the tool calling breakthrough:

“Tool calling is the fruits, the basket of fruits that bears all the gifts. As you make tool calling better, and specifically over while running tasks, it’s the Holy grail.” (00:20:29) — Logan Kilpatrick

On what constitutes slop:

“It’s where your thing that you built is in distribution to the model itself. The things that are most in distribution for the model, I would consider to most be slop… But that is a transitory effect. The expectation is that as we’re sitting here in 12 months, like that’s actually no longer the case.” (00:24:47) — Logan Kilpatrick

On the craft bar moving:

“I think the slop era is honestly behind us. It’s just, if you have taste, you can go and make the thing.” (00:22:02) — Ammaar Reshi