End-to-end lead prospecting from Luma events. Searches Luma for events by topic and location, extracts all attendees/hosts, qualifies them against a qualification prompt, outputs results to a Google Sheet, and sends a Slack alert with top leads. Use this skill whenever someone wants to find qualified leads from events, prospect event attendees, or run an event-based lead gen workflow. Also triggers for "find people at events and qualify them" or "who's attending X events that matches our ICP."
npx gooseworks install --claude # Then in your agent: /gooseworks <prompt> --skill get-qualified-leads-from-luma
Search Luma for events by topic and location, extract all attendees and hosts, qualify them against your ICP, export to a Google Sheet, and send a Slack alert with the top leads.
This is a 5-step pipeline that chains together luma-event-attendees, lead-qualification, Google Sheets output, and Slack alerting.
Before doing anything, make sure you have clear answers to these questions. If the user's prompt already covers them, skip ahead. Otherwise, ask:
skills/lead-qualification/qualification-prompts/? If not, what's their ICP at a high level? (Can use lead-qualification intake mode to build one)Present these as a numbered list. The user can answer in one shot.
Use the luma-event-attendees skill with multiple keyword variations to maximize coverage.
Generate 3-5 keyword variations combining the user's topic with their location. Run them all in parallel:
# Run each search variation in parallel
python3 skills/luma-event-attendees/scripts/scrape_event.py --search "AI San Francisco" --output /tmp/luma_search_1.csv
python3 skills/luma-event-attendees/scripts/scrape_event.py --search "Growth Marketing San Francisco" --output /tmp/luma_search_2.csv
python3 skills/luma-event-attendees/scripts/scrape_event.py --search "GTM San Francisco" --output /tmp/luma_search_3.csvAfter collecting results, filter out events outside the user's specified timeframe using the event_date column. Luma search returns events from all time periods, so this step is essential to avoid stale leads. If no timeframe was specified, default to the past 30 days.
Merge and deduplicate by name (case-insensitive). Handle None names gracefully — skip entries with no name.
Save the deduplicated result as a CSV:
/tmp/luma_all_attendees.csvReport to the user:
Work with CSVs throughout the pipeline — Google Sheets creation happens only at the end (Step 4) because writing large datasets to Sheets mid-process is slow and error-prone.
The CSV from Step 1 (/tmp/luma_all_attendees.csv) is your working file. Columns should include:
| name | event_role | bio | title | company | linkedin_url | twitter_url | instagram_url | website_url | username | event_name | event_date | event_url |
|---|
Use the lead-qualification skill (Mode 2: reuse prompt) to qualify all attendees.
skills/lead-qualification/qualification-prompts/ai-event-attendees-gtm.md)Launch all batches simultaneously using the Task tool with sonnet model subagents:
Task: "Qualify leads batch 1/N"
- Include the full qualification prompt text
- Include the batch of leads as JSON
- Ask for output as JSON array: [{id, name, qualified, confidence, reasoning}]
Task: "Qualify leads batch 2/N"
... (launch ALL at once)/tmp/all_qual_results.json — all 195 results/tmp/qualified_leads.json — only qualified leads, sorted by confidenceReport to the user:
Now create the Google Sheet with all data — both raw attendees and qualification results.
RUBE_SEARCH_TOOLS to find Google Sheets tools (search for "google sheet create")Luma Leads - [Topic] - [Date]Qualified — Yes / NoConfidence — High / Medium / LowReasoning — 2-3 sentence explanationThe Google Sheets API can be slow for large datasets. Use this approach:
If Rube/Sheets is unavailable, save as CSV:
/tmp/luma_qualified_leads_[date].csvPresent the Google Sheet link (or CSV path) to the user.
Send a formatted Slack message with the top N qualified leads (default: 5, or whatever the user specified).
Use Python with urllib.request to POST to the webhook:
import json
import urllib.request
message = {
"blocks": [
{"type": "header", "text": {"type": "plain_text", "text": "Top N Qualified Leads from [Topic] Events"}},
{"type": "section", "text": {"type": "mrkdwn", "text": "_From X attendees across Y events, Z qualified (P%). Here are the top N:_"}},
# For each lead:
{"type": "section", "text": {"type": "mrkdwn", "text": "*1. Name* [Confidence]\n LinkedIn: url\n Bio: ...\n Why: reasoning"}},
{"type": "divider"},
# Link to spreadsheet at the bottom
{"type": "section", "text": {"type": "mrkdwn", "text": "<sheet_url|View full spreadsheet> (X attendees, Y qualified)"}}
]
}
req = urllib.request.Request(webhook_url, data=json.dumps(message).encode(), headers={"Content-Type": "application/json"})
urllib.request.urlopen(req)Use RUBE_SEARCH_TOOLS to find Slack tools, then send via SLACK_SEND_MESSAGE or similar.
The Slack alert should include for each top lead:
End with a link to the full Google Sheet.
| Component | Cost |
|---|---|
| Luma scraper (Apify) | $29/mo flat subscription |
| LinkedIn enrichment (optional) | ~$0.03 per 100 leads |
| Google Sheets | Free (via Rube/Composio) |
| LLM qualification | ~$0.10-0.30 per run (depends on batch size) |
| Slack webhook | Free |
Typical run: ~200 attendees across 3-5 search variations costs essentially just the Apify subscription + a few cents in LLM tokens.
Quick run with existing prompt:
"Find qualified leads from AI and growth events in SF. Use the ai-event-attendees-gtm qualification prompt. Send top 5 to Slack webhook: https://hooks.slack.com/..."
Full specification:
"Search Luma for startup, SaaS, and AI events in New York. Extract all attendees. Qualify them against our Series A founders ICP. Put everything in a Google Sheet and Slack me the top 10."
Minimal (triggers clarifying questions):
"Find me leads from SF tech events"
export APIFY_API_TOKEN="your_token"
# Or check skills/luma-event-attendees/.envSome Luma events have show_guest_list disabled. The Apify scraper can still get featured guests, but full attendee lists may not be available for all events.
This is normal for large datasets. The skill writes in 50-row chunks. If it's too slow or fails, results are always available as CSV in /tmp/.
Verify the webhook URL is correct and the Slack app is still installed in the workspace. Test with a simple curl:
curl -X POST -H 'Content-Type: application/json' -d '{"text":"test"}' YOUR_WEBHOOK_URLCheck and improve your brand's visibility across AI search engines (ChatGPT, Perplexity, Gemini, Grok, Claude, DeepSeek). Set up tracking, run visibility analyses, audit your website for AI readability, and get actionable recommendations. Uses the npx goose-aeo@latest CLI.
Extract competitor and customer intelligence from any company's landing page HTML. Discovers tech stack, analytics tools, ad pixels, customer logos, SEO metadata, CTAs, hidden elements, and more. No API keys required.
Discover all customers of a given company by scanning websites, case studies, review sites, press, social media, job postings, and more. Use when you need competitive intelligence on who a company sells to.