Game launch marketing strategy: the complete playbook
Steam released 20,282 games in 2025. Only 3% reached 1,000 reviews. This playbook lays out a 12-month launch framework using cloud-powered playtesting, press previews, playable ads, and demo distribution to build the audience your game needs before it ships.
- Why the standard launch playbook is breaking
- The T-12 to launch framework
- T-12 to T-6: from pre-production to vertical slice
- T-6 to T-3: playtesting for product-market fit
- T-3 to T-1: the press preview campaign
- T-1 to launch: playable ads and demo distribution
- Launch week: multi-channel demo deployment
- Post-launch: reactivation and back catalog
- Channel strategy by funnel stage
- Budget allocation: physical vs. cloud
- KPIs by launch stage
- One build, four use cases
- Sources
Why the standard launch playbook is breaking
The standard game launch playbook: build for three years, announce six months out, buy media, ship, move on. That playbook is producing worse results every year. Not because the games are worse. Because the environment changed and most marketing teams haven't caught up.
Steam released 20,282 games in 2025. Only 3% reached 1,000 reviews. 48.8% got fewer than 10 (Source: HowToMarketAGame / GamesRadar 2026). That isn't saturation as an abstract concept. That is a specific, measurable collapse in discoverability for any title that doesn't start building an audience early and build it deliberately.
Steam discounts are nearly 4x less effective at generating page impressions than they were in 2019 (Source: GameDiscoverCo / GDC 2025). The mechanism that publishers used to reactivate a game after a soft launch is broken. You can't discount your way out of obscurity anymore.
UA costs are rising in parallel. Gaming CPI hit $0.56 in 2025, up 30% year-over-year (Source: Adjust 2026). Mobile studios spent $25 billion on UA last year (Source: AppsFlyer 2026). That capital is chasing a finite number of eyeballs and it is driving up costs for every segment, including premium PC and console.
The result is a market where 67% of total PC gaming hours in 2024 went to titles that are six or more years old (Source: GDC 2025 / GameDiscoverCo). Players are defaulting to what they already know. Getting them to try something new requires reducing friction to near zero.
The publishers who are winning have recognized that the product itself is now the best marketing vehicle. Demos, playtests, press access, and playable ads are not support channels. They are primary acquisition channels. This article lays out exactly how to sequence them.
The T-12 to launch framework
Most launch planning starts too late. A team finishes vertical slice, someone says "we should start marketing," and the clock starts at T-6 or T-3. By then, the best conversion windows are gone.
The framework in this article runs T-12 months to launch day. Each phase has a specific objective, specific deliverables, and specific success metrics. The phases are sequential but overlap: press outreach starts before playtesting concludes, demo distribution overlaps with paid media, post-launch reactivation planning starts before ship.
The logic is simple. Wishlist conversion data from GameDiscoverCo shows that top-performing games averaged 214 days of pre-release visibility. Underperformers averaged 411 days (Source: GameDiscoverCo 2025). More time visible is not automatically better. Quality of engagement per day is what compounds.
| Phase | Window | Primary objective | Key output |
|---|---|---|---|
| Pre-production to vertical slice | T-12 to T-6 | Steam page live, first wishlist baseline | Steam Next Fest slot, early community |
| Playtesting for product-market fit | T-6 to T-3 | Eliminate retention killers before press sees the game | Playtest report, retention curve |
| Press preview campaign | T-3 to T-1 | Secured coverage, review copy access pipeline | Media coverage, influencer seeding |
| Playable ads and demo distribution | T-1 to launch | Reduce download friction, UA conversion efficiency | Playable ad units, demo links |
| Launch week | Launch day ±7 | Multi-channel synchronized activation | Sales velocity, day-7 retention |
| Post-launch reactivation | T+1 to T+6 months | Back catalog monetization, lifecycle marketing | Reactivation campaigns, sequel seeding |
Each phase is detailed in the sections that follow. The tools and tactics shift by phase, but the thread running through all of it is the same: make the game playable as early as possible, in as many contexts as possible, with as little friction as possible.
T-12 to T-6: from pre-production to vertical slice
The first six months of marketing work happen before most studios think marketing has started. The deliverable from this phase is not a trailer. It is a Steam page that starts accumulating wishlists and a first playable build that can be used in Steam Next Fest.
Chris Zukowski from HowToMarketAGame frames this correctly: "Get your Steam page up nice and early: as early as you possibly can. Basically what you're doing is you're trying to collect wishlists." The platform rewards early intent signals. A game with 7,000-10,000 wishlists can appear on Steam's "Popular Upcoming" list (Source: HowToMarketAGame). A game that launches cold cannot.
The median wishlist-to-first-week-sales conversion for games with 25,000+ wishlists is 0.15x (Source: GameDiscoverCo 2025). That conversion rate is not guaranteed. It is a ceiling that requires the right release conditions to approach. But you cannot approach it without the wishlist base.
Steam page fundamentals
The page has one job at this stage: communicate the core loop and genre identity clearly enough that the right player wishlists and the wrong player does not. Misleading page assets create wishlists that don't convert. They inflate your numbers and distort your planning.
Key assets for the initial page:
- A header capsule that reads the genre correctly at thumbnail size
- A trailer under 90 seconds, gameplay-first within the first 10 seconds
- Three to five screenshots that show actual gameplay, not cinematic renders
- A description that names the closest genre comparables in the first two sentences
This is not the place for lore or world-building. Players can't wishlist a feeling. They wishlist a game they recognize as something they want to play.
Demo and playable builds for Steam Next Fest
Steam Next Fest is the highest-return event in the indie and mid-tier calendar. The October 2024 edition produced a demo-to-wishlist ratio of 5.26 (Source: GameDiscoverCo 2025). That ratio means a demo with 10,000 plays generates roughly 52,600 wishlists if optimized. No paid channel produces that conversion efficiency.
To participate, you need a demo. Building a demo early forces scope decisions that improve the product. Studios that plan the demo from the beginning tend to build cleaner first-hour experiences than studios that retrofit a demo after the fact.
The demo doesn't need to be long. Twenty to thirty minutes of polished content outperforms an hour of rough content. The goal is to deliver the core loop twice, confirm the tone, and end on a strong moment that generates anticipation.
For studios thinking ahead to demo distribution beyond Steam, the same build can be deployed in multiple contexts: press access, playtesting, playable ad units, and retailer kiosks. The demo distribution strategy article covers the full platform comparison. The key design decision at this stage is to build the demo as a standalone slice, not a cut-down version of the main game with content locked.
Community seeding
The T-12 to T-6 window is when you build the early community that will amplify launch. This means:
- A Discord server with genuine developer presence
- Weekly or bi-weekly devlog content (short-form, specific, not vague progress updates)
- Direct outreach to genre-specific content creators, not mass lists
The community you build here is not your primary acquisition channel. It is your signal amplifier. When you have 5,000 people who are genuinely excited and active in your Discord, their organic content reaches an audience you can't buy efficiently.
T-6 to T-3: playtesting for product-market fit
Most publishers treat playtesting as a QA function. Find bugs, fix them, ship. That framing is too narrow and it's expensive. A bug fixed post-launch costs 10-100x more than the same fix made during development (Source: DeepSource). But the more critical problem isn't bugs. It is retention.
If your first hour has a retention cliff that nobody has told you about, and you find out from Metacritic reviews, you have already lost the critical launch window. Press coverage that says "the opening is sluggish" doesn't go away. It compounds. It depresses conversion. It affects the algorithm.
Top studios run 12 playtests per game. The typical studio runs three to four (Source: PlaytestCloud 2025). The delta between those two behaviors corresponds closely with the delta in launch outcomes.
What to test at T-6
At six months out, the build is vertical-slice quality: the core loop works, the first zone or level is polished, the UI is functional if not final. The questions you're testing at this stage are:
- Does the player understand what they're supposed to do without tutorial text?
- Where is the first moment they get confused, frustrated, or bored?
- Is the core loop the thing they describe when asked "what is this game"?
- Do they want to keep playing after the session ends?
You are not testing polish. You are testing whether the product-market fit hypothesis is correct. If players in your target demographic consistently describe the game as something other than what you intended, that is a positioning problem and potentially a design problem. Better to know at T-6 than T-1.
Remote playtesting at scale
Physical playtests are geographically limited and expensive to run frequently. The remote playtesting guide covers the full methodology. The short version: remote playtesting with cloud-streamed builds eliminates the need for test machines at the studio, opens the tester pool to any geography or demographic, and produces session analytics (completion rates, drop-off timestamps, replay behavior) that physical observation misses.
Playruo for Playtest works as a charm and has a direct impact on team efficiency! As a game developer where logistics can be complicated, Playruo is an important tool for us.
The practical advantage is iteration speed. With a cloud-streamed build, you can update the build centrally and run a new cohort the next day. No redistribution, no version control across physical machines, no NDA enforcement friction.
Integrating playtest data into the build
Playtest data has no value if it doesn't change the build. Assign a producer to own the playtest-to-decision pipeline. For every finding, the output should be one of three things: fix it, accept it and prepare messaging around it, or flag it as a product risk that escalates to design leadership.
The T-6 to T-3 window is the last realistic window to make significant changes. After T-3, you're in press preview territory. The build journalists play is close to the build you ship. Structural changes at that stage create consistency problems between preview coverage and the final product.
Building your playtest panel
Recruit from your community (Discord, newsletter subscribers), genre-specific Reddit communities, and playtest recruitment platforms. Aim for diversity within your target demographic: mix of experience levels, platforms, ages, and geographies.
Run cohorts of 20-50 testers per round. Smaller cohorts allow for qualitative follow-up. Quantitative patterns emerge after 20+ sessions. Run at least three rounds: one at T-6 (core loop), one at T-5 (revised build), one at T-4 (near-final first hour before press access).
T-3 to T-1: the press preview campaign
Press coverage is not dying. Its mechanics are changing. 62% of game journalists receive 11-50 pitches per day (Source: Big Games Machine 2024). The median pitch is immediately deleted. The ones that get a response offer something the journalist can actually use: exclusive access, a strong hook, and frictionless access to the game.
67% of journalists want review copies at least three weeks before launch (Source: Big Games Machine 2024). The "review embargo lift 24 hours before launch" model that was standard five years ago actively damages launch performance now. It compresses the social proof window. Organic discussion doesn't have time to build before the algorithm decides whether to amplify you.
The press preview timeline
The full workflow for a well-executed press preview campaign is detailed in the press preview guide. The strategic frame here: start earlier, distribute access more widely, and use cloud streaming to remove platform barriers.
A typical timeline from T-12 weeks:
- T-12 weeks: press list finalization, media kit preparation
- T-10 weeks: first wave outreach to tier-1 outlets (embargoed preview, exclusive first-look options)
- T-8 weeks: preview sessions begin, embargoed coverage starts publishing
- T-6 weeks: second wave to tier-2 outlets and genre specialists
- T-4 weeks: review copies distributed, final press FAQ and asset pack
- T-3 weeks: embargo lifts on preview coverage
- T-1 week: review embargo lift
This spread creates multiple coverage waves. The algorithm interprets sustained coverage velocity as a relevance signal. A single spike at launch followed by silence is less valuable than three waves of coverage across eight weeks.
Cloud-streamed press access
The logistics problem with press access at scale is platform distribution. You have a Windows build. A journalist is on Mac. Another is in South Korea at a trade show. Another wants to record a preview but their review machine is occupied with another game. Physical builds on USB keys or installer links don't solve any of these problems well.
Cloud-streamed access solves all of them. The journalist opens a link, the game runs in their browser at full quality, session analytics confirm they played for 47 minutes, and forensic watermarking traces any leak back to the specific access token.
Nacon used this approach for three titles: Hell is Us, Styx: Blades of Greed, and GreedFall 2, coordinating EU and US press access simultaneously. No build redistribution. No regional logistics. Coverage from both regions aligned to the same embargo window.
Playruo allowed us to present Empire of the Ants to journalists from all over the world, and the experience was excellent. Playruo has been an invaluable asset to our communications strategy, and a new tool we'll be using for future projects.
Microids also deployed Playruo at Gamescom 2024 for Empire of the Ants: 15-minute social demo links for general audience alongside dedicated journalist virtual sessions. That dual-track approach turned a single event into two parallel acquisition surfaces.
Influencer and creator outreach
The creator economy operates on a different timeline than press. Top creators who cover launch-day content plan their schedules four to eight weeks out. Reaching them at T-2 weeks is too late for a day-one video.
Build a tiered creator list by audience size and genre alignment. The highest-value tier is not the largest channels. It is channels where the audience overlap with your game's target demographic is highest. A 500,000-subscriber channel where 80% of the audience plays your genre outperforms a 5,000,000-subscriber channel where 15% does.
Provide creators with:
- Early access, ideally T-4 to T-6 weeks before launch
- A dedicated analytics link so you can see their play session (opt-in, disclosed)
- Exclusive content permission: early announcements, first looks on specific features
- A direct line for questions. Creators who can't get answers go to press coverage, which is less favorable to the game's actual strengths.
T-1 to launch: playable ads and demo distribution
The final month before launch is when the awareness you've built has to convert. Wishlists are intent. Purchase is conversion. The gap between the two is where most launches lose.
40% of players are lost at the download step alone (Source: Xsolla). The download barrier exists at every touchpoint: the App Store, Steam, the Xbox store, the demo page. Anything that requires a download before the player decides to convert introduces attrition that compounds across your entire funnel.
Playable ads and cloud-streamed demos attack the same problem from two directions. Playable ads convert the unaware player at the top of the funnel. Cloud-streamed demos convert the interested player at mid-funnel. Both remove the download barrier. Both deliver the experience before asking for commitment.
Playable ads at the T-1 window
Playable ads produce 32% higher conversion than standard video ads (Source: Liftoff 2025). That conversion premium is not free: playable ads are more expensive to produce and require a build that can run in a constrained web environment.
The playable ads strategy guide covers format decisions, technical constraints, and platform distribution in detail. For this phase, the strategic decision is timing. Playable ads launched too early (T-3 or earlier) peak before the launch window. The algorithm optimization cycle for playable ad campaigns runs two to three weeks. Launch the campaign at T-4 weeks so it peaks at launch.
Playable ads accounted for approximately 18% of all mobile ad spend in gaming in 2024 (Source: Gamewheel 2025). The format is mainstream in mobile and growing in PC and console contexts where browser-based delivery allows the same mechanic. The fundamental behavior the format exploits, players preferring to experience before committing, is not platform-specific.
Key decisions for playable ad production:
- Extract a two to three minute self-contained loop from the core game (tutorial or introductory level)
- Hard-gate on the CTA: end the playable segment before the player finishes the loop
- Test three to five creative variants of the hook (the opening seconds are critical for completion)
- Match the CTA destination to the platform: Steam page for PC, store page for console, wishlist for pre-launch
Demo distribution strategy
The demo serves a different function than the playable ad. The playable ad converts the unaware. The demo retains and deepens engagement for the already-aware.
Demo-driven wishlists convert at a higher rate than ad-bought wishlists (Source: GameDiscoverCo developer quote). The player who played your demo and wishlisted it has demonstrated intent through action, not just attention. That behavioral signal is more predictive of purchase.
Demo distribution at T-1 has three channels:
Steam: The Steam demo is the default. It appears on your game page, it participates in Next Fest if timed correctly, and it generates wishlist conversions at point of contact. The mechanics are detailed in the demo distribution guide.
Cloud-streamed demo links: Shareable demo links remove the download requirement entirely. A journalist, influencer, or player receives a URL. They click. The game runs. This is particularly effective for influencer seeding: a creator can start streaming your demo four seconds after opening the email.
For Microids at Gamescom 2024, a 15-minute cloud-streamed demo ran for general show attendees via shareable link while dedicated journalist sessions ran in parallel. One build, two audiences, zero setup per viewer. The Gamescom 2024 benchmark for Steam visibility: 1,293 wishlists from 2,745 Steam visits during the event period (Source: Caldera Interactive).
Retailer and partner demo programs: Major retailers (GameStop in the US, Fnac and MediaMarkt in Europe) run in-store demo programs. The barrier has historically been hardware logistics. Cloud-streamed builds remove that barrier. Any screen with a browser becomes a demo kiosk.
Launch week: multi-channel demo deployment
Launch week is not a single moment. It is a seven-day window where velocity determines algorithmic outcomes on every platform. Steam's "New & Trending" list, the Xbox and PlayStation recommendation surfaces, and Google's Play Store all use sales velocity as a primary ranking input. What happens in the first 168 hours shapes the next 90 days.
The median wishlist-to-first-week-sales conversion for games with strong wishlists is 0.15x (Source: GameDiscoverCo 2025). Simon Carless from GameDiscoverCo frames the conversion difficulty as stemming from "increased competition rather than degrading efficiency metrics." The wishlist base is necessary. It is not sufficient.
Day-one execution checklist
24 hours before launch:
- Embargo lifts on review coverage: all review content live
- Paid social amplification live: retarget wishlist audience with "available now" creative
- Email campaign to newsletter list: launch announcement with direct purchase link
- Creator content live: coordinate with top-tier creators for simultaneous publication
- Playable ad campaigns shifted to "purchase" CTA from "wishlist"
Launch day:
- Steam page updated: launch trailer, final screenshot set, launch pricing confirmed
- Social media posts coordinated across all channels, same message window
- Discord community notification: launch announcement with first-hour giveaway or community event
- Press and community monitoring active: respond to coverage within two hours
Days 2-7:
- Monitor Steam tag performance: early user-generated tags shape discovery
- Respond to all press reviews publicly (thank positive, address negative constructively)
- Community management intensive: first-week players have questions; support directly affects review sentiment
- Mid-week media push: pitch "day-3 sales milestone" story to tier-2 outlets if numbers support it
Demo deployment during launch week
The demo doesn't stop being useful at launch. It becomes more useful because there is more traffic.
Keep the Steam demo active through launch week. Players who are on the fence will try the demo before buying. A player who plays 25 minutes of a demo before buying has a significantly lower refund rate than a player who bought on trailer alone. Lower refund rates improve your Steam refund standing, which affects visibility.
Cloud-streamed demo links serve a specific launch-week function: social media share campaigns. A playable demo link in a tweet or TikTok comment converts curiosity into trial in four seconds. That conversion path doesn't exist with a store download.
Coordinating physical and digital touchpoints
For titles with physical distribution, launch week requires coordination between digital activation and retail sell-in. Retail partners need launch assets, demo access for in-store displays, and real-time performance data they can use to adjust shelf placement.
A physical press event typically costs €50,000 to €100,000 when you factor in venue, travel, catering, and staffing at a global scale (Source: Events Industry Council / Oxford Economics 2018; Eventbrite Event Budget Template). Virtual events reduce costs by 35-75% (Source: Markletic 2024). The hybrid model, a physical presence at one anchor location with virtual access for everyone else, captures the press relationship value of in-person while scaling coverage to the full media list.
Post-launch: reactivation and back catalog
The game that sold 80% of its lifetime units in the first two weeks has a lifecycle problem. That pattern was acceptable in the boxed-retail era. It's not acceptable in a streaming-and-subscription world where catalog value compounds over time.
Post-launch strategy has two distinct phases: the reactivation window (T+1 to T+3 months) and the back-catalog phase (T+3 months onward). Each has different objectives and different tools.
Reactivation window: T+1 to T+3 months
The reactivation window captures players who were aware but didn't convert at launch. They saw coverage. They watched a creator video. They added it to their wishlist and then got distracted.
Price promotions are the traditional reactivation tool. But price promotions are nearly 4x less effective at driving page impressions than they were in 2019 (Source: GameDiscoverCo / GDC 2025). A 30% discount that nobody sees moves no units.
More effective reactivation levers:
Content drops: A new chapter, a challenge mode, or a major patch gives press and creators a new hook to cover the game. A second wave of coverage three months after launch reaches players who missed the first wave.
Subscription platform placement: Game Pass, PlayStation Plus, and similar programs generate large trial audiences. The discovery-to-purchase funnel doesn't require the platform to convert: it converts a segment to base ownership while the rest remains in the subscription pool.
Cloud gaming catalog: Cloud gaming market revenue hit $15.74 billion in 2025 and is projected to reach $159.26 billion by 2034 (Source: Fortune Business Insights). Microsoft's cumulative Xbox Cloud Gaming streaming hours reached 140 million, with one third coming from devices that cannot run the content locally (Source: GDC 2025 (Microsoft)). Placing a title in cloud gaming surfaces reaches audiences who would never have purchased it otherwise.
For more on how cloud streaming fits publisher strategy, the cloud gaming for publishers guide covers platform selection, deal structures, and revenue models in detail.
Community-driven events: Tournaments, speedrun competitions, community challenges. These events generate content that the community creates and distributes for free. They have near-zero direct cost and produce sustained social engagement signals.
Back-catalog phase: T+3 months onward
By T+3 months, the game is back catalog. The marketing objectives shift from acquisition to lifetime value maximization.
Key back-catalog tactics:
Franchise trailers: Bundle the current title with an announcement or teaser for the sequel or next game in the series. Players who completed the first game are your highest-quality warm audience for the next one.
Platform expansion: If the game launched on PC only, console ports three to six months later create a new launch event with new press coverage and new wishlist audiences.
Localization expansion: New language localizations open new regional markets. Korea, Japan, Brazil, and Eastern Europe each have distinct gaming communities that respond to localized content.
Demo permanence: Keep the demo available indefinitely. Back-catalog discovery happens continuously. A player who discovers your 18-month-old game through a creator video will try the demo before buying. If it's gone, they don't buy.
Channel strategy by funnel stage
Every channel has a funnel position it is good at. Mismatching channel and funnel stage is one of the most common and expensive mistakes in game marketing. Paid social at the top of funnel building awareness for a game that launches in three days produces low-quality traffic with no conversion time. Press coverage that drops two weeks before launch without a review copy pipeline misses the conversion window entirely.
| Stage | Channel | Primary KPI | Timing |
|---|---|---|---|
| Awareness | Organic social, devlogs, creator seeding | Reach, impressions | T-12 to T-3 |
| Awareness | PR (tier-1 outlets) | Coverage volume, sentiment | T-10 to T-3 |
| Consideration | Playable ads | CTR, play-through rate | T-4 to T+1 month |
| Consideration | Steam demo / cloud demo link | Wishlist conversion, session length | T-4 to ongoing |
| Intent | PR (reviews) | Review score, coverage velocity | T-3 to launch |
| Intent | Creator content (gameplay) | View-to-wishlist rate | T-4 to launch |
| Conversion | Paid search, retargeting | CPA, ROAS | T-1 to T+2 weeks |
| Conversion | Email (wishlist list) | Open rate, purchase CTR | Launch day |
| Retention | Community (Discord, Reddit) | DAU, message volume | Launch onward |
| Reactivation | Content drops, price events | Page impressions, conversion lift | T+1 to T+6 months |
The channels that are systematically underused are in the consideration stage. Most publishers over-invest in awareness (social, PR) and conversion (paid media) while skipping the middle. Players who receive awareness without a way to sample the product cannot move to intent.
Cloud gaming is a consideration-stage channel that most publishers still treat as a late-stage tactical tool. Streaming a demo to a player who found you through organic social converts better than retargeting them with a video ad, and eliminates the download barrier that costs 40% of players before they ever try the game (Source: Xsolla).
Budget allocation: physical vs. cloud
AAA marketing budgets run at 75-100% of development cost (Source: GamesRadar / industry estimate). For a game with a $30 million development budget, that means $22-30 million in marketing. Most of it goes to channels with declining efficiency.
The structural problem is inertia. Physical events, broadcast media, and store placements are familiar. Everyone understands their costs. Everyone has accepted their inefficiency. Cloud-based marketing tools are newer and their ROI is harder to benchmark against existing line items, so they get underfunded.
Here is a realistic allocation framework for a mid-tier publisher with a $2-5 million marketing budget:
| Category | % of budget | Key line items | Notes |
|---|---|---|---|
| Paid media (digital) | 30-35% | Social ads, search, display, playable ads | Shift 5-10% of video budget to playable |
| PR and influencer | 20-25% | Agency retainer, creator fees, event access | Cloud access reduces event logistics cost |
| Content production | 15-20% | Trailer, screenshots, playable ad creative, social assets | Playable ad production: $15k-$50k per unit |
| Demo and trial infrastructure | 5-10% | Cloud streaming, Steam demo pipeline, playtest platform | Historically underbudgeted; highest ROI per dollar |
| Community and owned media | 5-8% | Discord moderation, newsletter, social management | Compounds over time; do not cut first |
| Events (physical and virtual) | 5-10% | Trade shows, hybrid press events | Virtual saves 35-75% vs. physical equivalent |
| Reserve and testing | 5% | Mid-campaign pivots, performance channel testing | Keep liquid through launch week |
Where to shift budget in the current market
Three budget shifts are supported by current performance data:
From video ads to playable ads. Playable ads produce 32% higher conversion than video (Source: Liftoff 2025). If your video ad budget is $200,000, reallocating $40,000-$60,000 to playable formats with a modest production investment likely improves blended campaign ROAS.
From physical events to virtual-plus-one. A single physical anchor presence (one person, key market) plus cloud streaming for everyone else delivers comparable press relationship value at a fraction of the cost. A physical press event typically costs €50,000 to €100,000 when you factor in venue, travel, catering, and staffing at a global scale (Source: Events Industry Council / Oxford Economics 2018; Eventbrite Event Budget Template). A hybrid approach delivers comparable coverage reach.
From late paid media to early trial infrastructure. Budget spent on paid acquisition in the final two weeks before launch competes against every other game launching that month. Budget spent on demo infrastructure at T-4 months converts organically for the entire period at near-zero incremental cost per player.
KPIs by launch stage
Measuring the wrong thing at the wrong time produces bad decisions. Impressions at launch week are a vanity metric. Wishlist conversion at T-6 months is a leading indicator. Most marketing dashboards track what's easy to report, not what's predictive.
This is the measurement framework across the launch timeline:
| Stage | Primary KPI | Target benchmark | Warning signal |
|---|---|---|---|
| T-12 to T-6 | Steam wishlist rate (daily) | 50-200/day organic | < 20/day after page launch |
| T-12 to T-6 | Demo session length (Next Fest) | > 20 min median | < 10 min median |
| T-6 to T-3 | Playtest first-hour completion | > 70% | < 50% |
| T-6 to T-3 | Playtest NPS | > 40 | < 20 |
| T-3 to T-1 | Press coverage pieces | 30+ pieces pre-launch | < 10 pieces at T-4 weeks |
| T-3 to T-1 | Preview sentiment score | > 75% positive | < 60% positive |
| T-1 to launch | Playable ad play-through rate | > 60% | < 40% |
| T-1 to launch | Demo wishlist conversion rate | > 15% | < 8% |
| Launch week | Day-1 sales vs. wishlist base | > 12% of wishlists | < 8% of wishlists |
| Launch week | Steam refund rate | < 10% | > 15% |
| T+1 month | Day-14 retention | > 20% | < 10% |
| T+3 months | Review score stability | Within 5% of launch score | Declining > 5% |
Leading vs. lagging indicators
The most actionable KPIs are leading indicators: metrics you can see weeks or months before launch that predict launch outcomes. Wishlist rate, playtest completion, and demo session length are all leading indicators. By the time you see day-1 sales numbers, there is nothing left to optimize.
Build a weekly KPI review process starting at T-12 months. The questions are simple: is the leading indicator trend positive, flat, or negative, and what's the one action you'd take to improve the weakest number? Weekly cadence forces decisions. Monthly cadence is too slow.
Attribution in a multi-touch launch
Attributing sales to a specific channel in a multi-touch launch is difficult and the models are imperfect. A player who saw a creator video at T-6 months, played a Steam Next Fest demo, saw a playable ad at T-2 weeks, and bought on launch day has four potential attribution points. Last-touch attribution gives all credit to the final ad. It is usually wrong.
Use a data-driven attribution model if your analytics platform supports it. If it doesn't, use this heuristic: the channel that drove the demo play is usually more responsible for the conversion than the channel that was last seen. Experience converts. Impressions remind.
One build, four use cases
The traditional model treats each marketing use case as a separate production effort. Press access requires a build with a watermark system. Playtesting requires a build with session recording. Playable ads require a web-adapted truncated build. Demo distribution requires packaging and platform submission. Each has separate tooling, separate workflows, and separate costs.
The cloud streaming model collapses this. One build, streamed from a central server, serves all four use cases simultaneously: press previews, remote playtesting, playable ads, and demo distribution. The game doesn't change. The delivery context changes.
What one-build means operationally
A cloud-streamed build is the canonical version of your game at any point in development. When you update the build, every surface that streams it updates automatically. No redistribution. No version fragmentation across press, testers, and demo players.
This has specific implications for each use case:
Press previews: Every journalist accesses the same current build via their personal access link. Per-session forensic watermarking means any leak is traceable without any additional integration.
My mind is racing with ideas on how to use Playruo to bring content to our fans!
Playtesting: Testers access the build through a standard browser. No test machines, no version distribution, no NDAs tied to physical media. Session analytics (duration, completion rate, drop-off timestamps, geography) come from the same infrastructure as press access.
Playable ads: A truncated, time-limited version of the same build can be distributed as a shareable URL embedded in ad units. The player experiences the actual game, not a reproduction of it. The game engine, the controls, the feel are all real.
Demo distribution: The same link architecture used for press serves as a public demo. Time-gated, play-limited, geography-targeted if needed. No download requirement. 40% of players lost at the download step (Source: Xsolla) is a problem this architecture eliminates entirely.
The economics of consolidation
Running four separate tools for four use cases multiplies fixed costs, vendor relationships, and integration overhead. Consolidating onto one infrastructure reduces each of those by roughly 75%.
More importantly, it changes the unit economics of each use case individually. The marginal cost of running a playtest cohort drops when the infrastructure is already running for press access. The marginal cost of distributing a demo link drops when the streaming infrastructure is already provisioned.
We realized Playruo unlocks limitless possibilities for maximizing player engagement.
Playruo's architecture for publishers
Playruo is built for this use case. The platform runs on QUIC protocol with H.264, HEVC, VP9, and AV1 codec support. Playruo reports glass-to-glass latency of 8 ms, which it describes as the lowest in the industry (Source: playruo.com/technology). The browser-based delivery requires no download, no account creation, and no app.
There's no SDK, no porting requirement, and no code changes. A publisher provides the build. Playruo provides the streaming infrastructure, the session analytics, the per-session watermarking, and the white-label interface.
Pricing is usage-based: per session, not per seat. For a press preview with 200 journalists, you pay for 200 sessions. For a demo distribution campaign with 10,000 players, you pay for 10,000 sessions.
Details on the full platform capability are at /why-playruo.
When the model applies
The one-build model applies to any publisher with more than one active marketing use case. If you are running press previews AND planning a Steam demo, you have two use cases and the consolidation math is immediately favorable.
The model is particularly valuable for:
- Mid-tier publishers handling three to five titles per year (the overhead savings compound across the catalog)
- Teams launching simultaneously in multiple regions (no regional build distribution, single infrastructure)
- Studios with ongoing live-service titles that run continuous playtesting and creator access
The global games market reached $197 billion in 2025, up 7.5% year-over-year (Source: Newzoo 2025). Gaming captures less than 5% of global media ad spend despite having 3.4 billion players (Source: Dentsu Gaming 2025). That gap closes as gaming-native marketing tools become more efficient and more measurable. One-build infrastructure is a step in that direction.
Sources
| Source | URL | Note |
|---|---|---|
| HowToMarketAGame / GamesRadar 2026 | https://howtomarketagame.com | Steam 2025 release stats: 20,282 games, 3% reached 1,000+ reviews, 48.8% got fewer than 10 |
| GameDiscoverCo / GDC 2025 | https://gamediscover.co | Steam discounts 4x less effective at generating page impressions vs 2019 |
| Newzoo 2025 | https://newzoo.com | Global games market $197 billion, +7.5% YoY |
| Dentsu Gaming 2025 | https://www.dentsu.com | Gaming < 5% of global media ad spend despite 3.4 billion players |
| GamesRadar / industry estimate | https://www.gamesradar.com | AAA marketing budgets 75-100% of development cost |
| GameDiscoverCo 2025 | https://gamediscover.co | Median wishlist-to-first-week-sales conversion: 0.15x for 25,000+ wishlists |
| GameDiscoverCo 2025 | https://gamediscover.co | Top performers: 214 days pre-release visibility; underperformers: 411 days |
| GameDiscoverCo 2025 | https://gamediscover.co | Steam Next Fest October 2024: demo-to-wishlist ratio of 5.26 |
| HowToMarketAGame | https://howtomarketagame.com | 7,000-10,000 wishlists minimum for Steam Popular Upcoming |
| Liftoff 2025 | https://liftoff.io | Playable ads 32% higher conversion than standard video ads |
| Xsolla | https://xsolla.com | 40% of players lost at the download step |
| GameDiscoverCo developer quote | https://gamediscover.co | Demo-driven wishlists convert better than ad-bought wishlists |
| Gamewheel 2025 | https://gamewheel.io | Playable ads ~18% of all mobile ad spend in gaming in 2024 |
| Adjust 2026 | https://www.adjust.com | Gaming CPI $0.56 in 2025, up 30% YoY |
| AppsFlyer 2026 | https://www.appsflyer.com | Mobile games spent $25 billion on UA in 2025 |
| Events Industry Council / Oxford Economics 2018 | https://insights.eventscouncil.org/Portals/0/OE-EIC%20Global%20Meetings%20Significance%20%28FINAL%29%202018-11-09-2018.pdf | Global business events benchmark; average spend per participant and cost structure context for large-scale in-person events |
| Eventbrite Event Budget Template | https://www.eventbrite.com/resources/budgets/event-budget-template/ | Event budget framework covering venue, refreshments, promotion, staff, travel, and accommodation |
| Markletic 2024 | https://www.markletic.com | Virtual events reduce costs by 35-75% vs in-person, 3,960 respondents |
| Caldera Interactive / Gamescom 2024 | https://calderainteractive.com/blog/gamescom-wishlist-case-study/ | 1,293 wishlists from 2,745 Steam visits during Gamescom 2024 event |
| GDC 2025 / GameDiscoverCo | https://gdconf.com | 67% of total PC gaming hours in 2024 went to titles 6+ years old |
| Fortune Business Insights | https://www.fortunebusinessinsights.com | Cloud gaming market $15.74 billion in 2025, projected $159.26 billion by 2034 |
| GDC 2025 (Microsoft) | https://gdconf.com | 140 million cumulative Xbox Cloud Gaming streaming hours; 1/3 from incapable devices |
| PlaytestCloud 2025 | https://www.playtestcloud.com | Top studios run 12 playtests per game; typical: 3-4 (5,400 playtests analyzed) |
| DeepSource | https://deepsource.com | Bug fix post-launch costs 10-100x more than during development |
| Big Games Machine 2024 | https://www.biggamesmachine.com | 62% of journalists receive 11-50 pitches per day |
| Big Games Machine 2024 | https://www.biggamesmachine.com | 67% of journalists want review copies 3+ weeks before launch |
| Playruo technology page | https://playruo.com/technology | Glass-to-glass latency 8 ms (self-reported); QUIC protocol; H.264, HEVC, VP9, AV1 codecs |
Sources
| Source | Notes |
|---|---|
| HowToMarketAGame / GamesRadar 2026 | Steam 2025 release stats: 20,282 games, 3% reached 1,000+ reviews, 48.8% got fewer than 10 |
| GameDiscoverCo / GDC 2025 | Steam discounts 4x less effective at generating page impressions vs 2019 |
| Newzoo 2025 | Global games market $197 billion, +7.5% YoY |
| Dentsu Gaming 2025 | Gaming < 5% of global media ad spend despite 3.4 billion players |
| GamesRadar / industry estimate | AAA marketing budgets 75-100% of development cost |
| GameDiscoverCo 2025 | Median wishlist-to-first-week-sales conversion: 0.15x for 25,000+ wishlists |
| GameDiscoverCo 2025 | Top performers: 214 days pre-release visibility; underperformers: 411 days |
| GameDiscoverCo 2025 | Steam Next Fest October 2024: demo-to-wishlist ratio of 5.26 |
| HowToMarketAGame | 7,000-10,000 wishlists minimum for Steam Popular Upcoming |
| Liftoff 2025 | Playable ads 32% higher conversion than standard video ads |
| Xsolla | 40% of players lost at the download step |
| GameDiscoverCo developer quote | Demo-driven wishlists convert better than ad-bought wishlists |
| Gamewheel 2025 | Playable ads ~18% of all mobile ad spend in gaming in 2024 |
| Adjust 2026 | Gaming CPI $0.56 in 2025, up 30% YoY |
| AppsFlyer 2026 | Mobile games spent $25 billion on UA in 2025 |
| Events Industry Council / Oxford Economics 2018 | Global business events benchmark; average spend per participant and cost structure context for large-scale in-person events |
| Eventbrite Event Budget Template | Event budget framework covering venue, refreshments, promotion, staff, travel, and accommodation |
| Markletic 2024 | Virtual events reduce costs by 35-75% vs in-person, 3,960 respondents |
| Caldera Interactive / Gamescom 2024 | 1,293 wishlists from 2,745 Steam visits during Gamescom 2024 event |
| GDC 2025 / GameDiscoverCo | 67% of total PC gaming hours in 2024 went to titles 6+ years old |
| Fortune Business Insights | Cloud gaming market $15.74 billion in 2025, projected $159.26 billion by 2034 |
| GDC 2025 (Microsoft) | 140 million cumulative Xbox Cloud Gaming streaming hours; 1/3 from incapable devices |
| PlaytestCloud 2025 | Top studios run 12 playtests per game; typical: 3-4 (5,400 playtests analyzed) |
| DeepSource | Bug fix post-launch costs 10-100x more than during development |
| Big Games Machine 2024 | 62% of journalists receive 11-50 pitches per day |
| Big Games Machine 2024 | 67% of journalists want review copies 3+ weeks before launch |
| Playruo technology page | Glass-to-glass latency 8 ms (self-reported); QUIC protocol; H.264, HEVC, VP9, AV1 codecs |