A futuristic game development studio where AI and human designers collaborate, featuring a holographic AI assistant generating game environments, with developers reviewing and modifying elements on large screens. The studio is filled with advanced technology, including concept art, 3D models, and gaming consoles.

Highlights

  • Generative AI for game design: Researchers introduced the World and Human Action Model (WHAM), a generative AI tool designed to enhance creativity in game development.
  • Bridging AI and human ideation: WHAM supports divergent thinking (generating diverse gameplay ideas) and iterative practice (refining game mechanics).
  • User-driven innovation: A study of 27 game developers revealed that WHAM helps maintain creative agency by generating consistent, diverse, and persistent gameplay sequences.
  • Learning from human play: WHAM was trained on real gameplay data and can adapt to user modifications, making AI a valuable creative partner.
  • Open-source future: The WHAM model and its datasets are publicly available, paving the way for further advancements in AI-driven game design.

TL;DR

WHAM, a generative AI model for video game development, enhances creative ideation by balancing diversity, consistency, and user control. Trained on real gameplay data, WHAM supports interactive and iterative game design, making AI a more effective creative partner for game developers.


Revolutionizing Game Design: AI as a Creative Partner

The rise of generative artificial intelligence (AI) has transformed creative fields, from art and music to video game development. However, integrating AI into creative workflows remains a challenge. Existing AI models often fail to align with human creativity, making them difficult to use for generating new ideas.

A new study published in Nature presents an innovative approach to this problem. Researchers have developed the World and Human Action Model (WHAM)—a generative AI tool designed specifically to support gameplay ideation. By leveraging human gameplay data, WHAM enables game designers to create cohesive, dynamic, and interactive game elements while maintaining their creative agency.

Understanding WHAM: AI That Thinks Like a Game Designer

WHAM is a state-of-the-art generative AI model trained on large-scale datasets of human gameplay. Unlike previous AI tools, WHAM does not merely generate random game elements. Instead, it learns from real gameplay interactions to produce consistent and diverse sequences that match how humans actually play games.

Researchers identified three critical capabilities that AI must have to support creative ideation in game design:

  1. Consistency: AI-generated gameplay must follow the rules of the game world. For example, characters should not suddenly float or pass through walls.
  2. Diversity: The AI should offer multiple variations of gameplay sequences, allowing developers to explore different creative directions.
  3. Persistency: If a designer makes a change, such as adding a character, the AI should retain and incorporate that modification rather than disregarding it in later generations.

The Creative Process: AI + Human Collaboration

To understand how AI can best support creativity, the researchers conducted interviews with 27 game developers. These developers emphasized the importance of divergent thinking (exploring new ideas) and iterative practice (refining concepts over time).

Traditionally, game design is a time-consuming process. A single indie game can take two or more years to develop, while large AAA games may take five years or longer. Designers must go through multiple iterations, adjusting everything from character animations to level layouts.

WHAM streamlines this process by generating gameplay sequences that are both novel and functional. Game developers can use WHAM to quickly prototype new ideas, reducing the trial-and-error phase of development.

Training WHAM: Learning from Human Gameplay

To build WHAM, researchers trained the model using data from real human players. Working with the game studio Ninja Theory, they collected gameplay footage from the multiplayer action game Bleeding Edge. The dataset included:

  • 500,000 gaming sessions spanning seven years of play.
  • 1.4 billion frames of video, downsampled for training efficiency.
  • Player input data, capturing how gamers controlled characters.

WHAM uses a transformer-based AI model, similar to those found in modern AI chatbots, but optimized for video game dynamics. By learning from this massive dataset, WHAM can generate coherent game worlds where AI-generated characters move naturally and respond realistically.

Testing WHAM: How Well Does It Work?

To evaluate WHAM’s effectiveness, researchers tested the model’s ability to generate consistent, diverse, and persistent gameplay sequences.

  • Consistency Testing: The team used a metric called Fréchet Video Distance (FVD) to compare WHAM’s gameplay sequences to real human play. Lower FVD scores indicated higher realism in AI-generated gameplay.
  • Diversity Testing: To assess WHAM’s ability to generate different gameplay paths, researchers measured the Wasserstein distance, a statistical method for comparing variations in player actions.
  • Persistency Testing: Researchers manually inserted objects (like power-ups or new characters) into a game scene and checked whether WHAM retained those changes in subsequent gameplay generations.

The results were highly promising. WHAM-generated gameplay sequences closely matched real human play, offering multiple paths while maintaining in-game physics and logic. Moreover, when users modified a game scene, WHAM preserved their changes more than 85% of the time.

The WHAM Demonstrator: Bringing AI to Game Studios

To showcase WHAM’s potential, the researchers developed a user-friendly interface called the WHAM Demonstrator. This tool allows game designers to:

  • Input visual prompts (e.g., screenshots) to guide AI-generated sequences.
  • Modify gameplay elements and see how the AI adapts in real time.
  • Experiment with multiple iterations, choosing from diverse gameplay variations.

By making WHAM’s code and datasets publicly available, the researchers hope to encourage further innovation in AI-powered game design.

Implications: A New Era of AI-Driven Creativity

WHAM represents a significant leap forward in AI-assisted creativity. Unlike traditional AI models that generate static content, WHAM supports interactive and iterative game design.

The implications extend beyond gaming. WHAM’s approach could be adapted to other creative fields, such as:

  • Film and animation: AI-generated scenes that follow consistent visual storytelling rules.
  • Music composition: AI that generates diverse but coherent musical variations.
  • Virtual reality (VR) and simulation design: AI-generated environments that adapt to user input.

By prioritizing human creativity and control, WHAM exemplifies how AI can enhance rather than replace artistic expression.

Final Thoughts: The Future of AI in Creative Fields

Generative AI is still in its early stages, but models like WHAM demonstrate its potential to revolutionize creative industries. By understanding how humans create and iterate, AI can become a true creative partner, supporting innovation while respecting human intuition and expertise.

As game developers continue to explore the possibilities of AI, WHAM offers a glimpse into the future of interactive entertainment—one where AI and human creativity work together to craft immersive, dynamic, and endlessly innovative worlds.


Source: Kanervisto, A., Bignell, D., Wen, L. Y., Grayson, M., Georgescu, R., Valcarcel Macua, S., … & Hofmann, K. (2025). World and Human Action Models towards gameplay ideation. Nature. https://doi.org/10.1038/s41586-025-08600-3

Leave a Reply

Your email address will not be published. Required fields are marked *