Case Study: The AI Action Trailer That Hit 45M Views Across Platforms
AI-generated action trailer goes viral with 45M views.
AI-generated action trailer goes viral with 45M views.
In the hyper-competitive landscape of digital content, where even a million views is considered a major success, a single video project amassing 45 million views seems like an anomaly. It defies conventional wisdom, breaks established engagement patterns, and forces marketers and creators to question everything they thought they knew about virality. This isn't a story about a cute animal or a celebrity mishap; it's the story of a meticulously engineered piece of AI-generated content—an action movie trailer for a film that doesn't exist—that captivated a global audience and dominated platforms from YouTube to TikTok.
The project, codenamed "Project Chimera" by its creators, began as an experimental fusion of narrative ambition and cutting-edge artificial intelligence. The goal wasn't just to showcase AI's technical potential but to weaponize its creative power for maximum emotional impact and shareability. The result was a two-minute, thirty-seven-second trailer that didn't just get watched—it got dissected, shared, remixed, and debated, generating a tidal wave of organic reach that most brands can only dream of purchasing. This case study deconstructs the anatomy of that success, moving beyond the surface-level view count to explore the strategic decisions, technological innovations, and psychological triggers that turned a speculative concept into a viral phenomenon. We will delve into the creative brief that dared to be different, the AI toolkit that brought it to life, the platform-specific distribution playbook that fueled its fire, and the profound implications this success holds for the future of content creation, marketing, and search engine optimization.
The journey to 45 million views did not begin with a line of code or a generated image; it began with a document—the creative brief. In an industry saturated with safe, formulaic content, the brief for the AI action trailer was a radical departure. It was built on a foundational insight: today's audiences, particularly Gen Z and Millennials, are increasingly skeptical of overly polished, corporate advertising. They crave authenticity, but they are also drawn to high-concept, visually stunning spectacle. The brief sought to bridge this gap by creating something that felt both epic and accessible, fictional yet strangely plausible.
The core concept was a neo-noir action thriller set in a near-future megacity, a genre known for its visual appeal and philosophical undertones. However, the twist was in the execution. The brief explicitly called for a "found footage" aesthetic within this high-tech world. It wasn't meant to look like a multi-million-dollar studio production; it was meant to look like a leak, like classified surveillance footage or a guerrilla documentary from a dystopian tomorrow. This single decision had cascading effects on the entire project, influencing everything from the camera angles to the color grading and sound design. This approach taps into the same powerful sentiment that makes humanizing brand videos so effective as a new trust currency—the feeling that you're seeing something real, not manufactured.
Furthermore, the brief was not just a set of instructions for humans; it was engineered for AI interpretation. It was structured with unprecedented specificity:
This meticulous brief was the project's North Star. It ensured that despite using a suite of disparate AI tools, the final product would have a consistent tone, style, and narrative thrust. It was the blueprint that transformed a potentially cold, technological exercise into a piece of storytelling with a palpable soul. This foundational work is as critical as the emerging discipline of AI-powered scriptwriting, proving that the machine's output is only as good as the human creative input.
Executing a vision as ambitious as the one laid out in the creative brief required a sophisticated, multi-layered technology stack. This was not a one-tool project; it was a symphony of specialized generative AI models, each handling a specific component of the production pipeline. The "Chimera Stack," as the team called it, can be broken down into four core layers: Visuals, Audio, Motion, and Assembly.
The backbone of the trailer was its visual composition. The team started with advanced text-to-image models like Midjourney v6 and Stable Diffusion 3, but their use was highly specialized. They did not generate final shots in a single pass. Instead, they used these tools for:
The real magic, however, happened in the animation phase. The team employed a combination of AI motion blur plugins and bespoke interpolation models to transform their curated still images into fluid motion. This process, known as image-to-video, was the most technically challenging aspect. It required fine-tuning parameters on frame-by-frame coherence to avoid the "morphing" effect that often plagues AI video. The breakthrough came from using a technique similar to AI face replacement tools, but applied to entire scenes, ensuring characters and environments remained stable and consistent throughout a shot.
Sound is half the experience in a trailer, and the AI audio tools used were just as critical as the visual ones. The team leveraged a suite of models to create a completely original soundscape:
With hundreds of video clips and audio files generated, the final stage was assembly. Here, AI continued to play a role. Editing assistants powered by AI auto-cut editing algorithms analyzed the footage and suggested edits based on pacing models from successful action trailers. Furthermore, color grading was significantly accelerated using AI-powered color matching tools that ensured a consistent, cinematic look across all shots, despite them originating from different generative seeds. This entire workflow, from asset generation to final assembly, was a testament to how cloud-based VFX workflows are becoming the new standard for collaborative, iterative content creation.
Creating a masterpiece is only half the battle; the other half is ensuring the world sees it. The team approached distribution not as an afterthought, but as a primary pillar of the project, with a bespoke strategy for each major platform. The goal was to create a cross-platform wave of discovery, where users encountering the trailer on one network would be compelled to seek it out or encounter it again on another, creating a powerful feedback loop of visibility and social proof.
On YouTube, the strategy was built on foundational Search Engine Optimization. The title, description, and tags were meticulously crafted to target high-intent search queries. The title, "CHIMERA - AI-Generated Action Trailer (4K)", was a masterclass in keyword optimization, combining the provocative project name, the high-interest "AI-Generated" term, the genre, and the quality indicator "4K". The description was a long-form narrative, rich with keywords like "cyberpunk thriller," "AI filmmaking," "virtual production," and "generative AI," which are all becoming Google's fastest-growing search terms.
The team also employed a tactic often seen in influencer SEO strategies: they released a "Making Of" documentary simultaneously. This longer-form content ranked for more specific, educational queries like "how to make an AI movie trailer," and served as a hub that fed viewers back to the main trailer, dramatically increasing watch time and session duration—two critical ranking factors for YouTube's algorithm.
Recognizing that short-form platforms require a different content language, the team did not simply repost the full trailer. Instead, they deconstructed it into a series of hyper-engaging, vertical-format clips. Each clip was designed to stand alone while teasing the greater whole.
To build initial credibility and reach, the trailer was strategically shared with key influencers in the tech, AI, and filmmaking spaces. It was also submitted to curated content platforms like Colossal, which features groundbreaking digital art. A single feature on such a high-authority site provided a massive initial boost in qualified traffic and signaled to other blogs and news outlets that this was a project worth covering. This created a powerful viral momentum loop similar to successful CSR campaigns, where credibility begets more visibility.
Behind the staggering 45-million-view figure lies a treasure trove of data that reveals not just how many people watched, but how they engaged, where they came from, and what kept them hooked. A deep analytical dive into the performance metrics uncovers the precise levers that were pulled to achieve such explosive growth. This goes beyond vanity metrics and into the core behavioral psychology of a viral audience.
The first critical metric was Audience Retention. On YouTube, the average view duration for the 2:37 trailer was an astonishing 2:19—a retention rate of over 90%. This is virtually unheard of for online video. The graph showed not a gradual decline, but a series of peaks corresponding to each major action beat and narrative reveal. This indicated that the trailer was not just being watched; it was being consumed. The pacing, engineered by AI editing tools, was perfectly calibrated to deliver new information and visual spectacle at intervals that prevented drop-off. This high retention rate was the single most important signal to the YouTube algorithm, telling it to promote the video aggressively across the platform's homepage and recommended video feeds.
The second key dataset came from Traffic Source Analysis. The initial 24 hours were driven by direct shares and embedded links from the authority sites and influencer posts. Days 2-5 saw a massive spike from "YouTube Search" and "Google Search," as the SEO-optimized title and description began to rank for terms like "best AI video" and "future of filmmaking." By day 6, the primary source had shifted decisively to "YouTube Suggestions," accounting for over 60% of all traffic. This is the hallmark of a truly viral video: when the platform itself becomes its primary distribution engine. The cross-promotion from TikTok and Instagram was also quantifiable, with distinct spikes in YouTube views occurring within minutes of a popular TikTok clip about the trailer going live.
Finally, Engagement Metrics told a story of a captivated community. The like-to-dislike ratio was overwhelmingly positive, but more telling were the comments. The video garnered over 120,000 comments, with a significant portion being long-form discussions debating the film's hypothetical plot, identifying background details, and, most prevalently, asking one question: "Is this real? Is this a real movie?" This confusion between AI-generated content and reality was a powerful driver of engagement. It created a narrative mystery around the project itself, compelling viewers to become advocates and investigators. The share rate was 3x the platform average, as users shared the video with captions like "You have to see this to believe it." This level of organic, word-of-mouth promotion is the ultimate validation of a content strategy and is a key factor in the success of other visual phenomena, such as high-end CGI commercials that blur the line between advertisement and art.
The technological achievement and strategic distribution of the AI action trailer are only part of the story. Its true power lay in its ability to tap into a deep-seated, universal set of psychological triggers. The creators didn't just make a cool video; they engineered an experience that resonated on a primal level with human curiosity, fear, and wonder. Understanding these triggers is essential for replicating this kind of engagement.
The first and most powerful trigger was Novelty and The "Uncanny Valley" Awe. For years, AI-generated video has been characterized by janky, distorted figures and nonsensical motion. This trailer represented a quantum leap in quality, landing squarely in a new zone: the "Awe Valley." It was familiar enough to be comprehensible as a cinematic trailer, but its AI origins made it just unfamiliar enough to be deeply disorienting and fascinating. Viewers were not just watching a story; they were witnessing a threshold being crossed. This triggered a sense of awe and existential curiosity about the future of technology and art, a powerful motivator for sharing, as people seek to align themselves with groundbreaking cultural moments. This is the same trigger that powers the virality of emerging tech like holographic videos.
The second trigger was Speculative Engagement. By its very nature as a trailer for a non-existent film, the video was an incomplete puzzle. The human brain is wired to seek closure and resolve narratives. The trailer provided just enough information to construct a compelling world and premise, but deliberately left the central narrative arc unresolved. This forced the audience to become co-creators. Comment sections and social media threads exploded with fan theories, casting suggestions for a real-life adaptation, and plot predictions. This active, speculative participation transforms a passive viewer into an invested community member, dramatically increasing their emotional stake in the content's success. This mirrors the engagement seen in interactive video experiences, where viewer choice dictates the narrative.
The third trigger was the Democratization of High Art. For decades, the ability to create Hollywood-level visual spectacle was gated by immense budgets, specialized equipment, and elite studios. This trailer symbolized the breaking down of those gates. It sparked a compelling and contentious debate: "Could this replace human filmmakers?" This debate, while complex, is inherently engaging. It made the video relevant not just to film fans, but to tech enthusiasts, futurists, and cultural commentators. It positioned the project at the white-hot center of a global conversation about creativity, automation, and the future of work, ensuring its relevance extended far beyond its runtime. This taps into the same cultural current as tools that make real-time animation rendering accessible, empowering a new generation of creators.
While the 45 million views are a spectacular headline metric, the true, lasting value of the "Project Chimera" campaign lies in its profound and tangible impact on search engine visibility and brand equity. This was not a flash-in-the-pan viral hit that faded away; it was a strategic asset that generated compounding returns long after the initial view count plateaued. For any business or creator looking to justify investment in high-concept, AI-driven content, this is the most critical chapter.
The most immediate SEO benefit was the tsunami of qualified backlinks. Major publications in tech (TechCrunch, Wired), film (IndieWire, Deadline), and marketing (Adweek, Marketing Brew) all covered the phenomenon, each link acting as a powerful vote of confidence in the eyes of Google. This single campaign generated more high-domain-authority backlinks than a year's worth of traditional content marketing efforts. These backlinks didn't just point to the video; they often linked to the production company's website and blog, causing its Domain Authority to skyrocket. This elevated the entire domain's ranking potential for all its target keywords, from core services to broader industry terms. The project became a perfect example of how innovative video techniques can reshape post-production SEO by generating massive topical authority.
Secondly, the campaign created a permanent, high-value content hub. The project's central case study page, detailing the process, became a cornerstone piece of content. It now ranks on the first page of Google for highly competitive, high-intent keywords like "AI video generation case study," "generative AI in filmmaking," and "how to create an AI trailer." This page has a conversion rate of over 5% for lead generation, attracting clients who are specifically interested in cutting-edge video production. It functions similarly to how AI scene generators are ranking in top Google searches, by targeting a nascent but rapidly growing search vertical with definitive, expert content.
From a brand-building perspective, the impact was transformative. The company behind the project was no longer just a service provider; it was instantly anointed as a thought leader and innovation pioneer. This perception allowed them to:
This shift from vendor to visionary is the ultimate marketing outcome. It demonstrates that the ROI of such a project isn't just in the immediate analytics dashboard, but in the long-term strategic positioning that fuels growth for years to come. The brand effectively used a single piece of flagship content to achieve what years of standard advertising could not, proving the unparalleled power of quality and innovation in dominating search trends and market mindshare.
The impact of the AI action trailer extended far beyond its own view count and the brand that created it. It sent shockwaves through multiple industries, acting as a catalyst that accelerated existing trends, sparked new debates, and fundamentally altered the strategic roadmaps for companies in the tech, entertainment, and marketing sectors. This "ripple effect" demonstrates how a single, high-impact piece of content can serve as a proof-of-concept that changes the entire conversation.
In the film and entertainment industry, the trailer landed like a bombshell. Almost overnight, it moved the conversation about AI in filmmaking from a theoretical threat or a niche tool for pre-visualization to a tangible production-ready force. Major studios, which had been cautiously exploring AI in closed-door R&D labs, were now facing urgent questions from executives and shareholders. The trailer proved that the technical barrier to creating compelling, high-quality visual narratives had been lowered exponentially. This directly influenced the surge in interest for virtual production, as studios sought to understand how to integrate these generative tools into their existing pipelines. It also ignited a fire under the Hollywood guilds, with SAG-AFTRA and the WGA pointing to the trailer as a prime example of why strong protections against the unlicensed use of members' likenesses and writers' styles were needed in new contracts.
For the marketing and advertising world, "Project Chimera" became the new benchmark for campaign-led innovation. It validated a high-risk, high-reward content model. CMOs at global brands began re-evaluating their content budgets, questioning the ROI of another generic social media ad against the potential brand-lift of a groundbreaking, conversation-starting piece of owned media. The campaign proved that thought leadership could be earned not just through white papers and webinars, but through spectacular demonstrations of capability. This has led to an increased appetite for experimental formats, pushing agencies to develop dedicated AI creative teams and invest in the kind of hyper-personalized video ad technology that this project foreshadowed.
Perhaps the most significant ripple was felt within the AI developer ecosystem. The trailer served as the ultimate stress test and marketing vehicle for the tools used to create it. The teams behind the specific AI models for video, audio, and voice generation saw a massive spike in user signups and developer interest. Venture capital firms, now armed with a tangible, viral example of the technology's potential, began aggressively funding startups in the generative video space. The project created a "before and after" moment, similar to the impact of the first deepfake music video that went viral, setting a new quality bar that every new tool is now measured against. It demonstrated that the market was not just ready for AI-generated content, but was actively hungry for it, provided it reached a certain threshold of quality and narrative coherence.
With great power comes great responsibility, and the runaway success of the AI action trailer immediately plunged its creators into a complex web of ethical considerations. The project, while celebrated for its innovation, also became a central case study in the ongoing debate about the ethical boundaries of generative AI. Navigating this terrain was not an afterthought; it was a critical component of the project's lifecycle, from conception to public release.
The first and most contentious issue was data provenance and copyright. The AI models used were trained on vast, often undisclosed, datasets of images, videos, and text scraped from the public internet. This raised immediate questions: Were the stunning cityscapes and character designs merely sophisticated remixes of copyrighted work from human artists? The creators proactively addressed this by implementing a rigorous "style-checking" protocol, using reverse-image search and aesthetic analysis tools to ensure the final outputs did not directly mimic the style of any single, identifiable living artist. They also leaned heavily on the transformative nature of the work, arguing that the final trailer was a new, creative assembly of generated elements, much like a director is influenced by countless films they've seen but creates a unique final product. This is a frontier being explored across the industry, as seen with the rise of AI-generated fashion photos and the legal questions they raise.
The second major ethical challenge was disclosure and transparency. The team was determined to be upfront about the AI origins of the project to avoid accusations of deception. The title "AI-Generated Action Trailer" was a clear and honest label. However, they also grappled with the "paradox of disclosure." Would labeling it as AI cause some viewers to dismiss it as a cheap gimmick before even watching? The data proved the opposite. The AI label became a primary driver of curiosity and viewership. This established a new best practice for the industry: be transparent, but wear the "AI" badge as a mark of innovation, not something to be hidden. This level of transparency is becoming as crucial as the humanizing element in brand videos for building long-term trust with an audience.
Finally, the project sparked an internal debate about the potential for misinformation. The same technology used to create a fictional action trailer could, in less scrupulous hands, be used to create convincing fake news reports or malicious deepfakes. The creators publicly acknowledged this dual-use nature of the technology. In their "Making Of" documentary, they dedicated a segment to discussing these ethical implications, positioning themselves as advocates for the responsible development and use of generative AI. They engaged with academic researchers and policy organizations like the Partnership on AI to contribute to the development of best practices and potential watermarking standards for AI-generated content. By confronting these issues head-on, they turned potential criticism into a demonstration of ethical leadership.
The monumental success of the AI action trailer was not a random accident of luck; it was the result of a repeatable, methodical process. While the specific creative idea was unique, the underlying framework can be deconstructed and adapted by other brands, creators, and marketers seeking to harness the power of generative AI for their own high-impact campaigns. This playbook breaks down the process into six actionable phases.
Before writing a single prompt, you must define your strategic goal. Are you aiming for brand awareness, lead generation, or establishing thought leadership? Your objective will dictate your creative direction, distribution strategy, and success metrics. For "Project Chimera," the goal was unequivocally top-of-funnel brand awareness and SEO authority building. This allowed for a creative approach that was purely about spectacle and conversation-starting, without a direct sales message.
Identify your core audience and the platforms they inhabit. Map their psychological triggers. Are they tech-early adopters fascinated by process? Are they fans of a specific genre hungry for new content? The trailer's audience was a hybrid: film enthusiasts and tech futurists. This informed the dual-pronged distribution strategy of in-depth process content for YouTube and visually stunning, snackable clips for TikTok.
This is the most critical creative phase. Your concept must be high-concept enough to break through the noise but simple enough to be understood in a thumbnail. The brief must be written for both human and machine consumption. Use the lessons from this case study:
This phase requires the same creative muscle as developing a compelling micro-documentary—it's all about storytelling.
Assemble your "Generative Tech Stack" based on your project's needs. Don't rely on a single tool. Curate a suite for:
Adopt a "sprint" methodology. Generate vast quantities of assets, curate ruthlessly, and iterate based on the outputs. Expect a 90% discard rate. The key is volume and curation, not perfection on the first try. This iterative process is similar to how teams use real-time preview tools to accelerate traditional VFX workflows.
Your final master asset is not the end product; it's the source material. Before launch, create a detailed repurposing plan:
This ensures you can feed the algorithmic beast across all platforms simultaneously upon launch.
The launch is just the beginning. Actively monitor comments and engage with your audience. Answer questions, thank people for their theories, and share user-generated content. Track not just views, but the SEO impact—keyword rankings, backlinks, and organic traffic to your site. Use this data to inform your next campaign, creating a virtuous cycle of innovation and audience growth.
The AI action trailer represents a snapshot in time—a milestone on a rapidly accelerating curve. The tools and techniques used will be obsolete within a year, perhaps even months. To stay ahead of the curve, content creators and marketers must look beyond the present and anticipate the next waves of innovation that will define the future of AI-generated video. Understanding these coming shifts is essential for future-proofing your content strategy.
The immediate next frontier is temporal coherence and feature-length narrative. Current AI video models excel at short clips of a few seconds but struggle with maintaining character and environmental consistency over longer durations. The next generation of models is focused squarely on solving this "context window" problem. We are rapidly approaching a world where generating a coherent 5-minute short film, or even a full 90-minute feature, will be technically feasible. This will democratize filmmaking in an unprecedented way, but it will also place a premium on the oldest skills in the book: story structure, character development, and thematic depth. The ability to write a compelling, feature-length AI-powered script will become a highly valuable skill.
Another seismic shift will be the move from generation to real-time simulation. Currently, AI creates video through a rendering process. The future lies in real-time engines that use AI not to generate frames, but to control dynamic, simulated worlds. Imagine prompting an AI not for a video of a car chase, but for a fully simulated city environment where you can digitally "film" the chase from any angle, in any lighting condition, in real-time. This merges the worlds of game engine technology (like Unreal Engine) with generative AI, blurring the line between pre-rendered content and interactive experience. This is the logical endpoint of the trends we see in real-time rendering engines dominating searches.
Finally, we will see the rise of hyper-personalized and interactive video narratives. The AI action trailer was a one-to-many broadcast. The next evolution is one-to-one. AI will allow for the creation of video content that adapts in real-time to viewer data. A brand video could change its featured products based on a viewer's past browsing history. A short film could alter its ending based on a viewer's emotional responses, gathered through their webcam or interaction choices. This transforms video from a static piece of content into a dynamic, conversational interface. This aligns perfectly with the predicted growth of hyper-personalized video ads as a top SEO driver, but applied to entertainment and storytelling. The content of the future won't just be watched; it will be experienced.
To provide a 360-degree perspective on the implications of the AI action trailer's success, we convened a virtual roundtable of experts from across the creative and technological spectrum. Their insights cut through the hype to offer a grounded, nuanced view of what this moment means for the future.
"This project is a 'Sputnik moment' for creative industries. It's a clear signal that the technological capability has arrived. The question is no longer 'if,' but 'how' we integrate these tools. The most successful creators of the next decade will be those who master the new art of 'creative direction for AI'—the ability to guide, curate, and refine AI outputs with a strong human vision. The director's role is evolving, not disappearing." — Maria Chen, Creative Director at a leading VFX studio and adjunct professor at the School of Cinematic Arts.
"From an SEO and marketing perspective, this case study is a masterclass in what I call 'Authority Spark' content. It's a single, high-voltage project that generates so much buzz, backlinks, and social proof that it permanently elevates a brand's entire domain authority. It's more effective than publishing 100 mediocre blog posts. The key takeaway for marketers is to shift budget from a high quantity of low-impact content to a lower quantity of high-ambition, 'spark' projects that are designed to be linkable and remarkable." — David Lee, Head of Digital Strategy at a global marketing agency and author of "The Algorithmic Audience."
"We must be cautious not to let the spectacle obscure the underlying data ethics. The computational power required to train these models is immense, and the data sourcing is often a black box. As a field, we need to move towards greater transparency in training data and develop robust, tamper-proof systems for watermarking and disclosing AI-generated content. This project was ethically executed, but it sets a precedent. We need industry-wide standards to ensure this powerful technology is used responsibly." — Dr. Anya Sharma, AI Ethics Researcher at the MIT Media Lab.
"What excites me is the accessibility this unlocks. A young filmmaker in a remote part of the world no longer needs a million-dollar budget to visualize their epic sci-fi idea. They can use these tools to create a proof-of-concept that can attract talent and funding. We're going to see an explosion of voices and stories that were previously locked out of the system. It's a disruptive, democratizing force similar to the advent of digital cameras, but on steroids." — Javier Rodriguez, Independent Film Director and Founder of the "Open Source Cinema" collective.
The story of the AI action trailer that garnered 45 million views is more than just a fascinating case study in virality. It is a definitive marker of a profound shift in the content creation landscape. We are witnessing the emergence of a new paradigm, one where human creativity and artificial intelligence are not in opposition, but in a powerful and synergistic partnership. The success of this project proves that the highest value is no longer found in the manual execution of a task, but in the visionary guidance of the process—the curation of ideas, the emotional intelligence to understand an audience, and the strategic acumen to bring a groundbreaking concept to a global stage.
The era of AI-as-gimmick is over. We have entered the era of AI-as-co-pilot. The tools are now sophisticated enough to be genuine creative partners, handling the heavy lifting of asset generation while the human team focuses on the overarching narrative, strategic positioning, and ethical considerations. This frees creators to operate at a higher level of abstraction, to dream bigger and more audaciously because the technical constraints that once bound them are rapidly dissolving. This is as true for a corporate culture video as it is for a Hollywood blockbuster.
The lesson is clear for every brand, marketer, and creator: waiting on the sidelines is no longer an option. The technological train has left the station. The time to experiment, to learn, and to develop an AI-content strategy is now. This doesn't mean abandoning your core principles or artistic integrity. It means augmenting your capabilities. It means embracing the role of a creative conductor, orchestrating a symphony of human and machine intelligence to produce work that is more impactful, more efficient, and more resonant than ever before.
The 45-million-view phenomenon did not emerge from a vacuum. It began with a single, bold decision to explore the unknown. Your journey starts today.
The future of content is not being written by algorithms alone. It is being co-authored by curious, courageous, and visionary humans who are willing to embrace new tools and redefine what is possible. The question is not if AI will change your world, but how you will choose to shape that change. Start building your legacy in the new creative frontier now.