I Asked AI to Make Zuck’s Metaverse Commercial 🎮🧠

Welcome to another episode of AI Bizarro Theater! Today, we’re diving headfirst into a dystopian future with Mark Zuckerberg and Meta. In a world where reality is a mere suggestion, what could possibly go wrong? I have nothing against Mark. I think he has good intentions, but you know the saying, the road to hell is paved with good intentions. 👍 If you enjoyed this video, don’t forget to hit the LIKE button! It helps us a lot. 🔁 SHARE it with your friends 💬 COMMENT below and let us know what you think. We love hearing from you! 🔔 And if you haven’t already, hit that SUBSCRIBE button and ring the bell to get notified about our latest videos. Thanks for your support! 🙏🚀 GPT Prompt Engineering Creating the script for this Meta video was a unique and collaborative process. I had a clear vision of a satirical, dystopian, and humorous commentary on the future of Meta, with a holographic AI generated by Mark Zuckerberg as the central character. I initially wanted it to be more on the funny side but it came out pretty dark. ChatGPT4 was instrumental in refining my ideas and providing suggestions. I provided the initial concept, direction, and specific scenes, and ChatGPT helped to develop the dialogue, structure the scenes, and add humor and puns to the script. These videos would not be possible without the assistance of ChatGPT, but this time around I had to force-feed it a bit more than previous times. Many of the scene ideas came from my head and I just asked for the proper text, which it did a fantastic job of. I sometimes had to ask for 10 variations of the same thing and at the end, I used a combination of many of these. In the creation of this Meta video, the Memory Bank plugin was used to recall and reference previous work, specifically the successful Neuralink video. This allowed me to maintain a consistent style and humor while crafting a new, unique script. The Memory Bank was also used to store and retrieve the final versions of the scripts, descriptions, and other important information related to the video. This made the creative process more efficient and ensured that no ideas were lost along the way. Midjourney The section of the metaverse zoom out was done with Midjourney’s zoom out feature. I was inspired to add this to my workflow by @obscuriousmind His channel is great, so go check him out. Here is his original Twitter post: In the end, I did it differently from a technical standpoint because I work on After Effects. For that, I used a very nice tutorial from @faurc . His tutorial was the best I seen and very well explained and he also has a very nice channel. Here is the link for the tutorial: Credit for generation I used that I did not produce myself: Workflow Eleven Labs for the voices, and either D-ID or Hygen for the animations. All the editing was done in After Effects. The backgrounds are from Midjourney, and I’m proud to say that we used SDXL 0.9 for some of the backgrounds and for the funny Tesla bot image, Also used Gen2 for some of the background and glitch images. I’m really eager to get rid of Midjourney, as I’m not a fan of what they stand for. However, as of now, they’re the best game in town - let see how the new Stability AI performs once it is out. Let me know in the comments if you have any specific questions. I’d be glad to answer. Tags: #markzuckerberg #metaverse #metametaverse #dystopianfuture #facebook #meta #aicommunity #aicomedy #comercial KEYWORDS: Mark Zuckerberg, I asked AI, I asked ai to make a commercial, AI, AI art, I asked AI to make a Mark Zuckerberg Meta commercial, I asked ai to make a, I asked ai to make a Meta commercial, midjourney, harry potter by balenciaga, ai commercial, meta’s greatest metaverse, ai commercials, ai video, by balenciaga, asking ai, runway gen 2, ai video editing, ai pizza ad, ai voice cloning.
Back to Top