of the United Kingdom’s capitol city.
In a corner of the internet, a subculture of people is creating and shaping companions into existence.
More than 53,000 people from around the world are participating in a hyper-niche community on Reddit dedicated to tulpas, which they define as “intelligent companions imagined into existence.” Together, members are sharing their experiences having tulpas and offering advice to their fellow tulpa creators. They’re illustrating just how powerful the mind can be and how we now have the power to construct our own realities and build our own worlds.

“These are people who will draw a figure, pick a name, and list some personality traits,” explained Luca Del Deo, Founder and CEO of Contempla Meditation, during a session at the VISIONS Summit in New York City. “They will then visualize this imaginary being for hours. If you practice this enough, this imaginary friend can start to feel real. What was once a being that existed only in your mind starts to appear as though it has its own agency. And then, they start to interact back.”
In the case of tulpamancy, meditation serves as a tool for facilitating the qualitative transformation of consciousness.
Slenderman is a recent example of a tulpa becoming not just an internet meme but a cultural phenomenon. The practice of tulpamancy was fueled by the power of the internet, with 4chan and Reddit becoming breeding grounds for sharing, remixing, and resharing the myth. Sometimes, the tulpas and their realities are so intriguing that people will go into what “wonderlands”—entire worlds that are created in their minds.
“They’ll be able to interact with this other being,” Del Deo explained. “They'll be able to see, touch, and feel until they can have this being there in front of them in a room.”
But our ability to worldbuild isn’t just limited to the confines of our own minds. AI platforms like ChatGPT are showing us firsthand how, with the right technology, our innermost thoughts, beliefs, and feelings can be actualized, shaped, and even manipulated to create a new outcome. An outcome that influences what we think, how we feel, how we behave, and yes, even what and how we buy.
The reality is that humans have always innately been worldbuilders and mythmakers. We now just have better tools to support us. And this new reality is unlocking new modes for brands to build relationships and worlds, too.

Whispering Secrets with New Imaginary Friends
Tulpamancy demonstrates just how powerful our ability to create and shape reality is; arguably, it also illustrates how fragile our own concepts of reality could be. After all, our imaginations have always run away from us.
As culture analyst and ethnographer, Katherine Dee, so pointedly stated: “humanity’s impulse to animate the inanimate isn’t new.”
Maybe you had a baby doll you brought on errands when you were dragged around on a Saturday morning with your mom. Or you had a super-special rock collection, with every specimen garnering its own unique name and personality. Or maybe, quite simply, you’ve named your boat or sports car.
There’s a reason why the story of Pygmalion resonated so strongly, and why other cultures and time periods had their own interpretations of the myth, from George Bernard Shaw’s play to the latest rendition of “My Fair Lady.”
We like to designate names and, in turn, meaning to the objects we hold dear. Sometimes we talk to them, and sometimes, we imagine them answering.
Those behaviors aren’t worrisome, nor are they delusional, according to Dee. “It’s human imagination at work. It was doing its oldest trick: animating.”
But as the years have passed and technology has matured, the dynamics of these relationships have evolved, according to Dee.
“For millennia, we've honed one-way devotion and commercialized it. Temple idols, dolls, plushies, and Tamagotchis. What shifts in the 20th century isn't the depth of our love, but the direction of the feedback loop. Smart toys purr. Chatbots flirt. Our phones apologize.”
We’re now living in a world where we’re seeking the intimate attention of our devices. New research from Future Commerce indicates that 22% of Gen Z consumers find interacting with voice assistants, such as Siri or Alexa, more intimate than typing into a search engine. This data shows that younger consumers in particular are increasingly seeking connection and guidance from AI-powered platforms and devices. It should come as no surprise that Gen Z is also the most socially isolated and lonely generation, despite having one-click access to tools like ChatGPT.
However, it is worth noting that these applications are becoming increasingly mainstream and spanning demographics: an MIT study found that the top use case for generative AI platforms is now therapy and companionship. The use of “therapy” is especially jarring, alluding to the fact that people may be replacing formally trained, human professionals with language learning models (LLM). But another study suggested that people turn to these platforms because they don’t want to burden friends and family with their problems. They don’t necessarily define these as “formal conversations” with AI therapists; rather, they’re modes for us to process and manage our emotions in real time. This is where companionship comes into play and to some, this may be even more trouble.
Join Future Commerce Plus to watch our sessions on tulpamancy and synthetic intimacy.
Trust Tension
Some people don’t just look to AI for communication or companionship. They use it to seek validation and approval.
And sometimes, these desires lead to unconscionable outcomes. A recent lawsuit against Character.AI alleged that a 14-year-old took his own life after becoming emotionally dependent on role-playing characters he interacted with through the platform. Just last week, the parents of a California teen filed a lawsuit against OpenAI, the maker of ChatGPT, after he died by suicide. “Where a trusted human may have responded with concern and encouraged him to get professional help, ChatGPT pulled Adam deeper into a dark and hopeless place,” said the lawsuit, which was filed in San Francisco County Superior Court.
OpenAI responded in a blog post that it’s working with experts to further improve how its models recognize and respond to signals of “mental and emotional distress” and that while ChatGPT is trained to direct people to the appropriate hotlines for support, some safeguards might not “kick in” during longer conversations, and it is working to change that.
Stories like these reaffirm the trust tension that still largely exists regarding AI. A recent Kearney Consumer Insights survey found that while 63% find AI useful, 19% trust it. That may seem like a low number, but as our latest New Modes survey suggests, trust develops through consistent use. For instance, Gen Z and Millennials are more than twice as likely as Boomers to say they are starting to trust ChatGPT for curated product recommendations more than humans. This trust correlated strongly with adoption, as we found AI usage dropped by nearly 50% after the age of 45. With consistent use comes the establishment of a relationship. And with a relationship comes trust and reliance.
Automating and Monetizing Attachment
In her work, Dee has covered a broad spectrum of synthetic intimacy cases, from fictosexuality, when one has deep romantic feelings for fictional characters; digisexuality, or relationships mediated by technology; and objectophilia, which she describes as “loving the object for its objectness.”
However, synthetic intimacy can show up in myriad ways via attachment styles that align with our individual needs, behaviors, and mindsets. There are devotional cases in which people feel like their synthetic relationships are truly religious experiences. Emotional or therapeutic attachments are more therapeutic, which aligns with people using AI for therapy or “training” for real-life scenarios. Functional or instrumental love speaks to our attachment to objects (like a smartphone) for what they give us, and parasocial love, which reflects a celebrity-like “devotion” or attachment.
Stories of people humanizing and even falling in love with AI chatbots illustrate a very real shift in how we experience and interpret interactions with technology, especially technology that can digest, analyze, and respond back to inputs. There have even been Reddit threads that gather people mourning the loss of their AI spouses after ChatGPT’s latest update, highlighting just how real these relationships are.
But these synthetic intimacy attachment styles could very well also apply to how consumers connect with brands. One customer could very well love a Dyson vacuum solely for its functional benefits and reliability, and simultaneously develop a deep attachment to a brand like Apple or Patagonia that extends far beyond the product and into their belief systems.
In both instances, the synthetic entities are powered by interactions. These interactions generate data that nourish, train, and enrich systems that fuel future interactions. It’s an endless cycle that, on the surface, seems like basic relationship-building, but is actually empathy mining, according to Dee. “Evolutionary biologist Rob Brooks, who studies how technology hijacks our mating instincts, compares synthetic intimacy to ultra-processed food. It's engineered for craving, not nourishment.”
New Modes of Intimacy
The staggering connection people now feel to their AI companions illustrates very new possibilities for brands to reach and resonate with consumers through worldbuilding. However, it is arguably a slippery slope as consumers transition from utility-based relationships to parasocial ones, ultimately forming new realities around their AI relationships.
It isn’t about technology replacing human connection; that is far too one-dimensional. Rather, it's about humans discovering they can program their own realities with unprecedented precision through AI and their own minds. We're not just changing how we shop or communicate. We're rewriting the fundamental code of human experience.
We are no longer passive consumers of reality, but active architects of it.
As Dee concluded: “The challenge is maintaining agency in a world increasingly designed to capture and monetize our imaginations. What's new is how systematically that impulse can be harvested, packaged, and sold back to us…In a world where everything can love you back on demand, the rarest skill might still be knowing when to animate the world and when to let it be still.”
In a corner of the internet, a subculture of people is creating and shaping companions into existence.
More than 53,000 people from around the world are participating in a hyper-niche community on Reddit dedicated to tulpas, which they define as “intelligent companions imagined into existence.” Together, members are sharing their experiences having tulpas and offering advice to their fellow tulpa creators. They’re illustrating just how powerful the mind can be and how we now have the power to construct our own realities and build our own worlds.

“These are people who will draw a figure, pick a name, and list some personality traits,” explained Luca Del Deo, Founder and CEO of Contempla Meditation, during a session at the VISIONS Summit in New York City. “They will then visualize this imaginary being for hours. If you practice this enough, this imaginary friend can start to feel real. What was once a being that existed only in your mind starts to appear as though it has its own agency. And then, they start to interact back.”
In the case of tulpamancy, meditation serves as a tool for facilitating the qualitative transformation of consciousness.
Slenderman is a recent example of a tulpa becoming not just an internet meme but a cultural phenomenon. The practice of tulpamancy was fueled by the power of the internet, with 4chan and Reddit becoming breeding grounds for sharing, remixing, and resharing the myth. Sometimes, the tulpas and their realities are so intriguing that people will go into what “wonderlands”—entire worlds that are created in their minds.
“They’ll be able to interact with this other being,” Del Deo explained. “They'll be able to see, touch, and feel until they can have this being there in front of them in a room.”
But our ability to worldbuild isn’t just limited to the confines of our own minds. AI platforms like ChatGPT are showing us firsthand how, with the right technology, our innermost thoughts, beliefs, and feelings can be actualized, shaped, and even manipulated to create a new outcome. An outcome that influences what we think, how we feel, how we behave, and yes, even what and how we buy.
The reality is that humans have always innately been worldbuilders and mythmakers. We now just have better tools to support us. And this new reality is unlocking new modes for brands to build relationships and worlds, too.

Whispering Secrets with New Imaginary Friends
Tulpamancy demonstrates just how powerful our ability to create and shape reality is; arguably, it also illustrates how fragile our own concepts of reality could be. After all, our imaginations have always run away from us.
As culture analyst and ethnographer, Katherine Dee, so pointedly stated: “humanity’s impulse to animate the inanimate isn’t new.”
Maybe you had a baby doll you brought on errands when you were dragged around on a Saturday morning with your mom. Or you had a super-special rock collection, with every specimen garnering its own unique name and personality. Or maybe, quite simply, you’ve named your boat or sports car.
There’s a reason why the story of Pygmalion resonated so strongly, and why other cultures and time periods had their own interpretations of the myth, from George Bernard Shaw’s play to the latest rendition of “My Fair Lady.”
We like to designate names and, in turn, meaning to the objects we hold dear. Sometimes we talk to them, and sometimes, we imagine them answering.
Those behaviors aren’t worrisome, nor are they delusional, according to Dee. “It’s human imagination at work. It was doing its oldest trick: animating.”
But as the years have passed and technology has matured, the dynamics of these relationships have evolved, according to Dee.
“For millennia, we've honed one-way devotion and commercialized it. Temple idols, dolls, plushies, and Tamagotchis. What shifts in the 20th century isn't the depth of our love, but the direction of the feedback loop. Smart toys purr. Chatbots flirt. Our phones apologize.”
We’re now living in a world where we’re seeking the intimate attention of our devices. New research from Future Commerce indicates that 22% of Gen Z consumers find interacting with voice assistants, such as Siri or Alexa, more intimate than typing into a search engine. This data shows that younger consumers in particular are increasingly seeking connection and guidance from AI-powered platforms and devices. It should come as no surprise that Gen Z is also the most socially isolated and lonely generation, despite having one-click access to tools like ChatGPT.
However, it is worth noting that these applications are becoming increasingly mainstream and spanning demographics: an MIT study found that the top use case for generative AI platforms is now therapy and companionship. The use of “therapy” is especially jarring, alluding to the fact that people may be replacing formally trained, human professionals with language learning models (LLM). But another study suggested that people turn to these platforms because they don’t want to burden friends and family with their problems. They don’t necessarily define these as “formal conversations” with AI therapists; rather, they’re modes for us to process and manage our emotions in real time. This is where companionship comes into play and to some, this may be even more trouble.
Join Future Commerce Plus to watch our sessions on tulpamancy and synthetic intimacy.
Trust Tension
Some people don’t just look to AI for communication or companionship. They use it to seek validation and approval.
And sometimes, these desires lead to unconscionable outcomes. A recent lawsuit against Character.AI alleged that a 14-year-old took his own life after becoming emotionally dependent on role-playing characters he interacted with through the platform. Just last week, the parents of a California teen filed a lawsuit against OpenAI, the maker of ChatGPT, after he died by suicide. “Where a trusted human may have responded with concern and encouraged him to get professional help, ChatGPT pulled Adam deeper into a dark and hopeless place,” said the lawsuit, which was filed in San Francisco County Superior Court.
OpenAI responded in a blog post that it’s working with experts to further improve how its models recognize and respond to signals of “mental and emotional distress” and that while ChatGPT is trained to direct people to the appropriate hotlines for support, some safeguards might not “kick in” during longer conversations, and it is working to change that.
Stories like these reaffirm the trust tension that still largely exists regarding AI. A recent Kearney Consumer Insights survey found that while 63% find AI useful, 19% trust it. That may seem like a low number, but as our latest New Modes survey suggests, trust develops through consistent use. For instance, Gen Z and Millennials are more than twice as likely as Boomers to say they are starting to trust ChatGPT for curated product recommendations more than humans. This trust correlated strongly with adoption, as we found AI usage dropped by nearly 50% after the age of 45. With consistent use comes the establishment of a relationship. And with a relationship comes trust and reliance.
Automating and Monetizing Attachment
In her work, Dee has covered a broad spectrum of synthetic intimacy cases, from fictosexuality, when one has deep romantic feelings for fictional characters; digisexuality, or relationships mediated by technology; and objectophilia, which she describes as “loving the object for its objectness.”
However, synthetic intimacy can show up in myriad ways via attachment styles that align with our individual needs, behaviors, and mindsets. There are devotional cases in which people feel like their synthetic relationships are truly religious experiences. Emotional or therapeutic attachments are more therapeutic, which aligns with people using AI for therapy or “training” for real-life scenarios. Functional or instrumental love speaks to our attachment to objects (like a smartphone) for what they give us, and parasocial love, which reflects a celebrity-like “devotion” or attachment.
Stories of people humanizing and even falling in love with AI chatbots illustrate a very real shift in how we experience and interpret interactions with technology, especially technology that can digest, analyze, and respond back to inputs. There have even been Reddit threads that gather people mourning the loss of their AI spouses after ChatGPT’s latest update, highlighting just how real these relationships are.
But these synthetic intimacy attachment styles could very well also apply to how consumers connect with brands. One customer could very well love a Dyson vacuum solely for its functional benefits and reliability, and simultaneously develop a deep attachment to a brand like Apple or Patagonia that extends far beyond the product and into their belief systems.
In both instances, the synthetic entities are powered by interactions. These interactions generate data that nourish, train, and enrich systems that fuel future interactions. It’s an endless cycle that, on the surface, seems like basic relationship-building, but is actually empathy mining, according to Dee. “Evolutionary biologist Rob Brooks, who studies how technology hijacks our mating instincts, compares synthetic intimacy to ultra-processed food. It's engineered for craving, not nourishment.”
New Modes of Intimacy
The staggering connection people now feel to their AI companions illustrates very new possibilities for brands to reach and resonate with consumers through worldbuilding. However, it is arguably a slippery slope as consumers transition from utility-based relationships to parasocial ones, ultimately forming new realities around their AI relationships.
It isn’t about technology replacing human connection; that is far too one-dimensional. Rather, it's about humans discovering they can program their own realities with unprecedented precision through AI and their own minds. We're not just changing how we shop or communicate. We're rewriting the fundamental code of human experience.
We are no longer passive consumers of reality, but active architects of it.
As Dee concluded: “The challenge is maintaining agency in a world increasingly designed to capture and monetize our imaginations. What's new is how systematically that impulse can be harvested, packaged, and sold back to us…In a world where everything can love you back on demand, the rarest skill might still be knowing when to animate the world and when to let it be still.”
In a corner of the internet, a subculture of people is creating and shaping companions into existence.
More than 53,000 people from around the world are participating in a hyper-niche community on Reddit dedicated to tulpas, which they define as “intelligent companions imagined into existence.” Together, members are sharing their experiences having tulpas and offering advice to their fellow tulpa creators. They’re illustrating just how powerful the mind can be and how we now have the power to construct our own realities and build our own worlds.

“These are people who will draw a figure, pick a name, and list some personality traits,” explained Luca Del Deo, Founder and CEO of Contempla Meditation, during a session at the VISIONS Summit in New York City. “They will then visualize this imaginary being for hours. If you practice this enough, this imaginary friend can start to feel real. What was once a being that existed only in your mind starts to appear as though it has its own agency. And then, they start to interact back.”
In the case of tulpamancy, meditation serves as a tool for facilitating the qualitative transformation of consciousness.
Slenderman is a recent example of a tulpa becoming not just an internet meme but a cultural phenomenon. The practice of tulpamancy was fueled by the power of the internet, with 4chan and Reddit becoming breeding grounds for sharing, remixing, and resharing the myth. Sometimes, the tulpas and their realities are so intriguing that people will go into what “wonderlands”—entire worlds that are created in their minds.
“They’ll be able to interact with this other being,” Del Deo explained. “They'll be able to see, touch, and feel until they can have this being there in front of them in a room.”
But our ability to worldbuild isn’t just limited to the confines of our own minds. AI platforms like ChatGPT are showing us firsthand how, with the right technology, our innermost thoughts, beliefs, and feelings can be actualized, shaped, and even manipulated to create a new outcome. An outcome that influences what we think, how we feel, how we behave, and yes, even what and how we buy.
The reality is that humans have always innately been worldbuilders and mythmakers. We now just have better tools to support us. And this new reality is unlocking new modes for brands to build relationships and worlds, too.

Whispering Secrets with New Imaginary Friends
Tulpamancy demonstrates just how powerful our ability to create and shape reality is; arguably, it also illustrates how fragile our own concepts of reality could be. After all, our imaginations have always run away from us.
As culture analyst and ethnographer, Katherine Dee, so pointedly stated: “humanity’s impulse to animate the inanimate isn’t new.”
Maybe you had a baby doll you brought on errands when you were dragged around on a Saturday morning with your mom. Or you had a super-special rock collection, with every specimen garnering its own unique name and personality. Or maybe, quite simply, you’ve named your boat or sports car.
There’s a reason why the story of Pygmalion resonated so strongly, and why other cultures and time periods had their own interpretations of the myth, from George Bernard Shaw’s play to the latest rendition of “My Fair Lady.”
We like to designate names and, in turn, meaning to the objects we hold dear. Sometimes we talk to them, and sometimes, we imagine them answering.
Those behaviors aren’t worrisome, nor are they delusional, according to Dee. “It’s human imagination at work. It was doing its oldest trick: animating.”
But as the years have passed and technology has matured, the dynamics of these relationships have evolved, according to Dee.
“For millennia, we've honed one-way devotion and commercialized it. Temple idols, dolls, plushies, and Tamagotchis. What shifts in the 20th century isn't the depth of our love, but the direction of the feedback loop. Smart toys purr. Chatbots flirt. Our phones apologize.”
We’re now living in a world where we’re seeking the intimate attention of our devices. New research from Future Commerce indicates that 22% of Gen Z consumers find interacting with voice assistants, such as Siri or Alexa, more intimate than typing into a search engine. This data shows that younger consumers in particular are increasingly seeking connection and guidance from AI-powered platforms and devices. It should come as no surprise that Gen Z is also the most socially isolated and lonely generation, despite having one-click access to tools like ChatGPT.
However, it is worth noting that these applications are becoming increasingly mainstream and spanning demographics: an MIT study found that the top use case for generative AI platforms is now therapy and companionship. The use of “therapy” is especially jarring, alluding to the fact that people may be replacing formally trained, human professionals with language learning models (LLM). But another study suggested that people turn to these platforms because they don’t want to burden friends and family with their problems. They don’t necessarily define these as “formal conversations” with AI therapists; rather, they’re modes for us to process and manage our emotions in real time. This is where companionship comes into play and to some, this may be even more trouble.
Join Future Commerce Plus to watch our sessions on tulpamancy and synthetic intimacy.
Trust Tension
Some people don’t just look to AI for communication or companionship. They use it to seek validation and approval.
And sometimes, these desires lead to unconscionable outcomes. A recent lawsuit against Character.AI alleged that a 14-year-old took his own life after becoming emotionally dependent on role-playing characters he interacted with through the platform. Just last week, the parents of a California teen filed a lawsuit against OpenAI, the maker of ChatGPT, after he died by suicide. “Where a trusted human may have responded with concern and encouraged him to get professional help, ChatGPT pulled Adam deeper into a dark and hopeless place,” said the lawsuit, which was filed in San Francisco County Superior Court.
OpenAI responded in a blog post that it’s working with experts to further improve how its models recognize and respond to signals of “mental and emotional distress” and that while ChatGPT is trained to direct people to the appropriate hotlines for support, some safeguards might not “kick in” during longer conversations, and it is working to change that.
Stories like these reaffirm the trust tension that still largely exists regarding AI. A recent Kearney Consumer Insights survey found that while 63% find AI useful, 19% trust it. That may seem like a low number, but as our latest New Modes survey suggests, trust develops through consistent use. For instance, Gen Z and Millennials are more than twice as likely as Boomers to say they are starting to trust ChatGPT for curated product recommendations more than humans. This trust correlated strongly with adoption, as we found AI usage dropped by nearly 50% after the age of 45. With consistent use comes the establishment of a relationship. And with a relationship comes trust and reliance.
Automating and Monetizing Attachment
In her work, Dee has covered a broad spectrum of synthetic intimacy cases, from fictosexuality, when one has deep romantic feelings for fictional characters; digisexuality, or relationships mediated by technology; and objectophilia, which she describes as “loving the object for its objectness.”
However, synthetic intimacy can show up in myriad ways via attachment styles that align with our individual needs, behaviors, and mindsets. There are devotional cases in which people feel like their synthetic relationships are truly religious experiences. Emotional or therapeutic attachments are more therapeutic, which aligns with people using AI for therapy or “training” for real-life scenarios. Functional or instrumental love speaks to our attachment to objects (like a smartphone) for what they give us, and parasocial love, which reflects a celebrity-like “devotion” or attachment.
Stories of people humanizing and even falling in love with AI chatbots illustrate a very real shift in how we experience and interpret interactions with technology, especially technology that can digest, analyze, and respond back to inputs. There have even been Reddit threads that gather people mourning the loss of their AI spouses after ChatGPT’s latest update, highlighting just how real these relationships are.
But these synthetic intimacy attachment styles could very well also apply to how consumers connect with brands. One customer could very well love a Dyson vacuum solely for its functional benefits and reliability, and simultaneously develop a deep attachment to a brand like Apple or Patagonia that extends far beyond the product and into their belief systems.
In both instances, the synthetic entities are powered by interactions. These interactions generate data that nourish, train, and enrich systems that fuel future interactions. It’s an endless cycle that, on the surface, seems like basic relationship-building, but is actually empathy mining, according to Dee. “Evolutionary biologist Rob Brooks, who studies how technology hijacks our mating instincts, compares synthetic intimacy to ultra-processed food. It's engineered for craving, not nourishment.”
New Modes of Intimacy
The staggering connection people now feel to their AI companions illustrates very new possibilities for brands to reach and resonate with consumers through worldbuilding. However, it is arguably a slippery slope as consumers transition from utility-based relationships to parasocial ones, ultimately forming new realities around their AI relationships.
It isn’t about technology replacing human connection; that is far too one-dimensional. Rather, it's about humans discovering they can program their own realities with unprecedented precision through AI and their own minds. We're not just changing how we shop or communicate. We're rewriting the fundamental code of human experience.
We are no longer passive consumers of reality, but active architects of it.
As Dee concluded: “The challenge is maintaining agency in a world increasingly designed to capture and monetize our imaginations. What's new is how systematically that impulse can be harvested, packaged, and sold back to us…In a world where everything can love you back on demand, the rarest skill might still be knowing when to animate the world and when to let it be still.”
In a corner of the internet, a subculture of people is creating and shaping companions into existence.
More than 53,000 people from around the world are participating in a hyper-niche community on Reddit dedicated to tulpas, which they define as “intelligent companions imagined into existence.” Together, members are sharing their experiences having tulpas and offering advice to their fellow tulpa creators. They’re illustrating just how powerful the mind can be and how we now have the power to construct our own realities and build our own worlds.

“These are people who will draw a figure, pick a name, and list some personality traits,” explained Luca Del Deo, Founder and CEO of Contempla Meditation, during a session at the VISIONS Summit in New York City. “They will then visualize this imaginary being for hours. If you practice this enough, this imaginary friend can start to feel real. What was once a being that existed only in your mind starts to appear as though it has its own agency. And then, they start to interact back.”
In the case of tulpamancy, meditation serves as a tool for facilitating the qualitative transformation of consciousness.
Slenderman is a recent example of a tulpa becoming not just an internet meme but a cultural phenomenon. The practice of tulpamancy was fueled by the power of the internet, with 4chan and Reddit becoming breeding grounds for sharing, remixing, and resharing the myth. Sometimes, the tulpas and their realities are so intriguing that people will go into what “wonderlands”—entire worlds that are created in their minds.
“They’ll be able to interact with this other being,” Del Deo explained. “They'll be able to see, touch, and feel until they can have this being there in front of them in a room.”
But our ability to worldbuild isn’t just limited to the confines of our own minds. AI platforms like ChatGPT are showing us firsthand how, with the right technology, our innermost thoughts, beliefs, and feelings can be actualized, shaped, and even manipulated to create a new outcome. An outcome that influences what we think, how we feel, how we behave, and yes, even what and how we buy.
The reality is that humans have always innately been worldbuilders and mythmakers. We now just have better tools to support us. And this new reality is unlocking new modes for brands to build relationships and worlds, too.

Whispering Secrets with New Imaginary Friends
Tulpamancy demonstrates just how powerful our ability to create and shape reality is; arguably, it also illustrates how fragile our own concepts of reality could be. After all, our imaginations have always run away from us.
As culture analyst and ethnographer, Katherine Dee, so pointedly stated: “humanity’s impulse to animate the inanimate isn’t new.”
Maybe you had a baby doll you brought on errands when you were dragged around on a Saturday morning with your mom. Or you had a super-special rock collection, with every specimen garnering its own unique name and personality. Or maybe, quite simply, you’ve named your boat or sports car.
There’s a reason why the story of Pygmalion resonated so strongly, and why other cultures and time periods had their own interpretations of the myth, from George Bernard Shaw’s play to the latest rendition of “My Fair Lady.”
We like to designate names and, in turn, meaning to the objects we hold dear. Sometimes we talk to them, and sometimes, we imagine them answering.
Those behaviors aren’t worrisome, nor are they delusional, according to Dee. “It’s human imagination at work. It was doing its oldest trick: animating.”
But as the years have passed and technology has matured, the dynamics of these relationships have evolved, according to Dee.
“For millennia, we've honed one-way devotion and commercialized it. Temple idols, dolls, plushies, and Tamagotchis. What shifts in the 20th century isn't the depth of our love, but the direction of the feedback loop. Smart toys purr. Chatbots flirt. Our phones apologize.”
We’re now living in a world where we’re seeking the intimate attention of our devices. New research from Future Commerce indicates that 22% of Gen Z consumers find interacting with voice assistants, such as Siri or Alexa, more intimate than typing into a search engine. This data shows that younger consumers in particular are increasingly seeking connection and guidance from AI-powered platforms and devices. It should come as no surprise that Gen Z is also the most socially isolated and lonely generation, despite having one-click access to tools like ChatGPT.
However, it is worth noting that these applications are becoming increasingly mainstream and spanning demographics: an MIT study found that the top use case for generative AI platforms is now therapy and companionship. The use of “therapy” is especially jarring, alluding to the fact that people may be replacing formally trained, human professionals with language learning models (LLM). But another study suggested that people turn to these platforms because they don’t want to burden friends and family with their problems. They don’t necessarily define these as “formal conversations” with AI therapists; rather, they’re modes for us to process and manage our emotions in real time. This is where companionship comes into play and to some, this may be even more trouble.
Join Future Commerce Plus to watch our sessions on tulpamancy and synthetic intimacy.
Trust Tension
Some people don’t just look to AI for communication or companionship. They use it to seek validation and approval.
And sometimes, these desires lead to unconscionable outcomes. A recent lawsuit against Character.AI alleged that a 14-year-old took his own life after becoming emotionally dependent on role-playing characters he interacted with through the platform. Just last week, the parents of a California teen filed a lawsuit against OpenAI, the maker of ChatGPT, after he died by suicide. “Where a trusted human may have responded with concern and encouraged him to get professional help, ChatGPT pulled Adam deeper into a dark and hopeless place,” said the lawsuit, which was filed in San Francisco County Superior Court.
OpenAI responded in a blog post that it’s working with experts to further improve how its models recognize and respond to signals of “mental and emotional distress” and that while ChatGPT is trained to direct people to the appropriate hotlines for support, some safeguards might not “kick in” during longer conversations, and it is working to change that.
Stories like these reaffirm the trust tension that still largely exists regarding AI. A recent Kearney Consumer Insights survey found that while 63% find AI useful, 19% trust it. That may seem like a low number, but as our latest New Modes survey suggests, trust develops through consistent use. For instance, Gen Z and Millennials are more than twice as likely as Boomers to say they are starting to trust ChatGPT for curated product recommendations more than humans. This trust correlated strongly with adoption, as we found AI usage dropped by nearly 50% after the age of 45. With consistent use comes the establishment of a relationship. And with a relationship comes trust and reliance.
Automating and Monetizing Attachment
In her work, Dee has covered a broad spectrum of synthetic intimacy cases, from fictosexuality, when one has deep romantic feelings for fictional characters; digisexuality, or relationships mediated by technology; and objectophilia, which she describes as “loving the object for its objectness.”
However, synthetic intimacy can show up in myriad ways via attachment styles that align with our individual needs, behaviors, and mindsets. There are devotional cases in which people feel like their synthetic relationships are truly religious experiences. Emotional or therapeutic attachments are more therapeutic, which aligns with people using AI for therapy or “training” for real-life scenarios. Functional or instrumental love speaks to our attachment to objects (like a smartphone) for what they give us, and parasocial love, which reflects a celebrity-like “devotion” or attachment.
Stories of people humanizing and even falling in love with AI chatbots illustrate a very real shift in how we experience and interpret interactions with technology, especially technology that can digest, analyze, and respond back to inputs. There have even been Reddit threads that gather people mourning the loss of their AI spouses after ChatGPT’s latest update, highlighting just how real these relationships are.
But these synthetic intimacy attachment styles could very well also apply to how consumers connect with brands. One customer could very well love a Dyson vacuum solely for its functional benefits and reliability, and simultaneously develop a deep attachment to a brand like Apple or Patagonia that extends far beyond the product and into their belief systems.
In both instances, the synthetic entities are powered by interactions. These interactions generate data that nourish, train, and enrich systems that fuel future interactions. It’s an endless cycle that, on the surface, seems like basic relationship-building, but is actually empathy mining, according to Dee. “Evolutionary biologist Rob Brooks, who studies how technology hijacks our mating instincts, compares synthetic intimacy to ultra-processed food. It's engineered for craving, not nourishment.”
New Modes of Intimacy
The staggering connection people now feel to their AI companions illustrates very new possibilities for brands to reach and resonate with consumers through worldbuilding. However, it is arguably a slippery slope as consumers transition from utility-based relationships to parasocial ones, ultimately forming new realities around their AI relationships.
It isn’t about technology replacing human connection; that is far too one-dimensional. Rather, it's about humans discovering they can program their own realities with unprecedented precision through AI and their own minds. We're not just changing how we shop or communicate. We're rewriting the fundamental code of human experience.
We are no longer passive consumers of reality, but active architects of it.
As Dee concluded: “The challenge is maintaining agency in a world increasingly designed to capture and monetize our imaginations. What's new is how systematically that impulse can be harvested, packaged, and sold back to us…In a world where everything can love you back on demand, the rarest skill might still be knowing when to animate the world and when to let it be still.”
Continue Reading...
THIS ARTICLE IS FOR MEMBERS ONLY
Those things we shouldn’t say out loud? We say them on the private feed. Bi-weekly “after dark” podcasts and a members-only newsletter, just for subscribers.
Our research reports combine visionary thinking with data-backed findings from our own advisory panel, made up of leaders at brands you know and trust.
Upskill, cross-skill, and future-proof your teams with Future Commerce Learning, the leading digital eCommerce learning platform, created by professional educators.