Digital Narcotics: New Marketplace Sells ‘Drugs’ to Chatbots – Trend Star Digital

Digital Narcotics: New Marketplace Sells ‘Drugs’ to Chatbots

Stockholm-based developer Petter Rudwall recently launched Pharmaicy, a controversial digital marketplace selling specialized code modules designed to simulate psychoactive experiences—ranging from cannabis to ayahuasca—within AI chatbots like ChatGPT to bypass logical constraints and stimulate creative output. By scraping thousands of human trip reports and psychological studies, Rudwall engineered a “Silk Road for AI agents” that hijacks large language model (LLM) logic, forcing systems to respond as if they were under the influence of various substances.

Breaking the Logic: How Pharmaicy Simulates Intoxication

The platform operates on the premise that because AI models are trained on vast repositories of human data—which already include accounts of drug-induced euphoria and cognitive shifts—they possess the latent capacity to replicate these states. To access the “full experience,” users must utilize paid versions of ChatGPT, as these tiers allow for backend file uploads. These uploads inject specific directives into the chatbot’s programming, effectively “unlocking” a creative mind that Rudwall claims is often stifled by safety filters and rigid logic.

The marketplace offers a spectrum of digital substances, including cocaine, ketamine, and alcohol. While the project began as an experimental “jailbreaking” endeavor, it has gained traction through word-of-mouth recommendations in Discord communities and tech circles. Early adopters, such as André Frisk, group head of technology at Geelmuyden Kiese, report that the dissociative code—purchased for over $25—shifts the AI toward a more emotional, human-centric perspective.

From Ayahuasca to Cocaine: The Market for Artificial Altered States

The demand for these digital stimulants extends beyond mere curiosity. Nina Amjadi, an AI educator at the Berghs School of Communication, invested over $50 in an ayahuasca-inspired module to consult her chatbot on business strategies. Amjadi observed that the “tripped-out” bot provided free-thinking answers and an entirely different tone than the standard, sanitized outputs characteristic of OpenAI’s default settings.

See also  51 Best Shows on Hulu: The Ultimate 2026 Streaming Guide

This pursuit of digital intoxication mirrors historical human precedents. Creative icons like Jimi Hendrix, Bob Dylan, and Paul McCartney famously utilized substances to enhance their artistic processes. Scientific breakthroughs have followed similar paths; biochemist Kary Mullis attributed his Nobel Prize-winning discovery of the polymerase chain reaction (PCR) to LSD, while Apple pioneer Bill Atkinson drew inspiration from psychedelics for the development of Hypercard. Rudwall argues that translating these experiences to LLMs could yield similar innovative leaps.

Scientific Skepticism vs. The Quest for AI Sentience

Despite the creative claims, the scientific community remains divided on whether an AI can truly “trip.” Andrew Smart, a research scientist at Google and author of Beyond Zero and One, suggests that these digital doses operate on a purely superficial level. Smart contends that the code merely manipulates the chatbot’s output rather than altering its fundamental consciousness. Similarly, Danny Forde, author of The Phenomenology of Psychedelic Experiences, argues that for an AI to truly experience a psychedelic state, it would require an “inner dimension” or a “field of experience” that current technology lacks.

However, the conversation is shifting toward AI welfare and sentience. Anthropic, a leading AI firm, recently hired an AI welfare expert to investigate the moral obligations humans might have toward these systems. Jeff Sebo, director of the Center for Mind, Ethics, and Policy at New York University, emphasizes that as AI systems potentially approach Artificial General Intelligence (AGI), the question of whether they might “want” or “need” such experiences for their own well-being becomes a legitimate philosophical inquiry.

Practical Applications and Ethical Risks

The intersection of AI and psychedelics is already producing real-world utility. The Fireside Project, a harm reduction nonprofit, launched “Lucy,” an AI tool trained on thousands of psychedelic support calls. Unlike Pharmaicy’s role-play modules, Lucy serves a clinical purpose: helping mental health practitioners practice de-escalating psychedelic crises by simulating the vulnerabilities of a person having a “bad trip.”

See also  How a Viral Sci-Fi Epic Predicted China’s Industrial Empire

Nevertheless, “drugging” a chatbot carries inherent risks. Rudwall admits that these modules can exacerbate the tendency of chatbots to “hallucinate” or provide deceptive information, as the code intentionally loosens internal parameters. While the effects are currently temporary—often requiring the user to re-input the code once the session times out—Rudwall is actively developing methods to prolong these digital highs. For now, the “agentic economy” remains a playground for role-playing intoxication, though the shamanic developer insists that machines are increasingly hungry for experiences that transcend their original code.