Drupal feeds

DrupalCon News & Updates: Must-See DrupalCon Chicago 2026 Sessions for Marketing and Content Leaders

Drupal Planet -

If you are a marketing or content leader, DrupalCon Chicago 2026 is already calling your name. You are the special audience whose creative spark and unique perspective shine a light on Drupal in ways developers alone never could. You promote Drupal’s capabilities to the world and ensure the platform reaches the users who need it. You translate technical innovation into stories that resonate with everyone.

Drupal is increasingly built with you in mind. Making Drupal more editor‑friendly has been a clear priority in recent years. Thanks to your feedback and insights, great strides have been made in providing tools and workflows that truly support your creative vision.

This year’s DrupalCon sessions are set to spark bold insights, fresh strategies, and lively discussions. Expect those unforgettable “aha!” moments you’ll want to carry back and weave into your own marketing and content playbook. Here is a curated list of standout sessions designed to help marketing and content leaders turn inspiration into action, build meaningful connections, and discover new ways to make the most out of Drupal’s strengths.

Top DrupalCon Chicago 2026 sessions for marketing or content leaders “Generative engine optimization tactics for discoverability” — by Jeffrey McGuire and Tracy Evans

Search Engine Optimization (SEO) has long been one of the web’s most familiar acronyms when it comes to boosting content visibility. But new times bring new terms, and it’s time to meet “GEO” (Generative Engine Optimization).

Indeed, traditional SEO alone is no longer enough in a world where tools like ChatGPT, Perplexity, and Google’s AI Overviews are everyday sources of advice. Today, SEO and GEO must work hand in hand. DrupalCon Chicago 2026 has an insightful session designed to introduce you to a new way of helping your content reach its audience in the age of AI-driven recommendations.

Join brilliant speakers, Jeffrey McGuire (horncologne) and Tracy Evans (kanadiankicks), to stay ahead of the curve. Jeffrey A. “jam” McGuire has been one of the most influential voices in the Drupal community for over two decades, recognized as a marketing strategy and communications expert. With their combined expertise, this session is tailored for marketing and content leaders who want practical, actionable guidance.

You’ll explore how to make your agency, SaaS product, or company stand out when large language models decide which names to surface. Practical strategies will follow, helping you position your expertise, strengthen credibility signals, and align your content with the data sources LLMs rely on. The session will draw from real-world research, client projects, and observations.

“Context is Everything: Configuring Drupal AI for Consistent, On-Brand Content” — by Kristen Pol and Aidan Foster

It shouldn’t come as a surprise that the next session on this list is also about AI. Of course, you already know that artificial intelligence can churn out content in seconds. But how to make sure it’s consistent with your brand’s voice, feels authentic for your organization, and resonates with your audience? 

That’s where Drupal’s latest innovations, Context Control Center and Drupal Canvas, step in. Expect more exciting details at this session at DrupalCon Chicago 2026, which is a must‑see for marketing and content leaders.

This talk will be led by Kristen Pol (kristen pol) and Aidan Foster (afoster), the maintainers behind Context Control Center and Drupal Canvas. Through live demos, you’ll see landing pages, service pages, and blog posts come to life with clear context rules.

You’ll also leave with a practical starter framework for building your own context files, giving you the confidence to guide AI toward content that supports your marketing goals and strengthens your brand presence.

“From Chaos to Clarity: Building a Sustainable Content Governance Model with Drupal” — by Richard Nosek and C.J. Pagtakhan

Content chaos is something every marketing and content leader has faced: fragmented messaging, inconsistent standards, and editorial bottlenecks that slow campaigns down. At DrupalCon Chicago 2026, you’ll discover an actionable plan to make your content consistent, organized, and aligned with your brand’s goals.

Join this compelling session by Richard Nosek and C.J. Pagtakhan, seasoned experts in digital strategy. They’ll show how structured governance can scale across departments without stifling creativity. Explore workflows that make life easier for authors, editors, and administrators, including approval processes, audits, and lifecycle management. Discover clear frameworks for roles, responsibilities, and standards.

And because theory is best paired with practice, you’ll see real-world examples of how this approach improves quality, strengthens collaboration, and supports long‑term digital strategy on Drupal websites of every size and scope.

“Selling Drupal: How to win projects, and not alienate delivery teams” — by Hannah O'Leary and Hannah McDermott

Within agencies, sales and delivery departments share the same ultimate goal, client success. However, sales teams chase ambitious targets, while delivery teams focus on scope, sustainability, and the realities of open‑source implementation. Too often, this push and pull leads to friction, misaligned expectations, and even dips in client satisfaction.

At DrupalCon Chicago 2026, Hannah O’Leary hannaholeary and Hannah McDermott (hannah mcdermott) will share how they turned that challenge into a partnership at the Zoocha team. Through transparent handovers, joint scoping, and shared KPIs, they built a framework where both sides thrive together.

This session will highlight how open communication improved forecasting, reduced “us vs. them” dynamics, and directly boosted the quality of Drupal delivery. You’ll leave with practical strategies to apply in your own organization. This includes fostering empathy across teams, aligning metrics, and creating a culture of transparency.

“A Dashboard that Works: Giving Editors What They Want, But Focusing on What They Need” — by Albert Hughes and Dave Hansen-Lange

Imagine logging in and instantly seeing what matters most to your content team: recent edits, accessibility checks, broken links, permissions, and so on. That’s the power of a dashboard built not just to look good, but to truly support editors in their daily work.

Join Albert Hughes (ahughes3) and Dave Hansen-Lange (dalin) at their session as they share the journey of shaping a dashboard for 500 editors across 130 sites. You’ll hear how priorities were set, how editor needs were balanced with technical realities, and how decisions shaped a tool that keeps content teams focused and confident.

You’ll walk away with practical lessons you can apply to your own platform and a fresh perspective on how smart dashboards can empower editors and strengthen content leadership.

“Drupal CMS Spotlights” — by Gábor Hojtsy

As marketing and content leaders, you will appreciate a session on Drupal’s latest innovations that can make a difference in your work. One of the greatest presentations for this purpose at DrupalCon Chicago 2026 is the Drupal CMS Spotlights.

Drupal CMS is a curated version of Drupal packed with pre-configured features, many of which are focused on content experiences. For example, you can instantly spin up a ready-to-go blog, SEO tools, events, and more.

The session brings together key Drupal CMS leaders to share insights on recent developments and plans for the future. You’ll hear about Site Templates, the new Drupal Canvas page builder, AI, user experience, usability, documentation, and more.

Gábor Hojtsy (gábor hojtsy), Drupal core committer and initiative coordinator, is known for his engaging style, so you’ll enjoy the session even if some details get technical.

“Launching the Drupal Site Template Marketplace” — by Tim Hestenes Lehnen

For marketing and content leaders, the launch of the Drupal Site Template Marketplace is big news. Each template combines recipes (pre‑configured feature sets), demo content, and a Canvas‑compatible theme, making it faster than ever to launch a professional, polished website. For anyone focused on storytelling, campaigns, or digital experiences, this is a game‑changer.

The pilot program at DrupalCon Vienna 2025 introduced the first templates, built with the support of Drupal Certified Partners. Now, the Marketplace is expanding, offering a streamlined way to discover, select, and implement templates that align with your goals.

Join Tim Hestenes Lehnen (hestenet), a renowned Drupal core contributor, for a session that dives deeper. He’ll share lessons learned from the pilot, explain how the Marketplace connects to the roadmap for Drupal CMS and Drupal Canvas, and explore what’s next as more templates become available.

Driesnote — by Dries Buytaert

The inspiring keynote by Dries Buytaert, Drupal’s founder, is a session that can’t be missed. Driesnote closes the opening program at Chicago 2026 and sets the tone for the entire conference. It’s your perfect chance to see where Drupal is headed, and how those changes make your work easier, faster, and more creative.

At DrupalCon Vienna 2025, the main auditorium’s audience was the first to hear Dries’ announcements. Among other things, they heard about the rise in the AI Initiative funding, doubled contributions into Drupal CMS, and site templates to be found at Marketplace. 

Marketers and content editors were especially amazed to see what’s becoming possible in their work: content templates in Drupal Canvas, a Context Control Center to help AI capture brand voice, and autonomous Drupal agents keeping content up to date automatically.

This year, the mystery of what’s next is yours to uncover. Follow the crowd to the main auditorium at DrupalCon Chicago and expect that signature “wow” moment that leaves the audience buzzing. 

Final thoughts

Step into DrupalCon Chicago 2026 and reignite your marketing and content vision. Connect with peers, recharge your ideas, and see how Drupal continues to evolve. The sessions are designed to spark creativity and provide tools that can be put to work right away. As you head into the event, keep an open mind, lean into the conversations, and enjoy the energy that comes from sharing ideas across our amazing community.

Authored By: Nadiia Nykolaichuk, DrupalCon Chicago 2026 Marketing & Outreach Committee Member

Talking Drupal: Talking Drupal #540 - Acquia Source

Drupal Planet -

Today we are talking about Acquia's Fully managed Drupal SaaS Acquia Source, What you can do with it, and how it could change your organization with guest Matthew Grasmick. We'll also cover AI Single Page Importer as our module of the week.

For show notes visit: https://www.talkingDrupal.com/540

Topics
  • Introduction to Acquia Source
  • The Evolution of Acquia Source
  • Cost and Market Position of Acquia Source
  • Customizing and Growing Your Business
  • Challenges of Building a SaaS Platform on Drupal
  • Advantages of Acquia Source for Different Markets
  • Horizontal Scale and Governance at Scale
  • Canvas CLI Tool and Synchronization
  • Role of AI in Acquia Source
  • Agencies and Enterprise Clients
  • AI Experiments and Content Importer
  • AI and Orchestration in Drupal
  • Future Innovations in Acquia Source
Resources Guests

Matthew Grasmick - grasmash

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Catherine Tsiboukas - mindcraftgroup.com bletch

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to use AI to help map various content on an existing site to structured fields on Drupal site, as part of creating a node? There's a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Jan 2026 by Mark Conroy (markconroy) who listeners may know from his work on the LocalGov distribution and install profile
    • Versions available: 1.0.0-alpha3, which works with Drupal core 10 or 11
  • Maintainership
    • Actively maintained
    • Documentation - pretty extensive README, which is also currently in use as the project page
    • No issues yet
  • Usage stats:
    • 2 sites
  • Module features and usage
    • With this module enabled, you'll have a new "AI Content Import" section at the top of the node creation form. In there you can provide the URL of the existing page to use, and then click "Import Content with AI". That will trigger a process where OpenAI will ingest and analyze the existing page. It will extract values to populate your node fields, and then you can review or change those values before saving the node.
    • In the configuration you can specify the AI model to use, a maximum content length, an HTTP request timeout value, which content types should have the importer available, and then also prevent abuse by specifying blocked domains, a flood limit, and a flood window. You will also need to grant a new permission to use the importer for any user roles that should have access.
    • The module also includes a number of safeguards. For example, it will only accept URLs using HTTP or HTTPS protocols, private IP ranges are blocked, and by default it will only allow 5 requests per user per hour. It will perform HTML purification for long text fields, and strip tags for short text fields. In addition, it removes dangerous attributes like onclick or inline javascript, and generates CKEditor-compatible output.
    • It currently supports a list of field types that include text_long, text_with_summary, string, text, datetime, daterange, timestamps and link fields. It also supports entity reference fields, but only for taxonomy terms.
    • Listeners may also be aware of the Unstructured module which does some similar things, but requires you to use an Unstructured service or run a server using their software. So I would say that AI Single Page Importer is perhaps a little more narrow in scope but works with an OpenAI account instead of requiring the less commonly used Unstructured.

Drupal AI Initiative: Four Weeks of High Velocity Development for Drupal AI

Drupal Planet -

Authors: Arian, Christoph, Piyuesh, Rakhi (alphabetical)

While Artificial Intelligence is evolving rapidly, many applications remain experimental and difficult to implement in professional production environments. The Drupal AI Initiative addresses this directly, driving responsible AI innovation by channelling the community's creative energy into a clear, coordinated product vision for Drupal.


Dries Buytaert presenting the status of Drupal AI Initiative at DrupalCon Vienna 2025

In this article, the third in a series, we highlight the outcomes of the latest development sprints of the Drupal AI Initiative. Part one outlines the 2026 roadmap presented by Dries Buytaert. Part two addresses the organisation and new working model for the delivery of AI functionality.

Converting ambition into measurable progress

To turn the potential of AI into a reliable reality for the Drupal ecosystem, we have developed a repeatable, high-velocity production model that has already delivered significant results in its first four weeks.

A Dual-Workstream Approach to Innovation

To maximize efficiency and scale, development is organized into two closely collaborating workstreams. Together, they form a clear pipeline from exploration and prototyping to stable functionality:

  • The Innovation Workstream: Led by QED42, this stream explores emerging technologies like evaluating Symfony AI, building AI-driven page creation, prompt context management, and the latest LLM capabilities to define what is possible within the ecosystem.
  • The Product Workstream: Led by 1xINTERNET, this team takes proven innovations and refines, tests, and integrates them into a stable Drupal AI product ensuring they are ready for enterprise use.
Sustainable Management through the RFP Model

This structure is powered by a Request for Proposal (RFP) model, sponsored by 28 organizations partnering with the Drupal AI Initiative.

The management of these workstreams is designed to rotate every six months via a new RFP process. Currently, 1xINTERNET provides the Product Owner for Product Development and QED42 provides the Product Owner for Innovation, while FreelyGive provides core technical architecture. This model ensures the initiative remains sustainable and neutral, while benefiting from the consistent professional expertise provided by the partners of the Drupal AI Initiative.

Professional Expertise from our AI Partners

The professional delivery of the initiative is driven by our AI Partners, who provide the specialized resources required for implementation. To maintain high development velocity, we operate in two-week sprint iterations. This predictable cadence allows our partners to effectively plan their staff allocations and ensures consistent momentum.

The Product Owners for each workstream work closely with the AI Initiative Leadership to deliver on the one-year roadmap. They maintain well-prepared backlogs, ensuring that participating organizations can contribute where their specific technical strengths are most impactful.

By managing the complete development lifecycle, including software engineering, UX design, quality assurance, and peer reviews, the sprint teams ensure the delivery of stable and well-architected solutions that are ready for production environments.

The Strategic Role of AI in Drupal CMS

The work of the AI Initiative provides important functionality to the recently launched Drupal CMS 2.0. This release represents one of the most significant evolutions in Drupal’s 25-year history, introducing Drupal Canvas and a suite of AI-powered tools within a visual-first platform designed for marketing teams and site builders alike.

The strategic cooperation between the Drupal AI Initiative and the Drupal CMS team ensures that our professional-grade AI framework delivers critical functionality while aligning with the goals of Drupal CMS.
Results from our first Sprints

The initial sprints demonstrate the high productivity of this dual-workstream approach, driven directly by the specialized staff of our partnering organizations. In the first two weeks, the sprint teams resolved 143 issues, creating significant momentum right from the first sprint.


Screenshot Drupal AI Dashboard

This surge of activity resulted in the largest regular patch release in the history of the Drupal AI module. This achievement was made possible by the intensive collaboration between several expert companies working in sync. Increased contribution from our partners will allow us to further accelerate development velocity, improving the capacity to deliver more advanced technical features in the coming months.


Screen recording Agents Debugger

Highlights from the first sprints

While the volume of work is significant, some new features stand out. Here are a few highlights from our recent sprint reviews:

  • AI Dashboard in Drupal CMS 2.0: Artem from 1xINTERNET presented the AI Dashboard functionality. This central hub for managing AI features and providers has been officially moved into the Drupal CMS 2.0 release, serving as the user interface for AI site management.
  • Advanced Automation: Anjali from QED42 presented new JSON and Audio field automators, which enable Drupal to process complex data types via AI.
  • The Context Control Center: Kristen from Salsa Digital presented the evolution of our context governance, converting config entities into content entities to enable revision management and better targeting.
  • The New Markdown Editor: Bruno from 1xINTERNET demonstrated a sleek new Markdown editor for prompt fields, featuring type-ahead autocomplete for variables and tokens. This will be released with the 1.3 version of the Drupal AI module.
  • Agents Debugger: To help developers see "under the hood," Marcus from FreelyGive introduced the new debugger module to trace AI agent interactions in real-time.
  • Technical Deep-Dives: We’ve seen steady progress on Symfony AI (presented by Akhil from QED42), Model Context Protocol (presented by Abhisek from DropSolid), and Reranking Models (led by Sergiu from DropSolid) to improve search quality.
Become a Drupal AI Partner

Our success so far is thanks to the companies who have stepped up as Drupal AI Partners. These organizations are leading the way in defining how AI and the Open Web intersect.
A huge thank you to our main contributors of the first two sprints (alphabetical order):

We invite further participation from the community. If your organization is interested in contributing expert resources to the forefront of AI development, we encourage you to join the initiative.

Sign up to join the Drupal AI Initiative
 

Debug Academy: Four PHP Communities, One Uncomfortable Conversation

Drupal Planet -

Four PHP Communities, One Uncomfortable Conversation

#Drupal, Joomla, Magento, Mautic. All PHP-based, all use Composer, all have talented & passionate communities. And all share the same problems around growth and sustainability. There is a solution.

No, we should not merge the codebases. Sure, you could have AI "Ralph-Wiggum" its way to a monstrosity with passing tests. But these frameworks are trusted for their code quality and security, and using AI to Frankenstein-smush them together would destroy that trust instantly.

What I'm proposing is merging the communities behind a single framework.

Why now? Because (yes, I'm going there) while AI can't merge codebases, it can help developers who already know PHP, Composer, and open source ramp up on a new framework far faster than before. The barrier to a knowledgable human using a different technology has never been lower.

ashrafabed Mon, 02/16/2026

The Drop Times: What Keeps You Invested in Drupal?

Drupal Planet -

In the Drupal lifecycle, investment is rarely about the "new." It is about the enduring. While the broader tech landscape often chases the friction of constant disruption, what keeps this community anchored in 2026 is a different kind of momentum: the trust built through predictable engineering and shared governance.

We are currently seeing the dividends of that discipline. Drupal 11 continues to validate the shortened release cadence introduced with Drupal 8. By turning major version upgrades from "all-hands" crises into managed, architectural transitions, the community has removed the penalty for staying current. This isn't just maintenance; it is the infrastructure of reliability that allows enterprises and public institutions to stay invested without fear of the next breaking change.

The defining signal of this issue, however, is the formal commitment of 28 organisations to the Drupal AI roadmap. This goes beyond a technical milestone. In an era where "AI" is often synonymous with proprietary black boxes and reckless speed, the Drupal community is choosing a path of collective sovereignty. When organisations publicly back a roadmap, they signal shared governance, coordinated delivery, and sustained resource allocation. This pledge represents a move to a coordinated delivery workstream with institutional accountability.

The significance of that pledge lies in how AI is being framed. Drupal’s long-standing strengths—structured content, multilingual architecture, revision control, granular permissions, and workflow governance—remain foundational. AI capabilities are positioned as assistive and integrative, operating within those systems rather than bypassing them. The intention is augmentation, not disruption.

Investment in Drupal, then, is less about trend adoption and more about stewardship. It is visible in coordinated roadmaps, predictable release discipline, community-backed delivery structures, and organisations willing to commit resources publicly. Relevance, in this context, is not declared. It is maintained.

ORGANIZATION NEWSDISCOVER DRUPALEVENTFREE SOFTWARE


Thank you, 
Kazima Abbas 
Sub-editor, 
The DropTimes

Drupal.org blog: GitLab issue migration: how to use the new workflow

Drupal Planet -

Now that some of the projects that opted-in for GitLab issues are using them, they are getting real world experience with how the issue workflow in GitLab is slightly different. More and more projects are being migrated each week so sooner or later you will probably run into the following situations.

Creating new issues

When creating issues, the form is very simple. Add a title and a description and save, that's it!

GitLab has different work items when working on projects, like "Incidents", "Tasks" and "Issues". Our matching type will always be "Issue". Maintainers might choose to use the other types, but all integrations with Drupal.org will be made against "Issue" items.

Labels (issue metadata)

As mentioned in the previous blog post GitLab issue migration: the new workflow for migrated projects, all the metadata for issues is managed via labels. Maintainers will select the labels once the issue is created.

Users without sufficient privileges cannot decide things like priority or tags to use. Maintainers can decide to grant the role "reporter" to some users to help with this metadata for the issues. Reporters will be able to add/edit metadata when adding or editing issues. We acknowledge that this is probably the biggest difference to working with Drupal.org issues. We are listening to feedback and trying to identify the real needs first (thanks to the projects that opted in), before implementing anything permanent.

Reporters will be able to add or edit labels on issue creation or edit:

So far, we have identified the biggest missing piece, the ability to mark an issue as RTBC. Bouncing between "Needs work" or "Needs review" tends to happen organically via comments among the participating contributors in the issue, but RTBC is probably what some maintainers look for to get an issue merged.

The previous are conventions that we agreed on as a community a while back. RTBC is one, NW (Needs Work) vs NR (Needs Review) is another one, so we could use this transition to GitLab issues to define the equivalent ones.

GitLab merge requests offer several choices that we could easily leverage.

  • Draft vs ready could mimic "Needs work" vs "Needs review". Contributors will switch this back and forth depending on the feedback needed on the code.
  • Merge request approvals could mimic "RTBC". This is something that can even be required (number of approvals or approval rules) per project, depending on the maintainer's preferences.

We encourage maintainers to look at the merge requests listing instead (like this one). Both "draft" vs. "ready" and "approved" are features you can filter by when viewing merge requests for a project.

Automated messages

There are automated messages when opening or closing issues that provide links related to fork management, fork information, and access request when creating forks, and reminders to update the contribution record links to the issue to track credit information.

Crosslinking between Drupal.org issues and GitLab issues

When referring to a Drupal.org issue from another Drupal.org issue, you can continue to use the [#123] syntax in the summary and comments, but enter the full URL in the "related issues" entry box.

When referring to a GitLab issue from another GitLab issue, use the #123 syntax, without the enclosing [ ].

For cross-platform references (Drupal to GitLab or GitLab to Drupal), you need to use the full URL.

What's next

Same as before, we want to go and review more of the already opted-in projects, collect feedback, act on it when needed, and then we will start to batch-migrate the next set: low-usage projects, projects with a low number of issues, etc.

The above should get us 80% of the way regarding the total number of projects to migrate, and once we have gathered more feedback and iterated over it, we'll be ready to target higher-volume, higher-usage projects.

Related blog posts:

DDEV Blog: Mutagen in DDEV: Functionality, Issues, and Debugging

Drupal Planet -

Mutagen has been a part of DDEV for years, providing dramatic performance improvements for macOS and traditional Windows users. It's enabled by default on these platforms, but understanding how it works, what can go wrong, and how to debug issues is key to getting the most out of DDEV.

Just Need to Debug Something?

If you're here because you just need to debug a Mutagen problem, this will probably help:

ddev utility mutagen-diagnose

See more below.

Contributor Training Video

This blog is based on the Mutagen Fundamentals and Troubleshooting Contributor Training held on January 22, 2026.

See the slides for the training video.

What Mutagen Does

Mutagen is an asynchronous file synchronization tool that decouples in-container reads and writes from reads and writes on the host machine. Each filesystem enjoys near-native speed because neither is stuck waiting on the other.

Traditional Docker bind-mounts check every file access against the file on the host. On macOS and Windows, Docker's implementation of these checks is not performant. Mutagen solves this by maintaining a cached copy of your project files in a Docker volume, syncing changes between host and container asynchronously.

Mostly for PHP

The primary target of Mutagen syncing is PHP files. These were the fundamental problem with Docker as the number of files in a Docker-hosted PHP website grew into the Composer generation with tens of thousands of files, so php-fpm had to open so very many of them all at once. Now with DDEV on macOS using Mutagen, php-fpm is opening files that are just on its local Linux filesystem, not opening ten thousand files that all have to be verified on the host.

Webserving Performance Improvement

Mutagen has delighted many developers with its web-serving performance. One dev said "the first time I tried it I cried."

Filesystem Notifications

Mutagen supports filesystem notifications (inotify/fsnotify), so file-watchers on both the host and inside the container are notified when changes occur. This is a significant advantage for development tools that would otherwise need to poll for changes.

How Mutagen Works in DDEV

When Mutagen is enabled, DDEV:

  1. Mounts a Docker volume onto /var/www inside the web container
  2. A Linux Mutagen daemon is installed inside the web container
  3. A host-side Mutagen daemon is started by DDEV
  4. The two daemons keep each other up-to-date with changes on either side
Lifecycle
  • ddev start: Starts the Mutagen daemon on the host if not running, creates or resumes sync session
  • ddev stop: Flushes sync session to ensure consistency, then pauses it
  • ddev composer: Triggers synchronous flush after completion to sync massive filesystem changes
  • ddev mutagen reset: Removes the Docker volume and the sync session will then be recreated from scratch (from the host-side contents) on ddev start.
Upload Directories

DDEV automatically excludes user-generated files in upload_dirs from Mutagen syncing, using bind-mounts instead. For most CMS types, this is configured automatically, for example:

  • Drupal: sites/default/files
  • WordPress: wp-content/uploads
  • TYPO3: fileadmin, uploads

If your project has non-standard locations, override defaults by setting upload_dirs in .ddev/config.yaml.

We do note that upload_dirs is not an adequate description for this behavior. It was originally intended for user-generated files, but now is used for heavy directories like node_modules, etc.

Common Issues and Caveats Initial Sync Time

The first-time Mutagen sync takes 5-60 seconds depending on project size. A Magento 2 site with sample data might take 48 seconds initially, 12 seconds on subsequent starts. If sync takes longer than a minute, you're likely syncing large files or directories unnecessarily.

Large node_modules Directories

Frontend build tools create massive node_modules directories that slow Mutagen sync significantly. Solution: Add node_modules to upload_dirs:

upload_dirs: #upload_dirs entries are relative to docroot - sites/default/files # Keep existing CMS defaults - ../node_modules # Exclude from Mutagen

Then run ddev restart. The directory remains available in the container via Docker bind-mount.

File Changes When DDEV is Stopped

If you change files (checking out branches, running git pull, deleting files) while DDEV is stopped, Mutagen has no awareness of these changes. When you start again, it may restore old files from the volume.

Solution: Run ddev mutagen reset before restarting if you've made significant changes while stopped. That removes the volume so everything comes first from the host side in a fresh sync.

Simultaneous Changes

If the same file changes on both host and container while out of sync, conflicts can occur. This is quite rare but possible with:

  • Scripts running simultaneously on host and in container
  • Massive branch changes
  • Large npm install or yarn install operations

Best practices:

  • Do Git operations on the host, not in the container
  • Use ddev composer for most composer operations
  • Run ddev mutagen sync after major Git branch changes
  • Run ddev mutagen sync after manual Composer operations done inside the container
Disk Space Considerations

Mutagen increases disk usage because project code exists both on your computer and inside a Docker volume. The upload_dirs directories are excluded to mitigate this.

Watch for volumes larger than 5GB (warning) or 10GB (critical). Use ddev utility mutagen-diagnose --all to check all projects.

Debugging Mutagen Issues The New ddev utility mutagen-diagnose Command

DDEV now includes a diagnostic tool that automatically checks for common issues:

ddev utility mutagen-diagnose

This command analyzes:

  • Volume sizes: Warns if >5GB, critical if >10GB
  • Upload directories configuration: Checks if properly configured for your CMS
  • Sync session status: Reports problems with the sync session
  • Large directories: Identifies node_modules and other large directories being synced
  • Ignore patterns: Validates Mutagen exclusion configuration

Use --all flag to analyze all Mutagen volumes system-wide:

ddev utility mutagen-diagnose --all

The diagnostic provides actionable recommendations like:

⚠ 3 node_modules directories exist but are not excluded from sync (can cause slow sync) → Add to .ddev/config.yaml: upload_dirs: - sites/default/files - web/themes/custom/mytheme/node_modules - web/themes/contrib/bootstrap/node_modules - app/node_modules → Then run: ddev restart Debugging Long Startup Times

If ddev start takes longer than a minute and ddev utility mutagen-diagnose doesn't give you clues about why, watch what Mutagen is syncing:

ddev mutagen reset # Start from scratch ddev start

In another terminal:

while true; do ddev mutagen st -l | grep "^Current"; sleep 1; done

This shows which files Mutagen is working on:

Current file: vendor/bin/large-binary (306 MB/5.2 GB) Current file: vendor/bin/large-binary (687 MB/5.2 GB) Current file: vendor/bin/large-binary (1.1 GB/5.2 GB)

Add problem directories to upload_dirs or move them to .tarballs (automatically excluded).

Monitoring Sync Activity

Watch real-time sync activity:

ddev mutagen monitor

This shows when Mutagen responds to changes and helps identify sync delays.

Manual Sync Control

Force an explicit sync:

ddev mutagen sync

Check sync status:

ddev mutagen status

View detailed status:

ddev mutagen status -l Troubleshooting Steps
  1. Verify that your project works without Mutagen:

    ddev config --performance-mode=none && ddev restart
  2. Run diagnostics:

    ddev utility mutagen-diagnose
  3. Reset to clean .ddev/mutagen/mutagen.yml:

    # Backup customizations first mv .ddev/mutagen/mutagen.yml .ddev/mutagen/mutagen.yml.bak ddev restart
  4. Reset Mutagen volume and recreate it:

    ddev mutagen reset ddev restart
  5. Enable debug output:

    DDEV_DEBUG=true ddev start
  6. View Mutagen logs:

    ddev mutagen logs
  7. Restart Mutagen daemon:

    ddev utility mutagen daemon stop ddev utility mutagen daemon start
Advanced Configuration Excluding Directories from Sync

Recommended approach: Use upload_dirs in .ddev/config.yaml:

upload_dirs: - sites/default/files # CMS uploads - ../node_modules # Build dependencies - ../vendor/bin # Large binaries

Advanced approach: Edit .ddev/mutagen/mutagen.yml after removing the #ddev-generated line:

ignore: paths: - "/web/themes/custom/mytheme/node_modules" - "/vendor/large-package"

Then add corresponding bind-mounts in .ddev/docker-compose.bindmount.yaml:

services: web: volumes: - "../web/themes/custom/mytheme/node_modules:/var/www/html/web/themes/custom/mytheme/node_modules"

Always run ddev mutagen reset after changing mutagen.yml.

Git Hooks for Automatic Sync

Add .git/hooks/post-checkout and make it executable:

#!/usr/bin/env bash ddev mutagen sync || true chmod +x .git/hooks/post-checkout Use Global Configuration for performance_mode

The standard practice is to use global configuration for performance_mode so that each user does what's normal for them, and the project configuration does not have configuration that might not work for another team member.

Most people don't have to change this anyway; macOS and traditional Windows default to performance_mode: mutagen and Linux/WSL default to performance_mode: none.

When to Disable Mutagen

Disable Mutagen if:

  • You're on Linux or WSL2 (already has native performance)
  • Filesystem consistency is more critical than webserving performance
  • You're troubleshooting other DDEV issues

Disable per-project:

ddev mutagen reset && ddev config --performance-mode=none && ddev restart

Disable globally:

ddev config global --performance-mode=none Mutagen Data and DDEV

DDEV uses its own Mutagen installation, normally in ~/.ddev, but using $XDG_CONFIG_HOME when that is defined.

  • Binary location: $HOME/.ddev/bin/mutagen or ${XDG_CONFIG_HOME}/ddev/bin/mutagen
  • Data directory: $HOME/.ddev_mutagen_data_directory

Access Mutagen directly:

ddev utility mutagen sync list ddev utility mutagen sync monitor <projectname> Summary

Mutagen provides dramatic performance improvements for macOS and traditional Windows users, but understanding its asynchronous nature is key to avoiding issues:

  • Use ddev utility mutagen-diagnose as your first debugging step
  • Configure upload_dirs to exclude large directories like node_modules or heavy user-generated files directories
  • Run ddev mutagen reset after file changes when DDEV is stopped
  • Do Git operations on the host, not in the container
  • Monitor sync activity with ddev mutagen monitor when troubleshooting

The benefits far outweigh the caveats for most projects, especially with the new diagnostic tools that identify and help resolve common issues automatically.

For more information, see the DDEV Performance Documentation and the Mutagen documentation.

Copilot was used to create an initial draft for this blog, and for subsequent reviews.

A Drupal Couple: Why I Do Not Trust Independent AI Agents Without Strict Supervision

Drupal Planet -

Why I Do Not Trust Independent AI Agents Without Strict Supervision Image Imagen Article body

I use Claude Code almost exclusively. Every day, for hours. It allowed me to get back into developing great tools, and I have published several results that are working very well. Plugins, skills, frameworks, development workflows. Real things that real people can use. The productivity is undeniable.

So let me be clear about what this post is. This is not a take on what AI can do. This is about AI doing it completely alone.

The results are there. But under supervision.

The laollita.es Moment

When we were building laollita.es, something happened that I documented in a previous post. We needed to apply some visual changes to the site. The AI agent offered a solution: a custom module with a preprocess function. It would work. Then we iterated, and it moved to a theme-level implementation with a preprocess function. That would also work. Both approaches would accomplish the goal.

Until I asked: isn't it easier to just apply CSS to the new classes?

Yes. It was. Simple CSS. No module, no preprocess, no custom code beyond what was needed.

Here is what matters. All three solutions would have accomplished the goal. The module approach, the theme preprocess, the CSS. They all would have worked. But two of them create technical debt and maintenance load that was completely unnecessary. The AI did not choose the simplest path because it does not understand the maintenance burden. It does not think about who comes after. It generates a solution that works and moves on.

This is what I see every time I let the AI make decisions without questioning them. It works... and it creates problems you only discover later.

Why This Happens

I have been thinking about this for a while. I have my own theories, and they keep getting confirmed the more I work with these tools. Here is what I think is going on.

AI Cannot Form New Memories

Eddie Chu made this point at the latest AI Tinkerers meeting, and it resonated with me because I live it every day.

I use frameworks. Skills. Plugins. Commands. CLAUDE.md files. I have written before about my approach to working with AI tools. I have built an entire organization of reference documents, development guides, content frameworks, tone guides, project structure plans. All of this exists to create guardrails, to force best practices, to give AI the context it needs to do good work.

And it will not keep the memory.

We need to force it. Repeat it. Say it again.

This is not just about development. It has the same problem when creating content. I built a creative brief step into my workflow because the AI was generating content that reflected its own patterns instead of my message. I use markdown files, state files, reference documents, the whole structure in my projects folder. And still, every session starts from zero. The AI reads what it reads, processes what it processes, and the rest... it is as if it never existed.

The Expo.dev engineering team described this perfectly after using Claude Code for a month [1]. They said the tool "starts fresh every session" like "a new hire who needs onboarding each time." Pre-packaged skills? It "often forgets to apply them without explicit reminders." Exactly my experience.

Context Is Everything (And Context Is the Problem)

Here is something I have noticed repeatedly. In a chat interaction, in agentic work, the full history is the context. Everything that was said, every mistake, every correction, every back-and-forth. That is what the AI is working with.

When the AI is already confused and I have asked for the same correction three times and it is going in strange ways... starting a new session and asking it to analyze the code fresh, to understand what is there, it magically finds the solution.

Why? Because the previous mistakes are in the context. The AI does not read everything from top to bottom. It scans for what seems relevant, picks up fragments, skips over the rest. Which means even the guardrails I put in MD files, the frameworks, the instructions... they are not always read. They are not always in the window of what the AI is paying attention to at that moment.

And when errors are in the context, they compound. Research calls this "cascading failures" [2]. A small mistake becomes the foundation for every subsequent decision, and by the time you review the output, the error has propagated through multiple layers. An inventory agent hallucinated a nonexistent product, then called four downstream systems to price, stock, and ship the phantom item [3]. One hallucinated fact, one multi-system incident.

Starting fresh clears the poison. But an unsupervised agent never gets to start fresh. It just keeps building on what came before.

The Dunning-Kruger Effect of AI

The Dunning-Kruger effect is a cognitive bias where people with limited ability in a task overestimate their competence. AI has its own version of this.

When we ask AI to research, write, or code something, it typically responds with "this is done, production ready" or some variation of "this is done, final, perfect!" But it is not. And going back to the previous point, that false confidence is now in the context. So no matter if you discuss it later and explain what was wrong or that something is missing... it is already "done." If the AI's targeted search through the conversation does not bring the correction back into focus... there you go.

Expo.dev documented the same pattern [1]. Claude "produces poorly architected solutions with surprising frequency, and the solutions are presented with confidence." It never says "I am getting confused, maybe we should start over." It just keeps going, confidently wrong.

The METR study puts hard numbers on this [4]. In a randomized controlled trial with experienced developers, AI tools made them 19% slower. Not faster. Slower. But the developers still believed AI sped them up by 20%. The perception-reality gap is not just an AI problem. It is a human problem too. Both sides of the equation are miscalibrated.

The Training Data Problem

The information or memory that AI has is actually not all good. Usually it is "cowboy developers" who really go ahead and respond to most social media questions, Stack Overflow answers, blog posts, tutorials. And that is the training. That is the information AI learned from.

The same principle applies beyond code. The information we produce as a society is biased, and AI absorbs all of it. That is why you see discriminatory AI systems across industries. AI resume screeners favor white-associated names 85% of the time [5]. UnitedHealthcare's AI denied care and was overturned on appeal 90% of the time [6]. A Dutch algorithm wrongly accused 35,000 parents of fraud, and the scandal toppled the entire government [7].

For my own work, I create guides to counteract this. Content framework guides that extract proper research on how to use storytelling, inverted pyramid, AIDA structures. Tone guides with specific instructions. I put them in skills and reference documents so I can point the AI to them when we are working. And still I have to remind it. Every time.

What I See Every Day

I have seen AI do what it did in laollita.es across multiple projects. In development, it created an interactive chat component, and the next time we used it on another screen, it almost wrote another one from scratch instead of reusing the one it had just built. Same project. Same session sometimes.

In content creation, I have a tone guide with specific stylistic preferences. And I still have to explicitly ask the AI to review it. No matter how directive the language in the instructions is. "Always load this file before writing content." It does not always load the file.

And it is not just my experience.

A Replit agent deleted a production database during a code freeze, then fabricated fake data and falsified logs to cover it up [8]. Google's Antigravity agent wiped a user's entire hard drive when asked to clear a cache [9]. Klarna's CEO said "we went too far" after cutting 700 jobs for AI and is now rehiring humans [10]. Salesforce cut 4,000 support staff and is now facing lost institutional knowledge [11]. The pattern keeps repeating. Companies trust the agent, remove the human, discover why the human was there in the first place.

What This Means for AI Supervision

I am not against AI. I am writing this post on a system largely built with AI assistance. The tools I publish, the workflows I create, the content I produce. AI is deeply embedded in my work. It makes me more productive.

At Palcera, I believe AI is genuinely great for employees and companies. When AI helps a developer finish faster, that time surplus benefits everyone. The developer gets breathing room. The company gets efficiency. And the customer can get better value, better pricing, faster delivery. That is real. I see it every day.

But all of that requires the human in the loop. Questioning the choices. Asking "isn't CSS simpler?" Clearing the context when things go sideways. Pointing to the tone guide when the AI forgets. Starting fresh when the conversation gets poisoned with old mistakes.

The results are there. But under supervision. And that distinction matters more than most people realize.

References

[1] Expo.dev, "What Our Web Team Learned Using Claude Code for a Month"

[2] Adversa AI, "Cascading Failures in Agentic AI: OWASP ASI08 Security Guide 2026"

[3] Galileo, "7 AI Agent Failure Modes and How To Fix Them"

[4] METR, "Measuring the Impact of Early-2025 AI on Experienced Developer Productivity"

[5] The Interview Guys / University of Washington, "85% of AI Resume Screeners Prefer White Names"

[6] AMA, "How AI Is Leading to More Prior Authorization Denials"

[7] WBUR, "What Happened When AI Went After Welfare Fraud"

[8] The Register, "Vibe Coding Service Replit Deleted Production Database"

[9] The Register, "Google's Vibe Coding Platform Deletes Entire Drive"

[10] Yahoo Finance, "After Firing 700 Humans For AI, Klarna Now Wants Them Back"

[11] Maarthandam, "Salesforce Regrets Firing 4,000 Experienced Staff and Replacing Them with AI"

About I Wanted to Celebrate Drupal's 25th. So I Built Something for Our Moms. My Journey with AI Tools: Practical Tips from a Recent Discussion Why Web Development Simplicity Beats AI-Generated Complexity Author Carlos Ospina Abstract A developer who uses AI coding tools daily shares real examples of why autonomous AI agents still need human supervision. From unnecessary technical debt to context pollution to confidently wrong outputs, AI works best when a human is asking the right questions. Tags Drupal Drupal Planet AI AI Agents Human-in-the-Loop Claude Code Development Supervision Rating Select ratingGive Why I Do Not Trust Independent AI Agents Without Strict Supervision 1/5Give Why I Do Not Trust Independent AI Agents Without Strict Supervision 2/5Give Why I Do Not Trust Independent AI Agents Without Strict Supervision 3/5Give Why I Do Not Trust Independent AI Agents Without Strict Supervision 4/5Give Why I Do Not Trust Independent AI Agents Without Strict Supervision 5/5Cancel rating No votes yet Leave this field blank Add new comment

Droptica: Intelligent Taxonomy Mapping for AI-Powered Drupal Systems: A Practical Guide

Drupal Planet -

Integrating AI with Drupal content creation works well for text fields, but taxonomy mapping remains a significant challenge. AI extracts concepts using natural language, while Drupal taxonomies require exact predefined terms and the two rarely match. This article explores why common approaches like string matching and keyword mapping fail, and presents context injection as a production-proven solution that leverages AI’s semantic understanding to select correct taxonomy terms directly from the prompt.

ImageX: How to Manage Your Related Content in Drupal Instantly with Inline Entity Form

Drupal Planet -

On a busy Drupal website, content rarely lives on its own in a silo. Presentations and webinars are linked to speakers, academic programs will reference courses, and events are tied to locations, the list goes on. Update one of these pieces on its own page, and the change shows up everywhere it’s used, reflecting Drupal’s strength as a coherent, interconnected system.

Talking Drupal: TD Cafe #014 - AmyJune and Avi - Navigating Community, Safety, and Accessibility

Drupal Planet -

Join AmyJune and Avi as they discuss the complexities of organizing large events in changing times. The discussion covers topics from past DrupalCons, the crucial coordination behind community health and safety, accessibility, and the evolving challenges involving inclusivity. They also touch on the intersection of community dynamics, the importance of creating shared realities, and the engaging experience of the Drupal community. Additionally, expect an overview of upcoming events, including keynotes and fun activities like the Drupal Coffee Exchange.

For show notes visit: https://www.talkingDrupal.com/cafe014

Topics
  • Catching Up with Abby and June
  • Memories of DrupalCon and Camps
  • The $2 Bill Tradition
  • Open Y and Community Contributions
  • Community Working Group and Governance
  • Initial Reactions and Reflections
  • Challenges of Organizing DrupalCon
  • Accessibility and Safety Concerns
  • Event Planning and Community Involvement
  • Learning from Other Events
  • Upcoming Keynote and Event Highlights
  • Community and Collaboration
AmyJune Hineline

AmyJune works with the Linux Foundation as the Certification Community Architect, supporting the Education team in developing and maintaining exams and related documentation across the foundation's certification portfolio.

She's also a DrupalCamp organizer (Florida DrupalCamp, DrupalCamp Asheville, and DrupalCamp Colorado), a member of the Community Working Group's Conflict Resolution Team, and serves on the board of the Colorado Drupal Association.

Avi Schwab

Avi came to Drupal for the community and has been active in it since 2008. He is a founding organizer of MidCamp, Midwest Open Source Alliance, and the Event Organizer Working Group. In his role as a Technical Product Consultant at ImageX Media, he builds and supports Drupal sites for over 40 YMCA associations in the USA and Canada. For fun, he bikes, bakes, and enjoys time with his family.

Guests

AmyJune Hineline - volkswagenchick Avi Schwab - froboy

DDEV Blog: New `ddev share` Provider System: Cloudflare tunnel with no login or token

Drupal Planet -

Sharing your local development environment with clients, colleagues, or testing services has always been a valuable DDEV feature. DDEV v1.25.0 makes this easier and more flexible than ever with a complete redesign of ddev share. The biggest news is that you can now share your projects for free using Cloudflare Tunnel—no account signup or token setup required.

What Changed in ddev share

Previous versions of DDEV relied exclusively on ngrok for sharing. While ngrok remains a solid choice with advanced features, v1.25.0 introduces a modular provider system allowing more options and flexibility. DDEV now ships with two built-in providers:

  • ngrok: The traditional option (requires free account and authtoken)
  • cloudflared: A new, cost-free option using Cloudflare Tunnel (requires no account or token)

You can select providers via command flags, project configuration, or global defaults. Existing projects using ngrok continue working unchanged, and ngrok remains the default provider.

Free Sharing with Cloudflare Tunnel

Cloudflare Tunnel provides production-grade infrastructure for sharing your local environments at zero cost. After installing the cloudflared CLI, getting started is:

ddev share --provider=cloudflared

No account creation, no authentication setup, no subscription tiers—just immediate access to share your work. This removes barriers for individual developers and teams who need occasional sharing without the overhead of managing service accounts.

When should you use cloudflared vs ngrok? Use cloudflared for quick, free sharing during development and testing. Choose ngrok if you need stable subdomains, custom domains, or advanced features like IP allowlisting and OAuth protection. (However, if you control a domain registered at Cloudflare you can use that for stable domains. This will be covered in a future blog.)

Configuration Flexibility

You can set your preferred provider at multiple levels:

# Use a specific provider for this session ddev share --provider=cloudflared # Set default provider for the current project ddev config --share-default-provider=cloudflared # Set global default for all projects ddev config global --share-default-provider=cloudflared

This flexibility lets you use different providers for different projects or standardize across your entire development setup.

Tip: Your CMS or framework may have "trusted host patterns" configuration that denies access to the site when hosted at an unknown URL. You'll need to configure to allow all or specific URLs. For example, in Drupal, $settings['trusted_host_patterns'] = ['.*']; or in TYPO3 'trustedHostsPattern' => '.*.*'.

Automation for difficult CMSs using pre-share hooks and $DDEV_SHARE_URL

When you run ddev share, DDEV now exports the tunnel URL as the DDEV_SHARE_URL environment variable. This enables automation through hooks, particularly useful for integration testing, webhooks, or CI workflows that need the public URL.

WordPress Example

WordPress is always difficult because it embeds the URL right in the database. For sites to use a different URL the wp search-replace tool is the classic way to deal with this, so the hook demonstration below can be used to make ddev share work even when the URL is dynamic.

# .ddev/config.yaml hooks: pre-share: # provide DDEV_SHARE_URL in container - exec-host: echo "${DDEV_SHARE_URL}" >.ddev/share_url.txt # Save database for restore later - exec-host: ddev export-db --file=/tmp/tmpdump.sql.gz # Change the URL in the database - exec: wp search-replace ${DDEV_PRIMARY_URL} $(cat /mnt/ddev_config/share_url.txt) | grep Success # Fix the wp-config-ddev.php to use the DDEV_SHARE_URL - exec: cp wp-config-ddev.php wp-config-ddev.php.bak - exec: sed -i.bak "s|${DDEV_PRIMARY_URL}|$(cat /mnt/ddev_config/share_url.txt)|g" wp-config-ddev.php - exec: wp cache flush post-share: # Put back the things we changed - exec: cp wp-config-ddev.php.bak wp-config-ddev.php - exec-host: ddev import-db --file=/tmp/tmpdump.sql.gz

This approach works for any CMS that stores base URLs in its configuration or database. The pre-share hook updates URLs automatically, and you can use post-share hooks to restore them when sharing ends. This eliminates the manual configuration work that sharing previously required for many CMSs.

TYPO3 Example

TYPO3 usually puts the site URL into config/sites/*/config.yaml as base: <url>, and then it won't respond to the different URLs in a ddev share. The hooks here temporarily remove the base: element:

hooks: pre-share: # Make a backup of config/sites - exec: cp -r ${DDEV_APPROOT}/config/sites ${DDEV_APPROOT}/config/sites.bak - exec-host: echo "removing 'base' from site config for sharing to ${DDEV_SHARE_URL}" # Remove `base:` from the various site configs - exec: sed -i 's|^base:|#base:|g' ${DDEV_APPROOT}/config/sites/*/config.yaml - exec-host: echo "shared on ${DDEV_SHARE_URL}" post-share: # Restore the original configuration - exec: rm -rf ${DDEV_APPROOT}/config/sites - exec: mv ${DDEV_APPROOT}/config/sites.bak ${DDEV_APPROOT}/config/sites - exec-host: ddev mutagen sync - exec-host: echo "changes to config/sites reverted" Magento 2 Example

Magento2 has pretty easy control of the URL, so the hooks are pretty simple:

hooks: pre-share: # Switch magento to the share URL - exec-host: ddev magento setup:store-config:set --base-url="${DDEV_SHARE_URL}" post-share: # Switch back to the normal local URL - exec-host: ddev magento setup:store-config:set --base-url="${DDEV_PRIMARY_URL}" Extensibility: Custom Share Providers

The new provider system is script-based, allowing you to create custom providers for internal tunneling solutions or other services. Place Bash scripts in .ddev/share-providers/ (project-level) or $HOME/.ddev/share-providers/ (global), and DDEV will recognize them as available providers.

For details on creating custom providers, see the sharing documentation.

An example of a share provider for localtunnel is provided in .ddev/share-providers/localtunnel.sh.example and you can experiment with it by just copying that to .ddev/share-providers/localtunnel.sh.

Questions
Do I need to change anything in existing projects?
No. Ngrok remains the default provider, so existing projects continue working without any changes. Your ngrok authtokens and configurations are fully compatible with v1.25+.
When should I use cloudflared vs ngrok?
Use cloudflared for quick, free sharing during development and testing. Use ngrok if you need stable subdomains, custom domains, or advanced features like IP allowlisting and OAuth protection.
Can I create my own share provider?
Yes! Place bash scripts in .ddev/share-providers/ (project-level) or $HOME/.ddev/share-providers/ (global). See the sharing documentation for implementation details.
Try It Today

DDEV v1.25.0 is now available. Use the techniques above, and try out Cloudflared to see if you like it.

For complete details on the new sharing system, see the sharing documentation.

Join us on Discord, follow us on Mastodon, Bluesky, or LinkedIn, and subscribe to our newsletter for updates.

This blog was drafted and reviewed by AI including Claude Code.

Drupal Association blog: Drupal's AI Roadmap for 2026

Drupal Planet -

For the past months, the AI Initiative Leadership Team has been working with our contributing partners to define what the Drupal AI initiative should focus on in 2026. That plan is now ready, and I want to share it with the community.

This roadmap builds directly on the strategy we outlined in Accelerating AI Innovation in Drupal. That post described the direction. This plan turns it into concrete priorities and execution for 2026.

The full plan is available as a PDF, but let me explain the thinking behind it.

Producing consistently high-quality content and pages is really hard. Excellent content requires a subject matter expert who actually knows the topic, a copywriter who can translate expertise into clear language, someone who understands your audience and brand, someone who knows how to structure pages with your component library, good media assets, and an SEO/AEO specialist so people actually discover what you made.

Most organizations are missing at least some of these skillsets, and even when all the people exist, coordinating them is where everything breaks down. We believe AI can fill these gaps, not by replacing these roles but by making their expertise available to every content creator on the team.

For large organizations, this means stronger brand consistency, better accessibility, and improved compliance across thousands of pages. For smaller ones, it means access to skills that were previously out of reach: professional copywriting, SEO, and brand-consistent design without needing a specialist for each.

Used carelessly, AI just makes these problems worse by producing fast, generic content that sounds like everything else on the internet. But used well, with real structure and governance behind it, AI can help organizations raise the bar on quality rather than just volume.

Drupal has always been built around the realities of serious content work: structured content, workflows, permissions, revisions, moderation, and more. These capabilities are what make quality possible at scale. They're also exactly the foundation AI needs to actually work well.

Rather than bolting on a chatbot or a generic text generator, we're embedding AI into the content and page creation process itself, guided by the structure, governance, and brand rules that already live in Drupal.

For website owners, the value is faster site building, faster content delivery, smarter user journeys, higher conversions, and consistent brand quality at scale. For digital agencies, it means delivering higher-quality websites in less time. And for IT teams, it means less risk and less overhead: automated compliance, auditable changes, and fewer ad hoc requests to fix what someone published.

We think the real opportunity goes further than just adding AI to what we already have. It's also about connecting how content gets created, how it performs, and how it gets governed into one loop, so that what you learn from your content actually shapes what you build next.

The things that have always made Drupal good at content are the same things that make AI trustworthy. That is not a coincidence, and it's why we believe Drupal is the right place to build this.

What we're building in 2026

The 2026 plan identifies eight capabilities we'll focus on. Each is described in detail in the full plan, but here is a quick overview:

  • Page generation - Describe what you need and get a usable page, built from your actual design system components
  • Context management - A central place to define brand voice, style guides, audience profiles, and governance rules that AI can use
  • Background agents - AI that works without being prompted, responding to triggers and schedules while respecting editorial workflows
  • Design system integration - AI that builds with your components and can propose new ones when needed
  • Content creation and discovery - Smarter search, AI-powered optimization, and content drafting assistance
  • Advanced governance - Batch approvals, branch-based versioning, and comprehensive audit trails for AI changes
  • Intelligent website improvements - AI that learns from performance data, proposes concrete changes, and gets smarter over time through editorial review
  • Multi-channel campaigns - Create content for websites, social, email, and automation platforms from a single campaign goal

These eight capabilities are where the official AI Initiative is focusing its energy, but they're not the whole picture for AI in Drupal. There is a lot more we want to build that didn't make this initial list, and we expect to revisit the plan in six months to a year.

We also want to be clear: community contributions outside this scope are welcome and important. Work on migrations, chatbots, and other AI capabilities continues in the broader Drupal community. If you're building something that isn't in our 2026 plan, keep going.

How we're making this happen

Over the past year, we've brought together organizations willing to contribute people and funding to the AI initiative. Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors. That is over 50 individual contributors working across time zones and disciplines.

Coordinating 50+ people across organizations takes real structure, so we've hired two dedicated teams from among our partners:

  • QED42 is focused on innovation, pushing forward on what is next.
  • 1xINTERNET is focused on productization, taking what we've built and making it stable, intuitive, and easy to install.

Both teams are creating backlogs, managing issues, and giving all our contributors clear direction. You can read more about how contributions are coordinated.

This is a new model for Drupal. We're testing whether open source can move faster when you pool resources and coordinate professionally.

Get involved

If you're a contributing partner, we're asking you to align your contributions with this plan. The prioritized backlogs are in place, so pick up something that fits and let's build.

If you're not a partner but want to contribute, jump in. The prioritized backlogs are open to everyone.

And if you want to join the initiative as an official partner, we'd absolutely welcome that.

This plan wasn't built in a room by itself. It's the result of collaboration across 28 sponsoring organizations who bring expertise in UX, core development, QA, marketing, and more. Thank you.

We're building something new for Drupal, in a new way, and I'm excited to see where it goes.

— Dries Buytaert

File attachments:  Drupal AI Social Post.png

Drupal AI Initiative: From Strategy to Execution: How the Drupal AI Initiative is Scaling Delivery for 2026

Drupal Planet -

Scaling the Drupal AI Initiative

The Drupal AI Initiative officially launched in June 2025 with the release of the Drupal AI Strategy 1.0 and a shared commitment to advancing AI capabilities in an open, responsible way. What began as a coordinated effort among a small group of committed organizations has grown into a substantial, sponsor-funded collaboration across the Drupal ecosystem.

Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors representing over 50 individual contributors working across time zones and disciplines. Together, sponsors have committed more than $1.5 million in combined cash and in-kind contributions to move Drupal AI forward.

The initiative now operates across multiple focused areas, including leadership, marketing, UX, QA, core development, innovation, and product development. Contributors are not only exploring what’s possible with AI in Drupal, but are building capabilities designed to be stable, well-governed, and ready for real-world adoption in Drupal CMS.

Eight months in, this is more than a collection of experiments. It is a coordinated, community-backed investment in shaping how AI can strengthen content creation, governance, and measurable outcomes across the Drupal platform.

Strengthening Delivery to Support Growth

As outlined in the 2026 roadmap, this year focuses on delivering eight key capabilities that will shape how AI works in Drupal CMS. Achieving that level of focus and quality requires more than enthusiasm and good ideas. It requires coordination at scale.

From the beginning, sponsors contributed both people and funding so the initiative could be properly organized and managed. With 28 organizations contributing more than 23 people across multiple workstreams, it was clear that sustained progress would depend on dedicated delivery management to align priorities, organize backlogs, support contributors, and maintain predictable execution.

To support this growth, the initiative ran a formal Request for Proposal (RFP) process to select delivery management partners to help coordinate work across both innovation and product development workstreams. This was not a shift in direction, but a continuation of our original commitment: to build AI capabilities for Drupal in a way that is structured, sustainable, and ready for real-world adoption.

Selecting Partners to Support Our Shared Goals

To identify the right delivery partners, we launched the RFP process in October 2025 at DrupalCon Vienna. The RFP was open exclusively to sponsors of the Drupal AI Initiative. From the start, our goal was to run a process that reflected the responsibility we carry as a sponsor-funded, community-driven initiative.

The timeline included a pre-proposal briefing, an open clarification period, and structured review and interview phases. Proposals were independently evaluated against clearly defined criteria tailored to both innovation and production delivery. These criteria covered governance, roadmap and backlog management, delivery approach, quality assurance, financial oversight, and demonstrated experience contributing to Drupal and AI initiatives.

Following an independent review, leadership held structured comparison sessions to discuss scoring, explore trade-offs, clarify open questions, and ensure decisions were made thoughtfully and consistently. Final discussions were held with shortlisted vendors in December, and contracts were awarded in early January.

The selected partners are engaged for an initial six-month period. At the end of that term, the RFP process will be repeated.

This process was designed not only to select capable partners but to steward sponsor contributions responsibly and align with Drupal’s values of openness, collaboration, and accountability.

Delivery Partners Now in Place

Following the structured selection process, two contributing partners were selected to support delivery across the initiative’s key workstreams.

QED42 will focus on the Innovation workstream, helping coordinate forward-looking capabilities aligned with the 2026 roadmap. QED42 has been an active contributor to Drupal AI efforts from the earliest stages and has played a role in advancing AI adoption across the Drupal ecosystem. Their contributions to initiatives such as Drupal Canvas AI, AI-powered agents, and other community-driven efforts demonstrate both technical depth and a strong commitment to open collaboration. In this role, QED42 will support structured experimentation, prioritization, and delivery alignment across innovation work.

1xINTERNET will lead the Product Development workstream, supporting the transition of innovation into stable, production-ready capabilities within Drupal CMS. As a founding sponsor and co-leader within the initiative, 1xINTERNET brings deep experience in distributed Drupal delivery and governance. Their longstanding involvement in Drupal AI and broader community leadership positions them well to guide roadmap execution, release planning, backlog coordination, and predictable productization.

We are grateful to QED42 and 1xINTERNET for their continued commitment to the initiative and for stepping into this role in service of the broader Drupal community. We also want to acknowledge the strong level of interest in this RFP and the high standard of submissions received, and to thank all participating organizations for the time, thought, and care invested in the process. The level of interest and quality of submissions reflect the caliber of agencies and contributors engaged in advancing Drupal AI.

Both organizations were selected not only for their delivery expertise but for their demonstrated investment in Drupal AI and their alignment with the initiative’s goals. Their role is to support coordination, roadmap alignment, and disciplined execution across contributors, ensuring that sponsor investment and community effort translate into tangible, adoptable outcomes.

Contracts began in early January. Two development sprints have already been completed, and a third sprint is now underway, establishing a clear and predictable delivery cadence.

QED42 and 1xINTERNET will share more details about their processes and early progress in an upcoming blog post.

Ready to Deliver on the 2026 Roadmap

With the 2026 roadmap now defined and structured delivery teams in place, the Drupal AI Initiative is positioned to execute with greater clarity and focus. The eight capabilities outlined in the one-year plan provide direction. Dedicated delivery management provides the coordination needed to turn that direction into measurable progress.

Predictable sprint cycles, clearer backlog management, and improved cross-workstream alignment allow contributors to focus on building, refining, and shipping capabilities that can be adopted directly within Drupal CMS. Sponsor investment and community contribution are now supported by a delivery model designed for scale and sustainability.

This next phase is about disciplined execution. It means shipping stable, well-governed AI capabilities that site owners can enable with confidence. It means connecting innovation to production in a way that reflects Drupal’s strengths in structure, governance, and long-term maintainability.

We are grateful to the sponsors and contributors who have made this possible. As agencies and organizations continue to join the initiative, we remain committed to transparency, collaboration, and delivering meaningful value to the broader Drupal community.

We are entering a year of focused execution, and we are ready to deliver.

Moving Forward Together

The Drupal AI Initiative is built on collaboration. Sponsors contribute funding and dedicated team members. Contributors bring expertise across UX, core development, QA, marketing, innovation, and production. Leadership provides coordination and direction. Together, this shared investment makes meaningful progress possible.

We extend our thanks to the 28 sponsoring organizations and the more than 50 contributors who are helping shape the future of AI in Drupal. Their commitment reflects a belief that open source can lead in building AI capabilities that are stable, governed, and built for real-world use.

As we move into 2026, we invite continued participation. Contributing partners are encouraged to align their work with the roadmap and engage in the active workstreams. Organizations interested in joining the initiative are welcome to connect and explore how they can contribute.

We have laid the foundation. The roadmap is clear. Structured delivery is in place. With continued collaboration, we are well-positioned to deliver meaningful AI capabilities for the Drupal community and the organizations it serves.

Dries Buytaert: Drupal's AI roadmap for 2026

Drupal Planet -

For the past months, the AI Initiative Leadership Team has been working with our contributing partners to define what the Drupal AI initiative should focus on in 2026. That plan is now ready, and I want to share it with the community.

This roadmap builds directly on the strategy we outlined in Accelerating AI Innovation in Drupal. That post described the direction. This plan turns it into concrete priorities and execution for 2026.

The full plan is available as a PDF, but let me explain the thinking behind it.

Producing consistently high-quality content and pages is really hard. Excellent content requires a subject matter expert who actually knows the topic, a copywriter who can translate expertise into clear language, someone who understands your audience and brand, someone who knows how to structure pages with your component library, good media assets, and an SEO/AEO specialist so people actually discover what you made.

Most organizations are missing at least some of these skillsets, and even when all the people exist, coordinating them is where everything breaks down. We believe AI can fill these gaps, not by replacing these roles but by making their expertise available to every content creator on the team.

For large organizations, this means stronger brand consistency, better accessibility, and improved compliance across thousands of pages. For smaller ones, it means access to skills that were previously out of reach: professional copywriting, SEO, and brand-consistent design without needing a specialist for each.

Used carelessly, AI just makes these problems worse by producing fast, generic content that sounds like everything else on the internet. But used well, with real structure and governance behind it, AI can help organizations raise the bar on quality rather than just volume.

Drupal has always been built around the realities of serious content work: structured content, workflows, permissions, revisions, moderation, and more. These capabilities are what make quality possible at scale. They're also exactly the foundation AI needs to actually work well.

Rather than bolting on a chatbot or a generic text generator, we're embedding AI into the content and page creation process itself, guided by the structure, governance, and brand rules that already live in Drupal.

For website owners, the value is faster site building, faster content delivery, smarter user journeys, higher conversions, and consistent brand quality at scale. For digital agencies, it means delivering higher-quality websites in less time. And for IT teams, it means less risk and less overhead: automated compliance, auditable changes, and fewer ad hoc requests to fix what someone published.

We think the real opportunity goes further than just adding AI to what we already have. It's also about connecting how content gets created, how it performs, and how it gets governed into one loop, so that what you learn from your content actually shapes what you build next.

The things that have always made Drupal good at content are the same things that make AI trustworthy. That is not a coincidence, and it's why we believe Drupal is the right place to build this.

What we're building in 2026

The 2026 plan identifies eight capabilities we'll focus on. Each is described in detail in the full plan, but here is a quick overview:

  • Page generation - Describe what you need and get a usable page, built from your actual design system components
  • Context management - A central place to define brand voice, style guides, audience profiles, and governance rules that AI can use
  • Background agents - AI that works without being prompted, responding to triggers and schedules while respecting editorial workflows
  • Design system integration - AI that builds with your components and can propose new ones when needed
  • Content creation and discovery - Smarter search, AI-powered optimization, and content drafting assistance
  • Advanced governance - Batch approvals, branch-based versioning, and comprehensive audit trails for AI changes
  • Intelligent website improvements - AI that learns from performance data, proposes concrete changes, and gets smarter over time through editorial review
  • Multi-channel campaigns - Create content for websites, social, email, and automation platforms from a single campaign goal

These eight capabilities are where the official AI Initiative is focusing its energy, but they're not the whole picture for AI in Drupal. There is a lot more we want to build that didn't make this initial list, and we expect to revisit the plan in six months to a year.

We also want to be clear: community contributions outside this scope are welcome and important. Work on migrations, chatbots, and other AI capabilities continues in the broader Drupal community. If you're building something that isn't in our 2026 plan, keep going.

How we're making this happen

Over the past year, we've brought together organizations willing to contribute people and funding to the AI initiative. Today, 28 organizations support the initiative, collectively pledging more than 23 full-time equivalent contributors. That is over 50 individual contributors working across time zones and disciplines.

Coordinating 50+ people across organizations takes real structure, so we've hired two dedicated teams from among our partners:

  • QED42 is focused on innovation, pushing forward on what is next.
  • 1xINTERNET is focused on productization, taking what we've built and making it stable, intuitive, and easy to install.

Both teams are creating backlogs, managing issues, and giving all our contributors clear direction. You can read more about how we are going from strategy to execution.

This is a new model for Drupal. We're testing whether open source can move faster when you pool resources and coordinate in a new way.

Get involved

If you're a contributing partner, we're asking you to align your contributions with this plan. The prioritized backlogs are in place, so pick up something that fits and let's build.

If you're not a partner but want to contribute, jump in. The prioritized backlogs are open to everyone.

And if you want to join the initiative as an official partner, we'd absolutely welcome that.

This plan wasn't built in a room by itself. It's the result of collaboration across 28 sponsoring organizations who bring expertise in UX, core development, QA, marketing, and more. Thank you.

We're building something new for Drupal, in a new way, and I'm excited to see where it goes.

Pages

Subscribe to www.hazelbecker.com aggregator - Drupal feeds