Feed aggregator

Dripyard Premium Drupal Themes: Meridian, the Latest Dripyard Theme

Drupal Planet -

Meet Meridian, the newest Dripyard theme. We’re really excited about this release, as many hours went into Meridian along with updates to our other themes.

Flexibility

My favorite “feature” of Dripyard themes is flexibility. We market each theme toward a specific vertical, but in practice they are highly versatile. You can easily change the look and feel of an entire site by adjusting color schemes, border radiuses, and imagery.

Pictures are worth a thousand words, so we built our site to showcase multiple demos.

Drupal blog: Inside Drupal CMS 2.0: Q&A with Product Owner Pam Barone

Drupal Planet -

Drupal CMS 2.0 launched January 28. We asked Pam Barone—CTO of Technocrat and Product Owner of Drupal CMS—to talk about what's new and what she's most excited for people to try.

What makes Drupal CMS 2.0 different from version 1.0?

Drupal CMS 1.0 was really a proof of concept, to show that we could create a version of Drupal that bundled all of the best practices that many sites were using, and that the community would come together to make it happen in a short amount of time. We did prove the concept, but the 1.0 release did not represent any major innovations, because we were mostly just packaging functionality and tools that we already had and were familiar with. That is not to downplay the accomplishment at all, because it was a huge leap forward for the project, and it provided the foundation for the next steps.

With 2.0, we are introducing two big new concepts: Drupal Canvas and site templates. These represent another huge leap for the project, each in different ways, as we continue with the strategy to empower marketers to create exceptional digital experiences without relying on developers.

What are you personally most excited about for people to try in 2.0?

Drupal Canvas! I am so excited about Canvas and can’t wait to get it into the hands of our end users. There were times during the development of 2.0 when I was working in the Canvas editor and I thought, ‘Wow, I’m actually having fun!’ I can’t say I remember thinking that with previous Drupal page building tools.

And it’s not just about end users; one of the goals of 2.0 is to introduce Canvas within the community and showcase its potential. It’s a paradigm shift, and this level of change is always challenging, but after trying it out and getting familiar with the concepts, I think it’ll be clear that it’s worth it.

Site templates are a big part of this release. Can you explain what they are and why they matter?

Site templates are near-feature-complete starting points for Drupal sites based on specific use cases. They provide a content model, some example content, a polished look and feel, as well as the functionality you would expect based on the use case. The first site template – Byte, which is included in Drupal CMS 2.0 – is for a SaaS-based product marketing site. It includes all of the baseline functionality from 1.0, plus Canvas-powered landing pages, a blog, a newsletter signup and contact form, and a new theme with a dark style. 

During the development of 1.0, we realized that we couldn’t build something that was both generic and useful. Either we would have to build something simple that would be widely applicable, or we would be making a lot of assumptions about the site’s content model and functionality, and providing things that many users wouldn't want.

We decided that in order to really make it easy to launch sites, we had to provide many different starting points, across many use cases. By identifying the use case and being opinionated about how to solve it, site templates can start you off with 95 percent of what you need to launch.

Of course, that assumes there is a site template for your use case – which means we’re going to need a lot of them. We’re currently working with a group of Drupal agencies who have signed up for a pilot to develop new site templates for the launch of the site template Marketplace.

Let's talk about Canvas—how will this change the way marketers can build with Drupal?

The most obvious thing is just that it provides marketers with a modern, intuitive visual page builder of the kind that any competitive platform needs to have. Up until now, adopting Drupal meant getting its many benefits but compromising on the user experience, because the page building tools were clunky. With Canvas, that compromise is gone. We can provide the experience that marketers have come to expect.

In some ways it feels like we are playing catch-up, especially since it’s still early (the first release was in December) and there are some big gaps. But it also feels like a new era for Drupal, and the enthusiasm and pace of adoption so far is really encouraging. So I think we don’t really even know yet what changes will come, because when the community is presented with a new way to build cool things, the possibilities are endless.

You've mentioned making integrations easier with recipes. What does that look like in practice?

One of the benefits of using Drupal is that it can be integrated with pretty much anything, and all of the common integrations have modules to make it easier. But they always require some configuration, and it can be tricky to figure out. With recipes, we can add default configuration, and we can prompt for the necessary details, so you don’t have to go hunting around for where to add them.

Drupal CMS 1.0 included two integrations that use the recipe prompt already, for Google Analytics and the AI Assistant. They’re pretty simple in that you are just adding an ID or an API key, but they still are a big improvement over the manual setup. 

For 2.0, with site templates, we have the opportunity to include additional integrations that are relevant to the use case and wanted to tackle something a bit more complicated. Byte ships with a newsletter signup that uses a webform out of the box, and has an optional “Recommended add-on” to integrate with Mailchimp. The Mailchimp module already did most of the heavy lifting, but we worked with the maintainers to develop a recipe that configures the module (and its submodules), and once you authenticate your site with Mailchimp, will automatically create signup blocks for each of your audiences. From there, you can add them to any page via the Canvas editor.

We think that easy integrations are going to be really critical to making site templates attractive as an offering, so we are planning to continue working on that. 

In your recent presentations, you've talked about "making easy things less hard" versus "making easy things easy." Where does 2.0 fall on that spectrum?

The initial site templates are very intentionally on the “making easy things less hard” side. Not only is it a totally new concept, but they are leveraging Canvas, which is also new. So we thought that the best chance for success would be to keep it simple and try to really nail the use cases. Once we’ve all built a few, and we’ve gotten feedback from real users, we can move into the more complex sites where Drupal thrives.

Drupal CMS 2.0 is available now.

Try it now: drupal.org/drupal-cms/trial 

Download: drupal.org/download

Learn more: drupal.org/drupal-cms

Twenty-five years in. Still building.

Droptica: Automated Content Creation in Drupal: Field Widget Actions Tutorial with Real Results

Drupal Planet -

Information gathering, content writing, proofreading, SEO optimization, tag preparation – all these tasks consume a significant portion of the editorial team’s time. What if you could reduce this research time by up to 90% through automated content creation? In this article, I present a practical Drupal setup that uses AI-powered modules to generate editorial content with minimal manual input. This includes automatic information retrieval based on the title, tag generation, content creation, and detailed data fetching – all directly in your CMS, without switching between different tools. Read on or watch the episode from the Nowoczesny Drupal series.

DDEV Blog: DDEV v1.25.0: Improved Windows Support, Faster Debugging, and Modern Defaults

Drupal Planet -

We're excited to announce DDEV v1.25.0, featuring a completely revised Windows installer, XHGui as the default profiler, and updated system defaults including a move to Debian Trixie.

This release represents contributions from the entire DDEV community, with your suggestions, bug reports, code contributions, and financial support making it possible.

What's New and Updated

Default versions updated:

These updates mostly affect new projects. Existing projects typically continue to work without changes.

  • Debian Trixie replaces Debian Bookworm as the base image for ddev-webserver and ddev-ssh-agent
  • XHGui is now the default profiler (replacing prepend mode). See XHGui Feature blog post
  • PHP 8.4 becomes the default for new projects, and PHP 8.5.2 is now available with full extension support including Xdebug
  • Node.js 24 becomes the default for projects that don't specify another version
  • MariaDB 11.8 becomes the default for new projects

Major new features:

What You Need to Do After Upgrading

After upgrading to v1.25.0, follow these steps:

  1. Run ddev poweroff (DDEV will prompt you for this)
  2. Update your projects: Run ddev config --auto on each project to update to current configuration
  3. Update installed add-ons: Run ddev add-on list --installed to see your add-ons, then update them as needed
  4. Free up disk space: Run ddev delete images to remove old Docker image versions
  5. Check compatibility notes below
Compatibility Notes and Things to Check 1. Debian Trixie base image

If your project has custom Dockerfiles or uses webimage_extra_packages and ddev start shows any problems, you may have a little work to do, but most projects are unaffected.

What to do: Test your project after upgrading. See Debian Trixie release notes for known issues.

Note: DDEV already includes the tzdata-legacy package to handle removed timezones in Debian Trixie, so no action is needed for timezone-related changes.

2. Profiler changed to XHGui

If you use XHProf profiling, it now defaults to XHGui mode instead of prepend mode.

What to do: If you prefer the previous prepend mode, run:

ddev config global --xhprof-mode=prepend 3. Nginx modules now come from Debian repository

If you use custom nginx modules, the package names and module loading have changed. DDEV now uses nginx bundled with Debian Trixie instead of maintaining an extra dependency on the nginx.org repository.

What to do: Update your nginx module configuration.

Example: Adding NJS (JavaScript) support to nginx in DDEV v1.25.0+:

ddev config --webimage-extra-packages="libnginx-mod-http-js,libnginx-mod-stream,libnginx-mod-stream-js" --ddev-version-constraint='>=v1.25.0' cat <<'EOF' > .ddev/web-build/Dockerfile.nginx RUN sed -i '1i load_module modules/ngx_stream_module.so;\nload_module modules/ngx_http_js_module.so;\nload_module modules/ngx_stream_js_module.so;\n' /etc/nginx/nginx.conf EOF 4. Removed commands and features

If you use these commands, you'll need to switch:

5. Updated ddev config flags

If you use these flags in scripts, update them:

  • --mutagen-enabled → --performance-mode=mutagen
  • --upload-dir → --upload-dirs
  • --http-port → --router-http-port
  • --https-port → --router-https-port
  • --mailhog-port → --mailpit-http-port
  • --mailhog-https-port → --mailpit-https-port
  • --projectname → --project-name
  • --projecttype, --apptype → --project-type
  • --sitename → --project-name
  • --image-defaults → --web-image-default
6. Traefik configuration

If you have custom Traefik configuration, note that:

  • Only .ddev/traefik/config/<projectname>.yaml is used (other files are ignored)
  • Put global Traefik configuration in $HOME/.ddev/traefik/custom-global-config/
  • Traefik v3 syntax is now required

What to do if you have extra Traefik files:

  1. Merge all your custom configuration into .ddev/traefik/config/<projectname>.yaml and remove the #ddev-generated comment from it
  2. Track issue #8047 for potential future improvements to this workflow

Note: ddev-router no longer stops automatically when the last project stops. Use ddev poweroff to stop it manually.

7. Windows installation

If you're on traditional Windows (not WSL2): The installer may prompt you to uninstall the previous system-wide installation before installing the new per-user version.

Other Improvements

This release includes many other improvements:

  • New Wagtail, CodeIgniter, and Drupal 12 project types
  • Improved Pantheon integration with new environment variables and option to pull from existing backups or fresh database dumps
  • Much faster ddev add-on list and ddev add-on search
  • Shell autocompletion for ddev add-on get <TAB>
  • SELinux environment detection with automatic bind mount labels
  • More portable database collations for MySQL/MariaDB exports
  • SSH config support in $HOME/.ddev/homeadditions/.ssh/config.d
  • DBeaver support for traditional Windows

See the full release notes for complete details.

From the entire team, thanks for using, promoting, contributing, and supporting DDEV!

If you have questions, reach out in any of the support channels.

Follow our blog, Bluesky, LinkedIn, Mastodon, and join us on Discord. Sign up for the monthly newsletter.

This article was edited and refined with assistance from Claude Code.

DDEV Blog: Podman and Docker Rootless in DDEV

Drupal Planet -

TL;DR: DDEV supports Podman and Docker Rootless as of v1.25.0. Podman and Docker Rootless are a bit more trouble than the recommended normal traditional Docker providers and have some serious trade-offs. With Podman on macOS you can't use the normal default ports 80 and 443. On Linux Docker Rootless you can't bind-mount directories, so the entire project has to be mutagen-synced. But Podman Rootless on Linux is pretty solid.

Jump to setup instructions: Linux/WSL2 · macOS · Windows

Note: This support is experimental. Report issues on the DDEV issue tracker.

Table of Contents Understanding Docker and Podman Open Source Alternatives to Docker Desktop

A common misconception is that Podman is the only open-source alternative to Docker Desktop. This is not true. There are several fully open-source alternatives available on every platform:

  • Docker Engine - The original open-source Docker, free to use
  • Rancher Desktop - Open source container management with choice of dockerd or containerd
  • Lima - Linux virtual machines
  • Colima - Container runtime with minimal setup (built on Lima)
  • Podman Desktop - GUI for Podman with Docker compatibility

All of these work with DDEV. The main reason to choose Podman specifically is if your organization forbids Docker entirely or if you want rootless operation by default.

Why Choose Podman?

Podman is rootless by default, making it the simplest option for secure container environments. Traditional Docker requires root daemons, which can be a security concern in corporate environments with strict policies. (Note that DDEV is targeted at local development, where there are few risks of specialized attacks using this vector anyway.)

Podman's rootless approach runs the daemon without elevated privileges:

  • No root daemon on the system, only a rootless daemon in userspace
  • Container processes cannot access root-owned files
  • Reduced attack surface if a container is compromised

While DDEV already runs containers as unprivileged users, Podman eliminates the need for a root daemon entirely.

Why Choose Docker Rootless?

Docker Rootless provides the same security benefits as Podman Rootless while maintaining full Docker compatibility. It runs the daemon without root privileges, offering:

  • No root daemon on the system
  • Container processes cannot access root-owned files
  • Reduced attack surface if a container is compromised

Unlike Podman which is rootless by default, Docker Rootless requires special setup to enable. Choose this option if you want to stay with Docker but need rootless security.

Key aim: Linux and WSL2 users

The primary focus for this article is Linux and WSL2 (we have test coverage for Linux only for now). Most features and configurations are well-tested on these platforms.

Do You Need an Alternative to Docker?

Before diving into setup, consider whether you need an alternative to traditional Docker:

Runtime Why would you do this? Key trade-offs Performance Setup Recommendation Traditional Docker Standard, widely-used option None Excellent Simple Recommended for most users Docker Rootless Security requirement for rootless daemon Must use --no-bind-mounts (everything via Mutagen), can't use default workflow Moderate (Mutagen overhead) Moderate Only if rootless security is required Podman Rootful Organization forbids Docker Slower than Docker, different behavior Slower than Docker Moderate Only if Docker not allowed Podman Rootless Organization forbids Docker + want rootless security May need sysctl changes for ports <1024, slower than Docker Slower than Docker Moderate Only if Docker not allowed and rootless required

Bottom line: Stick with traditional Docker unless organizational policy or security requirements force you to use an alternative. The alternatives work, but have significant trade-offs.

Installing Podman

Install Podman using your distribution's package manager. See the official Podman installation guide for Linux.

# Ubuntu/Debian sudo apt-get update && sudo apt-get install podman # Fedora sudo dnf install --refresh podman

Note: Some distributions may have outdated Podman versions. This is the case with Ubuntu 24.04, which has Podman 4.9.3. We require Podman 5.0 or newer for the best experience, because we didn't have success with Podman 4.x in our automated tests, but you can still use Podman 4.x ignoring the warning on ddev start.

You can also install Podman Desktop if you prefer a GUI.

For more information, see the Podman tutorials.

Installing Docker CLI

Podman provides a Docker-compatible API, which means you can use the Docker CLI as a frontend for Podman. This approach offers several benefits:

  • Use familiar docker commands while Podman handles the actual container operations
  • Switch between different container runtimes using Docker contexts
  • Maintain compatibility with scripts and tools that expect the docker command
  1. Set up Docker's repository

  2. Install only the CLI:

    # Ubuntu/Debian sudo apt-get update && sudo apt-get install docker-ce-cli # Fedora sudo dnf install --refresh docker-ce-cli

    Note: You don't need to install docker-ce (the Docker engine).

Configuring Podman Rootless

This is the recommended configuration for most users.

  1. Prepare the system by configuring subuid and subgid ranges and enabling userns options, see the Arch Linux Wiki for details:

    # Add subuid and subgid ranges if they don't exist for the current user grep "^$(id -un):\|^$(id -u):" /etc/subuid >/dev/null 2>&1 || sudo usermod --add-subuids 100000-165535 $(whoami) grep "^$(id -un):\|^$(id -u):" /etc/subgid >/dev/null 2>&1 || sudo usermod --add-subgids 100000-165535 $(whoami) # Propagate changes to subuid and subgid podman system migrate # Debian requires setting unprivileged_userns_clone if [ -f /proc/sys/kernel/unprivileged_userns_clone ]; then if [ "1" != "$(cat /proc/sys/kernel/unprivileged_userns_clone)" ]; then echo 'kernel.unprivileged_userns_clone=1' | sudo tee -a /etc/sysctl.d/60-rootless.conf sudo sysctl --system fi fi # Fedora requires setting max_user_namespaces if [ -f /proc/sys/user/max_user_namespaces ]; then if [ "0" = "$(cat /proc/sys/user/max_user_namespaces)" ]; then echo 'user.max_user_namespaces=28633' | sudo tee -a /etc/sysctl.d/60-rootless.conf sudo sysctl --system fi fi # Allow privileged port access if needed if [ -f /proc/sys/net/ipv4/ip_unprivileged_port_start ]; then if [ "1024" = "$(cat /proc/sys/net/ipv4/ip_unprivileged_port_start)" ]; then echo 'net.ipv4.ip_unprivileged_port_start=0' | sudo tee -a /etc/sysctl.d/60-rootless.conf sudo sysctl --system fi fi
  2. Enable the Podman socket and verify it's running (Podman socket activation documentation):

    systemctl --user enable --now podman.socket # You should see `/run/user/1000/podman/podman.sock` (the number may vary): ls $XDG_RUNTIME_DIR/podman/podman.sock # You can also check the socket path with: podman info --format '{{.Host.RemoteSocket.Path}}'
  3. Configure Docker API to use Podman (Podman rootless tutorial):

    # View existing contexts docker context ls # Create Podman rootless context docker context create podman-rootless \ --description "Podman (rootless)" \ --docker host="unix://$XDG_RUNTIME_DIR/podman/podman.sock" # Switch to the new context docker context use podman-rootless # Verify it works docker ps
  4. Proceed with DDEV installation.

Podman Rootless Performance Optimization

Podman Rootless is significantly slower than Docker. See these resources:

To improve performance, install fuse-overlayfs and configure the overlay storage driver:

Install fuse-overlayfs:

# Ubuntu/Debian sudo apt-get update && sudo apt-get install fuse-overlayfs # Fedora sudo dnf install --refresh fuse-overlayfs

Configure storage:

mkdir -p ~/.config/containers cat << 'EOF' > ~/.config/containers/storage.conf [storage] driver = "overlay" [storage.options.overlay] mount_program = "/usr/bin/fuse-overlayfs" EOF

Warning: If you already have Podman containers, images, or volumes, you'll need to reset Podman for this change to take effect:

podman system reset

This removes all existing containers, images, and volumes (similar to docker system prune -a).

Configuring Podman Rootful

Rootless Podman is recommended. Only use rootful Podman if your setup specifically requires it.

To configure rootful Podman:

  1. Create a podman group (sudo groupadd podman) and add your user to it (sudo usermod -aG podman $USER).
  2. Configure group permissions to allow non-root users to access the socket
  3. Activate the socket with sudo systemctl enable --now podman.socket
  4. Create a Docker context docker context create podman-rootful --description "Podman (root)" --docker host="unix:///var/run/podman/podman.sock"
  5. Switch to the new context with docker context use podman-rootful
Setting Up Docker Rootless

Docker Rootless on Linux offers rootless security with full Docker compatibility.

  1. Follow the official Docker Rootless installation guide.

  2. Configure system:

    # Allow privileged port access if needed if [ -f /proc/sys/net/ipv4/ip_unprivileged_port_start ]; then if [ "1024" = "$(cat /proc/sys/net/ipv4/ip_unprivileged_port_start)" ]; then echo 'net.ipv4.ip_unprivileged_port_start=0' | sudo tee -a /etc/sysctl.d/60-rootless.conf sudo sysctl --system fi fi # Allow loopback connections (needed for working Xdebug) # See https://github.com/moby/moby/issues/47684#issuecomment-2166149845 mkdir -p ~/.config/systemd/user/docker.service.d cat << 'EOF' > ~/.config/systemd/user/docker.service.d/override.conf [Service] Environment="DOCKERD_ROOTLESS_ROOTLESSKIT_DISABLE_HOST_LOOPBACK=false" EOF
  3. Enable the Docker socket, and verify it's running:

    systemctl --user enable --now docker.socket # You should see `/run/user/1000/docker.sock` (the number may vary): ls $XDG_RUNTIME_DIR/docker.sock
  4. Configure Docker API to use Docker rootless:

    # View existing contexts docker context ls # Create rootless context if it doesn't exist docker context inspect rootless >/dev/null 2>&1 || \ docker context create rootless \ --description "Rootless runtime socket" \ --docker host="unix://$XDG_RUNTIME_DIR/docker.sock" # Switch to the context docker context use rootless # Verify it works docker ps
  5. Proceed with DDEV installation.

  6. Docker Rootless requires no-bind-mounts mode

    Docker Rootless has a limitation with bind mounts that affects DDEV. You must enable no-bind-mounts mode:

    ddev config global --no-bind-mounts=true

    Why this is needed:

    Docker Rootless sets ownership for bind mounts to root inside containers. This is a known issue:

    The root user inside the container maps to your host user, but many services will not run as root:

    • nginx runs as root without problems
    • MySQL/MariaDB need extra configuration
    • Apache and PostgreSQL will not run as root

    Podman Rootless fixes this with the --userns=keep-id option, which keeps user IDs the same. Docker Rootless does not have this option.

    The no-bind-mounts mode fixes this by using Mutagen for the web container.

macOS

macOS users can use Podman and Podman Desktop, but setup has its own challenges. Docker Rootless is not available on macOS.

Do You Need an Alternative to Docker? Runtime Why would you do this? Key trade-offs Performance Setup Recommendation Traditional Docker Standard, widely-used option None Excellent Simple Recommended for most users Podman Avoid Docker entirely (organizational policy) Cannot use ports 80/443 (must use 8080/8443 instead), different behavior Slower than Docker Moderate Only if Docker not allowed

Bottom line: Use traditional Docker (OrbStack, Docker Desktop, Lima, Colima, or Rancher Desktop) unless your organization forbids it. The inability to use standard ports 80/443 with Podman creates a significantly different development experience.

Installing Podman

Install Podman using Homebrew:

brew install podman

Or install Podman Desktop if you prefer a GUI.

For more information, see the official Podman installation guide for macOS and Podman tutorials.

Installing Docker CLI

Podman provides a Docker-compatible API, which means you can use the Docker CLI as a frontend for Podman. This approach offers several benefits:

  • Use familiar docker commands while Podman handles the actual container operations
  • Switch between different container runtimes using Docker contexts
  • Maintain compatibility with scripts and tools that expect the docker command
brew install docker Configuring Podman
  1. Handle privileged ports (<1024):

    Important: Podman on macOS cannot bind to privileged ports (80/443). You must configure DDEV to use unprivileged ports:

    ddev config global --router-http-port=8080 \ --router-https-port=8443

    This means your DDEV projects will be accessible at https://yourproject.ddev.site:8443 instead of the standard https://yourproject.ddev.site.

    Note: switching to rootful mode with podman machine set --rootful --user-mode-networking=false doesn't help with privileged ports because the --user-mode-networking=false flag is not supported on macOS (it's only available for WSL).

  2. Initialize and start the Podman machine:

    # check `podman machine init -h` for more options podman machine init --memory 8192 podman machine start

    Check for the Podman socket path using podman machine inspect:

    ~ % podman machine inspect ... "ConnectionInfo": { "PodmanSocket": { "Path": "/var/folders/z5/lhpyjf2n7xj2djl0bw_7kb3m0000gn/T/podman/podman-machine-default-api.sock" }, "PodmanPipe": null }, ...
  3. Configure Docker CLI to use Podman. Choose one of two approaches:

    Option 1: Create a Docker context (recommended, more flexible):

    # Create Podman context (path to socket may vary) # Use the socket path from `podman machine inspect` output docker context create podman-rootless \ --description "Podman (rootless)" \ --docker host="unix:///var/folders/z5/lhpyjf2n7xj2djl0bw_7kb3m0000gn/T/podman/podman-machine-default-api.sock" # Switch to the new context docker context use podman-rootless # Verify it works docker ps

    This approach uses Docker contexts to switch between different container runtimes without modifying system sockets. This is more flexible if you want to use multiple Docker providers.

    Option 2: Use the default Docker socket (simpler, but less flexible):

    # Install podman-mac-helper # Use the command from `podman machine start` output sudo /opt/homebrew/Cellar/podman/5.7.1/bin/podman-mac-helper install podman machine stop podman machine start # Verify it works docker ps
  4. Proceed with DDEV installation.

Windows

Windows users can use Podman Desktop, but setup has its own challenges. Docker Rootless is not available on traditional Windows (it works in WSL2, see the Linux and WSL2 section).

Do You Need an Alternative to Docker? Runtime Why would you do this? Key trade-offs Performance Setup Recommendation Traditional Docker Standard, widely-used option None Excellent Simple Recommended for most users Podman Avoid Docker entirely (organizational policy) Different behavior, less mature on Windows Slower than Docker Moderate Only if Docker not allowed

Bottom line: Use traditional Docker (Docker Desktop or alternatives) unless your organization forbids it. Podman on Windows works but is less mature than on Linux.

Installing Podman

Install Podman Desktop, which includes Podman.

Alternatively, install Podman directly following the official Podman installation guide for Windows.

For more information, see the Podman tutorials.

The setup and configuration follow similar patterns to the Linux/WSL2 setup, but with Podman Desktop managing the VM for you. Follow the Linux and WSL2 instructions.

Running Multiple Container Runtimes

You can run Docker and Podman sockets simultaneously and switch between them using Docker contexts.

For example, here's a system with four active Docker contexts:

$ docker context ls NAME DESCRIPTION DOCKER ENDPOINT default Current DOCKER_HOST based configuration unix:///var/run/docker.sock podman Podman (rootful) unix:///var/run/podman/podman.sock podman-rootless * Podman (rootless) unix:///run/user/1000/podman/podman.sock rootless Rootless runtime socket unix:///run/user/1000/docker.sock

Switch between them with:

docker context use "<context-name>"

Note: Running both Docker and Podman in rootful mode at the same time may cause network conflicts. See Podman and Docker network problem on Fedora 41.

Switching Runtimes with DDEV

DDEV automatically detects your active container runtime. To switch:

  1. Stop DDEV projects:

    ddev poweroff
  2. Switch Docker context or change the DOCKER_HOST environment variable

  3. Start your project:

    ddev start
Which Runtime Should You Choose? Runtime Comparison Feature Standard Docker Docker Rootless Podman Rootful Podman Rootless Platform Support All Linux, WSL2 All All Rootless Daemon ❌ ✅ ❌ ✅ Has automated testing in DDEV ✅ ✅ ❌ ✅ Mutagen ✅ ✅ ✅ ✅ Bind Mounts ✅ ❌, requires no-bind-mounts ✅ ✅ (with --userns=keep-id) Performance Excellent Moderate (because of no-bind-mounts) Slow compared to Docker Slow compared to Docker Privileged Ports (<1024) Works by default Requires sysctl config Works by default Requires sysctl config or may not work Setup Complexity Simple Moderate Moderate Moderate Maturity Most mature Experimental Experimental Experimental Recommended For Most users Docker users needing rootless Organizations that forbid Docker Organizations that forbid Docker Recommendations

Use of the many standard Docker providers if:

  • You're comfortable with the most widely used container runtime
  • You don't have rootless security requirements

This is the recommended option for the vast majority of users.

Use Podman Rootless if:

  • Your organization forbids Docker
  • You want rootless security by default

Use Podman Rootful if:

  • Your organization forbids Docker
  • You want traditional container permissions without user namespace mapping overhead

Use Docker Rootless if:

  • You need full Docker compatibility
  • You want rootless security without changing runtimes
The Journey to Podman Support

Supporting Podman and Docker Rootless required major changes to DDEV's Docker integration:

  • Switched to official Docker client library (#5787): DDEV previously used an unofficial library to communicate with the Docker API. We migrated to Docker's official client library for better compatibility and long-term support.
  • Replaced direct CLI calls with proper API usage (#7189): DDEV used to call docker context inspect directly, which doesn't work with Podman. We switched to using the docker/cli library to handle context operations properly.
  • Modernized SSH authentication (#7511): The ddev auth ssh command used to call docker run directly. We rewrote it to use the Docker API, making it compatible with alternative runtimes.
  • Optimized API call performance (#7587): DDEV's Docker API logic was inefficient, making redundant calls without caching. We restructured the code to cache data and reduce unnecessary API requests.
  • Removed legacy docker-compose features (#7642): Podman refuses to work with deprecated links and external_links directives in docker-compose files. We removed these legacy features and modernized DDEV's compose file generation.
  • Added Podman and Docker Rootless support (#7702): DDEV now detects and supports Podman (rootful and rootless) and Docker Rootless. We added handling for Podman-specific limitations and enabled rootless environments to work without root privileges.
  • Added support for SELinux environments (#7939): Podman has SELinux enabled by default on Fedora and some other distributions. We added support for SELinux by configuring volume mounts with the appropriate labels.

These changes enabled Podman and Docker Rootless support. These features were developed together because Podman's primary use case is rootless operation. Once DDEV could handle rootless runtimes, supporting both became natural. They share the same security model and similar technical constraints.

Supporting DDEV Development

This Podman and Docker Rootless support was made possible by community financial support. The changes required hundreds of hours of development, code reviews, and testing.

DDEV relies on support from individuals and organizations who use it. With Podman rootless support, DDEV now works in corporate environments where Docker Desktop is not allowed. If you or your organization uses DDEV, please consider sponsoring the project to help keep DDEV free and open source.

Conclusion

DDEV now supports Podman and Docker Rootless as experimental features. This opens DDEV to corporate environments where traditional Docker is not allowed.

DDEV automatically detects your runtime and handles the complexity. Whether you choose Podman for rootless security, Docker Rootless for compatibility, or standard Docker, setup is straightforward.

This article was edited and refined with assistance from Claude Code.

Drupal Association blog: Drupal Powers Global Action for World Cancer Day

Drupal Planet -

Every year on 4th February, the world unites to mark World Cancer Day (WCD), a campaign that raises awareness, amplifies voices, and inspires collective action against cancer. Behind the scenes, the World Cancer Day website, built with Drupal, powers millions of people, providing a central platform for global engagement.

Project overview

Campaign and its impact

The World Cancer Day 2025-2027 campaign embraces the theme “United by Unique”, emphasizing people-centered care. This approach prioritizes the needs, values, and active participation of individuals, families, and communities in cancer care. By putting people at the heart of the conversation, the campaign promotes a shift toward more inclusive, responsive, and compassionate health systems.

The 2025 campaign achieved remarkable global engagement:

  • +900 activities in 120 countries
  • +600 stories shared in text, video, and art
  • +1,000 participants in the Upside Down Challenge
  • +30,000 press mentions in 162 countries
  • 6 billion digital impressions and 9 million social media engagements
  • 530,000 website visitors and +300,000 campaign video views

These numbers highlight both the scale of the campaign and the critical need for a platform that can reliably support millions of users simultaneously.

Explore more about the campaign and join the global action at World Cancer Day.

Supporting global action with Drupal

Supporting a high-profile global campaign requires flexibility, scalability, and robustness, capabilities that are fundamental to Drupal’s architecture.

Key features
  • Multilingual CMS: Centralized content management across multiple languages ensures the website can reach a large global audience.
  • Scalable hosting: Drupal handles traffic surges exceeding 194 GB per hour, delivering consistent performance during peak activity.
  • Complex interactive tools: Custom features such as poster creationevent planning tools, and a global activities map make it easy for users to participate and share initiatives.
  • Dynamic content delivery: Scrollytelling, multimedia content, and personal stories from those affected by cancer create an engaging, meaningful experience.
  • Accessibility: Drupal’s accessibility capabilities reinforce inclusivity by supporting diverse audiences at scale.

"I've been pleased with my experience with Drupal. While the earlier versions were sometimes technically complex, it always felt like a solid, robust platform to build upon. I have been genuinely pleased that we chose to stick with it over the long term and to witness its evolution into a more mature and flexible platform." 

Charles Andrew Revkin  Senior Digital Strategy Manager  Union for International Cancer Control ( UICC)

Scaling impact with Drupal AI

To manage the vast volume of user-submitted stories while maintaining quality, relevance and inclusivity, WCD integrated Drupal AI. This automation helps with content moderation and reduces manual workload, allowing more people to share their experiences and supporting the campaign’s people-centered mission as it scales.

Why Open Source matters for global health initiatives

For non-profit organizations in the healthcare sector, efficiencytransparency, and long-term sustainability are essential, especially when every investment must directly support the mission. As an open-source platform, Drupal eliminates licensing costs and avoids vendor lock-in, allowing resources to be focused on participation and impact rather than software fees. Supported by a global contributor community, Drupal benefits from continuous improvements in security, accessibility, and performance, making it a trusted foundation for sensitive, high-impact initiatives like World Cancer Day.

Technology that serves people at scale

The global fight against cancer requires collective action, and Drupal plays an important role in enabling that engagement. By managing large-scale data, complex interactive features, and high-traffic performance, the platform ensures that the campaign can reach millions of people, foster participation, and support socially impactful initiatives year after year.

Read the full case study on 1xINTERNET website

File attachments:  CS_WCD_01.png

Talking Drupal: Talking Drupal #538 - Agentic Development Workflows

Drupal Planet -

Today we are talking about Development Workflows, Agentic Agents, and how they work together with guests Andy Giles & Matt Glaman. We'll also cover Drupal Canvas CLI as our module of the week.

For show notes visit: https://www.talkingDrupal.com/538

Topics
  • Understanding Agentic Development Workflows
  • Understanding UID Generation in AI Agents
  • Exploring Generative AI and Traditional Programming
  • Building Canvas Pages with AI Agents
  • Using Writing Tools and APIs for Automation
  • Introduction to MCP Server and Its Tools
  • Agent to Agent Orchestration and External Tools
  • Command Line Tools for Agent Coding
  • Security and Privacy Concerns with AI Tools
  • The Future of AI Tools and Their Sustainability
  • Benefits of AI for Site Builders
Resources Guests

Matt Glaman - mglaman.dev mglaman

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Andy Giles - dripyard.com andyg5000

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to sync components from a site using Drupal Canvas out to another project like a headless front end, or conversely, from an outside repo into Drupal Canvas? There's an NPM library for that
  • Module name/project name:
  • Brief history
    • How old: created in July 2025 (as xb-cli originally) by Bálint Kléri (balintbrews) of Acquia
    • Versions available: 0.6.2, and really only useful with Drupal Canvas, which works with Drupal core 11.2
  • Maintainership
    • Actively maintained
    • Number of open issues: 8 open issues, 2 of which are bugs, but one of which was marked fixed in the past week
  • Usage stats:
    • 128 weekly downloads according to npmjs.com
  • Module features and usage
    • With the Drupal Canvas CLI installed, you'll have a command line tool that allows you to download (export) components from Canvas into your local filesystem. There are options to download just the components, just the global css, or everything, and more. If no flags are provided, the tool will interactively prompt you for which options you want to use.
    • There is also an upload command with a similar set of options. It's worth noting that the upload will also automatically run the build and validate commands, ensuring that the uploaded components will work smoothly with Drupal Canvas
    • I thought this would be relevant to our topic today because with this tool you can create a React component with the aid of the AI integration available for Canvas and then sync that, either to a headless front end built in something like Next.js or Astro or a tool like Storybook; or you could use an AI-enhanced tool like Cursor IDE to build a component locally and then sync that into a Drupal site using Canvas
    • There is a blog post Balint published that includes a demo, if you want to see this tool in action

The Drop Times: How Drupal Starts Now

Drupal Planet -

Five years after the idea first surfaced, Drupal CMS 2.0 has arrived, with a clear focus on the early experience. Released on 28 January 2026, the update introduces real-time page editing via Drupal Canvas, a templating system with sector-specific defaults, and optional AI guidance. It’s not a reinvention of Drupal. It’s a response to what new users most often struggle with: getting started quickly without sacrificing long-term flexibility.

The release is built on Drupal Core 11.3, bringing the platform’s biggest performance gains in over a decade—up to 33% faster request handling. Canvas replaces the standard editing workflow with a drag-and-drop interface, powered by the new Mercury component system. The first template, Byte, is preconfigured for SaaS marketing sites and installs in under three minutes. Optional AI tools support page scaffolding, alt text generation, and guided content modelling, with integration available for amazee.ai, OpenAI, and Anthropic.

On launch day, Dries Buytaert called the release “power without complexity,” noting that it changes the starting point, not the system. Contributed module compatibility is preserved, and features from Drupal CMS 1, like automatic updates and the Gin admin UI, remain intact. For teams evaluating Drupal in 2026, CMS 2.0 sets a clearer baseline: real output, faster, with less overhead.

Discover DrupalEventOrganization NewsFree Software

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now. For timely updates, follow us on LinkedIn, Twitter, Bluesky, and Facebook. You can also join us on Drupal Slack at #thedroptimes.

Thank you.

Kazima Abbas
Sub-editor
The Drop Times

DrupalCon News & Updates: Your First DrupalCon: Chicago 2026 Sessions You Can’t Miss

Drupal Planet -

Heading to your very first DrupalCon? Lucky you. There’s nothing quite like that first DrupalCon — the energy, the discoveries, the “wow, I’m really here” feeling. Chicago, “The Windy City,” warmly welcomes you to see which way the wind is blowing in Drupal: latest trends, community initiatives, practical know-how, and hands-on tips.

At DrupalCon Chicago 2026, you’ll connect with fellow Drupal users and builders, swap stories, and finally match faces to names you may have seen online. You’ll meet the contributors behind the features that shape Drupal today, and they’re easy to talk to. Just come to their session or catch them nearby. The Drupal community is made up of people who enjoy sharing what they know and helping move Drupal forward together.

With so much happening at DrupalCon 2026 and an impressive choice of sessions, it can be hard to know where to start, especially on your first visit. The sessions below are hand-picked for first-time attendees and offer a balanced mix of context, inspiration, and practical takeaways.

Top sessions for first-time attendees at DrupalCon Chicago 2026
  • “Drupal CMS, site templates and beyond” — by Pamela Barone and Cristina Chumillas

Want to discover the real gems of Drupal that everyone’s buzzing about? This helps you stay oriented and gives you shared reference points for meaningful conversations, collaboration, and deeper exploration throughout DrupalCon and after it.

First, meet Drupal CMS — a special, ready-to-go version of Drupal built with usability in mind. It’s designed so non‑tech users can jump right in and enjoy smooth, out‑of‑the‑box experiences.

Drupal CMS 1.0 wowed the community with its pre‑built feature sets called Recipes, smart AI tools, easier admin navigation, and friendlier content editing. Now, it’s time for Drupal CMS 2.0 to shine, and you have a chance to hear an insightful session about it.

Guided by top contributors to the project, Pamela Barone (pameeela) and Cristina Chumillas (ckrina), you’ll explore the standout features that make Drupal CMS 2.0 special. Among them certainly is Drupal Canvas — a new-generation page builder. Another feature that will definitely be discussed is the newly-implemented Site Templates that enable you to kickstart pre-configured sites for specific use industries or use cases.

Besides the ready‑to‑use features in Drupal CMS 2.0, you’ll hear about the areas of ongoing work, plans, and ways to contribute. And don’t wait too long to grab your seat — Pamela and Cristina’s sessions are known to pack the room, with people standing just to catch the insights.

“Drupal AI Initiative: A Year in Review with Panel Q&A” — by James Abrahams, Christoph Breidert, Dominique De Cooman, and Paul Johnson

AI adoption is yet another topic that will help first-time attendees feel in the loop with the Drupal community. Recent DrupalCons have featured jaw‑dropping demos: AI building page layouts from a prompt, migrating content between sites, generating webforms from a sketch, and much more. In Drupal CMS, AI is baked into the concept itself, with agents, assistants, and automators designed to take on the heavy lifting. 

The Drupal AI Initiative was launched in 2025 to organize, coordinate, and strategically guide AI adoption. It is in full swing, so it’s the perfect moment to attend this compelling session by its maintainers. Discover what AI capabilities have become available thanks to the Initiative, what to look forward to, and how to get involved.

And if you’re curious, stick around for the open “Ask Me Anything” segment in this session. James Abrahams (yautja_cetanu), Christoph Breidert (breidert), Dominique De Cooman (domidc), and Paul Johnson (pdjohnson) will be ready to answer your questions and share insights.

“Next steps for Drupal Canvas” — by Lauri Timmanee

We’ve already touched on Drupal Canvas, but it deserves a moment of its own. Canvas is on track to become the primary way page layouts are built across the entire Drupal ecosystem, so knowing how it works is a must.

Built with React, Drupal Canvas brings a visual, component-driven approach to Drupal. Among its features are:

  • intuitive drag-and-drop visual editing
  • reusable components that keep pages consistent and maintainable
  • predefined content templates
  • integration with top Drupal modules like CKEditor, Metatag, and Webform
  • a developer-friendly architecture and an in-browser code editor for creating components
  • AI integration for generating pages from prompts, as demoed at Driesnote Vienna 2025

Don’t miss this session by Lauri Timmanee (lauriii), one of the key maintainers and product leads behind Drupal Canvas. This is your chance to discover how it works, explore real demos, and see the exciting features to look forward to. 

Kicking off your contribution journey

Open‑source thrives because people show up — and in Drupal, every action counts. A small fix, a quick test, a bit of feedback — these tiny sparks can light up big changes. That’s the magic of contributing: each step adds to something larger, something shared.

In Drupal, those efforts don’t go unnoticed. Credits on drupal.org are one way your work is recognized — but the real reward is the respect and connection you’ll earn from a community that values every contribution.

DrupalCon is the perfect place to start contributing. Think of it as a launchpad — a welcoming space where you can learn, experiment, and make your first mark on Drupal.

Walk into a DrupalCon contribution workshop, and you’ll feel it right away — the buzz of laptops opening, sticky notes being scribbled, and people leaning in to help each other. It’s not just a session, it’s a hive of energy where newcomers and veterans sit side by side to move Drupal forward.

Contribution workshops
  • First‑Time Contribution Workshop — perfect if you’ve never contributed before. You’ll learn how to navigate drupal.org, find beginner‑friendly tasks, and collaborate with the community. Multiple sessions are available, so you’ll have plenty of chances to join.
  • Mentored Contribution Session — open to everyone, no matter if you’re brand new or already have some experience. You’ll work on real issues with guidance from seasoned contributors and maintainers, ask questions, and gain hands‑on practice while making a meaningful impact.
“How to Land an EPIC Contribution in Drupal (Without Losing Your Mind)” — by Mike Herchel and Matt Glaman

When it comes to discovering contribution opportunities, you might also find it very useful to attend this lively session by two seasoned and famous Drupal contributors. Mike Herchel (mherchel) and Matt Glaman (mglaman) will pull back the curtain on how contributions happen that might eventually become epic in Drupal.

So how does a future contribution start? Maybe you spot a bug, or you feel the urge to improve how something works. From that moment of drive, the journey begins — identifying the issue, pitching your idea to the right people, assembling a team, doing the work, navigating communication hurdles, and finally pushing your contribution across the finish line.

You’ll hear real stories of stubborn bug fixes, ambitious features, and the persistence it takes to get changes into Drupal core, Drupal CMS, or major contributed projects. Expect practical advice, case studies that show the highs and lows, plenty of humor, and the kind of motivation that makes you want to do something epic yourself.

Driesnote by Dries Buytaert

The central keynote of DrupalCon is a can’t‑miss session for everyone. For first‑time attendees, it’s an especially exciting chance to see Drupal’s founder in person and hear his insights.

You’ll get a firsthand look at the features, initiatives, and updates preparing to define Drupal’s next chapter. It’s a moment to see the bigger picture, feel the energy of the community, and glimpse what lies ahead together.

Each year, the Driesnote comes with its own creative theme — from space missions to Drupal villages — always kept secret until the big reveal. Whatever the theme this year, the Driesnote is guaranteed to be a breathtaking performance, delivered by the one speaker who knows how to keep the audience engaged, fascinated, and full of anticipation.

The Driesnote is where DrupalCon truly begins — vision, energy, and surprises from Drupal’s founder. Grab your seat in the big auditorium, right where the whole community will be gathering.

“Drupal in a Day” — by Acquia

The skill‑sharing spirit of the Drupal community shines brightest when welcoming new talent. Seasoned gurus are happy to help newcomers learn Drupal.

Can you really learn Drupal in a single day? You’ll keep uncovering its powerful site‑building capabilities as your journey continues, but one day can give you a real taste of Drupal — enough to explore its fundamentals and see what makes it one of the world’s leading open‑source CMS platforms.

Drupal in a Day at DrupalCon Chicago 2026 is a free, hands‑on workshop designed for beginners. This includes university or college students, or just anyone who is curious about Drupal and wants to see how it all comes together. No prior experience needed — just bring your laptop and a bit of curiosity.

Guided by experienced Drupalers, you’ll roll up your sleeves to build a site from scratch, pick up practical skills, and leave with a certificate, new connections, and the confidence to dive deeper. Who knows — this could be the first step toward a future Drupal career, where you’ll be the one teaching others or contributing to the next big Drupal feature.

Spots are limited, so register early if you want to join in.

Final thoughts

These sessions can help you get your bearings, spark new ideas, and show how the pieces of Drupal fit together today, and where they’re headed next. In addition to the sessions on this list, there is a great variety of others you might enjoy depending on your background. Pick what catches your interest, follow your curiosity, and leave room for a few surprises along the way.

Besides the sessions, it’s a great idea to visit the Expo Hall for informal chats with solution partners and companies using Drupal. Many first‑time attendees find the networking between sessions just as valuable as the sessions themselves.

With its welcoming spirit, DrupalCon has a way of turning first sessions and first conversations into lasting connections. Make your first visit exciting, and let your journey with Drupal be truly epic!

Spinning Code: Consultant vs Client

Drupal Planet -

Lots of people working in technology have a choice between working for clients or working for consultants. We work on one side of the relationship thinking how nice it would be to have the advantages of being on the other; the preverbal grass is always seems greener.

I spent a little more than ten years as a client before I became a consultant. I spent just a bit longer as a consultant before becoming a client again. There are things I’ve learned in each role that help me do the other better. To ensure a mutually beneficial engagement it is helpful to understand the perspective of the other team.

Why is understand both sides useful?

The goals of a consultant and a client organization are misaligned. That doesn’t mean you can’t do great things together, but if you don’t understand the goals of your partner you are likely to step on each other’s toes.

The goals of clients

Client looks to consultants for one of two primary reasons:

  • Add to team capacity
  • Solve a problem

We either need to complete a project that our team does not have the time to tackle, or we need expertise it does not make sense for us to keep on staff. Sometimes we’re looking to reduce costs by having a group of part time people fill the roles of a smaller number of fulltime team members.

I like to have my team’s staffing level sufficient to complete all day-to-day tasks and to bring in outside help to take on special projects. Other people like to have consultants around consistently to provide the outside perspective and the diverse expertise that consultants bring. Both of those strategies are foundationally aimed at those two needs.

A smart client wants to spend the money needed to be successful, but not more. We want the most value for our money we can possibly get.

The goals of consultants

Consultants have a different pair of primary goals:

  • Record billable hours and/or ensure profit margin
  • Have a happy client who refers more business

Some consultants will protest that they have the goal to solves client’s problems through good work. My perspective is those are way to achieve those two goals. Some clients are happy when you do good work (but not all). All clients are paying to have a problem solved (see above).

Profit motive isn’t evil or wrong – even when supporting nonprofits and other socially beneficial institutions (having spent much of my career in nonprofits, we think about this a lot). Consultants need to make money to stay afloat. A consulting firm has people to pay, overhead to manage, and founders/investors to reward. Independent consultants need to eat, pay their mortgage, and so on. The larger the firm, the more pressure there is for larger profit margins.

To get new clients, consultants need “referable” clients. That means having clients who are so happy with the work done they will serve as a reference. I wish that always meant creating the best solution possible. What it means in practice is building the solution that makes the client happy. As a consultant I gave clients my best advice, and when they disagreed and insisted on a different solution, we build that instead. If they ran out of money along the way we still tried to keep them happy, even if we had the duct tape the last bits.

In the end, consultants build what clients pay for, and that’s not always the best solution.

Finding the balance

With consultants trying to make the most money they can, and clients trying to get a successful solution for the least money, there is an inherent tension in the system. Still, there is a balance to be had, where everyone wins, and great things happen. The trick is to make it a healthy tension that forces everyone to be better. Finding that balance doesn’t require that everyone involved has spent time on the other side the relationships, but it certainly helps.

When you understand the needs and goals of the other side of the relationship you can adjust your approach to make sure everyone is aligned to win.

Lessons to take from being a client

One of the things I learned along the way was that a lot of the advice given to new consultants contradicts what I knew from being a client. Spending time as a client gives you insights into how to best serve customers that many pure consultants don’t understand.

Be the consultant you want to hire

When you work at an organization that hires consultants you see different approaches taken by different firms. You learn your preferences about what you like and don’t like in a consulting partner. While no one style is the best fit for everyone it’s unlikely that you are so unique that there aren’t lots of other people who like that same style.

Default to the Golden Rule: treat clients the way you wanted to be treated by consultants.

You can’t always do that 100% of the way – sure as a client I want everything free, but that’s not reasonable. But by approaching the client the way I would have wanted to be treated consistently went a long way to helping smooth over challenges.

Start there, and over time you’ll learn to adapt your approach when specific clients prefer a different style.

Be honest about your limitations

Do. Not. Lie. To. Me.

Do not guess without admitting it. If I wanted made up answers, I’d ask an AI.

Consultants always want to appear to be the expert in the room, and so they feel they have to answer every question. Too often that leads to consultants making up answers to show how smart they are; clients will catch you eventually.

One of the best ways to build trust with a client is admit when you don’t know the answer to a question, and then come back later with the answer. Do not say “I don’t know” and leave it there, go for some form of “I will need to go look that up/ask around/figure that out.”

Great consultants find solutions, they don’t always have the answer right away. We can wait for you to do some research when we stump you. That is a lot easier to explain than when you have to walk back having given us the wrong answer.

Focus on what the client needs to succeed

Clients should always have an outcome in mind that supports their work. Consultants are focused on the solution they are building. When everything is going well, that solution is what the client needs to support their work. If those stop being the same thing you have a very big problem.

Both clients and consultants can easily forget to consistently re-check that alignment. As a client and as a consultant I’ve been part of projects where the delivered solution didn’t solve the actual problem – even when it fulfilled the spec and SOW. These moments frequently lead to energetic discussions that often become loud. No one wins when that happens.

Regularly check with the client, and with yourself, to see if the solution will solve the client’s problem. When you see misalignment raise your hand early and often.

Lessons to take from being a consultant

Of course consultants know and learn stuff that isn’t obvious to any given client. Consultants bring wider experiences, different perspectives, and a different energy to a project. That is part of what makes them valuable. Clients should hire a consultant they trust, and listen to their consultant. Think hard before deciding you know better.

Always learn new things, even if they aren’t important today

As a client we tend to learn deeply about the tools we use and our work. Consultants work on a lot of projects with a lot of clients. Along the way they use a lot tools, and see at lot of ideas. That creates a culture and need for constantly learning. Often they are learning about things that don’t seem useful right away.

The higher the role you have as a consultant, the more you are expected to be at least conversant about technology you haven’t used yet. You also need to be conversant about the work of your clients. That’s a lot of learning.

I had good learning habits going in to being a consultant. They served me extremely well as a consultant, and are serving me well again as a client.

The broad knowledge of a consultant is extremely useful and everyone benefits from more people knowing more stuff. Having that breadth of knowledge also helps when you do run into the places where you don’t know something. It gives you the confidence that you can go learn the next thing you need to know quickly (see Be honest about your limitations above).

Know how to work to a deadline

Consultants are always working within time and budget constraints – usually tight ones. That forces them to learn to be efficient. Sometimes that means they cut corners (see next section) usually that just means they move fast. Good consultants have a high degree of dexterity with their tools, they learn to line up their work to knock out tasks, and they learn what’s needed and what’s just nice to have.

New consultants often feel like they are sprinting all the time, but experienced consultants learn to balance the sprints with jogging. The pace is nearly always high (at least if sales are going well), but it still ebbs and flows. Consultants learn to hit their deadlines, but rarely are ready to deliver early.

As a consultant if a deadline was far in the future it gave me time to do careful work, balance other clients, do research, or just time off. Far off deadlines gave me time to recover from sprints and make sure I had the energy for high intensity moment. That intensity is important to driving client success – but hitting the deadline is more important.

Hitting deadlines is also important for a client to do. Consultants need you to hit your deadlines so they can balance their workload to hit their deadlines. They may also have penalties embedded in the contract (see Read the Contract below) that could cost you time or money over the course of your project.

Perfect is the enemy of the good

Okay, this isn’t something just consultants know, but it is something consultants often learn to deal with the hard way.

Consultants need a solution that meets the requirements, fits in the budget, and pleases the client. They are not there to create a solution that is perfect, or even elegant. In any project there is a balance to be had between carefully polished, and just barely good enough to be successful. Consultants learn to thread that needle. As long as the project is successful that’s a good thing.

I have seen developers spend hours, days, even months, trying to build to the perfect level of abstraction, with the perfect naming conventions, and drive for the perfect code, only to have the project fail because it’s overdue, over budget, and was outmoded by someone who worked twice as fast.

Yes, we all want good solutions to our technical problems. But no solution is going to be perfect. You should aim for perfection and know you are going to miss. When you learn to accept that, it’ll be easier to move forward and be successful.

Things everyone should know regardless of role

For all there are things that each side brings something to the table, there are habits that everyone should have as part of their role. There are lessons I learned, or was taught, in both roles that are super important.

Read, and understand, the contract

Everyone on a project benefits from having working knowledge of the contract. In the end, when push comes to shove, all that matters is the words on the paper. You can usually avoid the pushing and shoving by understanding what everyone agreed to up front.

The biggest issues I’ve seen on consulting projects was when one side, or the other, didn’t pay attention to the agreement.

Sometimes this happens because everyone is working in good faith, and no one remembers to amend the agreement when needs changed. In those cases you can often recover by continuing to work with each other in good faith.

Sometimes this happens when someone signed a contract they didn’t read and understand. I once had a client yell at me because I added a paragraph to the contract outlining the resources they were responsible for providing and he didn’t read it before we asked him for those resources (these clauses are really standard, and the one I wrote was extremely simple).

If everyone on the team takes the time to read and understand the contract it greatly reduces friction. Clients who understand the bounds and assumptions in a contract are able to get the most from their vendor without creating tension. Consultants who track the required deliverables of the contract don’t frustrate clients by skipping required elements. It doesn’t take long. The more you read them the faster we’ll be at reading the next one.

Once you have read a bunch of contracts you’ll know what’s normal and what’s not. At this point, if I don’t understand the contract language I see that as a red flag even before I send it out for legal review.

Discuss problems and be solution oriented

Projects go best when everyone is open about what problems exist and then pivots to solving them.

Technology should be deployed to solve problems. That means starting by talking about problems. Being problem focused at the start makes it easy to be hung up talking only about those problems, or about new problems that come up while solving the first problem.

Having a good problem statement is critical to creating good solutions. But once you have the problem outlined you need to focus on solving it. Yes, raise problems, concerns, challenges, threats, weaknesses, etc. Talk openly about all those things. Then make the pivot into problem solving mode once the issue is well understood.

The best projects come together when when everyone collaborates on finding the best solutions to the problems at hand.

Quality matters

Everyone needs to focus on the quality of the outcome. Consultants, for all their fast moving creation of imperfect solutions, must still do good work. Clients should hold their vendors, and themselves, to high standards.

Every message that goes back and forth is a chance for misunderstand that gets in the way. Every input into discovery and every deliverable is a chance for gaps to form. If anyone takes their eye off the ball mistakes can happen and the solution no longer threads the quality needle correctly.

Mistakes will happen, and everyone will have to help course correct. But the higher the quality of the work done before the mistake, the faster it will be to recover and better and overall solution the client will get.

The Grass is Greener

One final note on the way out. If you are trying to decide between being a consultant or being a client, I recommend the switch – whichever you are today try being the other if you haven’t yet. Not everyone loves both roles, and different roles have been right for me at different times.

As a client I loved what I did. We were helping make the world better. I was pushing things forward and helping the organizations succeed. But eventually the things they needed me to learn, and the pace I wanted to grow, weren’t aligned to the organization’s needs.

I’d been there a decade, I left on great terms, but it was time to go.

When I first became a consultant it was exciting. I got to work on a variety of projects, with more technologies than any one organization generally needs. The pace was higher and I was frequently pushing myself in new directions. Consulting gave me insights into how different organizations worked (for both better and worse). And I made more money.

Interesting work, exciting environment, more money, great!

As a consultant I spent less time in positions, the billable grind was exhausting, I missed being focused. When I returned to the client side, I got to focus again. I have one org to worry about, one set of organizational politics to understand, and so on. I get to learn the work of the organization deeply again and really understand the market we serve. In my case I, again, got more money – but that was at least partially luck as much as anything; consultants are often paid better than in-house team members.

Focused work, no billable hours target, calmer work environment, great!

Each really does have it’s advantages. But so does understanding what it’s like to be the person on the other side the relationship. Try them both, learn from both, decide what’s the best fit for you.

Drupal AI Initiative: The Drupal AI Hackathon: Play to Impact 2026

Drupal Planet -

On the 27th of January 70+ developers, designers, UX, project leads joined forces in nine teams to attend the European Commission hackathon called Play to impact at The One building in the heart of the European Commission's executive arm in Brussels.

Article by Marcus Johansson.

Day 1: Challenge setting and ideation

The two tasks for the teams were clear - build something that helps the content editor using AI or build something that helps reimagine how websites are created in Canvas.

While the tasks were mainly around the development of new features and modules, other actual criteria were scored, including a final powerpoint presentation in front of everyone. This meant that a multidisciplinary team was needed to have the chance to win.

One of the other criterias was that you had to use Mistral AI for your solution. Mistral, being the powerhouse of European AI innovation in large language models, was sponsors of the event. Mistral is one of the key companies to digital sovereign AI solutions in Europe.

They were both helping to make sure that all the teams had enough credits to develop and show off their impressive solutions using likewise impressive models, but also being able to support on site and helping in jury duty when selecting the winners.

amazee.ai and DrupalForge/Devpanel was also sponsoring the event, making sure that the provider setup was smooth for the teams and that the teams were given platforms where they could deploy their solutions for the jury to test.

The teams full at work

The event was the second time the commission had a hackathon specifically around Drupal and AI and this time it was a two day event, meaning people had much more time to prepare, plan, code and present the solutions.

This time there were also prep events where you could ask actual stakeholders, like editors of platforms, what their main problems they were facing.

As one of the core maintainers of the AI module, seeing the amount of people using something you helped create, was a feeling of pride, joy and satisfaction. And as someone that was on site to help technically for the second year around, two things stood out to me:

  1. At the first event I had to provide a lot of assistance, the event helped us identify areas for improvement at code level. If that year was a stress test, this year was smooth sailing. The modules are robust and people are more familiar with them.
  2. The usage of actual working AI code generation meant that the demos looked nicer, worked better and made sure that you can generate incredibly more impressive proof of concepts.

Group photo of most of the participants and organizers. Photo credit: Antonio De Marco.

Day 2: Sprinting to be presentation ready

On the second day all the teams had to stop at the deadline of 14:40 and have their presentation ready, code committed and Drupal instances set up.

After that started the presentation round, where each of the teams had exactly five minutes to present their solutions to the jury and answer questions from the jury. The jury consisted of people from the European Commission, one person representing Mistral, Tim Lehnen from the Drupal Association and Jamie Abrahams from the AI Initiative.

Bram ten Hove and Ronald te Brake presenting their ACE! Solution.

And the winners are ...

The winners in the end was team #4 aptly named Token Burners, that ended up making a solution that did not just spawn one actual contributed module, but two! They also had an very impressive presentation.

We now have the FlowDrop Agents that puts the AI Agents we have had in Drupal into the awesome Workflow management system FlowDrop and also the FlowDrop Node Sessions, which makes sure to support workflows to be initialized via a Drupal entity.


The winning team Token Burners and the hackathon jury.

From my point of view the hackathon was a huge success - the energy in the room, the collaboration, the brainstorming was just impressive.

A huge thanks to the organizers Sabina La Felice, Monika Vladimirova, Raquel Fialho, Antonio De Marco and Rosa. Ordinana-Calabuig and the European Commission in general for such a great event!

Droptica: AI Document Processing in Drupal: Technical Case Study with 95% Accuracy

Drupal Planet -

AI document processing is transforming content management in Drupal. Through integration with AI Automators, Unstructured.io, and GPT models, editorial teams can automate tedious tasks like metadata extraction, taxonomy matching, and summary generation. This case study reveals how BetterRegulation implemented AI document processing in their Drupal 11 platform, achieving 95%+ accuracy and 50% editorial time savings.

Dries Buytaert: AI creates asymmetric pressure on Open Source

Drupal Planet -

AI makes it cheaper to contribute to Open Source, but it's not making life easier for maintainers. More contributions are flowing in, but the burden of evaluating them still falls on the same small group of people. That asymmetric pressure risks breaking maintainers.

The curl story

Daniel Stenberg, who maintains curl, just ended the curl project's bug bounty program. The program had worked well for years. But in 2025, fewer than one in twenty submissions turned out to be real bugs.

In a post called "Death by a thousand slops", Stenberg described the toll on curl's seven-person security team: each report engaged three to four people, sometimes for hours, only to find nothing real. He wrote about the "emotional toll" of "mind-numbing stupidities".

Stenberg's response was pragmatic. He didn't ban AI. He ended the bug bounty. That alone removed most of the incentive to flood the project with low-quality reports.

Drupal doesn't have a bug bounty, but it still has incentives: contribution credit, reputation, and visibility all matter. Those incentives can attract low-quality contributions too, and the cost of sorting them out often lands on maintainers.

Caught between two truths

We've seen some AI slop in Drupal, though not at the scale curl experienced. But our maintainers are stretched thin, and they see what is happening to other projects.

Some have deep concerns about AI itself: its environmental cost, its impact on their craft, and the unresolved legal and ethical questions around how it was trained. Others worry about security vulnerabilities slipping through. And for some, it's simply demoralizing to watch something they built with care become a target for high-volume, low-quality contributions.

These concerns are legitimate, and they deserve to be heard. Some of them, like AI's environmental cost or its relationship to Open Web values, also deserve deeper discussion than I can give them here.

That tension shows up in conversations about AI in Drupal Core. People hesitate around AGENTS.md files and adaptable modules because they worry about inviting more contributions without adding more capacity to evaluate them.

This is the AI-induced asymmetric pressure showing up in our community. I understand the hesitation. Some feel they've already seen enough low-quality AI contributions to know where this leads. When we get this wrong, maintainers are the ones who pay. They've earned the right to be skeptical.

I feel caught between two truths.

On one side, maintainers hold everything together. If they burn out or leave, Drupal is in serious trouble. We can't ask them to absorb more work without first creating relief.

On the other side, the people who depend on Drupal are watching other platforms accelerate. If we move too slowly, they'll look elsewhere.

Both are true. Protecting maintainers and accelerating innovation shouldn't be opposites, but right now they feel that way. As Drupal's project lead, my job is to help us find a path that honors both.

I should be honest about where I stand. I've been writing software with AI tools for over a year now. I've had real successes. I've also watched some of the most experienced Drupal contributors become dramatically more productive with AI, doing things they could not have done without it. That perspective comes from direct experience, not hype.

But having a perspective is not the same as having all the answers. And leadership doesn't mean dragging people where they don't want to go. It means pointing a direction with care, staying open to evidence, and never abandoning the people who hold the project together.

We've sort of been here before

New technology has a way of lowering barriers, and lower barriers always come with tradeoffs. I saw this early in my career. I was writing low-level C for embedded systems by day, and after work I'd come home and work on websites with Drupal and PHP. It was thrilling, and a stark contrast to my day job. You could build in an evening what took days in C.

I remember that excitement. The early web coming alive. I hadn't felt the same excitement in 25 years, until AI.

PHP brought in hobbyists and self-taught developers, people learning as they went. Many of them built careers here. But it also meant that a lot of early PHP code had serious security problems. The language got blamed, and many experts dismissed it entirely. Some still do.

The answer wasn't rejecting PHP for enabling low-quality code. The answer was frameworks, better security practices, and shared standards.

AI is a different technology, but I see the same patterns. It lowers barriers and will bring in new contributors who aren't experts yet. And like scripting languages, AI is here to stay. The question isn't whether AI is coming to Open Source. It's how we make it work.

AI in the right hands

The curl story doesn't end there. In October 2025, a researcher named Joshua Rogers used AI-powered code analysis tools to submit hundreds of potential issues. Stenberg was "amazed by the quality and insights". He and a fellow maintainer merged about 50 fixes from the initial batch alone.

Earlier this week, a security startup called AISLE announced they had used AI to find 12 zero-days in the latest OpenSSL security release. OpenSSL is one of the most scrutinized codebases on the planet. It encrypts most of the internet. Some of the bugs AISLE found had been hiding for over 25 years. They also reported over 30 valid security issues to curl.

The difference between this and the slop flooding Stenberg's inbox wasn't the use of AI. It was expertise and intent. Rogers and AISLE used AI to amplify deep knowledge. The low-quality reports used AI to replace expertise that wasn't there, chasing volume instead of insight.

AI created new burden for maintainers. But used well, it may also be part of the relief.

Earn trust through results

I reached out to Daniel Stenberg this week to compare notes. He's navigating the same tensions inside the curl project, with maintainers who are skeptical, if not outright negative, toward AI.

His approach is simple. Rather than pushing tools on his team, he tests them on himself. He uses AI review tools on his own pull requests to understand their strengths and limits, and to show where they actually help. The goal is to find useful applications without forcing anyone else to adopt them.

The curl team does use AI-powered analyzers today because, as Stenberg puts it, "they have proven to find things no other analyzers do". The tools earned their place.

That is a model I'd like us to try in Drupal. Experiments should stay with willing contributors, and the burden of proof should remain with the experimenters. Nothing should become a new expectation for maintainers until it has demonstrated real, repeatable value.

That does not mean we should wait. If we want evidence instead of opinions, we have to create it. Contributors should experiment on their own work first. When something helps, show it. When something doesn't, share that too. We need honest results, not just positive ones. Maintainers don't have to adopt anything, but when someone shows up with real results, it's worth a look.

Not all low-quality contributions come from bad faith. Many contributors are learning, experimenting, and trying to help. They want what is best for Drupal. A welcoming environment means building the guidelines and culture to help them succeed, with or without AI, not making them afraid to try.

I believe AI tools are part of how we create relief. I also know that is a hard sell to someone already stretched thin, or dealing with AI slop, or wrestling with what AI means for their craft. The people we most want to help are often the most skeptical, and they have good reason to be.

I'm going to do my part. I'll seek out contributors who are experimenting with AI tools and share what they're learning, what works, what doesn't, and what surprises them. I'll try some of these tools myself before asking anyone else to. And I'll keep writing about what I find, including the failures.

If you're experimenting with AI tools, I'd love to hear about it. I've opened an issue on Drupal.org to collect real-world experiences from contributors. Share what you're learning in the issue, or write about it on your own blog and link it there. I'll report back on what we learn on my blog or at DrupalCon.

Protect your maintainers

This isn't just Drupal's challenge. Every large Open Source project is navigating the same tension between enthusiasm for AI and real concern about its impact.

But wherever this goes, one principle should guide us: protect your maintainers. They're a rare asset, hard to replace and easy to lose. Any path forward that burns them out isn't a path forward at all.

I believe Drupal will be stronger with AI tools, not weaker. I believe we can reduce maintainer burden rather than add to it. But getting there will take experimentation, honest results, and collaboration. That is the direction I want to point us in. Let's keep an open mind and let evidence and adoption speak for themselves.

Thanks to phenaproxima, Tim Lehnen, Gábor Hojtsy, Scott Falconer, Théodore Biadala, Jürgen Haas and Alex Bronstein for reviewing my draft.

Evolving Web: Designing a digital archive in partnership with an Indigenous community

Drupal Planet -

Lessons for building a digital repository of archival material, stories, or user-generated knowledge.


Digital archives play an increasingly important role in preserving cultural knowledge, personal histories, and community memory. But not all archives are created equal. Beyond simply storing information, the most effective digital archives are designed to be welcoming, respectful, and alive — spaces that invite exploration while honouring the people and knowledge they represent.

At Evolving Web, we recently collaborated with the University of Denver on the Our Stories, Our Medicine Archive (OSOMA), a community-owned digital archive that centres traditional Indigenous knowledge related to health, wellness, culture, and identity. Built in close collaboration with community partners, OSOMA offers a powerful example of how digital repositories can move beyond institutional models toward something more participatory and human.

If you’re working on a digital archive — whether it’s focused on cultural heritage, community storytelling, or user-generated knowledge — here are some key lessons from OSOMA that can help guide your approach.

Design for discoverability, not just storage

A strong digital archive doesn’t assume users know exactly what they’re looking for. Instead, it supports exploration and discovery.
On OSOMA, visitors can browse content by broad themes such as Plants, Food, Ceremony, Identity, and Land. From there, they can narrow their focus using more specific filters, for example, exploring knowledge connected to particular healing practices or types of medicine.

This structure allows users to move easily between big ideas and specific stories. Someone might begin by browsing “Plant Medicine” and then discover individual narratives, videos, or related knowledge shared by community members. The archive encourages curiosity rather than forcing users into rigid pathways.
By organizing content around themes that reflect Indigenous worldviews, rather than academic or institutional categories. OSOMA makes it easier for users to find meaning, not just information.

OSOMA’s theme-based browsing invites exploration, allowing visitors to move from broad concepts like ceremony, animals, or hope into more specific stories and knowledge shared by community members.
Use plain language to build trust

Plain language plays an important role in making digital archives accessible, but it also shapes how users feel when they engage with the content.

Across OSOMA, headlines, descriptions, and navigation labels are written in clear, approachable language. The content doesn’t feel instructional or authoritative, and it avoids positioning itself as a definitive source of medical advice. Instead, it presents stories, experiences, and teachings in a way that feels open-ended and respectful.

This tone is especially important for an archive focused on health and wellness. By avoiding prescriptive language, OSOMA creates space for users to learn without pressure, and reinforces that the knowledge being shared belongs to the community, not the platform.

Make it easy to access knowledge quickly

OSOMA includes rich media such as videos and interviews, and the way users access that content is intentional.

For example, users can watch videos directly from search and results pages, without needing to click through multiple screens. This makes it easier to sample content, follow related threads, and continue exploring without losing context.

These small experience details matter. They reduce friction and make the archive feel responsive and intuitive, especially for users who may be less comfortable navigating complex digital interfaces.

Focus on personal stories over institutions

Many digital archives unintentionally feel institutional, even when they contain deeply personal material. OSOMA takes a different approach by placing individual voices front and centre.

Each community member has a dedicated profile page that brings together their stories, interviews, and related knowledge items. These profiles help users understand who is sharing the knowledge, where it comes from, and how it connects to lived experience.

Stories aren’t treated as supplementary content, they are the foundation of the archive. This storytelling-first approach reflects Indigenous knowledge traditions, where stories are a primary way of sharing history, values, and healing practices. The result is an archive that feels human and relational, rather than abstract or academic.

An OSOMA community member profile brings individual voices to the forefront, weaving together personal stories, interviews, and related knowledge to show how lived experience anchors the archive.

Make participation visible and welcoming

OSOMA was designed as a living, community-owned archive, and that intention is visible throughout the site.

Links and prompts to contribute are displayed prominently, making it clear that community members are invited to share their own stories and knowledge. Even visitors who never log in or submit content can immediately sense that OSOMA is shaped by ongoing participation.

Behind the scenes, the platform supports this model by allowing Indigenous users to log in, contribute content, and access protected cultural knowledge. Using Drupal’s Group functionality, the site ensures that sensitive information remains visible only to appropriate community members.

Participation isn’t treated as an add-on but rather  it’s built into the structure of the archive itself.

OSOMA invites community members to contribute their own stories, ensuring the archive continues to grow through shared knowledge, relationships, and lived experience.
Use design to support confidence and cohesion

Strong visual design helps establish trust, especially when an archive contains many voices and content types.
OSOMA uses photography and video of people, land, and cultural assets to ground the experience in real places and lived relationships. Circular image frames and a consistent colour palette draw from OSOMA’s visual identity and help tie together diverse content.

These design choices do important work quietly. They lend confidence to the stories being shared and ensure the site feels cohesive, even as new contributions are added over time. Rather than competing with the content, the design supports it, creating space for stories to speak for themselves.

Accessibility is foundational, not optional

OSOMA was built to be welcoming to a wide range of users, including Elders, youth, and non-specialist visitors.

The site meets WCAG AA accessibility standards, with clear layouts, strong colour contrast, and plain-language content. Navigation and browsing tools were designed to be intuitive, so users can explore without needing technical expertise.

Accessibility here isn’t treated as a compliance exercise. It’s part of a broader commitment to inclusion, respect, and ease of use: values that align closely with OSOMA’s community-led goals.

Building archives that honour living knowledge

OSOMA demonstrates that digital archives don’t have to replicate colonial or extractive models of knowledge storage. With the right approach, they can become spaces of connection, care, and continuity.

By prioritizing discoverability, plain language, personal storytelling, participation, strong design, and accessibility, OSOMA offers a powerful example of what’s possible when technology is shaped by community values.

If you’re thinking about building a digital archive or knowledge platform, this project is a reminder to look beyond the technical requirements and ask deeper questions about ownership, voice, and experience.

Get in touch to talk about building digital platforms that are inclusive, future-friendly, and people-first.
 

Learn more about the OSOMA project by reading the case study. 

+ more awesome articles by Evolving Web

Pages

Subscribe to www.hazelbecker.com aggregator