Drupal feeds

Freelock Blog: Running Out of Time? Giving Users Control

Drupal Planet -

Day 18 - Timing Adjustable, Pause, Stop, Hide Dec 18, 2025 0

You're filling out a multi-page application form - carefully reviewing each section, gathering documents, double-checking information. Suddenly, a popup appears: "Your session has expired. Please log in again." All your work is gone. You have to start over.

Or you're reading an important article when an auto-playing carousel sweeps the content away before you finish reading it. You try to find the pause button, but there isn't one - the carousel just keeps cycling, forcing you to time your reading to match its pace.

Read More

Dries Buytaert: Adaptable Drupal modules: code meant to be adapted, not installed

Drupal Planet -

Over the years, I've built dozens of small, site-specific Drupal modules. None of them live on Drupal.org.

It makes me wonder: how many modules like that exist across the Drupal ecosystem? I'm guessing a lot.

For example, I recently open-sourced the content of this blog by exporting my posts as Markdown files and publishing them on GitHub. To do that, I built two custom Drupal modules with Claude Code: one that converts HTML to Markdown, and another that exports content as YAML with Markdown.

Both modules embed architectural choices and algorithms I explicitly described to Claude Code. Both have unit tests and have been used in production. But both only work for my site.

They're built around my specific content model and field names. For example, my export module expects fields like field_summary and field_image to exist. I'd love to contribute them to Drupal.org, but turning site-specific code into something reusable can be a lot of work.

On Drupal.org, contributed modules are expected to work for everyone. That means abstracting away my content model, adding configuration options I'll never use, handling edge cases I'll never hit, and documenting setups I haven't tested.

There is a "generalization tax": the cost of making code flexible enough for every possible site. Drupal has always had a strong culture of contribution, but this tax has kept a lot of useful code private. My blog alone has ten custom modules that will probably never make it to Drupal.org under the current model.

Generalization work is extremely valuable, and the maintainers who do it deserve a lot of credit. But it can be a high bar, and a lot of useful code never clears it.

That made me wonder: what if we had a different category of contributed code on Drupal.org?

Let's call them "adaptable modules", though the name matters less than the idea.

The concept is simple: tested, working code that solves a real problem for a real site, shared explicitly as a starting point. You don't install these modules. You certainly don't expect them to work out of the box. Instead, an AI adapts the code for you by reading it and understanding the design decisions embedded in it. Or a human can do the same.

In practice, that might mean pointing Claude Code at my Markdown export module and prompting: "I need something like this, but my site uses Paragraphs instead of a regular Body field". Or: "I store images in a media field instead of an image field". The AI reads the code, understands the approach, and generates a version tailored to your setup.

This workflow made less sense when humans had to do all the adaptation. But AI changes the economics. AI is good at reading code, understanding what it does, and reshaping it for a new context. The mechanical work of adaptation is becoming both cheap and reliable.

What matters are the design decisions embedded in the code: the architecture, the algorithms, the trade-offs. Those came from me, a human. There are worth sharing, even if AI handles the mechanical adaptation.

This aligns with where engineering is heading. As developers, we'll spend less time on syntax and boilerplate, and more time on understanding problems, making architectural choices, and weighing trade-offs. Our craft is shifting from writing code to shaping code. And orchestrating the AI agents that writes it. Adaptable modules fit that future.

Modules that work for everyone are still important. Drupal's success will always depend on them. But maybe they're not the only kind worth sharing. The traditional contribution model, generalizing everything for everyone, makes less sense for smaller utility modules when AI can generate context-specific code on demand.

Opinionated, site-specific modules have always lived in private repositories. What is new is that AI makes them worth sharing. Code that only works for my site becomes a useful starting point when AI can adapt it to yours.

I created an issue on Drupal.org to explore this further. I'd love to hear what you think.

(Thanks to phenaproxima, Tim Lehnen, Gábor Hojtsy and Wim Leers for reviewing my draft.)

LostCarPark Drupal Blog: Advent Calendar day 18 – That's Not a Theme, It's a Template

Drupal Planet -

Advent Calendar day 18 – That’s Not a Theme, It’s a Template james Thu, 12/18/2025 - 09:00

In today’s door we return to DrupalCon Nara, where Elliott Mower of Mediacurrent, describes himself as a non-engineer who became an “accidental designer” of the new starter theme for Drupal Canvas.

The talk introduces the idea that modern Drupal Site Templates are more than just themes: they are flexible foundations that allow non-developers to build, customize, and evolve websites without deep technical knowledge. Using Drupal CMS 2.0 and Canvas, tasks that once required expertise in Composer, CSS, or front-end development can now be done visually and intuitively. He emphasizes that creators…

Tags

Drupal AI Initiative: Oaisys 2025 - a revelation in AI and Drupal

Drupal Planet -

At the end of November I had the privilege of going to the first ever Oaisys AI Practitioners conference in Pune, India. I wanted to share some of the memories and moments from that event.

Jamie Abrahams and I were invited to both hold sessions and lead the contribution day. It was my first time in India and since the event itself was new as well, it was hard to know what to expect. But whatever my expectations could have been, it would have surpassed it.

We arrived two days early to get over jetlag and also meet some people before the event. On the first evening when we arrived we went to dinner with Dipen Chaudhary, CEO and Piyuesh Kumar, Director of technology, both from QED42. QED42 were the main organisers of the event and had done a fantastic job putting it all together. It was great to meet them a bit before the event started and let me tell you - Indian food in Germany is not the same thing as Indian food in India. It was absolutely delicious.

The next day we met up with Pritam Prasun, founder of OpenSense Labs and CEO of RAIL. We discussed a lot about the Drupal ecosystem, about Indian culture vs western culture, but also about the RAIL project and how it can be integrated with the AI module to make the ecosystem more secure.

We also got to visit the offices of QED42 which was a great experience. The team there was super friendly and it was nice to see some faces that you had seen on video calls or Slack channels in real life.

The conference itself started on the 29th of November in the ICC Trade Tower in Pune. When I walked in the first day, I was greeted right away by people I knew from the Drupal community, but also people that had worked with the AI module or were interested in AI in general. It was a great feeling to be surrounded by so many like-minded people.

It was a great feeling to be surrounded by so many like-minded people.

Because of managing the contribution room, I only attended two sessions during the conference - Jamie's introductory session, which was always great and very well received and the end session by Piyush about How LLMs learn, which I learned myself from as well.

Meeting prominent contributors to the Drupal AI ecosystem

The highlight of the conference was to meet some of the most prominent contributors to the AI ecosystem. Specifically Prabhavathi Vanipenta and Anjali Prasannan that both have been doing amazing work on the AI module, and Akhil Babu, who has been working on the agents system in Drupal, and built many of the agents you see in Canvas. It was truly a blessing to meet them in person and thank them for all their contributions.

Prashant Chauhan, wasn't there in person, but Prabha and Anjali made sure that I got to thank him over a video call. The four of them have worked and finished over 200 issues on the AI module, which is just mind-blowing.

Well thought through ideas of how AI can be used to improve their workflows, businesses and lives

Another thing that was mind-blowing was the level of interest, energy and enthusiasm around AI in general in the contribution room. People were building the craziest things and had really thought through what and how AI can be used to improve their workflows, businesses and lives. It was inspiring to see so many people passionate about the same thing as you. I have never seen anything like it before, in the events I have attended.

Another thing that was mind-blowing was the level of interest, energy and enthusiasm around AI in general in the contribution room

The Browser AI CKEditor for instance is a project that was thought through and built during the contribution day.

A lot of discussion led directly to new issues in the Drupal AI module issue queue, that some of the people attending are working on now.

Because of family commitments I couldn't stay longer than four days, but those four days were really great. I want to thank Dipen, Piyuesh and the whole QED42 team for organising such a fantastic event and being such great hosts. A special thanks as well to Priyanka Jeph, who organized a lot of the event.

File attachments:  marcus-pune.png Screenshot 2025-12-18 at 08.23.50.png

Freelock Blog: What Went Wrong? Error Identification and Helpful Suggestions

Drupal Planet -

Day 17 - Error Identification and Suggestions Dec 17, 2025 0

You're checking out on an e-commerce site. You click Submit, and the page reloads with an error message at the top: "There were errors in your submission." That's it. No indication of which fields have problems. No explanation of what's wrong. You start hunting through the form, checking each field, trying to figure out what went wrong.

This frustrating experience is unfortunately common, especially on e-commerce sites, membership portals, and complex forms. But it's also completely avoidable - and fixing it makes your site accessible and more usable for everyone.

Read More

Centarro: The Hidden Costs of Enterprise Ecommerce Platforms

Drupal Planet -

When evaluating enterprise eCommerce platforms, the sticker price is often the smallest part of their total cost. Expenses always go beyond monthly subscription or license fees. What you see advertised is almost never what you’re going to pay.

Sometimes, these hidden, unexpected costs aren’t a huge problem. But other times, they can turn something seemingly affordable into a budget-busting commitment.

Count the cost. The trust cost beyond the advertised price. Here are eight cost factors that, if ignored, could sabotage your project budget.

Hidden Cost #1: Revenue-Based Fees That Scale With Your Success

Many enterprise platforms use pricing models that scale as your business grows. They do this as a proxy for the importance of the technology to your business, not because you are making use of additional features, and not necessarily even because you’re putting extra load on their servers.

Consider these examples:

Read more

drunomics: Lupus Decoupled 1.4: Component Previews, Canvas-Ready, and a Better JSON API Enabling Native Vue Slots

Drupal Planet -

Lupus Decoupled 1.4: Component Previews, Canvas-Ready, and a Better JSON API Enabling Native Vue Slots wolfgang.ziegler Wed, 17 Dec 2025 - 13:35 Lupus Decoupled 1.4 introduces Component Previews, Canvas-ready features, and an improved JSON API with native Vue slot support — enhancing developer flexibility and front-end integration.

Drupal blog: Drupal 11.3.0 is now available

Drupal Planet -

The third feature release of Drupal 11 is here with the biggest performance boost in a decade. Serve 26-33% more requests with the same database load. New native HTMX support enables rich UX with up to 71% less JavaScript. Plus, enjoy the new stable Navigation module, improved CKEditor content editing, native content export, and cleaner OOP hooks for themes.

New in Drupal 11.3 Biggest performance boost in a decade

Database query and cache operations on both cold and warm caches have been significantly reduced. Our automated tests show that the new optimization for cold caches is about one third and on partially-warm cache requests by up to one fourth. Independent testing shows even bigger improvements on complex sites.

The render and caching layers now combine database and cache operations, notably in path alias and entity loading. BigPipe also now uses HTMX on the frontend, leading to a significant reduction in JavaScript weight.

Read more about performance improvements in Drupal 11.3.0.

Native HTMX: Rich UX with up to 71% less JavaScript

Drupal 11.3.0 now natively integrates HTMX, a powerful, dependency-free JavaScript library. HTMX dramatically enhances how developers build fast, interactive user interfaces. It enables modern browser features directly in HTML attributes, significantly reducing the need for extensive custom JavaScript.

Read more about HTMX support in Drupal 11.3.0.

Navigation module is now stable

The Navigation module is now stable, offering a superior and more modern experience than the old Toolbar. While it is an experience worth installing on all sites, it is most useful for sites with complex administration structures. While not yet the default, we strongly encourage users to switch and benefit from its improvements.

Improved content editing

CKEditor now natively supports linking content on the site by selecting it from an autocomplete or dropdown (using entity references).. CKEditor also has new, user-friendly options for formatting list bullets and numbering.. Finally, a dedicated Administer node published status permission is introduced to manage publication status of content (which does not require Administer nodes anymore).

Object-oriented hooks in themes

Themes can now use the same #[Hook()] attribute system as modules, with theme namespaces registered in the container for easier integration. This change allows themers to write cleaner, more structured code. Themes' OOP hook implementations are placed in the src/Hook/ directory, similarly to modules'. Themes support a defined subset of both normal and alter hooks.

Native support for content export

Drupal core now includes a command-line tool to export content in the format previously introduced by the contributed Default Content module. Drupal can export a single entity at a time, but it is also possible to export the dependencies of the entity automatically (for example, images or taxonomy terms it references).To use the export tool, run the following from the Drupal site's root:

php core/scripts/drupal content:export ENTITY_TYPE_ID ENTITY_ID

New experimental database driver for MySQL/MariaDB for parallel queries

A new, experimental MySQLi database driver has been added for MySQL and MariaDB. It is not yet fully supported and is hidden from the user interface.

While the current default drivers use PDO to connect to MySQL or MariaDB, this new database driver instead uses the mysqli PHP extension. MySQLi is more modern and allows database queries to be run in parallel instead of sequentially as with PDO. We plan to add asynchronous database query support in a future Drupal release.

Core maintainer team updates

Since Drupal 11.2, we reached out to all subsystem and topic maintainers to confirm whether they wished to continue in their roles. Several long-term contributors stepped back and opened up roles for new contributors. We would like to thank them for their contributions.

Additionally, Roy Scholten stepped back from his Usability maintainership and Drupal core product manager role. He has been inactive for a while, but his impact on Drupal since 2007 has been profound. We thank him for his involvement!

Mohit Aghera joined as a maintainer for the File subsystem. Shawn Duncan is a new maintainer for the Ajax subsystem. David Cameron was added as a maintainer of the Link Field module. Pierre Dureau and Florent Torregrosa are now the maintainers for the Asset Library API. Finally, codebymikey is the new maintainer for Basic Auth.

Going forward, we plan to review core maintainer appointments annually. We hope this will reduce the burden on maintainers when transitioning between roles or stepping down, and also provide more opportunities for new contributors.

Want to get involved?

If you are looking to make the leap from Drupal user to Drupal contributor, or you want to share resources with your team as part of their professional development, there are many opportunities to deepen your Drupal skill set and give back to the community. Check out the Drupal contributor guide.

You would be more than welcome to join us at DrupalCon Chicago in March 2026 to attend sessions, network, and enjoy mentorship for your first contributions.

The Core Leadership Team is always looking for new contributors to help steward the project. As recently various new opportunities have opened up. If you are looking to deepen your Drupal skill set, we encourage you to read more about the open subsystem and topic maintainer roles and consider stepping up to contribute your expertise.

Drupal 10.6 is also available

The next maintenance minor release of Drupal 10 has also been released, and will be supported until December 9, 2026, after the release of Drupal 12. Long-term support for Drupal 10 gives more flexibility for sites to move to Drupal 11 when they are ready while staying up-to-date with Drupal's dependencies.

This release schedule also allows sites to move from one long-term support version to the next if that is the best strategy for their needs. For more information on maintenance minors, read the previous post on the new major release schedule.

Drupal Core News: Drupal 11.3.0: Biggest performance boost in a decade

Drupal Planet -

Drupal 11.3 includes a number of significant performance improvements, altogether making it the most significant step forward for Drupal performance in the last 10 years (since the Drupal 8.0.0 release).

These improvements have been driven by enhancements to Drupal's render and caching layers in 11.2.x, notably taking advantage of Fibers, a new PHP feature added in PHP 8.1. By rendering more parts of the page in placeholders, we have enabled similar database and cache operations that used to occur individually to be combined, with particular improvements in path alias and entity loading. We have also learned from Drupal's automated performance testing framework, allowing us to identify and execute several optimizations during Drupal's hook and field discovery processes, to significantly reduce database and cache i/o, and memory usage on cold caches.

On the front end we have converted Drupal's BigPipe implementation to use HTMX, reducing JavaScript weight significantly. We also intercept placeholders with warm render caches prior to BigPipe replacement, so that BigPipe's JavaScript is not loaded at all on requests that will be served quickly without it, allowing BigPipe to be used more widely for the requests that do need it. These changes may also allow us to enable BigPipe for anonymous site visitors in a future release.

Combined, these changes reduce database query and cache operations on cold cache requests by around one third, with smaller but still significant improvements when caches become warmer, up to and including dynamic and internal page cache hits.

Drupal's automated performance tests show many of these improvements, and will ensure that we continue to build and maintain on these gains over time.

Drupal Umami demo’s anonymous cold cache front page request

Let's look at an example. These are the changes in Drupal core's included Umami demo's anonymous cold cache front page request performance test between 11.2.x and 11.3.x.

11.2.0 11.3.0 Reduction SQL Query Count 381 263 31% Cache Get Count 471 316 33% Cache Set Count 467 315 33% CacheTag Lookup Query Count 49 27 45% Estimated ms (assuming 1ms per operation) 1368 921 33%

Particularly notable is the benefit for requests to pages with partially warmed caches, where site-wide caches are full, but page-specific caches are invalid or empty. In core's performance test for this scenario, we saw an almost 50% reduction in database queries. Requests like this make up a large percentage of the slowest responses from real Drupal sites, and core now provides a much lower baseline to work against. Medium to large sites often hit constraints with the database first, because it is harder to scale than simply adding webservers, and these improvements reduce load when it is at its most constrained.

Drupal Umami demo’s anonymous node page with partially-warmed cache 11.2.0 11.3.0 Reduction SQL Query Count 171 91 47% Cache Get Count 202 168 17% Cache Set Count 41 42 -2% CacheTag Lookup Query Count 22 22 0% Estimated ms (assuming 1ms per operation) 436 323 26%

While different sites, and even different pages on the same site, will show different results, we would expect all Drupal sites to see a significant reduction in database and cache i/o per request once they've updated to Drupal 11.3.

Independent testing and further improvements with Paragraphs

Independent testing by MD Systems on their internal Primer Drupal distribution shows even better improvements with Drupal 11.3, especially for complex pages. This is also thanks to further improvements enabled and inspired by Drupal 11.3 in the Entity Reference Revisions module which resulted in considerable performance improvements for Paragraphs. Their results show a dramatic reduction in database and cache operations across different cache states. Their cold cache total query count dropped by 62% (from 1097 to 420), and total cache lookups decreased by 47% (from 991 to 522) compared to Drupal 11.2. At the same time their partially-warm cache total query count dropped by 61% (from 696 to 274) and total cache lookups decreased by 34% (from 562 to 373).

Even more technical details

For further details on how these improvements happened, check out some of the core issues that introduced them, or watch Nathaniel Catchpole's DrupalCon Vienna presentation.

#1237636: Lazy load multiple entities at a time using fibers
#2620980: Add static and persistent caching to ContentEntityStorageBase::loadRevision()
#3496369: Multiple load path aliases without the preload cache
#3537863: Optimize field module's hook_entity_bundle_info() implementation
#3538006: Optimize EntityFieldManager::buildBundleFieldDefinitions()
#3526080: Reduce write contention to the fast and consistent backend in ChainedFastBackend
#3493911: Add a CachedPlaceholderStrategy to optimize render cache hits and reduce layout shift from big pipe
#3526267: Remove core/drupal.ajax dependency from big_pipe/big_pipe
#3506930: Separate hooks from events
#3505248: Ability to preload frequently used cache tags (11.2.x)

There is more to do for Drupal 11.4!

If you'd like to help 11.4 become even faster yet, check out core issues tagged with 'performance' and try to help us get them done. We have multiple issues in progress that didn't quite make it into 11.3.0 but could form the basis of another significant set of improvements in 11.4.0.

Drupal Core News: Native HTMX in Drupal 11.3.0: Rich UX with up to 71% less JavaScript

Drupal Planet -

Drupal developers always face the dilemma of building classic multi-page applications or headless solutions with a modern JavaScript stack. Especially when they need to build UIs that feel fast and are highly reactive. While there were some Drupal specific solutions for parts of this need (Form State API, AJAX API and BigPipe), these were dated, only solving very specific use cases and comparatively heavy in implementation.

HTMX is a tiny, dependency-free, and extensible JavaScript library that allows you to access modern browser features directly from HTML, rather than using extensive amounts of JavaScript. It essentially enables you to use HTML to make AJAX requests, CSS transitions, WebSockets, and Server-Sent Events (SSE) directly.

As a replacement for Drupal's mostly home grown solutions, this reduced the loaded JavaScript size by up to 71% for browser-server interactions, including HTML streaming with BigPipe. While enabling a whole set of new functionality at the same time!

HTMX was originally created by Carson Gross. The motivation was to provide a modern, yet simple, way to build dynamic user interfaces by leveraging the existing capabilities of HTML and the server-side architecture, effectively offering an alternative to the complexity of heavy, client-side JavaScript frameworks. By sending less data (HTML fragments instead of large JSON payloads and complex client-side rendering logic), HTMX often results in faster perceived performance and less bandwidth consumption. It is being adopted by developers across diverse ecosystems.

Principles of HTMX

HTMX operates on a few core, simple principles, all expressed via HTML attributes. While pure HTMX does not require the data- prefix, Drupal uses it to achieve valid HTML. That is how you'll see it used in Drupal, so we'll use that notation in this post.

  1. Any Element Can Make a Request: Unlike standard HTML forms and anchors, HTMX allows any element (a <div>, a <span>, a <button>) to trigger an HTTP request. All five HTTP verbs are available.
    • Attributes: data-hx-get, data-hx-post, data-hx-put, data-hx-delete, data-hx-patch.
  2. Any Event Can Trigger a Request: You are not limited to click (for anchors/buttons) or submit (for forms). Requests can be triggered by any JavaScript event, such as mouseover, keyup, or a custom event.
    • Attribute: data-hx-trigger.
  3. Any Target Can Be Updated: By default, HTMX replaces the inner HTML of the element that triggered the request. However, you can use a CSS selector to specify any element on the page to be updated with the response HTML.
    • Attribute: data-hx-target.
  4. Any Transition Can Be Used: HTMX allows you to define how the new content is swapped into the target element (e.g., replace, prepend, append, outerHTML) and works with the new View Transition API.
    • Attribute: data-hx-swap.
Short Code Example

This Drupal-independent example demonstrates how to fetch and swap new content into a div when a button is clicked, without writing any custom JavaScript.

<!-- The button that triggers the request --> <button data-hx-get="/clicked" data-hx-target="#content-area" data-hx-swap="outerHTML"> Load New Content </button> <!-- The area that will be updated --> <div id="content-area"> This content will be replaced. </div>

In this code example, when the button is clicked:

  1. A GET request is made to the server at the URL: /clicked
  2. The server responds with a fragment of HTML (e.g., <div>New Content Loaded!</div>)
  3. The HTML content of the #content-area element is replaced (outerHTML) by the response.
Introducing HTMX in Drupal 11.3.0

HTMX was added as a dependency to Drupal core in 11.2, but is now fully featured in 11.3. A new factory class is provided for developers building or extending render arrays. The Htmx class provides methods that build every HTMX attribute or response header, therefore documenting and exposing the HTMX API to Drupal.

Drupal 11.3 also extends the FormBuilder class to support dynamic forms built with HTMX. When a form is rebuilt from an HTMX request, all the form values will be available to the form class for dynamically restructuring the form. Here’s an example of both features:

function buildForm(array $form, FormStateInterface $form_state) { $make = ['Audi', 'Toyota', 'BMW']; $models = [ ['A1', 'A4', 'A6'], ['Landcruiser', 'Tacoma', 'Yaris'], ['325i', '325ix', 'X5'], ]; $form['make'] = [ '#title' => 'Make', '#type' => 'select', '#options' => $make, ]; $form['model'] = [ '#title' => 'Models', '#type' => 'select', '#options' => $models[$form_state->getValue('make', 0)] ?? [], // We'll need that later. '#wrapper_attributes' => ['id' => 'models-wrapper'], ]; return $form; } (new Htmx()) // An empty method call uses the current URL. ->post() // We select the wrapper around the select. ->target('#models-wrapper') // And replace the whole wrapper // not simply updating the options in place, // so that any errors also display. ->select('#models-wrapper') // We replace the whole element for this form. ->swap('outerHTML') ->applyTo($form['make']);

In this form, whenever the make selector is changed, the models selector will be updated.

Drupal 11.3 also adds a dedicated renderer and associated wrapper format that can be used to keep the response to an HTMX request as small as possible. This render only returns the main content and its CSS/Javascript assets. There are two ways to take advantage of this renderer.

One is to add an attribute to the HTMX enhanced element, which will cause the wrapper format to be used:

(new Htmx()) ->post() ->onlyMainContent() ->target('#models-wrapper') ->select('#models-wrapper') ->swap('outerHTML') ->applyTo($form['make']);

There is also a new route option that can be used when creating a route specifically to service HTMX requests. This route option will also be useful with the dynamic routes in Views as we refactor to use HTMX.

demo.route_option: path: '/htmx-demo/route-option' defaults: _title: 'Using _htmx_route option' _controller: '\Drupal\module_name\Controller\DemoController::replace' requirements: _permission: 'access content' options: _htmx_route: TRUE Drupal is still committed to supporting decoupled architectures

HTMX is an excellent solution for progressive enhancement and dynamic front-ends. It is a powerful tool in the core toolkit, not a replacement for the flexibility offered by a fully decoupled backend. Drupal remains committed to supporting decoupled and headless architectures, especially where necessary, such as mobile applications, client-side state management, deep offline capabilities, etc.

LostCarPark Drupal Blog: Advent Calendar day 17 – Curious Findings

Drupal Planet -

Advent Calendar day 17 – Curious Findings james Wed, 12/17/2025 - 09:00

For today’s door of the Drupal Advent Calendar, we are joined by John Picozzi, who wants to share the keynote from this year’s New England Drupal Camp. John writes…

What sparks curiosity? This is the question my friend Jason Pamental tackled in this year’s keynote talk, “Curious Findings.” Jason invites us to approach design and digital experiences not as a set of fixed goals, but as a journey guided by questions, hunches, and open-ended exploration. As we follow along, we see how embracing uncertainty — and giving ourselves permission to poke around the edges of the unknown — can unlock…

Tags

Phase2 Earns Gold-Level Partner Status with Optimizely

Phase II Technology -

Phase2 Earns Gold-Level Partner Status with Optimizely cjorgenson Tue, 12/16/2025 - 10:17

ARLINGTON, VA, December, 2025 – Phase2, an enterprise technology partner for digital modernization, emerging innovation, and purpose-built AI products and frameworks, has been elevated by Optimizely to Gold Partner status.

Through this accomplishment, Phase2 joins as a qualified, value-add seller of the Optimizely Digital Experience Platform. With a national footprint and deep expertise in health and wellness, Phase2 brings proven strategy, design, and technology leadership to organizations adopting Intelligent Digital Experience leveraging Optimizely solutions.

“Achieving Gold Partner status with Optimizely reflects our dedication to helping clients in healthcare and other regulated industries unlock the full value of their digital investments,” said Chris Jorgenson, Director of Strategic Partnerships at Phase2. “By combining our deep expertise in highly regulated industries with the power of Optimizely One, we’re able to deliver best-in-class digital experiences that improve engagement, streamline operations, and create measurable business impact. This partnership strengthens our ability to help organizations not only keep pace with evolving consumer expectations, but lead with innovation.”

Optimizely’s Digital Experience Platform, known as Optimizely One, provides a wide range of powerful features for marketers, including industry-leading content management, content marketing, commerce, data and personalization tools, and web and feature experimentation functionality, all in an easy-to-use and fully integrated suite. A commissioned Total Economic Impact™ study conducted by Forrester Consulting on behalf of Optimizely found that over three years, a composite organization realized 370% return on investment (ROI), $9.84 million net present value (NPV). Optimizely also generated $1.1 million in savings due to increased developer productivity as a result of deploying the company’s DXP.

Phase2 has successfully earned seven Opal Developer certifications as part of Optimizely early adopter program and is contributing to the development of the tool via agent creation.With Opal, Optimizely’s smart AI assistant, embedded across the entire marketing lifecycle, teams can work faster, better, and smarter through generative AI, smart insights, and automated recommendations. 

“With a network of over 700 partner companies in 30 countries, Optimizely seeks to connect with qualified partners whose firms possess a wealth of experience, team members with a creative outlook, global reach, and a collective eye toward future opportunities to ensure mutual customers are successful in the short and long term.

“Optimizely is thrilled to continue our relationship with a first-class partner like Phase2 who has now earned Gold status,” said Jessica Dannemann, Chief Partner Officer at Optimizely. “We look forward to seeing what our customers can achieve with Optimizely and Phase2 at their side.”

About Phase2
Phase2 is your technology partner for digital modernization, emerging innovation, and purpose-built AI products and frameworks. Over the last 20 years we have partnered with enterprise organizations at the forefront of their industries who strive to change how the world does business and how society is served. Through solutions that demand both deep expertise and advanced AI capabilities, we help organizations leap ahead with automatic, fast, safe, and scalable solutions that have never been possible before. Learn more at phase2.io.

About Optimizely
At Optimizely, we're on a mission to help people unlock their digital potential. We do that by reinventing how marketing and product teams work to create and optimize digital experiences across all channels. With Optimizely One, our industry-first operating system for marketers, we offer teams flexibility and choice to build their stack their way with our fully SaaS, fully decoupled, and highly composable solution. We help companies around the world orchestrate their entire content lifecycle, monetize every digital experience and experiment across all customer touchpoints – all through Optimizely One, the leading digital experience platform that powers every phase of the marketing lifecycle through a single, AI-accelerated workflow.

Optimizely has nearly 1,500 employees across our 21 global offices and has 700+ partners. We are proud to help more than 10,000 businesses, including H&M, PayPal, Zoom, and Toyota, enrich their customer lifetime value, increase revenue and grow their brands. At Optimizely, we live each day with a simple philosophy: large enough to serve, small enough to care. Learn more at optimizely.com.
 

Publication Date Tue, 12/16/2025 - 10:17 Caitlin Loos VP, Marketing

Caitlin is obsessed with fueling intimate connections between humans and brands. She brings nearly 15 years of experience and expertise in creative leadership, brand and identity, design, communications, and marketing. Her work is constantly balancing divergent thinking, imagination, and play with smart strategy, action, and impact.

Featured Blog Post? No Has this blog post been deprecated? No Summary Phase2, an enterprise technology partner for digital modernization, emerging innovation, and purpose-built AI products and frameworks, has been elevated by Optimizely to Gold Partner status. Topic Phase2 Phase2 Optimizely Gold Partner Social Promo_Website Banner.png Promo Image

LakeDrops Drupal Consulting, Development and Hosting: ECA Use Case: Authentication

Drupal Planet -

ECA Use Case: Authentication Jürgen Haas Tue 16 Dec 2025 - 14:59

This article explores how ECA (Event-Condition-Action) can handle common authentication workflows in Drupal, including access denied redirects, user registration forms, and post-login actions. It demonstrates how ECA models can replace multiple contributed modules while offering greater flexibility — such as role-based redirects after login, hiding unnecessary password fields during account creation, and automatically assigning roles based on email domains. The key benefits include fewer dependencies, easier customization, simpler upgrades, and self-documenting configuration. However, ECA still needs improvement in discoverability and usability to become accessible to all Drupal site builders.

joshics.in: Beyond the Request: Best Practices for Third-Party API Integration in Drupal

Drupal Planet -

Beyond the Request: Best Practices for Third-Party API Integration in Drupal bhavinhjoshi Tue, 12/16/2025 - 18:20

As businesses continue to innovate in the digital space, no website is an island. Whether it’s pulling payment data from Stripe, syncing leads with Salesforce, or fetching live race results from a sports data provider, your Drupal site almost certainly needs to communicate with external services.

Drupal 10 (and the upcoming 11) is a powerful platform for these integrations, but simply connecting to an API isn’t enough. Poorly built integrations can result in a fragile system, where a third-party service outage brings your entire site down.

At Joshi Consultancy Services, we’ve seen the difference between "it works" and "it scales." Here’s how we ensure our API integrations are robust, reliable, and future-proof.

01

Embrace the Guzzle Client

Don’t resort to raw cURL. Since Drupal 8, the Guzzle HTTP client has been bundled in the core. It’s a robust, standards-compliant client that simplifies API interactions and offers better extensibility.

Why it matters: Guzzle allows us to standardize outgoing requests across your site. We can easily add middleware for tasks like logging, authentication, and debugging without redoing the connection logic for every API call. This leads to cleaner, maintainable code.

02

Never Hardcode Credentials

It's tempting to paste your API keys directly into the code or configuration settings to get things up and running quickly. But this creates a serious security risk, exposing sensitive credentials in code repositories or database backups.

The Solution: We use the Key module to securely store API credentials outside the web root. The module references API keys from environment variables or secure locations, ensuring they remain hidden from unauthorized access.

03

Caching is Non-Negotiable

External APIs can be slow, and relying on them for every page load will degrade your site's performance. Moreover, many APIs impose rate limits (e.g., “1000 requests per hour”), making it crucial to minimize the number of calls.

Best Practice: Decouple the view from the request.

  • When we fetch data, we store it in Drupal’s Cache API.
  • Subsequent page loads fetch the cached data, resulting in faster load times.
  • We set a “Time to Live” (TTL) for the cached data based on business needs.

Result: Your site stays fast, and you don’t exceed API rate limits.

04

Fail Gracefully

What happens if the third-party API goes down? Does your site crash with a “500 Error” or a blank screen?

Defensive Coding: We wrap all API requests in try/catch blocks. If an external service times out or returns a 404, we handle it gracefully. The user might see old cached data or a friendly message like “Live data is temporarily unavailable” instead of a crash.

05

Use the Queue API

Certain tasks should not block the user experience. If an action takes longer than a couple of seconds, it shouldn’t be performed while the user waits for the page to load.

Example: If a user submits a form and the data needs to be sent to multiple third-party services (CRM, ERP, marketing platform), don’t make them wait for each one.

The Solution: We use Drupal’s Queue API to handle time-consuming tasks in the background. The user’s submission is saved immediately, while a background process (using Cron) picks up the task and sends the data to the external APIs without blocking the user’s experience. Final Thoughts

API integration is straightforward, but resilient integration requires careful planning. By treating external APIs as unreliable services that need to be managed, cached, and secured, we ensure your Drupal site remains robust, even when things go wrong on the other side of the connection.

Are you struggling with slow API integrations or need to connect your Drupal site to complex third-party services? Let’s discuss how to architect a solution that scales, ensuring both performance and reliability. Drupal 10 API Drupal Planet Drupal Share this Add new comment

LostCarPark Drupal Blog: Advent Calendar day 16 – Drupal CMS now and beyond

Drupal Planet -

Advent Calendar day 16 – Drupal CMS now and beyond james Tue, 12/16/2025 - 09:00

Yesterday’s door was looking back at the launch of Drupal CMS 1.0, and today we look forward to CMS 2.0, which currently has an Alpha release available.

It is expected to launch early in 2026, though I suspect it won’t make it for Drupal’s 25th birthday.

At DrupalCon Nara, Cristina Chumillas and Pamela Barone talked about developments since the first release, and how community and company-backed contributions have increased significantly, strengthening the ecosystem. CMS 2.0 builds on this momentum, prioritising usability, and enabling non-technical users to build sites in hours rather than…

Tags

Drupal.org blog: GitLab CI: Drupal's strategy to empower a whole ecosystem

Drupal Planet -

In this post, we share our CI strategy for all Drupal contributed modules. We believe that other large open-source projects may want to adopt or learn from the way we implemented a solution to centrally-manage CI while still allowing per-project customization.

How Drupal contributed modules do CI today?

Let's give some more details about how did we get here.

The past

In summer 2023, only 2 and a half years from today, we enabled the usage of GitLab CI for the Drupal ecosystem, which includes all contrib modules and Drupal core. We announced and promoted it at DrupalCon Lille 2023.

This new system would replace entirely the DrupalCI, the custom testing solution that Drupal core and projects used for nearly 10 years prior to enabling GitLab CI.

Core tests went from taking nearly 1h to taking 10 minutes. Adoption for contrib modules was as easy as adding a six-line file to their project.

The present

Core continued to evolve at its own pace, and the CI runs are now down to 5 minutes. They’ve been able to leverage concurrency, caching, and many other features available on GitLab CI.

Contrib modules also saw significant changes to improve their quality. Adoption was continuously growing, and the standard templates really took off, adding many new features.

As of today, we have more than 2000 contrib projects using GitLab CI.

Jobs

We offer, without writing a single line of code, the same type of tests and checks that core does.

These are: Composer lint, PHPCS, CSpell, PHPStan, ESLint, Stylelint, Nightwatch, PHPUnit, Test-only.

In addition to those, we also have: Upgrade status, Drupal CMS, GitLab Pages.

You can see that having things like “Upgrade status” or “Drupal CMS” compatibility are key for our contrib modules, and it’s available out of the box.

Also, the GitLab Pages job allows for modules to publish a full documentation site based on their markdown files. If the files are there, the documentation site will be published. An example of this is our own documentation site for the shared CI templateshttps://project.pages.drupalcode.org/gitlab_templates

Most of these jobs will offer artifacts that can be downloaded by maintainers to fix the issues reported.

Customizations

Most of the above jobs can be disabled, if they are not wanted, with only a few lines of code (turn variables to 0). 

We can also test multiple versions of Drupal, like the next or previous minors or majors, again with a few lines of code (turn variables to 1).

We achieved this by extending base jobs that can be configured via variables, like this:

composer: extends: .composer-base variables: DRUPAL_CORE: $CORE_STABLE IGNORE_PROJECT_DRUPAL_CORE_VERSION: 1 composer (max PHP version): extends: .composer-base rules: - *opt-in-max-php-rule - *check-max-php-version-rule - when: always variables: PHP_VERSION: $CORE_PHP_MAX DRUPAL_CORE: $CORE_STABLE IGNORE_PROJECT_DRUPAL_CORE_VERSION: 1 composer (previous minor): extends: .composer-base rules: - *opt-in-previous-minor-rule - when: always variables: DRUPAL_CORE: $CORE_PREVIOUS_MINOR IGNORE_PROJECT_DRUPAL_CORE_VERSION: 1

We always keep up with the latest core releases, so maintainers don’t need to change anything to test the latest core versions. But if they want to “fix” the versions tested so these don’t change, they can pin the version of the templates that they are using with just one line of code.

They can choose with PHP version or database engine to run tests with.

External integrations

The contrib templates can be used in external instances. This is actually a five-line file (similar to the one mentioned above), but the integration remains the same. We have several community members using the templates in their own GitLab instances with their own company projects, and everything works the same.

The future

Ever since we made the switch, we have positively shaped the contribution to Drupal. Module standards are very much aligned with core standards. We get really quick in-browser feedback about what to fix; we no longer need to upload extra (test-only) patches, etc.

The possibilities are endless, and we continue looking at the future as well. We are always open to hearing about improvements. For example, only recently, thanks to suggestions from the community, we added Drupal CMS compatibility check and support for recipes.

We are also checking if we can convert some of the jobs to reusable GitLab CI components (they weren’t stable when we launched the templates).

All in all, the future looks bright, and we are really glad that we made this move as part of our broader GitLab initiative. 

How other open source projects can adopt a similar solution (aka "implementation details")

Whether you have an open source project and want to do something similar, or you are just curious, here are some of the details about how we implemented this for the Drupal Ecosystem.

We had several goals in mind, some of them as must-haves, some of them as nice-to-haves. The must-haves were that it needed to be easy to adopt, and that it should allow the same functionality as the previous system. The nice-to-haves were that it would be easy to iterate and push changes to all projects using it, without project interaction, and that we could easily add new features and turn them on/off from a central place.

At the time, GitLab components were still in the works and didn't have a timeline to be stable, so we needed to think which other options were available. GitLab has the include functionality, that allows including external YAML files in a project's CI configuration. This was our starting point.

Template inheritance

We control the templates centrally at the GitLab Templates project. In there, you can see a folder called "includes", and those are the files that projects include. That's it! To make this easier, we provide a default template that gets prepopulated in GitLab and that containsthe right "includes" in the right places. The six-line template is here.

You can create a ".gitlab-ci.yml" file in the repo and add these:

include: - project: $_GITLAB_TEMPLATES_REPO ref: $_GITLAB_TEMPLATES_REF file: - '/includes/include.drupalci.main.yml' - '/includes/include.drupalci.variables.yml' - '/includes/include.drupalci.workflows.yml'

From that moment on, the project "inherits" all the templates (that we control centrally) and will start running the above CI jobs automatically.

You can see that there are three main files: one with variables, one with global workflow rules, and one containing all the jobs.

That is just the base. Each project can deviate, configure, or override any part of the template as desired, giving them flexibility that we might not be able to accommodate centrally. 

We created extensive documentation and generated a GitLab Pages site to help with this: https://project.pages.drupalcode.org/gitlab_templates.

Should you want to include this in any other external GitLab instance, you just need to adapt the above to be fully qualified links as explained in our documentation page here.

As mentioned before, we can push a change (eg: bug fix, new feature) centrally, and as long as the projects make reference to our files, they will automatically receive the changes. This gives us great flexibility and extendibility, and best of all, maintainers don't need to worry about it as it is automatic for their projects.

We define variables that control the Drupal versions to tests against, the workflow rules that determine which jobs run and under which conditions, and most important of all, the logic for all the jobs ran in the pipelines.

We did it this way because it was the solution that would allow us to get all the must-haves and all the nice-to-haves. It allows literally thousands of projects to benefit instantly from shared CI checks and integration without barely writing code.

Versioning

We don't need a complex system for this, as the code is relatively small and straightforward compared to other projects, but we realised early that we needed a system because pushing the "latest" to everybody was risky, should a bug or unplanned issue arise.

We document our versioning system in the "Templates version" page. We use semver tagging, but we only maintain one branch. Depending on the changes introduced since the last tag, we increment X.Y.Z (X for breaking changes, Y for new features, Z for bug fixes), and we also generate a set of tags that will allow maintainers to pin specific versions, or moving-tags within the same major or minor. You can see the tagging script we use here.

Excerpt:

# Compute tags. TAG="$1" IFS=. read major minor micro <<<"${TAG}" MINOR_TAG="${major}.${minor}.x-latest" MAJOR_TAG="${major}.x-latest" ... echo "Setting tag: $TAG" git tag $TAG git push origin $TAG ... echo "Setting latest minor tag: $MINOR_TAG" git tag -d $MINOR_TAG || TRUE git push origin --delete $MINOR_TAG || TRUE git tag $MINOR_TAG git push origin $MINOR_TAG ... echo "Setting latest major tag: $MAJOR_TAG" git tag -d $MAJOR_TAG || TRUE git push origin --delete $MAJOR_TAG || TRUE git tag $MAJOR_TAG git push origin $MAJOR_TAG

This process has been working well for us for around 2 years already.

Pushing changes to all contributed projects

Once the above versioning system was implemented, it was easier and quicker to iterate, and it also gave maintainers a chance to pin things. We normally push changes to the "main" branch, so all users wanting the latest changes can both benefit from them and also help us discover any possible regressions.

Once we are happy with the set of changes from the last tag, we can create new tags that maintainers can reference. Also, once we are happy that a tag is stable enough, we have a special tag named "default-ref" and all we need to do is change that tag to point to the specific stable version we want. Once we do it, that tag will automatically be pushed to all the contributed projects using the default setup.

The script that we use to set the default tag can be seen here.

Excerpt:

TAG="$1" DEFAULT_TAG="default-ref" echo "Setting default tag to be the same as: $TAG" # Checkout the tag. git checkout $TAG # Override the default one. git tag -d $DEFAULT_TAG || TRUE git push origin --delete $DEFAULT_TAG || TRUE git tag $DEFAULT_TAG git push origin $DEFAULT_TAG # Back to the main branch. git checkout main Implement it in your project

In the spirit of open source, we've documented the overarching strategy we used so that other teams fostering open source projects can adopt similar principles. We wanted to share how we did it, in case it helps any other project.

The key is to have a central place where you can control the default setup, and from there on, let projects decide what's best for their needs. They can stick to the default and recommended setup, but they could deviate from it should they need to.

Talking Drupal: Talking Drupal #532 - AI Marketing and Stuff

Drupal Planet -

Today we are talking about AI Marketing,Marketing Trends, and The caber toss with guest Hayden Baillio. We'll also cover Drupal core 11.3 as our module of the week.

For show notes visit: https://www.talkingDrupal.com/532

Topics
  • AI in Marketing: Hayden's Insights
  • The Role of AI in Content Creation
  • Challenges and Ethical Considerations of AI
  • AI Training Data and Bias
  • AI in Security Testing
  • AI Replacing Jobs
  • The Future of Marketing with AI
  • Highland Games and Personal Hobbies
Resources Guests

Hayden Baillio - hounder.co hgbaillio

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Fei Lauren - feilauren

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you been wanting a version of Drupal core that moves away from the hooks system, has PHP 8.5 support, or has better support for asynchronous queries? The newly released Drupal core 11.3 has all these and more.
  • Module name/project name:
  • Brief history
    • Created in the last few days (hopefully) by the time this episode is released
  • Changes
    • Performance improvements
    • New MYSQLi database driver. In combination with the PHP Fibers support added in Drupal 10.2, this should allow Drupal sites to run much faster. Not all hosting environments will have PHP configured to work with the new driver, so for now the new driver is in an experimental core module you will need to install to try the new driver
    • Drupal can now lazy load multiple entities at a time using Fibers
    • PHP 8.5 support should also improve performance, as will a number of caching improvements
    • Some early testing in the community indicates some significant improvements for pages loaded from cold cache, anywhere from 30 to 40% fewer queries
    • One of the significant changes in Drupal core 11.2 was the addition of HTMX as the intended successor to Drupal's older AJAX system. Drupal core 11.3 includes some significant steps on the path to replacing all the places that AJAX system in core
    • There's a new HTMX factory object with methods to abstract the specifics of the attributes and headers needed to implement HTMX
    • HTMX is now used for the Form Builder and ConfigSingleExportForm
    • BigPipe no longer uses the older AJAX API, which itself uses jQuery
    • New Workspace Provider concept, will be interesting to see what new possibilities this creates
    • New administer node published status permission, previously required the much broader "administer nodes" permission
    • Drupal core 11.3 also includes some capabilities that previously required contrib modules
    • Links created within CKEditor5 now dynamically link to the entity and when rendered will automatically point to the most recent alias. Previously Drupal sites needed the Linkit module, which has been part of Drupal CMS since its release at the start of the year
    • Drupal CMS is also heavily based on Drupal's recipe system, which includes the ability to automatically import content included within a recipe. Until now you still needed the default_content module to export content as YAML for inclusion in a recipe. With Drupal 11.3 you can export all entities of a particular type, optionally filtered by bundle, and optionally including all dependencies
    • Many of Drupal's remaining hooks, particularly those for themes, now have OOP class replacements, so we're now very close to being able to deprecate .module and .theme files
    • Listeners may remember that the Navigation module was added as an experimental module in Drupal core 10.3. In 11.3, the module is now officially stable, so the rethought admin menu that originally debuted as part of the Gin admin theme is now fully realized in Drupal core
    • SDCs can now be marked to be excluded from the UI, for example if they are meant to only be nested within other components
    • Drupal core 11.3 also introduces some new deprecations:
    • Migrate Drupal and Migrate Drupal UI officially deprecated now that Drupal 7 is EOL
    • Also field_layout, which was ultimately superseded by Layout Builder
    • Promoted and Sticky fields are now hidden by default (an issue created more than 20 years ago! A five digit issue ID) - the user who created it had a drop.org username lol
    • Another issue that sets the "Promoted" default value to FALSE for new content types was also resolved, but only 15 years old. It had a six-digit issue ID - barely!
    • Theme engines have been deprecated!
    • This may be the last feature release of Drupal core before version 12, which could drop as early as June 2026
    • We'll include a link to the release highlights, but by the time you hear this there should also be an official announcement from Gabor and the DA with additional details

Pages

Subscribe to www.hazelbecker.com aggregator - Drupal feeds