Drupal feeds

Dries Buytaert: Connecting Drupal with Activepieces

Drupal Planet -

Activepieces is an open source workflow automation platform, similar to Zapier or n8n. It connects different systems so they can work together in automated workflows. For example, you might create a workflow where publishing a Drupal article automatically creates a social media post, updates a Google Sheet, and notifies your team in Slack.

There are two main ways to run Activepieces:

  • Activepieces Cloud: The easiest option for production use or for evaluating Activepieces. The limitation is that it cannot reach Drupal sites running on your localhost.

  • Run Activepieces locally: Useful when you are developing or testing Drupal integrations. There are two ways to do this:

    1. Docker environment: If you are developing Drupal sites locally with tools like DDEV, the easiest option is to run Activepieces locally using Docker so both can communicate easily. See running Activepieces locally with Docker.

    2. Development environment: If you want to modify the Activepieces codebase or contribute to the Drupal Piece, you will need the full development toolchain. See setting up the Activepieces development environment.

Once you have Activepieces running, you'll want to connect it to your Drupal site. This note explains two ways to do that: a basic integration using Drupal's built-in APIs, and an advanced setup that unlocks deeper automation capabilities.

Setting up basic integration

You can connect Drupal with Activepieces without installing any extra Drupal modules.

Drupal ships with JSON:API support, a REST API that exposes your content and data through HTTP requests. This means Activepieces can query your content, fetch individual nodes, explore field definitions, and follow entity relationships without any custom code.

While JSON:API is part of Drupal Core, it may not be enabled yet. You can enable it with:

drush pm-enable jsonapi -y

Next, set up a dedicated Drupal user account with only the permissions needed for what you want Activepieces to do.

Activepieces can use Basic Authentication to connect to Drupal with the corresponding username and password.

Basic Auth sends credentials with each request, which makes it simple to set up. For production environments, I recommend using a more secure authentication method like OAuth, though I have not tried that yet.

Drupal Core comes with a Basic Auth module, but you might also need to enable it:

drush pm-enable basic_auth -y

Once both modules are enabled, you can create a connection to Drupal from within Activepieces. In the Activepieces interface, drag a Drupal trigger or action onto the canvas, and you'll be prompted to set up the connection.

Setting up advanced integration

For more advanced scenarios, we created the Orchestration module. It's an optional module. Installing this module unlocks deeper integrations that enable external systems to trigger Drupal ECA workflows, use Drupal AI agents, call Drupal Tools, and more.

The module is organized using specialized submodules, each connecting to a different part of Drupal's ecosystem. You can pick and choose the capabilities you want to use.

For starters, here is how to install the Drupal AI and ECA integrations:

composer require drupal/orchestration drupal/ai drupal/ai_agents drupal/tool drupal/eca drush pm-enable ai ai_agents tool eca orchestration_ai_agents orchestration_ai_function orchestration_tool orchestration_eca -y

Before you can use any of the AI agents, you also need to install and configure one or more AI providers:

composer require drupal/ai_provider_anthropic drupal/ai_provider_openai drupal/ai_provider_ollama drush pm-enable ai_provider_anthropic ai_provider_openai ai_provider_ollama -y

Clear the cache:

drush cache-rebuild

With these modules installed, you can build much more sophisticated workflows that leverage Drupal's internal automation and AI capabilities.

Dries Buytaert: Setting up an Activepieces development environment

Drupal Planet -

If you just want to use Activepieces with Drupal on your local development machine, the easiest option is to follow my guide on running Activepieces locally with Docker. That approach allows you to use Activepieces, but you can't make code changes to it.

If you want to contribute to the Drupal Piece integration or create a new Piece, the Docker setup won't work. To develop or modify Pieces, you'll need to set up a full Activepieces development environment, which this note explains.

First, fork the Activepieces repository on GitHub using the UI. Then clone your fork locally:

git clone https://github.com/YOUR-USERNAME/activepieces.git

Move into the project directory and install all dependencies:

cd activepieces npm install

After the installation finishes, start your local development instance:

npm start

Open your web browser and go to http://localhost:4200.

Sign in with the default development account:

  • Email: dev@ap.com
  • Password: 12345678

This account is preconfigured so you can start building and testing custom Pieces right away.

The Drupal Piece code lives in ./packages/pieces/community/drupal. When you make changes to the code, they're automatically compiled and hot-reloaded, so you can see your changes immediately without restarting the development server.

To complete your setup, see my guide on connecting Drupal with Activepieces.

Troubleshooting common issues

I've run into a few issues while working with the Activepieces development environment. Here is what usually fixes them.

Start by deleting all caches:

rm -rf node_modules cache dev

This removes node_modules (all installed dependencies), cache (build and runtime caches), and dev (temporary development files).

Activepieces uses Nx, an open source build system for monorepos. If Nx's cache is out of sync, reset it to start with a clean slate for builds and tests:

npx nx reset

Dries Buytaert: Running Activepieces locally with Docker

Drupal Planet -

For Drupal developers, Activepieces makes it easy to connect Drupal to other systems. Think of it as an open source alternative to tools like Zapier or n8n, but with an MIT license.

For example, you can create a workflow that runs when new content is published in Drupal and automatically sends it to Slack, Google Sheets, or social media. You can also trigger Drupal actions, such as creating new content or updating user data, when something changes in Salesforce, GitHub, or Airtable.

This guide covers running Activepieces locally using Docker. This setup is ideal if you're developing Drupal sites locally with DDEV and want to build workflows that connect to your local Drupal instance.

When you develop Drupal sites locally, Activepieces Cloud can't reach them. You could use a tunneling service like ngrok to expose your local environment to the internet, but that adds extra complexity.

Instead, we can run an open source copy of Activepieces locally using Docker. This gives you a fully configured Activepieces instance that can communicate directly with your local Drupal site. You can get up and running in just a few minutes with a single command.

Contributing to the Drupal Piece

In Activepieces, a Piece is an integration that connects to an external application or service. I helped build the original Drupal Piece, which now ships with Activepieces out of the box. It lets you create workflows that move data between Drupal and other applications.

If you want to contribute to the Drupal Piece, this Docker setup is not what you need. The Docker instance runs like a production environment. It's perfect for building and testing workflows in Activepieces, but it doesn't let you modify the Activepieces code or the Drupal Piece itself.

To make changes to Activepieces, including the Drupal Piece, you'll need to set up a full Activepieces development environment instead.

However, if your goal is simply to run Activepieces locally and connect it to your Drupal site, the Docker setup below is all you need.

Run Activepieces locally with Docker

This one-line command will download and run Activepieces on your computer:

docker run -d -p 8080:80 -v ~/.activepieces:/root/.activepieces -e AP_QUEUE_MODE=MEMORY -e AP_DB_TYPE=SQLITE3 -e AP_FRONTEND_URL="http://localhost:8080" activepieces/activepieces:latest

This pulls the latest Activepieces image from Docker Hub (if it isn't already cached) and starts a container with the following settings:

  • Runs in detached mode (-d)
  • Maps port 8080 on your computer to port 80 in the container
  • Persists data by mounting ~/.activepieces to the container
  • Uses in-memory queue processing and SQLite database
  • Sets the frontend URL to http://localhost:8080

This might take a couple of minutes to boot up the container and get Activepieces up and running. After a couple of minutes, navigate to http://localhost:8080 (not https) to create an account and log into your local instance.

To start using Activepieces with your Drupal site, you still need to connect them. See my guide on connecting Drupal with Activepieces.

Upgrading the Activepieces Docker container

Activepieces regularly releases new versions. The Docker instance on your local machine does not update itself automatically, so you'll want to manually upgrade it from time to time.

First, list your running containers to find the container ID for Activepieces:

docker ps

Next, stop that container by replacing <container-id> with the actual ID you found:

docker stop <container-id>

Finally, pull the latest Activepieces image from Docker Hub:

docker pull activepieces/activepieces:latest

Start a new container using the same docker run command from above. Your flows and settings remain intact because they're stored in the mounted ~/.activepieces directory.

The Drop Times: DrupalCon Nara 2025: Asia’s Drupal Community Unites in Japan’s Ancient Capital

Drupal Planet -

Join the global Drupal community in the historic city of Nara, Japan, for DrupalCon Nara 2025. From 16–19 November 2025 at Hotel Nikko Nara, immerse yourself in bilingual English/Japanese sessions, hands‑on contribution days, and a city‑wide treasure hunt through a UNESCO World Heritage landscape—all tailored for Drupal users, developers and contributors across Asia and the world.

Drupal Association blog: Showcasing Drupal Excellence: Refreshed Industry Pages and a Renewed Commitment

Drupal Planet -

We've overhauled Drupal's industry landing pages to better showcase the real-world impact of Drupal across critical business sectors. These refreshed pages represent a new, more strategic approach to how we position Drupal for enterprise audiences.

These redesigned industry pages create focused spaces where prospects in specific industries can see Drupal solving problems they recognize—at the scale and complexity they need. Instead of generic CMS messaging, decision-makers in retail, healthcare, government, and other sectors now find pages that speak directly to their pain points, featuring case studies from organizations facing similar challenges.

What's Changed

Curated excellence
We are moving away from allowing agencies to book slots, to instead carefully selecting the best projects that demonstrate Drupal's capabilities. This means visitors see the most compelling case studies—recognized brands, innovative solutions, and clear business results that sell Drupal effectively.

Updated design and brand
The pages now reflect Drupal's updated brand and modern website design, presenting a professional, enterprise-grade appearance that matches the quality of the projects we showcase.

Industry-specific messaging
Each page features value propositions tailored to that industry's pain points, rather than generic CMS benefits. Retail pages talk about campaign velocity and Black Friday traffic. Healthcare pages address compliance and patient experiences. The messaging speaks directly to what matters in each sector.

Current Industry Coverage

The refreshed pages now cover:

  • Enterprise - Multi-brand governance and Fortune 500 scale
  • Government - Citizen services and public sector digital transformation
  • Education - Campus platforms and academic digital experiences
  • Nonprofit - Mission-driven organizations maximizing impact
  • Ecommerce - Commerce-driven digital experiences
  • Fintech - Financial services and secure digital banking
  • Healthcare - Patient experiences and healthcare digital transformation
  • Retail - Omnichannel retail and campaign velocity
  • Travel & Tourism - Destination marketing and travel experiences

Have ideas for new verticals or feedback on current pages?
Reach out to Ryan directly (ryan.witcombe@association.drupal.org)

How We Select Case Studies

To maintain quality and support the partners who support the Drupal project, we follow a clear selection process:

DCP exclusivity
Case studies featured on industry pages come exclusively from Drupal Certified Partners. These agencies support the Drupal project and allow us to maintain Drupal.org, create resources like these pages, and invest in the ecosystem. Featuring DCP work on these pages is one way we deliver value back to our partners.

Quality and credibility
We prioritize case studies that feature:

  • Well-known, trusted brands that prospects will recognize
  • Innovative approaches and technical sophistication
  • Clear business results and compelling transformation stories
  • Projects that best demonstrate Drupal's enterprise capabilities

Diversity and representation
Within each industry vertical, we aim for:

  • Geographic diversity (not all projects from one region)
  • A mix of project types and challenges
  • Different DCPs represented (avoiding concentration with one partner)
  • Variety in organization size and complexity
     

Regular review and updates
We review these pages quarterly to ensure they showcase the best current work. However, if an exceptional case study is posted to Drupal.org between reviews, we may add it immediately. This keeps the pages fresh while ensuring we never miss an opportunity to showcase outstanding work.

Also New: Monthly "Best of Drupal" Social Campaigns

The refreshed industry pages are part of a broader commitment to consistently showcasing Drupal excellence. We've also launched a monthly "Best of Drupal" carousels on social media that highlights outstanding projects from across the community.

These monthly campaigns:

  • Celebrate exceptional work from DCPs and the broader Drupal community
  • Build momentum by regularly showcasing what Drupal can do
  • Create shareable content that partners can amplify through their own channels
  • Keep Drupal visible in social feeds where decision-makers spend time

Together, the industry pages and monthly social campaigns create a consistent drumbeat of Drupal excellence—making it easier for prospects to discover what's possible and for partners to demonstrate their expertise.

Get Involved

These pages showcase industries where we have strong case studies and proven success. To keep them fresh and expand coverage, we need:

  • Quality case studies from DCPs with recognized brands and clear results
  • Client quotes - We're looking for compelling testimonials from your clients—the actual site owners, CMOs, CTOs, and end users who experience Drupal daily. Quotes that speak to business impact, technical capabilities, or how Drupal solved their specific challenges add authenticity and credibility to industry pages. Submit quotes alongside your case studies or send them separately.
  • Your feedback on additional verticals that should be represented

Want your work featured? Maintain your DCP status, submit compelling case studies to Drupal.org with quantifiable results, and send us powerful quotes from your clients about their Drupal experience.

Not yet a Drupal Certified Partner? Becoming a DCP supports the Drupal project, gives you access to benefits like featured placement on these industry pages, and demonstrates your commitment to the Drupal ecosystem. Learn more about becoming a DCP.

Have ideas for new verticals or feedback on current pages?
Reach out to Ryan directly (ryan.witcombe@association.drupal.org)

Dripyard Premium Drupal Themes: Preparing Dripyard themes for Drupal Canvas

Drupal Planet -

At Dripyard we’ve been preparing our premium Drupal themes for Canvas. If you haven’t heard, Drupal Canvas is Drupal’s next-generation page builder built to rival tools like Gutenberg, Webflow, and AEM.

With Canvas, Drupal’s page-building capabilities finally match its powerful content modeling system. It feels fresh, intuitive, and fast compared to previous approaches.

The Drop Times: How I Met Drupal: A Collective Portrait of Drupal’s Evolution

Drupal Planet -

A LinkedIn prompt by The DropTimes asking "What was your first version of Drupal?" drew nostalgic replies that span over two decades. From experimental beginnings with Drupal 3.4 to modern adoption in versions 9 and beyond, practitioners shared how each version marked their entry point—and growth—in the Drupal ecosystem. These reflections reveal not only technical evolution, but also how each wave of contributors shaped the community.

Drupal AI Initiative: Building Smarter Drupal Sites with the amazee.ai AI Provider

Drupal Planet -

Drupal has always been about flexibility and control. The amazee.ai AI Provider takes that same spirit and applies it to artificial intelligence. It lets you connect a Drupal site to powerful AI models in less than two minutes. No hidden dependencies and no waiting around for credentials to propagate. All of this is free for the first 30 days so you can experiment and use recipes that require LLMs and VectorDBs, and build!

Fast, Open, and Built for Drupal

The provider installs on any Drupal site running 10.2 or higher. Once enabled, it connects you to enterprise-grade AI models and a vector database built directly into the service. There’s no need to configure an external database or manage API tokens across multiple vendors. Everything works inside your existing Drupal environment - no need to change your hosting provider.

It’s also open source and built by the Drupal community in partnership with amazee.ai. That means full transparency, data sovereignty, and no surprises about how your data is handled. You can choose processing regions in Switzerland, Germany, the US, or Australia to meet compliance needs without compromise. If you need a different region, just ask the amazee folks.

Try It Without Limits

Every new install comes with 30 days of unlimited AI tokens. That’s a full month to experiment, automate, and build without worrying about quotas. If you’re a developer contributing to Drupal AI, maintaining modules, or running trainings/workshops, you can request a developer account that gives you ongoing access at no cost.

When the trial ends, a regular account costs only $30 per month for a Pro Account, $100 per month for a Growth Account, and if you need more, amazee.ai can tailor an Enterprise account as well. It’s predictable, simple, and keeps you connected to the same infrastructure used for professional Drupal AI development.

In workshops, we’ve had participants install the provider, connect it, and build working AI features before the session break. The setup is fast enough that you spend time building, not troubleshooting. If you’re doing a talk, running a workshop, or conducting a training - reach out and we can explain how to spin up fully operational sites for you and your students in 2-3 minutes with no credit card.

Built for How the Community Works

The amazee.ai AI Provider was created to support Drupal’s open ecosystem. It’s maintained in public view, designed for collaboration, and made for people who want control over how AI runs on their sites. It works on any hosting platform, whether you’re using Acquia, Pantheon, Platform.sh, or a self-hosted stack.

It’s the easiest path yet to bring AI into Drupal without giving up data ownership or flexibility.

https://www.drupal.org/project/ai_provider_amazeeio

File attachments:  amazee image.jpeg

Centarro: Streamlining Purchases with URL-Based Cart Operations

Drupal Planet -

The standard eCommerce workflow requires users to navigate to a product page and click an “Add to Cart” button. Simple. Direct. Most people are familiar with it. But for certain applications, you want to streamline this experience and remove as much friction as possible.

Membership renewals, email or SMS campaigns for specific products, quotes generated from sales reps, and embedded purchases within content. We developed the Commerce Cart Links module for these situations, and more.

The module exposes a /cart-links route that accepts product variation IDs, quantities, and optional parameters for controlling cart behavior and redirects. When a user visits a cart link URL, the module processes the specified product variations, adds them to a cart, and optionally redirects to a specified destination.

Here's a quick demo:

Use casesMembership renewal workflows

Membership organizations with tiered structures can use cart links to send direct purchase links to members during renewal periods. Each member receives a URL specific to their membership tier, bypassing the need for them to navigate your website.

This approach reduces support overhead from members purchasing incorrect membership tiers and streamlines the renewal process for organizations managing thousands of members.

Read more

The Drop Times: The State of Drupal Websites in 2025

Drupal Planet -

The DrupalFit Challenge – Vienna Edition 2025 has offered a revealing snapshot of how Drupal websites perform today. Conducted by OpenSense Labs ahead of DrupalCon Vienna, the audit examined 148 sites across key areas—security, performance, SEO, and accessibility. The findings show that while many Drupal sites maintain strong technical foundations, accessibility and performance remain widespread challenges. With 84.5% of sites showing accessibility issues and 83.1% facing performance concerns, the report underscores where developers and agencies should focus their next improvements to keep Drupal websites fast, secure, and inclusive.

Talking Drupal: Talking Drupal #527 - AI in Drupal

Drupal Planet -

Today we are talking about AI, New Drupal Features, and the future of AI in Drupal with guest Jamie Abrahams. We'll also cover Orchestration as our module of the week.

For show notes visit: https://www.talkingDrupal.com/527

Topics
  • Exciting Announcement: Object-Oriented Hooks in Themes
  • The Drupal AI Initiative
  • Canvas AI and Migration Challenges
  • AI Powered Features and Future Directions
  • AI's Role in Drupal vs. Other Platforms
  • Human in the Loop AI in Drupal
  • Canvas AI and Human Control
  • Challenges with Customizability and AI Integration
  • Transparency and Ethics in AI
  • Modernizing Drupal's Core for AI
  • Future of AI in Drupal
  • Community Engagement and Events
Resources Guests

Jamie Abrahams - freelygive.io yautja_cetanu

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Maya Schaeffer - evolvingweb.com mayalena

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to expose Drupal's capabilities to external automation platforms? There's a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Aug 2025 by Jürgen Haas of LakeDrops, in collaboration with Dries, who some of our listeners may be familiar with
    • Versions available: 1.0.0, which supports Drupal 11.2 or newer
  • Maintainership
    • Actively maintained
    • Security coverage
    • Documentation site
    • Number of open issues: 11 open issues, none of which are bugs
  • Usage stats:
    • 3 sites
  • Module features and usage
    • With the Orchestration module installed, external systems can trigger Drupal workflows, call AI agents, and execute business logic through a unified API
    • The modules functions as a bi-directional bridge, so Drupal events like content updates, user registrations, or form submissions can also trigger external processing
    • Using the Orchestration module with the Activepieces automation platform in particular was featured at about the one hour mark in the most recent Driesnote, from DrupalCon Vienna, and we'll include a link to watch that in the show notes. The complex example Dries shows is pulling content from a Wordpress site, using AI to evaluate whether or not each post met certain criteria, and then conditionally calling one of a couple of ECA functions, in addition to using AI to rewrite the incoming content to change Wordpress terminology into Drupalisms
    • Under the hood Orchestration provides an endpoint that will return a JSON list of services, including the properties that are needed for each service. The external service also needs to provide the username and password for a Drupal account, so you can control what services will be available based on permissions for the Drupal user that will be used
    • Already Orchestration works with ECA, AI Agents, Tool API, and AI function calls
    • There is also work underway for integrations using webhooks, for integration platforms that aren't ready to directly support Drupal's orchestration services
    • In his presentation Dries mentioned that they are looking for feedback. Specifically, they would like feedback on what platforms should have integrations available

Security public service announcements: Normal Drupal core security window rescheduled for November 12, 2025 due to DrupalCon - PSA-2025-11-03

Drupal Planet -

Date: 2025-November-03Description: 

The upcoming Drupal core security release window has been rescheduled from November 19, 2025 to November 12, 2025. As normal, the window will occur between 1600 UTC and 2200 UTC.

Schedule change for back-to-back DrupalCons

This schedule change is due to DrupalCons Vienna and Nara overlapping the October and November core security windows. We do not schedule core security windows during DrupalCons so that site owners and agencies can attend these conferences without having to worry about their sites or clients.

December is also not typically used for core security releases due to the quick sequencing of the Drupal core minor releases and the end-of-year holidays. This would mean a period of four months where we could not provide any regularly scheduled security update.

No special release procedures

The schedule change is not due to any highly critical issue that would require special release procedures.

As a reminder, a Drupal core security window does not necessarily mean a Drupal security release will occur, only that one is possible.

Coordinated By: 

The Drop Times: Orchestration is the Message

Drupal Planet -

Dries Buytaert has made a clear and timely case: orchestration is no longer a supporting layer in software architecture. It is becoming the hub where business logic resides, workflows are developed, and decisions are executed. This shift elevates orchestration tools from optional utilities to essential infrastructure. For those involved in building and maintaining digital platforms, this is not just a new idea. It is a new foundation.

The impact on existing platforms, including Drupal, is significant. As orchestration becomes the layer where integration, automation, and intelligence reside, every platform must reconsider its position within a broader network of systems. Drupal is well-suited to operate as a content and data hub; however, it must evolve to function as part of a distributed ecosystem, rather than assuming a central or controlling role. This requires architectural flexibility and a willingness to adapt.

What matters now is how the community responds. The orchestration layer is becoming the connective tissue of digital operations. That demands shared standards, openness, and collaboration. If this is where modern software systems come together, then the values behind it will shape how inclusive, resilient, and extensible those systems can be. Dries has shown where things are heading. The responsibility to build in that direction belongs to all of us.

Before we close, I'd like to extend a quick invitation. On Wednesday, November 5, we're holding our monthly TDT Townhall, an open planning meeting where we share progress, shape priorities, and listen to the community. If you're aligned with our mission to expand Drupal’s reach and want to contribute ideas around content, outreach, or technology, we’d love to have you on board. It’s a one-hour session, fully open, and you’re welcome to listen or bring something to the table. Join us on Google Meet: https://meet.google.com/vrw-naam-ire

TutorialCase StudyDiscover DrupalOrganization NewsDrupal CommunityFree SoftwareEvent

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now. To get timely updates, follow us on LinkedIn, Twitter, Bluesky, and Facebook. You can also join us on Drupal Slack at #thedroptimes.

Thank you.
Sincerely,
Alka Elizabeth,
Sub-editor, The DropTimes.

Ramsalt Lab: How Ramsalt and artificial intelligence transformed “secret” public documents

Drupal Planet -

How Ramsalt and artificial intelligence transformed “secret” public documents Yngve W. Bergheim CEO Thomas Andre Karlsen Developer 03.11.2025

The public sector is overflowing with knowledge – from reports to evaluations. The challenge has long been to find this information in the sea of ​​PDFs. Ramsalt Lab has been central to the development of Kudos – a national knowledge portal. By implementing groundbreaking AI-based document analysis, we have helped turn a chaotic document base into a goldmine of searchable insight.

The public sector produces enormous amounts of knowledge – reports, studies, evaluations and analyses. But where does all this knowledge go? Too often it ends up in digital drawers, hard to find and impossible to search.

This is the problem Kudos (Knowledge Documents in the Public Sector) solves.

Ramsalt have been a key technical partner in the development of this groundbreaking service for Directorate for Public Administration and Financial Management (DFØ). Our developer, Thomas Andre Karlsen, has been a key member of the team that built the technical engine that powers Kudos.

Ramsalt Lab is a digitalization partner with Directorate for Administration and Financial Management The contract has a value of up to 150 million NOK over a 10-year period and Kudos is one of several projects Ramsalt is doing for the directorate.

Kudos is a joint search solution that brings together knowledge and management documents from ministries, directorates and government agencies in one place. The service, which is a collaboration between DFØ and the National Library, aims to streamline the entire knowledge process – from production to sharing and reuse.

The Challenge: A Bunch of PDFs

Building a portal for tens of thousands of documents (at the time of writing, over 40 000!) is a big task in itself. But the real challenge lies not only in the volume.It is in the metadata.

Many of the documents that are retrieved (often via "scraping" of various public websites) lack basic information:

  • Who actually wrote the report?
  • Who was the client?
  • What are the most important keywords?
  • Can I get a good summary?

A document without good metadata is almost as bad as a lost document. This is where the magic – and Ramsalt’s contribution – comes in.

The solution: AI that cleans, analyzes and enriches

To solve the metadata tangle, Ramsalt's developer Thomas Andre Karlsen has been central to building an advancedAI-based document analysis tool inthe heart of Kudos.

This tool is not just a simple tagging function. Here's how it works:

  1. Analyse: When a new document is uploaded to Kudos, the first few pages are sent to an AI language model (such as GPT-4o and GPT-5).
  2. The model reads andunderstandsthe content. It identifies and extracts:
    • A more descriptive and searchable title.
    • The actual client (actor).
    • The individual authors.
    • Relevant keywords and topics to which the document can be linked.
  3. The AI ​​is also writing a completely new, concisesummaryof the document.

The result is that documents that were previously black boxes suddenly become data-rich, perfectly indexed, and extremely easy to find for users. This saves countless hours of manual work and dramatically improves the quality of the entire database.

Ask questions about knowledge

Ramsalt's contributions don't stop there. The team has also experimented with implementing aLLM search engine, RAG pipeline.

This allows users to "talk" to the database. Instead of just searching for a word, one can ask a question like:"What is the criticism of the Health Platform?"

The system will then find the most relevant documents in the Kudos database, read them, and then generate a fact-based answer for the user, complete with source references. This is a completely new way to extract precise information from a massive knowledge base.

A gold mine for the state, researchers and journalists

Like the newspaper Media 24 has pointed out, Kudos is a gold mine for anyone who needs to know what the state knows.

Public employees can easily find relevant knowledge from other organizations and avoid ordering reports that already exist. Researchers gain access to a huge dataset for their analyses. Journalists can use the service to uncover connections and dig into public administration.

At Ramsalt, we pride ourselves on delivering technology that has real societal value. The Kudos project is a prime example of how we use our expertise in Drupal, data mining, and artificial intelligence to build robust and intelligent solutions for the public sector.

Does your business have large amounts of data that are difficult to utilize? Do you need help structuring, enriching and making your information accessible? Contact us at Ramsalt for a chat about how we can use AI and smart search solutions to transform your data into valuable knowledge.

Tag1 Insights: Drupal CMS 2.0 Performance Testing

Drupal Planet -

Take Away:

Back at the end of 2024, Tag1 added performance testing to Drupal CMS 1.0 prior to its release. We then did a comparison between Drupal CMS and WordPress in which we dug into out-of-the-box performance between the two, and Drupal CMS came out pretty well. Now Drupal CMS is preparing for its 2.0 release, including a new theme, Mercury, and Drupal Canvas enabled by default. In preparation for this release we updated our performance tests, and wrote a new one for Drupal CMS’s new site template, Byte.

Gander: PHPUnit Performance Testing for Drupal

Drupal core’s performance tests allow PHPUnit assertions to be made for both back and front end performance metrics, including the number and size of CSS and JavaScript files, as well as the number of database queries and cache operations. Between these metrics, we get a high level overview of how a site performs. Additionally, timings from tests, which can vary based on external factors and aren’t precise enough to assert against exact values within a PHPUnit test, are sent to an Open Telemetry Grafana/Tempo dashboard. Gander is used to test Drupal Core, Drupal CMS and site templates, contributed modules such as the popular redirect module, and real sites via Drupal Test Traits integration, such as london.gov.uk

We added performance testing while Drupal CMS 2.0 was in alpha and Drupal Canvas was in release candidate, and that helped uncover a few issues.

Front End Performance

Drupal CMS’s new Mercury theme is based on Tailwind CSS, which allows for a lot of customization without having to write new CSS. While this should allow Drupal CMS theming to be very flexible, it does involve serving quite a lot of CSS out of the box.

When testing the Byte site template, we wrote a performance test that covers one of the more common front end performance problems that sites face.

Drupal’s CSS and JavaScript aggregation combines smaller files into minified larger files, to reduce HTTP requests and filesize. This is an effective strategy for most Drupal libraries and components, which often include many small CSS and JavaScript files attached to specific page elements, and which may only appear on specific pages or for users with certain permissions, etc. Serving these files on demand reduces unused CSS and JS when those elements aren’t there.

However, when a large CSS or JavaScript file is included on every page (or most pages), that file can be duplicated between different asset aggregates, meaning the same visitor downloads it over and over again if they visit multiple different pages with slightly different combinations of libraries. Our test shows over 2mb of CSS being downloaded across six pages, each page has around 350kb of CSS individually.

We filed an a issue and (Merge Request) MR against the Mercury theme to exclude the main.min.css file from minification (because it’s already minified, avoiding wasted CPU and memory trying to minify and already minified file) and aggregation, so that only a single copy is downloaded per visitor. This issue has already been committed and the improvement should be visible in the performance test for Byte once there’s a new release.

While we were looking for the source of those large CSS files in Chromium devtools, we also noticed some chained HTTP requests for both fonts and JavaScript files and opened issues to document both. When the browser has to first parse a CSS or JavaScript file before loading fonts or another file, this requires at least two round trips from the browser before the chained request can be served, which can significantly affect page loading performance.

Byte also includes a webform newsletter signup on the front page. We noticed that the Webform module attaches a details library to every page showing a webform , whether the webform will render a details element or not. Because the details library depends on jQuery, this adds around 100kb of JavaScript for anonymous users that otherwise might not be needed. This discovery is an example of how adding performance tests for Drupal CMS can test not only Drupal CMS itself, but also many of Drupal’s most popular contributed modules, finding performance issues that can affect many sites in the wild.

Canvas

Our original Drupal CMS 1.0 tests cover both anonymous and authenticated visitors. For Drupal CMS 2.0 we noticed that authenticated visitor page requests required many more database queries and cache operations than Drupal CMS 1.0. We tracked this down to the Canvas module which sets max_age: 0 when rendering its ComponentTreeItemList field in some circumstances, disabling the dynamic page cache for any requests that render Canvas’ page entity type. We also noticed that the tree rendering itself is quite expensive although this may become less of an issue once render caching is fixed.

These were the only backend performance issues we noticed, so assuming they’re fixed prior release, back end performance should be broadly similar between Drupal CMS 1.0 and 2.0.

Conclusion

These findings show how important it is to validate performance before releasing code to production, so that unexpected regressions in application or browser performance can be caught and fixed before reaching site visitors. At the time of writing, several of the issues we opened already had MRs attached or had already been committed by the Drupal CMS team. Drupal’s Gander testing framework, originally developed by Tag1, provides an ideal mechanism to add continual testing with repeatable results to any kind of Drupal project.

Keep Your Drupal Sites Fast and Reliable

Performance testing isn’t just a step in development, it’s the foundation of a seamless user experience. Tag1 helps engineering, product, and marketing teams ensure that sites are fast, stable, and ready to scale. Using performance monitoring solutions like Gander, we make performance enterprise-ready so your sites stay smooth, secure, and always available.

Lear more about website performance management and let us know how we can help!

Image by Lalmch from pixabay.

Pages

Subscribe to www.hazelbecker.com aggregator - Drupal feeds