Dev Tools & APIs

  • Mailgun

    Mailgun

    Mailgun is a transactional email API built for developers and businesses that need reliable, high-volume email delivery. Unlike marketing email platforms, Mailgun focuses on sending emails triggered by application events: password resets, order confirmations, invoice delivery, and automated notifications. The Mailgun node lets your automation workflows send, receive, and track emails programmatically through the Mailgun API. Businesses use Mailgun in their automations to send personalised transactional emails at scale, process incoming emails as workflow triggers, track delivery and engagement metrics, and manage suppression lists. It is a popular choice for SaaS companies, e-commerce platforms, and any organisation that needs to send emails from automated systems rather than manually from a marketing tool. Osher integrates Mailgun into automated data processing workflows where email is a key communication channel. We build systems that generate and send emails based on real business events, whether that is a completed form submission, a status change in your CRM, or an AI-processed document that needs to be delivered. Every email is tracked, and bounce and complaint data feeds back into your systems to keep your sender reputation healthy.
  • Matrix

    Matrix

    Matrix is an open, decentralised communication protocol that provides end-to-end encrypted messaging, voice, and video. Unlike proprietary platforms, Matrix lets organisations run their own servers while still communicating with users on other Matrix servers through federation. The Matrix node allows automation workflows to send messages, manage rooms, invite users, and interact with the Matrix network programmatically. Organisations that require sovereign communications use Matrix for its encryption, self-hosting capability, and federation model. Automation use cases include posting system alerts and monitoring notifications to Matrix rooms, creating dedicated rooms for projects or incidents, bridging messages between Matrix and other platforms, and building bot-driven workflows that respond to commands in chat. Osher builds AI agent workflows that interact through Matrix for organisations where data sovereignty and end-to-end encryption are non-negotiable. We connect your internal systems to Matrix rooms so operational notifications, alerts, and AI-generated insights arrive securely in the channels where your team operates. For defence, government, and regulated industries, Matrix provides the communication backbone that proprietary tools cannot match on security and data control.
  • RabbitMQ

    RabbitMQ

    RabbitMQ is an open-source message broker that sits between services in your software architecture and handles the reliable delivery of messages between them. Instead of Service A calling Service B directly (and failing if B is down or overloaded), Service A publishes a message to RabbitMQ, and Service B consumes it when ready. This decoupling makes systems more resilient, scalable, and easier to maintain. RabbitMQ supports the AMQP protocol and offers features like message persistence, routing, dead-letter queues, and clustering for high availability. The n8n RabbitMQ node lets workflows publish messages to RabbitMQ queues and consume messages from them. This is valuable when n8n is part of a larger microservices architecture — it can consume messages published by other services (e.g., a new order event from your e-commerce platform) and trigger automations, or it can publish messages that other services pick up (e.g., sending a processed data payload to a downstream service for further handling). At Osher, we use RabbitMQ with n8n for clients who have event-driven architectures or need reliable message delivery between systems that process data at different speeds. If your backend services need to communicate reliably without tight coupling, or if you need n8n to process events from a message queue rather than polling APIs, our system integration services can architect and implement the RabbitMQ layer that ties everything together.
  • Compression

    The Compression node in n8n compresses and decompresses files within a workflow. It supports ZIP and GZIP formats, allowing you to bundle multiple files into a single archive or extract files from an incoming compressed archive. The node operates on binary data that flows through the workflow — files read from disk, downloaded from APIs, received as email attachments, or generated by other workflow nodes. This node is essential in file processing pipelines where you need to package files for delivery (e.g., zipping a batch of reports before emailing them to a client), reduce file sizes before uploading to storage or transferring over slow connections, or unpack compressed files received from external systems before processing their contents. It pairs naturally with the Read Binary Files node, the HTTP Request node, and email or SFTP nodes. At Osher, we use the Compression node as part of automated file handling workflows in n8n. Common builds include compressing daily report exports into a single ZIP before emailing them to stakeholders, decompressing data files received from suppliers or partners before parsing and importing the contents, and archiving processed files by compressing them before moving to long-term storage. If your team manually zips, unzips, or manages compressed file transfers, our automated data processing services can build an n8n pipeline that handles it without manual effort.
  • Jira Trigger

    Jira Trigger

    Jira Trigger is an automation node that fires whenever specific events occur in your Jira project management boards. It listens for new issues, status changes, comment additions, and sprint updates, then passes that data into your workflow for processing. Teams running agile development cycles use Jira Trigger to eliminate the manual checking and copy-pasting that slows down cross-tool communication. Common use cases include syncing new Jira tickets to Slack channels, updating external spreadsheets when issue statuses change, and routing bug reports to the right team based on priority or label. QA teams use it to kick off automated testing pipelines the moment a ticket moves to “Ready for QA”, while project managers rely on it to keep stakeholders informed without writing status emails. At Osher, we connect Jira Trigger into broader system integration workflows that tie your project management stack to CRMs, communication tools, and reporting dashboards. Rather than building fragile point-to-point connections, we design event-driven automations that scale as your team grows. The result is fewer missed updates, faster response times, and a project management setup that actually reflects what your team is doing in real time.
  • Supabase

    Supabase

    Supabase is an open-source backend platform that provides a PostgreSQL database, authentication, real-time subscriptions, and file storage through a single API. Developers and product teams use it as a faster alternative to building backend infrastructure from scratch, while retaining full control over their data. The Supabase node lets you read, write, update, and delete database records directly from your automation workflows. Typical automation use cases include syncing form submissions into Supabase tables, pushing CRM updates into a centralised database, triggering workflows when new rows appear, and backing up data from third-party APIs into structured storage. Startups and mid-size companies often use Supabase as their application database, which makes it a natural integration point for connecting frontend apps with backend automation logic. Osher builds custom AI and automation solutions that use Supabase as the data layer. We connect your Supabase tables to workflow engines, AI processing pipelines, and business applications so your data moves where it needs to go without manual exports or fragile scripts. Whether you need real-time data syncing, automated record management, or AI-powered processing of your database contents, we design systems that treat your Supabase instance as a living, connected part of your tech stack.
  • MQTT

    MQTT

    MQTT (Message Queuing Telemetry Transport) is a lightweight messaging protocol designed for devices with limited processing power and networks with constrained bandwidth. It uses a publish-subscribe model: devices publish messages to a topic on a central broker, and any client subscribed to that topic receives the message. MQTT runs over TCP/IP and is the dominant protocol for IoT (Internet of Things) communication, used in everything from factory sensor networks to smart building systems. The n8n MQTT node allows workflows to subscribe to MQTT topics and trigger automations when messages arrive, or publish messages to topics as part of a workflow. This makes it possible to connect IoT device data to business systems — for example, reading temperature sensor data from a warehouse and triggering an alert in Slack, or receiving machine status updates from factory equipment and logging them to a database for reporting. At Osher, we use the MQTT node in n8n to bridge the gap between operational technology (sensors, PLCs, edge devices) and business systems (databases, dashboards, notification channels). If your business collects data from physical devices and you are manually exporting or checking it, our system integration services can connect those data streams directly into your workflows so the information reaches the right people and systems in real time.
  • Read Binary Files

    The Read Binary Files node in n8n reads files from the local filesystem in binary format — images, PDFs, spreadsheets, ZIP archives, or any other file type — and passes the binary data into the workflow for processing by subsequent nodes. It is the starting point for any n8n workflow that needs to work with files stored on the server rather than files received via API or webhook. Common use cases include reading invoice PDFs for data extraction, loading image files for processing or upload to another service, picking up CSV exports dropped into a folder by a legacy system, and reading configuration files needed by other workflow steps. The node supports wildcard patterns to read multiple files at once and can be combined with a schedule trigger to poll a directory for new files on a regular interval. At Osher, we use the Read Binary Files node as part of file processing pipelines built in n8n. A typical build might watch a shared folder for new documents, read them with this node, extract text using AI or OCR, classify the document type, and route the extracted data to the right system — a CRM, accounting package, or database. If your team is manually downloading, opening, and re-entering data from files, our automated data processing services can replace that manual work with an n8n pipeline that handles it automatically.
  • Home Assistant

    Home Assistant

    Home Assistant is an open-source home and building automation platform that runs locally on your own hardware (Raspberry Pi, mini PC, NAS, or virtual machine). It connects to over 2,000 device types — smart lights, thermostats, security cameras, door locks, energy monitors, HVAC systems, and industrial sensors — and provides a single interface for monitoring and controlling all of them. Unlike cloud-dependent platforms, Home Assistant processes everything locally, which means faster response times, no subscription fees, and full data privacy. The n8n Home Assistant node lets you read device states, trigger automations, call services, and fire events from within an n8n workflow. This bridges the gap between building/facility automation and business systems. For example, you can trigger an n8n workflow when a meeting room sensor detects occupancy, log energy consumption data from smart meters into a database for reporting, or send an alert to your facilities team when a temperature sensor exceeds a threshold. At Osher, we connect Home Assistant to business workflows for clients who need their physical environment data to feed into operational systems. This is relevant for commercial offices, warehouses, retail spaces, and any facility with smart devices. If you are running Home Assistant but the data stays siloed in the Home Assistant dashboard, our system integration services can connect it to your business tools so the data drives real actions.
  • Facebook Graph API

    Facebook Graph API

    The Facebook Graph API node in n8n lets you interact with Facebook and Instagram programmatically — reading page insights, publishing posts, managing ad campaigns, and pulling engagement data into your automation workflows. The Graph API is Facebook’s primary developer interface, and the n8n node wraps it so you can use it without writing custom API code. The practical problem this solves is the gap between your social media activity and your business systems. Marketing teams post content, run ads, and respond to comments inside Facebook’s own tools, but the data from those activities rarely flows into the CRM, the reporting dashboard, or the sales pipeline automatically. The Facebook Graph API node in n8n bridges that gap by pulling engagement data, ad performance metrics, and audience insights into workflows that feed your other tools. At Osher Digital, we use the Facebook Graph API node when building sales automation and marketing data pipelines for Australian businesses. Common setups include pulling daily ad spend and conversion data into reporting dashboards, syncing Facebook lead form submissions into CRMs in real time, and automating social media posting schedules from a content calendar. If your marketing team is manually exporting data from Facebook Business Manager and re-entering it elsewhere, this node eliminates that work.
  • NocoDB

    NocoDB

    NocoDB is an open-source platform that turns any SQL database (MySQL, PostgreSQL, SQL Server, SQLite) into a spreadsheet-like interface with a REST API. Think of it as a self-hosted alternative to Airtable. The n8n NocoDB node lets you create, read, update, and delete records in NocoDB tables directly from your automation workflows, giving you a flexible data layer that non-technical team members can view and edit through a familiar spreadsheet interface while your automations work with the same data through the API. The problem NocoDB solves is the gap between developers who want proper databases and business users who want spreadsheets. Instead of building a custom admin panel every time someone needs to view or edit data, you point NocoDB at your database and it generates a usable interface automatically. The n8n node then lets your automations read and write to those same tables, so you have one source of truth that both humans and machines can work with. At Osher Digital, we use NocoDB as the data backbone in many of the business automation systems we build for clients. It works particularly well as a lightweight CRM, a project tracker, an inventory database, or a content management system — anywhere you need structured data that both your team and your n8n workflows need to access. Because NocoDB is self-hosted, your data stays on your infrastructure, which matters for Australian businesses with data sovereignty requirements.
  • Github Trigger

    Github Trigger

    GitHub Trigger is an n8n node that starts a workflow whenever a specific event happens in your GitHub repositories — pushes, pull requests, issues, releases, code reviews, or any of the dozens of webhook events GitHub supports. If your development team uses GitHub and you want automated actions to happen when code is pushed, PRs are opened, or issues are created, this trigger connects your code repository to your operational workflows. The problem this solves is the disconnect between development activity and business operations. When a developer pushes code or a PR gets merged, other things usually need to happen — deployment notifications sent to the team, release notes posted, JIRA tickets updated, clients notified of new features, or QA tasks created. Doing these manually is error-prone and slow. The GitHub Trigger fires automatically and kicks off whatever downstream actions you define in n8n. At Osher Digital, we use the GitHub Trigger node as part of system integration projects that connect development workflows to business tools. Common builds include posting deployment summaries to Slack or Mattermost when code is merged to main, creating project management tasks when new issues are filed, syncing release notes to a client-facing changelog, and triggering automated test or build pipelines. If your dev team lives in GitHub but your business runs on different tools, this trigger bridges the gap.
  • MQTT Trigger

    MQTT Trigger

    MQTT Trigger is an n8n node that starts a workflow whenever a message arrives on a specific MQTT topic. MQTT (Message Queuing Telemetry Transport) is a lightweight messaging protocol built for IoT devices, sensors, and machine-to-machine communication. If you have temperature sensors, GPS trackers, industrial equipment, or smart building systems publishing data over MQTT, this trigger node lets n8n listen for those messages and act on them automatically. The practical problem MQTT Trigger solves is connecting IoT and operational technology (OT) data to your business systems. Your sensors might be publishing readings every few seconds, but that data is useless sitting on an MQTT broker. The MQTT Trigger node in n8n picks up those messages and routes them into databases, dashboards, alerting systems, or AI models for analysis — turning raw device data into actionable business intelligence. At Osher Digital, we use the MQTT Trigger node when building system integrations that bridge the gap between physical infrastructure and business software. This includes monitoring environmental sensors for compliance reporting, tracking asset locations in real time, and triggering maintenance workflows when equipment readings fall outside normal ranges. If your organisation has IoT devices generating data that nobody is acting on, this is where we start.
  • RabbitMQ Trigger

    RabbitMQ Trigger

    RabbitMQ Trigger is an n8n node that starts a workflow whenever a new message arrives in a RabbitMQ queue. RabbitMQ is an open-source message broker used by development teams to decouple services, manage background job queues, and handle asynchronous processing. If your engineering team already uses RabbitMQ to pass messages between microservices, this trigger lets n8n listen to those queues and execute automation workflows in response. The core problem this solves is getting business logic and operational workflows connected to your message queue infrastructure. Developers build RabbitMQ queues for technical reasons — handling order processing, managing email queues, distributing background tasks — but the business side needs visibility and action. The RabbitMQ Trigger bridges that gap by letting n8n consume messages from any queue and route them into CRM updates, notification systems, reporting dashboards, or any other business tool. At Osher Digital, we use the RabbitMQ Trigger when clients have existing message queue infrastructure and need to connect it to business workflows without writing custom code. This fits into our system integration work, where we connect developer-facing infrastructure to business-facing tools. Common use cases include processing order events from an e-commerce backend, handling webhook retries through a dead-letter queue, and orchestrating multi-step data processing pipelines.
  • Mattermost

    Mattermost

    Mattermost is an open-source, self-hosted team messaging platform that works as an alternative to Slack. The n8n Mattermost node lets you send messages, create posts, manage channels, and react to messages programmatically from your automation workflows. If your team uses Mattermost for internal communication and you want automated notifications, alerts, or status updates posted directly into your channels, this node handles it. The reason organisations choose Mattermost over Slack is data sovereignty — Mattermost runs on your own servers, so messages and files never leave your infrastructure. This makes it popular with government agencies, defence contractors, healthcare providers, and any organisation with strict data residency requirements. The n8n Mattermost node preserves that benefit because n8n itself can also be self-hosted, giving you a fully self-contained automation and communication stack. At Osher Digital, we integrate Mattermost into n8n workflows as part of our system integration projects. Common use cases include posting automated deployment notifications from CI/CD pipelines, sending alert messages when monitoring systems detect issues, and routing customer support messages from external channels into internal Mattermost threads. If your team uses Mattermost and you want to pipe automated updates into it without building custom bots, the n8n node is the fastest path.
  • Twilio

    Twilio

    Twilio is a cloud communications platform that provides APIs for sending and receiving SMS messages, making and receiving phone calls, sending WhatsApp messages, and handling video. Rather than building telephony infrastructure or negotiating carrier agreements, developers use Twilio’s APIs to add communication capabilities to their applications programmatically. You get a phone number from Twilio, and then use their API to send messages or handle incoming calls with webhooks. The problem Twilio solves is communication at scale. When a business needs to send appointment reminders via SMS, verify phone numbers with one-time codes, notify field workers about new jobs, or route incoming calls to the right department, doing this manually does not scale. Twilio makes these interactions programmable — triggered by events in your other systems. In n8n, the Twilio node can send SMS and WhatsApp messages using Twilio’s Messaging API. This makes it straightforward to add SMS notifications to any automation workflow — send an alert when a monitoring check fails, confirm an appointment when a calendar event is created, or notify a sales rep when a high-value lead comes in. Incoming messages can trigger n8n workflows via Twilio webhooks. Osher integrates Twilio into business automation projects where clients need SMS or WhatsApp as part of their workflows. We also use it for AI agent systems that communicate with users via text message, and in sales automation setups where lead notifications need to reach people on their phones immediately.
  • WooCommerce Trigger

    WooCommerce Trigger

    The WooCommerce Trigger node in n8n listens for events from a WooCommerce store and starts a workflow when those events occur. WooCommerce is the most widely used e-commerce plugin for WordPress, and its webhook system can notify external services whenever orders are placed, products are updated, customers register, or coupons are used. The trigger node receives these webhook payloads and passes the full event data into your n8n workflow. The problem this solves is the gap between your online store and your other business systems. When a customer places an order, several things need to happen: the order data should reach your accounting software, the warehouse needs a pick list, the customer should get a confirmation (possibly via SMS), and your CRM should log the purchase. WooCommerce handles the storefront, but it does not natively push data to all these other systems. The trigger node bridges that gap. WooCommerce webhooks fire in near real-time when events happen — order created, order updated, order deleted, product created, product updated, customer created, and coupon events. The trigger node receives the full payload including all order line items, customer details, shipping information, and payment status. This gives your n8n workflow everything it needs to route the data to the right downstream systems. Osher integrates WooCommerce stores into business automation workflows that connect orders to fulfillment, accounting, and customer communication systems. We also build system integrations that keep WooCommerce product and inventory data synchronised with external platforms.
  • Local File Trigger

    The Local File Trigger node in n8n watches a folder on your server’s file system and starts a workflow whenever a file is created, modified, or deleted. It turns your file system into an event source: drop a file into a watched folder and n8n automatically picks it up and processes it. This is particularly useful for businesses that receive data as files: CSV exports from legacy systems, PDF invoices from suppliers, XML feeds from government portals, or images that need processing. Instead of someone manually checking a folder, opening each file, and entering data into another system, the Local File Trigger detects new files immediately and kicks off the processing workflow. The node monitors a specified directory path and can filter by file extension. When it detects a change, it outputs the file path, file name, and event type (created, changed, deleted). You then use subsequent nodes (Read Binary File, Spreadsheet File, CSV, or code nodes) to read and process the file contents. At Osher, we use the Local File Trigger in automated data processing workflows where files arrive from external systems via SFTP, shared drives, or manual uploads. A typical build watches an SFTP landing folder, picks up new CSV files, parses the data, validates it, and loads it into a database or CRM. Our RPA team uses this pattern to replace manual file-processing tasks across finance, operations, and admin teams.
  • n8n

    n8n

    The n8n node is a management and utility node within n8n itself. It lets you interact with your n8n instance programmatically from inside a workflow: list workflows, get workflow details, activate or deactivate workflows, and retrieve execution data. This turns n8n into a self-managing platform where workflows can monitor and control other workflows. This node is useful for operational management of n8n at scale. When you have dozens or hundreds of workflows running, you need visibility into which ones are active, which have failed recently, and which need attention. Instead of manually checking the n8n dashboard, you can build a monitoring workflow that uses the n8n node to pull execution logs, check for failures, and send alerts to Slack or email. The node can also automate workflow lifecycle management. You can build workflows that activate seasonal automations on a schedule, deactivate workflows during maintenance windows, or generate reports on workflow execution counts and success rates. For teams that manage n8n for multiple clients or departments, this provides centralised control. At Osher, we use the n8n node in every client deployment for operational monitoring. Our n8n consulting team builds monitoring dashboards and alerting workflows that report on workflow health, execution failures, and performance metrics. When we manage n8n for clients through our business automation services, this node is how we keep tabs on everything running across the platform.
  • Redis

    Redis

    Redis is an open-source, in-memory data store that functions as a database, cache, message broker, and streaming engine. Unlike traditional disk-based databases, Redis holds data in RAM, which means read and write operations happen in microseconds rather than milliseconds. It supports data structures including strings, hashes, lists, sets, sorted sets, and streams — making it far more flexible than a simple key-value cache. The most common problem Redis solves is speed. When your application queries a relational database for the same data repeatedly, response times degrade as load increases. Redis sits between your application and your database, serving frequently accessed data from memory. Session stores, leaderboards, rate limiters, real-time analytics counters, and pub/sub messaging channels all run well on Redis because they need sub-millisecond response times. For automation workflows built on n8n, Redis is useful as a shared state store between workflow executions. You can cache API responses, deduplicate incoming webhook data, or manage queue-based processing where multiple workflows need to coordinate. Redis Streams can also act as a lightweight message broker for event-driven architectures. At Osher, we connect Redis into broader system integration projects where performance matters — particularly for real-time data pipelines and AI agent architectures that need fast access to context data between inference calls.
  • SendGrid

    SendGrid

    SendGrid is a cloud-based email delivery service owned by Twilio that handles both transactional and marketing email at scale. Rather than managing your own mail server (and dealing with IP reputation, SPF/DKIM configuration, and deliverability monitoring), SendGrid provides an API and SMTP relay that routes email through their infrastructure with built-in bounce handling, spam compliance, and delivery analytics. Transactional emails — order confirmations, password resets, shipping notifications — need to arrive instantly and reliably. Marketing emails — newsletters, product announcements, drip campaigns — need to land in inboxes rather than spam folders. SendGrid handles both, with separate IP pools so your marketing sends do not affect transactional delivery rates. In n8n automation workflows, the SendGrid node lets you send emails via the API, manage contacts and lists, and work with dynamic templates. This is particularly useful for workflows that respond to CRM events, form submissions, or e-commerce triggers — you can personalise emails with data pulled from other systems in the same workflow. Osher integrates SendGrid into business automation projects where clients need reliable email as part of a larger workflow. We also use it in sales automation setups where lead nurture sequences need to fire based on CRM activity or website behaviour.
  • Jira Software

    Jira Software

    Jira Software is Atlassian’s project management and issue tracking platform, built primarily for software development teams but used widely across IT, operations, and business teams. It provides Scrum boards, Kanban boards, sprint planning, backlog management, and detailed reporting — all structured around the concept of issues (tasks, bugs, stories, epics) that move through configurable workflows. The problem Jira solves is visibility. When development and IT teams track work across spreadsheets, emails, or chat messages, things get lost. Jira centralises all work items with status tracking, assignees, priorities, due dates, and links to related code commits or pull requests. Managers get dashboards and burndown charts; team members get a clear queue of what to work on next. For automation, Jira’s REST API is comprehensive. The n8n Jira node supports creating issues, updating fields, adding comments, transitioning statuses, and reading issue data. This means you can build workflows that automatically create Jira tickets from form submissions, escalate support tickets from a helpdesk into engineering sprints, or sync project status with external reporting tools. Osher connects Jira into broader system integration projects where development teams need their project management data flowing into other business systems. We also build process automation workflows that eliminate manual ticket creation and status updates.
  • AWS S3

    AWS S3

    AWS S3 (Simple Storage Service) is Amazon’s cloud object storage, designed to store and retrieve any amount of data from anywhere. Unlike a traditional file system with folders and drives, S3 stores data as objects in flat-namespace buckets, each object identified by a unique key. It is built for durability (99.999999999% — eleven nines) and scales automatically without capacity planning. The problem S3 solves is reliable, scalable file storage without managing servers. Businesses use it for backup and archival, hosting static website assets, storing documents and media files, feeding data into analytics pipelines, and as a staging area for data that moves between systems. S3’s lifecycle policies can automatically move older data to cheaper storage tiers (S3 Glacier, Deep Archive) to control costs. In n8n automation workflows, the AWS S3 node lets you upload, download, list, copy, and delete objects in S3 buckets. Common patterns include archiving processed documents, storing generated reports for later retrieval, feeding files into AI processing pipelines, and syncing files between S3 and other storage systems. S3 event notifications can also trigger n8n workflows when new files arrive. Osher uses S3 in automated data processing projects where files need to flow between systems reliably. We also connect S3 to AI agent workflows that process documents, images, or other unstructured data stored in buckets.
  • Notion Trigger

    Notion Trigger

    The Notion Trigger node in n8n starts a workflow automatically whenever something changes in a Notion workspace. It polls your Notion database or page at regular intervals and fires when it detects new or updated entries. This turns Notion from a static workspace into an active part of your automation stack. Many teams use Notion as their central hub for project tracking, content calendars, meeting notes, and task management. The problem is that updates in Notion rarely flow to other tools automatically. When a project status changes, someone has to manually update the CRM, notify the team on Slack, or create a task in the project management tool. Notion Trigger eliminates that manual step. The node connects through Notion’s API using an internal integration token. You configure it to watch a specific database and trigger when pages are created or updated. It returns the page properties (title, status, dates, people, select fields) as JSON, which you can then route to any other node in your workflow. At Osher, we connect Notion to other business tools for teams that have outgrown manual processes. A common build is Notion-to-Slack notifications when project statuses change, or Notion-to-CRM syncs that keep sales pipelines updated. Our business automation team can set this up so your Notion workspace drives action across your entire tool stack.
  • XML

    The XML node in n8n converts data between XML and JSON formats inside your automation workflows. It can parse incoming XML strings into structured JSON that other n8n nodes can work with, and it can convert JSON data back into XML for systems that require it. XML is still widely used in enterprise integrations, government APIs, SOAP web services, EDI transactions, and legacy system data exports. If your business receives XML files from suppliers, parses XML API responses from government services, or needs to submit data in XML format to compliance systems, this node handles the conversion without custom code. The node operates in two modes. “XML to JSON” takes an XML string and produces a structured JSON object with all elements, attributes, and nested structures preserved. “JSON to XML” does the reverse, converting your JSON data into valid XML with configurable options for root element names, attribute handling, and declaration headers. At Osher, we work with XML regularly in system integration projects, particularly when connecting modern APIs (which use JSON) to older enterprise systems (which expect XML). Government and financial services APIs in Australia often still return XML, and our automated data processing workflows handle the translation between formats so your team does not have to deal with raw XML manually.
  • OpenWeatherMap

    OpenWeatherMap

    OpenWeatherMap is a weather data API that provides current conditions, forecasts, and historical weather data for locations worldwide. In n8n, the OpenWeatherMap node lets you pull weather data directly into your automation workflows, so you can build processes that react to real-world weather conditions. Businesses that depend on weather (agriculture, logistics, construction, outdoor events, insurance, energy) often need weather data to feed into their operational decisions. The problem is that checking weather manually and then adjusting schedules, alerts, or processes is slow and error-prone. The OpenWeatherMap node automates this by fetching weather data on a schedule or on demand and feeding it into your workflow logic. The node can retrieve current weather by city name, coordinates, or zip code. It returns temperature, humidity, wind speed, precipitation, weather descriptions, and cloud coverage as structured data. You can use IF nodes downstream to branch your workflow based on conditions: send a storm warning if wind exceeds a threshold, reschedule outdoor work if rain is forecast, or log temperature data for compliance reporting. At Osher, we have direct experience building weather-driven automation. We built a weather data pipeline for an insurance tech company that processed weather data from the Bureau of Meteorology using n8n. Our data processing team builds similar workflows that turn weather APIs into operational triggers for Australian businesses.
  • Execute Workflow Trigger

    The Execute Workflow Trigger node is an n8n trigger that lets one workflow call another as a sub-workflow. It acts as the entry point for a workflow that is designed to be invoked by a parent workflow using the Execute Workflow node, rather than running on its own schedule or webhook. As your n8n automations grow, individual workflows can become long and difficult to maintain. The Execute Workflow Trigger solves this by allowing you to break large workflows into smaller, reusable modules. A parent workflow handles the main logic and calls sub-workflows for specific tasks, passing data in and receiving results back. This is the same concept as functions in programming. You build a sub-workflow once (for example, “look up a customer in the CRM and return their details”) and call it from any parent workflow that needs that data. The Execute Workflow Trigger node sits at the start of the sub-workflow and receives the input data from the parent. At Osher, we use this pattern extensively in complex automation projects. Our n8n consultants design modular workflow architectures where shared logic lives in sub-workflows. This keeps each workflow focused, testable, and maintainable. If you are building more than a handful of n8n workflows, our business automation team can help you structure them properly with sub-workflow patterns.
  • GraphQL

    GraphQL

    The GraphQL node in n8n lets you send queries and mutations to any GraphQL API directly from your automation workflows. Instead of making multiple REST API calls to assemble the data you need, a single GraphQL query can request exactly the fields you want from multiple related resources in one request. GraphQL APIs are used by platforms like Shopify, GitHub, Contentful, Strapi, Hasura, and many modern SaaS products. If you are integrating with any of these services, the GraphQL node gives you precise control over what data you fetch. You write a GraphQL query, set your variables, and the node returns structured JSON with only the fields you asked for. The node supports queries (read operations), mutations (write operations), and custom headers for authentication. You can pass Bearer tokens, API keys, or custom auth headers. Variables can be set dynamically from upstream nodes, so your GraphQL queries can be parameterised based on data flowing through the workflow. At Osher, we use the GraphQL node for Shopify integrations, headless CMS connections, and any API that offers a GraphQL endpoint alongside or instead of REST. Our system integration team picks GraphQL over REST when the data requirements are complex and we need to reduce the number of API calls. If you are connecting to GraphQL-based services, our custom development team can build the queries and wire them into your workflows.
  • FTP

    The FTP node in n8n connects to FTP and SFTP servers to upload, download, list, rename, move, and delete files programmatically. It authenticates with a hostname, port, username, and password (or private key for SFTP), and handles both active and passive transfer modes. The node works with binary file data, so files retrieved from an FTP server can be passed directly to other n8n nodes for processing, storage, or delivery. FTP may seem like old technology, but it remains deeply embedded in many business operations. Suppliers send EDI files via SFTP. Accounting systems export reports to FTP directories. Government portals require SFTP uploads for compliance submissions. Legacy systems that can’t use modern APIs often have FTP as their only integration option. The FTP node lets you connect these systems into modern automated workflows without replacing the underlying infrastructure. At Osher, we use the FTP node in system integration projects where a client’s suppliers, partners, or legacy systems rely on FTP for file exchange. We’ve built workflows that poll SFTP directories for new files, process the data through transformation and validation steps, and push the results into modern databases or cloud applications. Our automated data processing team handles the full pipeline from file retrieval to downstream delivery.
  • MongoDB

    MongoDB

    The MongoDB node in n8n connects to MongoDB databases (including MongoDB Atlas cloud instances) and lets your workflows insert, find, update, aggregate, and delete documents. It authenticates via a connection string with username, password, and database name, and supports all standard MongoDB operations including query filters, projection, sorting, and the aggregation pipeline framework. MongoDB is the most widely used NoSQL database, and many web applications, APIs, and data platforms store their operational data in it. The n8n MongoDB node lets you pull that data into automated workflows, push processed data back into collections, and react to changes in your MongoDB data as part of broader business automations. This is particularly useful for companies whose applications use MongoDB as their primary data store but need that data to flow into reporting tools, notification systems, or external APIs. At Osher, we use the MongoDB node in custom AI development and system integration projects where client applications run on MongoDB. We connect MongoDB collections to reporting dashboards, sync data between MongoDB and other databases, and build workflows that react to document changes by sending notifications or triggering downstream processes. We built a data pipeline in a similar architecture for our BOM weather data project, where structured data needed to flow reliably between systems.
  • Microsoft SQL

    Microsoft SQL

    Microsoft SQL Server (MSSQL) is a relational database management system built for storing, querying, and managing structured data at scale. In n8n, the Microsoft SQL node lets you run SQL queries, insert and update rows, and pull data from MSSQL databases directly inside your automation workflows. Most organisations that run MSSQL have business-critical data locked inside it: customer records, transaction histories, inventory levels, financial reporting tables. The problem is that getting data out of MSSQL and into other systems usually means manual exports, CSV files emailed between teams, or expensive middleware. The n8n MSSQL node fixes this by connecting your database to any other service in your workflow, whether that is a CRM, an accounting platform, or a reporting dashboard. Common use cases include syncing MSSQL customer records with your CRM on a schedule, pulling order data into automated invoicing workflows, and running parameterised queries to feed reporting tools with fresh data. The node supports SELECT, INSERT, UPDATE, and DELETE operations, so you can both read from and write back to your database. At Osher, we build MSSQL-connected workflows for Australian businesses that need their database talking to the rest of their tech stack. If your team is spending hours on manual data exports or copy-pasting between systems, our system integration services can connect MSSQL to your existing tools and automate the data flow.
  • Rename Keys

    Rename Keys is a utility node in n8n that changes the property names (keys) of JSON objects as they pass through a workflow. When you connect two systems that use different field names for the same data, Rename Keys sits between them and translates one naming convention to another. This is a common problem in integration work. Your CRM might call a field “company_name” while your accounting software expects “organisation”. Your API returns “firstName” but your database column is “first_name”. Without a translation step, data either fails to map or ends up in the wrong fields. Rename Keys solves this without writing any code. The node lets you define one or more key renaming rules. You specify the current key name and what it should become. It can handle nested JSON properties and rename multiple keys in a single step. You can also use regex patterns for more advanced renaming across keys that share a common pattern. At Osher, we use Rename Keys constantly in our system integration projects. Whenever we connect two platforms that structure their data differently, this node handles the field mapping without adding code nodes to the workflow. If your n8n workflows are breaking because of mismatched field names between systems, this is usually the fix.
  • GitHub

    GitHub

    The GitHub node in n8n connects to the GitHub REST and webhook APIs using personal access tokens or OAuth2 authentication. It lets your workflows interact with repositories, issues, pull requests, releases, and users programmatically. You can create and update issues, trigger workflows on push or PR events via webhooks, read repository contents, manage labels, and post comments, all from within an n8n automation. Development teams already live in GitHub, but the operational work around code, including issue triage, release notifications, deployment tracking, and cross-team communication, still involves manual steps. The GitHub node closes that gap by connecting your code repository to the rest of your business tools. When a PR gets merged, Slack gets notified. When an issue is labelled as urgent, it gets pushed to your project management board. When a release is published, clients get an update email. At Osher, we use the GitHub node in system integration projects where development activity needs to flow into project management, client communication, or deployment pipelines. We also use it in our own internal tooling, connecting our repositories to Slack alerts and task tracking. Our custom development team can build GitHub-connected workflows for any DevOps or project management use case.
  • SSH

    The SSH node in n8n executes commands on remote servers over an encrypted SSH connection. You configure it with a hostname, port, and authentication credentials (password or private key), and it runs shell commands on the target machine and returns the stdout and stderr output to your workflow. This gives n8n the ability to interact with any Linux or Unix server as part of an automated pipeline. This matters when your automation needs to touch infrastructure directly. Restarting a service, pulling fresh data from a server-side script, checking disk usage, deploying a configuration change, or running a database backup command are all tasks that require shell access. The SSH node lets you include these steps in a broader workflow without writing a separate script or logging into the server manually. At Osher, we use the SSH node in system integration projects where part of the workflow involves interacting with on-premise servers or VPS instances. For example, triggering a data export script on a client’s server, pulling log files for analysis, or restarting services as part of an automated deployment pipeline. Our n8n consulting team builds these workflows with proper key-based authentication and scoped command permissions.
  • HTML Extract

    The HTML Extract node in n8n pulls specific data out of HTML content using CSS selectors. You point it at an HTML string (from a web page, email body, or API response) and define which elements to extract using selectors like class names, IDs, tag names, or attribute values. It returns the extracted content as structured JSON data your workflow can use. This node solves a common problem in automation: getting usable data out of web pages and HTML-formatted content. Many systems do not offer a clean API but do have web pages with the data you need. HTML Extract lets you grab that data programmatically. It also works for parsing HTML emails, scraping structured content from web applications, and extracting data from HTML-formatted API responses. At Osher, we use HTML Extract in data processing workflows where the source data is embedded in web content. Practical examples include extracting product prices and stock levels from supplier websites that lack APIs, pulling structured data from HTML email notifications (order confirmations, shipping updates, alert emails), and parsing web application pages to capture data for downstream processing. The node supports extracting text content, HTML content, or attribute values (like href from links or src from images). You can define multiple extraction rules in a single node to pull several data points from the same HTML source. Our integration team can build web scraping and HTML parsing workflows for your specific data sources.
  • Read Binary File

    The Read Binary File node in n8n reads a file from the local filesystem of the machine where n8n is running and loads it into the workflow as binary data. You specify the file path, and the node makes that file available for downstream processing by other nodes. It is the starting point for any workflow that needs to work with files stored on the n8n server. This node is most relevant for self-hosted n8n deployments where your n8n instance has access to a local or mounted filesystem. If files land on a shared drive, a mounted network volume, or a specific directory on your server (from an SFTP upload, a cron job, or another application), Read Binary File picks them up and brings them into your n8n workflow for processing. At Osher, we use Read Binary File in data processing pipelines where files arrive on the server from external sources. A typical setup involves a partner system dropping CSV files into a directory via SFTP. The n8n workflow uses an Interval trigger to check the directory periodically, Read Binary File to load any new files, and then processes them through conversion, validation, and database insertion steps. We also use it for reading configuration files, loading email templates, and accessing locally stored reference data. For workflows that need to read files from cloud storage (Google Drive, S3, Dropbox), you would use those platform-specific nodes instead. Read Binary File is specifically for files on the local filesystem. Our n8n team can help you design file processing workflows regardless of where your files are stored.