How to Sync Issue Types and Select Lists (Dropdown) Between Jira On-premise and Jira Cloud 

how_to_sync_issue_types_and_select_lists__dropdown__between_jira_on-premise_and_jira_cloud_720

This article was originally published on the Atlassian Community.

Frictionless issue tracking and resolution form the basis of project management tools like Jira. But when teams use different Jira instances like Jira Cloud and Jira on-premise, managing information between them can be challenging. 

So how would you sync issue types and custom fields, like select lists, between these different Jira instances? 

Enter Exalate, a synchronization solution that helps map and sync various fields between multiple Jira deployments. 

Let’s dive in and see how!

The Use Case

The Jira cloud and on-premise instances must pass the following issue information:

  • Summary, description, labels, comments, and attachments must be synced bi-directionally. 
  • Issue types must be mapped and synced bi-directionally. So if an issue type in one instance is a Task, it must become an Incident in the other Jira instance. And if the issue type is Bug, then it must be a Change in the other system. 
  • Custom fields such as select lists, aka dropdowns, must be synced bi-directionally.

The Challenges

Basic issue fields like summary, description, etc are easy to sync. The real challenge is to map the issue types in the two instances and sync them properly. The same mapping logic must apply to the select list entries as well. 

If a user selects a particular item from the list, the other instances’ list must reflect the correct item.

Let’s take a look at why Exalate excels in a use case like this. 

Exalate: a Customizable Integration Solution 

Exalate is a highly customizable integration solution to synchronize data across multiple platforms like Jira, ServiceNow, Zendesk, Salesforce, Azure DevOps, GitHub, etc. 

With Exalate, you can control your sync at a granular level and automate the information flow based on the native query languages of the platform you use. 

Some key features of Exalate are: 

  • Customizability: Exalate provides flexible configuration options, allowing teams to define their sync rules, filters, mappings, and workflows. Such customization ensures that the integration meets the specific needs and requirements of each team. 
  • Security and Control: Exalate prioritizes security and provides features such as encrypted communication, access controls, and audit logs. This ensures that sensitive data remains secure during the synchronization process.
  • Scalability: Exalate is designed to handle large volumes of data and can scale according to the needs of your organization. It supports synchronization between multiple projects, teams, and even different companies.

How to Sync Issue Types and Select Lists Between Jira Cloud and Jira On-premise 

Start by installing Exalate on both the Jira cloud and Jira on-premise instances. 

Then connect Jira instances using the Script mode.

For more information on how to do that, refer to the step-by-step Getting Started guide. You can also watch the Exalate Academy videos if you prefer. 

After establishing the connection, click on the “Configure Sync” button. You’ll be redirected to the “Rules” tab. You can decide what information you want to send and receive between the platforms using the “Rules” tab.

Exalate provides two sets of script windows in both the Jira instances. The “Outgoing sync”  defines what data must go out from the particular Jira instance. And the “Incoming sync” defines how to interpret data coming from the other Jira instance. 

Sync rules in Jira

Map and Sync Issue Types

Next, you must map the issue types between the platforms. You can perform this mapping at either of the ‘Incoming sync’ script windows, depending on your requirement.

In this use case, we will create a Task or a Bug in Jira Cloud, and it should be reflected as an Incident or a Change in the Jira on-premise instance. So, when you create and sync an issue for the first time, a corresponding Incident or Change will be created in the other instance based on the mapping you have set.

You can also apply the same logic if the issue type is updated by creating a new mapping and mentioning it under the (!firstSync) condition. The firstSync variable determines if the issue is synced for the first time or not. 

As seen in the code, a typeMap variable stores the mapping. It’s in the format remoteissuetype:localissuetype.

Then we call the getIssueType method of the nodeHelper class of Exalate. This method allows you to fetch the issue type of the replica. 

Note: The replica carries the information to be passed between the systems, serving as the payload.

The getIssueType method picks up the local issue type that we have populated with the help of a map variable: typeMap, and assigns it to the issue type in Jira on-premise. 

A fallback value is also assigned to the issue type if none of the values match. In our case, the fallback value is Task, but it can be any other value of your choice. 

Note:  We have shown a uni-directional sync here, but it can be bi-directional as well. 

Sync issue types in Jira

Sync Select Lists (Dropdown) 

Select lists, also called dropdowns, are user-defined fields that allow you to select specific values from a list. These are called custom fields in the Exalate context. 

To ensure that the information in the custom field is synced to the destination instance, you only need to add a single line of code.

For instance, we have a dropdown called ‘Support Group’ that holds multiple values like IT, HR, Finance, etc. When the user selects “IT” in one of the Jira instances, the corresponding value will be displayed in the other system.

Custom field in Jira

Remember, just like you mapped the different issue types, you can also set up custom mappings for syncing select lists. 

Output

When a Task is created in the Jira Cloud instance, an Incident is created in the Jira on-premise instance.

Sync issue types in Jira

 

Likewise, the support group “Finance” selected in the Jira Cloud instance correctly reflects the value in the Jira on-premise instance. 

Conclusion

By integrating platforms, modern businesses stay competitive and responsive to customer needs. This use case we demonstrated is just one example of how integration tools can enable collaboration between teams. 

Curious to know if Exalate is the right solution for your business? It’s just a click away!

Recommended Reads:

How Integration Service Providers Can Help Simplify Data Integration

Integration Service Provider

In the digital world, connections matter. The problem is, the proliferation of numerous platforms, devices, and software systems has spiked incompatibility. This makes it difficult to connect with clients and software partners.

However, data integration is pivotal to fostering innovation. It also ensures software architects don’t have to do the same thing over and over again. Rather, they can repurpose existing processes and data flows into new projects. 

On the other hand, organizations generally find it untenable to constantly perform custom integration to overcome cross-platform incompatibilities in pursuit of information exchange. 

Integration service providers (ISPs) bridged this gap by harnessing the power of integration. 

We have previously discussed integrated service management (ISM), but here we’ll go into the role of ISPs in data transformation, along with their challenges, advantages, and common use cases. 

What is an Integration Service Provider?

An integration service provider is a platform that connects different software applications and systems so they can seamlessly exchange data. As purveyors of a specialized application, ISPs offer clients software structured with an accessible interface. This is usually via an application program interface (API), to execute data flow integration. 

ISPs typically offer features that range from customer data integration to cross-platform integration, including data management and database analytics solutions. They also aim to shorten deployment times, making it easy to exchange information using a no-code, low-code, or script-based integration solution.

Other features offered by ISPs include:

  • Real-time analytics
  • Flexible pricing models
  • Robust data mapping functionality
  • Easily configurable workflow triggers
  • Ability to implement custom connectors
  • Dashboards for managing integrations
  • Scalability through pre-built data connectors
  • Multiple protocol support, ranging from FTP, HTTP/S, Advanced Messaging Queuing, and Open Data Protocol.

How Do Integrated Service Providers Work?

ISPs use various technologies to facilitate the exchange of data across systems. They primarily work by using APIs to create connections between disparate and seemingly incompatible software artifacts. 

In addition to being API-driven, ISPs also use frameworks like service-oriented architectures (SOA). SOAs use service interfaces to make components deployed in web service solutions reusable. These service interfaces are defined using a standard tag structure for defining XML-based services called Web Service Definition Language (WSDL). 

Unlike most APIs that use the REST-standard-based protocols, the service interfaces exposed via SOA use another network protocol called SOAP (Simple Object Access Protocol) to send HTTP or JSON/HTTP requests. 

Instead of using API integrations, an ISP can also employ another architectural pattern known as the Enterprise Service Bus (ESB), which is a vital component of SOA. 

ESB uses centralized software components to execute the integration between applications. It also handles connectivity, performs message routing, plus the transformation of data models. 

While ISPs can use different architectural components to pursue integration, they invariably rely on one type of delivery mechanism and subscription model.

To reach customers, ISPs predominantly operate an integration platform as a service (iPaaS) business model. In fact, integrated service providers have mostly become synonymous with iPaaS since they primarily use this subscription-based pricing as a business model.

The iPaaS configuration provides enterprises with a set of automated tools to establish connections between core business applications in a turnkey manner. Hence, enterprises aren’t required to install or set up hardware or manage additional resources.

An integration service provider should ideally operate both cloud-native and on-premise deployments. Better still if they can provide customers with hybrid options. 

Types of Integration Service Providers

With a better understanding of how ISPs work, let’s discuss the existing variations of ISPs.

System Integrators 

These are companies that have expertise in bringing together different components of a subsystem into a functional whole. They also provide enterprises with the necessary planning, coordination, scheduling, implementation, and testing required for this type of integration for their computing systems.

Managed Service Providers (MSP) 

These are third parties that manage an organization’s IT infrastructure remotely. They are typically geared to help small and medium-sized businesses (SMBs) that can’t afford in-house IT personnel to manage their day-to-day tech operations.


In discharging their duties, MSPs provide perfunctory integration services on behalf of their clients to ensure the systems work effectively.  

Platform-as-a-Service Providers 

These are a particular type of integration service that gives companies both deployment and development environments. Since they are cloud-based solutions, they deliver resources that engender data transformation through sophisticated enterprise applications. 

Business Process Integration Providers 

These mainly deal with large enterprise clients who want to leverage their vast information sources efficiently. Business process integrators assist such clients by connecting disparate systems and integrating numerous business processes into a single, cohesive unit. 

Common Use Cases For ISPs and iPaaS

Just as data uses aren’t homogenous, all data integration and transformation aren’t created equal. 

  • Application-to-application integration: This ensures different applications can seamlessly establish connections through permeable interfaces.
  • Data integration: This permits real-time synchronization and data flow between systems through managing complex data format translations.
  • Microservice integration: The proliferation of microservice architecture has increased the need for data integration to generate, support, and publish APIs automatically.
  • Multiple cloud integrations: ISPs enable them to manage complex integration from multiple public cloud sources.
  • Big data integration: Fostering data analytics and business intelligence with complex Big Data integrations.

Advantages of Using an Integration Service Provider

One of the main benefits of adopting an integration service provider is that they empower organizations with: 

  • Fast and uncomplicated data integrations: To simplify and speed up the process of data integration.
  • Business intelligence: Cleaning and harmonizing data from various sources to provide the best quality for business operations and intelligence.
  • Innovate faster: The real-time business insights gained from streamlined data integration improve business efficiency and creativity.
  • Shorten deployment time: To allow software engineers to develop, test, and deploy integrations easily.
  • Convenience: Offering out-of-the-box connectivity solutions with low friction of use or maintenance.
  • Event monitoring: Monitor and manage integration solutions with access to process and system event logs.
  • Cost-effectiveness: ISPs eliminate hiring expensive specialists to perform in-house integrations. They provide businesses with the ability to implement high-performance integration at an optimum price with an adequate degree of reliability.
  • 360-degree view of data: Offering organizations a means to establish a shared and holistic view of data from disparate data sources.
  • Building a more effective tech stack: Integration service providers enable organizations to break down tech silos using specialized tools and resources that are more suitable to their business objectives. This creates a more effective and stable tech stack.
  • Improved customer experience: Better integrated data allows organizations to offer customers better, personalized experiences tailored to their needs. This also allows them to gain better data insight for customer segmentation.

Selecting a Competent Integration Service Provider

In choosing an integration service provider, qualities like dependability, security, and industry expertise should be priorities. Exalate has a track record as an integration service provider, supporting increased scalability, flexibility, and maintainability across industries. 

In addition, Exalate also provides AI-powered customization options through its AI Assist feature. Users can rely on this solution to generate scripts for complex integration scenarios.

Check out how a cross-platform integration tool like Exalate operates to offer you the flexibility of decentralized integration.

Recommended Reads:

Exploring The Best No-Code Integration Tools For Businesses 

No-code integration

Companies and solopreneurs need to integrate different work management systems in order to organize processes and gather data. Since configuring an integration manually is complicated, teams are looking for no-code integration solutions to ease this burden.

And with automation driving innovation in the modern workplace, no-code integration software makes it easy to automate and optimize operations.

We’ll discuss no-code integration in detail, exploring its importance and challenges. Continue reading to find out the best no-code integration solutions for automating your business.

What is No-Code Integration?

The concept of no-code integration involves using software to connect multiple apps, software, or platforms without writing any code. You can implement no-code integration using drag-and-drop features and default templates.

With usability becoming a core value in user experience, most integration service providers are building solutions that require as little coding as possible. This cultural shift has led to concepts like no-code and low-code.

Code-Based Integration vs. Low-Code Integration vs. No-Code Integration: How Are They Related?

Code-based integration involves the exclusive use of a scripting engine to configure syncs. This option is a no-brainer for seasoned developers who want absolute control over their integrations. It gives them the leeway to play around with mappings and configurations for advanced use cases.

Low-code integration is a variation of code-based integrations that combines scripting with accessibility features on a visual interface. Users of low-code integration tools get to write some code to change the default properties of the prebuilt integration functionality. They are suited for those who want to optimize processes and eliminate human error in long codes.

However, no-code integration strictly keeps you away from expressions, methods, and variables. As the name suggests, you won’t need to write any code to get your integration to work. Users without programming backgrounds can use no-code integrations to configure syncs instantly. 

Generally, you’ll need to learn some programming language native to the code-based or low-code integration solution in order to make changes to default sync configurations. But as AI gains popularity, a lot of code-based or low-code integration solutions, now have some AI element embedded within them.

With no-code tools, you won’t have to bother with any programming or scripting languages.

Why Use No-Code Integration Solutions?

Now that you have enough information about no-code integration, let’s explore why you need it.

You can set it up easily

With no-code integration tools, anyone can configure syncs with consummate ease. By dragging and dropping a few components on the visual interface, you will be able to sync data between two management platforms instantly.

It saves time

Since setting up a no-code integration is fast, you can save time and invest your efforts in more pressing issues. For instance, you won’t need to start learning a new programming language — or looking for someone who does — before you can sync multiple Jira instances.

Just map the correct fields with your no-code solution and call it a day.

It simplifies troubleshooting

When you write code, you have to debug it and maintain it to ensure it is running correctly; this also introduces troubleshooting problems when looking for the source of the issue. But no-code ISPs usually provide error pages to help you detect the source of the problem as soon as it occurs. As a result, you can fix them without scrolling through crappy documentation.

Anybody can use it

No-code integration solutions are beneficial to business owners because they eliminate the need to train an entire team of developers. Since no programming language is involved, you can host a single training session for your employees, and they can work independently going forward.

It saves you money

The time you spend troubleshooting and crushing bugs is useful elsewhere. And that’s where no-code solutions can help; you won’t need to invest in testing tools since you are not writing any code.

You get to save money because you won’t be paying coders to work on your use case. 

Drawbacks of Using No-code Integration Tools

The main challenge of using no-code integration is that you sacrifice autonomy at the altar of usability. You can only work with pre-built templates, which limits the applicable use cases. 

In addition, no-code IPaaS vendors own and maintain proprietary software. As a result, you are at the mercy of the vendor whenever a feature malfunctions or breaks down completely. And since you cannot write custom code to control what’s being synced, you are stuck with the defaults.

Seeing that you don’t have much wiggle room with no-code integration tools, you will struggle to scale your infrastructure as more complex systems and data become available. This also affects productivity, performance, and user experience.

7 Things to Consider When Choosing No-code Integrations

Before choosing an integration solution, here are some factors to consider.

Pricing

When shopping for a no-code iPaaS solution, cost considerations should always be the first item on your list. Look for vendors that provide a suitable pricing model for your business. Some common pricing variations include:

  • Per-user: you have to pay for every user working with the integration.
  • Pay-per-instance (installation): you have to pay for every unique installation of the no-code solution on a work management system or operating system.

Apart from the price on paper, you also need to check out additional fees for maintenance, support, and miscellaneous costs. This will help you align your needs with the company’s budget.

Customer Support

You need to choose no-code integration platforms with responsive customer support. This will come in handy when your instance malfunctions. Most times, the level of support you receive depends on the service level agreement (SLA) you signed. So make sure you are getting the best value for your money.

Customization Features

Look for platforms with as many features and pre-built templates as possible. This high level of customizability will also afford you more flexibility when mapping fields for specific use cases The more customization features, the simpler your integrations and syncs. 

Ease of Use

No-code integration solutions should be easy-to-use by default, but usability is a spectrum. Some platforms come with user interfaces that are not user-friendly, which makes them a nightmare to use. 

So find no-code integration platforms that a tech newbie can use without breaking a sweat. You need to ensure the key features have a flat learning curve. To find out more, you can always request a product demo. 

Compatibility

If you are working with multiple work management systems, you must ensure that the no-code integration tool you choose is compatible with them. For instance, some platforms work exclusively for Jira-Jira integrations, while others are more versatile. So figure out which one meets your business needs and use it to integrate your CRMs and ERPs.

Security

Before choosing a tool, always check out how it handles integration security. Here are some security features to consider:

  • Compliance certifications (SOC 2 and ISO 27001)
  • Encryption (SSH, AES, RSA, 3DES)
  • Multi-factor authentication (MFA or 2FA)
  • Tokenization (JSON web tokens, UUID).
  • Audit trails and access controls

Apart from these features, you must also confirm how the prospective vendor handles disaster recovery. Look for error report dashboards and troubleshooting consoles.

Automation

Automated integrations allow you to import and process data automatically. No-code integration platforms rely on automation to improve productivity by boosting the speed of syncs. This automation usually consists of triggers that initiate data synchronization once the conditions are met.

10 Low-code and No-code Integration Tools

We’ve gathered a list of no-code and low-code integration tools for businesses. Here are the 10 best no-code tools for automating integrations.

1 – Zapier 

Zapier is an automated integration solution that allows businesses to automate workflows (zaps) and sync their work management systems. This tool provides an interactive interface for business owners in IT, finance, and other spheres.

zapier dashboard

With Zapier, your zaps are protected using 256-bit AES encryption and TLS 1.2. Enterprises can also implement company-wide SSO, which relies on SAML 2.0. You can also use access level controls and app restrictions to manage who has access to what.

As a premier ISP, Zapier supports integration with OpenAI (GPT-3 & DALL·E), Google Apps, Instagram, YouTube, and over 5000 other apps. You can also use internal Zapier integrations like Paths, Filters, Webhooks, and Multi-step Zaps.

Pricing

  • Free
  • Starter – $19.99 monthly
  • Professional – $499 monthly
  • Team – $399 monthly
  • Company – $799 monthly

2 – ONEiO 

ONEiO is a cloud-native solution that provides an Integration Automation platform that combines AI capabilities to manage tasks and workflows. 

Oneio dashboard

As a cloud-based IPaaS provider, ONEiO’s core infrastructure is built on AWS. The available security certifications include SAS70 Type II, ISO 27001, and PCI DSS Level 1. ONEiO protects all outbound and inbound data with authentication standards such as Basic Authentication, API Keys, and OAuth. 

ONEiO’s no-code integration solution can connect different endpoint types, including Slack, Zendesk, Zabbix, Azure DevOps, Hubspot, and Jira. You can also work with JSON, XML, CSV, and other text-based file formats.

Pricing

ONEiO has two pricing plans: Service Provider and Enterprise. Both of them are available on a pay-per-use basis. So you need to contact the ONEiO team to negotiate the subscription cost.

3 – Workato

Workato is an automation solution that helps you integrate business workflows across on-premise and cloud applications and services.  

Workato integration

Some security features available to Workato users include the following:

  • IP allowlist
  • Supported cloud regions
  • Encryption key management 
  • AWS IAM role authentication

Workato also comes with pre-built connectors for Wrike, Zendesk, Salesforce, Oracle, Adobe Cloud, AWS Lambda, OneDrive, etc.

Pricing

Workato also uses a pay-per-use pricing model under two plans: Workato For Your Business and Workato For Your Product.

4- Make (Integromat)

Make, formerly known as Integromat, is a no-code integration solution that infuses automation to create and manage workflows. It connects with apps, services, social media platforms, work management systems, and marketplaces like Twitter, Mailchimp, Stripe, Shopify, Trello, and Airtable.

Make automation

When working with Make, you get free automation templates for syncing and customizing workflows across multiple platforms.

Some security features that are available to Make users include SSO, access control, VPN access, AES-256, TLS (versions 1.2 and 1.3), and AWS key management service (KMS). The core infrastructure is compliant with ISO 27001 and SOC 2.

Pricing

  • Free
  • Core – $10.59 monthly ($9 billed annually)
  • Pro – $18.82 monthly ($16 billed annually)
  • Teams – $34.12 monthly ($29 billed annually)
  • Enterprise – Contact sales for more information.

5 – Microsoft Power Automate

Microsoft Power Automate is a service that enables users to automate workflows, integrate data, and share files between several apps, services, and platforms. Power Automate has helped companies reduce time-to-market by 33%, proving its status as a market leader in integrations.

Microsoft power automate

This low-code solution gives you access to AI Builder’s generative AI, which you can use to create language models for advanced use cases. You can also link data from platforms like GitHub, Slack, Google Drive, Salesforce, and Microsoft Tools like Azure DevOps and Dynamics 365.

Pricing

Power Automate has a per-user subscription plan and a per-flow plan. Here is the pricing range.

  • Per-user plan (license by user) – $15 per user/per month
  • Per-user plan with attended RPA (license by user) – $40 per user/per month
  • Per-flow plan (license by flow) – $500 per month

Businesses with an ongoing Azure subscription can also pay for every unique flow run via premium connectors for the following prices:

  • Per-flow plan (license by flow run) – $0.60 per flow run
  • Per-flow plan in unattended mode (license by flow run) – $3 per user/per month

You can also get the AI Builder for $500 per unit/month. To find out more details, check out the Power Automate pricing page.

6 – ZigiOps

ZigiOps is a no-code integration solution that automates workflows, supports advanced mappings, and synchronizes data from multiple sources. It also provides customizable templates for simplifying integration processes.

ZigiOps no-code sync solution

With ZigiOps, you get access to 300+ app integrations for DevOps, Cloud, and monitoring systems. Some notable ones include AppDynamics, Azure DevOps, Cherwell, and DataDog.

Regarding security, ZigiOps has every angle covered with protocols like 128-bit TLS 1.3 and TLS 1.2, as well as SFTP and FTPS. The IPaaS solution also has ISO 27001 and FIPS 140-2 certifications.

Pricing

ZigiOps is available at a fixed yearly price. You can get the ZigiOps Regular or ZigiOps Basic plans for your business. To find out more, book a pricing meeting with their sales team.

7 – Jitterbit

Jitterbit is a data integration platform that enables businesses to simplify workflow optimization with the help of a single automated integration tool. Since acquiring Zudy and PrimeApps, Jitterbit now supports low-code integration as well.

Jitterbit data integration platform

Jitterbit connects with Adobe, Acumatica, Zoho, Amazon AWS, Square, Workday, and other platforms and services. You can also customize your integrations with the help of plug-ins and client certificates.

In addition, Jitterbit is compliant with SOC1 Type I & II, SOC2 Type I & II, GDPR, HIPAA, and ISO 27001. Other security protocols and features include:

  • Distributed Denial of Service (DDoS) protection
  • SSL/TLS encryption (HTTPS)
  • Access controls
  • FIPS 140-2 encryption
  • Password encryption
  • Two-factor authentication

Pricing

You can get Jitterbit for a Standard (3 connections), Professional (3 connections), or Enterprise (8+ connections) subscription.

8 – Unito

Unito is a bi-directional integration tool that allows real-time data synchronization between apps, workflow management platforms, and cloud services. You can use Unito’s pre-built field mapping rules to decide what to sync.

Unito automation tool

Unito is SOC 2 certified, and the infrastructure is hosted in AWS data centers in the US. It is also compliant with PCI DSS, OWASP, and CSA. Since Unito relies on AWS, it uses tools like Cloudwatch and GuardDuty to protect user data. Also, data at rest is secured, thanks to 256–bit AES encryption.

With Unito’s no-code interface, you can sync user data, file data, contacts, workflow status, and work items. It gathers this data from popular apps and tools such as Jira, Azure DevOps, Zendesk, Excel, Monday.com, Airtable, and Notion.

Pricing

Unito offers users a 14-day free trial, after which you need to pick a plan with prices starting at $10 per month for 100 items in sync.

9 – Exalate

Exalate is an integration solution that allows users to bidirectionally sync work items, issues, cases, and other forms of data. Exalate covers the entire spectrum, from no-code to script-based integration. 

Exalate integration solution


Here are different Exalate modes:

  • Basic Mode – the no-code version that allows users to sync data between two work management systems.
  • Visual Mode – the low-code version that allows users to map fields and establish basic sync rules without going too hard on the code.
  • Script Mode – the script-based version for advanced scripting. This mode is also powered by AI, enabling you to implement deep integrations using just human prompts. Just type in your sync requirements and AI will generate the script based on your input, existing configuration, and Exalate’s scripting API.

Exalate connects with Jira (cloud and on-premise), GitHub, ServiceNow, Zendesk, Salesforce, HP ALM, and Azure DevOps.

Pricing

Exalate has a free plan that gives you up to 1000 new monthly issue pairs. The cost of the Premium plan varies, depending on the platform you are working with. You can get the entire pricing breakdown here.

10 – Skyvia

Skyvia is a cloud-based data integration platform that helps businesses collect, manage, and back up data from multiple sources. Some supported data sources include Mailchimp, Spotify, G Suite, Oracle, MySQL, Dropbox, and Snowflake.

Skyvia sync platform

Businesses can use Skyvia to import, export, replicate, and synchronize data internally and externally.

Pricing

Skyvia has a comprehensive pricing list depending on the service you want. 

Backup Pricing: 

  • Free
  • Standard – $9 per month ($7 billed yearly)
  • Professional – $99 per month ($79 billed yearly)
  • Enterprise – $499 per month ($399 billed yearly)

Data Integration Pricing: 

  • Free
  • Basic – $19 per month ($15 billed yearly)
  • Standard – $99 per month ($79 billed yearly)
  • Professional – $499 per month ($399 billed yearly)

Query Pricing: 

  • Free
  • Standard – $19 per month ($15 billed yearly)

Connect Pricing: 

  • Free
  • Basic – $19 per month ($15 billed yearly)
  • Standard – $49 per month ($39 billed yearly)
  • Professional – $99 per month ($79 billed yearly)

Conclusion

No-code integration makes it easy to sync data from multiple sources without writing a single line of code. Thanks to automated, no-code integration tools, users can now improve their efficiency while gaining more control over the data they share or receive. 

And with the infusion of AI in no-code IPaaS solutions like Exalate, Power Automate, and Zapier, integrations will continue to blend seamlessly with automation for better results and improved customer satisfaction.

Exalate is an integration solution that helps businesses integrate with multiple MSPs and MSSPs. This solution allows you to share data bi-directionally using custom scripts and event-specific triggers.

Recommended Reads:

Integration Showdown: Unito App vs. Exalate – Which Tool is Right for You? 

Unito app vs. Exalate

Let me tell you a story. Once upon a time, there was a startup called “Happy Co.”. The team at Happy Co. used Jira for project management and GitHub for software development. But they had a problem. 

They were finding it difficult to keep everything in sync. They were constantly copying and pasting information between the 2 platforms and were losing valuable time and productivity.

That’s when they discovered Exalate and Unito app, 2 integration tools, and asked themselves which one is right for them. 

In this blog post, we’ll take a deep dive and discuss Unito app vs. Exalate, comparing their features, pricing, user experience, etc. It’ll help you make an informed decision. Whether you’re a small startup like Happy Co. or a large enterprise, we’ve got you covered. 

So, grab a cup of coffee, sit back, and read on to find which integration solution is actually the best fit for your company! 

Get Familiar with the Unito App and Exalate 

Before we compare these tools, let’s see what they were built for. 

Unito

Unito is a no-code two-way workflow automation platform that enables teams to stay in sync with the tools they already use. So, a work item in one application is synced with another item in a different application. 

Unito app logo

For instance, a task in Asana becomes a ticket in Zendesk, with the relevant details passed between both tools via an automated, real-time sync

Exalate

Exalate allows you to connect your tools and manage your work by setting up a customizable bi-directional sync. So, an issue in Jira can be triggered to Azure DevOps as a work item with any or all the information reflected and synced in real-time within each application. 

Exalate logo

It comes with a no-code builder but also allows you to enhance your sync with the help of scripts.

Features are what make these tools unique and appealing to a broader audience. So let’s get started by comparing their key features. 

Unito App vs. Exalate: Compare the Key Features

Getting a better understanding of the features both the Unito app and Exalate offer is an important milestone for companies like Happy Co. to choose the right one.

Scripts or Flows  

Exalate provides Groovy-based scripts that are used to fine-tune the sync behavior to a granular level between 2 instances of a platform, like Jira, ServiceNow, Zendesk, etc. 

These scripts allow users to add custom logic as well as define how data is mapped, transformed, and synchronized between the instances.  

Exalate also provides advanced features that use the power of AI and machine learning to increase the efficiency and accuracy of custom configurations. The AI Assist chatbot allows users to generate various forms of Groovy code snippets and mappings for complex use cases.

Overall, Exalate scripts provide a powerful way for users to customize and automate their synchronization behavior and tailor it to their specific needs. 

Customize your sync

In Unito, flows are the fundamental building blocks used to synchronize data between different work management systems like Trello, Asana, Jira, GitHub, etc. A flow is a set of rules that dictate how data is transferred between 2 or more systems. They define which data is synced, how often it is synced, and how it is mapped between the different systems. 

Flows are a powerful way for users in Unito to automate the workflows and save time by avoiding manual copy-pasting of information. 

Integration Breadth or Depth

When considering integration solutions, it is often the case that users are stuck between the breadth or the depth the tools offer. 

Breadth, in a way, is an easy decision since it’s all black and white. If a particular application is not supported, you must go with the other option. 

Integrations Supported

As such, Exalate supports the following platforms: 

  • Azure DevOps
  • GitHub
  • Jira (Cloud and on-premise)
  • Jira Work Management 
  • Jira Service Management 
  • Salesforce
  • ServiceNow
  • Zendesk
  • Docker deployments for multiple platforms

Note: You can also get notified of upcoming integrations for your favorite tool. 

Unito supports 50 integrations. A few of them are: 

  • Airtable
  • Asana
  • Azure DevOps
  • Basecamp
  • Bitbucket 
  • ClickUp
  • GitHub and GitLab
  • Google Calendar, Sheets, and Contacts
  • Wrike 
  • HubSpot
  • Jira and Jira Service Management 
  • Miro
  • Salesforce
  • Zendesk 
  • Trello 
  • … 

Note: You can also submit a request for an integration with Unito. 

The depth of an integration refers to the extent to which 2 or more software systems are interconnected and can exchange data and functionality seamlessly.

It can also involve interacting with the underlying APIs of the platform

Coverage Scope

Exalate scripts help achieve deeper integration between multiple instances. These scripts allow you to control every detail of your synchronization. You have the freedom to adjust and tweak your sync behavior to meet your specific business requirements. 

Exalate allows you to:

In the Unito app, flows are capable of handling a variety of use cases but are limited to what has been implemented through its UI. It can affect your synchronization mapping requirements because you can only use the defaults. 

With Unito: 

  • You can’t sync some fields bi-directionally.
  • Attachment sync depends on the plan you fall under. They are either streamed live within the respective application or are provided as a link.  
  • Triggers are limited to what is offered within the interface. 
  • Connecting or reconnecting existing entities in bulk isn’t supported. 
  • There is a limit on advanced mappings, for instance, for statuses and entity types.

Ease-of-Use or Just Right

Exalate provides a script editor that allows users to create, write, and test their scripts directly within the Exalate application. Sure, people with technical backgrounds can work seamlessly with scripts, but with practice, even non-technical individuals can get around them. If not, Exalate provides a no-code builder to implement simple synchronization use cases. 

It provides 3 modes to make it “just right” for all kinds of users: the Basic mode, the Visual mode, and the Script mode. 

Unito app, a no-code solution, makes setting up the Unito flows through a single drag-and-drop interface easy. It is simple and fast to set up and get started with. It has an intuitive interface that guides you through the synchronization process. 

It also provides a range of pre-built templates and customization options. That makes it easy to set up the desired mapping between fields or triggers without the need to write any code. 

Decentralized or Centralized Integration

Exalate enables decentralized integration between different applications. So it doesn’t rely on a central authority or a single point of control to set up and maintain the integration. 

It helps each integrating system or application retain control over its own data, synchronization rules, and functions. This approach allows for greater flexibility, scalability, and resilience, as systems are loosely coupled and operate independently without disrupting the integration setup. 

Unito, on the other hand, provides a centralized interface that helps set up flows to decide the rules for your synchronization and triggers to automate the workflow. Every time there is a change in the sync rules or information flow conditions, a change in the central interface is required. It creates a dependency on the centralized application resulting in a single point of failure.  

Decentralized platforms like Exalate and centralized ones like Unito both offer unique ways to connect and synchronize data between applications. While Exalate allows admins on both sides to adjust sync rules independently, Unito restricts this ability to only the user who created the flow. 

Integrated Retry Mechanism or Manual Intervention

Exalate offers an integrated retry mechanism that helps ensure data synchronization between different systems is successful. These apply even when there are temporary interruptions or errors in the network. 

It achieves this with incoming and outgoing sync queues that help restore the sync and apply changes from the point of interruption without any human intervention. 

There are many potential reasons why a synchronization might fail, and therefore the whole process is asynchronous, using queues to go through every step of a sync transaction. 

For Unito, this is not true since you need a manual page refresh or to clear the cache in case of common transient sync errors or downtimes. 

Synchronization or Automation

Synchronization or automation has always been a debatable issue. 

And there is no simple answer to this. It all depends on what you want to achieve by integrating your platforms. 

Let’s review this in the context of Unito App and Exalate. 

Flow-based integration solutions, like Unito, are optimally set up for automation. They can be used for simple, one-off tasks such as triggering an action in one system based on an event in another system. 

Synchronization is designed with a different purpose. It requires a more customizable and robust approach that involves mapping and transforming data with different data types and structures at the architectural level. 

Exalate’s scripting engine handles such complexities with scalable and distributed architecture. 

Additionally, flow-based solutions may not be able to handle large volumes of data, typically involved in a synchronization process. 

So if automation is what you are looking for, Unito is the obvious choice. For ongoing bi-directional synchronization tasks, Exalate should be your top pick. 

There are other considerations while exploring integration solutions. We’ll discuss them in the upcoming sections. 

Explore the Security of the 2 Apps

Security is a critical aspect of integration solutions. Without proper security measures, systems can be vulnerable to cyber-attacks and other security threats. 

Data Residency, Security, and Privacy

Exalate ensures security in the following ways:

  • It uses 2 hosting providers: Google Cloud-based in Belgium and rsync.net in Zurich. 
  • It uses JWT-based tokens to sign every message between the instances. 
  • Data is encrypted in transit and at rest. 
  • It uses encrypted (HTTPS) traffic between the Exalate node and the instance with properly defined certificates. 
  • It relies on a single-tenant architecture such that each Exalate instance and its corresponding environment supports only one application. So process space, file system, and network are not shared between applications. 
  • Distributed integration (peer-to-peer connectivity) to increase security and reduce coupling and dependencies.
  • It has a dedicated security team that conducts regular vulnerability checks and penetration tests. Any security breach is promptly escalated to all stakeholders. 
  • It logs every data exchange for audit purposes. 
  • It is also a participant in the Bug Bounty program.
  • It uses OAuth2 or token-based authentication mechanisms wherever applicable. 

You can check out more details in the Exalate Security Whitepaper

Unito has the following security measures in place: 

  • It is hosted on Amazon Web Services (AWS) data centers located in the USA. 
  • It uses Transit Level Security (TLS) version 1.2 to encrypt data in transit. Data at rest is encrypted with AES 256 algorithms. 
  • It has a dedicated team for cyber security events and threats. There are on-call engineers to escalate prompt reactions to security events. 
  • It uses checksums at both integrating ends to ensure data integrity. 
  • It allows access to application APIs from a fixed set of identifiable IP addresses. 
  • It performs an external penetration test once every year. 
  • It allows OAuth2 or self-administered email and password for authentication. 

You can check more details about Unito’s security measures if you like. 

Certifications and Compliances

Unito is SOC2 Type 2 certified. It complies with PCI DSS requirements. 

Exalate is ISO 27001 certified and GDPR compliant. 

In the upcoming section, we’ll be diving into the exciting world of use cases these integration solutions support. 

Unito and Exalate: Discover the Use Cases

From improving customer experience to simplifying backend operations, these tools have a variety of industries they can cater to. 

Exalate Use Cases

Exalate essentially supports custom-made integrations of any nature because of its scripting capabilities. 

However, we’ll outline the use cases under the following categories. 

Intra-company and Cross-company

Exalate provides a no-code drag-and-drop interface (Basic or Visual mode) that is perfect for implementing integrations across multiple teams or departments within a single company (intra-company). 

But its real strength lies in the ability to handle integrations across a network of companies (cross-company) via the Script mode that supports decentralized integration. Since security is paramount for multiple companies connected via integration, the decentralized control it offers helps alleviate any security concerns. 

Some Common Use Cases

Escalation: When a team wants to raise a bug, blocker, issue, or request with the responsible team without manual follow-ups. These teams can reside within a single company or be with a different company. 

For instance:

  • The support team using Zendesk wants to escalate tickets to the development team using Jira without following up via Slack or emails. The development team can either be within the same company or it can be an outsourced service provider residing in a completely different company. 
  • Customer success executives usually have useful insights into customer needs and issues. So these teams can raise customer issues to be taken up by the product team and prioritized in the product roadmap. 
  • Every so often employees need to raise requests for new software, apply for leave or other HR-related issues, or simply have creative requests. The requests must be escalated automatically to the required department without any manual follow-ups. 

Project Collaboration: When teams aim to collaborate on shared projects to promote transparency and streamline processes. 

For instance:

  • Orchestrating workflows by connecting the project management working in Jira, the dev team working in GitHub, and the customer support team working in ServiceNow. 
  • Order procurement process: Where order receipts are generated automatically in one system, like ServiceNow, to be reflected with the correct financial and inventory requirements for the operations team working in Jira.  
  • Supply chain management process: Aligning multiple service vendors or suppliers using disparate applications through automatic, real-time information exchange. Doing so will help businesses meet their timelines, streamline the supply-distribution chain, and not lose profit. 

Information Consolidation: When teams want to consolidate information within multiple platforms for better visibility for the top management and efficient decision-making. 

For instance, developers often use open-source software applications like GitHub. However, clear visibility of the pull requests, issue statuses, and blockers to the project management team will enable them to reprioritize the backlog or align issues for releases or sprints. 

There is also a need for business-critical information residing within different systems to be consolidated and visible to the right stakeholders in a merger and acquisition (M&A) setup. 

Service Management: When teams want to excel in service delivery and customer experience. In turn, achieving faster resolution of incidents or tickets by orchestrating service management workflows end-to-end. 

Service integration and management is an urgent need for today’s businesses. Services exist to be delivered in a variety of different domains like manufacturing, retail, healthcare, finance, etc.  

Such multiple service providers, if integrated in the correct manner, can help streamline collaboration and reduce manual errors. For instance, handling customer incidents raised in SaaS applications in a timely manner according to the stipulated SLA must involve an automated integration process

Unito App Use Cases

We will decode Unito use cases in the same manner as Exalate in this section. 

Connecting Development and Other Teams: Software developers often use Jira due to its agile methodology for tracking issues. The development team needs to communicate Jira updates to external stakeholders or other teams. 

Creating a Single Source of Truth: Consolidating information from multiple tools into a single view. 

For instance, fetching tasks from three Asana projects into a single Trello board. All essential information is captured between the tools and kept in sync.

Seamless Collaboration: Allowing different departments to collaborate seamlessly with one another. For instance, coordinating a new feature release with the dev team working in Jira, the marketing team working in Trello, and the project managers having an overview in Asana. 

One-Way or Two-Way Automation: Automating processes, status updates, and other fields across multiple instances helps overcome manual copy-pasting between systems. It also reduces human error and increases team productivity. 

Now that we have discussed a few important use cases for Exalate and Unito, the team at Happy Co. wants to know whether the integration tools can align with their budget and requirements.

Let’s dig into the pricing and licensing and help them out.  

Unito vs. Exalate: Pricing and Licensing

Unito’s pricing plans are based on how many items you want to sync and the desired feature sets, like the number of active flows. 

There are 3 plans Unito offers: 

  • Team plan: If you consider 1000 items under sync, the cost will be $106/ month. 
  • Company plan: If you consider 6000 items under sync, the cost will be $674/ month.
  • Enterprise plan: It’s a custom plan for which you must contact the Unito team.

The time taken by Unito to detect what changes to apply to synced items depends on your plan: 

  • Personal Plan: takes 15 mins.
  • Team plan: takes 5 mins.
  • Company and Enterprise plans: live sync.

Unito offers a 14-day free trial with 500 items to sync. 

Exalate has a 30-day free trial with its full functionality at your disposal. So you get unlimited two-way or one-way syncs, custom mappings, and all 3 modes (wherever applicable) with the free trial. 

Note: Since Exalate has a distributed architecture, you must buy licenses for both the platforms you want to connect. 

Exalate has a Free plan that supports the Basic mode and has out-of-the-box, pre-made configurations for simple synchronization scenarios. You can sync up to 1000 new entities per month. 

The Premium plan can be purchased after the free trial and it is fully customizable, including all the feature sets. 

Exalate pricing depends on the platform you want to set up the integration for. 

For instance, you can check the Jira pricing on the Atlassian marketplace.

The final consideration is to see how Happy Co. will fare once they have purchased the solution. 

How will the support, the documentation, and the community help them with their integration endeavor? 

Ask for Support, Documentation, and Community 

Unito app has an easy-to-set-up, no-code flow builder and quick onboarding. 

The product’s customer support depends on the plan, with email and in-app chat available for all plans. 

For Company plans, they have a 60-minute onboarding process. Other support features like a dedicated CSM, custom invoicing, and SLAs are only available to Enterprise plans.

Unito documentation is pretty straightforward, simple, easy to follow, and stable. However, they don’t have dedicated community support. 

Exalate offers 2 support options. 

Standard support assists with installation issues, troubleshooting, and workarounds for sync issues. Premier support offers higher SLAs, a dedicated support engineer, configuration assistance, and much more. 

You can also purchase cloud enhancements for any of your Exalate nodes on top of the premier support. It offers three-fold infrastructure on Exalate cloud, proactive monitoring, and alerting, enhanced resource profiles, etc. 

Exalate documentation is elaborate and extensive, with numerous script examples to help you get started. 

In addition to this, it also has a strong community presence to answer any questions (even configuration ones) you may have. 

Final Thoughts

Happy Co. and you have been on an interesting journey knowing what these tools bring to the table.

Here’s an overview of what we have gathered so far. 

Exalate Vs Unito

Conclusion

And there you have it! We’ve taken a deep dive into the world of integration tools and compared the features, pricing, security, etc of 2 popular solutions: Unito app vs. Exalate.

From startups like Happy Co. to large enterprises, choosing the right integration solution helps streamline workflows and team collaborations. 

We hope this comparison has given you valuable insights and helped you make an informed decision. So what are you waiting for? Start integrating now. 

Recommended Reads:

Behind the Scenes: Brainsquare Shares Their Experience Using Exalate

Partners in the spotlight

This article is based on an interview between Mariia Onyshchenko, the Product Marketing Manager at Exalate and Jorden Van Bogaert, the Atlassian Service Delivery Manager at Brainsquare.

Jorden Van Bogaert is the Atlassian Service Delivery Manager at Brainsquare — an Exalate Partner that designs, develops, and manages complex, critical application landscapes. During our meeting, Jorden shared his thoughts about using integrations and why Exalate is an excellent integration solution.

I started our discussion by asking Jorden why companies choose one particular integration from the sea of available options.

“Teams working in different systems need to align their efforts to bring everyone together into one system.”

Jorden Van Bogaertm Atlassian Service Delivery Manager at Brainsquare

Jorden further elaborated that normally, the requirement is to have one system as the single source of truth, even though teams work on different systems, so all information syncs to a central place where they report.

I then asked him about the main hurdle with choosing a central system, to which he responded with an example. Suppose a partner is trying to convince one team to move from one system (Jira, for example) to a new one (ServiceNow) or vice versa. They need to address the question: “Can we optimize it so we can just work in Jira, and they can do their stuff in ServiceNow?”

In such scenarios, a tool like Exalate can break this impasse by making sure both teams stay on their respective systems but still share data smoothly.

This led us down the path of using REST APIs for integrations. I learned that customers usually encounter issues when building something independently. In Jorden’s opinion:

“They [customers] need to make it work bidirectionally, and with REST API, you can only push things. You also need to get things. But the problem with this option is that it will cost you a lot more even though it is feasible. You are basically rebuilding Exalate.”

While discussing customer experience, I found out that customers often inquire about the usability of Exalate—and their main concern usually revolves around scripting. Some customers don’t want anything to do with code, while others want access to advanced configuration and scripts.

“I needed the possibility to do advanced configuration, [but] the other tools I have tried were either too limited or way too expensive,”

added Jorden to buttress the point. 

Still on the issue of advanced configuration and scripting, I was amazed to find out that devs enjoy using Exalate because it allows them to translate one programming language to another. And since most use cases, even the ones that customers consider basic, often end up requiring additional configuration, a tool like Exalate covers most of the custom coding.

“You can script everything. Well, I’m a big fan of scripting. I really like it because it supports almost any use cases.”

Said Jorden, in regards to Exalate’s flexibility being another key selling point for customers. 

Taking a breather from all the coding talk, I shifted his attention to the Exalate UI and visual field mapping. Jorden explained to me that these features made it possible for users to know what to map when customizing the script for specific use cases.

“To make it easy, you need to sacrifice a lot. You can make everything clickable in UI, but that means that you’re going to make a product that defines the use cases for you. And I think people want challenges as well.”

We then drifted back to the technical side, discussing Exalate’s distributed architecture.

“That [Exalate’s distributed architecture] is one of the selling points we use, especially when it comes to linking two partner companies–they want to make sure that the other partner does not need access to their environment.”

Unsurprisingly, I learned that companies are particular about Exalate sending their data to partners unless given permission to do so. Essentially, the entities syncing data can independently control what goes over and what comes in.

I followed up with the question about how customers’ appreciation of how the Exalate console gatekeeps the data for all sides involved in the sync. In Jorden’s view, this feature helps maintain transparency by ensuring that both sides of the sync approve any changes to the script.

“If they change something on one side, Exalate will throw an error. It sounds bad that it breaks, but it doesn’t break. It stops until you fix it,” he emphasized. 

When I finally broached the pricing subject, Jorden conveyed that prospects react neutrally to Exalate’s pricing. So every time they send a quote, there weren’t really questions. He thinks it was an expected price for what we said we could do with it. He also emphasized on Exalate not being in the high-end price range compared to other premium solutions.

Then I asked if customers were skeptical about involving Exalate partners whenever they needed integration solutions.

“It seems like [Exalate] is not self-explanatory. It is not. Because it is as complex as your use case”, Jorden added, underlining that Exalate could take a lot of time and tweaking to figure out. Alternatively, a partner could get customers started without breaking a sweat.

And with that, we landed on the usefulness of partners. Jorden also buttressed their importance by discussing how to set up triggers and translate the use case to code based on the outlined sync rules.

“The partnership is very useful in terms of understanding how the integration behaves and helping the users with their experience. For instances, understanding why Jira suddenly has a different status than ServiceNow.”

All in all, my conversation with Jorden showed me the importance of partners as intermediaries between Exalate and customers.

Jorden said that sometimes customers might think that something is not synced properly and then they might try to prove that its configuration is very difficult because it seems like Exalate didn’t handle it properly.

Well, that is where partners excel; they help understand what the customers want, which helps in mapping the required fields. And most importantly, the admins of both instances retain control over their spaces.

Exploring Ways To Implement Managed Services Integration

Managed Services Integration

Current technical advancements have ramped up the pressure on organizations to innovate at a faster pace while keeping costs at a reasonable minimum. To this end, some companies opt to keep innovation in-house, while others prefer to outsource managed services and even go for managed services integration.

According to a report by Research and Markets, the market value for managed services will reach $410 billion by 2027, representing a CAGR of 2.6% ($343 billion) from 2022. 

However, although building and maintaining your services in-house can work for large corporations like Facebook and Google, SMEs often lack the personnel and financial leeway to bear the costs of building them from scratch.

If you have yet to decide on managed services integration, this article will be your guide. We’ll explore the importance of managed services integration and go ahead to discuss ways to implement them into your business processes. 

Let’s define some key terms first.

What Are Managed Services?

Managed services refer to a cooperation model in which a third-party company takes partial or complete control of one facet of your organization’s operations. 

This third party is known as the managed service provider (MSP). If the service provider handles security, then they are known as the managed security service provider (MSSP). MSPs (and MSSPs) can offer their services as an extension of your existing IT team or as a fully-managed service. 

Essentially, managed services can extend to any department as long as it doesn’t affect the smooth flow of operations within your organization. They work well for time-consuming, business-critical processes like cybersecurity, risk management, and regulatory reporting.

Examples Of Managed Services

  • Cloud computing: Companies like IBM and Oracle are mainstays in cloud computing, providing managed cloud services to corporations, SMBs, non-profits, and government agencies. Computacenter helps Renault with analytics in Formula 1.
  • Marketing: Businesses can outsource copywriting, planning, distribution, sales, and advertising to managed marketing services.
  • Supply chain: FedEx and Amazon also serve as intermediaries for business entities in need of logistics services, sourcing of resources, and distribution, 
  • Payroll: Companies all over the world use HR management software like Ceridian to manage payroll services, as well as monitor employee performance and handle taxes. According to Deloitte, 73% of organizations outsource some payroll responsibilities. Whether a company does this internally or outsources it, choosing the right payroll system is essential.
  • Communication: Managed communication service (MCS) vendors enable smooth collaborations between teams and stakeholders. They also handle services like instant messaging, VoIP, and AI chatbots as part of Unified Communications as a Service (UCaaS). Popular MCS vendors include AT&T, Verizon, and CISCO.
  • Security: MSSPs like Cipher, Trustwave, NVISO, and Symantec handle cybersecurity audits, incident response, firewall management, threat monitoring, and compliance monitoring. 
  • Financial: Accenture and the Big Four (Deloitte, Ernst & Young, KPMG, and PwC) provide finance service management (FSM) in the form of audits, forecasts, analysis, consultations, as well as other HR and payroll services.

What Is Managed Services Integration?

Managed services integration is the process of connecting multiple service providers to ensure frictionless cooperation between teams. This involves using an integrator to manage and integrate all these managed services.

In the IT sphere, the service integration and management (SIAM) approach makes it possible for companies to manage risk effectively, reduce costs significantly, foster transparency, and boost consumer (or end-user) satisfaction.

However, SIAM is not the only IT service management (ITSM) model out there. Another highly-touted alternative to service management is the IT infrastructure library (ITIL) — a framework for delivering top-of-the-line IT services. The latest iteration is ITIL4.

Similar to SIAM, ITIL4 enables businesses to manage risks, satisfy customers better, and establish a stable environment for growth.

The only difference is that SIAM extends beyond customer satisfaction and focuses on areas like business requirements, service delivery, and quality control reporting.

How To Implement Managed Services Integration

Successfully integrating managed services for your organization requires careful planning and execution. Let’s look at the systematic steps involved in implementing managed services.

Identify Your Needs

The first step in implementing managed services integration is to conduct an in-depth analysis of your business needs and current workflows. This analysis will help you identify areas that require service integration and management. 

Let’s say you have multiple managed services that you wish to integrate. You need to consult your team to determine which ones to prioritize.

Create an Integration Plan

With a better understanding of how your business needs align with the current state of your infrastructure, you can now start working on an integration plan for your managed services. 

Your integration plan should cover specifics like delivery timely, coverage scope, budget, scalability, and future prospects. In some cases, you might need to work with your MSP to come up with a solid plan for integrating managed services.

Select a Trusted Integration Vendor

Choosing a tried-and-trusted MSP determines the success or failure of integrating managed services into your organization. At the same time, you also need an integration solution that can sync data with these MSPs. 

For businesses that rely on multiple managed services, IPaaS solutions like Exalate could help your teams sync data bi-directionally.

More on that later. Even if you decide to go with less-prominent providers, always check their track record and industry expertise. Better safe than sorry.

Establish SLAs

Your cooperation with an MSP should be covered by service level agreements (SLAs) — which outline the scope of your collaboration under the integration plan created earlier. These SLAs will make sure you are singing from the same hymn sheet as your integration vendor.

Besides, putting an SLA in place gives you recourse if the integration solution provider does not fulfill their obligations.

Suppose you are integrating your managed communication services with payroll services; signing an SLA could make sure you are entitled to compensation if the IPaaS vendor exposes your infrastructure to malware or ransomware.

Implement the Solution

After consulting with your team and signing an SLA with a trusted provider, time to integrate your managed services.

This stage often involves preparing your team for seamlessly integrating the MSP with the organizational infrastructure. Suppose a managed marketing services provider is handling your outreach campaigns; your internal team maintains communication channels for sharing data and feedback.

As mentioned earlier, you can sync data with your MSP using an integration solution. How does it work?

Let’s say you use Jira or ServiceNow and your service provider is also on either of these 2 platforms. To move away from manual data sharing and to set up a seamless connection, you can integrate Jira and ServiceNow bidirectionally. You can then create a customized connection, automate the sync, and set your own rules for this specific use case.

Monitor Performance

Although your MSP will be in charge of handling everything covered in the SLA, you still need to monitor performance. Always track key metrics like uptime, traffic, engagement, downtime, vulnerabilities, and every other performance indicator related to the managed services integration.

You can use performance monitoring tools New Relic and AppDynamics to monitor your business IT infrastructure.

Benefits of Managed Services Integration

Businesses, non-profit organizations, and public service agencies use integrating managed services to:

Reduce Expenses

MSPs use pricing models that tie into the services they provide. This clear pricing makes it easy for subscriber companies to curb unnecessary costs by following a strict budget. 

Also, you get to save the time needed to build and maintain every managed service integration individually—which could pile on to your overall business expenses. As a result, you no longer need to spend tons of money on hiring, training, and tooling new teams.

Improved Productivity

Integrating managed services into your business increases productivity by ensuring that your teams are channeling the right resources (and time) into the most vital parts of the business. Besides, since specialists handle managed services, fewer errors will occur.

Improved Service Reliability and Customer Satisfaction

Since MSPs have the cushion of industry expertise and hands-on experience, integrating managed services increases the reliability of that facet of your organization. 

For instance, if the MSP handles backups, you can rest assured that your app or site data will stay safe even if the server goes down.

Speaking of servers going down, MSPs and integration service providers go the extra nautical mile to minimize downtime. After all, it is in their best interest to preserve their reputation by boosting uptime.

And all things considered, your customers will be more satisfied with the services you provide.

Broader Scalability

Managed services integration gives your organization extra flexibility, which will come in handy whenever you decide to expand (or shrink) operations.

If you want to expand operations, you can update the SLA to adjust the scope of coverage and, in turn, the subscription cost. Similarly, your integration service provider can also scale your operations to adapt to changing business requirements. 

Employee Satisfaction

When you integrate managed services, you relieve the overall workload on your employees as well as increase productivity. This improves employee morale because they now have more time to focus on their core responsibilities.

Tighter Security

Integrating managed services also bolsters your security. MSSPs can beef up the security of your internal systems to make sure your network and data stay secure from unauthorized entities. 

Challenges of Implementing Managed Services Integration

Here are some of the hurdles you need to scale when implementing managed services integration:

You Are at the Mercy of a Third Party

Integrating managed services into your organization hands over control to a third party. This means that if the MSP experiences any issues, that part of your business will become inaccessible until they find a resolution.

So if you want absolute control over every facet of your organization, managed services integration will pose a worthy challenge for you.

Cost Is Still an Issue

Even though MSPs can help you maintain a strict budget, the services they offer are often quite expensive. Big corporations can afford these fees without blinking, but SMEs and non-profits often end up in the red after paying to integrate multiple managed services.

Privacy Is Not Guaranteed

In January 2023, MailChimp reported a data breach that exposed user data. Now imagine if you are using MailChimp for marketing; this means that your customer data would have been hit by the shrapnels from this breach. 

Communication Gaps Still Exist

If you don’t have the proper organizational structure for communication and data handling, integrating managed services will not bail you out of the impending mess. And if you don’t set up proper communication channels, information siloes will develop within your company.

Interoperability Is a Headache

When integrating new managed services with existing ones, you need to worry about compatibility. Most times, this lack of interoperability can be fixed with the help of a versatile, customizable integration solution such as Exalate. Otherwise, you will experience a significant drop in performance and reliability.

SLAs Won’t Protect You from Conflicts

In 2020, Boardman Molded Products sued MSP Involta for $ 1.7 million in damages for the latter’s negligence and culpability in a security breach.

Failure to meet business requirements often leads to finger-pointing between organizations and MSPs. Even in the presence of ironclad SLAs, the disagreements often devolve into toxic partnerships—and even lawsuits in extreme cases. 

Conclusion

Managed services integration will help your business combine the services of multiple MSPs to boost performance, decrease costs, and optimize employee productivity. Companies can also use integration solutions when working with numerous incompatible MSPs. This will help maintain smooth communication and efficient operations. 

Exalate is an integration solution that helps businesses integrate with MSPs and MSSPs. It allows you to share data bi-directionally using custom scripts and event-specific triggers.

Recommended Reads:

How to Sync SLA Records and Maintain State Updates From ServiceNow Incident to Jira Issue

Sync SNOW Case to Jira Epic

This article was originally published in the Atlassian Community.

By integrating Jira and ServiceNow, teams can fetch related SLA records within a ServiceNow incident and sync them in a user-defined Jira issue field, enabling them to track SLA information and ensure timely incident resolution.

In this blog post, we will see how to implement this use case using an integration solution called Exalate

So let’s jump right in! 

The Use Case

The following are the use case requirements: 

  • An incident created in ServiceNow is synced to Jira as an issue. The short description and description of the incident must be reflected within the Jira issue. 
  • Comments (from Jira) and work notes (from ServiceNow) must be synced bi-directionally between the two platforms. 
  • SLA information related to the incident is synced to the correct Jira issue in a user-defined field. The following SLA information must be passed over:
    • Name of the SLA
    • Breach time
    • State 
  • State changes to the SLA record in ServiceNow must be correctly updated in the SLA record on the Jira side. 

Note: You can choose to populate the SLA information in any kind of Jira field you want. For this use case, we have considered a user-defined field called ‘SLA Info’. 

The Challenge

Syncing an incident from ServiceNow to Jira is pretty straightforward and can be achieved easily. 

However, syncing the SLA information to the Jira issue needs to be handled carefully.  

An incident triggers (creates) an SLA record under two conditions: 

  • An incident of high priority is created. 
  • An incident is assigned to a specific person or an assignment group.

Once, the SLA record is created, it must automatically be synced to the Jira issue in a user-defined field. Therein lies the real challenge; to find the correct Jira issue to add the SLA information. 

Also, state changes in the SLA record must update the SLA details on the Jira side. 

The use case is complex, so we must find a solution that handles this complexity easily. 

And we have just the right solution for you. 

Exalate: A Customizable Integration Solution

Exalate is a bi-directional, fully customizable integration solution that helps integrate applications like Jira, ServiceNow, Salesforce, Zendesk, GitHub, Azure DevOps, etc.

We opted for Exalate to execute the use case because of the following reasons: 

  • User-friendly Scripting Engine: It features a Groovy-based scripting engine that simplifies the configuration of intricate logical mappings between entities that require synchronization. 
  • Advanced Automatic Sync Triggers: It provides a range of fine-grained triggers that enable automatic data sync
  • Independent Control of Information Flow: It provides admins on both integrating sides to have independent control of the information and allows them to fine-tune the sync as per their requirements without consulting each other. 
  • Bulk Sync of Entities: It allows syncing entities in bulk, simplifying the sync process for large datasets. 

Note: You can learn more about Exalate through its Academy videos. 

How to Sync SLA Information From ServiceNow to Jira Using Exalate

Prerequisites

  • Install Exalate on Jira (Cloud) and ServiceNow
  • Create a connection between Jira and ServiceNow using Script mode

Note: You can learn more about setting up a connection by referring to the Getting Started guide or the Jira ServiceNow integration guide. 

The Implementation

After setting up the connection, you must configure the sync rules, which are Groovy-based scripts that determine which information to exchange between Jira and ServiceNow. 

You can access these rules by clicking the ‘Configure Sync’ button after the connection is established or by editing the connection.

Jira ServiceNow integration

The ‘Rules’ tab is where you’ll find the scripts we discussed earlier. They exist in both Jira and ServiceNow. 

The ‘Outgoing sync’ determines what information is sent from the source to the destination, while the ‘Incoming sync’ specifies how to receive information from the source. 

The Scripts

Let’s see the actual scripts required to implement this use case. 

Remember, the scripts in the ‘Rules’ section provide some default behavior, like syncing comments, descriptions, etc. We must add our own scripting rules for the functionality we want to implement.

ServiceNow: Outgoing Sync Script

As such, from ServiceNow, we send the SLA information in the ‘Outgoing sync’. 

ServiceNow outgoing sync script

The code:

class SlaRecord {
    String name
    String breach_time
    String stage
    String linkValue
}

if(entity.tableName == "incident") {
    replica.key            = entity.key
    replica.summary        = entity.short_description
    replica.description    = entity.description
    replica.attachments    = entity.attachments
    replica.comments       = entity.comments
    replica.state          = entity.state
    def RelatedSlaRecords = []

    def limitResult = 20

    // lookup all related SLA records
    def response = httpClient.get("/api/now/table/task_sla?sysparm_query=task.number=${entity.key}&sysparm_limit=${limitResult}")

    if (!response || !response.result) return null  

    // For each SLA record, lookup corresponding value in contract_sla  //table
    // and collect all the data required within the RelatedSlaRecords array
    response.result.each { 
    SlaRecord temp = new SlaRecord()
    temp.breach_time = it.planned_end_time
    temp.stage = it.stage
    temp.linkValue = it.sla.value

    def slaRecord =     httpClient.get("/api/now/table/contract_sla/${it.sla.value}")   
    temp.name = slaRecord.result.sys_name
  
    RelatedSlaRecords.add(temp)
    }
    replica.slaResults = RelatedSlaRecords
}

Note: You can pick up any SLA information via Exalate and sync it to the other end. 

In the code above, we run an API query on the ‘task_sla’ table to fetch the related SLA records for that incident. After doing this, you can pick up the breach time, stage, and SLA value. 

However, if you want to pick up the actual SLA name, you need to run another API query to the ‘contract_sla’ table. From there, you can fetch the actual SLA name. 

We then package all of this within an object called ‘RelatedSlaRecords’ and send it to the other side. 

Also, it’s possible that you can have more than one SLA. The idea is to then use an array to populate all the SLA objects and send them to Jira. 

The job, on the other side, is relatively easy. All we need to do is to unravel the array that comes from the ServiceNow end.

Jira: Incoming Sync Script

We need to run a loop and iterate over all the fields from the array of objects sent from the ServiceNow side. Then display it in a user-defined ‘SLA Info’ field.

Jira incoming sync script

The code:

if(firstSync){
   issue.projectKey   = "UD" 
   // Set type name from source issue, if not found set a default
   issue.typeName     = nodeHelper.getIssueType(replica.type?.name, issue.projectKey)?.name ?: "Task"
}
issue.summary      = replica.summary
issue.description  = replica.description
issue.comments     = commentHelper.mergeComments(issue, replica)
issue.attachments  = attachmentHelper.mergeAttachments(issue, replica)
issue.labels       = replica.labels


issue.customFields."SLA Info".value = ""
for(int i=0; i<replica.slaResults?.size; i++){
    issue.customFields."SLA Info".value += "Name: ${replica.slaResults[i].name} \n Breach Time: _${replica.slaResults[i].breach_time} \n State: ${replica.slaResults[i].stage} \n\n"
}

You don’t need any modifications on the Jira outgoing and ServiceNow incoming sides. 

Let’s run the program and see the output.

Output

Begin by creating a simple incident.

Incident in ServiceNow

Set it in such a way that it triggers two SLA records.

The incident priority is now high and assigned to a specific user. And SLA records are created.

SLA information in Incident

Both the SLA records must be synced to the Jira instance. 

After syncing, the Jira side looks like this.

Jira issue with SLA information from ServiceNow

Remember, you can change the state of the SLA anytime. For instance, by changing it to ‘Cancelled’, you can see the updated SLA state in the Jira ‘SLA Info’ field.

SLA information in Jira from ServiceNow

Conclusion

By integrating platforms, modern businesses look to stay competitive and responsive to customer needs. The use case we demonstrated is proof of why integration tools will become even more critical in enabling successful collaboration between teams in the near future. 

Want to know if Exalate is the right solution for your business? It is just a click away

Recommended Reading:

Groovy Scripting Made Easy: A Beginner’s Guide to Mastering the Basics

Groovy scripting

Welcome to this comprehensive Groovy scripting guide! If you are curious to know what Groovy scripting has to offer and how it can be used in real-world scenarios, then you’re in the right place! 

We’ll dive into the world of Groovy and explore its features and capabilities.

We’ll also focus on the role of Groovy scripts in integrations. And see how it supplements integration solutions like Exalate to perform advanced integrations. 

With a lot of practical examples, you’ll better understand how Groovy scripting can streamline your coding workflows. So let’s get started!

Note: This is a complete handbook, so feel free to jump to the chapter of your choice by using the floating menu on the bottom left of this page.

Chapter 1: Get On with Groovy Scripting 

I am always curious about expanding my knowledge, whether it’s related to the field I work in or not. And sometimes, I find myself wondering, “What could I learn today that’s completely different from what I already know?”. 

It’s fun to explore new skills and interests, like maybe even taking up swimming! It’s just a thought that pops into my head every so often. 

Then, I look around the world of programming. And I’m amazed by the endless opportunities to learn and grow. Programming languages are like fashion trends that come and go at lightning speed. Keeping up with these trends can be challenging, but staying on top of the game is quite important. 

Learning the Groovy language has proven to be a wise decision for me. With my prior experience in Java and fondness for the language, it seemed like the perfect choice. Not only is it syntactically similar to Java, but it also reduces the amount of boilerplate code.

Apache defines Groovy as:
A multi-faceted language for the Java platform.
Apache Groovy is a powerful, optionally typed, and dynamic language, with static-typing and static compilation capabilities, for the Java platform aimed at improving developer productivity thanks to a concise, familiar, and easy-to-learn syntax. It integrates smoothly with any Java program, and immediately delivers to your application powerful features, including scripting capabilities, Domain-Specific Language authoring, runtime and compile-time meta-programming, and functional programming.

Groovy scripting simplifies Java coding, automates recurring tasks, and makes domain-specific language modeling easy. Plus, it supports ad-hoc scripting. 

With Groovy, you get advanced language features like closures, dynamic methods, and the Meta Object Protocol (MOP) on the Java platform (we will learn all of this, be rest assured). 

And your Java knowledge won’t become obsolete as Groovy builds on it seamlessly. 

But it is wrong to say that Groovy is only a scripting language. While it certainly functions as one, there’s much more to it than meets the eye.

It can pre-compile into a Java bytecode, integrate into different applications (especially Java-based), be the basis of building a whole new application, and so much more. 

Groovy is also used in major projects like Grails, Jenkins, and Gradle

As seen it can clearly do much more than just scripting. So labeling Groovy is like trying to fit a square peg into a round hole; it’s simply too versatile to be restricted to a single category. 

It’s safe to say that when you write a program in Groovy, you are writing a special kind of Java program, with all the power of the Java platform at your disposal, including the massive set of available libraries. 

The only hope is that you learn to write concise code as opposed to the verbose Java syntax. 

Let’s take a closer look at why Groovy is such an interesting language. 

Why Groovy Scripting Makes a Programmer’s Life Easy

It’s Friends with Java.

What I mean by being friends with Java:

  • Smooth integration with the JVM (Java Virtual Machine), i.e it works as a dynamic scripting language for JVM
  • Blends seamlessly with existing Java code and libraries
  • Extends the java.lang.Object class 
  • Implements operator overloading as Java methods, which can be called in Groovy as if they were operators
  • Uses Java features like abstract classes and interfaces seamlessly

Calling Java classes or functions from within Groovy code and also doing so in the opposite direction is easy. 

For instance, you can still use Groovy Date to access all the methods from the java.util.Date class. And you can also easily call within a Java class a Groovy class called ‘MyGroovyClass’ by ensuring MyGroovyClass is on the classpath for your Java application.

A really cool thing about Groovy is that it plays well with Java syntax! So you don’t need to worry about learning a new syntax altogether. 

The seamless interplay of Groovy and Java opens 2 dimensions: using Java to optimize code for runtime performance and using Groovy to optimize code for flexibility and readability.

It  Supports Dynamic Typing (and Static Typing).

Dynamically typed languages, like Groovy, move type checks, from compile-time to run-time. 

Type safety includes handling type (data type) mismatch errors in a programming language. Enforcing type safety can happen at compile time or run-time. 

For instance, in languages like Java (that enforce static typing), you must give a data type to every variable you define. Then the code gets compiled, and a type mismatch error occurs if the type assigned to the variable and the value do not match. 

So you cannot assign a String value to a variable you have defined as an integer (int), i.e String str = 123

Groovy allows you to defer specifying the data type of a variable until runtime, providing greater flexibility.

Of course, this can be disadvantageous since it can cause the entire system to crash, but it’s a fair price to pay for the features and flexibility it offers in return. 

It Allows Optional Typing

An extension of the above feature is optional typing. 

It means you can leave out mentioning the data types while writing your code. It’s done with the help of the keyword “def”. We will look at this in detail a little later. 

//when a data type is not specified in Groovy it still belongs to the //type Object. It doesn't mean there is no type
def hello = "Hello World"

It’s Object-Oriented

In Groovy scripting, you can leverage all the object-oriented properties and features available in Java.

So you can create classes, call class methods, set properties, and instantiate class objects in Groovy. 

It’s Loaded With Some Cool Features.

There are a lot of great features that Groovy scripts offer. Discussing all of them is beyond the scope of this blog post. 


A few amazing ones are: 

  • Consider a common example (I know what you are smiling at, my fellow programmers) ?:
println "Hello World."

In Groovy, you don’t need a semicolon or a parenthesis. Even System.out.println (in Java) is reduced to println.

  • It doesn’t need to import packages or make it mandatory to specify data types.
import java.util.*; // Java
Date today = new Date(); //Java

myday = new Date() //Groovy doesn't need a semicolon, nor does it //need to import the package.
  • It supports closures, a really awesome feature we will see in a while.
  • It generates setters and getters automatically at compile time. It’s called a POGO (Plain Old Groovy Object). 
  • It’s super easy to work with Lists and Maps in Groovy. 
  • It supports operator overloading, albeit makes it easier than Java.
  • It’s exciting to witness an increasing number of developers adopting Groovy lately! It’s gaining a lot of attention and momentum in the industry. Plus, it’s equally corporate-backed and has robust community support. 

Talking about the features of Groovy is like asking a chef how many ingredients they have in their pantry. Just like a skilled chef uses a variety of ingredients to create unique and flavorful dishes, programmers can use diverse Groovy features to craft something powerful and efficient. The features blend so well that coding in Groovy soon becomes second nature, not to forget pure fun. 

Building on the flexibility that Groovy offers, there are different ways in which you can start your journey. 

Groovy Environment: How to Run Groovy 

To run Groovy 2.4, ensure you have a Java Runtime Environment (JRE) for Java versions 6,7, or 8 installed on your computer. It is available for free here

After this, simply set the JAVA_HOME environment variable to point toward the location of your Java installation. 

A detailed installation guide for Groovy is available on its official website. It will walk you through all the latest instructions. 

Note: You can also refer to Groovy documentation if you prefer so. 

Like I haven’t stressed enough, you can start using Groovy in different ways. Just open a new tab in your browser and hit it with the Groovy web console

The web console is a handy way of getting hands-on with Groovy- whether it is for learning or for debugging small code segments. It’ll save you the trouble of installing a full-blown IDE or an interpreter on your machine. All the code snippets can be safely and correctly run within it. 

All you have to do is type your required script in the white area and click “Execute Script” to view the output.


Once you have installed Groovy, you can run it directly as scripts. You can do so via “groovy”, “groovysh”, or “groovyConsole”.

You can even compile Groovy with “groovyc” or run a compiled Groovy script with Java. 

If you feel adventurous, you can install a Groovy plug-in for your favorite IDE: IntelliJ IDEA plug-in, Netbeans IDE plug-in, Eclipse plug-in, and other editors. 

By now, you might have an idea about how easy it is to work with Groovy. It can be a handy tool. But hey, we all need to be practical as well, right? 

So in the next section, we’re going to check out how to use Groovy scripts in real-world situations.

Chapter 2: Infinite Possibilities with Groovy Scripting 

Depending on your situation and domain, you might want to use the features of Groovy differently. 

Groovy for the Plain Old Java Programmer 

One of the most obvious ways to make use of Groovy is by pairing it up with Java programming. 

As someone who has been a Java developer, I empathize with the challenges that arise when trying to introduce a new language to the team, only to face resistance from management who insist on sticking with Java. It’s often the case because Java is widely acclaimed and cherished.

Groovy can be your savior and trusted ally here, allowing you to introduce dynamic behavior on top of your existing Java code while making it concise.  So with Groovy, you can open a plethora of use cases and get the ball rolling on new projects. 

Let’s consider a simple example here. 

I’m sure, as a developer, there must have been countless occasions where you needed to access a file and perform some operations before closing it (or sometimes forget to close it, resulting in some awkward stares from peers). 

Here is how you would use Java to achieve this seemingly simple task. 

import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;

public class FileProcessor {
    public static void main(String[] args) throws IOException {
        BufferedReader reader = new BufferedReader(new FileReader(args[0]));
        String line;
        while ((line = reader.readLine()) != null) {
            System.out.println(line);
        }
        reader.close();
    }
}

And with Groovy, bingo!

new File(args[0]).eachLine { line ->
    println line
}

See the difference? 

It’s a win-win for everyone. 

So the software application that you have been burning the midnight oil for, can be taken up a notch with the features that Groovy offers. And your managers remain happy you haven’t abandoned Java altogether.  

Its support for features like functional programming and metaprogramming allows you to write concise and expressive code, at the same time providing seamless integration with Java libraries and frameworks. 

Automate Your Way with Groovy Scripts 

Groovy is a perfect language to automate daily, repetitive tasks like extracting data from a data source, processing a batch file, or generating your quarterly sales report. 

With built-in scripting capabilities, you can automate your way forward with Groovy scripting and make your life easier. 

If you are the DevOps or the Agile programmer kind and your daily work is managing a bunch of cards and statuses across different workflows, Groovy can be your genie. 

It can build simple automation for everyday tasks or even pull up continuous integration (CI) and reporting capabilities. 

Scriptrunner, an add-on app on the Atlassian marketplace, advocates Groovy and its vast capabilities by offering automation for everyday Jira tasks. It helps you create custom-scripted fields, design unique workflows, automate bulk actions, and much more. 

Groovy can even help you with your testing needs, both unit and functional testing, so your testers feel right at home. 

Integrations Made Easy with Groovy Scripting 

You can’t think of standalone applications in a digitally evolving world. 

As a developer, you might have already felt the need for different software applications and programming languages you use to be interoperable with each other. 

Groovy originated from this motivation. 

So it can play a huge role in a lot of different integrations. 

You might have used APIs and the pain that goes along with making them talk to one another. To make this issue easier, you can use Groovy to integrate with RESTful APIs, SOAP services, and other web services. It is possible because of built-in support for HTTP, JSON, and XML, making it easy to handle and manipulate data. 

Groovy features like support for JDBC and SQL make it easy to integrate it with your data sources like MySQL, Oracle, and PostgreSQL. And a cherry on top is that you can use this extracted data to generate reports with Groovy as we saw a while ago. 

We have all struggled with conflicting message formats and structures. Groovy’s dynamic typing gives the flexibility to work with these formats. So you can use it to integrate with middleware technologies like Apache Kafka, RabbitMQ, and Apache Camel.  

Cloud services have taken the world by storm. You can use Groovy to integrate with various Cloud services and applications like AWS, Azure, and Google Cloud. You can also use it to integrate with other cloud applications like Jira, Azure DevOps, Salesforce, ServiceNow, Zendesk, GitHub, etc. 

You can use Groovy scripting for enterprise integration tasks like ETL, data integration, and application integration. Its support for functional programming and collections, along with the Java libraries and frameworks within its reach, can be a powerful integration tool and resource. 

Have these concrete examples opened your minds to the world of Groovy and the value it brings to the table? 

You might have already started thinking of newer ways to implement your next project using Groovy scripting. Or you might ponder about how to use the integration prowess it natively supports. 

Let’s dig into this thought a little more.

Groovy Scripting in Exalate

Throughout this blog post, we will explore an interesting way in which Groovy adapts to diverse scenarios. 

We’ll discuss a solution called Exalate that uses Groovy scripts to synchronize information between different applications. 

Feel free to skip the Exalate-related sections if you only want to keep your mind occupied with Groovy and get a hands-on experience faster. However, it wouldn’t hurt to quickly scan through and acquire some extra knowledge along the way.

Before we discuss how Exalate uses Groovy, let’s briefly understand what Exalate is in the first place. 

Exalate is an integration solution that aims to provide uni or bi-directional synchronizations between different software applications. It supports integrations for Jira, Salesforce, ServiceNow, Zendesk, GitHub, Azure DevOps, HP ALM, etc. 

Now there are many integration solutions available in the market. Then why talk only about Exalate? 

Exalate is the only integration solution in the market that uses Groovy scripting to set up advanced, tailor-made integrations with multiple dependencies and custom data mappings. 

So it would be intriguing to study how you can use Groovy scripts for synchronizing information across our favorite platforms.

Exalate has its reasons for choosing Groovy as the preferred language. Regarding this, I had a conversation with the chief software engineer at Exalate, and this is what he had to say: 

Basically, when the product was conceptualized, we started looking into options for selecting a language that the Jira admins would be comfortable with. Exalate started with Jira on-premise as its first connector. 

And because Scriptrunner was the most popular addon in the marketplace at that time, and it used Groovy, the choice was a no-brainer for us. 

Another reason why we chose Groovy was because it provided seamless integration with the Jira API. You could call the Jira API without the need for any translation between Jira’s own language and Exalate scripts, as long as the scripting language was Java-based. 

Finally, we wanted to be able to run within Jira without making the customer install anything on their systems and without relying on the internet (since some Jira’s are only available within the company network), which means whatever language we use needs to be executable within Jira’s Java process.

Exalate supports decentralized integration. It uses Incoming and Outgoing sync processors on both sides that wish to interchange data. These processors allow independent and full control over information exchange.  

How to Set Up Your Development Environment in Exalate

The Exalate admin console has Outgoing and Incoming sync processors in the form of “Incoming sync” and “Outgoing sync” respectively.  These windows are present under the “Rules” tab that is displayed when you configure the connection.  

For instance, if you want to set up a Jira Zendesk integration, you must first install Exalate on both Jira and Zendesk instances. Then create a connection in the Script mode. 

You then need to configure the Outgoing sync script to determine what information to pass to the other side and an Incoming sync script that interprets the information received from the other side. You can choose to add, delete, or edit these sync “Rules” according to your integration use case.

Script mode in Exalate

Connections in Script Mode Using AI Assist

The Script Mode allows you to generate and optimize scripts using the AI Assist feature — which appears as a tab under both the incoming and outgoing sync rules.

How does it work?

Enter your sync requirements into the chat box, and AI Assist will generate scripts based on your input, existing configurations, and Exalate’s scripting API.

It is also important to note that AI is not perfect. So, you need precise and detailed prompts to ensure the best results. 

Let’s say you want to sync statuses between Azure DevOps and Salesforce; the prompt could look something like this: 

“I want to sync the status of my work item with the status of a Salesforce case.”

Script mode in Exalate

After a moment, the script will be generated, with suggested changes highlighted in green and red. The green scripts are suggested additions, while the red scripts are suggested deletions. 

If the new snippet works for you, click on “Insert Changes”. Otherwise, you can discard the suggested code. If needed, you can refine your prompt and, once satisfied, publish the changes.

Proceed to create automatic synchronization triggers now. 

You must “Publish” the changes and then test the sync upon configuring the scripts. 

While using Exalate, you’ll come across something called “replica”. You can see it in the image above. 

A copy of the original entity transferred to the other side is called a replica. It is a payload containing details of the information exchange.

Replica in Exalate

You can learn more about Exalate through its Academy tutorials or get hands-on experience with a step-by-step Getting Started guide on its documentation. 

Note: Hereon, I’ll include Exalate Groovy scripting examples wherever applicable.

Chapter 3: Understanding the Fundamentals Before You Start Coding in Groovy – aka Groovy Scripting Basics

Each programming language has its distinct look and feel, but the general structure remains the same. If you are familiar with a few such languages, following this blog will be easy for you. It’ll also help to have some background knowledge of Java. 

Nevertheless, I will provide the necessary information for each concept we cover, enough for you to get started with the language. But you must be aware of general programming concepts: braces, indentation, operators, parenthesis, comments, statement terminators, and the like. 

If you are eager to learn these concepts, move on to the next chapter. 

Of course, you can always revisit any section if you feel like you are struggling. 

We’ll cover a few starters here so you are comfortably settled within the Groovy environment. 

Commenting a Groovy Code

Like all programming languages, you can use single-line or multi-line comments in Groovy.

//this is a single line commentSome Groovy code here. 
/*this is a multi
multi-multi-line comment */
Some Groovy code here. 

Less is More in Groovy

As we have already discussed, you can write shorter, more concise, and more expressive code using Groovy scripts. 

  • Parentheses, package prefixes, and semicolons are optional in Groovy. However, in certain situations, like in methods with no parameters or constructors, parentheses are a good practice. 
  • Using “return” statements is optional in Groovy. 
  • Type (data type) declarations are optional in Groovy. 
  • Type casts are optional in Groovy. 
  • Methods and attributes in Groovy are public by default. 
  • You can omit mentioning the “throws” clause in the method signature if a checked expression is thrown.  

With this under your belt, let’s study the principle tool we will be using throughout this blog: the println or print statement

Displaying the Output in Groovy

You can use the print or println statement to display the output in Groovy. Classically these methods print the toString value of the object. 

Ignore the other lines of code written in the example below. We will study them a little later. You might already know what a class is if you are familiar with Java.    

class DemoClass {
static void main(String[] args) {       
//use the print or the println statement to display the //output. 
println 'car'
print 'car'
    }
}​

Groovy Scripts

Groovy scripts are files that hold the “.groovy” extension. 

  • They can contain any non-specific statements, plain text, class, or method definitions.
  • They can be run from the command line or within a Groovy environment. 
  • When a Groovy script is executed, the Groovy interpreter reads the script from top to bottom and executes each statement in turn. If the script contains method definitions or class declarations, these are compiled into bytecode and loaded into the JVM at runtime. 
  • They can also import or use external libraries, such as Java libraries or other Groovy scripts to extend their functionality. 

Note: You can externalize Groovy scripts in Exalate so you can use (or reuse) them outside the product scope.

Import Statements in Groovy

You can use import statements in Groovy to implement some functionality provided by libraries. 

By default, Groovy imports the following libraries, so you don’t need to worry about importing them. 

import java.lang.*
import java.util.*
import java.io.*
import java.net.*
import groovy.lang.*
import groovy.util.*
import java.math.BigInteger
import java.math.BigDecimal

Using Import Statements in Exalate

The most common example of Groovy packages used in Exalate is for transformers. You can use these transformers to convert information from one specific format to another. 

For instance, every application has a different format, Jira – Wiki, Azure DevOps, Salesforce and ServiceNow – HTML and GitHub, and Zendesk – Markdown. 

The following packages can be used to handle these formatting differences: 

import com.atlassian.jira.component.ComponentAccessor

import com.atlassian.jira.security.groups.GroupManager

import com.atlassian.jira.user.ApplicationUser

import java.text.SimpleDateFormat;

import java.text.DateFormat;

Note: You can also check for more information on the packages here

Groovy Keywords

Keywords are special words that are reserved to perform a certain function. So you cannot use them as a variable or function name. 

We will be learning about a few important keywords in the coming sections. 

abstractasassert
booleanbreakbyte
casecatchchar
classconstcontinue
defdefaultdo
doubleelseenum
extendsfalsefinal
finallyfloatfor
gotoifimplements
importininstanceof
intinterfacelong
nativenewnull
packageprivateprotected
publicreturnshort
staticstrictfpsuper
switchsynchronizedthis
threadsafethrowthrows
transienttruetry
voidvolatilewhile

Phew! That was long. I hope you are all set to move further. If not, take a break and come back soon. 

There is only one way to hit the road now, to start coding! So let your fingers groove with Groovy and follow on.

Chapter 4: Groovy Variables and Data Types

Consider the following statement:

String str = 'This is a string variable'

We have declared a variable called str. It belongs to the data type: String. Its value is: ‘This is a string variable’. 

Groovy scripting variables

Variables are named memory locations that have a data type. You’ll want to use variables to store some information. You can then use these variables to perform some operations throughout the program.  

Keep in mind:

  • Variables are case-sensitive. So, int x = 5 and int X = 5 are two different variables: x and X. 
  • Variable names can include alphabets, numbers, or the underscore sign. It can start with either an alphabet or an underscore, not a number. 
  • Variables need to be declared. That is, you must specify the data type of the variable either explicitly or using the “def” keyword (we will cover it soon). 

Data types denote what kind of data that variable holds: a string, a character, a boolean, an integer, a decimal, etc. 

Groovy has several built-in data types. We will quickly look at each of them. 

Numbers

Numbers can be integers (whole numbers) or decimal (floating point) values. 

Number data type in Groovy

The following table summarizes the data types and the range of values under each category. 

Data typeDescriptionRange of values for the data types Example
(Use the above image for the reference code)
byteRepresents a byte value-128 to 127byte b = 2
shortRepresents a short number-32,768 to 32,767short s = 2
intRepresents an integer-2,147,483,648 to 2,147,483,647int i = 3
longRepresents a long number-9,223,372,036,854,775,808 to +9,223,372,036,854,775,807long l = 455552
floatRepresents a 32-bit floating point number 1.40129846432481707e-45 to 3.40282346638528860e+38float f = 15.35
doubleRepresents a 64-bit floating point number4.94065645841246544e-324d to 1.79769313486231570e+308ddouble d = 6.78889

Note: You cannot assign a higher value like 45552 to a short data type since it will be out of its range. For instance, short s = 45552. Try it yourself and see the result!

Strings

Strings are used to give some text value to variables. It can be either a single character (char) or a block of text (String). 

Groovy scripts string

Strings can be enclosed in single, double, or triple quotes. Strings enclosed in triple quotes can span across multiple lines. 

Strings are formed out of single characters placed in a sequence. So you can access individual characters one at a time. The index position of the first character is 0, and the last character is one less than the length of the string. 

String Interpolation

Groovy string interpolation

String interpolation allows you to include variables or expressions (like 5+5) within a string. You can include variables or expressions with dynamic content without concatenating strings and variables manually.

Whenever an expression is given within ${expression} in a string literal (double quotes), it works as a placeholder. When the code is executed, the expression is evaluated and replaced by the actual value. 

For instance, if the expression says hello ${age}, the aim is to replace age with the actual value. Likewise, if there is an expression like ${5+5}, it will be replaced with the actual answer 10 at run-time. String interpolation aims to achieve this. 

Groovy supports string interpolation. 

Note: String interpolation works only for strings in double quotes. Single and triple quotes aren’t valid candidates for it.

Using String Interpolation in Exalate Scripting

Suppose you want to synchronize comments from one system to another and have a specific requirement for this sync. You want to mention the original author of the comment from the source side and send it to the destination instance. You also want to sync the comment creation time.  

The Groovy script used in Exalate for this use case would be

entity.comments += replica.addedComments.collect { comment ->
comment.body = "${comment.author.email} commented at ${comment.created}: \n${comment.body}"
comment
}

Boolean

Boolean is a special data type. You can assign only 2 values to Boolean variables: true or false. It is mostly used in conditional statements to check whether a value is true or false. You can use it like a regular data type and assign it to variables, methods, or any other field.

Groovy Boolean

Optional Typing with the def Keyword

I have been harping about how Groovy scripting is so versatile. Here’s a good one. 

There are 2 ways to declare (or define) variables (or methods) in Groovy. 

The first one is the traditional Java-based static approach, where it is mandatory to assign a data type to a variable name. 

The second one, the Groovy way, says that assigning a data type is optional. 

So, how do you do that? By using the “def” keyword.

Let’s understand it with the help of an example. We have defined 2 variables def X = 6 and def str = “Hello World” and assigned a numeric value and a text value to both of them respectively. Note here that we didn’t explicitly state the data type and simply used the keyword “def”

Groovy scripting def keyword

Optional typing is the idea of deferring the knowledge of data type until you run the program. So in programming lingo, (data) type checking will happen at run-time instead of compile-time. 

This is achieved using the keyword “def”

The keyword “def” can also be used with methods, where it is not mandatory to mention the data types for parameters. 

When a variable is declared using “def”, Groovy infers the type of the variable based on the value that is assigned to it. 

Using the keyword def doesn’t imply that a data type doesn’t exist; it’s just a Groovy equivalent to an Object in Java. 

 Note: Groovy can blow your mind away by allowing static type-checking using the @TypeChecked annotation. 

Note: You can use optional typing with Exalate scripting just like you would in Groovy. 

Chapter 5: Groovy Operators

Operators in a programming language allow you to manipulate data. They help perform some kind of operation on integers, strings, or booleans.

Arithmetic Operators

In Groovy, you can perform normal math operations like:

  • Addition (+): Adds two numbers
  • Subtraction (-): Subtracts one number from the other
  • Multiplication (*): Multiplies two numbers
  • Division (/): Divides two numbers
  • Remainder (%): Returns the remainder of a division operation

And then there is a power operator “**”. 

The power operator has two parts: the base and the exponent, like in math. The result will depend on the value of the operands (base and exponent) and the data type they belong to. 

Groovy operators

Plus, you also can use the usual postfix and prefix operators: ++ (increment) and – – (decrement) within expressions in Groovy. 

For instance, x++ uses a postfix operator. It means that the value of ‘x’ is first used in the expression, and then incremented afterward. 

A prefix operator, ++x means that the value of ‘x’ is first incremented, and then used in the expression. 

You can refer to the example shown below.

Groovy increment operator

You can also use the Unary operator in Groovy. A unary operator operates on a single operand, i.e it takes a single input and produces a single output. They are used to modify the value of a variable or perform a specific operation on it. 

Groovy unary operator

Relational Operators

Relational operators are used for comparing two variables, values, or objects. So the two values can be equal, greater than, smaller than, or not equal to. 

It returns a boolean value (true or false) based on the comparison made.

OperatorDescription
==Checks if the two values are equal 
!=Checks if the two values are not equal 
Checks if one value is less than the other
>Checks if one value is greater than the other
<=Checks if one value is less than or equal to the other
>=Checks if one value is greater than or equal to the other

Here is an example to demonstrate some of the operators. 

Groovy relational operators

Logical Operators

Logical operators are used to evaluate boolean values and return a boolean result. Groovy supports 3 logical operators:

  • Logical AND (&&): returns true if both operands are true, false otherwise
  • Logical OR (||): returns true if at least one operand is true, false otherwise
  • Logical NOT (!): returns the opposite boolean value as that of the operand
Groovy scripting logical operators

Bitwise Operators

Bitwise operators are operators that perform operations on the binary representation of integer values. 

Bitwise operators are used for int, byte, short, long, or BigInteger. If you use bitwise operators with an int and a long, then the result will be long, between a BigInteger and a long, then the result will be a BigInteger. 

In short, the result will always be the largest numerical data type. 

There are four bitwise operators that Groovy supports: 

  • AND operator (&): returns a value where each bit is set to 1 only if both operands have a corresponding bit set to 1 (e.g: x&y)
  • OR operator (|): returns a value where each bit is set to 1 only if either operand has a corresponding bit set to 1 (e.g: x|y) 
  • XOR (exclusive or) operator (^): returns a value where each bit is set to 1 only if exactly one of the operands has a corresponding bit set to 1 (e.g: x^y)
  • Negation operator (~): returns a value where each bit is flipped from 1 to 0 or from 0 to 1 (e.g ~x)

Groovy also offers three bitwise shift operators: 

  • Left shift operator (<<): shifts the bits of the first operand to the left by a number of positions specified by the second operand (e.g: x << 2)
  • Right shift operator (>>): shifts the bits of the first operand to the right by a number of positions specified by the second operand (e.g: x >> 2)
  • Right shift unsigned (>>>): shifts the bits of the first operand to the right by a number of positions specified by the second operand, filling the leftmost bits with 0’s instead of preserving the sign bit like the regular right shift operator (>>)

Bitwise operators are typically used while working with low-level binary data, such as when implementing networking protocols or device drivers. They can also be used in other contexts when optimizing certain algorithms or data structures. 

Note: You can learn more about Bitwise operators here

Range Operator 

The range operator is used to create a sequence of values that have a starting and an ending point. It is represented by two dots (..) and can be used to create a range of integers, characters, and other data types.

Groovy range operator

Range operators can be used in conjunction with other Groovy scripting data structures like lists, arrays, or collections. It can be useful when working with large data sets or when generating sequences of values.

Safe Navigation Operator (?.)

Before we learn about the safe navigation operator, let’s see what a dot(.) operator is. 

Like Java, Groovy also uses the dot (.) operator to access the class members and functions. 

In the example shown below, we have declared a class called Company that has a name and address. To access a class variable (or method) we create an object of the class. Then use the dot operator after the name of the object followed by the variable (or method) name. 

So if the object is def acme = new Company(), we access the name as acme.name

Groovy safe navigation operator

Now, what happens when you execute the code below? 

It throws a NullPointerException, a classic pain in the neck situation. Sometimes this exception can make your entire system crash. 

The safe navigation operator was born out of the need to avoid the NullPointerException.

Instead of a single dot, it has a question mark followed by a dot(?.). If the first argument or operand is null the entire expression will be null. It won’t throw an exception but just return the value null, not breaking anything in the process. 

Another reason the safe navigation operator is so popular is that it can simplify your code. 

Consider the example shown below. 

Safe navigation operator in Groovy

Note: We will learn more about the if loop in a while. 

Using Safe Navigation Operator In Exalate Scripting

As we saw in this section, the Safe Navigation operator (?.) is used to avoid the NullPointerException. 

Exalate uses this operator in a simple yet innovative way. 

For instance, you want to access the email property of a user while syncing the reporter from a Jira instance. 

We can use the operator in the following manner. 

issue.reporter = nodeHelper.getUserByEmail(replica.reporter?.email)

The above line ensures that if the email address of the reporter isn’t found, then the Safe Navigation operator would return a null value instead of throwing an exception.

Elvis Operator (?:)

Many of us have grown up listening to Elvis. Let’s read about the Elvis operator in Groovy now.

Let’s start with the following example:

Elvis operator in Groovy

The Elvis operator (?:) is a shorthand operator that allows you to simplify null checks in your code. It’s often referred to as “ternary operator for null safety”. 

As seen in the example above, if value1 is not null then it simply picks it up and assigns it to the result. If the value1 is null then pick value2 and assign it to the result. So you can assign a sensible default value in case one of the values is null. 

The Elvis operator can also be used in method calls or as a part of complex expressions. It is useful to write a more concise and readable code that handles null values elegantly. 

Groovy allows you to overload the various operators you learned in this section. This concept is called operator overloading
Operator overloading allows you to redefine the behavior of built-in operators when applied to your custom objects. 

Using Elvis Operator in Exalate Scripting

We saw how the Elvis operator allows you to simplify null checks in your code and assign sensible default values in case the code encounters a null value. 

Say, you want to assign a default value when syncing some information from Jira to Zendesk. In Jira, the description for the ticket is optional, while in Zendesk, it’s mandatory. 

So your Incoming sync script in the Zendesk instance would look like this.

entity.description = replica.description ?: "No description provided"

The above script ensures that if an issue doesn’t have a description in Jira, “No description provided” will be auto-filled as the fallback description in Zendesk. 

Now that your mind is operated enough on Groovy, let’s take you for a loop ride. 

Chapter 6: Groovy Control Flow Statements

Groovy supports all the control flow structures that Java offers. So you can use the if-else, for, while, do-while, and switch statements.

Control flow structures alter the flow of the program. So instead of statements executing sequentially, they run in an order specified by the control statement. 

If, If/else or Nested If Statement

The ‘if’ statement evaluates a condition, and if the result is ‘true’ then statements preceding the truth (if statement) are executed, otherwise, the statements preceding the false (else statement) are executed.

You can skip the ‘else’ statement and only use the ‘if’ condition. 

You can even use nested ‘if’ loops in Groovy.

Groovy if or if/else statement

You can also use a short-hand way of writing a long if-else statement by using the ternary operator

If the condition is true, then expression1 is executed, otherwise expression2.

condition ? expression1 : expression2

Using the If Statement in Exalate Scripting

The use of the ‘If’ statement in Exalate can be varied because you can do a variety of things based on the values received from the other side or modify the values you want to send to the other side the way you want. 

Suppose in Jira, you want to create an issue in a particular project. You also want to create an issue of a particular issue type based on the value of a field present in the replica. 

You can use the following code: 

//'x' is the value of the field in the replica you are testing against. 
if (replica.x == "abc"){
issue.projectKey = ABC
issue.typeName = "Task"
}
else if (replica.x == "def){
issue.projectKey = DEF
issue.typeName = "Story"
}

Switch Statement

A switch statement allows a program to perform different actions based on the value of a variable or an expression. 

It provides a way to test the value of an expression against multiple cases and execute different blocks of code depending on which case matches the value. The expression being evaluated is compared against each of the cases, and when a match is found, the code block associated with that case is executed. 

The switch statement is often used as an alternative to a series of if-else statements, particularly when there are multiple conditions to check.

The variable or expression you need to evaluate must be given in the round brackets after the keyword switch. For the cases, use the keyword ‘case’ followed by the actual value you want to test, and finally a colon (:). Use the break statement after every case. The entire switch block is enclosed within curly braces {}.

Switch statement in Groovy

A variable day is defined and you switch on the value of the day, i.e. Tuesday. The second case turns true when the value of the variable day matches “Tuesday”. 

So the output of the second case is printed and then the “break” statement is executed. 

We use the “break” statement to exit the case once the code block has been executed. When none of the case conditions are true, the “default” statement gets executed. It’s optional.

For Loop

Groovy supports ‘for’ loops where you can iterate over a sequence of values, such as a range of numbers or a list of items. You can use ‘for’ loops with arrays, collections, Maps, etc.

The actual condition for which you need to run the for loop is given in round brackets. For instance, in the second example shown below, print the value of ‘i’ 5 times. The condition to check whether the value of ‘i’ has reached 5 and incrementing (or decrementing) the value after each iteration of the ‘for’ loop is given within the round brackets. The actual statements that need to be executed within the ‘for’ loop are mentioned in curly brackets {}.

Groovy scripting for loop

Using For Loop in Exalate Scripting

You can use the “for” loop with Exalate.  

If you want to store some information from a user-defined field in the description field of an issue in Jira, you can do so as follows:

def p = ""

for(int i=0; i<replica.customFields."10035".value[0].approvers.size(); i++)
{
p += replica.customFields."10035".value[0].approvers[i].approver.displayName.toString() + " : " + replica.customFields."10035".value[0].approvers[i].approverDecision.toString() + "\n"
}

issue.description = p

Note: For loops are not always the most elegant solution to use. There are other methods like .each(), .find(), and .collect(), that we’ll see in a while which serve the same purpose and are a better option.

While Loop

The ‘while’ statement will execute a block of code repeatedly till a condition is true. The condition is evaluated at the beginning of each iteration of the loop. If it is true the code inside the loop is executed. This repeats until the condition remains true. 

The condition is given inside round brackets after the keyword ‘while’. The entire loop is then enclosed within curly brackets {}.

In the example shown below, the condition checks the value of “i”. It prints this value till “i” becomes equal to 5. For every iteration, the value of “i” is incremented. 

Groovy scripts while loop

Do While Loop

The ‘do-while’ statement is a variation of the while statement, where the condition is evaluated after the first iteration of the loop, ensuring the block is executed at least once. 

Thereon, for every iteration, the code in the ‘do-while’ block is executed until the condition remains true.

Chapter 7: Object Orientation in Groovy Scripting 

If you’re familiar with Java, you might already know most object-oriented programming concepts like classes, objects, interfaces, etc. 

Groovy is a full-fledged object-oriented programming language; everything is an object. 

You can create classes in Groovy like you can in Java.

A class is like a blueprint that defines the structure and behavior of objects. It has a set of properties (or attributes) and methods (or functions). 

Properties hold the data within the class, and the methods are operations you perform on that data. 

In the following example, we have defined a class called Student that has a few properties: name, age, and grade. It has a method called sayHello to print the students’ information. We create an instance (object) of the class called “student” (class names are case sensitive, so “student” is different from the class “Student”) and give values to its properties. 

Then we call the sayHello method to print those values. As seen, you can access individual properties and methods of the class with the dot(.) operator. 

object orientation in Groovy scripting

You can control the visibility, that is, which methods and properties are accessible outside of the scope they are defined in through the usual access modifiers: public, private, and protected. 

By default, properties, and methods are public, but you can use other access modifiers to change the visibility. 

You can also create constructors for the classes you define. Constructors are methods used to initialize the objects of a class. A constructor has the same name as that of the class. 

Plain Old Groovy Object (POGO) is a simple class in Groovy where you don’t need to define setter and getter methods (constructors) since Groovy will automatically generate them for you. 

Groovy supports inheritance, where the child class inherits properties and methods from the parent class. An inherited class is defined by the keyword: “extends”.

class Student extends Person {
    int grade
   
    void sayHello() {
    println("Hello, my name is ${name}, I am ${age} years old, and I am in grade ${grade}.")
    }
}

You can also create interfaces in Groovy. An interface acts like a contract that the class must adhere to. Interfaces only consist of a list of methods for which no implementation (method body) is provided. The class that “implements (a keyword)” an interface must provide the method body, i.e, the implementation. Interface methods can be public and abstract. The properties of the interface can be public, static, and final. 

//define an interface Speaker with a single method speak
interface Speaker {
    void speak()
}
//a class called George implements the interface and defines
//a method body for the speak method
class George implements Speaker{
    void speak(){
        println "George is speaking"
    }
}

You can also create abstract classes or methods in Groovy scripting. It is similar to interfaces but can contain method implementation. You cannot create an object of an abstract class. They can be created by using the “abstract” keyword. 

You must provide an implementation for the abstract methods if you create a class that inherits the abstract class.

/* We have defined an abstract class Animal with a single abstract method called speak() and a non-abstract method run(). Any class that extends the Animal class must provide an implementation for the speak() method, but it can also call the run() method provided by the Animal class. */
abstract class Animal {
    abstract void speak()

    void run() {
        println("The animal is running.")
    }
}

Object Orientation in Exalate Scripting

At the heart of Exalate is a replica that works as a payload to pass information between two applications. The replica itself is an object, and everything within it is also an object.

So, let’s consider the example of “status” in a Jira issue. When you say replica.status.name, you’re using the same object-oriented concepts we discussed earlier. We use the dot (.) operator to access the name property of the status object.

Chapter 8: Groovy Scripting Closures

Groovy closures are an interesting concept. Closures are anonymous blocks of code performing some function. They are defined within curly brackets: {}. A closure can contain multiple statements. 

You can assign them to a variable and call it a function (or method) with a return value or you can even use them as arguments to a function. The block of code gets passed around and executed at a later time, more like a “function-on-the-go”. 

Groovy closures are a powerful way to write flexible or reusable code; they also save you a lot of time and make the code concise. 

Groovy closures

You can see a lot of use for closures in Groovy data structures. 

That brings us to our next chapter: Groovy data structures. 

Using Groovy Closures in Exalate Scripting

Perhaps, the most common and popular way of using Groovy Closures in Exalate is when you want to manage the comment visibility between different applications. 

In Jira Service Management, you can create comments as internal or public. And you want to filter and send only public comments from Jira to the destination instance. 

You can do so using the following code snippet:

//the !it.internal enclosed within curly brackets (Groovy Closures) //ensures 
only public comments are saved in replica to be send over //to the 
destination side.

replica.comments = issue.comments.findAll { !it.internal }

Chapter 9: Groovy Scripts Data Structures

We have seen data types like int, long, short, etc. These are called primitive data types. 

Data structures are collections of these primitive data types in a list, an array, or a map format. We’ll look at them one by one. 

Lists

Groovy Lists allow you to store a collection of data. You can think of a List as a sequence of items, like your grocery or to-do list. 

To create a List in Groovy scripts, enclose it within square brackets [] and separate the items within the list with a comma (,).

def myGroceryList = ["apples", "bananas", "bread", "milk"]

Lists are a one-dimensional data structure. The items in a List can be primitive data types, or they can be object references. 

lists in Groovy scripts

Some List Methods

As we saw, Lists are sequences of items. So to perform some operations on Lists you can iterate over the list items one-by-one through indices. 

The first item has an index of 0 and refers to the first item in the list. 

There are a lot of other operations you can perform on Lists. They allow you to read, add, remove items from the list, and do much more.

We’ll see a few of them.

each() Method

The each() method helps you iterate over all the items in the List and perform some operation on them. It’s a convenient way to apply the same operation to every item on the List. 

List method in Groovy
find() Method

The .find() method allows you to search for a specific item in a list based on a condition you specify. It helps you find the first item within the list. 

The find method in Groovy
findAll() Method

The findAll() method works in the same way as the find() method. The only difference is that the findAll() method returns all the items that match the criteria instead of only the first item. 

Continuing the above example.

The findAll method in Groovy
collect() Method

The .collect() is used to manipulate the list and return the manipulated list. It transforms the list into something else. 

In the example shown below, we create a new list by multiplying all items in the old list by 2.

Collect method in Groovy

Some More List Methods

A few more list methods are: 

Method nameDescriptionUsage
add()Adds a new item to the end of the list. def myList = [1, 2, 3, 4, 5]

myList.add(6)

println(“List with added element: ${myList}”)
contains()Allows you to check if a particular item is present in the list or not. It returns a boolean value.def myList = [1, 2, 3, 4, 5]

def result = myList.contains(3)

println(“Result: ${result}”)
get()Allows you to retrieve a specific item from a list based on its index position. def myList = [1, 2, 3, 4, 5]

def result = myList.get(2)

println(“Result: ${result}”)
isEmpty()Allows you to check if a list is empty or not. It returns a boolean value. def myList = [1, 2, 3, 4, 5]

def emptyList = []

def result1 = myList.isEmpty()
def result2 = emptyList.isEmpty()

println(“Result1: ${result1}”)
println(“Result2: ${result2}”)
minus()Allows you to create a new list by removing specific items from the listdef myList = [1, 2, 3, 4, 5]

def newList = myList.minus([3, 4])

println(“Original List: ${myList}”)
println(“New List: ${newList}”)
plus()Allows you to create a new list by adding specific items to an existing list. def myList = [1, 2, 3, 4, 5]

def newList = myList.plus([6, 7])

println(“Original List: ${myList}”)
println(“New List: ${newList}”)
pop()Allows you to remove the last item from the list and return that item. def myList = [1, 2, 3, 4, 5]

def lastElement = myList.pop()

println(“Original List: ${myList}”)
println(“Last Element: ${lastElement}”)
remove()Allows you to remove a specific item from the list. It removes the first occurrence of the item. def myList = [1, 2, 3, 4, 5]

myList.remove(3)

println(“Original List: ${myList}”)
reverse()Allows you to reverse the order of the items in the list. It modifies the original list and returns the modified list. def myList = [1, 2, 3, 4, 5]

myList.reverse()

println(“Reversed List: ${myList}”)
size()Allows you to fetch the number of items in the list. It returns an integer denoting the size of the list. def myList = [1, 2, 3, 4, 5]

def listSize = myList.size()

println(“List Size: ${listSize}”)
sort()Allows you to sort the elements in the list. It modifies the original list and returns the sorted list. def myList = [5, 2, 3, 1, 4]

myList.sort()

println(“Sorted List: ${myList}”)

Using Groovy Lists in Exalate Scripting

One of the most common examples of Lists in Exalate scripting could be when syncing sprints in Jira.

//Only the sprints belonging to the following Board IDs will be //synced.
def boardIds = ["50", "80", "130", "144"] 

//Here, boardIDs is the list of board IDs we want to sync.
//We use the .find method of the List data structure

if(entityType == "sprint" && boardIds.find{it == sprint.originBoardId}){
replica.name = sprint.name
replica.goal = sprint.goal
replica.state = sprint.state
replica.startDate = sprint.startDate
replica.endDate = sprint.endDate
replica.originBoardId = sprint.originBoardId
}

Maps

Maps represent an unordered collection of items in the format of a key:value pair. The keys and values are separated using colons and each key/value pair is separated by commas. The entire set of keys and values are enclosed in square brackets. 

The key works like an index to search for the value. They are also called associative arrays or dictionaries in some programming languages. 

Groovy script maps

Some Map Methods

Like Lists, there are methods you can use to manipulate the items in maps. 

We’ll discuss a few examples. 

each() Method

The .each() method is used to iterate over maps and perform a specific operation on each of its key-value pairs. 

each method in Groovy
find() Method

This method can be used to search for a key-value pair in a map that matches a given value based on a condition. The find() method returns the first key-value pair in the map that matches the given condition, or null if a match isn’t found. 

find method in Groovy maps
findAll() Method

The findAll() method is used to search for all the key-value pairs in a map that matches a given value based on a condition. The findAll() method returns a new map that contains all the key-value pairs in the original map that match the given condition. 

findAll method in Groovy maps

Some More Map Methods

A few more map methods are

Method nameDescriptionUsage
collect()Iterates over each key-value pair in the map and executes a closure that transforms the key-value pair into a new value.def originalMap = [a: 1, b: 2, c: 3]
def newMap = originalMap.collect { key, value ->[“${key}”, value * 2] }
println “Original map: ${originalMap}”
println “New map: ${newMap}”
inject()Iterates over each key-value pair in the map and accumulates a value by executing a closure on each pair.//use inject to calculate the sum of //all values in a map
def map = [a: 1, b: 2, c: 3]
def sum = map.inject(0) { acc, key, value ->
    acc + value
}
println “Sum of values: ${sum}”
get()Returns a value for the given key, or a default value if the key is not found. def map = [a: 1, b: 2, c: 3]
def value = map.get(“b”)
println “Value of ‘b’ is: ${value}”
put()Adds a new key-value pair to the map or updates an existing one.def map = [a: 1, b: 2, c: 3]
map.put(“d”, 4)
map.put(“b”, 5)
println “Map after using the put method: ${map}”
remove()Removes a key-value pair from the map for a given key. def map = [a: 1, b: 2, c: 3]
map.remove(“b”)
println “Map after removing ‘b’: ${map}”
containsKey()Returns true if the map contains the given key, false otherwise.def map = [a: 1, b: 2, c: 3]

def hasB = map.containsKey(“b”)
def hasD = map.containsKey(“d”)

println “Map contains ‘b’: ${hasB}”
println “Map contains ‘d’: ${hasD}”
containsValue()Returns true if the map contains the given value, false otherwise.def map = [a: 1, b: 2, c: 3]

def has2 = map.containsValue(2)
def has4 = map.containsValue(4)

println “Map contains value 2: ${has2}”
println “Map contains value 4: ${has4}”

Using Groovy Maps in Exalate Scripting

There are a lot of use cases for Maps in Exalate. You can map issue types, priorities, statuses, etc between two systems and perform some sync operations on them. 

/*We have mapped the different statuses in a map variable called statusMap. 
Based on the value of the remote instance's status value, the 
corresponding correct local status is fetched and assigned to the entity. */

def statusMap = [
"Done": "Resolved",
"In Progress": "In Action"
]
def remoteStatusName = replica.status.name
issue.setStatus(statusMap[remoteStatusName] ?: remoteStatusName)

Arrays

An array is a fixed-size collection of items of the same data type. So you can create an array of integers, long or strings. 

You must use square brackets [] to create an array, just like with lists. The only difference is that the data type declaration is compulsory in arrays. You can also create arrays with the new keyword. 

//an array of integers
int [] array = [1, 2, 3, 4, 5]

// use the new keyword to create arrays
​def array1 = new int[5]

//accessing the first element of the array
//the indices start at 0
def array2 = [1, 2, 3, 4, 5]
def firstElement = array2[0]

//you can even use loops to manipulate arrays
def array3 = [1, 2, 3, 4, 5]
for (int i = 0; i < array.length; i++) {    
    println array[i]
}

You can use the functions we discussed, such as collect, findAll, each, inject, and more, with arrays as well. 

Chapter 10: Groovy Scripting Regular Expressions

Groovy supports regular expressions through the use of the java.util.regex package. This package allows you to create, manipulate and search for regular expressions through inbuilt classes and methods. 

A regular expression is a pattern that defines a set or subset of strings. You can use regular expressions in a variety of ways for different purposes. For instance, find all instances of a particular word, phrase, or pattern in a large block of text, extract data from strings, and even replace a certain block of text with another block. 

Regular expressions in Groovy can be denoted with a /…/, where the dots represent the pattern. For instance, the regular expression /world/ matches the string “world” wherever it occurs. 

To search for a regular expression within a string, you can use the =~ operator. 

You can also use the ==~ operator to match a regular expression against a string and return true or false. 

You can use various special characters in regular expressions to create and match complex patterns. The most common ones are

CharacterWhat it does
Matches any character except a newline
*Matches the preceding character 0 or more times
+Matches the preceding character 1 or more times
?Matches the preceding character zero or one time
\dMatches any digit (0-9)
\s Matches any whitespace character (space, tab, newline, etc)
\wMatches any word character (letter, digit, underscore)

You can use certain characters to match another set of characters. For instance, [aeiou] matches any vowel whereas [a-z] matches any lowercase character. 

You can also include grouping or alternation of characters. To group, use parentheses and to alter use the pipe (|) character. For instance, the regular expression /(hello | world)/ matches either “hello” or “world”. 

Consider an example where you want to search for all the strings that match the pattern of an email address. 

def text = "Please contact us at [email protected] or [email protected]"
def emailRegex = /\b[\w.%-]+@[\w.-]+\.[a-zA-Z]{2,4}\b/

def matcher = (text =~ emailRegex)

matcher.each { match ->
    println "Found email address: ${match[0]}"
}

Groovy supports a lot of inbuilt methods to work with regular expressions, such as find, findAll, replaceAll, split, etc. 

Regular expressions can be complex and difficult to read, so it’s important to use them with caution. They can also be computationally expensive, so make sure you consider their performance in performance-intensive code or while dealing with large strings. 

Phew! That’s a lot of coding we already learned. 

But during this coding journey, have you yet encountered an error message that left you puzzled already? 

The next step is to learn exception handling in Groovy scripting; so you don’t crash your program right away (hopefully never, fingers crossed).

Using Groovy Regular Expressions in Exalate Scripting

Regular expressions allow you fine-grained control over matching patterns within strings. 

This can be particularly useful when you need to send information between two applications based on specific sub-string or pattern matches. 

One example of regular expressions (Regex) in Exalate is when you want to sync only selected comments that match a filter. 

Let’s say you only want to send those comments that include the word ‘[SEND]’ in the comment text. Any other comments should not be sent. 

replica.comments = issue.comments.findAll {it.body =~ /\[SEND:\]*/ }

Chapter 11: Groovy Exception Handling

Programs crash all the time, and the only way for it to recover is to handle exceptions gracefully. 

Exceptions are errors or events that occur during the execution of a program causing it to behave in unexpected ways. These exceptions can occur due to various reasons, such as file i/o errors, invalid input, wrong program logic, network errors, etc. 

Groovy supports “try-catch” blocks to handle exceptions. The “try” block includes the code that might throw an exception, and the “catch” block contains the code to handle the exception. 

When an exception occurs in the try block, the code execution stops in that block and the program jumps to execute the code in the “catch” block. 

An example of a try-catch block: 

Groovy try catch block

As seen in the example, the try block attempts to divide a number by 0, resulting in an ArithmeticException. The catch block catches the exception and prints out the required message. The “finally” block is optional and is executed regardless of whether the exception occurs or not. 

You can have multiple catch blocks to catch different types of exceptions. 

In addition to the general try-and-catch block, you can also throw your own exceptions using the “throw” keyword. It allows you to create custom exceptions and handle them in a manner you deem fit. 

Exception handling is important for any programming language to write more reliable and robust code. 

Chapter 12: Groovy Testing Framework: the Assert Statement

The Groovy programming language is loaded with awesome features that are super handy for test-driven development. Yeah, it’s true!

When it comes to writing tests and making sure your code is rock solid, Groovy has got your back. It offers a bunch of cool features and state-of-the-art testing libraries and frameworks that have proven to be valuable in the world of test-driven development. 

One such feature is the “assert” keyword. 

An assertion (or an assert statement) is a nifty tool that lets you test your assumptions about your program. Let’s say you’re working on a method to calculate the ability of an individual to vote. With assertions, you can make sure the age of the person is always greater than 18. 

Basically, you create an assertion with a statement that should be true when it is executed. If the statement turns out to be false, the system will throw an error your way. By double-checking that the statement is true, the assert keyword gives you that extra confidence that your program is error-free and behaving just as you expect it to.

Here’s the cool part: writing assertions while you’re coding is like a superpower to finding and fixing bugs.

So how do you use the assert keyword in Groovy? 

There are two forms in which you can use the assert keyword. 

The first form is:

assert Expression1

Where Expression1 is just a fancy term for a Boolean expression. When your program hits this assertion, it checks if Expression1 is true. If the expression is true, it continues executing the next statement in the program and doesn’t print anything. If it’s false, then an AssertionError is thrown. 

And the second form of the assert statement is:

assert Expression1: Expression2

Where Expression1 is still our older Boolean expression and Expression2 is something that actually has a value. It can’t be something like a void method call. Bummer!
But here’s where it gets interesting. You can use this second form of the assert statement to display a super cool detailed message for the AssertionError. To do this, the program will grab the value of Expression2 and use it as an error message. That way, you’ll get more details about what went wrong in your assertion. Pretty handy, right?

Consider the following code:

def age = 15
assert age >= 18: "Age should be 18 or above"

If the age isn’t above 18, you’d be thrown a pretty direct error message as to why your assertion failed. 

The detailed message is all about capturing and sharing the details of the assertion failure. It can help you diagnose and fix the error that made the assertion go kaput. It isn’t catered toward regular users but is to be viewed alongside your stack trace and the source code. So you don’t have to worry about making it more understandable for the general public.

Chapter 13: JSON Handling in Groovy

JSON stands for JavaScript Object Notation and is a lightweight format for storing and transporting data. 

It’s a popular way to represent data in a human-readable format. It consists of data in the form of key-value pairs, as we saw with Groovy Maps. 

JSON formatting in Groovy is extremely useful. It simplifies the handling and manipulation of JSON data. It enhances the capabilities of Groovy when working with JSON-based technologies and facilitates data exchange, configuration management, testing, and more. 

In Groovy, you can work with JSON easily because it has in-built support for handling JSON data.

JsonBuilder Class

If you have your data handy, Groovy can convert it into JSON using the JsonBuilder class. You can start using this class by creating its object and using its methods to build your JSON structure.

JSON Builder class

Note: toPrettyString() is optional here. It’s only used to add some indentation and line breaks to make the JSON more readable. 

JsonSlurper Class

JsonSlurper is another fantastic class in Groovy that makes working with JSON data a breeze. It allows you to parse JSON strings and convert them into Groovy objects that you can easily manipulate and access. 

JSONSlurper class

In the example above, we create an object of the jsonSlurper class and call the parseText method of that class. We pass the JSON string to the method. 

All you need to do now is to access individual elements of the JSON object using the dot (.) operator. So you can access the name via jsonObject.name. 

And there you have it! 

With JsonSlurper, you can parse JSON strings and work on the data as Groovy objects. The JsonSlurper class also has a lot of other helpful methods like parse(File file) to parse the JSON data structure given within a file.

JSON Formatting Using Exalate

We saw earlier that a replica is the payload passed from one system to another. It contains the data and the metadata in the JSON format. 

There are two replicas per platform: the local replica and the remote replica.

Let me explain how Exalate accesses and works with the replica with an example. 

Let’s sync the priority field from ServiceNow to Jira.  If you view the replica on the ServiceNow instance, the local replica is the one in ServiceNow, and the remote replica will be the one on the Jira side. Similarly, the replicas will be interchanged if you view them in the Jira instance. 

The replica on the ServiceNow instance looks like this:

Replica in ServiceNow

The image shows the hubIssue (aka the replica) with all the incident fields. 

Now, our use case is to sync the priority from ServiceNow to Jira. 

The first thing you must do is send the priority information in the “Outgoing sync” in ServiceNow. 

ServiceNow Outgoing Sync

The following incident information is sent from ServiceNow to Jira. You can see the priority being sent too.

Sync rules in ServiceNow

Now, if you check the replica details, you can get the priority information in priority.name tag. 

Jira Incoming Sync

Accordingly, the “Incoming sync” will be used to get the priority details in Jira.

def priorityMapping = [
// Snow incident priority <-> Jira issue priority
"1 - Critical": "Highest",
"2 - High": "High",
"3 - Moderate": "Medium",
"4 - Low": "Low",
"5 - Planning": "Lowest"
]
// set default priority in case the proper priority could not be found

def defaultPriority = "Low"
def priorityName = priorityMapping[replica.priority?.name] ?: defaultPriority // set default priority in case the proper urgency //could not be found
issue.priority = nodeHelper.getPriority(priorityName)

Here, we have mapped the priorities in ServiceNow to the priorities in Jira. After that, the issue priority in Jira is assigned based on the mapping.

Chapter 14: Groovy Scripts Networking 

You can create a host of networked applications using the powerful set of networking features that Groovy supports. 

Networking in Groovy is built on top of Java’s networking APIs. So if you are familiar with Java’s networking concepts, this one should be easy for you. 

Some of the key features of Groovy networking include: 

  • Support for HTTP/ HTTPS client: Groovy provides an HTTP(s) client library making it easy to request and receive responses. This library supports both HTTP and HTTPS and allows you to set cookies, headers, and other parameters. 
  • Socket programming: You can create and manage sockets easily with Groovy. Sockets are endpoints for communication between two systems over a network. You can create both client and server-side sockets and use them to send and receive data. 
  • URL processing: You can use a rich set of classes to work with URLs. You can create and manipulate URLs, parse query parameters, and extract information from the URL. 
  • DNS lookup: Groovy provides classes for DNS lookups. 
  • Email handling: You can use the JavaMail API to send email messages using SMTP, POP3, and IMAP protocols. 

Groovy provides you with powerful networking features that make it easy to create and work with networked applications and get your job done. 

Using Groovy Networking Concepts in Exalate

Since Exalate supports Groovy-based scripts to extract, transform, and exchange information between multiple platforms, it can use Groovy’s networking features in many ways. 

Some common examples of HTTP client requests for information exchange are: 

You can also perform URL Processing using external scripts in the Jira cloud

So are you ready to explore the infinite possibilities with Groovy scripting and be amazed at what you can achieve? 

Whether you are a seasoned developer or just getting started, there’ll always be something Groovy has to offer.

Chapter 15: Best Practices and Tips for Groovy Scripting Development

  • Like we haven’t stressed enough the flexibility that Groovy offers. Use some cool features it supports like closures, dynamic typing, a safe navigation operator, and other in-built methods like find(), each(), etc. Use them fully and wisely to get the best out of Groovy. 
  • Groovy allows operator overloading, for +,-,*,/, and %. This can be used to create domain-specific languages (DSLs).
  • Groovy supports the @Delegate annotation that allows you to delegate method calls to another object. This can be useful to create adapters or for providing a simpler interface to complex objects. 
  • Groovy allows you to modify the behavior of objects at run-time using metaprogramming. You can use it to create dynamic DSLs or for adding some other behavior to objects at run-time. 

We have been working with the Groovy console for quite some time now. It’s a fantastic tool to quickly test out code snippets and a perfect way to play around with Groovy and learn more about the language. You’ll love how easy it is to use. Give it a try and see for yourself. 

Chapter 16: Most Popular Exalate Scripts

1. Transformers – Converting HTML to Wiki and Others

The most common example of Groovy scripts in Exalate can be for transformers. These transformers can convert information from one specific format to another, such that it is understood by the destination instance. 

For instance, every application has a different format, Jira – Wiki, Azure DevOps, Salesforce and ServiceNow – HTML and GitHub, and Zendesk – Markdown. Transformers help convert the HTML format to Wiki, the Markdown format to Wiki, or the Wiki format to HTML. 

Following are the different transformers that can be implemented via Exalate scripting: 

1.1. HTML to Wiki

1.2. Markdown to Wiki

1.3. Wiki to HTML

2. Syncing User Mentions, Rich Text, and Inline Images

2.1: Syncing User Mentions

It’s common knowledge that team members often tag (or mention) each other in comments for various reasons. 

You can use Exalate scripts to sync user mentions in comments between systems like Jira, Azure DevOps, Salesforce, etc. 

The following scripts would do the magic.

Note: The only pre-requisite for this use case is that the user property (like email IDs) should be the same in both systems. 

Azure DevOps Incoming Sync For User Mentions

String start1="#exalate_comment#"
String end1="#exalate_comment_end#"
for(comment in replica.addedComments)
{
  def matcher  = comment.body =~ /(?<=#exalate_comment#).*?(?=#exalate_comment_end#)/
 
 matcher.each {
 x->
   def userId=nodeHelper.getUserByEmail(x,"Project_key")?.key
   if(userId){
     def string = "<a href=\"#\" data-vss-mention=\"version:2.0,"+userId+"\"></a>"
     def test = comment.body.replaceAll(start1+ x + end1,string)
     comment.body = test
   }    
 }
}
Azure DevOps Outgoing Sync for User Mentions

def newComment
def allComments = workItem.comments.collect {
    comment ->
def comment1=comment.body
def matcher  = comment1 =~ /(?<=data-vss-mention="version:2.0,).*?(?=\")/
matcher.each {
 x->
def userId=nodeHelper.getUser(x,"project_key")?.email
 if (userId)
{
   def matcher1  = comment =~ /  <a href="#" data-vss-mention="version:2.0,${x}.*?<\/a> /
 
matcher1.each{
    y->
    comment1=comment1.replaceAll(y,"[~accountid:"+userId+"]")
  }
}
         
}
    comment.body=comment1
    comment
}
replica.comments   	= nodeHelper.stripHtmlFromComments(allComments)


Jira Incoming Sync For User Mentions
for(comment in replica.addedComments){
    def newCommentBody=comment.body
    def matcher  = comment.body =~ /\[~accountid:([a-zA-Z0-9+._-]+@[a-zA-Z0-9._-]+\.[a-zA-Z0-9_-]+)\]/
 matcher.each {
 x->
    def target = nodeHelper.getUserByEmail(x[1])?.key ?:x[1]
    newCommentBody = newCommentBody.replace(x[1],target)
}
 comment.body=  newCommentBody
}
 
def addedComments = commentHelper.mergeComments(issue, replica)

Jira Outgoing Sync For User Mentions
String start1="#exalate_comment#"
String end1="#exalate_comment_end#"
replica.comments = issue.comments.collect {
comment ->
      
    def matcher  = comment.body =~ /\[~accountid:([\w:-]+)\]/
    def newCommentBody = comment.body
      
    matcher.each {
     target = nodeHelper.getUser(it[1])?.email ?: "Stranger"         
     target = start1+target+end1
     newCommentBody = newCommentBody.replace(it[0],target)
    }
  
    comment.body = newCommentBody
    comment
}



The details of what happens behind the scenes in this use case can be found here

You can also sync user mentions from Jira Cloud comments to Salesforce chatter feed

2.2 Syncing Rich Text and Inline Images

Another common requirement is to handle rich-text and inline images and sync them correctly over to the destination instance. 

We’ll consider Jira and Azure DevOps for this use case as well. 

You can use the following code.

Jira Outgoing Sync For Rich Text and Inline Images
replica.description    = nodeHelper.getHtmlField(issue, "description")


Azure DevOps Incoming Sync For Rich Text and Inline Images

if(firstSync){
   // Set type name from source entity, if not found set a default
   workItem.projectKey  =  "Demo"
   workItem.typeName = "Task"
}


workItem.summary      = replica.summary

workItem.attachments  = attachmentHelper.mergeAttachments(workItem, replica)
workItem.comments     = commentHelper.mergeComments(workItem, replica)
workItem.labels       = replica.labels

def await = { f -> scala.concurrent.Await$.MODULE$.result(f, scala.concurrent.duration.Duration.apply(1, java.util.concurrent.TimeUnit.MINUTES)) }
def creds = await(httpClient.azureClient.getCredentials())
def issueTrackerUrl = creds.issueTrackerUrl()

def processInlineImages = { str ->

    def processLtGtTags = {
        def counter = 0
        while (counter < 1000) {
              def matcher = (str =~ /<!-- inline image filename=#(([^#]+)|(([^#]+)#([^#]+)))# -->/)
       
            if (matcher.size() < 1) {
                break;
            }
            def match = matcher[0]
           
            if (match.size() < 2) {
                break;
            }
            def filename = match[1]
            def attId = workItem.attachments.find { it.filename?.equals(filename) }?.idStr
     
            if (!attId) {
                log.error("""Could not find attachment with name ${filename},
           known names: ${replica.attachments.filename},
           match: ${replica.attachments.find { it.filename?.equals(filename) }}
       """)
                str = str.replace(match[0], """""".toString())
          
                
            } else {
                def tmpStr = str.replace(match[0], """""".toString())
              
                if (tmpStr == str) {
                    break;
                }
                str = tmpStr
            
            }
            counter++
        }
        
        str
        
    }
    def processNoImage = {
    
        def counter = 0
        while (counter < 1000) {
            def matcher = (str =~ //)
            if (matcher.size() < 1) {
                break;
            }
            def match = matcher[0]
            if (match.size() < 2) {
                break;
            }
            def filename = match[2]
            def attId = workItem.attachments.find { it.filename?.equals(filename) }?.idStr
            if (!attId) {
                log.error("""Could not find attachment with name ${filename},
           known names: ${replica.attachments.filename},
           match: ${replica.attachments.find { it.filename?.equals(filename) }}
       """)
                str = str.replace(match[0], """""".toString())
            } else {
              
           
                def tmpStr = str.replaceAll(match[0], """""".toString())
                if (tmpStr == str) {
                    break;
                }
                str = tmpStr
            }
            counter++
        }
        str
       
    }
    if (str == null) {
        return null
    }
    str = processLtGtTags()
    str = processNoImage()
 
    log.error("#processimages $str")
    str
}
 
String value = processInlineImages(replica.description)
workItem.description=value



3. Syncing a Parent-Child Relationship

Agile and project management systems like Jira and Azure DevOps often have entities that have a parent-child relationship with one another. The relationship can also have multiple levels of hierarchy. 

We’ll discuss two examples of how Exalate maintains a parent-child relationship. 

The relationship in Azure DevOps is Epic → Feature → Task. The same needs to be mirrored as Story → Task → Bug in Jira on-premise. You can see behind the scenes of this use case in this community post.

Azure DevOps Outgoing Sync
replica.parentId = workItem.parentId
 
def res = httpClient.get("/_apis/wit/workitems/${workItem.key}?\$expand=relations&api-version=6.0",false)
if (res.relations != null){
    replica."relation" = res.relations[0].attributes.name
    replica."relationid" = (res.relations[0].url).tokenize('/')[7]
    }

Jira incoming sync
import com.atlassian.jira.issue.link.IssueLinkManager
import com.atlassian.jira.component.ComponentAccessor
import com.atlassian.jira.security.JiraAuthenticationContext
import com.atlassian.jira.issue.link.IssueLinkTypeManager
import com.atlassian.jira.issue.link.IssueLinkType
import org.slf4j.Logger
 
class LogIn {
 
        static logIn(u) {
            def authCtx = com.atlassian.jira.component.ComponentAccessor.getJiraAuthenticationContext()
            try {
                //Jira 7
                authCtx.setLoggedInUser(u)
            } catch (Exception ignore) {
                // Jira 6
                //noinspection GroovyAssignabilityCheck
                authCtx.setLoggedInUser(u.getDirectoryUser())
            }
        }
 
        static  R tryLogInFinallyLogOut(Closure fn) {
            def authCtx = com.atlassian.jira.component.ComponentAccessor.getJiraAuthenticationContext()
            def proxyAppUser = getProxyUser()
            def loggedInUser = authCtx.getLoggedInUser()
            try {
                logIn(proxyAppUser)
                fn()
            } finally {
                logIn(loggedInUser)
            }
        }
 
        static getProxyUser() {
            def nserv = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.node.INodeService.class)
            nserv.proxyUser
        }
 
    }
class CreateIssue {
 
        static def log = org.slf4j.LoggerFactory.getLogger("com.exalate.scripts.Epic")
 
        private static def doCreate = {
            com.exalate.basic.domain.hubobject.v1.BasicHubIssue replica,
            com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue,
            com.exalate.api.domain.request.ISyncRequest syncRequest,
            com.exalate.node.hubobject.v1_3.NodeHelper nodeHelper,
            com.exalate.basic.domain.hubobject.v1.BasicHubIssue issueBeforeScript,
            com.exalate.api.domain.INonPersistentReplica remoteReplica,
            List traces,
            List blobMetadataList,
            Logger log ->
                def firstSync = com.exalate.processor.jira.JiraCreateIssueProcessor.createProcessorContext.get() == true
                def issueLevelError = { String msg ->
                    new com.exalate.api.exception.IssueTrackerException(msg)
                }
                def issueLevelError2 = { String msg, Throwable c ->
                    new com.exalate.api.exception.IssueTrackerException(msg, c)
                }
                def toExIssueKey = { com.atlassian.jira.issue.MutableIssue i ->
                    new com.exalate.basic.domain.BasicIssueKey(i.id, i.key)
                }
                final def authCtxInternal = com.atlassian.jira.component.ComponentAccessor.getJiraAuthenticationContext()
                final def imInternal = com.atlassian.jira.component.ComponentAccessor.issueManager
                final def umInternal = com.atlassian.jira.component.ComponentAccessor.userManager
                final def nservInternal = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.node.INodeService.class)
 
                final def hohfInternal2 = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.hubobject.IHubObjectHelperFactory.class)
                //noinspection GroovyAssignabilityCheck
                final def hohInternal2 = hohfInternal2.get(remoteReplica.payload.version)
 
                if (issue.id != null) {
                    def existingIssue = imInternal.getIssueObject(issue.id as Long)
                    if (existingIssue != null) {
                        return [existingIssue, toExIssueKey(existingIssue)]
                    }
                }
 
                def proxyAppUserInternal = nservInternal.getProxyUser()
 
                def loggedInUser = authCtxInternal.getLoggedInUser()
 
                log.debug("Logged user is " + loggedInUser)
 
                def reporterAppUser = null
                if (issue.reporter != null) {
                    reporterAppUser = umInternal.getUserByKey(issue.reporter?.key)
                }
                reporterAppUser = reporterAppUser ?: proxyAppUserInternal
 
                issue.project = issue.project ?: ({ nodeHelper.getProject(issue.projectKey) })()
                issue.type = issue.type ?: ({ nodeHelper.getIssueType(issue.typeName) })()
 
                def jIssueInternal = null
                try {
                    LogIn.logIn(reporterAppUser)
 
                    if (issue.id != null) {
                        def existingIssue = imInternal.getIssueObject(issue.id as Long)
                        if (existingIssue != null) {
                            issue.id = existingIssue.id
                            issue.key = existingIssue.key
                            return [existingIssue, toExIssueKey(existingIssue)]
                        }
                    }
                    def cir
                    try{
                        cir = hohInternal2.createNodeIssueWith(issue, hohInternal2.createHubIssueTemplate(), null, [:], blobMetadataList, syncRequest)
                    } catch (MissingMethodException e){
                        cir = hohInternal2.createNodeIssueWith(issue, hohInternal2.createHubIssueTemplate(), null, [:], blobMetadataList, syncRequest.getConnection())
                    }
 
                    def createdIssueKey = cir.getIssueKey();
 
                    jIssueInternal = imInternal.getIssueObject(createdIssueKey.id)
 
                    if (issue.id != null) {
                        def oldIssueKey = jIssueInternal.key
                        def oldIssueId = jIssueInternal.id
                        try {
                            jIssueInternal.key = issue.key
                            jIssueInternal.store()
                        } catch (Exception e) {
                            log.error("""Failed to sync issue key: ${e.message}. Please contact Exalate Support. Deleting issue `$oldIssueKey` ($oldIssueId)""".toString(), e)
                            imInternal.deleteIssue(proxyAppUserInternal, jIssueInternal as com.atlassian.jira.issue.Issue, com.atlassian.jira.event.type.EventDispatchOption.ISSUE_DELETED, false)
                        }
                    }
 
                    issue.id = jIssueInternal.id
                    issue.key = jIssueInternal.key
 
                    return [jIssueInternal, toExIssueKey(jIssueInternal)]
                } catch (com.exalate.api.exception.IssueTrackerException ite) {
                    if (firstSync && jIssueInternal != null) {
                        imInternal.deleteIssue(proxyAppUserInternal, jIssueInternal as com.atlassian.jira.issue.Issue, com.atlassian.jira.event.type.EventDispatchOption.ISSUE_DELETED, false)
                    }
                    throw ite
                } catch (Exception e) {
                    if (firstSync && jIssueInternal != null) {
                        imInternal.deleteIssue(proxyAppUserInternal, jIssueInternal as com.atlassian.jira.issue.Issue, com.atlassian.jira.event.type.EventDispatchOption.ISSUE_DELETED, false)
                    }
                    throw issueLevelError2("""Failed to create issue: ${
                        e.message
                    }. Please review the script or contact Exalate Support""".toString(), e)
                } finally {
                   LogIn.logIn(loggedInUser)
                }
        }
 
        /**
         * @param whenIssueCreatedFn - a callback closure executed after the issue has been created
         * */
        static com.exalate.basic.domain.BasicIssueKey create(
                com.exalate.basic.domain.hubobject.v1.BasicHubIssue replica,
                com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue,
                com.exalate.api.domain.request.ISyncRequest syncRequest,
                com.exalate.node.hubobject.v1_3.NodeHelper nodeHelper,
                com.exalate.basic.domain.hubobject.v1.BasicHubIssue issueBeforeScript,
                com.exalate.api.domain.INonPersistentReplica remoteReplica,
                List traces,
                List blobMetadataList,
                Closure whenIssueCreatedFn) {
 
            def firstSync = com.exalate.processor.jira.JiraCreateIssueProcessor.createProcessorContext.get() == true
            def (_jIssue, _exIssueKey) = doCreate(replica, issue, syncRequest, nodeHelper, issueBeforeScript, remoteReplica, traces, blobMetadataList, log)
            com.atlassian.jira.issue.MutableIssue jIssue = _jIssue as com.atlassian.jira.issue.MutableIssue
            com.exalate.basic.domain.BasicIssueKey exIssueKey = _exIssueKey as com.exalate.basic.domain.BasicIssueKey
            try {
 
                whenIssueCreatedFn()
                UpdateIssue.update(replica, issue, syncRequest, nodeHelper, issueBeforeScript, traces, blobMetadataList, jIssue, exIssueKey)
            } catch (Exception e3) {
                final def imInternal = com.atlassian.jira.component.ComponentAccessor.issueManager
                final def nservInternal2 = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.node.INodeService.class)
                def proxyAppUserInternal = nservInternal2.getProxyUser()
                if (firstSync && _jIssue != null) {
                    imInternal.deleteIssue(proxyAppUserInternal, _jIssue as com.atlassian.jira.issue.Issue, com.atlassian.jira.event.type.EventDispatchOption.ISSUE_DELETED, false)
                }
                throw e3
            }
            return exIssueKey
        }
    }
class UpdateIssue {
 
        private static def log = org.slf4j.LoggerFactory.getLogger("com.exalate.scripts.Epic")
 
        private static def doUpdate = { com.exalate.basic.domain.hubobject.v1.BasicHubIssue replica,
                                        com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue,
                                        com.exalate.api.domain.request.ISyncRequest syncRequest,
                                        com.exalate.node.hubobject.v1_3.NodeHelper nodeHelper,
                                        com.exalate.basic.domain.hubobject.v1.BasicHubIssue issueBeforeScript,
                                        List traces,
                                        List blobMetadataList,
                                        com.atlassian.jira.issue.MutableIssue jIssue,
                                        com.exalate.basic.domain.BasicIssueKey exIssueKey ->
            try {
                final def hohfInternal2 = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.hubobject.IHubObjectHelperFactory.class)
                //noinspection GroovyAssignabilityCheck
                final def hohInternal2 = hohfInternal2.get("1.2.0")
                final def nservInternal2 = com.atlassian.jira.component.ComponentAccessor.getOSGiComponentInstanceOfType(com.exalate.api.node.INodeService.class)
 
                def proxyAppUserInternal2 = nservInternal2.getProxyUser()
 
                log.info("performing the update for the issue `" + jIssue.key + "` for remote issue `" + replica.key + "`")
                //finally create all
                def fakeTraces2 = com.exalate.util.TraceUtils.indexFakeTraces(traces)
                def preparedIssue2 = hohInternal2.prepareLocalHubIssueForApplication(issueBeforeScript, issue, fakeTraces2)
                //@Nonnull IIssueKey issueKey, @Nonnull IHubIssueReplica hubIssueAfterScripts, @Nullable String proxyUser, @Nonnull IHubIssueReplica hubIssueBeforeScripts, @Nonnull Map> traces, @Nonnull List blobMetadataList, IRelation relation
                def resultTraces2
                try{
                    resultTraces2 = hohInternal2.updateNodeIssueWith(exIssueKey, preparedIssue2, proxyAppUserInternal2.key, issueBeforeScript, fakeTraces2, blobMetadataList, syncRequest)
                } catch (MissingMethodException e){
                    resultTraces2 = hohInternal2.updateNodeIssueWith(exIssueKey, preparedIssue2, proxyAppUserInternal2.key, issueBeforeScript, fakeTraces2, blobMetadataList, syncRequest.getConnection())
                }
                traces.clear()
                traces.addAll(resultTraces2 ?: [])
                new Result(issue, traces)
            } catch (com.exalate.api.exception.IssueTrackerException ite2) {
                throw ite2
            } catch (Exception e2) {
                throw new com.exalate.api.exception.IssueTrackerException(e2.message, e2)
            }
        }
 
        static Result update(com.exalate.basic.domain.hubobject.v1.BasicHubIssue replica,
                             com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue,
                             com.exalate.api.domain.request.ISyncRequest syncRequest,
                             com.exalate.node.hubobject.v1_3.NodeHelper nodeHelper,
                             com.exalate.basic.domain.hubobject.v1.BasicHubIssue issueBeforeScript,
                             List traces,
                             List blobMetadataList,
                             com.atlassian.jira.issue.MutableIssue jIssue,
                             com.exalate.basic.domain.BasicIssueKey exIssueKey) {
            doUpdate(replica, issue, syncRequest, nodeHelper, issueBeforeScript, traces, blobMetadataList, jIssue, exIssueKey)
        }
 
        static class Result {
            com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue
            List traces
 
            Result(com.exalate.basic.domain.hubobject.v1.BasicHubIssue issue, java.util.List traces) {
                this.issue = issue
                this.traces = traces
            }
        }
    }
     
int createIssueLink(){
 if (replica.parentId || replica."relation"){
    def parentLinkExists = false
    if (replica.parentId)
        flag = true
    def localParentKey = nodeHelper.getLocalIssueKeyFromRemoteId(replica.parentId ?: replica?."relationid" as Long, "issue")?.key
    if (localParentKey==null) return 1
    final String sourceIssueKey = localParentKey
    final String destinationIssueKey = issue.key
     
    def linkTypeMap = [
        "Parent" : "Relates",
        "Duplicate" : "Duplicate"
        ]
    String issueLinkName
 
    if (!parentLinkExists)
        issueLinkName = linkTypeMap[replica."relation"]
    else
        issueLinkName = "Blocks"
 
    final Long sequence = 1L
 
    def loggedInUser = ComponentAccessor.jiraAuthenticationContext.loggedInUser
    def issueLinkTypeManager = ComponentAccessor.getComponent(IssueLinkTypeManager)
    def issueManager = ComponentAccessor.issueManager
     
    def sourceIssue = issueManager.getIssueByCurrentKey(sourceIssueKey)
    def destinationIssue = issueManager.getIssueByCurrentKey(destinationIssueKey)
 
    def availableIssueLinkTypes = issueLinkTypeManager.issueLinkTypes
     
    int i,f=999
    for (i=0; i

Another use case is for Jira Cloud and Azure DevOps. 

We can create issue links to other issues in Jira and define some kind of relationship between them. The use case revolves around picking up the issueLinks and their relationships from Jira Cloud and transferring them over to Azure DevOps with the help of mapping. 

4. Syncing Multiple Tickets to a Single Issue Using httpClient 

There is often a need to connect multiple customer tickets to a single development issue. There are different ways to achieve this using Exalate. 

Here, we’ll use the httpClient method to sync multiple Zendesk tickets to a single Jira issue. 

Zendesk Outgoing Sync
replica.customFields."Issue to connect to" = issue.customFields."Issue to connect to"
Jira incoming sync
def remoteIssueUrn =replica.customFields."Issue to connect to"?.value
if(remoteIssueUrn && firstSync){
  def localIssue = httpClient.get("/rest/api/2/issue/"+remoteIssueUrn)
  if(localIssue == null) throw new com.exalate.api.exception.IssueTrackerException("Issue with key "+remoteIssueUrn+" was not found")
  issue.id = localIssue?.id
  issue.key = localIssue?.key
  return;
}

5. Syncing Insights Custom Field

You can sync custom fields created in Insights using Exalate. 

In the following example, we’ll sync an Assets custom Insight field in Jira on-premise. You can implement the same for Jira Cloud as well, but the code for that is a little different. 

Jira On-premise Outgoing Sync
// SETTINGS
final def insightCustomFieldName = "Assets"
 // END SETTINGS
 
replica.customKeys.
"My Custom Field values as Strings" = issue.customFields[insightCustomFieldName]?.value?.collect{
 v ->
  def cfm = com.atlassian.jira.component.ComponentAccessor.getCustomFieldManager()
 def cf = cfm.getCustomFieldObject(issue.customFields[insightCustomFieldName].id)
 def cft = cf.getCustomFieldType()
 def vStr = cft.getStringFromSingularObject(v)
 vStr
}
Jira On-premise Incoming Sync
// SETTINGS
final def insightCustomFieldName = "Assets"
 // END SETTINGS
 
issue.customFields[insightCustomFieldName].value = replica.customKeys.
"My Custom Field values as Strings".collect {
 String vStr ->
  def cfm = com.atlassian.jira.component.ComponentAccessor.getCustomFieldManager()
 def cf = cfm.getCustomFieldObject(issue.customFields[insightCustomFieldName].id)
 def cft = cf.getCustomFieldType()
 def v = cft.getSingularObjectFromString(vStr)
 v
}

Note: Both the incoming and outgoing sync scripts work for Jira on-premise only. 
You can also sync multiple Insight custom fields with Exalate

Conclusion

Finally, we are at the end! 

I hope you’ve enjoyed this journey through the world of Groovy and discovered just how amazing this language can be. From its ‘beauty with brevity’ syntax to its versatile features, Groovy scripting is truly a language like no other. 

Whether you’re building web applications, automating your daily tasks, or just tinkering around, Groovy scripting has something to offer for everyone. Amaze yourself and see what you can accomplish. 

Till then, feel the groove and code your heart out!

Recommended Reading:

CISCO Smart Bonding: An Introduction

CISCO bonding

Software is becoming more and more fundamental to business. No longer simply a tool, it’s the infrastructure we build our projects around. We use it to organize our teams, manage tasks and store the information that powers our work.

While there’s no shortage of specialized tools for business, you can run into problems when working with teams using different software. If you’re using incompatible platforms, then you can’t easily access each other’s data. 

The information can be copied manually, but that takes time and is prone to error. For teams to share data effectively, a better solution is needed. That’s where an integration can help and here we are, digging a bit into CISCO smart bonding.

Jump to:

What is CISCO Smart Bonding?

CISCO Smart Bonding lets you connect your IT service management (ITSM) systems to CISCO. You can share data between items on each system, keeping them in sync.

As well as exchanging case notes, you can share diagnostic and log files, and update them from your own setup. You do this via an encrypted API that provides push/pull capabilities. Your engineers need to work with it to control which data is shared.

Note: Read all about API integration.

It’s compatible with several major platforms, including Salesforce, HP and ServiceNow. It’s also free to use. CISCO provides documentation and test scripts to help you get started with it.

Why Do You Need CISCO Smart Bonding?

CISCO Bonding removes swivel chair inefficiencies from the process of data sharing. Once set up, it means engineers no longer have to waste time entering duplicate data into Cisco’s ITSM system.

With Smart Bonding, data transfer happens regularly and is error-free. If done manually, it’s the kind of task that gets overlooked, or done badly. 

A missing, or inaccurate piece of data can have serious consequences for your business. Smart Bonding ensures data gets moved correctly.

With manual transfers, you also have to wait for data to be synchronized. An automated system can ensure data is always updated to a regular schedule. That means your teams have the information they need at their fingertips when they need it.

Users that set up Smart Bonding report a 20% to 54% reduction in case ownership time, a clear demonstration of its benefits.

How to Get Started with Smart Bonding

Smart Bonding uses an API to push and pull encrypted data into and out of your systems. To work with it, you need to configure your ITSM system to consume CISCO APIs. That’s a lot of upfront work for your developers, but the benefits should outweigh the costs.

As well as developers, CISCO recommends assigning a project manager to manage risk and ensure sufficient resources are available for the project. It also recommends a case manager to help developers interpret business requirements when building their solutions.

The typical development process consists of the following phases:

  • Analysis – where you ensure your ITSM system is compatible with CISCO’s APIs.
  • Implementation  – where you connect your system to CISCO’s and test information transfer.
  • Testing – where you use CISCO’s test plan to ensure everything is working as expected.
  • Live – Where you deploy the system to production, test it, and start it working with real data.

CISCO’s team is available throughout this process to help you ensure it goes smoothly.

What to Look for in an Integration

Your integration needs to share the data you need, and do so when you need it. You may not want to share all your data, so there are several capabilities you need.

Firstly, you need to be able to control what is shared. That means specifying which fields are synced. An advanced integration can even modify or substitute some data, or make conditional decisions based on the data’s content. For example, you might want to share data from a specific region, or of a specific type.

You might want to limit what fields to share. If your engineers are integrating with a sales team, they might want to know about customer requests, but they don’t need the customers’ contact info.

Other Integration Options

Exalate is another tool for connecting software platforms. It lets you connect platforms using a no-code visual interface, or by scripting for more advanced use cases. It’s free to try out, and quick to set up.

Since it’s a ready-made system, you can have it up and running in minutes. Exalate enables teams working across different platforms like Jira, ServiceNow, Azure DevOps, GitHub, Salesforce, Zendesk, … to stay aligned. You can sync entities between those systems by using set defaults, or your own custom filters.

Conclusion

CISCO Smart Bonding makes integrating with other teams far easier. It automates the transfer of information, meaning you can share data in disparate software platforms.

Its flexible API lets you control the data flow in and out of your systems. Once set up, you can leave it in place to do your work for you, and your teams can get on with other things.

If setting up the initial integration seems like a hurdle, then there are other tools to help you with that part of the process. Exalate provides an easy-to-use system that can share data between your platforms without any developer involvement. It can also customize your data sharing to your exact requirements.

Book a call with an integration expert to see how Exalate can help you with your specific use case.

Recommended Reads:

Backbone Issue Sync vs. Exalate: An In-Depth Comparison

Backbone issue sync

Companies that use different work management systems need to find ways of integrating data smoothly without exposing their system infrastructure to security risks. That’s why integration solutions are gaining more popularity in the modern workplace.

But when it comes to choosing an integration solution for your company, you need to consider key factors like security, user-friendliness, and overall performance. 

In this guide, we’ll compare Exalate and Backbone Issue Sync (by K15t) because they have similar backgrounds. These 2 IaaS solutions will go head-to-head in order to determine which one provides the best value for money.

Backbone Issue Sync vs. Exalate: A Tale of 2 Integration Solutions

K15t’s Backbone Issue Sync for Jira (Backbone for short) is an integration solution that enables users to synchronize data on Jira. Backbone supports integration between Jira Cloud, Jira Server, and Jira Data Center. As an exclusive Jira sync tool, this solution hosts its core functionalities on the Atlassian infrastructure.

Exalate is an integration solution that enables bi-directional synchronization between work management systems such as Jira, Salesforce, ServiceNow, and Zendesk. This IaaS solution comes with a no-code interface for simple syncs, as well as a code-based scripting engine for advanced use cases.

Both integration solutions started as synchronization tools for Jira, practically within the same time bracket. 

But since then, K15t’s Backbone Issue Sync for Jira has remained an integration solution for Jira alone; it doesn’t support integration with other non-Jira platforms. Within that same period, Exalate has excelled beyond Jira to Jira integration and is now rubbing shoulders with other premium IaaS providers that offer multi-platform support.

Backbone vs. Exalate: Comparing Key Features and Attributes

Now that you know the backstory of both IaaS solutions, time to compare their key features in detail. 

Security 

Backbone uses standard encryptions and firewalls to protect user data from unauthorized access. To integrate projects in Jira, Backbone uses the official Jira REST API. After you set up the sync, Backbone will communicate with Jira via the encrypted channel provided by the Jira setup. 

Backbone relies on Atlassian’s security features as well as PGP (Pretty Good Privacy) encryption – a mix of symmetric and asymmetric algorithms – for both file and email protocols. The algorithms are 3000-bit RSA and 128-bit AES.

Exalate verifies users with the help of JWT tokens and OAuth. It also uses a single-tenant architecture that addresses incidents with the help of advanced endpoint detection and response (EDR). This architecture keeps your environment separate from other tenants.

Exalate also has a dedicated security team that conducts regular evaluations and penetration testing. They are also participants in the Bug Bounty program. 

All things considered, Exalate is more secure than Backbone.

Data Residency and Privacy

As a cloud-fortified integration, Backbone abides by Atlassian’s data residency laws. Likewise, Exalate Cloud falls within the auspices of Google because it relies on the Google Cloud data center in Belgium. Exalate also stores backups in the Rsync.net offline data center in Switzerland.

In terms of privacy, both Backbone and Exalate do not have the right to store or process your data. According to Exalate’s End User License Agreement, the company will not disclose your information to any third party without express permission from you–the original owner. 

So, it is safe to say that both integration solutions value privacy and data residency laws.

Licensing and Certification

You need a valid Backbone license for each instance you’re integrating with, whether the sync is unidirectional or bidirectional. 

But there is a workaround. You can use the remote licensing option to install Backbone on only one Jira instance and synchronize with other instances.

Here is a complete list of all Backbone licensing options: 

  • Atlassian Marketplace License: Cloud to Cloud
  • Atlassian Marketplace License: Data Center to Data Center/Cloud
  • Remote License
  • Behind Firewall (Email)
  • Behind Firewall (File)

Similarly, every Exalate instance requires a valid license on both ends to work. While configuring your console, your license key will be automatically validated for your instance. 

Here are Exalate’s licensing options:

  • Evaluation license (30 days full access and 1000 twin issue pairs)
  • Free plan license (pre-built basic configuration for 1000 issue pairs per month)
  • Paid license (Sync any data available via API with no volume limitations)

Both solutions are not yet ISO 27001-certified. And since Exalate is based in Belgium, the solution is GDPR-compliant.

User-friendliness and Customizability

Backbone has an intuitive no-code UI that blends seamlessly with Jira. You can use this tool to customize your syncs to specific needs by tweaking the settings and selecting the issues or fields to map.

Exalate has 3 different modes: pre-set (Basic mode), no-code (Visual mode), and code-based (Script Mode). This variety makes Exalate a perfect fit for every user, regardless of technical expertise. 

The interface is easy to use and maintains a consistent look across all platforms. You can also customize outgoing and incoming syncs using Exalate’s Groovy-based scripting engine.

Exalate also provides advanced features that use the power of AI and machine learning to increase the efficiency and accuracy of custom configurations. The AI Assist chatbot allows users to generate various forms of Groovy code snippets and mappings for complex use cases.

Coverage Scope

Backbone sends data via HTTP(S), files, and email. You can use this tool to sync sprints and versions. This integration tool also allows you to limit the sync using custom JQL (Jira Query Language). Other things you can sync include attachments, epic links, issue types, resolutions, and confluence pages.

However, the drawback with Backbone is that you cannot sync statuses between next-gen and original boards. Since you have no control over what is being mapped, you won’t be able to opt out of sending some data over to the other instance. This limitation makes Backbone a less favorable tool for cross-company integrations.

That’s why Exalate takes the cake as the perfect choice for intra-company and cross-company integrations. The Exalate Console allows you to configure incoming and outgoing syncs using Groovy expressions to specify what should go over and what to receive from both sides. 

This option works perfectly for inter-company syncs because both parties retain independent control over their respective instances. You can also set triggers to automate the sync–or use the Bulk Exalate option to sync multiple issues at once.

Both Exalate and Backbone support bi-directional syncs, which means you can transfer data in both directions, regardless of the platform.

Pricing

As discussed earlier, Backbone Issue Sync has 2 main licensing models:

  • Atlassian Marketplace licensing (installation on both ends).
  • Remote licensing (installation on one end and manual configuration). 

The billing plan for Backbone centers on the maximum number of users per instance. For example, if you have 50 users in a single instance, you’ll need to pay the price for 50 users. 

Here are the sample pricing options:

  • 50 users (no remote license) – $500 annually ($50 monthly)
  • 100 users (no remote license) – $1000 annually ($100 monthly)
  • 2000 users (no remote license) – $5925 annually ($592.5 monthly)

Exalate’s free plan gives you up to 1000 new monthly issue pairs. How does this work? Whenever you create an issue on one side, a replica or twin is automatically created on the remote side. This process removes the redundancy of having to create these pairs manually.

Exalate’s price depends on the use case and the platforms you wish to integrate. You can request a quote directly from the pricing page or contact the sales team at [email protected].

Customer Support

Backbone does not provide 24/7 customer support. The support agents work only from 9 am to 5 pm GMT+1, from Monday to Friday. You can always submit a support request via their portal and await a response.

Exalate provides better support to all customers, with 15 hours of free assistance. Under the new service level agreement, the premium service gets you onboarding calls, solution assistance, bi-weekly syncs, and other services, which are available 24/7. 

Versatility (Supported Platforms)

Backbone is only compatible with Jira products such as:

  • Jira Service Management
  • Jira Data Center
  • Jira Cloud
  • Jira Server

Exalate supports the following platforms:

  • Jira (Cloud, Server, Datacenter, Service Management)
  • Zendesk
  • HP ALM
  • Salesforce
  • GitHub
  • ServiceNow
  • Azure DevOps

You can also deploy Exalate on Docker for multiple environments. Integration with other services like Monday.com, Asana, Trello, Azure DevOps on-prem, and Gitlad is also in the works. 

Since Exalate integrates with more platforms, it is the perfect solution for companies working with different work management systems simultaneously. 

Error Handling

Backbone Issue Sync provides good troubleshooting and error handling. When something goes wrong with the sync, you will receive an error message with links to the doc pages containing the solution to known issues. 

When writing code with the Exalate Console, you will receive error notification balloons pointing out the specific line in which it occurred. This makes it easier to debug and fix your custom code faster in case one misspelled variable is ruining your coffee break. 

With Exalate’s robust Sync Queue mechanism, you can also keep an eye on events within the sync in order to debug the code easily.

Documentation and Community

Backbone has a comprehensive documentation page that details everything you need to know about the product. From sample use cases to How-tos, you can find answers to virtually any problem on the Backbone documentation page.

Similarly, Exalate has a detailed documentation page that addresses several potential issues. First-time users can also gather more information from the Getting Started guide

If you want to learn more about Exalate, you can watch video tutorials at the Exalate Academy. The product’s website and dedicated community also have a vast repository of case studies and use cases addressing various synchronization scenarios.

Backbone Issue Sync or Exalate? The Final Scorecard

After comparing both platforms, let’s aggregate the findings into a scorecard:

Backbone Issue Sync vs. Exalate

In many years of providing integration solutions for companies, the gap between Exalate and Backbone has widened into a chasm. 

As it stands, Exalate is clearly the more versatile product, making it a perfect solution for cross-company syncs. Even though Backbone equals Exalate in user-friendliness, it is no match for Exalate in terms of customization and security features.

Conclusion

Before choosing any integration solution, always check the compatibility with your work management systems and specific use cases, bearing in mind the possibility of future changes to requirements. You should also evaluate how the solution handles data privacy and security. And most importantly, always choose integration solutions that are easy to use and guarantee top-notch performance.

Recommended Reads: