Categories
Digital Transformation Platform Migration & Modernization

My Double Life in Microservices: From Homegrown to MuleSoft

As a software engineer who has been building microservices for more than six years now, I’ve led sort of a double life. Before I explain this dichotomy, I should provide a little background about myself. I’ve spent my whole career as a software plumber writing infrastructure code for big service platforms and more recently as a back-end developer implementing APIs and REST services. In this post, I will compare my experience building microservices through two very different approaches, which I think of as two different lives.

Reuse Knowledge and Experience in Technology

Throughout my career, I would typically reuse my knowledge and experience in the technology I was working with to move from one thing to another quite unrelated thing. For example at IBM, I spent a lot of time working on systems management platforms using Java and all things related to that environment. Then I jumped to a server product that synchronized a mail system with mobile devices, again in the Java ecosystem.

In my most recent job switch coming to Green Irony, I was hired because I have extensive experience building microservices, particularly for solutions in the insurance industry. For the first time in my career, I was asked to work on the same thing, microservices for insurance, but use a completely different technology stack.

Building an Integration Network from Scratch

In my previous job, we initially implemented a monolithic Enterprise Service Bus (ESB) to build an insurance solution. It became cumbersome to maintain and enhance, so we migrated to microservices rolling our own implementation. To do this we:  

  • Used Node.js running on AWS EC2 
  • Created templates for creating new microservice containers. This included code templates for the Node.js based microservice as well as Terraform scripts for building and packaging the microservice into a Docker container.
  • Built an open-source Node.js server stack that was the basis of each microservice 
  • Used Swagger to define our APIs and to use a library that provided request validation based on a Swagger specification. This was one of the best decisions we made. 
  • Built a bunch of reusable libraries to do things like configuration management, authentication and authorization and request-based security to name a few. 
  • Built-up a list of NPM dependencies that we relied on to perform various tasks.

Examining this list of accomplishments, it’s apparent that only one of the items directly contributed to solving the customer’s business needs, and that was defining our APIs in Swagger. Had we known about MuleSoft, we could have eliminated a lot of work along with the implicit cost of maintaining it and keeping it updated as dependencies and technology changed.

Understanding the Value of MuleSoft

When I first considered moving to Green Irony, I was given a demo on MuleSoft. I was shown how to build flows using Anypoint Studio, dragging and dropping connectors. My first thought was, “What do you need me for? Almost anybody can build stuff with a drag and drop IDE like that.”

It took me some time to ponder and absorb MuleSoft. Then at a second, more technical demonstration, I was shown how DataWeave makes it fairly easy to access and transform data as necessary to integrate data from third-party APIs to and from our project’s data types. And it all clicked how MuleSoft’s SAPI, PAPI, and EAPI architecture matched what we had been doing for our microservices.

We built APIs called “bindings” that were responsible for communicating directly to 3rd party APIs. These were our SAPIs. We built mediation services that contained business logic and utilized one or more bindings. These were our PAPIs. Finally, we built an API that would use one or more bindings that were provided to the ultimate user of our APIs These were our EAPIs.

Once all this clicked for me, I realized that MuleSoft was less about building flows with Anypoint Studio, and more about accelerating our ability to build the solutions for our customers.

Old Life vs. New Life

In my previous “roll your own” (RYO) life, we spent a lot of time and effort integrating various libraries to read and parse Swagger, provide request validation, and map data to and from TypeScript definitions. As Swagger grew into the OpenApi Specification, we were stuck on Swagger 2 because we never had the time to address the technical debt required to update our Swagger integration. We spent quite a bit of time and effort trying to find usable solutions to build Typescript definitions from Swagger docs or vice versa, generating Swagger documentation from TypeScript types. On top of that, we had to build client code that understood how to talk to the various services we were building. We had to figure out how to secure the various APIs to provide access only to authorized users.

In my new MuleSoft life, we design APIs using RAML in the Design Center. When we export the API to Exchange, a library is automatically generated providing connectors to utilize the API. When we implement the API, the Studio and APIkit automatically create the routes needed to implement each of the API’s endpoints and provide automatic request validation to ensure that incoming requests were properly formed and have all required values.

In my previous RYO life, much of our effort was spent manually writing code to transform and map data from one API to another. We had a homegrown data mapper library, but it was often difficult to figure out how to make it do exactly what you needed. This difficulty led to somewhat varied and at times convoluted implementations that made reading and understanding another developer’s mapping implementation quite confusing. Enhancing and maintaining these mappings was quite cumbersome.

In my new MuleSoft life there is DataWeave. DataWeave is MuleSoft’s expression language for accessing and transforming data that travels through a Mule application. It makes it very easy to read XML and write JSON, and vice versa. You can easily combine data from multiple sources and multiple formats. It’s quite functional in nature, so it provides a lot of powerful features to implement complex mappings.

In RYO life, unit testing was difficult. Typically more effort was spent figuring out how to scaffold code and mock external resources than actually writing the test cases and verifying the code. Often instead of unit tests, we wrote integration tests that relied on access to actual services or had fairly kludgy mock solutions. Also, when we were running unit tests, the code was not in the same runtime environment as when it was running in the server container on AWS. Sometimes code under test passed when it failed in the actual runtime because it was too hard to simulate the environment.

In MuleSoft life, unit testing is significantly easier. In Studio, you select a flow right mouse on a flow, select MUnit -> “Create blank test for this flow”. If the flow uses connectors to issue HTTP or SOAP requests or use a custom Mule connector to invoke a PAPI or a SAPI, you just mock that connector and provide your mocked response. If you need more granularity in your code to facilitate testing, you can refactor larger flows into sub-flows. You can easily set up the Mule event to mimic actual incoming requests, and/or you can initialize flow variables to further emulate the request environment. When you run the MUnit tests, Studio actually starts up your Mule application so that your code is running in the same environment as when it’s live. No need to spend lots of time figuring out how to emulate and scaffold your runtime environment.

MuleSoft for the Win

In my RYO life, I spent lots of time implementing system infrastructure and building custom solutions for security and authentication, authorization on services and endpoints, parsing Swagger and providing runtime request validation. As libraries and technologies progressed, we would automatically incur technical debt to bring our custom implementations up to the latest and greatest for a library or technology we depended on. Often, we would stay behind on older technology, like Swagger instead of moving to OpenAPI, because we couldn’t afford the time or effort to modernize our custom solution.

MuleSoft is a wonderful accelerator that allows us to focus on the problem we are trying to solve for our customers instead of building and maintaining a custom technical stack just to enable working on the actual solution. You still need experienced developers who understand how to tackle enterprise-level problems, but MuleSoft allows for robust solutions with much less time and effort. Developers can focus on the important parts of building solutions for their customers.

If you’d like to dive into the topic more, check out “Adapt vs. Adopt: The Value of Anypoint Platform” whitepaper. It’s a deep dive into the technical, business, and financial considerations an organization must take when evaluating building a homegrown network vs investing in MuleSoft. 

Categories
Platform Migration & Modernization

What about Boomi, Apigee, Informatica, and…? MuleSoft vs The Competition’s Partial Solutions to the Integration Puzzle

Jigsaw puzzle sales soared when the COVID-19 pandemic hit. Puzzles allow families, with plenty of time on their hands, to unwind and collaborate on the solution to a common challenge. Puzzle manufacturers work hard to ensure there are no missing pieces in a box, but when mistakes are made, it’s disappointing to have invested so much time in a solution only to find out that the solution was not available because pieces were missing.

This analogy is very relevant to the integration challenges faced by today’s IT executives. Many have a plethora of integration software either already purchased or currently being evaluated as possible solutions to challenges; however, it’s typically very unclear how each piece fits together into a cohesive whole. It’s equally unclear as to whether or not every piece of the puzzle is included in the “box” or whether they’ll finish the puzzle and have missing pieces that prevent them from fully addressing their integration challenges. 

In our whitepaper, “Adapt vs. Adopt: The Value of MuleSoft Anypoint Platform”, we discuss how MuleSoft allows companies to design and roll out comprehensive API-led integration strategies that reduce go-live timelines, increase developer efficiency and asset reuse, remove common roadblocks that gate digital transformation initiatives, and ultimately drive a better integration strategy that leads to an overall lower total cost of ownership. 

A common question we’ve received in response to this whitepaper goes something like this:

“Well I’m already using Boomi for integration. Why can’t I just do this with Boomi?”

The answer is simple: Boomi only solves a small piece of the puzzle.

Partial Solutions are Partial Homegrown

We don’t mean to pick on Dell Boomi in particular. You can substitute many alternatives: 

  • Apigee
  • Informatica
  • IBM API Connect
  • Azure API Management
  • SnapLogic
  • And many other integration and API solutions 

With any of those products, you’ll still come up with the same answer: these solutions are only solving a small piece of the puzzle. The rest of the puzzle still needs to be solved. Solving it involves pulling in more partial solutions and integrating them together (using custom code) or creating homegrown solutions to implement the missing features (also using custom code).

Categories of Partial Solutions

These partial solutions typically fall into three discrete categories:

  • Tactical integrators: short-term tactical approach to connect two systems together
  • API Specialists: policy and routing but not much else
  • Legacy integration platforms: heavyweight solutions of yesteryear that were created to address yesterday’s integration problems

Tactical integrators provide a toolset that allows for accelerated development and fuel a short-term tactical mindset for integration needs. These technologies rely on prebuilt connectors into data sources and some level of drag-and-drop tooling to accelerate point-to-point development patterns.

While these patterns may lead to short-term gains in delivery speed because of accelerated development, in the long term your integration strategy will suffer the same pitfalls as point-to-point: Ross Mason’s big ball of mud. This situation is outside of the scope of this blog entry and a wealth of information can be found in “Adapt vs. Adopt: The Value of MuleSoft Anypoint Platform” whitepaper. Common examples here include Boomi, SnapLogic, and Jitterbit.

API specialists provide gateway and proxy use cases that facilitate control and governance over APIs once they are deployed. While these platforms do a great job of handling API gateway scenarios around governance, policy enforcement, and routing, they do not handle anything else involved in executing a robust API-led strategy; connectivity, orchestration, and payload transformation are fully reliant on custom coding of API implementations. See pages 11 and 12 in our whitepaper for the comprehensive list of necessary functionality that’s missing from platforms that focus merely on the API gateway aspect of API-led connectivity. Common examples of this type of solution include Apigee, CA API Gateway (formerly Layer7), and Tibco Mashery. 

Legacy Integration Platforms are very heavyweight solutions that involve proprietary products that need to be integrated together to work effectively. These solutions are very complex, requiring a great deal of both setup and maintenance costs in addition to high infrastructure costs. Given that they are legacy in nature, many either do not support modern APIs or have bolt-on API functionality that was built on top of technology that was not intended to address modern connectivity challenges. These platforms all suffer from an “integrating your integration platforms together” type of mindset and don’t provide everything that is needed to roll out an API-led integration strategy.

MuleSoft Anypoint Platform: The Entire Puzzle In One Box

Using any of the partial solutions mentioned above, your development team is responsible for determining the size and shape of the missing puzzle pieces and for constructing them from scratch to address the gaps in your puzzle. And unlike the family at the beginning of our story with time to burn and a jigsaw puzzle to solve as a family-bonding activity, IT executives don’t have time to burn and integration is not a recreational activity. They need real solutions to quickly address their integration challenges and remove roadblocks to transformation efforts.

The true value of MuleSoft Anypoint Platform is that it provides an organization with everything required to be successful in executing a modern API-led integration strategy. Instead of reinventing the wheel, MuleSoft customers enjoy the ability to apply nearly 100% of their focus to what matters: business scenarios. For more in-depth information on how MuleSoft Anypoint Platform provides everything needed to tackle modern integration challenges and pave the way for lower overall TCO, check out “Adapt vs. Adopt: The Value of MuleSoft Anypoint Platform” whitepaper and let us know what you think.