How developer portals help you win in the API economy

How to win in the API economy with API Developer Portals

In an increasingly API-driven economy, an all-inclusive API Developer Portal can differentiate an enterprise from its competitors.




    The evolution and adoption of enterprise digital transformation have made APIs critical for integration within and across enterprises as well as for product/service innovation. As APIs grow in scale and complexity, establishing a developer portal would significantly ease the process of their roll-out and adoption. This perspective paper explores the significance of an API Developer Portal in the modern digital landscape driving the API economy.

    A Developer Portal makes it easier to understand APIs, reduces integration time, and supports developers in training and resolving API-related issues. This provides significant business value by improving agility and enhancing customer experience. With the help of a Portal, enterprises can efficiently publish and consume APIs and enable their integration with incremental API versions. This will ensure benefit from all digital investments.

    In an increasingly API-driven economy, an all-inclusive API Developer Portal can differentiate an enterprise from its competitors, help build trust with partners, and achieve long-term success. Depending on the API platforms being used, enterprises could adopt a built-in platform or develop a custom one. Developing a custom API Portal would be easy at the start. However, developing enhanced features would entail a significant investment of time and resources. Hence, to make the right decisions and succeed in the broader API implementation/integration journey, a well-thought-out approach is necessary.

    To learn more about the key drivers, components and features, implementation options and potential benefits of API Developer Portals, download the perspective paper here.

    Download Perspective Paper




      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch

      Automated financial analysis reduces manual effort

      BANKING

      Automated financial analysis reduces manual effort

      Analysts in a large North American bank's commercial lending and credit risk operations can source intelligent information across multiple documents.

      Client
      Commerical lending and credit risk units of large North American bank
      Goal
      Automated retrieval of information from multiple financial statements enabling data-driven insights and decision-making
      Tools and Technologies
      OpenAI API (GPT-3.5 Turbo), LlamaIndex, LangChain, PDF Reader
      Business Challenge

      A leading North American bank had large commercial lending and credit risk units. Analysts in those units typically refer to numerous sections in a financial statement, including balance sheets, cash flows, and income statements, supplemented by footnotes and leadership commentaries, to extract decision-making insights. Switching between multiple pages of different documents took a lot of work, making the analysis extra difficult.

      Solution

      Many tasks were automated using Gen AI tools. Our steps:

      • Ingest multiple URLs of financial statements
      • Convert these to text using the PDF Reader library
      • Build vector indices using LlamaIndex
      • Create text segments and corresponding vector embeddings using OpenAI’s API for storage in a multimodal vector database e.g., Deep Lake
      • Compose graphs of keyword indices for vector stores to combine data across documents
      • Break down complex queries into multiple searchable parts using LlamaIndex’s DecomposeQueryTransform library
      Outcomes

      The solution delivered impressive results in financial analysis, notably reducing manual efforts when multiple documents were involved. Since the approach is still largely linguistic in nature, considerable Prompt engineering may be required to generate accurate responses.

      Response limitations due to the lack of semantic awareness in Large Language Models (LLMs) may stir considerations about the usage of qualifying information in queries.

      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch
      Explore the world with Iris. Follow us on social media today.

      Next generation chatbot eases data access

      BROKERAGE & WEALTH

      Next generation chatbot eases data access

      Gen AI tools help users of retail brokerage trading platform obtain information related to specific needs and complex queries.

      Client
      Large U.S.-based Brokerage and Wealth Management Firm
      Goal
      Enable a large number of users to readily access summarized information contained in voluminous documents
      Tools and Technologies
      Google Dialogflow ES, Pinecone, Llamaindex, OpenAI API (GPT-3.5 Turbo)
      Business Challenge

      A large U.S.-based brokerage and wealth management firm has a large number of users for its retail trading platform that offers sophisticated trading capabilities. Although extensive information was documented in hundreds of pages of product and process manuals, it was difficult for users to access and understand information related to their specific needs (e.g., How is margin calculated? or What are Rolling Strategies? or Explain Beta Weighting).

      Solution

      Our Gen AI solution encompassed:

      • Building a user-friendly interactive chatbot using Dialogflow in Google Cloud
      • Ringfencing a knowledge corpus comprising specific documents to be searched against and summarized (e.g., 200-page product manual, website FAQ content)
      • Using a vector database to store vectors from the corpus and extract relevant context for user queries
      • Interfacing the vector database with OpenAI API to analyze vector-matched contexts and generate summarized responses
      Outcomes

      The OpenAI GPT-3.5 turbo LLM (170 bn parameters) delivered impressive linguistic search and summarization capabilities in dealing with information requests. Prompt engineering and training are crucial to secure those outcomes.

      In the case of a rich domain such as a trading platform, users may expect additional capabilities, such as:

      • API integration to support requests requiring retrieval of account/user specific information, and
      • Augmentation of linguistic approaches with semantics to deliver enhanced capabilities.
      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch
      Explore the world with Iris. Follow us on social media today.
      The state of Central Bank Digital Currency

      The state of Central Bank Digital Currency

      Innovations in digital currencies could redefine the concept of money and transform payments and banking systems.




        Central banking institutions have emerged as key players in the world of banking and money. They play a pivotal role in shaping economic and monetary policies, maintaining financial system stability, and overseeing currency issuance. A manifestation of the evolving interplay between central banks, money, and the forces that shape financial systems is the advent of Central Bank Digital Currency (CBDC). Many drivers have led central banks to explore CBDC: declining cash payments, the rise of digital payments and alternative currencies, and disruptive forces in the form of fin-tech innovations that continually reshape the payment landscape.

        Central banks are receptive towards recent technological advances and well-suited to the digital currency experiment, leveraging their inherent role of upholding the well-being of the monetary framework to innovate and facilitate a trustworthy and efficient monetary system.

        In 2023, 130 countries, representing 98% of global GDP, are known to be exploring a CBDC solution. Sixty-four of them are in an advanced phase of exploration (development, pilot, or launch), focused on lower costs for consumers and merchants, offline payments, robust security, and a higher level of privacy and transparency. Over 70% of the countries are evaluating digital ledger technology (DLT)-based solutions.  

        While still at a very nascent stage in terms of overall adoption for CBDC, the future of currency promises to be increasingly digital, supported by various innovations and maturation. CBDC has the potential to bring about a paradigm shift, particularly in the financial industry, redefining the way in which money, as we know it, exchanges hands.

        Read our perspective paper to learn more about CBDCs – the rationale for their existence, the factors driving their implementation, potential ramifications for the financial landscape, and challenges associated with their adoption.

        Download Perspective Paper




          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch
          Cloud Migration, Challenges and Solutions

          Cloud migration challenges and solutions

          Insights into the top challenges and their mitigations in the cloud journey.




            Selecting an appropriate path for an application or a portfolio of applications is one of the most critical decision points in a cloud journey. Assessing the nature and criticality of an existing application is usually the starting place. Another critical factor to consider is the implementation (migration) cost and time for each path to cloud. The four cloud adoption options are re-host, re-platform, re-factor and re-write in the order of increasing cost, effort, cloud benefits, and TCO reduction. Out of these, re-host usually does not involve code change and is relatively simple.

            Mapping cloud operating metrics into a 3x3 matrix is a good starting point on planning for a cloud journey. In this matrix, the cloud operating metrics would move to the right if they are critical for customer intelligence applications; that would be an X factor. Another critical dimension while planning cloud migration is identifying the interface dependencies between selected application(s) and others – both inbound and outbound. These could be synchronous, asynchronous or batch.

            Understanding the application architecture, its internal organization, and inter-dependencies are critical before migration. This can be a very complex and labor-intensive task if done manually and can be error prone. Not fully understanding the existing code can lead to issues related to transactions, data corruption, session handling, and performance.

            To read more on the top challenges and their mitigations in the cloud journey, download the perspective paper here.

            Download Perspective Paper




              Contact

              Our experts can help you find the right solutions to meet your needs.

              Get in touch
              Containerized microservices optimize infrastructure

              Containerized microservices optimize infrastructure

              Migration of legacy interfaces from Oracle SOA to AWS EKS increases scalability, availability and maintainability and establishes a single version of truth across systems and functions for a global publishing house.




                A leading global publishing house with significant operations in the U.S. had multiple Systems of Record (SOR) with a point-to-point integration between them and various operational and analytics marts built on legacy technologies. This led to the divergence of critical operational data across functions, delays in month-end and quarter-end processing as well as scalability and performance issues.

                Iris Software’s team collaborated with the client to accelerate the migration of their legacy interfaces and services to AWS EKS through a phased approach that was best suited to meet the client’s need for a much faster deployment time to market. It involved:

                • Defining the microservices and containerization technology stack for the migration of services.
                • Developing containerized microservices using Spring Boot with externalized configuration to deploy into AWS EKS.
                • Registering services hosted on AWS EKS with Kong API Gateway, enabling service discovery, auto-scaling and self-healing, as well as consumption of materialized views and response preparation.
                • The migration from Oracle SOA to a loosely coupled set of capabilities with microservices architecture on AWS EKS enabled the delivery of a single version of truth to various consuming applications and channels, resulting in reduced operational issues and maintenance costs. It also ensures that the business can meet future demand spikes, minimize downtime, and maintain optimal performance.

                Learn more by downloading the full success story here.

                Download Success Story




                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch
                  API platform migration increases capacity by 6x

                  Platform migration increases API capacity by 6x

                  Migration of legacy API platform to Apigee and GCP and API productizing led to 6x higher capacity and 70% reduction in support tickets, enhancing business growth and customer satisfaction for a global logistics firm.




                    Our client, a leader in truck transportation and logistics services with more than 50,000 customers across 33 countries, had developed customer-facing APIs using earlier generations of API platforms. These APIs connected their transportation management system with several other critical systems such as GPS tracking, warehouse management, and real-time customer portals. Scalability and reliability issues were plaguing the client’s API management system due to the legacy infrastructure and increasing numbers of APIs, leading to poor customer satisfaction and decreased competitiveness.

                    The client sought a technology partner who could understand the complex business logic within the existing API structure and execute a seamless migration and modernization that would improve the performance and scalability of the APIs for its customers. A team of Apigee experts at Iris Software addressed the challenges with a comprehensive, customized four-step approach that consisted of:

                    1. Outlining the migration strategy to move 18+ APIs to a more robust API Gateway
                    2. Automating the migration from the legacy API Gateway to Apigee to ease customer transitions
                    3. Balancing internal system loads to increase scalability and throughput
                    4. Implementing Apigee analytics for improved traceability and faster mitigation of issues

                    Iris’ solution provided multi-market, multi-channel and multi-partner integration as well as other positive outcomes for the client:

                    • 70% reduction in support tickets related to shipment delays
                    • 6x increase in API throughput
                    • New revenue streams from the creation of four new API products

                    Learn more by downloading the full success story here.

                    Download Success Story




                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch
                      Succeeding in ML Operations journeys

                      Succeeding in ML operations journeys

                      As machine learning becomes increasingly prevalent in the business world, more and more enterprises are considering the cloud as a way to scale their machine learning efforts.




                        The promise of AI and ML to deliver significant competitive advantages and contribute to the growth of the organization is now well established. Organizations, big and small, are adopting these to drive their strategic business goals.

                        However, moving machine learning workloads to cloud has its own challenges. Organizations are realizing that operational and support requirements increase rapidly as data science and modeling teams adopt emerging AI/ML platforms on cloud.

                        Machine Learning and Operations, or MLOps for short, is significantly different from traditional software development practices and requires a different way of thinking about how machine learning models are developed, deployed, and maintained. This shift can be difficult for various teams involved in governance, enablement, and support of public cloud-based capabilities that are used to more traditional approaches, and requires a change in culture and mindset.

                        The power of AI/ML models and platforms and the potential for deriving significant value from them is accelerating very fast. It is increasingly becoming a critical imperative for firms to adopt these rapidly to ensure they remain competitive. However, successfully adopting these at scale would require the use of public cloud technologies. Public cloud adoption in many industries is still in the early stages, with many internal groups to work with and evolving processes and standards. By establishing clear ownership for MLOps & DataOps and simplifying adoption through the use of templates and automation, firms can overcome these challenges and scale AI/ML on cloud while ensuring overall agility and cost-effectiveness.

                        To learn more about the key challenges and our learnings and best practices, download the perspective paper.

                        Download Perspective Paper




                          Contact

                          Our experts can help you find the right solutions to meet your needs.

                          Get in touch
                          Navigating distributed ledger technologies

                          Navigating distributed ledger technologies

                          Distributed ledger technology (DLT) strengthens data security, promotes transparency in transactions and can potentially revolutionize industries.




                            Today’s enterprises rely heavily on information systems to enable their business processes, which are usually managed and controlled by the respective enterprises. However, there are a lot more multilateral transactions in the modern business value cycle. These span cross-enterprise and require faster, reliable access to the latest, comprehensive information about the transactions to make them more effective and, eventually, lead to better collaboration among enterprises.

                            The reality, however, is that with decentralized information systems and each participant managing their version of truth, enterprises end up having an opaque information architecture resulting in information discrepancies, countless reconciliations, unproductive person-hours spent resolving these, increased operational risk, weakened trust, and increased cost.

                            Decentralization by way of DLT is a step towards addressing these issues, enabling companies to jointly manage, operate and use a platform to maintain a single version of truth across participants and strong cryptography to create trust and immutability, which helps reduce the issues mentioned above. The objective is to deliver tamper-proof data and transparency to all network participants in a consensually-agreed manner.

                            Distributed ledger technologies have evolved and matured over the last few years. While it came about with cryptocurrencies, the application of this technology in alternative use cases can benefit enterprises, and adoption of DLT is on the rise across industries.

                            This perspective paper addresses the evolution and application of DLT in enterprises, and how it can be further embraced to realize potential across multilateral solutions. To learn more about the pillars and eminent platforms of DLT, key challenges and industry use cases, download the perspective paper. 

                            Download Perspective Paper




                              Contact

                              Our experts can help you find the right solutions to meet your needs.

                              Get in touch

                              Release automation reduces testing time by 80%

                              PROFESSIONAL SERVICES

                              Release automation reduces testing time by 80%

                              DevOps implementation and release automation improved testing time, product quality, and global reach for a leading multi-level marketing company.

                              Client
                              A leading multi-level marketing company
                              Goal
                              Shorten the release cycle and improve product quality
                              Tools and Technologies
                              Amazon CloudWatch, Elasticsearch, Bitbucket, Jenkins, Amazon ECR, Docker, and Kubernetes
                              Business Challenge

                              The client's Commercial-off-the-shelf (COTS) applications were built using substandard code branching methods, causing product quality issues. The absence of a release process and a manual integration and deployment process were elongating release cycles. Manual configuration and setup of these applications were also leading to extended downtime. Missing functional, smoke, and regression test cases were adding to the unstable development environment. The database migration process was manual, resulting in delays, data quality issues, and higher costs.

                              Solution
                              • Code branching and integration strategy for defects / hotfixes in major and minor releases​
                              • Single-click application deployment, including environment creation, approval and deployment activities​
                              • Global DevOps platform implementation with a launch pad for applications to onboard other countries​
                              • Automated configuration and deployment of COTS applications and databases​
                              • Automation suite with 90% coverage of smoke and regression test cases​
                              • Static and dynamic analysis implementations to ensure code quality and address configuration issues​
                              Outcomes

                              Automation of release cycles delivered the following benefits to the client:

                              • Release cycle shortened from once a month to once per week
                              • MTTR reduced by 6 hrs
                              • Downtime decreased to <4 hours from 8 hours
                              • Product quality and defect leakage improved by 75%
                              • Testing time reduced by 80%
                              • Reach expanded to global geographies
                              • Availability, scalability, and fault tolerance enhanced for microservices-based applications
                              Contact

                              Our experts can help you find the right solutions to meet your needs.

                              Get in touch
                              Explore the world with Iris. Follow us on social media today.