New IT ecosystem earns 100% satisfaction rating

STANDARDS & MEMBERSHIP

New IT ecosystem earns 100% satisfaction

Redesigned application and support system overcome multiple process challenges that were causing costly delivery delays and widespread dissatisfaction.

Client
A leading global standards organization
Goal
Improve coordination across teams, reduce delivery delays, and streamline support processes for global users
Tools and Technologies
.NET Core, Vue.JS, Python, Docker, Kubernetes, Azure, Angular, Cosmos DB, MS SQL Server, PowerBI, Redis, Azure Functions, Azure Data Factory, Azure App Service, Spring Boot, Java
Business Challenge

The IT ecosystem was spread across multiple vendors and in-house teams, creating significant coordination overheads and challenges. These issues led to member organizations expressing dissatisfaction due to delivery delays and high turnaround times on incidents and problems.

The teams also had to support users across various geographies and time zones, further complicating operations. A lack of standardized customer service processes and knowledge base documentation also hindered issue resolution. Additionally, the teams struggled with awareness of client-specific standards and applications, while needing to handle ad hoc requests and support teams with customer service-specific projects.

Solution
  • Three-week discovery phase and six-month transition plan covering 10+ applications, 24/7 Level 2 support, and infra support
  • Established service and operations management processes, including governance, tools, and KPIs for agile and ITIL processes
  • Set up and maintained Freshdesk ticketing system to manage user requests
  • Created detailed SOPs and canned messages in Freshdesk for BAU tracking
  • SLA collaboration - worked with L3 team to define resolution SLAs for critical tickets
  • Created weekly and monthly reports to track requests, issues, and risks
  • Set up a standard process with client to handle access-related requests
Outcomes
  • Zero downtime deployments
  • Faster on-boarding of member organizations
  • Continuous reductions in infra costs on quarter-to-quarter basis
  • 100% response and resolution SLA that led to 100% customer satisfaction
  • Timely access to all applications for business users
  • A one-stop shop via Freshdesk for all ticket information and data for all stakeholders
  • Expertise on global standards enables ideas and suggestions to optimize the process for continuous improvement
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Automated POD improves turnaround time 95%

PROFESSIONAL SERVICES

Automated POD improves turnaround time 95%

Optimized Proof of Delivery documentation improves productivity by 87% and billing cycle by 35%.

Client
Leading supply chain brokerage
Goal
Automate Proof of Delivery documentation process to increase efficiency and accuracy in data upload, validation and invoicing
Tools and Technologies
UI Path Orchestrator, UI Path Document Understanding, Microsoft Power BI, Oracle Transportation Management
Business Challenge

Proof of Delivery (POD) is a document that confirms an order has arrived at its destination and was successfully delivered before the invoice can be billed for payment.

Lack of an electronic POD system leads to inefficient, manual processing due to varied legal and contractual documentation requirements, resulting in longer billing cycles. Diverse formats and layouts from different carriers complicate data extraction from paper-based PODs.

Solution
  • Developed a Document Processing Bot with UI Path AI Center, leveraging Document Understanding and Optical Character Recognition for managing various carrier documents
  • Optimized data models for major carriers, focusing on the top five document types that represent 80% of the volume
  • Implemented UI Path Action Center's "Human in the Loop" to handle exceptions and conducted 6-8 weeks of rigorous training on the Document Understanding model to ensure accuracy and meet confidence targets
Outcomes
  • Achieved a 95% reduction in POD turnaround time, dropping from 48 hours to 2 hours, significantly boosting customer satisfaction
  • Enhanced productivity by 87.5%, confirming receipt and condition of freight efficiently
  • Reached 80% process accuracy, with continuous enhancement via automatic retraining
  • Cut the billing cycle by 35%, allowing immediate use of data for customer invoicing
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Automated scheduling bots boost productivity by 50%

PROFESSIONAL SERVICES

Automated scheduling bots boost productivity by 50%

Major productivity gains and increased customer satisfaction from using 24/7 automated bots in place of manual scheduling.

Client
Leading supply chain brokerage
Goal
Automate the manual, supply-chain scheduling process to improve staff productivity and customer satisfaction
Tools and Technologies
UI Path Orchestrator, UI Path Assistant, Microsoft Power BI, Office 365
Business Challenge

Performing crucial supply chain logistics, a provider’s operations team was struggling due to the high volume of scheduling appointments with shippers, receivers, and carriers, which involve back and forth emails, phone calls, or manual data entry into multiple Transport Management Systems (TMS).

These appointment-scheduling complexities vary based on the parties involved, from sending an email requesting appointment times to accessing a TMS and selecting what’s available as per their schedule.

Lacking proper analytics, sales representatives were unable to pinpoint peak appointment times, track cancellation rates, or discern customer preferences, often leading to shipment delays and incurred detention charges.

Solution
  • Deployed multiple rule-based, automated workflows to pull information from incoming appointment requests (from emails, web forms, etc.) and automatically input it into the various TMS used to book pick-up and delivery appointments
  • Developed a Power BI dashboard to visualize appointment trends, peak times, and cancellation rates, providing insights into customer behaviors, including frequent reschedules, preferred times, and typical lead times for booking appointments
  • Delivered a reusable solution that could be leveraged for other business areas
Outcomes
  • Bots operating 24/7 have led to over 15,000 monthly appointments being scheduled, resulting in a 50% reduction in manual scheduling hours
  • The productivity of the operations team has improved by 50%, enabling staff to concentrate on high-value tasks rather than manual appointment-booking
  • The increased accuracy in scheduled appointments has significantly decreased detention charges, thereby boosting overall customer satisfaction
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Unified automation strategy enhances efficiency

PROFESSIONAL SERVICES

Unified automation strategy enhances efficiency by 30%

Automated platform engineering solution improves reliability and lowers costs by 20-50% each for a leading payroll and HR solutions provider.

Client
Leading payroll and HR solutions provider
Goal
Develop automation strategy and framework that accommodates growth and ensures efficiency
Tools and Technologies
Ansible, AWS, Dynatrace, Gremlin, Groovy, Jenkins, Keptn, KICS, Python, Terraform
Business Challenge

The SRE (Site Reliability Engineering) shared services team faced a diverse set of needs relating to automation of infrastructure and services provisioning, configuration, and deployment.

The team was encountering resource constraints, as limited in-house expertise in certain automation tools and technologies was causing delays in meeting critical automation requirements. They also needed to ensure system reliability and were challenged to scale automation solutions to accommodate increasing demands as operations grow.

Solution
  • Development of a comprehensive automation strategy to align with objectives, encompassing Terraform, Ansible, Python, Groovy, and other relevant technologies in the AWS environment
  • Leveraging our expertise to bridge the knowledge gap, provide training, and augment the client team in handling complex automation tasks
  • Implementation of a chaos engineering framework using Gremlin, Dynatrace, Keptn, and EDA tools, to proactively identify weaknesses and enhance system resilience
  • Creation of a scalable automation framework that accommodates growing needs and ensures long-term efficiency
Outcomes
  • A unified automation strategy that streamlined processes, reduced manual effort, and enhanced overall efficiency by 30%
  • The implementation of chaos engineering and self-healing practices, which increased reliability between 20% and 50%
  • A reduction in manual interventions along with improved efficiency that will result in cost savings of 25% - 50%
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.
The evolution of quality engineering

The evolution of quality engineering

Quality engineering in software development empowers organizations to achieve heightened quality, scalability and resilience.




    Quality Assurance (QA) has long been essential in software engineering, ensuring the development of products and applications with established standards and metrics. But QA has been reactive, focusing on defect detection through manual and automated testing. With the evolution of software development technology and methodologies, the limitations of traditional QA are evident. This perspective paper delves into the evolution of Quality Engineering (QE), which has transformed the approach to software quality. QE goes beyond QA and Test Automation to integrate quality practices throughout the Software Development Lifecycle (SDLC); it also addresses complexities in modern architectures such as microservices and cloud environments.  

    The journey from QA to QE is marked by several key milestones. Initially, software testing was a separate phase, conducted after development was complete. With the advent of Agile and DevOps methodologies, the need for continuous testing and early defect detection became apparent. This shift fostered the evolution of testing practices, embedding quality checks throughout the development cycle with the emerging adoption of cloud-native modern architecture, paving the way for what is now called Quality Engineering. Unlike the reactive nature of traditional QA and QA Automation, QE represents a proactive and integrated approach throughout the development lifecycle.

    Organizations can significantly enhance product or application quality, optimize development workflows, and mitigate risks by addressing QE concerns at every phase of the SDLC. They could leverage structured QE approaches as mentioned above, and focus on a holistic view of quality in modern architecture.

    Read our Perspective Paper for more insights on the evolution of quality engineering in software development.

    Download Perspective Paper




      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch

      Quality engineering optimizes a DLT platform

      Banking & Financial Services

      Quality engineering optimizes a DLT platform

      Reliability, availability, scalability, observability, and resilience ensured; release cycles and testing time improve 75% and 80%.

      Client
      A leading provider of financial services digitization solutions
      Goal
      Reliability assurance for a digital ledger technology (DLT) platform
      Tools and Technologies
      Kotlin, Java, Http Client, AWS, Azure, GCP, G42, OCP, AKS, EKS, Docker, Kubernetes, Helm Chart, Terraform
      Business Challenge

      A leader in Blockchain-based digital financial services required assurance for non-GUI (Graphic User Interface), Command Line Interface (CLI), microservices and Representational State Transfer (REST) APIs for a Digital Ledger Technology (DLT) platform, as well as platform reliability assurance on Azure, AWS services (EKS, AKS) to ensure availability, scalability, observability, monitoring and resilience (disaster recovery). It also wanted to identify capacity recommendations and any performance bottlenecks (whether impacting throughput or individual transaction latency) and required comprehensive automation coverage for older and newer product versions and management of frequent deliveries of multiple DLT product versions on a monthly basis.

      Solution
      • 130+ Dapps were developed and enhanced on the existing automation framework for terminal CLI and cluster utilities
      • Quality engineering was streamlined with real-time dashboarding via Grafana and Prometheus
      • Coverage for older and newer versions of the DLT platform was automated for smooth, frequent deliverables for confidence in releases
      • The test case management tool, Xray, was implemented for transparent automation coverage
      • Utilities were developed to execute a testing suite for AKS, EKS, local MAC/ Windows/ Linux cluster environments to run on a daily or as-needed basis
      Outcomes
      • Automation shortened release cycles from 1x/month to 1x/week; leads testing time was reduced by 80%
      • Test automation coverage with 2,000 TCs was developed, with pass rate of 96% in daily runs
      • Compatibility was created across AWS-EKS, Azure-AKS, Mac, Windows, Linux and local cluster
      • Increased efficiency in deliverables was displayed, along with an annual $350K savings for TCMs
      • An average throughput of 25 complete workflows per second was sustained
      • Achieved a 95th percentile flow-completion time that should not exceed 10 seconds
      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch
      Explore the world with Iris. Follow us on social media today.
      Productionizing Generative AI Pilots

      Productionizing Generative AI pilots

      Get scalable solutions and unlock insights from information siloed across an enterprise by automating data extraction, streamlining workflows, and leveraging models.




        Enterprises have vast amounts of unstructured information such as onboarding documents, contracts, financial statements, customer interaction records, confluence pages, etc., with valuable information siloed across formats and systems.

        Generative AI is now starting to unlock new capabilities, with vector databases and Large Language Models (LLMs) tapping into unstructured information using natural language, enabling faster insight generation and decision-making. The advent of LLMs, exemplified by the publicly-available ChatGPT, has been a game-changer for information retrieval and contextual question answering. As LLMs evolve, they’re not just limited to text. They’re becoming multi-modal, capable of interpreting charts and images. With a large number of offerings, it is very easy to develop Proofs of Concept (PoCs) and pilot applications. However, to derive meaningful value, the PoCs and pilots need to be productionized and delivered in significant scale.

        PoCs/pilots deal with only the tip of the iceberg. Productionizing needs to address a lot more that does not readily meet the eye. To scale extraction and indexing information, we need to establish a pipeline that, ideally, would be driven by events, new documents generated and available, possibly through an S3 document store and SQS (Simple Queue Service), to initiate parsing of documents for metadata, chunking, creating vector embedding and persisting metadata and vector embedding to suitable persistence stores. There is a need for logging and exception-handling, notification and automated retries when the pipeline encounters issues.

        While developing pilot applications using Generative AI is easy, teams need to carefully work through a number of additional considerations to take these applications to production, scale the volume of documents and the user-base, and deliver full value. It would be easier to do this across multiple RAG (Retrieval-Augmented Generation) applications, utilizing conventional NLP (Natural Language Processing) and classification techniques to direct user requests to different RAG pipelines for different queries. Implementing the capabilities required around productionizing Generative AI applications using LLMs in a phased manner will ensure that value can be scaled as the overall solution architecture and infrastructure is enhanced.

        Read our perspective paper for more insights on Productionizing Generative AI Pilots.

        Download Perspective Paper




          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch
          How Gen AI Can Transform Software Engineering

          How Gen AI Can transform software engineering

          Unlocking efficiency across the software development lifecycle, enabling faster delivery and higher quality outputs.




            Generative AI has enormous potential for business use cases, and its application to software engineering is equally promising.

            In our experience, development activities, including automated test and deployment scripts, account for only 30-50% of the time and effort spent across the software engineering lifecycle. Within that, only a fraction of the time and effort is spent in actual coding. Hence, to realize the true promise of Generative AI in software engineering, we need to look across the entire lifecycle.

            A typical software engineering lifecycle involves a number of different personas (Product Owner, Business Analyst, Architect, Quality Assurance/ Tech Leads, Developer, Quality/ DevSecOps/ Platform Engineers), each using their own tools and producing a distinct set of artifacts. Integrating these different tools through a combination of Gen AI software engineering extensions and services will help streamline the flow of artifacts through the lifecycle, formalize the hand-off reviews, enable automated derivation of initial versions of related artifacts, etc.

            As an art-of-the-possible exercise, we developed extensions (for VS Code IDE and Chrome Browser at this time) incorporating the above considerations. Our early experimentation suggests that Generative AI has the potential to enable more complete and consistent artifacts. This results in higher quality, productivity and agility, reducing churn and cycle time, across parts of the software engineering lifecycle that AI coding assistants do not currently address.

            Complementary approaches to automate repetitive activities through smart templating, leveraging Generative AI and traditional artifact generation and completion techniques can help save time, let the team focus on higher-value activities and improve overall satisfaction. However, there are key considerations in order to do this at scale across many teams and team members. To enable teams to become high-performant, the Gen AI software engineering extensions and services need to provide capabilities around standardization and templatization of standard solution patterns (archetypes) and formalize the definition and automation of steps of doneness for each artifact type.

            Read our perspective paper for more insights on How Gen AI Can Transform Software Engineering through streamlined processes, automated tasks, and augmented collaboration, bringing faster, higher-quality software delivery.

            Download Perspective Paper




              Contact

              Our experts can help you find the right solutions to meet your needs.

              Get in touch

              Gen AI interface enhances API productivity and UX

              Transportation & Logistics

              Gen AI interface enhances API productivity and UX

              Integrating Generative AI technology and developer portal reduces logistics provider’s API onboarding to 1-2 days.

              Client
              Leading logistics services provider
              Goal
              Improve API functionality and developer team’s productivity and user experience
              Tools and Technologies
              Open AI (GPT-3.5 & 4 Turbo LLM), AWS Lambda, Streamlit, Python, Apigee
              Business Challenge

              A leading logistics provider offers an API Developer Portal as a central hub for managing APIs, enabling collaboration, documentation, and integration efforts, but faces limitations, including:

              • Challenges to comprehend schemas, necessitating continued reliance on developers
              • No means to individually search for API operations on the API Developer Portal
              • Difficulties keeping track of changes in newly-released API versions
              • Potential week-long delays as business analysts or product owners must engage developers to check if existing APIs can support new website functionalities
              Solution

              Integrating Gen AI technology with API, we provided a user-friendly chat interface for business users. Features include:

              • Conversational interface for API interaction, eliminating the need for technical expertise to interact directly with APIs
              • Search mechanism for API operations, query parameters, and request attributes
              • Version comparison and tailored response generation
              • Backend API execution according to user query needs
              Outcomes
              • Business users are now empowered with a chat-based interface for querying API details
              • Users can seamlessly explore APIs, streamlining collaboration with the API team and reducing onboarding time to one or two days, ultimately enhancing the customer experience for all stakeholders
              • Developer productivity improved with the AI-powered tools in the API Developer Portal
              • Functionality is enhanced from the version comparison, individual API operation search, and tailored responses
              Contact

              Our experts can help you find the right solutions to meet your needs.

              Get in touch
              Explore the world with Iris. Follow us on social media today.
              Asset tokenization transforming global finance

              Real-world asset tokenization can transform financial markets

              Integration with Distributed Ledger Technologies is critical to realizing the full potential of tokenization.




                The global financial markets create and deal in multiple asset classes, including equities, bonds, forex, derivatives, and real estate investments. Each of them constitutes a multi-trillion-dollar market. These traditional markets encounter numerous challenges in terms of time and cost which impede accessibility, fund liquidity, and operational efficiencies. Consequently, the expected free flow of capital is hindered, leading to fragmented, and occasionally limited, inclusion of investors.

                In response to these challenges, today's financial services industry seeks to explore innovative avenues, leveraging advancements such as Distributed Ledger Technology (DLT). Using DLTs, it is feasible to tokenize assets, thus enabling issuance, trading, servicing and settlement digitally, not just in whole units, but also in fractions.

                Asset tokenization is the process of converting and portraying the unique properties of a real-world asset, including ownership and rights, on a Distributed Ledger Technology (DLT) platform. Digital and physical real-world assets, such as real estate, stocks, bonds, and commodities, are depicted by tokens with distinctive symbols and cryptographic features. These tokens exhibit specific behavior as part of an executable program on a blockchain.

                Many domains, especially financial institutions, have started recognizing the benefits of tokenization and begun to explore this technology. Some of the benefits are fractional ownership, increased liquidity, efficient transfer of ownership, ownership representation and programmability.

                With the recent surge in the adoption of tokenization, a diverse array of platforms has emerged, paving the way for broader success, but at the same time creating fragmented islands of ledgers and related assets. As capabilities mature and adoption grows, interconnectivity and interoperability across ledgers representing different institutions issuing/servicing different assets could improve, creating a better integrated market landscape. This would be critical to realizing the promise of asset tokenization using DLT.

                Read our Perspective Paper for more insights on asset tokenization and its potential to overcome the challenges, the underlying technology, successful use cases, and issues associated with implementation.

                Download Perspective Paper




                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch