Join our webinar - "Harness Automation to Transform Your IVR Testing" on May 30, 2024 | Register now.

Best Practices in Data Ingestion

Organizations are generating large chunks of data every nanosecond. These data points serve as the backbone of all major business transformation initiatives – helping derive deeper insights for providing superior customer experiences, improved products and personalized offerings. Yet, for data to be useful, it is important that it is accessible, clean and abundant. Data ingestion (the process of extracting, transforming, and loading data) plays an important role in making data available for timely analysis.

In an article for CMSWire, Moshe Kranc, Chief Technology Officer, Ness Digital Engineering, shares some of the essential best practices that organizations must consider in the data ingestion process. “Today, data has gotten too large, both in size and variety, to be curated manually. You need to develop tools that automate the ingestion process wherever possible. For example, rather than manually defining a table’s metadata, e.g., its schema or rules about minimum and maximum valid values, a user should be able to define this information in a spreadsheet, which is then read by a tool that enforces the specified metadata,” notes Moshe.

Read the full article

Impact of Digital Technologies Are Enabling Insurers To Make their ‘Personalization’ Dream a Reality

The impact of digital technologies has percolated through to the most traditional sectors and insurance is most definitely one of those. Traditionally held back by slow-paced technology improvements and few intelligent customer touchpoints, the insurance industry is in the midst of a major technology transformation that is changing every aspect of the insurance model: from the way products are designed and delivered, to how customer interactions are managed, and now radical changes in the insurance business model itself.

The emergence and apparent convergence of new and evolving digital technologies (like sensors and monitoring systems, telematics, wearables and AI) are enabling insurers to provide high-touch, on-demand personalized experiences to a new generation of insurance customers. This is because they now have new and more effective ways to measure, control and price risk more effectively, so they can build innovative and targeted offerings and form seamless and deeper customer interactions.

For a while now, auto-insurers  have been sponsoring the retrofit of cars with data-capturing devices and are even partnering with original equipment manufacturers to gain greater intelligence on driving behavior to offer highly personalized premiums. With access to this data, insurers now have greater monetization opportunities as they can offer value-added services ranging from personalized driving recommendations, to helping track stolen vehicles or alerting users ahead of potential break-downs.

In this blog, we look at how digital changes are impacting different aspects of the insurance industry to benefit both insurers and their customers.

High-Touch Customer Experience

Digital technologies and data analytics are helping insurers know their customers better. Products can be priced more effectively, and fraudulent claims can be identified with greater accuracy. Data gives insurers opportunities to think of long-term growth armed with insights into their customer behavior – they can offer tailored products – and can offer personalized offerings. For example, data on a customer who has put his house up for sale can enable the insurer to proactively suggest new home coverage options. With valuable customer data from connected devices, digital records and social media, insurance companies can offer their customers real-time, on-demand services. Health Insurers like AXA are now utilizing data from wearables/fitness apps to proactively track customers, and help manage their health conditions and insurance profile. Insurers can leverage insights from omni-channel user behavior to offer products and services personalized to the customer’s needs. Data analysis now provides a 360 degree view of the customer’s preferences and risk levels, which is an opportunity to cross sell services to customers who have policies with different divisions of the same company. That can be achieved with a personalized user experience (landing pages, priority of search results) based on the insights they have from data on that customer. These capabilities require digital excellence and a consistent experience across multiple channels and intelligent platforms. The opportunity to differentiate with great customer experience is huge because consumers are attracted to offerings they find more in tune with their unique situation.

Product Development – Usage-Based Insurance (UBI)

Automobile insurance premiums have traditionally been driven by parameters that segment customers into groups – based on age, education levels, past accident history etc. However digital technologies are enabling customization of premiums based on much more current factors like driving behavior. The diagram below explains how UBI works.


Image source:

This helps the insurance company to incentivize safe driving by providing inputs to the customers on their driving skills and areas of improvement and should ultimately reduce the number of accidents and payouts.

Large Property & Casualty insurance carriers such as State Farm, Liberty Mutual, Allstate and Progressive have all now launched UBI products.

Process Transformation

Fraud is one of the major challenges faced by insurance providers.

Insurance companies in the United States suffer estimated losses of 80 billion USD yearly. Fraud comprises about 10 percent of property-casualty insurance losses and loss adjustment expenses each year, amounting to 34 billion USD (source: Using data and insights from diverse data sources such as past claim history, social media feeds and using big data analytics techniques, insurance companies are now better poised to identify fraudulent patterns, behavior and claims.

Let’s consider a simple example: There is a claim for the loss of a property and contents because of fire. Assessment photographs from CCTV help claim investigators confirm that there weren’t any of the valuables (like the ten brand new 27 inch iMacs the claimant suggests) inside the office. Additionally, it is noticed that the claimant had made two automobile claims in the recent past, and a social media post (made after the claim for loss of property) shows him gifting valuables to a relative. The aggregation of this data points towards a high probability of a fraudulent claim – or at least one worth investigating further.

This kind of analysis would likely have been missed by a fraud investigator following a manual process, but it can be efficiently done using a big data system. The use of artificial intelligence in fraud analytics is another exciting development that has the potential for higher levels of fraud detection. Blockchain technology promises speed, efficiency and transparency benefits when searching for  fraudulent activity by independently verifying the authenticity of customers, policies and claims using the decentralized digital repository.

Transformation of Business Model: Peer to Peer Insurance

“We work with a hypothesis and test-driven approach to make sure we are meeting our customers’ needs with our product.” These are the words of a CEO of a company called Friendsurance that is known to have pioneered the peer-to-peer insurance model, bringing the two different worlds of “insurance” and “social media” together to improve the efficiency and experience of both the consumer and the insurer.

Peer to Peer Insurance


Companies like Friendsurance, Lemonade, Jointly and Guevara are disrupting the business model of the insurance industry. In Friendsurance’s model, a group of customers with the same type of insurance connect with each other. If no claims are made by the customers in the group, they receive a pre-agreed maximum payback. In case of any claims, the payback is reduced accordingly. The benefit is that group performance incentivizes customers to avoid unnecessary claims and they are able to secure a discount for the insurance premium based on their collective volume buying power.

The above scenarios are examples of a new direction for insurers. As newer technologies emerge the possibilities of digital disruption, efficiencies and new products are truly endless. Insurance companies that aspire to be ahead of their game should be investing in beefing up their digital prowess by building the right platforms and solutions that will enable them to interact with their customers seamlessly across channels, deliver personalized products on-demand, and add new revenue streams by being better positioned to capitalize on emerging business opportunities.

Ness works with leading organizations in the insurance sector to help them build innovative platforms and solutions to leverage the expanding opportunities in the fast- evolving digital insurance landscape.

To learn more about how we can help transform your business, contact us.

Important Considerations for Enterprises in Leading the Technology Change

Advanced technology implementations are at the center of the vast number of enterprise transformation stories taking shape today. Organizations want to innovate at a rapid pace, and in the race for creating a differentiating advantage using the latest technologies, they may often fail to assess the real impact of these technologies and how all of these play out together to offer the much-desired transformational impact. Technologies like Artificial Intelligence, DevOps and Robotic Process Automation hold lot of value for organizations if applied within the right context and also with due consideration of their possible downsides.

In an interaction with Analytics India, Moshe Kranc, Ness’s Chief Technology Officer, shares valuable insights on the benefits and challenges in deploying an RPA solution, the importance of DevOps in driving digital disruption, and why organizations need to champion the open data movement. “RPA took off in 2017 because it promises short term ROI for a very modest initial cost. But it often requires more time than anticipated to understand the business process, because it requires deep domain knowledge that human employees may be hesitant to share for fear of “bots” taking over some of their responsibilities,” notes Moshe.

Read full article


HR Organizations Must Focus on Providing Improved Candidate Experience & Continuous Learning

In an interaction with India in Making, Dr. Christina Augustine, HR Head, Ness Digital Engineering, breaks down the considerations for HR organizations in developing and nurturing next generation talents that can pave the way for accelerated digital transformation.  She talks about the intense competition in the technology space and how a progressive digital engineering company like Ness is focused on addressing fast changing industry needs through investing in building the right talents through an innovative, employee-focused culture. She also discusses the importance of engaging people through internal events focus on driving innovation, and learning opportunities beyond the regular line of work in adding value to individual profiles and the organization and its customer.

How do you ensure that you build a powerhouse of technology talent that is ready to take on the challenges of a digital future? Read the excerpt from Christina’s interview to learn more.

Q. In an extremely competitive and dynamic technology industry, how do you attract the right talents and offer more value to your employees?

Christina: It is a very competitive market, which makes it important for organizations to find effective ways to engage with the right talents. One of the ways we differentiate ourselves is through offering a high-level candidate experience supported by our pre-hire engagement activities, which includes connecting with candidates through multiple nodal entries and in different ways to ensure that we find people with the right cultural fit.

Another important differentiator for us is that we focus on providing not just jobs but on building careers. We are a firm believer in growing leaders from within. We have employees with long tenures – we provide multiple opportunities for career development. For instance – someone who joins us as a software engineer can explore multiple roles, such as that of business analyst, architect or a project manager, and can choose from the numerous opportunities that we open up for our employees at different stages in their career.

Watch the interesting video where Christina elaborates more on these points.

Technology Trends for 2018: Ness’s CTO Associates Weigh In

The CTO (Chief Technology Office) Associates is a forum of Ness’s technical stars across all Ness Innovation Centers, who are available to tackle global, technology-focused challenges outside their regular day job. This team regularly solves insanely hard technical problems and provides subject matter expertise for Ness customers; they write technical articles that promote thought leadership, develop accelerators for nascent strategic technologies and contribute to Open Source projects. They help drive a culture of technical excellence within the Ness engineering ecosystem, and represent a valuable resource for our customers: access to Ness’s collective brain power.

We asked the CTO Associates to anticipate the hot technical trends for the coming year.

Here are some of their thoughts on the Technology Trends for 2018:

Daniel Nanut, Senior Software Engineer

  • I think that Kotlin is going to have the biggest growth in programming languages for the next Why?
    • It’s much simpler than Java.
    • Full interoperability with Java.
    • Fast compile times (working for the last years with Scala, I can really appreciate this).
    • It is officially supported by Google to develop Android apps.( Considering the legal tussle between Google and Oracle over the Java language, it would not surprise me if Google switches completely to Kotlin.
    • It can target Javascript (you can have a React app written in Kotlin) and native code (this part landed recently and still needs to be test-driven).
  • I think that the next language to watch is TypeScript.
    • For large projects it is the sane way to tackle complexity.
    • More and more open source projects are using it instead of plain Javascript (like Angular).
    • Again, it brings sanity on the backend if you use NodeJS.
    • It is well supported by Microsoft
  • Another language that will continue to grow is Python, because of the intense interest surrounding machine learning, deep learning and investments in AI (Artificial Intelligence).

Ilie Halip, Architect

I see Rust becoming an important language. It has quickly become a major systems programming language because it can guarantee safe memory access at runtime by doing compile-time analysis. Some of the important problems it claims to detect at compile-time are: memory leaks, double frees, invalid pointers and even data races between threads. I expect a growing adoption for Rust in 2018, even in the embedded world, due to its supposed speed and efficiency improvements and the fact that it can easily interface with the C language commonly found in embedded systems.

Iulian Dogariu:Principal Engineer

  • Serverless is gaining traction quickly. Serverless apps will show up more often, especially in the long tail of apps that see infrequent use or low data volumes.
  • Backend-as-a-service products will see increased adoption (e.g., Auth0, Google Firebase), as companies look to shift an internal overhead to a reliable and easy-to-budget external partner. We will see the emergence of new kinds of backend-as-a-service products, with AWS leading the pack. But, a standard for functions-as-a-service will not be established in the coming year, so concerns about the downsides of vendor lock-in will not yet disappear in 2018.
  • Native apps are losing ground to progressive web apps, because they are more expensive to develop and to put in the hands of consumers. And mobile browsers are constantly improving, which means progressive web apps are able to showcase more complex and useful features. We will probably see the tipping point in the second part of 2018, when Apple is expected to reveal iOS version “12” with service workers added to Safari – a massive boost for progressive web apps, but perhaps a challenge to Apple’s App Store Revenues.
  • React will make gains at the expense of other UI frameworks. React’s 2017 relicensing to MIT removed a big non-technical hurdle for its adoption and many others will now follow. The emergence of React-native makes React a contender for becoming the de-facto standard for building rich UIs for both web and native experiences.

Daniel Masarik, Lead-Development

  • Kubernetes (and related tools) has become the default ecosystem for containers. Just a few weeks ago at the AWS re:Invent conference, Amazon announced it has joined Google and Microsoft on the Kubernetes bandwagon, with the introduction of AWS Elastic Container Service for Kubernetes.
  • Kubernetes has also recently become a part of the Firefox Quantum project – an ongoing effort to rewrite parts of Firefox in Rust. The first Quantum release (57.0) happened at the start of December 2017 and the speed gains are significant:

Sagar Mahapatro, Lead-Development: 

I think one of the hottest and most disruptive technology trends in 2018 is going to be blockchain. It has already captured the imaginations of tech enthusiasts as the underlying technology of Bitcoin and hundreds of other “altcoin” (alternative coin) cryptocurrencies, like Litecoin and Dogecoin. 2018 is going to be the year when blockchain applications will go beyond currency, finance, and markets, and make inroads into other areas like government, enterprise, health care, science and IOT security.

Vishnudas Lokhande, Senior Software Engineer:

My predictions for OpenSource would be:

  • Open Source ForgeRock products (Access Management, Identity Management, Directory Services and Identity Gateway) will gain in popularity for authentication, authorization, federation, API security and Internet of things.
  • Open Source WSO2 products such as API management, Integration (Service bus), IOT and Analytics will develop a growing audience.

With their deep technical knowledge and real-world experiences working on a diverse range of forward-looking technology projects for customers across sectors, the Ness CTO team is not afraid to make bold predictions in an ever-evolving enterprise technology landscape. We’ll check back in a year to see how skilled this team is at predicting the future.

Data Ingestion Best Practices

In a new article for CMSWire, Moshe Kranc, chief technology officer for Ness Digital Engineering, discusses data ingestion best practices to consider – i.e. preparing data for analysis – from a transform (cleansing and normalizing data) perspective. These include expecting difficulties and planning accordingly, automating data ingestion, using artificial intelligence (AI), making data ingestion self-service, governing the data to keep it clean, and advertising your cleansed data. Ultimately, these best practices, when taken together, can be the difference between the success and failure of your specific data ingestion projects.

read more »

Artificial Intelligence Is Not Just Hype – It’s Real

In an interaction with Analytics India, Moshe Kranc, Ness’s Chief Technology Officer, shares valuable insights on the benefits and challenges in deploying an RPA solution, the importance of DevOps in driving digital disruption, and why organizations need to champion the open data movement. “RPA took off in 2017 because it promises short term ROI for a very modest initial cost. But it often requires more time than anticipated to understand the business process, because it requires deep domain knowledge that human employees may be hesitant  to share for fear of “bots” taking over some of their responsibilities.” Moshe adds.

Read more »

Robotic Process Automation: What Could Go Wrong?

Robotic Process Automation (RPA) has stormed to the forefront of enterprise technical roadmaps in a remarkably short time. Every company seems to have a dedicated team looking for routine human processes, like copying and pasting data from spreadsheets, that can be done more accurately and less expensively by software robots. The idea has been around for years, embodied in products like web crawlers and automated quality test systems, but it has recently gone mainstream, with dozens of products and hundreds of vendors.

To understand the appeal of RPA, let’s compare it to its less popular cousin, Business Process Management (BPM). A typical BPM project seeks to standardize the workflow of a business process, such as handling a customer complaint, from start to finish. Some of the stages may be handled by software, while others may require human intervention, e.g., getting approval from a manager. At every stage, the BPM process decides what should be the next step based on the result of the previous step. As you can imagine, such a process can take a long time to analyze and define within a given BPM tool, and even more time to debug all the possible states and paths along the way. By contrast, an RPA project has a far more modest goal: to automate one single step in a process, so that a software robot can replace the human being who currently performs that step manually. In some cases, this may require some Artificial Intelligence (AI), to simulate the decisions a human would make in performing that step. But, RPA projects are far shorter than BPM projects and provide a much quicker return on investment.

Now that RPA has gone mainstream and has had (many) successes and (a few) failures, it’s time to step back and try to understand what can go wrong in an RPA project. Here are some anti-patterns and danger signs:

  • Poor understanding of the business process being automated: If you don’t fully understand the current process, you cannot automate it, so don’t cut corners on the Business Analysis (BA) phase.
  • Changing the business process as you are automating it: The temptation is huge to make a few improvements to the process as you are automating it, but the result is a moving target that may never converge into a working robot. And, how can you be sure your untested improvement is actually a better process?
  • Lack of cooperation from humans whose jobs may be automated and made redundant by RPA: It’s hard to automate a process if your only sources of information about the process have decided you are the enemy.
  • Creating a fully-automated process with no kill switch: A software robot must be smart enough to know when it has to give control back to human beings. Take the example of automated stock trading – on more than one occasion, software robots have all arrived at the same conclusion that it is time to sell and caused the stock market to melt down. Or, consider SkyNet from the “Terminator” movies.
  • Trying to implement Artificial Intelligence from the start, before you fully understand all the possible states and error conditions: You are better off automating a subset of the process that does not require AI, then expanding the automation to include successively smarter AI, as you learn more about the robot’s real-world behavior.
  • Eliminating all human understanding of the automated process once the automation is complete: Processes change over time. When you need to modify the automated process in six months, you will need organizational knowledge of the process and how to change it. Otherwise, the change will require an archaeologist rather than an architect.
  • Neglecting to make the robot enterprise grade: Just because the robot runs doesn’t mean you are done. How will you ensure the reliability and scalability of the robot? How will you deploy new versions? How will you track whether the robot completed successfully or which version was invoked? How will you orchestrate the execution of a team of robots that need to run together to accomplish a business task? You’ll need answers for all these questions in order to deploy an enterprise-grade robot.

By avoiding these traps, you will greatly increase your RPA project’s chances of success.