Archives

GenAI & the Midas Touch: Overcoming the Ambiguity of Natural Language

 

With all of its promise, Generative AI calls upon us to be careful what we wish for.

 

We all know the story of King Midas: Though already extremely wealthy, when granted his fondest wish, everything he touched would turn to gold. Initially, he was delighted with this gift, but then he realized that he could not eat or drink because his food turned to gold as soon as it touched his hands or lips. 

Finally, when his precious daughter jumped into his arms and instantly became a golden statue, he realized that what seemed like a great blessing was, in many ways, a curse. He was overjoyed when his wish was revoked, and his daughter (and presumably his food) were restored to their normal state.

The moral of the story: Be careful what you wish for.

In many ways, we face the same situation as King Midas in the age of GenAI. The promise of this technology is that many of our wishes, once expressed in the form of a “prompt,” can be instantly answered. Today, those wishes include unparalleled access and summarization of information, the creation of custom textual and graphic content, and even the automatic generation of software code. 

It’s also not too far-fetched that in the near future, those same verbal instructions will begin to shape the physical world around us, as in the case of King Midas. This is not speculative; even today, GenAI is being used to take verbal instructions and generate the code needed to control industrial robots, which, in turn, build products. In the immediate future, this GenAI-driven fabrication will be done using robots, 3D printers, and CNC machines housed in factories. However, it is not much of a stretch to think that in the near future, manufacturing will be “democratized” and begin to move into our homes or shops, as well. We simply describe the clothes or other items we want, and our machines produce them.

Generative AI Systems Will Give Us What We Ask For

While these generative systems are near-miraculous in many ways, the problem is that, like King Midas, they will give us what we ask for. They will generate the code we tell them to produce, the story we tell them to write, the picture we tell them to draw, the goods we ask them to produce, and so on. 

Further on, in the future, this may not be so. Our systems will surely grow more adept at reading our intent rather than following our literal instructions. (In software design circles, this used to be called “DWIM,” for “Do What I Mean.”) However, for now – and for at least the next few years – our increasingly smart systems will largely give us what we ask for rather than intuiting what we really want.

The reason disciplines like engineering, law, and others exist in the first place is that experience has shown that it’s very challenging to describe specifically what you want. 

Recommended reading: Prompt Engineering for Generative AI Defined

In the case of King Midas, the unforeseen consequences of his ambiguous verbal ‘prompt’ are obvious from the fable. But as a layman, when we try to write a complex Excel spreadsheet or draft a legal agreement, we rarely get it right the first time. Indeed, it would be very surprising if a computer – or another user – understood your spreadsheet the same way you intended the first time or if another reader of your contract interpreted every provision the same way you meant. 

Those variances often revolve around how challenging it is to unambiguously express what we want and expect, either to a machine or another person. AIs, at least in the short run, will face the same challenges that we humans do in figuring out the user’s intent.

Precision, Refinement & Specificity are Needed More Than Ever in a GenAI World

In software, through the years, tools and approaches like IFTT (“if this, then that”), Low Code / No Code, graphical programming, and others have attempted to make it easier for humans to describe what they want a software program to do. These were successful in small-scale systems and restricted domains but had limited success in the general-purpose complex engineering space. 

Consequently, they were not generally used to build more complicated systems. This is primarily because the simplified language these paradigms supported was not expressive enough for the person using them to ‘communicate’ complex ideas to the computer as clearly and as concisely as they could by using a computer language. Classic computer languages like C, C#, and Java are made for great specificity when communicating with a computing device. 

The downside is that these precise ‘programming’ languages require much study and rigor. Still, with effort, they can be made to communicate the programmer/author’s intent quite clearly, which is why they are widely used in complex systems.

Another challenge faced by the Low Code / No Code paradigm and graphical programming tools will also, I predict, be faced by GenAI: these were positioned as ‘democratizing’ software development. In other words, these and other paradigms are seen as more accessible to a wider pool of ‘developers,’ including those who do not know conventional programming languages. And these claims are, indeed, true. In the areas where they are applicable, it’s easier to author a simple low-code system than one in, say, C#. This means you can use less skilled – and therefore lower cost – people to create your system, or you can even do it yourself as a non-programmer. 

Recommended reading: Using AI to Maximize Business Potential: A Guide to Artificial Intelligence for Non-Technical Professionals

However, except in certain niche areas, the precision of thought and expression is still required. Otherwise, users tend to find that even though programs are easier to create, they will not deliver the results you expect or want without significant additional effort and refinement.

Natural language prompts seem, on the surface, to take the ‘democratization’ of programming to the extreme. Since nearly all of us learn to talk as young children (and to write not long after that), it seems that with GenAI, we can all be programmers. This is certainly true for non-critical questions – we don’t need to be a programmer or a data scientist to get surprisingly good answers from any of the popular public GenAI systems.

The problem, as King Midas found, is that while natural language is easy to use, it also carries inherent ambiguity. This ambiguity is built into the language due to many factors, including the fact that natural languages evolve organically over many generations, resulting in layer upon layer of meaning and nuance being attached to the same word or phrase. 

Also, being a ‘human’ language, the language speaker inherently assumes that the listener is a person who has ‘common sense’ and shares her or his human experiences, and will therefore automatically avoid the negative interpretations of a literal statement – King Midas’s issue. The social function of language also makes it intentionally imprecise at times to avoid offending the listener. Natural language is designed for expressiveness and social interaction with other humans, not primarily for rigorous clarity.

While it is certainly possible to be clear and precise when using natural language, it also takes thoughtful effort. That is why good lawyers can command high salaries and why writing a good blog, report, or even an email that doesn’t require further context or clarification takes time and attention. The rapidly emerging discipline of ‘prompt engineering’ is a human or machine-aided human activity to develop precise and specific prompts for GenAI systems that are within the limitations of a given AI to answer correctly. This activity begins to look more like engineering or programming than the truly democratized natural-language paradigm, which excites us so much about Generative AI. 

However, before blaming the paradigm, we should realize that even humans find natural language ambiguous. Consider that the next time you get a cryptic text from a friend or your spouse or significant other! Given the ambiguities inherent in natural language itself, it’s not surprising that a discipline needs to emerge that allows people and AIs to communicate without mistakes on either side—and that this discipline looks a lot like engineering.

I truly believe that GenAI will give us all the “Midas Touch” and lead to unparalleled and democratized access to information – and, eventually, personalized goods and services, as well. We should all keep in mind, though, that like King Midas, we must be careful what we ask for. In the near future, we just might get it!

Learn more:

Are you considering the future of your organization’s digital transformation? Microservices architecture offers unparalleled scalability, agility, and efficiency, transforming how organizations develop and deploy software.

GlobalLogic’s latest playbook, Microservices Execution, distills years of expertise into actionable guidance, empowering you to harness the full potential of Microservices.

Top Reasons to Download:

  • Explore a modern paradigm for software development with Microservices architecture.
  • Learn invaluable strategies and best practices from GlobalLogic’s extensive experience.
  • Accelerate your Microservices journey with proven methodologies and insights.

You’ll find track-wise guidelines and insights tailored to your organizational needs in this collection of industry-leading strategies and best practices curated to facilitate your Microservices journey. Discover best practices for designing, implementing, testing, monitoring, securing, and scaling Microservices, ensuring your organization overcomes execution challenges.

Whether conceptualizing, designing, or delivering digital projects, our playbook equips you with the tools needed for success. Download now and unlock the power of Microservices to drive innovation and agility in your organization.

While any software development initiative has unique features, some situations recur so often that I feel like I should have a recording that I can play back the next time that same situation comes up. One of these is the “What,” “How,” and “When” of software development.

Projects get into trouble when it’s not clear who owns these critical decisions, and—perhaps more importantly—when the wrong person or function tries to own one or more of them. When the business people try to own the technical “how” of a project, you know you’re headed for trouble. 

Similarly, when the technical people start designing end-user features (the “what”) without input from the users or the business, that often ends in disaster as well. And when either function tries to dictate “when” without regard to “what” or “how,” that spells trouble big-time.

Just the other day, I heard a business person say, “It’s obvious what they need to do—why can’t they just start coding?” Here the business person was saying, essentially, that the “what” is known (at least in their own mind), so the “how” should be obvious—meaning that engineering should just start doing it. 

In such situations, unless the engineers are truly incompetent (rare), it’s very doubtful that the business person speaking actually understands either the “what” or the “how.” The engineers certainly do not, or they would indeed be coding. 

Recommended reading: Software, the Last Handmade Thing

When a business person makes a statement like this, if he or she is in a position of sufficient power that the engineers do indeed “just start coding” even in the absence of clarity around the what or the how, the project rarely ends well. In particular, it rarely, if ever, delivers what the business person had in mind, when and how they wanted it. 

And—you guessed it—it’s the engineers who generally get blamed for the failure, not the person who insisted they go ahead no matter what.

Projects work best when the business says “what,” the engineers say “how,” and the business and technical people negotiate jointly in good faith over “when.” Sometimes the “when” is fixed—for example, a trade show-driven launch date or an investor deadline. In that case, the business and technical people need to negotiate over the “what” and “how.” 

Similarly, either the “how” or the “what” might be fixed—for example, because you are making modifications to an existing system and have limited technical options, or you have committed to deliver a certain feature. In this case, the “when” and the other of the three independent variables (either “what” or “how” respectively) need to be negotiable. Otherwise, a predictable failure—and/or development burnout—will occur.

Perhaps the most frequent issue is when a single person or function tries to own all three—the what, the how, and the when—telling engineering what they need to develop, how they are going to develop it, and when the project is to be delivered. Unless the person doing so is a universal genius—rare—this inevitably leads to problems. 

I worked with Steve Jobs for four years at NeXT, and even he rarely tried to dictate all three. Two out of three he would try for—but rarely, if ever, all three (and then not for long). Steve would generally defer to engineering on the “how” and would often (though sometimes grudgingly) accommodate strong pushback on the “when.” While I’ve never worked with Elon Musk, I get the sense he also listens to a core team of engineers he trusts. Unless you consider yourself smarter than Steve Jobs and Mr. Musk, you should pause to reconsider your own actions when you try to dictate what, how, and when to your engineering team.

Another often-overlooked facet of this puzzle is the fact that all three activities require communication. Even if the “what” seems clear in your own mind, it still needs to be expressed in terms that the engineering team can understand. This process of ‘backlog elaboration’ nearly always reveals gaps in the clarity of the initial vision, even if it might have seemed ‘obvious’ to you. Similarly, the ‘how’ may be clear to your technical leads, but it still needs to be expressed in architecture diagrams, sequence diagrams, API specs, and other artifacts that communicate the technical vision to the engineering team. 

Only when the “what” and “how” are expressed in sufficient detail can a reliable “when” be produced. The fact that the “what” is clear in our business person’s mind, or the “how” is clear in the mind of the architect, does not mean that the person’s vision could be successfully operationalized without further work. This is why “just start coding” reveals a real gap in understanding of how successful software projects are implemented.

All this can be really fast—even verbally and at the whiteboard in some cases. But in general, the more input and understanding you get from the people actually doing the work, the better your backlog and the more accurate your timeline will be.

A proper appreciation for the value of each ingredient (“what,” “how,” and “when”), combined with due respect for the roles of their proper owners, is the key recipe for successful software development.

More helpful resources:

Despite uncertainty around regulation, millions are already interacting inside the metaverse, a market Ernst & Young expects could contribute over $3 trillion to global GDP by 2031. With the metaverse poised to dramatically change how banks, insurance companies, and other financial institutions engage with customers, IT leaders are focused intently on the challenges and opportunities ahead. 

Banking and financial services IT professionals gathered recently for an immersive, one-hour VR roundtable discussion in the metaverse, co-hosted by GlobalLogic and The CXO Institute. In The Future of Banking: Doing Business in the Metaverse, hosted by GlobalLogic CTO Steven Croke and facilitated by yours truly, participants took a deep dive into innovative next-generation banking and finance solutions on the horizon and questions on how banks will feed consumer needs for personalization, interaction, convenience, and security. 

In this article, you’ll find the highlights from our session, including top questions surfacing in banking and finance organizations as each plans its metaverse roadmap – plus your personal invitation to join GlobalLogic’s Monthly Metaverse Meetups for banking leaders and innovators. Let’s begin by exploring the most pressing challenges and opportunities digital leaders face as metaverse and VR adoption gradually increase.

Why Metaverse Planning is on Banking & Financial Service Roadmaps

Changing consumer demographics and rapidly advancing VR technologies drive massive opportunities for forward-thinking brands, and banking is ripe for disruption. A recent GlobalLogic survey revealed that 90% of Gen Z are willing to turn to big tech and nonbanks for better and faster banking services, and most participants in that demographic had “no idea why” they’d go into a branch when most basic things can be done quicker and easier online.

The same survey found that 80% of Gen Z respondents felt there was insufficient advice available about banking and financial products and that they did not understand how things like mortgages were structured. Investing was a key theme across our research interviews, and most participants brought the idea up unprompted. Inflation, skyrocketing housing costs, and increasing volatility in the job market are weighing heavily on consumers’ minds, and freelancing in various forms is becoming more common. 

For all their diverse needs across employment, banking, shopping, and entertainment, people are looking for more immersive, engaging, and personalized experiences. Increasingly, they’re finding those in the metaverse – particularly Gen Z and Millennials (spanning ages 14 to 40), around 40% of whom have already used VR technology in some way. According to Deloitte, close to 50% of this cohort say they spend more time interacting with others on social media than in real life. Further, Gartner predicts that by 2026, 25% of people will spend at least one hour a day in the metaverse for work, shopping, education, social, and/or entertainment.

This state of current affairs in which consumers are seeking out financial advice and services and increasingly doing so online ought to cause concern for banks, Croke shared with The Future of Banking participants. How will your business respond to an emerging group of consumers who do not feel they need banks or understand them properly and have little or no desire to enter a branch?

Metaverse Presents Opportunities for Education, Support & Customer Experience

The banking sector’s adoption of cryptocurrency and blockchain has increased significantly in recent years and will account for 4% and 4.5% of metaverse revenue in 2025, respectively. But beyond these earliest and best-known DeFi products, how will your bank build trust in the metaverse and make virtual interactions more compelling than the bricks-and-mortar equivalent?

Several banks are setting up lounges or virtual branches as an entry point to the metaverse and using the space to establish a presence and nurture customer relationships. Offering education, support, and advice on financial products in the metaverse can enable financial services brands to engage Gen Z even as VR banking matures.

HSBC, for example, purchased virtual real estate in The Sandbox to engage and connect with sports, e-sports, and gaming enthusiasts. Is this the right idea?

IT leaders attending The Future of Banking event had mixed feelings regarding virtual banking services. They expressed skepticism about the likelihood of adoption without a specific incarnation of virtual offerings that fires the customer’s imagination. Banks will need to give customers compelling reasons to go to the metaverse to complete actions they can already do with mobile banking applications or develop actions they cannot experience with mobile or web interfaces. The next biggest hurdle will be understanding what that will look like across the industry. 

Transitioning to a VR Financial Services Mindset

For one institution, KB Kookmin Bank in South Korea, it meant creating a virtual branch where simple transactions, such as remittances, can be managed at a teller window. 

“We’re already seeing several banks now setting up branches… they’re essentially providing lounges for users to go into those branches and try and make them, effectively, a place to get a conversation going with customers,” Croke shared. Roundtable participants were asked whether they see replicated real-world experiences as the model for transactions in the Metaverse.

One delegate, a CTO for a large insurance brand, said he felt that HSBC’s approach made more sense. “History is littered with examples of trying to replicate something in a new medium and it not working as well… doing something different in a different medium would probably be a more fruitful direction forward.”

Perhaps a hybrid approach would be easier than a metaverse-native experience? Banks may consider creating products that mimic something in the real world with a VR twin; for example, mortgage applicants could access and explore a digital twin of the property they’re considering. 

IT leaders must also consider how metaverse-native experiences might be handled in the future. “If you buy a ticket to a concert in the Metaverse, why would you not purchase that with a payment product that is Metaverse native?” Croke said.

Exploring Options for Metaverse Finance & Banking Products

Even if product development is still far off on the long-term planning horizon, bank leaders should be thinking today about broadening their ideas of what financial products could look like within the metaverse context. 

“We’re already seeing value items being created in the Metaverse,” Croke said. “We’re seeing collectibles being created. We’re seeing equities, we’re seeing art being created. How are these going to be financed? How can one purchase those products? And if you think about storage, where do we store that value?”

Delegates questioned whether we should expect to see a dual business model, with banks in the Metaverse handling cryptocurrencies transacted through the banks or Metaverse ATMs. We tend to think today that the metaverse will not be able to handle traditional banking products. However, as one delegate pointed out, we may see this change once banking finds a strong use case to drive initial adoption and create demand for more services. “I think it’s about starting with a very niche, single-use case that’s killer, and everybody wants to use it, but I think that’s yet to be found,” he said. 

Visualizations offer an interesting way to explore the possibilities today. “If I want to have a 3D visualization of risk, rather than today’s 2D diagram, for example… in 3D, I can move stuff with my fingers and share that information with other traders,” one delegate shared. “I think that’d be very helpful. So, I visualize value addition. I think that’s a pivotal point where Metaverse can start adding value to existing processes.”

The Maturation and Growth of Decentralized Finance

The huge uptake in cryptocurrency and NFTs has led to a new virtual economy, even after the initial buzz died down. This is a borderless, secure, and fast environment in which DeFi enables financial transactions to be performed by entities directly using smart contracts without financial intermediaries. 

Still, we’ve not yet reached a point of maturity where people feel comfortable undertaking a number of activities and transactions in the metaverse outside of Gen Z and gamers. We have this community of early adopters who are already quite demanding and discerning in their metaverse experiences alongside a far larger population still trying to wrap their minds around the possibility. 

Whether adoption and user behavior drive regulation or increased regulation opens the door to greater adoption remains to be seen. The rise of central bank digital currencies and expressed desire from Singapore monetary (likely the furthest ahead at this point) raises many questions about DeFi and its impact on metaverse adoption and maturation. Even so, it is clear today that banks must prepare now to put their arms around this economy as the experiment and learn now so they can be best positioned to move fast and innovate as opportunities open up.

Final Thoughts & Continuing the Conversation

Internet users rely on multiple apps for authentication. But does the Metaverse require that we now own our digital identity? And if we move across multiple platforms, doesn’t a unique digital identity become a prerequisite? 

How do we combat money laundering and fraud in a virtual environment where a criminal can open a crypto wallet, fund their wallet with cryptocurrency, and buy a parcel on a chosen metaverse platform? Once the parcel is bought, they can build a store to hold their NFTs and sell NFTs as a cover for illicit products in real life. 

These are just a few of the metaverse questions and challenges facing IT leaders in banking worldwide right now. 

GlobalLogic will continue the conversation in our monthly Metaverse Innovation Meetups beginning Friday, November 10, to be held in VR. Join us for ongoing discussions about:

  • current trends in banking in the metaverse 
  • successes in the industry and lessons learned
  • brainstorming prototypes that will help define the app that will ultimately successfully drive VR adoption

We’re growing a community of like-minded innovators and business leaders to talk through ideas and help move banking in the metaverse forward in real ways. Will you join us? 

Click here to email me your request for an invitation to GlobalLogic’s Monthly Metaverse Meetup.

Digital product development can be a game-changer for organizations, in the ways it facilitates a seamless, software-driven user experience. It can provide insights on taking a user-centric approach to planning and developing digitally-driven solutions that delight users, create new lines of revenue, and scale with your growing business. 

Consistently applying a data-driven approach to digital product development helps your organization uncover customer insights, identify market trends, and validate hypotheses that result in products that better meet customer needs and drive business growth. Moreover, continuously iterating based on real-time insights ensures the products you’ve invested in are sustainable and evolve with your customers’ needs.

In today’s world, organizations are accumulating and sitting on large volumes of data from an increasing number of systems and interfaces. However, this comes with its fair share of challenges, including (but not limited to) data quality and reliability, scalability and infrastructure, data privacy and security, and the growing talent and expertise gap. We’ll take a closer look at these key considerations and more, so you can achieve a more data-driven approach to digital product development.

1. Data Quality, Reliability & Governance

While the availability of vast amounts of data offers opportunities for valuable insights, it also introduces the risk of incomplete, inaccurate, or inconsistent data. Ensuring data quality and reliability is essential to leveraging the full potential of a data-driven approach.

Incomplete or missing data can result in incomplete or skewed insights, leading to flawed decision-making. Without reliable data, organizations risk basing their strategies on faulty assumptions or incomplete information.

Overcoming this challenge calls for robust data governance processes. This includes defining data standards, establishing data collection and storage protocols, and implementing quality checks. Data validation techniques, such as data profiling, outlier detection, and consistency checks, are crucial in identifying and rectifying data anomalies. Regular data audits and monitoring processes help maintain data integrity and reliability over time.

Additionally, organizations can employ automated data validation tools and techniques to streamline the process and ensure a higher level of data quality. These tools can flag data inconsistencies, identify missing values, and validate data against predefined rules or business requirements.

2. Scalability and Infrastructure

The ability to process and analyze large volumes of data is essential for effective digital product development. As organizations gather increasing amounts of data from diverse sources, scalability and infrastructure become critical factors in harnessing the full potential of this data.

Traditional systems and infrastructure may not be equipped to handle the velocity, variety, and volume of data that digital product development demands. Processing and analyzing massive datasets require powerful computing resources, storage capacity, and efficient data processing frameworks.

Investing in scalable infrastructure ensures organizations can handle ever-growing data volumes without compromising performance. Cloud-based solutions, such as scalable cloud computing platforms and storage services, offer the flexibility to scale resources up or down based on demand. This elasticity allows organizations to handle peak workloads during intense data processing and analysis periods while avoiding excessive costs during periods of lower activity.

Modern technologies like distributed computing frameworks, such as Apache Hadoop and Apache Spark, provide the ability to parallelize data processing across clusters of machines, improving processing speed and efficiency. These frameworks enable organizations to leverage distributed computing power to tackle complex data analytics tasks effectively.

Recommended reading: The Evolution of Data & Analytics Technologies

3. Data Privacy and Security

A strong focus on data privacy and security in digital product development helps organizations maintain compliance, protect sensitive data, and foster customer trust. This, in turn, allows for more effective data-driven decision-making and enables organizations to leverage the full potential of their data assets while mitigating the inherent risks.

It’s not a matter of if it will happen but when, as IBM reports that 83% of organizations will experience a data breach. Those using AI and automation had a 74-day shorter breach lifecycle and saved an average of USD 3 million more than those without.

Safeguarding customer information and maintaining trust is crucial in a data-driven approach. This data often includes sensitive and personal information about individuals, such as personally identifiable information (PII) or financial data. Protecting this data from unauthorized access, breaches, or misuse is of paramount importance.

Organizations must comply with data privacy regulations, such as the General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA). These regulations outline guidelines and requirements for the collection, storage, processing, and sharing of personal data. Adhering to these regulations ensures that organizations handle customer data responsibly and legally.

Companies can implement encryption techniques to protect data at rest and in transit, access controls, and user authentication mechanisms. Conducting regular security audits and vulnerability assessments is also best practice. Supporting these initiatives requires a culture of data privacy and security awareness among employees. Training programs and clear communication channels can help employees understand their roles and responsibilities in protecting data and recognizing potential security risks.

4. Interpreting and Extracting Insights

Extracting meaningful insights from complex and diverse datasets is crucial for driving product innovation and success. However, this task can be challenging without the expertise of skilled data scientists and analysts to apply advanced analytical techniques and statistical models. These professionals possess the skills to navigate vast amounts of data, identify relevant patterns, and extract actionable insights that inform product development strategies.

Data scientists and analysts involved in digital product development must have a deep understanding of statistical analysis, data mining, machine learning, and visualization techniques. They should also possess domain-specific knowledge to contextualize the data and derive meaningful insights relevant to the product and its target audience.

These professionals leverage analytical tools and programming languages to manipulate and analyze data, such as Python, R, SQL, and data visualization tools like Tableau or Power BI. They employ exploratory data analysis techniques, statistical modeling, predictive analytics, and other advanced analytical methods to uncover patterns, correlations, and trends within the data.

They can identify user behavior patterns, preferences, and pain points, allowing organizations to make data-driven decisions about feature enhancements, user experience improvements, and product roadmaps. Collaboration between data scientists, analysts, and product development teams is crucial for the successful interpretation and application of data insights. 

And, of course, this leads us to…

5. Talent and Expertise Gap

Successfully blending software engineering and data analytics expertise enables organizations to build data-driven products that offer exceptional user experiences. However, bridging the talent and expertise gap by finding skilled professionals with a strong understanding of both disciplines can be a significant challenge.

Software engineers possess the technical prowess to design and build robust and scalable applications, while data analytics professionals can extract meaningful insights from data and apply them to inform product development strategies. The intersection of these skill sets is relatively new, and the demand for professionals who can bridge the gap is high. This creates a talent shortage and a competitive job market for individuals with software engineering and data analytics expertise.

To address this challenge, organizations must invest in talent acquisition strategies that attract individuals with hybrid skill sets. They can collaborate with educational institutions to develop specialized programs that equip students with the necessary knowledge and skills in both domains. Providing internships, training programs, and mentorship opportunities can also help nurture talent and bridge the expertise gap.

Organizations can foster cross-functional collaboration to encourage knowledge sharing between software engineering and data analytics teams. This allows professionals from different disciplines to learn from each other and leverage their collective expertise to drive innovation in digital product development.

Additionally, promoting a culture of continuous learning and professional development is crucial. According to McKinsey, which takes regular pulse checks of product-development senior executives, 53% of decision-makers believe skill building is the most useful way to address capability gaps, ahead of hiring, talent redeployment, and contracting in skilled workers. Encouraging employees to enhance their skills through training programs, industry certifications, and participation in conferences and workshops helps keep them updated with the latest advancements in software engineering and data analytics.

Recommended reading: A Digital Product Engineering Guide for Businesses

6. Data Integration and Compatibility

Integrating and compatibility between disparate data sources and systems is a major challenge for organizations. Establishing seamless data integration pipelines and ensuring system compatibility is crucial for successful data-driven digital product development.

Organizations often have many data sources, including internal databases, third-party APIs, customer feedback platforms, social media platforms, and more. These sources can generate data in various formats, structures, and locations, making integrating and harmonizing the data effectively complex.

Legacy systems further compound the challenge. Older systems may have limited compatibility with modern data analytics tools and techniques. Extracting, transforming, and loading data from legacy systems for analysis can be cumbersome and time-consuming.

To address these challenges, organizations need to adopt a strategic approach to data integration, including:

  • Data architecture and planning to develop a robust data architecture that outlines data flows, integration points, and data transformation processes. This architecture should account for different data sources, formats, and systems in the product development lifecycle.
  • Data integration tools and technologies to simplify the integration of disparate data sources. These tools can help automate data extraction, transformation, and loading (ETL) processes, ensuring smooth data flow across systems.
  • API and middleware integration, which can facilitate seamless integration between systems and data sources. APIs provide standardized interfaces for data exchange, allowing different systems to communicate and share data effectively.
  • Data transformation and standardization. Data transformation techniques play a vital role in harmonizing data from different sources. Standardizing data formats, resolving inconsistencies, and ensuring data quality during the transformation process enables more accurate and reliable analysis.
  • Modernization efforts to improve compatibility with data analytics tools and techniques. This digital transformation could involve system upgrades, adopting cloud-based solutions, or implementing data virtualization approaches.

7. Data Visualization and Communication

By using data visually to tell a story through charts, graphs, dashboards, and other interactive visual elements, organizations can distill complex information into intuitive and easy-to-digest formats. Data visualization is pivotal in effectively communicating complex data insights to non-technical stakeholders. 

In its raw form, data can be overwhelming and difficult to comprehend for individuals without a technical background. Complex datasets, statistical analyses, and intricate patterns can easily get lost in rows of numbers or dense spreadsheets. This is where data visualization comes into play, allowing stakeholders to grasp the key insights and trends at a glance.

Effective data visualization relies on understanding the audience and tailoring the visual representations accordingly. Different stakeholders have varying levels of familiarity with data and different areas of interest. The visualizations should be designed to align with their needs, ensuring the right information is conveyed clearly and concisely.

There are several key principles to consider when designing data visualizations for effective communication, including simplifying complex data, a visual hierarchy that highlights important information, contextualization and relevant comparisons, interactivity, and compelling storytelling.

Recommended reading: 4 Best Practices to Guide IoT and Dashboarding Projects

8. Ethical Use of Data

The collection and analysis of vast amounts of data give rise to ethical considerations. As organizations harness the power of data to drive product development strategies, it is essential to uphold the highest standards of ethical conduct. This includes respecting user privacy, protecting sensitive information, and ensuring data usage complies with applicable laws and regulations.

Obtaining informed consent from users is essential. Organizations must be transparent about the data they collect, how it is used, and the measures in place to protect it. 

Fairness is another crucial aspect of ethical data use, ensuring that the organization is using unbiased algorithms, models, and analytical techniques that do not discriminate against individuals or perpetuate societal biases. Proactively assess and mitigate potential biases in data collection, analysis, and decision-making processes to ensure fairness and equity.

Social responsibility is another guiding principle in data-driven product development. Advocate for the ethical use of data to address societal challenges, foster positive social impact, and avoid harm to individuals or communities. Consider the broader implications of data practices and determine how your organization can actively contribute to creating a responsible and inclusive digital ecosystem.

Implementing ethical data practices requires a comprehensive approach that includes clear policies, regular audits, and ongoing training for employees. It’s well worth getting right. Ethical data practices contribute to the long-term sustainability and reputation of organizations, while also aligning with broader societal expectations and regulatory requirements.

9. Cost and ROI

Implementing big data and analytics solutions in digital product development comes with significant upfront costs, including investments in infrastructure, tools, and talent acquisition. Organizations must carefully evaluate the return on investment (ROI) to ensure that the benefits derived from analytics initiatives outweigh the associated expenses.

While the costs of implementing big data and analytics solutions can be substantial, the potential benefits are equally significant. Leveraging data efficiently allows organizations to gain valuable insights, make informed decisions, and drive business growth. Research from The Business Application Research Center (BARC) shows that companies leveraging their data efficiently see an average increase in profitability of 8% and a 10% reduction in costs.

Begin by clearly defining the specific business objectives and key performance indicators (KPIs) your big data and analytics initiatives aim to address. This provides a basis for evaluating the impact and effectiveness of the investments made.

Conduct a thorough cost-benefit analysis to assess the potential returns and associated costs of implementing big data and analytics solutions. Consider both tangible and intangible benefits, such as improved decision-making, enhanced customer experience, and increased operational efficiency.

When investing in infrastructure, consider scalability to accommodate future growth and increasing data volumes. Cloud-based solutions offer the flexibility to scale resources based on demand, minimizing upfront infrastructure costs while providing the necessary capabilities to handle growing data requirements.

Establish mechanisms to measure and track the ROI of big data and analytics initiatives. You’ll need to regularly assess the impact on key business metrics, such as revenue growth, cost savings, customer satisfaction, and operational efficiency.

10. Continuous Learning and Adaptation

Staying current with the latest advancements, best practices, and industry trends is vital in digital product development, where technological advancements, new methodologies, and emerging opportunities drive constant evolution. To remain competitive and harness the full potential of data, thought leaders must foster a culture of continuous learning and adaptability within their organizations.

Encourage teams to pursue professional development opportunities. It’s important to allocate time and resources for training and learning activities and provide access to relevant educational resources to facilitate these programs. Give employees space and time to establish knowledge-sharing platforms and communities of practice to facilitate the exchange of ideas and encourage collaboration, as well.

Agile methodologies, such as Scrum or Kanban, are great for promoting iterative development and continuous improvement. Apply these methodologies to data analytics projects to enable teams to adapt quickly to changing requirements, incorporate feedback, and continuously learn from data insights and even failures.

Continuous learning should extend beyond the boundaries of data and analytics, as cross-disciplinary collaboration and combining data-driven insights with domain expertise can lead to more innovative approaches in digital product development. Developing data literacy across the organization is crucial, and empowers individuals to make informed decisions, contribute to data-driven discussions, and effectively communicate insights to drive organizational success. Advocate for understanding and interpreting data among all stakeholders, regardless of their roles or technical backgrounds. 

Conclusion

Applying a big data and analytics lens to digital product development means taking a strategic, data-driven approach encompassing technical solutions, organizational cultural shifts, investment in talent and infrastructure, adherence to ethical principles, and a culture of continuous learning.

Yes, it’s a tall order. Working alongside an experienced digital engineering partner like GlobalLogic through ideation, design, development, testing, deployment, and ongoing maintenance can help. We help organizations unlock the true potential of their data and get to market faster with innovative, compliant digital products that drive business success.

Want to learn more? Contact the GlobalLogic team today and see what we can do for you.

In an era dominated by data-driven decision-making, valuable and actionable insights have never been more essential for business success. This need has led to the rise of data marketplaces as a revolutionary solution that connects data providers with data consumers. But what is a data marketplace, and how can you use them to your company’s advantage? 

This blog dives into the essentials of Data Marketplaces – what they are, and the compelling reasons why integrating one might be a game-changer for your business.

What is a Data Marketplace?

A data marketplace is a platform that brings together data providers and data consumers, facilitating the buying and selling of data. It serves as a dynamic hub where data creators and consumers converge; where a variety of data products are listed and made available to potential buyers.

Data marketplaces typically host a diverse range of datasets from different sources and providers, spanning various domains and industries. They often offer features like search and filtering capabilities, allowing users to discover relevant datasets based on their specific needs. They may include additional functionalities such as rating and review systems, pricing models, and data preview options. They focus on creating a marketplace environment where users can explore and select datasets from multiple providers, promoting transparency, accessibility, and ease of data discovery.

Getting to Know the Data Marketplace Ecosystem

The data marketplace ecosystem typically consists of the following key stakeholders:

data marketplace ecosystem

Data Providers

These are organizations or individuals who offer their data assets for consumption. For example, anonymized patient data exposed by healthcare organizations can be used by pharmaceutical companies and consumed by researchers for clinical trials and drug development. They can be data aggregators, data brokers, research institutions, or even individual users who possess valuable data.

Data Consumers

These are organizations or individuals who seek access to specific datasets for analysis, research, or business purposes. For example E-commerce platforms use this data to analyze user behavior and purchase history to offer personalized product recommendations, Pharmaceutical companies and researchers use patients data for their clinical trials and drug development

Platform Operators

These are the entities that develop, maintain, and operate the data marketplace platform. They provide the infrastructure, security measures, and services necessary for data providers and consumers to interact within the marketplace.

Data Governance Authorities

In most cases, data marketplaces may have data governance authorities or regulatory bodies that establish policies, standards, and compliance requirements for data exchange and usage. These entities ensure that data privacy, security, and legal considerations are upheld within the marketplace ecosystem. Data governance authorities help ensure that data within the marketplace is managed, protected, and used in a responsible and compliant manner.

Data Marketplace Benefits

The data marketplace facilitates the seamless exchange of data across various entities whether within an organization, across industries, or even beyond geographical boundaries. Think of it as an organized marketplace for data, where valuable insights and information are readily available. Here are some of its benefits:

Catalyst for AI and Analytics

Data fuels AI and analytics initiatives. A data marketplace provides a rich pool of data for training AI models and conducting advanced analytics.

Unlocking Data’s Potential

Your business generates a plethora of data – structured, unstructured, and everything in between. A data marketplace harnesses this potential by making data accessible to those who can derive value from it. It’s a catalyst for turning raw data into actionable insights.

Accelerating Innovation

In a data marketplace, different stakeholders can access diverse datasets. This fuels innovation as creative minds from various domains collaborate, leading to fresh perspectives and inventive solutions.

Efficient Resource Utilization

Rather than each department or team siloing their data, the data marketplace centralizes data resources. This streamlines data collection, avoids duplication, and optimizes storage costs.

Data Monetization

Data Marketplaces allow organizations to develop and execute a data monetization strategy. They can choose to sell raw data, derived insights, or data-driven services, depending on their business objectives. For example, Healthcare organizations can aggregate and anonymize patient data to sell to pharmaceutical companies and researchers for clinical trials and drug development. E-commerce platforms can analyze user behavior and purchase history to offer personalized product recommendations. They can also sell this data to third-party retailers or advertisers looking to target specific customer segments.

Agility in Decision-Making

Timely access to pertinent data fuels quick and well-informed decisions. Relevant data is just a few clicks away, eliminating bottlenecks caused by data retrieval.

Collaboration Beyond Boundaries

If your business operates on a global scale, a data marketplace bridges geographical gaps. Teams from different locations can effortlessly exchange data, fostering collaboration.

Enhanced Data Governance

A well-structured data marketplace enforces data governance policies. It ensures data quality, security, and compliance, thus maintaining integrity across the board.

Expectations and Use Cases

The rise of data marketplaces has created significant expectations and opportunities across various industries. Organizations can leverage data marketplaces to enhance their business intelligence capabilities, for example. They gain access to external datasets that complement their internal data, enabling them to generate comprehensive insights and make data-driven decisions.

Data marketplaces serve as valuable resources for researchers and developers, as well. They can access specialized datasets for scientific research, machine learning model training, and innovation, accelerating their projects and fostering collaboration.

This is why they have gained traction across various industries, empowering organizations to access and leverage valuable datasets for a wide range of applications. Industries such as finance, healthcare, marketing, and transportation can leverage data marketplaces to enhance their services. Users can access real-time data feeds, consumer behavior data, geospatial data, and other relevant datasets to improve customer experiences and drive innovation.

Here are a few practical examples of industries where data marketplaces are extensively used:

  • Financial Services: Utilizing external data sources for risk assessment, fraud detection, and customer insights.
  • Healthcare: Leveraging medical records, research data, and patient-generated data for personalized medicine and healthcare analytics.
  • Retail and E-commerce: Using customer behavior data, market trends, and competitor insights for targeted marketing and business intelligence.
  • Smart Cities: Integrating data from various sources to optimize city operations, traffic management, and resource allocation.

Data marketplaces are rapidly evolving and offer tremendous potential for organizations to tap into the power of external data assets. However, careful consideration of data quality, privacy, security, and compliance is essential to ensure the success and trustworthiness of these marketplaces.

Challenges and Security Considerations

While data marketplaces offer immense value, ensuring security and privacy is of paramount importance. These are among the challenges facing organizations:

Data Privacy: Data marketplaces must establish robust privacy measures to protect the sensitive information contained in the datasets. Compliance with data protection regulations, anonymization techniques, and secure data transmission protocols are critical to maintaining privacy.

Data Quality and Trust: Data marketplaces need to implement mechanisms to verify the quality and authenticity of datasets. This includes data validation processes, transparency in data provenance, and reputation systems that establish trust between data providers and consumers.

Secure Infrastructure: The marketplace platform itself must have robust security measures in place. This includes secure authentication and access controls, encryption of data at rest and in transit, regular security audits, and protection against cyber threats.

Examples of Data Marketplaces to Know

Data marketplaces have proven to be effective platforms for data exchange across diverse industries. Depending on the specific requirements and field of expertise, one can discover other platforms customized for your industry or use case. Here are some noteworthy real-world instances of successful data marketplaces:

AWS Data Exchange: Amazon Web Services (AWS) Data Exchange is a data marketplace that allows data providers to securely publish and monetize their data products. Data consumers can easily find, subscribe to, and use the data they need for various applications and analytics.

Microsoft Azure Marketplace: Microsoft Azure Marketplace offers a wide range of data products, including datasets, APIs, and machine learning models. It enables data consumers to discover and access data assets that can be integrated into their Azure-based applications and workflows.

Google Cloud Public Datasets: Google Cloud Public Datasets presents a dynamic data marketplace within the Google Cloud Platform, offering a diverse range of public datasets for analysis. Spanning various industries and disciplines, this platform empowers users to execute big data analytics and machine learning workloads without the complexities of data movement.

Snowflake Data Marketplace: The Snowflake Data Marketplace grants seamless access to live, ready-to-query datasets from various providers across multiple industries. This platform allows users to explore and utilize a diverse array of data without the need for data copying or movement, offering a convenient and efficient solution for data consumers.

Kaggle Datasets: Kaggle, a platform for data science and machine learning competitions, hosts a dataset repository where users can discover and download various datasets contributed by the community.

Quandl: Quandl is a data marketplace that offers a vast collection of financial, economic, and alternative datasets. It caters to financial professionals, data analysts, and researchers looking for historical and real-time data.

Experian Online Marketplace: Experian is a global information services company that offers a wide range of services including credit reporting, data analytics, and decision making solutions

Data.gov: Data.gov is a public data portal provided by the U.S. government, offering access to a wide range of open datasets from various federal agencies.

Datarade.ai: Data exchange marketplace and platform that connects data buyers with data providers. It serves as a marketplace for the exchange of various types of data, catering to businesses and organizations in need of data for analytics, research, and other purposes

The Future of Data Marketplaces

As the data-driven landscape continues to evolve, the future of data marketplaces holds immense potential to reshape industries, foster innovation, and democratize data access. Here is a glimpse into what lies ahead:

future of data marketplaces

  1. AI-Driven Data Discovery
  • Advanced AI algorithms will enable personalized data discovery, suggesting datasets based on user preferences and context.
  • Smart search engines will enhance data accessibility, making it easier for users to find relevant information.
  1. Blockchain based Data Marketplace
  1. Edge Data Marketplace
  • Data Marketplaces may extend to edge computing environments, offering data closer to where it’s generated.
  1. AI-Powered Data Monetization
  • AI algorithms could assist in pricing and monetization strategies for data providers, optimizing revenue generation.

Looking ahead, the future of data marketplaces holds immense potential. Advanced AI algorithms will personalize data discovery, enhancing accessibility. Blockchain technology may enhance data trust and transparency. Data marketplaces may extend to edge computing environments, and AI-powered strategies could optimize data monetization. 

These developments are poised to reshape industries, foster innovation, and democratize data access in the evolving data-driven landscape. Embracing the evolving landscape of data marketplaces is key to staying at the forefront of data-driven innovation and success.

Is a Data Marketplace Right for You?

Data marketplaces have emerged as transformative platforms that revolutionize the utilization of data assets, offering a powerful solution for businesses, researchers, and industries alike. These platforms enable the easy access, sharing, and monetization of diverse datasets, driving data-driven innovation, collaboration, and growth in the digital age.

They empower organizations to overcome challenges, unlock new insights, create revenue streams through data monetization, and foster a data-driven culture. Embracing the concept of a data marketplace allows organizations to position themselves for success in the data-driven era, leveraging data to drive growth, competitiveness, and strategic decision-making.

Are you considering incorporating a marketplace in your organization’s data strategy? GlobalLogic helps our client partners build end-to-end solutions that improve customer engagement, optimize operations, and bring innovative new products to market faster. Learn more about our Data & Analytics Services here.

More helpful resources:

 

One of the most exciting developments in healthcare is the emergence of Software as a Medical Device (SaMD) as a more convenient and cost-effective means to deliver superior care to the tens of millions of people worldwide who suffer from various health conditions.

SaMD is defined by International Medical Device Regulators Forum (IMDRF) as “software intended to be used for one or more medical purposes that is capable of running on general purpose (non-medical) computing platforms.” In layman’s terms, SaMD is regulated software — installed and operated on “off-the-shelf (OTS)” computing platforms like mobile phones, tablets, laptops, desktops, servers and/or the cloud — that aids in diagnosis, screening, monitoring, or treating physiological conditions. These SaMD applications cover a wide spectrum of clinical patient conditions, from diabetes management solutions to cloud applications that analyze and generate patient-related insights viewed via a clinician’s portal.

Over the past decade, there has been a major evolution in medical devices. Previously, the vast majority of the medical device feature set was resident in the device itself. However, this landscape has experienced a paradigm shift. With advancements in software engineering, these features, and functionalities in SaMD are re-partitioned taking advantage of software and hardware components readily available in the market. Integrating third-party OTS hardware, software and/or libraries and services within SaMD applications has created additional clinical value by optimizing patient care in a more efficient and cost-effective manner. Medical device manufacturers using OTS hardware can take advantage of commercial operating systems, third-party software & services, and the hardware advances in memory, computing power, connectivity, communications and screen technology.

SaMD applications are regulated in nature, and are required to follow the same set of standards that govern medical device software, including various ISO/IEC standards that have been embraced by global regulators, such as ISO 13485, ISO 14971, IEC 62304, and IEC 62366. Depending on the product, other standards may also apply.

SaMD applications often take the form of patient companion apps, so-called because they often serve as a patient’s primary link and interaction point with the medical device system. For example, a chronic pain sufferer can utilize a SaMD patient companion app to adjust the energy levels associated with an implantable neurological stimulator device based on the guard bands setup by their clinician.

For a person living with diabetes, the patient companion application is usually part of a distributed diabetes management system that integrates with and presents information from a wearable insulin pump, a CGM (Continuous Glucose Monitor) and Cloud Analytics, helping the patient better manage their glucose levels.

Traditional insulin pumps contained all of their therapy and diagnostic functionality on the pump. While they were clinically effective, it was difficult to integrate data from third-party CGM sensors, and was hard to connect and upload data to applications used by clinicians to support remote patients. Leveraging a patient companion app on a mobile phone enables this functionality to be integrated and delivered to the patient’s mobile phone, and thus takes advantage of riding on top of a commercial platform utilized by millions of people.

Another value proposition associated with the use of SaMD applications on OTS hardware is the cost advantage of re-partitioning functionality that historically was on proprietary hardware. This re-design and re-partitioning can minimize the size of the medical device and reduce the CoGS (cost of goods sold) by reducing the number of physical components. SaMD applications on OTS hardware can also improve system usability and garner greater acceptance by patients, clinicians and payors.

SaMD applications

Many medical device manufacturers who have developed SaMD apps have gained market share and top-level revenue, and some are realizing even greater gains by offering specialized SaMD applications or subscription services. This direct revenue is projected to grow from $4.4 billion in 2021 to $8.2 billion in 2027.1

This all seems very positive, but how are companies managing the transition? Since SaMD patient companion apps are usually part of a distributed system, focusing on system engineering, system risks and design fundamentals are key to partitioning the functionality across multiple components. Another key is embracing a full lifecycle support plan that tracks hardware, software and services changes/updates, and re-releases applications as required. Finally, UIX design is critical because users expect the same level of design and implementation they experience with their everyday phone apps.

Now let’s consider how SaMD applications impact clinicians. With the incredible growth in the use of implantable and wearable medical devices over the past decade, and an aging population, clinicians and healthcare organizations are challenged with the increased volume of patients requiring periodic follow-ups. With the use of ubiquitous high-bandwidth connectivity provided through mobile phones, and a SaMD application providing data uploads to a central server or a cloud, patient data can be automatically analyzed against pre-defined guard bands or limits. If limits are exceeded, clinicians can be alerted to take the appropriate action.

For device manufacturers, the time is now to embrace patient companion applications

As mentioned, patient companion apps are projected to grow at a robust rate for the next several years — which means that device makers who have yet to embrace companion apps risk competitive displacement if they don’t course correct.

The good news? Companies just now embarking on patient companion app strategies can apply lessons learned from those who have gone before:

  • Focus on utilizing the appropriate and defined ISO/IEC standards for all aspects of the product development of these applications
  • Focus on applications that deliver tangible, verifiable clinical capability based upon science, and/or operational value (not just buzz)
  • Leverage the UIX design adopted by iOS and/or Android as appropriate while adhering to the applicable standards
  • Partner with design, engineering and technical experts who have significant experience in developing SaMD applications, taking advantage of the learnings generated by developing and supporting numerous SaMD apps

GlobalLogic, a Hitachi Group Company, is a leader in digital product engineering that helps clients design and build innovative products, platforms, and digital experiences by integrating our strategic design, complex engineering, and vertical industry expertise with Hitachi’s Operating Technology and Information Technology capabilities. We bring extensive digital engineering experience to help companies develop companion apps, with hundreds of successful projects brought to market over the last half decade.

The time is now to consider how leveraging SaMD patient companion app can be utilized by your company to help achieve your companies clinical and operational goals.

Read about how a medtech and engineering services partnership is saving lives with a breakthrough cardiac recovery system.

For more information on how GlobalLogic, a Hitachi Group Company, can help you better engage your customers, innovate within predictable budgets, and bring the next generation of companion apps to market in the shortest possible time, visit https://www.globallogic.com/services/industries/healthcare-life-sciences/

  • URL copied!