Tag Archives: data analytics

Emerging technology and content buying

Author: Penny Leach, Associate Director, EBRD, and BIR Editorial Board Member

Please note this post contains the personal views of the author and are not connected with her employer.

Emerging technology and innovation are impacting content buying – and selling – in multiple ways.  This was the conclusion of a lively session held at the SLA Conference this year.  The situationis evolving rapidly, with varying levels of appetite and capacity to optimise the exciting opportunities.  As is so often the case, collaboration between multiple parties is more likely to lead to success, makingthe most of harnessing data in ways thatfree human intelligence for more value-add activity, and create appropriate commercial models.  However there are challenges and concerns – the fear of unknown costs, of loss of control over proprietary content,of missing out (and being disenfranchised) due to a lack of knowledge or resource and appropriate infrastructure, raising both private and public sector concerns.

The SLA Conference this year was held in Baltimore in June.  The Conference is a great way to meet other information professionals and other members of the information community from across the globe and build better connections in person.   Every year the SLA Leadership & Management Division’s Content Buying Section brings together an experienced panel representing different approaches in thecommunity of content of vendors and buyers, to provide reality-based insight.   This year the panellists were Amy Davis, Senior External Content Advisor at EY; Tim Baker, Global Head of Innovation at Thomson Reuters (now Refinitiv); and Bill Noorlander, Director of BST America (Conference sponsor).

The panel focussed on four emerging technologies that are creating content and new ways of deriving value from content: the Internet of Things (IoT); Data Analytics; Artificial Intelligence (AI) and Robotics Process Automation (RPA).  Early on, the largely buyside audience was reminded that content is not normally for sale but rather is leased for specific purposes – hence the complex contractual terms that are needed to protect all parties (content creator, provider and user).

Several themes emerged from the discussion,and from audience questions during the interactive session.  Generally the new content and technologiesare seen to enable several kinds of ‘smarter’, such as better client experience when deploying more visual and user-friendly products, more machine-ready data that customers can use in their own apps, and more efficient companies using their own data effectively to reduce cost (automated processes) and add value (e.g. finding more content to enrich products).

There is increasing usage of sensor-based devices in personal, industrial and civic applications (IoT).  This is creating new and extremely high volume data streams to add to the fast-growing mass of structured and unstructured data that isalready part of our digitised world.  This data ‘exhaust’, as a by-product of core businesses, offers opportunities for monetisation – for example in the financial sector– but with caveats that (as ever) mean ‘free’ is not really the case.  These alternative data sets are messy, fragmented, lack standardisation and history, and are hard both to use effectively (signals can be weak),  and to price.   For vendors, it is costly to develop and maintain new commercial offerings where client needs might be very specific.  There are hurdles, too, around data privacy and ownership, and legal terms such as the definition of users.  ‘Bots’ for example, one of the tools created by AI and an example of RPA that can free humans from repetitive tasks, may be prohibited by legacy contracts.   And just how do you count ‘eyeballs’ and fingertips?

On the buy side, the panellists concurred that it is better if multiple stakeholders are at the table – information professionals familiar with content licensing and the concept of reference interviews to articulate data needs, IT, procurement, legal advisors, and of course, the business process owners – to determine the requirement, negotiate new or amended license rights, match price to available budgets, and finally but not least, implement the new tools.

New players are emerging- new intermediary service companies such as data  ‘wranglers’ as data science and analytics skills (e.g., Quandl)  and new roles such as Chief Data Officers (CDO). More tools are needed to commoditise processes to reduce development costs and to deal with challenges.  Blockchain for example may help with the tracking of data elements.  As ever, watch this space!

Realising the value of data – Third Theme in our BIR Annual Survey

This is the third in our series of themes from the latest BIR annual survey.  The value of data is something that is constantly being discussed within organisations – How do we make the most of the data we have? How do we realise the benefits?  How do we know what we know? how do we commercialise it?

All are interesting questions and equally important.  Since the rise in popularity of ‘big data’ which started around 2005,   (we have been focussed on collating data for much longer than that but technological advancements that culminated around this time gave rise to the possibilities of gathering and making use of large and potentially disparate data sets), organisations have been increasingly looking at gathering data – on their customers, on their competitors, markets, business environments to name a few.  Within this time organisations have also been trying to figure out how they can realise the value of the data they have gathered.  Even today with advancements in artificial intelligence (AI) organisations are still struggling to assess the value of data.  If it is done correctly it can help inform strategy and investment in future business assets and acquisitions, if it is not then it can be very costly indeed.  There are a number of ways for looking at how to measure the value of data but at this time none are accepted as the way forward.

McKinsey have written articles and conducted research in this area.  They have found that those organisations that are able to leverage customer insights to inform and improve the business are out performing peers by 85% in business growth and sales.  McKinsey note that most organisations find it difficult to realise the potential value of their data because of different technologies, legacy systems and siloed working meaning that data is fragmented all over the place.  It is this situation in particular that hinders organisations taking real advantage of the data they already hold and can lead many into investing externally into research and competitive analysis in order to leverage value from data.

What is the answer?  You the information professional are the key to the answer.  An understanding of search, location and structure of the internal data as well as the context in which it was found and stored is vital to making sense of the wealth of data an organisation holds.  Jinfo reported on the importance of the information professional in Data Analytics – ready your information service (see references below) looking at the importance of source expertise for gathering and analysing external data. In gathering and analysing data context and source are key to providing accurate insights to inform organisational strategy.

Read more about what information teams are considering and doing today to have an impact on data value in our annual research report published in September’s issue.

References

https://sloanreview.mit.edu/article/your-data-is-worth-more-than-you-think/

https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/capturing-value-from-your-customer-data

https://www.informationweek.com/big-data/big-data-analytics/how-valuable-is-your-companys-data/a/d-id/1331246

Data analytics – ready your information service https://web.jinfo.com/go/sub/report/2760