Building a floating city on the Great Lake of data: Pharma’s 2016 challenge
The life sciences industry is a knowledge business. The data from scientific research, drug development, billions of interactions with customers and compliance with regulations are growing exponentially. Simply put, pharma companies create, curate and consume data at a faster rate every day.
Imagine the industry as an old city built next to a lake where the water level is rising dramatically. The city is in danger of being overwhelmed by falling innovation, fewer products making it to the global market and the crushing cost of failure.
How can the top talent in pharma IT help the industry avoid the inevitable and create a new future? I found the group at Pharma IT 2016 aligned and ready to build a floating megacity.
Building for flexibility
With rising water levels, building static platforms – even on stilts – is a recipe for failure. Trying to organise the mass of data types, from the patient to the pipette, is a labour of Hercules, which won’t save the city. Instead, the data architects have stopped pile-driving the lakebed and are planning to build rafts. We need to develop systems that float above the data, accessing it using technologies brought in from other sectors. Of course, this sets a challenge for innovators to overcome and there’s plenty of healthy scepticism about this approach but…it’s still raining data and change is needed.
Herding the analytics cats
Data is a commodity: information has value. Critical to success in the new tomorrow will be how analytics target the right data, work with sparse or dirty data and bring it all together. The diversity of siloed, stand-alone analytics systems is becoming as complex as the data they’re using. There’s a strong move from the community toward building a network of these systems to drive inter-operability. This ‘community’ of harnessed modelling systems, from R&D to market, will allow data to be piped, purified and processed, reducing costs and time wasted caused by many existing approaches to data modelling.
Working together (almost)
To build a workable community, people must work together. There were good (and some worrying) examples of this behaviour on show. Positive action is being taken by pharma to share their own (anonymised) data on safety and drug metabolism for the benefit of the community. This self-help will bear fruit. However, some initiatives are established to use ‘open’ data but not to share their experiences of success or failure externally. This leaves the community to reproduce failure over and over again, behind closed doors, without telling anyone. Best practise from other sectors of the industry must in the end win out if the community that lives on data is to succeed.
IT as a partner not an order-taker
In many areas IT is now shedding the yoke of a service provider, and becoming a valued team member. The way in which an organisation can use data will, in the future, enable its success. In short, the IT enables the science (what we know) to produce new products (what we want) and engage clinicians (who are our customers) and their patients (whom we benefit).
Drying off after the wet work
In R&D there remains a large amount of lab and clinical experimentation that could be replaced by high quality modelling. This is anathema to some and common sense to others. Just as the industry was pressured from outside into the reduction of animal experiments deemed to be unnecessary, the pharma informaticians are campaigning internally. They see a future where R&D and regulators’ decisions are based on the output of qualified algorithms from large volumes of data, as opposed to ones based on variable ‘wet’ clinical and research testing. Even the FDA is now examining how to let the learning algorithms do the teaching (@precisionFDA). It’s going to take time but as experiments become more expensive and challenging to do, the focus will move to on ensuring that the ‘wet’ stuff is done only where it will have most insight into the unknown.
Look outside the bubble
The IT sector in pharma moves surprisingly slowly. Many of the technologies and approaches used today were developed twenty years ago. Even with some nips and tucks, they look their age. Desktops with BI elements and graphs are OK but still lag behind many of the systems used outside our industry. Semantic and RTF approaches, developed for other sectors are starting to pervade but are less common than the use of the same technologies we all use at home.
The boom of the guns from battle cruisers Microsoft and Apple has now been heard but the bow-wave they will make is still some way off. Whatever their entry may bring, the sector is showing some signs of using technology from gaming (for clinician and patient engagement) and modelling (from 3D-printing) to inject some innovation. How we engage the astrophysics, fintech, media and retail communities to share best practice will be critical. They all face massive data challenges and have focused resources to tackle them.
Influencing culture as much as technology
The meeting talked about how the landscape is changing. As the water rises and we build the megacity of pharma, floating on our Great Lake of data we need more than technology. Moving from a tradition of firm footing on land to becoming waterborne, technology changes must be supported by – or drive – cultural change. Communities don’t like change but it’s coming. Senior leadership and sponsorship is needed to help large pharma accept that they need to live and work differently if they’re to survive the flood. Some firms won’t survive, but those that do will represent a new brand of pharma: borne on data and comfortable to be afloat.
To find out what good talent in digital healthcare looks like see our Talent Equity report.