Proper document management practices are essential to the creation, assembly, and maintenance of an eCTD. An Electronic Document Management System (EDMS) generally includes features such as check-in/check-out, version control, and audit trails. EDMS also boasts document-level security for all kinds of document file types including word processing documents, XML, images, and much more. To establish […]
The boss gathered us all in the boardroom and said: “Folks, times are tough, we’ve got to improve our performance”. Or did he say “our efficiency”? Or maybe it was “productivity” that he said. Well never mind! We all knew what he meant. I rushed to my department and called John. John is a good guy. Always running around with his sleeves rolled up trying to solve some problem. I told him to sum up some numbers to see how we’re doing.
John is smart. He had anticipated my request and had been gathering data for some time on his notebook. Hours worked, who is friend with whom, that kind of stuff. Trouble is I couldn’t make much of it. The guys at the workshop were working their buts off. Not much to improve here. “With the holidays coming up we are risking burning them out” he told me. But I’m smart too. “We got to watch them closely, I said, and change the processes. That’s how we’ll improve the whatchamacallit”. “I don’t understand, said John, what’s the problem? What are we trying to improve?” “Well, the thing, the productivity. We got to make more money. Produce more. Those competitors are killing us.” “Do you want to improve our internal efficiency, track our performance or increase the productivity?” asked John and I did not know what to answer him.
How many times have you heard the terms efficiency, performance and productivity? How much time have you devoted to discussing them and looking for ways to improve them? And how many good solutions have you found?
Often, as in the example above, there is confusion between these terms and initiatives to improve them are met with suspicion by overloaded employees. From the smallest to the largest companies, all have in one time or another wondered about how to increase their revenue and profit by optimizing their ways of working. Non-for-profit organizations are less keen to visiting such topics but in fact they are the ones who could profit the most, as they tend to have more loose organizations and less stringent, though noble, objectives.
In most cases imminent financial threat is the trigger for actual action on these. A competitor shows up and offers far cheaper prices or better services, sudden cost increase due to external factors such as energy price and so on. Several methods have been developed and involve the business owners themselves in the improvement process (e.g. Lean and six sigma). However, experience has shown that external guidance is key to avoiding complaisant solutions and emotional attachment to current practices. But let’s see the definitions:
Efficiency is the extent to which time, effort, or cost is well-used for an intended task or function. This is where you can have most impact. Improving efficiency implies a clear definition of the intended task and the definition of what is meant as “well-used”. Process improvement, operational excellence and quality-by-design are some popular methods for improving efficiency. A good mix of internal knowledge and external advise is paramount to achieving efficient use of the limited time, effort and cost available.
All modern improvement methodologies more or less follow the PDCA (plan, do, check, act) approach and the Deming wheel shown below for continuous improvement. The PDCA cycle (or it’s OPDCA variant which includes an observation step) allows setting the plan, executing it then, assessing the result and making changes. The knowledge gained is incorporated in operational standards such as SOPs and raises the inherent quality of the work. A new cycle can then begin to further improve the process. While the methodology is well thought, logical and has proven beneficial in countless cases, it remains somewhat theoretical and consumes resources upfront. ROI is not always obvious.
Performance is the measure to which one obtains the right things, the right way at the optimal cost. Measuring performance implies setting targets and thresholds for all three dimensions.
Contrary to efficiency that focuses on setting the processes right, performance measures the degree to which an organization is following internal norms, budgets and the level of achievement over time. Efficiency tends to move in one direction: towards always better, towards improvement, while performance is basically contemplative and analytical. A good monitoring of performance allows detecting threats and opportunities both internal and external. While efficiency applies across the board and efficiency optimization will improve the way things work, performance management will look for the right spots where action can make the difference.
Finally Productivity is the ratio between the invested effort and the final result. In this respect, productivity represents the sum of all the actions in a given process including those devoted to efficiency optimization and performance management and depends not only on the efficiency of each individual process but also on the effect of combination of all processes involved. Productivity is a long-term indicator that can only improve (or deteriorate) slowly over time. It is a multi-factor outcome that cannot be acted upon directly.
So how does the magic equation work? Efficiency x performance = productivity. You may use the PDCA or any other methodology to improve your internal processes efficiency. Do it regularly (e.g. once a year) and involve the people in charge. Have them review their ways of working, plan changes, implement and assess those. At the same time, monitor performance in terms of results (whatever is the output for each department), of know-how (monitor adherence to internal norms) and cost (monitor budget respect). This will allow you to assess if processes are adequate, if output is satisfactory and discover issues and opportunities. At the end of the year, you will be able to assess the overall productivity and hopefully you will see it improve.
Technology has progressed at an accelerating speed in industrialized countries all along the 19th and 20th centuries and there are no signs of slowing down in the 21st century. Among the areas where technology has made a difference in our lives, healthcare is certainly one of the most important.
There is, of course, the science, Biology, dubbed “the science of the 20th century” in a comparison to Physics and Chemistry that dominated previous times. Our detailed understanding of the mechanisms of life has progressed greatly but, beyond the discovery of the DNA code, no other revolution has occurred. Instead, laboratory technologies have allowed cloning of the human and other species genome with a performance that goes beyond the wildest dreams of my own university professors. Genetic engineering has provided the tools for making new drugs and new crops revolutionizing several medical fields and the world agriculture.
There is Medicine itself: the fine art of disease diagnosis. Here again, imaging and diagnostic technologies have made the difference and the good old doctor’s intuition has been put to the back-burner and replaced by standardized guidelines and recommendations. And then there is surgery, which is also heavily supported by technology.
The result of all the above has been a real miracle. Life expectancy has steadily increased in the past hundred years. Quality of life has also improved in a way that we usually underestimate.
The problem: Life expectancy and QoL
I was recently in Vienna, Austria and took the time to visit the Hapsburg crypt where 145 members of the illustrious imperial family are buried since 1633. The crypt provides a perfect model for studying life expectancy and quality of life inside a privileged family.
The first thing that impressed me was the number of infant deaths. Many members of the family did not survive the first year. Several died at birth and over 25% of those entombed here were five-years of age or less when they died. As the visit moves through time, the lifespan of the members increases steadily and the last two, Otto von Hapsburg and Zita Bourbon-Parma lived to be 99 and 97 years old.
The second curiosity comes from the molded sculls decorating many of the sarcophagi in the vault. Those were made from the actual scull of the defunct and one immediately notices the missing teeth. Members of the imperial family as young as seventeen or twenty years old were missing several teeth and some of them could obviously not eat meat at all. Those teeth did not fall off overnight. They were associated to painful toothaches for which painkillers did not exist. Those sculls remind us how miserable life could be even for kings only decades ago.
Now let’s take a look at life expectancy in France, an industrial country of 60 million with a traditionally strong healthcare system. In the decade 1997-2007, it has increased by 2.5 years. This means that we added 150 million years of human life on French territory during this decade. A hundred and fifty million years of old people’s life that is (infant mortality has not improved during the period). A hundred and fifty million years of retired people’s life. People, who do not work, do not create value but consume pensions and increasing healthcare cost instead. Assuming a modest 10,000 Euro cost per person/year, France, during this decade, has added a cost of 1500 billion Euros, three quarters of its total public debt of 2000 billion Euros.
So medical science, supported by technology, has performed miracles in the past decades and for all we know it can perform even more miracles in the years to come. The question now is not “what are we able to achieve” but “what are we able to afford”. And this is not an ethical question. We have come to a point where we cannot pay for more life unless we change the economic model we live under and unless we do that the momentum for more medical miracles is diminishing.
Technical miracles are not the only thing that our western societies have achieved in the past century. Healthcare insurance is the second pillar of our improving health. Together with retirement pensions it provides the framework for a long, healthy life; a framework that kept growing and has now become a bottomless pit in our economies. For the first time in modern history, projections show that our children may have a worse life than we did. Fueled by double digit growth and cold war confrontation with communist regimes that claimed to provide everything for free, education, health, transportation etc., the western healthcare/ pension system has reached its limits and cannot survive in the absence of further growth. And growth is exactly what we do not have. At the same time, health-related expenses are still forecasted to grow from 6.7% of GDP to between 10 and 13% by 2050. So how can we preserve the benefits built over the past century in healthcare in the absence of growth?
Technology, once again, may provide the answers. But which technologies?
According to a recent report in the MIT Technology Review by Jonathan Skinner entitled “The costly paradox of health-care technology”, the author notices that in every industry but one, technology makes things better and cheaper and asks the question: “Why is it that innovation increases the cost of health-care?” The report concludes that among the different technologies available, only those that allow a better use of existing information and infrastructure can contribute to lowering the cost of health-care. I will add those that help lower the cost of developing new drugs.
Drugs: Why are they so expensive?
DDAs (direct-acting antivirals) are potent drugs capable of eradicating diseases like Hepatitis C which is estimated to have infected 185 million people around the globe. One of the major barriers in dong so is the astronomical price of DDAs, which makes it difficult to contemplate massive actions even within the European Union. Why are many new drugs so costly that even western economies struggle to afford them? And how can we ensure access to effective drugs to low-income populations while maintaining the incentive for new discoveries?
The high price of drugs is mainly due to high development cost. All R&D inclusive, a new drug costs between 4 and 11 billion dollars (Matthew Herper, Forbes Staff, Pharma & Healthcare 2/10/2012) of which an average 54% is spent in the final stages of development (CMR International, 2012 Pharmaceutical R&D factbook). Which technologies can help reduce the cost and the time of drug development ?
Computing power: The atomic bomb paradigm
During the second half of the 20th century, countries developing nuclear bombs were regularly testing their new weapons with devastating consequences on the environment. Today there are no more nuclear tests performed although the development of atomic weapons has not stopped. This is because computers have become powerful enough to simulate the explosions and their effects even for new bombs. Could computers be as helpful in saving lives as they are in the nuclear field? Several tests have been launched to simulate early stage clinical trials and the FDA is fully supportive of this approach that can allow to skip over the initial stages of drug development. In later stage development, «virtual» clinical trial have been performed with real patients but with no need for costly visits to the investigational sites as all data were collected remotely over the internet. Today, Internet 2.0 and social media are widely used to recruit patients for clinical trials and thus shorten the development cycles.
Biology and Moore’s law
Moore’s law is about the evolution of electronic circuits. It is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every eighteen months while the cost is divided by two. We all know what this exponential evolution has done for computers. Well, biology has done much better in the past six years. The cost of gene sequencing followed closely Moore’s law up until 2008 and then the progress accelerated. What cost 100 million dollars in 2001 can now be done for about 5000 dollars and wills soon be included in the price of a routine examination.
Big data: from personal data to personalized medicine
Computing power and analytical techniques have progressed to a point where huge amounts of data can be collected from each individual and processed in various ways. This may allow in a near future the quicker development of drugs by selecting the most representative individuals for clinical trials, the production of slightly varying medications better adapted to each patient, better diagnosis and even the prediction of risks for conditions to come.
Collecting and processing such large amounts of data was unthinkable only years ago but today, big data analytics are changing the game. It is possible to extract meaning from very large collections of apparently disparate and not standardized data. It is possible to introduce intelligence in the way these data are processed, make deductions by comparing patterns and in the end draw enormous value from what used to be considered as the stack of hay where the needle was lost. The next version of spell check will be smart enough to understand who you are writing to and adapt the spelling to «professional», «casual» or «phonetic». Software companies are partnering with doctors and hospitals to devise new ways to extract information from massive medical data. Chronic diseases, cancer, neuro-degenerative diseases can hope to find treatments and even cure and, most important, personalized treatments better adapted to each patient.
There is a caveat however in having such insight in people’s health: confidentiality of personal information that can lead to discrimination. Our ethics will certainly struggle to keep up with our technological progress but in the end we will need to find the right answers because technological progress in unstoppable.
Standardization: The difficult «easy solution»
Standardization can help exchange and process data and it was very early recognized as the key to effective use of medical information. A number of institutions have taken up the task to create and maintain standards, WHO, ISO, HL7 are some of them and they each manage tenths of different standards. This led Andrew Tanenbaum, a computer scientist, the famous saying: “The nice thing about standards is that you have so many to choose from.”
Data standardization has progressed at a slower pace than anticipated. It has progressed nevertheless and a new landscape is slowly emerging that allows greater digitization of medical information and better use of it. Some countries like Israel now have fully digitized medical records and can expect to be the first to take advantage of new information technologies in the healthcare area.
The European Union has also made considerable efforts to standardize medical records across the 27 member states in the framework of its fundamental mission, to allow the free movement of European citizens inside the European space. Recently the Union published a guidance for information exchange that is seen as a key step for collaboration for the benefit of patients according to Paola Testori Coggi, Director General for DG Health & Consumers, European Commission.
The European Union has launched a vast program to allow better use of computer technologies in Healthcare management called eHealth. This program includes several projects aiming at making information more reliable, more accurate and more exchangeable to help people’s mobility but also a better use of existing infrastructure within the boundaries of the Union.
Is there still hope?
The recent economic crisis seems to have opened Pandora’s box highlighting a great number of issues that came along with the progress of our quality of life the resolution of which resembles a Chinese puzzle on top of a ticking bomb. But in the Greek myth of Pandora, inside the doom box remained one last element that came out last: Hope.
A recent study entitled “Old age mortality and macroeconomic cycles” found that “In developed countries, mortality rates increase during upward cycles in the economy, and decrease during downward cycles. This effect is similar for the older and middle-aged population. Traditional explanations as work-stress and traffic accidents cannot explain our findings. Lower levels of social support and informal care by the working population during good economic times can play an important role, but this remains to be formally investigated.”
Imagine a world where you could write a 500-page book in a few hours. Where one can prepare the next season catalog of an online retail business with new products and prices with a few clicks. A world where you can quickly review and update a large clinical trial report and where you are always on time with the review and revision of all your SOPs. Well, this world exists. We just need to learn to live in it. The foundation, as it is often the case with revolutionary novelties, exists for a long time. It is called “xml” which stands for “Extensible Markup Language”. I will not bore you with technical explanations. I do not need to know how the ABS of my car works. All I want to know is that when I hit the breaks on a slippery road, the car will avoid going off the road to a certain extent. Same here. How xml works is the business of IT. What it can do for you and me is our business.
And it can do an awful lot. In a nutshell xml is the combination of text and data. Of “structured” and “unstructured” information to use a more technical term. In other words xml adds intelligence to plain text. Artificial intelligence, that is. A text can be very intelligent but only for human eyes. And human eyes can only read with a limited speed. When you are asked to read through a 600-page clinical study report and make sure that all changes from the last review have been implemented appropriately, you have a problem. So you ask for help from the study team. Now there are three people, including you, reviewing the same document. But they cannot just split it in three as the same information appears in several pars of the beast. So three people are now making more changes to this already complex document and, no surprise, there will be another round of checking to make sure that all changes match. Do you get the picture? Have you already experienced the pain?
Now imagine that you have a way to identify each part of the document as it is being written, a way to use pre-defined sentences or whole paragraphs and even be able to use language that has been previously approved by regulatory authorities so you take no risk of saying something wrong or inconsistent. What xml does, is just that. It allows a writer to “tag” text and give it a meaning. Meaning that computers will understand so that they will assist the writer by grouping related parts of the text, building libraries of terms, phrases, paragraphs, chapters and assigning actions to each of them. For instance, when a regulation changes, you could be alerted that some of your SOPs need to be updated, you would know exactly which ones, which part of the text needs changing and when you come to make the change, you only need to do it once and it will be repeated in all the places where it appears.
Now here comes the best part: all the tools for making what I described above happen exist and have been used extensively in the publishing business and other industries. Pharma has always been slow to adopt innovative technologies. There are many reasons to that. Our very long development cycles allow time for manual work to be done, our pharaonic budgets and revenues are too large to take the risk of having issues with new, poorly understood technologies and our highly specialized staff finds it hard to change old habits that have proven to be working in the past. Still we eventually do adopt them and when we do we are pretty good at it. For the exact same reasons. We want to make the most of what we use because the stakes are so high.
So what if we tried? Take SOPs as an example. Every pharma company has hundreds or even thousands of SOPs describing the precise way in which business in conducted. We are in the most highly regulated of all businesses and regular audits are performed by health authorities to make sure that our SOPs reflect the full set of rules and regulations worldwide. We are supposed to review and update our SOPs in a regular basis and of course we are supposed to make sure that they are consistent with one another as well as with our job descriptions and other sources of information. Can you put your hand over your heart and say that all this is true in your organization and will always be? If not then what is your risk? At best, you are doing a huge amount of work for little benefit as people tend to view SOPs not as a useful aid but as a necessary evil imposed upon them by regulatory bureaucrats. At worst you may get some nasty findings during audits and even end up with a consent decree forcing tight FDA control of all your work for years until they feel that you are back on track. Not to mention potential fines that can be very high by any standard.
Now let me show you how you can do complex work with reasonable resources while feeling confident about the output. How you can produce SOPs that are actually useful and will be used to increase the quality and decrease the cost of your work. SOPs that people will trust and use and be willing to refer to.
Take a tool that is an xml editor. It looks just like any word processing software, like Microsoft Word; only with a few additions. As you start typing, the software will ask you if you wish to use text from the library it has in store and will guide you in doing so. You do not have to use the suggested text but as you write, you realize that it is easier, faster and more reliable. Whole paragraphs from previous texts can be reused and as you move into the complexity of a clinical document for example, you reference text instead of re-writing it so that if a change is made later, it can be made only in one place and repeated everywhere inside the document. The more you work in the system the richer it becomes.
Back to our SOP example. Here we have a collection of documents, all with the same structure (Object of the SOP, scope of the SOP, roles and responsibilities and so on). The main challenge in this case is to avoid inconsistencies and to track inter-dependencies. This is very easily done using the SCA libraries. For example, the role of a person within the organization must be described in a consistent, although not identical, manner throughout the SOP system. A study physician has certain responsibilities some of which appear in one SOP and others in another. If the job description is stored in the library with the right granularity, the different responsibilities being listed separately and linked to the relevant regulations, it is very easy to refer to these when writing the SOPs. If later a regulation changes, the system will issue an alert that the responsibility needs to be updated, point the SOPs that contain it and allow changing these simply and quickly. Periodic SOP review is made easy as it can be done by theme rather than by document.
MAIA Consulting can assist your company in understanding in depth the possibilities of SCA, guide you through the choice of the right tools and help implementing a state-of-the-art authoring system based on the latest technologies.