For the past few years, one of the most important research areas in radiology has been the search for reliable imaging techniques to diagnose early Alzheimer's disease. This has involved attempts to identify deposits of both beta amyloid and tau protein (see: Diagnosing Alzheimer's Disease with Imaging and Biomarkers; Brain Plaque Diagnostic Imaging Procedure Approved by FDA; Alzheimer's Amyloid Tangle Theory Will Be Tested with Merck Drug Trial). A recent article provides a progress report on this research relating to tau protein (see: Tau Imaging Strong Predictor of Memory Loss):
Tau protein in key brain areas of cognitively normal older individuals is strongly correlated to memory loss over time, preliminary research shows. [A researcher recently] found that F18 T807...), a novel positron emission tomography (PET) tau tracer that binds to tau in both the entorhinal cortex and the temporal neocortex, is related to episodic memory decline in normal older people."Our findings indicate the spread of tau, under the influence of, or in the presence of [beta amyloid], to widespread cortical regions may be the sign, the smoking gun, that cognitive impairment is imminent or already under way," [he said]....In addition to beta amyloid (Aβ) plaques, tau neurofibrillary tangles are hallmark pathologies of AD that are laid down many years before symptoms of cognitive decline present. However, until now, detection of tau in the brain has only been possible at autopsy. A normal component of nerve cells, tau helps stabilize the internal neural transport system and supports architecture of neural cells. However, in dementia, the protein transforms and begins to accumulate in abnormal forms. Previous research suggests that levels of tau in the brain may be more closely associated with cognitive decline in AD than Aβ... Going into the study, the investigators hypothesized that tau was going to be "closer to the action" than Aβ and indicate incident impairment and that "when the tau compound binds to the sites in the brain, it was going to show us that these are the people that are about to have trouble or who have already started to have cognitive impairment." The investigators' hypothesis bore out, and they found that there was a "highly significant" relationship between a worsening of memory performance and higher levels of T807 binding in the brain. Identifying early buildup of these proteins in the brain, before cognitive decline occurs, may offer a road to early detection and diagnosis of AD and help identify candidates for prevention studies, investigators note.
Being able to diagnose Alzheimer's disease (AD) in its early stages is important for two reasons. First, it's an important step for differentiating it from other forms of dementia. "The most common form of dementia is Alzheimer's disease (75%). Other forms include Lewy body dementia, vascular dementia, frontotemporal dementia, progressive supranuclear palsy, corticobasal degeneration, normal pressure hydrocephalus and Creutzfeldt–Jakob disease (see: Dementia)." Secondly, a definitive diagnostic method is critical for conducting clinical trials of drugs that can cure or prevent the development of the disease. Here is a description of how how AD is currently diagnosed (see: Alzheimer Disease Imaging):
Current diagnosis of Alzheimer disease ... is made by clinical, neuropsychological, and neuroimaging assessments. Routine structural neuroimaging evaluation is based on nonspecific features such as atrophy, which is a late feature in the progression of the disease. Therefore, developing new approaches for early and specific recognition of Alzheimer disease at the prodromal stages is of crucial importance.
— Rasu Shrestha MD MBA (@RasuShrestha) July 31, 2014
This tweet struck me as I consider some of the technologies at the core of healthcare. As a patient, many of the healthcare technologies in use are extremely disappointing. As an entrepreneur I’m excited by the possibilities that newer technologies can and will provide healthcare.
I understand the history of healthcare technology and so I understand much of why healthcare organizations are using some of the technologies they do. In many cases, there’s just too much embedded knowledge in the older technology. In other cases, many believe that the older technologies are “more reliable” and trusted than newer technologies. They argue that healthcare needs to have extremely reliable technologies. The reality of many of these old technologies is that they don’t stop someone from purchasing the software (yet?). So, why should these organizations change?
I’m excited to see how the next 5-10 years play out. I see an opportunity for a company to leverage newer technologies to disrupt some of the dominant companies we see today. I reminded of this post on my favorite VC blog. The reality is that software is a commodity and so it can be replaced by newer and better technology and displace the incumbent software.
I think we’ve seen this already. Think about MEDITECH’s dominance and how Epic is having its hey day now. It does feel like software displacement in healthcare is a little slower than other industries, but it still happens. I’m interested to see who replaces Epic on the top of the heap.
I do offer one word of caution. As Fred says in the blog post above, one way to create software lock in is to create a network of users that’s hard to replicate. Although, he also suggested that data could be another way to make your software defensible. I’d describe it as data lock-in and not just data. We see this happening all over the EHR industry. Many EHR vendors absolutely lock in the EHR data in a way that makes it really challenging to switch EHR software. If exchange of EHR data becomes wide spread, that’s a real business risk to these EHR software companies.
While it’s sometimes disappointing to look at the old technology that powers healthcare, it also presents a fantastic opportunity to improve our system. It is certainly not easy to sell a new piece of software to healthcare. In fact, you’ll likely see the next disruptive software come from someone with deep connections inside healthcare partnered with a progressive IT expert.
A virtual rehabilitation program after amputation: a phenomenological exploration.
Disabil Rehabil Assist Technol. 2013 Nov;8(6):511-5
Authors: Moraal M, Slatman J, Pieters T, Mert A, Widdershoven G
Abstract. PURPOSE: This study provides an analysis of bodily experiences of a man with a lower leg amputation who used a virtual rehabilitation program. METHOD: The study reports data from semi-structured interviews with a 32-year veteran who used a virtual environment during rehabilitation. The interviews were analyzed using interpretative phenomenological analysis (IPA). RESULTS: During this rehabilitation program, he initially experienced his body as an object, which he had to handle carefully. As he went along with the training sessions, however, he was more stimulated to react directly without being aware of the body's position. In order to allow himself to react spontaneously, he needed to gain trust in the device. This was fostered by his narrative, in which he stressed how the device mechanically interacts with his movements. CONCLUSION: The use of a virtual environment facilitated the process of re-inserting one's body into the flow of one's experience in two opposite, but complementary ways: (1) it invited this person to move automatically without taking into account his body; (2) it invited him to take an instrumental or rational view on his body. Both processes fostered his trust in the device, and ultimately in his body. IMPLICATIONS FOR REHABILITATION: Providing (more) technological explanation of the technological device (i.e. the virtual environment), may facilitate a rehabilitation process. Providing (more) explicit technological feedback, during training sessions in a virtual environment, may facilitate a rehabilitation process.
Characteristics of Successful Technological Interventions in Mental Resilience Training.
J Med Syst. 2014 Sep;38(9):113
Authors: Vakili V, Brinkman WP, Morina N, Neerincx MA
Abstract. In the last two decades, several effective virtual reality-based interventions for anxiety disorders have been developed. Virtual reality interventions can also be used to build resilience to psychopathology for populations at risk of exposure to traumatic experiences and developing mental disorders as a result, such as for people working in vulnerable professions. Despite the interest among mental health professionals and researchers in applying new technology-supported interventions for pre-trauma mental resilience training, there is a lack of recommendations about what constitutes potentially effective technology-supported resilience training. This article analyses the role of technology in the field of stress-resilience training. It presents lessons learned from technology developers currently working in the area, and it identifies some key clinical requirements for the supported resilience interventions. Two processes made up this research: 1) developers of technology-assisted resilience programs were interviewed regarding human-computer interaction and system development; 2) discussions with clinicians were prompted using technology-centered concept storyboards to elicit feedback, and to refine, validate and extend the initial concepts. A qualitative analysis of the interviews produced a set of development guidelines that engineers should follow and a list of intervention requirements that the technology should fulfill. These recommendations can help bridge the gap between engineers and clinicians when generating novel resilience interventions for people in vulnerable professions.
Not so long ago, CCR and CCD were both common terms used in health IT. In fact, both CCR and CCD were compliant formats for sending patient summary data in the 2011 Edition of EHR certification. But since CCR was left out of the 2014 Edition of EHR certification as a valid way to send summary of care documents, the term has started to fade away.
As a refresher, CCR stands for Continuity of Care Record and it was an electronic document utilized for transferring patients. It started out as a paper document, and had a long history of providing appropriate clinical data so that the patient’s care could continue. CCR was an ASTM standard.
At the same time, HL7 developed their CDA standard that could apply to all types of documents utilized in a healthcare setting. To make a long story short, the content from a CCR was merged into a CDA format and the newly created document was called the Continuity of Care document, or CCD.
Even though CCD was meant to be the harmonization of CCR and CDA, CCR still stuck around for a while and was included in the original rules for Meaningful Use Stage 1. As a vendor testing for the 2011 Edition criteria related to Meaningful Use Stage 1, Corepoint Health had the choice in our test procedures to utilize a CCD or CCR to test for §170.306 (d)(1) Electronic Copy of Health Information. As an integration application, we also had OEM customers who utilized our application to test for this criteria. We would always ask, “Did you chose CCR or CCD for your testing?” Everyone would answer CCR. Why? Their response: “It’s just simpler.”
But now with Meaningful Use Stage 2 and the 2014 Edition of EHR certification there is no choice. Consolidated CDA, which includes CCD as one of its document types, is the standard of choice. To gauge the progress of using Consolidated CDA so far, JAMIA recently published a study. The study found that many EHR systems do not exchange data correctly using Consolidated CDA.
The study sampled 107 healthcare organizations using 21 EHR systems and found 615 errors and data expression variation across 11 key areas. The errors included missing data, incorrect coding terminology, and errors within the XML. The researches commented that “any expectation that Consolidate CDA documents could provide complete and consistently structured patient data is premature.”
With such apparent struggles for vendors and implementers to get the standard correct, you may be wondering, “Why did healthcare IT go the more complex route?” It was already apparent from the 2011 Edition EHR testing that vendors preferred the simplicity of CCR when given the choice. So why then does Consolidated CDA make sense for the path moving forward?
Consolidated CDA is made up of nine document types – including CCD, Discharge Summary, Diagnostic Imaging Report, Consultation Note, and others. And while the schema for CDA is much more complex that CCR, it can be applied to many more use cases. So while the bar is set high to get the summary of care document truly interoperable for Meaningful Use Stage 2, the bar should be much lower to allow for other types of Consolidated CDA documents to be added once interoperability is achieved for summary of care.
Did healthcare IT chose the wrong path by not choosing CCR? It certainly seems so at this point. But one must also step back and ask where the industry should be three to five years from now. Do we want Diagnostic Imaging Reports to be commonly exchanged as electronic documents? How about Consultation Notes?
If we are at the end of the road and in the future we only ever want to exchange documents for transition of care, then I say we definitely should have chosen CCR. It’s just easier. But if we have a longer term vision of interoperability that includes more document types than just for transition of care, then the price paid now to get Consolidated CDA working should speed up the progress of the utilization of all the document types included in the Consolidated CDA family.
When I was speaking at the gMed user conference, I learned about many of the users who participated on a GI focused online forum for gMed users. Essentially it’s an independent user group for gMed EHR users. Although, with so many GI doctors in one place, you can be sure there are all sorts of focused discussions that would be of interest to gastroenterology doctors.
Of course, there are a lot of other online forums that are similar to the GI forum. For example, Amazing Charts has a really active user forum. The open source EHR, OpenEMR also has a forum. I’m sure there are a lot more. I’d love to hear about other EHR forums you know about in the comments.
Many people probably don’t know that I built up much of my EMR knowledge participating in the now defunct EMR Update. It was a fantastic way for me to learn and share my knowledge. I’m sure that those who participate in the various EMR forums above get the same benefit. Although, it’s probably even more valuable since the forums above are all on the same EHR software.
I can’t tell you how valuable it is for a clinic to be able to turn to other users when they run into trouble. One of the best ways to optimize your EHR is to interact and exchange ideas with other end users. That’s why I’ve started creating a list of EHR user conferences as well. However, for those who can’t take the time off to go to a user conference, an online discussion forum is a great alternative. I’m surprised that more EHR vendors don’t create these types of forums.
The topic of companion diagnostics has been of continuing interest for me. I have posted a number of notes about this type of lab test (see, for example: More Details About Roche's Companion Diagnostics Strategy; A Closer Look at Companion Diagnostics Strategies; Consideration of a Broader Definition for "Companion Diagnostics"; Some Interesting Insights into Companion Diagnostics; Companion Diagnostics Gains Wider Acceptance in FDA Despite Challenges). Now comes news of a new collaboration between AstraZeneca and Roche on a diagnostic test to be used as a companion for AZ9291, an investigational compound (see: AstraZeneca, Roche Collaborate on Companion Diagnostic Test). Below is an excerpt from the article:
AstraZeneca announced it has entered into collaboration with Roche to develop a plasma-based companion diagnostic test to support AZD9291, AstraZeneca’s investigational compound in clinical development for non-small-cell lung cancer (NSCLC). The companion diagnostic test is designed to identify epidermal growth factor receptor (EGFR) mutations in both tumor tissue and plasma derived from patients with NSCLC, and to optimize the clinical development of AZD9291 for patients who are resistant to first-generation EGFR tyrosine kinase inhibitors (TKI). Currently, patients who have been treated with EGFR-TKIs in whom the disease has progressed have to undergo a repeat biopsy to assess whether they have a specific mutation, T790M. Diagnostic tests based on circulating DNA (ctDNA) in plasma samples provide an alternative method of identifying the T790M mutation....“Currently, late-stage lung cancer patients have to undergo surgery to collect tissue from a tumor so it can be sent for molecular testing,” said Paul Brown, head of Roche Molecular Diagnostics (RMD). “In some cases, collecting enough tissue for testing is not possible. This collaboration will enable molecular testing through plasma specimens and provide the information needed to inform treatment decisions without the complications of surgery, consequently increasing the level of care clinicians can give to the patient.” NSCLC represents approximately 80 to 85% of all lung cancers. Unfortunately, at the time of diagnosis approximately 70% of NSCLC patients have developed advanced or metastatic disease not amenable to surgical resection.
To summarize, AstraZeneca is working with Roche on a companion diagnostic to identify epidermal growth factor receptor (EGFR) mutations in both tumor DNA as well as in DNA circulating in plasma. The source of the latter is circulating tumor cells (ctDNA). Analysis of plasma ctDNA to detect tumor recurrence avoids the need to biopsy tumor tissue and thus the complications of additional surgery. This article illustrates the dual use of companion diagnostics. First of all, the test can be used to determine whether a patient is a candidate for a particular drug therapy -- compound AZ9291 in this case. The fact that the drug does not yet have a proprietary name is evidence of the very early stage at which the companion diagnostic is being developed. Secondly, the diagnostic test can also be used to identify circulating tumor DNA in the plasma.
NSCLC, a confusing term, is the most common type of lung cancer. Unfortunately and as noted above, some 70% of NSCLC patients have advanced disease at the time of diagnosis that is not amenable to surgical resection. The term NSCLC encompasses the following three histologic types of neoplasm (see: Lung cancer - non-small cell):
And the winner is... Sabine Petry and co-workers, Petry Lab, Princeton Department of Molecular Biology.
Description: Microtubules are hollow filaments that serve as the skeleton of the cell. They were thought to grow linearly, but this movie shows that they can branch: microtubules (red with growing tips in green) grow off the wall of existing microtubules. In addition, microtubules are moved along the glass surface by molecular motors. Microtubule branching amplifies the microtubules while preserving their polarity and explains how microtubules can cause the mitotic spindle of a dividing cell to reliably segregate chromosomes (Petry et al., Cell 2013).
Scale: A microtubule has a diameter of 25 nanometer and is the largest cytosekeletal filament in the cell.
More on the Princeton University Art of Science competition: http://artofsci.princeton.edu/
Sidewalk collisions involving pedestrians engrossed in their electronic devices have become an irritating (and sometimes dangerous) fact of city life. To prevent them, what about just creating a no cellphones lane on the sidewalk? Would people follow the signs? Thats what a TV crew decided to find out on a Washington, D.C., street Thursday, as part of a behavioral science experiment for a new National Geographic TV series. [via Quartz]
As expected, some pedestrians ignored the chalk markings designating a no-cellphones lane and a lane that warned pedestrians to walk at your own risk. Others didnt even see them because they were too busy staring at their phones. But others stopped, took pictures and posted themfrom their phones, of course.
Real-time functional MRI neurofeedback: a tool for psychiatry.
Curr Opin Psychiatry. 2014 Jul 14;
Authors: Kim S, Birbaumer N
Abstract. PURPOSE OF REVIEW: The aim of this review is to provide a critical overview of recent research in the field of neuroscientific and clinical application of real-time functional MRI neurofeedback (rtfMRI-nf).
RECENT FINDINGS: RtfMRI-nf allows self-regulating activity in circumscribed brain areas and brain systems. Furthermore, the learned regulation of brain activity has an influence on specific behaviors organized by the regulated brain regions. Patients with mental disorders show abnormal activity in certain regions, and simultaneous control of these regions using rtfMRI-nf may affect the symptoms of related behavioral disorders. SUMMARY: The promising results in clinical application indicate that rtfMRI-nf and other metabolic neurofeedback, such as near-infrared spectroscopy, might become a potential therapeutic tool. Further research is still required to examine whether rtfMRI-nf is a useful tool for psychiatry because there is still lack of knowledge about the neural function of certain brain systems and about neuronal markers for specific mental illnesses.
As much as I dislike being a patient, I have to admit it’s a good experience for a health care professional to go through. To be on the receiving end of the healthcare system not only helps me develop compassion for patients and families, but it also gives me a clearer vision of what we’re doing right and what we need to work on when delivering care. And it’s also given me some insight into patient satisfaction vs. patient-centered care.
A month or so ago I wrote a post on patient satisfaction scores and how health care providers are focusing on improving those scores, sometimes to the dismay of clinicians. I now have the opportunity to experience the health care system as a patient and be a satisfied or a dissatisfied consumer (or somewhere in between).
Recently, I paid visit to a dietitian and diabetes nurse educator because of a diagnosis of gestational diabetes. I was told to call the hospital’s central scheduling department to make the appointments. When I did, I was given two different dates two weeks apart, the first with the diabetes educator and the second two weeks later with the dietitian. I was given a long explanation about how the hospital was under construction and the bridge to the main hospital was closed, which is why I needed to enter through the medical office building entrance. It was also advised that I come 45 minutes early.
Then over the next few days I received follow-up calls from the scheduling department regarding the two appointments. When I returned the calls, a different employee took my call and wasn’t sure what the follow-up was regarding. I had assumed it was just to verify the times two weeks apart.
When I showed up for my appointment I followed the instructions then realized I was never given a suite number for the visit. I did my best to guess which one it was, but they were all closed. Why did I need to be there 45 minutes in advance? Perhaps I needed to stop in main hospital’s registration department first, after all they did mention something about the bridge being closed. So I got in my car and drove over to registration where I sat for 15 minutes before they called me over to discover that I had been in the right spot in the first place and I should go back. As I readied to get back into my car I asked what the suite number was. The registration employee had no idea and had to call the educator’s office to find out.
Once I settled in for my appointment, the educator asked about the instructions I was given and I told her I wasn’t given a suite number and was told to come 45 minutes early. She was none too pleased to hear that. Then she informed me that my appointment with the dietitian was immediately following my appointment with her. So that’s what the scheduling department was calling about! I told her that was great but I had no idea. She also told me that the employees in central scheduling could not see the same scheduling screen as the folks in the diabetes center and often did not know that earlier appointments were available.
Was this a satisfying experience? Not really. But the clinical care from the educator and dietitian was very good. They really listened when I explained how I felt about being there (annoyed at the diagnosis) and worked to try to come up with an eating and glucose testing schedule that fit my night shift job. After seeing them I knew what to do, when to do it, and who to call if something was off. With their focus on my needs, they were providing patient-centered care.
If asked if I was satisfied with my experience how would I respond? The scheduling process could have used some work but the clinical care was quite good.
This experience got me thinking about patient satisfaction and patient-centered care and how there is a distinct difference between the two. The writers of a July 2012 JAMA article Patient Satisfaction and Patient-Centered Care Necessary but Not Equal share my opinion.
Patient-centered care, as defined by the Institute of Medicine in its 2001 report Crossing the Quality Chasm, is defined as “providing care that is respectful of and responsive to individual patient preferences, needs, and values, and ensuring that patient values guide all clinical decisions,” whereas the JAMA authors explain, patient satisfaction is based on consumer marketing and measures the quality of the service against the consumer’s expectations.
As pointed out in the 2011 article The Values and Value of Patient-Centered Care in the Annals of Family Medicine, what some hospitals have been calling patient-centered care is “superficial and unconvincing.” Things like greeters and hotel-like decor “might enhance the patient’s experience, they do not necessarily achieve the goals of patient-centered care.”
When looking at my experience, the scheduling department did not meet my expectations — to give the appropriate information like suite number and arrival time. Yes, I was not satisfied with the scheduling experience but not getting the correct information did not harm me in any way. However, if the diabetes center had not taken my unique situation into account and given me a health regime that fit my night shift schedule, then my health outcomes could have been compromised. Had I gotten the correct scheduling information but not appropriate clinical care, in theory I could have been happy but not as healthy.
In an opinion piece in Circulation: Cardiovascular Quality and Outcomes, the author writes that satisfied patients can still have poor clinical results, though more research on the correlation between patient-satisfaction, patient-centered care and patient outcomes needs to be done.
I agree and would also argue that when health care providers focus on patient-centered — because patients feel heard and valued with this type of care — high satisfaction scores will follow. Plus, there will be better patient outcomes when patients are participating in their care. If you had to pick one area to focus on, patient satisfaction vs. patient-centered care, I’d focus on patient-centered care.
Two resources to help providers achieve patient-centered care are The Journal of General Internal Medicine‘s A 2020 Vision of Patient-Centered Primary Care and Plantree’s Patient-Centered Care Improvement Guide. Both help providers assess their implementation of patient-centered care and give pointers on how to improve care delivery.
Perhaps it’s because I have always worked in the clinical setting that I believe good clinical care can trump, or at least balance, parts of an experience that are less satisfying. Healthy patients equal happy patients and I feel they, like me, would be more willing to compartmentalize different aspects of the care experience. Because they are treated like individuals and listened to by their clinicians, they’ll be less likely to give an overall poor satisfaction score if something, like scheduling, goes amiss. And let’s not forget that despite how health care has changed over the years, good health outcomes are really what it is all about.
We’re just now starting down the road of the EHR replacement cycle. Meaningful use has driven many to adopt an EHR too quickly and now the buyer’s remorse is setting in and we’re going to see a wave of EHR replacements. Some organizations are going to wait until meaningful use runs it course, but many won’t even be able to wait.
With this prediction in mind, I was interested by this Allscripts whitepaper: Key Hidden Reasons Your EHR Is Not Sustainable and What To Do About It. I always learn a lot about a company when I read whitepapers like this one. It says a lot about the way the company thinks and where they’re taking their company.
For example, in the whitepaper, Allscripts provides a list of questions to consider when looking to replace your EHR:
You can see that these questions share a certain view of where healthcare IT and EHR is headed. Imagine how this criteria would compare with the criteria for EHR selection even five years ago. Although, I wonder how many doctors really share this type of approach to EHR selection. Do doctors really want their EHR to handle the above list? Should they be worrying about the above items?
I don’t doubt that doctors are going to be more involved in population health and they’re going to need to engage patients more. However, this list does seem to lack some of the practical realities that doctors still need from their EHR. In fact, as I write this, I wonder if it’s still too early to know what a next generation EHR will need to include. Of course, that won’t stop frustrated EHR users from replacing their EHR just the same.
I believe that there is inadequate strategic planning taking place in pathology and by pathologists. It's true that many of the individual pathology societies support some strategic planning activities but there are few published reports articles addressing the challenges facing our specialty. Here's what I consider the four of the most important challenges facing the field consideration from a strategic perspective: (1) cancer oncology; (2) digital pathology; (3) the future of the LIS; (4) increased lab complexity with reduced lab budgets.
By way of contrast, radiologists has been addressing strategic planning in their field head-on for many years. For example, the International Society of Strategic Studies in Radiology (ISSSR) held its ninth biennial meeting in August 2011. Here is the citation for the published proceedings of that meeting: Eur Radiol (2012) 22:2283-2294. This article is a revealing compendium about what the future holds for radiology. A number of topics are addressed in the proceedings including molecular imaging, diagnostic algorithms, information technology in radiology, integrated diagnostics, and health information exchanges. Here's a passage from the article describing population imaging:
Population imaging employs computational radiology techniques such as unstructured and structured data mining, image segmentation, and statistical modeling to map and summarise imaging features from large image databases and thus extract meaningful imaging biomarkers. The biomarkers may be anatomic structures, disease manifestations, tumour characteristics, or haemodynamic abnormalities. The summation of one or more imaging features, or biomarkers, from a global data set can be considered a phenotypic “population image” representing a particular disease or health state. In clinical care or clinical trials, population images may be used as a reference to classify individuals or patient groups into diagnostic categories. Radiologists can play a key leadership role in providing the needed intuition to productively integrate the computational information from population images with personal medical information....These imaging biomarkers may facilitate prediction of future disease onset, the development and implementation of preventive measures, and even the development of pre-clinical diagnostics....Furthermore, because of the statistical power provided by large sample sizes, population studies using global databases could potentially replace individual prospective studies as a means of validating new biomarkers, saving both time and money.
Here's a more succinct definition of population imaging (Population Imaging):
The ultimate aim of the European Population Imaging Infrastructure is to help the development and implementation of strategies to prevent or effectively treat disease. It supports imaging in large, prospective epidemiological studies on the population level. Image specific markers of pre-symptomatic diseases can be used to investigate causes of pathological alterations and for the early identification of people at risk.
I certainly applaud the strategic planning efforts of our colleagues in radiology and this focus on population imaging effort has tremendous merit. However, the apparent absence of clinical and anatomic pathology data in this project is a significant problem.
The following is a guest blog post by Trevor James.
If you work in the health/dental/medical space, you already know that HIPAA violations are a serious matter. Fines today for not complying with HIPAA laws and regulations are a minimum of $100-$50,000 per violation or record and a maximum of $1.5 million per year for violations of the same provision. Some violations also carry criminal charges with them, resulting in jail time for the violators.
Many dental offices are breaching HIPAA laws without realizing it or have employees doing so without their knowledge.
If you’re a dentist, office manager, or someone who’s been tasked with ensuring HIPAA security within your group, here are the 10 most common ways dental offices are breaching HIPAA regulations so your practice doesn’t make the same mistakes as others.
1. Devices with patient information being stolen
This is a common HIPAA violation for dental offices. It’s important to ensure the devices your dental office uses, like USB flash drives, mobile devices and laptops, are carefully handled and securely stored to prevent them and the patient information on them from being stolen.
2. Losing a device with patient information
Along the same lines as above, it’s also easy (and common) for an employee to lose those kinds of devices. USB flash drives and mobile devices are smaller items, so it’s easy to misplace them. When that happens, it’s easy for sensitive patient information to end up in the wrong hands.
Train your employees on the importance of properly handling these devices and set up some sort of tracking device, like downloading the Find My iPhone app or Where’s My Droid, to help you locate a device if it ends up lost.
3. Improperly disposing of papers and devices with patient information
When it comes time to get rid of papers or devices containing dental records or billing information, be sure you properly dispose of them. Crumpling paper in a ball and throwing it in the trash isn’t the correct way to do things nor is shutting down a device and then tossing it in the garbage. Use a paper shredder and wipe your devices clean of all information before disposing of them.
4. Not restricting access to patient information
Unauthorized access to a patient’s dental information will get you in serious trouble with HIPAA. Patients trust your office with this personal information, so be smart when handling such information so other patients, employees and relatives who aren’t allowed access don’t come across it.
A dental practice breached HIPAA in a case relating to this when they put a red sticker reading “AIDS” on the outside cover of patient folders and those not needing to know said information were able to read it while employees handled the folders. Don’t make simple, costly mistakes like they did.
5. Hacking/IT incidences
Most patient dental information now is stored on computers, laptops, mobile devices, and in the cloud. Today’s technology allows dental practices to more easily communicate, and look up and share patient information or their status on these devices.
The downfall of this technology is the people who are just as smart or smarter than your technology and hack into your devices or systems to get their hands on patient information. Make sure every device has some type of passcode or authentication to get on, install encryptions and enable personal firewalls and security software.
6. Sending sensitive patient information over email
While it’s not a violation to send these kinds of emails, it is a violation if the email is intercepted and/or read by someone without authorized access. Use encryptions and double check that whomever you’re sending the email to is supposed to be receiving the email.
7. Leaving too much patient information over a phone message
A patient may give you the A-Okay to call them, but be sure you don’t leave a message disclosing too much of their information. A friend or family member could check your patient’s message and hear things they shouldn’t, making said patient upset, or equally as bad, you could call the wrong number and say more than you should, which would probably make your patient even more upset with you. Your safest bet when calling a patient and they don’t answer is to leave a message for them to call you back.
8. Not having a “Right to Revoke” clause
When your dental office creates its HIPAA forms, you have to give your patients the right to revoke the permissions they’ve given to disclose their private dental information to certain parties. Not providing this information means your HIPAA forms are invalid and releasing subsequent information to another party puts you in breach of HIPAA.
9. Employees sharing stories about patient cases
People talk. It’s a simple fact. Employees talk with one another and they also talk to patients every workday. Remind them, though, that discussing a patient’s information to an employee lacking authorized access or to other patients is unprofessional and puts your whole practice at risk of being fined by HIPAA.
10. Employees snooping through files
It might seem shocking — or maybe not to some — but employees have been caught snooping through patient and co-worker files before. They do this to find out information for themselves but also because relatives or friends ask them to find things out about a certain person. Snooping is wrong and unprofessional on all levels.
Make sure your employees are clear on this and that they understand how bad the consequences can be for them and your office for doing so.
HIPAA violations in dental offices are all too common. Now that you know the top 10 ways dental offices are breaching HIPAA, you can take every precaution necessary to prevent your practice from violating any HIPAA laws and regulations.
About The Author
Developing and launching a competitive product, and getting initial traction in the market are not inconsiderable milestones. And yet for the entrepreneur and their investors, this is just the beginning. What was record setting last quarter is barely acceptable this quarter, and next quarter had better be back on track.
Developing a solid plan for growth depends on two things: a good understanding of the basic means to drive growth, and a deep understanding of the market. This post seeks to combine both of these in a brief survey of the key factors to drive messaging middleware revenue growth in health care. We’re going to consider three basic growth strategies: organic growth, product line extension, and the roll-up strategy.
For start ups, organic growth can be realized first by targeting a market segment that has broad appeal and large numbers of early and late adopters. Going back to Moore’s market adoption model, it’s relatively easy to identify a market need and generate initial sales to innovators and early adopters. These early buyers want technology and performance, something new the buyer can leverage to gain a competitive advantage of their own.
These early buyers tend to be large institutions with a corporate culture of innovation and the internal resources to support such endeavors. Accounts like the Cleveland Clinic, Mayo Clinic, Partners Healthcare, come immediately to mind. Kaiser Permamente would also fall into this group, except they are held back by their need to have solutions that can scale to considerable extremes, a requirement that is not applied to these other health care provider titans. There is even a cadre of smaller nimble early buyers: Overlake Hospital, Bellevue, Washington, and El Camino in Mountain View spring to mind. Spend enough time in this industry and the early buyers tend to make themselves known. The problem is that this population of early buyers is quite limited; early buyers will only take a company so far.
Once most of these early buyers in a market segment have bought, the market adoption chasm arises because the next group of buyers to adopt – the much larger early majority – don’t want technology and performance, they want complete, proven, easy to adopt solutions. This shift gives rise to the conventional wisdom that, “hospitals want solutions to problems, not tools they can use to solve their own problems.” For vendors, the importance of this is self evident when considering how to maintain or even increase their growth rate over time. For providers, it’s important to recognize from which side of the chasm your organization is operating and proceeding accordingly.
To cross the chasm, vendors must add to the original innovative technology the required features and services to create a whole product solution that is laser focused on a recognizable problem. Figuring out exactly what it is that’s required to transform an innovation worthy of inspiring early buyers into the safe and reliable solution required by the early majority is a challenge. Recognizing the gaps and knowing how best to fill them is not easy, although there are processes that can be used to identify those requirements and confirm that they’re met.
Moore calls the process of creating and going to market with the whole product solution being in the bowling alley. The bowling alley let’s you shift your growth from the early market, which may be nearing penetration, to the much larger early majority portion of the market. Crossing the chasm is an essential objective for new companies. In a crowded market like messaging middleware, numerous companies will be struggling with crossing the chasm.
Achieving strong organic growth is an excellent indicator that, beyond a solid whole product solution, sales and market are also top notch. Sales and marketing are especially important because health care is not a field of dreams market, where “if you build it, they will come.” Brand awareness, demand generation and market education are key marketing tasks. Sales requires effective sales tools and proofs in support of a sales strategy or process that leads first time buyers to the right decision in an efficient and reliable manner.
A main characteristic of the messaging middleware market is the variety of different problems that can be solved by the same basic technology. These different problems are reflected as market segments. Each of the different market segments listed in the previous blog post can potentially support a start up, or represents a potential product line extension. Moore frames these other market segments as additional bowling alleys that leverage the same foundation of product and services that make up the original whole product solution. Some product line extensions may require changes to the whole product solution to gain early majority market adoption.
Much like selecting the initial target market for a start up, the key is to identify new bowling alleys with sufficient market demand (of course, competition is also a factor). Synergy with preexisting whole product solutions is also desired. It’s also helpful if the new bowling alleys under consideration target the same markets (e.g., physician practices or hospitals) so that existing sales and marketing resources can be easily leveraged to take advantage of cross-sell and up-sell opportunities that emerge. If different bowling alleys target different markets – say, physician practices for one and hospitals for another – each target market will require major investments in marketing and sales; potential synergy from a targeting a common market are lost.
Sometimes a product line extension includes product changes that add substantive new features to the platform. For example, a secure messaging solution that is designed to support a single enterprise might add the capability to support users across multiple enterprises, or the addition of a scheduling module to support a more complete secure messaging solution for on-call physicians.
A roll-up strategy entails a series of acquisitions used to construct a bigger company made up of complementary products or solutions. A relevant example of this strategy can be found in Amcom Software. After their merger with Xtend Communications, Amcom came to dominate the hospital operator console market (due to their HL7 integration capability) and related telephony applications. Subsequent acquisitions extended Amcom’s reach with various communications solutions for health care, government and other vertical markets.
Amcom Software was acquired by USA Mobility in 2011 for $176,800,000. The combined company is now called Spok (pronounced spoke with a long “o”). Starting with the merger with Xtend, the Amcom Software strategy was to build a company through acquisitions and then sell the company. With a 2010 revenue of $60 million, things appear to have worked out well for Amcom’s investors.
Because of the nature of this market, a roll-up strategy can be challenging. Unlike the product line extension strategy, where a company’s existing technology is reconfigured or enhanced to target new market segments, the roll-up strategy entails the acquisition of other companies. How those acquired products, employees and customers are optimized is the challenge.
Mergers and acquisitions occur frequently in the health care industry. The goals of these transactions include:
The first two bullets are obviously related, however the degree and ways they’re related depends on the specific companies and their business models. A company that goes to market selling mostly capital goods (hardware and licensed software) is quite different from a company selling their solution as a cloud based service.
As discussed in a previous post, most messaging middleware solutions are built using a similar architecture that is often made up of software engines. These engines can be licensed from commercial vendors or from open source projects. The resulting solutions can be built relatively quickly and for modest sums. Consequently, the value in purchasing a messaging middleware vendor for their technology may be limited.
Creating interfaces between multiple messaging middleware acquisitions can be problematic. To date, messaging middleware systems have been designed to operate alone; manufacturers do not intend for their messaging middleware system to be one of a constellation of messaging solutions serving the same user base. Some manufacturers have added to existing designs by implementing APIs and other integration points to facilitate the incorporation of other messaging middleware apps – often to fill feature gaps demanded by prospective buyers. Implementing multiple messaging middleware solutions via acquisitions raises questions about message routing, escalation and the existence of more than one rules engine impacting message flow. A system of systems made up of messaging middleware solutions gets very complicated very quickly, increasing configuration and verification and validation test complexity.
An acquiring company with older software technology may see value in the acquired software platform, or in the intellectual property and expertise behind the development of that software. Further, the acquired company may have software capabilities that are extensions to messaging middleware solutions – such as the staff scheduling for on-call physician messaging example used earlier.
The acquisition of mVisum by Vocera is worth a closer look. It should be noted that Vocera does not appear to be executing a classic roll-up strategy but the rationale that may have driven this acquisition is of interest. mVisum was a start up with an attractive messaging middleware product. Unlike many other messaging middleware solutions, mVisum was FDA cleared for alarm notification, conveyed snippets of medical device waveforms with medical device alarms (important for screening non-actionable false/positive alarms), and also included remote medical device surveillance features. The company subsequently ran into some patent infringement issues with AirStrip Technologies. mVisum was acquired by Vocera for $3.5 million less than a year later.
There is considerable overlap between Vocera and mVisum solutions. Potential areas of value for Vocera include mVisum’s FDA clearance for alarm notification, one of the strongest messaging middleware market segments. mVisum also filed a number of patent applications that may be of value to Vocera. Vocera was founded in 2000, so there may be some value in mVisum’s software architecture – if not the actual software, then the requirements and design may be leveraged in future versions of Vocera’s software.
To summarize the roll-up strategy applied to messaging middleware, there is likely not a lot of value in acquiring other messaging middleware companies when compared to the product line extension strategy. The main reason is because most software architectures will be similar. There are exceptions to this, some of which are alluded to in the Vocera/mVisum discussion above. Because the messaging middleware market is relatively undeveloped – we’re far short of a penetrated market – there’s little opportunity to buy cash flow or market share through acquisitions. Nor is the market so developed that human resources are a likely justification for acquisition.
The roll-up strategy does make more sense when one looks beyond messaging middleware. Just as Amcom Software took a broader view of vertical market messaging and communications solutions that included messaging middleware as a portion of the whole, one could frame a roll-up strategy from a similar, higher level. For example, a roll-up targeting health care could encompass point of care solutions, rolling-up messaging middleware with nurse call, medical device data systems (MDDS), data aggregation and patient flow with enabling technologies like real time location systems (RTLS) and unified communications (enterprise phone systems). The resulting entity could define a new enterprise software category: point of care workflow automation.
Another practical application of the roll-up strategy is the secure messaging market targeting physicians. There is little apparent differentiation between solutions and vendors with good adoption in a particular geographic market will be difficult to dislodge. Here a classic roll-up, where the acquiring company offers broader economies of scale superior to those of regional players has a lot of potential. Such a strategy would be complex to implement, due to the technical product integration issues noted above. Provided they could dedicate sufficient cash flow, this could be an attractive strategy for Spok, although any company with access to several tens of millions could pull this off.
With 100+ competitors, the messaging middleware market is remarkably crowded. Over time, many of these firms will fade away as they fail to gain initial market traction, cross the chasm or get acquired. There will certainly be mergers and acquisitions. There will be some who plan and execute well, and grow their companies to tens and hundreds of millions in annual revenue. Some degree of luck with be a factor. But regardless of the strategy or outcome, the imperative shared by them all will be the drive for growth.
You can find a post Messaging Middleware Defined here and the post on Messaging Middleware Market Segmentation & Adoption here. In the coming week a post on HIPAA will be published. Be sure to check back!
— Wen Dombrowski MD (@HealthcareWen) July 22, 2014
I agree with Wen that the EMR and claims data needs to be cleaned up. I think it gives the wrong message to say it’s not meaningful though. Once it’s cleaned up, it has a lot of value.
— Jobs in Washington (@W4_Jobs_in_DC) July 28, 2014
How many of you have applied for a job because you saw it posted on Twitter? I’m really interested in this since I do a lot of health IT job posts on Twitter. We see quite a bit of traffic from Twitter to our healthcare IT job board, but I haven’t added a good way to track who signs up and applies for jobs. That’s next.
— Practice Fusion (@PracticeFusion) July 27, 2014
I love how academic Practice Fusion tries to make the discussion. I thought I made the discussion of EMR vs EHR much simpler.
One of the major goals of Big Pharma is to enlarge its customer base, which is to say, sell more product. One way to accomplish this is through the medicalization of "conditions" that previously have not been viewed as diseases. One example of such a condition is obesity. This medicalization process has also been referred to as "disease mongering" (see: Disease Mongering (i.e., Medicalization) by Pharmaceutical Companies; Medical Device Mongering, a Variant of Disease Mongering). The reason that Big Pharma spends huge amount of money each year on direct-to-consumer (DTC) advertisements on TV is to circumvent physicians by creating demand for prescription drugs among consumers (see: Effectiveness of "Direct-to-Consumer" Drug Advertisements). Although consumers can't write prescriptions, they can certainly request a particular drug from their physician.
Given all of this, it should not be surprising that the pharmaceutical industry is teaming up with Silicon Valley companies to develop wearable IT devices to monitor health. The details of a recent Google/Novartis deal were discussed in a recent article in the Financial Times (see: Big pharma teams up with big data). Below is an excerpt from it:
Big pharma and Silicon Valley have been circling one another for some time, looking for ways in which they might harness the power of data technology to medical ends. Now a fusion of West Coast entrepreneurship and lab-coated medical expertise has spawned its first big publicly announced transaction. Google has struck a deal with Novartis to develop a “smart” contact lens designed to help diabetics track their blood sugar levels. The lens works by analysing the level of glucose in a wearer’s tear fluid and communicating the data to a mobile device. It replaces the need for diabetics to test their own blood sugar several times a day....But this is only one of the reasons to applaud the marriage of pharma and big data, and the emergence of such “wearable” medical devices. The increasing incidence of chronic diseases and an ageing population has created the need for real-time health monitoring. At a time of stretched healthcare budgets, having a device that tracks the state of the wearer’s health can help to give physicians better early intelligence of problems, reducing the need for costly interventions and long hospital stays later on. Monitoring is, moreover, only part of the story. Wearable technology may also have a role to play in treating conditions. For instance, Novartis is also looking at using Google’s technology to produce an “autofocusing lens”....Another area of investigation is into so-called “electroceuticals”. These are implants that use electronic impulses to affect and modify the functions of the body.
I have no problem with wearable devices for monitoring health status. I think that this is both inevitable and useful for increasing health awareness among the general population if not to improve health. I am also enthusiastic about the notion of having healthcare consumers take more ownership of their own health status. Early autodiagnosis and ongoing health monitoring is certainly one way to reduce the cost of healthcare by avoiding the expense of nagging chronic diseases. Why then would I be concerned about pharmaceutical companies getting involved in the development of such devices? My greatest fear is that the companies will "fudge" these devices such that the diagnosis of various diseases and the need for treatment will be overstated. This will be an extension of the medicalization discussed in the first paragraph. One example of such chicanery was a rigged depression survey that Eli Lilly posted on the web. The company manufactures the anti-depressant Cymbalta. Regardless of how one answered the survey, the conclusion was that the subject may be depressed (see: Rigged Depression Survey on the Web Steers Readers to Lilly's Cymbalta).