Nuance announces that it has acquired image sharing vendor Accelarad and will immediately begin marketing a cloud-based document and image sharing platform called the Nuance PowerShare Network. Financial details were not disclosed.
Athenahealth reports Q1 earnings: revenue was up 30 percent at $163 million, but missed analyst estimates of $170 million, EPS $0.12 vs. $0.38.
In Australia, Epic wins a $48 million deal at Melbourne’s Royal Children’s Hospital, concluding a vendor search that reportedly included all major US vendors as well as representation from local Australian vendors.
San Francisco, Calif.-based One Medical Group, a startup building technology-laden primary care offices across the nation, raises a $40 million investment round to continue its expansion.
From Worth HIT: “Re: tradeshow blooper. At HIMSS Middle East, 3M’s booth described a new service offering, ‘Coding and Groping Quality.’ Go to love the high-tech fix … white tape.” The sign is full of inconsistencies: “groping” and “intelligence” are the only words not capitalized, “ICD-10” also appears as “ICD10,” some random commas found their way onto the page, and some lines end with periods while others don’t. You’re gonna need a bigger roll of tape.
From Pure Power: “Re: your 2009 thoughts about EHR data. Worth looking at again.” Well, here you go then, as I was referring five years ago to a research study about using EHR data in nephrology:
I don’t have access to the full text of the article, but I truly believe that once the pain of getting EMRs running as data collection appliances is over (meaning we’ve got data collection clerks known as doctors and nurses in place, which is the “pain” part), the benefit will be incredible. This article apparently deals with having nephrologists automatically consulted when the EHR finds problems. There are other benefits. You could do society-improving medical research by just slicing and dicing data from millions of patients, at least the parts of it that aren’t just clinical-sounding billing events that are useless or even misleading. You could find candidates for research trials. Patients could be followed over many years, even as they move around and use the services of a variety of providers. And for individual patients, there could be great value in putting research findings into the hands of front line doctors. Not to mention giving patients a platform whereby they can participate in their own care and add non-episodic information related to lifestyle, personal health assessment, etc. Clinical systems will not save time, as clinicians know – they exist to create data whose value mostly accrues to someone else. My advice to providers: much of your future income may be based on the data you create and the ownership in it you retain. Don’t be like the Native Americans and let greedy outsiders buy your land for trinkets.
HIStalk Announcements and Requests
A few highlights from HIStalk Practice this week include: US physicians produced $1.6 trillion in direct and indirect economic advantage in 2012. Steven Posnack creates a fun proof of concept graph that matches Medicare payment data with MU incentive payments. Boston doctors prescribe bike riding. AAFP’s president points out the disparity in compensation between family practice physicians and specialists, as evidenced by the recent release of Medicare payment data. CMS offers guidance on the Attestation Batch Upload option. A urology practice employee sends details on 1,114 patients to a competing practice to help the competitor solicit business. Thanks for reading.
This week on HIStalk Connect: Nuance acquires image-sharing vendor Accelarad, which will power a new cloud-based image and report exchange platform that integrates with its existing transcription product lines. In England, the NHS kicks-off a campaign to use telehealth and mHealth apps to reduce ED visits. The Mayo Clinic is funding a medical research assistant app designed to help consumers responsibly look up their symptoms and conditions. Dr. Travis recounts past mistakes the health IT industry has made with EHR data exchange and questions whether the same mistakes are being made with newer payment and care delivery models.
May 1 (Thursday) 1:00 p.m. ET. Think Beyond EDW: Using Your Data to Transform, Part 2 – Build-Measure-Learn to Get Value from Healthcare Data. Sponsored by Premier. Presenters: Alejandro Reti, MD, senior director of population health, Premier; and Alex Easton, senior director of enterprise solutions, Premier. Once you deploy an enterprise data warehouse, you need to arrive at value as quickly as possible. Learn ways to be operationally and technically agile with integrated data, including strategies for improving population health.
Acquisitions, Funding, Business, and Stock
Athenahealth announces Q1 results: revenue up 30 percent, adjusted EPS $0.12 vs. $0.38, missing analyst estimates for both.
Liaison Technologies raises $15 million in funding.
HCA subsidiary Health Insight Capital makes an equity investment in Intelligent InSites.
One Medical Group, a 27-location practice that heavily promotes its use of healthcare IT in providing care, raises $40 million in growth capital, bringing its total to $117 million.
Great Point Partners makes a “significant investment” in Orange Health Solutions to finance the acquisition of MZI Healthcare, developers of EZ-Cap and other technologies for ACOs and IPAs.
CareCloud reports that it added 170 clients in Q1.
Australia’s Royal Children’s Hospital in Melbourne awards Epic a $48 million contract.
Sisters of Charity of Leavenworth Health System (CO) selects Allscripts EPSi as its financial decision support system.
University Health System (TX) will deploy PeraHealth’s PeraTrend real-time patient status system, which calculates a score of acuity called the Rothman Index.
Crain’s Cleveland Business names Cleveland Clinic CIO Martin Harris, MD as its CIO of the year.
Healthcare data analytics firm GNS Healthcare hires Mark Pottle (N-of-One/Optum Insight) as CFO.
Aventura names Bill Bakken (Nordic Consulting) COO.
NaviNet promotes Sean Bridges to CFO, Sridhar Natarajan to VP of software development, and Thomas Smolinsky to VP/CISO.
Announcements and Implementations
Steward Health Care System launches the StewardCONNECT patient portal based on Get Real Health’s InstantPHR patient engagement platform.
Park Nicollet Health Services (MN) will implement StrataJazz from Strata Decision Technology for cost accounting, contract modeling, long-range financial planning, and rolling forecasting.
The Patient-Centered Outcomes Research Institute (PCORI) provides an update on its $100 million initiative to develop the National Patient-Centered Clinical Research Network that was originally announced in December. PCORI’s executive director Joe Selby, MD outlines details on governance, data security, privacy, and interoperability as participants work to build a database of 26 to 30 million EHR records in support of retrospective clinical research.
The 25-bed Dan C. Trigg Memorial Hospital (NM), which is owned by Presbyterian Healthcare Services, implements Epic.
The Whitman-Walker Clinic (DC) is implementing Forward Health Group’s PopulationManager and The Guideline Advantage.
Cincinnati’s fire department rolls out Tempus Pro, a real-time vital signs monitoring system developed for battlefield use that allows hospital-based physicians to monitor patients being transported by ambulance.
Mayo Clinic and startup Better announce a $50 per month membership-based app that includes a symptom checker, health information, and access to a personal health assistant.
Government and Politics
The HHS’s OIG warns that some state Medicaid agencies may be putting patient health information at risk by outsourcing administrative functions offshore.
Innovation and Research
A VA survey of 18,000 randomly chosen users of its My HealtheVet system finds that a third of them use Blue Button, with three-quarters of those saying its main value is collecting their information in one place. Barriers to adoption were identified as low awareness and usability issues.
HIMSS Analytics says that healthcare IT systems with the highest growth potential are bed management, ERP, and financial modeling.
TechCrunch profiles One Medical Group, which has raised $117 million (the latest funding announcement is above) in funding to create a new kind of technology-powered medical practice, with its custom-developed EHR and portal offering appointment scheduling, refills, lab results, and access to a patient’s records from any of its 27 locations. Patients pay $149 per year for access and can use their health insurance.
It’s not exactly health IT related, but appalling: Yahoo fires its COO of only 15 months after he fails to improve the company’s advertising revenue. He didn’t get a bonus because he didn’t make his numbers, but he still walked out with a severance check of $58 million.
The Bloomberg School of Public Health at Johns Hopkins of University tweets that it has exceed 1 million enrollments in its free Coursera courses. Starting soon: Community Change in Public Health, Mathematical Biostatistics Boot Camp 2, The Data Scientist’s Toolbox, Getting and Cleaning Data, Exploratory Data Analysis, and The Science of Safety in Healthcare.
BIDMC CIO John Halamka, MD offers common sense HIPAA-related tips to hospitals using patient data for fundraising:
Interesting facts from an article on clinicians who use social media in the OR:
UNC Healthcare (NC) reduces patient volumes as it adjusts to its April 4 Epic go-live.
A seventh grader undergoing cancer treatment “attends” classes in his school more than 1,100 miles away from Children’s Hospital of Philadelphia by using VGo, an audiovisual-equipped robot he can steer down the school hallway and into classrooms as he says hello to classmates. The same VGo robot is used by hospitals for patient monitoring and telemedicine.
Weird News Andy calls this story “Doc on the Run.” An Arkansas gynecologist allegedly takes smartphone pictures of his patients without their consent while they are in the stirrups. Police investigating a patient’s complaint find her photos on the doctor’s phone, but don’t initially find him (and thus WNA’s headline). Since then, however, he has been arrested and charged with video voyeurism.
Highlights from the Atlanta iHT2 Health IT Summit
By Jennifer Dennard
This was my third year in a row attending the Health IT Summit in Atlanta. It continues to be a great experience.
The conference, hosted by the Institute for Health Technology Transformation (iHT2), was held at Georgia Tech’s Academy of Medicine. It was an intimate gathering of providers, government healthcare reps, and vendors, with a few lab and pharma folks thrown in for good measure.
The topics of discussion both on stage and during networking breaks have moved over the last two years from Meaningful Use and EMRs to accountable care and patient engagement. Providers are concerned with:
Mary Jane Neff, senior director of regional IS; Katheryn Markham, VP of IS planning; Lynda Anderson, senior director of regional IS, all of Kindred Healthcare.
Thea-Marie Pascal, certified Epic clinical documentation application coordinator; Susan Still, RN, Epic ASAP lead application coordinator; Makeba Lippitt, certified Epic clinical documentation application coordinator, all of Piedmont Healthcare.
The panel on "Transforming Health Care Through HIE: Driving Interoperability" featured (from left to right) moderator Kimberly Bell, executive director, Georgia Health Information Technology Extension Center at Morehouse School of Medicine; panelists Eddy Brown, VP of business development, TeraMedica; Steve Sarros, VP/CIO, Baptist Health Care; and Sonya Christian, CIO, West Georgia Health.
The keynote presentations were solid, though a high bar was set a few years ago by Naomi Fried, chief innovation Officer at Boston Children’s Hospital (MA). My favorite session was the last, with West Georgia Health’s CIO, CFO and director of nursing all participating on the same panel, answering questions about workplace culture, Lean Six Sigma, and patient safety.
Ten companies exhibited, among them Merge Healthcare, TeraMedica, VMware, Information Management Consultants, and Jvion. Nicole Cirillo from LabCorp explained how patients can review their own lab results through its portal (Georgia is not a right-to-know state.) LabCorp now offers its own portal through which patients can, with guidance from their physicians, access results.
I had a run-in with one of our employed physicians yesterday. Some of these folks are really starting to wear me down. He’s been with us for a while, and unfortunately the EHR we purchased for our large multispecialty group many years ago does not have specific content for his specialty.
We knew this when we implemented him. We gave him the ability to use speech recognition to essentially dictate all of his office visit documentation except for orders, physical exam, and review of systems, which must be entered discretely. His staff enters other discrete data for patient history, allergies, etc.
Most of our other physicians (even those who do have content for their specialties) would kill for this arrangement. Still, it’s not enough for this guy, who demanded that I come to his office and personally shadow him to see how deficient the system is. I’m trying to win hearts and minds, so I agreed to go out. Rather than take the opportunity to show me how he sees patients and let me assess what his needs truly are, he preferred to spend the time we had standing in the hallway complaining about templates.
It turns out he has been using internal medicine templates to try to document his visits because he doesn’t like the dictation arrangements. He has the option to either dictate in the exam room with the patient present (many of our surgical consultants like this because it gives another opportunity for the patient and family to hear the diagnosis and plan of care one more time and ask questions), to release the patient to checkout and dictate in the exam room after the patient leaves, or to go to his administrative office to dictate. He has his own reasons why each of these is inadequate, but doesn’t have any suggestions for what he wants.
Of course, the internal medicine templates are completely overkill for what he’s trying to do. He has to weed through primary care clinical protocols and other information that’s not relevant to his specialty and feels frustrated. I reminded him that we didn’t train him to do this, that we recommended he use a specialty set that’s closer to his own instead, but he doesn’t like those either.
Most of our other specialists who don’t have content for their specialties are perfectly happy to dictate because it changed their workflow minimally from the paper world. Our primary care docs would love to be allowed to dictate as much as these guys can, but unfortunately for them, we need discrete data from more parts of the chart to meet payer incentive programs and other quality initiatives that we’re working on.
I’m not sure what he really wanted to get out of the visit other than to vent, which is fine, but it doesn’t change anything as far as documenting in the EHR. He wasn’t interested in any of the options I had to present and isn’t going to change his opinion. He doesn’t want a scribe. He doesn’t want to point and click. He doesn’t want to dictate. He doesn’t want a pen solution like Shareable Ink. His continued push-back (going on two years now) is an exercise in futility.
As I was driving back to my office, I got to thinking about that. This is a physician who deals regularly with patients who have life-altering injuries and conditions that cannot be fixed. His specialty is centered on helping people maximize the functionality they currently have and to compensate for what they have lost. He’s very good at what he does, yet he can’t see his EHR issues with the same perspective he uses when treating patients – helping them use what they have to the best of their abilities and not dwelling on what they don’t have or have never had.
We learn in medical school and residency to identify when interventions are futile. We call the code when there’s no hope of getting the patient back. We don’t perform surgeries when they’re not going to improve the patient’s condition. We understand that there are limits to technology and our ability to treat and cure. We’re pretty good at helping patients understand the options when they’re faced with a lack of good choices.
When it comes to limitations in information technology, however, we’re struggling mightily with the thought of applying those same concepts. The EHR of the future is going to look a lot different than what we have today – just like the laparascopic surgeries we do now are completely different from the open surgeries we did in the past. Maybe in the future we’ll beam your gallbladder out of your abdomen instead of having to cut you at all. But for the time being, we have to work with what we have as best as we can. We have to realize there are limits to everything. There’s no psychic module for EHR that’s going to document directly from your thoughts, at least not for now.
Fighting is good when it’s appropriate, but at some point, we have to realize when it’s futile and either accept our current situation or move on. I’m not sure what else to do with or for this physician since we’ve not been able to make him happy as long as we’ve been trying. I suspect there are other factors at play that have nothing to do with EHR, but they’re not within my realm to tackle. We’ll keep reinforcing his options, pair him up with peers that are successful, and encourage him. Until he’s ready to leave the group or retire, I’m not sure what else we can do.
Well, I guess there’s one more thing we could do – pastry therapy. I just dropped a little surprise at his office to thank him for his time yesterday. A girl can hope.
Dear Accelarad customer,You should have received an email from me on Monday of this week, when I provided our customers an early insight into the announcement that Accelarad is now a part of Nuance Communications. At this time, I wanted to provide you some additional information and invite you to learn more.
You can read the full press release here: (Nuance Unveils PowerShare – April 17,2014)As discussed, this new union brings together our cloud-based medical image sharing technology and Nuance’s PowerScribe radiology reporting and communication platform. The partnership will give you, our valued customer, access to Nuance’s expansive healthcare technology and professional services, while continuing to provide you with the proven software and solid relationships you have come to expect from Accelarad. With this partnership, Accelarad’s SeeMyRadiology solution has been rebranded to align with the Nuance diagnostic brand, and will be part of the Nuance PowerShare Network. To learn more about PowerShare | Image Sharing, sign up to join one of our webinars.
Most importantly, know that the products and people you have come to rely on will not change. Accelarad's leadership team and valued employees will be deeply involved in creating a smooth transition for our customers, and our focus remains on making sure you continue to receive the excellent service you deserve.
Thank you again for your support and confidence in us. We will keep you informed about any incremental changes along the way and are open to your feedback.
Willie Tillery, CEO, Accelarad
Rodney Hawkins, General Manager, Diagnostic Solutions, Nuance
Nuance PowerShare Network Unveiled for Cloud-Based Medical Imaging and Report ExchangeDefinitely an interesting constellation of services! I wonder where this might lead. Ironically, Rodney is also an old friend from the AMICAS days...
Industry’s Largest Medical Imaging Network Helps Providers and Patients Coordinate Care and Share Information Across Distances and Disparate Healthcare Systems
BURLINGTON, Mass., – April 17, 2014 – Nuance Communications, Inc. (NASDAQ: NUAN) announced today the immediate availability of Nuance PowerShare™ Network, the industry’s largest cloud-based network for securely connecting physicians, patients, government agencies, specialty medical societies and others to share essential medical images and reports as simply as people exchange information using social networks. Nuance PowerShare Network promotes informed and connected physicians and patients who can instantly view, share and collaborate while addressing patients’ healthcare needs.
“Organizations are being tasked to communicate efficiently both in and out of their networks to provide clinical insight to physicians beyond one person or office to a much broader team involved in the continuum of care,” said Keith Dreyer, DO, PhD, FACR, vice chairman of radiology at Massachusetts General Hospital and Chair of the American College of Radiology (ACR) IT and Informatics Committee. “Nuance PowerShare Network addresses the information sharing challenge physicians face today with a network that supports things we’ve dreamed of doing for years,” he adds.
Fully Connected Patients & Providers
Nuance PowerShare Network is already used by more than 1,900 provider organizations for sharing images via the cloud using open standards. Made possible through the acquisition of Accelarad, this medical imaging exchange eliminates the costly and insecure process of managing images on CDs and removes silos of information in healthcare that inhibit providers from optimizing the efficiency and quality of care they provide. Anyone can join the network regardless of IT systems in place to instantly view and manage images needed to consult, diagnose or treat patients, enabling clinicians to more seamlessly evaluate and deliver care for patients who transition between facilities or care settings.
Nuance is already used by more than 500,000 clinicians and is a critical component within the radiology workflow and a trusted partner for 1,600+ provider organizations that rely on Nuance PowerScribe for radiology reporting and communications. Healthcare organizations that use Nuance PowerScribe, a group that produces more that 50 percent of all radiology reports in the U.S., can immediately leverage their existing investment and begin sharing radiology reports along with images, such as X-rays, MRIs, CT scans, EKGs, wound care images, dermatology images or any other type of image. This simplifies secure health information exchange between multiple providers, patients and disparate systems without costly and time-consuming interfaces, CD production or the need to install additional third-party systems.
“The challenge of sharing images with interpretive reports is something we’ve heard about consistently from our customers and EHR partners, and we know Nuance PowerShare Network will overcome this major obstacle, helping physicians treat patients more efficiently and effectively,” said Peter Durlach, senior vice president of marketing and strategy, Nuance Communications. “This nationwide network, one that is fully integrated into the EHR workflow and already connected to approximately half of all clinicians producing diagnostic imaging information, is a ground-breaking solution that delivers immediate benefits at an unprecedented scale to our healthcare system.”
“Integrated image and report sharing helps us deliver quality care and drive down costs especially when patients transfer from one facility to another. Whether at their desktop or on their mobile device, our physicians can see the study that was done along with the interpretive report, which provides the information they need to treat the patient and avoid duplicate testing,” says Deborah Gash, vice president and CIO, Saint. Luke’s Health System in Kansas City. “By integrating this with our EHR, PowerShare will enable physicians to manage inbound imaging through one point of access and login. Physicians in our 11 hospitals and 100-mile radius referral network see this cutting-edge technology as a way to deliver the highest level of patient care,” she adds.
To learn more about the PowerShare Network and the new image sharing solution, visit http://www.nuance.com/products/PowerShareNetwork to join one of our webinars. Connect with Nuance on social media through the healthcare blog, What’s next, as well as Twitter and Facebook.
By refusing to pay for readmissions within 30 days of discharge from a hospital, Medicare has sent a strong message across the healthcare industry: < 30 day readmissions should be avoided at all costs. As a result, providers and vendors are doing everything in their power to avoid < 30 day readmissions.
This seems like a simple way to reduce costs, right? Well, not quite…
The vast majority of costs of care delivery are fixed: capital expenditures, facilities and diagnostics, 24/7 staffing, administrative overhead, etc. In other words, it’s extremely expensive just to “keep the lights on.” There are some variable costs in healthcare delivery – such as medications and unnecessary tests – but the marginal costs of diagnostics and treatments are small relative to the enormous fixed costs of delivering care.
Thus, Medicare’s < 30 day readmission policy doesn’t really address the fundamental cost problem in healthcare. If costs were linearly bound by resource utilization, than reducing readmissions (and thus utilization) should lead to meaningful cost reduction. But given the reality of enormous fixed costs, it’s extremely difficult to move down the cost curve. To visualize:
Medicare’s < 30 day readmission policy is a bandaid – not a cure – to the underlying cost problem. The policy, however, reduces Medicare’s outlays to providers. Rather than reduce (or expand, depending on your point of view) the size of the pie, Medicare has simply dictated that it will keep a larger share of the metaphorical pie for itself. Medicare is simply squeezing providers. One could argue that providers are bloated and that Medicare needs to squeeze providers to drive down costs. But this is intrinsically a superficial strategy, not a strategy that addresses the underlying cost problems in healthcare delivery.
So how can we actually address the fixed-cost problem of healthcare? Please leave a comment. Input is welcome.
I admire those who can explain the complex simply. In researching the latest developments in neuroscience and technology, I discovered the brilliant Dr. Story Landis, a neurobiologist and the Director of the National Institute of Neurological Disorders and Stroke.
Dr. Landis is part of the leadership for the President’s new “BRAIN Initiative,” a Grand Challenge of the 21st Century, and provides an easy overview of the latest advances in neurotechnology in this video (starting at 5:05).
She presented at the Society of Neuroscience’s Annual Convention as part of a distinguished panel to discuss the new brain initiatives in the United States and in Europe for 2014.
The acronym, BRAIN, stands for Brain Research through Advancing Innovative Neurotechnologies.
According to the National Institutes of Health, “By accelerating the development and application of innovative technologies, researchers will be able to produce a revolutionary new dynamic picture of the brain that, for the first time, shows how individual cells and complex neural circuits interact in both time and space.”
The goal of the initiative is to develop tools for researchers to discover new ways to treat, cure, and even prevent brain disorders. Through these technologies, researchers will explore “how the brain enables the human body to record, process, utilize, store, and retrieve vast quantities of information, all at the speed of thought.”
Neuroscientists need a consistent map of brain anatomy, but there isn’t one yet. Why? According to the Kavli Foundation, one of the partners of the initiative, “In the fast-moving field of neuroscience, researchers constantly reorganize brain maps to reflect new knowledge. They also face a vocabulary problem. Sometimes, different research groups will use several words to describe a single location; other times, a single word may mean different things to different researchers. Nor do maps remain consistent when moving across species.”
A Connectome is a structural description of the brain first proposed by Olaf Sporns. The Human Connectome Project (HCP) is a consortium comprehensively mapping brain circuitry in 1,200 healthy adults using noninvasive neuroimaging, and making their datasets freely available to the scientific community. Get the HCP data here.
Four imaging modalities are used to acquire data with unprecedented resolution in space and time. Resting-state functional MRI (rfMRI) and diffusion imaging (dMRI) provide information about brain connectivity. Task-evoked fMRI reveals much about brain function. Structural MRI captures the shape of the highly convoluted cerebral cortex. Behavioral data provides the basis for relating brain circuits to individual differences in cognition, perception, and personality. In addition, 100 participants will be studied using magnetoencephalography and electroencephalography (MEG/EEG). – HumanConnectome.org
Brainbow is the process by which individual neurons in the brain can be distinguished from neighboring neurons using fluorescent proteins. The idea is to color-code the individual wires and nodes, and was developed at the Center for Brain Science at Harvard.
CLARITY (Clear, Lipid-exchanged, Anatomically Rigid, Imaging/immunostaining compatible, Tissue hYdrogel) is a method of making brain tissue transparent, and offers a three-dimensional view of neural networks. It was developed by Karl Deisseroth and colleagues at the Stanford University School of Medicine.
The ability for CLARITY imaging to reveal specific structures in such unobstructed detail has led to promising avenues of future applications including local circuit wiring (especially as it relates to the Connectome Project). Pictured is a mouse brain with CLARITY.
Optogenetics uses light to control neurons that have been genetically sensitized to light. Optogenetics is credited with providing new insights for Parkinson’s Disease, autism, Schizophrenia, drug abuse, anxiety and depression.
Also part of the leadership for the BRAIN initiative is neuroscientist William Newsome of Stanford University:
Most of us who have been in this field in the last few decades understand that there is a revolution going on right now, so these tools we’ve mentioned already did not exist 8 years ago, and some did not exist 6 months ago. The pace of technological change is so rapid right now that those of us who were traditional experimental scientists say, “Whoa, what does it even mean to be an experimental scientist in this day and age?” We have to totally rethink what experiments are even possible, and it opens up vistas that were unimaginable 10 years ago.
To get a deeper understanding of the brain before and after disorders, neuroscientists from the University of California San Francisco have established a new “Brain Health Registry.” Their goal is to address one of the biggest obstacles to cures for brain disorders – the costs and time involved in clinical trials. To register your brain, participate in games, and help scientists, read more in the FAQs.
The average adult brain is about 1,300 to 1,400 grams or 3 pounds, and is about 5.9 inches or 15 centimeters long. It is often quoted that are 100 billion neurons in the human brain, but Dr. Suzana Herculan-Houzel of Brazil recently discovered there are 14 billion fewer. According to her research, the human brain has 86 billion neurons or nerve cells.
What is the impact of brain disorders in the U.S.?
According to the World Health Organization, brain disorders are a leading contributor to the global disease burden, and the fourth highest for Western developed countries. About 50 million people in the U.S. suffer from damage to the nervous system, and there are more than 600 neurological diseases.
Psychiatric Illness – About 1 in 4 American adults suffer from a diagnosable mental disorder in any given year, according to the NIMH.
Alzheimer’s – In 2014, there are 5.2 million people in the U.S. with Alzheimer’s Disease, according to the Alzhemier’s Association. With the growth of the Baby Boomer generation, it is expected that between 11 and 16 million will be affected by 2050.
Parkinson’s – The Parkinson’s Foundation estimates 1 million Americans live with Parkinson’s Disease.
Autism – One in 68 children in the U.S. are affected by Autism Spectrum Disorder, a 30% increase from two years ago.
The BRAIN Initiative involves a number of government agencies and private partners fostering a multi-disciplinary approach to research and technology. Specifically, it is a unique collaboration across disciplines involving the National Institutes of Health and the National Science Foundation. Learn more in this video with Dr. Tom Insel, Director of the NIMH, and Dr. Fleming Crim of the NSF, as they discuss exploring the connections between the life sciences and physical sciences in understanding the brain.
Through a Call to Action, the White House has asked to hear from companies, health systems, patient advocacy organizations, philanthropists, and developers about the unique activities and capabilities underway that could be leveraged to catalyze new breakthroughs in our understanding of the brain.
Do you have an idea? You have until May 1st to send your ideas to: email@example.com.
PCORI will invest $100 million to build a nationwide database containing 26 to 30 million EHR records in an effort to begin supporting retrospective clinical research.
MeMD announces that its subscription-based telemedicine service has expanded to include licensed providers in all 50 states.
The Congressional Budget Office expects that expanding insurance under the ACA will cost $100 billion less than previously forecast over the next 10 years, according to a report published Monday that cites an increase in non-elderly coverage and a decrease in the forecasted cost of insurance subsidies.
Can Intuitive Software Design Support Better Health?
By Scott Frederick
Biometric technology is the new “in” thing in healthcare, allowing patients to monitor certain health characteristics—blood pressure, weight, activity level, sleep pattern, blood sugar—outside of the healthcare setting. When this information is communicated with providers, it can help with population health management and long-term chronic disease care. For instance, when patients monitor their blood pressure using a biometric device and upload that information to their physician’s office, the physician can monitor the patient’s health remotely and tweak the care plan without having to physically see the patient.
For biometric technology to be effective, patients must use it consistently in order to capture a realistic picture of the health characteristics they are monitoring. Without regular use, it is hard to see if a reading is an anomaly or part of a larger pattern. The primary way to ensure consistent use is to design user-friendly biometric tools because it is human nature to avoid things that are too complicated, and individuals won’t hesitate to stop using a biometric device if it is onerous or complex.
Let’s look at an example.
An emerging growth area for healthcare biometrics is wireless activity trackers—like FitBit—that can promote healthier lifestyles and spur weight loss. About three months ago, I started using one of these devices to see if monitoring metrics like the number of steps I walked, calories I consumed and hours I slept would make a difference in my health.
The tool is easy-to-use and convenient. I can monitor my personal metrics any time, anywhere, allowing me to make real-time adjustments to what I eat, when I exercise, and so on. For instance, at any given time, I can tell how many steps I’ve taken and how many more I need to take to meet my daily fitness goal. This shows me whether I need to hit the gym on the way home from work or whether my walk at lunch was sufficient. I can even make slight changes to my routine, choosing to stand up during conference calls or take the stairs instead of the elevator.
I download my data to a website, which provides easy-to-read and customizable dashboards, so I can track overall progress. I find I check that website more frequently than I look at Facebook or Twitter.
Now, imagine if the tool was bulky, slow, cumbersome and hard to navigate. Or the dashboard where I view my data was difficult to understand. I would have stopped using it awhile ago—or may not have started using it in the first place.
Like other hot technology, there are several wireless activity trackers infiltrating the market, each one promising to be the best. In reality, only the most well-designed applications will stand the test of time. These will be completely user-centric, designed to easily and intuitively meet user needs.
For example, a well-designed tracker will facilitate customization so users can monitor only the information they want and change settings on the fly. Such a tool will have multiple data entry points, so a user can upload his or her personal data any time and from anywhere. People will also be able to track their progress over time using clear, easy-to-understand dashboards.
Going forward, successful trackers may also need to keep providers’ needs in mind. While physicians have hesitated to embrace wireless activity monitors—encouraging patients to use the technology but not leveraging the data to help with care decisions—that perspective may be changing. It will be interesting to see whether physicians start looking at this technology in the future as a way to monitor their patients’ health choices. Ease of obtaining the data and having it interface with existing technology will drive provider use and acceptance.
While biometric tools are becoming more common in healthcare and stand to play a major role in population health management in the future, not every tool will be created equal. Those designed with the patient and provider in mind will rise to the top and improve the overall health of their users.
Scott Frederick, RN, BSN, MSHI is director of clinical insight for PointClear Solutions of Atlanta, GA.
Addressing Data Quality in the EHR
By Greg Chittim
What if you found out that you might have missed out on seven of your 22 ACO performance measures, not because of your actual clinical and financial performance, but because of the quality of data in your EHRs? It happens, but it’s not an intractable problem if you take a systematic approach to understanding and addressing data quality in all of your different ambulatory EHRs.
In HIStalk’s recent coverage of HIMSS14, an astute reader wrote:
Several vendors were showing off their “big data” but weren’t ready to address the “big questions” that come with it. Having dealt with numerous EHR conversions, I’m keenly aware of the sheer magnitude of bad data out there. Those aggregating it tend to assume that the data they’re getting is good. I really pushed one of the major national vendors on how they handle data integrity and the answers were less than satisfactory. I could tell they understood the problem because they provided the example of allergy data where one vendor has separate fields for the allergy and the reaction and another vendor combines them. The rep wasn’t able to explain how they’re handling it even though they were displaying a patient chart that showed allergy data from both sources. I asked for a follow up contact, but I’m not holding my breath.
All too often as the HIT landscape evolves, vendors and their clients are moving too quickly from EHR implementation to population health to risk-based contracts, glossing over (or skipping entirely) a focus on the quality of the data that serves as the foundation of their strategic initiatives. As more provider organizations adopt population health-based tools and methodologies, a comprehensive, integrated, and validated data asset is critical to driving effective population-based care.
Health IT maturity can be defined as four distinct steps:
High-quality data is a key foundational piece that is required to manage a population and drive quality. When the quality of data equals the quality of care physicians are providing, one can leverage that data as an asset across the organization. Quality data can provide detailed insight that allows pinpointing opportunities for intervention — whether it’s around provider workflow, data extraction, or patient follow-up and chart review. Understanding the origins of compromised data quality help recognize how to boost measure performance, maximize reimbursements, and lay the foundation for effective population health reporting.
It goes without saying that reporting health data across an entire organization is not an easy task. However, there are steps that organizations must take to ensure they are extracting sound data from their EHR systems.
Outlined below are the key issues that contribute to poor data quality impacting population health programs, how they are typically resolved, and more optimal ways organizations can resolve them.
Variability across disparate EHRs and other data sources
EHRs are inconsistent. Data feeds are inconsistent. Despite their intentions, standardized message types such as HL7 and CCDs still have a great deal of variability among sources. When they meet the letter of national standards, they rarely meet the true spirit of those standards when you try to use.
Take diagnoses, for example. Patient diagnoses can often be recorded in three different locations: on the problem list, as an assessment, and in medical history. Problem lists and assessments are both structured data, but generally only diagnoses recorded on the problem list are transported to the reports via the CCD. This translates to underreporting on critical measures that require records of DM, CAD, HTN, or IVD diagnoses. Accounting for this variability is critical when mapping data to a single source of truth.
Standard approach: Most organizations try to use consistent mapping and normalization logic across all data sources. Validation is conducted by doing sanity checks, comparing new reports to old.
Best practice approach: To overcome the limitations of standard EHR feeds like the CCD, reports need to pull from all structured data fields in order to achieve performance rates that reflect the care physicians are rendering– either workflow needs to be standardized across providers or reporting tools need to be comprehensive and flexible in the data fields they pull from.
The optimal way to resolve this issue is to tap into the back end of the EHR. This allows you to see what data is structured vs. unstructured. Once you have an understanding of the back-end schema, data interfaces and extraction tools can be customized to pull data where it is actually captured, as well as where it should be captured. In addition, validation of individual data elements needs to happen in collaboration with providers, to ensure completeness and accuracy of data.
Variability in provider workflows
EHRs are not perfect and providers often have their own ways of doing things. What may be optimal for the EHR may not work for the providers or vice versa. Within reason, it is critical to accommodate provider workflows rather than forcing them into more unnatural change and further sacrificing efficiency.
Standard approach: Most organizations ignore this and go to one extreme or another: (1) use consistent mapping and normalization logic across all data sources and user workflows, making the assumption that all providers use the EHR consistently, or (2) allowing workflows to dictate all and fight the losing battle to make the data integration infinitely adaptable. Again, validation is conducted using sanity checks, comparing new reports to old.
Best practice approach: Understand how each provider uses the system and identify where the provider is capturing all data elements. Building in a core set of workflows and standards dictated by an on-the-ground clinical advisory committee, with flexibility for effective variations is critical. With a standard core, data quality can be enhanced by tapping into the back end of the EHR to fully understand how data is captured as well as spending time with care teams to observe their variable workflows. To avoid disruption in provider workflows, interfaces and extraction tools can be configured to map data correctly, regardless of how and where it is captured. Robust validation of individual data elements needs to happen in collaboration with providers to ensure completeness and accuracy of data (that is, the quality of the data) matches the quality of care being delivered.
Build provider buy-in/trust in system and data through ownership
If providers do not trust the data, they will not use population health tools. Without these tools, providers will struggle to effectively drive proactive, population-based care or quality improvement initiatives. Based on challenges with EHR implementation and adoption over the last decade, providers are often already skeptical of new technology, so getting this right is critical.
Standard approach: Many organizations simply conduct data validation process by doing a sanity test comparing old reports to new. Reactive fixes are done to correct errors in data mapping, but often too late, after provider trust has been lost in the system.
Best practice approach: Yet again, it is important to build out a collaborative process to ensure every single data element is mapped correctly. First meetings to review data quality usually begin with a statement akin to “your system must be wrong — there’s no way I am missing that many patients.” This is OK. Working side by side with the providers to ensure they understand where data is coming from and how to modify both workflow and calculations ensure that they are confident that reports accurately reflect the quality of care they are rendering. This confidence is a critical success factor to the eventual adoption of these population health tools in a practice.
Missed incentive payments under value-based reimbursement models
An integrated data asset that combines data from many sources should always add value and give meaningful insight into the patient population. A poorly mapped and validated data asset can actually compromise performance, lower incentive reimbursements, and ultimately result in a negative ROI.
Standard approach: A lackluster data validation process can result in lost revenue opportunities, as data will not accurately reflect the quality of care delivered or accurately report the risk of the patient population.
Best practice approach: Using the previously described approach when extracting, mapping, and validating data is critical for organizations that want to see a positive ROI in their population health analytics investments. Ensuring data is accurate and complete will ensure tools represent the quality of care delivered and patient population risk, maximizing reimbursement under value-based payments.
We have worked with a sample ACO physician group of over 50 physicians to assess the quality of data being fed from multiple EHRs within their system into an existing analytics platform via CCDs and pre-built feeds. Based on an assessment of 15 clinically sensitive ACO measures, it was discovered that the client’s reports were under-reporting on 12 of the 15 measures, based only on data quality. Amounts were under-reported by an average of 28 percentage points, with the maximum measure being under-reported by 100 percentage points.
Reports erroneously reported that only six of the 15 measures met 2013 targets, while a manual chart audit revealed that 13 of the 15 measures met 2013 targets, indicating that data was not being captured, transported, and reported accurately. By simply addressing these data quality issues, the organization could potentially see additional financial returns through quality incentive reimbursements as well as a reduced need for labor-intensive intensive chart audits.
As the industry continues to shift toward value-based payment models, the need for an enterprise data asset that accurately reflects the health and quality of care delivered to a patient population is increasingly crucial for financial success. Providers have suffered enough with drops in efficiency since going live on EHRs. Asking them to make additional significant changes in their daily workflows to make another analytics tool work is not often realistic.
Analytics vendors need to meet the provider where they are to add real value to their organization. Working with providers and care teams not only to validate integrity of data, but to instill a level of trust and give them the confidence they need to adopt these analytics tools into their everyday workflows is extremely valuable and often overlooked. These critical steps allow providers to begin driving population-based care and quality improvement in practices, positioning them for success in the new era of healthcare.
Greg Chittim is senior director of Arcadia Healthcare Solutions of Burlington, MA.
If you’ve ever traveled to a country that doesn’t speak your native tongue, you can appreciate the importance of basic communication. If you learn a second language to the degree that you’re adding nuance and colloquialisms, you’ve experienced how much easier it is to explain a point or to get answers you need. What if you’re expected to actually move to that foreign country under a strict timeline? The pressure is on to get up to speed. The same can be said for learning the detailed coding language of ICD-10.
The healthcare industry has been preparing in earnest to move from ICD-9 coding to the latest version of the international classification of diseases. People have been training, testing and updating information systems, essentially packing their bags to comply with the federal mandate to implement ICD-10 this October — but the trip was postponed. On April 1, President Barrack Obama signed into law a bill that includes an extension for converting to ICD-10 until at least Oct. 1, 2015. What does this mean for your ICD-10 travel plans?
Despite the unexpected delay, you’ll be living in ICD-10 country before you know it. With at least another year until the deadline, the timing is just right to start packing and hitting the books to learn the new codes and to prepare your systems. For those who have a head start, your time and focus has not gone to waste, so don’t throw your suitcases back into the closet. The planning, education and money involved in preparation for the ICD-10 transition doesn’t dissolve with the delay – you’ve collected valuable tools that will be put to use.
Although many people, including myself, are disappointed in the change, we need to continue making progress toward the conversion; learning and using ICD-10 will enable the United States to have more accurate, current and appropriate medical conversations with the rest of the world. Considering that it is almost four decades old, there is only so much communication that ICD-9 can handle; some categories are actually full as the number of new diagnoses continues to grow. ICD-9 uses three to five numeric characters for diagnosis coding, while ICD-10 uses three to seven alphanumeric characters. ICD-10 classifications will provide more specific information about medical conditions and procedures, allowing more depth and accuracy to conversations about a patient’s diagnosis and care.
Making the jump to ICD-10 fluency will be beneficial, albeit challenging. In order to study, understand and use ICD-10, healthcare organizations need to establish a learning system for their teams. The Breakaway Group, A Xerox Company, provides training for caregivers and coders that eases learning challenges, such as the expanded clinical documentation and new code set for ICD-10. Simply put, there are people can help with your entire ICD-10 travel itinerary, from creating a checklist of needs to planning a successful route.
ICD-10 is the international standard, so the journey from ICD-9 codes to ICD-10 codes will happen. Do not throw away your ICD-10 coding manuals and education materials just yet. All of these items will come in handy to reach the final destination: ICD-10.
Xerox is a sponsor of the Breakaway Thinking series of blog posts.
This is a guest blog by Nial Toner of PathXL, a vendor of cloud-based digital pathology systems. I asked him to discuss the benefits of cloud computing in digital pathology and barriers to its deployment. There will be some emphasis placed on digital pathology at the upcoming Pathology Informatics Summit 2014 (see: Digital Pathology Well Represented at Pathology Informatics Summit 2014)--BAF
In digital pathology, cloud computing can help to deliver cost effective healthcare and also help to manage the growing amount of data that is generated by the technology. Cloud computing provides many benefits but also some drawbacks. The benefits of cloud computing in digital pathology are the following:
Despite these benefits, some reservations and barriers to using cloud technology in digital pathology persist and include:
While the benefits are substantial, cloud computing has yet to make any major inroads in pathology. Despite this, cloud computing in support of digital pathology is increasingly being used for medical education and in research settings. The future for cloud technology does look bright and the value of cloud computing for the healthcare industry has been predicted to reach $5.4 billion by 2017. We are all increasingly adapting to a mobile world and digital pathology will make a major contribution to this goal.
A post-stroke rehabilitation system integrating robotics, VR and high-resolution EEG imaging.
IEEE Trans Neural Syst Rehabil Eng. 2013 Sep;21(5):849-59
Authors: Steinisch M, Tana MG, Comani S
We propose a system for the neuro-motor rehabilitation of upper limbs in stroke survivors. The system is composed of a passive robotic device (Trackhold) for kinematic tracking and gravity compensation, five dedicated virtual reality (VR) applications for training of distinct movement patterns, and high-resolution EEG for synchronous monitoring of cortical activity. In contrast to active devices, the Trackhold omits actuators for increased patient safety and acceptance levels, and for reduced complexity and costs. VR applications present all relevant information for task execution as easy-to-understand graphics that do not need any written or verbal instructions. High-resolution electroencephalography (HR-EEG) is synchronized with kinematic data acquisition, allowing for the epoching of EEG signals on the basis of movement-related temporal events. Two healthy volunteers participated in a feasibility study and performed a protocol suggested for the rehabilitation of post-stroke patients. Kinematic data were analyzed by means of in-house code. Open source packages (EEGLAB, SPM, and GMAC) and in-house code were used to process the neurological data. Results from kinematic and EEG data analysis are in line with knowledge from currently available literature and theoretical predictions, and demonstrate the feasibility and potential usefulness of the proposed rehabilitation system to monitor neuro-motor recovery.
Brain-computer interfaces: a powerful tool for scientific inquiry.
Curr Opin Neurobiol. 2014 Apr;25C:70-75
Authors: Wander JD, Rao RP
Abstract. Brain-computer interfaces (BCIs) are devices that record from the nervous system, provide input directly to the nervous system, or do both. Sensory BCIs such as cochlear implants have already had notable clinical success and motor BCIs have shown great promise for helping patients with severe motor deficits. Clinical and engineering outcomes aside, BCIs can also be tremendously powerful tools for scientific inquiry into the workings of the nervous system. They allow researchers to inject and record information at various stages of the system, permitting investigation of the brain in vivo and facilitating the reverse engineering of brain function. Most notably, BCIs are emerging as a novel experimental tool for investigating the tremendous adaptive capacity of the nervous system.
Android Wear will show you info from the wide variety of Android apps, such as messages, social apps, chats, notifications, health and fitness, music playlists, and videos.
It will also enable Google Now functions — say “OK, Google” for flight times, sending a text, weather, view email, get directions, travel time, making a reservation, etc..
Google says it’s working with several other consumer-electronics manufacturers, including Asus, HTC, and Samsung; chip makers Broadcom, Imagination, Intel, Mediatek and Qualcomm; and fashion brands like the Fossil Group to offer watches powered by Android Wear later this year.
If you’re a developer, there’s a new section on developer.android.com/wear focused on wearables. Starting today, you can download a Developer Preview so you can tailor your existing app notifications for watches powered by Android Wear.
A Hybrid Brain Computer Interface System Based on the Neurophysiological Protocol and Brain-actuated Switch for Wheelchair Control.
J Neurosci Methods. 2014 Apr 5;
Authors: Cao L, Li J, Ji H, Jiang C
BACKGROUND: Brain Computer Interfaces (BCIs) are developed to translate brain waves into machine instructions for external devices control. Recently, hybrid BCI systems are proposed for the multi-degree control of a real wheelchair to improve the systematical efficiency of traditional BCIs. However, it is difficult for existing hybrid BCIs to implement the multi-dimensional control in one command cycle.
NEW METHOD: This paper proposes a novel hybrid BCI system that combines motor imagery (MI)-based bio-signals and steady-state visual evoked potentials (SSVEPs) to control the speed and direction of a real wheelchair synchronously. Furthermore, a hybrid modalities-based switch is firstly designed to turn on/off the control system of the wheelchair.
RESULTS: Two experiments were performed to assess the proposed BCI system. One was implemented for training and the other one conducted a wheelchair control task in the real environment. All subjects completed these tasks successfully and no collisions occurred in the real wheelchair control experiment.
COMPARISON WITH EXISTING METHOD(S): The protocol of our BCI gave much more control commands than those of previous MI and SSVEP-based BCIs. Comparing with other BCI wheelchair systems, the superiority reflected by the index of path length optimality ratio validated the high efficiency of our control strategy.
CONCLUSIONS: The results validated the efficiency of our hybrid BCI system to control the direction and speed of a real wheelchair as well as the reliability of hybrid signals-based switch control.
Glyph looks like a normal headset and operates like one, too. That is, until you move the headband down over your eyes and it becomes a fully-functional visual visor that displays movies, television shows, video games or any other media connected via the attached HDMI cable.
Using Virtual Retinal Display (VRD), a technology that mimics the way we see light, the Glyph projects images directly onto your retina using one million micromirrors in each eye piece. These micromirrors reflect the images back to the retina, producing a reportedly crisp and vivid quality.
I’ve written regularly about the need for secure text messaging in healthcare. I can’t believe that it was two years ago that I wrote that Texting is Not HIPAA Secure. Traditional SMS texting on your cell phone is not HIPAA secure, but there are a whole lot of alternatives. In fact, in January I made the case for why even without HIPAA Secure Text Messaging was a much better alternative to SMS.
Those that know me (or read my byline at the end of each article) know that I’m totally bias on this front since I’m an adviser to secure text message company, docBeat. With that disclaimer, I encourage all of you to take a frank and objective look at the potential for HIPAA violations and the potential benefits of secure text over SMS and decide for yourself if there is value in these secure messaging services. This amazing potential is why I chose to support docBeat in the first place.
While I’ve found the secure messaging space really interesting, what I didn’t realize when I started helping docBeat was how many parts of the healthcare system could benefit from something as simple as a secure text message. When we first started talking about the secure text, we were completely focused on providers texting in ambulatory practices and hospitals. We quickly realized the value of secure texting with other members of the clinic or hospital organization like nurses, front desk staff, HIM, etc.
What’s been interesting in the evolution of docBeat was how many other parts of the healthcare system could benefit from a simple secure text message solution. Some of these areas include things like: long term care facilities, skilled nursing facilities, Quick Care, EDs, Radiology, Labs, rehabilitation centers, surgery centers, and more. This shouldn’t have been a surprise since the need to communicate healthcare information that includes PHI is universal and a simple text message is often the best way to do it.
The natural next extension for secure messaging is to connect it to patients. The beautiful part of secure text messaging apps like docBeat is that patients aren’t intimidated by a the messages they receive from docBeat. The same can’t be said for most patient portals which require all sorts of registration, logins, forms, etc. Every patient I know is happy to read a secure text message. I don’t know many that want to login to a portal.
Over the past couple years the secure text messaging tide has absolutely shifted and there’s now a land grab for organizations looking to implement some form of secure text messaging. In some ways it reminds me of the way organizations were adopting EHR software a few years back. However, we won’t need $36 billion to incentivize the adoption of secure text message. Instead, market pressures will make it happen naturally. Plus, with ICD-10 delayed another year, hopefully organizations will have time to focus on small but valuable projects like secure text messaging.
I recently chaired a couple of conferences and my next HealthIMPACT event is coming up later this month in NYC. At each one of the events and many times a year via twitter and e-mail I am asked whether the Direct Project is successful, worth implementing in health IT projects, and if there are many people sending secure messages using Direct. To help answer these questions, I reached out to Rachel A. Lunsford, Director of Product Management at Amida Technologies. Amida has amassed an impressive team of engineers to focus on health IT for patient-centered care so their answer will be well grounded in facts. Here’s what Rachel said when I asked whether Direct is a myth or if it’s real and in use:
Despite wide adoption in 44 States, there is a perception that Direct is not widely used. In a recent conversation, we discussed a potential Direct secure messaging implementation with a client when they expressed concern about being a rare adopter of Direct messaging. While the team reassured them that their organization would in fact be joining a rich ecosystem of adopters, they still asked us to survey the market.
In 2012, the Officer of the National Coordinator for Health Information Technology (ONC) awarded grants to State Health Information Exchanges to further the exchange of health information. There are two primary ways to exchange information: directed and query-based. ‘Directed’ exchange is what it sounds like – healthcare providers can send secure messages with health information attached to other healthcare providers that they know and trust. The most common type of ‘Directed’ exchange is Direct which is a secure, scalable, standards-based way to send messages. Query-based is a federated database or central repository approach to information exchange which is much harder to implement and growth in this area is slower. Thanks in part to the grants and also in part to the simplicity of the Direct protocol, 44 States have adopted Direct and widely implemented it. And yet the myth persists that Direct is not well adopted or used.
As with other new technologies, it may be hard to see the practical applications. When Edison and Tesla were dueling to find out which standard – direct or alternating current – would reign supreme, many were unsure if electricity would even be safe enough, never mind successful enough, to replace kerosene in the street lamps. It was impossible for people to foresee a world where many live in well-lit homes on well-lit streets, and none could have imagined using tools like the computer or the Internet. Thankfully, the standards debate was sorted out and we continue to benefit from it today.
There are two groupings of data we can look towards for more detail on use of Direct. The first are the States themselves; they self-report transaction and usage statistics to the ONC. It was reported in the third quarter of 2013 that the following were actively exchanging some 165 million ‘Directed’ messages:
Another organization collecting data on Direct implementation is DirectTrust.org. Charged by ONC, DirectTrust.org oversees development of the interoperability framework and rules used by Direct implementers, works to reduce implementation costs, and remove barriers to implementation. Additionally, DirectTrust supports those who want to serve as sending and receiving gateways known as health information service providers (HISPs). By DirectTrust.org’s count, the users number well over 45,000 with at least 16 organizations accredited as HISPs. Further, over two million messages have been exchanged with the roughly 1,500 Direct-enabled sites. With Meaningful Use encouraging the use of Direct, we can expect even more physicians and healthcare organizations to join in.
As more doctors are able to exchange records, everyone will benefit. When a provider can receive notes and records from other providers to see a fuller, more complete view of her patient’s health, we have a greater possibility of lowering healthcare costs, improving health outcomes, and saving lives. Once we open up the exchange to patients through things like the Blue Button personal health record, the sky is the limit.
Editor’s Note: The following is an update to a previous EMR and HIPAA blog post titled “EMR Companies Holding Practice Data for “Ransom”.” In this update, James Summerlin (aka “JamesNT”) offers an update on EHR vendors willingness to let providers access their EHR data.
Over the years I have been approached with questions by several solo docs and medical groups about things such as the following:
And there have been plenty of times I’ve had to give answers to those questions that were not favorable. In many cases, it was with some online EMR or PM and the fact that I could not get to the database and the vendor refused to export a copy to me or the vendor wanted thousands of dollars for the export. With the on-premises PM and EMR systems, getting to the data was a matter of working my way around whatever database was being used and figuring out what table had what data. Although working with an on-premises PM or EMR may sound easier, it too often isn’t. The on-premises guys have some tricks up their sleeves to keep you away from your data such as password protecting the database and, in some cases, flat out threatening legal action.
A few years back, I wrote a post on a forum about my thoughts on how once you entered your data into a PM or EMR, you may never get it back. You can see John Lynn’s blog post on that here.
My being critical of EMR and PM software vendors is nothing new. I’ve written several posts on forums and blogs, even articles in BC Advantage Magazine, about how hard it can be to deal with various EMR and PM systems. Much of the, at times, downright contemptuous attitudes many PM and EMR vendors have towards their own clients can be very harmful. Let’s consider three aspects:
In situations like those above, the best way to resolution is for the practice to perhaps obtain its own technical talent and build its own tools to extend the capabilities of the data contained within the various databases and repositories it may have such as the databases of the PM and EMR. Unfortunately, as I have reported before, most PM and EMR systems lock up the practice’s data such that it is unobtainable.
At long last; however, there appears to be a light at the end of the tunnel that doesn’t sound like a train. Some of the EMR systems that doctors use are beginning to realize that creating a turtle shell around a client’s data, in the long run, doesn’t do the client nor the PM/EMR vendor any good. One such EMR I’ve been working with for a long time is Amazing Charts. Amazing Charts has found itself in a very unique situation in that many of its clients are actually quite technical themselves or have no problem obtaining the technical talent they need to bend the different systems in their practices to their will. The idea of having three or four databases, each being an island unto itself, is not acceptable to this adventurous lot. They want all this data pooled together so they can make real business decisions.
Amazing Charts; therefore, has decided to be more open regarding data access. Read only access to the Amazing Charts database is soon to be considered a given by the company itself. Write access, of course, is another matter. Clients will have to prove, and rightly so, that they won’t go spelunking through the database making changes that do little more than rack up tech-support calls. Even with the caution placed on write access this is a far jump above and beyond the flat out “NO” any other company will give you for access to their database. I consider this to be a great leap forward for Amazing Charts and, I’m certain, will set them apart from competition that still considers lock-in and a stand-offish attitude the way to treat clients who pay them a lot of money.
Perhaps one day other PM and EMR vendors will see the light and realize the data belongs to the practice, not the vendor, and will stop taking people’s stuff only to rent access to it back to them or withhold it altogether. Until then, Amazing Charts seems to be leading the way.
I have posted previous notes about U.S. academic pathology departments and LIS vendors entering into various types of business arrangements with Chinese businesses. One involved the UPMC pathology department providing second opinions for surgical pathology cases to KingMed Diagnostics three year ago (see: UPMC Enters China Market for Second Opinions in Surgical Pathology Cases). Another involved the establishment of a global pathology network by PathCentral, a cloud-based AP-LIS, with participation by China-based Kindstar Globalgene Technology (see: PathCentral Debuts Agnostic Global Pathology Network). About three years ago, Mayo Clinic also signed a collaboration agreement with Kindstar (see: Mayo Clinic Signs Strategic Collaboration Agreement with Wuhan Kindstar Globalgene Technology, Inc. (Kindstar)). Here's a quote from that article:
...Kindstar offers 750 tests in specialties including hematology, oncology, genetics, infectious diseases, and cardiovascular disease. Through the collaboration with Mayo Clinic, Kindstar will expand its test menu offerings, promoting high-quality diagnostic and therapeutic care for the people of China by providing patients and physicians the broadest access to advanced esoteric testing services.
Now comes news that UCLA has signed a business agreement with Centre Testing to operate an esoteric lab in Shanghai with a special focus on molecular and genetic testing (see: UCLA signs agreement with Centre Testing to create clinical laboratory in Shanghai). Here is an excerpt from the article:
The University of California and UCLA Department of Pathology have signed an agreement with Centre Testing International Corp., a Chinese firm, to create a company that will operate a clinical laboratory in Shanghai. The new lab will support clinical trials and enhance medical care for Chinese patients with cancer and other diseases. The new company, CTI-Pathology/UCLA Health, is jointly owned by CTI and the University of California. The 25,000-square-foot facility - the first of its kind in China - will offer genetic and molecular diagnostics and other sophisticated tests that exceed the scope of the average lab in China, and UCLA pathologists will train Chinese lab specialists to accurately interpret the tests....The partnership is the first between a Chinese company and a U.S. academic medical center to create a specialized laboratory in China. ....UCLA will oversee management of the laboratory to ensure that its operations meet international standards for quality, and CTI will provide capital funding and marketing expertise.
This deal between UCLA and Centre strikes me as being unusual in a couple of ways. First, the new lab will focus on clinical trials as well as molecular and genetic testing. These are areas in which an academic lab like UCLA could have special expertise that would be of great value in the Chinese market. China is now one of the major sites for clinical trials because of the reduced cost of managing trials there but also because the Chinese market for prescription drugs is becoming one of the largest in the world (see: Clinical Trials Increasingly Move Offshore, Many to China; China's pharmaceutical industry -- Poised for the giant leap -- pdf). A second point is that Centre Testing International (CTI) is a testing, inspection, certification, and consulting firm but not a hospital or clinical laboratory. CTI is providing capital and marketing expertise for the project but UCLA will totally oversee management of the laboratory. This a different scenario than an American academic facility providing expertise on a case-by-case or referral basis to a Chinese lab or hospital.