“The digital revolution needs a trust revolution. Huge shifts are occurring as the world moves towards comprehensive information sharing via social media, cloud computing and big data. Systems of record (such as email) have become systems of engagement (such as social media) and are now moving towards systems of intelligence (data analytics). However, this progress cannot occur unless customers trust how their data is used. The challenge: more than 90% of consumers feel they have lost control of their data.”
Vendors and apps often say that you can always opt-out. However, most people prefer a choice to opt-in. If technology wants to build trust, opt-in will need to be the model.
At the WEF Annual Meeting, a set of universal data protection principles was called for.
“We all have to step up to another level of transparency, especially the vendors. So whether you are an enterprise vendor or a consumer vendor, we all need to open up a lot more to be able to say exactly where is the data, what’s going on with the data, who has the data, and if there’s a problem with the data – a security problem or some other issue with the data – immediate disclosure, complete and total transparency. No secrets. Because only through that transparency are we going to get to a higher level of trust. That is not where we are today.“We’re the enterprise cloud. Our customers are the GEs, the Philips, the BMWs, it’s their data. We can’t do anything without our customers saying what we can do. It’s their data. They tell where they want it, how they want to use it, what applications are using it. We can’t see it, the data is black to us, it’s encrypted. But that very much is a model from where the consumer companies are going to have to go. Enterprise companies can’t do anything without their customers saying it’s okay. That’s our agreement with our customers that we sign with them. In the consumer world, you don’t know what’s going on, and that is going to have to change. Total disclosure is critical.”
Trust is about weighing trade-offs – how much privacy do I have, how secure do I feel – what are the benefits I get, in exchange for that? You need to afford the individual trace and control. The user’s own their data. They should be able to examine it, take it with them, bring it to other sites, bring it to other vendors that they trust more. Basically, have a system and a market that helps people make these trade-offs and these decisions. But they should have control over how they use the system, or whether they use the system at all. People have trouble making some of these trade-offs because the vendors are not being transparent enough, not providing enough controls and choice.”
Tim Berners-Lee said that at MIT they are working on a new architecture for how we store data, and proposed “Beneficent Apps.”
Is what I am doing beneficent? Basically, is it good for users? Suppose we have a brand, this is a beneficent app, that means while I am writing the app, you are going to pay me for the app, and I am going to think about what you want. That’s the business model we are going to see.
The moderator of the WEF panel, Nick Gowing, said the that Terms and Conditions are not the small print, “Terms and Conditions, No, that’s the Big Print.”
Terms of service and privacy policies may not identify what third parties can do with data. So even if you trust an app or service, you may not know what a third party can do with your data. This will become increasing important with the growth in consumer health data that is not necessarily patient data. In a world of convergence, the Internet of Things, wearable technologies and integrated health app platforms, we need to build with consent of the user.
Consent means, we won’t use your data for any other purpose, unless you approve it.
In a recent blog, the opinions of the JASON Report Part II with regards to CDA were analyzed. The review of CDA was lukewarm at best. However, the report did spend a significant amount of time talking about future possibilities. The main focus of the future possibilities was HL7 FHIR.
FHIR was discussed extensively in the report because JASON thought it lends itself well to the health IT vision which was stated as:
Focus on the health of individuals rather than the care of individuals.
Key to this vision is the establishment of a robust health data infrastructure that could also be used to enable a Learning Health System. But one major impediment that remains is the critical need for open APIs for EHR connectivity and to stimulate entrepreneurial ideas. One solution to this impediment is seen as the FHIR standard, which JASON sees as a “significant improvement over CDA.”
The JASON report describes CDA as a container for information. The problem with the container is that it is hard to sort out all the data in the container into usable chunks. FHIR solves this by organizing the data into smaller usable chunks called resources. These resources standardize the exchange of information as modular components.
Resources contain basic pieces of information and can be extended to fulfill specialized requirements. Resources can also be bundled together to satisfy the same messaging and document workflows that the health IT industry uses today. In a previous post, I detailed the interoperability paradigms of FHIR, including REST, messaging, documents, and services. Examples of resources include Patient, Medication, and CarePlan to name a few. Like CDA, each resource has a human readable element as well as coded entries.
Because these resources are simple in structure and clearly defined, they are viewed as something that is easy to parse and extract the data. Not to mention, it is always possible to extract the human readable portion. The resources, which can be encoded in XML or JSON (not to be confused with JASON – the organization writing the report), are lightweight and easily adaptable to web applications which is something that has not existed in health IT to this point.
According to the report, of even greater importance than the lightweight and clearly defined resources is the ability to support representation state transfer (REST). There are several design features listed in the report which give evidence to REST being such a good choice:
With REST in place as a paradigm for interoperability, along with the simple modular structure of resources, JASON believes that FHIR sets the stage for a major shift in the way healthcare data is exchanged, and make data more readily available when and where it is needed to support the future vision of healthcare.
As Colorado runs its own exchange, and has had what most consider a successful rollout, we’ll discuss what is next and how the exchange works to improve the long-term health of the people of Colorado. In this chat we discuss choice architectures and how to build an exchange that is really, truly consumer-centric – a great vision for health in any state, and I’m glad to see it emerging here in Colorado.
LK: Have you looked into behavioral economics and what are called choice architectures like what they describe in Nudge? Nudge has a pretty long section on creating a framework for effective decisions based on the goals of the user.
PD: Absolutely. Our marketplace solution is a good traditional transactional system, but it’s not been designed as a true engagement platform, utilizing choice/behavioral best practices, so we’ll likely need to append our architecture with some niche solutions. These could come from the startup community and non-traditional sources of innovation in the local community, and that’s very exciting.
LK: You and I have talked the opportunity for the exchange to be more of a platform, presumably with APIs that would allow outside developers to come in and build new solutions and applications using data supplied via the API combined with other outside sources. What can you share with us about that?
PD: We are implementing an API into various parts of our marketplace, hopefully in the next year or so.
Digital engagement is very important to us. We are going to move forward with a hackathon so that we can engage the local digital health community to bring innovative new ideas that could be leveraged in the long term to create an engaging, transparent experience. As CTO however, there is a balance between being innovative and having an enterprise scalable architecture. Anything that we put into production has to be robust, it has to scale well. We have recently engaged with a startup, CodeBaby, who are based here in Colorado Springs. They helped us go live today with Kyla our avatar who helps people navigate our website. For now this is limited, but we hope to integrate this further into our key, core portal marketplace screens and into our streamlined eligibility application.
LK: That’s great to see, and I can’t wait to see what comes next. I think that this kind of opportunity will be very exciting for entrepreneurs because health care is something that literally everyone has a stake, and it’s great to have these kinds of opportunities in Colorado to get more people involved in improving it, with code.
PD: Denver is a really exciting place to be in the development of new health technologies and new innovations given the work that Mike Biselli is doing (creating Stride, an emerging digital health campus with some big soon-to-be-named digital health tenants) to establish Denver as a hub.
LK: Yes, the Prime Health Collaborative and Stride and Health 2.0 Denver do seem to have started something special in the community here. It seems like a great fit because people do come here to be active, and the active, consumer-centricity has started to show with the startups that have formed here. It’s a great confluence of forces around digital health and consumer-focused solutions.
So let’s talk a little about what makes this environment unique and how we’re going to sustain it. Connect for Health Colorado is a non-profit that will need to be self-sustaining. What are the opportunities for extending the business model?
PD: We do need to be self-sustaining in January of 2015 and we do have a plan to do so based on a broad market assessment and our carrier-fee billing for plans that we offer on the marketplace. In the future however there may also be opportunities for monetizing our (anonymized) information assets and our technologies thereby funneling additional resources back into the exchange to support our ongoing vision and mission. Perhaps to other, newer exchanges.
LK: What improvements are you going looking to roll out in the near future?
PD: In addition to what we’ve already discussed, a key focus area for us will be the utilization of user preferences to identify the important decision-making criteria for individuals.
We’re also putting in an out-of-pocket calculator so people can understand what kind of plan they should choose given their predicted healthcare expenditures for the year.
We do have a provider search tool, so people can see which providers are in-network for individual plans. However there are opportunities to make these searches broader and more inclusive, with real-time information on which providers are taking new patients and the exact services they are providing. For example so someone could find a child ABA (Applied Behavioral Analysis) provider in Denver that also has current openings within their practice. That’s just not possible using the provider search tools in use.
We have recently gone live with a formulary tool to help people find out which medications are covered by individual plans. In the future I would like to see the development of richer decision-support tools around formulary, linking in efficacy and safety information for particular drugs, given the genetic pre-disposition of the individual. Quality ratings for plans, carriers and providers are also areas that exchanges are looking to move into in the future. Amazon-like consumer-driven payer/provider ratings. The ACA has driven a number of initiatives to introduce more transparency in the marketplace. We’ve discussed a little about the All Payer Claims Database, or APCD, here in Colorado, which was driven by the ACA. Transparency and quality metrics is an area (CMS) will be providing guidance on in 2016. The establishment of the health insurance exchanges themselves is, in and of itself, a broad move toward applying more transparency to the marketplace by creating a common benefit package for qualified health plans. So, it’s easier for the consumer to compare plans like they are comparing apples to apples. CMS and ACA are playing a large role in helping to make the healthcare marketplace more transparent.
LK: Despite the hype to the contrary, it really is a free-market approach, and for the free market to work, you need transparency. If we want to fix health care, we need to make all of it more transparent and that creates a lot of opportunity for health IT that can facilitate that transparency.
PD: Reflecting on the success of PatientsLikeMe, that builds communities of patients to share information. There’s no reason we couldn’t explore providing similar communities for people in Colorado.
LK: So more of building a community and helping people connect with others in the state? That sounds great.
PD: In parallel the development of storefronts for the provision of direct to provider services including Telehealth and concierge medicine seem like a natural future evolution for exchanges.
LK: Seems like there are a lot of niches that could be provided and make this more of a communications system between patients, providers, payers and between many different stakeholders in the system, as well as a face to the health care system in Colorado.
One other thing I wanted to ask about is, have you received any interesting demographic trends about who is signing up for insurance on the exchange?
PD: Some interesting facts from our last open enrollment period (2013-14) was that 38% were in 0-34 age range. 35% were in the 35-54 age range. More than 73% of our consumers were under 55. Only about 26% were in the 55-64 age range. The other surprising thing was that 40% of those who enrolled received no financial assistance (tax credits or cost sharing reductions) indicating that people are choosing us as a trusted place to shop for their insurance.
LK: I see a lot of entrepreneurs have been getting their insurance through the exchange, so we’ll look forward to seeing how having this kind of access has improved the labor market, as people no longer need to be tied to a traditional job to qualify for affordable insurance.
PD: And, of course, the other big benefit is that people with pre-existing conditions can no longer be discriminated against, and a lot of people have come to us for that reason.
LK: Well thanks for the interview and all the great work that you’ve done. We are fortunate to live in a pretty progressive state in terms of health care and have some really great people working to improve things in Colorado.
[...] medical practice billing software encourage [...]
I’ve been involved in building many life-critical and mission-critical products over the last 25 years and have found that, finally, cybersecurity is getting the kind of attention it deserves. We’re slowly and steadily moving from “HIPAA Compliance” silliness into a more mature and disciplined professional focus on risk management, continuous risk monitoring, and actual security tasks concentrating on real technical vulnerabilities and proper training of users (instead of just “security theater”). I believe that security, like quality, is an emergent property of the system and its interaction with users and not something you can buy and bolt on. I’m both excited and pleased to see a number of healthcare focused cybersecurity experts, like Kamal Govindaswamy from RisknCompliance Consulting Group, preaching similar proactive and holistic guidance around compliance and security. I asked Kamal a simple question – if cybersecurity is an emergent property of a system, who should be held responsible/accountable for it? Here’s what Kamal said, and it’s sage advice worth following:
Information Security in general has historically been seen as something that the organization’s CISO (or equivalent) is responsible for. In reality, the Information Security department often doesn’t have the resources or the ability (regardless of resources) to be the owners or be ultimately “accountable” or “responsible” for information security. In almost all cases, the CISO can and must be the advisor to business and technology leaders or management in the organization. He could also operate/manage/oversee certain behind-the-scenes security specific technologies.
If your CISO doesn’t “own” Information Security in your organization, who should?
At the end of the day, everyone has a role to play in Information Security. However, I think the HealthIT managers and leaders in particular are critical to making security programs effective in healthcare organizations today.
Let me explain…
Of all the problems we have with security these days, I think the biggest stumbling block often has to do with not having an accurate inventory of the data we need to protect and defining ownership and accountability for protection. This problem is certainly not unique to Healthcare. No amount of technology investments or sophistication can solve this problem as it is a people and process problem more than anything else.
Healthcare is unfortunately in a unenviable position in this regard. Before the Meaningful Use program that has led to rapid adoption of EHRs over the last five years, many healthcare organizations didn’t necessarily have standard methods or technologies for collecting, processing or storing data. As a result, you will often see PHI or other sensitive information in all kinds of places that no one knows about any longer, let alone “own” them – Network file shares, emails, a legacy application or database that is no longer used etc. The fact that HealthIT in general has been overstretched over the last five years with implementation of EHRs or other programs hasn’t helped matters either.
In my opinion and experience, the average Healthcare organization is nowhere close to solving the crux of the problem with security programs – which is to ensure ownership, accountability and real effectiveness or efficiencies.
Most of us in the security profession have long talked about the critical need for the “business” to take ownership among business and technology leaders. For the most part however, I think this remains a elusive goal for many organizations. This is a serious problem because we can’t hope to have effective security programs or efficiencies without ownership and accountability.
So, how do we solve this problem in Healthcare? I think the answer lies in HealthIT leadership taking point on both ownership and accountability.
HealthIT personnel plan, design and build systems that collect/migrate/process/store data, interact with clinical or business leadership and stakeholders to formulate strategies, gather requirements, set expectations and are ultimately responsible for delivering them. Who better than HealthIT leaders and managers to be the owners and be accountable for safeguarding the data? Right?
So, let’s stop saying that we need “the business” to take ownership. Instead, I think it makes much more pragmatic sense to focus on assigning ownership and accountability on the HealthIT leadership.
I present below a few sample mechanics of how we could do this:
I hope these examples from Kamal illustrate how HealthIT can have an effective ownership and accountability for security.
Drop us some comments if you agree but especially if you don’t.
John Lynn, prolific blogger and health IT media magnate, and I are teaming up again for the second year to produce and deliver a marketing conference focused on helping digital health, health IT, and medical device innovators. We’re going to be providing actionable advice and specific techniques you can use to cut through the noise when trying to market healthcare and medical tech products to physicians, hospitals, health systems, ACOs, patients, and similar customers. Called The Healthcare IT Marketing Conference, last year’s event covered very important subjects by some of the world’s best experts on those topics and we’ll continue the tradition again in 2015.
Learn the difference between Marketing, Advertising, PR, and Branding
Everyone tells small companies that they need to “do marketing” but that’s really hard to do so I started with a quick visual to explain what it means. It comes from Marty Neumeier on pages 24 and 25 of ZAG by way of the Brand Autopsy Blog (which I highly recommend reading) and illustrates the differences between Marketing, Advertising, PR, and Branding. It’s a wonderful visual and clearly shows that small companies should focus on marketing and free PR, shoot for branding and probably eschew advertising until they have enough money. Our expert speakers at HITMC know the difference and will teach you how to make sure you’re not taking the wrong steps.
Learn how to conduct appropriate market research
Lots of (even innovative) companies don’t do basic market research so we will cover:
Learn about the different kinds of of Business Models to consider
Learn about major healthcare industry fallacies
Selling to the healthcare community is very hard and there are many myths that our conference will dispel:
Learn how to align the Payers, Beneficiaries, and Users (PBU) of your Health IT or MedTech product
There are three distinct groups you’re marketing and selling your products to:
I call this the “PBU alignment” problem. In a complex environment like healthcare, the three groups are often not the same — if you can find a market in which the payers, the beneficiaries, and the users are all the same then your sales job is easy. However, that’s commonly not the case. Let’s take a look at the typical example of a complex product like an electronic medical records (EMR) software package in the era of ARRA, HITECH, and meaningful use (MU). The “payer” may ultimately be government reimbursements through Medicare, the “beneficiaries” are the healthcare insurance firms and the government agencies that need the MU data, and the “users” are the doctors and staff at physicians offices and hospitals. Why has it taken decades for EMRs to be sold to just a tiny fraction of the total industry? Because the PBU alignment hasn’t been reached — until the users, beneficiaries, and payers of the products all understand the value and are willing to work together to achieve a goal it will be tough.
Join us at the conference to talk with experts on the PBU lesson and advice for your product. Figure out the PBU alignment problem and see how you’ll sell to each of the groups and make the right arguments — you do it right and you’ll make money. If you forget the complexities of the PBU and you’ll be languishing, too.
Go home with many tips and tricks:
Earlier this year NueMD created a nice looking Meaningful Use Infographic — asking the question whether MU was helping or hurting EHR Adoption. I loved the summary but I wanted to dig in a little further so I asked Dr. William Rusnak, a resident physician in radiology and a healthcare IT writer for NueMD, to tell us what that infographic meant for innovators and folks building solutions. Here’s what Dr. Rusnak said:
When the Centers for Medicare and Medicaid Services (CMS) launched their Electronic Health Records (EHR) Incentive Programs, coined “Meaningful Use” (MU) back in January 2011, the main goal was to reward healthcare practitioners and administrators for adopting EHRs and increasing efficiency within their practice. NueMD, a medical billing software company, decided to take a closer look at the effectiveness of this program. They compiled research from the Department of Health and Human Services (HHS), CMS, and the American College of Physicians (ACP) looking to identify adoption trends and determine potential obstacles to successful implementation.
The results are quite interesting and have shed some light upon the massive opportunity for technical breakthroughs in healthcare. If tech innovators want to join the movement, they should be continually searching for processes in medicine that still involve some sort of manual transmission of information. Talk to your friends that are nurses, doctors, office managers, billers, or administrators. You would be surprised simply by the amount of information still being written on papers and stuffed in pockets throughout the day!
Adoption, attestation, and a younger generation of physicians
According to a survey of more than 1,200 physicians, EHR adoption is certainly taking place, but when it comes to officially attesting to Meaningful Use – the numbers suggest there’s still room for improvement. Practices with more than 50 physicians had the highest rate of EHR adoption at 85%, with 62% attesting to MU. The big disparity exists among small practices (less than 10 providers) in which half have implemented EHR technology, while only 25% have attested to MU.
This will improve, though. With younger physicians beginning to practice and take on leadership positions, it is very likely that adoption rates will increase substantially over the next decade. In the past, one of the biggest challenges EHR vendors have faced is working with a userbase that wasn’t keen on technology. Soon, however, the majority of practicing physicians will be of the generation that was introduced to technology much earlier in life. Additionally, Medical Economics states that even many older physicians have become comfortable in using technology in their practices, claiming that this age-group has begun to see some of the highest rates of EHR adoption. Thus, the market, not only only for EHR, but also nearly any kind of health technology, is just about ready to surge.
User satisfaction and efficiency, or lack thereof
Although this data suggests EHR adoption is on the rise, providers’ feelings about implementing and using EHRs is showing another trend. Between March 2010 and December 2012, user satisfaction decreased 13% from 61% to 48% while dissatisfaction rose 14% from 23% to 37%. What’s to blame? Of those surveyed, 67% claimed system functionality as their primary reason for switching EHR vendors.
One could look at this on the surface and think that since satisfaction is decreasing, healthcare information technology (HIT) is a struggling industry. But, let’s not kid ourselves. HIT is here to stay and most of the gripes and complaints about EHR are typical for any developing technology. If anything, these data suggest that within this storm of inefficiencies exist ample opportunities for improvement. Developers should take this into consideration for future healthcare software. More emphasis needs to be the true effectiveness of the software. This problem could be solved rather quickly with focus groups consisting of healthcare providers. Let them pick apart your software and find bottlenecks, set-backs, and other negative features. In the end, the electronic version of any process must absolutely be less time-consuming than the old-fashioned paper method.
Another very common complaint of many EHR systems is that the usability is far from intuitive. This could be the lowest-hanging fruit in the tree of improvements to this kind of software. Although each user will differ in education — from patient to nurse to physician — all of them should be able to easily access any and all of the health information stored from the patient encounters. Innovators can easily overcome this obstacle by make a significant effort to create simple, user-friendly interfaces. Again, use focus groups or chat with current clients and find out where users struggle with simple tasks. Are there too many unused features on the “home” page? Is there are particular action that users frequently perform, but must search through several menu options to get to it?
The data entry dilemma
The next problem is rather complicated and that is of data entry. Currently, physicians and other providers are using either dictation software or typing to get information into the EHR. Streamlining this process even further will decrease the time necessary for documentation, thus providing more time for the patient interview. Innovators should be looking to try to design alternative ways input information into EHRs.
Since most devices now have voice recognition, an app that could allow physicians to quickly record the patient interview then allow for review and submission into any EHR would be an amazing product. It would be even more impressive if the app could create custom documents and help to avoid repetition. For example, physicians could record physical exam findings while s/he speaks them during the exam. This eliminates some documentation after the interview.
In the future, similar apps for wearables will be even more helpful. Imagine devices, such as otoscopes, thermometers, blood pressure cuffs, and stethoscopes recording data directly into the EHR as you use them. This reality is not too far off and any software that facilitates this data collection is likely to thrive.
Government intervention: Does it help or hurt?
Let’s get back to the question at hand. Is the government’s intervention helping or hurting? Unfortunately, the positive effects of the incentives seemed to have plateaued, given the lower amount of attestations in 2013. Furthermore, in a few rare instances, they have actually indirectly caused some healthcare leaders to commit fraud. A hospital CFO in Texas aided the hospital in receiving $800,000 in MU incentives, yet the system barely used its EHR. He was also reported to commit identity fraud in order to receive MU incentives. Additionally, on the innovation side of things, much of the funding in the form of grants, has run out, leaving most of the HIT companies that received them struggling to sustain themselves.
There are some good points. MU has initiated the transition to EHR for both vendors and providers. It was a surge of development in healthcare. In the process, providers were given software that was quickly designed and lacked key features. Therein lies the opportunity. Innovators now have customers with large demand for features such as usability, interoperability between software packages, and mobile implementation. Even though the EHR space in particular is crowded, there is still room for companies to create better patient portals, educational apps, analytics apps for wearables, and additional software that can be integrated into existing EHRs. And as far as the drought of government funds, venture funding for healthcare start-ups and companies is still plentiful.
Bad news can be good news
Overall, this data should be a wake-up call to everyone in the industry. Hospitals and smaller practices are struggling with the transition to a completely electronic system. Not to mention, they are unable to achieve true interoperability – open communication channels between everyone involved in patient care. However, this massive amount of problems is really a gold mine for HIT entrepreneurs. My advice to these innovators in the industry is to start connecting with physicians (or any other healthcare professional) willing to provide constructive input. Being that kind of doctor myself, I can tell you that I want nothing more than for developers to collaborate with those of us on the front lines of patient care. It’s only going to result in better software and devices.
“Large collections of electronic patient records have long provided abundant, but under-explored information on the real-world use of medicines. But when used properly these records can provide longitudinal observational data which is perfect for data mining,” Duan said. “Although such records are maintained for patient administration, they could provide a broad range of clinical information for data analysis. A growing interest has been drug safety.”
In this paper, the researchers proposed two novel algorithms—a likelihood ratio model and a Bayesian network model—for adverse drug effect discovery. Although the performance of these two algorithms is comparable to the state-of-the-art algorithm, Bayesian confidence propagation neural network, by combining three works, the researchers say one can get better, more diverse results.
I saw this a few weeks ago, and while I haven't had the time to delve deep into the details of this particular advance, it did at least give me more reason for hope with respect to the big picture of which it is a part.
It brought to mind the controversy over Vioxx starting a dozen or so years ago, documented in a 2004 article in the Cleveland Clinic Journal of Medicine. Vioxx, released in 1999, was a godsend to patients suffering from rheumatoid arthritic pain, but a longitudinal study published in 2000 unexpectedly showed a higher incidence of myocardial infarctions among Vioxx users compared with the former standard-of-care drug, naproxen. Merck, the patent holder, responded that the difference was due to a "protective effect" it attributed to naproxen rather than a causative adverse effect of Vioxx.
One of the sources of empirical evidence that eventually discredited Merck's defense of Vioxx's safety was a pioneering data mining epidemiological study conducted by Graham et al. using the live electronic medical records of 1.4 million Kaiser Permanente of California patients. Their findings were presented first in a poster in 2004 and then in the Lancet in 2005. Two or three other contemporaneous epidemiological studies of smaller non-overlapping populations showed similar results. A rigorous 18-month prospective study of the efficacy of Vioxx's generic form in relieving colon polyps showed an "unanticipated" significant increase in heart attacks among study participants.
Merck's withdrawal of Vioxx was an early victory for Big Data, though it did not win the battle alone. What the controversy did do was demonstrate the power of data mining in live electronic medical records. Graham and his colleagues were able to retrospectively construct what was effectively a clinical trial based on over 2 million patient-years of data. The fact that EMR records are not as rigorously accurate as clinical trial data capture was rendered moot by the huge volume of data analyzed.
Today, the value of Big Data in epidemiology is unquestioned, and the current focus is on developing better analytics and in parallel addressing concerns about patient privacy. The HITECH Act and Obamacare are increasing the rate of electronic biomedical data capture, and improving the utility of such data by requiring the adoption of standardized data structures and controlled vocabularies.
We are witnessing the dawning of an era, and hopefully the start of the transformation of our broken healthcare system into a learning organization.
I believe if we reduce the time between intention and action, it causes a major change in what you can do, period. When you actually get it down to two seconds, it’s a different way of thinking, and that’s powerful. And so I believe, and this is what a lot of people believe in academia right now, that these on-body devices are really the next revolution in computing.
I am convinced that wearable devices, in particular heads-up devices of which Google Glass is an example, will be playing a major role in medical practice in the not-too-distant future. The above quote from Thad Starner describes the leverage point such devices will exploit: the gap that now exists between deciding to make use of a device and being able to carry out the intended action.
Right now it takes me between 15 and 30 seconds to get my iPhone out and do something useful with it. Even in its current primitive form, Google Glass can do at least some of the most common tasks for which I get out my iPhone in under five seconds, such as taking a snapshot or doing a Web search.
Closing the gap between intention and action will open up potential computing modalities that do not currently exist, entirely novel use case scenarios that are difficult even to envision before a critical mass of early adopter experience is achieved.
The Technology Review interview from which I extracted the quote raises some of the potential issues wearable tech needs to address, but the value proposition driving adoption will soon be truly compelling.
I'm adding some drill-down links below.
Practices tended to use few formal mechanisms, such as formal care teams and designated care or case managers, but there was considerable evidence of use of informal team-based care and care coordination nonetheless. It appears that many of these practices achieved the spirit, if not the letter, of the law in terms of key dimensions of PCMH.
One bit of good news about the Patient Centered Medical Home (PCMH) model: here is a study showing that in spite of considerable challenges to PCMH implementation, the transformations it embodies can be and are being implemented even in small primary care practices serving disadvantaged populations.
|“cognitive” specialists, the care of the patient revolves around the “granularity” of the narrative. |
individual attention and focus
ability to share purposes
Using individual differences and idiosyncrasies
patients as widgets (here)?
Can you see the dashboard here?
big data, ICD-10
Electronic Medical Record -
templates and “smart sets”
Patient - BIG DATA - Doctor
empathy 'NOISE' empathy
life-course (“social”) epidemiology
Europe, health care systems, United States, health economists, hospital administrators, patients as “units of care”, physicians as “providers”, clinical demand = “throughput.”
"All patient and care records digital,
real time and interoperable by 2020."
|"Clinicians in primary, urgent |
and emergency care, and other key transitions
of care contexts will be operating without paper records by 2018."
|"Patients have access to their hospital,|
community, mental health and social care services records by 2018."
"By April 2016, commissioners and providers
must publish "road maps" showing how they
will develop interoperable digital records
and services by 2020."
Twitter, like the Internet in general, has become a vast source of and resource for health care information. As with other tools on the Internet it also has the potential for misinformation to be distributed. In some cases this is done by accident by those with the best intentions. In other cases it is done on purpose such as when companies promote their products or services while using false accounts they created.
In order to help determine the credibility of tweets containing health-related content I suggest the using the following checklist (adapted from Rains & Karmikel, 2009):
Ultimately it is up to the individual to determine how to use health information they find on Twitter or other Internet sources. For patients anecdotal or experiential information shared by others with the same illness may be considered very credible. Others conducting research may find this a less valuable information source. Conversely a researcher may only be looking for tweets that contain reference to peer-reviewed journal articles whereas patients and their caregivers may have little or no interest in this type of resource.
Rains, S. A., & Karmike, C. D. (2009). Health information-seeking and perceptions of website credibility: Examining Web-use orientation, message characteristics, and structural features of websites. Computers in Human Behavior, 25(2), 544-553.
The altmetric movement is intended to develop new measures of production and contribution in academia. The following article provides a primer for research scholars on what metrics they should consider collecting when participating in various forms of social media.
If you participate on Twitter you should be keeping track of the number of tweets you send, how many times your tweets are replied to, re-tweeted by other users and how many @mentions (tweets that include your Twitter handle) you obtain. ThinkUp is an open source application that allows you to track these metrics as well as other social media tools such as Facebook and Google +. Please read my extensive review about this tool. This service is free.
You should register with a domain shortening service such as bit.ly, which will provide you with an API key that you can enter into applications you use to share links. This will provide a means to keep track of your click-through statistics in one location. Bit.ly records how many times a link you created was clicked on, the referrer and location of the user. Consider registering your own domain name and using it to shorten your tweets as a means of branding. In addition, you can use your custom link on electronic copies of your CV or at your own web site. This will inform you when your links have been clicked on. You should also consider using bit.ly to create links used at your web site, providing you with feedback on which are used the most often. For example, all of the links in this article were created using my custom bit.ly domain. In addition, you can tweet a link to any research study you publish to publicize as well as keep track of how many clicks are obtained. Bit.ly is a free service.
Another tool to measure your tweets is TweetReach. This service allows you to track the reach of your tweets by Twitter handle or tweet. It provides output in formats that can be saved for use elsewhere (Excel, PDF or the option to print or save your output by link). To use these latter features you must sign up for an account but the service is free.
Buffer is a tool that allows you to schedule your tweets in advance. You can also connect Buffer to your bit.ly account so links used can be included in your overall analytics. Although Buffer provides its own measures on click-through counts this can contradict what appears in bit.ly. This service is free but also has paid upgrade options available that provide more detailed analytics.
Google Scholar Citation Profile
You can set up a profile with Google Scholar based on your publication record. The metrics provided by this service include a citation count, h-index and i10-index. When someone searches your name using Google Scholar your profile will appear at the top before any of the citations. This provides a quick way to separate your articles from someone else who has the same name as you.
Google Feedburner for RSS feeds
If you maintain your own web site and use RSS feeds to announce new postings you can also collect statistics on how many times your article is clicked on. Feedburner, recently acquired by Google provides one way to measure this. You enter your RSS feed ULR and a report is generate, which can be saved in CVS format.
Journal article download statistics
Many journals provide statistics on the number of downloads of articles. Keep track of those associated with your publication by visiting the site. For example, BioMed Central (BMC) maintains an access count of the last 30 days, one year and all time for each of your publications.
Other means of contributing to the knowledge base in your field include participating on web-based forums or web sites such as Quora. Quora provides threaded discussions on topics and allows participants to both generate and respond to the question. Other users vote on your responses and points are accrued. If you want another user to answer your question you must “spend” some of your points. Providing a link to your public profile on Quora on your CV will demonstrate another form of contribution to your field.
Paper.li is a free service that curates content and renders it in a web-based format. The focus of my Paper.li is the use of technology in Canadian Healthcare. I have also created a page that appears at my web site. Metrics on the number of times your paper has been shared via Facebook, Twitter, Google + and Linked are available. This service is free.
Twylah is similar to paper.li in that it takes content and displays it in a newspaper format except it uses your Twitter feed. There is an option to create a personalized page. I use tweets.lauraogrady.ca. I also have a Twylah widget at my web site that shows my trending tweets in a condensed magazine layout. It appears in the side bar. This free service does not yet provide metrics but can help increase your tweet reach. If you create a custom link for your Twylah page you can keep track of how many people visit it.
Analytics for your web site
Log file analysis
If you maintain your own web site you can use a variety of tools to capture and analyze its use. One of the most popular applications is Google Analytics. If you are using a content management system such as WordPress there are many plug-ins that will add the code to the pages at your site and produce reports. WordPress also provides a built-in analytic available through its dashboard.
If you have access to the raw log files you could use a shareware log file program or the open source tool Piwik. These tools will provide summaries about what pages of your site are visited most frequently, what countries the visitors come from, how long visitors remain at your site and what search terms are used to reach your site.
All of this information should be included in the annual report you prepare for your department and your tenure application. This will increase awareness of altmetrics and improve our ability to have these efforts “count” as contributions in your field.