This is the next post in my series of Do’s and Don’ts Healthcare IT. As we all know, some of our most important citizens live in rural settings, small cities, the countryside, or remote areas. These areas have smaller populations and less direct access to vital healthcare resources. In the past 15 years or so we’ve made some great strides in remotely accessible healthcare; these offerings, called telemedical tools, provide important clinical care at a distance. Here are some do’s and don’ts of telemedicine:
What do’s and don’ts would you add to a telemedicine strategy? Drop me a comment below.
I recently wrote, in Do’s and Don’ts of hospital health IT, that you shouldn’t make long-term decisions on mobile app platforms like iOS and Android because the mobile world is still quite young and the war between Apple, Microsoft, and Google is nowhere near being resolved. A couple of readers, in the comments section (thanks Anne and DDS), asked me to elaborate mobile and mHealth strategy for healthcare professionals (HCPs) and hospitals.
A couple of the key points were:
The approach I recommend right now for mobile apps, if you’re developing them yourself, is to stay focused on HTML5 browser-based apps and not native apps. So, to answer Anne’s and DDS’s question specifically, no you shouldn’t wait to allow usage of mobile apps by anyone; but, if you’re looking to build your own apps and deploy them widely (not in simple experiments or pilots) then you shouldn’t write to iOS or Android or WP7 but instead use HTML5 frameworks like AppMobi and PhoneGap that give you almost the same functionality but protect you from the underlying platform wars. In the end, HTML5 will likely win and it’s cross-platform and quite functional for most common use cases. If you’re not developing the apps yourself and using third-party apps, then of course you must support the use of iOS native, Android native, and soon Windows native apps on your network.
So, from a general perspective you should embrace mHealth but do so in a strategic, not tactical manner. Here are the most critical questions to answer in a mHealth strategy — it’s not a simple one size fits all approach:
If there is interest in this topic, I will expand on my list of Do’s and Don’ts — mHealth is a very complex topic and requires a good strategy. Just saying that you allow the use of mobile devices like smartphones in your hospital is not an mHealth strategy.
In case you haven’t seen it, MU attestations data is now available on Data.gov and it includes analyzable vendor statistics.
The data set merges information about the Centers for Medicare and Medicaid Services, Medicare and Medicaid EHR Incentive Programs attestations with the Office of the National Coordinator for Health IT, Certified Health IT Products List. This new dataset enables systematic analysis of the distribution of certified EHR vendors and products among those providers that have attested to meaningful use within the CMS EHR Incentive Programs. The data set can be analyzed by state, provider type, provider specialty, and practice setting.
The data set does not include dollar amounts or the difficulty of attestation (e.g. how many times it took to pass). I’ll try and find out if that data might be available in the future. It’s also unclear whether the provider counts were broken up into each line (meaning one provider per row) or if multiple providers were aggregated into lines (meaning multiple providers were grouped).
The dataset is available now on Data.gov at http://www.data.gov/raw/5486 and is worth checking out. Since the file has been downloaded over 75 times, it’s clear some of you already know about this so if you’ve done some analysis with it; if you’ve done any analysis or posted results please drop me a note below so that everyone can benefit.
Last year I started a series of “Do’s and Dont’s” in hospital tech by focusing on wireless technologies. Folks asked a lot of questions about do’s and dont’s in other tech areas so here’s a list of more tips and tricks:
One of the most important activities you can undertake before you begin your EHR implementation journey is to standardize and simplify your processes to help prepare for automation. Unlike humans, which can handle diversity, computers hate variations. Before you begin your software selection process, get help from a practice consultant to reduce the number of appointment types you manage, reduce the number of different forms you use, ensure that your charting categories (“Labs”, “Notes”, etc.) don’t look different per patient type or physician, determine how you will manage medication lists and problem lists across the patient population, and deal with how you’ll manage paper in your digital world.
If you spend even just a few hours a week doing the prep-work before you buy any software, you will be better prepared in your selection process. Without some level of standardization your EHR implementation will either fail, be delayed, or have many unhappy users; the more you can standardize and simplify, the more likely you will have a successful outcome. A strong project manager with authority to make decisions will be the difference maker in the simplification process.
To help you with your workflow assessment and standardization efforts, check out the The Agency for Healthcare Research and Quality (AHRQ.gov) Workflow Assessment for Health IT Toolkit. Even if you’ve done workflow assessments before, the toolkit is worth checking out.
As most of my regular readers know, I work as a technology strategy advisor for several different government agencies; in that role I get to spend quality time with folks from NIST (the National Institute of Standards and Technology), what I consider one of the government’s most prominent think tanks. They’re doing yeoman’s work trying to get the massive federal government’s different agencies working in common directions and the technology folks I’ve met seem cognizant of the influence (good and bad) they have; they seem to try to wield that power as carefully as they know how. Since most of you are in the technology industry, albeit specific to healthcare, I recommend that you learn more about NIST and the role it plays – they can make your life easier because of the coordination and consensus building work they do for us all. I, for one, was thrilled when NIST was picked as the governing body for the MU certification criteria. These guys know what they’re doing and I wish they got more involved in driving healthcare standards.
A few years ago NIST came up with the first drafts of the seminal definitions of Cloud Computing; they ended up setting the stage for communicating complex technical concepts and helping making “Cloud” a household name. After 15 drafts, the 16th and final definition was published as The NIST Definition of Cloud Computing (NIST Special Publication 800-145) in September. It’s worth reading because it’s only a few pages and is understandable by the layperson. No computer science degree is required.
Yesterday I was speaking to a senior executive in the EHR space and we had a great discussion on what healthcare providers are doing in terms of cloud computing and how to communicate these ideas to small practices as well as hospitals. It reminded me of the numerous similar conversations I’ve had with other senior executives we serve in the medical devices and other regulated IT sectors. In almost every conversation I can remember about this topic over the past couple of years, I had to remind people that NIST has already done the hard work and that we can, indeed, rely on them. Most of the time the senior executive was unaware of where the definitions came from so I figured I’d put together this quick advisory.
My strong recommendation to all senior healthcare executives is that we not come up with our own definitions for cloud components – instead, when communicating anything about the cloud we should instruct our customers about NIST’s definition and then tie our product offerings to those definitions. The essential characteristics, deployment models, and service models have already been established and we should use them. When we do that, customers know that we’re not trying to confuse them and that they have an independent way of verifying our cloud offerings as real or vapor.
Below I have copied/pasted from NIST 800-145 their key definitions. Imagine how many debates you would avert with technicians at clients when, during conversations with a client, you communicated some of the following information first, showed them how it was a “standard definition” and handed them a copy of the publication, and then mapped your offerings and discussions to the different areas. Your sales teams and the marketing teams would appreciate the clarity, too.
Note that you do not need to map every offering you have to every definition – just start mapping the obvious ones and then figure out how you can communicate the “gaps” as being not applicable to your products / services or if those gaps will be filled in the future as part of your roadmap. Treat these definitions as canonical but not inclusive – meaning that just because your SaaS offering doesn’t fit every essential characteristic doesn’t mean that you’re not “cloud” – it just means partially cloud.
If you’ve got questions about how to map your product offerings, drop me some comments and I’ll assist as best as I can.
Here are the key definitions from NIST 800-145, copied directly from the original source:
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.
On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.
Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).
Resource pooling. The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter). Examples of resources include storage, processing, memory, and network bandwidth.
Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be appropriated in any quantity at any time.
Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability1 at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.
Software as a Service (SaaS). The capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure2. The applications are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or a program interface. The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
Platform as a Service (PaaS). The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages, libraries, services, and tools supported by the provider.3 The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the deployed applications and possibly configuration settings for the application-hosting environment.
Infrastructure as a Service (IaaS). The capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components (e.g., host firewalls).
Private cloud. The cloud infrastructure is provisioned for exclusive use by a single organization comprising multiple consumers (e.g., business units). It may be owned, managed, and operated by the organization, a third party, or some combination of them, and it may exist on or off premises.
Community cloud. The cloud infrastructure is provisioned for exclusive use by a specific community of consumers from organizations that have shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be owned, managed, and operated by one or more of the organizations in the community, a third party, or some combination of them, and it may exist on or off premises.
Public cloud. The cloud infrastructure is provisioned for open use by the general public. It may be owned, managed, and operated by a business, academic, or government organization, or some combination of them. It exists on the premises of the cloud provider.
Hybrid cloud. The cloud infrastructure is a composition of two or more distinct cloud infrastructures (private, community, or public) that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load balancing between clouds).
AUSTIN, TX– Oct. 19, 2011–Drummond Group Inc. (DGI), the trusted interoperability test lab, last week submitted to the DEA its e-Prescribing of Controlled Substances (ePCS) Certification Process documentation. The DEA is currently reviewing this ePCS Certification Process for approval. Upon approval, DGI will be providing ePCS certification to healthcare software companies with the capability of e-Prescribing controlled substances.
Since 2005, Drummond Group has been the lead auditor and certification organization for the DEA’s final rule regulations on the Controlled Substance Ordering System (CSOS). CSOS enables drug manufacturers, distributors and pharmacies to electronically automate the order and fulfillment supply chain of controlled substances. Drummond Group also serves as an Authorized Testing and Certification Body (ATCB) under the Health and Human Services’ (HHS) electronic health records (EHR) certification program and has certified more than 450 software applications, including e-Prescribing solutions, since its inception in 2010.
For complete press release, please click here.
Last month, Steven Posnack, director of the Federal Policy Division within the Office of the National Coordinator for Health (ONC), wrote a very helpful blog on fact and fiction related to the ONC certification program. We have recently had many questions related to Drummond Group’s involvement in the ONC Permanent Certification program and related certification. Here is our own QA session for questions and how that affects certification from the current Temporary Program.
Question: What is difference between the Permanent Certification Program and the Temporary Certification Program? What about ATCBs and ACBs? Is ANSI now involved in certification?
Answer: The Temporary Certification Program and the Permanent Certification Program are ultimately about the governance of the testing and certification program, specifically, the bodies that are testing and certifying, like Drummond Group. Their work and requirements are, in most ways, outside the concern of EHR vendors and HIT users. Meaningful Use measures and ONC certification criteria are completely separate from the Final Rules governing both certification programs.
The requirements within both programs are very similar. The chief difference is the accreditation method. In the Temporary Program, an organization like Drummond Group was required to take comprehensive tests and submit two sets of quality manuals: 1) for testing plans and processes and 2) for our certification processes. These were approved by ONC itself to be accredited as an ONC Authorized Testing and Certification Body (ATCB). In the Permanent Program, ONC is no longer acting as the accreditation body of either testing or certification although they will still oversee the program. Instead, there is a new ONC-Approved Accreditors (ONC-AA), ANSI, who will do the accreditation for the certification bodies as an ONC-Authorized Certification Body (ACB), and NVLAP, a division of NIST, will run the accreditation program for the testing bodies.
Question: Will Drummond Group be a part of the Permanent Certification Program? Will you also do testing?
Answer: Drummond Group’s intention is to be an ACB (Authorized Certification Body), as well as an NVLAP accredited testing body for EHRs. We are currently working on preparations for approval in the Permanent Program.
Question: When will Drummond Group or others be named as ACBs? Do you have a timeframe?
Answer: We are really hesitant to even speculate on a specific date when ACBs will be open for business given there are many unknowns. Here is what we do know: NIST will be releasing the final testing accreditation requirements for testing agencies around December and will begin processing the applications on Jan. 15, 2012. We have no word from ANSI or ONC on details for applying to become an ACB, nor additional certification body accreditation requirements apart from the core ISO Guide 65.
Also, accreditation is just the first step. Only after you are accredited by ANSI for your certification quality procedures can you submit your application to ONC to be a part of the Permanent Certification program. In the Temporary Program, the submission to ONC to be an ATCB until official approval was a process that took approximately two months.
Question: Will there be new criteria to test and certify in the Permanent Program and will certified EHRs have to return and be recertified with an ACB to remain on the CHPL?
Answer: As stated above, the Temporary and Permanent Program Final Rules are ultimately about the governance of the testing and certification program but not about the criteria which the ATCBs or ACBs will certify. The testing requirements and certification criteria come from ONC separate from anything to do with the current state of the certification program.
Even in the Temporary Program, ONC could revise and update the certification criteria requiring products to be retested and recertified. In fact, they actually did make a revision to the public health surveillance criteria (170.302.l) in an interim rule in October 2010 although it did not require recertification. Eventually, the criteria will be updated when new meaningful use stages are introduced, but that is not connected with the timing or availability of the Permanent Program. Also, certified EHRs will not need to be recertified by an ACB simply because the ATCBs are dissolved with the closing of the Temporary Program.
Question: Once we are in the Permanent Program and new criteria are introduced, such as with Meaningful Use Stage 2, will certified EHRs to have to retest everything previously tested and certified in the temporary program?
Answer: On retesting previously certified criteria, the Permanent Program Final Rule does make a reference to allowing for “gap” certification of new or revised criteria added in later stages versus fully recertifying and retesting all criteria, including those unchanged from previous ONC rulings. However, it ultimately leaves this to the decision of the ACB. We (Drummond Group or any other ATCB) cannot speak definitely on this until we are an ACB and receive further guidance from ONC and possibly ANSI, which is the selected ONC-AA who will accredit us.
Question: In his blog, Steven Posnack stated that current CHPL certification will not expire. However, the certification seals issued in the Temporary Program make reference to 2011/2012. What does that mean?
Answer: Those 2011/2012 Certification Seal dates come from the ONC Final Rule on the Temporary Program, but they are not explicit expirations. Rather, they reflect what was anticipated as the timeline of the criteria and associated Stage 1 Meaningful Use measures.
It ultimately depends upon the current module criteria requirements. If they are not updated, then the certification is still valid.
As a testing company with more than a decade’s worth of experience, we’ve certified many software products across multiple industries. In the process, we’ve worked with quite a few certification and standards-setting bodies.
For the past several months, we’ve been busy in the healthcare industry, working under the auspices of the government’s electronic health records incentive program. As a matter of fact, since becoming an Authorized Testing and Certification Body (ATCB), we’ve certified more than 300 electronic health records systems using a testing script developed by National Institute of Standards and Technology (NIST) for the Office of the National Coordinator for Healthcare Information Technology (ONC-HIT).
Although we’ve tested systems for a variety of standards-setting groups, we weren’t quite sure what to expect when we started working as a testing company under the purview of the Health Information Technology for Economic and Clinical Health (HITECH) legislation – a behemoth government program to say the least.
Much to our delight: We’re really happy with the way things are going so far.
Why? First, we think the leaders in the government program got it right. The powers-that-be managed to develop clear technical requirements without imposing restrictive implementation methods, making it possible to ensure that certified EHRs all perform at a certain level, but also leaving enough flexibility for EHRs to meet specific user needs and for developers to continue to innovate. It’s a tricky balance but one that the ONC seems to have mastered quite well.
As a result, there’s plenty of room for developers to come up with products that push the envelope with new features and functions or to tailor systems to meet the very specific needs of certain specialists such as OB-GYNs, chiropractors or plastic surgeons. At the same time, end-users can rest easy, knowing that software systems that have been certified actually live up to the specifications that will make it possible for them to meet the government’s meaningful use requirements and, subsequently, qualify for their share of the federal government’s incentive funds.
We’re also happy with our work in the program. We feel that Drummond Group has been able to add value to the overall process by infusing a healthy dose of neutrality into the testing and certification process. Steadfastly maintaining neutrality has, after all, been a concept that we have built our company on since the beginning.
Although happy to be a member of the healthcare IT community, we purposely shy away from becoming deeply involved in professional coalitions or advocacy efforts. As such, when we test a product, we test a product. We don’t have to worry about the fact that an industry coalition spoke out against one of the ONC test requirements because our neutrality would keep us apart.
In essence, we make sure we don’t attach to anything else, so that the only thing we are attached to is testing. It’s a singular focus that serves software developers and the overall mission of the HITECH program well.
It’s complicated. That’s how many hospital leaders describe their electronic health records initiatives. These hospitals – instead of having a neat all-in-one EHR solution driving their efforts – have moved toward electronic records by cobbling together a variety of off-the-shelf, customized and possibly even home-grown solutions.
If you work at one of these facilities, you are probably all too familiar with the complications. And, when it comes time to get the stamp of approval needed to qualify for incentive funds, you probably don’t know exactly where to start. No worries. Drummond Group is ready to help. We’re taking applications from hospitals that want to achieve ONC-ATCB 2011/2012 certification for their unique EHR solutions. We stand ready to help hospitals in this situation move forward by testing their solutions to gain the certification required to move toward meaningful use.
Best of all, though, we are gearing up to truly offer more than a cursory certification. With more than 10 years of software testing experience, we have the interoperability know-how that you can tap into to truly get your miscellaneous solutions working together as one unified EHR. Having worked in a variety of complicated industries, we have encountered many difficult software and integration testing scenarios – and have had to evaluate a wide variety of software configurations from the simplest, out-of-the-box applications to complicated solutions derived from a variety of cobbled-together software applications.
What’s more, we are truly committed to meeting the specific needs of healthcare providers. We are presently answering inquiries with hospitals and working on setting up certifications for them. And, while we are ready to start working with you today to qualify your customized or home-grown system for certification, we plan on rolling all of our know-how up into a formal service offering early next year.
Remember, though, to achieve ONC-ATCB 2011/2012 certification, EHR software has to be tested based on the official criteria as defined by ONC. Authorized Testing and Certification Bodies (ATCBs) test and certify the software and then HHS approves and lists these certifications on the Certified Health IT Products Listing (CHPL). Customized programs for hospitals or specific specialties – while designed to help meet the unique needs of various classes of HIT vendors — are not required for the certification that will enable your organization to meet meaningful use incentives.
We’re often asked why we jumped into the healthcare industry. Our answer: It’s simple Business 101 logic. We saw a need and we knew we could fill it.
First, we started hearing from healthcare information technology vendors about the need for software testing. We investigated and discovered that the Department of Health and Human Services (HHC) Office of the National Coordinator (ONC) for Health Information Technology was recruiting organizations to serve as Authorized Testing and Certification Bodies (ATCBs) to provide the stamp of approval to electronic health records (EHRs) that would be used by healthcare provider organizations as they seek to qualify for incentive funds under the American Recovery and Reinvestment Act.
All of this opportunity, of course, piqued our interest. Realizing that we really had something special to offer the healthcare industry, however, made us take the plunge.
Most importantly, we felt that we could offer the efficient and effective software testing that the industry needs –as vendors are scrambling to meet the needs of providers with officially certified EHR solutions. Because we have tested complex software products in a plethora of industries for more than a decade, we have what it takes to get the job done. As a result, vendors can quickly and affordably get their EHR solutions listed on the Certified Health IT Products Listing (CHPL) – and providers can use the solutions to qualify for incentive monies.
Our vision extends beyond the short-term, though. We also realized that we could offer healthcare information technology companies the testing services that they will need as meaningful use requirements evolve—and become more complex. Because of our extensive testing experience, we have become experts in interoperability and privacy issues. Also, we feel that we will be able to help healthcare IT vendors with these issues as meaningful use evolves.
Drummond Group’s ONC-ATCB Certified EHR products have now been posted on the HHS Certified Health IT Products Listing.
Problems viewing the HHS Certified Health IT Products Listing?
Home page of healthit.hhs.gov and look under “What’s New”
Questions? Shoot us an email at EHR@drummondgroup.com
Peeps – sorry for the radio silence. Will make it up to you with this: http://ht.ly/4EMV8
Hew (hyū) v.
“In every block of marble I see a statue as plain as though it stood before me, shaped and perfect in attitude and action. I have only to hew away the rough walls that imprison the lovely apparition to reveal it to the other eyes as mine see it.” – Michaelangelo
An unfinished Michealangelo sculpture.
I just re-read this quote – I think it is a powerful metaphor for any innovator that is out there trying to change the world.They are the ones that can see the fully defined, fully articulated, and fully functional end product within the building blocks that others pass off as mere landscape material. I think this gift of vision – this ability to “see” what others cannot – and the doggedness to stick to the mindless chipping away until others can see it enough to give you the tools you need to finish it off.
We are privileged to be working on a HUGE project right now with a highly innovative company that sees the value of what we are doing and wants to be a part of changing health care. It has been fun to work with them to begin the process of “hewing” away and to literally see the game changing product we have always seen begin to take shape from the dust, the chipped stone, the dirty hands, and the bleeding fingers. The process of discovery and refinement is almost as fun as seeing how the end product will move people.
I am on an email list of Bill DeMarco’s, a reputable industry insider who has written and consulted extensively in the physician group and medical management space. He recently sent me a note about several physician aggregation events in New Jersey.
For some reason it struck a nerve with me . . . which led me to fire off the response below:
I thought we already saw this movie?
My question for you . . . besides banding together in some megagroup – what are these physicians doing to actual change the delivery of medicine? ACO is just the latest buzzword excuse to aggregate physicians under a new moniker and a supposed new model.
I am highly suspect that these physicians are doing anything to change the relationship with their patients, to use enabling technology to create team based care, or actually be accountable for the outcomes they produce. What systems are they using to tie themselves together? What financial alignment do they have? What measures are they using to demonstrate superior outcomes? What about the patient experience – 7 minute visits that push pills as the “treatment” won’t get it done in the future.
I think your closing statement, “Representatives from Summit and Optimus were unavailable for comment” says it all.
Am I seeing this the wrong way? Is there anything new about this model this time around? Am I getting old enough to see these things cycle through?
PS – and no, I don’t mean a wolf. The sheep get nervous and band together waiting to get pounced on by wolves.
1. Having to do with the matter at hand; to the point
I read with amusement Susanna Fox’s redux review about the relevance of Health 2.0 in general and in changing patient’s behavior specifically. Here questions reveals her bias in a very limited definition of Health 2.0 that I attempted to abolish originally in some of my bantering with Matthew Holt. I always saw Health 2.0 as a “movement” that would not be defined so much by its technology but rather enabled by it. As an “enabler”, the technology can help people do new things in new ways but I never believed technology in and of itself had the power to truly change health, health behaviors, or health care delivery in and of itself.
That is why my definition of health 2.0 was always more expansive and contemplated an entire “movement” to the next generation health care “system”. This new system must include new delivery models, new financing mechanism, and the new tools and technology that bring all of this together in a simple, efficient, and affordable way. Clearly this next generation of care would include technology, the new tools, but until we had a new delivery system that is financed in a new way we are going to continue to have the same behaviors across the patient, physician, provider, and payor continuum.
So Susanna, I don’t think your version of Health 2.0 (Tools and Technology) do much to get us to the behavior change you seek. In fact, getting to the root of behavior change requires almost a religious experience. Interestingly enough, the health care industry provides plenty of “religious” experiences including passing close to death, unbelievably poor customer experiences that invoke deep passions (ie, the birth of ePatient Dave), and promise of a far better world than we currently enjoy. So while the tools and technology show us what is possible, health care delivery and health finance are the catechismal doctrines we must reform first that actually incent the behavioral change we all seek.
So is Health 2.0 Relevant? I think it depends on your definition!
Extirpating (ĕk’stər-pāt’) v.
I recently took a great road trip with my two boys. We rented one of the new Kia Soul’s which my boys recognized from a very funny commercial developed to highlight its hipster (hamster?) vibe. The commercial reminded me of the old Hamburger A or Hamburger B commercials from Wendys back in the late 80′s wherein this ludicrous contrast is set up to demarcate the dichotomy between two distinct choices.
This modern reinvention of that age old contrast struck me because it is something that I deal with everyday in explaining Crossover Health to people. It all stems from a pervasive misconception about the term “Health Insurance”
The challenge is that “Health Insurance” is a confused term which most people equate with both Health Care (care delivery) and Health Finance (how you pay for it). Our current employer based system (wherein your employer provides and in most cases pays for your insurance) as well as a third party insurance payment system (we have the insurance pay for us) creates all kinds of weird incentives but also results in no accountability in terms of cost, quality, or outcome. It is currently imploding before our eyes.
Our reaction, both opportunistic as well as obligatory, is to do something totally different by blowing up the current Health Insurance model and separating out Health Care from how you pay for it (Health Financing). We say that there is a better way to do BOTH – pay your physician directly for the care you need and then get smart about how you pay for it with the right insurance product. In fact, you should “self insure” with the highest deductible plan you can find and then take responsibility for your health for all the small stuff or hire someone to do that for you (like Crossover Personal Health Advisory Service). There is no reason to intermediate with a parasitic organizations that are taking your premium dollars and wasting it on overhead, fancy offices, mindless phone trees, and my all time favorite “this is not a bill” disinformation pamphlets.
As people begin to take this in (they always get how the practice model is a radically improvement), they immediately revert back to the combined “Health Insurance” concept. Does Crossover Health want to replace my current “Health Insurance”? The answer is slightly nuanced, but a resounding YES! I want to replace what you call “Health Insurance” with a direct “Health Care” product (Crossover Health) and a smarter Health Finance product (highest deductible you can get).
We believe there are large and significant opportunities to roll this into a single product that can be purchased by employers, families, and other organizations seeking fresh alternatives that can demonstrate not only trend bending improvements but trend busting outcomes.
By Sheldon Needle
The real problem of an established medical practice moving into the realm of EHR is not the cost of the medical software package; it is not the training necessary for staff; and it is not security and backups.
The real problem of moving into EMR/EHR is the problem of unstructured medical data.
If you are involved in a new or relatively new practice, this is a no-brainer. Begin with a serious search to compare medical software vendors who are available to answer your questions honestly. It is not truly so difficult to get on a friendly medical screen to enter your patient’s blood pressure or lab test values. You can get used to that.
Neither is it difficult to take notes on a notebook that upload to the EHR system.
The real problem is taking your notes and dictation on a patient that go back 15 years and finding a way to get his possible symptoms, his worry about IBS, his headache history, and his worries over his children into a metrically available rendition that that does not take you or a member of your practices days to decipher. These notes are usually on dictation, hand written notes, and referral letters.
The concerns are many: this can take what feels to be forever, and the anxiety issues and unclear symptoms may not translate easily into metrics but may be critically important in future diagnoses.
There are two critical questions here:
In the long run, it doesn’t even matter if it is worth it. It will happen. Medicine as well as the rest of our cultural world, is becoming electronically-based whether we like it or not. But in the long run, it is worth it. Think of a patient going in to the hospital after a car accident, all by himself, and having all his data available to the admitting doctor in an instant: blood type, history, etc.
Think of a patient being referred to you, the specialist, and having all his patient history available in less than a minute. What a time saver! What insight!
Medical informatics has a number of methodologies it is using to translate unstructured data into useful and structured data.
Three basic methodologies exist to accomplish this:
These methods will be refined, utilized, and integrated in some way into most decent medical vendor software packages over the next few years. For you the physician or practice manager, this may start to pay off in a while, but you still have to get from hand written records into the database.
The obvious way to proceed makes use of our culture idea of, “going forward”:
The real message to practitioners moving to electronic health records is, don’t look at the top of the mountain when you start climbing, just put one foot in front of the other. Delaying the climb will not get you anywhere, but starting the march will move faster than you think!
Having recently spent time as an observer in a hospital setting, I was struck by the lack of intelligent planning and forethought made for doctors trying to move into an EMR / EHR environment.
Though I saw a well-known EHR panel on the computer screens within an ICU, and the EHR being used to record certain patient data, doctors were taking their notes in long-hand. Later on the same day I saw the same doctors transcribing their notes onto their computers. The doctors, doing double duty on note taking were not available to their patients because they were acting as secretaries.
When a large clinical environment is incorporating an EHR it has to be done in a modular way that does not impact productivity any more than it has to. The task is hard enough. If you are using an EHR to record point of care patient information, give your doctors a Notebook so they can take their notes electronically. In fact, insist on electronic note-taking. Incorporate change with some forethought to peoples’ time and effort.
This real-life observation just underscores the need to plan for transition to an EMR rather than throwing an institution into the chaos of change for its own sake, or for the sake of Meaningful Use incentive payments. As in all things, the old US Coast Guard motto holds true: Semper Paratus! Always be ready and prepared.
Most good EMR / EHR systems can offer medical clients some guidance as to best practices in incorporating EMR / EHR systems within their practices.
By Sheldon Needle
The prospects for EHR in the coming year are exciting but more than a little daunting. The issue is really how to find an EMR/EHR system that will organize and centralize the functions of your practice, without bankrupting you and throwing your staff and yourself into turmoil.
If you look at the websites for EMR vendors today, you can see that the functions they describe within their system –the integration of clinical records with practice management data, e-prescription, patient portals — could conceptually do wonderful things for you and for your patients in the way you handle their individual cases, but many of the details are still not working smoothly.
Here are some of the things to be aware of:
Remember, always read the fine print and ask every question you need to. Know that EMR software decisions is a very competitive business. The vendors need you just as much as you need them!
By Sheldon Needle
5010 is not only a date 3,000 years in the future: ANSI 5010 is the newest version of the HIPAA transaction standards regulating electronic transmission of medical and healthcare transactions. The existing standard is called 4010, and 4010 does not support ICD-10 coding.
The current coding standard for diagnosis and procedure coding is the ICD-9, and it has outlived its possibilities –it limits the number of new procedure and diagnostic codes that can be created.
This is how the CMS.gov (center for Medicare and Medicaid services, at: http://www.cms.gov) defines the ICD-10:
ICD-10-CM/PCS (International Classification of Diseases, 10th Edition, Clinical Modification/Procedure Coding System) consists of two parts:
ICD-10-CM is for use in all U.S. health care settings. Diagnosis coding under ICD-10-CM uses 3 to 7 digits instead of the 3 to 5 digits used with ICD-9-CM, but the format of the code sets is similar.
ICD-10-PCS is for use in U.S. inpatient hospital settings only. ICD-10-PCS uses 7 alphanumeric digits instead of the 3 or 4 numeric digits used under ICD-9-CM procedure coding. Coding under ICD-10-PCS is much more specific and substantially different from ICD-9-CM procedure coding.
The transition to 5010 is supposed to happen by January 1, 2012. This means that electronic transmissions including claims, eligibility inquiries and remittance advices must be made in a 5010-compliant format. Healthcare providers, health plans and clearinghouses for transactions are all expected to upgrade their transmissions. Non-compliance may result in claims denied or slower payment.
Systems that are certified as ONC-ATCB for 2011/2012 are already 5010 compliant. If you are contemplating buying a system that is so certified, you do not have to worry about the software compliance, but you do need to educate your staff, including yourself, if you are the physician or the P.A., on what the differences between 4010 and 5010 mean to their everyday work.
If you are using old medical software that has not been updated, or are contemplating installing software that is not certified as ONC-ATCB for 2011/2012, you need to update to a newer version, or face delays and uncertainties in your billing and claims submission. In other words, do some serious upgrading, or else!
By Sheldon Needle
November 30, 2011: Today HHS Secretary Kathleen Sebelius announced incentives to speed the adoption and use of health IT in the form of meaningful-use qualified EHR in doctors’ offices and hospitals nationwide, which will improve health care and create jobs nationwide.
The new administrative actions announced today, which will be made possible by provisions of the HITECH Act, will loosen requirements for doctors and other health care professionals to receive incentive payments for adopting and meaningfully using health IT.
“When doctors and hospitals use health IT, patients get better care and we save money,” said Secretary Sebelius. “We’re making great progress, but we can’t wait to do more. Too many doctors and hospitals are still using the same record-keeping technology as Hippocrates. Today, we are making it easier for health care providers to use new technology to improve the health care system for all of us and create more jobs.”
The press release continues to state: “HHS also announced its intent to make it easier to adopt health IT. Under the current requirements, eligible doctors and hospitals that begin participating in the Medicare EHR (electronic health record) Incentive Programs this year would have to meet new standards for the program in 2013. If they did not participate in the program until 2012, they could wait to meet these new standards until 2014 and still be eligible for the same incentive payment. To encourage faster adoption, the Secretary announced that HHS intends to allow doctors and hospitals to adopt health IT this year, without meeting the new standards until 2014. Doctors who act quickly can also qualify for incentive payments in 2011 as well as 2012.”“ (The italics are ours.)
We need to understand what acting quickly means: buying in 2011? Incorporating EHR within the next month, so that meaningful use occurs in 2011? This is not yet clear.
HHS is redoubling its effort to reach out with information, education, and the possibility of incentive payments to doctors and hospitals and vendors about stepping up the pace of transitioning practices and HER software to meet standards of Meaningful Use. What Meaningful use means to the individual practice depends on size, degree of implementation of the EHR, and the nature of the client base (how many Medicare or Medicaid patients, for instance, figures into the formula of Meaningful Use.
The Obama Administration is working to create a nationwide network of 62 Regional Extension Centers, comprised of local nonprofits, to help eligible health care providers learn how to participate in the Medicare and Medicaid EHR Incentive Programs and meaningfully use health IT.
See the HHS press release, at: http://www.hhs.gov/news/press/2011pres/11/20111130a.html to learn more.
Keep your eyes on the newspapers, government announcements and on this blog to learn about EMR and EHR news and updates.
By Sheldon Needle
You know that your medical practice will have to bite the EMR bullet sooner or later (actually, sooner). The digital handwriting is on the tablet, isn’t it? So what is it stopping you from moving ahead at a planned pace rather than being forced into converting your medical practice to an EMR at the 11th hour?
Here are some of the most common obstacles people face in converting their practices to the use of electronic medical record software, and here are some strategies to deal with them or get the process going:
1. How will we migrate from paper to digital images? Conversion of paper medical records to digital format: If you have your eye on an EMR, learn how tolerant it is of varying formats: does it accept PDF files? JPG format? Ascii text files? Extracts from excel files?
Don’t bit off more than you can chew to begin. If you are practice with reams of folders full of paper files to convert, decide how many years back you need to go in getting your EMR up and running. Perhaps you can start with one year of files via EMR? Or perhaps you need to go much further back?
Look into the possibility of having a consultant specializing in data conversion take charge of your files. There are companies that specialize in just such medical data conversions. If you are really desperate, hire your responsible college students, make the specs clear, and pay her decently!!
2. How will we train everyone in such a new system? Training your self and your staff: Once you have chosen your EMR system, engage the company’s own training staff; that way, you are sure you are being oriented in the current system, using the right documentation. Before you chose your EMR, see what kind of training options the company offers. You might go for a short orientation up front, with a good help desk that is available 24/7. Check reliable Electronic medical records ratings to see which companies provide good in person and on the phone / online support
3. Do we have to set up all the hardware and maintain the software? I don’t think we can manage that. Consider a cloud-based EMR solution: If you are reluctant to invest in a server and commit to the upkeep of hardware and software, consider a Web-based EMR solution, in which you log onto an EMR that worries about security, and updates to hardware and software.
4. How can I compare products so that my practice knows what it is getting into? How much can I trust referrals from other practices? Don’t put all of your EMR decision eggs into one basket: While personal referral are extremely helpful and reassuring, not all are meaningful for your unique EMR practice situation. There are many good EMR products to choose from, and each has its strengths, and its weaknesses.
The right choice will depend as much on the nature of your medical practice and the answers to many questions: What is your medical specialty? How many employees do you have? How expensive is the EMR, per year? How much money can you dedicate to investing in your EMR annually? Can you integrate your medical billing software with your proposed new EMR? Can you afford to hire a dedicated IT employee? How comfortable you and the others in your practice are with using an electronic device as the main source of medical input to your system. These are just a few of the many questions you need to ask yourself.
Talk to people in other practices, yes; but learn to ask the right questions and compare apples to apples and oranges to oranges. Great EMR comparison tools are available to you at no charge, and they can educate you to ask the right questions and maintain a solid baseline for comparison when choosing an EMR.
Ahem….what do we say about privacy and data selling…bingo it appears as if you read through the entire article why else would this type of data be shared with Wall Street Investors to make a market for selling some new analytic algorithms. Now get this the investors got to see this “private” information that a patient can’t even get access to see. This reminds me of ePatientDave, “give me my damn data” and this is a total abuse here as the data is not being used for better care but for “better money”.
Now this also says something about access to revenue cycling too, payers and integrators might want to visit this scenario and make sure that it stays on a server for one and what levels of access will be granted. Now this gets worse as the types of information and patients were related to mental health, HIV, Parkinson's and more. How many investors glazed over these records? Accretive gets paid on the revenue boost is provides. There are a lot of these types of 3rd parties around in healthcare and here’s another one used by Blue Cross who had some bad algorithms.
Actually when it comes down to payer disputes you wonder did the hospital bill erroneously on purpose or did they get some bad algorithms and a bunch of promises? If I were one of these patients, court might be on my mind and I would want to know what investors on Wall Street potentially or did see my data! On their website they talk about bringing increased discipline to the revenue cycle so is that the revenue cycle on Wall Street?
Well Fargo just dumped one of these types of companies recently and remember the big data breach at Stanford, also the fault of a 3rd party, so with history being built here who wants to trust a 3rd party today if you don’t have to as patient records end up on the web and in the hands of investors on the street. The 3rd party folks are the algorithm makers though that promise better profits and use of money. This whole scenario though is kind of sad as they were supposed to be helping a couple non profits boost their revenue but the hospitals probably had no clue on the methodologies like showing patients records was in the plan.
“The screen shot also includes numeric scores to predict the “complexity” of the patient and the probability of an inpatient hospitalization, and a box to describe the “frailty” of the patient.”
Tine to start licensing and taxing those data sellers and have a federal disclosure site so we all know what’s going on, beginning to make more sense every day! The link below will describe a bit of this brainstorm. BD
ST. PAUL, Minn. - Minnesota Attorney General Lori Swanson has filed a lawsuit against a debt collector accused of failing to protect the confidential information of 23,500 hospital patients after a company laptop was stolen from a rental car parked in the Seven Corners are of Minneapolis.
The lawsuit filed Thursday alleges Accretive Health, Inc., a debt collection agency that is part of a New York private equity fund conglomerate, failed to protect the confidentiality of patient health care records and failing to disclose its involvement in their health care.
Last July, Accretive lost a laptop computer containing unencrypted health data of about 23,500 patients of two Minnesota hospital systems -- Fairview Health Services and North Memorial Health Care.
Under both contracts, Accretive controls and directs the work of hospital employees and “infuses” its own employees into the staffs of the hospitals. Accretive gets base compensation and incentive pay for helping the hospitals boost revenue or cut costs.
“The debt collector found a way to essentially monetize portions of the revenue and health care delivery systems of some nonprofit hospitals for Wall Street investors, without the knowledge or consent of patients who have the right to know how their information is being used and to have it kept confidential,” Swanson said.
The state seeks an order requiring Accretive to fully disclose to patients:
- What information it has about Minnesota patients
- What information it has lost about Minnesota patients
- Where and to whom it has sent information about Minnesota patients
- The purposes for which it amasses and uses information about Minnesota patients.
One more the mergers and acquisitions speak out again on how health insurance companies have diversified their portfolios and are no longer “just an insurance company” with numerous subsidiaries both in the Health IT area and even others in what you might consider “unrelated” businesses. Here’s one example below with a diversified interest with a new division created to distribute hearing aids and offer incentives for those in certain areas of the us to sign up for insurance plans. I sometimes wonder how other insurers view this?
Here’s another example of what one might consider a business outside of what we might normally consider a related business with low incoming housing investments in New Mexico. One thing to keep in mind today is all the aggregated data that flows and the algorithms and SQL statements that bring some of this together. Data is big business.
Just a couple weeks ago we read about the investment with mobile health and again we venture down into the data business here again as the Optum division which has many subsidiaries has a huge focus on data, and part of the renamed group was the old “Ingenix” company that has consulted and provided data services for years and last year settled their case with the AMA with short paying providers for out of network services.
This kind of brings me around again to what I call the “Alternative Millionaire’s Tax” with companies that buy and sell data and this seems to be a good place for a mention here as the Optum Division has been making money for years with aggregating and selling prescription and other data. With big profits as such we certainly could entertain a license and tax situation for those making billions on the data selling business. As a short comparison from another Healthcare company, Walgreens has estimated their data selling business to be valued at just under $800 million, so again something to give some thought to as hospitals, providers, and patients struggle to afford medical care today.
Another good article to read about the over sell and naïve and gullible nature of the US with both government and consumers, read what Nanex has to say as they are the folks that monitor and study rogue algorithms in the stock market and look for indicators of the “next flash crash”. A couple paragraphs are below and will the SEC be suckered in to this huge expense of programmers who want to make big dollars writing code convince a naïve and gullible SEC? It’s all over the place with digital illiteracy, steroid marketing and algorithms for huge profits only and they have teeth. At a certain point in time we might need to REALLY think about the value of some of the data we analyze today and the cost and this is worth a mention as this is the big growth area for United, algorithms and software analytics via consulting services. It is also worth a note that United last year hired the former Assistant Attorney General for the State of Minnesota for their general counsel.
Below is one example of the algorithm/software business as the company created a clearinghouse business and collaborated with an medical records company to integrate the services with Epic and of course this means more data revenue for the company and puts a bit of stress on other smaller existing clearinghouse businesses in the US as well.
One more thing too is let’s not forget that they also own a bank with over $1 billion on deposit with health savings accounts and I would guess this also leaves them open to lend money on monies held here and somewhat compete with other banks. As you can read in the quote below the funds are largely generated by employers, in other words large US corporations so they seem to go hand in hand, right?
“OptumHealth offers three types of HSAs, as well as tax-advantaged health care savings and spending accounts, debit-card services, benefits administration services, and payment products. About three-quarters of the bank’s 1.6 million accounts are employer-generated, while the other quarter are individual accounts.”
There’s also the Chinese investment the company bought early in 2010.
If you were to stop and look you might also notice one more subsidiary that can consult with biotech and device companies to introduce new products to the FDA and you know when you think about it they might just have a subsidiary to handle the entire process from FDA approval all the way down to provider reimbursement too.
One other related item too is the purchase of physicians groups which is growing and the acquisition of Monarch in Orange County is one big example of buying a huge managed care group.
Again, in summary with such large profits and a lot coming from the data end of the business, this looks like one company where licensing and taxing the data sold for huge profits could fit and there are many more as Hedge Funds, Facebook and tons of other companies are cashing in royally and this all leads to bottom line profits for running algorithms on servers 24/7 that you can’t see, touch or talk to as far as the consumer is concerned, but automated algorithms for data mining and selling are yielding huge profits for corporate USA while as consumers we are becoming “data chasers” to fix a lot of the flawed data that is out there today. It’s a good idea today to read up and see how the corporate USA scene has changed tremendously due to the huge array of mergers and acquisitions as companies are not the same ones they were 2 to 3 years ago by any means. BD
UnitedHealth Group (NYSE:UNH) today reported fourth quarter and full year 2011 results, highlighted by strong enrollment and revenue growth in each of UnitedHealthcare’s benefits businesses and strong revenue growth at all Optum business units. Full year and fourth quarter 2011 net earnings were $4.73 per share and $1.17 per share, respectively. Cash flows from operations were $7 billion in 2011.
The Company continues to estimate 2012 revenues in the range of $107 billion to $108 billion and net earnings in the range of $4.55 to $4.75 per share.
Is there money in those algos? This story might answer that. Why would this employee who was a contracted programmer take this code? It’s worth money and if you read often enough you know I discuss those algos and software is nothing more than a group of algorithms, words of Bill Gates.
A co-worker said the employee said the accused confused he lost the drive containing the code and get this, it’s the software (aka algorithms) that cost $10 million to develop to track the billions of dollars that the US government dispenses “daily” to government agencies..these are some pretty commanding algorithms…so the programmer apparently took the code and who knows where it would go next? A lot of government code is open source but don’t think that is the case here…what’s the next security breach to occur? BD
Bo Zhang, 32, of Queens, New York, worked as a contract programmer at the bank. He was accused of illegally copying software to an external hard drive, according to a criminal complaint filed in U.S. district court in Manhattan.
Authorities said the software, owned by the U.S. Treasury Department, cost about $9.5 million to develop.
A New York Fed spokesman said in a statement that the bank immediately investigated the suspected breach when it was uncovered and promptly referred the matter to authorities.
Zhang told investigators he took the code "for private use and in order to ensure that it was available to him in the event that he lost his job," the complaint said.
The code, called the Government-wide Accounting and Reporting Program (GWA), was developed to help track the billions of dollars the United States government transfers daily. The GWA provides federal agencies with a statement of their account balance, the complaint said.
This is kind of an alarming incident but when you read further it does not stop the treatment process and the secondary outbursts are surgically removed. This affects about half of those treated to be on alert, but not all of those develop the secondary skin cancer, only about a quarter of the 50% risk group.
This sounds like a big step in recognizing undesired side effect with oncology treatments. BD
Drug Used to Treat Melanoma with One Mutation Sets off a Cascade that Results in a Different Type of Skin Cancer in Cells with Another Mutation
Patients with metastatic melanoma taking the recently approved drug vemurafenib (Zelboraf®) responded well to the twice daily pill, but some of them developed a different, secondary skin cancer.
Now, researchers at UCLA’s Jonsson Comprehensive Cancer Center, working with investigators from the Institute of Cancer Research in London, Roche and Plexxikon, have elucidated the mechanism by which vemurafenib excels at fighting melanoma but also allows for the development of skin squamous cell carcinomas.
The very action by which the pill works, blocking the mutated BRAF protein in melanoma cells, sets off a cellular cascade in other skin cells if they have another pre-disposing cancer mutation and ultimately accelerates the secondary skin cancers, said Dr. Antoni Ribas, co-senior author of the paper and a professor of hematology/oncology.
About 50 percent of patients who get melanoma have the BRAF mutation and can be treated with vemurafenib, Ribas said. Of those, a fourth of the patients develop skin squamous cell carcinomas. The squamous cell carcinomas were removed surgically, and vemurafenib was not discontinued for this side effect.
“We wondered why it was that we were treating and getting the melanoma to shrink, but another skin cancer was developing,” said Ribas, who studies melanoma at the Jonsson Cancer Center. “We looked at what was likely making them grow and we discovered that the drug was making pre-existing cells with a RAS mutation grow into skin squamous cell cancers.”
The 18-month study appears in the Jan. 19, 2012 edition of the New England Journal of Medicine.
The combined research team performed a molecular analysis to identify the oncogenic mutations in the squamous cell lesions of patients treated with the BRAF inhibitor. Among 21 tumor samples studied, 13 had RAS mutations. In a different set of 14 samples, eight had RAS mutations, Ribas said.
“Our data indicate that RAS mutations are present in about 60 percent of cases in patients who develop skin squamous cell cancers while treated with vemurafenib,” Ribas said. “This RAS mutation is likely caused by prior skin damage from sun exposure, and what vemurafenib does is accelerate the appearance of these skin squamous cell cancers, as opposed to being the cause of the mutation that starts these cancers.”
Ribas’ group found that blocking the non-mutated BRAF in cells with mutated RAS caused them to send signals around BRAF that induced the growth of the squamous cell cancers.
The discovery of the squamous cell cancer mechanism has led to strategies to inhibit both the BRAF mutation with vemurafenib and block the cellular cascade with a different drug, a MEK inhibitor, before it initiates the secondary skin cancers, said co-senior author Professor Richard Marais from the Institute of Cancer Research in London, who developed the animal model for the study.
“By understanding the mechanism by which these squamous cell cancers develop, we have been able to devise a strategy to prevent the second tumors without blocking the beneficial effects of the BRAF drugs,” Marais said. “This may allow many more patients to benefit from these important drugs.”
Ribas said that this is one of the very few times that oncologists understand molecularly why a side effect to cancer treatment is happening.
“The side effect in this case is caused by how the drug works in a different cellular setting,” he said. “In one case it inhibits cancer growth, and in another it makes the malignant cells grow faster.”
Studies currently are under way testing BRAF and MEK inhibitors in combination in patients with metastatic melanoma, Ribas said.
“Our data provide a molecular mechanism for the clinical toxicity of a targeted oncogene inhibitor that apparently contradicts the intended effects,” the study states.
The study was supported by Roche, Plexxikon, the Seaver Institute, the Louise Belley and Richard Schnarr Fund, the Fred L. Hartley Family Foundation, the Wesley Coyle Memorial Fund, the Ruby Family Foundation, the Albert Stroberg and Betsy Patterson Fund, the Jonsson Cancer Center Foundation and the Caltech-UCLA Joint Center for Translational Medicine.
UCLA's Jonsson Comprehensive Cancer Center has more than 240 researchers and clinicians engaged in disease research, prevention, detection, control, treatment and education. One of the nation's largest comprehensive cancer centers, the Jonsson center is dedicated to promoting research and translating basic science into leading-edge clinical studies. In July 2011, the Jonsson Cancer Center was named among the top 10 cancer centers nationwide by U.S. News & World Report, a ranking it has held for 11 of the last 12 years. For more information on the Jonsson Cancer Center, visit our website at http://www.cancer.ucla.edu.
Jon goes back to Foxconn-revisited…in his usual style and he says we need to make our factories look more like those in China. Workers live in dormitories and don’t know each other, cuts down on commuting and friendship.
Workers are finding ways of improving their conditions, hotlines with trying to stop suicide and put nets around buildings to catch jumpers…I think we remember this from a year ago and he says in the US we call this “treating the symptoms”.
“It’s me, Siri, in your pants pocket working on giving you testicular cancer”…If it works for those factories, electronics would cost more..modern work fare…a game to the rescue…this is great humor but there are somethings I does make one ponder…there’s just one level and this is it…(the middle class) as algorithms are marketed and designed and sold to consumers.
He shows the work of the algorithms in place for sure in a humorous way. Why are health insurance companies getting into the low income housing business though? I hope this is not a Foxconn plan to create communities with jobs that pay little and have medical care on campus? What is up with this?
The same company owns a subsidiary that will basically give you a free hearing aid made in China if you sign up for their health insurance…more below…and the subsidiary they built to distribute and coming to Walmart soon as I understand…
He moves on to the next part, a game that has one level…hmmm…we another insurance company banking on this too…data to sell? Will this make you healthy? I prefer real knowledge.
I just ask is there where we are headed with mining and selling data today and big corporations taking over our day to day decisions? The more information they have to judge and discriminate, the ability to control and humiliate the middle class grows.
At any rate with the use of algorithms today that have teeth and the amount of flawed data out there, are we going in this direction? I put this out for an awareness and perhaps to generate some though processes. I like technology and the good things it brings but am not oblivious to how it can also be abused as well and a NYU professor says it even better than me, read and listen up.
I sure hope Richard Cordray understands math and the power of the algorithms when used both in an intuitive and good manner and the reality of those who design for pure profit that hurts consumers. You can see, smell or touch them, but they are running on server 24/7 every day making like impacting decisions, crafted by some of the smartest programmers and developers that the money on Wall Street can buy.
Another good article to read about the over sell and naïve and gullible nature of the US with both government and consumers, read what Nanex has to say as they are the folks that monitor and study rogue algorithms in the stock market and look for indicators of the “next flash crash”. A couple paragraphs are below and will the SEC be suckered in to this huge expense of programmers who want to make big dollars writing code convince a naïve and gullible SEC? It’s all over the place with digital illiteracy, steroid marketing and algorithms for huge profits only and they have teeth. BD
“Wall street hires the best software developers money can buy. They write clever algorithms. These algorithms will only get more clever as time goes on. Which means they will always be changing. Now, writing software to detect what other software is doing is 100 times more difficult. Which in the software world means 100 times more expensive. Which means hiring people that do not exist, since Wall Street already snapped up the best, and you need the best times 100 (you can't make it up in quantity and just get 100 times more wizards, because many will have poor social skills, and you need these people to communicate).”
“You see the folly of trying to regulate the markets in real-time? Real-time raises the cost exponentially times a million. To a level that all the kings in the world couldn't afford. It would be one thing to track in real-time, things that had known behavior. Like your checking account being overdrawn. Maybe credit card fraud in the making (which, by-the-way, hasn't been perfected yet, despite lots of money and time thrown at the problem). “
To go back a little bit in time the chip was also set up to communicate with personal health records like Healthvault. The latest development on the chip was the ability to communicate real time glucose readings. The FDA has approved the product and the HealthLink software.
In addition, Medcomp who makes vascular access catheters will use the chip in vascular ports for identifying the port in a patient for proper medication dispensing. As it read here though the use with Medcomp still needs to secure FDA approval. This chip keeps coming back around with many lives. BD
DELRAY BEACH, Fla., Jan 17, 2012 (BUSINESS WIRE) -- VeriTeQ Acquisition Corporation ("VeriTeQ" or "Company"), a marketer of implantable, radio frequency identification ("RFID") technologies for patient identification and sensor applications, announced today it has acquired the VeriChip implantable microchip and related technologies, and Health Link personal health record from PositiveID Corporation. VeriTeQ is majority owned and led by Scott R. Silverman, former Chairman and CEO of PositiveID and VeriChip Corporation. PositiveID has retained an ownership interest in VeriTeQ.
VeriTeQ will focus on three main areas: patient identification and personal health record (PHR) access through the VeriChip implantable microchip and Health Link web-based PHR; implantable sensor applications; and identification of medical devices within the body. VeriTeQ will also focus on identification and sensor applications for animals.
VeriTeQ's acquisition also includes the rights to a Development and Supply Agreement with Medical Components, Inc. ("Medcomp"), a leading manufacturer of vascular access catheters. Under the terms of the agreement, Medcomp will embed the VeriChip microchip in its vascular ports to facilitate identification of the port in a patient and proper medication dispensing.
Doctors’ adoption of health information technology doubled in two years, according to a new report, Department of Health and Human Services Secretary Kathleen Sebelius released Wednesday. Sebelius also announced extension of the meaningful use qualification date to 2014. See link for more info – http://www.healthcareitnews.com/news/hhs-extends-mu-stage-2-deadline-spur-faster-emr-adoption?topic=01,08
The survey I posted earlier has now been completed – here are the results.
High physician fees, rather than factors such as practice costs, volume of services or tuition expenses, were the main drivers of higher U.S. healthcare spending and physician income, according to research presented in the September issue of Health Affairs.
The study, conducted by Miriam J. Laugesen, PhD, and Sherry A. Glied, PhD, both of the Mailman School of Public Health at Columbia University in New York City, found that in some cases, physicians in the U.S. are paid as much as double their counterparts in other countries. There is also a larger gap between fees paid for primary care and fees paid for specialty care, particularly orthopedic surgeons, in the U.S. compared to other countries evaluated by the study.
Fees paid by public and private payors for primary care office visits and hip replacements were compared in six countries: Australia, Canada, France, Germany, the U.K. and the U.S.
Laugesen and Glied found that primary care physicians in the U.S. were paid, on average, 27 percent more by public payors for an office visit, and 70 percent more by private payors for an office visit, compared to the other countries. The largest difference in fees paid between countries was for hip replacements. Physicians in the U.S. were paid 70 percent more by public payors and 120 percent more by private payors for these procedures as compared with physicians in the other countries.
Across the fees analyzed by the study, the biggest disparities in pay to U.S. physicians existed on the private side. Fees paid by private insurers in six markets in the U.S. averaged about 33 percent above Medicare rates for primary care and 50 percent above Medicare rates for hip replacements.
“Our analysis suggests that policymakers in all countries need to consider how differential prices paid by both public- and private-sector payors to specialists influence specialty choices,” wrote the authors. “Furthermore, this analysis suggests a need for greater standardization of cross-national data on the nature of physician services provided, fees, education and incomes to allow ongoing comparative research on the relationship between prices and healthcare spending growth.”
Incomes were also higher for U.S. primary care and orthopedic physicians compared to their foreign counterparts.
The authors said other factors thought to contribute to physicians’ fees, such as high medical education tuition costs for American physicians or increased work volume, could not fully explain the disparity in fees when compared across the countries.
“Although the tuition cost of medical education in the U.S. borne by individuals is substantial, it cannot fully account for the observed differences between the earnings of U.S. physicians and physicians in all other countries,” wrote Laugesen and Glied.
For the services examined by the study, higher physician incomes did not appear to be due to a higher volume of services, though the authors acknowledged the rates of other procedures not studied may be higher and contribute to the elevated fees and incomes.
One possible explanation offered by the authors for the high U.S. physician fees was the notion that higher fees may reflect the cost of attracting highly skilled candidates. When physician fees in each country were compared to the mean incomes of the top 1 percent of households within that country, the results were broadly consistent, suggesting higher U.S. fees were the result of a “society with a relatively more skewed income distribution,” according to Laugesen and Glied.
The New York Times (9/6, B1, Freudenheim, Subscription Publication) reports, “Under heavy pressure from government regulators and insurance companies, more and more physicians across the country are learning to think like entrepreneurs.” One result is the rapid growth in joint M.D./M.B.A programs to 65 at present with an estimated 500 students. Some intersperse business courses with medical courses while others have students complete their medical training and add a year or more of business education. “Dr. Barry R. Silbaugh, chief executive of the American College of Physician Executives, a professional society that provides medical education courses and career counseling, said more start-ups were being run by doctors.” He explained that some “are focused on adapting technology to health care, not just electronic medical records,” adding, “The use of social media is of great interest to many younger physicians, and so is health care analytics.”