Healthcare and Technology news
37.5K views | +12 today
Follow
Healthcare and Technology news
Your new post is loading...
Your new post is loading...
Scoop.it!

National Quality Forum Urges Providers Forward on Data and Analytics in Healthcare

National Quality Forum Urges Providers Forward on Data and Analytics in Healthcare | Healthcare and Technology news | Scoop.it

On Aug. 6, the Washington, D.C.-based National Quality Forum released a white paper, “Data Needed for Systematically Improving Healthcare,” intended to highlight strategies to help make healthcare data and analytics “more meaningful, usable, and available in real time for providers and consumers.”


According to a press release issued on that date, “The report identifies several opportunities to improve data and make it more useful for systematic improvement. Specific stakeholder action could include the government making Medicare data more broadly available in a timely manner, states building an analytic platform for Medicaid, and private payers facilitating open data and public reporting. In addition, electronic health record (EHR) vendors and health information technology policymakers could promote “true” interoperability between different EHR systems and could improve the healthcare delivery system’s ability to retrieve and act on data by preventing recurring high fees for data access.”


The press release noted further that “The report identifies actions that all stakeholders could take to make data more available and usable, including focusing on common metrics, ensuring that the healthcare workforce has the necessary tools to apply health data for improvement, and establishing standards for common data elements that can be collected, exchanged, and reported.”


The report emerged out of an initiative supported by the Peterson Center on Healthcare and the Gordon and Betty Moore Foundation, and spurred by a 2014 report by the President’s Council of Advisors on Science and Technology that called for systems engineering approaches to improve healthcare quality and value.


The press release included a statement by Christine K. Cassel, M.D., president and CEO of NQF. “Data to measure progress is fundamental to improving care provided to patients and their outcomes, but the healthcare industry has yet to fully capture the value of big data to engineer large-scale change,” Dr. Cassel said in the statement. “This report outlines critical strategies to help make data more accessible and useful, for meaningful system wide improvement.” 

Following the publication of the report, Rob Saunders, a senior director at the National Quality Forum, and one of the co-authors of the report, spoke with HCI Editor-in-Chief Mark Hagland about the report and its implications for healthcare IT leaders. Below are excerpts from that interview.


What do you see as the most essential barriers to moving forward to capture and correctly use “big data” for clinical transformation and operational improvement in healthcare?

There are sort of two buckets we looked at through this project. We looked at the availability of data, and we’re seeing more availability of electronic data. Interoperability remains a major challenge. But it wasn’t just about interoperability between electronic health records, but also being able to link in data from elsewhere.


Does that mean data from pharmacies, from medical devices, from wearables?

Some of these may be kinds of data from community health centers, or folks offering home-based and community-based services. So, getting a broader picture of people’s health, as they’re living their lives in their communities. And there are exciting things on the horizon, too, like wearable devices. But the first barrier we heard about was just getting more availability of data. Perhaps the harder problem right now is actually using more data, and turning that raw data into meaningful information that people can use. There’s so much raw data out there, but it so often is not actionable or immediately usable to clinicians.


So what is the solution?

That is an excellent question. Unfortunately, there’s no silver bullet. We’ve looked at a wide range of possible solutions, but it will take action from healthcare organizations trying to improve their internal capacity, for example, creating more training for clinicians to use data in their practices, or even state governments taking action. I think it will require a lot of action from all the stakeholders around healthcare to make progress.

 

The white paper mentioned barriers involving information systems interoperability, data deidentification and aggregation, feedback cycles, data governance, and data usability issues. Let’s discuss those.

I think one of the challenges with all of those is that there are some big strategic issues around all of those, and some large national conversations around all of those, esp. interoperability, but there are also just a lot of large technical details to iron out. And unfortunately, that’s not something we can just solve tomorrow. But there’s opportunity with these new delivery system models, and that will hopefully be helpful.


How might all this play out with regard to ACOs, population health, bundled payments, and other new delivery and payment models?

What we’ve heard is that those new models are becoming increasingly more common, and because of those, clinicians and hospitals have far more incentive to look far more holistically at the entire person, and think about improvement, and to really start digging into some of this data.


Marrying EHR [electronic health record] and claims data for accountable care and population health is a very major topic for our magazine and its readers right now. Let’s talk about those issues.

We didn’t necessarily go into great depth on that particular challenge. But clearly, that’s one of the big issues in trying to link all these different data sources together, and it also speaks to the challenge in getting this data together.


Is there anything that healthcare IT vendors need to do better?

And we actually called out healthcare IT vendors and EHR vendors, because they’re a really important sector here. Promoting interoperability speaks to both policy and technical challenges.


Are you also concerned about data blocking?

Yes, that’s how ONC and HHS have characterized it. But yes, we’re really talking about data access. Clearly, that’s a barrier. And then there are still some technical pieces here around how to create APIs that can really start to allow more innovative ways to analyze the data that’s already in a lot of these EHR and health IT systems, and that will allow some customization and capabilities.


What’s your vision of change for the use of data in healthcare?

There are a number of folks doing really exciting work using data for systemic improvement. So we showcased Virginia Mason as a model. And some of their work involves manual collection of data. And that can produce really remarkable results; and as you become more sophisticated, you’re able to incorporate that data collection into the EHR [electronic health record]  and other systems. That speaks to what we said earlier, that availability of data is a good thing, but it’s the use of data that seems to be more of an issue. Premier Inc. has done some really good things, collecting data through some of their groups, to share; and oftentimes, that was data people didn’t even have before.  You can also activate clinicians’ professional motivation—many physicians, nurses, really want to make care better for their patients. And data really can make a difference in that.

And the last point is the fact of the important role that brings this down to patients and consumers, involving the broader public in this. What we’ve talked about so far has been very technical. But patients have a lot of data about themselves, and they’re also able to help out with a lot of this.

 

So you’re talking about patient and consumer engagement in this?

 

Yes, I am, but it’s not just that. I’m also talking about patients as an untapped data resource, and an untapped resource in general of folks who are highly motivated and who want to make care better, if they have the tools available and are able to do so.

 

The “blessed cycle” of data collection, data analysis, data reporting, the sharing of data with end-users and clinician leaders for clinical and operational performance improvement, and the re-cycling into further data collection, reporting, etc., is very important. Any thoughts on that concept?

 

We didn’t necessarily talk about that concept per se, but we did talk about the general idea of this all being a process. And improvement needs to start somewhere, and oftentimes, you need to start small. And your data will be rough and dirty when you start; and that’s not necessarily a bad thing. The real pioneers in this area started out with rough, dirty data, and learned by using that data, and were able to increase their sophistication over time. So that’s part of the issue—bringing data together, oftentimes, you don’t know what data you need, until you start to use it.

 

So what should CIOs, CMIOs and their colleagues be doing right now, to help lead their colleagues forward in all these activities?

 

We really want to encourage more organizations to start doing this type of system improvement work. There’s more that can be done, so we want to encourage that. And the second message that permeated the entire project was not only making sure that more data should be made available, but also building up use, and to encourage more folks to get into systematic improvement.

more...
lucy gray's curator insight, August 17, 2015 11:35 AM

Green Coffee Slim

Avoidmaking in person dies night candy chips and unhealthy foods eat beforeshipping eat a healthy meal before you go groceryshopping this will provide you with a fullstomach a nutritious foods and keep you satisfied when you venture into is stillthat many temptations the free shipping mi amor help you avoidmaking different choices you may choose if they're very hungry and have a hardtime avoiding sweet or salty temptations another great tip is to take a pre-teensoon he pretty to the grocery store and walk around sipping on his to MissionStreet using a blender and two cups if Godcan't and New York one scoop of protein powder 1 cup yourfavorite fruit and some nice this movie their cheeryour sweet tooth and keep you busy while pressing alreadyattempting irons as unhealthy Phoenix sharply and to perimeter the grocerystore is strategically designed to make you purchase items you may not need ever wonder why the note and aches aposition to rein in the back on this tour it's designed in a way to make you walkthrough the entire store before picking up the school write-ins while walking past hundreds and otheritems most people will see many things they think they need doing even worse they will make in purse place whichinclude unhealthy....

http://healthyboosterspro.com/green-coffee-slim/

Scoop.it!

How Big Data Is Changing Healthcare

How Big Data Is Changing Healthcare | Healthcare and Technology news | Scoop.it

If you want to find out how Big Data is helping to make the world a better place, there’s no better example than the uses being found for it in healthcare.


The last decade has seen huge advances in the amount of data we routinely generate and collect in pretty much everything we do, as well as our ability to use technology to analyze and understand it. The intersection of these trends is what we call “Big Data” and it is helping businesses in every industry to become more efficient and productive.

Healthcare is no different. Beyond improving profits and cutting down on wasted overhead, Big Data in healthcare is being used to predict epidemics, cure disease, improve quality of life and avoid preventable deaths. With the world’s population increasing and everyone living longer, models of treatment delivery are rapidly changing, and many of the decisions behind those changes are being driven by data. The drive now is to understand as much about a patient as possible, as early in their life as possible – hopefully picking up warning signs of serious illness at an early enough stage that treatment is far more simple (and less expensive) than if it had not been spotted until later.

So to take a journey through Big Data in healthcare, let’s start at the beginning – before we even get ill.

Prevention is better than cure


Smart phones were just the start. With apps enabling them to be used as everything from pedometers to measure how far you walk in a day, to calorie counters to help you plan your diet, millions of us are now using mobile technology to help us try and live healthier lifestyles. More recently, a steady stream of dedicated wearable devices have emerged such as FitbitJawbone and Samsung Gear Fit that allow you to track your progress and upload your data to be compiled alongside everyone else’s.


In the very near future, you could also be sharing this data with your doctor who will use it as part of his or her diagnostic toolbox when you visit them with an ailment. Even if there’s nothing wrong with you, access to huge, ever growing databases of information about the state of the health of the general public will allow problems to be spotted before they occur, and remedies – either medicinal or educational – to be prepared in advance


This is leading to ground breaking work, often by partnerships between medical and data professionals, with the potential to peer into the future and identify problems before they happen. One recently formed example of such a partnership is thePittsburgh Health Data Alliance – which aims to take data from various sources (such as medical and insurance records, wearable sensors, genetic data and even social media use) to draw a comprehensive picture of the patient as an individual, in order to offer a tailored healthcare package.


That person’s data won’t be treated in isolation. It will be compared and analyzed alongside thousands of others, highlighting specific threats and issues through patterns that emerge during the comparison. This enables sophisticated predictive modelling to take place – a doctor will be able to assess the likely result of whichever treatment he or she is considering prescribing, backed up by the data from other patients with the same condition, genetic factors and lifestyle.


Programs such as this are the industry’s attempt to tackle one of the biggest hurdles in the quest for data-driven healthcare: The medical industry collects a huge amount of data but often it is siloed in archives controlled by different doctors’ surgeries, hospitals, clinics and administrative departments.


Another partnership that has just been announced is between Apple and IBM. The two companies are collaborating on a big data health platform that will allow iPhone and Apple Watch users to share data to IBM’s Watson Health cloud healthcare analytics service. The aim is to discover new medical insights from crunching real-time activity and biometric data from millions of potential users.

The way we visit and interact with doctors is likely to change in the near future, too. Telemedicine is a buzzwords at the moment, and refers to receiving medical treatment remotely, usually in your own home with the aid of a computer and internet connection. Strictly speaking this can refer to anything as simple as

visitingwebmd.com and self-diagnosing, but increasingly this will take place as a one-on-one service with a qualified professional. This type of service is offered by Healthtap.


All these interactions will of course leave a data trail, which can be analyzed to provide valuable information into general trends in public health and the way we access healthcare.


Big Data in clinical trials


Once your doctor decides that whatever you are complaining about is best treated by medicine, it is likely that the pills and potions he or she offers you have been designed with the help of Big Data, too. Huge amounts of data on applicants will allow researchers to pick the best subjects. And recently, data-sharing arrangements between the pharmaceutical giants has led to breakthroughs such as the discovery that desipramine, commonly used as an anti-depressant, has potential uses in curing types of lung cancer.


Personalized medicine is another hot topic in the healthcare field. It involves tailoring medicines to a person’s unique genetic makeup – and is developed by integrating a person’s genetic blueprint and data on their lifestyle and environment, then comparing it alongside thousands of others to predict illness and determine the best treatment.


Big Data is also helping in the fight against the spread of epidemics. In Africa, mobile phone location data is proving highly valuable in efforts to track population movements, which helps to predict the spread of the Ebola virus. This gives insight into the best areas to provide treatment centres and allows movement restrictions to be put in place when necessary. These strategies were pioneered in the wake of the 2010 Haiti earthquake where they were used to help plan disaster relief.


And of course, a Big Data solution has even been proposed for the search for the Holy Grail of medicine – a cure for cancer. Flatiron Health has developed a service called the OncologyCloud, based on the idea that 96% of potentially available data on patients with cancer is not yet analyzed. It aims to take this data gathered during diagnosis and treatment, and make it available to clinicians to further their study.


Privacy and security


Of course, no data is more personal than our medical data, so extremely secure safeguards have to be put in place to make sure the information only gets to those who we meant to see it. Despite that, cyber thieves routinely target medical records, andreportedly earn more money from stolen health data than by pilfering credit card details. In February, the largest ever healthcare-related data theft took place, when hackers stole records relating to 80 million patients from Anthem, the second largest US health insurer. Fortunately they only took identity information such as names and addresses, and details on illnesses and treatments were not exposed. However, there is a fear that it is only a matter of time until a security breach on that scale takes place in which patient records are lost. Some experts, such as Dr Leslie Saxon of the University of Southern Carolina Center for Body Computing, have called for the establishment of an international organization in the style of the UN, to regulate privacy and security issues relating to health data.


Despite that, the potential for good that Big Data can bring far outweighs the potential for bad. The growing trend towards centralization of medical data will cause concern, but as long as privacy and security can be maintained, it is certain to play a big part in the development of new treatments and add to our growing understanding of how our bodies work, and how we can make sure they carry on working as long as possible.

more...
Pascal Kerhervé's curator insight, June 19, 2015 4:23 AM

Interesting to see how big data and data sharing leads to the discovery of drug's potential on new indications!

Scoop.it!

How nine out of ten healthcare pages leak private data

A study by a Timothy Libert, a doctoral student at the University of Pennsylvania, has found that nine out of ten visitsto health-related web pages result in data being leaked to third parties like Google, Facebook and Experian:

There is a significant risk to your privacy whenever you visit a health-related web page. An analysis of over 80,000 such web pages shows that nine out of ten visits result in personal health information being leaked to third parties, including online advertisers and data brokers.

What Libert discovered is a widespread repetition of the flaw that the US government's flagship Healthcare.gov website was dragged over the coals for in January.

The sites in question use code from third parties to provide things like advertising, web analytics and social media sharing widgets on their pages. Because of the way those kinds of widgets work, their third party owners can see what pages you're visiting.

The companies supplying the code aren't necessarily seeking information about what you're looking at but they're getting it whether they want it or not.

So if you browse the pages about genital herpes on the highly respected CDC (Centres for Disease Control and Prevention) site you'll also be telling marketing mega-companies Twitter, Facebook and AddThis that you've an interest in genital herpes too.

It happens like this: when your browser fetches a web page, it also fetches any third party code embedded in it directly from the third parties' websites. The requests sent by your browser contain an HTTP header (the annoyingly misspelled 'referer' header) that includes the URL of the page you're looking at.

Since URLs tend to contain useful, human-readable information about what you're reading, those requests can be quite informative.

For example, looking at a CDC page about genital herpes triggers a request to addthis.com like this:

GET /js/300/addthis_widget.js HTTP/1.1
Host: s7.addthis.com

Referer: http://www.cdc.gov/std/Herpes/default.htm

The fact that embedded code gets URL data like this isn't new - it's part of how the web is designed and, like it or not, some third parties actually rely on it - Twitter uses it to power its Tailored Suggestions feature for example.

What's new, or perhaps what's changed, is that we're becoming more sensitive to the amount of data we all leak about ourselves and, of course, health data is among the most sensitive.

While a single data point such as one visit to one web page on the CDC site doesn't amount to much, the fact is we're parting with a lot of data and sharing it with the same handful of marketing companies.

We do an awful lot of healthcare research online and we tend to concentrate those visits around popular sites.

A 2012 survey by the Pew Research Center found that 72% of internet users say they looked online for health information within the past year. A fact that explains why one of the sites mentioned in the study, WebMD.com, is the 106th most popular website in the USA and ranked 325th in the world.

The study describes the data we share as follows:

...91 percent of health-related web pages initiate HTTP requests to third-parties.  Seventy percent of these requests include information about specific symptoms, treatment, or diseases (AIDS, Cancer, etc.). The vast majority of these requests go to a handful of online advertisers: Google collects user information from 78 percent of pages, comScore 38 percent, and Facebook 31 percent.  Two data brokers, Experian and Acxiom, were also found on thousands of pages.

If we assume that it's possible to imply an individual's recent medical history from the healthcare pages they've browsed over a number of years then, taken together, those innocuous individual page views add up to something very sensitive.

As the study's author puts it:

Personal health information ... has suddenly become the property of private corporations who may sell it to the highest bidder or accidentally misuse it to discriminate against the ill.

There is no indication or suggestion that the companies Limbert named are using the health data we're sharing but they are at least being made unwitting custodians of it and that carries some serious responsibilities.

Although there is nothing in the leaked data that identifies our names or identities, it's quite possible that the companies we're leaking our health data to have them already.

Even if they don't though, we're not in the clear.

Even if Google, Facebook, AddThis, Experian and all the others are at pains to anonymise our data, I wouldn't bet against individuals being identified in stolen or leaked data.

It's surprisingly easy to identify named individuals within data sets that have been deliberately anonymised.

For example, somebody with access to my browsing history could see that I regularly visit Naked Security for long periods of time and that those long periods tend to happen immediately prior to the appearance of articles written by Mark Stockley.

For a longer and more detailed look at this phenomenon, take a look at Paul Ducklin's excellent article 'Just how anonymous are "anonymous" records?'

It's possible to stop this kind of data leak by setting up your browser so it doesn't send referer headers but I wouldn't rely on that because there are other ways to leak data to third parties.

Instead I suggest you use browser plugins like NoScript, Ghostery or the EFF's own Privacy Badger to control which third party sites you have any interaction with at all.

What the study hints at is bigger than that though - what it highlights is that we live in the era of Big Data and we're only just beginning to understand some of the very big implications of small problems that have been under our noses for years.


more...
Lava Kafle's curator insight, March 3, 2015 5:40 AM

#DidYouKnowThis #HealthCare #Cyber #Security #threats #leaks #vulnerabilities #Mitigation #strategy @deerwalkinc #bigdata #thirdparty

Lava Prasad Kafle's curator insight, March 23, 2015 1:15 AM

@deerwalkinc

Instead I suggest you use browser plugins like NoScript, Ghostery or the EFF's own Privacy Badger to control which third party sites you have any interaction with at all.

What the study hints at is bigger than that though - what it highlights is that we live in the era of Big Data and we're only just beginning to understand some of the very big implications of small problems that have been under our noses for years.

Scoop.it!

Ingredients for streamlining care management | Healthcare IT News

Ingredients for streamlining care management | Healthcare IT News | Healthcare and Technology news | Scoop.it

In an era where medicine is highly specialized and different specialties are involved in the care of a patient, intelligent use of information technology is essential to help providers, payers and patients achieve better care management outcomes while simultaneously improving cost and quality of care.

While some entities, such as the Department of Veterans Affairs, have implemented solutions where patients have the ability to view their personal health records online and offline, the majority of the healthcare industry continues to face multiple challenges while implementing care management processes. Care management for large, diverse populations is highly complex and subjective, largely because needs vary for each patient and encounters may span across multiple care settings and plans.

Although a large proportion of health information today is captured electronically, integrated data around patients and their underlying disorders is often not available to providers at the point of care. However, efforts to code clinical content with standard terminology has, to some extent, helped streamline information across applications. There is also a lack of alignment between payers and providers in regards to cost of care management services and shared risk arrangements, leading to sub-optimal care quality.

How organizations manage their healthcare data, and what they use this data for, therefore becomes extremely critical to the success of these programs. While technology plays a very important role in areas like decision support, care coordination and population health management, providers and payers are still faced with the challenge of managing both complex people and process challenges.

Effective use of patient data

Patient data adds value across multiple areas such as decision support, planned interventions and medical reconciliation. Such examples include:

  • Using CPOE Based Order Sets: Effective clinical decision support tools contained within an order set can help enforce the use of quality measures or meaningful use criteria by providers. An example would be the use of a venous thromboembolism (VTE) risk assessment and subsequent prophylaxis for high risk patients embedded within an order set. Monitoring the prophylaxis regimen based on the VTE risk score can help reduce incidence of venous thrombosis.
  • Clinical Information Exchange: Effective care coordination requires healthcare data to flow seamlessly across all parts of the healthcare ecosystem, including providers, payers and consumers. By aligning incentives, all parties can reduce costs and improve quality of care. By leveraging health information exchanges across radiology, laboratory, perioperative, inpatient and outpatient applications, healthcare organizations have the ability to access patient data in a timely and secure fashion.
  • Medical Reconciliation: This feature is commonly available in electronic health records (EHRs) and can play a very important role in preventing adverse drug reactions. For example, the use of over-the-counter (OTC) medications like acetaminophen may not get recorded in an EHR, but can be retrieved from the pharmacy or the medication management application. This is extremely critical information for a physician, given the hepatotoxic profile of the drug.
  • Patient Registries: A patient registry fed with data from EHR applications can show the treatment prescribed to patients and identify care gaps, based on evidence-based guidelines. Care management programs can use this kind of analysis to highlight areas of improvement, thus positively impacting cost and quality of care.

Promoting patient engagement

Patient education plays a very important role in effective care management. Patients who are actively focused on learning more about their conditions are more likely to participate in initiatives that promote preventive steps and healthy behaviour. The use of patient portals, for instance, allow patients to have anytime, anywhere access to their medical records, and the ability to schedule appointments, request medication reconciliation, etc.

Processes such as discharge management and preventive care can also provide strong opportunities to increase patient participation. Such processes play a crucial role in keeping readmissions and acute care costs to a minimum. Automated alerts informing patients to make appointments or follow up on lab visits can help prevent potential acute and chronic conditions.

Patients today are increasingly using consumer devices and mobile apps to store and monitor their health parameters. Wearable devices have the ability to change the way health data is collected and managed, and care management processes will soon need to incorporate consumer technology to enhance patient engagement and self management.

Managing Stakeholder Expectations

To drive a sustainable care management program, it is important to demonstrate value to key stakeholders including providers, payers and patients. However, the definition of value differs from one entity to another. For instance, providers and payers often do not see eye to eye on issues such as risk sharing and care management goals. It is essential to build consensus on many of these issues and agree on clearly defined goals around care objectives, processes and costs.

Addressing issues around provider and payer expectations could lead to significant advantages for the healthcare industry as a whole. According to the Center for Disease Control and Prevention, the government spends nearly three-fourths of its total healthcare expenditure on chronic disease, an area where care management programs can make a large impact. A concerted effort from all major stakeholders to streamline care management objectives and processes would have a very large impact on healthcare cost and quality.

more...
No comment yet.
Scoop.it!

Innovation Pulse: Can platforms foster an ecosystem of IT advancement?

Innovation Pulse: Can platforms foster an ecosystem of IT advancement? | Healthcare and Technology news | Scoop.it

Meet the new linguistic mashup: "plecosystem," meaning a technological ecosystem comprising different platforms.

I first heard the phrase spoken by John Mattison, MD, Kaiser Permanente's chief medical information officer. Platforms are coming from technology leaders such as Apple, IBM, Google and Microsoft.

"The best prospect from one of these players is that it's a platform," said Steve Savas, managing director of clinical services at Accenture. Rather than trying to create apps that target specific conditions or diseases, the tech stalwarts are building platforms on top of which independent developers can build those very targeted apps. "And that's the place they could have the biggest impact."

IBM and Apple, for instance, have been forthright in saying that "the art of the possible" figures prominently into their plans to foster an ecosystem of applications on top of HealthKit, the iOS and Apple hardware platform. And they do stand to profit handsomely if independent software developers give hospital chief information officers reason to buy armadas of iPads.

"I think the innovators would welcome it from Apple, Microsoft or others because then there are standards and the uptake costs for adopting new innovative technology drops precipitously and it can create an ecosystem around it, much like the app ecosystem around the iPhone," Savas said.

Such a plecosystem, of course, presumes that these new platforms go the way of the iPhone and iPad – stimulating waves of original applications – and don't follow the iPod, which devoured the MP3 market and effectively squashed innovation in that realm.

We're already seeing evidence of the potential for this new plecosystem in mobile health. Provider-centric apps geared for Apple's HealthKit have emerged, ranging from American Well's AmWell, which enables live video visits with doctors, to Patient IO, an app with which physicians can send specific treatment reminders directly to patients.

While the Mayo Clinic has aligned with Apple and Epic in a high-profile triptych, other leading providers are also innovating on Apple's platform. The Cleveland Clinic, for example, has a quiver of apps that run on top of both iOS and Android to target specific conditions including cancer, epilepsy, heart disease, sleeping problems and general wellness.

Electronic health record makers are hopping on board, too. In addition to Epic, athenahealth and Cerner revealed plans to integrate with Apple's platform almost immediately after HealthKit launched; over the summer Apple was also rumored to be in talks with Allscripts.

What's more, with industry analyst firms such as Black Book projecting that the provider landscape will see widespread switching to new EHR vendors, more and more EHR makers are going to tune their software for tablets, phablets and smartphones.

Let's just speculate, for instance, that Apple manages to arrange things such that iPads and iPhones, pre-loaded with mobile-optimized software from Allscripts, Epic or a small and controlled choice of EHRs, are sold and supported by IBM. Well, that's a pretty compelling product.

Likewise, Microsoft and Dell could roll together something similar on the Windows platform. Same goes for Google's Android. Apple products, despite their cool factor, are not for everyone, after all.

Regardless of platform – Android, iOS or Windows – as those configurations become available are you really going to buy your next EHR any other way? And what are your patients going to demand, moving forward, when it comes to consumer-facing products?

We're about to find out. For instance, enough people asked for new means of connectivity and more touch points with doctors that Ochsner Health System integrated its Epic EHR within weeks of HealthKit's release, according to Richard Milani, MD, Ochsner's chief clinical transformation officer.

Much like Ochsner, Duke Medicine integrated Epic with HealthKit right away because it has been trying to better understand how patients are doing on a day-to-day basis at home rather than just during the rare office visits when doctors see them.

"Technologies like HealthKit open the door to patients choosing to share data," said Ricky Bloomfield, MD, director of Duke's mobile technology strategy. "That will let doctors make more informed treatment decisions in cooperation with them."

That last point is where the platform surpasses any single digital nanny or wellness app feeding a wearer's health data into the black hole so many proprietary vendor databases truly are.

Indeed, the overarching plecosystem theme here is the integration of those apps and devices within EHRs and clinical decision support tools, as Mattison told me when I met him at the Healthcare IT News Big Data and Healthcare Analytics Forum in Boston this past November.

That will enable "doctors to lead more with empathy, and to understand values and outcomes to deliver what patients want, not what we think is best for them," he said.

Right now, the plecosystem momentum is powerful – even though Apple, IBM, Google and Microsoft are just getting started – and there's every reason to believe it will get stronger in the months ahead.

"The platform approach," said Accenture's Savas, "will proliferate the innovators."


more...
Jeri Garner's curator insight, March 22, 2015 7:55 AM

EHR are getting smarter and consumers need them to be, but it needs to be easy and available at the touch of a button. Clinicians and CXO's don't have time to sit around analyzing data. The technology needs to make it easy for those caring for the patients to make the right decision at the right time, and drive improved patient outcomes and financial gains. .

Cameron's curator insight, March 25, 2015 8:34 PM

This is a great step for technology, this will indefinitely impact the health industry as well as change the way we live our lives. The author goes in-depth with the possibilities of the platform, letting the readers know that this technology is achievable. It is innovating ideas like these which make millions/billions in the long run.

Scoop.it!

Why 2014 was a groundbreaking year in digital health

Why 2014 was a groundbreaking year in digital health | Healthcare and Technology news | Scoop.it

2014 was the most exciting year in digital health since 2000, when the human genome was cloned. In February 2001, The Human Genome Project and Craig Venter’s Celera Genomics published the hallmark event. What followed was over a decade of glimmers of the potential for personalized medicine and new insights into disease, but also realistic mitigations in expectations, as is wont to happen in health care.


There is every indication that the next decade will be different — there will be an acceleration in innovation and development of devices to assess our healthy and ailing selves. What happened in 2014? A huge increase in funding and corporate investment in digital health technology (e.g., mobile, social media, genetics and big data), and massive growth in the accessible population, and the amount of open data:

  • Funding in digital health startups, tracked by an accelerator Rock Health since 2011, has grown steadily at double digit growth until this past year, when records were shattered with $4.1B in funding, more than double the 2013 amount.
  • Almost every major consumer technology company announced a large health initiative, notably Google, Apple, and Samsung.
  • Electronic health record and sensors were positioned to join or actually entered the “Internet of Things”. The partnership between Apple and Epic alone could reach 20 percent of patients entering a health care system in the U.S. An estimated 10M activity monitor units were sold in 2014 and phones became personal health monitors with the release of Apple’s HealthKit and Google’s Google Fit.
  • Lastly, the number of large data sets that opened in health care and the tools to analyze them came of age in 2014. For example, the FDA launched openFDA in June 2014, which made it easier to analyze data about adverse events, drug and medical device recalls, prescription and over the counter product labeling, and to access open source code for analyzing this data.

What does this mean for our health? Funding will help drive innovation, and greater connectivity between patients and the people and systems that deliver their care will help drive efficiencies. Both of these will enable developers to more easily amass huge data sets to advance personalized diagnosis and treatment, and support efforts to prevent disease.

Innovation. The last five years have been a “wild west” of digital health startups with greatly varying business rationales and user adoption. We are now beginning to see some sound inventions that make economic sense and will be used by patients, providers, and systems. Activity monitors will go stealth, such as the contact lens being developed by Google and Novartis that detects blood glucose levels. Quantification of conditions will advance so that we have a better understanding of the level and type of disease we are dealing with, such as Oculogica’s brain injury detection system for concussion and other brain afflictions. (Disclosure: The author is a consultant to Oculogica.)

Efficiencies. One of the most needed, but most difficult to realize, implications of health care systems partnering with technology companies, are changes that reduce time and costs for health care systems and patients — from the simple check in at a clinic or hospital, to the number and nature of tests ordered, to smarter follow-up. For example, the first Epic Apple integration at Ochsner Health System in Louisiana estimated a 40 percent decrease in readmissions based on a pilot study with 100 heart failure patients.

Personalization and outcomes. 2014 won’t be the end of guidelines and recommendations based on the general population, but we are at a turning point to eventually achieve ubiquitous genomic assessments of individuals and prediction for optimal treatment. The interim step will be larger data sets — ideally from clinical records and recorded from sensors — which will allow segmentation of patients by age, gender, stage of disease, and other factors. This enables providers to tailor care based on individuals or small segments, rather than large swaths of the population.

Prevention. Prevention is the holy grail in health care with significant impact on health and cost. Programs have been difficult to implement because adoption has been too onerous. Return on investment is difficult to quantify and is often not realized by one hospital or payer. Less obtrusive sensors and more connected systems lower or even remove barriers to adoption. Return on investment will be more quantifiable as individuals, rather than health systems, are followed and quantified.

What do we need in health care? Fewer people who get ill in the first place. When they do, they should receive better care, tailored to who they are and the specifics of their disease, delivered at a lower cost. The challenges notwithstanding, we moved a step closer to this fantastic vision for health care in 2014.


more...
No comment yet.
Scoop.it!

How Healthcare Organizations Can Turn Big Data Into Smart Data

How Healthcare Organizations Can Turn Big Data Into Smart Data | Healthcare and Technology news | Scoop.it

Only a very small percentage of healthcare organizations today seem to be leading the way in healthcare data analytics, while the vast majority are very early in the business intelligence (BI)/analytics process, or haven’t even started. As a result, organizations seem to see big data as something that’s off in the very distant future; for most of them, anything outside of five years is almost nonexistent, says Shane Pilcher, vice president at the Bethel Park, Pa.-based Stoltenberg Consulting.

It is important to remember that big data is more than just a sea of information; it is an opportunity to find insights in new and emerging types of data and content.  So what are hospitals and healthcare organizations forgetting in their paths for eventual success with big data? According to Pilcher, the answer is “smart data.” In the below interview with HCI Assistant Editor Rajiv Leventhal, Pilcher talks about the difference between big data and smart data, strategies for collecting the right data, and advice for physicians in getting on board with the movement.

When you say “smart data,” what do you mean? How does smart data differ from big data?

The data that organizations are collecting today that they will be using for big data are going into this black hole (usually the data warehouse) somewhere. They are happy that they’re collecting it and preparing for when big data finally does come around to their organization, but if they aren’t careful and if they don’t monitor what they’re recording, the quality and quantity of the data when it’s to be used five years from now will not be sufficient enough. These organizations might think that they have five years of historical data to start their analytics, but in reality, the data is often not of the quality or quantity, or even the type, that is needed. That’s the smart data—that step that focuses on the type of data that they have, the volume of data, and also the validity of that data. You have to make sure that what you’re collecting is what you’re expecting.

Do healthcare organizations recognize this need?

Big data is a common theme with CIOs at healthcare organizations everywhere—they know it’s coming. However, there are CEOs at their hospitals who hear about “big data” at conferences and have no idea what it is, yet they will still come back and tell their CIOs that they “have to be doing big data.” And thus, it’s left in the lap of CIOs. But for the CIOs, they have Stage 2 of meaningful use and ICD-10 coming [for many providers, Stage 2 is here already], so they are not in the best place to be dealing with big data. So for the most part—except for about 5 percent of organizations out there, they tend to move it to sideline. It’s like looking at the side view mirror on your car and not seeing the message, “images are closer than they appear.” They see big data reflected, but it’s a lot closer than what they’re thinking. For the places that have limited resources and time, this is something that is being pushed to the side until they can get to it down the road.

How can organizations better ensure they are collecting the right quantity and quality of data?

First, you need to start developing your strategy now. Using the standard data models and approaches other industries are using doesn’t necessarily translate to healthcare IT. The amount of data, the data structure, and the data model is off the chart compared to even something as large as automotive manufacturing—the complexity isn’t even comparable. You have to develop as you go. The biggest thing I can suggest, as this industry is developing and our tools are growing, is to develop those peer networks with other healthcare leaders that are already further down the road than you. About 5 percent of healthcare organizations are right now in “stage two” of the data maturity model where they could start looking at predictive and prescriptive approaches to data. Those that are on the forefront of data analysis and intelligence are going to be critical to the rest of the industry following along. So learn from and use your peers.

And again, the quality of the data is critical. Organizations often think that they initiated the data collection, it’s implemented, and it’s working, so they turn to next project, thinking that when they’re ready, they will have it there in the warehouse. But then when it gets closer to the time to use the data, they don’t have the quantity that they thought they had. If you are collecting the wrong information or it’s incorrect, when you do your analysis, you will get wrong results and not even know it. Decisions could be devastating because your data was inaccurate leading to wrong analysis.

So you also need to assess the data on a regular basis constantly and ensure that what you think you’re collecting is actually what you’re getting. Then you can depend on the accuracy of that data when it’s time to start analyzing. Being able to analyze unstructured data for trends is very difficult, almost borderline impossible.  Yet, about 80 percent of hospitals expect to use unstructured data in their data warehouse. Turning that data into structured data, or finding a tool that can do that for you with accuracy, becomes a huge push. If organizations are not prepared for that, they are racing against time at the last minute.

You need to trust the accuracy of your data. You know that your electronic health record (EHR) is collecting certain data and dumping into the data warehouse. But is anything happening with that transfer of data that is changing it in any way? Is it remaining accurate? Was it accurate to begin with? I wouldn’t say there is an issue of incorrect data in EHRs, but people can’t 100 percent say, “Yes, it’s ready to be analyzed.”

What are some other challenges organizations are facing with big data?

Time and money are the two big ones, of course. Everyone has a limited amount of time, with more projects and initiatives than time to do them in. And dollars are tight for healthcare organizations, so the things that tend to be more in the future get less priority when it comes to budgeting than things needed for today.

But staffing is also a problem—having trained staffs who know how to analyze and know how to approach intelligence processes can be challenging. A 2012 CHIME CIO survey, from last September, found that 67 percent of healthcare CIOs were reporting IT staff shortages. The issue is that organizations either didn’t have enough staffers, or didn’t have anyone internally with that skill set. At the end of the day, almost all organizations are having problems making up a BI department.

What is your advice to helping physicians get on board with big data?

This is definitely adding to the challenge for physicians. In many cases, a lot of them can view EHRs as taking up more of their time and causing more of a workload rather than being more efficient. Often, that is accurate. EHRs do not save you time, not at the beginning. And that’s why physicians tend to be resistant; they understand the need for meaningful use dollars, and that has pushed them in the direction, even though they have been reluctant to go there in the past.

But the day we can take that information and turn it into a tool for them to better take care of their patients, creating better outcomes at a lower cost, will be a benefit to all of the efforts and work they have been doing. That is why hospitals that have implemented BI initiatives; rather than just focus on the financial, they have to focus on the patient care strategies and initiatives. Because it’s not until then do doctors see a purpose for their extra work and start to get on board.



more...
No comment yet.
Scoop.it!

Why Health Care May Finally Be Ready for Big Data

Why Health Care May Finally Be Ready for Big Data | Healthcare and Technology news | Scoop.it

There has been a lot of buzz about “big data” over the last few years. This is hardly surprising, given the sheer scale of the data sets that are being produced daily. A total of 2.5 quintillion terabytes of data were generated every day in 2012 alone, and it is estimated that as much data is now generated in just two days as was created from the dawn of civilization until 2003. While other industries have been far more successful at harnessing the value from large-scale integration and analysis of big data, health care is just getting its feet wet. Yes, providers and payers are increasingly investing in their analytical capabilities to help them make better sense of the changing health care environment, but it is still early days. Here are some key elements that are crucial for health care to truly capture the value of big data.

Integrating data. Businesses and even political campaigns have successfully linked disparate data sources to learn “everything possible” about their citizens, customers, and clients, and apply advanced analysis and computation to modify existing strategies or create new ones. Similarly, leveraging heterogeneous datasets and securely linking them has the potential to improve health care by identifying the right treatment for the right individual or subgroup.

One of the first changes we face is the lack of standardization of health care data. The vast amount of data generated and collected by a multitude of agents in health care today comes in so many different forms — from insurance claims to physician notes within the medical record, images from patient scans, conversations about health in social media, and information from wearables and other monitoring devices.

The data-collecting community is equally heterogeneous, making the extraction and integration a real challenge. Providers, payers, employers, disease-management companies, wellness facilities and programs, personalized-genetic-testing companies, social media, and patients themselves all collect data.

Insight Center

  • A collaboration of the editors of Harvard Business Review and the New England Journal of Medicine, exploring best practices for improving patient outcomes while reducing costs.

Integration of data will require collaboration and leadership from the public and private sectors. The National Institutes of Health recently launched a Big Data to Knowledge Initiative (BD2K) to enable the biomedical research community to better access, manage, and utilize big data. Some early work is also being pursued through large collaborations such as the National Patient-Centered Research Network (PCORnet) and the consortium Optum Labs, a research collaborative that has brought together academic institutions, health care systems, provider organizations, life sciences companies, and membership and advocacy organizations.

Generating new knowledge. One of the earliest uses of big data to generate new insights has been around predictive analytics. In addition to the typical administrative and clinical data, integrating additional data about the patient and  his or her environment might provide better predictions and help target interventions to the right patients. These predictions may help identify areas to improve both quality and efficiency in health care in areas such as readmissions, adverse events, treatment optimization, and early identification of worsening health states or highest-need populations.

Equally critical is the focus on new methods. One of the reasons health care is lagging behind other industries is it has relied for too long on standard regression-based methods that have their limits. Many other industries, notably retail, have long been leveraging newer methods such as machine learning and graph analytics  to gain new insights. But health care is catching up.

For example, hospitals are starting to use graph analytics to evaluate the relationship across many complex variables such as laboratory results, nursing notes, patient family history, diagnoses, medications, and patient surveys to identify patients who may be at risk of an adverse outcome. Better knowledge and efficient assessment of disparate facts about patients at risk could mean the difference between timely intervention and a missed window for treatment.

Natural language processing and other artificial intelligence methods have also become more mainstream, though they are mostly useful in harvesting unstructured text data that are found in medical records, physician notes, and social media. Mayo Clinic teamed up with the IBM cognitive computer known as Watson, which is being trained to analyze clinical-trial criteria in order to determine appropriate matches for patients. As the artificial-intelligence computer gets more information and learns about matching patients to studies, Watson may also help locate patients for hard-to-fill trials such as those involving rare diseases.

Translating knowledge into practice. While standardized data collection and new analytical methods are critical to the big data movement, practical application will be key to its success. This is an important cultural challenge for both those who generate and those who consume the new knowledge. Users such as physicians, patients, and policy makers need to be engaged right at the beginning, and the entire research team should have a clear idea about how the new knowledge might be translated into practice.

The insights from big data have the potential to touch multiple aspects of health care: evidence of safety and effectiveness of different treatments, comparative outcomes achieved with different delivery models, and predictive models for diagnosing, treating, and delivering care. In addition, these data may enhance our understanding of the effects of consumer behavior, which in return may affect the way companies design their benefits packages.

Translating these new insights into practice will necessitate a shift in current practices. Relying on the evidence from randomized controlled trials has been a gold standard for making practice-changing decisions. While in many cases such trials may be necessary and justified, it will be critical to identify where the evidence generated by big data is adequate enough to change practice. In other cases, big data may generate new paradigms for increasing the efficiency of randomized clinical trials.

For example, as new knowledge is gained about the comparative benefits of second-line agents for treatment of diabetes, policy makers and expert groups may consider using this information to develop guidelines or recommendations or to guide future randomized trials.

Optum labs — a start to achieving these goals. A proverbial village is required to make sense of the messy medley of data points that is big data in health care today, integrate and manage them in a safe and secure environment, and translate the findings into practice. One such village is Optum Labs, a research collaborative that has brought together data from the administrative claims of more than 100 million patients and the electronic medical records of over 30 million patients, along with researchers, patient advocates, policy makers, providers, payers, and pharmaceuticals and life sciences companies. The vision of Optum Labs is to boost the generation of high-quality comparative-effectiveness evidence, accelerate translation of knowledge into the development of new predictive models and tools, and improve the delivery of care.

For almost two years now, researchers from Optum Labs’s 15 partner organizations have been working in this environment. The knowledge generation within Optum Labs is in its infancy, but early signs point to a significant potential to change health care with more than a 100 studies currently under way. Optum Labs’s priority is to enable clinicians to connect insights from big data directly to the care of an individual patient. This is particularly important for complex patients whose care requires careful prioritization (e.g., people with “comorbidities,” or multiple chronic conditions). For example, researchers are studying the longitudinal variation in care analysis of hip and knee surgery in patients with diabetes and obesity. By analyzing the data on their treatment outcomes, we may be able to learn something that will help us create better protocols for caring for them. The hope is to gain insights about what works and for whom. Ultimately, what should drive this initiative and others is to addresses the complexities, unmet needs, and challenges facing patients.



more...
No comment yet.
Scoop.it!

Where big data falls short

Where big data falls short | Healthcare and Technology news | Scoop.it

Big data and analytic tools have not yet been harnessed to bring meaningful improvement to the healthcare industry.

That's according to a new report from the National Quality Forum outlining the challenges to making health data andanalytics more usable and available in real time for providers and consumers.


Whereas big data has supported improvement in certain settings, such as reducing ventilator-acquired pneumonia, data analytics has been largely overlooked in the area of healthcare costs, even though this data can inform and assess efforts to improve the affordability and quality of care.


What's more, effective data management is necessary for the success of other incentives to enhance care, such as payment programs, as providers need timely information to understand where to improve and track their progress.


NQF found multiple challenges to making better use of health information, such as interoperability and linking disparate data sources, leveraging data for benchmarking, providing the ability to gather data directly from patients and de-identify it to generate knowledge, and the need to ensure that the data itself is trustworthy.


Then there's the matter of electronic health records software. "While greater EHR adoption is positive, these records do not contain all of the data needed for improvement," the report said. NQF pointed to operational or clinical data not captured in an EHR, such as the time a nurse spends caring for a particular patient or the time to transfer a patient from surgery to a post-operative recovery unit to a hospital room, as common examples.


The report noted there have been many ongoing attempts to develop interoperability between EHRs and clinical data sources recording patients' experiences and outcomes. Beyond linking healthcare data, however, "there is a need to learn from data spanning other determinants of health, as the most significant and sustained individual and population healthimprovements occur when healthcare organizations collaborate with community or public health organizations."


NQF also highlighted a widespread need to appreciate the value of nonfinancial incentives, such as peer and public reporting, in improvement initiatives.


"Overall, there was a desire to move from a retrospective approach of quality metrics and analytics to one that uses real-time data to identify potential challenges and gauge progress," the report said.


The report was supported by the Peterson Center on Healthcare and the Gordon and Betty Moore Foundation, the initiative was spurred by a 2014 report by the President's Council of Advisors on Science and Technology that called for systems engineering approaches to improve healthcare quality and value.

more...
No comment yet.
Scoop.it!

Data-based medicine: The end of evidence-based medicine?

As all kinds of information are being collected about every aspect of our lives, the data generated at this exorbitant rate can lead to advancements in research and health care.  That is the idea behind big data” and it’s disruptive benefits for the health care industry.  The term encompasses a searchable vast data collection for relative information in order to quickly identify trends.  Like all other disruptive innovations, the focus is speed.  However, medicine, unlike most industries, has never been quick to adapt to trends.


The history of medicine started out based on the knowledge of religious or spiritual theories.  The process for medical decision-making was highly subjective, and a few thousand years later the advancements in clinical judgments were based on individual preference.  Today, we would consider this an example of clinical-based medicine, practice based on individual or group observations.  It wasn’t until the later in the 20th century that doctors and health care researchers began to use the limited data that had been collected and evaluate the effectiveness of individual patient treatments.  Epidemiological methods were then devised to track explicit evidence of the effectiveness clinical practice guidelines and policies.  This disruption in medicine would lead to policies and practice guidelines being anchored on experimental evidence gathered from data rather than expert opinions.

Big data is a huge collection of data that is unmanageable by traditional evidence-based means and is a seismic disruption in the field of medicine.  One of the first published incidents of using big data to affect doctor decision-making was in 2011 at Stanford Lucile Packard Pediatric Hospital, where Dr. Frankovich searched through her medical records of pediatric lupus patients to determine whether or not to prescribe anticoagulant medication.  Because there were not any published guidelines and scant literature on the subject, she resorted to analyzing the patterns revealed in her collection of medical charts.

Lloyd Marino, CEO of Avetta Inc., a global strategy company, says big data is not a quick fix for immediate answers, especially in health care.  Unlocking the value of big data requires an ongoing process of the three A’s: automation, analytics, and action.

Automation sorts through and cleanses the data from numerous sources.  By normalizing the collected data, it can be integrated with current health care models on a continuous basis in order to produce real-time outcomes.  For example, medical records are filled with dozens, if not hundreds, of data points per patient and can be routinely updated inside an electronic medical record.  Beyond just collecting information, medical records can be combed through by robust learning machines for patterns and filtered based on disease, risk factors, or outcomes.

However, machine-learning algorithms from auto-generated data needs to be built and mastered.  Big data analytics explores deeper into the stream of healthcare information and finds solutions undiscoverable by traditional search means through moving beyond just managing data to mastering it.  Analytics does not just offer insight but can help create efficient better hospital infrastructure and streamline drug testing.

Most importantly, the action taken must be deployed wisely and rapidly to achieve a high return on investment (ROI), and this would speed the pharmaceutical industry’s notoriously slow pace.  Success also depends on how these solutions are aligned with key health care objectives, how easy for practitioners and invested health care workers to make use of solutions, and how well it integrates with existing protocols and procedures.

Evidence-based medicine is facing a disruptive force. However, it will never be fully uprooted; much like clinical-based medicine continues to exist today.  Big data has the advantages of size and speed compared to evidence-based medicine.  However, big data alone will not solve any issues for health care problems that exist for individual patients and communities.  Proper implementation of automation, analytics, and action, can help properly leverage big data for new solutions to health care models.

more...
jean marc mosselmans's curator insight, March 22, 2015 8:02 AM

the major danger is to forget the difference between observational studies and intervention studies. Modern medicine is full of very promising observational studies and hypothesis, unfortunately not confirmed by interventional dubble blind studies

Scoop.it!

Cerner big data platform gets new client

Cerner big data platform gets new client | Healthcare and Technology news | Scoop.it

Truman Medical Centers and the University of Missouri-Kansas City's Center for Health Insights have teamed up on a new initiative that will harness data from electronic medical records, de-identifying it and digesting it into a database that can help inform better care decisions.

Both organizations will partner with EMR giant Cerner to leverage its Health Facts data warehouse platform to drive the analytics initiative. Health Facts extracts data from both clinical and financial IT systems, de-identifies the data, standardizes terms through mapping to common nomenclature and has the ability to create adverse drug events and outcomes reports.

The platform, as Cerner officials described, will allow the two-hospital TMC to use its current clinical and financial data and transform it into a usable form that can be leveraged to improve patient safety and care outcomes. What's more, TMC officials anticipate the data analysis can also be used to reduce specific health disparities and reduce costs for certain procedures.

Specifically, with the platform TMC officials will be able to use the generated reports and compare one's organization's performance with other clients who use the warehouse. The warehouse already includes millions of EMR records from inpatient, ED and outpatient visits from patients nationwide.

"The centerpiece of this partnership provides tools to accelerate clinical and translational research and ultimately provide better health outcomes," said Lawrence Dreyfus, UMKC vice chancellor for Research and Economic Development, in a Feb. 18 press statement. "We couldn't be more excited about the prospects that this partnership holds for healthcare decisions that ultimately improve care and reduce costs."

more...
No comment yet.
Scoop.it!

Digital health in 2015: What's hot and what's not?

Digital health in 2015: What's hot and what's not? | Healthcare and Technology news | Scoop.it

I think it’s fair to say that digital health is warming up. And not just in one area. The sheer number and variety of trends are almost as impressive as the heat trajectory itself. The scientist in me can’t help but make the connection to water molecules in a glass — there may be many of them, but not all have enough kinetic energy to ascend beyond their liquid state. The majority are doomed to sit tight and get consumed by a thirsty guy with little regard for subtle temperature changes.


With this in mind, let’s take a look at which digital health trends seem poised to break out in 2015, and which may be fated to stay cold in the glass. As you read, keep in mind that this assessment is filtered through my perspective of science, medicine, and innovation. In other words, a “cold” idea could still be hot in other ways.

Collaboration is hot, silos are not. Empowerment for patients and consumers is at the heart of digital health. As a result, the role of the doctor will shift from control to collaboration. The good news for physicians is that the new and evolved clinician role that emerges will be hot as heck. The same applies to the nature of innovation in digital health and pharma. The lone wolf is doomed to fail, and eclectic thinking from mixed and varied sources will be the basis for innovation and superior care.

Scanners are hot, trackers are not. Yes, the tricorder will help redefine the hand-held tool for care. From ultrasound to spectrometry, the rapid and comprehensive assimilation of data will create a new “tool of trade” that will change the way people think about diagnosis and treatment. Trackers are yesterday’s news stories (and they’ll continue to be written) but scanners are tomorrow headlines.

Rapid and bold innovation is hot, slow and cautious approaches are not. Innovators are often found in basements and garages where they tinker with the brilliance of what might be possible. Traditionally, pharmaceutical companies have worked off of a different model, one that offers access and validation with less of the freewheeling spirit that thrives in places like Silicon Valley. Looking ahead, these two styles need to come together. The result, I predict, will be a digital health collaboration in which varied and conflicting voices build a new health reality.

Tiny is hot, small is not. Nanotechnology is a game-changer in digital health. Nanobots, among other micro-innovations, can now be used to continuously survey our bodies to detect (and even treat) disease. The profound ability for this technology to impact care will drive patients to a new generation of wearables (scanners) that will offer more of a clinical imperative to keep using them.

Early is hot, on-time is not. Tomorrow’s technology will fuel both rapid detection and the notion of “stage zero disease.” Health care is no longer about the early recognition of overt signs and symptoms, but rather about microscopic markers that may preempt disease at the very earliest cellular and biochemical stages.

Genomics are hot, empirics are not. Specificity — from genomics to antimicrobial therapy — will help improve outcomes and drive costs down. Therapy will be guided less and less by statistical means and population-based data and more and more by individualized insights and agents.

AI is hot, data is not. Data, data, data. The tsunami of information has often done more to paralyze us than provide solutions to big and complex problems. From wearables to genomics, that part isn’t slowing down, so to help us manage it, we’ll increasingly rely on artificial intelligence systems. Keeping in mind some of the inherent problems with artificial intelligence, perhaps the solution is less about AI in the purest sense and more around IA — intelligence augmented. Either way, it’s inevitable and essential.

Cybersecurity is hot, passwords are not. As intimate and specific data sets increasingly define our reality, protection becomes an inexorable part of the equation. Biometric and other more personalized and protected solutions can offer something that passwords just can’t.

Staying connected is hot, one-time consults are not. Medicine at a distance will empower patients, caregivers, and clinicians to provide outstanding care and will create significant cost reductions. Telemedicine and other online engagement tools will emerge as a tool for everything from peer-to-peer consultation in the ICU to first-line interventions.

In-home care is hot, hospital stays are not. “Get home and stay home” has always been the driving care plan for the hospitalized patient. Today’s technology will help provide real-time and proactive patient management that can put hospital-quality monitoring and analytics right in the home. Connectivity among stakeholders (family, EMS, and care providers) offers both practical and effective solutions to care.

Cost is hot, deductibles are not. Cost will be part of the “innovation equation” that will be a critical driver for market penetration. Payers will drive trial (if not adoption) by simply nodding yes for reimbursement. And as patients are forced to manage higher insurance deductibles, options to help drive down costs will compete more and more with efficacy and novelty.

Putting it all together: What it will take to break away in 2015?

Beyond speed lies velocity, a vector that has both magnitude and direction. Smart innovators realize that their work must be driven by a range of issues from compatibility to communications. Only then can they harness the speed and establish a market trajectory that moves a great idea in the right direction. Simply put, a great idea that doesn’t get noticed by the right audience at the right time is a bit like winking to someone in the dark. You know what you’re doing, but no one else does.


more...
No comment yet.
Scoop.it!

4 keys to achieving the Holy Grail of big data

4 keys to achieving the Holy Grail of big data | Healthcare and Technology news | Scoop.it

About every six months, one of the young brilliant and enthusiastic "big data" people who hang out in the MIT neighborhood near Kendall Square or in the Bay Area of California comes to me for advice as to how to break into the healthcare market. He or she is inevitably prepared to deliver the Holy Grail to a waiting healthcare world, i.e., a real-time decision support system that will codify the world of evidentiary medicine and help clinicians reduce length of stay, the number of unnecessary readmissions, and the cost of care. The person has sometimes, but not always, set up a "comparable" company in another field, analyzing big data and improving industrial processes, and s/he has often sold that business for a handsome sum to a multinational corporation or private equity firm.

I love meeting with these young people. They are true believers with no shortage of confidence, and they are fun to hang out with. So, I'm a bit reluctant to offer this blog post because I am going to set forth my advice in writing--knowing that I might perhaps make future personal meetings redundant. (But I'm hoping they'll still call.)

To obtain the Holy Grail, you need to satisfy the following interrelated conditions:

  1. A sophisticated data management system that, indeed, provides clinical advice that will be accurate in the vast majority of cases;
  2. A plan to integrate that system into the various support systems that exist in a hospital so that it can be used in real time, i.e., as patient care is being delivered;
  3. A plan to convince doctors and others to use the system;
  4. A strategy for getting the procurement approved by the various high-ranking clinical and administrative officials in the hospital.

On the first point, what level of accuracy do you think is required to offer a decision support system that could have the confidence of doctors? How would you test that accuracy?

On the second point, how long will it take to invent the interface between your system and the variety of clinical and administrative information systems that exist in your targeted hospital(s)? Think about it this way: How likely do you think it will be that you will get the time and attention of the CIO to install your system, as s/he is a bit busy with Meaningful Use projects?

On the third point, well, you know the issues. Please don't think that because you've satisfied #1, above, that adoption by MDs will be assured.

On the final point, who within your targeted hospital(s) will carry the water for this project in the strategic and budgetary reviews with the CIO, CNO, CMO, CFO, and CEO? Lots of people in a hospital can say "No."  Which of these people will say "Yes" and become your internal advocates?

If you can figure this all out and stay capitalized long enough to make sales and bring in revenue in a timely fashion, all will be well . . . if your approach truly offers a comparative advantage to the dozens of others trying to enter this arena.


more...
No comment yet.
Scoop.it!

How Big Data Will Transform Our Economy And Our Lives In 2015

How Big Data Will Transform Our Economy And Our Lives In 2015 | Healthcare and Technology news | Scoop.it

The great Danish physicist Niels Bohr once observed that “prediction is very difficult, especially if it’s about the future.” Particularly in the ever-changing world of technology, today’s bold prediction is liable to prove tomorrow’s historical artifact. But thinking ahead about wide-ranging technology and market trends is a useful exercise for those of us engaged in the business of partnering with entrepreneurs and executives that are building the next great company.

Moreover, let’s face it: gazing into the crystal ball is a time-honored, end-of-year parlor game. And it’s fun.

So in the spirit of the season, I have identified five big data themes to watch in 2015. As a marketing term or industry description, big data is so omnipresent these days that it doesn’t mean much. But it is pretty clear that we are at a tipping point. The global scale of the Internet, the ubiquity of mobile devices, the ever-declining costs of cloud computing and storage, and an increasingly networked physical word create an explosion of data unlike anything we’ve seen before.

The creation of all of this data isn’t as interesting as the possible uses of it. I think 2015 may well be the year we start to see the true potential (and real risks) of how big data can transform our economy and our lives.

Big Data Terrorism

The recent Sony hacking case is notable because it appears to potentially be the first state-sponsored act of cyber-terrorism where a company has been successfully threatened under the glare of the national media. I’ll leave it to the pundits to argue whether Sony’s decision to postpone releasing an inane farce was prudent or cowardly. What’s interesting is that the cyber terrorists caused real fear to Sony by publicly releasing internal enterprise data — including salaries, email conversations and information about actual movies.

Every Fortune 2000 management team is now thinking: Is my data safe? What could happen if my company’s data is made public and how could my data be used against me? And of course, security software companies are investing in big data analytics to help companies better protect against future attacks.

Big Data Becomes a Civil Liberties Issue

Data-driven decision tools are not only the domain of businesses but are now helping Americans make better decisions about the school, doctor or employer that is best for them. Similarly, companies are using data-driven software to find and hire the best employees or choose which customers to focus on.

But what happens when algorithms encroach on people’s privacy, their lifestyle choices and their health, and get used to make decisions based on their race, gender or age — even inadvertently? Our schools, companies and public institutions all have rules about privacy, fairness and anti-discrimination, with government enforcement as the backstop. Will privacy and consumer protection keep up with the fast-moving world of big data’s reach, especially as people become more aware of the potential encroachment on their privacy and civil liberties?

Open Government Data

Expect the government to continue to make government data more “liquid” and useful – and for companies to put the data to creative use. The public sector is an important source of data that private companies use in their products and services.

Take Climate Corporation, for instance. Open access to weather data powers the company’s insurance products and Internet software, which helps farmers manage risk and optimize their fields. Or take Zillow as another example. The successful real estate media site uses federal and local government data, including satellite photography, tax assessment data and economic statistics to  provide potential buyers a more dynamic and informed view of the housing market.

Personalized Medicine

Even as we engage in a vibrant discussion about the need for personal privacy, “big data” pushes the boundaries of what is possible in health care. Whether we label it “precision medicine” or “personalized medicine,” these two aligned trends — the digitization of the health care system and the introduction of wearable devices — are quietly revolutionizing health and wellness.

In the not-too-distant future, doctors will be able to create customized drugs and treatments tailored for your genome, your activity level, and your actual health. After all, how the average patient reacts to a particular treatment regime generically isn’t that relevant; I want the single best course of treatment (and outcome) for me.

Health IT is already a booming space for investment, but clinical decisions are still mostly based on guidelines, not on hard data. Big data analytics has the potential to disrupt the way we practice health care and change the way we think about our wellness.

Digital Learning, Everywhere

With over $1.2 trillion spent annually on public K-12 and higher education, and with student performance failing to meet the expectations of policy makers, educators and employers are still debating how to fix American education. Some reformers hope to apply market-based models, with an emphasis on testing, accountability and performance; others hope to elevate the teaching profession and trigger a renewed investment in schools and resources.

Both sides recognize that digital learning, inside and outside the classroom, is an unavoidable trend. From Massive Open Online Courses (MOOCs) to adaptive learning technologies that personalize the delivery of instructional material to the individual student, educational technology thrives on data. From names that you grew up with (McGraw Hill, Houghton Mifflin, Pearson) to some you didn’t (Cengage, Amplify), companies are making bold investments in digital products that do more than just push content online; they’re touting products that fundamentally change how and when students learn and how instructors evaluate individual student progress and aid their development. Expect more from this sector in 2015.

Now that we’ve moved past mere adoption to implementation and utilization, 2015 will undoubtedly be big data’s break-out year.


more...
Irina Donciu's curator insight, January 15, 2015 4:33 AM

The great Danish physicist Niels Bohr once observed that “prediction is very difficult, especially if it's about the future.”

Maryruth Hicks's curator insight, September 8, 2015 11:27 AM

Digital learning and big data in education might lead to educational reform!

Scoop.it!

What is Big Data for Healthcare IT? | EHR Blog | AmericanEHR Partners

What is Big Data for Healthcare IT? | EHR Blog | AmericanEHR Partners | Healthcare and Technology news | Scoop.it

Big data is a term commonly used by the press and analysts yet few people really understand what it means or how it might affect them. At it’s core, Big Data represents a very tangible pattern for IT workers and demands a plan of action. For those who understand it, the ability to create an actionable plan to use the knowledge tied up in the data can provide new opportunities and rewards.

Let’s first solidify our understanding of Big Data. Big Data is not about larger ones and zeros nor is it a tangible measurement of the overall size of data under your stewardship. Simply stated, one does not suddenly have “big data” when a database grows past a certain size. Big Data is a pattern in IT. The pattern captures the fact a lot of data collections that contain information related to an enterprise’s primary business are now accessible and actionable for that enterprise. The data is often distributed and in a variety of formats which makes it hard to curate or use, hence Big Data represents a problem as much as it does a situation. In many cases, just knowing that data even exists is a preliminary problem that many IT workers are finding hard to solve. The peripheral data is often available from governments, sensor readouts, in the public domain or simply made available from API’s into other organizations data. How do we know it is there, how can we get at it and how can we get the interesting parts out are all first class worries with respect to the big data problem.

To help illustrate the concepts involved in Big Data, we will use a hospital as an example. A hospital may need to plan for future capacity and needs to understand the aging patterns from demographics data that is available from a national census organization in the country they operate in. It also knows that supplementary data is available in terms of finding out how many people search for terms on search engines related to diseases and the percentage of the population that smokes, is not living healthy lifestyles and participates in certain activities.  This may have to be compared to current client lists and the ability to predict health outcomes for known patients of a specific hospital, augmented with the demographic data from the larger surrounding population.

The ability to plan for future capacity at a health institute may require that all of this data plus numerous other data repositories are searched for data to support or disprove the hypothesis that more people will require more healthcare from the hospital in ten years.

Another situation juxtaposed to illustrate other aspects to Big Data could be the situation whereby a single patient arrives at the hospital with an unknown disease or infection. Hospital workers may benefit from knowing the patients background yet may be unaware of where that data is. Such data may reside in that patients social media accounts such as FourSquare, a website that gamifies visits to businesses. The hospital IT workers in this scenario need to find a proverbial needle in a haystack. By searching across all known data sources, the IT workers might be able to scrape together a past history of the patient’s social media declarations which might provide valuable information about a person’s alcohol drinking patterns (scraped from FourSquare visits to licensed establishments), exercise data (from a site like socialcyclist.com) and data about their general lifestyle (stripped from Facebook, Twitter and other such sites). When this data is retrieved and combined with data from LinkedIn (data about the patients business life), a fairly accurate history can be established.  By combining photos from Flickr and Facebook, Doctors could actually see the physical changes in the way a patient looks over time.

The last example illustrates that the Big Data pattern is not always about using large amounts of data. Sometimes it involves finding the smaller atoms of data from large data collections and finding intersections with other data. Together, these two hospital examples show how Big Data patterns can provide benefits to an enterprise and help them carry out their primary objectives.

To gain access to the data is one matter. Just knowing the data is available and how to get at it is a primary problem. Knowing how the data relates to other data and being able to tease out knowledge from each data repository is a secondary problem that many organizations are faced with.

Some of our staff members recently worked on a big data project for the United States Department of Energy related to Geothermal prospecting. The Big Data problem there involved finding areas that may be promising in terms of being able to support a commercially viable geothermal energy plant that must operate for ten or more years to provide a valid ROI for investors. Once the rough locations are listed, a huge amount of other data needs to be collected to help determine the viability of a location.

Some examples of the other questions that need to be answered with Big Data were:

  1. What is the permeability of the materials near the hot spot and what are the heat flow capabilities?
  2. How much water or other fluids are available on a year round basis to help collect thermal energy and turn it into kinetic energy?
  3. How close is the point of energy production to the energy consumption?
  4. Is the location accessible by current roads or other methods of transportation?
  5. How close is the location to transmission lines?
  6. Is the property currently under any moratoriums?
  7. Is the property parkland or other special use planning?
  8. Does the geothermal potential overlap with existing gas and oil claims or other mineral rights or leases?
  9. Etc…

All of this data is available, some of it in prime structured digital formats and some of it not even in digital format. An example of non-digital format might be a drill casing stored in a drawer in the basement of a University that represents the underground materials near the heat dome. By studying its’ structure, the rate of heat exchange through the material can provide clues about the potential rate of thermal energy available to the primary exchange core.

In order to keep track of all the data that exists and how to get at it, many IT shops are starting to use graphs and graph database technologies to represent the data. The graph databases might not store the actual data itself, but they may store the knowledge of what protocols and credentials to use to connect to the data, what format the data is in, where the data is located and how much data is available. Additionally, the power of a graph database is that the database structure is very good at tracking the relationships between clusters of data in the form of relationships that capture how the data is related to other data. This is a very important piece of the puzzle.

The conclusion of the introduction post to Big Data is that Big Data exists already. It is not something that will be created. The new Big Data IT movement is about implementing systems to track and understand what data exists, how it can be retrieved, how it can be ingested and used and how it related (semantically) to other data. Every IT shop in the world has done this to some degree from a “just use Google for everything” low tech approach to a full blown data registry/repository being implemented to track all metadata about the data.

The real wins will be when systems can be built that can automatically find and use the data that is required for a specific endeavor in a real time manner. To be truly Big Data ready is going to require some planning and major architecture work in the next 3-5 years.



more...
No comment yet.