Healthcare and Technology news
50.9K views | +5 today
Healthcare and Technology news
Your new post is loading...
Your new post is loading...!

The Top 10 Benefits of an Internet Business Phone System

The Top 10 Benefits of an Internet Business Phone System | Healthcare and Technology news |

The voice over internet protocol (VoIP) market is expected to reach $55 billion by 2025. More companies are making the switch to VoIP, allowing them to make calls using broadband internet instead of a conventional phone system.


VoIP works by converting sounds into digital communications. Then, the digital file is transferred through internet broadband. By using VoIP, companies can use the internet to make phone calls!


Why make the switch? Keep reading to find out. Here are the 10 benefits of switching to a virtual phone system!


1. Easy Installation And Integration


Many businesses hesitate to make major technological changes. Any change takes time, testing, and money. Installing, configuring, and maintaining a VoIP, however, is incredibly easy.


In fact, it’s so easy that VoIP is now the number one telephone service choice for businesses in the country. Already 36 percent of businesses are utilizing a VoIP.


Even someone who is less technically savvy can install a VoIP on their own. You can either call an expert technician or try it yourself. VoIP phones are pretty much plug-and-play.


It’s also very easy to add new users using hosted VoIP software. The web portal will make it easy for you to move, add, and change systems as needed. The simplicity means you also won’t have to worry about maintenance.


As a result, you’ll rarely need professional support when making changes.


VoIP also makes it easy for you to utilize other systems and technologies. Integrating other Softwares can enhance your operations, boosting efficiency throughout your company. VoIP integrates a wide variety of business systems, allowing you to customize your VoIP as you see fit.


In other words, you’ll have all the benefits of VoIP without needing someone to modify your existing IT infrastructure.


2. Scalability


One of the top benefits of using an internet business phone system is its scalability.


Your virtual phone system will scale along with the needs of your business. A traditional phone system, on the other hand, is usually more difficult to scale. You’ll likely need an IT expert to handle any changes you might need.


This scalability will support your company’s efficiency and productivity efforts. You won’t have to waste time or money making company-wide changes to your system.


Instead, you can use your small business phone system to add a line the next time you hire a new employee. You can eliminate lines if you’re downsizing, too. Either way, your VoIP will scale along with you.


3. Reliability


As your company grows, you’ll need a system you can rely on.


Some companies think that if they’re without internet, they’ll end up without a phone system as well. One of the benefits of VoIP is that you can still rely on your system even if the internet does go down. In case of an event like this, you can have your calls forwarded to your mobile phone or another device.


That means you won’t have to worry about weather issues or power outages impacting your business operations.


4. Effective Communication


Whether your team is big or small, you’ll need to make sure everyone can communicate. With more people working from home, it’s important to have a system that prioritizes communication.


With a virtual phone system, the line will ring at your desk phone a few times before ringing on your mobile device, laptop, or tablet.


As a result, you won’t have to worry about missing urgent calls. You’ll save time trying to check your voicemail, too!


5. Flexibility


With a mobile business phone system, you don’t need your underlying network as part of a specific technology layout. Instead, you can use your existing ethernet, ATM, WiFi, or SONET as the foundation of the network.


Traditional phone networks require a lot of complexity, which can make it difficult for your IT team to make adjustments. The network flexibility with VoIP allows you to create a standardized system. As a result, you can support a number of communication types and require less equipment management.


6. Additional Features


There are a number of benefits and additional features that come included with your internet business phone system. For example, VoIP systems allow clients to connect with a variety of devices. This makes it easier for you to keep your company’s productivity levels up.


VoIP programs often include:


  • Caller ID
  • Virtual numbers
  • Contact lists
  • Voicemail

You can customize these features to improve your company’s operational efficiency.


For example, you can have voicemails forwarded to multiple co-workers. You can also use voicemail-to-text transcriptions and send these documents to your email with ease.


7. Work From Anywhere Access


Are more of your employees working from home? A work-at-home program can help you save money on office space and decrease utility costs. Before you make that transition, however, it helps to have a VoIP in place.


VoIP can ensure your team communicates effectively. Employees can use the voice, fax, and data services through their internet connection.


Employees can communicate straight from their home offices or even abroad.


As a result, you don’t have to worry about a drop in communication with your team members.


8. Simplified Conferences


Traditional phone systems allow you to conference with teams and clients. However, you usually need to pay for an additional service in order to host multiple callers. With a small business phone system, you can simplify this process.


VoIP removes the need for dedicated phone lines. Instead, you’ll operate on a converged data network. The features are usually native.


With the cost already built-in, you won’t have to worry about paying more for conferencing features.


9. Functionality


With a VoIP, you’re not limited to phone calls. You can also host video-conference, allowing you to communicate with co-workers and clients better than before. Video-conferencing will allow you to share meetings, files, documents, and agendas right from your VoIP system.


10. Cost-Efficiency


Above all else, switching to a virtual phone system will help your company cut costs. These systems are cheaper than conventional phone systems. The ability to install and remove lines as needed will help you adjust your system to cut costs, too.

Technical Dr. Inc.'s insight:
Contact Details : or 877-910-0004!

How nine out of ten healthcare pages leak private data

A study by a Timothy Libert, a doctoral student at the University of Pennsylvania, has found that nine out of ten visitsto health-related web pages result in data being leaked to third parties like Google, Facebook and Experian:

There is a significant risk to your privacy whenever you visit a health-related web page. An analysis of over 80,000 such web pages shows that nine out of ten visits result in personal health information being leaked to third parties, including online advertisers and data brokers.

What Libert discovered is a widespread repetition of the flaw that the US government's flagship website was dragged over the coals for in January.

The sites in question use code from third parties to provide things like advertising, web analytics and social media sharing widgets on their pages. Because of the way those kinds of widgets work, their third party owners can see what pages you're visiting.

The companies supplying the code aren't necessarily seeking information about what you're looking at but they're getting it whether they want it or not.

So if you browse the pages about genital herpes on the highly respected CDC (Centres for Disease Control and Prevention) site you'll also be telling marketing mega-companies Twitter, Facebook and AddThis that you've an interest in genital herpes too.

It happens like this: when your browser fetches a web page, it also fetches any third party code embedded in it directly from the third parties' websites. The requests sent by your browser contain an HTTP header (the annoyingly misspelled 'referer' header) that includes the URL of the page you're looking at.

Since URLs tend to contain useful, human-readable information about what you're reading, those requests can be quite informative.

For example, looking at a CDC page about genital herpes triggers a request to like this:

GET /js/300/addthis_widget.js HTTP/1.1


The fact that embedded code gets URL data like this isn't new - it's part of how the web is designed and, like it or not, some third parties actually rely on it - Twitter uses it to power its Tailored Suggestions feature for example.

What's new, or perhaps what's changed, is that we're becoming more sensitive to the amount of data we all leak about ourselves and, of course, health data is among the most sensitive.

While a single data point such as one visit to one web page on the CDC site doesn't amount to much, the fact is we're parting with a lot of data and sharing it with the same handful of marketing companies.

We do an awful lot of healthcare research online and we tend to concentrate those visits around popular sites.

A 2012 survey by the Pew Research Center found that 72% of internet users say they looked online for health information within the past year. A fact that explains why one of the sites mentioned in the study,, is the 106th most popular website in the USA and ranked 325th in the world.

The study describes the data we share as follows:

...91 percent of health-related web pages initiate HTTP requests to third-parties.  Seventy percent of these requests include information about specific symptoms, treatment, or diseases (AIDS, Cancer, etc.). The vast majority of these requests go to a handful of online advertisers: Google collects user information from 78 percent of pages, comScore 38 percent, and Facebook 31 percent.  Two data brokers, Experian and Acxiom, were also found on thousands of pages.

If we assume that it's possible to imply an individual's recent medical history from the healthcare pages they've browsed over a number of years then, taken together, those innocuous individual page views add up to something very sensitive.

As the study's author puts it:

Personal health information ... has suddenly become the property of private corporations who may sell it to the highest bidder or accidentally misuse it to discriminate against the ill.

There is no indication or suggestion that the companies Limbert named are using the health data we're sharing but they are at least being made unwitting custodians of it and that carries some serious responsibilities.

Although there is nothing in the leaked data that identifies our names or identities, it's quite possible that the companies we're leaking our health data to have them already.

Even if they don't though, we're not in the clear.

Even if Google, Facebook, AddThis, Experian and all the others are at pains to anonymise our data, I wouldn't bet against individuals being identified in stolen or leaked data.

It's surprisingly easy to identify named individuals within data sets that have been deliberately anonymised.

For example, somebody with access to my browsing history could see that I regularly visit Naked Security for long periods of time and that those long periods tend to happen immediately prior to the appearance of articles written by Mark Stockley.

For a longer and more detailed look at this phenomenon, take a look at Paul Ducklin's excellent article 'Just how anonymous are "anonymous" records?'

It's possible to stop this kind of data leak by setting up your browser so it doesn't send referer headers but I wouldn't rely on that because there are other ways to leak data to third parties.

Instead I suggest you use browser plugins like NoScript, Ghostery or the EFF's own Privacy Badger to control which third party sites you have any interaction with at all.

What the study hints at is bigger than that though - what it highlights is that we live in the era of Big Data and we're only just beginning to understand some of the very big implications of small problems that have been under our noses for years.

Lava Kafle's curator insight, March 3, 2015 5:40 AM

#DidYouKnowThis #HealthCare #Cyber #Security #threats #leaks #vulnerabilities #Mitigation #strategy @deerwalkinc #bigdata #thirdparty

Lava Prasad Kafle's curator insight, March 23, 2015 1:15 AM


Instead I suggest you use browser plugins like NoScript, Ghostery or the EFF's own Privacy Badger to control which third party sites you have any interaction with at all.

What the study hints at is bigger than that though - what it highlights is that we live in the era of Big Data and we're only just beginning to understand some of the very big implications of small problems that have been under our noses for years.!

Cerner seals the deal with Siemens

Cerner seals the deal with Siemens | Healthcare and Technology news |

Cerner Corp. has completed its $1.3 billion acquisition of SiemensHealth Services – and now looks forward to as much as $5B in revenue in 2015.

The newly-merged unit now boasts combined annual research and development investment of more than $650 million, according to Cerner.

“By combining client bases, investments in R&D and associates, we are in a great position to lead clients through one of the most dynamic eras in health care,” said Neal Patterson, Cerner chairman, CEO and co-founder, in a statement.

"Cerner remains focused on key development areas including population health, physician experience, open platforms, revenue cycle and mobility," he said. "We see these as critical areas of investment to ensure providers can meet growing regulatory demands and control costs, while continuing to improve quality of care.”

Cerner and Siemens AG, the former parent company of Siemens Health Services have also formed a strategic alliance to combine Cerner’s IT capabilities with Siemens' device and imaging expertise. Each organization expects to invest up to $50 million during an initial three-year term, officials say, with an initial focus on integrating diagnostics and therapeutics with electronic health records.

"A unique feature of this acquisition is we’ll continue working with Siemens AG in a R&D capacity, in order to advance the interoperability of electronic health records with medical devices," said Patterson.

Cerner expects revenue in 2015 to be approximately $4.8 billion to $5 billion, with a client base spanning more than 30 countries across more than 18,000 facilities.

"The Cerner client family has grown and so has our team," said Patterson. "We’re now more than 21,000 associates strong across a global network, all with the singular focus of advancing the state of the art in health and care."

John Glaser, former CEO of Siemens Health Services, has joined Cerner as a senior vice president and member of the company’s executive cabinet; he will support former Siemens Health Services clients as they transition to Cerner.

Meanwhile, support for Siemens Health Services core platforms will remain in place, say Cerner officials. Current implementations will continue, and Cerner will support and advance the Soarian platform for at least the next decade.

No comment yet.!

True Interoperability: Public APIs provide the open platform health IT requires | Healthcare IT News

True Interoperability: Public APIs provide the open platform health IT requires | Healthcare IT News | Healthcare and Technology news |

Do we finally have the spark?

Interoperability is the current health IT buzzword because it’s the essential ingredient in creating a system that benefits patients, doctors and hospitals. Almost everyone in healthcare is pressing for it and is frustrated, though probably not surprised, that meaningful use did not get us there.

The ONC says within three years we’ll have a roadmap for providing interoperability “across vendor platforms,” which should probably elicit a collective groan.

Look, a map is a fine tool but of limited use if I don’t speak the language. Change in this industry requires market drivers instituted now, if not sooner. We must move from MU to a health care payment model driving True Interoperability, not the garden-variety stuff.

What should True Interoperability be in healthcare? From the following definitions we can pick the best of the lot.

1.   The ability of two or more systems or components to exchange information and to use the information that has been exchanged.
- The Institute of Electrical and Electronics Engineers (IEEE) posting on

Too narrowly tailored, this definition covers “interface-ability” or basic data exchange. It lacks context and collaboration, which is required for care across systems. There is no mention of the technical challenge and costs that can make even this narrow goal a difficult one.

Compare that with another interoperability definition, also found on

2.   Interoperability is a property of a product or system, whose interfaces are completely understood, to work with other products or systems, present or future, without any restricted access or implementation.

With this definition, we’ve moved a step beyond simple data exchange, which is helpful. But health care has arguably unique challenges with interface variance and restrictions on access and implementation created by complexity, huge costs, and closed platforms and business models. Established data exchange standards within a restrictive business model yields closed records.

So, can we get closer to a definition that really has traction?

3. Interoperability is the ability of making systems and organizations work together (inter-operate). While the term was initially defined for information technology or systems engineering services to allow for information exchange, a more broad definition takes into account social, political, and organizational factors that impact system-to-system performance. The task of building coherent services for users when the individual components are technically different and manage by different organizations.

This definition nails the requirements for continuity, coordination and collaboration to help transform our health care “system.” In particular, I think we should pay close attention to the message in that last sentence; coherent services focused on different components and from different organizations =True Interoperability.

We must shift from basic interfaces to open and public access that allows systems to interoperate. We need Application Programming Interfaces (APIs), which some think is pure fantasy.

Indeed, health care and health IT are plagued by delusion, but not among those who have watched the automation of every other industry. As Micky Tripathi, president and CEO of the Massachusetts eHealth Collaborative and co-chair of the independent scientific task force JASON, said, "Kendall Square [home of MIT] and Silicon Valley are laughing at us."

In April, JASON released a report on the "robust health data infrastructure" required for health care. The report calls for publishing as many API’s as possible and proposes a strategy “modeled after the principles that have allowed the Internet to scale—a core set of tightly specified services that enable multiple heterogeneous ecosystems to emerge.”

Tripathi says Washington should align all programs around an API strategy and use the government’s tremendous purchasing power to move the market.

"I'd like to see a world where you get paid because you have good informaticians," Doug Fridsma, MD, former chief scientist at ONC and now CEO of AMIA, recently told Healthcare IT News

Hmmm. Require public APIs and let the market drive True Interoperability? Sounds like we have a winning definition. 

Now, if only we had some public APIs lying around …

It turns out HL-7 is testing a product called Fast Healthcare Interoperability Resources or FHIR (pronounced fire).

According to Charles Jaffe, CEO of HL7:

FHIR represents a departure from the notion of messaging and document-centric ideas. It uses technology that everyone is familiar with and it's very, very easy to implement. That's the real key to this, the fact that it's not only an effective solution, but it's a very cost-effective solution. That's what makes it unique.

So, are we finally lining up all the ingredients necessary for True Interoperability? A set of public API’s can open up interoperability to all developers who know web technology. Government purchasing power can dramatically alter the existing market. A small spark and gusting winds enable the fire to spread.

The question is still which vendors will seize this opportunity and serve the market and which will protect their locked-in client bases. Insisting that the wind isn’t changing direction to put you right in the fire’s path is not always a sound strategy. Can’t wait to see whether or not the wind starts to blow.

No comment yet.!

Why Health Care May Finally Be Ready for Big Data

Why Health Care May Finally Be Ready for Big Data | Healthcare and Technology news |

There has been a lot of buzz about “big data” over the last few years. This is hardly surprising, given the sheer scale of the data sets that are being produced daily. A total of 2.5 quintillion terabytes of data were generated every day in 2012 alone, and it is estimated that as much data is now generated in just two days as was created from the dawn of civilization until 2003. While other industries have been far more successful at harnessing the value from large-scale integration and analysis of big data, health care is just getting its feet wet. Yes, providers and payers are increasingly investing in their analytical capabilities to help them make better sense of the changing health care environment, but it is still early days. Here are some key elements that are crucial for health care to truly capture the value of big data.

Integrating data. Businesses and even political campaigns have successfully linked disparate data sources to learn “everything possible” about their citizens, customers, and clients, and apply advanced analysis and computation to modify existing strategies or create new ones. Similarly, leveraging heterogeneous datasets and securely linking them has the potential to improve health care by identifying the right treatment for the right individual or subgroup.

One of the first changes we face is the lack of standardization of health care data. The vast amount of data generated and collected by a multitude of agents in health care today comes in so many different forms — from insurance claims to physician notes within the medical record, images from patient scans, conversations about health in social media, and information from wearables and other monitoring devices.

The data-collecting community is equally heterogeneous, making the extraction and integration a real challenge. Providers, payers, employers, disease-management companies, wellness facilities and programs, personalized-genetic-testing companies, social media, and patients themselves all collect data.

Insight Center

  • A collaboration of the editors of Harvard Business Review and the New England Journal of Medicine, exploring best practices for improving patient outcomes while reducing costs.

Integration of data will require collaboration and leadership from the public and private sectors. The National Institutes of Health recently launched a Big Data to Knowledge Initiative (BD2K) to enable the biomedical research community to better access, manage, and utilize big data. Some early work is also being pursued through large collaborations such as the National Patient-Centered Research Network (PCORnet) and the consortium Optum Labs, a research collaborative that has brought together academic institutions, health care systems, provider organizations, life sciences companies, and membership and advocacy organizations.

Generating new knowledge. One of the earliest uses of big data to generate new insights has been around predictive analytics. In addition to the typical administrative and clinical data, integrating additional data about the patient and  his or her environment might provide better predictions and help target interventions to the right patients. These predictions may help identify areas to improve both quality and efficiency in health care in areas such as readmissions, adverse events, treatment optimization, and early identification of worsening health states or highest-need populations.

Equally critical is the focus on new methods. One of the reasons health care is lagging behind other industries is it has relied for too long on standard regression-based methods that have their limits. Many other industries, notably retail, have long been leveraging newer methods such as machine learning and graph analytics  to gain new insights. But health care is catching up.

For example, hospitals are starting to use graph analytics to evaluate the relationship across many complex variables such as laboratory results, nursing notes, patient family history, diagnoses, medications, and patient surveys to identify patients who may be at risk of an adverse outcome. Better knowledge and efficient assessment of disparate facts about patients at risk could mean the difference between timely intervention and a missed window for treatment.

Natural language processing and other artificial intelligence methods have also become more mainstream, though they are mostly useful in harvesting unstructured text data that are found in medical records, physician notes, and social media. Mayo Clinic teamed up with the IBM cognitive computer known as Watson, which is being trained to analyze clinical-trial criteria in order to determine appropriate matches for patients. As the artificial-intelligence computer gets more information and learns about matching patients to studies, Watson may also help locate patients for hard-to-fill trials such as those involving rare diseases.

Translating knowledge into practice. While standardized data collection and new analytical methods are critical to the big data movement, practical application will be key to its success. This is an important cultural challenge for both those who generate and those who consume the new knowledge. Users such as physicians, patients, and policy makers need to be engaged right at the beginning, and the entire research team should have a clear idea about how the new knowledge might be translated into practice.

The insights from big data have the potential to touch multiple aspects of health care: evidence of safety and effectiveness of different treatments, comparative outcomes achieved with different delivery models, and predictive models for diagnosing, treating, and delivering care. In addition, these data may enhance our understanding of the effects of consumer behavior, which in return may affect the way companies design their benefits packages.

Translating these new insights into practice will necessitate a shift in current practices. Relying on the evidence from randomized controlled trials has been a gold standard for making practice-changing decisions. While in many cases such trials may be necessary and justified, it will be critical to identify where the evidence generated by big data is adequate enough to change practice. In other cases, big data may generate new paradigms for increasing the efficiency of randomized clinical trials.

For example, as new knowledge is gained about the comparative benefits of second-line agents for treatment of diabetes, policy makers and expert groups may consider using this information to develop guidelines or recommendations or to guide future randomized trials.

Optum labs — a start to achieving these goals. A proverbial village is required to make sense of the messy medley of data points that is big data in health care today, integrate and manage them in a safe and secure environment, and translate the findings into practice. One such village is Optum Labs, a research collaborative that has brought together data from the administrative claims of more than 100 million patients and the electronic medical records of over 30 million patients, along with researchers, patient advocates, policy makers, providers, payers, and pharmaceuticals and life sciences companies. The vision of Optum Labs is to boost the generation of high-quality comparative-effectiveness evidence, accelerate translation of knowledge into the development of new predictive models and tools, and improve the delivery of care.

For almost two years now, researchers from Optum Labs’s 15 partner organizations have been working in this environment. The knowledge generation within Optum Labs is in its infancy, but early signs point to a significant potential to change health care with more than a 100 studies currently under way. Optum Labs’s priority is to enable clinicians to connect insights from big data directly to the care of an individual patient. This is particularly important for complex patients whose care requires careful prioritization (e.g., people with “comorbidities,” or multiple chronic conditions). For example, researchers are studying the longitudinal variation in care analysis of hip and knee surgery in patients with diabetes and obesity. By analyzing the data on their treatment outcomes, we may be able to learn something that will help us create better protocols for caring for them. The hope is to gain insights about what works and for whom. Ultimately, what should drive this initiative and others is to addresses the complexities, unmet needs, and challenges facing patients.

No comment yet.!

Why and Who Should Ensure Quality Health Data?

Why and Who Should Ensure Quality Health Data? | Healthcare and Technology news |

Contrary to common belief, technology does not own health data. Data exists as a result of the input of multiple sources of information throughout each patient’s healthcare continuum. The data does not exist only because of the technology but rather because of the careful selection of meaningful data items that need to be captured and at what frequency (ie. instantly, daily, weekly, etc.).

We in healthcare collect granular data on anything ranging from demographics, past medical, surgical, and social history, medication dosage and usage, health issues and problem lists, disease and comorbidity prevalence, vital statistics, and everything in between. We collect data on financial performance with benchmarks and reimbursement trends using individual data elements from accounting transactions. Healthcare organizations have been collecting the same or similar data for decades but never before have we been able to operate with such efficiency as we do now thanks to advances in technology.

We have become so data rich in the healthcare environment in a short amount of time and this data continues to multiply daily. But are we still information poor? When we continue to generate data but fail to aggregate the data into quality information, we are essentially wasting bandwidth and storage space with meaningless and disconnected data.

Every time patients have interactions with healthcare providers and facilities, data is generated. Over time, the data that is generated could (and should) be used to paint a picture of trends in patient demographics, population health, best practices in care, comorbidities and disease management, payment models, and clinical outcomes. This information becomes useful in meeting regulatory requirements, overcoming reimbursement hurdles, clinical quality initiatives, and even promotional and marketing material for healthcare organizations. This data could have opposite effects if not properly governed and utilized.

It goes back to the saying “garbage in, garbage out.” If the data cannot be standardized or trusted, it is useless. Input of data must be controlled with data models, hard-stops, templates, and collaborative development of clinical content. Capturing wrong or inconsistent data in healthcare can be dangerous to the patients and healthcare quality measurements as well as leading to unwanted legal actions for clinicians.

So who is the right person for the job of ensuring quality data and information? I have seen bidding wars take place over the ownership of the data and tasks surrounding data analysis, database administration, and data governance. Information Technology/Systems wants to provide data ownership due to the skills in the development and implementation of the technology needed to generate and access data. Clinical Informatics professionals feel they are appropriate for the task due to the understanding of clinical workflow and EHR system optimization. Financial, Accounting, Revenue Integrity, and Decision Support departments feel comfortable handling data but may have motives focused too heavily on the financial impact. Other areas may provide input on clinical quality initiatives and govern clinician education and compliance but may be primarily focused on the input of data instead of the entire data lifecycle.

When searching for an appropriate home for health data and information governance, organizations should look no further than Health Information Management (HIM) professionals. Information management is what HIM does and has always done. We have adapted and developed the data analytics skills needed to support the drive for quality data abstraction and data usage (just look at the education and credentialing criteria). HIM departments are a hub of information, both financial and clinical therefore governing data and information is an appropriate responsibility for this area. HIM also ensures an emphasis on HIPAA guidelines to keep data secure and in the right hands. Ensuring quality data is one of the most important tasks in healthcare today and trusting this task to HIM In collaboration with IT, Informatics, and other departments is the logical and appropriate choice.

cdebie's curator insight, August 17, 2015 4:32 AM

As we get inundated with health data from multiple sources,, aggregation, classification and interpretation will require specialised skills and dedicated resources.!

Health IT Security: What Can the Association for Computing Machinery Contribute?

A dazed awareness of security risks in health IT has bubbled up from the shop floor administrators and conformance directors (who have always worried about them) to C-suite offices and the general public, thanks to a series of oversized data breaches that recentlh peaked in the Anthem Health Insurance break-in. Now the US Senate Health Committee is taking up security, explicitly referring to Anthem. The inquiry is extremely broad, though, promising to address “electronic health records, hospital networks, insurance records, and network-connected medical devices.”

The challenge of defining a strategy has now been picked up by the US branch of the Association for Computing Machinery, the world’s largest organization focused on computing. (Also probably it’s oldest, having been founded in 1947 when computers used vacuum tubes.) We’re an interesting bunch, having people who have helped health care sites secure data as well as researchers whose role is to consume data–often hard to get.

So over the next few weeks, half a dozen volunteers on the ACM US Public Policy Council will discuss what to suggest to the Senate. Some of us hope the task of producing a position statement will lead the ACM to form a more long-range commmittee to apply the considerable expertise of the ACM to health IT.

Some of the areas I have asked the USACM to look at include:

Cyber-espionage and identity theft
This issue has all the publicity at the moment–and that’s appropriate given how many people get hurt by all the data breaches, which are going way up. We haven’t even seen instances yet of malicious alteration or destruction of data, but we probably will.

Members of our committee believe there is nothing special about the security needs of the health care field or the technologies available to secure it. Like all fields, it needs fine-grained access controls, logs and audit trails, encryption, multi-factor authentication, and so forth. The field has also got to stop doing stupid stuff like using Social Security numbers as identifiers. But certain aspects of health care make it particularly hard to secure:

  • The data is a platinum mine (far more valuable than your credit card information) for data thieves.
  • The data is also intensely sensitive. You can get a new credit card but you can’t change your MS diagnosis. The data can easily feed into discrimination by employees and ensurers, or other attacks on the individual victims.
  • Too many people need the data, from clinicians and patients all the way through to public health and medical researchers. The variety of people who get access to the data also makes security more difficult. (See also anonymization below.)
  • Ease of use and timely access are urgent. When your vital signs drop and your life is at stake, you don’t want the nurse on duty to have to page somebody for access.
  • Institutions are still stuck on outmoded security systems. Internally, passwords are important, as are firewalls externally, but many breaches can bypass both.
  • The stewards/owners of health care data keep it forever, because the data is always relevant to treatment. Unlike other industries, clinicians don’t eventually aggregate and discard facts on individuals.
Numerous breaches of public data, such as in Washington State, raise questions about the security of data that is supposedly anonymized. The HIPAA Safe Harbor, which health care providers and their business associates can use to avoid legal liability, is far too simplistic, being too strict for some situations and too lax for others.

Clearly, many institutions sharing data don’t understand the risks and how to mitigate against them. An enduring split has emerged between the experts, each bringing considerable authority to the debate. Researchers in health care point to well-researched techniques for deidentifying data (see Anonymizing Health Data, a book I edited).

In the other corner stand many computer security experts–some of them within the ACM–who doubt that any kind of useful anonymization will stand up over the years against the increase in computer speeds and in the sophistication of data mining algorithms. That side of the debate leads nowhere, however. If the cynics were correct, even the US Census could not ethically release data.

Patient consent
Strong rules to protect patients were put in place decades ago after shocking abuses (see The Immortal Life of Henrietta Lacks). Now researchers are complaining that data on patients is too hard to get. In particular, combining data from different sites to get a decent-sized patient population is a nightmare both legally and technically.
Device security
No surprise–like every shiny new fad, the Internet of Things is highly insecure. And this extends to implanted devices, at least in theory. We need to evaluate the risks of medical devices, in the hospital or in the body, and decide what steps are reasonable to secure them.
Trusted identities in cyberspace
This federal initiative would create a system of certificates and verification so that individuals could verify who they are while participating in online activities. Health care is a key sector that could benefit from this.

Expertise exists in all these areas, and it’s time for the health care industry to take better advantage of it. I’ll be reporting progress as we go along. The Patient Privacy Rights summit next June will also cover these issues.

No comment yet.!

Analysis: HHS' Threat Info Sharing Plan

Analysis: HHS' Threat Info Sharing Plan | Healthcare and Technology news |

The Department of Health and Human Services is reassessing how its many internal agencies, and the entire healthcare sector, can boost cyberthreat intelligence sharing and analysis as more patient records are digitized and shared.

That assessment includes HHS evaluating whether it should create a new information sharing and analysis "structure" to harness the growing volume of cyber-intelligence coming from multiple sources. It's also evaluating another option: leveraging an existing organization to improve collection, analysis and dissemination of cyber-intelligence, HHS officials tell Information Security Media Group.

The recognition of the importance of cyberthreat intelligence sharing, combined with an evolving healthcare ecosystem, prompted HHS' Office of the National Coordinator for Health IT to include plans for "the establishment of an information sharing and analysis center" in ONC's federal health IT strategic plan for 2015 to 2020, which was released this week, HHS officials explain.

That section of the strategic plan created confusion, however, because the healthcare sector already is served by the National Health-ISAC.

Ongoing Efforts

NH-ISAC already is working with several federal agencies, including the Food and Drug Administration, a unit of HHS, and the National Institute of Standards and Technology, on cybersecurity-related initiatives for the healthcare sector.

But HHS also works with other government and non-government entities on cyber-intelligence related activities. That includes sharing cyber-intelligence with the FBI and Department of Homeland Security, as well as conducting healthcare sector cyberdrills with the private sector's Health Information Trust Alliance, or HITRUST.

The healthcare sector has changed dramatically since ONC's last federal health IT strategic plan was issued in 2011, HHS officials point out. For instance, electronic health records are far more common, thanks to the HITECH Act financial incentive program.

With more records being digitized and exchanged, HHS wants to make sure that information about potential cyberthreats is shared in a timely way. In addition to keeping healthcare organizations of all sizes well-informed, HHS wants to ensure that its many units, including ONC, FDA, Centers for Disease Control and Prevention, Centers for Medicare and Medicaid Services, and others - are kept up to date and ready to respond to emerging cyberthreats and vulnerabilities.

"Big or small ... it's important actionable information gets to all levels of stakeholders in the health ecosystem," Julie Chua, lead information security specialist in ONC's office of the chief privacy officer, tells ISMG.

No deadline has been set for a decision about the approach HHS will take to boost cyberthreat information sharing, HHS officials say. HHS will take into consideration the public comments it receives on the federal health IT strategic plan.

No comment yet.!

In Cincinnati, Housing and Health Data Work Hand-in-Hand

In Cincinnati, Housing and Health Data Work Hand-in-Hand | Healthcare and Technology news |

In Hamilton County Ohio—home to the city of Cincinnati—it’s not unusual to see kids from the same neighborhoods and complexes coming to the hospital for asthma attacks.

In fact, Andrew Beck, M.D., a pediatrician at Cincinnati Children's Hospital Medical Center, says that a few years back, he and other researchers began to see a loose correlation between certain neighborhoods and higher rates of asthma admission into the hospital. Meanwhile, he was seeing patients from well-to-do neighborhoods with significantly lower rates of admission.

The variation got him thinking about the possible link between air quality in a specific housing complex and rates of asthma admission. Dr. Beck and his fellow researchers decided to see if there was a correlation between the air quality in certain neighborhoods and the number of children from those neighborhoods that were admitted to the hospital for asthma. To do this, they had to compare housing and health data sets.

“We were looking at the correlation and relationship between census-track level housing code violation density and the rate in which children from those census tracks either came to the emergency department or were admitted for asthma,” Beck says.

The researchers studied 4,355 children between the ages of 1 and 16 who visited the emergency department or were hospitalized for asthma at Cincinnati Children’s between January 2009 and December 2012. They tracked those kids for 12 months to see if they returned to the ED or were readmitted for asthma.

Not only were the researchers able to capture a firm correlation between the two sets, they were able to advance the research to predict which kids were at high-risk to return to the ED or hospital based on where they live. Children who lived in areas with poor air quality were 84 percent more likely to return, according to Beck’s findings. Even with most of the population in predominantly poverty-stricken areas, the researchers were able to differentiate between high-risk and low-risk housing complexes.

“The reality was that most of the population we studied in this study were poor kids in the inner city. We were still able to see differences within that population,” Beck says. “That was surprising.”

Integration of Housing and Health Data

At a time healthcare still lives in silos, the integration of housing and health data sets is a novel concept. However, Beck says that in order for population health to work, leaders at accountable care organizations (ACOs), health systems, and other provider organizations headed down this path must consider it.

“This is quite relevant to the new era of accountable care and population health,” Beck says. “Figuring out ways in which we can [prevent hospitalizations and ED utilization] will take us away from standard medical management and place us into the realm of social and environmental services.”

Non-traditional medical data integration has begun to take place in some medical collaborative environments already. The New York-Presbyterian Regional Health Collaborative created a medical village, which “goes beyond the established patient-centered medical home mode.” It not only connects an academic medical center with a large ambulatory network, medical homes, and other providers with each other, but community resources such as school-based clinics and specialty-care centers (the ones that are a part of NYP’s network).

Integrating housing and census data as part of this push toward evidence-based medicine is conceivable, says Beck. Electronic identification of a patient’s location isn’t the tough part, he notes. The tough part is integration of this data into the electronic health record (EHR) in a way that prompts the clinician to intervene.Andrew Beck at Cincinnati Children's Hospital Medical Center

“Figuring out how to get the data in and use it to drive appropriate interventions is a challenge, but I think it’s certainly realistic,” says Beck.

Of course, getting to this point will take work. Beck sees help needed from both the policy side and from the IT side to achieve integrated housing and health data.

From the policy side, he says there needs to be “pay-for-performance” reimbursement for housing inspectors, community health workers, and others on the front line. Also, from a conceptual level he’d like to see at the point of housing data collection, thoughts on how it will be utilized at the bedside.

From the IT side, Beck points to innovation as the key ingredient towards an integrated future. IT leaders must figure out how to stay innovative when it comes to developing systems and learn from other industries. “It’s the concept of buying something on Amazon with two clicks and then it shows up at the door a day later. How can we bring that seamless connection from objective to outcome in healthcare?” he says.

No comment yet.