Technology Today…

The power in your hands

IT Challenges in Meeting the Prescribed Treatment for what Ails Healthcare

Ever wonder why hasn’t IT revolutionized healthcare as it has most every industry in the world? I do – its one of the reasons I’ve made a career transition from IT into healthcare. That, and trying to see if I can help find how IT still can improve healthcare.

Information Technology, and advances in technology in general, have brought a revolutionary impact on almost all industries on the planet, specifically in decreasing costs throughout the value chain from supplier through consumer, streamlining communications, and in empowering consumers by giving them greater access to information and therefore greater say in what and how products and services are provided to them.

Even the theatre industry has benefited from IT.

Managing the sound and lighting effects in theatre productions is much less expensive and more user-friendly now then say 30 years ago, allowing theatre companies to lower the costs for stage productions and making culture more available to the masses.

Similar cost reductions and operational efficiencies are being witnessed globally. Currently, Uber and AirBnB are causing major disruptions by bringing a new, tech-based model to transportation and the hotel industry. (And in the process, creating an entirely new economy.)

But this hasn’t happened in healthcare.

Healthcare still has not fully adopted IT and much of it operates as it did in the 1950s. (Look, for instance at similarities between the hospital patient room from the 1950s and today.)

And, the underlying cost structure hasn’t changed – or reduced – as other industries have. In fact, healthcare costs continue to rise year over year at a rate faster than inflation.

This isn’t entirely for lack of trying – either on the part of IT firms or the medical community.

The question is Why?

Why has IT been unable to penetrate healthcare to effect cost reductions when it has a well established track records for doing so in other industries? And its close corollary – What can we do about it?

To understand the situation well enough to answer the question, we have to realize that the IT and healthcare industries differ in one important way.

The information technology field follows the “If you build it, they will come” mantra made famous by the Kevin Costner film, “Field of Dreams”. (As we’ll see, this approach would be a nightmare in healthcare.)

And, more recently, Steve Jobs was famous for not listening to the consumer – because the consumer doesn’t know what they want until he puts it in front of them. [Interesting analysis of Jobs’ quote on this subject on the HelpScout blog.] And really, most customers, the public in general, doesn’t know what they want from IT or what kinds of things IT can accomplish in the first place. So anticipating – or creating – customer needs is a viable approach – one Apple did and continues to do exceptionally well.

But, this doesn’t work in healthcare where the medical community is dealing with human life. Doctors can only use drugs, medical devices, and treatments that have been widely tested in controlled settings, proven to be effective, and with side effects that are well known and documented.

This requirement for testing is foreign to the IT industry. In IT, it is not uncommon to release products with known bugs and allow consumers to identify the problems and deficiencies with the software or solution. Microsoft was famous for taking this approach and Google has taken the practice one step further by releasing beta software for public use. [Disclosure: I love Google and Microsoft – research for this and all blogs starts at Google, and are drafted in Word.]

Beta refers to software that, while functional, is not fully ready for commercial use.

Yes, there are sound business reasons for the practice or releasing software that may not be fully ready for commercial use – especially in Google’s case as their products are released for free – as flaws in software products can be hard to find or are manifest only under certain conditions. Relying on customer reports is often the most efficient way to identify flaws.

But in healthcare, this doesn’t work. Anything intended for use with real patients must be thoroughly tested on patients and in controlled and monitored settings. We don’t want to wait for people to die to realize a drug or other treatment may have serious side effects.

So even if you put an IT solution in front of doctors and other providers that may potentially increase the quality of care, they won’t use it unless you can also point to documented evidence of its efficacy. The uptake, or adoption, of new drugs and technological approaches requires a greater level of evidence than the IT industry may be used to providing.

What’s required is a merging of the medical industry’s approach to clinical trials (e.g., testing with all results documented) with the iterative, Agile development process from the technology industry.

Agile development is a process for developing a software or technical solution that breaks the overall effort into multiple stages, called sprints. The end of each stage allows for an opportunity to test (e.g., run a small clinical trial).

Doing so may (but may not) increase the timeline and cost of the development of a health IT solution – but will more than make up for it in the increased quality and long-term uptake of the solution by the medical industry.

Performing clinical trials earlier in the development of health IT solutions allows the medical community to contribute early on by helping adapt the technology to the specific needs of their patients and potentially making providers more amenable to incorporating the solution into their practice of medicine going forward. At the least, being involved earlier in the development process will train providers in the use of the technology – especially important for situations when the new technology will change the workflows associated with patient care.

In essence, IT can still be the remedy for what ails the US healthcare system and incorporating a clinical trial approach to the Agile development process may be just what the doctor ordered.

Advertisements

August 7, 2015 Posted by | Audience Communication | , , , , , , , | Leave a comment

The Wounded Warrior Cyber Combat Academy

News reports abound on the shortage of skilled cybersecurity professionals and the potential consequences this can have for the security of the US and our way of life.

WWPThe Wounded Warrior Cyber Combat Academy is a unique approach to addressing this issue.  This effort recruits returning war-injured veterans and trains them on IT security skills so they can join the workforce and help defend the US from cyber-attack. 

Join us on Tuesday, October 8th at 6:00pm EST as we welcome Mr. Jim Wiggins, Executive Director of the Federal IT Security Institute and leader of the Wounded Warrior Cyber Combat Academy (W2CCA) to learn about this effort and how it benefits our returning soldiers by providing them employable skills, our economy by creating a pool of a skilled workers to fill US-based jobs, and our nation by increasing the ranks of cyber defenders. 

Join us! This Tuesday at 6pm EST on Technology Today. Listeners are invited to send in their thoughts by email, texts to 240-731-0756, facebook, our blog, our show page and on Twitter. You can also call us Live on Air at 646-652-4385.


Technology Today airs on Live on the BlogTalkRadio network – the Online home for Internet Radio!. Join us Live Here.

October 7, 2013 Posted by | Episode Descriptions | , , , , , | Leave a comment