To get small with medicine, this "Machine" will have to get big — really, really, really big.
A major research project backed by Hewlett Packard Enterprise aims, among other things, to one day help doctors tailor medical treatment for individual patients by quickly crunching data from massive amounts of patient information — related to genes, diseases, personal behaviors and treatment history — assembled from around the world.
Dubbed The Machine, the project represents more than half of the research efforts of Hewlett Packard Labs, the R&D arm of HPE.
"It's our biggest and most ambitious research project," said Jaap Suermondt, vice president of software and analytics at Hewlett Packard Labs.
Suermondt said the company hopes to offer The Machine commercially to customers, including health providers, in a few years but plans to display a prototype of the Machine later this year.
And "multiple central components of The Machine are well on their way," he said. "In fact, we are embedding components of The Machine's architecture into our current product road map as they become available."
The Machine, Suermondt said, is "first and foremost about flipping the computer inside out."
That architectural flip has to be done in order to efficiently process the huge data sets that are becoming available about patients, along with other types of digitized information, he said.
Unlike the traditional model of computing, which puts electronic processors at the center of the machine and surrounds it with relatively small amounts of memory, The Machine will make memory — the data itself — the center of the technology's infrastructure.
"What we're building is a new generation of computers," Suermondt said. "Rather than having the processor in the middle, we have the memory in the middle ... [and] you can put as much memory in as you want."
The new computer's architecture will rely on photonics, or light particles for superfast communication, and possibly down the line memristors, a long-envisioned form of digital memory.
Facilitating that flip — and also putting pressure on developers to make it happen — is the fact that "it's only recently that we've seen that really exponential growth in digital data, particularly in health care," he said.
"By 2020, 30 billion connected devices will generate unprecedented amounts of data," HPE notes on its website about The Machine, which is Hewlett Packard Labs' largest single ongoing project.
For the past seven years, the health-care sector, which had notably lagged other economic sectors in embracing digitization of data, has been increasingly adopting electronic health records, making it dramatically easier to detect health trends and outcomes across wide swaths of patient population.
"The infrastructure required to collect, process, store and analyze this data requires transformational changes in the foundations of computing. Bottom line: Current systems can't handle where we are headed, and we need a new solution," HPE said.
Suermondt said the Machine, whose components will utilize open-source code and collaboration, "basically flattens complex data hierarchies."
In addition to patient and other health-related data, The Machine will be able to process and analyze data from all kinds of industries to help develop solutions or new approaches in those fields, Suermondt said.
But Suermondt added that health care and "maybe national intelligence" are the two fields that for The Machine offer the most promise for changing outcomes, both because of the size of the data available and the stakes involved.
"There are very few things that are as complex as medicine," he said.
The Machine's development is occurring amid a new national focus on precision medicine, which offers the promise of developing patient-specific medicines and treatments.
Its potential massive computing power offered the opportunity of significantly speeding up the application of precision medicine by quickly sifting through terabytes of genomic data and suggesting courses of treatment based on an individual patient's profile.
President Barack Obama announced a Precision Medicine Initiative last year. Then, in December, he backed up that program by signing a bill that provides $200 million in funding for that effort to encourage the development of treatment based on an individual's own genes, health history, environments and lifestyles.
In February the National Institutes of Health announced an award to Vanderbilt University to develop the first phase of the Precision Medicine Initiative Cohort, with an eye toward getting 1 million or more volunteers in the United States who will agree to have their health data shared with researchers.
And federal defense officials are working to build out an existing, 450,000 cohort of military veterans whose health data is being researched to include enrollment of active-duty personnel.
Suermondt said that assembling those big new patient cohorts and tapping others that already exist will give The Machine and other precision medicine technologies a wide array of data that can be used to develop targeted medicine and treatments.
Instead of opting for a course of treatment that worked for the "average" patient in the past, "all of sudden we have the data and the connectivity so that every doctor can have access to 100 people just like you, as if they've seen them themselves," so that a physician can see what medicine has worked and not worked for those 100 people similar to yourself, said Suermondt.
"We're going to be able to compare you, in real time, to those 100 people," he said.
Suermondt said that The Machine, in its final form, would not actually be a tool "that every doctor will be using." Instead, he said, it will be a technological platform that will give physicians and other health providers access to data and analysis.
In addition to patient-specific analysis, Suermondt said The Machine is being designed to handle "memory-intensive" medical research on things such as drug outcomes. The platform will be able to develop algorithms "that normally are prohibitively expensive to run" because of the traditional computing power required, he said.
Having massive amounts of memory "allows us to pre-compute things, like an almost infinite set of 'what if' scenarios," Suermondt said. "We have experimented in areas like operating-room scheduling and the cascade of consequences for recovery, ICU, beds on the units, staff schedules, etc."
And the faster access to shared memory between many central and graphics processing units The Machine intends to accomplish "benefits anything we do with large graphs — for example, the relationship between genes, diseases and behaviors" of patients.
"Everybody thinks this is hugely promising at unlocking the ability to get at all the data, which in a lot of ways we already have," Suermondt said.
"I think we're building a tool that can really help."