The University of Michigan has announced that it has developed a computer that’s so small that it could potentially be implanted into the eyes of glaucoma patients in order to monitor the overall eye pressure.
Patients with glaucoma get excess pressure in their eyeball, which can damage the optic nerve, leading to blindness. This needs to be monitored on a regular basis, and this could be much easier with this little device. It’s being described as the first millimeter-scale computer and it’s just one cubic-millimeter in size. Incredibly, it only consumes 5.3 nanowatts.
The computer is charged up by solar power and is able to transmit the data wirelessly. This little computer could have many different applications, meaning that we should have flea-size cyborgs sometime soon.
Holy crap, I guess I knew that smartphones were ubiquitous, but it looks like now, there will be a case sold that will transform any iPhone into an ECG or EKG (electrocardiograph).
Developed by Alivecor along with Oregon Scientific, the iPhonECG works thanks to a pair of electrodes. The electrodes communicate the info to an iPhone. These readings can be done wirelessly, meaning that you can monitor your heart rate thanks to the iPhone pressed against your heart. The software can store a complete history of heart rates and beats and, if you’ve got heart problems, the iPhone could someday be programmed to send alerts to your physician in case of emergency.
Incredibly, the units are expected to sell for less than $100 a piece. When you think about how much a traditional ECG machine costs, it’s pretty amazing.
Kinect could bring touch-free interface to operating theaters
The hands-free interface developed by the Virtopsy research project to review medical images using Microsoft's Kinect
The development of open source drivers for Microsoft's Kinect motion-controller is already opening up new (if not entirely unpredictable) applications for the device. This example, developed by members of the Virtopsy research project at the Institute of Forensic Medicine at the University of Bern in Switzerland, is a functional prototype using Kinect that provides users with a hands-free way to review radiological images.
With software based on ofxKinect, libfreenect and open frameworks, the prototype uses a mix of voice control via a wireless headset and gesture control via the Kinect’s 3D video camera to control OsiriX, an image processing application specially designed for navigation and visualization of medical images. The user can switch modes using voice commands and then navigate the images – zooming in or out and moving the view through a 3D image – using one or two-handed gestures.
While the focus of the Virtopsy project is to make use of new technologies to replace standard autopsy with minimally invasive procedures, such a touch-free interface also has obvious benefits for surgeons needing to navigate medical images in a surgery environment. Gesture and voice controls would allow them to maintain sterility by doing away with the need to directly touch any keyboards, buttons, joysticks or touchscreens.
With news circulating that Microsoft is also working on a firmware update for Kinect that would quadruple the resolution of its 3D camera from 320x240 to 640x480 pixels, it seems inevitable that the device will find more and more uses beyond its gaming roots.
To me, the most enthralling idea presented in the following massively important project you’re about to experience is this: “I don’t think of myself as the Patient of the Future – it is the perspective of the providers that puts me in this box of the Patient of the Future – I’m a patient of the present!” This project / video presentation is called “Design We Can All Live With” and it is about how a Minneapolis-based design firm is aiming toward a better patient-based healthcare system through technology right this moment.
This firm goes by the name “Worrell” and this solution session aims at a situation based in a technological world, but a world that intimately connects doctors with their patients. This presentation shows an epic talk between design professionals working with Worrell and one doctor and one patient. They of course represent the greater whole, expressing the wants and needs of the entire community.
This talk represents the past few years of Worrell talking with patients in their homes and at the medical office and hospitals about their needs, the needs of what Worrell knows are stakeholders in the medical world. Worrell is an industrial design firm that works in interactive and medical technologies and has been for the past 35 years – they’ve been working on this particular project for the past four or five.
The images you see below in the gallery and later in the movie are of a set of technologies called “Pathway.” These web powered devices will not only store and keep current your medical records, they’ll aid in providing you, the patient, with helpful information like articles on your condition sent directly from your doctor.
Take a look at this video “Design We Can All Live With” and be completely inspired and excited for not the future of medical care, but the soon-to-be present!
This post is about a wheelchair project which allows the user to stand up easily. “The Leeding E.D.G.E” also features easy drive handles with different gearing options to promote accessibility and combat shoulder injury generally caused by traditional wheeling techniques. Designer Time Leeding proposes this wheelchair alleviates pressure sores and makes moves toward closing social boundaries which “inhibit the lives of the disabled day to day.”
It is a world that I do not pretend for a moment to understand, that being the world that a person that must be in a wheelchair lives in. I believe that each person lives a different life, and that each person deals differently with a situation that they might not find ideal, for example becoming confined to a wheelchair partway through life. Does allowing a person with no use of legs the ability to stand up temporarily work toward a better life for that person?
That question asked, this wheelchair seems to me to be quite the fabulous looking bit of engineering. “The Leeding E.D.G.E” features “dynamic drive” handles that work with a rowing sort of motion. More energy efficient and less strenuous than traditional techniques. The chair features geared hubs featuring 2:1 drive, 1:1 drive, neutral and revers gears, and of course, that excellent standing mode.
3-D Brain Model Could Revolutionize Neurology
By Stuart Fox
A 3-D model of a region of the cerebellum. Credit: The Whole Brain Catalog
LOS ANGELES – A new project aims to produce a Google Maps-like guide of the brain's labyrinthine structure At a presentation here at the SIGGRAPH interactive technology and computer graphics conference, researchers highlighted how a complete 3-D model of the brain could spark a new era in neurological research.
Called The Whole Brain Catalog, the project compiles data from across the research spectrum, in a variety of forms. It takes MRI data, pictures of stained neurons and theoretical diagrams of brain circuitry and presents them in a way that scientists, doctors and 3-D animators can digest in a unified way. Those users then contribute back to the site, wiki-style, to produce an increasingly full model of the brain at every scale, down to the molecular level.
“My dream is that you can built these simulations at any level of resolution,” said Stephen Larson, a neuroscience researcher at the University of California, San Diego, who works on the Whole Brain Catalog. “The key is to have something smaller than the brain that is still big enough for simulations.”
Mapping out all the connections in the brain has proved one of the most difficult challenges facing neuroscientists today. Indeed, they often compare the intricacy of the brain to the human genome — the name given to the entire library of human DNA — and have nicknamed the total map of all the brain’s links the “connectome”, Larson said.
Even before the Whole Brain Project is completed, scientists could use the data to run tests in the computer, instead of on live subjects. This would drastically increase the range of possible neurological experiments, while significantly cutting down the cost, time and variables of studies.
Ultimately, the connectome can only be understood through a complete 3-D model that maps out every molecule, neuron and wrinkle, Larson said.
“The brain is complex because it isn’t rectilinear like the world we live in, which makes it counter intuitive,” Larson said.
The Whole Brain Catalog remains a long way from finishing its task of mapping the connectome, and faces difficult challenges moving forward. For one, most of the data comes to the project in wildly different formats, and there is no way to automate the process of unifying the different types of input. Additionally, some regions of the brain receive far more research than others, leaving some areas mostly blank, Larson said.
However, that last problem also presents a new opportunity for the Whole Brain Project. Because the model reflects what data scientists generate, the model is as much a diagram of trends in neurological research as it is a map of the brain.
By looking at what areas are more or less defined in the Whole Brain Catalog model, scientists can determine what regions of the brain need more attention, and plan future research that crosses traditional boundaries that isolate different kinds of brain research, Larson said.
i-Mos is a pair of funky looking glasses that a speech impaired person can use for easy communication. Although the user scenario suggested by the designer looks more tragic, like involving physical disabilities as well, I’m going to stick with just the basics. The device tracks your eye movement as Morse-code inputs and then voices it out as speech. One eye is dedicated to “dot” and the other to “dash”. Aids like sentence completion and built-in Morse code learning, add value to the idea and make usage more independent. A well intended plan!
We've seen quadriplegic transportation directed by brainwaves, speech and even the occasional Wiimote, but your best bet might be to follow your nose. Israeli nasal researchers at the Weizmann Institute of Science unveiled a "sniff controller" this week, that measures nasal pressure to control a wheelchair joystick with surprising precision (see a video after the break) and a specially-developed typing interface. The latter is likely the more important advancement, as Discover heartwarmingly reports at the source link, by giving patients with locked-in syndrome (a la The Diving Bell and the Butterfly) the long-lost ability to speak. Best of all, the technology is inexpensive compared to alternatives on the market; while a Stephen Hawking-esque eye-tracking system can cost tens of thousands of dollars, Weizmann scholars reportedly pieced the prototype together for $358. The device is already being considered for public availability by the institute's technology transfer company, Yeda R&D -- find out just how it works in the full study at our more coverage link.