Everyone knows what a Tricorder is supposed to do; Star Trek’s cultural influence has spread far and wide. Designing a device that could actually do the job is another matter entirely, although the people behind the Qualcomm Tricorder XPRIZE had enough faith to put up $10 million as an incentive to pull the idea further towards reality. One hand-held system that’s definitely along the right lines is called the Cyclops, and uses LEDs to shine different wavelengths of light onto an unknown substance and rapidly analyse the reflections it gets back. For now it’s limited to surface and only slightly sub-surface analysis, but it turns out there’s a lot you could do with just that functionality, and developers Visualant are working on ways to build into a medical probe too. I interviewed their CEO about it, and what the ramifications might be.
Biophotonics is a growth area and likely to stay that way for a while, especially at places like University College, London, which is becoming one center for UK research in the field. I wrote about three of their projects for Optics.org, including some near-infrared imaging technology that’s already left the lab; and more on that potential use of optogenetics to switch off epilepsy, an approach where the regulators will need some further convincing.
Some eye diseases have a nasty habit of being all but untreatable by the time the patient is aware of the symptoms; a dirty trick on the part of the designer. Age-related macular degeneration is one, but AMD spends a long time brewing in the tissues of the retina before making itself known as actual vision loss. Now it turns out that it might be detectable during that gestation period, through a simple test of how a patient’s eyesight adjusts to darkness. I wrote about that test for Optics.org.
Also in the eyesight department: retinitis pigmentosa is being tackled through electronic implants of two different types, the epiretinal and subretinal varieties. Each sits in a slightly different position in the retina and does its job in slightly different ways; neither is guaranteed to succeed nor to be hassle-free for the patient. Intriguingly, the two approaches ultimately stem from a fairly fundamental difference of opinion about what AMD does to the retina.
Robotic surgical systems are becoming more advanced all the time; the top-of-the-line ones already have a distinct air of science-fiction production design about them. The overlap with the equally rapid advances in biomedical imaging techniques seems obvious, but the synergies are not materializing as fast as you might think. I spoke to Intuitive Surgical to find out why, and what benefits will follow once the two camps get in sync.
And: a report from a conference on biophotonics held at University College, London. Among other things discussed, a potential use of optogenetics for turning epilepsy off in the brain as soon as it starts to manifest itself.
Two more technology stories left over from my trip to San Francisco:
A team at UCLA have spent the last few years developing microscopes that don’t use lenses to do their magnifying, relying instead on something closer to holography to get the job done. Among other advantages, this allows a functional microscope to be small enough to be slotted onto a modern cell phone and use the phone’s camera to record the images; and that in turn allows meaningful microscopy to potentially be done on a cell phone just about anywhere. This is “telemedicine,” and it could be huge. Case in point: images of malaria-infected red blood cells taken in the field, fed into an online system and presented to volunteer observers around the world for them to visually sift through. The real trick is making people want to get involved in such an exercise, and accommodating the fact that they might not be any good at it. The answer: turn it into a game.
OCT is not recent technology; it’s already ubiquitous for looking at the structure of the retina to a depth of a millimetre or two, among other things. But millimetres was more or less the technique’s limit. Tweaking things so that the same method can image over distances measured in centimetres, taking an image of, say, the entire eye from front to back, is a big step forward. The idea of extending the range even further, over meters of free space, seems almost ridiculous; but that could be on the way too.
I spent the first week of February in San Francisco; some of it looking at Alcatraz Island and pondering the best bits of Point Blank, which is all of them, but most of it deep in the warrens of the Moscone Center while the Photonics West trade show was held upstairs. Among other things I was there to help write the show’s daily newspaper and sit in on some of the technical conference sessions, a job which always amounts to a crash course in the state-of-the-art of five or six different inter-connected industries. The fruits of that will feed through for a while in one form or another; here are a couple of the early results.
It’s not news that Raman spectroscopy should be able to spot malignant tumours in early stages of development and be a real boon to cancer diagnosis, but it will definitely be news when it becomes a practical clinical proposition. Vancouver-based Raman specialists Verisante won an award in San Francisco for its Aura probe, which looks like it might be the answer for skin cancer, and a version able to be used on internal cancers in the lungs and bowel is being trialled now with FDA approval on the agenda. I interviewed Verisante’s CEO about it, and the advantages of Vancouver as a location for effective cancer research. Among the things cut for space was the fact that he spent considerable time and money trying to make terahertz technology work in this field, before realising that he was backing the wrong horse and switching to Raman instead.
Bioimaging is booming, since the technology is at that stage of development where each incremental step opens up numerous potential applications. But the keyword is “potential”, and the inherent risk is that clinical usefulness gets left behind in the rush forward, with the technology delivering more and more information and the scientists having less and less idea exactly where it comes from. This sentiment was articulated neatly by Bruce Tromberg from the Beckman Laser Institute as being “the really beautiful but also somewhat horrifying thing about this field.”
And I interviewed Paul Thurk, a venture capitalist from the US now based in Ireland, about the cultural differences in his line of work that pertain on either side of the Atlantic. Turns out there are plenty, not just in things like employment law and pension plans, but also in peoples’ attitudes to those sort of topics too. This is far from trivial when it comes to successfully prodding a promising technology forward on an international stage and not having the enterprise crumble in your hands. The interview is in the third issue of the Photonics West daily publication, which for now is only available here as a pdf download, but the topic will be back.
Two days in Bordeaux at an investment conference duly produced two published pieces, one on the current state of biophotonics markets and one doing a similar thing for cleantech.
A capricious mix of optimism and nervous glancing at overhead thunderclouds has become the new standard in both sectors. Folks connected with LED lighting predict a revolutionary couple of years ahead, while delegates from the solar sector mostly looked like a revolution had just run them over with a steamroller. The future course for biomedical optics looks certain; an aging population in need of ophthalmic surgery will help take care of that, especially once the high-volume consumable items associated with the procedures are properly incorporated into the value chain. But the circumstantial evidence suggests that for now no one involved is actually making much money.
Not shown: a graph predicting that by the end of 2015 there will be two working IP devices for every person on the planet, all down to video streaming, machine to machine traffic, and device to device communication. Also not shown: the other graph indicating that the current capacity of data networks is nowhere close to handling that kind of avalanche effectively.
“The carrier’s business model is broken,” said Ian Jenks, whose company Intune Networks is in the network virtualization business and whose presentation could have come with a klaxon, if the message wasn’t becoming familiar from other quarters too. “By the end of 2013, the cost of transmitting one bit of data will be greater than the revenues received for doing so.” In this case the cloud brings not just a silver lining but the route to salvation.
A semi-regular round-up of some technology stories I’ve written about recently:
X-ray vision didn’t do Ray Milland much good in the long run; just as well then that the notion is a bit of a pipe dream. But seeing through an opaque screen is theoretically feasible, at least on a very small scale. You just have to decode the scrambled information that the photons in the blurry image are still carrying about the hidden object. All of which is far from trivial, needing a secondary laser shining on the screen and a whole lot of computing power, but it’s still a nifty trick.
More vision things: an artificial lens made to model the specialized design of natural lenses in some animal eyes, which have a range of refractive index values instead of being stuck with just one. This involves the small matter of eight thousand layers of nano-scale polymer, co-extruded and moulded very precisely.
Motion sensors are cropping up everywhere, for the most part based on long-established principles that limit how small they can be. So a new design small enough to fit onto chip-scale devices could ultimately be big business. This sensor is so tiny that, thanks to a fine example of the things that can happen in small-scale physics, it technically produces a localised temperature of -270 °C for itself to work in.
Also: new machinery for additive manufacturing in Germany; more money for additive manufacturing in the UK. The odd thing about this field is that I first saw it in action two decades ago, and the language about its great promise doesn’t seem to have changed much in all that time.
Plus: Measuring the physical stress that’s unavoidably caused in an embryonic heart by its own beating, which turns out to be connected to some long-term health issues once the embryo stage is long gone.
And: A potential way to diagnose bladder cancer early using confocal Raman spectroscopy, a technique whose suitability for this kind of thing has been known for a while but which never quite solved a couple of fundamental hurdles.
A semi-regular round-up of some technolgy stories I’ve written about recently:
Curiosity continues to trundle across Mars, and another article about its on-board laser is here on Optics.org. As always with space applications, it’s the timescales that surprise but probably shouldn’t. ChemCam is the first planetary application of laser-induced breakdown spectroscopy, an attention-grabbing media-friendly experiment to zap bits of Martian geography, and a landmark experiment in every sense. Thales Optronics were told “Nice laser; now make it 95 percent lighter and need no cooling,” and achieved it within a couple of years. It’s state of the art – for 2003, when the design was locked down. Although some of the smaller sensor components developed for ChemCam have gone on to Earth-bound uses in the decade since, the new class of laser that Thales designed has not found its way into commercially viable applications because it’s remained stubbornly too expensive. It’s a relic from the past, currently engaged in one of the most futuristic experimental scenarios on the books.
My grasp of gene sequencing tends to be a consistent three years out of date, so it’s only when studies like ENCODE prompt round-ups of where things stand that the vast scale of the topic comes into focus. But now, a decade and a bit into the human genome project, there are some promising practical applications becoming commercially viable. A children’s hospital in Kansas believes it can sequence enough of the genome of a sick child to identify some life-threatening hereditary conditions in a short enough time to start meaningful treatment, through a combination of a custom-built piece of software and a sequencing machine designed to get the job done as fast as reasonably possible. That involves elegant use of a fluorescence-tagging technique that’s been around for a while, but in Kansas they pressed the accelerator. Potentially, clinicians could get results in their hands in as little as 50 hours, although the trial hit some logistical issues that got in the way of that nice round number.
Adaptive optics is a feature of some astronomical telescopes, compensating for natural atmospheric disturbances by deliberately deforming a mirror very, very rapidly with mechanical actuators to counteract the interference. Ophthalmologists face similar problems when trying to get a good look at the retina to spot the onset of certain retinal diseases, but deformable mirror technology was never quite good enough to pull off the same trick dealing with the wide array of optical distortions found in human eyes. Imagine Eyes think they’ve cracked it, in a system that should mark the first reasonably practical transfer of adaptive optics into a normal clinical environment – practical, in that the machine to do the job isn’t the size of a small car.
A mention for three recent photonics stories I’ve covered:
The business end of the Mars Science Laboratory touched down on Mars, surviving a landing sequence that looked tough on paper and practically suicidal in the animations. Curiosity promptly took some high-definition pictures that made Mars look like the service road at your average quarry, splendidly enough. There’s a laser on Curiosity, designed to fire at nearby rocks and do spectroscopy on the plasma that comes off, and it too seems to have survived the trip without blowing a fuse. It fired at a rock that looked a lot like basalt; it was basalt. Early days. But fair play to Nasa; this is all a very long way from the bad old days of the Climate Orbiter, a probe sent to Mars that was inadvertently programmed for Andromeda. A longer article about the science behind the ChemCam instrument may be along in due course, all being well.
Meanwhile, DARPA is calling for proposals from companies with a vision of what ultra-fast lasers might be useful for in the future, and how to manufacture the systems that might stand a realistic chance of getting those jobs done. This is hard-core blue-sky thinking: “Literally any technology that uses electromagnetic radiation could be impacted,” says the man from DARPA, although in practice the technologies of value to the US military will inevitably be the ones that get the initial momentum.
Tiny laser-projectors able to shine a high-definition image from a hand-held unit have been promised for ages, but stumbling blocks kept cropping on. The most significant one used to be obtaining the right kind of green laser source, since the designer of the universe went out of Her way to make that tricky. Those sources are finally turning up, but now the physical size of the unit is not always very conducive to putting it in a pocket phone – as The Goodies pointed out decades ago, the trick is to wear the special trousers. But in the meantime, automotive companies have zeroed in on pico-projectors as a way to build a better heads-up display, and that’s an application where the projectors could fit right in. MicroVision has just received $5 million from Pioneer towards that very aim; this after the company rejigged its entire business model to try and stop the cash from flowing in the wrong direction.
A mention for some of the optics and photonics stories I’ve covered lately:
The B612 Foundation is raising funds for what would be the first privately funded deep space mission, putting its own infrared observatory into orbit around the Sun at roughly the orbital distance of Venus. Ambitious, is probably the appropriate word. B612 are campaigners on the topic of asteroid deflection, and having spotted that NASA isn’t going to complete the mapping of potentially hazardous bodies in the inner solar system to its satisfaction any time soon, it wants to get out there and start having a look itself. A space mission funded by a private organisation with something of an evangelical approach to a certain catastrophic turn of events has some pop-cultural bite about it, and the Foundation rather smartly takes its name from the asteroid in The Little Prince, so my dream of one day writing about the ground-breaking efforts of the Acme Corporation and Roxxon Oil is back on track.
The dietary habits of a hominin who lived in South Africa two million years ago have been figured out, by analysing carbon dioxide lasered out of his tooth enamel. Turns out he was picky. My take-away from this is that I have the wrong idea entirely about the fragility of tooth enamel.
A system at Palomar Observatory should be able to yield direct optical evidence of the presence of exoplanets orbiting other stars, rather than relying on indirect detection of changes in starlight or gravitational effects. This ultimately revolves around getting the speckle caused by the atmosphere out of the picture, using the first “extreme” adaptive optics system deployed for astronomy. They know their AO tech at Palomar, and reckon they might eventually be able to achieve Hubble-calibre resolution from the ground.
Keeping unmanned drone aircraft in the air for longer used to mean installing bigger batteries, but somehow recharging the ones they’ve already got without needing them to land might be a better bet. Shooting a laser at them from the ground and re-juicing a drone that way sounds fairly simple; in principle it ties back into the ideas of wireless power transmission that used to keep Nikola Tesla awake at night, and which then kept a lot of other people awake when he explained to them what he meant. Lockheed Martin and LaserMotive have made it work in a wind tunnel, but the technical bottleneck might be the cells on the drone that turn an incoming laser into electricity, tech which has not been particularly high on anyone’s research program lately.
Two holdovers from my trip to Baltimore for SPIE DSS 2012 have appeared on Optics.org:
Since shining a laser pointer in the general direction of an aircraft pilot seems to be the jape that won’t go away, optical filters that can limit the dazzle are in demand, and the military are just as interested in them as commercial airlines – probably more. Filters usually work by blocking the particular laser wavelengths used in your average pointer, but a better approach might be to block incident light based on its power instead. KiloLambda showed off a wide-band filter that uses a layer of carefully manufactured nanostructures and exploits their non-linear optics to block laser light when it passes a designated power threshold. Below that, the filter stays clear at all wavelengths; above it, transmission is either limited to a certain value or blocked completely.
Nanstructures are also the key to a scintillator material made by the applied research arm of Georgia Tech, a cerium-doped gadolinium halide material cast in a glass that scintillates when hit by incoming gamma-rays. If the nanoparticles used can be held below a certain size, and about 20 nanometers or so seems to do it, then the scattering of the scintillated light which can bedevil any kind of accurate reading is drastically reduced. Plus a glass or glass-ceramic material is much easier to handle than a fragile scintillation crystal. Plenty of room for improvement in the resolution, though.
And away from DSS: further work on retinal implants, a topic that has now entered the watch-lists of TV news producers everywhere. The principle of restoring sight to someone suffering from a condition like age-related macular degeneration hinges on the fact that it’s the photoreceptors in the retina that have died, not the neurons behind them. Implants to take over the job of the receptors and fire the neurons when light hits the retina are well past the status of pipe dreams, but there are a wealth of problems – not least among them, how to power the array of photoreceptors while it goes about its business. A group from Stanford University, applying scrupulous logic, think the answer may be to build an implant which draws its power from the same incoming infra-red light which brings the visual data, and their lab trials suggest that they’re on the right lines. Not quite a solar-powered retinal implant, but not too far off. Not a bionic eye either, although Steve Austin is the gift that keeps on giving for headline writers on a deadline.