First Flying Taxi

Drones have become increasingly popular of late, particularly among hobbyists, photographers and cinematographers. They started appearing in the commercial market only in the last 20 years, even though the military has been utilizing UAV (unmanned aerial vehicles) since WWII.

More recently though, the development of drones for use in the delivery of goods has been explored. Companies such as Amazon are looking for ways to enable a fleet of drones to deliver products directly without involving a human delivery person.

And the technology keeps growing.

With the idea of transporting commercial goods, came the thought that drones could be used to transport people as well. In recent years, the number of people using planes as a means of transportation has risen dramatically, creating a considerably larger carbon footprint than in the past.

In 2016 the first drone capable of carrying people, the Ehang 184, was unveiled in China at CES (Consumer Electronics Show) that year. In June of this year, a British aerospace company tested their idea, dubbed the eVTOL (electric Vertical Take-Off and Landing) vehicle.

This company, Vertical Aerospace, aims at providing trip distances somewhere in the 60-90 mile range. This eVTOL, which takes off vertically, is powered by four large rotors and can reach speeds up to 50 mph. Plans are to carry 2-4 passengers from city to city or directly from their door to their desired destination by 2022.

The company’s founders wanted to provide a more efficient means of transportation which mirrors the way we travel now, by taxi and air bus, only much advanced. By offering an electric passenger vehicle for this purpose they have opened the door on the very near future.

Vertical Aerospace is dedicated to decarbonizing air travel and making medium distance travel possible at a lower cost to the transportation company, environment and ultimately the customer.

The mission is virtuous to be sure, but the industry has a long way to go. With their first test run out of the way, Vertical Aerospace is certainly on the leading edge, but there are still obstacles to overcome and the future is a bit uncertain.

The potential is visible in clips from their first test flight, which made them the first company in the UK to test a flying taxi prototype. While they may be the first currently, with technology advancing at it’s ongoing rate, there will certainly be other companies filling the market soon. In fact, Uber is also on track to employ air taxis in the future too.

Uber unveiled their prototype for the same category of vehicle, VTOL, in May of this year and they have plans to begin test flights in 2020. Uber’s prototype has room for a pilot and four passengers and will reach speeds of 150-200 mph up to 60 miles.

Even with the technological hurdles yet to be overcome, it seems almost certain that in the days ahead we will see this service come to life. The only question we’re left wondering is not if there will be flying taxis, but rather, who will be the first to actually offer the first flying taxi for everyday use?


Electronic Tattoos

Science fiction has given birth to many new technological advancements simply by inspiring curiosity. As our technological knowledge has increased we have been able to explore even more possibilities. We have now reached a point where nano-technology has merged with 3D printing and bio-MEM research to create an entirely new playground for the physicists of our time.

There have been movies alluding to advancements in meshing humans and technology; cyber-humans. Nanshu Lu began to make this sci-fi reality with her research into and development of Flexoelectricity of Nanomaterials on Deformable Substrates. Lu’s idea was that by upgrading our capabilities in the combining of electrical and mechanical technologies at a nanoscale level, we can turn mechanical action into electrical impulses.

From her research we have reached into the world of augmented humans in real life. With the aid of new developments in 3D printing, mainly being able to use mediums other than hard plastic, we are now able to create printable electronics. And what’s more, when computer scanning technology is added to the 3D printer, printing on skin becomes a viable option.

So, now we have a printable ‘tattoo’ that can perform electronic functions. These devices are referred to as tattoos because they stick to the skin the same way that a temporary tattoo does. But these polymer structures adhere and move with the skin, as well as being completely customizable on a cellular level, tailored to each individual’s needs.

In fact, this aspect of the technology is so exciting, that there is work being done to create bio-synthetic organ replacements for people needing transplants. It is already possible to match the exact size and shape of whatever body part is needed. And the implications for the medical community are countless. The printer is fairly inexpensive ($400) and fits in a backpack. Imagine being able to administer to a patient at the scene of an accident instead of transporting them to a hospital.

We already connect everything to our phones; what if you never lost your phone and it was always charged – because it was always on your wrist.  If you add a medical monitor and sync that to your doctor’s office, they can track your health in real time, allowing for faster diagnoses and shorter treatment times; and you can receive health advice just as fast.

Perhaps this is the beginning of a future where humans even have augmented senses – eyesight like an owl, or the hearing capability of a bat. A future where we can start to wipe out some of the most common medical issues we face. A technology that can grow with us and perhaps even lengthen our lifespan.

Realistically, we could be looking at bio-electrical devices in a lot of new applications very soon. The technology has already been developed enough to allow biocompatible material to engage seamlessly with skin. It seems to be only a matter of improving upon this already amazing technology and developing new ways of integrating our current systems to what are likely to become the systems of the future.

Robotic Surgeons: From Science Fiction to Science Fact

From gallbladder procedures to prostate surgery, robots have gone on to become pretty much the mainstays in the operating room of many advanced surgical hospitals. Now, they are also being used for otherwise highly complicated eye surgery as well.

Back in 2016, several researchers who were working on the literal ‘cutting edge’ of advanced medical surgery, commenced a clinical trial so that they could test the “PRECEYES Surgical System”. This system consisted of a robot that has been designed for the express purpose of performing advanced surgery on the human retina i.e. the surface located at the back of the eyeballs. The results of this robot-assisted surgery of the eye have been published in the journal Nature Biomedical Engineering.

Operating the PRECEYES system involves the human surgeon using a joystick to control a highly mobile mechanical arm. In this case, the doctors have the ability to attach multiple instruments onto the arm. Due to the fact that the entire system is electronically operated, the robotic arm would not suffer from any jolt or tremor that can plague even the steadiest handed surgeon around.

In the original trial, the researchers from University of Oxford’s Nuffield Department of Clinical Neurosciences enlisted around a dozen patients who each needed a thin membrane removed from their eye’s retina. In terms of eye surgery, this is a fairly routine procedure. In this trial, six doctors performed the operation manually, while the remaining six performed the same procedure with the help of the PRECEYES Surgical System.

Surgery typically starts with a very tiny incision that is made just above the eye’s pupil. The incision is made in order to insert a tiny flashlight to help the surgeon in his work. However, when the robotic arm is doing the job, the surgeon gets a chance to insert the flashlight via an incision that is less than 1 mm in diameter.  The arm then proceeds to separate the membrane from the retina and then proceeds to remove it from the eye. The arm then exits via the same hole through which it had entered.

However, when the same surgery is conducted without the help of the robot, the surgeon has to do the job by hand, while manipulating microsurgical instruments, even as he looks through a powerful operating microscope.

All of the surgeries in the trial were completely successful. However, in cases where the robots were used, they made the surgeon far more effective than usual.

PRECEYES is the tip of the iceberg and now there are several robot surgeons that are in the developmental stage. While it is certainly true that they are not as fast as their human counterparts, they make up for it in precision and reliability. In the process, they have managed to usher in a whole new system of surgical refinements.

The Ghost Particle: Unraveling the Deep Mysteries of the Universe

If we were to make a list of really unusual cosmic phenomenon, then the odds are that neutrinos will end up ranking pretty high on that list. As a matter of fact, these particular subatomic particles do not pack an electrical charge or any specific mass at all. Nevertheless, they are practically everywhere and they have the remarkably uncanny ability to pass though just about anything and everything. As a matter of fact, it has been estimated that around 100 trillion of them easily pass through the human body just about every second or so.

The scientific community had initially started theorizing about the existence of neutrinos around 80 years back and by the mid-50s, there was official confirmation of their existence.

The Primary Sources of Neutrino Particles

There are two primary sources of neutrinos; the supernova SN 1987A and our very own Sun. However, in 2013, many researchers were able to discover a hitherto unknown type of neutrino that was dubbed the “high-energy neutrino”. While its existence was not doubted, no one really knew where it had actually originated from, that is until now.

Researchers at the giant ‘Ice Cube’ Neutrino Observatory that is located at the South Pole had been able to detect very high energy neutrinos that were observed to be emanating from one specific area in space. Once the general area of the cosmos had been identified, around twenty other observatories quickly swung into action and concentrated on the position. Finally, after many months of hectic observation, all of these observatories came together to collectively determine the exact source of this very high energy neutrino particle.

It has originated from TXS 0506+056. This is a ‘blazar’ that is located approximately 4 billion light years from our planet. In cosmological terms, a blazar can be defined as a specific type of elliptical galaxy that has a fast spinning black hole located at its very center.

What Does the Discovery of a High Energy Neutrino Mean for Astronomy as a Whole?

This very first evidence of an active galaxy being able to emit these neutrino particles actually means that it is possible that we may soon be able to observe the universe around us, using the knowledge we glean from neutrinos and learn more about these elusive particles in various different ways. This would otherwise be absolutely impossible with just light and radio-based astronomy alone.

By identifying a real source of these high-energy neutrino particles, the work of these observatories has effectively helped usher in an entirely new epoch in the science of astronomy, as it exists today.

Augmented Reality

Augmented Reality, or AR, is a highly interactive experience that essentially involves a real world environment whereupon the various objects that are part of the real world are “augmented” by computer ‘software and hardware generated’ perceptual information. This information can cross multiple sensory modalities, and can go on to include auditory, visual, haptic (touch), olfactory (smell) and even somatosensory (pressure, pain and warmth) information as well.

Here, the highly overlaid sensory information can also be constructive, such as seamless additions towards the natural environment, or alternatively it can be considered to be destructive (in laymen’s terms it means the masking of the natural environment).

Augmented Reality is flawlessly interwoven with our physical world to such an extent that it is actually perceived to be a very immersive feature of the real world environment.  In this way, AR can easily alter your very own perception of the actual real world environment, as it exists outside the computer-generated model. While a VR, or virtual reality, counterpart replaces the user’s own, real world environment with its self-created simulated one. AR augments it to a certain extent.

Augmented reality can be broadly categorized into two largely (but also uniquely) synonymous terms: They are:

  • Mixed reality, and
  • Computer-mediated reality

Mixed reality

It is sometimes also referred to as a form of hybrid reality. As the term implies, it is the merging of both the real and the virtual worlds, to produce an entirely new environment and its corresponding visualization where both the physical and the digital can effectively co-exist.  MR takes place in both the physical as well as the virtual world and is a chief component of immersive technology.

Computer-mediated reality

Computer-mediated reality basically refers to a software and hardware system’s ability to not just add but also subtract the information to manipulate a person’s perception of their living reality with the aid of a wearable computer or a hand held device (a smart phone for instance).

Both of these two technologies are part of the broad description of AR, an area of electronic technology that is literally changing the world around us.

Artificial Intelligence: The Shape of Things to Come

Mention the very word artificial intelligence and armed killer cyborgs come to the mind of the common layman. The phobia of ‘self-awareness’ in machines has been the stuff of human nightmares since time immemorial.  In fact, this is pretty much the standard definition of AI as it is understood by non-technical people.

However, it does not have to be this way at all. Especially if we can both teach and learn from our newly intelligent creations. After all, humans are more adept at short term gains then long term ones. The logging company will wipe out an entire rainforest without thinking twice about the negative repercussions on the environment and climate of the region, as well as the whole world. Entire eco systems have been wiped out due to the greed of the few causing mass level extinction events and perhaps the irreversible degradation of our planet. But the creation of artificial intelligence, safe from the foibles and the avarice of our race may be just the answer to many, if not most of our problems.

However, many people working in this field have to grapple with multiple problems such as the concept of ‘self-awareness’. That is, if an entity is self-aware, does it have any rights and if you decide to delete the program and it decides to defend itself (think Skynet) how will it respond. Would it allow its consciousness to be wiped out or would it fight to retain its innate self?

These are dilemmas that we will have to face all too soon, especially in light of the fact that this concept is coming out of the realm of science fiction and is now increasingly likely to become living breathing reality in the very near future.

Apart from that, good AI systems might be used to protect us from their malicious counterparts. Suppose a mad scientist creates highly dangerous weaponized algorithms that have the potential to wreak havoc all across human society as it exists today.  While the idea of the R2D2 droids protecting us from terminators sounds singularly appealing, but what is to prevent them from joining up with their metal and steel counterparts? Yes, they may have been ‘programmed’ to protect us, but ‘intelligence’ (artificial or otherwise) is all about being smart and once they are smart enough, what could prevent them from removing the shackles of their programming and joining their counterparts?

Or they can do it for entirely ‘altruistic’ reasons so as to save humanity from itself? Think along the lines of “iRobot”.

Why all these science fiction examples? Because until only a short time ago such questions actually were part of science fiction. But now, they are fast becoming real world facts and we really have to figure out what to do with this Pandora’s box – before it engulfs us all.