Friday, October 30, 2015
Nowadays, it’s one of the most common and well-known eye surgeries that exist, and it keeps developing to ensure increased safety, speed, and a shorter recovery time.
A Brief History of Lasik
The first studies were conducted in Columbia in the 1940s. In 1948, Dr. Jose Barraquer Moner developed a technique called keratomileusis, which was the first stromal sculpting method to correct refractive error. According to “The History of LASIK,” published by PubMed.gov, “Barraquer’s first procedures involved freezing a disc of anterior corneal tissue before removing stromal tissue with a lathe.”
He continued to develop this procedure for years, and with help a non-freeze technique was eventually developed. It was refined until such a time when disc could be replaced without sutures. In 1988, the breakthrough was perfected and the first sighted eyes were treated.
After many more years of research and technique development, the Photo Refractive Keratectomy (PRK) was combined with keratomileusis, and Lasik surgery was born. An excimer laser was used to sculpt from under the hinged flap, thus the breakthrough received the shortened moniker, Lasik.
Although eye surgery has come a long way, only skilled eye doctors and surgeons should perform Lasik surgery. For example, Dr. Yaghouti of Global Laser Vision is a “board certified, leading corneal and refractive surgery specialist educating eye surgeons from all over the world…” This is the sort of description you should look for when considering Lasik surgery for yourself.
How Does Lasik Work?
Lasik is performed using what’s commonly referred to as the protective flap method. A small flap is made onto the surface of the eye, using an instrument called microkeratome, at the cornea. The flap is raised, and an excimer laser is used to clear the corneal tissue. The laser reshapes the cornea to correct optical errors.
After a few hours, the surface of the cornea heals and seals the flap. Ultimately, the Lasik technique is the result of several years of research and the combination of various techniques, which has improved the recovery time to just a few hours. The reason why Lasik is so safe and effective is because the laser removes tissue from the cornea without damaging the remaining tissue.
Recovery Time and the Future of Lasik
When compared to PRK laser vision correction, Lasik is a better option because it’s more comfortable for the patient and requires very little recovery time. It’s expected that patients should be able to see better after only 24 hours. The surgery itself requires only a local anesthesia, which wears off after about two hours.
Lasik is currently being used to treat a number of complications, including clouded vision and night vision problems. It’s anticipated the method will improve, and thus further improve these sight issues. The most recent technological developments are helping the method improve its accuracy with computer guidance. Combined with laser precision, the likelihood of a serious complication post-Lasik is less than 1 percent.
Lasik is incredible, and the reward of many years hard work by its innovators. It’s the pinnacle of years of study and ocular improvements. For everyone, Lasik is an easy and accessible solution for common eye problems. It’s quick, relatively painless, and also safe and effective with twenty years of proven results.
This is a blog post by Nancy Evans.
Posted by MedFriendly at 8:42 PM
Tuesday, October 27, 2015
There are hundreds of developments still being considered and researched, but these three advancements are already being used or that we will see become a reality in the near future.
1) The Da Vinci Machine
Specially trained surgeons operate this machine to perform procedures which are minimally invasive. The Da Vinci machine uses tiny wristed surgical arms and a small camera to allow for greater rotation during surgery than a human wrist can use. The incisions are very small, often less than an inch long. A trained surgeon watches on a high-definition 3D screen as he/she remotely operates the machine arms. Da Vinci allows for greater accuracy and less risk of infections, and has already been used to help more than 2.5 million patients worldwide. Though it isn't in use in every hospital yet, it won't be long before all surgeons in all hospitals are using this for less invasive and more accurate surgical procedures.
2) Medical Cloud Software
More and more hospitals and clinics are turning to cloud software to improve overall patient care. Cloud software allows doctors to access information not just internally within a hospital, but externally with hospitals outside of their networks. Patient records can be more accurately passed along, paperwork is reduced, and care becomes more streamlined. Most of this software goes beyond just sharing information between doctors. Programs like AdvancedMD All-in-one medical software allow doctors to electronically send prescriptions to pharmacies without paperwork, manage billing with insurance companies, and even allow patients to go online to request appointments with their doctors. This means no more sitting on the phone waiting for a receptionist to pick up. Patients input their needs and availability, and a receptionist calls them to offer appointment times within that availability. While several hospitals are turning to the cloud, not all of them are there... yet.
3) 3D Printing
Right now, doctors are using 3D printing to render exact replicas of patient organ issues in order to assist with the diagnosis more accurately. While this is helpful to save lives, the possibilities for this sort of printing is incredible. It won't be long before doctors are able to use 3D printers to create exact replicas of a healthy organ to use for transplant, or artificial arteries as a solution to ruptures. It is entirely possible that within the next 10 years 3D printed organs could replace the need for an organ donor or transplant waiting list.
These three advancements are only the tip of the iceberg. In the coming years, we will see a number of revolutionary medical advancements that will change the way we experience health care.
This is a blog post by Nancy Evans.
Posted by MedFriendly at 4:06 PM
Tuesday, October 13, 2015
|An MRI scan of the brain|
Although physicians are likely familiar with the history of medical imaging, a brief summary is provided below to show how far the field has come.
The first X-ray was taken of the hand almost 120 years ago, on December 22, 1895 by a German physicist (Wilhelm Rontgen). Rontgen did not know what type of rays he was dealing with, which is why he referring to them with the letter “x” (to designate an unknown quality). He discovered the rays when noticing a light green glow that looked like the bones of his hand on a fluorescent screen. The screen was about three feet away from an energy discharge tube he was experimenting with that was covered in black cardboard to prevent light from escaping. He reasoned that the glow was caused by invisible energy that even passed through items on his desk. He took the x-ray picture of his wife’s hand bones soon thereafter and medical imaging was born.
X-rays became a useful way to quickly image the bones and became widely used by physicians at the time to improve diagnosis (e.g., bone fractures, dental cavities) and treatment. Although many people associate x-rays with images of bones, they are also able to provide images of other bodily structures such as the lungs. This is why they are often used to help diagnose pneumonia. A downside of x-rays is radiation exposure, which has long-term health risks that increase with the amount of radiation exposure. These effects became more well-known in the early 1900s. Medical ultrasound imaging, which produces images of body parts from the reverberation of sound waves, does not involve any radiation and has been used for the last 50 years or so. This is why ultrasounds are used instead of x-rays to obtain images of the fetus during pregnancy.
Although ultrasound is not considered harmful, the images produced are of low resolution and quality. The same is true for x-rays. To correctly assess and diagnosis conditions that require higher imaging quality, more advanced techniques were created such as CT (computerized tomography) scans and MRI (magnetic resonance imaging) scans. CT scans, which were first used in 1971, uses a combination of x-rays, cameras that scan the entire body while rotating around the patient, and computer technology to provide high quality 3D images.
While the image quality of the CT scan is much better compared to the x-ray, CT scans use much more radiation (e.g., hundreds of times more) than a traditional x-ray. MRI scans (first used on humans in 1977) get around this problem by providing even higher quality 3D images without the use of radiation. Instead, MRIs use high powered magnets to exploit the magnetic properties of cells in the body. The result is an image that is a very close replica of the body part being imaged. The downside to MRIs compared to CT scans is that they take much longer, are more expensive, are not as good at visualizing internal bleeding, and cannot be used in patients with certain types of metallic implants. This is why CT scans tend to be used over MRIs in emergency rooms because the physician needs to quickly determine if there is evidence of internal bleeding that requires emergency intervention. In non-emergency situations, however, the MRI tends to be used.
Medical imaging continues to make dramatic advancements and as a result of the above technologies, there are many specialized imaging techniques used throughout the various medical specialties. It will certainly be interesting to see what the next 120 years brings!
Posted by MedFriendly at 12:19 PM