Archive for ‘Management and Treatment’

Page 1 of 3123

Letrozole Market – Teva, Actavis, Apotex, Sun Pharma, Hengrui, Top Pharm

Letrozole Market – Teva, Actavis, Apotex, Sun Pharma, Hengrui, Top Pharm

Letrozole

 

In depth analysis of Letrozole Market is a professional research report.The review has been based on the report titled, ” Letrozole Market 2016 – 2020.”

To begin with, the report elaborates the Letrozole Market Various definitions and classification of the industry, applications of the industry and chain structure are given. Present day status of the Letrozole Market in key regions is stated and industry policies and news are analysed.

After the Letrozole Market analysis report speaks about the manufacturing process and cost structures.The process is analysed thoroughly Upstream raw materials, equipment and downstream consumers, various manufacturing associated costs (material cost, labour cost, etc.) and the actual process.

Request for Sample Report @  http://www.marketresearchstore.com/report/global-letrozole-market-research-report-forecast-2016-2021-89959#RequestSample

This report studies Letrozole in Global market, especially in United States, China, Europe, Japan, focuses on top players in these regions/countries, with sales, price, revenue and market share for each Manufactures player in these regions, covering
Teva
Actavis
Mylan
Fresenius Kabi
Endo
Apotex
Sun Pharma
Hengrui
Top Pharm
Market Segment by Regions, this report splits Global into several key Regions, with sales (consumption), revenue, market share and growth rate of Letrozole in these regions, from 2011 to 2021 (forecast), like
United States
China
Europe
Japan

To provide information on competitive landscape, this report includes detailed profiles of Letrozole Market key players. For each player, product details, capacity, price, cost, gross and revenue numbers are given.

Inquiry for Buying Report @  http://www.marketresearchstore.com/report/global-letrozole-market-research-report-forecast-2016-2021-89959#InquiryForBuying

In this report analysis, traders and distributors analysis is given along with contact details. For material and equipment suppliers also, contact details are given. New investment feasibility analysis and Industry growth is included in the report.Anatomage-Table

Full story

Microsoft Patents EMG Human-Computer Controllers

Microsoft Patents EMG Human-Computer Controllers

Microsoft Patents EMG Human-Computer Controllers

Muscle-Computer Interfaces (muCIs)

Many human-computer interaction technologies are currently mediated by physical transducers such as mice, keyboards, pens, dials, and touch-sensitive surfaces. While these transducers have enabled powerful interaction paradigms and leverage our human expertise in interacting with physical objects, they tether computation to a physical artifact that has to be within reach of the user.

As computing and displays begin to integrate more seamlessly into our environment and are used in situations where the user is not always focused on the computing task, it is important to consider mechanisms for acquiring human input that may not necessarily require direct manipulation of a physical implement. We explore the feasibility of muscle-computer input: an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible.

A machine learning model is trained by instructing a user to perform proscribed gestures, sampling signals from EMG sensors arranged arbitrarily on the user’s forearm with respect to locations of muscles in the forearm, extracting feature samples from the sampled signals, labeling the feature samples according to the corresponding gestures instructed to be performed, and training the machine learning model with the labeled feature samples. Subsequently, gestures may be recognized using the trained machine learning model by sampling signals from the EMG sensors, extracting from the signals unlabeled feature samples of a same type as those extracted during the training, passing the unlabeled feature samples to the machine learning model, and outputting from the machine learning model indicia of a gesture classified by the machine learning model.

1. A method for identifying individual finger movements, the method comprising: training a machine learning model by instructing a user to perform proscribed gestures, sampling signals from EMG sensors arranged arbitrarily on the user’s forearm with respect to locations of muscles in the forearm, extracting feature samples from the sampled signals and labeling the feature samples according to the corresponding gestures instructed to be performed, and training the machine learning model with the labeled feature samples; and recognizing gestures using the trained machine learning model by sampling signals from the EMG sensors, extracting from the signals unlabeled feature samples of a same type as those extracted during the training, passing the unlabeled feature samples to the machine learning model, and outputting from the machine learning model indicia of a gesture classified by the machine learning model.

2. A method according to claim 1, wherein the proscribed gestures for training comprise intended movement of two or more different fingers, and the indicia of the gesture indicates which of the different intended fingers the gesture corresponds to.

3. A method according to claim 1, wherein the feature samples for the training and the recognizing correspond to respective timeslices of EMG signals of the EMG sensors, and the feature samples for the training and the recognizing comprise time-independent features of the timeslices of EMG signals.

4. A method according to claim 3, wherein the time-independent features comprise features based on amplitudes, energy frequency distribution, and/or phase coherence.

5. A method according to claim 1, wherein the EMG sensors are fixed in a wearable device such that when the wearable device is donned by the user the EMG sensors are placed arbitrarily on the user’s arm in accordance with arbitrary placement of the wearable device.

6. A method according to claim 1, further comprising: outputting from the machine learning model a sequence of indicia of gestures; and analyzing aggregations of the sequence to determine which gestures were performed by the user based on the indicia of gestures in the window.

7. One or more computer readable media storing information to enable a computing device to perform a process, the process comprising: receiving a plurality of EMG signals derived from respective EMG sensors, the sensors being arranged in a wearable device placed arbitrarily on the forearm; dividing the EMG signals into a sequence of signal samples, a signal sample comprising signal segments of the EMG signals of the respective EMG sensors that span an increment of time; for each signal sample, forming a corresponding feature vector by extracting from the signal sample a plurality of values of different types of features based on the signal segments of the signal sample; and passing the feature vectors to a machine learning module previously trained with feature vectors labeled with known gestures and outputting from the machine learning module gesture classifications of the respective feature vectors.

8. One or more computer readable media according to claim 7, wherein based on a first of the feature vectors the machine learning module outputs a first classification comprising identity of a first of a plurality of particular types of finger movements for which the machine learning module has been trained to classify, and based on a second of the feature vectors the machine learning module outputs a second classification comprising identity of a second of the plurality of types of finger movements.

9. One or more computer readable media according to claim 8, wherein the process further comprises training the machine learning module by, when a user is performing known gestures, performing the dividing, and forming, and also labeling the corresponding feature vectors according to the known gestures being performed when the signal samples of the feature vectors were captured, where the labeled feature vectors train the machine learning module.

10. One or more computer readable media according to claim 9, the process further comprising ignoring, during the training, an initial period of signal data after a known gesture training event begins.

11. One or more computer readable media according to claim 10, wherein the training data is generated by: instructing a user to perform a single particular specified finger gesture; and during performance of the specified gesture, capturing the signal for a given sensor and obtaining a plurality of segments of the captured signal, for each segment deriving a sample comprising a plurality of signal characteristics of the corresponding segment, and labeling each of the samples as corresponding to the particular specified gesture, wherein the classification model is trained with the samples, and whereby a plurality of labeled samples are obtained corresponding to the single specified finger gesture.

12. One or more computer readable media according to claim 7 wherein the different types of features in a feature vector include both individual features extracted only from corresponding individual segments of a signal sample, and combined features that are based on pairs of the individual features.

13. One or more computer readable media according to claim 7, the process further comprising analyzing a plurality of the outputted classifications to identify a gesture.

14. A device comprising storage and a processor, the device further comprising: a feature extraction module receiving an EMG signal from each of a plurality of respective EMG sensors arbitrarily placed on a forearm, the feature extraction module analyzing the EMG signals and outputting a plurality of discrete samples, each sample comprising a plurality of time-independent features derived from the EMG signals, where different samples correspond to different respective time slices of the signals, a time slice comprising segments of each of the signals of the respective EMG sensors for a same period of time; and a trained machine learning module which receives the samples and determines a gesture, and based on an output of the trained machine learning module determining which of a plurality of pre-defined gestures correspond to the samples.

15. A device according to claim 14, the device further comprising a wearable band containing the EMG sensors.

16. A device according to claim 14, wherein the pre-defined gestures, for which the machine learning module is trained to classify samples, comprises one or more combinations of gestures from a set of gestures comprising: a tap gesture comprising a tap of a finger; a curl gesture comprising a curl of a finger; an extension gesture comprising an extension of a finger; a press gesture comprising a press of a finger; or a lift gesture comprising a lift of a finger.

17. A device according to claim 16, wherein the set of gestures further comprises a plurality of pressure levels applied by a finger.

18. A device according to claim 16, wherein the set of gestures further comprises a particular finger associated with one or more of the tap gesture, the curl gesture, the extension gesture, the press gesture, or the lift gesture.

19. A device according to claim 14, further comprising basing a determination that a single gesture was performed based on a plurality of the outputs of the machine learning module.

20. A device according to claim 14, wherein the pre-define gestures comprise intended movements of two or more individual fingers and/or two or more types of finger movements.

Microsoft has recently applied for a couple of patents that seek exclusive rights to use electromyography (EMG) as an input modality to control computers, consumer gadgets, and, hopefully, assistive devices for disabled folks. The system uses EMG sensors on the forearm, as well as other parts of the body, to detect and transmit motor unit action potentials.

The video below explains the advantages and demonstrates the use of hands-free EMG over traditional buttons and controllers to manipulate device settings.

source : http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220090327171%22.PGNR.&OS=DN/20090327171&RS=DN/20090327171

Full story

New Video of RSLSteeper bebionic3 Artificial Hand

New Video of RSLSteeper bebionic3 Artificial Hand

New Video of RSLSteeper bebionic3 Artificial Hand

RSLSteeper, having previewed their latest prosthetic hand earlier this Spring, is finally releasing the bebionic3 to market. The new hand is essentially a stronger, tougher, more accurate version of the earlier model, featuring new electronics and control software.

bebionic3 New bebionic3 Prosthetic Hand Now Available (video)The same options are available for your choice of skins, so you can stay rugged like the cowboy in the picture or get a hand that looks remarkably like your own.

Full feature list:

Individual Motors in each finger allow you to move the hand and grip in a natural, coordinated way. The motors are positioned to optimise weight distribution – making the hand feel lighter and more comfortable.

Powerful microprocessors continuously monitor the position of each finger, giving you precise, reliable control over hand movements.

14 Selectable grip patterns and hand positions enable you to perform a huge number of everyday activities with ease.

Proportional Speed Control gives you precision control over delicate tasks, so you can pick up an egg or hold a polystyrene cup as easily as crushing an empty can.

bebalance software and wireless technology located within the bebionic3 myoelectric hand makes it easy to customise the functions to suit your preferences and lifestyle.

Selectable thumb positions and a built-in sensor enable you to complete more tasks than ever before.

Auto grip means no more accidents, as bebionic3 automatically senses when a gripped item is slipping and adjusts the grip to secure it.

Foldaway fingers provide natural looking movement, and flex when you brush past people or bump into objects.

Durable construction and advanced materials makes bebionic3 strong enough to handle up to 45kg – so you can confidently use the hand to carry heavy objects, and push yourself up from a seated position.

Innovative palm design protects bebionic3 from impact damage, and makes the hand quieter than ever.

Soft finger pads and a wide thumb profile maximises the surface area, and enhances grip.

friend us on facebook follow us on twitter

American

French

Spanish

Tel: +44 (0) 870 2404133 | Email: bebionic@rslsteeper.com RSLSteeper main site

Home

The Hand

About Us

News

Events

Downloads

Contact

Home »

The Hand

The Hand

Independence through technology

Pioneering technology and innovative design features combine to make bebionic3 the world’s most lifelike, affordable, functional and easy to use myoelectric hand commercially available today.

Reliable, speedy and versatile, bebionic3 can be configured to handle almost anything you need to do. bebionic3 is designed to be stronger and more durable than other hands available, meaning that it can be worn daily, and withstand the stresses and strains of constant use.

RSLSteeper, the UK firm that makes advanced hand prostheses, has released a new video that shows off their new bebionic3 hand. Previous vids of the device included a high energy advertisement as well as a preview of the hand independent of an actual wearer.

The new video features a man with a missing arm who demonstrates his futuristic new device, including cracking eggs and opening cold bottles of Budweiser, the king of beers. Check it out:

Source : http://bebionic.com/the_hand

Full story

Viveve System for Vaginal Laxity Receives CE Approval

Viveve System for Vaginal Laxity Receives CE Approval

Viveve System for Vaginal Laxity Receives CE Approval

Viveve, from Palo Alto, CA, has received approval for a condition we never heard of before: vaginal laxity. On the other hand, of course, in this age where vaginoplasties are quickly becoming as normal as nose jobs, we should not be really surprised. The device uses radiofrequency to treat laxity of the vaginal introitus after childbirth, to improve the female sexual function. The system consists of a RF generator, a hand piece and single-use disposable tips. The procedure can be performed without anesthesia in approximately 30 minutes. According to studies cited by the company in the press release, women have reported improved feelings of vaginal tightness at several months after treatment (no words about male satisfaction though). Bingo, a new market has been created!

How Common is Vaginal Laxity?

Why Isn’t Vaginal Laxity Being Talked About?

Starting the Conversation

How Common is Vaginal Laxity?

In a survey1 of over four hundred women who had vaginal deliveries, nearly half reported some level of concern with vaginal looseness, or what is known as vaginal laxity (see figure below).

Source : http://www.viveve.com/understanding-vaginal-laxity/a-common-concern

Full story

Photosynthesis Thought to Exhibit Quantum Entanglement Phenomenon

Photosynthesis Thought to Exhibit Quantum Entanglement Phenomenon

Photosynthesis Thought to Exhibit Quantum Entanglement Phenomenon

Light harvesting components of photosynthetic organisms are complex, coupled, many-body quantum systems, in which electronic coherence has recently been shown to survive for relatively long time scales despite the decohering effects of their environments. Within this context, we analyze entanglement in multi-chromophoric light harvesting complexes, and establish methods for quantification of entanglement by presenting necessary and sufficient conditions for entanglement and by deriving a measure of global entanglement. These methods are then applied to the Fenna-Matthews-Olson (FMO) protein to extract the initial state and temperature dependencies of entanglement. We show that while FMO in natural conditions largely contains bipartite entanglement between dimerized chromophores, a small amount of long-range and multipartite entanglement exists even at physiological temperatures. This constitutes the first rigorous quantification of entanglement in a biological system. Finally, we discuss the practical utilization of entanglement in densely packed molecular aggregates such as light harvesting complexes.

“Using high-speed lasers, a Berkeley research team has discovered cell-level “quantum computing” occuring during photosynthesis. (Light energy first moves through every possible path, and then retroactively “decides” which is most efficient.) “Life’s very existence may be the consequence and continued operation of a quantum computer,” notes a transhumanist science magazine, giving the historic context for research into high-speed non-binary calculations occuring in nature, and arguing that “Ultrafast computing, accelerated by our explorations into the new science of quantum biology, could well be the critical technology that pushes us over the edge into the Singularity.”"

Back in January we blogged about an article in Discover Magazine that reviewed the latest discoveries involving quantum mechanical effects in biological systems. Now researchers from the Berkeley Center for Quantum Information and Computation, the Department of Chemistry at UC Berkeley, and the Lawrence Berkeley National Laboratory believe they have identified quantum entanglement events occurring in photosynthesis.

From The Physics arXiv Blog:

Various studies have shown that in light harvesting complexes, chromophores can share coherently delocalised electronic states. K. Birgitta Whaley at the Berkeley Center for Quantum Information and Computation and pals say this can only happen if the chromophores are entangled.

They point out that these molecules do not seem to exploit entanglement. Instead, its presence is just a consequence of the electronic coherence.

This is a big claim that relies somewhat on circumstantial evidence. It’ll be important to get confirmation of these idea before they can become mainstream.

Nevertheless, if correct, the discovery has huge implications. For a start, biologists could tap into this entanglement to make much more accurate measurement of what goes on inside molecules during photosynthesis using to the various techniques of quantum metrology that physicists have developed.

More exciting still, is the possibility that these molecules could be used for quantum information processing at room temperature. Imagine photosynthetic quantum computers!

Source : http://arxiv.org/abs/0905.3787

Full story

IBM Clinical Genomics Helps with Clinical Decision Making

IBM Clinical Genomics Helps with Clinical Decision Making

IBM Clinical Genomics Helps with Clinical Decision Making

HAIFA, Israel & MILAN – 14 Mar 2012: IBM (NYSE: IBM) today announced it has developed a unique biomedical analytics platform for personalized medicine that could enable doctors to better advise on the best course of medical treatment. This could lead to smarter and more personalized healthcare in a wide-range of areas, including cancer management, hypertension, and AIDS care.

Scientists from IBM Research are collaborating with the Fondazione IRCCS Istituto Nazionale dei Tumori, a major research and treatment cancer center in Italy, on the new decision support solution. This new analytics platform is being tested by the Institute’s physicians to personalize treatment based on automated interpretation of pathology guidelines and intelligence from a number of past clinical cases, documented in the hospital information system.

IBM’s Clinical Genomics

IBM’s Clinical Genomics biomedical analytics platform for personalized medicine.

Selecting the most effective treatment can depend on a number of characteristics including age, weight, family history, current state of the disease and general health. As a result, more informed and personalized decisions are needed to provide accurate and safe care.

IBM’s latest healthcare analytics solution, Clinical Genomics (Cli-G), can integrate and analyze all available clinical knowledge and guidelines, and correlate it with available patient data to create evidence that supports a specific course of treatment for each patient. Developed at IBM Research – Haifa, the new prototype works by investigating the patient’s personal makeup and disease profile, and combines this with insight from the analysis of past cases and clinical guidelines. The solution may provide physicians and administrators with a better picture of the patient-care process and reduce costs by helping clinicians choose more effective treatment options.

“Making decisions in today’s complex environment requires computerized methods that can analyze the vast amounts of patient information available to ease clinical decision-making,” notes Dr. Marco A. Pierotti, Scientific Director at the Istituto Nazionale dei Tumori. “By providing our physicians with vital input on what worked best for patients with similar clinical characteristics, we can help improve treatment effectiveness and the final patient outcome.”

Founded in 1925, the Fondazione IRCCS Istituto Nazionale dei Tumori in Milan is recognized as a scientific research and treatment institution in the field of pre-clinical and clinical oncology. The Institute’s special status as a research center enables it to transfer research results directly to clinical care. The Institute initiated this collaboration with IBM to enhance patient care through better use of innovative IT solutions. Once physicians make a diagnosis, they will receive personalized insights for their patients, based on medical information, automated interpretation of pathology clinical guidelines, and intelligence from a number of past clinical cases, documented in the hospital information system.

In addition to supporting decision-making about treatment, it can provide administrators at Fondazione IRCCS Istituto Nazionale dei Tumori with an aggregated view of patient care, enabling them to evaluate performances and using this knowledge to streamline processes for maximum safety. For example, hospital administrators can drill down into the data to better understand what the guidelines were for insights, what succeeded, and whether treatment quality has improved.

“Our clinical genomics solution may enable care-givers to personalize treatment and increase its chances of success,” explains Haim Nelken, senior manager of integration technologies at IBM Research – Haifa. “The solution is designed to provide physicians with recommendations that go beyond the results of clinical trials. It may allow them to go deeper into the data and more accurately follow the reasoning that led to choices previously made on the basis of subjective memory, intuition, or clinical trial results.”

Any patient data securely collected from hospitals and health organizations is ‘de-identified’ or made anonymous through the removal of personal identifying details. The IBM system does not need to know which individuals the information came from in order to draw conclusions. It works by identifying similar cases based on age, sex, symptoms, diagnosis, or other related factors.

IBM Research – Haifa, Israel has developed a new clinical decision support tool that correlates a patients’ unique disease profile against various clinical guidelines and a wide range of previously acquired clinical data from a multitude of patients. The tool, called Clinical Genomics (Cli-G), is designed to provide clinicians with actionable results that outline how to address individual patients’ conditions.

The system is currently being tested at the Fondazione IRCCS Istituto Nazionale dei Tumori, a research and cancer treatment center in Italy.

The Institute initiated this collaboration with IBM to enhance patient care through better use of innovative IT solutions. Once physicians make a diagnosis, they will receive personalized insights for their patients, based on medical information, automated interpretation of pathology clinical guidelines, and intelligence from a number of past clinical cases, documented in the hospital information system.

In addition to supporting decision-making about treatment, it can provide administrators at Fondazione IRCCS Istituto Nazionale dei Tumori with an aggregated view of patient care, enabling them to evaluate performances and using this knowledge to streamline processes for maximum safety. For example, hospital administrators can drill down into the data to better understand what the guidelines were for insights, what succeeded, and whether treatment quality has improved.

Any patient data securely collected from hospitals and health organizations is ‘de-identified’ or made anonymous through the removal of personal identifying details. The IBM system does not need to know which individuals the information came from in order to draw conclusions. It works by identifying similar cases based on age, sex, symptoms, diagnosis, or other related factors.

source : http://www-03.ibm.com/press/us/en/pressrelease/37199.wss

Full story

New In-The-Canal Hearing Aid by Siemens

New In-The-Canal Hearing Aid by Siemens

New In-The-Canal Hearing Aid by Siemens

Siemens’ new hearing instrument, Eclipse, offers customized comfort and sound

Individual manufacturing and new fitting techniques allow the instrument to be placed deep in the ear canal, providing a pleasant listening experience right from the start.

Boston, USA / Erlangen, Germany, 2012-Mar-28

The American Academy of Audiology (AAA) presented Siemens’ new in-the-canal hearing aid “Eclipse” at the AudiologyNOW! congress. It sits hidden in the ear canal, making it a discrete solution for many hearing-aid wearers. Because of a replaceable foam cylinder, it can be placed directly in front of the eardrum, deeper than previous hearing aids. And its remote location has removed the occlusion effect typical of many in-the-ear (ITE) devices. The latest generation of Siemens’ BestSound Technology, XCEL, provides a high degree of speech comprehension and balanced sound right from the start.

Siemens’ new hearing instrument, Eclipse, offers customized comfort and sound

Most people, who are hard of hearing want a hearing device that is as small as possible or, better yet, completely invisible. That’s why Siemens has offered ITE hearing aids for the past 45 years. Siemens Healthcare Sector’s many years of experience have enabled it, as part of the Agenda 2013 initiative, to construct the innovative Eclipse so small and precise that it can be pushed right up to the eardrum, deeper than previous in-the-ear instruments. The individually fitted housing with a small foam cylinder on the end snuggles securely into the designated place in the ear’s inner wall. Its deep location prevents the occlusion effect, which more or less distorts the sound of one’s own voice in contemporary ITE instruments and is often considered unpleasant.

Despite this deep placement, Eclipse can be put into place and removed quickly and easily, without risk of causing any physical damage, making daily use simpler and making it easier to exchange the battery or, for hygienic reasons, the foam cylinder, which is available in three different sizes.

But Eclipse’s position in the ear’s natural sound canal is not the only thing that makes the sound more pleasant, Eclipse hearing aids are also equipped with the latest generation of BestSound Technology, XCEL, developed by Siemens. It contains the latest in digital sound management, which automatically finds the right balance between the necessary amplification of speech signals and the wearer’s individual sound preferences. And all that from day one. Because the program automatically ensures that the sound is pleasant in every phase of adjustment, the wearer can get used to the hearing aid’s new sound impressions much quicker than with previous models.

Eclipse hearing aids are available in two different performance levels for light and mild hearing loss.

Launched by Siemens Healthcare Sector in November 2011, Agenda 2013 is a two-year global initiative to further strengthen the Healthcare Sector’s innovative power and competitiveness. Specific measures will be implemented in four fields of action: Innovation, Competitiveness, Regional Footprint, and People Development.

The Siemens Healthcare Sector is one of the world’s largest suppliers to the healthcare industry and a trendsetter in medical imaging, laboratory diagnostics, medical information technology and hearing aids. Siemens offers its customers products and solutions for the entire range of patient care from a single source – from prevention and early detection to diagnosis, and on to treatment and aftercare. By optimizing clinical workflows for the most common diseases, Siemens also makes healthcare faster, better and more cost-effective. Siemens Healthcare employs some 51,000 employees worldwide and operates around the world. In fiscal year 2011 (to September 30), the Sector posted revenue of 12.5 billion euros and profit of around 1.3 billion euros. For further information please visit: http://www.siemens.com/healthcare

The products mentioned here are not commercially available in all countries. Due to regulatory reasons the future availability in any country cannot be guaranteed. Further details are available from the local Siemens organizations.

Siemens released a new hearing aid at the AudiologyNOW! conference of the American Academy of Audiology being held in Boston these days. The new device can be placed discretely deep in the ear canal. A replaceable foam cylinder makes it possible to position the aid directly in front of the eardrum. Two different performance levels are available, for light and mild hearing loss.

The hearing aid is equipped with XCEL technology which manages the balance between speech amplification and background noise. The algorithms in the technology facilitate easy listening even for inexperienced hearing-aid wearers without overwhelming their ears with new unfamiliar sounds. Speech remains clear and comprehensible, while the sound quality can be adjusted to the individual wants and needs of the wearer. The XCEL technology is also available in the following hearing aids upon market launch: Motion SX, Motion P, Pure and Pure Carat.

Source : http://www.siemens.com/press/en/pressrelease/?press=/en/pressrelease/2012/healthcare/h20120323.htm

Full story

Bandu Watches Your Stress Levels: Interview with Founder, Dr. Robert Goldberg

Bandu Watches Your Stress Levels: Interview with Founder, Dr. Robert Goldberg

Bandu Watches Your Stress Levels: Interview with Founder, Dr. Robert Goldberg

bandu by Neumitra

Are you in control of your best performances? The effects of our daily commutes, personal relationships, and work demands keep us from getting there. That’s why we developed bandu, a watch that empowers you to slow down, take a moment, and prepare your best efforts. We show you what works. Like a little buddy on your wrist, bandu learns to help you take step back and collect your thoughts. Built by a neuroscientist and engineers, bandu alerts you to listen to music, play a game, or to call a friend. Using your phone, you set the messages and write custom ones that appear right on the watch face. Our goal is to give you a daily friend to help you strengthen your Buddha brain.

How bandu works

bandu measures the autonomic nervous system, which is associated with the effects of stress including perspiration, respiration, and cardiology. The current version monitors your skin conductance, movement, and temperature. When you become stressed, bandu alerts to you to take a break. You can listen to a favorite song or play a fun game. You can even call a trusted friend, just stand up to stretch or even meditate. The results are shown right on your smartphone in real-time so you can monitor your stress level, see when it rises, and determine the people, places, and things that help you feel better. And over time, bandu automatically learns what works.

Over 40 million Americans are suffering from a clinical anxiety disorder, and about 200 million prescriptions are written each year to address anxiety problems. Just thinking about society’s problem with stress is enough to induce stress. Fortunately, there are innovative companies, like Neumitra, which have taken up the challenge of addressing stress through some of our favorite technologies: quantified self, wearatronics, big data, machine learning, etc.

This author first met the Neumitra team this past summer at the Rock Health Boston offices and was impressed with their plans to build a watch, called the bandu, to measure people’s autonomic nervous system activity throughout the day. They have some interesting ideas, such as generating heat maps showing where the most stressful locations are based on individual and aggregated stress data (e.g. the office, public transportation stations, New York City, etc). Now they’ve launched an ambitious indiegogo campaign to raise money for their first production run. We caught up with founder Robert Goldberg to discuss their company, the bandu, and their upcoming plans.

Shiv Gaglani, Medgadget: How did neumitra come to be? Where did you get the name from?

Dr Robert Goldberg 2 Bandu Watches Your Stress Levels: Interview with Founder, Dr. Robert GoldbergRobert Goldberg: We all met in the Neurotechnology Ventures class at MIT taught by Ed Boyden and Joost Bonsen. I’m a neuroscientist by training (neuroimaging and fMRI). My co-founders worked in product development for the Human Genome Project and in machine learning and pattern recognition for the MIT Lincoln Lab. Anand found the benefits of meditation in his own life. So we started by discussing the value of a wearable sensor for measuring the effects of daily life on the mind. We saw that U.S. spending on mental health was the third largest category and with anxiety disorders as the largest category of mental health symptoms.

Neumitra literally comes from the Latin “neu” for new and the Sanskrit “mitra” for friend. So new friend or neurofriend. We saw the smartphone as a new friend for your daily health. It’s east meets west both in the name and in the execution – data to drive behavioral changes. It also helped that we came up with the name sitting in Building 46 at MIT (Brain and Cognitive Sciences) and MIT is right in the middle of the name.

Medgadget: Your first product is the bandu, correct? What does it do and how does it work? Where did its name come from?

Goldberg: bandu is designed as the first effort to measure the autonomic nervous system as part of your daily life. We found that as compelling as the data is for us, users wanted something more. stress map 2 Bandu Watches Your Stress Levels: Interview with Founder, Dr. Robert GoldbergSo we built in the watch face to alert you during your most stressful times and trigger apps on your phone to help you to feel better and show what works for you, your friends, and across larger groups.

bandu is from the Sanskrit “bandhu” meaning friend or buddy. We see bandu as a living, learning buddy for your daily life. The AI is designed to get smarter the more to wear it to better alert you to impending changes in your stress and with the best apps to help support you. Your data allows us to understand and effect stress more broadly across your life, your friends, even your whole company and city.

Medgadget: Who is your target market, and how big is the landscape?

Goldberg: Our goal is to deliver into the clinic new technologies to help patients who are really struggling. The challenge is the reimbursement and regulatory hurdles to reach that outcome. Consumers allow us to focus on the daily use case for why someone would want to wear the technologies long-term. Our focus is then providing daily value in a way that activity and sleep devices have been challenged to provide. We know that over 40 million Americans have a clinical anxiety disorder and more than 200 million prescriptions are being written each year for sedative drugs. But stress isn’t just associated with mental health. Stress makes symptoms worse for cardiac diseases, respiratory ailments, digestive issues, and even pain and fertility. Stress, and the autonomic nervous system, affects the whole body. Nowhere is this central role more apparent than when we look at the physiology of the sympathetic versus the parasympathetic nervous systems.

Medgadget: Do you have any competitors? If so, what makes you different?

Goldberg: Biofeedback devices are available to consumers but not in a wearable form to enable real-time alerts and daily, passive data tied to smartphone usage. We then show you the statistics.

Medgadget: You recently launched an indiegogo campaign for bandu. What is your primary goal for the campaign?

Goldberg: The goal for our indiegogo campaign is to engage early adopters with the first version of our technologies and given that we’re only making small batches prior to ramping up widespread production. We aim to effectively bring in the feedback from those early users to build better technologies. Our earliest backers will make the technology and scientific progress possible.

Medgadget: What is your background in medicine and medical technology?

Goldberg: I first found myself in a psychiatric hospital at the age of 20 to better understand my family history of mental illness. At night I worked in a sleep lab and realized we could measure brain waves while you sat, meditated, or slept. Those experiences drove me away from psychiatry and into neuroimaging and fMRI where I did research for ten years. In 1999 we were collecting one GB of fMRI data across 30,000 locations every 3 seconds in one person’s brain. I watched as the math was invented to analyze that data. When we first started discussing wearable sensors I immediately saw the data associated with daily life and with the smartphone. The stress map we compute clearly has a precedent in neuroimaging heat maps. We’re building the framework for a real-time stress map of the entire planet.

For more information, check out this entertaining and informative video about bandu (with an appropriate soundtrack):

http://www.youtube.com/watch?v=b1zfvmG7BYs&feature=player_embedded

Source : http://www.indiegogo.com/bandu

Full story

GE’s Q.Freeze Motion Correction for PET/CT

GE’s Q.Freeze Motion Correction for PET/CT

GE’s Q.Freeze Motion Correction for PET/CT

Q.Freeze imaging technology virtually eliminates a patient’s respiratory motion within PET/CT imaging, potentially resulting in healthcare cost savings

Digital radiography news WAUKESHA, WI APRIL 18, 2012— GE Healthcare today announced it has received FDA clearance of Q.Freeze, one of the PET/CT quantitative imaging technologies designed to enable treatment evaluation earlier in a patient’s cancer treatment.

Q.Freeze combines the quantitative benefits of 4D phase-matched PET/CT imaging, MotionMatch, into a single static image. By collecting CT and PET data at each phase of the breathing cycle, then matching the data for attenuation correction purposes, Q.Freeze is designed to improve quantitative consistency compared to conventional static PET imaging techniques while facilitating the reading of the 4D PET/CT imaging. None of the acquisition data is wasted, as 100 percent of the counts collected are combined to create a single static image. The goal is an image that has the dual benefit of frozen patient respiratory motion and reduced image noise.

The spark behind the idea—which has been under development since 2006—was correcting for lesion movement tied to respiratory motion at the same comfort and dose level as a routine static procedure would support clinicians’ diagnostic confidence. Because patients may not breathe the same way as they might have during their base line study, Q.Freeze could help enable easy and proper longitudinal response to therapy comparisons.

“GE Healthcare has demonstrated its excellence in addressing one of the biggest clinical challenges in PET/CT: respiratory motion” said Michael Barber, vice president and general manager, Molecular Imaging, GE Healthcare. “Respiratory motion impacts image clarity and quantification accuracy of lesions in organs subject to respiratory motion such as the lung, liver and pancreas. Now with Q.Freeze, GE Healthcare is offering an innovative technology that makes patient respiratory motion correction a routine procedure for every scan.”

In a recent study by Professor Cristina Messa, Head of the Nuclear Medicine Department and Dr. Luca Guerra, Nuclear Medicine Physician of The Center for Molecular Bio-imaging and San Gerado Hospital (HSG-CBM), a patient underwent PET/CT for nodule characterization. Their findings included a comparison of static acquisition, 4D PET/CT acquisition and Q.Freeze acquisition to determine clear evidence of a metabolic lesion. The Q.Freeze acquisition was able to increase the image quality making the lesion easily identifiable. According to the doctors, this improvement leads to a potential workflow benefit allowing the physician to review only one set of images free of a patient’s respiratory motion.

Q.Freeze is included in the GE Healthcare Q.Suite*, a collection of next-generation capabilities designed to further quantitative PET by generating more consistent Standardized Uptake Value (SUV) readings — enabling clinicians to assess treatment response accurately. During the course of cancer treatment, clinicians traditionally gauge progress by looking for physical change in the size of a tumor, typically using Computed Tomography (CT) or Magnetic Resonance (MR). However, with quantitative PET imaging, they are also able to consider a tumor’s metabolic activity. In many cases, metabolic changes in a tumor can be perceived earlier than physical ones, thus quantitative PET can give physicians an earlier view of how well a treatment is working.

For quantitative PET to be effective, consistency of SUV measurements between a patient’s baseline scan and subsequent follow-up scans is critical. Variation can occur throughout the PET workflow, in areas from patient management and biology to equipment protocols and performance. Controlling these variables to increase consistency can improve the clinician’s confidence that an SUV change has true clinical meaning.

“Doctors are seeking quantitative tools, such as Q.Freeze, to obtain reproducible measurements over a longitudinal patient study. Q.Freeze is one of the first element of a suite of tools that may enable doctors to assess confidently biological changes in a patient during a course of treatment, allowing them to quickly and accurately modify treatment regimens, “said Vivek Bhatt, general manager, PET/CT, GE Healthcare. “These tools, ultimately, could potentially contribute to personalized oncology care, increase quality of patient care and reduce wasted expenditure on ineffective treatment.”

About GE Healthcare

GE Healthcare provides transformational medical technologies and services that are shaping a new age of patient care. Our broad expertise in medical imaging and information technologies, medical diagnostics, patient monitoring systems, drug discovery, biopharmaceutical manufacturing technologies, performance improvement and performance solutions services help our customers to deliver better care to more people around the world at a lower cost. In addition, we partner with healthcare leaders, striving to leverage the global policy change necessary to implement a successful shift to sustainable healthcare systems.

Our “healthymagination” vision for the future invites the world to join us on our journey as we continuously develop innovations focused on reducing costs, increasing access and improving quality around the world. Headquartered in the United Kingdom, GE Healthcare is a unit of General Electric Company (NYSE: GE). Worldwide, GE Healthcare employees are committed to serving healthcare professionals and their patients in more than 100 countries. For more information about GE Healthcare, visit our website at www.gehealthcare.com.

GE Healthcare received FDA clearance for its Q.Freeze motion correction technology that combines multiple PET/CT images from different points in the patient’s respiratory cycle to create a composite high resolution image.

This is accomplished thanks to a camera that tracks the movement of a block resting on the patient’s chest, which provides movement correction data to the algorithm that’s aggregating the images together.

Q.Freeze combines the quantitative benefits of 4D phase-matched PET/ imaging, MotionMatch, into a single static image. By collecting CT and PET data at each phase of the breathing cycle, then matching the data for attenuation correction purposes, Q.Freeze is designed to improve quantitative consistency compared to conventional static PET imaging techniques while facilitating the reading of the 4D PET/CT imaging. None of the acquisition data is wasted, as 100 percent of the counts collected are combined to create a single static image. The goal is an image that has the dual benefit of frozen patient respiratory motion and reduced image noise.

Source : http://www.healthimaginghub.com/126-medical-imaging/

www.3732-ge-healthcare-receives-fda-clearance-of-q-freeze-helps-clinicians-assess-cancer-treatment-response.html

Full story

Vasanova Receives European Approval for Arrow Vascular Catheter Positioning System

Vasanova Receives European Approval for Arrow Vascular Catheter Positioning System

Vasanova Receives European Approval for Arrow Vascular Catheter Positioning System

The following information is dated and should be used for background information only. Teleflex undertakes no obligation to update material changes in forward-looking information.

Placement of a peripherally inserted central line so its tip is located at the lower third of the superior vena cava, just prior to the right atrium, is always a guessing game. Nowadays, the only way to confirm position of the line is to do an X-ray of the chest. VasoNova out of Sunnyvale, California wants to change the rules of the game. The company is introducing a device that uses Doppler ultrasound to monitor direction of blood flow and ECG to help position the catheter at the optimal location.

IMG handheld VasoNova Gets to The Heart of The Matter

The VasoNova VPS consists of several components: the VPS Stylet, the VPS Power Injectable PICC catheter and the VPS Console.

The VPS Stylet contains two sensors at its tip: a Doppler Ultrasound sensor and an intravascular ECG lead. Due to the ingenuity of our R&D group, VasoNova was able to miniaturize the two sensors and construct a highly technical and flexible stylet that can fit into the lumen of the VPS Power Injectable PICC line with comparable dimensions to other market available PICC lines12. Once the VPS stylet is loaded into the VPS catheter and connected to the VPS console per the Instruction for Use, the VPS Stylet is able to detect the patient’s physiological data, such as blood flow characteristics and ECG waveform. 4234sds2 VasoNova Gets to The Heart of The MatterFrom the point of insertion, patient data are sent to the VPS console for analysis. Using the highly advanced hardware and complex algorithm, the VPS console analyzes multiple vectors derived from these data and determines the location of the catheter tip while it is advanced through the patient’s vasculature. Using the data gathered by the sensors and analysis done by the console, the VasoNova VPS console displays visual indicators. These indicators guide the clinician in real-time as the catheter is advanced through the vasculature:

The green arrow indicates the catheter tip is moving with the blood flow towards the heart, as appropriate.

The orange “do-not-enter” sign indicates the catheter is moving against blood flow, such as into the internal jugular vein and away from the heart, or has passed the lower third of the SVC and going into the right atrium.

The yellow triangle indicates there is not enough information available, which may occur if the catheter tip is against the vessel wall.

The blue bull’s eye indicates the tip has arrived in the lower 1/3 of the SVC or at the caval-atrial junction.

VasoNova™ VPS™ (Vascular Positioning System™) takes away the guesswork in central catheter placement.

Finally, a central venous access catheter positioning system that hits the mark the first time with unprecedented accuracy, consistency and ease of use.

VasoNova, Inc. has created, developed and commercialized an innovative vascular navigation system. Founded in 2005, VasoNova, Inc. is the leader in algorithm-based vascular navigation and measurement technology, committed to innovating high quality medical devices to help clinicians address patient needs and improve patient care.

VasoNova, Inc. has created, developed and commercialized an innovative vascular navigation system. Founded in 2005, VasoNova, Inc. is the leader in algorithm-based vascular navigation and measurement technology, committed to innovating high quality medical devices to help clinicians address patient needs and improve patient care.

Source : http://www.teleflex.com/en/investor/news/index.html

Related Posts Plugin for WordPress, Blogger...

Full story

Page 1 of 3123
Copyright © 2017 Medical Technology & Gadgets Blog MedicalBuy.net. All rights reserved.
Proudly powered by WordPress. Developed by Deluxe Themes